Feb 17 17:43:47 crc systemd[1]: Starting Kubernetes Kubelet... Feb 17 17:43:47 crc restorecon[4754]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:43:47 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 17:43:48 crc restorecon[4754]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 17:43:48 crc restorecon[4754]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Feb 17 17:43:49 crc kubenswrapper[4892]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 17 17:43:49 crc kubenswrapper[4892]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Feb 17 17:43:49 crc kubenswrapper[4892]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 17 17:43:49 crc kubenswrapper[4892]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 17 17:43:49 crc kubenswrapper[4892]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 17 17:43:49 crc kubenswrapper[4892]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.078170 4892 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.094901 4892 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.094934 4892 feature_gate.go:330] unrecognized feature gate: Example Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.094944 4892 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.094954 4892 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.094963 4892 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.094972 4892 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.094981 4892 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.094989 4892 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.094997 4892 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095006 4892 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095016 4892 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095025 4892 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095034 4892 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095042 4892 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095050 4892 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095058 4892 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095066 4892 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095074 4892 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095082 4892 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095090 4892 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095098 4892 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095106 4892 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095114 4892 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095121 4892 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095131 4892 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095139 4892 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095147 4892 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095154 4892 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095161 4892 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095170 4892 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095178 4892 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095189 4892 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095199 4892 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095207 4892 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095215 4892 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095224 4892 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095233 4892 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095240 4892 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095248 4892 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095257 4892 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095265 4892 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095273 4892 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095282 4892 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095290 4892 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095300 4892 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095307 4892 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095316 4892 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095323 4892 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095331 4892 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095338 4892 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095346 4892 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095353 4892 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095363 4892 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095372 4892 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095380 4892 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095388 4892 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095395 4892 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095403 4892 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095412 4892 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095422 4892 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095433 4892 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095445 4892 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095455 4892 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095463 4892 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095472 4892 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095479 4892 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095487 4892 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095495 4892 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095502 4892 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095509 4892 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.095517 4892 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.096650 4892 flags.go:64] FLAG: --address="0.0.0.0" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.096673 4892 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.096689 4892 flags.go:64] FLAG: --anonymous-auth="true" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.096699 4892 flags.go:64] FLAG: --application-metrics-count-limit="100" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.096710 4892 flags.go:64] FLAG: --authentication-token-webhook="false" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.096719 4892 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.096730 4892 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.096741 4892 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.096750 4892 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.096759 4892 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.096769 4892 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.096781 4892 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.096791 4892 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.096799 4892 flags.go:64] FLAG: --cgroup-root="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.096807 4892 flags.go:64] FLAG: --cgroups-per-qos="true" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.096850 4892 flags.go:64] FLAG: --client-ca-file="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.096859 4892 flags.go:64] FLAG: --cloud-config="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.096869 4892 flags.go:64] FLAG: --cloud-provider="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.096877 4892 flags.go:64] FLAG: --cluster-dns="[]" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.096888 4892 flags.go:64] FLAG: --cluster-domain="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.096897 4892 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.096906 4892 flags.go:64] FLAG: --config-dir="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.096915 4892 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.096925 4892 flags.go:64] FLAG: --container-log-max-files="5" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.096936 4892 flags.go:64] FLAG: --container-log-max-size="10Mi" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.096944 4892 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.096953 4892 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.096963 4892 flags.go:64] FLAG: --containerd-namespace="k8s.io" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.096971 4892 flags.go:64] FLAG: --contention-profiling="false" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.096980 4892 flags.go:64] FLAG: --cpu-cfs-quota="true" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.096989 4892 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.096998 4892 flags.go:64] FLAG: --cpu-manager-policy="none" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097006 4892 flags.go:64] FLAG: --cpu-manager-policy-options="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097021 4892 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097030 4892 flags.go:64] FLAG: --enable-controller-attach-detach="true" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097039 4892 flags.go:64] FLAG: --enable-debugging-handlers="true" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097048 4892 flags.go:64] FLAG: --enable-load-reader="false" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097058 4892 flags.go:64] FLAG: --enable-server="true" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097067 4892 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097080 4892 flags.go:64] FLAG: --event-burst="100" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097090 4892 flags.go:64] FLAG: --event-qps="50" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097100 4892 flags.go:64] FLAG: --event-storage-age-limit="default=0" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097144 4892 flags.go:64] FLAG: --event-storage-event-limit="default=0" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097154 4892 flags.go:64] FLAG: --eviction-hard="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097184 4892 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097193 4892 flags.go:64] FLAG: --eviction-minimum-reclaim="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097203 4892 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097217 4892 flags.go:64] FLAG: --eviction-soft="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097228 4892 flags.go:64] FLAG: --eviction-soft-grace-period="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097237 4892 flags.go:64] FLAG: --exit-on-lock-contention="false" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097247 4892 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097257 4892 flags.go:64] FLAG: --experimental-mounter-path="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097266 4892 flags.go:64] FLAG: --fail-cgroupv1="false" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097277 4892 flags.go:64] FLAG: --fail-swap-on="true" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097287 4892 flags.go:64] FLAG: --feature-gates="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097298 4892 flags.go:64] FLAG: --file-check-frequency="20s" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097308 4892 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097317 4892 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097327 4892 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097337 4892 flags.go:64] FLAG: --healthz-port="10248" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097347 4892 flags.go:64] FLAG: --help="false" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097357 4892 flags.go:64] FLAG: --hostname-override="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097367 4892 flags.go:64] FLAG: --housekeeping-interval="10s" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097376 4892 flags.go:64] FLAG: --http-check-frequency="20s" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097386 4892 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097395 4892 flags.go:64] FLAG: --image-credential-provider-config="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097403 4892 flags.go:64] FLAG: --image-gc-high-threshold="85" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097412 4892 flags.go:64] FLAG: --image-gc-low-threshold="80" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097421 4892 flags.go:64] FLAG: --image-service-endpoint="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097431 4892 flags.go:64] FLAG: --kernel-memcg-notification="false" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097439 4892 flags.go:64] FLAG: --kube-api-burst="100" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097448 4892 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097457 4892 flags.go:64] FLAG: --kube-api-qps="50" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097466 4892 flags.go:64] FLAG: --kube-reserved="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097476 4892 flags.go:64] FLAG: --kube-reserved-cgroup="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097484 4892 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097493 4892 flags.go:64] FLAG: --kubelet-cgroups="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097502 4892 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097511 4892 flags.go:64] FLAG: --lock-file="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097520 4892 flags.go:64] FLAG: --log-cadvisor-usage="false" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097529 4892 flags.go:64] FLAG: --log-flush-frequency="5s" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097538 4892 flags.go:64] FLAG: --log-json-info-buffer-size="0" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097550 4892 flags.go:64] FLAG: --log-json-split-stream="false" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097561 4892 flags.go:64] FLAG: --log-text-info-buffer-size="0" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097571 4892 flags.go:64] FLAG: --log-text-split-stream="false" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097581 4892 flags.go:64] FLAG: --logging-format="text" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097590 4892 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097600 4892 flags.go:64] FLAG: --make-iptables-util-chains="true" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097609 4892 flags.go:64] FLAG: --manifest-url="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097618 4892 flags.go:64] FLAG: --manifest-url-header="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097629 4892 flags.go:64] FLAG: --max-housekeeping-interval="15s" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097638 4892 flags.go:64] FLAG: --max-open-files="1000000" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097648 4892 flags.go:64] FLAG: --max-pods="110" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097658 4892 flags.go:64] FLAG: --maximum-dead-containers="-1" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097666 4892 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097676 4892 flags.go:64] FLAG: --memory-manager-policy="None" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097685 4892 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097694 4892 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097704 4892 flags.go:64] FLAG: --node-ip="192.168.126.11" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097713 4892 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097733 4892 flags.go:64] FLAG: --node-status-max-images="50" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097743 4892 flags.go:64] FLAG: --node-status-update-frequency="10s" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097753 4892 flags.go:64] FLAG: --oom-score-adj="-999" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097761 4892 flags.go:64] FLAG: --pod-cidr="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097770 4892 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097784 4892 flags.go:64] FLAG: --pod-manifest-path="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097793 4892 flags.go:64] FLAG: --pod-max-pids="-1" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097802 4892 flags.go:64] FLAG: --pods-per-core="0" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097836 4892 flags.go:64] FLAG: --port="10250" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097846 4892 flags.go:64] FLAG: --protect-kernel-defaults="false" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097855 4892 flags.go:64] FLAG: --provider-id="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097864 4892 flags.go:64] FLAG: --qos-reserved="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097873 4892 flags.go:64] FLAG: --read-only-port="10255" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097881 4892 flags.go:64] FLAG: --register-node="true" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097890 4892 flags.go:64] FLAG: --register-schedulable="true" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097899 4892 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097913 4892 flags.go:64] FLAG: --registry-burst="10" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097922 4892 flags.go:64] FLAG: --registry-qps="5" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097931 4892 flags.go:64] FLAG: --reserved-cpus="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097941 4892 flags.go:64] FLAG: --reserved-memory="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097951 4892 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097961 4892 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097970 4892 flags.go:64] FLAG: --rotate-certificates="false" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097979 4892 flags.go:64] FLAG: --rotate-server-certificates="false" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097988 4892 flags.go:64] FLAG: --runonce="false" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.097997 4892 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.098006 4892 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.098016 4892 flags.go:64] FLAG: --seccomp-default="false" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.098025 4892 flags.go:64] FLAG: --serialize-image-pulls="true" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.098035 4892 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.098044 4892 flags.go:64] FLAG: --storage-driver-db="cadvisor" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.098054 4892 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.098063 4892 flags.go:64] FLAG: --storage-driver-password="root" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.098072 4892 flags.go:64] FLAG: --storage-driver-secure="false" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.098087 4892 flags.go:64] FLAG: --storage-driver-table="stats" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.098095 4892 flags.go:64] FLAG: --storage-driver-user="root" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.098105 4892 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.098114 4892 flags.go:64] FLAG: --sync-frequency="1m0s" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.098123 4892 flags.go:64] FLAG: --system-cgroups="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.098132 4892 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.098145 4892 flags.go:64] FLAG: --system-reserved-cgroup="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.098154 4892 flags.go:64] FLAG: --tls-cert-file="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.098163 4892 flags.go:64] FLAG: --tls-cipher-suites="[]" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.098173 4892 flags.go:64] FLAG: --tls-min-version="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.098182 4892 flags.go:64] FLAG: --tls-private-key-file="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.098191 4892 flags.go:64] FLAG: --topology-manager-policy="none" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.098200 4892 flags.go:64] FLAG: --topology-manager-policy-options="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.098209 4892 flags.go:64] FLAG: --topology-manager-scope="container" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.098218 4892 flags.go:64] FLAG: --v="2" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.098229 4892 flags.go:64] FLAG: --version="false" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.098240 4892 flags.go:64] FLAG: --vmodule="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.098250 4892 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.098260 4892 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098492 4892 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098502 4892 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098513 4892 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098523 4892 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098535 4892 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098546 4892 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098554 4892 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098563 4892 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098571 4892 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098579 4892 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098588 4892 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098596 4892 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098603 4892 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098615 4892 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098625 4892 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098634 4892 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098643 4892 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098651 4892 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098659 4892 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098666 4892 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098674 4892 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098682 4892 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098690 4892 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098697 4892 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098705 4892 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098715 4892 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098725 4892 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098733 4892 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098741 4892 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098750 4892 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098759 4892 feature_gate.go:330] unrecognized feature gate: Example Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098766 4892 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098775 4892 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098782 4892 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098790 4892 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098798 4892 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098806 4892 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098838 4892 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098847 4892 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098855 4892 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098863 4892 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098871 4892 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098878 4892 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098885 4892 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098893 4892 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098903 4892 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098911 4892 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098919 4892 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098926 4892 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098934 4892 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098941 4892 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098949 4892 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098957 4892 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098965 4892 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098972 4892 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098980 4892 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098988 4892 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.098996 4892 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.099004 4892 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.099012 4892 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.099020 4892 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.099028 4892 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.099036 4892 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.099044 4892 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.099051 4892 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.099059 4892 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.099066 4892 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.099074 4892 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.099081 4892 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.099088 4892 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.099096 4892 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.099108 4892 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.111480 4892 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.111719 4892 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.111883 4892 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.111900 4892 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.111910 4892 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.111918 4892 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.111926 4892 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.111935 4892 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.111942 4892 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.111950 4892 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.111958 4892 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.111966 4892 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.111974 4892 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.111983 4892 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.111991 4892 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.111999 4892 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112006 4892 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112015 4892 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112023 4892 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112031 4892 feature_gate.go:330] unrecognized feature gate: Example Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112038 4892 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112046 4892 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112055 4892 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112063 4892 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112073 4892 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112084 4892 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112092 4892 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112100 4892 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112110 4892 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112121 4892 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112129 4892 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112138 4892 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112147 4892 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112158 4892 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112168 4892 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112177 4892 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112189 4892 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112198 4892 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112207 4892 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112215 4892 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112222 4892 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112231 4892 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112238 4892 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112246 4892 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112254 4892 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112262 4892 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112269 4892 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112277 4892 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112286 4892 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112293 4892 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112301 4892 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112309 4892 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112316 4892 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112324 4892 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112334 4892 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112342 4892 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112349 4892 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112357 4892 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112364 4892 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112372 4892 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112379 4892 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112389 4892 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112398 4892 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112406 4892 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112415 4892 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112424 4892 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112431 4892 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112439 4892 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112447 4892 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112455 4892 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112463 4892 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112470 4892 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112478 4892 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.112491 4892 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112704 4892 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112715 4892 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112724 4892 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112732 4892 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112740 4892 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112748 4892 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112755 4892 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112763 4892 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112771 4892 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112778 4892 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112788 4892 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112796 4892 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112803 4892 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112841 4892 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112853 4892 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112863 4892 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112873 4892 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112880 4892 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112888 4892 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112895 4892 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112903 4892 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112912 4892 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112919 4892 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112927 4892 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112935 4892 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112942 4892 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112950 4892 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112958 4892 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112965 4892 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112972 4892 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112982 4892 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.112993 4892 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.113001 4892 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.113010 4892 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.113022 4892 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.113030 4892 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.113039 4892 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.113047 4892 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.113055 4892 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.113065 4892 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.113074 4892 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.113083 4892 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.113092 4892 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.113101 4892 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.113110 4892 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.113118 4892 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.113126 4892 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.113158 4892 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.113167 4892 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.113175 4892 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.113183 4892 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.113190 4892 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.113198 4892 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.113206 4892 feature_gate.go:330] unrecognized feature gate: Example Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.113213 4892 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.113221 4892 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.113230 4892 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.113239 4892 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.113246 4892 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.113254 4892 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.113261 4892 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.113269 4892 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.113276 4892 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.113284 4892 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.113294 4892 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.113302 4892 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.113310 4892 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.113318 4892 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.113326 4892 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.113334 4892 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.113342 4892 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.113355 4892 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.113622 4892 server.go:940] "Client rotation is on, will bootstrap in background" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.120129 4892 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.120268 4892 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.122841 4892 server.go:997] "Starting client certificate rotation" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.122888 4892 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.123158 4892 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-16 02:10:22.657360397 +0000 UTC Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.123273 4892 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.152643 4892 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 17 17:43:49 crc kubenswrapper[4892]: E0217 17:43:49.155168 4892 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.41:6443: connect: connection refused" logger="UnhandledError" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.155537 4892 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.169868 4892 log.go:25] "Validated CRI v1 runtime API" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.208387 4892 log.go:25] "Validated CRI v1 image API" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.210690 4892 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.217075 4892 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-02-17-17-38-55-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.217128 4892 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:41 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.244182 4892 manager.go:217] Machine: {Timestamp:2026-02-17 17:43:49.241693713 +0000 UTC m=+0.617097048 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:06c95f3d-0382-44a5-9f64-04e3b3bbd534 BootID:5b854407-f7fd-494b-9ad1-f90f175f6ff2 Filesystems:[{Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:41 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:9d:6b:58 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:9d:6b:58 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:ec:58:fb Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:6a:82:e6 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:8b:a4:bf Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:e2:eb:34 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:50:7c:ff Speed:-1 Mtu:1496} {Name:eth10 MacAddress:ae:ff:80:be:f1:f7 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:02:b0:93:44:50:ce Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.244591 4892 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.244765 4892 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.249008 4892 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.249350 4892 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.249452 4892 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.250029 4892 topology_manager.go:138] "Creating topology manager with none policy" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.250048 4892 container_manager_linux.go:303] "Creating device plugin manager" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.251022 4892 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.251078 4892 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.251446 4892 state_mem.go:36] "Initialized new in-memory state store" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.251593 4892 server.go:1245] "Using root directory" path="/var/lib/kubelet" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.255311 4892 kubelet.go:418] "Attempting to sync node with API server" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.255348 4892 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.255389 4892 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.255411 4892 kubelet.go:324] "Adding apiserver pod source" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.255437 4892 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.261120 4892 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.262724 4892 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.263629 4892 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.41:6443: connect: connection refused Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.263652 4892 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.41:6443: connect: connection refused Feb 17 17:43:49 crc kubenswrapper[4892]: E0217 17:43:49.263761 4892 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.41:6443: connect: connection refused" logger="UnhandledError" Feb 17 17:43:49 crc kubenswrapper[4892]: E0217 17:43:49.263792 4892 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.41:6443: connect: connection refused" logger="UnhandledError" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.264312 4892 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.266434 4892 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.266617 4892 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.266728 4892 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.266858 4892 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.267004 4892 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.267119 4892 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.267219 4892 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.267343 4892 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.267461 4892 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.267592 4892 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.267719 4892 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.267853 4892 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.268890 4892 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.269744 4892 server.go:1280] "Started kubelet" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.270109 4892 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.270727 4892 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.271573 4892 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.41:6443: connect: connection refused Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.271967 4892 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 17 17:43:49 crc systemd[1]: Started Kubernetes Kubelet. Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.274850 4892 server.go:460] "Adding debug handlers to kubelet server" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.284408 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.284490 4892 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.284566 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 11:05:21.281221915 +0000 UTC Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.285775 4892 volume_manager.go:287] "The desired_state_of_world populator starts" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.285887 4892 volume_manager.go:289] "Starting Kubelet Volume Manager" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.286033 4892 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Feb 17 17:43:49 crc kubenswrapper[4892]: E0217 17:43:49.285849 4892 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 17 17:43:49 crc kubenswrapper[4892]: E0217 17:43:49.287145 4892 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.41:6443: connect: connection refused" interval="200ms" Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.287396 4892 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.41:6443: connect: connection refused Feb 17 17:43:49 crc kubenswrapper[4892]: E0217 17:43:49.287530 4892 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.41:6443: connect: connection refused" logger="UnhandledError" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.289480 4892 factory.go:55] Registering systemd factory Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.290888 4892 factory.go:221] Registration of the systemd container factory successfully Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.291530 4892 factory.go:153] Registering CRI-O factory Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.291559 4892 factory.go:221] Registration of the crio container factory successfully Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.291782 4892 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.292740 4892 factory.go:103] Registering Raw factory Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.292842 4892 manager.go:1196] Started watching for new ooms in manager Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.294298 4892 manager.go:319] Starting recovery of all containers Feb 17 17:43:49 crc kubenswrapper[4892]: E0217 17:43:49.293408 4892 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.41:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189519a6046d5f57 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-17 17:43:49.269700439 +0000 UTC m=+0.645103734,LastTimestamp:2026-02-17 17:43:49.269700439 +0000 UTC m=+0.645103734,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.308112 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.308270 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.308301 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.308326 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.308355 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.308383 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.308410 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.308434 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.308478 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.308505 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.308529 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.308610 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.308637 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.308670 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.308696 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.308719 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.308747 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.308772 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.308798 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.308857 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.308888 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.308916 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.308940 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.308963 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.308987 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.309013 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.309043 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.309072 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.309097 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.309123 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.309149 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.309175 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.309216 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.309245 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.309275 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.309300 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.309328 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.309351 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.309376 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.309406 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.309433 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.309459 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.309535 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.309617 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.309644 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.309676 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.309704 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.309731 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.310551 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.310580 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.310607 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.310636 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.310672 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.310700 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.310725 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.310799 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.310962 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.310998 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.311036 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.311064 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.311091 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.311118 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.311145 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.311177 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.311204 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.311237 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.311271 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.311296 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.311331 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.311366 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.311398 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.311433 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.311464 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.311501 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.311580 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.311608 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.311633 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.311658 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.311687 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.311710 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.311747 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.311780 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.311881 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.311909 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.311946 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.311971 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.311993 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.312098 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.312125 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.312151 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.312180 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.312205 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.312243 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.312269 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.312294 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.314602 4892 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.314701 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.314738 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.314763 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.314792 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.314884 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.314911 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.314939 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.314966 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.315057 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.315155 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.315184 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.315220 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.315250 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.315280 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.315306 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.315332 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.315371 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.315414 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.315449 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.315476 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.315502 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.315525 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.315554 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.315582 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.315606 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.315641 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.315673 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.315699 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.315726 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.315752 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.315776 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.315870 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.315898 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.315925 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.316005 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.316034 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.316060 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.316085 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.316110 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.316133 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.316157 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.316214 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.316243 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.316270 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.316295 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.316321 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.316345 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.316370 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.316396 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.316422 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.316450 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.316478 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.316503 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.316530 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.316556 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.316582 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.316618 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.316644 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.316668 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.316705 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.316738 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.316768 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.316802 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.316863 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.316899 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.316928 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.317004 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.317032 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.317056 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.317081 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.317109 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.317146 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.317172 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.317202 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.317229 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.317260 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.317290 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.317311 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.317330 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.317353 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.317374 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.317394 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.317421 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.317442 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.317471 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.317535 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.317563 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.317584 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.317604 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.317624 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.317640 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.317656 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.317670 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.317719 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.317734 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.317749 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.317792 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.317807 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.317843 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.317868 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.317889 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.317911 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.317930 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.317947 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.317988 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.318006 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.318024 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.318043 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.318061 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.318078 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.318094 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.318112 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.318130 4892 reconstruct.go:97] "Volume reconstruction finished" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.318142 4892 reconciler.go:26] "Reconciler: start to sync state" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.326093 4892 manager.go:324] Recovery completed Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.344036 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.345792 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.345857 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.345875 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.346964 4892 cpu_manager.go:225] "Starting CPU manager" policy="none" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.346989 4892 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.347018 4892 state_mem.go:36] "Initialized new in-memory state store" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.355355 4892 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.358134 4892 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.358182 4892 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.358215 4892 kubelet.go:2335] "Starting kubelet main sync loop" Feb 17 17:43:49 crc kubenswrapper[4892]: E0217 17:43:49.358266 4892 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.358934 4892 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.41:6443: connect: connection refused Feb 17 17:43:49 crc kubenswrapper[4892]: E0217 17:43:49.358996 4892 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.41:6443: connect: connection refused" logger="UnhandledError" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.364731 4892 policy_none.go:49] "None policy: Start" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.368745 4892 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.368785 4892 state_mem.go:35] "Initializing new in-memory state store" Feb 17 17:43:49 crc kubenswrapper[4892]: E0217 17:43:49.387437 4892 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.439493 4892 manager.go:334] "Starting Device Plugin manager" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.439563 4892 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.439578 4892 server.go:79] "Starting device plugin registration server" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.440112 4892 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.440150 4892 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.440398 4892 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.440491 4892 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.440504 4892 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 17 17:43:49 crc kubenswrapper[4892]: E0217 17:43:49.454097 4892 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.458398 4892 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.458488 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.459572 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.459618 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.459631 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.459786 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.460124 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.460193 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.460614 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.460656 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.460669 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.460838 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.461025 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.461105 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.461413 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.461454 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.461470 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.461732 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.461760 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.461772 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.461934 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.462084 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.462135 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.462535 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.462574 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.462590 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.462885 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.462916 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.462930 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.463051 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.463089 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.463111 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.463125 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.463188 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.463231 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.464289 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.464321 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.464319 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.464359 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.464376 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.464335 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.464586 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.464625 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.465602 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.465640 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.465654 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:43:49 crc kubenswrapper[4892]: E0217 17:43:49.488072 4892 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.41:6443: connect: connection refused" interval="400ms" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.519802 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.519894 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.519935 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.519969 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.520004 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.520034 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.520065 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.520136 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.520216 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.520270 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.520311 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.520360 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.520406 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.520441 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.520485 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.540276 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.541755 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.541899 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.541923 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.541962 4892 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 17:43:49 crc kubenswrapper[4892]: E0217 17:43:49.542640 4892 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.41:6443: connect: connection refused" node="crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.621806 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.622161 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.622253 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.622180 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.622314 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.622346 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.622378 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.622407 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.622408 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.622438 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.622467 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.622469 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.622497 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.622523 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.622527 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.622570 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.622591 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.622616 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.622624 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.622656 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.622665 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.622688 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.622742 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.622756 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.622788 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.622857 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.622899 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.622709 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.622952 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.622994 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.743368 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.745138 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.745195 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.745213 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.745251 4892 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 17:43:49 crc kubenswrapper[4892]: E0217 17:43:49.745750 4892 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.41:6443: connect: connection refused" node="crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.804687 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.822567 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.833521 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.849907 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 17:43:49 crc kubenswrapper[4892]: I0217 17:43:49.857418 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.860117 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-c6d7751d8f86bfd6cd7a681059816b0bab40fbf5c4f2bd906cd86ed3667498bb WatchSource:0}: Error finding container c6d7751d8f86bfd6cd7a681059816b0bab40fbf5c4f2bd906cd86ed3667498bb: Status 404 returned error can't find the container with id c6d7751d8f86bfd6cd7a681059816b0bab40fbf5c4f2bd906cd86ed3667498bb Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.863263 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-4b9bfd9374d17d76325eed7bf204503afd784bc3ead0642a083b6001428afa43 WatchSource:0}: Error finding container 4b9bfd9374d17d76325eed7bf204503afd784bc3ead0642a083b6001428afa43: Status 404 returned error can't find the container with id 4b9bfd9374d17d76325eed7bf204503afd784bc3ead0642a083b6001428afa43 Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.863895 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-4a9e0575314c83f262bf8f64c3b009f67ac1dc0292c89a71586c04b786f728ca WatchSource:0}: Error finding container 4a9e0575314c83f262bf8f64c3b009f67ac1dc0292c89a71586c04b786f728ca: Status 404 returned error can't find the container with id 4a9e0575314c83f262bf8f64c3b009f67ac1dc0292c89a71586c04b786f728ca Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.877993 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-75d587578048988b40181d3f8de39df8897870d8cb544fe178090dd8bbf0f596 WatchSource:0}: Error finding container 75d587578048988b40181d3f8de39df8897870d8cb544fe178090dd8bbf0f596: Status 404 returned error can't find the container with id 75d587578048988b40181d3f8de39df8897870d8cb544fe178090dd8bbf0f596 Feb 17 17:43:49 crc kubenswrapper[4892]: W0217 17:43:49.881313 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-01b793468c01f5cdafe77a9eaa5b973c49af5fef3dd653925519737996ae7792 WatchSource:0}: Error finding container 01b793468c01f5cdafe77a9eaa5b973c49af5fef3dd653925519737996ae7792: Status 404 returned error can't find the container with id 01b793468c01f5cdafe77a9eaa5b973c49af5fef3dd653925519737996ae7792 Feb 17 17:43:49 crc kubenswrapper[4892]: E0217 17:43:49.888731 4892 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.41:6443: connect: connection refused" interval="800ms" Feb 17 17:43:50 crc kubenswrapper[4892]: I0217 17:43:50.146550 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:43:50 crc kubenswrapper[4892]: I0217 17:43:50.148016 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:43:50 crc kubenswrapper[4892]: I0217 17:43:50.148073 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:43:50 crc kubenswrapper[4892]: I0217 17:43:50.148089 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:43:50 crc kubenswrapper[4892]: I0217 17:43:50.148127 4892 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 17:43:50 crc kubenswrapper[4892]: E0217 17:43:50.148864 4892 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.41:6443: connect: connection refused" node="crc" Feb 17 17:43:50 crc kubenswrapper[4892]: W0217 17:43:50.211937 4892 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.41:6443: connect: connection refused Feb 17 17:43:50 crc kubenswrapper[4892]: E0217 17:43:50.212046 4892 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.41:6443: connect: connection refused" logger="UnhandledError" Feb 17 17:43:50 crc kubenswrapper[4892]: I0217 17:43:50.272887 4892 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.41:6443: connect: connection refused Feb 17 17:43:50 crc kubenswrapper[4892]: W0217 17:43:50.284509 4892 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.41:6443: connect: connection refused Feb 17 17:43:50 crc kubenswrapper[4892]: E0217 17:43:50.284604 4892 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.41:6443: connect: connection refused" logger="UnhandledError" Feb 17 17:43:50 crc kubenswrapper[4892]: I0217 17:43:50.285432 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 16:44:47.280215861 +0000 UTC Feb 17 17:43:50 crc kubenswrapper[4892]: I0217 17:43:50.363258 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"75d587578048988b40181d3f8de39df8897870d8cb544fe178090dd8bbf0f596"} Feb 17 17:43:50 crc kubenswrapper[4892]: I0217 17:43:50.364862 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4a9e0575314c83f262bf8f64c3b009f67ac1dc0292c89a71586c04b786f728ca"} Feb 17 17:43:50 crc kubenswrapper[4892]: I0217 17:43:50.366005 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4b9bfd9374d17d76325eed7bf204503afd784bc3ead0642a083b6001428afa43"} Feb 17 17:43:50 crc kubenswrapper[4892]: I0217 17:43:50.366809 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"c6d7751d8f86bfd6cd7a681059816b0bab40fbf5c4f2bd906cd86ed3667498bb"} Feb 17 17:43:50 crc kubenswrapper[4892]: I0217 17:43:50.367688 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"01b793468c01f5cdafe77a9eaa5b973c49af5fef3dd653925519737996ae7792"} Feb 17 17:43:50 crc kubenswrapper[4892]: W0217 17:43:50.571524 4892 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.41:6443: connect: connection refused Feb 17 17:43:50 crc kubenswrapper[4892]: E0217 17:43:50.571611 4892 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.41:6443: connect: connection refused" logger="UnhandledError" Feb 17 17:43:50 crc kubenswrapper[4892]: E0217 17:43:50.690344 4892 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.41:6443: connect: connection refused" interval="1.6s" Feb 17 17:43:50 crc kubenswrapper[4892]: W0217 17:43:50.926421 4892 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.41:6443: connect: connection refused Feb 17 17:43:50 crc kubenswrapper[4892]: E0217 17:43:50.926869 4892 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.41:6443: connect: connection refused" logger="UnhandledError" Feb 17 17:43:50 crc kubenswrapper[4892]: I0217 17:43:50.949871 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:43:50 crc kubenswrapper[4892]: I0217 17:43:50.952753 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:43:50 crc kubenswrapper[4892]: I0217 17:43:50.952796 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:43:50 crc kubenswrapper[4892]: I0217 17:43:50.952838 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:43:50 crc kubenswrapper[4892]: I0217 17:43:50.952872 4892 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 17:43:50 crc kubenswrapper[4892]: E0217 17:43:50.953429 4892 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.41:6443: connect: connection refused" node="crc" Feb 17 17:43:51 crc kubenswrapper[4892]: I0217 17:43:51.272468 4892 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.41:6443: connect: connection refused Feb 17 17:43:51 crc kubenswrapper[4892]: I0217 17:43:51.285617 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 06:03:03.448018897 +0000 UTC Feb 17 17:43:51 crc kubenswrapper[4892]: I0217 17:43:51.311001 4892 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 17 17:43:51 crc kubenswrapper[4892]: E0217 17:43:51.312664 4892 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.41:6443: connect: connection refused" logger="UnhandledError" Feb 17 17:43:51 crc kubenswrapper[4892]: I0217 17:43:51.372553 4892 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="768f7d12789bbed0b254d9ad4fbb0364b45e71930bf5622bb9d9b02a7c91ae50" exitCode=0 Feb 17 17:43:51 crc kubenswrapper[4892]: I0217 17:43:51.372688 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"768f7d12789bbed0b254d9ad4fbb0364b45e71930bf5622bb9d9b02a7c91ae50"} Feb 17 17:43:51 crc kubenswrapper[4892]: I0217 17:43:51.372841 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:43:51 crc kubenswrapper[4892]: I0217 17:43:51.374610 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:43:51 crc kubenswrapper[4892]: I0217 17:43:51.374659 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:43:51 crc kubenswrapper[4892]: I0217 17:43:51.374673 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:43:51 crc kubenswrapper[4892]: I0217 17:43:51.374972 4892 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="65e1e3306f2becef082f51747a01f6f5f0c619c462d0b001bdc01827d19d6ad3" exitCode=0 Feb 17 17:43:51 crc kubenswrapper[4892]: I0217 17:43:51.375012 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"65e1e3306f2becef082f51747a01f6f5f0c619c462d0b001bdc01827d19d6ad3"} Feb 17 17:43:51 crc kubenswrapper[4892]: I0217 17:43:51.375078 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:43:51 crc kubenswrapper[4892]: I0217 17:43:51.376483 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:43:51 crc kubenswrapper[4892]: I0217 17:43:51.376524 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:43:51 crc kubenswrapper[4892]: I0217 17:43:51.376536 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:43:51 crc kubenswrapper[4892]: I0217 17:43:51.378000 4892 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="9f344d7a0ae243f5e05b9e8c21f25a713411ee0921470e6632409800760aab22" exitCode=0 Feb 17 17:43:51 crc kubenswrapper[4892]: I0217 17:43:51.378040 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"9f344d7a0ae243f5e05b9e8c21f25a713411ee0921470e6632409800760aab22"} Feb 17 17:43:51 crc kubenswrapper[4892]: I0217 17:43:51.378122 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:43:51 crc kubenswrapper[4892]: I0217 17:43:51.379375 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:43:51 crc kubenswrapper[4892]: I0217 17:43:51.379400 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:43:51 crc kubenswrapper[4892]: I0217 17:43:51.379411 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:43:51 crc kubenswrapper[4892]: I0217 17:43:51.383729 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"02d08e8579fe7ac45bc28ca29b9e6bc0b795c68e4429e16319b75f2b6a1f60cd"} Feb 17 17:43:51 crc kubenswrapper[4892]: I0217 17:43:51.383763 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"82362387569a5b049fdb827cb8ca5ecdacae36743ef142077b37441cc2e33b41"} Feb 17 17:43:51 crc kubenswrapper[4892]: I0217 17:43:51.383779 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ca961a496bca89cff78c58c41585706c0673892df83e6c58e16655f83aeaa4d2"} Feb 17 17:43:51 crc kubenswrapper[4892]: I0217 17:43:51.383793 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"cd3ead139c2a8e6feefbddce384ccd6eb8f26b4581d93053b80832fea7c8bdff"} Feb 17 17:43:51 crc kubenswrapper[4892]: I0217 17:43:51.383799 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:43:51 crc kubenswrapper[4892]: I0217 17:43:51.384799 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:43:51 crc kubenswrapper[4892]: I0217 17:43:51.384844 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:43:51 crc kubenswrapper[4892]: I0217 17:43:51.384853 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:43:51 crc kubenswrapper[4892]: I0217 17:43:51.385837 4892 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5" exitCode=0 Feb 17 17:43:51 crc kubenswrapper[4892]: I0217 17:43:51.385876 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5"} Feb 17 17:43:51 crc kubenswrapper[4892]: I0217 17:43:51.386297 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:43:51 crc kubenswrapper[4892]: I0217 17:43:51.388099 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:43:51 crc kubenswrapper[4892]: I0217 17:43:51.388139 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:43:51 crc kubenswrapper[4892]: I0217 17:43:51.388150 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:43:51 crc kubenswrapper[4892]: I0217 17:43:51.393038 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:43:51 crc kubenswrapper[4892]: I0217 17:43:51.393912 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:43:51 crc kubenswrapper[4892]: I0217 17:43:51.393962 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:43:51 crc kubenswrapper[4892]: I0217 17:43:51.394174 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:43:52 crc kubenswrapper[4892]: W0217 17:43:52.064951 4892 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.41:6443: connect: connection refused Feb 17 17:43:52 crc kubenswrapper[4892]: E0217 17:43:52.065036 4892 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.41:6443: connect: connection refused" logger="UnhandledError" Feb 17 17:43:52 crc kubenswrapper[4892]: I0217 17:43:52.272332 4892 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.41:6443: connect: connection refused Feb 17 17:43:52 crc kubenswrapper[4892]: I0217 17:43:52.285755 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 10:36:25.522429576 +0000 UTC Feb 17 17:43:52 crc kubenswrapper[4892]: E0217 17:43:52.292075 4892 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.41:6443: connect: connection refused" interval="3.2s" Feb 17 17:43:52 crc kubenswrapper[4892]: I0217 17:43:52.390870 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"ae0b98dba841562d65c0ab18e4f604c141352d4bbfb16825248fb209a73b23cc"} Feb 17 17:43:52 crc kubenswrapper[4892]: I0217 17:43:52.390913 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"dd66fb423f8b68d5e670ece2a7be1b6e04e4dcd03ba42dc25bcf97cd9ec92b69"} Feb 17 17:43:52 crc kubenswrapper[4892]: I0217 17:43:52.390922 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5db577dd460e8c892dbb44799bae9ccdd3364c385f4bbfbd3a33a88bab1bcab1"} Feb 17 17:43:52 crc kubenswrapper[4892]: I0217 17:43:52.391013 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:43:52 crc kubenswrapper[4892]: I0217 17:43:52.391665 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:43:52 crc kubenswrapper[4892]: I0217 17:43:52.391689 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:43:52 crc kubenswrapper[4892]: I0217 17:43:52.391696 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:43:52 crc kubenswrapper[4892]: I0217 17:43:52.395659 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"23963364690d3b6cdb1cb76d08e094b0a7c2fe304f8d20a673206357e0152193"} Feb 17 17:43:52 crc kubenswrapper[4892]: I0217 17:43:52.395682 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2fe34ba04ce082dcb2c2040fafa3d9b0fb0f78fd569d08c9e53278873fb6434c"} Feb 17 17:43:52 crc kubenswrapper[4892]: I0217 17:43:52.395691 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b7a9fc749b194bcc0558af7017260846aeb8df8df5d84f31ae0f1e9f7ac29e63"} Feb 17 17:43:52 crc kubenswrapper[4892]: I0217 17:43:52.395701 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"bf7e90e5029b6c000084531e8c465365c236ea0c554bf0ab65b936fc0f53cfe8"} Feb 17 17:43:52 crc kubenswrapper[4892]: I0217 17:43:52.399538 4892 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="266389586fe3d597964081c84348f75ccac547d179c85375d837f25a6ac57c7c" exitCode=0 Feb 17 17:43:52 crc kubenswrapper[4892]: I0217 17:43:52.399617 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"266389586fe3d597964081c84348f75ccac547d179c85375d837f25a6ac57c7c"} Feb 17 17:43:52 crc kubenswrapper[4892]: I0217 17:43:52.399688 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:43:52 crc kubenswrapper[4892]: I0217 17:43:52.401261 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:43:52 crc kubenswrapper[4892]: I0217 17:43:52.401307 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:43:52 crc kubenswrapper[4892]: I0217 17:43:52.401322 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:43:52 crc kubenswrapper[4892]: I0217 17:43:52.401791 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:43:52 crc kubenswrapper[4892]: I0217 17:43:52.401720 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"38ea6d6b01fd4e5622f4283bf6250451952776f0d2b162865b1705dbde6542eb"} Feb 17 17:43:52 crc kubenswrapper[4892]: I0217 17:43:52.401802 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:43:52 crc kubenswrapper[4892]: I0217 17:43:52.403070 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:43:52 crc kubenswrapper[4892]: I0217 17:43:52.403111 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:43:52 crc kubenswrapper[4892]: I0217 17:43:52.403127 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:43:52 crc kubenswrapper[4892]: I0217 17:43:52.403773 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:43:52 crc kubenswrapper[4892]: I0217 17:43:52.403803 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:43:52 crc kubenswrapper[4892]: I0217 17:43:52.403836 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:43:52 crc kubenswrapper[4892]: W0217 17:43:52.410907 4892 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.41:6443: connect: connection refused Feb 17 17:43:52 crc kubenswrapper[4892]: E0217 17:43:52.411007 4892 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.41:6443: connect: connection refused" logger="UnhandledError" Feb 17 17:43:52 crc kubenswrapper[4892]: W0217 17:43:52.494436 4892 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.41:6443: connect: connection refused Feb 17 17:43:52 crc kubenswrapper[4892]: E0217 17:43:52.494536 4892 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.41:6443: connect: connection refused" logger="UnhandledError" Feb 17 17:43:52 crc kubenswrapper[4892]: I0217 17:43:52.554580 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:43:52 crc kubenswrapper[4892]: I0217 17:43:52.556291 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:43:52 crc kubenswrapper[4892]: I0217 17:43:52.556335 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:43:52 crc kubenswrapper[4892]: I0217 17:43:52.556347 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:43:52 crc kubenswrapper[4892]: I0217 17:43:52.556376 4892 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 17:43:52 crc kubenswrapper[4892]: E0217 17:43:52.559202 4892 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.41:6443: connect: connection refused" node="crc" Feb 17 17:43:52 crc kubenswrapper[4892]: I0217 17:43:52.890863 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 17:43:53 crc kubenswrapper[4892]: W0217 17:43:53.066701 4892 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.41:6443: connect: connection refused Feb 17 17:43:53 crc kubenswrapper[4892]: E0217 17:43:53.066956 4892 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.41:6443: connect: connection refused" logger="UnhandledError" Feb 17 17:43:53 crc kubenswrapper[4892]: I0217 17:43:53.287081 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 19:20:46.695608061 +0000 UTC Feb 17 17:43:53 crc kubenswrapper[4892]: I0217 17:43:53.406966 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 17 17:43:53 crc kubenswrapper[4892]: I0217 17:43:53.409147 4892 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8500993975f0571405b3d126d6d3b41af094adfe0da1920a6148fd9108688c24" exitCode=255 Feb 17 17:43:53 crc kubenswrapper[4892]: I0217 17:43:53.409220 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"8500993975f0571405b3d126d6d3b41af094adfe0da1920a6148fd9108688c24"} Feb 17 17:43:53 crc kubenswrapper[4892]: I0217 17:43:53.409474 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:43:53 crc kubenswrapper[4892]: I0217 17:43:53.411283 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:43:53 crc kubenswrapper[4892]: I0217 17:43:53.411347 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:43:53 crc kubenswrapper[4892]: I0217 17:43:53.411371 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:43:53 crc kubenswrapper[4892]: I0217 17:43:53.412226 4892 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="08598fcc87010d29b5fa6e25f077f7b55537751361231fbde800aa278e2954f9" exitCode=0 Feb 17 17:43:53 crc kubenswrapper[4892]: I0217 17:43:53.412317 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:43:53 crc kubenswrapper[4892]: I0217 17:43:53.412340 4892 scope.go:117] "RemoveContainer" containerID="8500993975f0571405b3d126d6d3b41af094adfe0da1920a6148fd9108688c24" Feb 17 17:43:53 crc kubenswrapper[4892]: I0217 17:43:53.412369 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:43:53 crc kubenswrapper[4892]: I0217 17:43:53.412756 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"08598fcc87010d29b5fa6e25f077f7b55537751361231fbde800aa278e2954f9"} Feb 17 17:43:53 crc kubenswrapper[4892]: I0217 17:43:53.412961 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:43:53 crc kubenswrapper[4892]: I0217 17:43:53.413372 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:43:53 crc kubenswrapper[4892]: I0217 17:43:53.413426 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:43:53 crc kubenswrapper[4892]: I0217 17:43:53.413449 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:43:53 crc kubenswrapper[4892]: I0217 17:43:53.414600 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:43:53 crc kubenswrapper[4892]: I0217 17:43:53.414627 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:43:53 crc kubenswrapper[4892]: I0217 17:43:53.414635 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:43:53 crc kubenswrapper[4892]: I0217 17:43:53.414790 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:43:53 crc kubenswrapper[4892]: I0217 17:43:53.414850 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:43:53 crc kubenswrapper[4892]: I0217 17:43:53.414867 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:43:53 crc kubenswrapper[4892]: I0217 17:43:53.953155 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:43:54 crc kubenswrapper[4892]: I0217 17:43:54.027983 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:43:54 crc kubenswrapper[4892]: I0217 17:43:54.289007 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 02:34:07.959568926 +0000 UTC Feb 17 17:43:54 crc kubenswrapper[4892]: I0217 17:43:54.431469 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1c1f0721415a6bfa0764f985f579345d35bac0a56978bff1bba4ae16a71ec2a0"} Feb 17 17:43:54 crc kubenswrapper[4892]: I0217 17:43:54.431511 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"34efb5409d2b86ecf4193ea82dc70ebf59c72f9b0f7460fb868e632f16b567d8"} Feb 17 17:43:54 crc kubenswrapper[4892]: I0217 17:43:54.431523 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"226b8c49d008f0a45b45ec1a8ffdfccc571c4eda281a8a17d5fb74c007dec6ae"} Feb 17 17:43:54 crc kubenswrapper[4892]: I0217 17:43:54.431532 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"691f098ec24b434c424081ff3d0323ffcfd387ade7ccece1fabee064b81b394e"} Feb 17 17:43:54 crc kubenswrapper[4892]: I0217 17:43:54.433786 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 17 17:43:54 crc kubenswrapper[4892]: I0217 17:43:54.435725 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6"} Feb 17 17:43:54 crc kubenswrapper[4892]: I0217 17:43:54.435777 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:43:54 crc kubenswrapper[4892]: I0217 17:43:54.435845 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:43:54 crc kubenswrapper[4892]: I0217 17:43:54.436966 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:43:54 crc kubenswrapper[4892]: I0217 17:43:54.437005 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:43:54 crc kubenswrapper[4892]: I0217 17:43:54.437023 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:43:54 crc kubenswrapper[4892]: I0217 17:43:54.437077 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:43:54 crc kubenswrapper[4892]: I0217 17:43:54.437122 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:43:54 crc kubenswrapper[4892]: I0217 17:43:54.437158 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:43:54 crc kubenswrapper[4892]: I0217 17:43:54.475934 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:43:55 crc kubenswrapper[4892]: I0217 17:43:55.289227 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 15:19:33.690348321 +0000 UTC Feb 17 17:43:55 crc kubenswrapper[4892]: I0217 17:43:55.443442 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"88ad26081caa5f1a5b3b789c7c08aea929b9bfbb73ff00ea14a3fef15873bcb7"} Feb 17 17:43:55 crc kubenswrapper[4892]: I0217 17:43:55.443475 4892 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 17:43:55 crc kubenswrapper[4892]: I0217 17:43:55.443563 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:43:55 crc kubenswrapper[4892]: I0217 17:43:55.443581 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:43:55 crc kubenswrapper[4892]: I0217 17:43:55.445047 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:43:55 crc kubenswrapper[4892]: I0217 17:43:55.445083 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:43:55 crc kubenswrapper[4892]: I0217 17:43:55.445093 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:43:55 crc kubenswrapper[4892]: I0217 17:43:55.445674 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:43:55 crc kubenswrapper[4892]: I0217 17:43:55.445710 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:43:55 crc kubenswrapper[4892]: I0217 17:43:55.445723 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:43:55 crc kubenswrapper[4892]: I0217 17:43:55.642300 4892 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 17 17:43:55 crc kubenswrapper[4892]: I0217 17:43:55.760123 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:43:55 crc kubenswrapper[4892]: I0217 17:43:55.762096 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:43:55 crc kubenswrapper[4892]: I0217 17:43:55.762139 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:43:55 crc kubenswrapper[4892]: I0217 17:43:55.762153 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:43:55 crc kubenswrapper[4892]: I0217 17:43:55.762178 4892 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 17:43:55 crc kubenswrapper[4892]: I0217 17:43:55.928467 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Feb 17 17:43:56 crc kubenswrapper[4892]: I0217 17:43:56.289368 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 15:18:41.577695122 +0000 UTC Feb 17 17:43:56 crc kubenswrapper[4892]: I0217 17:43:56.408271 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:43:56 crc kubenswrapper[4892]: I0217 17:43:56.447431 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:43:56 crc kubenswrapper[4892]: I0217 17:43:56.447431 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:43:56 crc kubenswrapper[4892]: I0217 17:43:56.448885 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:43:56 crc kubenswrapper[4892]: I0217 17:43:56.448948 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:43:56 crc kubenswrapper[4892]: I0217 17:43:56.448972 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:43:56 crc kubenswrapper[4892]: I0217 17:43:56.448994 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:43:56 crc kubenswrapper[4892]: I0217 17:43:56.449033 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:43:56 crc kubenswrapper[4892]: I0217 17:43:56.449056 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:43:57 crc kubenswrapper[4892]: I0217 17:43:57.290306 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 00:22:01.913465234 +0000 UTC Feb 17 17:43:57 crc kubenswrapper[4892]: I0217 17:43:57.450040 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:43:57 crc kubenswrapper[4892]: I0217 17:43:57.450142 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:43:57 crc kubenswrapper[4892]: I0217 17:43:57.451032 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:43:57 crc kubenswrapper[4892]: I0217 17:43:57.451081 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:43:57 crc kubenswrapper[4892]: I0217 17:43:57.451098 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:43:57 crc kubenswrapper[4892]: I0217 17:43:57.451845 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:43:57 crc kubenswrapper[4892]: I0217 17:43:57.451873 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:43:57 crc kubenswrapper[4892]: I0217 17:43:57.451882 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:43:57 crc kubenswrapper[4892]: I0217 17:43:57.727722 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 17:43:57 crc kubenswrapper[4892]: I0217 17:43:57.728060 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:43:57 crc kubenswrapper[4892]: I0217 17:43:57.729523 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:43:57 crc kubenswrapper[4892]: I0217 17:43:57.729569 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:43:57 crc kubenswrapper[4892]: I0217 17:43:57.729579 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:43:58 crc kubenswrapper[4892]: I0217 17:43:58.291347 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 19:58:47.680648359 +0000 UTC Feb 17 17:43:59 crc kubenswrapper[4892]: I0217 17:43:59.199006 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 17:43:59 crc kubenswrapper[4892]: I0217 17:43:59.199222 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:43:59 crc kubenswrapper[4892]: I0217 17:43:59.201297 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:43:59 crc kubenswrapper[4892]: I0217 17:43:59.201368 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:43:59 crc kubenswrapper[4892]: I0217 17:43:59.201387 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:43:59 crc kubenswrapper[4892]: I0217 17:43:59.292313 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 04:39:02.635886505 +0000 UTC Feb 17 17:43:59 crc kubenswrapper[4892]: E0217 17:43:59.454673 4892 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 17 17:43:59 crc kubenswrapper[4892]: I0217 17:43:59.809949 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 17:43:59 crc kubenswrapper[4892]: I0217 17:43:59.810141 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:43:59 crc kubenswrapper[4892]: I0217 17:43:59.811479 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:43:59 crc kubenswrapper[4892]: I0217 17:43:59.811521 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:43:59 crc kubenswrapper[4892]: I0217 17:43:59.811537 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:00 crc kubenswrapper[4892]: I0217 17:44:00.292703 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 10:21:06.734729181 +0000 UTC Feb 17 17:44:00 crc kubenswrapper[4892]: I0217 17:44:00.728611 4892 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 17 17:44:00 crc kubenswrapper[4892]: I0217 17:44:00.728722 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 17:44:01 crc kubenswrapper[4892]: I0217 17:44:01.094260 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 17:44:01 crc kubenswrapper[4892]: I0217 17:44:01.094502 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:44:01 crc kubenswrapper[4892]: I0217 17:44:01.096051 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:01 crc kubenswrapper[4892]: I0217 17:44:01.096101 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:01 crc kubenswrapper[4892]: I0217 17:44:01.096118 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:01 crc kubenswrapper[4892]: I0217 17:44:01.104673 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 17:44:01 crc kubenswrapper[4892]: I0217 17:44:01.292873 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 00:44:50.264374641 +0000 UTC Feb 17 17:44:01 crc kubenswrapper[4892]: I0217 17:44:01.463897 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:44:01 crc kubenswrapper[4892]: I0217 17:44:01.465864 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:01 crc kubenswrapper[4892]: I0217 17:44:01.465918 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:01 crc kubenswrapper[4892]: I0217 17:44:01.465935 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:01 crc kubenswrapper[4892]: I0217 17:44:01.473283 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 17:44:02 crc kubenswrapper[4892]: I0217 17:44:02.293473 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 20:21:15.199732078 +0000 UTC Feb 17 17:44:02 crc kubenswrapper[4892]: I0217 17:44:02.466189 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:44:02 crc kubenswrapper[4892]: I0217 17:44:02.467771 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:02 crc kubenswrapper[4892]: I0217 17:44:02.467840 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:02 crc kubenswrapper[4892]: I0217 17:44:02.467858 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:03 crc kubenswrapper[4892]: I0217 17:44:03.273676 4892 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Feb 17 17:44:03 crc kubenswrapper[4892]: I0217 17:44:03.294151 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 10:56:00.900551621 +0000 UTC Feb 17 17:44:03 crc kubenswrapper[4892]: I0217 17:44:03.408174 4892 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 17 17:44:03 crc kubenswrapper[4892]: I0217 17:44:03.408246 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 17 17:44:03 crc kubenswrapper[4892]: I0217 17:44:03.416339 4892 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 17 17:44:03 crc kubenswrapper[4892]: I0217 17:44:03.416384 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 17 17:44:03 crc kubenswrapper[4892]: I0217 17:44:03.961947 4892 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 17 17:44:03 crc kubenswrapper[4892]: [+]log ok Feb 17 17:44:03 crc kubenswrapper[4892]: [+]etcd ok Feb 17 17:44:03 crc kubenswrapper[4892]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Feb 17 17:44:03 crc kubenswrapper[4892]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Feb 17 17:44:03 crc kubenswrapper[4892]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 17 17:44:03 crc kubenswrapper[4892]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 17 17:44:03 crc kubenswrapper[4892]: [+]poststarthook/openshift.io-api-request-count-filter ok Feb 17 17:44:03 crc kubenswrapper[4892]: [+]poststarthook/openshift.io-startkubeinformers ok Feb 17 17:44:03 crc kubenswrapper[4892]: [+]poststarthook/generic-apiserver-start-informers ok Feb 17 17:44:03 crc kubenswrapper[4892]: [+]poststarthook/priority-and-fairness-config-consumer ok Feb 17 17:44:03 crc kubenswrapper[4892]: [+]poststarthook/priority-and-fairness-filter ok Feb 17 17:44:03 crc kubenswrapper[4892]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 17 17:44:03 crc kubenswrapper[4892]: [+]poststarthook/start-apiextensions-informers ok Feb 17 17:44:03 crc kubenswrapper[4892]: [+]poststarthook/start-apiextensions-controllers ok Feb 17 17:44:03 crc kubenswrapper[4892]: [+]poststarthook/crd-informer-synced ok Feb 17 17:44:03 crc kubenswrapper[4892]: [+]poststarthook/start-system-namespaces-controller ok Feb 17 17:44:03 crc kubenswrapper[4892]: [+]poststarthook/start-cluster-authentication-info-controller ok Feb 17 17:44:03 crc kubenswrapper[4892]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Feb 17 17:44:03 crc kubenswrapper[4892]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Feb 17 17:44:03 crc kubenswrapper[4892]: [+]poststarthook/start-legacy-token-tracking-controller ok Feb 17 17:44:03 crc kubenswrapper[4892]: [+]poststarthook/start-service-ip-repair-controllers ok Feb 17 17:44:03 crc kubenswrapper[4892]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Feb 17 17:44:03 crc kubenswrapper[4892]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Feb 17 17:44:03 crc kubenswrapper[4892]: [+]poststarthook/priority-and-fairness-config-producer ok Feb 17 17:44:03 crc kubenswrapper[4892]: [+]poststarthook/bootstrap-controller ok Feb 17 17:44:03 crc kubenswrapper[4892]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Feb 17 17:44:03 crc kubenswrapper[4892]: [+]poststarthook/start-kube-aggregator-informers ok Feb 17 17:44:03 crc kubenswrapper[4892]: [+]poststarthook/apiservice-status-local-available-controller ok Feb 17 17:44:03 crc kubenswrapper[4892]: [+]poststarthook/apiservice-status-remote-available-controller ok Feb 17 17:44:03 crc kubenswrapper[4892]: [+]poststarthook/apiservice-registration-controller ok Feb 17 17:44:03 crc kubenswrapper[4892]: [+]poststarthook/apiservice-wait-for-first-sync ok Feb 17 17:44:03 crc kubenswrapper[4892]: [+]poststarthook/apiservice-discovery-controller ok Feb 17 17:44:03 crc kubenswrapper[4892]: [+]poststarthook/kube-apiserver-autoregistration ok Feb 17 17:44:03 crc kubenswrapper[4892]: [+]autoregister-completion ok Feb 17 17:44:03 crc kubenswrapper[4892]: [+]poststarthook/apiservice-openapi-controller ok Feb 17 17:44:03 crc kubenswrapper[4892]: [+]poststarthook/apiservice-openapiv3-controller ok Feb 17 17:44:03 crc kubenswrapper[4892]: livez check failed Feb 17 17:44:03 crc kubenswrapper[4892]: I0217 17:44:03.962038 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 17:44:04 crc kubenswrapper[4892]: I0217 17:44:04.029193 4892 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Feb 17 17:44:04 crc kubenswrapper[4892]: I0217 17:44:04.029293 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Feb 17 17:44:04 crc kubenswrapper[4892]: I0217 17:44:04.263658 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Feb 17 17:44:04 crc kubenswrapper[4892]: I0217 17:44:04.263949 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:44:04 crc kubenswrapper[4892]: I0217 17:44:04.266209 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:04 crc kubenswrapper[4892]: I0217 17:44:04.266287 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:04 crc kubenswrapper[4892]: I0217 17:44:04.266309 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:04 crc kubenswrapper[4892]: I0217 17:44:04.294562 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 23:12:47.943055606 +0000 UTC Feb 17 17:44:04 crc kubenswrapper[4892]: I0217 17:44:04.333874 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 17 17:44:04 crc kubenswrapper[4892]: I0217 17:44:04.471226 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:44:04 crc kubenswrapper[4892]: I0217 17:44:04.472295 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:04 crc kubenswrapper[4892]: I0217 17:44:04.472349 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:04 crc kubenswrapper[4892]: I0217 17:44:04.472365 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:04 crc kubenswrapper[4892]: I0217 17:44:04.485959 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 17 17:44:05 crc kubenswrapper[4892]: I0217 17:44:05.295646 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 03:36:57.27699007 +0000 UTC Feb 17 17:44:05 crc kubenswrapper[4892]: I0217 17:44:05.473504 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:44:05 crc kubenswrapper[4892]: I0217 17:44:05.474432 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:05 crc kubenswrapper[4892]: I0217 17:44:05.474480 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:05 crc kubenswrapper[4892]: I0217 17:44:05.474498 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:06 crc kubenswrapper[4892]: I0217 17:44:06.295837 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 19:57:25.265832035 +0000 UTC Feb 17 17:44:06 crc kubenswrapper[4892]: I0217 17:44:06.409648 4892 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Feb 17 17:44:06 crc kubenswrapper[4892]: I0217 17:44:06.410015 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Feb 17 17:44:07 crc kubenswrapper[4892]: I0217 17:44:07.297388 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 19:54:26.760764934 +0000 UTC Feb 17 17:44:08 crc kubenswrapper[4892]: I0217 17:44:08.298117 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 01:07:11.367876221 +0000 UTC Feb 17 17:44:08 crc kubenswrapper[4892]: E0217 17:44:08.413359 4892 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Feb 17 17:44:08 crc kubenswrapper[4892]: I0217 17:44:08.416488 4892 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 17 17:44:08 crc kubenswrapper[4892]: I0217 17:44:08.418763 4892 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 17 17:44:08 crc kubenswrapper[4892]: I0217 17:44:08.419746 4892 trace.go:236] Trace[263452469]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (17-Feb-2026 17:43:55.440) (total time: 12979ms): Feb 17 17:44:08 crc kubenswrapper[4892]: Trace[263452469]: ---"Objects listed" error: 12979ms (17:44:08.419) Feb 17 17:44:08 crc kubenswrapper[4892]: Trace[263452469]: [12.979353291s] [12.979353291s] END Feb 17 17:44:08 crc kubenswrapper[4892]: I0217 17:44:08.420186 4892 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 17 17:44:08 crc kubenswrapper[4892]: E0217 17:44:08.421555 4892 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Feb 17 17:44:08 crc kubenswrapper[4892]: I0217 17:44:08.421975 4892 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 17 17:44:08 crc kubenswrapper[4892]: I0217 17:44:08.423652 4892 trace.go:236] Trace[2074839748]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (17-Feb-2026 17:43:57.692) (total time: 10730ms): Feb 17 17:44:08 crc kubenswrapper[4892]: Trace[2074839748]: ---"Objects listed" error: 10730ms (17:44:08.423) Feb 17 17:44:08 crc kubenswrapper[4892]: Trace[2074839748]: [10.730623019s] [10.730623019s] END Feb 17 17:44:08 crc kubenswrapper[4892]: I0217 17:44:08.423719 4892 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 17 17:44:08 crc kubenswrapper[4892]: I0217 17:44:08.431763 4892 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 17 17:44:08 crc kubenswrapper[4892]: I0217 17:44:08.497713 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 17:44:08 crc kubenswrapper[4892]: I0217 17:44:08.502485 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 17:44:08 crc kubenswrapper[4892]: I0217 17:44:08.957205 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:44:08 crc kubenswrapper[4892]: I0217 17:44:08.958283 4892 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Feb 17 17:44:08 crc kubenswrapper[4892]: I0217 17:44:08.958371 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Feb 17 17:44:08 crc kubenswrapper[4892]: I0217 17:44:08.964226 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.268568 4892 apiserver.go:52] "Watching apiserver" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.271780 4892 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.272087 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.272432 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:44:09 crc kubenswrapper[4892]: E0217 17:44:09.272491 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.272639 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.272671 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:44:09 crc kubenswrapper[4892]: E0217 17:44:09.272767 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.272979 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:44:09 crc kubenswrapper[4892]: E0217 17:44:09.273028 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.273053 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.273468 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.275702 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.276060 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.276378 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.276604 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.276806 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.276975 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.277845 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.277945 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.278019 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.287728 4892 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.298939 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 07:39:05.165573897 +0000 UTC Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.311230 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.327456 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.327492 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.327517 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.327539 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.327561 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.327584 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.327607 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.327660 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.327680 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.327702 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.327719 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.327737 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.327756 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.327775 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.327855 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.327899 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.327918 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.327945 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.327987 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.328006 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.328022 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.328040 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.328083 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.328101 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.328120 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.328138 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.328160 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.328180 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.328199 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.328240 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.328257 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.328274 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.328290 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.328306 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.328323 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.328341 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.328358 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.328374 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.328391 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.328407 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.328423 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.328440 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.328461 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.328487 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.328505 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.328521 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.328539 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.328556 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.328890 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.329004 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.329121 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.329109 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.329190 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.329196 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.329215 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.329343 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.329395 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.329523 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.329550 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.329673 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.329746 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.329613 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.330052 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.330357 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.330398 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.330418 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.330837 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.332855 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.332957 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.333286 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.333446 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.330024 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.330408 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.330625 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.330646 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.330689 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.330749 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.331039 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.331105 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.331129 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.331250 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.331269 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.333506 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.333724 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.334049 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.331421 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.331451 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.331464 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.331650 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.331730 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.334236 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.334456 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.331934 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.331938 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.332012 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.332040 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.332058 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.334654 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.331998 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.332538 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.332697 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.332964 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.332905 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.333119 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.333280 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.333289 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.333377 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.333430 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.333742 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.334678 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.333831 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.334063 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.334081 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.334266 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.335010 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.335258 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.335044 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4159c-33a5-42ef-9427-ad1fb2799470\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf7e90e5029b6c000084531e8c465365c236ea0c554bf0ab65b936fc0f53cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe34ba04ce082dcb2c2040fafa3d9b0fb0f78fd569d08c9e53278873fb6434c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a9fc749b194bcc0558af7017260846aeb8df8df5d84f31ae0f1e9f7ac29e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8500993975f0571405b3d126d6d3b41af094adfe0da1920a6148fd9108688c24\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:43:53Z\\\",\\\"message\\\":\\\"W0217 17:43:52.568525 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0217 17:43:52.569064 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771350232 cert, and key in /tmp/serving-cert-4124004612/serving-signer.crt, /tmp/serving-cert-4124004612/serving-signer.key\\\\nI0217 17:43:52.851033 1 observer_polling.go:159] Starting file observer\\\\nW0217 17:43:52.854033 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0217 17:43:52.854213 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:43:52.854862 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4124004612/tls.crt::/tmp/serving-cert-4124004612/tls.key\\\\\\\"\\\\nF0217 17:43:53.034283 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23963364690d3b6cdb1cb76d08e094b0a7c2fe304f8d20a673206357e0152193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.334340 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.334585 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.334726 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.335423 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.335044 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.335531 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.335082 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.335483 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.335316 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.335703 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.335784 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.335854 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.335889 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 17 17:44:09 crc kubenswrapper[4892]: E0217 17:44:09.335906 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:44:09.835881107 +0000 UTC m=+21.211284372 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.335946 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.336012 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.336093 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.335941 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.336094 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.336150 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.336322 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.336618 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.336673 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.336705 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.336159 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.336469 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.336096 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.336639 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.336855 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.336923 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.337019 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.337062 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.337275 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.337408 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.337392 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.337539 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.337684 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.337946 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.338070 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.338109 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.338180 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.338387 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.338437 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.338488 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.338219 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.338527 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.339137 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.339303 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.339741 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.339930 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.340003 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.340058 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.339034 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.340242 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.340733 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.340856 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.340909 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.340935 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.340954 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.340977 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.340999 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.341023 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.341045 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.341070 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.341093 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.340989 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.341116 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.341141 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.341164 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.341182 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.341206 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.341225 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.341243 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.341263 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.341280 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.341298 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.341317 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.341334 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.341353 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.341375 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.341399 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.341432 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.341480 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.341513 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.341538 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.341559 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.341578 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.341596 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.341616 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.341635 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.341654 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.341671 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.341690 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.341713 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.341731 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.341748 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.341767 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.341787 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.341822 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.341843 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.341860 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.341890 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.341915 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.341934 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.341952 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.341973 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.341991 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.342010 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.342028 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.342046 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.342067 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.342089 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.342115 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.342142 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.342165 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.342186 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.342208 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.342230 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.342252 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.342270 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.342289 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.342314 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.342335 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.342354 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.342372 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.342394 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.342417 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.342435 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.342489 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.342534 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.342939 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.342968 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.342991 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.343012 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.343035 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.343055 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.343074 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.343096 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.343115 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.343135 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.343155 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.343175 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.343195 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.343214 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.343233 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.343252 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.343271 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.343289 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.343307 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.343324 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.343341 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.343359 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.343377 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.343396 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.343425 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.343448 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.343468 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.343486 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.343504 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.343524 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.341104 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.341891 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.342024 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.342048 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.342091 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.342318 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.342487 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.342704 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.343758 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.343910 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.344039 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.344066 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.344506 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.344523 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.344570 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.343546 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.344716 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.344946 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.344981 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.345036 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.345009 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.345081 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.345181 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.345226 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.345262 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.345301 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.345339 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.345375 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.345418 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.345450 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.345487 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.345524 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.345564 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.345601 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.345635 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.345647 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.345668 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.345755 4892 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.345780 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.345802 4892 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.345897 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.345919 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.345939 4892 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.345960 4892 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.345979 4892 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.345999 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.346018 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.345374 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.346036 4892 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.346092 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.346130 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.346171 4892 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.346203 4892 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.346208 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.346234 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.346259 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.346263 4892 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.346304 4892 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.346322 4892 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.346339 4892 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.346397 4892 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.346408 4892 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.346419 4892 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.346429 4892 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.346439 4892 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.346452 4892 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.346463 4892 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.346478 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.346490 4892 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.346500 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.346527 4892 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.346540 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.346553 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.346565 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.346577 4892 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.346589 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.346600 4892 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.346611 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.346622 4892 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.346632 4892 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.346642 4892 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.346643 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.346653 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.346685 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.346848 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.346867 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.346879 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.346894 4892 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.346899 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.346908 4892 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.346922 4892 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.346935 4892 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.346947 4892 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.346958 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.346969 4892 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.346979 4892 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.346990 4892 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.347001 4892 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.347011 4892 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.347022 4892 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.347032 4892 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.347044 4892 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.347054 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.347066 4892 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.347076 4892 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.347086 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.347096 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.347107 4892 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.347119 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.347129 4892 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.347139 4892 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.347150 4892 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.347230 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.347242 4892 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.347252 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.347261 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.347270 4892 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.347281 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.347291 4892 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.347303 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.347315 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.347328 4892 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.347337 4892 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.347348 4892 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.347358 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.347370 4892 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.347381 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.347390 4892 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.347400 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.347411 4892 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.347420 4892 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.347432 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.347442 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.347453 4892 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.347463 4892 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.347473 4892 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.347484 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.347495 4892 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.347505 4892 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.347514 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.347530 4892 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.347540 4892 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.347550 4892 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.347560 4892 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.347571 4892 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.347611 4892 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.346891 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.346919 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.347292 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.347574 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.347645 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.347720 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.348073 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.348167 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.348460 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.348952 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.349262 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.349301 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.349404 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.349483 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.349700 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.349756 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.349788 4892 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.352685 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.353157 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.349798 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.350291 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.350410 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.350786 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.351065 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.351113 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.351211 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.351497 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.354704 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.354795 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.354837 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.354977 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.355151 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: E0217 17:44:09.355179 4892 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.355238 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: E0217 17:44:09.355287 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 17:44:09.85526925 +0000 UTC m=+21.230672515 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.351727 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.351799 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.351852 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.352003 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.352119 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.352128 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.352131 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.352345 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.352479 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.352934 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.352944 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.352983 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: E0217 17:44:09.353236 4892 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 17:44:09 crc kubenswrapper[4892]: E0217 17:44:09.355436 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 17:44:09.855425894 +0000 UTC m=+21.230829309 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.353397 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.353612 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.353700 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.353711 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.354124 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.354166 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.353538 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.354608 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.351564 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.355555 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.356079 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.356117 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.356522 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.357047 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.357416 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.357731 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.357903 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.358104 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.358261 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.358621 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.362664 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.364873 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.364981 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.365340 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: E0217 17:44:09.365502 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 17:44:09 crc kubenswrapper[4892]: E0217 17:44:09.365527 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 17:44:09 crc kubenswrapper[4892]: E0217 17:44:09.365547 4892 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:44:09 crc kubenswrapper[4892]: E0217 17:44:09.365635 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 17:44:09.865610299 +0000 UTC m=+21.241013594 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.367038 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.367608 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.367785 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.368214 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.368283 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.368581 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.368520 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.368722 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.369015 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.369053 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.369397 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.369391 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.369486 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.369980 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.371541 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.371735 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.371926 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.372293 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.372713 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.374628 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.375288 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.376454 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.377805 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.378346 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.378674 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 17 17:44:09 crc kubenswrapper[4892]: E0217 17:44:09.379652 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 17:44:09 crc kubenswrapper[4892]: E0217 17:44:09.379694 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 17:44:09 crc kubenswrapper[4892]: E0217 17:44:09.379714 4892 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:44:09 crc kubenswrapper[4892]: E0217 17:44:09.379791 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 17:44:09.8797648 +0000 UTC m=+21.255168105 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.380188 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.381357 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.381435 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.382002 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.383980 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.383981 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.384725 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.386923 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.388232 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.389694 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.390455 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.392575 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.393447 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.393889 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.396388 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.398496 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.400069 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.401468 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.402106 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.403081 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.403161 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.404166 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.404883 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.405541 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.406709 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.407488 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.408684 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.409436 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.410836 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.411343 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.411457 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.412353 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.412453 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.412938 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.413444 4892 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.413545 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.415736 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.416350 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.417285 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.418778 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.419463 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.420521 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.421214 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.422292 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.422552 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83f8f121-9193-465b-9d6b-4dfbdcf26f98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca961a496bca89cff78c58c41585706c0673892df83e6c58e16655f83aeaa4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3ead139c2a8e6feefbddce384ccd6eb8f26b4581d93053b80832fea7c8bdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82362387569a5b049fdb827cb8ca5ecdacae36743ef142077b37441cc2e33b41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d08e8579fe7ac45bc28ca29b9e6bc0b795c68e4429e16319b75f2b6a1f60cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.422866 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.423916 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.424590 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.425583 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.426089 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.427074 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.427585 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.428730 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.429256 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.430168 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.430157 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.430678 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.431229 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.432287 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.432796 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.443188 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.448069 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.448200 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.448273 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.448345 4892 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.448368 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.448387 4892 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.448405 4892 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.448423 4892 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.448440 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.448457 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.448475 4892 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.448491 4892 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.448508 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.448525 4892 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.448542 4892 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.448559 4892 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.448576 4892 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.448593 4892 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.448611 4892 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.448629 4892 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.448647 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.448664 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.448683 4892 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.448700 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.448718 4892 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.448735 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.448751 4892 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.448759 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.448768 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.448800 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.448831 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.448843 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.448877 4892 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.448906 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.448924 4892 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.448941 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.448962 4892 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.448979 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.448995 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.449013 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.449029 4892 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.449045 4892 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.449062 4892 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.449082 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.449100 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.449118 4892 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.449135 4892 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.449152 4892 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.449169 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.449185 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.449200 4892 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.449217 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.449234 4892 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.449250 4892 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.449266 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.449282 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.449297 4892 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.449313 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.449330 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.449348 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.449364 4892 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.449379 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.449395 4892 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.449413 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.449433 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.449449 4892 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.449475 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.449492 4892 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.449509 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.449525 4892 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.449541 4892 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.449558 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.449575 4892 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.449592 4892 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.449609 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.449629 4892 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.449644 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.449663 4892 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.449680 4892 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.449700 4892 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.449718 4892 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.449777 4892 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.449793 4892 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.449836 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.449855 4892 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.449870 4892 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.449889 4892 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.449905 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.449922 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.449939 4892 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.449956 4892 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.449973 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.449990 4892 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.450007 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.450022 4892 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.453885 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4159c-33a5-42ef-9427-ad1fb2799470\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf7e90e5029b6c000084531e8c465365c236ea0c554bf0ab65b936fc0f53cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe34ba04ce082dcb2c2040fafa3d9b0fb0f78fd569d08c9e53278873fb6434c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a9fc749b194bcc0558af7017260846aeb8df8df5d84f31ae0f1e9f7ac29e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8500993975f0571405b3d126d6d3b41af094adfe0da1920a6148fd9108688c24\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:43:53Z\\\",\\\"message\\\":\\\"W0217 17:43:52.568525 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0217 17:43:52.569064 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771350232 cert, and key in /tmp/serving-cert-4124004612/serving-signer.crt, /tmp/serving-cert-4124004612/serving-signer.key\\\\nI0217 17:43:52.851033 1 observer_polling.go:159] Starting file observer\\\\nW0217 17:43:52.854033 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0217 17:43:52.854213 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:43:52.854862 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4124004612/tls.crt::/tmp/serving-cert-4124004612/tls.key\\\\\\\"\\\\nF0217 17:43:53.034283 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23963364690d3b6cdb1cb76d08e094b0a7c2fe304f8d20a673206357e0152193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.464429 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.474691 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83f8f121-9193-465b-9d6b-4dfbdcf26f98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca961a496bca89cff78c58c41585706c0673892df83e6c58e16655f83aeaa4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3ead139c2a8e6feefbddce384ccd6eb8f26b4581d93053b80832fea7c8bdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82362387569a5b049fdb827cb8ca5ecdacae36743ef142077b37441cc2e33b41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d08e8579fe7ac45bc28ca29b9e6bc0b795c68e4429e16319b75f2b6a1f60cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.482756 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.483396 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.484582 4892 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6" exitCode=255 Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.485099 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6"} Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.485179 4892 scope.go:117] "RemoveContainer" containerID="8500993975f0571405b3d126d6d3b41af094adfe0da1920a6148fd9108688c24" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.486780 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 17:44:09 crc kubenswrapper[4892]: E0217 17:44:09.491017 4892 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-crc\" already exists" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.491455 4892 scope.go:117] "RemoveContainer" containerID="26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6" Feb 17 17:44:09 crc kubenswrapper[4892]: E0217 17:44:09.491927 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 17 17:44:09 crc kubenswrapper[4892]: E0217 17:44:09.492083 4892 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.496844 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.505206 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.513466 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83f8f121-9193-465b-9d6b-4dfbdcf26f98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca961a496bca89cff78c58c41585706c0673892df83e6c58e16655f83aeaa4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3ead139c2a8e6feefbddce384ccd6eb8f26b4581d93053b80832fea7c8bdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82362387569a5b049fdb827cb8ca5ecdacae36743ef142077b37441cc2e33b41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d08e8579fe7ac45bc28ca29b9e6bc0b795c68e4429e16319b75f2b6a1f60cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.523002 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.532561 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.548912 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.567290 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4159c-33a5-42ef-9427-ad1fb2799470\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf7e90e5029b6c000084531e8c465365c236ea0c554bf0ab65b936fc0f53cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe34ba04ce082dcb2c2040fafa3d9b0fb0f78fd569d08c9e53278873fb6434c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a9fc749b194bcc0558af7017260846aeb8df8df5d84f31ae0f1e9f7ac29e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8500993975f0571405b3d126d6d3b41af094adfe0da1920a6148fd9108688c24\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:43:53Z\\\",\\\"message\\\":\\\"W0217 17:43:52.568525 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0217 17:43:52.569064 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771350232 cert, and key in /tmp/serving-cert-4124004612/serving-signer.crt, /tmp/serving-cert-4124004612/serving-signer.key\\\\nI0217 17:43:52.851033 1 observer_polling.go:159] Starting file observer\\\\nW0217 17:43:52.854033 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0217 17:43:52.854213 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:43:52.854862 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4124004612/tls.crt::/tmp/serving-cert-4124004612/tls.key\\\\\\\"\\\\nF0217 17:43:53.034283 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 17:44:08.433164 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 17:44:08.433536 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:44:08.434768 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2781092108/tls.crt::/tmp/serving-cert-2781092108/tls.key\\\\\\\"\\\\nI0217 17:44:08.760474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:44:08.765628 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:44:08.765666 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:44:08.765700 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:44:08.765711 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:44:08.777802 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:44:08.777850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:44:08.777867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:44:08.777870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:44:08.777873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:44:08.777883 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:44:08.781106 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23963364690d3b6cdb1cb76d08e094b0a7c2fe304f8d20a673206357e0152193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.581911 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.591285 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.596764 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.603138 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.608371 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 17:44:09 crc kubenswrapper[4892]: W0217 17:44:09.611645 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-4c913e3b5cd05d35acabfce9af8aeb94a9ddd97c4fc3a7a445d2e73e5485dca8 WatchSource:0}: Error finding container 4c913e3b5cd05d35acabfce9af8aeb94a9ddd97c4fc3a7a445d2e73e5485dca8: Status 404 returned error can't find the container with id 4c913e3b5cd05d35acabfce9af8aeb94a9ddd97c4fc3a7a445d2e73e5485dca8 Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.616186 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 17:44:09 crc kubenswrapper[4892]: W0217 17:44:09.634892 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-2f1cfc73bc201437f74009a9017a3fd2764782cff7ac34e941a8cf727c9bd587 WatchSource:0}: Error finding container 2f1cfc73bc201437f74009a9017a3fd2764782cff7ac34e941a8cf727c9bd587: Status 404 returned error can't find the container with id 2f1cfc73bc201437f74009a9017a3fd2764782cff7ac34e941a8cf727c9bd587 Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.854562 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:44:09 crc kubenswrapper[4892]: E0217 17:44:09.854613 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:44:10.854595767 +0000 UTC m=+22.229999032 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.955544 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.955595 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.955621 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:44:09 crc kubenswrapper[4892]: I0217 17:44:09.955645 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:44:09 crc kubenswrapper[4892]: E0217 17:44:09.955749 4892 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 17:44:09 crc kubenswrapper[4892]: E0217 17:44:09.955800 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 17:44:10.955784254 +0000 UTC m=+22.331187529 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 17:44:09 crc kubenswrapper[4892]: E0217 17:44:09.956220 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 17:44:09 crc kubenswrapper[4892]: E0217 17:44:09.956251 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 17:44:09 crc kubenswrapper[4892]: E0217 17:44:09.956267 4892 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:44:09 crc kubenswrapper[4892]: E0217 17:44:09.956305 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 17:44:10.956293957 +0000 UTC m=+22.331697242 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:44:09 crc kubenswrapper[4892]: E0217 17:44:09.956346 4892 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 17:44:09 crc kubenswrapper[4892]: E0217 17:44:09.956373 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 17:44:10.956365039 +0000 UTC m=+22.331768314 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 17:44:09 crc kubenswrapper[4892]: E0217 17:44:09.956424 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 17:44:09 crc kubenswrapper[4892]: E0217 17:44:09.956436 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 17:44:09 crc kubenswrapper[4892]: E0217 17:44:09.956446 4892 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:44:09 crc kubenswrapper[4892]: E0217 17:44:09.956473 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 17:44:10.956463582 +0000 UTC m=+22.331866857 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:44:10 crc kubenswrapper[4892]: I0217 17:44:10.299877 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 23:40:44.088496537 +0000 UTC Feb 17 17:44:10 crc kubenswrapper[4892]: I0217 17:44:10.359302 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:44:10 crc kubenswrapper[4892]: E0217 17:44:10.359475 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:44:10 crc kubenswrapper[4892]: I0217 17:44:10.489263 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"aba0c9be9062adff6361f81c4b2f399a56592fcef0d1a62defdf587f9accb3e0"} Feb 17 17:44:10 crc kubenswrapper[4892]: I0217 17:44:10.489319 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"2f1cfc73bc201437f74009a9017a3fd2764782cff7ac34e941a8cf727c9bd587"} Feb 17 17:44:10 crc kubenswrapper[4892]: I0217 17:44:10.490211 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"c7df7c21fb102f72c621140c54fd97c01cd2d599334d872ca113840affb6cae4"} Feb 17 17:44:10 crc kubenswrapper[4892]: I0217 17:44:10.492091 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"5e1e9d28fe57947c798967ce328638e778603a86874e79a2f4d99eab0b9867fc"} Feb 17 17:44:10 crc kubenswrapper[4892]: I0217 17:44:10.492187 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"489d551f2e552fe6e6a03f8e38593e2b978a7dc4053a999c3a11ebe404cf427f"} Feb 17 17:44:10 crc kubenswrapper[4892]: I0217 17:44:10.492215 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"4c913e3b5cd05d35acabfce9af8aeb94a9ddd97c4fc3a7a445d2e73e5485dca8"} Feb 17 17:44:10 crc kubenswrapper[4892]: I0217 17:44:10.494742 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 17 17:44:10 crc kubenswrapper[4892]: I0217 17:44:10.498281 4892 scope.go:117] "RemoveContainer" containerID="26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6" Feb 17 17:44:10 crc kubenswrapper[4892]: E0217 17:44:10.498420 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 17 17:44:10 crc kubenswrapper[4892]: I0217 17:44:10.509246 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:10Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:10 crc kubenswrapper[4892]: I0217 17:44:10.544533 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aba0c9be9062adff6361f81c4b2f399a56592fcef0d1a62defdf587f9accb3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:10Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:10 crc kubenswrapper[4892]: I0217 17:44:10.567508 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4159c-33a5-42ef-9427-ad1fb2799470\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf7e90e5029b6c000084531e8c465365c236ea0c554bf0ab65b936fc0f53cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe34ba04ce082dcb2c2040fafa3d9b0fb0f78fd569d08c9e53278873fb6434c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a9fc749b194bcc0558af7017260846aeb8df8df5d84f31ae0f1e9f7ac29e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8500993975f0571405b3d126d6d3b41af094adfe0da1920a6148fd9108688c24\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:43:53Z\\\",\\\"message\\\":\\\"W0217 17:43:52.568525 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0217 17:43:52.569064 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771350232 cert, and key in /tmp/serving-cert-4124004612/serving-signer.crt, /tmp/serving-cert-4124004612/serving-signer.key\\\\nI0217 17:43:52.851033 1 observer_polling.go:159] Starting file observer\\\\nW0217 17:43:52.854033 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0217 17:43:52.854213 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:43:52.854862 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4124004612/tls.crt::/tmp/serving-cert-4124004612/tls.key\\\\\\\"\\\\nF0217 17:43:53.034283 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 17:44:08.433164 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 17:44:08.433536 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:44:08.434768 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2781092108/tls.crt::/tmp/serving-cert-2781092108/tls.key\\\\\\\"\\\\nI0217 17:44:08.760474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:44:08.765628 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:44:08.765666 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:44:08.765700 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:44:08.765711 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:44:08.777802 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:44:08.777850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:44:08.777867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:44:08.777870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:44:08.777873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:44:08.777883 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:44:08.781106 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23963364690d3b6cdb1cb76d08e094b0a7c2fe304f8d20a673206357e0152193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:10Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:10 crc kubenswrapper[4892]: I0217 17:44:10.585973 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:10Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:10 crc kubenswrapper[4892]: I0217 17:44:10.604994 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83f8f121-9193-465b-9d6b-4dfbdcf26f98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca961a496bca89cff78c58c41585706c0673892df83e6c58e16655f83aeaa4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3ead139c2a8e6feefbddce384ccd6eb8f26b4581d93053b80832fea7c8bdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82362387569a5b049fdb827cb8ca5ecdacae36743ef142077b37441cc2e33b41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d08e8579fe7ac45bc28ca29b9e6bc0b795c68e4429e16319b75f2b6a1f60cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:10Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:10 crc kubenswrapper[4892]: I0217 17:44:10.621398 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:10Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:10 crc kubenswrapper[4892]: I0217 17:44:10.634946 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:10Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:10 crc kubenswrapper[4892]: I0217 17:44:10.649585 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:10Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:10 crc kubenswrapper[4892]: I0217 17:44:10.667059 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83f8f121-9193-465b-9d6b-4dfbdcf26f98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca961a496bca89cff78c58c41585706c0673892df83e6c58e16655f83aeaa4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3ead139c2a8e6feefbddce384ccd6eb8f26b4581d93053b80832fea7c8bdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82362387569a5b049fdb827cb8ca5ecdacae36743ef142077b37441cc2e33b41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d08e8579fe7ac45bc28ca29b9e6bc0b795c68e4429e16319b75f2b6a1f60cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:10Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:10 crc kubenswrapper[4892]: I0217 17:44:10.681132 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e1e9d28fe57947c798967ce328638e778603a86874e79a2f4d99eab0b9867fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://489d551f2e552fe6e6a03f8e38593e2b978a7dc4053a999c3a11ebe404cf427f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:10Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:10 crc kubenswrapper[4892]: I0217 17:44:10.695085 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:10Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:10 crc kubenswrapper[4892]: I0217 17:44:10.708035 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:10Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:10 crc kubenswrapper[4892]: I0217 17:44:10.729318 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4159c-33a5-42ef-9427-ad1fb2799470\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf7e90e5029b6c000084531e8c465365c236ea0c554bf0ab65b936fc0f53cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe34ba04ce082dcb2c2040fafa3d9b0fb0f78fd569d08c9e53278873fb6434c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a9fc749b194bcc0558af7017260846aeb8df8df5d84f31ae0f1e9f7ac29e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 17:44:08.433164 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 17:44:08.433536 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:44:08.434768 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2781092108/tls.crt::/tmp/serving-cert-2781092108/tls.key\\\\\\\"\\\\nI0217 17:44:08.760474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:44:08.765628 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:44:08.765666 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:44:08.765700 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:44:08.765711 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:44:08.777802 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:44:08.777850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:44:08.777867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:44:08.777870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:44:08.777873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:44:08.777883 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:44:08.781106 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23963364690d3b6cdb1cb76d08e094b0a7c2fe304f8d20a673206357e0152193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:10Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:10 crc kubenswrapper[4892]: I0217 17:44:10.742555 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:10Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:10 crc kubenswrapper[4892]: I0217 17:44:10.759492 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:10Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:10 crc kubenswrapper[4892]: I0217 17:44:10.776678 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aba0c9be9062adff6361f81c4b2f399a56592fcef0d1a62defdf587f9accb3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:10Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:10 crc kubenswrapper[4892]: I0217 17:44:10.864151 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:44:10 crc kubenswrapper[4892]: E0217 17:44:10.864365 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:44:12.864334379 +0000 UTC m=+24.239737664 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:44:10 crc kubenswrapper[4892]: I0217 17:44:10.964835 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:44:10 crc kubenswrapper[4892]: I0217 17:44:10.964882 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:44:10 crc kubenswrapper[4892]: I0217 17:44:10.964905 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:44:10 crc kubenswrapper[4892]: I0217 17:44:10.964924 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:44:10 crc kubenswrapper[4892]: E0217 17:44:10.965005 4892 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 17:44:10 crc kubenswrapper[4892]: E0217 17:44:10.965014 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 17:44:10 crc kubenswrapper[4892]: E0217 17:44:10.965032 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 17:44:10 crc kubenswrapper[4892]: E0217 17:44:10.965044 4892 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:44:10 crc kubenswrapper[4892]: E0217 17:44:10.965066 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 17:44:10 crc kubenswrapper[4892]: E0217 17:44:10.965095 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 17:44:10 crc kubenswrapper[4892]: E0217 17:44:10.965106 4892 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:44:10 crc kubenswrapper[4892]: E0217 17:44:10.965076 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 17:44:12.965057503 +0000 UTC m=+24.340460768 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 17:44:10 crc kubenswrapper[4892]: E0217 17:44:10.965168 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 17:44:12.965157586 +0000 UTC m=+24.340560851 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:44:10 crc kubenswrapper[4892]: E0217 17:44:10.965069 4892 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 17:44:10 crc kubenswrapper[4892]: E0217 17:44:10.965182 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 17:44:12.965175606 +0000 UTC m=+24.340578871 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:44:10 crc kubenswrapper[4892]: E0217 17:44:10.965278 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 17:44:12.965249998 +0000 UTC m=+24.340653303 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 17:44:11 crc kubenswrapper[4892]: I0217 17:44:11.300796 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 17:47:32.711777438 +0000 UTC Feb 17 17:44:11 crc kubenswrapper[4892]: I0217 17:44:11.359247 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:44:11 crc kubenswrapper[4892]: I0217 17:44:11.359266 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:44:11 crc kubenswrapper[4892]: E0217 17:44:11.359436 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:44:11 crc kubenswrapper[4892]: E0217 17:44:11.359552 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:44:11 crc kubenswrapper[4892]: I0217 17:44:11.363492 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 17 17:44:11 crc kubenswrapper[4892]: I0217 17:44:11.364468 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 17 17:44:11 crc kubenswrapper[4892]: I0217 17:44:11.942978 4892 csr.go:261] certificate signing request csr-85l6k is approved, waiting to be issued Feb 17 17:44:11 crc kubenswrapper[4892]: I0217 17:44:11.969922 4892 csr.go:257] certificate signing request csr-85l6k is issued Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.301365 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 07:23:05.112002371 +0000 UTC Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.358795 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:44:12 crc kubenswrapper[4892]: E0217 17:44:12.358926 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.504346 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"a59304d1958e072cd19bd1e6a00f1487aa93563c8838c859aa7ed5222dc25c96"} Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.536255 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4159c-33a5-42ef-9427-ad1fb2799470\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf7e90e5029b6c000084531e8c465365c236ea0c554bf0ab65b936fc0f53cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe34ba04ce082dcb2c2040fafa3d9b0fb0f78fd569d08c9e53278873fb6434c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a9fc749b194bcc0558af7017260846aeb8df8df5d84f31ae0f1e9f7ac29e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 17:44:08.433164 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 17:44:08.433536 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:44:08.434768 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2781092108/tls.crt::/tmp/serving-cert-2781092108/tls.key\\\\\\\"\\\\nI0217 17:44:08.760474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:44:08.765628 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:44:08.765666 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:44:08.765700 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:44:08.765711 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:44:08.777802 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:44:08.777850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:44:08.777867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:44:08.777870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:44:08.777873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:44:08.777883 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:44:08.781106 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23963364690d3b6cdb1cb76d08e094b0a7c2fe304f8d20a673206357e0152193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:12Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.553765 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:12Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.573473 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59304d1958e072cd19bd1e6a00f1487aa93563c8838c859aa7ed5222dc25c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:12Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.598681 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aba0c9be9062adff6361f81c4b2f399a56592fcef0d1a62defdf587f9accb3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:12Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.615449 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83f8f121-9193-465b-9d6b-4dfbdcf26f98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca961a496bca89cff78c58c41585706c0673892df83e6c58e16655f83aeaa4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3ead139c2a8e6feefbddce384ccd6eb8f26b4581d93053b80832fea7c8bdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82362387569a5b049fdb827cb8ca5ecdacae36743ef142077b37441cc2e33b41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d08e8579fe7ac45bc28ca29b9e6bc0b795c68e4429e16319b75f2b6a1f60cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:12Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.631838 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e1e9d28fe57947c798967ce328638e778603a86874e79a2f4d99eab0b9867fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://489d551f2e552fe6e6a03f8e38593e2b978a7dc4053a999c3a11ebe404cf427f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:12Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.647615 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:12Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.666554 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:12Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.713153 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-6mhzt"] Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.713516 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.714414 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-lxpxh"] Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.714625 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-lxpxh" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.714685 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-p6jtp"] Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.715064 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-p6jtp" Feb 17 17:44:12 crc kubenswrapper[4892]: W0217 17:44:12.715863 4892 reflector.go:561] object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq": failed to list *v1.Secret: secrets "machine-config-daemon-dockercfg-r5tcq" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Feb 17 17:44:12 crc kubenswrapper[4892]: E0217 17:44:12.715902 4892 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"machine-config-daemon-dockercfg-r5tcq\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"machine-config-daemon-dockercfg-r5tcq\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 17 17:44:12 crc kubenswrapper[4892]: W0217 17:44:12.716061 4892 reflector.go:561] object-"openshift-machine-config-operator"/"kube-rbac-proxy": failed to list *v1.ConfigMap: configmaps "kube-rbac-proxy" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Feb 17 17:44:12 crc kubenswrapper[4892]: W0217 17:44:12.716067 4892 reflector.go:561] object-"openshift-machine-config-operator"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Feb 17 17:44:12 crc kubenswrapper[4892]: E0217 17:44:12.716081 4892 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"kube-rbac-proxy\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-rbac-proxy\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 17 17:44:12 crc kubenswrapper[4892]: E0217 17:44:12.716110 4892 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.716729 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.716738 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.716928 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 17 17:44:12 crc kubenswrapper[4892]: W0217 17:44:12.717058 4892 reflector.go:561] object-"openshift-machine-config-operator"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Feb 17 17:44:12 crc kubenswrapper[4892]: E0217 17:44:12.717087 4892 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.722668 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.722740 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.722838 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.723085 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.723320 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.723372 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.740354 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83f8f121-9193-465b-9d6b-4dfbdcf26f98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca961a496bca89cff78c58c41585706c0673892df83e6c58e16655f83aeaa4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3ead139c2a8e6feefbddce384ccd6eb8f26b4581d93053b80832fea7c8bdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82362387569a5b049fdb827cb8ca5ecdacae36743ef142077b37441cc2e33b41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d08e8579fe7ac45bc28ca29b9e6bc0b795c68e4429e16319b75f2b6a1f60cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:12Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.753631 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e1e9d28fe57947c798967ce328638e778603a86874e79a2f4d99eab0b9867fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://489d551f2e552fe6e6a03f8e38593e2b978a7dc4053a999c3a11ebe404cf427f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:12Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.763234 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:12Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.776597 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:12Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.790163 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4159c-33a5-42ef-9427-ad1fb2799470\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf7e90e5029b6c000084531e8c465365c236ea0c554bf0ab65b936fc0f53cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe34ba04ce082dcb2c2040fafa3d9b0fb0f78fd569d08c9e53278873fb6434c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a9fc749b194bcc0558af7017260846aeb8df8df5d84f31ae0f1e9f7ac29e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 17:44:08.433164 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 17:44:08.433536 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:44:08.434768 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2781092108/tls.crt::/tmp/serving-cert-2781092108/tls.key\\\\\\\"\\\\nI0217 17:44:08.760474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:44:08.765628 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:44:08.765666 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:44:08.765700 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:44:08.765711 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:44:08.777802 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:44:08.777850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:44:08.777867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:44:08.777870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:44:08.777873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:44:08.777883 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:44:08.781106 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23963364690d3b6cdb1cb76d08e094b0a7c2fe304f8d20a673206357e0152193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:12Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.802665 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:12Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.816432 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59304d1958e072cd19bd1e6a00f1487aa93563c8838c859aa7ed5222dc25c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:12Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.826883 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aba0c9be9062adff6361f81c4b2f399a56592fcef0d1a62defdf587f9accb3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:12Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.836458 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9013d62-9809-436b-82a8-5b18dbf13e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6mhzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:12Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.845560 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:12Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.855800 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aba0c9be9062adff6361f81c4b2f399a56592fcef0d1a62defdf587f9accb3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:12Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.865762 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9013d62-9809-436b-82a8-5b18dbf13e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6mhzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:12Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.878935 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.879027 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/646d0148-c138-45a2-8f68-51aee16aeff0-hosts-file\") pod \"node-resolver-p6jtp\" (UID: \"646d0148-c138-45a2-8f68-51aee16aeff0\") " pod="openshift-dns/node-resolver-p6jtp" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.879048 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/43b12f44-0079-4031-9b1d-492c374250df-multus-conf-dir\") pod \"multus-lxpxh\" (UID: \"43b12f44-0079-4031-9b1d-492c374250df\") " pod="openshift-multus/multus-lxpxh" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.879065 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43b12f44-0079-4031-9b1d-492c374250df-etc-kubernetes\") pod \"multus-lxpxh\" (UID: \"43b12f44-0079-4031-9b1d-492c374250df\") " pod="openshift-multus/multus-lxpxh" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.879083 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f9013d62-9809-436b-82a8-5b18dbf13e35-rootfs\") pod \"machine-config-daemon-6mhzt\" (UID: \"f9013d62-9809-436b-82a8-5b18dbf13e35\") " pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.879098 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/43b12f44-0079-4031-9b1d-492c374250df-cni-binary-copy\") pod \"multus-lxpxh\" (UID: \"43b12f44-0079-4031-9b1d-492c374250df\") " pod="openshift-multus/multus-lxpxh" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.879127 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/43b12f44-0079-4031-9b1d-492c374250df-host-var-lib-cni-multus\") pod \"multus-lxpxh\" (UID: \"43b12f44-0079-4031-9b1d-492c374250df\") " pod="openshift-multus/multus-lxpxh" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.879142 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/43b12f44-0079-4031-9b1d-492c374250df-multus-daemon-config\") pod \"multus-lxpxh\" (UID: \"43b12f44-0079-4031-9b1d-492c374250df\") " pod="openshift-multus/multus-lxpxh" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.879160 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/43b12f44-0079-4031-9b1d-492c374250df-system-cni-dir\") pod \"multus-lxpxh\" (UID: \"43b12f44-0079-4031-9b1d-492c374250df\") " pod="openshift-multus/multus-lxpxh" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.879179 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/43b12f44-0079-4031-9b1d-492c374250df-host-run-netns\") pod \"multus-lxpxh\" (UID: \"43b12f44-0079-4031-9b1d-492c374250df\") " pod="openshift-multus/multus-lxpxh" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.879203 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f9013d62-9809-436b-82a8-5b18dbf13e35-proxy-tls\") pod \"machine-config-daemon-6mhzt\" (UID: \"f9013d62-9809-436b-82a8-5b18dbf13e35\") " pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.879219 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/43b12f44-0079-4031-9b1d-492c374250df-multus-socket-dir-parent\") pod \"multus-lxpxh\" (UID: \"43b12f44-0079-4031-9b1d-492c374250df\") " pod="openshift-multus/multus-lxpxh" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.879237 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/43b12f44-0079-4031-9b1d-492c374250df-os-release\") pod \"multus-lxpxh\" (UID: \"43b12f44-0079-4031-9b1d-492c374250df\") " pod="openshift-multus/multus-lxpxh" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.879251 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/43b12f44-0079-4031-9b1d-492c374250df-host-var-lib-cni-bin\") pod \"multus-lxpxh\" (UID: \"43b12f44-0079-4031-9b1d-492c374250df\") " pod="openshift-multus/multus-lxpxh" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.879274 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f9013d62-9809-436b-82a8-5b18dbf13e35-mcd-auth-proxy-config\") pod \"machine-config-daemon-6mhzt\" (UID: \"f9013d62-9809-436b-82a8-5b18dbf13e35\") " pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.879290 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxbxx\" (UniqueName: \"kubernetes.io/projected/646d0148-c138-45a2-8f68-51aee16aeff0-kube-api-access-xxbxx\") pod \"node-resolver-p6jtp\" (UID: \"646d0148-c138-45a2-8f68-51aee16aeff0\") " pod="openshift-dns/node-resolver-p6jtp" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.879307 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/43b12f44-0079-4031-9b1d-492c374250df-multus-cni-dir\") pod \"multus-lxpxh\" (UID: \"43b12f44-0079-4031-9b1d-492c374250df\") " pod="openshift-multus/multus-lxpxh" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.879321 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/43b12f44-0079-4031-9b1d-492c374250df-hostroot\") pod \"multus-lxpxh\" (UID: \"43b12f44-0079-4031-9b1d-492c374250df\") " pod="openshift-multus/multus-lxpxh" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.879337 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/43b12f44-0079-4031-9b1d-492c374250df-host-run-multus-certs\") pod \"multus-lxpxh\" (UID: \"43b12f44-0079-4031-9b1d-492c374250df\") " pod="openshift-multus/multus-lxpxh" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.879352 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgp4t\" (UniqueName: \"kubernetes.io/projected/43b12f44-0079-4031-9b1d-492c374250df-kube-api-access-jgp4t\") pod \"multus-lxpxh\" (UID: \"43b12f44-0079-4031-9b1d-492c374250df\") " pod="openshift-multus/multus-lxpxh" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.879368 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/43b12f44-0079-4031-9b1d-492c374250df-cnibin\") pod \"multus-lxpxh\" (UID: \"43b12f44-0079-4031-9b1d-492c374250df\") " pod="openshift-multus/multus-lxpxh" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.879382 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/43b12f44-0079-4031-9b1d-492c374250df-host-var-lib-kubelet\") pod \"multus-lxpxh\" (UID: \"43b12f44-0079-4031-9b1d-492c374250df\") " pod="openshift-multus/multus-lxpxh" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.879407 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/43b12f44-0079-4031-9b1d-492c374250df-host-run-k8s-cni-cncf-io\") pod \"multus-lxpxh\" (UID: \"43b12f44-0079-4031-9b1d-492c374250df\") " pod="openshift-multus/multus-lxpxh" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.879423 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5np8\" (UniqueName: \"kubernetes.io/projected/f9013d62-9809-436b-82a8-5b18dbf13e35-kube-api-access-l5np8\") pod \"machine-config-daemon-6mhzt\" (UID: \"f9013d62-9809-436b-82a8-5b18dbf13e35\") " pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" Feb 17 17:44:12 crc kubenswrapper[4892]: E0217 17:44:12.879522 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:44:16.879505107 +0000 UTC m=+28.254908362 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.883946 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e1e9d28fe57947c798967ce328638e778603a86874e79a2f4d99eab0b9867fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://489d551f2e552fe6e6a03f8e38593e2b978a7dc4053a999c3a11ebe404cf427f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:12Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.895578 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p6jtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"646d0148-c138-45a2-8f68-51aee16aeff0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxbxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p6jtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:12Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.913172 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4159c-33a5-42ef-9427-ad1fb2799470\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf7e90e5029b6c000084531e8c465365c236ea0c554bf0ab65b936fc0f53cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe34ba04ce082dcb2c2040fafa3d9b0fb0f78fd569d08c9e53278873fb6434c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a9fc749b194bcc0558af7017260846aeb8df8df5d84f31ae0f1e9f7ac29e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 17:44:08.433164 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 17:44:08.433536 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:44:08.434768 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2781092108/tls.crt::/tmp/serving-cert-2781092108/tls.key\\\\\\\"\\\\nI0217 17:44:08.760474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:44:08.765628 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:44:08.765666 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:44:08.765700 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:44:08.765711 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:44:08.777802 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:44:08.777850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:44:08.777867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:44:08.777870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:44:08.777873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:44:08.777883 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:44:08.781106 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23963364690d3b6cdb1cb76d08e094b0a7c2fe304f8d20a673206357e0152193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:12Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.926750 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59304d1958e072cd19bd1e6a00f1487aa93563c8838c859aa7ed5222dc25c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:12Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.938293 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83f8f121-9193-465b-9d6b-4dfbdcf26f98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca961a496bca89cff78c58c41585706c0673892df83e6c58e16655f83aeaa4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3ead139c2a8e6feefbddce384ccd6eb8f26b4581d93053b80832fea7c8bdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82362387569a5b049fdb827cb8ca5ecdacae36743ef142077b37441cc2e33b41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d08e8579fe7ac45bc28ca29b9e6bc0b795c68e4429e16319b75f2b6a1f60cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:12Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.952116 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:12Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.965043 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:12Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.970782 4892 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-17 17:39:11 +0000 UTC, rotation deadline is 2026-11-12 02:15:09.875435902 +0000 UTC Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.970858 4892 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6416h30m56.904582123s for next certificate rotation Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.978115 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxpxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b12f44-0079-4031-9b1d-492c374250df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgp4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxpxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:12Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.980483 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/43b12f44-0079-4031-9b1d-492c374250df-host-var-lib-cni-multus\") pod \"multus-lxpxh\" (UID: \"43b12f44-0079-4031-9b1d-492c374250df\") " pod="openshift-multus/multus-lxpxh" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.980523 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/43b12f44-0079-4031-9b1d-492c374250df-multus-daemon-config\") pod \"multus-lxpxh\" (UID: \"43b12f44-0079-4031-9b1d-492c374250df\") " pod="openshift-multus/multus-lxpxh" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.980550 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/43b12f44-0079-4031-9b1d-492c374250df-system-cni-dir\") pod \"multus-lxpxh\" (UID: \"43b12f44-0079-4031-9b1d-492c374250df\") " pod="openshift-multus/multus-lxpxh" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.980571 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/43b12f44-0079-4031-9b1d-492c374250df-host-run-netns\") pod \"multus-lxpxh\" (UID: \"43b12f44-0079-4031-9b1d-492c374250df\") " pod="openshift-multus/multus-lxpxh" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.980593 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.980618 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f9013d62-9809-436b-82a8-5b18dbf13e35-proxy-tls\") pod \"machine-config-daemon-6mhzt\" (UID: \"f9013d62-9809-436b-82a8-5b18dbf13e35\") " pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.980638 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/43b12f44-0079-4031-9b1d-492c374250df-multus-socket-dir-parent\") pod \"multus-lxpxh\" (UID: \"43b12f44-0079-4031-9b1d-492c374250df\") " pod="openshift-multus/multus-lxpxh" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.980662 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.980683 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f9013d62-9809-436b-82a8-5b18dbf13e35-mcd-auth-proxy-config\") pod \"machine-config-daemon-6mhzt\" (UID: \"f9013d62-9809-436b-82a8-5b18dbf13e35\") " pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.980689 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/43b12f44-0079-4031-9b1d-492c374250df-host-var-lib-cni-multus\") pod \"multus-lxpxh\" (UID: \"43b12f44-0079-4031-9b1d-492c374250df\") " pod="openshift-multus/multus-lxpxh" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.980706 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxbxx\" (UniqueName: \"kubernetes.io/projected/646d0148-c138-45a2-8f68-51aee16aeff0-kube-api-access-xxbxx\") pod \"node-resolver-p6jtp\" (UID: \"646d0148-c138-45a2-8f68-51aee16aeff0\") " pod="openshift-dns/node-resolver-p6jtp" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.980733 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/43b12f44-0079-4031-9b1d-492c374250df-system-cni-dir\") pod \"multus-lxpxh\" (UID: \"43b12f44-0079-4031-9b1d-492c374250df\") " pod="openshift-multus/multus-lxpxh" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.980748 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/43b12f44-0079-4031-9b1d-492c374250df-multus-cni-dir\") pod \"multus-lxpxh\" (UID: \"43b12f44-0079-4031-9b1d-492c374250df\") " pod="openshift-multus/multus-lxpxh" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.980790 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/43b12f44-0079-4031-9b1d-492c374250df-multus-cni-dir\") pod \"multus-lxpxh\" (UID: \"43b12f44-0079-4031-9b1d-492c374250df\") " pod="openshift-multus/multus-lxpxh" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.980837 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/43b12f44-0079-4031-9b1d-492c374250df-os-release\") pod \"multus-lxpxh\" (UID: \"43b12f44-0079-4031-9b1d-492c374250df\") " pod="openshift-multus/multus-lxpxh" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.980862 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/43b12f44-0079-4031-9b1d-492c374250df-host-var-lib-cni-bin\") pod \"multus-lxpxh\" (UID: \"43b12f44-0079-4031-9b1d-492c374250df\") " pod="openshift-multus/multus-lxpxh" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.980883 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/43b12f44-0079-4031-9b1d-492c374250df-hostroot\") pod \"multus-lxpxh\" (UID: \"43b12f44-0079-4031-9b1d-492c374250df\") " pod="openshift-multus/multus-lxpxh" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.980901 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/43b12f44-0079-4031-9b1d-492c374250df-host-run-multus-certs\") pod \"multus-lxpxh\" (UID: \"43b12f44-0079-4031-9b1d-492c374250df\") " pod="openshift-multus/multus-lxpxh" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.980922 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgp4t\" (UniqueName: \"kubernetes.io/projected/43b12f44-0079-4031-9b1d-492c374250df-kube-api-access-jgp4t\") pod \"multus-lxpxh\" (UID: \"43b12f44-0079-4031-9b1d-492c374250df\") " pod="openshift-multus/multus-lxpxh" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.980954 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.980978 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/43b12f44-0079-4031-9b1d-492c374250df-cnibin\") pod \"multus-lxpxh\" (UID: \"43b12f44-0079-4031-9b1d-492c374250df\") " pod="openshift-multus/multus-lxpxh" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.980996 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/43b12f44-0079-4031-9b1d-492c374250df-host-var-lib-kubelet\") pod \"multus-lxpxh\" (UID: \"43b12f44-0079-4031-9b1d-492c374250df\") " pod="openshift-multus/multus-lxpxh" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.981027 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5np8\" (UniqueName: \"kubernetes.io/projected/f9013d62-9809-436b-82a8-5b18dbf13e35-kube-api-access-l5np8\") pod \"machine-config-daemon-6mhzt\" (UID: \"f9013d62-9809-436b-82a8-5b18dbf13e35\") " pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.981042 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/43b12f44-0079-4031-9b1d-492c374250df-multus-socket-dir-parent\") pod \"multus-lxpxh\" (UID: \"43b12f44-0079-4031-9b1d-492c374250df\") " pod="openshift-multus/multus-lxpxh" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.981049 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/43b12f44-0079-4031-9b1d-492c374250df-host-run-k8s-cni-cncf-io\") pod \"multus-lxpxh\" (UID: \"43b12f44-0079-4031-9b1d-492c374250df\") " pod="openshift-multus/multus-lxpxh" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.981082 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.981094 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/43b12f44-0079-4031-9b1d-492c374250df-hostroot\") pod \"multus-lxpxh\" (UID: \"43b12f44-0079-4031-9b1d-492c374250df\") " pod="openshift-multus/multus-lxpxh" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.981141 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/646d0148-c138-45a2-8f68-51aee16aeff0-hosts-file\") pod \"node-resolver-p6jtp\" (UID: \"646d0148-c138-45a2-8f68-51aee16aeff0\") " pod="openshift-dns/node-resolver-p6jtp" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.981150 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/43b12f44-0079-4031-9b1d-492c374250df-host-run-multus-certs\") pod \"multus-lxpxh\" (UID: \"43b12f44-0079-4031-9b1d-492c374250df\") " pod="openshift-multus/multus-lxpxh" Feb 17 17:44:12 crc kubenswrapper[4892]: E0217 17:44:12.981110 4892 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 17:44:12 crc kubenswrapper[4892]: E0217 17:44:12.981162 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 17:44:12 crc kubenswrapper[4892]: E0217 17:44:12.981188 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 17:44:12 crc kubenswrapper[4892]: E0217 17:44:12.981199 4892 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:44:12 crc kubenswrapper[4892]: E0217 17:44:12.981213 4892 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.981048 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/43b12f44-0079-4031-9b1d-492c374250df-host-run-netns\") pod \"multus-lxpxh\" (UID: \"43b12f44-0079-4031-9b1d-492c374250df\") " pod="openshift-multus/multus-lxpxh" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.981220 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/43b12f44-0079-4031-9b1d-492c374250df-os-release\") pod \"multus-lxpxh\" (UID: \"43b12f44-0079-4031-9b1d-492c374250df\") " pod="openshift-multus/multus-lxpxh" Feb 17 17:44:12 crc kubenswrapper[4892]: E0217 17:44:12.981218 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 17:44:16.981199458 +0000 UTC m=+28.356602833 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.981238 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/43b12f44-0079-4031-9b1d-492c374250df-host-run-k8s-cni-cncf-io\") pod \"multus-lxpxh\" (UID: \"43b12f44-0079-4031-9b1d-492c374250df\") " pod="openshift-multus/multus-lxpxh" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.981103 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/646d0148-c138-45a2-8f68-51aee16aeff0-hosts-file\") pod \"node-resolver-p6jtp\" (UID: \"646d0148-c138-45a2-8f68-51aee16aeff0\") " pod="openshift-dns/node-resolver-p6jtp" Feb 17 17:44:12 crc kubenswrapper[4892]: E0217 17:44:12.981265 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 17:44:16.981249379 +0000 UTC m=+28.356652744 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:44:12 crc kubenswrapper[4892]: E0217 17:44:12.981282 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 17:44:16.98127464 +0000 UTC m=+28.356678035 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.981293 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/43b12f44-0079-4031-9b1d-492c374250df-cnibin\") pod \"multus-lxpxh\" (UID: \"43b12f44-0079-4031-9b1d-492c374250df\") " pod="openshift-multus/multus-lxpxh" Feb 17 17:44:12 crc kubenswrapper[4892]: E0217 17:44:12.981302 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.981301 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/43b12f44-0079-4031-9b1d-492c374250df-host-var-lib-kubelet\") pod \"multus-lxpxh\" (UID: \"43b12f44-0079-4031-9b1d-492c374250df\") " pod="openshift-multus/multus-lxpxh" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.981327 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/43b12f44-0079-4031-9b1d-492c374250df-multus-conf-dir\") pod \"multus-lxpxh\" (UID: \"43b12f44-0079-4031-9b1d-492c374250df\") " pod="openshift-multus/multus-lxpxh" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.981299 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/43b12f44-0079-4031-9b1d-492c374250df-multus-conf-dir\") pod \"multus-lxpxh\" (UID: \"43b12f44-0079-4031-9b1d-492c374250df\") " pod="openshift-multus/multus-lxpxh" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.981349 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/43b12f44-0079-4031-9b1d-492c374250df-multus-daemon-config\") pod \"multus-lxpxh\" (UID: \"43b12f44-0079-4031-9b1d-492c374250df\") " pod="openshift-multus/multus-lxpxh" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.981369 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43b12f44-0079-4031-9b1d-492c374250df-etc-kubernetes\") pod \"multus-lxpxh\" (UID: \"43b12f44-0079-4031-9b1d-492c374250df\") " pod="openshift-multus/multus-lxpxh" Feb 17 17:44:12 crc kubenswrapper[4892]: E0217 17:44:12.981317 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.981401 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f9013d62-9809-436b-82a8-5b18dbf13e35-rootfs\") pod \"machine-config-daemon-6mhzt\" (UID: \"f9013d62-9809-436b-82a8-5b18dbf13e35\") " pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" Feb 17 17:44:12 crc kubenswrapper[4892]: E0217 17:44:12.981405 4892 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.981348 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/43b12f44-0079-4031-9b1d-492c374250df-host-var-lib-cni-bin\") pod \"multus-lxpxh\" (UID: \"43b12f44-0079-4031-9b1d-492c374250df\") " pod="openshift-multus/multus-lxpxh" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.981416 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43b12f44-0079-4031-9b1d-492c374250df-etc-kubernetes\") pod \"multus-lxpxh\" (UID: \"43b12f44-0079-4031-9b1d-492c374250df\") " pod="openshift-multus/multus-lxpxh" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.981423 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/43b12f44-0079-4031-9b1d-492c374250df-cni-binary-copy\") pod \"multus-lxpxh\" (UID: \"43b12f44-0079-4031-9b1d-492c374250df\") " pod="openshift-multus/multus-lxpxh" Feb 17 17:44:12 crc kubenswrapper[4892]: E0217 17:44:12.981444 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 17:44:16.981434924 +0000 UTC m=+28.356838299 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.981440 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f9013d62-9809-436b-82a8-5b18dbf13e35-rootfs\") pod \"machine-config-daemon-6mhzt\" (UID: \"f9013d62-9809-436b-82a8-5b18dbf13e35\") " pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.982026 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/43b12f44-0079-4031-9b1d-492c374250df-cni-binary-copy\") pod \"multus-lxpxh\" (UID: \"43b12f44-0079-4031-9b1d-492c374250df\") " pod="openshift-multus/multus-lxpxh" Feb 17 17:44:12 crc kubenswrapper[4892]: I0217 17:44:12.986291 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f9013d62-9809-436b-82a8-5b18dbf13e35-proxy-tls\") pod \"machine-config-daemon-6mhzt\" (UID: \"f9013d62-9809-436b-82a8-5b18dbf13e35\") " pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.001435 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxbxx\" (UniqueName: \"kubernetes.io/projected/646d0148-c138-45a2-8f68-51aee16aeff0-kube-api-access-xxbxx\") pod \"node-resolver-p6jtp\" (UID: \"646d0148-c138-45a2-8f68-51aee16aeff0\") " pod="openshift-dns/node-resolver-p6jtp" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.006394 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgp4t\" (UniqueName: \"kubernetes.io/projected/43b12f44-0079-4031-9b1d-492c374250df-kube-api-access-jgp4t\") pod \"multus-lxpxh\" (UID: \"43b12f44-0079-4031-9b1d-492c374250df\") " pod="openshift-multus/multus-lxpxh" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.037213 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-lxpxh" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.045679 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-p6jtp" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.077429 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-4dsxq"] Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.078257 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4dsxq" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.081394 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.081623 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.081738 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kp5h9"] Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.082846 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.091764 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.092070 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.092190 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.092374 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.092991 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.093059 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.093952 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.096746 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e1e9d28fe57947c798967ce328638e778603a86874e79a2f4d99eab0b9867fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://489d551f2e552fe6e6a03f8e38593e2b978a7dc4053a999c3a11ebe404cf427f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:13Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.108764 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p6jtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"646d0148-c138-45a2-8f68-51aee16aeff0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxbxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p6jtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:13Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.127691 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4159c-33a5-42ef-9427-ad1fb2799470\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf7e90e5029b6c000084531e8c465365c236ea0c554bf0ab65b936fc0f53cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe34ba04ce082dcb2c2040fafa3d9b0fb0f78fd569d08c9e53278873fb6434c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a9fc749b194bcc0558af7017260846aeb8df8df5d84f31ae0f1e9f7ac29e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 17:44:08.433164 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 17:44:08.433536 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:44:08.434768 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2781092108/tls.crt::/tmp/serving-cert-2781092108/tls.key\\\\\\\"\\\\nI0217 17:44:08.760474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:44:08.765628 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:44:08.765666 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:44:08.765700 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:44:08.765711 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:44:08.777802 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:44:08.777850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:44:08.777867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:44:08.777870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:44:08.777873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:44:08.777883 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:44:08.781106 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23963364690d3b6cdb1cb76d08e094b0a7c2fe304f8d20a673206357e0152193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:13Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.142070 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59304d1958e072cd19bd1e6a00f1487aa93563c8838c859aa7ed5222dc25c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:13Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.164476 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83f8f121-9193-465b-9d6b-4dfbdcf26f98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca961a496bca89cff78c58c41585706c0673892df83e6c58e16655f83aeaa4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3ead139c2a8e6feefbddce384ccd6eb8f26b4581d93053b80832fea7c8bdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82362387569a5b049fdb827cb8ca5ecdacae36743ef142077b37441cc2e33b41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d08e8579fe7ac45bc28ca29b9e6bc0b795c68e4429e16319b75f2b6a1f60cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:13Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.182161 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9llf2\" (UniqueName: \"kubernetes.io/projected/202411f4-1b44-41a1-9b8a-23038a0e9bd2-kube-api-access-9llf2\") pod \"multus-additional-cni-plugins-4dsxq\" (UID: \"202411f4-1b44-41a1-9b8a-23038a0e9bd2\") " pod="openshift-multus/multus-additional-cni-plugins-4dsxq" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.182200 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-host-slash\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.182222 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-run-openvswitch\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.182237 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcqf4\" (UniqueName: \"kubernetes.io/projected/b23058a0-04ec-4a23-82cb-60f9b368eaa0-kube-api-access-vcqf4\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.182277 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/202411f4-1b44-41a1-9b8a-23038a0e9bd2-cnibin\") pod \"multus-additional-cni-plugins-4dsxq\" (UID: \"202411f4-1b44-41a1-9b8a-23038a0e9bd2\") " pod="openshift-multus/multus-additional-cni-plugins-4dsxq" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.182304 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/202411f4-1b44-41a1-9b8a-23038a0e9bd2-os-release\") pod \"multus-additional-cni-plugins-4dsxq\" (UID: \"202411f4-1b44-41a1-9b8a-23038a0e9bd2\") " pod="openshift-multus/multus-additional-cni-plugins-4dsxq" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.182372 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-run-ovn\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.182412 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-node-log\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.182441 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.182475 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/202411f4-1b44-41a1-9b8a-23038a0e9bd2-system-cni-dir\") pod \"multus-additional-cni-plugins-4dsxq\" (UID: \"202411f4-1b44-41a1-9b8a-23038a0e9bd2\") " pod="openshift-multus/multus-additional-cni-plugins-4dsxq" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.182495 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-var-lib-openvswitch\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.182532 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/202411f4-1b44-41a1-9b8a-23038a0e9bd2-cni-binary-copy\") pod \"multus-additional-cni-plugins-4dsxq\" (UID: \"202411f4-1b44-41a1-9b8a-23038a0e9bd2\") " pod="openshift-multus/multus-additional-cni-plugins-4dsxq" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.182547 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-log-socket\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.182565 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-host-run-ovn-kubernetes\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.182598 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b23058a0-04ec-4a23-82cb-60f9b368eaa0-ovnkube-config\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.182613 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b23058a0-04ec-4a23-82cb-60f9b368eaa0-ovnkube-script-lib\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.182638 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/202411f4-1b44-41a1-9b8a-23038a0e9bd2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4dsxq\" (UID: \"202411f4-1b44-41a1-9b8a-23038a0e9bd2\") " pod="openshift-multus/multus-additional-cni-plugins-4dsxq" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.182656 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b23058a0-04ec-4a23-82cb-60f9b368eaa0-env-overrides\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.182678 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-run-systemd\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.182699 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/202411f4-1b44-41a1-9b8a-23038a0e9bd2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4dsxq\" (UID: \"202411f4-1b44-41a1-9b8a-23038a0e9bd2\") " pod="openshift-multus/multus-additional-cni-plugins-4dsxq" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.182715 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-host-run-netns\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.182782 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-systemd-units\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.182804 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-host-cni-netd\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.182834 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b23058a0-04ec-4a23-82cb-60f9b368eaa0-ovn-node-metrics-cert\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.182851 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-host-cni-bin\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.182843 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:13Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.182867 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-host-kubelet\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.182974 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-etc-openvswitch\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.198355 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:13Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.211552 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxpxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b12f44-0079-4031-9b1d-492c374250df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgp4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxpxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:13Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.224366 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4dsxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202411f4-1b44-41a1-9b8a-23038a0e9bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4dsxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:13Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.235795 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:13Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.247703 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aba0c9be9062adff6361f81c4b2f399a56592fcef0d1a62defdf587f9accb3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:13Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.257903 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9013d62-9809-436b-82a8-5b18dbf13e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6mhzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:13Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.269859 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4159c-33a5-42ef-9427-ad1fb2799470\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf7e90e5029b6c000084531e8c465365c236ea0c554bf0ab65b936fc0f53cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe34ba04ce082dcb2c2040fafa3d9b0fb0f78fd569d08c9e53278873fb6434c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a9fc749b194bcc0558af7017260846aeb8df8df5d84f31ae0f1e9f7ac29e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 17:44:08.433164 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 17:44:08.433536 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:44:08.434768 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2781092108/tls.crt::/tmp/serving-cert-2781092108/tls.key\\\\\\\"\\\\nI0217 17:44:08.760474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:44:08.765628 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:44:08.765666 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:44:08.765700 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:44:08.765711 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:44:08.777802 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:44:08.777850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:44:08.777867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:44:08.777870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:44:08.777873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:44:08.777883 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:44:08.781106 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23963364690d3b6cdb1cb76d08e094b0a7c2fe304f8d20a673206357e0152193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:13Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.279713 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59304d1958e072cd19bd1e6a00f1487aa93563c8838c859aa7ed5222dc25c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:13Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.284262 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-host-cni-bin\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.284293 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-host-cni-netd\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.284308 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b23058a0-04ec-4a23-82cb-60f9b368eaa0-ovn-node-metrics-cert\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.284326 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-host-kubelet\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.284343 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-etc-openvswitch\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.284379 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9llf2\" (UniqueName: \"kubernetes.io/projected/202411f4-1b44-41a1-9b8a-23038a0e9bd2-kube-api-access-9llf2\") pod \"multus-additional-cni-plugins-4dsxq\" (UID: \"202411f4-1b44-41a1-9b8a-23038a0e9bd2\") " pod="openshift-multus/multus-additional-cni-plugins-4dsxq" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.284394 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-host-slash\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.284398 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-host-cni-netd\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.284437 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-etc-openvswitch\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.284473 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-host-slash\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.284414 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-run-openvswitch\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.284509 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcqf4\" (UniqueName: \"kubernetes.io/projected/b23058a0-04ec-4a23-82cb-60f9b368eaa0-kube-api-access-vcqf4\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.284523 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/202411f4-1b44-41a1-9b8a-23038a0e9bd2-system-cni-dir\") pod \"multus-additional-cni-plugins-4dsxq\" (UID: \"202411f4-1b44-41a1-9b8a-23038a0e9bd2\") " pod="openshift-multus/multus-additional-cni-plugins-4dsxq" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.284538 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/202411f4-1b44-41a1-9b8a-23038a0e9bd2-cnibin\") pod \"multus-additional-cni-plugins-4dsxq\" (UID: \"202411f4-1b44-41a1-9b8a-23038a0e9bd2\") " pod="openshift-multus/multus-additional-cni-plugins-4dsxq" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.284556 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/202411f4-1b44-41a1-9b8a-23038a0e9bd2-os-release\") pod \"multus-additional-cni-plugins-4dsxq\" (UID: \"202411f4-1b44-41a1-9b8a-23038a0e9bd2\") " pod="openshift-multus/multus-additional-cni-plugins-4dsxq" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.284576 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-run-ovn\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.284595 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-node-log\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.284616 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.284636 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-var-lib-openvswitch\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.284664 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/202411f4-1b44-41a1-9b8a-23038a0e9bd2-cni-binary-copy\") pod \"multus-additional-cni-plugins-4dsxq\" (UID: \"202411f4-1b44-41a1-9b8a-23038a0e9bd2\") " pod="openshift-multus/multus-additional-cni-plugins-4dsxq" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.284669 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-host-kubelet\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.284679 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-log-socket\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.284697 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-host-run-ovn-kubernetes\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.284716 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/202411f4-1b44-41a1-9b8a-23038a0e9bd2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4dsxq\" (UID: \"202411f4-1b44-41a1-9b8a-23038a0e9bd2\") " pod="openshift-multus/multus-additional-cni-plugins-4dsxq" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.284732 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b23058a0-04ec-4a23-82cb-60f9b368eaa0-ovnkube-config\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.284747 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b23058a0-04ec-4a23-82cb-60f9b368eaa0-ovnkube-script-lib\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.284770 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b23058a0-04ec-4a23-82cb-60f9b368eaa0-env-overrides\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.284785 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-run-systemd\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.284793 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/202411f4-1b44-41a1-9b8a-23038a0e9bd2-os-release\") pod \"multus-additional-cni-plugins-4dsxq\" (UID: \"202411f4-1b44-41a1-9b8a-23038a0e9bd2\") " pod="openshift-multus/multus-additional-cni-plugins-4dsxq" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.284448 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-run-openvswitch\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.284846 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/202411f4-1b44-41a1-9b8a-23038a0e9bd2-cnibin\") pod \"multus-additional-cni-plugins-4dsxq\" (UID: \"202411f4-1b44-41a1-9b8a-23038a0e9bd2\") " pod="openshift-multus/multus-additional-cni-plugins-4dsxq" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.284852 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/202411f4-1b44-41a1-9b8a-23038a0e9bd2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4dsxq\" (UID: \"202411f4-1b44-41a1-9b8a-23038a0e9bd2\") " pod="openshift-multus/multus-additional-cni-plugins-4dsxq" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.284875 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-host-run-netns\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.284880 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/202411f4-1b44-41a1-9b8a-23038a0e9bd2-system-cni-dir\") pod \"multus-additional-cni-plugins-4dsxq\" (UID: \"202411f4-1b44-41a1-9b8a-23038a0e9bd2\") " pod="openshift-multus/multus-additional-cni-plugins-4dsxq" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.284398 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-host-cni-bin\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.284972 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/202411f4-1b44-41a1-9b8a-23038a0e9bd2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4dsxq\" (UID: \"202411f4-1b44-41a1-9b8a-23038a0e9bd2\") " pod="openshift-multus/multus-additional-cni-plugins-4dsxq" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.284988 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-host-run-netns\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.285003 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-log-socket\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.285016 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-host-run-ovn-kubernetes\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.285235 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-run-systemd\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.285271 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.285292 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-run-ovn\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.285310 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-node-log\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.285415 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/202411f4-1b44-41a1-9b8a-23038a0e9bd2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4dsxq\" (UID: \"202411f4-1b44-41a1-9b8a-23038a0e9bd2\") " pod="openshift-multus/multus-additional-cni-plugins-4dsxq" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.285469 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-var-lib-openvswitch\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.285551 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/202411f4-1b44-41a1-9b8a-23038a0e9bd2-cni-binary-copy\") pod \"multus-additional-cni-plugins-4dsxq\" (UID: \"202411f4-1b44-41a1-9b8a-23038a0e9bd2\") " pod="openshift-multus/multus-additional-cni-plugins-4dsxq" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.285681 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-systemd-units\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.285744 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-systemd-units\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.285917 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b23058a0-04ec-4a23-82cb-60f9b368eaa0-ovnkube-script-lib\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.285953 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b23058a0-04ec-4a23-82cb-60f9b368eaa0-env-overrides\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.285964 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b23058a0-04ec-4a23-82cb-60f9b368eaa0-ovnkube-config\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.287292 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b23058a0-04ec-4a23-82cb-60f9b368eaa0-ovn-node-metrics-cert\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.295890 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83f8f121-9193-465b-9d6b-4dfbdcf26f98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca961a496bca89cff78c58c41585706c0673892df83e6c58e16655f83aeaa4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3ead139c2a8e6feefbddce384ccd6eb8f26b4581d93053b80832fea7c8bdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82362387569a5b049fdb827cb8ca5ecdacae36743ef142077b37441cc2e33b41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d08e8579fe7ac45bc28ca29b9e6bc0b795c68e4429e16319b75f2b6a1f60cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:13Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.300431 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9llf2\" (UniqueName: \"kubernetes.io/projected/202411f4-1b44-41a1-9b8a-23038a0e9bd2-kube-api-access-9llf2\") pod \"multus-additional-cni-plugins-4dsxq\" (UID: \"202411f4-1b44-41a1-9b8a-23038a0e9bd2\") " pod="openshift-multus/multus-additional-cni-plugins-4dsxq" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.301592 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 00:59:50.998858798 +0000 UTC Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.302639 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcqf4\" (UniqueName: \"kubernetes.io/projected/b23058a0-04ec-4a23-82cb-60f9b368eaa0-kube-api-access-vcqf4\") pod \"ovnkube-node-kp5h9\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.307086 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:13Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.318924 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:13Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.332026 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxpxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b12f44-0079-4031-9b1d-492c374250df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgp4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxpxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:13Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.346755 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4dsxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202411f4-1b44-41a1-9b8a-23038a0e9bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4dsxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:13Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.359158 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:13Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.359305 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.359385 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:44:13 crc kubenswrapper[4892]: E0217 17:44:13.359496 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:44:13 crc kubenswrapper[4892]: E0217 17:44:13.359613 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.372280 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aba0c9be9062adff6361f81c4b2f399a56592fcef0d1a62defdf587f9accb3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:13Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.387128 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9013d62-9809-436b-82a8-5b18dbf13e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6mhzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:13Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.400881 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e1e9d28fe57947c798967ce328638e778603a86874e79a2f4d99eab0b9867fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://489d551f2e552fe6e6a03f8e38593e2b978a7dc4053a999c3a11ebe404cf427f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:13Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.407174 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4dsxq" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.413181 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p6jtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"646d0148-c138-45a2-8f68-51aee16aeff0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxbxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p6jtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:13Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:13 crc kubenswrapper[4892]: W0217 17:44:13.420304 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod202411f4_1b44_41a1_9b8a_23038a0e9bd2.slice/crio-e07260191e53d7c57751fa44ed4c6277783442143fae0919477b624feada767e WatchSource:0}: Error finding container e07260191e53d7c57751fa44ed4c6277783442143fae0919477b624feada767e: Status 404 returned error can't find the container with id e07260191e53d7c57751fa44ed4c6277783442143fae0919477b624feada767e Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.424430 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.438356 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b23058a0-04ec-4a23-82cb-60f9b368eaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kp5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:13Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:13 crc kubenswrapper[4892]: W0217 17:44:13.450528 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb23058a0_04ec_4a23_82cb_60f9b368eaa0.slice/crio-17cafc21d14ff978199bd24633c480073cfd1a24a1f65dc24eaeddf962fb96d1 WatchSource:0}: Error finding container 17cafc21d14ff978199bd24633c480073cfd1a24a1f65dc24eaeddf962fb96d1: Status 404 returned error can't find the container with id 17cafc21d14ff978199bd24633c480073cfd1a24a1f65dc24eaeddf962fb96d1 Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.511083 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" event={"ID":"b23058a0-04ec-4a23-82cb-60f9b368eaa0","Type":"ContainerStarted","Data":"17cafc21d14ff978199bd24633c480073cfd1a24a1f65dc24eaeddf962fb96d1"} Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.512836 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lxpxh" event={"ID":"43b12f44-0079-4031-9b1d-492c374250df","Type":"ContainerStarted","Data":"5490d72f776fd4e96f56e933c182c8589e96d80806378f96d97c4d0f6dccd8eb"} Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.512896 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lxpxh" event={"ID":"43b12f44-0079-4031-9b1d-492c374250df","Type":"ContainerStarted","Data":"3c30649ebb1643ef547e4458f921f78eecf5ea5b8fc3a95181db24c1061bf6b7"} Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.514433 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4dsxq" event={"ID":"202411f4-1b44-41a1-9b8a-23038a0e9bd2","Type":"ContainerStarted","Data":"e07260191e53d7c57751fa44ed4c6277783442143fae0919477b624feada767e"} Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.521244 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-p6jtp" event={"ID":"646d0148-c138-45a2-8f68-51aee16aeff0","Type":"ContainerStarted","Data":"a1d56c160a6bfb1da9b1ca24666fbbdbd87911eaf961365a6cb097b0cc20f1a2"} Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.521293 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-p6jtp" event={"ID":"646d0148-c138-45a2-8f68-51aee16aeff0","Type":"ContainerStarted","Data":"03af4173349d0798ee3ab9fc1e74d06aba30c2c498e414c7f6183933e1452fb3"} Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.528946 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4159c-33a5-42ef-9427-ad1fb2799470\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf7e90e5029b6c000084531e8c465365c236ea0c554bf0ab65b936fc0f53cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe34ba04ce082dcb2c2040fafa3d9b0fb0f78fd569d08c9e53278873fb6434c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a9fc749b194bcc0558af7017260846aeb8df8df5d84f31ae0f1e9f7ac29e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 17:44:08.433164 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 17:44:08.433536 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:44:08.434768 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2781092108/tls.crt::/tmp/serving-cert-2781092108/tls.key\\\\\\\"\\\\nI0217 17:44:08.760474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:44:08.765628 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:44:08.765666 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:44:08.765700 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:44:08.765711 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:44:08.777802 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:44:08.777850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:44:08.777867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:44:08.777870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:44:08.777873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:44:08.777883 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:44:08.781106 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23963364690d3b6cdb1cb76d08e094b0a7c2fe304f8d20a673206357e0152193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:13Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.531284 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.540958 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59304d1958e072cd19bd1e6a00f1487aa93563c8838c859aa7ed5222dc25c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:13Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.548424 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.551935 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f9013d62-9809-436b-82a8-5b18dbf13e35-mcd-auth-proxy-config\") pod \"machine-config-daemon-6mhzt\" (UID: \"f9013d62-9809-436b-82a8-5b18dbf13e35\") " pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.555948 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83f8f121-9193-465b-9d6b-4dfbdcf26f98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca961a496bca89cff78c58c41585706c0673892df83e6c58e16655f83aeaa4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3ead139c2a8e6feefbddce384ccd6eb8f26b4581d93053b80832fea7c8bdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82362387569a5b049fdb827cb8ca5ecdacae36743ef142077b37441cc2e33b41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d08e8579fe7ac45bc28ca29b9e6bc0b795c68e4429e16319b75f2b6a1f60cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:13Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.571452 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:13Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.584141 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:13Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.600712 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxpxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b12f44-0079-4031-9b1d-492c374250df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5490d72f776fd4e96f56e933c182c8589e96d80806378f96d97c4d0f6dccd8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgp4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxpxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:13Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.618559 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4dsxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202411f4-1b44-41a1-9b8a-23038a0e9bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4dsxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:13Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.630490 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:13Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.650367 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aba0c9be9062adff6361f81c4b2f399a56592fcef0d1a62defdf587f9accb3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:13Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.667467 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9013d62-9809-436b-82a8-5b18dbf13e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6mhzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:13Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.682606 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e1e9d28fe57947c798967ce328638e778603a86874e79a2f4d99eab0b9867fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://489d551f2e552fe6e6a03f8e38593e2b978a7dc4053a999c3a11ebe404cf427f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:13Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.695509 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p6jtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"646d0148-c138-45a2-8f68-51aee16aeff0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxbxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p6jtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:13Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.744720 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b23058a0-04ec-4a23-82cb-60f9b368eaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kp5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:13Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.746352 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.759183 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5np8\" (UniqueName: \"kubernetes.io/projected/f9013d62-9809-436b-82a8-5b18dbf13e35-kube-api-access-l5np8\") pod \"machine-config-daemon-6mhzt\" (UID: \"f9013d62-9809-436b-82a8-5b18dbf13e35\") " pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.797380 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxpxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b12f44-0079-4031-9b1d-492c374250df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5490d72f776fd4e96f56e933c182c8589e96d80806378f96d97c4d0f6dccd8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgp4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxpxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:13Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.840951 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4dsxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202411f4-1b44-41a1-9b8a-23038a0e9bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4dsxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:13Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.881332 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83f8f121-9193-465b-9d6b-4dfbdcf26f98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca961a496bca89cff78c58c41585706c0673892df83e6c58e16655f83aeaa4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3ead139c2a8e6feefbddce384ccd6eb8f26b4581d93053b80832fea7c8bdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82362387569a5b049fdb827cb8ca5ecdacae36743ef142077b37441cc2e33b41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d08e8579fe7ac45bc28ca29b9e6bc0b795c68e4429e16319b75f2b6a1f60cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:13Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.924339 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:13Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.958663 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:13Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:13 crc kubenswrapper[4892]: I0217 17:44:13.994757 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9013d62-9809-436b-82a8-5b18dbf13e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6mhzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:13Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.028408 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.029196 4892 scope.go:117] "RemoveContainer" containerID="26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6" Feb 17 17:44:14 crc kubenswrapper[4892]: E0217 17:44:14.029349 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.034129 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:14Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.083907 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aba0c9be9062adff6361f81c4b2f399a56592fcef0d1a62defdf587f9accb3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:14Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.117126 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e1e9d28fe57947c798967ce328638e778603a86874e79a2f4d99eab0b9867fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://489d551f2e552fe6e6a03f8e38593e2b978a7dc4053a999c3a11ebe404cf427f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:14Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.152769 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p6jtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"646d0148-c138-45a2-8f68-51aee16aeff0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1d56c160a6bfb1da9b1ca24666fbbdbd87911eaf961365a6cb097b0cc20f1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxbxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p6jtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:14Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.205447 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b23058a0-04ec-4a23-82cb-60f9b368eaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kp5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:14Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.205729 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.211653 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" Feb 17 17:44:14 crc kubenswrapper[4892]: W0217 17:44:14.233024 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9013d62_9809_436b_82a8_5b18dbf13e35.slice/crio-38bf4a288a3eb8bdd14276e4e8caa278769989a395ca4da5eba264501c3e12a9 WatchSource:0}: Error finding container 38bf4a288a3eb8bdd14276e4e8caa278769989a395ca4da5eba264501c3e12a9: Status 404 returned error can't find the container with id 38bf4a288a3eb8bdd14276e4e8caa278769989a395ca4da5eba264501c3e12a9 Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.254421 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59304d1958e072cd19bd1e6a00f1487aa93563c8838c859aa7ed5222dc25c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:14Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.298083 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4159c-33a5-42ef-9427-ad1fb2799470\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf7e90e5029b6c000084531e8c465365c236ea0c554bf0ab65b936fc0f53cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe34ba04ce082dcb2c2040fafa3d9b0fb0f78fd569d08c9e53278873fb6434c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a9fc749b194bcc0558af7017260846aeb8df8df5d84f31ae0f1e9f7ac29e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 17:44:08.433164 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 17:44:08.433536 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:44:08.434768 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2781092108/tls.crt::/tmp/serving-cert-2781092108/tls.key\\\\\\\"\\\\nI0217 17:44:08.760474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:44:08.765628 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:44:08.765666 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:44:08.765700 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:44:08.765711 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:44:08.777802 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:44:08.777850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:44:08.777867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:44:08.777870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:44:08.777873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:44:08.777883 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:44:08.781106 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23963364690d3b6cdb1cb76d08e094b0a7c2fe304f8d20a673206357e0152193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:14Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.302270 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 20:05:11.667201936 +0000 UTC Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.358582 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:44:14 crc kubenswrapper[4892]: E0217 17:44:14.358778 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.527131 4892 generic.go:334] "Generic (PLEG): container finished" podID="202411f4-1b44-41a1-9b8a-23038a0e9bd2" containerID="f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21" exitCode=0 Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.527218 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4dsxq" event={"ID":"202411f4-1b44-41a1-9b8a-23038a0e9bd2","Type":"ContainerDied","Data":"f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21"} Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.530656 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerStarted","Data":"6b57e4ceb22cb7d61e915ddf5e85dfee872d2088cccf3c8c4add63e1aa423a04"} Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.530708 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerStarted","Data":"1c31ce4ad814ab3be27cbd8d8eb0cb70c7e11352cacd14b705d5c3ed0b56fe52"} Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.530723 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerStarted","Data":"38bf4a288a3eb8bdd14276e4e8caa278769989a395ca4da5eba264501c3e12a9"} Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.537779 4892 generic.go:334] "Generic (PLEG): container finished" podID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerID="c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e" exitCode=0 Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.537838 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" event={"ID":"b23058a0-04ec-4a23-82cb-60f9b368eaa0","Type":"ContainerDied","Data":"c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e"} Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.546885 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:14Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.563246 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aba0c9be9062adff6361f81c4b2f399a56592fcef0d1a62defdf587f9accb3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:14Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.578989 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9013d62-9809-436b-82a8-5b18dbf13e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6mhzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:14Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.593995 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e1e9d28fe57947c798967ce328638e778603a86874e79a2f4d99eab0b9867fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://489d551f2e552fe6e6a03f8e38593e2b978a7dc4053a999c3a11ebe404cf427f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:14Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.606054 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p6jtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"646d0148-c138-45a2-8f68-51aee16aeff0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1d56c160a6bfb1da9b1ca24666fbbdbd87911eaf961365a6cb097b0cc20f1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxbxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p6jtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:14Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.623781 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b23058a0-04ec-4a23-82cb-60f9b368eaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kp5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:14Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.638313 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4159c-33a5-42ef-9427-ad1fb2799470\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf7e90e5029b6c000084531e8c465365c236ea0c554bf0ab65b936fc0f53cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe34ba04ce082dcb2c2040fafa3d9b0fb0f78fd569d08c9e53278873fb6434c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a9fc749b194bcc0558af7017260846aeb8df8df5d84f31ae0f1e9f7ac29e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 17:44:08.433164 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 17:44:08.433536 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:44:08.434768 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2781092108/tls.crt::/tmp/serving-cert-2781092108/tls.key\\\\\\\"\\\\nI0217 17:44:08.760474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:44:08.765628 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:44:08.765666 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:44:08.765700 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:44:08.765711 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:44:08.777802 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:44:08.777850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:44:08.777867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:44:08.777870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:44:08.777873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:44:08.777883 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:44:08.781106 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23963364690d3b6cdb1cb76d08e094b0a7c2fe304f8d20a673206357e0152193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:14Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.652078 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59304d1958e072cd19bd1e6a00f1487aa93563c8838c859aa7ed5222dc25c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:14Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.667950 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83f8f121-9193-465b-9d6b-4dfbdcf26f98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca961a496bca89cff78c58c41585706c0673892df83e6c58e16655f83aeaa4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3ead139c2a8e6feefbddce384ccd6eb8f26b4581d93053b80832fea7c8bdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82362387569a5b049fdb827cb8ca5ecdacae36743ef142077b37441cc2e33b41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d08e8579fe7ac45bc28ca29b9e6bc0b795c68e4429e16319b75f2b6a1f60cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:14Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.694985 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:14Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.738805 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:14Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.783244 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxpxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b12f44-0079-4031-9b1d-492c374250df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5490d72f776fd4e96f56e933c182c8589e96d80806378f96d97c4d0f6dccd8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgp4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxpxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:14Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.820555 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4dsxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202411f4-1b44-41a1-9b8a-23038a0e9bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4dsxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:14Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.822210 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.824525 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.824574 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.824586 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.824691 4892 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.897973 4892 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.898185 4892 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.899057 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.899099 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.899115 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.899135 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.899150 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:14Z","lastTransitionTime":"2026-02-17T17:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.921783 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4159c-33a5-42ef-9427-ad1fb2799470\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf7e90e5029b6c000084531e8c465365c236ea0c554bf0ab65b936fc0f53cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe34ba04ce082dcb2c2040fafa3d9b0fb0f78fd569d08c9e53278873fb6434c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a9fc749b194bcc0558af7017260846aeb8df8df5d84f31ae0f1e9f7ac29e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 17:44:08.433164 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 17:44:08.433536 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:44:08.434768 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2781092108/tls.crt::/tmp/serving-cert-2781092108/tls.key\\\\\\\"\\\\nI0217 17:44:08.760474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:44:08.765628 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:44:08.765666 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:44:08.765700 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:44:08.765711 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:44:08.777802 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:44:08.777850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:44:08.777867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:44:08.777870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:44:08.777873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:44:08.777883 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:44:08.781106 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23963364690d3b6cdb1cb76d08e094b0a7c2fe304f8d20a673206357e0152193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:14Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:14 crc kubenswrapper[4892]: E0217 17:44:14.922520 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b854407-f7fd-494b-9ad1-f90f175f6ff2\\\",\\\"systemUUID\\\":\\\"06c95f3d-0382-44a5-9f64-04e3b3bbd534\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:14Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.925934 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.925976 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.925985 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.925999 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.926008 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:14Z","lastTransitionTime":"2026-02-17T17:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.938975 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59304d1958e072cd19bd1e6a00f1487aa93563c8838c859aa7ed5222dc25c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:14Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:14 crc kubenswrapper[4892]: E0217 17:44:14.945411 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b854407-f7fd-494b-9ad1-f90f175f6ff2\\\",\\\"systemUUID\\\":\\\"06c95f3d-0382-44a5-9f64-04e3b3bbd534\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:14Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.948667 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.948718 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.948736 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.948758 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.948775 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:14Z","lastTransitionTime":"2026-02-17T17:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:14 crc kubenswrapper[4892]: E0217 17:44:14.962439 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b854407-f7fd-494b-9ad1-f90f175f6ff2\\\",\\\"systemUUID\\\":\\\"06c95f3d-0382-44a5-9f64-04e3b3bbd534\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:14Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.966217 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.966253 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.966263 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.966279 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.966287 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:14Z","lastTransitionTime":"2026-02-17T17:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.974444 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:14Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:14 crc kubenswrapper[4892]: E0217 17:44:14.976293 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b854407-f7fd-494b-9ad1-f90f175f6ff2\\\",\\\"systemUUID\\\":\\\"06c95f3d-0382-44a5-9f64-04e3b3bbd534\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:14Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.979043 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.979078 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.979089 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.979105 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.979117 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:14Z","lastTransitionTime":"2026-02-17T17:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:14 crc kubenswrapper[4892]: E0217 17:44:14.995992 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b854407-f7fd-494b-9ad1-f90f175f6ff2\\\",\\\"systemUUID\\\":\\\"06c95f3d-0382-44a5-9f64-04e3b3bbd534\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:14Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:14 crc kubenswrapper[4892]: E0217 17:44:14.996214 4892 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.997682 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.997713 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.997726 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.997747 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:14 crc kubenswrapper[4892]: I0217 17:44:14.997759 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:14Z","lastTransitionTime":"2026-02-17T17:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.018051 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:15Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.057184 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxpxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b12f44-0079-4031-9b1d-492c374250df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5490d72f776fd4e96f56e933c182c8589e96d80806378f96d97c4d0f6dccd8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgp4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxpxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:15Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.098877 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4dsxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202411f4-1b44-41a1-9b8a-23038a0e9bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4dsxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:15Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.100092 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.100149 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.100161 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.100180 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.100194 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:15Z","lastTransitionTime":"2026-02-17T17:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.134996 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83f8f121-9193-465b-9d6b-4dfbdcf26f98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca961a496bca89cff78c58c41585706c0673892df83e6c58e16655f83aeaa4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3ead139c2a8e6feefbddce384ccd6eb8f26b4581d93053b80832fea7c8bdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82362387569a5b049fdb827cb8ca5ecdacae36743ef142077b37441cc2e33b41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d08e8579fe7ac45bc28ca29b9e6bc0b795c68e4429e16319b75f2b6a1f60cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:15Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.178485 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:15Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.205400 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.205447 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.205461 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.205481 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.205496 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:15Z","lastTransitionTime":"2026-02-17T17:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.216193 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aba0c9be9062adff6361f81c4b2f399a56592fcef0d1a62defdf587f9accb3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:15Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.255949 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9013d62-9809-436b-82a8-5b18dbf13e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b57e4ceb22cb7d61e915ddf5e85dfee872d2088cccf3c8c4add63e1aa423a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c31ce4ad814ab3be27cbd8d8eb0cb70c7e11352cacd14b705d5c3ed0b56fe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6mhzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:15Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.293270 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p6jtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"646d0148-c138-45a2-8f68-51aee16aeff0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1d56c160a6bfb1da9b1ca24666fbbdbd87911eaf961365a6cb097b0cc20f1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxbxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p6jtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:15Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.302909 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 14:23:33.528542411 +0000 UTC Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.308396 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.308439 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.308452 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.308471 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.308484 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:15Z","lastTransitionTime":"2026-02-17T17:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.337731 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b23058a0-04ec-4a23-82cb-60f9b368eaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kp5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:15Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.358864 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.358898 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:44:15 crc kubenswrapper[4892]: E0217 17:44:15.359012 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:44:15 crc kubenswrapper[4892]: E0217 17:44:15.359125 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.378871 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e1e9d28fe57947c798967ce328638e778603a86874e79a2f4d99eab0b9867fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://489d551f2e552fe6e6a03f8e38593e2b978a7dc4053a999c3a11ebe404cf427f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:15Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.411275 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.411306 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.411314 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.411326 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.411335 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:15Z","lastTransitionTime":"2026-02-17T17:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.513764 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.513789 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.513798 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.513826 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.513834 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:15Z","lastTransitionTime":"2026-02-17T17:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.542923 4892 generic.go:334] "Generic (PLEG): container finished" podID="202411f4-1b44-41a1-9b8a-23038a0e9bd2" containerID="011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91" exitCode=0 Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.542990 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4dsxq" event={"ID":"202411f4-1b44-41a1-9b8a-23038a0e9bd2","Type":"ContainerDied","Data":"011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91"} Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.554870 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9013d62-9809-436b-82a8-5b18dbf13e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b57e4ceb22cb7d61e915ddf5e85dfee872d2088cccf3c8c4add63e1aa423a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c31ce4ad814ab3be27cbd8d8eb0cb70c7e11352cacd14b705d5c3ed0b56fe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6mhzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:15Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.555404 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" event={"ID":"b23058a0-04ec-4a23-82cb-60f9b368eaa0","Type":"ContainerStarted","Data":"0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb"} Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.555444 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" event={"ID":"b23058a0-04ec-4a23-82cb-60f9b368eaa0","Type":"ContainerStarted","Data":"f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222"} Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.555457 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" event={"ID":"b23058a0-04ec-4a23-82cb-60f9b368eaa0","Type":"ContainerStarted","Data":"2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63"} Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.555469 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" event={"ID":"b23058a0-04ec-4a23-82cb-60f9b368eaa0","Type":"ContainerStarted","Data":"d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2"} Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.555481 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" event={"ID":"b23058a0-04ec-4a23-82cb-60f9b368eaa0","Type":"ContainerStarted","Data":"5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639"} Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.555493 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" event={"ID":"b23058a0-04ec-4a23-82cb-60f9b368eaa0","Type":"ContainerStarted","Data":"3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146"} Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.566227 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:15Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.578663 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aba0c9be9062adff6361f81c4b2f399a56592fcef0d1a62defdf587f9accb3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:15Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.590406 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e1e9d28fe57947c798967ce328638e778603a86874e79a2f4d99eab0b9867fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://489d551f2e552fe6e6a03f8e38593e2b978a7dc4053a999c3a11ebe404cf427f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:15Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.600441 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p6jtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"646d0148-c138-45a2-8f68-51aee16aeff0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1d56c160a6bfb1da9b1ca24666fbbdbd87911eaf961365a6cb097b0cc20f1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxbxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p6jtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:15Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.616109 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.616138 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.616146 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.616158 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.616167 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:15Z","lastTransitionTime":"2026-02-17T17:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.619339 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b23058a0-04ec-4a23-82cb-60f9b368eaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kp5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:15Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.656204 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59304d1958e072cd19bd1e6a00f1487aa93563c8838c859aa7ed5222dc25c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:15Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.694570 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4159c-33a5-42ef-9427-ad1fb2799470\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf7e90e5029b6c000084531e8c465365c236ea0c554bf0ab65b936fc0f53cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe34ba04ce082dcb2c2040fafa3d9b0fb0f78fd569d08c9e53278873fb6434c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a9fc749b194bcc0558af7017260846aeb8df8df5d84f31ae0f1e9f7ac29e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 17:44:08.433164 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 17:44:08.433536 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:44:08.434768 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2781092108/tls.crt::/tmp/serving-cert-2781092108/tls.key\\\\\\\"\\\\nI0217 17:44:08.760474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:44:08.765628 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:44:08.765666 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:44:08.765700 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:44:08.765711 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:44:08.777802 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:44:08.777850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:44:08.777867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:44:08.777870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:44:08.777873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:44:08.777883 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:44:08.781106 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23963364690d3b6cdb1cb76d08e094b0a7c2fe304f8d20a673206357e0152193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:15Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.718532 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.718730 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.718739 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.718753 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.718762 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:15Z","lastTransitionTime":"2026-02-17T17:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.736069 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxpxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b12f44-0079-4031-9b1d-492c374250df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5490d72f776fd4e96f56e933c182c8589e96d80806378f96d97c4d0f6dccd8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgp4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxpxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:15Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.795196 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4dsxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202411f4-1b44-41a1-9b8a-23038a0e9bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4dsxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:15Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.820562 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.820605 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.820617 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.820646 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.820659 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:15Z","lastTransitionTime":"2026-02-17T17:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.840115 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83f8f121-9193-465b-9d6b-4dfbdcf26f98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca961a496bca89cff78c58c41585706c0673892df83e6c58e16655f83aeaa4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3ead139c2a8e6feefbddce384ccd6eb8f26b4581d93053b80832fea7c8bdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82362387569a5b049fdb827cb8ca5ecdacae36743ef142077b37441cc2e33b41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d08e8579fe7ac45bc28ca29b9e6bc0b795c68e4429e16319b75f2b6a1f60cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:15Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.853831 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:15Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.893310 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:15Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.923492 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.923515 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.923524 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.923537 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:15 crc kubenswrapper[4892]: I0217 17:44:15.923546 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:15Z","lastTransitionTime":"2026-02-17T17:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.026452 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.026499 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.026520 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.026544 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.026560 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:16Z","lastTransitionTime":"2026-02-17T17:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.027371 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-5v456"] Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.027752 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5v456" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.029558 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.030117 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.030352 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.030554 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.047061 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b23058a0-04ec-4a23-82cb-60f9b368eaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kp5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:16Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.061270 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e1e9d28fe57947c798967ce328638e778603a86874e79a2f4d99eab0b9867fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://489d551f2e552fe6e6a03f8e38593e2b978a7dc4053a999c3a11ebe404cf427f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:16Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.096805 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p6jtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"646d0148-c138-45a2-8f68-51aee16aeff0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1d56c160a6bfb1da9b1ca24666fbbdbd87911eaf961365a6cb097b0cc20f1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxbxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p6jtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:16Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.131661 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/62da95c0-b8b4-410e-a5e8-f3ab44db53b4-serviceca\") pod \"node-ca-5v456\" (UID: \"62da95c0-b8b4-410e-a5e8-f3ab44db53b4\") " pod="openshift-image-registry/node-ca-5v456" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.131702 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/62da95c0-b8b4-410e-a5e8-f3ab44db53b4-host\") pod \"node-ca-5v456\" (UID: \"62da95c0-b8b4-410e-a5e8-f3ab44db53b4\") " pod="openshift-image-registry/node-ca-5v456" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.131729 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfctg\" (UniqueName: \"kubernetes.io/projected/62da95c0-b8b4-410e-a5e8-f3ab44db53b4-kube-api-access-xfctg\") pod \"node-ca-5v456\" (UID: \"62da95c0-b8b4-410e-a5e8-f3ab44db53b4\") " pod="openshift-image-registry/node-ca-5v456" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.134219 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.134252 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.134263 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.134278 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.134290 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:16Z","lastTransitionTime":"2026-02-17T17:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.141488 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59304d1958e072cd19bd1e6a00f1487aa93563c8838c859aa7ed5222dc25c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:16Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.176758 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5v456" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62da95c0-b8b4-410e-a5e8-f3ab44db53b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfctg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5v456\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:16Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.223951 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4159c-33a5-42ef-9427-ad1fb2799470\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf7e90e5029b6c000084531e8c465365c236ea0c554bf0ab65b936fc0f53cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe34ba04ce082dcb2c2040fafa3d9b0fb0f78fd569d08c9e53278873fb6434c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a9fc749b194bcc0558af7017260846aeb8df8df5d84f31ae0f1e9f7ac29e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 17:44:08.433164 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 17:44:08.433536 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:44:08.434768 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2781092108/tls.crt::/tmp/serving-cert-2781092108/tls.key\\\\\\\"\\\\nI0217 17:44:08.760474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:44:08.765628 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:44:08.765666 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:44:08.765700 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:44:08.765711 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:44:08.777802 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:44:08.777850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:44:08.777867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:44:08.777870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:44:08.777873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:44:08.777883 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:44:08.781106 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23963364690d3b6cdb1cb76d08e094b0a7c2fe304f8d20a673206357e0152193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:16Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.232330 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfctg\" (UniqueName: \"kubernetes.io/projected/62da95c0-b8b4-410e-a5e8-f3ab44db53b4-kube-api-access-xfctg\") pod \"node-ca-5v456\" (UID: \"62da95c0-b8b4-410e-a5e8-f3ab44db53b4\") " pod="openshift-image-registry/node-ca-5v456" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.232402 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/62da95c0-b8b4-410e-a5e8-f3ab44db53b4-serviceca\") pod \"node-ca-5v456\" (UID: \"62da95c0-b8b4-410e-a5e8-f3ab44db53b4\") " pod="openshift-image-registry/node-ca-5v456" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.232421 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/62da95c0-b8b4-410e-a5e8-f3ab44db53b4-host\") pod \"node-ca-5v456\" (UID: \"62da95c0-b8b4-410e-a5e8-f3ab44db53b4\") " pod="openshift-image-registry/node-ca-5v456" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.232468 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/62da95c0-b8b4-410e-a5e8-f3ab44db53b4-host\") pod \"node-ca-5v456\" (UID: \"62da95c0-b8b4-410e-a5e8-f3ab44db53b4\") " pod="openshift-image-registry/node-ca-5v456" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.233842 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/62da95c0-b8b4-410e-a5e8-f3ab44db53b4-serviceca\") pod \"node-ca-5v456\" (UID: \"62da95c0-b8b4-410e-a5e8-f3ab44db53b4\") " pod="openshift-image-registry/node-ca-5v456" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.237268 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.237308 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.237319 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.237337 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.237348 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:16Z","lastTransitionTime":"2026-02-17T17:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.258954 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:16Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.282999 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfctg\" (UniqueName: \"kubernetes.io/projected/62da95c0-b8b4-410e-a5e8-f3ab44db53b4-kube-api-access-xfctg\") pod \"node-ca-5v456\" (UID: \"62da95c0-b8b4-410e-a5e8-f3ab44db53b4\") " pod="openshift-image-registry/node-ca-5v456" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.303739 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 11:41:15.918898066 +0000 UTC Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.317110 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxpxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b12f44-0079-4031-9b1d-492c374250df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5490d72f776fd4e96f56e933c182c8589e96d80806378f96d97c4d0f6dccd8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgp4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxpxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:16Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.339699 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.339949 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.340097 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.340235 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.340356 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:16Z","lastTransitionTime":"2026-02-17T17:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.345888 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5v456" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.359472 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.359482 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4dsxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202411f4-1b44-41a1-9b8a-23038a0e9bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4dsxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:16Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:16 crc kubenswrapper[4892]: E0217 17:44:16.359753 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:44:16 crc kubenswrapper[4892]: W0217 17:44:16.363386 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62da95c0_b8b4_410e_a5e8_f3ab44db53b4.slice/crio-ee7961dbfdd80f2908f8b9dc6f7ef0bba3f97ac5ce6de67c535dc63705475ba3 WatchSource:0}: Error finding container ee7961dbfdd80f2908f8b9dc6f7ef0bba3f97ac5ce6de67c535dc63705475ba3: Status 404 returned error can't find the container with id ee7961dbfdd80f2908f8b9dc6f7ef0bba3f97ac5ce6de67c535dc63705475ba3 Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.402546 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83f8f121-9193-465b-9d6b-4dfbdcf26f98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca961a496bca89cff78c58c41585706c0673892df83e6c58e16655f83aeaa4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3ead139c2a8e6feefbddce384ccd6eb8f26b4581d93053b80832fea7c8bdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82362387569a5b049fdb827cb8ca5ecdacae36743ef142077b37441cc2e33b41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d08e8579fe7ac45bc28ca29b9e6bc0b795c68e4429e16319b75f2b6a1f60cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:16Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.435916 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:16Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.444208 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.444275 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.444298 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.444327 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.444349 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:16Z","lastTransitionTime":"2026-02-17T17:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.481736 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aba0c9be9062adff6361f81c4b2f399a56592fcef0d1a62defdf587f9accb3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:16Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.514910 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9013d62-9809-436b-82a8-5b18dbf13e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b57e4ceb22cb7d61e915ddf5e85dfee872d2088cccf3c8c4add63e1aa423a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c31ce4ad814ab3be27cbd8d8eb0cb70c7e11352cacd14b705d5c3ed0b56fe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6mhzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:16Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.547473 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.547520 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.547537 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.547562 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.547579 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:16Z","lastTransitionTime":"2026-02-17T17:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.556265 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:16Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.560099 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5v456" event={"ID":"62da95c0-b8b4-410e-a5e8-f3ab44db53b4","Type":"ContainerStarted","Data":"ee7961dbfdd80f2908f8b9dc6f7ef0bba3f97ac5ce6de67c535dc63705475ba3"} Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.563132 4892 generic.go:334] "Generic (PLEG): container finished" podID="202411f4-1b44-41a1-9b8a-23038a0e9bd2" containerID="fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf" exitCode=0 Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.563183 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4dsxq" event={"ID":"202411f4-1b44-41a1-9b8a-23038a0e9bd2","Type":"ContainerDied","Data":"fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf"} Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.602749 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4159c-33a5-42ef-9427-ad1fb2799470\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf7e90e5029b6c000084531e8c465365c236ea0c554bf0ab65b936fc0f53cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe34ba04ce082dcb2c2040fafa3d9b0fb0f78fd569d08c9e53278873fb6434c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a9fc749b194bcc0558af7017260846aeb8df8df5d84f31ae0f1e9f7ac29e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 17:44:08.433164 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 17:44:08.433536 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:44:08.434768 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2781092108/tls.crt::/tmp/serving-cert-2781092108/tls.key\\\\\\\"\\\\nI0217 17:44:08.760474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:44:08.765628 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:44:08.765666 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:44:08.765700 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:44:08.765711 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:44:08.777802 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:44:08.777850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:44:08.777867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:44:08.777870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:44:08.777873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:44:08.777883 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:44:08.781106 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23963364690d3b6cdb1cb76d08e094b0a7c2fe304f8d20a673206357e0152193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:16Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.638729 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59304d1958e072cd19bd1e6a00f1487aa93563c8838c859aa7ed5222dc25c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:16Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.650726 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.650763 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.650774 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.650790 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.650853 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:16Z","lastTransitionTime":"2026-02-17T17:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.674504 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5v456" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62da95c0-b8b4-410e-a5e8-f3ab44db53b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfctg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5v456\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:16Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.715137 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83f8f121-9193-465b-9d6b-4dfbdcf26f98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca961a496bca89cff78c58c41585706c0673892df83e6c58e16655f83aeaa4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3ead139c2a8e6feefbddce384ccd6eb8f26b4581d93053b80832fea7c8bdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82362387569a5b049fdb827cb8ca5ecdacae36743ef142077b37441cc2e33b41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d08e8579fe7ac45bc28ca29b9e6bc0b795c68e4429e16319b75f2b6a1f60cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:16Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.757617 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.757672 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.757688 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.757709 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.757726 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:16Z","lastTransitionTime":"2026-02-17T17:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.757914 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:16Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.797187 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:16Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.840258 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxpxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b12f44-0079-4031-9b1d-492c374250df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5490d72f776fd4e96f56e933c182c8589e96d80806378f96d97c4d0f6dccd8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgp4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxpxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:16Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.862901 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.862941 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.862956 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.862977 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.862991 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:16Z","lastTransitionTime":"2026-02-17T17:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.880045 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4dsxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202411f4-1b44-41a1-9b8a-23038a0e9bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4dsxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:16Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.920147 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:16Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:16 crc kubenswrapper[4892]: E0217 17:44:16.939420 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:44:24.93938428 +0000 UTC m=+36.314787585 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.939266 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.954850 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aba0c9be9062adff6361f81c4b2f399a56592fcef0d1a62defdf587f9accb3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:16Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.965837 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.965863 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.965876 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.965892 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.965906 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:16Z","lastTransitionTime":"2026-02-17T17:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:16 crc kubenswrapper[4892]: I0217 17:44:16.998596 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9013d62-9809-436b-82a8-5b18dbf13e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b57e4ceb22cb7d61e915ddf5e85dfee872d2088cccf3c8c4add63e1aa423a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c31ce4ad814ab3be27cbd8d8eb0cb70c7e11352cacd14b705d5c3ed0b56fe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6mhzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:16Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.041909 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.042010 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:44:17 crc kubenswrapper[4892]: E0217 17:44:17.042038 4892 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.042065 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.042109 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:44:17 crc kubenswrapper[4892]: E0217 17:44:17.042146 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 17:44:25.042115358 +0000 UTC m=+36.417518663 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 17:44:17 crc kubenswrapper[4892]: E0217 17:44:17.042258 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 17:44:17 crc kubenswrapper[4892]: E0217 17:44:17.042283 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 17:44:17 crc kubenswrapper[4892]: E0217 17:44:17.042301 4892 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:44:17 crc kubenswrapper[4892]: E0217 17:44:17.042352 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 17:44:25.042335104 +0000 UTC m=+36.417738399 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:44:17 crc kubenswrapper[4892]: E0217 17:44:17.042426 4892 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 17:44:17 crc kubenswrapper[4892]: E0217 17:44:17.042467 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 17:44:25.042454897 +0000 UTC m=+36.417858192 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 17:44:17 crc kubenswrapper[4892]: E0217 17:44:17.042583 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 17:44:17 crc kubenswrapper[4892]: E0217 17:44:17.042604 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 17:44:17 crc kubenswrapper[4892]: E0217 17:44:17.042621 4892 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:44:17 crc kubenswrapper[4892]: E0217 17:44:17.042659 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 17:44:25.042646813 +0000 UTC m=+36.418050118 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.043186 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e1e9d28fe57947c798967ce328638e778603a86874e79a2f4d99eab0b9867fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://489d551f2e552fe6e6a03f8e38593e2b978a7dc4053a999c3a11ebe404cf427f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:17Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.071749 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.071805 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.071864 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.071891 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.071908 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:17Z","lastTransitionTime":"2026-02-17T17:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.078463 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p6jtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"646d0148-c138-45a2-8f68-51aee16aeff0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1d56c160a6bfb1da9b1ca24666fbbdbd87911eaf961365a6cb097b0cc20f1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxbxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p6jtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:17Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.125596 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b23058a0-04ec-4a23-82cb-60f9b368eaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kp5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:17Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.174696 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.174743 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.174752 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.174766 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.174776 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:17Z","lastTransitionTime":"2026-02-17T17:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.285002 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.285045 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.285056 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.285072 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.285084 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:17Z","lastTransitionTime":"2026-02-17T17:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.304601 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 21:11:13.679142386 +0000 UTC Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.359254 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:44:17 crc kubenswrapper[4892]: E0217 17:44:17.359414 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.359441 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:44:17 crc kubenswrapper[4892]: E0217 17:44:17.359604 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.390110 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.390176 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.390201 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.390230 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.390252 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:17Z","lastTransitionTime":"2026-02-17T17:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.493178 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.493591 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.493613 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.493639 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.493660 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:17Z","lastTransitionTime":"2026-02-17T17:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.568134 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5v456" event={"ID":"62da95c0-b8b4-410e-a5e8-f3ab44db53b4","Type":"ContainerStarted","Data":"e4bd49fbfa0e7cff66ffc2a72aa0020a4a28551fa192e587067d648feb4d09f9"} Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.574239 4892 generic.go:334] "Generic (PLEG): container finished" podID="202411f4-1b44-41a1-9b8a-23038a0e9bd2" containerID="35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d" exitCode=0 Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.574439 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4dsxq" event={"ID":"202411f4-1b44-41a1-9b8a-23038a0e9bd2","Type":"ContainerDied","Data":"35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d"} Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.584318 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" event={"ID":"b23058a0-04ec-4a23-82cb-60f9b368eaa0","Type":"ContainerStarted","Data":"03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089"} Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.597028 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.597089 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.597107 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.597132 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.597149 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:17Z","lastTransitionTime":"2026-02-17T17:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.607110 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e1e9d28fe57947c798967ce328638e778603a86874e79a2f4d99eab0b9867fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://489d551f2e552fe6e6a03f8e38593e2b978a7dc4053a999c3a11ebe404cf427f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:17Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.621401 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p6jtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"646d0148-c138-45a2-8f68-51aee16aeff0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1d56c160a6bfb1da9b1ca24666fbbdbd87911eaf961365a6cb097b0cc20f1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxbxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p6jtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:17Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.648834 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b23058a0-04ec-4a23-82cb-60f9b368eaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kp5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:17Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.667521 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59304d1958e072cd19bd1e6a00f1487aa93563c8838c859aa7ed5222dc25c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:17Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.678304 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5v456" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62da95c0-b8b4-410e-a5e8-f3ab44db53b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4bd49fbfa0e7cff66ffc2a72aa0020a4a28551fa192e587067d648feb4d09f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfctg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5v456\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:17Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.702380 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.702419 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.702429 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.702444 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.702456 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:17Z","lastTransitionTime":"2026-02-17T17:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.704950 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4159c-33a5-42ef-9427-ad1fb2799470\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf7e90e5029b6c000084531e8c465365c236ea0c554bf0ab65b936fc0f53cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe34ba04ce082dcb2c2040fafa3d9b0fb0f78fd569d08c9e53278873fb6434c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a9fc749b194bcc0558af7017260846aeb8df8df5d84f31ae0f1e9f7ac29e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 17:44:08.433164 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 17:44:08.433536 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:44:08.434768 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2781092108/tls.crt::/tmp/serving-cert-2781092108/tls.key\\\\\\\"\\\\nI0217 17:44:08.760474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:44:08.765628 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:44:08.765666 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:44:08.765700 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:44:08.765711 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:44:08.777802 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:44:08.777850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:44:08.777867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:44:08.777870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:44:08.777873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:44:08.777883 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:44:08.781106 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23963364690d3b6cdb1cb76d08e094b0a7c2fe304f8d20a673206357e0152193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:17Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.724227 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxpxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b12f44-0079-4031-9b1d-492c374250df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5490d72f776fd4e96f56e933c182c8589e96d80806378f96d97c4d0f6dccd8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgp4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxpxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:17Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.740668 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4dsxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202411f4-1b44-41a1-9b8a-23038a0e9bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4dsxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:17Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.752656 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83f8f121-9193-465b-9d6b-4dfbdcf26f98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca961a496bca89cff78c58c41585706c0673892df83e6c58e16655f83aeaa4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3ead139c2a8e6feefbddce384ccd6eb8f26b4581d93053b80832fea7c8bdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82362387569a5b049fdb827cb8ca5ecdacae36743ef142077b37441cc2e33b41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d08e8579fe7ac45bc28ca29b9e6bc0b795c68e4429e16319b75f2b6a1f60cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:17Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.763324 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:17Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.774368 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:17Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.784388 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9013d62-9809-436b-82a8-5b18dbf13e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b57e4ceb22cb7d61e915ddf5e85dfee872d2088cccf3c8c4add63e1aa423a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c31ce4ad814ab3be27cbd8d8eb0cb70c7e11352cacd14b705d5c3ed0b56fe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6mhzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:17Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.795279 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:17Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.804524 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.804552 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.804559 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.804573 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.804583 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:17Z","lastTransitionTime":"2026-02-17T17:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.807000 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aba0c9be9062adff6361f81c4b2f399a56592fcef0d1a62defdf587f9accb3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:17Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.817497 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83f8f121-9193-465b-9d6b-4dfbdcf26f98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca961a496bca89cff78c58c41585706c0673892df83e6c58e16655f83aeaa4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3ead139c2a8e6feefbddce384ccd6eb8f26b4581d93053b80832fea7c8bdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82362387569a5b049fdb827cb8ca5ecdacae36743ef142077b37441cc2e33b41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d08e8579fe7ac45bc28ca29b9e6bc0b795c68e4429e16319b75f2b6a1f60cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:17Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.829443 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:17Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.841737 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:17Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.853040 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxpxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b12f44-0079-4031-9b1d-492c374250df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5490d72f776fd4e96f56e933c182c8589e96d80806378f96d97c4d0f6dccd8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgp4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxpxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:17Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.875968 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4dsxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202411f4-1b44-41a1-9b8a-23038a0e9bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4dsxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:17Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.906863 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.906891 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.906900 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.906919 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.906930 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:17Z","lastTransitionTime":"2026-02-17T17:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.919337 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:17Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.955286 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aba0c9be9062adff6361f81c4b2f399a56592fcef0d1a62defdf587f9accb3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:17Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:17 crc kubenswrapper[4892]: I0217 17:44:17.995827 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9013d62-9809-436b-82a8-5b18dbf13e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b57e4ceb22cb7d61e915ddf5e85dfee872d2088cccf3c8c4add63e1aa423a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c31ce4ad814ab3be27cbd8d8eb0cb70c7e11352cacd14b705d5c3ed0b56fe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6mhzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:17Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.008996 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.009057 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.009079 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.009108 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.009132 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:18Z","lastTransitionTime":"2026-02-17T17:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.045704 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e1e9d28fe57947c798967ce328638e778603a86874e79a2f4d99eab0b9867fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://489d551f2e552fe6e6a03f8e38593e2b978a7dc4053a999c3a11ebe404cf427f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:18Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.078223 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p6jtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"646d0148-c138-45a2-8f68-51aee16aeff0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1d56c160a6bfb1da9b1ca24666fbbdbd87911eaf961365a6cb097b0cc20f1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxbxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p6jtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:18Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.112100 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.112137 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.112152 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.112176 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.112193 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:18Z","lastTransitionTime":"2026-02-17T17:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.140966 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b23058a0-04ec-4a23-82cb-60f9b368eaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kp5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:18Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.164591 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4159c-33a5-42ef-9427-ad1fb2799470\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf7e90e5029b6c000084531e8c465365c236ea0c554bf0ab65b936fc0f53cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe34ba04ce082dcb2c2040fafa3d9b0fb0f78fd569d08c9e53278873fb6434c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a9fc749b194bcc0558af7017260846aeb8df8df5d84f31ae0f1e9f7ac29e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 17:44:08.433164 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 17:44:08.433536 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:44:08.434768 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2781092108/tls.crt::/tmp/serving-cert-2781092108/tls.key\\\\\\\"\\\\nI0217 17:44:08.760474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:44:08.765628 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:44:08.765666 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:44:08.765700 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:44:08.765711 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:44:08.777802 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:44:08.777850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:44:08.777867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:44:08.777870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:44:08.777873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:44:08.777883 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:44:08.781106 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23963364690d3b6cdb1cb76d08e094b0a7c2fe304f8d20a673206357e0152193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:18Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.200376 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59304d1958e072cd19bd1e6a00f1487aa93563c8838c859aa7ed5222dc25c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:18Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.218005 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.218056 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.218073 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.218098 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.218113 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:18Z","lastTransitionTime":"2026-02-17T17:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.238484 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5v456" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62da95c0-b8b4-410e-a5e8-f3ab44db53b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4bd49fbfa0e7cff66ffc2a72aa0020a4a28551fa192e587067d648feb4d09f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfctg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5v456\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:18Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.305339 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 04:24:53.36020869 +0000 UTC Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.320466 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.320497 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.320507 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.320523 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.320535 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:18Z","lastTransitionTime":"2026-02-17T17:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.358967 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:44:18 crc kubenswrapper[4892]: E0217 17:44:18.359121 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.422397 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.422464 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.422482 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.422928 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.422981 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:18Z","lastTransitionTime":"2026-02-17T17:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.525845 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.525905 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.525932 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.525957 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.525973 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:18Z","lastTransitionTime":"2026-02-17T17:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.590315 4892 generic.go:334] "Generic (PLEG): container finished" podID="202411f4-1b44-41a1-9b8a-23038a0e9bd2" containerID="3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17" exitCode=0 Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.590399 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4dsxq" event={"ID":"202411f4-1b44-41a1-9b8a-23038a0e9bd2","Type":"ContainerDied","Data":"3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17"} Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.615251 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:18Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.629659 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.629722 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.629740 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.629769 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.629786 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:18Z","lastTransitionTime":"2026-02-17T17:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.630620 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aba0c9be9062adff6361f81c4b2f399a56592fcef0d1a62defdf587f9accb3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:18Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.647445 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9013d62-9809-436b-82a8-5b18dbf13e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b57e4ceb22cb7d61e915ddf5e85dfee872d2088cccf3c8c4add63e1aa423a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c31ce4ad814ab3be27cbd8d8eb0cb70c7e11352cacd14b705d5c3ed0b56fe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6mhzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:18Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.663678 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e1e9d28fe57947c798967ce328638e778603a86874e79a2f4d99eab0b9867fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://489d551f2e552fe6e6a03f8e38593e2b978a7dc4053a999c3a11ebe404cf427f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:18Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.679799 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p6jtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"646d0148-c138-45a2-8f68-51aee16aeff0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1d56c160a6bfb1da9b1ca24666fbbdbd87911eaf961365a6cb097b0cc20f1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxbxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p6jtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:18Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.709653 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b23058a0-04ec-4a23-82cb-60f9b368eaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kp5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:18Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.723858 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5v456" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62da95c0-b8b4-410e-a5e8-f3ab44db53b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4bd49fbfa0e7cff66ffc2a72aa0020a4a28551fa192e587067d648feb4d09f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfctg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5v456\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:18Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.732799 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.732905 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.732929 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.732960 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.732986 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:18Z","lastTransitionTime":"2026-02-17T17:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.743010 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4159c-33a5-42ef-9427-ad1fb2799470\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf7e90e5029b6c000084531e8c465365c236ea0c554bf0ab65b936fc0f53cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe34ba04ce082dcb2c2040fafa3d9b0fb0f78fd569d08c9e53278873fb6434c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a9fc749b194bcc0558af7017260846aeb8df8df5d84f31ae0f1e9f7ac29e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 17:44:08.433164 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 17:44:08.433536 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:44:08.434768 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2781092108/tls.crt::/tmp/serving-cert-2781092108/tls.key\\\\\\\"\\\\nI0217 17:44:08.760474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:44:08.765628 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:44:08.765666 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:44:08.765700 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:44:08.765711 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:44:08.777802 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:44:08.777850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:44:08.777867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:44:08.777870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:44:08.777873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:44:08.777883 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:44:08.781106 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23963364690d3b6cdb1cb76d08e094b0a7c2fe304f8d20a673206357e0152193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:18Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.762478 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59304d1958e072cd19bd1e6a00f1487aa93563c8838c859aa7ed5222dc25c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:18Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.787479 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4dsxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202411f4-1b44-41a1-9b8a-23038a0e9bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4dsxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:18Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.801737 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83f8f121-9193-465b-9d6b-4dfbdcf26f98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca961a496bca89cff78c58c41585706c0673892df83e6c58e16655f83aeaa4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3ead139c2a8e6feefbddce384ccd6eb8f26b4581d93053b80832fea7c8bdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82362387569a5b049fdb827cb8ca5ecdacae36743ef142077b37441cc2e33b41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d08e8579fe7ac45bc28ca29b9e6bc0b795c68e4429e16319b75f2b6a1f60cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:18Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.815011 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:18Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.831234 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:18Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.835062 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.835091 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.835103 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.835120 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.835133 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:18Z","lastTransitionTime":"2026-02-17T17:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.846509 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxpxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b12f44-0079-4031-9b1d-492c374250df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5490d72f776fd4e96f56e933c182c8589e96d80806378f96d97c4d0f6dccd8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgp4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxpxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:18Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.937275 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.937306 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.937319 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.937335 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:18 crc kubenswrapper[4892]: I0217 17:44:18.937346 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:18Z","lastTransitionTime":"2026-02-17T17:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.041759 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.041850 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.041871 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.041902 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.041954 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:19Z","lastTransitionTime":"2026-02-17T17:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.123304 4892 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.157391 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.157454 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.157490 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.157508 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.157520 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:19Z","lastTransitionTime":"2026-02-17T17:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.260434 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.260489 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.260505 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.260530 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.260548 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:19Z","lastTransitionTime":"2026-02-17T17:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.306244 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 11:29:29.867760694 +0000 UTC Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.359149 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.359208 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:44:19 crc kubenswrapper[4892]: E0217 17:44:19.359413 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:44:19 crc kubenswrapper[4892]: E0217 17:44:19.359607 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.366377 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.366424 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.366440 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.366459 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.366477 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:19Z","lastTransitionTime":"2026-02-17T17:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.380005 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.398554 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.417858 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxpxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b12f44-0079-4031-9b1d-492c374250df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5490d72f776fd4e96f56e933c182c8589e96d80806378f96d97c4d0f6dccd8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgp4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxpxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.437110 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4dsxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202411f4-1b44-41a1-9b8a-23038a0e9bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4dsxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.451694 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83f8f121-9193-465b-9d6b-4dfbdcf26f98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca961a496bca89cff78c58c41585706c0673892df83e6c58e16655f83aeaa4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3ead139c2a8e6feefbddce384ccd6eb8f26b4581d93053b80832fea7c8bdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82362387569a5b049fdb827cb8ca5ecdacae36743ef142077b37441cc2e33b41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d08e8579fe7ac45bc28ca29b9e6bc0b795c68e4429e16319b75f2b6a1f60cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.469768 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.469862 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.469873 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.469911 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.470233 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.469955 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:19Z","lastTransitionTime":"2026-02-17T17:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.489473 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aba0c9be9062adff6361f81c4b2f399a56592fcef0d1a62defdf587f9accb3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.504313 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9013d62-9809-436b-82a8-5b18dbf13e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b57e4ceb22cb7d61e915ddf5e85dfee872d2088cccf3c8c4add63e1aa423a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c31ce4ad814ab3be27cbd8d8eb0cb70c7e11352cacd14b705d5c3ed0b56fe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6mhzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.515386 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p6jtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"646d0148-c138-45a2-8f68-51aee16aeff0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1d56c160a6bfb1da9b1ca24666fbbdbd87911eaf961365a6cb097b0cc20f1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxbxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p6jtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.549658 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b23058a0-04ec-4a23-82cb-60f9b368eaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kp5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.565365 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e1e9d28fe57947c798967ce328638e778603a86874e79a2f4d99eab0b9867fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://489d551f2e552fe6e6a03f8e38593e2b978a7dc4053a999c3a11ebe404cf427f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.573030 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.573074 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.573087 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.573107 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.573119 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:19Z","lastTransitionTime":"2026-02-17T17:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.582349 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4159c-33a5-42ef-9427-ad1fb2799470\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf7e90e5029b6c000084531e8c465365c236ea0c554bf0ab65b936fc0f53cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe34ba04ce082dcb2c2040fafa3d9b0fb0f78fd569d08c9e53278873fb6434c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a9fc749b194bcc0558af7017260846aeb8df8df5d84f31ae0f1e9f7ac29e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 17:44:08.433164 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 17:44:08.433536 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:44:08.434768 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2781092108/tls.crt::/tmp/serving-cert-2781092108/tls.key\\\\\\\"\\\\nI0217 17:44:08.760474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:44:08.765628 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:44:08.765666 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:44:08.765700 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:44:08.765711 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:44:08.777802 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:44:08.777850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:44:08.777867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:44:08.777870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:44:08.777873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:44:08.777883 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:44:08.781106 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23963364690d3b6cdb1cb76d08e094b0a7c2fe304f8d20a673206357e0152193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.595698 4892 generic.go:334] "Generic (PLEG): container finished" podID="202411f4-1b44-41a1-9b8a-23038a0e9bd2" containerID="fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1" exitCode=0 Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.595740 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4dsxq" event={"ID":"202411f4-1b44-41a1-9b8a-23038a0e9bd2","Type":"ContainerDied","Data":"fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1"} Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.598646 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59304d1958e072cd19bd1e6a00f1487aa93563c8838c859aa7ed5222dc25c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.611855 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5v456" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62da95c0-b8b4-410e-a5e8-f3ab44db53b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4bd49fbfa0e7cff66ffc2a72aa0020a4a28551fa192e587067d648feb4d09f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfctg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5v456\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.629413 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.644563 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aba0c9be9062adff6361f81c4b2f399a56592fcef0d1a62defdf587f9accb3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.655253 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9013d62-9809-436b-82a8-5b18dbf13e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b57e4ceb22cb7d61e915ddf5e85dfee872d2088cccf3c8c4add63e1aa423a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c31ce4ad814ab3be27cbd8d8eb0cb70c7e11352cacd14b705d5c3ed0b56fe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6mhzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.668395 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e1e9d28fe57947c798967ce328638e778603a86874e79a2f4d99eab0b9867fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://489d551f2e552fe6e6a03f8e38593e2b978a7dc4053a999c3a11ebe404cf427f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.675170 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.675200 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.675209 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.675222 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.675232 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:19Z","lastTransitionTime":"2026-02-17T17:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.680643 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p6jtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"646d0148-c138-45a2-8f68-51aee16aeff0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1d56c160a6bfb1da9b1ca24666fbbdbd87911eaf961365a6cb097b0cc20f1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxbxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p6jtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.699565 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b23058a0-04ec-4a23-82cb-60f9b368eaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kp5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.719383 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4159c-33a5-42ef-9427-ad1fb2799470\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf7e90e5029b6c000084531e8c465365c236ea0c554bf0ab65b936fc0f53cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe34ba04ce082dcb2c2040fafa3d9b0fb0f78fd569d08c9e53278873fb6434c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a9fc749b194bcc0558af7017260846aeb8df8df5d84f31ae0f1e9f7ac29e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 17:44:08.433164 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 17:44:08.433536 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:44:08.434768 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2781092108/tls.crt::/tmp/serving-cert-2781092108/tls.key\\\\\\\"\\\\nI0217 17:44:08.760474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:44:08.765628 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:44:08.765666 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:44:08.765700 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:44:08.765711 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:44:08.777802 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:44:08.777850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:44:08.777867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:44:08.777870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:44:08.777873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:44:08.777883 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:44:08.781106 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23963364690d3b6cdb1cb76d08e094b0a7c2fe304f8d20a673206357e0152193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.733006 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59304d1958e072cd19bd1e6a00f1487aa93563c8838c859aa7ed5222dc25c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.742246 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5v456" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62da95c0-b8b4-410e-a5e8-f3ab44db53b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4bd49fbfa0e7cff66ffc2a72aa0020a4a28551fa192e587067d648feb4d09f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfctg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5v456\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.753854 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83f8f121-9193-465b-9d6b-4dfbdcf26f98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca961a496bca89cff78c58c41585706c0673892df83e6c58e16655f83aeaa4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3ead139c2a8e6feefbddce384ccd6eb8f26b4581d93053b80832fea7c8bdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82362387569a5b049fdb827cb8ca5ecdacae36743ef142077b37441cc2e33b41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d08e8579fe7ac45bc28ca29b9e6bc0b795c68e4429e16319b75f2b6a1f60cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.777794 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.777841 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.777850 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.777866 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.777875 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:19Z","lastTransitionTime":"2026-02-17T17:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.798349 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.841092 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.876601 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxpxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b12f44-0079-4031-9b1d-492c374250df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5490d72f776fd4e96f56e933c182c8589e96d80806378f96d97c4d0f6dccd8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgp4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxpxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.880079 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.880112 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.880120 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.880132 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.880141 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:19Z","lastTransitionTime":"2026-02-17T17:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.915470 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4dsxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202411f4-1b44-41a1-9b8a-23038a0e9bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4dsxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.982734 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.982784 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.982795 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.982826 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:19 crc kubenswrapper[4892]: I0217 17:44:19.982838 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:19Z","lastTransitionTime":"2026-02-17T17:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.086051 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.086122 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.086145 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.086175 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.086198 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:20Z","lastTransitionTime":"2026-02-17T17:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.189976 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.190616 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.190642 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.190727 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.190745 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:20Z","lastTransitionTime":"2026-02-17T17:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.293684 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.293742 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.293761 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.293786 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.293807 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:20Z","lastTransitionTime":"2026-02-17T17:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.306600 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 17:57:23.142757241 +0000 UTC Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.359387 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:44:20 crc kubenswrapper[4892]: E0217 17:44:20.359528 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.397463 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.397519 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.397535 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.397558 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.397605 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:20Z","lastTransitionTime":"2026-02-17T17:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.500141 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.500194 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.500214 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.500237 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.500253 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:20Z","lastTransitionTime":"2026-02-17T17:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.604991 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.605049 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.605289 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.605326 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.605348 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:20Z","lastTransitionTime":"2026-02-17T17:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.611287 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" event={"ID":"b23058a0-04ec-4a23-82cb-60f9b368eaa0","Type":"ContainerStarted","Data":"f423d521d3549b7cb74f5f1fa08e42b46969184b639d2bec15a5c0a2d42d596b"} Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.612103 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.612150 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.632861 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:20Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.651471 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.651568 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.653004 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:20Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.671524 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxpxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b12f44-0079-4031-9b1d-492c374250df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5490d72f776fd4e96f56e933c182c8589e96d80806378f96d97c4d0f6dccd8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgp4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxpxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:20Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.690126 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4dsxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202411f4-1b44-41a1-9b8a-23038a0e9bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4dsxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:20Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.707961 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.708016 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.708033 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.708057 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.708075 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:20Z","lastTransitionTime":"2026-02-17T17:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.710052 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83f8f121-9193-465b-9d6b-4dfbdcf26f98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca961a496bca89cff78c58c41585706c0673892df83e6c58e16655f83aeaa4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3ead139c2a8e6feefbddce384ccd6eb8f26b4581d93053b80832fea7c8bdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82362387569a5b049fdb827cb8ca5ecdacae36743ef142077b37441cc2e33b41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d08e8579fe7ac45bc28ca29b9e6bc0b795c68e4429e16319b75f2b6a1f60cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:20Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.729175 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:20Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.747121 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aba0c9be9062adff6361f81c4b2f399a56592fcef0d1a62defdf587f9accb3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:20Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.766690 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9013d62-9809-436b-82a8-5b18dbf13e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b57e4ceb22cb7d61e915ddf5e85dfee872d2088cccf3c8c4add63e1aa423a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c31ce4ad814ab3be27cbd8d8eb0cb70c7e11352cacd14b705d5c3ed0b56fe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6mhzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:20Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.781853 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p6jtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"646d0148-c138-45a2-8f68-51aee16aeff0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1d56c160a6bfb1da9b1ca24666fbbdbd87911eaf961365a6cb097b0cc20f1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxbxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p6jtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:20Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.811482 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.811537 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.811546 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.811561 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.811570 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:20Z","lastTransitionTime":"2026-02-17T17:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.811527 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b23058a0-04ec-4a23-82cb-60f9b368eaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f423d521d3549b7cb74f5f1fa08e42b46969184b639d2bec15a5c0a2d42d596b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kp5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:20Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.828007 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e1e9d28fe57947c798967ce328638e778603a86874e79a2f4d99eab0b9867fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://489d551f2e552fe6e6a03f8e38593e2b978a7dc4053a999c3a11ebe404cf427f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:20Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.848262 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4159c-33a5-42ef-9427-ad1fb2799470\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf7e90e5029b6c000084531e8c465365c236ea0c554bf0ab65b936fc0f53cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe34ba04ce082dcb2c2040fafa3d9b0fb0f78fd569d08c9e53278873fb6434c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a9fc749b194bcc0558af7017260846aeb8df8df5d84f31ae0f1e9f7ac29e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 17:44:08.433164 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 17:44:08.433536 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:44:08.434768 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2781092108/tls.crt::/tmp/serving-cert-2781092108/tls.key\\\\\\\"\\\\nI0217 17:44:08.760474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:44:08.765628 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:44:08.765666 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:44:08.765700 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:44:08.765711 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:44:08.777802 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:44:08.777850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:44:08.777867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:44:08.777870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:44:08.777873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:44:08.777883 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:44:08.781106 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23963364690d3b6cdb1cb76d08e094b0a7c2fe304f8d20a673206357e0152193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:20Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.861631 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59304d1958e072cd19bd1e6a00f1487aa93563c8838c859aa7ed5222dc25c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:20Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.875757 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5v456" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62da95c0-b8b4-410e-a5e8-f3ab44db53b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4bd49fbfa0e7cff66ffc2a72aa0020a4a28551fa192e587067d648feb4d09f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfctg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5v456\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:20Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.893230 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:20Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.911528 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aba0c9be9062adff6361f81c4b2f399a56592fcef0d1a62defdf587f9accb3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:20Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.914019 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.914090 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.914117 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.914147 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.914172 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:20Z","lastTransitionTime":"2026-02-17T17:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.930637 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9013d62-9809-436b-82a8-5b18dbf13e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b57e4ceb22cb7d61e915ddf5e85dfee872d2088cccf3c8c4add63e1aa423a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c31ce4ad814ab3be27cbd8d8eb0cb70c7e11352cacd14b705d5c3ed0b56fe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6mhzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:20Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.945674 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p6jtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"646d0148-c138-45a2-8f68-51aee16aeff0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1d56c160a6bfb1da9b1ca24666fbbdbd87911eaf961365a6cb097b0cc20f1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxbxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p6jtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:20Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.967735 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b23058a0-04ec-4a23-82cb-60f9b368eaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f423d521d3549b7cb74f5f1fa08e42b46969184b639d2bec15a5c0a2d42d596b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kp5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:20Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:20 crc kubenswrapper[4892]: I0217 17:44:20.989318 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e1e9d28fe57947c798967ce328638e778603a86874e79a2f4d99eab0b9867fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://489d551f2e552fe6e6a03f8e38593e2b978a7dc4053a999c3a11ebe404cf427f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:20Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.009528 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4159c-33a5-42ef-9427-ad1fb2799470\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf7e90e5029b6c000084531e8c465365c236ea0c554bf0ab65b936fc0f53cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe34ba04ce082dcb2c2040fafa3d9b0fb0f78fd569d08c9e53278873fb6434c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a9fc749b194bcc0558af7017260846aeb8df8df5d84f31ae0f1e9f7ac29e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 17:44:08.433164 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 17:44:08.433536 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:44:08.434768 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2781092108/tls.crt::/tmp/serving-cert-2781092108/tls.key\\\\\\\"\\\\nI0217 17:44:08.760474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:44:08.765628 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:44:08.765666 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:44:08.765700 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:44:08.765711 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:44:08.777802 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:44:08.777850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:44:08.777867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:44:08.777870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:44:08.777873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:44:08.777883 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:44:08.781106 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23963364690d3b6cdb1cb76d08e094b0a7c2fe304f8d20a673206357e0152193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:21Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.016736 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.016794 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.016848 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.016883 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.016901 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:21Z","lastTransitionTime":"2026-02-17T17:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.026921 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59304d1958e072cd19bd1e6a00f1487aa93563c8838c859aa7ed5222dc25c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:21Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.043181 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5v456" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62da95c0-b8b4-410e-a5e8-f3ab44db53b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4bd49fbfa0e7cff66ffc2a72aa0020a4a28551fa192e587067d648feb4d09f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfctg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5v456\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:21Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.062963 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:21Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.082239 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:21Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.102139 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxpxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b12f44-0079-4031-9b1d-492c374250df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5490d72f776fd4e96f56e933c182c8589e96d80806378f96d97c4d0f6dccd8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgp4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxpxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:21Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.119065 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.119138 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.119160 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.119188 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.119211 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:21Z","lastTransitionTime":"2026-02-17T17:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.125734 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4dsxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202411f4-1b44-41a1-9b8a-23038a0e9bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4dsxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:21Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.142983 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83f8f121-9193-465b-9d6b-4dfbdcf26f98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca961a496bca89cff78c58c41585706c0673892df83e6c58e16655f83aeaa4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3ead139c2a8e6feefbddce384ccd6eb8f26b4581d93053b80832fea7c8bdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82362387569a5b049fdb827cb8ca5ecdacae36743ef142077b37441cc2e33b41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d08e8579fe7ac45bc28ca29b9e6bc0b795c68e4429e16319b75f2b6a1f60cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:21Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.221775 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.221870 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.221887 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.221912 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.221929 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:21Z","lastTransitionTime":"2026-02-17T17:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.307123 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 00:12:29.13527536 +0000 UTC Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.328350 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.328548 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.328568 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.328594 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.328612 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:21Z","lastTransitionTime":"2026-02-17T17:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.358796 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.358890 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:44:21 crc kubenswrapper[4892]: E0217 17:44:21.359053 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:44:21 crc kubenswrapper[4892]: E0217 17:44:21.359463 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.431267 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.431347 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.431370 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.431399 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.431422 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:21Z","lastTransitionTime":"2026-02-17T17:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.534148 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.534211 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.534221 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.534237 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.534248 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:21Z","lastTransitionTime":"2026-02-17T17:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.613692 4892 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.636340 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.636369 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.636379 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.636392 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.636401 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:21Z","lastTransitionTime":"2026-02-17T17:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.739430 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.739493 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.739515 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.739545 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.739566 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:21Z","lastTransitionTime":"2026-02-17T17:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.842574 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.842626 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.842641 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.842661 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.842677 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:21Z","lastTransitionTime":"2026-02-17T17:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.945538 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.945617 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.945648 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.945683 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:21 crc kubenswrapper[4892]: I0217 17:44:21.945707 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:21Z","lastTransitionTime":"2026-02-17T17:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:22 crc kubenswrapper[4892]: I0217 17:44:22.048403 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:22 crc kubenswrapper[4892]: I0217 17:44:22.048458 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:22 crc kubenswrapper[4892]: I0217 17:44:22.048475 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:22 crc kubenswrapper[4892]: I0217 17:44:22.048496 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:22 crc kubenswrapper[4892]: I0217 17:44:22.048513 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:22Z","lastTransitionTime":"2026-02-17T17:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:22 crc kubenswrapper[4892]: I0217 17:44:22.151574 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:22 crc kubenswrapper[4892]: I0217 17:44:22.151612 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:22 crc kubenswrapper[4892]: I0217 17:44:22.151628 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:22 crc kubenswrapper[4892]: I0217 17:44:22.151651 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:22 crc kubenswrapper[4892]: I0217 17:44:22.151667 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:22Z","lastTransitionTime":"2026-02-17T17:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:22 crc kubenswrapper[4892]: I0217 17:44:22.254960 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:22 crc kubenswrapper[4892]: I0217 17:44:22.255011 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:22 crc kubenswrapper[4892]: I0217 17:44:22.255027 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:22 crc kubenswrapper[4892]: I0217 17:44:22.255051 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:22 crc kubenswrapper[4892]: I0217 17:44:22.255068 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:22Z","lastTransitionTime":"2026-02-17T17:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:22 crc kubenswrapper[4892]: I0217 17:44:22.307572 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 05:56:29.14335215 +0000 UTC Feb 17 17:44:22 crc kubenswrapper[4892]: I0217 17:44:22.357670 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:22 crc kubenswrapper[4892]: I0217 17:44:22.357739 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:22 crc kubenswrapper[4892]: I0217 17:44:22.357762 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:22 crc kubenswrapper[4892]: I0217 17:44:22.357794 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:22 crc kubenswrapper[4892]: I0217 17:44:22.357854 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:22Z","lastTransitionTime":"2026-02-17T17:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:22 crc kubenswrapper[4892]: I0217 17:44:22.358623 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:44:22 crc kubenswrapper[4892]: E0217 17:44:22.358787 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:44:22 crc kubenswrapper[4892]: I0217 17:44:22.461372 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:22 crc kubenswrapper[4892]: I0217 17:44:22.461445 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:22 crc kubenswrapper[4892]: I0217 17:44:22.461468 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:22 crc kubenswrapper[4892]: I0217 17:44:22.461501 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:22 crc kubenswrapper[4892]: I0217 17:44:22.461519 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:22Z","lastTransitionTime":"2026-02-17T17:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:22 crc kubenswrapper[4892]: I0217 17:44:22.564766 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:22 crc kubenswrapper[4892]: I0217 17:44:22.564886 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:22 crc kubenswrapper[4892]: I0217 17:44:22.564909 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:22 crc kubenswrapper[4892]: I0217 17:44:22.564943 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:22 crc kubenswrapper[4892]: I0217 17:44:22.564966 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:22Z","lastTransitionTime":"2026-02-17T17:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:22 crc kubenswrapper[4892]: I0217 17:44:22.617586 4892 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 17:44:22 crc kubenswrapper[4892]: I0217 17:44:22.668168 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:22 crc kubenswrapper[4892]: I0217 17:44:22.668215 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:22 crc kubenswrapper[4892]: I0217 17:44:22.668231 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:22 crc kubenswrapper[4892]: I0217 17:44:22.668254 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:22 crc kubenswrapper[4892]: I0217 17:44:22.668271 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:22Z","lastTransitionTime":"2026-02-17T17:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:22 crc kubenswrapper[4892]: I0217 17:44:22.771489 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:22 crc kubenswrapper[4892]: I0217 17:44:22.771526 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:22 crc kubenswrapper[4892]: I0217 17:44:22.771535 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:22 crc kubenswrapper[4892]: I0217 17:44:22.771550 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:22 crc kubenswrapper[4892]: I0217 17:44:22.771561 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:22Z","lastTransitionTime":"2026-02-17T17:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:22 crc kubenswrapper[4892]: I0217 17:44:22.874707 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:22 crc kubenswrapper[4892]: I0217 17:44:22.874769 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:22 crc kubenswrapper[4892]: I0217 17:44:22.874787 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:22 crc kubenswrapper[4892]: I0217 17:44:22.874838 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:22 crc kubenswrapper[4892]: I0217 17:44:22.874856 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:22Z","lastTransitionTime":"2026-02-17T17:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:22 crc kubenswrapper[4892]: I0217 17:44:22.977528 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:22 crc kubenswrapper[4892]: I0217 17:44:22.977605 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:22 crc kubenswrapper[4892]: I0217 17:44:22.977639 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:22 crc kubenswrapper[4892]: I0217 17:44:22.977670 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:22 crc kubenswrapper[4892]: I0217 17:44:22.977690 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:22Z","lastTransitionTime":"2026-02-17T17:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:23 crc kubenswrapper[4892]: I0217 17:44:23.079916 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:23 crc kubenswrapper[4892]: I0217 17:44:23.079987 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:23 crc kubenswrapper[4892]: I0217 17:44:23.080008 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:23 crc kubenswrapper[4892]: I0217 17:44:23.080034 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:23 crc kubenswrapper[4892]: I0217 17:44:23.080052 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:23Z","lastTransitionTime":"2026-02-17T17:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:23 crc kubenswrapper[4892]: I0217 17:44:23.188154 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:23 crc kubenswrapper[4892]: I0217 17:44:23.188201 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:23 crc kubenswrapper[4892]: I0217 17:44:23.188219 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:23 crc kubenswrapper[4892]: I0217 17:44:23.188240 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:23 crc kubenswrapper[4892]: I0217 17:44:23.188252 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:23Z","lastTransitionTime":"2026-02-17T17:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:23 crc kubenswrapper[4892]: I0217 17:44:23.291522 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:23 crc kubenswrapper[4892]: I0217 17:44:23.291578 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:23 crc kubenswrapper[4892]: I0217 17:44:23.291595 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:23 crc kubenswrapper[4892]: I0217 17:44:23.291619 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:23 crc kubenswrapper[4892]: I0217 17:44:23.291639 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:23Z","lastTransitionTime":"2026-02-17T17:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:23 crc kubenswrapper[4892]: I0217 17:44:23.308022 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 17:59:52.419005807 +0000 UTC Feb 17 17:44:23 crc kubenswrapper[4892]: I0217 17:44:23.358861 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:44:23 crc kubenswrapper[4892]: I0217 17:44:23.358940 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:44:23 crc kubenswrapper[4892]: E0217 17:44:23.358993 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:44:23 crc kubenswrapper[4892]: E0217 17:44:23.359064 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:44:23 crc kubenswrapper[4892]: I0217 17:44:23.394375 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:23 crc kubenswrapper[4892]: I0217 17:44:23.394438 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:23 crc kubenswrapper[4892]: I0217 17:44:23.394450 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:23 crc kubenswrapper[4892]: I0217 17:44:23.394467 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:23 crc kubenswrapper[4892]: I0217 17:44:23.394504 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:23Z","lastTransitionTime":"2026-02-17T17:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:23 crc kubenswrapper[4892]: I0217 17:44:23.497231 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:23 crc kubenswrapper[4892]: I0217 17:44:23.497287 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:23 crc kubenswrapper[4892]: I0217 17:44:23.497301 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:23 crc kubenswrapper[4892]: I0217 17:44:23.497327 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:23 crc kubenswrapper[4892]: I0217 17:44:23.497342 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:23Z","lastTransitionTime":"2026-02-17T17:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:23 crc kubenswrapper[4892]: I0217 17:44:23.600511 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:23 crc kubenswrapper[4892]: I0217 17:44:23.600589 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:23 crc kubenswrapper[4892]: I0217 17:44:23.600611 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:23 crc kubenswrapper[4892]: I0217 17:44:23.600639 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:23 crc kubenswrapper[4892]: I0217 17:44:23.600662 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:23Z","lastTransitionTime":"2026-02-17T17:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:23 crc kubenswrapper[4892]: I0217 17:44:23.703411 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:23 crc kubenswrapper[4892]: I0217 17:44:23.703481 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:23 crc kubenswrapper[4892]: I0217 17:44:23.703503 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:23 crc kubenswrapper[4892]: I0217 17:44:23.703534 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:23 crc kubenswrapper[4892]: I0217 17:44:23.703558 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:23Z","lastTransitionTime":"2026-02-17T17:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:23 crc kubenswrapper[4892]: I0217 17:44:23.806107 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:23 crc kubenswrapper[4892]: I0217 17:44:23.806165 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:23 crc kubenswrapper[4892]: I0217 17:44:23.806183 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:23 crc kubenswrapper[4892]: I0217 17:44:23.806206 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:23 crc kubenswrapper[4892]: I0217 17:44:23.806223 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:23Z","lastTransitionTime":"2026-02-17T17:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:23 crc kubenswrapper[4892]: I0217 17:44:23.908792 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:23 crc kubenswrapper[4892]: I0217 17:44:23.908864 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:23 crc kubenswrapper[4892]: I0217 17:44:23.908881 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:23 crc kubenswrapper[4892]: I0217 17:44:23.908905 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:23 crc kubenswrapper[4892]: I0217 17:44:23.908950 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:23Z","lastTransitionTime":"2026-02-17T17:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.011549 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.011613 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.011629 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.011654 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.011671 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:24Z","lastTransitionTime":"2026-02-17T17:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.114862 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.114920 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.114938 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.114961 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.114977 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:24Z","lastTransitionTime":"2026-02-17T17:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.217705 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.217743 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.217753 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.217767 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.217777 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:24Z","lastTransitionTime":"2026-02-17T17:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.309089 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 22:52:45.707559397 +0000 UTC Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.320685 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.320739 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.320748 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.320762 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.320771 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:24Z","lastTransitionTime":"2026-02-17T17:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.358722 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:44:24 crc kubenswrapper[4892]: E0217 17:44:24.358949 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.424504 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.424550 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.424561 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.424578 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.424589 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:24Z","lastTransitionTime":"2026-02-17T17:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.526704 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.526769 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.526785 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.526808 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.526848 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:24Z","lastTransitionTime":"2026-02-17T17:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.630198 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.630238 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.630247 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.630262 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.630273 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:24Z","lastTransitionTime":"2026-02-17T17:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.632135 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4dsxq" event={"ID":"202411f4-1b44-41a1-9b8a-23038a0e9bd2","Type":"ContainerStarted","Data":"dc5f2c5d87fb5b1529c03563955d9213cd72452aabadd9417afdf5a30b8a24f0"} Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.645175 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83f8f121-9193-465b-9d6b-4dfbdcf26f98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca961a496bca89cff78c58c41585706c0673892df83e6c58e16655f83aeaa4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3ead139c2a8e6feefbddce384ccd6eb8f26b4581d93053b80832fea7c8bdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82362387569a5b049fdb827cb8ca5ecdacae36743ef142077b37441cc2e33b41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d08e8579fe7ac45bc28ca29b9e6bc0b795c68e4429e16319b75f2b6a1f60cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:24Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.663734 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:24Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.677802 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:24Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.680333 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k28d"] Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.680922 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k28d" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.683426 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.683967 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.691677 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxpxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b12f44-0079-4031-9b1d-492c374250df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5490d72f776fd4e96f56e933c182c8589e96d80806378f96d97c4d0f6dccd8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgp4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxpxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:24Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.710617 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4dsxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202411f4-1b44-41a1-9b8a-23038a0e9bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc5f2c5d87fb5b1529c03563955d9213cd72452aabadd9417afdf5a30b8a24f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4dsxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:24Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.721626 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:24Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.732954 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.732989 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.732998 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.733014 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.733024 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:24Z","lastTransitionTime":"2026-02-17T17:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.733958 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aba0c9be9062adff6361f81c4b2f399a56592fcef0d1a62defdf587f9accb3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:24Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.744210 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9013d62-9809-436b-82a8-5b18dbf13e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b57e4ceb22cb7d61e915ddf5e85dfee872d2088cccf3c8c4add63e1aa423a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c31ce4ad814ab3be27cbd8d8eb0cb70c7e11352cacd14b705d5c3ed0b56fe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6mhzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:24Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.758227 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e1e9d28fe57947c798967ce328638e778603a86874e79a2f4d99eab0b9867fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://489d551f2e552fe6e6a03f8e38593e2b978a7dc4053a999c3a11ebe404cf427f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:24Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.771252 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p6jtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"646d0148-c138-45a2-8f68-51aee16aeff0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1d56c160a6bfb1da9b1ca24666fbbdbd87911eaf961365a6cb097b0cc20f1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxbxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p6jtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:24Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.787933 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b23058a0-04ec-4a23-82cb-60f9b368eaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f423d521d3549b7cb74f5f1fa08e42b46969184b639d2bec15a5c0a2d42d596b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kp5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:24Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.804266 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4159c-33a5-42ef-9427-ad1fb2799470\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf7e90e5029b6c000084531e8c465365c236ea0c554bf0ab65b936fc0f53cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe34ba04ce082dcb2c2040fafa3d9b0fb0f78fd569d08c9e53278873fb6434c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a9fc749b194bcc0558af7017260846aeb8df8df5d84f31ae0f1e9f7ac29e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 17:44:08.433164 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 17:44:08.433536 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:44:08.434768 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2781092108/tls.crt::/tmp/serving-cert-2781092108/tls.key\\\\\\\"\\\\nI0217 17:44:08.760474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:44:08.765628 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:44:08.765666 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:44:08.765700 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:44:08.765711 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:44:08.777802 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:44:08.777850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:44:08.777867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:44:08.777870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:44:08.777873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:44:08.777883 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:44:08.781106 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23963364690d3b6cdb1cb76d08e094b0a7c2fe304f8d20a673206357e0152193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:24Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.817052 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59304d1958e072cd19bd1e6a00f1487aa93563c8838c859aa7ed5222dc25c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:24Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.826468 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5v456" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62da95c0-b8b4-410e-a5e8-f3ab44db53b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4bd49fbfa0e7cff66ffc2a72aa0020a4a28551fa192e587067d648feb4d09f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfctg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5v456\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:24Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.831896 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8dcf\" (UniqueName: \"kubernetes.io/projected/5497ad5a-f96f-4f2e-ba79-a72f32527546-kube-api-access-g8dcf\") pod \"ovnkube-control-plane-749d76644c-7k28d\" (UID: \"5497ad5a-f96f-4f2e-ba79-a72f32527546\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k28d" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.831983 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5497ad5a-f96f-4f2e-ba79-a72f32527546-env-overrides\") pod \"ovnkube-control-plane-749d76644c-7k28d\" (UID: \"5497ad5a-f96f-4f2e-ba79-a72f32527546\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k28d" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.832083 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5497ad5a-f96f-4f2e-ba79-a72f32527546-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-7k28d\" (UID: \"5497ad5a-f96f-4f2e-ba79-a72f32527546\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k28d" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.832144 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5497ad5a-f96f-4f2e-ba79-a72f32527546-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-7k28d\" (UID: \"5497ad5a-f96f-4f2e-ba79-a72f32527546\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k28d" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.835141 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.835185 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.835199 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.835218 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.835231 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:24Z","lastTransitionTime":"2026-02-17T17:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.838663 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83f8f121-9193-465b-9d6b-4dfbdcf26f98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca961a496bca89cff78c58c41585706c0673892df83e6c58e16655f83aeaa4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3ead139c2a8e6feefbddce384ccd6eb8f26b4581d93053b80832fea7c8bdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82362387569a5b049fdb827cb8ca5ecdacae36743ef142077b37441cc2e33b41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d08e8579fe7ac45bc28ca29b9e6bc0b795c68e4429e16319b75f2b6a1f60cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:24Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.848989 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:24Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.859111 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:24Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.872205 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxpxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b12f44-0079-4031-9b1d-492c374250df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5490d72f776fd4e96f56e933c182c8589e96d80806378f96d97c4d0f6dccd8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgp4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxpxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:24Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.886779 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4dsxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202411f4-1b44-41a1-9b8a-23038a0e9bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc5f2c5d87fb5b1529c03563955d9213cd72452aabadd9417afdf5a30b8a24f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4dsxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:24Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.929956 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:24Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.932917 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8dcf\" (UniqueName: \"kubernetes.io/projected/5497ad5a-f96f-4f2e-ba79-a72f32527546-kube-api-access-g8dcf\") pod \"ovnkube-control-plane-749d76644c-7k28d\" (UID: \"5497ad5a-f96f-4f2e-ba79-a72f32527546\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k28d" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.933023 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5497ad5a-f96f-4f2e-ba79-a72f32527546-env-overrides\") pod \"ovnkube-control-plane-749d76644c-7k28d\" (UID: \"5497ad5a-f96f-4f2e-ba79-a72f32527546\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k28d" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.933072 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5497ad5a-f96f-4f2e-ba79-a72f32527546-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-7k28d\" (UID: \"5497ad5a-f96f-4f2e-ba79-a72f32527546\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k28d" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.933105 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5497ad5a-f96f-4f2e-ba79-a72f32527546-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-7k28d\" (UID: \"5497ad5a-f96f-4f2e-ba79-a72f32527546\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k28d" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.933662 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5497ad5a-f96f-4f2e-ba79-a72f32527546-env-overrides\") pod \"ovnkube-control-plane-749d76644c-7k28d\" (UID: \"5497ad5a-f96f-4f2e-ba79-a72f32527546\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k28d" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.934611 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5497ad5a-f96f-4f2e-ba79-a72f32527546-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-7k28d\" (UID: \"5497ad5a-f96f-4f2e-ba79-a72f32527546\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k28d" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.937877 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.937928 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.937941 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.937956 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.937967 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:24Z","lastTransitionTime":"2026-02-17T17:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.939144 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5497ad5a-f96f-4f2e-ba79-a72f32527546-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-7k28d\" (UID: \"5497ad5a-f96f-4f2e-ba79-a72f32527546\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k28d" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.948410 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aba0c9be9062adff6361f81c4b2f399a56592fcef0d1a62defdf587f9accb3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:24Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.958796 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8dcf\" (UniqueName: \"kubernetes.io/projected/5497ad5a-f96f-4f2e-ba79-a72f32527546-kube-api-access-g8dcf\") pod \"ovnkube-control-plane-749d76644c-7k28d\" (UID: \"5497ad5a-f96f-4f2e-ba79-a72f32527546\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k28d" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.960096 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9013d62-9809-436b-82a8-5b18dbf13e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b57e4ceb22cb7d61e915ddf5e85dfee872d2088cccf3c8c4add63e1aa423a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c31ce4ad814ab3be27cbd8d8eb0cb70c7e11352cacd14b705d5c3ed0b56fe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6mhzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:24Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.975234 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e1e9d28fe57947c798967ce328638e778603a86874e79a2f4d99eab0b9867fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://489d551f2e552fe6e6a03f8e38593e2b978a7dc4053a999c3a11ebe404cf427f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:24Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:24 crc kubenswrapper[4892]: I0217 17:44:24.987898 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p6jtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"646d0148-c138-45a2-8f68-51aee16aeff0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1d56c160a6bfb1da9b1ca24666fbbdbd87911eaf961365a6cb097b0cc20f1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxbxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p6jtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:24Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.001474 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k28d" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.018047 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b23058a0-04ec-4a23-82cb-60f9b368eaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f423d521d3549b7cb74f5f1fa08e42b46969184b639d2bec15a5c0a2d42d596b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kp5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:25Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:25 crc kubenswrapper[4892]: W0217 17:44:25.022584 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5497ad5a_f96f_4f2e_ba79_a72f32527546.slice/crio-e6b12ab009e56ff4c99fce8cbb477eadb1b6a7f4b9d921866101ee337844c5c6 WatchSource:0}: Error finding container e6b12ab009e56ff4c99fce8cbb477eadb1b6a7f4b9d921866101ee337844c5c6: Status 404 returned error can't find the container with id e6b12ab009e56ff4c99fce8cbb477eadb1b6a7f4b9d921866101ee337844c5c6 Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.034300 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:44:25 crc kubenswrapper[4892]: E0217 17:44:25.034709 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:44:41.034676996 +0000 UTC m=+52.410080301 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.041703 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.041771 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.041883 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.041916 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.041941 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:25Z","lastTransitionTime":"2026-02-17T17:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.044327 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4159c-33a5-42ef-9427-ad1fb2799470\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf7e90e5029b6c000084531e8c465365c236ea0c554bf0ab65b936fc0f53cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe34ba04ce082dcb2c2040fafa3d9b0fb0f78fd569d08c9e53278873fb6434c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a9fc749b194bcc0558af7017260846aeb8df8df5d84f31ae0f1e9f7ac29e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 17:44:08.433164 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 17:44:08.433536 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:44:08.434768 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2781092108/tls.crt::/tmp/serving-cert-2781092108/tls.key\\\\\\\"\\\\nI0217 17:44:08.760474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:44:08.765628 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:44:08.765666 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:44:08.765700 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:44:08.765711 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:44:08.777802 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:44:08.777850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:44:08.777867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:44:08.777870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:44:08.777873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:44:08.777883 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:44:08.781106 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23963364690d3b6cdb1cb76d08e094b0a7c2fe304f8d20a673206357e0152193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:25Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.064712 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59304d1958e072cd19bd1e6a00f1487aa93563c8838c859aa7ed5222dc25c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:25Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.078323 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5v456" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62da95c0-b8b4-410e-a5e8-f3ab44db53b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4bd49fbfa0e7cff66ffc2a72aa0020a4a28551fa192e587067d648feb4d09f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfctg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5v456\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:25Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.092163 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k28d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5497ad5a-f96f-4f2e-ba79-a72f32527546\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g8dcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g8dcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7k28d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:25Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.136181 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.136261 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.136304 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.136379 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:44:25 crc kubenswrapper[4892]: E0217 17:44:25.136388 4892 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 17:44:25 crc kubenswrapper[4892]: E0217 17:44:25.136478 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 17:44:41.13645642 +0000 UTC m=+52.511859755 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 17:44:25 crc kubenswrapper[4892]: E0217 17:44:25.136544 4892 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 17:44:25 crc kubenswrapper[4892]: E0217 17:44:25.136618 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 17:44:41.136596863 +0000 UTC m=+52.512000168 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 17:44:25 crc kubenswrapper[4892]: E0217 17:44:25.136678 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 17:44:25 crc kubenswrapper[4892]: E0217 17:44:25.136713 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 17:44:25 crc kubenswrapper[4892]: E0217 17:44:25.136726 4892 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:44:25 crc kubenswrapper[4892]: E0217 17:44:25.136826 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 17:44:41.136759978 +0000 UTC m=+52.512163323 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:44:25 crc kubenswrapper[4892]: E0217 17:44:25.136907 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 17:44:25 crc kubenswrapper[4892]: E0217 17:44:25.136923 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 17:44:25 crc kubenswrapper[4892]: E0217 17:44:25.136935 4892 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:44:25 crc kubenswrapper[4892]: E0217 17:44:25.136964 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 17:44:41.136955893 +0000 UTC m=+52.512359278 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.145245 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.145295 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.145304 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.145319 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.145328 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:25Z","lastTransitionTime":"2026-02-17T17:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.248413 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.248489 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.248513 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.248543 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.248563 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:25Z","lastTransitionTime":"2026-02-17T17:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.275958 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.276022 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.276046 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.276061 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.279428 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:25Z","lastTransitionTime":"2026-02-17T17:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:25 crc kubenswrapper[4892]: E0217 17:44:25.299556 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b854407-f7fd-494b-9ad1-f90f175f6ff2\\\",\\\"systemUUID\\\":\\\"06c95f3d-0382-44a5-9f64-04e3b3bbd534\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:25Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.304627 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.304678 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.304690 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.304709 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.304721 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:25Z","lastTransitionTime":"2026-02-17T17:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.310104 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 00:08:30.103169009 +0000 UTC Feb 17 17:44:25 crc kubenswrapper[4892]: E0217 17:44:25.320348 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b854407-f7fd-494b-9ad1-f90f175f6ff2\\\",\\\"systemUUID\\\":\\\"06c95f3d-0382-44a5-9f64-04e3b3bbd534\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:25Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.324490 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.324531 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.324544 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.324563 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.324577 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:25Z","lastTransitionTime":"2026-02-17T17:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:25 crc kubenswrapper[4892]: E0217 17:44:25.339484 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b854407-f7fd-494b-9ad1-f90f175f6ff2\\\",\\\"systemUUID\\\":\\\"06c95f3d-0382-44a5-9f64-04e3b3bbd534\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:25Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.344951 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.344994 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.345008 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.345029 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.345044 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:25Z","lastTransitionTime":"2026-02-17T17:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.359371 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.359512 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:44:25 crc kubenswrapper[4892]: E0217 17:44:25.359642 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:44:25 crc kubenswrapper[4892]: E0217 17:44:25.360046 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:44:25 crc kubenswrapper[4892]: E0217 17:44:25.360804 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b854407-f7fd-494b-9ad1-f90f175f6ff2\\\",\\\"systemUUID\\\":\\\"06c95f3d-0382-44a5-9f64-04e3b3bbd534\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:25Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.365617 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.365649 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.365659 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.365674 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.365684 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:25Z","lastTransitionTime":"2026-02-17T17:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:25 crc kubenswrapper[4892]: E0217 17:44:25.380477 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b854407-f7fd-494b-9ad1-f90f175f6ff2\\\",\\\"systemUUID\\\":\\\"06c95f3d-0382-44a5-9f64-04e3b3bbd534\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:25Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:25 crc kubenswrapper[4892]: E0217 17:44:25.380637 4892 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.382366 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.382408 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.382420 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.382440 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.382455 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:25Z","lastTransitionTime":"2026-02-17T17:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.485088 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.485137 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.485150 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.485169 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.485182 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:25Z","lastTransitionTime":"2026-02-17T17:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.588241 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.588302 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.588315 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.588347 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.588362 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:25Z","lastTransitionTime":"2026-02-17T17:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.638179 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k28d" event={"ID":"5497ad5a-f96f-4f2e-ba79-a72f32527546","Type":"ContainerStarted","Data":"74ca1f6810f305bb127d5ebfbb39b92ffa5914a3a245dcfe6f4adb805fea015e"} Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.638239 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k28d" event={"ID":"5497ad5a-f96f-4f2e-ba79-a72f32527546","Type":"ContainerStarted","Data":"a8cdd570c53fb9581ae506e73f11f51586ebffcc4561ba7176a7a64a25bd5e84"} Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.638252 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k28d" event={"ID":"5497ad5a-f96f-4f2e-ba79-a72f32527546","Type":"ContainerStarted","Data":"e6b12ab009e56ff4c99fce8cbb477eadb1b6a7f4b9d921866101ee337844c5c6"} Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.661349 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83f8f121-9193-465b-9d6b-4dfbdcf26f98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca961a496bca89cff78c58c41585706c0673892df83e6c58e16655f83aeaa4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3ead139c2a8e6feefbddce384ccd6eb8f26b4581d93053b80832fea7c8bdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82362387569a5b049fdb827cb8ca5ecdacae36743ef142077b37441cc2e33b41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d08e8579fe7ac45bc28ca29b9e6bc0b795c68e4429e16319b75f2b6a1f60cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:25Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.680523 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:25Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.690778 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.690828 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.690837 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.690854 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.690864 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:25Z","lastTransitionTime":"2026-02-17T17:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.698373 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:25Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.719976 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxpxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b12f44-0079-4031-9b1d-492c374250df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5490d72f776fd4e96f56e933c182c8589e96d80806378f96d97c4d0f6dccd8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgp4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxpxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:25Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.740446 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4dsxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202411f4-1b44-41a1-9b8a-23038a0e9bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc5f2c5d87fb5b1529c03563955d9213cd72452aabadd9417afdf5a30b8a24f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4dsxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:25Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.756385 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:25Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.768182 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aba0c9be9062adff6361f81c4b2f399a56592fcef0d1a62defdf587f9accb3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:25Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.779123 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9013d62-9809-436b-82a8-5b18dbf13e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b57e4ceb22cb7d61e915ddf5e85dfee872d2088cccf3c8c4add63e1aa423a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c31ce4ad814ab3be27cbd8d8eb0cb70c7e11352cacd14b705d5c3ed0b56fe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6mhzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:25Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.793424 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.793756 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.793768 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.793785 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.793796 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:25Z","lastTransitionTime":"2026-02-17T17:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.796239 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e1e9d28fe57947c798967ce328638e778603a86874e79a2f4d99eab0b9867fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://489d551f2e552fe6e6a03f8e38593e2b978a7dc4053a999c3a11ebe404cf427f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:25Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.808187 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p6jtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"646d0148-c138-45a2-8f68-51aee16aeff0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1d56c160a6bfb1da9b1ca24666fbbdbd87911eaf961365a6cb097b0cc20f1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxbxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p6jtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:25Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.832213 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b23058a0-04ec-4a23-82cb-60f9b368eaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f423d521d3549b7cb74f5f1fa08e42b46969184b639d2bec15a5c0a2d42d596b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kp5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:25Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.851874 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4159c-33a5-42ef-9427-ad1fb2799470\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf7e90e5029b6c000084531e8c465365c236ea0c554bf0ab65b936fc0f53cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe34ba04ce082dcb2c2040fafa3d9b0fb0f78fd569d08c9e53278873fb6434c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a9fc749b194bcc0558af7017260846aeb8df8df5d84f31ae0f1e9f7ac29e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 17:44:08.433164 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 17:44:08.433536 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:44:08.434768 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2781092108/tls.crt::/tmp/serving-cert-2781092108/tls.key\\\\\\\"\\\\nI0217 17:44:08.760474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:44:08.765628 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:44:08.765666 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:44:08.765700 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:44:08.765711 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:44:08.777802 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:44:08.777850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:44:08.777867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:44:08.777870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:44:08.777873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:44:08.777883 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:44:08.781106 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23963364690d3b6cdb1cb76d08e094b0a7c2fe304f8d20a673206357e0152193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:25Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.867740 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59304d1958e072cd19bd1e6a00f1487aa93563c8838c859aa7ed5222dc25c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:25Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.878239 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5v456" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62da95c0-b8b4-410e-a5e8-f3ab44db53b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4bd49fbfa0e7cff66ffc2a72aa0020a4a28551fa192e587067d648feb4d09f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfctg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5v456\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:25Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.890024 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k28d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5497ad5a-f96f-4f2e-ba79-a72f32527546\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8cdd570c53fb9581ae506e73f11f51586ebffcc4561ba7176a7a64a25bd5e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g8dcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ca1f6810f305bb127d5ebfbb39b92ffa5914a3a245dcfe6f4adb805fea015e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g8dcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7k28d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:25Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.895479 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.895518 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.895531 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.895547 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.895555 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:25Z","lastTransitionTime":"2026-02-17T17:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.997264 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.997305 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.997313 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.997328 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:25 crc kubenswrapper[4892]: I0217 17:44:25.997337 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:25Z","lastTransitionTime":"2026-02-17T17:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.099637 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.099679 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.099689 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.099705 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.099715 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:26Z","lastTransitionTime":"2026-02-17T17:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.202092 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.202126 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.202134 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.202147 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.202157 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:26Z","lastTransitionTime":"2026-02-17T17:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.304116 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.304160 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.304176 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.304200 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.304217 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:26Z","lastTransitionTime":"2026-02-17T17:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.311272 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 04:18:36.922199275 +0000 UTC Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.359005 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:44:26 crc kubenswrapper[4892]: E0217 17:44:26.359227 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.406776 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.406829 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.406840 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.406858 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.406869 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:26Z","lastTransitionTime":"2026-02-17T17:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.509585 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.509628 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.509637 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.509650 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.509660 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:26Z","lastTransitionTime":"2026-02-17T17:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.523455 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-2q4n6"] Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.524285 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:44:26 crc kubenswrapper[4892]: E0217 17:44:26.524405 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q4n6" podUID="9290105c-74a4-487a-879f-3f79186b3b01" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.554380 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b23058a0-04ec-4a23-82cb-60f9b368eaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f423d521d3549b7cb74f5f1fa08e42b46969184b639d2bec15a5c0a2d42d596b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kp5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:26Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.568989 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2q4n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9290105c-74a4-487a-879f-3f79186b3b01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6h6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6h6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2q4n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:26Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.585458 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e1e9d28fe57947c798967ce328638e778603a86874e79a2f4d99eab0b9867fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://489d551f2e552fe6e6a03f8e38593e2b978a7dc4053a999c3a11ebe404cf427f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:26Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.597824 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p6jtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"646d0148-c138-45a2-8f68-51aee16aeff0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1d56c160a6bfb1da9b1ca24666fbbdbd87911eaf961365a6cb097b0cc20f1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxbxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p6jtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:26Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.613788 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.613840 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.613850 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.613867 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.613878 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:26Z","lastTransitionTime":"2026-02-17T17:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.618516 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59304d1958e072cd19bd1e6a00f1487aa93563c8838c859aa7ed5222dc25c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:26Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.635970 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5v456" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62da95c0-b8b4-410e-a5e8-f3ab44db53b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4bd49fbfa0e7cff66ffc2a72aa0020a4a28551fa192e587067d648feb4d09f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfctg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5v456\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:26Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.650306 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k28d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5497ad5a-f96f-4f2e-ba79-a72f32527546\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8cdd570c53fb9581ae506e73f11f51586ebffcc4561ba7176a7a64a25bd5e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g8dcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ca1f6810f305bb127d5ebfbb39b92ffa5914a3a245dcfe6f4adb805fea015e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g8dcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7k28d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:26Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.660887 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9290105c-74a4-487a-879f-3f79186b3b01-metrics-certs\") pod \"network-metrics-daemon-2q4n6\" (UID: \"9290105c-74a4-487a-879f-3f79186b3b01\") " pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.660947 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6h6p\" (UniqueName: \"kubernetes.io/projected/9290105c-74a4-487a-879f-3f79186b3b01-kube-api-access-p6h6p\") pod \"network-metrics-daemon-2q4n6\" (UID: \"9290105c-74a4-487a-879f-3f79186b3b01\") " pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.668701 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4159c-33a5-42ef-9427-ad1fb2799470\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf7e90e5029b6c000084531e8c465365c236ea0c554bf0ab65b936fc0f53cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe34ba04ce082dcb2c2040fafa3d9b0fb0f78fd569d08c9e53278873fb6434c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a9fc749b194bcc0558af7017260846aeb8df8df5d84f31ae0f1e9f7ac29e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 17:44:08.433164 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 17:44:08.433536 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:44:08.434768 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2781092108/tls.crt::/tmp/serving-cert-2781092108/tls.key\\\\\\\"\\\\nI0217 17:44:08.760474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:44:08.765628 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:44:08.765666 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:44:08.765700 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:44:08.765711 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:44:08.777802 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:44:08.777850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:44:08.777867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:44:08.777870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:44:08.777873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:44:08.777883 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:44:08.781106 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23963364690d3b6cdb1cb76d08e094b0a7c2fe304f8d20a673206357e0152193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:26Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.684056 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:26Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.705655 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxpxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b12f44-0079-4031-9b1d-492c374250df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5490d72f776fd4e96f56e933c182c8589e96d80806378f96d97c4d0f6dccd8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgp4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxpxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:26Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.716726 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.716790 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.716805 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.716842 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.716859 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:26Z","lastTransitionTime":"2026-02-17T17:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.719399 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4dsxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202411f4-1b44-41a1-9b8a-23038a0e9bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc5f2c5d87fb5b1529c03563955d9213cd72452aabadd9417afdf5a30b8a24f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4dsxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:26Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.734110 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83f8f121-9193-465b-9d6b-4dfbdcf26f98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca961a496bca89cff78c58c41585706c0673892df83e6c58e16655f83aeaa4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3ead139c2a8e6feefbddce384ccd6eb8f26b4581d93053b80832fea7c8bdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82362387569a5b049fdb827cb8ca5ecdacae36743ef142077b37441cc2e33b41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d08e8579fe7ac45bc28ca29b9e6bc0b795c68e4429e16319b75f2b6a1f60cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:26Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.749875 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:26Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.761680 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9290105c-74a4-487a-879f-3f79186b3b01-metrics-certs\") pod \"network-metrics-daemon-2q4n6\" (UID: \"9290105c-74a4-487a-879f-3f79186b3b01\") " pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.761721 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6h6p\" (UniqueName: \"kubernetes.io/projected/9290105c-74a4-487a-879f-3f79186b3b01-kube-api-access-p6h6p\") pod \"network-metrics-daemon-2q4n6\" (UID: \"9290105c-74a4-487a-879f-3f79186b3b01\") " pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:44:26 crc kubenswrapper[4892]: E0217 17:44:26.761927 4892 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 17:44:26 crc kubenswrapper[4892]: E0217 17:44:26.762043 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9290105c-74a4-487a-879f-3f79186b3b01-metrics-certs podName:9290105c-74a4-487a-879f-3f79186b3b01 nodeName:}" failed. No retries permitted until 2026-02-17 17:44:27.262010308 +0000 UTC m=+38.637413593 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9290105c-74a4-487a-879f-3f79186b3b01-metrics-certs") pod "network-metrics-daemon-2q4n6" (UID: "9290105c-74a4-487a-879f-3f79186b3b01") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.766693 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aba0c9be9062adff6361f81c4b2f399a56592fcef0d1a62defdf587f9accb3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:26Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.787122 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9013d62-9809-436b-82a8-5b18dbf13e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b57e4ceb22cb7d61e915ddf5e85dfee872d2088cccf3c8c4add63e1aa423a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c31ce4ad814ab3be27cbd8d8eb0cb70c7e11352cacd14b705d5c3ed0b56fe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6mhzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:26Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.788258 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6h6p\" (UniqueName: \"kubernetes.io/projected/9290105c-74a4-487a-879f-3f79186b3b01-kube-api-access-p6h6p\") pod \"network-metrics-daemon-2q4n6\" (UID: \"9290105c-74a4-487a-879f-3f79186b3b01\") " pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.803113 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:26Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.818894 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.818967 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.818982 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.819005 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.819022 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:26Z","lastTransitionTime":"2026-02-17T17:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.922129 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.922163 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.922172 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.922186 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:26 crc kubenswrapper[4892]: I0217 17:44:26.922196 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:26Z","lastTransitionTime":"2026-02-17T17:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.025194 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.025249 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.025265 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.025290 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.025307 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:27Z","lastTransitionTime":"2026-02-17T17:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.129042 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.129121 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.129146 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.129177 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.129196 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:27Z","lastTransitionTime":"2026-02-17T17:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.232518 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.232578 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.232630 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.232656 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.232674 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:27Z","lastTransitionTime":"2026-02-17T17:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.267523 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9290105c-74a4-487a-879f-3f79186b3b01-metrics-certs\") pod \"network-metrics-daemon-2q4n6\" (UID: \"9290105c-74a4-487a-879f-3f79186b3b01\") " pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:44:27 crc kubenswrapper[4892]: E0217 17:44:27.267749 4892 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 17:44:27 crc kubenswrapper[4892]: E0217 17:44:27.267920 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9290105c-74a4-487a-879f-3f79186b3b01-metrics-certs podName:9290105c-74a4-487a-879f-3f79186b3b01 nodeName:}" failed. No retries permitted until 2026-02-17 17:44:28.267885232 +0000 UTC m=+39.643288577 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9290105c-74a4-487a-879f-3f79186b3b01-metrics-certs") pod "network-metrics-daemon-2q4n6" (UID: "9290105c-74a4-487a-879f-3f79186b3b01") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.312123 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 11:06:52.653278506 +0000 UTC Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.336197 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.336266 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.336288 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.336322 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.336344 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:27Z","lastTransitionTime":"2026-02-17T17:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.364933 4892 scope.go:117] "RemoveContainer" containerID="26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.365106 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:44:27 crc kubenswrapper[4892]: E0217 17:44:27.365300 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.366147 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:44:27 crc kubenswrapper[4892]: E0217 17:44:27.366340 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.439189 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.439238 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.439259 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.439281 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.439298 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:27Z","lastTransitionTime":"2026-02-17T17:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.542054 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.542122 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.542144 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.542170 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.542193 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:27Z","lastTransitionTime":"2026-02-17T17:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.644486 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.644526 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.644540 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.644558 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.644593 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:27Z","lastTransitionTime":"2026-02-17T17:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.668711 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kp5h9_b23058a0-04ec-4a23-82cb-60f9b368eaa0/ovnkube-controller/0.log" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.671743 4892 generic.go:334] "Generic (PLEG): container finished" podID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerID="f423d521d3549b7cb74f5f1fa08e42b46969184b639d2bec15a5c0a2d42d596b" exitCode=1 Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.671784 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" event={"ID":"b23058a0-04ec-4a23-82cb-60f9b368eaa0","Type":"ContainerDied","Data":"f423d521d3549b7cb74f5f1fa08e42b46969184b639d2bec15a5c0a2d42d596b"} Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.672601 4892 scope.go:117] "RemoveContainer" containerID="f423d521d3549b7cb74f5f1fa08e42b46969184b639d2bec15a5c0a2d42d596b" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.675412 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.679397 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2199d2bd60d27dc4d3c3ff4a2ef9e4cbad59c3546a36206258b82875901b69d5"} Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.679675 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.685677 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59304d1958e072cd19bd1e6a00f1487aa93563c8838c859aa7ed5222dc25c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:27Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.695479 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5v456" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62da95c0-b8b4-410e-a5e8-f3ab44db53b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4bd49fbfa0e7cff66ffc2a72aa0020a4a28551fa192e587067d648feb4d09f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfctg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5v456\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:27Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.709736 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k28d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5497ad5a-f96f-4f2e-ba79-a72f32527546\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8cdd570c53fb9581ae506e73f11f51586ebffcc4561ba7176a7a64a25bd5e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g8dcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ca1f6810f305bb127d5ebfbb39b92ffa5914a3a245dcfe6f4adb805fea015e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g8dcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7k28d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:27Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.725995 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4159c-33a5-42ef-9427-ad1fb2799470\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf7e90e5029b6c000084531e8c465365c236ea0c554bf0ab65b936fc0f53cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe34ba04ce082dcb2c2040fafa3d9b0fb0f78fd569d08c9e53278873fb6434c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a9fc749b194bcc0558af7017260846aeb8df8df5d84f31ae0f1e9f7ac29e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 17:44:08.433164 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 17:44:08.433536 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:44:08.434768 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2781092108/tls.crt::/tmp/serving-cert-2781092108/tls.key\\\\\\\"\\\\nI0217 17:44:08.760474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:44:08.765628 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:44:08.765666 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:44:08.765700 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:44:08.765711 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:44:08.777802 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:44:08.777850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:44:08.777867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:44:08.777870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:44:08.777873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:44:08.777883 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:44:08.781106 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23963364690d3b6cdb1cb76d08e094b0a7c2fe304f8d20a673206357e0152193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:27Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.741700 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxpxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b12f44-0079-4031-9b1d-492c374250df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5490d72f776fd4e96f56e933c182c8589e96d80806378f96d97c4d0f6dccd8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgp4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxpxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:27Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.762411 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.762453 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.762654 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.762675 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.762685 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:27Z","lastTransitionTime":"2026-02-17T17:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.769485 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4dsxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202411f4-1b44-41a1-9b8a-23038a0e9bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc5f2c5d87fb5b1529c03563955d9213cd72452aabadd9417afdf5a30b8a24f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4dsxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:27Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.787246 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83f8f121-9193-465b-9d6b-4dfbdcf26f98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca961a496bca89cff78c58c41585706c0673892df83e6c58e16655f83aeaa4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3ead139c2a8e6feefbddce384ccd6eb8f26b4581d93053b80832fea7c8bdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82362387569a5b049fdb827cb8ca5ecdacae36743ef142077b37441cc2e33b41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d08e8579fe7ac45bc28ca29b9e6bc0b795c68e4429e16319b75f2b6a1f60cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:27Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.802023 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:27Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.813190 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:27Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.832329 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9013d62-9809-436b-82a8-5b18dbf13e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b57e4ceb22cb7d61e915ddf5e85dfee872d2088cccf3c8c4add63e1aa423a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c31ce4ad814ab3be27cbd8d8eb0cb70c7e11352cacd14b705d5c3ed0b56fe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6mhzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:27Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.847470 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:27Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.862316 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aba0c9be9062adff6361f81c4b2f399a56592fcef0d1a62defdf587f9accb3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:27Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.865888 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.865918 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.865929 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.865945 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.865955 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:27Z","lastTransitionTime":"2026-02-17T17:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.875535 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2q4n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9290105c-74a4-487a-879f-3f79186b3b01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6h6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6h6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2q4n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:27Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.886104 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e1e9d28fe57947c798967ce328638e778603a86874e79a2f4d99eab0b9867fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://489d551f2e552fe6e6a03f8e38593e2b978a7dc4053a999c3a11ebe404cf427f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:27Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.899485 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p6jtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"646d0148-c138-45a2-8f68-51aee16aeff0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1d56c160a6bfb1da9b1ca24666fbbdbd87911eaf961365a6cb097b0cc20f1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxbxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p6jtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:27Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.923464 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b23058a0-04ec-4a23-82cb-60f9b368eaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f423d521d3549b7cb74f5f1fa08e42b46969184b639d2bec15a5c0a2d42d596b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f423d521d3549b7cb74f5f1fa08e42b46969184b639d2bec15a5c0a2d42d596b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:44:27Z\\\",\\\"message\\\":\\\"17 17:44:27.629245 6171 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0217 17:44:27.629285 6171 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0217 17:44:27.629301 6171 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 17:44:27.629316 6171 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 17:44:27.629317 6171 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0217 17:44:27.629332 6171 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0217 17:44:27.629339 6171 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 17:44:27.629348 6171 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 17:44:27.629355 6171 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0217 17:44:27.629367 6171 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0217 17:44:27.629368 6171 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0217 17:44:27.629392 6171 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 17:44:27.629408 6171 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0217 17:44:27.629431 6171 factory.go:656] Stopping watch factory\\\\nI0217 17:44:27.629445 6171 ovnkube.go:599] Stopped ovnkube\\\\nI0217 17:44:27.629465 6171 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kp5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:27Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.938332 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k28d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5497ad5a-f96f-4f2e-ba79-a72f32527546\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8cdd570c53fb9581ae506e73f11f51586ebffcc4561ba7176a7a64a25bd5e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g8dcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ca1f6810f305bb127d5ebfbb39b92ffa5914a3a245dcfe6f4adb805fea015e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g8dcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7k28d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:27Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.975280 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4159c-33a5-42ef-9427-ad1fb2799470\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf7e90e5029b6c000084531e8c465365c236ea0c554bf0ab65b936fc0f53cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe34ba04ce082dcb2c2040fafa3d9b0fb0f78fd569d08c9e53278873fb6434c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a9fc749b194bcc0558af7017260846aeb8df8df5d84f31ae0f1e9f7ac29e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2199d2bd60d27dc4d3c3ff4a2ef9e4cbad59c3546a36206258b82875901b69d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 17:44:08.433164 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 17:44:08.433536 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:44:08.434768 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2781092108/tls.crt::/tmp/serving-cert-2781092108/tls.key\\\\\\\"\\\\nI0217 17:44:08.760474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:44:08.765628 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:44:08.765666 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:44:08.765700 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:44:08.765711 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:44:08.777802 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:44:08.777850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:44:08.777867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:44:08.777870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:44:08.777873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:44:08.777883 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:44:08.781106 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23963364690d3b6cdb1cb76d08e094b0a7c2fe304f8d20a673206357e0152193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:27Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.976187 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.976217 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.976234 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.976251 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.976262 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:27Z","lastTransitionTime":"2026-02-17T17:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:27 crc kubenswrapper[4892]: I0217 17:44:27.988890 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59304d1958e072cd19bd1e6a00f1487aa93563c8838c859aa7ed5222dc25c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:27Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.002352 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5v456" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62da95c0-b8b4-410e-a5e8-f3ab44db53b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4bd49fbfa0e7cff66ffc2a72aa0020a4a28551fa192e587067d648feb4d09f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfctg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5v456\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:28Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.015547 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83f8f121-9193-465b-9d6b-4dfbdcf26f98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca961a496bca89cff78c58c41585706c0673892df83e6c58e16655f83aeaa4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3ead139c2a8e6feefbddce384ccd6eb8f26b4581d93053b80832fea7c8bdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82362387569a5b049fdb827cb8ca5ecdacae36743ef142077b37441cc2e33b41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d08e8579fe7ac45bc28ca29b9e6bc0b795c68e4429e16319b75f2b6a1f60cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:28Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.028921 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:28Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.052073 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:28Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.072990 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxpxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b12f44-0079-4031-9b1d-492c374250df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5490d72f776fd4e96f56e933c182c8589e96d80806378f96d97c4d0f6dccd8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgp4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxpxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:28Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.082280 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.082357 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.082376 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.082448 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.082480 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:28Z","lastTransitionTime":"2026-02-17T17:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.091622 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4dsxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202411f4-1b44-41a1-9b8a-23038a0e9bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc5f2c5d87fb5b1529c03563955d9213cd72452aabadd9417afdf5a30b8a24f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4dsxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:28Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.109111 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:28Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.126595 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aba0c9be9062adff6361f81c4b2f399a56592fcef0d1a62defdf587f9accb3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:28Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.145716 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9013d62-9809-436b-82a8-5b18dbf13e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b57e4ceb22cb7d61e915ddf5e85dfee872d2088cccf3c8c4add63e1aa423a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c31ce4ad814ab3be27cbd8d8eb0cb70c7e11352cacd14b705d5c3ed0b56fe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6mhzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:28Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.161754 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e1e9d28fe57947c798967ce328638e778603a86874e79a2f4d99eab0b9867fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://489d551f2e552fe6e6a03f8e38593e2b978a7dc4053a999c3a11ebe404cf427f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:28Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.172660 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p6jtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"646d0148-c138-45a2-8f68-51aee16aeff0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1d56c160a6bfb1da9b1ca24666fbbdbd87911eaf961365a6cb097b0cc20f1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxbxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p6jtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:28Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.184971 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.185420 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.185433 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.185453 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.185467 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:28Z","lastTransitionTime":"2026-02-17T17:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.195207 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b23058a0-04ec-4a23-82cb-60f9b368eaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f423d521d3549b7cb74f5f1fa08e42b46969184b639d2bec15a5c0a2d42d596b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f423d521d3549b7cb74f5f1fa08e42b46969184b639d2bec15a5c0a2d42d596b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:44:27Z\\\",\\\"message\\\":\\\"17 17:44:27.629245 6171 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0217 17:44:27.629285 6171 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0217 17:44:27.629301 6171 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 17:44:27.629316 6171 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 17:44:27.629317 6171 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0217 17:44:27.629332 6171 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0217 17:44:27.629339 6171 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 17:44:27.629348 6171 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 17:44:27.629355 6171 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0217 17:44:27.629367 6171 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0217 17:44:27.629368 6171 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0217 17:44:27.629392 6171 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 17:44:27.629408 6171 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0217 17:44:27.629431 6171 factory.go:656] Stopping watch factory\\\\nI0217 17:44:27.629445 6171 ovnkube.go:599] Stopped ovnkube\\\\nI0217 17:44:27.629465 6171 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kp5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:28Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.206455 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2q4n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9290105c-74a4-487a-879f-3f79186b3b01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6h6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6h6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2q4n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:28Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.278602 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9290105c-74a4-487a-879f-3f79186b3b01-metrics-certs\") pod \"network-metrics-daemon-2q4n6\" (UID: \"9290105c-74a4-487a-879f-3f79186b3b01\") " pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:44:28 crc kubenswrapper[4892]: E0217 17:44:28.278725 4892 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 17:44:28 crc kubenswrapper[4892]: E0217 17:44:28.278772 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9290105c-74a4-487a-879f-3f79186b3b01-metrics-certs podName:9290105c-74a4-487a-879f-3f79186b3b01 nodeName:}" failed. No retries permitted until 2026-02-17 17:44:30.278758003 +0000 UTC m=+41.654161268 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9290105c-74a4-487a-879f-3f79186b3b01-metrics-certs") pod "network-metrics-daemon-2q4n6" (UID: "9290105c-74a4-487a-879f-3f79186b3b01") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.288468 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.288496 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.288504 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.288518 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.288527 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:28Z","lastTransitionTime":"2026-02-17T17:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.313320 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 14:38:42.056531986 +0000 UTC Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.358690 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:44:28 crc kubenswrapper[4892]: E0217 17:44:28.358804 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q4n6" podUID="9290105c-74a4-487a-879f-3f79186b3b01" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.359109 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:44:28 crc kubenswrapper[4892]: E0217 17:44:28.359162 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.390188 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.390213 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.390222 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.390234 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.390242 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:28Z","lastTransitionTime":"2026-02-17T17:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.492006 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.492031 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.492038 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.492050 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.492058 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:28Z","lastTransitionTime":"2026-02-17T17:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.594923 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.594959 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.594968 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.594983 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.594997 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:28Z","lastTransitionTime":"2026-02-17T17:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.684924 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kp5h9_b23058a0-04ec-4a23-82cb-60f9b368eaa0/ovnkube-controller/1.log" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.685639 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kp5h9_b23058a0-04ec-4a23-82cb-60f9b368eaa0/ovnkube-controller/0.log" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.691636 4892 generic.go:334] "Generic (PLEG): container finished" podID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerID="54854fd0e686549c10d435083457b3ca06743c27b074aa4cb690d468e52e7b85" exitCode=1 Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.691727 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" event={"ID":"b23058a0-04ec-4a23-82cb-60f9b368eaa0","Type":"ContainerDied","Data":"54854fd0e686549c10d435083457b3ca06743c27b074aa4cb690d468e52e7b85"} Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.691804 4892 scope.go:117] "RemoveContainer" containerID="f423d521d3549b7cb74f5f1fa08e42b46969184b639d2bec15a5c0a2d42d596b" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.692714 4892 scope.go:117] "RemoveContainer" containerID="54854fd0e686549c10d435083457b3ca06743c27b074aa4cb690d468e52e7b85" Feb 17 17:44:28 crc kubenswrapper[4892]: E0217 17:44:28.692980 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kp5h9_openshift-ovn-kubernetes(b23058a0-04ec-4a23-82cb-60f9b368eaa0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.699807 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.699896 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.699917 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.699954 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.699982 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:28Z","lastTransitionTime":"2026-02-17T17:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.715139 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:28Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.733735 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aba0c9be9062adff6361f81c4b2f399a56592fcef0d1a62defdf587f9accb3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:28Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.750096 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9013d62-9809-436b-82a8-5b18dbf13e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b57e4ceb22cb7d61e915ddf5e85dfee872d2088cccf3c8c4add63e1aa423a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c31ce4ad814ab3be27cbd8d8eb0cb70c7e11352cacd14b705d5c3ed0b56fe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6mhzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:28Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.767030 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e1e9d28fe57947c798967ce328638e778603a86874e79a2f4d99eab0b9867fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://489d551f2e552fe6e6a03f8e38593e2b978a7dc4053a999c3a11ebe404cf427f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:28Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.781198 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p6jtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"646d0148-c138-45a2-8f68-51aee16aeff0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1d56c160a6bfb1da9b1ca24666fbbdbd87911eaf961365a6cb097b0cc20f1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxbxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p6jtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:28Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.802562 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.802606 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.802619 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.802640 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.802654 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:28Z","lastTransitionTime":"2026-02-17T17:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.814518 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b23058a0-04ec-4a23-82cb-60f9b368eaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54854fd0e686549c10d435083457b3ca06743c27b074aa4cb690d468e52e7b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f423d521d3549b7cb74f5f1fa08e42b46969184b639d2bec15a5c0a2d42d596b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:44:27Z\\\",\\\"message\\\":\\\"17 17:44:27.629245 6171 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0217 17:44:27.629285 6171 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0217 17:44:27.629301 6171 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 17:44:27.629316 6171 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 17:44:27.629317 6171 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0217 17:44:27.629332 6171 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0217 17:44:27.629339 6171 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 17:44:27.629348 6171 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 17:44:27.629355 6171 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0217 17:44:27.629367 6171 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0217 17:44:27.629368 6171 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0217 17:44:27.629392 6171 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 17:44:27.629408 6171 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0217 17:44:27.629431 6171 factory.go:656] Stopping watch factory\\\\nI0217 17:44:27.629445 6171 ovnkube.go:599] Stopped ovnkube\\\\nI0217 17:44:27.629465 6171 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54854fd0e686549c10d435083457b3ca06743c27b074aa4cb690d468e52e7b85\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:44:28Z\\\",\\\"message\\\":\\\" 6422 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0217 17:44:28.578986 6422 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 17:44:28.578992 6422 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0217 17:44:28.579012 6422 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0217 17:44:28.579054 6422 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 17:44:28.579065 6422 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0217 17:44:28.579068 6422 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0217 17:44:28.579079 6422 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 17:44:28.579090 6422 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 17:44:28.579097 6422 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 17:44:28.579051 6422 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 17:44:28.579113 6422 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0217 17:44:28.579123 6422 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 17:44:28.579136 6422 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 17:44:28.579146 6422 factory.go:656] Stopping watch factory\\\\nI0217 17:44:28.579164 6422 ovnkube.go:599] Stopped ovnkube\\\\nI0217 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kp5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:28Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.828288 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2q4n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9290105c-74a4-487a-879f-3f79186b3b01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6h6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6h6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2q4n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:28Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.847312 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4159c-33a5-42ef-9427-ad1fb2799470\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf7e90e5029b6c000084531e8c465365c236ea0c554bf0ab65b936fc0f53cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe34ba04ce082dcb2c2040fafa3d9b0fb0f78fd569d08c9e53278873fb6434c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a9fc749b194bcc0558af7017260846aeb8df8df5d84f31ae0f1e9f7ac29e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2199d2bd60d27dc4d3c3ff4a2ef9e4cbad59c3546a36206258b82875901b69d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 17:44:08.433164 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 17:44:08.433536 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:44:08.434768 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2781092108/tls.crt::/tmp/serving-cert-2781092108/tls.key\\\\\\\"\\\\nI0217 17:44:08.760474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:44:08.765628 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:44:08.765666 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:44:08.765700 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:44:08.765711 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:44:08.777802 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:44:08.777850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:44:08.777867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:44:08.777870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:44:08.777873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:44:08.777883 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:44:08.781106 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23963364690d3b6cdb1cb76d08e094b0a7c2fe304f8d20a673206357e0152193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:28Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.863510 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59304d1958e072cd19bd1e6a00f1487aa93563c8838c859aa7ed5222dc25c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:28Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.879635 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5v456" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62da95c0-b8b4-410e-a5e8-f3ab44db53b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4bd49fbfa0e7cff66ffc2a72aa0020a4a28551fa192e587067d648feb4d09f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfctg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5v456\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:28Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.897853 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k28d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5497ad5a-f96f-4f2e-ba79-a72f32527546\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8cdd570c53fb9581ae506e73f11f51586ebffcc4561ba7176a7a64a25bd5e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g8dcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ca1f6810f305bb127d5ebfbb39b92ffa5914a3a245dcfe6f4adb805fea015e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g8dcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7k28d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:28Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.905265 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.905295 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.905304 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.905318 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.905328 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:28Z","lastTransitionTime":"2026-02-17T17:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.919043 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83f8f121-9193-465b-9d6b-4dfbdcf26f98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca961a496bca89cff78c58c41585706c0673892df83e6c58e16655f83aeaa4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3ead139c2a8e6feefbddce384ccd6eb8f26b4581d93053b80832fea7c8bdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82362387569a5b049fdb827cb8ca5ecdacae36743ef142077b37441cc2e33b41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d08e8579fe7ac45bc28ca29b9e6bc0b795c68e4429e16319b75f2b6a1f60cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:28Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.935189 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:28Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.953233 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:28Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:28 crc kubenswrapper[4892]: I0217 17:44:28.970280 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxpxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b12f44-0079-4031-9b1d-492c374250df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5490d72f776fd4e96f56e933c182c8589e96d80806378f96d97c4d0f6dccd8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgp4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxpxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:28Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.001057 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4dsxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202411f4-1b44-41a1-9b8a-23038a0e9bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc5f2c5d87fb5b1529c03563955d9213cd72452aabadd9417afdf5a30b8a24f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4dsxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:28Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.007804 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.007868 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.007881 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.007900 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.007913 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:29Z","lastTransitionTime":"2026-02-17T17:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.111430 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.111488 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.111569 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.111593 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.111611 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:29Z","lastTransitionTime":"2026-02-17T17:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.214583 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.214643 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.214658 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.214682 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.214701 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:29Z","lastTransitionTime":"2026-02-17T17:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.297917 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.314385 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 08:35:43.083087941 +0000 UTC Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.317430 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.317490 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.317508 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.317530 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.317547 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:29Z","lastTransitionTime":"2026-02-17T17:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.358861 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.358871 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:44:29 crc kubenswrapper[4892]: E0217 17:44:29.359052 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:44:29 crc kubenswrapper[4892]: E0217 17:44:29.359200 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.382174 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4159c-33a5-42ef-9427-ad1fb2799470\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf7e90e5029b6c000084531e8c465365c236ea0c554bf0ab65b936fc0f53cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe34ba04ce082dcb2c2040fafa3d9b0fb0f78fd569d08c9e53278873fb6434c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a9fc749b194bcc0558af7017260846aeb8df8df5d84f31ae0f1e9f7ac29e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2199d2bd60d27dc4d3c3ff4a2ef9e4cbad59c3546a36206258b82875901b69d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 17:44:08.433164 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 17:44:08.433536 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:44:08.434768 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2781092108/tls.crt::/tmp/serving-cert-2781092108/tls.key\\\\\\\"\\\\nI0217 17:44:08.760474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:44:08.765628 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:44:08.765666 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:44:08.765700 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:44:08.765711 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:44:08.777802 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:44:08.777850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:44:08.777867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:44:08.777870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:44:08.777873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:44:08.777883 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:44:08.781106 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23963364690d3b6cdb1cb76d08e094b0a7c2fe304f8d20a673206357e0152193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.399799 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59304d1958e072cd19bd1e6a00f1487aa93563c8838c859aa7ed5222dc25c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.414053 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5v456" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62da95c0-b8b4-410e-a5e8-f3ab44db53b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4bd49fbfa0e7cff66ffc2a72aa0020a4a28551fa192e587067d648feb4d09f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfctg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5v456\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.419613 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.419660 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.419677 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.419700 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.419714 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:29Z","lastTransitionTime":"2026-02-17T17:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.430312 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k28d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5497ad5a-f96f-4f2e-ba79-a72f32527546\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8cdd570c53fb9581ae506e73f11f51586ebffcc4561ba7176a7a64a25bd5e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g8dcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ca1f6810f305bb127d5ebfbb39b92ffa5914a3a245dcfe6f4adb805fea015e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g8dcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7k28d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.445696 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83f8f121-9193-465b-9d6b-4dfbdcf26f98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca961a496bca89cff78c58c41585706c0673892df83e6c58e16655f83aeaa4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3ead139c2a8e6feefbddce384ccd6eb8f26b4581d93053b80832fea7c8bdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82362387569a5b049fdb827cb8ca5ecdacae36743ef142077b37441cc2e33b41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d08e8579fe7ac45bc28ca29b9e6bc0b795c68e4429e16319b75f2b6a1f60cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.463905 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.481072 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.500761 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxpxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b12f44-0079-4031-9b1d-492c374250df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5490d72f776fd4e96f56e933c182c8589e96d80806378f96d97c4d0f6dccd8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgp4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxpxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.519502 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4dsxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202411f4-1b44-41a1-9b8a-23038a0e9bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc5f2c5d87fb5b1529c03563955d9213cd72452aabadd9417afdf5a30b8a24f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4dsxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.521771 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.521805 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.521831 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.521848 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.521858 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:29Z","lastTransitionTime":"2026-02-17T17:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.539434 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.554253 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aba0c9be9062adff6361f81c4b2f399a56592fcef0d1a62defdf587f9accb3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.565996 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9013d62-9809-436b-82a8-5b18dbf13e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b57e4ceb22cb7d61e915ddf5e85dfee872d2088cccf3c8c4add63e1aa423a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c31ce4ad814ab3be27cbd8d8eb0cb70c7e11352cacd14b705d5c3ed0b56fe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6mhzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.581698 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e1e9d28fe57947c798967ce328638e778603a86874e79a2f4d99eab0b9867fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://489d551f2e552fe6e6a03f8e38593e2b978a7dc4053a999c3a11ebe404cf427f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.593844 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p6jtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"646d0148-c138-45a2-8f68-51aee16aeff0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1d56c160a6bfb1da9b1ca24666fbbdbd87911eaf961365a6cb097b0cc20f1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxbxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p6jtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.614025 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b23058a0-04ec-4a23-82cb-60f9b368eaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54854fd0e686549c10d435083457b3ca06743c27b074aa4cb690d468e52e7b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f423d521d3549b7cb74f5f1fa08e42b46969184b639d2bec15a5c0a2d42d596b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:44:27Z\\\",\\\"message\\\":\\\"17 17:44:27.629245 6171 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0217 17:44:27.629285 6171 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0217 17:44:27.629301 6171 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 17:44:27.629316 6171 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 17:44:27.629317 6171 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0217 17:44:27.629332 6171 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0217 17:44:27.629339 6171 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 17:44:27.629348 6171 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 17:44:27.629355 6171 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0217 17:44:27.629367 6171 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0217 17:44:27.629368 6171 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0217 17:44:27.629392 6171 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 17:44:27.629408 6171 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0217 17:44:27.629431 6171 factory.go:656] Stopping watch factory\\\\nI0217 17:44:27.629445 6171 ovnkube.go:599] Stopped ovnkube\\\\nI0217 17:44:27.629465 6171 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54854fd0e686549c10d435083457b3ca06743c27b074aa4cb690d468e52e7b85\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:44:28Z\\\",\\\"message\\\":\\\" 6422 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0217 17:44:28.578986 6422 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 17:44:28.578992 6422 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0217 17:44:28.579012 6422 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0217 17:44:28.579054 6422 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 17:44:28.579065 6422 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0217 17:44:28.579068 6422 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0217 17:44:28.579079 6422 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 17:44:28.579090 6422 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 17:44:28.579097 6422 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 17:44:28.579051 6422 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 17:44:28.579113 6422 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0217 17:44:28.579123 6422 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 17:44:28.579136 6422 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 17:44:28.579146 6422 factory.go:656] Stopping watch factory\\\\nI0217 17:44:28.579164 6422 ovnkube.go:599] Stopped ovnkube\\\\nI0217 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kp5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.626179 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2q4n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9290105c-74a4-487a-879f-3f79186b3b01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6h6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6h6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2q4n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.628738 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.628794 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.628838 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.628874 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.628888 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:29Z","lastTransitionTime":"2026-02-17T17:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.697591 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kp5h9_b23058a0-04ec-4a23-82cb-60f9b368eaa0/ovnkube-controller/1.log" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.701773 4892 scope.go:117] "RemoveContainer" containerID="54854fd0e686549c10d435083457b3ca06743c27b074aa4cb690d468e52e7b85" Feb 17 17:44:29 crc kubenswrapper[4892]: E0217 17:44:29.702083 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kp5h9_openshift-ovn-kubernetes(b23058a0-04ec-4a23-82cb-60f9b368eaa0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.718915 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.732464 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.732506 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.732518 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.732534 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.732545 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:29Z","lastTransitionTime":"2026-02-17T17:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.735873 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aba0c9be9062adff6361f81c4b2f399a56592fcef0d1a62defdf587f9accb3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.750400 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9013d62-9809-436b-82a8-5b18dbf13e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b57e4ceb22cb7d61e915ddf5e85dfee872d2088cccf3c8c4add63e1aa423a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c31ce4ad814ab3be27cbd8d8eb0cb70c7e11352cacd14b705d5c3ed0b56fe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6mhzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.761534 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p6jtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"646d0148-c138-45a2-8f68-51aee16aeff0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1d56c160a6bfb1da9b1ca24666fbbdbd87911eaf961365a6cb097b0cc20f1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxbxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p6jtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.794018 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b23058a0-04ec-4a23-82cb-60f9b368eaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54854fd0e686549c10d435083457b3ca06743c27b074aa4cb690d468e52e7b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54854fd0e686549c10d435083457b3ca06743c27b074aa4cb690d468e52e7b85\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:44:28Z\\\",\\\"message\\\":\\\" 6422 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0217 17:44:28.578986 6422 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 17:44:28.578992 6422 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0217 17:44:28.579012 6422 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0217 17:44:28.579054 6422 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 17:44:28.579065 6422 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0217 17:44:28.579068 6422 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0217 17:44:28.579079 6422 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 17:44:28.579090 6422 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 17:44:28.579097 6422 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 17:44:28.579051 6422 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 17:44:28.579113 6422 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0217 17:44:28.579123 6422 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 17:44:28.579136 6422 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 17:44:28.579146 6422 factory.go:656] Stopping watch factory\\\\nI0217 17:44:28.579164 6422 ovnkube.go:599] Stopped ovnkube\\\\nI0217 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kp5h9_openshift-ovn-kubernetes(b23058a0-04ec-4a23-82cb-60f9b368eaa0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kp5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.811351 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2q4n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9290105c-74a4-487a-879f-3f79186b3b01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6h6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6h6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2q4n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.829932 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e1e9d28fe57947c798967ce328638e778603a86874e79a2f4d99eab0b9867fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://489d551f2e552fe6e6a03f8e38593e2b978a7dc4053a999c3a11ebe404cf427f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.834674 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.834739 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.834761 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.834791 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.834844 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:29Z","lastTransitionTime":"2026-02-17T17:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.853164 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4159c-33a5-42ef-9427-ad1fb2799470\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf7e90e5029b6c000084531e8c465365c236ea0c554bf0ab65b936fc0f53cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe34ba04ce082dcb2c2040fafa3d9b0fb0f78fd569d08c9e53278873fb6434c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a9fc749b194bcc0558af7017260846aeb8df8df5d84f31ae0f1e9f7ac29e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2199d2bd60d27dc4d3c3ff4a2ef9e4cbad59c3546a36206258b82875901b69d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 17:44:08.433164 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 17:44:08.433536 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:44:08.434768 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2781092108/tls.crt::/tmp/serving-cert-2781092108/tls.key\\\\\\\"\\\\nI0217 17:44:08.760474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:44:08.765628 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:44:08.765666 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:44:08.765700 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:44:08.765711 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:44:08.777802 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:44:08.777850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:44:08.777867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:44:08.777870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:44:08.777873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:44:08.777883 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:44:08.781106 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23963364690d3b6cdb1cb76d08e094b0a7c2fe304f8d20a673206357e0152193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.871366 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59304d1958e072cd19bd1e6a00f1487aa93563c8838c859aa7ed5222dc25c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.884033 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5v456" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62da95c0-b8b4-410e-a5e8-f3ab44db53b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4bd49fbfa0e7cff66ffc2a72aa0020a4a28551fa192e587067d648feb4d09f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfctg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5v456\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.896293 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k28d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5497ad5a-f96f-4f2e-ba79-a72f32527546\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8cdd570c53fb9581ae506e73f11f51586ebffcc4561ba7176a7a64a25bd5e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g8dcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ca1f6810f305bb127d5ebfbb39b92ffa5914a3a245dcfe6f4adb805fea015e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g8dcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7k28d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.911078 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.926736 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.937922 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.937957 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.937973 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.937996 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.938013 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:29Z","lastTransitionTime":"2026-02-17T17:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.943902 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxpxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b12f44-0079-4031-9b1d-492c374250df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5490d72f776fd4e96f56e933c182c8589e96d80806378f96d97c4d0f6dccd8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgp4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxpxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.964243 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4dsxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202411f4-1b44-41a1-9b8a-23038a0e9bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc5f2c5d87fb5b1529c03563955d9213cd72452aabadd9417afdf5a30b8a24f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4dsxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:29 crc kubenswrapper[4892]: I0217 17:44:29.980267 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83f8f121-9193-465b-9d6b-4dfbdcf26f98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca961a496bca89cff78c58c41585706c0673892df83e6c58e16655f83aeaa4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3ead139c2a8e6feefbddce384ccd6eb8f26b4581d93053b80832fea7c8bdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82362387569a5b049fdb827cb8ca5ecdacae36743ef142077b37441cc2e33b41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d08e8579fe7ac45bc28ca29b9e6bc0b795c68e4429e16319b75f2b6a1f60cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:30 crc kubenswrapper[4892]: I0217 17:44:30.040794 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:30 crc kubenswrapper[4892]: I0217 17:44:30.040849 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:30 crc kubenswrapper[4892]: I0217 17:44:30.040859 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:30 crc kubenswrapper[4892]: I0217 17:44:30.040873 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:30 crc kubenswrapper[4892]: I0217 17:44:30.040885 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:30Z","lastTransitionTime":"2026-02-17T17:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:30 crc kubenswrapper[4892]: I0217 17:44:30.143973 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:30 crc kubenswrapper[4892]: I0217 17:44:30.144031 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:30 crc kubenswrapper[4892]: I0217 17:44:30.144048 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:30 crc kubenswrapper[4892]: I0217 17:44:30.144071 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:30 crc kubenswrapper[4892]: I0217 17:44:30.144088 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:30Z","lastTransitionTime":"2026-02-17T17:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:30 crc kubenswrapper[4892]: I0217 17:44:30.247076 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:30 crc kubenswrapper[4892]: I0217 17:44:30.247175 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:30 crc kubenswrapper[4892]: I0217 17:44:30.247193 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:30 crc kubenswrapper[4892]: I0217 17:44:30.247219 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:30 crc kubenswrapper[4892]: I0217 17:44:30.247238 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:30Z","lastTransitionTime":"2026-02-17T17:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:30 crc kubenswrapper[4892]: I0217 17:44:30.299549 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9290105c-74a4-487a-879f-3f79186b3b01-metrics-certs\") pod \"network-metrics-daemon-2q4n6\" (UID: \"9290105c-74a4-487a-879f-3f79186b3b01\") " pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:44:30 crc kubenswrapper[4892]: E0217 17:44:30.299852 4892 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 17:44:30 crc kubenswrapper[4892]: E0217 17:44:30.299978 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9290105c-74a4-487a-879f-3f79186b3b01-metrics-certs podName:9290105c-74a4-487a-879f-3f79186b3b01 nodeName:}" failed. No retries permitted until 2026-02-17 17:44:34.299946765 +0000 UTC m=+45.675350060 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9290105c-74a4-487a-879f-3f79186b3b01-metrics-certs") pod "network-metrics-daemon-2q4n6" (UID: "9290105c-74a4-487a-879f-3f79186b3b01") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 17:44:30 crc kubenswrapper[4892]: I0217 17:44:30.314780 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 22:32:45.593105262 +0000 UTC Feb 17 17:44:30 crc kubenswrapper[4892]: I0217 17:44:30.350526 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:30 crc kubenswrapper[4892]: I0217 17:44:30.350583 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:30 crc kubenswrapper[4892]: I0217 17:44:30.350601 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:30 crc kubenswrapper[4892]: I0217 17:44:30.350637 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:30 crc kubenswrapper[4892]: I0217 17:44:30.350658 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:30Z","lastTransitionTime":"2026-02-17T17:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:30 crc kubenswrapper[4892]: I0217 17:44:30.358997 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:44:30 crc kubenswrapper[4892]: E0217 17:44:30.359180 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q4n6" podUID="9290105c-74a4-487a-879f-3f79186b3b01" Feb 17 17:44:30 crc kubenswrapper[4892]: I0217 17:44:30.358995 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:44:30 crc kubenswrapper[4892]: E0217 17:44:30.359332 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:44:30 crc kubenswrapper[4892]: I0217 17:44:30.452980 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:30 crc kubenswrapper[4892]: I0217 17:44:30.453017 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:30 crc kubenswrapper[4892]: I0217 17:44:30.453027 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:30 crc kubenswrapper[4892]: I0217 17:44:30.453042 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:30 crc kubenswrapper[4892]: I0217 17:44:30.453052 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:30Z","lastTransitionTime":"2026-02-17T17:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:30 crc kubenswrapper[4892]: I0217 17:44:30.555737 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:30 crc kubenswrapper[4892]: I0217 17:44:30.555797 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:30 crc kubenswrapper[4892]: I0217 17:44:30.555839 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:30 crc kubenswrapper[4892]: I0217 17:44:30.555866 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:30 crc kubenswrapper[4892]: I0217 17:44:30.555882 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:30Z","lastTransitionTime":"2026-02-17T17:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:30 crc kubenswrapper[4892]: I0217 17:44:30.659663 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:30 crc kubenswrapper[4892]: I0217 17:44:30.659734 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:30 crc kubenswrapper[4892]: I0217 17:44:30.659753 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:30 crc kubenswrapper[4892]: I0217 17:44:30.659782 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:30 crc kubenswrapper[4892]: I0217 17:44:30.659801 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:30Z","lastTransitionTime":"2026-02-17T17:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:30 crc kubenswrapper[4892]: I0217 17:44:30.765031 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:30 crc kubenswrapper[4892]: I0217 17:44:30.765095 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:30 crc kubenswrapper[4892]: I0217 17:44:30.765113 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:30 crc kubenswrapper[4892]: I0217 17:44:30.765136 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:30 crc kubenswrapper[4892]: I0217 17:44:30.765153 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:30Z","lastTransitionTime":"2026-02-17T17:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:30 crc kubenswrapper[4892]: I0217 17:44:30.868140 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:30 crc kubenswrapper[4892]: I0217 17:44:30.868273 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:30 crc kubenswrapper[4892]: I0217 17:44:30.868315 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:30 crc kubenswrapper[4892]: I0217 17:44:30.868357 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:30 crc kubenswrapper[4892]: I0217 17:44:30.868390 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:30Z","lastTransitionTime":"2026-02-17T17:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:30 crc kubenswrapper[4892]: I0217 17:44:30.972373 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:30 crc kubenswrapper[4892]: I0217 17:44:30.972415 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:30 crc kubenswrapper[4892]: I0217 17:44:30.972423 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:30 crc kubenswrapper[4892]: I0217 17:44:30.972440 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:30 crc kubenswrapper[4892]: I0217 17:44:30.972453 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:30Z","lastTransitionTime":"2026-02-17T17:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:31 crc kubenswrapper[4892]: I0217 17:44:31.075467 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:31 crc kubenswrapper[4892]: I0217 17:44:31.075504 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:31 crc kubenswrapper[4892]: I0217 17:44:31.075515 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:31 crc kubenswrapper[4892]: I0217 17:44:31.075530 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:31 crc kubenswrapper[4892]: I0217 17:44:31.075540 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:31Z","lastTransitionTime":"2026-02-17T17:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:31 crc kubenswrapper[4892]: I0217 17:44:31.178930 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:31 crc kubenswrapper[4892]: I0217 17:44:31.179018 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:31 crc kubenswrapper[4892]: I0217 17:44:31.179040 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:31 crc kubenswrapper[4892]: I0217 17:44:31.179489 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:31 crc kubenswrapper[4892]: I0217 17:44:31.179768 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:31Z","lastTransitionTime":"2026-02-17T17:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:31 crc kubenswrapper[4892]: I0217 17:44:31.283113 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:31 crc kubenswrapper[4892]: I0217 17:44:31.283178 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:31 crc kubenswrapper[4892]: I0217 17:44:31.283198 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:31 crc kubenswrapper[4892]: I0217 17:44:31.283225 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:31 crc kubenswrapper[4892]: I0217 17:44:31.283242 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:31Z","lastTransitionTime":"2026-02-17T17:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:31 crc kubenswrapper[4892]: I0217 17:44:31.315888 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 21:13:11.448667813 +0000 UTC Feb 17 17:44:31 crc kubenswrapper[4892]: I0217 17:44:31.359151 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:44:31 crc kubenswrapper[4892]: I0217 17:44:31.359220 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:44:31 crc kubenswrapper[4892]: E0217 17:44:31.359330 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:44:31 crc kubenswrapper[4892]: E0217 17:44:31.359552 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:44:31 crc kubenswrapper[4892]: I0217 17:44:31.386272 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:31 crc kubenswrapper[4892]: I0217 17:44:31.386344 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:31 crc kubenswrapper[4892]: I0217 17:44:31.386366 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:31 crc kubenswrapper[4892]: I0217 17:44:31.386396 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:31 crc kubenswrapper[4892]: I0217 17:44:31.386418 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:31Z","lastTransitionTime":"2026-02-17T17:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:31 crc kubenswrapper[4892]: I0217 17:44:31.489577 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:31 crc kubenswrapper[4892]: I0217 17:44:31.489639 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:31 crc kubenswrapper[4892]: I0217 17:44:31.489660 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:31 crc kubenswrapper[4892]: I0217 17:44:31.489684 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:31 crc kubenswrapper[4892]: I0217 17:44:31.489701 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:31Z","lastTransitionTime":"2026-02-17T17:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:31 crc kubenswrapper[4892]: I0217 17:44:31.592277 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:31 crc kubenswrapper[4892]: I0217 17:44:31.592351 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:31 crc kubenswrapper[4892]: I0217 17:44:31.592376 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:31 crc kubenswrapper[4892]: I0217 17:44:31.592405 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:31 crc kubenswrapper[4892]: I0217 17:44:31.592428 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:31Z","lastTransitionTime":"2026-02-17T17:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:31 crc kubenswrapper[4892]: I0217 17:44:31.695125 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:31 crc kubenswrapper[4892]: I0217 17:44:31.695211 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:31 crc kubenswrapper[4892]: I0217 17:44:31.695233 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:31 crc kubenswrapper[4892]: I0217 17:44:31.695267 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:31 crc kubenswrapper[4892]: I0217 17:44:31.695287 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:31Z","lastTransitionTime":"2026-02-17T17:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:31 crc kubenswrapper[4892]: I0217 17:44:31.798318 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:31 crc kubenswrapper[4892]: I0217 17:44:31.798370 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:31 crc kubenswrapper[4892]: I0217 17:44:31.798388 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:31 crc kubenswrapper[4892]: I0217 17:44:31.798410 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:31 crc kubenswrapper[4892]: I0217 17:44:31.798429 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:31Z","lastTransitionTime":"2026-02-17T17:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:31 crc kubenswrapper[4892]: I0217 17:44:31.902298 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:31 crc kubenswrapper[4892]: I0217 17:44:31.902351 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:31 crc kubenswrapper[4892]: I0217 17:44:31.902367 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:31 crc kubenswrapper[4892]: I0217 17:44:31.902392 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:31 crc kubenswrapper[4892]: I0217 17:44:31.902415 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:31Z","lastTransitionTime":"2026-02-17T17:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:32 crc kubenswrapper[4892]: I0217 17:44:32.005022 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:32 crc kubenswrapper[4892]: I0217 17:44:32.005086 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:32 crc kubenswrapper[4892]: I0217 17:44:32.005104 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:32 crc kubenswrapper[4892]: I0217 17:44:32.005129 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:32 crc kubenswrapper[4892]: I0217 17:44:32.005146 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:32Z","lastTransitionTime":"2026-02-17T17:44:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:32 crc kubenswrapper[4892]: I0217 17:44:32.108401 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:32 crc kubenswrapper[4892]: I0217 17:44:32.108772 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:32 crc kubenswrapper[4892]: I0217 17:44:32.108996 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:32 crc kubenswrapper[4892]: I0217 17:44:32.109158 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:32 crc kubenswrapper[4892]: I0217 17:44:32.109352 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:32Z","lastTransitionTime":"2026-02-17T17:44:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:32 crc kubenswrapper[4892]: I0217 17:44:32.212407 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:32 crc kubenswrapper[4892]: I0217 17:44:32.212524 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:32 crc kubenswrapper[4892]: I0217 17:44:32.212543 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:32 crc kubenswrapper[4892]: I0217 17:44:32.212565 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:32 crc kubenswrapper[4892]: I0217 17:44:32.212581 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:32Z","lastTransitionTime":"2026-02-17T17:44:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:32 crc kubenswrapper[4892]: I0217 17:44:32.315749 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:32 crc kubenswrapper[4892]: I0217 17:44:32.315838 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:32 crc kubenswrapper[4892]: I0217 17:44:32.315861 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:32 crc kubenswrapper[4892]: I0217 17:44:32.315886 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:32 crc kubenswrapper[4892]: I0217 17:44:32.315905 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:32Z","lastTransitionTime":"2026-02-17T17:44:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:32 crc kubenswrapper[4892]: I0217 17:44:32.316875 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 19:26:34.197728248 +0000 UTC Feb 17 17:44:32 crc kubenswrapper[4892]: I0217 17:44:32.358587 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:44:32 crc kubenswrapper[4892]: I0217 17:44:32.358584 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:44:32 crc kubenswrapper[4892]: E0217 17:44:32.359015 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q4n6" podUID="9290105c-74a4-487a-879f-3f79186b3b01" Feb 17 17:44:32 crc kubenswrapper[4892]: E0217 17:44:32.359146 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:44:32 crc kubenswrapper[4892]: I0217 17:44:32.418167 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:32 crc kubenswrapper[4892]: I0217 17:44:32.418406 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:32 crc kubenswrapper[4892]: I0217 17:44:32.418474 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:32 crc kubenswrapper[4892]: I0217 17:44:32.418540 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:32 crc kubenswrapper[4892]: I0217 17:44:32.418614 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:32Z","lastTransitionTime":"2026-02-17T17:44:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:32 crc kubenswrapper[4892]: I0217 17:44:32.520636 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:32 crc kubenswrapper[4892]: I0217 17:44:32.520708 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:32 crc kubenswrapper[4892]: I0217 17:44:32.520725 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:32 crc kubenswrapper[4892]: I0217 17:44:32.520748 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:32 crc kubenswrapper[4892]: I0217 17:44:32.520765 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:32Z","lastTransitionTime":"2026-02-17T17:44:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:32 crc kubenswrapper[4892]: I0217 17:44:32.623578 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:32 crc kubenswrapper[4892]: I0217 17:44:32.623674 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:32 crc kubenswrapper[4892]: I0217 17:44:32.623692 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:32 crc kubenswrapper[4892]: I0217 17:44:32.623726 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:32 crc kubenswrapper[4892]: I0217 17:44:32.623765 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:32Z","lastTransitionTime":"2026-02-17T17:44:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:32 crc kubenswrapper[4892]: I0217 17:44:32.726093 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:32 crc kubenswrapper[4892]: I0217 17:44:32.726142 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:32 crc kubenswrapper[4892]: I0217 17:44:32.726154 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:32 crc kubenswrapper[4892]: I0217 17:44:32.726171 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:32 crc kubenswrapper[4892]: I0217 17:44:32.726182 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:32Z","lastTransitionTime":"2026-02-17T17:44:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:32 crc kubenswrapper[4892]: I0217 17:44:32.829275 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:32 crc kubenswrapper[4892]: I0217 17:44:32.829330 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:32 crc kubenswrapper[4892]: I0217 17:44:32.829347 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:32 crc kubenswrapper[4892]: I0217 17:44:32.829373 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:32 crc kubenswrapper[4892]: I0217 17:44:32.829389 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:32Z","lastTransitionTime":"2026-02-17T17:44:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:32 crc kubenswrapper[4892]: I0217 17:44:32.932550 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:32 crc kubenswrapper[4892]: I0217 17:44:32.932598 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:32 crc kubenswrapper[4892]: I0217 17:44:32.932646 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:32 crc kubenswrapper[4892]: I0217 17:44:32.932669 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:32 crc kubenswrapper[4892]: I0217 17:44:32.932680 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:32Z","lastTransitionTime":"2026-02-17T17:44:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:33 crc kubenswrapper[4892]: I0217 17:44:33.035920 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:33 crc kubenswrapper[4892]: I0217 17:44:33.035976 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:33 crc kubenswrapper[4892]: I0217 17:44:33.035991 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:33 crc kubenswrapper[4892]: I0217 17:44:33.036015 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:33 crc kubenswrapper[4892]: I0217 17:44:33.036032 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:33Z","lastTransitionTime":"2026-02-17T17:44:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:33 crc kubenswrapper[4892]: I0217 17:44:33.139234 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:33 crc kubenswrapper[4892]: I0217 17:44:33.139293 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:33 crc kubenswrapper[4892]: I0217 17:44:33.139315 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:33 crc kubenswrapper[4892]: I0217 17:44:33.139341 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:33 crc kubenswrapper[4892]: I0217 17:44:33.139362 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:33Z","lastTransitionTime":"2026-02-17T17:44:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:33 crc kubenswrapper[4892]: I0217 17:44:33.242686 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:33 crc kubenswrapper[4892]: I0217 17:44:33.242745 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:33 crc kubenswrapper[4892]: I0217 17:44:33.242760 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:33 crc kubenswrapper[4892]: I0217 17:44:33.242783 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:33 crc kubenswrapper[4892]: I0217 17:44:33.242799 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:33Z","lastTransitionTime":"2026-02-17T17:44:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:33 crc kubenswrapper[4892]: I0217 17:44:33.318012 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 10:49:45.375942731 +0000 UTC Feb 17 17:44:33 crc kubenswrapper[4892]: I0217 17:44:33.345809 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:33 crc kubenswrapper[4892]: I0217 17:44:33.345901 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:33 crc kubenswrapper[4892]: I0217 17:44:33.345923 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:33 crc kubenswrapper[4892]: I0217 17:44:33.345951 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:33 crc kubenswrapper[4892]: I0217 17:44:33.345972 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:33Z","lastTransitionTime":"2026-02-17T17:44:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:33 crc kubenswrapper[4892]: I0217 17:44:33.359079 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:44:33 crc kubenswrapper[4892]: E0217 17:44:33.359195 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:44:33 crc kubenswrapper[4892]: I0217 17:44:33.359091 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:44:33 crc kubenswrapper[4892]: E0217 17:44:33.359346 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:44:33 crc kubenswrapper[4892]: I0217 17:44:33.452348 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:33 crc kubenswrapper[4892]: I0217 17:44:33.452808 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:33 crc kubenswrapper[4892]: I0217 17:44:33.452930 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:33 crc kubenswrapper[4892]: I0217 17:44:33.453013 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:33 crc kubenswrapper[4892]: I0217 17:44:33.453093 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:33Z","lastTransitionTime":"2026-02-17T17:44:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:33 crc kubenswrapper[4892]: I0217 17:44:33.555323 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:33 crc kubenswrapper[4892]: I0217 17:44:33.555689 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:33 crc kubenswrapper[4892]: I0217 17:44:33.555917 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:33 crc kubenswrapper[4892]: I0217 17:44:33.556119 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:33 crc kubenswrapper[4892]: I0217 17:44:33.556377 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:33Z","lastTransitionTime":"2026-02-17T17:44:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:33 crc kubenswrapper[4892]: I0217 17:44:33.659113 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:33 crc kubenswrapper[4892]: I0217 17:44:33.659167 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:33 crc kubenswrapper[4892]: I0217 17:44:33.659186 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:33 crc kubenswrapper[4892]: I0217 17:44:33.659211 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:33 crc kubenswrapper[4892]: I0217 17:44:33.659227 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:33Z","lastTransitionTime":"2026-02-17T17:44:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:33 crc kubenswrapper[4892]: I0217 17:44:33.761183 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:33 crc kubenswrapper[4892]: I0217 17:44:33.761239 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:33 crc kubenswrapper[4892]: I0217 17:44:33.761255 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:33 crc kubenswrapper[4892]: I0217 17:44:33.761278 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:33 crc kubenswrapper[4892]: I0217 17:44:33.761295 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:33Z","lastTransitionTime":"2026-02-17T17:44:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:33 crc kubenswrapper[4892]: I0217 17:44:33.863871 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:33 crc kubenswrapper[4892]: I0217 17:44:33.863927 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:33 crc kubenswrapper[4892]: I0217 17:44:33.863944 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:33 crc kubenswrapper[4892]: I0217 17:44:33.863968 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:33 crc kubenswrapper[4892]: I0217 17:44:33.863986 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:33Z","lastTransitionTime":"2026-02-17T17:44:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:33 crc kubenswrapper[4892]: I0217 17:44:33.967505 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:33 crc kubenswrapper[4892]: I0217 17:44:33.967942 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:33 crc kubenswrapper[4892]: I0217 17:44:33.968182 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:33 crc kubenswrapper[4892]: I0217 17:44:33.968340 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:33 crc kubenswrapper[4892]: I0217 17:44:33.968462 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:33Z","lastTransitionTime":"2026-02-17T17:44:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:34 crc kubenswrapper[4892]: I0217 17:44:34.070666 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:34 crc kubenswrapper[4892]: I0217 17:44:34.070724 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:34 crc kubenswrapper[4892]: I0217 17:44:34.070741 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:34 crc kubenswrapper[4892]: I0217 17:44:34.070762 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:34 crc kubenswrapper[4892]: I0217 17:44:34.070778 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:34Z","lastTransitionTime":"2026-02-17T17:44:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:34 crc kubenswrapper[4892]: I0217 17:44:34.172802 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:34 crc kubenswrapper[4892]: I0217 17:44:34.172881 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:34 crc kubenswrapper[4892]: I0217 17:44:34.172899 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:34 crc kubenswrapper[4892]: I0217 17:44:34.172922 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:34 crc kubenswrapper[4892]: I0217 17:44:34.172938 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:34Z","lastTransitionTime":"2026-02-17T17:44:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:34 crc kubenswrapper[4892]: I0217 17:44:34.275066 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:34 crc kubenswrapper[4892]: I0217 17:44:34.275199 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:34 crc kubenswrapper[4892]: I0217 17:44:34.275208 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:34 crc kubenswrapper[4892]: I0217 17:44:34.275221 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:34 crc kubenswrapper[4892]: I0217 17:44:34.275230 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:34Z","lastTransitionTime":"2026-02-17T17:44:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:34 crc kubenswrapper[4892]: I0217 17:44:34.319107 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 19:45:07.061806908 +0000 UTC Feb 17 17:44:34 crc kubenswrapper[4892]: I0217 17:44:34.339194 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9290105c-74a4-487a-879f-3f79186b3b01-metrics-certs\") pod \"network-metrics-daemon-2q4n6\" (UID: \"9290105c-74a4-487a-879f-3f79186b3b01\") " pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:44:34 crc kubenswrapper[4892]: E0217 17:44:34.339365 4892 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 17:44:34 crc kubenswrapper[4892]: E0217 17:44:34.339424 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9290105c-74a4-487a-879f-3f79186b3b01-metrics-certs podName:9290105c-74a4-487a-879f-3f79186b3b01 nodeName:}" failed. No retries permitted until 2026-02-17 17:44:42.339409597 +0000 UTC m=+53.714812862 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9290105c-74a4-487a-879f-3f79186b3b01-metrics-certs") pod "network-metrics-daemon-2q4n6" (UID: "9290105c-74a4-487a-879f-3f79186b3b01") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 17:44:34 crc kubenswrapper[4892]: I0217 17:44:34.359291 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:44:34 crc kubenswrapper[4892]: I0217 17:44:34.359291 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:44:34 crc kubenswrapper[4892]: E0217 17:44:34.359457 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q4n6" podUID="9290105c-74a4-487a-879f-3f79186b3b01" Feb 17 17:44:34 crc kubenswrapper[4892]: E0217 17:44:34.359567 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:44:34 crc kubenswrapper[4892]: I0217 17:44:34.377885 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:34 crc kubenswrapper[4892]: I0217 17:44:34.377944 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:34 crc kubenswrapper[4892]: I0217 17:44:34.377961 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:34 crc kubenswrapper[4892]: I0217 17:44:34.377984 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:34 crc kubenswrapper[4892]: I0217 17:44:34.378001 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:34Z","lastTransitionTime":"2026-02-17T17:44:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:34 crc kubenswrapper[4892]: I0217 17:44:34.480649 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:34 crc kubenswrapper[4892]: I0217 17:44:34.481123 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:34 crc kubenswrapper[4892]: I0217 17:44:34.481146 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:34 crc kubenswrapper[4892]: I0217 17:44:34.481176 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:34 crc kubenswrapper[4892]: I0217 17:44:34.481197 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:34Z","lastTransitionTime":"2026-02-17T17:44:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:34 crc kubenswrapper[4892]: I0217 17:44:34.584328 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:34 crc kubenswrapper[4892]: I0217 17:44:34.584670 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:34 crc kubenswrapper[4892]: I0217 17:44:34.584900 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:34 crc kubenswrapper[4892]: I0217 17:44:34.585106 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:34 crc kubenswrapper[4892]: I0217 17:44:34.585291 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:34Z","lastTransitionTime":"2026-02-17T17:44:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:34 crc kubenswrapper[4892]: I0217 17:44:34.689110 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:34 crc kubenswrapper[4892]: I0217 17:44:34.689145 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:34 crc kubenswrapper[4892]: I0217 17:44:34.689153 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:34 crc kubenswrapper[4892]: I0217 17:44:34.689166 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:34 crc kubenswrapper[4892]: I0217 17:44:34.689175 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:34Z","lastTransitionTime":"2026-02-17T17:44:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:34 crc kubenswrapper[4892]: I0217 17:44:34.791520 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:34 crc kubenswrapper[4892]: I0217 17:44:34.791952 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:34 crc kubenswrapper[4892]: I0217 17:44:34.792177 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:34 crc kubenswrapper[4892]: I0217 17:44:34.792410 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:34 crc kubenswrapper[4892]: I0217 17:44:34.792676 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:34Z","lastTransitionTime":"2026-02-17T17:44:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:34 crc kubenswrapper[4892]: I0217 17:44:34.895712 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:34 crc kubenswrapper[4892]: I0217 17:44:34.895743 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:34 crc kubenswrapper[4892]: I0217 17:44:34.895753 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:34 crc kubenswrapper[4892]: I0217 17:44:34.895767 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:34 crc kubenswrapper[4892]: I0217 17:44:34.895777 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:34Z","lastTransitionTime":"2026-02-17T17:44:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:34 crc kubenswrapper[4892]: I0217 17:44:34.998893 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:34 crc kubenswrapper[4892]: I0217 17:44:34.998978 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:34 crc kubenswrapper[4892]: I0217 17:44:34.999003 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:34 crc kubenswrapper[4892]: I0217 17:44:34.999041 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:34 crc kubenswrapper[4892]: I0217 17:44:34.999075 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:34Z","lastTransitionTime":"2026-02-17T17:44:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.101368 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.101405 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.101416 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.101431 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.101442 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:35Z","lastTransitionTime":"2026-02-17T17:44:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.205415 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.205451 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.205460 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.205476 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.205487 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:35Z","lastTransitionTime":"2026-02-17T17:44:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.307419 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.307470 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.307480 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.307492 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.307501 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:35Z","lastTransitionTime":"2026-02-17T17:44:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.319796 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 18:21:28.107357466 +0000 UTC Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.358498 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.358606 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:44:35 crc kubenswrapper[4892]: E0217 17:44:35.358758 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:44:35 crc kubenswrapper[4892]: E0217 17:44:35.358859 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.410239 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.410270 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.410280 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.410300 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.410315 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:35Z","lastTransitionTime":"2026-02-17T17:44:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.513889 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.513947 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.513963 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.513987 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.514006 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:35Z","lastTransitionTime":"2026-02-17T17:44:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.616320 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.616384 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.616400 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.616422 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.616438 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:35Z","lastTransitionTime":"2026-02-17T17:44:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.719656 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.719694 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.719708 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.719725 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.719735 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:35Z","lastTransitionTime":"2026-02-17T17:44:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.782057 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.782106 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.782119 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.782136 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.782147 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:35Z","lastTransitionTime":"2026-02-17T17:44:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:35 crc kubenswrapper[4892]: E0217 17:44:35.798505 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b854407-f7fd-494b-9ad1-f90f175f6ff2\\\",\\\"systemUUID\\\":\\\"06c95f3d-0382-44a5-9f64-04e3b3bbd534\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:35Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.801981 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.802012 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.802027 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.802048 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.802063 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:35Z","lastTransitionTime":"2026-02-17T17:44:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:35 crc kubenswrapper[4892]: E0217 17:44:35.818897 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b854407-f7fd-494b-9ad1-f90f175f6ff2\\\",\\\"systemUUID\\\":\\\"06c95f3d-0382-44a5-9f64-04e3b3bbd534\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:35Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.823339 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.823402 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.823414 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.823435 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.823447 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:35Z","lastTransitionTime":"2026-02-17T17:44:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:35 crc kubenswrapper[4892]: E0217 17:44:35.842297 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b854407-f7fd-494b-9ad1-f90f175f6ff2\\\",\\\"systemUUID\\\":\\\"06c95f3d-0382-44a5-9f64-04e3b3bbd534\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:35Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.846340 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.846387 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.846398 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.846420 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.846446 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:35Z","lastTransitionTime":"2026-02-17T17:44:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:35 crc kubenswrapper[4892]: E0217 17:44:35.863073 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b854407-f7fd-494b-9ad1-f90f175f6ff2\\\",\\\"systemUUID\\\":\\\"06c95f3d-0382-44a5-9f64-04e3b3bbd534\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:35Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.866651 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.866682 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.866692 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.866706 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.866715 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:35Z","lastTransitionTime":"2026-02-17T17:44:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:35 crc kubenswrapper[4892]: E0217 17:44:35.884977 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b854407-f7fd-494b-9ad1-f90f175f6ff2\\\",\\\"systemUUID\\\":\\\"06c95f3d-0382-44a5-9f64-04e3b3bbd534\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:35Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:35 crc kubenswrapper[4892]: E0217 17:44:35.885114 4892 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.886995 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.887044 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.887061 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.887081 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.887096 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:35Z","lastTransitionTime":"2026-02-17T17:44:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.989217 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.989275 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.989287 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.989304 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:35 crc kubenswrapper[4892]: I0217 17:44:35.989316 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:35Z","lastTransitionTime":"2026-02-17T17:44:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:36 crc kubenswrapper[4892]: I0217 17:44:36.091586 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:36 crc kubenswrapper[4892]: I0217 17:44:36.091628 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:36 crc kubenswrapper[4892]: I0217 17:44:36.091640 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:36 crc kubenswrapper[4892]: I0217 17:44:36.091658 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:36 crc kubenswrapper[4892]: I0217 17:44:36.091670 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:36Z","lastTransitionTime":"2026-02-17T17:44:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:36 crc kubenswrapper[4892]: I0217 17:44:36.193951 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:36 crc kubenswrapper[4892]: I0217 17:44:36.194005 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:36 crc kubenswrapper[4892]: I0217 17:44:36.194021 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:36 crc kubenswrapper[4892]: I0217 17:44:36.194048 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:36 crc kubenswrapper[4892]: I0217 17:44:36.194066 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:36Z","lastTransitionTime":"2026-02-17T17:44:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:36 crc kubenswrapper[4892]: I0217 17:44:36.296521 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:36 crc kubenswrapper[4892]: I0217 17:44:36.296585 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:36 crc kubenswrapper[4892]: I0217 17:44:36.296602 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:36 crc kubenswrapper[4892]: I0217 17:44:36.296626 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:36 crc kubenswrapper[4892]: I0217 17:44:36.296642 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:36Z","lastTransitionTime":"2026-02-17T17:44:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:36 crc kubenswrapper[4892]: I0217 17:44:36.320285 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 08:02:22.141176233 +0000 UTC Feb 17 17:44:36 crc kubenswrapper[4892]: I0217 17:44:36.359338 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:44:36 crc kubenswrapper[4892]: I0217 17:44:36.359455 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:44:36 crc kubenswrapper[4892]: E0217 17:44:36.359510 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:44:36 crc kubenswrapper[4892]: E0217 17:44:36.359682 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q4n6" podUID="9290105c-74a4-487a-879f-3f79186b3b01" Feb 17 17:44:36 crc kubenswrapper[4892]: I0217 17:44:36.400517 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:36 crc kubenswrapper[4892]: I0217 17:44:36.400578 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:36 crc kubenswrapper[4892]: I0217 17:44:36.400595 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:36 crc kubenswrapper[4892]: I0217 17:44:36.400621 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:36 crc kubenswrapper[4892]: I0217 17:44:36.400640 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:36Z","lastTransitionTime":"2026-02-17T17:44:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:36 crc kubenswrapper[4892]: I0217 17:44:36.503577 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:36 crc kubenswrapper[4892]: I0217 17:44:36.503647 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:36 crc kubenswrapper[4892]: I0217 17:44:36.503664 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:36 crc kubenswrapper[4892]: I0217 17:44:36.503686 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:36 crc kubenswrapper[4892]: I0217 17:44:36.503702 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:36Z","lastTransitionTime":"2026-02-17T17:44:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:36 crc kubenswrapper[4892]: I0217 17:44:36.606266 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:36 crc kubenswrapper[4892]: I0217 17:44:36.606307 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:36 crc kubenswrapper[4892]: I0217 17:44:36.606316 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:36 crc kubenswrapper[4892]: I0217 17:44:36.606331 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:36 crc kubenswrapper[4892]: I0217 17:44:36.606341 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:36Z","lastTransitionTime":"2026-02-17T17:44:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:36 crc kubenswrapper[4892]: I0217 17:44:36.709112 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:36 crc kubenswrapper[4892]: I0217 17:44:36.709179 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:36 crc kubenswrapper[4892]: I0217 17:44:36.709201 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:36 crc kubenswrapper[4892]: I0217 17:44:36.709230 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:36 crc kubenswrapper[4892]: I0217 17:44:36.709251 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:36Z","lastTransitionTime":"2026-02-17T17:44:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:36 crc kubenswrapper[4892]: I0217 17:44:36.812422 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:36 crc kubenswrapper[4892]: I0217 17:44:36.812460 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:36 crc kubenswrapper[4892]: I0217 17:44:36.812470 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:36 crc kubenswrapper[4892]: I0217 17:44:36.812484 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:36 crc kubenswrapper[4892]: I0217 17:44:36.812494 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:36Z","lastTransitionTime":"2026-02-17T17:44:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:36 crc kubenswrapper[4892]: I0217 17:44:36.915472 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:36 crc kubenswrapper[4892]: I0217 17:44:36.915526 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:36 crc kubenswrapper[4892]: I0217 17:44:36.915549 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:36 crc kubenswrapper[4892]: I0217 17:44:36.915576 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:36 crc kubenswrapper[4892]: I0217 17:44:36.915598 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:36Z","lastTransitionTime":"2026-02-17T17:44:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:37 crc kubenswrapper[4892]: I0217 17:44:37.018201 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:37 crc kubenswrapper[4892]: I0217 17:44:37.018259 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:37 crc kubenswrapper[4892]: I0217 17:44:37.018281 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:37 crc kubenswrapper[4892]: I0217 17:44:37.018308 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:37 crc kubenswrapper[4892]: I0217 17:44:37.018332 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:37Z","lastTransitionTime":"2026-02-17T17:44:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:37 crc kubenswrapper[4892]: I0217 17:44:37.121354 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:37 crc kubenswrapper[4892]: I0217 17:44:37.121386 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:37 crc kubenswrapper[4892]: I0217 17:44:37.121396 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:37 crc kubenswrapper[4892]: I0217 17:44:37.121409 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:37 crc kubenswrapper[4892]: I0217 17:44:37.121419 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:37Z","lastTransitionTime":"2026-02-17T17:44:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:37 crc kubenswrapper[4892]: I0217 17:44:37.223486 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:37 crc kubenswrapper[4892]: I0217 17:44:37.223518 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:37 crc kubenswrapper[4892]: I0217 17:44:37.223526 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:37 crc kubenswrapper[4892]: I0217 17:44:37.223538 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:37 crc kubenswrapper[4892]: I0217 17:44:37.223548 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:37Z","lastTransitionTime":"2026-02-17T17:44:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:37 crc kubenswrapper[4892]: I0217 17:44:37.321097 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 06:24:21.01893812 +0000 UTC Feb 17 17:44:37 crc kubenswrapper[4892]: I0217 17:44:37.325891 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:37 crc kubenswrapper[4892]: I0217 17:44:37.325936 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:37 crc kubenswrapper[4892]: I0217 17:44:37.325951 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:37 crc kubenswrapper[4892]: I0217 17:44:37.325975 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:37 crc kubenswrapper[4892]: I0217 17:44:37.325991 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:37Z","lastTransitionTime":"2026-02-17T17:44:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:37 crc kubenswrapper[4892]: I0217 17:44:37.359050 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:44:37 crc kubenswrapper[4892]: E0217 17:44:37.359212 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:44:37 crc kubenswrapper[4892]: I0217 17:44:37.359311 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:44:37 crc kubenswrapper[4892]: E0217 17:44:37.359413 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:44:37 crc kubenswrapper[4892]: I0217 17:44:37.429605 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:37 crc kubenswrapper[4892]: I0217 17:44:37.429693 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:37 crc kubenswrapper[4892]: I0217 17:44:37.429710 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:37 crc kubenswrapper[4892]: I0217 17:44:37.429734 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:37 crc kubenswrapper[4892]: I0217 17:44:37.429751 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:37Z","lastTransitionTime":"2026-02-17T17:44:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:37 crc kubenswrapper[4892]: I0217 17:44:37.533449 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:37 crc kubenswrapper[4892]: I0217 17:44:37.533550 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:37 crc kubenswrapper[4892]: I0217 17:44:37.533571 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:37 crc kubenswrapper[4892]: I0217 17:44:37.533595 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:37 crc kubenswrapper[4892]: I0217 17:44:37.533611 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:37Z","lastTransitionTime":"2026-02-17T17:44:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:37 crc kubenswrapper[4892]: I0217 17:44:37.636775 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:37 crc kubenswrapper[4892]: I0217 17:44:37.636926 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:37 crc kubenswrapper[4892]: I0217 17:44:37.636943 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:37 crc kubenswrapper[4892]: I0217 17:44:37.636965 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:37 crc kubenswrapper[4892]: I0217 17:44:37.636983 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:37Z","lastTransitionTime":"2026-02-17T17:44:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:37 crc kubenswrapper[4892]: I0217 17:44:37.740030 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:37 crc kubenswrapper[4892]: I0217 17:44:37.740100 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:37 crc kubenswrapper[4892]: I0217 17:44:37.740118 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:37 crc kubenswrapper[4892]: I0217 17:44:37.740142 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:37 crc kubenswrapper[4892]: I0217 17:44:37.740160 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:37Z","lastTransitionTime":"2026-02-17T17:44:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:37 crc kubenswrapper[4892]: I0217 17:44:37.842000 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:37 crc kubenswrapper[4892]: I0217 17:44:37.842085 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:37 crc kubenswrapper[4892]: I0217 17:44:37.842108 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:37 crc kubenswrapper[4892]: I0217 17:44:37.842202 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:37 crc kubenswrapper[4892]: I0217 17:44:37.842222 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:37Z","lastTransitionTime":"2026-02-17T17:44:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:37 crc kubenswrapper[4892]: I0217 17:44:37.946649 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:37 crc kubenswrapper[4892]: I0217 17:44:37.946729 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:37 crc kubenswrapper[4892]: I0217 17:44:37.946749 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:37 crc kubenswrapper[4892]: I0217 17:44:37.946776 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:37 crc kubenswrapper[4892]: I0217 17:44:37.946794 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:37Z","lastTransitionTime":"2026-02-17T17:44:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:38 crc kubenswrapper[4892]: I0217 17:44:38.050792 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:38 crc kubenswrapper[4892]: I0217 17:44:38.050901 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:38 crc kubenswrapper[4892]: I0217 17:44:38.050920 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:38 crc kubenswrapper[4892]: I0217 17:44:38.050953 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:38 crc kubenswrapper[4892]: I0217 17:44:38.050973 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:38Z","lastTransitionTime":"2026-02-17T17:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:38 crc kubenswrapper[4892]: I0217 17:44:38.154665 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:38 crc kubenswrapper[4892]: I0217 17:44:38.154744 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:38 crc kubenswrapper[4892]: I0217 17:44:38.154761 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:38 crc kubenswrapper[4892]: I0217 17:44:38.154788 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:38 crc kubenswrapper[4892]: I0217 17:44:38.154806 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:38Z","lastTransitionTime":"2026-02-17T17:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:38 crc kubenswrapper[4892]: I0217 17:44:38.258222 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:38 crc kubenswrapper[4892]: I0217 17:44:38.258283 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:38 crc kubenswrapper[4892]: I0217 17:44:38.258299 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:38 crc kubenswrapper[4892]: I0217 17:44:38.258330 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:38 crc kubenswrapper[4892]: I0217 17:44:38.258348 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:38Z","lastTransitionTime":"2026-02-17T17:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:38 crc kubenswrapper[4892]: I0217 17:44:38.321916 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 14:29:11.155312869 +0000 UTC Feb 17 17:44:38 crc kubenswrapper[4892]: I0217 17:44:38.358647 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:44:38 crc kubenswrapper[4892]: E0217 17:44:38.358922 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q4n6" podUID="9290105c-74a4-487a-879f-3f79186b3b01" Feb 17 17:44:38 crc kubenswrapper[4892]: I0217 17:44:38.359138 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:44:38 crc kubenswrapper[4892]: E0217 17:44:38.359448 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:44:38 crc kubenswrapper[4892]: I0217 17:44:38.367146 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:38 crc kubenswrapper[4892]: I0217 17:44:38.367211 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:38 crc kubenswrapper[4892]: I0217 17:44:38.367233 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:38 crc kubenswrapper[4892]: I0217 17:44:38.367260 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:38 crc kubenswrapper[4892]: I0217 17:44:38.367278 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:38Z","lastTransitionTime":"2026-02-17T17:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:38 crc kubenswrapper[4892]: I0217 17:44:38.470953 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:38 crc kubenswrapper[4892]: I0217 17:44:38.470999 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:38 crc kubenswrapper[4892]: I0217 17:44:38.471011 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:38 crc kubenswrapper[4892]: I0217 17:44:38.471027 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:38 crc kubenswrapper[4892]: I0217 17:44:38.471037 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:38Z","lastTransitionTime":"2026-02-17T17:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:38 crc kubenswrapper[4892]: I0217 17:44:38.574002 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:38 crc kubenswrapper[4892]: I0217 17:44:38.574077 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:38 crc kubenswrapper[4892]: I0217 17:44:38.574101 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:38 crc kubenswrapper[4892]: I0217 17:44:38.574138 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:38 crc kubenswrapper[4892]: I0217 17:44:38.575141 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:38Z","lastTransitionTime":"2026-02-17T17:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:38 crc kubenswrapper[4892]: I0217 17:44:38.678680 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:38 crc kubenswrapper[4892]: I0217 17:44:38.678728 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:38 crc kubenswrapper[4892]: I0217 17:44:38.678745 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:38 crc kubenswrapper[4892]: I0217 17:44:38.678773 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:38 crc kubenswrapper[4892]: I0217 17:44:38.678791 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:38Z","lastTransitionTime":"2026-02-17T17:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:38 crc kubenswrapper[4892]: I0217 17:44:38.782166 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:38 crc kubenswrapper[4892]: I0217 17:44:38.782215 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:38 crc kubenswrapper[4892]: I0217 17:44:38.782226 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:38 crc kubenswrapper[4892]: I0217 17:44:38.782244 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:38 crc kubenswrapper[4892]: I0217 17:44:38.782253 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:38Z","lastTransitionTime":"2026-02-17T17:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:38 crc kubenswrapper[4892]: I0217 17:44:38.884845 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:38 crc kubenswrapper[4892]: I0217 17:44:38.884932 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:38 crc kubenswrapper[4892]: I0217 17:44:38.884951 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:38 crc kubenswrapper[4892]: I0217 17:44:38.884978 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:38 crc kubenswrapper[4892]: I0217 17:44:38.884998 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:38Z","lastTransitionTime":"2026-02-17T17:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:38 crc kubenswrapper[4892]: I0217 17:44:38.987291 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:38 crc kubenswrapper[4892]: I0217 17:44:38.987323 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:38 crc kubenswrapper[4892]: I0217 17:44:38.987334 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:38 crc kubenswrapper[4892]: I0217 17:44:38.987367 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:38 crc kubenswrapper[4892]: I0217 17:44:38.987378 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:38Z","lastTransitionTime":"2026-02-17T17:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.090388 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.090452 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.090468 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.090491 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.090508 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:39Z","lastTransitionTime":"2026-02-17T17:44:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.194286 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.194335 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.194350 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.194369 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.194379 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:39Z","lastTransitionTime":"2026-02-17T17:44:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.297117 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.297164 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.297173 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.297188 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.297198 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:39Z","lastTransitionTime":"2026-02-17T17:44:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.322207 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 10:12:51.935247536 +0000 UTC Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.359282 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.359330 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:44:39 crc kubenswrapper[4892]: E0217 17:44:39.360135 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:44:39 crc kubenswrapper[4892]: E0217 17:44:39.360294 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.388923 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b23058a0-04ec-4a23-82cb-60f9b368eaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54854fd0e686549c10d435083457b3ca06743c27b074aa4cb690d468e52e7b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54854fd0e686549c10d435083457b3ca06743c27b074aa4cb690d468e52e7b85\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:44:28Z\\\",\\\"message\\\":\\\" 6422 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0217 17:44:28.578986 6422 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 17:44:28.578992 6422 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0217 17:44:28.579012 6422 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0217 17:44:28.579054 6422 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 17:44:28.579065 6422 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0217 17:44:28.579068 6422 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0217 17:44:28.579079 6422 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 17:44:28.579090 6422 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 17:44:28.579097 6422 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 17:44:28.579051 6422 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 17:44:28.579113 6422 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0217 17:44:28.579123 6422 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 17:44:28.579136 6422 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 17:44:28.579146 6422 factory.go:656] Stopping watch factory\\\\nI0217 17:44:28.579164 6422 ovnkube.go:599] Stopped ovnkube\\\\nI0217 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kp5h9_openshift-ovn-kubernetes(b23058a0-04ec-4a23-82cb-60f9b368eaa0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kp5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:39Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.400662 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.400712 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.400721 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.400736 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.400748 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:39Z","lastTransitionTime":"2026-02-17T17:44:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.405562 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2q4n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9290105c-74a4-487a-879f-3f79186b3b01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6h6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6h6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2q4n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:39Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.420557 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e1e9d28fe57947c798967ce328638e778603a86874e79a2f4d99eab0b9867fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://489d551f2e552fe6e6a03f8e38593e2b978a7dc4053a999c3a11ebe404cf427f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:39Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.432768 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p6jtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"646d0148-c138-45a2-8f68-51aee16aeff0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1d56c160a6bfb1da9b1ca24666fbbdbd87911eaf961365a6cb097b0cc20f1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxbxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p6jtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:39Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.447317 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59304d1958e072cd19bd1e6a00f1487aa93563c8838c859aa7ed5222dc25c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:39Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.459427 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5v456" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62da95c0-b8b4-410e-a5e8-f3ab44db53b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4bd49fbfa0e7cff66ffc2a72aa0020a4a28551fa192e587067d648feb4d09f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfctg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5v456\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:39Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.475370 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k28d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5497ad5a-f96f-4f2e-ba79-a72f32527546\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8cdd570c53fb9581ae506e73f11f51586ebffcc4561ba7176a7a64a25bd5e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g8dcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ca1f6810f305bb127d5ebfbb39b92ffa5914a3a245dcfe6f4adb805fea015e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g8dcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7k28d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:39Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.490033 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4159c-33a5-42ef-9427-ad1fb2799470\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf7e90e5029b6c000084531e8c465365c236ea0c554bf0ab65b936fc0f53cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe34ba04ce082dcb2c2040fafa3d9b0fb0f78fd569d08c9e53278873fb6434c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a9fc749b194bcc0558af7017260846aeb8df8df5d84f31ae0f1e9f7ac29e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2199d2bd60d27dc4d3c3ff4a2ef9e4cbad59c3546a36206258b82875901b69d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 17:44:08.433164 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 17:44:08.433536 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:44:08.434768 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2781092108/tls.crt::/tmp/serving-cert-2781092108/tls.key\\\\\\\"\\\\nI0217 17:44:08.760474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:44:08.765628 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:44:08.765666 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:44:08.765700 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:44:08.765711 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:44:08.777802 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:44:08.777850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:44:08.777867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:44:08.777870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:44:08.777873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:44:08.777883 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:44:08.781106 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23963364690d3b6cdb1cb76d08e094b0a7c2fe304f8d20a673206357e0152193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:39Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.504849 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.504874 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.504885 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.504902 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.504914 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:39Z","lastTransitionTime":"2026-02-17T17:44:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.507808 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:39Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.527499 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxpxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b12f44-0079-4031-9b1d-492c374250df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5490d72f776fd4e96f56e933c182c8589e96d80806378f96d97c4d0f6dccd8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgp4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxpxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:39Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.550263 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4dsxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202411f4-1b44-41a1-9b8a-23038a0e9bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc5f2c5d87fb5b1529c03563955d9213cd72452aabadd9417afdf5a30b8a24f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4dsxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:39Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.569697 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83f8f121-9193-465b-9d6b-4dfbdcf26f98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca961a496bca89cff78c58c41585706c0673892df83e6c58e16655f83aeaa4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3ead139c2a8e6feefbddce384ccd6eb8f26b4581d93053b80832fea7c8bdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82362387569a5b049fdb827cb8ca5ecdacae36743ef142077b37441cc2e33b41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d08e8579fe7ac45bc28ca29b9e6bc0b795c68e4429e16319b75f2b6a1f60cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:39Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.584459 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:39Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.600634 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aba0c9be9062adff6361f81c4b2f399a56592fcef0d1a62defdf587f9accb3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:39Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.607455 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.607483 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.607491 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.607506 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.607516 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:39Z","lastTransitionTime":"2026-02-17T17:44:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.618727 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9013d62-9809-436b-82a8-5b18dbf13e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b57e4ceb22cb7d61e915ddf5e85dfee872d2088cccf3c8c4add63e1aa423a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c31ce4ad814ab3be27cbd8d8eb0cb70c7e11352cacd14b705d5c3ed0b56fe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6mhzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:39Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.640103 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:39Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.709491 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.709534 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.709544 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.709559 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.709570 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:39Z","lastTransitionTime":"2026-02-17T17:44:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.812464 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.812524 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.812542 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.812569 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.812586 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:39Z","lastTransitionTime":"2026-02-17T17:44:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.915920 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.916040 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.916060 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.916083 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:39 crc kubenswrapper[4892]: I0217 17:44:39.916100 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:39Z","lastTransitionTime":"2026-02-17T17:44:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:40 crc kubenswrapper[4892]: I0217 17:44:40.019384 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:40 crc kubenswrapper[4892]: I0217 17:44:40.019430 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:40 crc kubenswrapper[4892]: I0217 17:44:40.019440 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:40 crc kubenswrapper[4892]: I0217 17:44:40.019455 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:40 crc kubenswrapper[4892]: I0217 17:44:40.019468 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:40Z","lastTransitionTime":"2026-02-17T17:44:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:40 crc kubenswrapper[4892]: I0217 17:44:40.122211 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:40 crc kubenswrapper[4892]: I0217 17:44:40.122263 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:40 crc kubenswrapper[4892]: I0217 17:44:40.122274 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:40 crc kubenswrapper[4892]: I0217 17:44:40.122293 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:40 crc kubenswrapper[4892]: I0217 17:44:40.122306 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:40Z","lastTransitionTime":"2026-02-17T17:44:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:40 crc kubenswrapper[4892]: I0217 17:44:40.224889 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:40 crc kubenswrapper[4892]: I0217 17:44:40.224936 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:40 crc kubenswrapper[4892]: I0217 17:44:40.224952 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:40 crc kubenswrapper[4892]: I0217 17:44:40.224979 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:40 crc kubenswrapper[4892]: I0217 17:44:40.224995 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:40Z","lastTransitionTime":"2026-02-17T17:44:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:40 crc kubenswrapper[4892]: I0217 17:44:40.322572 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 21:49:15.159671732 +0000 UTC Feb 17 17:44:40 crc kubenswrapper[4892]: I0217 17:44:40.328230 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:40 crc kubenswrapper[4892]: I0217 17:44:40.328277 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:40 crc kubenswrapper[4892]: I0217 17:44:40.328287 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:40 crc kubenswrapper[4892]: I0217 17:44:40.328301 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:40 crc kubenswrapper[4892]: I0217 17:44:40.328312 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:40Z","lastTransitionTime":"2026-02-17T17:44:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:40 crc kubenswrapper[4892]: I0217 17:44:40.358441 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:44:40 crc kubenswrapper[4892]: I0217 17:44:40.358480 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:44:40 crc kubenswrapper[4892]: E0217 17:44:40.358630 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q4n6" podUID="9290105c-74a4-487a-879f-3f79186b3b01" Feb 17 17:44:40 crc kubenswrapper[4892]: E0217 17:44:40.358768 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:44:40 crc kubenswrapper[4892]: I0217 17:44:40.432099 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:40 crc kubenswrapper[4892]: I0217 17:44:40.432172 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:40 crc kubenswrapper[4892]: I0217 17:44:40.432191 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:40 crc kubenswrapper[4892]: I0217 17:44:40.432215 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:40 crc kubenswrapper[4892]: I0217 17:44:40.432232 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:40Z","lastTransitionTime":"2026-02-17T17:44:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:40 crc kubenswrapper[4892]: I0217 17:44:40.535601 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:40 crc kubenswrapper[4892]: I0217 17:44:40.535659 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:40 crc kubenswrapper[4892]: I0217 17:44:40.535674 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:40 crc kubenswrapper[4892]: I0217 17:44:40.535698 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:40 crc kubenswrapper[4892]: I0217 17:44:40.535714 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:40Z","lastTransitionTime":"2026-02-17T17:44:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:40 crc kubenswrapper[4892]: I0217 17:44:40.638529 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:40 crc kubenswrapper[4892]: I0217 17:44:40.638565 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:40 crc kubenswrapper[4892]: I0217 17:44:40.638576 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:40 crc kubenswrapper[4892]: I0217 17:44:40.638591 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:40 crc kubenswrapper[4892]: I0217 17:44:40.638603 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:40Z","lastTransitionTime":"2026-02-17T17:44:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:40 crc kubenswrapper[4892]: I0217 17:44:40.740730 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:40 crc kubenswrapper[4892]: I0217 17:44:40.740782 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:40 crc kubenswrapper[4892]: I0217 17:44:40.740799 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:40 crc kubenswrapper[4892]: I0217 17:44:40.740847 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:40 crc kubenswrapper[4892]: I0217 17:44:40.740866 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:40Z","lastTransitionTime":"2026-02-17T17:44:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:40 crc kubenswrapper[4892]: I0217 17:44:40.843580 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:40 crc kubenswrapper[4892]: I0217 17:44:40.843645 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:40 crc kubenswrapper[4892]: I0217 17:44:40.843666 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:40 crc kubenswrapper[4892]: I0217 17:44:40.843695 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:40 crc kubenswrapper[4892]: I0217 17:44:40.843720 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:40Z","lastTransitionTime":"2026-02-17T17:44:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:40 crc kubenswrapper[4892]: I0217 17:44:40.947238 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:40 crc kubenswrapper[4892]: I0217 17:44:40.947276 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:40 crc kubenswrapper[4892]: I0217 17:44:40.947285 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:40 crc kubenswrapper[4892]: I0217 17:44:40.947300 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:40 crc kubenswrapper[4892]: I0217 17:44:40.947312 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:40Z","lastTransitionTime":"2026-02-17T17:44:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.050222 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.050283 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.050308 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.050338 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.050361 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:41Z","lastTransitionTime":"2026-02-17T17:44:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.108265 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:44:41 crc kubenswrapper[4892]: E0217 17:44:41.108545 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:45:13.10851542 +0000 UTC m=+84.483918725 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.153616 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.153660 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.153671 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.153688 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.153704 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:41Z","lastTransitionTime":"2026-02-17T17:44:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.209205 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.209249 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.209274 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.209301 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:44:41 crc kubenswrapper[4892]: E0217 17:44:41.209332 4892 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 17:44:41 crc kubenswrapper[4892]: E0217 17:44:41.209422 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 17:45:13.209396975 +0000 UTC m=+84.584800250 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 17:44:41 crc kubenswrapper[4892]: E0217 17:44:41.209425 4892 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 17:44:41 crc kubenswrapper[4892]: E0217 17:44:41.209451 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 17:44:41 crc kubenswrapper[4892]: E0217 17:44:41.209508 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 17:44:41 crc kubenswrapper[4892]: E0217 17:44:41.209520 4892 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:44:41 crc kubenswrapper[4892]: E0217 17:44:41.209472 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 17:45:13.209457897 +0000 UTC m=+84.584861262 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 17:44:41 crc kubenswrapper[4892]: E0217 17:44:41.209577 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 17:45:13.209562129 +0000 UTC m=+84.584965494 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:44:41 crc kubenswrapper[4892]: E0217 17:44:41.209575 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 17:44:41 crc kubenswrapper[4892]: E0217 17:44:41.209623 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 17:44:41 crc kubenswrapper[4892]: E0217 17:44:41.209643 4892 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:44:41 crc kubenswrapper[4892]: E0217 17:44:41.209733 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 17:45:13.209708513 +0000 UTC m=+84.585111818 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.256903 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.256976 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.257000 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.257028 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.257045 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:41Z","lastTransitionTime":"2026-02-17T17:44:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.323006 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 11:56:23.244869125 +0000 UTC Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.358474 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.358522 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:44:41 crc kubenswrapper[4892]: E0217 17:44:41.358700 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:44:41 crc kubenswrapper[4892]: E0217 17:44:41.358860 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.360850 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.360899 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.360915 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.360939 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.360993 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:41Z","lastTransitionTime":"2026-02-17T17:44:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.463528 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.463584 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.463605 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.463637 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.463659 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:41Z","lastTransitionTime":"2026-02-17T17:44:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.566549 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.566604 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.566621 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.566644 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.566763 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:41Z","lastTransitionTime":"2026-02-17T17:44:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.669704 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.669749 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.669757 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.669836 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.669849 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:41Z","lastTransitionTime":"2026-02-17T17:44:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.772385 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.772450 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.772460 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.772476 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.772488 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:41Z","lastTransitionTime":"2026-02-17T17:44:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.875407 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.875470 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.875486 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.875508 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.875526 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:41Z","lastTransitionTime":"2026-02-17T17:44:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.978278 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.978334 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.978350 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.978371 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:41 crc kubenswrapper[4892]: I0217 17:44:41.978388 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:41Z","lastTransitionTime":"2026-02-17T17:44:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.081890 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.081960 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.081982 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.082013 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.082033 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:42Z","lastTransitionTime":"2026-02-17T17:44:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.184410 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.184455 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.184470 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.184494 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.184512 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:42Z","lastTransitionTime":"2026-02-17T17:44:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.287074 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.287135 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.287152 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.287176 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.287193 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:42Z","lastTransitionTime":"2026-02-17T17:44:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.324052 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 04:16:00.926098268 +0000 UTC Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.359391 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.359461 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:44:42 crc kubenswrapper[4892]: E0217 17:44:42.359570 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q4n6" podUID="9290105c-74a4-487a-879f-3f79186b3b01" Feb 17 17:44:42 crc kubenswrapper[4892]: E0217 17:44:42.359728 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.360687 4892 scope.go:117] "RemoveContainer" containerID="54854fd0e686549c10d435083457b3ca06743c27b074aa4cb690d468e52e7b85" Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.391059 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.391118 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.391135 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.391157 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.391173 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:42Z","lastTransitionTime":"2026-02-17T17:44:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.423474 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9290105c-74a4-487a-879f-3f79186b3b01-metrics-certs\") pod \"network-metrics-daemon-2q4n6\" (UID: \"9290105c-74a4-487a-879f-3f79186b3b01\") " pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:44:42 crc kubenswrapper[4892]: E0217 17:44:42.423723 4892 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 17:44:42 crc kubenswrapper[4892]: E0217 17:44:42.423860 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9290105c-74a4-487a-879f-3f79186b3b01-metrics-certs podName:9290105c-74a4-487a-879f-3f79186b3b01 nodeName:}" failed. No retries permitted until 2026-02-17 17:44:58.423836491 +0000 UTC m=+69.799239846 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9290105c-74a4-487a-879f-3f79186b3b01-metrics-certs") pod "network-metrics-daemon-2q4n6" (UID: "9290105c-74a4-487a-879f-3f79186b3b01") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.495223 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.495300 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.495318 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.495348 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.495367 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:42Z","lastTransitionTime":"2026-02-17T17:44:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.598292 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.598344 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.598361 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.598384 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.598400 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:42Z","lastTransitionTime":"2026-02-17T17:44:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.701448 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.701500 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.701515 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.701539 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.701556 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:42Z","lastTransitionTime":"2026-02-17T17:44:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.747178 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kp5h9_b23058a0-04ec-4a23-82cb-60f9b368eaa0/ovnkube-controller/1.log" Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.764788 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" event={"ID":"b23058a0-04ec-4a23-82cb-60f9b368eaa0","Type":"ContainerStarted","Data":"e71385b1a285f0da1c91a9a94188fbfd412225fc8c839572ec619b3363017253"} Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.765477 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.787718 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:42Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.806539 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aba0c9be9062adff6361f81c4b2f399a56592fcef0d1a62defdf587f9accb3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:42Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.806746 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.806806 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.806860 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.806894 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.806918 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:42Z","lastTransitionTime":"2026-02-17T17:44:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.827375 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9013d62-9809-436b-82a8-5b18dbf13e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b57e4ceb22cb7d61e915ddf5e85dfee872d2088cccf3c8c4add63e1aa423a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c31ce4ad814ab3be27cbd8d8eb0cb70c7e11352cacd14b705d5c3ed0b56fe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6mhzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:42Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.847351 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e1e9d28fe57947c798967ce328638e778603a86874e79a2f4d99eab0b9867fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://489d551f2e552fe6e6a03f8e38593e2b978a7dc4053a999c3a11ebe404cf427f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:42Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.868711 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p6jtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"646d0148-c138-45a2-8f68-51aee16aeff0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1d56c160a6bfb1da9b1ca24666fbbdbd87911eaf961365a6cb097b0cc20f1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxbxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p6jtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:42Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.894852 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b23058a0-04ec-4a23-82cb-60f9b368eaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e71385b1a285f0da1c91a9a94188fbfd412225fc8c839572ec619b3363017253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54854fd0e686549c10d435083457b3ca06743c27b074aa4cb690d468e52e7b85\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:44:28Z\\\",\\\"message\\\":\\\" 6422 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0217 17:44:28.578986 6422 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 17:44:28.578992 6422 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0217 17:44:28.579012 6422 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0217 17:44:28.579054 6422 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 17:44:28.579065 6422 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0217 17:44:28.579068 6422 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0217 17:44:28.579079 6422 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 17:44:28.579090 6422 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 17:44:28.579097 6422 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 17:44:28.579051 6422 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 17:44:28.579113 6422 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0217 17:44:28.579123 6422 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 17:44:28.579136 6422 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 17:44:28.579146 6422 factory.go:656] Stopping watch factory\\\\nI0217 17:44:28.579164 6422 ovnkube.go:599] Stopped ovnkube\\\\nI0217 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kp5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:42Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.896566 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.909268 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.909390 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.909412 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.909440 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.909463 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:42Z","lastTransitionTime":"2026-02-17T17:44:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.912693 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.915436 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2q4n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9290105c-74a4-487a-879f-3f79186b3b01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6h6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6h6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2q4n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:42Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.932363 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4159c-33a5-42ef-9427-ad1fb2799470\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf7e90e5029b6c000084531e8c465365c236ea0c554bf0ab65b936fc0f53cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe34ba04ce082dcb2c2040fafa3d9b0fb0f78fd569d08c9e53278873fb6434c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a9fc749b194bcc0558af7017260846aeb8df8df5d84f31ae0f1e9f7ac29e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2199d2bd60d27dc4d3c3ff4a2ef9e4cbad59c3546a36206258b82875901b69d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 17:44:08.433164 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 17:44:08.433536 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:44:08.434768 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2781092108/tls.crt::/tmp/serving-cert-2781092108/tls.key\\\\\\\"\\\\nI0217 17:44:08.760474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:44:08.765628 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:44:08.765666 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:44:08.765700 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:44:08.765711 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:44:08.777802 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:44:08.777850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:44:08.777867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:44:08.777870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:44:08.777873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:44:08.777883 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:44:08.781106 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23963364690d3b6cdb1cb76d08e094b0a7c2fe304f8d20a673206357e0152193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:42Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.952718 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59304d1958e072cd19bd1e6a00f1487aa93563c8838c859aa7ed5222dc25c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:42Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.968452 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5v456" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62da95c0-b8b4-410e-a5e8-f3ab44db53b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4bd49fbfa0e7cff66ffc2a72aa0020a4a28551fa192e587067d648feb4d09f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfctg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5v456\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:42Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.980559 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k28d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5497ad5a-f96f-4f2e-ba79-a72f32527546\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8cdd570c53fb9581ae506e73f11f51586ebffcc4561ba7176a7a64a25bd5e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g8dcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ca1f6810f305bb127d5ebfbb39b92ffa5914a3a245dcfe6f4adb805fea015e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g8dcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7k28d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:42Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:42 crc kubenswrapper[4892]: I0217 17:44:42.994059 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83f8f121-9193-465b-9d6b-4dfbdcf26f98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca961a496bca89cff78c58c41585706c0673892df83e6c58e16655f83aeaa4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3ead139c2a8e6feefbddce384ccd6eb8f26b4581d93053b80832fea7c8bdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82362387569a5b049fdb827cb8ca5ecdacae36743ef142077b37441cc2e33b41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d08e8579fe7ac45bc28ca29b9e6bc0b795c68e4429e16319b75f2b6a1f60cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:42Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.010850 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:43Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.012536 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.012580 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.012592 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.012609 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.012620 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:43Z","lastTransitionTime":"2026-02-17T17:44:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.028009 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:43Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.064607 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxpxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b12f44-0079-4031-9b1d-492c374250df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5490d72f776fd4e96f56e933c182c8589e96d80806378f96d97c4d0f6dccd8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgp4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxpxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:43Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.091725 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4dsxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202411f4-1b44-41a1-9b8a-23038a0e9bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc5f2c5d87fb5b1529c03563955d9213cd72452aabadd9417afdf5a30b8a24f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4dsxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:43Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.106795 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aba0c9be9062adff6361f81c4b2f399a56592fcef0d1a62defdf587f9accb3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:43Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.114633 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.114853 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.114931 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.115002 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.115055 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:43Z","lastTransitionTime":"2026-02-17T17:44:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.122733 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9013d62-9809-436b-82a8-5b18dbf13e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b57e4ceb22cb7d61e915ddf5e85dfee872d2088cccf3c8c4add63e1aa423a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c31ce4ad814ab3be27cbd8d8eb0cb70c7e11352cacd14b705d5c3ed0b56fe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6mhzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:43Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.135459 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:43Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.153622 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b23058a0-04ec-4a23-82cb-60f9b368eaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e71385b1a285f0da1c91a9a94188fbfd412225fc8c839572ec619b3363017253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54854fd0e686549c10d435083457b3ca06743c27b074aa4cb690d468e52e7b85\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:44:28Z\\\",\\\"message\\\":\\\" 6422 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0217 17:44:28.578986 6422 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 17:44:28.578992 6422 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0217 17:44:28.579012 6422 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0217 17:44:28.579054 6422 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 17:44:28.579065 6422 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0217 17:44:28.579068 6422 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0217 17:44:28.579079 6422 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 17:44:28.579090 6422 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 17:44:28.579097 6422 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 17:44:28.579051 6422 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 17:44:28.579113 6422 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0217 17:44:28.579123 6422 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 17:44:28.579136 6422 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 17:44:28.579146 6422 factory.go:656] Stopping watch factory\\\\nI0217 17:44:28.579164 6422 ovnkube.go:599] Stopped ovnkube\\\\nI0217 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kp5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:43Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.169295 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2q4n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9290105c-74a4-487a-879f-3f79186b3b01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6h6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6h6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2q4n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:43Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.181147 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e1e9d28fe57947c798967ce328638e778603a86874e79a2f4d99eab0b9867fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://489d551f2e552fe6e6a03f8e38593e2b978a7dc4053a999c3a11ebe404cf427f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:43Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.195409 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p6jtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"646d0148-c138-45a2-8f68-51aee16aeff0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1d56c160a6bfb1da9b1ca24666fbbdbd87911eaf961365a6cb097b0cc20f1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxbxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p6jtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:43Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.209901 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af0cdeda-3c72-4700-b908-b8c0f0041cc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db577dd460e8c892dbb44799bae9ccdd3364c385f4bbfbd3a33a88bab1bcab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd66fb423f8b68d5e670ece2a7be1b6e04e4dcd03ba42dc25bcf97cd9ec92b69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0b98dba841562d65c0ab18e4f604c141352d4bbfb16825248fb209a73b23cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f344d7a0ae243f5e05b9e8c21f25a713411ee0921470e6632409800760aab22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f344d7a0ae243f5e05b9e8c21f25a713411ee0921470e6632409800760aab22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:43Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.217463 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.217488 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.217497 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.217511 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.217521 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:43Z","lastTransitionTime":"2026-02-17T17:44:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.225517 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59304d1958e072cd19bd1e6a00f1487aa93563c8838c859aa7ed5222dc25c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:43Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.237680 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5v456" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62da95c0-b8b4-410e-a5e8-f3ab44db53b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4bd49fbfa0e7cff66ffc2a72aa0020a4a28551fa192e587067d648feb4d09f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfctg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5v456\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:43Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.250489 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k28d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5497ad5a-f96f-4f2e-ba79-a72f32527546\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8cdd570c53fb9581ae506e73f11f51586ebffcc4561ba7176a7a64a25bd5e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g8dcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ca1f6810f305bb127d5ebfbb39b92ffa5914a3a245dcfe6f4adb805fea015e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g8dcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7k28d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:43Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.263735 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4159c-33a5-42ef-9427-ad1fb2799470\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf7e90e5029b6c000084531e8c465365c236ea0c554bf0ab65b936fc0f53cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe34ba04ce082dcb2c2040fafa3d9b0fb0f78fd569d08c9e53278873fb6434c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a9fc749b194bcc0558af7017260846aeb8df8df5d84f31ae0f1e9f7ac29e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2199d2bd60d27dc4d3c3ff4a2ef9e4cbad59c3546a36206258b82875901b69d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 17:44:08.433164 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 17:44:08.433536 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:44:08.434768 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2781092108/tls.crt::/tmp/serving-cert-2781092108/tls.key\\\\\\\"\\\\nI0217 17:44:08.760474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:44:08.765628 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:44:08.765666 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:44:08.765700 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:44:08.765711 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:44:08.777802 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:44:08.777850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:44:08.777867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:44:08.777870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:44:08.777873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:44:08.777883 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:44:08.781106 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23963364690d3b6cdb1cb76d08e094b0a7c2fe304f8d20a673206357e0152193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:43Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.274787 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:43Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.288177 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxpxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b12f44-0079-4031-9b1d-492c374250df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5490d72f776fd4e96f56e933c182c8589e96d80806378f96d97c4d0f6dccd8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgp4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxpxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:43Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.303880 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4dsxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202411f4-1b44-41a1-9b8a-23038a0e9bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc5f2c5d87fb5b1529c03563955d9213cd72452aabadd9417afdf5a30b8a24f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4dsxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:43Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.316214 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83f8f121-9193-465b-9d6b-4dfbdcf26f98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca961a496bca89cff78c58c41585706c0673892df83e6c58e16655f83aeaa4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3ead139c2a8e6feefbddce384ccd6eb8f26b4581d93053b80832fea7c8bdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82362387569a5b049fdb827cb8ca5ecdacae36743ef142077b37441cc2e33b41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d08e8579fe7ac45bc28ca29b9e6bc0b795c68e4429e16319b75f2b6a1f60cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:43Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.320282 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.320327 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.320340 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.320362 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.320374 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:43Z","lastTransitionTime":"2026-02-17T17:44:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.324463 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 03:07:27.311163934 +0000 UTC Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.329588 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:43Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.359117 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.359182 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:44:43 crc kubenswrapper[4892]: E0217 17:44:43.359300 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:44:43 crc kubenswrapper[4892]: E0217 17:44:43.359471 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.423450 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.423513 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.423525 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.423539 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.423549 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:43Z","lastTransitionTime":"2026-02-17T17:44:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.526930 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.526995 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.527013 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.527041 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.527060 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:43Z","lastTransitionTime":"2026-02-17T17:44:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.630069 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.630462 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.630628 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.630769 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.630982 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:43Z","lastTransitionTime":"2026-02-17T17:44:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.734991 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.735097 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.735114 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.735137 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.735157 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:43Z","lastTransitionTime":"2026-02-17T17:44:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.771187 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kp5h9_b23058a0-04ec-4a23-82cb-60f9b368eaa0/ovnkube-controller/2.log" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.772312 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kp5h9_b23058a0-04ec-4a23-82cb-60f9b368eaa0/ovnkube-controller/1.log" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.776909 4892 generic.go:334] "Generic (PLEG): container finished" podID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerID="e71385b1a285f0da1c91a9a94188fbfd412225fc8c839572ec619b3363017253" exitCode=1 Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.777081 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" event={"ID":"b23058a0-04ec-4a23-82cb-60f9b368eaa0","Type":"ContainerDied","Data":"e71385b1a285f0da1c91a9a94188fbfd412225fc8c839572ec619b3363017253"} Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.777175 4892 scope.go:117] "RemoveContainer" containerID="54854fd0e686549c10d435083457b3ca06743c27b074aa4cb690d468e52e7b85" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.778455 4892 scope.go:117] "RemoveContainer" containerID="e71385b1a285f0da1c91a9a94188fbfd412225fc8c839572ec619b3363017253" Feb 17 17:44:43 crc kubenswrapper[4892]: E0217 17:44:43.778968 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kp5h9_openshift-ovn-kubernetes(b23058a0-04ec-4a23-82cb-60f9b368eaa0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.805133 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83f8f121-9193-465b-9d6b-4dfbdcf26f98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca961a496bca89cff78c58c41585706c0673892df83e6c58e16655f83aeaa4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3ead139c2a8e6feefbddce384ccd6eb8f26b4581d93053b80832fea7c8bdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82362387569a5b049fdb827cb8ca5ecdacae36743ef142077b37441cc2e33b41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d08e8579fe7ac45bc28ca29b9e6bc0b795c68e4429e16319b75f2b6a1f60cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:43Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.821426 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:43Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.838697 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.838740 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.838757 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.838772 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.838792 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:43Z","lastTransitionTime":"2026-02-17T17:44:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.843883 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:43Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.864131 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxpxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b12f44-0079-4031-9b1d-492c374250df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5490d72f776fd4e96f56e933c182c8589e96d80806378f96d97c4d0f6dccd8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgp4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxpxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:43Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.881901 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4dsxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202411f4-1b44-41a1-9b8a-23038a0e9bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc5f2c5d87fb5b1529c03563955d9213cd72452aabadd9417afdf5a30b8a24f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4dsxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:43Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.895758 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:43Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.914584 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aba0c9be9062adff6361f81c4b2f399a56592fcef0d1a62defdf587f9accb3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:43Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.928575 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9013d62-9809-436b-82a8-5b18dbf13e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b57e4ceb22cb7d61e915ddf5e85dfee872d2088cccf3c8c4add63e1aa423a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c31ce4ad814ab3be27cbd8d8eb0cb70c7e11352cacd14b705d5c3ed0b56fe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6mhzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:43Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.943672 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.943909 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.944244 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.944795 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.945007 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:43Z","lastTransitionTime":"2026-02-17T17:44:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.946574 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e1e9d28fe57947c798967ce328638e778603a86874e79a2f4d99eab0b9867fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://489d551f2e552fe6e6a03f8e38593e2b978a7dc4053a999c3a11ebe404cf427f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:43Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.960404 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p6jtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"646d0148-c138-45a2-8f68-51aee16aeff0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1d56c160a6bfb1da9b1ca24666fbbdbd87911eaf961365a6cb097b0cc20f1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxbxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p6jtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:43Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.982447 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b23058a0-04ec-4a23-82cb-60f9b368eaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e71385b1a285f0da1c91a9a94188fbfd412225fc8c839572ec619b3363017253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54854fd0e686549c10d435083457b3ca06743c27b074aa4cb690d468e52e7b85\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:44:28Z\\\",\\\"message\\\":\\\" 6422 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0217 17:44:28.578986 6422 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 17:44:28.578992 6422 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0217 17:44:28.579012 6422 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0217 17:44:28.579054 6422 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 17:44:28.579065 6422 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0217 17:44:28.579068 6422 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0217 17:44:28.579079 6422 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 17:44:28.579090 6422 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 17:44:28.579097 6422 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 17:44:28.579051 6422 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 17:44:28.579113 6422 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0217 17:44:28.579123 6422 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 17:44:28.579136 6422 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 17:44:28.579146 6422 factory.go:656] Stopping watch factory\\\\nI0217 17:44:28.579164 6422 ovnkube.go:599] Stopped ovnkube\\\\nI0217 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e71385b1a285f0da1c91a9a94188fbfd412225fc8c839572ec619b3363017253\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:44:43Z\\\",\\\"message\\\":\\\"ef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 17:44:43.370589 6578 services_controller.go:356] Processing sync for service openshift-multus/multus-admission-controller for network=default\\\\nI0217 17:44:43.369483 6578 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0217 17:44:43.370645 6578 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0217 17:44:43.370651 6578 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0217 17:44:43.370550 6578 base_netwo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kp5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:43Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:43 crc kubenswrapper[4892]: I0217 17:44:43.997109 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2q4n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9290105c-74a4-487a-879f-3f79186b3b01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6h6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6h6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2q4n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:43Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.012295 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k28d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5497ad5a-f96f-4f2e-ba79-a72f32527546\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8cdd570c53fb9581ae506e73f11f51586ebffcc4561ba7176a7a64a25bd5e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g8dcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ca1f6810f305bb127d5ebfbb39b92ffa5914a3a245dcfe6f4adb805fea015e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g8dcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7k28d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:44Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.031605 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4159c-33a5-42ef-9427-ad1fb2799470\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf7e90e5029b6c000084531e8c465365c236ea0c554bf0ab65b936fc0f53cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe34ba04ce082dcb2c2040fafa3d9b0fb0f78fd569d08c9e53278873fb6434c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a9fc749b194bcc0558af7017260846aeb8df8df5d84f31ae0f1e9f7ac29e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2199d2bd60d27dc4d3c3ff4a2ef9e4cbad59c3546a36206258b82875901b69d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 17:44:08.433164 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 17:44:08.433536 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:44:08.434768 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2781092108/tls.crt::/tmp/serving-cert-2781092108/tls.key\\\\\\\"\\\\nI0217 17:44:08.760474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:44:08.765628 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:44:08.765666 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:44:08.765700 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:44:08.765711 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:44:08.777802 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:44:08.777850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:44:08.777867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:44:08.777870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:44:08.777873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:44:08.777883 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:44:08.781106 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23963364690d3b6cdb1cb76d08e094b0a7c2fe304f8d20a673206357e0152193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:44Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.048485 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.048541 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.048567 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.048595 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.048615 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:44Z","lastTransitionTime":"2026-02-17T17:44:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.049080 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af0cdeda-3c72-4700-b908-b8c0f0041cc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db577dd460e8c892dbb44799bae9ccdd3364c385f4bbfbd3a33a88bab1bcab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd66fb423f8b68d5e670ece2a7be1b6e04e4dcd03ba42dc25bcf97cd9ec92b69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0b98dba841562d65c0ab18e4f604c141352d4bbfb16825248fb209a73b23cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f344d7a0ae243f5e05b9e8c21f25a713411ee0921470e6632409800760aab22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f344d7a0ae243f5e05b9e8c21f25a713411ee0921470e6632409800760aab22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:44Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.065749 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59304d1958e072cd19bd1e6a00f1487aa93563c8838c859aa7ed5222dc25c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:44Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.081264 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5v456" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62da95c0-b8b4-410e-a5e8-f3ab44db53b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4bd49fbfa0e7cff66ffc2a72aa0020a4a28551fa192e587067d648feb4d09f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfctg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5v456\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:44Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.150911 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.150963 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.150979 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.151001 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.151018 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:44Z","lastTransitionTime":"2026-02-17T17:44:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.254545 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.254600 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.254616 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.254640 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.254659 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:44Z","lastTransitionTime":"2026-02-17T17:44:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.325185 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 04:45:56.823694226 +0000 UTC Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.357971 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.358024 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.358040 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.358064 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.358079 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:44Z","lastTransitionTime":"2026-02-17T17:44:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.358468 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.358472 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:44:44 crc kubenswrapper[4892]: E0217 17:44:44.358617 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q4n6" podUID="9290105c-74a4-487a-879f-3f79186b3b01" Feb 17 17:44:44 crc kubenswrapper[4892]: E0217 17:44:44.358958 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.460572 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.460933 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.461183 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.461426 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.461755 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:44Z","lastTransitionTime":"2026-02-17T17:44:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.564972 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.565354 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.565746 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.566179 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.566537 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:44Z","lastTransitionTime":"2026-02-17T17:44:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.671165 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.671457 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.671468 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.671484 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.671497 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:44Z","lastTransitionTime":"2026-02-17T17:44:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.774875 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.774943 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.774967 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.774995 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.775016 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:44Z","lastTransitionTime":"2026-02-17T17:44:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.781645 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kp5h9_b23058a0-04ec-4a23-82cb-60f9b368eaa0/ovnkube-controller/2.log" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.786985 4892 scope.go:117] "RemoveContainer" containerID="e71385b1a285f0da1c91a9a94188fbfd412225fc8c839572ec619b3363017253" Feb 17 17:44:44 crc kubenswrapper[4892]: E0217 17:44:44.787740 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kp5h9_openshift-ovn-kubernetes(b23058a0-04ec-4a23-82cb-60f9b368eaa0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.800723 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p6jtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"646d0148-c138-45a2-8f68-51aee16aeff0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1d56c160a6bfb1da9b1ca24666fbbdbd87911eaf961365a6cb097b0cc20f1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxbxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p6jtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:44Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.820715 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b23058a0-04ec-4a23-82cb-60f9b368eaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e71385b1a285f0da1c91a9a94188fbfd412225fc8c839572ec619b3363017253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e71385b1a285f0da1c91a9a94188fbfd412225fc8c839572ec619b3363017253\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:44:43Z\\\",\\\"message\\\":\\\"ef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 17:44:43.370589 6578 services_controller.go:356] Processing sync for service openshift-multus/multus-admission-controller for network=default\\\\nI0217 17:44:43.369483 6578 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0217 17:44:43.370645 6578 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0217 17:44:43.370651 6578 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0217 17:44:43.370550 6578 base_netwo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kp5h9_openshift-ovn-kubernetes(b23058a0-04ec-4a23-82cb-60f9b368eaa0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kp5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:44Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.838969 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2q4n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9290105c-74a4-487a-879f-3f79186b3b01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6h6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6h6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2q4n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:44Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.859260 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e1e9d28fe57947c798967ce328638e778603a86874e79a2f4d99eab0b9867fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://489d551f2e552fe6e6a03f8e38593e2b978a7dc4053a999c3a11ebe404cf427f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:44Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.877329 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4159c-33a5-42ef-9427-ad1fb2799470\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf7e90e5029b6c000084531e8c465365c236ea0c554bf0ab65b936fc0f53cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe34ba04ce082dcb2c2040fafa3d9b0fb0f78fd569d08c9e53278873fb6434c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a9fc749b194bcc0558af7017260846aeb8df8df5d84f31ae0f1e9f7ac29e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2199d2bd60d27dc4d3c3ff4a2ef9e4cbad59c3546a36206258b82875901b69d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 17:44:08.433164 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 17:44:08.433536 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:44:08.434768 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2781092108/tls.crt::/tmp/serving-cert-2781092108/tls.key\\\\\\\"\\\\nI0217 17:44:08.760474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:44:08.765628 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:44:08.765666 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:44:08.765700 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:44:08.765711 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:44:08.777802 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:44:08.777850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:44:08.777867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:44:08.777870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:44:08.777873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:44:08.777883 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:44:08.781106 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23963364690d3b6cdb1cb76d08e094b0a7c2fe304f8d20a673206357e0152193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:44Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.878268 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.878320 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.878338 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.878365 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.878382 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:44Z","lastTransitionTime":"2026-02-17T17:44:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.893953 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af0cdeda-3c72-4700-b908-b8c0f0041cc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db577dd460e8c892dbb44799bae9ccdd3364c385f4bbfbd3a33a88bab1bcab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd66fb423f8b68d5e670ece2a7be1b6e04e4dcd03ba42dc25bcf97cd9ec92b69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0b98dba841562d65c0ab18e4f604c141352d4bbfb16825248fb209a73b23cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f344d7a0ae243f5e05b9e8c21f25a713411ee0921470e6632409800760aab22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f344d7a0ae243f5e05b9e8c21f25a713411ee0921470e6632409800760aab22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:44Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.907862 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59304d1958e072cd19bd1e6a00f1487aa93563c8838c859aa7ed5222dc25c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:44Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.920578 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5v456" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62da95c0-b8b4-410e-a5e8-f3ab44db53b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4bd49fbfa0e7cff66ffc2a72aa0020a4a28551fa192e587067d648feb4d09f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfctg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5v456\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:44Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.936497 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k28d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5497ad5a-f96f-4f2e-ba79-a72f32527546\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8cdd570c53fb9581ae506e73f11f51586ebffcc4561ba7176a7a64a25bd5e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g8dcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ca1f6810f305bb127d5ebfbb39b92ffa5914a3a245dcfe6f4adb805fea015e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g8dcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7k28d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:44Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.953092 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:44Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.970089 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:44Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.982314 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.982477 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.982505 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.982529 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.982546 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:44Z","lastTransitionTime":"2026-02-17T17:44:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:44 crc kubenswrapper[4892]: I0217 17:44:44.989868 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxpxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b12f44-0079-4031-9b1d-492c374250df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5490d72f776fd4e96f56e933c182c8589e96d80806378f96d97c4d0f6dccd8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgp4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxpxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:44Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:45 crc kubenswrapper[4892]: I0217 17:44:45.013491 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4dsxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202411f4-1b44-41a1-9b8a-23038a0e9bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc5f2c5d87fb5b1529c03563955d9213cd72452aabadd9417afdf5a30b8a24f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4dsxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:45Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:45 crc kubenswrapper[4892]: I0217 17:44:45.028976 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83f8f121-9193-465b-9d6b-4dfbdcf26f98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca961a496bca89cff78c58c41585706c0673892df83e6c58e16655f83aeaa4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3ead139c2a8e6feefbddce384ccd6eb8f26b4581d93053b80832fea7c8bdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82362387569a5b049fdb827cb8ca5ecdacae36743ef142077b37441cc2e33b41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d08e8579fe7ac45bc28ca29b9e6bc0b795c68e4429e16319b75f2b6a1f60cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:45Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:45 crc kubenswrapper[4892]: I0217 17:44:45.048339 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:45Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:45 crc kubenswrapper[4892]: I0217 17:44:45.063249 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aba0c9be9062adff6361f81c4b2f399a56592fcef0d1a62defdf587f9accb3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:45Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:45 crc kubenswrapper[4892]: I0217 17:44:45.077610 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9013d62-9809-436b-82a8-5b18dbf13e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b57e4ceb22cb7d61e915ddf5e85dfee872d2088cccf3c8c4add63e1aa423a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c31ce4ad814ab3be27cbd8d8eb0cb70c7e11352cacd14b705d5c3ed0b56fe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6mhzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:45Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:45 crc kubenswrapper[4892]: I0217 17:44:45.084443 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:45 crc kubenswrapper[4892]: I0217 17:44:45.084492 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:45 crc kubenswrapper[4892]: I0217 17:44:45.084506 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:45 crc kubenswrapper[4892]: I0217 17:44:45.084525 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:45 crc kubenswrapper[4892]: I0217 17:44:45.084540 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:45Z","lastTransitionTime":"2026-02-17T17:44:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:45 crc kubenswrapper[4892]: I0217 17:44:45.187250 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:45 crc kubenswrapper[4892]: I0217 17:44:45.187307 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:45 crc kubenswrapper[4892]: I0217 17:44:45.187317 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:45 crc kubenswrapper[4892]: I0217 17:44:45.187332 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:45 crc kubenswrapper[4892]: I0217 17:44:45.187366 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:45Z","lastTransitionTime":"2026-02-17T17:44:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:45 crc kubenswrapper[4892]: I0217 17:44:45.290949 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:45 crc kubenswrapper[4892]: I0217 17:44:45.291012 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:45 crc kubenswrapper[4892]: I0217 17:44:45.291029 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:45 crc kubenswrapper[4892]: I0217 17:44:45.291055 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:45 crc kubenswrapper[4892]: I0217 17:44:45.291073 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:45Z","lastTransitionTime":"2026-02-17T17:44:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:45 crc kubenswrapper[4892]: I0217 17:44:45.326418 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 13:49:06.019522191 +0000 UTC Feb 17 17:44:45 crc kubenswrapper[4892]: I0217 17:44:45.359533 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:44:45 crc kubenswrapper[4892]: I0217 17:44:45.359564 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:44:45 crc kubenswrapper[4892]: E0217 17:44:45.359675 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:44:45 crc kubenswrapper[4892]: E0217 17:44:45.359853 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:44:45 crc kubenswrapper[4892]: I0217 17:44:45.394338 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:45 crc kubenswrapper[4892]: I0217 17:44:45.394409 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:45 crc kubenswrapper[4892]: I0217 17:44:45.394432 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:45 crc kubenswrapper[4892]: I0217 17:44:45.394460 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:45 crc kubenswrapper[4892]: I0217 17:44:45.394481 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:45Z","lastTransitionTime":"2026-02-17T17:44:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:45 crc kubenswrapper[4892]: I0217 17:44:45.497539 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:45 crc kubenswrapper[4892]: I0217 17:44:45.497606 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:45 crc kubenswrapper[4892]: I0217 17:44:45.497622 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:45 crc kubenswrapper[4892]: I0217 17:44:45.497646 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:45 crc kubenswrapper[4892]: I0217 17:44:45.497662 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:45Z","lastTransitionTime":"2026-02-17T17:44:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:45 crc kubenswrapper[4892]: I0217 17:44:45.600928 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:45 crc kubenswrapper[4892]: I0217 17:44:45.601003 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:45 crc kubenswrapper[4892]: I0217 17:44:45.601025 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:45 crc kubenswrapper[4892]: I0217 17:44:45.601055 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:45 crc kubenswrapper[4892]: I0217 17:44:45.601123 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:45Z","lastTransitionTime":"2026-02-17T17:44:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:45 crc kubenswrapper[4892]: I0217 17:44:45.704020 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:45 crc kubenswrapper[4892]: I0217 17:44:45.704066 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:45 crc kubenswrapper[4892]: I0217 17:44:45.704077 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:45 crc kubenswrapper[4892]: I0217 17:44:45.704091 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:45 crc kubenswrapper[4892]: I0217 17:44:45.704102 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:45Z","lastTransitionTime":"2026-02-17T17:44:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:45 crc kubenswrapper[4892]: I0217 17:44:45.807032 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:45 crc kubenswrapper[4892]: I0217 17:44:45.807099 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:45 crc kubenswrapper[4892]: I0217 17:44:45.807118 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:45 crc kubenswrapper[4892]: I0217 17:44:45.807147 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:45 crc kubenswrapper[4892]: I0217 17:44:45.807167 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:45Z","lastTransitionTime":"2026-02-17T17:44:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:45 crc kubenswrapper[4892]: I0217 17:44:45.910106 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:45 crc kubenswrapper[4892]: I0217 17:44:45.910184 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:45 crc kubenswrapper[4892]: I0217 17:44:45.910202 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:45 crc kubenswrapper[4892]: I0217 17:44:45.910228 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:45 crc kubenswrapper[4892]: I0217 17:44:45.910249 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:45Z","lastTransitionTime":"2026-02-17T17:44:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.013033 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.013086 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.013097 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.013116 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.013128 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:46Z","lastTransitionTime":"2026-02-17T17:44:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.116134 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.116182 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.116191 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.116206 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.116217 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:46Z","lastTransitionTime":"2026-02-17T17:44:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.218550 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.218606 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.218626 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.218650 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.218668 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:46Z","lastTransitionTime":"2026-02-17T17:44:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.264469 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.264521 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.264533 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.264549 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.264559 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:46Z","lastTransitionTime":"2026-02-17T17:44:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:46 crc kubenswrapper[4892]: E0217 17:44:46.283603 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b854407-f7fd-494b-9ad1-f90f175f6ff2\\\",\\\"systemUUID\\\":\\\"06c95f3d-0382-44a5-9f64-04e3b3bbd534\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:46Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.289796 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.289899 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.289911 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.289931 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.289942 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:46Z","lastTransitionTime":"2026-02-17T17:44:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:46 crc kubenswrapper[4892]: E0217 17:44:46.308595 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b854407-f7fd-494b-9ad1-f90f175f6ff2\\\",\\\"systemUUID\\\":\\\"06c95f3d-0382-44a5-9f64-04e3b3bbd534\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:46Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.313601 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.313643 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.313655 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.313673 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.313685 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:46Z","lastTransitionTime":"2026-02-17T17:44:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.327607 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 06:37:37.659871873 +0000 UTC Feb 17 17:44:46 crc kubenswrapper[4892]: E0217 17:44:46.331031 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b854407-f7fd-494b-9ad1-f90f175f6ff2\\\",\\\"systemUUID\\\":\\\"06c95f3d-0382-44a5-9f64-04e3b3bbd534\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:46Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.337308 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.337392 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.337410 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.337435 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.337452 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:46Z","lastTransitionTime":"2026-02-17T17:44:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:46 crc kubenswrapper[4892]: E0217 17:44:46.354188 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b854407-f7fd-494b-9ad1-f90f175f6ff2\\\",\\\"systemUUID\\\":\\\"06c95f3d-0382-44a5-9f64-04e3b3bbd534\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:46Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.358568 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.358579 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:44:46 crc kubenswrapper[4892]: E0217 17:44:46.358747 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q4n6" podUID="9290105c-74a4-487a-879f-3f79186b3b01" Feb 17 17:44:46 crc kubenswrapper[4892]: E0217 17:44:46.358903 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.358922 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.358969 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.358985 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.359006 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.359023 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:46Z","lastTransitionTime":"2026-02-17T17:44:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:46 crc kubenswrapper[4892]: E0217 17:44:46.383849 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b854407-f7fd-494b-9ad1-f90f175f6ff2\\\",\\\"systemUUID\\\":\\\"06c95f3d-0382-44a5-9f64-04e3b3bbd534\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:46Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:46 crc kubenswrapper[4892]: E0217 17:44:46.384919 4892 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.387120 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.387162 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.387179 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.387205 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.387222 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:46Z","lastTransitionTime":"2026-02-17T17:44:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.415421 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.436187 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e1e9d28fe57947c798967ce328638e778603a86874e79a2f4d99eab0b9867fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://489d551f2e552fe6e6a03f8e38593e2b978a7dc4053a999c3a11ebe404cf427f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:46Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.450507 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p6jtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"646d0148-c138-45a2-8f68-51aee16aeff0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1d56c160a6bfb1da9b1ca24666fbbdbd87911eaf961365a6cb097b0cc20f1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxbxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p6jtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:46Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.476246 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b23058a0-04ec-4a23-82cb-60f9b368eaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e71385b1a285f0da1c91a9a94188fbfd412225fc8c839572ec619b3363017253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e71385b1a285f0da1c91a9a94188fbfd412225fc8c839572ec619b3363017253\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:44:43Z\\\",\\\"message\\\":\\\"ef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 17:44:43.370589 6578 services_controller.go:356] Processing sync for service openshift-multus/multus-admission-controller for network=default\\\\nI0217 17:44:43.369483 6578 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0217 17:44:43.370645 6578 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0217 17:44:43.370651 6578 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0217 17:44:43.370550 6578 base_netwo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kp5h9_openshift-ovn-kubernetes(b23058a0-04ec-4a23-82cb-60f9b368eaa0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kp5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:46Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.491482 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.491539 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.491555 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.491577 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.491594 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:46Z","lastTransitionTime":"2026-02-17T17:44:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.492717 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2q4n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9290105c-74a4-487a-879f-3f79186b3b01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6h6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6h6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2q4n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:46Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.507228 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5v456" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62da95c0-b8b4-410e-a5e8-f3ab44db53b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4bd49fbfa0e7cff66ffc2a72aa0020a4a28551fa192e587067d648feb4d09f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfctg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5v456\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:46Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.524724 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k28d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5497ad5a-f96f-4f2e-ba79-a72f32527546\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8cdd570c53fb9581ae506e73f11f51586ebffcc4561ba7176a7a64a25bd5e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g8dcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ca1f6810f305bb127d5ebfbb39b92ffa5914a3a245dcfe6f4adb805fea015e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g8dcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7k28d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:46Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.546125 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4159c-33a5-42ef-9427-ad1fb2799470\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf7e90e5029b6c000084531e8c465365c236ea0c554bf0ab65b936fc0f53cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe34ba04ce082dcb2c2040fafa3d9b0fb0f78fd569d08c9e53278873fb6434c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a9fc749b194bcc0558af7017260846aeb8df8df5d84f31ae0f1e9f7ac29e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2199d2bd60d27dc4d3c3ff4a2ef9e4cbad59c3546a36206258b82875901b69d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 17:44:08.433164 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 17:44:08.433536 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:44:08.434768 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2781092108/tls.crt::/tmp/serving-cert-2781092108/tls.key\\\\\\\"\\\\nI0217 17:44:08.760474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:44:08.765628 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:44:08.765666 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:44:08.765700 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:44:08.765711 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:44:08.777802 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:44:08.777850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:44:08.777867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:44:08.777870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:44:08.777873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:44:08.777883 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:44:08.781106 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23963364690d3b6cdb1cb76d08e094b0a7c2fe304f8d20a673206357e0152193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:46Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.562117 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af0cdeda-3c72-4700-b908-b8c0f0041cc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db577dd460e8c892dbb44799bae9ccdd3364c385f4bbfbd3a33a88bab1bcab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd66fb423f8b68d5e670ece2a7be1b6e04e4dcd03ba42dc25bcf97cd9ec92b69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0b98dba841562d65c0ab18e4f604c141352d4bbfb16825248fb209a73b23cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f344d7a0ae243f5e05b9e8c21f25a713411ee0921470e6632409800760aab22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f344d7a0ae243f5e05b9e8c21f25a713411ee0921470e6632409800760aab22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:46Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.580153 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59304d1958e072cd19bd1e6a00f1487aa93563c8838c859aa7ed5222dc25c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:46Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.595077 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.595139 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.595160 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.595181 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.595194 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:46Z","lastTransitionTime":"2026-02-17T17:44:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.604067 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4dsxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202411f4-1b44-41a1-9b8a-23038a0e9bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc5f2c5d87fb5b1529c03563955d9213cd72452aabadd9417afdf5a30b8a24f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4dsxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:46Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.621997 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83f8f121-9193-465b-9d6b-4dfbdcf26f98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca961a496bca89cff78c58c41585706c0673892df83e6c58e16655f83aeaa4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3ead139c2a8e6feefbddce384ccd6eb8f26b4581d93053b80832fea7c8bdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82362387569a5b049fdb827cb8ca5ecdacae36743ef142077b37441cc2e33b41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d08e8579fe7ac45bc28ca29b9e6bc0b795c68e4429e16319b75f2b6a1f60cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:46Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.641297 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:46Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.663229 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:46Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.682446 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxpxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b12f44-0079-4031-9b1d-492c374250df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5490d72f776fd4e96f56e933c182c8589e96d80806378f96d97c4d0f6dccd8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgp4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxpxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:46Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.698346 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.698406 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.698423 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.698448 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.698465 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:46Z","lastTransitionTime":"2026-02-17T17:44:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.703567 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:46Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.723877 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aba0c9be9062adff6361f81c4b2f399a56592fcef0d1a62defdf587f9accb3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:46Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.741931 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9013d62-9809-436b-82a8-5b18dbf13e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b57e4ceb22cb7d61e915ddf5e85dfee872d2088cccf3c8c4add63e1aa423a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c31ce4ad814ab3be27cbd8d8eb0cb70c7e11352cacd14b705d5c3ed0b56fe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6mhzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:46Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.800870 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.800911 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.800920 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.800935 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.800946 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:46Z","lastTransitionTime":"2026-02-17T17:44:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.903555 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.903598 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.903610 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.903626 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:46 crc kubenswrapper[4892]: I0217 17:44:46.903684 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:46Z","lastTransitionTime":"2026-02-17T17:44:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:47 crc kubenswrapper[4892]: I0217 17:44:47.006180 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:47 crc kubenswrapper[4892]: I0217 17:44:47.006226 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:47 crc kubenswrapper[4892]: I0217 17:44:47.006236 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:47 crc kubenswrapper[4892]: I0217 17:44:47.006254 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:47 crc kubenswrapper[4892]: I0217 17:44:47.006264 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:47Z","lastTransitionTime":"2026-02-17T17:44:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:47 crc kubenswrapper[4892]: I0217 17:44:47.108627 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:47 crc kubenswrapper[4892]: I0217 17:44:47.109070 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:47 crc kubenswrapper[4892]: I0217 17:44:47.109202 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:47 crc kubenswrapper[4892]: I0217 17:44:47.109327 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:47 crc kubenswrapper[4892]: I0217 17:44:47.109456 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:47Z","lastTransitionTime":"2026-02-17T17:44:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:47 crc kubenswrapper[4892]: I0217 17:44:47.212274 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:47 crc kubenswrapper[4892]: I0217 17:44:47.212348 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:47 crc kubenswrapper[4892]: I0217 17:44:47.212370 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:47 crc kubenswrapper[4892]: I0217 17:44:47.212398 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:47 crc kubenswrapper[4892]: I0217 17:44:47.212420 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:47Z","lastTransitionTime":"2026-02-17T17:44:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:47 crc kubenswrapper[4892]: I0217 17:44:47.315481 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:47 crc kubenswrapper[4892]: I0217 17:44:47.315521 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:47 crc kubenswrapper[4892]: I0217 17:44:47.315531 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:47 crc kubenswrapper[4892]: I0217 17:44:47.315545 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:47 crc kubenswrapper[4892]: I0217 17:44:47.315556 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:47Z","lastTransitionTime":"2026-02-17T17:44:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:47 crc kubenswrapper[4892]: I0217 17:44:47.327951 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 19:37:00.985740699 +0000 UTC Feb 17 17:44:47 crc kubenswrapper[4892]: I0217 17:44:47.358894 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:44:47 crc kubenswrapper[4892]: I0217 17:44:47.358916 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:44:47 crc kubenswrapper[4892]: E0217 17:44:47.359130 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:44:47 crc kubenswrapper[4892]: E0217 17:44:47.359516 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:44:47 crc kubenswrapper[4892]: I0217 17:44:47.418390 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:47 crc kubenswrapper[4892]: I0217 17:44:47.418459 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:47 crc kubenswrapper[4892]: I0217 17:44:47.418475 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:47 crc kubenswrapper[4892]: I0217 17:44:47.418499 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:47 crc kubenswrapper[4892]: I0217 17:44:47.418517 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:47Z","lastTransitionTime":"2026-02-17T17:44:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:47 crc kubenswrapper[4892]: I0217 17:44:47.521363 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:47 crc kubenswrapper[4892]: I0217 17:44:47.521437 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:47 crc kubenswrapper[4892]: I0217 17:44:47.521460 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:47 crc kubenswrapper[4892]: I0217 17:44:47.521486 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:47 crc kubenswrapper[4892]: I0217 17:44:47.521505 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:47Z","lastTransitionTime":"2026-02-17T17:44:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:47 crc kubenswrapper[4892]: I0217 17:44:47.624671 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:47 crc kubenswrapper[4892]: I0217 17:44:47.624759 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:47 crc kubenswrapper[4892]: I0217 17:44:47.624772 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:47 crc kubenswrapper[4892]: I0217 17:44:47.624804 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:47 crc kubenswrapper[4892]: I0217 17:44:47.624839 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:47Z","lastTransitionTime":"2026-02-17T17:44:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:47 crc kubenswrapper[4892]: I0217 17:44:47.727893 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:47 crc kubenswrapper[4892]: I0217 17:44:47.727954 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:47 crc kubenswrapper[4892]: I0217 17:44:47.727971 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:47 crc kubenswrapper[4892]: I0217 17:44:47.727997 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:47 crc kubenswrapper[4892]: I0217 17:44:47.728015 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:47Z","lastTransitionTime":"2026-02-17T17:44:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:47 crc kubenswrapper[4892]: I0217 17:44:47.831600 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:47 crc kubenswrapper[4892]: I0217 17:44:47.831650 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:47 crc kubenswrapper[4892]: I0217 17:44:47.831673 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:47 crc kubenswrapper[4892]: I0217 17:44:47.831704 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:47 crc kubenswrapper[4892]: I0217 17:44:47.831727 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:47Z","lastTransitionTime":"2026-02-17T17:44:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:47 crc kubenswrapper[4892]: I0217 17:44:47.934294 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:47 crc kubenswrapper[4892]: I0217 17:44:47.934430 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:47 crc kubenswrapper[4892]: I0217 17:44:47.934447 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:47 crc kubenswrapper[4892]: I0217 17:44:47.934471 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:47 crc kubenswrapper[4892]: I0217 17:44:47.934488 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:47Z","lastTransitionTime":"2026-02-17T17:44:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:48 crc kubenswrapper[4892]: I0217 17:44:48.037779 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:48 crc kubenswrapper[4892]: I0217 17:44:48.037888 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:48 crc kubenswrapper[4892]: I0217 17:44:48.037909 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:48 crc kubenswrapper[4892]: I0217 17:44:48.037942 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:48 crc kubenswrapper[4892]: I0217 17:44:48.037962 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:48Z","lastTransitionTime":"2026-02-17T17:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:48 crc kubenswrapper[4892]: I0217 17:44:48.140887 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:48 crc kubenswrapper[4892]: I0217 17:44:48.140946 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:48 crc kubenswrapper[4892]: I0217 17:44:48.140962 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:48 crc kubenswrapper[4892]: I0217 17:44:48.140984 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:48 crc kubenswrapper[4892]: I0217 17:44:48.141001 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:48Z","lastTransitionTime":"2026-02-17T17:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:48 crc kubenswrapper[4892]: I0217 17:44:48.243875 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:48 crc kubenswrapper[4892]: I0217 17:44:48.243940 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:48 crc kubenswrapper[4892]: I0217 17:44:48.243964 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:48 crc kubenswrapper[4892]: I0217 17:44:48.243992 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:48 crc kubenswrapper[4892]: I0217 17:44:48.244013 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:48Z","lastTransitionTime":"2026-02-17T17:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:48 crc kubenswrapper[4892]: I0217 17:44:48.328038 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 13:31:37.78485061 +0000 UTC Feb 17 17:44:48 crc kubenswrapper[4892]: I0217 17:44:48.347182 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:48 crc kubenswrapper[4892]: I0217 17:44:48.347296 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:48 crc kubenswrapper[4892]: I0217 17:44:48.347315 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:48 crc kubenswrapper[4892]: I0217 17:44:48.347338 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:48 crc kubenswrapper[4892]: I0217 17:44:48.347353 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:48Z","lastTransitionTime":"2026-02-17T17:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:48 crc kubenswrapper[4892]: I0217 17:44:48.359352 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:44:48 crc kubenswrapper[4892]: I0217 17:44:48.359401 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:44:48 crc kubenswrapper[4892]: E0217 17:44:48.359449 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q4n6" podUID="9290105c-74a4-487a-879f-3f79186b3b01" Feb 17 17:44:48 crc kubenswrapper[4892]: E0217 17:44:48.359512 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:44:48 crc kubenswrapper[4892]: I0217 17:44:48.450213 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:48 crc kubenswrapper[4892]: I0217 17:44:48.450263 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:48 crc kubenswrapper[4892]: I0217 17:44:48.450279 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:48 crc kubenswrapper[4892]: I0217 17:44:48.450300 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:48 crc kubenswrapper[4892]: I0217 17:44:48.450316 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:48Z","lastTransitionTime":"2026-02-17T17:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:48 crc kubenswrapper[4892]: I0217 17:44:48.553633 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:48 crc kubenswrapper[4892]: I0217 17:44:48.553695 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:48 crc kubenswrapper[4892]: I0217 17:44:48.553717 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:48 crc kubenswrapper[4892]: I0217 17:44:48.553747 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:48 crc kubenswrapper[4892]: I0217 17:44:48.553771 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:48Z","lastTransitionTime":"2026-02-17T17:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:48 crc kubenswrapper[4892]: I0217 17:44:48.656912 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:48 crc kubenswrapper[4892]: I0217 17:44:48.656952 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:48 crc kubenswrapper[4892]: I0217 17:44:48.656968 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:48 crc kubenswrapper[4892]: I0217 17:44:48.656993 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:48 crc kubenswrapper[4892]: I0217 17:44:48.657010 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:48Z","lastTransitionTime":"2026-02-17T17:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:48 crc kubenswrapper[4892]: I0217 17:44:48.759630 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:48 crc kubenswrapper[4892]: I0217 17:44:48.760594 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:48 crc kubenswrapper[4892]: I0217 17:44:48.760782 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:48 crc kubenswrapper[4892]: I0217 17:44:48.760951 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:48 crc kubenswrapper[4892]: I0217 17:44:48.761101 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:48Z","lastTransitionTime":"2026-02-17T17:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:48 crc kubenswrapper[4892]: I0217 17:44:48.863648 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:48 crc kubenswrapper[4892]: I0217 17:44:48.863708 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:48 crc kubenswrapper[4892]: I0217 17:44:48.863724 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:48 crc kubenswrapper[4892]: I0217 17:44:48.863749 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:48 crc kubenswrapper[4892]: I0217 17:44:48.863767 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:48Z","lastTransitionTime":"2026-02-17T17:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:48 crc kubenswrapper[4892]: I0217 17:44:48.967105 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:48 crc kubenswrapper[4892]: I0217 17:44:48.967184 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:48 crc kubenswrapper[4892]: I0217 17:44:48.967208 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:48 crc kubenswrapper[4892]: I0217 17:44:48.967233 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:48 crc kubenswrapper[4892]: I0217 17:44:48.967254 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:48Z","lastTransitionTime":"2026-02-17T17:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.069905 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.069956 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.069969 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.069989 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.070004 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:49Z","lastTransitionTime":"2026-02-17T17:44:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.173283 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.173641 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.173713 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.173784 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.173904 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:49Z","lastTransitionTime":"2026-02-17T17:44:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.276865 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.276903 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.276911 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.276924 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.276932 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:49Z","lastTransitionTime":"2026-02-17T17:44:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.329015 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 23:12:20.621391372 +0000 UTC Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.361426 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:44:49 crc kubenswrapper[4892]: E0217 17:44:49.361587 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.361700 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:44:49 crc kubenswrapper[4892]: E0217 17:44:49.361865 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.378714 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59304d1958e072cd19bd1e6a00f1487aa93563c8838c859aa7ed5222dc25c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:49Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.379706 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.379860 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.379975 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.380062 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.380177 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:49Z","lastTransitionTime":"2026-02-17T17:44:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.391564 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5v456" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62da95c0-b8b4-410e-a5e8-f3ab44db53b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4bd49fbfa0e7cff66ffc2a72aa0020a4a28551fa192e587067d648feb4d09f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfctg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5v456\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:49Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.403363 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k28d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5497ad5a-f96f-4f2e-ba79-a72f32527546\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8cdd570c53fb9581ae506e73f11f51586ebffcc4561ba7176a7a64a25bd5e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g8dcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ca1f6810f305bb127d5ebfbb39b92ffa5914a3a245dcfe6f4adb805fea015e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g8dcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7k28d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:49Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.420247 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4159c-33a5-42ef-9427-ad1fb2799470\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf7e90e5029b6c000084531e8c465365c236ea0c554bf0ab65b936fc0f53cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe34ba04ce082dcb2c2040fafa3d9b0fb0f78fd569d08c9e53278873fb6434c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a9fc749b194bcc0558af7017260846aeb8df8df5d84f31ae0f1e9f7ac29e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2199d2bd60d27dc4d3c3ff4a2ef9e4cbad59c3546a36206258b82875901b69d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 17:44:08.433164 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 17:44:08.433536 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:44:08.434768 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2781092108/tls.crt::/tmp/serving-cert-2781092108/tls.key\\\\\\\"\\\\nI0217 17:44:08.760474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:44:08.765628 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:44:08.765666 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:44:08.765700 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:44:08.765711 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:44:08.777802 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:44:08.777850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:44:08.777867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:44:08.777870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:44:08.777873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:44:08.777883 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:44:08.781106 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23963364690d3b6cdb1cb76d08e094b0a7c2fe304f8d20a673206357e0152193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:49Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.437579 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af0cdeda-3c72-4700-b908-b8c0f0041cc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db577dd460e8c892dbb44799bae9ccdd3364c385f4bbfbd3a33a88bab1bcab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd66fb423f8b68d5e670ece2a7be1b6e04e4dcd03ba42dc25bcf97cd9ec92b69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0b98dba841562d65c0ab18e4f604c141352d4bbfb16825248fb209a73b23cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f344d7a0ae243f5e05b9e8c21f25a713411ee0921470e6632409800760aab22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f344d7a0ae243f5e05b9e8c21f25a713411ee0921470e6632409800760aab22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:49Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.457781 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxpxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b12f44-0079-4031-9b1d-492c374250df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5490d72f776fd4e96f56e933c182c8589e96d80806378f96d97c4d0f6dccd8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgp4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxpxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:49Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.477242 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4dsxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202411f4-1b44-41a1-9b8a-23038a0e9bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc5f2c5d87fb5b1529c03563955d9213cd72452aabadd9417afdf5a30b8a24f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4dsxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:49Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.486469 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.486530 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.486546 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.486570 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.486588 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:49Z","lastTransitionTime":"2026-02-17T17:44:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.497387 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83f8f121-9193-465b-9d6b-4dfbdcf26f98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca961a496bca89cff78c58c41585706c0673892df83e6c58e16655f83aeaa4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3ead139c2a8e6feefbddce384ccd6eb8f26b4581d93053b80832fea7c8bdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82362387569a5b049fdb827cb8ca5ecdacae36743ef142077b37441cc2e33b41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d08e8579fe7ac45bc28ca29b9e6bc0b795c68e4429e16319b75f2b6a1f60cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:49Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.514545 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:49Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.528948 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:49Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.540463 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9013d62-9809-436b-82a8-5b18dbf13e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b57e4ceb22cb7d61e915ddf5e85dfee872d2088cccf3c8c4add63e1aa423a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c31ce4ad814ab3be27cbd8d8eb0cb70c7e11352cacd14b705d5c3ed0b56fe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6mhzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:49Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.556065 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:49Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.570277 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aba0c9be9062adff6361f81c4b2f399a56592fcef0d1a62defdf587f9accb3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:49Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.585388 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2q4n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9290105c-74a4-487a-879f-3f79186b3b01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6h6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6h6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2q4n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:49Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.590143 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.590210 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.590222 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.590240 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.590278 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:49Z","lastTransitionTime":"2026-02-17T17:44:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.604940 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e1e9d28fe57947c798967ce328638e778603a86874e79a2f4d99eab0b9867fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://489d551f2e552fe6e6a03f8e38593e2b978a7dc4053a999c3a11ebe404cf427f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:49Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.619839 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p6jtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"646d0148-c138-45a2-8f68-51aee16aeff0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1d56c160a6bfb1da9b1ca24666fbbdbd87911eaf961365a6cb097b0cc20f1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxbxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p6jtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:49Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.648963 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b23058a0-04ec-4a23-82cb-60f9b368eaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e71385b1a285f0da1c91a9a94188fbfd412225fc8c839572ec619b3363017253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e71385b1a285f0da1c91a9a94188fbfd412225fc8c839572ec619b3363017253\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:44:43Z\\\",\\\"message\\\":\\\"ef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 17:44:43.370589 6578 services_controller.go:356] Processing sync for service openshift-multus/multus-admission-controller for network=default\\\\nI0217 17:44:43.369483 6578 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0217 17:44:43.370645 6578 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0217 17:44:43.370651 6578 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0217 17:44:43.370550 6578 base_netwo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kp5h9_openshift-ovn-kubernetes(b23058a0-04ec-4a23-82cb-60f9b368eaa0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kp5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:49Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.693170 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.693221 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.693237 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.693258 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.693275 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:49Z","lastTransitionTime":"2026-02-17T17:44:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.796107 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.796149 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.796160 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.796176 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.796188 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:49Z","lastTransitionTime":"2026-02-17T17:44:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.898893 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.898935 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.898946 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.898962 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:49 crc kubenswrapper[4892]: I0217 17:44:49.898974 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:49Z","lastTransitionTime":"2026-02-17T17:44:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:50 crc kubenswrapper[4892]: I0217 17:44:50.002011 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:50 crc kubenswrapper[4892]: I0217 17:44:50.002362 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:50 crc kubenswrapper[4892]: I0217 17:44:50.002537 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:50 crc kubenswrapper[4892]: I0217 17:44:50.002682 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:50 crc kubenswrapper[4892]: I0217 17:44:50.002845 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:50Z","lastTransitionTime":"2026-02-17T17:44:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:50 crc kubenswrapper[4892]: I0217 17:44:50.109469 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:50 crc kubenswrapper[4892]: I0217 17:44:50.109522 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:50 crc kubenswrapper[4892]: I0217 17:44:50.109540 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:50 crc kubenswrapper[4892]: I0217 17:44:50.109561 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:50 crc kubenswrapper[4892]: I0217 17:44:50.109579 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:50Z","lastTransitionTime":"2026-02-17T17:44:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:50 crc kubenswrapper[4892]: I0217 17:44:50.211663 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:50 crc kubenswrapper[4892]: I0217 17:44:50.211690 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:50 crc kubenswrapper[4892]: I0217 17:44:50.211697 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:50 crc kubenswrapper[4892]: I0217 17:44:50.211709 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:50 crc kubenswrapper[4892]: I0217 17:44:50.211717 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:50Z","lastTransitionTime":"2026-02-17T17:44:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:50 crc kubenswrapper[4892]: I0217 17:44:50.313961 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:50 crc kubenswrapper[4892]: I0217 17:44:50.314002 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:50 crc kubenswrapper[4892]: I0217 17:44:50.314010 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:50 crc kubenswrapper[4892]: I0217 17:44:50.314024 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:50 crc kubenswrapper[4892]: I0217 17:44:50.314034 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:50Z","lastTransitionTime":"2026-02-17T17:44:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:50 crc kubenswrapper[4892]: I0217 17:44:50.329728 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 01:21:39.836198648 +0000 UTC Feb 17 17:44:50 crc kubenswrapper[4892]: I0217 17:44:50.359080 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:44:50 crc kubenswrapper[4892]: I0217 17:44:50.359088 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:44:50 crc kubenswrapper[4892]: E0217 17:44:50.359404 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q4n6" podUID="9290105c-74a4-487a-879f-3f79186b3b01" Feb 17 17:44:50 crc kubenswrapper[4892]: E0217 17:44:50.359264 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:44:50 crc kubenswrapper[4892]: I0217 17:44:50.416391 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:50 crc kubenswrapper[4892]: I0217 17:44:50.416433 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:50 crc kubenswrapper[4892]: I0217 17:44:50.416442 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:50 crc kubenswrapper[4892]: I0217 17:44:50.416457 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:50 crc kubenswrapper[4892]: I0217 17:44:50.416469 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:50Z","lastTransitionTime":"2026-02-17T17:44:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:50 crc kubenswrapper[4892]: I0217 17:44:50.519114 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:50 crc kubenswrapper[4892]: I0217 17:44:50.519146 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:50 crc kubenswrapper[4892]: I0217 17:44:50.519155 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:50 crc kubenswrapper[4892]: I0217 17:44:50.519167 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:50 crc kubenswrapper[4892]: I0217 17:44:50.519178 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:50Z","lastTransitionTime":"2026-02-17T17:44:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:50 crc kubenswrapper[4892]: I0217 17:44:50.621451 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:50 crc kubenswrapper[4892]: I0217 17:44:50.621506 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:50 crc kubenswrapper[4892]: I0217 17:44:50.621523 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:50 crc kubenswrapper[4892]: I0217 17:44:50.621546 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:50 crc kubenswrapper[4892]: I0217 17:44:50.621564 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:50Z","lastTransitionTime":"2026-02-17T17:44:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:50 crc kubenswrapper[4892]: I0217 17:44:50.723207 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:50 crc kubenswrapper[4892]: I0217 17:44:50.723245 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:50 crc kubenswrapper[4892]: I0217 17:44:50.723256 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:50 crc kubenswrapper[4892]: I0217 17:44:50.723273 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:50 crc kubenswrapper[4892]: I0217 17:44:50.723283 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:50Z","lastTransitionTime":"2026-02-17T17:44:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:50 crc kubenswrapper[4892]: I0217 17:44:50.826509 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:50 crc kubenswrapper[4892]: I0217 17:44:50.826586 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:50 crc kubenswrapper[4892]: I0217 17:44:50.826604 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:50 crc kubenswrapper[4892]: I0217 17:44:50.826627 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:50 crc kubenswrapper[4892]: I0217 17:44:50.826644 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:50Z","lastTransitionTime":"2026-02-17T17:44:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:50 crc kubenswrapper[4892]: I0217 17:44:50.933625 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:50 crc kubenswrapper[4892]: I0217 17:44:50.933701 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:50 crc kubenswrapper[4892]: I0217 17:44:50.933713 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:50 crc kubenswrapper[4892]: I0217 17:44:50.933735 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:50 crc kubenswrapper[4892]: I0217 17:44:50.933773 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:50Z","lastTransitionTime":"2026-02-17T17:44:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:51 crc kubenswrapper[4892]: I0217 17:44:51.036662 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:51 crc kubenswrapper[4892]: I0217 17:44:51.036725 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:51 crc kubenswrapper[4892]: I0217 17:44:51.036742 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:51 crc kubenswrapper[4892]: I0217 17:44:51.036766 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:51 crc kubenswrapper[4892]: I0217 17:44:51.036783 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:51Z","lastTransitionTime":"2026-02-17T17:44:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:51 crc kubenswrapper[4892]: I0217 17:44:51.140275 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:51 crc kubenswrapper[4892]: I0217 17:44:51.140328 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:51 crc kubenswrapper[4892]: I0217 17:44:51.140344 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:51 crc kubenswrapper[4892]: I0217 17:44:51.140366 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:51 crc kubenswrapper[4892]: I0217 17:44:51.140384 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:51Z","lastTransitionTime":"2026-02-17T17:44:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:51 crc kubenswrapper[4892]: I0217 17:44:51.243002 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:51 crc kubenswrapper[4892]: I0217 17:44:51.243045 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:51 crc kubenswrapper[4892]: I0217 17:44:51.243079 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:51 crc kubenswrapper[4892]: I0217 17:44:51.243095 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:51 crc kubenswrapper[4892]: I0217 17:44:51.243107 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:51Z","lastTransitionTime":"2026-02-17T17:44:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:51 crc kubenswrapper[4892]: I0217 17:44:51.330629 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 23:25:07.079626905 +0000 UTC Feb 17 17:44:51 crc kubenswrapper[4892]: I0217 17:44:51.345996 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:51 crc kubenswrapper[4892]: I0217 17:44:51.346043 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:51 crc kubenswrapper[4892]: I0217 17:44:51.346054 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:51 crc kubenswrapper[4892]: I0217 17:44:51.346070 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:51 crc kubenswrapper[4892]: I0217 17:44:51.346084 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:51Z","lastTransitionTime":"2026-02-17T17:44:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:51 crc kubenswrapper[4892]: I0217 17:44:51.358480 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:44:51 crc kubenswrapper[4892]: I0217 17:44:51.358487 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:44:51 crc kubenswrapper[4892]: E0217 17:44:51.358648 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:44:51 crc kubenswrapper[4892]: E0217 17:44:51.358755 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:44:51 crc kubenswrapper[4892]: I0217 17:44:51.449752 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:51 crc kubenswrapper[4892]: I0217 17:44:51.449869 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:51 crc kubenswrapper[4892]: I0217 17:44:51.449893 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:51 crc kubenswrapper[4892]: I0217 17:44:51.449922 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:51 crc kubenswrapper[4892]: I0217 17:44:51.449946 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:51Z","lastTransitionTime":"2026-02-17T17:44:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:51 crc kubenswrapper[4892]: I0217 17:44:51.553345 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:51 crc kubenswrapper[4892]: I0217 17:44:51.553404 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:51 crc kubenswrapper[4892]: I0217 17:44:51.553421 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:51 crc kubenswrapper[4892]: I0217 17:44:51.553443 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:51 crc kubenswrapper[4892]: I0217 17:44:51.553460 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:51Z","lastTransitionTime":"2026-02-17T17:44:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:51 crc kubenswrapper[4892]: I0217 17:44:51.655734 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:51 crc kubenswrapper[4892]: I0217 17:44:51.655766 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:51 crc kubenswrapper[4892]: I0217 17:44:51.655777 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:51 crc kubenswrapper[4892]: I0217 17:44:51.655795 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:51 crc kubenswrapper[4892]: I0217 17:44:51.655806 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:51Z","lastTransitionTime":"2026-02-17T17:44:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:51 crc kubenswrapper[4892]: I0217 17:44:51.758373 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:51 crc kubenswrapper[4892]: I0217 17:44:51.758431 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:51 crc kubenswrapper[4892]: I0217 17:44:51.758447 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:51 crc kubenswrapper[4892]: I0217 17:44:51.758469 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:51 crc kubenswrapper[4892]: I0217 17:44:51.758487 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:51Z","lastTransitionTime":"2026-02-17T17:44:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:51 crc kubenswrapper[4892]: I0217 17:44:51.861743 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:51 crc kubenswrapper[4892]: I0217 17:44:51.861794 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:51 crc kubenswrapper[4892]: I0217 17:44:51.861809 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:51 crc kubenswrapper[4892]: I0217 17:44:51.861856 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:51 crc kubenswrapper[4892]: I0217 17:44:51.861874 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:51Z","lastTransitionTime":"2026-02-17T17:44:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:51 crc kubenswrapper[4892]: I0217 17:44:51.965060 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:51 crc kubenswrapper[4892]: I0217 17:44:51.965138 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:51 crc kubenswrapper[4892]: I0217 17:44:51.965154 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:51 crc kubenswrapper[4892]: I0217 17:44:51.965188 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:51 crc kubenswrapper[4892]: I0217 17:44:51.965222 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:51Z","lastTransitionTime":"2026-02-17T17:44:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:52 crc kubenswrapper[4892]: I0217 17:44:52.067797 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:52 crc kubenswrapper[4892]: I0217 17:44:52.067853 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:52 crc kubenswrapper[4892]: I0217 17:44:52.067862 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:52 crc kubenswrapper[4892]: I0217 17:44:52.067876 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:52 crc kubenswrapper[4892]: I0217 17:44:52.067885 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:52Z","lastTransitionTime":"2026-02-17T17:44:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:52 crc kubenswrapper[4892]: I0217 17:44:52.170406 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:52 crc kubenswrapper[4892]: I0217 17:44:52.171002 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:52 crc kubenswrapper[4892]: I0217 17:44:52.171067 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:52 crc kubenswrapper[4892]: I0217 17:44:52.175215 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:52 crc kubenswrapper[4892]: I0217 17:44:52.175244 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:52Z","lastTransitionTime":"2026-02-17T17:44:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:52 crc kubenswrapper[4892]: I0217 17:44:52.278229 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:52 crc kubenswrapper[4892]: I0217 17:44:52.278298 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:52 crc kubenswrapper[4892]: I0217 17:44:52.278309 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:52 crc kubenswrapper[4892]: I0217 17:44:52.278325 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:52 crc kubenswrapper[4892]: I0217 17:44:52.278334 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:52Z","lastTransitionTime":"2026-02-17T17:44:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:52 crc kubenswrapper[4892]: I0217 17:44:52.330966 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 09:10:16.881514029 +0000 UTC Feb 17 17:44:52 crc kubenswrapper[4892]: I0217 17:44:52.358419 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:44:52 crc kubenswrapper[4892]: I0217 17:44:52.358487 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:44:52 crc kubenswrapper[4892]: E0217 17:44:52.358614 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:44:52 crc kubenswrapper[4892]: E0217 17:44:52.358757 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q4n6" podUID="9290105c-74a4-487a-879f-3f79186b3b01" Feb 17 17:44:52 crc kubenswrapper[4892]: I0217 17:44:52.381257 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:52 crc kubenswrapper[4892]: I0217 17:44:52.381294 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:52 crc kubenswrapper[4892]: I0217 17:44:52.381304 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:52 crc kubenswrapper[4892]: I0217 17:44:52.381318 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:52 crc kubenswrapper[4892]: I0217 17:44:52.381327 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:52Z","lastTransitionTime":"2026-02-17T17:44:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:52 crc kubenswrapper[4892]: I0217 17:44:52.484423 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:52 crc kubenswrapper[4892]: I0217 17:44:52.484479 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:52 crc kubenswrapper[4892]: I0217 17:44:52.484501 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:52 crc kubenswrapper[4892]: I0217 17:44:52.484527 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:52 crc kubenswrapper[4892]: I0217 17:44:52.484547 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:52Z","lastTransitionTime":"2026-02-17T17:44:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:52 crc kubenswrapper[4892]: I0217 17:44:52.587687 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:52 crc kubenswrapper[4892]: I0217 17:44:52.587735 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:52 crc kubenswrapper[4892]: I0217 17:44:52.587757 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:52 crc kubenswrapper[4892]: I0217 17:44:52.587786 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:52 crc kubenswrapper[4892]: I0217 17:44:52.587807 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:52Z","lastTransitionTime":"2026-02-17T17:44:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:52 crc kubenswrapper[4892]: I0217 17:44:52.690653 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:52 crc kubenswrapper[4892]: I0217 17:44:52.690682 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:52 crc kubenswrapper[4892]: I0217 17:44:52.690690 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:52 crc kubenswrapper[4892]: I0217 17:44:52.690703 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:52 crc kubenswrapper[4892]: I0217 17:44:52.690711 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:52Z","lastTransitionTime":"2026-02-17T17:44:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:52 crc kubenswrapper[4892]: I0217 17:44:52.793194 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:52 crc kubenswrapper[4892]: I0217 17:44:52.793238 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:52 crc kubenswrapper[4892]: I0217 17:44:52.793254 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:52 crc kubenswrapper[4892]: I0217 17:44:52.793277 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:52 crc kubenswrapper[4892]: I0217 17:44:52.793294 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:52Z","lastTransitionTime":"2026-02-17T17:44:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:52 crc kubenswrapper[4892]: I0217 17:44:52.895626 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:52 crc kubenswrapper[4892]: I0217 17:44:52.895668 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:52 crc kubenswrapper[4892]: I0217 17:44:52.895678 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:52 crc kubenswrapper[4892]: I0217 17:44:52.895690 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:52 crc kubenswrapper[4892]: I0217 17:44:52.895698 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:52Z","lastTransitionTime":"2026-02-17T17:44:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:52 crc kubenswrapper[4892]: I0217 17:44:52.998608 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:52 crc kubenswrapper[4892]: I0217 17:44:52.998666 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:52 crc kubenswrapper[4892]: I0217 17:44:52.998688 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:52 crc kubenswrapper[4892]: I0217 17:44:52.998716 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:52 crc kubenswrapper[4892]: I0217 17:44:52.998735 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:52Z","lastTransitionTime":"2026-02-17T17:44:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:53 crc kubenswrapper[4892]: I0217 17:44:53.101074 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:53 crc kubenswrapper[4892]: I0217 17:44:53.101128 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:53 crc kubenswrapper[4892]: I0217 17:44:53.101144 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:53 crc kubenswrapper[4892]: I0217 17:44:53.101166 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:53 crc kubenswrapper[4892]: I0217 17:44:53.101183 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:53Z","lastTransitionTime":"2026-02-17T17:44:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:53 crc kubenswrapper[4892]: I0217 17:44:53.204176 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:53 crc kubenswrapper[4892]: I0217 17:44:53.204230 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:53 crc kubenswrapper[4892]: I0217 17:44:53.204253 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:53 crc kubenswrapper[4892]: I0217 17:44:53.204280 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:53 crc kubenswrapper[4892]: I0217 17:44:53.204300 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:53Z","lastTransitionTime":"2026-02-17T17:44:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:53 crc kubenswrapper[4892]: I0217 17:44:53.307205 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:53 crc kubenswrapper[4892]: I0217 17:44:53.307245 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:53 crc kubenswrapper[4892]: I0217 17:44:53.307254 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:53 crc kubenswrapper[4892]: I0217 17:44:53.307267 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:53 crc kubenswrapper[4892]: I0217 17:44:53.307276 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:53Z","lastTransitionTime":"2026-02-17T17:44:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:53 crc kubenswrapper[4892]: I0217 17:44:53.331781 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 02:25:54.300210651 +0000 UTC Feb 17 17:44:53 crc kubenswrapper[4892]: I0217 17:44:53.359318 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:44:53 crc kubenswrapper[4892]: I0217 17:44:53.359334 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:44:53 crc kubenswrapper[4892]: E0217 17:44:53.359536 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:44:53 crc kubenswrapper[4892]: E0217 17:44:53.359600 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:44:53 crc kubenswrapper[4892]: I0217 17:44:53.409366 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:53 crc kubenswrapper[4892]: I0217 17:44:53.409421 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:53 crc kubenswrapper[4892]: I0217 17:44:53.409437 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:53 crc kubenswrapper[4892]: I0217 17:44:53.409460 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:53 crc kubenswrapper[4892]: I0217 17:44:53.409478 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:53Z","lastTransitionTime":"2026-02-17T17:44:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:53 crc kubenswrapper[4892]: I0217 17:44:53.511853 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:53 crc kubenswrapper[4892]: I0217 17:44:53.512203 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:53 crc kubenswrapper[4892]: I0217 17:44:53.512212 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:53 crc kubenswrapper[4892]: I0217 17:44:53.512227 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:53 crc kubenswrapper[4892]: I0217 17:44:53.512240 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:53Z","lastTransitionTime":"2026-02-17T17:44:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:53 crc kubenswrapper[4892]: I0217 17:44:53.614501 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:53 crc kubenswrapper[4892]: I0217 17:44:53.614536 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:53 crc kubenswrapper[4892]: I0217 17:44:53.614545 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:53 crc kubenswrapper[4892]: I0217 17:44:53.614564 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:53 crc kubenswrapper[4892]: I0217 17:44:53.614573 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:53Z","lastTransitionTime":"2026-02-17T17:44:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:53 crc kubenswrapper[4892]: I0217 17:44:53.717846 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:53 crc kubenswrapper[4892]: I0217 17:44:53.717918 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:53 crc kubenswrapper[4892]: I0217 17:44:53.717941 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:53 crc kubenswrapper[4892]: I0217 17:44:53.717971 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:53 crc kubenswrapper[4892]: I0217 17:44:53.717994 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:53Z","lastTransitionTime":"2026-02-17T17:44:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:53 crc kubenswrapper[4892]: I0217 17:44:53.819375 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:53 crc kubenswrapper[4892]: I0217 17:44:53.819418 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:53 crc kubenswrapper[4892]: I0217 17:44:53.819428 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:53 crc kubenswrapper[4892]: I0217 17:44:53.819443 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:53 crc kubenswrapper[4892]: I0217 17:44:53.819453 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:53Z","lastTransitionTime":"2026-02-17T17:44:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:53 crc kubenswrapper[4892]: I0217 17:44:53.921777 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:53 crc kubenswrapper[4892]: I0217 17:44:53.921837 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:53 crc kubenswrapper[4892]: I0217 17:44:53.921846 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:53 crc kubenswrapper[4892]: I0217 17:44:53.921860 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:53 crc kubenswrapper[4892]: I0217 17:44:53.921869 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:53Z","lastTransitionTime":"2026-02-17T17:44:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:54 crc kubenswrapper[4892]: I0217 17:44:54.024269 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:54 crc kubenswrapper[4892]: I0217 17:44:54.024302 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:54 crc kubenswrapper[4892]: I0217 17:44:54.024311 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:54 crc kubenswrapper[4892]: I0217 17:44:54.024326 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:54 crc kubenswrapper[4892]: I0217 17:44:54.024334 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:54Z","lastTransitionTime":"2026-02-17T17:44:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:54 crc kubenswrapper[4892]: I0217 17:44:54.126263 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:54 crc kubenswrapper[4892]: I0217 17:44:54.126301 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:54 crc kubenswrapper[4892]: I0217 17:44:54.126310 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:54 crc kubenswrapper[4892]: I0217 17:44:54.126338 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:54 crc kubenswrapper[4892]: I0217 17:44:54.126348 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:54Z","lastTransitionTime":"2026-02-17T17:44:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:54 crc kubenswrapper[4892]: I0217 17:44:54.228739 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:54 crc kubenswrapper[4892]: I0217 17:44:54.228875 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:54 crc kubenswrapper[4892]: I0217 17:44:54.228897 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:54 crc kubenswrapper[4892]: I0217 17:44:54.228920 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:54 crc kubenswrapper[4892]: I0217 17:44:54.228937 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:54Z","lastTransitionTime":"2026-02-17T17:44:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:54 crc kubenswrapper[4892]: I0217 17:44:54.335425 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 03:40:07.309847255 +0000 UTC Feb 17 17:44:54 crc kubenswrapper[4892]: I0217 17:44:54.336674 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:54 crc kubenswrapper[4892]: I0217 17:44:54.336711 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:54 crc kubenswrapper[4892]: I0217 17:44:54.336722 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:54 crc kubenswrapper[4892]: I0217 17:44:54.336740 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:54 crc kubenswrapper[4892]: I0217 17:44:54.336750 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:54Z","lastTransitionTime":"2026-02-17T17:44:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:54 crc kubenswrapper[4892]: I0217 17:44:54.358959 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:44:54 crc kubenswrapper[4892]: E0217 17:44:54.359091 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:44:54 crc kubenswrapper[4892]: I0217 17:44:54.359316 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:44:54 crc kubenswrapper[4892]: E0217 17:44:54.359394 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q4n6" podUID="9290105c-74a4-487a-879f-3f79186b3b01" Feb 17 17:44:54 crc kubenswrapper[4892]: I0217 17:44:54.438514 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:54 crc kubenswrapper[4892]: I0217 17:44:54.438548 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:54 crc kubenswrapper[4892]: I0217 17:44:54.438560 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:54 crc kubenswrapper[4892]: I0217 17:44:54.438576 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:54 crc kubenswrapper[4892]: I0217 17:44:54.438587 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:54Z","lastTransitionTime":"2026-02-17T17:44:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:54 crc kubenswrapper[4892]: I0217 17:44:54.540980 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:54 crc kubenswrapper[4892]: I0217 17:44:54.541018 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:54 crc kubenswrapper[4892]: I0217 17:44:54.541031 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:54 crc kubenswrapper[4892]: I0217 17:44:54.541052 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:54 crc kubenswrapper[4892]: I0217 17:44:54.541064 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:54Z","lastTransitionTime":"2026-02-17T17:44:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:54 crc kubenswrapper[4892]: I0217 17:44:54.643533 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:54 crc kubenswrapper[4892]: I0217 17:44:54.643579 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:54 crc kubenswrapper[4892]: I0217 17:44:54.643591 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:54 crc kubenswrapper[4892]: I0217 17:44:54.643607 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:54 crc kubenswrapper[4892]: I0217 17:44:54.643622 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:54Z","lastTransitionTime":"2026-02-17T17:44:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:54 crc kubenswrapper[4892]: I0217 17:44:54.745980 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:54 crc kubenswrapper[4892]: I0217 17:44:54.746020 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:54 crc kubenswrapper[4892]: I0217 17:44:54.746032 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:54 crc kubenswrapper[4892]: I0217 17:44:54.746047 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:54 crc kubenswrapper[4892]: I0217 17:44:54.746059 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:54Z","lastTransitionTime":"2026-02-17T17:44:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:54 crc kubenswrapper[4892]: I0217 17:44:54.848478 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:54 crc kubenswrapper[4892]: I0217 17:44:54.848523 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:54 crc kubenswrapper[4892]: I0217 17:44:54.848534 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:54 crc kubenswrapper[4892]: I0217 17:44:54.848553 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:54 crc kubenswrapper[4892]: I0217 17:44:54.848566 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:54Z","lastTransitionTime":"2026-02-17T17:44:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:54 crc kubenswrapper[4892]: I0217 17:44:54.950855 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:54 crc kubenswrapper[4892]: I0217 17:44:54.950895 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:54 crc kubenswrapper[4892]: I0217 17:44:54.950909 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:54 crc kubenswrapper[4892]: I0217 17:44:54.950927 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:54 crc kubenswrapper[4892]: I0217 17:44:54.950939 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:54Z","lastTransitionTime":"2026-02-17T17:44:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:55 crc kubenswrapper[4892]: I0217 17:44:55.053129 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:55 crc kubenswrapper[4892]: I0217 17:44:55.053167 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:55 crc kubenswrapper[4892]: I0217 17:44:55.053177 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:55 crc kubenswrapper[4892]: I0217 17:44:55.053192 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:55 crc kubenswrapper[4892]: I0217 17:44:55.053203 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:55Z","lastTransitionTime":"2026-02-17T17:44:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:55 crc kubenswrapper[4892]: I0217 17:44:55.155969 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:55 crc kubenswrapper[4892]: I0217 17:44:55.156009 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:55 crc kubenswrapper[4892]: I0217 17:44:55.156019 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:55 crc kubenswrapper[4892]: I0217 17:44:55.156033 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:55 crc kubenswrapper[4892]: I0217 17:44:55.156054 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:55Z","lastTransitionTime":"2026-02-17T17:44:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:55 crc kubenswrapper[4892]: I0217 17:44:55.258839 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:55 crc kubenswrapper[4892]: I0217 17:44:55.258873 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:55 crc kubenswrapper[4892]: I0217 17:44:55.258886 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:55 crc kubenswrapper[4892]: I0217 17:44:55.258902 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:55 crc kubenswrapper[4892]: I0217 17:44:55.258912 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:55Z","lastTransitionTime":"2026-02-17T17:44:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:55 crc kubenswrapper[4892]: I0217 17:44:55.336027 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 06:09:52.655444891 +0000 UTC Feb 17 17:44:55 crc kubenswrapper[4892]: I0217 17:44:55.359332 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:44:55 crc kubenswrapper[4892]: E0217 17:44:55.359445 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:44:55 crc kubenswrapper[4892]: I0217 17:44:55.359463 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:44:55 crc kubenswrapper[4892]: E0217 17:44:55.359684 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:44:55 crc kubenswrapper[4892]: I0217 17:44:55.360656 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:55 crc kubenswrapper[4892]: I0217 17:44:55.360680 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:55 crc kubenswrapper[4892]: I0217 17:44:55.360689 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:55 crc kubenswrapper[4892]: I0217 17:44:55.360700 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:55 crc kubenswrapper[4892]: I0217 17:44:55.360710 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:55Z","lastTransitionTime":"2026-02-17T17:44:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:55 crc kubenswrapper[4892]: I0217 17:44:55.462550 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:55 crc kubenswrapper[4892]: I0217 17:44:55.462608 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:55 crc kubenswrapper[4892]: I0217 17:44:55.462620 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:55 crc kubenswrapper[4892]: I0217 17:44:55.462639 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:55 crc kubenswrapper[4892]: I0217 17:44:55.462653 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:55Z","lastTransitionTime":"2026-02-17T17:44:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:55 crc kubenswrapper[4892]: I0217 17:44:55.564485 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:55 crc kubenswrapper[4892]: I0217 17:44:55.564522 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:55 crc kubenswrapper[4892]: I0217 17:44:55.564532 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:55 crc kubenswrapper[4892]: I0217 17:44:55.564545 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:55 crc kubenswrapper[4892]: I0217 17:44:55.564554 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:55Z","lastTransitionTime":"2026-02-17T17:44:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:55 crc kubenswrapper[4892]: I0217 17:44:55.667161 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:55 crc kubenswrapper[4892]: I0217 17:44:55.667209 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:55 crc kubenswrapper[4892]: I0217 17:44:55.667222 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:55 crc kubenswrapper[4892]: I0217 17:44:55.667241 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:55 crc kubenswrapper[4892]: I0217 17:44:55.667254 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:55Z","lastTransitionTime":"2026-02-17T17:44:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:55 crc kubenswrapper[4892]: I0217 17:44:55.769171 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:55 crc kubenswrapper[4892]: I0217 17:44:55.769207 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:55 crc kubenswrapper[4892]: I0217 17:44:55.769215 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:55 crc kubenswrapper[4892]: I0217 17:44:55.769231 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:55 crc kubenswrapper[4892]: I0217 17:44:55.769240 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:55Z","lastTransitionTime":"2026-02-17T17:44:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:55 crc kubenswrapper[4892]: I0217 17:44:55.871844 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:55 crc kubenswrapper[4892]: I0217 17:44:55.871903 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:55 crc kubenswrapper[4892]: I0217 17:44:55.871920 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:55 crc kubenswrapper[4892]: I0217 17:44:55.871944 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:55 crc kubenswrapper[4892]: I0217 17:44:55.871960 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:55Z","lastTransitionTime":"2026-02-17T17:44:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:55 crc kubenswrapper[4892]: I0217 17:44:55.973922 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:55 crc kubenswrapper[4892]: I0217 17:44:55.973978 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:55 crc kubenswrapper[4892]: I0217 17:44:55.973997 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:55 crc kubenswrapper[4892]: I0217 17:44:55.974020 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:55 crc kubenswrapper[4892]: I0217 17:44:55.974039 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:55Z","lastTransitionTime":"2026-02-17T17:44:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.076490 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.076532 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.076542 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.076557 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.076568 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:56Z","lastTransitionTime":"2026-02-17T17:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.179639 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.179672 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.179680 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.179693 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.179701 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:56Z","lastTransitionTime":"2026-02-17T17:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.281633 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.281665 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.281674 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.281687 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.281696 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:56Z","lastTransitionTime":"2026-02-17T17:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.336156 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 10:18:00.038170242 +0000 UTC Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.358423 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.358562 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:44:56 crc kubenswrapper[4892]: E0217 17:44:56.358666 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:44:56 crc kubenswrapper[4892]: E0217 17:44:56.358788 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q4n6" podUID="9290105c-74a4-487a-879f-3f79186b3b01" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.384423 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.384471 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.384485 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.384501 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.384775 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:56Z","lastTransitionTime":"2026-02-17T17:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.487204 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.487243 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.487254 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.487271 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.487283 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:56Z","lastTransitionTime":"2026-02-17T17:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.589717 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.589782 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.589805 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.589868 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.589891 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:56Z","lastTransitionTime":"2026-02-17T17:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.692397 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.692433 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.692444 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.692459 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.692470 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:56Z","lastTransitionTime":"2026-02-17T17:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.702517 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.702544 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.702555 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.702567 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.702576 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:56Z","lastTransitionTime":"2026-02-17T17:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:56 crc kubenswrapper[4892]: E0217 17:44:56.713324 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b854407-f7fd-494b-9ad1-f90f175f6ff2\\\",\\\"systemUUID\\\":\\\"06c95f3d-0382-44a5-9f64-04e3b3bbd534\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:56Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.716499 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.716525 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.716534 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.716548 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.716557 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:56Z","lastTransitionTime":"2026-02-17T17:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:56 crc kubenswrapper[4892]: E0217 17:44:56.725872 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b854407-f7fd-494b-9ad1-f90f175f6ff2\\\",\\\"systemUUID\\\":\\\"06c95f3d-0382-44a5-9f64-04e3b3bbd534\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:56Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.729320 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.729349 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.729359 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.729374 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.729382 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:56Z","lastTransitionTime":"2026-02-17T17:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:56 crc kubenswrapper[4892]: E0217 17:44:56.739142 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b854407-f7fd-494b-9ad1-f90f175f6ff2\\\",\\\"systemUUID\\\":\\\"06c95f3d-0382-44a5-9f64-04e3b3bbd534\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:56Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.742753 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.742803 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.742840 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.742858 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.742869 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:56Z","lastTransitionTime":"2026-02-17T17:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:56 crc kubenswrapper[4892]: E0217 17:44:56.755507 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b854407-f7fd-494b-9ad1-f90f175f6ff2\\\",\\\"systemUUID\\\":\\\"06c95f3d-0382-44a5-9f64-04e3b3bbd534\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:56Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.759742 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.759836 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.759854 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.759905 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.759924 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:56Z","lastTransitionTime":"2026-02-17T17:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:56 crc kubenswrapper[4892]: E0217 17:44:56.772110 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b854407-f7fd-494b-9ad1-f90f175f6ff2\\\",\\\"systemUUID\\\":\\\"06c95f3d-0382-44a5-9f64-04e3b3bbd534\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:56Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:56 crc kubenswrapper[4892]: E0217 17:44:56.772405 4892 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.794141 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.794172 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.794188 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.794207 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.794223 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:56Z","lastTransitionTime":"2026-02-17T17:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.896782 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.896869 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.896886 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.896939 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.896956 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:56Z","lastTransitionTime":"2026-02-17T17:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.998637 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.998683 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.998732 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.998752 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:56 crc kubenswrapper[4892]: I0217 17:44:56.998767 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:56Z","lastTransitionTime":"2026-02-17T17:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:57 crc kubenswrapper[4892]: I0217 17:44:57.100151 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:57 crc kubenswrapper[4892]: I0217 17:44:57.100182 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:57 crc kubenswrapper[4892]: I0217 17:44:57.100192 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:57 crc kubenswrapper[4892]: I0217 17:44:57.100208 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:57 crc kubenswrapper[4892]: I0217 17:44:57.100219 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:57Z","lastTransitionTime":"2026-02-17T17:44:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:57 crc kubenswrapper[4892]: I0217 17:44:57.202283 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:57 crc kubenswrapper[4892]: I0217 17:44:57.202314 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:57 crc kubenswrapper[4892]: I0217 17:44:57.202322 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:57 crc kubenswrapper[4892]: I0217 17:44:57.202335 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:57 crc kubenswrapper[4892]: I0217 17:44:57.202343 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:57Z","lastTransitionTime":"2026-02-17T17:44:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:57 crc kubenswrapper[4892]: I0217 17:44:57.303993 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:57 crc kubenswrapper[4892]: I0217 17:44:57.304013 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:57 crc kubenswrapper[4892]: I0217 17:44:57.304021 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:57 crc kubenswrapper[4892]: I0217 17:44:57.304032 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:57 crc kubenswrapper[4892]: I0217 17:44:57.304042 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:57Z","lastTransitionTime":"2026-02-17T17:44:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:57 crc kubenswrapper[4892]: I0217 17:44:57.336701 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 09:26:11.437429908 +0000 UTC Feb 17 17:44:57 crc kubenswrapper[4892]: I0217 17:44:57.358983 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:44:57 crc kubenswrapper[4892]: E0217 17:44:57.359067 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:44:57 crc kubenswrapper[4892]: I0217 17:44:57.358988 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:44:57 crc kubenswrapper[4892]: E0217 17:44:57.359132 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:44:57 crc kubenswrapper[4892]: I0217 17:44:57.405567 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:57 crc kubenswrapper[4892]: I0217 17:44:57.405614 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:57 crc kubenswrapper[4892]: I0217 17:44:57.405663 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:57 crc kubenswrapper[4892]: I0217 17:44:57.405688 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:57 crc kubenswrapper[4892]: I0217 17:44:57.405786 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:57Z","lastTransitionTime":"2026-02-17T17:44:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:57 crc kubenswrapper[4892]: I0217 17:44:57.508073 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:57 crc kubenswrapper[4892]: I0217 17:44:57.508106 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:57 crc kubenswrapper[4892]: I0217 17:44:57.508116 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:57 crc kubenswrapper[4892]: I0217 17:44:57.508131 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:57 crc kubenswrapper[4892]: I0217 17:44:57.508142 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:57Z","lastTransitionTime":"2026-02-17T17:44:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:57 crc kubenswrapper[4892]: I0217 17:44:57.610186 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:57 crc kubenswrapper[4892]: I0217 17:44:57.610228 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:57 crc kubenswrapper[4892]: I0217 17:44:57.610238 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:57 crc kubenswrapper[4892]: I0217 17:44:57.610253 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:57 crc kubenswrapper[4892]: I0217 17:44:57.610263 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:57Z","lastTransitionTime":"2026-02-17T17:44:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:57 crc kubenswrapper[4892]: I0217 17:44:57.712797 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:57 crc kubenswrapper[4892]: I0217 17:44:57.712872 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:57 crc kubenswrapper[4892]: I0217 17:44:57.712885 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:57 crc kubenswrapper[4892]: I0217 17:44:57.712901 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:57 crc kubenswrapper[4892]: I0217 17:44:57.712913 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:57Z","lastTransitionTime":"2026-02-17T17:44:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:57 crc kubenswrapper[4892]: I0217 17:44:57.815444 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:57 crc kubenswrapper[4892]: I0217 17:44:57.815475 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:57 crc kubenswrapper[4892]: I0217 17:44:57.815487 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:57 crc kubenswrapper[4892]: I0217 17:44:57.815501 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:57 crc kubenswrapper[4892]: I0217 17:44:57.815511 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:57Z","lastTransitionTime":"2026-02-17T17:44:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:57 crc kubenswrapper[4892]: I0217 17:44:57.917601 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:57 crc kubenswrapper[4892]: I0217 17:44:57.917642 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:57 crc kubenswrapper[4892]: I0217 17:44:57.917654 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:57 crc kubenswrapper[4892]: I0217 17:44:57.917669 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:57 crc kubenswrapper[4892]: I0217 17:44:57.917680 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:57Z","lastTransitionTime":"2026-02-17T17:44:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:58 crc kubenswrapper[4892]: I0217 17:44:58.019640 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:58 crc kubenswrapper[4892]: I0217 17:44:58.019695 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:58 crc kubenswrapper[4892]: I0217 17:44:58.019718 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:58 crc kubenswrapper[4892]: I0217 17:44:58.019744 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:58 crc kubenswrapper[4892]: I0217 17:44:58.019766 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:58Z","lastTransitionTime":"2026-02-17T17:44:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:58 crc kubenswrapper[4892]: I0217 17:44:58.122160 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:58 crc kubenswrapper[4892]: I0217 17:44:58.122202 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:58 crc kubenswrapper[4892]: I0217 17:44:58.122214 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:58 crc kubenswrapper[4892]: I0217 17:44:58.122229 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:58 crc kubenswrapper[4892]: I0217 17:44:58.122242 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:58Z","lastTransitionTime":"2026-02-17T17:44:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:58 crc kubenswrapper[4892]: I0217 17:44:58.224940 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:58 crc kubenswrapper[4892]: I0217 17:44:58.224974 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:58 crc kubenswrapper[4892]: I0217 17:44:58.224989 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:58 crc kubenswrapper[4892]: I0217 17:44:58.225004 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:58 crc kubenswrapper[4892]: I0217 17:44:58.225013 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:58Z","lastTransitionTime":"2026-02-17T17:44:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:58 crc kubenswrapper[4892]: I0217 17:44:58.327860 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:58 crc kubenswrapper[4892]: I0217 17:44:58.327891 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:58 crc kubenswrapper[4892]: I0217 17:44:58.327901 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:58 crc kubenswrapper[4892]: I0217 17:44:58.327915 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:58 crc kubenswrapper[4892]: I0217 17:44:58.327927 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:58Z","lastTransitionTime":"2026-02-17T17:44:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:58 crc kubenswrapper[4892]: I0217 17:44:58.337182 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 05:10:46.522159912 +0000 UTC Feb 17 17:44:58 crc kubenswrapper[4892]: I0217 17:44:58.358333 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:44:58 crc kubenswrapper[4892]: E0217 17:44:58.358415 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:44:58 crc kubenswrapper[4892]: I0217 17:44:58.358549 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:44:58 crc kubenswrapper[4892]: E0217 17:44:58.358758 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q4n6" podUID="9290105c-74a4-487a-879f-3f79186b3b01" Feb 17 17:44:58 crc kubenswrapper[4892]: I0217 17:44:58.358938 4892 scope.go:117] "RemoveContainer" containerID="e71385b1a285f0da1c91a9a94188fbfd412225fc8c839572ec619b3363017253" Feb 17 17:44:58 crc kubenswrapper[4892]: E0217 17:44:58.359138 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kp5h9_openshift-ovn-kubernetes(b23058a0-04ec-4a23-82cb-60f9b368eaa0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" Feb 17 17:44:58 crc kubenswrapper[4892]: I0217 17:44:58.429626 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:58 crc kubenswrapper[4892]: I0217 17:44:58.429654 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:58 crc kubenswrapper[4892]: I0217 17:44:58.429664 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:58 crc kubenswrapper[4892]: I0217 17:44:58.429677 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:58 crc kubenswrapper[4892]: I0217 17:44:58.429687 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:58Z","lastTransitionTime":"2026-02-17T17:44:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:58 crc kubenswrapper[4892]: I0217 17:44:58.493471 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9290105c-74a4-487a-879f-3f79186b3b01-metrics-certs\") pod \"network-metrics-daemon-2q4n6\" (UID: \"9290105c-74a4-487a-879f-3f79186b3b01\") " pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:44:58 crc kubenswrapper[4892]: E0217 17:44:58.493601 4892 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 17:44:58 crc kubenswrapper[4892]: E0217 17:44:58.493666 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9290105c-74a4-487a-879f-3f79186b3b01-metrics-certs podName:9290105c-74a4-487a-879f-3f79186b3b01 nodeName:}" failed. No retries permitted until 2026-02-17 17:45:30.493643967 +0000 UTC m=+101.869047272 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9290105c-74a4-487a-879f-3f79186b3b01-metrics-certs") pod "network-metrics-daemon-2q4n6" (UID: "9290105c-74a4-487a-879f-3f79186b3b01") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 17:44:58 crc kubenswrapper[4892]: I0217 17:44:58.531311 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:58 crc kubenswrapper[4892]: I0217 17:44:58.531354 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:58 crc kubenswrapper[4892]: I0217 17:44:58.531363 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:58 crc kubenswrapper[4892]: I0217 17:44:58.531375 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:58 crc kubenswrapper[4892]: I0217 17:44:58.531385 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:58Z","lastTransitionTime":"2026-02-17T17:44:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:58 crc kubenswrapper[4892]: I0217 17:44:58.633736 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:58 crc kubenswrapper[4892]: I0217 17:44:58.633792 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:58 crc kubenswrapper[4892]: I0217 17:44:58.633808 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:58 crc kubenswrapper[4892]: I0217 17:44:58.633898 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:58 crc kubenswrapper[4892]: I0217 17:44:58.633920 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:58Z","lastTransitionTime":"2026-02-17T17:44:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:58 crc kubenswrapper[4892]: I0217 17:44:58.735550 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:58 crc kubenswrapper[4892]: I0217 17:44:58.735580 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:58 crc kubenswrapper[4892]: I0217 17:44:58.735588 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:58 crc kubenswrapper[4892]: I0217 17:44:58.735600 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:58 crc kubenswrapper[4892]: I0217 17:44:58.735609 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:58Z","lastTransitionTime":"2026-02-17T17:44:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:58 crc kubenswrapper[4892]: I0217 17:44:58.836808 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:58 crc kubenswrapper[4892]: I0217 17:44:58.836842 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:58 crc kubenswrapper[4892]: I0217 17:44:58.836851 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:58 crc kubenswrapper[4892]: I0217 17:44:58.836864 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:58 crc kubenswrapper[4892]: I0217 17:44:58.836880 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:58Z","lastTransitionTime":"2026-02-17T17:44:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:58 crc kubenswrapper[4892]: I0217 17:44:58.938712 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:58 crc kubenswrapper[4892]: I0217 17:44:58.938764 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:58 crc kubenswrapper[4892]: I0217 17:44:58.938773 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:58 crc kubenswrapper[4892]: I0217 17:44:58.938786 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:58 crc kubenswrapper[4892]: I0217 17:44:58.938796 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:58Z","lastTransitionTime":"2026-02-17T17:44:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.040468 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.040530 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.040552 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.040580 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.040602 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:59Z","lastTransitionTime":"2026-02-17T17:44:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.142639 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.142667 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.142675 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.142687 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.142696 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:59Z","lastTransitionTime":"2026-02-17T17:44:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.245789 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.245833 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.245847 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.245862 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.245874 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:59Z","lastTransitionTime":"2026-02-17T17:44:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.337824 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 21:23:49.783683711 +0000 UTC Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.348145 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.348172 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.348181 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.348211 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.348221 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:59Z","lastTransitionTime":"2026-02-17T17:44:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.358728 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.358932 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:44:59 crc kubenswrapper[4892]: E0217 17:44:59.359028 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:44:59 crc kubenswrapper[4892]: E0217 17:44:59.359276 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.379209 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.382197 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.394682 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aba0c9be9062adff6361f81c4b2f399a56592fcef0d1a62defdf587f9accb3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.407128 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9013d62-9809-436b-82a8-5b18dbf13e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b57e4ceb22cb7d61e915ddf5e85dfee872d2088cccf3c8c4add63e1aa423a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c31ce4ad814ab3be27cbd8d8eb0cb70c7e11352cacd14b705d5c3ed0b56fe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6mhzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.417629 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e1e9d28fe57947c798967ce328638e778603a86874e79a2f4d99eab0b9867fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://489d551f2e552fe6e6a03f8e38593e2b978a7dc4053a999c3a11ebe404cf427f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.425200 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p6jtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"646d0148-c138-45a2-8f68-51aee16aeff0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1d56c160a6bfb1da9b1ca24666fbbdbd87911eaf961365a6cb097b0cc20f1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxbxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p6jtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.444281 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b23058a0-04ec-4a23-82cb-60f9b368eaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e71385b1a285f0da1c91a9a94188fbfd412225fc8c839572ec619b3363017253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e71385b1a285f0da1c91a9a94188fbfd412225fc8c839572ec619b3363017253\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:44:43Z\\\",\\\"message\\\":\\\"ef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 17:44:43.370589 6578 services_controller.go:356] Processing sync for service openshift-multus/multus-admission-controller for network=default\\\\nI0217 17:44:43.369483 6578 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0217 17:44:43.370645 6578 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0217 17:44:43.370651 6578 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0217 17:44:43.370550 6578 base_netwo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kp5h9_openshift-ovn-kubernetes(b23058a0-04ec-4a23-82cb-60f9b368eaa0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kp5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.454201 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.454238 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.454249 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.454265 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.454275 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:59Z","lastTransitionTime":"2026-02-17T17:44:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.455935 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2q4n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9290105c-74a4-487a-879f-3f79186b3b01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6h6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6h6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2q4n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.468569 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k28d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5497ad5a-f96f-4f2e-ba79-a72f32527546\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8cdd570c53fb9581ae506e73f11f51586ebffcc4561ba7176a7a64a25bd5e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g8dcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ca1f6810f305bb127d5ebfbb39b92ffa5914a3a245dcfe6f4adb805fea015e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g8dcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7k28d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.481903 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4159c-33a5-42ef-9427-ad1fb2799470\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf7e90e5029b6c000084531e8c465365c236ea0c554bf0ab65b936fc0f53cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe34ba04ce082dcb2c2040fafa3d9b0fb0f78fd569d08c9e53278873fb6434c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a9fc749b194bcc0558af7017260846aeb8df8df5d84f31ae0f1e9f7ac29e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2199d2bd60d27dc4d3c3ff4a2ef9e4cbad59c3546a36206258b82875901b69d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 17:44:08.433164 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 17:44:08.433536 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:44:08.434768 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2781092108/tls.crt::/tmp/serving-cert-2781092108/tls.key\\\\\\\"\\\\nI0217 17:44:08.760474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:44:08.765628 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:44:08.765666 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:44:08.765700 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:44:08.765711 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:44:08.777802 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:44:08.777850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:44:08.777867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:44:08.777870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:44:08.777873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:44:08.777883 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:44:08.781106 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23963364690d3b6cdb1cb76d08e094b0a7c2fe304f8d20a673206357e0152193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.494451 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af0cdeda-3c72-4700-b908-b8c0f0041cc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db577dd460e8c892dbb44799bae9ccdd3364c385f4bbfbd3a33a88bab1bcab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd66fb423f8b68d5e670ece2a7be1b6e04e4dcd03ba42dc25bcf97cd9ec92b69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0b98dba841562d65c0ab18e4f604c141352d4bbfb16825248fb209a73b23cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f344d7a0ae243f5e05b9e8c21f25a713411ee0921470e6632409800760aab22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f344d7a0ae243f5e05b9e8c21f25a713411ee0921470e6632409800760aab22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.509319 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59304d1958e072cd19bd1e6a00f1487aa93563c8838c859aa7ed5222dc25c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.524021 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5v456" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62da95c0-b8b4-410e-a5e8-f3ab44db53b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4bd49fbfa0e7cff66ffc2a72aa0020a4a28551fa192e587067d648feb4d09f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfctg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5v456\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.537438 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83f8f121-9193-465b-9d6b-4dfbdcf26f98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca961a496bca89cff78c58c41585706c0673892df83e6c58e16655f83aeaa4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3ead139c2a8e6feefbddce384ccd6eb8f26b4581d93053b80832fea7c8bdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82362387569a5b049fdb827cb8ca5ecdacae36743ef142077b37441cc2e33b41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d08e8579fe7ac45bc28ca29b9e6bc0b795c68e4429e16319b75f2b6a1f60cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.548321 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.556177 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.556229 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.556268 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.556284 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.556297 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:59Z","lastTransitionTime":"2026-02-17T17:44:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.562569 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.574971 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxpxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b12f44-0079-4031-9b1d-492c374250df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5490d72f776fd4e96f56e933c182c8589e96d80806378f96d97c4d0f6dccd8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgp4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxpxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.590633 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4dsxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202411f4-1b44-41a1-9b8a-23038a0e9bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc5f2c5d87fb5b1529c03563955d9213cd72452aabadd9417afdf5a30b8a24f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4dsxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.659460 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.659530 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.659543 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.659559 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.659592 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:59Z","lastTransitionTime":"2026-02-17T17:44:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.762148 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.762196 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.762208 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.762225 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.762238 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:59Z","lastTransitionTime":"2026-02-17T17:44:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.841931 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lxpxh_43b12f44-0079-4031-9b1d-492c374250df/kube-multus/0.log" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.841978 4892 generic.go:334] "Generic (PLEG): container finished" podID="43b12f44-0079-4031-9b1d-492c374250df" containerID="5490d72f776fd4e96f56e933c182c8589e96d80806378f96d97c4d0f6dccd8eb" exitCode=1 Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.842024 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lxpxh" event={"ID":"43b12f44-0079-4031-9b1d-492c374250df","Type":"ContainerDied","Data":"5490d72f776fd4e96f56e933c182c8589e96d80806378f96d97c4d0f6dccd8eb"} Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.842429 4892 scope.go:117] "RemoveContainer" containerID="5490d72f776fd4e96f56e933c182c8589e96d80806378f96d97c4d0f6dccd8eb" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.860470 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.864420 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.864448 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.864456 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.864470 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.864479 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:59Z","lastTransitionTime":"2026-02-17T17:44:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.873675 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aba0c9be9062adff6361f81c4b2f399a56592fcef0d1a62defdf587f9accb3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.884754 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9013d62-9809-436b-82a8-5b18dbf13e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b57e4ceb22cb7d61e915ddf5e85dfee872d2088cccf3c8c4add63e1aa423a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c31ce4ad814ab3be27cbd8d8eb0cb70c7e11352cacd14b705d5c3ed0b56fe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6mhzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.900559 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e1e9d28fe57947c798967ce328638e778603a86874e79a2f4d99eab0b9867fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://489d551f2e552fe6e6a03f8e38593e2b978a7dc4053a999c3a11ebe404cf427f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.912540 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p6jtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"646d0148-c138-45a2-8f68-51aee16aeff0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1d56c160a6bfb1da9b1ca24666fbbdbd87911eaf961365a6cb097b0cc20f1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxbxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p6jtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.939559 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b23058a0-04ec-4a23-82cb-60f9b368eaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e71385b1a285f0da1c91a9a94188fbfd412225fc8c839572ec619b3363017253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e71385b1a285f0da1c91a9a94188fbfd412225fc8c839572ec619b3363017253\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:44:43Z\\\",\\\"message\\\":\\\"ef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 17:44:43.370589 6578 services_controller.go:356] Processing sync for service openshift-multus/multus-admission-controller for network=default\\\\nI0217 17:44:43.369483 6578 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0217 17:44:43.370645 6578 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0217 17:44:43.370651 6578 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0217 17:44:43.370550 6578 base_netwo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kp5h9_openshift-ovn-kubernetes(b23058a0-04ec-4a23-82cb-60f9b368eaa0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kp5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.950653 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2q4n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9290105c-74a4-487a-879f-3f79186b3b01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6h6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6h6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2q4n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.960435 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5f7eaf0-f8c2-46f8-947e-adf3e613da11\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38ea6d6b01fd4e5622f4283bf6250451952776f0d2b162865b1705dbde6542eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65e1e3306f2becef082f51747a01f6f5f0c619c462d0b001bdc01827d19d6ad3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65e1e3306f2becef082f51747a01f6f5f0c619c462d0b001bdc01827d19d6ad3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.966197 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.966222 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.966232 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.966247 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.966257 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:44:59Z","lastTransitionTime":"2026-02-17T17:44:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.977197 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4159c-33a5-42ef-9427-ad1fb2799470\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf7e90e5029b6c000084531e8c465365c236ea0c554bf0ab65b936fc0f53cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe34ba04ce082dcb2c2040fafa3d9b0fb0f78fd569d08c9e53278873fb6434c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a9fc749b194bcc0558af7017260846aeb8df8df5d84f31ae0f1e9f7ac29e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2199d2bd60d27dc4d3c3ff4a2ef9e4cbad59c3546a36206258b82875901b69d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 17:44:08.433164 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 17:44:08.433536 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:44:08.434768 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2781092108/tls.crt::/tmp/serving-cert-2781092108/tls.key\\\\\\\"\\\\nI0217 17:44:08.760474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:44:08.765628 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:44:08.765666 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:44:08.765700 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:44:08.765711 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:44:08.777802 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:44:08.777850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:44:08.777867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:44:08.777870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:44:08.777873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:44:08.777883 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:44:08.781106 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23963364690d3b6cdb1cb76d08e094b0a7c2fe304f8d20a673206357e0152193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:44:59 crc kubenswrapper[4892]: I0217 17:44:59.991471 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af0cdeda-3c72-4700-b908-b8c0f0041cc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db577dd460e8c892dbb44799bae9ccdd3364c385f4bbfbd3a33a88bab1bcab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd66fb423f8b68d5e670ece2a7be1b6e04e4dcd03ba42dc25bcf97cd9ec92b69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0b98dba841562d65c0ab18e4f604c141352d4bbfb16825248fb209a73b23cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f344d7a0ae243f5e05b9e8c21f25a713411ee0921470e6632409800760aab22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f344d7a0ae243f5e05b9e8c21f25a713411ee0921470e6632409800760aab22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:44:59Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.006097 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59304d1958e072cd19bd1e6a00f1487aa93563c8838c859aa7ed5222dc25c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:00Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.019288 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5v456" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62da95c0-b8b4-410e-a5e8-f3ab44db53b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4bd49fbfa0e7cff66ffc2a72aa0020a4a28551fa192e587067d648feb4d09f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfctg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5v456\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:00Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.030925 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k28d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5497ad5a-f96f-4f2e-ba79-a72f32527546\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8cdd570c53fb9581ae506e73f11f51586ebffcc4561ba7176a7a64a25bd5e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g8dcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ca1f6810f305bb127d5ebfbb39b92ffa5914a3a245dcfe6f4adb805fea015e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g8dcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7k28d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:00Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.046031 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83f8f121-9193-465b-9d6b-4dfbdcf26f98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca961a496bca89cff78c58c41585706c0673892df83e6c58e16655f83aeaa4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3ead139c2a8e6feefbddce384ccd6eb8f26b4581d93053b80832fea7c8bdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82362387569a5b049fdb827cb8ca5ecdacae36743ef142077b37441cc2e33b41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d08e8579fe7ac45bc28ca29b9e6bc0b795c68e4429e16319b75f2b6a1f60cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:00Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.059997 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:00Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.068042 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.068063 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.068072 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.068084 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.068093 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:00Z","lastTransitionTime":"2026-02-17T17:45:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.073618 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:00Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.088758 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxpxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b12f44-0079-4031-9b1d-492c374250df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5490d72f776fd4e96f56e933c182c8589e96d80806378f96d97c4d0f6dccd8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5490d72f776fd4e96f56e933c182c8589e96d80806378f96d97c4d0f6dccd8eb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:44:59Z\\\",\\\"message\\\":\\\"2026-02-17T17:44:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d0b9a3e2-5264-4ab6-a42c-c965db521da1\\\\n2026-02-17T17:44:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d0b9a3e2-5264-4ab6-a42c-c965db521da1 to /host/opt/cni/bin/\\\\n2026-02-17T17:44:14Z [verbose] multus-daemon started\\\\n2026-02-17T17:44:14Z [verbose] Readiness Indicator file check\\\\n2026-02-17T17:44:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgp4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxpxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:00Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.105919 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4dsxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202411f4-1b44-41a1-9b8a-23038a0e9bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc5f2c5d87fb5b1529c03563955d9213cd72452aabadd9417afdf5a30b8a24f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4dsxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:00Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.170159 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.170195 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.170206 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.170224 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.170248 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:00Z","lastTransitionTime":"2026-02-17T17:45:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.272893 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.272953 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.272972 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.272995 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.273011 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:00Z","lastTransitionTime":"2026-02-17T17:45:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.338972 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 02:07:00.580805267 +0000 UTC Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.358989 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.359067 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:45:00 crc kubenswrapper[4892]: E0217 17:45:00.359189 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:45:00 crc kubenswrapper[4892]: E0217 17:45:00.359485 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q4n6" podUID="9290105c-74a4-487a-879f-3f79186b3b01" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.375339 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.375375 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.375383 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.375398 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.375408 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:00Z","lastTransitionTime":"2026-02-17T17:45:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.478059 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.478112 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.478133 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.478158 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.478177 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:00Z","lastTransitionTime":"2026-02-17T17:45:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.580772 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.580873 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.580891 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.580916 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.580932 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:00Z","lastTransitionTime":"2026-02-17T17:45:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.683528 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.683585 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.683605 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.683633 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.683655 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:00Z","lastTransitionTime":"2026-02-17T17:45:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.786570 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.786641 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.786665 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.786696 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.786718 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:00Z","lastTransitionTime":"2026-02-17T17:45:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.847916 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lxpxh_43b12f44-0079-4031-9b1d-492c374250df/kube-multus/0.log" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.848171 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lxpxh" event={"ID":"43b12f44-0079-4031-9b1d-492c374250df","Type":"ContainerStarted","Data":"6fc69d29b82db70745c48c3f7761a99709ee267fb45509bfc4ff406273c6f23c"} Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.871089 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b23058a0-04ec-4a23-82cb-60f9b368eaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e71385b1a285f0da1c91a9a94188fbfd412225fc8c839572ec619b3363017253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e71385b1a285f0da1c91a9a94188fbfd412225fc8c839572ec619b3363017253\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:44:43Z\\\",\\\"message\\\":\\\"ef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 17:44:43.370589 6578 services_controller.go:356] Processing sync for service openshift-multus/multus-admission-controller for network=default\\\\nI0217 17:44:43.369483 6578 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0217 17:44:43.370645 6578 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0217 17:44:43.370651 6578 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0217 17:44:43.370550 6578 base_netwo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kp5h9_openshift-ovn-kubernetes(b23058a0-04ec-4a23-82cb-60f9b368eaa0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kp5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:00Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.884879 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2q4n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9290105c-74a4-487a-879f-3f79186b3b01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6h6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6h6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2q4n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:00Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.889794 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.889867 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.889885 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.889910 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.889928 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:00Z","lastTransitionTime":"2026-02-17T17:45:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.900401 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e1e9d28fe57947c798967ce328638e778603a86874e79a2f4d99eab0b9867fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://489d551f2e552fe6e6a03f8e38593e2b978a7dc4053a999c3a11ebe404cf427f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:00Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.913637 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p6jtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"646d0148-c138-45a2-8f68-51aee16aeff0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1d56c160a6bfb1da9b1ca24666fbbdbd87911eaf961365a6cb097b0cc20f1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxbxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p6jtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:00Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.926903 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af0cdeda-3c72-4700-b908-b8c0f0041cc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db577dd460e8c892dbb44799bae9ccdd3364c385f4bbfbd3a33a88bab1bcab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd66fb423f8b68d5e670ece2a7be1b6e04e4dcd03ba42dc25bcf97cd9ec92b69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0b98dba841562d65c0ab18e4f604c141352d4bbfb16825248fb209a73b23cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f344d7a0ae243f5e05b9e8c21f25a713411ee0921470e6632409800760aab22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f344d7a0ae243f5e05b9e8c21f25a713411ee0921470e6632409800760aab22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:00Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.942907 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59304d1958e072cd19bd1e6a00f1487aa93563c8838c859aa7ed5222dc25c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:00Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.958558 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5v456" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62da95c0-b8b4-410e-a5e8-f3ab44db53b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4bd49fbfa0e7cff66ffc2a72aa0020a4a28551fa192e587067d648feb4d09f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfctg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5v456\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:00Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.975513 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k28d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5497ad5a-f96f-4f2e-ba79-a72f32527546\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8cdd570c53fb9581ae506e73f11f51586ebffcc4561ba7176a7a64a25bd5e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g8dcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ca1f6810f305bb127d5ebfbb39b92ffa5914a3a245dcfe6f4adb805fea015e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g8dcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7k28d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:00Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.990096 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5f7eaf0-f8c2-46f8-947e-adf3e613da11\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38ea6d6b01fd4e5622f4283bf6250451952776f0d2b162865b1705dbde6542eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65e1e3306f2becef082f51747a01f6f5f0c619c462d0b001bdc01827d19d6ad3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65e1e3306f2becef082f51747a01f6f5f0c619c462d0b001bdc01827d19d6ad3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:00Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.992236 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.992296 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.992321 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.992354 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:00 crc kubenswrapper[4892]: I0217 17:45:00.992377 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:00Z","lastTransitionTime":"2026-02-17T17:45:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.005835 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4159c-33a5-42ef-9427-ad1fb2799470\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf7e90e5029b6c000084531e8c465365c236ea0c554bf0ab65b936fc0f53cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe34ba04ce082dcb2c2040fafa3d9b0fb0f78fd569d08c9e53278873fb6434c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a9fc749b194bcc0558af7017260846aeb8df8df5d84f31ae0f1e9f7ac29e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2199d2bd60d27dc4d3c3ff4a2ef9e4cbad59c3546a36206258b82875901b69d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 17:44:08.433164 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 17:44:08.433536 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:44:08.434768 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2781092108/tls.crt::/tmp/serving-cert-2781092108/tls.key\\\\\\\"\\\\nI0217 17:44:08.760474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:44:08.765628 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:44:08.765666 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:44:08.765700 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:44:08.765711 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:44:08.777802 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:44:08.777850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:44:08.777867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:44:08.777870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:44:08.777873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:44:08.777883 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:44:08.781106 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23963364690d3b6cdb1cb76d08e094b0a7c2fe304f8d20a673206357e0152193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:01Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.025236 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:01Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.038774 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxpxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b12f44-0079-4031-9b1d-492c374250df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc69d29b82db70745c48c3f7761a99709ee267fb45509bfc4ff406273c6f23c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5490d72f776fd4e96f56e933c182c8589e96d80806378f96d97c4d0f6dccd8eb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:44:59Z\\\",\\\"message\\\":\\\"2026-02-17T17:44:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d0b9a3e2-5264-4ab6-a42c-c965db521da1\\\\n2026-02-17T17:44:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d0b9a3e2-5264-4ab6-a42c-c965db521da1 to /host/opt/cni/bin/\\\\n2026-02-17T17:44:14Z [verbose] multus-daemon started\\\\n2026-02-17T17:44:14Z [verbose] Readiness Indicator file check\\\\n2026-02-17T17:44:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgp4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxpxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:01Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.058937 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4dsxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202411f4-1b44-41a1-9b8a-23038a0e9bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc5f2c5d87fb5b1529c03563955d9213cd72452aabadd9417afdf5a30b8a24f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4dsxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:01Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.076445 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83f8f121-9193-465b-9d6b-4dfbdcf26f98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca961a496bca89cff78c58c41585706c0673892df83e6c58e16655f83aeaa4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3ead139c2a8e6feefbddce384ccd6eb8f26b4581d93053b80832fea7c8bdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82362387569a5b049fdb827cb8ca5ecdacae36743ef142077b37441cc2e33b41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d08e8579fe7ac45bc28ca29b9e6bc0b795c68e4429e16319b75f2b6a1f60cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:01Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.095395 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.095434 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.095611 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.095640 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.095656 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:01Z","lastTransitionTime":"2026-02-17T17:45:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.096473 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:01Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.116963 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aba0c9be9062adff6361f81c4b2f399a56592fcef0d1a62defdf587f9accb3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:01Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.131552 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9013d62-9809-436b-82a8-5b18dbf13e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b57e4ceb22cb7d61e915ddf5e85dfee872d2088cccf3c8c4add63e1aa423a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c31ce4ad814ab3be27cbd8d8eb0cb70c7e11352cacd14b705d5c3ed0b56fe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6mhzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:01Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.148008 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:01Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.198653 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.198700 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.198716 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.198739 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.198755 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:01Z","lastTransitionTime":"2026-02-17T17:45:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.300354 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.300386 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.300396 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.300407 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.300416 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:01Z","lastTransitionTime":"2026-02-17T17:45:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.340134 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 04:59:48.633218193 +0000 UTC Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.358750 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:45:01 crc kubenswrapper[4892]: E0217 17:45:01.358932 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.359158 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:45:01 crc kubenswrapper[4892]: E0217 17:45:01.359320 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.407182 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.407247 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.407264 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.407289 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.407313 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:01Z","lastTransitionTime":"2026-02-17T17:45:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.510480 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.510522 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.510538 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.510557 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.510574 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:01Z","lastTransitionTime":"2026-02-17T17:45:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.612999 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.613036 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.613046 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.613063 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.613075 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:01Z","lastTransitionTime":"2026-02-17T17:45:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.715330 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.715369 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.715379 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.715395 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.715406 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:01Z","lastTransitionTime":"2026-02-17T17:45:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.818008 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.818050 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.818061 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.818078 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.818090 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:01Z","lastTransitionTime":"2026-02-17T17:45:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.920863 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.920915 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.920930 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.920954 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:01 crc kubenswrapper[4892]: I0217 17:45:01.920970 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:01Z","lastTransitionTime":"2026-02-17T17:45:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:02 crc kubenswrapper[4892]: I0217 17:45:02.023023 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:02 crc kubenswrapper[4892]: I0217 17:45:02.023062 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:02 crc kubenswrapper[4892]: I0217 17:45:02.023075 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:02 crc kubenswrapper[4892]: I0217 17:45:02.023092 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:02 crc kubenswrapper[4892]: I0217 17:45:02.023104 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:02Z","lastTransitionTime":"2026-02-17T17:45:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:02 crc kubenswrapper[4892]: I0217 17:45:02.125211 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:02 crc kubenswrapper[4892]: I0217 17:45:02.125290 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:02 crc kubenswrapper[4892]: I0217 17:45:02.125299 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:02 crc kubenswrapper[4892]: I0217 17:45:02.125313 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:02 crc kubenswrapper[4892]: I0217 17:45:02.125322 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:02Z","lastTransitionTime":"2026-02-17T17:45:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:02 crc kubenswrapper[4892]: I0217 17:45:02.228318 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:02 crc kubenswrapper[4892]: I0217 17:45:02.228360 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:02 crc kubenswrapper[4892]: I0217 17:45:02.228376 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:02 crc kubenswrapper[4892]: I0217 17:45:02.228397 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:02 crc kubenswrapper[4892]: I0217 17:45:02.228413 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:02Z","lastTransitionTime":"2026-02-17T17:45:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:02 crc kubenswrapper[4892]: I0217 17:45:02.331089 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:02 crc kubenswrapper[4892]: I0217 17:45:02.331130 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:02 crc kubenswrapper[4892]: I0217 17:45:02.331147 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:02 crc kubenswrapper[4892]: I0217 17:45:02.331170 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:02 crc kubenswrapper[4892]: I0217 17:45:02.331188 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:02Z","lastTransitionTime":"2026-02-17T17:45:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:02 crc kubenswrapper[4892]: I0217 17:45:02.432918 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:02 crc kubenswrapper[4892]: I0217 17:45:02.432989 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:02 crc kubenswrapper[4892]: I0217 17:45:02.433005 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:02 crc kubenswrapper[4892]: I0217 17:45:02.433024 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:02 crc kubenswrapper[4892]: I0217 17:45:02.433039 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:02Z","lastTransitionTime":"2026-02-17T17:45:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:02 crc kubenswrapper[4892]: I0217 17:45:02.536049 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:02 crc kubenswrapper[4892]: I0217 17:45:02.536090 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:02 crc kubenswrapper[4892]: I0217 17:45:02.536106 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:02 crc kubenswrapper[4892]: I0217 17:45:02.536128 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:02 crc kubenswrapper[4892]: I0217 17:45:02.536144 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:02Z","lastTransitionTime":"2026-02-17T17:45:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:02 crc kubenswrapper[4892]: I0217 17:45:02.638365 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:02 crc kubenswrapper[4892]: I0217 17:45:02.638384 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:02 crc kubenswrapper[4892]: I0217 17:45:02.638394 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:02 crc kubenswrapper[4892]: I0217 17:45:02.638406 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:02 crc kubenswrapper[4892]: I0217 17:45:02.638417 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:02Z","lastTransitionTime":"2026-02-17T17:45:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:02 crc kubenswrapper[4892]: I0217 17:45:02.740978 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:02 crc kubenswrapper[4892]: I0217 17:45:02.741046 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:02 crc kubenswrapper[4892]: I0217 17:45:02.741069 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:02 crc kubenswrapper[4892]: I0217 17:45:02.741094 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:02 crc kubenswrapper[4892]: I0217 17:45:02.741113 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:02Z","lastTransitionTime":"2026-02-17T17:45:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:02 crc kubenswrapper[4892]: I0217 17:45:02.842581 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:02 crc kubenswrapper[4892]: I0217 17:45:02.842603 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:02 crc kubenswrapper[4892]: I0217 17:45:02.842610 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:02 crc kubenswrapper[4892]: I0217 17:45:02.842620 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:02 crc kubenswrapper[4892]: I0217 17:45:02.842628 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:02Z","lastTransitionTime":"2026-02-17T17:45:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:02 crc kubenswrapper[4892]: I0217 17:45:02.944917 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:02 crc kubenswrapper[4892]: I0217 17:45:02.944943 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:02 crc kubenswrapper[4892]: I0217 17:45:02.944951 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:02 crc kubenswrapper[4892]: I0217 17:45:02.944963 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:02 crc kubenswrapper[4892]: I0217 17:45:02.944972 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:02Z","lastTransitionTime":"2026-02-17T17:45:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:03 crc kubenswrapper[4892]: I0217 17:45:03.047557 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:03 crc kubenswrapper[4892]: I0217 17:45:03.047591 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:03 crc kubenswrapper[4892]: I0217 17:45:03.047602 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:03 crc kubenswrapper[4892]: I0217 17:45:03.047616 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:03 crc kubenswrapper[4892]: I0217 17:45:03.047627 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:03Z","lastTransitionTime":"2026-02-17T17:45:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:03 crc kubenswrapper[4892]: I0217 17:45:03.151250 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:03 crc kubenswrapper[4892]: I0217 17:45:03.151286 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:03 crc kubenswrapper[4892]: I0217 17:45:03.151296 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:03 crc kubenswrapper[4892]: I0217 17:45:03.151315 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:03 crc kubenswrapper[4892]: I0217 17:45:03.151327 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:03Z","lastTransitionTime":"2026-02-17T17:45:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:03 crc kubenswrapper[4892]: I0217 17:45:03.255091 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:03 crc kubenswrapper[4892]: I0217 17:45:03.255149 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:03 crc kubenswrapper[4892]: I0217 17:45:03.255168 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:03 crc kubenswrapper[4892]: I0217 17:45:03.255194 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:03 crc kubenswrapper[4892]: I0217 17:45:03.255216 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:03Z","lastTransitionTime":"2026-02-17T17:45:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:03 crc kubenswrapper[4892]: I0217 17:45:03.357723 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:03 crc kubenswrapper[4892]: I0217 17:45:03.357769 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:03 crc kubenswrapper[4892]: I0217 17:45:03.357785 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:03 crc kubenswrapper[4892]: I0217 17:45:03.357807 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:03 crc kubenswrapper[4892]: I0217 17:45:03.357875 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:03Z","lastTransitionTime":"2026-02-17T17:45:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:03 crc kubenswrapper[4892]: I0217 17:45:03.460743 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:03 crc kubenswrapper[4892]: I0217 17:45:03.460801 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:03 crc kubenswrapper[4892]: I0217 17:45:03.460853 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:03 crc kubenswrapper[4892]: I0217 17:45:03.460882 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:03 crc kubenswrapper[4892]: I0217 17:45:03.460903 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:03Z","lastTransitionTime":"2026-02-17T17:45:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:03 crc kubenswrapper[4892]: I0217 17:45:03.540042 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 21:37:12.148829503 +0000 UTC Feb 17 17:45:03 crc kubenswrapper[4892]: I0217 17:45:03.541090 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:45:03 crc kubenswrapper[4892]: E0217 17:45:03.541231 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:45:03 crc kubenswrapper[4892]: I0217 17:45:03.541279 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:45:03 crc kubenswrapper[4892]: I0217 17:45:03.541368 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:45:03 crc kubenswrapper[4892]: E0217 17:45:03.541540 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:45:03 crc kubenswrapper[4892]: I0217 17:45:03.541640 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:45:03 crc kubenswrapper[4892]: E0217 17:45:03.541759 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:45:03 crc kubenswrapper[4892]: E0217 17:45:03.541965 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q4n6" podUID="9290105c-74a4-487a-879f-3f79186b3b01" Feb 17 17:45:03 crc kubenswrapper[4892]: I0217 17:45:03.562448 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:03 crc kubenswrapper[4892]: I0217 17:45:03.562475 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:03 crc kubenswrapper[4892]: I0217 17:45:03.562482 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:03 crc kubenswrapper[4892]: I0217 17:45:03.562495 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:03 crc kubenswrapper[4892]: I0217 17:45:03.562503 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:03Z","lastTransitionTime":"2026-02-17T17:45:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:03 crc kubenswrapper[4892]: I0217 17:45:03.665534 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:03 crc kubenswrapper[4892]: I0217 17:45:03.665589 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:03 crc kubenswrapper[4892]: I0217 17:45:03.665606 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:03 crc kubenswrapper[4892]: I0217 17:45:03.665629 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:03 crc kubenswrapper[4892]: I0217 17:45:03.665645 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:03Z","lastTransitionTime":"2026-02-17T17:45:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:03 crc kubenswrapper[4892]: I0217 17:45:03.768285 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:03 crc kubenswrapper[4892]: I0217 17:45:03.768322 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:03 crc kubenswrapper[4892]: I0217 17:45:03.768330 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:03 crc kubenswrapper[4892]: I0217 17:45:03.768345 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:03 crc kubenswrapper[4892]: I0217 17:45:03.768355 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:03Z","lastTransitionTime":"2026-02-17T17:45:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:03 crc kubenswrapper[4892]: I0217 17:45:03.870859 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:03 crc kubenswrapper[4892]: I0217 17:45:03.870920 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:03 crc kubenswrapper[4892]: I0217 17:45:03.870958 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:03 crc kubenswrapper[4892]: I0217 17:45:03.870983 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:03 crc kubenswrapper[4892]: I0217 17:45:03.871003 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:03Z","lastTransitionTime":"2026-02-17T17:45:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:03 crc kubenswrapper[4892]: I0217 17:45:03.973407 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:03 crc kubenswrapper[4892]: I0217 17:45:03.973442 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:03 crc kubenswrapper[4892]: I0217 17:45:03.973450 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:03 crc kubenswrapper[4892]: I0217 17:45:03.973463 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:03 crc kubenswrapper[4892]: I0217 17:45:03.973472 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:03Z","lastTransitionTime":"2026-02-17T17:45:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:04 crc kubenswrapper[4892]: I0217 17:45:04.075660 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:04 crc kubenswrapper[4892]: I0217 17:45:04.075694 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:04 crc kubenswrapper[4892]: I0217 17:45:04.075702 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:04 crc kubenswrapper[4892]: I0217 17:45:04.075716 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:04 crc kubenswrapper[4892]: I0217 17:45:04.075725 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:04Z","lastTransitionTime":"2026-02-17T17:45:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:04 crc kubenswrapper[4892]: I0217 17:45:04.178482 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:04 crc kubenswrapper[4892]: I0217 17:45:04.178533 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:04 crc kubenswrapper[4892]: I0217 17:45:04.178549 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:04 crc kubenswrapper[4892]: I0217 17:45:04.178573 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:04 crc kubenswrapper[4892]: I0217 17:45:04.178591 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:04Z","lastTransitionTime":"2026-02-17T17:45:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:04 crc kubenswrapper[4892]: I0217 17:45:04.281257 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:04 crc kubenswrapper[4892]: I0217 17:45:04.281348 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:04 crc kubenswrapper[4892]: I0217 17:45:04.281399 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:04 crc kubenswrapper[4892]: I0217 17:45:04.281424 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:04 crc kubenswrapper[4892]: I0217 17:45:04.281442 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:04Z","lastTransitionTime":"2026-02-17T17:45:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:04 crc kubenswrapper[4892]: I0217 17:45:04.383521 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:04 crc kubenswrapper[4892]: I0217 17:45:04.383549 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:04 crc kubenswrapper[4892]: I0217 17:45:04.383560 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:04 crc kubenswrapper[4892]: I0217 17:45:04.383575 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:04 crc kubenswrapper[4892]: I0217 17:45:04.383586 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:04Z","lastTransitionTime":"2026-02-17T17:45:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:04 crc kubenswrapper[4892]: I0217 17:45:04.487157 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:04 crc kubenswrapper[4892]: I0217 17:45:04.487203 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:04 crc kubenswrapper[4892]: I0217 17:45:04.487213 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:04 crc kubenswrapper[4892]: I0217 17:45:04.487227 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:04 crc kubenswrapper[4892]: I0217 17:45:04.487238 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:04Z","lastTransitionTime":"2026-02-17T17:45:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:04 crc kubenswrapper[4892]: I0217 17:45:04.540696 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 02:47:14.520259656 +0000 UTC Feb 17 17:45:04 crc kubenswrapper[4892]: I0217 17:45:04.589658 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:04 crc kubenswrapper[4892]: I0217 17:45:04.589710 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:04 crc kubenswrapper[4892]: I0217 17:45:04.589727 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:04 crc kubenswrapper[4892]: I0217 17:45:04.589750 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:04 crc kubenswrapper[4892]: I0217 17:45:04.589767 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:04Z","lastTransitionTime":"2026-02-17T17:45:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:04 crc kubenswrapper[4892]: I0217 17:45:04.692452 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:04 crc kubenswrapper[4892]: I0217 17:45:04.692513 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:04 crc kubenswrapper[4892]: I0217 17:45:04.692530 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:04 crc kubenswrapper[4892]: I0217 17:45:04.692555 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:04 crc kubenswrapper[4892]: I0217 17:45:04.692572 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:04Z","lastTransitionTime":"2026-02-17T17:45:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:04 crc kubenswrapper[4892]: I0217 17:45:04.795018 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:04 crc kubenswrapper[4892]: I0217 17:45:04.795070 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:04 crc kubenswrapper[4892]: I0217 17:45:04.795086 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:04 crc kubenswrapper[4892]: I0217 17:45:04.795107 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:04 crc kubenswrapper[4892]: I0217 17:45:04.795127 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:04Z","lastTransitionTime":"2026-02-17T17:45:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:04 crc kubenswrapper[4892]: I0217 17:45:04.898097 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:04 crc kubenswrapper[4892]: I0217 17:45:04.898132 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:04 crc kubenswrapper[4892]: I0217 17:45:04.898140 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:04 crc kubenswrapper[4892]: I0217 17:45:04.898152 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:04 crc kubenswrapper[4892]: I0217 17:45:04.898161 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:04Z","lastTransitionTime":"2026-02-17T17:45:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:05 crc kubenswrapper[4892]: I0217 17:45:05.001104 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:05 crc kubenswrapper[4892]: I0217 17:45:05.001141 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:05 crc kubenswrapper[4892]: I0217 17:45:05.001150 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:05 crc kubenswrapper[4892]: I0217 17:45:05.001165 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:05 crc kubenswrapper[4892]: I0217 17:45:05.001173 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:05Z","lastTransitionTime":"2026-02-17T17:45:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:05 crc kubenswrapper[4892]: I0217 17:45:05.103855 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:05 crc kubenswrapper[4892]: I0217 17:45:05.103911 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:05 crc kubenswrapper[4892]: I0217 17:45:05.103928 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:05 crc kubenswrapper[4892]: I0217 17:45:05.103951 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:05 crc kubenswrapper[4892]: I0217 17:45:05.103968 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:05Z","lastTransitionTime":"2026-02-17T17:45:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:05 crc kubenswrapper[4892]: I0217 17:45:05.206802 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:05 crc kubenswrapper[4892]: I0217 17:45:05.206886 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:05 crc kubenswrapper[4892]: I0217 17:45:05.206906 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:05 crc kubenswrapper[4892]: I0217 17:45:05.206933 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:05 crc kubenswrapper[4892]: I0217 17:45:05.206954 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:05Z","lastTransitionTime":"2026-02-17T17:45:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:05 crc kubenswrapper[4892]: I0217 17:45:05.309190 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:05 crc kubenswrapper[4892]: I0217 17:45:05.309225 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:05 crc kubenswrapper[4892]: I0217 17:45:05.309234 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:05 crc kubenswrapper[4892]: I0217 17:45:05.309249 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:05 crc kubenswrapper[4892]: I0217 17:45:05.309259 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:05Z","lastTransitionTime":"2026-02-17T17:45:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:05 crc kubenswrapper[4892]: I0217 17:45:05.359553 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:45:05 crc kubenswrapper[4892]: E0217 17:45:05.359691 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:45:05 crc kubenswrapper[4892]: I0217 17:45:05.359768 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:45:05 crc kubenswrapper[4892]: E0217 17:45:05.359865 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q4n6" podUID="9290105c-74a4-487a-879f-3f79186b3b01" Feb 17 17:45:05 crc kubenswrapper[4892]: I0217 17:45:05.360126 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:45:05 crc kubenswrapper[4892]: E0217 17:45:05.360192 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:45:05 crc kubenswrapper[4892]: I0217 17:45:05.360412 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:45:05 crc kubenswrapper[4892]: E0217 17:45:05.360471 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:45:05 crc kubenswrapper[4892]: I0217 17:45:05.412232 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:05 crc kubenswrapper[4892]: I0217 17:45:05.412269 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:05 crc kubenswrapper[4892]: I0217 17:45:05.412276 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:05 crc kubenswrapper[4892]: I0217 17:45:05.412292 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:05 crc kubenswrapper[4892]: I0217 17:45:05.412303 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:05Z","lastTransitionTime":"2026-02-17T17:45:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:05 crc kubenswrapper[4892]: I0217 17:45:05.515293 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:05 crc kubenswrapper[4892]: I0217 17:45:05.515361 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:05 crc kubenswrapper[4892]: I0217 17:45:05.515380 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:05 crc kubenswrapper[4892]: I0217 17:45:05.515405 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:05 crc kubenswrapper[4892]: I0217 17:45:05.515422 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:05Z","lastTransitionTime":"2026-02-17T17:45:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:05 crc kubenswrapper[4892]: I0217 17:45:05.541730 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 11:46:01.671707602 +0000 UTC Feb 17 17:45:05 crc kubenswrapper[4892]: I0217 17:45:05.617795 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:05 crc kubenswrapper[4892]: I0217 17:45:05.617876 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:05 crc kubenswrapper[4892]: I0217 17:45:05.617896 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:05 crc kubenswrapper[4892]: I0217 17:45:05.617921 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:05 crc kubenswrapper[4892]: I0217 17:45:05.617937 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:05Z","lastTransitionTime":"2026-02-17T17:45:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:05 crc kubenswrapper[4892]: I0217 17:45:05.720152 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:05 crc kubenswrapper[4892]: I0217 17:45:05.720182 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:05 crc kubenswrapper[4892]: I0217 17:45:05.720191 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:05 crc kubenswrapper[4892]: I0217 17:45:05.720204 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:05 crc kubenswrapper[4892]: I0217 17:45:05.720215 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:05Z","lastTransitionTime":"2026-02-17T17:45:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:05 crc kubenswrapper[4892]: I0217 17:45:05.823912 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:05 crc kubenswrapper[4892]: I0217 17:45:05.823988 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:05 crc kubenswrapper[4892]: I0217 17:45:05.824008 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:05 crc kubenswrapper[4892]: I0217 17:45:05.824027 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:05 crc kubenswrapper[4892]: I0217 17:45:05.824040 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:05Z","lastTransitionTime":"2026-02-17T17:45:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:05 crc kubenswrapper[4892]: I0217 17:45:05.927967 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:05 crc kubenswrapper[4892]: I0217 17:45:05.928016 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:05 crc kubenswrapper[4892]: I0217 17:45:05.928036 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:05 crc kubenswrapper[4892]: I0217 17:45:05.928060 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:05 crc kubenswrapper[4892]: I0217 17:45:05.928077 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:05Z","lastTransitionTime":"2026-02-17T17:45:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:06 crc kubenswrapper[4892]: I0217 17:45:06.031270 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:06 crc kubenswrapper[4892]: I0217 17:45:06.031335 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:06 crc kubenswrapper[4892]: I0217 17:45:06.031354 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:06 crc kubenswrapper[4892]: I0217 17:45:06.031384 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:06 crc kubenswrapper[4892]: I0217 17:45:06.031403 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:06Z","lastTransitionTime":"2026-02-17T17:45:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:06 crc kubenswrapper[4892]: I0217 17:45:06.134529 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:06 crc kubenswrapper[4892]: I0217 17:45:06.134572 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:06 crc kubenswrapper[4892]: I0217 17:45:06.134588 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:06 crc kubenswrapper[4892]: I0217 17:45:06.134611 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:06 crc kubenswrapper[4892]: I0217 17:45:06.134628 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:06Z","lastTransitionTime":"2026-02-17T17:45:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:06 crc kubenswrapper[4892]: I0217 17:45:06.237470 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:06 crc kubenswrapper[4892]: I0217 17:45:06.237536 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:06 crc kubenswrapper[4892]: I0217 17:45:06.237553 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:06 crc kubenswrapper[4892]: I0217 17:45:06.237585 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:06 crc kubenswrapper[4892]: I0217 17:45:06.237602 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:06Z","lastTransitionTime":"2026-02-17T17:45:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:06 crc kubenswrapper[4892]: I0217 17:45:06.341155 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:06 crc kubenswrapper[4892]: I0217 17:45:06.341208 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:06 crc kubenswrapper[4892]: I0217 17:45:06.341223 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:06 crc kubenswrapper[4892]: I0217 17:45:06.341246 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:06 crc kubenswrapper[4892]: I0217 17:45:06.341263 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:06Z","lastTransitionTime":"2026-02-17T17:45:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:06 crc kubenswrapper[4892]: I0217 17:45:06.443502 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:06 crc kubenswrapper[4892]: I0217 17:45:06.443563 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:06 crc kubenswrapper[4892]: I0217 17:45:06.443581 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:06 crc kubenswrapper[4892]: I0217 17:45:06.443607 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:06 crc kubenswrapper[4892]: I0217 17:45:06.443626 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:06Z","lastTransitionTime":"2026-02-17T17:45:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:06 crc kubenswrapper[4892]: I0217 17:45:06.542275 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 03:51:05.611355589 +0000 UTC Feb 17 17:45:06 crc kubenswrapper[4892]: I0217 17:45:06.547091 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:06 crc kubenswrapper[4892]: I0217 17:45:06.547152 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:06 crc kubenswrapper[4892]: I0217 17:45:06.547168 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:06 crc kubenswrapper[4892]: I0217 17:45:06.547191 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:06 crc kubenswrapper[4892]: I0217 17:45:06.547207 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:06Z","lastTransitionTime":"2026-02-17T17:45:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:06 crc kubenswrapper[4892]: I0217 17:45:06.650759 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:06 crc kubenswrapper[4892]: I0217 17:45:06.650882 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:06 crc kubenswrapper[4892]: I0217 17:45:06.650911 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:06 crc kubenswrapper[4892]: I0217 17:45:06.650937 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:06 crc kubenswrapper[4892]: I0217 17:45:06.650957 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:06Z","lastTransitionTime":"2026-02-17T17:45:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:06 crc kubenswrapper[4892]: I0217 17:45:06.753702 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:06 crc kubenswrapper[4892]: I0217 17:45:06.753738 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:06 crc kubenswrapper[4892]: I0217 17:45:06.753747 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:06 crc kubenswrapper[4892]: I0217 17:45:06.753760 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:06 crc kubenswrapper[4892]: I0217 17:45:06.753770 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:06Z","lastTransitionTime":"2026-02-17T17:45:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:06 crc kubenswrapper[4892]: I0217 17:45:06.856767 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:06 crc kubenswrapper[4892]: I0217 17:45:06.856861 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:06 crc kubenswrapper[4892]: I0217 17:45:06.856879 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:06 crc kubenswrapper[4892]: I0217 17:45:06.856901 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:06 crc kubenswrapper[4892]: I0217 17:45:06.856917 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:06Z","lastTransitionTime":"2026-02-17T17:45:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:06 crc kubenswrapper[4892]: I0217 17:45:06.959925 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:06 crc kubenswrapper[4892]: I0217 17:45:06.960006 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:06 crc kubenswrapper[4892]: I0217 17:45:06.960029 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:06 crc kubenswrapper[4892]: I0217 17:45:06.960055 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:06 crc kubenswrapper[4892]: I0217 17:45:06.960076 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:06Z","lastTransitionTime":"2026-02-17T17:45:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.063272 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.063483 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.063560 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.063637 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.063728 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:07Z","lastTransitionTime":"2026-02-17T17:45:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.081103 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.081144 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.081154 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.081166 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.081176 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:07Z","lastTransitionTime":"2026-02-17T17:45:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:07 crc kubenswrapper[4892]: E0217 17:45:07.098945 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b854407-f7fd-494b-9ad1-f90f175f6ff2\\\",\\\"systemUUID\\\":\\\"06c95f3d-0382-44a5-9f64-04e3b3bbd534\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:07Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.104155 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.104229 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.104254 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.104282 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.104304 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:07Z","lastTransitionTime":"2026-02-17T17:45:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:07 crc kubenswrapper[4892]: E0217 17:45:07.124480 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b854407-f7fd-494b-9ad1-f90f175f6ff2\\\",\\\"systemUUID\\\":\\\"06c95f3d-0382-44a5-9f64-04e3b3bbd534\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:07Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.129201 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.129260 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.129277 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.129300 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.129317 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:07Z","lastTransitionTime":"2026-02-17T17:45:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:07 crc kubenswrapper[4892]: E0217 17:45:07.148922 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b854407-f7fd-494b-9ad1-f90f175f6ff2\\\",\\\"systemUUID\\\":\\\"06c95f3d-0382-44a5-9f64-04e3b3bbd534\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:07Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.153788 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.153835 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.153846 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.153859 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.153868 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:07Z","lastTransitionTime":"2026-02-17T17:45:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:07 crc kubenswrapper[4892]: E0217 17:45:07.170923 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b854407-f7fd-494b-9ad1-f90f175f6ff2\\\",\\\"systemUUID\\\":\\\"06c95f3d-0382-44a5-9f64-04e3b3bbd534\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:07Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.175824 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.175858 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.175868 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.175882 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.175896 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:07Z","lastTransitionTime":"2026-02-17T17:45:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:07 crc kubenswrapper[4892]: E0217 17:45:07.193781 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b854407-f7fd-494b-9ad1-f90f175f6ff2\\\",\\\"systemUUID\\\":\\\"06c95f3d-0382-44a5-9f64-04e3b3bbd534\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:07Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:07 crc kubenswrapper[4892]: E0217 17:45:07.193944 4892 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.195775 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.195807 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.195835 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.195849 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.195857 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:07Z","lastTransitionTime":"2026-02-17T17:45:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.299177 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.299255 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.299272 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.299298 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.299316 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:07Z","lastTransitionTime":"2026-02-17T17:45:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.358753 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.358876 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.358875 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:45:07 crc kubenswrapper[4892]: E0217 17:45:07.358986 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.359072 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:45:07 crc kubenswrapper[4892]: E0217 17:45:07.359259 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:45:07 crc kubenswrapper[4892]: E0217 17:45:07.359399 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q4n6" podUID="9290105c-74a4-487a-879f-3f79186b3b01" Feb 17 17:45:07 crc kubenswrapper[4892]: E0217 17:45:07.359505 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.402349 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.402403 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.402420 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.402444 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.402462 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:07Z","lastTransitionTime":"2026-02-17T17:45:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.504745 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.504800 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.504840 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.504863 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.504880 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:07Z","lastTransitionTime":"2026-02-17T17:45:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.543119 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 07:43:24.060303563 +0000 UTC Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.608147 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.608213 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.608230 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.608253 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.608271 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:07Z","lastTransitionTime":"2026-02-17T17:45:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.711495 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.711547 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.711562 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.711578 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.711589 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:07Z","lastTransitionTime":"2026-02-17T17:45:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.814654 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.814707 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.814718 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.814736 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.814750 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:07Z","lastTransitionTime":"2026-02-17T17:45:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.917977 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.918058 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.918086 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.918116 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:07 crc kubenswrapper[4892]: I0217 17:45:07.918138 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:07Z","lastTransitionTime":"2026-02-17T17:45:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:08 crc kubenswrapper[4892]: I0217 17:45:08.020800 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:08 crc kubenswrapper[4892]: I0217 17:45:08.020938 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:08 crc kubenswrapper[4892]: I0217 17:45:08.020963 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:08 crc kubenswrapper[4892]: I0217 17:45:08.020992 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:08 crc kubenswrapper[4892]: I0217 17:45:08.021013 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:08Z","lastTransitionTime":"2026-02-17T17:45:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:08 crc kubenswrapper[4892]: I0217 17:45:08.124075 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:08 crc kubenswrapper[4892]: I0217 17:45:08.124131 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:08 crc kubenswrapper[4892]: I0217 17:45:08.124149 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:08 crc kubenswrapper[4892]: I0217 17:45:08.124171 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:08 crc kubenswrapper[4892]: I0217 17:45:08.124187 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:08Z","lastTransitionTime":"2026-02-17T17:45:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:08 crc kubenswrapper[4892]: I0217 17:45:08.227460 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:08 crc kubenswrapper[4892]: I0217 17:45:08.227527 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:08 crc kubenswrapper[4892]: I0217 17:45:08.227544 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:08 crc kubenswrapper[4892]: I0217 17:45:08.227570 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:08 crc kubenswrapper[4892]: I0217 17:45:08.227587 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:08Z","lastTransitionTime":"2026-02-17T17:45:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:08 crc kubenswrapper[4892]: I0217 17:45:08.330678 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:08 crc kubenswrapper[4892]: I0217 17:45:08.330741 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:08 crc kubenswrapper[4892]: I0217 17:45:08.330759 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:08 crc kubenswrapper[4892]: I0217 17:45:08.330786 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:08 crc kubenswrapper[4892]: I0217 17:45:08.330802 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:08Z","lastTransitionTime":"2026-02-17T17:45:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:08 crc kubenswrapper[4892]: I0217 17:45:08.434548 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:08 crc kubenswrapper[4892]: I0217 17:45:08.434595 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:08 crc kubenswrapper[4892]: I0217 17:45:08.434607 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:08 crc kubenswrapper[4892]: I0217 17:45:08.434629 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:08 crc kubenswrapper[4892]: I0217 17:45:08.434638 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:08Z","lastTransitionTime":"2026-02-17T17:45:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:08 crc kubenswrapper[4892]: I0217 17:45:08.538415 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:08 crc kubenswrapper[4892]: I0217 17:45:08.538477 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:08 crc kubenswrapper[4892]: I0217 17:45:08.538495 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:08 crc kubenswrapper[4892]: I0217 17:45:08.538518 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:08 crc kubenswrapper[4892]: I0217 17:45:08.538534 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:08Z","lastTransitionTime":"2026-02-17T17:45:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:08 crc kubenswrapper[4892]: I0217 17:45:08.543624 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 04:51:36.749363196 +0000 UTC Feb 17 17:45:08 crc kubenswrapper[4892]: I0217 17:45:08.641897 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:08 crc kubenswrapper[4892]: I0217 17:45:08.641959 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:08 crc kubenswrapper[4892]: I0217 17:45:08.641982 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:08 crc kubenswrapper[4892]: I0217 17:45:08.642009 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:08 crc kubenswrapper[4892]: I0217 17:45:08.642030 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:08Z","lastTransitionTime":"2026-02-17T17:45:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:08 crc kubenswrapper[4892]: I0217 17:45:08.744706 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:08 crc kubenswrapper[4892]: I0217 17:45:08.744745 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:08 crc kubenswrapper[4892]: I0217 17:45:08.744754 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:08 crc kubenswrapper[4892]: I0217 17:45:08.744801 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:08 crc kubenswrapper[4892]: I0217 17:45:08.744833 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:08Z","lastTransitionTime":"2026-02-17T17:45:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:08 crc kubenswrapper[4892]: I0217 17:45:08.848045 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:08 crc kubenswrapper[4892]: I0217 17:45:08.848106 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:08 crc kubenswrapper[4892]: I0217 17:45:08.848124 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:08 crc kubenswrapper[4892]: I0217 17:45:08.848153 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:08 crc kubenswrapper[4892]: I0217 17:45:08.848176 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:08Z","lastTransitionTime":"2026-02-17T17:45:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:08 crc kubenswrapper[4892]: I0217 17:45:08.951073 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:08 crc kubenswrapper[4892]: I0217 17:45:08.951132 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:08 crc kubenswrapper[4892]: I0217 17:45:08.951149 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:08 crc kubenswrapper[4892]: I0217 17:45:08.951173 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:08 crc kubenswrapper[4892]: I0217 17:45:08.951191 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:08Z","lastTransitionTime":"2026-02-17T17:45:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.054205 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.054261 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.054270 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.054287 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.054317 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:09Z","lastTransitionTime":"2026-02-17T17:45:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.157744 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.157808 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.157860 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.157889 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.157912 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:09Z","lastTransitionTime":"2026-02-17T17:45:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.260601 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.260662 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.260733 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.260763 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.260863 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:09Z","lastTransitionTime":"2026-02-17T17:45:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.358965 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.359022 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.359026 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.358970 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:45:09 crc kubenswrapper[4892]: E0217 17:45:09.359152 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:45:09 crc kubenswrapper[4892]: E0217 17:45:09.359350 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q4n6" podUID="9290105c-74a4-487a-879f-3f79186b3b01" Feb 17 17:45:09 crc kubenswrapper[4892]: E0217 17:45:09.360137 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:45:09 crc kubenswrapper[4892]: E0217 17:45:09.360301 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.363021 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.363048 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.363058 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.363072 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.363085 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:09Z","lastTransitionTime":"2026-02-17T17:45:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.374197 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5f7eaf0-f8c2-46f8-947e-adf3e613da11\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38ea6d6b01fd4e5622f4283bf6250451952776f0d2b162865b1705dbde6542eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65e1e3306f2becef082f51747a01f6f5f0c619c462d0b001bdc01827d19d6ad3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65e1e3306f2becef082f51747a01f6f5f0c619c462d0b001bdc01827d19d6ad3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:09Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.395201 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4159c-33a5-42ef-9427-ad1fb2799470\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf7e90e5029b6c000084531e8c465365c236ea0c554bf0ab65b936fc0f53cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe34ba04ce082dcb2c2040fafa3d9b0fb0f78fd569d08c9e53278873fb6434c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a9fc749b194bcc0558af7017260846aeb8df8df5d84f31ae0f1e9f7ac29e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2199d2bd60d27dc4d3c3ff4a2ef9e4cbad59c3546a36206258b82875901b69d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 17:44:08.433164 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 17:44:08.433536 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:44:08.434768 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2781092108/tls.crt::/tmp/serving-cert-2781092108/tls.key\\\\\\\"\\\\nI0217 17:44:08.760474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:44:08.765628 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:44:08.765666 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:44:08.765700 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:44:08.765711 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:44:08.777802 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:44:08.777850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:44:08.777867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:44:08.777870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:44:08.777873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:44:08.777883 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:44:08.781106 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23963364690d3b6cdb1cb76d08e094b0a7c2fe304f8d20a673206357e0152193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:09Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.409480 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af0cdeda-3c72-4700-b908-b8c0f0041cc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db577dd460e8c892dbb44799bae9ccdd3364c385f4bbfbd3a33a88bab1bcab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd66fb423f8b68d5e670ece2a7be1b6e04e4dcd03ba42dc25bcf97cd9ec92b69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0b98dba841562d65c0ab18e4f604c141352d4bbfb16825248fb209a73b23cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f344d7a0ae243f5e05b9e8c21f25a713411ee0921470e6632409800760aab22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f344d7a0ae243f5e05b9e8c21f25a713411ee0921470e6632409800760aab22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:09Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.428277 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59304d1958e072cd19bd1e6a00f1487aa93563c8838c859aa7ed5222dc25c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:09Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.440601 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5v456" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62da95c0-b8b4-410e-a5e8-f3ab44db53b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4bd49fbfa0e7cff66ffc2a72aa0020a4a28551fa192e587067d648feb4d09f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfctg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5v456\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:09Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.458520 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k28d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5497ad5a-f96f-4f2e-ba79-a72f32527546\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8cdd570c53fb9581ae506e73f11f51586ebffcc4561ba7176a7a64a25bd5e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g8dcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ca1f6810f305bb127d5ebfbb39b92ffa5914a3a245dcfe6f4adb805fea015e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g8dcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7k28d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:09Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.465565 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.465645 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.465669 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.465699 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.465722 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:09Z","lastTransitionTime":"2026-02-17T17:45:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.478018 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83f8f121-9193-465b-9d6b-4dfbdcf26f98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca961a496bca89cff78c58c41585706c0673892df83e6c58e16655f83aeaa4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3ead139c2a8e6feefbddce384ccd6eb8f26b4581d93053b80832fea7c8bdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82362387569a5b049fdb827cb8ca5ecdacae36743ef142077b37441cc2e33b41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d08e8579fe7ac45bc28ca29b9e6bc0b795c68e4429e16319b75f2b6a1f60cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:09Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.497753 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:09Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.516545 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:09Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.536074 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxpxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b12f44-0079-4031-9b1d-492c374250df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc69d29b82db70745c48c3f7761a99709ee267fb45509bfc4ff406273c6f23c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5490d72f776fd4e96f56e933c182c8589e96d80806378f96d97c4d0f6dccd8eb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:44:59Z\\\",\\\"message\\\":\\\"2026-02-17T17:44:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d0b9a3e2-5264-4ab6-a42c-c965db521da1\\\\n2026-02-17T17:44:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d0b9a3e2-5264-4ab6-a42c-c965db521da1 to /host/opt/cni/bin/\\\\n2026-02-17T17:44:14Z [verbose] multus-daemon started\\\\n2026-02-17T17:44:14Z [verbose] Readiness Indicator file check\\\\n2026-02-17T17:44:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgp4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxpxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:09Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.544382 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 23:01:00.174454501 +0000 UTC Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.561374 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4dsxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202411f4-1b44-41a1-9b8a-23038a0e9bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc5f2c5d87fb5b1529c03563955d9213cd72452aabadd9417afdf5a30b8a24f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4dsxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:09Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.569077 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.569191 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.569208 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.569290 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.569337 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:09Z","lastTransitionTime":"2026-02-17T17:45:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.575133 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:09Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.593492 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aba0c9be9062adff6361f81c4b2f399a56592fcef0d1a62defdf587f9accb3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:09Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.607601 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9013d62-9809-436b-82a8-5b18dbf13e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b57e4ceb22cb7d61e915ddf5e85dfee872d2088cccf3c8c4add63e1aa423a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c31ce4ad814ab3be27cbd8d8eb0cb70c7e11352cacd14b705d5c3ed0b56fe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6mhzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:09Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.626961 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e1e9d28fe57947c798967ce328638e778603a86874e79a2f4d99eab0b9867fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://489d551f2e552fe6e6a03f8e38593e2b978a7dc4053a999c3a11ebe404cf427f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:09Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.642495 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p6jtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"646d0148-c138-45a2-8f68-51aee16aeff0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1d56c160a6bfb1da9b1ca24666fbbdbd87911eaf961365a6cb097b0cc20f1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxbxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p6jtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:09Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.671579 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b23058a0-04ec-4a23-82cb-60f9b368eaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e71385b1a285f0da1c91a9a94188fbfd412225fc8c839572ec619b3363017253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e71385b1a285f0da1c91a9a94188fbfd412225fc8c839572ec619b3363017253\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:44:43Z\\\",\\\"message\\\":\\\"ef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 17:44:43.370589 6578 services_controller.go:356] Processing sync for service openshift-multus/multus-admission-controller for network=default\\\\nI0217 17:44:43.369483 6578 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0217 17:44:43.370645 6578 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0217 17:44:43.370651 6578 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0217 17:44:43.370550 6578 base_netwo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kp5h9_openshift-ovn-kubernetes(b23058a0-04ec-4a23-82cb-60f9b368eaa0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kp5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:09Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.678557 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.678621 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.678641 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.678697 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.678720 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:09Z","lastTransitionTime":"2026-02-17T17:45:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.692158 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2q4n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9290105c-74a4-487a-879f-3f79186b3b01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6h6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6h6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2q4n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:09Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.781845 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.781875 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.781884 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.781897 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.781906 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:09Z","lastTransitionTime":"2026-02-17T17:45:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.884105 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.884164 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.884181 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.884241 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.884260 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:09Z","lastTransitionTime":"2026-02-17T17:45:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.987493 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.987560 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.987577 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.987605 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:09 crc kubenswrapper[4892]: I0217 17:45:09.987621 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:09Z","lastTransitionTime":"2026-02-17T17:45:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:10 crc kubenswrapper[4892]: I0217 17:45:10.089932 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:10 crc kubenswrapper[4892]: I0217 17:45:10.089977 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:10 crc kubenswrapper[4892]: I0217 17:45:10.089990 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:10 crc kubenswrapper[4892]: I0217 17:45:10.090007 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:10 crc kubenswrapper[4892]: I0217 17:45:10.090048 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:10Z","lastTransitionTime":"2026-02-17T17:45:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:10 crc kubenswrapper[4892]: I0217 17:45:10.193291 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:10 crc kubenswrapper[4892]: I0217 17:45:10.193339 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:10 crc kubenswrapper[4892]: I0217 17:45:10.193371 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:10 crc kubenswrapper[4892]: I0217 17:45:10.193394 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:10 crc kubenswrapper[4892]: I0217 17:45:10.193404 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:10Z","lastTransitionTime":"2026-02-17T17:45:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:10 crc kubenswrapper[4892]: I0217 17:45:10.296233 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:10 crc kubenswrapper[4892]: I0217 17:45:10.296306 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:10 crc kubenswrapper[4892]: I0217 17:45:10.296327 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:10 crc kubenswrapper[4892]: I0217 17:45:10.296357 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:10 crc kubenswrapper[4892]: I0217 17:45:10.296379 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:10Z","lastTransitionTime":"2026-02-17T17:45:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:10 crc kubenswrapper[4892]: I0217 17:45:10.359539 4892 scope.go:117] "RemoveContainer" containerID="e71385b1a285f0da1c91a9a94188fbfd412225fc8c839572ec619b3363017253" Feb 17 17:45:10 crc kubenswrapper[4892]: I0217 17:45:10.399360 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:10 crc kubenswrapper[4892]: I0217 17:45:10.399417 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:10 crc kubenswrapper[4892]: I0217 17:45:10.399441 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:10 crc kubenswrapper[4892]: I0217 17:45:10.399471 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:10 crc kubenswrapper[4892]: I0217 17:45:10.399496 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:10Z","lastTransitionTime":"2026-02-17T17:45:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:10 crc kubenswrapper[4892]: I0217 17:45:10.503372 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:10 crc kubenswrapper[4892]: I0217 17:45:10.503432 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:10 crc kubenswrapper[4892]: I0217 17:45:10.503454 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:10 crc kubenswrapper[4892]: I0217 17:45:10.503478 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:10 crc kubenswrapper[4892]: I0217 17:45:10.503494 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:10Z","lastTransitionTime":"2026-02-17T17:45:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:10 crc kubenswrapper[4892]: I0217 17:45:10.544734 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 03:15:24.368314372 +0000 UTC Feb 17 17:45:10 crc kubenswrapper[4892]: I0217 17:45:10.605972 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:10 crc kubenswrapper[4892]: I0217 17:45:10.606116 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:10 crc kubenswrapper[4892]: I0217 17:45:10.606247 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:10 crc kubenswrapper[4892]: I0217 17:45:10.606329 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:10 crc kubenswrapper[4892]: I0217 17:45:10.606398 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:10Z","lastTransitionTime":"2026-02-17T17:45:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:10 crc kubenswrapper[4892]: I0217 17:45:10.711975 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:10 crc kubenswrapper[4892]: I0217 17:45:10.712027 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:10 crc kubenswrapper[4892]: I0217 17:45:10.712044 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:10 crc kubenswrapper[4892]: I0217 17:45:10.712066 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:10 crc kubenswrapper[4892]: I0217 17:45:10.712084 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:10Z","lastTransitionTime":"2026-02-17T17:45:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:10 crc kubenswrapper[4892]: I0217 17:45:10.814802 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:10 crc kubenswrapper[4892]: I0217 17:45:10.814915 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:10 crc kubenswrapper[4892]: I0217 17:45:10.814968 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:10 crc kubenswrapper[4892]: I0217 17:45:10.814999 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:10 crc kubenswrapper[4892]: I0217 17:45:10.815020 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:10Z","lastTransitionTime":"2026-02-17T17:45:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:10 crc kubenswrapper[4892]: I0217 17:45:10.917114 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:10 crc kubenswrapper[4892]: I0217 17:45:10.917169 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:10 crc kubenswrapper[4892]: I0217 17:45:10.917179 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:10 crc kubenswrapper[4892]: I0217 17:45:10.917194 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:10 crc kubenswrapper[4892]: I0217 17:45:10.917205 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:10Z","lastTransitionTime":"2026-02-17T17:45:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.019350 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.019387 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.019397 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.019417 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.019427 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:11Z","lastTransitionTime":"2026-02-17T17:45:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.122483 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.122524 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.122535 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.122553 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.122563 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:11Z","lastTransitionTime":"2026-02-17T17:45:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.224950 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.225003 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.225017 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.225033 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.225046 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:11Z","lastTransitionTime":"2026-02-17T17:45:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.327592 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.327642 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.327656 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.327676 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.327692 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:11Z","lastTransitionTime":"2026-02-17T17:45:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.359235 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.359279 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.359308 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.359289 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:45:11 crc kubenswrapper[4892]: E0217 17:45:11.359391 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:45:11 crc kubenswrapper[4892]: E0217 17:45:11.359577 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:45:11 crc kubenswrapper[4892]: E0217 17:45:11.359650 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q4n6" podUID="9290105c-74a4-487a-879f-3f79186b3b01" Feb 17 17:45:11 crc kubenswrapper[4892]: E0217 17:45:11.359712 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.429940 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.429985 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.429994 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.430009 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.430019 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:11Z","lastTransitionTime":"2026-02-17T17:45:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.532082 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.532135 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.532151 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.532173 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.532190 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:11Z","lastTransitionTime":"2026-02-17T17:45:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.545258 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 22:12:13.524165309 +0000 UTC Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.582802 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kp5h9_b23058a0-04ec-4a23-82cb-60f9b368eaa0/ovnkube-controller/3.log" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.583695 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kp5h9_b23058a0-04ec-4a23-82cb-60f9b368eaa0/ovnkube-controller/2.log" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.587437 4892 generic.go:334] "Generic (PLEG): container finished" podID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerID="acb53587bfb9bbb4e171e5b2c92a91c196c8c0445a80c61d8004ce0fb3c858e1" exitCode=1 Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.587487 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" event={"ID":"b23058a0-04ec-4a23-82cb-60f9b368eaa0","Type":"ContainerDied","Data":"acb53587bfb9bbb4e171e5b2c92a91c196c8c0445a80c61d8004ce0fb3c858e1"} Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.587541 4892 scope.go:117] "RemoveContainer" containerID="e71385b1a285f0da1c91a9a94188fbfd412225fc8c839572ec619b3363017253" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.588736 4892 scope.go:117] "RemoveContainer" containerID="acb53587bfb9bbb4e171e5b2c92a91c196c8c0445a80c61d8004ce0fb3c858e1" Feb 17 17:45:11 crc kubenswrapper[4892]: E0217 17:45:11.589151 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kp5h9_openshift-ovn-kubernetes(b23058a0-04ec-4a23-82cb-60f9b368eaa0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.610370 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxpxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b12f44-0079-4031-9b1d-492c374250df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc69d29b82db70745c48c3f7761a99709ee267fb45509bfc4ff406273c6f23c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5490d72f776fd4e96f56e933c182c8589e96d80806378f96d97c4d0f6dccd8eb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:44:59Z\\\",\\\"message\\\":\\\"2026-02-17T17:44:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d0b9a3e2-5264-4ab6-a42c-c965db521da1\\\\n2026-02-17T17:44:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d0b9a3e2-5264-4ab6-a42c-c965db521da1 to /host/opt/cni/bin/\\\\n2026-02-17T17:44:14Z [verbose] multus-daemon started\\\\n2026-02-17T17:44:14Z [verbose] Readiness Indicator file check\\\\n2026-02-17T17:44:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgp4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxpxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:11Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.635871 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.635969 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.635996 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.636027 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.636048 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:11Z","lastTransitionTime":"2026-02-17T17:45:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.636400 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4dsxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202411f4-1b44-41a1-9b8a-23038a0e9bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc5f2c5d87fb5b1529c03563955d9213cd72452aabadd9417afdf5a30b8a24f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4dsxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:11Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.657467 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83f8f121-9193-465b-9d6b-4dfbdcf26f98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca961a496bca89cff78c58c41585706c0673892df83e6c58e16655f83aeaa4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3ead139c2a8e6feefbddce384ccd6eb8f26b4581d93053b80832fea7c8bdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82362387569a5b049fdb827cb8ca5ecdacae36743ef142077b37441cc2e33b41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d08e8579fe7ac45bc28ca29b9e6bc0b795c68e4429e16319b75f2b6a1f60cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:11Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.678196 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:11Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.694176 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:11Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.710519 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9013d62-9809-436b-82a8-5b18dbf13e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b57e4ceb22cb7d61e915ddf5e85dfee872d2088cccf3c8c4add63e1aa423a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c31ce4ad814ab3be27cbd8d8eb0cb70c7e11352cacd14b705d5c3ed0b56fe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6mhzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:11Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.728925 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:11Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.738270 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.738315 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.738351 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.738375 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.738391 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:11Z","lastTransitionTime":"2026-02-17T17:45:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.748745 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aba0c9be9062adff6361f81c4b2f399a56592fcef0d1a62defdf587f9accb3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:11Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.760660 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2q4n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9290105c-74a4-487a-879f-3f79186b3b01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6h6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6h6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2q4n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:11Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.779065 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e1e9d28fe57947c798967ce328638e778603a86874e79a2f4d99eab0b9867fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://489d551f2e552fe6e6a03f8e38593e2b978a7dc4053a999c3a11ebe404cf427f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:11Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.788579 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p6jtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"646d0148-c138-45a2-8f68-51aee16aeff0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1d56c160a6bfb1da9b1ca24666fbbdbd87911eaf961365a6cb097b0cc20f1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxbxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p6jtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:11Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.804723 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b23058a0-04ec-4a23-82cb-60f9b368eaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acb53587bfb9bbb4e171e5b2c92a91c196c8c0445a80c61d8004ce0fb3c858e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e71385b1a285f0da1c91a9a94188fbfd412225fc8c839572ec619b3363017253\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:44:43Z\\\",\\\"message\\\":\\\"ef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 17:44:43.370589 6578 services_controller.go:356] Processing sync for service openshift-multus/multus-admission-controller for network=default\\\\nI0217 17:44:43.369483 6578 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0217 17:44:43.370645 6578 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0217 17:44:43.370651 6578 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0217 17:44:43.370550 6578 base_netwo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://acb53587bfb9bbb4e171e5b2c92a91c196c8c0445a80c61d8004ce0fb3c858e1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:45:11Z\\\",\\\"message\\\":\\\"ailed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:11Z is after 2025-08-24T17:21:41Z]\\\\nI0217 17:45:11.478842 6999 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k28d\\\\nI0217 17:45:11.478739 6999 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0217 17:45:11.478855 6999 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k28d\\\\nI0217 17:45:11.478682 6999 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-5v456\\\\nI0217 17:45:11.478864 6999 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k28d in node crc\\\\nI0217 17:45:11.478876 6999 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-5v456\\\\nI0217 17:45:11.478883 6999 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-74\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:45:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kp5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:11Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.815784 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59304d1958e072cd19bd1e6a00f1487aa93563c8838c859aa7ed5222dc25c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:11Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.826282 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5v456" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62da95c0-b8b4-410e-a5e8-f3ab44db53b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4bd49fbfa0e7cff66ffc2a72aa0020a4a28551fa192e587067d648feb4d09f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfctg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5v456\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:11Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.838235 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k28d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5497ad5a-f96f-4f2e-ba79-a72f32527546\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8cdd570c53fb9581ae506e73f11f51586ebffcc4561ba7176a7a64a25bd5e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g8dcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ca1f6810f305bb127d5ebfbb39b92ffa5914a3a245dcfe6f4adb805fea015e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g8dcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7k28d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:11Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.841144 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.841184 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.841195 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.841211 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.841223 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:11Z","lastTransitionTime":"2026-02-17T17:45:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.848898 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5f7eaf0-f8c2-46f8-947e-adf3e613da11\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38ea6d6b01fd4e5622f4283bf6250451952776f0d2b162865b1705dbde6542eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65e1e3306f2becef082f51747a01f6f5f0c619c462d0b001bdc01827d19d6ad3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65e1e3306f2becef082f51747a01f6f5f0c619c462d0b001bdc01827d19d6ad3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:11Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.863116 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4159c-33a5-42ef-9427-ad1fb2799470\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf7e90e5029b6c000084531e8c465365c236ea0c554bf0ab65b936fc0f53cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe34ba04ce082dcb2c2040fafa3d9b0fb0f78fd569d08c9e53278873fb6434c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a9fc749b194bcc0558af7017260846aeb8df8df5d84f31ae0f1e9f7ac29e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2199d2bd60d27dc4d3c3ff4a2ef9e4cbad59c3546a36206258b82875901b69d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 17:44:08.433164 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 17:44:08.433536 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:44:08.434768 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2781092108/tls.crt::/tmp/serving-cert-2781092108/tls.key\\\\\\\"\\\\nI0217 17:44:08.760474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:44:08.765628 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:44:08.765666 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:44:08.765700 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:44:08.765711 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:44:08.777802 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:44:08.777850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:44:08.777867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:44:08.777870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:44:08.777873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:44:08.777883 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:44:08.781106 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23963364690d3b6cdb1cb76d08e094b0a7c2fe304f8d20a673206357e0152193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:11Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.876477 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af0cdeda-3c72-4700-b908-b8c0f0041cc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db577dd460e8c892dbb44799bae9ccdd3364c385f4bbfbd3a33a88bab1bcab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd66fb423f8b68d5e670ece2a7be1b6e04e4dcd03ba42dc25bcf97cd9ec92b69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0b98dba841562d65c0ab18e4f604c141352d4bbfb16825248fb209a73b23cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f344d7a0ae243f5e05b9e8c21f25a713411ee0921470e6632409800760aab22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f344d7a0ae243f5e05b9e8c21f25a713411ee0921470e6632409800760aab22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:11Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.944140 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.944210 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.944233 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.944265 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:11 crc kubenswrapper[4892]: I0217 17:45:11.944288 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:11Z","lastTransitionTime":"2026-02-17T17:45:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:12 crc kubenswrapper[4892]: I0217 17:45:12.047619 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:12 crc kubenswrapper[4892]: I0217 17:45:12.047677 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:12 crc kubenswrapper[4892]: I0217 17:45:12.047694 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:12 crc kubenswrapper[4892]: I0217 17:45:12.047720 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:12 crc kubenswrapper[4892]: I0217 17:45:12.047738 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:12Z","lastTransitionTime":"2026-02-17T17:45:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:12 crc kubenswrapper[4892]: I0217 17:45:12.149988 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:12 crc kubenswrapper[4892]: I0217 17:45:12.150053 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:12 crc kubenswrapper[4892]: I0217 17:45:12.150070 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:12 crc kubenswrapper[4892]: I0217 17:45:12.150096 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:12 crc kubenswrapper[4892]: I0217 17:45:12.150115 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:12Z","lastTransitionTime":"2026-02-17T17:45:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:12 crc kubenswrapper[4892]: I0217 17:45:12.253324 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:12 crc kubenswrapper[4892]: I0217 17:45:12.253410 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:12 crc kubenswrapper[4892]: I0217 17:45:12.253437 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:12 crc kubenswrapper[4892]: I0217 17:45:12.253464 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:12 crc kubenswrapper[4892]: I0217 17:45:12.253483 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:12Z","lastTransitionTime":"2026-02-17T17:45:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:12 crc kubenswrapper[4892]: I0217 17:45:12.356175 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:12 crc kubenswrapper[4892]: I0217 17:45:12.356240 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:12 crc kubenswrapper[4892]: I0217 17:45:12.356261 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:12 crc kubenswrapper[4892]: I0217 17:45:12.356286 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:12 crc kubenswrapper[4892]: I0217 17:45:12.356307 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:12Z","lastTransitionTime":"2026-02-17T17:45:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:12 crc kubenswrapper[4892]: I0217 17:45:12.459783 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:12 crc kubenswrapper[4892]: I0217 17:45:12.459846 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:12 crc kubenswrapper[4892]: I0217 17:45:12.459858 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:12 crc kubenswrapper[4892]: I0217 17:45:12.459876 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:12 crc kubenswrapper[4892]: I0217 17:45:12.459889 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:12Z","lastTransitionTime":"2026-02-17T17:45:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:12 crc kubenswrapper[4892]: I0217 17:45:12.545683 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 19:09:59.598892363 +0000 UTC Feb 17 17:45:12 crc kubenswrapper[4892]: I0217 17:45:12.562946 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:12 crc kubenswrapper[4892]: I0217 17:45:12.563006 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:12 crc kubenswrapper[4892]: I0217 17:45:12.563029 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:12 crc kubenswrapper[4892]: I0217 17:45:12.563057 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:12 crc kubenswrapper[4892]: I0217 17:45:12.563079 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:12Z","lastTransitionTime":"2026-02-17T17:45:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:12 crc kubenswrapper[4892]: I0217 17:45:12.593981 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kp5h9_b23058a0-04ec-4a23-82cb-60f9b368eaa0/ovnkube-controller/3.log" Feb 17 17:45:12 crc kubenswrapper[4892]: I0217 17:45:12.665590 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:12 crc kubenswrapper[4892]: I0217 17:45:12.665930 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:12 crc kubenswrapper[4892]: I0217 17:45:12.666047 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:12 crc kubenswrapper[4892]: I0217 17:45:12.666170 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:12 crc kubenswrapper[4892]: I0217 17:45:12.666277 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:12Z","lastTransitionTime":"2026-02-17T17:45:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:12 crc kubenswrapper[4892]: I0217 17:45:12.768458 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:12 crc kubenswrapper[4892]: I0217 17:45:12.768490 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:12 crc kubenswrapper[4892]: I0217 17:45:12.768499 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:12 crc kubenswrapper[4892]: I0217 17:45:12.768513 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:12 crc kubenswrapper[4892]: I0217 17:45:12.768525 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:12Z","lastTransitionTime":"2026-02-17T17:45:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:12 crc kubenswrapper[4892]: I0217 17:45:12.870938 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:12 crc kubenswrapper[4892]: I0217 17:45:12.871270 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:12 crc kubenswrapper[4892]: I0217 17:45:12.871402 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:12 crc kubenswrapper[4892]: I0217 17:45:12.871597 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:12 crc kubenswrapper[4892]: I0217 17:45:12.871743 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:12Z","lastTransitionTime":"2026-02-17T17:45:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:12 crc kubenswrapper[4892]: I0217 17:45:12.974492 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:12 crc kubenswrapper[4892]: I0217 17:45:12.974563 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:12 crc kubenswrapper[4892]: I0217 17:45:12.974586 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:12 crc kubenswrapper[4892]: I0217 17:45:12.974616 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:12 crc kubenswrapper[4892]: I0217 17:45:12.974638 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:12Z","lastTransitionTime":"2026-02-17T17:45:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:13 crc kubenswrapper[4892]: I0217 17:45:13.077529 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:13 crc kubenswrapper[4892]: I0217 17:45:13.077575 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:13 crc kubenswrapper[4892]: I0217 17:45:13.077585 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:13 crc kubenswrapper[4892]: I0217 17:45:13.077601 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:13 crc kubenswrapper[4892]: I0217 17:45:13.077612 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:13Z","lastTransitionTime":"2026-02-17T17:45:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:13 crc kubenswrapper[4892]: I0217 17:45:13.149630 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:45:13 crc kubenswrapper[4892]: E0217 17:45:13.150018 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:46:17.149982426 +0000 UTC m=+148.525385731 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:45:13 crc kubenswrapper[4892]: I0217 17:45:13.179971 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:13 crc kubenswrapper[4892]: I0217 17:45:13.180030 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:13 crc kubenswrapper[4892]: I0217 17:45:13.180052 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:13 crc kubenswrapper[4892]: I0217 17:45:13.180080 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:13 crc kubenswrapper[4892]: I0217 17:45:13.180102 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:13Z","lastTransitionTime":"2026-02-17T17:45:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:13 crc kubenswrapper[4892]: I0217 17:45:13.251120 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:45:13 crc kubenswrapper[4892]: I0217 17:45:13.251191 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:45:13 crc kubenswrapper[4892]: I0217 17:45:13.251232 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:45:13 crc kubenswrapper[4892]: I0217 17:45:13.251281 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:45:13 crc kubenswrapper[4892]: E0217 17:45:13.251448 4892 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 17:45:13 crc kubenswrapper[4892]: E0217 17:45:13.251463 4892 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 17:45:13 crc kubenswrapper[4892]: E0217 17:45:13.251490 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 17:45:13 crc kubenswrapper[4892]: E0217 17:45:13.251634 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 17:45:13 crc kubenswrapper[4892]: E0217 17:45:13.251653 4892 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:45:13 crc kubenswrapper[4892]: E0217 17:45:13.251483 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 17:45:13 crc kubenswrapper[4892]: E0217 17:45:13.251711 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 17:45:13 crc kubenswrapper[4892]: E0217 17:45:13.251721 4892 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:45:13 crc kubenswrapper[4892]: E0217 17:45:13.251605 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 17:46:17.251567928 +0000 UTC m=+148.626971233 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 17:45:13 crc kubenswrapper[4892]: E0217 17:45:13.251775 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 17:46:17.251758653 +0000 UTC m=+148.627162008 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 17:45:13 crc kubenswrapper[4892]: E0217 17:45:13.251787 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 17:46:17.251781414 +0000 UTC m=+148.627184809 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:45:13 crc kubenswrapper[4892]: E0217 17:45:13.251798 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 17:46:17.251792594 +0000 UTC m=+148.627195989 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 17:45:13 crc kubenswrapper[4892]: I0217 17:45:13.282932 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:13 crc kubenswrapper[4892]: I0217 17:45:13.282992 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:13 crc kubenswrapper[4892]: I0217 17:45:13.283005 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:13 crc kubenswrapper[4892]: I0217 17:45:13.283022 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:13 crc kubenswrapper[4892]: I0217 17:45:13.283033 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:13Z","lastTransitionTime":"2026-02-17T17:45:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:13 crc kubenswrapper[4892]: I0217 17:45:13.358927 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:45:13 crc kubenswrapper[4892]: I0217 17:45:13.358982 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:45:13 crc kubenswrapper[4892]: I0217 17:45:13.358989 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:45:13 crc kubenswrapper[4892]: I0217 17:45:13.359027 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:45:13 crc kubenswrapper[4892]: E0217 17:45:13.359167 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:45:13 crc kubenswrapper[4892]: E0217 17:45:13.359287 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:45:13 crc kubenswrapper[4892]: E0217 17:45:13.359421 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:45:13 crc kubenswrapper[4892]: E0217 17:45:13.359565 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q4n6" podUID="9290105c-74a4-487a-879f-3f79186b3b01" Feb 17 17:45:13 crc kubenswrapper[4892]: I0217 17:45:13.385124 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:13 crc kubenswrapper[4892]: I0217 17:45:13.385152 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:13 crc kubenswrapper[4892]: I0217 17:45:13.385162 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:13 crc kubenswrapper[4892]: I0217 17:45:13.385174 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:13 crc kubenswrapper[4892]: I0217 17:45:13.385185 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:13Z","lastTransitionTime":"2026-02-17T17:45:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:13 crc kubenswrapper[4892]: I0217 17:45:13.487935 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:13 crc kubenswrapper[4892]: I0217 17:45:13.488223 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:13 crc kubenswrapper[4892]: I0217 17:45:13.488237 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:13 crc kubenswrapper[4892]: I0217 17:45:13.488257 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:13 crc kubenswrapper[4892]: I0217 17:45:13.488269 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:13Z","lastTransitionTime":"2026-02-17T17:45:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:13 crc kubenswrapper[4892]: I0217 17:45:13.546409 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 20:25:59.890891979 +0000 UTC Feb 17 17:45:13 crc kubenswrapper[4892]: I0217 17:45:13.590847 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:13 crc kubenswrapper[4892]: I0217 17:45:13.590913 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:13 crc kubenswrapper[4892]: I0217 17:45:13.590937 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:13 crc kubenswrapper[4892]: I0217 17:45:13.590966 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:13 crc kubenswrapper[4892]: I0217 17:45:13.590988 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:13Z","lastTransitionTime":"2026-02-17T17:45:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:13 crc kubenswrapper[4892]: I0217 17:45:13.693562 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:13 crc kubenswrapper[4892]: I0217 17:45:13.693621 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:13 crc kubenswrapper[4892]: I0217 17:45:13.693639 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:13 crc kubenswrapper[4892]: I0217 17:45:13.693662 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:13 crc kubenswrapper[4892]: I0217 17:45:13.693679 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:13Z","lastTransitionTime":"2026-02-17T17:45:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:13 crc kubenswrapper[4892]: I0217 17:45:13.797654 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:13 crc kubenswrapper[4892]: I0217 17:45:13.797711 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:13 crc kubenswrapper[4892]: I0217 17:45:13.797728 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:13 crc kubenswrapper[4892]: I0217 17:45:13.797752 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:13 crc kubenswrapper[4892]: I0217 17:45:13.797768 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:13Z","lastTransitionTime":"2026-02-17T17:45:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:13 crc kubenswrapper[4892]: I0217 17:45:13.900679 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:13 crc kubenswrapper[4892]: I0217 17:45:13.900735 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:13 crc kubenswrapper[4892]: I0217 17:45:13.900751 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:13 crc kubenswrapper[4892]: I0217 17:45:13.900776 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:13 crc kubenswrapper[4892]: I0217 17:45:13.900792 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:13Z","lastTransitionTime":"2026-02-17T17:45:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:14 crc kubenswrapper[4892]: I0217 17:45:14.003669 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:14 crc kubenswrapper[4892]: I0217 17:45:14.003758 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:14 crc kubenswrapper[4892]: I0217 17:45:14.003777 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:14 crc kubenswrapper[4892]: I0217 17:45:14.003800 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:14 crc kubenswrapper[4892]: I0217 17:45:14.003851 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:14Z","lastTransitionTime":"2026-02-17T17:45:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:14 crc kubenswrapper[4892]: I0217 17:45:14.106068 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:14 crc kubenswrapper[4892]: I0217 17:45:14.106098 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:14 crc kubenswrapper[4892]: I0217 17:45:14.106109 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:14 crc kubenswrapper[4892]: I0217 17:45:14.106123 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:14 crc kubenswrapper[4892]: I0217 17:45:14.106132 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:14Z","lastTransitionTime":"2026-02-17T17:45:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:14 crc kubenswrapper[4892]: I0217 17:45:14.208525 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:14 crc kubenswrapper[4892]: I0217 17:45:14.208573 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:14 crc kubenswrapper[4892]: I0217 17:45:14.208583 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:14 crc kubenswrapper[4892]: I0217 17:45:14.208598 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:14 crc kubenswrapper[4892]: I0217 17:45:14.208607 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:14Z","lastTransitionTime":"2026-02-17T17:45:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:14 crc kubenswrapper[4892]: I0217 17:45:14.311328 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:14 crc kubenswrapper[4892]: I0217 17:45:14.311402 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:14 crc kubenswrapper[4892]: I0217 17:45:14.311421 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:14 crc kubenswrapper[4892]: I0217 17:45:14.311445 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:14 crc kubenswrapper[4892]: I0217 17:45:14.311462 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:14Z","lastTransitionTime":"2026-02-17T17:45:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:14 crc kubenswrapper[4892]: I0217 17:45:14.414067 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:14 crc kubenswrapper[4892]: I0217 17:45:14.414124 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:14 crc kubenswrapper[4892]: I0217 17:45:14.414147 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:14 crc kubenswrapper[4892]: I0217 17:45:14.414176 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:14 crc kubenswrapper[4892]: I0217 17:45:14.414197 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:14Z","lastTransitionTime":"2026-02-17T17:45:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:14 crc kubenswrapper[4892]: I0217 17:45:14.516327 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:14 crc kubenswrapper[4892]: I0217 17:45:14.516629 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:14 crc kubenswrapper[4892]: I0217 17:45:14.516809 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:14 crc kubenswrapper[4892]: I0217 17:45:14.517041 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:14 crc kubenswrapper[4892]: I0217 17:45:14.517226 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:14Z","lastTransitionTime":"2026-02-17T17:45:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:14 crc kubenswrapper[4892]: I0217 17:45:14.548678 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 04:35:03.057079973 +0000 UTC Feb 17 17:45:14 crc kubenswrapper[4892]: I0217 17:45:14.619782 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:14 crc kubenswrapper[4892]: I0217 17:45:14.619870 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:14 crc kubenswrapper[4892]: I0217 17:45:14.619882 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:14 crc kubenswrapper[4892]: I0217 17:45:14.619901 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:14 crc kubenswrapper[4892]: I0217 17:45:14.619916 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:14Z","lastTransitionTime":"2026-02-17T17:45:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:14 crc kubenswrapper[4892]: I0217 17:45:14.722418 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:14 crc kubenswrapper[4892]: I0217 17:45:14.722472 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:14 crc kubenswrapper[4892]: I0217 17:45:14.722489 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:14 crc kubenswrapper[4892]: I0217 17:45:14.722513 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:14 crc kubenswrapper[4892]: I0217 17:45:14.722530 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:14Z","lastTransitionTime":"2026-02-17T17:45:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:14 crc kubenswrapper[4892]: I0217 17:45:14.825940 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:14 crc kubenswrapper[4892]: I0217 17:45:14.826012 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:14 crc kubenswrapper[4892]: I0217 17:45:14.826032 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:14 crc kubenswrapper[4892]: I0217 17:45:14.826063 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:14 crc kubenswrapper[4892]: I0217 17:45:14.826084 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:14Z","lastTransitionTime":"2026-02-17T17:45:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:14 crc kubenswrapper[4892]: I0217 17:45:14.928989 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:14 crc kubenswrapper[4892]: I0217 17:45:14.929057 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:14 crc kubenswrapper[4892]: I0217 17:45:14.929078 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:14 crc kubenswrapper[4892]: I0217 17:45:14.929168 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:14 crc kubenswrapper[4892]: I0217 17:45:14.929197 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:14Z","lastTransitionTime":"2026-02-17T17:45:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:15 crc kubenswrapper[4892]: I0217 17:45:15.032348 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:15 crc kubenswrapper[4892]: I0217 17:45:15.032494 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:15 crc kubenswrapper[4892]: I0217 17:45:15.032514 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:15 crc kubenswrapper[4892]: I0217 17:45:15.032555 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:15 crc kubenswrapper[4892]: I0217 17:45:15.032573 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:15Z","lastTransitionTime":"2026-02-17T17:45:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:15 crc kubenswrapper[4892]: I0217 17:45:15.136221 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:15 crc kubenswrapper[4892]: I0217 17:45:15.136275 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:15 crc kubenswrapper[4892]: I0217 17:45:15.136293 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:15 crc kubenswrapper[4892]: I0217 17:45:15.136316 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:15 crc kubenswrapper[4892]: I0217 17:45:15.136333 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:15Z","lastTransitionTime":"2026-02-17T17:45:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:15 crc kubenswrapper[4892]: I0217 17:45:15.239789 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:15 crc kubenswrapper[4892]: I0217 17:45:15.239880 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:15 crc kubenswrapper[4892]: I0217 17:45:15.239903 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:15 crc kubenswrapper[4892]: I0217 17:45:15.239932 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:15 crc kubenswrapper[4892]: I0217 17:45:15.239959 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:15Z","lastTransitionTime":"2026-02-17T17:45:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:15 crc kubenswrapper[4892]: I0217 17:45:15.343040 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:15 crc kubenswrapper[4892]: I0217 17:45:15.343101 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:15 crc kubenswrapper[4892]: I0217 17:45:15.343123 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:15 crc kubenswrapper[4892]: I0217 17:45:15.343155 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:15 crc kubenswrapper[4892]: I0217 17:45:15.343177 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:15Z","lastTransitionTime":"2026-02-17T17:45:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:15 crc kubenswrapper[4892]: I0217 17:45:15.358680 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:45:15 crc kubenswrapper[4892]: E0217 17:45:15.358895 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:45:15 crc kubenswrapper[4892]: I0217 17:45:15.358974 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:45:15 crc kubenswrapper[4892]: I0217 17:45:15.358997 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:45:15 crc kubenswrapper[4892]: I0217 17:45:15.359010 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:45:15 crc kubenswrapper[4892]: E0217 17:45:15.359347 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q4n6" podUID="9290105c-74a4-487a-879f-3f79186b3b01" Feb 17 17:45:15 crc kubenswrapper[4892]: E0217 17:45:15.359436 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:45:15 crc kubenswrapper[4892]: E0217 17:45:15.360977 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:45:15 crc kubenswrapper[4892]: I0217 17:45:15.445504 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:15 crc kubenswrapper[4892]: I0217 17:45:15.445547 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:15 crc kubenswrapper[4892]: I0217 17:45:15.445557 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:15 crc kubenswrapper[4892]: I0217 17:45:15.445574 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:15 crc kubenswrapper[4892]: I0217 17:45:15.445584 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:15Z","lastTransitionTime":"2026-02-17T17:45:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:15 crc kubenswrapper[4892]: I0217 17:45:15.548690 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:15 crc kubenswrapper[4892]: I0217 17:45:15.548769 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:15 crc kubenswrapper[4892]: I0217 17:45:15.548780 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:15 crc kubenswrapper[4892]: I0217 17:45:15.548803 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:15 crc kubenswrapper[4892]: I0217 17:45:15.548836 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:15Z","lastTransitionTime":"2026-02-17T17:45:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:15 crc kubenswrapper[4892]: I0217 17:45:15.548883 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 18:34:49.986144286 +0000 UTC Feb 17 17:45:15 crc kubenswrapper[4892]: I0217 17:45:15.652042 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:15 crc kubenswrapper[4892]: I0217 17:45:15.652132 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:15 crc kubenswrapper[4892]: I0217 17:45:15.652149 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:15 crc kubenswrapper[4892]: I0217 17:45:15.652174 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:15 crc kubenswrapper[4892]: I0217 17:45:15.652190 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:15Z","lastTransitionTime":"2026-02-17T17:45:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:15 crc kubenswrapper[4892]: I0217 17:45:15.755565 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:15 crc kubenswrapper[4892]: I0217 17:45:15.755606 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:15 crc kubenswrapper[4892]: I0217 17:45:15.755618 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:15 crc kubenswrapper[4892]: I0217 17:45:15.755630 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:15 crc kubenswrapper[4892]: I0217 17:45:15.755641 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:15Z","lastTransitionTime":"2026-02-17T17:45:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:15 crc kubenswrapper[4892]: I0217 17:45:15.858548 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:15 crc kubenswrapper[4892]: I0217 17:45:15.858628 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:15 crc kubenswrapper[4892]: I0217 17:45:15.858646 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:15 crc kubenswrapper[4892]: I0217 17:45:15.858676 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:15 crc kubenswrapper[4892]: I0217 17:45:15.858701 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:15Z","lastTransitionTime":"2026-02-17T17:45:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:15 crc kubenswrapper[4892]: I0217 17:45:15.961783 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:15 crc kubenswrapper[4892]: I0217 17:45:15.961892 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:15 crc kubenswrapper[4892]: I0217 17:45:15.961909 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:15 crc kubenswrapper[4892]: I0217 17:45:15.961932 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:15 crc kubenswrapper[4892]: I0217 17:45:15.961949 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:15Z","lastTransitionTime":"2026-02-17T17:45:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:16 crc kubenswrapper[4892]: I0217 17:45:16.064906 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:16 crc kubenswrapper[4892]: I0217 17:45:16.064965 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:16 crc kubenswrapper[4892]: I0217 17:45:16.064984 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:16 crc kubenswrapper[4892]: I0217 17:45:16.065009 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:16 crc kubenswrapper[4892]: I0217 17:45:16.065027 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:16Z","lastTransitionTime":"2026-02-17T17:45:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:16 crc kubenswrapper[4892]: I0217 17:45:16.167797 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:16 crc kubenswrapper[4892]: I0217 17:45:16.167899 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:16 crc kubenswrapper[4892]: I0217 17:45:16.167916 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:16 crc kubenswrapper[4892]: I0217 17:45:16.167941 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:16 crc kubenswrapper[4892]: I0217 17:45:16.167959 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:16Z","lastTransitionTime":"2026-02-17T17:45:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:16 crc kubenswrapper[4892]: I0217 17:45:16.270084 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:16 crc kubenswrapper[4892]: I0217 17:45:16.270148 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:16 crc kubenswrapper[4892]: I0217 17:45:16.270166 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:16 crc kubenswrapper[4892]: I0217 17:45:16.270194 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:16 crc kubenswrapper[4892]: I0217 17:45:16.270211 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:16Z","lastTransitionTime":"2026-02-17T17:45:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:16 crc kubenswrapper[4892]: I0217 17:45:16.373254 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:16 crc kubenswrapper[4892]: I0217 17:45:16.373309 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:16 crc kubenswrapper[4892]: I0217 17:45:16.373325 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:16 crc kubenswrapper[4892]: I0217 17:45:16.373347 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:16 crc kubenswrapper[4892]: I0217 17:45:16.373364 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:16Z","lastTransitionTime":"2026-02-17T17:45:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:16 crc kubenswrapper[4892]: I0217 17:45:16.476080 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:16 crc kubenswrapper[4892]: I0217 17:45:16.476117 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:16 crc kubenswrapper[4892]: I0217 17:45:16.476126 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:16 crc kubenswrapper[4892]: I0217 17:45:16.476142 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:16 crc kubenswrapper[4892]: I0217 17:45:16.476152 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:16Z","lastTransitionTime":"2026-02-17T17:45:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:16 crc kubenswrapper[4892]: I0217 17:45:16.549330 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 14:24:46.794170445 +0000 UTC Feb 17 17:45:16 crc kubenswrapper[4892]: I0217 17:45:16.578621 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:16 crc kubenswrapper[4892]: I0217 17:45:16.578674 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:16 crc kubenswrapper[4892]: I0217 17:45:16.578690 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:16 crc kubenswrapper[4892]: I0217 17:45:16.578715 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:16 crc kubenswrapper[4892]: I0217 17:45:16.578732 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:16Z","lastTransitionTime":"2026-02-17T17:45:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:16 crc kubenswrapper[4892]: I0217 17:45:16.681795 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:16 crc kubenswrapper[4892]: I0217 17:45:16.681892 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:16 crc kubenswrapper[4892]: I0217 17:45:16.681912 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:16 crc kubenswrapper[4892]: I0217 17:45:16.681936 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:16 crc kubenswrapper[4892]: I0217 17:45:16.681955 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:16Z","lastTransitionTime":"2026-02-17T17:45:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:16 crc kubenswrapper[4892]: I0217 17:45:16.785205 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:16 crc kubenswrapper[4892]: I0217 17:45:16.785257 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:16 crc kubenswrapper[4892]: I0217 17:45:16.785277 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:16 crc kubenswrapper[4892]: I0217 17:45:16.785300 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:16 crc kubenswrapper[4892]: I0217 17:45:16.785317 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:16Z","lastTransitionTime":"2026-02-17T17:45:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:16 crc kubenswrapper[4892]: I0217 17:45:16.888432 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:16 crc kubenswrapper[4892]: I0217 17:45:16.888491 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:16 crc kubenswrapper[4892]: I0217 17:45:16.888508 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:16 crc kubenswrapper[4892]: I0217 17:45:16.888532 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:16 crc kubenswrapper[4892]: I0217 17:45:16.888547 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:16Z","lastTransitionTime":"2026-02-17T17:45:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:16 crc kubenswrapper[4892]: I0217 17:45:16.991514 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:16 crc kubenswrapper[4892]: I0217 17:45:16.991575 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:16 crc kubenswrapper[4892]: I0217 17:45:16.991591 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:16 crc kubenswrapper[4892]: I0217 17:45:16.991614 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:16 crc kubenswrapper[4892]: I0217 17:45:16.991637 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:16Z","lastTransitionTime":"2026-02-17T17:45:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.094265 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.094319 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.094336 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.094358 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.094375 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:17Z","lastTransitionTime":"2026-02-17T17:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.196905 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.196961 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.196979 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.197001 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.197051 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:17Z","lastTransitionTime":"2026-02-17T17:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.300011 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.300069 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.300087 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.300112 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.300129 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:17Z","lastTransitionTime":"2026-02-17T17:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.317677 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.317732 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.317750 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.317774 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.317792 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:17Z","lastTransitionTime":"2026-02-17T17:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:17 crc kubenswrapper[4892]: E0217 17:45:17.338265 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b854407-f7fd-494b-9ad1-f90f175f6ff2\\\",\\\"systemUUID\\\":\\\"06c95f3d-0382-44a5-9f64-04e3b3bbd534\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:17Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.343086 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.343138 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.343157 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.343176 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.343193 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:17Z","lastTransitionTime":"2026-02-17T17:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.359503 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.359695 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:45:17 crc kubenswrapper[4892]: E0217 17:45:17.359921 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q4n6" podUID="9290105c-74a4-487a-879f-3f79186b3b01" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.360221 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:45:17 crc kubenswrapper[4892]: E0217 17:45:17.360315 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.360384 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:45:17 crc kubenswrapper[4892]: E0217 17:45:17.360639 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:45:17 crc kubenswrapper[4892]: E0217 17:45:17.360736 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:45:17 crc kubenswrapper[4892]: E0217 17:45:17.362712 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b854407-f7fd-494b-9ad1-f90f175f6ff2\\\",\\\"systemUUID\\\":\\\"06c95f3d-0382-44a5-9f64-04e3b3bbd534\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:17Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.367264 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.367317 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.367333 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.367356 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.367376 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:17Z","lastTransitionTime":"2026-02-17T17:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:17 crc kubenswrapper[4892]: E0217 17:45:17.387835 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b854407-f7fd-494b-9ad1-f90f175f6ff2\\\",\\\"systemUUID\\\":\\\"06c95f3d-0382-44a5-9f64-04e3b3bbd534\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:17Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.395085 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.395123 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.395132 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.395147 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.395158 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:17Z","lastTransitionTime":"2026-02-17T17:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:17 crc kubenswrapper[4892]: E0217 17:45:17.413350 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b854407-f7fd-494b-9ad1-f90f175f6ff2\\\",\\\"systemUUID\\\":\\\"06c95f3d-0382-44a5-9f64-04e3b3bbd534\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:17Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.417967 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.418004 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.418020 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.418041 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.418057 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:17Z","lastTransitionTime":"2026-02-17T17:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:17 crc kubenswrapper[4892]: E0217 17:45:17.437802 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b854407-f7fd-494b-9ad1-f90f175f6ff2\\\",\\\"systemUUID\\\":\\\"06c95f3d-0382-44a5-9f64-04e3b3bbd534\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:17Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:17 crc kubenswrapper[4892]: E0217 17:45:17.438304 4892 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.440676 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.440755 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.440774 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.440796 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.440841 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:17Z","lastTransitionTime":"2026-02-17T17:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.543898 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.543954 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.543971 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.543993 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.544010 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:17Z","lastTransitionTime":"2026-02-17T17:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.550369 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 06:25:06.429066862 +0000 UTC Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.646479 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.646532 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.646548 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.646573 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.646591 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:17Z","lastTransitionTime":"2026-02-17T17:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.749077 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.749118 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.749128 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.749145 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.749158 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:17Z","lastTransitionTime":"2026-02-17T17:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.852278 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.852321 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.852329 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.852343 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.852353 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:17Z","lastTransitionTime":"2026-02-17T17:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.954704 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.954765 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.954783 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.954836 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:17 crc kubenswrapper[4892]: I0217 17:45:17.954854 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:17Z","lastTransitionTime":"2026-02-17T17:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:18 crc kubenswrapper[4892]: I0217 17:45:18.057613 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:18 crc kubenswrapper[4892]: I0217 17:45:18.057655 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:18 crc kubenswrapper[4892]: I0217 17:45:18.057666 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:18 crc kubenswrapper[4892]: I0217 17:45:18.057684 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:18 crc kubenswrapper[4892]: I0217 17:45:18.057696 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:18Z","lastTransitionTime":"2026-02-17T17:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:18 crc kubenswrapper[4892]: I0217 17:45:18.160089 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:18 crc kubenswrapper[4892]: I0217 17:45:18.160163 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:18 crc kubenswrapper[4892]: I0217 17:45:18.160233 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:18 crc kubenswrapper[4892]: I0217 17:45:18.160262 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:18 crc kubenswrapper[4892]: I0217 17:45:18.160283 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:18Z","lastTransitionTime":"2026-02-17T17:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:18 crc kubenswrapper[4892]: I0217 17:45:18.263966 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:18 crc kubenswrapper[4892]: I0217 17:45:18.264032 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:18 crc kubenswrapper[4892]: I0217 17:45:18.264056 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:18 crc kubenswrapper[4892]: I0217 17:45:18.264084 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:18 crc kubenswrapper[4892]: I0217 17:45:18.264105 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:18Z","lastTransitionTime":"2026-02-17T17:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:18 crc kubenswrapper[4892]: I0217 17:45:18.366739 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:18 crc kubenswrapper[4892]: I0217 17:45:18.366808 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:18 crc kubenswrapper[4892]: I0217 17:45:18.366856 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:18 crc kubenswrapper[4892]: I0217 17:45:18.366880 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:18 crc kubenswrapper[4892]: I0217 17:45:18.366898 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:18Z","lastTransitionTime":"2026-02-17T17:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:18 crc kubenswrapper[4892]: I0217 17:45:18.469479 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:18 crc kubenswrapper[4892]: I0217 17:45:18.469556 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:18 crc kubenswrapper[4892]: I0217 17:45:18.469594 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:18 crc kubenswrapper[4892]: I0217 17:45:18.469625 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:18 crc kubenswrapper[4892]: I0217 17:45:18.469648 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:18Z","lastTransitionTime":"2026-02-17T17:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:18 crc kubenswrapper[4892]: I0217 17:45:18.551469 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 05:16:47.647337698 +0000 UTC Feb 17 17:45:18 crc kubenswrapper[4892]: I0217 17:45:18.572594 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:18 crc kubenswrapper[4892]: I0217 17:45:18.572634 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:18 crc kubenswrapper[4892]: I0217 17:45:18.572642 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:18 crc kubenswrapper[4892]: I0217 17:45:18.572658 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:18 crc kubenswrapper[4892]: I0217 17:45:18.572668 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:18Z","lastTransitionTime":"2026-02-17T17:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:18 crc kubenswrapper[4892]: I0217 17:45:18.675673 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:18 crc kubenswrapper[4892]: I0217 17:45:18.675719 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:18 crc kubenswrapper[4892]: I0217 17:45:18.675729 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:18 crc kubenswrapper[4892]: I0217 17:45:18.675744 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:18 crc kubenswrapper[4892]: I0217 17:45:18.675755 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:18Z","lastTransitionTime":"2026-02-17T17:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:18 crc kubenswrapper[4892]: I0217 17:45:18.778169 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:18 crc kubenswrapper[4892]: I0217 17:45:18.778207 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:18 crc kubenswrapper[4892]: I0217 17:45:18.778219 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:18 crc kubenswrapper[4892]: I0217 17:45:18.778237 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:18 crc kubenswrapper[4892]: I0217 17:45:18.778268 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:18Z","lastTransitionTime":"2026-02-17T17:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:18 crc kubenswrapper[4892]: I0217 17:45:18.880782 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:18 crc kubenswrapper[4892]: I0217 17:45:18.880863 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:18 crc kubenswrapper[4892]: I0217 17:45:18.880872 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:18 crc kubenswrapper[4892]: I0217 17:45:18.880886 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:18 crc kubenswrapper[4892]: I0217 17:45:18.880894 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:18Z","lastTransitionTime":"2026-02-17T17:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:18 crc kubenswrapper[4892]: I0217 17:45:18.984355 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:18 crc kubenswrapper[4892]: I0217 17:45:18.984421 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:18 crc kubenswrapper[4892]: I0217 17:45:18.984445 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:18 crc kubenswrapper[4892]: I0217 17:45:18.984473 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:18 crc kubenswrapper[4892]: I0217 17:45:18.984492 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:18Z","lastTransitionTime":"2026-02-17T17:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.087196 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.087255 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.087271 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.087299 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.087316 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:19Z","lastTransitionTime":"2026-02-17T17:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.191133 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.191242 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.191259 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.191283 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.191301 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:19Z","lastTransitionTime":"2026-02-17T17:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.294404 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.294455 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.294472 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.294495 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.294511 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:19Z","lastTransitionTime":"2026-02-17T17:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.359560 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.359744 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.359761 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.359807 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:45:19 crc kubenswrapper[4892]: E0217 17:45:19.360065 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q4n6" podUID="9290105c-74a4-487a-879f-3f79186b3b01" Feb 17 17:45:19 crc kubenswrapper[4892]: E0217 17:45:19.360264 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:45:19 crc kubenswrapper[4892]: E0217 17:45:19.360472 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:45:19 crc kubenswrapper[4892]: E0217 17:45:19.360797 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.381652 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5v456" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62da95c0-b8b4-410e-a5e8-f3ab44db53b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4bd49fbfa0e7cff66ffc2a72aa0020a4a28551fa192e587067d648feb4d09f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfctg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5v456\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.397555 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k28d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5497ad5a-f96f-4f2e-ba79-a72f32527546\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8cdd570c53fb9581ae506e73f11f51586ebffcc4561ba7176a7a64a25bd5e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g8dcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ca1f6810f305bb127d5ebfbb39b92ffa5914a3a245dcfe6f4adb805fea015e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g8dcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7k28d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.398308 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.398361 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.398458 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.398513 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.398530 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:19Z","lastTransitionTime":"2026-02-17T17:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.437034 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5f7eaf0-f8c2-46f8-947e-adf3e613da11\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38ea6d6b01fd4e5622f4283bf6250451952776f0d2b162865b1705dbde6542eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65e1e3306f2becef082f51747a01f6f5f0c619c462d0b001bdc01827d19d6ad3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65e1e3306f2becef082f51747a01f6f5f0c619c462d0b001bdc01827d19d6ad3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.469429 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4159c-33a5-42ef-9427-ad1fb2799470\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf7e90e5029b6c000084531e8c465365c236ea0c554bf0ab65b936fc0f53cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe34ba04ce082dcb2c2040fafa3d9b0fb0f78fd569d08c9e53278873fb6434c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a9fc749b194bcc0558af7017260846aeb8df8df5d84f31ae0f1e9f7ac29e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2199d2bd60d27dc4d3c3ff4a2ef9e4cbad59c3546a36206258b82875901b69d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 17:44:08.433164 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 17:44:08.433536 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:44:08.434768 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2781092108/tls.crt::/tmp/serving-cert-2781092108/tls.key\\\\\\\"\\\\nI0217 17:44:08.760474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:44:08.765628 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:44:08.765666 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:44:08.765700 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:44:08.765711 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:44:08.777802 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:44:08.777850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:44:08.777867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:44:08.777870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:44:08.777873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:44:08.777883 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:44:08.781106 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23963364690d3b6cdb1cb76d08e094b0a7c2fe304f8d20a673206357e0152193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.487283 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af0cdeda-3c72-4700-b908-b8c0f0041cc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db577dd460e8c892dbb44799bae9ccdd3364c385f4bbfbd3a33a88bab1bcab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd66fb423f8b68d5e670ece2a7be1b6e04e4dcd03ba42dc25bcf97cd9ec92b69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0b98dba841562d65c0ab18e4f604c141352d4bbfb16825248fb209a73b23cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f344d7a0ae243f5e05b9e8c21f25a713411ee0921470e6632409800760aab22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f344d7a0ae243f5e05b9e8c21f25a713411ee0921470e6632409800760aab22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.500864 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.500936 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.500950 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.500967 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.501361 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:19Z","lastTransitionTime":"2026-02-17T17:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.501383 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59304d1958e072cd19bd1e6a00f1487aa93563c8838c859aa7ed5222dc25c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.521166 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4dsxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202411f4-1b44-41a1-9b8a-23038a0e9bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc5f2c5d87fb5b1529c03563955d9213cd72452aabadd9417afdf5a30b8a24f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4dsxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.534647 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83f8f121-9193-465b-9d6b-4dfbdcf26f98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca961a496bca89cff78c58c41585706c0673892df83e6c58e16655f83aeaa4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3ead139c2a8e6feefbddce384ccd6eb8f26b4581d93053b80832fea7c8bdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82362387569a5b049fdb827cb8ca5ecdacae36743ef142077b37441cc2e33b41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d08e8579fe7ac45bc28ca29b9e6bc0b795c68e4429e16319b75f2b6a1f60cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.550047 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.551776 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 22:51:28.876738176 +0000 UTC Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.566922 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.584449 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxpxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b12f44-0079-4031-9b1d-492c374250df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc69d29b82db70745c48c3f7761a99709ee267fb45509bfc4ff406273c6f23c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5490d72f776fd4e96f56e933c182c8589e96d80806378f96d97c4d0f6dccd8eb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:44:59Z\\\",\\\"message\\\":\\\"2026-02-17T17:44:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d0b9a3e2-5264-4ab6-a42c-c965db521da1\\\\n2026-02-17T17:44:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d0b9a3e2-5264-4ab6-a42c-c965db521da1 to /host/opt/cni/bin/\\\\n2026-02-17T17:44:14Z [verbose] multus-daemon started\\\\n2026-02-17T17:44:14Z [verbose] Readiness Indicator file check\\\\n2026-02-17T17:44:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgp4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxpxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.603184 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.603460 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.603526 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.603546 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.603571 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.603588 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:19Z","lastTransitionTime":"2026-02-17T17:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.622935 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aba0c9be9062adff6361f81c4b2f399a56592fcef0d1a62defdf587f9accb3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.639117 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9013d62-9809-436b-82a8-5b18dbf13e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b57e4ceb22cb7d61e915ddf5e85dfee872d2088cccf3c8c4add63e1aa423a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c31ce4ad814ab3be27cbd8d8eb0cb70c7e11352cacd14b705d5c3ed0b56fe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6mhzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.653338 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e1e9d28fe57947c798967ce328638e778603a86874e79a2f4d99eab0b9867fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://489d551f2e552fe6e6a03f8e38593e2b978a7dc4053a999c3a11ebe404cf427f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.667605 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p6jtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"646d0148-c138-45a2-8f68-51aee16aeff0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1d56c160a6bfb1da9b1ca24666fbbdbd87911eaf961365a6cb097b0cc20f1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxbxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p6jtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.696913 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b23058a0-04ec-4a23-82cb-60f9b368eaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acb53587bfb9bbb4e171e5b2c92a91c196c8c0445a80c61d8004ce0fb3c858e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e71385b1a285f0da1c91a9a94188fbfd412225fc8c839572ec619b3363017253\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:44:43Z\\\",\\\"message\\\":\\\"ef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 17:44:43.370589 6578 services_controller.go:356] Processing sync for service openshift-multus/multus-admission-controller for network=default\\\\nI0217 17:44:43.369483 6578 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0217 17:44:43.370645 6578 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0217 17:44:43.370651 6578 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0217 17:44:43.370550 6578 base_netwo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://acb53587bfb9bbb4e171e5b2c92a91c196c8c0445a80c61d8004ce0fb3c858e1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:45:11Z\\\",\\\"message\\\":\\\"ailed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:11Z is after 2025-08-24T17:21:41Z]\\\\nI0217 17:45:11.478842 6999 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k28d\\\\nI0217 17:45:11.478739 6999 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0217 17:45:11.478855 6999 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k28d\\\\nI0217 17:45:11.478682 6999 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-5v456\\\\nI0217 17:45:11.478864 6999 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k28d in node crc\\\\nI0217 17:45:11.478876 6999 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-5v456\\\\nI0217 17:45:11.478883 6999 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-74\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:45:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kp5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.706328 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.706379 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.706395 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.706419 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.706437 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:19Z","lastTransitionTime":"2026-02-17T17:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.712451 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2q4n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9290105c-74a4-487a-879f-3f79186b3b01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6h6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6h6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2q4n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:19Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.809279 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.809355 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.809378 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.809405 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.809427 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:19Z","lastTransitionTime":"2026-02-17T17:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.911897 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.911973 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.911994 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.912022 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:19 crc kubenswrapper[4892]: I0217 17:45:19.912041 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:19Z","lastTransitionTime":"2026-02-17T17:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:20 crc kubenswrapper[4892]: I0217 17:45:20.015455 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:20 crc kubenswrapper[4892]: I0217 17:45:20.015496 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:20 crc kubenswrapper[4892]: I0217 17:45:20.015512 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:20 crc kubenswrapper[4892]: I0217 17:45:20.015533 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:20 crc kubenswrapper[4892]: I0217 17:45:20.015551 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:20Z","lastTransitionTime":"2026-02-17T17:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:20 crc kubenswrapper[4892]: I0217 17:45:20.117747 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:20 crc kubenswrapper[4892]: I0217 17:45:20.117792 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:20 crc kubenswrapper[4892]: I0217 17:45:20.117803 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:20 crc kubenswrapper[4892]: I0217 17:45:20.117834 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:20 crc kubenswrapper[4892]: I0217 17:45:20.117845 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:20Z","lastTransitionTime":"2026-02-17T17:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:20 crc kubenswrapper[4892]: I0217 17:45:20.221643 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:20 crc kubenswrapper[4892]: I0217 17:45:20.221722 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:20 crc kubenswrapper[4892]: I0217 17:45:20.221733 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:20 crc kubenswrapper[4892]: I0217 17:45:20.221751 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:20 crc kubenswrapper[4892]: I0217 17:45:20.221762 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:20Z","lastTransitionTime":"2026-02-17T17:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:20 crc kubenswrapper[4892]: I0217 17:45:20.324485 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:20 crc kubenswrapper[4892]: I0217 17:45:20.324552 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:20 crc kubenswrapper[4892]: I0217 17:45:20.324571 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:20 crc kubenswrapper[4892]: I0217 17:45:20.324602 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:20 crc kubenswrapper[4892]: I0217 17:45:20.324624 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:20Z","lastTransitionTime":"2026-02-17T17:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:20 crc kubenswrapper[4892]: I0217 17:45:20.435288 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:20 crc kubenswrapper[4892]: I0217 17:45:20.435342 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:20 crc kubenswrapper[4892]: I0217 17:45:20.435359 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:20 crc kubenswrapper[4892]: I0217 17:45:20.435383 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:20 crc kubenswrapper[4892]: I0217 17:45:20.435399 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:20Z","lastTransitionTime":"2026-02-17T17:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:20 crc kubenswrapper[4892]: I0217 17:45:20.538331 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:20 crc kubenswrapper[4892]: I0217 17:45:20.538426 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:20 crc kubenswrapper[4892]: I0217 17:45:20.538445 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:20 crc kubenswrapper[4892]: I0217 17:45:20.538470 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:20 crc kubenswrapper[4892]: I0217 17:45:20.538488 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:20Z","lastTransitionTime":"2026-02-17T17:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:20 crc kubenswrapper[4892]: I0217 17:45:20.552657 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 12:37:03.650732882 +0000 UTC Feb 17 17:45:20 crc kubenswrapper[4892]: I0217 17:45:20.641699 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:20 crc kubenswrapper[4892]: I0217 17:45:20.641766 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:20 crc kubenswrapper[4892]: I0217 17:45:20.641782 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:20 crc kubenswrapper[4892]: I0217 17:45:20.641807 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:20 crc kubenswrapper[4892]: I0217 17:45:20.641866 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:20Z","lastTransitionTime":"2026-02-17T17:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:20 crc kubenswrapper[4892]: I0217 17:45:20.745216 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:20 crc kubenswrapper[4892]: I0217 17:45:20.745275 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:20 crc kubenswrapper[4892]: I0217 17:45:20.745327 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:20 crc kubenswrapper[4892]: I0217 17:45:20.745358 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:20 crc kubenswrapper[4892]: I0217 17:45:20.745381 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:20Z","lastTransitionTime":"2026-02-17T17:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:20 crc kubenswrapper[4892]: I0217 17:45:20.848142 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:20 crc kubenswrapper[4892]: I0217 17:45:20.848223 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:20 crc kubenswrapper[4892]: I0217 17:45:20.848249 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:20 crc kubenswrapper[4892]: I0217 17:45:20.848282 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:20 crc kubenswrapper[4892]: I0217 17:45:20.848307 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:20Z","lastTransitionTime":"2026-02-17T17:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:20 crc kubenswrapper[4892]: I0217 17:45:20.950943 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:20 crc kubenswrapper[4892]: I0217 17:45:20.950999 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:20 crc kubenswrapper[4892]: I0217 17:45:20.951016 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:20 crc kubenswrapper[4892]: I0217 17:45:20.951041 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:20 crc kubenswrapper[4892]: I0217 17:45:20.951059 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:20Z","lastTransitionTime":"2026-02-17T17:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:21 crc kubenswrapper[4892]: I0217 17:45:21.053038 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:21 crc kubenswrapper[4892]: I0217 17:45:21.053076 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:21 crc kubenswrapper[4892]: I0217 17:45:21.053086 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:21 crc kubenswrapper[4892]: I0217 17:45:21.053100 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:21 crc kubenswrapper[4892]: I0217 17:45:21.053110 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:21Z","lastTransitionTime":"2026-02-17T17:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:21 crc kubenswrapper[4892]: I0217 17:45:21.155749 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:21 crc kubenswrapper[4892]: I0217 17:45:21.155790 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:21 crc kubenswrapper[4892]: I0217 17:45:21.155803 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:21 crc kubenswrapper[4892]: I0217 17:45:21.155833 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:21 crc kubenswrapper[4892]: I0217 17:45:21.155843 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:21Z","lastTransitionTime":"2026-02-17T17:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:21 crc kubenswrapper[4892]: I0217 17:45:21.258997 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:21 crc kubenswrapper[4892]: I0217 17:45:21.259051 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:21 crc kubenswrapper[4892]: I0217 17:45:21.259067 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:21 crc kubenswrapper[4892]: I0217 17:45:21.259088 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:21 crc kubenswrapper[4892]: I0217 17:45:21.259105 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:21Z","lastTransitionTime":"2026-02-17T17:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:21 crc kubenswrapper[4892]: I0217 17:45:21.359492 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:45:21 crc kubenswrapper[4892]: I0217 17:45:21.359548 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:45:21 crc kubenswrapper[4892]: I0217 17:45:21.359570 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:45:21 crc kubenswrapper[4892]: I0217 17:45:21.359756 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:45:21 crc kubenswrapper[4892]: E0217 17:45:21.359746 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:45:21 crc kubenswrapper[4892]: E0217 17:45:21.359884 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:45:21 crc kubenswrapper[4892]: E0217 17:45:21.359942 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:45:21 crc kubenswrapper[4892]: E0217 17:45:21.360014 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q4n6" podUID="9290105c-74a4-487a-879f-3f79186b3b01" Feb 17 17:45:21 crc kubenswrapper[4892]: I0217 17:45:21.361384 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:21 crc kubenswrapper[4892]: I0217 17:45:21.361454 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:21 crc kubenswrapper[4892]: I0217 17:45:21.361465 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:21 crc kubenswrapper[4892]: I0217 17:45:21.361481 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:21 crc kubenswrapper[4892]: I0217 17:45:21.361493 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:21Z","lastTransitionTime":"2026-02-17T17:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:21 crc kubenswrapper[4892]: I0217 17:45:21.464547 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:21 crc kubenswrapper[4892]: I0217 17:45:21.464592 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:21 crc kubenswrapper[4892]: I0217 17:45:21.464602 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:21 crc kubenswrapper[4892]: I0217 17:45:21.464617 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:21 crc kubenswrapper[4892]: I0217 17:45:21.464629 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:21Z","lastTransitionTime":"2026-02-17T17:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:21 crc kubenswrapper[4892]: I0217 17:45:21.553400 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 11:05:14.533367685 +0000 UTC Feb 17 17:45:21 crc kubenswrapper[4892]: I0217 17:45:21.567478 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:21 crc kubenswrapper[4892]: I0217 17:45:21.567530 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:21 crc kubenswrapper[4892]: I0217 17:45:21.567546 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:21 crc kubenswrapper[4892]: I0217 17:45:21.567569 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:21 crc kubenswrapper[4892]: I0217 17:45:21.567586 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:21Z","lastTransitionTime":"2026-02-17T17:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:21 crc kubenswrapper[4892]: I0217 17:45:21.707553 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:21 crc kubenswrapper[4892]: I0217 17:45:21.707620 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:21 crc kubenswrapper[4892]: I0217 17:45:21.707642 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:21 crc kubenswrapper[4892]: I0217 17:45:21.707671 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:21 crc kubenswrapper[4892]: I0217 17:45:21.707696 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:21Z","lastTransitionTime":"2026-02-17T17:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:21 crc kubenswrapper[4892]: I0217 17:45:21.810953 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:21 crc kubenswrapper[4892]: I0217 17:45:21.811013 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:21 crc kubenswrapper[4892]: I0217 17:45:21.811024 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:21 crc kubenswrapper[4892]: I0217 17:45:21.811039 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:21 crc kubenswrapper[4892]: I0217 17:45:21.811049 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:21Z","lastTransitionTime":"2026-02-17T17:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:21 crc kubenswrapper[4892]: I0217 17:45:21.913977 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:21 crc kubenswrapper[4892]: I0217 17:45:21.914015 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:21 crc kubenswrapper[4892]: I0217 17:45:21.914028 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:21 crc kubenswrapper[4892]: I0217 17:45:21.914043 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:21 crc kubenswrapper[4892]: I0217 17:45:21.914054 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:21Z","lastTransitionTime":"2026-02-17T17:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.017149 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.017257 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.017274 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.017302 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.017318 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:22Z","lastTransitionTime":"2026-02-17T17:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.119794 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.119841 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.119849 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.119860 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.119868 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:22Z","lastTransitionTime":"2026-02-17T17:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.222229 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.222277 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.222293 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.222315 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.222332 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:22Z","lastTransitionTime":"2026-02-17T17:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.325031 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.325098 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.325116 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.325141 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.325164 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:22Z","lastTransitionTime":"2026-02-17T17:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.359951 4892 scope.go:117] "RemoveContainer" containerID="acb53587bfb9bbb4e171e5b2c92a91c196c8c0445a80c61d8004ce0fb3c858e1" Feb 17 17:45:22 crc kubenswrapper[4892]: E0217 17:45:22.360206 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kp5h9_openshift-ovn-kubernetes(b23058a0-04ec-4a23-82cb-60f9b368eaa0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.378993 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4dsxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202411f4-1b44-41a1-9b8a-23038a0e9bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc5f2c5d87fb5b1529c03563955d9213cd72452aabadd9417afdf5a30b8a24f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4dsxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:22Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.397763 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83f8f121-9193-465b-9d6b-4dfbdcf26f98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca961a496bca89cff78c58c41585706c0673892df83e6c58e16655f83aeaa4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3ead139c2a8e6feefbddce384ccd6eb8f26b4581d93053b80832fea7c8bdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82362387569a5b049fdb827cb8ca5ecdacae36743ef142077b37441cc2e33b41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d08e8579fe7ac45bc28ca29b9e6bc0b795c68e4429e16319b75f2b6a1f60cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:22Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.412943 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:22Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.428674 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.428720 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.428743 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.428767 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.428784 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:22Z","lastTransitionTime":"2026-02-17T17:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.432219 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:22Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.448258 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxpxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b12f44-0079-4031-9b1d-492c374250df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc69d29b82db70745c48c3f7761a99709ee267fb45509bfc4ff406273c6f23c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5490d72f776fd4e96f56e933c182c8589e96d80806378f96d97c4d0f6dccd8eb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:44:59Z\\\",\\\"message\\\":\\\"2026-02-17T17:44:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d0b9a3e2-5264-4ab6-a42c-c965db521da1\\\\n2026-02-17T17:44:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d0b9a3e2-5264-4ab6-a42c-c965db521da1 to /host/opt/cni/bin/\\\\n2026-02-17T17:44:14Z [verbose] multus-daemon started\\\\n2026-02-17T17:44:14Z [verbose] Readiness Indicator file check\\\\n2026-02-17T17:44:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgp4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxpxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:22Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.468457 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:22Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.490751 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aba0c9be9062adff6361f81c4b2f399a56592fcef0d1a62defdf587f9accb3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:22Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.508961 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9013d62-9809-436b-82a8-5b18dbf13e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b57e4ceb22cb7d61e915ddf5e85dfee872d2088cccf3c8c4add63e1aa423a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c31ce4ad814ab3be27cbd8d8eb0cb70c7e11352cacd14b705d5c3ed0b56fe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6mhzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:22Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.529781 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e1e9d28fe57947c798967ce328638e778603a86874e79a2f4d99eab0b9867fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://489d551f2e552fe6e6a03f8e38593e2b978a7dc4053a999c3a11ebe404cf427f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:22Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.531738 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.531827 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.531867 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.531887 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.531898 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:22Z","lastTransitionTime":"2026-02-17T17:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.544474 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p6jtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"646d0148-c138-45a2-8f68-51aee16aeff0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1d56c160a6bfb1da9b1ca24666fbbdbd87911eaf961365a6cb097b0cc20f1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxbxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p6jtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:22Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.553622 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 05:33:49.205804377 +0000 UTC Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.577453 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b23058a0-04ec-4a23-82cb-60f9b368eaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acb53587bfb9bbb4e171e5b2c92a91c196c8c0445a80c61d8004ce0fb3c858e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://acb53587bfb9bbb4e171e5b2c92a91c196c8c0445a80c61d8004ce0fb3c858e1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:45:11Z\\\",\\\"message\\\":\\\"ailed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:11Z is after 2025-08-24T17:21:41Z]\\\\nI0217 17:45:11.478842 6999 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k28d\\\\nI0217 17:45:11.478739 6999 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0217 17:45:11.478855 6999 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k28d\\\\nI0217 17:45:11.478682 6999 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-5v456\\\\nI0217 17:45:11.478864 6999 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k28d in node crc\\\\nI0217 17:45:11.478876 6999 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-5v456\\\\nI0217 17:45:11.478883 6999 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-74\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:45:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kp5h9_openshift-ovn-kubernetes(b23058a0-04ec-4a23-82cb-60f9b368eaa0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kp5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:22Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.594749 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2q4n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9290105c-74a4-487a-879f-3f79186b3b01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6h6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6h6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2q4n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:22Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.607125 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5v456" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62da95c0-b8b4-410e-a5e8-f3ab44db53b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4bd49fbfa0e7cff66ffc2a72aa0020a4a28551fa192e587067d648feb4d09f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfctg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5v456\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:22Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.620238 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k28d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5497ad5a-f96f-4f2e-ba79-a72f32527546\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8cdd570c53fb9581ae506e73f11f51586ebffcc4561ba7176a7a64a25bd5e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g8dcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ca1f6810f305bb127d5ebfbb39b92ffa5914a3a245dcfe6f4adb805fea015e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g8dcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7k28d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:22Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.633626 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5f7eaf0-f8c2-46f8-947e-adf3e613da11\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38ea6d6b01fd4e5622f4283bf6250451952776f0d2b162865b1705dbde6542eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65e1e3306f2becef082f51747a01f6f5f0c619c462d0b001bdc01827d19d6ad3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65e1e3306f2becef082f51747a01f6f5f0c619c462d0b001bdc01827d19d6ad3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:22Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.634580 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.634627 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.634640 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.634660 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.634671 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:22Z","lastTransitionTime":"2026-02-17T17:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.654538 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4159c-33a5-42ef-9427-ad1fb2799470\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf7e90e5029b6c000084531e8c465365c236ea0c554bf0ab65b936fc0f53cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe34ba04ce082dcb2c2040fafa3d9b0fb0f78fd569d08c9e53278873fb6434c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a9fc749b194bcc0558af7017260846aeb8df8df5d84f31ae0f1e9f7ac29e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2199d2bd60d27dc4d3c3ff4a2ef9e4cbad59c3546a36206258b82875901b69d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 17:44:08.433164 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 17:44:08.433536 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:44:08.434768 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2781092108/tls.crt::/tmp/serving-cert-2781092108/tls.key\\\\\\\"\\\\nI0217 17:44:08.760474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:44:08.765628 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:44:08.765666 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:44:08.765700 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:44:08.765711 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:44:08.777802 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:44:08.777850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:44:08.777867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:44:08.777870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:44:08.777873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:44:08.777883 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:44:08.781106 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23963364690d3b6cdb1cb76d08e094b0a7c2fe304f8d20a673206357e0152193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:22Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.671492 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af0cdeda-3c72-4700-b908-b8c0f0041cc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db577dd460e8c892dbb44799bae9ccdd3364c385f4bbfbd3a33a88bab1bcab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd66fb423f8b68d5e670ece2a7be1b6e04e4dcd03ba42dc25bcf97cd9ec92b69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0b98dba841562d65c0ab18e4f604c141352d4bbfb16825248fb209a73b23cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f344d7a0ae243f5e05b9e8c21f25a713411ee0921470e6632409800760aab22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f344d7a0ae243f5e05b9e8c21f25a713411ee0921470e6632409800760aab22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:22Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.688275 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59304d1958e072cd19bd1e6a00f1487aa93563c8838c859aa7ed5222dc25c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:22Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.738059 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.738145 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.738170 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.738200 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.738220 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:22Z","lastTransitionTime":"2026-02-17T17:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.841034 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.841070 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.841079 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.841092 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.841102 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:22Z","lastTransitionTime":"2026-02-17T17:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.943975 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.944036 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.944053 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.944079 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:22 crc kubenswrapper[4892]: I0217 17:45:22.944100 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:22Z","lastTransitionTime":"2026-02-17T17:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:23 crc kubenswrapper[4892]: I0217 17:45:23.047125 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:23 crc kubenswrapper[4892]: I0217 17:45:23.047190 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:23 crc kubenswrapper[4892]: I0217 17:45:23.047207 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:23 crc kubenswrapper[4892]: I0217 17:45:23.047231 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:23 crc kubenswrapper[4892]: I0217 17:45:23.047248 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:23Z","lastTransitionTime":"2026-02-17T17:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:23 crc kubenswrapper[4892]: I0217 17:45:23.150235 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:23 crc kubenswrapper[4892]: I0217 17:45:23.150309 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:23 crc kubenswrapper[4892]: I0217 17:45:23.150329 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:23 crc kubenswrapper[4892]: I0217 17:45:23.150356 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:23 crc kubenswrapper[4892]: I0217 17:45:23.150373 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:23Z","lastTransitionTime":"2026-02-17T17:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:23 crc kubenswrapper[4892]: I0217 17:45:23.253252 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:23 crc kubenswrapper[4892]: I0217 17:45:23.253693 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:23 crc kubenswrapper[4892]: I0217 17:45:23.253717 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:23 crc kubenswrapper[4892]: I0217 17:45:23.253747 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:23 crc kubenswrapper[4892]: I0217 17:45:23.253772 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:23Z","lastTransitionTime":"2026-02-17T17:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:23 crc kubenswrapper[4892]: I0217 17:45:23.357310 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:23 crc kubenswrapper[4892]: I0217 17:45:23.357380 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:23 crc kubenswrapper[4892]: I0217 17:45:23.357404 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:23 crc kubenswrapper[4892]: I0217 17:45:23.357436 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:23 crc kubenswrapper[4892]: I0217 17:45:23.357457 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:23Z","lastTransitionTime":"2026-02-17T17:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:23 crc kubenswrapper[4892]: I0217 17:45:23.358581 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:45:23 crc kubenswrapper[4892]: I0217 17:45:23.358615 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:45:23 crc kubenswrapper[4892]: I0217 17:45:23.358677 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:45:23 crc kubenswrapper[4892]: E0217 17:45:23.358798 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:45:23 crc kubenswrapper[4892]: I0217 17:45:23.358592 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:45:23 crc kubenswrapper[4892]: E0217 17:45:23.359038 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:45:23 crc kubenswrapper[4892]: E0217 17:45:23.359151 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q4n6" podUID="9290105c-74a4-487a-879f-3f79186b3b01" Feb 17 17:45:23 crc kubenswrapper[4892]: E0217 17:45:23.359186 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:45:23 crc kubenswrapper[4892]: I0217 17:45:23.460394 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:23 crc kubenswrapper[4892]: I0217 17:45:23.460444 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:23 crc kubenswrapper[4892]: I0217 17:45:23.460461 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:23 crc kubenswrapper[4892]: I0217 17:45:23.460486 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:23 crc kubenswrapper[4892]: I0217 17:45:23.460503 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:23Z","lastTransitionTime":"2026-02-17T17:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:23 crc kubenswrapper[4892]: I0217 17:45:23.554404 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 00:01:26.458462432 +0000 UTC Feb 17 17:45:23 crc kubenswrapper[4892]: I0217 17:45:23.565225 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:23 crc kubenswrapper[4892]: I0217 17:45:23.565287 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:23 crc kubenswrapper[4892]: I0217 17:45:23.565302 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:23 crc kubenswrapper[4892]: I0217 17:45:23.565323 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:23 crc kubenswrapper[4892]: I0217 17:45:23.565337 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:23Z","lastTransitionTime":"2026-02-17T17:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:23 crc kubenswrapper[4892]: I0217 17:45:23.667347 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:23 crc kubenswrapper[4892]: I0217 17:45:23.667412 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:23 crc kubenswrapper[4892]: I0217 17:45:23.667423 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:23 crc kubenswrapper[4892]: I0217 17:45:23.667439 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:23 crc kubenswrapper[4892]: I0217 17:45:23.667451 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:23Z","lastTransitionTime":"2026-02-17T17:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:23 crc kubenswrapper[4892]: I0217 17:45:23.769669 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:23 crc kubenswrapper[4892]: I0217 17:45:23.769731 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:23 crc kubenswrapper[4892]: I0217 17:45:23.769748 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:23 crc kubenswrapper[4892]: I0217 17:45:23.769772 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:23 crc kubenswrapper[4892]: I0217 17:45:23.769797 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:23Z","lastTransitionTime":"2026-02-17T17:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:23 crc kubenswrapper[4892]: I0217 17:45:23.872901 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:23 crc kubenswrapper[4892]: I0217 17:45:23.872991 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:23 crc kubenswrapper[4892]: I0217 17:45:23.873040 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:23 crc kubenswrapper[4892]: I0217 17:45:23.873067 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:23 crc kubenswrapper[4892]: I0217 17:45:23.873083 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:23Z","lastTransitionTime":"2026-02-17T17:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:23 crc kubenswrapper[4892]: I0217 17:45:23.976191 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:23 crc kubenswrapper[4892]: I0217 17:45:23.976260 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:23 crc kubenswrapper[4892]: I0217 17:45:23.976273 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:23 crc kubenswrapper[4892]: I0217 17:45:23.976290 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:23 crc kubenswrapper[4892]: I0217 17:45:23.976300 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:23Z","lastTransitionTime":"2026-02-17T17:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:24 crc kubenswrapper[4892]: I0217 17:45:24.078931 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:24 crc kubenswrapper[4892]: I0217 17:45:24.078988 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:24 crc kubenswrapper[4892]: I0217 17:45:24.078998 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:24 crc kubenswrapper[4892]: I0217 17:45:24.079010 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:24 crc kubenswrapper[4892]: I0217 17:45:24.079019 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:24Z","lastTransitionTime":"2026-02-17T17:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:24 crc kubenswrapper[4892]: I0217 17:45:24.181301 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:24 crc kubenswrapper[4892]: I0217 17:45:24.181362 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:24 crc kubenswrapper[4892]: I0217 17:45:24.181386 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:24 crc kubenswrapper[4892]: I0217 17:45:24.181413 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:24 crc kubenswrapper[4892]: I0217 17:45:24.181435 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:24Z","lastTransitionTime":"2026-02-17T17:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:24 crc kubenswrapper[4892]: I0217 17:45:24.283752 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:24 crc kubenswrapper[4892]: I0217 17:45:24.283784 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:24 crc kubenswrapper[4892]: I0217 17:45:24.283793 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:24 crc kubenswrapper[4892]: I0217 17:45:24.283805 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:24 crc kubenswrapper[4892]: I0217 17:45:24.283840 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:24Z","lastTransitionTime":"2026-02-17T17:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:24 crc kubenswrapper[4892]: I0217 17:45:24.387069 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:24 crc kubenswrapper[4892]: I0217 17:45:24.387118 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:24 crc kubenswrapper[4892]: I0217 17:45:24.387134 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:24 crc kubenswrapper[4892]: I0217 17:45:24.387155 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:24 crc kubenswrapper[4892]: I0217 17:45:24.387173 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:24Z","lastTransitionTime":"2026-02-17T17:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:24 crc kubenswrapper[4892]: I0217 17:45:24.489867 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:24 crc kubenswrapper[4892]: I0217 17:45:24.489911 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:24 crc kubenswrapper[4892]: I0217 17:45:24.489926 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:24 crc kubenswrapper[4892]: I0217 17:45:24.489941 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:24 crc kubenswrapper[4892]: I0217 17:45:24.489951 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:24Z","lastTransitionTime":"2026-02-17T17:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:24 crc kubenswrapper[4892]: I0217 17:45:24.554617 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 15:16:01.051608587 +0000 UTC Feb 17 17:45:24 crc kubenswrapper[4892]: I0217 17:45:24.593154 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:24 crc kubenswrapper[4892]: I0217 17:45:24.593212 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:24 crc kubenswrapper[4892]: I0217 17:45:24.593233 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:24 crc kubenswrapper[4892]: I0217 17:45:24.593321 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:24 crc kubenswrapper[4892]: I0217 17:45:24.593361 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:24Z","lastTransitionTime":"2026-02-17T17:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:24 crc kubenswrapper[4892]: I0217 17:45:24.696512 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:24 crc kubenswrapper[4892]: I0217 17:45:24.696551 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:24 crc kubenswrapper[4892]: I0217 17:45:24.696562 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:24 crc kubenswrapper[4892]: I0217 17:45:24.696579 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:24 crc kubenswrapper[4892]: I0217 17:45:24.696590 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:24Z","lastTransitionTime":"2026-02-17T17:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:24 crc kubenswrapper[4892]: I0217 17:45:24.799931 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:24 crc kubenswrapper[4892]: I0217 17:45:24.799974 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:24 crc kubenswrapper[4892]: I0217 17:45:24.799983 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:24 crc kubenswrapper[4892]: I0217 17:45:24.799999 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:24 crc kubenswrapper[4892]: I0217 17:45:24.800012 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:24Z","lastTransitionTime":"2026-02-17T17:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:24 crc kubenswrapper[4892]: I0217 17:45:24.902851 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:24 crc kubenswrapper[4892]: I0217 17:45:24.902889 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:24 crc kubenswrapper[4892]: I0217 17:45:24.902898 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:24 crc kubenswrapper[4892]: I0217 17:45:24.902913 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:24 crc kubenswrapper[4892]: I0217 17:45:24.902923 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:24Z","lastTransitionTime":"2026-02-17T17:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:25 crc kubenswrapper[4892]: I0217 17:45:25.005626 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:25 crc kubenswrapper[4892]: I0217 17:45:25.005677 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:25 crc kubenswrapper[4892]: I0217 17:45:25.005693 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:25 crc kubenswrapper[4892]: I0217 17:45:25.005716 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:25 crc kubenswrapper[4892]: I0217 17:45:25.005732 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:25Z","lastTransitionTime":"2026-02-17T17:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:25 crc kubenswrapper[4892]: I0217 17:45:25.108133 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:25 crc kubenswrapper[4892]: I0217 17:45:25.108186 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:25 crc kubenswrapper[4892]: I0217 17:45:25.108202 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:25 crc kubenswrapper[4892]: I0217 17:45:25.108225 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:25 crc kubenswrapper[4892]: I0217 17:45:25.108244 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:25Z","lastTransitionTime":"2026-02-17T17:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:25 crc kubenswrapper[4892]: I0217 17:45:25.211170 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:25 crc kubenswrapper[4892]: I0217 17:45:25.211227 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:25 crc kubenswrapper[4892]: I0217 17:45:25.211243 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:25 crc kubenswrapper[4892]: I0217 17:45:25.211265 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:25 crc kubenswrapper[4892]: I0217 17:45:25.211282 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:25Z","lastTransitionTime":"2026-02-17T17:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:25 crc kubenswrapper[4892]: I0217 17:45:25.313536 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:25 crc kubenswrapper[4892]: I0217 17:45:25.313577 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:25 crc kubenswrapper[4892]: I0217 17:45:25.313587 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:25 crc kubenswrapper[4892]: I0217 17:45:25.313602 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:25 crc kubenswrapper[4892]: I0217 17:45:25.313612 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:25Z","lastTransitionTime":"2026-02-17T17:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:25 crc kubenswrapper[4892]: I0217 17:45:25.359397 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:45:25 crc kubenswrapper[4892]: I0217 17:45:25.359466 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:45:25 crc kubenswrapper[4892]: I0217 17:45:25.359499 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:45:25 crc kubenswrapper[4892]: I0217 17:45:25.359466 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:45:25 crc kubenswrapper[4892]: E0217 17:45:25.359653 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q4n6" podUID="9290105c-74a4-487a-879f-3f79186b3b01" Feb 17 17:45:25 crc kubenswrapper[4892]: E0217 17:45:25.360106 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:45:25 crc kubenswrapper[4892]: E0217 17:45:25.360399 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:45:25 crc kubenswrapper[4892]: E0217 17:45:25.360667 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:45:25 crc kubenswrapper[4892]: I0217 17:45:25.415781 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:25 crc kubenswrapper[4892]: I0217 17:45:25.415835 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:25 crc kubenswrapper[4892]: I0217 17:45:25.415847 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:25 crc kubenswrapper[4892]: I0217 17:45:25.415864 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:25 crc kubenswrapper[4892]: I0217 17:45:25.415875 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:25Z","lastTransitionTime":"2026-02-17T17:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:25 crc kubenswrapper[4892]: I0217 17:45:25.518124 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:25 crc kubenswrapper[4892]: I0217 17:45:25.518148 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:25 crc kubenswrapper[4892]: I0217 17:45:25.518156 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:25 crc kubenswrapper[4892]: I0217 17:45:25.518167 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:25 crc kubenswrapper[4892]: I0217 17:45:25.518176 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:25Z","lastTransitionTime":"2026-02-17T17:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:25 crc kubenswrapper[4892]: I0217 17:45:25.555650 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 06:34:56.325217701 +0000 UTC Feb 17 17:45:25 crc kubenswrapper[4892]: I0217 17:45:25.621451 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:25 crc kubenswrapper[4892]: I0217 17:45:25.621893 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:25 crc kubenswrapper[4892]: I0217 17:45:25.622052 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:25 crc kubenswrapper[4892]: I0217 17:45:25.622220 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:25 crc kubenswrapper[4892]: I0217 17:45:25.622353 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:25Z","lastTransitionTime":"2026-02-17T17:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:25 crc kubenswrapper[4892]: I0217 17:45:25.725224 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:25 crc kubenswrapper[4892]: I0217 17:45:25.725281 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:25 crc kubenswrapper[4892]: I0217 17:45:25.725298 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:25 crc kubenswrapper[4892]: I0217 17:45:25.725321 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:25 crc kubenswrapper[4892]: I0217 17:45:25.725341 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:25Z","lastTransitionTime":"2026-02-17T17:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:25 crc kubenswrapper[4892]: I0217 17:45:25.829076 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:25 crc kubenswrapper[4892]: I0217 17:45:25.829177 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:25 crc kubenswrapper[4892]: I0217 17:45:25.829201 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:25 crc kubenswrapper[4892]: I0217 17:45:25.829228 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:25 crc kubenswrapper[4892]: I0217 17:45:25.829248 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:25Z","lastTransitionTime":"2026-02-17T17:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:25 crc kubenswrapper[4892]: I0217 17:45:25.932177 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:25 crc kubenswrapper[4892]: I0217 17:45:25.932244 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:25 crc kubenswrapper[4892]: I0217 17:45:25.932273 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:25 crc kubenswrapper[4892]: I0217 17:45:25.932300 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:25 crc kubenswrapper[4892]: I0217 17:45:25.932317 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:25Z","lastTransitionTime":"2026-02-17T17:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:26 crc kubenswrapper[4892]: I0217 17:45:26.035244 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:26 crc kubenswrapper[4892]: I0217 17:45:26.035335 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:26 crc kubenswrapper[4892]: I0217 17:45:26.035352 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:26 crc kubenswrapper[4892]: I0217 17:45:26.035375 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:26 crc kubenswrapper[4892]: I0217 17:45:26.035392 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:26Z","lastTransitionTime":"2026-02-17T17:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:26 crc kubenswrapper[4892]: I0217 17:45:26.138077 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:26 crc kubenswrapper[4892]: I0217 17:45:26.138131 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:26 crc kubenswrapper[4892]: I0217 17:45:26.138150 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:26 crc kubenswrapper[4892]: I0217 17:45:26.138174 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:26 crc kubenswrapper[4892]: I0217 17:45:26.138191 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:26Z","lastTransitionTime":"2026-02-17T17:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:26 crc kubenswrapper[4892]: I0217 17:45:26.241541 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:26 crc kubenswrapper[4892]: I0217 17:45:26.241609 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:26 crc kubenswrapper[4892]: I0217 17:45:26.241628 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:26 crc kubenswrapper[4892]: I0217 17:45:26.241650 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:26 crc kubenswrapper[4892]: I0217 17:45:26.241667 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:26Z","lastTransitionTime":"2026-02-17T17:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:26 crc kubenswrapper[4892]: I0217 17:45:26.344883 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:26 crc kubenswrapper[4892]: I0217 17:45:26.344923 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:26 crc kubenswrapper[4892]: I0217 17:45:26.344938 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:26 crc kubenswrapper[4892]: I0217 17:45:26.344959 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:26 crc kubenswrapper[4892]: I0217 17:45:26.344975 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:26Z","lastTransitionTime":"2026-02-17T17:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:26 crc kubenswrapper[4892]: I0217 17:45:26.380198 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 17 17:45:26 crc kubenswrapper[4892]: I0217 17:45:26.447804 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:26 crc kubenswrapper[4892]: I0217 17:45:26.447887 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:26 crc kubenswrapper[4892]: I0217 17:45:26.447906 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:26 crc kubenswrapper[4892]: I0217 17:45:26.447930 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:26 crc kubenswrapper[4892]: I0217 17:45:26.447947 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:26Z","lastTransitionTime":"2026-02-17T17:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:26 crc kubenswrapper[4892]: I0217 17:45:26.551255 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:26 crc kubenswrapper[4892]: I0217 17:45:26.551310 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:26 crc kubenswrapper[4892]: I0217 17:45:26.551325 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:26 crc kubenswrapper[4892]: I0217 17:45:26.551350 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:26 crc kubenswrapper[4892]: I0217 17:45:26.551368 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:26Z","lastTransitionTime":"2026-02-17T17:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:26 crc kubenswrapper[4892]: I0217 17:45:26.557662 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 19:54:50.22361171 +0000 UTC Feb 17 17:45:26 crc kubenswrapper[4892]: I0217 17:45:26.653335 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:26 crc kubenswrapper[4892]: I0217 17:45:26.653389 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:26 crc kubenswrapper[4892]: I0217 17:45:26.653406 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:26 crc kubenswrapper[4892]: I0217 17:45:26.653427 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:26 crc kubenswrapper[4892]: I0217 17:45:26.653445 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:26Z","lastTransitionTime":"2026-02-17T17:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:26 crc kubenswrapper[4892]: I0217 17:45:26.756668 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:26 crc kubenswrapper[4892]: I0217 17:45:26.756708 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:26 crc kubenswrapper[4892]: I0217 17:45:26.756718 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:26 crc kubenswrapper[4892]: I0217 17:45:26.756733 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:26 crc kubenswrapper[4892]: I0217 17:45:26.756744 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:26Z","lastTransitionTime":"2026-02-17T17:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:26 crc kubenswrapper[4892]: I0217 17:45:26.860323 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:26 crc kubenswrapper[4892]: I0217 17:45:26.860392 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:26 crc kubenswrapper[4892]: I0217 17:45:26.860409 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:26 crc kubenswrapper[4892]: I0217 17:45:26.860434 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:26 crc kubenswrapper[4892]: I0217 17:45:26.860451 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:26Z","lastTransitionTime":"2026-02-17T17:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:26 crc kubenswrapper[4892]: I0217 17:45:26.963157 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:26 crc kubenswrapper[4892]: I0217 17:45:26.963211 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:26 crc kubenswrapper[4892]: I0217 17:45:26.963227 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:26 crc kubenswrapper[4892]: I0217 17:45:26.963250 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:26 crc kubenswrapper[4892]: I0217 17:45:26.963267 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:26Z","lastTransitionTime":"2026-02-17T17:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.066052 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.066107 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.066126 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.066161 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.066180 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:27Z","lastTransitionTime":"2026-02-17T17:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.170349 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.170417 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.170438 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.170465 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.170487 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:27Z","lastTransitionTime":"2026-02-17T17:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.272946 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.273023 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.273039 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.273064 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.273083 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:27Z","lastTransitionTime":"2026-02-17T17:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.358997 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.359024 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.359078 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:45:27 crc kubenswrapper[4892]: E0217 17:45:27.359161 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q4n6" podUID="9290105c-74a4-487a-879f-3f79186b3b01" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.359178 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:45:27 crc kubenswrapper[4892]: E0217 17:45:27.359350 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:45:27 crc kubenswrapper[4892]: E0217 17:45:27.359382 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:45:27 crc kubenswrapper[4892]: E0217 17:45:27.359445 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.375866 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.375915 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.375932 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.375956 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.375975 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:27Z","lastTransitionTime":"2026-02-17T17:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.479183 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.479251 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.479267 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.479290 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.479306 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:27Z","lastTransitionTime":"2026-02-17T17:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.558162 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 17:50:05.787284691 +0000 UTC Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.582846 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.582904 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.582955 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.582981 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.582997 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:27Z","lastTransitionTime":"2026-02-17T17:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.685370 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.685407 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.685416 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.685429 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.685438 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:27Z","lastTransitionTime":"2026-02-17T17:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.699011 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.699061 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.699078 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.699098 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.699113 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:27Z","lastTransitionTime":"2026-02-17T17:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:27 crc kubenswrapper[4892]: E0217 17:45:27.717770 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b854407-f7fd-494b-9ad1-f90f175f6ff2\\\",\\\"systemUUID\\\":\\\"06c95f3d-0382-44a5-9f64-04e3b3bbd534\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:27Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.722837 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.722869 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.722879 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.722915 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.722929 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:27Z","lastTransitionTime":"2026-02-17T17:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:27 crc kubenswrapper[4892]: E0217 17:45:27.740431 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b854407-f7fd-494b-9ad1-f90f175f6ff2\\\",\\\"systemUUID\\\":\\\"06c95f3d-0382-44a5-9f64-04e3b3bbd534\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:27Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.745552 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.745603 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.745620 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.745718 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.746052 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:27Z","lastTransitionTime":"2026-02-17T17:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:27 crc kubenswrapper[4892]: E0217 17:45:27.764965 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b854407-f7fd-494b-9ad1-f90f175f6ff2\\\",\\\"systemUUID\\\":\\\"06c95f3d-0382-44a5-9f64-04e3b3bbd534\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:27Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.770222 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.770274 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.770291 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.770313 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.770330 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:27Z","lastTransitionTime":"2026-02-17T17:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:27 crc kubenswrapper[4892]: E0217 17:45:27.791809 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b854407-f7fd-494b-9ad1-f90f175f6ff2\\\",\\\"systemUUID\\\":\\\"06c95f3d-0382-44a5-9f64-04e3b3bbd534\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:27Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.796613 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.796711 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.796728 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.796750 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.796766 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:27Z","lastTransitionTime":"2026-02-17T17:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:27 crc kubenswrapper[4892]: E0217 17:45:27.813653 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T17:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b854407-f7fd-494b-9ad1-f90f175f6ff2\\\",\\\"systemUUID\\\":\\\"06c95f3d-0382-44a5-9f64-04e3b3bbd534\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:27Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:27 crc kubenswrapper[4892]: E0217 17:45:27.813893 4892 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.815676 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.815717 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.815726 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.815737 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.815746 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:27Z","lastTransitionTime":"2026-02-17T17:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.918856 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.918929 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.918951 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.918982 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:27 crc kubenswrapper[4892]: I0217 17:45:27.919033 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:27Z","lastTransitionTime":"2026-02-17T17:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:28 crc kubenswrapper[4892]: I0217 17:45:28.021613 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:28 crc kubenswrapper[4892]: I0217 17:45:28.021673 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:28 crc kubenswrapper[4892]: I0217 17:45:28.021693 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:28 crc kubenswrapper[4892]: I0217 17:45:28.021725 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:28 crc kubenswrapper[4892]: I0217 17:45:28.021748 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:28Z","lastTransitionTime":"2026-02-17T17:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:28 crc kubenswrapper[4892]: I0217 17:45:28.125179 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:28 crc kubenswrapper[4892]: I0217 17:45:28.125232 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:28 crc kubenswrapper[4892]: I0217 17:45:28.125248 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:28 crc kubenswrapper[4892]: I0217 17:45:28.125271 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:28 crc kubenswrapper[4892]: I0217 17:45:28.125288 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:28Z","lastTransitionTime":"2026-02-17T17:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:28 crc kubenswrapper[4892]: I0217 17:45:28.228616 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:28 crc kubenswrapper[4892]: I0217 17:45:28.228674 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:28 crc kubenswrapper[4892]: I0217 17:45:28.228683 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:28 crc kubenswrapper[4892]: I0217 17:45:28.228700 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:28 crc kubenswrapper[4892]: I0217 17:45:28.228710 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:28Z","lastTransitionTime":"2026-02-17T17:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:28 crc kubenswrapper[4892]: I0217 17:45:28.331493 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:28 crc kubenswrapper[4892]: I0217 17:45:28.331564 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:28 crc kubenswrapper[4892]: I0217 17:45:28.331581 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:28 crc kubenswrapper[4892]: I0217 17:45:28.331606 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:28 crc kubenswrapper[4892]: I0217 17:45:28.331623 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:28Z","lastTransitionTime":"2026-02-17T17:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:28 crc kubenswrapper[4892]: I0217 17:45:28.434472 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:28 crc kubenswrapper[4892]: I0217 17:45:28.434533 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:28 crc kubenswrapper[4892]: I0217 17:45:28.434553 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:28 crc kubenswrapper[4892]: I0217 17:45:28.434576 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:28 crc kubenswrapper[4892]: I0217 17:45:28.434595 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:28Z","lastTransitionTime":"2026-02-17T17:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:28 crc kubenswrapper[4892]: I0217 17:45:28.537649 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:28 crc kubenswrapper[4892]: I0217 17:45:28.537715 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:28 crc kubenswrapper[4892]: I0217 17:45:28.537732 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:28 crc kubenswrapper[4892]: I0217 17:45:28.537756 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:28 crc kubenswrapper[4892]: I0217 17:45:28.537772 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:28Z","lastTransitionTime":"2026-02-17T17:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:28 crc kubenswrapper[4892]: I0217 17:45:28.559260 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 02:35:14.380573652 +0000 UTC Feb 17 17:45:28 crc kubenswrapper[4892]: I0217 17:45:28.640483 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:28 crc kubenswrapper[4892]: I0217 17:45:28.640539 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:28 crc kubenswrapper[4892]: I0217 17:45:28.640555 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:28 crc kubenswrapper[4892]: I0217 17:45:28.640578 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:28 crc kubenswrapper[4892]: I0217 17:45:28.640594 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:28Z","lastTransitionTime":"2026-02-17T17:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:28 crc kubenswrapper[4892]: I0217 17:45:28.743149 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:28 crc kubenswrapper[4892]: I0217 17:45:28.743262 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:28 crc kubenswrapper[4892]: I0217 17:45:28.743287 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:28 crc kubenswrapper[4892]: I0217 17:45:28.743319 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:28 crc kubenswrapper[4892]: I0217 17:45:28.743341 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:28Z","lastTransitionTime":"2026-02-17T17:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:28 crc kubenswrapper[4892]: I0217 17:45:28.845988 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:28 crc kubenswrapper[4892]: I0217 17:45:28.846044 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:28 crc kubenswrapper[4892]: I0217 17:45:28.846065 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:28 crc kubenswrapper[4892]: I0217 17:45:28.846095 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:28 crc kubenswrapper[4892]: I0217 17:45:28.846117 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:28Z","lastTransitionTime":"2026-02-17T17:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:28 crc kubenswrapper[4892]: I0217 17:45:28.950162 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:28 crc kubenswrapper[4892]: I0217 17:45:28.950227 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:28 crc kubenswrapper[4892]: I0217 17:45:28.950247 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:28 crc kubenswrapper[4892]: I0217 17:45:28.950278 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:28 crc kubenswrapper[4892]: I0217 17:45:28.950300 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:28Z","lastTransitionTime":"2026-02-17T17:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.053605 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.053655 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.053671 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.053694 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.053711 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:29Z","lastTransitionTime":"2026-02-17T17:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.156571 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.156653 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.156677 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.156703 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.156720 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:29Z","lastTransitionTime":"2026-02-17T17:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.258963 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.259026 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.259048 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.259078 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.259098 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:29Z","lastTransitionTime":"2026-02-17T17:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.298734 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.299944 4892 scope.go:117] "RemoveContainer" containerID="acb53587bfb9bbb4e171e5b2c92a91c196c8c0445a80c61d8004ce0fb3c858e1" Feb 17 17:45:29 crc kubenswrapper[4892]: E0217 17:45:29.300145 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kp5h9_openshift-ovn-kubernetes(b23058a0-04ec-4a23-82cb-60f9b368eaa0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.359502 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.359573 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.360989 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:45:29 crc kubenswrapper[4892]: E0217 17:45:29.361196 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.361277 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:45:29 crc kubenswrapper[4892]: E0217 17:45:29.361422 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:45:29 crc kubenswrapper[4892]: E0217 17:45:29.361540 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:45:29 crc kubenswrapper[4892]: E0217 17:45:29.361637 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q4n6" podUID="9290105c-74a4-487a-879f-3f79186b3b01" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.362272 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.362334 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.362356 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.362385 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.362405 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:29Z","lastTransitionTime":"2026-02-17T17:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.378921 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9013d62-9809-436b-82a8-5b18dbf13e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b57e4ceb22cb7d61e915ddf5e85dfee872d2088cccf3c8c4add63e1aa423a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c31ce4ad814ab3be27cbd8d8eb0cb70c7e11352cacd14b705d5c3ed0b56fe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5np8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6mhzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.400942 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a078938-20da-44c1-840e-060eab8bfe95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://226b8c49d008f0a45b45ec1a8ffdfccc571c4eda281a8a17d5fb74c007dec6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34efb5409d2b86ecf4193ea82dc70ebf59c72f9b0f7460fb868e632f16b567d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c1f0721415a6bfa0764f985f579345d35bac0a56978bff1bba4ae16a71ec2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88ad26081caa5f1a5b3b789c7c08aea929b9bfbb73ff00ea14a3fef15873bcb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://691f098ec24b434c424081ff3d0323ffcfd387ade7ccece1fabee064b81b394e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://768f7d12789bbed0b254d9ad4fbb0364b45e71930bf5622bb9d9b02a7c91ae50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://768f7d12789bbed0b254d9ad4fbb0364b45e71930bf5622bb9d9b02a7c91ae50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://266389586fe3d597964081c84348f75ccac547d179c85375d837f25a6ac57c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://266389586fe3d597964081c84348f75ccac547d179c85375d837f25a6ac57c7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://08598fcc87010d29b5fa6e25f077f7b55537751361231fbde800aa278e2954f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08598fcc87010d29b5fa6e25f077f7b55537751361231fbde800aa278e2954f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.418592 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.435785 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aba0c9be9062adff6361f81c4b2f399a56592fcef0d1a62defdf587f9accb3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.447489 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2q4n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9290105c-74a4-487a-879f-3f79186b3b01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6h6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6h6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2q4n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.464342 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e1e9d28fe57947c798967ce328638e778603a86874e79a2f4d99eab0b9867fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://489d551f2e552fe6e6a03f8e38593e2b978a7dc4053a999c3a11ebe404cf427f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.466146 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.466175 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.466217 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.466244 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.466259 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:29Z","lastTransitionTime":"2026-02-17T17:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.482338 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p6jtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"646d0148-c138-45a2-8f68-51aee16aeff0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1d56c160a6bfb1da9b1ca24666fbbdbd87911eaf961365a6cb097b0cc20f1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxbxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p6jtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.504727 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b23058a0-04ec-4a23-82cb-60f9b368eaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acb53587bfb9bbb4e171e5b2c92a91c196c8c0445a80c61d8004ce0fb3c858e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://acb53587bfb9bbb4e171e5b2c92a91c196c8c0445a80c61d8004ce0fb3c858e1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:45:11Z\\\",\\\"message\\\":\\\"ailed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:11Z is after 2025-08-24T17:21:41Z]\\\\nI0217 17:45:11.478842 6999 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k28d\\\\nI0217 17:45:11.478739 6999 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0217 17:45:11.478855 6999 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k28d\\\\nI0217 17:45:11.478682 6999 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-5v456\\\\nI0217 17:45:11.478864 6999 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k28d in node crc\\\\nI0217 17:45:11.478876 6999 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-5v456\\\\nI0217 17:45:11.478883 6999 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-74\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:45:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kp5h9_openshift-ovn-kubernetes(b23058a0-04ec-4a23-82cb-60f9b368eaa0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vcqf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kp5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.519470 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59304d1958e072cd19bd1e6a00f1487aa93563c8838c859aa7ed5222dc25c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.532336 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5v456" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62da95c0-b8b4-410e-a5e8-f3ab44db53b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4bd49fbfa0e7cff66ffc2a72aa0020a4a28551fa192e587067d648feb4d09f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfctg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5v456\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.546353 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k28d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5497ad5a-f96f-4f2e-ba79-a72f32527546\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8cdd570c53fb9581ae506e73f11f51586ebffcc4561ba7176a7a64a25bd5e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g8dcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ca1f6810f305bb127d5ebfbb39b92ffa5914a3a245dcfe6f4adb805fea015e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g8dcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7k28d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.558313 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5f7eaf0-f8c2-46f8-947e-adf3e613da11\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38ea6d6b01fd4e5622f4283bf6250451952776f0d2b162865b1705dbde6542eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65e1e3306f2becef082f51747a01f6f5f0c619c462d0b001bdc01827d19d6ad3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65e1e3306f2becef082f51747a01f6f5f0c619c462d0b001bdc01827d19d6ad3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.559369 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 13:05:13.402514757 +0000 UTC Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.576144 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.576200 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.576222 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.576251 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.576101 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4159c-33a5-42ef-9427-ad1fb2799470\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf7e90e5029b6c000084531e8c465365c236ea0c554bf0ab65b936fc0f53cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe34ba04ce082dcb2c2040fafa3d9b0fb0f78fd569d08c9e53278873fb6434c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a9fc749b194bcc0558af7017260846aeb8df8df5d84f31ae0f1e9f7ac29e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2199d2bd60d27dc4d3c3ff4a2ef9e4cbad59c3546a36206258b82875901b69d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 17:44:08.433164 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 17:44:08.433536 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 17:44:08.434768 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2781092108/tls.crt::/tmp/serving-cert-2781092108/tls.key\\\\\\\"\\\\nI0217 17:44:08.760474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 17:44:08.765628 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 17:44:08.765666 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 17:44:08.765700 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 17:44:08.765711 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 17:44:08.777802 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 17:44:08.777850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 17:44:08.777863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 17:44:08.777867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 17:44:08.777870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 17:44:08.777873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 17:44:08.777883 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 17:44:08.781106 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23963364690d3b6cdb1cb76d08e094b0a7c2fe304f8d20a673206357e0152193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.576272 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:29Z","lastTransitionTime":"2026-02-17T17:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.595806 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af0cdeda-3c72-4700-b908-b8c0f0041cc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db577dd460e8c892dbb44799bae9ccdd3364c385f4bbfbd3a33a88bab1bcab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd66fb423f8b68d5e670ece2a7be1b6e04e4dcd03ba42dc25bcf97cd9ec92b69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0b98dba841562d65c0ab18e4f604c141352d4bbfb16825248fb209a73b23cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f344d7a0ae243f5e05b9e8c21f25a713411ee0921470e6632409800760aab22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f344d7a0ae243f5e05b9e8c21f25a713411ee0921470e6632409800760aab22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:43:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.613751 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxpxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b12f44-0079-4031-9b1d-492c374250df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc69d29b82db70745c48c3f7761a99709ee267fb45509bfc4ff406273c6f23c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5490d72f776fd4e96f56e933c182c8589e96d80806378f96d97c4d0f6dccd8eb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T17:44:59Z\\\",\\\"message\\\":\\\"2026-02-17T17:44:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d0b9a3e2-5264-4ab6-a42c-c965db521da1\\\\n2026-02-17T17:44:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d0b9a3e2-5264-4ab6-a42c-c965db521da1 to /host/opt/cni/bin/\\\\n2026-02-17T17:44:14Z [verbose] multus-daemon started\\\\n2026-02-17T17:44:14Z [verbose] Readiness Indicator file check\\\\n2026-02-17T17:44:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgp4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxpxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.633984 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4dsxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202411f4-1b44-41a1-9b8a-23038a0e9bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc5f2c5d87fb5b1529c03563955d9213cd72452aabadd9417afdf5a30b8a24f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f875898e917ad66dbfd850e5fb28ace90794d263c99c6d2f34c67eb4f4350e21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://011340409e55ebce1d692aa10f1267543f6225674216a5acb61642e9ab90bb91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa114647d495e3cb9e533fb2a81a7df9707571519af8f83bc659e8cddecfe5bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35ed96dcc54dc0f19446ec26ab5b1203c4db69719ae3ff229ed0db7eae25f20d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a31a428f3ed559100c74918af2b1f900c7287344a3753c7b773d81293edbd17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe29b5bd0d82b04f9482f1c0c1ebb204f19a6edd1f56f1ee5da4b3a6537b5ff1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T17:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T17:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9llf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4dsxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.648711 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83f8f121-9193-465b-9d6b-4dfbdcf26f98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T17:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca961a496bca89cff78c58c41585706c0673892df83e6c58e16655f83aeaa4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3ead139c2a8e6feefbddce384ccd6eb8f26b4581d93053b80832fea7c8bdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82362387569a5b049fdb827cb8ca5ecdacae36743ef142077b37441cc2e33b41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d08e8579fe7ac45bc28ca29b9e6bc0b795c68e4429e16319b75f2b6a1f60cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T17:43:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T17:43:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.664338 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.679346 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.679534 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.679607 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.679667 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.679726 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:29Z","lastTransitionTime":"2026-02-17T17:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.679877 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T17:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T17:45:29Z is after 2025-08-24T17:21:41Z" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.783173 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.783213 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.783222 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.783235 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.783244 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:29Z","lastTransitionTime":"2026-02-17T17:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.886499 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.886570 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.886593 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.886622 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.886646 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:29Z","lastTransitionTime":"2026-02-17T17:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.989907 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.989965 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.989989 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.990015 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:29 crc kubenswrapper[4892]: I0217 17:45:29.990038 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:29Z","lastTransitionTime":"2026-02-17T17:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:30 crc kubenswrapper[4892]: I0217 17:45:30.092326 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:30 crc kubenswrapper[4892]: I0217 17:45:30.092380 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:30 crc kubenswrapper[4892]: I0217 17:45:30.092398 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:30 crc kubenswrapper[4892]: I0217 17:45:30.092421 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:30 crc kubenswrapper[4892]: I0217 17:45:30.092438 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:30Z","lastTransitionTime":"2026-02-17T17:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:30 crc kubenswrapper[4892]: I0217 17:45:30.195125 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:30 crc kubenswrapper[4892]: I0217 17:45:30.195193 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:30 crc kubenswrapper[4892]: I0217 17:45:30.195214 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:30 crc kubenswrapper[4892]: I0217 17:45:30.195240 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:30 crc kubenswrapper[4892]: I0217 17:45:30.195257 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:30Z","lastTransitionTime":"2026-02-17T17:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:30 crc kubenswrapper[4892]: I0217 17:45:30.297121 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:30 crc kubenswrapper[4892]: I0217 17:45:30.297149 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:30 crc kubenswrapper[4892]: I0217 17:45:30.297158 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:30 crc kubenswrapper[4892]: I0217 17:45:30.297170 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:30 crc kubenswrapper[4892]: I0217 17:45:30.297178 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:30Z","lastTransitionTime":"2026-02-17T17:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:30 crc kubenswrapper[4892]: I0217 17:45:30.400079 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:30 crc kubenswrapper[4892]: I0217 17:45:30.400113 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:30 crc kubenswrapper[4892]: I0217 17:45:30.400123 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:30 crc kubenswrapper[4892]: I0217 17:45:30.400136 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:30 crc kubenswrapper[4892]: I0217 17:45:30.400145 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:30Z","lastTransitionTime":"2026-02-17T17:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:30 crc kubenswrapper[4892]: I0217 17:45:30.503104 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:30 crc kubenswrapper[4892]: I0217 17:45:30.503141 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:30 crc kubenswrapper[4892]: I0217 17:45:30.503150 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:30 crc kubenswrapper[4892]: I0217 17:45:30.503164 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:30 crc kubenswrapper[4892]: I0217 17:45:30.503174 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:30Z","lastTransitionTime":"2026-02-17T17:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:30 crc kubenswrapper[4892]: I0217 17:45:30.534770 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9290105c-74a4-487a-879f-3f79186b3b01-metrics-certs\") pod \"network-metrics-daemon-2q4n6\" (UID: \"9290105c-74a4-487a-879f-3f79186b3b01\") " pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:45:30 crc kubenswrapper[4892]: E0217 17:45:30.534935 4892 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 17:45:30 crc kubenswrapper[4892]: E0217 17:45:30.534988 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9290105c-74a4-487a-879f-3f79186b3b01-metrics-certs podName:9290105c-74a4-487a-879f-3f79186b3b01 nodeName:}" failed. No retries permitted until 2026-02-17 17:46:34.534970598 +0000 UTC m=+165.910373863 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9290105c-74a4-487a-879f-3f79186b3b01-metrics-certs") pod "network-metrics-daemon-2q4n6" (UID: "9290105c-74a4-487a-879f-3f79186b3b01") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 17:45:30 crc kubenswrapper[4892]: I0217 17:45:30.560016 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 18:45:09.798953649 +0000 UTC Feb 17 17:45:30 crc kubenswrapper[4892]: I0217 17:45:30.606004 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:30 crc kubenswrapper[4892]: I0217 17:45:30.606060 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:30 crc kubenswrapper[4892]: I0217 17:45:30.606082 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:30 crc kubenswrapper[4892]: I0217 17:45:30.606107 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:30 crc kubenswrapper[4892]: I0217 17:45:30.606126 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:30Z","lastTransitionTime":"2026-02-17T17:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:30 crc kubenswrapper[4892]: I0217 17:45:30.709376 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:30 crc kubenswrapper[4892]: I0217 17:45:30.709414 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:30 crc kubenswrapper[4892]: I0217 17:45:30.709425 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:30 crc kubenswrapper[4892]: I0217 17:45:30.709440 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:30 crc kubenswrapper[4892]: I0217 17:45:30.709450 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:30Z","lastTransitionTime":"2026-02-17T17:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:30 crc kubenswrapper[4892]: I0217 17:45:30.811865 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:30 crc kubenswrapper[4892]: I0217 17:45:30.811935 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:30 crc kubenswrapper[4892]: I0217 17:45:30.811968 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:30 crc kubenswrapper[4892]: I0217 17:45:30.811999 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:30 crc kubenswrapper[4892]: I0217 17:45:30.812020 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:30Z","lastTransitionTime":"2026-02-17T17:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:30 crc kubenswrapper[4892]: I0217 17:45:30.914700 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:30 crc kubenswrapper[4892]: I0217 17:45:30.914759 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:30 crc kubenswrapper[4892]: I0217 17:45:30.914774 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:30 crc kubenswrapper[4892]: I0217 17:45:30.914800 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:30 crc kubenswrapper[4892]: I0217 17:45:30.914858 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:30Z","lastTransitionTime":"2026-02-17T17:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:31 crc kubenswrapper[4892]: I0217 17:45:31.018052 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:31 crc kubenswrapper[4892]: I0217 17:45:31.018121 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:31 crc kubenswrapper[4892]: I0217 17:45:31.018141 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:31 crc kubenswrapper[4892]: I0217 17:45:31.018167 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:31 crc kubenswrapper[4892]: I0217 17:45:31.018184 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:31Z","lastTransitionTime":"2026-02-17T17:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:31 crc kubenswrapper[4892]: I0217 17:45:31.120927 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:31 crc kubenswrapper[4892]: I0217 17:45:31.120969 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:31 crc kubenswrapper[4892]: I0217 17:45:31.120980 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:31 crc kubenswrapper[4892]: I0217 17:45:31.121000 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:31 crc kubenswrapper[4892]: I0217 17:45:31.121012 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:31Z","lastTransitionTime":"2026-02-17T17:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:31 crc kubenswrapper[4892]: I0217 17:45:31.222678 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:31 crc kubenswrapper[4892]: I0217 17:45:31.222723 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:31 crc kubenswrapper[4892]: I0217 17:45:31.222739 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:31 crc kubenswrapper[4892]: I0217 17:45:31.222761 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:31 crc kubenswrapper[4892]: I0217 17:45:31.222777 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:31Z","lastTransitionTime":"2026-02-17T17:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:31 crc kubenswrapper[4892]: I0217 17:45:31.325735 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:31 crc kubenswrapper[4892]: I0217 17:45:31.325772 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:31 crc kubenswrapper[4892]: I0217 17:45:31.325781 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:31 crc kubenswrapper[4892]: I0217 17:45:31.325794 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:31 crc kubenswrapper[4892]: I0217 17:45:31.325806 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:31Z","lastTransitionTime":"2026-02-17T17:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:31 crc kubenswrapper[4892]: I0217 17:45:31.358882 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:45:31 crc kubenswrapper[4892]: I0217 17:45:31.359005 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:45:31 crc kubenswrapper[4892]: E0217 17:45:31.359164 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:45:31 crc kubenswrapper[4892]: I0217 17:45:31.359247 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:45:31 crc kubenswrapper[4892]: I0217 17:45:31.359258 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:45:31 crc kubenswrapper[4892]: E0217 17:45:31.359405 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q4n6" podUID="9290105c-74a4-487a-879f-3f79186b3b01" Feb 17 17:45:31 crc kubenswrapper[4892]: E0217 17:45:31.359668 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:45:31 crc kubenswrapper[4892]: E0217 17:45:31.359697 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:45:31 crc kubenswrapper[4892]: I0217 17:45:31.428677 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:31 crc kubenswrapper[4892]: I0217 17:45:31.428732 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:31 crc kubenswrapper[4892]: I0217 17:45:31.428751 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:31 crc kubenswrapper[4892]: I0217 17:45:31.428775 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:31 crc kubenswrapper[4892]: I0217 17:45:31.428792 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:31Z","lastTransitionTime":"2026-02-17T17:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:31 crc kubenswrapper[4892]: I0217 17:45:31.531804 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:31 crc kubenswrapper[4892]: I0217 17:45:31.531850 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:31 crc kubenswrapper[4892]: I0217 17:45:31.531876 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:31 crc kubenswrapper[4892]: I0217 17:45:31.531889 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:31 crc kubenswrapper[4892]: I0217 17:45:31.531898 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:31Z","lastTransitionTime":"2026-02-17T17:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:31 crc kubenswrapper[4892]: I0217 17:45:31.560730 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 19:55:36.232867164 +0000 UTC Feb 17 17:45:31 crc kubenswrapper[4892]: I0217 17:45:31.634309 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:31 crc kubenswrapper[4892]: I0217 17:45:31.634371 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:31 crc kubenswrapper[4892]: I0217 17:45:31.634401 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:31 crc kubenswrapper[4892]: I0217 17:45:31.634442 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:31 crc kubenswrapper[4892]: I0217 17:45:31.634468 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:31Z","lastTransitionTime":"2026-02-17T17:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:31 crc kubenswrapper[4892]: I0217 17:45:31.737344 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:31 crc kubenswrapper[4892]: I0217 17:45:31.737425 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:31 crc kubenswrapper[4892]: I0217 17:45:31.737448 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:31 crc kubenswrapper[4892]: I0217 17:45:31.737478 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:31 crc kubenswrapper[4892]: I0217 17:45:31.737497 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:31Z","lastTransitionTime":"2026-02-17T17:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:31 crc kubenswrapper[4892]: I0217 17:45:31.840554 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:31 crc kubenswrapper[4892]: I0217 17:45:31.840594 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:31 crc kubenswrapper[4892]: I0217 17:45:31.840606 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:31 crc kubenswrapper[4892]: I0217 17:45:31.840624 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:31 crc kubenswrapper[4892]: I0217 17:45:31.840634 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:31Z","lastTransitionTime":"2026-02-17T17:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:31 crc kubenswrapper[4892]: I0217 17:45:31.943099 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:31 crc kubenswrapper[4892]: I0217 17:45:31.943145 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:31 crc kubenswrapper[4892]: I0217 17:45:31.943155 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:31 crc kubenswrapper[4892]: I0217 17:45:31.943172 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:31 crc kubenswrapper[4892]: I0217 17:45:31.943192 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:31Z","lastTransitionTime":"2026-02-17T17:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:32 crc kubenswrapper[4892]: I0217 17:45:32.047974 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:32 crc kubenswrapper[4892]: I0217 17:45:32.048036 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:32 crc kubenswrapper[4892]: I0217 17:45:32.048054 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:32 crc kubenswrapper[4892]: I0217 17:45:32.048081 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:32 crc kubenswrapper[4892]: I0217 17:45:32.048098 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:32Z","lastTransitionTime":"2026-02-17T17:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:32 crc kubenswrapper[4892]: I0217 17:45:32.154971 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:32 crc kubenswrapper[4892]: I0217 17:45:32.155213 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:32 crc kubenswrapper[4892]: I0217 17:45:32.155230 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:32 crc kubenswrapper[4892]: I0217 17:45:32.155254 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:32 crc kubenswrapper[4892]: I0217 17:45:32.155271 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:32Z","lastTransitionTime":"2026-02-17T17:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:32 crc kubenswrapper[4892]: I0217 17:45:32.258245 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:32 crc kubenswrapper[4892]: I0217 17:45:32.258302 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:32 crc kubenswrapper[4892]: I0217 17:45:32.258323 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:32 crc kubenswrapper[4892]: I0217 17:45:32.258354 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:32 crc kubenswrapper[4892]: I0217 17:45:32.258373 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:32Z","lastTransitionTime":"2026-02-17T17:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:32 crc kubenswrapper[4892]: I0217 17:45:32.361521 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:32 crc kubenswrapper[4892]: I0217 17:45:32.361565 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:32 crc kubenswrapper[4892]: I0217 17:45:32.361581 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:32 crc kubenswrapper[4892]: I0217 17:45:32.361601 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:32 crc kubenswrapper[4892]: I0217 17:45:32.361620 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:32Z","lastTransitionTime":"2026-02-17T17:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:32 crc kubenswrapper[4892]: I0217 17:45:32.465524 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:32 crc kubenswrapper[4892]: I0217 17:45:32.465576 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:32 crc kubenswrapper[4892]: I0217 17:45:32.465588 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:32 crc kubenswrapper[4892]: I0217 17:45:32.465607 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:32 crc kubenswrapper[4892]: I0217 17:45:32.465623 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:32Z","lastTransitionTime":"2026-02-17T17:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:32 crc kubenswrapper[4892]: I0217 17:45:32.561553 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 05:43:27.609097563 +0000 UTC Feb 17 17:45:32 crc kubenswrapper[4892]: I0217 17:45:32.568618 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:32 crc kubenswrapper[4892]: I0217 17:45:32.568663 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:32 crc kubenswrapper[4892]: I0217 17:45:32.568675 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:32 crc kubenswrapper[4892]: I0217 17:45:32.568692 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:32 crc kubenswrapper[4892]: I0217 17:45:32.568704 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:32Z","lastTransitionTime":"2026-02-17T17:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:32 crc kubenswrapper[4892]: I0217 17:45:32.671845 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:32 crc kubenswrapper[4892]: I0217 17:45:32.671888 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:32 crc kubenswrapper[4892]: I0217 17:45:32.671901 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:32 crc kubenswrapper[4892]: I0217 17:45:32.671920 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:32 crc kubenswrapper[4892]: I0217 17:45:32.671935 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:32Z","lastTransitionTime":"2026-02-17T17:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:32 crc kubenswrapper[4892]: I0217 17:45:32.775002 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:32 crc kubenswrapper[4892]: I0217 17:45:32.775071 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:32 crc kubenswrapper[4892]: I0217 17:45:32.775088 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:32 crc kubenswrapper[4892]: I0217 17:45:32.775113 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:32 crc kubenswrapper[4892]: I0217 17:45:32.775131 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:32Z","lastTransitionTime":"2026-02-17T17:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:32 crc kubenswrapper[4892]: I0217 17:45:32.878148 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:32 crc kubenswrapper[4892]: I0217 17:45:32.878215 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:32 crc kubenswrapper[4892]: I0217 17:45:32.878234 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:32 crc kubenswrapper[4892]: I0217 17:45:32.878257 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:32 crc kubenswrapper[4892]: I0217 17:45:32.878276 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:32Z","lastTransitionTime":"2026-02-17T17:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:32 crc kubenswrapper[4892]: I0217 17:45:32.981401 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:32 crc kubenswrapper[4892]: I0217 17:45:32.981462 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:32 crc kubenswrapper[4892]: I0217 17:45:32.981478 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:32 crc kubenswrapper[4892]: I0217 17:45:32.981501 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:32 crc kubenswrapper[4892]: I0217 17:45:32.981517 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:32Z","lastTransitionTime":"2026-02-17T17:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:33 crc kubenswrapper[4892]: I0217 17:45:33.083863 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:33 crc kubenswrapper[4892]: I0217 17:45:33.083897 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:33 crc kubenswrapper[4892]: I0217 17:45:33.083907 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:33 crc kubenswrapper[4892]: I0217 17:45:33.083921 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:33 crc kubenswrapper[4892]: I0217 17:45:33.083933 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:33Z","lastTransitionTime":"2026-02-17T17:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:33 crc kubenswrapper[4892]: I0217 17:45:33.186431 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:33 crc kubenswrapper[4892]: I0217 17:45:33.186470 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:33 crc kubenswrapper[4892]: I0217 17:45:33.186479 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:33 crc kubenswrapper[4892]: I0217 17:45:33.186496 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:33 crc kubenswrapper[4892]: I0217 17:45:33.186507 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:33Z","lastTransitionTime":"2026-02-17T17:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:33 crc kubenswrapper[4892]: I0217 17:45:33.289422 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:33 crc kubenswrapper[4892]: I0217 17:45:33.289475 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:33 crc kubenswrapper[4892]: I0217 17:45:33.289484 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:33 crc kubenswrapper[4892]: I0217 17:45:33.289499 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:33 crc kubenswrapper[4892]: I0217 17:45:33.289509 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:33Z","lastTransitionTime":"2026-02-17T17:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:33 crc kubenswrapper[4892]: I0217 17:45:33.359230 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:45:33 crc kubenswrapper[4892]: I0217 17:45:33.359285 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:45:33 crc kubenswrapper[4892]: I0217 17:45:33.359324 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:45:33 crc kubenswrapper[4892]: I0217 17:45:33.359363 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:45:33 crc kubenswrapper[4892]: E0217 17:45:33.359518 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q4n6" podUID="9290105c-74a4-487a-879f-3f79186b3b01" Feb 17 17:45:33 crc kubenswrapper[4892]: E0217 17:45:33.359546 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:45:33 crc kubenswrapper[4892]: E0217 17:45:33.359677 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:45:33 crc kubenswrapper[4892]: E0217 17:45:33.359740 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:45:33 crc kubenswrapper[4892]: I0217 17:45:33.391763 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:33 crc kubenswrapper[4892]: I0217 17:45:33.391843 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:33 crc kubenswrapper[4892]: I0217 17:45:33.391861 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:33 crc kubenswrapper[4892]: I0217 17:45:33.391886 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:33 crc kubenswrapper[4892]: I0217 17:45:33.391903 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:33Z","lastTransitionTime":"2026-02-17T17:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:33 crc kubenswrapper[4892]: I0217 17:45:33.494329 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:33 crc kubenswrapper[4892]: I0217 17:45:33.494379 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:33 crc kubenswrapper[4892]: I0217 17:45:33.494395 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:33 crc kubenswrapper[4892]: I0217 17:45:33.494416 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:33 crc kubenswrapper[4892]: I0217 17:45:33.494432 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:33Z","lastTransitionTime":"2026-02-17T17:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:33 crc kubenswrapper[4892]: I0217 17:45:33.562214 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 05:45:15.714027522 +0000 UTC Feb 17 17:45:33 crc kubenswrapper[4892]: I0217 17:45:33.598120 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:33 crc kubenswrapper[4892]: I0217 17:45:33.598187 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:33 crc kubenswrapper[4892]: I0217 17:45:33.598211 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:33 crc kubenswrapper[4892]: I0217 17:45:33.598238 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:33 crc kubenswrapper[4892]: I0217 17:45:33.598259 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:33Z","lastTransitionTime":"2026-02-17T17:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:33 crc kubenswrapper[4892]: I0217 17:45:33.701161 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:33 crc kubenswrapper[4892]: I0217 17:45:33.701193 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:33 crc kubenswrapper[4892]: I0217 17:45:33.701201 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:33 crc kubenswrapper[4892]: I0217 17:45:33.701213 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:33 crc kubenswrapper[4892]: I0217 17:45:33.701222 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:33Z","lastTransitionTime":"2026-02-17T17:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:33 crc kubenswrapper[4892]: I0217 17:45:33.803896 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:33 crc kubenswrapper[4892]: I0217 17:45:33.803951 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:33 crc kubenswrapper[4892]: I0217 17:45:33.803972 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:33 crc kubenswrapper[4892]: I0217 17:45:33.803999 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:33 crc kubenswrapper[4892]: I0217 17:45:33.804020 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:33Z","lastTransitionTime":"2026-02-17T17:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:33 crc kubenswrapper[4892]: I0217 17:45:33.906067 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:33 crc kubenswrapper[4892]: I0217 17:45:33.906156 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:33 crc kubenswrapper[4892]: I0217 17:45:33.906179 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:33 crc kubenswrapper[4892]: I0217 17:45:33.906210 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:33 crc kubenswrapper[4892]: I0217 17:45:33.906230 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:33Z","lastTransitionTime":"2026-02-17T17:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:34 crc kubenswrapper[4892]: I0217 17:45:34.009104 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:34 crc kubenswrapper[4892]: I0217 17:45:34.009159 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:34 crc kubenswrapper[4892]: I0217 17:45:34.009180 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:34 crc kubenswrapper[4892]: I0217 17:45:34.009204 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:34 crc kubenswrapper[4892]: I0217 17:45:34.009222 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:34Z","lastTransitionTime":"2026-02-17T17:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:34 crc kubenswrapper[4892]: I0217 17:45:34.111842 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:34 crc kubenswrapper[4892]: I0217 17:45:34.111912 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:34 crc kubenswrapper[4892]: I0217 17:45:34.111935 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:34 crc kubenswrapper[4892]: I0217 17:45:34.111961 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:34 crc kubenswrapper[4892]: I0217 17:45:34.111980 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:34Z","lastTransitionTime":"2026-02-17T17:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:34 crc kubenswrapper[4892]: I0217 17:45:34.215056 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:34 crc kubenswrapper[4892]: I0217 17:45:34.215117 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:34 crc kubenswrapper[4892]: I0217 17:45:34.215126 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:34 crc kubenswrapper[4892]: I0217 17:45:34.215139 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:34 crc kubenswrapper[4892]: I0217 17:45:34.215149 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:34Z","lastTransitionTime":"2026-02-17T17:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:34 crc kubenswrapper[4892]: I0217 17:45:34.317514 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:34 crc kubenswrapper[4892]: I0217 17:45:34.317565 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:34 crc kubenswrapper[4892]: I0217 17:45:34.317582 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:34 crc kubenswrapper[4892]: I0217 17:45:34.317602 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:34 crc kubenswrapper[4892]: I0217 17:45:34.317618 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:34Z","lastTransitionTime":"2026-02-17T17:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:34 crc kubenswrapper[4892]: I0217 17:45:34.420938 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:34 crc kubenswrapper[4892]: I0217 17:45:34.421002 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:34 crc kubenswrapper[4892]: I0217 17:45:34.421019 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:34 crc kubenswrapper[4892]: I0217 17:45:34.421044 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:34 crc kubenswrapper[4892]: I0217 17:45:34.421062 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:34Z","lastTransitionTime":"2026-02-17T17:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:34 crc kubenswrapper[4892]: I0217 17:45:34.523685 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:34 crc kubenswrapper[4892]: I0217 17:45:34.523748 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:34 crc kubenswrapper[4892]: I0217 17:45:34.523769 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:34 crc kubenswrapper[4892]: I0217 17:45:34.523797 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:34 crc kubenswrapper[4892]: I0217 17:45:34.523864 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:34Z","lastTransitionTime":"2026-02-17T17:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:34 crc kubenswrapper[4892]: I0217 17:45:34.563352 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 17:48:25.395216567 +0000 UTC Feb 17 17:45:34 crc kubenswrapper[4892]: I0217 17:45:34.627003 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:34 crc kubenswrapper[4892]: I0217 17:45:34.627065 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:34 crc kubenswrapper[4892]: I0217 17:45:34.627087 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:34 crc kubenswrapper[4892]: I0217 17:45:34.627116 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:34 crc kubenswrapper[4892]: I0217 17:45:34.627142 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:34Z","lastTransitionTime":"2026-02-17T17:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:34 crc kubenswrapper[4892]: I0217 17:45:34.730076 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:34 crc kubenswrapper[4892]: I0217 17:45:34.730131 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:34 crc kubenswrapper[4892]: I0217 17:45:34.730147 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:34 crc kubenswrapper[4892]: I0217 17:45:34.730187 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:34 crc kubenswrapper[4892]: I0217 17:45:34.730209 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:34Z","lastTransitionTime":"2026-02-17T17:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:34 crc kubenswrapper[4892]: I0217 17:45:34.833010 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:34 crc kubenswrapper[4892]: I0217 17:45:34.833060 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:34 crc kubenswrapper[4892]: I0217 17:45:34.833076 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:34 crc kubenswrapper[4892]: I0217 17:45:34.833100 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:34 crc kubenswrapper[4892]: I0217 17:45:34.833117 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:34Z","lastTransitionTime":"2026-02-17T17:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:34 crc kubenswrapper[4892]: I0217 17:45:34.936496 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:34 crc kubenswrapper[4892]: I0217 17:45:34.936627 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:34 crc kubenswrapper[4892]: I0217 17:45:34.936654 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:34 crc kubenswrapper[4892]: I0217 17:45:34.936684 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:34 crc kubenswrapper[4892]: I0217 17:45:34.936710 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:34Z","lastTransitionTime":"2026-02-17T17:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:35 crc kubenswrapper[4892]: I0217 17:45:35.039394 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:35 crc kubenswrapper[4892]: I0217 17:45:35.039469 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:35 crc kubenswrapper[4892]: I0217 17:45:35.039503 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:35 crc kubenswrapper[4892]: I0217 17:45:35.039535 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:35 crc kubenswrapper[4892]: I0217 17:45:35.039554 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:35Z","lastTransitionTime":"2026-02-17T17:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:35 crc kubenswrapper[4892]: I0217 17:45:35.142474 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:35 crc kubenswrapper[4892]: I0217 17:45:35.142636 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:35 crc kubenswrapper[4892]: I0217 17:45:35.142655 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:35 crc kubenswrapper[4892]: I0217 17:45:35.142677 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:35 crc kubenswrapper[4892]: I0217 17:45:35.142693 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:35Z","lastTransitionTime":"2026-02-17T17:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:35 crc kubenswrapper[4892]: I0217 17:45:35.245466 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:35 crc kubenswrapper[4892]: I0217 17:45:35.245515 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:35 crc kubenswrapper[4892]: I0217 17:45:35.245526 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:35 crc kubenswrapper[4892]: I0217 17:45:35.245542 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:35 crc kubenswrapper[4892]: I0217 17:45:35.245554 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:35Z","lastTransitionTime":"2026-02-17T17:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:35 crc kubenswrapper[4892]: I0217 17:45:35.349768 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:35 crc kubenswrapper[4892]: I0217 17:45:35.349855 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:35 crc kubenswrapper[4892]: I0217 17:45:35.349877 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:35 crc kubenswrapper[4892]: I0217 17:45:35.349907 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:35 crc kubenswrapper[4892]: I0217 17:45:35.349929 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:35Z","lastTransitionTime":"2026-02-17T17:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:35 crc kubenswrapper[4892]: I0217 17:45:35.359104 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:45:35 crc kubenswrapper[4892]: E0217 17:45:35.359261 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:45:35 crc kubenswrapper[4892]: I0217 17:45:35.359512 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:45:35 crc kubenswrapper[4892]: E0217 17:45:35.359611 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:45:35 crc kubenswrapper[4892]: I0217 17:45:35.360161 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:45:35 crc kubenswrapper[4892]: I0217 17:45:35.360239 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:45:35 crc kubenswrapper[4892]: E0217 17:45:35.360356 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q4n6" podUID="9290105c-74a4-487a-879f-3f79186b3b01" Feb 17 17:45:35 crc kubenswrapper[4892]: E0217 17:45:35.360498 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:45:35 crc kubenswrapper[4892]: I0217 17:45:35.452354 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:35 crc kubenswrapper[4892]: I0217 17:45:35.452402 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:35 crc kubenswrapper[4892]: I0217 17:45:35.452414 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:35 crc kubenswrapper[4892]: I0217 17:45:35.452433 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:35 crc kubenswrapper[4892]: I0217 17:45:35.452446 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:35Z","lastTransitionTime":"2026-02-17T17:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:35 crc kubenswrapper[4892]: I0217 17:45:35.554965 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:35 crc kubenswrapper[4892]: I0217 17:45:35.555031 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:35 crc kubenswrapper[4892]: I0217 17:45:35.555049 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:35 crc kubenswrapper[4892]: I0217 17:45:35.555072 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:35 crc kubenswrapper[4892]: I0217 17:45:35.555090 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:35Z","lastTransitionTime":"2026-02-17T17:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:35 crc kubenswrapper[4892]: I0217 17:45:35.564263 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 19:42:10.409515719 +0000 UTC Feb 17 17:45:35 crc kubenswrapper[4892]: I0217 17:45:35.657493 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:35 crc kubenswrapper[4892]: I0217 17:45:35.657552 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:35 crc kubenswrapper[4892]: I0217 17:45:35.657571 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:35 crc kubenswrapper[4892]: I0217 17:45:35.657595 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:35 crc kubenswrapper[4892]: I0217 17:45:35.657614 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:35Z","lastTransitionTime":"2026-02-17T17:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:35 crc kubenswrapper[4892]: I0217 17:45:35.760176 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:35 crc kubenswrapper[4892]: I0217 17:45:35.760250 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:35 crc kubenswrapper[4892]: I0217 17:45:35.760274 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:35 crc kubenswrapper[4892]: I0217 17:45:35.760305 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:35 crc kubenswrapper[4892]: I0217 17:45:35.760327 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:35Z","lastTransitionTime":"2026-02-17T17:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:35 crc kubenswrapper[4892]: I0217 17:45:35.862404 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:35 crc kubenswrapper[4892]: I0217 17:45:35.862447 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:35 crc kubenswrapper[4892]: I0217 17:45:35.862458 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:35 crc kubenswrapper[4892]: I0217 17:45:35.862473 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:35 crc kubenswrapper[4892]: I0217 17:45:35.862483 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:35Z","lastTransitionTime":"2026-02-17T17:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:35 crc kubenswrapper[4892]: I0217 17:45:35.964542 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:35 crc kubenswrapper[4892]: I0217 17:45:35.964597 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:35 crc kubenswrapper[4892]: I0217 17:45:35.964613 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:35 crc kubenswrapper[4892]: I0217 17:45:35.964640 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:35 crc kubenswrapper[4892]: I0217 17:45:35.964656 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:35Z","lastTransitionTime":"2026-02-17T17:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:36 crc kubenswrapper[4892]: I0217 17:45:36.067364 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:36 crc kubenswrapper[4892]: I0217 17:45:36.067421 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:36 crc kubenswrapper[4892]: I0217 17:45:36.067435 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:36 crc kubenswrapper[4892]: I0217 17:45:36.067454 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:36 crc kubenswrapper[4892]: I0217 17:45:36.067468 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:36Z","lastTransitionTime":"2026-02-17T17:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:36 crc kubenswrapper[4892]: I0217 17:45:36.170969 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:36 crc kubenswrapper[4892]: I0217 17:45:36.171029 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:36 crc kubenswrapper[4892]: I0217 17:45:36.171051 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:36 crc kubenswrapper[4892]: I0217 17:45:36.171082 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:36 crc kubenswrapper[4892]: I0217 17:45:36.171104 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:36Z","lastTransitionTime":"2026-02-17T17:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:36 crc kubenswrapper[4892]: I0217 17:45:36.274185 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:36 crc kubenswrapper[4892]: I0217 17:45:36.274253 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:36 crc kubenswrapper[4892]: I0217 17:45:36.274276 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:36 crc kubenswrapper[4892]: I0217 17:45:36.274303 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:36 crc kubenswrapper[4892]: I0217 17:45:36.274325 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:36Z","lastTransitionTime":"2026-02-17T17:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:36 crc kubenswrapper[4892]: I0217 17:45:36.376677 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:36 crc kubenswrapper[4892]: I0217 17:45:36.376741 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:36 crc kubenswrapper[4892]: I0217 17:45:36.376759 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:36 crc kubenswrapper[4892]: I0217 17:45:36.376780 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:36 crc kubenswrapper[4892]: I0217 17:45:36.376802 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:36Z","lastTransitionTime":"2026-02-17T17:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:36 crc kubenswrapper[4892]: I0217 17:45:36.479487 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:36 crc kubenswrapper[4892]: I0217 17:45:36.479523 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:36 crc kubenswrapper[4892]: I0217 17:45:36.479531 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:36 crc kubenswrapper[4892]: I0217 17:45:36.479546 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:36 crc kubenswrapper[4892]: I0217 17:45:36.479554 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:36Z","lastTransitionTime":"2026-02-17T17:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:36 crc kubenswrapper[4892]: I0217 17:45:36.564665 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 03:35:08.960636414 +0000 UTC Feb 17 17:45:36 crc kubenswrapper[4892]: I0217 17:45:36.582034 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:36 crc kubenswrapper[4892]: I0217 17:45:36.582107 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:36 crc kubenswrapper[4892]: I0217 17:45:36.582128 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:36 crc kubenswrapper[4892]: I0217 17:45:36.582158 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:36 crc kubenswrapper[4892]: I0217 17:45:36.582185 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:36Z","lastTransitionTime":"2026-02-17T17:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:36 crc kubenswrapper[4892]: I0217 17:45:36.685012 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:36 crc kubenswrapper[4892]: I0217 17:45:36.685158 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:36 crc kubenswrapper[4892]: I0217 17:45:36.685182 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:36 crc kubenswrapper[4892]: I0217 17:45:36.685213 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:36 crc kubenswrapper[4892]: I0217 17:45:36.685235 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:36Z","lastTransitionTime":"2026-02-17T17:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:36 crc kubenswrapper[4892]: I0217 17:45:36.788977 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:36 crc kubenswrapper[4892]: I0217 17:45:36.789067 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:36 crc kubenswrapper[4892]: I0217 17:45:36.789089 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:36 crc kubenswrapper[4892]: I0217 17:45:36.789115 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:36 crc kubenswrapper[4892]: I0217 17:45:36.789133 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:36Z","lastTransitionTime":"2026-02-17T17:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:36 crc kubenswrapper[4892]: I0217 17:45:36.891722 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:36 crc kubenswrapper[4892]: I0217 17:45:36.891793 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:36 crc kubenswrapper[4892]: I0217 17:45:36.891805 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:36 crc kubenswrapper[4892]: I0217 17:45:36.891868 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:36 crc kubenswrapper[4892]: I0217 17:45:36.891884 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:36Z","lastTransitionTime":"2026-02-17T17:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:36 crc kubenswrapper[4892]: I0217 17:45:36.994746 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:36 crc kubenswrapper[4892]: I0217 17:45:36.994865 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:36 crc kubenswrapper[4892]: I0217 17:45:36.994884 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:36 crc kubenswrapper[4892]: I0217 17:45:36.994915 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:36 crc kubenswrapper[4892]: I0217 17:45:36.994937 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:36Z","lastTransitionTime":"2026-02-17T17:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:37 crc kubenswrapper[4892]: I0217 17:45:37.098617 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:37 crc kubenswrapper[4892]: I0217 17:45:37.098690 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:37 crc kubenswrapper[4892]: I0217 17:45:37.098708 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:37 crc kubenswrapper[4892]: I0217 17:45:37.098734 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:37 crc kubenswrapper[4892]: I0217 17:45:37.098753 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:37Z","lastTransitionTime":"2026-02-17T17:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:37 crc kubenswrapper[4892]: I0217 17:45:37.201693 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:37 crc kubenswrapper[4892]: I0217 17:45:37.201757 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:37 crc kubenswrapper[4892]: I0217 17:45:37.201775 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:37 crc kubenswrapper[4892]: I0217 17:45:37.201800 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:37 crc kubenswrapper[4892]: I0217 17:45:37.201844 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:37Z","lastTransitionTime":"2026-02-17T17:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:37 crc kubenswrapper[4892]: I0217 17:45:37.305538 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:37 crc kubenswrapper[4892]: I0217 17:45:37.305588 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:37 crc kubenswrapper[4892]: I0217 17:45:37.305598 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:37 crc kubenswrapper[4892]: I0217 17:45:37.305616 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:37 crc kubenswrapper[4892]: I0217 17:45:37.305626 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:37Z","lastTransitionTime":"2026-02-17T17:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:37 crc kubenswrapper[4892]: I0217 17:45:37.358840 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:45:37 crc kubenswrapper[4892]: I0217 17:45:37.358926 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:45:37 crc kubenswrapper[4892]: E0217 17:45:37.358999 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:45:37 crc kubenswrapper[4892]: I0217 17:45:37.359027 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:45:37 crc kubenswrapper[4892]: E0217 17:45:37.359081 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:45:37 crc kubenswrapper[4892]: I0217 17:45:37.359159 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:45:37 crc kubenswrapper[4892]: E0217 17:45:37.359323 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q4n6" podUID="9290105c-74a4-487a-879f-3f79186b3b01" Feb 17 17:45:37 crc kubenswrapper[4892]: E0217 17:45:37.359437 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:45:37 crc kubenswrapper[4892]: I0217 17:45:37.408052 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:37 crc kubenswrapper[4892]: I0217 17:45:37.408096 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:37 crc kubenswrapper[4892]: I0217 17:45:37.408116 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:37 crc kubenswrapper[4892]: I0217 17:45:37.408144 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:37 crc kubenswrapper[4892]: I0217 17:45:37.408166 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:37Z","lastTransitionTime":"2026-02-17T17:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:37 crc kubenswrapper[4892]: I0217 17:45:37.510652 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:37 crc kubenswrapper[4892]: I0217 17:45:37.510718 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:37 crc kubenswrapper[4892]: I0217 17:45:37.510741 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:37 crc kubenswrapper[4892]: I0217 17:45:37.510768 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:37 crc kubenswrapper[4892]: I0217 17:45:37.510789 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:37Z","lastTransitionTime":"2026-02-17T17:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:37 crc kubenswrapper[4892]: I0217 17:45:37.565634 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-06 22:52:35.733243596 +0000 UTC Feb 17 17:45:37 crc kubenswrapper[4892]: I0217 17:45:37.613040 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:37 crc kubenswrapper[4892]: I0217 17:45:37.613129 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:37 crc kubenswrapper[4892]: I0217 17:45:37.613147 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:37 crc kubenswrapper[4892]: I0217 17:45:37.613172 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:37 crc kubenswrapper[4892]: I0217 17:45:37.613191 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:37Z","lastTransitionTime":"2026-02-17T17:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:37 crc kubenswrapper[4892]: I0217 17:45:37.715744 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:37 crc kubenswrapper[4892]: I0217 17:45:37.715867 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:37 crc kubenswrapper[4892]: I0217 17:45:37.715899 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:37 crc kubenswrapper[4892]: I0217 17:45:37.715926 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:37 crc kubenswrapper[4892]: I0217 17:45:37.715944 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:37Z","lastTransitionTime":"2026-02-17T17:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:37 crc kubenswrapper[4892]: I0217 17:45:37.818293 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:37 crc kubenswrapper[4892]: I0217 17:45:37.818351 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:37 crc kubenswrapper[4892]: I0217 17:45:37.818369 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:37 crc kubenswrapper[4892]: I0217 17:45:37.818395 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:37 crc kubenswrapper[4892]: I0217 17:45:37.818412 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:37Z","lastTransitionTime":"2026-02-17T17:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:37 crc kubenswrapper[4892]: I0217 17:45:37.920745 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:37 crc kubenswrapper[4892]: I0217 17:45:37.920845 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:37 crc kubenswrapper[4892]: I0217 17:45:37.920872 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:37 crc kubenswrapper[4892]: I0217 17:45:37.920902 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:37 crc kubenswrapper[4892]: I0217 17:45:37.920922 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:37Z","lastTransitionTime":"2026-02-17T17:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:38 crc kubenswrapper[4892]: I0217 17:45:38.023230 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:38 crc kubenswrapper[4892]: I0217 17:45:38.023283 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:38 crc kubenswrapper[4892]: I0217 17:45:38.023299 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:38 crc kubenswrapper[4892]: I0217 17:45:38.023322 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:38 crc kubenswrapper[4892]: I0217 17:45:38.023338 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:38Z","lastTransitionTime":"2026-02-17T17:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:38 crc kubenswrapper[4892]: I0217 17:45:38.117642 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:38 crc kubenswrapper[4892]: I0217 17:45:38.117681 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:38 crc kubenswrapper[4892]: I0217 17:45:38.117691 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:38 crc kubenswrapper[4892]: I0217 17:45:38.117706 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:38 crc kubenswrapper[4892]: I0217 17:45:38.117718 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:38Z","lastTransitionTime":"2026-02-17T17:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:38 crc kubenswrapper[4892]: I0217 17:45:38.143679 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 17:45:38 crc kubenswrapper[4892]: I0217 17:45:38.143714 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 17:45:38 crc kubenswrapper[4892]: I0217 17:45:38.143723 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 17:45:38 crc kubenswrapper[4892]: I0217 17:45:38.143737 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 17:45:38 crc kubenswrapper[4892]: I0217 17:45:38.143747 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T17:45:38Z","lastTransitionTime":"2026-02-17T17:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 17:45:38 crc kubenswrapper[4892]: I0217 17:45:38.182658 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-6r2tk"] Feb 17 17:45:38 crc kubenswrapper[4892]: I0217 17:45:38.183531 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6r2tk" Feb 17 17:45:38 crc kubenswrapper[4892]: I0217 17:45:38.186522 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 17 17:45:38 crc kubenswrapper[4892]: I0217 17:45:38.186582 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 17 17:45:38 crc kubenswrapper[4892]: I0217 17:45:38.187209 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 17 17:45:38 crc kubenswrapper[4892]: I0217 17:45:38.187661 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 17 17:45:38 crc kubenswrapper[4892]: I0217 17:45:38.232754 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=12.232734173 podStartE2EDuration="12.232734173s" podCreationTimestamp="2026-02-17 17:45:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:45:38.232188357 +0000 UTC m=+109.607591662" watchObservedRunningTime="2026-02-17 17:45:38.232734173 +0000 UTC m=+109.608137458" Feb 17 17:45:38 crc kubenswrapper[4892]: I0217 17:45:38.312616 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podStartSLOduration=86.312592513 podStartE2EDuration="1m26.312592513s" podCreationTimestamp="2026-02-17 17:44:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:45:38.297897497 +0000 UTC m=+109.673300802" watchObservedRunningTime="2026-02-17 17:45:38.312592513 +0000 UTC m=+109.687995808" Feb 17 17:45:38 crc kubenswrapper[4892]: I0217 17:45:38.317549 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/88c19eb5-c83e-4667-85d1-1e534a5e21c9-service-ca\") pod \"cluster-version-operator-5c965bbfc6-6r2tk\" (UID: \"88c19eb5-c83e-4667-85d1-1e534a5e21c9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6r2tk" Feb 17 17:45:38 crc kubenswrapper[4892]: I0217 17:45:38.317604 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/88c19eb5-c83e-4667-85d1-1e534a5e21c9-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-6r2tk\" (UID: \"88c19eb5-c83e-4667-85d1-1e534a5e21c9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6r2tk" Feb 17 17:45:38 crc kubenswrapper[4892]: I0217 17:45:38.317641 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/88c19eb5-c83e-4667-85d1-1e534a5e21c9-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-6r2tk\" (UID: \"88c19eb5-c83e-4667-85d1-1e534a5e21c9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6r2tk" Feb 17 17:45:38 crc kubenswrapper[4892]: I0217 17:45:38.317675 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/88c19eb5-c83e-4667-85d1-1e534a5e21c9-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-6r2tk\" (UID: \"88c19eb5-c83e-4667-85d1-1e534a5e21c9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6r2tk" Feb 17 17:45:38 crc kubenswrapper[4892]: I0217 17:45:38.317937 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88c19eb5-c83e-4667-85d1-1e534a5e21c9-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-6r2tk\" (UID: \"88c19eb5-c83e-4667-85d1-1e534a5e21c9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6r2tk" Feb 17 17:45:38 crc kubenswrapper[4892]: I0217 17:45:38.382462 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-p6jtp" podStartSLOduration=86.382443891 podStartE2EDuration="1m26.382443891s" podCreationTimestamp="2026-02-17 17:44:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:45:38.35391636 +0000 UTC m=+109.729319665" watchObservedRunningTime="2026-02-17 17:45:38.382443891 +0000 UTC m=+109.757847166" Feb 17 17:45:38 crc kubenswrapper[4892]: I0217 17:45:38.410876 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=39.410861388 podStartE2EDuration="39.410861388s" podCreationTimestamp="2026-02-17 17:44:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:45:38.410704114 +0000 UTC m=+109.786107419" watchObservedRunningTime="2026-02-17 17:45:38.410861388 +0000 UTC m=+109.786264663" Feb 17 17:45:38 crc kubenswrapper[4892]: I0217 17:45:38.418570 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88c19eb5-c83e-4667-85d1-1e534a5e21c9-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-6r2tk\" (UID: \"88c19eb5-c83e-4667-85d1-1e534a5e21c9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6r2tk" Feb 17 17:45:38 crc kubenswrapper[4892]: I0217 17:45:38.418665 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/88c19eb5-c83e-4667-85d1-1e534a5e21c9-service-ca\") pod \"cluster-version-operator-5c965bbfc6-6r2tk\" (UID: \"88c19eb5-c83e-4667-85d1-1e534a5e21c9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6r2tk" Feb 17 17:45:38 crc kubenswrapper[4892]: I0217 17:45:38.418702 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/88c19eb5-c83e-4667-85d1-1e534a5e21c9-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-6r2tk\" (UID: \"88c19eb5-c83e-4667-85d1-1e534a5e21c9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6r2tk" Feb 17 17:45:38 crc kubenswrapper[4892]: I0217 17:45:38.418740 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/88c19eb5-c83e-4667-85d1-1e534a5e21c9-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-6r2tk\" (UID: \"88c19eb5-c83e-4667-85d1-1e534a5e21c9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6r2tk" Feb 17 17:45:38 crc kubenswrapper[4892]: I0217 17:45:38.418773 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/88c19eb5-c83e-4667-85d1-1e534a5e21c9-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-6r2tk\" (UID: \"88c19eb5-c83e-4667-85d1-1e534a5e21c9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6r2tk" Feb 17 17:45:38 crc kubenswrapper[4892]: I0217 17:45:38.418860 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/88c19eb5-c83e-4667-85d1-1e534a5e21c9-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-6r2tk\" (UID: \"88c19eb5-c83e-4667-85d1-1e534a5e21c9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6r2tk" Feb 17 17:45:38 crc kubenswrapper[4892]: I0217 17:45:38.419339 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/88c19eb5-c83e-4667-85d1-1e534a5e21c9-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-6r2tk\" (UID: \"88c19eb5-c83e-4667-85d1-1e534a5e21c9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6r2tk" Feb 17 17:45:38 crc kubenswrapper[4892]: I0217 17:45:38.420439 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/88c19eb5-c83e-4667-85d1-1e534a5e21c9-service-ca\") pod \"cluster-version-operator-5c965bbfc6-6r2tk\" (UID: \"88c19eb5-c83e-4667-85d1-1e534a5e21c9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6r2tk" Feb 17 17:45:38 crc kubenswrapper[4892]: I0217 17:45:38.426345 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88c19eb5-c83e-4667-85d1-1e534a5e21c9-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-6r2tk\" (UID: \"88c19eb5-c83e-4667-85d1-1e534a5e21c9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6r2tk" Feb 17 17:45:38 crc kubenswrapper[4892]: I0217 17:45:38.436310 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=90.436287837 podStartE2EDuration="1m30.436287837s" podCreationTimestamp="2026-02-17 17:44:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:45:38.436212965 +0000 UTC m=+109.811616230" watchObservedRunningTime="2026-02-17 17:45:38.436287837 +0000 UTC m=+109.811691142" Feb 17 17:45:38 crc kubenswrapper[4892]: I0217 17:45:38.441460 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/88c19eb5-c83e-4667-85d1-1e534a5e21c9-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-6r2tk\" (UID: \"88c19eb5-c83e-4667-85d1-1e534a5e21c9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6r2tk" Feb 17 17:45:38 crc kubenswrapper[4892]: I0217 17:45:38.456152 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=56.456129909 podStartE2EDuration="56.456129909s" podCreationTimestamp="2026-02-17 17:44:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:45:38.454764683 +0000 UTC m=+109.830167948" watchObservedRunningTime="2026-02-17 17:45:38.456129909 +0000 UTC m=+109.831533214" Feb 17 17:45:38 crc kubenswrapper[4892]: I0217 17:45:38.493175 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-5v456" podStartSLOduration=86.493147423 podStartE2EDuration="1m26.493147423s" podCreationTimestamp="2026-02-17 17:44:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:45:38.486157069 +0000 UTC m=+109.861560374" watchObservedRunningTime="2026-02-17 17:45:38.493147423 +0000 UTC m=+109.868550698" Feb 17 17:45:38 crc kubenswrapper[4892]: I0217 17:45:38.501530 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6r2tk" Feb 17 17:45:38 crc kubenswrapper[4892]: I0217 17:45:38.511719 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k28d" podStartSLOduration=86.511688761 podStartE2EDuration="1m26.511688761s" podCreationTimestamp="2026-02-17 17:44:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:45:38.507967313 +0000 UTC m=+109.883370658" watchObservedRunningTime="2026-02-17 17:45:38.511688761 +0000 UTC m=+109.887092056" Feb 17 17:45:38 crc kubenswrapper[4892]: I0217 17:45:38.551446 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=90.551422786 podStartE2EDuration="1m30.551422786s" podCreationTimestamp="2026-02-17 17:44:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:45:38.530470155 +0000 UTC m=+109.905873440" watchObservedRunningTime="2026-02-17 17:45:38.551422786 +0000 UTC m=+109.926826091" Feb 17 17:45:38 crc kubenswrapper[4892]: I0217 17:45:38.566501 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 01:04:37.573376893 +0000 UTC Feb 17 17:45:38 crc kubenswrapper[4892]: I0217 17:45:38.566601 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 17 17:45:38 crc kubenswrapper[4892]: I0217 17:45:38.576353 4892 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 17 17:45:38 crc kubenswrapper[4892]: I0217 17:45:38.583417 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-lxpxh" podStartSLOduration=86.583388627 podStartE2EDuration="1m26.583388627s" podCreationTimestamp="2026-02-17 17:44:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:45:38.582672519 +0000 UTC m=+109.958075834" watchObservedRunningTime="2026-02-17 17:45:38.583388627 +0000 UTC m=+109.958791942" Feb 17 17:45:38 crc kubenswrapper[4892]: I0217 17:45:38.600083 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-4dsxq" podStartSLOduration=86.600066996 podStartE2EDuration="1m26.600066996s" podCreationTimestamp="2026-02-17 17:44:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:45:38.598887924 +0000 UTC m=+109.974291189" watchObservedRunningTime="2026-02-17 17:45:38.600066996 +0000 UTC m=+109.975470261" Feb 17 17:45:38 crc kubenswrapper[4892]: I0217 17:45:38.684524 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6r2tk" event={"ID":"88c19eb5-c83e-4667-85d1-1e534a5e21c9","Type":"ContainerStarted","Data":"c5e8611bbd45f96f20010271d4a948db43de3a255ae3786d687022d7b8c1046a"} Feb 17 17:45:38 crc kubenswrapper[4892]: I0217 17:45:38.684583 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6r2tk" event={"ID":"88c19eb5-c83e-4667-85d1-1e534a5e21c9","Type":"ContainerStarted","Data":"15070364476d711c9b2e7ea439dbaef807f3bf44c19d3153a19818fa25b37e98"} Feb 17 17:45:38 crc kubenswrapper[4892]: I0217 17:45:38.697165 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6r2tk" podStartSLOduration=86.697148499 podStartE2EDuration="1m26.697148499s" podCreationTimestamp="2026-02-17 17:44:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:45:38.697115908 +0000 UTC m=+110.072519173" watchObservedRunningTime="2026-02-17 17:45:38.697148499 +0000 UTC m=+110.072551764" Feb 17 17:45:39 crc kubenswrapper[4892]: I0217 17:45:39.358878 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:45:39 crc kubenswrapper[4892]: I0217 17:45:39.358954 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:45:39 crc kubenswrapper[4892]: I0217 17:45:39.358992 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:45:39 crc kubenswrapper[4892]: I0217 17:45:39.359096 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:45:39 crc kubenswrapper[4892]: E0217 17:45:39.360719 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:45:39 crc kubenswrapper[4892]: E0217 17:45:39.360889 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:45:39 crc kubenswrapper[4892]: E0217 17:45:39.361032 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:45:39 crc kubenswrapper[4892]: E0217 17:45:39.361198 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q4n6" podUID="9290105c-74a4-487a-879f-3f79186b3b01" Feb 17 17:45:41 crc kubenswrapper[4892]: I0217 17:45:41.359405 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:45:41 crc kubenswrapper[4892]: I0217 17:45:41.359573 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:45:41 crc kubenswrapper[4892]: I0217 17:45:41.359433 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:45:41 crc kubenswrapper[4892]: I0217 17:45:41.359721 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:45:41 crc kubenswrapper[4892]: E0217 17:45:41.359714 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:45:41 crc kubenswrapper[4892]: E0217 17:45:41.359942 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:45:41 crc kubenswrapper[4892]: E0217 17:45:41.360071 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:45:41 crc kubenswrapper[4892]: E0217 17:45:41.360258 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q4n6" podUID="9290105c-74a4-487a-879f-3f79186b3b01" Feb 17 17:45:43 crc kubenswrapper[4892]: I0217 17:45:43.358957 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:45:43 crc kubenswrapper[4892]: I0217 17:45:43.359124 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:45:43 crc kubenswrapper[4892]: I0217 17:45:43.359329 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:45:43 crc kubenswrapper[4892]: E0217 17:45:43.359569 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q4n6" podUID="9290105c-74a4-487a-879f-3f79186b3b01" Feb 17 17:45:43 crc kubenswrapper[4892]: I0217 17:45:43.359782 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:45:43 crc kubenswrapper[4892]: E0217 17:45:43.359965 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:45:43 crc kubenswrapper[4892]: E0217 17:45:43.360054 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:45:43 crc kubenswrapper[4892]: E0217 17:45:43.360141 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:45:44 crc kubenswrapper[4892]: I0217 17:45:44.359622 4892 scope.go:117] "RemoveContainer" containerID="acb53587bfb9bbb4e171e5b2c92a91c196c8c0445a80c61d8004ce0fb3c858e1" Feb 17 17:45:44 crc kubenswrapper[4892]: E0217 17:45:44.359955 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kp5h9_openshift-ovn-kubernetes(b23058a0-04ec-4a23-82cb-60f9b368eaa0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" Feb 17 17:45:45 crc kubenswrapper[4892]: I0217 17:45:45.358769 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:45:45 crc kubenswrapper[4892]: I0217 17:45:45.358928 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:45:45 crc kubenswrapper[4892]: I0217 17:45:45.358770 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:45:45 crc kubenswrapper[4892]: E0217 17:45:45.359016 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:45:45 crc kubenswrapper[4892]: E0217 17:45:45.359142 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:45:45 crc kubenswrapper[4892]: I0217 17:45:45.359169 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:45:45 crc kubenswrapper[4892]: E0217 17:45:45.359298 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:45:45 crc kubenswrapper[4892]: E0217 17:45:45.359474 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q4n6" podUID="9290105c-74a4-487a-879f-3f79186b3b01" Feb 17 17:45:45 crc kubenswrapper[4892]: I0217 17:45:45.707908 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lxpxh_43b12f44-0079-4031-9b1d-492c374250df/kube-multus/1.log" Feb 17 17:45:45 crc kubenswrapper[4892]: I0217 17:45:45.708506 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lxpxh_43b12f44-0079-4031-9b1d-492c374250df/kube-multus/0.log" Feb 17 17:45:45 crc kubenswrapper[4892]: I0217 17:45:45.708581 4892 generic.go:334] "Generic (PLEG): container finished" podID="43b12f44-0079-4031-9b1d-492c374250df" containerID="6fc69d29b82db70745c48c3f7761a99709ee267fb45509bfc4ff406273c6f23c" exitCode=1 Feb 17 17:45:45 crc kubenswrapper[4892]: I0217 17:45:45.708625 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lxpxh" event={"ID":"43b12f44-0079-4031-9b1d-492c374250df","Type":"ContainerDied","Data":"6fc69d29b82db70745c48c3f7761a99709ee267fb45509bfc4ff406273c6f23c"} Feb 17 17:45:45 crc kubenswrapper[4892]: I0217 17:45:45.708674 4892 scope.go:117] "RemoveContainer" containerID="5490d72f776fd4e96f56e933c182c8589e96d80806378f96d97c4d0f6dccd8eb" Feb 17 17:45:45 crc kubenswrapper[4892]: I0217 17:45:45.709373 4892 scope.go:117] "RemoveContainer" containerID="6fc69d29b82db70745c48c3f7761a99709ee267fb45509bfc4ff406273c6f23c" Feb 17 17:45:45 crc kubenswrapper[4892]: E0217 17:45:45.709896 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-lxpxh_openshift-multus(43b12f44-0079-4031-9b1d-492c374250df)\"" pod="openshift-multus/multus-lxpxh" podUID="43b12f44-0079-4031-9b1d-492c374250df" Feb 17 17:45:46 crc kubenswrapper[4892]: I0217 17:45:46.713872 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lxpxh_43b12f44-0079-4031-9b1d-492c374250df/kube-multus/1.log" Feb 17 17:45:47 crc kubenswrapper[4892]: I0217 17:45:47.359588 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:45:47 crc kubenswrapper[4892]: I0217 17:45:47.359699 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:45:47 crc kubenswrapper[4892]: I0217 17:45:47.359780 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:45:47 crc kubenswrapper[4892]: E0217 17:45:47.359994 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:45:47 crc kubenswrapper[4892]: I0217 17:45:47.360052 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:45:47 crc kubenswrapper[4892]: E0217 17:45:47.360183 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:45:47 crc kubenswrapper[4892]: E0217 17:45:47.360318 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:45:47 crc kubenswrapper[4892]: E0217 17:45:47.360465 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q4n6" podUID="9290105c-74a4-487a-879f-3f79186b3b01" Feb 17 17:45:49 crc kubenswrapper[4892]: I0217 17:45:49.358993 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:45:49 crc kubenswrapper[4892]: E0217 17:45:49.361255 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q4n6" podUID="9290105c-74a4-487a-879f-3f79186b3b01" Feb 17 17:45:49 crc kubenswrapper[4892]: I0217 17:45:49.361327 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:45:49 crc kubenswrapper[4892]: I0217 17:45:49.361355 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:45:49 crc kubenswrapper[4892]: I0217 17:45:49.361386 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:45:49 crc kubenswrapper[4892]: E0217 17:45:49.362089 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:45:49 crc kubenswrapper[4892]: E0217 17:45:49.362217 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:45:49 crc kubenswrapper[4892]: E0217 17:45:49.362369 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:45:49 crc kubenswrapper[4892]: E0217 17:45:49.382870 4892 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 17 17:45:49 crc kubenswrapper[4892]: E0217 17:45:49.471481 4892 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 17:45:51 crc kubenswrapper[4892]: I0217 17:45:51.359613 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:45:51 crc kubenswrapper[4892]: I0217 17:45:51.359746 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:45:51 crc kubenswrapper[4892]: I0217 17:45:51.359800 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:45:51 crc kubenswrapper[4892]: I0217 17:45:51.359878 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:45:51 crc kubenswrapper[4892]: E0217 17:45:51.360034 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:45:51 crc kubenswrapper[4892]: E0217 17:45:51.360334 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q4n6" podUID="9290105c-74a4-487a-879f-3f79186b3b01" Feb 17 17:45:51 crc kubenswrapper[4892]: E0217 17:45:51.360429 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:45:51 crc kubenswrapper[4892]: E0217 17:45:51.360483 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:45:53 crc kubenswrapper[4892]: I0217 17:45:53.359420 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:45:53 crc kubenswrapper[4892]: I0217 17:45:53.359463 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:45:53 crc kubenswrapper[4892]: I0217 17:45:53.359538 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:45:53 crc kubenswrapper[4892]: I0217 17:45:53.359634 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:45:53 crc kubenswrapper[4892]: E0217 17:45:53.359629 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q4n6" podUID="9290105c-74a4-487a-879f-3f79186b3b01" Feb 17 17:45:53 crc kubenswrapper[4892]: E0217 17:45:53.359775 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:45:53 crc kubenswrapper[4892]: E0217 17:45:53.359915 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:45:53 crc kubenswrapper[4892]: E0217 17:45:53.360047 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:45:54 crc kubenswrapper[4892]: E0217 17:45:54.473081 4892 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 17:45:55 crc kubenswrapper[4892]: I0217 17:45:55.358679 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:45:55 crc kubenswrapper[4892]: I0217 17:45:55.358751 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:45:55 crc kubenswrapper[4892]: E0217 17:45:55.358927 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:45:55 crc kubenswrapper[4892]: I0217 17:45:55.359000 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:45:55 crc kubenswrapper[4892]: I0217 17:45:55.359050 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:45:55 crc kubenswrapper[4892]: E0217 17:45:55.359194 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:45:55 crc kubenswrapper[4892]: E0217 17:45:55.359325 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q4n6" podUID="9290105c-74a4-487a-879f-3f79186b3b01" Feb 17 17:45:55 crc kubenswrapper[4892]: E0217 17:45:55.359449 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:45:57 crc kubenswrapper[4892]: I0217 17:45:57.359045 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:45:57 crc kubenswrapper[4892]: I0217 17:45:57.359117 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:45:57 crc kubenswrapper[4892]: E0217 17:45:57.359493 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:45:57 crc kubenswrapper[4892]: I0217 17:45:57.359311 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:45:57 crc kubenswrapper[4892]: I0217 17:45:57.359179 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:45:57 crc kubenswrapper[4892]: E0217 17:45:57.359666 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:45:57 crc kubenswrapper[4892]: E0217 17:45:57.359896 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q4n6" podUID="9290105c-74a4-487a-879f-3f79186b3b01" Feb 17 17:45:57 crc kubenswrapper[4892]: E0217 17:45:57.359979 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:45:59 crc kubenswrapper[4892]: I0217 17:45:59.359009 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:45:59 crc kubenswrapper[4892]: I0217 17:45:59.359113 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:45:59 crc kubenswrapper[4892]: E0217 17:45:59.360074 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:45:59 crc kubenswrapper[4892]: I0217 17:45:59.360103 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:45:59 crc kubenswrapper[4892]: I0217 17:45:59.360086 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:45:59 crc kubenswrapper[4892]: E0217 17:45:59.360183 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q4n6" podUID="9290105c-74a4-487a-879f-3f79186b3b01" Feb 17 17:45:59 crc kubenswrapper[4892]: I0217 17:45:59.360696 4892 scope.go:117] "RemoveContainer" containerID="6fc69d29b82db70745c48c3f7761a99709ee267fb45509bfc4ff406273c6f23c" Feb 17 17:45:59 crc kubenswrapper[4892]: E0217 17:45:59.360734 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:45:59 crc kubenswrapper[4892]: E0217 17:45:59.360806 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:45:59 crc kubenswrapper[4892]: I0217 17:45:59.361069 4892 scope.go:117] "RemoveContainer" containerID="acb53587bfb9bbb4e171e5b2c92a91c196c8c0445a80c61d8004ce0fb3c858e1" Feb 17 17:45:59 crc kubenswrapper[4892]: E0217 17:45:59.474692 4892 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 17:45:59 crc kubenswrapper[4892]: I0217 17:45:59.758675 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kp5h9_b23058a0-04ec-4a23-82cb-60f9b368eaa0/ovnkube-controller/3.log" Feb 17 17:45:59 crc kubenswrapper[4892]: I0217 17:45:59.762177 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" event={"ID":"b23058a0-04ec-4a23-82cb-60f9b368eaa0","Type":"ContainerStarted","Data":"71a7a05edd0e6c9f2879bb1a8d2c1ea260b971a5feeb8d6efc576ee04ce99c14"} Feb 17 17:45:59 crc kubenswrapper[4892]: I0217 17:45:59.763503 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:45:59 crc kubenswrapper[4892]: I0217 17:45:59.765608 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lxpxh_43b12f44-0079-4031-9b1d-492c374250df/kube-multus/1.log" Feb 17 17:45:59 crc kubenswrapper[4892]: I0217 17:45:59.765665 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lxpxh" event={"ID":"43b12f44-0079-4031-9b1d-492c374250df","Type":"ContainerStarted","Data":"304725a2b080e31a96b7983f19b1eedea834a9b2e00608dc922e78f46533ae2a"} Feb 17 17:45:59 crc kubenswrapper[4892]: I0217 17:45:59.791211 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" podStartSLOduration=107.791193017 podStartE2EDuration="1m47.791193017s" podCreationTimestamp="2026-02-17 17:44:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:45:59.790088158 +0000 UTC m=+131.165491423" watchObservedRunningTime="2026-02-17 17:45:59.791193017 +0000 UTC m=+131.166596282" Feb 17 17:46:00 crc kubenswrapper[4892]: I0217 17:46:00.292083 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2q4n6"] Feb 17 17:46:00 crc kubenswrapper[4892]: I0217 17:46:00.292201 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:46:00 crc kubenswrapper[4892]: E0217 17:46:00.292282 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q4n6" podUID="9290105c-74a4-487a-879f-3f79186b3b01" Feb 17 17:46:01 crc kubenswrapper[4892]: I0217 17:46:01.358486 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:46:01 crc kubenswrapper[4892]: I0217 17:46:01.358531 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:46:01 crc kubenswrapper[4892]: E0217 17:46:01.358972 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:46:01 crc kubenswrapper[4892]: I0217 17:46:01.358582 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:46:01 crc kubenswrapper[4892]: E0217 17:46:01.359032 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:46:01 crc kubenswrapper[4892]: E0217 17:46:01.359176 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:46:02 crc kubenswrapper[4892]: I0217 17:46:02.358849 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:46:02 crc kubenswrapper[4892]: E0217 17:46:02.359048 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q4n6" podUID="9290105c-74a4-487a-879f-3f79186b3b01" Feb 17 17:46:03 crc kubenswrapper[4892]: I0217 17:46:03.359489 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:46:03 crc kubenswrapper[4892]: E0217 17:46:03.359619 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 17:46:03 crc kubenswrapper[4892]: I0217 17:46:03.359493 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:46:03 crc kubenswrapper[4892]: E0217 17:46:03.359705 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 17:46:03 crc kubenswrapper[4892]: I0217 17:46:03.359493 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:46:03 crc kubenswrapper[4892]: E0217 17:46:03.359773 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 17:46:04 crc kubenswrapper[4892]: I0217 17:46:04.358627 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:46:04 crc kubenswrapper[4892]: E0217 17:46:04.358983 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q4n6" podUID="9290105c-74a4-487a-879f-3f79186b3b01" Feb 17 17:46:05 crc kubenswrapper[4892]: I0217 17:46:05.359148 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:46:05 crc kubenswrapper[4892]: I0217 17:46:05.360368 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:46:05 crc kubenswrapper[4892]: I0217 17:46:05.361165 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:46:05 crc kubenswrapper[4892]: I0217 17:46:05.366761 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 17 17:46:05 crc kubenswrapper[4892]: I0217 17:46:05.368173 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 17 17:46:05 crc kubenswrapper[4892]: I0217 17:46:05.368637 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 17 17:46:05 crc kubenswrapper[4892]: I0217 17:46:05.369050 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 17 17:46:06 crc kubenswrapper[4892]: I0217 17:46:06.359027 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:46:06 crc kubenswrapper[4892]: I0217 17:46:06.361797 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 17 17:46:06 crc kubenswrapper[4892]: I0217 17:46:06.362094 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.567644 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.615093 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-t5flc"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.615977 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-t5flc" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.617635 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-zh54l"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.618163 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-zh54l" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.630623 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.630874 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.631013 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.631189 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.631026 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.631313 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.631487 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.631666 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.631781 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.631964 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.632158 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.632292 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.633608 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.633930 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.638895 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.639938 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.657052 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2jkg\" (UniqueName: \"kubernetes.io/projected/be8497ae-6021-4f47-9bf1-64fc30d9e161-kube-api-access-w2jkg\") pod \"machine-api-operator-5694c8668f-zh54l\" (UID: \"be8497ae-6021-4f47-9bf1-64fc30d9e161\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zh54l" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.657121 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ac493e83-90b2-4dc6-a414-256a21927213-etcd-client\") pod \"apiserver-76f77b778f-t5flc\" (UID: \"ac493e83-90b2-4dc6-a414-256a21927213\") " pod="openshift-apiserver/apiserver-76f77b778f-t5flc" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.657156 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac493e83-90b2-4dc6-a414-256a21927213-trusted-ca-bundle\") pod \"apiserver-76f77b778f-t5flc\" (UID: \"ac493e83-90b2-4dc6-a414-256a21927213\") " pod="openshift-apiserver/apiserver-76f77b778f-t5flc" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.657194 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be8497ae-6021-4f47-9bf1-64fc30d9e161-config\") pod \"machine-api-operator-5694c8668f-zh54l\" (UID: \"be8497ae-6021-4f47-9bf1-64fc30d9e161\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zh54l" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.657226 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ac493e83-90b2-4dc6-a414-256a21927213-audit\") pod \"apiserver-76f77b778f-t5flc\" (UID: \"ac493e83-90b2-4dc6-a414-256a21927213\") " pod="openshift-apiserver/apiserver-76f77b778f-t5flc" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.657257 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ac493e83-90b2-4dc6-a414-256a21927213-audit-dir\") pod \"apiserver-76f77b778f-t5flc\" (UID: \"ac493e83-90b2-4dc6-a414-256a21927213\") " pod="openshift-apiserver/apiserver-76f77b778f-t5flc" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.657286 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxs95\" (UniqueName: \"kubernetes.io/projected/ac493e83-90b2-4dc6-a414-256a21927213-kube-api-access-nxs95\") pod \"apiserver-76f77b778f-t5flc\" (UID: \"ac493e83-90b2-4dc6-a414-256a21927213\") " pod="openshift-apiserver/apiserver-76f77b778f-t5flc" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.657339 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac493e83-90b2-4dc6-a414-256a21927213-config\") pod \"apiserver-76f77b778f-t5flc\" (UID: \"ac493e83-90b2-4dc6-a414-256a21927213\") " pod="openshift-apiserver/apiserver-76f77b778f-t5flc" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.657371 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/be8497ae-6021-4f47-9bf1-64fc30d9e161-images\") pod \"machine-api-operator-5694c8668f-zh54l\" (UID: \"be8497ae-6021-4f47-9bf1-64fc30d9e161\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zh54l" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.657399 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac493e83-90b2-4dc6-a414-256a21927213-serving-cert\") pod \"apiserver-76f77b778f-t5flc\" (UID: \"ac493e83-90b2-4dc6-a414-256a21927213\") " pod="openshift-apiserver/apiserver-76f77b778f-t5flc" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.657428 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ac493e83-90b2-4dc6-a414-256a21927213-node-pullsecrets\") pod \"apiserver-76f77b778f-t5flc\" (UID: \"ac493e83-90b2-4dc6-a414-256a21927213\") " pod="openshift-apiserver/apiserver-76f77b778f-t5flc" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.657472 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ac493e83-90b2-4dc6-a414-256a21927213-etcd-serving-ca\") pod \"apiserver-76f77b778f-t5flc\" (UID: \"ac493e83-90b2-4dc6-a414-256a21927213\") " pod="openshift-apiserver/apiserver-76f77b778f-t5flc" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.657512 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/be8497ae-6021-4f47-9bf1-64fc30d9e161-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-zh54l\" (UID: \"be8497ae-6021-4f47-9bf1-64fc30d9e161\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zh54l" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.657818 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-st4m9"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.658364 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-dc6pg"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.657560 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ac493e83-90b2-4dc6-a414-256a21927213-image-import-ca\") pod \"apiserver-76f77b778f-t5flc\" (UID: \"ac493e83-90b2-4dc6-a414-256a21927213\") " pod="openshift-apiserver/apiserver-76f77b778f-t5flc" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.658527 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ac493e83-90b2-4dc6-a414-256a21927213-encryption-config\") pod \"apiserver-76f77b778f-t5flc\" (UID: \"ac493e83-90b2-4dc6-a414-256a21927213\") " pod="openshift-apiserver/apiserver-76f77b778f-t5flc" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.659050 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dc6pg" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.659074 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-st4m9" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.662904 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-nj852"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.663574 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-lzbcm"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.664036 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-lzbcm" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.664474 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nj852" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.665067 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.665405 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.665557 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.665643 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.665781 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.665982 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.666072 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.670310 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.670491 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.670956 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.672193 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.673080 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hwwms"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.673644 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hwwms" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.674112 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-hzxbm"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.674483 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-hzxbm" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.677092 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-zc25m"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.677489 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-zc25m" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.685282 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.685323 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.685363 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.685576 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.685872 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.685921 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.686006 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.686117 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.686171 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.686192 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.686227 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.686310 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.686333 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.686440 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.687766 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qm6w9"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.690494 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.690773 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.690797 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.691127 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.691304 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.691437 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.691466 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.691679 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.691824 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.692386 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.692588 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.692734 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.692903 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.693027 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.693124 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.693275 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.697926 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-t24t9"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.698619 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-t24t9" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.699210 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qm6w9" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.699445 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.699967 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.701535 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.705341 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.713044 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-4q9d4"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.714767 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-4q9d4" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.715890 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.717028 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.717555 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.718074 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.718566 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.719991 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.720293 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.720458 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.720672 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.725922 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lxn66"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.754242 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.754415 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.755014 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-vm7qp"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.755482 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lxn66" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.755634 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-n8n76"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.755488 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.756164 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-znjqw"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.756255 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vm7qp" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.756407 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.756720 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-znjqw" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.757275 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.761371 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.761596 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.761757 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.761943 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.762100 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.762263 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.762488 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.762646 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.762753 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.762933 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wrdm2"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.762978 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.763074 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.763169 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.763532 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wrdm2" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.763867 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.763895 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kc9rz"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.764402 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kc9rz" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.764526 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca6ef0ff-6a6d-4e4e-b396-1de6fd632cd1-config\") pod \"authentication-operator-69f744f599-lzbcm\" (UID: \"ca6ef0ff-6a6d-4e4e-b396-1de6fd632cd1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lzbcm" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.764553 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/be8497ae-6021-4f47-9bf1-64fc30d9e161-images\") pod \"machine-api-operator-5694c8668f-zh54l\" (UID: \"be8497ae-6021-4f47-9bf1-64fc30d9e161\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zh54l" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.764574 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4baae3f3-ceff-445d-9cf0-82284224bff7-serving-cert\") pod \"route-controller-manager-6576b87f9c-qm6w9\" (UID: \"4baae3f3-ceff-445d-9cf0-82284224bff7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qm6w9" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.764593 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac493e83-90b2-4dc6-a414-256a21927213-serving-cert\") pod \"apiserver-76f77b778f-t5flc\" (UID: \"ac493e83-90b2-4dc6-a414-256a21927213\") " pod="openshift-apiserver/apiserver-76f77b778f-t5flc" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.764610 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c65gw\" (UniqueName: \"kubernetes.io/projected/14853a40-bee2-4d8e-a148-5f0ae761d71c-kube-api-access-c65gw\") pod \"openshift-config-operator-7777fb866f-nj852\" (UID: \"14853a40-bee2-4d8e-a148-5f0ae761d71c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nj852" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.764627 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5qpz\" (UniqueName: \"kubernetes.io/projected/183505a9-eeb4-4bf8-94ae-2c593e78b926-kube-api-access-j5qpz\") pod \"controller-manager-879f6c89f-st4m9\" (UID: \"183505a9-eeb4-4bf8-94ae-2c593e78b926\") " pod="openshift-controller-manager/controller-manager-879f6c89f-st4m9" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.764648 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ac493e83-90b2-4dc6-a414-256a21927213-node-pullsecrets\") pod \"apiserver-76f77b778f-t5flc\" (UID: \"ac493e83-90b2-4dc6-a414-256a21927213\") " pod="openshift-apiserver/apiserver-76f77b778f-t5flc" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.764662 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ac493e83-90b2-4dc6-a414-256a21927213-etcd-serving-ca\") pod \"apiserver-76f77b778f-t5flc\" (UID: \"ac493e83-90b2-4dc6-a414-256a21927213\") " pod="openshift-apiserver/apiserver-76f77b778f-t5flc" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.764677 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/be8497ae-6021-4f47-9bf1-64fc30d9e161-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-zh54l\" (UID: \"be8497ae-6021-4f47-9bf1-64fc30d9e161\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zh54l" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.764693 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0b0cbdbb-671e-41e1-b494-a369938dab8e-console-serving-cert\") pod \"console-f9d7485db-zc25m\" (UID: \"0b0cbdbb-671e-41e1-b494-a369938dab8e\") " pod="openshift-console/console-f9d7485db-zc25m" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.764709 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b0cbdbb-671e-41e1-b494-a369938dab8e-trusted-ca-bundle\") pod \"console-f9d7485db-zc25m\" (UID: \"0b0cbdbb-671e-41e1-b494-a369938dab8e\") " pod="openshift-console/console-f9d7485db-zc25m" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.764725 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kms92\" (UniqueName: \"kubernetes.io/projected/ca6ef0ff-6a6d-4e4e-b396-1de6fd632cd1-kube-api-access-kms92\") pod \"authentication-operator-69f744f599-lzbcm\" (UID: \"ca6ef0ff-6a6d-4e4e-b396-1de6fd632cd1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lzbcm" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.764774 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ac493e83-90b2-4dc6-a414-256a21927213-node-pullsecrets\") pod \"apiserver-76f77b778f-t5flc\" (UID: \"ac493e83-90b2-4dc6-a414-256a21927213\") " pod="openshift-apiserver/apiserver-76f77b778f-t5flc" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.764923 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ac493e83-90b2-4dc6-a414-256a21927213-image-import-ca\") pod \"apiserver-76f77b778f-t5flc\" (UID: \"ac493e83-90b2-4dc6-a414-256a21927213\") " pod="openshift-apiserver/apiserver-76f77b778f-t5flc" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.764944 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2aa4da4a-e681-4199-b8b7-c4de1ed9c04e-serving-cert\") pod \"console-operator-58897d9998-t24t9\" (UID: \"2aa4da4a-e681-4199-b8b7-c4de1ed9c04e\") " pod="openshift-console-operator/console-operator-58897d9998-t24t9" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.764961 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/183505a9-eeb4-4bf8-94ae-2c593e78b926-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-st4m9\" (UID: \"183505a9-eeb4-4bf8-94ae-2c593e78b926\") " pod="openshift-controller-manager/controller-manager-879f6c89f-st4m9" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.764983 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4baae3f3-ceff-445d-9cf0-82284224bff7-client-ca\") pod \"route-controller-manager-6576b87f9c-qm6w9\" (UID: \"4baae3f3-ceff-445d-9cf0-82284224bff7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qm6w9" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.765007 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/183505a9-eeb4-4bf8-94ae-2c593e78b926-client-ca\") pod \"controller-manager-879f6c89f-st4m9\" (UID: \"183505a9-eeb4-4bf8-94ae-2c593e78b926\") " pod="openshift-controller-manager/controller-manager-879f6c89f-st4m9" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.765021 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4baae3f3-ceff-445d-9cf0-82284224bff7-config\") pod \"route-controller-manager-6576b87f9c-qm6w9\" (UID: \"4baae3f3-ceff-445d-9cf0-82284224bff7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qm6w9" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.765036 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca6ef0ff-6a6d-4e4e-b396-1de6fd632cd1-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-lzbcm\" (UID: \"ca6ef0ff-6a6d-4e4e-b396-1de6fd632cd1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lzbcm" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.765054 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ac493e83-90b2-4dc6-a414-256a21927213-encryption-config\") pod \"apiserver-76f77b778f-t5flc\" (UID: \"ac493e83-90b2-4dc6-a414-256a21927213\") " pod="openshift-apiserver/apiserver-76f77b778f-t5flc" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.765146 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2aa4da4a-e681-4199-b8b7-c4de1ed9c04e-config\") pod \"console-operator-58897d9998-t24t9\" (UID: \"2aa4da4a-e681-4199-b8b7-c4de1ed9c04e\") " pod="openshift-console-operator/console-operator-58897d9998-t24t9" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.765207 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8e8f043a-de31-4c34-8ca8-d90a0ad0f125-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-dc6pg\" (UID: \"8e8f043a-de31-4c34-8ca8-d90a0ad0f125\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dc6pg" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.765229 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca6ef0ff-6a6d-4e4e-b396-1de6fd632cd1-serving-cert\") pod \"authentication-operator-69f744f599-lzbcm\" (UID: \"ca6ef0ff-6a6d-4e4e-b396-1de6fd632cd1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lzbcm" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.765266 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8e8f043a-de31-4c34-8ca8-d90a0ad0f125-audit-policies\") pod \"apiserver-7bbb656c7d-dc6pg\" (UID: \"8e8f043a-de31-4c34-8ca8-d90a0ad0f125\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dc6pg" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.765311 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5plsk\" (UniqueName: \"kubernetes.io/projected/4baae3f3-ceff-445d-9cf0-82284224bff7-kube-api-access-5plsk\") pod \"route-controller-manager-6576b87f9c-qm6w9\" (UID: \"4baae3f3-ceff-445d-9cf0-82284224bff7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qm6w9" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.765333 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/129ae682-89fa-4fa9-b4de-789ea8c0a9fd-metrics-tls\") pod \"dns-operator-744455d44c-hzxbm\" (UID: \"129ae682-89fa-4fa9-b4de-789ea8c0a9fd\") " pod="openshift-dns-operator/dns-operator-744455d44c-hzxbm" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.765375 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfcts\" (UniqueName: \"kubernetes.io/projected/6c5aba9a-b953-479b-a16a-93af181b7445-kube-api-access-mfcts\") pod \"cluster-samples-operator-665b6dd947-hwwms\" (UID: \"6c5aba9a-b953-479b-a16a-93af181b7445\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hwwms" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.765421 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/183505a9-eeb4-4bf8-94ae-2c593e78b926-serving-cert\") pod \"controller-manager-879f6c89f-st4m9\" (UID: \"183505a9-eeb4-4bf8-94ae-2c593e78b926\") " pod="openshift-controller-manager/controller-manager-879f6c89f-st4m9" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.765454 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca6ef0ff-6a6d-4e4e-b396-1de6fd632cd1-service-ca-bundle\") pod \"authentication-operator-69f744f599-lzbcm\" (UID: \"ca6ef0ff-6a6d-4e4e-b396-1de6fd632cd1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lzbcm" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.765477 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/be8497ae-6021-4f47-9bf1-64fc30d9e161-images\") pod \"machine-api-operator-5694c8668f-zh54l\" (UID: \"be8497ae-6021-4f47-9bf1-64fc30d9e161\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zh54l" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.765501 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2jkg\" (UniqueName: \"kubernetes.io/projected/be8497ae-6021-4f47-9bf1-64fc30d9e161-kube-api-access-w2jkg\") pod \"machine-api-operator-5694c8668f-zh54l\" (UID: \"be8497ae-6021-4f47-9bf1-64fc30d9e161\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zh54l" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.765543 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e8f043a-de31-4c34-8ca8-d90a0ad0f125-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-dc6pg\" (UID: \"8e8f043a-de31-4c34-8ca8-d90a0ad0f125\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dc6pg" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.765559 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96s4p\" (UniqueName: \"kubernetes.io/projected/8e8f043a-de31-4c34-8ca8-d90a0ad0f125-kube-api-access-96s4p\") pod \"apiserver-7bbb656c7d-dc6pg\" (UID: \"8e8f043a-de31-4c34-8ca8-d90a0ad0f125\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dc6pg" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.765578 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0b0cbdbb-671e-41e1-b494-a369938dab8e-console-config\") pod \"console-f9d7485db-zc25m\" (UID: \"0b0cbdbb-671e-41e1-b494-a369938dab8e\") " pod="openshift-console/console-f9d7485db-zc25m" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.765611 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b0cbdbb-671e-41e1-b494-a369938dab8e-service-ca\") pod \"console-f9d7485db-zc25m\" (UID: \"0b0cbdbb-671e-41e1-b494-a369938dab8e\") " pod="openshift-console/console-f9d7485db-zc25m" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.765628 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8e8f043a-de31-4c34-8ca8-d90a0ad0f125-etcd-client\") pod \"apiserver-7bbb656c7d-dc6pg\" (UID: \"8e8f043a-de31-4c34-8ca8-d90a0ad0f125\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dc6pg" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.765647 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ac493e83-90b2-4dc6-a414-256a21927213-etcd-client\") pod \"apiserver-76f77b778f-t5flc\" (UID: \"ac493e83-90b2-4dc6-a414-256a21927213\") " pod="openshift-apiserver/apiserver-76f77b778f-t5flc" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.765679 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac493e83-90b2-4dc6-a414-256a21927213-trusted-ca-bundle\") pod \"apiserver-76f77b778f-t5flc\" (UID: \"ac493e83-90b2-4dc6-a414-256a21927213\") " pod="openshift-apiserver/apiserver-76f77b778f-t5flc" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.765697 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8e8f043a-de31-4c34-8ca8-d90a0ad0f125-encryption-config\") pod \"apiserver-7bbb656c7d-dc6pg\" (UID: \"8e8f043a-de31-4c34-8ca8-d90a0ad0f125\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dc6pg" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.765716 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/183505a9-eeb4-4bf8-94ae-2c593e78b926-config\") pod \"controller-manager-879f6c89f-st4m9\" (UID: \"183505a9-eeb4-4bf8-94ae-2c593e78b926\") " pod="openshift-controller-manager/controller-manager-879f6c89f-st4m9" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.765733 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2l8dj\" (UniqueName: \"kubernetes.io/projected/129ae682-89fa-4fa9-b4de-789ea8c0a9fd-kube-api-access-2l8dj\") pod \"dns-operator-744455d44c-hzxbm\" (UID: \"129ae682-89fa-4fa9-b4de-789ea8c0a9fd\") " pod="openshift-dns-operator/dns-operator-744455d44c-hzxbm" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.765767 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be8497ae-6021-4f47-9bf1-64fc30d9e161-config\") pod \"machine-api-operator-5694c8668f-zh54l\" (UID: \"be8497ae-6021-4f47-9bf1-64fc30d9e161\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zh54l" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.765786 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/14853a40-bee2-4d8e-a148-5f0ae761d71c-available-featuregates\") pod \"openshift-config-operator-7777fb866f-nj852\" (UID: \"14853a40-bee2-4d8e-a148-5f0ae761d71c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nj852" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.765802 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0b0cbdbb-671e-41e1-b494-a369938dab8e-oauth-serving-cert\") pod \"console-f9d7485db-zc25m\" (UID: \"0b0cbdbb-671e-41e1-b494-a369938dab8e\") " pod="openshift-console/console-f9d7485db-zc25m" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.765844 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ac493e83-90b2-4dc6-a414-256a21927213-audit\") pod \"apiserver-76f77b778f-t5flc\" (UID: \"ac493e83-90b2-4dc6-a414-256a21927213\") " pod="openshift-apiserver/apiserver-76f77b778f-t5flc" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.765862 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9snm\" (UniqueName: \"kubernetes.io/projected/0b0cbdbb-671e-41e1-b494-a369938dab8e-kube-api-access-r9snm\") pod \"console-f9d7485db-zc25m\" (UID: \"0b0cbdbb-671e-41e1-b494-a369938dab8e\") " pod="openshift-console/console-f9d7485db-zc25m" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.765880 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e8f043a-de31-4c34-8ca8-d90a0ad0f125-serving-cert\") pod \"apiserver-7bbb656c7d-dc6pg\" (UID: \"8e8f043a-de31-4c34-8ca8-d90a0ad0f125\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dc6pg" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.765924 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2aa4da4a-e681-4199-b8b7-c4de1ed9c04e-trusted-ca\") pod \"console-operator-58897d9998-t24t9\" (UID: \"2aa4da4a-e681-4199-b8b7-c4de1ed9c04e\") " pod="openshift-console-operator/console-operator-58897d9998-t24t9" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.765592 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-72rk9"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.766014 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ac493e83-90b2-4dc6-a414-256a21927213-audit-dir\") pod \"apiserver-76f77b778f-t5flc\" (UID: \"ac493e83-90b2-4dc6-a414-256a21927213\") " pod="openshift-apiserver/apiserver-76f77b778f-t5flc" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.766118 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ac493e83-90b2-4dc6-a414-256a21927213-image-import-ca\") pod \"apiserver-76f77b778f-t5flc\" (UID: \"ac493e83-90b2-4dc6-a414-256a21927213\") " pod="openshift-apiserver/apiserver-76f77b778f-t5flc" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.766225 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ac493e83-90b2-4dc6-a414-256a21927213-etcd-serving-ca\") pod \"apiserver-76f77b778f-t5flc\" (UID: \"ac493e83-90b2-4dc6-a414-256a21927213\") " pod="openshift-apiserver/apiserver-76f77b778f-t5flc" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.766942 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be8497ae-6021-4f47-9bf1-64fc30d9e161-config\") pod \"machine-api-operator-5694c8668f-zh54l\" (UID: \"be8497ae-6021-4f47-9bf1-64fc30d9e161\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zh54l" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.767080 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac493e83-90b2-4dc6-a414-256a21927213-trusted-ca-bundle\") pod \"apiserver-76f77b778f-t5flc\" (UID: \"ac493e83-90b2-4dc6-a414-256a21927213\") " pod="openshift-apiserver/apiserver-76f77b778f-t5flc" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.765946 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ac493e83-90b2-4dc6-a414-256a21927213-audit-dir\") pod \"apiserver-76f77b778f-t5flc\" (UID: \"ac493e83-90b2-4dc6-a414-256a21927213\") " pod="openshift-apiserver/apiserver-76f77b778f-t5flc" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.767338 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0b0cbdbb-671e-41e1-b494-a369938dab8e-console-oauth-config\") pod \"console-f9d7485db-zc25m\" (UID: \"0b0cbdbb-671e-41e1-b494-a369938dab8e\") " pod="openshift-console/console-f9d7485db-zc25m" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.767361 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8e8f043a-de31-4c34-8ca8-d90a0ad0f125-audit-dir\") pod \"apiserver-7bbb656c7d-dc6pg\" (UID: \"8e8f043a-de31-4c34-8ca8-d90a0ad0f125\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dc6pg" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.767385 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxs95\" (UniqueName: \"kubernetes.io/projected/ac493e83-90b2-4dc6-a414-256a21927213-kube-api-access-nxs95\") pod \"apiserver-76f77b778f-t5flc\" (UID: \"ac493e83-90b2-4dc6-a414-256a21927213\") " pod="openshift-apiserver/apiserver-76f77b778f-t5flc" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.767404 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjkmd\" (UniqueName: \"kubernetes.io/projected/843d01a5-8de5-4628-99d0-2ac552e9abf5-kube-api-access-tjkmd\") pod \"downloads-7954f5f757-4q9d4\" (UID: \"843d01a5-8de5-4628-99d0-2ac552e9abf5\") " pod="openshift-console/downloads-7954f5f757-4q9d4" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.767424 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47s6x\" (UniqueName: \"kubernetes.io/projected/2aa4da4a-e681-4199-b8b7-c4de1ed9c04e-kube-api-access-47s6x\") pod \"console-operator-58897d9998-t24t9\" (UID: \"2aa4da4a-e681-4199-b8b7-c4de1ed9c04e\") " pod="openshift-console-operator/console-operator-58897d9998-t24t9" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.767455 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac493e83-90b2-4dc6-a414-256a21927213-config\") pod \"apiserver-76f77b778f-t5flc\" (UID: \"ac493e83-90b2-4dc6-a414-256a21927213\") " pod="openshift-apiserver/apiserver-76f77b778f-t5flc" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.767471 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14853a40-bee2-4d8e-a148-5f0ae761d71c-serving-cert\") pod \"openshift-config-operator-7777fb866f-nj852\" (UID: \"14853a40-bee2-4d8e-a148-5f0ae761d71c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nj852" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.767489 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6c5aba9a-b953-479b-a16a-93af181b7445-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-hwwms\" (UID: \"6c5aba9a-b953-479b-a16a-93af181b7445\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hwwms" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.767924 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.767954 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.767972 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.767954 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.768138 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac493e83-90b2-4dc6-a414-256a21927213-config\") pod \"apiserver-76f77b778f-t5flc\" (UID: \"ac493e83-90b2-4dc6-a414-256a21927213\") " pod="openshift-apiserver/apiserver-76f77b778f-t5flc" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.768173 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.768304 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.768373 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.768499 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.768556 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.768958 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-72rk9" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.769106 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ac493e83-90b2-4dc6-a414-256a21927213-audit\") pod \"apiserver-76f77b778f-t5flc\" (UID: \"ac493e83-90b2-4dc6-a414-256a21927213\") " pod="openshift-apiserver/apiserver-76f77b778f-t5flc" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.769838 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.770835 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qj469"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.771448 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qj469" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.772280 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k9jsf"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.773358 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac493e83-90b2-4dc6-a414-256a21927213-serving-cert\") pod \"apiserver-76f77b778f-t5flc\" (UID: \"ac493e83-90b2-4dc6-a414-256a21927213\") " pod="openshift-apiserver/apiserver-76f77b778f-t5flc" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.773567 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/be8497ae-6021-4f47-9bf1-64fc30d9e161-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-zh54l\" (UID: \"be8497ae-6021-4f47-9bf1-64fc30d9e161\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zh54l" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.773648 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k9jsf" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.775087 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.775246 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-2rzft"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.779154 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-cnjbv"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.779478 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2rzft" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.779983 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-ddw22"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.780141 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.780388 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-ddw22" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.780859 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ac493e83-90b2-4dc6-a414-256a21927213-etcd-client\") pod \"apiserver-76f77b778f-t5flc\" (UID: \"ac493e83-90b2-4dc6-a414-256a21927213\") " pod="openshift-apiserver/apiserver-76f77b778f-t5flc" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.781410 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.784138 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.786161 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.791367 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ac493e83-90b2-4dc6-a414-256a21927213-encryption-config\") pod \"apiserver-76f77b778f-t5flc\" (UID: \"ac493e83-90b2-4dc6-a414-256a21927213\") " pod="openshift-apiserver/apiserver-76f77b778f-t5flc" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.793665 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.798870 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.802535 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qw5r5"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.804093 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qw5r5" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.804169 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vhwpb"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.807169 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vhwpb" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.808373 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-dc6pg"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.811255 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-t5flc"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.818167 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.829408 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ph8l6"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.829991 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ps5cl"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.830172 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.830374 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-x6gtg"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.830569 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ph8l6" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.830712 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-x6gtg" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.830965 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ps5cl" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.832187 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-v4nm6"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.832911 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-v4nm6" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.834896 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xlhd5"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.835510 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-xcgd8"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.835955 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522505-6hhs6"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.836138 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xlhd5" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.836428 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522505-6hhs6" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.836737 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xcgd8" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.836874 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-km9qc"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.837374 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-km9qc" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.837756 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9bbcq"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.838270 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9bbcq" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.838417 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-d9zlz"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.840587 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.842176 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-6wqsp"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.842332 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d9zlz" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.843136 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-bgzdj"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.843407 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-6wqsp" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.843902 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bgzdj" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.844037 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-pm4rt"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.845197 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-pm4rt" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.846777 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-lzbcm"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.849339 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-hzxbm"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.850420 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-st4m9"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.851942 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-zh54l"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.854185 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-n8n76"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.856111 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qw5r5"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.858190 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.858320 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qm6w9"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.859703 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-2rzft"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.860924 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-zc25m"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.863037 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-x6gtg"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.866195 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xlhd5"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.867451 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ph8l6"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.868042 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjkmd\" (UniqueName: \"kubernetes.io/projected/843d01a5-8de5-4628-99d0-2ac552e9abf5-kube-api-access-tjkmd\") pod \"downloads-7954f5f757-4q9d4\" (UID: \"843d01a5-8de5-4628-99d0-2ac552e9abf5\") " pod="openshift-console/downloads-7954f5f757-4q9d4" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.868249 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/752ea4e8-6b0b-4fab-9e96-f21100f3e72d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-k9jsf\" (UID: \"752ea4e8-6b0b-4fab-9e96-f21100f3e72d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k9jsf" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.868274 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/752ea4e8-6b0b-4fab-9e96-f21100f3e72d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-k9jsf\" (UID: \"752ea4e8-6b0b-4fab-9e96-f21100f3e72d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k9jsf" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.868347 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47s6x\" (UniqueName: \"kubernetes.io/projected/2aa4da4a-e681-4199-b8b7-c4de1ed9c04e-kube-api-access-47s6x\") pod \"console-operator-58897d9998-t24t9\" (UID: \"2aa4da4a-e681-4199-b8b7-c4de1ed9c04e\") " pod="openshift-console-operator/console-operator-58897d9998-t24t9" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.868790 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kc9rz"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.868907 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6fjj\" (UniqueName: \"kubernetes.io/projected/b571ef16-161d-4280-aaa0-3f15b76dcc21-kube-api-access-d6fjj\") pod \"multus-admission-controller-857f4d67dd-6wqsp\" (UID: \"b571ef16-161d-4280-aaa0-3f15b76dcc21\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6wqsp" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.869060 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72a2852b-7dd5-4681-b330-b8e4128d75b0-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-kc9rz\" (UID: \"72a2852b-7dd5-4681-b330-b8e4128d75b0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kc9rz" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.869085 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-n8n76\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.869143 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/36e96f72-2b42-4b98-b6f1-5fe676510aed-proxy-tls\") pod \"machine-config-controller-84d6567774-d9zlz\" (UID: \"36e96f72-2b42-4b98-b6f1-5fe676510aed\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d9zlz" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.869158 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72a2852b-7dd5-4681-b330-b8e4128d75b0-config\") pod \"kube-apiserver-operator-766d6c64bb-kc9rz\" (UID: \"72a2852b-7dd5-4681-b330-b8e4128d75b0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kc9rz" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.869209 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-n8n76\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.869224 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vbvj\" (UniqueName: \"kubernetes.io/projected/58da6748-a277-48e6-a169-6e6477486e44-kube-api-access-5vbvj\") pod \"oauth-openshift-558db77b4-n8n76\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.869241 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14853a40-bee2-4d8e-a148-5f0ae761d71c-serving-cert\") pod \"openshift-config-operator-7777fb866f-nj852\" (UID: \"14853a40-bee2-4d8e-a148-5f0ae761d71c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nj852" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.869300 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6c5aba9a-b953-479b-a16a-93af181b7445-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-hwwms\" (UID: \"6c5aba9a-b953-479b-a16a-93af181b7445\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hwwms" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.869318 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b571ef16-161d-4280-aaa0-3f15b76dcc21-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-6wqsp\" (UID: \"b571ef16-161d-4280-aaa0-3f15b76dcc21\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6wqsp" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.870241 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca6ef0ff-6a6d-4e4e-b396-1de6fd632cd1-config\") pod \"authentication-operator-69f744f599-lzbcm\" (UID: \"ca6ef0ff-6a6d-4e4e-b396-1de6fd632cd1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lzbcm" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.870263 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e7188959-438f-469b-8c0e-8e2af8c54d3f-apiservice-cert\") pod \"packageserver-d55dfcdfc-9bbcq\" (UID: \"e7188959-438f-469b-8c0e-8e2af8c54d3f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9bbcq" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.870285 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c65gw\" (UniqueName: \"kubernetes.io/projected/14853a40-bee2-4d8e-a148-5f0ae761d71c-kube-api-access-c65gw\") pod \"openshift-config-operator-7777fb866f-nj852\" (UID: \"14853a40-bee2-4d8e-a148-5f0ae761d71c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nj852" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.870301 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4baae3f3-ceff-445d-9cf0-82284224bff7-serving-cert\") pod \"route-controller-manager-6576b87f9c-qm6w9\" (UID: \"4baae3f3-ceff-445d-9cf0-82284224bff7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qm6w9" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.870317 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47sjd\" (UniqueName: \"kubernetes.io/projected/e7188959-438f-469b-8c0e-8e2af8c54d3f-kube-api-access-47sjd\") pod \"packageserver-d55dfcdfc-9bbcq\" (UID: \"e7188959-438f-469b-8c0e-8e2af8c54d3f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9bbcq" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.870333 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-n8n76\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.870353 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5qpz\" (UniqueName: \"kubernetes.io/projected/183505a9-eeb4-4bf8-94ae-2c593e78b926-kube-api-access-j5qpz\") pod \"controller-manager-879f6c89f-st4m9\" (UID: \"183505a9-eeb4-4bf8-94ae-2c593e78b926\") " pod="openshift-controller-manager/controller-manager-879f6c89f-st4m9" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.870370 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0b0cbdbb-671e-41e1-b494-a369938dab8e-console-serving-cert\") pod \"console-f9d7485db-zc25m\" (UID: \"0b0cbdbb-671e-41e1-b494-a369938dab8e\") " pod="openshift-console/console-f9d7485db-zc25m" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.870385 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b0cbdbb-671e-41e1-b494-a369938dab8e-trusted-ca-bundle\") pod \"console-f9d7485db-zc25m\" (UID: \"0b0cbdbb-671e-41e1-b494-a369938dab8e\") " pod="openshift-console/console-f9d7485db-zc25m" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.870401 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kms92\" (UniqueName: \"kubernetes.io/projected/ca6ef0ff-6a6d-4e4e-b396-1de6fd632cd1-kube-api-access-kms92\") pod \"authentication-operator-69f744f599-lzbcm\" (UID: \"ca6ef0ff-6a6d-4e4e-b396-1de6fd632cd1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lzbcm" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.870417 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50dcb3e4-a35c-4e36-872c-32a252f542a7-config\") pod \"kube-controller-manager-operator-78b949d7b-wrdm2\" (UID: \"50dcb3e4-a35c-4e36-872c-32a252f542a7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wrdm2" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.870433 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-n8n76\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.870459 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2aa4da4a-e681-4199-b8b7-c4de1ed9c04e-serving-cert\") pod \"console-operator-58897d9998-t24t9\" (UID: \"2aa4da4a-e681-4199-b8b7-c4de1ed9c04e\") " pod="openshift-console-operator/console-operator-58897d9998-t24t9" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.870474 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/183505a9-eeb4-4bf8-94ae-2c593e78b926-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-st4m9\" (UID: \"183505a9-eeb4-4bf8-94ae-2c593e78b926\") " pod="openshift-controller-manager/controller-manager-879f6c89f-st4m9" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.870490 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/36e96f72-2b42-4b98-b6f1-5fe676510aed-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-d9zlz\" (UID: \"36e96f72-2b42-4b98-b6f1-5fe676510aed\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d9zlz" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.870515 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4baae3f3-ceff-445d-9cf0-82284224bff7-client-ca\") pod \"route-controller-manager-6576b87f9c-qm6w9\" (UID: \"4baae3f3-ceff-445d-9cf0-82284224bff7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qm6w9" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.870531 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2aa4da4a-e681-4199-b8b7-c4de1ed9c04e-config\") pod \"console-operator-58897d9998-t24t9\" (UID: \"2aa4da4a-e681-4199-b8b7-c4de1ed9c04e\") " pod="openshift-console-operator/console-operator-58897d9998-t24t9" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.870546 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8e8f043a-de31-4c34-8ca8-d90a0ad0f125-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-dc6pg\" (UID: \"8e8f043a-de31-4c34-8ca8-d90a0ad0f125\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dc6pg" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.870563 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/183505a9-eeb4-4bf8-94ae-2c593e78b926-client-ca\") pod \"controller-manager-879f6c89f-st4m9\" (UID: \"183505a9-eeb4-4bf8-94ae-2c593e78b926\") " pod="openshift-controller-manager/controller-manager-879f6c89f-st4m9" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.870582 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4baae3f3-ceff-445d-9cf0-82284224bff7-config\") pod \"route-controller-manager-6576b87f9c-qm6w9\" (UID: \"4baae3f3-ceff-445d-9cf0-82284224bff7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qm6w9" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.870603 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca6ef0ff-6a6d-4e4e-b396-1de6fd632cd1-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-lzbcm\" (UID: \"ca6ef0ff-6a6d-4e4e-b396-1de6fd632cd1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lzbcm" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.870620 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e1952c19-97f1-41ae-b9de-834d2325c943-profile-collector-cert\") pod \"catalog-operator-68c6474976-x6gtg\" (UID: \"e1952c19-97f1-41ae-b9de-834d2325c943\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-x6gtg" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.870642 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-n8n76\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.870660 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca6ef0ff-6a6d-4e4e-b396-1de6fd632cd1-serving-cert\") pod \"authentication-operator-69f744f599-lzbcm\" (UID: \"ca6ef0ff-6a6d-4e4e-b396-1de6fd632cd1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lzbcm" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.870677 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8e8f043a-de31-4c34-8ca8-d90a0ad0f125-audit-policies\") pod \"apiserver-7bbb656c7d-dc6pg\" (UID: \"8e8f043a-de31-4c34-8ca8-d90a0ad0f125\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dc6pg" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.870693 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clnks\" (UniqueName: \"kubernetes.io/projected/36e96f72-2b42-4b98-b6f1-5fe676510aed-kube-api-access-clnks\") pod \"machine-config-controller-84d6567774-d9zlz\" (UID: \"36e96f72-2b42-4b98-b6f1-5fe676510aed\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d9zlz" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.870710 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-n8n76\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.870736 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5plsk\" (UniqueName: \"kubernetes.io/projected/4baae3f3-ceff-445d-9cf0-82284224bff7-kube-api-access-5plsk\") pod \"route-controller-manager-6576b87f9c-qm6w9\" (UID: \"4baae3f3-ceff-445d-9cf0-82284224bff7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qm6w9" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.870752 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/129ae682-89fa-4fa9-b4de-789ea8c0a9fd-metrics-tls\") pod \"dns-operator-744455d44c-hzxbm\" (UID: \"129ae682-89fa-4fa9-b4de-789ea8c0a9fd\") " pod="openshift-dns-operator/dns-operator-744455d44c-hzxbm" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.870771 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f377a3b2-2464-4568-a5d9-0557bc175309-config\") pod \"service-ca-operator-777779d784-pm4rt\" (UID: \"f377a3b2-2464-4568-a5d9-0557bc175309\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pm4rt" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.870788 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfcts\" (UniqueName: \"kubernetes.io/projected/6c5aba9a-b953-479b-a16a-93af181b7445-kube-api-access-mfcts\") pod \"cluster-samples-operator-665b6dd947-hwwms\" (UID: \"6c5aba9a-b953-479b-a16a-93af181b7445\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hwwms" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.870795 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca6ef0ff-6a6d-4e4e-b396-1de6fd632cd1-config\") pod \"authentication-operator-69f744f599-lzbcm\" (UID: \"ca6ef0ff-6a6d-4e4e-b396-1de6fd632cd1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lzbcm" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.870805 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc208abe-d868-4d0e-bd8c-9f93e2e5c05a-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-km9qc\" (UID: \"fc208abe-d868-4d0e-bd8c-9f93e2e5c05a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-km9qc" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.870836 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/183505a9-eeb4-4bf8-94ae-2c593e78b926-serving-cert\") pod \"controller-manager-879f6c89f-st4m9\" (UID: \"183505a9-eeb4-4bf8-94ae-2c593e78b926\") " pod="openshift-controller-manager/controller-manager-879f6c89f-st4m9" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.870853 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca6ef0ff-6a6d-4e4e-b396-1de6fd632cd1-service-ca-bundle\") pod \"authentication-operator-69f744f599-lzbcm\" (UID: \"ca6ef0ff-6a6d-4e4e-b396-1de6fd632cd1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lzbcm" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.870870 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-n8n76\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.870886 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f377a3b2-2464-4568-a5d9-0557bc175309-serving-cert\") pod \"service-ca-operator-777779d784-pm4rt\" (UID: \"f377a3b2-2464-4568-a5d9-0557bc175309\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pm4rt" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.870900 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-n8n76\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.870923 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-nj852"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.870924 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jdps\" (UniqueName: \"kubernetes.io/projected/0168b546-66c4-4663-a840-4b4b1968d725-kube-api-access-5jdps\") pod \"migrator-59844c95c7-qw5r5\" (UID: \"0168b546-66c4-4663-a840-4b4b1968d725\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qw5r5" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.870965 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-n8n76\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.870986 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/58da6748-a277-48e6-a169-6e6477486e44-audit-dir\") pod \"oauth-openshift-558db77b4-n8n76\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.871007 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smgtz\" (UniqueName: \"kubernetes.io/projected/fc208abe-d868-4d0e-bd8c-9f93e2e5c05a-kube-api-access-smgtz\") pod \"kube-storage-version-migrator-operator-b67b599dd-km9qc\" (UID: \"fc208abe-d868-4d0e-bd8c-9f93e2e5c05a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-km9qc" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.871023 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/58da6748-a277-48e6-a169-6e6477486e44-audit-policies\") pod \"oauth-openshift-558db77b4-n8n76\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.871040 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e1952c19-97f1-41ae-b9de-834d2325c943-srv-cert\") pod \"catalog-operator-68c6474976-x6gtg\" (UID: \"e1952c19-97f1-41ae-b9de-834d2325c943\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-x6gtg" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.871057 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwgr5\" (UniqueName: \"kubernetes.io/projected/f377a3b2-2464-4568-a5d9-0557bc175309-kube-api-access-bwgr5\") pod \"service-ca-operator-777779d784-pm4rt\" (UID: \"f377a3b2-2464-4568-a5d9-0557bc175309\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pm4rt" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.871074 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-n8n76\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.871091 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc208abe-d868-4d0e-bd8c-9f93e2e5c05a-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-km9qc\" (UID: \"fc208abe-d868-4d0e-bd8c-9f93e2e5c05a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-km9qc" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.871111 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e8f043a-de31-4c34-8ca8-d90a0ad0f125-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-dc6pg\" (UID: \"8e8f043a-de31-4c34-8ca8-d90a0ad0f125\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dc6pg" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.871133 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96s4p\" (UniqueName: \"kubernetes.io/projected/8e8f043a-de31-4c34-8ca8-d90a0ad0f125-kube-api-access-96s4p\") pod \"apiserver-7bbb656c7d-dc6pg\" (UID: \"8e8f043a-de31-4c34-8ca8-d90a0ad0f125\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dc6pg" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.871154 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e7188959-438f-469b-8c0e-8e2af8c54d3f-tmpfs\") pod \"packageserver-d55dfcdfc-9bbcq\" (UID: \"e7188959-438f-469b-8c0e-8e2af8c54d3f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9bbcq" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.871182 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn68r\" (UniqueName: \"kubernetes.io/projected/752ea4e8-6b0b-4fab-9e96-f21100f3e72d-kube-api-access-xn68r\") pod \"cluster-image-registry-operator-dc59b4c8b-k9jsf\" (UID: \"752ea4e8-6b0b-4fab-9e96-f21100f3e72d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k9jsf" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.871206 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-n8n76\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.871228 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0b0cbdbb-671e-41e1-b494-a369938dab8e-console-config\") pod \"console-f9d7485db-zc25m\" (UID: \"0b0cbdbb-671e-41e1-b494-a369938dab8e\") " pod="openshift-console/console-f9d7485db-zc25m" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.871247 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b0cbdbb-671e-41e1-b494-a369938dab8e-service-ca\") pod \"console-f9d7485db-zc25m\" (UID: \"0b0cbdbb-671e-41e1-b494-a369938dab8e\") " pod="openshift-console/console-f9d7485db-zc25m" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.871265 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8e8f043a-de31-4c34-8ca8-d90a0ad0f125-etcd-client\") pod \"apiserver-7bbb656c7d-dc6pg\" (UID: \"8e8f043a-de31-4c34-8ca8-d90a0ad0f125\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dc6pg" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.871285 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/72a2852b-7dd5-4681-b330-b8e4128d75b0-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-kc9rz\" (UID: \"72a2852b-7dd5-4681-b330-b8e4128d75b0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kc9rz" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.871303 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8e8f043a-de31-4c34-8ca8-d90a0ad0f125-encryption-config\") pod \"apiserver-7bbb656c7d-dc6pg\" (UID: \"8e8f043a-de31-4c34-8ca8-d90a0ad0f125\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dc6pg" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.871325 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/183505a9-eeb4-4bf8-94ae-2c593e78b926-config\") pod \"controller-manager-879f6c89f-st4m9\" (UID: \"183505a9-eeb4-4bf8-94ae-2c593e78b926\") " pod="openshift-controller-manager/controller-manager-879f6c89f-st4m9" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.871343 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e7188959-438f-469b-8c0e-8e2af8c54d3f-webhook-cert\") pod \"packageserver-d55dfcdfc-9bbcq\" (UID: \"e7188959-438f-469b-8c0e-8e2af8c54d3f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9bbcq" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.871364 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/752ea4e8-6b0b-4fab-9e96-f21100f3e72d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-k9jsf\" (UID: \"752ea4e8-6b0b-4fab-9e96-f21100f3e72d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k9jsf" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.871387 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/14853a40-bee2-4d8e-a148-5f0ae761d71c-available-featuregates\") pod \"openshift-config-operator-7777fb866f-nj852\" (UID: \"14853a40-bee2-4d8e-a148-5f0ae761d71c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nj852" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.871406 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0b0cbdbb-671e-41e1-b494-a369938dab8e-oauth-serving-cert\") pod \"console-f9d7485db-zc25m\" (UID: \"0b0cbdbb-671e-41e1-b494-a369938dab8e\") " pod="openshift-console/console-f9d7485db-zc25m" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.871423 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2l8dj\" (UniqueName: \"kubernetes.io/projected/129ae682-89fa-4fa9-b4de-789ea8c0a9fd-kube-api-access-2l8dj\") pod \"dns-operator-744455d44c-hzxbm\" (UID: \"129ae682-89fa-4fa9-b4de-789ea8c0a9fd\") " pod="openshift-dns-operator/dns-operator-744455d44c-hzxbm" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.871440 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrnv6\" (UniqueName: \"kubernetes.io/projected/e1952c19-97f1-41ae-b9de-834d2325c943-kube-api-access-hrnv6\") pod \"catalog-operator-68c6474976-x6gtg\" (UID: \"e1952c19-97f1-41ae-b9de-834d2325c943\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-x6gtg" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.871460 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9snm\" (UniqueName: \"kubernetes.io/projected/0b0cbdbb-671e-41e1-b494-a369938dab8e-kube-api-access-r9snm\") pod \"console-f9d7485db-zc25m\" (UID: \"0b0cbdbb-671e-41e1-b494-a369938dab8e\") " pod="openshift-console/console-f9d7485db-zc25m" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.871477 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e8f043a-de31-4c34-8ca8-d90a0ad0f125-serving-cert\") pod \"apiserver-7bbb656c7d-dc6pg\" (UID: \"8e8f043a-de31-4c34-8ca8-d90a0ad0f125\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dc6pg" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.871496 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50dcb3e4-a35c-4e36-872c-32a252f542a7-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-wrdm2\" (UID: \"50dcb3e4-a35c-4e36-872c-32a252f542a7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wrdm2" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.871514 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2aa4da4a-e681-4199-b8b7-c4de1ed9c04e-trusted-ca\") pod \"console-operator-58897d9998-t24t9\" (UID: \"2aa4da4a-e681-4199-b8b7-c4de1ed9c04e\") " pod="openshift-console-operator/console-operator-58897d9998-t24t9" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.871533 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/50dcb3e4-a35c-4e36-872c-32a252f542a7-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-wrdm2\" (UID: \"50dcb3e4-a35c-4e36-872c-32a252f542a7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wrdm2" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.871557 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0b0cbdbb-671e-41e1-b494-a369938dab8e-console-oauth-config\") pod \"console-f9d7485db-zc25m\" (UID: \"0b0cbdbb-671e-41e1-b494-a369938dab8e\") " pod="openshift-console/console-f9d7485db-zc25m" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.871575 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8e8f043a-de31-4c34-8ca8-d90a0ad0f125-audit-dir\") pod \"apiserver-7bbb656c7d-dc6pg\" (UID: \"8e8f043a-de31-4c34-8ca8-d90a0ad0f125\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dc6pg" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.871642 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8e8f043a-de31-4c34-8ca8-d90a0ad0f125-audit-dir\") pod \"apiserver-7bbb656c7d-dc6pg\" (UID: \"8e8f043a-de31-4c34-8ca8-d90a0ad0f125\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dc6pg" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.872108 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e8f043a-de31-4c34-8ca8-d90a0ad0f125-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-dc6pg\" (UID: \"8e8f043a-de31-4c34-8ca8-d90a0ad0f125\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dc6pg" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.872113 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b0cbdbb-671e-41e1-b494-a369938dab8e-trusted-ca-bundle\") pod \"console-f9d7485db-zc25m\" (UID: \"0b0cbdbb-671e-41e1-b494-a369938dab8e\") " pod="openshift-console/console-f9d7485db-zc25m" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.872825 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6c5aba9a-b953-479b-a16a-93af181b7445-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-hwwms\" (UID: \"6c5aba9a-b953-479b-a16a-93af181b7445\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hwwms" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.873033 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0b0cbdbb-671e-41e1-b494-a369938dab8e-console-config\") pod \"console-f9d7485db-zc25m\" (UID: \"0b0cbdbb-671e-41e1-b494-a369938dab8e\") " pod="openshift-console/console-f9d7485db-zc25m" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.873186 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wrdm2"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.873256 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8e8f043a-de31-4c34-8ca8-d90a0ad0f125-audit-policies\") pod \"apiserver-7bbb656c7d-dc6pg\" (UID: \"8e8f043a-de31-4c34-8ca8-d90a0ad0f125\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dc6pg" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.873638 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b0cbdbb-671e-41e1-b494-a369938dab8e-service-ca\") pod \"console-f9d7485db-zc25m\" (UID: \"0b0cbdbb-671e-41e1-b494-a369938dab8e\") " pod="openshift-console/console-f9d7485db-zc25m" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.874140 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/183505a9-eeb4-4bf8-94ae-2c593e78b926-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-st4m9\" (UID: \"183505a9-eeb4-4bf8-94ae-2c593e78b926\") " pod="openshift-controller-manager/controller-manager-879f6c89f-st4m9" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.874235 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0b0cbdbb-671e-41e1-b494-a369938dab8e-oauth-serving-cert\") pod \"console-f9d7485db-zc25m\" (UID: \"0b0cbdbb-671e-41e1-b494-a369938dab8e\") " pod="openshift-console/console-f9d7485db-zc25m" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.874571 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/14853a40-bee2-4d8e-a148-5f0ae761d71c-available-featuregates\") pod \"openshift-config-operator-7777fb866f-nj852\" (UID: \"14853a40-bee2-4d8e-a148-5f0ae761d71c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nj852" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.875820 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca6ef0ff-6a6d-4e4e-b396-1de6fd632cd1-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-lzbcm\" (UID: \"ca6ef0ff-6a6d-4e4e-b396-1de6fd632cd1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lzbcm" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.877007 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14853a40-bee2-4d8e-a148-5f0ae761d71c-serving-cert\") pod \"openshift-config-operator-7777fb866f-nj852\" (UID: \"14853a40-bee2-4d8e-a148-5f0ae761d71c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nj852" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.877051 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4baae3f3-ceff-445d-9cf0-82284224bff7-serving-cert\") pod \"route-controller-manager-6576b87f9c-qm6w9\" (UID: \"4baae3f3-ceff-445d-9cf0-82284224bff7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qm6w9" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.877506 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/183505a9-eeb4-4bf8-94ae-2c593e78b926-client-ca\") pod \"controller-manager-879f6c89f-st4m9\" (UID: \"183505a9-eeb4-4bf8-94ae-2c593e78b926\") " pod="openshift-controller-manager/controller-manager-879f6c89f-st4m9" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.877988 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca6ef0ff-6a6d-4e4e-b396-1de6fd632cd1-service-ca-bundle\") pod \"authentication-operator-69f744f599-lzbcm\" (UID: \"ca6ef0ff-6a6d-4e4e-b396-1de6fd632cd1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lzbcm" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.878202 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4baae3f3-ceff-445d-9cf0-82284224bff7-config\") pod \"route-controller-manager-6576b87f9c-qm6w9\" (UID: \"4baae3f3-ceff-445d-9cf0-82284224bff7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qm6w9" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.879022 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8e8f043a-de31-4c34-8ca8-d90a0ad0f125-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-dc6pg\" (UID: \"8e8f043a-de31-4c34-8ca8-d90a0ad0f125\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dc6pg" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.879177 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8e8f043a-de31-4c34-8ca8-d90a0ad0f125-encryption-config\") pod \"apiserver-7bbb656c7d-dc6pg\" (UID: \"8e8f043a-de31-4c34-8ca8-d90a0ad0f125\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dc6pg" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.879247 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vhwpb"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.879278 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qj469"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.879298 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4baae3f3-ceff-445d-9cf0-82284224bff7-client-ca\") pod \"route-controller-manager-6576b87f9c-qm6w9\" (UID: \"4baae3f3-ceff-445d-9cf0-82284224bff7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qm6w9" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.878463 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/183505a9-eeb4-4bf8-94ae-2c593e78b926-config\") pod \"controller-manager-879f6c89f-st4m9\" (UID: \"183505a9-eeb4-4bf8-94ae-2c593e78b926\") " pod="openshift-controller-manager/controller-manager-879f6c89f-st4m9" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.880249 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2aa4da4a-e681-4199-b8b7-c4de1ed9c04e-trusted-ca\") pod \"console-operator-58897d9998-t24t9\" (UID: \"2aa4da4a-e681-4199-b8b7-c4de1ed9c04e\") " pod="openshift-console-operator/console-operator-58897d9998-t24t9" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.881370 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8e8f043a-de31-4c34-8ca8-d90a0ad0f125-etcd-client\") pod \"apiserver-7bbb656c7d-dc6pg\" (UID: \"8e8f043a-de31-4c34-8ca8-d90a0ad0f125\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dc6pg" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.881426 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lxn66"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.881907 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.882435 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-4q9d4"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.885121 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e8f043a-de31-4c34-8ca8-d90a0ad0f125-serving-cert\") pod \"apiserver-7bbb656c7d-dc6pg\" (UID: \"8e8f043a-de31-4c34-8ca8-d90a0ad0f125\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dc6pg" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.885228 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2aa4da4a-e681-4199-b8b7-c4de1ed9c04e-config\") pod \"console-operator-58897d9998-t24t9\" (UID: \"2aa4da4a-e681-4199-b8b7-c4de1ed9c04e\") " pod="openshift-console-operator/console-operator-58897d9998-t24t9" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.885505 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-v4nm6"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.887016 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2aa4da4a-e681-4199-b8b7-c4de1ed9c04e-serving-cert\") pod \"console-operator-58897d9998-t24t9\" (UID: \"2aa4da4a-e681-4199-b8b7-c4de1ed9c04e\") " pod="openshift-console-operator/console-operator-58897d9998-t24t9" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.887086 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0b0cbdbb-671e-41e1-b494-a369938dab8e-console-serving-cert\") pod \"console-f9d7485db-zc25m\" (UID: \"0b0cbdbb-671e-41e1-b494-a369938dab8e\") " pod="openshift-console/console-f9d7485db-zc25m" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.887591 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xcgd8"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.888003 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/129ae682-89fa-4fa9-b4de-789ea8c0a9fd-metrics-tls\") pod \"dns-operator-744455d44c-hzxbm\" (UID: \"129ae682-89fa-4fa9-b4de-789ea8c0a9fd\") " pod="openshift-dns-operator/dns-operator-744455d44c-hzxbm" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.889722 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/183505a9-eeb4-4bf8-94ae-2c593e78b926-serving-cert\") pod \"controller-manager-879f6c89f-st4m9\" (UID: \"183505a9-eeb4-4bf8-94ae-2c593e78b926\") " pod="openshift-controller-manager/controller-manager-879f6c89f-st4m9" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.889872 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0b0cbdbb-671e-41e1-b494-a369938dab8e-console-oauth-config\") pod \"console-f9d7485db-zc25m\" (UID: \"0b0cbdbb-671e-41e1-b494-a369938dab8e\") " pod="openshift-console/console-f9d7485db-zc25m" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.891614 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-22jlr"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.892811 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-t24t9"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.892917 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-22jlr" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.893654 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca6ef0ff-6a6d-4e4e-b396-1de6fd632cd1-serving-cert\") pod \"authentication-operator-69f744f599-lzbcm\" (UID: \"ca6ef0ff-6a6d-4e4e-b396-1de6fd632cd1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lzbcm" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.893713 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-d9zlz"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.894696 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ps5cl"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.896072 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k9jsf"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.897310 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-znjqw"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.897832 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.898358 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hwwms"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.899391 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-6wqsp"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.901456 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-cnjbv"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.902544 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522505-6hhs6"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.903632 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-pm4rt"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.904753 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-72rk9"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.905824 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-km9qc"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.906907 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-bgzdj"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.908273 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9bbcq"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.909279 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-nnn4l"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.910359 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-cv2jf"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.910519 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-nnn4l" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.911359 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-cv2jf"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.911446 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cv2jf" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.912533 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-nnn4l"] Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.918452 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.938550 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.959198 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.972115 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/36e96f72-2b42-4b98-b6f1-5fe676510aed-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-d9zlz\" (UID: \"36e96f72-2b42-4b98-b6f1-5fe676510aed\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d9zlz" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.972167 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e1952c19-97f1-41ae-b9de-834d2325c943-profile-collector-cert\") pod \"catalog-operator-68c6474976-x6gtg\" (UID: \"e1952c19-97f1-41ae-b9de-834d2325c943\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-x6gtg" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.972194 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-n8n76\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.972221 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clnks\" (UniqueName: \"kubernetes.io/projected/36e96f72-2b42-4b98-b6f1-5fe676510aed-kube-api-access-clnks\") pod \"machine-config-controller-84d6567774-d9zlz\" (UID: \"36e96f72-2b42-4b98-b6f1-5fe676510aed\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d9zlz" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.972237 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-n8n76\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.972267 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f377a3b2-2464-4568-a5d9-0557bc175309-config\") pod \"service-ca-operator-777779d784-pm4rt\" (UID: \"f377a3b2-2464-4568-a5d9-0557bc175309\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pm4rt" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.972297 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc208abe-d868-4d0e-bd8c-9f93e2e5c05a-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-km9qc\" (UID: \"fc208abe-d868-4d0e-bd8c-9f93e2e5c05a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-km9qc" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.972322 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-n8n76\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.972338 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f377a3b2-2464-4568-a5d9-0557bc175309-serving-cert\") pod \"service-ca-operator-777779d784-pm4rt\" (UID: \"f377a3b2-2464-4568-a5d9-0557bc175309\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pm4rt" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.972353 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-n8n76\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.972377 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jdps\" (UniqueName: \"kubernetes.io/projected/0168b546-66c4-4663-a840-4b4b1968d725-kube-api-access-5jdps\") pod \"migrator-59844c95c7-qw5r5\" (UID: \"0168b546-66c4-4663-a840-4b4b1968d725\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qw5r5" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.972392 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-n8n76\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.972409 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/58da6748-a277-48e6-a169-6e6477486e44-audit-dir\") pod \"oauth-openshift-558db77b4-n8n76\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.972425 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smgtz\" (UniqueName: \"kubernetes.io/projected/fc208abe-d868-4d0e-bd8c-9f93e2e5c05a-kube-api-access-smgtz\") pod \"kube-storage-version-migrator-operator-b67b599dd-km9qc\" (UID: \"fc208abe-d868-4d0e-bd8c-9f93e2e5c05a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-km9qc" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.972442 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/58da6748-a277-48e6-a169-6e6477486e44-audit-policies\") pod \"oauth-openshift-558db77b4-n8n76\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.972461 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e1952c19-97f1-41ae-b9de-834d2325c943-srv-cert\") pod \"catalog-operator-68c6474976-x6gtg\" (UID: \"e1952c19-97f1-41ae-b9de-834d2325c943\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-x6gtg" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.972484 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwgr5\" (UniqueName: \"kubernetes.io/projected/f377a3b2-2464-4568-a5d9-0557bc175309-kube-api-access-bwgr5\") pod \"service-ca-operator-777779d784-pm4rt\" (UID: \"f377a3b2-2464-4568-a5d9-0557bc175309\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pm4rt" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.972508 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-n8n76\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.972579 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc208abe-d868-4d0e-bd8c-9f93e2e5c05a-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-km9qc\" (UID: \"fc208abe-d868-4d0e-bd8c-9f93e2e5c05a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-km9qc" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.972631 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e7188959-438f-469b-8c0e-8e2af8c54d3f-tmpfs\") pod \"packageserver-d55dfcdfc-9bbcq\" (UID: \"e7188959-438f-469b-8c0e-8e2af8c54d3f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9bbcq" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.972649 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn68r\" (UniqueName: \"kubernetes.io/projected/752ea4e8-6b0b-4fab-9e96-f21100f3e72d-kube-api-access-xn68r\") pod \"cluster-image-registry-operator-dc59b4c8b-k9jsf\" (UID: \"752ea4e8-6b0b-4fab-9e96-f21100f3e72d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k9jsf" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.972667 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-n8n76\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.972694 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/72a2852b-7dd5-4681-b330-b8e4128d75b0-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-kc9rz\" (UID: \"72a2852b-7dd5-4681-b330-b8e4128d75b0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kc9rz" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.972715 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e7188959-438f-469b-8c0e-8e2af8c54d3f-webhook-cert\") pod \"packageserver-d55dfcdfc-9bbcq\" (UID: \"e7188959-438f-469b-8c0e-8e2af8c54d3f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9bbcq" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.972739 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/752ea4e8-6b0b-4fab-9e96-f21100f3e72d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-k9jsf\" (UID: \"752ea4e8-6b0b-4fab-9e96-f21100f3e72d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k9jsf" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.972803 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrnv6\" (UniqueName: \"kubernetes.io/projected/e1952c19-97f1-41ae-b9de-834d2325c943-kube-api-access-hrnv6\") pod \"catalog-operator-68c6474976-x6gtg\" (UID: \"e1952c19-97f1-41ae-b9de-834d2325c943\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-x6gtg" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.972850 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50dcb3e4-a35c-4e36-872c-32a252f542a7-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-wrdm2\" (UID: \"50dcb3e4-a35c-4e36-872c-32a252f542a7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wrdm2" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.972875 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/50dcb3e4-a35c-4e36-872c-32a252f542a7-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-wrdm2\" (UID: \"50dcb3e4-a35c-4e36-872c-32a252f542a7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wrdm2" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.972915 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/752ea4e8-6b0b-4fab-9e96-f21100f3e72d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-k9jsf\" (UID: \"752ea4e8-6b0b-4fab-9e96-f21100f3e72d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k9jsf" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.972936 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/752ea4e8-6b0b-4fab-9e96-f21100f3e72d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-k9jsf\" (UID: \"752ea4e8-6b0b-4fab-9e96-f21100f3e72d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k9jsf" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.972960 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6fjj\" (UniqueName: \"kubernetes.io/projected/b571ef16-161d-4280-aaa0-3f15b76dcc21-kube-api-access-d6fjj\") pod \"multus-admission-controller-857f4d67dd-6wqsp\" (UID: \"b571ef16-161d-4280-aaa0-3f15b76dcc21\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6wqsp" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.972982 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72a2852b-7dd5-4681-b330-b8e4128d75b0-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-kc9rz\" (UID: \"72a2852b-7dd5-4681-b330-b8e4128d75b0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kc9rz" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.973005 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-n8n76\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.973041 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/36e96f72-2b42-4b98-b6f1-5fe676510aed-proxy-tls\") pod \"machine-config-controller-84d6567774-d9zlz\" (UID: \"36e96f72-2b42-4b98-b6f1-5fe676510aed\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d9zlz" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.973066 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72a2852b-7dd5-4681-b330-b8e4128d75b0-config\") pod \"kube-apiserver-operator-766d6c64bb-kc9rz\" (UID: \"72a2852b-7dd5-4681-b330-b8e4128d75b0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kc9rz" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.973089 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-n8n76\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.973112 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vbvj\" (UniqueName: \"kubernetes.io/projected/58da6748-a277-48e6-a169-6e6477486e44-kube-api-access-5vbvj\") pod \"oauth-openshift-558db77b4-n8n76\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.973131 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b571ef16-161d-4280-aaa0-3f15b76dcc21-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-6wqsp\" (UID: \"b571ef16-161d-4280-aaa0-3f15b76dcc21\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6wqsp" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.973148 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e7188959-438f-469b-8c0e-8e2af8c54d3f-apiservice-cert\") pod \"packageserver-d55dfcdfc-9bbcq\" (UID: \"e7188959-438f-469b-8c0e-8e2af8c54d3f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9bbcq" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.973174 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47sjd\" (UniqueName: \"kubernetes.io/projected/e7188959-438f-469b-8c0e-8e2af8c54d3f-kube-api-access-47sjd\") pod \"packageserver-d55dfcdfc-9bbcq\" (UID: \"e7188959-438f-469b-8c0e-8e2af8c54d3f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9bbcq" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.973194 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-n8n76\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.973232 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50dcb3e4-a35c-4e36-872c-32a252f542a7-config\") pod \"kube-controller-manager-operator-78b949d7b-wrdm2\" (UID: \"50dcb3e4-a35c-4e36-872c-32a252f542a7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wrdm2" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.973257 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-n8n76\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.974282 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72a2852b-7dd5-4681-b330-b8e4128d75b0-config\") pod \"kube-apiserver-operator-766d6c64bb-kc9rz\" (UID: \"72a2852b-7dd5-4681-b330-b8e4128d75b0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kc9rz" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.974519 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/58da6748-a277-48e6-a169-6e6477486e44-audit-dir\") pod \"oauth-openshift-558db77b4-n8n76\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.972959 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/36e96f72-2b42-4b98-b6f1-5fe676510aed-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-d9zlz\" (UID: \"36e96f72-2b42-4b98-b6f1-5fe676510aed\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d9zlz" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.974741 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/58da6748-a277-48e6-a169-6e6477486e44-audit-policies\") pod \"oauth-openshift-558db77b4-n8n76\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.975115 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-n8n76\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.975151 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e7188959-438f-469b-8c0e-8e2af8c54d3f-tmpfs\") pod \"packageserver-d55dfcdfc-9bbcq\" (UID: \"e7188959-438f-469b-8c0e-8e2af8c54d3f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9bbcq" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.975370 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-n8n76\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.975913 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50dcb3e4-a35c-4e36-872c-32a252f542a7-config\") pod \"kube-controller-manager-operator-78b949d7b-wrdm2\" (UID: \"50dcb3e4-a35c-4e36-872c-32a252f542a7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wrdm2" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.976119 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-n8n76\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.976980 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-n8n76\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.977457 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-n8n76\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.977756 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-n8n76\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.978593 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-n8n76\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.978770 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-n8n76\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.979119 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.979168 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-n8n76\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.979463 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-n8n76\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.981098 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50dcb3e4-a35c-4e36-872c-32a252f542a7-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-wrdm2\" (UID: \"50dcb3e4-a35c-4e36-872c-32a252f542a7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wrdm2" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.981631 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72a2852b-7dd5-4681-b330-b8e4128d75b0-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-kc9rz\" (UID: \"72a2852b-7dd5-4681-b330-b8e4128d75b0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kc9rz" Feb 17 17:46:08 crc kubenswrapper[4892]: I0217 17:46:08.983260 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-n8n76\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.012979 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2jkg\" (UniqueName: \"kubernetes.io/projected/be8497ae-6021-4f47-9bf1-64fc30d9e161-kube-api-access-w2jkg\") pod \"machine-api-operator-5694c8668f-zh54l\" (UID: \"be8497ae-6021-4f47-9bf1-64fc30d9e161\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zh54l" Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.032594 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxs95\" (UniqueName: \"kubernetes.io/projected/ac493e83-90b2-4dc6-a414-256a21927213-kube-api-access-nxs95\") pod \"apiserver-76f77b778f-t5flc\" (UID: \"ac493e83-90b2-4dc6-a414-256a21927213\") " pod="openshift-apiserver/apiserver-76f77b778f-t5flc" Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.057923 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.078172 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.098534 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.117795 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.138672 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.158374 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.178699 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.199403 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.218474 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.237989 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.254185 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-t5flc" Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.259023 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.264693 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-zh54l" Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.278948 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.298246 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.331756 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.336452 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/752ea4e8-6b0b-4fab-9e96-f21100f3e72d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-k9jsf\" (UID: \"752ea4e8-6b0b-4fab-9e96-f21100f3e72d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k9jsf" Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.341526 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.351010 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/752ea4e8-6b0b-4fab-9e96-f21100f3e72d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-k9jsf\" (UID: \"752ea4e8-6b0b-4fab-9e96-f21100f3e72d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k9jsf" Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.361017 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.378591 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.409896 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.418554 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.438357 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.458120 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.464032 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-t5flc"] Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.479965 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.498218 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.506600 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-zh54l"] Feb 17 17:46:09 crc kubenswrapper[4892]: W0217 17:46:09.514243 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe8497ae_6021_4f47_9bf1_64fc30d9e161.slice/crio-f43207116875cb6883bb9da89951ab13ce16df8bf5e7368dbd1402408578e85b WatchSource:0}: Error finding container f43207116875cb6883bb9da89951ab13ce16df8bf5e7368dbd1402408578e85b: Status 404 returned error can't find the container with id f43207116875cb6883bb9da89951ab13ce16df8bf5e7368dbd1402408578e85b Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.520067 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.545113 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.558384 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.578222 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.598956 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.619029 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.639836 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.659163 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.677913 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.698739 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.718557 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.739353 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.759150 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.778420 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.798624 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.803429 4892 generic.go:334] "Generic (PLEG): container finished" podID="ac493e83-90b2-4dc6-a414-256a21927213" containerID="85759d4e8b558de59bd2f8376306e69c05a5428602a43ff0d5a9ecbfea8f3ef9" exitCode=0 Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.803883 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-t5flc" event={"ID":"ac493e83-90b2-4dc6-a414-256a21927213","Type":"ContainerDied","Data":"85759d4e8b558de59bd2f8376306e69c05a5428602a43ff0d5a9ecbfea8f3ef9"} Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.803916 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-t5flc" event={"ID":"ac493e83-90b2-4dc6-a414-256a21927213","Type":"ContainerStarted","Data":"ef057c37f23e67190ecc60af243a981126f32b42111b6737861b960c4f6ef397"} Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.806338 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-zh54l" event={"ID":"be8497ae-6021-4f47-9bf1-64fc30d9e161","Type":"ContainerStarted","Data":"ab8e2608e4c17c5494f7619eefb8affad016574833a04655865612c870a67932"} Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.806366 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-zh54l" event={"ID":"be8497ae-6021-4f47-9bf1-64fc30d9e161","Type":"ContainerStarted","Data":"21d81cbaf802a511c69d8980bf368715ac4f62fbe2fc7f1733ac4eaa6cd8f8b6"} Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.806381 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-zh54l" event={"ID":"be8497ae-6021-4f47-9bf1-64fc30d9e161","Type":"ContainerStarted","Data":"f43207116875cb6883bb9da89951ab13ce16df8bf5e7368dbd1402408578e85b"} Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.818326 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.828049 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e1952c19-97f1-41ae-b9de-834d2325c943-profile-collector-cert\") pod \"catalog-operator-68c6474976-x6gtg\" (UID: \"e1952c19-97f1-41ae-b9de-834d2325c943\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-x6gtg" Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.837363 4892 request.go:700] Waited for 1.006410491s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.839106 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.858201 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.868396 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e1952c19-97f1-41ae-b9de-834d2325c943-srv-cert\") pod \"catalog-operator-68c6474976-x6gtg\" (UID: \"e1952c19-97f1-41ae-b9de-834d2325c943\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-x6gtg" Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.878940 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.898184 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.918728 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.938096 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.959022 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 17 17:46:09 crc kubenswrapper[4892]: E0217 17:46:09.974015 4892 secret.go:188] Couldn't get secret openshift-service-ca-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 17 17:46:09 crc kubenswrapper[4892]: E0217 17:46:09.974072 4892 configmap.go:193] Couldn't get configMap openshift-kube-storage-version-migrator-operator/config: failed to sync configmap cache: timed out waiting for the condition Feb 17 17:46:09 crc kubenswrapper[4892]: E0217 17:46:09.974123 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f377a3b2-2464-4568-a5d9-0557bc175309-serving-cert podName:f377a3b2-2464-4568-a5d9-0557bc175309 nodeName:}" failed. No retries permitted until 2026-02-17 17:46:10.47410015 +0000 UTC m=+141.849503415 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/f377a3b2-2464-4568-a5d9-0557bc175309-serving-cert") pod "service-ca-operator-777779d784-pm4rt" (UID: "f377a3b2-2464-4568-a5d9-0557bc175309") : failed to sync secret cache: timed out waiting for the condition Feb 17 17:46:09 crc kubenswrapper[4892]: E0217 17:46:09.974157 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fc208abe-d868-4d0e-bd8c-9f93e2e5c05a-config podName:fc208abe-d868-4d0e-bd8c-9f93e2e5c05a nodeName:}" failed. No retries permitted until 2026-02-17 17:46:10.47413422 +0000 UTC m=+141.849537495 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/fc208abe-d868-4d0e-bd8c-9f93e2e5c05a-config") pod "kube-storage-version-migrator-operator-b67b599dd-km9qc" (UID: "fc208abe-d868-4d0e-bd8c-9f93e2e5c05a") : failed to sync configmap cache: timed out waiting for the condition Feb 17 17:46:09 crc kubenswrapper[4892]: E0217 17:46:09.974475 4892 secret.go:188] Couldn't get secret openshift-kube-storage-version-migrator-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 17 17:46:09 crc kubenswrapper[4892]: E0217 17:46:09.974526 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc208abe-d868-4d0e-bd8c-9f93e2e5c05a-serving-cert podName:fc208abe-d868-4d0e-bd8c-9f93e2e5c05a nodeName:}" failed. No retries permitted until 2026-02-17 17:46:10.47451553 +0000 UTC m=+141.849918895 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/fc208abe-d868-4d0e-bd8c-9f93e2e5c05a-serving-cert") pod "kube-storage-version-migrator-operator-b67b599dd-km9qc" (UID: "fc208abe-d868-4d0e-bd8c-9f93e2e5c05a") : failed to sync secret cache: timed out waiting for the condition Feb 17 17:46:09 crc kubenswrapper[4892]: E0217 17:46:09.975274 4892 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Feb 17 17:46:09 crc kubenswrapper[4892]: E0217 17:46:09.975339 4892 secret.go:188] Couldn't get secret openshift-machine-config-operator/mcc-proxy-tls: failed to sync secret cache: timed out waiting for the condition Feb 17 17:46:09 crc kubenswrapper[4892]: E0217 17:46:09.975404 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7188959-438f-469b-8c0e-8e2af8c54d3f-webhook-cert podName:e7188959-438f-469b-8c0e-8e2af8c54d3f nodeName:}" failed. No retries permitted until 2026-02-17 17:46:10.475382364 +0000 UTC m=+141.850785629 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/e7188959-438f-469b-8c0e-8e2af8c54d3f-webhook-cert") pod "packageserver-d55dfcdfc-9bbcq" (UID: "e7188959-438f-469b-8c0e-8e2af8c54d3f") : failed to sync secret cache: timed out waiting for the condition Feb 17 17:46:09 crc kubenswrapper[4892]: E0217 17:46:09.975434 4892 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Feb 17 17:46:09 crc kubenswrapper[4892]: E0217 17:46:09.975474 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/36e96f72-2b42-4b98-b6f1-5fe676510aed-proxy-tls podName:36e96f72-2b42-4b98-b6f1-5fe676510aed nodeName:}" failed. No retries permitted until 2026-02-17 17:46:10.475438866 +0000 UTC m=+141.850842141 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/36e96f72-2b42-4b98-b6f1-5fe676510aed-proxy-tls") pod "machine-config-controller-84d6567774-d9zlz" (UID: "36e96f72-2b42-4b98-b6f1-5fe676510aed") : failed to sync secret cache: timed out waiting for the condition Feb 17 17:46:09 crc kubenswrapper[4892]: E0217 17:46:09.975505 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7188959-438f-469b-8c0e-8e2af8c54d3f-apiservice-cert podName:e7188959-438f-469b-8c0e-8e2af8c54d3f nodeName:}" failed. No retries permitted until 2026-02-17 17:46:10.475493107 +0000 UTC m=+141.850896392 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/e7188959-438f-469b-8c0e-8e2af8c54d3f-apiservice-cert") pod "packageserver-d55dfcdfc-9bbcq" (UID: "e7188959-438f-469b-8c0e-8e2af8c54d3f") : failed to sync secret cache: timed out waiting for the condition Feb 17 17:46:09 crc kubenswrapper[4892]: E0217 17:46:09.976532 4892 configmap.go:193] Couldn't get configMap openshift-service-ca-operator/service-ca-operator-config: failed to sync configmap cache: timed out waiting for the condition Feb 17 17:46:09 crc kubenswrapper[4892]: E0217 17:46:09.976580 4892 secret.go:188] Couldn't get secret openshift-multus/multus-admission-controller-secret: failed to sync secret cache: timed out waiting for the condition Feb 17 17:46:09 crc kubenswrapper[4892]: E0217 17:46:09.976619 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f377a3b2-2464-4568-a5d9-0557bc175309-config podName:f377a3b2-2464-4568-a5d9-0557bc175309 nodeName:}" failed. No retries permitted until 2026-02-17 17:46:10.476599836 +0000 UTC m=+141.852003321 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/f377a3b2-2464-4568-a5d9-0557bc175309-config") pod "service-ca-operator-777779d784-pm4rt" (UID: "f377a3b2-2464-4568-a5d9-0557bc175309") : failed to sync configmap cache: timed out waiting for the condition Feb 17 17:46:09 crc kubenswrapper[4892]: E0217 17:46:09.976661 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b571ef16-161d-4280-aaa0-3f15b76dcc21-webhook-certs podName:b571ef16-161d-4280-aaa0-3f15b76dcc21 nodeName:}" failed. No retries permitted until 2026-02-17 17:46:10.476638137 +0000 UTC m=+141.852041412 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/b571ef16-161d-4280-aaa0-3f15b76dcc21-webhook-certs") pod "multus-admission-controller-857f4d67dd-6wqsp" (UID: "b571ef16-161d-4280-aaa0-3f15b76dcc21") : failed to sync secret cache: timed out waiting for the condition Feb 17 17:46:09 crc kubenswrapper[4892]: I0217 17:46:09.978097 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.005480 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.018300 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.037848 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.058604 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.078308 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.099141 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.118245 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.139402 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.158198 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.178908 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.198563 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.218721 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.238533 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.258139 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.279068 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.298947 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.318479 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.338520 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.358948 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.378196 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.398207 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.418503 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.438953 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.458542 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.478858 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.496597 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e7188959-438f-469b-8c0e-8e2af8c54d3f-webhook-cert\") pod \"packageserver-d55dfcdfc-9bbcq\" (UID: \"e7188959-438f-469b-8c0e-8e2af8c54d3f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9bbcq" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.496712 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/36e96f72-2b42-4b98-b6f1-5fe676510aed-proxy-tls\") pod \"machine-config-controller-84d6567774-d9zlz\" (UID: \"36e96f72-2b42-4b98-b6f1-5fe676510aed\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d9zlz" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.496746 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b571ef16-161d-4280-aaa0-3f15b76dcc21-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-6wqsp\" (UID: \"b571ef16-161d-4280-aaa0-3f15b76dcc21\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6wqsp" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.496772 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e7188959-438f-469b-8c0e-8e2af8c54d3f-apiservice-cert\") pod \"packageserver-d55dfcdfc-9bbcq\" (UID: \"e7188959-438f-469b-8c0e-8e2af8c54d3f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9bbcq" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.496936 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f377a3b2-2464-4568-a5d9-0557bc175309-config\") pod \"service-ca-operator-777779d784-pm4rt\" (UID: \"f377a3b2-2464-4568-a5d9-0557bc175309\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pm4rt" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.496983 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc208abe-d868-4d0e-bd8c-9f93e2e5c05a-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-km9qc\" (UID: \"fc208abe-d868-4d0e-bd8c-9f93e2e5c05a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-km9qc" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.497015 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f377a3b2-2464-4568-a5d9-0557bc175309-serving-cert\") pod \"service-ca-operator-777779d784-pm4rt\" (UID: \"f377a3b2-2464-4568-a5d9-0557bc175309\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pm4rt" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.497077 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc208abe-d868-4d0e-bd8c-9f93e2e5c05a-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-km9qc\" (UID: \"fc208abe-d868-4d0e-bd8c-9f93e2e5c05a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-km9qc" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.499599 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.500469 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc208abe-d868-4d0e-bd8c-9f93e2e5c05a-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-km9qc\" (UID: \"fc208abe-d868-4d0e-bd8c-9f93e2e5c05a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-km9qc" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.501496 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e7188959-438f-469b-8c0e-8e2af8c54d3f-apiservice-cert\") pod \"packageserver-d55dfcdfc-9bbcq\" (UID: \"e7188959-438f-469b-8c0e-8e2af8c54d3f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9bbcq" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.501525 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e7188959-438f-469b-8c0e-8e2af8c54d3f-webhook-cert\") pod \"packageserver-d55dfcdfc-9bbcq\" (UID: \"e7188959-438f-469b-8c0e-8e2af8c54d3f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9bbcq" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.501738 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc208abe-d868-4d0e-bd8c-9f93e2e5c05a-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-km9qc\" (UID: \"fc208abe-d868-4d0e-bd8c-9f93e2e5c05a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-km9qc" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.501911 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/36e96f72-2b42-4b98-b6f1-5fe676510aed-proxy-tls\") pod \"machine-config-controller-84d6567774-d9zlz\" (UID: \"36e96f72-2b42-4b98-b6f1-5fe676510aed\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d9zlz" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.506611 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b571ef16-161d-4280-aaa0-3f15b76dcc21-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-6wqsp\" (UID: \"b571ef16-161d-4280-aaa0-3f15b76dcc21\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6wqsp" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.514295 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f377a3b2-2464-4568-a5d9-0557bc175309-serving-cert\") pod \"service-ca-operator-777779d784-pm4rt\" (UID: \"f377a3b2-2464-4568-a5d9-0557bc175309\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pm4rt" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.519142 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.520131 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f377a3b2-2464-4568-a5d9-0557bc175309-config\") pod \"service-ca-operator-777779d784-pm4rt\" (UID: \"f377a3b2-2464-4568-a5d9-0557bc175309\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pm4rt" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.538955 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.558733 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.621106 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjkmd\" (UniqueName: \"kubernetes.io/projected/843d01a5-8de5-4628-99d0-2ac552e9abf5-kube-api-access-tjkmd\") pod \"downloads-7954f5f757-4q9d4\" (UID: \"843d01a5-8de5-4628-99d0-2ac552e9abf5\") " pod="openshift-console/downloads-7954f5f757-4q9d4" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.632005 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47s6x\" (UniqueName: \"kubernetes.io/projected/2aa4da4a-e681-4199-b8b7-c4de1ed9c04e-kube-api-access-47s6x\") pod \"console-operator-58897d9998-t24t9\" (UID: \"2aa4da4a-e681-4199-b8b7-c4de1ed9c04e\") " pod="openshift-console-operator/console-operator-58897d9998-t24t9" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.636304 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-t24t9" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.652675 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c65gw\" (UniqueName: \"kubernetes.io/projected/14853a40-bee2-4d8e-a148-5f0ae761d71c-kube-api-access-c65gw\") pod \"openshift-config-operator-7777fb866f-nj852\" (UID: \"14853a40-bee2-4d8e-a148-5f0ae761d71c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nj852" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.673189 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5qpz\" (UniqueName: \"kubernetes.io/projected/183505a9-eeb4-4bf8-94ae-2c593e78b926-kube-api-access-j5qpz\") pod \"controller-manager-879f6c89f-st4m9\" (UID: \"183505a9-eeb4-4bf8-94ae-2c593e78b926\") " pod="openshift-controller-manager/controller-manager-879f6c89f-st4m9" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.694665 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96s4p\" (UniqueName: \"kubernetes.io/projected/8e8f043a-de31-4c34-8ca8-d90a0ad0f125-kube-api-access-96s4p\") pod \"apiserver-7bbb656c7d-dc6pg\" (UID: \"8e8f043a-de31-4c34-8ca8-d90a0ad0f125\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dc6pg" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.712648 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2l8dj\" (UniqueName: \"kubernetes.io/projected/129ae682-89fa-4fa9-b4de-789ea8c0a9fd-kube-api-access-2l8dj\") pod \"dns-operator-744455d44c-hzxbm\" (UID: \"129ae682-89fa-4fa9-b4de-789ea8c0a9fd\") " pod="openshift-dns-operator/dns-operator-744455d44c-hzxbm" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.738368 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9snm\" (UniqueName: \"kubernetes.io/projected/0b0cbdbb-671e-41e1-b494-a369938dab8e-kube-api-access-r9snm\") pod \"console-f9d7485db-zc25m\" (UID: \"0b0cbdbb-671e-41e1-b494-a369938dab8e\") " pod="openshift-console/console-f9d7485db-zc25m" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.756124 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kms92\" (UniqueName: \"kubernetes.io/projected/ca6ef0ff-6a6d-4e4e-b396-1de6fd632cd1-kube-api-access-kms92\") pod \"authentication-operator-69f744f599-lzbcm\" (UID: \"ca6ef0ff-6a6d-4e4e-b396-1de6fd632cd1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lzbcm" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.782947 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfcts\" (UniqueName: \"kubernetes.io/projected/6c5aba9a-b953-479b-a16a-93af181b7445-kube-api-access-mfcts\") pod \"cluster-samples-operator-665b6dd947-hwwms\" (UID: \"6c5aba9a-b953-479b-a16a-93af181b7445\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hwwms" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.788944 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dc6pg" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.799088 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5plsk\" (UniqueName: \"kubernetes.io/projected/4baae3f3-ceff-445d-9cf0-82284224bff7-kube-api-access-5plsk\") pod \"route-controller-manager-6576b87f9c-qm6w9\" (UID: \"4baae3f3-ceff-445d-9cf0-82284224bff7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qm6w9" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.802713 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.811676 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-st4m9" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.814073 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-t5flc" event={"ID":"ac493e83-90b2-4dc6-a414-256a21927213","Type":"ContainerStarted","Data":"a9fad7201e6bb34efb44bd97696bfcb82db394a2e0f7f456e7bf50857cb37f0b"} Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.814116 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-t5flc" event={"ID":"ac493e83-90b2-4dc6-a414-256a21927213","Type":"ContainerStarted","Data":"73c9e058b54149cdbcc1341b781c54536524dd8e10eb6308b725cbb00fff4c53"} Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.821003 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.821489 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-4q9d4" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.832841 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-lzbcm" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.840446 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.842584 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-t24t9"] Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.856988 4892 request.go:700] Waited for 1.946184154s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/hostpath-provisioner/secrets?fieldSelector=metadata.name%3Dcsi-hostpath-provisioner-sa-dockercfg-qd74k&limit=500&resourceVersion=0 Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.859070 4892 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.866443 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nj852" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.877934 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.898293 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hwwms" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.898627 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.902831 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-hzxbm" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.921129 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-zc25m" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.924919 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.939677 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.944504 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qm6w9" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.958764 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 17 17:46:10 crc kubenswrapper[4892]: I0217 17:46:10.994580 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smgtz\" (UniqueName: \"kubernetes.io/projected/fc208abe-d868-4d0e-bd8c-9f93e2e5c05a-kube-api-access-smgtz\") pod \"kube-storage-version-migrator-operator-b67b599dd-km9qc\" (UID: \"fc208abe-d868-4d0e-bd8c-9f93e2e5c05a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-km9qc" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.020720 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clnks\" (UniqueName: \"kubernetes.io/projected/36e96f72-2b42-4b98-b6f1-5fe676510aed-kube-api-access-clnks\") pod \"machine-config-controller-84d6567774-d9zlz\" (UID: \"36e96f72-2b42-4b98-b6f1-5fe676510aed\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d9zlz" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.036513 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-dc6pg"] Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.040367 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/50dcb3e4-a35c-4e36-872c-32a252f542a7-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-wrdm2\" (UID: \"50dcb3e4-a35c-4e36-872c-32a252f542a7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wrdm2" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.058466 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwgr5\" (UniqueName: \"kubernetes.io/projected/f377a3b2-2464-4568-a5d9-0557bc175309-kube-api-access-bwgr5\") pod \"service-ca-operator-777779d784-pm4rt\" (UID: \"f377a3b2-2464-4568-a5d9-0557bc175309\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pm4rt" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.076090 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jdps\" (UniqueName: \"kubernetes.io/projected/0168b546-66c4-4663-a840-4b4b1968d725-kube-api-access-5jdps\") pod \"migrator-59844c95c7-qw5r5\" (UID: \"0168b546-66c4-4663-a840-4b4b1968d725\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qw5r5" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.106383 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/752ea4e8-6b0b-4fab-9e96-f21100f3e72d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-k9jsf\" (UID: \"752ea4e8-6b0b-4fab-9e96-f21100f3e72d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k9jsf" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.116047 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-lzbcm"] Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.129610 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6fjj\" (UniqueName: \"kubernetes.io/projected/b571ef16-161d-4280-aaa0-3f15b76dcc21-kube-api-access-d6fjj\") pod \"multus-admission-controller-857f4d67dd-6wqsp\" (UID: \"b571ef16-161d-4280-aaa0-3f15b76dcc21\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6wqsp" Feb 17 17:46:11 crc kubenswrapper[4892]: W0217 17:46:11.130028 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca6ef0ff_6a6d_4e4e_b396_1de6fd632cd1.slice/crio-c51794a9b2951e7ee64fc169f38df2c4bffaec7d8926f94a9ca12cf75e6746e9 WatchSource:0}: Error finding container c51794a9b2951e7ee64fc169f38df2c4bffaec7d8926f94a9ca12cf75e6746e9: Status 404 returned error can't find the container with id c51794a9b2951e7ee64fc169f38df2c4bffaec7d8926f94a9ca12cf75e6746e9 Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.134048 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-km9qc" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.134389 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/72a2852b-7dd5-4681-b330-b8e4128d75b0-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-kc9rz\" (UID: \"72a2852b-7dd5-4681-b330-b8e4128d75b0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kc9rz" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.151628 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d9zlz" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.154074 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrnv6\" (UniqueName: \"kubernetes.io/projected/e1952c19-97f1-41ae-b9de-834d2325c943-kube-api-access-hrnv6\") pod \"catalog-operator-68c6474976-x6gtg\" (UID: \"e1952c19-97f1-41ae-b9de-834d2325c943\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-x6gtg" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.159655 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-6wqsp" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.176588 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-pm4rt" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.176910 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn68r\" (UniqueName: \"kubernetes.io/projected/752ea4e8-6b0b-4fab-9e96-f21100f3e72d-kube-api-access-xn68r\") pod \"cluster-image-registry-operator-dc59b4c8b-k9jsf\" (UID: \"752ea4e8-6b0b-4fab-9e96-f21100f3e72d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k9jsf" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.192377 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vbvj\" (UniqueName: \"kubernetes.io/projected/58da6748-a277-48e6-a169-6e6477486e44-kube-api-access-5vbvj\") pod \"oauth-openshift-558db77b4-n8n76\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.218779 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47sjd\" (UniqueName: \"kubernetes.io/projected/e7188959-438f-469b-8c0e-8e2af8c54d3f-kube-api-access-47sjd\") pod \"packageserver-d55dfcdfc-9bbcq\" (UID: \"e7188959-438f-469b-8c0e-8e2af8c54d3f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9bbcq" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.294102 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qm6w9"] Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.310798 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.311185 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52tz9\" (UniqueName: \"kubernetes.io/projected/2a08a25e-d2b9-4a1d-aeb6-e968436a9472-kube-api-access-52tz9\") pod \"package-server-manager-789f6589d5-ps5cl\" (UID: \"2a08a25e-d2b9-4a1d-aeb6-e968436a9472\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ps5cl" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.311229 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b3fc996d-107b-4647-b52f-54fef31f9059-registry-certificates\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.311257 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5n65\" (UniqueName: \"kubernetes.io/projected/fc4dd9a2-87aa-4405-a4cc-778c778aaec9-kube-api-access-m5n65\") pod \"marketplace-operator-79b997595-xlhd5\" (UID: \"fc4dd9a2-87aa-4405-a4cc-778c778aaec9\") " pod="openshift-marketplace/marketplace-operator-79b997595-xlhd5" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.311284 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7c9c06c9-0472-42e8-906f-2c7d8ec5608d-stats-auth\") pod \"router-default-5444994796-ddw22\" (UID: \"7c9c06c9-0472-42e8-906f-2c7d8ec5608d\") " pod="openshift-ingress/router-default-5444994796-ddw22" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.311320 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4svpj\" (UniqueName: \"kubernetes.io/projected/1111f441-f7d1-4115-b288-48cef127137a-kube-api-access-4svpj\") pod \"collect-profiles-29522505-6hhs6\" (UID: \"1111f441-f7d1-4115-b288-48cef127137a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522505-6hhs6" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.311352 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7c9c06c9-0472-42e8-906f-2c7d8ec5608d-default-certificate\") pod \"router-default-5444994796-ddw22\" (UID: \"7c9c06c9-0472-42e8-906f-2c7d8ec5608d\") " pod="openshift-ingress/router-default-5444994796-ddw22" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.311377 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b3fc996d-107b-4647-b52f-54fef31f9059-ca-trust-extracted\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.311398 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6jfl\" (UniqueName: \"kubernetes.io/projected/f6f8d659-22a7-478b-a52a-f1a82ee5a40a-kube-api-access-f6jfl\") pod \"control-plane-machine-set-operator-78cbb6b69f-vhwpb\" (UID: \"f6f8d659-22a7-478b-a52a-f1a82ee5a40a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vhwpb" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.311418 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2st9w\" (UniqueName: \"kubernetes.io/projected/a94de5f7-987c-43ae-90e9-bc7a48fc4d6a-kube-api-access-2st9w\") pod \"olm-operator-6b444d44fb-ph8l6\" (UID: \"a94de5f7-987c-43ae-90e9-bc7a48fc4d6a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ph8l6" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.311442 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/ae17ea52-fb8a-4d96-9b27-41f6fb99de88-machine-approver-tls\") pod \"machine-approver-56656f9798-vm7qp\" (UID: \"ae17ea52-fb8a-4d96-9b27-41f6fb99de88\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vm7qp" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.311467 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f6f8d659-22a7-478b-a52a-f1a82ee5a40a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-vhwpb\" (UID: \"f6f8d659-22a7-478b-a52a-f1a82ee5a40a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vhwpb" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.311489 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1af316b8-cbe5-47df-8e9b-8e5d839aee33-signing-cabundle\") pod \"service-ca-9c57cc56f-v4nm6\" (UID: \"1af316b8-cbe5-47df-8e9b-8e5d839aee33\") " pod="openshift-service-ca/service-ca-9c57cc56f-v4nm6" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.311511 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7ec23c19-a8c4-4079-bd8d-897a91ad009f-cert\") pod \"ingress-canary-xcgd8\" (UID: \"7ec23c19-a8c4-4079-bd8d-897a91ad009f\") " pod="openshift-ingress-canary/ingress-canary-xcgd8" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.311531 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ae17ea52-fb8a-4d96-9b27-41f6fb99de88-auth-proxy-config\") pod \"machine-approver-56656f9798-vm7qp\" (UID: \"ae17ea52-fb8a-4d96-9b27-41f6fb99de88\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vm7qp" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.311578 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/99bb3092-4fdb-4533-98e7-45e62ac33a4a-etcd-client\") pod \"etcd-operator-b45778765-72rk9\" (UID: \"99bb3092-4fdb-4533-98e7-45e62ac33a4a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-72rk9" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.311629 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0572b1a8-4e2e-4d75-b1ab-34da859539cf-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-znjqw\" (UID: \"0572b1a8-4e2e-4d75-b1ab-34da859539cf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-znjqw" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.311649 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1111f441-f7d1-4115-b288-48cef127137a-secret-volume\") pod \"collect-profiles-29522505-6hhs6\" (UID: \"1111f441-f7d1-4115-b288-48cef127137a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522505-6hhs6" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.311671 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8df12f42-c275-445f-8c39-32d009a322d5-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lxn66\" (UID: \"8df12f42-c275-445f-8c39-32d009a322d5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lxn66" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.311693 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9spm\" (UniqueName: \"kubernetes.io/projected/ae17ea52-fb8a-4d96-9b27-41f6fb99de88-kube-api-access-l9spm\") pod \"machine-approver-56656f9798-vm7qp\" (UID: \"ae17ea52-fb8a-4d96-9b27-41f6fb99de88\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vm7qp" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.311715 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2a08a25e-d2b9-4a1d-aeb6-e968436a9472-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-ps5cl\" (UID: \"2a08a25e-d2b9-4a1d-aeb6-e968436a9472\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ps5cl" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.311742 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a311419d-f51d-420d-aa7a-01cff548c963-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qj469\" (UID: \"a311419d-f51d-420d-aa7a-01cff548c963\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qj469" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.311764 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d3b54c25-de44-4c2c-be7a-3e9d1ab88a2b-proxy-tls\") pod \"machine-config-operator-74547568cd-bgzdj\" (UID: \"d3b54c25-de44-4c2c-be7a-3e9d1ab88a2b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bgzdj" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.311800 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a311419d-f51d-420d-aa7a-01cff548c963-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qj469\" (UID: \"a311419d-f51d-420d-aa7a-01cff548c963\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qj469" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.311875 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c9c06c9-0472-42e8-906f-2c7d8ec5608d-service-ca-bundle\") pod \"router-default-5444994796-ddw22\" (UID: \"7c9c06c9-0472-42e8-906f-2c7d8ec5608d\") " pod="openshift-ingress/router-default-5444994796-ddw22" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.311902 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxnsk\" (UniqueName: \"kubernetes.io/projected/a81a9079-32b0-41a0-ae32-13fcc5c70c0b-kube-api-access-jxnsk\") pod \"ingress-operator-5b745b69d9-2rzft\" (UID: \"a81a9079-32b0-41a0-ae32-13fcc5c70c0b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2rzft" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.311932 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djgsv\" (UniqueName: \"kubernetes.io/projected/7c9c06c9-0472-42e8-906f-2c7d8ec5608d-kube-api-access-djgsv\") pod \"router-default-5444994796-ddw22\" (UID: \"7c9c06c9-0472-42e8-906f-2c7d8ec5608d\") " pod="openshift-ingress/router-default-5444994796-ddw22" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.311955 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1af316b8-cbe5-47df-8e9b-8e5d839aee33-signing-key\") pod \"service-ca-9c57cc56f-v4nm6\" (UID: \"1af316b8-cbe5-47df-8e9b-8e5d839aee33\") " pod="openshift-service-ca/service-ca-9c57cc56f-v4nm6" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.311979 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4v9t8\" (UniqueName: \"kubernetes.io/projected/7ec23c19-a8c4-4079-bd8d-897a91ad009f-kube-api-access-4v9t8\") pod \"ingress-canary-xcgd8\" (UID: \"7ec23c19-a8c4-4079-bd8d-897a91ad009f\") " pod="openshift-ingress-canary/ingress-canary-xcgd8" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.311999 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1111f441-f7d1-4115-b288-48cef127137a-config-volume\") pod \"collect-profiles-29522505-6hhs6\" (UID: \"1111f441-f7d1-4115-b288-48cef127137a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522505-6hhs6" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.312024 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fc4dd9a2-87aa-4405-a4cc-778c778aaec9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xlhd5\" (UID: \"fc4dd9a2-87aa-4405-a4cc-778c778aaec9\") " pod="openshift-marketplace/marketplace-operator-79b997595-xlhd5" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.312048 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a94de5f7-987c-43ae-90e9-bc7a48fc4d6a-srv-cert\") pod \"olm-operator-6b444d44fb-ph8l6\" (UID: \"a94de5f7-987c-43ae-90e9-bc7a48fc4d6a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ph8l6" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.312074 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b3fc996d-107b-4647-b52f-54fef31f9059-registry-tls\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.312105 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b3fc996d-107b-4647-b52f-54fef31f9059-trusted-ca\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.312129 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae17ea52-fb8a-4d96-9b27-41f6fb99de88-config\") pod \"machine-approver-56656f9798-vm7qp\" (UID: \"ae17ea52-fb8a-4d96-9b27-41f6fb99de88\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vm7qp" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.312149 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d3b54c25-de44-4c2c-be7a-3e9d1ab88a2b-images\") pod \"machine-config-operator-74547568cd-bgzdj\" (UID: \"d3b54c25-de44-4c2c-be7a-3e9d1ab88a2b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bgzdj" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.312181 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.312255 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/99bb3092-4fdb-4533-98e7-45e62ac33a4a-etcd-ca\") pod \"etcd-operator-b45778765-72rk9\" (UID: \"99bb3092-4fdb-4533-98e7-45e62ac33a4a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-72rk9" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.312453 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7gt9\" (UniqueName: \"kubernetes.io/projected/d3b54c25-de44-4c2c-be7a-3e9d1ab88a2b-kube-api-access-r7gt9\") pod \"machine-config-operator-74547568cd-bgzdj\" (UID: \"d3b54c25-de44-4c2c-be7a-3e9d1ab88a2b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bgzdj" Feb 17 17:46:11 crc kubenswrapper[4892]: E0217 17:46:11.312478 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:46:11.812464145 +0000 UTC m=+143.187867490 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cnjbv" (UID: "b3fc996d-107b-4647-b52f-54fef31f9059") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.312508 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99bb3092-4fdb-4533-98e7-45e62ac33a4a-serving-cert\") pod \"etcd-operator-b45778765-72rk9\" (UID: \"99bb3092-4fdb-4533-98e7-45e62ac33a4a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-72rk9" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.312541 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjk4h\" (UniqueName: \"kubernetes.io/projected/8df12f42-c275-445f-8c39-32d009a322d5-kube-api-access-wjk4h\") pod \"openshift-apiserver-operator-796bbdcf4f-lxn66\" (UID: \"8df12f42-c275-445f-8c39-32d009a322d5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lxn66" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.312982 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0572b1a8-4e2e-4d75-b1ab-34da859539cf-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-znjqw\" (UID: \"0572b1a8-4e2e-4d75-b1ab-34da859539cf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-znjqw" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.313031 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a94de5f7-987c-43ae-90e9-bc7a48fc4d6a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-ph8l6\" (UID: \"a94de5f7-987c-43ae-90e9-bc7a48fc4d6a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ph8l6" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.313191 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx95k\" (UniqueName: \"kubernetes.io/projected/99bb3092-4fdb-4533-98e7-45e62ac33a4a-kube-api-access-xx95k\") pod \"etcd-operator-b45778765-72rk9\" (UID: \"99bb3092-4fdb-4533-98e7-45e62ac33a4a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-72rk9" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.313222 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a81a9079-32b0-41a0-ae32-13fcc5c70c0b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-2rzft\" (UID: \"a81a9079-32b0-41a0-ae32-13fcc5c70c0b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2rzft" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.313302 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99bb3092-4fdb-4533-98e7-45e62ac33a4a-config\") pod \"etcd-operator-b45778765-72rk9\" (UID: \"99bb3092-4fdb-4533-98e7-45e62ac33a4a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-72rk9" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.313342 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2jz6\" (UniqueName: \"kubernetes.io/projected/1af316b8-cbe5-47df-8e9b-8e5d839aee33-kube-api-access-w2jz6\") pod \"service-ca-9c57cc56f-v4nm6\" (UID: \"1af316b8-cbe5-47df-8e9b-8e5d839aee33\") " pod="openshift-service-ca/service-ca-9c57cc56f-v4nm6" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.313361 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a311419d-f51d-420d-aa7a-01cff548c963-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qj469\" (UID: \"a311419d-f51d-420d-aa7a-01cff548c963\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qj469" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.313383 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fc4dd9a2-87aa-4405-a4cc-778c778aaec9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xlhd5\" (UID: \"fc4dd9a2-87aa-4405-a4cc-778c778aaec9\") " pod="openshift-marketplace/marketplace-operator-79b997595-xlhd5" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.313402 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a81a9079-32b0-41a0-ae32-13fcc5c70c0b-metrics-tls\") pod \"ingress-operator-5b745b69d9-2rzft\" (UID: \"a81a9079-32b0-41a0-ae32-13fcc5c70c0b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2rzft" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.313429 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b3fc996d-107b-4647-b52f-54fef31f9059-installation-pull-secrets\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.313447 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c9c06c9-0472-42e8-906f-2c7d8ec5608d-metrics-certs\") pod \"router-default-5444994796-ddw22\" (UID: \"7c9c06c9-0472-42e8-906f-2c7d8ec5608d\") " pod="openshift-ingress/router-default-5444994796-ddw22" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.315066 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8df12f42-c275-445f-8c39-32d009a322d5-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lxn66\" (UID: \"8df12f42-c275-445f-8c39-32d009a322d5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lxn66" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.315638 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d3b54c25-de44-4c2c-be7a-3e9d1ab88a2b-auth-proxy-config\") pod \"machine-config-operator-74547568cd-bgzdj\" (UID: \"d3b54c25-de44-4c2c-be7a-3e9d1ab88a2b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bgzdj" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.315678 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a81a9079-32b0-41a0-ae32-13fcc5c70c0b-trusted-ca\") pod \"ingress-operator-5b745b69d9-2rzft\" (UID: \"a81a9079-32b0-41a0-ae32-13fcc5c70c0b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2rzft" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.315727 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhlmx\" (UniqueName: \"kubernetes.io/projected/b3fc996d-107b-4647-b52f-54fef31f9059-kube-api-access-vhlmx\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.315760 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/99bb3092-4fdb-4533-98e7-45e62ac33a4a-etcd-service-ca\") pod \"etcd-operator-b45778765-72rk9\" (UID: \"99bb3092-4fdb-4533-98e7-45e62ac33a4a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-72rk9" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.315862 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b3fc996d-107b-4647-b52f-54fef31f9059-bound-sa-token\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.315917 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhh8z\" (UniqueName: \"kubernetes.io/projected/0572b1a8-4e2e-4d75-b1ab-34da859539cf-kube-api-access-mhh8z\") pod \"openshift-controller-manager-operator-756b6f6bc6-znjqw\" (UID: \"0572b1a8-4e2e-4d75-b1ab-34da859539cf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-znjqw" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.316053 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wrdm2" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.322265 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kc9rz" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.324874 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-st4m9"] Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.330426 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-4q9d4"] Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.342583 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k9jsf" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.371469 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qw5r5" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.373955 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-nj852"] Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.392708 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-x6gtg" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.398235 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-d9zlz"] Feb 17 17:46:11 crc kubenswrapper[4892]: W0217 17:46:11.401292 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod183505a9_eeb4_4bf8_94ae_2c593e78b926.slice/crio-944a81ae14b6bd46b5d8e3a477b6440d90f4a94880c87a970f73185ea8da5b09 WatchSource:0}: Error finding container 944a81ae14b6bd46b5d8e3a477b6440d90f4a94880c87a970f73185ea8da5b09: Status 404 returned error can't find the container with id 944a81ae14b6bd46b5d8e3a477b6440d90f4a94880c87a970f73185ea8da5b09 Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.417262 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.417408 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/99bb3092-4fdb-4533-98e7-45e62ac33a4a-etcd-ca\") pod \"etcd-operator-b45778765-72rk9\" (UID: \"99bb3092-4fdb-4533-98e7-45e62ac33a4a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-72rk9" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.417445 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p26b\" (UniqueName: \"kubernetes.io/projected/1b62a756-057b-45a6-b568-b788dbb95fa0-kube-api-access-9p26b\") pod \"dns-default-cv2jf\" (UID: \"1b62a756-057b-45a6-b568-b788dbb95fa0\") " pod="openshift-dns/dns-default-cv2jf" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.417465 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7gt9\" (UniqueName: \"kubernetes.io/projected/d3b54c25-de44-4c2c-be7a-3e9d1ab88a2b-kube-api-access-r7gt9\") pod \"machine-config-operator-74547568cd-bgzdj\" (UID: \"d3b54c25-de44-4c2c-be7a-3e9d1ab88a2b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bgzdj" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.417485 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjk4h\" (UniqueName: \"kubernetes.io/projected/8df12f42-c275-445f-8c39-32d009a322d5-kube-api-access-wjk4h\") pod \"openshift-apiserver-operator-796bbdcf4f-lxn66\" (UID: \"8df12f42-c275-445f-8c39-32d009a322d5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lxn66" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.417503 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99bb3092-4fdb-4533-98e7-45e62ac33a4a-serving-cert\") pod \"etcd-operator-b45778765-72rk9\" (UID: \"99bb3092-4fdb-4533-98e7-45e62ac33a4a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-72rk9" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.417536 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0572b1a8-4e2e-4d75-b1ab-34da859539cf-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-znjqw\" (UID: \"0572b1a8-4e2e-4d75-b1ab-34da859539cf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-znjqw" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.417561 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a94de5f7-987c-43ae-90e9-bc7a48fc4d6a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-ph8l6\" (UID: \"a94de5f7-987c-43ae-90e9-bc7a48fc4d6a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ph8l6" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.417586 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx95k\" (UniqueName: \"kubernetes.io/projected/99bb3092-4fdb-4533-98e7-45e62ac33a4a-kube-api-access-xx95k\") pod \"etcd-operator-b45778765-72rk9\" (UID: \"99bb3092-4fdb-4533-98e7-45e62ac33a4a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-72rk9" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.417599 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a81a9079-32b0-41a0-ae32-13fcc5c70c0b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-2rzft\" (UID: \"a81a9079-32b0-41a0-ae32-13fcc5c70c0b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2rzft" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.417615 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99bb3092-4fdb-4533-98e7-45e62ac33a4a-config\") pod \"etcd-operator-b45778765-72rk9\" (UID: \"99bb3092-4fdb-4533-98e7-45e62ac33a4a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-72rk9" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.417630 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2jz6\" (UniqueName: \"kubernetes.io/projected/1af316b8-cbe5-47df-8e9b-8e5d839aee33-kube-api-access-w2jz6\") pod \"service-ca-9c57cc56f-v4nm6\" (UID: \"1af316b8-cbe5-47df-8e9b-8e5d839aee33\") " pod="openshift-service-ca/service-ca-9c57cc56f-v4nm6" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.417645 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a311419d-f51d-420d-aa7a-01cff548c963-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qj469\" (UID: \"a311419d-f51d-420d-aa7a-01cff548c963\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qj469" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.417660 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fc4dd9a2-87aa-4405-a4cc-778c778aaec9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xlhd5\" (UID: \"fc4dd9a2-87aa-4405-a4cc-778c778aaec9\") " pod="openshift-marketplace/marketplace-operator-79b997595-xlhd5" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.417692 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a81a9079-32b0-41a0-ae32-13fcc5c70c0b-metrics-tls\") pod \"ingress-operator-5b745b69d9-2rzft\" (UID: \"a81a9079-32b0-41a0-ae32-13fcc5c70c0b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2rzft" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.417706 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b3fc996d-107b-4647-b52f-54fef31f9059-installation-pull-secrets\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.417724 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c9c06c9-0472-42e8-906f-2c7d8ec5608d-metrics-certs\") pod \"router-default-5444994796-ddw22\" (UID: \"7c9c06c9-0472-42e8-906f-2c7d8ec5608d\") " pod="openshift-ingress/router-default-5444994796-ddw22" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.417776 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8df12f42-c275-445f-8c39-32d009a322d5-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lxn66\" (UID: \"8df12f42-c275-445f-8c39-32d009a322d5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lxn66" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.417801 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d3b54c25-de44-4c2c-be7a-3e9d1ab88a2b-auth-proxy-config\") pod \"machine-config-operator-74547568cd-bgzdj\" (UID: \"d3b54c25-de44-4c2c-be7a-3e9d1ab88a2b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bgzdj" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.417835 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a81a9079-32b0-41a0-ae32-13fcc5c70c0b-trusted-ca\") pod \"ingress-operator-5b745b69d9-2rzft\" (UID: \"a81a9079-32b0-41a0-ae32-13fcc5c70c0b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2rzft" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.417861 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhlmx\" (UniqueName: \"kubernetes.io/projected/b3fc996d-107b-4647-b52f-54fef31f9059-kube-api-access-vhlmx\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.417875 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/99bb3092-4fdb-4533-98e7-45e62ac33a4a-etcd-service-ca\") pod \"etcd-operator-b45778765-72rk9\" (UID: \"99bb3092-4fdb-4533-98e7-45e62ac33a4a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-72rk9" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.417900 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b3fc996d-107b-4647-b52f-54fef31f9059-bound-sa-token\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.417921 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhh8z\" (UniqueName: \"kubernetes.io/projected/0572b1a8-4e2e-4d75-b1ab-34da859539cf-kube-api-access-mhh8z\") pod \"openshift-controller-manager-operator-756b6f6bc6-znjqw\" (UID: \"0572b1a8-4e2e-4d75-b1ab-34da859539cf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-znjqw" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.417947 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1b62a756-057b-45a6-b568-b788dbb95fa0-metrics-tls\") pod \"dns-default-cv2jf\" (UID: \"1b62a756-057b-45a6-b568-b788dbb95fa0\") " pod="openshift-dns/dns-default-cv2jf" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.417974 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52tz9\" (UniqueName: \"kubernetes.io/projected/2a08a25e-d2b9-4a1d-aeb6-e968436a9472-kube-api-access-52tz9\") pod \"package-server-manager-789f6589d5-ps5cl\" (UID: \"2a08a25e-d2b9-4a1d-aeb6-e968436a9472\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ps5cl" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.417998 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b3fc996d-107b-4647-b52f-54fef31f9059-registry-certificates\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.418032 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5n65\" (UniqueName: \"kubernetes.io/projected/fc4dd9a2-87aa-4405-a4cc-778c778aaec9-kube-api-access-m5n65\") pod \"marketplace-operator-79b997595-xlhd5\" (UID: \"fc4dd9a2-87aa-4405-a4cc-778c778aaec9\") " pod="openshift-marketplace/marketplace-operator-79b997595-xlhd5" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.418056 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7c9c06c9-0472-42e8-906f-2c7d8ec5608d-stats-auth\") pod \"router-default-5444994796-ddw22\" (UID: \"7c9c06c9-0472-42e8-906f-2c7d8ec5608d\") " pod="openshift-ingress/router-default-5444994796-ddw22" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.418074 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4svpj\" (UniqueName: \"kubernetes.io/projected/1111f441-f7d1-4115-b288-48cef127137a-kube-api-access-4svpj\") pod \"collect-profiles-29522505-6hhs6\" (UID: \"1111f441-f7d1-4115-b288-48cef127137a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522505-6hhs6" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.418107 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7c9c06c9-0472-42e8-906f-2c7d8ec5608d-default-certificate\") pod \"router-default-5444994796-ddw22\" (UID: \"7c9c06c9-0472-42e8-906f-2c7d8ec5608d\") " pod="openshift-ingress/router-default-5444994796-ddw22" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.418143 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b3fc996d-107b-4647-b52f-54fef31f9059-ca-trust-extracted\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.418158 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6jfl\" (UniqueName: \"kubernetes.io/projected/f6f8d659-22a7-478b-a52a-f1a82ee5a40a-kube-api-access-f6jfl\") pod \"control-plane-machine-set-operator-78cbb6b69f-vhwpb\" (UID: \"f6f8d659-22a7-478b-a52a-f1a82ee5a40a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vhwpb" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.418174 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2st9w\" (UniqueName: \"kubernetes.io/projected/a94de5f7-987c-43ae-90e9-bc7a48fc4d6a-kube-api-access-2st9w\") pod \"olm-operator-6b444d44fb-ph8l6\" (UID: \"a94de5f7-987c-43ae-90e9-bc7a48fc4d6a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ph8l6" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.418222 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/ae17ea52-fb8a-4d96-9b27-41f6fb99de88-machine-approver-tls\") pod \"machine-approver-56656f9798-vm7qp\" (UID: \"ae17ea52-fb8a-4d96-9b27-41f6fb99de88\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vm7qp" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.418239 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f6f8d659-22a7-478b-a52a-f1a82ee5a40a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-vhwpb\" (UID: \"f6f8d659-22a7-478b-a52a-f1a82ee5a40a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vhwpb" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.418255 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1af316b8-cbe5-47df-8e9b-8e5d839aee33-signing-cabundle\") pod \"service-ca-9c57cc56f-v4nm6\" (UID: \"1af316b8-cbe5-47df-8e9b-8e5d839aee33\") " pod="openshift-service-ca/service-ca-9c57cc56f-v4nm6" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.418282 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/8607d279-cebb-4590-a0c0-7dda3de5dfd5-mountpoint-dir\") pod \"csi-hostpathplugin-nnn4l\" (UID: \"8607d279-cebb-4590-a0c0-7dda3de5dfd5\") " pod="hostpath-provisioner/csi-hostpathplugin-nnn4l" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.418325 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7ec23c19-a8c4-4079-bd8d-897a91ad009f-cert\") pod \"ingress-canary-xcgd8\" (UID: \"7ec23c19-a8c4-4079-bd8d-897a91ad009f\") " pod="openshift-ingress-canary/ingress-canary-xcgd8" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.418340 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/8607d279-cebb-4590-a0c0-7dda3de5dfd5-plugins-dir\") pod \"csi-hostpathplugin-nnn4l\" (UID: \"8607d279-cebb-4590-a0c0-7dda3de5dfd5\") " pod="hostpath-provisioner/csi-hostpathplugin-nnn4l" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.418377 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ae17ea52-fb8a-4d96-9b27-41f6fb99de88-auth-proxy-config\") pod \"machine-approver-56656f9798-vm7qp\" (UID: \"ae17ea52-fb8a-4d96-9b27-41f6fb99de88\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vm7qp" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.418393 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrw7k\" (UniqueName: \"kubernetes.io/projected/29cdf011-a502-4b68-932e-4d7df7e2b0ae-kube-api-access-zrw7k\") pod \"machine-config-server-22jlr\" (UID: \"29cdf011-a502-4b68-932e-4d7df7e2b0ae\") " pod="openshift-machine-config-operator/machine-config-server-22jlr" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.418418 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/8607d279-cebb-4590-a0c0-7dda3de5dfd5-csi-data-dir\") pod \"csi-hostpathplugin-nnn4l\" (UID: \"8607d279-cebb-4590-a0c0-7dda3de5dfd5\") " pod="hostpath-provisioner/csi-hostpathplugin-nnn4l" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.418445 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/99bb3092-4fdb-4533-98e7-45e62ac33a4a-etcd-client\") pod \"etcd-operator-b45778765-72rk9\" (UID: \"99bb3092-4fdb-4533-98e7-45e62ac33a4a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-72rk9" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.418470 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8kpq\" (UniqueName: \"kubernetes.io/projected/8607d279-cebb-4590-a0c0-7dda3de5dfd5-kube-api-access-l8kpq\") pod \"csi-hostpathplugin-nnn4l\" (UID: \"8607d279-cebb-4590-a0c0-7dda3de5dfd5\") " pod="hostpath-provisioner/csi-hostpathplugin-nnn4l" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.418486 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0572b1a8-4e2e-4d75-b1ab-34da859539cf-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-znjqw\" (UID: \"0572b1a8-4e2e-4d75-b1ab-34da859539cf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-znjqw" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.418502 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1111f441-f7d1-4115-b288-48cef127137a-secret-volume\") pod \"collect-profiles-29522505-6hhs6\" (UID: \"1111f441-f7d1-4115-b288-48cef127137a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522505-6hhs6" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.418518 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/29cdf011-a502-4b68-932e-4d7df7e2b0ae-node-bootstrap-token\") pod \"machine-config-server-22jlr\" (UID: \"29cdf011-a502-4b68-932e-4d7df7e2b0ae\") " pod="openshift-machine-config-operator/machine-config-server-22jlr" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.418542 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8df12f42-c275-445f-8c39-32d009a322d5-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lxn66\" (UID: \"8df12f42-c275-445f-8c39-32d009a322d5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lxn66" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.418559 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9spm\" (UniqueName: \"kubernetes.io/projected/ae17ea52-fb8a-4d96-9b27-41f6fb99de88-kube-api-access-l9spm\") pod \"machine-approver-56656f9798-vm7qp\" (UID: \"ae17ea52-fb8a-4d96-9b27-41f6fb99de88\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vm7qp" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.418574 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2a08a25e-d2b9-4a1d-aeb6-e968436a9472-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-ps5cl\" (UID: \"2a08a25e-d2b9-4a1d-aeb6-e968436a9472\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ps5cl" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.418606 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/29cdf011-a502-4b68-932e-4d7df7e2b0ae-certs\") pod \"machine-config-server-22jlr\" (UID: \"29cdf011-a502-4b68-932e-4d7df7e2b0ae\") " pod="openshift-machine-config-operator/machine-config-server-22jlr" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.418621 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a311419d-f51d-420d-aa7a-01cff548c963-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qj469\" (UID: \"a311419d-f51d-420d-aa7a-01cff548c963\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qj469" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.418636 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d3b54c25-de44-4c2c-be7a-3e9d1ab88a2b-proxy-tls\") pod \"machine-config-operator-74547568cd-bgzdj\" (UID: \"d3b54c25-de44-4c2c-be7a-3e9d1ab88a2b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bgzdj" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.418653 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a311419d-f51d-420d-aa7a-01cff548c963-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qj469\" (UID: \"a311419d-f51d-420d-aa7a-01cff548c963\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qj469" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.418699 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c9c06c9-0472-42e8-906f-2c7d8ec5608d-service-ca-bundle\") pod \"router-default-5444994796-ddw22\" (UID: \"7c9c06c9-0472-42e8-906f-2c7d8ec5608d\") " pod="openshift-ingress/router-default-5444994796-ddw22" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.418715 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxnsk\" (UniqueName: \"kubernetes.io/projected/a81a9079-32b0-41a0-ae32-13fcc5c70c0b-kube-api-access-jxnsk\") pod \"ingress-operator-5b745b69d9-2rzft\" (UID: \"a81a9079-32b0-41a0-ae32-13fcc5c70c0b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2rzft" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.418730 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8607d279-cebb-4590-a0c0-7dda3de5dfd5-registration-dir\") pod \"csi-hostpathplugin-nnn4l\" (UID: \"8607d279-cebb-4590-a0c0-7dda3de5dfd5\") " pod="hostpath-provisioner/csi-hostpathplugin-nnn4l" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.418755 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djgsv\" (UniqueName: \"kubernetes.io/projected/7c9c06c9-0472-42e8-906f-2c7d8ec5608d-kube-api-access-djgsv\") pod \"router-default-5444994796-ddw22\" (UID: \"7c9c06c9-0472-42e8-906f-2c7d8ec5608d\") " pod="openshift-ingress/router-default-5444994796-ddw22" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.418773 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1af316b8-cbe5-47df-8e9b-8e5d839aee33-signing-key\") pod \"service-ca-9c57cc56f-v4nm6\" (UID: \"1af316b8-cbe5-47df-8e9b-8e5d839aee33\") " pod="openshift-service-ca/service-ca-9c57cc56f-v4nm6" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.418792 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4v9t8\" (UniqueName: \"kubernetes.io/projected/7ec23c19-a8c4-4079-bd8d-897a91ad009f-kube-api-access-4v9t8\") pod \"ingress-canary-xcgd8\" (UID: \"7ec23c19-a8c4-4079-bd8d-897a91ad009f\") " pod="openshift-ingress-canary/ingress-canary-xcgd8" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.418817 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1111f441-f7d1-4115-b288-48cef127137a-config-volume\") pod \"collect-profiles-29522505-6hhs6\" (UID: \"1111f441-f7d1-4115-b288-48cef127137a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522505-6hhs6" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.418867 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fc4dd9a2-87aa-4405-a4cc-778c778aaec9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xlhd5\" (UID: \"fc4dd9a2-87aa-4405-a4cc-778c778aaec9\") " pod="openshift-marketplace/marketplace-operator-79b997595-xlhd5" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.418902 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a94de5f7-987c-43ae-90e9-bc7a48fc4d6a-srv-cert\") pod \"olm-operator-6b444d44fb-ph8l6\" (UID: \"a94de5f7-987c-43ae-90e9-bc7a48fc4d6a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ph8l6" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.418932 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8607d279-cebb-4590-a0c0-7dda3de5dfd5-socket-dir\") pod \"csi-hostpathplugin-nnn4l\" (UID: \"8607d279-cebb-4590-a0c0-7dda3de5dfd5\") " pod="hostpath-provisioner/csi-hostpathplugin-nnn4l" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.418946 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1b62a756-057b-45a6-b568-b788dbb95fa0-config-volume\") pod \"dns-default-cv2jf\" (UID: \"1b62a756-057b-45a6-b568-b788dbb95fa0\") " pod="openshift-dns/dns-default-cv2jf" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.418979 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b3fc996d-107b-4647-b52f-54fef31f9059-registry-tls\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.419003 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b3fc996d-107b-4647-b52f-54fef31f9059-trusted-ca\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.419063 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae17ea52-fb8a-4d96-9b27-41f6fb99de88-config\") pod \"machine-approver-56656f9798-vm7qp\" (UID: \"ae17ea52-fb8a-4d96-9b27-41f6fb99de88\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vm7qp" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.419112 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d3b54c25-de44-4c2c-be7a-3e9d1ab88a2b-images\") pod \"machine-config-operator-74547568cd-bgzdj\" (UID: \"d3b54c25-de44-4c2c-be7a-3e9d1ab88a2b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bgzdj" Feb 17 17:46:11 crc kubenswrapper[4892]: E0217 17:46:11.420145 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:46:11.920126536 +0000 UTC m=+143.295529801 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.421003 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d3b54c25-de44-4c2c-be7a-3e9d1ab88a2b-images\") pod \"machine-config-operator-74547568cd-bgzdj\" (UID: \"d3b54c25-de44-4c2c-be7a-3e9d1ab88a2b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bgzdj" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.421548 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/99bb3092-4fdb-4533-98e7-45e62ac33a4a-etcd-ca\") pod \"etcd-operator-b45778765-72rk9\" (UID: \"99bb3092-4fdb-4533-98e7-45e62ac33a4a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-72rk9" Feb 17 17:46:11 crc kubenswrapper[4892]: W0217 17:46:11.421783 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod843d01a5_8de5_4628_99d0_2ac552e9abf5.slice/crio-047c19b27db261bf47e3743e82f2e6b0d89eacbcbdf63cc041f3e5029b539e88 WatchSource:0}: Error finding container 047c19b27db261bf47e3743e82f2e6b0d89eacbcbdf63cc041f3e5029b539e88: Status 404 returned error can't find the container with id 047c19b27db261bf47e3743e82f2e6b0d89eacbcbdf63cc041f3e5029b539e88 Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.423256 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fc4dd9a2-87aa-4405-a4cc-778c778aaec9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xlhd5\" (UID: \"fc4dd9a2-87aa-4405-a4cc-778c778aaec9\") " pod="openshift-marketplace/marketplace-operator-79b997595-xlhd5" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.423406 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a311419d-f51d-420d-aa7a-01cff548c963-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qj469\" (UID: \"a311419d-f51d-420d-aa7a-01cff548c963\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qj469" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.429882 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99bb3092-4fdb-4533-98e7-45e62ac33a4a-config\") pod \"etcd-operator-b45778765-72rk9\" (UID: \"99bb3092-4fdb-4533-98e7-45e62ac33a4a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-72rk9" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.430141 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/99bb3092-4fdb-4533-98e7-45e62ac33a4a-etcd-service-ca\") pod \"etcd-operator-b45778765-72rk9\" (UID: \"99bb3092-4fdb-4533-98e7-45e62ac33a4a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-72rk9" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.430844 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d3b54c25-de44-4c2c-be7a-3e9d1ab88a2b-auth-proxy-config\") pod \"machine-config-operator-74547568cd-bgzdj\" (UID: \"d3b54c25-de44-4c2c-be7a-3e9d1ab88a2b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bgzdj" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.430859 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c9c06c9-0472-42e8-906f-2c7d8ec5608d-metrics-certs\") pod \"router-default-5444994796-ddw22\" (UID: \"7c9c06c9-0472-42e8-906f-2c7d8ec5608d\") " pod="openshift-ingress/router-default-5444994796-ddw22" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.431244 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99bb3092-4fdb-4533-98e7-45e62ac33a4a-serving-cert\") pod \"etcd-operator-b45778765-72rk9\" (UID: \"99bb3092-4fdb-4533-98e7-45e62ac33a4a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-72rk9" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.431400 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8df12f42-c275-445f-8c39-32d009a322d5-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lxn66\" (UID: \"8df12f42-c275-445f-8c39-32d009a322d5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lxn66" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.431465 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0572b1a8-4e2e-4d75-b1ab-34da859539cf-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-znjqw\" (UID: \"0572b1a8-4e2e-4d75-b1ab-34da859539cf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-znjqw" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.432045 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a81a9079-32b0-41a0-ae32-13fcc5c70c0b-metrics-tls\") pod \"ingress-operator-5b745b69d9-2rzft\" (UID: \"a81a9079-32b0-41a0-ae32-13fcc5c70c0b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2rzft" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.432460 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a81a9079-32b0-41a0-ae32-13fcc5c70c0b-trusted-ca\") pod \"ingress-operator-5b745b69d9-2rzft\" (UID: \"a81a9079-32b0-41a0-ae32-13fcc5c70c0b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2rzft" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.432562 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b3fc996d-107b-4647-b52f-54fef31f9059-installation-pull-secrets\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.438388 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f6f8d659-22a7-478b-a52a-f1a82ee5a40a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-vhwpb\" (UID: \"f6f8d659-22a7-478b-a52a-f1a82ee5a40a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vhwpb" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.440768 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1111f441-f7d1-4115-b288-48cef127137a-config-volume\") pod \"collect-profiles-29522505-6hhs6\" (UID: \"1111f441-f7d1-4115-b288-48cef127137a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522505-6hhs6" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.441061 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1af316b8-cbe5-47df-8e9b-8e5d839aee33-signing-cabundle\") pod \"service-ca-9c57cc56f-v4nm6\" (UID: \"1af316b8-cbe5-47df-8e9b-8e5d839aee33\") " pod="openshift-service-ca/service-ca-9c57cc56f-v4nm6" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.441894 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b3fc996d-107b-4647-b52f-54fef31f9059-registry-certificates\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.442738 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b3fc996d-107b-4647-b52f-54fef31f9059-ca-trust-extracted\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.443894 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9bbcq" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.444366 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7ec23c19-a8c4-4079-bd8d-897a91ad009f-cert\") pod \"ingress-canary-xcgd8\" (UID: \"7ec23c19-a8c4-4079-bd8d-897a91ad009f\") " pod="openshift-ingress-canary/ingress-canary-xcgd8" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.444445 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-km9qc"] Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.446487 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae17ea52-fb8a-4d96-9b27-41f6fb99de88-config\") pod \"machine-approver-56656f9798-vm7qp\" (UID: \"ae17ea52-fb8a-4d96-9b27-41f6fb99de88\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vm7qp" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.447604 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0572b1a8-4e2e-4d75-b1ab-34da859539cf-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-znjqw\" (UID: \"0572b1a8-4e2e-4d75-b1ab-34da859539cf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-znjqw" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.452300 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a94de5f7-987c-43ae-90e9-bc7a48fc4d6a-srv-cert\") pod \"olm-operator-6b444d44fb-ph8l6\" (UID: \"a94de5f7-987c-43ae-90e9-bc7a48fc4d6a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ph8l6" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.452631 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b3fc996d-107b-4647-b52f-54fef31f9059-trusted-ca\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.452836 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ae17ea52-fb8a-4d96-9b27-41f6fb99de88-auth-proxy-config\") pod \"machine-approver-56656f9798-vm7qp\" (UID: \"ae17ea52-fb8a-4d96-9b27-41f6fb99de88\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vm7qp" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.453167 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d3b54c25-de44-4c2c-be7a-3e9d1ab88a2b-proxy-tls\") pod \"machine-config-operator-74547568cd-bgzdj\" (UID: \"d3b54c25-de44-4c2c-be7a-3e9d1ab88a2b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bgzdj" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.453862 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b3fc996d-107b-4647-b52f-54fef31f9059-registry-tls\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.453965 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c9c06c9-0472-42e8-906f-2c7d8ec5608d-service-ca-bundle\") pod \"router-default-5444994796-ddw22\" (UID: \"7c9c06c9-0472-42e8-906f-2c7d8ec5608d\") " pod="openshift-ingress/router-default-5444994796-ddw22" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.454313 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1af316b8-cbe5-47df-8e9b-8e5d839aee33-signing-key\") pod \"service-ca-9c57cc56f-v4nm6\" (UID: \"1af316b8-cbe5-47df-8e9b-8e5d839aee33\") " pod="openshift-service-ca/service-ca-9c57cc56f-v4nm6" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.454448 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a94de5f7-987c-43ae-90e9-bc7a48fc4d6a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-ph8l6\" (UID: \"a94de5f7-987c-43ae-90e9-bc7a48fc4d6a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ph8l6" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.454322 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a311419d-f51d-420d-aa7a-01cff548c963-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qj469\" (UID: \"a311419d-f51d-420d-aa7a-01cff548c963\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qj469" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.454669 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/ae17ea52-fb8a-4d96-9b27-41f6fb99de88-machine-approver-tls\") pod \"machine-approver-56656f9798-vm7qp\" (UID: \"ae17ea52-fb8a-4d96-9b27-41f6fb99de88\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vm7qp" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.454767 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1111f441-f7d1-4115-b288-48cef127137a-secret-volume\") pod \"collect-profiles-29522505-6hhs6\" (UID: \"1111f441-f7d1-4115-b288-48cef127137a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522505-6hhs6" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.454994 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8df12f42-c275-445f-8c39-32d009a322d5-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lxn66\" (UID: \"8df12f42-c275-445f-8c39-32d009a322d5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lxn66" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.457815 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7c9c06c9-0472-42e8-906f-2c7d8ec5608d-default-certificate\") pod \"router-default-5444994796-ddw22\" (UID: \"7c9c06c9-0472-42e8-906f-2c7d8ec5608d\") " pod="openshift-ingress/router-default-5444994796-ddw22" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.462913 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2a08a25e-d2b9-4a1d-aeb6-e968436a9472-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-ps5cl\" (UID: \"2a08a25e-d2b9-4a1d-aeb6-e968436a9472\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ps5cl" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.469237 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fc4dd9a2-87aa-4405-a4cc-778c778aaec9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xlhd5\" (UID: \"fc4dd9a2-87aa-4405-a4cc-778c778aaec9\") " pod="openshift-marketplace/marketplace-operator-79b997595-xlhd5" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.470521 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-6wqsp"] Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.472334 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2jz6\" (UniqueName: \"kubernetes.io/projected/1af316b8-cbe5-47df-8e9b-8e5d839aee33-kube-api-access-w2jz6\") pod \"service-ca-9c57cc56f-v4nm6\" (UID: \"1af316b8-cbe5-47df-8e9b-8e5d839aee33\") " pod="openshift-service-ca/service-ca-9c57cc56f-v4nm6" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.472446 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/99bb3092-4fdb-4533-98e7-45e62ac33a4a-etcd-client\") pod \"etcd-operator-b45778765-72rk9\" (UID: \"99bb3092-4fdb-4533-98e7-45e62ac33a4a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-72rk9" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.477768 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7c9c06c9-0472-42e8-906f-2c7d8ec5608d-stats-auth\") pod \"router-default-5444994796-ddw22\" (UID: \"7c9c06c9-0472-42e8-906f-2c7d8ec5608d\") " pod="openshift-ingress/router-default-5444994796-ddw22" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.483738 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjk4h\" (UniqueName: \"kubernetes.io/projected/8df12f42-c275-445f-8c39-32d009a322d5-kube-api-access-wjk4h\") pod \"openshift-apiserver-operator-796bbdcf4f-lxn66\" (UID: \"8df12f42-c275-445f-8c39-32d009a322d5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lxn66" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.491087 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7gt9\" (UniqueName: \"kubernetes.io/projected/d3b54c25-de44-4c2c-be7a-3e9d1ab88a2b-kube-api-access-r7gt9\") pod \"machine-config-operator-74547568cd-bgzdj\" (UID: \"d3b54c25-de44-4c2c-be7a-3e9d1ab88a2b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bgzdj" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.510502 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hwwms"] Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.511534 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-pm4rt"] Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.527203 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1b62a756-057b-45a6-b568-b788dbb95fa0-metrics-tls\") pod \"dns-default-cv2jf\" (UID: \"1b62a756-057b-45a6-b568-b788dbb95fa0\") " pod="openshift-dns/dns-default-cv2jf" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.527970 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/8607d279-cebb-4590-a0c0-7dda3de5dfd5-mountpoint-dir\") pod \"csi-hostpathplugin-nnn4l\" (UID: \"8607d279-cebb-4590-a0c0-7dda3de5dfd5\") " pod="hostpath-provisioner/csi-hostpathplugin-nnn4l" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.528055 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/8607d279-cebb-4590-a0c0-7dda3de5dfd5-plugins-dir\") pod \"csi-hostpathplugin-nnn4l\" (UID: \"8607d279-cebb-4590-a0c0-7dda3de5dfd5\") " pod="hostpath-provisioner/csi-hostpathplugin-nnn4l" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.528078 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrw7k\" (UniqueName: \"kubernetes.io/projected/29cdf011-a502-4b68-932e-4d7df7e2b0ae-kube-api-access-zrw7k\") pod \"machine-config-server-22jlr\" (UID: \"29cdf011-a502-4b68-932e-4d7df7e2b0ae\") " pod="openshift-machine-config-operator/machine-config-server-22jlr" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.528095 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/8607d279-cebb-4590-a0c0-7dda3de5dfd5-csi-data-dir\") pod \"csi-hostpathplugin-nnn4l\" (UID: \"8607d279-cebb-4590-a0c0-7dda3de5dfd5\") " pod="hostpath-provisioner/csi-hostpathplugin-nnn4l" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.528147 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8kpq\" (UniqueName: \"kubernetes.io/projected/8607d279-cebb-4590-a0c0-7dda3de5dfd5-kube-api-access-l8kpq\") pod \"csi-hostpathplugin-nnn4l\" (UID: \"8607d279-cebb-4590-a0c0-7dda3de5dfd5\") " pod="hostpath-provisioner/csi-hostpathplugin-nnn4l" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.528165 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/29cdf011-a502-4b68-932e-4d7df7e2b0ae-node-bootstrap-token\") pod \"machine-config-server-22jlr\" (UID: \"29cdf011-a502-4b68-932e-4d7df7e2b0ae\") " pod="openshift-machine-config-operator/machine-config-server-22jlr" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.528192 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/29cdf011-a502-4b68-932e-4d7df7e2b0ae-certs\") pod \"machine-config-server-22jlr\" (UID: \"29cdf011-a502-4b68-932e-4d7df7e2b0ae\") " pod="openshift-machine-config-operator/machine-config-server-22jlr" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.528230 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8607d279-cebb-4590-a0c0-7dda3de5dfd5-registration-dir\") pod \"csi-hostpathplugin-nnn4l\" (UID: \"8607d279-cebb-4590-a0c0-7dda3de5dfd5\") " pod="hostpath-provisioner/csi-hostpathplugin-nnn4l" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.528272 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8607d279-cebb-4590-a0c0-7dda3de5dfd5-socket-dir\") pod \"csi-hostpathplugin-nnn4l\" (UID: \"8607d279-cebb-4590-a0c0-7dda3de5dfd5\") " pod="hostpath-provisioner/csi-hostpathplugin-nnn4l" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.528287 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1b62a756-057b-45a6-b568-b788dbb95fa0-config-volume\") pod \"dns-default-cv2jf\" (UID: \"1b62a756-057b-45a6-b568-b788dbb95fa0\") " pod="openshift-dns/dns-default-cv2jf" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.528356 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.528379 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9p26b\" (UniqueName: \"kubernetes.io/projected/1b62a756-057b-45a6-b568-b788dbb95fa0-kube-api-access-9p26b\") pod \"dns-default-cv2jf\" (UID: \"1b62a756-057b-45a6-b568-b788dbb95fa0\") " pod="openshift-dns/dns-default-cv2jf" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.528376 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/8607d279-cebb-4590-a0c0-7dda3de5dfd5-plugins-dir\") pod \"csi-hostpathplugin-nnn4l\" (UID: \"8607d279-cebb-4590-a0c0-7dda3de5dfd5\") " pod="hostpath-provisioner/csi-hostpathplugin-nnn4l" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.528479 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/8607d279-cebb-4590-a0c0-7dda3de5dfd5-mountpoint-dir\") pod \"csi-hostpathplugin-nnn4l\" (UID: \"8607d279-cebb-4590-a0c0-7dda3de5dfd5\") " pod="hostpath-provisioner/csi-hostpathplugin-nnn4l" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.528534 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8607d279-cebb-4590-a0c0-7dda3de5dfd5-registration-dir\") pod \"csi-hostpathplugin-nnn4l\" (UID: \"8607d279-cebb-4590-a0c0-7dda3de5dfd5\") " pod="hostpath-provisioner/csi-hostpathplugin-nnn4l" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.528574 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8607d279-cebb-4590-a0c0-7dda3de5dfd5-socket-dir\") pod \"csi-hostpathplugin-nnn4l\" (UID: \"8607d279-cebb-4590-a0c0-7dda3de5dfd5\") " pod="hostpath-provisioner/csi-hostpathplugin-nnn4l" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.528590 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6jfl\" (UniqueName: \"kubernetes.io/projected/f6f8d659-22a7-478b-a52a-f1a82ee5a40a-kube-api-access-f6jfl\") pod \"control-plane-machine-set-operator-78cbb6b69f-vhwpb\" (UID: \"f6f8d659-22a7-478b-a52a-f1a82ee5a40a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vhwpb" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.528990 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/8607d279-cebb-4590-a0c0-7dda3de5dfd5-csi-data-dir\") pod \"csi-hostpathplugin-nnn4l\" (UID: \"8607d279-cebb-4590-a0c0-7dda3de5dfd5\") " pod="hostpath-provisioner/csi-hostpathplugin-nnn4l" Feb 17 17:46:11 crc kubenswrapper[4892]: E0217 17:46:11.529524 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:46:12.029491752 +0000 UTC m=+143.404895017 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cnjbv" (UID: "b3fc996d-107b-4647-b52f-54fef31f9059") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.529773 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1b62a756-057b-45a6-b568-b788dbb95fa0-config-volume\") pod \"dns-default-cv2jf\" (UID: \"1b62a756-057b-45a6-b568-b788dbb95fa0\") " pod="openshift-dns/dns-default-cv2jf" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.538196 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/29cdf011-a502-4b68-932e-4d7df7e2b0ae-node-bootstrap-token\") pod \"machine-config-server-22jlr\" (UID: \"29cdf011-a502-4b68-932e-4d7df7e2b0ae\") " pod="openshift-machine-config-operator/machine-config-server-22jlr" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.541040 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/29cdf011-a502-4b68-932e-4d7df7e2b0ae-certs\") pod \"machine-config-server-22jlr\" (UID: \"29cdf011-a502-4b68-932e-4d7df7e2b0ae\") " pod="openshift-machine-config-operator/machine-config-server-22jlr" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.546608 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1b62a756-057b-45a6-b568-b788dbb95fa0-metrics-tls\") pod \"dns-default-cv2jf\" (UID: \"1b62a756-057b-45a6-b568-b788dbb95fa0\") " pod="openshift-dns/dns-default-cv2jf" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.551639 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxnsk\" (UniqueName: \"kubernetes.io/projected/a81a9079-32b0-41a0-ae32-13fcc5c70c0b-kube-api-access-jxnsk\") pod \"ingress-operator-5b745b69d9-2rzft\" (UID: \"a81a9079-32b0-41a0-ae32-13fcc5c70c0b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2rzft" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.559011 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lxn66" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.560898 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx95k\" (UniqueName: \"kubernetes.io/projected/99bb3092-4fdb-4533-98e7-45e62ac33a4a-kube-api-access-xx95k\") pod \"etcd-operator-b45778765-72rk9\" (UID: \"99bb3092-4fdb-4533-98e7-45e62ac33a4a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-72rk9" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.569588 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-hzxbm"] Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.583297 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a81a9079-32b0-41a0-ae32-13fcc5c70c0b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-2rzft\" (UID: \"a81a9079-32b0-41a0-ae32-13fcc5c70c0b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2rzft" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.588463 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-zc25m"] Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.592117 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhlmx\" (UniqueName: \"kubernetes.io/projected/b3fc996d-107b-4647-b52f-54fef31f9059-kube-api-access-vhlmx\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.616740 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52tz9\" (UniqueName: \"kubernetes.io/projected/2a08a25e-d2b9-4a1d-aeb6-e968436a9472-kube-api-access-52tz9\") pod \"package-server-manager-789f6589d5-ps5cl\" (UID: \"2a08a25e-d2b9-4a1d-aeb6-e968436a9472\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ps5cl" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.628609 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-72rk9" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.629138 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:46:11 crc kubenswrapper[4892]: E0217 17:46:11.629477 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:46:12.129462529 +0000 UTC m=+143.504865794 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.639431 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b3fc996d-107b-4647-b52f-54fef31f9059-bound-sa-token\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.652113 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2rzft" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.662474 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhh8z\" (UniqueName: \"kubernetes.io/projected/0572b1a8-4e2e-4d75-b1ab-34da859539cf-kube-api-access-mhh8z\") pod \"openshift-controller-manager-operator-756b6f6bc6-znjqw\" (UID: \"0572b1a8-4e2e-4d75-b1ab-34da859539cf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-znjqw" Feb 17 17:46:11 crc kubenswrapper[4892]: W0217 17:46:11.675974 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b0cbdbb_671e_41e1_b494_a369938dab8e.slice/crio-cbe2bffa717fb5e983f997e71d1e9d0608da3e5cd7276d126e62aee0b5791db7 WatchSource:0}: Error finding container cbe2bffa717fb5e983f997e71d1e9d0608da3e5cd7276d126e62aee0b5791db7: Status 404 returned error can't find the container with id cbe2bffa717fb5e983f997e71d1e9d0608da3e5cd7276d126e62aee0b5791db7 Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.676734 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2st9w\" (UniqueName: \"kubernetes.io/projected/a94de5f7-987c-43ae-90e9-bc7a48fc4d6a-kube-api-access-2st9w\") pod \"olm-operator-6b444d44fb-ph8l6\" (UID: \"a94de5f7-987c-43ae-90e9-bc7a48fc4d6a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ph8l6" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.677051 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vhwpb" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.678694 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qw5r5"] Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.685945 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ph8l6" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.693953 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wrdm2"] Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.699401 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ps5cl" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.704368 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djgsv\" (UniqueName: \"kubernetes.io/projected/7c9c06c9-0472-42e8-906f-2c7d8ec5608d-kube-api-access-djgsv\") pod \"router-default-5444994796-ddw22\" (UID: \"7c9c06c9-0472-42e8-906f-2c7d8ec5608d\") " pod="openshift-ingress/router-default-5444994796-ddw22" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.704534 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-v4nm6" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.723179 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4v9t8\" (UniqueName: \"kubernetes.io/projected/7ec23c19-a8c4-4079-bd8d-897a91ad009f-kube-api-access-4v9t8\") pod \"ingress-canary-xcgd8\" (UID: \"7ec23c19-a8c4-4079-bd8d-897a91ad009f\") " pod="openshift-ingress-canary/ingress-canary-xcgd8" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.728324 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xcgd8" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.730191 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:11 crc kubenswrapper[4892]: E0217 17:46:11.730518 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:46:12.230507924 +0000 UTC m=+143.605911189 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cnjbv" (UID: "b3fc996d-107b-4647-b52f-54fef31f9059") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.749703 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5n65\" (UniqueName: \"kubernetes.io/projected/fc4dd9a2-87aa-4405-a4cc-778c778aaec9-kube-api-access-m5n65\") pod \"marketplace-operator-79b997595-xlhd5\" (UID: \"fc4dd9a2-87aa-4405-a4cc-778c778aaec9\") " pod="openshift-marketplace/marketplace-operator-79b997595-xlhd5" Feb 17 17:46:11 crc kubenswrapper[4892]: W0217 17:46:11.760483 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50dcb3e4_a35c_4e36_872c_32a252f542a7.slice/crio-a68aa67e428af085cc0acddfc86f85c1c204e8b4f290b740e500590b16252111 WatchSource:0}: Error finding container a68aa67e428af085cc0acddfc86f85c1c204e8b4f290b740e500590b16252111: Status 404 returned error can't find the container with id a68aa67e428af085cc0acddfc86f85c1c204e8b4f290b740e500590b16252111 Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.760949 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4svpj\" (UniqueName: \"kubernetes.io/projected/1111f441-f7d1-4115-b288-48cef127137a-kube-api-access-4svpj\") pod \"collect-profiles-29522505-6hhs6\" (UID: \"1111f441-f7d1-4115-b288-48cef127137a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522505-6hhs6" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.769172 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bgzdj" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.822098 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a311419d-f51d-420d-aa7a-01cff548c963-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qj469\" (UID: \"a311419d-f51d-420d-aa7a-01cff548c963\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qj469" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.824594 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9spm\" (UniqueName: \"kubernetes.io/projected/ae17ea52-fb8a-4d96-9b27-41f6fb99de88-kube-api-access-l9spm\") pod \"machine-approver-56656f9798-vm7qp\" (UID: \"ae17ea52-fb8a-4d96-9b27-41f6fb99de88\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vm7qp" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.831760 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:46:11 crc kubenswrapper[4892]: E0217 17:46:11.832536 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:46:12.332519955 +0000 UTC m=+143.707923220 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.851224 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522505-6hhs6" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.871015 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vm7qp" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.879558 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-znjqw" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.880865 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p26b\" (UniqueName: \"kubernetes.io/projected/1b62a756-057b-45a6-b568-b788dbb95fa0-kube-api-access-9p26b\") pod \"dns-default-cv2jf\" (UID: \"1b62a756-057b-45a6-b568-b788dbb95fa0\") " pod="openshift-dns/dns-default-cv2jf" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.881442 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrw7k\" (UniqueName: \"kubernetes.io/projected/29cdf011-a502-4b68-932e-4d7df7e2b0ae-kube-api-access-zrw7k\") pod \"machine-config-server-22jlr\" (UID: \"29cdf011-a502-4b68-932e-4d7df7e2b0ae\") " pod="openshift-machine-config-operator/machine-config-server-22jlr" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.900601 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8kpq\" (UniqueName: \"kubernetes.io/projected/8607d279-cebb-4590-a0c0-7dda3de5dfd5-kube-api-access-l8kpq\") pod \"csi-hostpathplugin-nnn4l\" (UID: \"8607d279-cebb-4590-a0c0-7dda3de5dfd5\") " pod="hostpath-provisioner/csi-hostpathplugin-nnn4l" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.924035 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-x6gtg"] Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.926138 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-hzxbm" event={"ID":"129ae682-89fa-4fa9-b4de-789ea8c0a9fd","Type":"ContainerStarted","Data":"6ae9473ab144901f75ae8e22c4038aff109eb90f963ed6f2e2cd3348361ff6ef"} Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.933779 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:11 crc kubenswrapper[4892]: E0217 17:46:11.934064 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:46:12.434054553 +0000 UTC m=+143.809457818 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cnjbv" (UID: "b3fc996d-107b-4647-b52f-54fef31f9059") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.934142 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qj469" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.962075 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-ddw22" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.965133 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wrdm2" event={"ID":"50dcb3e4-a35c-4e36-872c-32a252f542a7","Type":"ContainerStarted","Data":"a68aa67e428af085cc0acddfc86f85c1c204e8b4f290b740e500590b16252111"} Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.984064 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k9jsf"] Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.987525 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-4q9d4" event={"ID":"843d01a5-8de5-4628-99d0-2ac552e9abf5","Type":"ContainerStarted","Data":"7cf401a6538a7d14f56048a7954c05749191abc55e184ad174c7c16667db82fe"} Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.987787 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-4q9d4" event={"ID":"843d01a5-8de5-4628-99d0-2ac552e9abf5","Type":"ContainerStarted","Data":"047c19b27db261bf47e3743e82f2e6b0d89eacbcbdf63cc041f3e5029b539e88"} Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.987804 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-4q9d4" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.994612 4892 patch_prober.go:28] interesting pod/downloads-7954f5f757-4q9d4 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.994654 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-4q9d4" podUID="843d01a5-8de5-4628-99d0-2ac552e9abf5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.997142 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qm6w9" event={"ID":"4baae3f3-ceff-445d-9cf0-82284224bff7","Type":"ContainerStarted","Data":"1f20e94cde5065059433c85feeb14e30293940ceb285307fd606d09b79f93de8"} Feb 17 17:46:11 crc kubenswrapper[4892]: I0217 17:46:11.997657 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qm6w9" Feb 17 17:46:12 crc kubenswrapper[4892]: I0217 17:46:12.002851 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-t24t9" event={"ID":"2aa4da4a-e681-4199-b8b7-c4de1ed9c04e","Type":"ContainerStarted","Data":"334e07f0a86e8b6c1312aa908c6941fe856213e52463ee86809bf2179b59e59d"} Feb 17 17:46:12 crc kubenswrapper[4892]: I0217 17:46:12.002874 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-t24t9" event={"ID":"2aa4da4a-e681-4199-b8b7-c4de1ed9c04e","Type":"ContainerStarted","Data":"b34cc8562fbc59dd949136f684cfc34265e09c982607dd42511f61e0a11ee4b4"} Feb 17 17:46:12 crc kubenswrapper[4892]: I0217 17:46:12.004120 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-t24t9" Feb 17 17:46:12 crc kubenswrapper[4892]: I0217 17:46:12.006158 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kc9rz"] Feb 17 17:46:12 crc kubenswrapper[4892]: I0217 17:46:12.006573 4892 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-qm6w9 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Feb 17 17:46:12 crc kubenswrapper[4892]: I0217 17:46:12.006609 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qm6w9" podUID="4baae3f3-ceff-445d-9cf0-82284224bff7" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" Feb 17 17:46:12 crc kubenswrapper[4892]: I0217 17:46:12.007991 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qw5r5" event={"ID":"0168b546-66c4-4663-a840-4b4b1968d725","Type":"ContainerStarted","Data":"6ca07528df112d38265a8ef26683d68664b563b0f4d6fba54f211fa43992bc07"} Feb 17 17:46:12 crc kubenswrapper[4892]: I0217 17:46:12.008008 4892 patch_prober.go:28] interesting pod/console-operator-58897d9998-t24t9 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.29:8443/readyz\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Feb 17 17:46:12 crc kubenswrapper[4892]: I0217 17:46:12.008054 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-t24t9" podUID="2aa4da4a-e681-4199-b8b7-c4de1ed9c04e" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.29:8443/readyz\": dial tcp 10.217.0.29:8443: connect: connection refused" Feb 17 17:46:12 crc kubenswrapper[4892]: I0217 17:46:12.015144 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xlhd5" Feb 17 17:46:12 crc kubenswrapper[4892]: I0217 17:46:12.019583 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-st4m9" event={"ID":"183505a9-eeb4-4bf8-94ae-2c593e78b926","Type":"ContainerStarted","Data":"944a81ae14b6bd46b5d8e3a477b6440d90f4a94880c87a970f73185ea8da5b09"} Feb 17 17:46:12 crc kubenswrapper[4892]: I0217 17:46:12.035305 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:46:12 crc kubenswrapper[4892]: E0217 17:46:12.035963 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:46:12.53593763 +0000 UTC m=+143.911340915 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:12 crc kubenswrapper[4892]: I0217 17:46:12.045557 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-n8n76"] Feb 17 17:46:12 crc kubenswrapper[4892]: I0217 17:46:12.070241 4892 generic.go:334] "Generic (PLEG): container finished" podID="8e8f043a-de31-4c34-8ca8-d90a0ad0f125" containerID="59ef0ad22884ed637ceea90dec3fe65ea024b2a886b5e224f0edc591d450b349" exitCode=0 Feb 17 17:46:12 crc kubenswrapper[4892]: I0217 17:46:12.070312 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dc6pg" event={"ID":"8e8f043a-de31-4c34-8ca8-d90a0ad0f125","Type":"ContainerDied","Data":"59ef0ad22884ed637ceea90dec3fe65ea024b2a886b5e224f0edc591d450b349"} Feb 17 17:46:12 crc kubenswrapper[4892]: I0217 17:46:12.070513 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dc6pg" event={"ID":"8e8f043a-de31-4c34-8ca8-d90a0ad0f125","Type":"ContainerStarted","Data":"2bf8aa3ec2652a08736e53b6b5eb7dc2ccd03192b6259db7b5269feb1d8722df"} Feb 17 17:46:12 crc kubenswrapper[4892]: W0217 17:46:12.070496 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1952c19_97f1_41ae_b9de_834d2325c943.slice/crio-437d365bc5035ba20ff78bb369f08abfd1b1b7e95a12e853019e07da51a1d8ad WatchSource:0}: Error finding container 437d365bc5035ba20ff78bb369f08abfd1b1b7e95a12e853019e07da51a1d8ad: Status 404 returned error can't find the container with id 437d365bc5035ba20ff78bb369f08abfd1b1b7e95a12e853019e07da51a1d8ad Feb 17 17:46:12 crc kubenswrapper[4892]: I0217 17:46:12.073165 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d9zlz" event={"ID":"36e96f72-2b42-4b98-b6f1-5fe676510aed","Type":"ContainerStarted","Data":"f5e0566596e5ff2b0c7daa20c084200aec0ea4fa6a03eb29ef5cdd8a38e75811"} Feb 17 17:46:12 crc kubenswrapper[4892]: I0217 17:46:12.074204 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-6wqsp" event={"ID":"b571ef16-161d-4280-aaa0-3f15b76dcc21","Type":"ContainerStarted","Data":"09cdcb9bec3918ac2b22362269007f3dbabffbc021c1acbf76195afb9e43aec2"} Feb 17 17:46:12 crc kubenswrapper[4892]: I0217 17:46:12.077671 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-lzbcm" event={"ID":"ca6ef0ff-6a6d-4e4e-b396-1de6fd632cd1","Type":"ContainerStarted","Data":"2e6e4f7c66a5db068928457e78d60c99e53625f8ffd6f755f1c69747aaa97bcb"} Feb 17 17:46:12 crc kubenswrapper[4892]: I0217 17:46:12.077715 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-lzbcm" event={"ID":"ca6ef0ff-6a6d-4e4e-b396-1de6fd632cd1","Type":"ContainerStarted","Data":"c51794a9b2951e7ee64fc169f38df2c4bffaec7d8926f94a9ca12cf75e6746e9"} Feb 17 17:46:12 crc kubenswrapper[4892]: I0217 17:46:12.085060 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-22jlr" Feb 17 17:46:12 crc kubenswrapper[4892]: I0217 17:46:12.087020 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-pm4rt" event={"ID":"f377a3b2-2464-4568-a5d9-0557bc175309","Type":"ContainerStarted","Data":"4bf818d76a5b81191ffcb25bf3340b00e9859e1aeeb23aaf8f91f1eb3872701d"} Feb 17 17:46:12 crc kubenswrapper[4892]: I0217 17:46:12.090234 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-km9qc" event={"ID":"fc208abe-d868-4d0e-bd8c-9f93e2e5c05a","Type":"ContainerStarted","Data":"9724b15493b0b51d8bbe700bec2172dc898636804f89462097d532bef41c17e8"} Feb 17 17:46:12 crc kubenswrapper[4892]: I0217 17:46:12.091252 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hwwms" event={"ID":"6c5aba9a-b953-479b-a16a-93af181b7445","Type":"ContainerStarted","Data":"39d93ebba2503e82ddd7f9a6bde1772dc38f90a57c31e1090cee297f1a0e4114"} Feb 17 17:46:12 crc kubenswrapper[4892]: I0217 17:46:12.102249 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nj852" event={"ID":"14853a40-bee2-4d8e-a148-5f0ae761d71c","Type":"ContainerStarted","Data":"9d6ca017aae744d46178372de3b22cd2ebd00258b125613c19771e99610daeb6"} Feb 17 17:46:12 crc kubenswrapper[4892]: I0217 17:46:12.102287 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nj852" event={"ID":"14853a40-bee2-4d8e-a148-5f0ae761d71c","Type":"ContainerStarted","Data":"674ec5920af39ed5161573bc300d6e6ea0806ea5721e376a3c7fbf75cc2b83b8"} Feb 17 17:46:12 crc kubenswrapper[4892]: I0217 17:46:12.102979 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-nnn4l" Feb 17 17:46:12 crc kubenswrapper[4892]: I0217 17:46:12.105459 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-zc25m" event={"ID":"0b0cbdbb-671e-41e1-b494-a369938dab8e","Type":"ContainerStarted","Data":"cbe2bffa717fb5e983f997e71d1e9d0608da3e5cd7276d126e62aee0b5791db7"} Feb 17 17:46:12 crc kubenswrapper[4892]: I0217 17:46:12.113648 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cv2jf" Feb 17 17:46:12 crc kubenswrapper[4892]: I0217 17:46:12.132130 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lxn66"] Feb 17 17:46:12 crc kubenswrapper[4892]: I0217 17:46:12.136964 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:12 crc kubenswrapper[4892]: E0217 17:46:12.138485 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:46:12.638470635 +0000 UTC m=+144.013873900 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cnjbv" (UID: "b3fc996d-107b-4647-b52f-54fef31f9059") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:12 crc kubenswrapper[4892]: I0217 17:46:12.143639 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9bbcq"] Feb 17 17:46:12 crc kubenswrapper[4892]: I0217 17:46:12.184053 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-2rzft"] Feb 17 17:46:12 crc kubenswrapper[4892]: I0217 17:46:12.237787 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:46:12 crc kubenswrapper[4892]: E0217 17:46:12.239065 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:46:12.739043567 +0000 UTC m=+144.114446932 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:12 crc kubenswrapper[4892]: I0217 17:46:12.255431 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-72rk9"] Feb 17 17:46:12 crc kubenswrapper[4892]: W0217 17:46:12.262617 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7188959_438f_469b_8c0e_8e2af8c54d3f.slice/crio-953e94500ff9520d52dbdd25e914b4262bfa3b24f16ecd5a06d5bb0c34c59625 WatchSource:0}: Error finding container 953e94500ff9520d52dbdd25e914b4262bfa3b24f16ecd5a06d5bb0c34c59625: Status 404 returned error can't find the container with id 953e94500ff9520d52dbdd25e914b4262bfa3b24f16ecd5a06d5bb0c34c59625 Feb 17 17:46:12 crc kubenswrapper[4892]: I0217 17:46:12.276918 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ps5cl"] Feb 17 17:46:12 crc kubenswrapper[4892]: W0217 17:46:12.313099 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8df12f42_c275_445f_8c39_32d009a322d5.slice/crio-10525b0ef561ea57163c6b46bc073750cd46183e4407da6b82b42160aa68165d WatchSource:0}: Error finding container 10525b0ef561ea57163c6b46bc073750cd46183e4407da6b82b42160aa68165d: Status 404 returned error can't find the container with id 10525b0ef561ea57163c6b46bc073750cd46183e4407da6b82b42160aa68165d Feb 17 17:46:12 crc kubenswrapper[4892]: I0217 17:46:12.339618 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:12 crc kubenswrapper[4892]: E0217 17:46:12.339953 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:46:12.839941208 +0000 UTC m=+144.215344473 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cnjbv" (UID: "b3fc996d-107b-4647-b52f-54fef31f9059") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:12 crc kubenswrapper[4892]: I0217 17:46:12.382373 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vhwpb"] Feb 17 17:46:12 crc kubenswrapper[4892]: W0217 17:46:12.428833 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99bb3092_4fdb_4533_98e7_45e62ac33a4a.slice/crio-9875400ae6f1f940864552fac10136ace6f1b426dcb94abee4c68ff85ace03f0 WatchSource:0}: Error finding container 9875400ae6f1f940864552fac10136ace6f1b426dcb94abee4c68ff85ace03f0: Status 404 returned error can't find the container with id 9875400ae6f1f940864552fac10136ace6f1b426dcb94abee4c68ff85ace03f0 Feb 17 17:46:12 crc kubenswrapper[4892]: I0217 17:46:12.443829 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:46:12 crc kubenswrapper[4892]: E0217 17:46:12.444261 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:46:12.944238939 +0000 UTC m=+144.319642204 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:12 crc kubenswrapper[4892]: I0217 17:46:12.457557 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-bgzdj"] Feb 17 17:46:12 crc kubenswrapper[4892]: I0217 17:46:12.457814 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-t5flc" podStartSLOduration=121.457794231 podStartE2EDuration="2m1.457794231s" podCreationTimestamp="2026-02-17 17:44:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:46:12.456670351 +0000 UTC m=+143.832073616" watchObservedRunningTime="2026-02-17 17:46:12.457794231 +0000 UTC m=+143.833197496" Feb 17 17:46:12 crc kubenswrapper[4892]: W0217 17:46:12.501415 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae17ea52_fb8a_4d96_9b27_41f6fb99de88.slice/crio-83f3279eea3ab4540e0e32eac0e622b78f5c9c942578bf6872ecfb1fc499a29b WatchSource:0}: Error finding container 83f3279eea3ab4540e0e32eac0e622b78f5c9c942578bf6872ecfb1fc499a29b: Status 404 returned error can't find the container with id 83f3279eea3ab4540e0e32eac0e622b78f5c9c942578bf6872ecfb1fc499a29b Feb 17 17:46:12 crc kubenswrapper[4892]: I0217 17:46:12.544652 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:12 crc kubenswrapper[4892]: E0217 17:46:12.545210 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:46:13.045197292 +0000 UTC m=+144.420600557 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cnjbv" (UID: "b3fc996d-107b-4647-b52f-54fef31f9059") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:12 crc kubenswrapper[4892]: I0217 17:46:12.616692 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xcgd8"] Feb 17 17:46:12 crc kubenswrapper[4892]: E0217 17:46:12.645931 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:46:13.145903028 +0000 UTC m=+144.521306293 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:12 crc kubenswrapper[4892]: I0217 17:46:12.645994 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:46:12 crc kubenswrapper[4892]: I0217 17:46:12.646467 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:12 crc kubenswrapper[4892]: E0217 17:46:12.646756 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:46:13.14674844 +0000 UTC m=+144.522151705 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cnjbv" (UID: "b3fc996d-107b-4647-b52f-54fef31f9059") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:12 crc kubenswrapper[4892]: I0217 17:46:12.650112 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ph8l6"] Feb 17 17:46:12 crc kubenswrapper[4892]: I0217 17:46:12.758100 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:46:12 crc kubenswrapper[4892]: E0217 17:46:12.758500 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:46:13.2584834 +0000 UTC m=+144.633886665 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:12 crc kubenswrapper[4892]: I0217 17:46:12.840436 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522505-6hhs6"] Feb 17 17:46:12 crc kubenswrapper[4892]: I0217 17:46:12.864582 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:12 crc kubenswrapper[4892]: E0217 17:46:12.864886 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:46:13.364875068 +0000 UTC m=+144.740278333 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cnjbv" (UID: "b3fc996d-107b-4647-b52f-54fef31f9059") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:12 crc kubenswrapper[4892]: I0217 17:46:12.877543 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xlhd5"] Feb 17 17:46:12 crc kubenswrapper[4892]: W0217 17:46:12.892451 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda94de5f7_987c_43ae_90e9_bc7a48fc4d6a.slice/crio-2d0f382f8a5c33fad484510f1cb395d7cb3ed172562d68ca8d5d21b7a4283f53 WatchSource:0}: Error finding container 2d0f382f8a5c33fad484510f1cb395d7cb3ed172562d68ca8d5d21b7a4283f53: Status 404 returned error can't find the container with id 2d0f382f8a5c33fad484510f1cb395d7cb3ed172562d68ca8d5d21b7a4283f53 Feb 17 17:46:12 crc kubenswrapper[4892]: I0217 17:46:12.917215 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-v4nm6"] Feb 17 17:46:12 crc kubenswrapper[4892]: W0217 17:46:12.957898 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1111f441_f7d1_4115_b288_48cef127137a.slice/crio-2e380bd5921e06b4e83a586224baf0fd4201a38c1057f932dffc2ad51d7100c7 WatchSource:0}: Error finding container 2e380bd5921e06b4e83a586224baf0fd4201a38c1057f932dffc2ad51d7100c7: Status 404 returned error can't find the container with id 2e380bd5921e06b4e83a586224baf0fd4201a38c1057f932dffc2ad51d7100c7 Feb 17 17:46:12 crc kubenswrapper[4892]: I0217 17:46:12.965380 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:46:12 crc kubenswrapper[4892]: E0217 17:46:12.965767 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:46:13.465747738 +0000 UTC m=+144.841151003 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:13 crc kubenswrapper[4892]: W0217 17:46:13.018329 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc4dd9a2_87aa_4405_a4cc_778c778aaec9.slice/crio-7a4dd4a3098a25aaa08eab3545192f2342640940c35e0699f2c31176c8a92602 WatchSource:0}: Error finding container 7a4dd4a3098a25aaa08eab3545192f2342640940c35e0699f2c31176c8a92602: Status 404 returned error can't find the container with id 7a4dd4a3098a25aaa08eab3545192f2342640940c35e0699f2c31176c8a92602 Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.067176 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:13 crc kubenswrapper[4892]: E0217 17:46:13.067927 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:46:13.567914184 +0000 UTC m=+144.943317449 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cnjbv" (UID: "b3fc996d-107b-4647-b52f-54fef31f9059") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.109021 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-zh54l" podStartSLOduration=121.108999159 podStartE2EDuration="2m1.108999159s" podCreationTimestamp="2026-02-17 17:44:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:46:13.097105882 +0000 UTC m=+144.472509147" watchObservedRunningTime="2026-02-17 17:46:13.108999159 +0000 UTC m=+144.484402434" Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.115139 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-nnn4l"] Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.138020 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-hzxbm" event={"ID":"129ae682-89fa-4fa9-b4de-789ea8c0a9fd","Type":"ContainerStarted","Data":"80ea5e7157284942c544a8c04d6ea77f5c7bf90c7e1fa832337b3cc940231487"} Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.140206 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522505-6hhs6" event={"ID":"1111f441-f7d1-4115-b288-48cef127137a","Type":"ContainerStarted","Data":"2e380bd5921e06b4e83a586224baf0fd4201a38c1057f932dffc2ad51d7100c7"} Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.141704 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hwwms" event={"ID":"6c5aba9a-b953-479b-a16a-93af181b7445","Type":"ContainerStarted","Data":"aef283bc20d52e4a381068c50cc7c489fb22288f9e90bc0d35e5e400c9ed89bf"} Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.142344 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-znjqw"] Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.142754 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-22jlr" event={"ID":"29cdf011-a502-4b68-932e-4d7df7e2b0ae","Type":"ContainerStarted","Data":"ecc426ca026213d75fa7ce8e203f1d757761689df2f77423575196b5830b58d7"} Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.145176 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qw5r5" event={"ID":"0168b546-66c4-4663-a840-4b4b1968d725","Type":"ContainerStarted","Data":"bee047fd6fb08601bb2730f9f5e643a58a649005f345e3fd900c4838e59df7ee"} Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.146180 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qj469"] Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.151285 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9bbcq" event={"ID":"e7188959-438f-469b-8c0e-8e2af8c54d3f","Type":"ContainerStarted","Data":"953e94500ff9520d52dbdd25e914b4262bfa3b24f16ecd5a06d5bb0c34c59625"} Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.152256 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xcgd8" event={"ID":"7ec23c19-a8c4-4079-bd8d-897a91ad009f","Type":"ContainerStarted","Data":"055e4ca7b6906db405dafe54c5aa68ee5ea90a3fbd0b12a2c035444cb55f51e4"} Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.153685 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ph8l6" event={"ID":"a94de5f7-987c-43ae-90e9-bc7a48fc4d6a","Type":"ContainerStarted","Data":"2d0f382f8a5c33fad484510f1cb395d7cb3ed172562d68ca8d5d21b7a4283f53"} Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.167568 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vhwpb" event={"ID":"f6f8d659-22a7-478b-a52a-f1a82ee5a40a","Type":"ContainerStarted","Data":"baebf0f3f4b3ba5c96f2cf9b1a3e6570dc49e14b4672de4d84af53624417825e"} Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.168946 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:46:13 crc kubenswrapper[4892]: E0217 17:46:13.169135 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:46:13.669118573 +0000 UTC m=+145.044521838 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.169312 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:13 crc kubenswrapper[4892]: E0217 17:46:13.169696 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:46:13.669679937 +0000 UTC m=+145.045083202 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cnjbv" (UID: "b3fc996d-107b-4647-b52f-54fef31f9059") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.170746 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wrdm2" event={"ID":"50dcb3e4-a35c-4e36-872c-32a252f542a7","Type":"ContainerStarted","Data":"dfc2f4da4a457fab14b6792f959f77dd8951c59342edacc77ff3da7a02218222"} Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.175166 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bgzdj" event={"ID":"d3b54c25-de44-4c2c-be7a-3e9d1ab88a2b","Type":"ContainerStarted","Data":"a13d10a152439e85b6afcb1a584bbd8ddf1c9f66b723cf9aaa5bf57b76d12c82"} Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.181232 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-x6gtg" event={"ID":"e1952c19-97f1-41ae-b9de-834d2325c943","Type":"ContainerStarted","Data":"437d365bc5035ba20ff78bb369f08abfd1b1b7e95a12e853019e07da51a1d8ad"} Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.185257 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kc9rz" event={"ID":"72a2852b-7dd5-4681-b330-b8e4128d75b0","Type":"ContainerStarted","Data":"eeb3e46c2138be2a32a54a26cc4b91d2f796b6f8505b6e1499612bf6473c7dc4"} Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.216629 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-6wqsp" event={"ID":"b571ef16-161d-4280-aaa0-3f15b76dcc21","Type":"ContainerStarted","Data":"9efe8f1277fdc20debf03ee23134c925db9acf082eb9bbc03bcbc590de4d531c"} Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.220942 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ps5cl" event={"ID":"2a08a25e-d2b9-4a1d-aeb6-e968436a9472","Type":"ContainerStarted","Data":"ba95f636ebbbc5efcf01663ed38484a2089aaafeebdafa7628314d8d836c9a09"} Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.230592 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" event={"ID":"58da6748-a277-48e6-a169-6e6477486e44","Type":"ContainerStarted","Data":"6b538bd5d8589bbca26cd1d57575037451e3ca26825fc1c6ce76fb8c708d61ac"} Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.251851 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-pm4rt" event={"ID":"f377a3b2-2464-4568-a5d9-0557bc175309","Type":"ContainerStarted","Data":"c5d8beb55e71d49eb30310b0638a0b706e9fec0b0a72be40a605a32055d272c6"} Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.255406 4892 generic.go:334] "Generic (PLEG): container finished" podID="14853a40-bee2-4d8e-a148-5f0ae761d71c" containerID="9d6ca017aae744d46178372de3b22cd2ebd00258b125613c19771e99610daeb6" exitCode=0 Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.255468 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nj852" event={"ID":"14853a40-bee2-4d8e-a148-5f0ae761d71c","Type":"ContainerDied","Data":"9d6ca017aae744d46178372de3b22cd2ebd00258b125613c19771e99610daeb6"} Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.257741 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2rzft" event={"ID":"a81a9079-32b0-41a0-ae32-13fcc5c70c0b","Type":"ContainerStarted","Data":"cbee406b4f6b0cb110a33010483fc99bd172391fe29ff2ce5a56eb2f62c93417"} Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.259751 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-st4m9" event={"ID":"183505a9-eeb4-4bf8-94ae-2c593e78b926","Type":"ContainerStarted","Data":"6581e50a4ad1f934152494bb5fbed617a4b12fd7ac9a226abae451e0cc4f0a7b"} Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.260229 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-st4m9" Feb 17 17:46:13 crc kubenswrapper[4892]: W0217 17:46:13.261933 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8607d279_cebb_4590_a0c0_7dda3de5dfd5.slice/crio-7c86d33e019d96bfcc8d357515372271bb1c9aa2b9706682d6009eef82963fdd WatchSource:0}: Error finding container 7c86d33e019d96bfcc8d357515372271bb1c9aa2b9706682d6009eef82963fdd: Status 404 returned error can't find the container with id 7c86d33e019d96bfcc8d357515372271bb1c9aa2b9706682d6009eef82963fdd Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.262607 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vm7qp" event={"ID":"ae17ea52-fb8a-4d96-9b27-41f6fb99de88","Type":"ContainerStarted","Data":"83f3279eea3ab4540e0e32eac0e622b78f5c9c942578bf6872ecfb1fc499a29b"} Feb 17 17:46:13 crc kubenswrapper[4892]: W0217 17:46:13.263162 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0572b1a8_4e2e_4d75_b1ab_34da859539cf.slice/crio-7023ecb5b28c56e55c4d5cdce4997435e992576fa0f5ccb0a05e6200b464d350 WatchSource:0}: Error finding container 7023ecb5b28c56e55c4d5cdce4997435e992576fa0f5ccb0a05e6200b464d350: Status 404 returned error can't find the container with id 7023ecb5b28c56e55c4d5cdce4997435e992576fa0f5ccb0a05e6200b464d350 Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.264062 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xlhd5" event={"ID":"fc4dd9a2-87aa-4405-a4cc-778c778aaec9","Type":"ContainerStarted","Data":"7a4dd4a3098a25aaa08eab3545192f2342640940c35e0699f2c31176c8a92602"} Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.268248 4892 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-st4m9 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.268290 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-st4m9" podUID="183505a9-eeb4-4bf8-94ae-2c593e78b926" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.268667 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k9jsf" event={"ID":"752ea4e8-6b0b-4fab-9e96-f21100f3e72d","Type":"ContainerStarted","Data":"87a7b041fe2fb3086cf2c139341434aa40ee30c47374c57d4010fa1d4c0f8cd4"} Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.270667 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:46:13 crc kubenswrapper[4892]: E0217 17:46:13.270796 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:46:13.770779154 +0000 UTC m=+145.146182419 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.270892 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:13 crc kubenswrapper[4892]: E0217 17:46:13.271224 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:46:13.771217315 +0000 UTC m=+145.146620580 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cnjbv" (UID: "b3fc996d-107b-4647-b52f-54fef31f9059") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.271949 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lxn66" event={"ID":"8df12f42-c275-445f-8c39-32d009a322d5","Type":"ContainerStarted","Data":"10525b0ef561ea57163c6b46bc073750cd46183e4407da6b82b42160aa68165d"} Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.275953 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-ddw22" event={"ID":"7c9c06c9-0472-42e8-906f-2c7d8ec5608d","Type":"ContainerStarted","Data":"24a5cb8e3f0f778d12a131d6173c313d1617375e8e328c809e6e2f9307d17a05"} Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.280028 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qm6w9" event={"ID":"4baae3f3-ceff-445d-9cf0-82284224bff7","Type":"ContainerStarted","Data":"81b2d892fc75ff371816c3a647f8429d6358f47b2843f1d99914b835402fa303"} Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.280930 4892 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-qm6w9 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.280961 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qm6w9" podUID="4baae3f3-ceff-445d-9cf0-82284224bff7" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.283746 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-zc25m" event={"ID":"0b0cbdbb-671e-41e1-b494-a369938dab8e","Type":"ContainerStarted","Data":"f494ef23ba8e3bb38506a532e26207a1d357512b763a47b1478758f86f6cd70c"} Feb 17 17:46:13 crc kubenswrapper[4892]: W0217 17:46:13.285970 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda311419d_f51d_420d_aa7a_01cff548c963.slice/crio-dc19bdb646d84b45620ff960734082cd2f9e6e8ccde3085357c44fef73ba66f0 WatchSource:0}: Error finding container dc19bdb646d84b45620ff960734082cd2f9e6e8ccde3085357c44fef73ba66f0: Status 404 returned error can't find the container with id dc19bdb646d84b45620ff960734082cd2f9e6e8ccde3085357c44fef73ba66f0 Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.289351 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d9zlz" event={"ID":"36e96f72-2b42-4b98-b6f1-5fe676510aed","Type":"ContainerStarted","Data":"05a14c572601a5a2abe7d39057b65eab779d0efdddc64d6c8827723e065d1911"} Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.316706 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-72rk9" event={"ID":"99bb3092-4fdb-4533-98e7-45e62ac33a4a","Type":"ContainerStarted","Data":"9875400ae6f1f940864552fac10136ace6f1b426dcb94abee4c68ff85ace03f0"} Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.377548 4892 patch_prober.go:28] interesting pod/downloads-7954f5f757-4q9d4 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.377585 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-4q9d4" podUID="843d01a5-8de5-4628-99d0-2ac552e9abf5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.377681 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:46:13 crc kubenswrapper[4892]: E0217 17:46:13.377959 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:46:13.877940791 +0000 UTC m=+145.253344056 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.380157 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:13 crc kubenswrapper[4892]: E0217 17:46:13.383677 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:46:13.883605943 +0000 UTC m=+145.259009208 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cnjbv" (UID: "b3fc996d-107b-4647-b52f-54fef31f9059") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.418133 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-t24t9" Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.418171 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-km9qc" event={"ID":"fc208abe-d868-4d0e-bd8c-9f93e2e5c05a","Type":"ContainerStarted","Data":"673036ec564cb8143be605f9cf94d85d1f358957823669b1de54f887ca28ffe6"} Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.418189 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-cv2jf"] Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.481304 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:46:13 crc kubenswrapper[4892]: E0217 17:46:13.483061 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:46:13.983043415 +0000 UTC m=+145.358446680 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.506089 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wrdm2" podStartSLOduration=121.506069129 podStartE2EDuration="2m1.506069129s" podCreationTimestamp="2026-02-17 17:44:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:46:13.45926139 +0000 UTC m=+144.834664645" watchObservedRunningTime="2026-02-17 17:46:13.506069129 +0000 UTC m=+144.881472414" Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.577262 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-pm4rt" podStartSLOduration=121.577246857 podStartE2EDuration="2m1.577246857s" podCreationTimestamp="2026-02-17 17:44:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:46:13.574670399 +0000 UTC m=+144.950073654" watchObservedRunningTime="2026-02-17 17:46:13.577246857 +0000 UTC m=+144.952650122" Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.590692 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:13 crc kubenswrapper[4892]: E0217 17:46:13.591037 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:46:14.091026124 +0000 UTC m=+145.466429389 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cnjbv" (UID: "b3fc996d-107b-4647-b52f-54fef31f9059") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.655554 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qm6w9" podStartSLOduration=121.655536555 podStartE2EDuration="2m1.655536555s" podCreationTimestamp="2026-02-17 17:44:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:46:13.620993404 +0000 UTC m=+144.996396669" watchObservedRunningTime="2026-02-17 17:46:13.655536555 +0000 UTC m=+145.030939820" Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.693361 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:46:13 crc kubenswrapper[4892]: E0217 17:46:13.693725 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:46:14.193710913 +0000 UTC m=+145.569114178 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.796881 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:13 crc kubenswrapper[4892]: E0217 17:46:13.797245 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:46:14.297232835 +0000 UTC m=+145.672636100 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cnjbv" (UID: "b3fc996d-107b-4647-b52f-54fef31f9059") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.903010 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:46:13 crc kubenswrapper[4892]: E0217 17:46:13.904172 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:46:14.404151006 +0000 UTC m=+145.779554281 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.904990 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:13 crc kubenswrapper[4892]: E0217 17:46:13.905320 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:46:14.405309627 +0000 UTC m=+145.780712892 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cnjbv" (UID: "b3fc996d-107b-4647-b52f-54fef31f9059") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.918299 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-4q9d4" podStartSLOduration=121.918276663 podStartE2EDuration="2m1.918276663s" podCreationTimestamp="2026-02-17 17:44:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:46:13.861351935 +0000 UTC m=+145.236755210" watchObservedRunningTime="2026-02-17 17:46:13.918276663 +0000 UTC m=+145.293679928" Feb 17 17:46:13 crc kubenswrapper[4892]: I0217 17:46:13.974590 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-lzbcm" podStartSLOduration=122.974573744 podStartE2EDuration="2m2.974573744s" podCreationTimestamp="2026-02-17 17:44:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:46:13.947260296 +0000 UTC m=+145.322663561" watchObservedRunningTime="2026-02-17 17:46:13.974573744 +0000 UTC m=+145.349977009" Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.005669 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:46:14 crc kubenswrapper[4892]: E0217 17:46:14.006135 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:46:14.506119906 +0000 UTC m=+145.881523171 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.017040 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-st4m9" podStartSLOduration=122.017025016 podStartE2EDuration="2m2.017025016s" podCreationTimestamp="2026-02-17 17:44:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:46:14.015973988 +0000 UTC m=+145.391377263" watchObservedRunningTime="2026-02-17 17:46:14.017025016 +0000 UTC m=+145.392428281" Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.017375 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-zc25m" podStartSLOduration=122.017370656 podStartE2EDuration="2m2.017370656s" podCreationTimestamp="2026-02-17 17:44:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:46:13.974161223 +0000 UTC m=+145.349564508" watchObservedRunningTime="2026-02-17 17:46:14.017370656 +0000 UTC m=+145.392773911" Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.101368 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-t24t9" podStartSLOduration=122.101349936 podStartE2EDuration="2m2.101349936s" podCreationTimestamp="2026-02-17 17:44:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:46:14.059127709 +0000 UTC m=+145.434530974" watchObservedRunningTime="2026-02-17 17:46:14.101349936 +0000 UTC m=+145.476753211" Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.112393 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:14 crc kubenswrapper[4892]: E0217 17:46:14.112723 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:46:14.612707009 +0000 UTC m=+145.988110274 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cnjbv" (UID: "b3fc996d-107b-4647-b52f-54fef31f9059") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.213136 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:46:14 crc kubenswrapper[4892]: E0217 17:46:14.213457 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:46:14.713434245 +0000 UTC m=+146.088837510 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.213546 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:14 crc kubenswrapper[4892]: E0217 17:46:14.214161 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:46:14.714148634 +0000 UTC m=+146.089551899 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cnjbv" (UID: "b3fc996d-107b-4647-b52f-54fef31f9059") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.254989 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-t5flc" Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.256607 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-t5flc" Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.271064 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-t5flc" Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.311320 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-km9qc" podStartSLOduration=122.311302166 podStartE2EDuration="2m2.311302166s" podCreationTimestamp="2026-02-17 17:44:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:46:14.134233402 +0000 UTC m=+145.509636667" watchObservedRunningTime="2026-02-17 17:46:14.311302166 +0000 UTC m=+145.686705431" Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.314348 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:46:14 crc kubenswrapper[4892]: E0217 17:46:14.314795 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:46:14.814780638 +0000 UTC m=+146.190183903 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.387752 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2rzft" event={"ID":"a81a9079-32b0-41a0-ae32-13fcc5c70c0b","Type":"ContainerStarted","Data":"5cdc6d284e7063f845ef867f9f7c4587249a22653bbd2462edadbbdaa8c6dd1a"} Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.387792 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2rzft" event={"ID":"a81a9079-32b0-41a0-ae32-13fcc5c70c0b","Type":"ContainerStarted","Data":"2094ab23ccfa3dc26d5be3d4d7b6c1e71a5b3718d9ddf4db01ce7e45e8f4f242"} Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.390154 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-ddw22" event={"ID":"7c9c06c9-0472-42e8-906f-2c7d8ec5608d","Type":"ContainerStarted","Data":"648be0b6d5fe39ba517866f511b109b41388e3c6ffed3ed55fdd0580e7041051"} Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.391863 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" event={"ID":"58da6748-a277-48e6-a169-6e6477486e44","Type":"ContainerStarted","Data":"8db01f8c4331823b1ba42f2b593d26b7e69fd64c411d09eed9a1461eab5f277a"} Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.392136 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.393674 4892 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-n8n76 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.13:6443/healthz\": dial tcp 10.217.0.13:6443: connect: connection refused" start-of-body= Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.393716 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" podUID="58da6748-a277-48e6-a169-6e6477486e44" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.13:6443/healthz\": dial tcp 10.217.0.13:6443: connect: connection refused" Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.395571 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-znjqw" event={"ID":"0572b1a8-4e2e-4d75-b1ab-34da859539cf","Type":"ContainerStarted","Data":"1356c01c3a218a8376f54de87ef4d75682c63b16c40c77bfff267e105e7570b3"} Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.395614 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-znjqw" event={"ID":"0572b1a8-4e2e-4d75-b1ab-34da859539cf","Type":"ContainerStarted","Data":"7023ecb5b28c56e55c4d5cdce4997435e992576fa0f5ccb0a05e6200b464d350"} Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.403227 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vhwpb" event={"ID":"f6f8d659-22a7-478b-a52a-f1a82ee5a40a","Type":"ContainerStarted","Data":"3035e68a94f2974bf4c9f4ebfb9345a43a8261989c8ea1e7484bc40fb3cd1ed3"} Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.406932 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qj469" event={"ID":"a311419d-f51d-420d-aa7a-01cff548c963","Type":"ContainerStarted","Data":"1cfcf56bc39d53f7785333948ddb7d300de0bbfb60003d79f3f19fecb2aedde8"} Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.406973 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qj469" event={"ID":"a311419d-f51d-420d-aa7a-01cff548c963","Type":"ContainerStarted","Data":"dc19bdb646d84b45620ff960734082cd2f9e6e8ccde3085357c44fef73ba66f0"} Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.414864 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2rzft" podStartSLOduration=122.414847546 podStartE2EDuration="2m2.414847546s" podCreationTimestamp="2026-02-17 17:44:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:46:14.413065399 +0000 UTC m=+145.788468664" watchObservedRunningTime="2026-02-17 17:46:14.414847546 +0000 UTC m=+145.790250811" Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.415738 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:14 crc kubenswrapper[4892]: E0217 17:46:14.416086 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:46:14.916074099 +0000 UTC m=+146.291477364 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cnjbv" (UID: "b3fc996d-107b-4647-b52f-54fef31f9059") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.416871 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dc6pg" event={"ID":"8e8f043a-de31-4c34-8ca8-d90a0ad0f125","Type":"ContainerStarted","Data":"5f6a652482b82aba3c82c90f7368fa8fff9b3c9e4a104db689df1ac7ba93ac3d"} Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.419354 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ps5cl" event={"ID":"2a08a25e-d2b9-4a1d-aeb6-e968436a9472","Type":"ContainerStarted","Data":"1d1e7cc33930aa06b112fe35b4cf5b0d3d293cebdec134c0ba5bb75335c097ef"} Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.419381 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ps5cl" event={"ID":"2a08a25e-d2b9-4a1d-aeb6-e968436a9472","Type":"ContainerStarted","Data":"eb5a972339ccd2f073675cf0fcc8cbf4da895ac4f3955b0df6da484e1038c1b7"} Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.419913 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ps5cl" Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.424991 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vm7qp" event={"ID":"ae17ea52-fb8a-4d96-9b27-41f6fb99de88","Type":"ContainerStarted","Data":"757167cbaf9ad123a57838a442f57e4e093c645d7f8d88354242c01f11f43c86"} Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.426200 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nj852" event={"ID":"14853a40-bee2-4d8e-a148-5f0ae761d71c","Type":"ContainerStarted","Data":"b17e4818095a39a979ccabcb018b00e4c57124becf46a09ec2803c052f8d1ce1"} Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.426707 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nj852" Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.429331 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kc9rz" event={"ID":"72a2852b-7dd5-4681-b330-b8e4128d75b0","Type":"ContainerStarted","Data":"29f05574cc7dde59290b48d5e9a69123b70a22e9ff7ed4283d25d526df587b1f"} Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.432678 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lxn66" event={"ID":"8df12f42-c275-445f-8c39-32d009a322d5","Type":"ContainerStarted","Data":"cd726cd53b5f813dfdffff5ae0673e06f859dd0f43454b59d411395106341ce7"} Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.453924 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" podStartSLOduration=123.453907548 podStartE2EDuration="2m3.453907548s" podCreationTimestamp="2026-02-17 17:44:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:46:14.449783868 +0000 UTC m=+145.825187133" watchObservedRunningTime="2026-02-17 17:46:14.453907548 +0000 UTC m=+145.829310813" Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.454221 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-hzxbm" event={"ID":"129ae682-89fa-4fa9-b4de-789ea8c0a9fd","Type":"ContainerStarted","Data":"9e6cdfc8e875bddb073a12dc86901f9f3706a0011929ad1922a1b78f9f6a1482"} Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.465825 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cv2jf" event={"ID":"1b62a756-057b-45a6-b568-b788dbb95fa0","Type":"ContainerStarted","Data":"179cd0596dca1a5795320869a87e50bf8c4ccd71fee7f70bff6d2d957421fc13"} Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.465866 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cv2jf" event={"ID":"1b62a756-057b-45a6-b568-b788dbb95fa0","Type":"ContainerStarted","Data":"d9ea48f31095d32958271718f8f9568aad0f22580ecda1b4b552041451b1e12c"} Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.471602 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522505-6hhs6" event={"ID":"1111f441-f7d1-4115-b288-48cef127137a","Type":"ContainerStarted","Data":"39338101665f015d00683b1772c24a547ba1bb7237f21bb2a360429436d44f3f"} Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.473876 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-ddw22" podStartSLOduration=122.473861531 podStartE2EDuration="2m2.473861531s" podCreationTimestamp="2026-02-17 17:44:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:46:14.471544969 +0000 UTC m=+145.846948254" watchObservedRunningTime="2026-02-17 17:46:14.473861531 +0000 UTC m=+145.849264796" Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.479920 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ph8l6" event={"ID":"a94de5f7-987c-43ae-90e9-bc7a48fc4d6a","Type":"ContainerStarted","Data":"7d4da1f1e80a24cc707b41079c7f51f0c8d791fdfc97184668d0ba658756d87e"} Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.480665 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ph8l6" Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.485881 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xlhd5" event={"ID":"fc4dd9a2-87aa-4405-a4cc-778c778aaec9","Type":"ContainerStarted","Data":"5a9d6f29dd69f0c88f04d654389d0c1fc3b2e8e6144ebe204be4e4bfe3f2b38f"} Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.486049 4892 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-ph8l6 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.486110 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ph8l6" podUID="a94de5f7-987c-43ae-90e9-bc7a48fc4d6a" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.487209 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-xlhd5" Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.487500 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-nnn4l" event={"ID":"8607d279-cebb-4590-a0c0-7dda3de5dfd5","Type":"ContainerStarted","Data":"7c86d33e019d96bfcc8d357515372271bb1c9aa2b9706682d6009eef82963fdd"} Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.491369 4892 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-xlhd5 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/healthz\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.491404 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-xlhd5" podUID="fc4dd9a2-87aa-4405-a4cc-778c778aaec9" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.21:8080/healthz\": dial tcp 10.217.0.21:8080: connect: connection refused" Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.494477 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k9jsf" event={"ID":"752ea4e8-6b0b-4fab-9e96-f21100f3e72d","Type":"ContainerStarted","Data":"92b43d38c5106f2532536fc8d6c801d3bb0bc78de523fdf297143446adafab90"} Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.505868 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qj469" podStartSLOduration=122.505849054 podStartE2EDuration="2m2.505849054s" podCreationTimestamp="2026-02-17 17:44:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:46:14.498953269 +0000 UTC m=+145.874356534" watchObservedRunningTime="2026-02-17 17:46:14.505849054 +0000 UTC m=+145.881252319" Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.508077 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-22jlr" event={"ID":"29cdf011-a502-4b68-932e-4d7df7e2b0ae","Type":"ContainerStarted","Data":"5cd55c534b93e77997f315899a51bf07864d3017d6f5ae698e5785290a19e7b8"} Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.513381 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9bbcq" event={"ID":"e7188959-438f-469b-8c0e-8e2af8c54d3f","Type":"ContainerStarted","Data":"c100493b1b044dece7cdee8d33b5aa01e8407da7b59d914eb5bc504670fa4694"} Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.514251 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9bbcq" Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.516459 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-72rk9" event={"ID":"99bb3092-4fdb-4533-98e7-45e62ac33a4a","Type":"ContainerStarted","Data":"0938f42ccbdbfd0a610fca3859ab736c3ed60c392041a42399e333ff8e07b7c5"} Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.519969 4892 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-9bbcq container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:5443/healthz\": dial tcp 10.217.0.38:5443: connect: connection refused" start-of-body= Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.520018 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9bbcq" podUID="e7188959-438f-469b-8c0e-8e2af8c54d3f" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.38:5443/healthz\": dial tcp 10.217.0.38:5443: connect: connection refused" Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.520581 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-znjqw" podStartSLOduration=122.520568296 podStartE2EDuration="2m2.520568296s" podCreationTimestamp="2026-02-17 17:44:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:46:14.520483344 +0000 UTC m=+145.895886609" watchObservedRunningTime="2026-02-17 17:46:14.520568296 +0000 UTC m=+145.895971561" Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.520988 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:46:14 crc kubenswrapper[4892]: E0217 17:46:14.522983 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:46:15.02296852 +0000 UTC m=+146.398371785 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.542219 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-x6gtg" event={"ID":"e1952c19-97f1-41ae-b9de-834d2325c943","Type":"ContainerStarted","Data":"5499db8b463e9aec433b0f8d20b8cd61272a229994fd1219d2f8e8504b1e607f"} Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.543049 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-x6gtg" Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.545484 4892 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-x6gtg container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.545520 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-x6gtg" podUID="e1952c19-97f1-41ae-b9de-834d2325c943" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.563679 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vhwpb" podStartSLOduration=122.563414868 podStartE2EDuration="2m2.563414868s" podCreationTimestamp="2026-02-17 17:44:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:46:14.561490448 +0000 UTC m=+145.936893713" watchObservedRunningTime="2026-02-17 17:46:14.563414868 +0000 UTC m=+145.938818143" Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.568064 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-6wqsp" event={"ID":"b571ef16-161d-4280-aaa0-3f15b76dcc21","Type":"ContainerStarted","Data":"aad8bccfdf0657040037b1458bd3e7feb65040f39d8f0108d4e497c6cda8943f"} Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.579590 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-22jlr" podStartSLOduration=6.57957199 podStartE2EDuration="6.57957199s" podCreationTimestamp="2026-02-17 17:46:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:46:14.579071246 +0000 UTC m=+145.954474511" watchObservedRunningTime="2026-02-17 17:46:14.57957199 +0000 UTC m=+145.954975255" Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.581154 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xcgd8" event={"ID":"7ec23c19-a8c4-4079-bd8d-897a91ad009f","Type":"ContainerStarted","Data":"8d4fb6673cfd7c8e65d67c6ae06b35e6177c20b3b6384b4674277fd41caf3601"} Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.586995 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bgzdj" event={"ID":"d3b54c25-de44-4c2c-be7a-3e9d1ab88a2b","Type":"ContainerStarted","Data":"e55ae70e70686203a3030c9f2c2ea1196a74a98f191199e18a337a89396a5cce"} Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.587037 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bgzdj" event={"ID":"d3b54c25-de44-4c2c-be7a-3e9d1ab88a2b","Type":"ContainerStarted","Data":"04a653ff3ff02ddbe5a636ee209980cb62005097c44a9e8689e8d7bb62bf803f"} Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.590446 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-v4nm6" event={"ID":"1af316b8-cbe5-47df-8e9b-8e5d839aee33","Type":"ContainerStarted","Data":"a1d2bb74a7315579aad46c05cd34dda2235157a1db045538846f0da13b397438"} Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.590499 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-v4nm6" event={"ID":"1af316b8-cbe5-47df-8e9b-8e5d839aee33","Type":"ContainerStarted","Data":"cc23eb6b2f4aaf880793c0c111e804ad004a8e4ff78cd0c61ca25fa4ba7d69a7"} Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.594263 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qw5r5" event={"ID":"0168b546-66c4-4663-a840-4b4b1968d725","Type":"ContainerStarted","Data":"152aa609a06d3d66623ea37d62926c7f9cbeb2a4be8e519e37ca3da6d5185b12"} Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.618125 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dc6pg" podStartSLOduration=122.618106418 podStartE2EDuration="2m2.618106418s" podCreationTimestamp="2026-02-17 17:44:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:46:14.612736674 +0000 UTC m=+145.988139929" watchObservedRunningTime="2026-02-17 17:46:14.618106418 +0000 UTC m=+145.993509683" Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.618434 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d9zlz" event={"ID":"36e96f72-2b42-4b98-b6f1-5fe676510aed","Type":"ContainerStarted","Data":"a88d695e6f5a2ad69e46eb255ecaa22952273f6db7a85ce08ea79182ca8edbb7"} Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.622620 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:14 crc kubenswrapper[4892]: E0217 17:46:14.623355 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:46:15.123343397 +0000 UTC m=+146.498746652 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cnjbv" (UID: "b3fc996d-107b-4647-b52f-54fef31f9059") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.639170 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hwwms" event={"ID":"6c5aba9a-b953-479b-a16a-93af181b7445","Type":"ContainerStarted","Data":"73283e517f980357b520721b7842c0abd3af8480f65df66492417bfb2278f746"} Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.640018 4892 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-st4m9 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.640081 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-st4m9" podUID="183505a9-eeb4-4bf8-94ae-2c593e78b926" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.647330 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nj852" podStartSLOduration=122.647314966 podStartE2EDuration="2m2.647314966s" podCreationTimestamp="2026-02-17 17:44:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:46:14.644778559 +0000 UTC m=+146.020181824" watchObservedRunningTime="2026-02-17 17:46:14.647314966 +0000 UTC m=+146.022718231" Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.657854 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qm6w9" Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.660502 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-t5flc" Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.723711 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:46:14 crc kubenswrapper[4892]: E0217 17:46:14.725353 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:46:15.225333937 +0000 UTC m=+146.600737212 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.826480 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:14 crc kubenswrapper[4892]: E0217 17:46:14.826852 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:46:15.326840545 +0000 UTC m=+146.702243810 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cnjbv" (UID: "b3fc996d-107b-4647-b52f-54fef31f9059") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.831456 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-x6gtg" podStartSLOduration=122.831438297 podStartE2EDuration="2m2.831438297s" podCreationTimestamp="2026-02-17 17:44:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:46:14.765190511 +0000 UTC m=+146.140593786" watchObservedRunningTime="2026-02-17 17:46:14.831438297 +0000 UTC m=+146.206841562" Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.833235 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-xlhd5" podStartSLOduration=122.833226025 podStartE2EDuration="2m2.833226025s" podCreationTimestamp="2026-02-17 17:44:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:46:14.829990489 +0000 UTC m=+146.205393754" watchObservedRunningTime="2026-02-17 17:46:14.833226025 +0000 UTC m=+146.208629290" Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.854055 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-hzxbm" podStartSLOduration=122.8540354 podStartE2EDuration="2m2.8540354s" podCreationTimestamp="2026-02-17 17:44:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:46:14.851929853 +0000 UTC m=+146.227333128" watchObservedRunningTime="2026-02-17 17:46:14.8540354 +0000 UTC m=+146.229438665" Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.866414 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ph8l6" podStartSLOduration=122.86639898 podStartE2EDuration="2m2.86639898s" podCreationTimestamp="2026-02-17 17:44:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:46:14.865151996 +0000 UTC m=+146.240555271" watchObservedRunningTime="2026-02-17 17:46:14.86639898 +0000 UTC m=+146.241802245" Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.888900 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lxn66" podStartSLOduration=123.88887945 podStartE2EDuration="2m3.88887945s" podCreationTimestamp="2026-02-17 17:44:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:46:14.880760233 +0000 UTC m=+146.256163488" watchObservedRunningTime="2026-02-17 17:46:14.88887945 +0000 UTC m=+146.264282715" Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.923815 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ps5cl" podStartSLOduration=122.92379472 podStartE2EDuration="2m2.92379472s" podCreationTimestamp="2026-02-17 17:44:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:46:14.923725158 +0000 UTC m=+146.299128423" watchObservedRunningTime="2026-02-17 17:46:14.92379472 +0000 UTC m=+146.299197985" Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.927724 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:46:14 crc kubenswrapper[4892]: E0217 17:46:14.927882 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:46:15.427861829 +0000 UTC m=+146.803265094 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.928285 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:14 crc kubenswrapper[4892]: E0217 17:46:14.928648 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:46:15.42863576 +0000 UTC m=+146.804039025 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cnjbv" (UID: "b3fc996d-107b-4647-b52f-54fef31f9059") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.954224 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kc9rz" podStartSLOduration=122.954187681 podStartE2EDuration="2m2.954187681s" podCreationTimestamp="2026-02-17 17:44:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:46:14.950716579 +0000 UTC m=+146.326119864" watchObservedRunningTime="2026-02-17 17:46:14.954187681 +0000 UTC m=+146.329590946" Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.962760 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-ddw22" Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.967985 4892 patch_prober.go:28] interesting pod/router-default-5444994796-ddw22 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.968049 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ddw22" podUID="7c9c06c9-0472-42e8-906f-2c7d8ec5608d" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.973179 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9bbcq" podStartSLOduration=122.973161387 podStartE2EDuration="2m2.973161387s" podCreationTimestamp="2026-02-17 17:44:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:46:14.970469406 +0000 UTC m=+146.345872671" watchObservedRunningTime="2026-02-17 17:46:14.973161387 +0000 UTC m=+146.348564652" Feb 17 17:46:14 crc kubenswrapper[4892]: I0217 17:46:14.992049 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29522505-6hhs6" podStartSLOduration=74.99202971 podStartE2EDuration="1m14.99202971s" podCreationTimestamp="2026-02-17 17:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:46:14.991892656 +0000 UTC m=+146.367295931" watchObservedRunningTime="2026-02-17 17:46:14.99202971 +0000 UTC m=+146.367432975" Feb 17 17:46:15 crc kubenswrapper[4892]: I0217 17:46:15.011293 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-72rk9" podStartSLOduration=123.011260643 podStartE2EDuration="2m3.011260643s" podCreationTimestamp="2026-02-17 17:44:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:46:15.008345195 +0000 UTC m=+146.383748460" watchObservedRunningTime="2026-02-17 17:46:15.011260643 +0000 UTC m=+146.386663898" Feb 17 17:46:15 crc kubenswrapper[4892]: I0217 17:46:15.029855 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:46:15 crc kubenswrapper[4892]: E0217 17:46:15.030028 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:46:15.530002923 +0000 UTC m=+146.905406188 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:15 crc kubenswrapper[4892]: I0217 17:46:15.030068 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:15 crc kubenswrapper[4892]: E0217 17:46:15.030393 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:46:15.530381903 +0000 UTC m=+146.905785168 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cnjbv" (UID: "b3fc996d-107b-4647-b52f-54fef31f9059") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:15 crc kubenswrapper[4892]: I0217 17:46:15.063207 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k9jsf" podStartSLOduration=123.063188658 podStartE2EDuration="2m3.063188658s" podCreationTimestamp="2026-02-17 17:44:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:46:15.037960045 +0000 UTC m=+146.413363310" watchObservedRunningTime="2026-02-17 17:46:15.063188658 +0000 UTC m=+146.438591923" Feb 17 17:46:15 crc kubenswrapper[4892]: I0217 17:46:15.064914 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-6wqsp" podStartSLOduration=123.064909525 podStartE2EDuration="2m3.064909525s" podCreationTimestamp="2026-02-17 17:44:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:46:15.063860876 +0000 UTC m=+146.439264141" watchObservedRunningTime="2026-02-17 17:46:15.064909525 +0000 UTC m=+146.440312790" Feb 17 17:46:15 crc kubenswrapper[4892]: I0217 17:46:15.081938 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d9zlz" podStartSLOduration=123.081922878 podStartE2EDuration="2m3.081922878s" podCreationTimestamp="2026-02-17 17:44:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:46:15.081013393 +0000 UTC m=+146.456416658" watchObservedRunningTime="2026-02-17 17:46:15.081922878 +0000 UTC m=+146.457326143" Feb 17 17:46:15 crc kubenswrapper[4892]: I0217 17:46:15.109675 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qw5r5" podStartSLOduration=123.109660527 podStartE2EDuration="2m3.109660527s" podCreationTimestamp="2026-02-17 17:44:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:46:15.108779184 +0000 UTC m=+146.484182439" watchObservedRunningTime="2026-02-17 17:46:15.109660527 +0000 UTC m=+146.485063792" Feb 17 17:46:15 crc kubenswrapper[4892]: I0217 17:46:15.131187 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:46:15 crc kubenswrapper[4892]: E0217 17:46:15.131321 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:46:15.631294744 +0000 UTC m=+147.006697999 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:15 crc kubenswrapper[4892]: I0217 17:46:15.131437 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:15 crc kubenswrapper[4892]: E0217 17:46:15.131908 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:46:15.63189382 +0000 UTC m=+147.007297095 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cnjbv" (UID: "b3fc996d-107b-4647-b52f-54fef31f9059") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:15 crc kubenswrapper[4892]: I0217 17:46:15.143003 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bgzdj" podStartSLOduration=123.142980816 podStartE2EDuration="2m3.142980816s" podCreationTimestamp="2026-02-17 17:44:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:46:15.142923354 +0000 UTC m=+146.518326619" watchObservedRunningTime="2026-02-17 17:46:15.142980816 +0000 UTC m=+146.518384081" Feb 17 17:46:15 crc kubenswrapper[4892]: I0217 17:46:15.218597 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-v4nm6" podStartSLOduration=123.218574992 podStartE2EDuration="2m3.218574992s" podCreationTimestamp="2026-02-17 17:44:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:46:15.211769351 +0000 UTC m=+146.587172626" watchObservedRunningTime="2026-02-17 17:46:15.218574992 +0000 UTC m=+146.593978257" Feb 17 17:46:15 crc kubenswrapper[4892]: I0217 17:46:15.232573 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:46:15 crc kubenswrapper[4892]: E0217 17:46:15.232743 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:46:15.732720069 +0000 UTC m=+147.108123334 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:15 crc kubenswrapper[4892]: I0217 17:46:15.232918 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:15 crc kubenswrapper[4892]: E0217 17:46:15.233267 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:46:15.733253194 +0000 UTC m=+147.108656459 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cnjbv" (UID: "b3fc996d-107b-4647-b52f-54fef31f9059") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:15 crc kubenswrapper[4892]: I0217 17:46:15.334339 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:46:15 crc kubenswrapper[4892]: E0217 17:46:15.334508 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:46:15.834481014 +0000 UTC m=+147.209884269 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:15 crc kubenswrapper[4892]: I0217 17:46:15.334922 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:15 crc kubenswrapper[4892]: E0217 17:46:15.335210 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:46:15.835202763 +0000 UTC m=+147.210606028 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cnjbv" (UID: "b3fc996d-107b-4647-b52f-54fef31f9059") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:15 crc kubenswrapper[4892]: I0217 17:46:15.352398 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-xcgd8" podStartSLOduration=7.352378661 podStartE2EDuration="7.352378661s" podCreationTimestamp="2026-02-17 17:46:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:46:15.351361864 +0000 UTC m=+146.726765129" watchObservedRunningTime="2026-02-17 17:46:15.352378661 +0000 UTC m=+146.727781936" Feb 17 17:46:15 crc kubenswrapper[4892]: I0217 17:46:15.436303 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:46:15 crc kubenswrapper[4892]: E0217 17:46:15.436650 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:46:15.936634498 +0000 UTC m=+147.312037763 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:15 crc kubenswrapper[4892]: I0217 17:46:15.537610 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:15 crc kubenswrapper[4892]: E0217 17:46:15.538023 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:46:16.038008481 +0000 UTC m=+147.413411746 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cnjbv" (UID: "b3fc996d-107b-4647-b52f-54fef31f9059") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:15 crc kubenswrapper[4892]: I0217 17:46:15.638931 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:46:15 crc kubenswrapper[4892]: E0217 17:46:15.639138 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:46:16.139110818 +0000 UTC m=+147.514514083 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:15 crc kubenswrapper[4892]: I0217 17:46:15.639189 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:15 crc kubenswrapper[4892]: E0217 17:46:15.639455 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:46:16.139446168 +0000 UTC m=+147.514849433 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cnjbv" (UID: "b3fc996d-107b-4647-b52f-54fef31f9059") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:15 crc kubenswrapper[4892]: I0217 17:46:15.649284 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vm7qp" event={"ID":"ae17ea52-fb8a-4d96-9b27-41f6fb99de88","Type":"ContainerStarted","Data":"3ff93750595994880e73b02483f198beb1b94771e0c06aedb9f6e95f0758ae1b"} Feb 17 17:46:15 crc kubenswrapper[4892]: I0217 17:46:15.650916 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-nnn4l" event={"ID":"8607d279-cebb-4590-a0c0-7dda3de5dfd5","Type":"ContainerStarted","Data":"d6fc31a49617f697bea5f4639310804a47ffad14e003132c8a501bb324196be0"} Feb 17 17:46:15 crc kubenswrapper[4892]: I0217 17:46:15.653675 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cv2jf" event={"ID":"1b62a756-057b-45a6-b568-b788dbb95fa0","Type":"ContainerStarted","Data":"1173bec1db12451435e06d7f0d1589831051ed86e12da6bee3804974e241456a"} Feb 17 17:46:15 crc kubenswrapper[4892]: I0217 17:46:15.653708 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-cv2jf" Feb 17 17:46:15 crc kubenswrapper[4892]: I0217 17:46:15.658991 4892 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-xlhd5 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/healthz\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Feb 17 17:46:15 crc kubenswrapper[4892]: I0217 17:46:15.659871 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-xlhd5" podUID="fc4dd9a2-87aa-4405-a4cc-778c778aaec9" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.21:8080/healthz\": dial tcp 10.217.0.21:8080: connect: connection refused" Feb 17 17:46:15 crc kubenswrapper[4892]: I0217 17:46:15.660950 4892 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-ph8l6 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Feb 17 17:46:15 crc kubenswrapper[4892]: I0217 17:46:15.661003 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ph8l6" podUID="a94de5f7-987c-43ae-90e9-bc7a48fc4d6a" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" Feb 17 17:46:15 crc kubenswrapper[4892]: I0217 17:46:15.673165 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-x6gtg" Feb 17 17:46:15 crc kubenswrapper[4892]: I0217 17:46:15.717757 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hwwms" podStartSLOduration=123.717741796 podStartE2EDuration="2m3.717741796s" podCreationTimestamp="2026-02-17 17:44:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:46:15.440518172 +0000 UTC m=+146.815921437" watchObservedRunningTime="2026-02-17 17:46:15.717741796 +0000 UTC m=+147.093145061" Feb 17 17:46:15 crc kubenswrapper[4892]: I0217 17:46:15.740410 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:46:15 crc kubenswrapper[4892]: E0217 17:46:15.741737 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:46:16.241718655 +0000 UTC m=+147.617121920 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:15 crc kubenswrapper[4892]: I0217 17:46:15.789950 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dc6pg" Feb 17 17:46:15 crc kubenswrapper[4892]: I0217 17:46:15.790139 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dc6pg" Feb 17 17:46:15 crc kubenswrapper[4892]: I0217 17:46:15.803152 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-cv2jf" podStartSLOduration=7.803133513 podStartE2EDuration="7.803133513s" podCreationTimestamp="2026-02-17 17:46:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:46:15.797115692 +0000 UTC m=+147.172518957" watchObservedRunningTime="2026-02-17 17:46:15.803133513 +0000 UTC m=+147.178536778" Feb 17 17:46:15 crc kubenswrapper[4892]: I0217 17:46:15.803572 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vm7qp" podStartSLOduration=124.803568174 podStartE2EDuration="2m4.803568174s" podCreationTimestamp="2026-02-17 17:44:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:46:15.721869615 +0000 UTC m=+147.097272880" watchObservedRunningTime="2026-02-17 17:46:15.803568174 +0000 UTC m=+147.178971439" Feb 17 17:46:15 crc kubenswrapper[4892]: I0217 17:46:15.843681 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:15 crc kubenswrapper[4892]: E0217 17:46:15.844073 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:46:16.344057425 +0000 UTC m=+147.719460690 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cnjbv" (UID: "b3fc996d-107b-4647-b52f-54fef31f9059") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:15 crc kubenswrapper[4892]: I0217 17:46:15.944805 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:46:15 crc kubenswrapper[4892]: E0217 17:46:15.944945 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:46:16.444920645 +0000 UTC m=+147.820323910 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:15 crc kubenswrapper[4892]: I0217 17:46:15.945062 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:15 crc kubenswrapper[4892]: E0217 17:46:15.945364 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:46:16.445351256 +0000 UTC m=+147.820754521 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cnjbv" (UID: "b3fc996d-107b-4647-b52f-54fef31f9059") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:15 crc kubenswrapper[4892]: I0217 17:46:15.967320 4892 patch_prober.go:28] interesting pod/router-default-5444994796-ddw22 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 17:46:15 crc kubenswrapper[4892]: [-]has-synced failed: reason withheld Feb 17 17:46:15 crc kubenswrapper[4892]: [+]process-running ok Feb 17 17:46:15 crc kubenswrapper[4892]: healthz check failed Feb 17 17:46:15 crc kubenswrapper[4892]: I0217 17:46:15.967384 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ddw22" podUID="7c9c06c9-0472-42e8-906f-2c7d8ec5608d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 17:46:16 crc kubenswrapper[4892]: I0217 17:46:16.046114 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:46:16 crc kubenswrapper[4892]: E0217 17:46:16.046251 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:46:16.546227256 +0000 UTC m=+147.921630521 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:16 crc kubenswrapper[4892]: I0217 17:46:16.046379 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:16 crc kubenswrapper[4892]: E0217 17:46:16.046681 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:46:16.546672788 +0000 UTC m=+147.922076053 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cnjbv" (UID: "b3fc996d-107b-4647-b52f-54fef31f9059") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:16 crc kubenswrapper[4892]: I0217 17:46:16.147323 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:46:16 crc kubenswrapper[4892]: E0217 17:46:16.147515 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:46:16.647498857 +0000 UTC m=+148.022902122 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:16 crc kubenswrapper[4892]: I0217 17:46:16.147571 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:16 crc kubenswrapper[4892]: E0217 17:46:16.147946 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:46:16.647931749 +0000 UTC m=+148.023335014 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cnjbv" (UID: "b3fc996d-107b-4647-b52f-54fef31f9059") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:16 crc kubenswrapper[4892]: I0217 17:46:16.248560 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:46:16 crc kubenswrapper[4892]: E0217 17:46:16.249204 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:46:16.749177479 +0000 UTC m=+148.124580734 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:16 crc kubenswrapper[4892]: I0217 17:46:16.350069 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:16 crc kubenswrapper[4892]: E0217 17:46:16.350439 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:46:16.850422759 +0000 UTC m=+148.225826024 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cnjbv" (UID: "b3fc996d-107b-4647-b52f-54fef31f9059") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:16 crc kubenswrapper[4892]: I0217 17:46:16.409383 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dc6pg" Feb 17 17:46:16 crc kubenswrapper[4892]: I0217 17:46:16.450988 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:46:16 crc kubenswrapper[4892]: E0217 17:46:16.451165 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:46:16.951139555 +0000 UTC m=+148.326542820 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:16 crc kubenswrapper[4892]: I0217 17:46:16.451426 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:16 crc kubenswrapper[4892]: E0217 17:46:16.451708 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:46:16.951699971 +0000 UTC m=+148.327103236 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cnjbv" (UID: "b3fc996d-107b-4647-b52f-54fef31f9059") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:16 crc kubenswrapper[4892]: I0217 17:46:16.552138 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:46:16 crc kubenswrapper[4892]: E0217 17:46:16.552318 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:46:17.052279052 +0000 UTC m=+148.427682317 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:16 crc kubenswrapper[4892]: I0217 17:46:16.552461 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:16 crc kubenswrapper[4892]: E0217 17:46:16.552762 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:46:17.052751566 +0000 UTC m=+148.428154831 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cnjbv" (UID: "b3fc996d-107b-4647-b52f-54fef31f9059") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:16 crc kubenswrapper[4892]: I0217 17:46:16.568999 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9bbcq" Feb 17 17:46:16 crc kubenswrapper[4892]: I0217 17:46:16.653264 4892 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-n8n76 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.13:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 17 17:46:16 crc kubenswrapper[4892]: I0217 17:46:16.653420 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" podUID="58da6748-a277-48e6-a169-6e6477486e44" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.13:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 17 17:46:16 crc kubenswrapper[4892]: I0217 17:46:16.653598 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:46:16 crc kubenswrapper[4892]: E0217 17:46:16.653715 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:46:17.153698728 +0000 UTC m=+148.529101993 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:16 crc kubenswrapper[4892]: I0217 17:46:16.653762 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:16 crc kubenswrapper[4892]: E0217 17:46:16.654119 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:46:17.154112609 +0000 UTC m=+148.529515874 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cnjbv" (UID: "b3fc996d-107b-4647-b52f-54fef31f9059") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:16 crc kubenswrapper[4892]: I0217 17:46:16.681659 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-nnn4l" event={"ID":"8607d279-cebb-4590-a0c0-7dda3de5dfd5","Type":"ContainerStarted","Data":"566e5ed7ee66b6e390a55ce01a8b2fb3926f758fcd805f9af4bb2f794b824733"} Feb 17 17:46:16 crc kubenswrapper[4892]: I0217 17:46:16.694184 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dc6pg" Feb 17 17:46:16 crc kubenswrapper[4892]: I0217 17:46:16.698196 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nj852" Feb 17 17:46:16 crc kubenswrapper[4892]: I0217 17:46:16.699945 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ph8l6" Feb 17 17:46:16 crc kubenswrapper[4892]: I0217 17:46:16.754392 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:46:16 crc kubenswrapper[4892]: E0217 17:46:16.754658 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:46:17.25463139 +0000 UTC m=+148.630034655 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:16 crc kubenswrapper[4892]: I0217 17:46:16.754867 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:16 crc kubenswrapper[4892]: E0217 17:46:16.755248 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:46:17.255231756 +0000 UTC m=+148.630635011 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cnjbv" (UID: "b3fc996d-107b-4647-b52f-54fef31f9059") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:16 crc kubenswrapper[4892]: I0217 17:46:16.859353 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:46:16 crc kubenswrapper[4892]: E0217 17:46:16.859547 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:46:17.359519627 +0000 UTC m=+148.734922892 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:16 crc kubenswrapper[4892]: I0217 17:46:16.859778 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:16 crc kubenswrapper[4892]: E0217 17:46:16.860395 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:46:17.36038193 +0000 UTC m=+148.735785195 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cnjbv" (UID: "b3fc996d-107b-4647-b52f-54fef31f9059") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:16 crc kubenswrapper[4892]: I0217 17:46:16.895432 4892 csr.go:261] certificate signing request csr-552g8 is approved, waiting to be issued Feb 17 17:46:16 crc kubenswrapper[4892]: I0217 17:46:16.919327 4892 csr.go:257] certificate signing request csr-552g8 is issued Feb 17 17:46:16 crc kubenswrapper[4892]: I0217 17:46:16.961297 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:46:16 crc kubenswrapper[4892]: E0217 17:46:16.961675 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:46:17.461660072 +0000 UTC m=+148.837063337 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:16 crc kubenswrapper[4892]: I0217 17:46:16.969376 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-ddw22" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.062911 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:17 crc kubenswrapper[4892]: E0217 17:46:17.063225 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:46:17.56320922 +0000 UTC m=+148.938612485 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cnjbv" (UID: "b3fc996d-107b-4647-b52f-54fef31f9059") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.142438 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vh5f8"] Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.143418 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vh5f8" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.150143 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.163925 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:46:17 crc kubenswrapper[4892]: E0217 17:46:17.164071 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:46:17.664048989 +0000 UTC m=+149.039452254 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.164218 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:17 crc kubenswrapper[4892]: E0217 17:46:17.164597 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:46:17.664586073 +0000 UTC m=+149.039989338 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cnjbv" (UID: "b3fc996d-107b-4647-b52f-54fef31f9059") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.176755 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vh5f8"] Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.265811 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.265952 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.266016 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.266037 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.266073 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbb9e745-9259-437f-a600-80153e687c65-catalog-content\") pod \"certified-operators-vh5f8\" (UID: \"cbb9e745-9259-437f-a600-80153e687c65\") " pod="openshift-marketplace/certified-operators-vh5f8" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.266096 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbb9e745-9259-437f-a600-80153e687c65-utilities\") pod \"certified-operators-vh5f8\" (UID: \"cbb9e745-9259-437f-a600-80153e687c65\") " pod="openshift-marketplace/certified-operators-vh5f8" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.266116 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8vd6\" (UniqueName: \"kubernetes.io/projected/cbb9e745-9259-437f-a600-80153e687c65-kube-api-access-k8vd6\") pod \"certified-operators-vh5f8\" (UID: \"cbb9e745-9259-437f-a600-80153e687c65\") " pod="openshift-marketplace/certified-operators-vh5f8" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.266134 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:46:17 crc kubenswrapper[4892]: E0217 17:46:17.266600 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:46:17.766575493 +0000 UTC m=+149.141978758 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.270073 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.275521 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.276048 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.302356 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.324268 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2dtr9"] Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.325477 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2dtr9" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.329629 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.353220 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2dtr9"] Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.366766 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbb9e745-9259-437f-a600-80153e687c65-catalog-content\") pod \"certified-operators-vh5f8\" (UID: \"cbb9e745-9259-437f-a600-80153e687c65\") " pod="openshift-marketplace/certified-operators-vh5f8" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.366807 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbb9e745-9259-437f-a600-80153e687c65-utilities\") pod \"certified-operators-vh5f8\" (UID: \"cbb9e745-9259-437f-a600-80153e687c65\") " pod="openshift-marketplace/certified-operators-vh5f8" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.366845 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8vd6\" (UniqueName: \"kubernetes.io/projected/cbb9e745-9259-437f-a600-80153e687c65-kube-api-access-k8vd6\") pod \"certified-operators-vh5f8\" (UID: \"cbb9e745-9259-437f-a600-80153e687c65\") " pod="openshift-marketplace/certified-operators-vh5f8" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.366875 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:17 crc kubenswrapper[4892]: E0217 17:46:17.367154 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:46:17.867142885 +0000 UTC m=+149.242546150 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cnjbv" (UID: "b3fc996d-107b-4647-b52f-54fef31f9059") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.367334 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbb9e745-9259-437f-a600-80153e687c65-catalog-content\") pod \"certified-operators-vh5f8\" (UID: \"cbb9e745-9259-437f-a600-80153e687c65\") " pod="openshift-marketplace/certified-operators-vh5f8" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.367521 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbb9e745-9259-437f-a600-80153e687c65-utilities\") pod \"certified-operators-vh5f8\" (UID: \"cbb9e745-9259-437f-a600-80153e687c65\") " pod="openshift-marketplace/certified-operators-vh5f8" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.381229 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.388446 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.396083 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.412951 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8vd6\" (UniqueName: \"kubernetes.io/projected/cbb9e745-9259-437f-a600-80153e687c65-kube-api-access-k8vd6\") pod \"certified-operators-vh5f8\" (UID: \"cbb9e745-9259-437f-a600-80153e687c65\") " pod="openshift-marketplace/certified-operators-vh5f8" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.468477 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.468752 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c4922f3-1e12-469d-9afc-c2c52238e551-catalog-content\") pod \"community-operators-2dtr9\" (UID: \"9c4922f3-1e12-469d-9afc-c2c52238e551\") " pod="openshift-marketplace/community-operators-2dtr9" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.468789 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c4922f3-1e12-469d-9afc-c2c52238e551-utilities\") pod \"community-operators-2dtr9\" (UID: \"9c4922f3-1e12-469d-9afc-c2c52238e551\") " pod="openshift-marketplace/community-operators-2dtr9" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.468873 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fbkk\" (UniqueName: \"kubernetes.io/projected/9c4922f3-1e12-469d-9afc-c2c52238e551-kube-api-access-8fbkk\") pod \"community-operators-2dtr9\" (UID: \"9c4922f3-1e12-469d-9afc-c2c52238e551\") " pod="openshift-marketplace/community-operators-2dtr9" Feb 17 17:46:17 crc kubenswrapper[4892]: E0217 17:46:17.469024 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 17:46:17.969008903 +0000 UTC m=+149.344412168 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.478123 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vh5f8" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.510775 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dzbvh"] Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.539656 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dzbvh"] Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.539777 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dzbvh" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.540905 4892 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.570103 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fbkk\" (UniqueName: \"kubernetes.io/projected/9c4922f3-1e12-469d-9afc-c2c52238e551-kube-api-access-8fbkk\") pod \"community-operators-2dtr9\" (UID: \"9c4922f3-1e12-469d-9afc-c2c52238e551\") " pod="openshift-marketplace/community-operators-2dtr9" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.570148 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.570193 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c4922f3-1e12-469d-9afc-c2c52238e551-catalog-content\") pod \"community-operators-2dtr9\" (UID: \"9c4922f3-1e12-469d-9afc-c2c52238e551\") " pod="openshift-marketplace/community-operators-2dtr9" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.570209 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c4922f3-1e12-469d-9afc-c2c52238e551-utilities\") pod \"community-operators-2dtr9\" (UID: \"9c4922f3-1e12-469d-9afc-c2c52238e551\") " pod="openshift-marketplace/community-operators-2dtr9" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.570572 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c4922f3-1e12-469d-9afc-c2c52238e551-utilities\") pod \"community-operators-2dtr9\" (UID: \"9c4922f3-1e12-469d-9afc-c2c52238e551\") " pod="openshift-marketplace/community-operators-2dtr9" Feb 17 17:46:17 crc kubenswrapper[4892]: E0217 17:46:17.571037 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 17:46:18.071026353 +0000 UTC m=+149.446429618 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cnjbv" (UID: "b3fc996d-107b-4647-b52f-54fef31f9059") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.571217 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c4922f3-1e12-469d-9afc-c2c52238e551-catalog-content\") pod \"community-operators-2dtr9\" (UID: \"9c4922f3-1e12-469d-9afc-c2c52238e551\") " pod="openshift-marketplace/community-operators-2dtr9" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.586685 4892 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-17T17:46:17.540985112Z","Handler":null,"Name":""} Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.597007 4892 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.597055 4892 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.606204 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fbkk\" (UniqueName: \"kubernetes.io/projected/9c4922f3-1e12-469d-9afc-c2c52238e551-kube-api-access-8fbkk\") pod \"community-operators-2dtr9\" (UID: \"9c4922f3-1e12-469d-9afc-c2c52238e551\") " pod="openshift-marketplace/community-operators-2dtr9" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.641785 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2dtr9" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.673157 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.673335 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8tfn\" (UniqueName: \"kubernetes.io/projected/cda40965-c69d-469d-beb4-91582508ad77-kube-api-access-m8tfn\") pod \"certified-operators-dzbvh\" (UID: \"cda40965-c69d-469d-beb4-91582508ad77\") " pod="openshift-marketplace/certified-operators-dzbvh" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.673386 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cda40965-c69d-469d-beb4-91582508ad77-catalog-content\") pod \"certified-operators-dzbvh\" (UID: \"cda40965-c69d-469d-beb4-91582508ad77\") " pod="openshift-marketplace/certified-operators-dzbvh" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.673404 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cda40965-c69d-469d-beb4-91582508ad77-utilities\") pod \"certified-operators-dzbvh\" (UID: \"cda40965-c69d-469d-beb4-91582508ad77\") " pod="openshift-marketplace/certified-operators-dzbvh" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.706003 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.727753 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.728550 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.736033 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9n77k"] Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.736717 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.736876 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.736994 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9n77k" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.759327 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.759370 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9n77k"] Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.776628 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8tfn\" (UniqueName: \"kubernetes.io/projected/cda40965-c69d-469d-beb4-91582508ad77-kube-api-access-m8tfn\") pod \"certified-operators-dzbvh\" (UID: \"cda40965-c69d-469d-beb4-91582508ad77\") " pod="openshift-marketplace/certified-operators-dzbvh" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.776722 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cda40965-c69d-469d-beb4-91582508ad77-catalog-content\") pod \"certified-operators-dzbvh\" (UID: \"cda40965-c69d-469d-beb4-91582508ad77\") " pod="openshift-marketplace/certified-operators-dzbvh" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.776753 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cda40965-c69d-469d-beb4-91582508ad77-utilities\") pod \"certified-operators-dzbvh\" (UID: \"cda40965-c69d-469d-beb4-91582508ad77\") " pod="openshift-marketplace/certified-operators-dzbvh" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.776831 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.779908 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cda40965-c69d-469d-beb4-91582508ad77-catalog-content\") pod \"certified-operators-dzbvh\" (UID: \"cda40965-c69d-469d-beb4-91582508ad77\") " pod="openshift-marketplace/certified-operators-dzbvh" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.780323 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cda40965-c69d-469d-beb4-91582508ad77-utilities\") pod \"certified-operators-dzbvh\" (UID: \"cda40965-c69d-469d-beb4-91582508ad77\") " pod="openshift-marketplace/certified-operators-dzbvh" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.807495 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-nnn4l" event={"ID":"8607d279-cebb-4590-a0c0-7dda3de5dfd5","Type":"ContainerStarted","Data":"a80ae4077b842e27a0239240f9aaf439c54dfd2aa618dd41e7981e92af591e38"} Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.807531 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-nnn4l" event={"ID":"8607d279-cebb-4590-a0c0-7dda3de5dfd5","Type":"ContainerStarted","Data":"036012bae9ac468e17707d0e6c023af1b07bd288999c2895e1c704d7620cd1e6"} Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.809496 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-ddw22" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.813067 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-ddw22" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.814897 4892 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.814936 4892 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.840685 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8tfn\" (UniqueName: \"kubernetes.io/projected/cda40965-c69d-469d-beb4-91582508ad77-kube-api-access-m8tfn\") pod \"certified-operators-dzbvh\" (UID: \"cda40965-c69d-469d-beb4-91582508ad77\") " pod="openshift-marketplace/certified-operators-dzbvh" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.891387 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c8cf662-ff88-481f-ae9d-de0fe49af1f4-catalog-content\") pod \"community-operators-9n77k\" (UID: \"0c8cf662-ff88-481f-ae9d-de0fe49af1f4\") " pod="openshift-marketplace/community-operators-9n77k" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.891439 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9ccb84da-913b-4bdc-8c60-e88713b63b81-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9ccb84da-913b-4bdc-8c60-e88713b63b81\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.891482 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9ccb84da-913b-4bdc-8c60-e88713b63b81-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9ccb84da-913b-4bdc-8c60-e88713b63b81\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.891497 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bppw6\" (UniqueName: \"kubernetes.io/projected/0c8cf662-ff88-481f-ae9d-de0fe49af1f4-kube-api-access-bppw6\") pod \"community-operators-9n77k\" (UID: \"0c8cf662-ff88-481f-ae9d-de0fe49af1f4\") " pod="openshift-marketplace/community-operators-9n77k" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.891539 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c8cf662-ff88-481f-ae9d-de0fe49af1f4-utilities\") pod \"community-operators-9n77k\" (UID: \"0c8cf662-ff88-481f-ae9d-de0fe49af1f4\") " pod="openshift-marketplace/community-operators-9n77k" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.918091 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dzbvh" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.922140 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-17 17:41:16 +0000 UTC, rotation deadline is 2026-11-17 18:57:52.805083915 +0000 UTC Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.922168 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6553h11m34.882918387s for next certificate rotation Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.992476 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9ccb84da-913b-4bdc-8c60-e88713b63b81-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9ccb84da-913b-4bdc-8c60-e88713b63b81\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.992767 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bppw6\" (UniqueName: \"kubernetes.io/projected/0c8cf662-ff88-481f-ae9d-de0fe49af1f4-kube-api-access-bppw6\") pod \"community-operators-9n77k\" (UID: \"0c8cf662-ff88-481f-ae9d-de0fe49af1f4\") " pod="openshift-marketplace/community-operators-9n77k" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.992920 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c8cf662-ff88-481f-ae9d-de0fe49af1f4-utilities\") pod \"community-operators-9n77k\" (UID: \"0c8cf662-ff88-481f-ae9d-de0fe49af1f4\") " pod="openshift-marketplace/community-operators-9n77k" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.992969 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c8cf662-ff88-481f-ae9d-de0fe49af1f4-catalog-content\") pod \"community-operators-9n77k\" (UID: \"0c8cf662-ff88-481f-ae9d-de0fe49af1f4\") " pod="openshift-marketplace/community-operators-9n77k" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.993025 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9ccb84da-913b-4bdc-8c60-e88713b63b81-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9ccb84da-913b-4bdc-8c60-e88713b63b81\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.994565 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9ccb84da-913b-4bdc-8c60-e88713b63b81-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9ccb84da-913b-4bdc-8c60-e88713b63b81\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.994935 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c8cf662-ff88-481f-ae9d-de0fe49af1f4-utilities\") pod \"community-operators-9n77k\" (UID: \"0c8cf662-ff88-481f-ae9d-de0fe49af1f4\") " pod="openshift-marketplace/community-operators-9n77k" Feb 17 17:46:17 crc kubenswrapper[4892]: I0217 17:46:17.995522 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c8cf662-ff88-481f-ae9d-de0fe49af1f4-catalog-content\") pod \"community-operators-9n77k\" (UID: \"0c8cf662-ff88-481f-ae9d-de0fe49af1f4\") " pod="openshift-marketplace/community-operators-9n77k" Feb 17 17:46:18 crc kubenswrapper[4892]: I0217 17:46:18.001782 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cnjbv\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:18 crc kubenswrapper[4892]: I0217 17:46:18.014373 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9ccb84da-913b-4bdc-8c60-e88713b63b81-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9ccb84da-913b-4bdc-8c60-e88713b63b81\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 17:46:18 crc kubenswrapper[4892]: I0217 17:46:18.016595 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bppw6\" (UniqueName: \"kubernetes.io/projected/0c8cf662-ff88-481f-ae9d-de0fe49af1f4-kube-api-access-bppw6\") pod \"community-operators-9n77k\" (UID: \"0c8cf662-ff88-481f-ae9d-de0fe49af1f4\") " pod="openshift-marketplace/community-operators-9n77k" Feb 17 17:46:18 crc kubenswrapper[4892]: I0217 17:46:18.107279 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 17:46:18 crc kubenswrapper[4892]: I0217 17:46:18.119024 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-nnn4l" podStartSLOduration=10.119008189 podStartE2EDuration="10.119008189s" podCreationTimestamp="2026-02-17 17:46:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:46:17.901097807 +0000 UTC m=+149.276501072" watchObservedRunningTime="2026-02-17 17:46:18.119008189 +0000 UTC m=+149.494411454" Feb 17 17:46:18 crc kubenswrapper[4892]: I0217 17:46:18.128069 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9n77k" Feb 17 17:46:18 crc kubenswrapper[4892]: I0217 17:46:18.255017 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:18 crc kubenswrapper[4892]: W0217 17:46:18.319961 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-55a0779ffa700cd0107fde0add27a93cde3fb8c0f153c03decfa493707b6bd14 WatchSource:0}: Error finding container 55a0779ffa700cd0107fde0add27a93cde3fb8c0f153c03decfa493707b6bd14: Status 404 returned error can't find the container with id 55a0779ffa700cd0107fde0add27a93cde3fb8c0f153c03decfa493707b6bd14 Feb 17 17:46:18 crc kubenswrapper[4892]: I0217 17:46:18.454114 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vh5f8"] Feb 17 17:46:18 crc kubenswrapper[4892]: W0217 17:46:18.489526 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbb9e745_9259_437f_a600_80153e687c65.slice/crio-268dbcdc6d9560e491f03e837ec0d4f078e92e5283437cb1689f23c81f22ac04 WatchSource:0}: Error finding container 268dbcdc6d9560e491f03e837ec0d4f078e92e5283437cb1689f23c81f22ac04: Status 404 returned error can't find the container with id 268dbcdc6d9560e491f03e837ec0d4f078e92e5283437cb1689f23c81f22ac04 Feb 17 17:46:18 crc kubenswrapper[4892]: I0217 17:46:18.646472 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9n77k"] Feb 17 17:46:18 crc kubenswrapper[4892]: I0217 17:46:18.691901 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 17 17:46:18 crc kubenswrapper[4892]: W0217 17:46:18.700495 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod9ccb84da_913b_4bdc_8c60_e88713b63b81.slice/crio-ad9fbacf4f638a1ff48629f3382c8406ce0c7d12e719f6e3c2efe0d24377dba1 WatchSource:0}: Error finding container ad9fbacf4f638a1ff48629f3382c8406ce0c7d12e719f6e3c2efe0d24377dba1: Status 404 returned error can't find the container with id ad9fbacf4f638a1ff48629f3382c8406ce0c7d12e719f6e3c2efe0d24377dba1 Feb 17 17:46:18 crc kubenswrapper[4892]: I0217 17:46:18.730832 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dzbvh"] Feb 17 17:46:18 crc kubenswrapper[4892]: I0217 17:46:18.734669 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2dtr9"] Feb 17 17:46:18 crc kubenswrapper[4892]: I0217 17:46:18.856212 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"4d55de19f05ea7b75307bcd2314b6e0834421f19a77a058433897e58a6082704"} Feb 17 17:46:18 crc kubenswrapper[4892]: I0217 17:46:18.856541 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"41d17a37d7e011827194fabad291b209d0a8ca050b2f7e8c2bafb4cb13f1af62"} Feb 17 17:46:18 crc kubenswrapper[4892]: I0217 17:46:18.863434 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9n77k" event={"ID":"0c8cf662-ff88-481f-ae9d-de0fe49af1f4","Type":"ContainerStarted","Data":"10890d3fc0ec04ba91cccf530482cad3cd1d64f2284110d109b6cb4c2f9a7ab1"} Feb 17 17:46:18 crc kubenswrapper[4892]: I0217 17:46:18.868734 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2dtr9" event={"ID":"9c4922f3-1e12-469d-9afc-c2c52238e551","Type":"ContainerStarted","Data":"b8f0ac963a1f02ca7520e0d01648f9fb03a9ba179d2acf8ea629ed075194fd9b"} Feb 17 17:46:18 crc kubenswrapper[4892]: I0217 17:46:18.875411 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dzbvh" event={"ID":"cda40965-c69d-469d-beb4-91582508ad77","Type":"ContainerStarted","Data":"41e16e977ddf1804ea1891f5b9f306921e481f4ea0ef64193203838d6fe7f04c"} Feb 17 17:46:18 crc kubenswrapper[4892]: I0217 17:46:18.900221 4892 generic.go:334] "Generic (PLEG): container finished" podID="1111f441-f7d1-4115-b288-48cef127137a" containerID="39338101665f015d00683b1772c24a547ba1bb7237f21bb2a360429436d44f3f" exitCode=0 Feb 17 17:46:18 crc kubenswrapper[4892]: I0217 17:46:18.900313 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522505-6hhs6" event={"ID":"1111f441-f7d1-4115-b288-48cef127137a","Type":"ContainerDied","Data":"39338101665f015d00683b1772c24a547ba1bb7237f21bb2a360429436d44f3f"} Feb 17 17:46:18 crc kubenswrapper[4892]: I0217 17:46:18.914684 4892 generic.go:334] "Generic (PLEG): container finished" podID="cbb9e745-9259-437f-a600-80153e687c65" containerID="8ebeceef20e179b1f52af3ee7ec578218e9f83787d6135ccf4597c9c5326e232" exitCode=0 Feb 17 17:46:18 crc kubenswrapper[4892]: I0217 17:46:18.914844 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vh5f8" event={"ID":"cbb9e745-9259-437f-a600-80153e687c65","Type":"ContainerDied","Data":"8ebeceef20e179b1f52af3ee7ec578218e9f83787d6135ccf4597c9c5326e232"} Feb 17 17:46:18 crc kubenswrapper[4892]: I0217 17:46:18.914874 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vh5f8" event={"ID":"cbb9e745-9259-437f-a600-80153e687c65","Type":"ContainerStarted","Data":"268dbcdc6d9560e491f03e837ec0d4f078e92e5283437cb1689f23c81f22ac04"} Feb 17 17:46:18 crc kubenswrapper[4892]: I0217 17:46:18.917343 4892 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 17:46:18 crc kubenswrapper[4892]: I0217 17:46:18.920553 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"c6433daf61e1780161793bac060b622d447983b6538ef6af74f8be9fba2e2cee"} Feb 17 17:46:18 crc kubenswrapper[4892]: I0217 17:46:18.920584 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"cc718891f5fc2e08d5743c76ee9fc5e9c5d798ec035914db8cf9acf7443db526"} Feb 17 17:46:18 crc kubenswrapper[4892]: I0217 17:46:18.921097 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:46:18 crc kubenswrapper[4892]: I0217 17:46:18.937469 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"4c7f59262a91dba05768a6f7760f66bf2667899a2a55b117d27dcdffcdf4efdf"} Feb 17 17:46:18 crc kubenswrapper[4892]: I0217 17:46:18.937524 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"55a0779ffa700cd0107fde0add27a93cde3fb8c0f153c03decfa493707b6bd14"} Feb 17 17:46:18 crc kubenswrapper[4892]: I0217 17:46:18.955530 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9ccb84da-913b-4bdc-8c60-e88713b63b81","Type":"ContainerStarted","Data":"ad9fbacf4f638a1ff48629f3382c8406ce0c7d12e719f6e3c2efe0d24377dba1"} Feb 17 17:46:19 crc kubenswrapper[4892]: I0217 17:46:19.051545 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-cnjbv"] Feb 17 17:46:19 crc kubenswrapper[4892]: I0217 17:46:19.112146 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rjqrw"] Feb 17 17:46:19 crc kubenswrapper[4892]: I0217 17:46:19.113224 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rjqrw" Feb 17 17:46:19 crc kubenswrapper[4892]: I0217 17:46:19.117866 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 17 17:46:19 crc kubenswrapper[4892]: I0217 17:46:19.124084 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rjqrw"] Feb 17 17:46:19 crc kubenswrapper[4892]: W0217 17:46:19.152618 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3fc996d_107b_4647_b52f_54fef31f9059.slice/crio-46de291552e7a1225c0a9f84848364c156ad88e5930d6cc7e79e21c197073f82 WatchSource:0}: Error finding container 46de291552e7a1225c0a9f84848364c156ad88e5930d6cc7e79e21c197073f82: Status 404 returned error can't find the container with id 46de291552e7a1225c0a9f84848364c156ad88e5930d6cc7e79e21c197073f82 Feb 17 17:46:19 crc kubenswrapper[4892]: I0217 17:46:19.233201 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8498dfc3-1aa0-4059-abad-cab139ba83ec-catalog-content\") pod \"redhat-marketplace-rjqrw\" (UID: \"8498dfc3-1aa0-4059-abad-cab139ba83ec\") " pod="openshift-marketplace/redhat-marketplace-rjqrw" Feb 17 17:46:19 crc kubenswrapper[4892]: I0217 17:46:19.233456 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8498dfc3-1aa0-4059-abad-cab139ba83ec-utilities\") pod \"redhat-marketplace-rjqrw\" (UID: \"8498dfc3-1aa0-4059-abad-cab139ba83ec\") " pod="openshift-marketplace/redhat-marketplace-rjqrw" Feb 17 17:46:19 crc kubenswrapper[4892]: I0217 17:46:19.233533 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqqtf\" (UniqueName: \"kubernetes.io/projected/8498dfc3-1aa0-4059-abad-cab139ba83ec-kube-api-access-lqqtf\") pod \"redhat-marketplace-rjqrw\" (UID: \"8498dfc3-1aa0-4059-abad-cab139ba83ec\") " pod="openshift-marketplace/redhat-marketplace-rjqrw" Feb 17 17:46:19 crc kubenswrapper[4892]: I0217 17:46:19.334400 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8498dfc3-1aa0-4059-abad-cab139ba83ec-catalog-content\") pod \"redhat-marketplace-rjqrw\" (UID: \"8498dfc3-1aa0-4059-abad-cab139ba83ec\") " pod="openshift-marketplace/redhat-marketplace-rjqrw" Feb 17 17:46:19 crc kubenswrapper[4892]: I0217 17:46:19.334456 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8498dfc3-1aa0-4059-abad-cab139ba83ec-utilities\") pod \"redhat-marketplace-rjqrw\" (UID: \"8498dfc3-1aa0-4059-abad-cab139ba83ec\") " pod="openshift-marketplace/redhat-marketplace-rjqrw" Feb 17 17:46:19 crc kubenswrapper[4892]: I0217 17:46:19.334542 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqqtf\" (UniqueName: \"kubernetes.io/projected/8498dfc3-1aa0-4059-abad-cab139ba83ec-kube-api-access-lqqtf\") pod \"redhat-marketplace-rjqrw\" (UID: \"8498dfc3-1aa0-4059-abad-cab139ba83ec\") " pod="openshift-marketplace/redhat-marketplace-rjqrw" Feb 17 17:46:19 crc kubenswrapper[4892]: I0217 17:46:19.334850 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8498dfc3-1aa0-4059-abad-cab139ba83ec-catalog-content\") pod \"redhat-marketplace-rjqrw\" (UID: \"8498dfc3-1aa0-4059-abad-cab139ba83ec\") " pod="openshift-marketplace/redhat-marketplace-rjqrw" Feb 17 17:46:19 crc kubenswrapper[4892]: I0217 17:46:19.335024 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8498dfc3-1aa0-4059-abad-cab139ba83ec-utilities\") pod \"redhat-marketplace-rjqrw\" (UID: \"8498dfc3-1aa0-4059-abad-cab139ba83ec\") " pod="openshift-marketplace/redhat-marketplace-rjqrw" Feb 17 17:46:19 crc kubenswrapper[4892]: I0217 17:46:19.358544 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqqtf\" (UniqueName: \"kubernetes.io/projected/8498dfc3-1aa0-4059-abad-cab139ba83ec-kube-api-access-lqqtf\") pod \"redhat-marketplace-rjqrw\" (UID: \"8498dfc3-1aa0-4059-abad-cab139ba83ec\") " pod="openshift-marketplace/redhat-marketplace-rjqrw" Feb 17 17:46:19 crc kubenswrapper[4892]: I0217 17:46:19.365923 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 17 17:46:19 crc kubenswrapper[4892]: I0217 17:46:19.461462 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rjqrw" Feb 17 17:46:19 crc kubenswrapper[4892]: I0217 17:46:19.524725 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-q2t7d"] Feb 17 17:46:19 crc kubenswrapper[4892]: I0217 17:46:19.525936 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q2t7d" Feb 17 17:46:19 crc kubenswrapper[4892]: I0217 17:46:19.538987 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q2t7d"] Feb 17 17:46:19 crc kubenswrapper[4892]: I0217 17:46:19.642797 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9j7z\" (UniqueName: \"kubernetes.io/projected/0a738868-4aa5-4aa8-8058-863b3dfb3acb-kube-api-access-z9j7z\") pod \"redhat-marketplace-q2t7d\" (UID: \"0a738868-4aa5-4aa8-8058-863b3dfb3acb\") " pod="openshift-marketplace/redhat-marketplace-q2t7d" Feb 17 17:46:19 crc kubenswrapper[4892]: I0217 17:46:19.643131 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a738868-4aa5-4aa8-8058-863b3dfb3acb-catalog-content\") pod \"redhat-marketplace-q2t7d\" (UID: \"0a738868-4aa5-4aa8-8058-863b3dfb3acb\") " pod="openshift-marketplace/redhat-marketplace-q2t7d" Feb 17 17:46:19 crc kubenswrapper[4892]: I0217 17:46:19.643166 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a738868-4aa5-4aa8-8058-863b3dfb3acb-utilities\") pod \"redhat-marketplace-q2t7d\" (UID: \"0a738868-4aa5-4aa8-8058-863b3dfb3acb\") " pod="openshift-marketplace/redhat-marketplace-q2t7d" Feb 17 17:46:19 crc kubenswrapper[4892]: I0217 17:46:19.743649 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9j7z\" (UniqueName: \"kubernetes.io/projected/0a738868-4aa5-4aa8-8058-863b3dfb3acb-kube-api-access-z9j7z\") pod \"redhat-marketplace-q2t7d\" (UID: \"0a738868-4aa5-4aa8-8058-863b3dfb3acb\") " pod="openshift-marketplace/redhat-marketplace-q2t7d" Feb 17 17:46:19 crc kubenswrapper[4892]: I0217 17:46:19.743732 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a738868-4aa5-4aa8-8058-863b3dfb3acb-catalog-content\") pod \"redhat-marketplace-q2t7d\" (UID: \"0a738868-4aa5-4aa8-8058-863b3dfb3acb\") " pod="openshift-marketplace/redhat-marketplace-q2t7d" Feb 17 17:46:19 crc kubenswrapper[4892]: I0217 17:46:19.743757 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a738868-4aa5-4aa8-8058-863b3dfb3acb-utilities\") pod \"redhat-marketplace-q2t7d\" (UID: \"0a738868-4aa5-4aa8-8058-863b3dfb3acb\") " pod="openshift-marketplace/redhat-marketplace-q2t7d" Feb 17 17:46:19 crc kubenswrapper[4892]: I0217 17:46:19.744223 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a738868-4aa5-4aa8-8058-863b3dfb3acb-utilities\") pod \"redhat-marketplace-q2t7d\" (UID: \"0a738868-4aa5-4aa8-8058-863b3dfb3acb\") " pod="openshift-marketplace/redhat-marketplace-q2t7d" Feb 17 17:46:19 crc kubenswrapper[4892]: I0217 17:46:19.744283 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a738868-4aa5-4aa8-8058-863b3dfb3acb-catalog-content\") pod \"redhat-marketplace-q2t7d\" (UID: \"0a738868-4aa5-4aa8-8058-863b3dfb3acb\") " pod="openshift-marketplace/redhat-marketplace-q2t7d" Feb 17 17:46:19 crc kubenswrapper[4892]: I0217 17:46:19.750642 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rjqrw"] Feb 17 17:46:19 crc kubenswrapper[4892]: I0217 17:46:19.760558 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9j7z\" (UniqueName: \"kubernetes.io/projected/0a738868-4aa5-4aa8-8058-863b3dfb3acb-kube-api-access-z9j7z\") pod \"redhat-marketplace-q2t7d\" (UID: \"0a738868-4aa5-4aa8-8058-863b3dfb3acb\") " pod="openshift-marketplace/redhat-marketplace-q2t7d" Feb 17 17:46:19 crc kubenswrapper[4892]: I0217 17:46:19.868540 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q2t7d" Feb 17 17:46:19 crc kubenswrapper[4892]: I0217 17:46:19.969246 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" event={"ID":"b3fc996d-107b-4647-b52f-54fef31f9059","Type":"ContainerStarted","Data":"f7b748acabc068e57b10cd73410ba1adaa2b620c1854c8e6c1a80454fbd7f693"} Feb 17 17:46:19 crc kubenswrapper[4892]: I0217 17:46:19.969610 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" event={"ID":"b3fc996d-107b-4647-b52f-54fef31f9059","Type":"ContainerStarted","Data":"46de291552e7a1225c0a9f84848364c156ad88e5930d6cc7e79e21c197073f82"} Feb 17 17:46:19 crc kubenswrapper[4892]: I0217 17:46:19.970440 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:19 crc kubenswrapper[4892]: I0217 17:46:19.974575 4892 generic.go:334] "Generic (PLEG): container finished" podID="0c8cf662-ff88-481f-ae9d-de0fe49af1f4" containerID="a3b1a01c913a494eb9b7cacdd13d127d0a4312d1709b9e07e372b0f7922d9cf9" exitCode=0 Feb 17 17:46:19 crc kubenswrapper[4892]: I0217 17:46:19.974948 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9n77k" event={"ID":"0c8cf662-ff88-481f-ae9d-de0fe49af1f4","Type":"ContainerDied","Data":"a3b1a01c913a494eb9b7cacdd13d127d0a4312d1709b9e07e372b0f7922d9cf9"} Feb 17 17:46:19 crc kubenswrapper[4892]: I0217 17:46:19.979775 4892 generic.go:334] "Generic (PLEG): container finished" podID="9c4922f3-1e12-469d-9afc-c2c52238e551" containerID="5af8f01b467aa9c8a4acc40a1cfdc98f6a250c45f7553a2d4a3df95fbfcf8e79" exitCode=0 Feb 17 17:46:19 crc kubenswrapper[4892]: I0217 17:46:19.979848 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2dtr9" event={"ID":"9c4922f3-1e12-469d-9afc-c2c52238e551","Type":"ContainerDied","Data":"5af8f01b467aa9c8a4acc40a1cfdc98f6a250c45f7553a2d4a3df95fbfcf8e79"} Feb 17 17:46:19 crc kubenswrapper[4892]: I0217 17:46:19.982493 4892 generic.go:334] "Generic (PLEG): container finished" podID="cda40965-c69d-469d-beb4-91582508ad77" containerID="95e2423cb23413ef2cd8d48d47391cfcdb331c4ca8874388e26ba6148dd5ca0d" exitCode=0 Feb 17 17:46:19 crc kubenswrapper[4892]: I0217 17:46:19.982546 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dzbvh" event={"ID":"cda40965-c69d-469d-beb4-91582508ad77","Type":"ContainerDied","Data":"95e2423cb23413ef2cd8d48d47391cfcdb331c4ca8874388e26ba6148dd5ca0d"} Feb 17 17:46:20 crc kubenswrapper[4892]: I0217 17:46:19.998677 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" podStartSLOduration=127.99866349 podStartE2EDuration="2m7.99866349s" podCreationTimestamp="2026-02-17 17:44:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:46:19.998213338 +0000 UTC m=+151.373616613" watchObservedRunningTime="2026-02-17 17:46:19.99866349 +0000 UTC m=+151.374066755" Feb 17 17:46:20 crc kubenswrapper[4892]: I0217 17:46:20.012087 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rjqrw" event={"ID":"8498dfc3-1aa0-4059-abad-cab139ba83ec","Type":"ContainerStarted","Data":"bd524d390e8e71c79a3d7b80c04bf32a57cc9ab1c9849e4de05f419819cb9341"} Feb 17 17:46:20 crc kubenswrapper[4892]: I0217 17:46:20.013780 4892 generic.go:334] "Generic (PLEG): container finished" podID="9ccb84da-913b-4bdc-8c60-e88713b63b81" containerID="589a4b91c3b9839906ac1fcb7d62a2181980267ddc549c7652c7d96595b914d9" exitCode=0 Feb 17 17:46:20 crc kubenswrapper[4892]: I0217 17:46:20.014117 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9ccb84da-913b-4bdc-8c60-e88713b63b81","Type":"ContainerDied","Data":"589a4b91c3b9839906ac1fcb7d62a2181980267ddc549c7652c7d96595b914d9"} Feb 17 17:46:20 crc kubenswrapper[4892]: I0217 17:46:20.140015 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q2t7d"] Feb 17 17:46:20 crc kubenswrapper[4892]: I0217 17:46:20.312785 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522505-6hhs6" Feb 17 17:46:20 crc kubenswrapper[4892]: I0217 17:46:20.460246 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4svpj\" (UniqueName: \"kubernetes.io/projected/1111f441-f7d1-4115-b288-48cef127137a-kube-api-access-4svpj\") pod \"1111f441-f7d1-4115-b288-48cef127137a\" (UID: \"1111f441-f7d1-4115-b288-48cef127137a\") " Feb 17 17:46:20 crc kubenswrapper[4892]: I0217 17:46:20.460303 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1111f441-f7d1-4115-b288-48cef127137a-config-volume\") pod \"1111f441-f7d1-4115-b288-48cef127137a\" (UID: \"1111f441-f7d1-4115-b288-48cef127137a\") " Feb 17 17:46:20 crc kubenswrapper[4892]: I0217 17:46:20.460374 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1111f441-f7d1-4115-b288-48cef127137a-secret-volume\") pod \"1111f441-f7d1-4115-b288-48cef127137a\" (UID: \"1111f441-f7d1-4115-b288-48cef127137a\") " Feb 17 17:46:20 crc kubenswrapper[4892]: I0217 17:46:20.461419 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1111f441-f7d1-4115-b288-48cef127137a-config-volume" (OuterVolumeSpecName: "config-volume") pod "1111f441-f7d1-4115-b288-48cef127137a" (UID: "1111f441-f7d1-4115-b288-48cef127137a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:46:20 crc kubenswrapper[4892]: I0217 17:46:20.466588 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1111f441-f7d1-4115-b288-48cef127137a-kube-api-access-4svpj" (OuterVolumeSpecName: "kube-api-access-4svpj") pod "1111f441-f7d1-4115-b288-48cef127137a" (UID: "1111f441-f7d1-4115-b288-48cef127137a"). InnerVolumeSpecName "kube-api-access-4svpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:46:20 crc kubenswrapper[4892]: I0217 17:46:20.480945 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1111f441-f7d1-4115-b288-48cef127137a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1111f441-f7d1-4115-b288-48cef127137a" (UID: "1111f441-f7d1-4115-b288-48cef127137a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:46:20 crc kubenswrapper[4892]: I0217 17:46:20.511753 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-q424n"] Feb 17 17:46:20 crc kubenswrapper[4892]: E0217 17:46:20.513138 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1111f441-f7d1-4115-b288-48cef127137a" containerName="collect-profiles" Feb 17 17:46:20 crc kubenswrapper[4892]: I0217 17:46:20.513213 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="1111f441-f7d1-4115-b288-48cef127137a" containerName="collect-profiles" Feb 17 17:46:20 crc kubenswrapper[4892]: I0217 17:46:20.513413 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="1111f441-f7d1-4115-b288-48cef127137a" containerName="collect-profiles" Feb 17 17:46:20 crc kubenswrapper[4892]: I0217 17:46:20.514224 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q424n" Feb 17 17:46:20 crc kubenswrapper[4892]: I0217 17:46:20.518253 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 17 17:46:20 crc kubenswrapper[4892]: I0217 17:46:20.525587 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q424n"] Feb 17 17:46:20 crc kubenswrapper[4892]: I0217 17:46:20.562179 4892 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1111f441-f7d1-4115-b288-48cef127137a-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 17:46:20 crc kubenswrapper[4892]: I0217 17:46:20.562204 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4svpj\" (UniqueName: \"kubernetes.io/projected/1111f441-f7d1-4115-b288-48cef127137a-kube-api-access-4svpj\") on node \"crc\" DevicePath \"\"" Feb 17 17:46:20 crc kubenswrapper[4892]: I0217 17:46:20.562213 4892 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1111f441-f7d1-4115-b288-48cef127137a-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 17:46:20 crc kubenswrapper[4892]: I0217 17:46:20.664750 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2t9n\" (UniqueName: \"kubernetes.io/projected/96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1-kube-api-access-t2t9n\") pod \"redhat-operators-q424n\" (UID: \"96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1\") " pod="openshift-marketplace/redhat-operators-q424n" Feb 17 17:46:20 crc kubenswrapper[4892]: I0217 17:46:20.664872 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1-utilities\") pod \"redhat-operators-q424n\" (UID: \"96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1\") " pod="openshift-marketplace/redhat-operators-q424n" Feb 17 17:46:20 crc kubenswrapper[4892]: I0217 17:46:20.664987 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1-catalog-content\") pod \"redhat-operators-q424n\" (UID: \"96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1\") " pod="openshift-marketplace/redhat-operators-q424n" Feb 17 17:46:20 crc kubenswrapper[4892]: I0217 17:46:20.767015 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2t9n\" (UniqueName: \"kubernetes.io/projected/96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1-kube-api-access-t2t9n\") pod \"redhat-operators-q424n\" (UID: \"96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1\") " pod="openshift-marketplace/redhat-operators-q424n" Feb 17 17:46:20 crc kubenswrapper[4892]: I0217 17:46:20.768131 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1-catalog-content\") pod \"redhat-operators-q424n\" (UID: \"96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1\") " pod="openshift-marketplace/redhat-operators-q424n" Feb 17 17:46:20 crc kubenswrapper[4892]: I0217 17:46:20.768224 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1-utilities\") pod \"redhat-operators-q424n\" (UID: \"96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1\") " pod="openshift-marketplace/redhat-operators-q424n" Feb 17 17:46:20 crc kubenswrapper[4892]: I0217 17:46:20.768707 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1-utilities\") pod \"redhat-operators-q424n\" (UID: \"96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1\") " pod="openshift-marketplace/redhat-operators-q424n" Feb 17 17:46:20 crc kubenswrapper[4892]: I0217 17:46:20.769277 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1-catalog-content\") pod \"redhat-operators-q424n\" (UID: \"96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1\") " pod="openshift-marketplace/redhat-operators-q424n" Feb 17 17:46:20 crc kubenswrapper[4892]: I0217 17:46:20.789980 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2t9n\" (UniqueName: \"kubernetes.io/projected/96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1-kube-api-access-t2t9n\") pod \"redhat-operators-q424n\" (UID: \"96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1\") " pod="openshift-marketplace/redhat-operators-q424n" Feb 17 17:46:20 crc kubenswrapper[4892]: I0217 17:46:20.816551 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-st4m9" Feb 17 17:46:20 crc kubenswrapper[4892]: I0217 17:46:20.823349 4892 patch_prober.go:28] interesting pod/downloads-7954f5f757-4q9d4 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Feb 17 17:46:20 crc kubenswrapper[4892]: I0217 17:46:20.823416 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-4q9d4" podUID="843d01a5-8de5-4628-99d0-2ac552e9abf5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Feb 17 17:46:20 crc kubenswrapper[4892]: I0217 17:46:20.823631 4892 patch_prober.go:28] interesting pod/downloads-7954f5f757-4q9d4 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Feb 17 17:46:20 crc kubenswrapper[4892]: I0217 17:46:20.823681 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-4q9d4" podUID="843d01a5-8de5-4628-99d0-2ac552e9abf5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Feb 17 17:46:20 crc kubenswrapper[4892]: I0217 17:46:20.861450 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q424n" Feb 17 17:46:20 crc kubenswrapper[4892]: I0217 17:46:20.919934 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-f7rxl"] Feb 17 17:46:20 crc kubenswrapper[4892]: I0217 17:46:20.921372 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f7rxl" Feb 17 17:46:20 crc kubenswrapper[4892]: I0217 17:46:20.921926 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-zc25m" Feb 17 17:46:20 crc kubenswrapper[4892]: I0217 17:46:20.921978 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-zc25m" Feb 17 17:46:20 crc kubenswrapper[4892]: I0217 17:46:20.926974 4892 patch_prober.go:28] interesting pod/console-f9d7485db-zc25m container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Feb 17 17:46:20 crc kubenswrapper[4892]: I0217 17:46:20.927028 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-zc25m" podUID="0b0cbdbb-671e-41e1-b494-a369938dab8e" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" Feb 17 17:46:20 crc kubenswrapper[4892]: I0217 17:46:20.930459 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f7rxl"] Feb 17 17:46:21 crc kubenswrapper[4892]: I0217 17:46:21.054187 4892 generic.go:334] "Generic (PLEG): container finished" podID="8498dfc3-1aa0-4059-abad-cab139ba83ec" containerID="df1a5f5e0cb626905f67b812abc37e81b58844c9a8912a578f943334dbb74769" exitCode=0 Feb 17 17:46:21 crc kubenswrapper[4892]: I0217 17:46:21.054295 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rjqrw" event={"ID":"8498dfc3-1aa0-4059-abad-cab139ba83ec","Type":"ContainerDied","Data":"df1a5f5e0cb626905f67b812abc37e81b58844c9a8912a578f943334dbb74769"} Feb 17 17:46:21 crc kubenswrapper[4892]: I0217 17:46:21.058805 4892 generic.go:334] "Generic (PLEG): container finished" podID="0a738868-4aa5-4aa8-8058-863b3dfb3acb" containerID="2a32002b8d0a52892cd5a6f93d9cb44226d2b2312a6423691fbc8a059c27d068" exitCode=0 Feb 17 17:46:21 crc kubenswrapper[4892]: I0217 17:46:21.058934 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q2t7d" event={"ID":"0a738868-4aa5-4aa8-8058-863b3dfb3acb","Type":"ContainerDied","Data":"2a32002b8d0a52892cd5a6f93d9cb44226d2b2312a6423691fbc8a059c27d068"} Feb 17 17:46:21 crc kubenswrapper[4892]: I0217 17:46:21.058986 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q2t7d" event={"ID":"0a738868-4aa5-4aa8-8058-863b3dfb3acb","Type":"ContainerStarted","Data":"e70d8ae6109d36f340662d54b77f01f06be0d761e67f0686475aeeeeb7b895e6"} Feb 17 17:46:21 crc kubenswrapper[4892]: I0217 17:46:21.065605 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522505-6hhs6" Feb 17 17:46:21 crc kubenswrapper[4892]: I0217 17:46:21.070365 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522505-6hhs6" event={"ID":"1111f441-f7d1-4115-b288-48cef127137a","Type":"ContainerDied","Data":"2e380bd5921e06b4e83a586224baf0fd4201a38c1057f932dffc2ad51d7100c7"} Feb 17 17:46:21 crc kubenswrapper[4892]: I0217 17:46:21.070415 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e380bd5921e06b4e83a586224baf0fd4201a38c1057f932dffc2ad51d7100c7" Feb 17 17:46:21 crc kubenswrapper[4892]: I0217 17:46:21.075499 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a4ab50e-1f68-4755-a980-157e38a83436-catalog-content\") pod \"redhat-operators-f7rxl\" (UID: \"6a4ab50e-1f68-4755-a980-157e38a83436\") " pod="openshift-marketplace/redhat-operators-f7rxl" Feb 17 17:46:21 crc kubenswrapper[4892]: I0217 17:46:21.075564 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj6nr\" (UniqueName: \"kubernetes.io/projected/6a4ab50e-1f68-4755-a980-157e38a83436-kube-api-access-jj6nr\") pod \"redhat-operators-f7rxl\" (UID: \"6a4ab50e-1f68-4755-a980-157e38a83436\") " pod="openshift-marketplace/redhat-operators-f7rxl" Feb 17 17:46:21 crc kubenswrapper[4892]: I0217 17:46:21.075614 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a4ab50e-1f68-4755-a980-157e38a83436-utilities\") pod \"redhat-operators-f7rxl\" (UID: \"6a4ab50e-1f68-4755-a980-157e38a83436\") " pod="openshift-marketplace/redhat-operators-f7rxl" Feb 17 17:46:21 crc kubenswrapper[4892]: I0217 17:46:21.177199 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a4ab50e-1f68-4755-a980-157e38a83436-catalog-content\") pod \"redhat-operators-f7rxl\" (UID: \"6a4ab50e-1f68-4755-a980-157e38a83436\") " pod="openshift-marketplace/redhat-operators-f7rxl" Feb 17 17:46:21 crc kubenswrapper[4892]: I0217 17:46:21.177285 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jj6nr\" (UniqueName: \"kubernetes.io/projected/6a4ab50e-1f68-4755-a980-157e38a83436-kube-api-access-jj6nr\") pod \"redhat-operators-f7rxl\" (UID: \"6a4ab50e-1f68-4755-a980-157e38a83436\") " pod="openshift-marketplace/redhat-operators-f7rxl" Feb 17 17:46:21 crc kubenswrapper[4892]: I0217 17:46:21.177369 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a4ab50e-1f68-4755-a980-157e38a83436-utilities\") pod \"redhat-operators-f7rxl\" (UID: \"6a4ab50e-1f68-4755-a980-157e38a83436\") " pod="openshift-marketplace/redhat-operators-f7rxl" Feb 17 17:46:21 crc kubenswrapper[4892]: I0217 17:46:21.177892 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a4ab50e-1f68-4755-a980-157e38a83436-utilities\") pod \"redhat-operators-f7rxl\" (UID: \"6a4ab50e-1f68-4755-a980-157e38a83436\") " pod="openshift-marketplace/redhat-operators-f7rxl" Feb 17 17:46:21 crc kubenswrapper[4892]: I0217 17:46:21.181677 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a4ab50e-1f68-4755-a980-157e38a83436-catalog-content\") pod \"redhat-operators-f7rxl\" (UID: \"6a4ab50e-1f68-4755-a980-157e38a83436\") " pod="openshift-marketplace/redhat-operators-f7rxl" Feb 17 17:46:21 crc kubenswrapper[4892]: I0217 17:46:21.202293 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj6nr\" (UniqueName: \"kubernetes.io/projected/6a4ab50e-1f68-4755-a980-157e38a83436-kube-api-access-jj6nr\") pod \"redhat-operators-f7rxl\" (UID: \"6a4ab50e-1f68-4755-a980-157e38a83436\") " pod="openshift-marketplace/redhat-operators-f7rxl" Feb 17 17:46:21 crc kubenswrapper[4892]: I0217 17:46:21.274621 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f7rxl" Feb 17 17:46:21 crc kubenswrapper[4892]: I0217 17:46:21.315882 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" Feb 17 17:46:21 crc kubenswrapper[4892]: I0217 17:46:21.325576 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 17:46:21 crc kubenswrapper[4892]: I0217 17:46:21.385946 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q424n"] Feb 17 17:46:21 crc kubenswrapper[4892]: W0217 17:46:21.391602 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96c2cfe7_cd0c_4fc8_8913_2bf0089a21e1.slice/crio-ea78c26ba1cc84d62e5174b25361e3f3ad3514ed99cb8d07529ab77159409551 WatchSource:0}: Error finding container ea78c26ba1cc84d62e5174b25361e3f3ad3514ed99cb8d07529ab77159409551: Status 404 returned error can't find the container with id ea78c26ba1cc84d62e5174b25361e3f3ad3514ed99cb8d07529ab77159409551 Feb 17 17:46:21 crc kubenswrapper[4892]: I0217 17:46:21.480907 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9ccb84da-913b-4bdc-8c60-e88713b63b81-kubelet-dir\") pod \"9ccb84da-913b-4bdc-8c60-e88713b63b81\" (UID: \"9ccb84da-913b-4bdc-8c60-e88713b63b81\") " Feb 17 17:46:21 crc kubenswrapper[4892]: I0217 17:46:21.481024 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9ccb84da-913b-4bdc-8c60-e88713b63b81-kube-api-access\") pod \"9ccb84da-913b-4bdc-8c60-e88713b63b81\" (UID: \"9ccb84da-913b-4bdc-8c60-e88713b63b81\") " Feb 17 17:46:21 crc kubenswrapper[4892]: I0217 17:46:21.482017 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9ccb84da-913b-4bdc-8c60-e88713b63b81-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "9ccb84da-913b-4bdc-8c60-e88713b63b81" (UID: "9ccb84da-913b-4bdc-8c60-e88713b63b81"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:46:21 crc kubenswrapper[4892]: I0217 17:46:21.487031 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ccb84da-913b-4bdc-8c60-e88713b63b81-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "9ccb84da-913b-4bdc-8c60-e88713b63b81" (UID: "9ccb84da-913b-4bdc-8c60-e88713b63b81"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:46:21 crc kubenswrapper[4892]: I0217 17:46:21.587357 4892 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9ccb84da-913b-4bdc-8c60-e88713b63b81-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 17 17:46:21 crc kubenswrapper[4892]: I0217 17:46:21.587595 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9ccb84da-913b-4bdc-8c60-e88713b63b81-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 17:46:21 crc kubenswrapper[4892]: I0217 17:46:21.588065 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 17 17:46:21 crc kubenswrapper[4892]: E0217 17:46:21.588270 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ccb84da-913b-4bdc-8c60-e88713b63b81" containerName="pruner" Feb 17 17:46:21 crc kubenswrapper[4892]: I0217 17:46:21.588281 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ccb84da-913b-4bdc-8c60-e88713b63b81" containerName="pruner" Feb 17 17:46:21 crc kubenswrapper[4892]: I0217 17:46:21.588365 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ccb84da-913b-4bdc-8c60-e88713b63b81" containerName="pruner" Feb 17 17:46:21 crc kubenswrapper[4892]: I0217 17:46:21.588701 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 17:46:21 crc kubenswrapper[4892]: I0217 17:46:21.591278 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 17 17:46:21 crc kubenswrapper[4892]: I0217 17:46:21.591525 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 17 17:46:21 crc kubenswrapper[4892]: I0217 17:46:21.597598 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 17 17:46:21 crc kubenswrapper[4892]: I0217 17:46:21.600188 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f7rxl"] Feb 17 17:46:21 crc kubenswrapper[4892]: I0217 17:46:21.688849 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/43bf66f0-1781-4f7f-9953-db63cd55ff3b-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"43bf66f0-1781-4f7f-9953-db63cd55ff3b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 17:46:21 crc kubenswrapper[4892]: I0217 17:46:21.688924 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/43bf66f0-1781-4f7f-9953-db63cd55ff3b-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"43bf66f0-1781-4f7f-9953-db63cd55ff3b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 17:46:21 crc kubenswrapper[4892]: I0217 17:46:21.789999 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/43bf66f0-1781-4f7f-9953-db63cd55ff3b-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"43bf66f0-1781-4f7f-9953-db63cd55ff3b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 17:46:21 crc kubenswrapper[4892]: I0217 17:46:21.790087 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/43bf66f0-1781-4f7f-9953-db63cd55ff3b-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"43bf66f0-1781-4f7f-9953-db63cd55ff3b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 17:46:21 crc kubenswrapper[4892]: I0217 17:46:21.790280 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/43bf66f0-1781-4f7f-9953-db63cd55ff3b-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"43bf66f0-1781-4f7f-9953-db63cd55ff3b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 17:46:21 crc kubenswrapper[4892]: I0217 17:46:21.811178 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/43bf66f0-1781-4f7f-9953-db63cd55ff3b-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"43bf66f0-1781-4f7f-9953-db63cd55ff3b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 17:46:21 crc kubenswrapper[4892]: I0217 17:46:21.919613 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 17:46:22 crc kubenswrapper[4892]: I0217 17:46:22.022394 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-xlhd5" Feb 17 17:46:22 crc kubenswrapper[4892]: I0217 17:46:22.075972 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 17:46:22 crc kubenswrapper[4892]: I0217 17:46:22.076498 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9ccb84da-913b-4bdc-8c60-e88713b63b81","Type":"ContainerDied","Data":"ad9fbacf4f638a1ff48629f3382c8406ce0c7d12e719f6e3c2efe0d24377dba1"} Feb 17 17:46:22 crc kubenswrapper[4892]: I0217 17:46:22.076528 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad9fbacf4f638a1ff48629f3382c8406ce0c7d12e719f6e3c2efe0d24377dba1" Feb 17 17:46:22 crc kubenswrapper[4892]: I0217 17:46:22.077989 4892 generic.go:334] "Generic (PLEG): container finished" podID="6a4ab50e-1f68-4755-a980-157e38a83436" containerID="d260debdcd79b86e2ca286cd52155c0dc90420adeb6ae2907aab335e14b1a1bd" exitCode=0 Feb 17 17:46:22 crc kubenswrapper[4892]: I0217 17:46:22.078056 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f7rxl" event={"ID":"6a4ab50e-1f68-4755-a980-157e38a83436","Type":"ContainerDied","Data":"d260debdcd79b86e2ca286cd52155c0dc90420adeb6ae2907aab335e14b1a1bd"} Feb 17 17:46:22 crc kubenswrapper[4892]: I0217 17:46:22.078082 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f7rxl" event={"ID":"6a4ab50e-1f68-4755-a980-157e38a83436","Type":"ContainerStarted","Data":"614787ceadd9869372533cd81172060e31e3e3ecb3dab6b1a24b0a60d41213ed"} Feb 17 17:46:22 crc kubenswrapper[4892]: I0217 17:46:22.082713 4892 generic.go:334] "Generic (PLEG): container finished" podID="96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1" containerID="4fa1642e7d2dba1351d3ab8f9cc9e029523c0344237a6ae1ce6d314534967ba7" exitCode=0 Feb 17 17:46:22 crc kubenswrapper[4892]: I0217 17:46:22.082788 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q424n" event={"ID":"96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1","Type":"ContainerDied","Data":"4fa1642e7d2dba1351d3ab8f9cc9e029523c0344237a6ae1ce6d314534967ba7"} Feb 17 17:46:22 crc kubenswrapper[4892]: I0217 17:46:22.082824 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q424n" event={"ID":"96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1","Type":"ContainerStarted","Data":"ea78c26ba1cc84d62e5174b25361e3f3ad3514ed99cb8d07529ab77159409551"} Feb 17 17:46:22 crc kubenswrapper[4892]: I0217 17:46:22.348336 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 17 17:46:22 crc kubenswrapper[4892]: W0217 17:46:22.367744 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod43bf66f0_1781_4f7f_9953_db63cd55ff3b.slice/crio-c7111de8a6262c98aff8ba630fad308e863cf1062a58dae1eb4b695e9e9ced34 WatchSource:0}: Error finding container c7111de8a6262c98aff8ba630fad308e863cf1062a58dae1eb4b695e9e9ced34: Status 404 returned error can't find the container with id c7111de8a6262c98aff8ba630fad308e863cf1062a58dae1eb4b695e9e9ced34 Feb 17 17:46:23 crc kubenswrapper[4892]: I0217 17:46:23.094045 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"43bf66f0-1781-4f7f-9953-db63cd55ff3b","Type":"ContainerStarted","Data":"d9c03c178751ee411012a555795b1abc389964d582e6e065fd36c65874990f92"} Feb 17 17:46:23 crc kubenswrapper[4892]: I0217 17:46:23.094413 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"43bf66f0-1781-4f7f-9953-db63cd55ff3b","Type":"ContainerStarted","Data":"c7111de8a6262c98aff8ba630fad308e863cf1062a58dae1eb4b695e9e9ced34"} Feb 17 17:46:23 crc kubenswrapper[4892]: I0217 17:46:23.111781 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.111752397 podStartE2EDuration="2.111752397s" podCreationTimestamp="2026-02-17 17:46:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:46:23.109228429 +0000 UTC m=+154.484631694" watchObservedRunningTime="2026-02-17 17:46:23.111752397 +0000 UTC m=+154.487155662" Feb 17 17:46:24 crc kubenswrapper[4892]: I0217 17:46:24.100195 4892 generic.go:334] "Generic (PLEG): container finished" podID="43bf66f0-1781-4f7f-9953-db63cd55ff3b" containerID="d9c03c178751ee411012a555795b1abc389964d582e6e065fd36c65874990f92" exitCode=0 Feb 17 17:46:24 crc kubenswrapper[4892]: I0217 17:46:24.100235 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"43bf66f0-1781-4f7f-9953-db63cd55ff3b","Type":"ContainerDied","Data":"d9c03c178751ee411012a555795b1abc389964d582e6e065fd36c65874990f92"} Feb 17 17:46:24 crc kubenswrapper[4892]: I0217 17:46:24.125440 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-cv2jf" Feb 17 17:46:25 crc kubenswrapper[4892]: I0217 17:46:25.480028 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 17:46:25 crc kubenswrapper[4892]: I0217 17:46:25.555346 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/43bf66f0-1781-4f7f-9953-db63cd55ff3b-kube-api-access\") pod \"43bf66f0-1781-4f7f-9953-db63cd55ff3b\" (UID: \"43bf66f0-1781-4f7f-9953-db63cd55ff3b\") " Feb 17 17:46:25 crc kubenswrapper[4892]: I0217 17:46:25.559965 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/43bf66f0-1781-4f7f-9953-db63cd55ff3b-kubelet-dir\") pod \"43bf66f0-1781-4f7f-9953-db63cd55ff3b\" (UID: \"43bf66f0-1781-4f7f-9953-db63cd55ff3b\") " Feb 17 17:46:25 crc kubenswrapper[4892]: I0217 17:46:25.560402 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43bf66f0-1781-4f7f-9953-db63cd55ff3b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "43bf66f0-1781-4f7f-9953-db63cd55ff3b" (UID: "43bf66f0-1781-4f7f-9953-db63cd55ff3b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:46:25 crc kubenswrapper[4892]: I0217 17:46:25.566996 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43bf66f0-1781-4f7f-9953-db63cd55ff3b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "43bf66f0-1781-4f7f-9953-db63cd55ff3b" (UID: "43bf66f0-1781-4f7f-9953-db63cd55ff3b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:46:25 crc kubenswrapper[4892]: I0217 17:46:25.663442 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/43bf66f0-1781-4f7f-9953-db63cd55ff3b-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 17:46:25 crc kubenswrapper[4892]: I0217 17:46:25.663510 4892 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/43bf66f0-1781-4f7f-9953-db63cd55ff3b-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 17 17:46:26 crc kubenswrapper[4892]: I0217 17:46:26.115063 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"43bf66f0-1781-4f7f-9953-db63cd55ff3b","Type":"ContainerDied","Data":"c7111de8a6262c98aff8ba630fad308e863cf1062a58dae1eb4b695e9e9ced34"} Feb 17 17:46:26 crc kubenswrapper[4892]: I0217 17:46:26.115104 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7111de8a6262c98aff8ba630fad308e863cf1062a58dae1eb4b695e9e9ced34" Feb 17 17:46:26 crc kubenswrapper[4892]: I0217 17:46:26.115155 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 17:46:29 crc kubenswrapper[4892]: I0217 17:46:29.351688 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:46:30 crc kubenswrapper[4892]: I0217 17:46:30.823527 4892 patch_prober.go:28] interesting pod/downloads-7954f5f757-4q9d4 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Feb 17 17:46:30 crc kubenswrapper[4892]: I0217 17:46:30.823799 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-4q9d4" podUID="843d01a5-8de5-4628-99d0-2ac552e9abf5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Feb 17 17:46:30 crc kubenswrapper[4892]: I0217 17:46:30.823550 4892 patch_prober.go:28] interesting pod/downloads-7954f5f757-4q9d4 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Feb 17 17:46:30 crc kubenswrapper[4892]: I0217 17:46:30.823877 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-4q9d4" podUID="843d01a5-8de5-4628-99d0-2ac552e9abf5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Feb 17 17:46:30 crc kubenswrapper[4892]: I0217 17:46:30.939637 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-zc25m" Feb 17 17:46:30 crc kubenswrapper[4892]: I0217 17:46:30.943122 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-zc25m" Feb 17 17:46:34 crc kubenswrapper[4892]: I0217 17:46:34.611342 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9290105c-74a4-487a-879f-3f79186b3b01-metrics-certs\") pod \"network-metrics-daemon-2q4n6\" (UID: \"9290105c-74a4-487a-879f-3f79186b3b01\") " pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:46:34 crc kubenswrapper[4892]: I0217 17:46:34.622706 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9290105c-74a4-487a-879f-3f79186b3b01-metrics-certs\") pod \"network-metrics-daemon-2q4n6\" (UID: \"9290105c-74a4-487a-879f-3f79186b3b01\") " pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:46:34 crc kubenswrapper[4892]: I0217 17:46:34.878944 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q4n6" Feb 17 17:46:37 crc kubenswrapper[4892]: I0217 17:46:37.367241 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-st4m9"] Feb 17 17:46:37 crc kubenswrapper[4892]: I0217 17:46:37.367776 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-st4m9" podUID="183505a9-eeb4-4bf8-94ae-2c593e78b926" containerName="controller-manager" containerID="cri-o://6581e50a4ad1f934152494bb5fbed617a4b12fd7ac9a226abae451e0cc4f0a7b" gracePeriod=30 Feb 17 17:46:37 crc kubenswrapper[4892]: I0217 17:46:37.427468 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:46:37 crc kubenswrapper[4892]: I0217 17:46:37.427551 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:46:37 crc kubenswrapper[4892]: I0217 17:46:37.431292 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qm6w9"] Feb 17 17:46:37 crc kubenswrapper[4892]: I0217 17:46:37.431596 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qm6w9" podUID="4baae3f3-ceff-445d-9cf0-82284224bff7" containerName="route-controller-manager" containerID="cri-o://81b2d892fc75ff371816c3a647f8429d6358f47b2843f1d99914b835402fa303" gracePeriod=30 Feb 17 17:46:38 crc kubenswrapper[4892]: I0217 17:46:38.262273 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:46:39 crc kubenswrapper[4892]: I0217 17:46:39.206742 4892 generic.go:334] "Generic (PLEG): container finished" podID="183505a9-eeb4-4bf8-94ae-2c593e78b926" containerID="6581e50a4ad1f934152494bb5fbed617a4b12fd7ac9a226abae451e0cc4f0a7b" exitCode=0 Feb 17 17:46:39 crc kubenswrapper[4892]: I0217 17:46:39.206789 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-st4m9" event={"ID":"183505a9-eeb4-4bf8-94ae-2c593e78b926","Type":"ContainerDied","Data":"6581e50a4ad1f934152494bb5fbed617a4b12fd7ac9a226abae451e0cc4f0a7b"} Feb 17 17:46:40 crc kubenswrapper[4892]: I0217 17:46:40.813257 4892 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-st4m9 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Feb 17 17:46:40 crc kubenswrapper[4892]: I0217 17:46:40.813335 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-st4m9" podUID="183505a9-eeb4-4bf8-94ae-2c593e78b926" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Feb 17 17:46:40 crc kubenswrapper[4892]: I0217 17:46:40.838046 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-4q9d4" Feb 17 17:46:40 crc kubenswrapper[4892]: I0217 17:46:40.945442 4892 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-qm6w9 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Feb 17 17:46:40 crc kubenswrapper[4892]: I0217 17:46:40.945497 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qm6w9" podUID="4baae3f3-ceff-445d-9cf0-82284224bff7" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" Feb 17 17:46:46 crc kubenswrapper[4892]: E0217 17:46:46.949091 4892 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 17 17:46:46 crc kubenswrapper[4892]: E0217 17:46:46.949646 4892 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m8tfn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-dzbvh_openshift-marketplace(cda40965-c69d-469d-beb4-91582508ad77): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 17:46:46 crc kubenswrapper[4892]: E0217 17:46:46.951058 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-dzbvh" podUID="cda40965-c69d-469d-beb4-91582508ad77" Feb 17 17:46:47 crc kubenswrapper[4892]: I0217 17:46:47.259644 4892 generic.go:334] "Generic (PLEG): container finished" podID="4baae3f3-ceff-445d-9cf0-82284224bff7" containerID="81b2d892fc75ff371816c3a647f8429d6358f47b2843f1d99914b835402fa303" exitCode=0 Feb 17 17:46:47 crc kubenswrapper[4892]: I0217 17:46:47.259656 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qm6w9" event={"ID":"4baae3f3-ceff-445d-9cf0-82284224bff7","Type":"ContainerDied","Data":"81b2d892fc75ff371816c3a647f8429d6358f47b2843f1d99914b835402fa303"} Feb 17 17:46:47 crc kubenswrapper[4892]: E0217 17:46:47.528525 4892 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 17 17:46:47 crc kubenswrapper[4892]: E0217 17:46:47.528672 4892 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k8vd6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-vh5f8_openshift-marketplace(cbb9e745-9259-437f-a600-80153e687c65): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 17:46:47 crc kubenswrapper[4892]: E0217 17:46:47.530417 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-vh5f8" podUID="cbb9e745-9259-437f-a600-80153e687c65" Feb 17 17:46:49 crc kubenswrapper[4892]: E0217 17:46:49.161149 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-vh5f8" podUID="cbb9e745-9259-437f-a600-80153e687c65" Feb 17 17:46:49 crc kubenswrapper[4892]: E0217 17:46:49.161280 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-dzbvh" podUID="cda40965-c69d-469d-beb4-91582508ad77" Feb 17 17:46:50 crc kubenswrapper[4892]: E0217 17:46:50.490478 4892 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 17 17:46:50 crc kubenswrapper[4892]: E0217 17:46:50.490678 4892 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8fbkk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-2dtr9_openshift-marketplace(9c4922f3-1e12-469d-9afc-c2c52238e551): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 17:46:50 crc kubenswrapper[4892]: E0217 17:46:50.492133 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-2dtr9" podUID="9c4922f3-1e12-469d-9afc-c2c52238e551" Feb 17 17:46:51 crc kubenswrapper[4892]: I0217 17:46:51.705935 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ps5cl" Feb 17 17:46:51 crc kubenswrapper[4892]: I0217 17:46:51.813134 4892 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-st4m9 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 17 17:46:51 crc kubenswrapper[4892]: I0217 17:46:51.813207 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-st4m9" podUID="183505a9-eeb4-4bf8-94ae-2c593e78b926" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 17 17:46:51 crc kubenswrapper[4892]: I0217 17:46:51.946092 4892 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-qm6w9 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 17 17:46:51 crc kubenswrapper[4892]: I0217 17:46:51.946249 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qm6w9" podUID="4baae3f3-ceff-445d-9cf0-82284224bff7" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 17 17:46:54 crc kubenswrapper[4892]: E0217 17:46:54.587884 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-2dtr9" podUID="9c4922f3-1e12-469d-9afc-c2c52238e551" Feb 17 17:46:56 crc kubenswrapper[4892]: E0217 17:46:56.636236 4892 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 17 17:46:56 crc kubenswrapper[4892]: E0217 17:46:56.636426 4892 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z9j7z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-q2t7d_openshift-marketplace(0a738868-4aa5-4aa8-8058-863b3dfb3acb): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 17:46:56 crc kubenswrapper[4892]: E0217 17:46:56.638951 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-q2t7d" podUID="0a738868-4aa5-4aa8-8058-863b3dfb3acb" Feb 17 17:46:56 crc kubenswrapper[4892]: E0217 17:46:56.688320 4892 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 17 17:46:56 crc kubenswrapper[4892]: E0217 17:46:56.688588 4892 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bppw6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-9n77k_openshift-marketplace(0c8cf662-ff88-481f-ae9d-de0fe49af1f4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 17:46:56 crc kubenswrapper[4892]: E0217 17:46:56.690199 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-9n77k" podUID="0c8cf662-ff88-481f-ae9d-de0fe49af1f4" Feb 17 17:46:57 crc kubenswrapper[4892]: I0217 17:46:57.402214 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 17:46:58 crc kubenswrapper[4892]: I0217 17:46:58.566036 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 17 17:46:58 crc kubenswrapper[4892]: E0217 17:46:58.566317 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43bf66f0-1781-4f7f-9953-db63cd55ff3b" containerName="pruner" Feb 17 17:46:58 crc kubenswrapper[4892]: I0217 17:46:58.566330 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="43bf66f0-1781-4f7f-9953-db63cd55ff3b" containerName="pruner" Feb 17 17:46:58 crc kubenswrapper[4892]: I0217 17:46:58.566464 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="43bf66f0-1781-4f7f-9953-db63cd55ff3b" containerName="pruner" Feb 17 17:46:58 crc kubenswrapper[4892]: I0217 17:46:58.567386 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 17:46:58 crc kubenswrapper[4892]: I0217 17:46:58.569254 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 17 17:46:58 crc kubenswrapper[4892]: I0217 17:46:58.571425 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 17 17:46:58 crc kubenswrapper[4892]: I0217 17:46:58.577884 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 17 17:46:58 crc kubenswrapper[4892]: I0217 17:46:58.670073 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8873a6a5-2fc4-4f3e-9f43-2f73368b3eaf-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8873a6a5-2fc4-4f3e-9f43-2f73368b3eaf\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 17:46:58 crc kubenswrapper[4892]: I0217 17:46:58.671486 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8873a6a5-2fc4-4f3e-9f43-2f73368b3eaf-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8873a6a5-2fc4-4f3e-9f43-2f73368b3eaf\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 17:46:58 crc kubenswrapper[4892]: I0217 17:46:58.772718 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8873a6a5-2fc4-4f3e-9f43-2f73368b3eaf-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8873a6a5-2fc4-4f3e-9f43-2f73368b3eaf\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 17:46:58 crc kubenswrapper[4892]: I0217 17:46:58.772870 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8873a6a5-2fc4-4f3e-9f43-2f73368b3eaf-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8873a6a5-2fc4-4f3e-9f43-2f73368b3eaf\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 17:46:58 crc kubenswrapper[4892]: I0217 17:46:58.772961 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8873a6a5-2fc4-4f3e-9f43-2f73368b3eaf-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8873a6a5-2fc4-4f3e-9f43-2f73368b3eaf\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 17:46:58 crc kubenswrapper[4892]: I0217 17:46:58.797463 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8873a6a5-2fc4-4f3e-9f43-2f73368b3eaf-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8873a6a5-2fc4-4f3e-9f43-2f73368b3eaf\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 17:46:58 crc kubenswrapper[4892]: I0217 17:46:58.895592 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 17:46:59 crc kubenswrapper[4892]: E0217 17:46:59.719087 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-q2t7d" podUID="0a738868-4aa5-4aa8-8058-863b3dfb3acb" Feb 17 17:46:59 crc kubenswrapper[4892]: E0217 17:46:59.719170 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-9n77k" podUID="0c8cf662-ff88-481f-ae9d-de0fe49af1f4" Feb 17 17:46:59 crc kubenswrapper[4892]: E0217 17:46:59.754342 4892 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 17 17:46:59 crc kubenswrapper[4892]: E0217 17:46:59.754763 4892 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jj6nr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-f7rxl_openshift-marketplace(6a4ab50e-1f68-4755-a980-157e38a83436): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 17:46:59 crc kubenswrapper[4892]: E0217 17:46:59.756017 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-f7rxl" podUID="6a4ab50e-1f68-4755-a980-157e38a83436" Feb 17 17:46:59 crc kubenswrapper[4892]: I0217 17:46:59.793376 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-st4m9" Feb 17 17:46:59 crc kubenswrapper[4892]: E0217 17:46:59.795862 4892 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 17 17:46:59 crc kubenswrapper[4892]: E0217 17:46:59.796072 4892 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lqqtf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-rjqrw_openshift-marketplace(8498dfc3-1aa0-4059-abad-cab139ba83ec): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 17:46:59 crc kubenswrapper[4892]: E0217 17:46:59.798037 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-rjqrw" podUID="8498dfc3-1aa0-4059-abad-cab139ba83ec" Feb 17 17:46:59 crc kubenswrapper[4892]: I0217 17:46:59.799133 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qm6w9" Feb 17 17:46:59 crc kubenswrapper[4892]: I0217 17:46:59.817570 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-bf4b5c686-mb98b"] Feb 17 17:46:59 crc kubenswrapper[4892]: E0217 17:46:59.817855 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="183505a9-eeb4-4bf8-94ae-2c593e78b926" containerName="controller-manager" Feb 17 17:46:59 crc kubenswrapper[4892]: I0217 17:46:59.817870 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="183505a9-eeb4-4bf8-94ae-2c593e78b926" containerName="controller-manager" Feb 17 17:46:59 crc kubenswrapper[4892]: E0217 17:46:59.817892 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4baae3f3-ceff-445d-9cf0-82284224bff7" containerName="route-controller-manager" Feb 17 17:46:59 crc kubenswrapper[4892]: I0217 17:46:59.817900 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="4baae3f3-ceff-445d-9cf0-82284224bff7" containerName="route-controller-manager" Feb 17 17:46:59 crc kubenswrapper[4892]: I0217 17:46:59.817990 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="183505a9-eeb4-4bf8-94ae-2c593e78b926" containerName="controller-manager" Feb 17 17:46:59 crc kubenswrapper[4892]: I0217 17:46:59.818004 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="4baae3f3-ceff-445d-9cf0-82284224bff7" containerName="route-controller-manager" Feb 17 17:46:59 crc kubenswrapper[4892]: I0217 17:46:59.818355 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bf4b5c686-mb98b" Feb 17 17:46:59 crc kubenswrapper[4892]: I0217 17:46:59.837518 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-bf4b5c686-mb98b"] Feb 17 17:46:59 crc kubenswrapper[4892]: E0217 17:46:59.847927 4892 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 17 17:46:59 crc kubenswrapper[4892]: E0217 17:46:59.848068 4892 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t2t9n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-q424n_openshift-marketplace(96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 17:46:59 crc kubenswrapper[4892]: E0217 17:46:59.849483 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-q424n" podUID="96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1" Feb 17 17:46:59 crc kubenswrapper[4892]: I0217 17:46:59.947150 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2q4n6"] Feb 17 17:46:59 crc kubenswrapper[4892]: I0217 17:46:59.991235 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5qpz\" (UniqueName: \"kubernetes.io/projected/183505a9-eeb4-4bf8-94ae-2c593e78b926-kube-api-access-j5qpz\") pod \"183505a9-eeb4-4bf8-94ae-2c593e78b926\" (UID: \"183505a9-eeb4-4bf8-94ae-2c593e78b926\") " Feb 17 17:46:59 crc kubenswrapper[4892]: I0217 17:46:59.991301 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4baae3f3-ceff-445d-9cf0-82284224bff7-client-ca\") pod \"4baae3f3-ceff-445d-9cf0-82284224bff7\" (UID: \"4baae3f3-ceff-445d-9cf0-82284224bff7\") " Feb 17 17:46:59 crc kubenswrapper[4892]: I0217 17:46:59.991329 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4baae3f3-ceff-445d-9cf0-82284224bff7-config\") pod \"4baae3f3-ceff-445d-9cf0-82284224bff7\" (UID: \"4baae3f3-ceff-445d-9cf0-82284224bff7\") " Feb 17 17:46:59 crc kubenswrapper[4892]: I0217 17:46:59.991361 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4baae3f3-ceff-445d-9cf0-82284224bff7-serving-cert\") pod \"4baae3f3-ceff-445d-9cf0-82284224bff7\" (UID: \"4baae3f3-ceff-445d-9cf0-82284224bff7\") " Feb 17 17:46:59 crc kubenswrapper[4892]: I0217 17:46:59.991406 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/183505a9-eeb4-4bf8-94ae-2c593e78b926-serving-cert\") pod \"183505a9-eeb4-4bf8-94ae-2c593e78b926\" (UID: \"183505a9-eeb4-4bf8-94ae-2c593e78b926\") " Feb 17 17:46:59 crc kubenswrapper[4892]: I0217 17:46:59.991449 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/183505a9-eeb4-4bf8-94ae-2c593e78b926-config\") pod \"183505a9-eeb4-4bf8-94ae-2c593e78b926\" (UID: \"183505a9-eeb4-4bf8-94ae-2c593e78b926\") " Feb 17 17:46:59 crc kubenswrapper[4892]: I0217 17:46:59.991515 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/183505a9-eeb4-4bf8-94ae-2c593e78b926-proxy-ca-bundles\") pod \"183505a9-eeb4-4bf8-94ae-2c593e78b926\" (UID: \"183505a9-eeb4-4bf8-94ae-2c593e78b926\") " Feb 17 17:46:59 crc kubenswrapper[4892]: I0217 17:46:59.991540 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/183505a9-eeb4-4bf8-94ae-2c593e78b926-client-ca\") pod \"183505a9-eeb4-4bf8-94ae-2c593e78b926\" (UID: \"183505a9-eeb4-4bf8-94ae-2c593e78b926\") " Feb 17 17:46:59 crc kubenswrapper[4892]: I0217 17:46:59.991560 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5plsk\" (UniqueName: \"kubernetes.io/projected/4baae3f3-ceff-445d-9cf0-82284224bff7-kube-api-access-5plsk\") pod \"4baae3f3-ceff-445d-9cf0-82284224bff7\" (UID: \"4baae3f3-ceff-445d-9cf0-82284224bff7\") " Feb 17 17:46:59 crc kubenswrapper[4892]: I0217 17:46:59.991752 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/47bd4c66-226c-455a-8442-947e453e4f8e-proxy-ca-bundles\") pod \"controller-manager-bf4b5c686-mb98b\" (UID: \"47bd4c66-226c-455a-8442-947e453e4f8e\") " pod="openshift-controller-manager/controller-manager-bf4b5c686-mb98b" Feb 17 17:46:59 crc kubenswrapper[4892]: I0217 17:46:59.991841 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47bd4c66-226c-455a-8442-947e453e4f8e-config\") pod \"controller-manager-bf4b5c686-mb98b\" (UID: \"47bd4c66-226c-455a-8442-947e453e4f8e\") " pod="openshift-controller-manager/controller-manager-bf4b5c686-mb98b" Feb 17 17:46:59 crc kubenswrapper[4892]: I0217 17:46:59.991872 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqjhd\" (UniqueName: \"kubernetes.io/projected/47bd4c66-226c-455a-8442-947e453e4f8e-kube-api-access-pqjhd\") pod \"controller-manager-bf4b5c686-mb98b\" (UID: \"47bd4c66-226c-455a-8442-947e453e4f8e\") " pod="openshift-controller-manager/controller-manager-bf4b5c686-mb98b" Feb 17 17:46:59 crc kubenswrapper[4892]: I0217 17:46:59.991899 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/47bd4c66-226c-455a-8442-947e453e4f8e-client-ca\") pod \"controller-manager-bf4b5c686-mb98b\" (UID: \"47bd4c66-226c-455a-8442-947e453e4f8e\") " pod="openshift-controller-manager/controller-manager-bf4b5c686-mb98b" Feb 17 17:46:59 crc kubenswrapper[4892]: I0217 17:46:59.992666 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/183505a9-eeb4-4bf8-94ae-2c593e78b926-client-ca" (OuterVolumeSpecName: "client-ca") pod "183505a9-eeb4-4bf8-94ae-2c593e78b926" (UID: "183505a9-eeb4-4bf8-94ae-2c593e78b926"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:46:59 crc kubenswrapper[4892]: I0217 17:46:59.992698 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4baae3f3-ceff-445d-9cf0-82284224bff7-client-ca" (OuterVolumeSpecName: "client-ca") pod "4baae3f3-ceff-445d-9cf0-82284224bff7" (UID: "4baae3f3-ceff-445d-9cf0-82284224bff7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:46:59 crc kubenswrapper[4892]: I0217 17:46:59.992792 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4baae3f3-ceff-445d-9cf0-82284224bff7-config" (OuterVolumeSpecName: "config") pod "4baae3f3-ceff-445d-9cf0-82284224bff7" (UID: "4baae3f3-ceff-445d-9cf0-82284224bff7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:46:59 crc kubenswrapper[4892]: I0217 17:46:59.993252 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/183505a9-eeb4-4bf8-94ae-2c593e78b926-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "183505a9-eeb4-4bf8-94ae-2c593e78b926" (UID: "183505a9-eeb4-4bf8-94ae-2c593e78b926"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:46:59 crc kubenswrapper[4892]: I0217 17:46:59.993277 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/183505a9-eeb4-4bf8-94ae-2c593e78b926-config" (OuterVolumeSpecName: "config") pod "183505a9-eeb4-4bf8-94ae-2c593e78b926" (UID: "183505a9-eeb4-4bf8-94ae-2c593e78b926"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:46:59 crc kubenswrapper[4892]: I0217 17:46:59.991919 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47bd4c66-226c-455a-8442-947e453e4f8e-serving-cert\") pod \"controller-manager-bf4b5c686-mb98b\" (UID: \"47bd4c66-226c-455a-8442-947e453e4f8e\") " pod="openshift-controller-manager/controller-manager-bf4b5c686-mb98b" Feb 17 17:46:59 crc kubenswrapper[4892]: I0217 17:46:59.997709 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/183505a9-eeb4-4bf8-94ae-2c593e78b926-kube-api-access-j5qpz" (OuterVolumeSpecName: "kube-api-access-j5qpz") pod "183505a9-eeb4-4bf8-94ae-2c593e78b926" (UID: "183505a9-eeb4-4bf8-94ae-2c593e78b926"). InnerVolumeSpecName "kube-api-access-j5qpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:46:59 crc kubenswrapper[4892]: I0217 17:46:59.998832 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4baae3f3-ceff-445d-9cf0-82284224bff7-kube-api-access-5plsk" (OuterVolumeSpecName: "kube-api-access-5plsk") pod "4baae3f3-ceff-445d-9cf0-82284224bff7" (UID: "4baae3f3-ceff-445d-9cf0-82284224bff7"). InnerVolumeSpecName "kube-api-access-5plsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:00 crc kubenswrapper[4892]: I0217 17:47:00.001134 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/183505a9-eeb4-4bf8-94ae-2c593e78b926-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:00 crc kubenswrapper[4892]: I0217 17:47:00.001172 4892 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/183505a9-eeb4-4bf8-94ae-2c593e78b926-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:00 crc kubenswrapper[4892]: I0217 17:47:00.001186 4892 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/183505a9-eeb4-4bf8-94ae-2c593e78b926-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:00 crc kubenswrapper[4892]: I0217 17:47:00.001197 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5plsk\" (UniqueName: \"kubernetes.io/projected/4baae3f3-ceff-445d-9cf0-82284224bff7-kube-api-access-5plsk\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:00 crc kubenswrapper[4892]: I0217 17:47:00.001208 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5qpz\" (UniqueName: \"kubernetes.io/projected/183505a9-eeb4-4bf8-94ae-2c593e78b926-kube-api-access-j5qpz\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:00 crc kubenswrapper[4892]: I0217 17:47:00.001218 4892 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4baae3f3-ceff-445d-9cf0-82284224bff7-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:00 crc kubenswrapper[4892]: I0217 17:47:00.001227 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4baae3f3-ceff-445d-9cf0-82284224bff7-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:00 crc kubenswrapper[4892]: I0217 17:47:00.004367 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/183505a9-eeb4-4bf8-94ae-2c593e78b926-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "183505a9-eeb4-4bf8-94ae-2c593e78b926" (UID: "183505a9-eeb4-4bf8-94ae-2c593e78b926"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:00 crc kubenswrapper[4892]: I0217 17:47:00.005564 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4baae3f3-ceff-445d-9cf0-82284224bff7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4baae3f3-ceff-445d-9cf0-82284224bff7" (UID: "4baae3f3-ceff-445d-9cf0-82284224bff7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:00 crc kubenswrapper[4892]: I0217 17:47:00.102281 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/47bd4c66-226c-455a-8442-947e453e4f8e-proxy-ca-bundles\") pod \"controller-manager-bf4b5c686-mb98b\" (UID: \"47bd4c66-226c-455a-8442-947e453e4f8e\") " pod="openshift-controller-manager/controller-manager-bf4b5c686-mb98b" Feb 17 17:47:00 crc kubenswrapper[4892]: I0217 17:47:00.102802 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47bd4c66-226c-455a-8442-947e453e4f8e-config\") pod \"controller-manager-bf4b5c686-mb98b\" (UID: \"47bd4c66-226c-455a-8442-947e453e4f8e\") " pod="openshift-controller-manager/controller-manager-bf4b5c686-mb98b" Feb 17 17:47:00 crc kubenswrapper[4892]: I0217 17:47:00.102851 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqjhd\" (UniqueName: \"kubernetes.io/projected/47bd4c66-226c-455a-8442-947e453e4f8e-kube-api-access-pqjhd\") pod \"controller-manager-bf4b5c686-mb98b\" (UID: \"47bd4c66-226c-455a-8442-947e453e4f8e\") " pod="openshift-controller-manager/controller-manager-bf4b5c686-mb98b" Feb 17 17:47:00 crc kubenswrapper[4892]: I0217 17:47:00.102878 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/47bd4c66-226c-455a-8442-947e453e4f8e-client-ca\") pod \"controller-manager-bf4b5c686-mb98b\" (UID: \"47bd4c66-226c-455a-8442-947e453e4f8e\") " pod="openshift-controller-manager/controller-manager-bf4b5c686-mb98b" Feb 17 17:47:00 crc kubenswrapper[4892]: I0217 17:47:00.102898 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47bd4c66-226c-455a-8442-947e453e4f8e-serving-cert\") pod \"controller-manager-bf4b5c686-mb98b\" (UID: \"47bd4c66-226c-455a-8442-947e453e4f8e\") " pod="openshift-controller-manager/controller-manager-bf4b5c686-mb98b" Feb 17 17:47:00 crc kubenswrapper[4892]: I0217 17:47:00.102978 4892 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4baae3f3-ceff-445d-9cf0-82284224bff7-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:00 crc kubenswrapper[4892]: I0217 17:47:00.102993 4892 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/183505a9-eeb4-4bf8-94ae-2c593e78b926-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:00 crc kubenswrapper[4892]: I0217 17:47:00.104078 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/47bd4c66-226c-455a-8442-947e453e4f8e-proxy-ca-bundles\") pod \"controller-manager-bf4b5c686-mb98b\" (UID: \"47bd4c66-226c-455a-8442-947e453e4f8e\") " pod="openshift-controller-manager/controller-manager-bf4b5c686-mb98b" Feb 17 17:47:00 crc kubenswrapper[4892]: I0217 17:47:00.106131 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47bd4c66-226c-455a-8442-947e453e4f8e-config\") pod \"controller-manager-bf4b5c686-mb98b\" (UID: \"47bd4c66-226c-455a-8442-947e453e4f8e\") " pod="openshift-controller-manager/controller-manager-bf4b5c686-mb98b" Feb 17 17:47:00 crc kubenswrapper[4892]: I0217 17:47:00.106156 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47bd4c66-226c-455a-8442-947e453e4f8e-serving-cert\") pod \"controller-manager-bf4b5c686-mb98b\" (UID: \"47bd4c66-226c-455a-8442-947e453e4f8e\") " pod="openshift-controller-manager/controller-manager-bf4b5c686-mb98b" Feb 17 17:47:00 crc kubenswrapper[4892]: I0217 17:47:00.113675 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/47bd4c66-226c-455a-8442-947e453e4f8e-client-ca\") pod \"controller-manager-bf4b5c686-mb98b\" (UID: \"47bd4c66-226c-455a-8442-947e453e4f8e\") " pod="openshift-controller-manager/controller-manager-bf4b5c686-mb98b" Feb 17 17:47:00 crc kubenswrapper[4892]: I0217 17:47:00.122341 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqjhd\" (UniqueName: \"kubernetes.io/projected/47bd4c66-226c-455a-8442-947e453e4f8e-kube-api-access-pqjhd\") pod \"controller-manager-bf4b5c686-mb98b\" (UID: \"47bd4c66-226c-455a-8442-947e453e4f8e\") " pod="openshift-controller-manager/controller-manager-bf4b5c686-mb98b" Feb 17 17:47:00 crc kubenswrapper[4892]: I0217 17:47:00.141312 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bf4b5c686-mb98b" Feb 17 17:47:00 crc kubenswrapper[4892]: I0217 17:47:00.208142 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 17 17:47:00 crc kubenswrapper[4892]: I0217 17:47:00.349304 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-bf4b5c686-mb98b"] Feb 17 17:47:00 crc kubenswrapper[4892]: I0217 17:47:00.351181 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qm6w9" event={"ID":"4baae3f3-ceff-445d-9cf0-82284224bff7","Type":"ContainerDied","Data":"1f20e94cde5065059433c85feeb14e30293940ceb285307fd606d09b79f93de8"} Feb 17 17:47:00 crc kubenswrapper[4892]: I0217 17:47:00.351235 4892 scope.go:117] "RemoveContainer" containerID="81b2d892fc75ff371816c3a647f8429d6358f47b2843f1d99914b835402fa303" Feb 17 17:47:00 crc kubenswrapper[4892]: I0217 17:47:00.351357 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qm6w9" Feb 17 17:47:00 crc kubenswrapper[4892]: I0217 17:47:00.356734 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"8873a6a5-2fc4-4f3e-9f43-2f73368b3eaf","Type":"ContainerStarted","Data":"11283016c3f53377d813e15072af8f183feb949fffe754d058e8c774c7b70fb4"} Feb 17 17:47:00 crc kubenswrapper[4892]: I0217 17:47:00.358914 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2q4n6" event={"ID":"9290105c-74a4-487a-879f-3f79186b3b01","Type":"ContainerStarted","Data":"a8ce7d0462c0554f234372de5c065de5ab26185ad2d54584b397257c214069b8"} Feb 17 17:47:00 crc kubenswrapper[4892]: I0217 17:47:00.358950 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2q4n6" event={"ID":"9290105c-74a4-487a-879f-3f79186b3b01","Type":"ContainerStarted","Data":"cfb67e2351e9374abc85902c6654d3cf3578bfaf35ed60825622a4c19d204170"} Feb 17 17:47:00 crc kubenswrapper[4892]: I0217 17:47:00.373793 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-st4m9" Feb 17 17:47:00 crc kubenswrapper[4892]: I0217 17:47:00.374082 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-st4m9" event={"ID":"183505a9-eeb4-4bf8-94ae-2c593e78b926","Type":"ContainerDied","Data":"944a81ae14b6bd46b5d8e3a477b6440d90f4a94880c87a970f73185ea8da5b09"} Feb 17 17:47:00 crc kubenswrapper[4892]: E0217 17:47:00.381083 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-q424n" podUID="96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1" Feb 17 17:47:00 crc kubenswrapper[4892]: E0217 17:47:00.381267 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-f7rxl" podUID="6a4ab50e-1f68-4755-a980-157e38a83436" Feb 17 17:47:00 crc kubenswrapper[4892]: E0217 17:47:00.381336 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-rjqrw" podUID="8498dfc3-1aa0-4059-abad-cab139ba83ec" Feb 17 17:47:00 crc kubenswrapper[4892]: I0217 17:47:00.411045 4892 scope.go:117] "RemoveContainer" containerID="6581e50a4ad1f934152494bb5fbed617a4b12fd7ac9a226abae451e0cc4f0a7b" Feb 17 17:47:00 crc kubenswrapper[4892]: I0217 17:47:00.444194 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qm6w9"] Feb 17 17:47:00 crc kubenswrapper[4892]: I0217 17:47:00.461737 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qm6w9"] Feb 17 17:47:00 crc kubenswrapper[4892]: I0217 17:47:00.478472 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-st4m9"] Feb 17 17:47:00 crc kubenswrapper[4892]: I0217 17:47:00.479307 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-st4m9"] Feb 17 17:47:01 crc kubenswrapper[4892]: I0217 17:47:01.379954 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="183505a9-eeb4-4bf8-94ae-2c593e78b926" path="/var/lib/kubelet/pods/183505a9-eeb4-4bf8-94ae-2c593e78b926/volumes" Feb 17 17:47:01 crc kubenswrapper[4892]: I0217 17:47:01.383888 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4baae3f3-ceff-445d-9cf0-82284224bff7" path="/var/lib/kubelet/pods/4baae3f3-ceff-445d-9cf0-82284224bff7/volumes" Feb 17 17:47:01 crc kubenswrapper[4892]: I0217 17:47:01.384861 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2q4n6" event={"ID":"9290105c-74a4-487a-879f-3f79186b3b01","Type":"ContainerStarted","Data":"479dbb0ac3018f5beaefab53fb6869c7193e8fc9268883337b9e45dd7d1f95e8"} Feb 17 17:47:01 crc kubenswrapper[4892]: I0217 17:47:01.391458 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-bf4b5c686-mb98b" event={"ID":"47bd4c66-226c-455a-8442-947e453e4f8e","Type":"ContainerStarted","Data":"3c42f71d5767472611dedc02167e92b60eeed5d548841dab0369cfadd400498f"} Feb 17 17:47:01 crc kubenswrapper[4892]: I0217 17:47:01.391512 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-bf4b5c686-mb98b" event={"ID":"47bd4c66-226c-455a-8442-947e453e4f8e","Type":"ContainerStarted","Data":"da098827ccdf081e1752c50fdbbfc6acbad59776f937301ed1c5bd8678b0841b"} Feb 17 17:47:01 crc kubenswrapper[4892]: I0217 17:47:01.391778 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-bf4b5c686-mb98b" Feb 17 17:47:01 crc kubenswrapper[4892]: I0217 17:47:01.406220 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-2q4n6" podStartSLOduration=169.406191243 podStartE2EDuration="2m49.406191243s" podCreationTimestamp="2026-02-17 17:44:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:47:01.405388652 +0000 UTC m=+192.780791957" watchObservedRunningTime="2026-02-17 17:47:01.406191243 +0000 UTC m=+192.781594548" Feb 17 17:47:01 crc kubenswrapper[4892]: I0217 17:47:01.407477 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-bf4b5c686-mb98b" Feb 17 17:47:01 crc kubenswrapper[4892]: I0217 17:47:01.411496 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"8873a6a5-2fc4-4f3e-9f43-2f73368b3eaf","Type":"ContainerStarted","Data":"6728f4d582137d12544c8d0cc69b913dc214b4741545c4deda7e7aa97e585930"} Feb 17 17:47:01 crc kubenswrapper[4892]: I0217 17:47:01.436270 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-bf4b5c686-mb98b" podStartSLOduration=4.4362213839999995 podStartE2EDuration="4.436221384s" podCreationTimestamp="2026-02-17 17:46:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:47:01.427785549 +0000 UTC m=+192.803188834" watchObservedRunningTime="2026-02-17 17:47:01.436221384 +0000 UTC m=+192.811624669" Feb 17 17:47:01 crc kubenswrapper[4892]: I0217 17:47:01.499253 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=3.499225914 podStartE2EDuration="3.499225914s" podCreationTimestamp="2026-02-17 17:46:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:47:01.491187859 +0000 UTC m=+192.866591154" watchObservedRunningTime="2026-02-17 17:47:01.499225914 +0000 UTC m=+192.874629189" Feb 17 17:47:02 crc kubenswrapper[4892]: I0217 17:47:02.418978 4892 generic.go:334] "Generic (PLEG): container finished" podID="cbb9e745-9259-437f-a600-80153e687c65" containerID="738eaefa7812c1c2f8b363b20ed4c5492c125b041a57e242e9f2ed854604c068" exitCode=0 Feb 17 17:47:02 crc kubenswrapper[4892]: I0217 17:47:02.419096 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vh5f8" event={"ID":"cbb9e745-9259-437f-a600-80153e687c65","Type":"ContainerDied","Data":"738eaefa7812c1c2f8b363b20ed4c5492c125b041a57e242e9f2ed854604c068"} Feb 17 17:47:02 crc kubenswrapper[4892]: I0217 17:47:02.421261 4892 generic.go:334] "Generic (PLEG): container finished" podID="8873a6a5-2fc4-4f3e-9f43-2f73368b3eaf" containerID="6728f4d582137d12544c8d0cc69b913dc214b4741545c4deda7e7aa97e585930" exitCode=0 Feb 17 17:47:02 crc kubenswrapper[4892]: I0217 17:47:02.421357 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"8873a6a5-2fc4-4f3e-9f43-2f73368b3eaf","Type":"ContainerDied","Data":"6728f4d582137d12544c8d0cc69b913dc214b4741545c4deda7e7aa97e585930"} Feb 17 17:47:02 crc kubenswrapper[4892]: I0217 17:47:02.670057 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f4b8cf5d-nlk6v"] Feb 17 17:47:02 crc kubenswrapper[4892]: I0217 17:47:02.673428 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f4b8cf5d-nlk6v"] Feb 17 17:47:02 crc kubenswrapper[4892]: I0217 17:47:02.673548 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f4b8cf5d-nlk6v" Feb 17 17:47:02 crc kubenswrapper[4892]: I0217 17:47:02.677687 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 17 17:47:02 crc kubenswrapper[4892]: I0217 17:47:02.679072 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 17 17:47:02 crc kubenswrapper[4892]: I0217 17:47:02.679449 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 17 17:47:02 crc kubenswrapper[4892]: I0217 17:47:02.679736 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 17 17:47:02 crc kubenswrapper[4892]: I0217 17:47:02.679994 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 17 17:47:02 crc kubenswrapper[4892]: I0217 17:47:02.680080 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 17 17:47:02 crc kubenswrapper[4892]: I0217 17:47:02.747772 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6708749-50f6-4384-89eb-1d0d3165add7-config\") pod \"route-controller-manager-f4b8cf5d-nlk6v\" (UID: \"a6708749-50f6-4384-89eb-1d0d3165add7\") " pod="openshift-route-controller-manager/route-controller-manager-f4b8cf5d-nlk6v" Feb 17 17:47:02 crc kubenswrapper[4892]: I0217 17:47:02.747937 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlzbl\" (UniqueName: \"kubernetes.io/projected/a6708749-50f6-4384-89eb-1d0d3165add7-kube-api-access-wlzbl\") pod \"route-controller-manager-f4b8cf5d-nlk6v\" (UID: \"a6708749-50f6-4384-89eb-1d0d3165add7\") " pod="openshift-route-controller-manager/route-controller-manager-f4b8cf5d-nlk6v" Feb 17 17:47:02 crc kubenswrapper[4892]: I0217 17:47:02.748031 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6708749-50f6-4384-89eb-1d0d3165add7-serving-cert\") pod \"route-controller-manager-f4b8cf5d-nlk6v\" (UID: \"a6708749-50f6-4384-89eb-1d0d3165add7\") " pod="openshift-route-controller-manager/route-controller-manager-f4b8cf5d-nlk6v" Feb 17 17:47:02 crc kubenswrapper[4892]: I0217 17:47:02.748063 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a6708749-50f6-4384-89eb-1d0d3165add7-client-ca\") pod \"route-controller-manager-f4b8cf5d-nlk6v\" (UID: \"a6708749-50f6-4384-89eb-1d0d3165add7\") " pod="openshift-route-controller-manager/route-controller-manager-f4b8cf5d-nlk6v" Feb 17 17:47:02 crc kubenswrapper[4892]: I0217 17:47:02.849274 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6708749-50f6-4384-89eb-1d0d3165add7-serving-cert\") pod \"route-controller-manager-f4b8cf5d-nlk6v\" (UID: \"a6708749-50f6-4384-89eb-1d0d3165add7\") " pod="openshift-route-controller-manager/route-controller-manager-f4b8cf5d-nlk6v" Feb 17 17:47:02 crc kubenswrapper[4892]: I0217 17:47:02.849320 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a6708749-50f6-4384-89eb-1d0d3165add7-client-ca\") pod \"route-controller-manager-f4b8cf5d-nlk6v\" (UID: \"a6708749-50f6-4384-89eb-1d0d3165add7\") " pod="openshift-route-controller-manager/route-controller-manager-f4b8cf5d-nlk6v" Feb 17 17:47:02 crc kubenswrapper[4892]: I0217 17:47:02.849456 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6708749-50f6-4384-89eb-1d0d3165add7-config\") pod \"route-controller-manager-f4b8cf5d-nlk6v\" (UID: \"a6708749-50f6-4384-89eb-1d0d3165add7\") " pod="openshift-route-controller-manager/route-controller-manager-f4b8cf5d-nlk6v" Feb 17 17:47:02 crc kubenswrapper[4892]: I0217 17:47:02.849506 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlzbl\" (UniqueName: \"kubernetes.io/projected/a6708749-50f6-4384-89eb-1d0d3165add7-kube-api-access-wlzbl\") pod \"route-controller-manager-f4b8cf5d-nlk6v\" (UID: \"a6708749-50f6-4384-89eb-1d0d3165add7\") " pod="openshift-route-controller-manager/route-controller-manager-f4b8cf5d-nlk6v" Feb 17 17:47:02 crc kubenswrapper[4892]: I0217 17:47:02.850633 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a6708749-50f6-4384-89eb-1d0d3165add7-client-ca\") pod \"route-controller-manager-f4b8cf5d-nlk6v\" (UID: \"a6708749-50f6-4384-89eb-1d0d3165add7\") " pod="openshift-route-controller-manager/route-controller-manager-f4b8cf5d-nlk6v" Feb 17 17:47:02 crc kubenswrapper[4892]: I0217 17:47:02.851318 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6708749-50f6-4384-89eb-1d0d3165add7-config\") pod \"route-controller-manager-f4b8cf5d-nlk6v\" (UID: \"a6708749-50f6-4384-89eb-1d0d3165add7\") " pod="openshift-route-controller-manager/route-controller-manager-f4b8cf5d-nlk6v" Feb 17 17:47:02 crc kubenswrapper[4892]: I0217 17:47:02.858420 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6708749-50f6-4384-89eb-1d0d3165add7-serving-cert\") pod \"route-controller-manager-f4b8cf5d-nlk6v\" (UID: \"a6708749-50f6-4384-89eb-1d0d3165add7\") " pod="openshift-route-controller-manager/route-controller-manager-f4b8cf5d-nlk6v" Feb 17 17:47:02 crc kubenswrapper[4892]: I0217 17:47:02.874017 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlzbl\" (UniqueName: \"kubernetes.io/projected/a6708749-50f6-4384-89eb-1d0d3165add7-kube-api-access-wlzbl\") pod \"route-controller-manager-f4b8cf5d-nlk6v\" (UID: \"a6708749-50f6-4384-89eb-1d0d3165add7\") " pod="openshift-route-controller-manager/route-controller-manager-f4b8cf5d-nlk6v" Feb 17 17:47:03 crc kubenswrapper[4892]: I0217 17:47:03.004621 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f4b8cf5d-nlk6v" Feb 17 17:47:03 crc kubenswrapper[4892]: I0217 17:47:03.441972 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vh5f8" event={"ID":"cbb9e745-9259-437f-a600-80153e687c65","Type":"ContainerStarted","Data":"942cc645d18d4a896708d9d97cdfe4687015e7e43dea707b1f4f3fa654b94f5b"} Feb 17 17:47:03 crc kubenswrapper[4892]: I0217 17:47:03.446033 4892 generic.go:334] "Generic (PLEG): container finished" podID="cda40965-c69d-469d-beb4-91582508ad77" containerID="ed58704cf7bab7d976c821b136d4779f41853fa8d123b8a98a29c62ed2054f9d" exitCode=0 Feb 17 17:47:03 crc kubenswrapper[4892]: I0217 17:47:03.446141 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dzbvh" event={"ID":"cda40965-c69d-469d-beb4-91582508ad77","Type":"ContainerDied","Data":"ed58704cf7bab7d976c821b136d4779f41853fa8d123b8a98a29c62ed2054f9d"} Feb 17 17:47:03 crc kubenswrapper[4892]: I0217 17:47:03.473469 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vh5f8" podStartSLOduration=2.472938789 podStartE2EDuration="46.473450247s" podCreationTimestamp="2026-02-17 17:46:17 +0000 UTC" firstStartedPulling="2026-02-17 17:46:18.917082834 +0000 UTC m=+150.292486099" lastFinishedPulling="2026-02-17 17:47:02.917594292 +0000 UTC m=+194.292997557" observedRunningTime="2026-02-17 17:47:03.467611152 +0000 UTC m=+194.843014417" watchObservedRunningTime="2026-02-17 17:47:03.473450247 +0000 UTC m=+194.848853512" Feb 17 17:47:03 crc kubenswrapper[4892]: I0217 17:47:03.477664 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f4b8cf5d-nlk6v"] Feb 17 17:47:03 crc kubenswrapper[4892]: W0217 17:47:03.488946 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6708749_50f6_4384_89eb_1d0d3165add7.slice/crio-566d834b9b77f25b0b460207a0413ae7898acf36b2a9fc3777e6ce4aa10eecb5 WatchSource:0}: Error finding container 566d834b9b77f25b0b460207a0413ae7898acf36b2a9fc3777e6ce4aa10eecb5: Status 404 returned error can't find the container with id 566d834b9b77f25b0b460207a0413ae7898acf36b2a9fc3777e6ce4aa10eecb5 Feb 17 17:47:03 crc kubenswrapper[4892]: I0217 17:47:03.690229 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 17:47:03 crc kubenswrapper[4892]: I0217 17:47:03.764703 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8873a6a5-2fc4-4f3e-9f43-2f73368b3eaf-kube-api-access\") pod \"8873a6a5-2fc4-4f3e-9f43-2f73368b3eaf\" (UID: \"8873a6a5-2fc4-4f3e-9f43-2f73368b3eaf\") " Feb 17 17:47:03 crc kubenswrapper[4892]: I0217 17:47:03.765129 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8873a6a5-2fc4-4f3e-9f43-2f73368b3eaf-kubelet-dir\") pod \"8873a6a5-2fc4-4f3e-9f43-2f73368b3eaf\" (UID: \"8873a6a5-2fc4-4f3e-9f43-2f73368b3eaf\") " Feb 17 17:47:03 crc kubenswrapper[4892]: I0217 17:47:03.765248 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8873a6a5-2fc4-4f3e-9f43-2f73368b3eaf-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8873a6a5-2fc4-4f3e-9f43-2f73368b3eaf" (UID: "8873a6a5-2fc4-4f3e-9f43-2f73368b3eaf"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:47:03 crc kubenswrapper[4892]: I0217 17:47:03.765661 4892 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8873a6a5-2fc4-4f3e-9f43-2f73368b3eaf-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:03 crc kubenswrapper[4892]: I0217 17:47:03.769721 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8873a6a5-2fc4-4f3e-9f43-2f73368b3eaf-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8873a6a5-2fc4-4f3e-9f43-2f73368b3eaf" (UID: "8873a6a5-2fc4-4f3e-9f43-2f73368b3eaf"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:03 crc kubenswrapper[4892]: I0217 17:47:03.867006 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8873a6a5-2fc4-4f3e-9f43-2f73368b3eaf-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:04 crc kubenswrapper[4892]: I0217 17:47:04.453681 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 17:47:04 crc kubenswrapper[4892]: I0217 17:47:04.453700 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"8873a6a5-2fc4-4f3e-9f43-2f73368b3eaf","Type":"ContainerDied","Data":"11283016c3f53377d813e15072af8f183feb949fffe754d058e8c774c7b70fb4"} Feb 17 17:47:04 crc kubenswrapper[4892]: I0217 17:47:04.454071 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11283016c3f53377d813e15072af8f183feb949fffe754d058e8c774c7b70fb4" Feb 17 17:47:04 crc kubenswrapper[4892]: I0217 17:47:04.455347 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f4b8cf5d-nlk6v" event={"ID":"a6708749-50f6-4384-89eb-1d0d3165add7","Type":"ContainerStarted","Data":"1d66c54aba10ed5a24c969778b336f4f63acefd010c36094f3d06145ac9da29d"} Feb 17 17:47:04 crc kubenswrapper[4892]: I0217 17:47:04.455376 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f4b8cf5d-nlk6v" event={"ID":"a6708749-50f6-4384-89eb-1d0d3165add7","Type":"ContainerStarted","Data":"566d834b9b77f25b0b460207a0413ae7898acf36b2a9fc3777e6ce4aa10eecb5"} Feb 17 17:47:04 crc kubenswrapper[4892]: I0217 17:47:04.456335 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-f4b8cf5d-nlk6v" Feb 17 17:47:04 crc kubenswrapper[4892]: I0217 17:47:04.458002 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dzbvh" event={"ID":"cda40965-c69d-469d-beb4-91582508ad77","Type":"ContainerStarted","Data":"61ac8f14b49f3fd6b98272becd05c65f3abb07b77446f6e3664418cebc690262"} Feb 17 17:47:04 crc kubenswrapper[4892]: I0217 17:47:04.465233 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-f4b8cf5d-nlk6v" Feb 17 17:47:04 crc kubenswrapper[4892]: I0217 17:47:04.471866 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-f4b8cf5d-nlk6v" podStartSLOduration=7.471848315 podStartE2EDuration="7.471848315s" podCreationTimestamp="2026-02-17 17:46:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:47:04.468782133 +0000 UTC m=+195.844185398" watchObservedRunningTime="2026-02-17 17:47:04.471848315 +0000 UTC m=+195.847251590" Feb 17 17:47:04 crc kubenswrapper[4892]: I0217 17:47:04.504335 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dzbvh" podStartSLOduration=3.564592223 podStartE2EDuration="47.504316621s" podCreationTimestamp="2026-02-17 17:46:17 +0000 UTC" firstStartedPulling="2026-02-17 17:46:19.987304067 +0000 UTC m=+151.362707332" lastFinishedPulling="2026-02-17 17:47:03.927028465 +0000 UTC m=+195.302431730" observedRunningTime="2026-02-17 17:47:04.501966948 +0000 UTC m=+195.877370213" watchObservedRunningTime="2026-02-17 17:47:04.504316621 +0000 UTC m=+195.879719886" Feb 17 17:47:04 crc kubenswrapper[4892]: I0217 17:47:04.764854 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 17 17:47:04 crc kubenswrapper[4892]: E0217 17:47:04.765320 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8873a6a5-2fc4-4f3e-9f43-2f73368b3eaf" containerName="pruner" Feb 17 17:47:04 crc kubenswrapper[4892]: I0217 17:47:04.765416 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="8873a6a5-2fc4-4f3e-9f43-2f73368b3eaf" containerName="pruner" Feb 17 17:47:04 crc kubenswrapper[4892]: I0217 17:47:04.765593 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="8873a6a5-2fc4-4f3e-9f43-2f73368b3eaf" containerName="pruner" Feb 17 17:47:04 crc kubenswrapper[4892]: I0217 17:47:04.766596 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 17 17:47:04 crc kubenswrapper[4892]: I0217 17:47:04.771371 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 17 17:47:04 crc kubenswrapper[4892]: I0217 17:47:04.771929 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 17 17:47:04 crc kubenswrapper[4892]: I0217 17:47:04.777187 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 17 17:47:04 crc kubenswrapper[4892]: I0217 17:47:04.883259 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ab22fa14-083f-49e3-bfac-a797f4393a1c-var-lock\") pod \"installer-9-crc\" (UID: \"ab22fa14-083f-49e3-bfac-a797f4393a1c\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 17:47:04 crc kubenswrapper[4892]: I0217 17:47:04.883316 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab22fa14-083f-49e3-bfac-a797f4393a1c-kubelet-dir\") pod \"installer-9-crc\" (UID: \"ab22fa14-083f-49e3-bfac-a797f4393a1c\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 17:47:04 crc kubenswrapper[4892]: I0217 17:47:04.883402 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab22fa14-083f-49e3-bfac-a797f4393a1c-kube-api-access\") pod \"installer-9-crc\" (UID: \"ab22fa14-083f-49e3-bfac-a797f4393a1c\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 17:47:04 crc kubenswrapper[4892]: I0217 17:47:04.984885 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ab22fa14-083f-49e3-bfac-a797f4393a1c-var-lock\") pod \"installer-9-crc\" (UID: \"ab22fa14-083f-49e3-bfac-a797f4393a1c\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 17:47:04 crc kubenswrapper[4892]: I0217 17:47:04.984934 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab22fa14-083f-49e3-bfac-a797f4393a1c-kubelet-dir\") pod \"installer-9-crc\" (UID: \"ab22fa14-083f-49e3-bfac-a797f4393a1c\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 17:47:04 crc kubenswrapper[4892]: I0217 17:47:04.984985 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab22fa14-083f-49e3-bfac-a797f4393a1c-kube-api-access\") pod \"installer-9-crc\" (UID: \"ab22fa14-083f-49e3-bfac-a797f4393a1c\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 17:47:04 crc kubenswrapper[4892]: I0217 17:47:04.985047 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ab22fa14-083f-49e3-bfac-a797f4393a1c-var-lock\") pod \"installer-9-crc\" (UID: \"ab22fa14-083f-49e3-bfac-a797f4393a1c\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 17:47:04 crc kubenswrapper[4892]: I0217 17:47:04.985093 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab22fa14-083f-49e3-bfac-a797f4393a1c-kubelet-dir\") pod \"installer-9-crc\" (UID: \"ab22fa14-083f-49e3-bfac-a797f4393a1c\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 17:47:05 crc kubenswrapper[4892]: I0217 17:47:05.004699 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab22fa14-083f-49e3-bfac-a797f4393a1c-kube-api-access\") pod \"installer-9-crc\" (UID: \"ab22fa14-083f-49e3-bfac-a797f4393a1c\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 17:47:05 crc kubenswrapper[4892]: I0217 17:47:05.082417 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 17 17:47:05 crc kubenswrapper[4892]: I0217 17:47:05.504748 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 17 17:47:05 crc kubenswrapper[4892]: W0217 17:47:05.511085 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podab22fa14_083f_49e3_bfac_a797f4393a1c.slice/crio-517c82aea51facb30934905fe55335563cea82a61608b2c32f47896213638bcd WatchSource:0}: Error finding container 517c82aea51facb30934905fe55335563cea82a61608b2c32f47896213638bcd: Status 404 returned error can't find the container with id 517c82aea51facb30934905fe55335563cea82a61608b2c32f47896213638bcd Feb 17 17:47:06 crc kubenswrapper[4892]: I0217 17:47:06.470469 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ab22fa14-083f-49e3-bfac-a797f4393a1c","Type":"ContainerStarted","Data":"e42fd288b10656213d300284a6312c12c0d7d24d9471f1667b535a75f57a5ca6"} Feb 17 17:47:06 crc kubenswrapper[4892]: I0217 17:47:06.470965 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ab22fa14-083f-49e3-bfac-a797f4393a1c","Type":"ContainerStarted","Data":"517c82aea51facb30934905fe55335563cea82a61608b2c32f47896213638bcd"} Feb 17 17:47:06 crc kubenswrapper[4892]: I0217 17:47:06.491619 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.491591192 podStartE2EDuration="2.491591192s" podCreationTimestamp="2026-02-17 17:47:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:47:06.484709099 +0000 UTC m=+197.860112374" watchObservedRunningTime="2026-02-17 17:47:06.491591192 +0000 UTC m=+197.866994507" Feb 17 17:47:07 crc kubenswrapper[4892]: I0217 17:47:07.425290 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:47:07 crc kubenswrapper[4892]: I0217 17:47:07.425376 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:47:07 crc kubenswrapper[4892]: I0217 17:47:07.479621 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vh5f8" Feb 17 17:47:07 crc kubenswrapper[4892]: I0217 17:47:07.479996 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vh5f8" Feb 17 17:47:07 crc kubenswrapper[4892]: I0217 17:47:07.643120 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vh5f8" Feb 17 17:47:07 crc kubenswrapper[4892]: I0217 17:47:07.919102 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dzbvh" Feb 17 17:47:07 crc kubenswrapper[4892]: I0217 17:47:07.919163 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dzbvh" Feb 17 17:47:07 crc kubenswrapper[4892]: I0217 17:47:07.974151 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dzbvh" Feb 17 17:47:08 crc kubenswrapper[4892]: I0217 17:47:08.519094 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vh5f8" Feb 17 17:47:10 crc kubenswrapper[4892]: I0217 17:47:10.491576 4892 generic.go:334] "Generic (PLEG): container finished" podID="9c4922f3-1e12-469d-9afc-c2c52238e551" containerID="43b933a294321da90e7290922622d50a53a6ade207ac81e142604060032d6811" exitCode=0 Feb 17 17:47:10 crc kubenswrapper[4892]: I0217 17:47:10.492771 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2dtr9" event={"ID":"9c4922f3-1e12-469d-9afc-c2c52238e551","Type":"ContainerDied","Data":"43b933a294321da90e7290922622d50a53a6ade207ac81e142604060032d6811"} Feb 17 17:47:11 crc kubenswrapper[4892]: I0217 17:47:11.500622 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2dtr9" event={"ID":"9c4922f3-1e12-469d-9afc-c2c52238e551","Type":"ContainerStarted","Data":"eb82e87c02e4da31e4beea83f2bb51bb4112d961eedd71995389ff5801902e41"} Feb 17 17:47:11 crc kubenswrapper[4892]: I0217 17:47:11.518244 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2dtr9" podStartSLOduration=3.391658086 podStartE2EDuration="54.518229511s" podCreationTimestamp="2026-02-17 17:46:17 +0000 UTC" firstStartedPulling="2026-02-17 17:46:19.981263195 +0000 UTC m=+151.356666460" lastFinishedPulling="2026-02-17 17:47:11.10783462 +0000 UTC m=+202.483237885" observedRunningTime="2026-02-17 17:47:11.516619721 +0000 UTC m=+202.892023006" watchObservedRunningTime="2026-02-17 17:47:11.518229511 +0000 UTC m=+202.893632776" Feb 17 17:47:12 crc kubenswrapper[4892]: I0217 17:47:12.508135 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f7rxl" event={"ID":"6a4ab50e-1f68-4755-a980-157e38a83436","Type":"ContainerStarted","Data":"7945163f82e6d87d962f9c1193733e3866406a7fc6668123ee558f81e06a1831"} Feb 17 17:47:13 crc kubenswrapper[4892]: I0217 17:47:13.513310 4892 generic.go:334] "Generic (PLEG): container finished" podID="6a4ab50e-1f68-4755-a980-157e38a83436" containerID="7945163f82e6d87d962f9c1193733e3866406a7fc6668123ee558f81e06a1831" exitCode=0 Feb 17 17:47:13 crc kubenswrapper[4892]: I0217 17:47:13.513356 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f7rxl" event={"ID":"6a4ab50e-1f68-4755-a980-157e38a83436","Type":"ContainerDied","Data":"7945163f82e6d87d962f9c1193733e3866406a7fc6668123ee558f81e06a1831"} Feb 17 17:47:14 crc kubenswrapper[4892]: I0217 17:47:14.520435 4892 generic.go:334] "Generic (PLEG): container finished" podID="0a738868-4aa5-4aa8-8058-863b3dfb3acb" containerID="149aaf51660d7b5620d873cf039b05a8910daef37d8841f4e2b1f486c6b85ae3" exitCode=0 Feb 17 17:47:14 crc kubenswrapper[4892]: I0217 17:47:14.520529 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q2t7d" event={"ID":"0a738868-4aa5-4aa8-8058-863b3dfb3acb","Type":"ContainerDied","Data":"149aaf51660d7b5620d873cf039b05a8910daef37d8841f4e2b1f486c6b85ae3"} Feb 17 17:47:14 crc kubenswrapper[4892]: I0217 17:47:14.524538 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q424n" event={"ID":"96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1","Type":"ContainerStarted","Data":"d4e5aa159dbea66b9719c1a7698c99b8cbad3d95e0b2fb1f0af96a0053f96143"} Feb 17 17:47:15 crc kubenswrapper[4892]: I0217 17:47:15.530514 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f7rxl" event={"ID":"6a4ab50e-1f68-4755-a980-157e38a83436","Type":"ContainerStarted","Data":"d55d79758b5fad4d05d08b91b02aaa1574dbbfee5558aa84f8ee81dc50702c33"} Feb 17 17:47:15 crc kubenswrapper[4892]: I0217 17:47:15.531888 4892 generic.go:334] "Generic (PLEG): container finished" podID="96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1" containerID="d4e5aa159dbea66b9719c1a7698c99b8cbad3d95e0b2fb1f0af96a0053f96143" exitCode=0 Feb 17 17:47:15 crc kubenswrapper[4892]: I0217 17:47:15.531943 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q424n" event={"ID":"96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1","Type":"ContainerDied","Data":"d4e5aa159dbea66b9719c1a7698c99b8cbad3d95e0b2fb1f0af96a0053f96143"} Feb 17 17:47:15 crc kubenswrapper[4892]: I0217 17:47:15.546761 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-f7rxl" podStartSLOduration=3.61156197 podStartE2EDuration="55.546745451s" podCreationTimestamp="2026-02-17 17:46:20 +0000 UTC" firstStartedPulling="2026-02-17 17:46:22.08227989 +0000 UTC m=+153.457683155" lastFinishedPulling="2026-02-17 17:47:14.017463381 +0000 UTC m=+205.392866636" observedRunningTime="2026-02-17 17:47:15.543224981 +0000 UTC m=+206.918628246" watchObservedRunningTime="2026-02-17 17:47:15.546745451 +0000 UTC m=+206.922148716" Feb 17 17:47:17 crc kubenswrapper[4892]: I0217 17:47:17.334088 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-bf4b5c686-mb98b"] Feb 17 17:47:17 crc kubenswrapper[4892]: I0217 17:47:17.334528 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-bf4b5c686-mb98b" podUID="47bd4c66-226c-455a-8442-947e453e4f8e" containerName="controller-manager" containerID="cri-o://3c42f71d5767472611dedc02167e92b60eeed5d548841dab0369cfadd400498f" gracePeriod=30 Feb 17 17:47:17 crc kubenswrapper[4892]: I0217 17:47:17.370248 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f4b8cf5d-nlk6v"] Feb 17 17:47:17 crc kubenswrapper[4892]: I0217 17:47:17.370461 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-f4b8cf5d-nlk6v" podUID="a6708749-50f6-4384-89eb-1d0d3165add7" containerName="route-controller-manager" containerID="cri-o://1d66c54aba10ed5a24c969778b336f4f63acefd010c36094f3d06145ac9da29d" gracePeriod=30 Feb 17 17:47:17 crc kubenswrapper[4892]: I0217 17:47:17.548891 4892 generic.go:334] "Generic (PLEG): container finished" podID="0c8cf662-ff88-481f-ae9d-de0fe49af1f4" containerID="b5747251961b6f3d14434c38dff9ca2c25b6c7dd7206f9d921d67ea49e80294b" exitCode=0 Feb 17 17:47:17 crc kubenswrapper[4892]: I0217 17:47:17.548951 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9n77k" event={"ID":"0c8cf662-ff88-481f-ae9d-de0fe49af1f4","Type":"ContainerDied","Data":"b5747251961b6f3d14434c38dff9ca2c25b6c7dd7206f9d921d67ea49e80294b"} Feb 17 17:47:17 crc kubenswrapper[4892]: I0217 17:47:17.564913 4892 generic.go:334] "Generic (PLEG): container finished" podID="8498dfc3-1aa0-4059-abad-cab139ba83ec" containerID="16c7ce864e5766da1e505fa587594e5df9b68fd94e3b9755e98c650cd170ef65" exitCode=0 Feb 17 17:47:17 crc kubenswrapper[4892]: I0217 17:47:17.565534 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rjqrw" event={"ID":"8498dfc3-1aa0-4059-abad-cab139ba83ec","Type":"ContainerDied","Data":"16c7ce864e5766da1e505fa587594e5df9b68fd94e3b9755e98c650cd170ef65"} Feb 17 17:47:17 crc kubenswrapper[4892]: I0217 17:47:17.572387 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q424n" event={"ID":"96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1","Type":"ContainerStarted","Data":"72fcfa4564659b3702e16d67177b9a31bfc4cf9dc06a4ae895cb1f08ea736a18"} Feb 17 17:47:17 crc kubenswrapper[4892]: I0217 17:47:17.578291 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q2t7d" event={"ID":"0a738868-4aa5-4aa8-8058-863b3dfb3acb","Type":"ContainerStarted","Data":"5b563a62b45400c6c77bc999fce238da35b608acb03170a509e52a7e7c821fe5"} Feb 17 17:47:17 crc kubenswrapper[4892]: I0217 17:47:17.580285 4892 generic.go:334] "Generic (PLEG): container finished" podID="a6708749-50f6-4384-89eb-1d0d3165add7" containerID="1d66c54aba10ed5a24c969778b336f4f63acefd010c36094f3d06145ac9da29d" exitCode=0 Feb 17 17:47:17 crc kubenswrapper[4892]: I0217 17:47:17.580337 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f4b8cf5d-nlk6v" event={"ID":"a6708749-50f6-4384-89eb-1d0d3165add7","Type":"ContainerDied","Data":"1d66c54aba10ed5a24c969778b336f4f63acefd010c36094f3d06145ac9da29d"} Feb 17 17:47:17 crc kubenswrapper[4892]: I0217 17:47:17.581253 4892 generic.go:334] "Generic (PLEG): container finished" podID="47bd4c66-226c-455a-8442-947e453e4f8e" containerID="3c42f71d5767472611dedc02167e92b60eeed5d548841dab0369cfadd400498f" exitCode=0 Feb 17 17:47:17 crc kubenswrapper[4892]: I0217 17:47:17.581270 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-bf4b5c686-mb98b" event={"ID":"47bd4c66-226c-455a-8442-947e453e4f8e","Type":"ContainerDied","Data":"3c42f71d5767472611dedc02167e92b60eeed5d548841dab0369cfadd400498f"} Feb 17 17:47:17 crc kubenswrapper[4892]: I0217 17:47:17.673499 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2dtr9" Feb 17 17:47:17 crc kubenswrapper[4892]: I0217 17:47:17.674213 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2dtr9" Feb 17 17:47:17 crc kubenswrapper[4892]: I0217 17:47:17.716019 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-q2t7d" podStartSLOduration=3.031361185 podStartE2EDuration="58.716000509s" podCreationTimestamp="2026-02-17 17:46:19 +0000 UTC" firstStartedPulling="2026-02-17 17:46:21.069361956 +0000 UTC m=+152.444765221" lastFinishedPulling="2026-02-17 17:47:16.75400128 +0000 UTC m=+208.129404545" observedRunningTime="2026-02-17 17:47:17.682929394 +0000 UTC m=+209.058332669" watchObservedRunningTime="2026-02-17 17:47:17.716000509 +0000 UTC m=+209.091403774" Feb 17 17:47:17 crc kubenswrapper[4892]: I0217 17:47:17.726478 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2dtr9" Feb 17 17:47:17 crc kubenswrapper[4892]: I0217 17:47:17.743676 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-q424n" podStartSLOduration=2.521179133 podStartE2EDuration="57.743659057s" podCreationTimestamp="2026-02-17 17:46:20 +0000 UTC" firstStartedPulling="2026-02-17 17:46:22.084002726 +0000 UTC m=+153.459405991" lastFinishedPulling="2026-02-17 17:47:17.30648265 +0000 UTC m=+208.681885915" observedRunningTime="2026-02-17 17:47:17.714988113 +0000 UTC m=+209.090391368" watchObservedRunningTime="2026-02-17 17:47:17.743659057 +0000 UTC m=+209.119062322" Feb 17 17:47:17 crc kubenswrapper[4892]: I0217 17:47:17.902775 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f4b8cf5d-nlk6v" Feb 17 17:47:17 crc kubenswrapper[4892]: I0217 17:47:17.971550 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dzbvh" Feb 17 17:47:18 crc kubenswrapper[4892]: I0217 17:47:18.078092 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6708749-50f6-4384-89eb-1d0d3165add7-config\") pod \"a6708749-50f6-4384-89eb-1d0d3165add7\" (UID: \"a6708749-50f6-4384-89eb-1d0d3165add7\") " Feb 17 17:47:18 crc kubenswrapper[4892]: I0217 17:47:18.078209 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6708749-50f6-4384-89eb-1d0d3165add7-serving-cert\") pod \"a6708749-50f6-4384-89eb-1d0d3165add7\" (UID: \"a6708749-50f6-4384-89eb-1d0d3165add7\") " Feb 17 17:47:18 crc kubenswrapper[4892]: I0217 17:47:18.078274 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a6708749-50f6-4384-89eb-1d0d3165add7-client-ca\") pod \"a6708749-50f6-4384-89eb-1d0d3165add7\" (UID: \"a6708749-50f6-4384-89eb-1d0d3165add7\") " Feb 17 17:47:18 crc kubenswrapper[4892]: I0217 17:47:18.079222 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlzbl\" (UniqueName: \"kubernetes.io/projected/a6708749-50f6-4384-89eb-1d0d3165add7-kube-api-access-wlzbl\") pod \"a6708749-50f6-4384-89eb-1d0d3165add7\" (UID: \"a6708749-50f6-4384-89eb-1d0d3165add7\") " Feb 17 17:47:18 crc kubenswrapper[4892]: I0217 17:47:18.079127 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6708749-50f6-4384-89eb-1d0d3165add7-config" (OuterVolumeSpecName: "config") pod "a6708749-50f6-4384-89eb-1d0d3165add7" (UID: "a6708749-50f6-4384-89eb-1d0d3165add7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:18 crc kubenswrapper[4892]: I0217 17:47:18.079141 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6708749-50f6-4384-89eb-1d0d3165add7-client-ca" (OuterVolumeSpecName: "client-ca") pod "a6708749-50f6-4384-89eb-1d0d3165add7" (UID: "a6708749-50f6-4384-89eb-1d0d3165add7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:18 crc kubenswrapper[4892]: I0217 17:47:18.079947 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6708749-50f6-4384-89eb-1d0d3165add7-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:18 crc kubenswrapper[4892]: I0217 17:47:18.079967 4892 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a6708749-50f6-4384-89eb-1d0d3165add7-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:18 crc kubenswrapper[4892]: I0217 17:47:18.082951 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6708749-50f6-4384-89eb-1d0d3165add7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a6708749-50f6-4384-89eb-1d0d3165add7" (UID: "a6708749-50f6-4384-89eb-1d0d3165add7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:18 crc kubenswrapper[4892]: I0217 17:47:18.082985 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6708749-50f6-4384-89eb-1d0d3165add7-kube-api-access-wlzbl" (OuterVolumeSpecName: "kube-api-access-wlzbl") pod "a6708749-50f6-4384-89eb-1d0d3165add7" (UID: "a6708749-50f6-4384-89eb-1d0d3165add7"). InnerVolumeSpecName "kube-api-access-wlzbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:18 crc kubenswrapper[4892]: I0217 17:47:18.180330 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlzbl\" (UniqueName: \"kubernetes.io/projected/a6708749-50f6-4384-89eb-1d0d3165add7-kube-api-access-wlzbl\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:18 crc kubenswrapper[4892]: I0217 17:47:18.180354 4892 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6708749-50f6-4384-89eb-1d0d3165add7-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:18 crc kubenswrapper[4892]: I0217 17:47:18.588650 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f4b8cf5d-nlk6v" Feb 17 17:47:18 crc kubenswrapper[4892]: I0217 17:47:18.595377 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f4b8cf5d-nlk6v" event={"ID":"a6708749-50f6-4384-89eb-1d0d3165add7","Type":"ContainerDied","Data":"566d834b9b77f25b0b460207a0413ae7898acf36b2a9fc3777e6ce4aa10eecb5"} Feb 17 17:47:18 crc kubenswrapper[4892]: I0217 17:47:18.595459 4892 scope.go:117] "RemoveContainer" containerID="1d66c54aba10ed5a24c969778b336f4f63acefd010c36094f3d06145ac9da29d" Feb 17 17:47:18 crc kubenswrapper[4892]: I0217 17:47:18.612008 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f4b8cf5d-nlk6v"] Feb 17 17:47:18 crc kubenswrapper[4892]: I0217 17:47:18.615618 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f4b8cf5d-nlk6v"] Feb 17 17:47:18 crc kubenswrapper[4892]: I0217 17:47:18.649151 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2dtr9" Feb 17 17:47:18 crc kubenswrapper[4892]: I0217 17:47:18.687767 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c88bcf5b9-9bqj6"] Feb 17 17:47:18 crc kubenswrapper[4892]: E0217 17:47:18.688033 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6708749-50f6-4384-89eb-1d0d3165add7" containerName="route-controller-manager" Feb 17 17:47:18 crc kubenswrapper[4892]: I0217 17:47:18.688048 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6708749-50f6-4384-89eb-1d0d3165add7" containerName="route-controller-manager" Feb 17 17:47:18 crc kubenswrapper[4892]: I0217 17:47:18.688153 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6708749-50f6-4384-89eb-1d0d3165add7" containerName="route-controller-manager" Feb 17 17:47:18 crc kubenswrapper[4892]: I0217 17:47:18.688600 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c88bcf5b9-9bqj6" Feb 17 17:47:18 crc kubenswrapper[4892]: I0217 17:47:18.692155 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 17 17:47:18 crc kubenswrapper[4892]: I0217 17:47:18.692377 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 17 17:47:18 crc kubenswrapper[4892]: I0217 17:47:18.692546 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 17 17:47:18 crc kubenswrapper[4892]: I0217 17:47:18.692809 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 17 17:47:18 crc kubenswrapper[4892]: I0217 17:47:18.693003 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 17 17:47:18 crc kubenswrapper[4892]: I0217 17:47:18.693144 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 17 17:47:18 crc kubenswrapper[4892]: I0217 17:47:18.704664 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c88bcf5b9-9bqj6"] Feb 17 17:47:18 crc kubenswrapper[4892]: I0217 17:47:18.791707 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d56da1fc-e0fa-4d86-89d8-b46b1c4f5765-config\") pod \"route-controller-manager-5c88bcf5b9-9bqj6\" (UID: \"d56da1fc-e0fa-4d86-89d8-b46b1c4f5765\") " pod="openshift-route-controller-manager/route-controller-manager-5c88bcf5b9-9bqj6" Feb 17 17:47:18 crc kubenswrapper[4892]: I0217 17:47:18.791770 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d56da1fc-e0fa-4d86-89d8-b46b1c4f5765-client-ca\") pod \"route-controller-manager-5c88bcf5b9-9bqj6\" (UID: \"d56da1fc-e0fa-4d86-89d8-b46b1c4f5765\") " pod="openshift-route-controller-manager/route-controller-manager-5c88bcf5b9-9bqj6" Feb 17 17:47:18 crc kubenswrapper[4892]: I0217 17:47:18.791826 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntm54\" (UniqueName: \"kubernetes.io/projected/d56da1fc-e0fa-4d86-89d8-b46b1c4f5765-kube-api-access-ntm54\") pod \"route-controller-manager-5c88bcf5b9-9bqj6\" (UID: \"d56da1fc-e0fa-4d86-89d8-b46b1c4f5765\") " pod="openshift-route-controller-manager/route-controller-manager-5c88bcf5b9-9bqj6" Feb 17 17:47:18 crc kubenswrapper[4892]: I0217 17:47:18.791962 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d56da1fc-e0fa-4d86-89d8-b46b1c4f5765-serving-cert\") pod \"route-controller-manager-5c88bcf5b9-9bqj6\" (UID: \"d56da1fc-e0fa-4d86-89d8-b46b1c4f5765\") " pod="openshift-route-controller-manager/route-controller-manager-5c88bcf5b9-9bqj6" Feb 17 17:47:18 crc kubenswrapper[4892]: I0217 17:47:18.893182 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d56da1fc-e0fa-4d86-89d8-b46b1c4f5765-config\") pod \"route-controller-manager-5c88bcf5b9-9bqj6\" (UID: \"d56da1fc-e0fa-4d86-89d8-b46b1c4f5765\") " pod="openshift-route-controller-manager/route-controller-manager-5c88bcf5b9-9bqj6" Feb 17 17:47:18 crc kubenswrapper[4892]: I0217 17:47:18.893247 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d56da1fc-e0fa-4d86-89d8-b46b1c4f5765-client-ca\") pod \"route-controller-manager-5c88bcf5b9-9bqj6\" (UID: \"d56da1fc-e0fa-4d86-89d8-b46b1c4f5765\") " pod="openshift-route-controller-manager/route-controller-manager-5c88bcf5b9-9bqj6" Feb 17 17:47:18 crc kubenswrapper[4892]: I0217 17:47:18.893288 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntm54\" (UniqueName: \"kubernetes.io/projected/d56da1fc-e0fa-4d86-89d8-b46b1c4f5765-kube-api-access-ntm54\") pod \"route-controller-manager-5c88bcf5b9-9bqj6\" (UID: \"d56da1fc-e0fa-4d86-89d8-b46b1c4f5765\") " pod="openshift-route-controller-manager/route-controller-manager-5c88bcf5b9-9bqj6" Feb 17 17:47:18 crc kubenswrapper[4892]: I0217 17:47:18.893324 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d56da1fc-e0fa-4d86-89d8-b46b1c4f5765-serving-cert\") pod \"route-controller-manager-5c88bcf5b9-9bqj6\" (UID: \"d56da1fc-e0fa-4d86-89d8-b46b1c4f5765\") " pod="openshift-route-controller-manager/route-controller-manager-5c88bcf5b9-9bqj6" Feb 17 17:47:18 crc kubenswrapper[4892]: I0217 17:47:18.894171 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d56da1fc-e0fa-4d86-89d8-b46b1c4f5765-client-ca\") pod \"route-controller-manager-5c88bcf5b9-9bqj6\" (UID: \"d56da1fc-e0fa-4d86-89d8-b46b1c4f5765\") " pod="openshift-route-controller-manager/route-controller-manager-5c88bcf5b9-9bqj6" Feb 17 17:47:18 crc kubenswrapper[4892]: I0217 17:47:18.894433 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d56da1fc-e0fa-4d86-89d8-b46b1c4f5765-config\") pod \"route-controller-manager-5c88bcf5b9-9bqj6\" (UID: \"d56da1fc-e0fa-4d86-89d8-b46b1c4f5765\") " pod="openshift-route-controller-manager/route-controller-manager-5c88bcf5b9-9bqj6" Feb 17 17:47:18 crc kubenswrapper[4892]: I0217 17:47:18.902692 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d56da1fc-e0fa-4d86-89d8-b46b1c4f5765-serving-cert\") pod \"route-controller-manager-5c88bcf5b9-9bqj6\" (UID: \"d56da1fc-e0fa-4d86-89d8-b46b1c4f5765\") " pod="openshift-route-controller-manager/route-controller-manager-5c88bcf5b9-9bqj6" Feb 17 17:47:18 crc kubenswrapper[4892]: I0217 17:47:18.911603 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntm54\" (UniqueName: \"kubernetes.io/projected/d56da1fc-e0fa-4d86-89d8-b46b1c4f5765-kube-api-access-ntm54\") pod \"route-controller-manager-5c88bcf5b9-9bqj6\" (UID: \"d56da1fc-e0fa-4d86-89d8-b46b1c4f5765\") " pod="openshift-route-controller-manager/route-controller-manager-5c88bcf5b9-9bqj6" Feb 17 17:47:19 crc kubenswrapper[4892]: I0217 17:47:19.003783 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c88bcf5b9-9bqj6" Feb 17 17:47:19 crc kubenswrapper[4892]: I0217 17:47:19.366782 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6708749-50f6-4384-89eb-1d0d3165add7" path="/var/lib/kubelet/pods/a6708749-50f6-4384-89eb-1d0d3165add7/volumes" Feb 17 17:47:19 crc kubenswrapper[4892]: I0217 17:47:19.394951 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c88bcf5b9-9bqj6"] Feb 17 17:47:19 crc kubenswrapper[4892]: W0217 17:47:19.400467 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd56da1fc_e0fa_4d86_89d8_b46b1c4f5765.slice/crio-ce8a7b347bc0ba11c3ee56b8460317edb35c5877d25f4f642b85d9c34093327d WatchSource:0}: Error finding container ce8a7b347bc0ba11c3ee56b8460317edb35c5877d25f4f642b85d9c34093327d: Status 404 returned error can't find the container with id ce8a7b347bc0ba11c3ee56b8460317edb35c5877d25f4f642b85d9c34093327d Feb 17 17:47:19 crc kubenswrapper[4892]: I0217 17:47:19.593912 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c88bcf5b9-9bqj6" event={"ID":"d56da1fc-e0fa-4d86-89d8-b46b1c4f5765","Type":"ContainerStarted","Data":"ce8a7b347bc0ba11c3ee56b8460317edb35c5877d25f4f642b85d9c34093327d"} Feb 17 17:47:19 crc kubenswrapper[4892]: I0217 17:47:19.788652 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dzbvh"] Feb 17 17:47:19 crc kubenswrapper[4892]: I0217 17:47:19.789340 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dzbvh" podUID="cda40965-c69d-469d-beb4-91582508ad77" containerName="registry-server" containerID="cri-o://61ac8f14b49f3fd6b98272becd05c65f3abb07b77446f6e3664418cebc690262" gracePeriod=2 Feb 17 17:47:19 crc kubenswrapper[4892]: I0217 17:47:19.869727 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-q2t7d" Feb 17 17:47:19 crc kubenswrapper[4892]: I0217 17:47:19.869774 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-q2t7d" Feb 17 17:47:19 crc kubenswrapper[4892]: I0217 17:47:19.910376 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-q2t7d" Feb 17 17:47:20 crc kubenswrapper[4892]: I0217 17:47:20.142672 4892 patch_prober.go:28] interesting pod/controller-manager-bf4b5c686-mb98b container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: connect: connection refused" start-of-body= Feb 17 17:47:20 crc kubenswrapper[4892]: I0217 17:47:20.142733 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-bf4b5c686-mb98b" podUID="47bd4c66-226c-455a-8442-947e453e4f8e" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: connect: connection refused" Feb 17 17:47:20 crc kubenswrapper[4892]: I0217 17:47:20.547804 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bf4b5c686-mb98b" Feb 17 17:47:20 crc kubenswrapper[4892]: I0217 17:47:20.601103 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bf4b5c686-mb98b" Feb 17 17:47:20 crc kubenswrapper[4892]: I0217 17:47:20.601372 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-bf4b5c686-mb98b" event={"ID":"47bd4c66-226c-455a-8442-947e453e4f8e","Type":"ContainerDied","Data":"da098827ccdf081e1752c50fdbbfc6acbad59776f937301ed1c5bd8678b0841b"} Feb 17 17:47:20 crc kubenswrapper[4892]: I0217 17:47:20.601406 4892 scope.go:117] "RemoveContainer" containerID="3c42f71d5767472611dedc02167e92b60eeed5d548841dab0369cfadd400498f" Feb 17 17:47:20 crc kubenswrapper[4892]: I0217 17:47:20.683398 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6947b757c5-8hwvg"] Feb 17 17:47:20 crc kubenswrapper[4892]: E0217 17:47:20.683635 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47bd4c66-226c-455a-8442-947e453e4f8e" containerName="controller-manager" Feb 17 17:47:20 crc kubenswrapper[4892]: I0217 17:47:20.683650 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="47bd4c66-226c-455a-8442-947e453e4f8e" containerName="controller-manager" Feb 17 17:47:20 crc kubenswrapper[4892]: I0217 17:47:20.683772 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="47bd4c66-226c-455a-8442-947e453e4f8e" containerName="controller-manager" Feb 17 17:47:20 crc kubenswrapper[4892]: I0217 17:47:20.684248 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6947b757c5-8hwvg" Feb 17 17:47:20 crc kubenswrapper[4892]: I0217 17:47:20.694860 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6947b757c5-8hwvg"] Feb 17 17:47:20 crc kubenswrapper[4892]: I0217 17:47:20.712139 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqjhd\" (UniqueName: \"kubernetes.io/projected/47bd4c66-226c-455a-8442-947e453e4f8e-kube-api-access-pqjhd\") pod \"47bd4c66-226c-455a-8442-947e453e4f8e\" (UID: \"47bd4c66-226c-455a-8442-947e453e4f8e\") " Feb 17 17:47:20 crc kubenswrapper[4892]: I0217 17:47:20.712540 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/47bd4c66-226c-455a-8442-947e453e4f8e-client-ca\") pod \"47bd4c66-226c-455a-8442-947e453e4f8e\" (UID: \"47bd4c66-226c-455a-8442-947e453e4f8e\") " Feb 17 17:47:20 crc kubenswrapper[4892]: I0217 17:47:20.712608 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/47bd4c66-226c-455a-8442-947e453e4f8e-proxy-ca-bundles\") pod \"47bd4c66-226c-455a-8442-947e453e4f8e\" (UID: \"47bd4c66-226c-455a-8442-947e453e4f8e\") " Feb 17 17:47:20 crc kubenswrapper[4892]: I0217 17:47:20.712660 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47bd4c66-226c-455a-8442-947e453e4f8e-config\") pod \"47bd4c66-226c-455a-8442-947e453e4f8e\" (UID: \"47bd4c66-226c-455a-8442-947e453e4f8e\") " Feb 17 17:47:20 crc kubenswrapper[4892]: I0217 17:47:20.712694 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47bd4c66-226c-455a-8442-947e453e4f8e-serving-cert\") pod \"47bd4c66-226c-455a-8442-947e453e4f8e\" (UID: \"47bd4c66-226c-455a-8442-947e453e4f8e\") " Feb 17 17:47:20 crc kubenswrapper[4892]: I0217 17:47:20.712788 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e20b74ec-bb8b-4674-b0f6-31d8f02ff100-proxy-ca-bundles\") pod \"controller-manager-6947b757c5-8hwvg\" (UID: \"e20b74ec-bb8b-4674-b0f6-31d8f02ff100\") " pod="openshift-controller-manager/controller-manager-6947b757c5-8hwvg" Feb 17 17:47:20 crc kubenswrapper[4892]: I0217 17:47:20.712875 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e20b74ec-bb8b-4674-b0f6-31d8f02ff100-client-ca\") pod \"controller-manager-6947b757c5-8hwvg\" (UID: \"e20b74ec-bb8b-4674-b0f6-31d8f02ff100\") " pod="openshift-controller-manager/controller-manager-6947b757c5-8hwvg" Feb 17 17:47:20 crc kubenswrapper[4892]: I0217 17:47:20.713022 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e20b74ec-bb8b-4674-b0f6-31d8f02ff100-serving-cert\") pod \"controller-manager-6947b757c5-8hwvg\" (UID: \"e20b74ec-bb8b-4674-b0f6-31d8f02ff100\") " pod="openshift-controller-manager/controller-manager-6947b757c5-8hwvg" Feb 17 17:47:20 crc kubenswrapper[4892]: I0217 17:47:20.713058 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e20b74ec-bb8b-4674-b0f6-31d8f02ff100-config\") pod \"controller-manager-6947b757c5-8hwvg\" (UID: \"e20b74ec-bb8b-4674-b0f6-31d8f02ff100\") " pod="openshift-controller-manager/controller-manager-6947b757c5-8hwvg" Feb 17 17:47:20 crc kubenswrapper[4892]: I0217 17:47:20.713169 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47bd4c66-226c-455a-8442-947e453e4f8e-client-ca" (OuterVolumeSpecName: "client-ca") pod "47bd4c66-226c-455a-8442-947e453e4f8e" (UID: "47bd4c66-226c-455a-8442-947e453e4f8e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:20 crc kubenswrapper[4892]: I0217 17:47:20.713179 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4xzb\" (UniqueName: \"kubernetes.io/projected/e20b74ec-bb8b-4674-b0f6-31d8f02ff100-kube-api-access-c4xzb\") pod \"controller-manager-6947b757c5-8hwvg\" (UID: \"e20b74ec-bb8b-4674-b0f6-31d8f02ff100\") " pod="openshift-controller-manager/controller-manager-6947b757c5-8hwvg" Feb 17 17:47:20 crc kubenswrapper[4892]: I0217 17:47:20.713577 4892 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/47bd4c66-226c-455a-8442-947e453e4f8e-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:20 crc kubenswrapper[4892]: I0217 17:47:20.713259 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47bd4c66-226c-455a-8442-947e453e4f8e-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "47bd4c66-226c-455a-8442-947e453e4f8e" (UID: "47bd4c66-226c-455a-8442-947e453e4f8e"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:20 crc kubenswrapper[4892]: I0217 17:47:20.713309 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47bd4c66-226c-455a-8442-947e453e4f8e-config" (OuterVolumeSpecName: "config") pod "47bd4c66-226c-455a-8442-947e453e4f8e" (UID: "47bd4c66-226c-455a-8442-947e453e4f8e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:20 crc kubenswrapper[4892]: I0217 17:47:20.729399 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47bd4c66-226c-455a-8442-947e453e4f8e-kube-api-access-pqjhd" (OuterVolumeSpecName: "kube-api-access-pqjhd") pod "47bd4c66-226c-455a-8442-947e453e4f8e" (UID: "47bd4c66-226c-455a-8442-947e453e4f8e"). InnerVolumeSpecName "kube-api-access-pqjhd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:20 crc kubenswrapper[4892]: I0217 17:47:20.729923 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47bd4c66-226c-455a-8442-947e453e4f8e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "47bd4c66-226c-455a-8442-947e453e4f8e" (UID: "47bd4c66-226c-455a-8442-947e453e4f8e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:20 crc kubenswrapper[4892]: I0217 17:47:20.814374 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e20b74ec-bb8b-4674-b0f6-31d8f02ff100-proxy-ca-bundles\") pod \"controller-manager-6947b757c5-8hwvg\" (UID: \"e20b74ec-bb8b-4674-b0f6-31d8f02ff100\") " pod="openshift-controller-manager/controller-manager-6947b757c5-8hwvg" Feb 17 17:47:20 crc kubenswrapper[4892]: I0217 17:47:20.814476 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e20b74ec-bb8b-4674-b0f6-31d8f02ff100-client-ca\") pod \"controller-manager-6947b757c5-8hwvg\" (UID: \"e20b74ec-bb8b-4674-b0f6-31d8f02ff100\") " pod="openshift-controller-manager/controller-manager-6947b757c5-8hwvg" Feb 17 17:47:20 crc kubenswrapper[4892]: I0217 17:47:20.814563 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e20b74ec-bb8b-4674-b0f6-31d8f02ff100-serving-cert\") pod \"controller-manager-6947b757c5-8hwvg\" (UID: \"e20b74ec-bb8b-4674-b0f6-31d8f02ff100\") " pod="openshift-controller-manager/controller-manager-6947b757c5-8hwvg" Feb 17 17:47:20 crc kubenswrapper[4892]: I0217 17:47:20.814603 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e20b74ec-bb8b-4674-b0f6-31d8f02ff100-config\") pod \"controller-manager-6947b757c5-8hwvg\" (UID: \"e20b74ec-bb8b-4674-b0f6-31d8f02ff100\") " pod="openshift-controller-manager/controller-manager-6947b757c5-8hwvg" Feb 17 17:47:20 crc kubenswrapper[4892]: I0217 17:47:20.814675 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4xzb\" (UniqueName: \"kubernetes.io/projected/e20b74ec-bb8b-4674-b0f6-31d8f02ff100-kube-api-access-c4xzb\") pod \"controller-manager-6947b757c5-8hwvg\" (UID: \"e20b74ec-bb8b-4674-b0f6-31d8f02ff100\") " pod="openshift-controller-manager/controller-manager-6947b757c5-8hwvg" Feb 17 17:47:20 crc kubenswrapper[4892]: I0217 17:47:20.814777 4892 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/47bd4c66-226c-455a-8442-947e453e4f8e-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:20 crc kubenswrapper[4892]: I0217 17:47:20.814807 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47bd4c66-226c-455a-8442-947e453e4f8e-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:20 crc kubenswrapper[4892]: I0217 17:47:20.814868 4892 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47bd4c66-226c-455a-8442-947e453e4f8e-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:20 crc kubenswrapper[4892]: I0217 17:47:20.814891 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqjhd\" (UniqueName: \"kubernetes.io/projected/47bd4c66-226c-455a-8442-947e453e4f8e-kube-api-access-pqjhd\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:20 crc kubenswrapper[4892]: I0217 17:47:20.815645 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e20b74ec-bb8b-4674-b0f6-31d8f02ff100-client-ca\") pod \"controller-manager-6947b757c5-8hwvg\" (UID: \"e20b74ec-bb8b-4674-b0f6-31d8f02ff100\") " pod="openshift-controller-manager/controller-manager-6947b757c5-8hwvg" Feb 17 17:47:20 crc kubenswrapper[4892]: I0217 17:47:20.815649 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e20b74ec-bb8b-4674-b0f6-31d8f02ff100-proxy-ca-bundles\") pod \"controller-manager-6947b757c5-8hwvg\" (UID: \"e20b74ec-bb8b-4674-b0f6-31d8f02ff100\") " pod="openshift-controller-manager/controller-manager-6947b757c5-8hwvg" Feb 17 17:47:20 crc kubenswrapper[4892]: I0217 17:47:20.816402 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e20b74ec-bb8b-4674-b0f6-31d8f02ff100-config\") pod \"controller-manager-6947b757c5-8hwvg\" (UID: \"e20b74ec-bb8b-4674-b0f6-31d8f02ff100\") " pod="openshift-controller-manager/controller-manager-6947b757c5-8hwvg" Feb 17 17:47:20 crc kubenswrapper[4892]: I0217 17:47:20.820424 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e20b74ec-bb8b-4674-b0f6-31d8f02ff100-serving-cert\") pod \"controller-manager-6947b757c5-8hwvg\" (UID: \"e20b74ec-bb8b-4674-b0f6-31d8f02ff100\") " pod="openshift-controller-manager/controller-manager-6947b757c5-8hwvg" Feb 17 17:47:20 crc kubenswrapper[4892]: I0217 17:47:20.831075 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4xzb\" (UniqueName: \"kubernetes.io/projected/e20b74ec-bb8b-4674-b0f6-31d8f02ff100-kube-api-access-c4xzb\") pod \"controller-manager-6947b757c5-8hwvg\" (UID: \"e20b74ec-bb8b-4674-b0f6-31d8f02ff100\") " pod="openshift-controller-manager/controller-manager-6947b757c5-8hwvg" Feb 17 17:47:20 crc kubenswrapper[4892]: I0217 17:47:20.862710 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-q424n" Feb 17 17:47:20 crc kubenswrapper[4892]: I0217 17:47:20.862781 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-q424n" Feb 17 17:47:20 crc kubenswrapper[4892]: I0217 17:47:20.940565 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-bf4b5c686-mb98b"] Feb 17 17:47:20 crc kubenswrapper[4892]: I0217 17:47:20.945677 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-bf4b5c686-mb98b"] Feb 17 17:47:20 crc kubenswrapper[4892]: I0217 17:47:20.996841 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6947b757c5-8hwvg" Feb 17 17:47:21 crc kubenswrapper[4892]: I0217 17:47:21.263721 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6947b757c5-8hwvg"] Feb 17 17:47:21 crc kubenswrapper[4892]: I0217 17:47:21.275200 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-f7rxl" Feb 17 17:47:21 crc kubenswrapper[4892]: I0217 17:47:21.275234 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-f7rxl" Feb 17 17:47:21 crc kubenswrapper[4892]: W0217 17:47:21.277150 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode20b74ec_bb8b_4674_b0f6_31d8f02ff100.slice/crio-be87bfdce5887b47abcbe6369a7c1ee1d100bec2dc5cf37780579a83a205ef4b WatchSource:0}: Error finding container be87bfdce5887b47abcbe6369a7c1ee1d100bec2dc5cf37780579a83a205ef4b: Status 404 returned error can't find the container with id be87bfdce5887b47abcbe6369a7c1ee1d100bec2dc5cf37780579a83a205ef4b Feb 17 17:47:21 crc kubenswrapper[4892]: I0217 17:47:21.339206 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-f7rxl" Feb 17 17:47:21 crc kubenswrapper[4892]: I0217 17:47:21.369002 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47bd4c66-226c-455a-8442-947e453e4f8e" path="/var/lib/kubelet/pods/47bd4c66-226c-455a-8442-947e453e4f8e/volumes" Feb 17 17:47:21 crc kubenswrapper[4892]: I0217 17:47:21.618616 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c88bcf5b9-9bqj6" event={"ID":"d56da1fc-e0fa-4d86-89d8-b46b1c4f5765","Type":"ContainerStarted","Data":"c373bc6ed06ef408037fd87b1d95403106f3849f2ff5dbd09d584d5dbc344162"} Feb 17 17:47:21 crc kubenswrapper[4892]: I0217 17:47:21.619253 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5c88bcf5b9-9bqj6" Feb 17 17:47:21 crc kubenswrapper[4892]: I0217 17:47:21.622107 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6947b757c5-8hwvg" event={"ID":"e20b74ec-bb8b-4674-b0f6-31d8f02ff100","Type":"ContainerStarted","Data":"5ccae063352eef56f9ae75430e2bfcf2f1f044a128eb9dee1a25dcc4388a0efc"} Feb 17 17:47:21 crc kubenswrapper[4892]: I0217 17:47:21.622495 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6947b757c5-8hwvg" event={"ID":"e20b74ec-bb8b-4674-b0f6-31d8f02ff100","Type":"ContainerStarted","Data":"be87bfdce5887b47abcbe6369a7c1ee1d100bec2dc5cf37780579a83a205ef4b"} Feb 17 17:47:21 crc kubenswrapper[4892]: I0217 17:47:21.623360 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6947b757c5-8hwvg" Feb 17 17:47:21 crc kubenswrapper[4892]: I0217 17:47:21.632657 4892 patch_prober.go:28] interesting pod/controller-manager-6947b757c5-8hwvg container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": dial tcp 10.217.0.60:8443: connect: connection refused" start-of-body= Feb 17 17:47:21 crc kubenswrapper[4892]: I0217 17:47:21.632948 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6947b757c5-8hwvg" podUID="e20b74ec-bb8b-4674-b0f6-31d8f02ff100" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": dial tcp 10.217.0.60:8443: connect: connection refused" Feb 17 17:47:21 crc kubenswrapper[4892]: I0217 17:47:21.634015 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5c88bcf5b9-9bqj6" Feb 17 17:47:21 crc kubenswrapper[4892]: I0217 17:47:21.635681 4892 generic.go:334] "Generic (PLEG): container finished" podID="cda40965-c69d-469d-beb4-91582508ad77" containerID="61ac8f14b49f3fd6b98272becd05c65f3abb07b77446f6e3664418cebc690262" exitCode=0 Feb 17 17:47:21 crc kubenswrapper[4892]: I0217 17:47:21.635883 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dzbvh" event={"ID":"cda40965-c69d-469d-beb4-91582508ad77","Type":"ContainerDied","Data":"61ac8f14b49f3fd6b98272becd05c65f3abb07b77446f6e3664418cebc690262"} Feb 17 17:47:21 crc kubenswrapper[4892]: I0217 17:47:21.653973 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5c88bcf5b9-9bqj6" podStartSLOduration=4.653953492 podStartE2EDuration="4.653953492s" podCreationTimestamp="2026-02-17 17:47:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:47:21.647282963 +0000 UTC m=+213.022686238" watchObservedRunningTime="2026-02-17 17:47:21.653953492 +0000 UTC m=+213.029356767" Feb 17 17:47:21 crc kubenswrapper[4892]: I0217 17:47:21.685250 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6947b757c5-8hwvg" podStartSLOduration=4.685234301 podStartE2EDuration="4.685234301s" podCreationTimestamp="2026-02-17 17:47:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:47:21.684226535 +0000 UTC m=+213.059629800" watchObservedRunningTime="2026-02-17 17:47:21.685234301 +0000 UTC m=+213.060637566" Feb 17 17:47:21 crc kubenswrapper[4892]: I0217 17:47:21.709120 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-f7rxl" Feb 17 17:47:21 crc kubenswrapper[4892]: I0217 17:47:21.816906 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dzbvh" Feb 17 17:47:21 crc kubenswrapper[4892]: I0217 17:47:21.911917 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-q424n" podUID="96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1" containerName="registry-server" probeResult="failure" output=< Feb 17 17:47:21 crc kubenswrapper[4892]: timeout: failed to connect service ":50051" within 1s Feb 17 17:47:21 crc kubenswrapper[4892]: > Feb 17 17:47:21 crc kubenswrapper[4892]: I0217 17:47:21.941482 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cda40965-c69d-469d-beb4-91582508ad77-utilities\") pod \"cda40965-c69d-469d-beb4-91582508ad77\" (UID: \"cda40965-c69d-469d-beb4-91582508ad77\") " Feb 17 17:47:21 crc kubenswrapper[4892]: I0217 17:47:21.941541 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cda40965-c69d-469d-beb4-91582508ad77-catalog-content\") pod \"cda40965-c69d-469d-beb4-91582508ad77\" (UID: \"cda40965-c69d-469d-beb4-91582508ad77\") " Feb 17 17:47:21 crc kubenswrapper[4892]: I0217 17:47:21.941622 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8tfn\" (UniqueName: \"kubernetes.io/projected/cda40965-c69d-469d-beb4-91582508ad77-kube-api-access-m8tfn\") pod \"cda40965-c69d-469d-beb4-91582508ad77\" (UID: \"cda40965-c69d-469d-beb4-91582508ad77\") " Feb 17 17:47:21 crc kubenswrapper[4892]: I0217 17:47:21.943315 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cda40965-c69d-469d-beb4-91582508ad77-utilities" (OuterVolumeSpecName: "utilities") pod "cda40965-c69d-469d-beb4-91582508ad77" (UID: "cda40965-c69d-469d-beb4-91582508ad77"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:47:21 crc kubenswrapper[4892]: I0217 17:47:21.952983 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cda40965-c69d-469d-beb4-91582508ad77-kube-api-access-m8tfn" (OuterVolumeSpecName: "kube-api-access-m8tfn") pod "cda40965-c69d-469d-beb4-91582508ad77" (UID: "cda40965-c69d-469d-beb4-91582508ad77"). InnerVolumeSpecName "kube-api-access-m8tfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:21 crc kubenswrapper[4892]: I0217 17:47:21.998539 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cda40965-c69d-469d-beb4-91582508ad77-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cda40965-c69d-469d-beb4-91582508ad77" (UID: "cda40965-c69d-469d-beb4-91582508ad77"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:47:22 crc kubenswrapper[4892]: I0217 17:47:22.042797 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cda40965-c69d-469d-beb4-91582508ad77-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:22 crc kubenswrapper[4892]: I0217 17:47:22.042865 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cda40965-c69d-469d-beb4-91582508ad77-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:22 crc kubenswrapper[4892]: I0217 17:47:22.042880 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8tfn\" (UniqueName: \"kubernetes.io/projected/cda40965-c69d-469d-beb4-91582508ad77-kube-api-access-m8tfn\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:22 crc kubenswrapper[4892]: I0217 17:47:22.644048 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rjqrw" event={"ID":"8498dfc3-1aa0-4059-abad-cab139ba83ec","Type":"ContainerStarted","Data":"6149f27584b0888a515761df07de36ffe15ad669fc82075ec03e130d972b3e4c"} Feb 17 17:47:22 crc kubenswrapper[4892]: I0217 17:47:22.646860 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9n77k" event={"ID":"0c8cf662-ff88-481f-ae9d-de0fe49af1f4","Type":"ContainerStarted","Data":"10503cdba15dbb02d56c0bd4d0d5c1c10a1dfab526969776027736f50fbdf59a"} Feb 17 17:47:22 crc kubenswrapper[4892]: I0217 17:47:22.649390 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dzbvh" event={"ID":"cda40965-c69d-469d-beb4-91582508ad77","Type":"ContainerDied","Data":"41e16e977ddf1804ea1891f5b9f306921e481f4ea0ef64193203838d6fe7f04c"} Feb 17 17:47:22 crc kubenswrapper[4892]: I0217 17:47:22.649436 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dzbvh" Feb 17 17:47:22 crc kubenswrapper[4892]: I0217 17:47:22.649450 4892 scope.go:117] "RemoveContainer" containerID="61ac8f14b49f3fd6b98272becd05c65f3abb07b77446f6e3664418cebc690262" Feb 17 17:47:22 crc kubenswrapper[4892]: I0217 17:47:22.658540 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6947b757c5-8hwvg" Feb 17 17:47:22 crc kubenswrapper[4892]: I0217 17:47:22.683742 4892 scope.go:117] "RemoveContainer" containerID="ed58704cf7bab7d976c821b136d4779f41853fa8d123b8a98a29c62ed2054f9d" Feb 17 17:47:22 crc kubenswrapper[4892]: I0217 17:47:22.704017 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9n77k" podStartSLOduration=3.947670977 podStartE2EDuration="1m5.703999642s" podCreationTimestamp="2026-02-17 17:46:17 +0000 UTC" firstStartedPulling="2026-02-17 17:46:19.975935604 +0000 UTC m=+151.351338869" lastFinishedPulling="2026-02-17 17:47:21.732264269 +0000 UTC m=+213.107667534" observedRunningTime="2026-02-17 17:47:22.70192993 +0000 UTC m=+214.077333205" watchObservedRunningTime="2026-02-17 17:47:22.703999642 +0000 UTC m=+214.079402907" Feb 17 17:47:22 crc kubenswrapper[4892]: I0217 17:47:22.705378 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rjqrw" podStartSLOduration=3.107243994 podStartE2EDuration="1m3.705371717s" podCreationTimestamp="2026-02-17 17:46:19 +0000 UTC" firstStartedPulling="2026-02-17 17:46:21.058351262 +0000 UTC m=+152.433754527" lastFinishedPulling="2026-02-17 17:47:21.656478975 +0000 UTC m=+213.031882250" observedRunningTime="2026-02-17 17:47:22.678506888 +0000 UTC m=+214.053910153" watchObservedRunningTime="2026-02-17 17:47:22.705371717 +0000 UTC m=+214.080774982" Feb 17 17:47:22 crc kubenswrapper[4892]: I0217 17:47:22.706535 4892 scope.go:117] "RemoveContainer" containerID="95e2423cb23413ef2cd8d48d47391cfcdb331c4ca8874388e26ba6148dd5ca0d" Feb 17 17:47:22 crc kubenswrapper[4892]: I0217 17:47:22.721681 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dzbvh"] Feb 17 17:47:22 crc kubenswrapper[4892]: I0217 17:47:22.725374 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dzbvh"] Feb 17 17:47:23 crc kubenswrapper[4892]: I0217 17:47:23.365235 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cda40965-c69d-469d-beb4-91582508ad77" path="/var/lib/kubelet/pods/cda40965-c69d-469d-beb4-91582508ad77/volumes" Feb 17 17:47:23 crc kubenswrapper[4892]: I0217 17:47:23.588188 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f7rxl"] Feb 17 17:47:23 crc kubenswrapper[4892]: I0217 17:47:23.658287 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-f7rxl" podUID="6a4ab50e-1f68-4755-a980-157e38a83436" containerName="registry-server" containerID="cri-o://d55d79758b5fad4d05d08b91b02aaa1574dbbfee5558aa84f8ee81dc50702c33" gracePeriod=2 Feb 17 17:47:24 crc kubenswrapper[4892]: I0217 17:47:24.063514 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f7rxl" Feb 17 17:47:24 crc kubenswrapper[4892]: I0217 17:47:24.172374 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a4ab50e-1f68-4755-a980-157e38a83436-catalog-content\") pod \"6a4ab50e-1f68-4755-a980-157e38a83436\" (UID: \"6a4ab50e-1f68-4755-a980-157e38a83436\") " Feb 17 17:47:24 crc kubenswrapper[4892]: I0217 17:47:24.172464 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a4ab50e-1f68-4755-a980-157e38a83436-utilities\") pod \"6a4ab50e-1f68-4755-a980-157e38a83436\" (UID: \"6a4ab50e-1f68-4755-a980-157e38a83436\") " Feb 17 17:47:24 crc kubenswrapper[4892]: I0217 17:47:24.172504 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jj6nr\" (UniqueName: \"kubernetes.io/projected/6a4ab50e-1f68-4755-a980-157e38a83436-kube-api-access-jj6nr\") pod \"6a4ab50e-1f68-4755-a980-157e38a83436\" (UID: \"6a4ab50e-1f68-4755-a980-157e38a83436\") " Feb 17 17:47:24 crc kubenswrapper[4892]: I0217 17:47:24.173507 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a4ab50e-1f68-4755-a980-157e38a83436-utilities" (OuterVolumeSpecName: "utilities") pod "6a4ab50e-1f68-4755-a980-157e38a83436" (UID: "6a4ab50e-1f68-4755-a980-157e38a83436"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:47:24 crc kubenswrapper[4892]: I0217 17:47:24.196767 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a4ab50e-1f68-4755-a980-157e38a83436-kube-api-access-jj6nr" (OuterVolumeSpecName: "kube-api-access-jj6nr") pod "6a4ab50e-1f68-4755-a980-157e38a83436" (UID: "6a4ab50e-1f68-4755-a980-157e38a83436"). InnerVolumeSpecName "kube-api-access-jj6nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:24 crc kubenswrapper[4892]: I0217 17:47:24.274188 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a4ab50e-1f68-4755-a980-157e38a83436-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:24 crc kubenswrapper[4892]: I0217 17:47:24.274221 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jj6nr\" (UniqueName: \"kubernetes.io/projected/6a4ab50e-1f68-4755-a980-157e38a83436-kube-api-access-jj6nr\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:24 crc kubenswrapper[4892]: I0217 17:47:24.311955 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a4ab50e-1f68-4755-a980-157e38a83436-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6a4ab50e-1f68-4755-a980-157e38a83436" (UID: "6a4ab50e-1f68-4755-a980-157e38a83436"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:47:24 crc kubenswrapper[4892]: I0217 17:47:24.375060 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a4ab50e-1f68-4755-a980-157e38a83436-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:24 crc kubenswrapper[4892]: I0217 17:47:24.665735 4892 generic.go:334] "Generic (PLEG): container finished" podID="6a4ab50e-1f68-4755-a980-157e38a83436" containerID="d55d79758b5fad4d05d08b91b02aaa1574dbbfee5558aa84f8ee81dc50702c33" exitCode=0 Feb 17 17:47:24 crc kubenswrapper[4892]: I0217 17:47:24.665784 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f7rxl" event={"ID":"6a4ab50e-1f68-4755-a980-157e38a83436","Type":"ContainerDied","Data":"d55d79758b5fad4d05d08b91b02aaa1574dbbfee5558aa84f8ee81dc50702c33"} Feb 17 17:47:24 crc kubenswrapper[4892]: I0217 17:47:24.665834 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f7rxl" event={"ID":"6a4ab50e-1f68-4755-a980-157e38a83436","Type":"ContainerDied","Data":"614787ceadd9869372533cd81172060e31e3e3ecb3dab6b1a24b0a60d41213ed"} Feb 17 17:47:24 crc kubenswrapper[4892]: I0217 17:47:24.665854 4892 scope.go:117] "RemoveContainer" containerID="d55d79758b5fad4d05d08b91b02aaa1574dbbfee5558aa84f8ee81dc50702c33" Feb 17 17:47:24 crc kubenswrapper[4892]: I0217 17:47:24.665910 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f7rxl" Feb 17 17:47:24 crc kubenswrapper[4892]: I0217 17:47:24.691882 4892 scope.go:117] "RemoveContainer" containerID="7945163f82e6d87d962f9c1193733e3866406a7fc6668123ee558f81e06a1831" Feb 17 17:47:24 crc kubenswrapper[4892]: I0217 17:47:24.712586 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f7rxl"] Feb 17 17:47:24 crc kubenswrapper[4892]: I0217 17:47:24.718069 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-f7rxl"] Feb 17 17:47:24 crc kubenswrapper[4892]: I0217 17:47:24.740030 4892 scope.go:117] "RemoveContainer" containerID="d260debdcd79b86e2ca286cd52155c0dc90420adeb6ae2907aab335e14b1a1bd" Feb 17 17:47:24 crc kubenswrapper[4892]: I0217 17:47:24.760701 4892 scope.go:117] "RemoveContainer" containerID="d55d79758b5fad4d05d08b91b02aaa1574dbbfee5558aa84f8ee81dc50702c33" Feb 17 17:47:24 crc kubenswrapper[4892]: E0217 17:47:24.761260 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d55d79758b5fad4d05d08b91b02aaa1574dbbfee5558aa84f8ee81dc50702c33\": container with ID starting with d55d79758b5fad4d05d08b91b02aaa1574dbbfee5558aa84f8ee81dc50702c33 not found: ID does not exist" containerID="d55d79758b5fad4d05d08b91b02aaa1574dbbfee5558aa84f8ee81dc50702c33" Feb 17 17:47:24 crc kubenswrapper[4892]: I0217 17:47:24.761304 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d55d79758b5fad4d05d08b91b02aaa1574dbbfee5558aa84f8ee81dc50702c33"} err="failed to get container status \"d55d79758b5fad4d05d08b91b02aaa1574dbbfee5558aa84f8ee81dc50702c33\": rpc error: code = NotFound desc = could not find container \"d55d79758b5fad4d05d08b91b02aaa1574dbbfee5558aa84f8ee81dc50702c33\": container with ID starting with d55d79758b5fad4d05d08b91b02aaa1574dbbfee5558aa84f8ee81dc50702c33 not found: ID does not exist" Feb 17 17:47:24 crc kubenswrapper[4892]: I0217 17:47:24.761360 4892 scope.go:117] "RemoveContainer" containerID="7945163f82e6d87d962f9c1193733e3866406a7fc6668123ee558f81e06a1831" Feb 17 17:47:24 crc kubenswrapper[4892]: E0217 17:47:24.761745 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7945163f82e6d87d962f9c1193733e3866406a7fc6668123ee558f81e06a1831\": container with ID starting with 7945163f82e6d87d962f9c1193733e3866406a7fc6668123ee558f81e06a1831 not found: ID does not exist" containerID="7945163f82e6d87d962f9c1193733e3866406a7fc6668123ee558f81e06a1831" Feb 17 17:47:24 crc kubenswrapper[4892]: I0217 17:47:24.761807 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7945163f82e6d87d962f9c1193733e3866406a7fc6668123ee558f81e06a1831"} err="failed to get container status \"7945163f82e6d87d962f9c1193733e3866406a7fc6668123ee558f81e06a1831\": rpc error: code = NotFound desc = could not find container \"7945163f82e6d87d962f9c1193733e3866406a7fc6668123ee558f81e06a1831\": container with ID starting with 7945163f82e6d87d962f9c1193733e3866406a7fc6668123ee558f81e06a1831 not found: ID does not exist" Feb 17 17:47:24 crc kubenswrapper[4892]: I0217 17:47:24.761978 4892 scope.go:117] "RemoveContainer" containerID="d260debdcd79b86e2ca286cd52155c0dc90420adeb6ae2907aab335e14b1a1bd" Feb 17 17:47:24 crc kubenswrapper[4892]: E0217 17:47:24.762655 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d260debdcd79b86e2ca286cd52155c0dc90420adeb6ae2907aab335e14b1a1bd\": container with ID starting with d260debdcd79b86e2ca286cd52155c0dc90420adeb6ae2907aab335e14b1a1bd not found: ID does not exist" containerID="d260debdcd79b86e2ca286cd52155c0dc90420adeb6ae2907aab335e14b1a1bd" Feb 17 17:47:24 crc kubenswrapper[4892]: I0217 17:47:24.762704 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d260debdcd79b86e2ca286cd52155c0dc90420adeb6ae2907aab335e14b1a1bd"} err="failed to get container status \"d260debdcd79b86e2ca286cd52155c0dc90420adeb6ae2907aab335e14b1a1bd\": rpc error: code = NotFound desc = could not find container \"d260debdcd79b86e2ca286cd52155c0dc90420adeb6ae2907aab335e14b1a1bd\": container with ID starting with d260debdcd79b86e2ca286cd52155c0dc90420adeb6ae2907aab335e14b1a1bd not found: ID does not exist" Feb 17 17:47:25 crc kubenswrapper[4892]: I0217 17:47:25.366188 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a4ab50e-1f68-4755-a980-157e38a83436" path="/var/lib/kubelet/pods/6a4ab50e-1f68-4755-a980-157e38a83436/volumes" Feb 17 17:47:28 crc kubenswrapper[4892]: I0217 17:47:28.129154 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9n77k" Feb 17 17:47:28 crc kubenswrapper[4892]: I0217 17:47:28.129210 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9n77k" Feb 17 17:47:28 crc kubenswrapper[4892]: I0217 17:47:28.178868 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9n77k" Feb 17 17:47:28 crc kubenswrapper[4892]: I0217 17:47:28.743559 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9n77k" Feb 17 17:47:28 crc kubenswrapper[4892]: I0217 17:47:28.997604 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9n77k"] Feb 17 17:47:29 crc kubenswrapper[4892]: I0217 17:47:29.461878 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rjqrw" Feb 17 17:47:29 crc kubenswrapper[4892]: I0217 17:47:29.462142 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rjqrw" Feb 17 17:47:29 crc kubenswrapper[4892]: I0217 17:47:29.517778 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rjqrw" Feb 17 17:47:29 crc kubenswrapper[4892]: I0217 17:47:29.756540 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rjqrw" Feb 17 17:47:29 crc kubenswrapper[4892]: I0217 17:47:29.924437 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-q2t7d" Feb 17 17:47:30 crc kubenswrapper[4892]: I0217 17:47:30.699425 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9n77k" podUID="0c8cf662-ff88-481f-ae9d-de0fe49af1f4" containerName="registry-server" containerID="cri-o://10503cdba15dbb02d56c0bd4d0d5c1c10a1dfab526969776027736f50fbdf59a" gracePeriod=2 Feb 17 17:47:30 crc kubenswrapper[4892]: I0217 17:47:30.903424 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-q424n" Feb 17 17:47:30 crc kubenswrapper[4892]: I0217 17:47:30.951395 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-q424n" Feb 17 17:47:31 crc kubenswrapper[4892]: I0217 17:47:31.243058 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9n77k" Feb 17 17:47:31 crc kubenswrapper[4892]: I0217 17:47:31.378033 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c8cf662-ff88-481f-ae9d-de0fe49af1f4-catalog-content\") pod \"0c8cf662-ff88-481f-ae9d-de0fe49af1f4\" (UID: \"0c8cf662-ff88-481f-ae9d-de0fe49af1f4\") " Feb 17 17:47:31 crc kubenswrapper[4892]: I0217 17:47:31.378374 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bppw6\" (UniqueName: \"kubernetes.io/projected/0c8cf662-ff88-481f-ae9d-de0fe49af1f4-kube-api-access-bppw6\") pod \"0c8cf662-ff88-481f-ae9d-de0fe49af1f4\" (UID: \"0c8cf662-ff88-481f-ae9d-de0fe49af1f4\") " Feb 17 17:47:31 crc kubenswrapper[4892]: I0217 17:47:31.378413 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c8cf662-ff88-481f-ae9d-de0fe49af1f4-utilities\") pod \"0c8cf662-ff88-481f-ae9d-de0fe49af1f4\" (UID: \"0c8cf662-ff88-481f-ae9d-de0fe49af1f4\") " Feb 17 17:47:31 crc kubenswrapper[4892]: I0217 17:47:31.380393 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c8cf662-ff88-481f-ae9d-de0fe49af1f4-utilities" (OuterVolumeSpecName: "utilities") pod "0c8cf662-ff88-481f-ae9d-de0fe49af1f4" (UID: "0c8cf662-ff88-481f-ae9d-de0fe49af1f4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:47:31 crc kubenswrapper[4892]: I0217 17:47:31.394373 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c8cf662-ff88-481f-ae9d-de0fe49af1f4-kube-api-access-bppw6" (OuterVolumeSpecName: "kube-api-access-bppw6") pod "0c8cf662-ff88-481f-ae9d-de0fe49af1f4" (UID: "0c8cf662-ff88-481f-ae9d-de0fe49af1f4"). InnerVolumeSpecName "kube-api-access-bppw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:31 crc kubenswrapper[4892]: I0217 17:47:31.430499 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c8cf662-ff88-481f-ae9d-de0fe49af1f4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0c8cf662-ff88-481f-ae9d-de0fe49af1f4" (UID: "0c8cf662-ff88-481f-ae9d-de0fe49af1f4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:47:31 crc kubenswrapper[4892]: I0217 17:47:31.480271 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bppw6\" (UniqueName: \"kubernetes.io/projected/0c8cf662-ff88-481f-ae9d-de0fe49af1f4-kube-api-access-bppw6\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:31 crc kubenswrapper[4892]: I0217 17:47:31.480409 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c8cf662-ff88-481f-ae9d-de0fe49af1f4-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:31 crc kubenswrapper[4892]: I0217 17:47:31.480425 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c8cf662-ff88-481f-ae9d-de0fe49af1f4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:31 crc kubenswrapper[4892]: I0217 17:47:31.710643 4892 generic.go:334] "Generic (PLEG): container finished" podID="0c8cf662-ff88-481f-ae9d-de0fe49af1f4" containerID="10503cdba15dbb02d56c0bd4d0d5c1c10a1dfab526969776027736f50fbdf59a" exitCode=0 Feb 17 17:47:31 crc kubenswrapper[4892]: I0217 17:47:31.710885 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9n77k" Feb 17 17:47:31 crc kubenswrapper[4892]: I0217 17:47:31.710953 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9n77k" event={"ID":"0c8cf662-ff88-481f-ae9d-de0fe49af1f4","Type":"ContainerDied","Data":"10503cdba15dbb02d56c0bd4d0d5c1c10a1dfab526969776027736f50fbdf59a"} Feb 17 17:47:31 crc kubenswrapper[4892]: I0217 17:47:31.710999 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9n77k" event={"ID":"0c8cf662-ff88-481f-ae9d-de0fe49af1f4","Type":"ContainerDied","Data":"10890d3fc0ec04ba91cccf530482cad3cd1d64f2284110d109b6cb4c2f9a7ab1"} Feb 17 17:47:31 crc kubenswrapper[4892]: I0217 17:47:31.711029 4892 scope.go:117] "RemoveContainer" containerID="10503cdba15dbb02d56c0bd4d0d5c1c10a1dfab526969776027736f50fbdf59a" Feb 17 17:47:31 crc kubenswrapper[4892]: I0217 17:47:31.754417 4892 scope.go:117] "RemoveContainer" containerID="b5747251961b6f3d14434c38dff9ca2c25b6c7dd7206f9d921d67ea49e80294b" Feb 17 17:47:31 crc kubenswrapper[4892]: I0217 17:47:31.763872 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9n77k"] Feb 17 17:47:31 crc kubenswrapper[4892]: I0217 17:47:31.767106 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9n77k"] Feb 17 17:47:31 crc kubenswrapper[4892]: I0217 17:47:31.789602 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q2t7d"] Feb 17 17:47:31 crc kubenswrapper[4892]: I0217 17:47:31.789883 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-q2t7d" podUID="0a738868-4aa5-4aa8-8058-863b3dfb3acb" containerName="registry-server" containerID="cri-o://5b563a62b45400c6c77bc999fce238da35b608acb03170a509e52a7e7c821fe5" gracePeriod=2 Feb 17 17:47:31 crc kubenswrapper[4892]: I0217 17:47:31.799026 4892 scope.go:117] "RemoveContainer" containerID="a3b1a01c913a494eb9b7cacdd13d127d0a4312d1709b9e07e372b0f7922d9cf9" Feb 17 17:47:31 crc kubenswrapper[4892]: I0217 17:47:31.819347 4892 scope.go:117] "RemoveContainer" containerID="10503cdba15dbb02d56c0bd4d0d5c1c10a1dfab526969776027736f50fbdf59a" Feb 17 17:47:31 crc kubenswrapper[4892]: E0217 17:47:31.819833 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10503cdba15dbb02d56c0bd4d0d5c1c10a1dfab526969776027736f50fbdf59a\": container with ID starting with 10503cdba15dbb02d56c0bd4d0d5c1c10a1dfab526969776027736f50fbdf59a not found: ID does not exist" containerID="10503cdba15dbb02d56c0bd4d0d5c1c10a1dfab526969776027736f50fbdf59a" Feb 17 17:47:31 crc kubenswrapper[4892]: I0217 17:47:31.819870 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10503cdba15dbb02d56c0bd4d0d5c1c10a1dfab526969776027736f50fbdf59a"} err="failed to get container status \"10503cdba15dbb02d56c0bd4d0d5c1c10a1dfab526969776027736f50fbdf59a\": rpc error: code = NotFound desc = could not find container \"10503cdba15dbb02d56c0bd4d0d5c1c10a1dfab526969776027736f50fbdf59a\": container with ID starting with 10503cdba15dbb02d56c0bd4d0d5c1c10a1dfab526969776027736f50fbdf59a not found: ID does not exist" Feb 17 17:47:31 crc kubenswrapper[4892]: I0217 17:47:31.819896 4892 scope.go:117] "RemoveContainer" containerID="b5747251961b6f3d14434c38dff9ca2c25b6c7dd7206f9d921d67ea49e80294b" Feb 17 17:47:31 crc kubenswrapper[4892]: E0217 17:47:31.821151 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5747251961b6f3d14434c38dff9ca2c25b6c7dd7206f9d921d67ea49e80294b\": container with ID starting with b5747251961b6f3d14434c38dff9ca2c25b6c7dd7206f9d921d67ea49e80294b not found: ID does not exist" containerID="b5747251961b6f3d14434c38dff9ca2c25b6c7dd7206f9d921d67ea49e80294b" Feb 17 17:47:31 crc kubenswrapper[4892]: I0217 17:47:31.821184 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5747251961b6f3d14434c38dff9ca2c25b6c7dd7206f9d921d67ea49e80294b"} err="failed to get container status \"b5747251961b6f3d14434c38dff9ca2c25b6c7dd7206f9d921d67ea49e80294b\": rpc error: code = NotFound desc = could not find container \"b5747251961b6f3d14434c38dff9ca2c25b6c7dd7206f9d921d67ea49e80294b\": container with ID starting with b5747251961b6f3d14434c38dff9ca2c25b6c7dd7206f9d921d67ea49e80294b not found: ID does not exist" Feb 17 17:47:31 crc kubenswrapper[4892]: I0217 17:47:31.821204 4892 scope.go:117] "RemoveContainer" containerID="a3b1a01c913a494eb9b7cacdd13d127d0a4312d1709b9e07e372b0f7922d9cf9" Feb 17 17:47:31 crc kubenswrapper[4892]: E0217 17:47:31.821519 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3b1a01c913a494eb9b7cacdd13d127d0a4312d1709b9e07e372b0f7922d9cf9\": container with ID starting with a3b1a01c913a494eb9b7cacdd13d127d0a4312d1709b9e07e372b0f7922d9cf9 not found: ID does not exist" containerID="a3b1a01c913a494eb9b7cacdd13d127d0a4312d1709b9e07e372b0f7922d9cf9" Feb 17 17:47:31 crc kubenswrapper[4892]: I0217 17:47:31.821560 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3b1a01c913a494eb9b7cacdd13d127d0a4312d1709b9e07e372b0f7922d9cf9"} err="failed to get container status \"a3b1a01c913a494eb9b7cacdd13d127d0a4312d1709b9e07e372b0f7922d9cf9\": rpc error: code = NotFound desc = could not find container \"a3b1a01c913a494eb9b7cacdd13d127d0a4312d1709b9e07e372b0f7922d9cf9\": container with ID starting with a3b1a01c913a494eb9b7cacdd13d127d0a4312d1709b9e07e372b0f7922d9cf9 not found: ID does not exist" Feb 17 17:47:32 crc kubenswrapper[4892]: I0217 17:47:32.271267 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q2t7d" Feb 17 17:47:32 crc kubenswrapper[4892]: I0217 17:47:32.400447 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a738868-4aa5-4aa8-8058-863b3dfb3acb-utilities\") pod \"0a738868-4aa5-4aa8-8058-863b3dfb3acb\" (UID: \"0a738868-4aa5-4aa8-8058-863b3dfb3acb\") " Feb 17 17:47:32 crc kubenswrapper[4892]: I0217 17:47:32.400532 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9j7z\" (UniqueName: \"kubernetes.io/projected/0a738868-4aa5-4aa8-8058-863b3dfb3acb-kube-api-access-z9j7z\") pod \"0a738868-4aa5-4aa8-8058-863b3dfb3acb\" (UID: \"0a738868-4aa5-4aa8-8058-863b3dfb3acb\") " Feb 17 17:47:32 crc kubenswrapper[4892]: I0217 17:47:32.400553 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a738868-4aa5-4aa8-8058-863b3dfb3acb-catalog-content\") pod \"0a738868-4aa5-4aa8-8058-863b3dfb3acb\" (UID: \"0a738868-4aa5-4aa8-8058-863b3dfb3acb\") " Feb 17 17:47:32 crc kubenswrapper[4892]: I0217 17:47:32.401435 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a738868-4aa5-4aa8-8058-863b3dfb3acb-utilities" (OuterVolumeSpecName: "utilities") pod "0a738868-4aa5-4aa8-8058-863b3dfb3acb" (UID: "0a738868-4aa5-4aa8-8058-863b3dfb3acb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:47:32 crc kubenswrapper[4892]: I0217 17:47:32.407993 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a738868-4aa5-4aa8-8058-863b3dfb3acb-kube-api-access-z9j7z" (OuterVolumeSpecName: "kube-api-access-z9j7z") pod "0a738868-4aa5-4aa8-8058-863b3dfb3acb" (UID: "0a738868-4aa5-4aa8-8058-863b3dfb3acb"). InnerVolumeSpecName "kube-api-access-z9j7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:32 crc kubenswrapper[4892]: I0217 17:47:32.431160 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a738868-4aa5-4aa8-8058-863b3dfb3acb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0a738868-4aa5-4aa8-8058-863b3dfb3acb" (UID: "0a738868-4aa5-4aa8-8058-863b3dfb3acb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:47:32 crc kubenswrapper[4892]: I0217 17:47:32.501851 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9j7z\" (UniqueName: \"kubernetes.io/projected/0a738868-4aa5-4aa8-8058-863b3dfb3acb-kube-api-access-z9j7z\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:32 crc kubenswrapper[4892]: I0217 17:47:32.501888 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a738868-4aa5-4aa8-8058-863b3dfb3acb-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:32 crc kubenswrapper[4892]: I0217 17:47:32.501899 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a738868-4aa5-4aa8-8058-863b3dfb3acb-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:32 crc kubenswrapper[4892]: I0217 17:47:32.736915 4892 generic.go:334] "Generic (PLEG): container finished" podID="0a738868-4aa5-4aa8-8058-863b3dfb3acb" containerID="5b563a62b45400c6c77bc999fce238da35b608acb03170a509e52a7e7c821fe5" exitCode=0 Feb 17 17:47:32 crc kubenswrapper[4892]: I0217 17:47:32.736981 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q2t7d" event={"ID":"0a738868-4aa5-4aa8-8058-863b3dfb3acb","Type":"ContainerDied","Data":"5b563a62b45400c6c77bc999fce238da35b608acb03170a509e52a7e7c821fe5"} Feb 17 17:47:32 crc kubenswrapper[4892]: I0217 17:47:32.737019 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q2t7d" Feb 17 17:47:32 crc kubenswrapper[4892]: I0217 17:47:32.737055 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q2t7d" event={"ID":"0a738868-4aa5-4aa8-8058-863b3dfb3acb","Type":"ContainerDied","Data":"e70d8ae6109d36f340662d54b77f01f06be0d761e67f0686475aeeeeb7b895e6"} Feb 17 17:47:32 crc kubenswrapper[4892]: I0217 17:47:32.737086 4892 scope.go:117] "RemoveContainer" containerID="5b563a62b45400c6c77bc999fce238da35b608acb03170a509e52a7e7c821fe5" Feb 17 17:47:32 crc kubenswrapper[4892]: I0217 17:47:32.758287 4892 scope.go:117] "RemoveContainer" containerID="149aaf51660d7b5620d873cf039b05a8910daef37d8841f4e2b1f486c6b85ae3" Feb 17 17:47:32 crc kubenswrapper[4892]: I0217 17:47:32.776556 4892 scope.go:117] "RemoveContainer" containerID="2a32002b8d0a52892cd5a6f93d9cb44226d2b2312a6423691fbc8a059c27d068" Feb 17 17:47:32 crc kubenswrapper[4892]: I0217 17:47:32.787017 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q2t7d"] Feb 17 17:47:32 crc kubenswrapper[4892]: I0217 17:47:32.791557 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-q2t7d"] Feb 17 17:47:32 crc kubenswrapper[4892]: I0217 17:47:32.808331 4892 scope.go:117] "RemoveContainer" containerID="5b563a62b45400c6c77bc999fce238da35b608acb03170a509e52a7e7c821fe5" Feb 17 17:47:32 crc kubenswrapper[4892]: E0217 17:47:32.809312 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b563a62b45400c6c77bc999fce238da35b608acb03170a509e52a7e7c821fe5\": container with ID starting with 5b563a62b45400c6c77bc999fce238da35b608acb03170a509e52a7e7c821fe5 not found: ID does not exist" containerID="5b563a62b45400c6c77bc999fce238da35b608acb03170a509e52a7e7c821fe5" Feb 17 17:47:32 crc kubenswrapper[4892]: I0217 17:47:32.809354 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b563a62b45400c6c77bc999fce238da35b608acb03170a509e52a7e7c821fe5"} err="failed to get container status \"5b563a62b45400c6c77bc999fce238da35b608acb03170a509e52a7e7c821fe5\": rpc error: code = NotFound desc = could not find container \"5b563a62b45400c6c77bc999fce238da35b608acb03170a509e52a7e7c821fe5\": container with ID starting with 5b563a62b45400c6c77bc999fce238da35b608acb03170a509e52a7e7c821fe5 not found: ID does not exist" Feb 17 17:47:32 crc kubenswrapper[4892]: I0217 17:47:32.809384 4892 scope.go:117] "RemoveContainer" containerID="149aaf51660d7b5620d873cf039b05a8910daef37d8841f4e2b1f486c6b85ae3" Feb 17 17:47:32 crc kubenswrapper[4892]: E0217 17:47:32.810152 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"149aaf51660d7b5620d873cf039b05a8910daef37d8841f4e2b1f486c6b85ae3\": container with ID starting with 149aaf51660d7b5620d873cf039b05a8910daef37d8841f4e2b1f486c6b85ae3 not found: ID does not exist" containerID="149aaf51660d7b5620d873cf039b05a8910daef37d8841f4e2b1f486c6b85ae3" Feb 17 17:47:32 crc kubenswrapper[4892]: I0217 17:47:32.810201 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"149aaf51660d7b5620d873cf039b05a8910daef37d8841f4e2b1f486c6b85ae3"} err="failed to get container status \"149aaf51660d7b5620d873cf039b05a8910daef37d8841f4e2b1f486c6b85ae3\": rpc error: code = NotFound desc = could not find container \"149aaf51660d7b5620d873cf039b05a8910daef37d8841f4e2b1f486c6b85ae3\": container with ID starting with 149aaf51660d7b5620d873cf039b05a8910daef37d8841f4e2b1f486c6b85ae3 not found: ID does not exist" Feb 17 17:47:32 crc kubenswrapper[4892]: I0217 17:47:32.810240 4892 scope.go:117] "RemoveContainer" containerID="2a32002b8d0a52892cd5a6f93d9cb44226d2b2312a6423691fbc8a059c27d068" Feb 17 17:47:32 crc kubenswrapper[4892]: E0217 17:47:32.810846 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a32002b8d0a52892cd5a6f93d9cb44226d2b2312a6423691fbc8a059c27d068\": container with ID starting with 2a32002b8d0a52892cd5a6f93d9cb44226d2b2312a6423691fbc8a059c27d068 not found: ID does not exist" containerID="2a32002b8d0a52892cd5a6f93d9cb44226d2b2312a6423691fbc8a059c27d068" Feb 17 17:47:32 crc kubenswrapper[4892]: I0217 17:47:32.810882 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a32002b8d0a52892cd5a6f93d9cb44226d2b2312a6423691fbc8a059c27d068"} err="failed to get container status \"2a32002b8d0a52892cd5a6f93d9cb44226d2b2312a6423691fbc8a059c27d068\": rpc error: code = NotFound desc = could not find container \"2a32002b8d0a52892cd5a6f93d9cb44226d2b2312a6423691fbc8a059c27d068\": container with ID starting with 2a32002b8d0a52892cd5a6f93d9cb44226d2b2312a6423691fbc8a059c27d068 not found: ID does not exist" Feb 17 17:47:33 crc kubenswrapper[4892]: I0217 17:47:33.369220 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a738868-4aa5-4aa8-8058-863b3dfb3acb" path="/var/lib/kubelet/pods/0a738868-4aa5-4aa8-8058-863b3dfb3acb/volumes" Feb 17 17:47:33 crc kubenswrapper[4892]: I0217 17:47:33.369899 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c8cf662-ff88-481f-ae9d-de0fe49af1f4" path="/var/lib/kubelet/pods/0c8cf662-ff88-481f-ae9d-de0fe49af1f4/volumes" Feb 17 17:47:37 crc kubenswrapper[4892]: I0217 17:47:37.316750 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6947b757c5-8hwvg"] Feb 17 17:47:37 crc kubenswrapper[4892]: I0217 17:47:37.320281 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6947b757c5-8hwvg" podUID="e20b74ec-bb8b-4674-b0f6-31d8f02ff100" containerName="controller-manager" containerID="cri-o://5ccae063352eef56f9ae75430e2bfcf2f1f044a128eb9dee1a25dcc4388a0efc" gracePeriod=30 Feb 17 17:47:37 crc kubenswrapper[4892]: I0217 17:47:37.413848 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c88bcf5b9-9bqj6"] Feb 17 17:47:37 crc kubenswrapper[4892]: I0217 17:47:37.414080 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5c88bcf5b9-9bqj6" podUID="d56da1fc-e0fa-4d86-89d8-b46b1c4f5765" containerName="route-controller-manager" containerID="cri-o://c373bc6ed06ef408037fd87b1d95403106f3849f2ff5dbd09d584d5dbc344162" gracePeriod=30 Feb 17 17:47:37 crc kubenswrapper[4892]: I0217 17:47:37.424735 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:47:37 crc kubenswrapper[4892]: I0217 17:47:37.424795 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:47:37 crc kubenswrapper[4892]: I0217 17:47:37.424863 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" Feb 17 17:47:37 crc kubenswrapper[4892]: I0217 17:47:37.425456 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1c31ce4ad814ab3be27cbd8d8eb0cb70c7e11352cacd14b705d5c3ed0b56fe52"} pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 17:47:37 crc kubenswrapper[4892]: I0217 17:47:37.425520 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" containerID="cri-o://1c31ce4ad814ab3be27cbd8d8eb0cb70c7e11352cacd14b705d5c3ed0b56fe52" gracePeriod=600 Feb 17 17:47:37 crc kubenswrapper[4892]: I0217 17:47:37.764388 4892 generic.go:334] "Generic (PLEG): container finished" podID="e20b74ec-bb8b-4674-b0f6-31d8f02ff100" containerID="5ccae063352eef56f9ae75430e2bfcf2f1f044a128eb9dee1a25dcc4388a0efc" exitCode=0 Feb 17 17:47:37 crc kubenswrapper[4892]: I0217 17:47:37.764430 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6947b757c5-8hwvg" event={"ID":"e20b74ec-bb8b-4674-b0f6-31d8f02ff100","Type":"ContainerDied","Data":"5ccae063352eef56f9ae75430e2bfcf2f1f044a128eb9dee1a25dcc4388a0efc"} Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.707511 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c88bcf5b9-9bqj6" Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.733737 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58cdcc4d95-wq4w7"] Feb 17 17:47:38 crc kubenswrapper[4892]: E0217 17:47:38.734017 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cda40965-c69d-469d-beb4-91582508ad77" containerName="extract-content" Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.734039 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="cda40965-c69d-469d-beb4-91582508ad77" containerName="extract-content" Feb 17 17:47:38 crc kubenswrapper[4892]: E0217 17:47:38.734054 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d56da1fc-e0fa-4d86-89d8-b46b1c4f5765" containerName="route-controller-manager" Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.734061 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="d56da1fc-e0fa-4d86-89d8-b46b1c4f5765" containerName="route-controller-manager" Feb 17 17:47:38 crc kubenswrapper[4892]: E0217 17:47:38.734072 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c8cf662-ff88-481f-ae9d-de0fe49af1f4" containerName="registry-server" Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.734078 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c8cf662-ff88-481f-ae9d-de0fe49af1f4" containerName="registry-server" Feb 17 17:47:38 crc kubenswrapper[4892]: E0217 17:47:38.734133 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a4ab50e-1f68-4755-a980-157e38a83436" containerName="registry-server" Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.734142 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a4ab50e-1f68-4755-a980-157e38a83436" containerName="registry-server" Feb 17 17:47:38 crc kubenswrapper[4892]: E0217 17:47:38.734156 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a738868-4aa5-4aa8-8058-863b3dfb3acb" containerName="extract-content" Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.734164 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a738868-4aa5-4aa8-8058-863b3dfb3acb" containerName="extract-content" Feb 17 17:47:38 crc kubenswrapper[4892]: E0217 17:47:38.734174 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cda40965-c69d-469d-beb4-91582508ad77" containerName="registry-server" Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.734181 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="cda40965-c69d-469d-beb4-91582508ad77" containerName="registry-server" Feb 17 17:47:38 crc kubenswrapper[4892]: E0217 17:47:38.734192 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c8cf662-ff88-481f-ae9d-de0fe49af1f4" containerName="extract-content" Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.734199 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c8cf662-ff88-481f-ae9d-de0fe49af1f4" containerName="extract-content" Feb 17 17:47:38 crc kubenswrapper[4892]: E0217 17:47:38.734210 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a738868-4aa5-4aa8-8058-863b3dfb3acb" containerName="extract-utilities" Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.734217 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a738868-4aa5-4aa8-8058-863b3dfb3acb" containerName="extract-utilities" Feb 17 17:47:38 crc kubenswrapper[4892]: E0217 17:47:38.734227 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c8cf662-ff88-481f-ae9d-de0fe49af1f4" containerName="extract-utilities" Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.734235 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c8cf662-ff88-481f-ae9d-de0fe49af1f4" containerName="extract-utilities" Feb 17 17:47:38 crc kubenswrapper[4892]: E0217 17:47:38.734244 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a738868-4aa5-4aa8-8058-863b3dfb3acb" containerName="registry-server" Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.734251 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a738868-4aa5-4aa8-8058-863b3dfb3acb" containerName="registry-server" Feb 17 17:47:38 crc kubenswrapper[4892]: E0217 17:47:38.734261 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a4ab50e-1f68-4755-a980-157e38a83436" containerName="extract-utilities" Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.734269 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a4ab50e-1f68-4755-a980-157e38a83436" containerName="extract-utilities" Feb 17 17:47:38 crc kubenswrapper[4892]: E0217 17:47:38.734279 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a4ab50e-1f68-4755-a980-157e38a83436" containerName="extract-content" Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.734286 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a4ab50e-1f68-4755-a980-157e38a83436" containerName="extract-content" Feb 17 17:47:38 crc kubenswrapper[4892]: E0217 17:47:38.734296 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cda40965-c69d-469d-beb4-91582508ad77" containerName="extract-utilities" Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.734303 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="cda40965-c69d-469d-beb4-91582508ad77" containerName="extract-utilities" Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.734423 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="cda40965-c69d-469d-beb4-91582508ad77" containerName="registry-server" Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.734475 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c8cf662-ff88-481f-ae9d-de0fe49af1f4" containerName="registry-server" Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.734488 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a4ab50e-1f68-4755-a980-157e38a83436" containerName="registry-server" Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.734495 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="d56da1fc-e0fa-4d86-89d8-b46b1c4f5765" containerName="route-controller-manager" Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.734507 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a738868-4aa5-4aa8-8058-863b3dfb3acb" containerName="registry-server" Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.734884 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58cdcc4d95-wq4w7" Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.741566 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58cdcc4d95-wq4w7"] Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.771615 4892 generic.go:334] "Generic (PLEG): container finished" podID="d56da1fc-e0fa-4d86-89d8-b46b1c4f5765" containerID="c373bc6ed06ef408037fd87b1d95403106f3849f2ff5dbd09d584d5dbc344162" exitCode=0 Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.771719 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c88bcf5b9-9bqj6" event={"ID":"d56da1fc-e0fa-4d86-89d8-b46b1c4f5765","Type":"ContainerDied","Data":"c373bc6ed06ef408037fd87b1d95403106f3849f2ff5dbd09d584d5dbc344162"} Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.771698 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c88bcf5b9-9bqj6" Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.771794 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c88bcf5b9-9bqj6" event={"ID":"d56da1fc-e0fa-4d86-89d8-b46b1c4f5765","Type":"ContainerDied","Data":"ce8a7b347bc0ba11c3ee56b8460317edb35c5877d25f4f642b85d9c34093327d"} Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.771837 4892 scope.go:117] "RemoveContainer" containerID="c373bc6ed06ef408037fd87b1d95403106f3849f2ff5dbd09d584d5dbc344162" Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.779759 4892 generic.go:334] "Generic (PLEG): container finished" podID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerID="1c31ce4ad814ab3be27cbd8d8eb0cb70c7e11352cacd14b705d5c3ed0b56fe52" exitCode=0 Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.779803 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerDied","Data":"1c31ce4ad814ab3be27cbd8d8eb0cb70c7e11352cacd14b705d5c3ed0b56fe52"} Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.779852 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerStarted","Data":"1f481e119dd06565c296707fbdfe27f9e2c04abd2d62b9e12028ab806fca7152"} Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.799286 4892 scope.go:117] "RemoveContainer" containerID="c373bc6ed06ef408037fd87b1d95403106f3849f2ff5dbd09d584d5dbc344162" Feb 17 17:47:38 crc kubenswrapper[4892]: E0217 17:47:38.800267 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c373bc6ed06ef408037fd87b1d95403106f3849f2ff5dbd09d584d5dbc344162\": container with ID starting with c373bc6ed06ef408037fd87b1d95403106f3849f2ff5dbd09d584d5dbc344162 not found: ID does not exist" containerID="c373bc6ed06ef408037fd87b1d95403106f3849f2ff5dbd09d584d5dbc344162" Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.800327 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c373bc6ed06ef408037fd87b1d95403106f3849f2ff5dbd09d584d5dbc344162"} err="failed to get container status \"c373bc6ed06ef408037fd87b1d95403106f3849f2ff5dbd09d584d5dbc344162\": rpc error: code = NotFound desc = could not find container \"c373bc6ed06ef408037fd87b1d95403106f3849f2ff5dbd09d584d5dbc344162\": container with ID starting with c373bc6ed06ef408037fd87b1d95403106f3849f2ff5dbd09d584d5dbc344162 not found: ID does not exist" Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.802600 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a1b1449-9911-496a-bb90-b207edd42528-config\") pod \"route-controller-manager-58cdcc4d95-wq4w7\" (UID: \"6a1b1449-9911-496a-bb90-b207edd42528\") " pod="openshift-route-controller-manager/route-controller-manager-58cdcc4d95-wq4w7" Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.802768 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a1b1449-9911-496a-bb90-b207edd42528-serving-cert\") pod \"route-controller-manager-58cdcc4d95-wq4w7\" (UID: \"6a1b1449-9911-496a-bb90-b207edd42528\") " pod="openshift-route-controller-manager/route-controller-manager-58cdcc4d95-wq4w7" Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.802787 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m99hs\" (UniqueName: \"kubernetes.io/projected/6a1b1449-9911-496a-bb90-b207edd42528-kube-api-access-m99hs\") pod \"route-controller-manager-58cdcc4d95-wq4w7\" (UID: \"6a1b1449-9911-496a-bb90-b207edd42528\") " pod="openshift-route-controller-manager/route-controller-manager-58cdcc4d95-wq4w7" Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.802854 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6a1b1449-9911-496a-bb90-b207edd42528-client-ca\") pod \"route-controller-manager-58cdcc4d95-wq4w7\" (UID: \"6a1b1449-9911-496a-bb90-b207edd42528\") " pod="openshift-route-controller-manager/route-controller-manager-58cdcc4d95-wq4w7" Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.857386 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6947b757c5-8hwvg" Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.903529 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntm54\" (UniqueName: \"kubernetes.io/projected/d56da1fc-e0fa-4d86-89d8-b46b1c4f5765-kube-api-access-ntm54\") pod \"d56da1fc-e0fa-4d86-89d8-b46b1c4f5765\" (UID: \"d56da1fc-e0fa-4d86-89d8-b46b1c4f5765\") " Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.903583 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e20b74ec-bb8b-4674-b0f6-31d8f02ff100-client-ca\") pod \"e20b74ec-bb8b-4674-b0f6-31d8f02ff100\" (UID: \"e20b74ec-bb8b-4674-b0f6-31d8f02ff100\") " Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.903601 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4xzb\" (UniqueName: \"kubernetes.io/projected/e20b74ec-bb8b-4674-b0f6-31d8f02ff100-kube-api-access-c4xzb\") pod \"e20b74ec-bb8b-4674-b0f6-31d8f02ff100\" (UID: \"e20b74ec-bb8b-4674-b0f6-31d8f02ff100\") " Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.903623 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e20b74ec-bb8b-4674-b0f6-31d8f02ff100-config\") pod \"e20b74ec-bb8b-4674-b0f6-31d8f02ff100\" (UID: \"e20b74ec-bb8b-4674-b0f6-31d8f02ff100\") " Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.903657 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d56da1fc-e0fa-4d86-89d8-b46b1c4f5765-config\") pod \"d56da1fc-e0fa-4d86-89d8-b46b1c4f5765\" (UID: \"d56da1fc-e0fa-4d86-89d8-b46b1c4f5765\") " Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.903682 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d56da1fc-e0fa-4d86-89d8-b46b1c4f5765-client-ca\") pod \"d56da1fc-e0fa-4d86-89d8-b46b1c4f5765\" (UID: \"d56da1fc-e0fa-4d86-89d8-b46b1c4f5765\") " Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.903698 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d56da1fc-e0fa-4d86-89d8-b46b1c4f5765-serving-cert\") pod \"d56da1fc-e0fa-4d86-89d8-b46b1c4f5765\" (UID: \"d56da1fc-e0fa-4d86-89d8-b46b1c4f5765\") " Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.903713 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e20b74ec-bb8b-4674-b0f6-31d8f02ff100-serving-cert\") pod \"e20b74ec-bb8b-4674-b0f6-31d8f02ff100\" (UID: \"e20b74ec-bb8b-4674-b0f6-31d8f02ff100\") " Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.903732 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e20b74ec-bb8b-4674-b0f6-31d8f02ff100-proxy-ca-bundles\") pod \"e20b74ec-bb8b-4674-b0f6-31d8f02ff100\" (UID: \"e20b74ec-bb8b-4674-b0f6-31d8f02ff100\") " Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.903954 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a1b1449-9911-496a-bb90-b207edd42528-config\") pod \"route-controller-manager-58cdcc4d95-wq4w7\" (UID: \"6a1b1449-9911-496a-bb90-b207edd42528\") " pod="openshift-route-controller-manager/route-controller-manager-58cdcc4d95-wq4w7" Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.904030 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m99hs\" (UniqueName: \"kubernetes.io/projected/6a1b1449-9911-496a-bb90-b207edd42528-kube-api-access-m99hs\") pod \"route-controller-manager-58cdcc4d95-wq4w7\" (UID: \"6a1b1449-9911-496a-bb90-b207edd42528\") " pod="openshift-route-controller-manager/route-controller-manager-58cdcc4d95-wq4w7" Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.904051 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a1b1449-9911-496a-bb90-b207edd42528-serving-cert\") pod \"route-controller-manager-58cdcc4d95-wq4w7\" (UID: \"6a1b1449-9911-496a-bb90-b207edd42528\") " pod="openshift-route-controller-manager/route-controller-manager-58cdcc4d95-wq4w7" Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.904080 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6a1b1449-9911-496a-bb90-b207edd42528-client-ca\") pod \"route-controller-manager-58cdcc4d95-wq4w7\" (UID: \"6a1b1449-9911-496a-bb90-b207edd42528\") " pod="openshift-route-controller-manager/route-controller-manager-58cdcc4d95-wq4w7" Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.904415 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e20b74ec-bb8b-4674-b0f6-31d8f02ff100-client-ca" (OuterVolumeSpecName: "client-ca") pod "e20b74ec-bb8b-4674-b0f6-31d8f02ff100" (UID: "e20b74ec-bb8b-4674-b0f6-31d8f02ff100"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.904672 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6a1b1449-9911-496a-bb90-b207edd42528-client-ca\") pod \"route-controller-manager-58cdcc4d95-wq4w7\" (UID: \"6a1b1449-9911-496a-bb90-b207edd42528\") " pod="openshift-route-controller-manager/route-controller-manager-58cdcc4d95-wq4w7" Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.904940 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a1b1449-9911-496a-bb90-b207edd42528-config\") pod \"route-controller-manager-58cdcc4d95-wq4w7\" (UID: \"6a1b1449-9911-496a-bb90-b207edd42528\") " pod="openshift-route-controller-manager/route-controller-manager-58cdcc4d95-wq4w7" Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.905214 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d56da1fc-e0fa-4d86-89d8-b46b1c4f5765-client-ca" (OuterVolumeSpecName: "client-ca") pod "d56da1fc-e0fa-4d86-89d8-b46b1c4f5765" (UID: "d56da1fc-e0fa-4d86-89d8-b46b1c4f5765"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.905630 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e20b74ec-bb8b-4674-b0f6-31d8f02ff100-config" (OuterVolumeSpecName: "config") pod "e20b74ec-bb8b-4674-b0f6-31d8f02ff100" (UID: "e20b74ec-bb8b-4674-b0f6-31d8f02ff100"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.905673 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d56da1fc-e0fa-4d86-89d8-b46b1c4f5765-config" (OuterVolumeSpecName: "config") pod "d56da1fc-e0fa-4d86-89d8-b46b1c4f5765" (UID: "d56da1fc-e0fa-4d86-89d8-b46b1c4f5765"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.908549 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e20b74ec-bb8b-4674-b0f6-31d8f02ff100-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e20b74ec-bb8b-4674-b0f6-31d8f02ff100" (UID: "e20b74ec-bb8b-4674-b0f6-31d8f02ff100"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.908985 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e20b74ec-bb8b-4674-b0f6-31d8f02ff100-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e20b74ec-bb8b-4674-b0f6-31d8f02ff100" (UID: "e20b74ec-bb8b-4674-b0f6-31d8f02ff100"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.909068 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d56da1fc-e0fa-4d86-89d8-b46b1c4f5765-kube-api-access-ntm54" (OuterVolumeSpecName: "kube-api-access-ntm54") pod "d56da1fc-e0fa-4d86-89d8-b46b1c4f5765" (UID: "d56da1fc-e0fa-4d86-89d8-b46b1c4f5765"). InnerVolumeSpecName "kube-api-access-ntm54". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.909414 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a1b1449-9911-496a-bb90-b207edd42528-serving-cert\") pod \"route-controller-manager-58cdcc4d95-wq4w7\" (UID: \"6a1b1449-9911-496a-bb90-b207edd42528\") " pod="openshift-route-controller-manager/route-controller-manager-58cdcc4d95-wq4w7" Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.909424 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d56da1fc-e0fa-4d86-89d8-b46b1c4f5765-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d56da1fc-e0fa-4d86-89d8-b46b1c4f5765" (UID: "d56da1fc-e0fa-4d86-89d8-b46b1c4f5765"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.914983 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e20b74ec-bb8b-4674-b0f6-31d8f02ff100-kube-api-access-c4xzb" (OuterVolumeSpecName: "kube-api-access-c4xzb") pod "e20b74ec-bb8b-4674-b0f6-31d8f02ff100" (UID: "e20b74ec-bb8b-4674-b0f6-31d8f02ff100"). InnerVolumeSpecName "kube-api-access-c4xzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:38 crc kubenswrapper[4892]: I0217 17:47:38.918603 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m99hs\" (UniqueName: \"kubernetes.io/projected/6a1b1449-9911-496a-bb90-b207edd42528-kube-api-access-m99hs\") pod \"route-controller-manager-58cdcc4d95-wq4w7\" (UID: \"6a1b1449-9911-496a-bb90-b207edd42528\") " pod="openshift-route-controller-manager/route-controller-manager-58cdcc4d95-wq4w7" Feb 17 17:47:39 crc kubenswrapper[4892]: I0217 17:47:39.004566 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d56da1fc-e0fa-4d86-89d8-b46b1c4f5765-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:39 crc kubenswrapper[4892]: I0217 17:47:39.004600 4892 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d56da1fc-e0fa-4d86-89d8-b46b1c4f5765-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:39 crc kubenswrapper[4892]: I0217 17:47:39.004614 4892 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d56da1fc-e0fa-4d86-89d8-b46b1c4f5765-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:39 crc kubenswrapper[4892]: I0217 17:47:39.004624 4892 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e20b74ec-bb8b-4674-b0f6-31d8f02ff100-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:39 crc kubenswrapper[4892]: I0217 17:47:39.004632 4892 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e20b74ec-bb8b-4674-b0f6-31d8f02ff100-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:39 crc kubenswrapper[4892]: I0217 17:47:39.004642 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntm54\" (UniqueName: \"kubernetes.io/projected/d56da1fc-e0fa-4d86-89d8-b46b1c4f5765-kube-api-access-ntm54\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:39 crc kubenswrapper[4892]: I0217 17:47:39.004652 4892 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e20b74ec-bb8b-4674-b0f6-31d8f02ff100-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:39 crc kubenswrapper[4892]: I0217 17:47:39.004662 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4xzb\" (UniqueName: \"kubernetes.io/projected/e20b74ec-bb8b-4674-b0f6-31d8f02ff100-kube-api-access-c4xzb\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:39 crc kubenswrapper[4892]: I0217 17:47:39.004672 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e20b74ec-bb8b-4674-b0f6-31d8f02ff100-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:39 crc kubenswrapper[4892]: I0217 17:47:39.055955 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58cdcc4d95-wq4w7" Feb 17 17:47:39 crc kubenswrapper[4892]: I0217 17:47:39.113089 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c88bcf5b9-9bqj6"] Feb 17 17:47:39 crc kubenswrapper[4892]: I0217 17:47:39.116900 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c88bcf5b9-9bqj6"] Feb 17 17:47:39 crc kubenswrapper[4892]: I0217 17:47:39.367399 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d56da1fc-e0fa-4d86-89d8-b46b1c4f5765" path="/var/lib/kubelet/pods/d56da1fc-e0fa-4d86-89d8-b46b1c4f5765/volumes" Feb 17 17:47:39 crc kubenswrapper[4892]: I0217 17:47:39.520058 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58cdcc4d95-wq4w7"] Feb 17 17:47:39 crc kubenswrapper[4892]: W0217 17:47:39.525728 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a1b1449_9911_496a_bb90_b207edd42528.slice/crio-f181da76c315599b4d561b4be1695a5b1a302ae5a88893fcd23b7d4d69dbcb09 WatchSource:0}: Error finding container f181da76c315599b4d561b4be1695a5b1a302ae5a88893fcd23b7d4d69dbcb09: Status 404 returned error can't find the container with id f181da76c315599b4d561b4be1695a5b1a302ae5a88893fcd23b7d4d69dbcb09 Feb 17 17:47:39 crc kubenswrapper[4892]: I0217 17:47:39.791083 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-58cdcc4d95-wq4w7" event={"ID":"6a1b1449-9911-496a-bb90-b207edd42528","Type":"ContainerStarted","Data":"3ad7ab4dd3e1cb9b62225c2eb4c85fd6bd97ceee62964a71bd3e70675334384a"} Feb 17 17:47:39 crc kubenswrapper[4892]: I0217 17:47:39.792025 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-58cdcc4d95-wq4w7" event={"ID":"6a1b1449-9911-496a-bb90-b207edd42528","Type":"ContainerStarted","Data":"f181da76c315599b4d561b4be1695a5b1a302ae5a88893fcd23b7d4d69dbcb09"} Feb 17 17:47:39 crc kubenswrapper[4892]: I0217 17:47:39.795220 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6947b757c5-8hwvg" event={"ID":"e20b74ec-bb8b-4674-b0f6-31d8f02ff100","Type":"ContainerDied","Data":"be87bfdce5887b47abcbe6369a7c1ee1d100bec2dc5cf37780579a83a205ef4b"} Feb 17 17:47:39 crc kubenswrapper[4892]: I0217 17:47:39.795269 4892 scope.go:117] "RemoveContainer" containerID="5ccae063352eef56f9ae75430e2bfcf2f1f044a128eb9dee1a25dcc4388a0efc" Feb 17 17:47:39 crc kubenswrapper[4892]: I0217 17:47:39.795418 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6947b757c5-8hwvg" Feb 17 17:47:39 crc kubenswrapper[4892]: I0217 17:47:39.809790 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6947b757c5-8hwvg"] Feb 17 17:47:39 crc kubenswrapper[4892]: I0217 17:47:39.812270 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6947b757c5-8hwvg"] Feb 17 17:47:39 crc kubenswrapper[4892]: I0217 17:47:39.979784 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vh5f8"] Feb 17 17:47:39 crc kubenswrapper[4892]: I0217 17:47:39.980164 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vh5f8" podUID="cbb9e745-9259-437f-a600-80153e687c65" containerName="registry-server" containerID="cri-o://942cc645d18d4a896708d9d97cdfe4687015e7e43dea707b1f4f3fa654b94f5b" gracePeriod=30 Feb 17 17:47:39 crc kubenswrapper[4892]: I0217 17:47:39.995307 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2dtr9"] Feb 17 17:47:39 crc kubenswrapper[4892]: I0217 17:47:39.995587 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2dtr9" podUID="9c4922f3-1e12-469d-9afc-c2c52238e551" containerName="registry-server" containerID="cri-o://eb82e87c02e4da31e4beea83f2bb51bb4112d961eedd71995389ff5801902e41" gracePeriod=30 Feb 17 17:47:40 crc kubenswrapper[4892]: I0217 17:47:39.999669 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xlhd5"] Feb 17 17:47:40 crc kubenswrapper[4892]: I0217 17:47:40.001199 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-xlhd5" podUID="fc4dd9a2-87aa-4405-a4cc-778c778aaec9" containerName="marketplace-operator" containerID="cri-o://5a9d6f29dd69f0c88f04d654389d0c1fc3b2e8e6144ebe204be4e4bfe3f2b38f" gracePeriod=30 Feb 17 17:47:40 crc kubenswrapper[4892]: I0217 17:47:40.003181 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rjqrw"] Feb 17 17:47:40 crc kubenswrapper[4892]: I0217 17:47:40.003352 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rjqrw" podUID="8498dfc3-1aa0-4059-abad-cab139ba83ec" containerName="registry-server" containerID="cri-o://6149f27584b0888a515761df07de36ffe15ad669fc82075ec03e130d972b3e4c" gracePeriod=30 Feb 17 17:47:40 crc kubenswrapper[4892]: I0217 17:47:40.019616 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q424n"] Feb 17 17:47:40 crc kubenswrapper[4892]: I0217 17:47:40.020012 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-q424n" podUID="96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1" containerName="registry-server" containerID="cri-o://72fcfa4564659b3702e16d67177b9a31bfc4cf9dc06a4ae895cb1f08ea736a18" gracePeriod=30 Feb 17 17:47:40 crc kubenswrapper[4892]: I0217 17:47:40.026437 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5t5vc"] Feb 17 17:47:40 crc kubenswrapper[4892]: E0217 17:47:40.026700 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e20b74ec-bb8b-4674-b0f6-31d8f02ff100" containerName="controller-manager" Feb 17 17:47:40 crc kubenswrapper[4892]: I0217 17:47:40.026725 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="e20b74ec-bb8b-4674-b0f6-31d8f02ff100" containerName="controller-manager" Feb 17 17:47:40 crc kubenswrapper[4892]: I0217 17:47:40.026906 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="e20b74ec-bb8b-4674-b0f6-31d8f02ff100" containerName="controller-manager" Feb 17 17:47:40 crc kubenswrapper[4892]: I0217 17:47:40.027341 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-5t5vc" Feb 17 17:47:40 crc kubenswrapper[4892]: I0217 17:47:40.032451 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5t5vc"] Feb 17 17:47:40 crc kubenswrapper[4892]: I0217 17:47:40.217251 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9djl\" (UniqueName: \"kubernetes.io/projected/c6d1cad4-ce5d-4116-a7ad-7a2f6c51c2bb-kube-api-access-t9djl\") pod \"marketplace-operator-79b997595-5t5vc\" (UID: \"c6d1cad4-ce5d-4116-a7ad-7a2f6c51c2bb\") " pod="openshift-marketplace/marketplace-operator-79b997595-5t5vc" Feb 17 17:47:40 crc kubenswrapper[4892]: I0217 17:47:40.217694 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c6d1cad4-ce5d-4116-a7ad-7a2f6c51c2bb-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-5t5vc\" (UID: \"c6d1cad4-ce5d-4116-a7ad-7a2f6c51c2bb\") " pod="openshift-marketplace/marketplace-operator-79b997595-5t5vc" Feb 17 17:47:40 crc kubenswrapper[4892]: I0217 17:47:40.217761 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c6d1cad4-ce5d-4116-a7ad-7a2f6c51c2bb-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-5t5vc\" (UID: \"c6d1cad4-ce5d-4116-a7ad-7a2f6c51c2bb\") " pod="openshift-marketplace/marketplace-operator-79b997595-5t5vc" Feb 17 17:47:40 crc kubenswrapper[4892]: I0217 17:47:40.252497 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-n8n76"] Feb 17 17:47:40 crc kubenswrapper[4892]: I0217 17:47:40.318922 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c6d1cad4-ce5d-4116-a7ad-7a2f6c51c2bb-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-5t5vc\" (UID: \"c6d1cad4-ce5d-4116-a7ad-7a2f6c51c2bb\") " pod="openshift-marketplace/marketplace-operator-79b997595-5t5vc" Feb 17 17:47:40 crc kubenswrapper[4892]: I0217 17:47:40.318994 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9djl\" (UniqueName: \"kubernetes.io/projected/c6d1cad4-ce5d-4116-a7ad-7a2f6c51c2bb-kube-api-access-t9djl\") pod \"marketplace-operator-79b997595-5t5vc\" (UID: \"c6d1cad4-ce5d-4116-a7ad-7a2f6c51c2bb\") " pod="openshift-marketplace/marketplace-operator-79b997595-5t5vc" Feb 17 17:47:40 crc kubenswrapper[4892]: I0217 17:47:40.319019 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c6d1cad4-ce5d-4116-a7ad-7a2f6c51c2bb-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-5t5vc\" (UID: \"c6d1cad4-ce5d-4116-a7ad-7a2f6c51c2bb\") " pod="openshift-marketplace/marketplace-operator-79b997595-5t5vc" Feb 17 17:47:40 crc kubenswrapper[4892]: I0217 17:47:40.320021 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c6d1cad4-ce5d-4116-a7ad-7a2f6c51c2bb-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-5t5vc\" (UID: \"c6d1cad4-ce5d-4116-a7ad-7a2f6c51c2bb\") " pod="openshift-marketplace/marketplace-operator-79b997595-5t5vc" Feb 17 17:47:40 crc kubenswrapper[4892]: I0217 17:47:40.326445 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c6d1cad4-ce5d-4116-a7ad-7a2f6c51c2bb-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-5t5vc\" (UID: \"c6d1cad4-ce5d-4116-a7ad-7a2f6c51c2bb\") " pod="openshift-marketplace/marketplace-operator-79b997595-5t5vc" Feb 17 17:47:40 crc kubenswrapper[4892]: I0217 17:47:40.334197 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9djl\" (UniqueName: \"kubernetes.io/projected/c6d1cad4-ce5d-4116-a7ad-7a2f6c51c2bb-kube-api-access-t9djl\") pod \"marketplace-operator-79b997595-5t5vc\" (UID: \"c6d1cad4-ce5d-4116-a7ad-7a2f6c51c2bb\") " pod="openshift-marketplace/marketplace-operator-79b997595-5t5vc" Feb 17 17:47:40 crc kubenswrapper[4892]: I0217 17:47:40.354030 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-5t5vc" Feb 17 17:47:40 crc kubenswrapper[4892]: I0217 17:47:40.668853 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5t5vc"] Feb 17 17:47:40 crc kubenswrapper[4892]: W0217 17:47:40.681736 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6d1cad4_ce5d_4116_a7ad_7a2f6c51c2bb.slice/crio-7de3533991c839cb7881385169a9a2cec84efddc491a920efec786e2317b51c0 WatchSource:0}: Error finding container 7de3533991c839cb7881385169a9a2cec84efddc491a920efec786e2317b51c0: Status 404 returned error can't find the container with id 7de3533991c839cb7881385169a9a2cec84efddc491a920efec786e2317b51c0 Feb 17 17:47:40 crc kubenswrapper[4892]: I0217 17:47:40.836085 4892 generic.go:334] "Generic (PLEG): container finished" podID="96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1" containerID="72fcfa4564659b3702e16d67177b9a31bfc4cf9dc06a4ae895cb1f08ea736a18" exitCode=0 Feb 17 17:47:40 crc kubenswrapper[4892]: I0217 17:47:40.836175 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q424n" event={"ID":"96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1","Type":"ContainerDied","Data":"72fcfa4564659b3702e16d67177b9a31bfc4cf9dc06a4ae895cb1f08ea736a18"} Feb 17 17:47:40 crc kubenswrapper[4892]: I0217 17:47:40.853751 4892 generic.go:334] "Generic (PLEG): container finished" podID="cbb9e745-9259-437f-a600-80153e687c65" containerID="942cc645d18d4a896708d9d97cdfe4687015e7e43dea707b1f4f3fa654b94f5b" exitCode=0 Feb 17 17:47:40 crc kubenswrapper[4892]: I0217 17:47:40.853820 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vh5f8" event={"ID":"cbb9e745-9259-437f-a600-80153e687c65","Type":"ContainerDied","Data":"942cc645d18d4a896708d9d97cdfe4687015e7e43dea707b1f4f3fa654b94f5b"} Feb 17 17:47:40 crc kubenswrapper[4892]: I0217 17:47:40.855119 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-5t5vc" event={"ID":"c6d1cad4-ce5d-4116-a7ad-7a2f6c51c2bb","Type":"ContainerStarted","Data":"7de3533991c839cb7881385169a9a2cec84efddc491a920efec786e2317b51c0"} Feb 17 17:47:40 crc kubenswrapper[4892]: I0217 17:47:40.857692 4892 generic.go:334] "Generic (PLEG): container finished" podID="9c4922f3-1e12-469d-9afc-c2c52238e551" containerID="eb82e87c02e4da31e4beea83f2bb51bb4112d961eedd71995389ff5801902e41" exitCode=0 Feb 17 17:47:40 crc kubenswrapper[4892]: I0217 17:47:40.857754 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2dtr9" event={"ID":"9c4922f3-1e12-469d-9afc-c2c52238e551","Type":"ContainerDied","Data":"eb82e87c02e4da31e4beea83f2bb51bb4112d961eedd71995389ff5801902e41"} Feb 17 17:47:40 crc kubenswrapper[4892]: I0217 17:47:40.859037 4892 generic.go:334] "Generic (PLEG): container finished" podID="fc4dd9a2-87aa-4405-a4cc-778c778aaec9" containerID="5a9d6f29dd69f0c88f04d654389d0c1fc3b2e8e6144ebe204be4e4bfe3f2b38f" exitCode=0 Feb 17 17:47:40 crc kubenswrapper[4892]: I0217 17:47:40.859105 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xlhd5" event={"ID":"fc4dd9a2-87aa-4405-a4cc-778c778aaec9","Type":"ContainerDied","Data":"5a9d6f29dd69f0c88f04d654389d0c1fc3b2e8e6144ebe204be4e4bfe3f2b38f"} Feb 17 17:47:40 crc kubenswrapper[4892]: I0217 17:47:40.861720 4892 generic.go:334] "Generic (PLEG): container finished" podID="8498dfc3-1aa0-4059-abad-cab139ba83ec" containerID="6149f27584b0888a515761df07de36ffe15ad669fc82075ec03e130d972b3e4c" exitCode=0 Feb 17 17:47:40 crc kubenswrapper[4892]: I0217 17:47:40.862527 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rjqrw" event={"ID":"8498dfc3-1aa0-4059-abad-cab139ba83ec","Type":"ContainerDied","Data":"6149f27584b0888a515761df07de36ffe15ad669fc82075ec03e130d972b3e4c"} Feb 17 17:47:40 crc kubenswrapper[4892]: I0217 17:47:40.862750 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-58cdcc4d95-wq4w7" Feb 17 17:47:40 crc kubenswrapper[4892]: E0217 17:47:40.870694 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 72fcfa4564659b3702e16d67177b9a31bfc4cf9dc06a4ae895cb1f08ea736a18 is running failed: container process not found" containerID="72fcfa4564659b3702e16d67177b9a31bfc4cf9dc06a4ae895cb1f08ea736a18" cmd=["grpc_health_probe","-addr=:50051"] Feb 17 17:47:40 crc kubenswrapper[4892]: E0217 17:47:40.871397 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 72fcfa4564659b3702e16d67177b9a31bfc4cf9dc06a4ae895cb1f08ea736a18 is running failed: container process not found" containerID="72fcfa4564659b3702e16d67177b9a31bfc4cf9dc06a4ae895cb1f08ea736a18" cmd=["grpc_health_probe","-addr=:50051"] Feb 17 17:47:40 crc kubenswrapper[4892]: I0217 17:47:40.872666 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-58cdcc4d95-wq4w7" Feb 17 17:47:40 crc kubenswrapper[4892]: E0217 17:47:40.872672 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 72fcfa4564659b3702e16d67177b9a31bfc4cf9dc06a4ae895cb1f08ea736a18 is running failed: container process not found" containerID="72fcfa4564659b3702e16d67177b9a31bfc4cf9dc06a4ae895cb1f08ea736a18" cmd=["grpc_health_probe","-addr=:50051"] Feb 17 17:47:40 crc kubenswrapper[4892]: E0217 17:47:40.872710 4892 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 72fcfa4564659b3702e16d67177b9a31bfc4cf9dc06a4ae895cb1f08ea736a18 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-q424n" podUID="96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1" containerName="registry-server" Feb 17 17:47:40 crc kubenswrapper[4892]: I0217 17:47:40.894483 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-58cdcc4d95-wq4w7" podStartSLOduration=3.894463363 podStartE2EDuration="3.894463363s" podCreationTimestamp="2026-02-17 17:47:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:47:40.894053593 +0000 UTC m=+232.269456858" watchObservedRunningTime="2026-02-17 17:47:40.894463363 +0000 UTC m=+232.269866638" Feb 17 17:47:40 crc kubenswrapper[4892]: I0217 17:47:40.953302 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vh5f8" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.079033 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbb9e745-9259-437f-a600-80153e687c65-catalog-content\") pod \"cbb9e745-9259-437f-a600-80153e687c65\" (UID: \"cbb9e745-9259-437f-a600-80153e687c65\") " Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.079082 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbb9e745-9259-437f-a600-80153e687c65-utilities\") pod \"cbb9e745-9259-437f-a600-80153e687c65\" (UID: \"cbb9e745-9259-437f-a600-80153e687c65\") " Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.079160 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8vd6\" (UniqueName: \"kubernetes.io/projected/cbb9e745-9259-437f-a600-80153e687c65-kube-api-access-k8vd6\") pod \"cbb9e745-9259-437f-a600-80153e687c65\" (UID: \"cbb9e745-9259-437f-a600-80153e687c65\") " Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.081393 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbb9e745-9259-437f-a600-80153e687c65-utilities" (OuterVolumeSpecName: "utilities") pod "cbb9e745-9259-437f-a600-80153e687c65" (UID: "cbb9e745-9259-437f-a600-80153e687c65"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.087925 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbb9e745-9259-437f-a600-80153e687c65-kube-api-access-k8vd6" (OuterVolumeSpecName: "kube-api-access-k8vd6") pod "cbb9e745-9259-437f-a600-80153e687c65" (UID: "cbb9e745-9259-437f-a600-80153e687c65"). InnerVolumeSpecName "kube-api-access-k8vd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.130855 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xlhd5" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.151278 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbb9e745-9259-437f-a600-80153e687c65-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cbb9e745-9259-437f-a600-80153e687c65" (UID: "cbb9e745-9259-437f-a600-80153e687c65"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.179932 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fc4dd9a2-87aa-4405-a4cc-778c778aaec9-marketplace-operator-metrics\") pod \"fc4dd9a2-87aa-4405-a4cc-778c778aaec9\" (UID: \"fc4dd9a2-87aa-4405-a4cc-778c778aaec9\") " Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.180006 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5n65\" (UniqueName: \"kubernetes.io/projected/fc4dd9a2-87aa-4405-a4cc-778c778aaec9-kube-api-access-m5n65\") pod \"fc4dd9a2-87aa-4405-a4cc-778c778aaec9\" (UID: \"fc4dd9a2-87aa-4405-a4cc-778c778aaec9\") " Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.180081 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fc4dd9a2-87aa-4405-a4cc-778c778aaec9-marketplace-trusted-ca\") pod \"fc4dd9a2-87aa-4405-a4cc-778c778aaec9\" (UID: \"fc4dd9a2-87aa-4405-a4cc-778c778aaec9\") " Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.180276 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbb9e745-9259-437f-a600-80153e687c65-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.180293 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbb9e745-9259-437f-a600-80153e687c65-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.180301 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8vd6\" (UniqueName: \"kubernetes.io/projected/cbb9e745-9259-437f-a600-80153e687c65-kube-api-access-k8vd6\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.180866 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc4dd9a2-87aa-4405-a4cc-778c778aaec9-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "fc4dd9a2-87aa-4405-a4cc-778c778aaec9" (UID: "fc4dd9a2-87aa-4405-a4cc-778c778aaec9"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.183288 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc4dd9a2-87aa-4405-a4cc-778c778aaec9-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "fc4dd9a2-87aa-4405-a4cc-778c778aaec9" (UID: "fc4dd9a2-87aa-4405-a4cc-778c778aaec9"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.183647 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc4dd9a2-87aa-4405-a4cc-778c778aaec9-kube-api-access-m5n65" (OuterVolumeSpecName: "kube-api-access-m5n65") pod "fc4dd9a2-87aa-4405-a4cc-778c778aaec9" (UID: "fc4dd9a2-87aa-4405-a4cc-778c778aaec9"). InnerVolumeSpecName "kube-api-access-m5n65". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.199389 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rjqrw" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.204435 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2dtr9" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.209535 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q424n" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.281463 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c4922f3-1e12-469d-9afc-c2c52238e551-catalog-content\") pod \"9c4922f3-1e12-469d-9afc-c2c52238e551\" (UID: \"9c4922f3-1e12-469d-9afc-c2c52238e551\") " Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.281552 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c4922f3-1e12-469d-9afc-c2c52238e551-utilities\") pod \"9c4922f3-1e12-469d-9afc-c2c52238e551\" (UID: \"9c4922f3-1e12-469d-9afc-c2c52238e551\") " Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.281586 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqqtf\" (UniqueName: \"kubernetes.io/projected/8498dfc3-1aa0-4059-abad-cab139ba83ec-kube-api-access-lqqtf\") pod \"8498dfc3-1aa0-4059-abad-cab139ba83ec\" (UID: \"8498dfc3-1aa0-4059-abad-cab139ba83ec\") " Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.281642 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fbkk\" (UniqueName: \"kubernetes.io/projected/9c4922f3-1e12-469d-9afc-c2c52238e551-kube-api-access-8fbkk\") pod \"9c4922f3-1e12-469d-9afc-c2c52238e551\" (UID: \"9c4922f3-1e12-469d-9afc-c2c52238e551\") " Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.281664 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8498dfc3-1aa0-4059-abad-cab139ba83ec-catalog-content\") pod \"8498dfc3-1aa0-4059-abad-cab139ba83ec\" (UID: \"8498dfc3-1aa0-4059-abad-cab139ba83ec\") " Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.281709 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2t9n\" (UniqueName: \"kubernetes.io/projected/96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1-kube-api-access-t2t9n\") pod \"96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1\" (UID: \"96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1\") " Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.281739 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8498dfc3-1aa0-4059-abad-cab139ba83ec-utilities\") pod \"8498dfc3-1aa0-4059-abad-cab139ba83ec\" (UID: \"8498dfc3-1aa0-4059-abad-cab139ba83ec\") " Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.281760 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1-catalog-content\") pod \"96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1\" (UID: \"96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1\") " Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.281832 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1-utilities\") pod \"96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1\" (UID: \"96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1\") " Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.282055 4892 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fc4dd9a2-87aa-4405-a4cc-778c778aaec9-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.282071 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5n65\" (UniqueName: \"kubernetes.io/projected/fc4dd9a2-87aa-4405-a4cc-778c778aaec9-kube-api-access-m5n65\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.282084 4892 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fc4dd9a2-87aa-4405-a4cc-778c778aaec9-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.282713 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1-utilities" (OuterVolumeSpecName: "utilities") pod "96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1" (UID: "96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.282970 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c4922f3-1e12-469d-9afc-c2c52238e551-utilities" (OuterVolumeSpecName: "utilities") pod "9c4922f3-1e12-469d-9afc-c2c52238e551" (UID: "9c4922f3-1e12-469d-9afc-c2c52238e551"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.283045 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8498dfc3-1aa0-4059-abad-cab139ba83ec-utilities" (OuterVolumeSpecName: "utilities") pod "8498dfc3-1aa0-4059-abad-cab139ba83ec" (UID: "8498dfc3-1aa0-4059-abad-cab139ba83ec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.285785 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c4922f3-1e12-469d-9afc-c2c52238e551-kube-api-access-8fbkk" (OuterVolumeSpecName: "kube-api-access-8fbkk") pod "9c4922f3-1e12-469d-9afc-c2c52238e551" (UID: "9c4922f3-1e12-469d-9afc-c2c52238e551"). InnerVolumeSpecName "kube-api-access-8fbkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.285925 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1-kube-api-access-t2t9n" (OuterVolumeSpecName: "kube-api-access-t2t9n") pod "96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1" (UID: "96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1"). InnerVolumeSpecName "kube-api-access-t2t9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.286559 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8498dfc3-1aa0-4059-abad-cab139ba83ec-kube-api-access-lqqtf" (OuterVolumeSpecName: "kube-api-access-lqqtf") pod "8498dfc3-1aa0-4059-abad-cab139ba83ec" (UID: "8498dfc3-1aa0-4059-abad-cab139ba83ec"). InnerVolumeSpecName "kube-api-access-lqqtf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.305686 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8498dfc3-1aa0-4059-abad-cab139ba83ec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8498dfc3-1aa0-4059-abad-cab139ba83ec" (UID: "8498dfc3-1aa0-4059-abad-cab139ba83ec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.344054 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c4922f3-1e12-469d-9afc-c2c52238e551-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9c4922f3-1e12-469d-9afc-c2c52238e551" (UID: "9c4922f3-1e12-469d-9afc-c2c52238e551"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.373964 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e20b74ec-bb8b-4674-b0f6-31d8f02ff100" path="/var/lib/kubelet/pods/e20b74ec-bb8b-4674-b0f6-31d8f02ff100/volumes" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.382836 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2t9n\" (UniqueName: \"kubernetes.io/projected/96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1-kube-api-access-t2t9n\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.382872 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8498dfc3-1aa0-4059-abad-cab139ba83ec-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.382887 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.382901 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c4922f3-1e12-469d-9afc-c2c52238e551-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.382912 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c4922f3-1e12-469d-9afc-c2c52238e551-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.382924 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqqtf\" (UniqueName: \"kubernetes.io/projected/8498dfc3-1aa0-4059-abad-cab139ba83ec-kube-api-access-lqqtf\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.382936 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fbkk\" (UniqueName: \"kubernetes.io/projected/9c4922f3-1e12-469d-9afc-c2c52238e551-kube-api-access-8fbkk\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.382947 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8498dfc3-1aa0-4059-abad-cab139ba83ec-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.414331 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1" (UID: "96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.484069 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:41 crc kubenswrapper[4892]: E0217 17:47:41.488655 4892 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8498dfc3_1aa0_4059_abad_cab139ba83ec.slice/crio-bd524d390e8e71c79a3d7b80c04bf32a57cc9ab1c9849e4de05f419819cb9341\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c4922f3_1e12_469d_9afc_c2c52238e551.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbb9e745_9259_437f_a600_80153e687c65.slice/crio-268dbcdc6d9560e491f03e837ec0d4f078e92e5283437cb1689f23c81f22ac04\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc4dd9a2_87aa_4405_a4cc_778c778aaec9.slice/crio-7a4dd4a3098a25aaa08eab3545192f2342640940c35e0699f2c31176c8a92602\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbb9e745_9259_437f_a600_80153e687c65.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8498dfc3_1aa0_4059_abad_cab139ba83ec.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c4922f3_1e12_469d_9afc_c2c52238e551.slice/crio-b8f0ac963a1f02ca7520e0d01648f9fb03a9ba179d2acf8ea629ed075194fd9b\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc4dd9a2_87aa_4405_a4cc_778c778aaec9.slice\": RecentStats: unable to find data in memory cache]" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.701704 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-69b7b79fcc-zbfpd"] Feb 17 17:47:41 crc kubenswrapper[4892]: E0217 17:47:41.702167 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc4dd9a2-87aa-4405-a4cc-778c778aaec9" containerName="marketplace-operator" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.702179 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc4dd9a2-87aa-4405-a4cc-778c778aaec9" containerName="marketplace-operator" Feb 17 17:47:41 crc kubenswrapper[4892]: E0217 17:47:41.702191 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbb9e745-9259-437f-a600-80153e687c65" containerName="extract-content" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.702197 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbb9e745-9259-437f-a600-80153e687c65" containerName="extract-content" Feb 17 17:47:41 crc kubenswrapper[4892]: E0217 17:47:41.702208 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c4922f3-1e12-469d-9afc-c2c52238e551" containerName="extract-content" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.702214 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c4922f3-1e12-469d-9afc-c2c52238e551" containerName="extract-content" Feb 17 17:47:41 crc kubenswrapper[4892]: E0217 17:47:41.702225 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8498dfc3-1aa0-4059-abad-cab139ba83ec" containerName="registry-server" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.702231 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="8498dfc3-1aa0-4059-abad-cab139ba83ec" containerName="registry-server" Feb 17 17:47:41 crc kubenswrapper[4892]: E0217 17:47:41.702238 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1" containerName="registry-server" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.702244 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1" containerName="registry-server" Feb 17 17:47:41 crc kubenswrapper[4892]: E0217 17:47:41.702253 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbb9e745-9259-437f-a600-80153e687c65" containerName="registry-server" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.702261 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbb9e745-9259-437f-a600-80153e687c65" containerName="registry-server" Feb 17 17:47:41 crc kubenswrapper[4892]: E0217 17:47:41.702270 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c4922f3-1e12-469d-9afc-c2c52238e551" containerName="registry-server" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.702275 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c4922f3-1e12-469d-9afc-c2c52238e551" containerName="registry-server" Feb 17 17:47:41 crc kubenswrapper[4892]: E0217 17:47:41.702285 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1" containerName="extract-content" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.702291 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1" containerName="extract-content" Feb 17 17:47:41 crc kubenswrapper[4892]: E0217 17:47:41.702299 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbb9e745-9259-437f-a600-80153e687c65" containerName="extract-utilities" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.702305 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbb9e745-9259-437f-a600-80153e687c65" containerName="extract-utilities" Feb 17 17:47:41 crc kubenswrapper[4892]: E0217 17:47:41.702314 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8498dfc3-1aa0-4059-abad-cab139ba83ec" containerName="extract-content" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.702319 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="8498dfc3-1aa0-4059-abad-cab139ba83ec" containerName="extract-content" Feb 17 17:47:41 crc kubenswrapper[4892]: E0217 17:47:41.702327 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1" containerName="extract-utilities" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.702333 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1" containerName="extract-utilities" Feb 17 17:47:41 crc kubenswrapper[4892]: E0217 17:47:41.702341 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c4922f3-1e12-469d-9afc-c2c52238e551" containerName="extract-utilities" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.702347 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c4922f3-1e12-469d-9afc-c2c52238e551" containerName="extract-utilities" Feb 17 17:47:41 crc kubenswrapper[4892]: E0217 17:47:41.702354 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8498dfc3-1aa0-4059-abad-cab139ba83ec" containerName="extract-utilities" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.702360 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="8498dfc3-1aa0-4059-abad-cab139ba83ec" containerName="extract-utilities" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.702436 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbb9e745-9259-437f-a600-80153e687c65" containerName="registry-server" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.702446 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="8498dfc3-1aa0-4059-abad-cab139ba83ec" containerName="registry-server" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.702453 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc4dd9a2-87aa-4405-a4cc-778c778aaec9" containerName="marketplace-operator" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.702462 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c4922f3-1e12-469d-9afc-c2c52238e551" containerName="registry-server" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.702470 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1" containerName="registry-server" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.702778 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-69b7b79fcc-zbfpd" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.707207 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.707278 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.707542 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.707696 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.707886 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.708021 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.713382 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-69b7b79fcc-zbfpd"] Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.718428 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.787056 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b8512dd-2720-49b7-a51c-be2074c3e435-serving-cert\") pod \"controller-manager-69b7b79fcc-zbfpd\" (UID: \"4b8512dd-2720-49b7-a51c-be2074c3e435\") " pod="openshift-controller-manager/controller-manager-69b7b79fcc-zbfpd" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.787149 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b8512dd-2720-49b7-a51c-be2074c3e435-config\") pod \"controller-manager-69b7b79fcc-zbfpd\" (UID: \"4b8512dd-2720-49b7-a51c-be2074c3e435\") " pod="openshift-controller-manager/controller-manager-69b7b79fcc-zbfpd" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.787202 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4b8512dd-2720-49b7-a51c-be2074c3e435-client-ca\") pod \"controller-manager-69b7b79fcc-zbfpd\" (UID: \"4b8512dd-2720-49b7-a51c-be2074c3e435\") " pod="openshift-controller-manager/controller-manager-69b7b79fcc-zbfpd" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.787418 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4b8512dd-2720-49b7-a51c-be2074c3e435-proxy-ca-bundles\") pod \"controller-manager-69b7b79fcc-zbfpd\" (UID: \"4b8512dd-2720-49b7-a51c-be2074c3e435\") " pod="openshift-controller-manager/controller-manager-69b7b79fcc-zbfpd" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.787486 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6kqk\" (UniqueName: \"kubernetes.io/projected/4b8512dd-2720-49b7-a51c-be2074c3e435-kube-api-access-k6kqk\") pod \"controller-manager-69b7b79fcc-zbfpd\" (UID: \"4b8512dd-2720-49b7-a51c-be2074c3e435\") " pod="openshift-controller-manager/controller-manager-69b7b79fcc-zbfpd" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.869198 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vh5f8" event={"ID":"cbb9e745-9259-437f-a600-80153e687c65","Type":"ContainerDied","Data":"268dbcdc6d9560e491f03e837ec0d4f078e92e5283437cb1689f23c81f22ac04"} Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.869245 4892 scope.go:117] "RemoveContainer" containerID="942cc645d18d4a896708d9d97cdfe4687015e7e43dea707b1f4f3fa654b94f5b" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.869312 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vh5f8" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.871158 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-5t5vc" event={"ID":"c6d1cad4-ce5d-4116-a7ad-7a2f6c51c2bb","Type":"ContainerStarted","Data":"4252ff94e80e60413089c714385b714f974ad275dd06d75c3b4b23338029f272"} Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.871348 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-5t5vc" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.874149 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2dtr9" event={"ID":"9c4922f3-1e12-469d-9afc-c2c52238e551","Type":"ContainerDied","Data":"b8f0ac963a1f02ca7520e0d01648f9fb03a9ba179d2acf8ea629ed075194fd9b"} Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.874201 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2dtr9" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.878203 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-5t5vc" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.878645 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xlhd5" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.879609 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xlhd5" event={"ID":"fc4dd9a2-87aa-4405-a4cc-778c778aaec9","Type":"ContainerDied","Data":"7a4dd4a3098a25aaa08eab3545192f2342640940c35e0699f2c31176c8a92602"} Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.883345 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rjqrw" event={"ID":"8498dfc3-1aa0-4059-abad-cab139ba83ec","Type":"ContainerDied","Data":"bd524d390e8e71c79a3d7b80c04bf32a57cc9ab1c9849e4de05f419819cb9341"} Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.884208 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rjqrw" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.889254 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b8512dd-2720-49b7-a51c-be2074c3e435-serving-cert\") pod \"controller-manager-69b7b79fcc-zbfpd\" (UID: \"4b8512dd-2720-49b7-a51c-be2074c3e435\") " pod="openshift-controller-manager/controller-manager-69b7b79fcc-zbfpd" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.889321 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b8512dd-2720-49b7-a51c-be2074c3e435-config\") pod \"controller-manager-69b7b79fcc-zbfpd\" (UID: \"4b8512dd-2720-49b7-a51c-be2074c3e435\") " pod="openshift-controller-manager/controller-manager-69b7b79fcc-zbfpd" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.889352 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4b8512dd-2720-49b7-a51c-be2074c3e435-client-ca\") pod \"controller-manager-69b7b79fcc-zbfpd\" (UID: \"4b8512dd-2720-49b7-a51c-be2074c3e435\") " pod="openshift-controller-manager/controller-manager-69b7b79fcc-zbfpd" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.889380 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4b8512dd-2720-49b7-a51c-be2074c3e435-proxy-ca-bundles\") pod \"controller-manager-69b7b79fcc-zbfpd\" (UID: \"4b8512dd-2720-49b7-a51c-be2074c3e435\") " pod="openshift-controller-manager/controller-manager-69b7b79fcc-zbfpd" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.889422 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6kqk\" (UniqueName: \"kubernetes.io/projected/4b8512dd-2720-49b7-a51c-be2074c3e435-kube-api-access-k6kqk\") pod \"controller-manager-69b7b79fcc-zbfpd\" (UID: \"4b8512dd-2720-49b7-a51c-be2074c3e435\") " pod="openshift-controller-manager/controller-manager-69b7b79fcc-zbfpd" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.890383 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q424n" event={"ID":"96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1","Type":"ContainerDied","Data":"ea78c26ba1cc84d62e5174b25361e3f3ad3514ed99cb8d07529ab77159409551"} Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.890572 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q424n" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.893791 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-5t5vc" podStartSLOduration=1.893778893 podStartE2EDuration="1.893778893s" podCreationTimestamp="2026-02-17 17:47:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:47:41.891344541 +0000 UTC m=+233.266747806" watchObservedRunningTime="2026-02-17 17:47:41.893778893 +0000 UTC m=+233.269182158" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.898947 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4b8512dd-2720-49b7-a51c-be2074c3e435-client-ca\") pod \"controller-manager-69b7b79fcc-zbfpd\" (UID: \"4b8512dd-2720-49b7-a51c-be2074c3e435\") " pod="openshift-controller-manager/controller-manager-69b7b79fcc-zbfpd" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.899572 4892 scope.go:117] "RemoveContainer" containerID="738eaefa7812c1c2f8b363b20ed4c5492c125b041a57e242e9f2ed854604c068" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.900182 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4b8512dd-2720-49b7-a51c-be2074c3e435-proxy-ca-bundles\") pod \"controller-manager-69b7b79fcc-zbfpd\" (UID: \"4b8512dd-2720-49b7-a51c-be2074c3e435\") " pod="openshift-controller-manager/controller-manager-69b7b79fcc-zbfpd" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.906251 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b8512dd-2720-49b7-a51c-be2074c3e435-config\") pod \"controller-manager-69b7b79fcc-zbfpd\" (UID: \"4b8512dd-2720-49b7-a51c-be2074c3e435\") " pod="openshift-controller-manager/controller-manager-69b7b79fcc-zbfpd" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.922930 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vh5f8"] Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.926674 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6kqk\" (UniqueName: \"kubernetes.io/projected/4b8512dd-2720-49b7-a51c-be2074c3e435-kube-api-access-k6kqk\") pod \"controller-manager-69b7b79fcc-zbfpd\" (UID: \"4b8512dd-2720-49b7-a51c-be2074c3e435\") " pod="openshift-controller-manager/controller-manager-69b7b79fcc-zbfpd" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.927211 4892 scope.go:117] "RemoveContainer" containerID="8ebeceef20e179b1f52af3ee7ec578218e9f83787d6135ccf4597c9c5326e232" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.943443 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b8512dd-2720-49b7-a51c-be2074c3e435-serving-cert\") pod \"controller-manager-69b7b79fcc-zbfpd\" (UID: \"4b8512dd-2720-49b7-a51c-be2074c3e435\") " pod="openshift-controller-manager/controller-manager-69b7b79fcc-zbfpd" Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.953078 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vh5f8"] Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.960068 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rjqrw"] Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.964099 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rjqrw"] Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.967749 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xlhd5"] Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.970249 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xlhd5"] Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.976976 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2dtr9"] Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.985082 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2dtr9"] Feb 17 17:47:41 crc kubenswrapper[4892]: I0217 17:47:41.998595 4892 scope.go:117] "RemoveContainer" containerID="eb82e87c02e4da31e4beea83f2bb51bb4112d961eedd71995389ff5801902e41" Feb 17 17:47:42 crc kubenswrapper[4892]: I0217 17:47:42.020709 4892 scope.go:117] "RemoveContainer" containerID="43b933a294321da90e7290922622d50a53a6ade207ac81e142604060032d6811" Feb 17 17:47:42 crc kubenswrapper[4892]: I0217 17:47:42.027669 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q424n"] Feb 17 17:47:42 crc kubenswrapper[4892]: I0217 17:47:42.029960 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-q424n"] Feb 17 17:47:42 crc kubenswrapper[4892]: I0217 17:47:42.051755 4892 scope.go:117] "RemoveContainer" containerID="5af8f01b467aa9c8a4acc40a1cfdc98f6a250c45f7553a2d4a3df95fbfcf8e79" Feb 17 17:47:42 crc kubenswrapper[4892]: I0217 17:47:42.059985 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-69b7b79fcc-zbfpd" Feb 17 17:47:42 crc kubenswrapper[4892]: I0217 17:47:42.063069 4892 scope.go:117] "RemoveContainer" containerID="5a9d6f29dd69f0c88f04d654389d0c1fc3b2e8e6144ebe204be4e4bfe3f2b38f" Feb 17 17:47:42 crc kubenswrapper[4892]: I0217 17:47:42.078998 4892 scope.go:117] "RemoveContainer" containerID="6149f27584b0888a515761df07de36ffe15ad669fc82075ec03e130d972b3e4c" Feb 17 17:47:42 crc kubenswrapper[4892]: I0217 17:47:42.089348 4892 scope.go:117] "RemoveContainer" containerID="16c7ce864e5766da1e505fa587594e5df9b68fd94e3b9755e98c650cd170ef65" Feb 17 17:47:42 crc kubenswrapper[4892]: I0217 17:47:42.103063 4892 scope.go:117] "RemoveContainer" containerID="df1a5f5e0cb626905f67b812abc37e81b58844c9a8912a578f943334dbb74769" Feb 17 17:47:42 crc kubenswrapper[4892]: I0217 17:47:42.118548 4892 scope.go:117] "RemoveContainer" containerID="72fcfa4564659b3702e16d67177b9a31bfc4cf9dc06a4ae895cb1f08ea736a18" Feb 17 17:47:42 crc kubenswrapper[4892]: I0217 17:47:42.128369 4892 scope.go:117] "RemoveContainer" containerID="d4e5aa159dbea66b9719c1a7698c99b8cbad3d95e0b2fb1f0af96a0053f96143" Feb 17 17:47:42 crc kubenswrapper[4892]: I0217 17:47:42.145779 4892 scope.go:117] "RemoveContainer" containerID="4fa1642e7d2dba1351d3ab8f9cc9e029523c0344237a6ae1ce6d314534967ba7" Feb 17 17:47:42 crc kubenswrapper[4892]: I0217 17:47:42.451221 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-69b7b79fcc-zbfpd"] Feb 17 17:47:42 crc kubenswrapper[4892]: W0217 17:47:42.455915 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b8512dd_2720_49b7_a51c_be2074c3e435.slice/crio-3e18ba79529fc0d2fdbb2a112f1d4e04b88a85716dca743fe5436baf0650e70b WatchSource:0}: Error finding container 3e18ba79529fc0d2fdbb2a112f1d4e04b88a85716dca743fe5436baf0650e70b: Status 404 returned error can't find the container with id 3e18ba79529fc0d2fdbb2a112f1d4e04b88a85716dca743fe5436baf0650e70b Feb 17 17:47:42 crc kubenswrapper[4892]: I0217 17:47:42.902253 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-69b7b79fcc-zbfpd" event={"ID":"4b8512dd-2720-49b7-a51c-be2074c3e435","Type":"ContainerStarted","Data":"e67fea6585f5ab8e033528a5f2e583d7dd5f216a91f8516a0fe1572f942ba9af"} Feb 17 17:47:42 crc kubenswrapper[4892]: I0217 17:47:42.902653 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-69b7b79fcc-zbfpd" event={"ID":"4b8512dd-2720-49b7-a51c-be2074c3e435","Type":"ContainerStarted","Data":"3e18ba79529fc0d2fdbb2a112f1d4e04b88a85716dca743fe5436baf0650e70b"} Feb 17 17:47:42 crc kubenswrapper[4892]: I0217 17:47:42.918437 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-69b7b79fcc-zbfpd" podStartSLOduration=5.918418772 podStartE2EDuration="5.918418772s" podCreationTimestamp="2026-02-17 17:47:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:47:42.915938499 +0000 UTC m=+234.291341774" watchObservedRunningTime="2026-02-17 17:47:42.918418772 +0000 UTC m=+234.293822037" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.199947 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8w24r"] Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.201370 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8w24r" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.202832 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.213927 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8w24r"] Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.311240 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11f3a8b-12fd-40bd-b67b-fa0ba188eed8-utilities\") pod \"redhat-marketplace-8w24r\" (UID: \"b11f3a8b-12fd-40bd-b67b-fa0ba188eed8\") " pod="openshift-marketplace/redhat-marketplace-8w24r" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.311330 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-768gd\" (UniqueName: \"kubernetes.io/projected/b11f3a8b-12fd-40bd-b67b-fa0ba188eed8-kube-api-access-768gd\") pod \"redhat-marketplace-8w24r\" (UID: \"b11f3a8b-12fd-40bd-b67b-fa0ba188eed8\") " pod="openshift-marketplace/redhat-marketplace-8w24r" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.311350 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11f3a8b-12fd-40bd-b67b-fa0ba188eed8-catalog-content\") pod \"redhat-marketplace-8w24r\" (UID: \"b11f3a8b-12fd-40bd-b67b-fa0ba188eed8\") " pod="openshift-marketplace/redhat-marketplace-8w24r" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.381233 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8498dfc3-1aa0-4059-abad-cab139ba83ec" path="/var/lib/kubelet/pods/8498dfc3-1aa0-4059-abad-cab139ba83ec/volumes" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.381922 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1" path="/var/lib/kubelet/pods/96c2cfe7-cd0c-4fc8-8913-2bf0089a21e1/volumes" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.382595 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c4922f3-1e12-469d-9afc-c2c52238e551" path="/var/lib/kubelet/pods/9c4922f3-1e12-469d-9afc-c2c52238e551/volumes" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.383714 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbb9e745-9259-437f-a600-80153e687c65" path="/var/lib/kubelet/pods/cbb9e745-9259-437f-a600-80153e687c65/volumes" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.384393 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc4dd9a2-87aa-4405-a4cc-778c778aaec9" path="/var/lib/kubelet/pods/fc4dd9a2-87aa-4405-a4cc-778c778aaec9/volumes" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.412979 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-768gd\" (UniqueName: \"kubernetes.io/projected/b11f3a8b-12fd-40bd-b67b-fa0ba188eed8-kube-api-access-768gd\") pod \"redhat-marketplace-8w24r\" (UID: \"b11f3a8b-12fd-40bd-b67b-fa0ba188eed8\") " pod="openshift-marketplace/redhat-marketplace-8w24r" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.413066 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11f3a8b-12fd-40bd-b67b-fa0ba188eed8-catalog-content\") pod \"redhat-marketplace-8w24r\" (UID: \"b11f3a8b-12fd-40bd-b67b-fa0ba188eed8\") " pod="openshift-marketplace/redhat-marketplace-8w24r" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.413145 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11f3a8b-12fd-40bd-b67b-fa0ba188eed8-utilities\") pod \"redhat-marketplace-8w24r\" (UID: \"b11f3a8b-12fd-40bd-b67b-fa0ba188eed8\") " pod="openshift-marketplace/redhat-marketplace-8w24r" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.413551 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11f3a8b-12fd-40bd-b67b-fa0ba188eed8-catalog-content\") pod \"redhat-marketplace-8w24r\" (UID: \"b11f3a8b-12fd-40bd-b67b-fa0ba188eed8\") " pod="openshift-marketplace/redhat-marketplace-8w24r" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.414251 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11f3a8b-12fd-40bd-b67b-fa0ba188eed8-utilities\") pod \"redhat-marketplace-8w24r\" (UID: \"b11f3a8b-12fd-40bd-b67b-fa0ba188eed8\") " pod="openshift-marketplace/redhat-marketplace-8w24r" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.435844 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-768gd\" (UniqueName: \"kubernetes.io/projected/b11f3a8b-12fd-40bd-b67b-fa0ba188eed8-kube-api-access-768gd\") pod \"redhat-marketplace-8w24r\" (UID: \"b11f3a8b-12fd-40bd-b67b-fa0ba188eed8\") " pod="openshift-marketplace/redhat-marketplace-8w24r" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.515854 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8w24r" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.584220 4892 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.584494 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://bf7e90e5029b6c000084531e8c465365c236ea0c554bf0ab65b936fc0f53cfe8" gracePeriod=15 Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.584622 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://2199d2bd60d27dc4d3c3ff4a2ef9e4cbad59c3546a36206258b82875901b69d5" gracePeriod=15 Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.584658 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://23963364690d3b6cdb1cb76d08e094b0a7c2fe304f8d20a673206357e0152193" gracePeriod=15 Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.584684 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://2fe34ba04ce082dcb2c2040fafa3d9b0fb0f78fd569d08c9e53278873fb6434c" gracePeriod=15 Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.584712 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://b7a9fc749b194bcc0558af7017260846aeb8df8df5d84f31ae0f1e9f7ac29e63" gracePeriod=15 Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.586382 4892 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 17:47:43 crc kubenswrapper[4892]: E0217 17:47:43.586729 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.586746 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 17 17:47:43 crc kubenswrapper[4892]: E0217 17:47:43.586772 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.586780 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 17:47:43 crc kubenswrapper[4892]: E0217 17:47:43.586788 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.586795 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 17 17:47:43 crc kubenswrapper[4892]: E0217 17:47:43.586806 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.586836 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 17 17:47:43 crc kubenswrapper[4892]: E0217 17:47:43.586848 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.586855 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 17 17:47:43 crc kubenswrapper[4892]: E0217 17:47:43.586862 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.586869 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 17 17:47:43 crc kubenswrapper[4892]: E0217 17:47:43.586880 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.586889 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 17:47:43 crc kubenswrapper[4892]: E0217 17:47:43.586926 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.586934 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.587083 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.587098 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.587116 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.587126 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.587163 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.587285 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.587579 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.588849 4892 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.589258 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.594351 4892 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.719612 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.719961 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.720045 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.720125 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.720171 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.720194 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.720255 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.720341 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.821659 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.821699 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.821779 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.821858 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.821864 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.822004 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.822005 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.822105 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.822151 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.822177 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.822209 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.822237 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.822267 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.822269 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.822288 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.822387 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.910162 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.911563 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.912224 4892 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2199d2bd60d27dc4d3c3ff4a2ef9e4cbad59c3546a36206258b82875901b69d5" exitCode=0 Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.912245 4892 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="23963364690d3b6cdb1cb76d08e094b0a7c2fe304f8d20a673206357e0152193" exitCode=0 Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.912254 4892 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2fe34ba04ce082dcb2c2040fafa3d9b0fb0f78fd569d08c9e53278873fb6434c" exitCode=0 Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.912262 4892 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b7a9fc749b194bcc0558af7017260846aeb8df8df5d84f31ae0f1e9f7ac29e63" exitCode=2 Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.912341 4892 scope.go:117] "RemoveContainer" containerID="26462d2b91f60681af53ca1c113c56802b51da2f3f7012eb98c7c2517b61c9e6" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.915114 4892 generic.go:334] "Generic (PLEG): container finished" podID="ab22fa14-083f-49e3-bfac-a797f4393a1c" containerID="e42fd288b10656213d300284a6312c12c0d7d24d9471f1667b535a75f57a5ca6" exitCode=0 Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.915212 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ab22fa14-083f-49e3-bfac-a797f4393a1c","Type":"ContainerDied","Data":"e42fd288b10656213d300284a6312c12c0d7d24d9471f1667b535a75f57a5ca6"} Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.915720 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-69b7b79fcc-zbfpd" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.916299 4892 status_manager.go:851] "Failed to get status for pod" podUID="ab22fa14-083f-49e3-bfac-a797f4393a1c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.41:6443: connect: connection refused" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.923216 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-69b7b79fcc-zbfpd" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.923661 4892 status_manager.go:851] "Failed to get status for pod" podUID="4b8512dd-2720-49b7-a51c-be2074c3e435" pod="openshift-controller-manager/controller-manager-69b7b79fcc-zbfpd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-69b7b79fcc-zbfpd\": dial tcp 38.102.83.41:6443: connect: connection refused" Feb 17 17:47:43 crc kubenswrapper[4892]: I0217 17:47:43.924034 4892 status_manager.go:851] "Failed to get status for pod" podUID="ab22fa14-083f-49e3-bfac-a797f4393a1c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.41:6443: connect: connection refused" Feb 17 17:47:44 crc kubenswrapper[4892]: E0217 17:47:44.128069 4892 log.go:32] "RunPodSandbox from runtime service failed" err=< Feb 17 17:47:44 crc kubenswrapper[4892]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-marketplace-8w24r_openshift-marketplace_b11f3a8b-12fd-40bd-b67b-fa0ba188eed8_0(98ae761801627b0880ca62f56f4df239df443dc3607e10994c6bf90a2bf43059): error adding pod openshift-marketplace_redhat-marketplace-8w24r to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"98ae761801627b0880ca62f56f4df239df443dc3607e10994c6bf90a2bf43059" Netns:"/var/run/netns/f74b59cb-f297-428e-95e5-7ebe51ce5932" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=redhat-marketplace-8w24r;K8S_POD_INFRA_CONTAINER_ID=98ae761801627b0880ca62f56f4df239df443dc3607e10994c6bf90a2bf43059;K8S_POD_UID=b11f3a8b-12fd-40bd-b67b-fa0ba188eed8" Path:"" ERRORED: error configuring pod [openshift-marketplace/redhat-marketplace-8w24r] networking: Multus: [openshift-marketplace/redhat-marketplace-8w24r/b11f3a8b-12fd-40bd-b67b-fa0ba188eed8]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod redhat-marketplace-8w24r in out of cluster comm: SetNetworkStatus: failed to update the pod redhat-marketplace-8w24r in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8w24r?timeout=1m0s": dial tcp 38.102.83.41:6443: connect: connection refused Feb 17 17:47:44 crc kubenswrapper[4892]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 17 17:47:44 crc kubenswrapper[4892]: > Feb 17 17:47:44 crc kubenswrapper[4892]: E0217 17:47:44.128154 4892 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Feb 17 17:47:44 crc kubenswrapper[4892]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-marketplace-8w24r_openshift-marketplace_b11f3a8b-12fd-40bd-b67b-fa0ba188eed8_0(98ae761801627b0880ca62f56f4df239df443dc3607e10994c6bf90a2bf43059): error adding pod openshift-marketplace_redhat-marketplace-8w24r to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"98ae761801627b0880ca62f56f4df239df443dc3607e10994c6bf90a2bf43059" Netns:"/var/run/netns/f74b59cb-f297-428e-95e5-7ebe51ce5932" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=redhat-marketplace-8w24r;K8S_POD_INFRA_CONTAINER_ID=98ae761801627b0880ca62f56f4df239df443dc3607e10994c6bf90a2bf43059;K8S_POD_UID=b11f3a8b-12fd-40bd-b67b-fa0ba188eed8" Path:"" ERRORED: error configuring pod [openshift-marketplace/redhat-marketplace-8w24r] networking: Multus: [openshift-marketplace/redhat-marketplace-8w24r/b11f3a8b-12fd-40bd-b67b-fa0ba188eed8]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod redhat-marketplace-8w24r in out of cluster comm: SetNetworkStatus: failed to update the pod redhat-marketplace-8w24r in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8w24r?timeout=1m0s": dial tcp 38.102.83.41:6443: connect: connection refused Feb 17 17:47:44 crc kubenswrapper[4892]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 17 17:47:44 crc kubenswrapper[4892]: > pod="openshift-marketplace/redhat-marketplace-8w24r" Feb 17 17:47:44 crc kubenswrapper[4892]: E0217 17:47:44.128181 4892 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Feb 17 17:47:44 crc kubenswrapper[4892]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-marketplace-8w24r_openshift-marketplace_b11f3a8b-12fd-40bd-b67b-fa0ba188eed8_0(98ae761801627b0880ca62f56f4df239df443dc3607e10994c6bf90a2bf43059): error adding pod openshift-marketplace_redhat-marketplace-8w24r to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"98ae761801627b0880ca62f56f4df239df443dc3607e10994c6bf90a2bf43059" Netns:"/var/run/netns/f74b59cb-f297-428e-95e5-7ebe51ce5932" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=redhat-marketplace-8w24r;K8S_POD_INFRA_CONTAINER_ID=98ae761801627b0880ca62f56f4df239df443dc3607e10994c6bf90a2bf43059;K8S_POD_UID=b11f3a8b-12fd-40bd-b67b-fa0ba188eed8" Path:"" ERRORED: error configuring pod [openshift-marketplace/redhat-marketplace-8w24r] networking: Multus: [openshift-marketplace/redhat-marketplace-8w24r/b11f3a8b-12fd-40bd-b67b-fa0ba188eed8]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod redhat-marketplace-8w24r in out of cluster comm: SetNetworkStatus: failed to update the pod redhat-marketplace-8w24r in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8w24r?timeout=1m0s": dial tcp 38.102.83.41:6443: connect: connection refused Feb 17 17:47:44 crc kubenswrapper[4892]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 17 17:47:44 crc kubenswrapper[4892]: > pod="openshift-marketplace/redhat-marketplace-8w24r" Feb 17 17:47:44 crc kubenswrapper[4892]: E0217 17:47:44.128248 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"redhat-marketplace-8w24r_openshift-marketplace(b11f3a8b-12fd-40bd-b67b-fa0ba188eed8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"redhat-marketplace-8w24r_openshift-marketplace(b11f3a8b-12fd-40bd-b67b-fa0ba188eed8)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-marketplace-8w24r_openshift-marketplace_b11f3a8b-12fd-40bd-b67b-fa0ba188eed8_0(98ae761801627b0880ca62f56f4df239df443dc3607e10994c6bf90a2bf43059): error adding pod openshift-marketplace_redhat-marketplace-8w24r to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"98ae761801627b0880ca62f56f4df239df443dc3607e10994c6bf90a2bf43059\\\" Netns:\\\"/var/run/netns/f74b59cb-f297-428e-95e5-7ebe51ce5932\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=redhat-marketplace-8w24r;K8S_POD_INFRA_CONTAINER_ID=98ae761801627b0880ca62f56f4df239df443dc3607e10994c6bf90a2bf43059;K8S_POD_UID=b11f3a8b-12fd-40bd-b67b-fa0ba188eed8\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-marketplace/redhat-marketplace-8w24r] networking: Multus: [openshift-marketplace/redhat-marketplace-8w24r/b11f3a8b-12fd-40bd-b67b-fa0ba188eed8]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod redhat-marketplace-8w24r in out of cluster comm: SetNetworkStatus: failed to update the pod redhat-marketplace-8w24r in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8w24r?timeout=1m0s\\\": dial tcp 38.102.83.41:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-marketplace/redhat-marketplace-8w24r" podUID="b11f3a8b-12fd-40bd-b67b-fa0ba188eed8" Feb 17 17:47:44 crc kubenswrapper[4892]: E0217 17:47:44.128797 4892 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.41:6443: connect: connection refused" event=< Feb 17 17:47:44 crc kubenswrapper[4892]: &Event{ObjectMeta:{redhat-marketplace-8w24r.189519dcb315b20a openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:redhat-marketplace-8w24r,UID:b11f3a8b-12fd-40bd-b67b-fa0ba188eed8,APIVersion:v1,ResourceVersion:30045,FieldPath:,},Reason:FailedCreatePodSandBox,Message:Failed to create pod sandbox: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-marketplace-8w24r_openshift-marketplace_b11f3a8b-12fd-40bd-b67b-fa0ba188eed8_0(98ae761801627b0880ca62f56f4df239df443dc3607e10994c6bf90a2bf43059): error adding pod openshift-marketplace_redhat-marketplace-8w24r to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"98ae761801627b0880ca62f56f4df239df443dc3607e10994c6bf90a2bf43059" Netns:"/var/run/netns/f74b59cb-f297-428e-95e5-7ebe51ce5932" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=redhat-marketplace-8w24r;K8S_POD_INFRA_CONTAINER_ID=98ae761801627b0880ca62f56f4df239df443dc3607e10994c6bf90a2bf43059;K8S_POD_UID=b11f3a8b-12fd-40bd-b67b-fa0ba188eed8" Path:"" ERRORED: error configuring pod [openshift-marketplace/redhat-marketplace-8w24r] networking: Multus: [openshift-marketplace/redhat-marketplace-8w24r/b11f3a8b-12fd-40bd-b67b-fa0ba188eed8]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod redhat-marketplace-8w24r in out of cluster comm: SetNetworkStatus: failed to update the pod redhat-marketplace-8w24r in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8w24r?timeout=1m0s": dial tcp 38.102.83.41:6443: connect: connection refused Feb 17 17:47:44 crc kubenswrapper[4892]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"},Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-17 17:47:44.128201226 +0000 UTC m=+235.503604491,LastTimestamp:2026-02-17 17:47:44.128201226 +0000 UTC m=+235.503604491,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 17 17:47:44 crc kubenswrapper[4892]: > Feb 17 17:47:44 crc kubenswrapper[4892]: I0217 17:47:44.923483 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 17 17:47:44 crc kubenswrapper[4892]: I0217 17:47:44.924670 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8w24r" Feb 17 17:47:44 crc kubenswrapper[4892]: I0217 17:47:44.925341 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8w24r" Feb 17 17:47:45 crc kubenswrapper[4892]: I0217 17:47:45.242099 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 17 17:47:45 crc kubenswrapper[4892]: I0217 17:47:45.242680 4892 status_manager.go:851] "Failed to get status for pod" podUID="ab22fa14-083f-49e3-bfac-a797f4393a1c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.41:6443: connect: connection refused" Feb 17 17:47:45 crc kubenswrapper[4892]: I0217 17:47:45.243134 4892 status_manager.go:851] "Failed to get status for pod" podUID="4b8512dd-2720-49b7-a51c-be2074c3e435" pod="openshift-controller-manager/controller-manager-69b7b79fcc-zbfpd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-69b7b79fcc-zbfpd\": dial tcp 38.102.83.41:6443: connect: connection refused" Feb 17 17:47:45 crc kubenswrapper[4892]: I0217 17:47:45.342388 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab22fa14-083f-49e3-bfac-a797f4393a1c-kube-api-access\") pod \"ab22fa14-083f-49e3-bfac-a797f4393a1c\" (UID: \"ab22fa14-083f-49e3-bfac-a797f4393a1c\") " Feb 17 17:47:45 crc kubenswrapper[4892]: I0217 17:47:45.342782 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ab22fa14-083f-49e3-bfac-a797f4393a1c-var-lock\") pod \"ab22fa14-083f-49e3-bfac-a797f4393a1c\" (UID: \"ab22fa14-083f-49e3-bfac-a797f4393a1c\") " Feb 17 17:47:45 crc kubenswrapper[4892]: I0217 17:47:45.342893 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab22fa14-083f-49e3-bfac-a797f4393a1c-kubelet-dir\") pod \"ab22fa14-083f-49e3-bfac-a797f4393a1c\" (UID: \"ab22fa14-083f-49e3-bfac-a797f4393a1c\") " Feb 17 17:47:45 crc kubenswrapper[4892]: I0217 17:47:45.343022 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ab22fa14-083f-49e3-bfac-a797f4393a1c-var-lock" (OuterVolumeSpecName: "var-lock") pod "ab22fa14-083f-49e3-bfac-a797f4393a1c" (UID: "ab22fa14-083f-49e3-bfac-a797f4393a1c"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:47:45 crc kubenswrapper[4892]: I0217 17:47:45.343090 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ab22fa14-083f-49e3-bfac-a797f4393a1c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ab22fa14-083f-49e3-bfac-a797f4393a1c" (UID: "ab22fa14-083f-49e3-bfac-a797f4393a1c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:47:45 crc kubenswrapper[4892]: I0217 17:47:45.343171 4892 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ab22fa14-083f-49e3-bfac-a797f4393a1c-var-lock\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:45 crc kubenswrapper[4892]: I0217 17:47:45.352495 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab22fa14-083f-49e3-bfac-a797f4393a1c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ab22fa14-083f-49e3-bfac-a797f4393a1c" (UID: "ab22fa14-083f-49e3-bfac-a797f4393a1c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:45 crc kubenswrapper[4892]: I0217 17:47:45.443896 4892 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab22fa14-083f-49e3-bfac-a797f4393a1c-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:45 crc kubenswrapper[4892]: I0217 17:47:45.443929 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab22fa14-083f-49e3-bfac-a797f4393a1c-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:45 crc kubenswrapper[4892]: E0217 17:47:45.604597 4892 log.go:32] "RunPodSandbox from runtime service failed" err=< Feb 17 17:47:45 crc kubenswrapper[4892]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-marketplace-8w24r_openshift-marketplace_b11f3a8b-12fd-40bd-b67b-fa0ba188eed8_0(f9952f0dafac4a7ba2adf73425f961daeb2af14b7f4d6e4374d4f1ad66181570): error adding pod openshift-marketplace_redhat-marketplace-8w24r to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"f9952f0dafac4a7ba2adf73425f961daeb2af14b7f4d6e4374d4f1ad66181570" Netns:"/var/run/netns/c52b44fd-f064-4566-a2e1-036deb0e1922" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=redhat-marketplace-8w24r;K8S_POD_INFRA_CONTAINER_ID=f9952f0dafac4a7ba2adf73425f961daeb2af14b7f4d6e4374d4f1ad66181570;K8S_POD_UID=b11f3a8b-12fd-40bd-b67b-fa0ba188eed8" Path:"" ERRORED: error configuring pod [openshift-marketplace/redhat-marketplace-8w24r] networking: Multus: [openshift-marketplace/redhat-marketplace-8w24r/b11f3a8b-12fd-40bd-b67b-fa0ba188eed8]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod redhat-marketplace-8w24r in out of cluster comm: SetNetworkStatus: failed to update the pod redhat-marketplace-8w24r in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8w24r?timeout=1m0s": dial tcp 38.102.83.41:6443: connect: connection refused Feb 17 17:47:45 crc kubenswrapper[4892]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 17 17:47:45 crc kubenswrapper[4892]: > Feb 17 17:47:45 crc kubenswrapper[4892]: E0217 17:47:45.604660 4892 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Feb 17 17:47:45 crc kubenswrapper[4892]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-marketplace-8w24r_openshift-marketplace_b11f3a8b-12fd-40bd-b67b-fa0ba188eed8_0(f9952f0dafac4a7ba2adf73425f961daeb2af14b7f4d6e4374d4f1ad66181570): error adding pod openshift-marketplace_redhat-marketplace-8w24r to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"f9952f0dafac4a7ba2adf73425f961daeb2af14b7f4d6e4374d4f1ad66181570" Netns:"/var/run/netns/c52b44fd-f064-4566-a2e1-036deb0e1922" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=redhat-marketplace-8w24r;K8S_POD_INFRA_CONTAINER_ID=f9952f0dafac4a7ba2adf73425f961daeb2af14b7f4d6e4374d4f1ad66181570;K8S_POD_UID=b11f3a8b-12fd-40bd-b67b-fa0ba188eed8" Path:"" ERRORED: error configuring pod [openshift-marketplace/redhat-marketplace-8w24r] networking: Multus: [openshift-marketplace/redhat-marketplace-8w24r/b11f3a8b-12fd-40bd-b67b-fa0ba188eed8]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod redhat-marketplace-8w24r in out of cluster comm: SetNetworkStatus: failed to update the pod redhat-marketplace-8w24r in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8w24r?timeout=1m0s": dial tcp 38.102.83.41:6443: connect: connection refused Feb 17 17:47:45 crc kubenswrapper[4892]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 17 17:47:45 crc kubenswrapper[4892]: > pod="openshift-marketplace/redhat-marketplace-8w24r" Feb 17 17:47:45 crc kubenswrapper[4892]: E0217 17:47:45.604682 4892 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Feb 17 17:47:45 crc kubenswrapper[4892]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-marketplace-8w24r_openshift-marketplace_b11f3a8b-12fd-40bd-b67b-fa0ba188eed8_0(f9952f0dafac4a7ba2adf73425f961daeb2af14b7f4d6e4374d4f1ad66181570): error adding pod openshift-marketplace_redhat-marketplace-8w24r to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"f9952f0dafac4a7ba2adf73425f961daeb2af14b7f4d6e4374d4f1ad66181570" Netns:"/var/run/netns/c52b44fd-f064-4566-a2e1-036deb0e1922" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=redhat-marketplace-8w24r;K8S_POD_INFRA_CONTAINER_ID=f9952f0dafac4a7ba2adf73425f961daeb2af14b7f4d6e4374d4f1ad66181570;K8S_POD_UID=b11f3a8b-12fd-40bd-b67b-fa0ba188eed8" Path:"" ERRORED: error configuring pod [openshift-marketplace/redhat-marketplace-8w24r] networking: Multus: [openshift-marketplace/redhat-marketplace-8w24r/b11f3a8b-12fd-40bd-b67b-fa0ba188eed8]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod redhat-marketplace-8w24r in out of cluster comm: SetNetworkStatus: failed to update the pod redhat-marketplace-8w24r in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8w24r?timeout=1m0s": dial tcp 38.102.83.41:6443: connect: connection refused Feb 17 17:47:45 crc kubenswrapper[4892]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 17 17:47:45 crc kubenswrapper[4892]: > pod="openshift-marketplace/redhat-marketplace-8w24r" Feb 17 17:47:45 crc kubenswrapper[4892]: E0217 17:47:45.604743 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"redhat-marketplace-8w24r_openshift-marketplace(b11f3a8b-12fd-40bd-b67b-fa0ba188eed8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"redhat-marketplace-8w24r_openshift-marketplace(b11f3a8b-12fd-40bd-b67b-fa0ba188eed8)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-marketplace-8w24r_openshift-marketplace_b11f3a8b-12fd-40bd-b67b-fa0ba188eed8_0(f9952f0dafac4a7ba2adf73425f961daeb2af14b7f4d6e4374d4f1ad66181570): error adding pod openshift-marketplace_redhat-marketplace-8w24r to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"f9952f0dafac4a7ba2adf73425f961daeb2af14b7f4d6e4374d4f1ad66181570\\\" Netns:\\\"/var/run/netns/c52b44fd-f064-4566-a2e1-036deb0e1922\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=redhat-marketplace-8w24r;K8S_POD_INFRA_CONTAINER_ID=f9952f0dafac4a7ba2adf73425f961daeb2af14b7f4d6e4374d4f1ad66181570;K8S_POD_UID=b11f3a8b-12fd-40bd-b67b-fa0ba188eed8\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-marketplace/redhat-marketplace-8w24r] networking: Multus: [openshift-marketplace/redhat-marketplace-8w24r/b11f3a8b-12fd-40bd-b67b-fa0ba188eed8]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod redhat-marketplace-8w24r in out of cluster comm: SetNetworkStatus: failed to update the pod redhat-marketplace-8w24r in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8w24r?timeout=1m0s\\\": dial tcp 38.102.83.41:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-marketplace/redhat-marketplace-8w24r" podUID="b11f3a8b-12fd-40bd-b67b-fa0ba188eed8" Feb 17 17:47:45 crc kubenswrapper[4892]: I0217 17:47:45.938679 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 17 17:47:45 crc kubenswrapper[4892]: I0217 17:47:45.940957 4892 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="bf7e90e5029b6c000084531e8c465365c236ea0c554bf0ab65b936fc0f53cfe8" exitCode=0 Feb 17 17:47:45 crc kubenswrapper[4892]: I0217 17:47:45.941288 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a9e0575314c83f262bf8f64c3b009f67ac1dc0292c89a71586c04b786f728ca" Feb 17 17:47:45 crc kubenswrapper[4892]: I0217 17:47:45.942994 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ab22fa14-083f-49e3-bfac-a797f4393a1c","Type":"ContainerDied","Data":"517c82aea51facb30934905fe55335563cea82a61608b2c32f47896213638bcd"} Feb 17 17:47:45 crc kubenswrapper[4892]: I0217 17:47:45.943025 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="517c82aea51facb30934905fe55335563cea82a61608b2c32f47896213638bcd" Feb 17 17:47:45 crc kubenswrapper[4892]: I0217 17:47:45.943154 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 17 17:47:45 crc kubenswrapper[4892]: I0217 17:47:45.970979 4892 status_manager.go:851] "Failed to get status for pod" podUID="ab22fa14-083f-49e3-bfac-a797f4393a1c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.41:6443: connect: connection refused" Feb 17 17:47:45 crc kubenswrapper[4892]: I0217 17:47:45.971213 4892 status_manager.go:851] "Failed to get status for pod" podUID="4b8512dd-2720-49b7-a51c-be2074c3e435" pod="openshift-controller-manager/controller-manager-69b7b79fcc-zbfpd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-69b7b79fcc-zbfpd\": dial tcp 38.102.83.41:6443: connect: connection refused" Feb 17 17:47:45 crc kubenswrapper[4892]: I0217 17:47:45.974111 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 17 17:47:45 crc kubenswrapper[4892]: I0217 17:47:45.975806 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:47:45 crc kubenswrapper[4892]: I0217 17:47:45.976376 4892 status_manager.go:851] "Failed to get status for pod" podUID="4b8512dd-2720-49b7-a51c-be2074c3e435" pod="openshift-controller-manager/controller-manager-69b7b79fcc-zbfpd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-69b7b79fcc-zbfpd\": dial tcp 38.102.83.41:6443: connect: connection refused" Feb 17 17:47:45 crc kubenswrapper[4892]: I0217 17:47:45.976626 4892 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.41:6443: connect: connection refused" Feb 17 17:47:45 crc kubenswrapper[4892]: I0217 17:47:45.976958 4892 status_manager.go:851] "Failed to get status for pod" podUID="ab22fa14-083f-49e3-bfac-a797f4393a1c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.41:6443: connect: connection refused" Feb 17 17:47:46 crc kubenswrapper[4892]: I0217 17:47:46.048923 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 17 17:47:46 crc kubenswrapper[4892]: I0217 17:47:46.049197 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 17 17:47:46 crc kubenswrapper[4892]: I0217 17:47:46.049378 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 17 17:47:46 crc kubenswrapper[4892]: I0217 17:47:46.049149 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:47:46 crc kubenswrapper[4892]: I0217 17:47:46.049315 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:47:46 crc kubenswrapper[4892]: I0217 17:47:46.049471 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:47:46 crc kubenswrapper[4892]: I0217 17:47:46.050842 4892 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:46 crc kubenswrapper[4892]: I0217 17:47:46.051028 4892 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:46 crc kubenswrapper[4892]: I0217 17:47:46.051098 4892 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:46 crc kubenswrapper[4892]: E0217 17:47:46.589161 4892 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.41:6443: connect: connection refused" event=< Feb 17 17:47:46 crc kubenswrapper[4892]: &Event{ObjectMeta:{redhat-marketplace-8w24r.189519dcb315b20a openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:redhat-marketplace-8w24r,UID:b11f3a8b-12fd-40bd-b67b-fa0ba188eed8,APIVersion:v1,ResourceVersion:30045,FieldPath:,},Reason:FailedCreatePodSandBox,Message:Failed to create pod sandbox: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-marketplace-8w24r_openshift-marketplace_b11f3a8b-12fd-40bd-b67b-fa0ba188eed8_0(98ae761801627b0880ca62f56f4df239df443dc3607e10994c6bf90a2bf43059): error adding pod openshift-marketplace_redhat-marketplace-8w24r to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"98ae761801627b0880ca62f56f4df239df443dc3607e10994c6bf90a2bf43059" Netns:"/var/run/netns/f74b59cb-f297-428e-95e5-7ebe51ce5932" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=redhat-marketplace-8w24r;K8S_POD_INFRA_CONTAINER_ID=98ae761801627b0880ca62f56f4df239df443dc3607e10994c6bf90a2bf43059;K8S_POD_UID=b11f3a8b-12fd-40bd-b67b-fa0ba188eed8" Path:"" ERRORED: error configuring pod [openshift-marketplace/redhat-marketplace-8w24r] networking: Multus: [openshift-marketplace/redhat-marketplace-8w24r/b11f3a8b-12fd-40bd-b67b-fa0ba188eed8]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod redhat-marketplace-8w24r in out of cluster comm: SetNetworkStatus: failed to update the pod redhat-marketplace-8w24r in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8w24r?timeout=1m0s": dial tcp 38.102.83.41:6443: connect: connection refused Feb 17 17:47:46 crc kubenswrapper[4892]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"},Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-17 17:47:44.128201226 +0000 UTC m=+235.503604491,LastTimestamp:2026-02-17 17:47:44.128201226 +0000 UTC m=+235.503604491,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 17 17:47:46 crc kubenswrapper[4892]: > Feb 17 17:47:46 crc kubenswrapper[4892]: I0217 17:47:46.950285 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:47:46 crc kubenswrapper[4892]: I0217 17:47:46.966446 4892 status_manager.go:851] "Failed to get status for pod" podUID="4b8512dd-2720-49b7-a51c-be2074c3e435" pod="openshift-controller-manager/controller-manager-69b7b79fcc-zbfpd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-69b7b79fcc-zbfpd\": dial tcp 38.102.83.41:6443: connect: connection refused" Feb 17 17:47:46 crc kubenswrapper[4892]: I0217 17:47:46.966918 4892 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.41:6443: connect: connection refused" Feb 17 17:47:46 crc kubenswrapper[4892]: I0217 17:47:46.967293 4892 status_manager.go:851] "Failed to get status for pod" podUID="ab22fa14-083f-49e3-bfac-a797f4393a1c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.41:6443: connect: connection refused" Feb 17 17:47:47 crc kubenswrapper[4892]: I0217 17:47:47.366125 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 17 17:47:48 crc kubenswrapper[4892]: E0217 17:47:48.644492 4892 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.41:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 17:47:48 crc kubenswrapper[4892]: I0217 17:47:48.645640 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 17:47:48 crc kubenswrapper[4892]: I0217 17:47:48.964896 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"c7b820c2dd32d5cad7dbb8b2f8c7b4779ecb7248f74e6d73a2ce66ee3e35f2e8"} Feb 17 17:47:48 crc kubenswrapper[4892]: I0217 17:47:48.964958 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"20ee05244e55d970361e0db41973d3303336cba8e4c387af20c498e2513ccc7b"} Feb 17 17:47:49 crc kubenswrapper[4892]: E0217 17:47:49.302701 4892 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.41:6443: connect: connection refused" Feb 17 17:47:49 crc kubenswrapper[4892]: E0217 17:47:49.303399 4892 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.41:6443: connect: connection refused" Feb 17 17:47:49 crc kubenswrapper[4892]: E0217 17:47:49.303836 4892 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.41:6443: connect: connection refused" Feb 17 17:47:49 crc kubenswrapper[4892]: E0217 17:47:49.304171 4892 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.41:6443: connect: connection refused" Feb 17 17:47:49 crc kubenswrapper[4892]: E0217 17:47:49.304488 4892 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.41:6443: connect: connection refused" Feb 17 17:47:49 crc kubenswrapper[4892]: I0217 17:47:49.304529 4892 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 17 17:47:49 crc kubenswrapper[4892]: E0217 17:47:49.304882 4892 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.41:6443: connect: connection refused" interval="200ms" Feb 17 17:47:49 crc kubenswrapper[4892]: I0217 17:47:49.362616 4892 status_manager.go:851] "Failed to get status for pod" podUID="4b8512dd-2720-49b7-a51c-be2074c3e435" pod="openshift-controller-manager/controller-manager-69b7b79fcc-zbfpd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-69b7b79fcc-zbfpd\": dial tcp 38.102.83.41:6443: connect: connection refused" Feb 17 17:47:49 crc kubenswrapper[4892]: I0217 17:47:49.363565 4892 status_manager.go:851] "Failed to get status for pod" podUID="ab22fa14-083f-49e3-bfac-a797f4393a1c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.41:6443: connect: connection refused" Feb 17 17:47:49 crc kubenswrapper[4892]: E0217 17:47:49.505755 4892 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.41:6443: connect: connection refused" interval="400ms" Feb 17 17:47:49 crc kubenswrapper[4892]: E0217 17:47:49.907326 4892 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.41:6443: connect: connection refused" interval="800ms" Feb 17 17:47:49 crc kubenswrapper[4892]: I0217 17:47:49.969534 4892 status_manager.go:851] "Failed to get status for pod" podUID="4b8512dd-2720-49b7-a51c-be2074c3e435" pod="openshift-controller-manager/controller-manager-69b7b79fcc-zbfpd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-69b7b79fcc-zbfpd\": dial tcp 38.102.83.41:6443: connect: connection refused" Feb 17 17:47:49 crc kubenswrapper[4892]: E0217 17:47:49.969573 4892 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.41:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 17:47:49 crc kubenswrapper[4892]: I0217 17:47:49.969858 4892 status_manager.go:851] "Failed to get status for pod" podUID="ab22fa14-083f-49e3-bfac-a797f4393a1c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.41:6443: connect: connection refused" Feb 17 17:47:50 crc kubenswrapper[4892]: E0217 17:47:50.708517 4892 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.41:6443: connect: connection refused" interval="1.6s" Feb 17 17:47:52 crc kubenswrapper[4892]: E0217 17:47:52.310192 4892 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.41:6443: connect: connection refused" interval="3.2s" Feb 17 17:47:55 crc kubenswrapper[4892]: E0217 17:47:55.511646 4892 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.41:6443: connect: connection refused" interval="6.4s" Feb 17 17:47:56 crc kubenswrapper[4892]: I0217 17:47:56.358936 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8w24r" Feb 17 17:47:56 crc kubenswrapper[4892]: I0217 17:47:56.358980 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:47:56 crc kubenswrapper[4892]: I0217 17:47:56.360052 4892 status_manager.go:851] "Failed to get status for pod" podUID="ab22fa14-083f-49e3-bfac-a797f4393a1c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.41:6443: connect: connection refused" Feb 17 17:47:56 crc kubenswrapper[4892]: I0217 17:47:56.360442 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8w24r" Feb 17 17:47:56 crc kubenswrapper[4892]: I0217 17:47:56.360694 4892 status_manager.go:851] "Failed to get status for pod" podUID="4b8512dd-2720-49b7-a51c-be2074c3e435" pod="openshift-controller-manager/controller-manager-69b7b79fcc-zbfpd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-69b7b79fcc-zbfpd\": dial tcp 38.102.83.41:6443: connect: connection refused" Feb 17 17:47:56 crc kubenswrapper[4892]: I0217 17:47:56.435801 4892 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f6d4159c-33a5-42ef-9427-ad1fb2799470" Feb 17 17:47:56 crc kubenswrapper[4892]: I0217 17:47:56.436317 4892 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f6d4159c-33a5-42ef-9427-ad1fb2799470" Feb 17 17:47:56 crc kubenswrapper[4892]: E0217 17:47:56.436931 4892 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.41:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:47:56 crc kubenswrapper[4892]: I0217 17:47:56.438057 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:47:56 crc kubenswrapper[4892]: E0217 17:47:56.590703 4892 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.41:6443: connect: connection refused" event=< Feb 17 17:47:56 crc kubenswrapper[4892]: &Event{ObjectMeta:{redhat-marketplace-8w24r.189519dcb315b20a openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:redhat-marketplace-8w24r,UID:b11f3a8b-12fd-40bd-b67b-fa0ba188eed8,APIVersion:v1,ResourceVersion:30045,FieldPath:,},Reason:FailedCreatePodSandBox,Message:Failed to create pod sandbox: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-marketplace-8w24r_openshift-marketplace_b11f3a8b-12fd-40bd-b67b-fa0ba188eed8_0(98ae761801627b0880ca62f56f4df239df443dc3607e10994c6bf90a2bf43059): error adding pod openshift-marketplace_redhat-marketplace-8w24r to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"98ae761801627b0880ca62f56f4df239df443dc3607e10994c6bf90a2bf43059" Netns:"/var/run/netns/f74b59cb-f297-428e-95e5-7ebe51ce5932" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=redhat-marketplace-8w24r;K8S_POD_INFRA_CONTAINER_ID=98ae761801627b0880ca62f56f4df239df443dc3607e10994c6bf90a2bf43059;K8S_POD_UID=b11f3a8b-12fd-40bd-b67b-fa0ba188eed8" Path:"" ERRORED: error configuring pod [openshift-marketplace/redhat-marketplace-8w24r] networking: Multus: [openshift-marketplace/redhat-marketplace-8w24r/b11f3a8b-12fd-40bd-b67b-fa0ba188eed8]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod redhat-marketplace-8w24r in out of cluster comm: SetNetworkStatus: failed to update the pod redhat-marketplace-8w24r in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8w24r?timeout=1m0s": dial tcp 38.102.83.41:6443: connect: connection refused Feb 17 17:47:56 crc kubenswrapper[4892]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"},Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-17 17:47:44.128201226 +0000 UTC m=+235.503604491,LastTimestamp:2026-02-17 17:47:44.128201226 +0000 UTC m=+235.503604491,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 17 17:47:56 crc kubenswrapper[4892]: > Feb 17 17:47:57 crc kubenswrapper[4892]: I0217 17:47:57.042710 4892 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="7178fa54e20685df79a7f0b294161fe870597eee66b16afaed3289cca18879b2" exitCode=0 Feb 17 17:47:57 crc kubenswrapper[4892]: I0217 17:47:57.042765 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"7178fa54e20685df79a7f0b294161fe870597eee66b16afaed3289cca18879b2"} Feb 17 17:47:57 crc kubenswrapper[4892]: I0217 17:47:57.042794 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"66707b10e8eb4182d35b5903d1eac9b1288a3b3743e7b354e05cab3275ba3c4e"} Feb 17 17:47:57 crc kubenswrapper[4892]: I0217 17:47:57.043202 4892 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f6d4159c-33a5-42ef-9427-ad1fb2799470" Feb 17 17:47:57 crc kubenswrapper[4892]: I0217 17:47:57.043225 4892 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f6d4159c-33a5-42ef-9427-ad1fb2799470" Feb 17 17:47:57 crc kubenswrapper[4892]: I0217 17:47:57.043865 4892 status_manager.go:851] "Failed to get status for pod" podUID="ab22fa14-083f-49e3-bfac-a797f4393a1c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.41:6443: connect: connection refused" Feb 17 17:47:57 crc kubenswrapper[4892]: E0217 17:47:57.043885 4892 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.41:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:47:57 crc kubenswrapper[4892]: I0217 17:47:57.044196 4892 status_manager.go:851] "Failed to get status for pod" podUID="4b8512dd-2720-49b7-a51c-be2074c3e435" pod="openshift-controller-manager/controller-manager-69b7b79fcc-zbfpd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-69b7b79fcc-zbfpd\": dial tcp 38.102.83.41:6443: connect: connection refused" Feb 17 17:47:57 crc kubenswrapper[4892]: E0217 17:47:57.140266 4892 log.go:32] "RunPodSandbox from runtime service failed" err=< Feb 17 17:47:57 crc kubenswrapper[4892]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-marketplace-8w24r_openshift-marketplace_b11f3a8b-12fd-40bd-b67b-fa0ba188eed8_0(a1342fbb7832198720b6784e16aa288ce92bdfdaa11cc28e19c4c5a212663e0f): error adding pod openshift-marketplace_redhat-marketplace-8w24r to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"a1342fbb7832198720b6784e16aa288ce92bdfdaa11cc28e19c4c5a212663e0f" Netns:"/var/run/netns/3a47b6b4-9498-4964-9e14-014bd7bbd5af" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=redhat-marketplace-8w24r;K8S_POD_INFRA_CONTAINER_ID=a1342fbb7832198720b6784e16aa288ce92bdfdaa11cc28e19c4c5a212663e0f;K8S_POD_UID=b11f3a8b-12fd-40bd-b67b-fa0ba188eed8" Path:"" ERRORED: error configuring pod [openshift-marketplace/redhat-marketplace-8w24r] networking: Multus: [openshift-marketplace/redhat-marketplace-8w24r/b11f3a8b-12fd-40bd-b67b-fa0ba188eed8]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod redhat-marketplace-8w24r in out of cluster comm: SetNetworkStatus: failed to update the pod redhat-marketplace-8w24r in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8w24r?timeout=1m0s": dial tcp 38.102.83.41:6443: connect: connection refused Feb 17 17:47:57 crc kubenswrapper[4892]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 17 17:47:57 crc kubenswrapper[4892]: > Feb 17 17:47:57 crc kubenswrapper[4892]: E0217 17:47:57.140346 4892 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Feb 17 17:47:57 crc kubenswrapper[4892]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-marketplace-8w24r_openshift-marketplace_b11f3a8b-12fd-40bd-b67b-fa0ba188eed8_0(a1342fbb7832198720b6784e16aa288ce92bdfdaa11cc28e19c4c5a212663e0f): error adding pod openshift-marketplace_redhat-marketplace-8w24r to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"a1342fbb7832198720b6784e16aa288ce92bdfdaa11cc28e19c4c5a212663e0f" Netns:"/var/run/netns/3a47b6b4-9498-4964-9e14-014bd7bbd5af" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=redhat-marketplace-8w24r;K8S_POD_INFRA_CONTAINER_ID=a1342fbb7832198720b6784e16aa288ce92bdfdaa11cc28e19c4c5a212663e0f;K8S_POD_UID=b11f3a8b-12fd-40bd-b67b-fa0ba188eed8" Path:"" ERRORED: error configuring pod [openshift-marketplace/redhat-marketplace-8w24r] networking: Multus: [openshift-marketplace/redhat-marketplace-8w24r/b11f3a8b-12fd-40bd-b67b-fa0ba188eed8]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod redhat-marketplace-8w24r in out of cluster comm: SetNetworkStatus: failed to update the pod redhat-marketplace-8w24r in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8w24r?timeout=1m0s": dial tcp 38.102.83.41:6443: connect: connection refused Feb 17 17:47:57 crc kubenswrapper[4892]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 17 17:47:57 crc kubenswrapper[4892]: > pod="openshift-marketplace/redhat-marketplace-8w24r" Feb 17 17:47:57 crc kubenswrapper[4892]: E0217 17:47:57.140366 4892 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Feb 17 17:47:57 crc kubenswrapper[4892]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-marketplace-8w24r_openshift-marketplace_b11f3a8b-12fd-40bd-b67b-fa0ba188eed8_0(a1342fbb7832198720b6784e16aa288ce92bdfdaa11cc28e19c4c5a212663e0f): error adding pod openshift-marketplace_redhat-marketplace-8w24r to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"a1342fbb7832198720b6784e16aa288ce92bdfdaa11cc28e19c4c5a212663e0f" Netns:"/var/run/netns/3a47b6b4-9498-4964-9e14-014bd7bbd5af" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=redhat-marketplace-8w24r;K8S_POD_INFRA_CONTAINER_ID=a1342fbb7832198720b6784e16aa288ce92bdfdaa11cc28e19c4c5a212663e0f;K8S_POD_UID=b11f3a8b-12fd-40bd-b67b-fa0ba188eed8" Path:"" ERRORED: error configuring pod [openshift-marketplace/redhat-marketplace-8w24r] networking: Multus: [openshift-marketplace/redhat-marketplace-8w24r/b11f3a8b-12fd-40bd-b67b-fa0ba188eed8]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod redhat-marketplace-8w24r in out of cluster comm: SetNetworkStatus: failed to update the pod redhat-marketplace-8w24r in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8w24r?timeout=1m0s": dial tcp 38.102.83.41:6443: connect: connection refused Feb 17 17:47:57 crc kubenswrapper[4892]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 17 17:47:57 crc kubenswrapper[4892]: > pod="openshift-marketplace/redhat-marketplace-8w24r" Feb 17 17:47:57 crc kubenswrapper[4892]: E0217 17:47:57.140443 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"redhat-marketplace-8w24r_openshift-marketplace(b11f3a8b-12fd-40bd-b67b-fa0ba188eed8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"redhat-marketplace-8w24r_openshift-marketplace(b11f3a8b-12fd-40bd-b67b-fa0ba188eed8)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-marketplace-8w24r_openshift-marketplace_b11f3a8b-12fd-40bd-b67b-fa0ba188eed8_0(a1342fbb7832198720b6784e16aa288ce92bdfdaa11cc28e19c4c5a212663e0f): error adding pod openshift-marketplace_redhat-marketplace-8w24r to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"a1342fbb7832198720b6784e16aa288ce92bdfdaa11cc28e19c4c5a212663e0f\\\" Netns:\\\"/var/run/netns/3a47b6b4-9498-4964-9e14-014bd7bbd5af\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=redhat-marketplace-8w24r;K8S_POD_INFRA_CONTAINER_ID=a1342fbb7832198720b6784e16aa288ce92bdfdaa11cc28e19c4c5a212663e0f;K8S_POD_UID=b11f3a8b-12fd-40bd-b67b-fa0ba188eed8\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-marketplace/redhat-marketplace-8w24r] networking: Multus: [openshift-marketplace/redhat-marketplace-8w24r/b11f3a8b-12fd-40bd-b67b-fa0ba188eed8]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod redhat-marketplace-8w24r in out of cluster comm: SetNetworkStatus: failed to update the pod redhat-marketplace-8w24r in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8w24r?timeout=1m0s\\\": dial tcp 38.102.83.41:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-marketplace/redhat-marketplace-8w24r" podUID="b11f3a8b-12fd-40bd-b67b-fa0ba188eed8" Feb 17 17:47:58 crc kubenswrapper[4892]: I0217 17:47:58.050388 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4ed3ed263fb6901f935630df0e66c557bbbe5f543ff8bf91cffb4f0a3188b61a"} Feb 17 17:47:58 crc kubenswrapper[4892]: I0217 17:47:58.050622 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"17e7ed99d57d58ed603e632395407e5765d722924d71276986873c56ed202a21"} Feb 17 17:47:58 crc kubenswrapper[4892]: I0217 17:47:58.050632 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"51de50720a707f7df18cf9dcf79715a4419c86000816e8755cf822a1f874ac02"} Feb 17 17:47:58 crc kubenswrapper[4892]: I0217 17:47:58.050641 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5bb82fb5beebeb16d416fd3ebd6c8cf2a29e3bf8a597fb6c3afd882fcb6fd659"} Feb 17 17:47:58 crc kubenswrapper[4892]: I0217 17:47:58.053865 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 17 17:47:58 crc kubenswrapper[4892]: I0217 17:47:58.053906 4892 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="cd3ead139c2a8e6feefbddce384ccd6eb8f26b4581d93053b80832fea7c8bdff" exitCode=1 Feb 17 17:47:58 crc kubenswrapper[4892]: I0217 17:47:58.053929 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"cd3ead139c2a8e6feefbddce384ccd6eb8f26b4581d93053b80832fea7c8bdff"} Feb 17 17:47:58 crc kubenswrapper[4892]: I0217 17:47:58.054368 4892 scope.go:117] "RemoveContainer" containerID="cd3ead139c2a8e6feefbddce384ccd6eb8f26b4581d93053b80832fea7c8bdff" Feb 17 17:47:59 crc kubenswrapper[4892]: I0217 17:47:59.062255 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 17 17:47:59 crc kubenswrapper[4892]: I0217 17:47:59.062726 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6d054b704998fe9cba4b58963d10244c47c0a1492769fb77796947eade2cad25"} Feb 17 17:47:59 crc kubenswrapper[4892]: I0217 17:47:59.067141 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a97a304ddb3eef7a806b267eb8267cc72a50c8f3aba7938f4034de65e6e1dfab"} Feb 17 17:47:59 crc kubenswrapper[4892]: I0217 17:47:59.067411 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:47:59 crc kubenswrapper[4892]: I0217 17:47:59.067474 4892 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f6d4159c-33a5-42ef-9427-ad1fb2799470" Feb 17 17:47:59 crc kubenswrapper[4892]: I0217 17:47:59.067504 4892 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f6d4159c-33a5-42ef-9427-ad1fb2799470" Feb 17 17:47:59 crc kubenswrapper[4892]: I0217 17:47:59.810948 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 17:48:01 crc kubenswrapper[4892]: I0217 17:48:01.093708 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 17:48:01 crc kubenswrapper[4892]: I0217 17:48:01.094059 4892 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 17 17:48:01 crc kubenswrapper[4892]: I0217 17:48:01.094325 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 17 17:48:01 crc kubenswrapper[4892]: I0217 17:48:01.438351 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:48:01 crc kubenswrapper[4892]: I0217 17:48:01.438419 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:48:01 crc kubenswrapper[4892]: I0217 17:48:01.443953 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:48:04 crc kubenswrapper[4892]: I0217 17:48:04.083259 4892 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:48:05 crc kubenswrapper[4892]: I0217 17:48:05.101135 4892 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f6d4159c-33a5-42ef-9427-ad1fb2799470" Feb 17 17:48:05 crc kubenswrapper[4892]: I0217 17:48:05.101528 4892 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f6d4159c-33a5-42ef-9427-ad1fb2799470" Feb 17 17:48:05 crc kubenswrapper[4892]: I0217 17:48:05.108540 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:48:05 crc kubenswrapper[4892]: I0217 17:48:05.112119 4892 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="1186b072-6ad0-47ae-953e-bab37c336077" Feb 17 17:48:05 crc kubenswrapper[4892]: I0217 17:48:05.287178 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" podUID="58da6748-a277-48e6-a169-6e6477486e44" containerName="oauth-openshift" containerID="cri-o://8db01f8c4331823b1ba42f2b593d26b7e69fd64c411d09eed9a1461eab5f277a" gracePeriod=15 Feb 17 17:48:05 crc kubenswrapper[4892]: I0217 17:48:05.781703 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" Feb 17 17:48:05 crc kubenswrapper[4892]: I0217 17:48:05.794674 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-user-template-login\") pod \"58da6748-a277-48e6-a169-6e6477486e44\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " Feb 17 17:48:05 crc kubenswrapper[4892]: I0217 17:48:05.794731 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/58da6748-a277-48e6-a169-6e6477486e44-audit-policies\") pod \"58da6748-a277-48e6-a169-6e6477486e44\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " Feb 17 17:48:05 crc kubenswrapper[4892]: I0217 17:48:05.794776 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-system-cliconfig\") pod \"58da6748-a277-48e6-a169-6e6477486e44\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " Feb 17 17:48:05 crc kubenswrapper[4892]: I0217 17:48:05.794807 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/58da6748-a277-48e6-a169-6e6477486e44-audit-dir\") pod \"58da6748-a277-48e6-a169-6e6477486e44\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " Feb 17 17:48:05 crc kubenswrapper[4892]: I0217 17:48:05.794883 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-system-service-ca\") pod \"58da6748-a277-48e6-a169-6e6477486e44\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " Feb 17 17:48:05 crc kubenswrapper[4892]: I0217 17:48:05.794919 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-system-session\") pod \"58da6748-a277-48e6-a169-6e6477486e44\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " Feb 17 17:48:05 crc kubenswrapper[4892]: I0217 17:48:05.794939 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/58da6748-a277-48e6-a169-6e6477486e44-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "58da6748-a277-48e6-a169-6e6477486e44" (UID: "58da6748-a277-48e6-a169-6e6477486e44"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:48:05 crc kubenswrapper[4892]: I0217 17:48:05.794977 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-user-template-provider-selection\") pod \"58da6748-a277-48e6-a169-6e6477486e44\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " Feb 17 17:48:05 crc kubenswrapper[4892]: I0217 17:48:05.795013 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-user-idp-0-file-data\") pod \"58da6748-a277-48e6-a169-6e6477486e44\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " Feb 17 17:48:05 crc kubenswrapper[4892]: I0217 17:48:05.795044 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-system-trusted-ca-bundle\") pod \"58da6748-a277-48e6-a169-6e6477486e44\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " Feb 17 17:48:05 crc kubenswrapper[4892]: I0217 17:48:05.795079 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-user-template-error\") pod \"58da6748-a277-48e6-a169-6e6477486e44\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " Feb 17 17:48:05 crc kubenswrapper[4892]: I0217 17:48:05.795140 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-system-serving-cert\") pod \"58da6748-a277-48e6-a169-6e6477486e44\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " Feb 17 17:48:05 crc kubenswrapper[4892]: I0217 17:48:05.795188 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vbvj\" (UniqueName: \"kubernetes.io/projected/58da6748-a277-48e6-a169-6e6477486e44-kube-api-access-5vbvj\") pod \"58da6748-a277-48e6-a169-6e6477486e44\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " Feb 17 17:48:05 crc kubenswrapper[4892]: I0217 17:48:05.795225 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-system-router-certs\") pod \"58da6748-a277-48e6-a169-6e6477486e44\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " Feb 17 17:48:05 crc kubenswrapper[4892]: I0217 17:48:05.795259 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-system-ocp-branding-template\") pod \"58da6748-a277-48e6-a169-6e6477486e44\" (UID: \"58da6748-a277-48e6-a169-6e6477486e44\") " Feb 17 17:48:05 crc kubenswrapper[4892]: I0217 17:48:05.795550 4892 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/58da6748-a277-48e6-a169-6e6477486e44-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 17 17:48:05 crc kubenswrapper[4892]: I0217 17:48:05.796103 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "58da6748-a277-48e6-a169-6e6477486e44" (UID: "58da6748-a277-48e6-a169-6e6477486e44"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:48:05 crc kubenswrapper[4892]: I0217 17:48:05.796364 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "58da6748-a277-48e6-a169-6e6477486e44" (UID: "58da6748-a277-48e6-a169-6e6477486e44"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:48:05 crc kubenswrapper[4892]: I0217 17:48:05.796673 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58da6748-a277-48e6-a169-6e6477486e44-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "58da6748-a277-48e6-a169-6e6477486e44" (UID: "58da6748-a277-48e6-a169-6e6477486e44"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:48:05 crc kubenswrapper[4892]: I0217 17:48:05.801112 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "58da6748-a277-48e6-a169-6e6477486e44" (UID: "58da6748-a277-48e6-a169-6e6477486e44"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:48:05 crc kubenswrapper[4892]: I0217 17:48:05.803457 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58da6748-a277-48e6-a169-6e6477486e44-kube-api-access-5vbvj" (OuterVolumeSpecName: "kube-api-access-5vbvj") pod "58da6748-a277-48e6-a169-6e6477486e44" (UID: "58da6748-a277-48e6-a169-6e6477486e44"). InnerVolumeSpecName "kube-api-access-5vbvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:48:05 crc kubenswrapper[4892]: I0217 17:48:05.805837 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "58da6748-a277-48e6-a169-6e6477486e44" (UID: "58da6748-a277-48e6-a169-6e6477486e44"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:48:05 crc kubenswrapper[4892]: I0217 17:48:05.811788 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "58da6748-a277-48e6-a169-6e6477486e44" (UID: "58da6748-a277-48e6-a169-6e6477486e44"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:48:05 crc kubenswrapper[4892]: I0217 17:48:05.812213 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "58da6748-a277-48e6-a169-6e6477486e44" (UID: "58da6748-a277-48e6-a169-6e6477486e44"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:48:05 crc kubenswrapper[4892]: I0217 17:48:05.812864 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "58da6748-a277-48e6-a169-6e6477486e44" (UID: "58da6748-a277-48e6-a169-6e6477486e44"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:48:05 crc kubenswrapper[4892]: I0217 17:48:05.827171 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "58da6748-a277-48e6-a169-6e6477486e44" (UID: "58da6748-a277-48e6-a169-6e6477486e44"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:48:05 crc kubenswrapper[4892]: I0217 17:48:05.827418 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "58da6748-a277-48e6-a169-6e6477486e44" (UID: "58da6748-a277-48e6-a169-6e6477486e44"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:48:05 crc kubenswrapper[4892]: I0217 17:48:05.827529 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "58da6748-a277-48e6-a169-6e6477486e44" (UID: "58da6748-a277-48e6-a169-6e6477486e44"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:48:05 crc kubenswrapper[4892]: I0217 17:48:05.827741 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "58da6748-a277-48e6-a169-6e6477486e44" (UID: "58da6748-a277-48e6-a169-6e6477486e44"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:48:05 crc kubenswrapper[4892]: I0217 17:48:05.896803 4892 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:48:05 crc kubenswrapper[4892]: I0217 17:48:05.896894 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vbvj\" (UniqueName: \"kubernetes.io/projected/58da6748-a277-48e6-a169-6e6477486e44-kube-api-access-5vbvj\") on node \"crc\" DevicePath \"\"" Feb 17 17:48:05 crc kubenswrapper[4892]: I0217 17:48:05.896914 4892 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 17 17:48:05 crc kubenswrapper[4892]: I0217 17:48:05.896933 4892 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 17 17:48:05 crc kubenswrapper[4892]: I0217 17:48:05.896949 4892 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 17 17:48:05 crc kubenswrapper[4892]: I0217 17:48:05.896965 4892 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/58da6748-a277-48e6-a169-6e6477486e44-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 17 17:48:05 crc kubenswrapper[4892]: I0217 17:48:05.896981 4892 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 17 17:48:05 crc kubenswrapper[4892]: I0217 17:48:05.896997 4892 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 17:48:05 crc kubenswrapper[4892]: I0217 17:48:05.897011 4892 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 17 17:48:05 crc kubenswrapper[4892]: I0217 17:48:05.897029 4892 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 17 17:48:05 crc kubenswrapper[4892]: I0217 17:48:05.897045 4892 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 17 17:48:05 crc kubenswrapper[4892]: I0217 17:48:05.897060 4892 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:48:05 crc kubenswrapper[4892]: I0217 17:48:05.897076 4892 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/58da6748-a277-48e6-a169-6e6477486e44-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 17 17:48:06 crc kubenswrapper[4892]: I0217 17:48:06.108713 4892 generic.go:334] "Generic (PLEG): container finished" podID="58da6748-a277-48e6-a169-6e6477486e44" containerID="8db01f8c4331823b1ba42f2b593d26b7e69fd64c411d09eed9a1461eab5f277a" exitCode=0 Feb 17 17:48:06 crc kubenswrapper[4892]: I0217 17:48:06.108893 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" Feb 17 17:48:06 crc kubenswrapper[4892]: I0217 17:48:06.108986 4892 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f6d4159c-33a5-42ef-9427-ad1fb2799470" Feb 17 17:48:06 crc kubenswrapper[4892]: I0217 17:48:06.109003 4892 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f6d4159c-33a5-42ef-9427-ad1fb2799470" Feb 17 17:48:06 crc kubenswrapper[4892]: I0217 17:48:06.109172 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" event={"ID":"58da6748-a277-48e6-a169-6e6477486e44","Type":"ContainerDied","Data":"8db01f8c4331823b1ba42f2b593d26b7e69fd64c411d09eed9a1461eab5f277a"} Feb 17 17:48:06 crc kubenswrapper[4892]: I0217 17:48:06.109202 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-n8n76" event={"ID":"58da6748-a277-48e6-a169-6e6477486e44","Type":"ContainerDied","Data":"6b538bd5d8589bbca26cd1d57575037451e3ca26825fc1c6ce76fb8c708d61ac"} Feb 17 17:48:06 crc kubenswrapper[4892]: I0217 17:48:06.109221 4892 scope.go:117] "RemoveContainer" containerID="8db01f8c4331823b1ba42f2b593d26b7e69fd64c411d09eed9a1461eab5f277a" Feb 17 17:48:06 crc kubenswrapper[4892]: I0217 17:48:06.146477 4892 scope.go:117] "RemoveContainer" containerID="8db01f8c4331823b1ba42f2b593d26b7e69fd64c411d09eed9a1461eab5f277a" Feb 17 17:48:06 crc kubenswrapper[4892]: E0217 17:48:06.147252 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8db01f8c4331823b1ba42f2b593d26b7e69fd64c411d09eed9a1461eab5f277a\": container with ID starting with 8db01f8c4331823b1ba42f2b593d26b7e69fd64c411d09eed9a1461eab5f277a not found: ID does not exist" containerID="8db01f8c4331823b1ba42f2b593d26b7e69fd64c411d09eed9a1461eab5f277a" Feb 17 17:48:06 crc kubenswrapper[4892]: I0217 17:48:06.147283 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8db01f8c4331823b1ba42f2b593d26b7e69fd64c411d09eed9a1461eab5f277a"} err="failed to get container status \"8db01f8c4331823b1ba42f2b593d26b7e69fd64c411d09eed9a1461eab5f277a\": rpc error: code = NotFound desc = could not find container \"8db01f8c4331823b1ba42f2b593d26b7e69fd64c411d09eed9a1461eab5f277a\": container with ID starting with 8db01f8c4331823b1ba42f2b593d26b7e69fd64c411d09eed9a1461eab5f277a not found: ID does not exist" Feb 17 17:48:08 crc kubenswrapper[4892]: I0217 17:48:08.358769 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8w24r" Feb 17 17:48:08 crc kubenswrapper[4892]: I0217 17:48:08.359573 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8w24r" Feb 17 17:48:08 crc kubenswrapper[4892]: W0217 17:48:08.752105 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb11f3a8b_12fd_40bd_b67b_fa0ba188eed8.slice/crio-6f3285fe3fd4324063bdb079a2436b63b4bceb3e786a248423ec3fa98b234e82 WatchSource:0}: Error finding container 6f3285fe3fd4324063bdb079a2436b63b4bceb3e786a248423ec3fa98b234e82: Status 404 returned error can't find the container with id 6f3285fe3fd4324063bdb079a2436b63b4bceb3e786a248423ec3fa98b234e82 Feb 17 17:48:09 crc kubenswrapper[4892]: I0217 17:48:09.130087 4892 generic.go:334] "Generic (PLEG): container finished" podID="b11f3a8b-12fd-40bd-b67b-fa0ba188eed8" containerID="b9c2a94c6cb34f8fb938d6981be1f5a1ef5896acbbab4ef4cd4c6757dba938b6" exitCode=0 Feb 17 17:48:09 crc kubenswrapper[4892]: I0217 17:48:09.130277 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8w24r" event={"ID":"b11f3a8b-12fd-40bd-b67b-fa0ba188eed8","Type":"ContainerDied","Data":"b9c2a94c6cb34f8fb938d6981be1f5a1ef5896acbbab4ef4cd4c6757dba938b6"} Feb 17 17:48:09 crc kubenswrapper[4892]: I0217 17:48:09.130370 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8w24r" event={"ID":"b11f3a8b-12fd-40bd-b67b-fa0ba188eed8","Type":"ContainerStarted","Data":"6f3285fe3fd4324063bdb079a2436b63b4bceb3e786a248423ec3fa98b234e82"} Feb 17 17:48:09 crc kubenswrapper[4892]: I0217 17:48:09.391187 4892 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="1186b072-6ad0-47ae-953e-bab37c336077" Feb 17 17:48:10 crc kubenswrapper[4892]: I0217 17:48:10.138261 4892 generic.go:334] "Generic (PLEG): container finished" podID="b11f3a8b-12fd-40bd-b67b-fa0ba188eed8" containerID="8f6bec20b04cda4bb1920aba58fb6b960b1027fe935fd3a8ae221b2ee1951c4f" exitCode=0 Feb 17 17:48:10 crc kubenswrapper[4892]: I0217 17:48:10.138309 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8w24r" event={"ID":"b11f3a8b-12fd-40bd-b67b-fa0ba188eed8","Type":"ContainerDied","Data":"8f6bec20b04cda4bb1920aba58fb6b960b1027fe935fd3a8ae221b2ee1951c4f"} Feb 17 17:48:11 crc kubenswrapper[4892]: I0217 17:48:11.094098 4892 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 17 17:48:11 crc kubenswrapper[4892]: I0217 17:48:11.094780 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 17 17:48:11 crc kubenswrapper[4892]: I0217 17:48:11.145431 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8w24r" event={"ID":"b11f3a8b-12fd-40bd-b67b-fa0ba188eed8","Type":"ContainerStarted","Data":"8270cd92687eba2f1d6d326e014771ceb8cdfdd9e25974ffff2f09e4408e4929"} Feb 17 17:48:13 crc kubenswrapper[4892]: I0217 17:48:13.220828 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 17 17:48:13 crc kubenswrapper[4892]: I0217 17:48:13.516536 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8w24r" Feb 17 17:48:13 crc kubenswrapper[4892]: I0217 17:48:13.516572 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8w24r" Feb 17 17:48:13 crc kubenswrapper[4892]: I0217 17:48:13.579323 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8w24r" Feb 17 17:48:13 crc kubenswrapper[4892]: I0217 17:48:13.693104 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 17 17:48:13 crc kubenswrapper[4892]: I0217 17:48:13.811745 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 17 17:48:14 crc kubenswrapper[4892]: I0217 17:48:14.082290 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 17 17:48:14 crc kubenswrapper[4892]: I0217 17:48:14.112373 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 17 17:48:14 crc kubenswrapper[4892]: I0217 17:48:14.370607 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 17 17:48:14 crc kubenswrapper[4892]: I0217 17:48:14.813192 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 17 17:48:14 crc kubenswrapper[4892]: I0217 17:48:14.867653 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 17 17:48:15 crc kubenswrapper[4892]: I0217 17:48:15.070656 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 17 17:48:15 crc kubenswrapper[4892]: I0217 17:48:15.085477 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 17 17:48:15 crc kubenswrapper[4892]: I0217 17:48:15.482577 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 17 17:48:15 crc kubenswrapper[4892]: I0217 17:48:15.634371 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 17 17:48:15 crc kubenswrapper[4892]: I0217 17:48:15.979949 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 17 17:48:15 crc kubenswrapper[4892]: I0217 17:48:15.996629 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 17 17:48:16 crc kubenswrapper[4892]: I0217 17:48:16.018032 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 17 17:48:16 crc kubenswrapper[4892]: I0217 17:48:16.186598 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 17 17:48:16 crc kubenswrapper[4892]: I0217 17:48:16.676828 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 17 17:48:16 crc kubenswrapper[4892]: I0217 17:48:16.916932 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 17 17:48:16 crc kubenswrapper[4892]: I0217 17:48:16.966763 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 17 17:48:17 crc kubenswrapper[4892]: I0217 17:48:17.094685 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 17 17:48:17 crc kubenswrapper[4892]: I0217 17:48:17.139710 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 17 17:48:17 crc kubenswrapper[4892]: I0217 17:48:17.589049 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 17 17:48:17 crc kubenswrapper[4892]: I0217 17:48:17.591409 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 17 17:48:17 crc kubenswrapper[4892]: I0217 17:48:17.688784 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 17 17:48:17 crc kubenswrapper[4892]: I0217 17:48:17.691640 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 17 17:48:17 crc kubenswrapper[4892]: I0217 17:48:17.844896 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 17 17:48:17 crc kubenswrapper[4892]: I0217 17:48:17.876841 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 17 17:48:17 crc kubenswrapper[4892]: I0217 17:48:17.986254 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 17 17:48:18 crc kubenswrapper[4892]: I0217 17:48:18.089809 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 17 17:48:18 crc kubenswrapper[4892]: I0217 17:48:18.133821 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 17 17:48:18 crc kubenswrapper[4892]: I0217 17:48:18.203437 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 17 17:48:18 crc kubenswrapper[4892]: I0217 17:48:18.286766 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 17 17:48:18 crc kubenswrapper[4892]: I0217 17:48:18.312300 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 17 17:48:18 crc kubenswrapper[4892]: I0217 17:48:18.357782 4892 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 17 17:48:18 crc kubenswrapper[4892]: I0217 17:48:18.358723 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8w24r" podStartSLOduration=33.927571037999996 podStartE2EDuration="35.35870333s" podCreationTimestamp="2026-02-17 17:47:43 +0000 UTC" firstStartedPulling="2026-02-17 17:48:09.132040745 +0000 UTC m=+260.507444010" lastFinishedPulling="2026-02-17 17:48:10.563173037 +0000 UTC m=+261.938576302" observedRunningTime="2026-02-17 17:48:11.159045007 +0000 UTC m=+262.534448272" watchObservedRunningTime="2026-02-17 17:48:18.35870333 +0000 UTC m=+269.734106605" Feb 17 17:48:18 crc kubenswrapper[4892]: I0217 17:48:18.363461 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-n8n76"] Feb 17 17:48:18 crc kubenswrapper[4892]: I0217 17:48:18.363528 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 17:48:18 crc kubenswrapper[4892]: I0217 17:48:18.363553 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8w24r"] Feb 17 17:48:18 crc kubenswrapper[4892]: I0217 17:48:18.383966 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=14.383950105 podStartE2EDuration="14.383950105s" podCreationTimestamp="2026-02-17 17:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:48:18.38133309 +0000 UTC m=+269.756736355" watchObservedRunningTime="2026-02-17 17:48:18.383950105 +0000 UTC m=+269.759353370" Feb 17 17:48:18 crc kubenswrapper[4892]: I0217 17:48:18.406248 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8w24r" Feb 17 17:48:18 crc kubenswrapper[4892]: I0217 17:48:18.424031 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 17 17:48:18 crc kubenswrapper[4892]: I0217 17:48:18.460877 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 17 17:48:18 crc kubenswrapper[4892]: I0217 17:48:18.594356 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 17 17:48:18 crc kubenswrapper[4892]: I0217 17:48:18.770910 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 17 17:48:18 crc kubenswrapper[4892]: I0217 17:48:18.864470 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 17 17:48:18 crc kubenswrapper[4892]: I0217 17:48:18.902726 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 17 17:48:18 crc kubenswrapper[4892]: I0217 17:48:18.952017 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 17 17:48:18 crc kubenswrapper[4892]: I0217 17:48:18.984178 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 17 17:48:19 crc kubenswrapper[4892]: I0217 17:48:19.107009 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 17 17:48:19 crc kubenswrapper[4892]: I0217 17:48:19.123793 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 17 17:48:19 crc kubenswrapper[4892]: I0217 17:48:19.186480 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 17 17:48:19 crc kubenswrapper[4892]: I0217 17:48:19.227491 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 17 17:48:19 crc kubenswrapper[4892]: I0217 17:48:19.263039 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 17 17:48:19 crc kubenswrapper[4892]: I0217 17:48:19.267393 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 17 17:48:19 crc kubenswrapper[4892]: I0217 17:48:19.277749 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 17 17:48:19 crc kubenswrapper[4892]: I0217 17:48:19.376096 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58da6748-a277-48e6-a169-6e6477486e44" path="/var/lib/kubelet/pods/58da6748-a277-48e6-a169-6e6477486e44/volumes" Feb 17 17:48:19 crc kubenswrapper[4892]: I0217 17:48:19.383991 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 17 17:48:19 crc kubenswrapper[4892]: I0217 17:48:19.541634 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 17 17:48:19 crc kubenswrapper[4892]: I0217 17:48:19.580949 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 17 17:48:19 crc kubenswrapper[4892]: I0217 17:48:19.590480 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 17 17:48:19 crc kubenswrapper[4892]: I0217 17:48:19.634643 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 17 17:48:19 crc kubenswrapper[4892]: I0217 17:48:19.674547 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 17 17:48:19 crc kubenswrapper[4892]: I0217 17:48:19.826633 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 17 17:48:19 crc kubenswrapper[4892]: I0217 17:48:19.997860 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 17 17:48:20 crc kubenswrapper[4892]: I0217 17:48:20.180262 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 17 17:48:20 crc kubenswrapper[4892]: I0217 17:48:20.194255 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 17 17:48:20 crc kubenswrapper[4892]: I0217 17:48:20.309324 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 17 17:48:20 crc kubenswrapper[4892]: I0217 17:48:20.310904 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 17 17:48:20 crc kubenswrapper[4892]: I0217 17:48:20.382669 4892 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 17 17:48:20 crc kubenswrapper[4892]: I0217 17:48:20.443595 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 17 17:48:20 crc kubenswrapper[4892]: I0217 17:48:20.492206 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 17 17:48:20 crc kubenswrapper[4892]: I0217 17:48:20.495539 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 17 17:48:20 crc kubenswrapper[4892]: I0217 17:48:20.510996 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 17 17:48:20 crc kubenswrapper[4892]: I0217 17:48:20.548804 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 17 17:48:20 crc kubenswrapper[4892]: I0217 17:48:20.613110 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 17 17:48:20 crc kubenswrapper[4892]: I0217 17:48:20.630494 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 17 17:48:20 crc kubenswrapper[4892]: I0217 17:48:20.670101 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 17 17:48:20 crc kubenswrapper[4892]: I0217 17:48:20.723597 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 17 17:48:20 crc kubenswrapper[4892]: I0217 17:48:20.737236 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 17 17:48:20 crc kubenswrapper[4892]: I0217 17:48:20.822185 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 17 17:48:20 crc kubenswrapper[4892]: I0217 17:48:20.858514 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 17 17:48:20 crc kubenswrapper[4892]: I0217 17:48:20.859034 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 17 17:48:20 crc kubenswrapper[4892]: I0217 17:48:20.910102 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.072097 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.098456 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.101576 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.105577 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.142752 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.230342 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.270857 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.316382 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.330765 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.360024 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.487805 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.566254 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.604184 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.642127 4892 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.712701 4892 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.723189 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.736058 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-654c8698c5-bshb5"] Feb 17 17:48:21 crc kubenswrapper[4892]: E0217 17:48:21.736402 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab22fa14-083f-49e3-bfac-a797f4393a1c" containerName="installer" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.736435 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab22fa14-083f-49e3-bfac-a797f4393a1c" containerName="installer" Feb 17 17:48:21 crc kubenswrapper[4892]: E0217 17:48:21.736457 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58da6748-a277-48e6-a169-6e6477486e44" containerName="oauth-openshift" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.736469 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="58da6748-a277-48e6-a169-6e6477486e44" containerName="oauth-openshift" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.736619 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab22fa14-083f-49e3-bfac-a797f4393a1c" containerName="installer" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.736647 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="58da6748-a277-48e6-a169-6e6477486e44" containerName="oauth-openshift" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.737324 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-654c8698c5-bshb5" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.745448 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.745762 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.746074 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.746277 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.746499 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.746499 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.746776 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.747343 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.747774 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.750629 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.750659 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.754411 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.758732 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.761235 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.771545 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.801594 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/15fc4814-e17b-4a09-af98-b0317b9af1b2-audit-dir\") pod \"oauth-openshift-654c8698c5-bshb5\" (UID: \"15fc4814-e17b-4a09-af98-b0317b9af1b2\") " pod="openshift-authentication/oauth-openshift-654c8698c5-bshb5" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.801643 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/15fc4814-e17b-4a09-af98-b0317b9af1b2-v4-0-config-user-template-login\") pod \"oauth-openshift-654c8698c5-bshb5\" (UID: \"15fc4814-e17b-4a09-af98-b0317b9af1b2\") " pod="openshift-authentication/oauth-openshift-654c8698c5-bshb5" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.801681 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/15fc4814-e17b-4a09-af98-b0317b9af1b2-v4-0-config-system-serving-cert\") pod \"oauth-openshift-654c8698c5-bshb5\" (UID: \"15fc4814-e17b-4a09-af98-b0317b9af1b2\") " pod="openshift-authentication/oauth-openshift-654c8698c5-bshb5" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.801704 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/15fc4814-e17b-4a09-af98-b0317b9af1b2-v4-0-config-system-service-ca\") pod \"oauth-openshift-654c8698c5-bshb5\" (UID: \"15fc4814-e17b-4a09-af98-b0317b9af1b2\") " pod="openshift-authentication/oauth-openshift-654c8698c5-bshb5" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.801724 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/15fc4814-e17b-4a09-af98-b0317b9af1b2-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-654c8698c5-bshb5\" (UID: \"15fc4814-e17b-4a09-af98-b0317b9af1b2\") " pod="openshift-authentication/oauth-openshift-654c8698c5-bshb5" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.801764 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/15fc4814-e17b-4a09-af98-b0317b9af1b2-v4-0-config-system-session\") pod \"oauth-openshift-654c8698c5-bshb5\" (UID: \"15fc4814-e17b-4a09-af98-b0317b9af1b2\") " pod="openshift-authentication/oauth-openshift-654c8698c5-bshb5" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.801807 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/15fc4814-e17b-4a09-af98-b0317b9af1b2-audit-policies\") pod \"oauth-openshift-654c8698c5-bshb5\" (UID: \"15fc4814-e17b-4a09-af98-b0317b9af1b2\") " pod="openshift-authentication/oauth-openshift-654c8698c5-bshb5" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.801843 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/15fc4814-e17b-4a09-af98-b0317b9af1b2-v4-0-config-system-cliconfig\") pod \"oauth-openshift-654c8698c5-bshb5\" (UID: \"15fc4814-e17b-4a09-af98-b0317b9af1b2\") " pod="openshift-authentication/oauth-openshift-654c8698c5-bshb5" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.801860 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/15fc4814-e17b-4a09-af98-b0317b9af1b2-v4-0-config-system-router-certs\") pod \"oauth-openshift-654c8698c5-bshb5\" (UID: \"15fc4814-e17b-4a09-af98-b0317b9af1b2\") " pod="openshift-authentication/oauth-openshift-654c8698c5-bshb5" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.801877 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcz48\" (UniqueName: \"kubernetes.io/projected/15fc4814-e17b-4a09-af98-b0317b9af1b2-kube-api-access-xcz48\") pod \"oauth-openshift-654c8698c5-bshb5\" (UID: \"15fc4814-e17b-4a09-af98-b0317b9af1b2\") " pod="openshift-authentication/oauth-openshift-654c8698c5-bshb5" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.801893 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15fc4814-e17b-4a09-af98-b0317b9af1b2-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-654c8698c5-bshb5\" (UID: \"15fc4814-e17b-4a09-af98-b0317b9af1b2\") " pod="openshift-authentication/oauth-openshift-654c8698c5-bshb5" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.801911 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/15fc4814-e17b-4a09-af98-b0317b9af1b2-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-654c8698c5-bshb5\" (UID: \"15fc4814-e17b-4a09-af98-b0317b9af1b2\") " pod="openshift-authentication/oauth-openshift-654c8698c5-bshb5" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.801929 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/15fc4814-e17b-4a09-af98-b0317b9af1b2-v4-0-config-user-template-error\") pod \"oauth-openshift-654c8698c5-bshb5\" (UID: \"15fc4814-e17b-4a09-af98-b0317b9af1b2\") " pod="openshift-authentication/oauth-openshift-654c8698c5-bshb5" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.801946 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/15fc4814-e17b-4a09-af98-b0317b9af1b2-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-654c8698c5-bshb5\" (UID: \"15fc4814-e17b-4a09-af98-b0317b9af1b2\") " pod="openshift-authentication/oauth-openshift-654c8698c5-bshb5" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.822975 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.864784 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.902633 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/15fc4814-e17b-4a09-af98-b0317b9af1b2-audit-policies\") pod \"oauth-openshift-654c8698c5-bshb5\" (UID: \"15fc4814-e17b-4a09-af98-b0317b9af1b2\") " pod="openshift-authentication/oauth-openshift-654c8698c5-bshb5" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.902687 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/15fc4814-e17b-4a09-af98-b0317b9af1b2-v4-0-config-system-cliconfig\") pod \"oauth-openshift-654c8698c5-bshb5\" (UID: \"15fc4814-e17b-4a09-af98-b0317b9af1b2\") " pod="openshift-authentication/oauth-openshift-654c8698c5-bshb5" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.902711 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcz48\" (UniqueName: \"kubernetes.io/projected/15fc4814-e17b-4a09-af98-b0317b9af1b2-kube-api-access-xcz48\") pod \"oauth-openshift-654c8698c5-bshb5\" (UID: \"15fc4814-e17b-4a09-af98-b0317b9af1b2\") " pod="openshift-authentication/oauth-openshift-654c8698c5-bshb5" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.902730 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/15fc4814-e17b-4a09-af98-b0317b9af1b2-v4-0-config-system-router-certs\") pod \"oauth-openshift-654c8698c5-bshb5\" (UID: \"15fc4814-e17b-4a09-af98-b0317b9af1b2\") " pod="openshift-authentication/oauth-openshift-654c8698c5-bshb5" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.902750 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15fc4814-e17b-4a09-af98-b0317b9af1b2-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-654c8698c5-bshb5\" (UID: \"15fc4814-e17b-4a09-af98-b0317b9af1b2\") " pod="openshift-authentication/oauth-openshift-654c8698c5-bshb5" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.902777 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/15fc4814-e17b-4a09-af98-b0317b9af1b2-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-654c8698c5-bshb5\" (UID: \"15fc4814-e17b-4a09-af98-b0317b9af1b2\") " pod="openshift-authentication/oauth-openshift-654c8698c5-bshb5" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.902797 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/15fc4814-e17b-4a09-af98-b0317b9af1b2-v4-0-config-user-template-error\") pod \"oauth-openshift-654c8698c5-bshb5\" (UID: \"15fc4814-e17b-4a09-af98-b0317b9af1b2\") " pod="openshift-authentication/oauth-openshift-654c8698c5-bshb5" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.902870 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/15fc4814-e17b-4a09-af98-b0317b9af1b2-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-654c8698c5-bshb5\" (UID: \"15fc4814-e17b-4a09-af98-b0317b9af1b2\") " pod="openshift-authentication/oauth-openshift-654c8698c5-bshb5" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.902911 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/15fc4814-e17b-4a09-af98-b0317b9af1b2-audit-dir\") pod \"oauth-openshift-654c8698c5-bshb5\" (UID: \"15fc4814-e17b-4a09-af98-b0317b9af1b2\") " pod="openshift-authentication/oauth-openshift-654c8698c5-bshb5" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.902931 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/15fc4814-e17b-4a09-af98-b0317b9af1b2-v4-0-config-user-template-login\") pod \"oauth-openshift-654c8698c5-bshb5\" (UID: \"15fc4814-e17b-4a09-af98-b0317b9af1b2\") " pod="openshift-authentication/oauth-openshift-654c8698c5-bshb5" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.902964 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/15fc4814-e17b-4a09-af98-b0317b9af1b2-v4-0-config-system-serving-cert\") pod \"oauth-openshift-654c8698c5-bshb5\" (UID: \"15fc4814-e17b-4a09-af98-b0317b9af1b2\") " pod="openshift-authentication/oauth-openshift-654c8698c5-bshb5" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.902986 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/15fc4814-e17b-4a09-af98-b0317b9af1b2-v4-0-config-system-service-ca\") pod \"oauth-openshift-654c8698c5-bshb5\" (UID: \"15fc4814-e17b-4a09-af98-b0317b9af1b2\") " pod="openshift-authentication/oauth-openshift-654c8698c5-bshb5" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.903005 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/15fc4814-e17b-4a09-af98-b0317b9af1b2-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-654c8698c5-bshb5\" (UID: \"15fc4814-e17b-4a09-af98-b0317b9af1b2\") " pod="openshift-authentication/oauth-openshift-654c8698c5-bshb5" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.903043 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/15fc4814-e17b-4a09-af98-b0317b9af1b2-v4-0-config-system-session\") pod \"oauth-openshift-654c8698c5-bshb5\" (UID: \"15fc4814-e17b-4a09-af98-b0317b9af1b2\") " pod="openshift-authentication/oauth-openshift-654c8698c5-bshb5" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.903833 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/15fc4814-e17b-4a09-af98-b0317b9af1b2-v4-0-config-system-cliconfig\") pod \"oauth-openshift-654c8698c5-bshb5\" (UID: \"15fc4814-e17b-4a09-af98-b0317b9af1b2\") " pod="openshift-authentication/oauth-openshift-654c8698c5-bshb5" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.904017 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/15fc4814-e17b-4a09-af98-b0317b9af1b2-audit-policies\") pod \"oauth-openshift-654c8698c5-bshb5\" (UID: \"15fc4814-e17b-4a09-af98-b0317b9af1b2\") " pod="openshift-authentication/oauth-openshift-654c8698c5-bshb5" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.904965 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15fc4814-e17b-4a09-af98-b0317b9af1b2-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-654c8698c5-bshb5\" (UID: \"15fc4814-e17b-4a09-af98-b0317b9af1b2\") " pod="openshift-authentication/oauth-openshift-654c8698c5-bshb5" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.905108 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/15fc4814-e17b-4a09-af98-b0317b9af1b2-audit-dir\") pod \"oauth-openshift-654c8698c5-bshb5\" (UID: \"15fc4814-e17b-4a09-af98-b0317b9af1b2\") " pod="openshift-authentication/oauth-openshift-654c8698c5-bshb5" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.905987 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/15fc4814-e17b-4a09-af98-b0317b9af1b2-v4-0-config-system-service-ca\") pod \"oauth-openshift-654c8698c5-bshb5\" (UID: \"15fc4814-e17b-4a09-af98-b0317b9af1b2\") " pod="openshift-authentication/oauth-openshift-654c8698c5-bshb5" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.910037 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/15fc4814-e17b-4a09-af98-b0317b9af1b2-v4-0-config-system-session\") pod \"oauth-openshift-654c8698c5-bshb5\" (UID: \"15fc4814-e17b-4a09-af98-b0317b9af1b2\") " pod="openshift-authentication/oauth-openshift-654c8698c5-bshb5" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.910209 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/15fc4814-e17b-4a09-af98-b0317b9af1b2-v4-0-config-system-serving-cert\") pod \"oauth-openshift-654c8698c5-bshb5\" (UID: \"15fc4814-e17b-4a09-af98-b0317b9af1b2\") " pod="openshift-authentication/oauth-openshift-654c8698c5-bshb5" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.910437 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/15fc4814-e17b-4a09-af98-b0317b9af1b2-v4-0-config-user-template-login\") pod \"oauth-openshift-654c8698c5-bshb5\" (UID: \"15fc4814-e17b-4a09-af98-b0317b9af1b2\") " pod="openshift-authentication/oauth-openshift-654c8698c5-bshb5" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.914468 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/15fc4814-e17b-4a09-af98-b0317b9af1b2-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-654c8698c5-bshb5\" (UID: \"15fc4814-e17b-4a09-af98-b0317b9af1b2\") " pod="openshift-authentication/oauth-openshift-654c8698c5-bshb5" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.914550 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/15fc4814-e17b-4a09-af98-b0317b9af1b2-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-654c8698c5-bshb5\" (UID: \"15fc4814-e17b-4a09-af98-b0317b9af1b2\") " pod="openshift-authentication/oauth-openshift-654c8698c5-bshb5" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.916205 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/15fc4814-e17b-4a09-af98-b0317b9af1b2-v4-0-config-user-template-error\") pod \"oauth-openshift-654c8698c5-bshb5\" (UID: \"15fc4814-e17b-4a09-af98-b0317b9af1b2\") " pod="openshift-authentication/oauth-openshift-654c8698c5-bshb5" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.917443 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/15fc4814-e17b-4a09-af98-b0317b9af1b2-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-654c8698c5-bshb5\" (UID: \"15fc4814-e17b-4a09-af98-b0317b9af1b2\") " pod="openshift-authentication/oauth-openshift-654c8698c5-bshb5" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.917806 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/15fc4814-e17b-4a09-af98-b0317b9af1b2-v4-0-config-system-router-certs\") pod \"oauth-openshift-654c8698c5-bshb5\" (UID: \"15fc4814-e17b-4a09-af98-b0317b9af1b2\") " pod="openshift-authentication/oauth-openshift-654c8698c5-bshb5" Feb 17 17:48:21 crc kubenswrapper[4892]: I0217 17:48:21.928594 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcz48\" (UniqueName: \"kubernetes.io/projected/15fc4814-e17b-4a09-af98-b0317b9af1b2-kube-api-access-xcz48\") pod \"oauth-openshift-654c8698c5-bshb5\" (UID: \"15fc4814-e17b-4a09-af98-b0317b9af1b2\") " pod="openshift-authentication/oauth-openshift-654c8698c5-bshb5" Feb 17 17:48:22 crc kubenswrapper[4892]: I0217 17:48:22.001373 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 17 17:48:22 crc kubenswrapper[4892]: I0217 17:48:22.003839 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 17 17:48:22 crc kubenswrapper[4892]: I0217 17:48:22.022198 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 17 17:48:22 crc kubenswrapper[4892]: I0217 17:48:22.034436 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 17 17:48:22 crc kubenswrapper[4892]: I0217 17:48:22.063649 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-654c8698c5-bshb5" Feb 17 17:48:22 crc kubenswrapper[4892]: I0217 17:48:22.084047 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 17 17:48:22 crc kubenswrapper[4892]: I0217 17:48:22.151254 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 17 17:48:22 crc kubenswrapper[4892]: I0217 17:48:22.208658 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 17 17:48:22 crc kubenswrapper[4892]: I0217 17:48:22.245244 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 17 17:48:22 crc kubenswrapper[4892]: I0217 17:48:22.256907 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 17 17:48:22 crc kubenswrapper[4892]: I0217 17:48:22.265897 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 17 17:48:22 crc kubenswrapper[4892]: I0217 17:48:22.321636 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 17 17:48:22 crc kubenswrapper[4892]: I0217 17:48:22.348870 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 17 17:48:22 crc kubenswrapper[4892]: I0217 17:48:22.369672 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 17 17:48:22 crc kubenswrapper[4892]: I0217 17:48:22.401126 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 17 17:48:22 crc kubenswrapper[4892]: I0217 17:48:22.436764 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 17 17:48:22 crc kubenswrapper[4892]: I0217 17:48:22.457608 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 17 17:48:22 crc kubenswrapper[4892]: I0217 17:48:22.576742 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 17 17:48:22 crc kubenswrapper[4892]: I0217 17:48:22.603502 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 17 17:48:22 crc kubenswrapper[4892]: I0217 17:48:22.632746 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 17 17:48:22 crc kubenswrapper[4892]: I0217 17:48:22.675459 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 17 17:48:22 crc kubenswrapper[4892]: I0217 17:48:22.721720 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 17 17:48:22 crc kubenswrapper[4892]: I0217 17:48:22.756580 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 17 17:48:22 crc kubenswrapper[4892]: I0217 17:48:22.823845 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-654c8698c5-bshb5"] Feb 17 17:48:22 crc kubenswrapper[4892]: I0217 17:48:22.962056 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 17 17:48:23 crc kubenswrapper[4892]: I0217 17:48:23.070104 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 17 17:48:23 crc kubenswrapper[4892]: I0217 17:48:23.077584 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 17 17:48:23 crc kubenswrapper[4892]: I0217 17:48:23.115474 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 17 17:48:23 crc kubenswrapper[4892]: I0217 17:48:23.116653 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 17 17:48:23 crc kubenswrapper[4892]: I0217 17:48:23.167062 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 17 17:48:23 crc kubenswrapper[4892]: I0217 17:48:23.250851 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 17 17:48:23 crc kubenswrapper[4892]: I0217 17:48:23.312501 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-654c8698c5-bshb5"] Feb 17 17:48:23 crc kubenswrapper[4892]: I0217 17:48:23.487217 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 17 17:48:23 crc kubenswrapper[4892]: I0217 17:48:23.508427 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 17 17:48:23 crc kubenswrapper[4892]: I0217 17:48:23.598621 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 17 17:48:23 crc kubenswrapper[4892]: I0217 17:48:23.607723 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 17 17:48:23 crc kubenswrapper[4892]: I0217 17:48:23.777213 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 17 17:48:23 crc kubenswrapper[4892]: I0217 17:48:23.893217 4892 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 17 17:48:23 crc kubenswrapper[4892]: I0217 17:48:23.895542 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 17 17:48:23 crc kubenswrapper[4892]: I0217 17:48:23.937353 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 17 17:48:23 crc kubenswrapper[4892]: I0217 17:48:23.996604 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 17 17:48:23 crc kubenswrapper[4892]: I0217 17:48:23.996710 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 17 17:48:24 crc kubenswrapper[4892]: I0217 17:48:24.194734 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 17 17:48:24 crc kubenswrapper[4892]: I0217 17:48:24.214806 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-654c8698c5-bshb5" event={"ID":"15fc4814-e17b-4a09-af98-b0317b9af1b2","Type":"ContainerStarted","Data":"127382e148d666da7a980e44a7f9ce8f2c371b5c74730cf3c3bb611d006c0dfb"} Feb 17 17:48:24 crc kubenswrapper[4892]: I0217 17:48:24.214886 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-654c8698c5-bshb5" event={"ID":"15fc4814-e17b-4a09-af98-b0317b9af1b2","Type":"ContainerStarted","Data":"cdc4d91961a3c7cc569582321a95a05a659450bd82df9f8298fd9bad22f3d6f3"} Feb 17 17:48:24 crc kubenswrapper[4892]: I0217 17:48:24.215211 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-654c8698c5-bshb5" Feb 17 17:48:24 crc kubenswrapper[4892]: I0217 17:48:24.221739 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-654c8698c5-bshb5" Feb 17 17:48:24 crc kubenswrapper[4892]: I0217 17:48:24.239588 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-654c8698c5-bshb5" podStartSLOduration=44.239570306 podStartE2EDuration="44.239570306s" podCreationTimestamp="2026-02-17 17:47:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:48:24.238846471 +0000 UTC m=+275.614249736" watchObservedRunningTime="2026-02-17 17:48:24.239570306 +0000 UTC m=+275.614973571" Feb 17 17:48:24 crc kubenswrapper[4892]: I0217 17:48:24.251225 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 17 17:48:24 crc kubenswrapper[4892]: I0217 17:48:24.259283 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 17 17:48:24 crc kubenswrapper[4892]: I0217 17:48:24.264728 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 17 17:48:24 crc kubenswrapper[4892]: I0217 17:48:24.278062 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 17 17:48:24 crc kubenswrapper[4892]: I0217 17:48:24.334249 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 17 17:48:24 crc kubenswrapper[4892]: I0217 17:48:24.365444 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 17 17:48:24 crc kubenswrapper[4892]: I0217 17:48:24.419329 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 17 17:48:24 crc kubenswrapper[4892]: I0217 17:48:24.462704 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 17 17:48:24 crc kubenswrapper[4892]: I0217 17:48:24.468757 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 17 17:48:24 crc kubenswrapper[4892]: I0217 17:48:24.478786 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 17 17:48:24 crc kubenswrapper[4892]: I0217 17:48:24.490986 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 17 17:48:24 crc kubenswrapper[4892]: I0217 17:48:24.520103 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 17 17:48:24 crc kubenswrapper[4892]: I0217 17:48:24.520903 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 17 17:48:24 crc kubenswrapper[4892]: I0217 17:48:24.541045 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 17 17:48:24 crc kubenswrapper[4892]: I0217 17:48:24.574337 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 17 17:48:24 crc kubenswrapper[4892]: I0217 17:48:24.581672 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 17 17:48:24 crc kubenswrapper[4892]: I0217 17:48:24.702021 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 17 17:48:24 crc kubenswrapper[4892]: I0217 17:48:24.709451 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 17 17:48:24 crc kubenswrapper[4892]: I0217 17:48:24.786431 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 17 17:48:24 crc kubenswrapper[4892]: I0217 17:48:24.818479 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 17 17:48:24 crc kubenswrapper[4892]: I0217 17:48:24.827793 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 17 17:48:24 crc kubenswrapper[4892]: I0217 17:48:24.864764 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 17 17:48:24 crc kubenswrapper[4892]: I0217 17:48:24.923611 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 17 17:48:24 crc kubenswrapper[4892]: I0217 17:48:24.940509 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 17 17:48:24 crc kubenswrapper[4892]: I0217 17:48:24.970181 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 17 17:48:25 crc kubenswrapper[4892]: I0217 17:48:25.005778 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 17 17:48:25 crc kubenswrapper[4892]: I0217 17:48:25.187411 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 17 17:48:25 crc kubenswrapper[4892]: I0217 17:48:25.332030 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 17 17:48:25 crc kubenswrapper[4892]: I0217 17:48:25.386174 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 17 17:48:25 crc kubenswrapper[4892]: I0217 17:48:25.390252 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 17 17:48:25 crc kubenswrapper[4892]: I0217 17:48:25.424060 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 17 17:48:25 crc kubenswrapper[4892]: I0217 17:48:25.518208 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 17 17:48:25 crc kubenswrapper[4892]: I0217 17:48:25.522552 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 17 17:48:25 crc kubenswrapper[4892]: I0217 17:48:25.566153 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 17 17:48:25 crc kubenswrapper[4892]: I0217 17:48:25.704561 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 17 17:48:25 crc kubenswrapper[4892]: I0217 17:48:25.708797 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 17 17:48:25 crc kubenswrapper[4892]: I0217 17:48:25.798169 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 17 17:48:25 crc kubenswrapper[4892]: I0217 17:48:25.806757 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 17 17:48:25 crc kubenswrapper[4892]: I0217 17:48:25.889473 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 17 17:48:26 crc kubenswrapper[4892]: I0217 17:48:26.031330 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 17 17:48:26 crc kubenswrapper[4892]: I0217 17:48:26.068320 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 17 17:48:26 crc kubenswrapper[4892]: I0217 17:48:26.190537 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 17 17:48:26 crc kubenswrapper[4892]: I0217 17:48:26.236933 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 17 17:48:26 crc kubenswrapper[4892]: I0217 17:48:26.328338 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 17 17:48:26 crc kubenswrapper[4892]: I0217 17:48:26.337862 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 17 17:48:26 crc kubenswrapper[4892]: I0217 17:48:26.430960 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 17 17:48:26 crc kubenswrapper[4892]: I0217 17:48:26.446128 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 17:48:26 crc kubenswrapper[4892]: I0217 17:48:26.448678 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 17 17:48:26 crc kubenswrapper[4892]: I0217 17:48:26.491297 4892 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 17 17:48:26 crc kubenswrapper[4892]: I0217 17:48:26.534992 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 17 17:48:26 crc kubenswrapper[4892]: I0217 17:48:26.570360 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 17 17:48:26 crc kubenswrapper[4892]: I0217 17:48:26.573756 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 17 17:48:26 crc kubenswrapper[4892]: I0217 17:48:26.616896 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 17 17:48:26 crc kubenswrapper[4892]: I0217 17:48:26.617715 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 17 17:48:26 crc kubenswrapper[4892]: I0217 17:48:26.644704 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 17 17:48:26 crc kubenswrapper[4892]: I0217 17:48:26.820729 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 17 17:48:26 crc kubenswrapper[4892]: I0217 17:48:26.898723 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 17 17:48:26 crc kubenswrapper[4892]: I0217 17:48:26.910796 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 17 17:48:26 crc kubenswrapper[4892]: I0217 17:48:26.966766 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 17 17:48:27 crc kubenswrapper[4892]: I0217 17:48:27.003928 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 17 17:48:27 crc kubenswrapper[4892]: I0217 17:48:27.008903 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 17 17:48:27 crc kubenswrapper[4892]: I0217 17:48:27.013462 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 17 17:48:27 crc kubenswrapper[4892]: I0217 17:48:27.061521 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 17 17:48:27 crc kubenswrapper[4892]: I0217 17:48:27.092346 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 17 17:48:27 crc kubenswrapper[4892]: I0217 17:48:27.191492 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 17 17:48:27 crc kubenswrapper[4892]: I0217 17:48:27.233053 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 17 17:48:27 crc kubenswrapper[4892]: I0217 17:48:27.315453 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 17 17:48:27 crc kubenswrapper[4892]: I0217 17:48:27.348603 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 17 17:48:27 crc kubenswrapper[4892]: I0217 17:48:27.529852 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 17 17:48:27 crc kubenswrapper[4892]: I0217 17:48:27.740764 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 17 17:48:27 crc kubenswrapper[4892]: I0217 17:48:27.902462 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 17 17:48:27 crc kubenswrapper[4892]: I0217 17:48:27.930340 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 17 17:48:27 crc kubenswrapper[4892]: I0217 17:48:27.939709 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 17 17:48:28 crc kubenswrapper[4892]: I0217 17:48:28.066471 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 17 17:48:28 crc kubenswrapper[4892]: I0217 17:48:28.196638 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 17 17:48:28 crc kubenswrapper[4892]: I0217 17:48:28.232220 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 17 17:48:28 crc kubenswrapper[4892]: I0217 17:48:28.239039 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 17 17:48:28 crc kubenswrapper[4892]: I0217 17:48:28.311515 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 17 17:48:28 crc kubenswrapper[4892]: I0217 17:48:28.385481 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 17 17:48:28 crc kubenswrapper[4892]: I0217 17:48:28.402172 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 17 17:48:28 crc kubenswrapper[4892]: I0217 17:48:28.656578 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 17 17:48:28 crc kubenswrapper[4892]: I0217 17:48:28.701781 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 17 17:48:28 crc kubenswrapper[4892]: I0217 17:48:28.756407 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 17 17:48:28 crc kubenswrapper[4892]: I0217 17:48:28.782398 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 17 17:48:28 crc kubenswrapper[4892]: I0217 17:48:28.818802 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 17 17:48:28 crc kubenswrapper[4892]: I0217 17:48:28.844941 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 17 17:48:28 crc kubenswrapper[4892]: I0217 17:48:28.930488 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 17 17:48:28 crc kubenswrapper[4892]: I0217 17:48:28.964240 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 17 17:48:29 crc kubenswrapper[4892]: I0217 17:48:29.264020 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 17 17:48:29 crc kubenswrapper[4892]: I0217 17:48:29.281890 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 17 17:48:29 crc kubenswrapper[4892]: I0217 17:48:29.333359 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 17 17:48:29 crc kubenswrapper[4892]: I0217 17:48:29.447904 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 17 17:48:29 crc kubenswrapper[4892]: I0217 17:48:29.884237 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 17 17:48:29 crc kubenswrapper[4892]: I0217 17:48:29.952843 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 17 17:48:29 crc kubenswrapper[4892]: I0217 17:48:29.952930 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 17 17:48:29 crc kubenswrapper[4892]: I0217 17:48:29.956427 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 17 17:48:30 crc kubenswrapper[4892]: I0217 17:48:30.003953 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 17 17:48:30 crc kubenswrapper[4892]: I0217 17:48:30.103058 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 17 17:48:30 crc kubenswrapper[4892]: I0217 17:48:30.149468 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 17 17:48:30 crc kubenswrapper[4892]: I0217 17:48:30.419994 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 17 17:48:31 crc kubenswrapper[4892]: I0217 17:48:31.165889 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 17 17:48:31 crc kubenswrapper[4892]: I0217 17:48:31.247217 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 17 17:48:31 crc kubenswrapper[4892]: I0217 17:48:31.496453 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 17 17:48:37 crc kubenswrapper[4892]: I0217 17:48:37.833862 4892 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 17 17:48:37 crc kubenswrapper[4892]: I0217 17:48:37.836218 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://c7b820c2dd32d5cad7dbb8b2f8c7b4779ecb7248f74e6d73a2ce66ee3e35f2e8" gracePeriod=5 Feb 17 17:48:43 crc kubenswrapper[4892]: I0217 17:48:43.335063 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 17 17:48:43 crc kubenswrapper[4892]: I0217 17:48:43.335967 4892 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="c7b820c2dd32d5cad7dbb8b2f8c7b4779ecb7248f74e6d73a2ce66ee3e35f2e8" exitCode=137 Feb 17 17:48:43 crc kubenswrapper[4892]: I0217 17:48:43.438365 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 17 17:48:43 crc kubenswrapper[4892]: I0217 17:48:43.438440 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 17:48:43 crc kubenswrapper[4892]: I0217 17:48:43.505662 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 17:48:43 crc kubenswrapper[4892]: I0217 17:48:43.505800 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:48:43 crc kubenswrapper[4892]: I0217 17:48:43.506091 4892 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 17 17:48:43 crc kubenswrapper[4892]: I0217 17:48:43.607160 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 17:48:43 crc kubenswrapper[4892]: I0217 17:48:43.607253 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 17:48:43 crc kubenswrapper[4892]: I0217 17:48:43.607303 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 17:48:43 crc kubenswrapper[4892]: I0217 17:48:43.607377 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 17:48:43 crc kubenswrapper[4892]: I0217 17:48:43.607430 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:48:43 crc kubenswrapper[4892]: I0217 17:48:43.607462 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:48:43 crc kubenswrapper[4892]: I0217 17:48:43.607603 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:48:43 crc kubenswrapper[4892]: I0217 17:48:43.607967 4892 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 17 17:48:43 crc kubenswrapper[4892]: I0217 17:48:43.607991 4892 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 17 17:48:43 crc kubenswrapper[4892]: I0217 17:48:43.608004 4892 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 17 17:48:43 crc kubenswrapper[4892]: I0217 17:48:43.623261 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:48:43 crc kubenswrapper[4892]: I0217 17:48:43.708885 4892 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 17 17:48:44 crc kubenswrapper[4892]: I0217 17:48:44.344441 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 17 17:48:44 crc kubenswrapper[4892]: I0217 17:48:44.344579 4892 scope.go:117] "RemoveContainer" containerID="c7b820c2dd32d5cad7dbb8b2f8c7b4779ecb7248f74e6d73a2ce66ee3e35f2e8" Feb 17 17:48:44 crc kubenswrapper[4892]: I0217 17:48:44.344597 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 17:48:45 crc kubenswrapper[4892]: I0217 17:48:45.365595 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 17 17:48:49 crc kubenswrapper[4892]: I0217 17:48:49.125732 4892 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Feb 17 17:49:08 crc kubenswrapper[4892]: I0217 17:49:08.240127 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xnd6s"] Feb 17 17:49:08 crc kubenswrapper[4892]: E0217 17:49:08.241055 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 17 17:49:08 crc kubenswrapper[4892]: I0217 17:49:08.241075 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 17 17:49:08 crc kubenswrapper[4892]: I0217 17:49:08.241317 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 17 17:49:08 crc kubenswrapper[4892]: I0217 17:49:08.242626 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xnd6s" Feb 17 17:49:08 crc kubenswrapper[4892]: I0217 17:49:08.244690 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 17 17:49:08 crc kubenswrapper[4892]: I0217 17:49:08.258194 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xnd6s"] Feb 17 17:49:08 crc kubenswrapper[4892]: I0217 17:49:08.379784 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn9bb\" (UniqueName: \"kubernetes.io/projected/e446b496-04a9-4d11-b207-cdfe5ccc6d5a-kube-api-access-pn9bb\") pod \"certified-operators-xnd6s\" (UID: \"e446b496-04a9-4d11-b207-cdfe5ccc6d5a\") " pod="openshift-marketplace/certified-operators-xnd6s" Feb 17 17:49:08 crc kubenswrapper[4892]: I0217 17:49:08.379939 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e446b496-04a9-4d11-b207-cdfe5ccc6d5a-catalog-content\") pod \"certified-operators-xnd6s\" (UID: \"e446b496-04a9-4d11-b207-cdfe5ccc6d5a\") " pod="openshift-marketplace/certified-operators-xnd6s" Feb 17 17:49:08 crc kubenswrapper[4892]: I0217 17:49:08.380005 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e446b496-04a9-4d11-b207-cdfe5ccc6d5a-utilities\") pod \"certified-operators-xnd6s\" (UID: \"e446b496-04a9-4d11-b207-cdfe5ccc6d5a\") " pod="openshift-marketplace/certified-operators-xnd6s" Feb 17 17:49:08 crc kubenswrapper[4892]: I0217 17:49:08.444938 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cmmtx"] Feb 17 17:49:08 crc kubenswrapper[4892]: I0217 17:49:08.447218 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cmmtx" Feb 17 17:49:08 crc kubenswrapper[4892]: I0217 17:49:08.451485 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 17 17:49:08 crc kubenswrapper[4892]: I0217 17:49:08.469790 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cmmtx"] Feb 17 17:49:08 crc kubenswrapper[4892]: I0217 17:49:08.481175 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pn9bb\" (UniqueName: \"kubernetes.io/projected/e446b496-04a9-4d11-b207-cdfe5ccc6d5a-kube-api-access-pn9bb\") pod \"certified-operators-xnd6s\" (UID: \"e446b496-04a9-4d11-b207-cdfe5ccc6d5a\") " pod="openshift-marketplace/certified-operators-xnd6s" Feb 17 17:49:08 crc kubenswrapper[4892]: I0217 17:49:08.481283 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e446b496-04a9-4d11-b207-cdfe5ccc6d5a-catalog-content\") pod \"certified-operators-xnd6s\" (UID: \"e446b496-04a9-4d11-b207-cdfe5ccc6d5a\") " pod="openshift-marketplace/certified-operators-xnd6s" Feb 17 17:49:08 crc kubenswrapper[4892]: I0217 17:49:08.481368 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e446b496-04a9-4d11-b207-cdfe5ccc6d5a-utilities\") pod \"certified-operators-xnd6s\" (UID: \"e446b496-04a9-4d11-b207-cdfe5ccc6d5a\") " pod="openshift-marketplace/certified-operators-xnd6s" Feb 17 17:49:08 crc kubenswrapper[4892]: I0217 17:49:08.482055 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e446b496-04a9-4d11-b207-cdfe5ccc6d5a-utilities\") pod \"certified-operators-xnd6s\" (UID: \"e446b496-04a9-4d11-b207-cdfe5ccc6d5a\") " pod="openshift-marketplace/certified-operators-xnd6s" Feb 17 17:49:08 crc kubenswrapper[4892]: I0217 17:49:08.482237 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e446b496-04a9-4d11-b207-cdfe5ccc6d5a-catalog-content\") pod \"certified-operators-xnd6s\" (UID: \"e446b496-04a9-4d11-b207-cdfe5ccc6d5a\") " pod="openshift-marketplace/certified-operators-xnd6s" Feb 17 17:49:08 crc kubenswrapper[4892]: I0217 17:49:08.511490 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn9bb\" (UniqueName: \"kubernetes.io/projected/e446b496-04a9-4d11-b207-cdfe5ccc6d5a-kube-api-access-pn9bb\") pod \"certified-operators-xnd6s\" (UID: \"e446b496-04a9-4d11-b207-cdfe5ccc6d5a\") " pod="openshift-marketplace/certified-operators-xnd6s" Feb 17 17:49:08 crc kubenswrapper[4892]: I0217 17:49:08.582539 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f052c8f0-9011-4e0e-af19-c27399a3dfc0-utilities\") pod \"community-operators-cmmtx\" (UID: \"f052c8f0-9011-4e0e-af19-c27399a3dfc0\") " pod="openshift-marketplace/community-operators-cmmtx" Feb 17 17:49:08 crc kubenswrapper[4892]: I0217 17:49:08.582605 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f052c8f0-9011-4e0e-af19-c27399a3dfc0-catalog-content\") pod \"community-operators-cmmtx\" (UID: \"f052c8f0-9011-4e0e-af19-c27399a3dfc0\") " pod="openshift-marketplace/community-operators-cmmtx" Feb 17 17:49:08 crc kubenswrapper[4892]: I0217 17:49:08.582672 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfrz7\" (UniqueName: \"kubernetes.io/projected/f052c8f0-9011-4e0e-af19-c27399a3dfc0-kube-api-access-hfrz7\") pod \"community-operators-cmmtx\" (UID: \"f052c8f0-9011-4e0e-af19-c27399a3dfc0\") " pod="openshift-marketplace/community-operators-cmmtx" Feb 17 17:49:08 crc kubenswrapper[4892]: I0217 17:49:08.594437 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xnd6s" Feb 17 17:49:08 crc kubenswrapper[4892]: I0217 17:49:08.684533 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f052c8f0-9011-4e0e-af19-c27399a3dfc0-utilities\") pod \"community-operators-cmmtx\" (UID: \"f052c8f0-9011-4e0e-af19-c27399a3dfc0\") " pod="openshift-marketplace/community-operators-cmmtx" Feb 17 17:49:08 crc kubenswrapper[4892]: I0217 17:49:08.684871 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f052c8f0-9011-4e0e-af19-c27399a3dfc0-catalog-content\") pod \"community-operators-cmmtx\" (UID: \"f052c8f0-9011-4e0e-af19-c27399a3dfc0\") " pod="openshift-marketplace/community-operators-cmmtx" Feb 17 17:49:08 crc kubenswrapper[4892]: I0217 17:49:08.684915 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfrz7\" (UniqueName: \"kubernetes.io/projected/f052c8f0-9011-4e0e-af19-c27399a3dfc0-kube-api-access-hfrz7\") pod \"community-operators-cmmtx\" (UID: \"f052c8f0-9011-4e0e-af19-c27399a3dfc0\") " pod="openshift-marketplace/community-operators-cmmtx" Feb 17 17:49:08 crc kubenswrapper[4892]: I0217 17:49:08.685437 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f052c8f0-9011-4e0e-af19-c27399a3dfc0-utilities\") pod \"community-operators-cmmtx\" (UID: \"f052c8f0-9011-4e0e-af19-c27399a3dfc0\") " pod="openshift-marketplace/community-operators-cmmtx" Feb 17 17:49:08 crc kubenswrapper[4892]: I0217 17:49:08.685685 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f052c8f0-9011-4e0e-af19-c27399a3dfc0-catalog-content\") pod \"community-operators-cmmtx\" (UID: \"f052c8f0-9011-4e0e-af19-c27399a3dfc0\") " pod="openshift-marketplace/community-operators-cmmtx" Feb 17 17:49:08 crc kubenswrapper[4892]: I0217 17:49:08.719023 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfrz7\" (UniqueName: \"kubernetes.io/projected/f052c8f0-9011-4e0e-af19-c27399a3dfc0-kube-api-access-hfrz7\") pod \"community-operators-cmmtx\" (UID: \"f052c8f0-9011-4e0e-af19-c27399a3dfc0\") " pod="openshift-marketplace/community-operators-cmmtx" Feb 17 17:49:08 crc kubenswrapper[4892]: I0217 17:49:08.771840 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cmmtx" Feb 17 17:49:09 crc kubenswrapper[4892]: I0217 17:49:09.029368 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xnd6s"] Feb 17 17:49:09 crc kubenswrapper[4892]: W0217 17:49:09.034137 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode446b496_04a9_4d11_b207_cdfe5ccc6d5a.slice/crio-ba10995aab86678b399e02de5daed1c0326197987867c6d1f5fe7d3b1253ba25 WatchSource:0}: Error finding container ba10995aab86678b399e02de5daed1c0326197987867c6d1f5fe7d3b1253ba25: Status 404 returned error can't find the container with id ba10995aab86678b399e02de5daed1c0326197987867c6d1f5fe7d3b1253ba25 Feb 17 17:49:09 crc kubenswrapper[4892]: I0217 17:49:09.177065 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cmmtx"] Feb 17 17:49:09 crc kubenswrapper[4892]: I0217 17:49:09.492150 4892 generic.go:334] "Generic (PLEG): container finished" podID="f052c8f0-9011-4e0e-af19-c27399a3dfc0" containerID="277bd7d9c5b9ed3d08e02d4f6d5267b8cc4d6695670f5521a28f7e025b6b1972" exitCode=0 Feb 17 17:49:09 crc kubenswrapper[4892]: I0217 17:49:09.492224 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cmmtx" event={"ID":"f052c8f0-9011-4e0e-af19-c27399a3dfc0","Type":"ContainerDied","Data":"277bd7d9c5b9ed3d08e02d4f6d5267b8cc4d6695670f5521a28f7e025b6b1972"} Feb 17 17:49:09 crc kubenswrapper[4892]: I0217 17:49:09.492323 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cmmtx" event={"ID":"f052c8f0-9011-4e0e-af19-c27399a3dfc0","Type":"ContainerStarted","Data":"fc1c923a0b9394175e2a08cce7d1e8bb7c9c3dca0a3e9bc5b458abd7c342caff"} Feb 17 17:49:09 crc kubenswrapper[4892]: I0217 17:49:09.494118 4892 generic.go:334] "Generic (PLEG): container finished" podID="e446b496-04a9-4d11-b207-cdfe5ccc6d5a" containerID="8916b2ec1365ba7be86238f8b9137908aef59a2f71931e55e3ee84a77219bf90" exitCode=0 Feb 17 17:49:09 crc kubenswrapper[4892]: I0217 17:49:09.494206 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xnd6s" event={"ID":"e446b496-04a9-4d11-b207-cdfe5ccc6d5a","Type":"ContainerDied","Data":"8916b2ec1365ba7be86238f8b9137908aef59a2f71931e55e3ee84a77219bf90"} Feb 17 17:49:09 crc kubenswrapper[4892]: I0217 17:49:09.494258 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xnd6s" event={"ID":"e446b496-04a9-4d11-b207-cdfe5ccc6d5a","Type":"ContainerStarted","Data":"ba10995aab86678b399e02de5daed1c0326197987867c6d1f5fe7d3b1253ba25"} Feb 17 17:49:10 crc kubenswrapper[4892]: I0217 17:49:10.500871 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xnd6s" event={"ID":"e446b496-04a9-4d11-b207-cdfe5ccc6d5a","Type":"ContainerStarted","Data":"c59b96f0f1058b9d1f168a2ac9505b8849b9b4b8e3630111342edb52404648b0"} Feb 17 17:49:10 crc kubenswrapper[4892]: I0217 17:49:10.503309 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cmmtx" event={"ID":"f052c8f0-9011-4e0e-af19-c27399a3dfc0","Type":"ContainerStarted","Data":"6ad681ada17487982dd27c2a11bd9e67c22d0ebf0710d8c89d059b7ea714628e"} Feb 17 17:49:11 crc kubenswrapper[4892]: I0217 17:49:11.233211 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ct6mr"] Feb 17 17:49:11 crc kubenswrapper[4892]: I0217 17:49:11.234744 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ct6mr" Feb 17 17:49:11 crc kubenswrapper[4892]: I0217 17:49:11.238296 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 17 17:49:11 crc kubenswrapper[4892]: I0217 17:49:11.246448 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ct6mr"] Feb 17 17:49:11 crc kubenswrapper[4892]: I0217 17:49:11.324234 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50aef298-67b1-42cd-a4a5-86179693b0eb-utilities\") pod \"redhat-operators-ct6mr\" (UID: \"50aef298-67b1-42cd-a4a5-86179693b0eb\") " pod="openshift-marketplace/redhat-operators-ct6mr" Feb 17 17:49:11 crc kubenswrapper[4892]: I0217 17:49:11.324284 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50aef298-67b1-42cd-a4a5-86179693b0eb-catalog-content\") pod \"redhat-operators-ct6mr\" (UID: \"50aef298-67b1-42cd-a4a5-86179693b0eb\") " pod="openshift-marketplace/redhat-operators-ct6mr" Feb 17 17:49:11 crc kubenswrapper[4892]: I0217 17:49:11.324380 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz4bk\" (UniqueName: \"kubernetes.io/projected/50aef298-67b1-42cd-a4a5-86179693b0eb-kube-api-access-wz4bk\") pod \"redhat-operators-ct6mr\" (UID: \"50aef298-67b1-42cd-a4a5-86179693b0eb\") " pod="openshift-marketplace/redhat-operators-ct6mr" Feb 17 17:49:11 crc kubenswrapper[4892]: I0217 17:49:11.425858 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50aef298-67b1-42cd-a4a5-86179693b0eb-utilities\") pod \"redhat-operators-ct6mr\" (UID: \"50aef298-67b1-42cd-a4a5-86179693b0eb\") " pod="openshift-marketplace/redhat-operators-ct6mr" Feb 17 17:49:11 crc kubenswrapper[4892]: I0217 17:49:11.425949 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50aef298-67b1-42cd-a4a5-86179693b0eb-catalog-content\") pod \"redhat-operators-ct6mr\" (UID: \"50aef298-67b1-42cd-a4a5-86179693b0eb\") " pod="openshift-marketplace/redhat-operators-ct6mr" Feb 17 17:49:11 crc kubenswrapper[4892]: I0217 17:49:11.426463 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50aef298-67b1-42cd-a4a5-86179693b0eb-utilities\") pod \"redhat-operators-ct6mr\" (UID: \"50aef298-67b1-42cd-a4a5-86179693b0eb\") " pod="openshift-marketplace/redhat-operators-ct6mr" Feb 17 17:49:11 crc kubenswrapper[4892]: I0217 17:49:11.426546 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wz4bk\" (UniqueName: \"kubernetes.io/projected/50aef298-67b1-42cd-a4a5-86179693b0eb-kube-api-access-wz4bk\") pod \"redhat-operators-ct6mr\" (UID: \"50aef298-67b1-42cd-a4a5-86179693b0eb\") " pod="openshift-marketplace/redhat-operators-ct6mr" Feb 17 17:49:11 crc kubenswrapper[4892]: I0217 17:49:11.426570 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50aef298-67b1-42cd-a4a5-86179693b0eb-catalog-content\") pod \"redhat-operators-ct6mr\" (UID: \"50aef298-67b1-42cd-a4a5-86179693b0eb\") " pod="openshift-marketplace/redhat-operators-ct6mr" Feb 17 17:49:11 crc kubenswrapper[4892]: I0217 17:49:11.462723 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz4bk\" (UniqueName: \"kubernetes.io/projected/50aef298-67b1-42cd-a4a5-86179693b0eb-kube-api-access-wz4bk\") pod \"redhat-operators-ct6mr\" (UID: \"50aef298-67b1-42cd-a4a5-86179693b0eb\") " pod="openshift-marketplace/redhat-operators-ct6mr" Feb 17 17:49:11 crc kubenswrapper[4892]: I0217 17:49:11.509797 4892 generic.go:334] "Generic (PLEG): container finished" podID="f052c8f0-9011-4e0e-af19-c27399a3dfc0" containerID="6ad681ada17487982dd27c2a11bd9e67c22d0ebf0710d8c89d059b7ea714628e" exitCode=0 Feb 17 17:49:11 crc kubenswrapper[4892]: I0217 17:49:11.509875 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cmmtx" event={"ID":"f052c8f0-9011-4e0e-af19-c27399a3dfc0","Type":"ContainerDied","Data":"6ad681ada17487982dd27c2a11bd9e67c22d0ebf0710d8c89d059b7ea714628e"} Feb 17 17:49:11 crc kubenswrapper[4892]: I0217 17:49:11.511392 4892 generic.go:334] "Generic (PLEG): container finished" podID="e446b496-04a9-4d11-b207-cdfe5ccc6d5a" containerID="c59b96f0f1058b9d1f168a2ac9505b8849b9b4b8e3630111342edb52404648b0" exitCode=0 Feb 17 17:49:11 crc kubenswrapper[4892]: I0217 17:49:11.511414 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xnd6s" event={"ID":"e446b496-04a9-4d11-b207-cdfe5ccc6d5a","Type":"ContainerDied","Data":"c59b96f0f1058b9d1f168a2ac9505b8849b9b4b8e3630111342edb52404648b0"} Feb 17 17:49:11 crc kubenswrapper[4892]: I0217 17:49:11.547849 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ct6mr" Feb 17 17:49:12 crc kubenswrapper[4892]: I0217 17:49:12.518092 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cmmtx" event={"ID":"f052c8f0-9011-4e0e-af19-c27399a3dfc0","Type":"ContainerStarted","Data":"e4c0711eaab72afd5ed03f43df8bb29ead655d9ae40e74a0c090d0122cee2c7e"} Feb 17 17:49:12 crc kubenswrapper[4892]: I0217 17:49:12.521402 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xnd6s" event={"ID":"e446b496-04a9-4d11-b207-cdfe5ccc6d5a","Type":"ContainerStarted","Data":"1c64ab752bc5e66cc7a7098a411531a3daf533cd829bad312c8be8492189604a"} Feb 17 17:49:12 crc kubenswrapper[4892]: I0217 17:49:12.542029 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cmmtx" podStartSLOduration=1.963402063 podStartE2EDuration="4.542010537s" podCreationTimestamp="2026-02-17 17:49:08 +0000 UTC" firstStartedPulling="2026-02-17 17:49:09.494000012 +0000 UTC m=+320.869403277" lastFinishedPulling="2026-02-17 17:49:12.072608476 +0000 UTC m=+323.448011751" observedRunningTime="2026-02-17 17:49:12.540936417 +0000 UTC m=+323.916339692" watchObservedRunningTime="2026-02-17 17:49:12.542010537 +0000 UTC m=+323.917413802" Feb 17 17:49:12 crc kubenswrapper[4892]: I0217 17:49:12.560147 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xnd6s" podStartSLOduration=2.116693531 podStartE2EDuration="4.560127493s" podCreationTimestamp="2026-02-17 17:49:08 +0000 UTC" firstStartedPulling="2026-02-17 17:49:09.496540171 +0000 UTC m=+320.871943446" lastFinishedPulling="2026-02-17 17:49:11.939974143 +0000 UTC m=+323.315377408" observedRunningTime="2026-02-17 17:49:12.558997023 +0000 UTC m=+323.934400298" watchObservedRunningTime="2026-02-17 17:49:12.560127493 +0000 UTC m=+323.935530758" Feb 17 17:49:12 crc kubenswrapper[4892]: I0217 17:49:12.626268 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ct6mr"] Feb 17 17:49:12 crc kubenswrapper[4892]: W0217 17:49:12.633671 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50aef298_67b1_42cd_a4a5_86179693b0eb.slice/crio-8631449bdce6b6b6fff6cb93d87f0be6618eef0dff8c5210aeee4a9d5dd4afb4 WatchSource:0}: Error finding container 8631449bdce6b6b6fff6cb93d87f0be6618eef0dff8c5210aeee4a9d5dd4afb4: Status 404 returned error can't find the container with id 8631449bdce6b6b6fff6cb93d87f0be6618eef0dff8c5210aeee4a9d5dd4afb4 Feb 17 17:49:13 crc kubenswrapper[4892]: I0217 17:49:13.231052 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wx7cq"] Feb 17 17:49:13 crc kubenswrapper[4892]: I0217 17:49:13.232236 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wx7cq" Feb 17 17:49:13 crc kubenswrapper[4892]: I0217 17:49:13.243774 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wx7cq"] Feb 17 17:49:13 crc kubenswrapper[4892]: I0217 17:49:13.357432 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9918aee-e88f-48dc-a1dd-f5de340b5570-utilities\") pod \"certified-operators-wx7cq\" (UID: \"f9918aee-e88f-48dc-a1dd-f5de340b5570\") " pod="openshift-marketplace/certified-operators-wx7cq" Feb 17 17:49:13 crc kubenswrapper[4892]: I0217 17:49:13.357533 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g5gg\" (UniqueName: \"kubernetes.io/projected/f9918aee-e88f-48dc-a1dd-f5de340b5570-kube-api-access-4g5gg\") pod \"certified-operators-wx7cq\" (UID: \"f9918aee-e88f-48dc-a1dd-f5de340b5570\") " pod="openshift-marketplace/certified-operators-wx7cq" Feb 17 17:49:13 crc kubenswrapper[4892]: I0217 17:49:13.357585 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9918aee-e88f-48dc-a1dd-f5de340b5570-catalog-content\") pod \"certified-operators-wx7cq\" (UID: \"f9918aee-e88f-48dc-a1dd-f5de340b5570\") " pod="openshift-marketplace/certified-operators-wx7cq" Feb 17 17:49:13 crc kubenswrapper[4892]: I0217 17:49:13.458334 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9918aee-e88f-48dc-a1dd-f5de340b5570-utilities\") pod \"certified-operators-wx7cq\" (UID: \"f9918aee-e88f-48dc-a1dd-f5de340b5570\") " pod="openshift-marketplace/certified-operators-wx7cq" Feb 17 17:49:13 crc kubenswrapper[4892]: I0217 17:49:13.458409 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g5gg\" (UniqueName: \"kubernetes.io/projected/f9918aee-e88f-48dc-a1dd-f5de340b5570-kube-api-access-4g5gg\") pod \"certified-operators-wx7cq\" (UID: \"f9918aee-e88f-48dc-a1dd-f5de340b5570\") " pod="openshift-marketplace/certified-operators-wx7cq" Feb 17 17:49:13 crc kubenswrapper[4892]: I0217 17:49:13.458437 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9918aee-e88f-48dc-a1dd-f5de340b5570-catalog-content\") pod \"certified-operators-wx7cq\" (UID: \"f9918aee-e88f-48dc-a1dd-f5de340b5570\") " pod="openshift-marketplace/certified-operators-wx7cq" Feb 17 17:49:13 crc kubenswrapper[4892]: I0217 17:49:13.458945 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9918aee-e88f-48dc-a1dd-f5de340b5570-utilities\") pod \"certified-operators-wx7cq\" (UID: \"f9918aee-e88f-48dc-a1dd-f5de340b5570\") " pod="openshift-marketplace/certified-operators-wx7cq" Feb 17 17:49:13 crc kubenswrapper[4892]: I0217 17:49:13.459262 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9918aee-e88f-48dc-a1dd-f5de340b5570-catalog-content\") pod \"certified-operators-wx7cq\" (UID: \"f9918aee-e88f-48dc-a1dd-f5de340b5570\") " pod="openshift-marketplace/certified-operators-wx7cq" Feb 17 17:49:13 crc kubenswrapper[4892]: I0217 17:49:13.478231 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g5gg\" (UniqueName: \"kubernetes.io/projected/f9918aee-e88f-48dc-a1dd-f5de340b5570-kube-api-access-4g5gg\") pod \"certified-operators-wx7cq\" (UID: \"f9918aee-e88f-48dc-a1dd-f5de340b5570\") " pod="openshift-marketplace/certified-operators-wx7cq" Feb 17 17:49:13 crc kubenswrapper[4892]: I0217 17:49:13.530234 4892 generic.go:334] "Generic (PLEG): container finished" podID="50aef298-67b1-42cd-a4a5-86179693b0eb" containerID="5f5a3f59c2536565699b1b25c13d26697f1f39acfb6a67fda775b1a8d558ea5a" exitCode=0 Feb 17 17:49:13 crc kubenswrapper[4892]: I0217 17:49:13.530656 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ct6mr" event={"ID":"50aef298-67b1-42cd-a4a5-86179693b0eb","Type":"ContainerDied","Data":"5f5a3f59c2536565699b1b25c13d26697f1f39acfb6a67fda775b1a8d558ea5a"} Feb 17 17:49:13 crc kubenswrapper[4892]: I0217 17:49:13.530708 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ct6mr" event={"ID":"50aef298-67b1-42cd-a4a5-86179693b0eb","Type":"ContainerStarted","Data":"8631449bdce6b6b6fff6cb93d87f0be6618eef0dff8c5210aeee4a9d5dd4afb4"} Feb 17 17:49:13 crc kubenswrapper[4892]: I0217 17:49:13.588707 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wx7cq" Feb 17 17:49:13 crc kubenswrapper[4892]: I0217 17:49:13.836749 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8d66x"] Feb 17 17:49:13 crc kubenswrapper[4892]: I0217 17:49:13.841558 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8d66x" Feb 17 17:49:13 crc kubenswrapper[4892]: I0217 17:49:13.847076 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8d66x"] Feb 17 17:49:13 crc kubenswrapper[4892]: I0217 17:49:13.875385 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wx7cq"] Feb 17 17:49:13 crc kubenswrapper[4892]: I0217 17:49:13.964491 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pw49\" (UniqueName: \"kubernetes.io/projected/db2e4438-9aec-4791-88de-9b1d82af406d-kube-api-access-7pw49\") pod \"community-operators-8d66x\" (UID: \"db2e4438-9aec-4791-88de-9b1d82af406d\") " pod="openshift-marketplace/community-operators-8d66x" Feb 17 17:49:13 crc kubenswrapper[4892]: I0217 17:49:13.964538 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db2e4438-9aec-4791-88de-9b1d82af406d-catalog-content\") pod \"community-operators-8d66x\" (UID: \"db2e4438-9aec-4791-88de-9b1d82af406d\") " pod="openshift-marketplace/community-operators-8d66x" Feb 17 17:49:13 crc kubenswrapper[4892]: I0217 17:49:13.964635 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db2e4438-9aec-4791-88de-9b1d82af406d-utilities\") pod \"community-operators-8d66x\" (UID: \"db2e4438-9aec-4791-88de-9b1d82af406d\") " pod="openshift-marketplace/community-operators-8d66x" Feb 17 17:49:14 crc kubenswrapper[4892]: I0217 17:49:14.065563 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pw49\" (UniqueName: \"kubernetes.io/projected/db2e4438-9aec-4791-88de-9b1d82af406d-kube-api-access-7pw49\") pod \"community-operators-8d66x\" (UID: \"db2e4438-9aec-4791-88de-9b1d82af406d\") " pod="openshift-marketplace/community-operators-8d66x" Feb 17 17:49:14 crc kubenswrapper[4892]: I0217 17:49:14.065609 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db2e4438-9aec-4791-88de-9b1d82af406d-catalog-content\") pod \"community-operators-8d66x\" (UID: \"db2e4438-9aec-4791-88de-9b1d82af406d\") " pod="openshift-marketplace/community-operators-8d66x" Feb 17 17:49:14 crc kubenswrapper[4892]: I0217 17:49:14.065653 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db2e4438-9aec-4791-88de-9b1d82af406d-utilities\") pod \"community-operators-8d66x\" (UID: \"db2e4438-9aec-4791-88de-9b1d82af406d\") " pod="openshift-marketplace/community-operators-8d66x" Feb 17 17:49:14 crc kubenswrapper[4892]: I0217 17:49:14.066315 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db2e4438-9aec-4791-88de-9b1d82af406d-catalog-content\") pod \"community-operators-8d66x\" (UID: \"db2e4438-9aec-4791-88de-9b1d82af406d\") " pod="openshift-marketplace/community-operators-8d66x" Feb 17 17:49:14 crc kubenswrapper[4892]: I0217 17:49:14.066324 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db2e4438-9aec-4791-88de-9b1d82af406d-utilities\") pod \"community-operators-8d66x\" (UID: \"db2e4438-9aec-4791-88de-9b1d82af406d\") " pod="openshift-marketplace/community-operators-8d66x" Feb 17 17:49:14 crc kubenswrapper[4892]: I0217 17:49:14.085478 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pw49\" (UniqueName: \"kubernetes.io/projected/db2e4438-9aec-4791-88de-9b1d82af406d-kube-api-access-7pw49\") pod \"community-operators-8d66x\" (UID: \"db2e4438-9aec-4791-88de-9b1d82af406d\") " pod="openshift-marketplace/community-operators-8d66x" Feb 17 17:49:14 crc kubenswrapper[4892]: I0217 17:49:14.159048 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8d66x" Feb 17 17:49:14 crc kubenswrapper[4892]: I0217 17:49:14.390211 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8d66x"] Feb 17 17:49:14 crc kubenswrapper[4892]: W0217 17:49:14.398323 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb2e4438_9aec_4791_88de_9b1d82af406d.slice/crio-3dfead2c93f5f6b8caa6102aa6cdfd384851b2a6668531c75afbd37b22b27cbf WatchSource:0}: Error finding container 3dfead2c93f5f6b8caa6102aa6cdfd384851b2a6668531c75afbd37b22b27cbf: Status 404 returned error can't find the container with id 3dfead2c93f5f6b8caa6102aa6cdfd384851b2a6668531c75afbd37b22b27cbf Feb 17 17:49:14 crc kubenswrapper[4892]: I0217 17:49:14.537314 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ct6mr" event={"ID":"50aef298-67b1-42cd-a4a5-86179693b0eb","Type":"ContainerStarted","Data":"f8bc707d6563a439c5213a575db578e3a964a66bd64169ac0c8c4e6e1a7044da"} Feb 17 17:49:14 crc kubenswrapper[4892]: I0217 17:49:14.539012 4892 generic.go:334] "Generic (PLEG): container finished" podID="db2e4438-9aec-4791-88de-9b1d82af406d" containerID="d3bbabd6f20c8197ea839a7449f7d09432277c4ffd3389560c91269b95e136f2" exitCode=0 Feb 17 17:49:14 crc kubenswrapper[4892]: I0217 17:49:14.539085 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8d66x" event={"ID":"db2e4438-9aec-4791-88de-9b1d82af406d","Type":"ContainerDied","Data":"d3bbabd6f20c8197ea839a7449f7d09432277c4ffd3389560c91269b95e136f2"} Feb 17 17:49:14 crc kubenswrapper[4892]: I0217 17:49:14.539146 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8d66x" event={"ID":"db2e4438-9aec-4791-88de-9b1d82af406d","Type":"ContainerStarted","Data":"3dfead2c93f5f6b8caa6102aa6cdfd384851b2a6668531c75afbd37b22b27cbf"} Feb 17 17:49:14 crc kubenswrapper[4892]: I0217 17:49:14.550512 4892 generic.go:334] "Generic (PLEG): container finished" podID="f9918aee-e88f-48dc-a1dd-f5de340b5570" containerID="b17ee6f58577586801aedd782b3b3c56090c0334aa71ce8b9f80f905ed97da37" exitCode=0 Feb 17 17:49:14 crc kubenswrapper[4892]: I0217 17:49:14.550541 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wx7cq" event={"ID":"f9918aee-e88f-48dc-a1dd-f5de340b5570","Type":"ContainerDied","Data":"b17ee6f58577586801aedd782b3b3c56090c0334aa71ce8b9f80f905ed97da37"} Feb 17 17:49:14 crc kubenswrapper[4892]: I0217 17:49:14.550561 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wx7cq" event={"ID":"f9918aee-e88f-48dc-a1dd-f5de340b5570","Type":"ContainerStarted","Data":"d163a85c08372f5fa77859458f92754a6b606ca3578b57e7ad7791325ee15e4d"} Feb 17 17:49:15 crc kubenswrapper[4892]: I0217 17:49:15.563657 4892 generic.go:334] "Generic (PLEG): container finished" podID="50aef298-67b1-42cd-a4a5-86179693b0eb" containerID="f8bc707d6563a439c5213a575db578e3a964a66bd64169ac0c8c4e6e1a7044da" exitCode=0 Feb 17 17:49:15 crc kubenswrapper[4892]: I0217 17:49:15.563775 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ct6mr" event={"ID":"50aef298-67b1-42cd-a4a5-86179693b0eb","Type":"ContainerDied","Data":"f8bc707d6563a439c5213a575db578e3a964a66bd64169ac0c8c4e6e1a7044da"} Feb 17 17:49:15 crc kubenswrapper[4892]: I0217 17:49:15.569350 4892 generic.go:334] "Generic (PLEG): container finished" podID="db2e4438-9aec-4791-88de-9b1d82af406d" containerID="fb9d7f9f768f089fcd8f2879a30473de410a196965afd5e46907f1f10a5dec59" exitCode=0 Feb 17 17:49:15 crc kubenswrapper[4892]: I0217 17:49:15.569453 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8d66x" event={"ID":"db2e4438-9aec-4791-88de-9b1d82af406d","Type":"ContainerDied","Data":"fb9d7f9f768f089fcd8f2879a30473de410a196965afd5e46907f1f10a5dec59"} Feb 17 17:49:16 crc kubenswrapper[4892]: I0217 17:49:16.244856 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lx4bn"] Feb 17 17:49:16 crc kubenswrapper[4892]: I0217 17:49:16.246376 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lx4bn" Feb 17 17:49:16 crc kubenswrapper[4892]: I0217 17:49:16.256053 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lx4bn"] Feb 17 17:49:16 crc kubenswrapper[4892]: I0217 17:49:16.294218 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9833d581-8f46-4e58-9a99-6ec65dc9431b-catalog-content\") pod \"redhat-operators-lx4bn\" (UID: \"9833d581-8f46-4e58-9a99-6ec65dc9431b\") " pod="openshift-marketplace/redhat-operators-lx4bn" Feb 17 17:49:16 crc kubenswrapper[4892]: I0217 17:49:16.294273 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z49sj\" (UniqueName: \"kubernetes.io/projected/9833d581-8f46-4e58-9a99-6ec65dc9431b-kube-api-access-z49sj\") pod \"redhat-operators-lx4bn\" (UID: \"9833d581-8f46-4e58-9a99-6ec65dc9431b\") " pod="openshift-marketplace/redhat-operators-lx4bn" Feb 17 17:49:16 crc kubenswrapper[4892]: I0217 17:49:16.294338 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9833d581-8f46-4e58-9a99-6ec65dc9431b-utilities\") pod \"redhat-operators-lx4bn\" (UID: \"9833d581-8f46-4e58-9a99-6ec65dc9431b\") " pod="openshift-marketplace/redhat-operators-lx4bn" Feb 17 17:49:16 crc kubenswrapper[4892]: I0217 17:49:16.396120 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9833d581-8f46-4e58-9a99-6ec65dc9431b-catalog-content\") pod \"redhat-operators-lx4bn\" (UID: \"9833d581-8f46-4e58-9a99-6ec65dc9431b\") " pod="openshift-marketplace/redhat-operators-lx4bn" Feb 17 17:49:16 crc kubenswrapper[4892]: I0217 17:49:16.396173 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z49sj\" (UniqueName: \"kubernetes.io/projected/9833d581-8f46-4e58-9a99-6ec65dc9431b-kube-api-access-z49sj\") pod \"redhat-operators-lx4bn\" (UID: \"9833d581-8f46-4e58-9a99-6ec65dc9431b\") " pod="openshift-marketplace/redhat-operators-lx4bn" Feb 17 17:49:16 crc kubenswrapper[4892]: I0217 17:49:16.396234 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9833d581-8f46-4e58-9a99-6ec65dc9431b-utilities\") pod \"redhat-operators-lx4bn\" (UID: \"9833d581-8f46-4e58-9a99-6ec65dc9431b\") " pod="openshift-marketplace/redhat-operators-lx4bn" Feb 17 17:49:16 crc kubenswrapper[4892]: I0217 17:49:16.396677 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9833d581-8f46-4e58-9a99-6ec65dc9431b-catalog-content\") pod \"redhat-operators-lx4bn\" (UID: \"9833d581-8f46-4e58-9a99-6ec65dc9431b\") " pod="openshift-marketplace/redhat-operators-lx4bn" Feb 17 17:49:16 crc kubenswrapper[4892]: I0217 17:49:16.396736 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9833d581-8f46-4e58-9a99-6ec65dc9431b-utilities\") pod \"redhat-operators-lx4bn\" (UID: \"9833d581-8f46-4e58-9a99-6ec65dc9431b\") " pod="openshift-marketplace/redhat-operators-lx4bn" Feb 17 17:49:16 crc kubenswrapper[4892]: I0217 17:49:16.415455 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z49sj\" (UniqueName: \"kubernetes.io/projected/9833d581-8f46-4e58-9a99-6ec65dc9431b-kube-api-access-z49sj\") pod \"redhat-operators-lx4bn\" (UID: \"9833d581-8f46-4e58-9a99-6ec65dc9431b\") " pod="openshift-marketplace/redhat-operators-lx4bn" Feb 17 17:49:16 crc kubenswrapper[4892]: I0217 17:49:16.564582 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lx4bn" Feb 17 17:49:16 crc kubenswrapper[4892]: I0217 17:49:16.578879 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8d66x" event={"ID":"db2e4438-9aec-4791-88de-9b1d82af406d","Type":"ContainerStarted","Data":"9307344cae7ab3c6e1bd8a573d213297d644645c522e1e0e4dacbcdc6f90702d"} Feb 17 17:49:16 crc kubenswrapper[4892]: I0217 17:49:16.581397 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ct6mr" event={"ID":"50aef298-67b1-42cd-a4a5-86179693b0eb","Type":"ContainerStarted","Data":"0cd96c59b712e93c256d47f7fb31c8154dbfed1cb5359729f1f4cb4df4f521d7"} Feb 17 17:49:16 crc kubenswrapper[4892]: I0217 17:49:16.602079 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8d66x" podStartSLOduration=2.205340923 podStartE2EDuration="3.602052575s" podCreationTimestamp="2026-02-17 17:49:13 +0000 UTC" firstStartedPulling="2026-02-17 17:49:14.550583287 +0000 UTC m=+325.925986562" lastFinishedPulling="2026-02-17 17:49:15.947294929 +0000 UTC m=+327.322698214" observedRunningTime="2026-02-17 17:49:16.600043361 +0000 UTC m=+327.975446636" watchObservedRunningTime="2026-02-17 17:49:16.602052575 +0000 UTC m=+327.977455850" Feb 17 17:49:16 crc kubenswrapper[4892]: I0217 17:49:16.619327 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ct6mr" podStartSLOduration=3.053919256 podStartE2EDuration="5.619306369s" podCreationTimestamp="2026-02-17 17:49:11 +0000 UTC" firstStartedPulling="2026-02-17 17:49:13.534928141 +0000 UTC m=+324.910331406" lastFinishedPulling="2026-02-17 17:49:16.100315214 +0000 UTC m=+327.475718519" observedRunningTime="2026-02-17 17:49:16.617711467 +0000 UTC m=+327.993114752" watchObservedRunningTime="2026-02-17 17:49:16.619306369 +0000 UTC m=+327.994709654" Feb 17 17:49:16 crc kubenswrapper[4892]: W0217 17:49:16.976680 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9833d581_8f46_4e58_9a99_6ec65dc9431b.slice/crio-3681f4b0a6d8090419fd5ee1ad34a1d105dd1efdc4462f1ce4ec364c50b8b3cf WatchSource:0}: Error finding container 3681f4b0a6d8090419fd5ee1ad34a1d105dd1efdc4462f1ce4ec364c50b8b3cf: Status 404 returned error can't find the container with id 3681f4b0a6d8090419fd5ee1ad34a1d105dd1efdc4462f1ce4ec364c50b8b3cf Feb 17 17:49:16 crc kubenswrapper[4892]: I0217 17:49:16.987123 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lx4bn"] Feb 17 17:49:17 crc kubenswrapper[4892]: I0217 17:49:17.588549 4892 generic.go:334] "Generic (PLEG): container finished" podID="f9918aee-e88f-48dc-a1dd-f5de340b5570" containerID="5a0564725f144f21c2482dc8d633ec3655af50ec683bf0c6cfda35ab831c6286" exitCode=0 Feb 17 17:49:17 crc kubenswrapper[4892]: I0217 17:49:17.588635 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wx7cq" event={"ID":"f9918aee-e88f-48dc-a1dd-f5de340b5570","Type":"ContainerDied","Data":"5a0564725f144f21c2482dc8d633ec3655af50ec683bf0c6cfda35ab831c6286"} Feb 17 17:49:17 crc kubenswrapper[4892]: I0217 17:49:17.590657 4892 generic.go:334] "Generic (PLEG): container finished" podID="9833d581-8f46-4e58-9a99-6ec65dc9431b" containerID="666b68835bee66d4b6aa5d1e6aca7e51bc1e9c1455c012cafe58ed8db11bffec" exitCode=0 Feb 17 17:49:17 crc kubenswrapper[4892]: I0217 17:49:17.590758 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lx4bn" event={"ID":"9833d581-8f46-4e58-9a99-6ec65dc9431b","Type":"ContainerDied","Data":"666b68835bee66d4b6aa5d1e6aca7e51bc1e9c1455c012cafe58ed8db11bffec"} Feb 17 17:49:17 crc kubenswrapper[4892]: I0217 17:49:17.590789 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lx4bn" event={"ID":"9833d581-8f46-4e58-9a99-6ec65dc9431b","Type":"ContainerStarted","Data":"3681f4b0a6d8090419fd5ee1ad34a1d105dd1efdc4462f1ce4ec364c50b8b3cf"} Feb 17 17:49:17 crc kubenswrapper[4892]: I0217 17:49:17.643595 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hjlcl"] Feb 17 17:49:17 crc kubenswrapper[4892]: I0217 17:49:17.646787 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hjlcl" Feb 17 17:49:17 crc kubenswrapper[4892]: I0217 17:49:17.654747 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hjlcl"] Feb 17 17:49:17 crc kubenswrapper[4892]: I0217 17:49:17.710544 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s65h8\" (UniqueName: \"kubernetes.io/projected/e6edd657-e697-42e2-bfc4-3ea98348eb23-kube-api-access-s65h8\") pod \"certified-operators-hjlcl\" (UID: \"e6edd657-e697-42e2-bfc4-3ea98348eb23\") " pod="openshift-marketplace/certified-operators-hjlcl" Feb 17 17:49:17 crc kubenswrapper[4892]: I0217 17:49:17.710614 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6edd657-e697-42e2-bfc4-3ea98348eb23-catalog-content\") pod \"certified-operators-hjlcl\" (UID: \"e6edd657-e697-42e2-bfc4-3ea98348eb23\") " pod="openshift-marketplace/certified-operators-hjlcl" Feb 17 17:49:17 crc kubenswrapper[4892]: I0217 17:49:17.710834 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6edd657-e697-42e2-bfc4-3ea98348eb23-utilities\") pod \"certified-operators-hjlcl\" (UID: \"e6edd657-e697-42e2-bfc4-3ea98348eb23\") " pod="openshift-marketplace/certified-operators-hjlcl" Feb 17 17:49:17 crc kubenswrapper[4892]: I0217 17:49:17.812354 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6edd657-e697-42e2-bfc4-3ea98348eb23-utilities\") pod \"certified-operators-hjlcl\" (UID: \"e6edd657-e697-42e2-bfc4-3ea98348eb23\") " pod="openshift-marketplace/certified-operators-hjlcl" Feb 17 17:49:17 crc kubenswrapper[4892]: I0217 17:49:17.812435 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s65h8\" (UniqueName: \"kubernetes.io/projected/e6edd657-e697-42e2-bfc4-3ea98348eb23-kube-api-access-s65h8\") pod \"certified-operators-hjlcl\" (UID: \"e6edd657-e697-42e2-bfc4-3ea98348eb23\") " pod="openshift-marketplace/certified-operators-hjlcl" Feb 17 17:49:17 crc kubenswrapper[4892]: I0217 17:49:17.812470 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6edd657-e697-42e2-bfc4-3ea98348eb23-catalog-content\") pod \"certified-operators-hjlcl\" (UID: \"e6edd657-e697-42e2-bfc4-3ea98348eb23\") " pod="openshift-marketplace/certified-operators-hjlcl" Feb 17 17:49:17 crc kubenswrapper[4892]: I0217 17:49:17.812969 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6edd657-e697-42e2-bfc4-3ea98348eb23-catalog-content\") pod \"certified-operators-hjlcl\" (UID: \"e6edd657-e697-42e2-bfc4-3ea98348eb23\") " pod="openshift-marketplace/certified-operators-hjlcl" Feb 17 17:49:17 crc kubenswrapper[4892]: I0217 17:49:17.813200 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6edd657-e697-42e2-bfc4-3ea98348eb23-utilities\") pod \"certified-operators-hjlcl\" (UID: \"e6edd657-e697-42e2-bfc4-3ea98348eb23\") " pod="openshift-marketplace/certified-operators-hjlcl" Feb 17 17:49:17 crc kubenswrapper[4892]: I0217 17:49:17.837362 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s65h8\" (UniqueName: \"kubernetes.io/projected/e6edd657-e697-42e2-bfc4-3ea98348eb23-kube-api-access-s65h8\") pod \"certified-operators-hjlcl\" (UID: \"e6edd657-e697-42e2-bfc4-3ea98348eb23\") " pod="openshift-marketplace/certified-operators-hjlcl" Feb 17 17:49:17 crc kubenswrapper[4892]: I0217 17:49:17.962778 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hjlcl" Feb 17 17:49:18 crc kubenswrapper[4892]: I0217 17:49:18.400316 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hjlcl"] Feb 17 17:49:18 crc kubenswrapper[4892]: I0217 17:49:18.595255 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xnd6s" Feb 17 17:49:18 crc kubenswrapper[4892]: I0217 17:49:18.596088 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xnd6s" Feb 17 17:49:18 crc kubenswrapper[4892]: I0217 17:49:18.599862 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hjlcl" event={"ID":"e6edd657-e697-42e2-bfc4-3ea98348eb23","Type":"ContainerStarted","Data":"76c56ce35efa20ed85f745f01843987bf7d2c433748e37590881bd8f4e22b376"} Feb 17 17:49:18 crc kubenswrapper[4892]: I0217 17:49:18.599897 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hjlcl" event={"ID":"e6edd657-e697-42e2-bfc4-3ea98348eb23","Type":"ContainerStarted","Data":"4150b5ddbb9802419e9cfbdfa8318ab322335734077bc15885a044299956782f"} Feb 17 17:49:18 crc kubenswrapper[4892]: I0217 17:49:18.606022 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wx7cq" event={"ID":"f9918aee-e88f-48dc-a1dd-f5de340b5570","Type":"ContainerStarted","Data":"96a79f94b5b068c3072a49a54e121f86581f7a8054e048bd036cd18e252fa778"} Feb 17 17:49:18 crc kubenswrapper[4892]: I0217 17:49:18.609444 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lx4bn" event={"ID":"9833d581-8f46-4e58-9a99-6ec65dc9431b","Type":"ContainerStarted","Data":"bd8b8a6ac9540f1ab8c9eaf080c8cddbf3dfa7c4efa13a1e1ab048216d12cd5f"} Feb 17 17:49:18 crc kubenswrapper[4892]: I0217 17:49:18.639968 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4pxrd"] Feb 17 17:49:18 crc kubenswrapper[4892]: I0217 17:49:18.640971 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4pxrd" Feb 17 17:49:18 crc kubenswrapper[4892]: I0217 17:49:18.650595 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4pxrd"] Feb 17 17:49:18 crc kubenswrapper[4892]: I0217 17:49:18.655244 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wx7cq" podStartSLOduration=2.17698484 podStartE2EDuration="5.655226023s" podCreationTimestamp="2026-02-17 17:49:13 +0000 UTC" firstStartedPulling="2026-02-17 17:49:14.55144771 +0000 UTC m=+325.926851015" lastFinishedPulling="2026-02-17 17:49:18.029688933 +0000 UTC m=+329.405092198" observedRunningTime="2026-02-17 17:49:18.654798052 +0000 UTC m=+330.030201327" watchObservedRunningTime="2026-02-17 17:49:18.655226023 +0000 UTC m=+330.030629288" Feb 17 17:49:18 crc kubenswrapper[4892]: I0217 17:49:18.672185 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xnd6s" Feb 17 17:49:18 crc kubenswrapper[4892]: I0217 17:49:18.722794 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k92vj\" (UniqueName: \"kubernetes.io/projected/c5624079-5b90-4427-a09c-0d96ed3ddebd-kube-api-access-k92vj\") pod \"community-operators-4pxrd\" (UID: \"c5624079-5b90-4427-a09c-0d96ed3ddebd\") " pod="openshift-marketplace/community-operators-4pxrd" Feb 17 17:49:18 crc kubenswrapper[4892]: I0217 17:49:18.722925 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5624079-5b90-4427-a09c-0d96ed3ddebd-catalog-content\") pod \"community-operators-4pxrd\" (UID: \"c5624079-5b90-4427-a09c-0d96ed3ddebd\") " pod="openshift-marketplace/community-operators-4pxrd" Feb 17 17:49:18 crc kubenswrapper[4892]: I0217 17:49:18.722978 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5624079-5b90-4427-a09c-0d96ed3ddebd-utilities\") pod \"community-operators-4pxrd\" (UID: \"c5624079-5b90-4427-a09c-0d96ed3ddebd\") " pod="openshift-marketplace/community-operators-4pxrd" Feb 17 17:49:18 crc kubenswrapper[4892]: I0217 17:49:18.772091 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cmmtx" Feb 17 17:49:18 crc kubenswrapper[4892]: I0217 17:49:18.772159 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cmmtx" Feb 17 17:49:18 crc kubenswrapper[4892]: I0217 17:49:18.824294 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5624079-5b90-4427-a09c-0d96ed3ddebd-utilities\") pod \"community-operators-4pxrd\" (UID: \"c5624079-5b90-4427-a09c-0d96ed3ddebd\") " pod="openshift-marketplace/community-operators-4pxrd" Feb 17 17:49:18 crc kubenswrapper[4892]: I0217 17:49:18.824795 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k92vj\" (UniqueName: \"kubernetes.io/projected/c5624079-5b90-4427-a09c-0d96ed3ddebd-kube-api-access-k92vj\") pod \"community-operators-4pxrd\" (UID: \"c5624079-5b90-4427-a09c-0d96ed3ddebd\") " pod="openshift-marketplace/community-operators-4pxrd" Feb 17 17:49:18 crc kubenswrapper[4892]: I0217 17:49:18.824896 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5624079-5b90-4427-a09c-0d96ed3ddebd-catalog-content\") pod \"community-operators-4pxrd\" (UID: \"c5624079-5b90-4427-a09c-0d96ed3ddebd\") " pod="openshift-marketplace/community-operators-4pxrd" Feb 17 17:49:18 crc kubenswrapper[4892]: I0217 17:49:18.825255 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5624079-5b90-4427-a09c-0d96ed3ddebd-utilities\") pod \"community-operators-4pxrd\" (UID: \"c5624079-5b90-4427-a09c-0d96ed3ddebd\") " pod="openshift-marketplace/community-operators-4pxrd" Feb 17 17:49:18 crc kubenswrapper[4892]: I0217 17:49:18.825393 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5624079-5b90-4427-a09c-0d96ed3ddebd-catalog-content\") pod \"community-operators-4pxrd\" (UID: \"c5624079-5b90-4427-a09c-0d96ed3ddebd\") " pod="openshift-marketplace/community-operators-4pxrd" Feb 17 17:49:18 crc kubenswrapper[4892]: I0217 17:49:18.830451 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cmmtx" Feb 17 17:49:18 crc kubenswrapper[4892]: I0217 17:49:18.845240 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k92vj\" (UniqueName: \"kubernetes.io/projected/c5624079-5b90-4427-a09c-0d96ed3ddebd-kube-api-access-k92vj\") pod \"community-operators-4pxrd\" (UID: \"c5624079-5b90-4427-a09c-0d96ed3ddebd\") " pod="openshift-marketplace/community-operators-4pxrd" Feb 17 17:49:18 crc kubenswrapper[4892]: I0217 17:49:18.973956 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4pxrd" Feb 17 17:49:19 crc kubenswrapper[4892]: I0217 17:49:19.456301 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4pxrd"] Feb 17 17:49:19 crc kubenswrapper[4892]: W0217 17:49:19.465230 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5624079_5b90_4427_a09c_0d96ed3ddebd.slice/crio-31997fbc4d9726853dfa8d2954452e82db0af16356f8372502621ff4a2c9ce92 WatchSource:0}: Error finding container 31997fbc4d9726853dfa8d2954452e82db0af16356f8372502621ff4a2c9ce92: Status 404 returned error can't find the container with id 31997fbc4d9726853dfa8d2954452e82db0af16356f8372502621ff4a2c9ce92 Feb 17 17:49:19 crc kubenswrapper[4892]: I0217 17:49:19.621566 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4pxrd" event={"ID":"c5624079-5b90-4427-a09c-0d96ed3ddebd","Type":"ContainerStarted","Data":"31997fbc4d9726853dfa8d2954452e82db0af16356f8372502621ff4a2c9ce92"} Feb 17 17:49:19 crc kubenswrapper[4892]: I0217 17:49:19.623670 4892 generic.go:334] "Generic (PLEG): container finished" podID="e6edd657-e697-42e2-bfc4-3ea98348eb23" containerID="76c56ce35efa20ed85f745f01843987bf7d2c433748e37590881bd8f4e22b376" exitCode=0 Feb 17 17:49:19 crc kubenswrapper[4892]: I0217 17:49:19.623746 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hjlcl" event={"ID":"e6edd657-e697-42e2-bfc4-3ea98348eb23","Type":"ContainerDied","Data":"76c56ce35efa20ed85f745f01843987bf7d2c433748e37590881bd8f4e22b376"} Feb 17 17:49:19 crc kubenswrapper[4892]: I0217 17:49:19.626151 4892 generic.go:334] "Generic (PLEG): container finished" podID="9833d581-8f46-4e58-9a99-6ec65dc9431b" containerID="bd8b8a6ac9540f1ab8c9eaf080c8cddbf3dfa7c4efa13a1e1ab048216d12cd5f" exitCode=0 Feb 17 17:49:19 crc kubenswrapper[4892]: I0217 17:49:19.626455 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lx4bn" event={"ID":"9833d581-8f46-4e58-9a99-6ec65dc9431b","Type":"ContainerDied","Data":"bd8b8a6ac9540f1ab8c9eaf080c8cddbf3dfa7c4efa13a1e1ab048216d12cd5f"} Feb 17 17:49:19 crc kubenswrapper[4892]: I0217 17:49:19.680139 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cmmtx" Feb 17 17:49:19 crc kubenswrapper[4892]: I0217 17:49:19.700309 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xnd6s" Feb 17 17:49:20 crc kubenswrapper[4892]: I0217 17:49:20.034397 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-x8f4q"] Feb 17 17:49:20 crc kubenswrapper[4892]: I0217 17:49:20.035645 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x8f4q" Feb 17 17:49:20 crc kubenswrapper[4892]: I0217 17:49:20.046025 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x8f4q"] Feb 17 17:49:20 crc kubenswrapper[4892]: I0217 17:49:20.141880 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94374c20-25dd-491e-8778-0b00d10afda0-utilities\") pod \"redhat-operators-x8f4q\" (UID: \"94374c20-25dd-491e-8778-0b00d10afda0\") " pod="openshift-marketplace/redhat-operators-x8f4q" Feb 17 17:49:20 crc kubenswrapper[4892]: I0217 17:49:20.141963 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4cr9\" (UniqueName: \"kubernetes.io/projected/94374c20-25dd-491e-8778-0b00d10afda0-kube-api-access-m4cr9\") pod \"redhat-operators-x8f4q\" (UID: \"94374c20-25dd-491e-8778-0b00d10afda0\") " pod="openshift-marketplace/redhat-operators-x8f4q" Feb 17 17:49:20 crc kubenswrapper[4892]: I0217 17:49:20.142009 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94374c20-25dd-491e-8778-0b00d10afda0-catalog-content\") pod \"redhat-operators-x8f4q\" (UID: \"94374c20-25dd-491e-8778-0b00d10afda0\") " pod="openshift-marketplace/redhat-operators-x8f4q" Feb 17 17:49:20 crc kubenswrapper[4892]: I0217 17:49:20.243133 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94374c20-25dd-491e-8778-0b00d10afda0-utilities\") pod \"redhat-operators-x8f4q\" (UID: \"94374c20-25dd-491e-8778-0b00d10afda0\") " pod="openshift-marketplace/redhat-operators-x8f4q" Feb 17 17:49:20 crc kubenswrapper[4892]: I0217 17:49:20.243235 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4cr9\" (UniqueName: \"kubernetes.io/projected/94374c20-25dd-491e-8778-0b00d10afda0-kube-api-access-m4cr9\") pod \"redhat-operators-x8f4q\" (UID: \"94374c20-25dd-491e-8778-0b00d10afda0\") " pod="openshift-marketplace/redhat-operators-x8f4q" Feb 17 17:49:20 crc kubenswrapper[4892]: I0217 17:49:20.243281 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94374c20-25dd-491e-8778-0b00d10afda0-catalog-content\") pod \"redhat-operators-x8f4q\" (UID: \"94374c20-25dd-491e-8778-0b00d10afda0\") " pod="openshift-marketplace/redhat-operators-x8f4q" Feb 17 17:49:20 crc kubenswrapper[4892]: I0217 17:49:20.243891 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94374c20-25dd-491e-8778-0b00d10afda0-utilities\") pod \"redhat-operators-x8f4q\" (UID: \"94374c20-25dd-491e-8778-0b00d10afda0\") " pod="openshift-marketplace/redhat-operators-x8f4q" Feb 17 17:49:20 crc kubenswrapper[4892]: I0217 17:49:20.244136 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94374c20-25dd-491e-8778-0b00d10afda0-catalog-content\") pod \"redhat-operators-x8f4q\" (UID: \"94374c20-25dd-491e-8778-0b00d10afda0\") " pod="openshift-marketplace/redhat-operators-x8f4q" Feb 17 17:49:20 crc kubenswrapper[4892]: I0217 17:49:20.266141 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4cr9\" (UniqueName: \"kubernetes.io/projected/94374c20-25dd-491e-8778-0b00d10afda0-kube-api-access-m4cr9\") pod \"redhat-operators-x8f4q\" (UID: \"94374c20-25dd-491e-8778-0b00d10afda0\") " pod="openshift-marketplace/redhat-operators-x8f4q" Feb 17 17:49:20 crc kubenswrapper[4892]: I0217 17:49:20.349147 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x8f4q" Feb 17 17:49:20 crc kubenswrapper[4892]: I0217 17:49:20.633356 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lx4bn" event={"ID":"9833d581-8f46-4e58-9a99-6ec65dc9431b","Type":"ContainerStarted","Data":"076c41de3c27bee00540514cfa0ac6ff34e251fdec2660e974becb7d9c6b5c1b"} Feb 17 17:49:20 crc kubenswrapper[4892]: I0217 17:49:20.634638 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4pxrd" event={"ID":"c5624079-5b90-4427-a09c-0d96ed3ddebd","Type":"ContainerStarted","Data":"81b6fc2b2024854d4d71058097b71352ef2c94a94cd6614e1745bee067e8e13a"} Feb 17 17:49:20 crc kubenswrapper[4892]: I0217 17:49:20.637768 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hjlcl" event={"ID":"e6edd657-e697-42e2-bfc4-3ea98348eb23","Type":"ContainerStarted","Data":"b71360393e5093d54ee4280b406404bd3aeba92aa26135a2c1a7471c904481df"} Feb 17 17:49:20 crc kubenswrapper[4892]: I0217 17:49:20.653051 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lx4bn" podStartSLOduration=2.21846829 podStartE2EDuration="4.653035293s" podCreationTimestamp="2026-02-17 17:49:16 +0000 UTC" firstStartedPulling="2026-02-17 17:49:17.591920533 +0000 UTC m=+328.967323798" lastFinishedPulling="2026-02-17 17:49:20.026487536 +0000 UTC m=+331.401890801" observedRunningTime="2026-02-17 17:49:20.649471177 +0000 UTC m=+332.024874462" watchObservedRunningTime="2026-02-17 17:49:20.653035293 +0000 UTC m=+332.028438558" Feb 17 17:49:20 crc kubenswrapper[4892]: I0217 17:49:20.735142 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x8f4q"] Feb 17 17:49:20 crc kubenswrapper[4892]: W0217 17:49:20.739912 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94374c20_25dd_491e_8778_0b00d10afda0.slice/crio-5fed5f18e8e09f644c7feb02c6eba4a9a0e607e3b4689693b4baf8c49522ba35 WatchSource:0}: Error finding container 5fed5f18e8e09f644c7feb02c6eba4a9a0e607e3b4689693b4baf8c49522ba35: Status 404 returned error can't find the container with id 5fed5f18e8e09f644c7feb02c6eba4a9a0e607e3b4689693b4baf8c49522ba35 Feb 17 17:49:21 crc kubenswrapper[4892]: I0217 17:49:21.040918 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jnfwl"] Feb 17 17:49:21 crc kubenswrapper[4892]: I0217 17:49:21.042089 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jnfwl" Feb 17 17:49:21 crc kubenswrapper[4892]: I0217 17:49:21.052259 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jnfwl"] Feb 17 17:49:21 crc kubenswrapper[4892]: I0217 17:49:21.152633 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d42f2f0e-3e80-4231-9daa-5abdd0a0091e-catalog-content\") pod \"certified-operators-jnfwl\" (UID: \"d42f2f0e-3e80-4231-9daa-5abdd0a0091e\") " pod="openshift-marketplace/certified-operators-jnfwl" Feb 17 17:49:21 crc kubenswrapper[4892]: I0217 17:49:21.152673 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vqw7\" (UniqueName: \"kubernetes.io/projected/d42f2f0e-3e80-4231-9daa-5abdd0a0091e-kube-api-access-8vqw7\") pod \"certified-operators-jnfwl\" (UID: \"d42f2f0e-3e80-4231-9daa-5abdd0a0091e\") " pod="openshift-marketplace/certified-operators-jnfwl" Feb 17 17:49:21 crc kubenswrapper[4892]: I0217 17:49:21.152767 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d42f2f0e-3e80-4231-9daa-5abdd0a0091e-utilities\") pod \"certified-operators-jnfwl\" (UID: \"d42f2f0e-3e80-4231-9daa-5abdd0a0091e\") " pod="openshift-marketplace/certified-operators-jnfwl" Feb 17 17:49:21 crc kubenswrapper[4892]: I0217 17:49:21.253966 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d42f2f0e-3e80-4231-9daa-5abdd0a0091e-catalog-content\") pod \"certified-operators-jnfwl\" (UID: \"d42f2f0e-3e80-4231-9daa-5abdd0a0091e\") " pod="openshift-marketplace/certified-operators-jnfwl" Feb 17 17:49:21 crc kubenswrapper[4892]: I0217 17:49:21.254052 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vqw7\" (UniqueName: \"kubernetes.io/projected/d42f2f0e-3e80-4231-9daa-5abdd0a0091e-kube-api-access-8vqw7\") pod \"certified-operators-jnfwl\" (UID: \"d42f2f0e-3e80-4231-9daa-5abdd0a0091e\") " pod="openshift-marketplace/certified-operators-jnfwl" Feb 17 17:49:21 crc kubenswrapper[4892]: I0217 17:49:21.254213 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d42f2f0e-3e80-4231-9daa-5abdd0a0091e-utilities\") pod \"certified-operators-jnfwl\" (UID: \"d42f2f0e-3e80-4231-9daa-5abdd0a0091e\") " pod="openshift-marketplace/certified-operators-jnfwl" Feb 17 17:49:21 crc kubenswrapper[4892]: I0217 17:49:21.254613 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d42f2f0e-3e80-4231-9daa-5abdd0a0091e-utilities\") pod \"certified-operators-jnfwl\" (UID: \"d42f2f0e-3e80-4231-9daa-5abdd0a0091e\") " pod="openshift-marketplace/certified-operators-jnfwl" Feb 17 17:49:21 crc kubenswrapper[4892]: I0217 17:49:21.254694 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d42f2f0e-3e80-4231-9daa-5abdd0a0091e-catalog-content\") pod \"certified-operators-jnfwl\" (UID: \"d42f2f0e-3e80-4231-9daa-5abdd0a0091e\") " pod="openshift-marketplace/certified-operators-jnfwl" Feb 17 17:49:21 crc kubenswrapper[4892]: I0217 17:49:21.277852 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vqw7\" (UniqueName: \"kubernetes.io/projected/d42f2f0e-3e80-4231-9daa-5abdd0a0091e-kube-api-access-8vqw7\") pod \"certified-operators-jnfwl\" (UID: \"d42f2f0e-3e80-4231-9daa-5abdd0a0091e\") " pod="openshift-marketplace/certified-operators-jnfwl" Feb 17 17:49:21 crc kubenswrapper[4892]: I0217 17:49:21.358331 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jnfwl" Feb 17 17:49:21 crc kubenswrapper[4892]: I0217 17:49:21.548706 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ct6mr" Feb 17 17:49:21 crc kubenswrapper[4892]: I0217 17:49:21.548965 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ct6mr" Feb 17 17:49:21 crc kubenswrapper[4892]: I0217 17:49:21.643884 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x8f4q" event={"ID":"94374c20-25dd-491e-8778-0b00d10afda0","Type":"ContainerStarted","Data":"f10b38ed42395540b600837982e306e0f8547758920c7092c67abace4e1b4581"} Feb 17 17:49:21 crc kubenswrapper[4892]: I0217 17:49:21.643925 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x8f4q" event={"ID":"94374c20-25dd-491e-8778-0b00d10afda0","Type":"ContainerStarted","Data":"5fed5f18e8e09f644c7feb02c6eba4a9a0e607e3b4689693b4baf8c49522ba35"} Feb 17 17:49:21 crc kubenswrapper[4892]: I0217 17:49:21.646133 4892 generic.go:334] "Generic (PLEG): container finished" podID="c5624079-5b90-4427-a09c-0d96ed3ddebd" containerID="81b6fc2b2024854d4d71058097b71352ef2c94a94cd6614e1745bee067e8e13a" exitCode=0 Feb 17 17:49:21 crc kubenswrapper[4892]: I0217 17:49:21.647062 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4pxrd" event={"ID":"c5624079-5b90-4427-a09c-0d96ed3ddebd","Type":"ContainerDied","Data":"81b6fc2b2024854d4d71058097b71352ef2c94a94cd6614e1745bee067e8e13a"} Feb 17 17:49:21 crc kubenswrapper[4892]: I0217 17:49:21.764014 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jnfwl"] Feb 17 17:49:22 crc kubenswrapper[4892]: I0217 17:49:22.435620 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xnsqq"] Feb 17 17:49:22 crc kubenswrapper[4892]: I0217 17:49:22.437265 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xnsqq" Feb 17 17:49:22 crc kubenswrapper[4892]: I0217 17:49:22.447777 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xnsqq"] Feb 17 17:49:22 crc kubenswrapper[4892]: I0217 17:49:22.467965 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8e76551-8839-410b-9aec-a671ca2119dc-catalog-content\") pod \"community-operators-xnsqq\" (UID: \"b8e76551-8839-410b-9aec-a671ca2119dc\") " pod="openshift-marketplace/community-operators-xnsqq" Feb 17 17:49:22 crc kubenswrapper[4892]: I0217 17:49:22.468272 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8e76551-8839-410b-9aec-a671ca2119dc-utilities\") pod \"community-operators-xnsqq\" (UID: \"b8e76551-8839-410b-9aec-a671ca2119dc\") " pod="openshift-marketplace/community-operators-xnsqq" Feb 17 17:49:22 crc kubenswrapper[4892]: I0217 17:49:22.468447 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs9m4\" (UniqueName: \"kubernetes.io/projected/b8e76551-8839-410b-9aec-a671ca2119dc-kube-api-access-xs9m4\") pod \"community-operators-xnsqq\" (UID: \"b8e76551-8839-410b-9aec-a671ca2119dc\") " pod="openshift-marketplace/community-operators-xnsqq" Feb 17 17:49:22 crc kubenswrapper[4892]: I0217 17:49:22.569383 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8e76551-8839-410b-9aec-a671ca2119dc-utilities\") pod \"community-operators-xnsqq\" (UID: \"b8e76551-8839-410b-9aec-a671ca2119dc\") " pod="openshift-marketplace/community-operators-xnsqq" Feb 17 17:49:22 crc kubenswrapper[4892]: I0217 17:49:22.569470 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs9m4\" (UniqueName: \"kubernetes.io/projected/b8e76551-8839-410b-9aec-a671ca2119dc-kube-api-access-xs9m4\") pod \"community-operators-xnsqq\" (UID: \"b8e76551-8839-410b-9aec-a671ca2119dc\") " pod="openshift-marketplace/community-operators-xnsqq" Feb 17 17:49:22 crc kubenswrapper[4892]: I0217 17:49:22.569517 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8e76551-8839-410b-9aec-a671ca2119dc-catalog-content\") pod \"community-operators-xnsqq\" (UID: \"b8e76551-8839-410b-9aec-a671ca2119dc\") " pod="openshift-marketplace/community-operators-xnsqq" Feb 17 17:49:22 crc kubenswrapper[4892]: I0217 17:49:22.569936 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8e76551-8839-410b-9aec-a671ca2119dc-utilities\") pod \"community-operators-xnsqq\" (UID: \"b8e76551-8839-410b-9aec-a671ca2119dc\") " pod="openshift-marketplace/community-operators-xnsqq" Feb 17 17:49:22 crc kubenswrapper[4892]: I0217 17:49:22.570113 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8e76551-8839-410b-9aec-a671ca2119dc-catalog-content\") pod \"community-operators-xnsqq\" (UID: \"b8e76551-8839-410b-9aec-a671ca2119dc\") " pod="openshift-marketplace/community-operators-xnsqq" Feb 17 17:49:22 crc kubenswrapper[4892]: I0217 17:49:22.591562 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs9m4\" (UniqueName: \"kubernetes.io/projected/b8e76551-8839-410b-9aec-a671ca2119dc-kube-api-access-xs9m4\") pod \"community-operators-xnsqq\" (UID: \"b8e76551-8839-410b-9aec-a671ca2119dc\") " pod="openshift-marketplace/community-operators-xnsqq" Feb 17 17:49:22 crc kubenswrapper[4892]: I0217 17:49:22.607978 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ct6mr" podUID="50aef298-67b1-42cd-a4a5-86179693b0eb" containerName="registry-server" probeResult="failure" output=< Feb 17 17:49:22 crc kubenswrapper[4892]: timeout: failed to connect service ":50051" within 1s Feb 17 17:49:22 crc kubenswrapper[4892]: > Feb 17 17:49:22 crc kubenswrapper[4892]: I0217 17:49:22.652639 4892 generic.go:334] "Generic (PLEG): container finished" podID="94374c20-25dd-491e-8778-0b00d10afda0" containerID="f10b38ed42395540b600837982e306e0f8547758920c7092c67abace4e1b4581" exitCode=0 Feb 17 17:49:22 crc kubenswrapper[4892]: I0217 17:49:22.652716 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x8f4q" event={"ID":"94374c20-25dd-491e-8778-0b00d10afda0","Type":"ContainerDied","Data":"f10b38ed42395540b600837982e306e0f8547758920c7092c67abace4e1b4581"} Feb 17 17:49:22 crc kubenswrapper[4892]: I0217 17:49:22.655857 4892 generic.go:334] "Generic (PLEG): container finished" podID="d42f2f0e-3e80-4231-9daa-5abdd0a0091e" containerID="5ff92214d6f6318966ebda783a4c31c55dbc09aa86f2bafab13284df0b0b7df6" exitCode=0 Feb 17 17:49:22 crc kubenswrapper[4892]: I0217 17:49:22.655962 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jnfwl" event={"ID":"d42f2f0e-3e80-4231-9daa-5abdd0a0091e","Type":"ContainerDied","Data":"5ff92214d6f6318966ebda783a4c31c55dbc09aa86f2bafab13284df0b0b7df6"} Feb 17 17:49:22 crc kubenswrapper[4892]: I0217 17:49:22.655990 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jnfwl" event={"ID":"d42f2f0e-3e80-4231-9daa-5abdd0a0091e","Type":"ContainerStarted","Data":"a45d5fbdd1d074685bdd6d039464c9dcfbb02b3895b1b830ae2673c1dec44789"} Feb 17 17:49:22 crc kubenswrapper[4892]: I0217 17:49:22.668595 4892 generic.go:334] "Generic (PLEG): container finished" podID="e6edd657-e697-42e2-bfc4-3ea98348eb23" containerID="b71360393e5093d54ee4280b406404bd3aeba92aa26135a2c1a7471c904481df" exitCode=0 Feb 17 17:49:22 crc kubenswrapper[4892]: I0217 17:49:22.668641 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hjlcl" event={"ID":"e6edd657-e697-42e2-bfc4-3ea98348eb23","Type":"ContainerDied","Data":"b71360393e5093d54ee4280b406404bd3aeba92aa26135a2c1a7471c904481df"} Feb 17 17:49:22 crc kubenswrapper[4892]: I0217 17:49:22.806967 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xnsqq" Feb 17 17:49:23 crc kubenswrapper[4892]: I0217 17:49:23.255135 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xnsqq"] Feb 17 17:49:23 crc kubenswrapper[4892]: I0217 17:49:23.447890 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-k2k8h"] Feb 17 17:49:23 crc kubenswrapper[4892]: I0217 17:49:23.449931 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k2k8h" Feb 17 17:49:23 crc kubenswrapper[4892]: I0217 17:49:23.455832 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k2k8h"] Feb 17 17:49:23 crc kubenswrapper[4892]: I0217 17:49:23.483360 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc7rt\" (UniqueName: \"kubernetes.io/projected/e730a12c-43a9-45a7-a52b-9f6fd3eaecaf-kube-api-access-lc7rt\") pod \"redhat-operators-k2k8h\" (UID: \"e730a12c-43a9-45a7-a52b-9f6fd3eaecaf\") " pod="openshift-marketplace/redhat-operators-k2k8h" Feb 17 17:49:23 crc kubenswrapper[4892]: I0217 17:49:23.483465 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e730a12c-43a9-45a7-a52b-9f6fd3eaecaf-catalog-content\") pod \"redhat-operators-k2k8h\" (UID: \"e730a12c-43a9-45a7-a52b-9f6fd3eaecaf\") " pod="openshift-marketplace/redhat-operators-k2k8h" Feb 17 17:49:23 crc kubenswrapper[4892]: I0217 17:49:23.483503 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e730a12c-43a9-45a7-a52b-9f6fd3eaecaf-utilities\") pod \"redhat-operators-k2k8h\" (UID: \"e730a12c-43a9-45a7-a52b-9f6fd3eaecaf\") " pod="openshift-marketplace/redhat-operators-k2k8h" Feb 17 17:49:23 crc kubenswrapper[4892]: I0217 17:49:23.584567 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lc7rt\" (UniqueName: \"kubernetes.io/projected/e730a12c-43a9-45a7-a52b-9f6fd3eaecaf-kube-api-access-lc7rt\") pod \"redhat-operators-k2k8h\" (UID: \"e730a12c-43a9-45a7-a52b-9f6fd3eaecaf\") " pod="openshift-marketplace/redhat-operators-k2k8h" Feb 17 17:49:23 crc kubenswrapper[4892]: I0217 17:49:23.584672 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e730a12c-43a9-45a7-a52b-9f6fd3eaecaf-catalog-content\") pod \"redhat-operators-k2k8h\" (UID: \"e730a12c-43a9-45a7-a52b-9f6fd3eaecaf\") " pod="openshift-marketplace/redhat-operators-k2k8h" Feb 17 17:49:23 crc kubenswrapper[4892]: I0217 17:49:23.584721 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e730a12c-43a9-45a7-a52b-9f6fd3eaecaf-utilities\") pod \"redhat-operators-k2k8h\" (UID: \"e730a12c-43a9-45a7-a52b-9f6fd3eaecaf\") " pod="openshift-marketplace/redhat-operators-k2k8h" Feb 17 17:49:23 crc kubenswrapper[4892]: I0217 17:49:23.585471 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e730a12c-43a9-45a7-a52b-9f6fd3eaecaf-utilities\") pod \"redhat-operators-k2k8h\" (UID: \"e730a12c-43a9-45a7-a52b-9f6fd3eaecaf\") " pod="openshift-marketplace/redhat-operators-k2k8h" Feb 17 17:49:23 crc kubenswrapper[4892]: I0217 17:49:23.585696 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e730a12c-43a9-45a7-a52b-9f6fd3eaecaf-catalog-content\") pod \"redhat-operators-k2k8h\" (UID: \"e730a12c-43a9-45a7-a52b-9f6fd3eaecaf\") " pod="openshift-marketplace/redhat-operators-k2k8h" Feb 17 17:49:23 crc kubenswrapper[4892]: I0217 17:49:23.589471 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wx7cq" Feb 17 17:49:23 crc kubenswrapper[4892]: I0217 17:49:23.589553 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wx7cq" Feb 17 17:49:23 crc kubenswrapper[4892]: I0217 17:49:23.616638 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc7rt\" (UniqueName: \"kubernetes.io/projected/e730a12c-43a9-45a7-a52b-9f6fd3eaecaf-kube-api-access-lc7rt\") pod \"redhat-operators-k2k8h\" (UID: \"e730a12c-43a9-45a7-a52b-9f6fd3eaecaf\") " pod="openshift-marketplace/redhat-operators-k2k8h" Feb 17 17:49:23 crc kubenswrapper[4892]: I0217 17:49:23.635607 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wx7cq" Feb 17 17:49:23 crc kubenswrapper[4892]: I0217 17:49:23.682174 4892 generic.go:334] "Generic (PLEG): container finished" podID="c5624079-5b90-4427-a09c-0d96ed3ddebd" containerID="91a1df79e609a6127420838320066f432b02cd9ba6f74870c490778e507fb0a7" exitCode=0 Feb 17 17:49:23 crc kubenswrapper[4892]: I0217 17:49:23.682243 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4pxrd" event={"ID":"c5624079-5b90-4427-a09c-0d96ed3ddebd","Type":"ContainerDied","Data":"91a1df79e609a6127420838320066f432b02cd9ba6f74870c490778e507fb0a7"} Feb 17 17:49:23 crc kubenswrapper[4892]: I0217 17:49:23.683317 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xnsqq" event={"ID":"b8e76551-8839-410b-9aec-a671ca2119dc","Type":"ContainerStarted","Data":"0f9020b7cceeb59149db7b23c12f542496b32e588729c1b8e1e1626ced957cb0"} Feb 17 17:49:23 crc kubenswrapper[4892]: I0217 17:49:23.720861 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wx7cq" Feb 17 17:49:23 crc kubenswrapper[4892]: I0217 17:49:23.775261 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k2k8h" Feb 17 17:49:24 crc kubenswrapper[4892]: I0217 17:49:24.160246 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8d66x" Feb 17 17:49:24 crc kubenswrapper[4892]: I0217 17:49:24.161508 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8d66x" Feb 17 17:49:24 crc kubenswrapper[4892]: W0217 17:49:24.170729 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode730a12c_43a9_45a7_a52b_9f6fd3eaecaf.slice/crio-1626b29e905340f6deacd2ba4712da7f61425d837a482ae74fd71298e9f9b9e5 WatchSource:0}: Error finding container 1626b29e905340f6deacd2ba4712da7f61425d837a482ae74fd71298e9f9b9e5: Status 404 returned error can't find the container with id 1626b29e905340f6deacd2ba4712da7f61425d837a482ae74fd71298e9f9b9e5 Feb 17 17:49:24 crc kubenswrapper[4892]: I0217 17:49:24.171042 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k2k8h"] Feb 17 17:49:24 crc kubenswrapper[4892]: I0217 17:49:24.227221 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8d66x" Feb 17 17:49:24 crc kubenswrapper[4892]: I0217 17:49:24.703897 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hjlcl" event={"ID":"e6edd657-e697-42e2-bfc4-3ea98348eb23","Type":"ContainerStarted","Data":"ff98d1f4fe8d39315ef9d68f94466af7e580b1f041b4be80f27f79aa7b851ce4"} Feb 17 17:49:24 crc kubenswrapper[4892]: I0217 17:49:24.705782 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k2k8h" event={"ID":"e730a12c-43a9-45a7-a52b-9f6fd3eaecaf","Type":"ContainerStarted","Data":"1ac8ae4b6ebbee6f0f846c3fe0d25eba908841a626fe85a5379294620d9d5406"} Feb 17 17:49:24 crc kubenswrapper[4892]: I0217 17:49:24.705838 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k2k8h" event={"ID":"e730a12c-43a9-45a7-a52b-9f6fd3eaecaf","Type":"ContainerStarted","Data":"1626b29e905340f6deacd2ba4712da7f61425d837a482ae74fd71298e9f9b9e5"} Feb 17 17:49:24 crc kubenswrapper[4892]: I0217 17:49:24.708695 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jnfwl" event={"ID":"d42f2f0e-3e80-4231-9daa-5abdd0a0091e","Type":"ContainerStarted","Data":"6254cae7da380d7ce1441e1e4c8df8573f650443ff6cb9e79bd49fa402b06d1d"} Feb 17 17:49:24 crc kubenswrapper[4892]: I0217 17:49:24.711222 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xnsqq" event={"ID":"b8e76551-8839-410b-9aec-a671ca2119dc","Type":"ContainerStarted","Data":"da976570c9bc028312a88a1fa306bd6610fa8a5490a2fbc6997ba40a9b0c8843"} Feb 17 17:49:24 crc kubenswrapper[4892]: I0217 17:49:24.722408 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hjlcl" podStartSLOduration=2.915113442 podStartE2EDuration="7.722369704s" podCreationTimestamp="2026-02-17 17:49:17 +0000 UTC" firstStartedPulling="2026-02-17 17:49:19.625518934 +0000 UTC m=+331.000922199" lastFinishedPulling="2026-02-17 17:49:24.432775186 +0000 UTC m=+335.808178461" observedRunningTime="2026-02-17 17:49:24.721446489 +0000 UTC m=+336.096849764" watchObservedRunningTime="2026-02-17 17:49:24.722369704 +0000 UTC m=+336.097772989" Feb 17 17:49:24 crc kubenswrapper[4892]: I0217 17:49:24.767410 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8d66x" Feb 17 17:49:24 crc kubenswrapper[4892]: I0217 17:49:24.844872 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ljjvr"] Feb 17 17:49:24 crc kubenswrapper[4892]: I0217 17:49:24.846173 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ljjvr" Feb 17 17:49:24 crc kubenswrapper[4892]: I0217 17:49:24.857377 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ljjvr"] Feb 17 17:49:24 crc kubenswrapper[4892]: I0217 17:49:24.902376 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b23b6a9-794e-40ce-93b0-8b60ec25cf20-utilities\") pod \"certified-operators-ljjvr\" (UID: \"4b23b6a9-794e-40ce-93b0-8b60ec25cf20\") " pod="openshift-marketplace/certified-operators-ljjvr" Feb 17 17:49:24 crc kubenswrapper[4892]: I0217 17:49:24.902770 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b23b6a9-794e-40ce-93b0-8b60ec25cf20-catalog-content\") pod \"certified-operators-ljjvr\" (UID: \"4b23b6a9-794e-40ce-93b0-8b60ec25cf20\") " pod="openshift-marketplace/certified-operators-ljjvr" Feb 17 17:49:24 crc kubenswrapper[4892]: I0217 17:49:24.902878 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf2rt\" (UniqueName: \"kubernetes.io/projected/4b23b6a9-794e-40ce-93b0-8b60ec25cf20-kube-api-access-gf2rt\") pod \"certified-operators-ljjvr\" (UID: \"4b23b6a9-794e-40ce-93b0-8b60ec25cf20\") " pod="openshift-marketplace/certified-operators-ljjvr" Feb 17 17:49:25 crc kubenswrapper[4892]: I0217 17:49:25.003754 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b23b6a9-794e-40ce-93b0-8b60ec25cf20-utilities\") pod \"certified-operators-ljjvr\" (UID: \"4b23b6a9-794e-40ce-93b0-8b60ec25cf20\") " pod="openshift-marketplace/certified-operators-ljjvr" Feb 17 17:49:25 crc kubenswrapper[4892]: I0217 17:49:25.003849 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b23b6a9-794e-40ce-93b0-8b60ec25cf20-catalog-content\") pod \"certified-operators-ljjvr\" (UID: \"4b23b6a9-794e-40ce-93b0-8b60ec25cf20\") " pod="openshift-marketplace/certified-operators-ljjvr" Feb 17 17:49:25 crc kubenswrapper[4892]: I0217 17:49:25.003876 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gf2rt\" (UniqueName: \"kubernetes.io/projected/4b23b6a9-794e-40ce-93b0-8b60ec25cf20-kube-api-access-gf2rt\") pod \"certified-operators-ljjvr\" (UID: \"4b23b6a9-794e-40ce-93b0-8b60ec25cf20\") " pod="openshift-marketplace/certified-operators-ljjvr" Feb 17 17:49:25 crc kubenswrapper[4892]: I0217 17:49:25.004520 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b23b6a9-794e-40ce-93b0-8b60ec25cf20-catalog-content\") pod \"certified-operators-ljjvr\" (UID: \"4b23b6a9-794e-40ce-93b0-8b60ec25cf20\") " pod="openshift-marketplace/certified-operators-ljjvr" Feb 17 17:49:25 crc kubenswrapper[4892]: I0217 17:49:25.004623 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b23b6a9-794e-40ce-93b0-8b60ec25cf20-utilities\") pod \"certified-operators-ljjvr\" (UID: \"4b23b6a9-794e-40ce-93b0-8b60ec25cf20\") " pod="openshift-marketplace/certified-operators-ljjvr" Feb 17 17:49:25 crc kubenswrapper[4892]: I0217 17:49:25.025796 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gf2rt\" (UniqueName: \"kubernetes.io/projected/4b23b6a9-794e-40ce-93b0-8b60ec25cf20-kube-api-access-gf2rt\") pod \"certified-operators-ljjvr\" (UID: \"4b23b6a9-794e-40ce-93b0-8b60ec25cf20\") " pod="openshift-marketplace/certified-operators-ljjvr" Feb 17 17:49:25 crc kubenswrapper[4892]: I0217 17:49:25.167872 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ljjvr" Feb 17 17:49:25 crc kubenswrapper[4892]: I0217 17:49:25.644190 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ljjvr"] Feb 17 17:49:25 crc kubenswrapper[4892]: W0217 17:49:25.652463 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b23b6a9_794e_40ce_93b0_8b60ec25cf20.slice/crio-d08591d644054f20e71ce9acb2784ca82e53a24408d7527da05e59e6615dd2d3 WatchSource:0}: Error finding container d08591d644054f20e71ce9acb2784ca82e53a24408d7527da05e59e6615dd2d3: Status 404 returned error can't find the container with id d08591d644054f20e71ce9acb2784ca82e53a24408d7527da05e59e6615dd2d3 Feb 17 17:49:25 crc kubenswrapper[4892]: I0217 17:49:25.717292 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x8f4q" event={"ID":"94374c20-25dd-491e-8778-0b00d10afda0","Type":"ContainerStarted","Data":"d1014d4ed3072d95e63d7b7bae47fe8dcd0e17bc688218b747f04f332329d99a"} Feb 17 17:49:25 crc kubenswrapper[4892]: I0217 17:49:25.718195 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ljjvr" event={"ID":"4b23b6a9-794e-40ce-93b0-8b60ec25cf20","Type":"ContainerStarted","Data":"d08591d644054f20e71ce9acb2784ca82e53a24408d7527da05e59e6615dd2d3"} Feb 17 17:49:25 crc kubenswrapper[4892]: I0217 17:49:25.720145 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4pxrd" event={"ID":"c5624079-5b90-4427-a09c-0d96ed3ddebd","Type":"ContainerStarted","Data":"38772aa7ed1432e29af1464ff5a1c6f78c3e569ae52aef5a80715e899bf5b5e5"} Feb 17 17:49:25 crc kubenswrapper[4892]: I0217 17:49:25.752150 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4pxrd" podStartSLOduration=4.563465252 podStartE2EDuration="7.752133723s" podCreationTimestamp="2026-02-17 17:49:18 +0000 UTC" firstStartedPulling="2026-02-17 17:49:21.647910164 +0000 UTC m=+333.023313429" lastFinishedPulling="2026-02-17 17:49:24.836578635 +0000 UTC m=+336.211981900" observedRunningTime="2026-02-17 17:49:25.750560931 +0000 UTC m=+337.125964196" watchObservedRunningTime="2026-02-17 17:49:25.752133723 +0000 UTC m=+337.127536988" Feb 17 17:49:25 crc kubenswrapper[4892]: I0217 17:49:25.839032 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7hghz"] Feb 17 17:49:25 crc kubenswrapper[4892]: I0217 17:49:25.840586 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7hghz" Feb 17 17:49:25 crc kubenswrapper[4892]: I0217 17:49:25.853528 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7hghz"] Feb 17 17:49:25 crc kubenswrapper[4892]: I0217 17:49:25.911749 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnndd\" (UniqueName: \"kubernetes.io/projected/3b7d2423-4734-4106-91f3-fd4f6712a9d0-kube-api-access-lnndd\") pod \"community-operators-7hghz\" (UID: \"3b7d2423-4734-4106-91f3-fd4f6712a9d0\") " pod="openshift-marketplace/community-operators-7hghz" Feb 17 17:49:25 crc kubenswrapper[4892]: I0217 17:49:25.911844 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b7d2423-4734-4106-91f3-fd4f6712a9d0-catalog-content\") pod \"community-operators-7hghz\" (UID: \"3b7d2423-4734-4106-91f3-fd4f6712a9d0\") " pod="openshift-marketplace/community-operators-7hghz" Feb 17 17:49:25 crc kubenswrapper[4892]: I0217 17:49:25.911868 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b7d2423-4734-4106-91f3-fd4f6712a9d0-utilities\") pod \"community-operators-7hghz\" (UID: \"3b7d2423-4734-4106-91f3-fd4f6712a9d0\") " pod="openshift-marketplace/community-operators-7hghz" Feb 17 17:49:26 crc kubenswrapper[4892]: I0217 17:49:26.013521 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnndd\" (UniqueName: \"kubernetes.io/projected/3b7d2423-4734-4106-91f3-fd4f6712a9d0-kube-api-access-lnndd\") pod \"community-operators-7hghz\" (UID: \"3b7d2423-4734-4106-91f3-fd4f6712a9d0\") " pod="openshift-marketplace/community-operators-7hghz" Feb 17 17:49:26 crc kubenswrapper[4892]: I0217 17:49:26.013726 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b7d2423-4734-4106-91f3-fd4f6712a9d0-catalog-content\") pod \"community-operators-7hghz\" (UID: \"3b7d2423-4734-4106-91f3-fd4f6712a9d0\") " pod="openshift-marketplace/community-operators-7hghz" Feb 17 17:49:26 crc kubenswrapper[4892]: I0217 17:49:26.013757 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b7d2423-4734-4106-91f3-fd4f6712a9d0-utilities\") pod \"community-operators-7hghz\" (UID: \"3b7d2423-4734-4106-91f3-fd4f6712a9d0\") " pod="openshift-marketplace/community-operators-7hghz" Feb 17 17:49:26 crc kubenswrapper[4892]: I0217 17:49:26.014151 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b7d2423-4734-4106-91f3-fd4f6712a9d0-catalog-content\") pod \"community-operators-7hghz\" (UID: \"3b7d2423-4734-4106-91f3-fd4f6712a9d0\") " pod="openshift-marketplace/community-operators-7hghz" Feb 17 17:49:26 crc kubenswrapper[4892]: I0217 17:49:26.014291 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b7d2423-4734-4106-91f3-fd4f6712a9d0-utilities\") pod \"community-operators-7hghz\" (UID: \"3b7d2423-4734-4106-91f3-fd4f6712a9d0\") " pod="openshift-marketplace/community-operators-7hghz" Feb 17 17:49:26 crc kubenswrapper[4892]: I0217 17:49:26.037555 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnndd\" (UniqueName: \"kubernetes.io/projected/3b7d2423-4734-4106-91f3-fd4f6712a9d0-kube-api-access-lnndd\") pod \"community-operators-7hghz\" (UID: \"3b7d2423-4734-4106-91f3-fd4f6712a9d0\") " pod="openshift-marketplace/community-operators-7hghz" Feb 17 17:49:26 crc kubenswrapper[4892]: I0217 17:49:26.216045 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7hghz" Feb 17 17:49:26 crc kubenswrapper[4892]: I0217 17:49:26.565935 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lx4bn" Feb 17 17:49:26 crc kubenswrapper[4892]: I0217 17:49:26.566290 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lx4bn" Feb 17 17:49:26 crc kubenswrapper[4892]: I0217 17:49:26.647462 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7hghz"] Feb 17 17:49:26 crc kubenswrapper[4892]: I0217 17:49:26.727475 4892 generic.go:334] "Generic (PLEG): container finished" podID="b8e76551-8839-410b-9aec-a671ca2119dc" containerID="da976570c9bc028312a88a1fa306bd6610fa8a5490a2fbc6997ba40a9b0c8843" exitCode=0 Feb 17 17:49:26 crc kubenswrapper[4892]: I0217 17:49:26.727524 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xnsqq" event={"ID":"b8e76551-8839-410b-9aec-a671ca2119dc","Type":"ContainerDied","Data":"da976570c9bc028312a88a1fa306bd6610fa8a5490a2fbc6997ba40a9b0c8843"} Feb 17 17:49:26 crc kubenswrapper[4892]: I0217 17:49:26.730079 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7hghz" event={"ID":"3b7d2423-4734-4106-91f3-fd4f6712a9d0","Type":"ContainerStarted","Data":"d79244bfea19b57f508956e7b42169796399e4e84b4b8d6619576abc7b5c60a3"} Feb 17 17:49:26 crc kubenswrapper[4892]: I0217 17:49:26.734496 4892 generic.go:334] "Generic (PLEG): container finished" podID="e730a12c-43a9-45a7-a52b-9f6fd3eaecaf" containerID="1ac8ae4b6ebbee6f0f846c3fe0d25eba908841a626fe85a5379294620d9d5406" exitCode=0 Feb 17 17:49:26 crc kubenswrapper[4892]: I0217 17:49:26.734569 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k2k8h" event={"ID":"e730a12c-43a9-45a7-a52b-9f6fd3eaecaf","Type":"ContainerDied","Data":"1ac8ae4b6ebbee6f0f846c3fe0d25eba908841a626fe85a5379294620d9d5406"} Feb 17 17:49:26 crc kubenswrapper[4892]: I0217 17:49:26.737881 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ljjvr" event={"ID":"4b23b6a9-794e-40ce-93b0-8b60ec25cf20","Type":"ContainerStarted","Data":"c8437b48b739e1ba68ee8987a855e7ea7ae6e38da4ab45d51fb3da9dba4e1e44"} Feb 17 17:49:27 crc kubenswrapper[4892]: I0217 17:49:27.245850 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gfc4l"] Feb 17 17:49:27 crc kubenswrapper[4892]: I0217 17:49:27.247533 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gfc4l" Feb 17 17:49:27 crc kubenswrapper[4892]: I0217 17:49:27.257993 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gfc4l"] Feb 17 17:49:27 crc kubenswrapper[4892]: I0217 17:49:27.434045 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31427c94-1de1-431c-9f4b-e87f01548d3f-catalog-content\") pod \"redhat-operators-gfc4l\" (UID: \"31427c94-1de1-431c-9f4b-e87f01548d3f\") " pod="openshift-marketplace/redhat-operators-gfc4l" Feb 17 17:49:27 crc kubenswrapper[4892]: I0217 17:49:27.434131 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-454vb\" (UniqueName: \"kubernetes.io/projected/31427c94-1de1-431c-9f4b-e87f01548d3f-kube-api-access-454vb\") pod \"redhat-operators-gfc4l\" (UID: \"31427c94-1de1-431c-9f4b-e87f01548d3f\") " pod="openshift-marketplace/redhat-operators-gfc4l" Feb 17 17:49:27 crc kubenswrapper[4892]: I0217 17:49:27.434189 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31427c94-1de1-431c-9f4b-e87f01548d3f-utilities\") pod \"redhat-operators-gfc4l\" (UID: \"31427c94-1de1-431c-9f4b-e87f01548d3f\") " pod="openshift-marketplace/redhat-operators-gfc4l" Feb 17 17:49:27 crc kubenswrapper[4892]: I0217 17:49:27.535213 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31427c94-1de1-431c-9f4b-e87f01548d3f-catalog-content\") pod \"redhat-operators-gfc4l\" (UID: \"31427c94-1de1-431c-9f4b-e87f01548d3f\") " pod="openshift-marketplace/redhat-operators-gfc4l" Feb 17 17:49:27 crc kubenswrapper[4892]: I0217 17:49:27.535431 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-454vb\" (UniqueName: \"kubernetes.io/projected/31427c94-1de1-431c-9f4b-e87f01548d3f-kube-api-access-454vb\") pod \"redhat-operators-gfc4l\" (UID: \"31427c94-1de1-431c-9f4b-e87f01548d3f\") " pod="openshift-marketplace/redhat-operators-gfc4l" Feb 17 17:49:27 crc kubenswrapper[4892]: I0217 17:49:27.535527 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31427c94-1de1-431c-9f4b-e87f01548d3f-utilities\") pod \"redhat-operators-gfc4l\" (UID: \"31427c94-1de1-431c-9f4b-e87f01548d3f\") " pod="openshift-marketplace/redhat-operators-gfc4l" Feb 17 17:49:27 crc kubenswrapper[4892]: I0217 17:49:27.536371 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31427c94-1de1-431c-9f4b-e87f01548d3f-catalog-content\") pod \"redhat-operators-gfc4l\" (UID: \"31427c94-1de1-431c-9f4b-e87f01548d3f\") " pod="openshift-marketplace/redhat-operators-gfc4l" Feb 17 17:49:27 crc kubenswrapper[4892]: I0217 17:49:27.536439 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31427c94-1de1-431c-9f4b-e87f01548d3f-utilities\") pod \"redhat-operators-gfc4l\" (UID: \"31427c94-1de1-431c-9f4b-e87f01548d3f\") " pod="openshift-marketplace/redhat-operators-gfc4l" Feb 17 17:49:27 crc kubenswrapper[4892]: I0217 17:49:27.571759 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-454vb\" (UniqueName: \"kubernetes.io/projected/31427c94-1de1-431c-9f4b-e87f01548d3f-kube-api-access-454vb\") pod \"redhat-operators-gfc4l\" (UID: \"31427c94-1de1-431c-9f4b-e87f01548d3f\") " pod="openshift-marketplace/redhat-operators-gfc4l" Feb 17 17:49:27 crc kubenswrapper[4892]: I0217 17:49:27.602980 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lx4bn" podUID="9833d581-8f46-4e58-9a99-6ec65dc9431b" containerName="registry-server" probeResult="failure" output=< Feb 17 17:49:27 crc kubenswrapper[4892]: timeout: failed to connect service ":50051" within 1s Feb 17 17:49:27 crc kubenswrapper[4892]: > Feb 17 17:49:27 crc kubenswrapper[4892]: I0217 17:49:27.743721 4892 generic.go:334] "Generic (PLEG): container finished" podID="94374c20-25dd-491e-8778-0b00d10afda0" containerID="d1014d4ed3072d95e63d7b7bae47fe8dcd0e17bc688218b747f04f332329d99a" exitCode=0 Feb 17 17:49:27 crc kubenswrapper[4892]: I0217 17:49:27.743764 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x8f4q" event={"ID":"94374c20-25dd-491e-8778-0b00d10afda0","Type":"ContainerDied","Data":"d1014d4ed3072d95e63d7b7bae47fe8dcd0e17bc688218b747f04f332329d99a"} Feb 17 17:49:27 crc kubenswrapper[4892]: I0217 17:49:27.745727 4892 generic.go:334] "Generic (PLEG): container finished" podID="4b23b6a9-794e-40ce-93b0-8b60ec25cf20" containerID="c8437b48b739e1ba68ee8987a855e7ea7ae6e38da4ab45d51fb3da9dba4e1e44" exitCode=0 Feb 17 17:49:27 crc kubenswrapper[4892]: I0217 17:49:27.745788 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ljjvr" event={"ID":"4b23b6a9-794e-40ce-93b0-8b60ec25cf20","Type":"ContainerDied","Data":"c8437b48b739e1ba68ee8987a855e7ea7ae6e38da4ab45d51fb3da9dba4e1e44"} Feb 17 17:49:27 crc kubenswrapper[4892]: I0217 17:49:27.748245 4892 generic.go:334] "Generic (PLEG): container finished" podID="d42f2f0e-3e80-4231-9daa-5abdd0a0091e" containerID="6254cae7da380d7ce1441e1e4c8df8573f650443ff6cb9e79bd49fa402b06d1d" exitCode=0 Feb 17 17:49:27 crc kubenswrapper[4892]: I0217 17:49:27.748317 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jnfwl" event={"ID":"d42f2f0e-3e80-4231-9daa-5abdd0a0091e","Type":"ContainerDied","Data":"6254cae7da380d7ce1441e1e4c8df8573f650443ff6cb9e79bd49fa402b06d1d"} Feb 17 17:49:27 crc kubenswrapper[4892]: I0217 17:49:27.753494 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7hghz" event={"ID":"3b7d2423-4734-4106-91f3-fd4f6712a9d0","Type":"ContainerStarted","Data":"a48339fdeae0d0b590c5b0eeb2980b72b9022267ddc7fa7cf3aa82db9c2a816d"} Feb 17 17:49:27 crc kubenswrapper[4892]: I0217 17:49:27.869411 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gfc4l" Feb 17 17:49:27 crc kubenswrapper[4892]: I0217 17:49:27.963248 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hjlcl" Feb 17 17:49:27 crc kubenswrapper[4892]: I0217 17:49:27.963504 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hjlcl" Feb 17 17:49:28 crc kubenswrapper[4892]: I0217 17:49:28.068625 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hjlcl" Feb 17 17:49:28 crc kubenswrapper[4892]: I0217 17:49:28.112647 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gfc4l"] Feb 17 17:49:28 crc kubenswrapper[4892]: W0217 17:49:28.122004 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31427c94_1de1_431c_9f4b_e87f01548d3f.slice/crio-15daf13eff4370c279d6830d07c6b949cfd623dac781dc351e393c57a91b833c WatchSource:0}: Error finding container 15daf13eff4370c279d6830d07c6b949cfd623dac781dc351e393c57a91b833c: Status 404 returned error can't find the container with id 15daf13eff4370c279d6830d07c6b949cfd623dac781dc351e393c57a91b833c Feb 17 17:49:28 crc kubenswrapper[4892]: I0217 17:49:28.236748 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hw9fh"] Feb 17 17:49:28 crc kubenswrapper[4892]: I0217 17:49:28.237890 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hw9fh" Feb 17 17:49:28 crc kubenswrapper[4892]: I0217 17:49:28.245071 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7p5g\" (UniqueName: \"kubernetes.io/projected/8d6875b4-db50-44e7-b270-ceb7bc6ef804-kube-api-access-s7p5g\") pod \"certified-operators-hw9fh\" (UID: \"8d6875b4-db50-44e7-b270-ceb7bc6ef804\") " pod="openshift-marketplace/certified-operators-hw9fh" Feb 17 17:49:28 crc kubenswrapper[4892]: I0217 17:49:28.245137 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d6875b4-db50-44e7-b270-ceb7bc6ef804-utilities\") pod \"certified-operators-hw9fh\" (UID: \"8d6875b4-db50-44e7-b270-ceb7bc6ef804\") " pod="openshift-marketplace/certified-operators-hw9fh" Feb 17 17:49:28 crc kubenswrapper[4892]: I0217 17:49:28.245196 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d6875b4-db50-44e7-b270-ceb7bc6ef804-catalog-content\") pod \"certified-operators-hw9fh\" (UID: \"8d6875b4-db50-44e7-b270-ceb7bc6ef804\") " pod="openshift-marketplace/certified-operators-hw9fh" Feb 17 17:49:28 crc kubenswrapper[4892]: I0217 17:49:28.253134 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hw9fh"] Feb 17 17:49:28 crc kubenswrapper[4892]: I0217 17:49:28.347099 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7p5g\" (UniqueName: \"kubernetes.io/projected/8d6875b4-db50-44e7-b270-ceb7bc6ef804-kube-api-access-s7p5g\") pod \"certified-operators-hw9fh\" (UID: \"8d6875b4-db50-44e7-b270-ceb7bc6ef804\") " pod="openshift-marketplace/certified-operators-hw9fh" Feb 17 17:49:28 crc kubenswrapper[4892]: I0217 17:49:28.347510 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d6875b4-db50-44e7-b270-ceb7bc6ef804-utilities\") pod \"certified-operators-hw9fh\" (UID: \"8d6875b4-db50-44e7-b270-ceb7bc6ef804\") " pod="openshift-marketplace/certified-operators-hw9fh" Feb 17 17:49:28 crc kubenswrapper[4892]: I0217 17:49:28.347584 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d6875b4-db50-44e7-b270-ceb7bc6ef804-catalog-content\") pod \"certified-operators-hw9fh\" (UID: \"8d6875b4-db50-44e7-b270-ceb7bc6ef804\") " pod="openshift-marketplace/certified-operators-hw9fh" Feb 17 17:49:28 crc kubenswrapper[4892]: I0217 17:49:28.348249 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d6875b4-db50-44e7-b270-ceb7bc6ef804-catalog-content\") pod \"certified-operators-hw9fh\" (UID: \"8d6875b4-db50-44e7-b270-ceb7bc6ef804\") " pod="openshift-marketplace/certified-operators-hw9fh" Feb 17 17:49:28 crc kubenswrapper[4892]: I0217 17:49:28.348263 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d6875b4-db50-44e7-b270-ceb7bc6ef804-utilities\") pod \"certified-operators-hw9fh\" (UID: \"8d6875b4-db50-44e7-b270-ceb7bc6ef804\") " pod="openshift-marketplace/certified-operators-hw9fh" Feb 17 17:49:28 crc kubenswrapper[4892]: I0217 17:49:28.371998 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7p5g\" (UniqueName: \"kubernetes.io/projected/8d6875b4-db50-44e7-b270-ceb7bc6ef804-kube-api-access-s7p5g\") pod \"certified-operators-hw9fh\" (UID: \"8d6875b4-db50-44e7-b270-ceb7bc6ef804\") " pod="openshift-marketplace/certified-operators-hw9fh" Feb 17 17:49:28 crc kubenswrapper[4892]: I0217 17:49:28.552536 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hw9fh" Feb 17 17:49:28 crc kubenswrapper[4892]: I0217 17:49:28.764528 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gfc4l" event={"ID":"31427c94-1de1-431c-9f4b-e87f01548d3f","Type":"ContainerStarted","Data":"15daf13eff4370c279d6830d07c6b949cfd623dac781dc351e393c57a91b833c"} Feb 17 17:49:28 crc kubenswrapper[4892]: I0217 17:49:28.974570 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4pxrd" Feb 17 17:49:28 crc kubenswrapper[4892]: I0217 17:49:28.974638 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4pxrd" Feb 17 17:49:29 crc kubenswrapper[4892]: I0217 17:49:29.005170 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hw9fh"] Feb 17 17:49:29 crc kubenswrapper[4892]: I0217 17:49:29.016791 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4pxrd" Feb 17 17:49:29 crc kubenswrapper[4892]: W0217 17:49:29.058239 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d6875b4_db50_44e7_b270_ceb7bc6ef804.slice/crio-e8bc16e1a1a2023ac06d03ae26e05cf2902fd7a793b6c30d314ae6e3b73fabf4 WatchSource:0}: Error finding container e8bc16e1a1a2023ac06d03ae26e05cf2902fd7a793b6c30d314ae6e3b73fabf4: Status 404 returned error can't find the container with id e8bc16e1a1a2023ac06d03ae26e05cf2902fd7a793b6c30d314ae6e3b73fabf4 Feb 17 17:49:29 crc kubenswrapper[4892]: I0217 17:49:29.633164 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fsc5r"] Feb 17 17:49:29 crc kubenswrapper[4892]: I0217 17:49:29.634534 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fsc5r" Feb 17 17:49:29 crc kubenswrapper[4892]: I0217 17:49:29.715239 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fsc5r"] Feb 17 17:49:29 crc kubenswrapper[4892]: I0217 17:49:29.762059 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb7qd\" (UniqueName: \"kubernetes.io/projected/28f724e3-47e3-47df-be60-cc0da5a15e25-kube-api-access-tb7qd\") pod \"community-operators-fsc5r\" (UID: \"28f724e3-47e3-47df-be60-cc0da5a15e25\") " pod="openshift-marketplace/community-operators-fsc5r" Feb 17 17:49:29 crc kubenswrapper[4892]: I0217 17:49:29.762147 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28f724e3-47e3-47df-be60-cc0da5a15e25-catalog-content\") pod \"community-operators-fsc5r\" (UID: \"28f724e3-47e3-47df-be60-cc0da5a15e25\") " pod="openshift-marketplace/community-operators-fsc5r" Feb 17 17:49:29 crc kubenswrapper[4892]: I0217 17:49:29.762172 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28f724e3-47e3-47df-be60-cc0da5a15e25-utilities\") pod \"community-operators-fsc5r\" (UID: \"28f724e3-47e3-47df-be60-cc0da5a15e25\") " pod="openshift-marketplace/community-operators-fsc5r" Feb 17 17:49:29 crc kubenswrapper[4892]: I0217 17:49:29.770962 4892 generic.go:334] "Generic (PLEG): container finished" podID="31427c94-1de1-431c-9f4b-e87f01548d3f" containerID="792b0f5ab045c2d981e9f022986d9afadc5e6fe9beeb3f3507e0e33fef329c3d" exitCode=0 Feb 17 17:49:29 crc kubenswrapper[4892]: I0217 17:49:29.771020 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gfc4l" event={"ID":"31427c94-1de1-431c-9f4b-e87f01548d3f","Type":"ContainerDied","Data":"792b0f5ab045c2d981e9f022986d9afadc5e6fe9beeb3f3507e0e33fef329c3d"} Feb 17 17:49:29 crc kubenswrapper[4892]: I0217 17:49:29.775881 4892 generic.go:334] "Generic (PLEG): container finished" podID="3b7d2423-4734-4106-91f3-fd4f6712a9d0" containerID="a48339fdeae0d0b590c5b0eeb2980b72b9022267ddc7fa7cf3aa82db9c2a816d" exitCode=0 Feb 17 17:49:29 crc kubenswrapper[4892]: I0217 17:49:29.775961 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7hghz" event={"ID":"3b7d2423-4734-4106-91f3-fd4f6712a9d0","Type":"ContainerDied","Data":"a48339fdeae0d0b590c5b0eeb2980b72b9022267ddc7fa7cf3aa82db9c2a816d"} Feb 17 17:49:29 crc kubenswrapper[4892]: I0217 17:49:29.778336 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hw9fh" event={"ID":"8d6875b4-db50-44e7-b270-ceb7bc6ef804","Type":"ContainerStarted","Data":"f2819bfa5e2d545e8b28aa1b846a66666d5981d51498cac306c94a5d7f4a99bc"} Feb 17 17:49:29 crc kubenswrapper[4892]: I0217 17:49:29.778404 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hw9fh" event={"ID":"8d6875b4-db50-44e7-b270-ceb7bc6ef804","Type":"ContainerStarted","Data":"e8bc16e1a1a2023ac06d03ae26e05cf2902fd7a793b6c30d314ae6e3b73fabf4"} Feb 17 17:49:29 crc kubenswrapper[4892]: I0217 17:49:29.826385 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4pxrd" Feb 17 17:49:29 crc kubenswrapper[4892]: I0217 17:49:29.864555 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tb7qd\" (UniqueName: \"kubernetes.io/projected/28f724e3-47e3-47df-be60-cc0da5a15e25-kube-api-access-tb7qd\") pod \"community-operators-fsc5r\" (UID: \"28f724e3-47e3-47df-be60-cc0da5a15e25\") " pod="openshift-marketplace/community-operators-fsc5r" Feb 17 17:49:29 crc kubenswrapper[4892]: I0217 17:49:29.864648 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28f724e3-47e3-47df-be60-cc0da5a15e25-catalog-content\") pod \"community-operators-fsc5r\" (UID: \"28f724e3-47e3-47df-be60-cc0da5a15e25\") " pod="openshift-marketplace/community-operators-fsc5r" Feb 17 17:49:29 crc kubenswrapper[4892]: I0217 17:49:29.864673 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28f724e3-47e3-47df-be60-cc0da5a15e25-utilities\") pod \"community-operators-fsc5r\" (UID: \"28f724e3-47e3-47df-be60-cc0da5a15e25\") " pod="openshift-marketplace/community-operators-fsc5r" Feb 17 17:49:29 crc kubenswrapper[4892]: I0217 17:49:29.865131 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28f724e3-47e3-47df-be60-cc0da5a15e25-utilities\") pod \"community-operators-fsc5r\" (UID: \"28f724e3-47e3-47df-be60-cc0da5a15e25\") " pod="openshift-marketplace/community-operators-fsc5r" Feb 17 17:49:29 crc kubenswrapper[4892]: I0217 17:49:29.865359 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28f724e3-47e3-47df-be60-cc0da5a15e25-catalog-content\") pod \"community-operators-fsc5r\" (UID: \"28f724e3-47e3-47df-be60-cc0da5a15e25\") " pod="openshift-marketplace/community-operators-fsc5r" Feb 17 17:49:29 crc kubenswrapper[4892]: I0217 17:49:29.906196 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tb7qd\" (UniqueName: \"kubernetes.io/projected/28f724e3-47e3-47df-be60-cc0da5a15e25-kube-api-access-tb7qd\") pod \"community-operators-fsc5r\" (UID: \"28f724e3-47e3-47df-be60-cc0da5a15e25\") " pod="openshift-marketplace/community-operators-fsc5r" Feb 17 17:49:29 crc kubenswrapper[4892]: I0217 17:49:29.947021 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fsc5r" Feb 17 17:49:30 crc kubenswrapper[4892]: I0217 17:49:30.149885 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fsc5r"] Feb 17 17:49:30 crc kubenswrapper[4892]: W0217 17:49:30.157123 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28f724e3_47e3_47df_be60_cc0da5a15e25.slice/crio-2d671abd1099d02bf527fd96d9fe502139161decf1dfad5a84d332e923e20467 WatchSource:0}: Error finding container 2d671abd1099d02bf527fd96d9fe502139161decf1dfad5a84d332e923e20467: Status 404 returned error can't find the container with id 2d671abd1099d02bf527fd96d9fe502139161decf1dfad5a84d332e923e20467 Feb 17 17:49:30 crc kubenswrapper[4892]: I0217 17:49:30.644236 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6vvnr"] Feb 17 17:49:30 crc kubenswrapper[4892]: I0217 17:49:30.646207 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6vvnr" Feb 17 17:49:30 crc kubenswrapper[4892]: I0217 17:49:30.652283 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6vvnr"] Feb 17 17:49:30 crc kubenswrapper[4892]: I0217 17:49:30.786407 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fsc5r" event={"ID":"28f724e3-47e3-47df-be60-cc0da5a15e25","Type":"ContainerStarted","Data":"2d671abd1099d02bf527fd96d9fe502139161decf1dfad5a84d332e923e20467"} Feb 17 17:49:30 crc kubenswrapper[4892]: I0217 17:49:30.788625 4892 generic.go:334] "Generic (PLEG): container finished" podID="8d6875b4-db50-44e7-b270-ceb7bc6ef804" containerID="f2819bfa5e2d545e8b28aa1b846a66666d5981d51498cac306c94a5d7f4a99bc" exitCode=0 Feb 17 17:49:30 crc kubenswrapper[4892]: I0217 17:49:30.788673 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hw9fh" event={"ID":"8d6875b4-db50-44e7-b270-ceb7bc6ef804","Type":"ContainerDied","Data":"f2819bfa5e2d545e8b28aa1b846a66666d5981d51498cac306c94a5d7f4a99bc"} Feb 17 17:49:30 crc kubenswrapper[4892]: I0217 17:49:30.799870 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnwvb\" (UniqueName: \"kubernetes.io/projected/68547217-2996-4b25-9020-c2187ecfb42e-kube-api-access-pnwvb\") pod \"redhat-operators-6vvnr\" (UID: \"68547217-2996-4b25-9020-c2187ecfb42e\") " pod="openshift-marketplace/redhat-operators-6vvnr" Feb 17 17:49:30 crc kubenswrapper[4892]: I0217 17:49:30.799967 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68547217-2996-4b25-9020-c2187ecfb42e-catalog-content\") pod \"redhat-operators-6vvnr\" (UID: \"68547217-2996-4b25-9020-c2187ecfb42e\") " pod="openshift-marketplace/redhat-operators-6vvnr" Feb 17 17:49:30 crc kubenswrapper[4892]: I0217 17:49:30.800001 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68547217-2996-4b25-9020-c2187ecfb42e-utilities\") pod \"redhat-operators-6vvnr\" (UID: \"68547217-2996-4b25-9020-c2187ecfb42e\") " pod="openshift-marketplace/redhat-operators-6vvnr" Feb 17 17:49:30 crc kubenswrapper[4892]: I0217 17:49:30.901667 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68547217-2996-4b25-9020-c2187ecfb42e-utilities\") pod \"redhat-operators-6vvnr\" (UID: \"68547217-2996-4b25-9020-c2187ecfb42e\") " pod="openshift-marketplace/redhat-operators-6vvnr" Feb 17 17:49:30 crc kubenswrapper[4892]: I0217 17:49:30.901781 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnwvb\" (UniqueName: \"kubernetes.io/projected/68547217-2996-4b25-9020-c2187ecfb42e-kube-api-access-pnwvb\") pod \"redhat-operators-6vvnr\" (UID: \"68547217-2996-4b25-9020-c2187ecfb42e\") " pod="openshift-marketplace/redhat-operators-6vvnr" Feb 17 17:49:30 crc kubenswrapper[4892]: I0217 17:49:30.901834 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68547217-2996-4b25-9020-c2187ecfb42e-catalog-content\") pod \"redhat-operators-6vvnr\" (UID: \"68547217-2996-4b25-9020-c2187ecfb42e\") " pod="openshift-marketplace/redhat-operators-6vvnr" Feb 17 17:49:30 crc kubenswrapper[4892]: I0217 17:49:30.902988 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68547217-2996-4b25-9020-c2187ecfb42e-catalog-content\") pod \"redhat-operators-6vvnr\" (UID: \"68547217-2996-4b25-9020-c2187ecfb42e\") " pod="openshift-marketplace/redhat-operators-6vvnr" Feb 17 17:49:30 crc kubenswrapper[4892]: I0217 17:49:30.903026 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68547217-2996-4b25-9020-c2187ecfb42e-utilities\") pod \"redhat-operators-6vvnr\" (UID: \"68547217-2996-4b25-9020-c2187ecfb42e\") " pod="openshift-marketplace/redhat-operators-6vvnr" Feb 17 17:49:30 crc kubenswrapper[4892]: I0217 17:49:30.926370 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnwvb\" (UniqueName: \"kubernetes.io/projected/68547217-2996-4b25-9020-c2187ecfb42e-kube-api-access-pnwvb\") pod \"redhat-operators-6vvnr\" (UID: \"68547217-2996-4b25-9020-c2187ecfb42e\") " pod="openshift-marketplace/redhat-operators-6vvnr" Feb 17 17:49:31 crc kubenswrapper[4892]: I0217 17:49:31.018264 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6vvnr" Feb 17 17:49:31 crc kubenswrapper[4892]: I0217 17:49:31.605393 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ct6mr" Feb 17 17:49:31 crc kubenswrapper[4892]: I0217 17:49:31.651609 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ct6mr" Feb 17 17:49:31 crc kubenswrapper[4892]: I0217 17:49:31.805214 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xnsqq" event={"ID":"b8e76551-8839-410b-9aec-a671ca2119dc","Type":"ContainerStarted","Data":"aab60b013de97225059f1f24b745c4783a5b3cf21cff83aef795c6459f00dd5b"} Feb 17 17:49:31 crc kubenswrapper[4892]: I0217 17:49:31.807559 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fsc5r" event={"ID":"28f724e3-47e3-47df-be60-cc0da5a15e25","Type":"ContainerDied","Data":"b9c4473c311aba27d81b205a916c679031348f4b5af6b820ebd6e085ce6ab816"} Feb 17 17:49:31 crc kubenswrapper[4892]: I0217 17:49:31.807507 4892 generic.go:334] "Generic (PLEG): container finished" podID="28f724e3-47e3-47df-be60-cc0da5a15e25" containerID="b9c4473c311aba27d81b205a916c679031348f4b5af6b820ebd6e085ce6ab816" exitCode=0 Feb 17 17:49:31 crc kubenswrapper[4892]: I0217 17:49:31.814013 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ljjvr" event={"ID":"4b23b6a9-794e-40ce-93b0-8b60ec25cf20","Type":"ContainerStarted","Data":"f7266bc344c6aab906c236cba119a7a4ac288fa4a38b69bd19c3a169fbdc3636"} Feb 17 17:49:31 crc kubenswrapper[4892]: I0217 17:49:31.821328 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gfc4l" event={"ID":"31427c94-1de1-431c-9f4b-e87f01548d3f","Type":"ContainerStarted","Data":"1c64371027f9a0b92aebbd32db4c5e47fb934474941206fbd2e67f5f92c2fedd"} Feb 17 17:49:31 crc kubenswrapper[4892]: I0217 17:49:31.826865 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jnfwl" event={"ID":"d42f2f0e-3e80-4231-9daa-5abdd0a0091e","Type":"ContainerStarted","Data":"af4a15d0b4f9f93d9b089cf26fd05394e4d9ac66db54ab531c48a2158c0c0841"} Feb 17 17:49:31 crc kubenswrapper[4892]: I0217 17:49:31.907741 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jnfwl" podStartSLOduration=2.007651587 podStartE2EDuration="10.907719602s" podCreationTimestamp="2026-02-17 17:49:21 +0000 UTC" firstStartedPulling="2026-02-17 17:49:22.658378255 +0000 UTC m=+334.033781530" lastFinishedPulling="2026-02-17 17:49:31.55844628 +0000 UTC m=+342.933849545" observedRunningTime="2026-02-17 17:49:31.90724982 +0000 UTC m=+343.282653085" watchObservedRunningTime="2026-02-17 17:49:31.907719602 +0000 UTC m=+343.283122867" Feb 17 17:49:31 crc kubenswrapper[4892]: I0217 17:49:31.951508 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6vvnr"] Feb 17 17:49:31 crc kubenswrapper[4892]: W0217 17:49:31.961383 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68547217_2996_4b25_9020_c2187ecfb42e.slice/crio-e814ced2fb312d8217eec5ff46796c720b90d30cbd27b36cbfc3bcb0f60dde75 WatchSource:0}: Error finding container e814ced2fb312d8217eec5ff46796c720b90d30cbd27b36cbfc3bcb0f60dde75: Status 404 returned error can't find the container with id e814ced2fb312d8217eec5ff46796c720b90d30cbd27b36cbfc3bcb0f60dde75 Feb 17 17:49:32 crc kubenswrapper[4892]: I0217 17:49:32.035111 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bv8xp"] Feb 17 17:49:32 crc kubenswrapper[4892]: I0217 17:49:32.036748 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bv8xp" Feb 17 17:49:32 crc kubenswrapper[4892]: I0217 17:49:32.048372 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bv8xp"] Feb 17 17:49:32 crc kubenswrapper[4892]: I0217 17:49:32.220882 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36d16e04-a79d-42f9-bcb8-e9115efb1eae-catalog-content\") pod \"certified-operators-bv8xp\" (UID: \"36d16e04-a79d-42f9-bcb8-e9115efb1eae\") " pod="openshift-marketplace/certified-operators-bv8xp" Feb 17 17:49:32 crc kubenswrapper[4892]: I0217 17:49:32.221001 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36d16e04-a79d-42f9-bcb8-e9115efb1eae-utilities\") pod \"certified-operators-bv8xp\" (UID: \"36d16e04-a79d-42f9-bcb8-e9115efb1eae\") " pod="openshift-marketplace/certified-operators-bv8xp" Feb 17 17:49:32 crc kubenswrapper[4892]: I0217 17:49:32.221046 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n5jq\" (UniqueName: \"kubernetes.io/projected/36d16e04-a79d-42f9-bcb8-e9115efb1eae-kube-api-access-6n5jq\") pod \"certified-operators-bv8xp\" (UID: \"36d16e04-a79d-42f9-bcb8-e9115efb1eae\") " pod="openshift-marketplace/certified-operators-bv8xp" Feb 17 17:49:32 crc kubenswrapper[4892]: I0217 17:49:32.322056 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n5jq\" (UniqueName: \"kubernetes.io/projected/36d16e04-a79d-42f9-bcb8-e9115efb1eae-kube-api-access-6n5jq\") pod \"certified-operators-bv8xp\" (UID: \"36d16e04-a79d-42f9-bcb8-e9115efb1eae\") " pod="openshift-marketplace/certified-operators-bv8xp" Feb 17 17:49:32 crc kubenswrapper[4892]: I0217 17:49:32.322352 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36d16e04-a79d-42f9-bcb8-e9115efb1eae-catalog-content\") pod \"certified-operators-bv8xp\" (UID: \"36d16e04-a79d-42f9-bcb8-e9115efb1eae\") " pod="openshift-marketplace/certified-operators-bv8xp" Feb 17 17:49:32 crc kubenswrapper[4892]: I0217 17:49:32.322416 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36d16e04-a79d-42f9-bcb8-e9115efb1eae-utilities\") pod \"certified-operators-bv8xp\" (UID: \"36d16e04-a79d-42f9-bcb8-e9115efb1eae\") " pod="openshift-marketplace/certified-operators-bv8xp" Feb 17 17:49:32 crc kubenswrapper[4892]: I0217 17:49:32.322980 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36d16e04-a79d-42f9-bcb8-e9115efb1eae-catalog-content\") pod \"certified-operators-bv8xp\" (UID: \"36d16e04-a79d-42f9-bcb8-e9115efb1eae\") " pod="openshift-marketplace/certified-operators-bv8xp" Feb 17 17:49:32 crc kubenswrapper[4892]: I0217 17:49:32.324841 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36d16e04-a79d-42f9-bcb8-e9115efb1eae-utilities\") pod \"certified-operators-bv8xp\" (UID: \"36d16e04-a79d-42f9-bcb8-e9115efb1eae\") " pod="openshift-marketplace/certified-operators-bv8xp" Feb 17 17:49:32 crc kubenswrapper[4892]: I0217 17:49:32.338841 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n5jq\" (UniqueName: \"kubernetes.io/projected/36d16e04-a79d-42f9-bcb8-e9115efb1eae-kube-api-access-6n5jq\") pod \"certified-operators-bv8xp\" (UID: \"36d16e04-a79d-42f9-bcb8-e9115efb1eae\") " pod="openshift-marketplace/certified-operators-bv8xp" Feb 17 17:49:32 crc kubenswrapper[4892]: I0217 17:49:32.363524 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bv8xp" Feb 17 17:49:32 crc kubenswrapper[4892]: I0217 17:49:32.636083 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bv8xp"] Feb 17 17:49:32 crc kubenswrapper[4892]: I0217 17:49:32.832892 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hw9fh" event={"ID":"8d6875b4-db50-44e7-b270-ceb7bc6ef804","Type":"ContainerStarted","Data":"2da9e6d989f7b60732cff5aa1691028e7269fbd5413a1f63af6a225f0c0b83a8"} Feb 17 17:49:32 crc kubenswrapper[4892]: I0217 17:49:32.840355 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k2k8h" event={"ID":"e730a12c-43a9-45a7-a52b-9f6fd3eaecaf","Type":"ContainerStarted","Data":"8497fb9df9095ccf6b3defb12843d295646049056697a13f60fea815a49cd876"} Feb 17 17:49:32 crc kubenswrapper[4892]: I0217 17:49:32.842681 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x8f4q" event={"ID":"94374c20-25dd-491e-8778-0b00d10afda0","Type":"ContainerStarted","Data":"2ba90fc2d816105e9338e3455c3d172641daf2c4e534d92d6555ad37ec55401b"} Feb 17 17:49:32 crc kubenswrapper[4892]: I0217 17:49:32.844072 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bv8xp" event={"ID":"36d16e04-a79d-42f9-bcb8-e9115efb1eae","Type":"ContainerStarted","Data":"68f6fa6edc0a5e9e95eaaa02dc0fcedf0c4145bbb806c96f182338c53924d977"} Feb 17 17:49:32 crc kubenswrapper[4892]: I0217 17:49:32.844117 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bv8xp" event={"ID":"36d16e04-a79d-42f9-bcb8-e9115efb1eae","Type":"ContainerStarted","Data":"c10c388e88659a93c55a7b3e10c4c2dba0447b09b6ede9c148ad91c4b935adba"} Feb 17 17:49:32 crc kubenswrapper[4892]: I0217 17:49:32.845471 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6vvnr" event={"ID":"68547217-2996-4b25-9020-c2187ecfb42e","Type":"ContainerStarted","Data":"4377875f391f14d6766643a98ffd1dfef2c9c174e539553062ef8f25af15ca07"} Feb 17 17:49:32 crc kubenswrapper[4892]: I0217 17:49:32.845529 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6vvnr" event={"ID":"68547217-2996-4b25-9020-c2187ecfb42e","Type":"ContainerStarted","Data":"e814ced2fb312d8217eec5ff46796c720b90d30cbd27b36cbfc3bcb0f60dde75"} Feb 17 17:49:32 crc kubenswrapper[4892]: I0217 17:49:32.847435 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7hghz" event={"ID":"3b7d2423-4734-4106-91f3-fd4f6712a9d0","Type":"ContainerStarted","Data":"591a5e362de38c7da7ee90e92b48cf0cbf03c3ab148f22aaa27fab9c733fd90f"} Feb 17 17:49:32 crc kubenswrapper[4892]: I0217 17:49:32.851741 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fsc5r" event={"ID":"28f724e3-47e3-47df-be60-cc0da5a15e25","Type":"ContainerStarted","Data":"c367cf6f00ccb052677f2a632915b232e9aefee87ce88f9164c89651eea6c2bf"} Feb 17 17:49:32 crc kubenswrapper[4892]: I0217 17:49:32.949320 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-x8f4q" podStartSLOduration=3.848319853 podStartE2EDuration="12.94930439s" podCreationTimestamp="2026-02-17 17:49:20 +0000 UTC" firstStartedPulling="2026-02-17 17:49:22.654554132 +0000 UTC m=+334.029957397" lastFinishedPulling="2026-02-17 17:49:31.755538669 +0000 UTC m=+343.130941934" observedRunningTime="2026-02-17 17:49:32.948280051 +0000 UTC m=+344.323683326" watchObservedRunningTime="2026-02-17 17:49:32.94930439 +0000 UTC m=+344.324707665" Feb 17 17:49:33 crc kubenswrapper[4892]: E0217 17:49:33.020400 4892 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28f724e3_47e3_47df_be60_cc0da5a15e25.slice/crio-c367cf6f00ccb052677f2a632915b232e9aefee87ce88f9164c89651eea6c2bf.scope\": RecentStats: unable to find data in memory cache]" Feb 17 17:49:33 crc kubenswrapper[4892]: I0217 17:49:33.035861 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-n84ch"] Feb 17 17:49:33 crc kubenswrapper[4892]: I0217 17:49:33.040395 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n84ch" Feb 17 17:49:33 crc kubenswrapper[4892]: I0217 17:49:33.048508 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n84ch"] Feb 17 17:49:33 crc kubenswrapper[4892]: I0217 17:49:33.235095 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f87ba7cc-b3a3-4df2-bc4e-4f9d892f5c02-utilities\") pod \"community-operators-n84ch\" (UID: \"f87ba7cc-b3a3-4df2-bc4e-4f9d892f5c02\") " pod="openshift-marketplace/community-operators-n84ch" Feb 17 17:49:33 crc kubenswrapper[4892]: I0217 17:49:33.235149 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f87ba7cc-b3a3-4df2-bc4e-4f9d892f5c02-catalog-content\") pod \"community-operators-n84ch\" (UID: \"f87ba7cc-b3a3-4df2-bc4e-4f9d892f5c02\") " pod="openshift-marketplace/community-operators-n84ch" Feb 17 17:49:33 crc kubenswrapper[4892]: I0217 17:49:33.235220 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpchx\" (UniqueName: \"kubernetes.io/projected/f87ba7cc-b3a3-4df2-bc4e-4f9d892f5c02-kube-api-access-zpchx\") pod \"community-operators-n84ch\" (UID: \"f87ba7cc-b3a3-4df2-bc4e-4f9d892f5c02\") " pod="openshift-marketplace/community-operators-n84ch" Feb 17 17:49:33 crc kubenswrapper[4892]: I0217 17:49:33.336566 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpchx\" (UniqueName: \"kubernetes.io/projected/f87ba7cc-b3a3-4df2-bc4e-4f9d892f5c02-kube-api-access-zpchx\") pod \"community-operators-n84ch\" (UID: \"f87ba7cc-b3a3-4df2-bc4e-4f9d892f5c02\") " pod="openshift-marketplace/community-operators-n84ch" Feb 17 17:49:33 crc kubenswrapper[4892]: I0217 17:49:33.336929 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f87ba7cc-b3a3-4df2-bc4e-4f9d892f5c02-utilities\") pod \"community-operators-n84ch\" (UID: \"f87ba7cc-b3a3-4df2-bc4e-4f9d892f5c02\") " pod="openshift-marketplace/community-operators-n84ch" Feb 17 17:49:33 crc kubenswrapper[4892]: I0217 17:49:33.337026 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f87ba7cc-b3a3-4df2-bc4e-4f9d892f5c02-catalog-content\") pod \"community-operators-n84ch\" (UID: \"f87ba7cc-b3a3-4df2-bc4e-4f9d892f5c02\") " pod="openshift-marketplace/community-operators-n84ch" Feb 17 17:49:33 crc kubenswrapper[4892]: I0217 17:49:33.337608 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f87ba7cc-b3a3-4df2-bc4e-4f9d892f5c02-utilities\") pod \"community-operators-n84ch\" (UID: \"f87ba7cc-b3a3-4df2-bc4e-4f9d892f5c02\") " pod="openshift-marketplace/community-operators-n84ch" Feb 17 17:49:33 crc kubenswrapper[4892]: I0217 17:49:33.337680 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f87ba7cc-b3a3-4df2-bc4e-4f9d892f5c02-catalog-content\") pod \"community-operators-n84ch\" (UID: \"f87ba7cc-b3a3-4df2-bc4e-4f9d892f5c02\") " pod="openshift-marketplace/community-operators-n84ch" Feb 17 17:49:33 crc kubenswrapper[4892]: I0217 17:49:33.364258 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpchx\" (UniqueName: \"kubernetes.io/projected/f87ba7cc-b3a3-4df2-bc4e-4f9d892f5c02-kube-api-access-zpchx\") pod \"community-operators-n84ch\" (UID: \"f87ba7cc-b3a3-4df2-bc4e-4f9d892f5c02\") " pod="openshift-marketplace/community-operators-n84ch" Feb 17 17:49:33 crc kubenswrapper[4892]: I0217 17:49:33.389711 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n84ch" Feb 17 17:49:33 crc kubenswrapper[4892]: I0217 17:49:33.622663 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n84ch"] Feb 17 17:49:33 crc kubenswrapper[4892]: W0217 17:49:33.630124 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf87ba7cc_b3a3_4df2_bc4e_4f9d892f5c02.slice/crio-36899fdf017a39f5c383e129d4ba07b5a04f212fb3cbd68638bfc899760376c2 WatchSource:0}: Error finding container 36899fdf017a39f5c383e129d4ba07b5a04f212fb3cbd68638bfc899760376c2: Status 404 returned error can't find the container with id 36899fdf017a39f5c383e129d4ba07b5a04f212fb3cbd68638bfc899760376c2 Feb 17 17:49:33 crc kubenswrapper[4892]: I0217 17:49:33.809414 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-78dz9"] Feb 17 17:49:33 crc kubenswrapper[4892]: I0217 17:49:33.810731 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-78dz9" Feb 17 17:49:33 crc kubenswrapper[4892]: I0217 17:49:33.825554 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-78dz9"] Feb 17 17:49:33 crc kubenswrapper[4892]: I0217 17:49:33.856160 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n84ch" event={"ID":"f87ba7cc-b3a3-4df2-bc4e-4f9d892f5c02","Type":"ContainerStarted","Data":"36899fdf017a39f5c383e129d4ba07b5a04f212fb3cbd68638bfc899760376c2"} Feb 17 17:49:33 crc kubenswrapper[4892]: I0217 17:49:33.944793 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-78dz9\" (UID: \"3892a207-79f1-4b65-bf0d-971b59b0fd1b\") " pod="openshift-image-registry/image-registry-66df7c8f76-78dz9" Feb 17 17:49:33 crc kubenswrapper[4892]: I0217 17:49:33.944973 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6whjq\" (UniqueName: \"kubernetes.io/projected/3892a207-79f1-4b65-bf0d-971b59b0fd1b-kube-api-access-6whjq\") pod \"image-registry-66df7c8f76-78dz9\" (UID: \"3892a207-79f1-4b65-bf0d-971b59b0fd1b\") " pod="openshift-image-registry/image-registry-66df7c8f76-78dz9" Feb 17 17:49:33 crc kubenswrapper[4892]: I0217 17:49:33.945078 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3892a207-79f1-4b65-bf0d-971b59b0fd1b-registry-certificates\") pod \"image-registry-66df7c8f76-78dz9\" (UID: \"3892a207-79f1-4b65-bf0d-971b59b0fd1b\") " pod="openshift-image-registry/image-registry-66df7c8f76-78dz9" Feb 17 17:49:33 crc kubenswrapper[4892]: I0217 17:49:33.945456 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3892a207-79f1-4b65-bf0d-971b59b0fd1b-trusted-ca\") pod \"image-registry-66df7c8f76-78dz9\" (UID: \"3892a207-79f1-4b65-bf0d-971b59b0fd1b\") " pod="openshift-image-registry/image-registry-66df7c8f76-78dz9" Feb 17 17:49:33 crc kubenswrapper[4892]: I0217 17:49:33.945501 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3892a207-79f1-4b65-bf0d-971b59b0fd1b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-78dz9\" (UID: \"3892a207-79f1-4b65-bf0d-971b59b0fd1b\") " pod="openshift-image-registry/image-registry-66df7c8f76-78dz9" Feb 17 17:49:33 crc kubenswrapper[4892]: I0217 17:49:33.945556 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3892a207-79f1-4b65-bf0d-971b59b0fd1b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-78dz9\" (UID: \"3892a207-79f1-4b65-bf0d-971b59b0fd1b\") " pod="openshift-image-registry/image-registry-66df7c8f76-78dz9" Feb 17 17:49:33 crc kubenswrapper[4892]: I0217 17:49:33.945630 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3892a207-79f1-4b65-bf0d-971b59b0fd1b-registry-tls\") pod \"image-registry-66df7c8f76-78dz9\" (UID: \"3892a207-79f1-4b65-bf0d-971b59b0fd1b\") " pod="openshift-image-registry/image-registry-66df7c8f76-78dz9" Feb 17 17:49:33 crc kubenswrapper[4892]: I0217 17:49:33.945655 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3892a207-79f1-4b65-bf0d-971b59b0fd1b-bound-sa-token\") pod \"image-registry-66df7c8f76-78dz9\" (UID: \"3892a207-79f1-4b65-bf0d-971b59b0fd1b\") " pod="openshift-image-registry/image-registry-66df7c8f76-78dz9" Feb 17 17:49:33 crc kubenswrapper[4892]: I0217 17:49:33.990056 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-78dz9\" (UID: \"3892a207-79f1-4b65-bf0d-971b59b0fd1b\") " pod="openshift-image-registry/image-registry-66df7c8f76-78dz9" Feb 17 17:49:34 crc kubenswrapper[4892]: I0217 17:49:34.047115 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3892a207-79f1-4b65-bf0d-971b59b0fd1b-registry-tls\") pod \"image-registry-66df7c8f76-78dz9\" (UID: \"3892a207-79f1-4b65-bf0d-971b59b0fd1b\") " pod="openshift-image-registry/image-registry-66df7c8f76-78dz9" Feb 17 17:49:34 crc kubenswrapper[4892]: I0217 17:49:34.047162 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3892a207-79f1-4b65-bf0d-971b59b0fd1b-bound-sa-token\") pod \"image-registry-66df7c8f76-78dz9\" (UID: \"3892a207-79f1-4b65-bf0d-971b59b0fd1b\") " pod="openshift-image-registry/image-registry-66df7c8f76-78dz9" Feb 17 17:49:34 crc kubenswrapper[4892]: I0217 17:49:34.047235 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6whjq\" (UniqueName: \"kubernetes.io/projected/3892a207-79f1-4b65-bf0d-971b59b0fd1b-kube-api-access-6whjq\") pod \"image-registry-66df7c8f76-78dz9\" (UID: \"3892a207-79f1-4b65-bf0d-971b59b0fd1b\") " pod="openshift-image-registry/image-registry-66df7c8f76-78dz9" Feb 17 17:49:34 crc kubenswrapper[4892]: I0217 17:49:34.047279 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3892a207-79f1-4b65-bf0d-971b59b0fd1b-registry-certificates\") pod \"image-registry-66df7c8f76-78dz9\" (UID: \"3892a207-79f1-4b65-bf0d-971b59b0fd1b\") " pod="openshift-image-registry/image-registry-66df7c8f76-78dz9" Feb 17 17:49:34 crc kubenswrapper[4892]: I0217 17:49:34.047376 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3892a207-79f1-4b65-bf0d-971b59b0fd1b-trusted-ca\") pod \"image-registry-66df7c8f76-78dz9\" (UID: \"3892a207-79f1-4b65-bf0d-971b59b0fd1b\") " pod="openshift-image-registry/image-registry-66df7c8f76-78dz9" Feb 17 17:49:34 crc kubenswrapper[4892]: I0217 17:49:34.047395 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3892a207-79f1-4b65-bf0d-971b59b0fd1b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-78dz9\" (UID: \"3892a207-79f1-4b65-bf0d-971b59b0fd1b\") " pod="openshift-image-registry/image-registry-66df7c8f76-78dz9" Feb 17 17:49:34 crc kubenswrapper[4892]: I0217 17:49:34.047433 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3892a207-79f1-4b65-bf0d-971b59b0fd1b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-78dz9\" (UID: \"3892a207-79f1-4b65-bf0d-971b59b0fd1b\") " pod="openshift-image-registry/image-registry-66df7c8f76-78dz9" Feb 17 17:49:34 crc kubenswrapper[4892]: I0217 17:49:34.048754 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3892a207-79f1-4b65-bf0d-971b59b0fd1b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-78dz9\" (UID: \"3892a207-79f1-4b65-bf0d-971b59b0fd1b\") " pod="openshift-image-registry/image-registry-66df7c8f76-78dz9" Feb 17 17:49:34 crc kubenswrapper[4892]: I0217 17:49:34.049593 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3892a207-79f1-4b65-bf0d-971b59b0fd1b-trusted-ca\") pod \"image-registry-66df7c8f76-78dz9\" (UID: \"3892a207-79f1-4b65-bf0d-971b59b0fd1b\") " pod="openshift-image-registry/image-registry-66df7c8f76-78dz9" Feb 17 17:49:34 crc kubenswrapper[4892]: I0217 17:49:34.050009 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3892a207-79f1-4b65-bf0d-971b59b0fd1b-registry-certificates\") pod \"image-registry-66df7c8f76-78dz9\" (UID: \"3892a207-79f1-4b65-bf0d-971b59b0fd1b\") " pod="openshift-image-registry/image-registry-66df7c8f76-78dz9" Feb 17 17:49:34 crc kubenswrapper[4892]: I0217 17:49:34.052530 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3892a207-79f1-4b65-bf0d-971b59b0fd1b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-78dz9\" (UID: \"3892a207-79f1-4b65-bf0d-971b59b0fd1b\") " pod="openshift-image-registry/image-registry-66df7c8f76-78dz9" Feb 17 17:49:34 crc kubenswrapper[4892]: I0217 17:49:34.065849 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3892a207-79f1-4b65-bf0d-971b59b0fd1b-registry-tls\") pod \"image-registry-66df7c8f76-78dz9\" (UID: \"3892a207-79f1-4b65-bf0d-971b59b0fd1b\") " pod="openshift-image-registry/image-registry-66df7c8f76-78dz9" Feb 17 17:49:34 crc kubenswrapper[4892]: I0217 17:49:34.071015 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6whjq\" (UniqueName: \"kubernetes.io/projected/3892a207-79f1-4b65-bf0d-971b59b0fd1b-kube-api-access-6whjq\") pod \"image-registry-66df7c8f76-78dz9\" (UID: \"3892a207-79f1-4b65-bf0d-971b59b0fd1b\") " pod="openshift-image-registry/image-registry-66df7c8f76-78dz9" Feb 17 17:49:34 crc kubenswrapper[4892]: I0217 17:49:34.077675 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3892a207-79f1-4b65-bf0d-971b59b0fd1b-bound-sa-token\") pod \"image-registry-66df7c8f76-78dz9\" (UID: \"3892a207-79f1-4b65-bf0d-971b59b0fd1b\") " pod="openshift-image-registry/image-registry-66df7c8f76-78dz9" Feb 17 17:49:34 crc kubenswrapper[4892]: I0217 17:49:34.139585 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-78dz9" Feb 17 17:49:34 crc kubenswrapper[4892]: I0217 17:49:34.443710 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ztgz4"] Feb 17 17:49:34 crc kubenswrapper[4892]: I0217 17:49:34.447646 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ztgz4" Feb 17 17:49:34 crc kubenswrapper[4892]: I0217 17:49:34.448781 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ztgz4"] Feb 17 17:49:34 crc kubenswrapper[4892]: I0217 17:49:34.558409 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgplj\" (UniqueName: \"kubernetes.io/projected/66354c96-340d-43c7-98d0-19713f857884-kube-api-access-jgplj\") pod \"redhat-operators-ztgz4\" (UID: \"66354c96-340d-43c7-98d0-19713f857884\") " pod="openshift-marketplace/redhat-operators-ztgz4" Feb 17 17:49:34 crc kubenswrapper[4892]: I0217 17:49:34.558731 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66354c96-340d-43c7-98d0-19713f857884-utilities\") pod \"redhat-operators-ztgz4\" (UID: \"66354c96-340d-43c7-98d0-19713f857884\") " pod="openshift-marketplace/redhat-operators-ztgz4" Feb 17 17:49:34 crc kubenswrapper[4892]: I0217 17:49:34.558958 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66354c96-340d-43c7-98d0-19713f857884-catalog-content\") pod \"redhat-operators-ztgz4\" (UID: \"66354c96-340d-43c7-98d0-19713f857884\") " pod="openshift-marketplace/redhat-operators-ztgz4" Feb 17 17:49:34 crc kubenswrapper[4892]: I0217 17:49:34.565050 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-78dz9"] Feb 17 17:49:34 crc kubenswrapper[4892]: I0217 17:49:34.660924 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66354c96-340d-43c7-98d0-19713f857884-utilities\") pod \"redhat-operators-ztgz4\" (UID: \"66354c96-340d-43c7-98d0-19713f857884\") " pod="openshift-marketplace/redhat-operators-ztgz4" Feb 17 17:49:34 crc kubenswrapper[4892]: I0217 17:49:34.660995 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66354c96-340d-43c7-98d0-19713f857884-catalog-content\") pod \"redhat-operators-ztgz4\" (UID: \"66354c96-340d-43c7-98d0-19713f857884\") " pod="openshift-marketplace/redhat-operators-ztgz4" Feb 17 17:49:34 crc kubenswrapper[4892]: I0217 17:49:34.661097 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgplj\" (UniqueName: \"kubernetes.io/projected/66354c96-340d-43c7-98d0-19713f857884-kube-api-access-jgplj\") pod \"redhat-operators-ztgz4\" (UID: \"66354c96-340d-43c7-98d0-19713f857884\") " pod="openshift-marketplace/redhat-operators-ztgz4" Feb 17 17:49:34 crc kubenswrapper[4892]: I0217 17:49:34.661599 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66354c96-340d-43c7-98d0-19713f857884-utilities\") pod \"redhat-operators-ztgz4\" (UID: \"66354c96-340d-43c7-98d0-19713f857884\") " pod="openshift-marketplace/redhat-operators-ztgz4" Feb 17 17:49:34 crc kubenswrapper[4892]: I0217 17:49:34.661879 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66354c96-340d-43c7-98d0-19713f857884-catalog-content\") pod \"redhat-operators-ztgz4\" (UID: \"66354c96-340d-43c7-98d0-19713f857884\") " pod="openshift-marketplace/redhat-operators-ztgz4" Feb 17 17:49:34 crc kubenswrapper[4892]: W0217 17:49:34.662229 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3892a207_79f1_4b65_bf0d_971b59b0fd1b.slice/crio-771a3008d7110d719050eb965410941cf62db6fe6249a0d1c882dc988964b122 WatchSource:0}: Error finding container 771a3008d7110d719050eb965410941cf62db6fe6249a0d1c882dc988964b122: Status 404 returned error can't find the container with id 771a3008d7110d719050eb965410941cf62db6fe6249a0d1c882dc988964b122 Feb 17 17:49:34 crc kubenswrapper[4892]: I0217 17:49:34.689467 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgplj\" (UniqueName: \"kubernetes.io/projected/66354c96-340d-43c7-98d0-19713f857884-kube-api-access-jgplj\") pod \"redhat-operators-ztgz4\" (UID: \"66354c96-340d-43c7-98d0-19713f857884\") " pod="openshift-marketplace/redhat-operators-ztgz4" Feb 17 17:49:34 crc kubenswrapper[4892]: I0217 17:49:34.864859 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-78dz9" event={"ID":"3892a207-79f1-4b65-bf0d-971b59b0fd1b","Type":"ContainerStarted","Data":"771a3008d7110d719050eb965410941cf62db6fe6249a0d1c882dc988964b122"} Feb 17 17:49:34 crc kubenswrapper[4892]: I0217 17:49:34.871104 4892 generic.go:334] "Generic (PLEG): container finished" podID="4b23b6a9-794e-40ce-93b0-8b60ec25cf20" containerID="f7266bc344c6aab906c236cba119a7a4ac288fa4a38b69bd19c3a169fbdc3636" exitCode=0 Feb 17 17:49:34 crc kubenswrapper[4892]: I0217 17:49:34.871210 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ljjvr" event={"ID":"4b23b6a9-794e-40ce-93b0-8b60ec25cf20","Type":"ContainerDied","Data":"f7266bc344c6aab906c236cba119a7a4ac288fa4a38b69bd19c3a169fbdc3636"} Feb 17 17:49:34 crc kubenswrapper[4892]: I0217 17:49:34.873801 4892 generic.go:334] "Generic (PLEG): container finished" podID="31427c94-1de1-431c-9f4b-e87f01548d3f" containerID="1c64371027f9a0b92aebbd32db4c5e47fb934474941206fbd2e67f5f92c2fedd" exitCode=0 Feb 17 17:49:34 crc kubenswrapper[4892]: I0217 17:49:34.873876 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gfc4l" event={"ID":"31427c94-1de1-431c-9f4b-e87f01548d3f","Type":"ContainerDied","Data":"1c64371027f9a0b92aebbd32db4c5e47fb934474941206fbd2e67f5f92c2fedd"} Feb 17 17:49:34 crc kubenswrapper[4892]: I0217 17:49:34.877613 4892 generic.go:334] "Generic (PLEG): container finished" podID="b8e76551-8839-410b-9aec-a671ca2119dc" containerID="aab60b013de97225059f1f24b745c4783a5b3cf21cff83aef795c6459f00dd5b" exitCode=0 Feb 17 17:49:34 crc kubenswrapper[4892]: I0217 17:49:34.877655 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xnsqq" event={"ID":"b8e76551-8839-410b-9aec-a671ca2119dc","Type":"ContainerDied","Data":"aab60b013de97225059f1f24b745c4783a5b3cf21cff83aef795c6459f00dd5b"} Feb 17 17:49:34 crc kubenswrapper[4892]: I0217 17:49:34.881404 4892 generic.go:334] "Generic (PLEG): container finished" podID="68547217-2996-4b25-9020-c2187ecfb42e" containerID="4377875f391f14d6766643a98ffd1dfef2c9c174e539553062ef8f25af15ca07" exitCode=0 Feb 17 17:49:34 crc kubenswrapper[4892]: I0217 17:49:34.881453 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6vvnr" event={"ID":"68547217-2996-4b25-9020-c2187ecfb42e","Type":"ContainerDied","Data":"4377875f391f14d6766643a98ffd1dfef2c9c174e539553062ef8f25af15ca07"} Feb 17 17:49:34 crc kubenswrapper[4892]: I0217 17:49:34.886148 4892 generic.go:334] "Generic (PLEG): container finished" podID="3b7d2423-4734-4106-91f3-fd4f6712a9d0" containerID="591a5e362de38c7da7ee90e92b48cf0cbf03c3ab148f22aaa27fab9c733fd90f" exitCode=0 Feb 17 17:49:34 crc kubenswrapper[4892]: I0217 17:49:34.886190 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7hghz" event={"ID":"3b7d2423-4734-4106-91f3-fd4f6712a9d0","Type":"ContainerDied","Data":"591a5e362de38c7da7ee90e92b48cf0cbf03c3ab148f22aaa27fab9c733fd90f"} Feb 17 17:49:34 crc kubenswrapper[4892]: I0217 17:49:34.964206 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ztgz4" Feb 17 17:49:35 crc kubenswrapper[4892]: I0217 17:49:35.208024 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ztgz4"] Feb 17 17:49:35 crc kubenswrapper[4892]: W0217 17:49:35.232063 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66354c96_340d_43c7_98d0_19713f857884.slice/crio-93f38c7c7467fe902f1e67b205777619ca237bf0075010941b4ede85c2df88ac WatchSource:0}: Error finding container 93f38c7c7467fe902f1e67b205777619ca237bf0075010941b4ede85c2df88ac: Status 404 returned error can't find the container with id 93f38c7c7467fe902f1e67b205777619ca237bf0075010941b4ede85c2df88ac Feb 17 17:49:35 crc kubenswrapper[4892]: I0217 17:49:35.434667 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5c7bz"] Feb 17 17:49:35 crc kubenswrapper[4892]: I0217 17:49:35.436128 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5c7bz" Feb 17 17:49:35 crc kubenswrapper[4892]: I0217 17:49:35.460334 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5c7bz"] Feb 17 17:49:35 crc kubenswrapper[4892]: I0217 17:49:35.475283 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5b92\" (UniqueName: \"kubernetes.io/projected/30a36169-14a5-4d2a-9f66-fb343852b6a6-kube-api-access-d5b92\") pod \"certified-operators-5c7bz\" (UID: \"30a36169-14a5-4d2a-9f66-fb343852b6a6\") " pod="openshift-marketplace/certified-operators-5c7bz" Feb 17 17:49:35 crc kubenswrapper[4892]: I0217 17:49:35.475430 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30a36169-14a5-4d2a-9f66-fb343852b6a6-utilities\") pod \"certified-operators-5c7bz\" (UID: \"30a36169-14a5-4d2a-9f66-fb343852b6a6\") " pod="openshift-marketplace/certified-operators-5c7bz" Feb 17 17:49:35 crc kubenswrapper[4892]: I0217 17:49:35.475465 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30a36169-14a5-4d2a-9f66-fb343852b6a6-catalog-content\") pod \"certified-operators-5c7bz\" (UID: \"30a36169-14a5-4d2a-9f66-fb343852b6a6\") " pod="openshift-marketplace/certified-operators-5c7bz" Feb 17 17:49:35 crc kubenswrapper[4892]: I0217 17:49:35.576242 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30a36169-14a5-4d2a-9f66-fb343852b6a6-utilities\") pod \"certified-operators-5c7bz\" (UID: \"30a36169-14a5-4d2a-9f66-fb343852b6a6\") " pod="openshift-marketplace/certified-operators-5c7bz" Feb 17 17:49:35 crc kubenswrapper[4892]: I0217 17:49:35.576647 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30a36169-14a5-4d2a-9f66-fb343852b6a6-catalog-content\") pod \"certified-operators-5c7bz\" (UID: \"30a36169-14a5-4d2a-9f66-fb343852b6a6\") " pod="openshift-marketplace/certified-operators-5c7bz" Feb 17 17:49:35 crc kubenswrapper[4892]: I0217 17:49:35.576700 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5b92\" (UniqueName: \"kubernetes.io/projected/30a36169-14a5-4d2a-9f66-fb343852b6a6-kube-api-access-d5b92\") pod \"certified-operators-5c7bz\" (UID: \"30a36169-14a5-4d2a-9f66-fb343852b6a6\") " pod="openshift-marketplace/certified-operators-5c7bz" Feb 17 17:49:35 crc kubenswrapper[4892]: I0217 17:49:35.576763 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30a36169-14a5-4d2a-9f66-fb343852b6a6-utilities\") pod \"certified-operators-5c7bz\" (UID: \"30a36169-14a5-4d2a-9f66-fb343852b6a6\") " pod="openshift-marketplace/certified-operators-5c7bz" Feb 17 17:49:35 crc kubenswrapper[4892]: I0217 17:49:35.576978 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30a36169-14a5-4d2a-9f66-fb343852b6a6-catalog-content\") pod \"certified-operators-5c7bz\" (UID: \"30a36169-14a5-4d2a-9f66-fb343852b6a6\") " pod="openshift-marketplace/certified-operators-5c7bz" Feb 17 17:49:35 crc kubenswrapper[4892]: I0217 17:49:35.597801 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5b92\" (UniqueName: \"kubernetes.io/projected/30a36169-14a5-4d2a-9f66-fb343852b6a6-kube-api-access-d5b92\") pod \"certified-operators-5c7bz\" (UID: \"30a36169-14a5-4d2a-9f66-fb343852b6a6\") " pod="openshift-marketplace/certified-operators-5c7bz" Feb 17 17:49:35 crc kubenswrapper[4892]: I0217 17:49:35.873163 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5c7bz" Feb 17 17:49:35 crc kubenswrapper[4892]: I0217 17:49:35.896295 4892 generic.go:334] "Generic (PLEG): container finished" podID="e730a12c-43a9-45a7-a52b-9f6fd3eaecaf" containerID="8497fb9df9095ccf6b3defb12843d295646049056697a13f60fea815a49cd876" exitCode=0 Feb 17 17:49:35 crc kubenswrapper[4892]: I0217 17:49:35.896483 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k2k8h" event={"ID":"e730a12c-43a9-45a7-a52b-9f6fd3eaecaf","Type":"ContainerDied","Data":"8497fb9df9095ccf6b3defb12843d295646049056697a13f60fea815a49cd876"} Feb 17 17:49:35 crc kubenswrapper[4892]: I0217 17:49:35.903033 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ztgz4" event={"ID":"66354c96-340d-43c7-98d0-19713f857884","Type":"ContainerStarted","Data":"93f38c7c7467fe902f1e67b205777619ca237bf0075010941b4ede85c2df88ac"} Feb 17 17:49:35 crc kubenswrapper[4892]: I0217 17:49:35.914797 4892 generic.go:334] "Generic (PLEG): container finished" podID="36d16e04-a79d-42f9-bcb8-e9115efb1eae" containerID="68f6fa6edc0a5e9e95eaaa02dc0fcedf0c4145bbb806c96f182338c53924d977" exitCode=0 Feb 17 17:49:35 crc kubenswrapper[4892]: I0217 17:49:35.914878 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bv8xp" event={"ID":"36d16e04-a79d-42f9-bcb8-e9115efb1eae","Type":"ContainerDied","Data":"68f6fa6edc0a5e9e95eaaa02dc0fcedf0c4145bbb806c96f182338c53924d977"} Feb 17 17:49:35 crc kubenswrapper[4892]: I0217 17:49:35.923481 4892 generic.go:334] "Generic (PLEG): container finished" podID="28f724e3-47e3-47df-be60-cc0da5a15e25" containerID="c367cf6f00ccb052677f2a632915b232e9aefee87ce88f9164c89651eea6c2bf" exitCode=0 Feb 17 17:49:35 crc kubenswrapper[4892]: I0217 17:49:35.923567 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fsc5r" event={"ID":"28f724e3-47e3-47df-be60-cc0da5a15e25","Type":"ContainerDied","Data":"c367cf6f00ccb052677f2a632915b232e9aefee87ce88f9164c89651eea6c2bf"} Feb 17 17:49:35 crc kubenswrapper[4892]: I0217 17:49:35.951141 4892 generic.go:334] "Generic (PLEG): container finished" podID="8d6875b4-db50-44e7-b270-ceb7bc6ef804" containerID="2da9e6d989f7b60732cff5aa1691028e7269fbd5413a1f63af6a225f0c0b83a8" exitCode=0 Feb 17 17:49:35 crc kubenswrapper[4892]: I0217 17:49:35.951201 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hw9fh" event={"ID":"8d6875b4-db50-44e7-b270-ceb7bc6ef804","Type":"ContainerDied","Data":"2da9e6d989f7b60732cff5aa1691028e7269fbd5413a1f63af6a225f0c0b83a8"} Feb 17 17:49:36 crc kubenswrapper[4892]: I0217 17:49:36.353887 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5c7bz"] Feb 17 17:49:36 crc kubenswrapper[4892]: I0217 17:49:36.617540 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lx4bn" Feb 17 17:49:36 crc kubenswrapper[4892]: I0217 17:49:36.666896 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lx4bn" Feb 17 17:49:36 crc kubenswrapper[4892]: I0217 17:49:36.962976 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ztgz4" event={"ID":"66354c96-340d-43c7-98d0-19713f857884","Type":"ContainerStarted","Data":"4139f53a7cc8fef9ebc26414bb1382f5d99f4b3d4c71987c97e854fd6f15b6e3"} Feb 17 17:49:36 crc kubenswrapper[4892]: I0217 17:49:36.965600 4892 generic.go:334] "Generic (PLEG): container finished" podID="f87ba7cc-b3a3-4df2-bc4e-4f9d892f5c02" containerID="e87161b6025e4b874bc1ca1c9e85d0f7310f6430c09d6d6a9bcba48a1f519804" exitCode=0 Feb 17 17:49:36 crc kubenswrapper[4892]: I0217 17:49:36.965891 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n84ch" event={"ID":"f87ba7cc-b3a3-4df2-bc4e-4f9d892f5c02","Type":"ContainerDied","Data":"e87161b6025e4b874bc1ca1c9e85d0f7310f6430c09d6d6a9bcba48a1f519804"} Feb 17 17:49:36 crc kubenswrapper[4892]: I0217 17:49:36.967125 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5c7bz" event={"ID":"30a36169-14a5-4d2a-9f66-fb343852b6a6","Type":"ContainerStarted","Data":"1d9ae68afc2ff801f2b2b86fad2316504f9a41898364b972be58f60088ba7b3c"} Feb 17 17:49:37 crc kubenswrapper[4892]: I0217 17:49:37.039334 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zcqks"] Feb 17 17:49:37 crc kubenswrapper[4892]: I0217 17:49:37.041684 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zcqks" Feb 17 17:49:37 crc kubenswrapper[4892]: I0217 17:49:37.051251 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zcqks"] Feb 17 17:49:37 crc kubenswrapper[4892]: I0217 17:49:37.193553 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a22a9e64-6027-45c1-be54-dc34baaf71c5-utilities\") pod \"community-operators-zcqks\" (UID: \"a22a9e64-6027-45c1-be54-dc34baaf71c5\") " pod="openshift-marketplace/community-operators-zcqks" Feb 17 17:49:37 crc kubenswrapper[4892]: I0217 17:49:37.193663 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a22a9e64-6027-45c1-be54-dc34baaf71c5-catalog-content\") pod \"community-operators-zcqks\" (UID: \"a22a9e64-6027-45c1-be54-dc34baaf71c5\") " pod="openshift-marketplace/community-operators-zcqks" Feb 17 17:49:37 crc kubenswrapper[4892]: I0217 17:49:37.193715 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnq8b\" (UniqueName: \"kubernetes.io/projected/a22a9e64-6027-45c1-be54-dc34baaf71c5-kube-api-access-cnq8b\") pod \"community-operators-zcqks\" (UID: \"a22a9e64-6027-45c1-be54-dc34baaf71c5\") " pod="openshift-marketplace/community-operators-zcqks" Feb 17 17:49:37 crc kubenswrapper[4892]: I0217 17:49:37.294761 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a22a9e64-6027-45c1-be54-dc34baaf71c5-catalog-content\") pod \"community-operators-zcqks\" (UID: \"a22a9e64-6027-45c1-be54-dc34baaf71c5\") " pod="openshift-marketplace/community-operators-zcqks" Feb 17 17:49:37 crc kubenswrapper[4892]: I0217 17:49:37.294853 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnq8b\" (UniqueName: \"kubernetes.io/projected/a22a9e64-6027-45c1-be54-dc34baaf71c5-kube-api-access-cnq8b\") pod \"community-operators-zcqks\" (UID: \"a22a9e64-6027-45c1-be54-dc34baaf71c5\") " pod="openshift-marketplace/community-operators-zcqks" Feb 17 17:49:37 crc kubenswrapper[4892]: I0217 17:49:37.294897 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a22a9e64-6027-45c1-be54-dc34baaf71c5-utilities\") pod \"community-operators-zcqks\" (UID: \"a22a9e64-6027-45c1-be54-dc34baaf71c5\") " pod="openshift-marketplace/community-operators-zcqks" Feb 17 17:49:37 crc kubenswrapper[4892]: I0217 17:49:37.295274 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a22a9e64-6027-45c1-be54-dc34baaf71c5-catalog-content\") pod \"community-operators-zcqks\" (UID: \"a22a9e64-6027-45c1-be54-dc34baaf71c5\") " pod="openshift-marketplace/community-operators-zcqks" Feb 17 17:49:37 crc kubenswrapper[4892]: I0217 17:49:37.295295 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a22a9e64-6027-45c1-be54-dc34baaf71c5-utilities\") pod \"community-operators-zcqks\" (UID: \"a22a9e64-6027-45c1-be54-dc34baaf71c5\") " pod="openshift-marketplace/community-operators-zcqks" Feb 17 17:49:37 crc kubenswrapper[4892]: I0217 17:49:37.320710 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnq8b\" (UniqueName: \"kubernetes.io/projected/a22a9e64-6027-45c1-be54-dc34baaf71c5-kube-api-access-cnq8b\") pod \"community-operators-zcqks\" (UID: \"a22a9e64-6027-45c1-be54-dc34baaf71c5\") " pod="openshift-marketplace/community-operators-zcqks" Feb 17 17:49:37 crc kubenswrapper[4892]: I0217 17:49:37.369898 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zcqks" Feb 17 17:49:38 crc kubenswrapper[4892]: I0217 17:49:37.640091 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fj4g8"] Feb 17 17:49:38 crc kubenswrapper[4892]: I0217 17:49:37.641606 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fj4g8" Feb 17 17:49:38 crc kubenswrapper[4892]: I0217 17:49:37.649060 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fj4g8"] Feb 17 17:49:38 crc kubenswrapper[4892]: I0217 17:49:37.805126 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fee21c95-556a-4a25-8b38-273489f881e4-catalog-content\") pod \"redhat-operators-fj4g8\" (UID: \"fee21c95-556a-4a25-8b38-273489f881e4\") " pod="openshift-marketplace/redhat-operators-fj4g8" Feb 17 17:49:38 crc kubenswrapper[4892]: I0217 17:49:37.805203 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hbfn\" (UniqueName: \"kubernetes.io/projected/fee21c95-556a-4a25-8b38-273489f881e4-kube-api-access-2hbfn\") pod \"redhat-operators-fj4g8\" (UID: \"fee21c95-556a-4a25-8b38-273489f881e4\") " pod="openshift-marketplace/redhat-operators-fj4g8" Feb 17 17:49:38 crc kubenswrapper[4892]: I0217 17:49:37.805470 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fee21c95-556a-4a25-8b38-273489f881e4-utilities\") pod \"redhat-operators-fj4g8\" (UID: \"fee21c95-556a-4a25-8b38-273489f881e4\") " pod="openshift-marketplace/redhat-operators-fj4g8" Feb 17 17:49:38 crc kubenswrapper[4892]: I0217 17:49:37.907232 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hbfn\" (UniqueName: \"kubernetes.io/projected/fee21c95-556a-4a25-8b38-273489f881e4-kube-api-access-2hbfn\") pod \"redhat-operators-fj4g8\" (UID: \"fee21c95-556a-4a25-8b38-273489f881e4\") " pod="openshift-marketplace/redhat-operators-fj4g8" Feb 17 17:49:38 crc kubenswrapper[4892]: I0217 17:49:37.907333 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fee21c95-556a-4a25-8b38-273489f881e4-utilities\") pod \"redhat-operators-fj4g8\" (UID: \"fee21c95-556a-4a25-8b38-273489f881e4\") " pod="openshift-marketplace/redhat-operators-fj4g8" Feb 17 17:49:38 crc kubenswrapper[4892]: I0217 17:49:37.907428 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fee21c95-556a-4a25-8b38-273489f881e4-catalog-content\") pod \"redhat-operators-fj4g8\" (UID: \"fee21c95-556a-4a25-8b38-273489f881e4\") " pod="openshift-marketplace/redhat-operators-fj4g8" Feb 17 17:49:38 crc kubenswrapper[4892]: I0217 17:49:37.907972 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fee21c95-556a-4a25-8b38-273489f881e4-catalog-content\") pod \"redhat-operators-fj4g8\" (UID: \"fee21c95-556a-4a25-8b38-273489f881e4\") " pod="openshift-marketplace/redhat-operators-fj4g8" Feb 17 17:49:38 crc kubenswrapper[4892]: I0217 17:49:37.908184 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fee21c95-556a-4a25-8b38-273489f881e4-utilities\") pod \"redhat-operators-fj4g8\" (UID: \"fee21c95-556a-4a25-8b38-273489f881e4\") " pod="openshift-marketplace/redhat-operators-fj4g8" Feb 17 17:49:38 crc kubenswrapper[4892]: I0217 17:49:37.929184 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hbfn\" (UniqueName: \"kubernetes.io/projected/fee21c95-556a-4a25-8b38-273489f881e4-kube-api-access-2hbfn\") pod \"redhat-operators-fj4g8\" (UID: \"fee21c95-556a-4a25-8b38-273489f881e4\") " pod="openshift-marketplace/redhat-operators-fj4g8" Feb 17 17:49:38 crc kubenswrapper[4892]: I0217 17:49:37.957411 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fj4g8" Feb 17 17:49:38 crc kubenswrapper[4892]: I0217 17:49:37.979153 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5c7bz" event={"ID":"30a36169-14a5-4d2a-9f66-fb343852b6a6","Type":"ContainerStarted","Data":"caa2a35beacddd6be13358c3905c6c7b1c8cbcaf78f3469d39706824bc40c207"} Feb 17 17:49:38 crc kubenswrapper[4892]: I0217 17:49:37.980702 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-78dz9" event={"ID":"3892a207-79f1-4b65-bf0d-971b59b0fd1b","Type":"ContainerStarted","Data":"3b0c94a5ca65896f0c2eddac571b1d53d0fc13fff2e70ba9426fa318fd21d860"} Feb 17 17:49:38 crc kubenswrapper[4892]: I0217 17:49:37.981949 4892 generic.go:334] "Generic (PLEG): container finished" podID="66354c96-340d-43c7-98d0-19713f857884" containerID="4139f53a7cc8fef9ebc26414bb1382f5d99f4b3d4c71987c97e854fd6f15b6e3" exitCode=0 Feb 17 17:49:38 crc kubenswrapper[4892]: I0217 17:49:37.982026 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ztgz4" event={"ID":"66354c96-340d-43c7-98d0-19713f857884","Type":"ContainerDied","Data":"4139f53a7cc8fef9ebc26414bb1382f5d99f4b3d4c71987c97e854fd6f15b6e3"} Feb 17 17:49:38 crc kubenswrapper[4892]: I0217 17:49:38.025516 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hjlcl" Feb 17 17:49:38 crc kubenswrapper[4892]: I0217 17:49:38.424770 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zcqks"] Feb 17 17:49:38 crc kubenswrapper[4892]: I0217 17:49:38.518921 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fj4g8"] Feb 17 17:49:38 crc kubenswrapper[4892]: W0217 17:49:38.522374 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfee21c95_556a_4a25_8b38_273489f881e4.slice/crio-7f76615bafe5269546c45acc46de2a55a419580d92eababd140e708cecbcd664 WatchSource:0}: Error finding container 7f76615bafe5269546c45acc46de2a55a419580d92eababd140e708cecbcd664: Status 404 returned error can't find the container with id 7f76615bafe5269546c45acc46de2a55a419580d92eababd140e708cecbcd664 Feb 17 17:49:38 crc kubenswrapper[4892]: I0217 17:49:38.990803 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zcqks" event={"ID":"a22a9e64-6027-45c1-be54-dc34baaf71c5","Type":"ContainerStarted","Data":"82c06fd1bf757f62a81200ae1ebd294c31fb3bcc17caf28f9ba093bebd348b87"} Feb 17 17:49:39 crc kubenswrapper[4892]: I0217 17:49:39.009514 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fj4g8" event={"ID":"fee21c95-556a-4a25-8b38-273489f881e4","Type":"ContainerStarted","Data":"7f76615bafe5269546c45acc46de2a55a419580d92eababd140e708cecbcd664"} Feb 17 17:49:39 crc kubenswrapper[4892]: I0217 17:49:39.234762 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6qj4k"] Feb 17 17:49:39 crc kubenswrapper[4892]: I0217 17:49:39.236241 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6qj4k" Feb 17 17:49:39 crc kubenswrapper[4892]: I0217 17:49:39.243432 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6qj4k"] Feb 17 17:49:39 crc kubenswrapper[4892]: I0217 17:49:39.272671 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a8823d3-7d81-4ab1-8726-e77a2d0024e6-utilities\") pod \"certified-operators-6qj4k\" (UID: \"8a8823d3-7d81-4ab1-8726-e77a2d0024e6\") " pod="openshift-marketplace/certified-operators-6qj4k" Feb 17 17:49:39 crc kubenswrapper[4892]: I0217 17:49:39.273919 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tg25\" (UniqueName: \"kubernetes.io/projected/8a8823d3-7d81-4ab1-8726-e77a2d0024e6-kube-api-access-6tg25\") pod \"certified-operators-6qj4k\" (UID: \"8a8823d3-7d81-4ab1-8726-e77a2d0024e6\") " pod="openshift-marketplace/certified-operators-6qj4k" Feb 17 17:49:39 crc kubenswrapper[4892]: I0217 17:49:39.274140 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a8823d3-7d81-4ab1-8726-e77a2d0024e6-catalog-content\") pod \"certified-operators-6qj4k\" (UID: \"8a8823d3-7d81-4ab1-8726-e77a2d0024e6\") " pod="openshift-marketplace/certified-operators-6qj4k" Feb 17 17:49:39 crc kubenswrapper[4892]: I0217 17:49:39.375005 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a8823d3-7d81-4ab1-8726-e77a2d0024e6-catalog-content\") pod \"certified-operators-6qj4k\" (UID: \"8a8823d3-7d81-4ab1-8726-e77a2d0024e6\") " pod="openshift-marketplace/certified-operators-6qj4k" Feb 17 17:49:39 crc kubenswrapper[4892]: I0217 17:49:39.375555 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a8823d3-7d81-4ab1-8726-e77a2d0024e6-catalog-content\") pod \"certified-operators-6qj4k\" (UID: \"8a8823d3-7d81-4ab1-8726-e77a2d0024e6\") " pod="openshift-marketplace/certified-operators-6qj4k" Feb 17 17:49:39 crc kubenswrapper[4892]: I0217 17:49:39.375832 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a8823d3-7d81-4ab1-8726-e77a2d0024e6-utilities\") pod \"certified-operators-6qj4k\" (UID: \"8a8823d3-7d81-4ab1-8726-e77a2d0024e6\") " pod="openshift-marketplace/certified-operators-6qj4k" Feb 17 17:49:39 crc kubenswrapper[4892]: I0217 17:49:39.375867 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tg25\" (UniqueName: \"kubernetes.io/projected/8a8823d3-7d81-4ab1-8726-e77a2d0024e6-kube-api-access-6tg25\") pod \"certified-operators-6qj4k\" (UID: \"8a8823d3-7d81-4ab1-8726-e77a2d0024e6\") " pod="openshift-marketplace/certified-operators-6qj4k" Feb 17 17:49:39 crc kubenswrapper[4892]: I0217 17:49:39.376370 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a8823d3-7d81-4ab1-8726-e77a2d0024e6-utilities\") pod \"certified-operators-6qj4k\" (UID: \"8a8823d3-7d81-4ab1-8726-e77a2d0024e6\") " pod="openshift-marketplace/certified-operators-6qj4k" Feb 17 17:49:39 crc kubenswrapper[4892]: I0217 17:49:39.400282 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tg25\" (UniqueName: \"kubernetes.io/projected/8a8823d3-7d81-4ab1-8726-e77a2d0024e6-kube-api-access-6tg25\") pod \"certified-operators-6qj4k\" (UID: \"8a8823d3-7d81-4ab1-8726-e77a2d0024e6\") " pod="openshift-marketplace/certified-operators-6qj4k" Feb 17 17:49:39 crc kubenswrapper[4892]: I0217 17:49:39.602700 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6qj4k" Feb 17 17:49:39 crc kubenswrapper[4892]: I0217 17:49:39.817737 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6qj4k"] Feb 17 17:49:39 crc kubenswrapper[4892]: W0217 17:49:39.822754 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a8823d3_7d81_4ab1_8726_e77a2d0024e6.slice/crio-71c955501ff6260aa202dee21dfc569b10fa6621cf31fe425022cce59e8e21d4 WatchSource:0}: Error finding container 71c955501ff6260aa202dee21dfc569b10fa6621cf31fe425022cce59e8e21d4: Status 404 returned error can't find the container with id 71c955501ff6260aa202dee21dfc569b10fa6621cf31fe425022cce59e8e21d4 Feb 17 17:49:40 crc kubenswrapper[4892]: I0217 17:49:40.016218 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6qj4k" event={"ID":"8a8823d3-7d81-4ab1-8726-e77a2d0024e6","Type":"ContainerStarted","Data":"71c955501ff6260aa202dee21dfc569b10fa6621cf31fe425022cce59e8e21d4"} Feb 17 17:49:40 crc kubenswrapper[4892]: I0217 17:49:40.016371 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-78dz9" Feb 17 17:49:40 crc kubenswrapper[4892]: I0217 17:49:40.240690 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-78dz9" podStartSLOduration=7.240670847 podStartE2EDuration="7.240670847s" podCreationTimestamp="2026-02-17 17:49:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:49:40.03659542 +0000 UTC m=+351.411998695" watchObservedRunningTime="2026-02-17 17:49:40.240670847 +0000 UTC m=+351.616074112" Feb 17 17:49:40 crc kubenswrapper[4892]: I0217 17:49:40.242472 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jstz7"] Feb 17 17:49:40 crc kubenswrapper[4892]: I0217 17:49:40.244054 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jstz7" Feb 17 17:49:40 crc kubenswrapper[4892]: I0217 17:49:40.257022 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jstz7"] Feb 17 17:49:40 crc kubenswrapper[4892]: I0217 17:49:40.288841 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20048710-1af1-4532-8a8d-c8f325bcc2a1-utilities\") pod \"community-operators-jstz7\" (UID: \"20048710-1af1-4532-8a8d-c8f325bcc2a1\") " pod="openshift-marketplace/community-operators-jstz7" Feb 17 17:49:40 crc kubenswrapper[4892]: I0217 17:49:40.288896 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20048710-1af1-4532-8a8d-c8f325bcc2a1-catalog-content\") pod \"community-operators-jstz7\" (UID: \"20048710-1af1-4532-8a8d-c8f325bcc2a1\") " pod="openshift-marketplace/community-operators-jstz7" Feb 17 17:49:40 crc kubenswrapper[4892]: I0217 17:49:40.288969 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8ntp\" (UniqueName: \"kubernetes.io/projected/20048710-1af1-4532-8a8d-c8f325bcc2a1-kube-api-access-x8ntp\") pod \"community-operators-jstz7\" (UID: \"20048710-1af1-4532-8a8d-c8f325bcc2a1\") " pod="openshift-marketplace/community-operators-jstz7" Feb 17 17:49:40 crc kubenswrapper[4892]: I0217 17:49:40.349333 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-x8f4q" Feb 17 17:49:40 crc kubenswrapper[4892]: I0217 17:49:40.349571 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-x8f4q" Feb 17 17:49:40 crc kubenswrapper[4892]: I0217 17:49:40.390009 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-x8f4q" Feb 17 17:49:40 crc kubenswrapper[4892]: I0217 17:49:40.391258 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20048710-1af1-4532-8a8d-c8f325bcc2a1-utilities\") pod \"community-operators-jstz7\" (UID: \"20048710-1af1-4532-8a8d-c8f325bcc2a1\") " pod="openshift-marketplace/community-operators-jstz7" Feb 17 17:49:40 crc kubenswrapper[4892]: I0217 17:49:40.391731 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20048710-1af1-4532-8a8d-c8f325bcc2a1-utilities\") pod \"community-operators-jstz7\" (UID: \"20048710-1af1-4532-8a8d-c8f325bcc2a1\") " pod="openshift-marketplace/community-operators-jstz7" Feb 17 17:49:40 crc kubenswrapper[4892]: I0217 17:49:40.391793 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20048710-1af1-4532-8a8d-c8f325bcc2a1-catalog-content\") pod \"community-operators-jstz7\" (UID: \"20048710-1af1-4532-8a8d-c8f325bcc2a1\") " pod="openshift-marketplace/community-operators-jstz7" Feb 17 17:49:40 crc kubenswrapper[4892]: I0217 17:49:40.392024 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20048710-1af1-4532-8a8d-c8f325bcc2a1-catalog-content\") pod \"community-operators-jstz7\" (UID: \"20048710-1af1-4532-8a8d-c8f325bcc2a1\") " pod="openshift-marketplace/community-operators-jstz7" Feb 17 17:49:40 crc kubenswrapper[4892]: I0217 17:49:40.392100 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8ntp\" (UniqueName: \"kubernetes.io/projected/20048710-1af1-4532-8a8d-c8f325bcc2a1-kube-api-access-x8ntp\") pod \"community-operators-jstz7\" (UID: \"20048710-1af1-4532-8a8d-c8f325bcc2a1\") " pod="openshift-marketplace/community-operators-jstz7" Feb 17 17:49:40 crc kubenswrapper[4892]: I0217 17:49:40.413520 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8ntp\" (UniqueName: \"kubernetes.io/projected/20048710-1af1-4532-8a8d-c8f325bcc2a1-kube-api-access-x8ntp\") pod \"community-operators-jstz7\" (UID: \"20048710-1af1-4532-8a8d-c8f325bcc2a1\") " pod="openshift-marketplace/community-operators-jstz7" Feb 17 17:49:40 crc kubenswrapper[4892]: I0217 17:49:40.558365 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jstz7" Feb 17 17:49:40 crc kubenswrapper[4892]: I0217 17:49:40.984323 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jstz7"] Feb 17 17:49:40 crc kubenswrapper[4892]: W0217 17:49:40.990237 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20048710_1af1_4532_8a8d_c8f325bcc2a1.slice/crio-d08c4d4fc697bdcb73470b1b120c605051da6aabbc5cfd3ac74bf0b3228878bd WatchSource:0}: Error finding container d08c4d4fc697bdcb73470b1b120c605051da6aabbc5cfd3ac74bf0b3228878bd: Status 404 returned error can't find the container with id d08c4d4fc697bdcb73470b1b120c605051da6aabbc5cfd3ac74bf0b3228878bd Feb 17 17:49:41 crc kubenswrapper[4892]: I0217 17:49:41.024067 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jstz7" event={"ID":"20048710-1af1-4532-8a8d-c8f325bcc2a1","Type":"ContainerStarted","Data":"d08c4d4fc697bdcb73470b1b120c605051da6aabbc5cfd3ac74bf0b3228878bd"} Feb 17 17:49:41 crc kubenswrapper[4892]: I0217 17:49:41.025836 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fj4g8" event={"ID":"fee21c95-556a-4a25-8b38-273489f881e4","Type":"ContainerStarted","Data":"557f85612b4b8b4d561466b7153151f8ec59e2074c2dabb0b90a2fe562b3c1aa"} Feb 17 17:49:41 crc kubenswrapper[4892]: I0217 17:49:41.077042 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-x8f4q" Feb 17 17:49:41 crc kubenswrapper[4892]: I0217 17:49:41.358882 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jnfwl" Feb 17 17:49:41 crc kubenswrapper[4892]: I0217 17:49:41.366166 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jnfwl" Feb 17 17:49:41 crc kubenswrapper[4892]: I0217 17:49:41.407339 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jnfwl" Feb 17 17:49:41 crc kubenswrapper[4892]: I0217 17:49:41.443661 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jxtbb"] Feb 17 17:49:41 crc kubenswrapper[4892]: I0217 17:49:41.445196 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jxtbb" Feb 17 17:49:41 crc kubenswrapper[4892]: I0217 17:49:41.466394 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jxtbb"] Feb 17 17:49:41 crc kubenswrapper[4892]: I0217 17:49:41.510328 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3425377d-d147-4bc3-a063-5e3c9456d2f9-utilities\") pod \"redhat-operators-jxtbb\" (UID: \"3425377d-d147-4bc3-a063-5e3c9456d2f9\") " pod="openshift-marketplace/redhat-operators-jxtbb" Feb 17 17:49:41 crc kubenswrapper[4892]: I0217 17:49:41.510424 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3425377d-d147-4bc3-a063-5e3c9456d2f9-catalog-content\") pod \"redhat-operators-jxtbb\" (UID: \"3425377d-d147-4bc3-a063-5e3c9456d2f9\") " pod="openshift-marketplace/redhat-operators-jxtbb" Feb 17 17:49:41 crc kubenswrapper[4892]: I0217 17:49:41.510486 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl52n\" (UniqueName: \"kubernetes.io/projected/3425377d-d147-4bc3-a063-5e3c9456d2f9-kube-api-access-kl52n\") pod \"redhat-operators-jxtbb\" (UID: \"3425377d-d147-4bc3-a063-5e3c9456d2f9\") " pod="openshift-marketplace/redhat-operators-jxtbb" Feb 17 17:49:41 crc kubenswrapper[4892]: I0217 17:49:41.611174 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3425377d-d147-4bc3-a063-5e3c9456d2f9-utilities\") pod \"redhat-operators-jxtbb\" (UID: \"3425377d-d147-4bc3-a063-5e3c9456d2f9\") " pod="openshift-marketplace/redhat-operators-jxtbb" Feb 17 17:49:41 crc kubenswrapper[4892]: I0217 17:49:41.611399 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3425377d-d147-4bc3-a063-5e3c9456d2f9-catalog-content\") pod \"redhat-operators-jxtbb\" (UID: \"3425377d-d147-4bc3-a063-5e3c9456d2f9\") " pod="openshift-marketplace/redhat-operators-jxtbb" Feb 17 17:49:41 crc kubenswrapper[4892]: I0217 17:49:41.611437 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kl52n\" (UniqueName: \"kubernetes.io/projected/3425377d-d147-4bc3-a063-5e3c9456d2f9-kube-api-access-kl52n\") pod \"redhat-operators-jxtbb\" (UID: \"3425377d-d147-4bc3-a063-5e3c9456d2f9\") " pod="openshift-marketplace/redhat-operators-jxtbb" Feb 17 17:49:41 crc kubenswrapper[4892]: I0217 17:49:41.611587 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3425377d-d147-4bc3-a063-5e3c9456d2f9-utilities\") pod \"redhat-operators-jxtbb\" (UID: \"3425377d-d147-4bc3-a063-5e3c9456d2f9\") " pod="openshift-marketplace/redhat-operators-jxtbb" Feb 17 17:49:41 crc kubenswrapper[4892]: I0217 17:49:41.611824 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3425377d-d147-4bc3-a063-5e3c9456d2f9-catalog-content\") pod \"redhat-operators-jxtbb\" (UID: \"3425377d-d147-4bc3-a063-5e3c9456d2f9\") " pod="openshift-marketplace/redhat-operators-jxtbb" Feb 17 17:49:41 crc kubenswrapper[4892]: I0217 17:49:41.636920 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl52n\" (UniqueName: \"kubernetes.io/projected/3425377d-d147-4bc3-a063-5e3c9456d2f9-kube-api-access-kl52n\") pod \"redhat-operators-jxtbb\" (UID: \"3425377d-d147-4bc3-a063-5e3c9456d2f9\") " pod="openshift-marketplace/redhat-operators-jxtbb" Feb 17 17:49:41 crc kubenswrapper[4892]: I0217 17:49:41.761038 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jxtbb" Feb 17 17:49:42 crc kubenswrapper[4892]: I0217 17:49:42.032137 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6qj4k" event={"ID":"8a8823d3-7d81-4ab1-8726-e77a2d0024e6","Type":"ContainerStarted","Data":"ad79dfeb408341e4b7e6e20c9b2e82562148b058458200823d0beed3405fca09"} Feb 17 17:49:42 crc kubenswrapper[4892]: I0217 17:49:42.033622 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zcqks" event={"ID":"a22a9e64-6027-45c1-be54-dc34baaf71c5","Type":"ContainerStarted","Data":"79d025f0927da55b87e8cf63a01bc7f89730004f2c9bbdd33edcc671a3285213"} Feb 17 17:49:42 crc kubenswrapper[4892]: I0217 17:49:42.086878 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jnfwl" Feb 17 17:49:42 crc kubenswrapper[4892]: I0217 17:49:42.854422 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5cd89"] Feb 17 17:49:42 crc kubenswrapper[4892]: I0217 17:49:42.858901 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5cd89" Feb 17 17:49:42 crc kubenswrapper[4892]: I0217 17:49:42.870481 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5cd89"] Feb 17 17:49:42 crc kubenswrapper[4892]: I0217 17:49:42.946406 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47pnh\" (UniqueName: \"kubernetes.io/projected/94cfcd99-3052-49c6-991c-571a85bdeba5-kube-api-access-47pnh\") pod \"certified-operators-5cd89\" (UID: \"94cfcd99-3052-49c6-991c-571a85bdeba5\") " pod="openshift-marketplace/certified-operators-5cd89" Feb 17 17:49:42 crc kubenswrapper[4892]: I0217 17:49:42.946650 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94cfcd99-3052-49c6-991c-571a85bdeba5-catalog-content\") pod \"certified-operators-5cd89\" (UID: \"94cfcd99-3052-49c6-991c-571a85bdeba5\") " pod="openshift-marketplace/certified-operators-5cd89" Feb 17 17:49:42 crc kubenswrapper[4892]: I0217 17:49:42.946718 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94cfcd99-3052-49c6-991c-571a85bdeba5-utilities\") pod \"certified-operators-5cd89\" (UID: \"94cfcd99-3052-49c6-991c-571a85bdeba5\") " pod="openshift-marketplace/certified-operators-5cd89" Feb 17 17:49:43 crc kubenswrapper[4892]: I0217 17:49:43.046950 4892 generic.go:334] "Generic (PLEG): container finished" podID="30a36169-14a5-4d2a-9f66-fb343852b6a6" containerID="caa2a35beacddd6be13358c3905c6c7b1c8cbcaf78f3469d39706824bc40c207" exitCode=0 Feb 17 17:49:43 crc kubenswrapper[4892]: I0217 17:49:43.047653 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47pnh\" (UniqueName: \"kubernetes.io/projected/94cfcd99-3052-49c6-991c-571a85bdeba5-kube-api-access-47pnh\") pod \"certified-operators-5cd89\" (UID: \"94cfcd99-3052-49c6-991c-571a85bdeba5\") " pod="openshift-marketplace/certified-operators-5cd89" Feb 17 17:49:43 crc kubenswrapper[4892]: I0217 17:49:43.047088 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5c7bz" event={"ID":"30a36169-14a5-4d2a-9f66-fb343852b6a6","Type":"ContainerDied","Data":"caa2a35beacddd6be13358c3905c6c7b1c8cbcaf78f3469d39706824bc40c207"} Feb 17 17:49:43 crc kubenswrapper[4892]: I0217 17:49:43.047739 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94cfcd99-3052-49c6-991c-571a85bdeba5-catalog-content\") pod \"certified-operators-5cd89\" (UID: \"94cfcd99-3052-49c6-991c-571a85bdeba5\") " pod="openshift-marketplace/certified-operators-5cd89" Feb 17 17:49:43 crc kubenswrapper[4892]: I0217 17:49:43.047803 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94cfcd99-3052-49c6-991c-571a85bdeba5-utilities\") pod \"certified-operators-5cd89\" (UID: \"94cfcd99-3052-49c6-991c-571a85bdeba5\") " pod="openshift-marketplace/certified-operators-5cd89" Feb 17 17:49:43 crc kubenswrapper[4892]: I0217 17:49:43.048442 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94cfcd99-3052-49c6-991c-571a85bdeba5-utilities\") pod \"certified-operators-5cd89\" (UID: \"94cfcd99-3052-49c6-991c-571a85bdeba5\") " pod="openshift-marketplace/certified-operators-5cd89" Feb 17 17:49:43 crc kubenswrapper[4892]: I0217 17:49:43.048552 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94cfcd99-3052-49c6-991c-571a85bdeba5-catalog-content\") pod \"certified-operators-5cd89\" (UID: \"94cfcd99-3052-49c6-991c-571a85bdeba5\") " pod="openshift-marketplace/certified-operators-5cd89" Feb 17 17:49:43 crc kubenswrapper[4892]: I0217 17:49:43.051325 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jstz7" event={"ID":"20048710-1af1-4532-8a8d-c8f325bcc2a1","Type":"ContainerStarted","Data":"b55284aa074acd537c3cc8b70b8b6560705f03e304e5d944eb9b1cec10408989"} Feb 17 17:49:43 crc kubenswrapper[4892]: I0217 17:49:43.082773 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47pnh\" (UniqueName: \"kubernetes.io/projected/94cfcd99-3052-49c6-991c-571a85bdeba5-kube-api-access-47pnh\") pod \"certified-operators-5cd89\" (UID: \"94cfcd99-3052-49c6-991c-571a85bdeba5\") " pod="openshift-marketplace/certified-operators-5cd89" Feb 17 17:49:43 crc kubenswrapper[4892]: I0217 17:49:43.182894 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5cd89" Feb 17 17:49:43 crc kubenswrapper[4892]: I0217 17:49:43.835139 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dhfx5"] Feb 17 17:49:43 crc kubenswrapper[4892]: I0217 17:49:43.836990 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dhfx5" Feb 17 17:49:43 crc kubenswrapper[4892]: I0217 17:49:43.842028 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dhfx5"] Feb 17 17:49:43 crc kubenswrapper[4892]: I0217 17:49:43.858264 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7700d4bf-d2a8-4c27-bba9-712bde89f76c-utilities\") pod \"community-operators-dhfx5\" (UID: \"7700d4bf-d2a8-4c27-bba9-712bde89f76c\") " pod="openshift-marketplace/community-operators-dhfx5" Feb 17 17:49:43 crc kubenswrapper[4892]: I0217 17:49:43.858395 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7700d4bf-d2a8-4c27-bba9-712bde89f76c-catalog-content\") pod \"community-operators-dhfx5\" (UID: \"7700d4bf-d2a8-4c27-bba9-712bde89f76c\") " pod="openshift-marketplace/community-operators-dhfx5" Feb 17 17:49:43 crc kubenswrapper[4892]: I0217 17:49:43.858524 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8kzc\" (UniqueName: \"kubernetes.io/projected/7700d4bf-d2a8-4c27-bba9-712bde89f76c-kube-api-access-h8kzc\") pod \"community-operators-dhfx5\" (UID: \"7700d4bf-d2a8-4c27-bba9-712bde89f76c\") " pod="openshift-marketplace/community-operators-dhfx5" Feb 17 17:49:43 crc kubenswrapper[4892]: I0217 17:49:43.959798 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7700d4bf-d2a8-4c27-bba9-712bde89f76c-utilities\") pod \"community-operators-dhfx5\" (UID: \"7700d4bf-d2a8-4c27-bba9-712bde89f76c\") " pod="openshift-marketplace/community-operators-dhfx5" Feb 17 17:49:43 crc kubenswrapper[4892]: I0217 17:49:43.959941 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7700d4bf-d2a8-4c27-bba9-712bde89f76c-catalog-content\") pod \"community-operators-dhfx5\" (UID: \"7700d4bf-d2a8-4c27-bba9-712bde89f76c\") " pod="openshift-marketplace/community-operators-dhfx5" Feb 17 17:49:43 crc kubenswrapper[4892]: I0217 17:49:43.959989 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8kzc\" (UniqueName: \"kubernetes.io/projected/7700d4bf-d2a8-4c27-bba9-712bde89f76c-kube-api-access-h8kzc\") pod \"community-operators-dhfx5\" (UID: \"7700d4bf-d2a8-4c27-bba9-712bde89f76c\") " pod="openshift-marketplace/community-operators-dhfx5" Feb 17 17:49:43 crc kubenswrapper[4892]: I0217 17:49:43.960332 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7700d4bf-d2a8-4c27-bba9-712bde89f76c-utilities\") pod \"community-operators-dhfx5\" (UID: \"7700d4bf-d2a8-4c27-bba9-712bde89f76c\") " pod="openshift-marketplace/community-operators-dhfx5" Feb 17 17:49:43 crc kubenswrapper[4892]: I0217 17:49:43.960456 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7700d4bf-d2a8-4c27-bba9-712bde89f76c-catalog-content\") pod \"community-operators-dhfx5\" (UID: \"7700d4bf-d2a8-4c27-bba9-712bde89f76c\") " pod="openshift-marketplace/community-operators-dhfx5" Feb 17 17:49:43 crc kubenswrapper[4892]: I0217 17:49:43.979099 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8kzc\" (UniqueName: \"kubernetes.io/projected/7700d4bf-d2a8-4c27-bba9-712bde89f76c-kube-api-access-h8kzc\") pod \"community-operators-dhfx5\" (UID: \"7700d4bf-d2a8-4c27-bba9-712bde89f76c\") " pod="openshift-marketplace/community-operators-dhfx5" Feb 17 17:49:44 crc kubenswrapper[4892]: I0217 17:49:44.206083 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dhfx5" Feb 17 17:49:45 crc kubenswrapper[4892]: I0217 17:49:45.235295 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pbtj9"] Feb 17 17:49:45 crc kubenswrapper[4892]: I0217 17:49:45.236789 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pbtj9" Feb 17 17:49:45 crc kubenswrapper[4892]: I0217 17:49:45.250548 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pbtj9"] Feb 17 17:49:45 crc kubenswrapper[4892]: I0217 17:49:45.282139 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d276a412-68f8-4069-9aa2-275fdb23997d-utilities\") pod \"redhat-operators-pbtj9\" (UID: \"d276a412-68f8-4069-9aa2-275fdb23997d\") " pod="openshift-marketplace/redhat-operators-pbtj9" Feb 17 17:49:45 crc kubenswrapper[4892]: I0217 17:49:45.282514 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5hmc\" (UniqueName: \"kubernetes.io/projected/d276a412-68f8-4069-9aa2-275fdb23997d-kube-api-access-l5hmc\") pod \"redhat-operators-pbtj9\" (UID: \"d276a412-68f8-4069-9aa2-275fdb23997d\") " pod="openshift-marketplace/redhat-operators-pbtj9" Feb 17 17:49:45 crc kubenswrapper[4892]: I0217 17:49:45.282769 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d276a412-68f8-4069-9aa2-275fdb23997d-catalog-content\") pod \"redhat-operators-pbtj9\" (UID: \"d276a412-68f8-4069-9aa2-275fdb23997d\") " pod="openshift-marketplace/redhat-operators-pbtj9" Feb 17 17:49:45 crc kubenswrapper[4892]: I0217 17:49:45.385481 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d276a412-68f8-4069-9aa2-275fdb23997d-utilities\") pod \"redhat-operators-pbtj9\" (UID: \"d276a412-68f8-4069-9aa2-275fdb23997d\") " pod="openshift-marketplace/redhat-operators-pbtj9" Feb 17 17:49:45 crc kubenswrapper[4892]: I0217 17:49:45.385636 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5hmc\" (UniqueName: \"kubernetes.io/projected/d276a412-68f8-4069-9aa2-275fdb23997d-kube-api-access-l5hmc\") pod \"redhat-operators-pbtj9\" (UID: \"d276a412-68f8-4069-9aa2-275fdb23997d\") " pod="openshift-marketplace/redhat-operators-pbtj9" Feb 17 17:49:45 crc kubenswrapper[4892]: I0217 17:49:45.385694 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d276a412-68f8-4069-9aa2-275fdb23997d-catalog-content\") pod \"redhat-operators-pbtj9\" (UID: \"d276a412-68f8-4069-9aa2-275fdb23997d\") " pod="openshift-marketplace/redhat-operators-pbtj9" Feb 17 17:49:45 crc kubenswrapper[4892]: I0217 17:49:45.386789 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d276a412-68f8-4069-9aa2-275fdb23997d-catalog-content\") pod \"redhat-operators-pbtj9\" (UID: \"d276a412-68f8-4069-9aa2-275fdb23997d\") " pod="openshift-marketplace/redhat-operators-pbtj9" Feb 17 17:49:45 crc kubenswrapper[4892]: I0217 17:49:45.387490 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d276a412-68f8-4069-9aa2-275fdb23997d-utilities\") pod \"redhat-operators-pbtj9\" (UID: \"d276a412-68f8-4069-9aa2-275fdb23997d\") " pod="openshift-marketplace/redhat-operators-pbtj9" Feb 17 17:49:45 crc kubenswrapper[4892]: I0217 17:49:45.415415 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5hmc\" (UniqueName: \"kubernetes.io/projected/d276a412-68f8-4069-9aa2-275fdb23997d-kube-api-access-l5hmc\") pod \"redhat-operators-pbtj9\" (UID: \"d276a412-68f8-4069-9aa2-275fdb23997d\") " pod="openshift-marketplace/redhat-operators-pbtj9" Feb 17 17:49:45 crc kubenswrapper[4892]: I0217 17:49:45.562383 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pbtj9" Feb 17 17:49:46 crc kubenswrapper[4892]: I0217 17:49:46.247325 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rmmjd"] Feb 17 17:49:46 crc kubenswrapper[4892]: I0217 17:49:46.249315 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rmmjd" Feb 17 17:49:46 crc kubenswrapper[4892]: I0217 17:49:46.258847 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rmmjd"] Feb 17 17:49:46 crc kubenswrapper[4892]: I0217 17:49:46.428518 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12793f08-a0ab-411a-9449-b0b2d0834e5e-catalog-content\") pod \"certified-operators-rmmjd\" (UID: \"12793f08-a0ab-411a-9449-b0b2d0834e5e\") " pod="openshift-marketplace/certified-operators-rmmjd" Feb 17 17:49:46 crc kubenswrapper[4892]: I0217 17:49:46.428606 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8qqz\" (UniqueName: \"kubernetes.io/projected/12793f08-a0ab-411a-9449-b0b2d0834e5e-kube-api-access-j8qqz\") pod \"certified-operators-rmmjd\" (UID: \"12793f08-a0ab-411a-9449-b0b2d0834e5e\") " pod="openshift-marketplace/certified-operators-rmmjd" Feb 17 17:49:46 crc kubenswrapper[4892]: I0217 17:49:46.428657 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12793f08-a0ab-411a-9449-b0b2d0834e5e-utilities\") pod \"certified-operators-rmmjd\" (UID: \"12793f08-a0ab-411a-9449-b0b2d0834e5e\") " pod="openshift-marketplace/certified-operators-rmmjd" Feb 17 17:49:46 crc kubenswrapper[4892]: I0217 17:49:46.529526 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12793f08-a0ab-411a-9449-b0b2d0834e5e-utilities\") pod \"certified-operators-rmmjd\" (UID: \"12793f08-a0ab-411a-9449-b0b2d0834e5e\") " pod="openshift-marketplace/certified-operators-rmmjd" Feb 17 17:49:46 crc kubenswrapper[4892]: I0217 17:49:46.529838 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12793f08-a0ab-411a-9449-b0b2d0834e5e-catalog-content\") pod \"certified-operators-rmmjd\" (UID: \"12793f08-a0ab-411a-9449-b0b2d0834e5e\") " pod="openshift-marketplace/certified-operators-rmmjd" Feb 17 17:49:46 crc kubenswrapper[4892]: I0217 17:49:46.529953 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8qqz\" (UniqueName: \"kubernetes.io/projected/12793f08-a0ab-411a-9449-b0b2d0834e5e-kube-api-access-j8qqz\") pod \"certified-operators-rmmjd\" (UID: \"12793f08-a0ab-411a-9449-b0b2d0834e5e\") " pod="openshift-marketplace/certified-operators-rmmjd" Feb 17 17:49:46 crc kubenswrapper[4892]: I0217 17:49:46.530358 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12793f08-a0ab-411a-9449-b0b2d0834e5e-catalog-content\") pod \"certified-operators-rmmjd\" (UID: \"12793f08-a0ab-411a-9449-b0b2d0834e5e\") " pod="openshift-marketplace/certified-operators-rmmjd" Feb 17 17:49:46 crc kubenswrapper[4892]: I0217 17:49:46.530772 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12793f08-a0ab-411a-9449-b0b2d0834e5e-utilities\") pod \"certified-operators-rmmjd\" (UID: \"12793f08-a0ab-411a-9449-b0b2d0834e5e\") " pod="openshift-marketplace/certified-operators-rmmjd" Feb 17 17:49:46 crc kubenswrapper[4892]: I0217 17:49:46.555710 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8qqz\" (UniqueName: \"kubernetes.io/projected/12793f08-a0ab-411a-9449-b0b2d0834e5e-kube-api-access-j8qqz\") pod \"certified-operators-rmmjd\" (UID: \"12793f08-a0ab-411a-9449-b0b2d0834e5e\") " pod="openshift-marketplace/certified-operators-rmmjd" Feb 17 17:49:46 crc kubenswrapper[4892]: I0217 17:49:46.646254 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rmmjd" Feb 17 17:49:47 crc kubenswrapper[4892]: I0217 17:49:47.644044 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hwn4v"] Feb 17 17:49:47 crc kubenswrapper[4892]: I0217 17:49:47.646302 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hwn4v" Feb 17 17:49:47 crc kubenswrapper[4892]: I0217 17:49:47.663274 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hwn4v"] Feb 17 17:49:47 crc kubenswrapper[4892]: I0217 17:49:47.748743 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7833e24-e8f2-422a-8759-be2e54c1f6ee-utilities\") pod \"community-operators-hwn4v\" (UID: \"a7833e24-e8f2-422a-8759-be2e54c1f6ee\") " pod="openshift-marketplace/community-operators-hwn4v" Feb 17 17:49:47 crc kubenswrapper[4892]: I0217 17:49:47.749145 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7833e24-e8f2-422a-8759-be2e54c1f6ee-catalog-content\") pod \"community-operators-hwn4v\" (UID: \"a7833e24-e8f2-422a-8759-be2e54c1f6ee\") " pod="openshift-marketplace/community-operators-hwn4v" Feb 17 17:49:47 crc kubenswrapper[4892]: I0217 17:49:47.749284 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-226vd\" (UniqueName: \"kubernetes.io/projected/a7833e24-e8f2-422a-8759-be2e54c1f6ee-kube-api-access-226vd\") pod \"community-operators-hwn4v\" (UID: \"a7833e24-e8f2-422a-8759-be2e54c1f6ee\") " pod="openshift-marketplace/community-operators-hwn4v" Feb 17 17:49:47 crc kubenswrapper[4892]: I0217 17:49:47.850543 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7833e24-e8f2-422a-8759-be2e54c1f6ee-catalog-content\") pod \"community-operators-hwn4v\" (UID: \"a7833e24-e8f2-422a-8759-be2e54c1f6ee\") " pod="openshift-marketplace/community-operators-hwn4v" Feb 17 17:49:47 crc kubenswrapper[4892]: I0217 17:49:47.850998 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-226vd\" (UniqueName: \"kubernetes.io/projected/a7833e24-e8f2-422a-8759-be2e54c1f6ee-kube-api-access-226vd\") pod \"community-operators-hwn4v\" (UID: \"a7833e24-e8f2-422a-8759-be2e54c1f6ee\") " pod="openshift-marketplace/community-operators-hwn4v" Feb 17 17:49:47 crc kubenswrapper[4892]: I0217 17:49:47.851077 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7833e24-e8f2-422a-8759-be2e54c1f6ee-utilities\") pod \"community-operators-hwn4v\" (UID: \"a7833e24-e8f2-422a-8759-be2e54c1f6ee\") " pod="openshift-marketplace/community-operators-hwn4v" Feb 17 17:49:47 crc kubenswrapper[4892]: I0217 17:49:47.851548 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7833e24-e8f2-422a-8759-be2e54c1f6ee-catalog-content\") pod \"community-operators-hwn4v\" (UID: \"a7833e24-e8f2-422a-8759-be2e54c1f6ee\") " pod="openshift-marketplace/community-operators-hwn4v" Feb 17 17:49:47 crc kubenswrapper[4892]: I0217 17:49:47.851780 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7833e24-e8f2-422a-8759-be2e54c1f6ee-utilities\") pod \"community-operators-hwn4v\" (UID: \"a7833e24-e8f2-422a-8759-be2e54c1f6ee\") " pod="openshift-marketplace/community-operators-hwn4v" Feb 17 17:49:47 crc kubenswrapper[4892]: I0217 17:49:47.882645 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-226vd\" (UniqueName: \"kubernetes.io/projected/a7833e24-e8f2-422a-8759-be2e54c1f6ee-kube-api-access-226vd\") pod \"community-operators-hwn4v\" (UID: \"a7833e24-e8f2-422a-8759-be2e54c1f6ee\") " pod="openshift-marketplace/community-operators-hwn4v" Feb 17 17:49:47 crc kubenswrapper[4892]: I0217 17:49:47.976911 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hwn4v" Feb 17 17:49:48 crc kubenswrapper[4892]: I0217 17:49:48.439425 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5v59h"] Feb 17 17:49:48 crc kubenswrapper[4892]: I0217 17:49:48.440697 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5v59h" Feb 17 17:49:48 crc kubenswrapper[4892]: I0217 17:49:48.455711 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5v59h"] Feb 17 17:49:48 crc kubenswrapper[4892]: I0217 17:49:48.564224 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpsgw\" (UniqueName: \"kubernetes.io/projected/272b1940-458c-4610-a618-fcc2a4fab95c-kube-api-access-gpsgw\") pod \"redhat-operators-5v59h\" (UID: \"272b1940-458c-4610-a618-fcc2a4fab95c\") " pod="openshift-marketplace/redhat-operators-5v59h" Feb 17 17:49:48 crc kubenswrapper[4892]: I0217 17:49:48.564498 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/272b1940-458c-4610-a618-fcc2a4fab95c-catalog-content\") pod \"redhat-operators-5v59h\" (UID: \"272b1940-458c-4610-a618-fcc2a4fab95c\") " pod="openshift-marketplace/redhat-operators-5v59h" Feb 17 17:49:48 crc kubenswrapper[4892]: I0217 17:49:48.564590 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/272b1940-458c-4610-a618-fcc2a4fab95c-utilities\") pod \"redhat-operators-5v59h\" (UID: \"272b1940-458c-4610-a618-fcc2a4fab95c\") " pod="openshift-marketplace/redhat-operators-5v59h" Feb 17 17:49:48 crc kubenswrapper[4892]: I0217 17:49:48.666887 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpsgw\" (UniqueName: \"kubernetes.io/projected/272b1940-458c-4610-a618-fcc2a4fab95c-kube-api-access-gpsgw\") pod \"redhat-operators-5v59h\" (UID: \"272b1940-458c-4610-a618-fcc2a4fab95c\") " pod="openshift-marketplace/redhat-operators-5v59h" Feb 17 17:49:48 crc kubenswrapper[4892]: I0217 17:49:48.666993 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/272b1940-458c-4610-a618-fcc2a4fab95c-catalog-content\") pod \"redhat-operators-5v59h\" (UID: \"272b1940-458c-4610-a618-fcc2a4fab95c\") " pod="openshift-marketplace/redhat-operators-5v59h" Feb 17 17:49:48 crc kubenswrapper[4892]: I0217 17:49:48.667032 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/272b1940-458c-4610-a618-fcc2a4fab95c-utilities\") pod \"redhat-operators-5v59h\" (UID: \"272b1940-458c-4610-a618-fcc2a4fab95c\") " pod="openshift-marketplace/redhat-operators-5v59h" Feb 17 17:49:48 crc kubenswrapper[4892]: I0217 17:49:48.667571 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/272b1940-458c-4610-a618-fcc2a4fab95c-catalog-content\") pod \"redhat-operators-5v59h\" (UID: \"272b1940-458c-4610-a618-fcc2a4fab95c\") " pod="openshift-marketplace/redhat-operators-5v59h" Feb 17 17:49:48 crc kubenswrapper[4892]: I0217 17:49:48.667795 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/272b1940-458c-4610-a618-fcc2a4fab95c-utilities\") pod \"redhat-operators-5v59h\" (UID: \"272b1940-458c-4610-a618-fcc2a4fab95c\") " pod="openshift-marketplace/redhat-operators-5v59h" Feb 17 17:49:48 crc kubenswrapper[4892]: I0217 17:49:48.686673 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpsgw\" (UniqueName: \"kubernetes.io/projected/272b1940-458c-4610-a618-fcc2a4fab95c-kube-api-access-gpsgw\") pod \"redhat-operators-5v59h\" (UID: \"272b1940-458c-4610-a618-fcc2a4fab95c\") " pod="openshift-marketplace/redhat-operators-5v59h" Feb 17 17:49:48 crc kubenswrapper[4892]: I0217 17:49:48.766864 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5v59h" Feb 17 17:49:54 crc kubenswrapper[4892]: I0217 17:49:54.148922 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-78dz9" Feb 17 17:49:54 crc kubenswrapper[4892]: I0217 17:49:54.201015 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-cnjbv"] Feb 17 17:49:55 crc kubenswrapper[4892]: I0217 17:49:55.145753 4892 generic.go:334] "Generic (PLEG): container finished" podID="fee21c95-556a-4a25-8b38-273489f881e4" containerID="557f85612b4b8b4d561466b7153151f8ec59e2074c2dabb0b90a2fe562b3c1aa" exitCode=0 Feb 17 17:49:55 crc kubenswrapper[4892]: I0217 17:49:55.145883 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fj4g8" event={"ID":"fee21c95-556a-4a25-8b38-273489f881e4","Type":"ContainerDied","Data":"557f85612b4b8b4d561466b7153151f8ec59e2074c2dabb0b90a2fe562b3c1aa"} Feb 17 17:49:56 crc kubenswrapper[4892]: I0217 17:49:56.156342 4892 generic.go:334] "Generic (PLEG): container finished" podID="a22a9e64-6027-45c1-be54-dc34baaf71c5" containerID="79d025f0927da55b87e8cf63a01bc7f89730004f2c9bbdd33edcc671a3285213" exitCode=0 Feb 17 17:49:56 crc kubenswrapper[4892]: I0217 17:49:56.156461 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zcqks" event={"ID":"a22a9e64-6027-45c1-be54-dc34baaf71c5","Type":"ContainerDied","Data":"79d025f0927da55b87e8cf63a01bc7f89730004f2c9bbdd33edcc671a3285213"} Feb 17 17:49:56 crc kubenswrapper[4892]: I0217 17:49:56.159147 4892 generic.go:334] "Generic (PLEG): container finished" podID="8a8823d3-7d81-4ab1-8726-e77a2d0024e6" containerID="ad79dfeb408341e4b7e6e20c9b2e82562148b058458200823d0beed3405fca09" exitCode=0 Feb 17 17:49:56 crc kubenswrapper[4892]: I0217 17:49:56.159206 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6qj4k" event={"ID":"8a8823d3-7d81-4ab1-8726-e77a2d0024e6","Type":"ContainerDied","Data":"ad79dfeb408341e4b7e6e20c9b2e82562148b058458200823d0beed3405fca09"} Feb 17 17:49:57 crc kubenswrapper[4892]: I0217 17:49:57.177878 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jstz7" event={"ID":"20048710-1af1-4532-8a8d-c8f325bcc2a1","Type":"ContainerDied","Data":"b55284aa074acd537c3cc8b70b8b6560705f03e304e5d944eb9b1cec10408989"} Feb 17 17:49:57 crc kubenswrapper[4892]: I0217 17:49:57.179358 4892 generic.go:334] "Generic (PLEG): container finished" podID="20048710-1af1-4532-8a8d-c8f325bcc2a1" containerID="b55284aa074acd537c3cc8b70b8b6560705f03e304e5d944eb9b1cec10408989" exitCode=0 Feb 17 17:50:00 crc kubenswrapper[4892]: E0217 17:50:00.701515 4892 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 17 17:50:00 crc kubenswrapper[4892]: E0217 17:50:00.701931 4892 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jgplj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-ztgz4_openshift-marketplace(66354c96-340d-43c7-98d0-19713f857884): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 17:50:00 crc kubenswrapper[4892]: E0217 17:50:00.703729 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-ztgz4" podUID="66354c96-340d-43c7-98d0-19713f857884" Feb 17 17:50:00 crc kubenswrapper[4892]: I0217 17:50:00.928312 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jxtbb"] Feb 17 17:50:00 crc kubenswrapper[4892]: W0217 17:50:00.968471 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3425377d_d147_4bc3_a063_5e3c9456d2f9.slice/crio-2127c8729a6c593f3d6bebab6d244305f40ba0ecef8b5c7333a8e09dc89d1c5a WatchSource:0}: Error finding container 2127c8729a6c593f3d6bebab6d244305f40ba0ecef8b5c7333a8e09dc89d1c5a: Status 404 returned error can't find the container with id 2127c8729a6c593f3d6bebab6d244305f40ba0ecef8b5c7333a8e09dc89d1c5a Feb 17 17:50:01 crc kubenswrapper[4892]: I0217 17:50:01.216170 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jxtbb" event={"ID":"3425377d-d147-4bc3-a063-5e3c9456d2f9","Type":"ContainerStarted","Data":"2127c8729a6c593f3d6bebab6d244305f40ba0ecef8b5c7333a8e09dc89d1c5a"} Feb 17 17:50:01 crc kubenswrapper[4892]: I0217 17:50:01.537423 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rmmjd"] Feb 17 17:50:01 crc kubenswrapper[4892]: W0217 17:50:01.579001 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12793f08_a0ab_411a_9449_b0b2d0834e5e.slice/crio-705ce30159b8d544970c18e0e288bcd207bbfa658193fe13e118d29b76e471c3 WatchSource:0}: Error finding container 705ce30159b8d544970c18e0e288bcd207bbfa658193fe13e118d29b76e471c3: Status 404 returned error can't find the container with id 705ce30159b8d544970c18e0e288bcd207bbfa658193fe13e118d29b76e471c3 Feb 17 17:50:01 crc kubenswrapper[4892]: I0217 17:50:01.598388 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5v59h"] Feb 17 17:50:01 crc kubenswrapper[4892]: I0217 17:50:01.603788 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5cd89"] Feb 17 17:50:01 crc kubenswrapper[4892]: I0217 17:50:01.608937 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hwn4v"] Feb 17 17:50:01 crc kubenswrapper[4892]: I0217 17:50:01.623227 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dhfx5"] Feb 17 17:50:01 crc kubenswrapper[4892]: I0217 17:50:01.700038 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pbtj9"] Feb 17 17:50:01 crc kubenswrapper[4892]: W0217 17:50:01.719402 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7700d4bf_d2a8_4c27_bba9_712bde89f76c.slice/crio-8a477e8da71643bf2a3b768dd175ed826feca395348b18b812f25f0ed70f156f WatchSource:0}: Error finding container 8a477e8da71643bf2a3b768dd175ed826feca395348b18b812f25f0ed70f156f: Status 404 returned error can't find the container with id 8a477e8da71643bf2a3b768dd175ed826feca395348b18b812f25f0ed70f156f Feb 17 17:50:01 crc kubenswrapper[4892]: W0217 17:50:01.757622 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd276a412_68f8_4069_9aa2_275fdb23997d.slice/crio-a409622bc7625257df8b70e7b1f49962ec72aa2700e539dda6fee1984e054289 WatchSource:0}: Error finding container a409622bc7625257df8b70e7b1f49962ec72aa2700e539dda6fee1984e054289: Status 404 returned error can't find the container with id a409622bc7625257df8b70e7b1f49962ec72aa2700e539dda6fee1984e054289 Feb 17 17:50:02 crc kubenswrapper[4892]: I0217 17:50:02.227987 4892 generic.go:334] "Generic (PLEG): container finished" podID="3425377d-d147-4bc3-a063-5e3c9456d2f9" containerID="67d1ec9b26c08b9dca9a5b81f44c80be7e8fa8e99899ac04645043c9c49500d2" exitCode=0 Feb 17 17:50:02 crc kubenswrapper[4892]: I0217 17:50:02.228075 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jxtbb" event={"ID":"3425377d-d147-4bc3-a063-5e3c9456d2f9","Type":"ContainerDied","Data":"67d1ec9b26c08b9dca9a5b81f44c80be7e8fa8e99899ac04645043c9c49500d2"} Feb 17 17:50:02 crc kubenswrapper[4892]: I0217 17:50:02.248265 4892 generic.go:334] "Generic (PLEG): container finished" podID="a7833e24-e8f2-422a-8759-be2e54c1f6ee" containerID="67237d93bb134b10d1bd0350f926738733c064a7892abd239bfd291f558c9950" exitCode=0 Feb 17 17:50:02 crc kubenswrapper[4892]: I0217 17:50:02.248363 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hwn4v" event={"ID":"a7833e24-e8f2-422a-8759-be2e54c1f6ee","Type":"ContainerDied","Data":"67237d93bb134b10d1bd0350f926738733c064a7892abd239bfd291f558c9950"} Feb 17 17:50:02 crc kubenswrapper[4892]: I0217 17:50:02.248391 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hwn4v" event={"ID":"a7833e24-e8f2-422a-8759-be2e54c1f6ee","Type":"ContainerStarted","Data":"6943410c350fe6a1d7cf661bf3b6044e302999be5913ec1d3695b51a958fa17c"} Feb 17 17:50:02 crc kubenswrapper[4892]: I0217 17:50:02.264990 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6qj4k" event={"ID":"8a8823d3-7d81-4ab1-8726-e77a2d0024e6","Type":"ContainerStarted","Data":"5772aae5990ff53de41fb1ecd34b0bc6546450216efa0097d3839f7398e3acee"} Feb 17 17:50:02 crc kubenswrapper[4892]: I0217 17:50:02.284528 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5c7bz" event={"ID":"30a36169-14a5-4d2a-9f66-fb343852b6a6","Type":"ContainerStarted","Data":"76f7d0dd2abcf5cc5477d52bab7e6299b8745e10fe10316a6f3b3a94c0441fac"} Feb 17 17:50:02 crc kubenswrapper[4892]: I0217 17:50:02.319119 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gfc4l" event={"ID":"31427c94-1de1-431c-9f4b-e87f01548d3f","Type":"ContainerStarted","Data":"e57bf64fd272b83c9b0c76454389d556e20f18a0cfda0881aa6288c49cb07d10"} Feb 17 17:50:02 crc kubenswrapper[4892]: I0217 17:50:02.342903 4892 generic.go:334] "Generic (PLEG): container finished" podID="272b1940-458c-4610-a618-fcc2a4fab95c" containerID="a44b89bd4c8b505a828ec24bcfbc3fb71032b42d182573461b50b00ba5587cb9" exitCode=0 Feb 17 17:50:02 crc kubenswrapper[4892]: I0217 17:50:02.343018 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5v59h" event={"ID":"272b1940-458c-4610-a618-fcc2a4fab95c","Type":"ContainerDied","Data":"a44b89bd4c8b505a828ec24bcfbc3fb71032b42d182573461b50b00ba5587cb9"} Feb 17 17:50:02 crc kubenswrapper[4892]: I0217 17:50:02.343047 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5v59h" event={"ID":"272b1940-458c-4610-a618-fcc2a4fab95c","Type":"ContainerStarted","Data":"53759a00a3b1e0e47a65764e82e44b06af984fccbc7bb1f7a409900a5eca81ed"} Feb 17 17:50:02 crc kubenswrapper[4892]: I0217 17:50:02.360460 4892 generic.go:334] "Generic (PLEG): container finished" podID="d276a412-68f8-4069-9aa2-275fdb23997d" containerID="aaf6c2d6daab549efc9c97fa98a8f39f4a98e15e2b90d63bf017d3878089637c" exitCode=0 Feb 17 17:50:02 crc kubenswrapper[4892]: I0217 17:50:02.360586 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbtj9" event={"ID":"d276a412-68f8-4069-9aa2-275fdb23997d","Type":"ContainerDied","Data":"aaf6c2d6daab549efc9c97fa98a8f39f4a98e15e2b90d63bf017d3878089637c"} Feb 17 17:50:02 crc kubenswrapper[4892]: I0217 17:50:02.360618 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbtj9" event={"ID":"d276a412-68f8-4069-9aa2-275fdb23997d","Type":"ContainerStarted","Data":"a409622bc7625257df8b70e7b1f49962ec72aa2700e539dda6fee1984e054289"} Feb 17 17:50:02 crc kubenswrapper[4892]: I0217 17:50:02.382099 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fj4g8" event={"ID":"fee21c95-556a-4a25-8b38-273489f881e4","Type":"ContainerStarted","Data":"554ddb300fba674d1cc5a1a361ac7dfbc35d6e4f749cb9231a2a8532bd44450e"} Feb 17 17:50:02 crc kubenswrapper[4892]: I0217 17:50:02.395136 4892 generic.go:334] "Generic (PLEG): container finished" podID="7700d4bf-d2a8-4c27-bba9-712bde89f76c" containerID="819cb379784d4ddb90948d015f96b15c91a155afeca774731c30f2428070db00" exitCode=0 Feb 17 17:50:02 crc kubenswrapper[4892]: I0217 17:50:02.395253 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dhfx5" event={"ID":"7700d4bf-d2a8-4c27-bba9-712bde89f76c","Type":"ContainerDied","Data":"819cb379784d4ddb90948d015f96b15c91a155afeca774731c30f2428070db00"} Feb 17 17:50:02 crc kubenswrapper[4892]: I0217 17:50:02.395288 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dhfx5" event={"ID":"7700d4bf-d2a8-4c27-bba9-712bde89f76c","Type":"ContainerStarted","Data":"8a477e8da71643bf2a3b768dd175ed826feca395348b18b812f25f0ed70f156f"} Feb 17 17:50:02 crc kubenswrapper[4892]: I0217 17:50:02.398541 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gfc4l" podStartSLOduration=4.184302509 podStartE2EDuration="35.398520091s" podCreationTimestamp="2026-02-17 17:49:27 +0000 UTC" firstStartedPulling="2026-02-17 17:49:29.772662372 +0000 UTC m=+341.148065647" lastFinishedPulling="2026-02-17 17:50:00.986879954 +0000 UTC m=+372.362283229" observedRunningTime="2026-02-17 17:50:02.359703058 +0000 UTC m=+373.735106323" watchObservedRunningTime="2026-02-17 17:50:02.398520091 +0000 UTC m=+373.773923356" Feb 17 17:50:02 crc kubenswrapper[4892]: I0217 17:50:02.411982 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k2k8h" event={"ID":"e730a12c-43a9-45a7-a52b-9f6fd3eaecaf","Type":"ContainerStarted","Data":"098479c84ca860a2b62aa81e3f7f2dc771b33b3e3dcdc59ed7bf5bd2efdd78af"} Feb 17 17:50:02 crc kubenswrapper[4892]: I0217 17:50:02.429848 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ljjvr" event={"ID":"4b23b6a9-794e-40ce-93b0-8b60ec25cf20","Type":"ContainerStarted","Data":"1de36c79e76db97c5e8cdaba0096c633c3bce0fff11e04340520c991b31f96b1"} Feb 17 17:50:02 crc kubenswrapper[4892]: I0217 17:50:02.431962 4892 generic.go:334] "Generic (PLEG): container finished" podID="36d16e04-a79d-42f9-bcb8-e9115efb1eae" containerID="5260f20fe3b032cf80ab688ac96b7b0facf7d74885a46af27ad7a71d3b6a74b2" exitCode=0 Feb 17 17:50:02 crc kubenswrapper[4892]: I0217 17:50:02.432021 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bv8xp" event={"ID":"36d16e04-a79d-42f9-bcb8-e9115efb1eae","Type":"ContainerDied","Data":"5260f20fe3b032cf80ab688ac96b7b0facf7d74885a46af27ad7a71d3b6a74b2"} Feb 17 17:50:02 crc kubenswrapper[4892]: I0217 17:50:02.473441 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7hghz" event={"ID":"3b7d2423-4734-4106-91f3-fd4f6712a9d0","Type":"ContainerStarted","Data":"e883194d5e15190df601f85baa6fb5c4d9261896a591d52356197331f3a5cc41"} Feb 17 17:50:02 crc kubenswrapper[4892]: I0217 17:50:02.493197 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zcqks" event={"ID":"a22a9e64-6027-45c1-be54-dc34baaf71c5","Type":"ContainerStarted","Data":"abc5b3d10ceb9e864636bfc7bd385a89a46d0fd150c31c69bc98d88a0324c34d"} Feb 17 17:50:02 crc kubenswrapper[4892]: I0217 17:50:02.496037 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ljjvr" podStartSLOduration=5.257299415 podStartE2EDuration="38.496019583s" podCreationTimestamp="2026-02-17 17:49:24 +0000 UTC" firstStartedPulling="2026-02-17 17:49:27.749321366 +0000 UTC m=+339.124724631" lastFinishedPulling="2026-02-17 17:50:00.988041524 +0000 UTC m=+372.363444799" observedRunningTime="2026-02-17 17:50:02.494569364 +0000 UTC m=+373.869972629" watchObservedRunningTime="2026-02-17 17:50:02.496019583 +0000 UTC m=+373.871422838" Feb 17 17:50:02 crc kubenswrapper[4892]: I0217 17:50:02.506118 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xnsqq" event={"ID":"b8e76551-8839-410b-9aec-a671ca2119dc","Type":"ContainerStarted","Data":"87df6024877141fb32a3791a22dda23a2f47e5e78c6bee52d9c168fdbd728fbe"} Feb 17 17:50:02 crc kubenswrapper[4892]: I0217 17:50:02.517982 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6vvnr" event={"ID":"68547217-2996-4b25-9020-c2187ecfb42e","Type":"ContainerStarted","Data":"168b7bdb369be71c31ac2c71505e1a44233545383d46375e567d36c7a7f38f6d"} Feb 17 17:50:02 crc kubenswrapper[4892]: I0217 17:50:02.520561 4892 generic.go:334] "Generic (PLEG): container finished" podID="94cfcd99-3052-49c6-991c-571a85bdeba5" containerID="f89a549aee8b2cfcccc4c26aa6cd254abec5f29488ed0084d00d0b7760e7fb32" exitCode=0 Feb 17 17:50:02 crc kubenswrapper[4892]: I0217 17:50:02.520604 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5cd89" event={"ID":"94cfcd99-3052-49c6-991c-571a85bdeba5","Type":"ContainerDied","Data":"f89a549aee8b2cfcccc4c26aa6cd254abec5f29488ed0084d00d0b7760e7fb32"} Feb 17 17:50:02 crc kubenswrapper[4892]: I0217 17:50:02.520621 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5cd89" event={"ID":"94cfcd99-3052-49c6-991c-571a85bdeba5","Type":"ContainerStarted","Data":"78c3dbdb10ebc9c0da599587dbf59ebe7bd05ff8e72754b8eaead5e05c5bd1ba"} Feb 17 17:50:02 crc kubenswrapper[4892]: I0217 17:50:02.522526 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-k2k8h" podStartSLOduration=5.56764842 podStartE2EDuration="39.522504155s" podCreationTimestamp="2026-02-17 17:49:23 +0000 UTC" firstStartedPulling="2026-02-17 17:49:26.737850108 +0000 UTC m=+338.113253393" lastFinishedPulling="2026-02-17 17:50:00.692705853 +0000 UTC m=+372.068109128" observedRunningTime="2026-02-17 17:50:02.516084323 +0000 UTC m=+373.891487588" watchObservedRunningTime="2026-02-17 17:50:02.522504155 +0000 UTC m=+373.897907410" Feb 17 17:50:02 crc kubenswrapper[4892]: I0217 17:50:02.546430 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fsc5r" event={"ID":"28f724e3-47e3-47df-be60-cc0da5a15e25","Type":"ContainerStarted","Data":"680dd934e408aad8bc359200d7422d2b4d6e49bff324e07be9555d2e88259150"} Feb 17 17:50:02 crc kubenswrapper[4892]: I0217 17:50:02.548619 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hw9fh" event={"ID":"8d6875b4-db50-44e7-b270-ceb7bc6ef804","Type":"ContainerStarted","Data":"cbf0f4bb67090449de60937e4c5f126db77b22d75a30b29fa70a2de42b9432be"} Feb 17 17:50:02 crc kubenswrapper[4892]: I0217 17:50:02.557218 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jstz7" event={"ID":"20048710-1af1-4532-8a8d-c8f325bcc2a1","Type":"ContainerStarted","Data":"b69c5de29c892b6be92f6c8143dc2c8ef98d46737b1c288ca6f7ba8b44676200"} Feb 17 17:50:02 crc kubenswrapper[4892]: I0217 17:50:02.559743 4892 generic.go:334] "Generic (PLEG): container finished" podID="12793f08-a0ab-411a-9449-b0b2d0834e5e" containerID="b4cba89435d139bae2a0ba80644ac183aa902671fdb2340208afda10f9d98f64" exitCode=0 Feb 17 17:50:02 crc kubenswrapper[4892]: I0217 17:50:02.559786 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rmmjd" event={"ID":"12793f08-a0ab-411a-9449-b0b2d0834e5e","Type":"ContainerDied","Data":"b4cba89435d139bae2a0ba80644ac183aa902671fdb2340208afda10f9d98f64"} Feb 17 17:50:02 crc kubenswrapper[4892]: I0217 17:50:02.559802 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rmmjd" event={"ID":"12793f08-a0ab-411a-9449-b0b2d0834e5e","Type":"ContainerStarted","Data":"705ce30159b8d544970c18e0e288bcd207bbfa658193fe13e118d29b76e471c3"} Feb 17 17:50:02 crc kubenswrapper[4892]: I0217 17:50:02.566390 4892 generic.go:334] "Generic (PLEG): container finished" podID="f87ba7cc-b3a3-4df2-bc4e-4f9d892f5c02" containerID="9cb163b05d4e9e7ab8bbdbbee7dd3026b665255dfa36a2295ff220c0534c4092" exitCode=0 Feb 17 17:50:02 crc kubenswrapper[4892]: I0217 17:50:02.566478 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n84ch" event={"ID":"f87ba7cc-b3a3-4df2-bc4e-4f9d892f5c02","Type":"ContainerDied","Data":"9cb163b05d4e9e7ab8bbdbbee7dd3026b665255dfa36a2295ff220c0534c4092"} Feb 17 17:50:02 crc kubenswrapper[4892]: I0217 17:50:02.585981 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7hghz" podStartSLOduration=6.663069163 podStartE2EDuration="37.585959802s" podCreationTimestamp="2026-02-17 17:49:25 +0000 UTC" firstStartedPulling="2026-02-17 17:49:29.777330757 +0000 UTC m=+341.152734022" lastFinishedPulling="2026-02-17 17:50:00.700221356 +0000 UTC m=+372.075624661" observedRunningTime="2026-02-17 17:50:02.58405556 +0000 UTC m=+373.959458845" watchObservedRunningTime="2026-02-17 17:50:02.585959802 +0000 UTC m=+373.961363077" Feb 17 17:50:02 crc kubenswrapper[4892]: I0217 17:50:02.629894 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fsc5r" podStartSLOduration=4.686578274 podStartE2EDuration="33.629868653s" podCreationTimestamp="2026-02-17 17:49:29 +0000 UTC" firstStartedPulling="2026-02-17 17:49:31.808825532 +0000 UTC m=+343.184228797" lastFinishedPulling="2026-02-17 17:50:00.752115911 +0000 UTC m=+372.127519176" observedRunningTime="2026-02-17 17:50:02.624456167 +0000 UTC m=+373.999859432" watchObservedRunningTime="2026-02-17 17:50:02.629868653 +0000 UTC m=+374.005271918" Feb 17 17:50:02 crc kubenswrapper[4892]: I0217 17:50:02.643958 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xnsqq" podStartSLOduration=6.421759628 podStartE2EDuration="40.643941501s" podCreationTimestamp="2026-02-17 17:49:22 +0000 UTC" firstStartedPulling="2026-02-17 17:49:26.73006833 +0000 UTC m=+338.105471615" lastFinishedPulling="2026-02-17 17:50:00.952250213 +0000 UTC m=+372.327653488" observedRunningTime="2026-02-17 17:50:02.640935959 +0000 UTC m=+374.016339224" watchObservedRunningTime="2026-02-17 17:50:02.643941501 +0000 UTC m=+374.019344766" Feb 17 17:50:02 crc kubenswrapper[4892]: I0217 17:50:02.662962 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hw9fh" podStartSLOduration=5.275785737 podStartE2EDuration="34.662942341s" podCreationTimestamp="2026-02-17 17:49:28 +0000 UTC" firstStartedPulling="2026-02-17 17:49:31.334964411 +0000 UTC m=+342.710367676" lastFinishedPulling="2026-02-17 17:50:00.722120975 +0000 UTC m=+372.097524280" observedRunningTime="2026-02-17 17:50:02.657832665 +0000 UTC m=+374.033235930" watchObservedRunningTime="2026-02-17 17:50:02.662942341 +0000 UTC m=+374.038345606" Feb 17 17:50:02 crc kubenswrapper[4892]: I0217 17:50:02.808551 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xnsqq" Feb 17 17:50:02 crc kubenswrapper[4892]: I0217 17:50:02.808594 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xnsqq" Feb 17 17:50:03 crc kubenswrapper[4892]: I0217 17:50:03.584193 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n84ch" event={"ID":"f87ba7cc-b3a3-4df2-bc4e-4f9d892f5c02","Type":"ContainerStarted","Data":"3edceb7a5cfa5964f4844a0839a370881310e96f40b6b80d4de91cf607043af9"} Feb 17 17:50:03 crc kubenswrapper[4892]: I0217 17:50:03.595860 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jxtbb" event={"ID":"3425377d-d147-4bc3-a063-5e3c9456d2f9","Type":"ContainerStarted","Data":"17754112c2787e44e163a30f9110738ef9beb1dee8705a1644d1a377921c6503"} Feb 17 17:50:03 crc kubenswrapper[4892]: I0217 17:50:03.605388 4892 generic.go:334] "Generic (PLEG): container finished" podID="8a8823d3-7d81-4ab1-8726-e77a2d0024e6" containerID="5772aae5990ff53de41fb1ecd34b0bc6546450216efa0097d3839f7398e3acee" exitCode=0 Feb 17 17:50:03 crc kubenswrapper[4892]: I0217 17:50:03.605452 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6qj4k" event={"ID":"8a8823d3-7d81-4ab1-8726-e77a2d0024e6","Type":"ContainerDied","Data":"5772aae5990ff53de41fb1ecd34b0bc6546450216efa0097d3839f7398e3acee"} Feb 17 17:50:03 crc kubenswrapper[4892]: I0217 17:50:03.615327 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-n84ch" podStartSLOduration=5.4939962 podStartE2EDuration="30.615301839s" podCreationTimestamp="2026-02-17 17:49:33 +0000 UTC" firstStartedPulling="2026-02-17 17:49:37.985307653 +0000 UTC m=+349.360710938" lastFinishedPulling="2026-02-17 17:50:03.106613312 +0000 UTC m=+374.482016577" observedRunningTime="2026-02-17 17:50:03.610321746 +0000 UTC m=+374.985725021" watchObservedRunningTime="2026-02-17 17:50:03.615301839 +0000 UTC m=+374.990705104" Feb 17 17:50:03 crc kubenswrapper[4892]: I0217 17:50:03.619603 4892 generic.go:334] "Generic (PLEG): container finished" podID="fee21c95-556a-4a25-8b38-273489f881e4" containerID="554ddb300fba674d1cc5a1a361ac7dfbc35d6e4f749cb9231a2a8532bd44450e" exitCode=0 Feb 17 17:50:03 crc kubenswrapper[4892]: I0217 17:50:03.619693 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fj4g8" event={"ID":"fee21c95-556a-4a25-8b38-273489f881e4","Type":"ContainerDied","Data":"554ddb300fba674d1cc5a1a361ac7dfbc35d6e4f749cb9231a2a8532bd44450e"} Feb 17 17:50:03 crc kubenswrapper[4892]: I0217 17:50:03.627486 4892 generic.go:334] "Generic (PLEG): container finished" podID="30a36169-14a5-4d2a-9f66-fb343852b6a6" containerID="76f7d0dd2abcf5cc5477d52bab7e6299b8745e10fe10316a6f3b3a94c0441fac" exitCode=0 Feb 17 17:50:03 crc kubenswrapper[4892]: I0217 17:50:03.627560 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5c7bz" event={"ID":"30a36169-14a5-4d2a-9f66-fb343852b6a6","Type":"ContainerDied","Data":"76f7d0dd2abcf5cc5477d52bab7e6299b8745e10fe10316a6f3b3a94c0441fac"} Feb 17 17:50:03 crc kubenswrapper[4892]: I0217 17:50:03.632427 4892 generic.go:334] "Generic (PLEG): container finished" podID="20048710-1af1-4532-8a8d-c8f325bcc2a1" containerID="b69c5de29c892b6be92f6c8143dc2c8ef98d46737b1c288ca6f7ba8b44676200" exitCode=0 Feb 17 17:50:03 crc kubenswrapper[4892]: I0217 17:50:03.632492 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jstz7" event={"ID":"20048710-1af1-4532-8a8d-c8f325bcc2a1","Type":"ContainerDied","Data":"b69c5de29c892b6be92f6c8143dc2c8ef98d46737b1c288ca6f7ba8b44676200"} Feb 17 17:50:03 crc kubenswrapper[4892]: I0217 17:50:03.632512 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jstz7" event={"ID":"20048710-1af1-4532-8a8d-c8f325bcc2a1","Type":"ContainerStarted","Data":"207295f7ca5f8aa659cf902c41c5f5f2b25202b183f6e5d567a707a02aa7b898"} Feb 17 17:50:03 crc kubenswrapper[4892]: I0217 17:50:03.648640 4892 generic.go:334] "Generic (PLEG): container finished" podID="a22a9e64-6027-45c1-be54-dc34baaf71c5" containerID="abc5b3d10ceb9e864636bfc7bd385a89a46d0fd150c31c69bc98d88a0324c34d" exitCode=0 Feb 17 17:50:03 crc kubenswrapper[4892]: I0217 17:50:03.648745 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zcqks" event={"ID":"a22a9e64-6027-45c1-be54-dc34baaf71c5","Type":"ContainerDied","Data":"abc5b3d10ceb9e864636bfc7bd385a89a46d0fd150c31c69bc98d88a0324c34d"} Feb 17 17:50:03 crc kubenswrapper[4892]: I0217 17:50:03.648785 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zcqks" event={"ID":"a22a9e64-6027-45c1-be54-dc34baaf71c5","Type":"ContainerStarted","Data":"eb0b0d03c5b97da6a80fced38a04f1b71ba042f1a5c57ed2cc321aabba53be7a"} Feb 17 17:50:03 crc kubenswrapper[4892]: I0217 17:50:03.667079 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bv8xp" event={"ID":"36d16e04-a79d-42f9-bcb8-e9115efb1eae","Type":"ContainerStarted","Data":"803ffa13fee64053edce941066bb95ac5f892a4716a2b03b73a3fa0c42dae1ee"} Feb 17 17:50:03 crc kubenswrapper[4892]: I0217 17:50:03.690280 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hwn4v" event={"ID":"a7833e24-e8f2-422a-8759-be2e54c1f6ee","Type":"ContainerStarted","Data":"e7a7526845c320ce355b50a58e722e7436fe13d19dfa120fb3ab513a313cfb1d"} Feb 17 17:50:03 crc kubenswrapper[4892]: I0217 17:50:03.766122 4892 generic.go:334] "Generic (PLEG): container finished" podID="68547217-2996-4b25-9020-c2187ecfb42e" containerID="168b7bdb369be71c31ac2c71505e1a44233545383d46375e567d36c7a7f38f6d" exitCode=0 Feb 17 17:50:03 crc kubenswrapper[4892]: I0217 17:50:03.766153 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zcqks" podStartSLOduration=24.437643544 podStartE2EDuration="26.766132545s" podCreationTimestamp="2026-02-17 17:49:37 +0000 UTC" firstStartedPulling="2026-02-17 17:50:00.627360027 +0000 UTC m=+372.002763322" lastFinishedPulling="2026-02-17 17:50:02.955849058 +0000 UTC m=+374.331252323" observedRunningTime="2026-02-17 17:50:03.728585686 +0000 UTC m=+375.103988951" watchObservedRunningTime="2026-02-17 17:50:03.766132545 +0000 UTC m=+375.141535810" Feb 17 17:50:03 crc kubenswrapper[4892]: I0217 17:50:03.766964 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6vvnr" event={"ID":"68547217-2996-4b25-9020-c2187ecfb42e","Type":"ContainerDied","Data":"168b7bdb369be71c31ac2c71505e1a44233545383d46375e567d36c7a7f38f6d"} Feb 17 17:50:03 crc kubenswrapper[4892]: I0217 17:50:03.775746 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-k2k8h" Feb 17 17:50:03 crc kubenswrapper[4892]: I0217 17:50:03.775828 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-k2k8h" Feb 17 17:50:03 crc kubenswrapper[4892]: I0217 17:50:03.808589 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jstz7" podStartSLOduration=21.467655700999998 podStartE2EDuration="23.808574826s" podCreationTimestamp="2026-02-17 17:49:40 +0000 UTC" firstStartedPulling="2026-02-17 17:50:00.628074826 +0000 UTC m=+372.003478101" lastFinishedPulling="2026-02-17 17:50:02.968993961 +0000 UTC m=+374.344397226" observedRunningTime="2026-02-17 17:50:03.80499888 +0000 UTC m=+375.180402145" watchObservedRunningTime="2026-02-17 17:50:03.808574826 +0000 UTC m=+375.183978091" Feb 17 17:50:03 crc kubenswrapper[4892]: I0217 17:50:03.834445 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bv8xp" podStartSLOduration=4.901879741 podStartE2EDuration="31.834420952s" podCreationTimestamp="2026-02-17 17:49:32 +0000 UTC" firstStartedPulling="2026-02-17 17:49:35.916466784 +0000 UTC m=+347.291870049" lastFinishedPulling="2026-02-17 17:50:02.849007995 +0000 UTC m=+374.224411260" observedRunningTime="2026-02-17 17:50:03.830786704 +0000 UTC m=+375.206189989" watchObservedRunningTime="2026-02-17 17:50:03.834420952 +0000 UTC m=+375.209824217" Feb 17 17:50:03 crc kubenswrapper[4892]: I0217 17:50:03.867361 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-xnsqq" podUID="b8e76551-8839-410b-9aec-a671ca2119dc" containerName="registry-server" probeResult="failure" output=< Feb 17 17:50:03 crc kubenswrapper[4892]: timeout: failed to connect service ":50051" within 1s Feb 17 17:50:03 crc kubenswrapper[4892]: > Feb 17 17:50:04 crc kubenswrapper[4892]: I0217 17:50:04.783056 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rmmjd" event={"ID":"12793f08-a0ab-411a-9449-b0b2d0834e5e","Type":"ContainerStarted","Data":"5a4f985587cbecaa347e64c8c095b0203f618e96b04bc5548ce1287ccf6e91fa"} Feb 17 17:50:04 crc kubenswrapper[4892]: I0217 17:50:04.784953 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5v59h" event={"ID":"272b1940-458c-4610-a618-fcc2a4fab95c","Type":"ContainerStarted","Data":"2aa4c0258198354208f3bd9ca1757e9cc1b45a0be4571e56666814b568d697b5"} Feb 17 17:50:04 crc kubenswrapper[4892]: I0217 17:50:04.786681 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbtj9" event={"ID":"d276a412-68f8-4069-9aa2-275fdb23997d","Type":"ContainerStarted","Data":"5186ffd59cb954c27883939ccd1e21943d600879ac8599b458c2142f6d2b66d3"} Feb 17 17:50:04 crc kubenswrapper[4892]: I0217 17:50:04.788331 4892 generic.go:334] "Generic (PLEG): container finished" podID="a7833e24-e8f2-422a-8759-be2e54c1f6ee" containerID="e7a7526845c320ce355b50a58e722e7436fe13d19dfa120fb3ab513a313cfb1d" exitCode=0 Feb 17 17:50:04 crc kubenswrapper[4892]: I0217 17:50:04.788387 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hwn4v" event={"ID":"a7833e24-e8f2-422a-8759-be2e54c1f6ee","Type":"ContainerDied","Data":"e7a7526845c320ce355b50a58e722e7436fe13d19dfa120fb3ab513a313cfb1d"} Feb 17 17:50:04 crc kubenswrapper[4892]: I0217 17:50:04.790224 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5cd89" event={"ID":"94cfcd99-3052-49c6-991c-571a85bdeba5","Type":"ContainerStarted","Data":"4c7c4f01e276d66570dc871177174486b174db591f011f3bcfaf345727957eca"} Feb 17 17:50:04 crc kubenswrapper[4892]: I0217 17:50:04.792437 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dhfx5" event={"ID":"7700d4bf-d2a8-4c27-bba9-712bde89f76c","Type":"ContainerStarted","Data":"e77c967bc170abf9c8231f16f56ab464ea5099c27a28288613c9f6279ff0066d"} Feb 17 17:50:04 crc kubenswrapper[4892]: I0217 17:50:04.899342 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-k2k8h" podUID="e730a12c-43a9-45a7-a52b-9f6fd3eaecaf" containerName="registry-server" probeResult="failure" output=< Feb 17 17:50:04 crc kubenswrapper[4892]: timeout: failed to connect service ":50051" within 1s Feb 17 17:50:04 crc kubenswrapper[4892]: > Feb 17 17:50:05 crc kubenswrapper[4892]: I0217 17:50:05.168263 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ljjvr" Feb 17 17:50:05 crc kubenswrapper[4892]: I0217 17:50:05.168318 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ljjvr" Feb 17 17:50:05 crc kubenswrapper[4892]: I0217 17:50:05.211192 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ljjvr" Feb 17 17:50:05 crc kubenswrapper[4892]: I0217 17:50:05.801396 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6qj4k" event={"ID":"8a8823d3-7d81-4ab1-8726-e77a2d0024e6","Type":"ContainerStarted","Data":"31479b29c172110aee0c402d9eee0ecb00c6341d89987efadcbb60135886a995"} Feb 17 17:50:05 crc kubenswrapper[4892]: I0217 17:50:05.804345 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5c7bz" event={"ID":"30a36169-14a5-4d2a-9f66-fb343852b6a6","Type":"ContainerStarted","Data":"707d30d85368ace4a1096f6610fbd2d0995919888db6fda397a652b326263bb2"} Feb 17 17:50:05 crc kubenswrapper[4892]: I0217 17:50:05.806877 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fj4g8" event={"ID":"fee21c95-556a-4a25-8b38-273489f881e4","Type":"ContainerStarted","Data":"1619ab7d9ed780d2cc40a936c7b7d6cabc77abf480db3f68fced340a2f40a7f1"} Feb 17 17:50:05 crc kubenswrapper[4892]: I0217 17:50:05.809417 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hwn4v" event={"ID":"a7833e24-e8f2-422a-8759-be2e54c1f6ee","Type":"ContainerStarted","Data":"5ca267414837c2ff1aa02ec862bf3c8c6393d40bd6fb4278bd6640485b61f489"} Feb 17 17:50:05 crc kubenswrapper[4892]: I0217 17:50:05.811491 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6vvnr" event={"ID":"68547217-2996-4b25-9020-c2187ecfb42e","Type":"ContainerStarted","Data":"e51c1574a60189904fadbce2d623f904d53f6722685a80e95f6a12844fd34fa7"} Feb 17 17:50:05 crc kubenswrapper[4892]: I0217 17:50:05.830484 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6qj4k" podStartSLOduration=22.543960193 podStartE2EDuration="26.830457003s" podCreationTimestamp="2026-02-17 17:49:39 +0000 UTC" firstStartedPulling="2026-02-17 17:50:00.628158448 +0000 UTC m=+372.003561753" lastFinishedPulling="2026-02-17 17:50:04.914655298 +0000 UTC m=+376.290058563" observedRunningTime="2026-02-17 17:50:05.827446152 +0000 UTC m=+377.202849417" watchObservedRunningTime="2026-02-17 17:50:05.830457003 +0000 UTC m=+377.205860268" Feb 17 17:50:05 crc kubenswrapper[4892]: I0217 17:50:05.845185 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5c7bz" podStartSLOduration=12.871613797 podStartE2EDuration="30.845168619s" podCreationTimestamp="2026-02-17 17:49:35 +0000 UTC" firstStartedPulling="2026-02-17 17:49:46.731251543 +0000 UTC m=+358.106654798" lastFinishedPulling="2026-02-17 17:50:04.704806355 +0000 UTC m=+376.080209620" observedRunningTime="2026-02-17 17:50:05.844364487 +0000 UTC m=+377.219767762" watchObservedRunningTime="2026-02-17 17:50:05.845168619 +0000 UTC m=+377.220571884" Feb 17 17:50:05 crc kubenswrapper[4892]: I0217 17:50:05.864536 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hwn4v" podStartSLOduration=15.581588464 podStartE2EDuration="18.864518289s" podCreationTimestamp="2026-02-17 17:49:47 +0000 UTC" firstStartedPulling="2026-02-17 17:50:02.25602001 +0000 UTC m=+373.631423265" lastFinishedPulling="2026-02-17 17:50:05.538949825 +0000 UTC m=+376.914353090" observedRunningTime="2026-02-17 17:50:05.861678763 +0000 UTC m=+377.237082038" watchObservedRunningTime="2026-02-17 17:50:05.864518289 +0000 UTC m=+377.239921554" Feb 17 17:50:05 crc kubenswrapper[4892]: I0217 17:50:05.874174 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5c7bz" Feb 17 17:50:05 crc kubenswrapper[4892]: I0217 17:50:05.874235 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5c7bz" Feb 17 17:50:05 crc kubenswrapper[4892]: I0217 17:50:05.890721 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fj4g8" podStartSLOduration=24.612083525 podStartE2EDuration="28.890681283s" podCreationTimestamp="2026-02-17 17:49:37 +0000 UTC" firstStartedPulling="2026-02-17 17:50:00.627038968 +0000 UTC m=+372.002442253" lastFinishedPulling="2026-02-17 17:50:04.905636746 +0000 UTC m=+376.281040011" observedRunningTime="2026-02-17 17:50:05.888435063 +0000 UTC m=+377.263838328" watchObservedRunningTime="2026-02-17 17:50:05.890681283 +0000 UTC m=+377.266084548" Feb 17 17:50:05 crc kubenswrapper[4892]: I0217 17:50:05.914403 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6vvnr" podStartSLOduration=5.675850883 podStartE2EDuration="35.91438616s" podCreationTimestamp="2026-02-17 17:49:30 +0000 UTC" firstStartedPulling="2026-02-17 17:49:34.882645635 +0000 UTC m=+346.258048940" lastFinishedPulling="2026-02-17 17:50:05.121180942 +0000 UTC m=+376.496584217" observedRunningTime="2026-02-17 17:50:05.905660626 +0000 UTC m=+377.281063901" watchObservedRunningTime="2026-02-17 17:50:05.91438616 +0000 UTC m=+377.289789425" Feb 17 17:50:06 crc kubenswrapper[4892]: I0217 17:50:06.216916 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7hghz" Feb 17 17:50:06 crc kubenswrapper[4892]: I0217 17:50:06.217927 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7hghz" Feb 17 17:50:06 crc kubenswrapper[4892]: I0217 17:50:06.263634 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7hghz" Feb 17 17:50:06 crc kubenswrapper[4892]: I0217 17:50:06.825134 4892 generic.go:334] "Generic (PLEG): container finished" podID="3425377d-d147-4bc3-a063-5e3c9456d2f9" containerID="17754112c2787e44e163a30f9110738ef9beb1dee8705a1644d1a377921c6503" exitCode=0 Feb 17 17:50:06 crc kubenswrapper[4892]: I0217 17:50:06.825420 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jxtbb" event={"ID":"3425377d-d147-4bc3-a063-5e3c9456d2f9","Type":"ContainerDied","Data":"17754112c2787e44e163a30f9110738ef9beb1dee8705a1644d1a377921c6503"} Feb 17 17:50:06 crc kubenswrapper[4892]: I0217 17:50:06.862897 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7hghz" Feb 17 17:50:06 crc kubenswrapper[4892]: I0217 17:50:06.943998 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-5c7bz" podUID="30a36169-14a5-4d2a-9f66-fb343852b6a6" containerName="registry-server" probeResult="failure" output=< Feb 17 17:50:06 crc kubenswrapper[4892]: timeout: failed to connect service ":50051" within 1s Feb 17 17:50:06 crc kubenswrapper[4892]: > Feb 17 17:50:07 crc kubenswrapper[4892]: I0217 17:50:07.371072 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zcqks" Feb 17 17:50:07 crc kubenswrapper[4892]: I0217 17:50:07.371128 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zcqks" Feb 17 17:50:07 crc kubenswrapper[4892]: I0217 17:50:07.414321 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zcqks" Feb 17 17:50:07 crc kubenswrapper[4892]: I0217 17:50:07.424562 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:50:07 crc kubenswrapper[4892]: I0217 17:50:07.424641 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:50:07 crc kubenswrapper[4892]: I0217 17:50:07.871049 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gfc4l" Feb 17 17:50:07 crc kubenswrapper[4892]: I0217 17:50:07.871090 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gfc4l" Feb 17 17:50:07 crc kubenswrapper[4892]: I0217 17:50:07.957961 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fj4g8" Feb 17 17:50:07 crc kubenswrapper[4892]: I0217 17:50:07.958017 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fj4g8" Feb 17 17:50:07 crc kubenswrapper[4892]: I0217 17:50:07.983587 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hwn4v" Feb 17 17:50:07 crc kubenswrapper[4892]: I0217 17:50:07.984184 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hwn4v" Feb 17 17:50:08 crc kubenswrapper[4892]: I0217 17:50:08.553612 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hw9fh" Feb 17 17:50:08 crc kubenswrapper[4892]: I0217 17:50:08.553673 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hw9fh" Feb 17 17:50:08 crc kubenswrapper[4892]: I0217 17:50:08.596134 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hw9fh" Feb 17 17:50:08 crc kubenswrapper[4892]: I0217 17:50:08.839154 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jxtbb" event={"ID":"3425377d-d147-4bc3-a063-5e3c9456d2f9","Type":"ContainerStarted","Data":"296e009cfb6f2a21e4228283cfc44e00354c12e851eb69cdae67c43b2b299156"} Feb 17 17:50:08 crc kubenswrapper[4892]: I0217 17:50:08.862868 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jxtbb" podStartSLOduration=21.846686622 podStartE2EDuration="27.86279875s" podCreationTimestamp="2026-02-17 17:49:41 +0000 UTC" firstStartedPulling="2026-02-17 17:50:02.22999539 +0000 UTC m=+373.605398655" lastFinishedPulling="2026-02-17 17:50:08.246107528 +0000 UTC m=+379.621510783" observedRunningTime="2026-02-17 17:50:08.86092134 +0000 UTC m=+380.236324605" watchObservedRunningTime="2026-02-17 17:50:08.86279875 +0000 UTC m=+380.238202015" Feb 17 17:50:08 crc kubenswrapper[4892]: I0217 17:50:08.885113 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hw9fh" Feb 17 17:50:08 crc kubenswrapper[4892]: I0217 17:50:08.909421 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gfc4l" podUID="31427c94-1de1-431c-9f4b-e87f01548d3f" containerName="registry-server" probeResult="failure" output=< Feb 17 17:50:08 crc kubenswrapper[4892]: timeout: failed to connect service ":50051" within 1s Feb 17 17:50:08 crc kubenswrapper[4892]: > Feb 17 17:50:09 crc kubenswrapper[4892]: I0217 17:50:09.010840 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fj4g8" podUID="fee21c95-556a-4a25-8b38-273489f881e4" containerName="registry-server" probeResult="failure" output=< Feb 17 17:50:09 crc kubenswrapper[4892]: timeout: failed to connect service ":50051" within 1s Feb 17 17:50:09 crc kubenswrapper[4892]: > Feb 17 17:50:09 crc kubenswrapper[4892]: I0217 17:50:09.024222 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-hwn4v" podUID="a7833e24-e8f2-422a-8759-be2e54c1f6ee" containerName="registry-server" probeResult="failure" output=< Feb 17 17:50:09 crc kubenswrapper[4892]: timeout: failed to connect service ":50051" within 1s Feb 17 17:50:09 crc kubenswrapper[4892]: > Feb 17 17:50:09 crc kubenswrapper[4892]: I0217 17:50:09.602710 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6qj4k" Feb 17 17:50:09 crc kubenswrapper[4892]: I0217 17:50:09.603376 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6qj4k" Feb 17 17:50:09 crc kubenswrapper[4892]: I0217 17:50:09.649912 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6qj4k" Feb 17 17:50:09 crc kubenswrapper[4892]: I0217 17:50:09.847956 4892 generic.go:334] "Generic (PLEG): container finished" podID="7700d4bf-d2a8-4c27-bba9-712bde89f76c" containerID="e77c967bc170abf9c8231f16f56ab464ea5099c27a28288613c9f6279ff0066d" exitCode=0 Feb 17 17:50:09 crc kubenswrapper[4892]: I0217 17:50:09.848181 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dhfx5" event={"ID":"7700d4bf-d2a8-4c27-bba9-712bde89f76c","Type":"ContainerDied","Data":"e77c967bc170abf9c8231f16f56ab464ea5099c27a28288613c9f6279ff0066d"} Feb 17 17:50:09 crc kubenswrapper[4892]: I0217 17:50:09.851536 4892 generic.go:334] "Generic (PLEG): container finished" podID="12793f08-a0ab-411a-9449-b0b2d0834e5e" containerID="5a4f985587cbecaa347e64c8c095b0203f618e96b04bc5548ce1287ccf6e91fa" exitCode=0 Feb 17 17:50:09 crc kubenswrapper[4892]: I0217 17:50:09.851980 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rmmjd" event={"ID":"12793f08-a0ab-411a-9449-b0b2d0834e5e","Type":"ContainerDied","Data":"5a4f985587cbecaa347e64c8c095b0203f618e96b04bc5548ce1287ccf6e91fa"} Feb 17 17:50:09 crc kubenswrapper[4892]: I0217 17:50:09.947624 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fsc5r" Feb 17 17:50:09 crc kubenswrapper[4892]: I0217 17:50:09.947687 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fsc5r" Feb 17 17:50:09 crc kubenswrapper[4892]: I0217 17:50:09.993992 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fsc5r" Feb 17 17:50:10 crc kubenswrapper[4892]: I0217 17:50:10.559533 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jstz7" Feb 17 17:50:10 crc kubenswrapper[4892]: I0217 17:50:10.559609 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jstz7" Feb 17 17:50:10 crc kubenswrapper[4892]: I0217 17:50:10.598939 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jstz7" Feb 17 17:50:10 crc kubenswrapper[4892]: I0217 17:50:10.861215 4892 generic.go:334] "Generic (PLEG): container finished" podID="94cfcd99-3052-49c6-991c-571a85bdeba5" containerID="4c7c4f01e276d66570dc871177174486b174db591f011f3bcfaf345727957eca" exitCode=0 Feb 17 17:50:10 crc kubenswrapper[4892]: I0217 17:50:10.861381 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5cd89" event={"ID":"94cfcd99-3052-49c6-991c-571a85bdeba5","Type":"ContainerDied","Data":"4c7c4f01e276d66570dc871177174486b174db591f011f3bcfaf345727957eca"} Feb 17 17:50:10 crc kubenswrapper[4892]: I0217 17:50:10.914135 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6qj4k" Feb 17 17:50:10 crc kubenswrapper[4892]: I0217 17:50:10.924414 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fsc5r" Feb 17 17:50:10 crc kubenswrapper[4892]: I0217 17:50:10.944570 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jstz7" Feb 17 17:50:11 crc kubenswrapper[4892]: I0217 17:50:11.018627 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6vvnr" Feb 17 17:50:11 crc kubenswrapper[4892]: I0217 17:50:11.018696 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6vvnr" Feb 17 17:50:11 crc kubenswrapper[4892]: I0217 17:50:11.762181 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jxtbb" Feb 17 17:50:11 crc kubenswrapper[4892]: I0217 17:50:11.762230 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jxtbb" Feb 17 17:50:12 crc kubenswrapper[4892]: I0217 17:50:12.054217 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6vvnr" podUID="68547217-2996-4b25-9020-c2187ecfb42e" containerName="registry-server" probeResult="failure" output=< Feb 17 17:50:12 crc kubenswrapper[4892]: timeout: failed to connect service ":50051" within 1s Feb 17 17:50:12 crc kubenswrapper[4892]: > Feb 17 17:50:12 crc kubenswrapper[4892]: I0217 17:50:12.364330 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bv8xp" Feb 17 17:50:12 crc kubenswrapper[4892]: I0217 17:50:12.364940 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bv8xp" Feb 17 17:50:12 crc kubenswrapper[4892]: I0217 17:50:12.430418 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bv8xp" Feb 17 17:50:12 crc kubenswrapper[4892]: I0217 17:50:12.831024 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jxtbb" podUID="3425377d-d147-4bc3-a063-5e3c9456d2f9" containerName="registry-server" probeResult="failure" output=< Feb 17 17:50:12 crc kubenswrapper[4892]: timeout: failed to connect service ":50051" within 1s Feb 17 17:50:12 crc kubenswrapper[4892]: > Feb 17 17:50:12 crc kubenswrapper[4892]: I0217 17:50:12.867994 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xnsqq" Feb 17 17:50:12 crc kubenswrapper[4892]: I0217 17:50:12.925553 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xnsqq" Feb 17 17:50:12 crc kubenswrapper[4892]: I0217 17:50:12.933287 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bv8xp" Feb 17 17:50:13 crc kubenswrapper[4892]: I0217 17:50:13.391079 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-n84ch" Feb 17 17:50:13 crc kubenswrapper[4892]: I0217 17:50:13.391414 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-n84ch" Feb 17 17:50:13 crc kubenswrapper[4892]: I0217 17:50:13.452286 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-n84ch" Feb 17 17:50:13 crc kubenswrapper[4892]: I0217 17:50:13.830126 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-k2k8h" Feb 17 17:50:13 crc kubenswrapper[4892]: I0217 17:50:13.885053 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-k2k8h" Feb 17 17:50:13 crc kubenswrapper[4892]: I0217 17:50:13.956629 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-n84ch" Feb 17 17:50:15 crc kubenswrapper[4892]: I0217 17:50:15.243211 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ljjvr" Feb 17 17:50:15 crc kubenswrapper[4892]: I0217 17:50:15.942510 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5c7bz" Feb 17 17:50:16 crc kubenswrapper[4892]: I0217 17:50:16.010936 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5c7bz" Feb 17 17:50:17 crc kubenswrapper[4892]: I0217 17:50:17.440258 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zcqks" Feb 17 17:50:17 crc kubenswrapper[4892]: I0217 17:50:17.930724 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gfc4l" Feb 17 17:50:17 crc kubenswrapper[4892]: I0217 17:50:17.985470 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gfc4l" Feb 17 17:50:18 crc kubenswrapper[4892]: I0217 17:50:18.072948 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hwn4v" Feb 17 17:50:18 crc kubenswrapper[4892]: I0217 17:50:18.076051 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fj4g8" Feb 17 17:50:18 crc kubenswrapper[4892]: I0217 17:50:18.135627 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hwn4v" Feb 17 17:50:18 crc kubenswrapper[4892]: I0217 17:50:18.137320 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fj4g8" Feb 17 17:50:19 crc kubenswrapper[4892]: I0217 17:50:19.245125 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" podUID="b3fc996d-107b-4647-b52f-54fef31f9059" containerName="registry" containerID="cri-o://f7b748acabc068e57b10cd73410ba1adaa2b620c1854c8e6c1a80454fbd7f693" gracePeriod=30 Feb 17 17:50:21 crc kubenswrapper[4892]: I0217 17:50:21.067577 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6vvnr" Feb 17 17:50:21 crc kubenswrapper[4892]: I0217 17:50:21.112122 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6vvnr" Feb 17 17:50:21 crc kubenswrapper[4892]: I0217 17:50:21.842052 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jxtbb" Feb 17 17:50:21 crc kubenswrapper[4892]: I0217 17:50:21.893640 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jxtbb" Feb 17 17:50:23 crc kubenswrapper[4892]: I0217 17:50:23.960333 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dhfx5" event={"ID":"7700d4bf-d2a8-4c27-bba9-712bde89f76c","Type":"ContainerStarted","Data":"516c6fb31690de852ed046776445903ab7d70726987f2c60f09c77cbed0a38c7"} Feb 17 17:50:23 crc kubenswrapper[4892]: I0217 17:50:23.963270 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rmmjd" event={"ID":"12793f08-a0ab-411a-9449-b0b2d0834e5e","Type":"ContainerStarted","Data":"c6e50c67a1a5ad2f80bf4049b2522399dfcf51701c4703092c83ad868afd9672"} Feb 17 17:50:24 crc kubenswrapper[4892]: I0217 17:50:24.970508 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5cd89" event={"ID":"94cfcd99-3052-49c6-991c-571a85bdeba5","Type":"ContainerStarted","Data":"60ede4528881a0e75c0a773cdc6e4742ed2805584239afe4bcecf7be8adcba0a"} Feb 17 17:50:25 crc kubenswrapper[4892]: I0217 17:50:25.001004 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rmmjd" podStartSLOduration=19.053988265 podStartE2EDuration="39.000985715s" podCreationTimestamp="2026-02-17 17:49:46 +0000 UTC" firstStartedPulling="2026-02-17 17:50:02.563927869 +0000 UTC m=+373.939331134" lastFinishedPulling="2026-02-17 17:50:22.510925319 +0000 UTC m=+393.886328584" observedRunningTime="2026-02-17 17:50:24.989650229 +0000 UTC m=+396.365053494" watchObservedRunningTime="2026-02-17 17:50:25.000985715 +0000 UTC m=+396.376388980" Feb 17 17:50:25 crc kubenswrapper[4892]: I0217 17:50:25.026364 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dhfx5" podStartSLOduration=21.454294879 podStartE2EDuration="42.02633124s" podCreationTimestamp="2026-02-17 17:49:43 +0000 UTC" firstStartedPulling="2026-02-17 17:50:02.397624668 +0000 UTC m=+373.773027943" lastFinishedPulling="2026-02-17 17:50:22.969661019 +0000 UTC m=+394.345064304" observedRunningTime="2026-02-17 17:50:25.01336015 +0000 UTC m=+396.388763415" watchObservedRunningTime="2026-02-17 17:50:25.02633124 +0000 UTC m=+396.401734505" Feb 17 17:50:25 crc kubenswrapper[4892]: I0217 17:50:25.041679 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5cd89" podStartSLOduration=23.285655164 podStartE2EDuration="43.041657785s" podCreationTimestamp="2026-02-17 17:49:42 +0000 UTC" firstStartedPulling="2026-02-17 17:50:02.52155135 +0000 UTC m=+373.896954615" lastFinishedPulling="2026-02-17 17:50:22.277553971 +0000 UTC m=+393.652957236" observedRunningTime="2026-02-17 17:50:25.038239533 +0000 UTC m=+396.413642798" watchObservedRunningTime="2026-02-17 17:50:25.041657785 +0000 UTC m=+396.417061050" Feb 17 17:50:26 crc kubenswrapper[4892]: I0217 17:50:26.647429 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rmmjd" Feb 17 17:50:26 crc kubenswrapper[4892]: I0217 17:50:26.648102 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rmmjd" Feb 17 17:50:26 crc kubenswrapper[4892]: I0217 17:50:26.696606 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rmmjd" Feb 17 17:50:29 crc kubenswrapper[4892]: I0217 17:50:29.716181 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-697d97f7c8-cnjbv_b3fc996d-107b-4647-b52f-54fef31f9059/registry/0.log" Feb 17 17:50:29 crc kubenswrapper[4892]: I0217 17:50:29.716752 4892 generic.go:334] "Generic (PLEG): container finished" podID="b3fc996d-107b-4647-b52f-54fef31f9059" containerID="f7b748acabc068e57b10cd73410ba1adaa2b620c1854c8e6c1a80454fbd7f693" exitCode=-1 Feb 17 17:50:29 crc kubenswrapper[4892]: I0217 17:50:29.716863 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" event={"ID":"b3fc996d-107b-4647-b52f-54fef31f9059","Type":"ContainerDied","Data":"f7b748acabc068e57b10cd73410ba1adaa2b620c1854c8e6c1a80454fbd7f693"} Feb 17 17:50:29 crc kubenswrapper[4892]: I0217 17:50:29.759579 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rmmjd" Feb 17 17:50:30 crc kubenswrapper[4892]: I0217 17:50:30.453847 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-697d97f7c8-cnjbv_b3fc996d-107b-4647-b52f-54fef31f9059/registry/0.log" Feb 17 17:50:30 crc kubenswrapper[4892]: I0217 17:50:30.454385 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:50:30 crc kubenswrapper[4892]: I0217 17:50:30.516369 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b3fc996d-107b-4647-b52f-54fef31f9059-trusted-ca\") pod \"b3fc996d-107b-4647-b52f-54fef31f9059\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " Feb 17 17:50:30 crc kubenswrapper[4892]: I0217 17:50:30.516445 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b3fc996d-107b-4647-b52f-54fef31f9059-installation-pull-secrets\") pod \"b3fc996d-107b-4647-b52f-54fef31f9059\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " Feb 17 17:50:30 crc kubenswrapper[4892]: I0217 17:50:30.516510 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b3fc996d-107b-4647-b52f-54fef31f9059-registry-tls\") pod \"b3fc996d-107b-4647-b52f-54fef31f9059\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " Feb 17 17:50:30 crc kubenswrapper[4892]: I0217 17:50:30.516652 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"b3fc996d-107b-4647-b52f-54fef31f9059\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " Feb 17 17:50:30 crc kubenswrapper[4892]: I0217 17:50:30.516737 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b3fc996d-107b-4647-b52f-54fef31f9059-registry-certificates\") pod \"b3fc996d-107b-4647-b52f-54fef31f9059\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " Feb 17 17:50:30 crc kubenswrapper[4892]: I0217 17:50:30.516828 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b3fc996d-107b-4647-b52f-54fef31f9059-ca-trust-extracted\") pod \"b3fc996d-107b-4647-b52f-54fef31f9059\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " Feb 17 17:50:30 crc kubenswrapper[4892]: I0217 17:50:30.516902 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhlmx\" (UniqueName: \"kubernetes.io/projected/b3fc996d-107b-4647-b52f-54fef31f9059-kube-api-access-vhlmx\") pod \"b3fc996d-107b-4647-b52f-54fef31f9059\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " Feb 17 17:50:30 crc kubenswrapper[4892]: I0217 17:50:30.516985 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b3fc996d-107b-4647-b52f-54fef31f9059-bound-sa-token\") pod \"b3fc996d-107b-4647-b52f-54fef31f9059\" (UID: \"b3fc996d-107b-4647-b52f-54fef31f9059\") " Feb 17 17:50:30 crc kubenswrapper[4892]: I0217 17:50:30.517681 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3fc996d-107b-4647-b52f-54fef31f9059-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "b3fc996d-107b-4647-b52f-54fef31f9059" (UID: "b3fc996d-107b-4647-b52f-54fef31f9059"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:50:30 crc kubenswrapper[4892]: I0217 17:50:30.519225 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3fc996d-107b-4647-b52f-54fef31f9059-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "b3fc996d-107b-4647-b52f-54fef31f9059" (UID: "b3fc996d-107b-4647-b52f-54fef31f9059"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:50:30 crc kubenswrapper[4892]: I0217 17:50:30.527882 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3fc996d-107b-4647-b52f-54fef31f9059-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "b3fc996d-107b-4647-b52f-54fef31f9059" (UID: "b3fc996d-107b-4647-b52f-54fef31f9059"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:50:30 crc kubenswrapper[4892]: I0217 17:50:30.528086 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3fc996d-107b-4647-b52f-54fef31f9059-kube-api-access-vhlmx" (OuterVolumeSpecName: "kube-api-access-vhlmx") pod "b3fc996d-107b-4647-b52f-54fef31f9059" (UID: "b3fc996d-107b-4647-b52f-54fef31f9059"). InnerVolumeSpecName "kube-api-access-vhlmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:50:30 crc kubenswrapper[4892]: I0217 17:50:30.528989 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3fc996d-107b-4647-b52f-54fef31f9059-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "b3fc996d-107b-4647-b52f-54fef31f9059" (UID: "b3fc996d-107b-4647-b52f-54fef31f9059"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:50:30 crc kubenswrapper[4892]: I0217 17:50:30.531609 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3fc996d-107b-4647-b52f-54fef31f9059-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "b3fc996d-107b-4647-b52f-54fef31f9059" (UID: "b3fc996d-107b-4647-b52f-54fef31f9059"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:50:30 crc kubenswrapper[4892]: I0217 17:50:30.535720 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "b3fc996d-107b-4647-b52f-54fef31f9059" (UID: "b3fc996d-107b-4647-b52f-54fef31f9059"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 17 17:50:30 crc kubenswrapper[4892]: I0217 17:50:30.542502 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3fc996d-107b-4647-b52f-54fef31f9059-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "b3fc996d-107b-4647-b52f-54fef31f9059" (UID: "b3fc996d-107b-4647-b52f-54fef31f9059"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:50:30 crc kubenswrapper[4892]: I0217 17:50:30.618989 4892 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b3fc996d-107b-4647-b52f-54fef31f9059-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 17 17:50:30 crc kubenswrapper[4892]: I0217 17:50:30.619276 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhlmx\" (UniqueName: \"kubernetes.io/projected/b3fc996d-107b-4647-b52f-54fef31f9059-kube-api-access-vhlmx\") on node \"crc\" DevicePath \"\"" Feb 17 17:50:30 crc kubenswrapper[4892]: I0217 17:50:30.619393 4892 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b3fc996d-107b-4647-b52f-54fef31f9059-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 17 17:50:30 crc kubenswrapper[4892]: I0217 17:50:30.619499 4892 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b3fc996d-107b-4647-b52f-54fef31f9059-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 17:50:30 crc kubenswrapper[4892]: I0217 17:50:30.619659 4892 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b3fc996d-107b-4647-b52f-54fef31f9059-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 17 17:50:30 crc kubenswrapper[4892]: I0217 17:50:30.619769 4892 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b3fc996d-107b-4647-b52f-54fef31f9059-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 17 17:50:30 crc kubenswrapper[4892]: I0217 17:50:30.619903 4892 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b3fc996d-107b-4647-b52f-54fef31f9059-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 17 17:50:33 crc kubenswrapper[4892]: I0217 17:50:33.185096 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5cd89" Feb 17 17:50:33 crc kubenswrapper[4892]: I0217 17:50:33.185416 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5cd89" Feb 17 17:50:33 crc kubenswrapper[4892]: I0217 17:50:33.252934 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5cd89" Feb 17 17:50:34 crc kubenswrapper[4892]: I0217 17:50:34.206325 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dhfx5" Feb 17 17:50:34 crc kubenswrapper[4892]: I0217 17:50:34.206620 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dhfx5" Feb 17 17:50:34 crc kubenswrapper[4892]: I0217 17:50:34.243798 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dhfx5" Feb 17 17:50:34 crc kubenswrapper[4892]: I0217 17:50:34.468683 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-697d97f7c8-cnjbv_b3fc996d-107b-4647-b52f-54fef31f9059/registry/0.log" Feb 17 17:50:34 crc kubenswrapper[4892]: I0217 17:50:34.468911 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" event={"ID":"b3fc996d-107b-4647-b52f-54fef31f9059","Type":"ContainerDied","Data":"46de291552e7a1225c0a9f84848364c156ad88e5930d6cc7e79e21c197073f82"} Feb 17 17:50:34 crc kubenswrapper[4892]: I0217 17:50:34.468999 4892 scope.go:117] "RemoveContainer" containerID="f7b748acabc068e57b10cd73410ba1adaa2b620c1854c8e6c1a80454fbd7f693" Feb 17 17:50:34 crc kubenswrapper[4892]: I0217 17:50:34.469012 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:50:35 crc kubenswrapper[4892]: I0217 17:50:35.478217 4892 generic.go:334] "Generic (PLEG): container finished" podID="66354c96-340d-43c7-98d0-19713f857884" containerID="009bd6851f021fef23a438ed46f023782908848880371c9dec7ee845540b639a" exitCode=0 Feb 17 17:50:35 crc kubenswrapper[4892]: I0217 17:50:35.478303 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ztgz4" event={"ID":"66354c96-340d-43c7-98d0-19713f857884","Type":"ContainerDied","Data":"009bd6851f021fef23a438ed46f023782908848880371c9dec7ee845540b639a"} Feb 17 17:50:35 crc kubenswrapper[4892]: I0217 17:50:35.526558 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dhfx5" Feb 17 17:50:35 crc kubenswrapper[4892]: I0217 17:50:35.526906 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5cd89" Feb 17 17:50:37 crc kubenswrapper[4892]: I0217 17:50:37.424451 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:50:37 crc kubenswrapper[4892]: I0217 17:50:37.425490 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:50:38 crc kubenswrapper[4892]: I0217 17:50:38.501574 4892 generic.go:334] "Generic (PLEG): container finished" podID="272b1940-458c-4610-a618-fcc2a4fab95c" containerID="2aa4c0258198354208f3bd9ca1757e9cc1b45a0be4571e56666814b568d697b5" exitCode=0 Feb 17 17:50:38 crc kubenswrapper[4892]: I0217 17:50:38.501680 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5v59h" event={"ID":"272b1940-458c-4610-a618-fcc2a4fab95c","Type":"ContainerDied","Data":"2aa4c0258198354208f3bd9ca1757e9cc1b45a0be4571e56666814b568d697b5"} Feb 17 17:50:38 crc kubenswrapper[4892]: I0217 17:50:38.505210 4892 generic.go:334] "Generic (PLEG): container finished" podID="d276a412-68f8-4069-9aa2-275fdb23997d" containerID="5186ffd59cb954c27883939ccd1e21943d600879ac8599b458c2142f6d2b66d3" exitCode=0 Feb 17 17:50:38 crc kubenswrapper[4892]: I0217 17:50:38.505263 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbtj9" event={"ID":"d276a412-68f8-4069-9aa2-275fdb23997d","Type":"ContainerDied","Data":"5186ffd59cb954c27883939ccd1e21943d600879ac8599b458c2142f6d2b66d3"} Feb 17 17:50:40 crc kubenswrapper[4892]: I0217 17:50:40.523647 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ztgz4" event={"ID":"66354c96-340d-43c7-98d0-19713f857884","Type":"ContainerStarted","Data":"cc4815a035b4957f5c46b6adc05333fe70f215427c1b860521e216f31e27e5e6"} Feb 17 17:50:41 crc kubenswrapper[4892]: I0217 17:50:41.554111 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ztgz4" podStartSLOduration=7.209804175 podStartE2EDuration="1m7.55408343s" podCreationTimestamp="2026-02-17 17:49:34 +0000 UTC" firstStartedPulling="2026-02-17 17:49:39.011930228 +0000 UTC m=+350.387333493" lastFinishedPulling="2026-02-17 17:50:39.356209483 +0000 UTC m=+410.731612748" observedRunningTime="2026-02-17 17:50:41.552018484 +0000 UTC m=+412.927421829" watchObservedRunningTime="2026-02-17 17:50:41.55408343 +0000 UTC m=+412.929486735" Feb 17 17:50:44 crc kubenswrapper[4892]: I0217 17:50:44.965268 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ztgz4" Feb 17 17:50:44 crc kubenswrapper[4892]: I0217 17:50:44.965719 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ztgz4" Feb 17 17:50:46 crc kubenswrapper[4892]: I0217 17:50:46.035613 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ztgz4" podUID="66354c96-340d-43c7-98d0-19713f857884" containerName="registry-server" probeResult="failure" output=< Feb 17 17:50:46 crc kubenswrapper[4892]: timeout: failed to connect service ":50051" within 1s Feb 17 17:50:46 crc kubenswrapper[4892]: > Feb 17 17:50:48 crc kubenswrapper[4892]: I0217 17:50:48.591291 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbtj9" event={"ID":"d276a412-68f8-4069-9aa2-275fdb23997d","Type":"ContainerStarted","Data":"f057e6826ee9b4451acb2ecf5775b8d00830c9d920d4c51c7a05610476ed7701"} Feb 17 17:50:48 crc kubenswrapper[4892]: I0217 17:50:48.625122 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pbtj9" podStartSLOduration=19.025209314 podStartE2EDuration="1m3.625067586s" podCreationTimestamp="2026-02-17 17:49:45 +0000 UTC" firstStartedPulling="2026-02-17 17:50:02.365213846 +0000 UTC m=+373.740617111" lastFinishedPulling="2026-02-17 17:50:46.965072108 +0000 UTC m=+418.340475383" observedRunningTime="2026-02-17 17:50:48.622053915 +0000 UTC m=+419.997457210" watchObservedRunningTime="2026-02-17 17:50:48.625067586 +0000 UTC m=+420.000470871" Feb 17 17:50:49 crc kubenswrapper[4892]: I0217 17:50:49.516565 4892 scope.go:117] "RemoveContainer" containerID="2199d2bd60d27dc4d3c3ff4a2ef9e4cbad59c3546a36206258b82875901b69d5" Feb 17 17:50:50 crc kubenswrapper[4892]: I0217 17:50:50.708070 4892 scope.go:117] "RemoveContainer" containerID="2fe34ba04ce082dcb2c2040fafa3d9b0fb0f78fd569d08c9e53278873fb6434c" Feb 17 17:50:52 crc kubenswrapper[4892]: I0217 17:50:52.182727 4892 scope.go:117] "RemoveContainer" containerID="23963364690d3b6cdb1cb76d08e094b0a7c2fe304f8d20a673206357e0152193" Feb 17 17:50:52 crc kubenswrapper[4892]: I0217 17:50:52.211478 4892 scope.go:117] "RemoveContainer" containerID="bf7e90e5029b6c000084531e8c465365c236ea0c554bf0ab65b936fc0f53cfe8" Feb 17 17:50:52 crc kubenswrapper[4892]: I0217 17:50:52.240025 4892 scope.go:117] "RemoveContainer" containerID="4e37f4ca4c3c9dac281443c5cd61f2c5d81b353fa7e589d5080d48da11d627e5" Feb 17 17:50:52 crc kubenswrapper[4892]: I0217 17:50:52.261938 4892 scope.go:117] "RemoveContainer" containerID="b7a9fc749b194bcc0558af7017260846aeb8df8df5d84f31ae0f1e9f7ac29e63" Feb 17 17:50:52 crc kubenswrapper[4892]: I0217 17:50:52.629366 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5v59h" event={"ID":"272b1940-458c-4610-a618-fcc2a4fab95c","Type":"ContainerStarted","Data":"891529110bfd1783b1f20f998e261a4dea3331cf20f0710362cfdedb4a9666fb"} Feb 17 17:50:52 crc kubenswrapper[4892]: I0217 17:50:52.660718 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5v59h" podStartSLOduration=14.822503269 podStartE2EDuration="1m4.660702708s" podCreationTimestamp="2026-02-17 17:49:48 +0000 UTC" firstStartedPulling="2026-02-17 17:50:02.344802127 +0000 UTC m=+373.720205392" lastFinishedPulling="2026-02-17 17:50:52.183001556 +0000 UTC m=+423.558404831" observedRunningTime="2026-02-17 17:50:52.657733428 +0000 UTC m=+424.033136713" watchObservedRunningTime="2026-02-17 17:50:52.660702708 +0000 UTC m=+424.036105973" Feb 17 17:50:55 crc kubenswrapper[4892]: I0217 17:50:55.011755 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ztgz4" Feb 17 17:50:55 crc kubenswrapper[4892]: I0217 17:50:55.068453 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ztgz4" Feb 17 17:50:55 crc kubenswrapper[4892]: I0217 17:50:55.563160 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pbtj9" Feb 17 17:50:55 crc kubenswrapper[4892]: I0217 17:50:55.563275 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pbtj9" Feb 17 17:50:55 crc kubenswrapper[4892]: I0217 17:50:55.649952 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pbtj9" Feb 17 17:50:55 crc kubenswrapper[4892]: I0217 17:50:55.722503 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pbtj9" Feb 17 17:50:58 crc kubenswrapper[4892]: I0217 17:50:58.768028 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5v59h" Feb 17 17:50:58 crc kubenswrapper[4892]: I0217 17:50:58.768408 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5v59h" Feb 17 17:50:59 crc kubenswrapper[4892]: I0217 17:50:59.817260 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5v59h" podUID="272b1940-458c-4610-a618-fcc2a4fab95c" containerName="registry-server" probeResult="failure" output=< Feb 17 17:50:59 crc kubenswrapper[4892]: timeout: failed to connect service ":50051" within 1s Feb 17 17:50:59 crc kubenswrapper[4892]: > Feb 17 17:51:01 crc kubenswrapper[4892]: I0217 17:51:01.405962 4892 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","burstable","podb3fc996d-107b-4647-b52f-54fef31f9059"] err="unable to destroy cgroup paths for cgroup [kubepods burstable podb3fc996d-107b-4647-b52f-54fef31f9059] : Timed out while waiting for systemd to remove kubepods-burstable-podb3fc996d_107b_4647_b52f_54fef31f9059.slice" Feb 17 17:51:03 crc kubenswrapper[4892]: I0217 17:51:03.390032 4892 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","burstable","podb3fc996d-107b-4647-b52f-54fef31f9059"] err="unable to destroy cgroup paths for cgroup [kubepods burstable podb3fc996d-107b-4647-b52f-54fef31f9059] : Timed out while waiting for systemd to remove kubepods-burstable-podb3fc996d_107b_4647_b52f_54fef31f9059.slice" Feb 17 17:51:04 crc kubenswrapper[4892]: I0217 17:51:04.510167 4892 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","burstable","podb3fc996d-107b-4647-b52f-54fef31f9059"] err="unable to destroy cgroup paths for cgroup [kubepods burstable podb3fc996d-107b-4647-b52f-54fef31f9059] : Timed out while waiting for systemd to remove kubepods-burstable-podb3fc996d_107b_4647_b52f_54fef31f9059.slice" Feb 17 17:51:04 crc kubenswrapper[4892]: E0217 17:51:04.510217 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods burstable podb3fc996d-107b-4647-b52f-54fef31f9059] : unable to destroy cgroup paths for cgroup [kubepods burstable podb3fc996d-107b-4647-b52f-54fef31f9059] : Timed out while waiting for systemd to remove kubepods-burstable-podb3fc996d_107b_4647_b52f_54fef31f9059.slice" pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" podUID="b3fc996d-107b-4647-b52f-54fef31f9059" Feb 17 17:51:04 crc kubenswrapper[4892]: I0217 17:51:04.731428 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-cnjbv" Feb 17 17:51:04 crc kubenswrapper[4892]: I0217 17:51:04.788380 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-cnjbv"] Feb 17 17:51:04 crc kubenswrapper[4892]: I0217 17:51:04.798501 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-cnjbv"] Feb 17 17:51:05 crc kubenswrapper[4892]: I0217 17:51:05.378530 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3fc996d-107b-4647-b52f-54fef31f9059" path="/var/lib/kubelet/pods/b3fc996d-107b-4647-b52f-54fef31f9059/volumes" Feb 17 17:51:05 crc kubenswrapper[4892]: I0217 17:51:05.398326 4892 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","burstable","podb3fc996d-107b-4647-b52f-54fef31f9059"] err="unable to destroy cgroup paths for cgroup [kubepods burstable podb3fc996d-107b-4647-b52f-54fef31f9059] : Timed out while waiting for systemd to remove kubepods-burstable-podb3fc996d_107b_4647_b52f_54fef31f9059.slice" Feb 17 17:51:07 crc kubenswrapper[4892]: I0217 17:51:07.425035 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:51:07 crc kubenswrapper[4892]: I0217 17:51:07.425133 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:51:07 crc kubenswrapper[4892]: I0217 17:51:07.425216 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" Feb 17 17:51:07 crc kubenswrapper[4892]: I0217 17:51:07.426141 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1f481e119dd06565c296707fbdfe27f9e2c04abd2d62b9e12028ab806fca7152"} pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 17:51:07 crc kubenswrapper[4892]: I0217 17:51:07.426262 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" containerID="cri-o://1f481e119dd06565c296707fbdfe27f9e2c04abd2d62b9e12028ab806fca7152" gracePeriod=600 Feb 17 17:51:07 crc kubenswrapper[4892]: I0217 17:51:07.754415 4892 generic.go:334] "Generic (PLEG): container finished" podID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerID="1f481e119dd06565c296707fbdfe27f9e2c04abd2d62b9e12028ab806fca7152" exitCode=0 Feb 17 17:51:07 crc kubenswrapper[4892]: I0217 17:51:07.754493 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerDied","Data":"1f481e119dd06565c296707fbdfe27f9e2c04abd2d62b9e12028ab806fca7152"} Feb 17 17:51:07 crc kubenswrapper[4892]: I0217 17:51:07.754549 4892 scope.go:117] "RemoveContainer" containerID="1c31ce4ad814ab3be27cbd8d8eb0cb70c7e11352cacd14b705d5c3ed0b56fe52" Feb 17 17:51:08 crc kubenswrapper[4892]: I0217 17:51:08.767383 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerStarted","Data":"4bf1ebbe703d10ba0bd5ce4189499a9391545c941bb9c55b294cc5679c9bd301"} Feb 17 17:51:08 crc kubenswrapper[4892]: I0217 17:51:08.832161 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5v59h" Feb 17 17:51:08 crc kubenswrapper[4892]: I0217 17:51:08.890331 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5v59h" Feb 17 17:53:07 crc kubenswrapper[4892]: I0217 17:53:07.424719 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:53:07 crc kubenswrapper[4892]: I0217 17:53:07.425248 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:53:37 crc kubenswrapper[4892]: I0217 17:53:37.425316 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:53:37 crc kubenswrapper[4892]: I0217 17:53:37.426153 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:54:07 crc kubenswrapper[4892]: I0217 17:54:07.424979 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:54:07 crc kubenswrapper[4892]: I0217 17:54:07.425742 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:54:07 crc kubenswrapper[4892]: I0217 17:54:07.425845 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" Feb 17 17:54:07 crc kubenswrapper[4892]: I0217 17:54:07.426860 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4bf1ebbe703d10ba0bd5ce4189499a9391545c941bb9c55b294cc5679c9bd301"} pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 17:54:07 crc kubenswrapper[4892]: I0217 17:54:07.426967 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" containerID="cri-o://4bf1ebbe703d10ba0bd5ce4189499a9391545c941bb9c55b294cc5679c9bd301" gracePeriod=600 Feb 17 17:54:08 crc kubenswrapper[4892]: I0217 17:54:08.265049 4892 generic.go:334] "Generic (PLEG): container finished" podID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerID="4bf1ebbe703d10ba0bd5ce4189499a9391545c941bb9c55b294cc5679c9bd301" exitCode=0 Feb 17 17:54:08 crc kubenswrapper[4892]: I0217 17:54:08.265110 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerDied","Data":"4bf1ebbe703d10ba0bd5ce4189499a9391545c941bb9c55b294cc5679c9bd301"} Feb 17 17:54:08 crc kubenswrapper[4892]: I0217 17:54:08.265721 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerStarted","Data":"921d973508b9907ab2d7e5529ed27c9f162bd1cd21401233f975dc91366a6d72"} Feb 17 17:54:08 crc kubenswrapper[4892]: I0217 17:54:08.265744 4892 scope.go:117] "RemoveContainer" containerID="1f481e119dd06565c296707fbdfe27f9e2c04abd2d62b9e12028ab806fca7152" Feb 17 17:55:23 crc kubenswrapper[4892]: I0217 17:55:23.715220 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-dckf8"] Feb 17 17:55:23 crc kubenswrapper[4892]: E0217 17:55:23.716229 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3fc996d-107b-4647-b52f-54fef31f9059" containerName="registry" Feb 17 17:55:23 crc kubenswrapper[4892]: I0217 17:55:23.716247 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3fc996d-107b-4647-b52f-54fef31f9059" containerName="registry" Feb 17 17:55:23 crc kubenswrapper[4892]: I0217 17:55:23.716457 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3fc996d-107b-4647-b52f-54fef31f9059" containerName="registry" Feb 17 17:55:23 crc kubenswrapper[4892]: I0217 17:55:23.717047 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-dckf8" Feb 17 17:55:23 crc kubenswrapper[4892]: I0217 17:55:23.723544 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Feb 17 17:55:23 crc kubenswrapper[4892]: I0217 17:55:23.724596 4892 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-6s48s" Feb 17 17:55:23 crc kubenswrapper[4892]: I0217 17:55:23.725031 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Feb 17 17:55:23 crc kubenswrapper[4892]: I0217 17:55:23.725258 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Feb 17 17:55:23 crc kubenswrapper[4892]: I0217 17:55:23.728174 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-dckf8"] Feb 17 17:55:23 crc kubenswrapper[4892]: I0217 17:55:23.845340 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kff6n\" (UniqueName: \"kubernetes.io/projected/e475f89c-c290-4a9b-8475-2d6155ec7ea2-kube-api-access-kff6n\") pod \"crc-storage-crc-dckf8\" (UID: \"e475f89c-c290-4a9b-8475-2d6155ec7ea2\") " pod="crc-storage/crc-storage-crc-dckf8" Feb 17 17:55:23 crc kubenswrapper[4892]: I0217 17:55:23.845726 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/e475f89c-c290-4a9b-8475-2d6155ec7ea2-crc-storage\") pod \"crc-storage-crc-dckf8\" (UID: \"e475f89c-c290-4a9b-8475-2d6155ec7ea2\") " pod="crc-storage/crc-storage-crc-dckf8" Feb 17 17:55:23 crc kubenswrapper[4892]: I0217 17:55:23.845871 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/e475f89c-c290-4a9b-8475-2d6155ec7ea2-node-mnt\") pod \"crc-storage-crc-dckf8\" (UID: \"e475f89c-c290-4a9b-8475-2d6155ec7ea2\") " pod="crc-storage/crc-storage-crc-dckf8" Feb 17 17:55:23 crc kubenswrapper[4892]: I0217 17:55:23.947134 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/e475f89c-c290-4a9b-8475-2d6155ec7ea2-node-mnt\") pod \"crc-storage-crc-dckf8\" (UID: \"e475f89c-c290-4a9b-8475-2d6155ec7ea2\") " pod="crc-storage/crc-storage-crc-dckf8" Feb 17 17:55:23 crc kubenswrapper[4892]: I0217 17:55:23.947237 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kff6n\" (UniqueName: \"kubernetes.io/projected/e475f89c-c290-4a9b-8475-2d6155ec7ea2-kube-api-access-kff6n\") pod \"crc-storage-crc-dckf8\" (UID: \"e475f89c-c290-4a9b-8475-2d6155ec7ea2\") " pod="crc-storage/crc-storage-crc-dckf8" Feb 17 17:55:23 crc kubenswrapper[4892]: I0217 17:55:23.947282 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/e475f89c-c290-4a9b-8475-2d6155ec7ea2-crc-storage\") pod \"crc-storage-crc-dckf8\" (UID: \"e475f89c-c290-4a9b-8475-2d6155ec7ea2\") " pod="crc-storage/crc-storage-crc-dckf8" Feb 17 17:55:23 crc kubenswrapper[4892]: I0217 17:55:23.947523 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/e475f89c-c290-4a9b-8475-2d6155ec7ea2-node-mnt\") pod \"crc-storage-crc-dckf8\" (UID: \"e475f89c-c290-4a9b-8475-2d6155ec7ea2\") " pod="crc-storage/crc-storage-crc-dckf8" Feb 17 17:55:23 crc kubenswrapper[4892]: I0217 17:55:23.948264 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/e475f89c-c290-4a9b-8475-2d6155ec7ea2-crc-storage\") pod \"crc-storage-crc-dckf8\" (UID: \"e475f89c-c290-4a9b-8475-2d6155ec7ea2\") " pod="crc-storage/crc-storage-crc-dckf8" Feb 17 17:55:23 crc kubenswrapper[4892]: I0217 17:55:23.980905 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kff6n\" (UniqueName: \"kubernetes.io/projected/e475f89c-c290-4a9b-8475-2d6155ec7ea2-kube-api-access-kff6n\") pod \"crc-storage-crc-dckf8\" (UID: \"e475f89c-c290-4a9b-8475-2d6155ec7ea2\") " pod="crc-storage/crc-storage-crc-dckf8" Feb 17 17:55:24 crc kubenswrapper[4892]: I0217 17:55:24.048215 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-dckf8" Feb 17 17:55:24 crc kubenswrapper[4892]: I0217 17:55:24.541965 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-dckf8"] Feb 17 17:55:24 crc kubenswrapper[4892]: I0217 17:55:24.556397 4892 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 17:55:24 crc kubenswrapper[4892]: I0217 17:55:24.940853 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-dckf8" event={"ID":"e475f89c-c290-4a9b-8475-2d6155ec7ea2","Type":"ContainerStarted","Data":"14b7d7453f8efb12c61b58bde3bba0aea6bd3f1bf5b757dd0ef46ca14b096229"} Feb 17 17:55:26 crc kubenswrapper[4892]: I0217 17:55:26.960162 4892 generic.go:334] "Generic (PLEG): container finished" podID="e475f89c-c290-4a9b-8475-2d6155ec7ea2" containerID="7b2c8e54ef6f6676cc9ceaabe3aac48284268991de638a01a0cf0bb47c8f122b" exitCode=0 Feb 17 17:55:26 crc kubenswrapper[4892]: I0217 17:55:26.960241 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-dckf8" event={"ID":"e475f89c-c290-4a9b-8475-2d6155ec7ea2","Type":"ContainerDied","Data":"7b2c8e54ef6f6676cc9ceaabe3aac48284268991de638a01a0cf0bb47c8f122b"} Feb 17 17:55:28 crc kubenswrapper[4892]: I0217 17:55:28.308578 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-dckf8" Feb 17 17:55:28 crc kubenswrapper[4892]: I0217 17:55:28.423467 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/e475f89c-c290-4a9b-8475-2d6155ec7ea2-crc-storage\") pod \"e475f89c-c290-4a9b-8475-2d6155ec7ea2\" (UID: \"e475f89c-c290-4a9b-8475-2d6155ec7ea2\") " Feb 17 17:55:28 crc kubenswrapper[4892]: I0217 17:55:28.423602 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kff6n\" (UniqueName: \"kubernetes.io/projected/e475f89c-c290-4a9b-8475-2d6155ec7ea2-kube-api-access-kff6n\") pod \"e475f89c-c290-4a9b-8475-2d6155ec7ea2\" (UID: \"e475f89c-c290-4a9b-8475-2d6155ec7ea2\") " Feb 17 17:55:28 crc kubenswrapper[4892]: I0217 17:55:28.423675 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/e475f89c-c290-4a9b-8475-2d6155ec7ea2-node-mnt\") pod \"e475f89c-c290-4a9b-8475-2d6155ec7ea2\" (UID: \"e475f89c-c290-4a9b-8475-2d6155ec7ea2\") " Feb 17 17:55:28 crc kubenswrapper[4892]: I0217 17:55:28.423892 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e475f89c-c290-4a9b-8475-2d6155ec7ea2-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "e475f89c-c290-4a9b-8475-2d6155ec7ea2" (UID: "e475f89c-c290-4a9b-8475-2d6155ec7ea2"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:55:28 crc kubenswrapper[4892]: I0217 17:55:28.424222 4892 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/e475f89c-c290-4a9b-8475-2d6155ec7ea2-node-mnt\") on node \"crc\" DevicePath \"\"" Feb 17 17:55:28 crc kubenswrapper[4892]: I0217 17:55:28.431268 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e475f89c-c290-4a9b-8475-2d6155ec7ea2-kube-api-access-kff6n" (OuterVolumeSpecName: "kube-api-access-kff6n") pod "e475f89c-c290-4a9b-8475-2d6155ec7ea2" (UID: "e475f89c-c290-4a9b-8475-2d6155ec7ea2"). InnerVolumeSpecName "kube-api-access-kff6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:55:28 crc kubenswrapper[4892]: I0217 17:55:28.455091 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e475f89c-c290-4a9b-8475-2d6155ec7ea2-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "e475f89c-c290-4a9b-8475-2d6155ec7ea2" (UID: "e475f89c-c290-4a9b-8475-2d6155ec7ea2"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:55:28 crc kubenswrapper[4892]: I0217 17:55:28.526778 4892 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/e475f89c-c290-4a9b-8475-2d6155ec7ea2-crc-storage\") on node \"crc\" DevicePath \"\"" Feb 17 17:55:28 crc kubenswrapper[4892]: I0217 17:55:28.526865 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kff6n\" (UniqueName: \"kubernetes.io/projected/e475f89c-c290-4a9b-8475-2d6155ec7ea2-kube-api-access-kff6n\") on node \"crc\" DevicePath \"\"" Feb 17 17:55:28 crc kubenswrapper[4892]: I0217 17:55:28.980552 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-dckf8" event={"ID":"e475f89c-c290-4a9b-8475-2d6155ec7ea2","Type":"ContainerDied","Data":"14b7d7453f8efb12c61b58bde3bba0aea6bd3f1bf5b757dd0ef46ca14b096229"} Feb 17 17:55:28 crc kubenswrapper[4892]: I0217 17:55:28.980704 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14b7d7453f8efb12c61b58bde3bba0aea6bd3f1bf5b757dd0ef46ca14b096229" Feb 17 17:55:28 crc kubenswrapper[4892]: I0217 17:55:28.980717 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-dckf8" Feb 17 17:55:29 crc kubenswrapper[4892]: E0217 17:55:29.124594 4892 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode475f89c_c290_4a9b_8475_2d6155ec7ea2.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode475f89c_c290_4a9b_8475_2d6155ec7ea2.slice/crio-14b7d7453f8efb12c61b58bde3bba0aea6bd3f1bf5b757dd0ef46ca14b096229\": RecentStats: unable to find data in memory cache]" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.395964 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kp5h9"] Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.398288 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerName="ovn-controller" containerID="cri-o://3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146" gracePeriod=30 Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.398358 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerName="northd" containerID="cri-o://f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222" gracePeriod=30 Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.398466 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63" gracePeriod=30 Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.398535 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerName="kube-rbac-proxy-node" containerID="cri-o://d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2" gracePeriod=30 Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.398605 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerName="ovn-acl-logging" containerID="cri-o://5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639" gracePeriod=30 Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.398606 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerName="sbdb" containerID="cri-o://03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089" gracePeriod=30 Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.398652 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerName="nbdb" containerID="cri-o://0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb" gracePeriod=30 Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.452329 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerName="ovnkube-controller" containerID="cri-o://71a7a05edd0e6c9f2879bb1a8d2c1ea260b971a5feeb8d6efc576ee04ce99c14" gracePeriod=30 Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.747796 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kp5h9_b23058a0-04ec-4a23-82cb-60f9b368eaa0/ovnkube-controller/3.log" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.751204 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kp5h9_b23058a0-04ec-4a23-82cb-60f9b368eaa0/ovn-acl-logging/0.log" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.752167 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kp5h9_b23058a0-04ec-4a23-82cb-60f9b368eaa0/ovn-controller/0.log" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.752916 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.796599 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-etc-openvswitch\") pod \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.796647 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-host-cni-bin\") pod \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.796665 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-host-run-netns\") pod \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.796688 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-host-cni-netd\") pod \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.796728 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b23058a0-04ec-4a23-82cb-60f9b368eaa0-ovnkube-config\") pod \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.796747 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-host-run-ovn-kubernetes\") pod \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.796778 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcqf4\" (UniqueName: \"kubernetes.io/projected/b23058a0-04ec-4a23-82cb-60f9b368eaa0-kube-api-access-vcqf4\") pod \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.796827 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-systemd-units\") pod \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.796845 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b23058a0-04ec-4a23-82cb-60f9b368eaa0-ovnkube-script-lib\") pod \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.796869 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-host-slash\") pod \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.796886 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.796905 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-run-ovn\") pod \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.796930 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-log-socket\") pod \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.796968 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b23058a0-04ec-4a23-82cb-60f9b368eaa0-ovn-node-metrics-cert\") pod \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.796990 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-node-log\") pod \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.797007 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-var-lib-openvswitch\") pod \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.797027 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b23058a0-04ec-4a23-82cb-60f9b368eaa0-env-overrides\") pod \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.797052 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-host-kubelet\") pod \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.797069 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-run-openvswitch\") pod \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.797086 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-run-systemd\") pod \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\" (UID: \"b23058a0-04ec-4a23-82cb-60f9b368eaa0\") " Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.798685 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-host-slash" (OuterVolumeSpecName: "host-slash") pod "b23058a0-04ec-4a23-82cb-60f9b368eaa0" (UID: "b23058a0-04ec-4a23-82cb-60f9b368eaa0"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.798723 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b23058a0-04ec-4a23-82cb-60f9b368eaa0-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "b23058a0-04ec-4a23-82cb-60f9b368eaa0" (UID: "b23058a0-04ec-4a23-82cb-60f9b368eaa0"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.798735 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-node-log" (OuterVolumeSpecName: "node-log") pod "b23058a0-04ec-4a23-82cb-60f9b368eaa0" (UID: "b23058a0-04ec-4a23-82cb-60f9b368eaa0"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.798804 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "b23058a0-04ec-4a23-82cb-60f9b368eaa0" (UID: "b23058a0-04ec-4a23-82cb-60f9b368eaa0"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.798870 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "b23058a0-04ec-4a23-82cb-60f9b368eaa0" (UID: "b23058a0-04ec-4a23-82cb-60f9b368eaa0"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.798934 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "b23058a0-04ec-4a23-82cb-60f9b368eaa0" (UID: "b23058a0-04ec-4a23-82cb-60f9b368eaa0"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.798989 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "b23058a0-04ec-4a23-82cb-60f9b368eaa0" (UID: "b23058a0-04ec-4a23-82cb-60f9b368eaa0"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.799050 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "b23058a0-04ec-4a23-82cb-60f9b368eaa0" (UID: "b23058a0-04ec-4a23-82cb-60f9b368eaa0"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.799096 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "b23058a0-04ec-4a23-82cb-60f9b368eaa0" (UID: "b23058a0-04ec-4a23-82cb-60f9b368eaa0"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.799142 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "b23058a0-04ec-4a23-82cb-60f9b368eaa0" (UID: "b23058a0-04ec-4a23-82cb-60f9b368eaa0"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.799185 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "b23058a0-04ec-4a23-82cb-60f9b368eaa0" (UID: "b23058a0-04ec-4a23-82cb-60f9b368eaa0"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.799209 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-log-socket" (OuterVolumeSpecName: "log-socket") pod "b23058a0-04ec-4a23-82cb-60f9b368eaa0" (UID: "b23058a0-04ec-4a23-82cb-60f9b368eaa0"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.799598 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "b23058a0-04ec-4a23-82cb-60f9b368eaa0" (UID: "b23058a0-04ec-4a23-82cb-60f9b368eaa0"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.799801 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "b23058a0-04ec-4a23-82cb-60f9b368eaa0" (UID: "b23058a0-04ec-4a23-82cb-60f9b368eaa0"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.799873 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b23058a0-04ec-4a23-82cb-60f9b368eaa0-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "b23058a0-04ec-4a23-82cb-60f9b368eaa0" (UID: "b23058a0-04ec-4a23-82cb-60f9b368eaa0"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.800302 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "b23058a0-04ec-4a23-82cb-60f9b368eaa0" (UID: "b23058a0-04ec-4a23-82cb-60f9b368eaa0"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.800441 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b23058a0-04ec-4a23-82cb-60f9b368eaa0-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "b23058a0-04ec-4a23-82cb-60f9b368eaa0" (UID: "b23058a0-04ec-4a23-82cb-60f9b368eaa0"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.805523 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b23058a0-04ec-4a23-82cb-60f9b368eaa0-kube-api-access-vcqf4" (OuterVolumeSpecName: "kube-api-access-vcqf4") pod "b23058a0-04ec-4a23-82cb-60f9b368eaa0" (UID: "b23058a0-04ec-4a23-82cb-60f9b368eaa0"). InnerVolumeSpecName "kube-api-access-vcqf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.805877 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b23058a0-04ec-4a23-82cb-60f9b368eaa0-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "b23058a0-04ec-4a23-82cb-60f9b368eaa0" (UID: "b23058a0-04ec-4a23-82cb-60f9b368eaa0"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.830380 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-db2ft"] Feb 17 17:55:32 crc kubenswrapper[4892]: E0217 17:55:32.831021 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerName="ovnkube-controller" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.831160 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerName="ovnkube-controller" Feb 17 17:55:32 crc kubenswrapper[4892]: E0217 17:55:32.831267 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerName="kubecfg-setup" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.831367 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerName="kubecfg-setup" Feb 17 17:55:32 crc kubenswrapper[4892]: E0217 17:55:32.831494 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerName="kube-rbac-proxy-ovn-metrics" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.831570 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerName="kube-rbac-proxy-ovn-metrics" Feb 17 17:55:32 crc kubenswrapper[4892]: E0217 17:55:32.831637 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerName="ovn-controller" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.831723 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerName="ovn-controller" Feb 17 17:55:32 crc kubenswrapper[4892]: E0217 17:55:32.831836 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerName="nbdb" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.831907 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerName="nbdb" Feb 17 17:55:32 crc kubenswrapper[4892]: E0217 17:55:32.831970 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerName="ovnkube-controller" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.832029 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerName="ovnkube-controller" Feb 17 17:55:32 crc kubenswrapper[4892]: E0217 17:55:32.832102 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerName="sbdb" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.832169 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerName="sbdb" Feb 17 17:55:32 crc kubenswrapper[4892]: E0217 17:55:32.832246 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerName="ovnkube-controller" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.832311 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerName="ovnkube-controller" Feb 17 17:55:32 crc kubenswrapper[4892]: E0217 17:55:32.832377 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerName="ovnkube-controller" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.832439 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerName="ovnkube-controller" Feb 17 17:55:32 crc kubenswrapper[4892]: E0217 17:55:32.832501 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerName="kube-rbac-proxy-node" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.832565 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerName="kube-rbac-proxy-node" Feb 17 17:55:32 crc kubenswrapper[4892]: E0217 17:55:32.832644 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerName="northd" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.832798 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerName="northd" Feb 17 17:55:32 crc kubenswrapper[4892]: E0217 17:55:32.833051 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerName="ovn-acl-logging" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.833163 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerName="ovn-acl-logging" Feb 17 17:55:32 crc kubenswrapper[4892]: E0217 17:55:32.833289 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e475f89c-c290-4a9b-8475-2d6155ec7ea2" containerName="storage" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.833402 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="e475f89c-c290-4a9b-8475-2d6155ec7ea2" containerName="storage" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.833741 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="e475f89c-c290-4a9b-8475-2d6155ec7ea2" containerName="storage" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.833912 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerName="ovn-acl-logging" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.834036 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerName="ovnkube-controller" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.834153 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerName="kube-rbac-proxy-ovn-metrics" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.834275 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerName="northd" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.834400 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerName="nbdb" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.834526 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerName="ovnkube-controller" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.834772 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerName="ovnkube-controller" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.834920 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerName="ovnkube-controller" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.835025 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerName="ovn-controller" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.835148 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerName="ovnkube-controller" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.835256 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerName="kube-rbac-proxy-node" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.835359 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerName="sbdb" Feb 17 17:55:32 crc kubenswrapper[4892]: E0217 17:55:32.835690 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerName="ovnkube-controller" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.834145 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "b23058a0-04ec-4a23-82cb-60f9b368eaa0" (UID: "b23058a0-04ec-4a23-82cb-60f9b368eaa0"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.836527 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerName="ovnkube-controller" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.839607 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.899491 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-host-run-ovn-kubernetes\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.899549 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-host-cni-bin\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.899618 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-etc-openvswitch\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.899639 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-log-socket\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.899675 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-host-run-netns\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.899696 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-env-overrides\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.899717 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.899740 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-host-cni-netd\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.899762 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-host-slash\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.899878 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-run-openvswitch\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.899937 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-ovn-node-metrics-cert\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.900213 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-node-log\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.900255 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-ovnkube-config\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.900281 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-run-systemd\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.900383 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-var-lib-openvswitch\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.900409 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-systemd-units\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.900431 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-host-kubelet\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.900540 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zmr6\" (UniqueName: \"kubernetes.io/projected/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-kube-api-access-4zmr6\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.900598 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-ovnkube-script-lib\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.900708 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-run-ovn\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.902146 4892 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.902172 4892 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b23058a0-04ec-4a23-82cb-60f9b368eaa0-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.902186 4892 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-host-slash\") on node \"crc\" DevicePath \"\"" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.902204 4892 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.902219 4892 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.902236 4892 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-log-socket\") on node \"crc\" DevicePath \"\"" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.902249 4892 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b23058a0-04ec-4a23-82cb-60f9b368eaa0-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.902268 4892 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-node-log\") on node \"crc\" DevicePath \"\"" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.902281 4892 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.902294 4892 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b23058a0-04ec-4a23-82cb-60f9b368eaa0-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.902307 4892 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.902322 4892 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.902335 4892 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.902348 4892 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.902363 4892 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.902376 4892 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.902388 4892 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.902400 4892 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b23058a0-04ec-4a23-82cb-60f9b368eaa0-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.902418 4892 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b23058a0-04ec-4a23-82cb-60f9b368eaa0-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 17 17:55:32 crc kubenswrapper[4892]: I0217 17:55:32.902432 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcqf4\" (UniqueName: \"kubernetes.io/projected/b23058a0-04ec-4a23-82cb-60f9b368eaa0-kube-api-access-vcqf4\") on node \"crc\" DevicePath \"\"" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.003409 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-run-openvswitch\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.003493 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-ovn-node-metrics-cert\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.003537 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-node-log\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.003551 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-run-openvswitch\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.003577 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-ovnkube-config\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.004037 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-node-log\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.004128 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-run-systemd\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.004166 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-var-lib-openvswitch\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.004204 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-host-kubelet\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.004216 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-run-systemd\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.004226 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-systemd-units\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.004265 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zmr6\" (UniqueName: \"kubernetes.io/projected/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-kube-api-access-4zmr6\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.004295 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-systemd-units\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.004299 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-ovnkube-script-lib\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.004341 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-host-kubelet\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.004265 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-var-lib-openvswitch\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.004420 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-run-ovn\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.004484 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-run-ovn\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.004596 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-host-run-ovn-kubernetes\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.004656 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-host-cni-bin\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.004701 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-host-run-ovn-kubernetes\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.004741 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-etc-openvswitch\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.004742 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-host-cni-bin\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.004765 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-log-socket\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.004797 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-host-run-netns\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.004830 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-env-overrides\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.004850 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.004871 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-host-cni-netd\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.004914 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-log-socket\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.004954 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-etc-openvswitch\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.004965 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.005019 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-ovnkube-config\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.005444 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-env-overrides\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.005525 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-host-run-netns\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.005575 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-host-cni-netd\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.005695 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-host-slash\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.005910 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-host-slash\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.006379 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-ovnkube-script-lib\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.008177 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-ovn-node-metrics-cert\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.020076 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lxpxh_43b12f44-0079-4031-9b1d-492c374250df/kube-multus/2.log" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.021211 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lxpxh_43b12f44-0079-4031-9b1d-492c374250df/kube-multus/1.log" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.021338 4892 generic.go:334] "Generic (PLEG): container finished" podID="43b12f44-0079-4031-9b1d-492c374250df" containerID="304725a2b080e31a96b7983f19b1eedea834a9b2e00608dc922e78f46533ae2a" exitCode=2 Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.021435 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lxpxh" event={"ID":"43b12f44-0079-4031-9b1d-492c374250df","Type":"ContainerDied","Data":"304725a2b080e31a96b7983f19b1eedea834a9b2e00608dc922e78f46533ae2a"} Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.021483 4892 scope.go:117] "RemoveContainer" containerID="6fc69d29b82db70745c48c3f7761a99709ee267fb45509bfc4ff406273c6f23c" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.022095 4892 scope.go:117] "RemoveContainer" containerID="304725a2b080e31a96b7983f19b1eedea834a9b2e00608dc922e78f46533ae2a" Feb 17 17:55:33 crc kubenswrapper[4892]: E0217 17:55:33.022685 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-lxpxh_openshift-multus(43b12f44-0079-4031-9b1d-492c374250df)\"" pod="openshift-multus/multus-lxpxh" podUID="43b12f44-0079-4031-9b1d-492c374250df" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.029912 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kp5h9_b23058a0-04ec-4a23-82cb-60f9b368eaa0/ovnkube-controller/3.log" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.034866 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kp5h9_b23058a0-04ec-4a23-82cb-60f9b368eaa0/ovn-acl-logging/0.log" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.035706 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kp5h9_b23058a0-04ec-4a23-82cb-60f9b368eaa0/ovn-controller/0.log" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.036470 4892 generic.go:334] "Generic (PLEG): container finished" podID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerID="71a7a05edd0e6c9f2879bb1a8d2c1ea260b971a5feeb8d6efc576ee04ce99c14" exitCode=0 Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.036567 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" event={"ID":"b23058a0-04ec-4a23-82cb-60f9b368eaa0","Type":"ContainerDied","Data":"71a7a05edd0e6c9f2879bb1a8d2c1ea260b971a5feeb8d6efc576ee04ce99c14"} Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.036657 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.036663 4892 generic.go:334] "Generic (PLEG): container finished" podID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerID="03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089" exitCode=0 Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.036960 4892 generic.go:334] "Generic (PLEG): container finished" podID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerID="0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb" exitCode=0 Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.036985 4892 generic.go:334] "Generic (PLEG): container finished" podID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerID="f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222" exitCode=0 Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.037005 4892 generic.go:334] "Generic (PLEG): container finished" podID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerID="2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63" exitCode=0 Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.037022 4892 generic.go:334] "Generic (PLEG): container finished" podID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerID="d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2" exitCode=0 Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.037036 4892 generic.go:334] "Generic (PLEG): container finished" podID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerID="5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639" exitCode=143 Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.036755 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" event={"ID":"b23058a0-04ec-4a23-82cb-60f9b368eaa0","Type":"ContainerDied","Data":"03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089"} Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.037079 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" event={"ID":"b23058a0-04ec-4a23-82cb-60f9b368eaa0","Type":"ContainerDied","Data":"0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb"} Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.037103 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" event={"ID":"b23058a0-04ec-4a23-82cb-60f9b368eaa0","Type":"ContainerDied","Data":"f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222"} Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.037125 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" event={"ID":"b23058a0-04ec-4a23-82cb-60f9b368eaa0","Type":"ContainerDied","Data":"2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63"} Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.037050 4892 generic.go:334] "Generic (PLEG): container finished" podID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" containerID="3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146" exitCode=143 Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.037145 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" event={"ID":"b23058a0-04ec-4a23-82cb-60f9b368eaa0","Type":"ContainerDied","Data":"d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2"} Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.037200 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"71a7a05edd0e6c9f2879bb1a8d2c1ea260b971a5feeb8d6efc576ee04ce99c14"} Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.037219 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"acb53587bfb9bbb4e171e5b2c92a91c196c8c0445a80c61d8004ce0fb3c858e1"} Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.037227 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089"} Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.037233 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb"} Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.037240 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222"} Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.037246 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63"} Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.037253 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2"} Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.037294 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639"} Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.037301 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146"} Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.037308 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e"} Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.037326 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" event={"ID":"b23058a0-04ec-4a23-82cb-60f9b368eaa0","Type":"ContainerDied","Data":"5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639"} Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.037492 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"71a7a05edd0e6c9f2879bb1a8d2c1ea260b971a5feeb8d6efc576ee04ce99c14"} Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.037560 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"acb53587bfb9bbb4e171e5b2c92a91c196c8c0445a80c61d8004ce0fb3c858e1"} Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.037572 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089"} Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.037579 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb"} Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.037585 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222"} Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.037832 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63"} Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.037843 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2"} Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.037850 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639"} Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.037856 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146"} Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.037863 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e"} Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.037874 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" event={"ID":"b23058a0-04ec-4a23-82cb-60f9b368eaa0","Type":"ContainerDied","Data":"3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146"} Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.037887 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"71a7a05edd0e6c9f2879bb1a8d2c1ea260b971a5feeb8d6efc576ee04ce99c14"} Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.037896 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"acb53587bfb9bbb4e171e5b2c92a91c196c8c0445a80c61d8004ce0fb3c858e1"} Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.037903 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089"} Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.037910 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb"} Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.037917 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222"} Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.037923 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63"} Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.037930 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2"} Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.037936 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639"} Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.037942 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146"} Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.037949 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e"} Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.037959 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kp5h9" event={"ID":"b23058a0-04ec-4a23-82cb-60f9b368eaa0","Type":"ContainerDied","Data":"17cafc21d14ff978199bd24633c480073cfd1a24a1f65dc24eaeddf962fb96d1"} Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.037968 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"71a7a05edd0e6c9f2879bb1a8d2c1ea260b971a5feeb8d6efc576ee04ce99c14"} Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.037976 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"acb53587bfb9bbb4e171e5b2c92a91c196c8c0445a80c61d8004ce0fb3c858e1"} Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.037983 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089"} Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.037990 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb"} Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.037997 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222"} Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.038004 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63"} Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.038010 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2"} Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.038017 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639"} Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.038024 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146"} Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.038030 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e"} Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.040268 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zmr6\" (UniqueName: \"kubernetes.io/projected/64a238eb-25e3-46d8-9a9d-3fadd5198c1c-kube-api-access-4zmr6\") pod \"ovnkube-node-db2ft\" (UID: \"64a238eb-25e3-46d8-9a9d-3fadd5198c1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.067746 4892 scope.go:117] "RemoveContainer" containerID="71a7a05edd0e6c9f2879bb1a8d2c1ea260b971a5feeb8d6efc576ee04ce99c14" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.101513 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kp5h9"] Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.105832 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kp5h9"] Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.105853 4892 scope.go:117] "RemoveContainer" containerID="acb53587bfb9bbb4e171e5b2c92a91c196c8c0445a80c61d8004ce0fb3c858e1" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.124514 4892 scope.go:117] "RemoveContainer" containerID="03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.149012 4892 scope.go:117] "RemoveContainer" containerID="0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.159440 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.164016 4892 scope.go:117] "RemoveContainer" containerID="f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.183589 4892 scope.go:117] "RemoveContainer" containerID="2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63" Feb 17 17:55:33 crc kubenswrapper[4892]: W0217 17:55:33.193029 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64a238eb_25e3_46d8_9a9d_3fadd5198c1c.slice/crio-5dc135e1e93d52aedd53b30e2db0742764b999338931c5eb92fd1f264670c176 WatchSource:0}: Error finding container 5dc135e1e93d52aedd53b30e2db0742764b999338931c5eb92fd1f264670c176: Status 404 returned error can't find the container with id 5dc135e1e93d52aedd53b30e2db0742764b999338931c5eb92fd1f264670c176 Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.201020 4892 scope.go:117] "RemoveContainer" containerID="d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.225387 4892 scope.go:117] "RemoveContainer" containerID="5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.244825 4892 scope.go:117] "RemoveContainer" containerID="3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.266675 4892 scope.go:117] "RemoveContainer" containerID="c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.283049 4892 scope.go:117] "RemoveContainer" containerID="71a7a05edd0e6c9f2879bb1a8d2c1ea260b971a5feeb8d6efc576ee04ce99c14" Feb 17 17:55:33 crc kubenswrapper[4892]: E0217 17:55:33.283446 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71a7a05edd0e6c9f2879bb1a8d2c1ea260b971a5feeb8d6efc576ee04ce99c14\": container with ID starting with 71a7a05edd0e6c9f2879bb1a8d2c1ea260b971a5feeb8d6efc576ee04ce99c14 not found: ID does not exist" containerID="71a7a05edd0e6c9f2879bb1a8d2c1ea260b971a5feeb8d6efc576ee04ce99c14" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.283483 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71a7a05edd0e6c9f2879bb1a8d2c1ea260b971a5feeb8d6efc576ee04ce99c14"} err="failed to get container status \"71a7a05edd0e6c9f2879bb1a8d2c1ea260b971a5feeb8d6efc576ee04ce99c14\": rpc error: code = NotFound desc = could not find container \"71a7a05edd0e6c9f2879bb1a8d2c1ea260b971a5feeb8d6efc576ee04ce99c14\": container with ID starting with 71a7a05edd0e6c9f2879bb1a8d2c1ea260b971a5feeb8d6efc576ee04ce99c14 not found: ID does not exist" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.283510 4892 scope.go:117] "RemoveContainer" containerID="acb53587bfb9bbb4e171e5b2c92a91c196c8c0445a80c61d8004ce0fb3c858e1" Feb 17 17:55:33 crc kubenswrapper[4892]: E0217 17:55:33.283775 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acb53587bfb9bbb4e171e5b2c92a91c196c8c0445a80c61d8004ce0fb3c858e1\": container with ID starting with acb53587bfb9bbb4e171e5b2c92a91c196c8c0445a80c61d8004ce0fb3c858e1 not found: ID does not exist" containerID="acb53587bfb9bbb4e171e5b2c92a91c196c8c0445a80c61d8004ce0fb3c858e1" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.283809 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acb53587bfb9bbb4e171e5b2c92a91c196c8c0445a80c61d8004ce0fb3c858e1"} err="failed to get container status \"acb53587bfb9bbb4e171e5b2c92a91c196c8c0445a80c61d8004ce0fb3c858e1\": rpc error: code = NotFound desc = could not find container \"acb53587bfb9bbb4e171e5b2c92a91c196c8c0445a80c61d8004ce0fb3c858e1\": container with ID starting with acb53587bfb9bbb4e171e5b2c92a91c196c8c0445a80c61d8004ce0fb3c858e1 not found: ID does not exist" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.283851 4892 scope.go:117] "RemoveContainer" containerID="03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089" Feb 17 17:55:33 crc kubenswrapper[4892]: E0217 17:55:33.284270 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089\": container with ID starting with 03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089 not found: ID does not exist" containerID="03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.284300 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089"} err="failed to get container status \"03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089\": rpc error: code = NotFound desc = could not find container \"03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089\": container with ID starting with 03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089 not found: ID does not exist" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.284318 4892 scope.go:117] "RemoveContainer" containerID="0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb" Feb 17 17:55:33 crc kubenswrapper[4892]: E0217 17:55:33.284653 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb\": container with ID starting with 0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb not found: ID does not exist" containerID="0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.284721 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb"} err="failed to get container status \"0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb\": rpc error: code = NotFound desc = could not find container \"0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb\": container with ID starting with 0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb not found: ID does not exist" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.284759 4892 scope.go:117] "RemoveContainer" containerID="f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222" Feb 17 17:55:33 crc kubenswrapper[4892]: E0217 17:55:33.285163 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222\": container with ID starting with f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222 not found: ID does not exist" containerID="f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.285219 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222"} err="failed to get container status \"f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222\": rpc error: code = NotFound desc = could not find container \"f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222\": container with ID starting with f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222 not found: ID does not exist" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.285250 4892 scope.go:117] "RemoveContainer" containerID="2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63" Feb 17 17:55:33 crc kubenswrapper[4892]: E0217 17:55:33.285587 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63\": container with ID starting with 2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63 not found: ID does not exist" containerID="2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.285633 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63"} err="failed to get container status \"2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63\": rpc error: code = NotFound desc = could not find container \"2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63\": container with ID starting with 2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63 not found: ID does not exist" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.285660 4892 scope.go:117] "RemoveContainer" containerID="d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2" Feb 17 17:55:33 crc kubenswrapper[4892]: E0217 17:55:33.286084 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2\": container with ID starting with d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2 not found: ID does not exist" containerID="d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.286113 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2"} err="failed to get container status \"d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2\": rpc error: code = NotFound desc = could not find container \"d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2\": container with ID starting with d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2 not found: ID does not exist" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.286131 4892 scope.go:117] "RemoveContainer" containerID="5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639" Feb 17 17:55:33 crc kubenswrapper[4892]: E0217 17:55:33.286401 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639\": container with ID starting with 5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639 not found: ID does not exist" containerID="5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.286439 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639"} err="failed to get container status \"5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639\": rpc error: code = NotFound desc = could not find container \"5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639\": container with ID starting with 5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639 not found: ID does not exist" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.286466 4892 scope.go:117] "RemoveContainer" containerID="3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146" Feb 17 17:55:33 crc kubenswrapper[4892]: E0217 17:55:33.286886 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146\": container with ID starting with 3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146 not found: ID does not exist" containerID="3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.286914 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146"} err="failed to get container status \"3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146\": rpc error: code = NotFound desc = could not find container \"3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146\": container with ID starting with 3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146 not found: ID does not exist" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.286935 4892 scope.go:117] "RemoveContainer" containerID="c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e" Feb 17 17:55:33 crc kubenswrapper[4892]: E0217 17:55:33.287204 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\": container with ID starting with c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e not found: ID does not exist" containerID="c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.287248 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e"} err="failed to get container status \"c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\": rpc error: code = NotFound desc = could not find container \"c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\": container with ID starting with c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e not found: ID does not exist" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.287275 4892 scope.go:117] "RemoveContainer" containerID="71a7a05edd0e6c9f2879bb1a8d2c1ea260b971a5feeb8d6efc576ee04ce99c14" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.287644 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71a7a05edd0e6c9f2879bb1a8d2c1ea260b971a5feeb8d6efc576ee04ce99c14"} err="failed to get container status \"71a7a05edd0e6c9f2879bb1a8d2c1ea260b971a5feeb8d6efc576ee04ce99c14\": rpc error: code = NotFound desc = could not find container \"71a7a05edd0e6c9f2879bb1a8d2c1ea260b971a5feeb8d6efc576ee04ce99c14\": container with ID starting with 71a7a05edd0e6c9f2879bb1a8d2c1ea260b971a5feeb8d6efc576ee04ce99c14 not found: ID does not exist" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.287682 4892 scope.go:117] "RemoveContainer" containerID="acb53587bfb9bbb4e171e5b2c92a91c196c8c0445a80c61d8004ce0fb3c858e1" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.288191 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acb53587bfb9bbb4e171e5b2c92a91c196c8c0445a80c61d8004ce0fb3c858e1"} err="failed to get container status \"acb53587bfb9bbb4e171e5b2c92a91c196c8c0445a80c61d8004ce0fb3c858e1\": rpc error: code = NotFound desc = could not find container \"acb53587bfb9bbb4e171e5b2c92a91c196c8c0445a80c61d8004ce0fb3c858e1\": container with ID starting with acb53587bfb9bbb4e171e5b2c92a91c196c8c0445a80c61d8004ce0fb3c858e1 not found: ID does not exist" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.288228 4892 scope.go:117] "RemoveContainer" containerID="03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.288545 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089"} err="failed to get container status \"03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089\": rpc error: code = NotFound desc = could not find container \"03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089\": container with ID starting with 03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089 not found: ID does not exist" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.288582 4892 scope.go:117] "RemoveContainer" containerID="0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.290053 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb"} err="failed to get container status \"0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb\": rpc error: code = NotFound desc = could not find container \"0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb\": container with ID starting with 0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb not found: ID does not exist" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.290099 4892 scope.go:117] "RemoveContainer" containerID="f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.290454 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222"} err="failed to get container status \"f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222\": rpc error: code = NotFound desc = could not find container \"f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222\": container with ID starting with f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222 not found: ID does not exist" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.290499 4892 scope.go:117] "RemoveContainer" containerID="2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.290808 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63"} err="failed to get container status \"2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63\": rpc error: code = NotFound desc = could not find container \"2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63\": container with ID starting with 2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63 not found: ID does not exist" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.290873 4892 scope.go:117] "RemoveContainer" containerID="d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.291109 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2"} err="failed to get container status \"d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2\": rpc error: code = NotFound desc = could not find container \"d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2\": container with ID starting with d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2 not found: ID does not exist" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.291133 4892 scope.go:117] "RemoveContainer" containerID="5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.291375 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639"} err="failed to get container status \"5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639\": rpc error: code = NotFound desc = could not find container \"5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639\": container with ID starting with 5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639 not found: ID does not exist" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.291414 4892 scope.go:117] "RemoveContainer" containerID="3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.291622 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146"} err="failed to get container status \"3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146\": rpc error: code = NotFound desc = could not find container \"3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146\": container with ID starting with 3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146 not found: ID does not exist" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.291645 4892 scope.go:117] "RemoveContainer" containerID="c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.291902 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e"} err="failed to get container status \"c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\": rpc error: code = NotFound desc = could not find container \"c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\": container with ID starting with c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e not found: ID does not exist" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.291937 4892 scope.go:117] "RemoveContainer" containerID="71a7a05edd0e6c9f2879bb1a8d2c1ea260b971a5feeb8d6efc576ee04ce99c14" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.292170 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71a7a05edd0e6c9f2879bb1a8d2c1ea260b971a5feeb8d6efc576ee04ce99c14"} err="failed to get container status \"71a7a05edd0e6c9f2879bb1a8d2c1ea260b971a5feeb8d6efc576ee04ce99c14\": rpc error: code = NotFound desc = could not find container \"71a7a05edd0e6c9f2879bb1a8d2c1ea260b971a5feeb8d6efc576ee04ce99c14\": container with ID starting with 71a7a05edd0e6c9f2879bb1a8d2c1ea260b971a5feeb8d6efc576ee04ce99c14 not found: ID does not exist" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.292194 4892 scope.go:117] "RemoveContainer" containerID="acb53587bfb9bbb4e171e5b2c92a91c196c8c0445a80c61d8004ce0fb3c858e1" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.292501 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acb53587bfb9bbb4e171e5b2c92a91c196c8c0445a80c61d8004ce0fb3c858e1"} err="failed to get container status \"acb53587bfb9bbb4e171e5b2c92a91c196c8c0445a80c61d8004ce0fb3c858e1\": rpc error: code = NotFound desc = could not find container \"acb53587bfb9bbb4e171e5b2c92a91c196c8c0445a80c61d8004ce0fb3c858e1\": container with ID starting with acb53587bfb9bbb4e171e5b2c92a91c196c8c0445a80c61d8004ce0fb3c858e1 not found: ID does not exist" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.292548 4892 scope.go:117] "RemoveContainer" containerID="03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.292789 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089"} err="failed to get container status \"03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089\": rpc error: code = NotFound desc = could not find container \"03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089\": container with ID starting with 03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089 not found: ID does not exist" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.292832 4892 scope.go:117] "RemoveContainer" containerID="0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.293071 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb"} err="failed to get container status \"0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb\": rpc error: code = NotFound desc = could not find container \"0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb\": container with ID starting with 0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb not found: ID does not exist" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.293092 4892 scope.go:117] "RemoveContainer" containerID="f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.293306 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222"} err="failed to get container status \"f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222\": rpc error: code = NotFound desc = could not find container \"f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222\": container with ID starting with f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222 not found: ID does not exist" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.293326 4892 scope.go:117] "RemoveContainer" containerID="2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.293539 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63"} err="failed to get container status \"2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63\": rpc error: code = NotFound desc = could not find container \"2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63\": container with ID starting with 2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63 not found: ID does not exist" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.293563 4892 scope.go:117] "RemoveContainer" containerID="d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.293920 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2"} err="failed to get container status \"d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2\": rpc error: code = NotFound desc = could not find container \"d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2\": container with ID starting with d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2 not found: ID does not exist" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.293958 4892 scope.go:117] "RemoveContainer" containerID="5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.294556 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639"} err="failed to get container status \"5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639\": rpc error: code = NotFound desc = could not find container \"5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639\": container with ID starting with 5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639 not found: ID does not exist" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.294580 4892 scope.go:117] "RemoveContainer" containerID="3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.294841 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146"} err="failed to get container status \"3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146\": rpc error: code = NotFound desc = could not find container \"3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146\": container with ID starting with 3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146 not found: ID does not exist" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.294872 4892 scope.go:117] "RemoveContainer" containerID="c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.295090 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e"} err="failed to get container status \"c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\": rpc error: code = NotFound desc = could not find container \"c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\": container with ID starting with c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e not found: ID does not exist" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.295119 4892 scope.go:117] "RemoveContainer" containerID="71a7a05edd0e6c9f2879bb1a8d2c1ea260b971a5feeb8d6efc576ee04ce99c14" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.295370 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71a7a05edd0e6c9f2879bb1a8d2c1ea260b971a5feeb8d6efc576ee04ce99c14"} err="failed to get container status \"71a7a05edd0e6c9f2879bb1a8d2c1ea260b971a5feeb8d6efc576ee04ce99c14\": rpc error: code = NotFound desc = could not find container \"71a7a05edd0e6c9f2879bb1a8d2c1ea260b971a5feeb8d6efc576ee04ce99c14\": container with ID starting with 71a7a05edd0e6c9f2879bb1a8d2c1ea260b971a5feeb8d6efc576ee04ce99c14 not found: ID does not exist" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.295391 4892 scope.go:117] "RemoveContainer" containerID="acb53587bfb9bbb4e171e5b2c92a91c196c8c0445a80c61d8004ce0fb3c858e1" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.295614 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acb53587bfb9bbb4e171e5b2c92a91c196c8c0445a80c61d8004ce0fb3c858e1"} err="failed to get container status \"acb53587bfb9bbb4e171e5b2c92a91c196c8c0445a80c61d8004ce0fb3c858e1\": rpc error: code = NotFound desc = could not find container \"acb53587bfb9bbb4e171e5b2c92a91c196c8c0445a80c61d8004ce0fb3c858e1\": container with ID starting with acb53587bfb9bbb4e171e5b2c92a91c196c8c0445a80c61d8004ce0fb3c858e1 not found: ID does not exist" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.295654 4892 scope.go:117] "RemoveContainer" containerID="03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.295891 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089"} err="failed to get container status \"03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089\": rpc error: code = NotFound desc = could not find container \"03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089\": container with ID starting with 03d19b53802aadf6044f9951674486731811272edd0acb0ae76543e64e327089 not found: ID does not exist" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.295915 4892 scope.go:117] "RemoveContainer" containerID="0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.296153 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb"} err="failed to get container status \"0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb\": rpc error: code = NotFound desc = could not find container \"0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb\": container with ID starting with 0b14261d60e4ba33222d25c05bfbd665407f3683418067c5803213af6001a3cb not found: ID does not exist" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.296188 4892 scope.go:117] "RemoveContainer" containerID="f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.296431 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222"} err="failed to get container status \"f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222\": rpc error: code = NotFound desc = could not find container \"f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222\": container with ID starting with f60fc2e6d78ab582b73b72a8248a00562de7bc5786627f8fcdb9e63403213222 not found: ID does not exist" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.296457 4892 scope.go:117] "RemoveContainer" containerID="2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.296711 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63"} err="failed to get container status \"2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63\": rpc error: code = NotFound desc = could not find container \"2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63\": container with ID starting with 2ed9c88ae0a3d19ff20304b1dcc5e44322ab5462038492ab11a941d697289b63 not found: ID does not exist" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.296744 4892 scope.go:117] "RemoveContainer" containerID="d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.296948 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2"} err="failed to get container status \"d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2\": rpc error: code = NotFound desc = could not find container \"d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2\": container with ID starting with d6c38ef65989169f7edd7516ae154bee71bacec780ea131c680d61665fb32ee2 not found: ID does not exist" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.296972 4892 scope.go:117] "RemoveContainer" containerID="5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.297192 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639"} err="failed to get container status \"5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639\": rpc error: code = NotFound desc = could not find container \"5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639\": container with ID starting with 5b7d4bb2430aabe8c70516d13a269b6070e80ac1f60793b02e9528079c666639 not found: ID does not exist" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.297222 4892 scope.go:117] "RemoveContainer" containerID="3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.297433 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146"} err="failed to get container status \"3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146\": rpc error: code = NotFound desc = could not find container \"3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146\": container with ID starting with 3355acaba9a057e1cc9473b7c13a8f895d8136d1075fa17f90dcc2abdb9ff146 not found: ID does not exist" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.297457 4892 scope.go:117] "RemoveContainer" containerID="c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.297701 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e"} err="failed to get container status \"c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\": rpc error: code = NotFound desc = could not find container \"c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e\": container with ID starting with c6fa2591e1f8e8613ed1d5c3ec0ea7bf6d01c07dbf366b7880b4fe0a61e6d75e not found: ID does not exist" Feb 17 17:55:33 crc kubenswrapper[4892]: I0217 17:55:33.369380 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b23058a0-04ec-4a23-82cb-60f9b368eaa0" path="/var/lib/kubelet/pods/b23058a0-04ec-4a23-82cb-60f9b368eaa0/volumes" Feb 17 17:55:34 crc kubenswrapper[4892]: I0217 17:55:34.048050 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lxpxh_43b12f44-0079-4031-9b1d-492c374250df/kube-multus/2.log" Feb 17 17:55:34 crc kubenswrapper[4892]: I0217 17:55:34.051456 4892 generic.go:334] "Generic (PLEG): container finished" podID="64a238eb-25e3-46d8-9a9d-3fadd5198c1c" containerID="f5e407cc652362302bbd02c236b493eedc6965cea8b2b5ed5275a33832bc4e3d" exitCode=0 Feb 17 17:55:34 crc kubenswrapper[4892]: I0217 17:55:34.051529 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" event={"ID":"64a238eb-25e3-46d8-9a9d-3fadd5198c1c","Type":"ContainerDied","Data":"f5e407cc652362302bbd02c236b493eedc6965cea8b2b5ed5275a33832bc4e3d"} Feb 17 17:55:34 crc kubenswrapper[4892]: I0217 17:55:34.051587 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" event={"ID":"64a238eb-25e3-46d8-9a9d-3fadd5198c1c","Type":"ContainerStarted","Data":"5dc135e1e93d52aedd53b30e2db0742764b999338931c5eb92fd1f264670c176"} Feb 17 17:55:35 crc kubenswrapper[4892]: I0217 17:55:35.061123 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" event={"ID":"64a238eb-25e3-46d8-9a9d-3fadd5198c1c","Type":"ContainerStarted","Data":"749822abbd2cf5d7a39f623d6a6726ae4052212a846b2e65f39390fbf3ce5da5"} Feb 17 17:55:35 crc kubenswrapper[4892]: I0217 17:55:35.061648 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" event={"ID":"64a238eb-25e3-46d8-9a9d-3fadd5198c1c","Type":"ContainerStarted","Data":"5b55fb549e93188918b10e19ff2b40cff0dfe170626be76ea1e7a7d2206c69f3"} Feb 17 17:55:35 crc kubenswrapper[4892]: I0217 17:55:35.061665 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" event={"ID":"64a238eb-25e3-46d8-9a9d-3fadd5198c1c","Type":"ContainerStarted","Data":"4717c14ac0b7f5d422e91fe5a06d8e2e0ab7cb5a85cd04dcb8b02655507038bb"} Feb 17 17:55:35 crc kubenswrapper[4892]: I0217 17:55:35.061676 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" event={"ID":"64a238eb-25e3-46d8-9a9d-3fadd5198c1c","Type":"ContainerStarted","Data":"42b005c2f5169faa1d528f80a2634fa186544b59e629e7f6ecdf463d39a802f8"} Feb 17 17:55:35 crc kubenswrapper[4892]: I0217 17:55:35.061686 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" event={"ID":"64a238eb-25e3-46d8-9a9d-3fadd5198c1c","Type":"ContainerStarted","Data":"89fa4cba8f1fd06d88915325dd7d19ab9fc17700793725e5f3db3d4ed150773f"} Feb 17 17:55:35 crc kubenswrapper[4892]: I0217 17:55:35.061700 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" event={"ID":"64a238eb-25e3-46d8-9a9d-3fadd5198c1c","Type":"ContainerStarted","Data":"1f8c00660cddd876fe25684e2fa40b2a89962cee72c242347f7675fa1e009641"} Feb 17 17:55:35 crc kubenswrapper[4892]: I0217 17:55:35.726984 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx"] Feb 17 17:55:35 crc kubenswrapper[4892]: I0217 17:55:35.728566 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx" Feb 17 17:55:35 crc kubenswrapper[4892]: I0217 17:55:35.731397 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 17 17:55:35 crc kubenswrapper[4892]: I0217 17:55:35.844261 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bfe219b7-e412-4fb9-8b96-36ed80ea8cd3-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx\" (UID: \"bfe219b7-e412-4fb9-8b96-36ed80ea8cd3\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx" Feb 17 17:55:35 crc kubenswrapper[4892]: I0217 17:55:35.844345 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bfe219b7-e412-4fb9-8b96-36ed80ea8cd3-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx\" (UID: \"bfe219b7-e412-4fb9-8b96-36ed80ea8cd3\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx" Feb 17 17:55:35 crc kubenswrapper[4892]: I0217 17:55:35.844397 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkg2t\" (UniqueName: \"kubernetes.io/projected/bfe219b7-e412-4fb9-8b96-36ed80ea8cd3-kube-api-access-bkg2t\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx\" (UID: \"bfe219b7-e412-4fb9-8b96-36ed80ea8cd3\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx" Feb 17 17:55:35 crc kubenswrapper[4892]: I0217 17:55:35.945495 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bfe219b7-e412-4fb9-8b96-36ed80ea8cd3-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx\" (UID: \"bfe219b7-e412-4fb9-8b96-36ed80ea8cd3\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx" Feb 17 17:55:35 crc kubenswrapper[4892]: I0217 17:55:35.945560 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bfe219b7-e412-4fb9-8b96-36ed80ea8cd3-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx\" (UID: \"bfe219b7-e412-4fb9-8b96-36ed80ea8cd3\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx" Feb 17 17:55:35 crc kubenswrapper[4892]: I0217 17:55:35.945594 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkg2t\" (UniqueName: \"kubernetes.io/projected/bfe219b7-e412-4fb9-8b96-36ed80ea8cd3-kube-api-access-bkg2t\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx\" (UID: \"bfe219b7-e412-4fb9-8b96-36ed80ea8cd3\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx" Feb 17 17:55:35 crc kubenswrapper[4892]: I0217 17:55:35.946333 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bfe219b7-e412-4fb9-8b96-36ed80ea8cd3-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx\" (UID: \"bfe219b7-e412-4fb9-8b96-36ed80ea8cd3\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx" Feb 17 17:55:35 crc kubenswrapper[4892]: I0217 17:55:35.946571 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bfe219b7-e412-4fb9-8b96-36ed80ea8cd3-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx\" (UID: \"bfe219b7-e412-4fb9-8b96-36ed80ea8cd3\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx" Feb 17 17:55:35 crc kubenswrapper[4892]: I0217 17:55:35.967247 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkg2t\" (UniqueName: \"kubernetes.io/projected/bfe219b7-e412-4fb9-8b96-36ed80ea8cd3-kube-api-access-bkg2t\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx\" (UID: \"bfe219b7-e412-4fb9-8b96-36ed80ea8cd3\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx" Feb 17 17:55:36 crc kubenswrapper[4892]: I0217 17:55:36.043371 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx" Feb 17 17:55:36 crc kubenswrapper[4892]: E0217 17:55:36.095660 4892 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx_openshift-marketplace_bfe219b7-e412-4fb9-8b96-36ed80ea8cd3_0(9ab7a2abbe0d9ad06dccc67e73ec9c5d6592648440864c3d8918e62f0a189b2f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 17:55:36 crc kubenswrapper[4892]: E0217 17:55:36.095750 4892 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx_openshift-marketplace_bfe219b7-e412-4fb9-8b96-36ed80ea8cd3_0(9ab7a2abbe0d9ad06dccc67e73ec9c5d6592648440864c3d8918e62f0a189b2f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx" Feb 17 17:55:36 crc kubenswrapper[4892]: E0217 17:55:36.095788 4892 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx_openshift-marketplace_bfe219b7-e412-4fb9-8b96-36ed80ea8cd3_0(9ab7a2abbe0d9ad06dccc67e73ec9c5d6592648440864c3d8918e62f0a189b2f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx" Feb 17 17:55:36 crc kubenswrapper[4892]: E0217 17:55:36.095898 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx_openshift-marketplace(bfe219b7-e412-4fb9-8b96-36ed80ea8cd3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx_openshift-marketplace(bfe219b7-e412-4fb9-8b96-36ed80ea8cd3)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx_openshift-marketplace_bfe219b7-e412-4fb9-8b96-36ed80ea8cd3_0(9ab7a2abbe0d9ad06dccc67e73ec9c5d6592648440864c3d8918e62f0a189b2f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx" podUID="bfe219b7-e412-4fb9-8b96-36ed80ea8cd3" Feb 17 17:55:38 crc kubenswrapper[4892]: I0217 17:55:38.097529 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" event={"ID":"64a238eb-25e3-46d8-9a9d-3fadd5198c1c","Type":"ContainerStarted","Data":"dc74c26a7d5efaafd3613c5f73b10c61f1c8b8cb38356ed52aaa3631577ddfe9"} Feb 17 17:55:40 crc kubenswrapper[4892]: I0217 17:55:40.129596 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" event={"ID":"64a238eb-25e3-46d8-9a9d-3fadd5198c1c","Type":"ContainerStarted","Data":"ca783e52bedb724bfc16ce0ae048fe010da51ed758b479607a999bb1182f1065"} Feb 17 17:55:40 crc kubenswrapper[4892]: I0217 17:55:40.130509 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:40 crc kubenswrapper[4892]: I0217 17:55:40.130626 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:40 crc kubenswrapper[4892]: I0217 17:55:40.130742 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:40 crc kubenswrapper[4892]: I0217 17:55:40.160747 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:40 crc kubenswrapper[4892]: I0217 17:55:40.162942 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:55:40 crc kubenswrapper[4892]: I0217 17:55:40.167779 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" podStartSLOduration=8.167762048 podStartE2EDuration="8.167762048s" podCreationTimestamp="2026-02-17 17:55:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:55:40.162373292 +0000 UTC m=+711.537776577" watchObservedRunningTime="2026-02-17 17:55:40.167762048 +0000 UTC m=+711.543165313" Feb 17 17:55:40 crc kubenswrapper[4892]: I0217 17:55:40.415607 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx"] Feb 17 17:55:40 crc kubenswrapper[4892]: I0217 17:55:40.415752 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx" Feb 17 17:55:40 crc kubenswrapper[4892]: I0217 17:55:40.416337 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx" Feb 17 17:55:40 crc kubenswrapper[4892]: E0217 17:55:40.447663 4892 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx_openshift-marketplace_bfe219b7-e412-4fb9-8b96-36ed80ea8cd3_0(85130c692c98dcf53e7226e305d47dc81ce96e1cfc2ab30846c395748776d4cf): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 17:55:40 crc kubenswrapper[4892]: E0217 17:55:40.447736 4892 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx_openshift-marketplace_bfe219b7-e412-4fb9-8b96-36ed80ea8cd3_0(85130c692c98dcf53e7226e305d47dc81ce96e1cfc2ab30846c395748776d4cf): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx" Feb 17 17:55:40 crc kubenswrapper[4892]: E0217 17:55:40.447764 4892 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx_openshift-marketplace_bfe219b7-e412-4fb9-8b96-36ed80ea8cd3_0(85130c692c98dcf53e7226e305d47dc81ce96e1cfc2ab30846c395748776d4cf): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx" Feb 17 17:55:40 crc kubenswrapper[4892]: E0217 17:55:40.447838 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx_openshift-marketplace(bfe219b7-e412-4fb9-8b96-36ed80ea8cd3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx_openshift-marketplace(bfe219b7-e412-4fb9-8b96-36ed80ea8cd3)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx_openshift-marketplace_bfe219b7-e412-4fb9-8b96-36ed80ea8cd3_0(85130c692c98dcf53e7226e305d47dc81ce96e1cfc2ab30846c395748776d4cf): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx" podUID="bfe219b7-e412-4fb9-8b96-36ed80ea8cd3" Feb 17 17:55:45 crc kubenswrapper[4892]: I0217 17:55:45.360596 4892 scope.go:117] "RemoveContainer" containerID="304725a2b080e31a96b7983f19b1eedea834a9b2e00608dc922e78f46533ae2a" Feb 17 17:55:45 crc kubenswrapper[4892]: E0217 17:55:45.361534 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-lxpxh_openshift-multus(43b12f44-0079-4031-9b1d-492c374250df)\"" pod="openshift-multus/multus-lxpxh" podUID="43b12f44-0079-4031-9b1d-492c374250df" Feb 17 17:55:54 crc kubenswrapper[4892]: I0217 17:55:54.359147 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx" Feb 17 17:55:54 crc kubenswrapper[4892]: I0217 17:55:54.360441 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx" Feb 17 17:55:54 crc kubenswrapper[4892]: E0217 17:55:54.391402 4892 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx_openshift-marketplace_bfe219b7-e412-4fb9-8b96-36ed80ea8cd3_0(9489eff9919ea07352e4e16edc34e872b0d61b04e56ae0412235bd62b570b63f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 17:55:54 crc kubenswrapper[4892]: E0217 17:55:54.391497 4892 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx_openshift-marketplace_bfe219b7-e412-4fb9-8b96-36ed80ea8cd3_0(9489eff9919ea07352e4e16edc34e872b0d61b04e56ae0412235bd62b570b63f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx" Feb 17 17:55:54 crc kubenswrapper[4892]: E0217 17:55:54.391533 4892 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx_openshift-marketplace_bfe219b7-e412-4fb9-8b96-36ed80ea8cd3_0(9489eff9919ea07352e4e16edc34e872b0d61b04e56ae0412235bd62b570b63f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx" Feb 17 17:55:54 crc kubenswrapper[4892]: E0217 17:55:54.391597 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx_openshift-marketplace(bfe219b7-e412-4fb9-8b96-36ed80ea8cd3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx_openshift-marketplace(bfe219b7-e412-4fb9-8b96-36ed80ea8cd3)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx_openshift-marketplace_bfe219b7-e412-4fb9-8b96-36ed80ea8cd3_0(9489eff9919ea07352e4e16edc34e872b0d61b04e56ae0412235bd62b570b63f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx" podUID="bfe219b7-e412-4fb9-8b96-36ed80ea8cd3" Feb 17 17:55:56 crc kubenswrapper[4892]: I0217 17:55:56.359915 4892 scope.go:117] "RemoveContainer" containerID="304725a2b080e31a96b7983f19b1eedea834a9b2e00608dc922e78f46533ae2a" Feb 17 17:55:57 crc kubenswrapper[4892]: I0217 17:55:57.267903 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lxpxh_43b12f44-0079-4031-9b1d-492c374250df/kube-multus/2.log" Feb 17 17:55:57 crc kubenswrapper[4892]: I0217 17:55:57.268424 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lxpxh" event={"ID":"43b12f44-0079-4031-9b1d-492c374250df","Type":"ContainerStarted","Data":"8cc6ce99708a774b897b39b4c353cc37c9a11b076cf96c762e1dbcac0283655b"} Feb 17 17:56:03 crc kubenswrapper[4892]: I0217 17:56:03.189536 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-db2ft" Feb 17 17:56:07 crc kubenswrapper[4892]: I0217 17:56:07.359594 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx" Feb 17 17:56:07 crc kubenswrapper[4892]: I0217 17:56:07.360642 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx" Feb 17 17:56:07 crc kubenswrapper[4892]: I0217 17:56:07.425223 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:56:07 crc kubenswrapper[4892]: I0217 17:56:07.425287 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:56:07 crc kubenswrapper[4892]: I0217 17:56:07.806693 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx"] Feb 17 17:56:08 crc kubenswrapper[4892]: I0217 17:56:08.360710 4892 generic.go:334] "Generic (PLEG): container finished" podID="bfe219b7-e412-4fb9-8b96-36ed80ea8cd3" containerID="394e022f8c695f72bd2de721ebf8eb4b7158cdfc9bc8c739d9a8848f32482fde" exitCode=0 Feb 17 17:56:08 crc kubenswrapper[4892]: I0217 17:56:08.360758 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx" event={"ID":"bfe219b7-e412-4fb9-8b96-36ed80ea8cd3","Type":"ContainerDied","Data":"394e022f8c695f72bd2de721ebf8eb4b7158cdfc9bc8c739d9a8848f32482fde"} Feb 17 17:56:08 crc kubenswrapper[4892]: I0217 17:56:08.360792 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx" event={"ID":"bfe219b7-e412-4fb9-8b96-36ed80ea8cd3","Type":"ContainerStarted","Data":"ef36fb19f835c69d04c7279c53d462a6d019fde259c5fa0e150834974b895a89"} Feb 17 17:56:09 crc kubenswrapper[4892]: E0217 17:56:09.824569 4892 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfe219b7_e412_4fb9_8b96_36ed80ea8cd3.slice/crio-3301499e71dfc87bc65f56421257ddb454c7c54df0f9f62c01545c1e141893de.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfe219b7_e412_4fb9_8b96_36ed80ea8cd3.slice/crio-conmon-3301499e71dfc87bc65f56421257ddb454c7c54df0f9f62c01545c1e141893de.scope\": RecentStats: unable to find data in memory cache]" Feb 17 17:56:10 crc kubenswrapper[4892]: I0217 17:56:10.385244 4892 generic.go:334] "Generic (PLEG): container finished" podID="bfe219b7-e412-4fb9-8b96-36ed80ea8cd3" containerID="3301499e71dfc87bc65f56421257ddb454c7c54df0f9f62c01545c1e141893de" exitCode=0 Feb 17 17:56:10 crc kubenswrapper[4892]: I0217 17:56:10.385315 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx" event={"ID":"bfe219b7-e412-4fb9-8b96-36ed80ea8cd3","Type":"ContainerDied","Data":"3301499e71dfc87bc65f56421257ddb454c7c54df0f9f62c01545c1e141893de"} Feb 17 17:56:11 crc kubenswrapper[4892]: I0217 17:56:11.401525 4892 generic.go:334] "Generic (PLEG): container finished" podID="bfe219b7-e412-4fb9-8b96-36ed80ea8cd3" containerID="9e238d52cf15695d584baee385780e5a5f8c12a184344147039c8e23cc402949" exitCode=0 Feb 17 17:56:11 crc kubenswrapper[4892]: I0217 17:56:11.402114 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx" event={"ID":"bfe219b7-e412-4fb9-8b96-36ed80ea8cd3","Type":"ContainerDied","Data":"9e238d52cf15695d584baee385780e5a5f8c12a184344147039c8e23cc402949"} Feb 17 17:56:12 crc kubenswrapper[4892]: I0217 17:56:12.778288 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx" Feb 17 17:56:12 crc kubenswrapper[4892]: I0217 17:56:12.953148 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bfe219b7-e412-4fb9-8b96-36ed80ea8cd3-bundle\") pod \"bfe219b7-e412-4fb9-8b96-36ed80ea8cd3\" (UID: \"bfe219b7-e412-4fb9-8b96-36ed80ea8cd3\") " Feb 17 17:56:12 crc kubenswrapper[4892]: I0217 17:56:12.953265 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkg2t\" (UniqueName: \"kubernetes.io/projected/bfe219b7-e412-4fb9-8b96-36ed80ea8cd3-kube-api-access-bkg2t\") pod \"bfe219b7-e412-4fb9-8b96-36ed80ea8cd3\" (UID: \"bfe219b7-e412-4fb9-8b96-36ed80ea8cd3\") " Feb 17 17:56:12 crc kubenswrapper[4892]: I0217 17:56:12.953367 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bfe219b7-e412-4fb9-8b96-36ed80ea8cd3-util\") pod \"bfe219b7-e412-4fb9-8b96-36ed80ea8cd3\" (UID: \"bfe219b7-e412-4fb9-8b96-36ed80ea8cd3\") " Feb 17 17:56:12 crc kubenswrapper[4892]: I0217 17:56:12.955042 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfe219b7-e412-4fb9-8b96-36ed80ea8cd3-bundle" (OuterVolumeSpecName: "bundle") pod "bfe219b7-e412-4fb9-8b96-36ed80ea8cd3" (UID: "bfe219b7-e412-4fb9-8b96-36ed80ea8cd3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:56:12 crc kubenswrapper[4892]: I0217 17:56:12.965000 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfe219b7-e412-4fb9-8b96-36ed80ea8cd3-kube-api-access-bkg2t" (OuterVolumeSpecName: "kube-api-access-bkg2t") pod "bfe219b7-e412-4fb9-8b96-36ed80ea8cd3" (UID: "bfe219b7-e412-4fb9-8b96-36ed80ea8cd3"). InnerVolumeSpecName "kube-api-access-bkg2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:56:12 crc kubenswrapper[4892]: I0217 17:56:12.985065 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfe219b7-e412-4fb9-8b96-36ed80ea8cd3-util" (OuterVolumeSpecName: "util") pod "bfe219b7-e412-4fb9-8b96-36ed80ea8cd3" (UID: "bfe219b7-e412-4fb9-8b96-36ed80ea8cd3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:56:13 crc kubenswrapper[4892]: I0217 17:56:13.056433 4892 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bfe219b7-e412-4fb9-8b96-36ed80ea8cd3-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:56:13 crc kubenswrapper[4892]: I0217 17:56:13.056490 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkg2t\" (UniqueName: \"kubernetes.io/projected/bfe219b7-e412-4fb9-8b96-36ed80ea8cd3-kube-api-access-bkg2t\") on node \"crc\" DevicePath \"\"" Feb 17 17:56:13 crc kubenswrapper[4892]: I0217 17:56:13.056504 4892 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bfe219b7-e412-4fb9-8b96-36ed80ea8cd3-util\") on node \"crc\" DevicePath \"\"" Feb 17 17:56:13 crc kubenswrapper[4892]: I0217 17:56:13.427460 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx" event={"ID":"bfe219b7-e412-4fb9-8b96-36ed80ea8cd3","Type":"ContainerDied","Data":"ef36fb19f835c69d04c7279c53d462a6d019fde259c5fa0e150834974b895a89"} Feb 17 17:56:13 crc kubenswrapper[4892]: I0217 17:56:13.427517 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx" Feb 17 17:56:13 crc kubenswrapper[4892]: I0217 17:56:13.427525 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef36fb19f835c69d04c7279c53d462a6d019fde259c5fa0e150834974b895a89" Feb 17 17:56:17 crc kubenswrapper[4892]: I0217 17:56:17.590058 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-7nmvs"] Feb 17 17:56:17 crc kubenswrapper[4892]: E0217 17:56:17.590921 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfe219b7-e412-4fb9-8b96-36ed80ea8cd3" containerName="pull" Feb 17 17:56:17 crc kubenswrapper[4892]: I0217 17:56:17.590938 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfe219b7-e412-4fb9-8b96-36ed80ea8cd3" containerName="pull" Feb 17 17:56:17 crc kubenswrapper[4892]: E0217 17:56:17.590954 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfe219b7-e412-4fb9-8b96-36ed80ea8cd3" containerName="extract" Feb 17 17:56:17 crc kubenswrapper[4892]: I0217 17:56:17.590961 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfe219b7-e412-4fb9-8b96-36ed80ea8cd3" containerName="extract" Feb 17 17:56:17 crc kubenswrapper[4892]: E0217 17:56:17.590973 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfe219b7-e412-4fb9-8b96-36ed80ea8cd3" containerName="util" Feb 17 17:56:17 crc kubenswrapper[4892]: I0217 17:56:17.590980 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfe219b7-e412-4fb9-8b96-36ed80ea8cd3" containerName="util" Feb 17 17:56:17 crc kubenswrapper[4892]: I0217 17:56:17.591180 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfe219b7-e412-4fb9-8b96-36ed80ea8cd3" containerName="extract" Feb 17 17:56:17 crc kubenswrapper[4892]: I0217 17:56:17.591646 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-7nmvs" Feb 17 17:56:17 crc kubenswrapper[4892]: I0217 17:56:17.594565 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-2wmd6" Feb 17 17:56:17 crc kubenswrapper[4892]: I0217 17:56:17.594595 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 17 17:56:17 crc kubenswrapper[4892]: I0217 17:56:17.595231 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 17 17:56:17 crc kubenswrapper[4892]: I0217 17:56:17.608201 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-7nmvs"] Feb 17 17:56:17 crc kubenswrapper[4892]: I0217 17:56:17.629102 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss7lj\" (UniqueName: \"kubernetes.io/projected/e291b988-bfc6-47d5-864a-71c877507a09-kube-api-access-ss7lj\") pod \"nmstate-operator-694c9596b7-7nmvs\" (UID: \"e291b988-bfc6-47d5-864a-71c877507a09\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-7nmvs" Feb 17 17:56:17 crc kubenswrapper[4892]: I0217 17:56:17.729824 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss7lj\" (UniqueName: \"kubernetes.io/projected/e291b988-bfc6-47d5-864a-71c877507a09-kube-api-access-ss7lj\") pod \"nmstate-operator-694c9596b7-7nmvs\" (UID: \"e291b988-bfc6-47d5-864a-71c877507a09\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-7nmvs" Feb 17 17:56:17 crc kubenswrapper[4892]: I0217 17:56:17.746353 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss7lj\" (UniqueName: \"kubernetes.io/projected/e291b988-bfc6-47d5-864a-71c877507a09-kube-api-access-ss7lj\") pod \"nmstate-operator-694c9596b7-7nmvs\" (UID: \"e291b988-bfc6-47d5-864a-71c877507a09\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-7nmvs" Feb 17 17:56:17 crc kubenswrapper[4892]: I0217 17:56:17.911234 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-7nmvs" Feb 17 17:56:18 crc kubenswrapper[4892]: I0217 17:56:18.141256 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-7nmvs"] Feb 17 17:56:18 crc kubenswrapper[4892]: W0217 17:56:18.145739 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode291b988_bfc6_47d5_864a_71c877507a09.slice/crio-707f35c0e680690160a000286bb01f80a609d5e9077b8d5fbfec3ee13e203d3e WatchSource:0}: Error finding container 707f35c0e680690160a000286bb01f80a609d5e9077b8d5fbfec3ee13e203d3e: Status 404 returned error can't find the container with id 707f35c0e680690160a000286bb01f80a609d5e9077b8d5fbfec3ee13e203d3e Feb 17 17:56:18 crc kubenswrapper[4892]: I0217 17:56:18.463527 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-7nmvs" event={"ID":"e291b988-bfc6-47d5-864a-71c877507a09","Type":"ContainerStarted","Data":"707f35c0e680690160a000286bb01f80a609d5e9077b8d5fbfec3ee13e203d3e"} Feb 17 17:56:20 crc kubenswrapper[4892]: I0217 17:56:20.477038 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-7nmvs" event={"ID":"e291b988-bfc6-47d5-864a-71c877507a09","Type":"ContainerStarted","Data":"ec4d7379d8c3217798218799fa87dee513b7d4efb5fb0300e0aafad5d05ba9a1"} Feb 17 17:56:20 crc kubenswrapper[4892]: I0217 17:56:20.498646 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-694c9596b7-7nmvs" podStartSLOduration=1.548579316 podStartE2EDuration="3.49862523s" podCreationTimestamp="2026-02-17 17:56:17 +0000 UTC" firstStartedPulling="2026-02-17 17:56:18.148250689 +0000 UTC m=+749.523653954" lastFinishedPulling="2026-02-17 17:56:20.098296603 +0000 UTC m=+751.473699868" observedRunningTime="2026-02-17 17:56:20.493754858 +0000 UTC m=+751.869158133" watchObservedRunningTime="2026-02-17 17:56:20.49862523 +0000 UTC m=+751.874028485" Feb 17 17:56:24 crc kubenswrapper[4892]: I0217 17:56:24.182635 4892 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.441332 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-89vmm"] Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.442896 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-89vmm" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.448170 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-v6h8d"] Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.449153 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-v6h8d" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.449636 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-png7m" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.452490 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.472719 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqx7m\" (UniqueName: \"kubernetes.io/projected/91f1ad1b-9ce2-44b8-bdd0-6528b0563442-kube-api-access-tqx7m\") pod \"nmstate-metrics-58c85c668d-89vmm\" (UID: \"91f1ad1b-9ce2-44b8-bdd0-6528b0563442\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-89vmm" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.472788 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrthz\" (UniqueName: \"kubernetes.io/projected/0190d61f-6f4b-43a1-a439-acc019e8353a-kube-api-access-wrthz\") pod \"nmstate-webhook-866bcb46dc-v6h8d\" (UID: \"0190d61f-6f4b-43a1-a439-acc019e8353a\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-v6h8d" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.473008 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/0190d61f-6f4b-43a1-a439-acc019e8353a-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-v6h8d\" (UID: \"0190d61f-6f4b-43a1-a439-acc019e8353a\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-v6h8d" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.476288 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-99qbz"] Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.477195 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-99qbz" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.484882 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-89vmm"] Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.490742 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-v6h8d"] Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.568970 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-cvjjp"] Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.570080 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-cvjjp" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.574193 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/0190d61f-6f4b-43a1-a439-acc019e8353a-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-v6h8d\" (UID: \"0190d61f-6f4b-43a1-a439-acc019e8353a\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-v6h8d" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.574246 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqx7m\" (UniqueName: \"kubernetes.io/projected/91f1ad1b-9ce2-44b8-bdd0-6528b0563442-kube-api-access-tqx7m\") pod \"nmstate-metrics-58c85c668d-89vmm\" (UID: \"91f1ad1b-9ce2-44b8-bdd0-6528b0563442\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-89vmm" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.574274 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrthz\" (UniqueName: \"kubernetes.io/projected/0190d61f-6f4b-43a1-a439-acc019e8353a-kube-api-access-wrthz\") pod \"nmstate-webhook-866bcb46dc-v6h8d\" (UID: \"0190d61f-6f4b-43a1-a439-acc019e8353a\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-v6h8d" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.574305 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjccx\" (UniqueName: \"kubernetes.io/projected/ffff86a7-6135-4bc5-b5ea-2b5f219551e4-kube-api-access-rjccx\") pod \"nmstate-handler-99qbz\" (UID: \"ffff86a7-6135-4bc5-b5ea-2b5f219551e4\") " pod="openshift-nmstate/nmstate-handler-99qbz" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.574331 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ffff86a7-6135-4bc5-b5ea-2b5f219551e4-dbus-socket\") pod \"nmstate-handler-99qbz\" (UID: \"ffff86a7-6135-4bc5-b5ea-2b5f219551e4\") " pod="openshift-nmstate/nmstate-handler-99qbz" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.574367 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ffff86a7-6135-4bc5-b5ea-2b5f219551e4-nmstate-lock\") pod \"nmstate-handler-99qbz\" (UID: \"ffff86a7-6135-4bc5-b5ea-2b5f219551e4\") " pod="openshift-nmstate/nmstate-handler-99qbz" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.574410 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ffff86a7-6135-4bc5-b5ea-2b5f219551e4-ovs-socket\") pod \"nmstate-handler-99qbz\" (UID: \"ffff86a7-6135-4bc5-b5ea-2b5f219551e4\") " pod="openshift-nmstate/nmstate-handler-99qbz" Feb 17 17:56:26 crc kubenswrapper[4892]: E0217 17:56:26.574549 4892 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Feb 17 17:56:26 crc kubenswrapper[4892]: E0217 17:56:26.574596 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0190d61f-6f4b-43a1-a439-acc019e8353a-tls-key-pair podName:0190d61f-6f4b-43a1-a439-acc019e8353a nodeName:}" failed. No retries permitted until 2026-02-17 17:56:27.074578519 +0000 UTC m=+758.449981784 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/0190d61f-6f4b-43a1-a439-acc019e8353a-tls-key-pair") pod "nmstate-webhook-866bcb46dc-v6h8d" (UID: "0190d61f-6f4b-43a1-a439-acc019e8353a") : secret "openshift-nmstate-webhook" not found Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.577432 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-hpbvr" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.577619 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.577752 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.584675 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-cvjjp"] Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.594203 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqx7m\" (UniqueName: \"kubernetes.io/projected/91f1ad1b-9ce2-44b8-bdd0-6528b0563442-kube-api-access-tqx7m\") pod \"nmstate-metrics-58c85c668d-89vmm\" (UID: \"91f1ad1b-9ce2-44b8-bdd0-6528b0563442\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-89vmm" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.616313 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrthz\" (UniqueName: \"kubernetes.io/projected/0190d61f-6f4b-43a1-a439-acc019e8353a-kube-api-access-wrthz\") pod \"nmstate-webhook-866bcb46dc-v6h8d\" (UID: \"0190d61f-6f4b-43a1-a439-acc019e8353a\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-v6h8d" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.676213 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ffff86a7-6135-4bc5-b5ea-2b5f219551e4-dbus-socket\") pod \"nmstate-handler-99qbz\" (UID: \"ffff86a7-6135-4bc5-b5ea-2b5f219551e4\") " pod="openshift-nmstate/nmstate-handler-99qbz" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.676511 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ffff86a7-6135-4bc5-b5ea-2b5f219551e4-nmstate-lock\") pod \"nmstate-handler-99qbz\" (UID: \"ffff86a7-6135-4bc5-b5ea-2b5f219551e4\") " pod="openshift-nmstate/nmstate-handler-99qbz" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.676617 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ffff86a7-6135-4bc5-b5ea-2b5f219551e4-nmstate-lock\") pod \"nmstate-handler-99qbz\" (UID: \"ffff86a7-6135-4bc5-b5ea-2b5f219551e4\") " pod="openshift-nmstate/nmstate-handler-99qbz" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.676586 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ffff86a7-6135-4bc5-b5ea-2b5f219551e4-dbus-socket\") pod \"nmstate-handler-99qbz\" (UID: \"ffff86a7-6135-4bc5-b5ea-2b5f219551e4\") " pod="openshift-nmstate/nmstate-handler-99qbz" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.676833 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/642744dc-fe94-49d4-98a0-64115704ead8-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-cvjjp\" (UID: \"642744dc-fe94-49d4-98a0-64115704ead8\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-cvjjp" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.676990 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h944s\" (UniqueName: \"kubernetes.io/projected/642744dc-fe94-49d4-98a0-64115704ead8-kube-api-access-h944s\") pod \"nmstate-console-plugin-5c78fc5d65-cvjjp\" (UID: \"642744dc-fe94-49d4-98a0-64115704ead8\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-cvjjp" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.677211 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ffff86a7-6135-4bc5-b5ea-2b5f219551e4-ovs-socket\") pod \"nmstate-handler-99qbz\" (UID: \"ffff86a7-6135-4bc5-b5ea-2b5f219551e4\") " pod="openshift-nmstate/nmstate-handler-99qbz" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.677278 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/642744dc-fe94-49d4-98a0-64115704ead8-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-cvjjp\" (UID: \"642744dc-fe94-49d4-98a0-64115704ead8\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-cvjjp" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.677458 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjccx\" (UniqueName: \"kubernetes.io/projected/ffff86a7-6135-4bc5-b5ea-2b5f219551e4-kube-api-access-rjccx\") pod \"nmstate-handler-99qbz\" (UID: \"ffff86a7-6135-4bc5-b5ea-2b5f219551e4\") " pod="openshift-nmstate/nmstate-handler-99qbz" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.677875 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ffff86a7-6135-4bc5-b5ea-2b5f219551e4-ovs-socket\") pod \"nmstate-handler-99qbz\" (UID: \"ffff86a7-6135-4bc5-b5ea-2b5f219551e4\") " pod="openshift-nmstate/nmstate-handler-99qbz" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.707876 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjccx\" (UniqueName: \"kubernetes.io/projected/ffff86a7-6135-4bc5-b5ea-2b5f219551e4-kube-api-access-rjccx\") pod \"nmstate-handler-99qbz\" (UID: \"ffff86a7-6135-4bc5-b5ea-2b5f219551e4\") " pod="openshift-nmstate/nmstate-handler-99qbz" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.752924 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-86d4db4587-9z4ls"] Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.754090 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86d4db4587-9z4ls" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.765024 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-89vmm" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.778828 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5b36f5d2-b7b4-4e18-8a28-f6fb5cc5944c-service-ca\") pod \"console-86d4db4587-9z4ls\" (UID: \"5b36f5d2-b7b4-4e18-8a28-f6fb5cc5944c\") " pod="openshift-console/console-86d4db4587-9z4ls" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.778889 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b36f5d2-b7b4-4e18-8a28-f6fb5cc5944c-console-serving-cert\") pod \"console-86d4db4587-9z4ls\" (UID: \"5b36f5d2-b7b4-4e18-8a28-f6fb5cc5944c\") " pod="openshift-console/console-86d4db4587-9z4ls" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.778918 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5q26\" (UniqueName: \"kubernetes.io/projected/5b36f5d2-b7b4-4e18-8a28-f6fb5cc5944c-kube-api-access-d5q26\") pod \"console-86d4db4587-9z4ls\" (UID: \"5b36f5d2-b7b4-4e18-8a28-f6fb5cc5944c\") " pod="openshift-console/console-86d4db4587-9z4ls" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.778953 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/642744dc-fe94-49d4-98a0-64115704ead8-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-cvjjp\" (UID: \"642744dc-fe94-49d4-98a0-64115704ead8\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-cvjjp" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.778981 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h944s\" (UniqueName: \"kubernetes.io/projected/642744dc-fe94-49d4-98a0-64115704ead8-kube-api-access-h944s\") pod \"nmstate-console-plugin-5c78fc5d65-cvjjp\" (UID: \"642744dc-fe94-49d4-98a0-64115704ead8\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-cvjjp" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.779018 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/642744dc-fe94-49d4-98a0-64115704ead8-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-cvjjp\" (UID: \"642744dc-fe94-49d4-98a0-64115704ead8\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-cvjjp" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.779072 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b36f5d2-b7b4-4e18-8a28-f6fb5cc5944c-trusted-ca-bundle\") pod \"console-86d4db4587-9z4ls\" (UID: \"5b36f5d2-b7b4-4e18-8a28-f6fb5cc5944c\") " pod="openshift-console/console-86d4db4587-9z4ls" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.779106 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5b36f5d2-b7b4-4e18-8a28-f6fb5cc5944c-console-oauth-config\") pod \"console-86d4db4587-9z4ls\" (UID: \"5b36f5d2-b7b4-4e18-8a28-f6fb5cc5944c\") " pod="openshift-console/console-86d4db4587-9z4ls" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.779147 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5b36f5d2-b7b4-4e18-8a28-f6fb5cc5944c-console-config\") pod \"console-86d4db4587-9z4ls\" (UID: \"5b36f5d2-b7b4-4e18-8a28-f6fb5cc5944c\") " pod="openshift-console/console-86d4db4587-9z4ls" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.779172 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5b36f5d2-b7b4-4e18-8a28-f6fb5cc5944c-oauth-serving-cert\") pod \"console-86d4db4587-9z4ls\" (UID: \"5b36f5d2-b7b4-4e18-8a28-f6fb5cc5944c\") " pod="openshift-console/console-86d4db4587-9z4ls" Feb 17 17:56:26 crc kubenswrapper[4892]: E0217 17:56:26.779363 4892 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Feb 17 17:56:26 crc kubenswrapper[4892]: E0217 17:56:26.779411 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/642744dc-fe94-49d4-98a0-64115704ead8-plugin-serving-cert podName:642744dc-fe94-49d4-98a0-64115704ead8 nodeName:}" failed. No retries permitted until 2026-02-17 17:56:27.279393634 +0000 UTC m=+758.654796899 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/642744dc-fe94-49d4-98a0-64115704ead8-plugin-serving-cert") pod "nmstate-console-plugin-5c78fc5d65-cvjjp" (UID: "642744dc-fe94-49d4-98a0-64115704ead8") : secret "plugin-serving-cert" not found Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.780478 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/642744dc-fe94-49d4-98a0-64115704ead8-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-cvjjp\" (UID: \"642744dc-fe94-49d4-98a0-64115704ead8\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-cvjjp" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.790586 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-86d4db4587-9z4ls"] Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.792037 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-99qbz" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.797905 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h944s\" (UniqueName: \"kubernetes.io/projected/642744dc-fe94-49d4-98a0-64115704ead8-kube-api-access-h944s\") pod \"nmstate-console-plugin-5c78fc5d65-cvjjp\" (UID: \"642744dc-fe94-49d4-98a0-64115704ead8\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-cvjjp" Feb 17 17:56:26 crc kubenswrapper[4892]: W0217 17:56:26.835152 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffff86a7_6135_4bc5_b5ea_2b5f219551e4.slice/crio-bc651d7ff042c9f83187e37bf3ce69d550dd1861f6a96b620daaeba87e50d029 WatchSource:0}: Error finding container bc651d7ff042c9f83187e37bf3ce69d550dd1861f6a96b620daaeba87e50d029: Status 404 returned error can't find the container with id bc651d7ff042c9f83187e37bf3ce69d550dd1861f6a96b620daaeba87e50d029 Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.882012 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5b36f5d2-b7b4-4e18-8a28-f6fb5cc5944c-service-ca\") pod \"console-86d4db4587-9z4ls\" (UID: \"5b36f5d2-b7b4-4e18-8a28-f6fb5cc5944c\") " pod="openshift-console/console-86d4db4587-9z4ls" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.882075 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b36f5d2-b7b4-4e18-8a28-f6fb5cc5944c-console-serving-cert\") pod \"console-86d4db4587-9z4ls\" (UID: \"5b36f5d2-b7b4-4e18-8a28-f6fb5cc5944c\") " pod="openshift-console/console-86d4db4587-9z4ls" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.882098 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5q26\" (UniqueName: \"kubernetes.io/projected/5b36f5d2-b7b4-4e18-8a28-f6fb5cc5944c-kube-api-access-d5q26\") pod \"console-86d4db4587-9z4ls\" (UID: \"5b36f5d2-b7b4-4e18-8a28-f6fb5cc5944c\") " pod="openshift-console/console-86d4db4587-9z4ls" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.882271 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b36f5d2-b7b4-4e18-8a28-f6fb5cc5944c-trusted-ca-bundle\") pod \"console-86d4db4587-9z4ls\" (UID: \"5b36f5d2-b7b4-4e18-8a28-f6fb5cc5944c\") " pod="openshift-console/console-86d4db4587-9z4ls" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.882320 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5b36f5d2-b7b4-4e18-8a28-f6fb5cc5944c-console-oauth-config\") pod \"console-86d4db4587-9z4ls\" (UID: \"5b36f5d2-b7b4-4e18-8a28-f6fb5cc5944c\") " pod="openshift-console/console-86d4db4587-9z4ls" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.882406 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5b36f5d2-b7b4-4e18-8a28-f6fb5cc5944c-console-config\") pod \"console-86d4db4587-9z4ls\" (UID: \"5b36f5d2-b7b4-4e18-8a28-f6fb5cc5944c\") " pod="openshift-console/console-86d4db4587-9z4ls" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.882446 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5b36f5d2-b7b4-4e18-8a28-f6fb5cc5944c-oauth-serving-cert\") pod \"console-86d4db4587-9z4ls\" (UID: \"5b36f5d2-b7b4-4e18-8a28-f6fb5cc5944c\") " pod="openshift-console/console-86d4db4587-9z4ls" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.884002 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5b36f5d2-b7b4-4e18-8a28-f6fb5cc5944c-oauth-serving-cert\") pod \"console-86d4db4587-9z4ls\" (UID: \"5b36f5d2-b7b4-4e18-8a28-f6fb5cc5944c\") " pod="openshift-console/console-86d4db4587-9z4ls" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.884232 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5b36f5d2-b7b4-4e18-8a28-f6fb5cc5944c-service-ca\") pod \"console-86d4db4587-9z4ls\" (UID: \"5b36f5d2-b7b4-4e18-8a28-f6fb5cc5944c\") " pod="openshift-console/console-86d4db4587-9z4ls" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.884245 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5b36f5d2-b7b4-4e18-8a28-f6fb5cc5944c-console-config\") pod \"console-86d4db4587-9z4ls\" (UID: \"5b36f5d2-b7b4-4e18-8a28-f6fb5cc5944c\") " pod="openshift-console/console-86d4db4587-9z4ls" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.885980 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b36f5d2-b7b4-4e18-8a28-f6fb5cc5944c-trusted-ca-bundle\") pod \"console-86d4db4587-9z4ls\" (UID: \"5b36f5d2-b7b4-4e18-8a28-f6fb5cc5944c\") " pod="openshift-console/console-86d4db4587-9z4ls" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.894483 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b36f5d2-b7b4-4e18-8a28-f6fb5cc5944c-console-serving-cert\") pod \"console-86d4db4587-9z4ls\" (UID: \"5b36f5d2-b7b4-4e18-8a28-f6fb5cc5944c\") " pod="openshift-console/console-86d4db4587-9z4ls" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.897828 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5b36f5d2-b7b4-4e18-8a28-f6fb5cc5944c-console-oauth-config\") pod \"console-86d4db4587-9z4ls\" (UID: \"5b36f5d2-b7b4-4e18-8a28-f6fb5cc5944c\") " pod="openshift-console/console-86d4db4587-9z4ls" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.899711 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5q26\" (UniqueName: \"kubernetes.io/projected/5b36f5d2-b7b4-4e18-8a28-f6fb5cc5944c-kube-api-access-d5q26\") pod \"console-86d4db4587-9z4ls\" (UID: \"5b36f5d2-b7b4-4e18-8a28-f6fb5cc5944c\") " pod="openshift-console/console-86d4db4587-9z4ls" Feb 17 17:56:26 crc kubenswrapper[4892]: I0217 17:56:26.999004 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-89vmm"] Feb 17 17:56:27 crc kubenswrapper[4892]: I0217 17:56:27.067775 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86d4db4587-9z4ls" Feb 17 17:56:27 crc kubenswrapper[4892]: I0217 17:56:27.097299 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/0190d61f-6f4b-43a1-a439-acc019e8353a-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-v6h8d\" (UID: \"0190d61f-6f4b-43a1-a439-acc019e8353a\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-v6h8d" Feb 17 17:56:27 crc kubenswrapper[4892]: I0217 17:56:27.101583 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/0190d61f-6f4b-43a1-a439-acc019e8353a-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-v6h8d\" (UID: \"0190d61f-6f4b-43a1-a439-acc019e8353a\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-v6h8d" Feb 17 17:56:27 crc kubenswrapper[4892]: I0217 17:56:27.301662 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/642744dc-fe94-49d4-98a0-64115704ead8-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-cvjjp\" (UID: \"642744dc-fe94-49d4-98a0-64115704ead8\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-cvjjp" Feb 17 17:56:27 crc kubenswrapper[4892]: I0217 17:56:27.305707 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/642744dc-fe94-49d4-98a0-64115704ead8-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-cvjjp\" (UID: \"642744dc-fe94-49d4-98a0-64115704ead8\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-cvjjp" Feb 17 17:56:27 crc kubenswrapper[4892]: I0217 17:56:27.377769 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-v6h8d" Feb 17 17:56:27 crc kubenswrapper[4892]: I0217 17:56:27.495257 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-cvjjp" Feb 17 17:56:27 crc kubenswrapper[4892]: I0217 17:56:27.529789 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-86d4db4587-9z4ls"] Feb 17 17:56:27 crc kubenswrapper[4892]: W0217 17:56:27.534379 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b36f5d2_b7b4_4e18_8a28_f6fb5cc5944c.slice/crio-129b05f8acd865ad87ec6215dcf51cda7956d30ffeb0103954a4ca09e87d3c5f WatchSource:0}: Error finding container 129b05f8acd865ad87ec6215dcf51cda7956d30ffeb0103954a4ca09e87d3c5f: Status 404 returned error can't find the container with id 129b05f8acd865ad87ec6215dcf51cda7956d30ffeb0103954a4ca09e87d3c5f Feb 17 17:56:27 crc kubenswrapper[4892]: I0217 17:56:27.537430 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-99qbz" event={"ID":"ffff86a7-6135-4bc5-b5ea-2b5f219551e4","Type":"ContainerStarted","Data":"bc651d7ff042c9f83187e37bf3ce69d550dd1861f6a96b620daaeba87e50d029"} Feb 17 17:56:27 crc kubenswrapper[4892]: I0217 17:56:27.538727 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-89vmm" event={"ID":"91f1ad1b-9ce2-44b8-bdd0-6528b0563442","Type":"ContainerStarted","Data":"e9f877f6bd4c8684e19dbda616af034958aef28775bd32fd019135f731806dd3"} Feb 17 17:56:27 crc kubenswrapper[4892]: I0217 17:56:27.703648 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-v6h8d"] Feb 17 17:56:27 crc kubenswrapper[4892]: W0217 17:56:27.727993 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0190d61f_6f4b_43a1_a439_acc019e8353a.slice/crio-c9d495c1cf92d1c244ee19356a225e36ef336e5cbdb95756a8b15b0270b19ffa WatchSource:0}: Error finding container c9d495c1cf92d1c244ee19356a225e36ef336e5cbdb95756a8b15b0270b19ffa: Status 404 returned error can't find the container with id c9d495c1cf92d1c244ee19356a225e36ef336e5cbdb95756a8b15b0270b19ffa Feb 17 17:56:27 crc kubenswrapper[4892]: I0217 17:56:27.807070 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-cvjjp"] Feb 17 17:56:27 crc kubenswrapper[4892]: W0217 17:56:27.811024 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod642744dc_fe94_49d4_98a0_64115704ead8.slice/crio-c2e510d595e05867a4cd49ed1c9462f83c393eaa19c60520b6c5a9bd16774e3d WatchSource:0}: Error finding container c2e510d595e05867a4cd49ed1c9462f83c393eaa19c60520b6c5a9bd16774e3d: Status 404 returned error can't find the container with id c2e510d595e05867a4cd49ed1c9462f83c393eaa19c60520b6c5a9bd16774e3d Feb 17 17:56:28 crc kubenswrapper[4892]: I0217 17:56:28.545105 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-v6h8d" event={"ID":"0190d61f-6f4b-43a1-a439-acc019e8353a","Type":"ContainerStarted","Data":"c9d495c1cf92d1c244ee19356a225e36ef336e5cbdb95756a8b15b0270b19ffa"} Feb 17 17:56:28 crc kubenswrapper[4892]: I0217 17:56:28.546220 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-cvjjp" event={"ID":"642744dc-fe94-49d4-98a0-64115704ead8","Type":"ContainerStarted","Data":"c2e510d595e05867a4cd49ed1c9462f83c393eaa19c60520b6c5a9bd16774e3d"} Feb 17 17:56:28 crc kubenswrapper[4892]: I0217 17:56:28.548210 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86d4db4587-9z4ls" event={"ID":"5b36f5d2-b7b4-4e18-8a28-f6fb5cc5944c","Type":"ContainerStarted","Data":"d8f7931f5c5abc050bd3d460a6279ed555f4b9dcd86e5d03b4ebce905a0e5918"} Feb 17 17:56:28 crc kubenswrapper[4892]: I0217 17:56:28.548264 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86d4db4587-9z4ls" event={"ID":"5b36f5d2-b7b4-4e18-8a28-f6fb5cc5944c","Type":"ContainerStarted","Data":"129b05f8acd865ad87ec6215dcf51cda7956d30ffeb0103954a4ca09e87d3c5f"} Feb 17 17:56:28 crc kubenswrapper[4892]: I0217 17:56:28.571896 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-86d4db4587-9z4ls" podStartSLOduration=2.571875127 podStartE2EDuration="2.571875127s" podCreationTimestamp="2026-02-17 17:56:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:56:28.566957744 +0000 UTC m=+759.942361019" watchObservedRunningTime="2026-02-17 17:56:28.571875127 +0000 UTC m=+759.947278382" Feb 17 17:56:29 crc kubenswrapper[4892]: I0217 17:56:29.556397 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-89vmm" event={"ID":"91f1ad1b-9ce2-44b8-bdd0-6528b0563442","Type":"ContainerStarted","Data":"f9957c38144461084f8c8c3c9d0dc068f958172f15b6822457c5fe6da576a2eb"} Feb 17 17:56:29 crc kubenswrapper[4892]: I0217 17:56:29.558077 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-v6h8d" event={"ID":"0190d61f-6f4b-43a1-a439-acc019e8353a","Type":"ContainerStarted","Data":"5151d584bbdad9d8da9be5d61c048b3f96790a160e840b2897d9f248d0e418dc"} Feb 17 17:56:29 crc kubenswrapper[4892]: I0217 17:56:29.558185 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-v6h8d" Feb 17 17:56:29 crc kubenswrapper[4892]: I0217 17:56:29.559511 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-99qbz" event={"ID":"ffff86a7-6135-4bc5-b5ea-2b5f219551e4","Type":"ContainerStarted","Data":"859697b640c7b141ef0000ac21078fab44d8e6f1cd9d05d42ec1e2d447dd2713"} Feb 17 17:56:29 crc kubenswrapper[4892]: I0217 17:56:29.559664 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-99qbz" Feb 17 17:56:29 crc kubenswrapper[4892]: I0217 17:56:29.574608 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-v6h8d" podStartSLOduration=2.06925215 podStartE2EDuration="3.574590552s" podCreationTimestamp="2026-02-17 17:56:26 +0000 UTC" firstStartedPulling="2026-02-17 17:56:27.729880899 +0000 UTC m=+759.105284154" lastFinishedPulling="2026-02-17 17:56:29.235219261 +0000 UTC m=+760.610622556" observedRunningTime="2026-02-17 17:56:29.570796438 +0000 UTC m=+760.946199723" watchObservedRunningTime="2026-02-17 17:56:29.574590552 +0000 UTC m=+760.949993817" Feb 17 17:56:29 crc kubenswrapper[4892]: I0217 17:56:29.595128 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-99qbz" podStartSLOduration=1.198667636 podStartE2EDuration="3.595113719s" podCreationTimestamp="2026-02-17 17:56:26 +0000 UTC" firstStartedPulling="2026-02-17 17:56:26.837491753 +0000 UTC m=+758.212895018" lastFinishedPulling="2026-02-17 17:56:29.233937836 +0000 UTC m=+760.609341101" observedRunningTime="2026-02-17 17:56:29.593242398 +0000 UTC m=+760.968645683" watchObservedRunningTime="2026-02-17 17:56:29.595113719 +0000 UTC m=+760.970516984" Feb 17 17:56:30 crc kubenswrapper[4892]: I0217 17:56:30.570554 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-cvjjp" event={"ID":"642744dc-fe94-49d4-98a0-64115704ead8","Type":"ContainerStarted","Data":"1759f7485d3158003ca9bd278e4fa242845ddcfeef1f09fa368b216950092b7b"} Feb 17 17:56:30 crc kubenswrapper[4892]: I0217 17:56:30.590185 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-cvjjp" podStartSLOduration=2.055873746 podStartE2EDuration="4.590155085s" podCreationTimestamp="2026-02-17 17:56:26 +0000 UTC" firstStartedPulling="2026-02-17 17:56:27.813286326 +0000 UTC m=+759.188689591" lastFinishedPulling="2026-02-17 17:56:30.347567665 +0000 UTC m=+761.722970930" observedRunningTime="2026-02-17 17:56:30.581231723 +0000 UTC m=+761.956634988" watchObservedRunningTime="2026-02-17 17:56:30.590155085 +0000 UTC m=+761.965558400" Feb 17 17:56:32 crc kubenswrapper[4892]: I0217 17:56:32.595898 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-89vmm" event={"ID":"91f1ad1b-9ce2-44b8-bdd0-6528b0563442","Type":"ContainerStarted","Data":"f103dc13d4f16a2642ab045200318c66d43ed81afce219bb1df43f23d8acb052"} Feb 17 17:56:32 crc kubenswrapper[4892]: I0217 17:56:32.623836 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58c85c668d-89vmm" podStartSLOduration=1.544971935 podStartE2EDuration="6.623785561s" podCreationTimestamp="2026-02-17 17:56:26 +0000 UTC" firstStartedPulling="2026-02-17 17:56:27.005246601 +0000 UTC m=+758.380649866" lastFinishedPulling="2026-02-17 17:56:32.084060197 +0000 UTC m=+763.459463492" observedRunningTime="2026-02-17 17:56:32.61530061 +0000 UTC m=+763.990703965" watchObservedRunningTime="2026-02-17 17:56:32.623785561 +0000 UTC m=+763.999188856" Feb 17 17:56:36 crc kubenswrapper[4892]: I0217 17:56:36.831784 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-99qbz" Feb 17 17:56:37 crc kubenswrapper[4892]: I0217 17:56:37.068084 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-86d4db4587-9z4ls" Feb 17 17:56:37 crc kubenswrapper[4892]: I0217 17:56:37.068142 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-86d4db4587-9z4ls" Feb 17 17:56:37 crc kubenswrapper[4892]: I0217 17:56:37.074280 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-86d4db4587-9z4ls" Feb 17 17:56:37 crc kubenswrapper[4892]: I0217 17:56:37.424871 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:56:37 crc kubenswrapper[4892]: I0217 17:56:37.424964 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:56:37 crc kubenswrapper[4892]: I0217 17:56:37.647073 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-86d4db4587-9z4ls" Feb 17 17:56:37 crc kubenswrapper[4892]: I0217 17:56:37.734251 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-zc25m"] Feb 17 17:56:47 crc kubenswrapper[4892]: I0217 17:56:47.388065 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-v6h8d" Feb 17 17:57:02 crc kubenswrapper[4892]: I0217 17:57:02.786373 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-zc25m" podUID="0b0cbdbb-671e-41e1-b494-a369938dab8e" containerName="console" containerID="cri-o://f494ef23ba8e3bb38506a532e26207a1d357512b763a47b1478758f86f6cd70c" gracePeriod=15 Feb 17 17:57:03 crc kubenswrapper[4892]: I0217 17:57:03.257244 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-zc25m_0b0cbdbb-671e-41e1-b494-a369938dab8e/console/0.log" Feb 17 17:57:03 crc kubenswrapper[4892]: I0217 17:57:03.257569 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-zc25m" Feb 17 17:57:03 crc kubenswrapper[4892]: I0217 17:57:03.355865 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0b0cbdbb-671e-41e1-b494-a369938dab8e-console-oauth-config\") pod \"0b0cbdbb-671e-41e1-b494-a369938dab8e\" (UID: \"0b0cbdbb-671e-41e1-b494-a369938dab8e\") " Feb 17 17:57:03 crc kubenswrapper[4892]: I0217 17:57:03.355971 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b0cbdbb-671e-41e1-b494-a369938dab8e-service-ca\") pod \"0b0cbdbb-671e-41e1-b494-a369938dab8e\" (UID: \"0b0cbdbb-671e-41e1-b494-a369938dab8e\") " Feb 17 17:57:03 crc kubenswrapper[4892]: I0217 17:57:03.356047 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b0cbdbb-671e-41e1-b494-a369938dab8e-trusted-ca-bundle\") pod \"0b0cbdbb-671e-41e1-b494-a369938dab8e\" (UID: \"0b0cbdbb-671e-41e1-b494-a369938dab8e\") " Feb 17 17:57:03 crc kubenswrapper[4892]: I0217 17:57:03.356083 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9snm\" (UniqueName: \"kubernetes.io/projected/0b0cbdbb-671e-41e1-b494-a369938dab8e-kube-api-access-r9snm\") pod \"0b0cbdbb-671e-41e1-b494-a369938dab8e\" (UID: \"0b0cbdbb-671e-41e1-b494-a369938dab8e\") " Feb 17 17:57:03 crc kubenswrapper[4892]: I0217 17:57:03.356101 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0b0cbdbb-671e-41e1-b494-a369938dab8e-oauth-serving-cert\") pod \"0b0cbdbb-671e-41e1-b494-a369938dab8e\" (UID: \"0b0cbdbb-671e-41e1-b494-a369938dab8e\") " Feb 17 17:57:03 crc kubenswrapper[4892]: I0217 17:57:03.356118 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0b0cbdbb-671e-41e1-b494-a369938dab8e-console-serving-cert\") pod \"0b0cbdbb-671e-41e1-b494-a369938dab8e\" (UID: \"0b0cbdbb-671e-41e1-b494-a369938dab8e\") " Feb 17 17:57:03 crc kubenswrapper[4892]: I0217 17:57:03.356132 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0b0cbdbb-671e-41e1-b494-a369938dab8e-console-config\") pod \"0b0cbdbb-671e-41e1-b494-a369938dab8e\" (UID: \"0b0cbdbb-671e-41e1-b494-a369938dab8e\") " Feb 17 17:57:03 crc kubenswrapper[4892]: I0217 17:57:03.356907 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b0cbdbb-671e-41e1-b494-a369938dab8e-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b0cbdbb-671e-41e1-b494-a369938dab8e" (UID: "0b0cbdbb-671e-41e1-b494-a369938dab8e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:57:03 crc kubenswrapper[4892]: I0217 17:57:03.356924 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b0cbdbb-671e-41e1-b494-a369938dab8e-console-config" (OuterVolumeSpecName: "console-config") pod "0b0cbdbb-671e-41e1-b494-a369938dab8e" (UID: "0b0cbdbb-671e-41e1-b494-a369938dab8e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:57:03 crc kubenswrapper[4892]: I0217 17:57:03.356958 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b0cbdbb-671e-41e1-b494-a369938dab8e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0b0cbdbb-671e-41e1-b494-a369938dab8e" (UID: "0b0cbdbb-671e-41e1-b494-a369938dab8e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:57:03 crc kubenswrapper[4892]: I0217 17:57:03.357118 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b0cbdbb-671e-41e1-b494-a369938dab8e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "0b0cbdbb-671e-41e1-b494-a369938dab8e" (UID: "0b0cbdbb-671e-41e1-b494-a369938dab8e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:57:03 crc kubenswrapper[4892]: I0217 17:57:03.361392 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b0cbdbb-671e-41e1-b494-a369938dab8e-kube-api-access-r9snm" (OuterVolumeSpecName: "kube-api-access-r9snm") pod "0b0cbdbb-671e-41e1-b494-a369938dab8e" (UID: "0b0cbdbb-671e-41e1-b494-a369938dab8e"). InnerVolumeSpecName "kube-api-access-r9snm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:57:03 crc kubenswrapper[4892]: I0217 17:57:03.364462 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b0cbdbb-671e-41e1-b494-a369938dab8e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0b0cbdbb-671e-41e1-b494-a369938dab8e" (UID: "0b0cbdbb-671e-41e1-b494-a369938dab8e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:57:03 crc kubenswrapper[4892]: I0217 17:57:03.365358 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b0cbdbb-671e-41e1-b494-a369938dab8e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0b0cbdbb-671e-41e1-b494-a369938dab8e" (UID: "0b0cbdbb-671e-41e1-b494-a369938dab8e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:57:03 crc kubenswrapper[4892]: I0217 17:57:03.457647 4892 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b0cbdbb-671e-41e1-b494-a369938dab8e-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 17:57:03 crc kubenswrapper[4892]: I0217 17:57:03.457677 4892 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b0cbdbb-671e-41e1-b494-a369938dab8e-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:57:03 crc kubenswrapper[4892]: I0217 17:57:03.457687 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9snm\" (UniqueName: \"kubernetes.io/projected/0b0cbdbb-671e-41e1-b494-a369938dab8e-kube-api-access-r9snm\") on node \"crc\" DevicePath \"\"" Feb 17 17:57:03 crc kubenswrapper[4892]: I0217 17:57:03.457696 4892 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0b0cbdbb-671e-41e1-b494-a369938dab8e-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:57:03 crc kubenswrapper[4892]: I0217 17:57:03.457704 4892 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0b0cbdbb-671e-41e1-b494-a369938dab8e-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 17:57:03 crc kubenswrapper[4892]: I0217 17:57:03.457716 4892 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0b0cbdbb-671e-41e1-b494-a369938dab8e-console-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:57:03 crc kubenswrapper[4892]: I0217 17:57:03.457724 4892 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0b0cbdbb-671e-41e1-b494-a369938dab8e-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:57:03 crc kubenswrapper[4892]: I0217 17:57:03.856380 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-zc25m_0b0cbdbb-671e-41e1-b494-a369938dab8e/console/0.log" Feb 17 17:57:03 crc kubenswrapper[4892]: I0217 17:57:03.856423 4892 generic.go:334] "Generic (PLEG): container finished" podID="0b0cbdbb-671e-41e1-b494-a369938dab8e" containerID="f494ef23ba8e3bb38506a532e26207a1d357512b763a47b1478758f86f6cd70c" exitCode=2 Feb 17 17:57:03 crc kubenswrapper[4892]: I0217 17:57:03.856452 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-zc25m" event={"ID":"0b0cbdbb-671e-41e1-b494-a369938dab8e","Type":"ContainerDied","Data":"f494ef23ba8e3bb38506a532e26207a1d357512b763a47b1478758f86f6cd70c"} Feb 17 17:57:03 crc kubenswrapper[4892]: I0217 17:57:03.856475 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-zc25m" event={"ID":"0b0cbdbb-671e-41e1-b494-a369938dab8e","Type":"ContainerDied","Data":"cbe2bffa717fb5e983f997e71d1e9d0608da3e5cd7276d126e62aee0b5791db7"} Feb 17 17:57:03 crc kubenswrapper[4892]: I0217 17:57:03.856525 4892 scope.go:117] "RemoveContainer" containerID="f494ef23ba8e3bb38506a532e26207a1d357512b763a47b1478758f86f6cd70c" Feb 17 17:57:03 crc kubenswrapper[4892]: I0217 17:57:03.856535 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-zc25m" Feb 17 17:57:03 crc kubenswrapper[4892]: I0217 17:57:03.875502 4892 scope.go:117] "RemoveContainer" containerID="f494ef23ba8e3bb38506a532e26207a1d357512b763a47b1478758f86f6cd70c" Feb 17 17:57:03 crc kubenswrapper[4892]: E0217 17:57:03.876194 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f494ef23ba8e3bb38506a532e26207a1d357512b763a47b1478758f86f6cd70c\": container with ID starting with f494ef23ba8e3bb38506a532e26207a1d357512b763a47b1478758f86f6cd70c not found: ID does not exist" containerID="f494ef23ba8e3bb38506a532e26207a1d357512b763a47b1478758f86f6cd70c" Feb 17 17:57:03 crc kubenswrapper[4892]: I0217 17:57:03.876237 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f494ef23ba8e3bb38506a532e26207a1d357512b763a47b1478758f86f6cd70c"} err="failed to get container status \"f494ef23ba8e3bb38506a532e26207a1d357512b763a47b1478758f86f6cd70c\": rpc error: code = NotFound desc = could not find container \"f494ef23ba8e3bb38506a532e26207a1d357512b763a47b1478758f86f6cd70c\": container with ID starting with f494ef23ba8e3bb38506a532e26207a1d357512b763a47b1478758f86f6cd70c not found: ID does not exist" Feb 17 17:57:03 crc kubenswrapper[4892]: I0217 17:57:03.880458 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-zc25m"] Feb 17 17:57:03 crc kubenswrapper[4892]: I0217 17:57:03.887868 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-zc25m"] Feb 17 17:57:04 crc kubenswrapper[4892]: I0217 17:57:04.842867 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138p9bq"] Feb 17 17:57:04 crc kubenswrapper[4892]: E0217 17:57:04.843765 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b0cbdbb-671e-41e1-b494-a369938dab8e" containerName="console" Feb 17 17:57:04 crc kubenswrapper[4892]: I0217 17:57:04.843799 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b0cbdbb-671e-41e1-b494-a369938dab8e" containerName="console" Feb 17 17:57:04 crc kubenswrapper[4892]: I0217 17:57:04.844129 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b0cbdbb-671e-41e1-b494-a369938dab8e" containerName="console" Feb 17 17:57:04 crc kubenswrapper[4892]: I0217 17:57:04.845973 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138p9bq" Feb 17 17:57:04 crc kubenswrapper[4892]: I0217 17:57:04.848348 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 17 17:57:04 crc kubenswrapper[4892]: I0217 17:57:04.852316 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138p9bq"] Feb 17 17:57:04 crc kubenswrapper[4892]: I0217 17:57:04.983991 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/28fded22-d298-44b3-8cb8-6588578ba409-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138p9bq\" (UID: \"28fded22-d298-44b3-8cb8-6588578ba409\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138p9bq" Feb 17 17:57:04 crc kubenswrapper[4892]: I0217 17:57:04.984066 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4dwk\" (UniqueName: \"kubernetes.io/projected/28fded22-d298-44b3-8cb8-6588578ba409-kube-api-access-k4dwk\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138p9bq\" (UID: \"28fded22-d298-44b3-8cb8-6588578ba409\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138p9bq" Feb 17 17:57:04 crc kubenswrapper[4892]: I0217 17:57:04.984312 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/28fded22-d298-44b3-8cb8-6588578ba409-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138p9bq\" (UID: \"28fded22-d298-44b3-8cb8-6588578ba409\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138p9bq" Feb 17 17:57:05 crc kubenswrapper[4892]: I0217 17:57:05.085501 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/28fded22-d298-44b3-8cb8-6588578ba409-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138p9bq\" (UID: \"28fded22-d298-44b3-8cb8-6588578ba409\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138p9bq" Feb 17 17:57:05 crc kubenswrapper[4892]: I0217 17:57:05.085601 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/28fded22-d298-44b3-8cb8-6588578ba409-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138p9bq\" (UID: \"28fded22-d298-44b3-8cb8-6588578ba409\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138p9bq" Feb 17 17:57:05 crc kubenswrapper[4892]: I0217 17:57:05.085649 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4dwk\" (UniqueName: \"kubernetes.io/projected/28fded22-d298-44b3-8cb8-6588578ba409-kube-api-access-k4dwk\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138p9bq\" (UID: \"28fded22-d298-44b3-8cb8-6588578ba409\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138p9bq" Feb 17 17:57:05 crc kubenswrapper[4892]: I0217 17:57:05.086233 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/28fded22-d298-44b3-8cb8-6588578ba409-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138p9bq\" (UID: \"28fded22-d298-44b3-8cb8-6588578ba409\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138p9bq" Feb 17 17:57:05 crc kubenswrapper[4892]: I0217 17:57:05.087012 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/28fded22-d298-44b3-8cb8-6588578ba409-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138p9bq\" (UID: \"28fded22-d298-44b3-8cb8-6588578ba409\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138p9bq" Feb 17 17:57:05 crc kubenswrapper[4892]: I0217 17:57:05.107709 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4dwk\" (UniqueName: \"kubernetes.io/projected/28fded22-d298-44b3-8cb8-6588578ba409-kube-api-access-k4dwk\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138p9bq\" (UID: \"28fded22-d298-44b3-8cb8-6588578ba409\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138p9bq" Feb 17 17:57:05 crc kubenswrapper[4892]: I0217 17:57:05.170865 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138p9bq" Feb 17 17:57:05 crc kubenswrapper[4892]: I0217 17:57:05.372894 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b0cbdbb-671e-41e1-b494-a369938dab8e" path="/var/lib/kubelet/pods/0b0cbdbb-671e-41e1-b494-a369938dab8e/volumes" Feb 17 17:57:05 crc kubenswrapper[4892]: I0217 17:57:05.502121 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138p9bq"] Feb 17 17:57:05 crc kubenswrapper[4892]: W0217 17:57:05.519330 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28fded22_d298_44b3_8cb8_6588578ba409.slice/crio-5cd6e53a5efce54b8c02378b9f7a8c2738b77bbd772959d9d8137499465ddd2a WatchSource:0}: Error finding container 5cd6e53a5efce54b8c02378b9f7a8c2738b77bbd772959d9d8137499465ddd2a: Status 404 returned error can't find the container with id 5cd6e53a5efce54b8c02378b9f7a8c2738b77bbd772959d9d8137499465ddd2a Feb 17 17:57:05 crc kubenswrapper[4892]: I0217 17:57:05.882061 4892 generic.go:334] "Generic (PLEG): container finished" podID="28fded22-d298-44b3-8cb8-6588578ba409" containerID="0204b17f18e385087bd56caac731c6b34298d2202020995bde2369eaa1567967" exitCode=0 Feb 17 17:57:05 crc kubenswrapper[4892]: I0217 17:57:05.882132 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138p9bq" event={"ID":"28fded22-d298-44b3-8cb8-6588578ba409","Type":"ContainerDied","Data":"0204b17f18e385087bd56caac731c6b34298d2202020995bde2369eaa1567967"} Feb 17 17:57:05 crc kubenswrapper[4892]: I0217 17:57:05.882398 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138p9bq" event={"ID":"28fded22-d298-44b3-8cb8-6588578ba409","Type":"ContainerStarted","Data":"5cd6e53a5efce54b8c02378b9f7a8c2738b77bbd772959d9d8137499465ddd2a"} Feb 17 17:57:07 crc kubenswrapper[4892]: I0217 17:57:07.425673 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:57:07 crc kubenswrapper[4892]: I0217 17:57:07.425761 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:57:07 crc kubenswrapper[4892]: I0217 17:57:07.425855 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" Feb 17 17:57:07 crc kubenswrapper[4892]: I0217 17:57:07.426854 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"921d973508b9907ab2d7e5529ed27c9f162bd1cd21401233f975dc91366a6d72"} pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 17:57:07 crc kubenswrapper[4892]: I0217 17:57:07.426944 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" containerID="cri-o://921d973508b9907ab2d7e5529ed27c9f162bd1cd21401233f975dc91366a6d72" gracePeriod=600 Feb 17 17:57:07 crc kubenswrapper[4892]: I0217 17:57:07.904577 4892 generic.go:334] "Generic (PLEG): container finished" podID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerID="921d973508b9907ab2d7e5529ed27c9f162bd1cd21401233f975dc91366a6d72" exitCode=0 Feb 17 17:57:07 crc kubenswrapper[4892]: I0217 17:57:07.904751 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerDied","Data":"921d973508b9907ab2d7e5529ed27c9f162bd1cd21401233f975dc91366a6d72"} Feb 17 17:57:07 crc kubenswrapper[4892]: I0217 17:57:07.905155 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerStarted","Data":"277a892ddcda11348b051b3a2c03162bd0db1300ec13dbc17277b62b780132f1"} Feb 17 17:57:07 crc kubenswrapper[4892]: I0217 17:57:07.905197 4892 scope.go:117] "RemoveContainer" containerID="4bf1ebbe703d10ba0bd5ce4189499a9391545c941bb9c55b294cc5679c9bd301" Feb 17 17:57:07 crc kubenswrapper[4892]: I0217 17:57:07.910521 4892 generic.go:334] "Generic (PLEG): container finished" podID="28fded22-d298-44b3-8cb8-6588578ba409" containerID="3b46697995e0a4cd7fe761f8081ac868f062a247a08fa44cc84fb3a9edc45d66" exitCode=0 Feb 17 17:57:07 crc kubenswrapper[4892]: I0217 17:57:07.910582 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138p9bq" event={"ID":"28fded22-d298-44b3-8cb8-6588578ba409","Type":"ContainerDied","Data":"3b46697995e0a4cd7fe761f8081ac868f062a247a08fa44cc84fb3a9edc45d66"} Feb 17 17:57:08 crc kubenswrapper[4892]: I0217 17:57:08.406726 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2lp89"] Feb 17 17:57:08 crc kubenswrapper[4892]: I0217 17:57:08.410552 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2lp89" Feb 17 17:57:08 crc kubenswrapper[4892]: I0217 17:57:08.419925 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2lp89"] Feb 17 17:57:08 crc kubenswrapper[4892]: I0217 17:57:08.554313 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c27ddee-0c35-496e-9705-ef75f1c326f5-catalog-content\") pod \"redhat-operators-2lp89\" (UID: \"7c27ddee-0c35-496e-9705-ef75f1c326f5\") " pod="openshift-marketplace/redhat-operators-2lp89" Feb 17 17:57:08 crc kubenswrapper[4892]: I0217 17:57:08.554377 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8j9k\" (UniqueName: \"kubernetes.io/projected/7c27ddee-0c35-496e-9705-ef75f1c326f5-kube-api-access-z8j9k\") pod \"redhat-operators-2lp89\" (UID: \"7c27ddee-0c35-496e-9705-ef75f1c326f5\") " pod="openshift-marketplace/redhat-operators-2lp89" Feb 17 17:57:08 crc kubenswrapper[4892]: I0217 17:57:08.554460 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c27ddee-0c35-496e-9705-ef75f1c326f5-utilities\") pod \"redhat-operators-2lp89\" (UID: \"7c27ddee-0c35-496e-9705-ef75f1c326f5\") " pod="openshift-marketplace/redhat-operators-2lp89" Feb 17 17:57:08 crc kubenswrapper[4892]: I0217 17:57:08.656112 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c27ddee-0c35-496e-9705-ef75f1c326f5-catalog-content\") pod \"redhat-operators-2lp89\" (UID: \"7c27ddee-0c35-496e-9705-ef75f1c326f5\") " pod="openshift-marketplace/redhat-operators-2lp89" Feb 17 17:57:08 crc kubenswrapper[4892]: I0217 17:57:08.656169 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8j9k\" (UniqueName: \"kubernetes.io/projected/7c27ddee-0c35-496e-9705-ef75f1c326f5-kube-api-access-z8j9k\") pod \"redhat-operators-2lp89\" (UID: \"7c27ddee-0c35-496e-9705-ef75f1c326f5\") " pod="openshift-marketplace/redhat-operators-2lp89" Feb 17 17:57:08 crc kubenswrapper[4892]: I0217 17:57:08.656241 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c27ddee-0c35-496e-9705-ef75f1c326f5-utilities\") pod \"redhat-operators-2lp89\" (UID: \"7c27ddee-0c35-496e-9705-ef75f1c326f5\") " pod="openshift-marketplace/redhat-operators-2lp89" Feb 17 17:57:08 crc kubenswrapper[4892]: I0217 17:57:08.656777 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c27ddee-0c35-496e-9705-ef75f1c326f5-catalog-content\") pod \"redhat-operators-2lp89\" (UID: \"7c27ddee-0c35-496e-9705-ef75f1c326f5\") " pod="openshift-marketplace/redhat-operators-2lp89" Feb 17 17:57:08 crc kubenswrapper[4892]: I0217 17:57:08.657275 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c27ddee-0c35-496e-9705-ef75f1c326f5-utilities\") pod \"redhat-operators-2lp89\" (UID: \"7c27ddee-0c35-496e-9705-ef75f1c326f5\") " pod="openshift-marketplace/redhat-operators-2lp89" Feb 17 17:57:08 crc kubenswrapper[4892]: I0217 17:57:08.690591 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8j9k\" (UniqueName: \"kubernetes.io/projected/7c27ddee-0c35-496e-9705-ef75f1c326f5-kube-api-access-z8j9k\") pod \"redhat-operators-2lp89\" (UID: \"7c27ddee-0c35-496e-9705-ef75f1c326f5\") " pod="openshift-marketplace/redhat-operators-2lp89" Feb 17 17:57:08 crc kubenswrapper[4892]: I0217 17:57:08.753475 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2lp89" Feb 17 17:57:08 crc kubenswrapper[4892]: I0217 17:57:08.941050 4892 generic.go:334] "Generic (PLEG): container finished" podID="28fded22-d298-44b3-8cb8-6588578ba409" containerID="9312d4ba5ba210c968081608e9cc164f398dbe07c21abb11104a91c01fb0f9da" exitCode=0 Feb 17 17:57:08 crc kubenswrapper[4892]: I0217 17:57:08.941352 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138p9bq" event={"ID":"28fded22-d298-44b3-8cb8-6588578ba409","Type":"ContainerDied","Data":"9312d4ba5ba210c968081608e9cc164f398dbe07c21abb11104a91c01fb0f9da"} Feb 17 17:57:09 crc kubenswrapper[4892]: I0217 17:57:09.255612 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2lp89"] Feb 17 17:57:09 crc kubenswrapper[4892]: W0217 17:57:09.261633 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c27ddee_0c35_496e_9705_ef75f1c326f5.slice/crio-cc18ec4f027c8b3a6b0de39cd0bb4122eeb72caa0da0ed49a84873354b779509 WatchSource:0}: Error finding container cc18ec4f027c8b3a6b0de39cd0bb4122eeb72caa0da0ed49a84873354b779509: Status 404 returned error can't find the container with id cc18ec4f027c8b3a6b0de39cd0bb4122eeb72caa0da0ed49a84873354b779509 Feb 17 17:57:09 crc kubenswrapper[4892]: I0217 17:57:09.949927 4892 generic.go:334] "Generic (PLEG): container finished" podID="7c27ddee-0c35-496e-9705-ef75f1c326f5" containerID="607695cbf795b12df1d149137034b000319319cc90faa68947a9738f5a63da94" exitCode=0 Feb 17 17:57:09 crc kubenswrapper[4892]: I0217 17:57:09.950026 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2lp89" event={"ID":"7c27ddee-0c35-496e-9705-ef75f1c326f5","Type":"ContainerDied","Data":"607695cbf795b12df1d149137034b000319319cc90faa68947a9738f5a63da94"} Feb 17 17:57:09 crc kubenswrapper[4892]: I0217 17:57:09.950275 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2lp89" event={"ID":"7c27ddee-0c35-496e-9705-ef75f1c326f5","Type":"ContainerStarted","Data":"cc18ec4f027c8b3a6b0de39cd0bb4122eeb72caa0da0ed49a84873354b779509"} Feb 17 17:57:10 crc kubenswrapper[4892]: I0217 17:57:10.267022 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138p9bq" Feb 17 17:57:10 crc kubenswrapper[4892]: I0217 17:57:10.381277 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/28fded22-d298-44b3-8cb8-6588578ba409-util\") pod \"28fded22-d298-44b3-8cb8-6588578ba409\" (UID: \"28fded22-d298-44b3-8cb8-6588578ba409\") " Feb 17 17:57:10 crc kubenswrapper[4892]: I0217 17:57:10.381368 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/28fded22-d298-44b3-8cb8-6588578ba409-bundle\") pod \"28fded22-d298-44b3-8cb8-6588578ba409\" (UID: \"28fded22-d298-44b3-8cb8-6588578ba409\") " Feb 17 17:57:10 crc kubenswrapper[4892]: I0217 17:57:10.381476 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4dwk\" (UniqueName: \"kubernetes.io/projected/28fded22-d298-44b3-8cb8-6588578ba409-kube-api-access-k4dwk\") pod \"28fded22-d298-44b3-8cb8-6588578ba409\" (UID: \"28fded22-d298-44b3-8cb8-6588578ba409\") " Feb 17 17:57:10 crc kubenswrapper[4892]: I0217 17:57:10.383024 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28fded22-d298-44b3-8cb8-6588578ba409-bundle" (OuterVolumeSpecName: "bundle") pod "28fded22-d298-44b3-8cb8-6588578ba409" (UID: "28fded22-d298-44b3-8cb8-6588578ba409"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:57:10 crc kubenswrapper[4892]: I0217 17:57:10.394616 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28fded22-d298-44b3-8cb8-6588578ba409-kube-api-access-k4dwk" (OuterVolumeSpecName: "kube-api-access-k4dwk") pod "28fded22-d298-44b3-8cb8-6588578ba409" (UID: "28fded22-d298-44b3-8cb8-6588578ba409"). InnerVolumeSpecName "kube-api-access-k4dwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:57:10 crc kubenswrapper[4892]: I0217 17:57:10.406386 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28fded22-d298-44b3-8cb8-6588578ba409-util" (OuterVolumeSpecName: "util") pod "28fded22-d298-44b3-8cb8-6588578ba409" (UID: "28fded22-d298-44b3-8cb8-6588578ba409"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:57:10 crc kubenswrapper[4892]: I0217 17:57:10.483016 4892 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/28fded22-d298-44b3-8cb8-6588578ba409-util\") on node \"crc\" DevicePath \"\"" Feb 17 17:57:10 crc kubenswrapper[4892]: I0217 17:57:10.483051 4892 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/28fded22-d298-44b3-8cb8-6588578ba409-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:57:10 crc kubenswrapper[4892]: I0217 17:57:10.483066 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4dwk\" (UniqueName: \"kubernetes.io/projected/28fded22-d298-44b3-8cb8-6588578ba409-kube-api-access-k4dwk\") on node \"crc\" DevicePath \"\"" Feb 17 17:57:10 crc kubenswrapper[4892]: I0217 17:57:10.957633 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2lp89" event={"ID":"7c27ddee-0c35-496e-9705-ef75f1c326f5","Type":"ContainerStarted","Data":"d8499bfd6465e23cfe046dfaf18b2681f1e3839818d7a6da0057d92aef11485a"} Feb 17 17:57:10 crc kubenswrapper[4892]: I0217 17:57:10.960478 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138p9bq" event={"ID":"28fded22-d298-44b3-8cb8-6588578ba409","Type":"ContainerDied","Data":"5cd6e53a5efce54b8c02378b9f7a8c2738b77bbd772959d9d8137499465ddd2a"} Feb 17 17:57:10 crc kubenswrapper[4892]: I0217 17:57:10.960500 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5cd6e53a5efce54b8c02378b9f7a8c2738b77bbd772959d9d8137499465ddd2a" Feb 17 17:57:10 crc kubenswrapper[4892]: I0217 17:57:10.960542 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138p9bq" Feb 17 17:57:11 crc kubenswrapper[4892]: I0217 17:57:11.978958 4892 generic.go:334] "Generic (PLEG): container finished" podID="7c27ddee-0c35-496e-9705-ef75f1c326f5" containerID="d8499bfd6465e23cfe046dfaf18b2681f1e3839818d7a6da0057d92aef11485a" exitCode=0 Feb 17 17:57:11 crc kubenswrapper[4892]: I0217 17:57:11.979088 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2lp89" event={"ID":"7c27ddee-0c35-496e-9705-ef75f1c326f5","Type":"ContainerDied","Data":"d8499bfd6465e23cfe046dfaf18b2681f1e3839818d7a6da0057d92aef11485a"} Feb 17 17:57:12 crc kubenswrapper[4892]: I0217 17:57:12.990493 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2lp89" event={"ID":"7c27ddee-0c35-496e-9705-ef75f1c326f5","Type":"ContainerStarted","Data":"40a2702568ee11f3e7efe1c758d276b95eaedb732455fca63b041d5474b17512"} Feb 17 17:57:13 crc kubenswrapper[4892]: I0217 17:57:13.028950 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2lp89" podStartSLOduration=2.479668757 podStartE2EDuration="5.028927302s" podCreationTimestamp="2026-02-17 17:57:08 +0000 UTC" firstStartedPulling="2026-02-17 17:57:09.952349679 +0000 UTC m=+801.327752944" lastFinishedPulling="2026-02-17 17:57:12.501608194 +0000 UTC m=+803.877011489" observedRunningTime="2026-02-17 17:57:13.025251882 +0000 UTC m=+804.400655187" watchObservedRunningTime="2026-02-17 17:57:13.028927302 +0000 UTC m=+804.404330607" Feb 17 17:57:18 crc kubenswrapper[4892]: I0217 17:57:18.754636 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2lp89" Feb 17 17:57:18 crc kubenswrapper[4892]: I0217 17:57:18.755214 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2lp89" Feb 17 17:57:19 crc kubenswrapper[4892]: I0217 17:57:19.791195 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2lp89" podUID="7c27ddee-0c35-496e-9705-ef75f1c326f5" containerName="registry-server" probeResult="failure" output=< Feb 17 17:57:19 crc kubenswrapper[4892]: timeout: failed to connect service ":50051" within 1s Feb 17 17:57:19 crc kubenswrapper[4892]: > Feb 17 17:57:22 crc kubenswrapper[4892]: I0217 17:57:22.009499 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6cbb547f6d-mlwwq"] Feb 17 17:57:22 crc kubenswrapper[4892]: E0217 17:57:22.009858 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28fded22-d298-44b3-8cb8-6588578ba409" containerName="util" Feb 17 17:57:22 crc kubenswrapper[4892]: I0217 17:57:22.009873 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="28fded22-d298-44b3-8cb8-6588578ba409" containerName="util" Feb 17 17:57:22 crc kubenswrapper[4892]: E0217 17:57:22.009893 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28fded22-d298-44b3-8cb8-6588578ba409" containerName="pull" Feb 17 17:57:22 crc kubenswrapper[4892]: I0217 17:57:22.009901 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="28fded22-d298-44b3-8cb8-6588578ba409" containerName="pull" Feb 17 17:57:22 crc kubenswrapper[4892]: E0217 17:57:22.009929 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28fded22-d298-44b3-8cb8-6588578ba409" containerName="extract" Feb 17 17:57:22 crc kubenswrapper[4892]: I0217 17:57:22.009937 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="28fded22-d298-44b3-8cb8-6588578ba409" containerName="extract" Feb 17 17:57:22 crc kubenswrapper[4892]: I0217 17:57:22.010100 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="28fded22-d298-44b3-8cb8-6588578ba409" containerName="extract" Feb 17 17:57:22 crc kubenswrapper[4892]: I0217 17:57:22.010627 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6cbb547f6d-mlwwq" Feb 17 17:57:22 crc kubenswrapper[4892]: I0217 17:57:22.015204 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 17 17:57:22 crc kubenswrapper[4892]: I0217 17:57:22.015704 4892 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-7mvkf" Feb 17 17:57:22 crc kubenswrapper[4892]: I0217 17:57:22.015804 4892 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 17 17:57:22 crc kubenswrapper[4892]: I0217 17:57:22.017087 4892 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 17 17:57:22 crc kubenswrapper[4892]: I0217 17:57:22.018367 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 17 17:57:22 crc kubenswrapper[4892]: I0217 17:57:22.030645 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6cbb547f6d-mlwwq"] Feb 17 17:57:22 crc kubenswrapper[4892]: I0217 17:57:22.172513 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/97b49200-bd30-4c85-b61d-3b75276c00ef-apiservice-cert\") pod \"metallb-operator-controller-manager-6cbb547f6d-mlwwq\" (UID: \"97b49200-bd30-4c85-b61d-3b75276c00ef\") " pod="metallb-system/metallb-operator-controller-manager-6cbb547f6d-mlwwq" Feb 17 17:57:22 crc kubenswrapper[4892]: I0217 17:57:22.172576 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/97b49200-bd30-4c85-b61d-3b75276c00ef-webhook-cert\") pod \"metallb-operator-controller-manager-6cbb547f6d-mlwwq\" (UID: \"97b49200-bd30-4c85-b61d-3b75276c00ef\") " pod="metallb-system/metallb-operator-controller-manager-6cbb547f6d-mlwwq" Feb 17 17:57:22 crc kubenswrapper[4892]: I0217 17:57:22.172619 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2c5fx\" (UniqueName: \"kubernetes.io/projected/97b49200-bd30-4c85-b61d-3b75276c00ef-kube-api-access-2c5fx\") pod \"metallb-operator-controller-manager-6cbb547f6d-mlwwq\" (UID: \"97b49200-bd30-4c85-b61d-3b75276c00ef\") " pod="metallb-system/metallb-operator-controller-manager-6cbb547f6d-mlwwq" Feb 17 17:57:22 crc kubenswrapper[4892]: I0217 17:57:22.273845 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/97b49200-bd30-4c85-b61d-3b75276c00ef-webhook-cert\") pod \"metallb-operator-controller-manager-6cbb547f6d-mlwwq\" (UID: \"97b49200-bd30-4c85-b61d-3b75276c00ef\") " pod="metallb-system/metallb-operator-controller-manager-6cbb547f6d-mlwwq" Feb 17 17:57:22 crc kubenswrapper[4892]: I0217 17:57:22.273920 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2c5fx\" (UniqueName: \"kubernetes.io/projected/97b49200-bd30-4c85-b61d-3b75276c00ef-kube-api-access-2c5fx\") pod \"metallb-operator-controller-manager-6cbb547f6d-mlwwq\" (UID: \"97b49200-bd30-4c85-b61d-3b75276c00ef\") " pod="metallb-system/metallb-operator-controller-manager-6cbb547f6d-mlwwq" Feb 17 17:57:22 crc kubenswrapper[4892]: I0217 17:57:22.273982 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/97b49200-bd30-4c85-b61d-3b75276c00ef-apiservice-cert\") pod \"metallb-operator-controller-manager-6cbb547f6d-mlwwq\" (UID: \"97b49200-bd30-4c85-b61d-3b75276c00ef\") " pod="metallb-system/metallb-operator-controller-manager-6cbb547f6d-mlwwq" Feb 17 17:57:22 crc kubenswrapper[4892]: I0217 17:57:22.279111 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/97b49200-bd30-4c85-b61d-3b75276c00ef-webhook-cert\") pod \"metallb-operator-controller-manager-6cbb547f6d-mlwwq\" (UID: \"97b49200-bd30-4c85-b61d-3b75276c00ef\") " pod="metallb-system/metallb-operator-controller-manager-6cbb547f6d-mlwwq" Feb 17 17:57:22 crc kubenswrapper[4892]: I0217 17:57:22.280512 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/97b49200-bd30-4c85-b61d-3b75276c00ef-apiservice-cert\") pod \"metallb-operator-controller-manager-6cbb547f6d-mlwwq\" (UID: \"97b49200-bd30-4c85-b61d-3b75276c00ef\") " pod="metallb-system/metallb-operator-controller-manager-6cbb547f6d-mlwwq" Feb 17 17:57:22 crc kubenswrapper[4892]: I0217 17:57:22.295520 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2c5fx\" (UniqueName: \"kubernetes.io/projected/97b49200-bd30-4c85-b61d-3b75276c00ef-kube-api-access-2c5fx\") pod \"metallb-operator-controller-manager-6cbb547f6d-mlwwq\" (UID: \"97b49200-bd30-4c85-b61d-3b75276c00ef\") " pod="metallb-system/metallb-operator-controller-manager-6cbb547f6d-mlwwq" Feb 17 17:57:22 crc kubenswrapper[4892]: I0217 17:57:22.325920 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7cbfbcbb66-8nlr9"] Feb 17 17:57:22 crc kubenswrapper[4892]: I0217 17:57:22.326714 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7cbfbcbb66-8nlr9" Feb 17 17:57:22 crc kubenswrapper[4892]: I0217 17:57:22.327672 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6cbb547f6d-mlwwq" Feb 17 17:57:22 crc kubenswrapper[4892]: I0217 17:57:22.329330 4892 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 17 17:57:22 crc kubenswrapper[4892]: I0217 17:57:22.329410 4892 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-l42mv" Feb 17 17:57:22 crc kubenswrapper[4892]: I0217 17:57:22.330065 4892 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 17 17:57:22 crc kubenswrapper[4892]: I0217 17:57:22.342987 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7cbfbcbb66-8nlr9"] Feb 17 17:57:22 crc kubenswrapper[4892]: I0217 17:57:22.476796 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b319b422-94c7-47c8-9e1f-2bef71d833b1-webhook-cert\") pod \"metallb-operator-webhook-server-7cbfbcbb66-8nlr9\" (UID: \"b319b422-94c7-47c8-9e1f-2bef71d833b1\") " pod="metallb-system/metallb-operator-webhook-server-7cbfbcbb66-8nlr9" Feb 17 17:57:22 crc kubenswrapper[4892]: I0217 17:57:22.476944 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48952\" (UniqueName: \"kubernetes.io/projected/b319b422-94c7-47c8-9e1f-2bef71d833b1-kube-api-access-48952\") pod \"metallb-operator-webhook-server-7cbfbcbb66-8nlr9\" (UID: \"b319b422-94c7-47c8-9e1f-2bef71d833b1\") " pod="metallb-system/metallb-operator-webhook-server-7cbfbcbb66-8nlr9" Feb 17 17:57:22 crc kubenswrapper[4892]: I0217 17:57:22.476973 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b319b422-94c7-47c8-9e1f-2bef71d833b1-apiservice-cert\") pod \"metallb-operator-webhook-server-7cbfbcbb66-8nlr9\" (UID: \"b319b422-94c7-47c8-9e1f-2bef71d833b1\") " pod="metallb-system/metallb-operator-webhook-server-7cbfbcbb66-8nlr9" Feb 17 17:57:22 crc kubenswrapper[4892]: I0217 17:57:22.561567 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6cbb547f6d-mlwwq"] Feb 17 17:57:22 crc kubenswrapper[4892]: W0217 17:57:22.570474 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97b49200_bd30_4c85_b61d_3b75276c00ef.slice/crio-cd208110e565f41e801874625d498133771f5d6ceeffbb0452f10f3a0a13a42e WatchSource:0}: Error finding container cd208110e565f41e801874625d498133771f5d6ceeffbb0452f10f3a0a13a42e: Status 404 returned error can't find the container with id cd208110e565f41e801874625d498133771f5d6ceeffbb0452f10f3a0a13a42e Feb 17 17:57:22 crc kubenswrapper[4892]: I0217 17:57:22.578045 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b319b422-94c7-47c8-9e1f-2bef71d833b1-webhook-cert\") pod \"metallb-operator-webhook-server-7cbfbcbb66-8nlr9\" (UID: \"b319b422-94c7-47c8-9e1f-2bef71d833b1\") " pod="metallb-system/metallb-operator-webhook-server-7cbfbcbb66-8nlr9" Feb 17 17:57:22 crc kubenswrapper[4892]: I0217 17:57:22.578121 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48952\" (UniqueName: \"kubernetes.io/projected/b319b422-94c7-47c8-9e1f-2bef71d833b1-kube-api-access-48952\") pod \"metallb-operator-webhook-server-7cbfbcbb66-8nlr9\" (UID: \"b319b422-94c7-47c8-9e1f-2bef71d833b1\") " pod="metallb-system/metallb-operator-webhook-server-7cbfbcbb66-8nlr9" Feb 17 17:57:22 crc kubenswrapper[4892]: I0217 17:57:22.578155 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b319b422-94c7-47c8-9e1f-2bef71d833b1-apiservice-cert\") pod \"metallb-operator-webhook-server-7cbfbcbb66-8nlr9\" (UID: \"b319b422-94c7-47c8-9e1f-2bef71d833b1\") " pod="metallb-system/metallb-operator-webhook-server-7cbfbcbb66-8nlr9" Feb 17 17:57:22 crc kubenswrapper[4892]: I0217 17:57:22.585688 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b319b422-94c7-47c8-9e1f-2bef71d833b1-webhook-cert\") pod \"metallb-operator-webhook-server-7cbfbcbb66-8nlr9\" (UID: \"b319b422-94c7-47c8-9e1f-2bef71d833b1\") " pod="metallb-system/metallb-operator-webhook-server-7cbfbcbb66-8nlr9" Feb 17 17:57:22 crc kubenswrapper[4892]: I0217 17:57:22.587043 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b319b422-94c7-47c8-9e1f-2bef71d833b1-apiservice-cert\") pod \"metallb-operator-webhook-server-7cbfbcbb66-8nlr9\" (UID: \"b319b422-94c7-47c8-9e1f-2bef71d833b1\") " pod="metallb-system/metallb-operator-webhook-server-7cbfbcbb66-8nlr9" Feb 17 17:57:22 crc kubenswrapper[4892]: I0217 17:57:22.607536 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48952\" (UniqueName: \"kubernetes.io/projected/b319b422-94c7-47c8-9e1f-2bef71d833b1-kube-api-access-48952\") pod \"metallb-operator-webhook-server-7cbfbcbb66-8nlr9\" (UID: \"b319b422-94c7-47c8-9e1f-2bef71d833b1\") " pod="metallb-system/metallb-operator-webhook-server-7cbfbcbb66-8nlr9" Feb 17 17:57:22 crc kubenswrapper[4892]: I0217 17:57:22.706659 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7cbfbcbb66-8nlr9" Feb 17 17:57:23 crc kubenswrapper[4892]: I0217 17:57:23.003006 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7cbfbcbb66-8nlr9"] Feb 17 17:57:23 crc kubenswrapper[4892]: W0217 17:57:23.009673 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb319b422_94c7_47c8_9e1f_2bef71d833b1.slice/crio-40af75e10c2bf0fdcf7820d8683e4f6fe9fed704c08f41542519f6778a4ff031 WatchSource:0}: Error finding container 40af75e10c2bf0fdcf7820d8683e4f6fe9fed704c08f41542519f6778a4ff031: Status 404 returned error can't find the container with id 40af75e10c2bf0fdcf7820d8683e4f6fe9fed704c08f41542519f6778a4ff031 Feb 17 17:57:23 crc kubenswrapper[4892]: I0217 17:57:23.073716 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7cbfbcbb66-8nlr9" event={"ID":"b319b422-94c7-47c8-9e1f-2bef71d833b1","Type":"ContainerStarted","Data":"40af75e10c2bf0fdcf7820d8683e4f6fe9fed704c08f41542519f6778a4ff031"} Feb 17 17:57:23 crc kubenswrapper[4892]: I0217 17:57:23.074785 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6cbb547f6d-mlwwq" event={"ID":"97b49200-bd30-4c85-b61d-3b75276c00ef","Type":"ContainerStarted","Data":"cd208110e565f41e801874625d498133771f5d6ceeffbb0452f10f3a0a13a42e"} Feb 17 17:57:26 crc kubenswrapper[4892]: I0217 17:57:26.099750 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6cbb547f6d-mlwwq" event={"ID":"97b49200-bd30-4c85-b61d-3b75276c00ef","Type":"ContainerStarted","Data":"feb075606e46a0c45e80fab6497575931dd3f206222880f86f2eb50a45376ea4"} Feb 17 17:57:26 crc kubenswrapper[4892]: I0217 17:57:26.101102 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6cbb547f6d-mlwwq" Feb 17 17:57:26 crc kubenswrapper[4892]: I0217 17:57:26.120995 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6cbb547f6d-mlwwq" podStartSLOduration=2.097127653 podStartE2EDuration="5.120979664s" podCreationTimestamp="2026-02-17 17:57:21 +0000 UTC" firstStartedPulling="2026-02-17 17:57:22.573604019 +0000 UTC m=+813.949007284" lastFinishedPulling="2026-02-17 17:57:25.59745603 +0000 UTC m=+816.972859295" observedRunningTime="2026-02-17 17:57:26.119238107 +0000 UTC m=+817.494641392" watchObservedRunningTime="2026-02-17 17:57:26.120979664 +0000 UTC m=+817.496382919" Feb 17 17:57:28 crc kubenswrapper[4892]: I0217 17:57:28.116576 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7cbfbcbb66-8nlr9" event={"ID":"b319b422-94c7-47c8-9e1f-2bef71d833b1","Type":"ContainerStarted","Data":"38cd0042c84591a74d31cc7a864ff6e12daced7312e53c5290152ef316824244"} Feb 17 17:57:28 crc kubenswrapper[4892]: I0217 17:57:28.117044 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7cbfbcbb66-8nlr9" Feb 17 17:57:28 crc kubenswrapper[4892]: I0217 17:57:28.140266 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7cbfbcbb66-8nlr9" podStartSLOduration=1.861302048 podStartE2EDuration="6.14023795s" podCreationTimestamp="2026-02-17 17:57:22 +0000 UTC" firstStartedPulling="2026-02-17 17:57:23.011652221 +0000 UTC m=+814.387055486" lastFinishedPulling="2026-02-17 17:57:27.290588113 +0000 UTC m=+818.665991388" observedRunningTime="2026-02-17 17:57:28.133991999 +0000 UTC m=+819.509395284" watchObservedRunningTime="2026-02-17 17:57:28.14023795 +0000 UTC m=+819.515641245" Feb 17 17:57:28 crc kubenswrapper[4892]: I0217 17:57:28.823389 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2lp89" Feb 17 17:57:28 crc kubenswrapper[4892]: I0217 17:57:28.886292 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2lp89" Feb 17 17:57:29 crc kubenswrapper[4892]: I0217 17:57:29.085037 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2lp89"] Feb 17 17:57:30 crc kubenswrapper[4892]: I0217 17:57:30.135909 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2lp89" podUID="7c27ddee-0c35-496e-9705-ef75f1c326f5" containerName="registry-server" containerID="cri-o://40a2702568ee11f3e7efe1c758d276b95eaedb732455fca63b041d5474b17512" gracePeriod=2 Feb 17 17:57:30 crc kubenswrapper[4892]: I0217 17:57:30.533891 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2lp89" Feb 17 17:57:30 crc kubenswrapper[4892]: I0217 17:57:30.614879 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c27ddee-0c35-496e-9705-ef75f1c326f5-catalog-content\") pod \"7c27ddee-0c35-496e-9705-ef75f1c326f5\" (UID: \"7c27ddee-0c35-496e-9705-ef75f1c326f5\") " Feb 17 17:57:30 crc kubenswrapper[4892]: I0217 17:57:30.614917 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8j9k\" (UniqueName: \"kubernetes.io/projected/7c27ddee-0c35-496e-9705-ef75f1c326f5-kube-api-access-z8j9k\") pod \"7c27ddee-0c35-496e-9705-ef75f1c326f5\" (UID: \"7c27ddee-0c35-496e-9705-ef75f1c326f5\") " Feb 17 17:57:30 crc kubenswrapper[4892]: I0217 17:57:30.614953 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c27ddee-0c35-496e-9705-ef75f1c326f5-utilities\") pod \"7c27ddee-0c35-496e-9705-ef75f1c326f5\" (UID: \"7c27ddee-0c35-496e-9705-ef75f1c326f5\") " Feb 17 17:57:30 crc kubenswrapper[4892]: I0217 17:57:30.616036 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c27ddee-0c35-496e-9705-ef75f1c326f5-utilities" (OuterVolumeSpecName: "utilities") pod "7c27ddee-0c35-496e-9705-ef75f1c326f5" (UID: "7c27ddee-0c35-496e-9705-ef75f1c326f5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:57:30 crc kubenswrapper[4892]: I0217 17:57:30.619703 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c27ddee-0c35-496e-9705-ef75f1c326f5-kube-api-access-z8j9k" (OuterVolumeSpecName: "kube-api-access-z8j9k") pod "7c27ddee-0c35-496e-9705-ef75f1c326f5" (UID: "7c27ddee-0c35-496e-9705-ef75f1c326f5"). InnerVolumeSpecName "kube-api-access-z8j9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:57:30 crc kubenswrapper[4892]: I0217 17:57:30.716920 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8j9k\" (UniqueName: \"kubernetes.io/projected/7c27ddee-0c35-496e-9705-ef75f1c326f5-kube-api-access-z8j9k\") on node \"crc\" DevicePath \"\"" Feb 17 17:57:30 crc kubenswrapper[4892]: I0217 17:57:30.716956 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c27ddee-0c35-496e-9705-ef75f1c326f5-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 17:57:30 crc kubenswrapper[4892]: I0217 17:57:30.732918 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c27ddee-0c35-496e-9705-ef75f1c326f5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7c27ddee-0c35-496e-9705-ef75f1c326f5" (UID: "7c27ddee-0c35-496e-9705-ef75f1c326f5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:57:30 crc kubenswrapper[4892]: I0217 17:57:30.818068 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c27ddee-0c35-496e-9705-ef75f1c326f5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 17:57:31 crc kubenswrapper[4892]: I0217 17:57:31.148017 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2lp89" event={"ID":"7c27ddee-0c35-496e-9705-ef75f1c326f5","Type":"ContainerDied","Data":"40a2702568ee11f3e7efe1c758d276b95eaedb732455fca63b041d5474b17512"} Feb 17 17:57:31 crc kubenswrapper[4892]: I0217 17:57:31.148045 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2lp89" Feb 17 17:57:31 crc kubenswrapper[4892]: I0217 17:57:31.148083 4892 scope.go:117] "RemoveContainer" containerID="40a2702568ee11f3e7efe1c758d276b95eaedb732455fca63b041d5474b17512" Feb 17 17:57:31 crc kubenswrapper[4892]: I0217 17:57:31.147966 4892 generic.go:334] "Generic (PLEG): container finished" podID="7c27ddee-0c35-496e-9705-ef75f1c326f5" containerID="40a2702568ee11f3e7efe1c758d276b95eaedb732455fca63b041d5474b17512" exitCode=0 Feb 17 17:57:31 crc kubenswrapper[4892]: I0217 17:57:31.148229 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2lp89" event={"ID":"7c27ddee-0c35-496e-9705-ef75f1c326f5","Type":"ContainerDied","Data":"cc18ec4f027c8b3a6b0de39cd0bb4122eeb72caa0da0ed49a84873354b779509"} Feb 17 17:57:31 crc kubenswrapper[4892]: I0217 17:57:31.189771 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2lp89"] Feb 17 17:57:31 crc kubenswrapper[4892]: I0217 17:57:31.192097 4892 scope.go:117] "RemoveContainer" containerID="d8499bfd6465e23cfe046dfaf18b2681f1e3839818d7a6da0057d92aef11485a" Feb 17 17:57:31 crc kubenswrapper[4892]: I0217 17:57:31.201733 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2lp89"] Feb 17 17:57:31 crc kubenswrapper[4892]: I0217 17:57:31.219116 4892 scope.go:117] "RemoveContainer" containerID="607695cbf795b12df1d149137034b000319319cc90faa68947a9738f5a63da94" Feb 17 17:57:31 crc kubenswrapper[4892]: I0217 17:57:31.237005 4892 scope.go:117] "RemoveContainer" containerID="40a2702568ee11f3e7efe1c758d276b95eaedb732455fca63b041d5474b17512" Feb 17 17:57:31 crc kubenswrapper[4892]: E0217 17:57:31.244184 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40a2702568ee11f3e7efe1c758d276b95eaedb732455fca63b041d5474b17512\": container with ID starting with 40a2702568ee11f3e7efe1c758d276b95eaedb732455fca63b041d5474b17512 not found: ID does not exist" containerID="40a2702568ee11f3e7efe1c758d276b95eaedb732455fca63b041d5474b17512" Feb 17 17:57:31 crc kubenswrapper[4892]: I0217 17:57:31.244235 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40a2702568ee11f3e7efe1c758d276b95eaedb732455fca63b041d5474b17512"} err="failed to get container status \"40a2702568ee11f3e7efe1c758d276b95eaedb732455fca63b041d5474b17512\": rpc error: code = NotFound desc = could not find container \"40a2702568ee11f3e7efe1c758d276b95eaedb732455fca63b041d5474b17512\": container with ID starting with 40a2702568ee11f3e7efe1c758d276b95eaedb732455fca63b041d5474b17512 not found: ID does not exist" Feb 17 17:57:31 crc kubenswrapper[4892]: I0217 17:57:31.244268 4892 scope.go:117] "RemoveContainer" containerID="d8499bfd6465e23cfe046dfaf18b2681f1e3839818d7a6da0057d92aef11485a" Feb 17 17:57:31 crc kubenswrapper[4892]: E0217 17:57:31.244707 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8499bfd6465e23cfe046dfaf18b2681f1e3839818d7a6da0057d92aef11485a\": container with ID starting with d8499bfd6465e23cfe046dfaf18b2681f1e3839818d7a6da0057d92aef11485a not found: ID does not exist" containerID="d8499bfd6465e23cfe046dfaf18b2681f1e3839818d7a6da0057d92aef11485a" Feb 17 17:57:31 crc kubenswrapper[4892]: I0217 17:57:31.244739 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8499bfd6465e23cfe046dfaf18b2681f1e3839818d7a6da0057d92aef11485a"} err="failed to get container status \"d8499bfd6465e23cfe046dfaf18b2681f1e3839818d7a6da0057d92aef11485a\": rpc error: code = NotFound desc = could not find container \"d8499bfd6465e23cfe046dfaf18b2681f1e3839818d7a6da0057d92aef11485a\": container with ID starting with d8499bfd6465e23cfe046dfaf18b2681f1e3839818d7a6da0057d92aef11485a not found: ID does not exist" Feb 17 17:57:31 crc kubenswrapper[4892]: I0217 17:57:31.244758 4892 scope.go:117] "RemoveContainer" containerID="607695cbf795b12df1d149137034b000319319cc90faa68947a9738f5a63da94" Feb 17 17:57:31 crc kubenswrapper[4892]: E0217 17:57:31.245344 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"607695cbf795b12df1d149137034b000319319cc90faa68947a9738f5a63da94\": container with ID starting with 607695cbf795b12df1d149137034b000319319cc90faa68947a9738f5a63da94 not found: ID does not exist" containerID="607695cbf795b12df1d149137034b000319319cc90faa68947a9738f5a63da94" Feb 17 17:57:31 crc kubenswrapper[4892]: I0217 17:57:31.245367 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"607695cbf795b12df1d149137034b000319319cc90faa68947a9738f5a63da94"} err="failed to get container status \"607695cbf795b12df1d149137034b000319319cc90faa68947a9738f5a63da94\": rpc error: code = NotFound desc = could not find container \"607695cbf795b12df1d149137034b000319319cc90faa68947a9738f5a63da94\": container with ID starting with 607695cbf795b12df1d149137034b000319319cc90faa68947a9738f5a63da94 not found: ID does not exist" Feb 17 17:57:31 crc kubenswrapper[4892]: I0217 17:57:31.376540 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c27ddee-0c35-496e-9705-ef75f1c326f5" path="/var/lib/kubelet/pods/7c27ddee-0c35-496e-9705-ef75f1c326f5/volumes" Feb 17 17:57:42 crc kubenswrapper[4892]: I0217 17:57:42.733934 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7cbfbcbb66-8nlr9" Feb 17 17:58:02 crc kubenswrapper[4892]: I0217 17:58:02.331042 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6cbb547f6d-mlwwq" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.183172 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-xfjst"] Feb 17 17:58:03 crc kubenswrapper[4892]: E0217 17:58:03.183715 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c27ddee-0c35-496e-9705-ef75f1c326f5" containerName="extract-utilities" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.183735 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c27ddee-0c35-496e-9705-ef75f1c326f5" containerName="extract-utilities" Feb 17 17:58:03 crc kubenswrapper[4892]: E0217 17:58:03.183758 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c27ddee-0c35-496e-9705-ef75f1c326f5" containerName="extract-content" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.183768 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c27ddee-0c35-496e-9705-ef75f1c326f5" containerName="extract-content" Feb 17 17:58:03 crc kubenswrapper[4892]: E0217 17:58:03.183798 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c27ddee-0c35-496e-9705-ef75f1c326f5" containerName="registry-server" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.183806 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c27ddee-0c35-496e-9705-ef75f1c326f5" containerName="registry-server" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.183983 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c27ddee-0c35-496e-9705-ef75f1c326f5" containerName="registry-server" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.189532 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-xfjst" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.195267 4892 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.195555 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.199989 4892 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-t9rvd" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.201470 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-t4nhs"] Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.204938 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-t4nhs" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.208108 4892 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.215790 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-t4nhs"] Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.218588 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/87e188e4-25c7-4068-8336-8396f99d9a51-metrics-certs\") pod \"frr-k8s-xfjst\" (UID: \"87e188e4-25c7-4068-8336-8396f99d9a51\") " pod="metallb-system/frr-k8s-xfjst" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.218720 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/87e188e4-25c7-4068-8336-8396f99d9a51-reloader\") pod \"frr-k8s-xfjst\" (UID: \"87e188e4-25c7-4068-8336-8396f99d9a51\") " pod="metallb-system/frr-k8s-xfjst" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.218756 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/87e188e4-25c7-4068-8336-8396f99d9a51-metrics\") pod \"frr-k8s-xfjst\" (UID: \"87e188e4-25c7-4068-8336-8396f99d9a51\") " pod="metallb-system/frr-k8s-xfjst" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.218784 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/87e188e4-25c7-4068-8336-8396f99d9a51-frr-startup\") pod \"frr-k8s-xfjst\" (UID: \"87e188e4-25c7-4068-8336-8396f99d9a51\") " pod="metallb-system/frr-k8s-xfjst" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.218807 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/87e188e4-25c7-4068-8336-8396f99d9a51-frr-conf\") pod \"frr-k8s-xfjst\" (UID: \"87e188e4-25c7-4068-8336-8396f99d9a51\") " pod="metallb-system/frr-k8s-xfjst" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.218846 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dbdc8dbb-c968-46e3-8e11-9f1006a9ccbf-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-t4nhs\" (UID: \"dbdc8dbb-c968-46e3-8e11-9f1006a9ccbf\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-t4nhs" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.218864 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/87e188e4-25c7-4068-8336-8396f99d9a51-frr-sockets\") pod \"frr-k8s-xfjst\" (UID: \"87e188e4-25c7-4068-8336-8396f99d9a51\") " pod="metallb-system/frr-k8s-xfjst" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.218905 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6t96\" (UniqueName: \"kubernetes.io/projected/dbdc8dbb-c968-46e3-8e11-9f1006a9ccbf-kube-api-access-z6t96\") pod \"frr-k8s-webhook-server-78b44bf5bb-t4nhs\" (UID: \"dbdc8dbb-c968-46e3-8e11-9f1006a9ccbf\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-t4nhs" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.218941 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftd2l\" (UniqueName: \"kubernetes.io/projected/87e188e4-25c7-4068-8336-8396f99d9a51-kube-api-access-ftd2l\") pod \"frr-k8s-xfjst\" (UID: \"87e188e4-25c7-4068-8336-8396f99d9a51\") " pod="metallb-system/frr-k8s-xfjst" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.293869 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-69bbfbf88f-n6bwg"] Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.294993 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-n6bwg" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.299331 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-55rj9"] Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.301226 4892 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.302405 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-55rj9" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.307222 4892 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.307239 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.307405 4892 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.307536 4892 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-ln7kr" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.307754 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-n6bwg"] Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.319933 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6sq7\" (UniqueName: \"kubernetes.io/projected/b6b73500-2314-4f41-83cf-3da8514065e3-kube-api-access-m6sq7\") pod \"speaker-55rj9\" (UID: \"b6b73500-2314-4f41-83cf-3da8514065e3\") " pod="metallb-system/speaker-55rj9" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.319961 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a4756aa-28e7-41ff-a32c-65b2a2db0b71-cert\") pod \"controller-69bbfbf88f-n6bwg\" (UID: \"7a4756aa-28e7-41ff-a32c-65b2a2db0b71\") " pod="metallb-system/controller-69bbfbf88f-n6bwg" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.319981 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gz5xj\" (UniqueName: \"kubernetes.io/projected/7a4756aa-28e7-41ff-a32c-65b2a2db0b71-kube-api-access-gz5xj\") pod \"controller-69bbfbf88f-n6bwg\" (UID: \"7a4756aa-28e7-41ff-a32c-65b2a2db0b71\") " pod="metallb-system/controller-69bbfbf88f-n6bwg" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.320012 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/87e188e4-25c7-4068-8336-8396f99d9a51-metrics-certs\") pod \"frr-k8s-xfjst\" (UID: \"87e188e4-25c7-4068-8336-8396f99d9a51\") " pod="metallb-system/frr-k8s-xfjst" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.320028 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7a4756aa-28e7-41ff-a32c-65b2a2db0b71-metrics-certs\") pod \"controller-69bbfbf88f-n6bwg\" (UID: \"7a4756aa-28e7-41ff-a32c-65b2a2db0b71\") " pod="metallb-system/controller-69bbfbf88f-n6bwg" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.320047 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/b6b73500-2314-4f41-83cf-3da8514065e3-memberlist\") pod \"speaker-55rj9\" (UID: \"b6b73500-2314-4f41-83cf-3da8514065e3\") " pod="metallb-system/speaker-55rj9" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.320069 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/87e188e4-25c7-4068-8336-8396f99d9a51-reloader\") pod \"frr-k8s-xfjst\" (UID: \"87e188e4-25c7-4068-8336-8396f99d9a51\") " pod="metallb-system/frr-k8s-xfjst" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.320083 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/87e188e4-25c7-4068-8336-8396f99d9a51-metrics\") pod \"frr-k8s-xfjst\" (UID: \"87e188e4-25c7-4068-8336-8396f99d9a51\") " pod="metallb-system/frr-k8s-xfjst" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.320106 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/87e188e4-25c7-4068-8336-8396f99d9a51-frr-startup\") pod \"frr-k8s-xfjst\" (UID: \"87e188e4-25c7-4068-8336-8396f99d9a51\") " pod="metallb-system/frr-k8s-xfjst" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.320156 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/87e188e4-25c7-4068-8336-8396f99d9a51-frr-conf\") pod \"frr-k8s-xfjst\" (UID: \"87e188e4-25c7-4068-8336-8396f99d9a51\") " pod="metallb-system/frr-k8s-xfjst" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.320190 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/87e188e4-25c7-4068-8336-8396f99d9a51-frr-sockets\") pod \"frr-k8s-xfjst\" (UID: \"87e188e4-25c7-4068-8336-8396f99d9a51\") " pod="metallb-system/frr-k8s-xfjst" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.320212 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/b6b73500-2314-4f41-83cf-3da8514065e3-metallb-excludel2\") pod \"speaker-55rj9\" (UID: \"b6b73500-2314-4f41-83cf-3da8514065e3\") " pod="metallb-system/speaker-55rj9" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.320268 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dbdc8dbb-c968-46e3-8e11-9f1006a9ccbf-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-t4nhs\" (UID: \"dbdc8dbb-c968-46e3-8e11-9f1006a9ccbf\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-t4nhs" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.320966 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/87e188e4-25c7-4068-8336-8396f99d9a51-reloader\") pod \"frr-k8s-xfjst\" (UID: \"87e188e4-25c7-4068-8336-8396f99d9a51\") " pod="metallb-system/frr-k8s-xfjst" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.321137 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/87e188e4-25c7-4068-8336-8396f99d9a51-metrics\") pod \"frr-k8s-xfjst\" (UID: \"87e188e4-25c7-4068-8336-8396f99d9a51\") " pod="metallb-system/frr-k8s-xfjst" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.321807 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/87e188e4-25c7-4068-8336-8396f99d9a51-frr-startup\") pod \"frr-k8s-xfjst\" (UID: \"87e188e4-25c7-4068-8336-8396f99d9a51\") " pod="metallb-system/frr-k8s-xfjst" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.321999 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/87e188e4-25c7-4068-8336-8396f99d9a51-frr-conf\") pod \"frr-k8s-xfjst\" (UID: \"87e188e4-25c7-4068-8336-8396f99d9a51\") " pod="metallb-system/frr-k8s-xfjst" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.322151 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/87e188e4-25c7-4068-8336-8396f99d9a51-frr-sockets\") pod \"frr-k8s-xfjst\" (UID: \"87e188e4-25c7-4068-8336-8396f99d9a51\") " pod="metallb-system/frr-k8s-xfjst" Feb 17 17:58:03 crc kubenswrapper[4892]: E0217 17:58:03.322213 4892 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Feb 17 17:58:03 crc kubenswrapper[4892]: E0217 17:58:03.322252 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dbdc8dbb-c968-46e3-8e11-9f1006a9ccbf-cert podName:dbdc8dbb-c968-46e3-8e11-9f1006a9ccbf nodeName:}" failed. No retries permitted until 2026-02-17 17:58:03.822238874 +0000 UTC m=+855.197642139 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dbdc8dbb-c968-46e3-8e11-9f1006a9ccbf-cert") pod "frr-k8s-webhook-server-78b44bf5bb-t4nhs" (UID: "dbdc8dbb-c968-46e3-8e11-9f1006a9ccbf") : secret "frr-k8s-webhook-server-cert" not found Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.322546 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6t96\" (UniqueName: \"kubernetes.io/projected/dbdc8dbb-c968-46e3-8e11-9f1006a9ccbf-kube-api-access-z6t96\") pod \"frr-k8s-webhook-server-78b44bf5bb-t4nhs\" (UID: \"dbdc8dbb-c968-46e3-8e11-9f1006a9ccbf\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-t4nhs" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.322592 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b6b73500-2314-4f41-83cf-3da8514065e3-metrics-certs\") pod \"speaker-55rj9\" (UID: \"b6b73500-2314-4f41-83cf-3da8514065e3\") " pod="metallb-system/speaker-55rj9" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.322616 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftd2l\" (UniqueName: \"kubernetes.io/projected/87e188e4-25c7-4068-8336-8396f99d9a51-kube-api-access-ftd2l\") pod \"frr-k8s-xfjst\" (UID: \"87e188e4-25c7-4068-8336-8396f99d9a51\") " pod="metallb-system/frr-k8s-xfjst" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.325304 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/87e188e4-25c7-4068-8336-8396f99d9a51-metrics-certs\") pod \"frr-k8s-xfjst\" (UID: \"87e188e4-25c7-4068-8336-8396f99d9a51\") " pod="metallb-system/frr-k8s-xfjst" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.343158 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6t96\" (UniqueName: \"kubernetes.io/projected/dbdc8dbb-c968-46e3-8e11-9f1006a9ccbf-kube-api-access-z6t96\") pod \"frr-k8s-webhook-server-78b44bf5bb-t4nhs\" (UID: \"dbdc8dbb-c968-46e3-8e11-9f1006a9ccbf\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-t4nhs" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.343368 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftd2l\" (UniqueName: \"kubernetes.io/projected/87e188e4-25c7-4068-8336-8396f99d9a51-kube-api-access-ftd2l\") pod \"frr-k8s-xfjst\" (UID: \"87e188e4-25c7-4068-8336-8396f99d9a51\") " pod="metallb-system/frr-k8s-xfjst" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.424405 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b6b73500-2314-4f41-83cf-3da8514065e3-metrics-certs\") pod \"speaker-55rj9\" (UID: \"b6b73500-2314-4f41-83cf-3da8514065e3\") " pod="metallb-system/speaker-55rj9" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.424475 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6sq7\" (UniqueName: \"kubernetes.io/projected/b6b73500-2314-4f41-83cf-3da8514065e3-kube-api-access-m6sq7\") pod \"speaker-55rj9\" (UID: \"b6b73500-2314-4f41-83cf-3da8514065e3\") " pod="metallb-system/speaker-55rj9" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.424494 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a4756aa-28e7-41ff-a32c-65b2a2db0b71-cert\") pod \"controller-69bbfbf88f-n6bwg\" (UID: \"7a4756aa-28e7-41ff-a32c-65b2a2db0b71\") " pod="metallb-system/controller-69bbfbf88f-n6bwg" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.424512 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gz5xj\" (UniqueName: \"kubernetes.io/projected/7a4756aa-28e7-41ff-a32c-65b2a2db0b71-kube-api-access-gz5xj\") pod \"controller-69bbfbf88f-n6bwg\" (UID: \"7a4756aa-28e7-41ff-a32c-65b2a2db0b71\") " pod="metallb-system/controller-69bbfbf88f-n6bwg" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.424536 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7a4756aa-28e7-41ff-a32c-65b2a2db0b71-metrics-certs\") pod \"controller-69bbfbf88f-n6bwg\" (UID: \"7a4756aa-28e7-41ff-a32c-65b2a2db0b71\") " pod="metallb-system/controller-69bbfbf88f-n6bwg" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.424555 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/b6b73500-2314-4f41-83cf-3da8514065e3-memberlist\") pod \"speaker-55rj9\" (UID: \"b6b73500-2314-4f41-83cf-3da8514065e3\") " pod="metallb-system/speaker-55rj9" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.424610 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/b6b73500-2314-4f41-83cf-3da8514065e3-metallb-excludel2\") pod \"speaker-55rj9\" (UID: \"b6b73500-2314-4f41-83cf-3da8514065e3\") " pod="metallb-system/speaker-55rj9" Feb 17 17:58:03 crc kubenswrapper[4892]: E0217 17:58:03.424999 4892 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Feb 17 17:58:03 crc kubenswrapper[4892]: E0217 17:58:03.425062 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6b73500-2314-4f41-83cf-3da8514065e3-metrics-certs podName:b6b73500-2314-4f41-83cf-3da8514065e3 nodeName:}" failed. No retries permitted until 2026-02-17 17:58:03.925046905 +0000 UTC m=+855.300450170 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b6b73500-2314-4f41-83cf-3da8514065e3-metrics-certs") pod "speaker-55rj9" (UID: "b6b73500-2314-4f41-83cf-3da8514065e3") : secret "speaker-certs-secret" not found Feb 17 17:58:03 crc kubenswrapper[4892]: E0217 17:58:03.425482 4892 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 17 17:58:03 crc kubenswrapper[4892]: E0217 17:58:03.425514 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6b73500-2314-4f41-83cf-3da8514065e3-memberlist podName:b6b73500-2314-4f41-83cf-3da8514065e3 nodeName:}" failed. No retries permitted until 2026-02-17 17:58:03.925505747 +0000 UTC m=+855.300909012 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/b6b73500-2314-4f41-83cf-3da8514065e3-memberlist") pod "speaker-55rj9" (UID: "b6b73500-2314-4f41-83cf-3da8514065e3") : secret "metallb-memberlist" not found Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.427160 4892 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.428787 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7a4756aa-28e7-41ff-a32c-65b2a2db0b71-metrics-certs\") pod \"controller-69bbfbf88f-n6bwg\" (UID: \"7a4756aa-28e7-41ff-a32c-65b2a2db0b71\") " pod="metallb-system/controller-69bbfbf88f-n6bwg" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.428919 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/b6b73500-2314-4f41-83cf-3da8514065e3-metallb-excludel2\") pod \"speaker-55rj9\" (UID: \"b6b73500-2314-4f41-83cf-3da8514065e3\") " pod="metallb-system/speaker-55rj9" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.439290 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a4756aa-28e7-41ff-a32c-65b2a2db0b71-cert\") pod \"controller-69bbfbf88f-n6bwg\" (UID: \"7a4756aa-28e7-41ff-a32c-65b2a2db0b71\") " pod="metallb-system/controller-69bbfbf88f-n6bwg" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.441924 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gz5xj\" (UniqueName: \"kubernetes.io/projected/7a4756aa-28e7-41ff-a32c-65b2a2db0b71-kube-api-access-gz5xj\") pod \"controller-69bbfbf88f-n6bwg\" (UID: \"7a4756aa-28e7-41ff-a32c-65b2a2db0b71\") " pod="metallb-system/controller-69bbfbf88f-n6bwg" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.448757 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6sq7\" (UniqueName: \"kubernetes.io/projected/b6b73500-2314-4f41-83cf-3da8514065e3-kube-api-access-m6sq7\") pod \"speaker-55rj9\" (UID: \"b6b73500-2314-4f41-83cf-3da8514065e3\") " pod="metallb-system/speaker-55rj9" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.525957 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-xfjst" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.619490 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-n6bwg" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.834251 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dbdc8dbb-c968-46e3-8e11-9f1006a9ccbf-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-t4nhs\" (UID: \"dbdc8dbb-c968-46e3-8e11-9f1006a9ccbf\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-t4nhs" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.842394 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dbdc8dbb-c968-46e3-8e11-9f1006a9ccbf-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-t4nhs\" (UID: \"dbdc8dbb-c968-46e3-8e11-9f1006a9ccbf\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-t4nhs" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.850831 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-n6bwg"] Feb 17 17:58:03 crc kubenswrapper[4892]: W0217 17:58:03.852329 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a4756aa_28e7_41ff_a32c_65b2a2db0b71.slice/crio-a2056062ab2bbf8640640402ca3e049fec2ff5d1d3cbc3b68f7995bb73812652 WatchSource:0}: Error finding container a2056062ab2bbf8640640402ca3e049fec2ff5d1d3cbc3b68f7995bb73812652: Status 404 returned error can't find the container with id a2056062ab2bbf8640640402ca3e049fec2ff5d1d3cbc3b68f7995bb73812652 Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.935943 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/b6b73500-2314-4f41-83cf-3da8514065e3-memberlist\") pod \"speaker-55rj9\" (UID: \"b6b73500-2314-4f41-83cf-3da8514065e3\") " pod="metallb-system/speaker-55rj9" Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.936141 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b6b73500-2314-4f41-83cf-3da8514065e3-metrics-certs\") pod \"speaker-55rj9\" (UID: \"b6b73500-2314-4f41-83cf-3da8514065e3\") " pod="metallb-system/speaker-55rj9" Feb 17 17:58:03 crc kubenswrapper[4892]: E0217 17:58:03.936175 4892 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 17 17:58:03 crc kubenswrapper[4892]: E0217 17:58:03.936322 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6b73500-2314-4f41-83cf-3da8514065e3-memberlist podName:b6b73500-2314-4f41-83cf-3da8514065e3 nodeName:}" failed. No retries permitted until 2026-02-17 17:58:04.936289807 +0000 UTC m=+856.311693132 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/b6b73500-2314-4f41-83cf-3da8514065e3-memberlist") pod "speaker-55rj9" (UID: "b6b73500-2314-4f41-83cf-3da8514065e3") : secret "metallb-memberlist" not found Feb 17 17:58:03 crc kubenswrapper[4892]: I0217 17:58:03.940568 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b6b73500-2314-4f41-83cf-3da8514065e3-metrics-certs\") pod \"speaker-55rj9\" (UID: \"b6b73500-2314-4f41-83cf-3da8514065e3\") " pod="metallb-system/speaker-55rj9" Feb 17 17:58:04 crc kubenswrapper[4892]: I0217 17:58:04.131150 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-t4nhs" Feb 17 17:58:04 crc kubenswrapper[4892]: I0217 17:58:04.339654 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-t4nhs"] Feb 17 17:58:04 crc kubenswrapper[4892]: W0217 17:58:04.349889 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddbdc8dbb_c968_46e3_8e11_9f1006a9ccbf.slice/crio-1d03c8ad7e61ade6a49c3351d5a353ea1a0d89295c2580c0d4f7d81483a1846d WatchSource:0}: Error finding container 1d03c8ad7e61ade6a49c3351d5a353ea1a0d89295c2580c0d4f7d81483a1846d: Status 404 returned error can't find the container with id 1d03c8ad7e61ade6a49c3351d5a353ea1a0d89295c2580c0d4f7d81483a1846d Feb 17 17:58:04 crc kubenswrapper[4892]: I0217 17:58:04.463654 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xfjst" event={"ID":"87e188e4-25c7-4068-8336-8396f99d9a51","Type":"ContainerStarted","Data":"fe2a34cf67a8ac511e0a5bc273b64f0a4c854d110b3173a7e8aeb10e2ee52b83"} Feb 17 17:58:04 crc kubenswrapper[4892]: I0217 17:58:04.464966 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-t4nhs" event={"ID":"dbdc8dbb-c968-46e3-8e11-9f1006a9ccbf","Type":"ContainerStarted","Data":"1d03c8ad7e61ade6a49c3351d5a353ea1a0d89295c2580c0d4f7d81483a1846d"} Feb 17 17:58:04 crc kubenswrapper[4892]: I0217 17:58:04.467189 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-n6bwg" event={"ID":"7a4756aa-28e7-41ff-a32c-65b2a2db0b71","Type":"ContainerStarted","Data":"750b0cc08ab4d92118994829906a67f45d7b59bf24b79e0cbad54e43656542db"} Feb 17 17:58:04 crc kubenswrapper[4892]: I0217 17:58:04.467219 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-n6bwg" event={"ID":"7a4756aa-28e7-41ff-a32c-65b2a2db0b71","Type":"ContainerStarted","Data":"c2b9b2151c795000321aa14b36f56450787858ee86fe49ba04f9336f0b11f1c9"} Feb 17 17:58:04 crc kubenswrapper[4892]: I0217 17:58:04.467234 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-n6bwg" event={"ID":"7a4756aa-28e7-41ff-a32c-65b2a2db0b71","Type":"ContainerStarted","Data":"a2056062ab2bbf8640640402ca3e049fec2ff5d1d3cbc3b68f7995bb73812652"} Feb 17 17:58:04 crc kubenswrapper[4892]: I0217 17:58:04.468421 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-69bbfbf88f-n6bwg" Feb 17 17:58:04 crc kubenswrapper[4892]: I0217 17:58:04.497881 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-69bbfbf88f-n6bwg" podStartSLOduration=1.497857765 podStartE2EDuration="1.497857765s" podCreationTimestamp="2026-02-17 17:58:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:58:04.492230273 +0000 UTC m=+855.867633548" watchObservedRunningTime="2026-02-17 17:58:04.497857765 +0000 UTC m=+855.873261050" Feb 17 17:58:04 crc kubenswrapper[4892]: I0217 17:58:04.952607 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/b6b73500-2314-4f41-83cf-3da8514065e3-memberlist\") pod \"speaker-55rj9\" (UID: \"b6b73500-2314-4f41-83cf-3da8514065e3\") " pod="metallb-system/speaker-55rj9" Feb 17 17:58:04 crc kubenswrapper[4892]: I0217 17:58:04.961624 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/b6b73500-2314-4f41-83cf-3da8514065e3-memberlist\") pod \"speaker-55rj9\" (UID: \"b6b73500-2314-4f41-83cf-3da8514065e3\") " pod="metallb-system/speaker-55rj9" Feb 17 17:58:05 crc kubenswrapper[4892]: I0217 17:58:05.178756 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-55rj9" Feb 17 17:58:05 crc kubenswrapper[4892]: I0217 17:58:05.480733 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-55rj9" event={"ID":"b6b73500-2314-4f41-83cf-3da8514065e3","Type":"ContainerStarted","Data":"048b55c7e34b5a564051d0654eec537b2b936e3d587a33fd2144d2859f8f43fd"} Feb 17 17:58:06 crc kubenswrapper[4892]: I0217 17:58:06.492940 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-55rj9" event={"ID":"b6b73500-2314-4f41-83cf-3da8514065e3","Type":"ContainerStarted","Data":"bef1b0179db30a39242d3dae40af0f2910fb1bc7bfde46ae2f76bfb9b73bc29e"} Feb 17 17:58:06 crc kubenswrapper[4892]: I0217 17:58:06.493151 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-55rj9" event={"ID":"b6b73500-2314-4f41-83cf-3da8514065e3","Type":"ContainerStarted","Data":"bf28ac4404e76258373a8d778597f8a72cefb2802e708cdd45ce44db251a2fe8"} Feb 17 17:58:06 crc kubenswrapper[4892]: I0217 17:58:06.493169 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-55rj9" Feb 17 17:58:09 crc kubenswrapper[4892]: I0217 17:58:09.382585 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-55rj9" podStartSLOduration=6.382565232 podStartE2EDuration="6.382565232s" podCreationTimestamp="2026-02-17 17:58:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:58:06.520408047 +0000 UTC m=+857.895811332" watchObservedRunningTime="2026-02-17 17:58:09.382565232 +0000 UTC m=+860.757968527" Feb 17 17:58:10 crc kubenswrapper[4892]: I0217 17:58:10.540244 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-t4nhs" event={"ID":"dbdc8dbb-c968-46e3-8e11-9f1006a9ccbf","Type":"ContainerStarted","Data":"d6f84556d6cf77fa278ec3fc49aeeb9251ada3783de7749a82c887574631b5fc"} Feb 17 17:58:10 crc kubenswrapper[4892]: I0217 17:58:10.540674 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-t4nhs" Feb 17 17:58:10 crc kubenswrapper[4892]: I0217 17:58:10.545263 4892 generic.go:334] "Generic (PLEG): container finished" podID="87e188e4-25c7-4068-8336-8396f99d9a51" containerID="73d296ec7b22ff9bec3c8ca1d000938447b40b8e624bf0b3f1c8993c681a623f" exitCode=0 Feb 17 17:58:10 crc kubenswrapper[4892]: I0217 17:58:10.545327 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xfjst" event={"ID":"87e188e4-25c7-4068-8336-8396f99d9a51","Type":"ContainerDied","Data":"73d296ec7b22ff9bec3c8ca1d000938447b40b8e624bf0b3f1c8993c681a623f"} Feb 17 17:58:10 crc kubenswrapper[4892]: I0217 17:58:10.567936 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-t4nhs" podStartSLOduration=1.685312488 podStartE2EDuration="7.56772786s" podCreationTimestamp="2026-02-17 17:58:03 +0000 UTC" firstStartedPulling="2026-02-17 17:58:04.352457845 +0000 UTC m=+855.727861110" lastFinishedPulling="2026-02-17 17:58:10.234873217 +0000 UTC m=+861.610276482" observedRunningTime="2026-02-17 17:58:10.558243255 +0000 UTC m=+861.933646560" watchObservedRunningTime="2026-02-17 17:58:10.56772786 +0000 UTC m=+861.943131155" Feb 17 17:58:11 crc kubenswrapper[4892]: I0217 17:58:11.569801 4892 generic.go:334] "Generic (PLEG): container finished" podID="87e188e4-25c7-4068-8336-8396f99d9a51" containerID="b3dcd76b2cf1aec6d6531d542d96fd608604a892e994971e504bbc79594f5bf2" exitCode=0 Feb 17 17:58:11 crc kubenswrapper[4892]: I0217 17:58:11.569968 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xfjst" event={"ID":"87e188e4-25c7-4068-8336-8396f99d9a51","Type":"ContainerDied","Data":"b3dcd76b2cf1aec6d6531d542d96fd608604a892e994971e504bbc79594f5bf2"} Feb 17 17:58:12 crc kubenswrapper[4892]: I0217 17:58:12.579389 4892 generic.go:334] "Generic (PLEG): container finished" podID="87e188e4-25c7-4068-8336-8396f99d9a51" containerID="5d7bb2595a7bc2eeaa6707fbfd0f1fdf68dae3b19c3108d612cbbd6e1b68016b" exitCode=0 Feb 17 17:58:12 crc kubenswrapper[4892]: I0217 17:58:12.579478 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xfjst" event={"ID":"87e188e4-25c7-4068-8336-8396f99d9a51","Type":"ContainerDied","Data":"5d7bb2595a7bc2eeaa6707fbfd0f1fdf68dae3b19c3108d612cbbd6e1b68016b"} Feb 17 17:58:13 crc kubenswrapper[4892]: I0217 17:58:13.588956 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xfjst" event={"ID":"87e188e4-25c7-4068-8336-8396f99d9a51","Type":"ContainerStarted","Data":"2455e1a298b1fa830383910003174442d2ea5f1845c73f6c4656a4266a02249d"} Feb 17 17:58:13 crc kubenswrapper[4892]: I0217 17:58:13.589540 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xfjst" event={"ID":"87e188e4-25c7-4068-8336-8396f99d9a51","Type":"ContainerStarted","Data":"68ff6bbaef555dd4672d0cf383542101266a2c01a4ef1813e9601dd291968ab6"} Feb 17 17:58:13 crc kubenswrapper[4892]: I0217 17:58:13.589555 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xfjst" event={"ID":"87e188e4-25c7-4068-8336-8396f99d9a51","Type":"ContainerStarted","Data":"22c5ab74c40aa42742356b679963fb240da693e218036839c6645d3942fbacea"} Feb 17 17:58:13 crc kubenswrapper[4892]: I0217 17:58:13.589567 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xfjst" event={"ID":"87e188e4-25c7-4068-8336-8396f99d9a51","Type":"ContainerStarted","Data":"a07ce2c488c2d3b1ba4c49b8b68197bde1c32f043fcd34fe895271d76f51b3bc"} Feb 17 17:58:13 crc kubenswrapper[4892]: I0217 17:58:13.589577 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xfjst" event={"ID":"87e188e4-25c7-4068-8336-8396f99d9a51","Type":"ContainerStarted","Data":"56c0da5a51da5f99ec1a6a44b098ce4fd6f0446da30009eeed19e50071e2627d"} Feb 17 17:58:13 crc kubenswrapper[4892]: I0217 17:58:13.623806 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-69bbfbf88f-n6bwg" Feb 17 17:58:14 crc kubenswrapper[4892]: I0217 17:58:14.607837 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xfjst" event={"ID":"87e188e4-25c7-4068-8336-8396f99d9a51","Type":"ContainerStarted","Data":"12628761476e2fbd5e7c95eb9483b58aa02999c2528da8b5eefd5944f526933f"} Feb 17 17:58:14 crc kubenswrapper[4892]: I0217 17:58:14.608048 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-xfjst" Feb 17 17:58:14 crc kubenswrapper[4892]: I0217 17:58:14.638062 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-xfjst" podStartSLOduration=5.112317929 podStartE2EDuration="11.638033653s" podCreationTimestamp="2026-02-17 17:58:03 +0000 UTC" firstStartedPulling="2026-02-17 17:58:03.693667506 +0000 UTC m=+855.069070771" lastFinishedPulling="2026-02-17 17:58:10.21938324 +0000 UTC m=+861.594786495" observedRunningTime="2026-02-17 17:58:14.633385598 +0000 UTC m=+866.008788873" watchObservedRunningTime="2026-02-17 17:58:14.638033653 +0000 UTC m=+866.013436928" Feb 17 17:58:15 crc kubenswrapper[4892]: I0217 17:58:15.185466 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-55rj9" Feb 17 17:58:16 crc kubenswrapper[4892]: I0217 17:58:16.580300 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e594m25"] Feb 17 17:58:16 crc kubenswrapper[4892]: I0217 17:58:16.583061 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e594m25" Feb 17 17:58:16 crc kubenswrapper[4892]: I0217 17:58:16.585282 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 17 17:58:16 crc kubenswrapper[4892]: I0217 17:58:16.594513 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e594m25"] Feb 17 17:58:16 crc kubenswrapper[4892]: I0217 17:58:16.642262 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d5933e5f-3c83-41ab-b58a-9b5d5e7f7580-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e594m25\" (UID: \"d5933e5f-3c83-41ab-b58a-9b5d5e7f7580\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e594m25" Feb 17 17:58:16 crc kubenswrapper[4892]: I0217 17:58:16.642330 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxf82\" (UniqueName: \"kubernetes.io/projected/d5933e5f-3c83-41ab-b58a-9b5d5e7f7580-kube-api-access-jxf82\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e594m25\" (UID: \"d5933e5f-3c83-41ab-b58a-9b5d5e7f7580\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e594m25" Feb 17 17:58:16 crc kubenswrapper[4892]: I0217 17:58:16.642353 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d5933e5f-3c83-41ab-b58a-9b5d5e7f7580-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e594m25\" (UID: \"d5933e5f-3c83-41ab-b58a-9b5d5e7f7580\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e594m25" Feb 17 17:58:16 crc kubenswrapper[4892]: I0217 17:58:16.743358 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d5933e5f-3c83-41ab-b58a-9b5d5e7f7580-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e594m25\" (UID: \"d5933e5f-3c83-41ab-b58a-9b5d5e7f7580\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e594m25" Feb 17 17:58:16 crc kubenswrapper[4892]: I0217 17:58:16.743695 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxf82\" (UniqueName: \"kubernetes.io/projected/d5933e5f-3c83-41ab-b58a-9b5d5e7f7580-kube-api-access-jxf82\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e594m25\" (UID: \"d5933e5f-3c83-41ab-b58a-9b5d5e7f7580\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e594m25" Feb 17 17:58:16 crc kubenswrapper[4892]: I0217 17:58:16.743724 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d5933e5f-3c83-41ab-b58a-9b5d5e7f7580-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e594m25\" (UID: \"d5933e5f-3c83-41ab-b58a-9b5d5e7f7580\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e594m25" Feb 17 17:58:16 crc kubenswrapper[4892]: I0217 17:58:16.743842 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d5933e5f-3c83-41ab-b58a-9b5d5e7f7580-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e594m25\" (UID: \"d5933e5f-3c83-41ab-b58a-9b5d5e7f7580\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e594m25" Feb 17 17:58:16 crc kubenswrapper[4892]: I0217 17:58:16.744147 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d5933e5f-3c83-41ab-b58a-9b5d5e7f7580-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e594m25\" (UID: \"d5933e5f-3c83-41ab-b58a-9b5d5e7f7580\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e594m25" Feb 17 17:58:16 crc kubenswrapper[4892]: I0217 17:58:16.770607 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxf82\" (UniqueName: \"kubernetes.io/projected/d5933e5f-3c83-41ab-b58a-9b5d5e7f7580-kube-api-access-jxf82\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e594m25\" (UID: \"d5933e5f-3c83-41ab-b58a-9b5d5e7f7580\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e594m25" Feb 17 17:58:16 crc kubenswrapper[4892]: I0217 17:58:16.941277 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e594m25" Feb 17 17:58:17 crc kubenswrapper[4892]: I0217 17:58:17.237649 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e594m25"] Feb 17 17:58:17 crc kubenswrapper[4892]: I0217 17:58:17.676292 4892 generic.go:334] "Generic (PLEG): container finished" podID="d5933e5f-3c83-41ab-b58a-9b5d5e7f7580" containerID="adf629630ccaef5bcdae10201133f1941095fab15130e3a13f7a8be8e0bb7f88" exitCode=0 Feb 17 17:58:17 crc kubenswrapper[4892]: I0217 17:58:17.676346 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e594m25" event={"ID":"d5933e5f-3c83-41ab-b58a-9b5d5e7f7580","Type":"ContainerDied","Data":"adf629630ccaef5bcdae10201133f1941095fab15130e3a13f7a8be8e0bb7f88"} Feb 17 17:58:17 crc kubenswrapper[4892]: I0217 17:58:17.676378 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e594m25" event={"ID":"d5933e5f-3c83-41ab-b58a-9b5d5e7f7580","Type":"ContainerStarted","Data":"f7906edfa8c01023fb68dd00ce43f2d5f7a87b41b6ffdf2d3696d648deb505b9"} Feb 17 17:58:18 crc kubenswrapper[4892]: I0217 17:58:18.527271 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-xfjst" Feb 17 17:58:18 crc kubenswrapper[4892]: I0217 17:58:18.584692 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-xfjst" Feb 17 17:58:22 crc kubenswrapper[4892]: I0217 17:58:22.732468 4892 generic.go:334] "Generic (PLEG): container finished" podID="d5933e5f-3c83-41ab-b58a-9b5d5e7f7580" containerID="94e177198f519f6ab90777b4f62f26831be7f1b293b36a1f0cdb86b1f8058042" exitCode=0 Feb 17 17:58:22 crc kubenswrapper[4892]: I0217 17:58:22.732685 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e594m25" event={"ID":"d5933e5f-3c83-41ab-b58a-9b5d5e7f7580","Type":"ContainerDied","Data":"94e177198f519f6ab90777b4f62f26831be7f1b293b36a1f0cdb86b1f8058042"} Feb 17 17:58:23 crc kubenswrapper[4892]: I0217 17:58:23.530984 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-xfjst" Feb 17 17:58:23 crc kubenswrapper[4892]: I0217 17:58:23.745482 4892 generic.go:334] "Generic (PLEG): container finished" podID="d5933e5f-3c83-41ab-b58a-9b5d5e7f7580" containerID="f943cf8c88b52840e2215effbaf724c026e42b2e8a31c77a49ffd51e700df622" exitCode=0 Feb 17 17:58:23 crc kubenswrapper[4892]: I0217 17:58:23.745535 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e594m25" event={"ID":"d5933e5f-3c83-41ab-b58a-9b5d5e7f7580","Type":"ContainerDied","Data":"f943cf8c88b52840e2215effbaf724c026e42b2e8a31c77a49ffd51e700df622"} Feb 17 17:58:24 crc kubenswrapper[4892]: I0217 17:58:24.136917 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-t4nhs" Feb 17 17:58:25 crc kubenswrapper[4892]: I0217 17:58:25.050251 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e594m25" Feb 17 17:58:25 crc kubenswrapper[4892]: I0217 17:58:25.187719 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d5933e5f-3c83-41ab-b58a-9b5d5e7f7580-util\") pod \"d5933e5f-3c83-41ab-b58a-9b5d5e7f7580\" (UID: \"d5933e5f-3c83-41ab-b58a-9b5d5e7f7580\") " Feb 17 17:58:25 crc kubenswrapper[4892]: I0217 17:58:25.187888 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d5933e5f-3c83-41ab-b58a-9b5d5e7f7580-bundle\") pod \"d5933e5f-3c83-41ab-b58a-9b5d5e7f7580\" (UID: \"d5933e5f-3c83-41ab-b58a-9b5d5e7f7580\") " Feb 17 17:58:25 crc kubenswrapper[4892]: I0217 17:58:25.187954 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxf82\" (UniqueName: \"kubernetes.io/projected/d5933e5f-3c83-41ab-b58a-9b5d5e7f7580-kube-api-access-jxf82\") pod \"d5933e5f-3c83-41ab-b58a-9b5d5e7f7580\" (UID: \"d5933e5f-3c83-41ab-b58a-9b5d5e7f7580\") " Feb 17 17:58:25 crc kubenswrapper[4892]: I0217 17:58:25.189161 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5933e5f-3c83-41ab-b58a-9b5d5e7f7580-bundle" (OuterVolumeSpecName: "bundle") pod "d5933e5f-3c83-41ab-b58a-9b5d5e7f7580" (UID: "d5933e5f-3c83-41ab-b58a-9b5d5e7f7580"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:58:25 crc kubenswrapper[4892]: I0217 17:58:25.194009 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5933e5f-3c83-41ab-b58a-9b5d5e7f7580-kube-api-access-jxf82" (OuterVolumeSpecName: "kube-api-access-jxf82") pod "d5933e5f-3c83-41ab-b58a-9b5d5e7f7580" (UID: "d5933e5f-3c83-41ab-b58a-9b5d5e7f7580"). InnerVolumeSpecName "kube-api-access-jxf82". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:58:25 crc kubenswrapper[4892]: I0217 17:58:25.211112 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5933e5f-3c83-41ab-b58a-9b5d5e7f7580-util" (OuterVolumeSpecName: "util") pod "d5933e5f-3c83-41ab-b58a-9b5d5e7f7580" (UID: "d5933e5f-3c83-41ab-b58a-9b5d5e7f7580"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:58:25 crc kubenswrapper[4892]: I0217 17:58:25.291167 4892 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d5933e5f-3c83-41ab-b58a-9b5d5e7f7580-util\") on node \"crc\" DevicePath \"\"" Feb 17 17:58:25 crc kubenswrapper[4892]: I0217 17:58:25.291197 4892 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d5933e5f-3c83-41ab-b58a-9b5d5e7f7580-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:58:25 crc kubenswrapper[4892]: I0217 17:58:25.291206 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxf82\" (UniqueName: \"kubernetes.io/projected/d5933e5f-3c83-41ab-b58a-9b5d5e7f7580-kube-api-access-jxf82\") on node \"crc\" DevicePath \"\"" Feb 17 17:58:25 crc kubenswrapper[4892]: I0217 17:58:25.764894 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e594m25" event={"ID":"d5933e5f-3c83-41ab-b58a-9b5d5e7f7580","Type":"ContainerDied","Data":"f7906edfa8c01023fb68dd00ce43f2d5f7a87b41b6ffdf2d3696d648deb505b9"} Feb 17 17:58:25 crc kubenswrapper[4892]: I0217 17:58:25.764973 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7906edfa8c01023fb68dd00ce43f2d5f7a87b41b6ffdf2d3696d648deb505b9" Feb 17 17:58:25 crc kubenswrapper[4892]: I0217 17:58:25.765043 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e594m25" Feb 17 17:58:30 crc kubenswrapper[4892]: I0217 17:58:30.084841 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-6wdsm"] Feb 17 17:58:30 crc kubenswrapper[4892]: E0217 17:58:30.085506 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5933e5f-3c83-41ab-b58a-9b5d5e7f7580" containerName="util" Feb 17 17:58:30 crc kubenswrapper[4892]: I0217 17:58:30.085519 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5933e5f-3c83-41ab-b58a-9b5d5e7f7580" containerName="util" Feb 17 17:58:30 crc kubenswrapper[4892]: E0217 17:58:30.085557 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5933e5f-3c83-41ab-b58a-9b5d5e7f7580" containerName="pull" Feb 17 17:58:30 crc kubenswrapper[4892]: I0217 17:58:30.085564 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5933e5f-3c83-41ab-b58a-9b5d5e7f7580" containerName="pull" Feb 17 17:58:30 crc kubenswrapper[4892]: E0217 17:58:30.085574 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5933e5f-3c83-41ab-b58a-9b5d5e7f7580" containerName="extract" Feb 17 17:58:30 crc kubenswrapper[4892]: I0217 17:58:30.085582 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5933e5f-3c83-41ab-b58a-9b5d5e7f7580" containerName="extract" Feb 17 17:58:30 crc kubenswrapper[4892]: I0217 17:58:30.085754 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5933e5f-3c83-41ab-b58a-9b5d5e7f7580" containerName="extract" Feb 17 17:58:30 crc kubenswrapper[4892]: I0217 17:58:30.086359 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-6wdsm" Feb 17 17:58:30 crc kubenswrapper[4892]: I0217 17:58:30.089155 4892 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-92qsn" Feb 17 17:58:30 crc kubenswrapper[4892]: I0217 17:58:30.089182 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Feb 17 17:58:30 crc kubenswrapper[4892]: I0217 17:58:30.089681 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Feb 17 17:58:30 crc kubenswrapper[4892]: I0217 17:58:30.107062 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-6wdsm"] Feb 17 17:58:30 crc kubenswrapper[4892]: I0217 17:58:30.172423 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/24975bf0-24b4-4cc3-9955-e2df208aca26-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-6wdsm\" (UID: \"24975bf0-24b4-4cc3-9955-e2df208aca26\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-6wdsm" Feb 17 17:58:30 crc kubenswrapper[4892]: I0217 17:58:30.172727 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsqwr\" (UniqueName: \"kubernetes.io/projected/24975bf0-24b4-4cc3-9955-e2df208aca26-kube-api-access-dsqwr\") pod \"cert-manager-operator-controller-manager-66c8bdd694-6wdsm\" (UID: \"24975bf0-24b4-4cc3-9955-e2df208aca26\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-6wdsm" Feb 17 17:58:30 crc kubenswrapper[4892]: I0217 17:58:30.274066 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/24975bf0-24b4-4cc3-9955-e2df208aca26-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-6wdsm\" (UID: \"24975bf0-24b4-4cc3-9955-e2df208aca26\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-6wdsm" Feb 17 17:58:30 crc kubenswrapper[4892]: I0217 17:58:30.274127 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsqwr\" (UniqueName: \"kubernetes.io/projected/24975bf0-24b4-4cc3-9955-e2df208aca26-kube-api-access-dsqwr\") pod \"cert-manager-operator-controller-manager-66c8bdd694-6wdsm\" (UID: \"24975bf0-24b4-4cc3-9955-e2df208aca26\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-6wdsm" Feb 17 17:58:30 crc kubenswrapper[4892]: I0217 17:58:30.274535 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/24975bf0-24b4-4cc3-9955-e2df208aca26-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-6wdsm\" (UID: \"24975bf0-24b4-4cc3-9955-e2df208aca26\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-6wdsm" Feb 17 17:58:30 crc kubenswrapper[4892]: I0217 17:58:30.300770 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsqwr\" (UniqueName: \"kubernetes.io/projected/24975bf0-24b4-4cc3-9955-e2df208aca26-kube-api-access-dsqwr\") pod \"cert-manager-operator-controller-manager-66c8bdd694-6wdsm\" (UID: \"24975bf0-24b4-4cc3-9955-e2df208aca26\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-6wdsm" Feb 17 17:58:30 crc kubenswrapper[4892]: I0217 17:58:30.461560 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-6wdsm" Feb 17 17:58:30 crc kubenswrapper[4892]: I0217 17:58:30.966192 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-6wdsm"] Feb 17 17:58:30 crc kubenswrapper[4892]: W0217 17:58:30.973869 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24975bf0_24b4_4cc3_9955_e2df208aca26.slice/crio-99f0498fc92526fbe821ce5aa5bf68f0e30eb99a7d2b4630ef80ad796f770448 WatchSource:0}: Error finding container 99f0498fc92526fbe821ce5aa5bf68f0e30eb99a7d2b4630ef80ad796f770448: Status 404 returned error can't find the container with id 99f0498fc92526fbe821ce5aa5bf68f0e30eb99a7d2b4630ef80ad796f770448 Feb 17 17:58:31 crc kubenswrapper[4892]: I0217 17:58:31.807217 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-6wdsm" event={"ID":"24975bf0-24b4-4cc3-9955-e2df208aca26","Type":"ContainerStarted","Data":"99f0498fc92526fbe821ce5aa5bf68f0e30eb99a7d2b4630ef80ad796f770448"} Feb 17 17:58:35 crc kubenswrapper[4892]: I0217 17:58:35.757680 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8krld"] Feb 17 17:58:35 crc kubenswrapper[4892]: I0217 17:58:35.760447 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8krld" Feb 17 17:58:35 crc kubenswrapper[4892]: I0217 17:58:35.785074 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8krld"] Feb 17 17:58:35 crc kubenswrapper[4892]: I0217 17:58:35.904964 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hnmv\" (UniqueName: \"kubernetes.io/projected/547f235f-cf83-4393-b38d-ec8cc7d7a491-kube-api-access-8hnmv\") pod \"community-operators-8krld\" (UID: \"547f235f-cf83-4393-b38d-ec8cc7d7a491\") " pod="openshift-marketplace/community-operators-8krld" Feb 17 17:58:35 crc kubenswrapper[4892]: I0217 17:58:35.905035 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/547f235f-cf83-4393-b38d-ec8cc7d7a491-utilities\") pod \"community-operators-8krld\" (UID: \"547f235f-cf83-4393-b38d-ec8cc7d7a491\") " pod="openshift-marketplace/community-operators-8krld" Feb 17 17:58:35 crc kubenswrapper[4892]: I0217 17:58:35.905209 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/547f235f-cf83-4393-b38d-ec8cc7d7a491-catalog-content\") pod \"community-operators-8krld\" (UID: \"547f235f-cf83-4393-b38d-ec8cc7d7a491\") " pod="openshift-marketplace/community-operators-8krld" Feb 17 17:58:36 crc kubenswrapper[4892]: I0217 17:58:36.007084 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/547f235f-cf83-4393-b38d-ec8cc7d7a491-catalog-content\") pod \"community-operators-8krld\" (UID: \"547f235f-cf83-4393-b38d-ec8cc7d7a491\") " pod="openshift-marketplace/community-operators-8krld" Feb 17 17:58:36 crc kubenswrapper[4892]: I0217 17:58:36.007184 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hnmv\" (UniqueName: \"kubernetes.io/projected/547f235f-cf83-4393-b38d-ec8cc7d7a491-kube-api-access-8hnmv\") pod \"community-operators-8krld\" (UID: \"547f235f-cf83-4393-b38d-ec8cc7d7a491\") " pod="openshift-marketplace/community-operators-8krld" Feb 17 17:58:36 crc kubenswrapper[4892]: I0217 17:58:36.007231 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/547f235f-cf83-4393-b38d-ec8cc7d7a491-utilities\") pod \"community-operators-8krld\" (UID: \"547f235f-cf83-4393-b38d-ec8cc7d7a491\") " pod="openshift-marketplace/community-operators-8krld" Feb 17 17:58:36 crc kubenswrapper[4892]: I0217 17:58:36.007623 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/547f235f-cf83-4393-b38d-ec8cc7d7a491-catalog-content\") pod \"community-operators-8krld\" (UID: \"547f235f-cf83-4393-b38d-ec8cc7d7a491\") " pod="openshift-marketplace/community-operators-8krld" Feb 17 17:58:36 crc kubenswrapper[4892]: I0217 17:58:36.007747 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/547f235f-cf83-4393-b38d-ec8cc7d7a491-utilities\") pod \"community-operators-8krld\" (UID: \"547f235f-cf83-4393-b38d-ec8cc7d7a491\") " pod="openshift-marketplace/community-operators-8krld" Feb 17 17:58:36 crc kubenswrapper[4892]: I0217 17:58:36.033086 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hnmv\" (UniqueName: \"kubernetes.io/projected/547f235f-cf83-4393-b38d-ec8cc7d7a491-kube-api-access-8hnmv\") pod \"community-operators-8krld\" (UID: \"547f235f-cf83-4393-b38d-ec8cc7d7a491\") " pod="openshift-marketplace/community-operators-8krld" Feb 17 17:58:36 crc kubenswrapper[4892]: I0217 17:58:36.081839 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8krld" Feb 17 17:58:36 crc kubenswrapper[4892]: I0217 17:58:36.623416 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8krld"] Feb 17 17:58:36 crc kubenswrapper[4892]: I0217 17:58:36.851869 4892 generic.go:334] "Generic (PLEG): container finished" podID="547f235f-cf83-4393-b38d-ec8cc7d7a491" containerID="29fd21523b60634facb51f1651e0135c25756c7042d90f72aac5105b2827bfe9" exitCode=0 Feb 17 17:58:36 crc kubenswrapper[4892]: I0217 17:58:36.851908 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8krld" event={"ID":"547f235f-cf83-4393-b38d-ec8cc7d7a491","Type":"ContainerDied","Data":"29fd21523b60634facb51f1651e0135c25756c7042d90f72aac5105b2827bfe9"} Feb 17 17:58:36 crc kubenswrapper[4892]: I0217 17:58:36.851933 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8krld" event={"ID":"547f235f-cf83-4393-b38d-ec8cc7d7a491","Type":"ContainerStarted","Data":"24ae154f505e22d19b96f09e9304fcb9ae4ff9c11995bf1e81064215b40023c4"} Feb 17 17:58:37 crc kubenswrapper[4892]: I0217 17:58:37.864966 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8krld" event={"ID":"547f235f-cf83-4393-b38d-ec8cc7d7a491","Type":"ContainerStarted","Data":"81d3fec03d8b556c36c63d64b92c0f9f6280adb38914475942317333432f11e3"} Feb 17 17:58:38 crc kubenswrapper[4892]: I0217 17:58:38.876966 4892 generic.go:334] "Generic (PLEG): container finished" podID="547f235f-cf83-4393-b38d-ec8cc7d7a491" containerID="81d3fec03d8b556c36c63d64b92c0f9f6280adb38914475942317333432f11e3" exitCode=0 Feb 17 17:58:38 crc kubenswrapper[4892]: I0217 17:58:38.877019 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8krld" event={"ID":"547f235f-cf83-4393-b38d-ec8cc7d7a491","Type":"ContainerDied","Data":"81d3fec03d8b556c36c63d64b92c0f9f6280adb38914475942317333432f11e3"} Feb 17 17:58:39 crc kubenswrapper[4892]: I0217 17:58:39.899379 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8krld" event={"ID":"547f235f-cf83-4393-b38d-ec8cc7d7a491","Type":"ContainerStarted","Data":"68d2f045e145141f5a24c2e216528c4adbbaf49071f246c94d8ef6c8ad6acd8f"} Feb 17 17:58:40 crc kubenswrapper[4892]: I0217 17:58:40.939083 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8krld" podStartSLOduration=3.179924469 podStartE2EDuration="5.939064747s" podCreationTimestamp="2026-02-17 17:58:35 +0000 UTC" firstStartedPulling="2026-02-17 17:58:36.854564882 +0000 UTC m=+888.229968157" lastFinishedPulling="2026-02-17 17:58:39.61370513 +0000 UTC m=+890.989108435" observedRunningTime="2026-02-17 17:58:40.934393282 +0000 UTC m=+892.309796617" watchObservedRunningTime="2026-02-17 17:58:40.939064747 +0000 UTC m=+892.314468022" Feb 17 17:58:44 crc kubenswrapper[4892]: I0217 17:58:44.950028 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-6wdsm" event={"ID":"24975bf0-24b4-4cc3-9955-e2df208aca26","Type":"ContainerStarted","Data":"ca993f3fa693ec04743f1d655b594eeb0533ad04a31e33e291c771f3c2f5f0cf"} Feb 17 17:58:45 crc kubenswrapper[4892]: I0217 17:58:45.008191 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-6wdsm" podStartSLOduration=1.995707262 podStartE2EDuration="15.008164698s" podCreationTimestamp="2026-02-17 17:58:30 +0000 UTC" firstStartedPulling="2026-02-17 17:58:30.976784125 +0000 UTC m=+882.352187390" lastFinishedPulling="2026-02-17 17:58:43.989241561 +0000 UTC m=+895.364644826" observedRunningTime="2026-02-17 17:58:44.994662944 +0000 UTC m=+896.370066259" watchObservedRunningTime="2026-02-17 17:58:45.008164698 +0000 UTC m=+896.383567963" Feb 17 17:58:46 crc kubenswrapper[4892]: I0217 17:58:46.083112 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8krld" Feb 17 17:58:46 crc kubenswrapper[4892]: I0217 17:58:46.083395 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8krld" Feb 17 17:58:46 crc kubenswrapper[4892]: I0217 17:58:46.170250 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8krld" Feb 17 17:58:47 crc kubenswrapper[4892]: I0217 17:58:47.014595 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8krld" Feb 17 17:58:47 crc kubenswrapper[4892]: I0217 17:58:47.050205 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-t2r56"] Feb 17 17:58:47 crc kubenswrapper[4892]: I0217 17:58:47.051129 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-t2r56" Feb 17 17:58:47 crc kubenswrapper[4892]: I0217 17:58:47.054480 4892 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-zgdkr" Feb 17 17:58:47 crc kubenswrapper[4892]: I0217 17:58:47.054594 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 17 17:58:47 crc kubenswrapper[4892]: I0217 17:58:47.055622 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 17 17:58:47 crc kubenswrapper[4892]: I0217 17:58:47.068615 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-t2r56"] Feb 17 17:58:47 crc kubenswrapper[4892]: I0217 17:58:47.119406 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8krld"] Feb 17 17:58:47 crc kubenswrapper[4892]: I0217 17:58:47.217587 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czqw9\" (UniqueName: \"kubernetes.io/projected/e7381d2c-f7e2-4935-be5f-380479a2e516-kube-api-access-czqw9\") pod \"cert-manager-webhook-6888856db4-t2r56\" (UID: \"e7381d2c-f7e2-4935-be5f-380479a2e516\") " pod="cert-manager/cert-manager-webhook-6888856db4-t2r56" Feb 17 17:58:47 crc kubenswrapper[4892]: I0217 17:58:47.217672 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e7381d2c-f7e2-4935-be5f-380479a2e516-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-t2r56\" (UID: \"e7381d2c-f7e2-4935-be5f-380479a2e516\") " pod="cert-manager/cert-manager-webhook-6888856db4-t2r56" Feb 17 17:58:47 crc kubenswrapper[4892]: I0217 17:58:47.318617 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czqw9\" (UniqueName: \"kubernetes.io/projected/e7381d2c-f7e2-4935-be5f-380479a2e516-kube-api-access-czqw9\") pod \"cert-manager-webhook-6888856db4-t2r56\" (UID: \"e7381d2c-f7e2-4935-be5f-380479a2e516\") " pod="cert-manager/cert-manager-webhook-6888856db4-t2r56" Feb 17 17:58:47 crc kubenswrapper[4892]: I0217 17:58:47.318703 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e7381d2c-f7e2-4935-be5f-380479a2e516-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-t2r56\" (UID: \"e7381d2c-f7e2-4935-be5f-380479a2e516\") " pod="cert-manager/cert-manager-webhook-6888856db4-t2r56" Feb 17 17:58:47 crc kubenswrapper[4892]: I0217 17:58:47.339452 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e7381d2c-f7e2-4935-be5f-380479a2e516-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-t2r56\" (UID: \"e7381d2c-f7e2-4935-be5f-380479a2e516\") " pod="cert-manager/cert-manager-webhook-6888856db4-t2r56" Feb 17 17:58:47 crc kubenswrapper[4892]: I0217 17:58:47.340253 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czqw9\" (UniqueName: \"kubernetes.io/projected/e7381d2c-f7e2-4935-be5f-380479a2e516-kube-api-access-czqw9\") pod \"cert-manager-webhook-6888856db4-t2r56\" (UID: \"e7381d2c-f7e2-4935-be5f-380479a2e516\") " pod="cert-manager/cert-manager-webhook-6888856db4-t2r56" Feb 17 17:58:47 crc kubenswrapper[4892]: I0217 17:58:47.372111 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-t2r56" Feb 17 17:58:47 crc kubenswrapper[4892]: I0217 17:58:47.819290 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-t2r56"] Feb 17 17:58:47 crc kubenswrapper[4892]: I0217 17:58:47.990260 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-t2r56" event={"ID":"e7381d2c-f7e2-4935-be5f-380479a2e516","Type":"ContainerStarted","Data":"38331d8915a6d469c737386039803cde5150b5d10293a4950fa2cd9ccce085bc"} Feb 17 17:58:48 crc kubenswrapper[4892]: I0217 17:58:48.997941 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8krld" podUID="547f235f-cf83-4393-b38d-ec8cc7d7a491" containerName="registry-server" containerID="cri-o://68d2f045e145141f5a24c2e216528c4adbbaf49071f246c94d8ef6c8ad6acd8f" gracePeriod=2 Feb 17 17:58:50 crc kubenswrapper[4892]: I0217 17:58:50.016579 4892 generic.go:334] "Generic (PLEG): container finished" podID="547f235f-cf83-4393-b38d-ec8cc7d7a491" containerID="68d2f045e145141f5a24c2e216528c4adbbaf49071f246c94d8ef6c8ad6acd8f" exitCode=0 Feb 17 17:58:50 crc kubenswrapper[4892]: I0217 17:58:50.016668 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8krld" event={"ID":"547f235f-cf83-4393-b38d-ec8cc7d7a491","Type":"ContainerDied","Data":"68d2f045e145141f5a24c2e216528c4adbbaf49071f246c94d8ef6c8ad6acd8f"} Feb 17 17:58:50 crc kubenswrapper[4892]: I0217 17:58:50.016911 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8krld" event={"ID":"547f235f-cf83-4393-b38d-ec8cc7d7a491","Type":"ContainerDied","Data":"24ae154f505e22d19b96f09e9304fcb9ae4ff9c11995bf1e81064215b40023c4"} Feb 17 17:58:50 crc kubenswrapper[4892]: I0217 17:58:50.016963 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24ae154f505e22d19b96f09e9304fcb9ae4ff9c11995bf1e81064215b40023c4" Feb 17 17:58:50 crc kubenswrapper[4892]: I0217 17:58:50.038721 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8krld" Feb 17 17:58:50 crc kubenswrapper[4892]: I0217 17:58:50.082754 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hnmv\" (UniqueName: \"kubernetes.io/projected/547f235f-cf83-4393-b38d-ec8cc7d7a491-kube-api-access-8hnmv\") pod \"547f235f-cf83-4393-b38d-ec8cc7d7a491\" (UID: \"547f235f-cf83-4393-b38d-ec8cc7d7a491\") " Feb 17 17:58:50 crc kubenswrapper[4892]: I0217 17:58:50.082842 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/547f235f-cf83-4393-b38d-ec8cc7d7a491-utilities\") pod \"547f235f-cf83-4393-b38d-ec8cc7d7a491\" (UID: \"547f235f-cf83-4393-b38d-ec8cc7d7a491\") " Feb 17 17:58:50 crc kubenswrapper[4892]: I0217 17:58:50.082971 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/547f235f-cf83-4393-b38d-ec8cc7d7a491-catalog-content\") pod \"547f235f-cf83-4393-b38d-ec8cc7d7a491\" (UID: \"547f235f-cf83-4393-b38d-ec8cc7d7a491\") " Feb 17 17:58:50 crc kubenswrapper[4892]: I0217 17:58:50.084006 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/547f235f-cf83-4393-b38d-ec8cc7d7a491-utilities" (OuterVolumeSpecName: "utilities") pod "547f235f-cf83-4393-b38d-ec8cc7d7a491" (UID: "547f235f-cf83-4393-b38d-ec8cc7d7a491"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:58:50 crc kubenswrapper[4892]: I0217 17:58:50.091582 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/547f235f-cf83-4393-b38d-ec8cc7d7a491-kube-api-access-8hnmv" (OuterVolumeSpecName: "kube-api-access-8hnmv") pod "547f235f-cf83-4393-b38d-ec8cc7d7a491" (UID: "547f235f-cf83-4393-b38d-ec8cc7d7a491"). InnerVolumeSpecName "kube-api-access-8hnmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:58:50 crc kubenswrapper[4892]: I0217 17:58:50.144028 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/547f235f-cf83-4393-b38d-ec8cc7d7a491-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "547f235f-cf83-4393-b38d-ec8cc7d7a491" (UID: "547f235f-cf83-4393-b38d-ec8cc7d7a491"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:58:50 crc kubenswrapper[4892]: I0217 17:58:50.187584 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/547f235f-cf83-4393-b38d-ec8cc7d7a491-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 17:58:50 crc kubenswrapper[4892]: I0217 17:58:50.187645 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hnmv\" (UniqueName: \"kubernetes.io/projected/547f235f-cf83-4393-b38d-ec8cc7d7a491-kube-api-access-8hnmv\") on node \"crc\" DevicePath \"\"" Feb 17 17:58:50 crc kubenswrapper[4892]: I0217 17:58:50.187658 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/547f235f-cf83-4393-b38d-ec8cc7d7a491-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 17:58:50 crc kubenswrapper[4892]: I0217 17:58:50.521369 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-8x4md"] Feb 17 17:58:50 crc kubenswrapper[4892]: E0217 17:58:50.521970 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="547f235f-cf83-4393-b38d-ec8cc7d7a491" containerName="extract-utilities" Feb 17 17:58:50 crc kubenswrapper[4892]: I0217 17:58:50.521986 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="547f235f-cf83-4393-b38d-ec8cc7d7a491" containerName="extract-utilities" Feb 17 17:58:50 crc kubenswrapper[4892]: E0217 17:58:50.522002 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="547f235f-cf83-4393-b38d-ec8cc7d7a491" containerName="extract-content" Feb 17 17:58:50 crc kubenswrapper[4892]: I0217 17:58:50.522008 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="547f235f-cf83-4393-b38d-ec8cc7d7a491" containerName="extract-content" Feb 17 17:58:50 crc kubenswrapper[4892]: E0217 17:58:50.522379 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="547f235f-cf83-4393-b38d-ec8cc7d7a491" containerName="registry-server" Feb 17 17:58:50 crc kubenswrapper[4892]: I0217 17:58:50.522387 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="547f235f-cf83-4393-b38d-ec8cc7d7a491" containerName="registry-server" Feb 17 17:58:50 crc kubenswrapper[4892]: I0217 17:58:50.522549 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="547f235f-cf83-4393-b38d-ec8cc7d7a491" containerName="registry-server" Feb 17 17:58:50 crc kubenswrapper[4892]: I0217 17:58:50.533194 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-8x4md" Feb 17 17:58:50 crc kubenswrapper[4892]: I0217 17:58:50.534400 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-8x4md"] Feb 17 17:58:50 crc kubenswrapper[4892]: I0217 17:58:50.536640 4892 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-526kt" Feb 17 17:58:50 crc kubenswrapper[4892]: I0217 17:58:50.593321 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsclk\" (UniqueName: \"kubernetes.io/projected/4bf0f495-906b-4e3c-b5e2-eecaaf07335e-kube-api-access-gsclk\") pod \"cert-manager-cainjector-5545bd876-8x4md\" (UID: \"4bf0f495-906b-4e3c-b5e2-eecaaf07335e\") " pod="cert-manager/cert-manager-cainjector-5545bd876-8x4md" Feb 17 17:58:50 crc kubenswrapper[4892]: I0217 17:58:50.593366 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4bf0f495-906b-4e3c-b5e2-eecaaf07335e-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-8x4md\" (UID: \"4bf0f495-906b-4e3c-b5e2-eecaaf07335e\") " pod="cert-manager/cert-manager-cainjector-5545bd876-8x4md" Feb 17 17:58:50 crc kubenswrapper[4892]: I0217 17:58:50.695037 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsclk\" (UniqueName: \"kubernetes.io/projected/4bf0f495-906b-4e3c-b5e2-eecaaf07335e-kube-api-access-gsclk\") pod \"cert-manager-cainjector-5545bd876-8x4md\" (UID: \"4bf0f495-906b-4e3c-b5e2-eecaaf07335e\") " pod="cert-manager/cert-manager-cainjector-5545bd876-8x4md" Feb 17 17:58:50 crc kubenswrapper[4892]: I0217 17:58:50.695134 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4bf0f495-906b-4e3c-b5e2-eecaaf07335e-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-8x4md\" (UID: \"4bf0f495-906b-4e3c-b5e2-eecaaf07335e\") " pod="cert-manager/cert-manager-cainjector-5545bd876-8x4md" Feb 17 17:58:50 crc kubenswrapper[4892]: I0217 17:58:50.711598 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4bf0f495-906b-4e3c-b5e2-eecaaf07335e-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-8x4md\" (UID: \"4bf0f495-906b-4e3c-b5e2-eecaaf07335e\") " pod="cert-manager/cert-manager-cainjector-5545bd876-8x4md" Feb 17 17:58:50 crc kubenswrapper[4892]: I0217 17:58:50.715956 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsclk\" (UniqueName: \"kubernetes.io/projected/4bf0f495-906b-4e3c-b5e2-eecaaf07335e-kube-api-access-gsclk\") pod \"cert-manager-cainjector-5545bd876-8x4md\" (UID: \"4bf0f495-906b-4e3c-b5e2-eecaaf07335e\") " pod="cert-manager/cert-manager-cainjector-5545bd876-8x4md" Feb 17 17:58:50 crc kubenswrapper[4892]: I0217 17:58:50.855054 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-8x4md" Feb 17 17:58:51 crc kubenswrapper[4892]: I0217 17:58:51.027354 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8krld" Feb 17 17:58:51 crc kubenswrapper[4892]: I0217 17:58:51.073890 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8krld"] Feb 17 17:58:51 crc kubenswrapper[4892]: I0217 17:58:51.081346 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8krld"] Feb 17 17:58:51 crc kubenswrapper[4892]: I0217 17:58:51.086707 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-8x4md"] Feb 17 17:58:51 crc kubenswrapper[4892]: W0217 17:58:51.087550 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4bf0f495_906b_4e3c_b5e2_eecaaf07335e.slice/crio-591b9765ef5b0eb07d5e4ae83faec6d0548b2932995c3c149f894f646f0faa14 WatchSource:0}: Error finding container 591b9765ef5b0eb07d5e4ae83faec6d0548b2932995c3c149f894f646f0faa14: Status 404 returned error can't find the container with id 591b9765ef5b0eb07d5e4ae83faec6d0548b2932995c3c149f894f646f0faa14 Feb 17 17:58:51 crc kubenswrapper[4892]: I0217 17:58:51.368465 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="547f235f-cf83-4393-b38d-ec8cc7d7a491" path="/var/lib/kubelet/pods/547f235f-cf83-4393-b38d-ec8cc7d7a491/volumes" Feb 17 17:58:52 crc kubenswrapper[4892]: I0217 17:58:52.038290 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-8x4md" event={"ID":"4bf0f495-906b-4e3c-b5e2-eecaaf07335e","Type":"ContainerStarted","Data":"591b9765ef5b0eb07d5e4ae83faec6d0548b2932995c3c149f894f646f0faa14"} Feb 17 17:58:54 crc kubenswrapper[4892]: I0217 17:58:54.055067 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-t2r56" event={"ID":"e7381d2c-f7e2-4935-be5f-380479a2e516","Type":"ContainerStarted","Data":"10d169255788d2b75f1ab232458ffeebe47dd271fc7c9fb0a9efe3b1cb468c68"} Feb 17 17:58:54 crc kubenswrapper[4892]: I0217 17:58:54.055415 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-t2r56" Feb 17 17:58:54 crc kubenswrapper[4892]: I0217 17:58:54.056743 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-8x4md" event={"ID":"4bf0f495-906b-4e3c-b5e2-eecaaf07335e","Type":"ContainerStarted","Data":"e60ecb5f588f2539102861c9e39a49ca0b1abd558247440250fed9bd5a6d553a"} Feb 17 17:58:54 crc kubenswrapper[4892]: I0217 17:58:54.079360 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-t2r56" podStartSLOduration=1.945472957 podStartE2EDuration="7.079338079s" podCreationTimestamp="2026-02-17 17:58:47 +0000 UTC" firstStartedPulling="2026-02-17 17:58:47.819911765 +0000 UTC m=+899.195315020" lastFinishedPulling="2026-02-17 17:58:52.953776877 +0000 UTC m=+904.329180142" observedRunningTime="2026-02-17 17:58:54.070564303 +0000 UTC m=+905.445967588" watchObservedRunningTime="2026-02-17 17:58:54.079338079 +0000 UTC m=+905.454741354" Feb 17 17:58:54 crc kubenswrapper[4892]: I0217 17:58:54.096028 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-8x4md" podStartSLOduration=2.23193831 podStartE2EDuration="4.096010899s" podCreationTimestamp="2026-02-17 17:58:50 +0000 UTC" firstStartedPulling="2026-02-17 17:58:51.08976697 +0000 UTC m=+902.465170235" lastFinishedPulling="2026-02-17 17:58:52.953839559 +0000 UTC m=+904.329242824" observedRunningTime="2026-02-17 17:58:54.09234235 +0000 UTC m=+905.467745635" watchObservedRunningTime="2026-02-17 17:58:54.096010899 +0000 UTC m=+905.471414164" Feb 17 17:58:59 crc kubenswrapper[4892]: I0217 17:58:59.842374 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zvmcw"] Feb 17 17:58:59 crc kubenswrapper[4892]: I0217 17:58:59.847059 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zvmcw" Feb 17 17:58:59 crc kubenswrapper[4892]: I0217 17:58:59.852338 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b212f5a-29fd-4f56-b3ea-ce8570911957-utilities\") pod \"redhat-marketplace-zvmcw\" (UID: \"1b212f5a-29fd-4f56-b3ea-ce8570911957\") " pod="openshift-marketplace/redhat-marketplace-zvmcw" Feb 17 17:58:59 crc kubenswrapper[4892]: I0217 17:58:59.852445 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b212f5a-29fd-4f56-b3ea-ce8570911957-catalog-content\") pod \"redhat-marketplace-zvmcw\" (UID: \"1b212f5a-29fd-4f56-b3ea-ce8570911957\") " pod="openshift-marketplace/redhat-marketplace-zvmcw" Feb 17 17:58:59 crc kubenswrapper[4892]: I0217 17:58:59.852545 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zng9\" (UniqueName: \"kubernetes.io/projected/1b212f5a-29fd-4f56-b3ea-ce8570911957-kube-api-access-4zng9\") pod \"redhat-marketplace-zvmcw\" (UID: \"1b212f5a-29fd-4f56-b3ea-ce8570911957\") " pod="openshift-marketplace/redhat-marketplace-zvmcw" Feb 17 17:58:59 crc kubenswrapper[4892]: I0217 17:58:59.904195 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zvmcw"] Feb 17 17:58:59 crc kubenswrapper[4892]: I0217 17:58:59.955535 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b212f5a-29fd-4f56-b3ea-ce8570911957-catalog-content\") pod \"redhat-marketplace-zvmcw\" (UID: \"1b212f5a-29fd-4f56-b3ea-ce8570911957\") " pod="openshift-marketplace/redhat-marketplace-zvmcw" Feb 17 17:58:59 crc kubenswrapper[4892]: I0217 17:58:59.955613 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zng9\" (UniqueName: \"kubernetes.io/projected/1b212f5a-29fd-4f56-b3ea-ce8570911957-kube-api-access-4zng9\") pod \"redhat-marketplace-zvmcw\" (UID: \"1b212f5a-29fd-4f56-b3ea-ce8570911957\") " pod="openshift-marketplace/redhat-marketplace-zvmcw" Feb 17 17:58:59 crc kubenswrapper[4892]: I0217 17:58:59.955677 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b212f5a-29fd-4f56-b3ea-ce8570911957-utilities\") pod \"redhat-marketplace-zvmcw\" (UID: \"1b212f5a-29fd-4f56-b3ea-ce8570911957\") " pod="openshift-marketplace/redhat-marketplace-zvmcw" Feb 17 17:58:59 crc kubenswrapper[4892]: I0217 17:58:59.956190 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b212f5a-29fd-4f56-b3ea-ce8570911957-utilities\") pod \"redhat-marketplace-zvmcw\" (UID: \"1b212f5a-29fd-4f56-b3ea-ce8570911957\") " pod="openshift-marketplace/redhat-marketplace-zvmcw" Feb 17 17:58:59 crc kubenswrapper[4892]: I0217 17:58:59.956899 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b212f5a-29fd-4f56-b3ea-ce8570911957-catalog-content\") pod \"redhat-marketplace-zvmcw\" (UID: \"1b212f5a-29fd-4f56-b3ea-ce8570911957\") " pod="openshift-marketplace/redhat-marketplace-zvmcw" Feb 17 17:58:59 crc kubenswrapper[4892]: I0217 17:58:59.984646 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zng9\" (UniqueName: \"kubernetes.io/projected/1b212f5a-29fd-4f56-b3ea-ce8570911957-kube-api-access-4zng9\") pod \"redhat-marketplace-zvmcw\" (UID: \"1b212f5a-29fd-4f56-b3ea-ce8570911957\") " pod="openshift-marketplace/redhat-marketplace-zvmcw" Feb 17 17:59:00 crc kubenswrapper[4892]: I0217 17:59:00.225152 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zvmcw" Feb 17 17:59:00 crc kubenswrapper[4892]: I0217 17:59:00.407849 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-62mrg"] Feb 17 17:59:00 crc kubenswrapper[4892]: I0217 17:59:00.420325 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-62mrg"] Feb 17 17:59:00 crc kubenswrapper[4892]: I0217 17:59:00.424561 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-62mrg" Feb 17 17:59:00 crc kubenswrapper[4892]: I0217 17:59:00.428357 4892 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-485kh" Feb 17 17:59:00 crc kubenswrapper[4892]: I0217 17:59:00.462899 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v5wt\" (UniqueName: \"kubernetes.io/projected/c8cb92f6-00bd-44aa-bd4e-277d7327155d-kube-api-access-9v5wt\") pod \"cert-manager-545d4d4674-62mrg\" (UID: \"c8cb92f6-00bd-44aa-bd4e-277d7327155d\") " pod="cert-manager/cert-manager-545d4d4674-62mrg" Feb 17 17:59:00 crc kubenswrapper[4892]: I0217 17:59:00.462973 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c8cb92f6-00bd-44aa-bd4e-277d7327155d-bound-sa-token\") pod \"cert-manager-545d4d4674-62mrg\" (UID: \"c8cb92f6-00bd-44aa-bd4e-277d7327155d\") " pod="cert-manager/cert-manager-545d4d4674-62mrg" Feb 17 17:59:00 crc kubenswrapper[4892]: I0217 17:59:00.564068 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c8cb92f6-00bd-44aa-bd4e-277d7327155d-bound-sa-token\") pod \"cert-manager-545d4d4674-62mrg\" (UID: \"c8cb92f6-00bd-44aa-bd4e-277d7327155d\") " pod="cert-manager/cert-manager-545d4d4674-62mrg" Feb 17 17:59:00 crc kubenswrapper[4892]: I0217 17:59:00.564853 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v5wt\" (UniqueName: \"kubernetes.io/projected/c8cb92f6-00bd-44aa-bd4e-277d7327155d-kube-api-access-9v5wt\") pod \"cert-manager-545d4d4674-62mrg\" (UID: \"c8cb92f6-00bd-44aa-bd4e-277d7327155d\") " pod="cert-manager/cert-manager-545d4d4674-62mrg" Feb 17 17:59:00 crc kubenswrapper[4892]: I0217 17:59:00.586399 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v5wt\" (UniqueName: \"kubernetes.io/projected/c8cb92f6-00bd-44aa-bd4e-277d7327155d-kube-api-access-9v5wt\") pod \"cert-manager-545d4d4674-62mrg\" (UID: \"c8cb92f6-00bd-44aa-bd4e-277d7327155d\") " pod="cert-manager/cert-manager-545d4d4674-62mrg" Feb 17 17:59:00 crc kubenswrapper[4892]: I0217 17:59:00.587684 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c8cb92f6-00bd-44aa-bd4e-277d7327155d-bound-sa-token\") pod \"cert-manager-545d4d4674-62mrg\" (UID: \"c8cb92f6-00bd-44aa-bd4e-277d7327155d\") " pod="cert-manager/cert-manager-545d4d4674-62mrg" Feb 17 17:59:00 crc kubenswrapper[4892]: I0217 17:59:00.671585 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zvmcw"] Feb 17 17:59:00 crc kubenswrapper[4892]: W0217 17:59:00.676951 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b212f5a_29fd_4f56_b3ea_ce8570911957.slice/crio-a58aec50ba4810aaa79ac3e2b491e7370ff1765b808b62ed355db0876be3738f WatchSource:0}: Error finding container a58aec50ba4810aaa79ac3e2b491e7370ff1765b808b62ed355db0876be3738f: Status 404 returned error can't find the container with id a58aec50ba4810aaa79ac3e2b491e7370ff1765b808b62ed355db0876be3738f Feb 17 17:59:00 crc kubenswrapper[4892]: I0217 17:59:00.743139 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-62mrg" Feb 17 17:59:00 crc kubenswrapper[4892]: I0217 17:59:00.960302 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-62mrg"] Feb 17 17:59:01 crc kubenswrapper[4892]: I0217 17:59:01.120378 4892 generic.go:334] "Generic (PLEG): container finished" podID="1b212f5a-29fd-4f56-b3ea-ce8570911957" containerID="a03327377525d02b57c55559888469af4e64b119e55db441bb78e4649874994c" exitCode=0 Feb 17 17:59:01 crc kubenswrapper[4892]: I0217 17:59:01.120780 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zvmcw" event={"ID":"1b212f5a-29fd-4f56-b3ea-ce8570911957","Type":"ContainerDied","Data":"a03327377525d02b57c55559888469af4e64b119e55db441bb78e4649874994c"} Feb 17 17:59:01 crc kubenswrapper[4892]: I0217 17:59:01.123166 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zvmcw" event={"ID":"1b212f5a-29fd-4f56-b3ea-ce8570911957","Type":"ContainerStarted","Data":"a58aec50ba4810aaa79ac3e2b491e7370ff1765b808b62ed355db0876be3738f"} Feb 17 17:59:01 crc kubenswrapper[4892]: I0217 17:59:01.131359 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-62mrg" event={"ID":"c8cb92f6-00bd-44aa-bd4e-277d7327155d","Type":"ContainerStarted","Data":"f619e647e54111691382cb55018405c06886e4f0861c0baff231be12f783efbb"} Feb 17 17:59:02 crc kubenswrapper[4892]: I0217 17:59:02.140961 4892 generic.go:334] "Generic (PLEG): container finished" podID="1b212f5a-29fd-4f56-b3ea-ce8570911957" containerID="ab7409bd2f57f0c5cec652eec45eb7584d1ed37fa15c984cab1297471c6dc3c2" exitCode=0 Feb 17 17:59:02 crc kubenswrapper[4892]: I0217 17:59:02.141012 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zvmcw" event={"ID":"1b212f5a-29fd-4f56-b3ea-ce8570911957","Type":"ContainerDied","Data":"ab7409bd2f57f0c5cec652eec45eb7584d1ed37fa15c984cab1297471c6dc3c2"} Feb 17 17:59:02 crc kubenswrapper[4892]: I0217 17:59:02.144174 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-62mrg" event={"ID":"c8cb92f6-00bd-44aa-bd4e-277d7327155d","Type":"ContainerStarted","Data":"8d0a08d7ed526de11abee7e2804a12472280db077018bae5f4a99f71d2dc9315"} Feb 17 17:59:02 crc kubenswrapper[4892]: I0217 17:59:02.181894 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-62mrg" podStartSLOduration=2.181869698 podStartE2EDuration="2.181869698s" podCreationTimestamp="2026-02-17 17:59:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:59:02.174804088 +0000 UTC m=+913.550207353" watchObservedRunningTime="2026-02-17 17:59:02.181869698 +0000 UTC m=+913.557272963" Feb 17 17:59:02 crc kubenswrapper[4892]: I0217 17:59:02.375601 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-t2r56" Feb 17 17:59:03 crc kubenswrapper[4892]: I0217 17:59:03.152869 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zvmcw" event={"ID":"1b212f5a-29fd-4f56-b3ea-ce8570911957","Type":"ContainerStarted","Data":"6c0a8d33d54ff2a915d217ab8afbc7c857dea93ca601df061104c5438e4db3e9"} Feb 17 17:59:03 crc kubenswrapper[4892]: I0217 17:59:03.170877 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zvmcw" podStartSLOduration=2.775885873 podStartE2EDuration="4.170857168s" podCreationTimestamp="2026-02-17 17:58:59 +0000 UTC" firstStartedPulling="2026-02-17 17:59:01.122366187 +0000 UTC m=+912.497769452" lastFinishedPulling="2026-02-17 17:59:02.517337462 +0000 UTC m=+913.892740747" observedRunningTime="2026-02-17 17:59:03.170204351 +0000 UTC m=+914.545607616" watchObservedRunningTime="2026-02-17 17:59:03.170857168 +0000 UTC m=+914.546260443" Feb 17 17:59:05 crc kubenswrapper[4892]: I0217 17:59:05.016904 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wjwgw"] Feb 17 17:59:05 crc kubenswrapper[4892]: I0217 17:59:05.020511 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wjwgw" Feb 17 17:59:05 crc kubenswrapper[4892]: I0217 17:59:05.063206 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wjwgw"] Feb 17 17:59:05 crc kubenswrapper[4892]: I0217 17:59:05.066306 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6512465e-9615-4c53-8a1a-2c1d0e632bbc-utilities\") pod \"certified-operators-wjwgw\" (UID: \"6512465e-9615-4c53-8a1a-2c1d0e632bbc\") " pod="openshift-marketplace/certified-operators-wjwgw" Feb 17 17:59:05 crc kubenswrapper[4892]: I0217 17:59:05.068503 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6512465e-9615-4c53-8a1a-2c1d0e632bbc-catalog-content\") pod \"certified-operators-wjwgw\" (UID: \"6512465e-9615-4c53-8a1a-2c1d0e632bbc\") " pod="openshift-marketplace/certified-operators-wjwgw" Feb 17 17:59:05 crc kubenswrapper[4892]: I0217 17:59:05.068870 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8fsv\" (UniqueName: \"kubernetes.io/projected/6512465e-9615-4c53-8a1a-2c1d0e632bbc-kube-api-access-k8fsv\") pod \"certified-operators-wjwgw\" (UID: \"6512465e-9615-4c53-8a1a-2c1d0e632bbc\") " pod="openshift-marketplace/certified-operators-wjwgw" Feb 17 17:59:05 crc kubenswrapper[4892]: I0217 17:59:05.170283 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8fsv\" (UniqueName: \"kubernetes.io/projected/6512465e-9615-4c53-8a1a-2c1d0e632bbc-kube-api-access-k8fsv\") pod \"certified-operators-wjwgw\" (UID: \"6512465e-9615-4c53-8a1a-2c1d0e632bbc\") " pod="openshift-marketplace/certified-operators-wjwgw" Feb 17 17:59:05 crc kubenswrapper[4892]: I0217 17:59:05.170572 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6512465e-9615-4c53-8a1a-2c1d0e632bbc-utilities\") pod \"certified-operators-wjwgw\" (UID: \"6512465e-9615-4c53-8a1a-2c1d0e632bbc\") " pod="openshift-marketplace/certified-operators-wjwgw" Feb 17 17:59:05 crc kubenswrapper[4892]: I0217 17:59:05.170630 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6512465e-9615-4c53-8a1a-2c1d0e632bbc-catalog-content\") pod \"certified-operators-wjwgw\" (UID: \"6512465e-9615-4c53-8a1a-2c1d0e632bbc\") " pod="openshift-marketplace/certified-operators-wjwgw" Feb 17 17:59:05 crc kubenswrapper[4892]: I0217 17:59:05.171082 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6512465e-9615-4c53-8a1a-2c1d0e632bbc-catalog-content\") pod \"certified-operators-wjwgw\" (UID: \"6512465e-9615-4c53-8a1a-2c1d0e632bbc\") " pod="openshift-marketplace/certified-operators-wjwgw" Feb 17 17:59:05 crc kubenswrapper[4892]: I0217 17:59:05.171277 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6512465e-9615-4c53-8a1a-2c1d0e632bbc-utilities\") pod \"certified-operators-wjwgw\" (UID: \"6512465e-9615-4c53-8a1a-2c1d0e632bbc\") " pod="openshift-marketplace/certified-operators-wjwgw" Feb 17 17:59:05 crc kubenswrapper[4892]: I0217 17:59:05.222293 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8fsv\" (UniqueName: \"kubernetes.io/projected/6512465e-9615-4c53-8a1a-2c1d0e632bbc-kube-api-access-k8fsv\") pod \"certified-operators-wjwgw\" (UID: \"6512465e-9615-4c53-8a1a-2c1d0e632bbc\") " pod="openshift-marketplace/certified-operators-wjwgw" Feb 17 17:59:05 crc kubenswrapper[4892]: I0217 17:59:05.374345 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wjwgw" Feb 17 17:59:05 crc kubenswrapper[4892]: I0217 17:59:05.868471 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wjwgw"] Feb 17 17:59:05 crc kubenswrapper[4892]: W0217 17:59:05.871627 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6512465e_9615_4c53_8a1a_2c1d0e632bbc.slice/crio-51c40074cbd23261462669bf7ed5ba025fb706efee272694505e3750e3f9bf43 WatchSource:0}: Error finding container 51c40074cbd23261462669bf7ed5ba025fb706efee272694505e3750e3f9bf43: Status 404 returned error can't find the container with id 51c40074cbd23261462669bf7ed5ba025fb706efee272694505e3750e3f9bf43 Feb 17 17:59:06 crc kubenswrapper[4892]: I0217 17:59:06.173438 4892 generic.go:334] "Generic (PLEG): container finished" podID="6512465e-9615-4c53-8a1a-2c1d0e632bbc" containerID="1452062ed15176af2fdb80eb058be165ffd9f76e18436b70ff8279fbe0a3ada9" exitCode=0 Feb 17 17:59:06 crc kubenswrapper[4892]: I0217 17:59:06.173491 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wjwgw" event={"ID":"6512465e-9615-4c53-8a1a-2c1d0e632bbc","Type":"ContainerDied","Data":"1452062ed15176af2fdb80eb058be165ffd9f76e18436b70ff8279fbe0a3ada9"} Feb 17 17:59:06 crc kubenswrapper[4892]: I0217 17:59:06.173515 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wjwgw" event={"ID":"6512465e-9615-4c53-8a1a-2c1d0e632bbc","Type":"ContainerStarted","Data":"51c40074cbd23261462669bf7ed5ba025fb706efee272694505e3750e3f9bf43"} Feb 17 17:59:07 crc kubenswrapper[4892]: I0217 17:59:07.185419 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wjwgw" event={"ID":"6512465e-9615-4c53-8a1a-2c1d0e632bbc","Type":"ContainerStarted","Data":"fe351e11a21beca5476ee9b199210328ddecf3e539e74a470fcb74004498bed4"} Feb 17 17:59:07 crc kubenswrapper[4892]: I0217 17:59:07.424752 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:59:07 crc kubenswrapper[4892]: I0217 17:59:07.424857 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:59:08 crc kubenswrapper[4892]: I0217 17:59:08.199730 4892 generic.go:334] "Generic (PLEG): container finished" podID="6512465e-9615-4c53-8a1a-2c1d0e632bbc" containerID="fe351e11a21beca5476ee9b199210328ddecf3e539e74a470fcb74004498bed4" exitCode=0 Feb 17 17:59:08 crc kubenswrapper[4892]: I0217 17:59:08.199886 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wjwgw" event={"ID":"6512465e-9615-4c53-8a1a-2c1d0e632bbc","Type":"ContainerDied","Data":"fe351e11a21beca5476ee9b199210328ddecf3e539e74a470fcb74004498bed4"} Feb 17 17:59:10 crc kubenswrapper[4892]: I0217 17:59:10.014117 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-k5wgc"] Feb 17 17:59:10 crc kubenswrapper[4892]: I0217 17:59:10.015668 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-k5wgc" Feb 17 17:59:10 crc kubenswrapper[4892]: I0217 17:59:10.018101 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 17 17:59:10 crc kubenswrapper[4892]: I0217 17:59:10.018626 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 17 17:59:10 crc kubenswrapper[4892]: I0217 17:59:10.019116 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-fzbc7" Feb 17 17:59:10 crc kubenswrapper[4892]: I0217 17:59:10.025570 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-k5wgc"] Feb 17 17:59:10 crc kubenswrapper[4892]: I0217 17:59:10.047346 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d965z\" (UniqueName: \"kubernetes.io/projected/72dd98da-284e-486e-8ec7-9418283ac655-kube-api-access-d965z\") pod \"openstack-operator-index-k5wgc\" (UID: \"72dd98da-284e-486e-8ec7-9418283ac655\") " pod="openstack-operators/openstack-operator-index-k5wgc" Feb 17 17:59:10 crc kubenswrapper[4892]: I0217 17:59:10.149672 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d965z\" (UniqueName: \"kubernetes.io/projected/72dd98da-284e-486e-8ec7-9418283ac655-kube-api-access-d965z\") pod \"openstack-operator-index-k5wgc\" (UID: \"72dd98da-284e-486e-8ec7-9418283ac655\") " pod="openstack-operators/openstack-operator-index-k5wgc" Feb 17 17:59:10 crc kubenswrapper[4892]: I0217 17:59:10.175019 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d965z\" (UniqueName: \"kubernetes.io/projected/72dd98da-284e-486e-8ec7-9418283ac655-kube-api-access-d965z\") pod \"openstack-operator-index-k5wgc\" (UID: \"72dd98da-284e-486e-8ec7-9418283ac655\") " pod="openstack-operators/openstack-operator-index-k5wgc" Feb 17 17:59:10 crc kubenswrapper[4892]: I0217 17:59:10.219043 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wjwgw" event={"ID":"6512465e-9615-4c53-8a1a-2c1d0e632bbc","Type":"ContainerStarted","Data":"9a1646bd9a374f0fcf1ad5b2ebacf20059a9359d2d45815e1bd296e43b91d8d4"} Feb 17 17:59:10 crc kubenswrapper[4892]: I0217 17:59:10.225576 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zvmcw" Feb 17 17:59:10 crc kubenswrapper[4892]: I0217 17:59:10.225630 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zvmcw" Feb 17 17:59:10 crc kubenswrapper[4892]: I0217 17:59:10.241593 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wjwgw" podStartSLOduration=3.281710065 podStartE2EDuration="6.241576074s" podCreationTimestamp="2026-02-17 17:59:04 +0000 UTC" firstStartedPulling="2026-02-17 17:59:06.174837127 +0000 UTC m=+917.550240392" lastFinishedPulling="2026-02-17 17:59:09.134703136 +0000 UTC m=+920.510106401" observedRunningTime="2026-02-17 17:59:10.240254638 +0000 UTC m=+921.615657923" watchObservedRunningTime="2026-02-17 17:59:10.241576074 +0000 UTC m=+921.616979349" Feb 17 17:59:10 crc kubenswrapper[4892]: I0217 17:59:10.288462 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zvmcw" Feb 17 17:59:10 crc kubenswrapper[4892]: I0217 17:59:10.338853 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-k5wgc" Feb 17 17:59:10 crc kubenswrapper[4892]: I0217 17:59:10.663701 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-k5wgc"] Feb 17 17:59:11 crc kubenswrapper[4892]: I0217 17:59:11.234699 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-k5wgc" event={"ID":"72dd98da-284e-486e-8ec7-9418283ac655","Type":"ContainerStarted","Data":"daece5f0e54e5298b39accf6786889eba568fc1e42115a44036ca50792b99abf"} Feb 17 17:59:11 crc kubenswrapper[4892]: I0217 17:59:11.296434 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zvmcw" Feb 17 17:59:15 crc kubenswrapper[4892]: I0217 17:59:15.267512 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-k5wgc" event={"ID":"72dd98da-284e-486e-8ec7-9418283ac655","Type":"ContainerStarted","Data":"5dc2e1d7a45f6abccf715ed917ce0a23bd8d1edd02175875407cb72b19fb17f8"} Feb 17 17:59:15 crc kubenswrapper[4892]: I0217 17:59:15.298952 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-k5wgc" podStartSLOduration=2.792912655 podStartE2EDuration="6.298921585s" podCreationTimestamp="2026-02-17 17:59:09 +0000 UTC" firstStartedPulling="2026-02-17 17:59:10.656845899 +0000 UTC m=+922.032249164" lastFinishedPulling="2026-02-17 17:59:14.162854789 +0000 UTC m=+925.538258094" observedRunningTime="2026-02-17 17:59:15.291625868 +0000 UTC m=+926.667029153" watchObservedRunningTime="2026-02-17 17:59:15.298921585 +0000 UTC m=+926.674324890" Feb 17 17:59:15 crc kubenswrapper[4892]: I0217 17:59:15.375042 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wjwgw" Feb 17 17:59:15 crc kubenswrapper[4892]: I0217 17:59:15.375108 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wjwgw" Feb 17 17:59:15 crc kubenswrapper[4892]: I0217 17:59:15.457594 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wjwgw" Feb 17 17:59:16 crc kubenswrapper[4892]: I0217 17:59:16.339593 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wjwgw" Feb 17 17:59:16 crc kubenswrapper[4892]: I0217 17:59:16.805503 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zvmcw"] Feb 17 17:59:16 crc kubenswrapper[4892]: I0217 17:59:16.805847 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zvmcw" podUID="1b212f5a-29fd-4f56-b3ea-ce8570911957" containerName="registry-server" containerID="cri-o://6c0a8d33d54ff2a915d217ab8afbc7c857dea93ca601df061104c5438e4db3e9" gracePeriod=2 Feb 17 17:59:17 crc kubenswrapper[4892]: I0217 17:59:17.231066 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zvmcw" Feb 17 17:59:17 crc kubenswrapper[4892]: I0217 17:59:17.289765 4892 generic.go:334] "Generic (PLEG): container finished" podID="1b212f5a-29fd-4f56-b3ea-ce8570911957" containerID="6c0a8d33d54ff2a915d217ab8afbc7c857dea93ca601df061104c5438e4db3e9" exitCode=0 Feb 17 17:59:17 crc kubenswrapper[4892]: I0217 17:59:17.289840 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zvmcw" event={"ID":"1b212f5a-29fd-4f56-b3ea-ce8570911957","Type":"ContainerDied","Data":"6c0a8d33d54ff2a915d217ab8afbc7c857dea93ca601df061104c5438e4db3e9"} Feb 17 17:59:17 crc kubenswrapper[4892]: I0217 17:59:17.289872 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zvmcw" Feb 17 17:59:17 crc kubenswrapper[4892]: I0217 17:59:17.289889 4892 scope.go:117] "RemoveContainer" containerID="6c0a8d33d54ff2a915d217ab8afbc7c857dea93ca601df061104c5438e4db3e9" Feb 17 17:59:17 crc kubenswrapper[4892]: I0217 17:59:17.289878 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zvmcw" event={"ID":"1b212f5a-29fd-4f56-b3ea-ce8570911957","Type":"ContainerDied","Data":"a58aec50ba4810aaa79ac3e2b491e7370ff1765b808b62ed355db0876be3738f"} Feb 17 17:59:17 crc kubenswrapper[4892]: I0217 17:59:17.306436 4892 scope.go:117] "RemoveContainer" containerID="ab7409bd2f57f0c5cec652eec45eb7584d1ed37fa15c984cab1297471c6dc3c2" Feb 17 17:59:17 crc kubenswrapper[4892]: I0217 17:59:17.315397 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b212f5a-29fd-4f56-b3ea-ce8570911957-catalog-content\") pod \"1b212f5a-29fd-4f56-b3ea-ce8570911957\" (UID: \"1b212f5a-29fd-4f56-b3ea-ce8570911957\") " Feb 17 17:59:17 crc kubenswrapper[4892]: I0217 17:59:17.315438 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b212f5a-29fd-4f56-b3ea-ce8570911957-utilities\") pod \"1b212f5a-29fd-4f56-b3ea-ce8570911957\" (UID: \"1b212f5a-29fd-4f56-b3ea-ce8570911957\") " Feb 17 17:59:17 crc kubenswrapper[4892]: I0217 17:59:17.315479 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zng9\" (UniqueName: \"kubernetes.io/projected/1b212f5a-29fd-4f56-b3ea-ce8570911957-kube-api-access-4zng9\") pod \"1b212f5a-29fd-4f56-b3ea-ce8570911957\" (UID: \"1b212f5a-29fd-4f56-b3ea-ce8570911957\") " Feb 17 17:59:17 crc kubenswrapper[4892]: I0217 17:59:17.316205 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b212f5a-29fd-4f56-b3ea-ce8570911957-utilities" (OuterVolumeSpecName: "utilities") pod "1b212f5a-29fd-4f56-b3ea-ce8570911957" (UID: "1b212f5a-29fd-4f56-b3ea-ce8570911957"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:59:17 crc kubenswrapper[4892]: I0217 17:59:17.334050 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b212f5a-29fd-4f56-b3ea-ce8570911957-kube-api-access-4zng9" (OuterVolumeSpecName: "kube-api-access-4zng9") pod "1b212f5a-29fd-4f56-b3ea-ce8570911957" (UID: "1b212f5a-29fd-4f56-b3ea-ce8570911957"). InnerVolumeSpecName "kube-api-access-4zng9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:59:17 crc kubenswrapper[4892]: I0217 17:59:17.336583 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b212f5a-29fd-4f56-b3ea-ce8570911957-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1b212f5a-29fd-4f56-b3ea-ce8570911957" (UID: "1b212f5a-29fd-4f56-b3ea-ce8570911957"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:59:17 crc kubenswrapper[4892]: I0217 17:59:17.343451 4892 scope.go:117] "RemoveContainer" containerID="a03327377525d02b57c55559888469af4e64b119e55db441bb78e4649874994c" Feb 17 17:59:17 crc kubenswrapper[4892]: I0217 17:59:17.384236 4892 scope.go:117] "RemoveContainer" containerID="6c0a8d33d54ff2a915d217ab8afbc7c857dea93ca601df061104c5438e4db3e9" Feb 17 17:59:17 crc kubenswrapper[4892]: E0217 17:59:17.393627 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c0a8d33d54ff2a915d217ab8afbc7c857dea93ca601df061104c5438e4db3e9\": container with ID starting with 6c0a8d33d54ff2a915d217ab8afbc7c857dea93ca601df061104c5438e4db3e9 not found: ID does not exist" containerID="6c0a8d33d54ff2a915d217ab8afbc7c857dea93ca601df061104c5438e4db3e9" Feb 17 17:59:17 crc kubenswrapper[4892]: I0217 17:59:17.393673 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c0a8d33d54ff2a915d217ab8afbc7c857dea93ca601df061104c5438e4db3e9"} err="failed to get container status \"6c0a8d33d54ff2a915d217ab8afbc7c857dea93ca601df061104c5438e4db3e9\": rpc error: code = NotFound desc = could not find container \"6c0a8d33d54ff2a915d217ab8afbc7c857dea93ca601df061104c5438e4db3e9\": container with ID starting with 6c0a8d33d54ff2a915d217ab8afbc7c857dea93ca601df061104c5438e4db3e9 not found: ID does not exist" Feb 17 17:59:17 crc kubenswrapper[4892]: I0217 17:59:17.393699 4892 scope.go:117] "RemoveContainer" containerID="ab7409bd2f57f0c5cec652eec45eb7584d1ed37fa15c984cab1297471c6dc3c2" Feb 17 17:59:17 crc kubenswrapper[4892]: E0217 17:59:17.398946 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab7409bd2f57f0c5cec652eec45eb7584d1ed37fa15c984cab1297471c6dc3c2\": container with ID starting with ab7409bd2f57f0c5cec652eec45eb7584d1ed37fa15c984cab1297471c6dc3c2 not found: ID does not exist" containerID="ab7409bd2f57f0c5cec652eec45eb7584d1ed37fa15c984cab1297471c6dc3c2" Feb 17 17:59:17 crc kubenswrapper[4892]: I0217 17:59:17.398989 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab7409bd2f57f0c5cec652eec45eb7584d1ed37fa15c984cab1297471c6dc3c2"} err="failed to get container status \"ab7409bd2f57f0c5cec652eec45eb7584d1ed37fa15c984cab1297471c6dc3c2\": rpc error: code = NotFound desc = could not find container \"ab7409bd2f57f0c5cec652eec45eb7584d1ed37fa15c984cab1297471c6dc3c2\": container with ID starting with ab7409bd2f57f0c5cec652eec45eb7584d1ed37fa15c984cab1297471c6dc3c2 not found: ID does not exist" Feb 17 17:59:17 crc kubenswrapper[4892]: I0217 17:59:17.399017 4892 scope.go:117] "RemoveContainer" containerID="a03327377525d02b57c55559888469af4e64b119e55db441bb78e4649874994c" Feb 17 17:59:17 crc kubenswrapper[4892]: E0217 17:59:17.399582 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a03327377525d02b57c55559888469af4e64b119e55db441bb78e4649874994c\": container with ID starting with a03327377525d02b57c55559888469af4e64b119e55db441bb78e4649874994c not found: ID does not exist" containerID="a03327377525d02b57c55559888469af4e64b119e55db441bb78e4649874994c" Feb 17 17:59:17 crc kubenswrapper[4892]: I0217 17:59:17.399616 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a03327377525d02b57c55559888469af4e64b119e55db441bb78e4649874994c"} err="failed to get container status \"a03327377525d02b57c55559888469af4e64b119e55db441bb78e4649874994c\": rpc error: code = NotFound desc = could not find container \"a03327377525d02b57c55559888469af4e64b119e55db441bb78e4649874994c\": container with ID starting with a03327377525d02b57c55559888469af4e64b119e55db441bb78e4649874994c not found: ID does not exist" Feb 17 17:59:17 crc kubenswrapper[4892]: I0217 17:59:17.416993 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b212f5a-29fd-4f56-b3ea-ce8570911957-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 17:59:17 crc kubenswrapper[4892]: I0217 17:59:17.417033 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b212f5a-29fd-4f56-b3ea-ce8570911957-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 17:59:17 crc kubenswrapper[4892]: I0217 17:59:17.417044 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zng9\" (UniqueName: \"kubernetes.io/projected/1b212f5a-29fd-4f56-b3ea-ce8570911957-kube-api-access-4zng9\") on node \"crc\" DevicePath \"\"" Feb 17 17:59:17 crc kubenswrapper[4892]: I0217 17:59:17.616713 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zvmcw"] Feb 17 17:59:17 crc kubenswrapper[4892]: I0217 17:59:17.627052 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zvmcw"] Feb 17 17:59:19 crc kubenswrapper[4892]: I0217 17:59:19.380091 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b212f5a-29fd-4f56-b3ea-ce8570911957" path="/var/lib/kubelet/pods/1b212f5a-29fd-4f56-b3ea-ce8570911957/volumes" Feb 17 17:59:19 crc kubenswrapper[4892]: I0217 17:59:19.417898 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wjwgw"] Feb 17 17:59:19 crc kubenswrapper[4892]: I0217 17:59:19.418188 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wjwgw" podUID="6512465e-9615-4c53-8a1a-2c1d0e632bbc" containerName="registry-server" containerID="cri-o://9a1646bd9a374f0fcf1ad5b2ebacf20059a9359d2d45815e1bd296e43b91d8d4" gracePeriod=2 Feb 17 17:59:19 crc kubenswrapper[4892]: I0217 17:59:19.906387 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wjwgw" Feb 17 17:59:19 crc kubenswrapper[4892]: I0217 17:59:19.954357 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6512465e-9615-4c53-8a1a-2c1d0e632bbc-catalog-content\") pod \"6512465e-9615-4c53-8a1a-2c1d0e632bbc\" (UID: \"6512465e-9615-4c53-8a1a-2c1d0e632bbc\") " Feb 17 17:59:19 crc kubenswrapper[4892]: I0217 17:59:19.954409 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6512465e-9615-4c53-8a1a-2c1d0e632bbc-utilities\") pod \"6512465e-9615-4c53-8a1a-2c1d0e632bbc\" (UID: \"6512465e-9615-4c53-8a1a-2c1d0e632bbc\") " Feb 17 17:59:19 crc kubenswrapper[4892]: I0217 17:59:19.954518 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8fsv\" (UniqueName: \"kubernetes.io/projected/6512465e-9615-4c53-8a1a-2c1d0e632bbc-kube-api-access-k8fsv\") pod \"6512465e-9615-4c53-8a1a-2c1d0e632bbc\" (UID: \"6512465e-9615-4c53-8a1a-2c1d0e632bbc\") " Feb 17 17:59:19 crc kubenswrapper[4892]: I0217 17:59:19.955176 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6512465e-9615-4c53-8a1a-2c1d0e632bbc-utilities" (OuterVolumeSpecName: "utilities") pod "6512465e-9615-4c53-8a1a-2c1d0e632bbc" (UID: "6512465e-9615-4c53-8a1a-2c1d0e632bbc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:59:19 crc kubenswrapper[4892]: I0217 17:59:19.959703 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6512465e-9615-4c53-8a1a-2c1d0e632bbc-kube-api-access-k8fsv" (OuterVolumeSpecName: "kube-api-access-k8fsv") pod "6512465e-9615-4c53-8a1a-2c1d0e632bbc" (UID: "6512465e-9615-4c53-8a1a-2c1d0e632bbc"). InnerVolumeSpecName "kube-api-access-k8fsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:59:20 crc kubenswrapper[4892]: I0217 17:59:20.009345 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6512465e-9615-4c53-8a1a-2c1d0e632bbc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6512465e-9615-4c53-8a1a-2c1d0e632bbc" (UID: "6512465e-9615-4c53-8a1a-2c1d0e632bbc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:59:20 crc kubenswrapper[4892]: I0217 17:59:20.056134 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6512465e-9615-4c53-8a1a-2c1d0e632bbc-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 17:59:20 crc kubenswrapper[4892]: I0217 17:59:20.056166 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6512465e-9615-4c53-8a1a-2c1d0e632bbc-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 17:59:20 crc kubenswrapper[4892]: I0217 17:59:20.056175 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8fsv\" (UniqueName: \"kubernetes.io/projected/6512465e-9615-4c53-8a1a-2c1d0e632bbc-kube-api-access-k8fsv\") on node \"crc\" DevicePath \"\"" Feb 17 17:59:20 crc kubenswrapper[4892]: I0217 17:59:20.318282 4892 generic.go:334] "Generic (PLEG): container finished" podID="6512465e-9615-4c53-8a1a-2c1d0e632bbc" containerID="9a1646bd9a374f0fcf1ad5b2ebacf20059a9359d2d45815e1bd296e43b91d8d4" exitCode=0 Feb 17 17:59:20 crc kubenswrapper[4892]: I0217 17:59:20.318330 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wjwgw" event={"ID":"6512465e-9615-4c53-8a1a-2c1d0e632bbc","Type":"ContainerDied","Data":"9a1646bd9a374f0fcf1ad5b2ebacf20059a9359d2d45815e1bd296e43b91d8d4"} Feb 17 17:59:20 crc kubenswrapper[4892]: I0217 17:59:20.318356 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wjwgw" event={"ID":"6512465e-9615-4c53-8a1a-2c1d0e632bbc","Type":"ContainerDied","Data":"51c40074cbd23261462669bf7ed5ba025fb706efee272694505e3750e3f9bf43"} Feb 17 17:59:20 crc kubenswrapper[4892]: I0217 17:59:20.318373 4892 scope.go:117] "RemoveContainer" containerID="9a1646bd9a374f0fcf1ad5b2ebacf20059a9359d2d45815e1bd296e43b91d8d4" Feb 17 17:59:20 crc kubenswrapper[4892]: I0217 17:59:20.318372 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wjwgw" Feb 17 17:59:20 crc kubenswrapper[4892]: I0217 17:59:20.337556 4892 scope.go:117] "RemoveContainer" containerID="fe351e11a21beca5476ee9b199210328ddecf3e539e74a470fcb74004498bed4" Feb 17 17:59:20 crc kubenswrapper[4892]: I0217 17:59:20.339051 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-k5wgc" Feb 17 17:59:20 crc kubenswrapper[4892]: I0217 17:59:20.339096 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-k5wgc" Feb 17 17:59:20 crc kubenswrapper[4892]: I0217 17:59:20.347587 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wjwgw"] Feb 17 17:59:20 crc kubenswrapper[4892]: I0217 17:59:20.353365 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wjwgw"] Feb 17 17:59:20 crc kubenswrapper[4892]: I0217 17:59:20.365163 4892 scope.go:117] "RemoveContainer" containerID="1452062ed15176af2fdb80eb058be165ffd9f76e18436b70ff8279fbe0a3ada9" Feb 17 17:59:20 crc kubenswrapper[4892]: I0217 17:59:20.384438 4892 scope.go:117] "RemoveContainer" containerID="9a1646bd9a374f0fcf1ad5b2ebacf20059a9359d2d45815e1bd296e43b91d8d4" Feb 17 17:59:20 crc kubenswrapper[4892]: E0217 17:59:20.385238 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a1646bd9a374f0fcf1ad5b2ebacf20059a9359d2d45815e1bd296e43b91d8d4\": container with ID starting with 9a1646bd9a374f0fcf1ad5b2ebacf20059a9359d2d45815e1bd296e43b91d8d4 not found: ID does not exist" containerID="9a1646bd9a374f0fcf1ad5b2ebacf20059a9359d2d45815e1bd296e43b91d8d4" Feb 17 17:59:20 crc kubenswrapper[4892]: I0217 17:59:20.385273 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a1646bd9a374f0fcf1ad5b2ebacf20059a9359d2d45815e1bd296e43b91d8d4"} err="failed to get container status \"9a1646bd9a374f0fcf1ad5b2ebacf20059a9359d2d45815e1bd296e43b91d8d4\": rpc error: code = NotFound desc = could not find container \"9a1646bd9a374f0fcf1ad5b2ebacf20059a9359d2d45815e1bd296e43b91d8d4\": container with ID starting with 9a1646bd9a374f0fcf1ad5b2ebacf20059a9359d2d45815e1bd296e43b91d8d4 not found: ID does not exist" Feb 17 17:59:20 crc kubenswrapper[4892]: I0217 17:59:20.385320 4892 scope.go:117] "RemoveContainer" containerID="fe351e11a21beca5476ee9b199210328ddecf3e539e74a470fcb74004498bed4" Feb 17 17:59:20 crc kubenswrapper[4892]: I0217 17:59:20.385576 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-k5wgc" Feb 17 17:59:20 crc kubenswrapper[4892]: E0217 17:59:20.386052 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe351e11a21beca5476ee9b199210328ddecf3e539e74a470fcb74004498bed4\": container with ID starting with fe351e11a21beca5476ee9b199210328ddecf3e539e74a470fcb74004498bed4 not found: ID does not exist" containerID="fe351e11a21beca5476ee9b199210328ddecf3e539e74a470fcb74004498bed4" Feb 17 17:59:20 crc kubenswrapper[4892]: I0217 17:59:20.386086 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe351e11a21beca5476ee9b199210328ddecf3e539e74a470fcb74004498bed4"} err="failed to get container status \"fe351e11a21beca5476ee9b199210328ddecf3e539e74a470fcb74004498bed4\": rpc error: code = NotFound desc = could not find container \"fe351e11a21beca5476ee9b199210328ddecf3e539e74a470fcb74004498bed4\": container with ID starting with fe351e11a21beca5476ee9b199210328ddecf3e539e74a470fcb74004498bed4 not found: ID does not exist" Feb 17 17:59:20 crc kubenswrapper[4892]: I0217 17:59:20.386099 4892 scope.go:117] "RemoveContainer" containerID="1452062ed15176af2fdb80eb058be165ffd9f76e18436b70ff8279fbe0a3ada9" Feb 17 17:59:20 crc kubenswrapper[4892]: E0217 17:59:20.386515 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1452062ed15176af2fdb80eb058be165ffd9f76e18436b70ff8279fbe0a3ada9\": container with ID starting with 1452062ed15176af2fdb80eb058be165ffd9f76e18436b70ff8279fbe0a3ada9 not found: ID does not exist" containerID="1452062ed15176af2fdb80eb058be165ffd9f76e18436b70ff8279fbe0a3ada9" Feb 17 17:59:20 crc kubenswrapper[4892]: I0217 17:59:20.386566 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1452062ed15176af2fdb80eb058be165ffd9f76e18436b70ff8279fbe0a3ada9"} err="failed to get container status \"1452062ed15176af2fdb80eb058be165ffd9f76e18436b70ff8279fbe0a3ada9\": rpc error: code = NotFound desc = could not find container \"1452062ed15176af2fdb80eb058be165ffd9f76e18436b70ff8279fbe0a3ada9\": container with ID starting with 1452062ed15176af2fdb80eb058be165ffd9f76e18436b70ff8279fbe0a3ada9 not found: ID does not exist" Feb 17 17:59:21 crc kubenswrapper[4892]: I0217 17:59:21.383156 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6512465e-9615-4c53-8a1a-2c1d0e632bbc" path="/var/lib/kubelet/pods/6512465e-9615-4c53-8a1a-2c1d0e632bbc/volumes" Feb 17 17:59:21 crc kubenswrapper[4892]: I0217 17:59:21.384489 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-k5wgc" Feb 17 17:59:25 crc kubenswrapper[4892]: I0217 17:59:25.061229 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/6058dfda14cb65c58e6e1b1bf4650548f92122c2af59420cb44b4d3438j7svn"] Feb 17 17:59:25 crc kubenswrapper[4892]: E0217 17:59:25.062130 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b212f5a-29fd-4f56-b3ea-ce8570911957" containerName="extract-utilities" Feb 17 17:59:25 crc kubenswrapper[4892]: I0217 17:59:25.062147 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b212f5a-29fd-4f56-b3ea-ce8570911957" containerName="extract-utilities" Feb 17 17:59:25 crc kubenswrapper[4892]: E0217 17:59:25.062163 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6512465e-9615-4c53-8a1a-2c1d0e632bbc" containerName="extract-utilities" Feb 17 17:59:25 crc kubenswrapper[4892]: I0217 17:59:25.062171 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="6512465e-9615-4c53-8a1a-2c1d0e632bbc" containerName="extract-utilities" Feb 17 17:59:25 crc kubenswrapper[4892]: E0217 17:59:25.062188 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b212f5a-29fd-4f56-b3ea-ce8570911957" containerName="extract-content" Feb 17 17:59:25 crc kubenswrapper[4892]: I0217 17:59:25.062196 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b212f5a-29fd-4f56-b3ea-ce8570911957" containerName="extract-content" Feb 17 17:59:25 crc kubenswrapper[4892]: E0217 17:59:25.062219 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6512465e-9615-4c53-8a1a-2c1d0e632bbc" containerName="extract-content" Feb 17 17:59:25 crc kubenswrapper[4892]: I0217 17:59:25.062227 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="6512465e-9615-4c53-8a1a-2c1d0e632bbc" containerName="extract-content" Feb 17 17:59:25 crc kubenswrapper[4892]: E0217 17:59:25.062243 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b212f5a-29fd-4f56-b3ea-ce8570911957" containerName="registry-server" Feb 17 17:59:25 crc kubenswrapper[4892]: I0217 17:59:25.062250 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b212f5a-29fd-4f56-b3ea-ce8570911957" containerName="registry-server" Feb 17 17:59:25 crc kubenswrapper[4892]: E0217 17:59:25.062270 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6512465e-9615-4c53-8a1a-2c1d0e632bbc" containerName="registry-server" Feb 17 17:59:25 crc kubenswrapper[4892]: I0217 17:59:25.062277 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="6512465e-9615-4c53-8a1a-2c1d0e632bbc" containerName="registry-server" Feb 17 17:59:25 crc kubenswrapper[4892]: I0217 17:59:25.062450 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b212f5a-29fd-4f56-b3ea-ce8570911957" containerName="registry-server" Feb 17 17:59:25 crc kubenswrapper[4892]: I0217 17:59:25.062466 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="6512465e-9615-4c53-8a1a-2c1d0e632bbc" containerName="registry-server" Feb 17 17:59:25 crc kubenswrapper[4892]: I0217 17:59:25.063773 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/6058dfda14cb65c58e6e1b1bf4650548f92122c2af59420cb44b4d3438j7svn" Feb 17 17:59:25 crc kubenswrapper[4892]: I0217 17:59:25.065878 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-vfk5f" Feb 17 17:59:25 crc kubenswrapper[4892]: I0217 17:59:25.086170 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/6058dfda14cb65c58e6e1b1bf4650548f92122c2af59420cb44b4d3438j7svn"] Feb 17 17:59:25 crc kubenswrapper[4892]: I0217 17:59:25.139923 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhlt6\" (UniqueName: \"kubernetes.io/projected/4ebf28d4-a728-4295-a2c1-bbd21cd9c333-kube-api-access-nhlt6\") pod \"6058dfda14cb65c58e6e1b1bf4650548f92122c2af59420cb44b4d3438j7svn\" (UID: \"4ebf28d4-a728-4295-a2c1-bbd21cd9c333\") " pod="openstack-operators/6058dfda14cb65c58e6e1b1bf4650548f92122c2af59420cb44b4d3438j7svn" Feb 17 17:59:25 crc kubenswrapper[4892]: I0217 17:59:25.140007 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ebf28d4-a728-4295-a2c1-bbd21cd9c333-util\") pod \"6058dfda14cb65c58e6e1b1bf4650548f92122c2af59420cb44b4d3438j7svn\" (UID: \"4ebf28d4-a728-4295-a2c1-bbd21cd9c333\") " pod="openstack-operators/6058dfda14cb65c58e6e1b1bf4650548f92122c2af59420cb44b4d3438j7svn" Feb 17 17:59:25 crc kubenswrapper[4892]: I0217 17:59:25.140036 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ebf28d4-a728-4295-a2c1-bbd21cd9c333-bundle\") pod \"6058dfda14cb65c58e6e1b1bf4650548f92122c2af59420cb44b4d3438j7svn\" (UID: \"4ebf28d4-a728-4295-a2c1-bbd21cd9c333\") " pod="openstack-operators/6058dfda14cb65c58e6e1b1bf4650548f92122c2af59420cb44b4d3438j7svn" Feb 17 17:59:25 crc kubenswrapper[4892]: I0217 17:59:25.241465 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhlt6\" (UniqueName: \"kubernetes.io/projected/4ebf28d4-a728-4295-a2c1-bbd21cd9c333-kube-api-access-nhlt6\") pod \"6058dfda14cb65c58e6e1b1bf4650548f92122c2af59420cb44b4d3438j7svn\" (UID: \"4ebf28d4-a728-4295-a2c1-bbd21cd9c333\") " pod="openstack-operators/6058dfda14cb65c58e6e1b1bf4650548f92122c2af59420cb44b4d3438j7svn" Feb 17 17:59:25 crc kubenswrapper[4892]: I0217 17:59:25.241532 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ebf28d4-a728-4295-a2c1-bbd21cd9c333-bundle\") pod \"6058dfda14cb65c58e6e1b1bf4650548f92122c2af59420cb44b4d3438j7svn\" (UID: \"4ebf28d4-a728-4295-a2c1-bbd21cd9c333\") " pod="openstack-operators/6058dfda14cb65c58e6e1b1bf4650548f92122c2af59420cb44b4d3438j7svn" Feb 17 17:59:25 crc kubenswrapper[4892]: I0217 17:59:25.241553 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ebf28d4-a728-4295-a2c1-bbd21cd9c333-util\") pod \"6058dfda14cb65c58e6e1b1bf4650548f92122c2af59420cb44b4d3438j7svn\" (UID: \"4ebf28d4-a728-4295-a2c1-bbd21cd9c333\") " pod="openstack-operators/6058dfda14cb65c58e6e1b1bf4650548f92122c2af59420cb44b4d3438j7svn" Feb 17 17:59:25 crc kubenswrapper[4892]: I0217 17:59:25.242119 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ebf28d4-a728-4295-a2c1-bbd21cd9c333-util\") pod \"6058dfda14cb65c58e6e1b1bf4650548f92122c2af59420cb44b4d3438j7svn\" (UID: \"4ebf28d4-a728-4295-a2c1-bbd21cd9c333\") " pod="openstack-operators/6058dfda14cb65c58e6e1b1bf4650548f92122c2af59420cb44b4d3438j7svn" Feb 17 17:59:25 crc kubenswrapper[4892]: I0217 17:59:25.242187 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ebf28d4-a728-4295-a2c1-bbd21cd9c333-bundle\") pod \"6058dfda14cb65c58e6e1b1bf4650548f92122c2af59420cb44b4d3438j7svn\" (UID: \"4ebf28d4-a728-4295-a2c1-bbd21cd9c333\") " pod="openstack-operators/6058dfda14cb65c58e6e1b1bf4650548f92122c2af59420cb44b4d3438j7svn" Feb 17 17:59:25 crc kubenswrapper[4892]: I0217 17:59:25.259591 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhlt6\" (UniqueName: \"kubernetes.io/projected/4ebf28d4-a728-4295-a2c1-bbd21cd9c333-kube-api-access-nhlt6\") pod \"6058dfda14cb65c58e6e1b1bf4650548f92122c2af59420cb44b4d3438j7svn\" (UID: \"4ebf28d4-a728-4295-a2c1-bbd21cd9c333\") " pod="openstack-operators/6058dfda14cb65c58e6e1b1bf4650548f92122c2af59420cb44b4d3438j7svn" Feb 17 17:59:25 crc kubenswrapper[4892]: I0217 17:59:25.387424 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/6058dfda14cb65c58e6e1b1bf4650548f92122c2af59420cb44b4d3438j7svn" Feb 17 17:59:25 crc kubenswrapper[4892]: I0217 17:59:25.729860 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/6058dfda14cb65c58e6e1b1bf4650548f92122c2af59420cb44b4d3438j7svn"] Feb 17 17:59:25 crc kubenswrapper[4892]: W0217 17:59:25.737921 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ebf28d4_a728_4295_a2c1_bbd21cd9c333.slice/crio-d4959cb560c4a3dc42ed41d41f287b7ef46e0d23c296faa5a2a7a70970613670 WatchSource:0}: Error finding container d4959cb560c4a3dc42ed41d41f287b7ef46e0d23c296faa5a2a7a70970613670: Status 404 returned error can't find the container with id d4959cb560c4a3dc42ed41d41f287b7ef46e0d23c296faa5a2a7a70970613670 Feb 17 17:59:26 crc kubenswrapper[4892]: I0217 17:59:26.375542 4892 generic.go:334] "Generic (PLEG): container finished" podID="4ebf28d4-a728-4295-a2c1-bbd21cd9c333" containerID="c11b284332cd0527b0c2b37907a5ac10ff8d5d290692174ed29071cfe9f488e0" exitCode=0 Feb 17 17:59:26 crc kubenswrapper[4892]: I0217 17:59:26.375625 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/6058dfda14cb65c58e6e1b1bf4650548f92122c2af59420cb44b4d3438j7svn" event={"ID":"4ebf28d4-a728-4295-a2c1-bbd21cd9c333","Type":"ContainerDied","Data":"c11b284332cd0527b0c2b37907a5ac10ff8d5d290692174ed29071cfe9f488e0"} Feb 17 17:59:26 crc kubenswrapper[4892]: I0217 17:59:26.375850 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/6058dfda14cb65c58e6e1b1bf4650548f92122c2af59420cb44b4d3438j7svn" event={"ID":"4ebf28d4-a728-4295-a2c1-bbd21cd9c333","Type":"ContainerStarted","Data":"d4959cb560c4a3dc42ed41d41f287b7ef46e0d23c296faa5a2a7a70970613670"} Feb 17 17:59:27 crc kubenswrapper[4892]: I0217 17:59:27.386196 4892 generic.go:334] "Generic (PLEG): container finished" podID="4ebf28d4-a728-4295-a2c1-bbd21cd9c333" containerID="097662a27eb2c98811b1a4df3c49a32b276f746e2d87e4a3c41f6b14b13e9bdc" exitCode=0 Feb 17 17:59:27 crc kubenswrapper[4892]: I0217 17:59:27.386293 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/6058dfda14cb65c58e6e1b1bf4650548f92122c2af59420cb44b4d3438j7svn" event={"ID":"4ebf28d4-a728-4295-a2c1-bbd21cd9c333","Type":"ContainerDied","Data":"097662a27eb2c98811b1a4df3c49a32b276f746e2d87e4a3c41f6b14b13e9bdc"} Feb 17 17:59:28 crc kubenswrapper[4892]: I0217 17:59:28.396735 4892 generic.go:334] "Generic (PLEG): container finished" podID="4ebf28d4-a728-4295-a2c1-bbd21cd9c333" containerID="5148ac027556151e5f3759f44cf152c793e6e5e8a34409345cc04f907d4708e1" exitCode=0 Feb 17 17:59:28 crc kubenswrapper[4892]: I0217 17:59:28.396799 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/6058dfda14cb65c58e6e1b1bf4650548f92122c2af59420cb44b4d3438j7svn" event={"ID":"4ebf28d4-a728-4295-a2c1-bbd21cd9c333","Type":"ContainerDied","Data":"5148ac027556151e5f3759f44cf152c793e6e5e8a34409345cc04f907d4708e1"} Feb 17 17:59:29 crc kubenswrapper[4892]: I0217 17:59:29.732983 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/6058dfda14cb65c58e6e1b1bf4650548f92122c2af59420cb44b4d3438j7svn" Feb 17 17:59:29 crc kubenswrapper[4892]: I0217 17:59:29.829057 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ebf28d4-a728-4295-a2c1-bbd21cd9c333-util\") pod \"4ebf28d4-a728-4295-a2c1-bbd21cd9c333\" (UID: \"4ebf28d4-a728-4295-a2c1-bbd21cd9c333\") " Feb 17 17:59:29 crc kubenswrapper[4892]: I0217 17:59:29.829167 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhlt6\" (UniqueName: \"kubernetes.io/projected/4ebf28d4-a728-4295-a2c1-bbd21cd9c333-kube-api-access-nhlt6\") pod \"4ebf28d4-a728-4295-a2c1-bbd21cd9c333\" (UID: \"4ebf28d4-a728-4295-a2c1-bbd21cd9c333\") " Feb 17 17:59:29 crc kubenswrapper[4892]: I0217 17:59:29.829191 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ebf28d4-a728-4295-a2c1-bbd21cd9c333-bundle\") pod \"4ebf28d4-a728-4295-a2c1-bbd21cd9c333\" (UID: \"4ebf28d4-a728-4295-a2c1-bbd21cd9c333\") " Feb 17 17:59:29 crc kubenswrapper[4892]: I0217 17:59:29.829830 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ebf28d4-a728-4295-a2c1-bbd21cd9c333-bundle" (OuterVolumeSpecName: "bundle") pod "4ebf28d4-a728-4295-a2c1-bbd21cd9c333" (UID: "4ebf28d4-a728-4295-a2c1-bbd21cd9c333"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:59:29 crc kubenswrapper[4892]: I0217 17:59:29.830124 4892 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ebf28d4-a728-4295-a2c1-bbd21cd9c333-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:59:29 crc kubenswrapper[4892]: I0217 17:59:29.834044 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ebf28d4-a728-4295-a2c1-bbd21cd9c333-kube-api-access-nhlt6" (OuterVolumeSpecName: "kube-api-access-nhlt6") pod "4ebf28d4-a728-4295-a2c1-bbd21cd9c333" (UID: "4ebf28d4-a728-4295-a2c1-bbd21cd9c333"). InnerVolumeSpecName "kube-api-access-nhlt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:59:29 crc kubenswrapper[4892]: I0217 17:59:29.843427 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ebf28d4-a728-4295-a2c1-bbd21cd9c333-util" (OuterVolumeSpecName: "util") pod "4ebf28d4-a728-4295-a2c1-bbd21cd9c333" (UID: "4ebf28d4-a728-4295-a2c1-bbd21cd9c333"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:59:29 crc kubenswrapper[4892]: I0217 17:59:29.931198 4892 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ebf28d4-a728-4295-a2c1-bbd21cd9c333-util\") on node \"crc\" DevicePath \"\"" Feb 17 17:59:29 crc kubenswrapper[4892]: I0217 17:59:29.931234 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhlt6\" (UniqueName: \"kubernetes.io/projected/4ebf28d4-a728-4295-a2c1-bbd21cd9c333-kube-api-access-nhlt6\") on node \"crc\" DevicePath \"\"" Feb 17 17:59:30 crc kubenswrapper[4892]: I0217 17:59:30.416696 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/6058dfda14cb65c58e6e1b1bf4650548f92122c2af59420cb44b4d3438j7svn" event={"ID":"4ebf28d4-a728-4295-a2c1-bbd21cd9c333","Type":"ContainerDied","Data":"d4959cb560c4a3dc42ed41d41f287b7ef46e0d23c296faa5a2a7a70970613670"} Feb 17 17:59:30 crc kubenswrapper[4892]: I0217 17:59:30.416740 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4959cb560c4a3dc42ed41d41f287b7ef46e0d23c296faa5a2a7a70970613670" Feb 17 17:59:30 crc kubenswrapper[4892]: I0217 17:59:30.416876 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/6058dfda14cb65c58e6e1b1bf4650548f92122c2af59420cb44b4d3438j7svn" Feb 17 17:59:35 crc kubenswrapper[4892]: I0217 17:59:35.309454 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-5498fb9db9-4w9jv"] Feb 17 17:59:35 crc kubenswrapper[4892]: E0217 17:59:35.310381 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ebf28d4-a728-4295-a2c1-bbd21cd9c333" containerName="extract" Feb 17 17:59:35 crc kubenswrapper[4892]: I0217 17:59:35.310398 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ebf28d4-a728-4295-a2c1-bbd21cd9c333" containerName="extract" Feb 17 17:59:35 crc kubenswrapper[4892]: E0217 17:59:35.310414 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ebf28d4-a728-4295-a2c1-bbd21cd9c333" containerName="pull" Feb 17 17:59:35 crc kubenswrapper[4892]: I0217 17:59:35.310422 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ebf28d4-a728-4295-a2c1-bbd21cd9c333" containerName="pull" Feb 17 17:59:35 crc kubenswrapper[4892]: E0217 17:59:35.310443 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ebf28d4-a728-4295-a2c1-bbd21cd9c333" containerName="util" Feb 17 17:59:35 crc kubenswrapper[4892]: I0217 17:59:35.310453 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ebf28d4-a728-4295-a2c1-bbd21cd9c333" containerName="util" Feb 17 17:59:35 crc kubenswrapper[4892]: I0217 17:59:35.310632 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ebf28d4-a728-4295-a2c1-bbd21cd9c333" containerName="extract" Feb 17 17:59:35 crc kubenswrapper[4892]: I0217 17:59:35.311235 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-5498fb9db9-4w9jv" Feb 17 17:59:35 crc kubenswrapper[4892]: I0217 17:59:35.315744 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-8vj5h" Feb 17 17:59:35 crc kubenswrapper[4892]: I0217 17:59:35.344347 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-5498fb9db9-4w9jv"] Feb 17 17:59:35 crc kubenswrapper[4892]: I0217 17:59:35.423312 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z926b\" (UniqueName: \"kubernetes.io/projected/3916f167-fdb1-4c54-bc25-0a7fa20a6794-kube-api-access-z926b\") pod \"openstack-operator-controller-init-5498fb9db9-4w9jv\" (UID: \"3916f167-fdb1-4c54-bc25-0a7fa20a6794\") " pod="openstack-operators/openstack-operator-controller-init-5498fb9db9-4w9jv" Feb 17 17:59:35 crc kubenswrapper[4892]: I0217 17:59:35.524993 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z926b\" (UniqueName: \"kubernetes.io/projected/3916f167-fdb1-4c54-bc25-0a7fa20a6794-kube-api-access-z926b\") pod \"openstack-operator-controller-init-5498fb9db9-4w9jv\" (UID: \"3916f167-fdb1-4c54-bc25-0a7fa20a6794\") " pod="openstack-operators/openstack-operator-controller-init-5498fb9db9-4w9jv" Feb 17 17:59:35 crc kubenswrapper[4892]: I0217 17:59:35.543877 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z926b\" (UniqueName: \"kubernetes.io/projected/3916f167-fdb1-4c54-bc25-0a7fa20a6794-kube-api-access-z926b\") pod \"openstack-operator-controller-init-5498fb9db9-4w9jv\" (UID: \"3916f167-fdb1-4c54-bc25-0a7fa20a6794\") " pod="openstack-operators/openstack-operator-controller-init-5498fb9db9-4w9jv" Feb 17 17:59:35 crc kubenswrapper[4892]: I0217 17:59:35.631028 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-5498fb9db9-4w9jv" Feb 17 17:59:36 crc kubenswrapper[4892]: I0217 17:59:36.146291 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-5498fb9db9-4w9jv"] Feb 17 17:59:36 crc kubenswrapper[4892]: I0217 17:59:36.465582 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-5498fb9db9-4w9jv" event={"ID":"3916f167-fdb1-4c54-bc25-0a7fa20a6794","Type":"ContainerStarted","Data":"4b610c417be00d3472e2f9d82306f03e5e51a86b40b12e82ae8afaa3188cd9b1"} Feb 17 17:59:37 crc kubenswrapper[4892]: I0217 17:59:37.424325 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:59:37 crc kubenswrapper[4892]: I0217 17:59:37.425239 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:59:41 crc kubenswrapper[4892]: I0217 17:59:41.506001 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-5498fb9db9-4w9jv" event={"ID":"3916f167-fdb1-4c54-bc25-0a7fa20a6794","Type":"ContainerStarted","Data":"a0cf1b7d6cc99508b30fc68e24e2393159240c83dcded4add20c5b613af92337"} Feb 17 17:59:41 crc kubenswrapper[4892]: I0217 17:59:41.506670 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-5498fb9db9-4w9jv" Feb 17 17:59:41 crc kubenswrapper[4892]: I0217 17:59:41.538001 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-5498fb9db9-4w9jv" podStartSLOduration=2.29157073 podStartE2EDuration="6.53798188s" podCreationTimestamp="2026-02-17 17:59:35 +0000 UTC" firstStartedPulling="2026-02-17 17:59:36.130502081 +0000 UTC m=+947.505905346" lastFinishedPulling="2026-02-17 17:59:40.376913231 +0000 UTC m=+951.752316496" observedRunningTime="2026-02-17 17:59:41.533350356 +0000 UTC m=+952.908753621" watchObservedRunningTime="2026-02-17 17:59:41.53798188 +0000 UTC m=+952.913385145" Feb 17 17:59:45 crc kubenswrapper[4892]: I0217 17:59:45.633490 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-5498fb9db9-4w9jv" Feb 17 18:00:00 crc kubenswrapper[4892]: I0217 18:00:00.146064 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522520-5jt52"] Feb 17 18:00:00 crc kubenswrapper[4892]: I0217 18:00:00.148279 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522520-5jt52" Feb 17 18:00:00 crc kubenswrapper[4892]: I0217 18:00:00.150615 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 18:00:00 crc kubenswrapper[4892]: I0217 18:00:00.152021 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522520-5jt52"] Feb 17 18:00:00 crc kubenswrapper[4892]: I0217 18:00:00.154551 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 18:00:00 crc kubenswrapper[4892]: I0217 18:00:00.225570 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-758p6\" (UniqueName: \"kubernetes.io/projected/8a8477f7-43ee-4fe8-8ff6-1af065f5ab21-kube-api-access-758p6\") pod \"collect-profiles-29522520-5jt52\" (UID: \"8a8477f7-43ee-4fe8-8ff6-1af065f5ab21\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522520-5jt52" Feb 17 18:00:00 crc kubenswrapper[4892]: I0217 18:00:00.225645 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a8477f7-43ee-4fe8-8ff6-1af065f5ab21-config-volume\") pod \"collect-profiles-29522520-5jt52\" (UID: \"8a8477f7-43ee-4fe8-8ff6-1af065f5ab21\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522520-5jt52" Feb 17 18:00:00 crc kubenswrapper[4892]: I0217 18:00:00.225786 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8a8477f7-43ee-4fe8-8ff6-1af065f5ab21-secret-volume\") pod \"collect-profiles-29522520-5jt52\" (UID: \"8a8477f7-43ee-4fe8-8ff6-1af065f5ab21\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522520-5jt52" Feb 17 18:00:00 crc kubenswrapper[4892]: I0217 18:00:00.327788 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-758p6\" (UniqueName: \"kubernetes.io/projected/8a8477f7-43ee-4fe8-8ff6-1af065f5ab21-kube-api-access-758p6\") pod \"collect-profiles-29522520-5jt52\" (UID: \"8a8477f7-43ee-4fe8-8ff6-1af065f5ab21\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522520-5jt52" Feb 17 18:00:00 crc kubenswrapper[4892]: I0217 18:00:00.327907 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a8477f7-43ee-4fe8-8ff6-1af065f5ab21-config-volume\") pod \"collect-profiles-29522520-5jt52\" (UID: \"8a8477f7-43ee-4fe8-8ff6-1af065f5ab21\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522520-5jt52" Feb 17 18:00:00 crc kubenswrapper[4892]: I0217 18:00:00.327988 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8a8477f7-43ee-4fe8-8ff6-1af065f5ab21-secret-volume\") pod \"collect-profiles-29522520-5jt52\" (UID: \"8a8477f7-43ee-4fe8-8ff6-1af065f5ab21\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522520-5jt52" Feb 17 18:00:00 crc kubenswrapper[4892]: I0217 18:00:00.328895 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a8477f7-43ee-4fe8-8ff6-1af065f5ab21-config-volume\") pod \"collect-profiles-29522520-5jt52\" (UID: \"8a8477f7-43ee-4fe8-8ff6-1af065f5ab21\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522520-5jt52" Feb 17 18:00:00 crc kubenswrapper[4892]: I0217 18:00:00.341348 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8a8477f7-43ee-4fe8-8ff6-1af065f5ab21-secret-volume\") pod \"collect-profiles-29522520-5jt52\" (UID: \"8a8477f7-43ee-4fe8-8ff6-1af065f5ab21\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522520-5jt52" Feb 17 18:00:00 crc kubenswrapper[4892]: I0217 18:00:00.361339 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-758p6\" (UniqueName: \"kubernetes.io/projected/8a8477f7-43ee-4fe8-8ff6-1af065f5ab21-kube-api-access-758p6\") pod \"collect-profiles-29522520-5jt52\" (UID: \"8a8477f7-43ee-4fe8-8ff6-1af065f5ab21\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522520-5jt52" Feb 17 18:00:00 crc kubenswrapper[4892]: I0217 18:00:00.469753 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522520-5jt52" Feb 17 18:00:00 crc kubenswrapper[4892]: I0217 18:00:00.995108 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522520-5jt52"] Feb 17 18:00:01 crc kubenswrapper[4892]: I0217 18:00:01.689045 4892 generic.go:334] "Generic (PLEG): container finished" podID="8a8477f7-43ee-4fe8-8ff6-1af065f5ab21" containerID="f7283cef0e509d28cda8cd3f7e1d8875cd21279f2449d37c8cdedc0c5d655358" exitCode=0 Feb 17 18:00:01 crc kubenswrapper[4892]: I0217 18:00:01.689323 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522520-5jt52" event={"ID":"8a8477f7-43ee-4fe8-8ff6-1af065f5ab21","Type":"ContainerDied","Data":"f7283cef0e509d28cda8cd3f7e1d8875cd21279f2449d37c8cdedc0c5d655358"} Feb 17 18:00:01 crc kubenswrapper[4892]: I0217 18:00:01.689350 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522520-5jt52" event={"ID":"8a8477f7-43ee-4fe8-8ff6-1af065f5ab21","Type":"ContainerStarted","Data":"886ccdc857ade481f382c7a83f1f44d6723bc24dab96e6809d94f59301e39a3a"} Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:02.999760 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522520-5jt52" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.075527 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a8477f7-43ee-4fe8-8ff6-1af065f5ab21-config-volume\") pod \"8a8477f7-43ee-4fe8-8ff6-1af065f5ab21\" (UID: \"8a8477f7-43ee-4fe8-8ff6-1af065f5ab21\") " Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.075631 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8a8477f7-43ee-4fe8-8ff6-1af065f5ab21-secret-volume\") pod \"8a8477f7-43ee-4fe8-8ff6-1af065f5ab21\" (UID: \"8a8477f7-43ee-4fe8-8ff6-1af065f5ab21\") " Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.075668 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-758p6\" (UniqueName: \"kubernetes.io/projected/8a8477f7-43ee-4fe8-8ff6-1af065f5ab21-kube-api-access-758p6\") pod \"8a8477f7-43ee-4fe8-8ff6-1af065f5ab21\" (UID: \"8a8477f7-43ee-4fe8-8ff6-1af065f5ab21\") " Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.081900 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a8477f7-43ee-4fe8-8ff6-1af065f5ab21-config-volume" (OuterVolumeSpecName: "config-volume") pod "8a8477f7-43ee-4fe8-8ff6-1af065f5ab21" (UID: "8a8477f7-43ee-4fe8-8ff6-1af065f5ab21"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.087994 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a8477f7-43ee-4fe8-8ff6-1af065f5ab21-kube-api-access-758p6" (OuterVolumeSpecName: "kube-api-access-758p6") pod "8a8477f7-43ee-4fe8-8ff6-1af065f5ab21" (UID: "8a8477f7-43ee-4fe8-8ff6-1af065f5ab21"). InnerVolumeSpecName "kube-api-access-758p6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.089363 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a8477f7-43ee-4fe8-8ff6-1af065f5ab21-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8a8477f7-43ee-4fe8-8ff6-1af065f5ab21" (UID: "8a8477f7-43ee-4fe8-8ff6-1af065f5ab21"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.177570 4892 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a8477f7-43ee-4fe8-8ff6-1af065f5ab21-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.177599 4892 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8a8477f7-43ee-4fe8-8ff6-1af065f5ab21-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.177609 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-758p6\" (UniqueName: \"kubernetes.io/projected/8a8477f7-43ee-4fe8-8ff6-1af065f5ab21-kube-api-access-758p6\") on node \"crc\" DevicePath \"\"" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.461701 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-p6wgw"] Feb 17 18:00:03 crc kubenswrapper[4892]: E0217 18:00:03.462354 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a8477f7-43ee-4fe8-8ff6-1af065f5ab21" containerName="collect-profiles" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.462374 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a8477f7-43ee-4fe8-8ff6-1af065f5ab21" containerName="collect-profiles" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.462593 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a8477f7-43ee-4fe8-8ff6-1af065f5ab21" containerName="collect-profiles" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.463185 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-p6wgw" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.465092 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-pvxjp" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.469203 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-p6wgw"] Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.479286 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-254f9"] Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.480210 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-254f9" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.481676 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqw8f\" (UniqueName: \"kubernetes.io/projected/ed7add95-6153-4284-b9be-a76a4142a35e-kube-api-access-nqw8f\") pod \"barbican-operator-controller-manager-868647ff47-p6wgw\" (UID: \"ed7add95-6153-4284-b9be-a76a4142a35e\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-p6wgw" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.483044 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-w7gbt" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.492446 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-5pdhr"] Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.493406 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-5pdhr" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.496323 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-hkcsk" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.511559 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-254f9"] Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.525891 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-5pdhr"] Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.532442 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-ml9zk"] Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.538101 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-ml9zk" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.540763 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-p6krq" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.547939 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-ml9zk"] Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.557804 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-ngzhd"] Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.558746 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-ngzhd" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.562167 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-fq7gc" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.582952 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n45cr\" (UniqueName: \"kubernetes.io/projected/cc4b7060-d89f-47c5-b2e4-2a793606350c-kube-api-access-n45cr\") pod \"designate-operator-controller-manager-6d8bf5c495-5pdhr\" (UID: \"cc4b7060-d89f-47c5-b2e4-2a793606350c\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-5pdhr" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.582987 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv8nx\" (UniqueName: \"kubernetes.io/projected/8dcfa260-ea93-42f8-a345-aec700f9e938-kube-api-access-rv8nx\") pod \"glance-operator-controller-manager-77987464f4-ml9zk\" (UID: \"8dcfa260-ea93-42f8-a345-aec700f9e938\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-ml9zk" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.583050 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqw8f\" (UniqueName: \"kubernetes.io/projected/ed7add95-6153-4284-b9be-a76a4142a35e-kube-api-access-nqw8f\") pod \"barbican-operator-controller-manager-868647ff47-p6wgw\" (UID: \"ed7add95-6153-4284-b9be-a76a4142a35e\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-p6wgw" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.583099 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6dpx\" (UniqueName: \"kubernetes.io/projected/727885b3-46ff-43a8-9991-28567f43d07e-kube-api-access-p6dpx\") pod \"cinder-operator-controller-manager-5d946d989d-254f9\" (UID: \"727885b3-46ff-43a8-9991-28567f43d07e\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-254f9" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.583130 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28mml\" (UniqueName: \"kubernetes.io/projected/67ccc01c-23ce-407b-91dd-9554c49acbd5-kube-api-access-28mml\") pod \"heat-operator-controller-manager-69f49c598c-ngzhd\" (UID: \"67ccc01c-23ce-407b-91dd-9554c49acbd5\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-ngzhd" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.602304 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-ngzhd"] Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.633655 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqw8f\" (UniqueName: \"kubernetes.io/projected/ed7add95-6153-4284-b9be-a76a4142a35e-kube-api-access-nqw8f\") pod \"barbican-operator-controller-manager-868647ff47-p6wgw\" (UID: \"ed7add95-6153-4284-b9be-a76a4142a35e\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-p6wgw" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.654901 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-nr7vj"] Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.655880 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-nr7vj" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.663628 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-hc496" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.677969 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-nr7vj"] Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.685312 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6dpx\" (UniqueName: \"kubernetes.io/projected/727885b3-46ff-43a8-9991-28567f43d07e-kube-api-access-p6dpx\") pod \"cinder-operator-controller-manager-5d946d989d-254f9\" (UID: \"727885b3-46ff-43a8-9991-28567f43d07e\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-254f9" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.685365 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28mml\" (UniqueName: \"kubernetes.io/projected/67ccc01c-23ce-407b-91dd-9554c49acbd5-kube-api-access-28mml\") pod \"heat-operator-controller-manager-69f49c598c-ngzhd\" (UID: \"67ccc01c-23ce-407b-91dd-9554c49acbd5\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-ngzhd" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.685432 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n45cr\" (UniqueName: \"kubernetes.io/projected/cc4b7060-d89f-47c5-b2e4-2a793606350c-kube-api-access-n45cr\") pod \"designate-operator-controller-manager-6d8bf5c495-5pdhr\" (UID: \"cc4b7060-d89f-47c5-b2e4-2a793606350c\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-5pdhr" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.685453 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rv8nx\" (UniqueName: \"kubernetes.io/projected/8dcfa260-ea93-42f8-a345-aec700f9e938-kube-api-access-rv8nx\") pod \"glance-operator-controller-manager-77987464f4-ml9zk\" (UID: \"8dcfa260-ea93-42f8-a345-aec700f9e938\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-ml9zk" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.685487 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9697f\" (UniqueName: \"kubernetes.io/projected/48f194fc-64b3-4ef2-9006-8b533ce72000-kube-api-access-9697f\") pod \"horizon-operator-controller-manager-5b9b8895d5-nr7vj\" (UID: \"48f194fc-64b3-4ef2-9006-8b533ce72000\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-nr7vj" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.686873 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-ff5c8777-kccmj"] Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.687743 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-ff5c8777-kccmj" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.692256 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.703973 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-njvgq" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.711025 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28mml\" (UniqueName: \"kubernetes.io/projected/67ccc01c-23ce-407b-91dd-9554c49acbd5-kube-api-access-28mml\") pod \"heat-operator-controller-manager-69f49c598c-ngzhd\" (UID: \"67ccc01c-23ce-407b-91dd-9554c49acbd5\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-ngzhd" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.713074 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv8nx\" (UniqueName: \"kubernetes.io/projected/8dcfa260-ea93-42f8-a345-aec700f9e938-kube-api-access-rv8nx\") pod \"glance-operator-controller-manager-77987464f4-ml9zk\" (UID: \"8dcfa260-ea93-42f8-a345-aec700f9e938\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-ml9zk" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.714866 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-ff5c8777-kccmj"] Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.715719 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n45cr\" (UniqueName: \"kubernetes.io/projected/cc4b7060-d89f-47c5-b2e4-2a793606350c-kube-api-access-n45cr\") pod \"designate-operator-controller-manager-6d8bf5c495-5pdhr\" (UID: \"cc4b7060-d89f-47c5-b2e4-2a793606350c\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-5pdhr" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.718224 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6dpx\" (UniqueName: \"kubernetes.io/projected/727885b3-46ff-43a8-9991-28567f43d07e-kube-api-access-p6dpx\") pod \"cinder-operator-controller-manager-5d946d989d-254f9\" (UID: \"727885b3-46ff-43a8-9991-28567f43d07e\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-254f9" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.718512 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522520-5jt52" event={"ID":"8a8477f7-43ee-4fe8-8ff6-1af065f5ab21","Type":"ContainerDied","Data":"886ccdc857ade481f382c7a83f1f44d6723bc24dab96e6809d94f59301e39a3a"} Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.718541 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="886ccdc857ade481f382c7a83f1f44d6723bc24dab96e6809d94f59301e39a3a" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.718630 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522520-5jt52" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.720881 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-wdsjq"] Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.721815 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-wdsjq" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.725893 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-564wn" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.728452 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-wdsjq"] Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.740491 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-tw7fv"] Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.741519 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-tw7fv" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.746402 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-rlhq2" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.762422 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-tw7fv"] Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.770270 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-spdh6"] Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.771304 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-spdh6" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.773902 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-67zr2"] Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.773929 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-ccbzj" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.774872 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-67zr2" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.777363 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-kpt6r" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.779496 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-p6wgw" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.780748 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-m5npp"] Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.781722 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-m5npp" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.786236 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-jnsrp" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.786939 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhnt5\" (UniqueName: \"kubernetes.io/projected/9693b58d-64b4-4d18-a746-ec0a67606de5-kube-api-access-qhnt5\") pod \"ironic-operator-controller-manager-554564d7fc-wdsjq\" (UID: \"9693b58d-64b4-4d18-a746-ec0a67606de5\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-wdsjq" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.786972 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-862ff\" (UniqueName: \"kubernetes.io/projected/cdcfbb9d-667b-4333-bb72-96bbf99ed979-kube-api-access-862ff\") pod \"keystone-operator-controller-manager-b4d948c87-tw7fv\" (UID: \"cdcfbb9d-667b-4333-bb72-96bbf99ed979\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-tw7fv" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.786996 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnfj4\" (UniqueName: \"kubernetes.io/projected/5716f235-cb99-4c41-b126-c122c572684a-kube-api-access-nnfj4\") pod \"mariadb-operator-controller-manager-6994f66f48-67zr2\" (UID: \"5716f235-cb99-4c41-b126-c122c572684a\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-67zr2" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.787017 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4cnx\" (UniqueName: \"kubernetes.io/projected/471d6891-4b43-4dd0-86b1-5deb2fa418f7-kube-api-access-b4cnx\") pod \"manila-operator-controller-manager-54f6768c69-spdh6\" (UID: \"471d6891-4b43-4dd0-86b1-5deb2fa418f7\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-spdh6" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.787036 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0756a5c3-ad9d-4f9c-a3ce-77763bd1182e-cert\") pod \"infra-operator-controller-manager-ff5c8777-kccmj\" (UID: \"0756a5c3-ad9d-4f9c-a3ce-77763bd1182e\") " pod="openstack-operators/infra-operator-controller-manager-ff5c8777-kccmj" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.787066 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9697f\" (UniqueName: \"kubernetes.io/projected/48f194fc-64b3-4ef2-9006-8b533ce72000-kube-api-access-9697f\") pod \"horizon-operator-controller-manager-5b9b8895d5-nr7vj\" (UID: \"48f194fc-64b3-4ef2-9006-8b533ce72000\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-nr7vj" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.787089 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mft6r\" (UniqueName: \"kubernetes.io/projected/0756a5c3-ad9d-4f9c-a3ce-77763bd1182e-kube-api-access-mft6r\") pod \"infra-operator-controller-manager-ff5c8777-kccmj\" (UID: \"0756a5c3-ad9d-4f9c-a3ce-77763bd1182e\") " pod="openstack-operators/infra-operator-controller-manager-ff5c8777-kccmj" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.807283 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-254f9" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.810312 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-spdh6"] Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.810755 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-5pdhr" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.823561 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-67zr2"] Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.831681 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9697f\" (UniqueName: \"kubernetes.io/projected/48f194fc-64b3-4ef2-9006-8b533ce72000-kube-api-access-9697f\") pod \"horizon-operator-controller-manager-5b9b8895d5-nr7vj\" (UID: \"48f194fc-64b3-4ef2-9006-8b533ce72000\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-nr7vj" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.844423 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-m5npp"] Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.863373 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-7k7pc"] Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.864533 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-7k7pc" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.867741 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-jttl8" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.873991 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-7k7pc"] Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.874265 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-ml9zk" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.882030 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-hjp6k"] Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.883436 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-hjp6k" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.890037 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhnt5\" (UniqueName: \"kubernetes.io/projected/9693b58d-64b4-4d18-a746-ec0a67606de5-kube-api-access-qhnt5\") pod \"ironic-operator-controller-manager-554564d7fc-wdsjq\" (UID: \"9693b58d-64b4-4d18-a746-ec0a67606de5\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-wdsjq" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.890092 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-862ff\" (UniqueName: \"kubernetes.io/projected/cdcfbb9d-667b-4333-bb72-96bbf99ed979-kube-api-access-862ff\") pod \"keystone-operator-controller-manager-b4d948c87-tw7fv\" (UID: \"cdcfbb9d-667b-4333-bb72-96bbf99ed979\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-tw7fv" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.890112 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnfj4\" (UniqueName: \"kubernetes.io/projected/5716f235-cb99-4c41-b126-c122c572684a-kube-api-access-nnfj4\") pod \"mariadb-operator-controller-manager-6994f66f48-67zr2\" (UID: \"5716f235-cb99-4c41-b126-c122c572684a\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-67zr2" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.890153 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4cnx\" (UniqueName: \"kubernetes.io/projected/471d6891-4b43-4dd0-86b1-5deb2fa418f7-kube-api-access-b4cnx\") pod \"manila-operator-controller-manager-54f6768c69-spdh6\" (UID: \"471d6891-4b43-4dd0-86b1-5deb2fa418f7\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-spdh6" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.890184 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mft6r\" (UniqueName: \"kubernetes.io/projected/0756a5c3-ad9d-4f9c-a3ce-77763bd1182e-kube-api-access-mft6r\") pod \"infra-operator-controller-manager-ff5c8777-kccmj\" (UID: \"0756a5c3-ad9d-4f9c-a3ce-77763bd1182e\") " pod="openstack-operators/infra-operator-controller-manager-ff5c8777-kccmj" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.890211 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bttgt\" (UniqueName: \"kubernetes.io/projected/ab01e10a-ce82-409d-8912-88ec85acac47-kube-api-access-bttgt\") pod \"nova-operator-controller-manager-567668f5cf-7k7pc\" (UID: \"ab01e10a-ce82-409d-8912-88ec85acac47\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-7k7pc" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.890266 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0756a5c3-ad9d-4f9c-a3ce-77763bd1182e-cert\") pod \"infra-operator-controller-manager-ff5c8777-kccmj\" (UID: \"0756a5c3-ad9d-4f9c-a3ce-77763bd1182e\") " pod="openstack-operators/infra-operator-controller-manager-ff5c8777-kccmj" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.890287 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cn2ff\" (UniqueName: \"kubernetes.io/projected/062a2519-0d5a-4662-8c84-5b8926ba32a2-kube-api-access-cn2ff\") pod \"neutron-operator-controller-manager-64ddbf8bb-m5npp\" (UID: \"062a2519-0d5a-4662-8c84-5b8926ba32a2\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-m5npp" Feb 17 18:00:03 crc kubenswrapper[4892]: E0217 18:00:03.890438 4892 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 17 18:00:03 crc kubenswrapper[4892]: E0217 18:00:03.890495 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0756a5c3-ad9d-4f9c-a3ce-77763bd1182e-cert podName:0756a5c3-ad9d-4f9c-a3ce-77763bd1182e nodeName:}" failed. No retries permitted until 2026-02-17 18:00:04.390472816 +0000 UTC m=+975.765876081 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0756a5c3-ad9d-4f9c-a3ce-77763bd1182e-cert") pod "infra-operator-controller-manager-ff5c8777-kccmj" (UID: "0756a5c3-ad9d-4f9c-a3ce-77763bd1182e") : secret "infra-operator-webhook-server-cert" not found Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.894956 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-hjp6k"] Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.905147 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-ngzhd" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.907571 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-dh8vc" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.931987 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnfj4\" (UniqueName: \"kubernetes.io/projected/5716f235-cb99-4c41-b126-c122c572684a-kube-api-access-nnfj4\") pod \"mariadb-operator-controller-manager-6994f66f48-67zr2\" (UID: \"5716f235-cb99-4c41-b126-c122c572684a\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-67zr2" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.933293 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-862ff\" (UniqueName: \"kubernetes.io/projected/cdcfbb9d-667b-4333-bb72-96bbf99ed979-kube-api-access-862ff\") pod \"keystone-operator-controller-manager-b4d948c87-tw7fv\" (UID: \"cdcfbb9d-667b-4333-bb72-96bbf99ed979\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-tw7fv" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.936442 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mft6r\" (UniqueName: \"kubernetes.io/projected/0756a5c3-ad9d-4f9c-a3ce-77763bd1182e-kube-api-access-mft6r\") pod \"infra-operator-controller-manager-ff5c8777-kccmj\" (UID: \"0756a5c3-ad9d-4f9c-a3ce-77763bd1182e\") " pod="openstack-operators/infra-operator-controller-manager-ff5c8777-kccmj" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.953088 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhnt5\" (UniqueName: \"kubernetes.io/projected/9693b58d-64b4-4d18-a746-ec0a67606de5-kube-api-access-qhnt5\") pod \"ironic-operator-controller-manager-554564d7fc-wdsjq\" (UID: \"9693b58d-64b4-4d18-a746-ec0a67606de5\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-wdsjq" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.959771 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4cnx\" (UniqueName: \"kubernetes.io/projected/471d6891-4b43-4dd0-86b1-5deb2fa418f7-kube-api-access-b4cnx\") pod \"manila-operator-controller-manager-54f6768c69-spdh6\" (UID: \"471d6891-4b43-4dd0-86b1-5deb2fa418f7\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-spdh6" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.982254 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-nr7vj" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.982777 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-lf9pj"] Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.983798 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-lf9pj" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.987551 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-2bp7k" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.992388 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cn2ff\" (UniqueName: \"kubernetes.io/projected/062a2519-0d5a-4662-8c84-5b8926ba32a2-kube-api-access-cn2ff\") pod \"neutron-operator-controller-manager-64ddbf8bb-m5npp\" (UID: \"062a2519-0d5a-4662-8c84-5b8926ba32a2\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-m5npp" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.992491 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bttgt\" (UniqueName: \"kubernetes.io/projected/ab01e10a-ce82-409d-8912-88ec85acac47-kube-api-access-bttgt\") pod \"nova-operator-controller-manager-567668f5cf-7k7pc\" (UID: \"ab01e10a-ce82-409d-8912-88ec85acac47\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-7k7pc" Feb 17 18:00:03 crc kubenswrapper[4892]: I0217 18:00:03.992538 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79w2g\" (UniqueName: \"kubernetes.io/projected/07a93e09-771e-4a85-89d0-e6fb19dcdb2d-kube-api-access-79w2g\") pod \"octavia-operator-controller-manager-69f8888797-hjp6k\" (UID: \"07a93e09-771e-4a85-89d0-e6fb19dcdb2d\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-hjp6k" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.004163 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-84966cf5c4c5hvk"] Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.006012 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84966cf5c4c5hvk" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.008474 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.008681 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-gcxzf" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.018540 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cn2ff\" (UniqueName: \"kubernetes.io/projected/062a2519-0d5a-4662-8c84-5b8926ba32a2-kube-api-access-cn2ff\") pod \"neutron-operator-controller-manager-64ddbf8bb-m5npp\" (UID: \"062a2519-0d5a-4662-8c84-5b8926ba32a2\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-m5npp" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.028503 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bttgt\" (UniqueName: \"kubernetes.io/projected/ab01e10a-ce82-409d-8912-88ec85acac47-kube-api-access-bttgt\") pod \"nova-operator-controller-manager-567668f5cf-7k7pc\" (UID: \"ab01e10a-ce82-409d-8912-88ec85acac47\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-7k7pc" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.036929 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-ktptw"] Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.041602 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-ktptw" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.052660 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-xggx8" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.060352 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-wdsjq" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.065623 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-tw7fv" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.094036 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79w2g\" (UniqueName: \"kubernetes.io/projected/07a93e09-771e-4a85-89d0-e6fb19dcdb2d-kube-api-access-79w2g\") pod \"octavia-operator-controller-manager-69f8888797-hjp6k\" (UID: \"07a93e09-771e-4a85-89d0-e6fb19dcdb2d\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-hjp6k" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.094087 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/86e53c34-ab5b-49ab-a14e-13d76792b6ef-cert\") pod \"openstack-baremetal-operator-controller-manager-84966cf5c4c5hvk\" (UID: \"86e53c34-ab5b-49ab-a14e-13d76792b6ef\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84966cf5c4c5hvk" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.094142 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9l59\" (UniqueName: \"kubernetes.io/projected/6622263b-b231-458c-b9fc-19061a5d73a7-kube-api-access-k9l59\") pod \"placement-operator-controller-manager-8497b45c89-ktptw\" (UID: \"6622263b-b231-458c-b9fc-19061a5d73a7\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-ktptw" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.094177 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmtww\" (UniqueName: \"kubernetes.io/projected/67b2947f-d96d-4697-9527-da5cbabc0552-kube-api-access-jmtww\") pod \"ovn-operator-controller-manager-d44cf6b75-lf9pj\" (UID: \"67b2947f-d96d-4697-9527-da5cbabc0552\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-lf9pj" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.094197 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh4fs\" (UniqueName: \"kubernetes.io/projected/86e53c34-ab5b-49ab-a14e-13d76792b6ef-kube-api-access-fh4fs\") pod \"openstack-baremetal-operator-controller-manager-84966cf5c4c5hvk\" (UID: \"86e53c34-ab5b-49ab-a14e-13d76792b6ef\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84966cf5c4c5hvk" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.109930 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-lf9pj"] Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.115827 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79w2g\" (UniqueName: \"kubernetes.io/projected/07a93e09-771e-4a85-89d0-e6fb19dcdb2d-kube-api-access-79w2g\") pod \"octavia-operator-controller-manager-69f8888797-hjp6k\" (UID: \"07a93e09-771e-4a85-89d0-e6fb19dcdb2d\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-hjp6k" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.117988 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-spdh6" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.125490 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-ktptw"] Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.141941 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-67zr2" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.152118 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-9ws6d"] Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.153462 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-9ws6d" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.155403 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-9t4x9" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.165133 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-m5npp" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.182190 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-84966cf5c4c5hvk"] Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.191089 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-7k7pc" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.194968 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/86e53c34-ab5b-49ab-a14e-13d76792b6ef-cert\") pod \"openstack-baremetal-operator-controller-manager-84966cf5c4c5hvk\" (UID: \"86e53c34-ab5b-49ab-a14e-13d76792b6ef\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84966cf5c4c5hvk" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.195019 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qnxr\" (UniqueName: \"kubernetes.io/projected/689be620-172c-415d-927a-0a4f9ea9f5cb-kube-api-access-9qnxr\") pod \"swift-operator-controller-manager-68f46476f-9ws6d\" (UID: \"689be620-172c-415d-927a-0a4f9ea9f5cb\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-9ws6d" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.195055 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9l59\" (UniqueName: \"kubernetes.io/projected/6622263b-b231-458c-b9fc-19061a5d73a7-kube-api-access-k9l59\") pod \"placement-operator-controller-manager-8497b45c89-ktptw\" (UID: \"6622263b-b231-458c-b9fc-19061a5d73a7\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-ktptw" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.195092 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmtww\" (UniqueName: \"kubernetes.io/projected/67b2947f-d96d-4697-9527-da5cbabc0552-kube-api-access-jmtww\") pod \"ovn-operator-controller-manager-d44cf6b75-lf9pj\" (UID: \"67b2947f-d96d-4697-9527-da5cbabc0552\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-lf9pj" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.195111 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fh4fs\" (UniqueName: \"kubernetes.io/projected/86e53c34-ab5b-49ab-a14e-13d76792b6ef-kube-api-access-fh4fs\") pod \"openstack-baremetal-operator-controller-manager-84966cf5c4c5hvk\" (UID: \"86e53c34-ab5b-49ab-a14e-13d76792b6ef\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84966cf5c4c5hvk" Feb 17 18:00:04 crc kubenswrapper[4892]: E0217 18:00:04.195510 4892 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 18:00:04 crc kubenswrapper[4892]: E0217 18:00:04.195583 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86e53c34-ab5b-49ab-a14e-13d76792b6ef-cert podName:86e53c34-ab5b-49ab-a14e-13d76792b6ef nodeName:}" failed. No retries permitted until 2026-02-17 18:00:04.695550163 +0000 UTC m=+976.070953428 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/86e53c34-ab5b-49ab-a14e-13d76792b6ef-cert") pod "openstack-baremetal-operator-controller-manager-84966cf5c4c5hvk" (UID: "86e53c34-ab5b-49ab-a14e-13d76792b6ef") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.224647 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-9ws6d"] Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.226873 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9l59\" (UniqueName: \"kubernetes.io/projected/6622263b-b231-458c-b9fc-19061a5d73a7-kube-api-access-k9l59\") pod \"placement-operator-controller-manager-8497b45c89-ktptw\" (UID: \"6622263b-b231-458c-b9fc-19061a5d73a7\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-ktptw" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.230511 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh4fs\" (UniqueName: \"kubernetes.io/projected/86e53c34-ab5b-49ab-a14e-13d76792b6ef-kube-api-access-fh4fs\") pod \"openstack-baremetal-operator-controller-manager-84966cf5c4c5hvk\" (UID: \"86e53c34-ab5b-49ab-a14e-13d76792b6ef\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84966cf5c4c5hvk" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.230709 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-hjp6k" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.296788 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qnxr\" (UniqueName: \"kubernetes.io/projected/689be620-172c-415d-927a-0a4f9ea9f5cb-kube-api-access-9qnxr\") pod \"swift-operator-controller-manager-68f46476f-9ws6d\" (UID: \"689be620-172c-415d-927a-0a4f9ea9f5cb\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-9ws6d" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.315488 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmtww\" (UniqueName: \"kubernetes.io/projected/67b2947f-d96d-4697-9527-da5cbabc0552-kube-api-access-jmtww\") pod \"ovn-operator-controller-manager-d44cf6b75-lf9pj\" (UID: \"67b2947f-d96d-4697-9527-da5cbabc0552\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-lf9pj" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.352479 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qnxr\" (UniqueName: \"kubernetes.io/projected/689be620-172c-415d-927a-0a4f9ea9f5cb-kube-api-access-9qnxr\") pod \"swift-operator-controller-manager-68f46476f-9ws6d\" (UID: \"689be620-172c-415d-927a-0a4f9ea9f5cb\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-9ws6d" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.407325 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0756a5c3-ad9d-4f9c-a3ce-77763bd1182e-cert\") pod \"infra-operator-controller-manager-ff5c8777-kccmj\" (UID: \"0756a5c3-ad9d-4f9c-a3ce-77763bd1182e\") " pod="openstack-operators/infra-operator-controller-manager-ff5c8777-kccmj" Feb 17 18:00:04 crc kubenswrapper[4892]: E0217 18:00:04.407564 4892 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 17 18:00:04 crc kubenswrapper[4892]: E0217 18:00:04.407614 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0756a5c3-ad9d-4f9c-a3ce-77763bd1182e-cert podName:0756a5c3-ad9d-4f9c-a3ce-77763bd1182e nodeName:}" failed. No retries permitted until 2026-02-17 18:00:05.407599078 +0000 UTC m=+976.783002343 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0756a5c3-ad9d-4f9c-a3ce-77763bd1182e-cert") pod "infra-operator-controller-manager-ff5c8777-kccmj" (UID: "0756a5c3-ad9d-4f9c-a3ce-77763bd1182e") : secret "infra-operator-webhook-server-cert" not found Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.420686 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-tzf7x"] Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.421884 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-tzf7x" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.425273 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-lf9pj" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.426055 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-9r9fg" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.432621 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-tzf7x"] Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.455240 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-cpl2x"] Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.456355 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-cpl2x" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.459073 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-8wk2z" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.461376 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-cpl2x"] Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.465497 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-ktptw" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.503019 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-xftdg"] Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.507630 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-xftdg" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.518718 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-2ww2f" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.537845 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-xftdg"] Feb 17 18:00:04 crc kubenswrapper[4892]: W0217 18:00:04.538935 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded7add95_6153_4284_b9be_a76a4142a35e.slice/crio-d481585e85e2a145e05871894954ed934734557e6b8d93459bf446ff5143c167 WatchSource:0}: Error finding container d481585e85e2a145e05871894954ed934734557e6b8d93459bf446ff5143c167: Status 404 returned error can't find the container with id d481585e85e2a145e05871894954ed934734557e6b8d93459bf446ff5143c167 Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.540451 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-9ws6d" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.554135 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9sls\" (UniqueName: \"kubernetes.io/projected/2f32911d-efa2-45de-8a11-497adcb7d2bd-kube-api-access-h9sls\") pod \"telemetry-operator-controller-manager-7f45b4ff68-tzf7x\" (UID: \"2f32911d-efa2-45de-8a11-497adcb7d2bd\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-tzf7x" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.641210 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-b96b9dfc9-cb6tc"] Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.642193 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-b96b9dfc9-cb6tc" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.646431 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.647064 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.647664 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-7jjf9" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.656495 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5b9l7\" (UniqueName: \"kubernetes.io/projected/f533d37d-5213-42af-82db-fc5e0208cb8d-kube-api-access-5b9l7\") pod \"test-operator-controller-manager-7866795846-cpl2x\" (UID: \"f533d37d-5213-42af-82db-fc5e0208cb8d\") " pod="openstack-operators/test-operator-controller-manager-7866795846-cpl2x" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.656540 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1b49b91f-8f48-4831-ae32-4b1a9287123f-webhook-certs\") pod \"openstack-operator-controller-manager-b96b9dfc9-cb6tc\" (UID: \"1b49b91f-8f48-4831-ae32-4b1a9287123f\") " pod="openstack-operators/openstack-operator-controller-manager-b96b9dfc9-cb6tc" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.656646 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9sls\" (UniqueName: \"kubernetes.io/projected/2f32911d-efa2-45de-8a11-497adcb7d2bd-kube-api-access-h9sls\") pod \"telemetry-operator-controller-manager-7f45b4ff68-tzf7x\" (UID: \"2f32911d-efa2-45de-8a11-497adcb7d2bd\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-tzf7x" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.656678 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b49b91f-8f48-4831-ae32-4b1a9287123f-metrics-certs\") pod \"openstack-operator-controller-manager-b96b9dfc9-cb6tc\" (UID: \"1b49b91f-8f48-4831-ae32-4b1a9287123f\") " pod="openstack-operators/openstack-operator-controller-manager-b96b9dfc9-cb6tc" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.656776 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsj24\" (UniqueName: \"kubernetes.io/projected/1b49b91f-8f48-4831-ae32-4b1a9287123f-kube-api-access-vsj24\") pod \"openstack-operator-controller-manager-b96b9dfc9-cb6tc\" (UID: \"1b49b91f-8f48-4831-ae32-4b1a9287123f\") " pod="openstack-operators/openstack-operator-controller-manager-b96b9dfc9-cb6tc" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.656862 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jmvx\" (UniqueName: \"kubernetes.io/projected/a1264594-6ddc-417a-bea1-9bb9eedbb719-kube-api-access-2jmvx\") pod \"watcher-operator-controller-manager-5db88f68c-xftdg\" (UID: \"a1264594-6ddc-417a-bea1-9bb9eedbb719\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-xftdg" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.671372 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-b96b9dfc9-cb6tc"] Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.687305 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jd4b7"] Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.688771 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jd4b7" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.694276 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-8fbw4" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.697533 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jd4b7"] Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.705626 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-p6wgw"] Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.705914 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9sls\" (UniqueName: \"kubernetes.io/projected/2f32911d-efa2-45de-8a11-497adcb7d2bd-kube-api-access-h9sls\") pod \"telemetry-operator-controller-manager-7f45b4ff68-tzf7x\" (UID: \"2f32911d-efa2-45de-8a11-497adcb7d2bd\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-tzf7x" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.738082 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-p6wgw" event={"ID":"ed7add95-6153-4284-b9be-a76a4142a35e","Type":"ContainerStarted","Data":"d481585e85e2a145e05871894954ed934734557e6b8d93459bf446ff5143c167"} Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.757865 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5b9l7\" (UniqueName: \"kubernetes.io/projected/f533d37d-5213-42af-82db-fc5e0208cb8d-kube-api-access-5b9l7\") pod \"test-operator-controller-manager-7866795846-cpl2x\" (UID: \"f533d37d-5213-42af-82db-fc5e0208cb8d\") " pod="openstack-operators/test-operator-controller-manager-7866795846-cpl2x" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.757962 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1b49b91f-8f48-4831-ae32-4b1a9287123f-webhook-certs\") pod \"openstack-operator-controller-manager-b96b9dfc9-cb6tc\" (UID: \"1b49b91f-8f48-4831-ae32-4b1a9287123f\") " pod="openstack-operators/openstack-operator-controller-manager-b96b9dfc9-cb6tc" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.758012 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b49b91f-8f48-4831-ae32-4b1a9287123f-metrics-certs\") pod \"openstack-operator-controller-manager-b96b9dfc9-cb6tc\" (UID: \"1b49b91f-8f48-4831-ae32-4b1a9287123f\") " pod="openstack-operators/openstack-operator-controller-manager-b96b9dfc9-cb6tc" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.758076 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsj24\" (UniqueName: \"kubernetes.io/projected/1b49b91f-8f48-4831-ae32-4b1a9287123f-kube-api-access-vsj24\") pod \"openstack-operator-controller-manager-b96b9dfc9-cb6tc\" (UID: \"1b49b91f-8f48-4831-ae32-4b1a9287123f\") " pod="openstack-operators/openstack-operator-controller-manager-b96b9dfc9-cb6tc" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.758100 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/86e53c34-ab5b-49ab-a14e-13d76792b6ef-cert\") pod \"openstack-baremetal-operator-controller-manager-84966cf5c4c5hvk\" (UID: \"86e53c34-ab5b-49ab-a14e-13d76792b6ef\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84966cf5c4c5hvk" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.758168 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jmvx\" (UniqueName: \"kubernetes.io/projected/a1264594-6ddc-417a-bea1-9bb9eedbb719-kube-api-access-2jmvx\") pod \"watcher-operator-controller-manager-5db88f68c-xftdg\" (UID: \"a1264594-6ddc-417a-bea1-9bb9eedbb719\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-xftdg" Feb 17 18:00:04 crc kubenswrapper[4892]: E0217 18:00:04.758251 4892 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 17 18:00:04 crc kubenswrapper[4892]: E0217 18:00:04.758316 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b49b91f-8f48-4831-ae32-4b1a9287123f-metrics-certs podName:1b49b91f-8f48-4831-ae32-4b1a9287123f nodeName:}" failed. No retries permitted until 2026-02-17 18:00:05.258294176 +0000 UTC m=+976.633697441 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b49b91f-8f48-4831-ae32-4b1a9287123f-metrics-certs") pod "openstack-operator-controller-manager-b96b9dfc9-cb6tc" (UID: "1b49b91f-8f48-4831-ae32-4b1a9287123f") : secret "metrics-server-cert" not found Feb 17 18:00:04 crc kubenswrapper[4892]: E0217 18:00:04.758371 4892 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 17 18:00:04 crc kubenswrapper[4892]: E0217 18:00:04.758396 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b49b91f-8f48-4831-ae32-4b1a9287123f-webhook-certs podName:1b49b91f-8f48-4831-ae32-4b1a9287123f nodeName:}" failed. No retries permitted until 2026-02-17 18:00:05.258388359 +0000 UTC m=+976.633791624 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/1b49b91f-8f48-4831-ae32-4b1a9287123f-webhook-certs") pod "openstack-operator-controller-manager-b96b9dfc9-cb6tc" (UID: "1b49b91f-8f48-4831-ae32-4b1a9287123f") : secret "webhook-server-cert" not found Feb 17 18:00:04 crc kubenswrapper[4892]: E0217 18:00:04.758573 4892 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 18:00:04 crc kubenswrapper[4892]: E0217 18:00:04.758641 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86e53c34-ab5b-49ab-a14e-13d76792b6ef-cert podName:86e53c34-ab5b-49ab-a14e-13d76792b6ef nodeName:}" failed. No retries permitted until 2026-02-17 18:00:05.758615005 +0000 UTC m=+977.134018270 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/86e53c34-ab5b-49ab-a14e-13d76792b6ef-cert") pod "openstack-baremetal-operator-controller-manager-84966cf5c4c5hvk" (UID: "86e53c34-ab5b-49ab-a14e-13d76792b6ef") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.775564 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jmvx\" (UniqueName: \"kubernetes.io/projected/a1264594-6ddc-417a-bea1-9bb9eedbb719-kube-api-access-2jmvx\") pod \"watcher-operator-controller-manager-5db88f68c-xftdg\" (UID: \"a1264594-6ddc-417a-bea1-9bb9eedbb719\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-xftdg" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.777849 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5b9l7\" (UniqueName: \"kubernetes.io/projected/f533d37d-5213-42af-82db-fc5e0208cb8d-kube-api-access-5b9l7\") pod \"test-operator-controller-manager-7866795846-cpl2x\" (UID: \"f533d37d-5213-42af-82db-fc5e0208cb8d\") " pod="openstack-operators/test-operator-controller-manager-7866795846-cpl2x" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.794029 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsj24\" (UniqueName: \"kubernetes.io/projected/1b49b91f-8f48-4831-ae32-4b1a9287123f-kube-api-access-vsj24\") pod \"openstack-operator-controller-manager-b96b9dfc9-cb6tc\" (UID: \"1b49b91f-8f48-4831-ae32-4b1a9287123f\") " pod="openstack-operators/openstack-operator-controller-manager-b96b9dfc9-cb6tc" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.805018 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-cpl2x" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.809556 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-tzf7x" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.861153 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhvrh\" (UniqueName: \"kubernetes.io/projected/5bc11699-06b5-4fd8-b8ff-6f3ecbda532f-kube-api-access-zhvrh\") pod \"rabbitmq-cluster-operator-manager-668c99d594-jd4b7\" (UID: \"5bc11699-06b5-4fd8-b8ff-6f3ecbda532f\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jd4b7" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.866521 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-xftdg" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.888349 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-254f9"] Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.962264 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhvrh\" (UniqueName: \"kubernetes.io/projected/5bc11699-06b5-4fd8-b8ff-6f3ecbda532f-kube-api-access-zhvrh\") pod \"rabbitmq-cluster-operator-manager-668c99d594-jd4b7\" (UID: \"5bc11699-06b5-4fd8-b8ff-6f3ecbda532f\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jd4b7" Feb 17 18:00:04 crc kubenswrapper[4892]: I0217 18:00:04.992242 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhvrh\" (UniqueName: \"kubernetes.io/projected/5bc11699-06b5-4fd8-b8ff-6f3ecbda532f-kube-api-access-zhvrh\") pod \"rabbitmq-cluster-operator-manager-668c99d594-jd4b7\" (UID: \"5bc11699-06b5-4fd8-b8ff-6f3ecbda532f\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jd4b7" Feb 17 18:00:05 crc kubenswrapper[4892]: I0217 18:00:05.089732 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-5pdhr"] Feb 17 18:00:05 crc kubenswrapper[4892]: I0217 18:00:05.145103 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jd4b7" Feb 17 18:00:05 crc kubenswrapper[4892]: I0217 18:00:05.270645 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1b49b91f-8f48-4831-ae32-4b1a9287123f-webhook-certs\") pod \"openstack-operator-controller-manager-b96b9dfc9-cb6tc\" (UID: \"1b49b91f-8f48-4831-ae32-4b1a9287123f\") " pod="openstack-operators/openstack-operator-controller-manager-b96b9dfc9-cb6tc" Feb 17 18:00:05 crc kubenswrapper[4892]: I0217 18:00:05.270702 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b49b91f-8f48-4831-ae32-4b1a9287123f-metrics-certs\") pod \"openstack-operator-controller-manager-b96b9dfc9-cb6tc\" (UID: \"1b49b91f-8f48-4831-ae32-4b1a9287123f\") " pod="openstack-operators/openstack-operator-controller-manager-b96b9dfc9-cb6tc" Feb 17 18:00:05 crc kubenswrapper[4892]: E0217 18:00:05.270803 4892 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 17 18:00:05 crc kubenswrapper[4892]: E0217 18:00:05.270891 4892 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 17 18:00:05 crc kubenswrapper[4892]: E0217 18:00:05.270901 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b49b91f-8f48-4831-ae32-4b1a9287123f-webhook-certs podName:1b49b91f-8f48-4831-ae32-4b1a9287123f nodeName:}" failed. No retries permitted until 2026-02-17 18:00:06.270877966 +0000 UTC m=+977.646281231 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/1b49b91f-8f48-4831-ae32-4b1a9287123f-webhook-certs") pod "openstack-operator-controller-manager-b96b9dfc9-cb6tc" (UID: "1b49b91f-8f48-4831-ae32-4b1a9287123f") : secret "webhook-server-cert" not found Feb 17 18:00:05 crc kubenswrapper[4892]: E0217 18:00:05.270938 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b49b91f-8f48-4831-ae32-4b1a9287123f-metrics-certs podName:1b49b91f-8f48-4831-ae32-4b1a9287123f nodeName:}" failed. No retries permitted until 2026-02-17 18:00:06.270921297 +0000 UTC m=+977.646324562 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b49b91f-8f48-4831-ae32-4b1a9287123f-metrics-certs") pod "openstack-operator-controller-manager-b96b9dfc9-cb6tc" (UID: "1b49b91f-8f48-4831-ae32-4b1a9287123f") : secret "metrics-server-cert" not found Feb 17 18:00:05 crc kubenswrapper[4892]: I0217 18:00:05.329888 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-ml9zk"] Feb 17 18:00:05 crc kubenswrapper[4892]: I0217 18:00:05.349182 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-nr7vj"] Feb 17 18:00:05 crc kubenswrapper[4892]: I0217 18:00:05.386012 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-ngzhd"] Feb 17 18:00:05 crc kubenswrapper[4892]: W0217 18:00:05.394909 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8dcfa260_ea93_42f8_a345_aec700f9e938.slice/crio-02fba7e0a42a11fb2b0c4c82c8205015374f767975a1621ff2c7a3cb33bbb1e7 WatchSource:0}: Error finding container 02fba7e0a42a11fb2b0c4c82c8205015374f767975a1621ff2c7a3cb33bbb1e7: Status 404 returned error can't find the container with id 02fba7e0a42a11fb2b0c4c82c8205015374f767975a1621ff2c7a3cb33bbb1e7 Feb 17 18:00:05 crc kubenswrapper[4892]: I0217 18:00:05.452659 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-lf9pj"] Feb 17 18:00:05 crc kubenswrapper[4892]: I0217 18:00:05.463370 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-9ws6d"] Feb 17 18:00:05 crc kubenswrapper[4892]: I0217 18:00:05.473031 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-tw7fv"] Feb 17 18:00:05 crc kubenswrapper[4892]: I0217 18:00:05.473965 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0756a5c3-ad9d-4f9c-a3ce-77763bd1182e-cert\") pod \"infra-operator-controller-manager-ff5c8777-kccmj\" (UID: \"0756a5c3-ad9d-4f9c-a3ce-77763bd1182e\") " pod="openstack-operators/infra-operator-controller-manager-ff5c8777-kccmj" Feb 17 18:00:05 crc kubenswrapper[4892]: E0217 18:00:05.474135 4892 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 17 18:00:05 crc kubenswrapper[4892]: E0217 18:00:05.474192 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0756a5c3-ad9d-4f9c-a3ce-77763bd1182e-cert podName:0756a5c3-ad9d-4f9c-a3ce-77763bd1182e nodeName:}" failed. No retries permitted until 2026-02-17 18:00:07.474175335 +0000 UTC m=+978.849578600 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0756a5c3-ad9d-4f9c-a3ce-77763bd1182e-cert") pod "infra-operator-controller-manager-ff5c8777-kccmj" (UID: "0756a5c3-ad9d-4f9c-a3ce-77763bd1182e") : secret "infra-operator-webhook-server-cert" not found Feb 17 18:00:05 crc kubenswrapper[4892]: I0217 18:00:05.484964 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-spdh6"] Feb 17 18:00:05 crc kubenswrapper[4892]: I0217 18:00:05.501794 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-7k7pc"] Feb 17 18:00:05 crc kubenswrapper[4892]: I0217 18:00:05.782253 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/86e53c34-ab5b-49ab-a14e-13d76792b6ef-cert\") pod \"openstack-baremetal-operator-controller-manager-84966cf5c4c5hvk\" (UID: \"86e53c34-ab5b-49ab-a14e-13d76792b6ef\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84966cf5c4c5hvk" Feb 17 18:00:05 crc kubenswrapper[4892]: E0217 18:00:05.782473 4892 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 18:00:05 crc kubenswrapper[4892]: E0217 18:00:05.782525 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86e53c34-ab5b-49ab-a14e-13d76792b6ef-cert podName:86e53c34-ab5b-49ab-a14e-13d76792b6ef nodeName:}" failed. No retries permitted until 2026-02-17 18:00:07.782505519 +0000 UTC m=+979.157908784 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/86e53c34-ab5b-49ab-a14e-13d76792b6ef-cert") pod "openstack-baremetal-operator-controller-manager-84966cf5c4c5hvk" (UID: "86e53c34-ab5b-49ab-a14e-13d76792b6ef") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 18:00:05 crc kubenswrapper[4892]: I0217 18:00:05.783037 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-9ws6d" event={"ID":"689be620-172c-415d-927a-0a4f9ea9f5cb","Type":"ContainerStarted","Data":"2b2b80f81f61dce03a50f7450d47a9cf1655194194194f6ee5b7adcb08fefc31"} Feb 17 18:00:05 crc kubenswrapper[4892]: I0217 18:00:05.792880 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-ngzhd" event={"ID":"67ccc01c-23ce-407b-91dd-9554c49acbd5","Type":"ContainerStarted","Data":"4b9501b25a05046adf21d608cfb1a0a3073776141b9539efb901518928e08e92"} Feb 17 18:00:05 crc kubenswrapper[4892]: I0217 18:00:05.795880 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-tw7fv" event={"ID":"cdcfbb9d-667b-4333-bb72-96bbf99ed979","Type":"ContainerStarted","Data":"57b3e922fce4af60186acf1022c5394a325dd79057740d4b1de3b642c3cd5077"} Feb 17 18:00:05 crc kubenswrapper[4892]: I0217 18:00:05.799927 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-lf9pj" event={"ID":"67b2947f-d96d-4697-9527-da5cbabc0552","Type":"ContainerStarted","Data":"b1da92427b39e4f17e1e5d701bcfaa58014b9c478523176811518a5e44454990"} Feb 17 18:00:05 crc kubenswrapper[4892]: I0217 18:00:05.801649 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-5pdhr" event={"ID":"cc4b7060-d89f-47c5-b2e4-2a793606350c","Type":"ContainerStarted","Data":"c50945573d7b8046678886effe122dafd4c0ac00fa65c5ce78c1826424676a8e"} Feb 17 18:00:05 crc kubenswrapper[4892]: I0217 18:00:05.802944 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-254f9" event={"ID":"727885b3-46ff-43a8-9991-28567f43d07e","Type":"ContainerStarted","Data":"b43f4f6bea9679a98ec9abff3dad1d58013e5102ab65522c64c5ce74f949c71f"} Feb 17 18:00:05 crc kubenswrapper[4892]: I0217 18:00:05.805731 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-nr7vj" event={"ID":"48f194fc-64b3-4ef2-9006-8b533ce72000","Type":"ContainerStarted","Data":"a3f3f8ec88d8a2265132a49003e170eb49511580497e3bdf3eae6af51753ea06"} Feb 17 18:00:05 crc kubenswrapper[4892]: I0217 18:00:05.812667 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-ml9zk" event={"ID":"8dcfa260-ea93-42f8-a345-aec700f9e938","Type":"ContainerStarted","Data":"02fba7e0a42a11fb2b0c4c82c8205015374f767975a1621ff2c7a3cb33bbb1e7"} Feb 17 18:00:05 crc kubenswrapper[4892]: I0217 18:00:05.814739 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-spdh6" event={"ID":"471d6891-4b43-4dd0-86b1-5deb2fa418f7","Type":"ContainerStarted","Data":"f22e1e288a73aae8a99417039d32ea06626040f1ec706716cd548e51bc1cc9b3"} Feb 17 18:00:05 crc kubenswrapper[4892]: I0217 18:00:05.820969 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-7k7pc" event={"ID":"ab01e10a-ce82-409d-8912-88ec85acac47","Type":"ContainerStarted","Data":"b9a1763d520195a81102c5831980f94cef32710b4f52e37f4dae180afad3d923"} Feb 17 18:00:05 crc kubenswrapper[4892]: I0217 18:00:05.861684 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-wdsjq"] Feb 17 18:00:05 crc kubenswrapper[4892]: I0217 18:00:05.868437 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-tzf7x"] Feb 17 18:00:05 crc kubenswrapper[4892]: I0217 18:00:05.879554 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-m5npp"] Feb 17 18:00:05 crc kubenswrapper[4892]: I0217 18:00:05.921906 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-xftdg"] Feb 17 18:00:05 crc kubenswrapper[4892]: I0217 18:00:05.929885 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-hjp6k"] Feb 17 18:00:05 crc kubenswrapper[4892]: I0217 18:00:05.933853 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-ktptw"] Feb 17 18:00:05 crc kubenswrapper[4892]: I0217 18:00:05.992031 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jd4b7"] Feb 17 18:00:06 crc kubenswrapper[4892]: I0217 18:00:06.001245 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-cpl2x"] Feb 17 18:00:06 crc kubenswrapper[4892]: E0217 18:00:06.013206 4892 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-k9l59,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-8497b45c89-ktptw_openstack-operators(6622263b-b231-458c-b9fc-19061a5d73a7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 17 18:00:06 crc kubenswrapper[4892]: E0217 18:00:06.014530 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-ktptw" podUID="6622263b-b231-458c-b9fc-19061a5d73a7" Feb 17 18:00:06 crc kubenswrapper[4892]: I0217 18:00:06.037473 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-67zr2"] Feb 17 18:00:06 crc kubenswrapper[4892]: E0217 18:00:06.041123 4892 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zhvrh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-jd4b7_openstack-operators(5bc11699-06b5-4fd8-b8ff-6f3ecbda532f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 17 18:00:06 crc kubenswrapper[4892]: E0217 18:00:06.042395 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jd4b7" podUID="5bc11699-06b5-4fd8-b8ff-6f3ecbda532f" Feb 17 18:00:06 crc kubenswrapper[4892]: E0217 18:00:06.061132 4892 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5b9l7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-7866795846-cpl2x_openstack-operators(f533d37d-5213-42af-82db-fc5e0208cb8d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 17 18:00:06 crc kubenswrapper[4892]: E0217 18:00:06.065270 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-7866795846-cpl2x" podUID="f533d37d-5213-42af-82db-fc5e0208cb8d" Feb 17 18:00:06 crc kubenswrapper[4892]: E0217 18:00:06.093150 4892 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nnfj4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-6994f66f48-67zr2_openstack-operators(5716f235-cb99-4c41-b126-c122c572684a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 17 18:00:06 crc kubenswrapper[4892]: E0217 18:00:06.094671 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-67zr2" podUID="5716f235-cb99-4c41-b126-c122c572684a" Feb 17 18:00:06 crc kubenswrapper[4892]: I0217 18:00:06.296708 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1b49b91f-8f48-4831-ae32-4b1a9287123f-webhook-certs\") pod \"openstack-operator-controller-manager-b96b9dfc9-cb6tc\" (UID: \"1b49b91f-8f48-4831-ae32-4b1a9287123f\") " pod="openstack-operators/openstack-operator-controller-manager-b96b9dfc9-cb6tc" Feb 17 18:00:06 crc kubenswrapper[4892]: I0217 18:00:06.296780 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b49b91f-8f48-4831-ae32-4b1a9287123f-metrics-certs\") pod \"openstack-operator-controller-manager-b96b9dfc9-cb6tc\" (UID: \"1b49b91f-8f48-4831-ae32-4b1a9287123f\") " pod="openstack-operators/openstack-operator-controller-manager-b96b9dfc9-cb6tc" Feb 17 18:00:06 crc kubenswrapper[4892]: E0217 18:00:06.297008 4892 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 17 18:00:06 crc kubenswrapper[4892]: E0217 18:00:06.297020 4892 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 17 18:00:06 crc kubenswrapper[4892]: E0217 18:00:06.297062 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b49b91f-8f48-4831-ae32-4b1a9287123f-metrics-certs podName:1b49b91f-8f48-4831-ae32-4b1a9287123f nodeName:}" failed. No retries permitted until 2026-02-17 18:00:08.29704621 +0000 UTC m=+979.672449475 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b49b91f-8f48-4831-ae32-4b1a9287123f-metrics-certs") pod "openstack-operator-controller-manager-b96b9dfc9-cb6tc" (UID: "1b49b91f-8f48-4831-ae32-4b1a9287123f") : secret "metrics-server-cert" not found Feb 17 18:00:06 crc kubenswrapper[4892]: E0217 18:00:06.297114 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b49b91f-8f48-4831-ae32-4b1a9287123f-webhook-certs podName:1b49b91f-8f48-4831-ae32-4b1a9287123f nodeName:}" failed. No retries permitted until 2026-02-17 18:00:08.297094612 +0000 UTC m=+979.672497977 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/1b49b91f-8f48-4831-ae32-4b1a9287123f-webhook-certs") pod "openstack-operator-controller-manager-b96b9dfc9-cb6tc" (UID: "1b49b91f-8f48-4831-ae32-4b1a9287123f") : secret "webhook-server-cert" not found Feb 17 18:00:06 crc kubenswrapper[4892]: I0217 18:00:06.834794 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-tzf7x" event={"ID":"2f32911d-efa2-45de-8a11-497adcb7d2bd","Type":"ContainerStarted","Data":"62eed875805661d131a4a70692b9e142f23ecabe697631715d45e65d84786670"} Feb 17 18:00:06 crc kubenswrapper[4892]: I0217 18:00:06.839293 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-67zr2" event={"ID":"5716f235-cb99-4c41-b126-c122c572684a","Type":"ContainerStarted","Data":"6d5a168902040925e638d3a808202fc35f50b80bbf0e52a955cfa2d58e5293c5"} Feb 17 18:00:06 crc kubenswrapper[4892]: E0217 18:00:06.843023 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-67zr2" podUID="5716f235-cb99-4c41-b126-c122c572684a" Feb 17 18:00:06 crc kubenswrapper[4892]: I0217 18:00:06.847568 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-wdsjq" event={"ID":"9693b58d-64b4-4d18-a746-ec0a67606de5","Type":"ContainerStarted","Data":"3f81a030815c3d1d57bc93d729649864fbc9dc6eb1f8874b0aa9e539f8979aed"} Feb 17 18:00:06 crc kubenswrapper[4892]: I0217 18:00:06.848686 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-cpl2x" event={"ID":"f533d37d-5213-42af-82db-fc5e0208cb8d","Type":"ContainerStarted","Data":"9e93d00236149be8c62a20ef03156edcece542003ebee8125e8d81e89597093c"} Feb 17 18:00:06 crc kubenswrapper[4892]: I0217 18:00:06.851775 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-m5npp" event={"ID":"062a2519-0d5a-4662-8c84-5b8926ba32a2","Type":"ContainerStarted","Data":"d32151c20cd83a2987ec7916c5050728bff56712d588c107b1a9d83f58b46f3c"} Feb 17 18:00:06 crc kubenswrapper[4892]: E0217 18:00:06.853574 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6\\\"\"" pod="openstack-operators/test-operator-controller-manager-7866795846-cpl2x" podUID="f533d37d-5213-42af-82db-fc5e0208cb8d" Feb 17 18:00:06 crc kubenswrapper[4892]: I0217 18:00:06.861182 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jd4b7" event={"ID":"5bc11699-06b5-4fd8-b8ff-6f3ecbda532f","Type":"ContainerStarted","Data":"802cd09a7f7c6980b6ca3bde74779f4cff47232edf03c74a6f92784756e8950e"} Feb 17 18:00:06 crc kubenswrapper[4892]: I0217 18:00:06.863926 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-ktptw" event={"ID":"6622263b-b231-458c-b9fc-19061a5d73a7","Type":"ContainerStarted","Data":"95f1eb06f8e150dcf62c0b930f8c514786e8abbaecbcea826407e5241bf530e7"} Feb 17 18:00:06 crc kubenswrapper[4892]: E0217 18:00:06.866033 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jd4b7" podUID="5bc11699-06b5-4fd8-b8ff-6f3ecbda532f" Feb 17 18:00:06 crc kubenswrapper[4892]: I0217 18:00:06.866532 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-hjp6k" event={"ID":"07a93e09-771e-4a85-89d0-e6fb19dcdb2d","Type":"ContainerStarted","Data":"63422be1c86273c5267dcd86f7a01e3d3f883b77c65c6cc25b62b265dee00119"} Feb 17 18:00:06 crc kubenswrapper[4892]: I0217 18:00:06.873710 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-xftdg" event={"ID":"a1264594-6ddc-417a-bea1-9bb9eedbb719","Type":"ContainerStarted","Data":"2f53426399e1b585605e465a2c0960036b12b7c18dd73965b86df6f1b14e2535"} Feb 17 18:00:06 crc kubenswrapper[4892]: E0217 18:00:06.893964 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd\\\"\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-ktptw" podUID="6622263b-b231-458c-b9fc-19061a5d73a7" Feb 17 18:00:07 crc kubenswrapper[4892]: I0217 18:00:07.424781 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 18:00:07 crc kubenswrapper[4892]: I0217 18:00:07.424849 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 18:00:07 crc kubenswrapper[4892]: I0217 18:00:07.425018 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" Feb 17 18:00:07 crc kubenswrapper[4892]: I0217 18:00:07.426027 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"277a892ddcda11348b051b3a2c03162bd0db1300ec13dbc17277b62b780132f1"} pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 18:00:07 crc kubenswrapper[4892]: I0217 18:00:07.426543 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" containerID="cri-o://277a892ddcda11348b051b3a2c03162bd0db1300ec13dbc17277b62b780132f1" gracePeriod=600 Feb 17 18:00:07 crc kubenswrapper[4892]: I0217 18:00:07.516669 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0756a5c3-ad9d-4f9c-a3ce-77763bd1182e-cert\") pod \"infra-operator-controller-manager-ff5c8777-kccmj\" (UID: \"0756a5c3-ad9d-4f9c-a3ce-77763bd1182e\") " pod="openstack-operators/infra-operator-controller-manager-ff5c8777-kccmj" Feb 17 18:00:07 crc kubenswrapper[4892]: E0217 18:00:07.516892 4892 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 17 18:00:07 crc kubenswrapper[4892]: E0217 18:00:07.517273 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0756a5c3-ad9d-4f9c-a3ce-77763bd1182e-cert podName:0756a5c3-ad9d-4f9c-a3ce-77763bd1182e nodeName:}" failed. No retries permitted until 2026-02-17 18:00:11.517246345 +0000 UTC m=+982.892649700 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0756a5c3-ad9d-4f9c-a3ce-77763bd1182e-cert") pod "infra-operator-controller-manager-ff5c8777-kccmj" (UID: "0756a5c3-ad9d-4f9c-a3ce-77763bd1182e") : secret "infra-operator-webhook-server-cert" not found Feb 17 18:00:07 crc kubenswrapper[4892]: I0217 18:00:07.820617 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/86e53c34-ab5b-49ab-a14e-13d76792b6ef-cert\") pod \"openstack-baremetal-operator-controller-manager-84966cf5c4c5hvk\" (UID: \"86e53c34-ab5b-49ab-a14e-13d76792b6ef\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84966cf5c4c5hvk" Feb 17 18:00:07 crc kubenswrapper[4892]: E0217 18:00:07.820805 4892 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 18:00:07 crc kubenswrapper[4892]: E0217 18:00:07.820902 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86e53c34-ab5b-49ab-a14e-13d76792b6ef-cert podName:86e53c34-ab5b-49ab-a14e-13d76792b6ef nodeName:}" failed. No retries permitted until 2026-02-17 18:00:11.820880862 +0000 UTC m=+983.196284127 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/86e53c34-ab5b-49ab-a14e-13d76792b6ef-cert") pod "openstack-baremetal-operator-controller-manager-84966cf5c4c5hvk" (UID: "86e53c34-ab5b-49ab-a14e-13d76792b6ef") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 18:00:07 crc kubenswrapper[4892]: I0217 18:00:07.895923 4892 generic.go:334] "Generic (PLEG): container finished" podID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerID="277a892ddcda11348b051b3a2c03162bd0db1300ec13dbc17277b62b780132f1" exitCode=0 Feb 17 18:00:07 crc kubenswrapper[4892]: I0217 18:00:07.896282 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerDied","Data":"277a892ddcda11348b051b3a2c03162bd0db1300ec13dbc17277b62b780132f1"} Feb 17 18:00:07 crc kubenswrapper[4892]: I0217 18:00:07.896348 4892 scope.go:117] "RemoveContainer" containerID="921d973508b9907ab2d7e5529ed27c9f162bd1cd21401233f975dc91366a6d72" Feb 17 18:00:07 crc kubenswrapper[4892]: E0217 18:00:07.902508 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jd4b7" podUID="5bc11699-06b5-4fd8-b8ff-6f3ecbda532f" Feb 17 18:00:07 crc kubenswrapper[4892]: E0217 18:00:07.902939 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-67zr2" podUID="5716f235-cb99-4c41-b126-c122c572684a" Feb 17 18:00:07 crc kubenswrapper[4892]: E0217 18:00:07.903108 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd\\\"\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-ktptw" podUID="6622263b-b231-458c-b9fc-19061a5d73a7" Feb 17 18:00:07 crc kubenswrapper[4892]: E0217 18:00:07.903182 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6\\\"\"" pod="openstack-operators/test-operator-controller-manager-7866795846-cpl2x" podUID="f533d37d-5213-42af-82db-fc5e0208cb8d" Feb 17 18:00:08 crc kubenswrapper[4892]: I0217 18:00:08.328406 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b49b91f-8f48-4831-ae32-4b1a9287123f-metrics-certs\") pod \"openstack-operator-controller-manager-b96b9dfc9-cb6tc\" (UID: \"1b49b91f-8f48-4831-ae32-4b1a9287123f\") " pod="openstack-operators/openstack-operator-controller-manager-b96b9dfc9-cb6tc" Feb 17 18:00:08 crc kubenswrapper[4892]: E0217 18:00:08.328557 4892 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 17 18:00:08 crc kubenswrapper[4892]: E0217 18:00:08.328605 4892 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 17 18:00:08 crc kubenswrapper[4892]: E0217 18:00:08.328611 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b49b91f-8f48-4831-ae32-4b1a9287123f-metrics-certs podName:1b49b91f-8f48-4831-ae32-4b1a9287123f nodeName:}" failed. No retries permitted until 2026-02-17 18:00:12.32859545 +0000 UTC m=+983.703998715 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b49b91f-8f48-4831-ae32-4b1a9287123f-metrics-certs") pod "openstack-operator-controller-manager-b96b9dfc9-cb6tc" (UID: "1b49b91f-8f48-4831-ae32-4b1a9287123f") : secret "metrics-server-cert" not found Feb 17 18:00:08 crc kubenswrapper[4892]: E0217 18:00:08.328634 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b49b91f-8f48-4831-ae32-4b1a9287123f-webhook-certs podName:1b49b91f-8f48-4831-ae32-4b1a9287123f nodeName:}" failed. No retries permitted until 2026-02-17 18:00:12.328625451 +0000 UTC m=+983.704028716 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/1b49b91f-8f48-4831-ae32-4b1a9287123f-webhook-certs") pod "openstack-operator-controller-manager-b96b9dfc9-cb6tc" (UID: "1b49b91f-8f48-4831-ae32-4b1a9287123f") : secret "webhook-server-cert" not found Feb 17 18:00:08 crc kubenswrapper[4892]: I0217 18:00:08.328561 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1b49b91f-8f48-4831-ae32-4b1a9287123f-webhook-certs\") pod \"openstack-operator-controller-manager-b96b9dfc9-cb6tc\" (UID: \"1b49b91f-8f48-4831-ae32-4b1a9287123f\") " pod="openstack-operators/openstack-operator-controller-manager-b96b9dfc9-cb6tc" Feb 17 18:00:11 crc kubenswrapper[4892]: I0217 18:00:11.604294 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0756a5c3-ad9d-4f9c-a3ce-77763bd1182e-cert\") pod \"infra-operator-controller-manager-ff5c8777-kccmj\" (UID: \"0756a5c3-ad9d-4f9c-a3ce-77763bd1182e\") " pod="openstack-operators/infra-operator-controller-manager-ff5c8777-kccmj" Feb 17 18:00:11 crc kubenswrapper[4892]: E0217 18:00:11.604481 4892 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 17 18:00:11 crc kubenswrapper[4892]: E0217 18:00:11.604749 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0756a5c3-ad9d-4f9c-a3ce-77763bd1182e-cert podName:0756a5c3-ad9d-4f9c-a3ce-77763bd1182e nodeName:}" failed. No retries permitted until 2026-02-17 18:00:19.604728901 +0000 UTC m=+990.980132256 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0756a5c3-ad9d-4f9c-a3ce-77763bd1182e-cert") pod "infra-operator-controller-manager-ff5c8777-kccmj" (UID: "0756a5c3-ad9d-4f9c-a3ce-77763bd1182e") : secret "infra-operator-webhook-server-cert" not found Feb 17 18:00:11 crc kubenswrapper[4892]: I0217 18:00:11.909225 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/86e53c34-ab5b-49ab-a14e-13d76792b6ef-cert\") pod \"openstack-baremetal-operator-controller-manager-84966cf5c4c5hvk\" (UID: \"86e53c34-ab5b-49ab-a14e-13d76792b6ef\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84966cf5c4c5hvk" Feb 17 18:00:11 crc kubenswrapper[4892]: E0217 18:00:11.909413 4892 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 18:00:11 crc kubenswrapper[4892]: E0217 18:00:11.909509 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86e53c34-ab5b-49ab-a14e-13d76792b6ef-cert podName:86e53c34-ab5b-49ab-a14e-13d76792b6ef nodeName:}" failed. No retries permitted until 2026-02-17 18:00:19.90948652 +0000 UTC m=+991.284889785 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/86e53c34-ab5b-49ab-a14e-13d76792b6ef-cert") pod "openstack-baremetal-operator-controller-manager-84966cf5c4c5hvk" (UID: "86e53c34-ab5b-49ab-a14e-13d76792b6ef") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 18:00:12 crc kubenswrapper[4892]: I0217 18:00:12.418292 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1b49b91f-8f48-4831-ae32-4b1a9287123f-webhook-certs\") pod \"openstack-operator-controller-manager-b96b9dfc9-cb6tc\" (UID: \"1b49b91f-8f48-4831-ae32-4b1a9287123f\") " pod="openstack-operators/openstack-operator-controller-manager-b96b9dfc9-cb6tc" Feb 17 18:00:12 crc kubenswrapper[4892]: I0217 18:00:12.418381 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b49b91f-8f48-4831-ae32-4b1a9287123f-metrics-certs\") pod \"openstack-operator-controller-manager-b96b9dfc9-cb6tc\" (UID: \"1b49b91f-8f48-4831-ae32-4b1a9287123f\") " pod="openstack-operators/openstack-operator-controller-manager-b96b9dfc9-cb6tc" Feb 17 18:00:12 crc kubenswrapper[4892]: E0217 18:00:12.418439 4892 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 17 18:00:12 crc kubenswrapper[4892]: E0217 18:00:12.418495 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b49b91f-8f48-4831-ae32-4b1a9287123f-webhook-certs podName:1b49b91f-8f48-4831-ae32-4b1a9287123f nodeName:}" failed. No retries permitted until 2026-02-17 18:00:20.418478122 +0000 UTC m=+991.793881387 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/1b49b91f-8f48-4831-ae32-4b1a9287123f-webhook-certs") pod "openstack-operator-controller-manager-b96b9dfc9-cb6tc" (UID: "1b49b91f-8f48-4831-ae32-4b1a9287123f") : secret "webhook-server-cert" not found Feb 17 18:00:12 crc kubenswrapper[4892]: E0217 18:00:12.418617 4892 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 17 18:00:12 crc kubenswrapper[4892]: E0217 18:00:12.418698 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b49b91f-8f48-4831-ae32-4b1a9287123f-metrics-certs podName:1b49b91f-8f48-4831-ae32-4b1a9287123f nodeName:}" failed. No retries permitted until 2026-02-17 18:00:20.418677107 +0000 UTC m=+991.794080382 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b49b91f-8f48-4831-ae32-4b1a9287123f-metrics-certs") pod "openstack-operator-controller-manager-b96b9dfc9-cb6tc" (UID: "1b49b91f-8f48-4831-ae32-4b1a9287123f") : secret "metrics-server-cert" not found Feb 17 18:00:17 crc kubenswrapper[4892]: E0217 18:00:17.938488 4892 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759" Feb 17 18:00:17 crc kubenswrapper[4892]: E0217 18:00:17.938966 4892 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jmtww,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-d44cf6b75-lf9pj_openstack-operators(67b2947f-d96d-4697-9527-da5cbabc0552): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 18:00:17 crc kubenswrapper[4892]: E0217 18:00:17.940331 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-lf9pj" podUID="67b2947f-d96d-4697-9527-da5cbabc0552" Feb 17 18:00:17 crc kubenswrapper[4892]: E0217 18:00:17.984147 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-lf9pj" podUID="67b2947f-d96d-4697-9527-da5cbabc0552" Feb 17 18:00:19 crc kubenswrapper[4892]: I0217 18:00:19.644873 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0756a5c3-ad9d-4f9c-a3ce-77763bd1182e-cert\") pod \"infra-operator-controller-manager-ff5c8777-kccmj\" (UID: \"0756a5c3-ad9d-4f9c-a3ce-77763bd1182e\") " pod="openstack-operators/infra-operator-controller-manager-ff5c8777-kccmj" Feb 17 18:00:19 crc kubenswrapper[4892]: E0217 18:00:19.645139 4892 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 17 18:00:19 crc kubenswrapper[4892]: E0217 18:00:19.645315 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0756a5c3-ad9d-4f9c-a3ce-77763bd1182e-cert podName:0756a5c3-ad9d-4f9c-a3ce-77763bd1182e nodeName:}" failed. No retries permitted until 2026-02-17 18:00:35.645294126 +0000 UTC m=+1007.020697401 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0756a5c3-ad9d-4f9c-a3ce-77763bd1182e-cert") pod "infra-operator-controller-manager-ff5c8777-kccmj" (UID: "0756a5c3-ad9d-4f9c-a3ce-77763bd1182e") : secret "infra-operator-webhook-server-cert" not found Feb 17 18:00:19 crc kubenswrapper[4892]: I0217 18:00:19.949002 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/86e53c34-ab5b-49ab-a14e-13d76792b6ef-cert\") pod \"openstack-baremetal-operator-controller-manager-84966cf5c4c5hvk\" (UID: \"86e53c34-ab5b-49ab-a14e-13d76792b6ef\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84966cf5c4c5hvk" Feb 17 18:00:19 crc kubenswrapper[4892]: E0217 18:00:19.949251 4892 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 18:00:19 crc kubenswrapper[4892]: E0217 18:00:19.949311 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86e53c34-ab5b-49ab-a14e-13d76792b6ef-cert podName:86e53c34-ab5b-49ab-a14e-13d76792b6ef nodeName:}" failed. No retries permitted until 2026-02-17 18:00:35.949294844 +0000 UTC m=+1007.324698109 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/86e53c34-ab5b-49ab-a14e-13d76792b6ef-cert") pod "openstack-baremetal-operator-controller-manager-84966cf5c4c5hvk" (UID: "86e53c34-ab5b-49ab-a14e-13d76792b6ef") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 18:00:20 crc kubenswrapper[4892]: E0217 18:00:20.272585 4892 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf" Feb 17 18:00:20 crc kubenswrapper[4892]: E0217 18:00:20.272756 4892 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cn2ff,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-64ddbf8bb-m5npp_openstack-operators(062a2519-0d5a-4662-8c84-5b8926ba32a2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 18:00:20 crc kubenswrapper[4892]: E0217 18:00:20.274056 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-m5npp" podUID="062a2519-0d5a-4662-8c84-5b8926ba32a2" Feb 17 18:00:20 crc kubenswrapper[4892]: I0217 18:00:20.457032 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1b49b91f-8f48-4831-ae32-4b1a9287123f-webhook-certs\") pod \"openstack-operator-controller-manager-b96b9dfc9-cb6tc\" (UID: \"1b49b91f-8f48-4831-ae32-4b1a9287123f\") " pod="openstack-operators/openstack-operator-controller-manager-b96b9dfc9-cb6tc" Feb 17 18:00:20 crc kubenswrapper[4892]: I0217 18:00:20.457094 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b49b91f-8f48-4831-ae32-4b1a9287123f-metrics-certs\") pod \"openstack-operator-controller-manager-b96b9dfc9-cb6tc\" (UID: \"1b49b91f-8f48-4831-ae32-4b1a9287123f\") " pod="openstack-operators/openstack-operator-controller-manager-b96b9dfc9-cb6tc" Feb 17 18:00:20 crc kubenswrapper[4892]: E0217 18:00:20.457204 4892 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 17 18:00:20 crc kubenswrapper[4892]: E0217 18:00:20.457257 4892 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 17 18:00:20 crc kubenswrapper[4892]: E0217 18:00:20.457295 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b49b91f-8f48-4831-ae32-4b1a9287123f-webhook-certs podName:1b49b91f-8f48-4831-ae32-4b1a9287123f nodeName:}" failed. No retries permitted until 2026-02-17 18:00:36.457276669 +0000 UTC m=+1007.832679934 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/1b49b91f-8f48-4831-ae32-4b1a9287123f-webhook-certs") pod "openstack-operator-controller-manager-b96b9dfc9-cb6tc" (UID: "1b49b91f-8f48-4831-ae32-4b1a9287123f") : secret "webhook-server-cert" not found Feb 17 18:00:20 crc kubenswrapper[4892]: E0217 18:00:20.457316 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b49b91f-8f48-4831-ae32-4b1a9287123f-metrics-certs podName:1b49b91f-8f48-4831-ae32-4b1a9287123f nodeName:}" failed. No retries permitted until 2026-02-17 18:00:36.45730609 +0000 UTC m=+1007.832709365 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b49b91f-8f48-4831-ae32-4b1a9287123f-metrics-certs") pod "openstack-operator-controller-manager-b96b9dfc9-cb6tc" (UID: "1b49b91f-8f48-4831-ae32-4b1a9287123f") : secret "metrics-server-cert" not found Feb 17 18:00:20 crc kubenswrapper[4892]: E0217 18:00:20.806059 4892 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0" Feb 17 18:00:20 crc kubenswrapper[4892]: E0217 18:00:20.806234 4892 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2jmvx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-5db88f68c-xftdg_openstack-operators(a1264594-6ddc-417a-bea1-9bb9eedbb719): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 18:00:20 crc kubenswrapper[4892]: E0217 18:00:20.807381 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-xftdg" podUID="a1264594-6ddc-417a-bea1-9bb9eedbb719" Feb 17 18:00:21 crc kubenswrapper[4892]: E0217 18:00:21.002488 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-m5npp" podUID="062a2519-0d5a-4662-8c84-5b8926ba32a2" Feb 17 18:00:21 crc kubenswrapper[4892]: E0217 18:00:21.002561 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-xftdg" podUID="a1264594-6ddc-417a-bea1-9bb9eedbb719" Feb 17 18:00:21 crc kubenswrapper[4892]: E0217 18:00:21.371863 4892 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:7e1b0b7b172ad0d707ab80dd72d609e1d0f5bbd38a22c24a28ed0f17a960c867" Feb 17 18:00:21 crc kubenswrapper[4892]: E0217 18:00:21.372226 4892 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:7e1b0b7b172ad0d707ab80dd72d609e1d0f5bbd38a22c24a28ed0f17a960c867,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qhnt5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-554564d7fc-wdsjq_openstack-operators(9693b58d-64b4-4d18-a746-ec0a67606de5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 18:00:21 crc kubenswrapper[4892]: E0217 18:00:21.374067 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-wdsjq" podUID="9693b58d-64b4-4d18-a746-ec0a67606de5" Feb 17 18:00:21 crc kubenswrapper[4892]: E0217 18:00:21.936777 4892 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:2b8ab3063af4aaeed0198197aae6f391c6647ac686c94c85668537f1d5933979" Feb 17 18:00:21 crc kubenswrapper[4892]: E0217 18:00:21.936982 4892 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:2b8ab3063af4aaeed0198197aae6f391c6647ac686c94c85668537f1d5933979,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p6dpx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-5d946d989d-254f9_openstack-operators(727885b3-46ff-43a8-9991-28567f43d07e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 18:00:21 crc kubenswrapper[4892]: E0217 18:00:21.938806 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-254f9" podUID="727885b3-46ff-43a8-9991-28567f43d07e" Feb 17 18:00:22 crc kubenswrapper[4892]: E0217 18:00:22.009142 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:7e1b0b7b172ad0d707ab80dd72d609e1d0f5bbd38a22c24a28ed0f17a960c867\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-wdsjq" podUID="9693b58d-64b4-4d18-a746-ec0a67606de5" Feb 17 18:00:22 crc kubenswrapper[4892]: E0217 18:00:22.011587 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:2b8ab3063af4aaeed0198197aae6f391c6647ac686c94c85668537f1d5933979\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-254f9" podUID="727885b3-46ff-43a8-9991-28567f43d07e" Feb 17 18:00:22 crc kubenswrapper[4892]: E0217 18:00:22.635231 4892 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:1ab3ec59cd8e30dd8423e91ad832403bdefbae3b8ac47e15578d5a677d7ba0df" Feb 17 18:00:22 crc kubenswrapper[4892]: E0217 18:00:22.635452 4892 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:1ab3ec59cd8e30dd8423e91ad832403bdefbae3b8ac47e15578d5a677d7ba0df,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rv8nx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-77987464f4-ml9zk_openstack-operators(8dcfa260-ea93-42f8-a345-aec700f9e938): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 18:00:22 crc kubenswrapper[4892]: E0217 18:00:22.637224 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-77987464f4-ml9zk" podUID="8dcfa260-ea93-42f8-a345-aec700f9e938" Feb 17 18:00:23 crc kubenswrapper[4892]: E0217 18:00:23.014902 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:1ab3ec59cd8e30dd8423e91ad832403bdefbae3b8ac47e15578d5a677d7ba0df\\\"\"" pod="openstack-operators/glance-operator-controller-manager-77987464f4-ml9zk" podUID="8dcfa260-ea93-42f8-a345-aec700f9e938" Feb 17 18:00:23 crc kubenswrapper[4892]: E0217 18:00:23.241752 4892 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:e8a675284ff97a1d3f0f07583863be20b20b4aa48ebb34dbc80d83fe39d757b2" Feb 17 18:00:23 crc kubenswrapper[4892]: E0217 18:00:23.241930 4892 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:e8a675284ff97a1d3f0f07583863be20b20b4aa48ebb34dbc80d83fe39d757b2,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-28mml,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-69f49c598c-ngzhd_openstack-operators(67ccc01c-23ce-407b-91dd-9554c49acbd5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 18:00:23 crc kubenswrapper[4892]: E0217 18:00:23.243267 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-ngzhd" podUID="67ccc01c-23ce-407b-91dd-9554c49acbd5" Feb 17 18:00:24 crc kubenswrapper[4892]: E0217 18:00:24.022247 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:e8a675284ff97a1d3f0f07583863be20b20b4aa48ebb34dbc80d83fe39d757b2\\\"\"" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-ngzhd" podUID="67ccc01c-23ce-407b-91dd-9554c49acbd5" Feb 17 18:00:24 crc kubenswrapper[4892]: E0217 18:00:24.790332 4892 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1" Feb 17 18:00:24 crc kubenswrapper[4892]: E0217 18:00:24.790606 4892 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-862ff,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-b4d948c87-tw7fv_openstack-operators(cdcfbb9d-667b-4333-bb72-96bbf99ed979): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 18:00:24 crc kubenswrapper[4892]: E0217 18:00:24.791897 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-tw7fv" podUID="cdcfbb9d-667b-4333-bb72-96bbf99ed979" Feb 17 18:00:25 crc kubenswrapper[4892]: E0217 18:00:25.028867 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-tw7fv" podUID="cdcfbb9d-667b-4333-bb72-96bbf99ed979" Feb 17 18:00:25 crc kubenswrapper[4892]: E0217 18:00:25.774301 4892 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/telemetry-operator@sha256:66a4b9322ebb573313178ea88e31026d4532f461592b9fae2dff71efd9256d99" Feb 17 18:00:25 crc kubenswrapper[4892]: E0217 18:00:25.774469 4892 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:66a4b9322ebb573313178ea88e31026d4532f461592b9fae2dff71efd9256d99,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-h9sls,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-7f45b4ff68-tzf7x_openstack-operators(2f32911d-efa2-45de-8a11-497adcb7d2bd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 18:00:25 crc kubenswrapper[4892]: E0217 18:00:25.776608 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-tzf7x" podUID="2f32911d-efa2-45de-8a11-497adcb7d2bd" Feb 17 18:00:26 crc kubenswrapper[4892]: I0217 18:00:26.040360 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerStarted","Data":"39bb78d4e8e45cbad6e675e4abf8cc16247b09380bca60a00134831853f3fc17"} Feb 17 18:00:26 crc kubenswrapper[4892]: E0217 18:00:26.041450 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:66a4b9322ebb573313178ea88e31026d4532f461592b9fae2dff71efd9256d99\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-tzf7x" podUID="2f32911d-efa2-45de-8a11-497adcb7d2bd" Feb 17 18:00:27 crc kubenswrapper[4892]: I0217 18:00:27.048044 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-5pdhr" event={"ID":"cc4b7060-d89f-47c5-b2e4-2a793606350c","Type":"ContainerStarted","Data":"3037fd3993ca07af2fa55247fb41c9f1c223a48ac31f356d6a577fa6d5318a09"} Feb 17 18:00:27 crc kubenswrapper[4892]: I0217 18:00:27.049493 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-nr7vj" event={"ID":"48f194fc-64b3-4ef2-9006-8b533ce72000","Type":"ContainerStarted","Data":"76383e265ea0359d55ed3e51165ffd91efcbc54205644b3d02cc5331ee8fd680"} Feb 17 18:00:27 crc kubenswrapper[4892]: I0217 18:00:27.063189 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-nr7vj" podStartSLOduration=3.633610398 podStartE2EDuration="24.06317123s" podCreationTimestamp="2026-02-17 18:00:03 +0000 UTC" firstStartedPulling="2026-02-17 18:00:05.381596605 +0000 UTC m=+976.756999870" lastFinishedPulling="2026-02-17 18:00:25.811157397 +0000 UTC m=+997.186560702" observedRunningTime="2026-02-17 18:00:27.061124854 +0000 UTC m=+998.436528129" watchObservedRunningTime="2026-02-17 18:00:27.06317123 +0000 UTC m=+998.438574495" Feb 17 18:00:28 crc kubenswrapper[4892]: I0217 18:00:28.058676 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-hjp6k" event={"ID":"07a93e09-771e-4a85-89d0-e6fb19dcdb2d","Type":"ContainerStarted","Data":"5cc4b2eb581408e854797514374ffc20a6c5a51e6620b20100fd49dd97afc90b"} Feb 17 18:00:28 crc kubenswrapper[4892]: I0217 18:00:28.063901 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-9ws6d" event={"ID":"689be620-172c-415d-927a-0a4f9ea9f5cb","Type":"ContainerStarted","Data":"b45b87ea61fe698a198ee7225979f40aab10d5d3ec7d8e171118a71e5400eaa8"} Feb 17 18:00:28 crc kubenswrapper[4892]: I0217 18:00:28.084015 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-7k7pc" event={"ID":"ab01e10a-ce82-409d-8912-88ec85acac47","Type":"ContainerStarted","Data":"9b08a03690173b48ab807c1e0f4a640c66dc92b322ebc8f593f4bd0ca9121b38"} Feb 17 18:00:28 crc kubenswrapper[4892]: I0217 18:00:28.084092 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-nr7vj" Feb 17 18:00:28 crc kubenswrapper[4892]: I0217 18:00:28.085220 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-5pdhr" Feb 17 18:00:28 crc kubenswrapper[4892]: I0217 18:00:28.112074 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-5pdhr" podStartSLOduration=4.483556976 podStartE2EDuration="25.112052799s" podCreationTimestamp="2026-02-17 18:00:03 +0000 UTC" firstStartedPulling="2026-02-17 18:00:05.176455666 +0000 UTC m=+976.551858931" lastFinishedPulling="2026-02-17 18:00:25.804951459 +0000 UTC m=+997.180354754" observedRunningTime="2026-02-17 18:00:28.10249828 +0000 UTC m=+999.477901545" watchObservedRunningTime="2026-02-17 18:00:28.112052799 +0000 UTC m=+999.487456074" Feb 17 18:00:29 crc kubenswrapper[4892]: I0217 18:00:29.074963 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-spdh6" event={"ID":"471d6891-4b43-4dd0-86b1-5deb2fa418f7","Type":"ContainerStarted","Data":"57274c62afdecf7d95980c2aca7aa292a17c3143673e270e153eae8a9ac8b8b5"} Feb 17 18:00:29 crc kubenswrapper[4892]: I0217 18:00:29.076190 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-p6wgw" event={"ID":"ed7add95-6153-4284-b9be-a76a4142a35e","Type":"ContainerStarted","Data":"f2bdd5bd0631c30c9d17f36363589df01d56fb659db991a165628e2f9f06b520"} Feb 17 18:00:31 crc kubenswrapper[4892]: I0217 18:00:31.113116 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68f46476f-9ws6d" Feb 17 18:00:31 crc kubenswrapper[4892]: I0217 18:00:31.113550 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-spdh6" Feb 17 18:00:31 crc kubenswrapper[4892]: I0217 18:00:31.137030 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-hjp6k" podStartSLOduration=8.316844829 podStartE2EDuration="28.137006828s" podCreationTimestamp="2026-02-17 18:00:03 +0000 UTC" firstStartedPulling="2026-02-17 18:00:05.990364001 +0000 UTC m=+977.365767256" lastFinishedPulling="2026-02-17 18:00:25.810526 +0000 UTC m=+997.185929255" observedRunningTime="2026-02-17 18:00:31.125728444 +0000 UTC m=+1002.501131719" watchObservedRunningTime="2026-02-17 18:00:31.137006828 +0000 UTC m=+1002.512410103" Feb 17 18:00:31 crc kubenswrapper[4892]: I0217 18:00:31.142904 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68f46476f-9ws6d" podStartSLOduration=7.814865547 podStartE2EDuration="28.142891187s" podCreationTimestamp="2026-02-17 18:00:03 +0000 UTC" firstStartedPulling="2026-02-17 18:00:05.482616903 +0000 UTC m=+976.858020168" lastFinishedPulling="2026-02-17 18:00:25.810642513 +0000 UTC m=+997.186045808" observedRunningTime="2026-02-17 18:00:31.141060018 +0000 UTC m=+1002.516463353" watchObservedRunningTime="2026-02-17 18:00:31.142891187 +0000 UTC m=+1002.518294462" Feb 17 18:00:31 crc kubenswrapper[4892]: I0217 18:00:31.158222 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-spdh6" podStartSLOduration=7.813777637 podStartE2EDuration="28.15819274s" podCreationTimestamp="2026-02-17 18:00:03 +0000 UTC" firstStartedPulling="2026-02-17 18:00:05.489283792 +0000 UTC m=+976.864687057" lastFinishedPulling="2026-02-17 18:00:25.833698895 +0000 UTC m=+997.209102160" observedRunningTime="2026-02-17 18:00:31.156865564 +0000 UTC m=+1002.532268909" watchObservedRunningTime="2026-02-17 18:00:31.15819274 +0000 UTC m=+1002.533596055" Feb 17 18:00:31 crc kubenswrapper[4892]: I0217 18:00:31.193835 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-7k7pc" podStartSLOduration=7.891523487 podStartE2EDuration="28.193791962s" podCreationTimestamp="2026-02-17 18:00:03 +0000 UTC" firstStartedPulling="2026-02-17 18:00:05.50808928 +0000 UTC m=+976.883492545" lastFinishedPulling="2026-02-17 18:00:25.810357755 +0000 UTC m=+997.185761020" observedRunningTime="2026-02-17 18:00:31.177149623 +0000 UTC m=+1002.552552908" watchObservedRunningTime="2026-02-17 18:00:31.193791962 +0000 UTC m=+1002.569195227" Feb 17 18:00:31 crc kubenswrapper[4892]: I0217 18:00:31.201286 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-p6wgw" podStartSLOduration=6.966370637 podStartE2EDuration="28.201272053s" podCreationTimestamp="2026-02-17 18:00:03 +0000 UTC" firstStartedPulling="2026-02-17 18:00:04.569982572 +0000 UTC m=+975.945385827" lastFinishedPulling="2026-02-17 18:00:25.804883948 +0000 UTC m=+997.180287243" observedRunningTime="2026-02-17 18:00:31.189235159 +0000 UTC m=+1002.564638464" watchObservedRunningTime="2026-02-17 18:00:31.201272053 +0000 UTC m=+1002.576675318" Feb 17 18:00:32 crc kubenswrapper[4892]: I0217 18:00:32.130079 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68f46476f-9ws6d" Feb 17 18:00:32 crc kubenswrapper[4892]: I0217 18:00:32.363748 4892 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 18:00:33 crc kubenswrapper[4892]: I0217 18:00:33.781629 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-p6wgw" Feb 17 18:00:33 crc kubenswrapper[4892]: I0217 18:00:33.784889 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-p6wgw" Feb 17 18:00:33 crc kubenswrapper[4892]: I0217 18:00:33.817593 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-5pdhr" Feb 17 18:00:33 crc kubenswrapper[4892]: I0217 18:00:33.986284 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-nr7vj" Feb 17 18:00:34 crc kubenswrapper[4892]: I0217 18:00:34.120711 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-spdh6" Feb 17 18:00:34 crc kubenswrapper[4892]: I0217 18:00:34.191890 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-7k7pc" Feb 17 18:00:34 crc kubenswrapper[4892]: I0217 18:00:34.194141 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-7k7pc" Feb 17 18:00:34 crc kubenswrapper[4892]: I0217 18:00:34.231593 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-hjp6k" Feb 17 18:00:34 crc kubenswrapper[4892]: I0217 18:00:34.236375 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-hjp6k" Feb 17 18:00:35 crc kubenswrapper[4892]: I0217 18:00:35.677380 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0756a5c3-ad9d-4f9c-a3ce-77763bd1182e-cert\") pod \"infra-operator-controller-manager-ff5c8777-kccmj\" (UID: \"0756a5c3-ad9d-4f9c-a3ce-77763bd1182e\") " pod="openstack-operators/infra-operator-controller-manager-ff5c8777-kccmj" Feb 17 18:00:35 crc kubenswrapper[4892]: I0217 18:00:35.685184 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0756a5c3-ad9d-4f9c-a3ce-77763bd1182e-cert\") pod \"infra-operator-controller-manager-ff5c8777-kccmj\" (UID: \"0756a5c3-ad9d-4f9c-a3ce-77763bd1182e\") " pod="openstack-operators/infra-operator-controller-manager-ff5c8777-kccmj" Feb 17 18:00:35 crc kubenswrapper[4892]: I0217 18:00:35.820447 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-ff5c8777-kccmj" Feb 17 18:00:35 crc kubenswrapper[4892]: I0217 18:00:35.990033 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/86e53c34-ab5b-49ab-a14e-13d76792b6ef-cert\") pod \"openstack-baremetal-operator-controller-manager-84966cf5c4c5hvk\" (UID: \"86e53c34-ab5b-49ab-a14e-13d76792b6ef\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84966cf5c4c5hvk" Feb 17 18:00:35 crc kubenswrapper[4892]: I0217 18:00:35.996960 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/86e53c34-ab5b-49ab-a14e-13d76792b6ef-cert\") pod \"openstack-baremetal-operator-controller-manager-84966cf5c4c5hvk\" (UID: \"86e53c34-ab5b-49ab-a14e-13d76792b6ef\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84966cf5c4c5hvk" Feb 17 18:00:36 crc kubenswrapper[4892]: I0217 18:00:36.180263 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jd4b7" event={"ID":"5bc11699-06b5-4fd8-b8ff-6f3ecbda532f","Type":"ContainerStarted","Data":"903361699db52ac7f64d8ca8d9b7dd76dd391f9d9d76a567dc891c07e0a56323"} Feb 17 18:00:36 crc kubenswrapper[4892]: I0217 18:00:36.201239 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jd4b7" podStartSLOduration=2.416642979 podStartE2EDuration="32.201218445s" podCreationTimestamp="2026-02-17 18:00:04 +0000 UTC" firstStartedPulling="2026-02-17 18:00:06.040984427 +0000 UTC m=+977.416387692" lastFinishedPulling="2026-02-17 18:00:35.825559893 +0000 UTC m=+1007.200963158" observedRunningTime="2026-02-17 18:00:36.197584918 +0000 UTC m=+1007.572988193" watchObservedRunningTime="2026-02-17 18:00:36.201218445 +0000 UTC m=+1007.576621720" Feb 17 18:00:36 crc kubenswrapper[4892]: I0217 18:00:36.255170 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84966cf5c4c5hvk" Feb 17 18:00:36 crc kubenswrapper[4892]: I0217 18:00:36.321733 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-ff5c8777-kccmj"] Feb 17 18:00:36 crc kubenswrapper[4892]: I0217 18:00:36.497870 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1b49b91f-8f48-4831-ae32-4b1a9287123f-webhook-certs\") pod \"openstack-operator-controller-manager-b96b9dfc9-cb6tc\" (UID: \"1b49b91f-8f48-4831-ae32-4b1a9287123f\") " pod="openstack-operators/openstack-operator-controller-manager-b96b9dfc9-cb6tc" Feb 17 18:00:36 crc kubenswrapper[4892]: I0217 18:00:36.498160 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b49b91f-8f48-4831-ae32-4b1a9287123f-metrics-certs\") pod \"openstack-operator-controller-manager-b96b9dfc9-cb6tc\" (UID: \"1b49b91f-8f48-4831-ae32-4b1a9287123f\") " pod="openstack-operators/openstack-operator-controller-manager-b96b9dfc9-cb6tc" Feb 17 18:00:36 crc kubenswrapper[4892]: I0217 18:00:36.502840 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b49b91f-8f48-4831-ae32-4b1a9287123f-metrics-certs\") pod \"openstack-operator-controller-manager-b96b9dfc9-cb6tc\" (UID: \"1b49b91f-8f48-4831-ae32-4b1a9287123f\") " pod="openstack-operators/openstack-operator-controller-manager-b96b9dfc9-cb6tc" Feb 17 18:00:36 crc kubenswrapper[4892]: I0217 18:00:36.506352 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1b49b91f-8f48-4831-ae32-4b1a9287123f-webhook-certs\") pod \"openstack-operator-controller-manager-b96b9dfc9-cb6tc\" (UID: \"1b49b91f-8f48-4831-ae32-4b1a9287123f\") " pod="openstack-operators/openstack-operator-controller-manager-b96b9dfc9-cb6tc" Feb 17 18:00:36 crc kubenswrapper[4892]: I0217 18:00:36.572599 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-b96b9dfc9-cb6tc" Feb 17 18:00:36 crc kubenswrapper[4892]: I0217 18:00:36.790746 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-84966cf5c4c5hvk"] Feb 17 18:00:36 crc kubenswrapper[4892]: W0217 18:00:36.826342 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86e53c34_ab5b_49ab_a14e_13d76792b6ef.slice/crio-bfc5adac19366299bcb3302f2fff57446ff5eada0d8a4c64e9d72dc239a63c1c WatchSource:0}: Error finding container bfc5adac19366299bcb3302f2fff57446ff5eada0d8a4c64e9d72dc239a63c1c: Status 404 returned error can't find the container with id bfc5adac19366299bcb3302f2fff57446ff5eada0d8a4c64e9d72dc239a63c1c Feb 17 18:00:37 crc kubenswrapper[4892]: I0217 18:00:37.035253 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-b96b9dfc9-cb6tc"] Feb 17 18:00:37 crc kubenswrapper[4892]: W0217 18:00:37.041909 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b49b91f_8f48_4831_ae32_4b1a9287123f.slice/crio-c74d57f27fe5be4e559ebe81b5c17c48e22efe6c9f5d6b78b0368ea40b5ad6b7 WatchSource:0}: Error finding container c74d57f27fe5be4e559ebe81b5c17c48e22efe6c9f5d6b78b0368ea40b5ad6b7: Status 404 returned error can't find the container with id c74d57f27fe5be4e559ebe81b5c17c48e22efe6c9f5d6b78b0368ea40b5ad6b7 Feb 17 18:00:37 crc kubenswrapper[4892]: I0217 18:00:37.192301 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-cpl2x" event={"ID":"f533d37d-5213-42af-82db-fc5e0208cb8d","Type":"ContainerStarted","Data":"e52fe45783cf2e3d9a444a4187d0b79fe0dd8b676a13de83394015491171c06f"} Feb 17 18:00:37 crc kubenswrapper[4892]: I0217 18:00:37.193292 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-7866795846-cpl2x" Feb 17 18:00:37 crc kubenswrapper[4892]: I0217 18:00:37.195237 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-ml9zk" event={"ID":"8dcfa260-ea93-42f8-a345-aec700f9e938","Type":"ContainerStarted","Data":"33f641db36d9154d51e1a9e1f57c226a9287b412e39b16a57189f94f35f944eb"} Feb 17 18:00:37 crc kubenswrapper[4892]: I0217 18:00:37.195934 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987464f4-ml9zk" Feb 17 18:00:37 crc kubenswrapper[4892]: I0217 18:00:37.198190 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-m5npp" event={"ID":"062a2519-0d5a-4662-8c84-5b8926ba32a2","Type":"ContainerStarted","Data":"00c2fc62f93b94aa351f76624bcb4e17d25fda0e082891453cd959673010b0fd"} Feb 17 18:00:37 crc kubenswrapper[4892]: I0217 18:00:37.198404 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-m5npp" Feb 17 18:00:37 crc kubenswrapper[4892]: I0217 18:00:37.199631 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-lf9pj" event={"ID":"67b2947f-d96d-4697-9527-da5cbabc0552","Type":"ContainerStarted","Data":"37f115cb16ce595c7dc8626dd6b0edc02d4f8dd30a9d638dfc7bf301958d9e64"} Feb 17 18:00:37 crc kubenswrapper[4892]: I0217 18:00:37.200385 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-lf9pj" Feb 17 18:00:37 crc kubenswrapper[4892]: I0217 18:00:37.208639 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-7866795846-cpl2x" podStartSLOduration=4.444905452 podStartE2EDuration="34.208627525s" podCreationTimestamp="2026-02-17 18:00:03 +0000 UTC" firstStartedPulling="2026-02-17 18:00:06.061005308 +0000 UTC m=+977.436408573" lastFinishedPulling="2026-02-17 18:00:35.824727371 +0000 UTC m=+1007.200130646" observedRunningTime="2026-02-17 18:00:37.205166041 +0000 UTC m=+1008.580569296" watchObservedRunningTime="2026-02-17 18:00:37.208627525 +0000 UTC m=+1008.584030790" Feb 17 18:00:37 crc kubenswrapper[4892]: I0217 18:00:37.220902 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-67zr2" event={"ID":"5716f235-cb99-4c41-b126-c122c572684a","Type":"ContainerStarted","Data":"fc472cb1b35820ef45093d1de74a02fc413c0123224b2d3b8263899b469b0754"} Feb 17 18:00:37 crc kubenswrapper[4892]: I0217 18:00:37.221150 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-67zr2" Feb 17 18:00:37 crc kubenswrapper[4892]: I0217 18:00:37.225601 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-b96b9dfc9-cb6tc" event={"ID":"1b49b91f-8f48-4831-ae32-4b1a9287123f","Type":"ContainerStarted","Data":"c74d57f27fe5be4e559ebe81b5c17c48e22efe6c9f5d6b78b0368ea40b5ad6b7"} Feb 17 18:00:37 crc kubenswrapper[4892]: I0217 18:00:37.228865 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-ff5c8777-kccmj" event={"ID":"0756a5c3-ad9d-4f9c-a3ce-77763bd1182e","Type":"ContainerStarted","Data":"bdeadb4e59aa9d49b556947dd4cf9e9955903e8d945e55b2c07af481365e8f1d"} Feb 17 18:00:37 crc kubenswrapper[4892]: I0217 18:00:37.230948 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-xftdg" event={"ID":"a1264594-6ddc-417a-bea1-9bb9eedbb719","Type":"ContainerStarted","Data":"3cb7606ef5a96509253d16f6aa65a77996a20aae887d9a6201a612f3aa49e7d5"} Feb 17 18:00:37 crc kubenswrapper[4892]: I0217 18:00:37.231675 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-xftdg" Feb 17 18:00:37 crc kubenswrapper[4892]: I0217 18:00:37.234390 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84966cf5c4c5hvk" event={"ID":"86e53c34-ab5b-49ab-a14e-13d76792b6ef","Type":"ContainerStarted","Data":"bfc5adac19366299bcb3302f2fff57446ff5eada0d8a4c64e9d72dc239a63c1c"} Feb 17 18:00:37 crc kubenswrapper[4892]: I0217 18:00:37.234780 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987464f4-ml9zk" podStartSLOduration=2.781364679 podStartE2EDuration="34.23475802s" podCreationTimestamp="2026-02-17 18:00:03 +0000 UTC" firstStartedPulling="2026-02-17 18:00:05.398319287 +0000 UTC m=+976.773722552" lastFinishedPulling="2026-02-17 18:00:36.851712628 +0000 UTC m=+1008.227115893" observedRunningTime="2026-02-17 18:00:37.227964547 +0000 UTC m=+1008.603367812" watchObservedRunningTime="2026-02-17 18:00:37.23475802 +0000 UTC m=+1008.610161285" Feb 17 18:00:37 crc kubenswrapper[4892]: I0217 18:00:37.235430 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-ktptw" event={"ID":"6622263b-b231-458c-b9fc-19061a5d73a7","Type":"ContainerStarted","Data":"1c64e606a3c0fa9f7396236d5b3c229bd4a2ec8bc124283b52fe614561a05641"} Feb 17 18:00:37 crc kubenswrapper[4892]: I0217 18:00:37.236131 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-ktptw" Feb 17 18:00:37 crc kubenswrapper[4892]: I0217 18:00:37.238242 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-wdsjq" event={"ID":"9693b58d-64b4-4d18-a746-ec0a67606de5","Type":"ContainerStarted","Data":"09230207092a82fe29577adc2f7b570b650d006c69b5d7a7bdc749e25353f749"} Feb 17 18:00:37 crc kubenswrapper[4892]: I0217 18:00:37.238698 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-wdsjq" Feb 17 18:00:37 crc kubenswrapper[4892]: I0217 18:00:37.251919 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-lf9pj" podStartSLOduration=3.891552191 podStartE2EDuration="34.251895132s" podCreationTimestamp="2026-02-17 18:00:03 +0000 UTC" firstStartedPulling="2026-02-17 18:00:05.482207931 +0000 UTC m=+976.857611196" lastFinishedPulling="2026-02-17 18:00:35.842550872 +0000 UTC m=+1007.217954137" observedRunningTime="2026-02-17 18:00:37.239129368 +0000 UTC m=+1008.614532633" watchObservedRunningTime="2026-02-17 18:00:37.251895132 +0000 UTC m=+1008.627298397" Feb 17 18:00:37 crc kubenswrapper[4892]: I0217 18:00:37.258710 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-m5npp" podStartSLOduration=4.318157129 podStartE2EDuration="34.258690656s" podCreationTimestamp="2026-02-17 18:00:03 +0000 UTC" firstStartedPulling="2026-02-17 18:00:05.902040726 +0000 UTC m=+977.277443991" lastFinishedPulling="2026-02-17 18:00:35.842574233 +0000 UTC m=+1007.217977518" observedRunningTime="2026-02-17 18:00:37.254942795 +0000 UTC m=+1008.630346060" watchObservedRunningTime="2026-02-17 18:00:37.258690656 +0000 UTC m=+1008.634093921" Feb 17 18:00:37 crc kubenswrapper[4892]: I0217 18:00:37.318294 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-wdsjq" podStartSLOduration=3.336319521 podStartE2EDuration="34.318276265s" podCreationTimestamp="2026-02-17 18:00:03 +0000 UTC" firstStartedPulling="2026-02-17 18:00:05.874321197 +0000 UTC m=+977.249724462" lastFinishedPulling="2026-02-17 18:00:36.856277941 +0000 UTC m=+1008.231681206" observedRunningTime="2026-02-17 18:00:37.287755771 +0000 UTC m=+1008.663159036" watchObservedRunningTime="2026-02-17 18:00:37.318276265 +0000 UTC m=+1008.693679530" Feb 17 18:00:37 crc kubenswrapper[4892]: I0217 18:00:37.320678 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-67zr2" podStartSLOduration=4.597369378 podStartE2EDuration="34.320666989s" podCreationTimestamp="2026-02-17 18:00:03 +0000 UTC" firstStartedPulling="2026-02-17 18:00:06.092998302 +0000 UTC m=+977.468401567" lastFinishedPulling="2026-02-17 18:00:35.816295923 +0000 UTC m=+1007.191699178" observedRunningTime="2026-02-17 18:00:37.309994301 +0000 UTC m=+1008.685397576" watchObservedRunningTime="2026-02-17 18:00:37.320666989 +0000 UTC m=+1008.696070254" Feb 17 18:00:37 crc kubenswrapper[4892]: I0217 18:00:37.335648 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-xftdg" podStartSLOduration=3.452828286 podStartE2EDuration="33.335629143s" podCreationTimestamp="2026-02-17 18:00:04 +0000 UTC" firstStartedPulling="2026-02-17 18:00:05.960251198 +0000 UTC m=+977.335654463" lastFinishedPulling="2026-02-17 18:00:35.843052055 +0000 UTC m=+1007.218455320" observedRunningTime="2026-02-17 18:00:37.323424894 +0000 UTC m=+1008.698828159" watchObservedRunningTime="2026-02-17 18:00:37.335629143 +0000 UTC m=+1008.711032408" Feb 17 18:00:37 crc kubenswrapper[4892]: I0217 18:00:37.354579 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-ktptw" podStartSLOduration=4.606752142 podStartE2EDuration="34.354561885s" podCreationTimestamp="2026-02-17 18:00:03 +0000 UTC" firstStartedPulling="2026-02-17 18:00:06.013080634 +0000 UTC m=+977.388483899" lastFinishedPulling="2026-02-17 18:00:35.760890377 +0000 UTC m=+1007.136293642" observedRunningTime="2026-02-17 18:00:37.345763767 +0000 UTC m=+1008.721167032" watchObservedRunningTime="2026-02-17 18:00:37.354561885 +0000 UTC m=+1008.729965150" Feb 17 18:00:38 crc kubenswrapper[4892]: I0217 18:00:38.246742 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-b96b9dfc9-cb6tc" event={"ID":"1b49b91f-8f48-4831-ae32-4b1a9287123f","Type":"ContainerStarted","Data":"2fd420ca8db3fecaa2daace7d9d7b3b513127a525a811047d3975f8e9ea5bedb"} Feb 17 18:00:38 crc kubenswrapper[4892]: I0217 18:00:38.247268 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-b96b9dfc9-cb6tc" Feb 17 18:00:38 crc kubenswrapper[4892]: I0217 18:00:38.251610 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-254f9" event={"ID":"727885b3-46ff-43a8-9991-28567f43d07e","Type":"ContainerStarted","Data":"85e37ff4f2affcc861cbf7cf14cafd7b9c245916b3caab33ae7d243ebd488cf8"} Feb 17 18:00:38 crc kubenswrapper[4892]: I0217 18:00:38.272881 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-b96b9dfc9-cb6tc" podStartSLOduration=34.272861528 podStartE2EDuration="34.272861528s" podCreationTimestamp="2026-02-17 18:00:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:00:38.269516156 +0000 UTC m=+1009.644919441" watchObservedRunningTime="2026-02-17 18:00:38.272861528 +0000 UTC m=+1009.648264803" Feb 17 18:00:38 crc kubenswrapper[4892]: I0217 18:00:38.295645 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-254f9" podStartSLOduration=3.205864279 podStartE2EDuration="35.295627672s" podCreationTimestamp="2026-02-17 18:00:03 +0000 UTC" firstStartedPulling="2026-02-17 18:00:04.944912985 +0000 UTC m=+976.320316250" lastFinishedPulling="2026-02-17 18:00:37.034676378 +0000 UTC m=+1008.410079643" observedRunningTime="2026-02-17 18:00:38.290695669 +0000 UTC m=+1009.666098934" watchObservedRunningTime="2026-02-17 18:00:38.295627672 +0000 UTC m=+1009.671030937" Feb 17 18:00:41 crc kubenswrapper[4892]: I0217 18:00:41.286605 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-tzf7x" event={"ID":"2f32911d-efa2-45de-8a11-497adcb7d2bd","Type":"ContainerStarted","Data":"79bdd0d31572c2193273d7324396f77adc151184a5320e2e7e571b9a2725106f"} Feb 17 18:00:41 crc kubenswrapper[4892]: I0217 18:00:41.287244 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-tzf7x" Feb 17 18:00:41 crc kubenswrapper[4892]: I0217 18:00:41.321595 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-tzf7x" podStartSLOduration=4.816402531 podStartE2EDuration="38.321569178s" podCreationTimestamp="2026-02-17 18:00:03 +0000 UTC" firstStartedPulling="2026-02-17 18:00:05.90218333 +0000 UTC m=+977.277586595" lastFinishedPulling="2026-02-17 18:00:39.407349947 +0000 UTC m=+1010.782753242" observedRunningTime="2026-02-17 18:00:41.311567048 +0000 UTC m=+1012.686970403" watchObservedRunningTime="2026-02-17 18:00:41.321569178 +0000 UTC m=+1012.696972473" Feb 17 18:00:42 crc kubenswrapper[4892]: I0217 18:00:42.296414 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-tw7fv" event={"ID":"cdcfbb9d-667b-4333-bb72-96bbf99ed979","Type":"ContainerStarted","Data":"a03a78e0296344a65097c147742d8b9486cfd23b8130d071e7c8561f1a9b1bca"} Feb 17 18:00:42 crc kubenswrapper[4892]: I0217 18:00:42.296972 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-tw7fv" Feb 17 18:00:42 crc kubenswrapper[4892]: I0217 18:00:42.298345 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84966cf5c4c5hvk" event={"ID":"86e53c34-ab5b-49ab-a14e-13d76792b6ef","Type":"ContainerStarted","Data":"55da7e97c1742827872e087693dbdc8442b1eb41f95384e1e4a0d882dc7d663e"} Feb 17 18:00:42 crc kubenswrapper[4892]: I0217 18:00:42.298474 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84966cf5c4c5hvk" Feb 17 18:00:42 crc kubenswrapper[4892]: I0217 18:00:42.300252 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-ngzhd" event={"ID":"67ccc01c-23ce-407b-91dd-9554c49acbd5","Type":"ContainerStarted","Data":"37feb22caad006be9f46d877ae4484912e01926b39d64f4e3cf3b2bb7d7ea927"} Feb 17 18:00:42 crc kubenswrapper[4892]: I0217 18:00:42.300399 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-ngzhd" Feb 17 18:00:42 crc kubenswrapper[4892]: I0217 18:00:42.301893 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-ff5c8777-kccmj" event={"ID":"0756a5c3-ad9d-4f9c-a3ce-77763bd1182e","Type":"ContainerStarted","Data":"b9cc855cf1f6c1eb40c65da3c043b750ef7309f9a2522394bd6df894982ddb2a"} Feb 17 18:00:42 crc kubenswrapper[4892]: I0217 18:00:42.302054 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-ff5c8777-kccmj" Feb 17 18:00:42 crc kubenswrapper[4892]: I0217 18:00:42.325020 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-tw7fv" podStartSLOduration=3.131389787 podStartE2EDuration="39.324994459s" podCreationTimestamp="2026-02-17 18:00:03 +0000 UTC" firstStartedPulling="2026-02-17 18:00:05.48996422 +0000 UTC m=+976.865367485" lastFinishedPulling="2026-02-17 18:00:41.683568852 +0000 UTC m=+1013.058972157" observedRunningTime="2026-02-17 18:00:42.322921403 +0000 UTC m=+1013.698324668" watchObservedRunningTime="2026-02-17 18:00:42.324994459 +0000 UTC m=+1013.700397734" Feb 17 18:00:42 crc kubenswrapper[4892]: I0217 18:00:42.342272 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-ff5c8777-kccmj" podStartSLOduration=33.979934919 podStartE2EDuration="39.342258275s" podCreationTimestamp="2026-02-17 18:00:03 +0000 UTC" firstStartedPulling="2026-02-17 18:00:36.332693795 +0000 UTC m=+1007.708097060" lastFinishedPulling="2026-02-17 18:00:41.695017131 +0000 UTC m=+1013.070420416" observedRunningTime="2026-02-17 18:00:42.340987212 +0000 UTC m=+1013.716390477" watchObservedRunningTime="2026-02-17 18:00:42.342258275 +0000 UTC m=+1013.717661540" Feb 17 18:00:42 crc kubenswrapper[4892]: I0217 18:00:42.404083 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84966cf5c4c5hvk" podStartSLOduration=34.57105685 podStartE2EDuration="39.404057714s" podCreationTimestamp="2026-02-17 18:00:03 +0000 UTC" firstStartedPulling="2026-02-17 18:00:36.850391773 +0000 UTC m=+1008.225795028" lastFinishedPulling="2026-02-17 18:00:41.683392617 +0000 UTC m=+1013.058795892" observedRunningTime="2026-02-17 18:00:42.388141845 +0000 UTC m=+1013.763545120" watchObservedRunningTime="2026-02-17 18:00:42.404057714 +0000 UTC m=+1013.779460979" Feb 17 18:00:42 crc kubenswrapper[4892]: I0217 18:00:42.416184 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-ngzhd" podStartSLOduration=3.112604071 podStartE2EDuration="39.416167141s" podCreationTimestamp="2026-02-17 18:00:03 +0000 UTC" firstStartedPulling="2026-02-17 18:00:05.379985621 +0000 UTC m=+976.755388886" lastFinishedPulling="2026-02-17 18:00:41.683548681 +0000 UTC m=+1013.058951956" observedRunningTime="2026-02-17 18:00:42.410771066 +0000 UTC m=+1013.786174331" watchObservedRunningTime="2026-02-17 18:00:42.416167141 +0000 UTC m=+1013.791570396" Feb 17 18:00:43 crc kubenswrapper[4892]: I0217 18:00:43.809238 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-254f9" Feb 17 18:00:43 crc kubenswrapper[4892]: I0217 18:00:43.814594 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-254f9" Feb 17 18:00:43 crc kubenswrapper[4892]: I0217 18:00:43.878190 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987464f4-ml9zk" Feb 17 18:00:44 crc kubenswrapper[4892]: I0217 18:00:44.066003 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-wdsjq" Feb 17 18:00:44 crc kubenswrapper[4892]: I0217 18:00:44.148280 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-67zr2" Feb 17 18:00:44 crc kubenswrapper[4892]: I0217 18:00:44.172996 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-m5npp" Feb 17 18:00:44 crc kubenswrapper[4892]: I0217 18:00:44.431271 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-lf9pj" Feb 17 18:00:44 crc kubenswrapper[4892]: I0217 18:00:44.468000 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-ktptw" Feb 17 18:00:44 crc kubenswrapper[4892]: I0217 18:00:44.808021 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-7866795846-cpl2x" Feb 17 18:00:44 crc kubenswrapper[4892]: I0217 18:00:44.871786 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-xftdg" Feb 17 18:00:46 crc kubenswrapper[4892]: I0217 18:00:46.584395 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-b96b9dfc9-cb6tc" Feb 17 18:00:53 crc kubenswrapper[4892]: I0217 18:00:53.908375 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-ngzhd" Feb 17 18:00:54 crc kubenswrapper[4892]: I0217 18:00:54.069629 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-tw7fv" Feb 17 18:00:54 crc kubenswrapper[4892]: I0217 18:00:54.813252 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-tzf7x" Feb 17 18:00:55 crc kubenswrapper[4892]: I0217 18:00:55.831106 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-ff5c8777-kccmj" Feb 17 18:00:56 crc kubenswrapper[4892]: I0217 18:00:56.267356 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84966cf5c4c5hvk" Feb 17 18:01:12 crc kubenswrapper[4892]: I0217 18:01:12.419765 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-ftxpz"] Feb 17 18:01:12 crc kubenswrapper[4892]: I0217 18:01:12.433870 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-ftxpz" Feb 17 18:01:12 crc kubenswrapper[4892]: I0217 18:01:12.436635 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 17 18:01:12 crc kubenswrapper[4892]: I0217 18:01:12.436953 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 17 18:01:12 crc kubenswrapper[4892]: I0217 18:01:12.437494 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-jpt6c" Feb 17 18:01:12 crc kubenswrapper[4892]: I0217 18:01:12.438056 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 17 18:01:12 crc kubenswrapper[4892]: I0217 18:01:12.442774 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-ftxpz"] Feb 17 18:01:12 crc kubenswrapper[4892]: I0217 18:01:12.495233 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-8nh25"] Feb 17 18:01:12 crc kubenswrapper[4892]: I0217 18:01:12.497229 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-8nh25" Feb 17 18:01:12 crc kubenswrapper[4892]: I0217 18:01:12.500690 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 17 18:01:12 crc kubenswrapper[4892]: I0217 18:01:12.506919 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-8nh25"] Feb 17 18:01:12 crc kubenswrapper[4892]: I0217 18:01:12.623305 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blvrn\" (UniqueName: \"kubernetes.io/projected/1b92aa73-c227-40c3-bd98-0c07f7fd342b-kube-api-access-blvrn\") pod \"dnsmasq-dns-675f4bcbfc-ftxpz\" (UID: \"1b92aa73-c227-40c3-bd98-0c07f7fd342b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-ftxpz" Feb 17 18:01:12 crc kubenswrapper[4892]: I0217 18:01:12.623716 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7e8b752-5bee-44d7-b852-e433eee3b7a8-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-8nh25\" (UID: \"c7e8b752-5bee-44d7-b852-e433eee3b7a8\") " pod="openstack/dnsmasq-dns-78dd6ddcc-8nh25" Feb 17 18:01:12 crc kubenswrapper[4892]: I0217 18:01:12.623788 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrtp9\" (UniqueName: \"kubernetes.io/projected/c7e8b752-5bee-44d7-b852-e433eee3b7a8-kube-api-access-zrtp9\") pod \"dnsmasq-dns-78dd6ddcc-8nh25\" (UID: \"c7e8b752-5bee-44d7-b852-e433eee3b7a8\") " pod="openstack/dnsmasq-dns-78dd6ddcc-8nh25" Feb 17 18:01:12 crc kubenswrapper[4892]: I0217 18:01:12.623963 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7e8b752-5bee-44d7-b852-e433eee3b7a8-config\") pod \"dnsmasq-dns-78dd6ddcc-8nh25\" (UID: \"c7e8b752-5bee-44d7-b852-e433eee3b7a8\") " pod="openstack/dnsmasq-dns-78dd6ddcc-8nh25" Feb 17 18:01:12 crc kubenswrapper[4892]: I0217 18:01:12.623996 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b92aa73-c227-40c3-bd98-0c07f7fd342b-config\") pod \"dnsmasq-dns-675f4bcbfc-ftxpz\" (UID: \"1b92aa73-c227-40c3-bd98-0c07f7fd342b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-ftxpz" Feb 17 18:01:12 crc kubenswrapper[4892]: I0217 18:01:12.725578 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7e8b752-5bee-44d7-b852-e433eee3b7a8-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-8nh25\" (UID: \"c7e8b752-5bee-44d7-b852-e433eee3b7a8\") " pod="openstack/dnsmasq-dns-78dd6ddcc-8nh25" Feb 17 18:01:12 crc kubenswrapper[4892]: I0217 18:01:12.725925 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrtp9\" (UniqueName: \"kubernetes.io/projected/c7e8b752-5bee-44d7-b852-e433eee3b7a8-kube-api-access-zrtp9\") pod \"dnsmasq-dns-78dd6ddcc-8nh25\" (UID: \"c7e8b752-5bee-44d7-b852-e433eee3b7a8\") " pod="openstack/dnsmasq-dns-78dd6ddcc-8nh25" Feb 17 18:01:12 crc kubenswrapper[4892]: I0217 18:01:12.726069 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7e8b752-5bee-44d7-b852-e433eee3b7a8-config\") pod \"dnsmasq-dns-78dd6ddcc-8nh25\" (UID: \"c7e8b752-5bee-44d7-b852-e433eee3b7a8\") " pod="openstack/dnsmasq-dns-78dd6ddcc-8nh25" Feb 17 18:01:12 crc kubenswrapper[4892]: I0217 18:01:12.726164 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b92aa73-c227-40c3-bd98-0c07f7fd342b-config\") pod \"dnsmasq-dns-675f4bcbfc-ftxpz\" (UID: \"1b92aa73-c227-40c3-bd98-0c07f7fd342b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-ftxpz" Feb 17 18:01:12 crc kubenswrapper[4892]: I0217 18:01:12.726285 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blvrn\" (UniqueName: \"kubernetes.io/projected/1b92aa73-c227-40c3-bd98-0c07f7fd342b-kube-api-access-blvrn\") pod \"dnsmasq-dns-675f4bcbfc-ftxpz\" (UID: \"1b92aa73-c227-40c3-bd98-0c07f7fd342b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-ftxpz" Feb 17 18:01:12 crc kubenswrapper[4892]: I0217 18:01:12.726442 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7e8b752-5bee-44d7-b852-e433eee3b7a8-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-8nh25\" (UID: \"c7e8b752-5bee-44d7-b852-e433eee3b7a8\") " pod="openstack/dnsmasq-dns-78dd6ddcc-8nh25" Feb 17 18:01:12 crc kubenswrapper[4892]: I0217 18:01:12.727623 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7e8b752-5bee-44d7-b852-e433eee3b7a8-config\") pod \"dnsmasq-dns-78dd6ddcc-8nh25\" (UID: \"c7e8b752-5bee-44d7-b852-e433eee3b7a8\") " pod="openstack/dnsmasq-dns-78dd6ddcc-8nh25" Feb 17 18:01:12 crc kubenswrapper[4892]: I0217 18:01:12.727928 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b92aa73-c227-40c3-bd98-0c07f7fd342b-config\") pod \"dnsmasq-dns-675f4bcbfc-ftxpz\" (UID: \"1b92aa73-c227-40c3-bd98-0c07f7fd342b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-ftxpz" Feb 17 18:01:12 crc kubenswrapper[4892]: I0217 18:01:12.743624 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blvrn\" (UniqueName: \"kubernetes.io/projected/1b92aa73-c227-40c3-bd98-0c07f7fd342b-kube-api-access-blvrn\") pod \"dnsmasq-dns-675f4bcbfc-ftxpz\" (UID: \"1b92aa73-c227-40c3-bd98-0c07f7fd342b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-ftxpz" Feb 17 18:01:12 crc kubenswrapper[4892]: I0217 18:01:12.752600 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrtp9\" (UniqueName: \"kubernetes.io/projected/c7e8b752-5bee-44d7-b852-e433eee3b7a8-kube-api-access-zrtp9\") pod \"dnsmasq-dns-78dd6ddcc-8nh25\" (UID: \"c7e8b752-5bee-44d7-b852-e433eee3b7a8\") " pod="openstack/dnsmasq-dns-78dd6ddcc-8nh25" Feb 17 18:01:12 crc kubenswrapper[4892]: I0217 18:01:12.766581 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-ftxpz" Feb 17 18:01:12 crc kubenswrapper[4892]: I0217 18:01:12.818259 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-8nh25" Feb 17 18:01:13 crc kubenswrapper[4892]: I0217 18:01:13.268787 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-ftxpz"] Feb 17 18:01:13 crc kubenswrapper[4892]: W0217 18:01:13.377433 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7e8b752_5bee_44d7_b852_e433eee3b7a8.slice/crio-3372c23036fa892a85478df71183ef199766f36aef8e97d893a12ef17b9a2c53 WatchSource:0}: Error finding container 3372c23036fa892a85478df71183ef199766f36aef8e97d893a12ef17b9a2c53: Status 404 returned error can't find the container with id 3372c23036fa892a85478df71183ef199766f36aef8e97d893a12ef17b9a2c53 Feb 17 18:01:13 crc kubenswrapper[4892]: I0217 18:01:13.380857 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-8nh25"] Feb 17 18:01:13 crc kubenswrapper[4892]: I0217 18:01:13.866653 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-8nh25" event={"ID":"c7e8b752-5bee-44d7-b852-e433eee3b7a8","Type":"ContainerStarted","Data":"3372c23036fa892a85478df71183ef199766f36aef8e97d893a12ef17b9a2c53"} Feb 17 18:01:13 crc kubenswrapper[4892]: I0217 18:01:13.867990 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-ftxpz" event={"ID":"1b92aa73-c227-40c3-bd98-0c07f7fd342b","Type":"ContainerStarted","Data":"53bc294c0a0156b04573fba23d6096ddba24cb66964f6c1a0563169d70e3d6b8"} Feb 17 18:01:14 crc kubenswrapper[4892]: I0217 18:01:14.027521 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-ftxpz"] Feb 17 18:01:14 crc kubenswrapper[4892]: I0217 18:01:14.054037 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-dpswn"] Feb 17 18:01:14 crc kubenswrapper[4892]: I0217 18:01:14.055565 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-dpswn" Feb 17 18:01:14 crc kubenswrapper[4892]: I0217 18:01:14.068670 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-dpswn"] Feb 17 18:01:14 crc kubenswrapper[4892]: I0217 18:01:14.168047 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c165766b-b53f-4345-802e-7262eb64618c-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-dpswn\" (UID: \"c165766b-b53f-4345-802e-7262eb64618c\") " pod="openstack/dnsmasq-dns-5ccc8479f9-dpswn" Feb 17 18:01:14 crc kubenswrapper[4892]: I0217 18:01:14.169169 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4g8l\" (UniqueName: \"kubernetes.io/projected/c165766b-b53f-4345-802e-7262eb64618c-kube-api-access-f4g8l\") pod \"dnsmasq-dns-5ccc8479f9-dpswn\" (UID: \"c165766b-b53f-4345-802e-7262eb64618c\") " pod="openstack/dnsmasq-dns-5ccc8479f9-dpswn" Feb 17 18:01:14 crc kubenswrapper[4892]: I0217 18:01:14.171031 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c165766b-b53f-4345-802e-7262eb64618c-config\") pod \"dnsmasq-dns-5ccc8479f9-dpswn\" (UID: \"c165766b-b53f-4345-802e-7262eb64618c\") " pod="openstack/dnsmasq-dns-5ccc8479f9-dpswn" Feb 17 18:01:14 crc kubenswrapper[4892]: I0217 18:01:14.273206 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c165766b-b53f-4345-802e-7262eb64618c-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-dpswn\" (UID: \"c165766b-b53f-4345-802e-7262eb64618c\") " pod="openstack/dnsmasq-dns-5ccc8479f9-dpswn" Feb 17 18:01:14 crc kubenswrapper[4892]: I0217 18:01:14.273242 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4g8l\" (UniqueName: \"kubernetes.io/projected/c165766b-b53f-4345-802e-7262eb64618c-kube-api-access-f4g8l\") pod \"dnsmasq-dns-5ccc8479f9-dpswn\" (UID: \"c165766b-b53f-4345-802e-7262eb64618c\") " pod="openstack/dnsmasq-dns-5ccc8479f9-dpswn" Feb 17 18:01:14 crc kubenswrapper[4892]: I0217 18:01:14.273277 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c165766b-b53f-4345-802e-7262eb64618c-config\") pod \"dnsmasq-dns-5ccc8479f9-dpswn\" (UID: \"c165766b-b53f-4345-802e-7262eb64618c\") " pod="openstack/dnsmasq-dns-5ccc8479f9-dpswn" Feb 17 18:01:14 crc kubenswrapper[4892]: I0217 18:01:14.274163 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c165766b-b53f-4345-802e-7262eb64618c-config\") pod \"dnsmasq-dns-5ccc8479f9-dpswn\" (UID: \"c165766b-b53f-4345-802e-7262eb64618c\") " pod="openstack/dnsmasq-dns-5ccc8479f9-dpswn" Feb 17 18:01:14 crc kubenswrapper[4892]: I0217 18:01:14.274264 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c165766b-b53f-4345-802e-7262eb64618c-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-dpswn\" (UID: \"c165766b-b53f-4345-802e-7262eb64618c\") " pod="openstack/dnsmasq-dns-5ccc8479f9-dpswn" Feb 17 18:01:14 crc kubenswrapper[4892]: I0217 18:01:14.305370 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4g8l\" (UniqueName: \"kubernetes.io/projected/c165766b-b53f-4345-802e-7262eb64618c-kube-api-access-f4g8l\") pod \"dnsmasq-dns-5ccc8479f9-dpswn\" (UID: \"c165766b-b53f-4345-802e-7262eb64618c\") " pod="openstack/dnsmasq-dns-5ccc8479f9-dpswn" Feb 17 18:01:14 crc kubenswrapper[4892]: I0217 18:01:14.378753 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-dpswn" Feb 17 18:01:14 crc kubenswrapper[4892]: I0217 18:01:14.571898 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-8nh25"] Feb 17 18:01:14 crc kubenswrapper[4892]: I0217 18:01:14.593677 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-zbw9m"] Feb 17 18:01:14 crc kubenswrapper[4892]: I0217 18:01:14.595103 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-zbw9m" Feb 17 18:01:14 crc kubenswrapper[4892]: I0217 18:01:14.624461 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-zbw9m"] Feb 17 18:01:14 crc kubenswrapper[4892]: I0217 18:01:14.783366 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ca917ea-9347-4218-ab50-2156103f610a-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-zbw9m\" (UID: \"2ca917ea-9347-4218-ab50-2156103f610a\") " pod="openstack/dnsmasq-dns-57d769cc4f-zbw9m" Feb 17 18:01:14 crc kubenswrapper[4892]: I0217 18:01:14.783416 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ca917ea-9347-4218-ab50-2156103f610a-config\") pod \"dnsmasq-dns-57d769cc4f-zbw9m\" (UID: \"2ca917ea-9347-4218-ab50-2156103f610a\") " pod="openstack/dnsmasq-dns-57d769cc4f-zbw9m" Feb 17 18:01:14 crc kubenswrapper[4892]: I0217 18:01:14.783507 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtvh6\" (UniqueName: \"kubernetes.io/projected/2ca917ea-9347-4218-ab50-2156103f610a-kube-api-access-jtvh6\") pod \"dnsmasq-dns-57d769cc4f-zbw9m\" (UID: \"2ca917ea-9347-4218-ab50-2156103f610a\") " pod="openstack/dnsmasq-dns-57d769cc4f-zbw9m" Feb 17 18:01:14 crc kubenswrapper[4892]: I0217 18:01:14.886135 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtvh6\" (UniqueName: \"kubernetes.io/projected/2ca917ea-9347-4218-ab50-2156103f610a-kube-api-access-jtvh6\") pod \"dnsmasq-dns-57d769cc4f-zbw9m\" (UID: \"2ca917ea-9347-4218-ab50-2156103f610a\") " pod="openstack/dnsmasq-dns-57d769cc4f-zbw9m" Feb 17 18:01:14 crc kubenswrapper[4892]: I0217 18:01:14.887174 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ca917ea-9347-4218-ab50-2156103f610a-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-zbw9m\" (UID: \"2ca917ea-9347-4218-ab50-2156103f610a\") " pod="openstack/dnsmasq-dns-57d769cc4f-zbw9m" Feb 17 18:01:14 crc kubenswrapper[4892]: I0217 18:01:14.887202 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ca917ea-9347-4218-ab50-2156103f610a-config\") pod \"dnsmasq-dns-57d769cc4f-zbw9m\" (UID: \"2ca917ea-9347-4218-ab50-2156103f610a\") " pod="openstack/dnsmasq-dns-57d769cc4f-zbw9m" Feb 17 18:01:14 crc kubenswrapper[4892]: I0217 18:01:14.888038 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ca917ea-9347-4218-ab50-2156103f610a-config\") pod \"dnsmasq-dns-57d769cc4f-zbw9m\" (UID: \"2ca917ea-9347-4218-ab50-2156103f610a\") " pod="openstack/dnsmasq-dns-57d769cc4f-zbw9m" Feb 17 18:01:14 crc kubenswrapper[4892]: I0217 18:01:14.890480 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ca917ea-9347-4218-ab50-2156103f610a-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-zbw9m\" (UID: \"2ca917ea-9347-4218-ab50-2156103f610a\") " pod="openstack/dnsmasq-dns-57d769cc4f-zbw9m" Feb 17 18:01:14 crc kubenswrapper[4892]: I0217 18:01:14.905943 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtvh6\" (UniqueName: \"kubernetes.io/projected/2ca917ea-9347-4218-ab50-2156103f610a-kube-api-access-jtvh6\") pod \"dnsmasq-dns-57d769cc4f-zbw9m\" (UID: \"2ca917ea-9347-4218-ab50-2156103f610a\") " pod="openstack/dnsmasq-dns-57d769cc4f-zbw9m" Feb 17 18:01:14 crc kubenswrapper[4892]: I0217 18:01:14.926425 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-zbw9m" Feb 17 18:01:14 crc kubenswrapper[4892]: I0217 18:01:14.943260 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-dpswn"] Feb 17 18:01:14 crc kubenswrapper[4892]: W0217 18:01:14.956845 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc165766b_b53f_4345_802e_7262eb64618c.slice/crio-40ebec5f88ef82ec19e43255cbf4714d60f7dbb143808fa9e34b60a9ccb9be26 WatchSource:0}: Error finding container 40ebec5f88ef82ec19e43255cbf4714d60f7dbb143808fa9e34b60a9ccb9be26: Status 404 returned error can't find the container with id 40ebec5f88ef82ec19e43255cbf4714d60f7dbb143808fa9e34b60a9ccb9be26 Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.231715 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.236274 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.241571 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.241838 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.242085 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.242240 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.242392 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.242587 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-xdjgd" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.243770 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.262043 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.308902 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/60523b2e-a498-4bc9-920b-32f117afb898-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"60523b2e-a498-4bc9-920b-32f117afb898\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.308949 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/60523b2e-a498-4bc9-920b-32f117afb898-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"60523b2e-a498-4bc9-920b-32f117afb898\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.308968 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/60523b2e-a498-4bc9-920b-32f117afb898-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"60523b2e-a498-4bc9-920b-32f117afb898\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.308985 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/60523b2e-a498-4bc9-920b-32f117afb898-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"60523b2e-a498-4bc9-920b-32f117afb898\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.309006 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/60523b2e-a498-4bc9-920b-32f117afb898-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"60523b2e-a498-4bc9-920b-32f117afb898\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.309047 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"60523b2e-a498-4bc9-920b-32f117afb898\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.309065 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/60523b2e-a498-4bc9-920b-32f117afb898-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"60523b2e-a498-4bc9-920b-32f117afb898\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.309215 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/60523b2e-a498-4bc9-920b-32f117afb898-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"60523b2e-a498-4bc9-920b-32f117afb898\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.309328 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4skhm\" (UniqueName: \"kubernetes.io/projected/60523b2e-a498-4bc9-920b-32f117afb898-kube-api-access-4skhm\") pod \"rabbitmq-cell1-server-0\" (UID: \"60523b2e-a498-4bc9-920b-32f117afb898\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.309396 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/60523b2e-a498-4bc9-920b-32f117afb898-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"60523b2e-a498-4bc9-920b-32f117afb898\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.309481 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/60523b2e-a498-4bc9-920b-32f117afb898-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"60523b2e-a498-4bc9-920b-32f117afb898\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.406822 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-zbw9m"] Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.411062 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/60523b2e-a498-4bc9-920b-32f117afb898-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"60523b2e-a498-4bc9-920b-32f117afb898\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.411145 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/60523b2e-a498-4bc9-920b-32f117afb898-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"60523b2e-a498-4bc9-920b-32f117afb898\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.411169 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/60523b2e-a498-4bc9-920b-32f117afb898-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"60523b2e-a498-4bc9-920b-32f117afb898\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.411185 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/60523b2e-a498-4bc9-920b-32f117afb898-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"60523b2e-a498-4bc9-920b-32f117afb898\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.411203 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/60523b2e-a498-4bc9-920b-32f117afb898-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"60523b2e-a498-4bc9-920b-32f117afb898\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.411223 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/60523b2e-a498-4bc9-920b-32f117afb898-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"60523b2e-a498-4bc9-920b-32f117afb898\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.411284 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"60523b2e-a498-4bc9-920b-32f117afb898\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.411304 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/60523b2e-a498-4bc9-920b-32f117afb898-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"60523b2e-a498-4bc9-920b-32f117afb898\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.411347 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/60523b2e-a498-4bc9-920b-32f117afb898-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"60523b2e-a498-4bc9-920b-32f117afb898\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.411391 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4skhm\" (UniqueName: \"kubernetes.io/projected/60523b2e-a498-4bc9-920b-32f117afb898-kube-api-access-4skhm\") pod \"rabbitmq-cell1-server-0\" (UID: \"60523b2e-a498-4bc9-920b-32f117afb898\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.411427 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/60523b2e-a498-4bc9-920b-32f117afb898-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"60523b2e-a498-4bc9-920b-32f117afb898\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.412215 4892 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"60523b2e-a498-4bc9-920b-32f117afb898\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.412519 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/60523b2e-a498-4bc9-920b-32f117afb898-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"60523b2e-a498-4bc9-920b-32f117afb898\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.413218 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/60523b2e-a498-4bc9-920b-32f117afb898-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"60523b2e-a498-4bc9-920b-32f117afb898\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.414280 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/60523b2e-a498-4bc9-920b-32f117afb898-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"60523b2e-a498-4bc9-920b-32f117afb898\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.414497 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/60523b2e-a498-4bc9-920b-32f117afb898-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"60523b2e-a498-4bc9-920b-32f117afb898\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.415832 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/60523b2e-a498-4bc9-920b-32f117afb898-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"60523b2e-a498-4bc9-920b-32f117afb898\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.418521 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/60523b2e-a498-4bc9-920b-32f117afb898-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"60523b2e-a498-4bc9-920b-32f117afb898\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.419493 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/60523b2e-a498-4bc9-920b-32f117afb898-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"60523b2e-a498-4bc9-920b-32f117afb898\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.421827 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/60523b2e-a498-4bc9-920b-32f117afb898-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"60523b2e-a498-4bc9-920b-32f117afb898\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.430709 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4skhm\" (UniqueName: \"kubernetes.io/projected/60523b2e-a498-4bc9-920b-32f117afb898-kube-api-access-4skhm\") pod \"rabbitmq-cell1-server-0\" (UID: \"60523b2e-a498-4bc9-920b-32f117afb898\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.432410 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/60523b2e-a498-4bc9-920b-32f117afb898-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"60523b2e-a498-4bc9-920b-32f117afb898\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.434938 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"60523b2e-a498-4bc9-920b-32f117afb898\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.575541 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.744107 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.745932 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.752114 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.752123 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.752209 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.752241 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.752246 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.752306 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.752426 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-d6d75" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.753426 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.824274 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3a991e29-288f-453d-9bb4-f8d90a2689ad-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3a991e29-288f-453d-9bb4-f8d90a2689ad\") " pod="openstack/rabbitmq-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.824343 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a991e29-288f-453d-9bb4-f8d90a2689ad-config-data\") pod \"rabbitmq-server-0\" (UID: \"3a991e29-288f-453d-9bb4-f8d90a2689ad\") " pod="openstack/rabbitmq-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.824382 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"3a991e29-288f-453d-9bb4-f8d90a2689ad\") " pod="openstack/rabbitmq-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.824419 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3a991e29-288f-453d-9bb4-f8d90a2689ad-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3a991e29-288f-453d-9bb4-f8d90a2689ad\") " pod="openstack/rabbitmq-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.824448 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3a991e29-288f-453d-9bb4-f8d90a2689ad-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3a991e29-288f-453d-9bb4-f8d90a2689ad\") " pod="openstack/rabbitmq-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.824470 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3a991e29-288f-453d-9bb4-f8d90a2689ad-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3a991e29-288f-453d-9bb4-f8d90a2689ad\") " pod="openstack/rabbitmq-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.824494 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3a991e29-288f-453d-9bb4-f8d90a2689ad-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"3a991e29-288f-453d-9bb4-f8d90a2689ad\") " pod="openstack/rabbitmq-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.824511 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkdw2\" (UniqueName: \"kubernetes.io/projected/3a991e29-288f-453d-9bb4-f8d90a2689ad-kube-api-access-rkdw2\") pod \"rabbitmq-server-0\" (UID: \"3a991e29-288f-453d-9bb4-f8d90a2689ad\") " pod="openstack/rabbitmq-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.824556 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3a991e29-288f-453d-9bb4-f8d90a2689ad-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3a991e29-288f-453d-9bb4-f8d90a2689ad\") " pod="openstack/rabbitmq-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.824572 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3a991e29-288f-453d-9bb4-f8d90a2689ad-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3a991e29-288f-453d-9bb4-f8d90a2689ad\") " pod="openstack/rabbitmq-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.824723 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3a991e29-288f-453d-9bb4-f8d90a2689ad-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3a991e29-288f-453d-9bb4-f8d90a2689ad\") " pod="openstack/rabbitmq-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.901663 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-dpswn" event={"ID":"c165766b-b53f-4345-802e-7262eb64618c","Type":"ContainerStarted","Data":"40ebec5f88ef82ec19e43255cbf4714d60f7dbb143808fa9e34b60a9ccb9be26"} Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.923139 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-zbw9m" event={"ID":"2ca917ea-9347-4218-ab50-2156103f610a","Type":"ContainerStarted","Data":"cd3b30dae02ab6718950ec073827f51303bec77bd680ef79e988052686b04a47"} Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.926472 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a991e29-288f-453d-9bb4-f8d90a2689ad-config-data\") pod \"rabbitmq-server-0\" (UID: \"3a991e29-288f-453d-9bb4-f8d90a2689ad\") " pod="openstack/rabbitmq-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.926563 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"3a991e29-288f-453d-9bb4-f8d90a2689ad\") " pod="openstack/rabbitmq-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.926624 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3a991e29-288f-453d-9bb4-f8d90a2689ad-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3a991e29-288f-453d-9bb4-f8d90a2689ad\") " pod="openstack/rabbitmq-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.926648 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3a991e29-288f-453d-9bb4-f8d90a2689ad-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3a991e29-288f-453d-9bb4-f8d90a2689ad\") " pod="openstack/rabbitmq-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.926670 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3a991e29-288f-453d-9bb4-f8d90a2689ad-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3a991e29-288f-453d-9bb4-f8d90a2689ad\") " pod="openstack/rabbitmq-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.926717 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3a991e29-288f-453d-9bb4-f8d90a2689ad-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"3a991e29-288f-453d-9bb4-f8d90a2689ad\") " pod="openstack/rabbitmq-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.926739 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkdw2\" (UniqueName: \"kubernetes.io/projected/3a991e29-288f-453d-9bb4-f8d90a2689ad-kube-api-access-rkdw2\") pod \"rabbitmq-server-0\" (UID: \"3a991e29-288f-453d-9bb4-f8d90a2689ad\") " pod="openstack/rabbitmq-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.926805 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3a991e29-288f-453d-9bb4-f8d90a2689ad-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3a991e29-288f-453d-9bb4-f8d90a2689ad\") " pod="openstack/rabbitmq-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.926847 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3a991e29-288f-453d-9bb4-f8d90a2689ad-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3a991e29-288f-453d-9bb4-f8d90a2689ad\") " pod="openstack/rabbitmq-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.926896 4892 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"3a991e29-288f-453d-9bb4-f8d90a2689ad\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.927941 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3a991e29-288f-453d-9bb4-f8d90a2689ad-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3a991e29-288f-453d-9bb4-f8d90a2689ad\") " pod="openstack/rabbitmq-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.927944 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3a991e29-288f-453d-9bb4-f8d90a2689ad-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3a991e29-288f-453d-9bb4-f8d90a2689ad\") " pod="openstack/rabbitmq-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.928233 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3a991e29-288f-453d-9bb4-f8d90a2689ad-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3a991e29-288f-453d-9bb4-f8d90a2689ad\") " pod="openstack/rabbitmq-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.927892 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3a991e29-288f-453d-9bb4-f8d90a2689ad-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3a991e29-288f-453d-9bb4-f8d90a2689ad\") " pod="openstack/rabbitmq-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.927285 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a991e29-288f-453d-9bb4-f8d90a2689ad-config-data\") pod \"rabbitmq-server-0\" (UID: \"3a991e29-288f-453d-9bb4-f8d90a2689ad\") " pod="openstack/rabbitmq-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.928328 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3a991e29-288f-453d-9bb4-f8d90a2689ad-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3a991e29-288f-453d-9bb4-f8d90a2689ad\") " pod="openstack/rabbitmq-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.932472 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3a991e29-288f-453d-9bb4-f8d90a2689ad-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3a991e29-288f-453d-9bb4-f8d90a2689ad\") " pod="openstack/rabbitmq-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.935096 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3a991e29-288f-453d-9bb4-f8d90a2689ad-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"3a991e29-288f-453d-9bb4-f8d90a2689ad\") " pod="openstack/rabbitmq-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.936541 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3a991e29-288f-453d-9bb4-f8d90a2689ad-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3a991e29-288f-453d-9bb4-f8d90a2689ad\") " pod="openstack/rabbitmq-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.936668 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3a991e29-288f-453d-9bb4-f8d90a2689ad-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3a991e29-288f-453d-9bb4-f8d90a2689ad\") " pod="openstack/rabbitmq-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.947297 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3a991e29-288f-453d-9bb4-f8d90a2689ad-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3a991e29-288f-453d-9bb4-f8d90a2689ad\") " pod="openstack/rabbitmq-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.947555 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkdw2\" (UniqueName: \"kubernetes.io/projected/3a991e29-288f-453d-9bb4-f8d90a2689ad-kube-api-access-rkdw2\") pod \"rabbitmq-server-0\" (UID: \"3a991e29-288f-453d-9bb4-f8d90a2689ad\") " pod="openstack/rabbitmq-server-0" Feb 17 18:01:15 crc kubenswrapper[4892]: I0217 18:01:15.976317 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"3a991e29-288f-453d-9bb4-f8d90a2689ad\") " pod="openstack/rabbitmq-server-0" Feb 17 18:01:16 crc kubenswrapper[4892]: I0217 18:01:16.063497 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 17 18:01:16 crc kubenswrapper[4892]: I0217 18:01:16.107379 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 18:01:16 crc kubenswrapper[4892]: I0217 18:01:16.641464 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 18:01:16 crc kubenswrapper[4892]: I0217 18:01:16.944175 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3a991e29-288f-453d-9bb4-f8d90a2689ad","Type":"ContainerStarted","Data":"db247dcc000538ded4cfecb83a5e39b97feb0a16238ba40aaed9354e33407022"} Feb 17 18:01:16 crc kubenswrapper[4892]: I0217 18:01:16.994523 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"60523b2e-a498-4bc9-920b-32f117afb898","Type":"ContainerStarted","Data":"33fb36cb1c79714be309b862878f90ca69249e16aad9f6cc0881a87adaf05284"} Feb 17 18:01:17 crc kubenswrapper[4892]: I0217 18:01:17.265424 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 17 18:01:17 crc kubenswrapper[4892]: I0217 18:01:17.267725 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 17 18:01:17 crc kubenswrapper[4892]: I0217 18:01:17.274339 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 17 18:01:17 crc kubenswrapper[4892]: I0217 18:01:17.275152 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-wcn6r" Feb 17 18:01:17 crc kubenswrapper[4892]: I0217 18:01:17.277153 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 17 18:01:17 crc kubenswrapper[4892]: I0217 18:01:17.277636 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 17 18:01:17 crc kubenswrapper[4892]: I0217 18:01:17.278123 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 17 18:01:17 crc kubenswrapper[4892]: I0217 18:01:17.279730 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 17 18:01:17 crc kubenswrapper[4892]: I0217 18:01:17.380100 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f63d2ae-8195-4841-a7b5-f38667fc87b2-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"4f63d2ae-8195-4841-a7b5-f38667fc87b2\") " pod="openstack/openstack-galera-0" Feb 17 18:01:17 crc kubenswrapper[4892]: I0217 18:01:17.380936 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f63d2ae-8195-4841-a7b5-f38667fc87b2-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"4f63d2ae-8195-4841-a7b5-f38667fc87b2\") " pod="openstack/openstack-galera-0" Feb 17 18:01:17 crc kubenswrapper[4892]: I0217 18:01:17.381561 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4f63d2ae-8195-4841-a7b5-f38667fc87b2-kolla-config\") pod \"openstack-galera-0\" (UID: \"4f63d2ae-8195-4841-a7b5-f38667fc87b2\") " pod="openstack/openstack-galera-0" Feb 17 18:01:17 crc kubenswrapper[4892]: I0217 18:01:17.381684 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxx58\" (UniqueName: \"kubernetes.io/projected/4f63d2ae-8195-4841-a7b5-f38667fc87b2-kube-api-access-sxx58\") pod \"openstack-galera-0\" (UID: \"4f63d2ae-8195-4841-a7b5-f38667fc87b2\") " pod="openstack/openstack-galera-0" Feb 17 18:01:17 crc kubenswrapper[4892]: I0217 18:01:17.381714 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4f63d2ae-8195-4841-a7b5-f38667fc87b2-config-data-default\") pod \"openstack-galera-0\" (UID: \"4f63d2ae-8195-4841-a7b5-f38667fc87b2\") " pod="openstack/openstack-galera-0" Feb 17 18:01:17 crc kubenswrapper[4892]: I0217 18:01:17.383320 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f63d2ae-8195-4841-a7b5-f38667fc87b2-operator-scripts\") pod \"openstack-galera-0\" (UID: \"4f63d2ae-8195-4841-a7b5-f38667fc87b2\") " pod="openstack/openstack-galera-0" Feb 17 18:01:17 crc kubenswrapper[4892]: I0217 18:01:17.383369 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4f63d2ae-8195-4841-a7b5-f38667fc87b2-config-data-generated\") pod \"openstack-galera-0\" (UID: \"4f63d2ae-8195-4841-a7b5-f38667fc87b2\") " pod="openstack/openstack-galera-0" Feb 17 18:01:17 crc kubenswrapper[4892]: I0217 18:01:17.383448 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"4f63d2ae-8195-4841-a7b5-f38667fc87b2\") " pod="openstack/openstack-galera-0" Feb 17 18:01:17 crc kubenswrapper[4892]: I0217 18:01:17.485282 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f63d2ae-8195-4841-a7b5-f38667fc87b2-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"4f63d2ae-8195-4841-a7b5-f38667fc87b2\") " pod="openstack/openstack-galera-0" Feb 17 18:01:17 crc kubenswrapper[4892]: I0217 18:01:17.485321 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f63d2ae-8195-4841-a7b5-f38667fc87b2-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"4f63d2ae-8195-4841-a7b5-f38667fc87b2\") " pod="openstack/openstack-galera-0" Feb 17 18:01:17 crc kubenswrapper[4892]: I0217 18:01:17.485347 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4f63d2ae-8195-4841-a7b5-f38667fc87b2-kolla-config\") pod \"openstack-galera-0\" (UID: \"4f63d2ae-8195-4841-a7b5-f38667fc87b2\") " pod="openstack/openstack-galera-0" Feb 17 18:01:17 crc kubenswrapper[4892]: I0217 18:01:17.485397 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxx58\" (UniqueName: \"kubernetes.io/projected/4f63d2ae-8195-4841-a7b5-f38667fc87b2-kube-api-access-sxx58\") pod \"openstack-galera-0\" (UID: \"4f63d2ae-8195-4841-a7b5-f38667fc87b2\") " pod="openstack/openstack-galera-0" Feb 17 18:01:17 crc kubenswrapper[4892]: I0217 18:01:17.485416 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4f63d2ae-8195-4841-a7b5-f38667fc87b2-config-data-default\") pod \"openstack-galera-0\" (UID: \"4f63d2ae-8195-4841-a7b5-f38667fc87b2\") " pod="openstack/openstack-galera-0" Feb 17 18:01:17 crc kubenswrapper[4892]: I0217 18:01:17.485498 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f63d2ae-8195-4841-a7b5-f38667fc87b2-operator-scripts\") pod \"openstack-galera-0\" (UID: \"4f63d2ae-8195-4841-a7b5-f38667fc87b2\") " pod="openstack/openstack-galera-0" Feb 17 18:01:17 crc kubenswrapper[4892]: I0217 18:01:17.485514 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4f63d2ae-8195-4841-a7b5-f38667fc87b2-config-data-generated\") pod \"openstack-galera-0\" (UID: \"4f63d2ae-8195-4841-a7b5-f38667fc87b2\") " pod="openstack/openstack-galera-0" Feb 17 18:01:17 crc kubenswrapper[4892]: I0217 18:01:17.485532 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"4f63d2ae-8195-4841-a7b5-f38667fc87b2\") " pod="openstack/openstack-galera-0" Feb 17 18:01:17 crc kubenswrapper[4892]: I0217 18:01:17.485836 4892 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"4f63d2ae-8195-4841-a7b5-f38667fc87b2\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/openstack-galera-0" Feb 17 18:01:17 crc kubenswrapper[4892]: I0217 18:01:17.486895 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4f63d2ae-8195-4841-a7b5-f38667fc87b2-config-data-default\") pod \"openstack-galera-0\" (UID: \"4f63d2ae-8195-4841-a7b5-f38667fc87b2\") " pod="openstack/openstack-galera-0" Feb 17 18:01:17 crc kubenswrapper[4892]: I0217 18:01:17.487377 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4f63d2ae-8195-4841-a7b5-f38667fc87b2-kolla-config\") pod \"openstack-galera-0\" (UID: \"4f63d2ae-8195-4841-a7b5-f38667fc87b2\") " pod="openstack/openstack-galera-0" Feb 17 18:01:17 crc kubenswrapper[4892]: I0217 18:01:17.487955 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4f63d2ae-8195-4841-a7b5-f38667fc87b2-config-data-generated\") pod \"openstack-galera-0\" (UID: \"4f63d2ae-8195-4841-a7b5-f38667fc87b2\") " pod="openstack/openstack-galera-0" Feb 17 18:01:17 crc kubenswrapper[4892]: I0217 18:01:17.488802 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f63d2ae-8195-4841-a7b5-f38667fc87b2-operator-scripts\") pod \"openstack-galera-0\" (UID: \"4f63d2ae-8195-4841-a7b5-f38667fc87b2\") " pod="openstack/openstack-galera-0" Feb 17 18:01:17 crc kubenswrapper[4892]: I0217 18:01:17.490417 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f63d2ae-8195-4841-a7b5-f38667fc87b2-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"4f63d2ae-8195-4841-a7b5-f38667fc87b2\") " pod="openstack/openstack-galera-0" Feb 17 18:01:17 crc kubenswrapper[4892]: I0217 18:01:17.509795 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"4f63d2ae-8195-4841-a7b5-f38667fc87b2\") " pod="openstack/openstack-galera-0" Feb 17 18:01:17 crc kubenswrapper[4892]: I0217 18:01:17.513381 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f63d2ae-8195-4841-a7b5-f38667fc87b2-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"4f63d2ae-8195-4841-a7b5-f38667fc87b2\") " pod="openstack/openstack-galera-0" Feb 17 18:01:17 crc kubenswrapper[4892]: I0217 18:01:17.521115 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxx58\" (UniqueName: \"kubernetes.io/projected/4f63d2ae-8195-4841-a7b5-f38667fc87b2-kube-api-access-sxx58\") pod \"openstack-galera-0\" (UID: \"4f63d2ae-8195-4841-a7b5-f38667fc87b2\") " pod="openstack/openstack-galera-0" Feb 17 18:01:17 crc kubenswrapper[4892]: I0217 18:01:17.607175 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 17 18:01:18 crc kubenswrapper[4892]: W0217 18:01:18.178865 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f63d2ae_8195_4841_a7b5_f38667fc87b2.slice/crio-b37fd06238b0ce86ef52cf7e56224d81aca25c6bab4974b465b44f404c73fc50 WatchSource:0}: Error finding container b37fd06238b0ce86ef52cf7e56224d81aca25c6bab4974b465b44f404c73fc50: Status 404 returned error can't find the container with id b37fd06238b0ce86ef52cf7e56224d81aca25c6bab4974b465b44f404c73fc50 Feb 17 18:01:18 crc kubenswrapper[4892]: I0217 18:01:18.184519 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 17 18:01:18 crc kubenswrapper[4892]: I0217 18:01:18.710634 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 17 18:01:18 crc kubenswrapper[4892]: I0217 18:01:18.712113 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 17 18:01:18 crc kubenswrapper[4892]: I0217 18:01:18.715409 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 17 18:01:18 crc kubenswrapper[4892]: I0217 18:01:18.715647 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-gnxkd" Feb 17 18:01:18 crc kubenswrapper[4892]: I0217 18:01:18.715792 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 17 18:01:18 crc kubenswrapper[4892]: I0217 18:01:18.716060 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 17 18:01:18 crc kubenswrapper[4892]: I0217 18:01:18.725746 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 17 18:01:18 crc kubenswrapper[4892]: I0217 18:01:18.831695 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/38862686-bfab-4f7d-8367-ec59a68b0299-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"38862686-bfab-4f7d-8367-ec59a68b0299\") " pod="openstack/openstack-cell1-galera-0" Feb 17 18:01:18 crc kubenswrapper[4892]: I0217 18:01:18.831772 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpwrf\" (UniqueName: \"kubernetes.io/projected/38862686-bfab-4f7d-8367-ec59a68b0299-kube-api-access-fpwrf\") pod \"openstack-cell1-galera-0\" (UID: \"38862686-bfab-4f7d-8367-ec59a68b0299\") " pod="openstack/openstack-cell1-galera-0" Feb 17 18:01:18 crc kubenswrapper[4892]: I0217 18:01:18.831867 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"38862686-bfab-4f7d-8367-ec59a68b0299\") " pod="openstack/openstack-cell1-galera-0" Feb 17 18:01:18 crc kubenswrapper[4892]: I0217 18:01:18.831912 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/38862686-bfab-4f7d-8367-ec59a68b0299-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"38862686-bfab-4f7d-8367-ec59a68b0299\") " pod="openstack/openstack-cell1-galera-0" Feb 17 18:01:18 crc kubenswrapper[4892]: I0217 18:01:18.832005 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38862686-bfab-4f7d-8367-ec59a68b0299-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"38862686-bfab-4f7d-8367-ec59a68b0299\") " pod="openstack/openstack-cell1-galera-0" Feb 17 18:01:18 crc kubenswrapper[4892]: I0217 18:01:18.832043 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/38862686-bfab-4f7d-8367-ec59a68b0299-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"38862686-bfab-4f7d-8367-ec59a68b0299\") " pod="openstack/openstack-cell1-galera-0" Feb 17 18:01:18 crc kubenswrapper[4892]: I0217 18:01:18.832070 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38862686-bfab-4f7d-8367-ec59a68b0299-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"38862686-bfab-4f7d-8367-ec59a68b0299\") " pod="openstack/openstack-cell1-galera-0" Feb 17 18:01:18 crc kubenswrapper[4892]: I0217 18:01:18.832095 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/38862686-bfab-4f7d-8367-ec59a68b0299-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"38862686-bfab-4f7d-8367-ec59a68b0299\") " pod="openstack/openstack-cell1-galera-0" Feb 17 18:01:18 crc kubenswrapper[4892]: I0217 18:01:18.930686 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 17 18:01:18 crc kubenswrapper[4892]: I0217 18:01:18.933579 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"38862686-bfab-4f7d-8367-ec59a68b0299\") " pod="openstack/openstack-cell1-galera-0" Feb 17 18:01:18 crc kubenswrapper[4892]: I0217 18:01:18.933640 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/38862686-bfab-4f7d-8367-ec59a68b0299-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"38862686-bfab-4f7d-8367-ec59a68b0299\") " pod="openstack/openstack-cell1-galera-0" Feb 17 18:01:18 crc kubenswrapper[4892]: I0217 18:01:18.933678 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38862686-bfab-4f7d-8367-ec59a68b0299-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"38862686-bfab-4f7d-8367-ec59a68b0299\") " pod="openstack/openstack-cell1-galera-0" Feb 17 18:01:18 crc kubenswrapper[4892]: I0217 18:01:18.933701 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/38862686-bfab-4f7d-8367-ec59a68b0299-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"38862686-bfab-4f7d-8367-ec59a68b0299\") " pod="openstack/openstack-cell1-galera-0" Feb 17 18:01:18 crc kubenswrapper[4892]: I0217 18:01:18.933721 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38862686-bfab-4f7d-8367-ec59a68b0299-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"38862686-bfab-4f7d-8367-ec59a68b0299\") " pod="openstack/openstack-cell1-galera-0" Feb 17 18:01:18 crc kubenswrapper[4892]: I0217 18:01:18.933739 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/38862686-bfab-4f7d-8367-ec59a68b0299-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"38862686-bfab-4f7d-8367-ec59a68b0299\") " pod="openstack/openstack-cell1-galera-0" Feb 17 18:01:18 crc kubenswrapper[4892]: I0217 18:01:18.933783 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/38862686-bfab-4f7d-8367-ec59a68b0299-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"38862686-bfab-4f7d-8367-ec59a68b0299\") " pod="openstack/openstack-cell1-galera-0" Feb 17 18:01:18 crc kubenswrapper[4892]: I0217 18:01:18.933822 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpwrf\" (UniqueName: \"kubernetes.io/projected/38862686-bfab-4f7d-8367-ec59a68b0299-kube-api-access-fpwrf\") pod \"openstack-cell1-galera-0\" (UID: \"38862686-bfab-4f7d-8367-ec59a68b0299\") " pod="openstack/openstack-cell1-galera-0" Feb 17 18:01:18 crc kubenswrapper[4892]: I0217 18:01:18.934401 4892 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"38862686-bfab-4f7d-8367-ec59a68b0299\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/openstack-cell1-galera-0" Feb 17 18:01:18 crc kubenswrapper[4892]: I0217 18:01:18.935568 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 17 18:01:18 crc kubenswrapper[4892]: I0217 18:01:18.943721 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 17 18:01:18 crc kubenswrapper[4892]: I0217 18:01:18.944278 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 17 18:01:18 crc kubenswrapper[4892]: I0217 18:01:18.944462 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 17 18:01:18 crc kubenswrapper[4892]: I0217 18:01:18.944651 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-zsjnz" Feb 17 18:01:18 crc kubenswrapper[4892]: I0217 18:01:18.945416 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/38862686-bfab-4f7d-8367-ec59a68b0299-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"38862686-bfab-4f7d-8367-ec59a68b0299\") " pod="openstack/openstack-cell1-galera-0" Feb 17 18:01:18 crc kubenswrapper[4892]: I0217 18:01:18.946055 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/38862686-bfab-4f7d-8367-ec59a68b0299-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"38862686-bfab-4f7d-8367-ec59a68b0299\") " pod="openstack/openstack-cell1-galera-0" Feb 17 18:01:18 crc kubenswrapper[4892]: I0217 18:01:18.948441 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38862686-bfab-4f7d-8367-ec59a68b0299-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"38862686-bfab-4f7d-8367-ec59a68b0299\") " pod="openstack/openstack-cell1-galera-0" Feb 17 18:01:18 crc kubenswrapper[4892]: I0217 18:01:18.952023 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38862686-bfab-4f7d-8367-ec59a68b0299-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"38862686-bfab-4f7d-8367-ec59a68b0299\") " pod="openstack/openstack-cell1-galera-0" Feb 17 18:01:18 crc kubenswrapper[4892]: I0217 18:01:18.952085 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/38862686-bfab-4f7d-8367-ec59a68b0299-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"38862686-bfab-4f7d-8367-ec59a68b0299\") " pod="openstack/openstack-cell1-galera-0" Feb 17 18:01:18 crc kubenswrapper[4892]: I0217 18:01:18.953615 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/38862686-bfab-4f7d-8367-ec59a68b0299-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"38862686-bfab-4f7d-8367-ec59a68b0299\") " pod="openstack/openstack-cell1-galera-0" Feb 17 18:01:18 crc kubenswrapper[4892]: I0217 18:01:18.954445 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpwrf\" (UniqueName: \"kubernetes.io/projected/38862686-bfab-4f7d-8367-ec59a68b0299-kube-api-access-fpwrf\") pod \"openstack-cell1-galera-0\" (UID: \"38862686-bfab-4f7d-8367-ec59a68b0299\") " pod="openstack/openstack-cell1-galera-0" Feb 17 18:01:18 crc kubenswrapper[4892]: I0217 18:01:18.977479 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"38862686-bfab-4f7d-8367-ec59a68b0299\") " pod="openstack/openstack-cell1-galera-0" Feb 17 18:01:19 crc kubenswrapper[4892]: I0217 18:01:19.035561 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dadb10bf-ed88-454e-8873-9c49f762ef6e-config-data\") pod \"memcached-0\" (UID: \"dadb10bf-ed88-454e-8873-9c49f762ef6e\") " pod="openstack/memcached-0" Feb 17 18:01:19 crc kubenswrapper[4892]: I0217 18:01:19.035607 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/dadb10bf-ed88-454e-8873-9c49f762ef6e-kolla-config\") pod \"memcached-0\" (UID: \"dadb10bf-ed88-454e-8873-9c49f762ef6e\") " pod="openstack/memcached-0" Feb 17 18:01:19 crc kubenswrapper[4892]: I0217 18:01:19.035625 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dadb10bf-ed88-454e-8873-9c49f762ef6e-combined-ca-bundle\") pod \"memcached-0\" (UID: \"dadb10bf-ed88-454e-8873-9c49f762ef6e\") " pod="openstack/memcached-0" Feb 17 18:01:19 crc kubenswrapper[4892]: I0217 18:01:19.035652 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g47gz\" (UniqueName: \"kubernetes.io/projected/dadb10bf-ed88-454e-8873-9c49f762ef6e-kube-api-access-g47gz\") pod \"memcached-0\" (UID: \"dadb10bf-ed88-454e-8873-9c49f762ef6e\") " pod="openstack/memcached-0" Feb 17 18:01:19 crc kubenswrapper[4892]: I0217 18:01:19.035700 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/dadb10bf-ed88-454e-8873-9c49f762ef6e-memcached-tls-certs\") pod \"memcached-0\" (UID: \"dadb10bf-ed88-454e-8873-9c49f762ef6e\") " pod="openstack/memcached-0" Feb 17 18:01:19 crc kubenswrapper[4892]: I0217 18:01:19.079986 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4f63d2ae-8195-4841-a7b5-f38667fc87b2","Type":"ContainerStarted","Data":"b37fd06238b0ce86ef52cf7e56224d81aca25c6bab4974b465b44f404c73fc50"} Feb 17 18:01:19 crc kubenswrapper[4892]: I0217 18:01:19.082329 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 17 18:01:19 crc kubenswrapper[4892]: I0217 18:01:19.137201 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dadb10bf-ed88-454e-8873-9c49f762ef6e-config-data\") pod \"memcached-0\" (UID: \"dadb10bf-ed88-454e-8873-9c49f762ef6e\") " pod="openstack/memcached-0" Feb 17 18:01:19 crc kubenswrapper[4892]: I0217 18:01:19.137256 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/dadb10bf-ed88-454e-8873-9c49f762ef6e-kolla-config\") pod \"memcached-0\" (UID: \"dadb10bf-ed88-454e-8873-9c49f762ef6e\") " pod="openstack/memcached-0" Feb 17 18:01:19 crc kubenswrapper[4892]: I0217 18:01:19.137286 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dadb10bf-ed88-454e-8873-9c49f762ef6e-combined-ca-bundle\") pod \"memcached-0\" (UID: \"dadb10bf-ed88-454e-8873-9c49f762ef6e\") " pod="openstack/memcached-0" Feb 17 18:01:19 crc kubenswrapper[4892]: I0217 18:01:19.137324 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g47gz\" (UniqueName: \"kubernetes.io/projected/dadb10bf-ed88-454e-8873-9c49f762ef6e-kube-api-access-g47gz\") pod \"memcached-0\" (UID: \"dadb10bf-ed88-454e-8873-9c49f762ef6e\") " pod="openstack/memcached-0" Feb 17 18:01:19 crc kubenswrapper[4892]: I0217 18:01:19.137368 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/dadb10bf-ed88-454e-8873-9c49f762ef6e-memcached-tls-certs\") pod \"memcached-0\" (UID: \"dadb10bf-ed88-454e-8873-9c49f762ef6e\") " pod="openstack/memcached-0" Feb 17 18:01:19 crc kubenswrapper[4892]: I0217 18:01:19.138181 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dadb10bf-ed88-454e-8873-9c49f762ef6e-config-data\") pod \"memcached-0\" (UID: \"dadb10bf-ed88-454e-8873-9c49f762ef6e\") " pod="openstack/memcached-0" Feb 17 18:01:19 crc kubenswrapper[4892]: I0217 18:01:19.138698 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/dadb10bf-ed88-454e-8873-9c49f762ef6e-kolla-config\") pod \"memcached-0\" (UID: \"dadb10bf-ed88-454e-8873-9c49f762ef6e\") " pod="openstack/memcached-0" Feb 17 18:01:19 crc kubenswrapper[4892]: I0217 18:01:19.146517 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/dadb10bf-ed88-454e-8873-9c49f762ef6e-memcached-tls-certs\") pod \"memcached-0\" (UID: \"dadb10bf-ed88-454e-8873-9c49f762ef6e\") " pod="openstack/memcached-0" Feb 17 18:01:19 crc kubenswrapper[4892]: I0217 18:01:19.146974 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dadb10bf-ed88-454e-8873-9c49f762ef6e-combined-ca-bundle\") pod \"memcached-0\" (UID: \"dadb10bf-ed88-454e-8873-9c49f762ef6e\") " pod="openstack/memcached-0" Feb 17 18:01:19 crc kubenswrapper[4892]: I0217 18:01:19.154118 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g47gz\" (UniqueName: \"kubernetes.io/projected/dadb10bf-ed88-454e-8873-9c49f762ef6e-kube-api-access-g47gz\") pod \"memcached-0\" (UID: \"dadb10bf-ed88-454e-8873-9c49f762ef6e\") " pod="openstack/memcached-0" Feb 17 18:01:19 crc kubenswrapper[4892]: I0217 18:01:19.324638 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 17 18:01:21 crc kubenswrapper[4892]: I0217 18:01:21.016372 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 18:01:21 crc kubenswrapper[4892]: I0217 18:01:21.022554 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 17 18:01:21 crc kubenswrapper[4892]: I0217 18:01:21.024881 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-zlbcv" Feb 17 18:01:21 crc kubenswrapper[4892]: I0217 18:01:21.028884 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 18:01:21 crc kubenswrapper[4892]: I0217 18:01:21.180337 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsskp\" (UniqueName: \"kubernetes.io/projected/56066e3d-ab6b-4a76-bf31-11f8442d9285-kube-api-access-jsskp\") pod \"kube-state-metrics-0\" (UID: \"56066e3d-ab6b-4a76-bf31-11f8442d9285\") " pod="openstack/kube-state-metrics-0" Feb 17 18:01:21 crc kubenswrapper[4892]: I0217 18:01:21.281911 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsskp\" (UniqueName: \"kubernetes.io/projected/56066e3d-ab6b-4a76-bf31-11f8442d9285-kube-api-access-jsskp\") pod \"kube-state-metrics-0\" (UID: \"56066e3d-ab6b-4a76-bf31-11f8442d9285\") " pod="openstack/kube-state-metrics-0" Feb 17 18:01:21 crc kubenswrapper[4892]: I0217 18:01:21.298496 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsskp\" (UniqueName: \"kubernetes.io/projected/56066e3d-ab6b-4a76-bf31-11f8442d9285-kube-api-access-jsskp\") pod \"kube-state-metrics-0\" (UID: \"56066e3d-ab6b-4a76-bf31-11f8442d9285\") " pod="openstack/kube-state-metrics-0" Feb 17 18:01:21 crc kubenswrapper[4892]: I0217 18:01:21.353055 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 17 18:01:24 crc kubenswrapper[4892]: I0217 18:01:24.833963 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-bz7v2"] Feb 17 18:01:24 crc kubenswrapper[4892]: I0217 18:01:24.835289 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bz7v2" Feb 17 18:01:24 crc kubenswrapper[4892]: I0217 18:01:24.838914 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 17 18:01:24 crc kubenswrapper[4892]: I0217 18:01:24.839465 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-d4sc7" Feb 17 18:01:24 crc kubenswrapper[4892]: I0217 18:01:24.840563 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Feb 17 18:01:24 crc kubenswrapper[4892]: I0217 18:01:24.847050 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-bz7v2"] Feb 17 18:01:24 crc kubenswrapper[4892]: I0217 18:01:24.917646 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-8n9s7"] Feb 17 18:01:24 crc kubenswrapper[4892]: I0217 18:01:24.919525 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-8n9s7" Feb 17 18:01:24 crc kubenswrapper[4892]: I0217 18:01:24.943147 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-8n9s7"] Feb 17 18:01:24 crc kubenswrapper[4892]: I0217 18:01:24.961843 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff799349-84ed-44f7-8f46-d11d5637abf1-ovn-controller-tls-certs\") pod \"ovn-controller-bz7v2\" (UID: \"ff799349-84ed-44f7-8f46-d11d5637abf1\") " pod="openstack/ovn-controller-bz7v2" Feb 17 18:01:24 crc kubenswrapper[4892]: I0217 18:01:24.962016 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf4w8\" (UniqueName: \"kubernetes.io/projected/ff799349-84ed-44f7-8f46-d11d5637abf1-kube-api-access-rf4w8\") pod \"ovn-controller-bz7v2\" (UID: \"ff799349-84ed-44f7-8f46-d11d5637abf1\") " pod="openstack/ovn-controller-bz7v2" Feb 17 18:01:24 crc kubenswrapper[4892]: I0217 18:01:24.962092 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff799349-84ed-44f7-8f46-d11d5637abf1-combined-ca-bundle\") pod \"ovn-controller-bz7v2\" (UID: \"ff799349-84ed-44f7-8f46-d11d5637abf1\") " pod="openstack/ovn-controller-bz7v2" Feb 17 18:01:24 crc kubenswrapper[4892]: I0217 18:01:24.962240 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ff799349-84ed-44f7-8f46-d11d5637abf1-var-run-ovn\") pod \"ovn-controller-bz7v2\" (UID: \"ff799349-84ed-44f7-8f46-d11d5637abf1\") " pod="openstack/ovn-controller-bz7v2" Feb 17 18:01:24 crc kubenswrapper[4892]: I0217 18:01:24.962314 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff799349-84ed-44f7-8f46-d11d5637abf1-scripts\") pod \"ovn-controller-bz7v2\" (UID: \"ff799349-84ed-44f7-8f46-d11d5637abf1\") " pod="openstack/ovn-controller-bz7v2" Feb 17 18:01:24 crc kubenswrapper[4892]: I0217 18:01:24.962361 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ff799349-84ed-44f7-8f46-d11d5637abf1-var-run\") pod \"ovn-controller-bz7v2\" (UID: \"ff799349-84ed-44f7-8f46-d11d5637abf1\") " pod="openstack/ovn-controller-bz7v2" Feb 17 18:01:24 crc kubenswrapper[4892]: I0217 18:01:24.962434 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ff799349-84ed-44f7-8f46-d11d5637abf1-var-log-ovn\") pod \"ovn-controller-bz7v2\" (UID: \"ff799349-84ed-44f7-8f46-d11d5637abf1\") " pod="openstack/ovn-controller-bz7v2" Feb 17 18:01:25 crc kubenswrapper[4892]: I0217 18:01:25.064341 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/abdceb70-55cb-4f2a-ac20-7fe8e9f4d064-var-lib\") pod \"ovn-controller-ovs-8n9s7\" (UID: \"abdceb70-55cb-4f2a-ac20-7fe8e9f4d064\") " pod="openstack/ovn-controller-ovs-8n9s7" Feb 17 18:01:25 crc kubenswrapper[4892]: I0217 18:01:25.064451 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/abdceb70-55cb-4f2a-ac20-7fe8e9f4d064-etc-ovs\") pod \"ovn-controller-ovs-8n9s7\" (UID: \"abdceb70-55cb-4f2a-ac20-7fe8e9f4d064\") " pod="openstack/ovn-controller-ovs-8n9s7" Feb 17 18:01:25 crc kubenswrapper[4892]: I0217 18:01:25.064485 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rf4w8\" (UniqueName: \"kubernetes.io/projected/ff799349-84ed-44f7-8f46-d11d5637abf1-kube-api-access-rf4w8\") pod \"ovn-controller-bz7v2\" (UID: \"ff799349-84ed-44f7-8f46-d11d5637abf1\") " pod="openstack/ovn-controller-bz7v2" Feb 17 18:01:25 crc kubenswrapper[4892]: I0217 18:01:25.064503 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/abdceb70-55cb-4f2a-ac20-7fe8e9f4d064-scripts\") pod \"ovn-controller-ovs-8n9s7\" (UID: \"abdceb70-55cb-4f2a-ac20-7fe8e9f4d064\") " pod="openstack/ovn-controller-ovs-8n9s7" Feb 17 18:01:25 crc kubenswrapper[4892]: I0217 18:01:25.064521 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/abdceb70-55cb-4f2a-ac20-7fe8e9f4d064-var-log\") pod \"ovn-controller-ovs-8n9s7\" (UID: \"abdceb70-55cb-4f2a-ac20-7fe8e9f4d064\") " pod="openstack/ovn-controller-ovs-8n9s7" Feb 17 18:01:25 crc kubenswrapper[4892]: I0217 18:01:25.064566 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff799349-84ed-44f7-8f46-d11d5637abf1-combined-ca-bundle\") pod \"ovn-controller-bz7v2\" (UID: \"ff799349-84ed-44f7-8f46-d11d5637abf1\") " pod="openstack/ovn-controller-bz7v2" Feb 17 18:01:25 crc kubenswrapper[4892]: I0217 18:01:25.064597 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ff799349-84ed-44f7-8f46-d11d5637abf1-var-run-ovn\") pod \"ovn-controller-bz7v2\" (UID: \"ff799349-84ed-44f7-8f46-d11d5637abf1\") " pod="openstack/ovn-controller-bz7v2" Feb 17 18:01:25 crc kubenswrapper[4892]: I0217 18:01:25.064621 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff799349-84ed-44f7-8f46-d11d5637abf1-scripts\") pod \"ovn-controller-bz7v2\" (UID: \"ff799349-84ed-44f7-8f46-d11d5637abf1\") " pod="openstack/ovn-controller-bz7v2" Feb 17 18:01:25 crc kubenswrapper[4892]: I0217 18:01:25.064646 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ff799349-84ed-44f7-8f46-d11d5637abf1-var-run\") pod \"ovn-controller-bz7v2\" (UID: \"ff799349-84ed-44f7-8f46-d11d5637abf1\") " pod="openstack/ovn-controller-bz7v2" Feb 17 18:01:25 crc kubenswrapper[4892]: I0217 18:01:25.064661 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/abdceb70-55cb-4f2a-ac20-7fe8e9f4d064-var-run\") pod \"ovn-controller-ovs-8n9s7\" (UID: \"abdceb70-55cb-4f2a-ac20-7fe8e9f4d064\") " pod="openstack/ovn-controller-ovs-8n9s7" Feb 17 18:01:25 crc kubenswrapper[4892]: I0217 18:01:25.064685 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ff799349-84ed-44f7-8f46-d11d5637abf1-var-log-ovn\") pod \"ovn-controller-bz7v2\" (UID: \"ff799349-84ed-44f7-8f46-d11d5637abf1\") " pod="openstack/ovn-controller-bz7v2" Feb 17 18:01:25 crc kubenswrapper[4892]: I0217 18:01:25.064715 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh2g9\" (UniqueName: \"kubernetes.io/projected/abdceb70-55cb-4f2a-ac20-7fe8e9f4d064-kube-api-access-mh2g9\") pod \"ovn-controller-ovs-8n9s7\" (UID: \"abdceb70-55cb-4f2a-ac20-7fe8e9f4d064\") " pod="openstack/ovn-controller-ovs-8n9s7" Feb 17 18:01:25 crc kubenswrapper[4892]: I0217 18:01:25.064734 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff799349-84ed-44f7-8f46-d11d5637abf1-ovn-controller-tls-certs\") pod \"ovn-controller-bz7v2\" (UID: \"ff799349-84ed-44f7-8f46-d11d5637abf1\") " pod="openstack/ovn-controller-bz7v2" Feb 17 18:01:25 crc kubenswrapper[4892]: I0217 18:01:25.065380 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ff799349-84ed-44f7-8f46-d11d5637abf1-var-run\") pod \"ovn-controller-bz7v2\" (UID: \"ff799349-84ed-44f7-8f46-d11d5637abf1\") " pod="openstack/ovn-controller-bz7v2" Feb 17 18:01:25 crc kubenswrapper[4892]: I0217 18:01:25.065510 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ff799349-84ed-44f7-8f46-d11d5637abf1-var-log-ovn\") pod \"ovn-controller-bz7v2\" (UID: \"ff799349-84ed-44f7-8f46-d11d5637abf1\") " pod="openstack/ovn-controller-bz7v2" Feb 17 18:01:25 crc kubenswrapper[4892]: I0217 18:01:25.066027 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ff799349-84ed-44f7-8f46-d11d5637abf1-var-run-ovn\") pod \"ovn-controller-bz7v2\" (UID: \"ff799349-84ed-44f7-8f46-d11d5637abf1\") " pod="openstack/ovn-controller-bz7v2" Feb 17 18:01:25 crc kubenswrapper[4892]: I0217 18:01:25.066950 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff799349-84ed-44f7-8f46-d11d5637abf1-scripts\") pod \"ovn-controller-bz7v2\" (UID: \"ff799349-84ed-44f7-8f46-d11d5637abf1\") " pod="openstack/ovn-controller-bz7v2" Feb 17 18:01:25 crc kubenswrapper[4892]: I0217 18:01:25.076783 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff799349-84ed-44f7-8f46-d11d5637abf1-ovn-controller-tls-certs\") pod \"ovn-controller-bz7v2\" (UID: \"ff799349-84ed-44f7-8f46-d11d5637abf1\") " pod="openstack/ovn-controller-bz7v2" Feb 17 18:01:25 crc kubenswrapper[4892]: I0217 18:01:25.076907 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff799349-84ed-44f7-8f46-d11d5637abf1-combined-ca-bundle\") pod \"ovn-controller-bz7v2\" (UID: \"ff799349-84ed-44f7-8f46-d11d5637abf1\") " pod="openstack/ovn-controller-bz7v2" Feb 17 18:01:25 crc kubenswrapper[4892]: I0217 18:01:25.081670 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf4w8\" (UniqueName: \"kubernetes.io/projected/ff799349-84ed-44f7-8f46-d11d5637abf1-kube-api-access-rf4w8\") pod \"ovn-controller-bz7v2\" (UID: \"ff799349-84ed-44f7-8f46-d11d5637abf1\") " pod="openstack/ovn-controller-bz7v2" Feb 17 18:01:25 crc kubenswrapper[4892]: I0217 18:01:25.166492 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mh2g9\" (UniqueName: \"kubernetes.io/projected/abdceb70-55cb-4f2a-ac20-7fe8e9f4d064-kube-api-access-mh2g9\") pod \"ovn-controller-ovs-8n9s7\" (UID: \"abdceb70-55cb-4f2a-ac20-7fe8e9f4d064\") " pod="openstack/ovn-controller-ovs-8n9s7" Feb 17 18:01:25 crc kubenswrapper[4892]: I0217 18:01:25.166574 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/abdceb70-55cb-4f2a-ac20-7fe8e9f4d064-var-lib\") pod \"ovn-controller-ovs-8n9s7\" (UID: \"abdceb70-55cb-4f2a-ac20-7fe8e9f4d064\") " pod="openstack/ovn-controller-ovs-8n9s7" Feb 17 18:01:25 crc kubenswrapper[4892]: I0217 18:01:25.166603 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/abdceb70-55cb-4f2a-ac20-7fe8e9f4d064-etc-ovs\") pod \"ovn-controller-ovs-8n9s7\" (UID: \"abdceb70-55cb-4f2a-ac20-7fe8e9f4d064\") " pod="openstack/ovn-controller-ovs-8n9s7" Feb 17 18:01:25 crc kubenswrapper[4892]: I0217 18:01:25.166626 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/abdceb70-55cb-4f2a-ac20-7fe8e9f4d064-scripts\") pod \"ovn-controller-ovs-8n9s7\" (UID: \"abdceb70-55cb-4f2a-ac20-7fe8e9f4d064\") " pod="openstack/ovn-controller-ovs-8n9s7" Feb 17 18:01:25 crc kubenswrapper[4892]: I0217 18:01:25.166642 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/abdceb70-55cb-4f2a-ac20-7fe8e9f4d064-var-log\") pod \"ovn-controller-ovs-8n9s7\" (UID: \"abdceb70-55cb-4f2a-ac20-7fe8e9f4d064\") " pod="openstack/ovn-controller-ovs-8n9s7" Feb 17 18:01:25 crc kubenswrapper[4892]: I0217 18:01:25.166713 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/abdceb70-55cb-4f2a-ac20-7fe8e9f4d064-var-run\") pod \"ovn-controller-ovs-8n9s7\" (UID: \"abdceb70-55cb-4f2a-ac20-7fe8e9f4d064\") " pod="openstack/ovn-controller-ovs-8n9s7" Feb 17 18:01:25 crc kubenswrapper[4892]: I0217 18:01:25.166979 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/abdceb70-55cb-4f2a-ac20-7fe8e9f4d064-var-run\") pod \"ovn-controller-ovs-8n9s7\" (UID: \"abdceb70-55cb-4f2a-ac20-7fe8e9f4d064\") " pod="openstack/ovn-controller-ovs-8n9s7" Feb 17 18:01:25 crc kubenswrapper[4892]: I0217 18:01:25.167147 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/abdceb70-55cb-4f2a-ac20-7fe8e9f4d064-etc-ovs\") pod \"ovn-controller-ovs-8n9s7\" (UID: \"abdceb70-55cb-4f2a-ac20-7fe8e9f4d064\") " pod="openstack/ovn-controller-ovs-8n9s7" Feb 17 18:01:25 crc kubenswrapper[4892]: I0217 18:01:25.167245 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/abdceb70-55cb-4f2a-ac20-7fe8e9f4d064-var-log\") pod \"ovn-controller-ovs-8n9s7\" (UID: \"abdceb70-55cb-4f2a-ac20-7fe8e9f4d064\") " pod="openstack/ovn-controller-ovs-8n9s7" Feb 17 18:01:25 crc kubenswrapper[4892]: I0217 18:01:25.167279 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/abdceb70-55cb-4f2a-ac20-7fe8e9f4d064-var-lib\") pod \"ovn-controller-ovs-8n9s7\" (UID: \"abdceb70-55cb-4f2a-ac20-7fe8e9f4d064\") " pod="openstack/ovn-controller-ovs-8n9s7" Feb 17 18:01:25 crc kubenswrapper[4892]: I0217 18:01:25.176200 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/abdceb70-55cb-4f2a-ac20-7fe8e9f4d064-scripts\") pod \"ovn-controller-ovs-8n9s7\" (UID: \"abdceb70-55cb-4f2a-ac20-7fe8e9f4d064\") " pod="openstack/ovn-controller-ovs-8n9s7" Feb 17 18:01:25 crc kubenswrapper[4892]: I0217 18:01:25.183749 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh2g9\" (UniqueName: \"kubernetes.io/projected/abdceb70-55cb-4f2a-ac20-7fe8e9f4d064-kube-api-access-mh2g9\") pod \"ovn-controller-ovs-8n9s7\" (UID: \"abdceb70-55cb-4f2a-ac20-7fe8e9f4d064\") " pod="openstack/ovn-controller-ovs-8n9s7" Feb 17 18:01:25 crc kubenswrapper[4892]: I0217 18:01:25.197714 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bz7v2" Feb 17 18:01:25 crc kubenswrapper[4892]: I0217 18:01:25.239057 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-8n9s7" Feb 17 18:01:26 crc kubenswrapper[4892]: I0217 18:01:26.530415 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 17 18:01:26 crc kubenswrapper[4892]: I0217 18:01:26.532290 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 17 18:01:26 crc kubenswrapper[4892]: I0217 18:01:26.536498 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-q8xgc" Feb 17 18:01:26 crc kubenswrapper[4892]: I0217 18:01:26.536832 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 17 18:01:26 crc kubenswrapper[4892]: I0217 18:01:26.536960 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 17 18:01:26 crc kubenswrapper[4892]: I0217 18:01:26.537249 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 17 18:01:26 crc kubenswrapper[4892]: I0217 18:01:26.537408 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 17 18:01:26 crc kubenswrapper[4892]: I0217 18:01:26.543507 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 17 18:01:26 crc kubenswrapper[4892]: I0217 18:01:26.691785 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/747e7c96-8d95-4c34-9ff3-83dc8c793fc2-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"747e7c96-8d95-4c34-9ff3-83dc8c793fc2\") " pod="openstack/ovsdbserver-sb-0" Feb 17 18:01:26 crc kubenswrapper[4892]: I0217 18:01:26.691845 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdgh7\" (UniqueName: \"kubernetes.io/projected/747e7c96-8d95-4c34-9ff3-83dc8c793fc2-kube-api-access-fdgh7\") pod \"ovsdbserver-sb-0\" (UID: \"747e7c96-8d95-4c34-9ff3-83dc8c793fc2\") " pod="openstack/ovsdbserver-sb-0" Feb 17 18:01:26 crc kubenswrapper[4892]: I0217 18:01:26.691878 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/747e7c96-8d95-4c34-9ff3-83dc8c793fc2-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"747e7c96-8d95-4c34-9ff3-83dc8c793fc2\") " pod="openstack/ovsdbserver-sb-0" Feb 17 18:01:26 crc kubenswrapper[4892]: I0217 18:01:26.691914 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/747e7c96-8d95-4c34-9ff3-83dc8c793fc2-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"747e7c96-8d95-4c34-9ff3-83dc8c793fc2\") " pod="openstack/ovsdbserver-sb-0" Feb 17 18:01:26 crc kubenswrapper[4892]: I0217 18:01:26.691954 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/747e7c96-8d95-4c34-9ff3-83dc8c793fc2-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"747e7c96-8d95-4c34-9ff3-83dc8c793fc2\") " pod="openstack/ovsdbserver-sb-0" Feb 17 18:01:26 crc kubenswrapper[4892]: I0217 18:01:26.691988 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/747e7c96-8d95-4c34-9ff3-83dc8c793fc2-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"747e7c96-8d95-4c34-9ff3-83dc8c793fc2\") " pod="openstack/ovsdbserver-sb-0" Feb 17 18:01:26 crc kubenswrapper[4892]: I0217 18:01:26.692005 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/747e7c96-8d95-4c34-9ff3-83dc8c793fc2-config\") pod \"ovsdbserver-sb-0\" (UID: \"747e7c96-8d95-4c34-9ff3-83dc8c793fc2\") " pod="openstack/ovsdbserver-sb-0" Feb 17 18:01:26 crc kubenswrapper[4892]: I0217 18:01:26.692039 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"747e7c96-8d95-4c34-9ff3-83dc8c793fc2\") " pod="openstack/ovsdbserver-sb-0" Feb 17 18:01:26 crc kubenswrapper[4892]: I0217 18:01:26.794854 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/747e7c96-8d95-4c34-9ff3-83dc8c793fc2-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"747e7c96-8d95-4c34-9ff3-83dc8c793fc2\") " pod="openstack/ovsdbserver-sb-0" Feb 17 18:01:26 crc kubenswrapper[4892]: I0217 18:01:26.794887 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdgh7\" (UniqueName: \"kubernetes.io/projected/747e7c96-8d95-4c34-9ff3-83dc8c793fc2-kube-api-access-fdgh7\") pod \"ovsdbserver-sb-0\" (UID: \"747e7c96-8d95-4c34-9ff3-83dc8c793fc2\") " pod="openstack/ovsdbserver-sb-0" Feb 17 18:01:26 crc kubenswrapper[4892]: I0217 18:01:26.794906 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/747e7c96-8d95-4c34-9ff3-83dc8c793fc2-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"747e7c96-8d95-4c34-9ff3-83dc8c793fc2\") " pod="openstack/ovsdbserver-sb-0" Feb 17 18:01:26 crc kubenswrapper[4892]: I0217 18:01:26.794935 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/747e7c96-8d95-4c34-9ff3-83dc8c793fc2-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"747e7c96-8d95-4c34-9ff3-83dc8c793fc2\") " pod="openstack/ovsdbserver-sb-0" Feb 17 18:01:26 crc kubenswrapper[4892]: I0217 18:01:26.794965 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/747e7c96-8d95-4c34-9ff3-83dc8c793fc2-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"747e7c96-8d95-4c34-9ff3-83dc8c793fc2\") " pod="openstack/ovsdbserver-sb-0" Feb 17 18:01:26 crc kubenswrapper[4892]: I0217 18:01:26.794987 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/747e7c96-8d95-4c34-9ff3-83dc8c793fc2-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"747e7c96-8d95-4c34-9ff3-83dc8c793fc2\") " pod="openstack/ovsdbserver-sb-0" Feb 17 18:01:26 crc kubenswrapper[4892]: I0217 18:01:26.795002 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/747e7c96-8d95-4c34-9ff3-83dc8c793fc2-config\") pod \"ovsdbserver-sb-0\" (UID: \"747e7c96-8d95-4c34-9ff3-83dc8c793fc2\") " pod="openstack/ovsdbserver-sb-0" Feb 17 18:01:26 crc kubenswrapper[4892]: I0217 18:01:26.795028 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"747e7c96-8d95-4c34-9ff3-83dc8c793fc2\") " pod="openstack/ovsdbserver-sb-0" Feb 17 18:01:26 crc kubenswrapper[4892]: I0217 18:01:26.795324 4892 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"747e7c96-8d95-4c34-9ff3-83dc8c793fc2\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-sb-0" Feb 17 18:01:26 crc kubenswrapper[4892]: I0217 18:01:26.796631 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/747e7c96-8d95-4c34-9ff3-83dc8c793fc2-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"747e7c96-8d95-4c34-9ff3-83dc8c793fc2\") " pod="openstack/ovsdbserver-sb-0" Feb 17 18:01:26 crc kubenswrapper[4892]: I0217 18:01:26.799105 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/747e7c96-8d95-4c34-9ff3-83dc8c793fc2-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"747e7c96-8d95-4c34-9ff3-83dc8c793fc2\") " pod="openstack/ovsdbserver-sb-0" Feb 17 18:01:26 crc kubenswrapper[4892]: I0217 18:01:26.799668 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/747e7c96-8d95-4c34-9ff3-83dc8c793fc2-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"747e7c96-8d95-4c34-9ff3-83dc8c793fc2\") " pod="openstack/ovsdbserver-sb-0" Feb 17 18:01:26 crc kubenswrapper[4892]: I0217 18:01:26.799830 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/747e7c96-8d95-4c34-9ff3-83dc8c793fc2-config\") pod \"ovsdbserver-sb-0\" (UID: \"747e7c96-8d95-4c34-9ff3-83dc8c793fc2\") " pod="openstack/ovsdbserver-sb-0" Feb 17 18:01:26 crc kubenswrapper[4892]: I0217 18:01:26.803665 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/747e7c96-8d95-4c34-9ff3-83dc8c793fc2-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"747e7c96-8d95-4c34-9ff3-83dc8c793fc2\") " pod="openstack/ovsdbserver-sb-0" Feb 17 18:01:26 crc kubenswrapper[4892]: I0217 18:01:26.815710 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/747e7c96-8d95-4c34-9ff3-83dc8c793fc2-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"747e7c96-8d95-4c34-9ff3-83dc8c793fc2\") " pod="openstack/ovsdbserver-sb-0" Feb 17 18:01:26 crc kubenswrapper[4892]: I0217 18:01:26.817642 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdgh7\" (UniqueName: \"kubernetes.io/projected/747e7c96-8d95-4c34-9ff3-83dc8c793fc2-kube-api-access-fdgh7\") pod \"ovsdbserver-sb-0\" (UID: \"747e7c96-8d95-4c34-9ff3-83dc8c793fc2\") " pod="openstack/ovsdbserver-sb-0" Feb 17 18:01:26 crc kubenswrapper[4892]: I0217 18:01:26.839587 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"747e7c96-8d95-4c34-9ff3-83dc8c793fc2\") " pod="openstack/ovsdbserver-sb-0" Feb 17 18:01:26 crc kubenswrapper[4892]: I0217 18:01:26.856551 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 17 18:01:27 crc kubenswrapper[4892]: I0217 18:01:27.467792 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 17 18:01:27 crc kubenswrapper[4892]: I0217 18:01:27.469736 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 17 18:01:27 crc kubenswrapper[4892]: I0217 18:01:27.511572 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 17 18:01:27 crc kubenswrapper[4892]: I0217 18:01:27.512184 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-56flm" Feb 17 18:01:27 crc kubenswrapper[4892]: I0217 18:01:27.512423 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 17 18:01:27 crc kubenswrapper[4892]: I0217 18:01:27.515096 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 17 18:01:27 crc kubenswrapper[4892]: I0217 18:01:27.519069 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 17 18:01:27 crc kubenswrapper[4892]: I0217 18:01:27.620458 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"c085ee96-4617-4fa6-b546-a68d29c6238b\") " pod="openstack/ovsdbserver-nb-0" Feb 17 18:01:27 crc kubenswrapper[4892]: I0217 18:01:27.620545 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c085ee96-4617-4fa6-b546-a68d29c6238b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c085ee96-4617-4fa6-b546-a68d29c6238b\") " pod="openstack/ovsdbserver-nb-0" Feb 17 18:01:27 crc kubenswrapper[4892]: I0217 18:01:27.620710 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c085ee96-4617-4fa6-b546-a68d29c6238b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c085ee96-4617-4fa6-b546-a68d29c6238b\") " pod="openstack/ovsdbserver-nb-0" Feb 17 18:01:27 crc kubenswrapper[4892]: I0217 18:01:27.620784 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c085ee96-4617-4fa6-b546-a68d29c6238b-config\") pod \"ovsdbserver-nb-0\" (UID: \"c085ee96-4617-4fa6-b546-a68d29c6238b\") " pod="openstack/ovsdbserver-nb-0" Feb 17 18:01:27 crc kubenswrapper[4892]: I0217 18:01:27.621507 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c085ee96-4617-4fa6-b546-a68d29c6238b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"c085ee96-4617-4fa6-b546-a68d29c6238b\") " pod="openstack/ovsdbserver-nb-0" Feb 17 18:01:27 crc kubenswrapper[4892]: I0217 18:01:27.624508 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c085ee96-4617-4fa6-b546-a68d29c6238b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"c085ee96-4617-4fa6-b546-a68d29c6238b\") " pod="openstack/ovsdbserver-nb-0" Feb 17 18:01:27 crc kubenswrapper[4892]: I0217 18:01:27.624584 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvv4x\" (UniqueName: \"kubernetes.io/projected/c085ee96-4617-4fa6-b546-a68d29c6238b-kube-api-access-wvv4x\") pod \"ovsdbserver-nb-0\" (UID: \"c085ee96-4617-4fa6-b546-a68d29c6238b\") " pod="openstack/ovsdbserver-nb-0" Feb 17 18:01:27 crc kubenswrapper[4892]: I0217 18:01:27.624633 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c085ee96-4617-4fa6-b546-a68d29c6238b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"c085ee96-4617-4fa6-b546-a68d29c6238b\") " pod="openstack/ovsdbserver-nb-0" Feb 17 18:01:27 crc kubenswrapper[4892]: I0217 18:01:27.726425 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c085ee96-4617-4fa6-b546-a68d29c6238b-config\") pod \"ovsdbserver-nb-0\" (UID: \"c085ee96-4617-4fa6-b546-a68d29c6238b\") " pod="openstack/ovsdbserver-nb-0" Feb 17 18:01:27 crc kubenswrapper[4892]: I0217 18:01:27.726521 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c085ee96-4617-4fa6-b546-a68d29c6238b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"c085ee96-4617-4fa6-b546-a68d29c6238b\") " pod="openstack/ovsdbserver-nb-0" Feb 17 18:01:27 crc kubenswrapper[4892]: I0217 18:01:27.726660 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c085ee96-4617-4fa6-b546-a68d29c6238b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"c085ee96-4617-4fa6-b546-a68d29c6238b\") " pod="openstack/ovsdbserver-nb-0" Feb 17 18:01:27 crc kubenswrapper[4892]: I0217 18:01:27.726703 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvv4x\" (UniqueName: \"kubernetes.io/projected/c085ee96-4617-4fa6-b546-a68d29c6238b-kube-api-access-wvv4x\") pod \"ovsdbserver-nb-0\" (UID: \"c085ee96-4617-4fa6-b546-a68d29c6238b\") " pod="openstack/ovsdbserver-nb-0" Feb 17 18:01:27 crc kubenswrapper[4892]: I0217 18:01:27.726744 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c085ee96-4617-4fa6-b546-a68d29c6238b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"c085ee96-4617-4fa6-b546-a68d29c6238b\") " pod="openstack/ovsdbserver-nb-0" Feb 17 18:01:27 crc kubenswrapper[4892]: I0217 18:01:27.726798 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"c085ee96-4617-4fa6-b546-a68d29c6238b\") " pod="openstack/ovsdbserver-nb-0" Feb 17 18:01:27 crc kubenswrapper[4892]: I0217 18:01:27.726860 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c085ee96-4617-4fa6-b546-a68d29c6238b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c085ee96-4617-4fa6-b546-a68d29c6238b\") " pod="openstack/ovsdbserver-nb-0" Feb 17 18:01:27 crc kubenswrapper[4892]: I0217 18:01:27.726952 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c085ee96-4617-4fa6-b546-a68d29c6238b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c085ee96-4617-4fa6-b546-a68d29c6238b\") " pod="openstack/ovsdbserver-nb-0" Feb 17 18:01:27 crc kubenswrapper[4892]: I0217 18:01:27.727733 4892 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"c085ee96-4617-4fa6-b546-a68d29c6238b\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-nb-0" Feb 17 18:01:27 crc kubenswrapper[4892]: I0217 18:01:27.728804 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c085ee96-4617-4fa6-b546-a68d29c6238b-config\") pod \"ovsdbserver-nb-0\" (UID: \"c085ee96-4617-4fa6-b546-a68d29c6238b\") " pod="openstack/ovsdbserver-nb-0" Feb 17 18:01:27 crc kubenswrapper[4892]: I0217 18:01:27.729256 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c085ee96-4617-4fa6-b546-a68d29c6238b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"c085ee96-4617-4fa6-b546-a68d29c6238b\") " pod="openstack/ovsdbserver-nb-0" Feb 17 18:01:27 crc kubenswrapper[4892]: I0217 18:01:27.730312 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c085ee96-4617-4fa6-b546-a68d29c6238b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"c085ee96-4617-4fa6-b546-a68d29c6238b\") " pod="openstack/ovsdbserver-nb-0" Feb 17 18:01:27 crc kubenswrapper[4892]: I0217 18:01:27.732786 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c085ee96-4617-4fa6-b546-a68d29c6238b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c085ee96-4617-4fa6-b546-a68d29c6238b\") " pod="openstack/ovsdbserver-nb-0" Feb 17 18:01:27 crc kubenswrapper[4892]: I0217 18:01:27.734456 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c085ee96-4617-4fa6-b546-a68d29c6238b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"c085ee96-4617-4fa6-b546-a68d29c6238b\") " pod="openstack/ovsdbserver-nb-0" Feb 17 18:01:27 crc kubenswrapper[4892]: I0217 18:01:27.740707 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c085ee96-4617-4fa6-b546-a68d29c6238b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c085ee96-4617-4fa6-b546-a68d29c6238b\") " pod="openstack/ovsdbserver-nb-0" Feb 17 18:01:27 crc kubenswrapper[4892]: I0217 18:01:27.752839 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"c085ee96-4617-4fa6-b546-a68d29c6238b\") " pod="openstack/ovsdbserver-nb-0" Feb 17 18:01:27 crc kubenswrapper[4892]: I0217 18:01:27.756788 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvv4x\" (UniqueName: \"kubernetes.io/projected/c085ee96-4617-4fa6-b546-a68d29c6238b-kube-api-access-wvv4x\") pod \"ovsdbserver-nb-0\" (UID: \"c085ee96-4617-4fa6-b546-a68d29c6238b\") " pod="openstack/ovsdbserver-nb-0" Feb 17 18:01:27 crc kubenswrapper[4892]: I0217 18:01:27.833693 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 17 18:01:37 crc kubenswrapper[4892]: E0217 18:01:37.303246 4892 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 17 18:01:37 crc kubenswrapper[4892]: E0217 18:01:37.304004 4892 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-blvrn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-ftxpz_openstack(1b92aa73-c227-40c3-bd98-0c07f7fd342b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 18:01:37 crc kubenswrapper[4892]: E0217 18:01:37.312216 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-ftxpz" podUID="1b92aa73-c227-40c3-bd98-0c07f7fd342b" Feb 17 18:01:37 crc kubenswrapper[4892]: E0217 18:01:37.357560 4892 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 17 18:01:37 crc kubenswrapper[4892]: E0217 18:01:37.357933 4892 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jtvh6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-zbw9m_openstack(2ca917ea-9347-4218-ab50-2156103f610a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 18:01:37 crc kubenswrapper[4892]: E0217 18:01:37.359121 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-zbw9m" podUID="2ca917ea-9347-4218-ab50-2156103f610a" Feb 17 18:01:37 crc kubenswrapper[4892]: E0217 18:01:37.361651 4892 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 17 18:01:37 crc kubenswrapper[4892]: E0217 18:01:37.361862 4892 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zrtp9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-8nh25_openstack(c7e8b752-5bee-44d7-b852-e433eee3b7a8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 18:01:37 crc kubenswrapper[4892]: E0217 18:01:37.363586 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-8nh25" podUID="c7e8b752-5bee-44d7-b852-e433eee3b7a8" Feb 17 18:01:37 crc kubenswrapper[4892]: E0217 18:01:37.390035 4892 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 17 18:01:37 crc kubenswrapper[4892]: E0217 18:01:37.390176 4892 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f4g8l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5ccc8479f9-dpswn_openstack(c165766b-b53f-4345-802e-7262eb64618c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 18:01:37 crc kubenswrapper[4892]: E0217 18:01:37.391941 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5ccc8479f9-dpswn" podUID="c165766b-b53f-4345-802e-7262eb64618c" Feb 17 18:01:37 crc kubenswrapper[4892]: I0217 18:01:37.814935 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 17 18:01:37 crc kubenswrapper[4892]: W0217 18:01:37.816795 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod38862686_bfab_4f7d_8367_ec59a68b0299.slice/crio-5135b40394a291608961e1f25d10b174759bc6fbb292af7050e6cb4fcb627e0f WatchSource:0}: Error finding container 5135b40394a291608961e1f25d10b174759bc6fbb292af7050e6cb4fcb627e0f: Status 404 returned error can't find the container with id 5135b40394a291608961e1f25d10b174759bc6fbb292af7050e6cb4fcb627e0f Feb 17 18:01:37 crc kubenswrapper[4892]: W0217 18:01:37.819009 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56066e3d_ab6b_4a76_bf31_11f8442d9285.slice/crio-b95e17644cc3c59ab1e620d0e3fd9b8bf013435ef5feae7623363500958d3efb WatchSource:0}: Error finding container b95e17644cc3c59ab1e620d0e3fd9b8bf013435ef5feae7623363500958d3efb: Status 404 returned error can't find the container with id b95e17644cc3c59ab1e620d0e3fd9b8bf013435ef5feae7623363500958d3efb Feb 17 18:01:37 crc kubenswrapper[4892]: I0217 18:01:37.825024 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 17 18:01:37 crc kubenswrapper[4892]: I0217 18:01:37.832732 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 18:01:38 crc kubenswrapper[4892]: I0217 18:01:38.037567 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-bz7v2"] Feb 17 18:01:38 crc kubenswrapper[4892]: W0217 18:01:38.039586 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff799349_84ed_44f7_8f46_d11d5637abf1.slice/crio-bab04d84fb9fff3cd8342885cce22150d7b9a68faff5371f3ce3ae03d1da9f28 WatchSource:0}: Error finding container bab04d84fb9fff3cd8342885cce22150d7b9a68faff5371f3ce3ae03d1da9f28: Status 404 returned error can't find the container with id bab04d84fb9fff3cd8342885cce22150d7b9a68faff5371f3ce3ae03d1da9f28 Feb 17 18:01:38 crc kubenswrapper[4892]: I0217 18:01:38.309227 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"56066e3d-ab6b-4a76-bf31-11f8442d9285","Type":"ContainerStarted","Data":"b95e17644cc3c59ab1e620d0e3fd9b8bf013435ef5feae7623363500958d3efb"} Feb 17 18:01:38 crc kubenswrapper[4892]: I0217 18:01:38.310895 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-bz7v2" event={"ID":"ff799349-84ed-44f7-8f46-d11d5637abf1","Type":"ContainerStarted","Data":"bab04d84fb9fff3cd8342885cce22150d7b9a68faff5371f3ce3ae03d1da9f28"} Feb 17 18:01:38 crc kubenswrapper[4892]: I0217 18:01:38.316799 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"38862686-bfab-4f7d-8367-ec59a68b0299","Type":"ContainerStarted","Data":"a6f29bfcf8f5047a2ec9026b3812106d7f946e8416e1e920e156552a30f70a3e"} Feb 17 18:01:38 crc kubenswrapper[4892]: I0217 18:01:38.316891 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"38862686-bfab-4f7d-8367-ec59a68b0299","Type":"ContainerStarted","Data":"5135b40394a291608961e1f25d10b174759bc6fbb292af7050e6cb4fcb627e0f"} Feb 17 18:01:38 crc kubenswrapper[4892]: I0217 18:01:38.321461 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4f63d2ae-8195-4841-a7b5-f38667fc87b2","Type":"ContainerStarted","Data":"7ace672204f35f9e3ce9138c9274e92b32a9ad967b4d642a8ab107066e7260aa"} Feb 17 18:01:38 crc kubenswrapper[4892]: I0217 18:01:38.324002 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"dadb10bf-ed88-454e-8873-9c49f762ef6e","Type":"ContainerStarted","Data":"dbbbecb0ba3fdbb6d53c711ee669f5b906795c9e893eb7463eb720302edeef15"} Feb 17 18:01:38 crc kubenswrapper[4892]: E0217 18:01:38.326853 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-zbw9m" podUID="2ca917ea-9347-4218-ab50-2156103f610a" Feb 17 18:01:38 crc kubenswrapper[4892]: E0217 18:01:38.327739 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-5ccc8479f9-dpswn" podUID="c165766b-b53f-4345-802e-7262eb64618c" Feb 17 18:01:38 crc kubenswrapper[4892]: I0217 18:01:38.470242 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 17 18:01:39 crc kubenswrapper[4892]: I0217 18:01:39.075642 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-ftxpz" Feb 17 18:01:39 crc kubenswrapper[4892]: I0217 18:01:39.081202 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-8nh25" Feb 17 18:01:39 crc kubenswrapper[4892]: I0217 18:01:39.173770 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blvrn\" (UniqueName: \"kubernetes.io/projected/1b92aa73-c227-40c3-bd98-0c07f7fd342b-kube-api-access-blvrn\") pod \"1b92aa73-c227-40c3-bd98-0c07f7fd342b\" (UID: \"1b92aa73-c227-40c3-bd98-0c07f7fd342b\") " Feb 17 18:01:39 crc kubenswrapper[4892]: I0217 18:01:39.173885 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b92aa73-c227-40c3-bd98-0c07f7fd342b-config\") pod \"1b92aa73-c227-40c3-bd98-0c07f7fd342b\" (UID: \"1b92aa73-c227-40c3-bd98-0c07f7fd342b\") " Feb 17 18:01:39 crc kubenswrapper[4892]: I0217 18:01:39.174000 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7e8b752-5bee-44d7-b852-e433eee3b7a8-config\") pod \"c7e8b752-5bee-44d7-b852-e433eee3b7a8\" (UID: \"c7e8b752-5bee-44d7-b852-e433eee3b7a8\") " Feb 17 18:01:39 crc kubenswrapper[4892]: I0217 18:01:39.174136 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7e8b752-5bee-44d7-b852-e433eee3b7a8-dns-svc\") pod \"c7e8b752-5bee-44d7-b852-e433eee3b7a8\" (UID: \"c7e8b752-5bee-44d7-b852-e433eee3b7a8\") " Feb 17 18:01:39 crc kubenswrapper[4892]: I0217 18:01:39.174231 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrtp9\" (UniqueName: \"kubernetes.io/projected/c7e8b752-5bee-44d7-b852-e433eee3b7a8-kube-api-access-zrtp9\") pod \"c7e8b752-5bee-44d7-b852-e433eee3b7a8\" (UID: \"c7e8b752-5bee-44d7-b852-e433eee3b7a8\") " Feb 17 18:01:39 crc kubenswrapper[4892]: I0217 18:01:39.174991 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7e8b752-5bee-44d7-b852-e433eee3b7a8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c7e8b752-5bee-44d7-b852-e433eee3b7a8" (UID: "c7e8b752-5bee-44d7-b852-e433eee3b7a8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:01:39 crc kubenswrapper[4892]: I0217 18:01:39.175147 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7e8b752-5bee-44d7-b852-e433eee3b7a8-config" (OuterVolumeSpecName: "config") pod "c7e8b752-5bee-44d7-b852-e433eee3b7a8" (UID: "c7e8b752-5bee-44d7-b852-e433eee3b7a8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:01:39 crc kubenswrapper[4892]: I0217 18:01:39.176013 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b92aa73-c227-40c3-bd98-0c07f7fd342b-config" (OuterVolumeSpecName: "config") pod "1b92aa73-c227-40c3-bd98-0c07f7fd342b" (UID: "1b92aa73-c227-40c3-bd98-0c07f7fd342b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:01:39 crc kubenswrapper[4892]: I0217 18:01:39.186048 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b92aa73-c227-40c3-bd98-0c07f7fd342b-kube-api-access-blvrn" (OuterVolumeSpecName: "kube-api-access-blvrn") pod "1b92aa73-c227-40c3-bd98-0c07f7fd342b" (UID: "1b92aa73-c227-40c3-bd98-0c07f7fd342b"). InnerVolumeSpecName "kube-api-access-blvrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:01:39 crc kubenswrapper[4892]: I0217 18:01:39.186194 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7e8b752-5bee-44d7-b852-e433eee3b7a8-kube-api-access-zrtp9" (OuterVolumeSpecName: "kube-api-access-zrtp9") pod "c7e8b752-5bee-44d7-b852-e433eee3b7a8" (UID: "c7e8b752-5bee-44d7-b852-e433eee3b7a8"). InnerVolumeSpecName "kube-api-access-zrtp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:01:39 crc kubenswrapper[4892]: I0217 18:01:39.277479 4892 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7e8b752-5bee-44d7-b852-e433eee3b7a8-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 18:01:39 crc kubenswrapper[4892]: I0217 18:01:39.277511 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrtp9\" (UniqueName: \"kubernetes.io/projected/c7e8b752-5bee-44d7-b852-e433eee3b7a8-kube-api-access-zrtp9\") on node \"crc\" DevicePath \"\"" Feb 17 18:01:39 crc kubenswrapper[4892]: I0217 18:01:39.277523 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blvrn\" (UniqueName: \"kubernetes.io/projected/1b92aa73-c227-40c3-bd98-0c07f7fd342b-kube-api-access-blvrn\") on node \"crc\" DevicePath \"\"" Feb 17 18:01:39 crc kubenswrapper[4892]: I0217 18:01:39.277531 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b92aa73-c227-40c3-bd98-0c07f7fd342b-config\") on node \"crc\" DevicePath \"\"" Feb 17 18:01:39 crc kubenswrapper[4892]: I0217 18:01:39.277541 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7e8b752-5bee-44d7-b852-e433eee3b7a8-config\") on node \"crc\" DevicePath \"\"" Feb 17 18:01:39 crc kubenswrapper[4892]: I0217 18:01:39.335035 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"747e7c96-8d95-4c34-9ff3-83dc8c793fc2","Type":"ContainerStarted","Data":"75439c82587547eef8780e63384b061d636764c68d1a719e24b0c40e7e911241"} Feb 17 18:01:39 crc kubenswrapper[4892]: I0217 18:01:39.336165 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-8nh25" event={"ID":"c7e8b752-5bee-44d7-b852-e433eee3b7a8","Type":"ContainerDied","Data":"3372c23036fa892a85478df71183ef199766f36aef8e97d893a12ef17b9a2c53"} Feb 17 18:01:39 crc kubenswrapper[4892]: I0217 18:01:39.336189 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-8nh25" Feb 17 18:01:39 crc kubenswrapper[4892]: I0217 18:01:39.337556 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-ftxpz" Feb 17 18:01:39 crc kubenswrapper[4892]: I0217 18:01:39.337574 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-ftxpz" event={"ID":"1b92aa73-c227-40c3-bd98-0c07f7fd342b","Type":"ContainerDied","Data":"53bc294c0a0156b04573fba23d6096ddba24cb66964f6c1a0563169d70e3d6b8"} Feb 17 18:01:39 crc kubenswrapper[4892]: I0217 18:01:39.339041 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3a991e29-288f-453d-9bb4-f8d90a2689ad","Type":"ContainerStarted","Data":"503d1ce458069660595b88c09b6233a6c992ba208fc9f8777a3919fc48271c2f"} Feb 17 18:01:39 crc kubenswrapper[4892]: I0217 18:01:39.342936 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"60523b2e-a498-4bc9-920b-32f117afb898","Type":"ContainerStarted","Data":"e99502375ab69e8b92fee8d394383cb65bc6d54cb14279a57cb6a3666079c978"} Feb 17 18:01:39 crc kubenswrapper[4892]: I0217 18:01:39.494486 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 17 18:01:39 crc kubenswrapper[4892]: I0217 18:01:39.534480 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-8nh25"] Feb 17 18:01:39 crc kubenswrapper[4892]: I0217 18:01:39.545668 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-8nh25"] Feb 17 18:01:39 crc kubenswrapper[4892]: I0217 18:01:39.566340 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-ftxpz"] Feb 17 18:01:39 crc kubenswrapper[4892]: I0217 18:01:39.575781 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-ftxpz"] Feb 17 18:01:40 crc kubenswrapper[4892]: I0217 18:01:40.114013 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-8n9s7"] Feb 17 18:01:40 crc kubenswrapper[4892]: W0217 18:01:40.362840 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc085ee96_4617_4fa6_b546_a68d29c6238b.slice/crio-adf20f4c1fb1e37e6edc36fa477c3ec7dfa0b7c2a94d5c159631dd0053972529 WatchSource:0}: Error finding container adf20f4c1fb1e37e6edc36fa477c3ec7dfa0b7c2a94d5c159631dd0053972529: Status 404 returned error can't find the container with id adf20f4c1fb1e37e6edc36fa477c3ec7dfa0b7c2a94d5c159631dd0053972529 Feb 17 18:01:40 crc kubenswrapper[4892]: W0217 18:01:40.438805 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podabdceb70_55cb_4f2a_ac20_7fe8e9f4d064.slice/crio-250911d16f139c55d6260ef758663ea4bd7069431179b3f2214a1dc64aac1a5c WatchSource:0}: Error finding container 250911d16f139c55d6260ef758663ea4bd7069431179b3f2214a1dc64aac1a5c: Status 404 returned error can't find the container with id 250911d16f139c55d6260ef758663ea4bd7069431179b3f2214a1dc64aac1a5c Feb 17 18:01:41 crc kubenswrapper[4892]: I0217 18:01:41.364538 4892 generic.go:334] "Generic (PLEG): container finished" podID="4f63d2ae-8195-4841-a7b5-f38667fc87b2" containerID="7ace672204f35f9e3ce9138c9274e92b32a9ad967b4d642a8ab107066e7260aa" exitCode=0 Feb 17 18:01:41 crc kubenswrapper[4892]: I0217 18:01:41.377196 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b92aa73-c227-40c3-bd98-0c07f7fd342b" path="/var/lib/kubelet/pods/1b92aa73-c227-40c3-bd98-0c07f7fd342b/volumes" Feb 17 18:01:41 crc kubenswrapper[4892]: I0217 18:01:41.377569 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7e8b752-5bee-44d7-b852-e433eee3b7a8" path="/var/lib/kubelet/pods/c7e8b752-5bee-44d7-b852-e433eee3b7a8/volumes" Feb 17 18:01:41 crc kubenswrapper[4892]: I0217 18:01:41.377927 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8n9s7" event={"ID":"abdceb70-55cb-4f2a-ac20-7fe8e9f4d064","Type":"ContainerStarted","Data":"250911d16f139c55d6260ef758663ea4bd7069431179b3f2214a1dc64aac1a5c"} Feb 17 18:01:41 crc kubenswrapper[4892]: I0217 18:01:41.377953 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4f63d2ae-8195-4841-a7b5-f38667fc87b2","Type":"ContainerDied","Data":"7ace672204f35f9e3ce9138c9274e92b32a9ad967b4d642a8ab107066e7260aa"} Feb 17 18:01:41 crc kubenswrapper[4892]: I0217 18:01:41.377971 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"c085ee96-4617-4fa6-b546-a68d29c6238b","Type":"ContainerStarted","Data":"adf20f4c1fb1e37e6edc36fa477c3ec7dfa0b7c2a94d5c159631dd0053972529"} Feb 17 18:01:42 crc kubenswrapper[4892]: I0217 18:01:42.376086 4892 generic.go:334] "Generic (PLEG): container finished" podID="38862686-bfab-4f7d-8367-ec59a68b0299" containerID="a6f29bfcf8f5047a2ec9026b3812106d7f946e8416e1e920e156552a30f70a3e" exitCode=0 Feb 17 18:01:42 crc kubenswrapper[4892]: I0217 18:01:42.376433 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"38862686-bfab-4f7d-8367-ec59a68b0299","Type":"ContainerDied","Data":"a6f29bfcf8f5047a2ec9026b3812106d7f946e8416e1e920e156552a30f70a3e"} Feb 17 18:01:44 crc kubenswrapper[4892]: I0217 18:01:44.397568 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"38862686-bfab-4f7d-8367-ec59a68b0299","Type":"ContainerStarted","Data":"30d52ee98141b0a4b18a9c3892ae7d9fadf3cc7b619452277145c17c67f590bb"} Feb 17 18:01:44 crc kubenswrapper[4892]: I0217 18:01:44.399570 4892 generic.go:334] "Generic (PLEG): container finished" podID="abdceb70-55cb-4f2a-ac20-7fe8e9f4d064" containerID="e5e3d7ffade91386dbb0cee4288429c6a1a13fd89deb550ad8243529008d2c7b" exitCode=0 Feb 17 18:01:44 crc kubenswrapper[4892]: I0217 18:01:44.399855 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8n9s7" event={"ID":"abdceb70-55cb-4f2a-ac20-7fe8e9f4d064","Type":"ContainerDied","Data":"e5e3d7ffade91386dbb0cee4288429c6a1a13fd89deb550ad8243529008d2c7b"} Feb 17 18:01:44 crc kubenswrapper[4892]: I0217 18:01:44.401940 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"747e7c96-8d95-4c34-9ff3-83dc8c793fc2","Type":"ContainerStarted","Data":"1ea89386cc3497040f999fba08153fd875851acce88b76b5cb7c8280e1126f85"} Feb 17 18:01:44 crc kubenswrapper[4892]: I0217 18:01:44.404020 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4f63d2ae-8195-4841-a7b5-f38667fc87b2","Type":"ContainerStarted","Data":"55d0a5452a08a089f4f815ed51cec9a66b1e302496f9185e70ddb1d9d5ee79a3"} Feb 17 18:01:44 crc kubenswrapper[4892]: I0217 18:01:44.405538 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"dadb10bf-ed88-454e-8873-9c49f762ef6e","Type":"ContainerStarted","Data":"264f797728e12c3996d15cc2b9cd8446cea32fc84d31eeb1fec2bcc2395f7027"} Feb 17 18:01:44 crc kubenswrapper[4892]: I0217 18:01:44.405742 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 17 18:01:44 crc kubenswrapper[4892]: I0217 18:01:44.410624 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"56066e3d-ab6b-4a76-bf31-11f8442d9285","Type":"ContainerStarted","Data":"d48371c955863cb82644a9a6770a2b1b75808550ad06a8eb03550a2db5621bb5"} Feb 17 18:01:44 crc kubenswrapper[4892]: I0217 18:01:44.410842 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 17 18:01:44 crc kubenswrapper[4892]: I0217 18:01:44.413208 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"c085ee96-4617-4fa6-b546-a68d29c6238b","Type":"ContainerStarted","Data":"82eb422a29271ebf3f8cf145783a1075bbaffa1ac3279210e6826fd4595ca345"} Feb 17 18:01:44 crc kubenswrapper[4892]: I0217 18:01:44.415735 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-bz7v2" event={"ID":"ff799349-84ed-44f7-8f46-d11d5637abf1","Type":"ContainerStarted","Data":"2a24b9d22c199e4d14a2a499464981f6e1f269599a861bcbb6d2f7956c026991"} Feb 17 18:01:44 crc kubenswrapper[4892]: I0217 18:01:44.416198 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-bz7v2" Feb 17 18:01:44 crc kubenswrapper[4892]: I0217 18:01:44.424157 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=27.424138389 podStartE2EDuration="27.424138389s" podCreationTimestamp="2026-02-17 18:01:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:01:44.419569877 +0000 UTC m=+1075.794973182" watchObservedRunningTime="2026-02-17 18:01:44.424138389 +0000 UTC m=+1075.799541654" Feb 17 18:01:44 crc kubenswrapper[4892]: I0217 18:01:44.439086 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-bz7v2" podStartSLOduration=15.584152667 podStartE2EDuration="20.439069623s" podCreationTimestamp="2026-02-17 18:01:24 +0000 UTC" firstStartedPulling="2026-02-17 18:01:38.041646331 +0000 UTC m=+1069.417049596" lastFinishedPulling="2026-02-17 18:01:42.896563277 +0000 UTC m=+1074.271966552" observedRunningTime="2026-02-17 18:01:44.435589489 +0000 UTC m=+1075.810992754" watchObservedRunningTime="2026-02-17 18:01:44.439069623 +0000 UTC m=+1075.814472888" Feb 17 18:01:44 crc kubenswrapper[4892]: I0217 18:01:44.462733 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=21.560378654 podStartE2EDuration="26.462714141s" podCreationTimestamp="2026-02-17 18:01:18 +0000 UTC" firstStartedPulling="2026-02-17 18:01:37.823952813 +0000 UTC m=+1069.199356078" lastFinishedPulling="2026-02-17 18:01:42.7262883 +0000 UTC m=+1074.101691565" observedRunningTime="2026-02-17 18:01:44.453283577 +0000 UTC m=+1075.828686842" watchObservedRunningTime="2026-02-17 18:01:44.462714141 +0000 UTC m=+1075.838117406" Feb 17 18:01:44 crc kubenswrapper[4892]: I0217 18:01:44.477634 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=9.321784851 podStartE2EDuration="28.477615554s" podCreationTimestamp="2026-02-17 18:01:16 +0000 UTC" firstStartedPulling="2026-02-17 18:01:18.18992926 +0000 UTC m=+1049.565332525" lastFinishedPulling="2026-02-17 18:01:37.345759963 +0000 UTC m=+1068.721163228" observedRunningTime="2026-02-17 18:01:44.477091869 +0000 UTC m=+1075.852495134" watchObservedRunningTime="2026-02-17 18:01:44.477615554 +0000 UTC m=+1075.853018819" Feb 17 18:01:44 crc kubenswrapper[4892]: I0217 18:01:44.522372 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=18.660398626 podStartE2EDuration="24.522353441s" podCreationTimestamp="2026-02-17 18:01:20 +0000 UTC" firstStartedPulling="2026-02-17 18:01:37.82196953 +0000 UTC m=+1069.197372795" lastFinishedPulling="2026-02-17 18:01:43.683924345 +0000 UTC m=+1075.059327610" observedRunningTime="2026-02-17 18:01:44.520674006 +0000 UTC m=+1075.896077281" watchObservedRunningTime="2026-02-17 18:01:44.522353441 +0000 UTC m=+1075.897756706" Feb 17 18:01:45 crc kubenswrapper[4892]: I0217 18:01:45.426166 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8n9s7" event={"ID":"abdceb70-55cb-4f2a-ac20-7fe8e9f4d064","Type":"ContainerStarted","Data":"2fa766c1995ec1ef80411d787e1580966f11b07f17360079458fcc8c8c5e948a"} Feb 17 18:01:45 crc kubenswrapper[4892]: I0217 18:01:45.426524 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8n9s7" event={"ID":"abdceb70-55cb-4f2a-ac20-7fe8e9f4d064","Type":"ContainerStarted","Data":"e1814f919569ab3fca60ffafe790f6c380bef0cb3debbd1731ef73a58c171a64"} Feb 17 18:01:45 crc kubenswrapper[4892]: I0217 18:01:45.453120 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-8n9s7" podStartSLOduration=19.082379205 podStartE2EDuration="21.453097021s" podCreationTimestamp="2026-02-17 18:01:24 +0000 UTC" firstStartedPulling="2026-02-17 18:01:40.457797874 +0000 UTC m=+1071.833201139" lastFinishedPulling="2026-02-17 18:01:42.82851564 +0000 UTC m=+1074.203918955" observedRunningTime="2026-02-17 18:01:45.446236045 +0000 UTC m=+1076.821639320" watchObservedRunningTime="2026-02-17 18:01:45.453097021 +0000 UTC m=+1076.828500296" Feb 17 18:01:46 crc kubenswrapper[4892]: I0217 18:01:46.454515 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"747e7c96-8d95-4c34-9ff3-83dc8c793fc2","Type":"ContainerStarted","Data":"c98b346276d9129a2832ecb16437471be06a6bf5533e4bd04d38cba334c672a5"} Feb 17 18:01:46 crc kubenswrapper[4892]: I0217 18:01:46.461180 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"c085ee96-4617-4fa6-b546-a68d29c6238b","Type":"ContainerStarted","Data":"091908dabe7cfe9736d9853d937fdfd18cc05ba8b8dcdcc384696d866e69fa27"} Feb 17 18:01:46 crc kubenswrapper[4892]: I0217 18:01:46.461376 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-8n9s7" Feb 17 18:01:46 crc kubenswrapper[4892]: I0217 18:01:46.461497 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-8n9s7" Feb 17 18:01:46 crc kubenswrapper[4892]: I0217 18:01:46.496723 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=14.425275247 podStartE2EDuration="21.496695006s" podCreationTimestamp="2026-02-17 18:01:25 +0000 UTC" firstStartedPulling="2026-02-17 18:01:38.475584857 +0000 UTC m=+1069.850988122" lastFinishedPulling="2026-02-17 18:01:45.547004616 +0000 UTC m=+1076.922407881" observedRunningTime="2026-02-17 18:01:46.485267698 +0000 UTC m=+1077.860670963" watchObservedRunningTime="2026-02-17 18:01:46.496695006 +0000 UTC m=+1077.872098311" Feb 17 18:01:46 crc kubenswrapper[4892]: I0217 18:01:46.520188 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=15.353517537 podStartE2EDuration="20.52016737s" podCreationTimestamp="2026-02-17 18:01:26 +0000 UTC" firstStartedPulling="2026-02-17 18:01:40.366627562 +0000 UTC m=+1071.742030867" lastFinishedPulling="2026-02-17 18:01:45.533277435 +0000 UTC m=+1076.908680700" observedRunningTime="2026-02-17 18:01:46.51130181 +0000 UTC m=+1077.886705075" watchObservedRunningTime="2026-02-17 18:01:46.52016737 +0000 UTC m=+1077.895570635" Feb 17 18:01:46 crc kubenswrapper[4892]: I0217 18:01:46.856938 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 17 18:01:47 crc kubenswrapper[4892]: I0217 18:01:47.624244 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 17 18:01:47 crc kubenswrapper[4892]: I0217 18:01:47.625695 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 17 18:01:47 crc kubenswrapper[4892]: I0217 18:01:47.834111 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 17 18:01:47 crc kubenswrapper[4892]: I0217 18:01:47.857525 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 17 18:01:47 crc kubenswrapper[4892]: I0217 18:01:47.895533 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 17 18:01:48 crc kubenswrapper[4892]: I0217 18:01:48.676172 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 17 18:01:48 crc kubenswrapper[4892]: I0217 18:01:48.871544 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 17 18:01:48 crc kubenswrapper[4892]: I0217 18:01:48.945282 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 17 18:01:48 crc kubenswrapper[4892]: I0217 18:01:48.946858 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-dpswn"] Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.004896 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-s6h6h"] Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.006663 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-s6h6h" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.012335 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.020027 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-s6h6h"] Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.073181 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-858jh"] Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.074451 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-858jh" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.078577 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa192ddc-2dc5-4684-afc7-29c9e9db5f6b-config\") pod \"ovn-controller-metrics-858jh\" (UID: \"aa192ddc-2dc5-4684-afc7-29c9e9db5f6b\") " pod="openstack/ovn-controller-metrics-858jh" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.078606 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d527912-c53f-420a-a4d2-a417f9b9caa1-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-s6h6h\" (UID: \"3d527912-c53f-420a-a4d2-a417f9b9caa1\") " pod="openstack/dnsmasq-dns-6bc7876d45-s6h6h" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.078636 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d527912-c53f-420a-a4d2-a417f9b9caa1-config\") pod \"dnsmasq-dns-6bc7876d45-s6h6h\" (UID: \"3d527912-c53f-420a-a4d2-a417f9b9caa1\") " pod="openstack/dnsmasq-dns-6bc7876d45-s6h6h" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.078666 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5j8x\" (UniqueName: \"kubernetes.io/projected/3d527912-c53f-420a-a4d2-a417f9b9caa1-kube-api-access-c5j8x\") pod \"dnsmasq-dns-6bc7876d45-s6h6h\" (UID: \"3d527912-c53f-420a-a4d2-a417f9b9caa1\") " pod="openstack/dnsmasq-dns-6bc7876d45-s6h6h" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.078699 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48scl\" (UniqueName: \"kubernetes.io/projected/aa192ddc-2dc5-4684-afc7-29c9e9db5f6b-kube-api-access-48scl\") pod \"ovn-controller-metrics-858jh\" (UID: \"aa192ddc-2dc5-4684-afc7-29c9e9db5f6b\") " pod="openstack/ovn-controller-metrics-858jh" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.078726 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/aa192ddc-2dc5-4684-afc7-29c9e9db5f6b-ovs-rundir\") pod \"ovn-controller-metrics-858jh\" (UID: \"aa192ddc-2dc5-4684-afc7-29c9e9db5f6b\") " pod="openstack/ovn-controller-metrics-858jh" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.078743 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa192ddc-2dc5-4684-afc7-29c9e9db5f6b-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-858jh\" (UID: \"aa192ddc-2dc5-4684-afc7-29c9e9db5f6b\") " pod="openstack/ovn-controller-metrics-858jh" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.078785 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/aa192ddc-2dc5-4684-afc7-29c9e9db5f6b-ovn-rundir\") pod \"ovn-controller-metrics-858jh\" (UID: \"aa192ddc-2dc5-4684-afc7-29c9e9db5f6b\") " pod="openstack/ovn-controller-metrics-858jh" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.078828 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa192ddc-2dc5-4684-afc7-29c9e9db5f6b-combined-ca-bundle\") pod \"ovn-controller-metrics-858jh\" (UID: \"aa192ddc-2dc5-4684-afc7-29c9e9db5f6b\") " pod="openstack/ovn-controller-metrics-858jh" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.078846 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d527912-c53f-420a-a4d2-a417f9b9caa1-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-s6h6h\" (UID: \"3d527912-c53f-420a-a4d2-a417f9b9caa1\") " pod="openstack/dnsmasq-dns-6bc7876d45-s6h6h" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.079220 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.083412 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.083433 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.084977 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-858jh"] Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.174301 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.179738 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5j8x\" (UniqueName: \"kubernetes.io/projected/3d527912-c53f-420a-a4d2-a417f9b9caa1-kube-api-access-c5j8x\") pod \"dnsmasq-dns-6bc7876d45-s6h6h\" (UID: \"3d527912-c53f-420a-a4d2-a417f9b9caa1\") " pod="openstack/dnsmasq-dns-6bc7876d45-s6h6h" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.179786 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48scl\" (UniqueName: \"kubernetes.io/projected/aa192ddc-2dc5-4684-afc7-29c9e9db5f6b-kube-api-access-48scl\") pod \"ovn-controller-metrics-858jh\" (UID: \"aa192ddc-2dc5-4684-afc7-29c9e9db5f6b\") " pod="openstack/ovn-controller-metrics-858jh" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.179825 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/aa192ddc-2dc5-4684-afc7-29c9e9db5f6b-ovs-rundir\") pod \"ovn-controller-metrics-858jh\" (UID: \"aa192ddc-2dc5-4684-afc7-29c9e9db5f6b\") " pod="openstack/ovn-controller-metrics-858jh" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.179884 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa192ddc-2dc5-4684-afc7-29c9e9db5f6b-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-858jh\" (UID: \"aa192ddc-2dc5-4684-afc7-29c9e9db5f6b\") " pod="openstack/ovn-controller-metrics-858jh" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.180250 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/aa192ddc-2dc5-4684-afc7-29c9e9db5f6b-ovs-rundir\") pod \"ovn-controller-metrics-858jh\" (UID: \"aa192ddc-2dc5-4684-afc7-29c9e9db5f6b\") " pod="openstack/ovn-controller-metrics-858jh" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.180353 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/aa192ddc-2dc5-4684-afc7-29c9e9db5f6b-ovn-rundir\") pod \"ovn-controller-metrics-858jh\" (UID: \"aa192ddc-2dc5-4684-afc7-29c9e9db5f6b\") " pod="openstack/ovn-controller-metrics-858jh" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.180434 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa192ddc-2dc5-4684-afc7-29c9e9db5f6b-combined-ca-bundle\") pod \"ovn-controller-metrics-858jh\" (UID: \"aa192ddc-2dc5-4684-afc7-29c9e9db5f6b\") " pod="openstack/ovn-controller-metrics-858jh" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.180495 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d527912-c53f-420a-a4d2-a417f9b9caa1-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-s6h6h\" (UID: \"3d527912-c53f-420a-a4d2-a417f9b9caa1\") " pod="openstack/dnsmasq-dns-6bc7876d45-s6h6h" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.180570 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa192ddc-2dc5-4684-afc7-29c9e9db5f6b-config\") pod \"ovn-controller-metrics-858jh\" (UID: \"aa192ddc-2dc5-4684-afc7-29c9e9db5f6b\") " pod="openstack/ovn-controller-metrics-858jh" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.180606 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d527912-c53f-420a-a4d2-a417f9b9caa1-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-s6h6h\" (UID: \"3d527912-c53f-420a-a4d2-a417f9b9caa1\") " pod="openstack/dnsmasq-dns-6bc7876d45-s6h6h" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.180667 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d527912-c53f-420a-a4d2-a417f9b9caa1-config\") pod \"dnsmasq-dns-6bc7876d45-s6h6h\" (UID: \"3d527912-c53f-420a-a4d2-a417f9b9caa1\") " pod="openstack/dnsmasq-dns-6bc7876d45-s6h6h" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.181619 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/aa192ddc-2dc5-4684-afc7-29c9e9db5f6b-ovn-rundir\") pod \"ovn-controller-metrics-858jh\" (UID: \"aa192ddc-2dc5-4684-afc7-29c9e9db5f6b\") " pod="openstack/ovn-controller-metrics-858jh" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.182843 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa192ddc-2dc5-4684-afc7-29c9e9db5f6b-config\") pod \"ovn-controller-metrics-858jh\" (UID: \"aa192ddc-2dc5-4684-afc7-29c9e9db5f6b\") " pod="openstack/ovn-controller-metrics-858jh" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.184047 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d527912-c53f-420a-a4d2-a417f9b9caa1-config\") pod \"dnsmasq-dns-6bc7876d45-s6h6h\" (UID: \"3d527912-c53f-420a-a4d2-a417f9b9caa1\") " pod="openstack/dnsmasq-dns-6bc7876d45-s6h6h" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.186031 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa192ddc-2dc5-4684-afc7-29c9e9db5f6b-combined-ca-bundle\") pod \"ovn-controller-metrics-858jh\" (UID: \"aa192ddc-2dc5-4684-afc7-29c9e9db5f6b\") " pod="openstack/ovn-controller-metrics-858jh" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.186781 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d527912-c53f-420a-a4d2-a417f9b9caa1-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-s6h6h\" (UID: \"3d527912-c53f-420a-a4d2-a417f9b9caa1\") " pod="openstack/dnsmasq-dns-6bc7876d45-s6h6h" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.192195 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d527912-c53f-420a-a4d2-a417f9b9caa1-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-s6h6h\" (UID: \"3d527912-c53f-420a-a4d2-a417f9b9caa1\") " pod="openstack/dnsmasq-dns-6bc7876d45-s6h6h" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.194357 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa192ddc-2dc5-4684-afc7-29c9e9db5f6b-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-858jh\" (UID: \"aa192ddc-2dc5-4684-afc7-29c9e9db5f6b\") " pod="openstack/ovn-controller-metrics-858jh" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.209104 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5j8x\" (UniqueName: \"kubernetes.io/projected/3d527912-c53f-420a-a4d2-a417f9b9caa1-kube-api-access-c5j8x\") pod \"dnsmasq-dns-6bc7876d45-s6h6h\" (UID: \"3d527912-c53f-420a-a4d2-a417f9b9caa1\") " pod="openstack/dnsmasq-dns-6bc7876d45-s6h6h" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.218800 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48scl\" (UniqueName: \"kubernetes.io/projected/aa192ddc-2dc5-4684-afc7-29c9e9db5f6b-kube-api-access-48scl\") pod \"ovn-controller-metrics-858jh\" (UID: \"aa192ddc-2dc5-4684-afc7-29c9e9db5f6b\") " pod="openstack/ovn-controller-metrics-858jh" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.296692 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-zbw9m"] Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.310054 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-dpswn" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.330367 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-q88lk"] Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.332230 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-q88lk" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.332894 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.340298 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-s6h6h" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.353255 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.403157 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-858jh" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.426689 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-q88lk"] Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.490458 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c165766b-b53f-4345-802e-7262eb64618c-config\") pod \"c165766b-b53f-4345-802e-7262eb64618c\" (UID: \"c165766b-b53f-4345-802e-7262eb64618c\") " Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.490587 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4g8l\" (UniqueName: \"kubernetes.io/projected/c165766b-b53f-4345-802e-7262eb64618c-kube-api-access-f4g8l\") pod \"c165766b-b53f-4345-802e-7262eb64618c\" (UID: \"c165766b-b53f-4345-802e-7262eb64618c\") " Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.490686 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c165766b-b53f-4345-802e-7262eb64618c-dns-svc\") pod \"c165766b-b53f-4345-802e-7262eb64618c\" (UID: \"c165766b-b53f-4345-802e-7262eb64618c\") " Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.491022 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/059760ef-70ba-461a-8ff9-e873c3e9501a-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-q88lk\" (UID: \"059760ef-70ba-461a-8ff9-e873c3e9501a\") " pod="openstack/dnsmasq-dns-8554648995-q88lk" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.491056 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/059760ef-70ba-461a-8ff9-e873c3e9501a-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-q88lk\" (UID: \"059760ef-70ba-461a-8ff9-e873c3e9501a\") " pod="openstack/dnsmasq-dns-8554648995-q88lk" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.491096 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/059760ef-70ba-461a-8ff9-e873c3e9501a-dns-svc\") pod \"dnsmasq-dns-8554648995-q88lk\" (UID: \"059760ef-70ba-461a-8ff9-e873c3e9501a\") " pod="openstack/dnsmasq-dns-8554648995-q88lk" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.491243 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5wfn\" (UniqueName: \"kubernetes.io/projected/059760ef-70ba-461a-8ff9-e873c3e9501a-kube-api-access-f5wfn\") pod \"dnsmasq-dns-8554648995-q88lk\" (UID: \"059760ef-70ba-461a-8ff9-e873c3e9501a\") " pod="openstack/dnsmasq-dns-8554648995-q88lk" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.491294 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c165766b-b53f-4345-802e-7262eb64618c-config" (OuterVolumeSpecName: "config") pod "c165766b-b53f-4345-802e-7262eb64618c" (UID: "c165766b-b53f-4345-802e-7262eb64618c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.491321 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/059760ef-70ba-461a-8ff9-e873c3e9501a-config\") pod \"dnsmasq-dns-8554648995-q88lk\" (UID: \"059760ef-70ba-461a-8ff9-e873c3e9501a\") " pod="openstack/dnsmasq-dns-8554648995-q88lk" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.500857 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c165766b-b53f-4345-802e-7262eb64618c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c165766b-b53f-4345-802e-7262eb64618c" (UID: "c165766b-b53f-4345-802e-7262eb64618c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.501039 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c165766b-b53f-4345-802e-7262eb64618c-kube-api-access-f4g8l" (OuterVolumeSpecName: "kube-api-access-f4g8l") pod "c165766b-b53f-4345-802e-7262eb64618c" (UID: "c165766b-b53f-4345-802e-7262eb64618c"). InnerVolumeSpecName "kube-api-access-f4g8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.504438 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c165766b-b53f-4345-802e-7262eb64618c-config\") on node \"crc\" DevicePath \"\"" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.624513 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5wfn\" (UniqueName: \"kubernetes.io/projected/059760ef-70ba-461a-8ff9-e873c3e9501a-kube-api-access-f5wfn\") pod \"dnsmasq-dns-8554648995-q88lk\" (UID: \"059760ef-70ba-461a-8ff9-e873c3e9501a\") " pod="openstack/dnsmasq-dns-8554648995-q88lk" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.624590 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/059760ef-70ba-461a-8ff9-e873c3e9501a-config\") pod \"dnsmasq-dns-8554648995-q88lk\" (UID: \"059760ef-70ba-461a-8ff9-e873c3e9501a\") " pod="openstack/dnsmasq-dns-8554648995-q88lk" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.624663 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/059760ef-70ba-461a-8ff9-e873c3e9501a-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-q88lk\" (UID: \"059760ef-70ba-461a-8ff9-e873c3e9501a\") " pod="openstack/dnsmasq-dns-8554648995-q88lk" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.624689 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/059760ef-70ba-461a-8ff9-e873c3e9501a-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-q88lk\" (UID: \"059760ef-70ba-461a-8ff9-e873c3e9501a\") " pod="openstack/dnsmasq-dns-8554648995-q88lk" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.624723 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/059760ef-70ba-461a-8ff9-e873c3e9501a-dns-svc\") pod \"dnsmasq-dns-8554648995-q88lk\" (UID: \"059760ef-70ba-461a-8ff9-e873c3e9501a\") " pod="openstack/dnsmasq-dns-8554648995-q88lk" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.629726 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4g8l\" (UniqueName: \"kubernetes.io/projected/c165766b-b53f-4345-802e-7262eb64618c-kube-api-access-f4g8l\") on node \"crc\" DevicePath \"\"" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.630479 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/059760ef-70ba-461a-8ff9-e873c3e9501a-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-q88lk\" (UID: \"059760ef-70ba-461a-8ff9-e873c3e9501a\") " pod="openstack/dnsmasq-dns-8554648995-q88lk" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.630966 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/059760ef-70ba-461a-8ff9-e873c3e9501a-config\") pod \"dnsmasq-dns-8554648995-q88lk\" (UID: \"059760ef-70ba-461a-8ff9-e873c3e9501a\") " pod="openstack/dnsmasq-dns-8554648995-q88lk" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.631504 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/059760ef-70ba-461a-8ff9-e873c3e9501a-dns-svc\") pod \"dnsmasq-dns-8554648995-q88lk\" (UID: \"059760ef-70ba-461a-8ff9-e873c3e9501a\") " pod="openstack/dnsmasq-dns-8554648995-q88lk" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.632328 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/059760ef-70ba-461a-8ff9-e873c3e9501a-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-q88lk\" (UID: \"059760ef-70ba-461a-8ff9-e873c3e9501a\") " pod="openstack/dnsmasq-dns-8554648995-q88lk" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.632352 4892 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c165766b-b53f-4345-802e-7262eb64618c-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.665286 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5wfn\" (UniqueName: \"kubernetes.io/projected/059760ef-70ba-461a-8ff9-e873c3e9501a-kube-api-access-f5wfn\") pod \"dnsmasq-dns-8554648995-q88lk\" (UID: \"059760ef-70ba-461a-8ff9-e873c3e9501a\") " pod="openstack/dnsmasq-dns-8554648995-q88lk" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.676404 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-q88lk" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.713833 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-dpswn" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.717777 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-dpswn" event={"ID":"c165766b-b53f-4345-802e-7262eb64618c","Type":"ContainerDied","Data":"40ebec5f88ef82ec19e43255cbf4714d60f7dbb143808fa9e34b60a9ccb9be26"} Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.740308 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-zbw9m" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.775960 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.793196 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-dpswn"] Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.803590 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-dpswn"] Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.836236 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtvh6\" (UniqueName: \"kubernetes.io/projected/2ca917ea-9347-4218-ab50-2156103f610a-kube-api-access-jtvh6\") pod \"2ca917ea-9347-4218-ab50-2156103f610a\" (UID: \"2ca917ea-9347-4218-ab50-2156103f610a\") " Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.836465 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ca917ea-9347-4218-ab50-2156103f610a-config\") pod \"2ca917ea-9347-4218-ab50-2156103f610a\" (UID: \"2ca917ea-9347-4218-ab50-2156103f610a\") " Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.836546 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ca917ea-9347-4218-ab50-2156103f610a-dns-svc\") pod \"2ca917ea-9347-4218-ab50-2156103f610a\" (UID: \"2ca917ea-9347-4218-ab50-2156103f610a\") " Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.837469 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ca917ea-9347-4218-ab50-2156103f610a-config" (OuterVolumeSpecName: "config") pod "2ca917ea-9347-4218-ab50-2156103f610a" (UID: "2ca917ea-9347-4218-ab50-2156103f610a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.838127 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ca917ea-9347-4218-ab50-2156103f610a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2ca917ea-9347-4218-ab50-2156103f610a" (UID: "2ca917ea-9347-4218-ab50-2156103f610a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.842120 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ca917ea-9347-4218-ab50-2156103f610a-kube-api-access-jtvh6" (OuterVolumeSpecName: "kube-api-access-jtvh6") pod "2ca917ea-9347-4218-ab50-2156103f610a" (UID: "2ca917ea-9347-4218-ab50-2156103f610a"). InnerVolumeSpecName "kube-api-access-jtvh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.842897 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtvh6\" (UniqueName: \"kubernetes.io/projected/2ca917ea-9347-4218-ab50-2156103f610a-kube-api-access-jtvh6\") on node \"crc\" DevicePath \"\"" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.843305 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ca917ea-9347-4218-ab50-2156103f610a-config\") on node \"crc\" DevicePath \"\"" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.843316 4892 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ca917ea-9347-4218-ab50-2156103f610a-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.870307 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.924859 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.936431 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.939084 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-bqkd7" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.939253 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.939371 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.939509 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 17 18:01:49 crc kubenswrapper[4892]: I0217 18:01:49.949282 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 17 18:01:50 crc kubenswrapper[4892]: I0217 18:01:50.046343 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l62xg\" (UniqueName: \"kubernetes.io/projected/5b688e91-e3c2-4a0f-a784-a694f951ea5e-kube-api-access-l62xg\") pod \"ovn-northd-0\" (UID: \"5b688e91-e3c2-4a0f-a784-a694f951ea5e\") " pod="openstack/ovn-northd-0" Feb 17 18:01:50 crc kubenswrapper[4892]: I0217 18:01:50.046614 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b688e91-e3c2-4a0f-a784-a694f951ea5e-config\") pod \"ovn-northd-0\" (UID: \"5b688e91-e3c2-4a0f-a784-a694f951ea5e\") " pod="openstack/ovn-northd-0" Feb 17 18:01:50 crc kubenswrapper[4892]: I0217 18:01:50.046661 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5b688e91-e3c2-4a0f-a784-a694f951ea5e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"5b688e91-e3c2-4a0f-a784-a694f951ea5e\") " pod="openstack/ovn-northd-0" Feb 17 18:01:50 crc kubenswrapper[4892]: I0217 18:01:50.046680 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b688e91-e3c2-4a0f-a784-a694f951ea5e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"5b688e91-e3c2-4a0f-a784-a694f951ea5e\") " pod="openstack/ovn-northd-0" Feb 17 18:01:50 crc kubenswrapper[4892]: I0217 18:01:50.046707 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5b688e91-e3c2-4a0f-a784-a694f951ea5e-scripts\") pod \"ovn-northd-0\" (UID: \"5b688e91-e3c2-4a0f-a784-a694f951ea5e\") " pod="openstack/ovn-northd-0" Feb 17 18:01:50 crc kubenswrapper[4892]: I0217 18:01:50.046948 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b688e91-e3c2-4a0f-a784-a694f951ea5e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"5b688e91-e3c2-4a0f-a784-a694f951ea5e\") " pod="openstack/ovn-northd-0" Feb 17 18:01:50 crc kubenswrapper[4892]: I0217 18:01:50.047343 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b688e91-e3c2-4a0f-a784-a694f951ea5e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"5b688e91-e3c2-4a0f-a784-a694f951ea5e\") " pod="openstack/ovn-northd-0" Feb 17 18:01:50 crc kubenswrapper[4892]: I0217 18:01:50.148995 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b688e91-e3c2-4a0f-a784-a694f951ea5e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"5b688e91-e3c2-4a0f-a784-a694f951ea5e\") " pod="openstack/ovn-northd-0" Feb 17 18:01:50 crc kubenswrapper[4892]: I0217 18:01:50.149046 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b688e91-e3c2-4a0f-a784-a694f951ea5e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"5b688e91-e3c2-4a0f-a784-a694f951ea5e\") " pod="openstack/ovn-northd-0" Feb 17 18:01:50 crc kubenswrapper[4892]: I0217 18:01:50.149095 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l62xg\" (UniqueName: \"kubernetes.io/projected/5b688e91-e3c2-4a0f-a784-a694f951ea5e-kube-api-access-l62xg\") pod \"ovn-northd-0\" (UID: \"5b688e91-e3c2-4a0f-a784-a694f951ea5e\") " pod="openstack/ovn-northd-0" Feb 17 18:01:50 crc kubenswrapper[4892]: I0217 18:01:50.149122 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b688e91-e3c2-4a0f-a784-a694f951ea5e-config\") pod \"ovn-northd-0\" (UID: \"5b688e91-e3c2-4a0f-a784-a694f951ea5e\") " pod="openstack/ovn-northd-0" Feb 17 18:01:50 crc kubenswrapper[4892]: I0217 18:01:50.149497 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5b688e91-e3c2-4a0f-a784-a694f951ea5e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"5b688e91-e3c2-4a0f-a784-a694f951ea5e\") " pod="openstack/ovn-northd-0" Feb 17 18:01:50 crc kubenswrapper[4892]: I0217 18:01:50.149541 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b688e91-e3c2-4a0f-a784-a694f951ea5e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"5b688e91-e3c2-4a0f-a784-a694f951ea5e\") " pod="openstack/ovn-northd-0" Feb 17 18:01:50 crc kubenswrapper[4892]: I0217 18:01:50.149572 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5b688e91-e3c2-4a0f-a784-a694f951ea5e-scripts\") pod \"ovn-northd-0\" (UID: \"5b688e91-e3c2-4a0f-a784-a694f951ea5e\") " pod="openstack/ovn-northd-0" Feb 17 18:01:50 crc kubenswrapper[4892]: I0217 18:01:50.149957 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5b688e91-e3c2-4a0f-a784-a694f951ea5e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"5b688e91-e3c2-4a0f-a784-a694f951ea5e\") " pod="openstack/ovn-northd-0" Feb 17 18:01:50 crc kubenswrapper[4892]: I0217 18:01:50.151956 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5b688e91-e3c2-4a0f-a784-a694f951ea5e-scripts\") pod \"ovn-northd-0\" (UID: \"5b688e91-e3c2-4a0f-a784-a694f951ea5e\") " pod="openstack/ovn-northd-0" Feb 17 18:01:50 crc kubenswrapper[4892]: I0217 18:01:50.152063 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b688e91-e3c2-4a0f-a784-a694f951ea5e-config\") pod \"ovn-northd-0\" (UID: \"5b688e91-e3c2-4a0f-a784-a694f951ea5e\") " pod="openstack/ovn-northd-0" Feb 17 18:01:50 crc kubenswrapper[4892]: I0217 18:01:50.152711 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b688e91-e3c2-4a0f-a784-a694f951ea5e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"5b688e91-e3c2-4a0f-a784-a694f951ea5e\") " pod="openstack/ovn-northd-0" Feb 17 18:01:50 crc kubenswrapper[4892]: I0217 18:01:50.152856 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b688e91-e3c2-4a0f-a784-a694f951ea5e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"5b688e91-e3c2-4a0f-a784-a694f951ea5e\") " pod="openstack/ovn-northd-0" Feb 17 18:01:50 crc kubenswrapper[4892]: I0217 18:01:50.153421 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b688e91-e3c2-4a0f-a784-a694f951ea5e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"5b688e91-e3c2-4a0f-a784-a694f951ea5e\") " pod="openstack/ovn-northd-0" Feb 17 18:01:50 crc kubenswrapper[4892]: I0217 18:01:50.164463 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l62xg\" (UniqueName: \"kubernetes.io/projected/5b688e91-e3c2-4a0f-a784-a694f951ea5e-kube-api-access-l62xg\") pod \"ovn-northd-0\" (UID: \"5b688e91-e3c2-4a0f-a784-a694f951ea5e\") " pod="openstack/ovn-northd-0" Feb 17 18:01:50 crc kubenswrapper[4892]: I0217 18:01:50.200064 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-858jh"] Feb 17 18:01:50 crc kubenswrapper[4892]: W0217 18:01:50.211407 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa192ddc_2dc5_4684_afc7_29c9e9db5f6b.slice/crio-ec45d12b51ac0478080a8880ebb2a56c1612762345c8796d6face1374704b5a8 WatchSource:0}: Error finding container ec45d12b51ac0478080a8880ebb2a56c1612762345c8796d6face1374704b5a8: Status 404 returned error can't find the container with id ec45d12b51ac0478080a8880ebb2a56c1612762345c8796d6face1374704b5a8 Feb 17 18:01:50 crc kubenswrapper[4892]: I0217 18:01:50.265248 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 17 18:01:50 crc kubenswrapper[4892]: I0217 18:01:50.287776 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-s6h6h"] Feb 17 18:01:50 crc kubenswrapper[4892]: W0217 18:01:50.294541 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d527912_c53f_420a_a4d2_a417f9b9caa1.slice/crio-f76993c4fba0708fa54b43c81316637892b2a398f0393bf734066ea064f5db27 WatchSource:0}: Error finding container f76993c4fba0708fa54b43c81316637892b2a398f0393bf734066ea064f5db27: Status 404 returned error can't find the container with id f76993c4fba0708fa54b43c81316637892b2a398f0393bf734066ea064f5db27 Feb 17 18:01:50 crc kubenswrapper[4892]: I0217 18:01:50.299798 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-q88lk"] Feb 17 18:01:50 crc kubenswrapper[4892]: W0217 18:01:50.540141 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b688e91_e3c2_4a0f_a784_a694f951ea5e.slice/crio-9d795354563a389daa71a0017918bcc9fe39ccec245ee6211bd79ca5dfe2c2cd WatchSource:0}: Error finding container 9d795354563a389daa71a0017918bcc9fe39ccec245ee6211bd79ca5dfe2c2cd: Status 404 returned error can't find the container with id 9d795354563a389daa71a0017918bcc9fe39ccec245ee6211bd79ca5dfe2c2cd Feb 17 18:01:50 crc kubenswrapper[4892]: I0217 18:01:50.542849 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 17 18:01:50 crc kubenswrapper[4892]: I0217 18:01:50.724541 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-zbw9m" event={"ID":"2ca917ea-9347-4218-ab50-2156103f610a","Type":"ContainerDied","Data":"cd3b30dae02ab6718950ec073827f51303bec77bd680ef79e988052686b04a47"} Feb 17 18:01:50 crc kubenswrapper[4892]: I0217 18:01:50.724572 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-zbw9m" Feb 17 18:01:50 crc kubenswrapper[4892]: I0217 18:01:50.726047 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"5b688e91-e3c2-4a0f-a784-a694f951ea5e","Type":"ContainerStarted","Data":"9d795354563a389daa71a0017918bcc9fe39ccec245ee6211bd79ca5dfe2c2cd"} Feb 17 18:01:50 crc kubenswrapper[4892]: I0217 18:01:50.727682 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-s6h6h" event={"ID":"3d527912-c53f-420a-a4d2-a417f9b9caa1","Type":"ContainerStarted","Data":"f76993c4fba0708fa54b43c81316637892b2a398f0393bf734066ea064f5db27"} Feb 17 18:01:50 crc kubenswrapper[4892]: I0217 18:01:50.730414 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-858jh" event={"ID":"aa192ddc-2dc5-4684-afc7-29c9e9db5f6b","Type":"ContainerStarted","Data":"63921be3f9a636b065a7e01b28b406aaf94dbcbae3cf8b99719f9d18b1baacbb"} Feb 17 18:01:50 crc kubenswrapper[4892]: I0217 18:01:50.730457 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-858jh" event={"ID":"aa192ddc-2dc5-4684-afc7-29c9e9db5f6b","Type":"ContainerStarted","Data":"ec45d12b51ac0478080a8880ebb2a56c1612762345c8796d6face1374704b5a8"} Feb 17 18:01:50 crc kubenswrapper[4892]: I0217 18:01:50.731586 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-q88lk" event={"ID":"059760ef-70ba-461a-8ff9-e873c3e9501a","Type":"ContainerStarted","Data":"65178bec6aa381d8e962e4b1c436b2bc672b61f766ca52a02231df65d6a49da7"} Feb 17 18:01:50 crc kubenswrapper[4892]: I0217 18:01:50.790396 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-858jh" podStartSLOduration=1.79037402 podStartE2EDuration="1.79037402s" podCreationTimestamp="2026-02-17 18:01:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:01:50.75627955 +0000 UTC m=+1082.131682845" watchObservedRunningTime="2026-02-17 18:01:50.79037402 +0000 UTC m=+1082.165777295" Feb 17 18:01:50 crc kubenswrapper[4892]: I0217 18:01:50.831945 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-zbw9m"] Feb 17 18:01:50 crc kubenswrapper[4892]: I0217 18:01:50.841373 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-zbw9m"] Feb 17 18:01:51 crc kubenswrapper[4892]: I0217 18:01:51.393195 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ca917ea-9347-4218-ab50-2156103f610a" path="/var/lib/kubelet/pods/2ca917ea-9347-4218-ab50-2156103f610a/volumes" Feb 17 18:01:51 crc kubenswrapper[4892]: I0217 18:01:51.393788 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c165766b-b53f-4345-802e-7262eb64618c" path="/var/lib/kubelet/pods/c165766b-b53f-4345-802e-7262eb64618c/volumes" Feb 17 18:01:51 crc kubenswrapper[4892]: I0217 18:01:51.394168 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 17 18:01:51 crc kubenswrapper[4892]: I0217 18:01:51.473388 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-s6h6h"] Feb 17 18:01:51 crc kubenswrapper[4892]: I0217 18:01:51.525324 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-hkkrz"] Feb 17 18:01:51 crc kubenswrapper[4892]: I0217 18:01:51.527046 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-hkkrz" Feb 17 18:01:51 crc kubenswrapper[4892]: I0217 18:01:51.545799 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-hkkrz"] Feb 17 18:01:51 crc kubenswrapper[4892]: I0217 18:01:51.592725 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a40576fc-fbd3-45f5-afb8-50de90642017-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-hkkrz\" (UID: \"a40576fc-fbd3-45f5-afb8-50de90642017\") " pod="openstack/dnsmasq-dns-b8fbc5445-hkkrz" Feb 17 18:01:51 crc kubenswrapper[4892]: I0217 18:01:51.592784 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a40576fc-fbd3-45f5-afb8-50de90642017-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-hkkrz\" (UID: \"a40576fc-fbd3-45f5-afb8-50de90642017\") " pod="openstack/dnsmasq-dns-b8fbc5445-hkkrz" Feb 17 18:01:51 crc kubenswrapper[4892]: I0217 18:01:51.592921 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a40576fc-fbd3-45f5-afb8-50de90642017-config\") pod \"dnsmasq-dns-b8fbc5445-hkkrz\" (UID: \"a40576fc-fbd3-45f5-afb8-50de90642017\") " pod="openstack/dnsmasq-dns-b8fbc5445-hkkrz" Feb 17 18:01:51 crc kubenswrapper[4892]: I0217 18:01:51.592957 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a40576fc-fbd3-45f5-afb8-50de90642017-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-hkkrz\" (UID: \"a40576fc-fbd3-45f5-afb8-50de90642017\") " pod="openstack/dnsmasq-dns-b8fbc5445-hkkrz" Feb 17 18:01:51 crc kubenswrapper[4892]: I0217 18:01:51.593000 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xbtj\" (UniqueName: \"kubernetes.io/projected/a40576fc-fbd3-45f5-afb8-50de90642017-kube-api-access-4xbtj\") pod \"dnsmasq-dns-b8fbc5445-hkkrz\" (UID: \"a40576fc-fbd3-45f5-afb8-50de90642017\") " pod="openstack/dnsmasq-dns-b8fbc5445-hkkrz" Feb 17 18:01:51 crc kubenswrapper[4892]: I0217 18:01:51.694787 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a40576fc-fbd3-45f5-afb8-50de90642017-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-hkkrz\" (UID: \"a40576fc-fbd3-45f5-afb8-50de90642017\") " pod="openstack/dnsmasq-dns-b8fbc5445-hkkrz" Feb 17 18:01:51 crc kubenswrapper[4892]: I0217 18:01:51.694888 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a40576fc-fbd3-45f5-afb8-50de90642017-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-hkkrz\" (UID: \"a40576fc-fbd3-45f5-afb8-50de90642017\") " pod="openstack/dnsmasq-dns-b8fbc5445-hkkrz" Feb 17 18:01:51 crc kubenswrapper[4892]: I0217 18:01:51.695007 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a40576fc-fbd3-45f5-afb8-50de90642017-config\") pod \"dnsmasq-dns-b8fbc5445-hkkrz\" (UID: \"a40576fc-fbd3-45f5-afb8-50de90642017\") " pod="openstack/dnsmasq-dns-b8fbc5445-hkkrz" Feb 17 18:01:51 crc kubenswrapper[4892]: I0217 18:01:51.695048 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a40576fc-fbd3-45f5-afb8-50de90642017-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-hkkrz\" (UID: \"a40576fc-fbd3-45f5-afb8-50de90642017\") " pod="openstack/dnsmasq-dns-b8fbc5445-hkkrz" Feb 17 18:01:51 crc kubenswrapper[4892]: I0217 18:01:51.695072 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xbtj\" (UniqueName: \"kubernetes.io/projected/a40576fc-fbd3-45f5-afb8-50de90642017-kube-api-access-4xbtj\") pod \"dnsmasq-dns-b8fbc5445-hkkrz\" (UID: \"a40576fc-fbd3-45f5-afb8-50de90642017\") " pod="openstack/dnsmasq-dns-b8fbc5445-hkkrz" Feb 17 18:01:51 crc kubenswrapper[4892]: I0217 18:01:51.695831 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a40576fc-fbd3-45f5-afb8-50de90642017-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-hkkrz\" (UID: \"a40576fc-fbd3-45f5-afb8-50de90642017\") " pod="openstack/dnsmasq-dns-b8fbc5445-hkkrz" Feb 17 18:01:51 crc kubenswrapper[4892]: I0217 18:01:51.696195 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a40576fc-fbd3-45f5-afb8-50de90642017-config\") pod \"dnsmasq-dns-b8fbc5445-hkkrz\" (UID: \"a40576fc-fbd3-45f5-afb8-50de90642017\") " pod="openstack/dnsmasq-dns-b8fbc5445-hkkrz" Feb 17 18:01:51 crc kubenswrapper[4892]: I0217 18:01:51.696309 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a40576fc-fbd3-45f5-afb8-50de90642017-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-hkkrz\" (UID: \"a40576fc-fbd3-45f5-afb8-50de90642017\") " pod="openstack/dnsmasq-dns-b8fbc5445-hkkrz" Feb 17 18:01:51 crc kubenswrapper[4892]: I0217 18:01:51.697716 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a40576fc-fbd3-45f5-afb8-50de90642017-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-hkkrz\" (UID: \"a40576fc-fbd3-45f5-afb8-50de90642017\") " pod="openstack/dnsmasq-dns-b8fbc5445-hkkrz" Feb 17 18:01:51 crc kubenswrapper[4892]: I0217 18:01:51.713472 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xbtj\" (UniqueName: \"kubernetes.io/projected/a40576fc-fbd3-45f5-afb8-50de90642017-kube-api-access-4xbtj\") pod \"dnsmasq-dns-b8fbc5445-hkkrz\" (UID: \"a40576fc-fbd3-45f5-afb8-50de90642017\") " pod="openstack/dnsmasq-dns-b8fbc5445-hkkrz" Feb 17 18:01:51 crc kubenswrapper[4892]: I0217 18:01:51.744169 4892 generic.go:334] "Generic (PLEG): container finished" podID="059760ef-70ba-461a-8ff9-e873c3e9501a" containerID="015caa8d51bb5ac02279e0b7aa16d9eae5311f1075a8cf029eda49781440ccfb" exitCode=0 Feb 17 18:01:51 crc kubenswrapper[4892]: I0217 18:01:51.744284 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-q88lk" event={"ID":"059760ef-70ba-461a-8ff9-e873c3e9501a","Type":"ContainerDied","Data":"015caa8d51bb5ac02279e0b7aa16d9eae5311f1075a8cf029eda49781440ccfb"} Feb 17 18:01:51 crc kubenswrapper[4892]: I0217 18:01:51.750300 4892 generic.go:334] "Generic (PLEG): container finished" podID="3d527912-c53f-420a-a4d2-a417f9b9caa1" containerID="f795c3e12042852d1382a8796ff6f9d85f87241b264630d9aae2e5b5f63098a8" exitCode=0 Feb 17 18:01:51 crc kubenswrapper[4892]: I0217 18:01:51.750454 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-s6h6h" event={"ID":"3d527912-c53f-420a-a4d2-a417f9b9caa1","Type":"ContainerDied","Data":"f795c3e12042852d1382a8796ff6f9d85f87241b264630d9aae2e5b5f63098a8"} Feb 17 18:01:51 crc kubenswrapper[4892]: I0217 18:01:51.775489 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 17 18:01:51 crc kubenswrapper[4892]: I0217 18:01:51.877006 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 17 18:01:51 crc kubenswrapper[4892]: I0217 18:01:51.877679 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-hkkrz" Feb 17 18:01:52 crc kubenswrapper[4892]: E0217 18:01:52.490712 4892 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Feb 17 18:01:52 crc kubenswrapper[4892]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/3d527912-c53f-420a-a4d2-a417f9b9caa1/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Feb 17 18:01:52 crc kubenswrapper[4892]: > podSandboxID="f76993c4fba0708fa54b43c81316637892b2a398f0393bf734066ea064f5db27" Feb 17 18:01:52 crc kubenswrapper[4892]: E0217 18:01:52.491549 4892 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 17 18:01:52 crc kubenswrapper[4892]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n8ch647h5fdh676h5c8h566h96h5d8hdh569h64dh5b5h587h55h5cch58dh658h67h5f6h64fh648h6h59fh65ch7hf9hf6h74hf8hch596h5b8q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c5j8x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-6bc7876d45-s6h6h_openstack(3d527912-c53f-420a-a4d2-a417f9b9caa1): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/3d527912-c53f-420a-a4d2-a417f9b9caa1/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Feb 17 18:01:52 crc kubenswrapper[4892]: > logger="UnhandledError" Feb 17 18:01:52 crc kubenswrapper[4892]: E0217 18:01:52.492721 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/3d527912-c53f-420a-a4d2-a417f9b9caa1/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-6bc7876d45-s6h6h" podUID="3d527912-c53f-420a-a4d2-a417f9b9caa1" Feb 17 18:01:52 crc kubenswrapper[4892]: E0217 18:01:52.509035 4892 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Feb 17 18:01:52 crc kubenswrapper[4892]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/059760ef-70ba-461a-8ff9-e873c3e9501a/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Feb 17 18:01:52 crc kubenswrapper[4892]: > podSandboxID="65178bec6aa381d8e962e4b1c436b2bc672b61f766ca52a02231df65d6a49da7" Feb 17 18:01:52 crc kubenswrapper[4892]: E0217 18:01:52.509246 4892 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 17 18:01:52 crc kubenswrapper[4892]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n654h99h64ch5dbh6dh555h587h64bh5cfh647h5fdh57ch679h9h597h5f5hbch59bh54fh575h566h667h586h5f5h65ch5bch57h68h65ch58bh694h5cfq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f5wfn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-8554648995-q88lk_openstack(059760ef-70ba-461a-8ff9-e873c3e9501a): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/059760ef-70ba-461a-8ff9-e873c3e9501a/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Feb 17 18:01:52 crc kubenswrapper[4892]: > logger="UnhandledError" Feb 17 18:01:52 crc kubenswrapper[4892]: E0217 18:01:52.510440 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/059760ef-70ba-461a-8ff9-e873c3e9501a/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-8554648995-q88lk" podUID="059760ef-70ba-461a-8ff9-e873c3e9501a" Feb 17 18:01:52 crc kubenswrapper[4892]: I0217 18:01:52.536917 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Feb 17 18:01:52 crc kubenswrapper[4892]: I0217 18:01:52.546591 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 17 18:01:52 crc kubenswrapper[4892]: I0217 18:01:52.549234 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 17 18:01:52 crc kubenswrapper[4892]: I0217 18:01:52.549393 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Feb 17 18:01:52 crc kubenswrapper[4892]: I0217 18:01:52.550392 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 17 18:01:52 crc kubenswrapper[4892]: I0217 18:01:52.550562 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-z5c7w" Feb 17 18:01:52 crc kubenswrapper[4892]: I0217 18:01:52.562760 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 17 18:01:52 crc kubenswrapper[4892]: I0217 18:01:52.615728 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0a693208-de83-4f05-b4ff-0b3e7f858c74-etc-swift\") pod \"swift-storage-0\" (UID: \"0a693208-de83-4f05-b4ff-0b3e7f858c74\") " pod="openstack/swift-storage-0" Feb 17 18:01:52 crc kubenswrapper[4892]: I0217 18:01:52.615805 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"0a693208-de83-4f05-b4ff-0b3e7f858c74\") " pod="openstack/swift-storage-0" Feb 17 18:01:52 crc kubenswrapper[4892]: I0217 18:01:52.615870 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdkcv\" (UniqueName: \"kubernetes.io/projected/0a693208-de83-4f05-b4ff-0b3e7f858c74-kube-api-access-fdkcv\") pod \"swift-storage-0\" (UID: \"0a693208-de83-4f05-b4ff-0b3e7f858c74\") " pod="openstack/swift-storage-0" Feb 17 18:01:52 crc kubenswrapper[4892]: I0217 18:01:52.615894 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/0a693208-de83-4f05-b4ff-0b3e7f858c74-lock\") pod \"swift-storage-0\" (UID: \"0a693208-de83-4f05-b4ff-0b3e7f858c74\") " pod="openstack/swift-storage-0" Feb 17 18:01:52 crc kubenswrapper[4892]: I0217 18:01:52.615934 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/0a693208-de83-4f05-b4ff-0b3e7f858c74-cache\") pod \"swift-storage-0\" (UID: \"0a693208-de83-4f05-b4ff-0b3e7f858c74\") " pod="openstack/swift-storage-0" Feb 17 18:01:52 crc kubenswrapper[4892]: I0217 18:01:52.615974 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a693208-de83-4f05-b4ff-0b3e7f858c74-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"0a693208-de83-4f05-b4ff-0b3e7f858c74\") " pod="openstack/swift-storage-0" Feb 17 18:01:52 crc kubenswrapper[4892]: I0217 18:01:52.686064 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-hkkrz"] Feb 17 18:01:52 crc kubenswrapper[4892]: W0217 18:01:52.695200 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda40576fc_fbd3_45f5_afb8_50de90642017.slice/crio-cb332a6d06af9a760e318c299c1023164b04aa143f34562f8b24e09089f6f829 WatchSource:0}: Error finding container cb332a6d06af9a760e318c299c1023164b04aa143f34562f8b24e09089f6f829: Status 404 returned error can't find the container with id cb332a6d06af9a760e318c299c1023164b04aa143f34562f8b24e09089f6f829 Feb 17 18:01:52 crc kubenswrapper[4892]: I0217 18:01:52.717681 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdkcv\" (UniqueName: \"kubernetes.io/projected/0a693208-de83-4f05-b4ff-0b3e7f858c74-kube-api-access-fdkcv\") pod \"swift-storage-0\" (UID: \"0a693208-de83-4f05-b4ff-0b3e7f858c74\") " pod="openstack/swift-storage-0" Feb 17 18:01:52 crc kubenswrapper[4892]: I0217 18:01:52.717715 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/0a693208-de83-4f05-b4ff-0b3e7f858c74-lock\") pod \"swift-storage-0\" (UID: \"0a693208-de83-4f05-b4ff-0b3e7f858c74\") " pod="openstack/swift-storage-0" Feb 17 18:01:52 crc kubenswrapper[4892]: I0217 18:01:52.717761 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/0a693208-de83-4f05-b4ff-0b3e7f858c74-cache\") pod \"swift-storage-0\" (UID: \"0a693208-de83-4f05-b4ff-0b3e7f858c74\") " pod="openstack/swift-storage-0" Feb 17 18:01:52 crc kubenswrapper[4892]: I0217 18:01:52.717860 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a693208-de83-4f05-b4ff-0b3e7f858c74-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"0a693208-de83-4f05-b4ff-0b3e7f858c74\") " pod="openstack/swift-storage-0" Feb 17 18:01:52 crc kubenswrapper[4892]: I0217 18:01:52.717891 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0a693208-de83-4f05-b4ff-0b3e7f858c74-etc-swift\") pod \"swift-storage-0\" (UID: \"0a693208-de83-4f05-b4ff-0b3e7f858c74\") " pod="openstack/swift-storage-0" Feb 17 18:01:52 crc kubenswrapper[4892]: I0217 18:01:52.717952 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"0a693208-de83-4f05-b4ff-0b3e7f858c74\") " pod="openstack/swift-storage-0" Feb 17 18:01:52 crc kubenswrapper[4892]: I0217 18:01:52.718269 4892 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"0a693208-de83-4f05-b4ff-0b3e7f858c74\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/swift-storage-0" Feb 17 18:01:52 crc kubenswrapper[4892]: I0217 18:01:52.718287 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/0a693208-de83-4f05-b4ff-0b3e7f858c74-lock\") pod \"swift-storage-0\" (UID: \"0a693208-de83-4f05-b4ff-0b3e7f858c74\") " pod="openstack/swift-storage-0" Feb 17 18:01:52 crc kubenswrapper[4892]: I0217 18:01:52.718335 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/0a693208-de83-4f05-b4ff-0b3e7f858c74-cache\") pod \"swift-storage-0\" (UID: \"0a693208-de83-4f05-b4ff-0b3e7f858c74\") " pod="openstack/swift-storage-0" Feb 17 18:01:52 crc kubenswrapper[4892]: E0217 18:01:52.718685 4892 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 17 18:01:52 crc kubenswrapper[4892]: E0217 18:01:52.718704 4892 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 17 18:01:52 crc kubenswrapper[4892]: E0217 18:01:52.718740 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0a693208-de83-4f05-b4ff-0b3e7f858c74-etc-swift podName:0a693208-de83-4f05-b4ff-0b3e7f858c74 nodeName:}" failed. No retries permitted until 2026-02-17 18:01:53.218726803 +0000 UTC m=+1084.594130068 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/0a693208-de83-4f05-b4ff-0b3e7f858c74-etc-swift") pod "swift-storage-0" (UID: "0a693208-de83-4f05-b4ff-0b3e7f858c74") : configmap "swift-ring-files" not found Feb 17 18:01:52 crc kubenswrapper[4892]: I0217 18:01:52.722744 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a693208-de83-4f05-b4ff-0b3e7f858c74-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"0a693208-de83-4f05-b4ff-0b3e7f858c74\") " pod="openstack/swift-storage-0" Feb 17 18:01:52 crc kubenswrapper[4892]: I0217 18:01:52.733792 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdkcv\" (UniqueName: \"kubernetes.io/projected/0a693208-de83-4f05-b4ff-0b3e7f858c74-kube-api-access-fdkcv\") pod \"swift-storage-0\" (UID: \"0a693208-de83-4f05-b4ff-0b3e7f858c74\") " pod="openstack/swift-storage-0" Feb 17 18:01:52 crc kubenswrapper[4892]: I0217 18:01:52.748295 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"0a693208-de83-4f05-b4ff-0b3e7f858c74\") " pod="openstack/swift-storage-0" Feb 17 18:01:52 crc kubenswrapper[4892]: I0217 18:01:52.769539 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"5b688e91-e3c2-4a0f-a784-a694f951ea5e","Type":"ContainerStarted","Data":"f5ee250509d5fcfefa3da2589c17b1ebe3b3316577d89b3435a80ae89dc5bf32"} Feb 17 18:01:52 crc kubenswrapper[4892]: I0217 18:01:52.769724 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"5b688e91-e3c2-4a0f-a784-a694f951ea5e","Type":"ContainerStarted","Data":"ce10b286d2db3229a5621dafd5ff72921acba1f38e152476f3b1b8d7c9ad5d2a"} Feb 17 18:01:52 crc kubenswrapper[4892]: I0217 18:01:52.770796 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 17 18:01:52 crc kubenswrapper[4892]: I0217 18:01:52.774303 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-hkkrz" event={"ID":"a40576fc-fbd3-45f5-afb8-50de90642017","Type":"ContainerStarted","Data":"cb332a6d06af9a760e318c299c1023164b04aa143f34562f8b24e09089f6f829"} Feb 17 18:01:52 crc kubenswrapper[4892]: I0217 18:01:52.797883 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.0858923 podStartE2EDuration="3.797863561s" podCreationTimestamp="2026-02-17 18:01:49 +0000 UTC" firstStartedPulling="2026-02-17 18:01:50.54220166 +0000 UTC m=+1081.917604935" lastFinishedPulling="2026-02-17 18:01:52.254172911 +0000 UTC m=+1083.629576196" observedRunningTime="2026-02-17 18:01:52.791057696 +0000 UTC m=+1084.166460971" watchObservedRunningTime="2026-02-17 18:01:52.797863561 +0000 UTC m=+1084.173266826" Feb 17 18:01:53 crc kubenswrapper[4892]: I0217 18:01:53.029672 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-xgmsg"] Feb 17 18:01:53 crc kubenswrapper[4892]: I0217 18:01:53.030922 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-xgmsg" Feb 17 18:01:53 crc kubenswrapper[4892]: I0217 18:01:53.034641 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 17 18:01:53 crc kubenswrapper[4892]: I0217 18:01:53.034883 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 17 18:01:53 crc kubenswrapper[4892]: I0217 18:01:53.035028 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 17 18:01:53 crc kubenswrapper[4892]: I0217 18:01:53.059024 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-xgmsg"] Feb 17 18:01:53 crc kubenswrapper[4892]: I0217 18:01:53.079923 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-s6h6h" Feb 17 18:01:53 crc kubenswrapper[4892]: I0217 18:01:53.125204 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d527912-c53f-420a-a4d2-a417f9b9caa1-ovsdbserver-sb\") pod \"3d527912-c53f-420a-a4d2-a417f9b9caa1\" (UID: \"3d527912-c53f-420a-a4d2-a417f9b9caa1\") " Feb 17 18:01:53 crc kubenswrapper[4892]: I0217 18:01:53.125291 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d527912-c53f-420a-a4d2-a417f9b9caa1-config\") pod \"3d527912-c53f-420a-a4d2-a417f9b9caa1\" (UID: \"3d527912-c53f-420a-a4d2-a417f9b9caa1\") " Feb 17 18:01:53 crc kubenswrapper[4892]: I0217 18:01:53.125382 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d527912-c53f-420a-a4d2-a417f9b9caa1-dns-svc\") pod \"3d527912-c53f-420a-a4d2-a417f9b9caa1\" (UID: \"3d527912-c53f-420a-a4d2-a417f9b9caa1\") " Feb 17 18:01:53 crc kubenswrapper[4892]: I0217 18:01:53.125443 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5j8x\" (UniqueName: \"kubernetes.io/projected/3d527912-c53f-420a-a4d2-a417f9b9caa1-kube-api-access-c5j8x\") pod \"3d527912-c53f-420a-a4d2-a417f9b9caa1\" (UID: \"3d527912-c53f-420a-a4d2-a417f9b9caa1\") " Feb 17 18:01:53 crc kubenswrapper[4892]: I0217 18:01:53.125869 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f78e90e9-8804-4e06-8083-785c64a86a1d-ring-data-devices\") pod \"swift-ring-rebalance-xgmsg\" (UID: \"f78e90e9-8804-4e06-8083-785c64a86a1d\") " pod="openstack/swift-ring-rebalance-xgmsg" Feb 17 18:01:53 crc kubenswrapper[4892]: I0217 18:01:53.125932 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7ljm\" (UniqueName: \"kubernetes.io/projected/f78e90e9-8804-4e06-8083-785c64a86a1d-kube-api-access-j7ljm\") pod \"swift-ring-rebalance-xgmsg\" (UID: \"f78e90e9-8804-4e06-8083-785c64a86a1d\") " pod="openstack/swift-ring-rebalance-xgmsg" Feb 17 18:01:53 crc kubenswrapper[4892]: I0217 18:01:53.126056 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f78e90e9-8804-4e06-8083-785c64a86a1d-combined-ca-bundle\") pod \"swift-ring-rebalance-xgmsg\" (UID: \"f78e90e9-8804-4e06-8083-785c64a86a1d\") " pod="openstack/swift-ring-rebalance-xgmsg" Feb 17 18:01:53 crc kubenswrapper[4892]: I0217 18:01:53.126120 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f78e90e9-8804-4e06-8083-785c64a86a1d-scripts\") pod \"swift-ring-rebalance-xgmsg\" (UID: \"f78e90e9-8804-4e06-8083-785c64a86a1d\") " pod="openstack/swift-ring-rebalance-xgmsg" Feb 17 18:01:53 crc kubenswrapper[4892]: I0217 18:01:53.126210 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f78e90e9-8804-4e06-8083-785c64a86a1d-etc-swift\") pod \"swift-ring-rebalance-xgmsg\" (UID: \"f78e90e9-8804-4e06-8083-785c64a86a1d\") " pod="openstack/swift-ring-rebalance-xgmsg" Feb 17 18:01:53 crc kubenswrapper[4892]: I0217 18:01:53.126249 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f78e90e9-8804-4e06-8083-785c64a86a1d-swiftconf\") pod \"swift-ring-rebalance-xgmsg\" (UID: \"f78e90e9-8804-4e06-8083-785c64a86a1d\") " pod="openstack/swift-ring-rebalance-xgmsg" Feb 17 18:01:53 crc kubenswrapper[4892]: I0217 18:01:53.126284 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f78e90e9-8804-4e06-8083-785c64a86a1d-dispersionconf\") pod \"swift-ring-rebalance-xgmsg\" (UID: \"f78e90e9-8804-4e06-8083-785c64a86a1d\") " pod="openstack/swift-ring-rebalance-xgmsg" Feb 17 18:01:53 crc kubenswrapper[4892]: I0217 18:01:53.129401 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d527912-c53f-420a-a4d2-a417f9b9caa1-kube-api-access-c5j8x" (OuterVolumeSpecName: "kube-api-access-c5j8x") pod "3d527912-c53f-420a-a4d2-a417f9b9caa1" (UID: "3d527912-c53f-420a-a4d2-a417f9b9caa1"). InnerVolumeSpecName "kube-api-access-c5j8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:01:53 crc kubenswrapper[4892]: I0217 18:01:53.172128 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d527912-c53f-420a-a4d2-a417f9b9caa1-config" (OuterVolumeSpecName: "config") pod "3d527912-c53f-420a-a4d2-a417f9b9caa1" (UID: "3d527912-c53f-420a-a4d2-a417f9b9caa1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:01:53 crc kubenswrapper[4892]: I0217 18:01:53.180678 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d527912-c53f-420a-a4d2-a417f9b9caa1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3d527912-c53f-420a-a4d2-a417f9b9caa1" (UID: "3d527912-c53f-420a-a4d2-a417f9b9caa1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:01:53 crc kubenswrapper[4892]: I0217 18:01:53.194820 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d527912-c53f-420a-a4d2-a417f9b9caa1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3d527912-c53f-420a-a4d2-a417f9b9caa1" (UID: "3d527912-c53f-420a-a4d2-a417f9b9caa1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:01:53 crc kubenswrapper[4892]: I0217 18:01:53.228267 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f78e90e9-8804-4e06-8083-785c64a86a1d-ring-data-devices\") pod \"swift-ring-rebalance-xgmsg\" (UID: \"f78e90e9-8804-4e06-8083-785c64a86a1d\") " pod="openstack/swift-ring-rebalance-xgmsg" Feb 17 18:01:53 crc kubenswrapper[4892]: I0217 18:01:53.228357 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7ljm\" (UniqueName: \"kubernetes.io/projected/f78e90e9-8804-4e06-8083-785c64a86a1d-kube-api-access-j7ljm\") pod \"swift-ring-rebalance-xgmsg\" (UID: \"f78e90e9-8804-4e06-8083-785c64a86a1d\") " pod="openstack/swift-ring-rebalance-xgmsg" Feb 17 18:01:53 crc kubenswrapper[4892]: I0217 18:01:53.228435 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f78e90e9-8804-4e06-8083-785c64a86a1d-combined-ca-bundle\") pod \"swift-ring-rebalance-xgmsg\" (UID: \"f78e90e9-8804-4e06-8083-785c64a86a1d\") " pod="openstack/swift-ring-rebalance-xgmsg" Feb 17 18:01:53 crc kubenswrapper[4892]: I0217 18:01:53.228484 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f78e90e9-8804-4e06-8083-785c64a86a1d-scripts\") pod \"swift-ring-rebalance-xgmsg\" (UID: \"f78e90e9-8804-4e06-8083-785c64a86a1d\") " pod="openstack/swift-ring-rebalance-xgmsg" Feb 17 18:01:53 crc kubenswrapper[4892]: I0217 18:01:53.228551 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f78e90e9-8804-4e06-8083-785c64a86a1d-etc-swift\") pod \"swift-ring-rebalance-xgmsg\" (UID: \"f78e90e9-8804-4e06-8083-785c64a86a1d\") " pod="openstack/swift-ring-rebalance-xgmsg" Feb 17 18:01:53 crc kubenswrapper[4892]: I0217 18:01:53.228594 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f78e90e9-8804-4e06-8083-785c64a86a1d-swiftconf\") pod \"swift-ring-rebalance-xgmsg\" (UID: \"f78e90e9-8804-4e06-8083-785c64a86a1d\") " pod="openstack/swift-ring-rebalance-xgmsg" Feb 17 18:01:53 crc kubenswrapper[4892]: I0217 18:01:53.228623 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0a693208-de83-4f05-b4ff-0b3e7f858c74-etc-swift\") pod \"swift-storage-0\" (UID: \"0a693208-de83-4f05-b4ff-0b3e7f858c74\") " pod="openstack/swift-storage-0" Feb 17 18:01:53 crc kubenswrapper[4892]: I0217 18:01:53.228648 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f78e90e9-8804-4e06-8083-785c64a86a1d-dispersionconf\") pod \"swift-ring-rebalance-xgmsg\" (UID: \"f78e90e9-8804-4e06-8083-785c64a86a1d\") " pod="openstack/swift-ring-rebalance-xgmsg" Feb 17 18:01:53 crc kubenswrapper[4892]: I0217 18:01:53.228728 4892 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d527912-c53f-420a-a4d2-a417f9b9caa1-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 18:01:53 crc kubenswrapper[4892]: I0217 18:01:53.228743 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5j8x\" (UniqueName: \"kubernetes.io/projected/3d527912-c53f-420a-a4d2-a417f9b9caa1-kube-api-access-c5j8x\") on node \"crc\" DevicePath \"\"" Feb 17 18:01:53 crc kubenswrapper[4892]: I0217 18:01:53.228755 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d527912-c53f-420a-a4d2-a417f9b9caa1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 18:01:53 crc kubenswrapper[4892]: I0217 18:01:53.228768 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d527912-c53f-420a-a4d2-a417f9b9caa1-config\") on node \"crc\" DevicePath \"\"" Feb 17 18:01:53 crc kubenswrapper[4892]: I0217 18:01:53.230016 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f78e90e9-8804-4e06-8083-785c64a86a1d-scripts\") pod \"swift-ring-rebalance-xgmsg\" (UID: \"f78e90e9-8804-4e06-8083-785c64a86a1d\") " pod="openstack/swift-ring-rebalance-xgmsg" Feb 17 18:01:53 crc kubenswrapper[4892]: I0217 18:01:53.230514 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f78e90e9-8804-4e06-8083-785c64a86a1d-ring-data-devices\") pod \"swift-ring-rebalance-xgmsg\" (UID: \"f78e90e9-8804-4e06-8083-785c64a86a1d\") " pod="openstack/swift-ring-rebalance-xgmsg" Feb 17 18:01:53 crc kubenswrapper[4892]: E0217 18:01:53.231302 4892 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 17 18:01:53 crc kubenswrapper[4892]: E0217 18:01:53.231332 4892 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 17 18:01:53 crc kubenswrapper[4892]: E0217 18:01:53.231385 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0a693208-de83-4f05-b4ff-0b3e7f858c74-etc-swift podName:0a693208-de83-4f05-b4ff-0b3e7f858c74 nodeName:}" failed. No retries permitted until 2026-02-17 18:01:54.231366564 +0000 UTC m=+1085.606769829 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/0a693208-de83-4f05-b4ff-0b3e7f858c74-etc-swift") pod "swift-storage-0" (UID: "0a693208-de83-4f05-b4ff-0b3e7f858c74") : configmap "swift-ring-files" not found Feb 17 18:01:53 crc kubenswrapper[4892]: I0217 18:01:53.231756 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f78e90e9-8804-4e06-8083-785c64a86a1d-dispersionconf\") pod \"swift-ring-rebalance-xgmsg\" (UID: \"f78e90e9-8804-4e06-8083-785c64a86a1d\") " pod="openstack/swift-ring-rebalance-xgmsg" Feb 17 18:01:53 crc kubenswrapper[4892]: I0217 18:01:53.232085 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f78e90e9-8804-4e06-8083-785c64a86a1d-etc-swift\") pod \"swift-ring-rebalance-xgmsg\" (UID: \"f78e90e9-8804-4e06-8083-785c64a86a1d\") " pod="openstack/swift-ring-rebalance-xgmsg" Feb 17 18:01:53 crc kubenswrapper[4892]: I0217 18:01:53.233954 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f78e90e9-8804-4e06-8083-785c64a86a1d-combined-ca-bundle\") pod \"swift-ring-rebalance-xgmsg\" (UID: \"f78e90e9-8804-4e06-8083-785c64a86a1d\") " pod="openstack/swift-ring-rebalance-xgmsg" Feb 17 18:01:53 crc kubenswrapper[4892]: I0217 18:01:53.236177 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f78e90e9-8804-4e06-8083-785c64a86a1d-swiftconf\") pod \"swift-ring-rebalance-xgmsg\" (UID: \"f78e90e9-8804-4e06-8083-785c64a86a1d\") " pod="openstack/swift-ring-rebalance-xgmsg" Feb 17 18:01:53 crc kubenswrapper[4892]: I0217 18:01:53.244270 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7ljm\" (UniqueName: \"kubernetes.io/projected/f78e90e9-8804-4e06-8083-785c64a86a1d-kube-api-access-j7ljm\") pod \"swift-ring-rebalance-xgmsg\" (UID: \"f78e90e9-8804-4e06-8083-785c64a86a1d\") " pod="openstack/swift-ring-rebalance-xgmsg" Feb 17 18:01:53 crc kubenswrapper[4892]: I0217 18:01:53.352673 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-xgmsg" Feb 17 18:01:53 crc kubenswrapper[4892]: I0217 18:01:53.788477 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-s6h6h" Feb 17 18:01:53 crc kubenswrapper[4892]: I0217 18:01:53.788470 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-s6h6h" event={"ID":"3d527912-c53f-420a-a4d2-a417f9b9caa1","Type":"ContainerDied","Data":"f76993c4fba0708fa54b43c81316637892b2a398f0393bf734066ea064f5db27"} Feb 17 18:01:53 crc kubenswrapper[4892]: I0217 18:01:53.795325 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-q88lk" event={"ID":"059760ef-70ba-461a-8ff9-e873c3e9501a","Type":"ContainerStarted","Data":"705aa0dfc26d371fdcc08c162ebc21671ebbcf58a789b8333b5850c9cd47a318"} Feb 17 18:01:53 crc kubenswrapper[4892]: I0217 18:01:53.795474 4892 scope.go:117] "RemoveContainer" containerID="f795c3e12042852d1382a8796ff6f9d85f87241b264630d9aae2e5b5f63098a8" Feb 17 18:01:53 crc kubenswrapper[4892]: I0217 18:01:53.796105 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-q88lk" Feb 17 18:01:53 crc kubenswrapper[4892]: I0217 18:01:53.806167 4892 generic.go:334] "Generic (PLEG): container finished" podID="a40576fc-fbd3-45f5-afb8-50de90642017" containerID="1bef56092db4c9c5e74ecfc6582f80b4c9ec88e7d419fae6c9b95972e9977a22" exitCode=0 Feb 17 18:01:53 crc kubenswrapper[4892]: I0217 18:01:53.806616 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-hkkrz" event={"ID":"a40576fc-fbd3-45f5-afb8-50de90642017","Type":"ContainerDied","Data":"1bef56092db4c9c5e74ecfc6582f80b4c9ec88e7d419fae6c9b95972e9977a22"} Feb 17 18:01:53 crc kubenswrapper[4892]: W0217 18:01:53.849135 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf78e90e9_8804_4e06_8083_785c64a86a1d.slice/crio-9b10c1628d240ccc2d5772a96deb0f12c56b1ef94f016d0153bca18114e366b3 WatchSource:0}: Error finding container 9b10c1628d240ccc2d5772a96deb0f12c56b1ef94f016d0153bca18114e366b3: Status 404 returned error can't find the container with id 9b10c1628d240ccc2d5772a96deb0f12c56b1ef94f016d0153bca18114e366b3 Feb 17 18:01:53 crc kubenswrapper[4892]: I0217 18:01:53.856004 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-xgmsg"] Feb 17 18:01:53 crc kubenswrapper[4892]: I0217 18:01:53.873800 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-q88lk" podStartSLOduration=4.350610365 podStartE2EDuration="4.873778739s" podCreationTimestamp="2026-02-17 18:01:49 +0000 UTC" firstStartedPulling="2026-02-17 18:01:50.304455052 +0000 UTC m=+1081.679858317" lastFinishedPulling="2026-02-17 18:01:50.827623426 +0000 UTC m=+1082.203026691" observedRunningTime="2026-02-17 18:01:53.816758139 +0000 UTC m=+1085.192161424" watchObservedRunningTime="2026-02-17 18:01:53.873778739 +0000 UTC m=+1085.249182004" Feb 17 18:01:53 crc kubenswrapper[4892]: I0217 18:01:53.918728 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-s6h6h"] Feb 17 18:01:53 crc kubenswrapper[4892]: I0217 18:01:53.928137 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-s6h6h"] Feb 17 18:01:54 crc kubenswrapper[4892]: I0217 18:01:54.246893 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0a693208-de83-4f05-b4ff-0b3e7f858c74-etc-swift\") pod \"swift-storage-0\" (UID: \"0a693208-de83-4f05-b4ff-0b3e7f858c74\") " pod="openstack/swift-storage-0" Feb 17 18:01:54 crc kubenswrapper[4892]: E0217 18:01:54.247169 4892 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 17 18:01:54 crc kubenswrapper[4892]: E0217 18:01:54.247214 4892 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 17 18:01:54 crc kubenswrapper[4892]: E0217 18:01:54.247283 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0a693208-de83-4f05-b4ff-0b3e7f858c74-etc-swift podName:0a693208-de83-4f05-b4ff-0b3e7f858c74 nodeName:}" failed. No retries permitted until 2026-02-17 18:01:56.247258842 +0000 UTC m=+1087.622662127 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/0a693208-de83-4f05-b4ff-0b3e7f858c74-etc-swift") pod "swift-storage-0" (UID: "0a693208-de83-4f05-b4ff-0b3e7f858c74") : configmap "swift-ring-files" not found Feb 17 18:01:54 crc kubenswrapper[4892]: I0217 18:01:54.816626 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-xgmsg" event={"ID":"f78e90e9-8804-4e06-8083-785c64a86a1d","Type":"ContainerStarted","Data":"9b10c1628d240ccc2d5772a96deb0f12c56b1ef94f016d0153bca18114e366b3"} Feb 17 18:01:54 crc kubenswrapper[4892]: I0217 18:01:54.818900 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-hkkrz" event={"ID":"a40576fc-fbd3-45f5-afb8-50de90642017","Type":"ContainerStarted","Data":"ee31744dbff65e585731ce1bce9c8769097e005abb95fe945b70dda9c322656e"} Feb 17 18:01:54 crc kubenswrapper[4892]: I0217 18:01:54.843801 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-hkkrz" podStartSLOduration=3.843780158 podStartE2EDuration="3.843780158s" podCreationTimestamp="2026-02-17 18:01:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:01:54.836608523 +0000 UTC m=+1086.212011788" watchObservedRunningTime="2026-02-17 18:01:54.843780158 +0000 UTC m=+1086.219183433" Feb 17 18:01:55 crc kubenswrapper[4892]: I0217 18:01:55.370151 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d527912-c53f-420a-a4d2-a417f9b9caa1" path="/var/lib/kubelet/pods/3d527912-c53f-420a-a4d2-a417f9b9caa1/volumes" Feb 17 18:01:55 crc kubenswrapper[4892]: I0217 18:01:55.830487 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-hkkrz" Feb 17 18:01:56 crc kubenswrapper[4892]: I0217 18:01:56.288302 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0a693208-de83-4f05-b4ff-0b3e7f858c74-etc-swift\") pod \"swift-storage-0\" (UID: \"0a693208-de83-4f05-b4ff-0b3e7f858c74\") " pod="openstack/swift-storage-0" Feb 17 18:01:56 crc kubenswrapper[4892]: E0217 18:01:56.288492 4892 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 17 18:01:56 crc kubenswrapper[4892]: E0217 18:01:56.288504 4892 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 17 18:01:56 crc kubenswrapper[4892]: E0217 18:01:56.288540 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0a693208-de83-4f05-b4ff-0b3e7f858c74-etc-swift podName:0a693208-de83-4f05-b4ff-0b3e7f858c74 nodeName:}" failed. No retries permitted until 2026-02-17 18:02:00.288527794 +0000 UTC m=+1091.663931059 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/0a693208-de83-4f05-b4ff-0b3e7f858c74-etc-swift") pod "swift-storage-0" (UID: "0a693208-de83-4f05-b4ff-0b3e7f858c74") : configmap "swift-ring-files" not found Feb 17 18:01:56 crc kubenswrapper[4892]: I0217 18:01:56.333255 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-q4wgd"] Feb 17 18:01:56 crc kubenswrapper[4892]: E0217 18:01:56.334397 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d527912-c53f-420a-a4d2-a417f9b9caa1" containerName="init" Feb 17 18:01:56 crc kubenswrapper[4892]: I0217 18:01:56.334417 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d527912-c53f-420a-a4d2-a417f9b9caa1" containerName="init" Feb 17 18:01:56 crc kubenswrapper[4892]: I0217 18:01:56.335377 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d527912-c53f-420a-a4d2-a417f9b9caa1" containerName="init" Feb 17 18:01:56 crc kubenswrapper[4892]: I0217 18:01:56.337894 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-q4wgd" Feb 17 18:01:56 crc kubenswrapper[4892]: I0217 18:01:56.340679 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 17 18:01:56 crc kubenswrapper[4892]: I0217 18:01:56.381462 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-q4wgd"] Feb 17 18:01:56 crc kubenswrapper[4892]: I0217 18:01:56.389757 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b940cf2-ddde-420c-bfef-e2c2876b8919-operator-scripts\") pod \"root-account-create-update-q4wgd\" (UID: \"5b940cf2-ddde-420c-bfef-e2c2876b8919\") " pod="openstack/root-account-create-update-q4wgd" Feb 17 18:01:56 crc kubenswrapper[4892]: I0217 18:01:56.390146 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b2vg\" (UniqueName: \"kubernetes.io/projected/5b940cf2-ddde-420c-bfef-e2c2876b8919-kube-api-access-7b2vg\") pod \"root-account-create-update-q4wgd\" (UID: \"5b940cf2-ddde-420c-bfef-e2c2876b8919\") " pod="openstack/root-account-create-update-q4wgd" Feb 17 18:01:56 crc kubenswrapper[4892]: I0217 18:01:56.492582 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b940cf2-ddde-420c-bfef-e2c2876b8919-operator-scripts\") pod \"root-account-create-update-q4wgd\" (UID: \"5b940cf2-ddde-420c-bfef-e2c2876b8919\") " pod="openstack/root-account-create-update-q4wgd" Feb 17 18:01:56 crc kubenswrapper[4892]: I0217 18:01:56.492692 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b2vg\" (UniqueName: \"kubernetes.io/projected/5b940cf2-ddde-420c-bfef-e2c2876b8919-kube-api-access-7b2vg\") pod \"root-account-create-update-q4wgd\" (UID: \"5b940cf2-ddde-420c-bfef-e2c2876b8919\") " pod="openstack/root-account-create-update-q4wgd" Feb 17 18:01:56 crc kubenswrapper[4892]: I0217 18:01:56.493320 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b940cf2-ddde-420c-bfef-e2c2876b8919-operator-scripts\") pod \"root-account-create-update-q4wgd\" (UID: \"5b940cf2-ddde-420c-bfef-e2c2876b8919\") " pod="openstack/root-account-create-update-q4wgd" Feb 17 18:01:56 crc kubenswrapper[4892]: I0217 18:01:56.512317 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b2vg\" (UniqueName: \"kubernetes.io/projected/5b940cf2-ddde-420c-bfef-e2c2876b8919-kube-api-access-7b2vg\") pod \"root-account-create-update-q4wgd\" (UID: \"5b940cf2-ddde-420c-bfef-e2c2876b8919\") " pod="openstack/root-account-create-update-q4wgd" Feb 17 18:01:56 crc kubenswrapper[4892]: I0217 18:01:56.677278 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-q4wgd" Feb 17 18:01:57 crc kubenswrapper[4892]: I0217 18:01:57.389552 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-q4wgd"] Feb 17 18:01:57 crc kubenswrapper[4892]: I0217 18:01:57.855185 4892 generic.go:334] "Generic (PLEG): container finished" podID="5b940cf2-ddde-420c-bfef-e2c2876b8919" containerID="be35a06a29cd7c2a1ba928f15928d076ce2f85d560d8048d0cb22851b293f9a7" exitCode=0 Feb 17 18:01:57 crc kubenswrapper[4892]: I0217 18:01:57.855464 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-q4wgd" event={"ID":"5b940cf2-ddde-420c-bfef-e2c2876b8919","Type":"ContainerDied","Data":"be35a06a29cd7c2a1ba928f15928d076ce2f85d560d8048d0cb22851b293f9a7"} Feb 17 18:01:57 crc kubenswrapper[4892]: I0217 18:01:57.855618 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-q4wgd" event={"ID":"5b940cf2-ddde-420c-bfef-e2c2876b8919","Type":"ContainerStarted","Data":"0196c8bc26ef4f69451a4a84a5cc87915d01ca3adbabef02dd4e158a7cb8beb6"} Feb 17 18:01:57 crc kubenswrapper[4892]: I0217 18:01:57.858424 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-xgmsg" event={"ID":"f78e90e9-8804-4e06-8083-785c64a86a1d","Type":"ContainerStarted","Data":"20e78a2656091e6830683ed12bdcfe9347b0af0753251ba18342603e69dcb165"} Feb 17 18:01:57 crc kubenswrapper[4892]: I0217 18:01:57.898158 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-xgmsg" podStartSLOduration=1.826519862 podStartE2EDuration="4.898140291s" podCreationTimestamp="2026-02-17 18:01:53 +0000 UTC" firstStartedPulling="2026-02-17 18:01:53.880417748 +0000 UTC m=+1085.255821023" lastFinishedPulling="2026-02-17 18:01:56.952038167 +0000 UTC m=+1088.327441452" observedRunningTime="2026-02-17 18:01:57.891774719 +0000 UTC m=+1089.267178024" watchObservedRunningTime="2026-02-17 18:01:57.898140291 +0000 UTC m=+1089.273543566" Feb 17 18:01:59 crc kubenswrapper[4892]: I0217 18:01:59.264649 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-q4wgd" Feb 17 18:01:59 crc kubenswrapper[4892]: I0217 18:01:59.373570 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b940cf2-ddde-420c-bfef-e2c2876b8919-operator-scripts\") pod \"5b940cf2-ddde-420c-bfef-e2c2876b8919\" (UID: \"5b940cf2-ddde-420c-bfef-e2c2876b8919\") " Feb 17 18:01:59 crc kubenswrapper[4892]: I0217 18:01:59.373642 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7b2vg\" (UniqueName: \"kubernetes.io/projected/5b940cf2-ddde-420c-bfef-e2c2876b8919-kube-api-access-7b2vg\") pod \"5b940cf2-ddde-420c-bfef-e2c2876b8919\" (UID: \"5b940cf2-ddde-420c-bfef-e2c2876b8919\") " Feb 17 18:01:59 crc kubenswrapper[4892]: I0217 18:01:59.374297 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b940cf2-ddde-420c-bfef-e2c2876b8919-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5b940cf2-ddde-420c-bfef-e2c2876b8919" (UID: "5b940cf2-ddde-420c-bfef-e2c2876b8919"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:01:59 crc kubenswrapper[4892]: I0217 18:01:59.378259 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b940cf2-ddde-420c-bfef-e2c2876b8919-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:01:59 crc kubenswrapper[4892]: I0217 18:01:59.393024 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b940cf2-ddde-420c-bfef-e2c2876b8919-kube-api-access-7b2vg" (OuterVolumeSpecName: "kube-api-access-7b2vg") pod "5b940cf2-ddde-420c-bfef-e2c2876b8919" (UID: "5b940cf2-ddde-420c-bfef-e2c2876b8919"). InnerVolumeSpecName "kube-api-access-7b2vg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:01:59 crc kubenswrapper[4892]: I0217 18:01:59.479934 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7b2vg\" (UniqueName: \"kubernetes.io/projected/5b940cf2-ddde-420c-bfef-e2c2876b8919-kube-api-access-7b2vg\") on node \"crc\" DevicePath \"\"" Feb 17 18:01:59 crc kubenswrapper[4892]: I0217 18:01:59.500790 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-tvldr"] Feb 17 18:01:59 crc kubenswrapper[4892]: E0217 18:01:59.501212 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b940cf2-ddde-420c-bfef-e2c2876b8919" containerName="mariadb-account-create-update" Feb 17 18:01:59 crc kubenswrapper[4892]: I0217 18:01:59.501232 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b940cf2-ddde-420c-bfef-e2c2876b8919" containerName="mariadb-account-create-update" Feb 17 18:01:59 crc kubenswrapper[4892]: I0217 18:01:59.501630 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b940cf2-ddde-420c-bfef-e2c2876b8919" containerName="mariadb-account-create-update" Feb 17 18:01:59 crc kubenswrapper[4892]: I0217 18:01:59.502213 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-tvldr" Feb 17 18:01:59 crc kubenswrapper[4892]: I0217 18:01:59.512389 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-tvldr"] Feb 17 18:01:59 crc kubenswrapper[4892]: I0217 18:01:59.581906 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49dc1deb-f5b7-4b82-bd8e-d1c6dbe9f513-operator-scripts\") pod \"glance-db-create-tvldr\" (UID: \"49dc1deb-f5b7-4b82-bd8e-d1c6dbe9f513\") " pod="openstack/glance-db-create-tvldr" Feb 17 18:01:59 crc kubenswrapper[4892]: I0217 18:01:59.582052 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78rsc\" (UniqueName: \"kubernetes.io/projected/49dc1deb-f5b7-4b82-bd8e-d1c6dbe9f513-kube-api-access-78rsc\") pod \"glance-db-create-tvldr\" (UID: \"49dc1deb-f5b7-4b82-bd8e-d1c6dbe9f513\") " pod="openstack/glance-db-create-tvldr" Feb 17 18:01:59 crc kubenswrapper[4892]: I0217 18:01:59.601938 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-5c69-account-create-update-dl45f"] Feb 17 18:01:59 crc kubenswrapper[4892]: I0217 18:01:59.604914 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5c69-account-create-update-dl45f" Feb 17 18:01:59 crc kubenswrapper[4892]: I0217 18:01:59.610180 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 17 18:01:59 crc kubenswrapper[4892]: I0217 18:01:59.614486 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-5c69-account-create-update-dl45f"] Feb 17 18:01:59 crc kubenswrapper[4892]: I0217 18:01:59.678263 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8554648995-q88lk" Feb 17 18:01:59 crc kubenswrapper[4892]: I0217 18:01:59.683657 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49dc1deb-f5b7-4b82-bd8e-d1c6dbe9f513-operator-scripts\") pod \"glance-db-create-tvldr\" (UID: \"49dc1deb-f5b7-4b82-bd8e-d1c6dbe9f513\") " pod="openstack/glance-db-create-tvldr" Feb 17 18:01:59 crc kubenswrapper[4892]: I0217 18:01:59.683734 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtrsz\" (UniqueName: \"kubernetes.io/projected/46aabc4c-95df-425a-bde5-db7523f34d7f-kube-api-access-jtrsz\") pod \"glance-5c69-account-create-update-dl45f\" (UID: \"46aabc4c-95df-425a-bde5-db7523f34d7f\") " pod="openstack/glance-5c69-account-create-update-dl45f" Feb 17 18:01:59 crc kubenswrapper[4892]: I0217 18:01:59.683856 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78rsc\" (UniqueName: \"kubernetes.io/projected/49dc1deb-f5b7-4b82-bd8e-d1c6dbe9f513-kube-api-access-78rsc\") pod \"glance-db-create-tvldr\" (UID: \"49dc1deb-f5b7-4b82-bd8e-d1c6dbe9f513\") " pod="openstack/glance-db-create-tvldr" Feb 17 18:01:59 crc kubenswrapper[4892]: I0217 18:01:59.684011 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46aabc4c-95df-425a-bde5-db7523f34d7f-operator-scripts\") pod \"glance-5c69-account-create-update-dl45f\" (UID: \"46aabc4c-95df-425a-bde5-db7523f34d7f\") " pod="openstack/glance-5c69-account-create-update-dl45f" Feb 17 18:01:59 crc kubenswrapper[4892]: I0217 18:01:59.684502 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49dc1deb-f5b7-4b82-bd8e-d1c6dbe9f513-operator-scripts\") pod \"glance-db-create-tvldr\" (UID: \"49dc1deb-f5b7-4b82-bd8e-d1c6dbe9f513\") " pod="openstack/glance-db-create-tvldr" Feb 17 18:01:59 crc kubenswrapper[4892]: I0217 18:01:59.704652 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78rsc\" (UniqueName: \"kubernetes.io/projected/49dc1deb-f5b7-4b82-bd8e-d1c6dbe9f513-kube-api-access-78rsc\") pod \"glance-db-create-tvldr\" (UID: \"49dc1deb-f5b7-4b82-bd8e-d1c6dbe9f513\") " pod="openstack/glance-db-create-tvldr" Feb 17 18:01:59 crc kubenswrapper[4892]: I0217 18:01:59.785570 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtrsz\" (UniqueName: \"kubernetes.io/projected/46aabc4c-95df-425a-bde5-db7523f34d7f-kube-api-access-jtrsz\") pod \"glance-5c69-account-create-update-dl45f\" (UID: \"46aabc4c-95df-425a-bde5-db7523f34d7f\") " pod="openstack/glance-5c69-account-create-update-dl45f" Feb 17 18:01:59 crc kubenswrapper[4892]: I0217 18:01:59.785770 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46aabc4c-95df-425a-bde5-db7523f34d7f-operator-scripts\") pod \"glance-5c69-account-create-update-dl45f\" (UID: \"46aabc4c-95df-425a-bde5-db7523f34d7f\") " pod="openstack/glance-5c69-account-create-update-dl45f" Feb 17 18:01:59 crc kubenswrapper[4892]: I0217 18:01:59.786502 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46aabc4c-95df-425a-bde5-db7523f34d7f-operator-scripts\") pod \"glance-5c69-account-create-update-dl45f\" (UID: \"46aabc4c-95df-425a-bde5-db7523f34d7f\") " pod="openstack/glance-5c69-account-create-update-dl45f" Feb 17 18:01:59 crc kubenswrapper[4892]: I0217 18:01:59.805325 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtrsz\" (UniqueName: \"kubernetes.io/projected/46aabc4c-95df-425a-bde5-db7523f34d7f-kube-api-access-jtrsz\") pod \"glance-5c69-account-create-update-dl45f\" (UID: \"46aabc4c-95df-425a-bde5-db7523f34d7f\") " pod="openstack/glance-5c69-account-create-update-dl45f" Feb 17 18:01:59 crc kubenswrapper[4892]: I0217 18:01:59.819279 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-tvldr" Feb 17 18:01:59 crc kubenswrapper[4892]: I0217 18:01:59.879540 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-q4wgd" event={"ID":"5b940cf2-ddde-420c-bfef-e2c2876b8919","Type":"ContainerDied","Data":"0196c8bc26ef4f69451a4a84a5cc87915d01ca3adbabef02dd4e158a7cb8beb6"} Feb 17 18:01:59 crc kubenswrapper[4892]: I0217 18:01:59.879602 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0196c8bc26ef4f69451a4a84a5cc87915d01ca3adbabef02dd4e158a7cb8beb6" Feb 17 18:01:59 crc kubenswrapper[4892]: I0217 18:01:59.879692 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-q4wgd" Feb 17 18:01:59 crc kubenswrapper[4892]: I0217 18:01:59.920418 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5c69-account-create-update-dl45f" Feb 17 18:02:00 crc kubenswrapper[4892]: I0217 18:02:00.267052 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-tvldr"] Feb 17 18:02:00 crc kubenswrapper[4892]: W0217 18:02:00.270731 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49dc1deb_f5b7_4b82_bd8e_d1c6dbe9f513.slice/crio-4d55d2347215e118f1acc7cd1cf7ad0799061e739916e8ffd39b98eaea541cb8 WatchSource:0}: Error finding container 4d55d2347215e118f1acc7cd1cf7ad0799061e739916e8ffd39b98eaea541cb8: Status 404 returned error can't find the container with id 4d55d2347215e118f1acc7cd1cf7ad0799061e739916e8ffd39b98eaea541cb8 Feb 17 18:02:00 crc kubenswrapper[4892]: I0217 18:02:00.295251 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0a693208-de83-4f05-b4ff-0b3e7f858c74-etc-swift\") pod \"swift-storage-0\" (UID: \"0a693208-de83-4f05-b4ff-0b3e7f858c74\") " pod="openstack/swift-storage-0" Feb 17 18:02:00 crc kubenswrapper[4892]: E0217 18:02:00.295484 4892 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 17 18:02:00 crc kubenswrapper[4892]: E0217 18:02:00.295519 4892 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 17 18:02:00 crc kubenswrapper[4892]: E0217 18:02:00.295581 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0a693208-de83-4f05-b4ff-0b3e7f858c74-etc-swift podName:0a693208-de83-4f05-b4ff-0b3e7f858c74 nodeName:}" failed. No retries permitted until 2026-02-17 18:02:08.295563939 +0000 UTC m=+1099.670967204 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/0a693208-de83-4f05-b4ff-0b3e7f858c74-etc-swift") pod "swift-storage-0" (UID: "0a693208-de83-4f05-b4ff-0b3e7f858c74") : configmap "swift-ring-files" not found Feb 17 18:02:00 crc kubenswrapper[4892]: I0217 18:02:00.343127 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-zf6jg"] Feb 17 18:02:00 crc kubenswrapper[4892]: I0217 18:02:00.344296 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-zf6jg" Feb 17 18:02:00 crc kubenswrapper[4892]: I0217 18:02:00.353549 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-zf6jg"] Feb 17 18:02:00 crc kubenswrapper[4892]: I0217 18:02:00.399913 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjdwq\" (UniqueName: \"kubernetes.io/projected/17e67c77-46ba-4230-b84d-bc8e6952b2d8-kube-api-access-xjdwq\") pod \"keystone-db-create-zf6jg\" (UID: \"17e67c77-46ba-4230-b84d-bc8e6952b2d8\") " pod="openstack/keystone-db-create-zf6jg" Feb 17 18:02:00 crc kubenswrapper[4892]: I0217 18:02:00.399949 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17e67c77-46ba-4230-b84d-bc8e6952b2d8-operator-scripts\") pod \"keystone-db-create-zf6jg\" (UID: \"17e67c77-46ba-4230-b84d-bc8e6952b2d8\") " pod="openstack/keystone-db-create-zf6jg" Feb 17 18:02:00 crc kubenswrapper[4892]: I0217 18:02:00.414828 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-5c69-account-create-update-dl45f"] Feb 17 18:02:00 crc kubenswrapper[4892]: I0217 18:02:00.457807 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-36cd-account-create-update-9tkph"] Feb 17 18:02:00 crc kubenswrapper[4892]: I0217 18:02:00.459387 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-36cd-account-create-update-9tkph" Feb 17 18:02:00 crc kubenswrapper[4892]: I0217 18:02:00.461703 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 17 18:02:00 crc kubenswrapper[4892]: I0217 18:02:00.470088 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-36cd-account-create-update-9tkph"] Feb 17 18:02:00 crc kubenswrapper[4892]: I0217 18:02:00.502440 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rq6xq\" (UniqueName: \"kubernetes.io/projected/0126c9e0-502f-4109-a6cf-25eccd572dff-kube-api-access-rq6xq\") pod \"keystone-36cd-account-create-update-9tkph\" (UID: \"0126c9e0-502f-4109-a6cf-25eccd572dff\") " pod="openstack/keystone-36cd-account-create-update-9tkph" Feb 17 18:02:00 crc kubenswrapper[4892]: I0217 18:02:00.502560 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0126c9e0-502f-4109-a6cf-25eccd572dff-operator-scripts\") pod \"keystone-36cd-account-create-update-9tkph\" (UID: \"0126c9e0-502f-4109-a6cf-25eccd572dff\") " pod="openstack/keystone-36cd-account-create-update-9tkph" Feb 17 18:02:00 crc kubenswrapper[4892]: I0217 18:02:00.502683 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjdwq\" (UniqueName: \"kubernetes.io/projected/17e67c77-46ba-4230-b84d-bc8e6952b2d8-kube-api-access-xjdwq\") pod \"keystone-db-create-zf6jg\" (UID: \"17e67c77-46ba-4230-b84d-bc8e6952b2d8\") " pod="openstack/keystone-db-create-zf6jg" Feb 17 18:02:00 crc kubenswrapper[4892]: I0217 18:02:00.502721 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17e67c77-46ba-4230-b84d-bc8e6952b2d8-operator-scripts\") pod \"keystone-db-create-zf6jg\" (UID: \"17e67c77-46ba-4230-b84d-bc8e6952b2d8\") " pod="openstack/keystone-db-create-zf6jg" Feb 17 18:02:00 crc kubenswrapper[4892]: I0217 18:02:00.503643 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17e67c77-46ba-4230-b84d-bc8e6952b2d8-operator-scripts\") pod \"keystone-db-create-zf6jg\" (UID: \"17e67c77-46ba-4230-b84d-bc8e6952b2d8\") " pod="openstack/keystone-db-create-zf6jg" Feb 17 18:02:00 crc kubenswrapper[4892]: I0217 18:02:00.526074 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjdwq\" (UniqueName: \"kubernetes.io/projected/17e67c77-46ba-4230-b84d-bc8e6952b2d8-kube-api-access-xjdwq\") pod \"keystone-db-create-zf6jg\" (UID: \"17e67c77-46ba-4230-b84d-bc8e6952b2d8\") " pod="openstack/keystone-db-create-zf6jg" Feb 17 18:02:00 crc kubenswrapper[4892]: I0217 18:02:00.552678 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-7xltk"] Feb 17 18:02:00 crc kubenswrapper[4892]: I0217 18:02:00.554148 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-7xltk" Feb 17 18:02:00 crc kubenswrapper[4892]: I0217 18:02:00.562521 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-7xltk"] Feb 17 18:02:00 crc kubenswrapper[4892]: I0217 18:02:00.605268 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c72be734-0fa6-4a40-8f95-984db82d859c-operator-scripts\") pod \"placement-db-create-7xltk\" (UID: \"c72be734-0fa6-4a40-8f95-984db82d859c\") " pod="openstack/placement-db-create-7xltk" Feb 17 18:02:00 crc kubenswrapper[4892]: I0217 18:02:00.605331 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rq6xq\" (UniqueName: \"kubernetes.io/projected/0126c9e0-502f-4109-a6cf-25eccd572dff-kube-api-access-rq6xq\") pod \"keystone-36cd-account-create-update-9tkph\" (UID: \"0126c9e0-502f-4109-a6cf-25eccd572dff\") " pod="openstack/keystone-36cd-account-create-update-9tkph" Feb 17 18:02:00 crc kubenswrapper[4892]: I0217 18:02:00.605367 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n82gk\" (UniqueName: \"kubernetes.io/projected/c72be734-0fa6-4a40-8f95-984db82d859c-kube-api-access-n82gk\") pod \"placement-db-create-7xltk\" (UID: \"c72be734-0fa6-4a40-8f95-984db82d859c\") " pod="openstack/placement-db-create-7xltk" Feb 17 18:02:00 crc kubenswrapper[4892]: I0217 18:02:00.605403 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0126c9e0-502f-4109-a6cf-25eccd572dff-operator-scripts\") pod \"keystone-36cd-account-create-update-9tkph\" (UID: \"0126c9e0-502f-4109-a6cf-25eccd572dff\") " pod="openstack/keystone-36cd-account-create-update-9tkph" Feb 17 18:02:00 crc kubenswrapper[4892]: I0217 18:02:00.606471 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0126c9e0-502f-4109-a6cf-25eccd572dff-operator-scripts\") pod \"keystone-36cd-account-create-update-9tkph\" (UID: \"0126c9e0-502f-4109-a6cf-25eccd572dff\") " pod="openstack/keystone-36cd-account-create-update-9tkph" Feb 17 18:02:00 crc kubenswrapper[4892]: I0217 18:02:00.626286 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rq6xq\" (UniqueName: \"kubernetes.io/projected/0126c9e0-502f-4109-a6cf-25eccd572dff-kube-api-access-rq6xq\") pod \"keystone-36cd-account-create-update-9tkph\" (UID: \"0126c9e0-502f-4109-a6cf-25eccd572dff\") " pod="openstack/keystone-36cd-account-create-update-9tkph" Feb 17 18:02:00 crc kubenswrapper[4892]: I0217 18:02:00.645867 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-2050-account-create-update-k5ss6"] Feb 17 18:02:00 crc kubenswrapper[4892]: I0217 18:02:00.647073 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2050-account-create-update-k5ss6" Feb 17 18:02:00 crc kubenswrapper[4892]: I0217 18:02:00.663200 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 17 18:02:00 crc kubenswrapper[4892]: I0217 18:02:00.674754 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-zf6jg" Feb 17 18:02:00 crc kubenswrapper[4892]: I0217 18:02:00.678557 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-2050-account-create-update-k5ss6"] Feb 17 18:02:00 crc kubenswrapper[4892]: I0217 18:02:00.709304 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c72be734-0fa6-4a40-8f95-984db82d859c-operator-scripts\") pod \"placement-db-create-7xltk\" (UID: \"c72be734-0fa6-4a40-8f95-984db82d859c\") " pod="openstack/placement-db-create-7xltk" Feb 17 18:02:00 crc kubenswrapper[4892]: I0217 18:02:00.709425 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n82gk\" (UniqueName: \"kubernetes.io/projected/c72be734-0fa6-4a40-8f95-984db82d859c-kube-api-access-n82gk\") pod \"placement-db-create-7xltk\" (UID: \"c72be734-0fa6-4a40-8f95-984db82d859c\") " pod="openstack/placement-db-create-7xltk" Feb 17 18:02:00 crc kubenswrapper[4892]: I0217 18:02:00.711124 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c72be734-0fa6-4a40-8f95-984db82d859c-operator-scripts\") pod \"placement-db-create-7xltk\" (UID: \"c72be734-0fa6-4a40-8f95-984db82d859c\") " pod="openstack/placement-db-create-7xltk" Feb 17 18:02:00 crc kubenswrapper[4892]: I0217 18:02:00.730441 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n82gk\" (UniqueName: \"kubernetes.io/projected/c72be734-0fa6-4a40-8f95-984db82d859c-kube-api-access-n82gk\") pod \"placement-db-create-7xltk\" (UID: \"c72be734-0fa6-4a40-8f95-984db82d859c\") " pod="openstack/placement-db-create-7xltk" Feb 17 18:02:00 crc kubenswrapper[4892]: I0217 18:02:00.822451 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb8ce869-af70-43d7-b341-cf6c53900d97-operator-scripts\") pod \"placement-2050-account-create-update-k5ss6\" (UID: \"fb8ce869-af70-43d7-b341-cf6c53900d97\") " pod="openstack/placement-2050-account-create-update-k5ss6" Feb 17 18:02:00 crc kubenswrapper[4892]: I0217 18:02:00.822911 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8svl\" (UniqueName: \"kubernetes.io/projected/fb8ce869-af70-43d7-b341-cf6c53900d97-kube-api-access-v8svl\") pod \"placement-2050-account-create-update-k5ss6\" (UID: \"fb8ce869-af70-43d7-b341-cf6c53900d97\") " pod="openstack/placement-2050-account-create-update-k5ss6" Feb 17 18:02:00 crc kubenswrapper[4892]: I0217 18:02:00.910781 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-tvldr" event={"ID":"49dc1deb-f5b7-4b82-bd8e-d1c6dbe9f513","Type":"ContainerStarted","Data":"4d55d2347215e118f1acc7cd1cf7ad0799061e739916e8ffd39b98eaea541cb8"} Feb 17 18:02:00 crc kubenswrapper[4892]: I0217 18:02:00.912976 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5c69-account-create-update-dl45f" event={"ID":"46aabc4c-95df-425a-bde5-db7523f34d7f","Type":"ContainerStarted","Data":"40bd37f18d08276d6fe35bf92ebf86226d5675bb83c184a466363fdaf1031007"} Feb 17 18:02:00 crc kubenswrapper[4892]: I0217 18:02:00.919245 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-36cd-account-create-update-9tkph" Feb 17 18:02:00 crc kubenswrapper[4892]: I0217 18:02:00.923904 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb8ce869-af70-43d7-b341-cf6c53900d97-operator-scripts\") pod \"placement-2050-account-create-update-k5ss6\" (UID: \"fb8ce869-af70-43d7-b341-cf6c53900d97\") " pod="openstack/placement-2050-account-create-update-k5ss6" Feb 17 18:02:00 crc kubenswrapper[4892]: I0217 18:02:00.924591 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8svl\" (UniqueName: \"kubernetes.io/projected/fb8ce869-af70-43d7-b341-cf6c53900d97-kube-api-access-v8svl\") pod \"placement-2050-account-create-update-k5ss6\" (UID: \"fb8ce869-af70-43d7-b341-cf6c53900d97\") " pod="openstack/placement-2050-account-create-update-k5ss6" Feb 17 18:02:00 crc kubenswrapper[4892]: I0217 18:02:00.926218 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb8ce869-af70-43d7-b341-cf6c53900d97-operator-scripts\") pod \"placement-2050-account-create-update-k5ss6\" (UID: \"fb8ce869-af70-43d7-b341-cf6c53900d97\") " pod="openstack/placement-2050-account-create-update-k5ss6" Feb 17 18:02:00 crc kubenswrapper[4892]: I0217 18:02:00.933299 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-7xltk" Feb 17 18:02:00 crc kubenswrapper[4892]: I0217 18:02:00.949126 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8svl\" (UniqueName: \"kubernetes.io/projected/fb8ce869-af70-43d7-b341-cf6c53900d97-kube-api-access-v8svl\") pod \"placement-2050-account-create-update-k5ss6\" (UID: \"fb8ce869-af70-43d7-b341-cf6c53900d97\") " pod="openstack/placement-2050-account-create-update-k5ss6" Feb 17 18:02:00 crc kubenswrapper[4892]: I0217 18:02:00.951674 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2050-account-create-update-k5ss6" Feb 17 18:02:01 crc kubenswrapper[4892]: I0217 18:02:01.146003 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-zf6jg"] Feb 17 18:02:01 crc kubenswrapper[4892]: I0217 18:02:01.584676 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-7xltk"] Feb 17 18:02:01 crc kubenswrapper[4892]: I0217 18:02:01.597409 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-36cd-account-create-update-9tkph"] Feb 17 18:02:01 crc kubenswrapper[4892]: I0217 18:02:01.660060 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-2050-account-create-update-k5ss6"] Feb 17 18:02:01 crc kubenswrapper[4892]: W0217 18:02:01.667972 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb8ce869_af70_43d7_b341_cf6c53900d97.slice/crio-88e4ccdc5286e61812f3364e7e9f7edfb9e18100d3375e34665db970b3866e10 WatchSource:0}: Error finding container 88e4ccdc5286e61812f3364e7e9f7edfb9e18100d3375e34665db970b3866e10: Status 404 returned error can't find the container with id 88e4ccdc5286e61812f3364e7e9f7edfb9e18100d3375e34665db970b3866e10 Feb 17 18:02:01 crc kubenswrapper[4892]: I0217 18:02:01.880016 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-hkkrz" Feb 17 18:02:01 crc kubenswrapper[4892]: I0217 18:02:01.932741 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-7xltk" event={"ID":"c72be734-0fa6-4a40-8f95-984db82d859c","Type":"ContainerStarted","Data":"5d7067482fa1f32b3456410b256ff6fbbbdc9dd6a1a70f2b44e6ef0f424e2683"} Feb 17 18:02:01 crc kubenswrapper[4892]: I0217 18:02:01.936516 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-zf6jg" event={"ID":"17e67c77-46ba-4230-b84d-bc8e6952b2d8","Type":"ContainerStarted","Data":"0ed10cd1809fc1ebc8ac2ad88bd47659d051e13889fd973bc2f836802f14044e"} Feb 17 18:02:01 crc kubenswrapper[4892]: I0217 18:02:01.936579 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-zf6jg" event={"ID":"17e67c77-46ba-4230-b84d-bc8e6952b2d8","Type":"ContainerStarted","Data":"1d6aa8889cf3c4274fc999b4dbb0fae917a512a6a95be47c846e3f368ecb1de2"} Feb 17 18:02:01 crc kubenswrapper[4892]: I0217 18:02:01.940120 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2050-account-create-update-k5ss6" event={"ID":"fb8ce869-af70-43d7-b341-cf6c53900d97","Type":"ContainerStarted","Data":"88e4ccdc5286e61812f3364e7e9f7edfb9e18100d3375e34665db970b3866e10"} Feb 17 18:02:01 crc kubenswrapper[4892]: I0217 18:02:01.941871 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-q88lk"] Feb 17 18:02:01 crc kubenswrapper[4892]: I0217 18:02:01.944828 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-q88lk" podUID="059760ef-70ba-461a-8ff9-e873c3e9501a" containerName="dnsmasq-dns" containerID="cri-o://705aa0dfc26d371fdcc08c162ebc21671ebbcf58a789b8333b5850c9cd47a318" gracePeriod=10 Feb 17 18:02:01 crc kubenswrapper[4892]: I0217 18:02:01.967441 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-zf6jg" podStartSLOduration=1.967411807 podStartE2EDuration="1.967411807s" podCreationTimestamp="2026-02-17 18:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:02:01.955801883 +0000 UTC m=+1093.331205148" watchObservedRunningTime="2026-02-17 18:02:01.967411807 +0000 UTC m=+1093.342815082" Feb 17 18:02:01 crc kubenswrapper[4892]: I0217 18:02:01.984405 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-36cd-account-create-update-9tkph" event={"ID":"0126c9e0-502f-4109-a6cf-25eccd572dff","Type":"ContainerStarted","Data":"bee4189c7539f35f8f9e2a0eb3a697b47e5a5d4eef89ead1df0f8682c8e94d4c"} Feb 17 18:02:01 crc kubenswrapper[4892]: I0217 18:02:01.999665 4892 generic.go:334] "Generic (PLEG): container finished" podID="49dc1deb-f5b7-4b82-bd8e-d1c6dbe9f513" containerID="d6eca5b3fff973b26b1e0665698334af0d197984bc7aca23a2b70b59cc1daf4d" exitCode=0 Feb 17 18:02:01 crc kubenswrapper[4892]: I0217 18:02:01.999741 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-tvldr" event={"ID":"49dc1deb-f5b7-4b82-bd8e-d1c6dbe9f513","Type":"ContainerDied","Data":"d6eca5b3fff973b26b1e0665698334af0d197984bc7aca23a2b70b59cc1daf4d"} Feb 17 18:02:02 crc kubenswrapper[4892]: I0217 18:02:02.009994 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5c69-account-create-update-dl45f" event={"ID":"46aabc4c-95df-425a-bde5-db7523f34d7f","Type":"ContainerStarted","Data":"e26651b0dac4631f1d31c476692d4a84f0c1e5516e237a2c33f25e57e3776b8c"} Feb 17 18:02:02 crc kubenswrapper[4892]: I0217 18:02:02.056093 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-5c69-account-create-update-dl45f" podStartSLOduration=3.056068929 podStartE2EDuration="3.056068929s" podCreationTimestamp="2026-02-17 18:01:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:02:02.040628993 +0000 UTC m=+1093.416032268" watchObservedRunningTime="2026-02-17 18:02:02.056068929 +0000 UTC m=+1093.431472194" Feb 17 18:02:02 crc kubenswrapper[4892]: I0217 18:02:02.589976 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-q88lk" Feb 17 18:02:02 crc kubenswrapper[4892]: I0217 18:02:02.689087 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-q4wgd"] Feb 17 18:02:02 crc kubenswrapper[4892]: I0217 18:02:02.696275 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-q4wgd"] Feb 17 18:02:02 crc kubenswrapper[4892]: I0217 18:02:02.761312 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/059760ef-70ba-461a-8ff9-e873c3e9501a-config\") pod \"059760ef-70ba-461a-8ff9-e873c3e9501a\" (UID: \"059760ef-70ba-461a-8ff9-e873c3e9501a\") " Feb 17 18:02:02 crc kubenswrapper[4892]: I0217 18:02:02.761462 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/059760ef-70ba-461a-8ff9-e873c3e9501a-ovsdbserver-sb\") pod \"059760ef-70ba-461a-8ff9-e873c3e9501a\" (UID: \"059760ef-70ba-461a-8ff9-e873c3e9501a\") " Feb 17 18:02:02 crc kubenswrapper[4892]: I0217 18:02:02.761521 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/059760ef-70ba-461a-8ff9-e873c3e9501a-dns-svc\") pod \"059760ef-70ba-461a-8ff9-e873c3e9501a\" (UID: \"059760ef-70ba-461a-8ff9-e873c3e9501a\") " Feb 17 18:02:02 crc kubenswrapper[4892]: I0217 18:02:02.761635 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/059760ef-70ba-461a-8ff9-e873c3e9501a-ovsdbserver-nb\") pod \"059760ef-70ba-461a-8ff9-e873c3e9501a\" (UID: \"059760ef-70ba-461a-8ff9-e873c3e9501a\") " Feb 17 18:02:02 crc kubenswrapper[4892]: I0217 18:02:02.761690 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5wfn\" (UniqueName: \"kubernetes.io/projected/059760ef-70ba-461a-8ff9-e873c3e9501a-kube-api-access-f5wfn\") pod \"059760ef-70ba-461a-8ff9-e873c3e9501a\" (UID: \"059760ef-70ba-461a-8ff9-e873c3e9501a\") " Feb 17 18:02:02 crc kubenswrapper[4892]: I0217 18:02:02.768933 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/059760ef-70ba-461a-8ff9-e873c3e9501a-kube-api-access-f5wfn" (OuterVolumeSpecName: "kube-api-access-f5wfn") pod "059760ef-70ba-461a-8ff9-e873c3e9501a" (UID: "059760ef-70ba-461a-8ff9-e873c3e9501a"). InnerVolumeSpecName "kube-api-access-f5wfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:02:02 crc kubenswrapper[4892]: I0217 18:02:02.811056 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/059760ef-70ba-461a-8ff9-e873c3e9501a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "059760ef-70ba-461a-8ff9-e873c3e9501a" (UID: "059760ef-70ba-461a-8ff9-e873c3e9501a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:02:02 crc kubenswrapper[4892]: I0217 18:02:02.834860 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/059760ef-70ba-461a-8ff9-e873c3e9501a-config" (OuterVolumeSpecName: "config") pod "059760ef-70ba-461a-8ff9-e873c3e9501a" (UID: "059760ef-70ba-461a-8ff9-e873c3e9501a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:02:02 crc kubenswrapper[4892]: I0217 18:02:02.837177 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/059760ef-70ba-461a-8ff9-e873c3e9501a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "059760ef-70ba-461a-8ff9-e873c3e9501a" (UID: "059760ef-70ba-461a-8ff9-e873c3e9501a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:02:02 crc kubenswrapper[4892]: I0217 18:02:02.840034 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/059760ef-70ba-461a-8ff9-e873c3e9501a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "059760ef-70ba-461a-8ff9-e873c3e9501a" (UID: "059760ef-70ba-461a-8ff9-e873c3e9501a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:02:02 crc kubenswrapper[4892]: I0217 18:02:02.863697 4892 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/059760ef-70ba-461a-8ff9-e873c3e9501a-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:02 crc kubenswrapper[4892]: I0217 18:02:02.863734 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/059760ef-70ba-461a-8ff9-e873c3e9501a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:02 crc kubenswrapper[4892]: I0217 18:02:02.863747 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5wfn\" (UniqueName: \"kubernetes.io/projected/059760ef-70ba-461a-8ff9-e873c3e9501a-kube-api-access-f5wfn\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:02 crc kubenswrapper[4892]: I0217 18:02:02.863759 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/059760ef-70ba-461a-8ff9-e873c3e9501a-config\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:02 crc kubenswrapper[4892]: I0217 18:02:02.863770 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/059760ef-70ba-461a-8ff9-e873c3e9501a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:03 crc kubenswrapper[4892]: I0217 18:02:03.022057 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-7xltk" event={"ID":"c72be734-0fa6-4a40-8f95-984db82d859c","Type":"ContainerStarted","Data":"6b32704fbc72ff9a7cc5ded27500a276ebad694cfbdf97871511f5b03b78561d"} Feb 17 18:02:03 crc kubenswrapper[4892]: I0217 18:02:03.026741 4892 generic.go:334] "Generic (PLEG): container finished" podID="059760ef-70ba-461a-8ff9-e873c3e9501a" containerID="705aa0dfc26d371fdcc08c162ebc21671ebbcf58a789b8333b5850c9cd47a318" exitCode=0 Feb 17 18:02:03 crc kubenswrapper[4892]: I0217 18:02:03.026855 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-q88lk" event={"ID":"059760ef-70ba-461a-8ff9-e873c3e9501a","Type":"ContainerDied","Data":"705aa0dfc26d371fdcc08c162ebc21671ebbcf58a789b8333b5850c9cd47a318"} Feb 17 18:02:03 crc kubenswrapper[4892]: I0217 18:02:03.026927 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-q88lk" event={"ID":"059760ef-70ba-461a-8ff9-e873c3e9501a","Type":"ContainerDied","Data":"65178bec6aa381d8e962e4b1c436b2bc672b61f766ca52a02231df65d6a49da7"} Feb 17 18:02:03 crc kubenswrapper[4892]: I0217 18:02:03.026998 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-q88lk" Feb 17 18:02:03 crc kubenswrapper[4892]: I0217 18:02:03.026989 4892 scope.go:117] "RemoveContainer" containerID="705aa0dfc26d371fdcc08c162ebc21671ebbcf58a789b8333b5850c9cd47a318" Feb 17 18:02:03 crc kubenswrapper[4892]: I0217 18:02:03.028943 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2050-account-create-update-k5ss6" event={"ID":"fb8ce869-af70-43d7-b341-cf6c53900d97","Type":"ContainerStarted","Data":"ceb4c385d00646492bf20fac2ee2ee7bfac8d274b0a4abeaaee8d01b2d0db45c"} Feb 17 18:02:03 crc kubenswrapper[4892]: I0217 18:02:03.030944 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-36cd-account-create-update-9tkph" event={"ID":"0126c9e0-502f-4109-a6cf-25eccd572dff","Type":"ContainerStarted","Data":"cee32f5ada1e424bb8f30122322fd702190cef4acfca76eec13044ef381a90ff"} Feb 17 18:02:03 crc kubenswrapper[4892]: I0217 18:02:03.046992 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-7xltk" podStartSLOduration=3.046936532 podStartE2EDuration="3.046936532s" podCreationTimestamp="2026-02-17 18:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:02:03.038118854 +0000 UTC m=+1094.413522129" watchObservedRunningTime="2026-02-17 18:02:03.046936532 +0000 UTC m=+1094.422339817" Feb 17 18:02:03 crc kubenswrapper[4892]: I0217 18:02:03.065140 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-36cd-account-create-update-9tkph" podStartSLOduration=3.065112723 podStartE2EDuration="3.065112723s" podCreationTimestamp="2026-02-17 18:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:02:03.060505038 +0000 UTC m=+1094.435908303" watchObservedRunningTime="2026-02-17 18:02:03.065112723 +0000 UTC m=+1094.440516018" Feb 17 18:02:03 crc kubenswrapper[4892]: I0217 18:02:03.095152 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-2050-account-create-update-k5ss6" podStartSLOduration=3.095126473 podStartE2EDuration="3.095126473s" podCreationTimestamp="2026-02-17 18:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:02:03.08649998 +0000 UTC m=+1094.461903285" watchObservedRunningTime="2026-02-17 18:02:03.095126473 +0000 UTC m=+1094.470529758" Feb 17 18:02:03 crc kubenswrapper[4892]: I0217 18:02:03.123386 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-q88lk"] Feb 17 18:02:03 crc kubenswrapper[4892]: I0217 18:02:03.150197 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-q88lk"] Feb 17 18:02:03 crc kubenswrapper[4892]: I0217 18:02:03.381651 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="059760ef-70ba-461a-8ff9-e873c3e9501a" path="/var/lib/kubelet/pods/059760ef-70ba-461a-8ff9-e873c3e9501a/volumes" Feb 17 18:02:03 crc kubenswrapper[4892]: I0217 18:02:03.383442 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b940cf2-ddde-420c-bfef-e2c2876b8919" path="/var/lib/kubelet/pods/5b940cf2-ddde-420c-bfef-e2c2876b8919/volumes" Feb 17 18:02:04 crc kubenswrapper[4892]: I0217 18:02:04.048450 4892 generic.go:334] "Generic (PLEG): container finished" podID="17e67c77-46ba-4230-b84d-bc8e6952b2d8" containerID="0ed10cd1809fc1ebc8ac2ad88bd47659d051e13889fd973bc2f836802f14044e" exitCode=0 Feb 17 18:02:04 crc kubenswrapper[4892]: I0217 18:02:04.049416 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-zf6jg" event={"ID":"17e67c77-46ba-4230-b84d-bc8e6952b2d8","Type":"ContainerDied","Data":"0ed10cd1809fc1ebc8ac2ad88bd47659d051e13889fd973bc2f836802f14044e"} Feb 17 18:02:05 crc kubenswrapper[4892]: I0217 18:02:05.102151 4892 scope.go:117] "RemoveContainer" containerID="015caa8d51bb5ac02279e0b7aa16d9eae5311f1075a8cf029eda49781440ccfb" Feb 17 18:02:05 crc kubenswrapper[4892]: I0217 18:02:05.216704 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-tvldr" Feb 17 18:02:05 crc kubenswrapper[4892]: I0217 18:02:05.220871 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78rsc\" (UniqueName: \"kubernetes.io/projected/49dc1deb-f5b7-4b82-bd8e-d1c6dbe9f513-kube-api-access-78rsc\") pod \"49dc1deb-f5b7-4b82-bd8e-d1c6dbe9f513\" (UID: \"49dc1deb-f5b7-4b82-bd8e-d1c6dbe9f513\") " Feb 17 18:02:05 crc kubenswrapper[4892]: I0217 18:02:05.221227 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49dc1deb-f5b7-4b82-bd8e-d1c6dbe9f513-operator-scripts\") pod \"49dc1deb-f5b7-4b82-bd8e-d1c6dbe9f513\" (UID: \"49dc1deb-f5b7-4b82-bd8e-d1c6dbe9f513\") " Feb 17 18:02:05 crc kubenswrapper[4892]: I0217 18:02:05.221981 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49dc1deb-f5b7-4b82-bd8e-d1c6dbe9f513-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "49dc1deb-f5b7-4b82-bd8e-d1c6dbe9f513" (UID: "49dc1deb-f5b7-4b82-bd8e-d1c6dbe9f513"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:02:05 crc kubenswrapper[4892]: I0217 18:02:05.229973 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49dc1deb-f5b7-4b82-bd8e-d1c6dbe9f513-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:05 crc kubenswrapper[4892]: I0217 18:02:05.235605 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49dc1deb-f5b7-4b82-bd8e-d1c6dbe9f513-kube-api-access-78rsc" (OuterVolumeSpecName: "kube-api-access-78rsc") pod "49dc1deb-f5b7-4b82-bd8e-d1c6dbe9f513" (UID: "49dc1deb-f5b7-4b82-bd8e-d1c6dbe9f513"). InnerVolumeSpecName "kube-api-access-78rsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:02:05 crc kubenswrapper[4892]: I0217 18:02:05.260140 4892 scope.go:117] "RemoveContainer" containerID="705aa0dfc26d371fdcc08c162ebc21671ebbcf58a789b8333b5850c9cd47a318" Feb 17 18:02:05 crc kubenswrapper[4892]: E0217 18:02:05.260967 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"705aa0dfc26d371fdcc08c162ebc21671ebbcf58a789b8333b5850c9cd47a318\": container with ID starting with 705aa0dfc26d371fdcc08c162ebc21671ebbcf58a789b8333b5850c9cd47a318 not found: ID does not exist" containerID="705aa0dfc26d371fdcc08c162ebc21671ebbcf58a789b8333b5850c9cd47a318" Feb 17 18:02:05 crc kubenswrapper[4892]: I0217 18:02:05.261009 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"705aa0dfc26d371fdcc08c162ebc21671ebbcf58a789b8333b5850c9cd47a318"} err="failed to get container status \"705aa0dfc26d371fdcc08c162ebc21671ebbcf58a789b8333b5850c9cd47a318\": rpc error: code = NotFound desc = could not find container \"705aa0dfc26d371fdcc08c162ebc21671ebbcf58a789b8333b5850c9cd47a318\": container with ID starting with 705aa0dfc26d371fdcc08c162ebc21671ebbcf58a789b8333b5850c9cd47a318 not found: ID does not exist" Feb 17 18:02:05 crc kubenswrapper[4892]: I0217 18:02:05.261037 4892 scope.go:117] "RemoveContainer" containerID="015caa8d51bb5ac02279e0b7aa16d9eae5311f1075a8cf029eda49781440ccfb" Feb 17 18:02:05 crc kubenswrapper[4892]: E0217 18:02:05.261385 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"015caa8d51bb5ac02279e0b7aa16d9eae5311f1075a8cf029eda49781440ccfb\": container with ID starting with 015caa8d51bb5ac02279e0b7aa16d9eae5311f1075a8cf029eda49781440ccfb not found: ID does not exist" containerID="015caa8d51bb5ac02279e0b7aa16d9eae5311f1075a8cf029eda49781440ccfb" Feb 17 18:02:05 crc kubenswrapper[4892]: I0217 18:02:05.261416 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"015caa8d51bb5ac02279e0b7aa16d9eae5311f1075a8cf029eda49781440ccfb"} err="failed to get container status \"015caa8d51bb5ac02279e0b7aa16d9eae5311f1075a8cf029eda49781440ccfb\": rpc error: code = NotFound desc = could not find container \"015caa8d51bb5ac02279e0b7aa16d9eae5311f1075a8cf029eda49781440ccfb\": container with ID starting with 015caa8d51bb5ac02279e0b7aa16d9eae5311f1075a8cf029eda49781440ccfb not found: ID does not exist" Feb 17 18:02:05 crc kubenswrapper[4892]: I0217 18:02:05.332452 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78rsc\" (UniqueName: \"kubernetes.io/projected/49dc1deb-f5b7-4b82-bd8e-d1c6dbe9f513-kube-api-access-78rsc\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:05 crc kubenswrapper[4892]: I0217 18:02:05.427952 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-zf6jg" Feb 17 18:02:05 crc kubenswrapper[4892]: I0217 18:02:05.535411 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjdwq\" (UniqueName: \"kubernetes.io/projected/17e67c77-46ba-4230-b84d-bc8e6952b2d8-kube-api-access-xjdwq\") pod \"17e67c77-46ba-4230-b84d-bc8e6952b2d8\" (UID: \"17e67c77-46ba-4230-b84d-bc8e6952b2d8\") " Feb 17 18:02:05 crc kubenswrapper[4892]: I0217 18:02:05.535458 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17e67c77-46ba-4230-b84d-bc8e6952b2d8-operator-scripts\") pod \"17e67c77-46ba-4230-b84d-bc8e6952b2d8\" (UID: \"17e67c77-46ba-4230-b84d-bc8e6952b2d8\") " Feb 17 18:02:05 crc kubenswrapper[4892]: I0217 18:02:05.536352 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17e67c77-46ba-4230-b84d-bc8e6952b2d8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "17e67c77-46ba-4230-b84d-bc8e6952b2d8" (UID: "17e67c77-46ba-4230-b84d-bc8e6952b2d8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:02:05 crc kubenswrapper[4892]: I0217 18:02:05.541200 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17e67c77-46ba-4230-b84d-bc8e6952b2d8-kube-api-access-xjdwq" (OuterVolumeSpecName: "kube-api-access-xjdwq") pod "17e67c77-46ba-4230-b84d-bc8e6952b2d8" (UID: "17e67c77-46ba-4230-b84d-bc8e6952b2d8"). InnerVolumeSpecName "kube-api-access-xjdwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:02:05 crc kubenswrapper[4892]: I0217 18:02:05.637491 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjdwq\" (UniqueName: \"kubernetes.io/projected/17e67c77-46ba-4230-b84d-bc8e6952b2d8-kube-api-access-xjdwq\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:05 crc kubenswrapper[4892]: I0217 18:02:05.638139 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17e67c77-46ba-4230-b84d-bc8e6952b2d8-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:06 crc kubenswrapper[4892]: I0217 18:02:06.074451 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-zf6jg" event={"ID":"17e67c77-46ba-4230-b84d-bc8e6952b2d8","Type":"ContainerDied","Data":"1d6aa8889cf3c4274fc999b4dbb0fae917a512a6a95be47c846e3f368ecb1de2"} Feb 17 18:02:06 crc kubenswrapper[4892]: I0217 18:02:06.074509 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d6aa8889cf3c4274fc999b4dbb0fae917a512a6a95be47c846e3f368ecb1de2" Feb 17 18:02:06 crc kubenswrapper[4892]: I0217 18:02:06.074952 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-zf6jg" Feb 17 18:02:06 crc kubenswrapper[4892]: I0217 18:02:06.078299 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-tvldr" event={"ID":"49dc1deb-f5b7-4b82-bd8e-d1c6dbe9f513","Type":"ContainerDied","Data":"4d55d2347215e118f1acc7cd1cf7ad0799061e739916e8ffd39b98eaea541cb8"} Feb 17 18:02:06 crc kubenswrapper[4892]: I0217 18:02:06.078335 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d55d2347215e118f1acc7cd1cf7ad0799061e739916e8ffd39b98eaea541cb8" Feb 17 18:02:06 crc kubenswrapper[4892]: I0217 18:02:06.078389 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-tvldr" Feb 17 18:02:06 crc kubenswrapper[4892]: I0217 18:02:06.080219 4892 generic.go:334] "Generic (PLEG): container finished" podID="c72be734-0fa6-4a40-8f95-984db82d859c" containerID="6b32704fbc72ff9a7cc5ded27500a276ebad694cfbdf97871511f5b03b78561d" exitCode=0 Feb 17 18:02:06 crc kubenswrapper[4892]: I0217 18:02:06.080264 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-7xltk" event={"ID":"c72be734-0fa6-4a40-8f95-984db82d859c","Type":"ContainerDied","Data":"6b32704fbc72ff9a7cc5ded27500a276ebad694cfbdf97871511f5b03b78561d"} Feb 17 18:02:07 crc kubenswrapper[4892]: I0217 18:02:07.117048 4892 generic.go:334] "Generic (PLEG): container finished" podID="0126c9e0-502f-4109-a6cf-25eccd572dff" containerID="cee32f5ada1e424bb8f30122322fd702190cef4acfca76eec13044ef381a90ff" exitCode=0 Feb 17 18:02:07 crc kubenswrapper[4892]: I0217 18:02:07.117175 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-36cd-account-create-update-9tkph" event={"ID":"0126c9e0-502f-4109-a6cf-25eccd572dff","Type":"ContainerDied","Data":"cee32f5ada1e424bb8f30122322fd702190cef4acfca76eec13044ef381a90ff"} Feb 17 18:02:07 crc kubenswrapper[4892]: I0217 18:02:07.121233 4892 generic.go:334] "Generic (PLEG): container finished" podID="46aabc4c-95df-425a-bde5-db7523f34d7f" containerID="e26651b0dac4631f1d31c476692d4a84f0c1e5516e237a2c33f25e57e3776b8c" exitCode=0 Feb 17 18:02:07 crc kubenswrapper[4892]: I0217 18:02:07.121347 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5c69-account-create-update-dl45f" event={"ID":"46aabc4c-95df-425a-bde5-db7523f34d7f","Type":"ContainerDied","Data":"e26651b0dac4631f1d31c476692d4a84f0c1e5516e237a2c33f25e57e3776b8c"} Feb 17 18:02:07 crc kubenswrapper[4892]: I0217 18:02:07.124529 4892 generic.go:334] "Generic (PLEG): container finished" podID="f78e90e9-8804-4e06-8083-785c64a86a1d" containerID="20e78a2656091e6830683ed12bdcfe9347b0af0753251ba18342603e69dcb165" exitCode=0 Feb 17 18:02:07 crc kubenswrapper[4892]: I0217 18:02:07.124588 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-xgmsg" event={"ID":"f78e90e9-8804-4e06-8083-785c64a86a1d","Type":"ContainerDied","Data":"20e78a2656091e6830683ed12bdcfe9347b0af0753251ba18342603e69dcb165"} Feb 17 18:02:07 crc kubenswrapper[4892]: I0217 18:02:07.127692 4892 generic.go:334] "Generic (PLEG): container finished" podID="fb8ce869-af70-43d7-b341-cf6c53900d97" containerID="ceb4c385d00646492bf20fac2ee2ee7bfac8d274b0a4abeaaee8d01b2d0db45c" exitCode=0 Feb 17 18:02:07 crc kubenswrapper[4892]: I0217 18:02:07.127992 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2050-account-create-update-k5ss6" event={"ID":"fb8ce869-af70-43d7-b341-cf6c53900d97","Type":"ContainerDied","Data":"ceb4c385d00646492bf20fac2ee2ee7bfac8d274b0a4abeaaee8d01b2d0db45c"} Feb 17 18:02:07 crc kubenswrapper[4892]: I0217 18:02:07.509274 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-7xltk" Feb 17 18:02:07 crc kubenswrapper[4892]: I0217 18:02:07.582433 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n82gk\" (UniqueName: \"kubernetes.io/projected/c72be734-0fa6-4a40-8f95-984db82d859c-kube-api-access-n82gk\") pod \"c72be734-0fa6-4a40-8f95-984db82d859c\" (UID: \"c72be734-0fa6-4a40-8f95-984db82d859c\") " Feb 17 18:02:07 crc kubenswrapper[4892]: I0217 18:02:07.582486 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c72be734-0fa6-4a40-8f95-984db82d859c-operator-scripts\") pod \"c72be734-0fa6-4a40-8f95-984db82d859c\" (UID: \"c72be734-0fa6-4a40-8f95-984db82d859c\") " Feb 17 18:02:07 crc kubenswrapper[4892]: I0217 18:02:07.583109 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c72be734-0fa6-4a40-8f95-984db82d859c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c72be734-0fa6-4a40-8f95-984db82d859c" (UID: "c72be734-0fa6-4a40-8f95-984db82d859c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:02:07 crc kubenswrapper[4892]: I0217 18:02:07.586698 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c72be734-0fa6-4a40-8f95-984db82d859c-kube-api-access-n82gk" (OuterVolumeSpecName: "kube-api-access-n82gk") pod "c72be734-0fa6-4a40-8f95-984db82d859c" (UID: "c72be734-0fa6-4a40-8f95-984db82d859c"). InnerVolumeSpecName "kube-api-access-n82gk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:02:07 crc kubenswrapper[4892]: I0217 18:02:07.684151 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n82gk\" (UniqueName: \"kubernetes.io/projected/c72be734-0fa6-4a40-8f95-984db82d859c-kube-api-access-n82gk\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:07 crc kubenswrapper[4892]: I0217 18:02:07.684183 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c72be734-0fa6-4a40-8f95-984db82d859c-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:07 crc kubenswrapper[4892]: I0217 18:02:07.690942 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-m59z9"] Feb 17 18:02:07 crc kubenswrapper[4892]: E0217 18:02:07.691405 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c72be734-0fa6-4a40-8f95-984db82d859c" containerName="mariadb-database-create" Feb 17 18:02:07 crc kubenswrapper[4892]: I0217 18:02:07.691424 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="c72be734-0fa6-4a40-8f95-984db82d859c" containerName="mariadb-database-create" Feb 17 18:02:07 crc kubenswrapper[4892]: E0217 18:02:07.691440 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49dc1deb-f5b7-4b82-bd8e-d1c6dbe9f513" containerName="mariadb-database-create" Feb 17 18:02:07 crc kubenswrapper[4892]: I0217 18:02:07.691447 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="49dc1deb-f5b7-4b82-bd8e-d1c6dbe9f513" containerName="mariadb-database-create" Feb 17 18:02:07 crc kubenswrapper[4892]: E0217 18:02:07.691473 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="059760ef-70ba-461a-8ff9-e873c3e9501a" containerName="init" Feb 17 18:02:07 crc kubenswrapper[4892]: I0217 18:02:07.691482 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="059760ef-70ba-461a-8ff9-e873c3e9501a" containerName="init" Feb 17 18:02:07 crc kubenswrapper[4892]: E0217 18:02:07.691498 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="059760ef-70ba-461a-8ff9-e873c3e9501a" containerName="dnsmasq-dns" Feb 17 18:02:07 crc kubenswrapper[4892]: I0217 18:02:07.691504 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="059760ef-70ba-461a-8ff9-e873c3e9501a" containerName="dnsmasq-dns" Feb 17 18:02:07 crc kubenswrapper[4892]: E0217 18:02:07.691518 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17e67c77-46ba-4230-b84d-bc8e6952b2d8" containerName="mariadb-database-create" Feb 17 18:02:07 crc kubenswrapper[4892]: I0217 18:02:07.691525 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="17e67c77-46ba-4230-b84d-bc8e6952b2d8" containerName="mariadb-database-create" Feb 17 18:02:07 crc kubenswrapper[4892]: I0217 18:02:07.691732 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="059760ef-70ba-461a-8ff9-e873c3e9501a" containerName="dnsmasq-dns" Feb 17 18:02:07 crc kubenswrapper[4892]: I0217 18:02:07.691756 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="c72be734-0fa6-4a40-8f95-984db82d859c" containerName="mariadb-database-create" Feb 17 18:02:07 crc kubenswrapper[4892]: I0217 18:02:07.691773 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="17e67c77-46ba-4230-b84d-bc8e6952b2d8" containerName="mariadb-database-create" Feb 17 18:02:07 crc kubenswrapper[4892]: I0217 18:02:07.691785 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="49dc1deb-f5b7-4b82-bd8e-d1c6dbe9f513" containerName="mariadb-database-create" Feb 17 18:02:07 crc kubenswrapper[4892]: I0217 18:02:07.692407 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-m59z9" Feb 17 18:02:07 crc kubenswrapper[4892]: I0217 18:02:07.699373 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-m59z9"] Feb 17 18:02:07 crc kubenswrapper[4892]: I0217 18:02:07.702565 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 17 18:02:07 crc kubenswrapper[4892]: I0217 18:02:07.784835 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb22j\" (UniqueName: \"kubernetes.io/projected/3a3ae569-94c4-4d8e-b97c-b73a0f4c2b5c-kube-api-access-tb22j\") pod \"root-account-create-update-m59z9\" (UID: \"3a3ae569-94c4-4d8e-b97c-b73a0f4c2b5c\") " pod="openstack/root-account-create-update-m59z9" Feb 17 18:02:07 crc kubenswrapper[4892]: I0217 18:02:07.785114 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a3ae569-94c4-4d8e-b97c-b73a0f4c2b5c-operator-scripts\") pod \"root-account-create-update-m59z9\" (UID: \"3a3ae569-94c4-4d8e-b97c-b73a0f4c2b5c\") " pod="openstack/root-account-create-update-m59z9" Feb 17 18:02:07 crc kubenswrapper[4892]: I0217 18:02:07.887724 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tb22j\" (UniqueName: \"kubernetes.io/projected/3a3ae569-94c4-4d8e-b97c-b73a0f4c2b5c-kube-api-access-tb22j\") pod \"root-account-create-update-m59z9\" (UID: \"3a3ae569-94c4-4d8e-b97c-b73a0f4c2b5c\") " pod="openstack/root-account-create-update-m59z9" Feb 17 18:02:07 crc kubenswrapper[4892]: I0217 18:02:07.887842 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a3ae569-94c4-4d8e-b97c-b73a0f4c2b5c-operator-scripts\") pod \"root-account-create-update-m59z9\" (UID: \"3a3ae569-94c4-4d8e-b97c-b73a0f4c2b5c\") " pod="openstack/root-account-create-update-m59z9" Feb 17 18:02:07 crc kubenswrapper[4892]: I0217 18:02:07.888599 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a3ae569-94c4-4d8e-b97c-b73a0f4c2b5c-operator-scripts\") pod \"root-account-create-update-m59z9\" (UID: \"3a3ae569-94c4-4d8e-b97c-b73a0f4c2b5c\") " pod="openstack/root-account-create-update-m59z9" Feb 17 18:02:07 crc kubenswrapper[4892]: I0217 18:02:07.915110 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tb22j\" (UniqueName: \"kubernetes.io/projected/3a3ae569-94c4-4d8e-b97c-b73a0f4c2b5c-kube-api-access-tb22j\") pod \"root-account-create-update-m59z9\" (UID: \"3a3ae569-94c4-4d8e-b97c-b73a0f4c2b5c\") " pod="openstack/root-account-create-update-m59z9" Feb 17 18:02:08 crc kubenswrapper[4892]: I0217 18:02:08.009934 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-m59z9" Feb 17 18:02:08 crc kubenswrapper[4892]: I0217 18:02:08.142799 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-7xltk" Feb 17 18:02:08 crc kubenswrapper[4892]: I0217 18:02:08.143101 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-7xltk" event={"ID":"c72be734-0fa6-4a40-8f95-984db82d859c","Type":"ContainerDied","Data":"5d7067482fa1f32b3456410b256ff6fbbbdc9dd6a1a70f2b44e6ef0f424e2683"} Feb 17 18:02:08 crc kubenswrapper[4892]: I0217 18:02:08.144261 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d7067482fa1f32b3456410b256ff6fbbbdc9dd6a1a70f2b44e6ef0f424e2683" Feb 17 18:02:08 crc kubenswrapper[4892]: I0217 18:02:08.300075 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0a693208-de83-4f05-b4ff-0b3e7f858c74-etc-swift\") pod \"swift-storage-0\" (UID: \"0a693208-de83-4f05-b4ff-0b3e7f858c74\") " pod="openstack/swift-storage-0" Feb 17 18:02:08 crc kubenswrapper[4892]: I0217 18:02:08.304948 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0a693208-de83-4f05-b4ff-0b3e7f858c74-etc-swift\") pod \"swift-storage-0\" (UID: \"0a693208-de83-4f05-b4ff-0b3e7f858c74\") " pod="openstack/swift-storage-0" Feb 17 18:02:08 crc kubenswrapper[4892]: I0217 18:02:08.515751 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-m59z9"] Feb 17 18:02:08 crc kubenswrapper[4892]: W0217 18:02:08.520855 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a3ae569_94c4_4d8e_b97c_b73a0f4c2b5c.slice/crio-cbfd373a15032502d15ba860a6006e019202e0ed218cf1e8a119fc9635dbd720 WatchSource:0}: Error finding container cbfd373a15032502d15ba860a6006e019202e0ed218cf1e8a119fc9635dbd720: Status 404 returned error can't find the container with id cbfd373a15032502d15ba860a6006e019202e0ed218cf1e8a119fc9635dbd720 Feb 17 18:02:08 crc kubenswrapper[4892]: I0217 18:02:08.561387 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 17 18:02:08 crc kubenswrapper[4892]: I0217 18:02:08.717884 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-36cd-account-create-update-9tkph" Feb 17 18:02:08 crc kubenswrapper[4892]: I0217 18:02:08.724597 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-xgmsg" Feb 17 18:02:08 crc kubenswrapper[4892]: I0217 18:02:08.731631 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2050-account-create-update-k5ss6" Feb 17 18:02:08 crc kubenswrapper[4892]: I0217 18:02:08.764241 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5c69-account-create-update-dl45f" Feb 17 18:02:08 crc kubenswrapper[4892]: I0217 18:02:08.808964 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f78e90e9-8804-4e06-8083-785c64a86a1d-ring-data-devices\") pod \"f78e90e9-8804-4e06-8083-785c64a86a1d\" (UID: \"f78e90e9-8804-4e06-8083-785c64a86a1d\") " Feb 17 18:02:08 crc kubenswrapper[4892]: I0217 18:02:08.809018 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f78e90e9-8804-4e06-8083-785c64a86a1d-swiftconf\") pod \"f78e90e9-8804-4e06-8083-785c64a86a1d\" (UID: \"f78e90e9-8804-4e06-8083-785c64a86a1d\") " Feb 17 18:02:08 crc kubenswrapper[4892]: I0217 18:02:08.809056 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f78e90e9-8804-4e06-8083-785c64a86a1d-combined-ca-bundle\") pod \"f78e90e9-8804-4e06-8083-785c64a86a1d\" (UID: \"f78e90e9-8804-4e06-8083-785c64a86a1d\") " Feb 17 18:02:08 crc kubenswrapper[4892]: I0217 18:02:08.809076 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f78e90e9-8804-4e06-8083-785c64a86a1d-dispersionconf\") pod \"f78e90e9-8804-4e06-8083-785c64a86a1d\" (UID: \"f78e90e9-8804-4e06-8083-785c64a86a1d\") " Feb 17 18:02:08 crc kubenswrapper[4892]: I0217 18:02:08.809104 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7ljm\" (UniqueName: \"kubernetes.io/projected/f78e90e9-8804-4e06-8083-785c64a86a1d-kube-api-access-j7ljm\") pod \"f78e90e9-8804-4e06-8083-785c64a86a1d\" (UID: \"f78e90e9-8804-4e06-8083-785c64a86a1d\") " Feb 17 18:02:08 crc kubenswrapper[4892]: I0217 18:02:08.809193 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f78e90e9-8804-4e06-8083-785c64a86a1d-etc-swift\") pod \"f78e90e9-8804-4e06-8083-785c64a86a1d\" (UID: \"f78e90e9-8804-4e06-8083-785c64a86a1d\") " Feb 17 18:02:08 crc kubenswrapper[4892]: I0217 18:02:08.809220 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f78e90e9-8804-4e06-8083-785c64a86a1d-scripts\") pod \"f78e90e9-8804-4e06-8083-785c64a86a1d\" (UID: \"f78e90e9-8804-4e06-8083-785c64a86a1d\") " Feb 17 18:02:08 crc kubenswrapper[4892]: I0217 18:02:08.809239 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb8ce869-af70-43d7-b341-cf6c53900d97-operator-scripts\") pod \"fb8ce869-af70-43d7-b341-cf6c53900d97\" (UID: \"fb8ce869-af70-43d7-b341-cf6c53900d97\") " Feb 17 18:02:08 crc kubenswrapper[4892]: I0217 18:02:08.809305 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46aabc4c-95df-425a-bde5-db7523f34d7f-operator-scripts\") pod \"46aabc4c-95df-425a-bde5-db7523f34d7f\" (UID: \"46aabc4c-95df-425a-bde5-db7523f34d7f\") " Feb 17 18:02:08 crc kubenswrapper[4892]: I0217 18:02:08.809361 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rq6xq\" (UniqueName: \"kubernetes.io/projected/0126c9e0-502f-4109-a6cf-25eccd572dff-kube-api-access-rq6xq\") pod \"0126c9e0-502f-4109-a6cf-25eccd572dff\" (UID: \"0126c9e0-502f-4109-a6cf-25eccd572dff\") " Feb 17 18:02:08 crc kubenswrapper[4892]: I0217 18:02:08.809395 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtrsz\" (UniqueName: \"kubernetes.io/projected/46aabc4c-95df-425a-bde5-db7523f34d7f-kube-api-access-jtrsz\") pod \"46aabc4c-95df-425a-bde5-db7523f34d7f\" (UID: \"46aabc4c-95df-425a-bde5-db7523f34d7f\") " Feb 17 18:02:08 crc kubenswrapper[4892]: I0217 18:02:08.809428 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0126c9e0-502f-4109-a6cf-25eccd572dff-operator-scripts\") pod \"0126c9e0-502f-4109-a6cf-25eccd572dff\" (UID: \"0126c9e0-502f-4109-a6cf-25eccd572dff\") " Feb 17 18:02:08 crc kubenswrapper[4892]: I0217 18:02:08.809453 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8svl\" (UniqueName: \"kubernetes.io/projected/fb8ce869-af70-43d7-b341-cf6c53900d97-kube-api-access-v8svl\") pod \"fb8ce869-af70-43d7-b341-cf6c53900d97\" (UID: \"fb8ce869-af70-43d7-b341-cf6c53900d97\") " Feb 17 18:02:08 crc kubenswrapper[4892]: I0217 18:02:08.813327 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f78e90e9-8804-4e06-8083-785c64a86a1d-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "f78e90e9-8804-4e06-8083-785c64a86a1d" (UID: "f78e90e9-8804-4e06-8083-785c64a86a1d"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:02:08 crc kubenswrapper[4892]: I0217 18:02:08.814542 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46aabc4c-95df-425a-bde5-db7523f34d7f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "46aabc4c-95df-425a-bde5-db7523f34d7f" (UID: "46aabc4c-95df-425a-bde5-db7523f34d7f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:02:08 crc kubenswrapper[4892]: I0217 18:02:08.814769 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb8ce869-af70-43d7-b341-cf6c53900d97-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fb8ce869-af70-43d7-b341-cf6c53900d97" (UID: "fb8ce869-af70-43d7-b341-cf6c53900d97"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:02:08 crc kubenswrapper[4892]: I0217 18:02:08.815239 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0126c9e0-502f-4109-a6cf-25eccd572dff-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0126c9e0-502f-4109-a6cf-25eccd572dff" (UID: "0126c9e0-502f-4109-a6cf-25eccd572dff"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:02:08 crc kubenswrapper[4892]: I0217 18:02:08.815976 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f78e90e9-8804-4e06-8083-785c64a86a1d-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "f78e90e9-8804-4e06-8083-785c64a86a1d" (UID: "f78e90e9-8804-4e06-8083-785c64a86a1d"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:02:08 crc kubenswrapper[4892]: I0217 18:02:08.824125 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb8ce869-af70-43d7-b341-cf6c53900d97-kube-api-access-v8svl" (OuterVolumeSpecName: "kube-api-access-v8svl") pod "fb8ce869-af70-43d7-b341-cf6c53900d97" (UID: "fb8ce869-af70-43d7-b341-cf6c53900d97"). InnerVolumeSpecName "kube-api-access-v8svl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:02:08 crc kubenswrapper[4892]: I0217 18:02:08.827644 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f78e90e9-8804-4e06-8083-785c64a86a1d-kube-api-access-j7ljm" (OuterVolumeSpecName: "kube-api-access-j7ljm") pod "f78e90e9-8804-4e06-8083-785c64a86a1d" (UID: "f78e90e9-8804-4e06-8083-785c64a86a1d"). InnerVolumeSpecName "kube-api-access-j7ljm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:02:08 crc kubenswrapper[4892]: I0217 18:02:08.832628 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0126c9e0-502f-4109-a6cf-25eccd572dff-kube-api-access-rq6xq" (OuterVolumeSpecName: "kube-api-access-rq6xq") pod "0126c9e0-502f-4109-a6cf-25eccd572dff" (UID: "0126c9e0-502f-4109-a6cf-25eccd572dff"). InnerVolumeSpecName "kube-api-access-rq6xq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:02:08 crc kubenswrapper[4892]: I0217 18:02:08.835332 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46aabc4c-95df-425a-bde5-db7523f34d7f-kube-api-access-jtrsz" (OuterVolumeSpecName: "kube-api-access-jtrsz") pod "46aabc4c-95df-425a-bde5-db7523f34d7f" (UID: "46aabc4c-95df-425a-bde5-db7523f34d7f"). InnerVolumeSpecName "kube-api-access-jtrsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:02:08 crc kubenswrapper[4892]: I0217 18:02:08.835371 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f78e90e9-8804-4e06-8083-785c64a86a1d-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "f78e90e9-8804-4e06-8083-785c64a86a1d" (UID: "f78e90e9-8804-4e06-8083-785c64a86a1d"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:02:08 crc kubenswrapper[4892]: I0217 18:02:08.851754 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f78e90e9-8804-4e06-8083-785c64a86a1d-scripts" (OuterVolumeSpecName: "scripts") pod "f78e90e9-8804-4e06-8083-785c64a86a1d" (UID: "f78e90e9-8804-4e06-8083-785c64a86a1d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:02:08 crc kubenswrapper[4892]: I0217 18:02:08.857943 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f78e90e9-8804-4e06-8083-785c64a86a1d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f78e90e9-8804-4e06-8083-785c64a86a1d" (UID: "f78e90e9-8804-4e06-8083-785c64a86a1d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:02:08 crc kubenswrapper[4892]: I0217 18:02:08.860301 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f78e90e9-8804-4e06-8083-785c64a86a1d-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "f78e90e9-8804-4e06-8083-785c64a86a1d" (UID: "f78e90e9-8804-4e06-8083-785c64a86a1d"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:02:08 crc kubenswrapper[4892]: I0217 18:02:08.916279 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0126c9e0-502f-4109-a6cf-25eccd572dff-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:08 crc kubenswrapper[4892]: I0217 18:02:08.916537 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8svl\" (UniqueName: \"kubernetes.io/projected/fb8ce869-af70-43d7-b341-cf6c53900d97-kube-api-access-v8svl\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:08 crc kubenswrapper[4892]: I0217 18:02:08.916627 4892 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f78e90e9-8804-4e06-8083-785c64a86a1d-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:08 crc kubenswrapper[4892]: I0217 18:02:08.916751 4892 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f78e90e9-8804-4e06-8083-785c64a86a1d-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:08 crc kubenswrapper[4892]: I0217 18:02:08.916829 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f78e90e9-8804-4e06-8083-785c64a86a1d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:08 crc kubenswrapper[4892]: I0217 18:02:08.916948 4892 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f78e90e9-8804-4e06-8083-785c64a86a1d-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:08 crc kubenswrapper[4892]: I0217 18:02:08.917053 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7ljm\" (UniqueName: \"kubernetes.io/projected/f78e90e9-8804-4e06-8083-785c64a86a1d-kube-api-access-j7ljm\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:08 crc kubenswrapper[4892]: I0217 18:02:08.917127 4892 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f78e90e9-8804-4e06-8083-785c64a86a1d-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:08 crc kubenswrapper[4892]: I0217 18:02:08.917203 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f78e90e9-8804-4e06-8083-785c64a86a1d-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:08 crc kubenswrapper[4892]: I0217 18:02:08.917276 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb8ce869-af70-43d7-b341-cf6c53900d97-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:08 crc kubenswrapper[4892]: I0217 18:02:08.917356 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46aabc4c-95df-425a-bde5-db7523f34d7f-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:08 crc kubenswrapper[4892]: I0217 18:02:08.917428 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rq6xq\" (UniqueName: \"kubernetes.io/projected/0126c9e0-502f-4109-a6cf-25eccd572dff-kube-api-access-rq6xq\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:08 crc kubenswrapper[4892]: I0217 18:02:08.917512 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtrsz\" (UniqueName: \"kubernetes.io/projected/46aabc4c-95df-425a-bde5-db7523f34d7f-kube-api-access-jtrsz\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:09 crc kubenswrapper[4892]: I0217 18:02:09.117767 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 17 18:02:09 crc kubenswrapper[4892]: I0217 18:02:09.185571 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0a693208-de83-4f05-b4ff-0b3e7f858c74","Type":"ContainerStarted","Data":"a5f99743a51ff9d0dbf46cc29bd6dccbd9b931151c253386b705f0b24776bb75"} Feb 17 18:02:09 crc kubenswrapper[4892]: I0217 18:02:09.189785 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5c69-account-create-update-dl45f" event={"ID":"46aabc4c-95df-425a-bde5-db7523f34d7f","Type":"ContainerDied","Data":"40bd37f18d08276d6fe35bf92ebf86226d5675bb83c184a466363fdaf1031007"} Feb 17 18:02:09 crc kubenswrapper[4892]: I0217 18:02:09.189847 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40bd37f18d08276d6fe35bf92ebf86226d5675bb83c184a466363fdaf1031007" Feb 17 18:02:09 crc kubenswrapper[4892]: I0217 18:02:09.189924 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5c69-account-create-update-dl45f" Feb 17 18:02:09 crc kubenswrapper[4892]: I0217 18:02:09.196520 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-xgmsg" event={"ID":"f78e90e9-8804-4e06-8083-785c64a86a1d","Type":"ContainerDied","Data":"9b10c1628d240ccc2d5772a96deb0f12c56b1ef94f016d0153bca18114e366b3"} Feb 17 18:02:09 crc kubenswrapper[4892]: I0217 18:02:09.196564 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b10c1628d240ccc2d5772a96deb0f12c56b1ef94f016d0153bca18114e366b3" Feb 17 18:02:09 crc kubenswrapper[4892]: I0217 18:02:09.196643 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-xgmsg" Feb 17 18:02:09 crc kubenswrapper[4892]: I0217 18:02:09.203299 4892 generic.go:334] "Generic (PLEG): container finished" podID="3a3ae569-94c4-4d8e-b97c-b73a0f4c2b5c" containerID="8a8dca3429143b599989ef7ff2f7114f30539084187e904a5b30eee4fcfc5b7f" exitCode=0 Feb 17 18:02:09 crc kubenswrapper[4892]: I0217 18:02:09.203387 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-m59z9" event={"ID":"3a3ae569-94c4-4d8e-b97c-b73a0f4c2b5c","Type":"ContainerDied","Data":"8a8dca3429143b599989ef7ff2f7114f30539084187e904a5b30eee4fcfc5b7f"} Feb 17 18:02:09 crc kubenswrapper[4892]: I0217 18:02:09.203421 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-m59z9" event={"ID":"3a3ae569-94c4-4d8e-b97c-b73a0f4c2b5c","Type":"ContainerStarted","Data":"cbfd373a15032502d15ba860a6006e019202e0ed218cf1e8a119fc9635dbd720"} Feb 17 18:02:09 crc kubenswrapper[4892]: I0217 18:02:09.205144 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2050-account-create-update-k5ss6" Feb 17 18:02:09 crc kubenswrapper[4892]: I0217 18:02:09.205130 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2050-account-create-update-k5ss6" event={"ID":"fb8ce869-af70-43d7-b341-cf6c53900d97","Type":"ContainerDied","Data":"88e4ccdc5286e61812f3364e7e9f7edfb9e18100d3375e34665db970b3866e10"} Feb 17 18:02:09 crc kubenswrapper[4892]: I0217 18:02:09.205276 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88e4ccdc5286e61812f3364e7e9f7edfb9e18100d3375e34665db970b3866e10" Feb 17 18:02:09 crc kubenswrapper[4892]: I0217 18:02:09.207048 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-36cd-account-create-update-9tkph" event={"ID":"0126c9e0-502f-4109-a6cf-25eccd572dff","Type":"ContainerDied","Data":"bee4189c7539f35f8f9e2a0eb3a697b47e5a5d4eef89ead1df0f8682c8e94d4c"} Feb 17 18:02:09 crc kubenswrapper[4892]: I0217 18:02:09.207084 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bee4189c7539f35f8f9e2a0eb3a697b47e5a5d4eef89ead1df0f8682c8e94d4c" Feb 17 18:02:09 crc kubenswrapper[4892]: I0217 18:02:09.207096 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-36cd-account-create-update-9tkph" Feb 17 18:02:09 crc kubenswrapper[4892]: I0217 18:02:09.765093 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-rvn94"] Feb 17 18:02:09 crc kubenswrapper[4892]: E0217 18:02:09.765815 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0126c9e0-502f-4109-a6cf-25eccd572dff" containerName="mariadb-account-create-update" Feb 17 18:02:09 crc kubenswrapper[4892]: I0217 18:02:09.765834 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="0126c9e0-502f-4109-a6cf-25eccd572dff" containerName="mariadb-account-create-update" Feb 17 18:02:09 crc kubenswrapper[4892]: E0217 18:02:09.765872 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb8ce869-af70-43d7-b341-cf6c53900d97" containerName="mariadb-account-create-update" Feb 17 18:02:09 crc kubenswrapper[4892]: I0217 18:02:09.765882 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb8ce869-af70-43d7-b341-cf6c53900d97" containerName="mariadb-account-create-update" Feb 17 18:02:09 crc kubenswrapper[4892]: E0217 18:02:09.765895 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46aabc4c-95df-425a-bde5-db7523f34d7f" containerName="mariadb-account-create-update" Feb 17 18:02:09 crc kubenswrapper[4892]: I0217 18:02:09.765904 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="46aabc4c-95df-425a-bde5-db7523f34d7f" containerName="mariadb-account-create-update" Feb 17 18:02:09 crc kubenswrapper[4892]: E0217 18:02:09.765922 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f78e90e9-8804-4e06-8083-785c64a86a1d" containerName="swift-ring-rebalance" Feb 17 18:02:09 crc kubenswrapper[4892]: I0217 18:02:09.765931 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f78e90e9-8804-4e06-8083-785c64a86a1d" containerName="swift-ring-rebalance" Feb 17 18:02:09 crc kubenswrapper[4892]: I0217 18:02:09.766219 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb8ce869-af70-43d7-b341-cf6c53900d97" containerName="mariadb-account-create-update" Feb 17 18:02:09 crc kubenswrapper[4892]: I0217 18:02:09.766254 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="46aabc4c-95df-425a-bde5-db7523f34d7f" containerName="mariadb-account-create-update" Feb 17 18:02:09 crc kubenswrapper[4892]: I0217 18:02:09.766263 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="0126c9e0-502f-4109-a6cf-25eccd572dff" containerName="mariadb-account-create-update" Feb 17 18:02:09 crc kubenswrapper[4892]: I0217 18:02:09.766273 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="f78e90e9-8804-4e06-8083-785c64a86a1d" containerName="swift-ring-rebalance" Feb 17 18:02:09 crc kubenswrapper[4892]: I0217 18:02:09.766866 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-rvn94" Feb 17 18:02:09 crc kubenswrapper[4892]: I0217 18:02:09.769314 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 17 18:02:09 crc kubenswrapper[4892]: I0217 18:02:09.769530 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-ltggx" Feb 17 18:02:09 crc kubenswrapper[4892]: I0217 18:02:09.776114 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-rvn94"] Feb 17 18:02:09 crc kubenswrapper[4892]: I0217 18:02:09.842097 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b26d854a-7e7b-4a84-a611-d0672cea173d-db-sync-config-data\") pod \"glance-db-sync-rvn94\" (UID: \"b26d854a-7e7b-4a84-a611-d0672cea173d\") " pod="openstack/glance-db-sync-rvn94" Feb 17 18:02:09 crc kubenswrapper[4892]: I0217 18:02:09.842137 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b26d854a-7e7b-4a84-a611-d0672cea173d-config-data\") pod \"glance-db-sync-rvn94\" (UID: \"b26d854a-7e7b-4a84-a611-d0672cea173d\") " pod="openstack/glance-db-sync-rvn94" Feb 17 18:02:09 crc kubenswrapper[4892]: I0217 18:02:09.842164 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b26d854a-7e7b-4a84-a611-d0672cea173d-combined-ca-bundle\") pod \"glance-db-sync-rvn94\" (UID: \"b26d854a-7e7b-4a84-a611-d0672cea173d\") " pod="openstack/glance-db-sync-rvn94" Feb 17 18:02:09 crc kubenswrapper[4892]: I0217 18:02:09.842185 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckkww\" (UniqueName: \"kubernetes.io/projected/b26d854a-7e7b-4a84-a611-d0672cea173d-kube-api-access-ckkww\") pod \"glance-db-sync-rvn94\" (UID: \"b26d854a-7e7b-4a84-a611-d0672cea173d\") " pod="openstack/glance-db-sync-rvn94" Feb 17 18:02:09 crc kubenswrapper[4892]: I0217 18:02:09.944938 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b26d854a-7e7b-4a84-a611-d0672cea173d-db-sync-config-data\") pod \"glance-db-sync-rvn94\" (UID: \"b26d854a-7e7b-4a84-a611-d0672cea173d\") " pod="openstack/glance-db-sync-rvn94" Feb 17 18:02:09 crc kubenswrapper[4892]: I0217 18:02:09.944997 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b26d854a-7e7b-4a84-a611-d0672cea173d-config-data\") pod \"glance-db-sync-rvn94\" (UID: \"b26d854a-7e7b-4a84-a611-d0672cea173d\") " pod="openstack/glance-db-sync-rvn94" Feb 17 18:02:09 crc kubenswrapper[4892]: I0217 18:02:09.945035 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b26d854a-7e7b-4a84-a611-d0672cea173d-combined-ca-bundle\") pod \"glance-db-sync-rvn94\" (UID: \"b26d854a-7e7b-4a84-a611-d0672cea173d\") " pod="openstack/glance-db-sync-rvn94" Feb 17 18:02:09 crc kubenswrapper[4892]: I0217 18:02:09.945062 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckkww\" (UniqueName: \"kubernetes.io/projected/b26d854a-7e7b-4a84-a611-d0672cea173d-kube-api-access-ckkww\") pod \"glance-db-sync-rvn94\" (UID: \"b26d854a-7e7b-4a84-a611-d0672cea173d\") " pod="openstack/glance-db-sync-rvn94" Feb 17 18:02:09 crc kubenswrapper[4892]: I0217 18:02:09.950370 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b26d854a-7e7b-4a84-a611-d0672cea173d-db-sync-config-data\") pod \"glance-db-sync-rvn94\" (UID: \"b26d854a-7e7b-4a84-a611-d0672cea173d\") " pod="openstack/glance-db-sync-rvn94" Feb 17 18:02:09 crc kubenswrapper[4892]: I0217 18:02:09.950721 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b26d854a-7e7b-4a84-a611-d0672cea173d-config-data\") pod \"glance-db-sync-rvn94\" (UID: \"b26d854a-7e7b-4a84-a611-d0672cea173d\") " pod="openstack/glance-db-sync-rvn94" Feb 17 18:02:09 crc kubenswrapper[4892]: I0217 18:02:09.951392 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b26d854a-7e7b-4a84-a611-d0672cea173d-combined-ca-bundle\") pod \"glance-db-sync-rvn94\" (UID: \"b26d854a-7e7b-4a84-a611-d0672cea173d\") " pod="openstack/glance-db-sync-rvn94" Feb 17 18:02:09 crc kubenswrapper[4892]: I0217 18:02:09.962343 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckkww\" (UniqueName: \"kubernetes.io/projected/b26d854a-7e7b-4a84-a611-d0672cea173d-kube-api-access-ckkww\") pod \"glance-db-sync-rvn94\" (UID: \"b26d854a-7e7b-4a84-a611-d0672cea173d\") " pod="openstack/glance-db-sync-rvn94" Feb 17 18:02:10 crc kubenswrapper[4892]: I0217 18:02:10.089998 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-rvn94" Feb 17 18:02:10 crc kubenswrapper[4892]: I0217 18:02:10.386387 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 17 18:02:10 crc kubenswrapper[4892]: I0217 18:02:10.696919 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-m59z9" Feb 17 18:02:10 crc kubenswrapper[4892]: I0217 18:02:10.763430 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tb22j\" (UniqueName: \"kubernetes.io/projected/3a3ae569-94c4-4d8e-b97c-b73a0f4c2b5c-kube-api-access-tb22j\") pod \"3a3ae569-94c4-4d8e-b97c-b73a0f4c2b5c\" (UID: \"3a3ae569-94c4-4d8e-b97c-b73a0f4c2b5c\") " Feb 17 18:02:10 crc kubenswrapper[4892]: I0217 18:02:10.763471 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a3ae569-94c4-4d8e-b97c-b73a0f4c2b5c-operator-scripts\") pod \"3a3ae569-94c4-4d8e-b97c-b73a0f4c2b5c\" (UID: \"3a3ae569-94c4-4d8e-b97c-b73a0f4c2b5c\") " Feb 17 18:02:10 crc kubenswrapper[4892]: I0217 18:02:10.764241 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a3ae569-94c4-4d8e-b97c-b73a0f4c2b5c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3a3ae569-94c4-4d8e-b97c-b73a0f4c2b5c" (UID: "3a3ae569-94c4-4d8e-b97c-b73a0f4c2b5c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:02:10 crc kubenswrapper[4892]: I0217 18:02:10.767880 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a3ae569-94c4-4d8e-b97c-b73a0f4c2b5c-kube-api-access-tb22j" (OuterVolumeSpecName: "kube-api-access-tb22j") pod "3a3ae569-94c4-4d8e-b97c-b73a0f4c2b5c" (UID: "3a3ae569-94c4-4d8e-b97c-b73a0f4c2b5c"). InnerVolumeSpecName "kube-api-access-tb22j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:02:10 crc kubenswrapper[4892]: I0217 18:02:10.839153 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-rvn94"] Feb 17 18:02:10 crc kubenswrapper[4892]: I0217 18:02:10.866096 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tb22j\" (UniqueName: \"kubernetes.io/projected/3a3ae569-94c4-4d8e-b97c-b73a0f4c2b5c-kube-api-access-tb22j\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:10 crc kubenswrapper[4892]: I0217 18:02:10.866127 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a3ae569-94c4-4d8e-b97c-b73a0f4c2b5c-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:10 crc kubenswrapper[4892]: W0217 18:02:10.892931 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb26d854a_7e7b_4a84_a611_d0672cea173d.slice/crio-91093520ca360d4b71562db72ce92327931ae1ad58357923811871124c617de8 WatchSource:0}: Error finding container 91093520ca360d4b71562db72ce92327931ae1ad58357923811871124c617de8: Status 404 returned error can't find the container with id 91093520ca360d4b71562db72ce92327931ae1ad58357923811871124c617de8 Feb 17 18:02:11 crc kubenswrapper[4892]: I0217 18:02:11.227564 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-rvn94" event={"ID":"b26d854a-7e7b-4a84-a611-d0672cea173d","Type":"ContainerStarted","Data":"91093520ca360d4b71562db72ce92327931ae1ad58357923811871124c617de8"} Feb 17 18:02:11 crc kubenswrapper[4892]: I0217 18:02:11.229007 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-m59z9" event={"ID":"3a3ae569-94c4-4d8e-b97c-b73a0f4c2b5c","Type":"ContainerDied","Data":"cbfd373a15032502d15ba860a6006e019202e0ed218cf1e8a119fc9635dbd720"} Feb 17 18:02:11 crc kubenswrapper[4892]: I0217 18:02:11.229034 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cbfd373a15032502d15ba860a6006e019202e0ed218cf1e8a119fc9635dbd720" Feb 17 18:02:11 crc kubenswrapper[4892]: I0217 18:02:11.229040 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-m59z9" Feb 17 18:02:11 crc kubenswrapper[4892]: I0217 18:02:11.230307 4892 generic.go:334] "Generic (PLEG): container finished" podID="3a991e29-288f-453d-9bb4-f8d90a2689ad" containerID="503d1ce458069660595b88c09b6233a6c992ba208fc9f8777a3919fc48271c2f" exitCode=0 Feb 17 18:02:11 crc kubenswrapper[4892]: I0217 18:02:11.230438 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3a991e29-288f-453d-9bb4-f8d90a2689ad","Type":"ContainerDied","Data":"503d1ce458069660595b88c09b6233a6c992ba208fc9f8777a3919fc48271c2f"} Feb 17 18:02:11 crc kubenswrapper[4892]: I0217 18:02:11.233921 4892 generic.go:334] "Generic (PLEG): container finished" podID="60523b2e-a498-4bc9-920b-32f117afb898" containerID="e99502375ab69e8b92fee8d394383cb65bc6d54cb14279a57cb6a3666079c978" exitCode=0 Feb 17 18:02:11 crc kubenswrapper[4892]: I0217 18:02:11.233985 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"60523b2e-a498-4bc9-920b-32f117afb898","Type":"ContainerDied","Data":"e99502375ab69e8b92fee8d394383cb65bc6d54cb14279a57cb6a3666079c978"} Feb 17 18:02:11 crc kubenswrapper[4892]: I0217 18:02:11.239907 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0a693208-de83-4f05-b4ff-0b3e7f858c74","Type":"ContainerStarted","Data":"3cf97b246bab6355f5a8e7bc81f3bddf502ccd3c47dc0316fda01c98e9d599dc"} Feb 17 18:02:11 crc kubenswrapper[4892]: I0217 18:02:11.239949 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0a693208-de83-4f05-b4ff-0b3e7f858c74","Type":"ContainerStarted","Data":"6ad15eb01bd19157c80c0bb24ae02bfdeddfc5c770f3fde85bbd8a6f9ea1c582"} Feb 17 18:02:11 crc kubenswrapper[4892]: I0217 18:02:11.239959 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0a693208-de83-4f05-b4ff-0b3e7f858c74","Type":"ContainerStarted","Data":"5215ee6df896cbac8711678f30bfc8c34ed8386052d37de70cf674a5aa175b70"} Feb 17 18:02:11 crc kubenswrapper[4892]: I0217 18:02:11.239967 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0a693208-de83-4f05-b4ff-0b3e7f858c74","Type":"ContainerStarted","Data":"736363b29d80b41bd0b961a49490e559581f0ae8dd26533460a7abdb41743241"} Feb 17 18:02:12 crc kubenswrapper[4892]: I0217 18:02:12.256894 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3a991e29-288f-453d-9bb4-f8d90a2689ad","Type":"ContainerStarted","Data":"fbd28f7697ba231fe30eb9e4f8561cf1e9d17638992ea1a393e7ed69f9406cea"} Feb 17 18:02:12 crc kubenswrapper[4892]: I0217 18:02:12.257727 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 17 18:02:12 crc kubenswrapper[4892]: I0217 18:02:12.260508 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"60523b2e-a498-4bc9-920b-32f117afb898","Type":"ContainerStarted","Data":"c90905bd645724d583f06264c835c33dbac29aab54cd0307b6a53d4573add124"} Feb 17 18:02:12 crc kubenswrapper[4892]: I0217 18:02:12.260815 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 17 18:02:12 crc kubenswrapper[4892]: I0217 18:02:12.294268 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.588307515 podStartE2EDuration="58.294248989s" podCreationTimestamp="2026-02-17 18:01:14 +0000 UTC" firstStartedPulling="2026-02-17 18:01:16.671086263 +0000 UTC m=+1048.046489528" lastFinishedPulling="2026-02-17 18:01:37.377027737 +0000 UTC m=+1068.752431002" observedRunningTime="2026-02-17 18:02:12.292601396 +0000 UTC m=+1103.668004701" watchObservedRunningTime="2026-02-17 18:02:12.294248989 +0000 UTC m=+1103.669652254" Feb 17 18:02:14 crc kubenswrapper[4892]: I0217 18:02:14.283914 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0a693208-de83-4f05-b4ff-0b3e7f858c74","Type":"ContainerStarted","Data":"58e0df0679c6d02fa3c1eeeb9fd8b0c6a7d302f3247b2ed8e7c839d88c861c37"} Feb 17 18:02:14 crc kubenswrapper[4892]: I0217 18:02:14.284375 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0a693208-de83-4f05-b4ff-0b3e7f858c74","Type":"ContainerStarted","Data":"ecc298da1bf313ad808f705d76569aadd8ccd7b7dec598f4818582d15e76c068"} Feb 17 18:02:14 crc kubenswrapper[4892]: I0217 18:02:14.284386 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0a693208-de83-4f05-b4ff-0b3e7f858c74","Type":"ContainerStarted","Data":"0f12b3eda7e47024c7abf99bbcbb51563466d6864f1fb053fa6d1defadfb7e85"} Feb 17 18:02:14 crc kubenswrapper[4892]: I0217 18:02:14.284393 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0a693208-de83-4f05-b4ff-0b3e7f858c74","Type":"ContainerStarted","Data":"c1d342baf3f40d0f5eafc73a64df7c0cd93b1f05a6ea729e27e0c6aa8625b809"} Feb 17 18:02:15 crc kubenswrapper[4892]: I0217 18:02:15.288410 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-bz7v2" podUID="ff799349-84ed-44f7-8f46-d11d5637abf1" containerName="ovn-controller" probeResult="failure" output=< Feb 17 18:02:15 crc kubenswrapper[4892]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 17 18:02:15 crc kubenswrapper[4892]: > Feb 17 18:02:15 crc kubenswrapper[4892]: I0217 18:02:15.332684 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-8n9s7" Feb 17 18:02:15 crc kubenswrapper[4892]: I0217 18:02:15.356081 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=41.120249025 podStartE2EDuration="1m1.356062406s" podCreationTimestamp="2026-02-17 18:01:14 +0000 UTC" firstStartedPulling="2026-02-17 18:01:16.123353135 +0000 UTC m=+1047.498756400" lastFinishedPulling="2026-02-17 18:01:36.359166516 +0000 UTC m=+1067.734569781" observedRunningTime="2026-02-17 18:02:12.325634979 +0000 UTC m=+1103.701038254" watchObservedRunningTime="2026-02-17 18:02:15.356062406 +0000 UTC m=+1106.731465671" Feb 17 18:02:15 crc kubenswrapper[4892]: I0217 18:02:15.371001 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-8n9s7" Feb 17 18:02:15 crc kubenswrapper[4892]: I0217 18:02:15.697875 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-bz7v2-config-phc6w"] Feb 17 18:02:15 crc kubenswrapper[4892]: E0217 18:02:15.698749 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a3ae569-94c4-4d8e-b97c-b73a0f4c2b5c" containerName="mariadb-account-create-update" Feb 17 18:02:15 crc kubenswrapper[4892]: I0217 18:02:15.698839 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a3ae569-94c4-4d8e-b97c-b73a0f4c2b5c" containerName="mariadb-account-create-update" Feb 17 18:02:15 crc kubenswrapper[4892]: I0217 18:02:15.699993 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a3ae569-94c4-4d8e-b97c-b73a0f4c2b5c" containerName="mariadb-account-create-update" Feb 17 18:02:15 crc kubenswrapper[4892]: I0217 18:02:15.700654 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bz7v2-config-phc6w" Feb 17 18:02:15 crc kubenswrapper[4892]: I0217 18:02:15.711727 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 17 18:02:15 crc kubenswrapper[4892]: I0217 18:02:15.712622 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-bz7v2-config-phc6w"] Feb 17 18:02:15 crc kubenswrapper[4892]: I0217 18:02:15.788841 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4qkg\" (UniqueName: \"kubernetes.io/projected/b444f257-f61f-475d-8e7b-7bca049b5244-kube-api-access-f4qkg\") pod \"ovn-controller-bz7v2-config-phc6w\" (UID: \"b444f257-f61f-475d-8e7b-7bca049b5244\") " pod="openstack/ovn-controller-bz7v2-config-phc6w" Feb 17 18:02:15 crc kubenswrapper[4892]: I0217 18:02:15.788907 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b444f257-f61f-475d-8e7b-7bca049b5244-var-log-ovn\") pod \"ovn-controller-bz7v2-config-phc6w\" (UID: \"b444f257-f61f-475d-8e7b-7bca049b5244\") " pod="openstack/ovn-controller-bz7v2-config-phc6w" Feb 17 18:02:15 crc kubenswrapper[4892]: I0217 18:02:15.789008 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b444f257-f61f-475d-8e7b-7bca049b5244-var-run\") pod \"ovn-controller-bz7v2-config-phc6w\" (UID: \"b444f257-f61f-475d-8e7b-7bca049b5244\") " pod="openstack/ovn-controller-bz7v2-config-phc6w" Feb 17 18:02:15 crc kubenswrapper[4892]: I0217 18:02:15.789170 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b444f257-f61f-475d-8e7b-7bca049b5244-scripts\") pod \"ovn-controller-bz7v2-config-phc6w\" (UID: \"b444f257-f61f-475d-8e7b-7bca049b5244\") " pod="openstack/ovn-controller-bz7v2-config-phc6w" Feb 17 18:02:15 crc kubenswrapper[4892]: I0217 18:02:15.789667 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b444f257-f61f-475d-8e7b-7bca049b5244-additional-scripts\") pod \"ovn-controller-bz7v2-config-phc6w\" (UID: \"b444f257-f61f-475d-8e7b-7bca049b5244\") " pod="openstack/ovn-controller-bz7v2-config-phc6w" Feb 17 18:02:15 crc kubenswrapper[4892]: I0217 18:02:15.789736 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b444f257-f61f-475d-8e7b-7bca049b5244-var-run-ovn\") pod \"ovn-controller-bz7v2-config-phc6w\" (UID: \"b444f257-f61f-475d-8e7b-7bca049b5244\") " pod="openstack/ovn-controller-bz7v2-config-phc6w" Feb 17 18:02:15 crc kubenswrapper[4892]: I0217 18:02:15.890589 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b444f257-f61f-475d-8e7b-7bca049b5244-additional-scripts\") pod \"ovn-controller-bz7v2-config-phc6w\" (UID: \"b444f257-f61f-475d-8e7b-7bca049b5244\") " pod="openstack/ovn-controller-bz7v2-config-phc6w" Feb 17 18:02:15 crc kubenswrapper[4892]: I0217 18:02:15.890633 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b444f257-f61f-475d-8e7b-7bca049b5244-var-run-ovn\") pod \"ovn-controller-bz7v2-config-phc6w\" (UID: \"b444f257-f61f-475d-8e7b-7bca049b5244\") " pod="openstack/ovn-controller-bz7v2-config-phc6w" Feb 17 18:02:15 crc kubenswrapper[4892]: I0217 18:02:15.890702 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4qkg\" (UniqueName: \"kubernetes.io/projected/b444f257-f61f-475d-8e7b-7bca049b5244-kube-api-access-f4qkg\") pod \"ovn-controller-bz7v2-config-phc6w\" (UID: \"b444f257-f61f-475d-8e7b-7bca049b5244\") " pod="openstack/ovn-controller-bz7v2-config-phc6w" Feb 17 18:02:15 crc kubenswrapper[4892]: I0217 18:02:15.890735 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b444f257-f61f-475d-8e7b-7bca049b5244-var-log-ovn\") pod \"ovn-controller-bz7v2-config-phc6w\" (UID: \"b444f257-f61f-475d-8e7b-7bca049b5244\") " pod="openstack/ovn-controller-bz7v2-config-phc6w" Feb 17 18:02:15 crc kubenswrapper[4892]: I0217 18:02:15.890762 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b444f257-f61f-475d-8e7b-7bca049b5244-var-run\") pod \"ovn-controller-bz7v2-config-phc6w\" (UID: \"b444f257-f61f-475d-8e7b-7bca049b5244\") " pod="openstack/ovn-controller-bz7v2-config-phc6w" Feb 17 18:02:15 crc kubenswrapper[4892]: I0217 18:02:15.890794 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b444f257-f61f-475d-8e7b-7bca049b5244-scripts\") pod \"ovn-controller-bz7v2-config-phc6w\" (UID: \"b444f257-f61f-475d-8e7b-7bca049b5244\") " pod="openstack/ovn-controller-bz7v2-config-phc6w" Feb 17 18:02:15 crc kubenswrapper[4892]: I0217 18:02:15.891204 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b444f257-f61f-475d-8e7b-7bca049b5244-var-log-ovn\") pod \"ovn-controller-bz7v2-config-phc6w\" (UID: \"b444f257-f61f-475d-8e7b-7bca049b5244\") " pod="openstack/ovn-controller-bz7v2-config-phc6w" Feb 17 18:02:15 crc kubenswrapper[4892]: I0217 18:02:15.891205 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b444f257-f61f-475d-8e7b-7bca049b5244-var-run\") pod \"ovn-controller-bz7v2-config-phc6w\" (UID: \"b444f257-f61f-475d-8e7b-7bca049b5244\") " pod="openstack/ovn-controller-bz7v2-config-phc6w" Feb 17 18:02:15 crc kubenswrapper[4892]: I0217 18:02:15.891532 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b444f257-f61f-475d-8e7b-7bca049b5244-additional-scripts\") pod \"ovn-controller-bz7v2-config-phc6w\" (UID: \"b444f257-f61f-475d-8e7b-7bca049b5244\") " pod="openstack/ovn-controller-bz7v2-config-phc6w" Feb 17 18:02:15 crc kubenswrapper[4892]: I0217 18:02:15.891963 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b444f257-f61f-475d-8e7b-7bca049b5244-var-run-ovn\") pod \"ovn-controller-bz7v2-config-phc6w\" (UID: \"b444f257-f61f-475d-8e7b-7bca049b5244\") " pod="openstack/ovn-controller-bz7v2-config-phc6w" Feb 17 18:02:15 crc kubenswrapper[4892]: I0217 18:02:15.892599 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b444f257-f61f-475d-8e7b-7bca049b5244-scripts\") pod \"ovn-controller-bz7v2-config-phc6w\" (UID: \"b444f257-f61f-475d-8e7b-7bca049b5244\") " pod="openstack/ovn-controller-bz7v2-config-phc6w" Feb 17 18:02:15 crc kubenswrapper[4892]: I0217 18:02:15.910730 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4qkg\" (UniqueName: \"kubernetes.io/projected/b444f257-f61f-475d-8e7b-7bca049b5244-kube-api-access-f4qkg\") pod \"ovn-controller-bz7v2-config-phc6w\" (UID: \"b444f257-f61f-475d-8e7b-7bca049b5244\") " pod="openstack/ovn-controller-bz7v2-config-phc6w" Feb 17 18:02:16 crc kubenswrapper[4892]: I0217 18:02:16.089456 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bz7v2-config-phc6w" Feb 17 18:02:16 crc kubenswrapper[4892]: I0217 18:02:16.335913 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0a693208-de83-4f05-b4ff-0b3e7f858c74","Type":"ContainerStarted","Data":"4a392a3a4be90b644c07bd0ce9b8c0d432df6220823557ff7db78a582c418255"} Feb 17 18:02:16 crc kubenswrapper[4892]: I0217 18:02:16.336229 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0a693208-de83-4f05-b4ff-0b3e7f858c74","Type":"ContainerStarted","Data":"651defaed761921ccf7c7a6cc3b90f243a62d645f799923b98e7b01c8fdaecf1"} Feb 17 18:02:16 crc kubenswrapper[4892]: I0217 18:02:16.336249 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0a693208-de83-4f05-b4ff-0b3e7f858c74","Type":"ContainerStarted","Data":"94a10ed2d5fee50a1d21331a9c66ebfdc3f24fee2209a9431f1451ef17c41252"} Feb 17 18:02:16 crc kubenswrapper[4892]: I0217 18:02:16.336260 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0a693208-de83-4f05-b4ff-0b3e7f858c74","Type":"ContainerStarted","Data":"4d5dff4d38a7a79f798428eae45a2013e22a5357dd2fcd764f8587d896667672"} Feb 17 18:02:16 crc kubenswrapper[4892]: I0217 18:02:16.584267 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-bz7v2-config-phc6w"] Feb 17 18:02:17 crc kubenswrapper[4892]: I0217 18:02:17.346144 4892 generic.go:334] "Generic (PLEG): container finished" podID="b444f257-f61f-475d-8e7b-7bca049b5244" containerID="71ee82ac13e1236283665727695224d81d671980ccde4309b03ac1b224f8754c" exitCode=0 Feb 17 18:02:17 crc kubenswrapper[4892]: I0217 18:02:17.346233 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-bz7v2-config-phc6w" event={"ID":"b444f257-f61f-475d-8e7b-7bca049b5244","Type":"ContainerDied","Data":"71ee82ac13e1236283665727695224d81d671980ccde4309b03ac1b224f8754c"} Feb 17 18:02:17 crc kubenswrapper[4892]: I0217 18:02:17.346732 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-bz7v2-config-phc6w" event={"ID":"b444f257-f61f-475d-8e7b-7bca049b5244","Type":"ContainerStarted","Data":"2575b434ded0249cdf5d371513f3d798adb2565f454783562f113f3289914748"} Feb 17 18:02:17 crc kubenswrapper[4892]: I0217 18:02:17.354544 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0a693208-de83-4f05-b4ff-0b3e7f858c74","Type":"ContainerStarted","Data":"40d70537bfb2a1c3a8fc4994456c79434226023147aca15f0cc917598c474f70"} Feb 17 18:02:17 crc kubenswrapper[4892]: I0217 18:02:17.354781 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0a693208-de83-4f05-b4ff-0b3e7f858c74","Type":"ContainerStarted","Data":"560fe248893cb8f1a43c62388a6d42d718ed7cc7f26d7e07babf294df3388633"} Feb 17 18:02:17 crc kubenswrapper[4892]: I0217 18:02:17.354873 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0a693208-de83-4f05-b4ff-0b3e7f858c74","Type":"ContainerStarted","Data":"ec801cf9cb9854df9d768df41b9402e3ae9e7297f0a3ac7eca720a4609e68e35"} Feb 17 18:02:17 crc kubenswrapper[4892]: I0217 18:02:17.406247 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.368756433 podStartE2EDuration="26.406225887s" podCreationTimestamp="2026-02-17 18:01:51 +0000 UTC" firstStartedPulling="2026-02-17 18:02:09.123878271 +0000 UTC m=+1100.499281536" lastFinishedPulling="2026-02-17 18:02:15.161347725 +0000 UTC m=+1106.536750990" observedRunningTime="2026-02-17 18:02:17.397179335 +0000 UTC m=+1108.772582620" watchObservedRunningTime="2026-02-17 18:02:17.406225887 +0000 UTC m=+1108.781629152" Feb 17 18:02:17 crc kubenswrapper[4892]: I0217 18:02:17.683036 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-cnqps"] Feb 17 18:02:17 crc kubenswrapper[4892]: I0217 18:02:17.684977 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-cnqps" Feb 17 18:02:17 crc kubenswrapper[4892]: I0217 18:02:17.696238 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Feb 17 18:02:17 crc kubenswrapper[4892]: I0217 18:02:17.701051 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-cnqps"] Feb 17 18:02:17 crc kubenswrapper[4892]: I0217 18:02:17.851441 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/08f3009f-758d-4bca-b836-ea3ba6e8d097-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-cnqps\" (UID: \"08f3009f-758d-4bca-b836-ea3ba6e8d097\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-cnqps" Feb 17 18:02:17 crc kubenswrapper[4892]: I0217 18:02:17.851502 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hdfd\" (UniqueName: \"kubernetes.io/projected/08f3009f-758d-4bca-b836-ea3ba6e8d097-kube-api-access-9hdfd\") pod \"dnsmasq-dns-6d5b6d6b67-cnqps\" (UID: \"08f3009f-758d-4bca-b836-ea3ba6e8d097\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-cnqps" Feb 17 18:02:17 crc kubenswrapper[4892]: I0217 18:02:17.851991 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08f3009f-758d-4bca-b836-ea3ba6e8d097-config\") pod \"dnsmasq-dns-6d5b6d6b67-cnqps\" (UID: \"08f3009f-758d-4bca-b836-ea3ba6e8d097\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-cnqps" Feb 17 18:02:17 crc kubenswrapper[4892]: I0217 18:02:17.852091 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08f3009f-758d-4bca-b836-ea3ba6e8d097-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-cnqps\" (UID: \"08f3009f-758d-4bca-b836-ea3ba6e8d097\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-cnqps" Feb 17 18:02:17 crc kubenswrapper[4892]: I0217 18:02:17.852161 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/08f3009f-758d-4bca-b836-ea3ba6e8d097-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-cnqps\" (UID: \"08f3009f-758d-4bca-b836-ea3ba6e8d097\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-cnqps" Feb 17 18:02:17 crc kubenswrapper[4892]: I0217 18:02:17.852221 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/08f3009f-758d-4bca-b836-ea3ba6e8d097-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-cnqps\" (UID: \"08f3009f-758d-4bca-b836-ea3ba6e8d097\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-cnqps" Feb 17 18:02:17 crc kubenswrapper[4892]: I0217 18:02:17.953614 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/08f3009f-758d-4bca-b836-ea3ba6e8d097-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-cnqps\" (UID: \"08f3009f-758d-4bca-b836-ea3ba6e8d097\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-cnqps" Feb 17 18:02:17 crc kubenswrapper[4892]: I0217 18:02:17.953890 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hdfd\" (UniqueName: \"kubernetes.io/projected/08f3009f-758d-4bca-b836-ea3ba6e8d097-kube-api-access-9hdfd\") pod \"dnsmasq-dns-6d5b6d6b67-cnqps\" (UID: \"08f3009f-758d-4bca-b836-ea3ba6e8d097\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-cnqps" Feb 17 18:02:17 crc kubenswrapper[4892]: I0217 18:02:17.953922 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08f3009f-758d-4bca-b836-ea3ba6e8d097-config\") pod \"dnsmasq-dns-6d5b6d6b67-cnqps\" (UID: \"08f3009f-758d-4bca-b836-ea3ba6e8d097\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-cnqps" Feb 17 18:02:17 crc kubenswrapper[4892]: I0217 18:02:17.953966 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08f3009f-758d-4bca-b836-ea3ba6e8d097-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-cnqps\" (UID: \"08f3009f-758d-4bca-b836-ea3ba6e8d097\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-cnqps" Feb 17 18:02:17 crc kubenswrapper[4892]: I0217 18:02:17.954008 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/08f3009f-758d-4bca-b836-ea3ba6e8d097-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-cnqps\" (UID: \"08f3009f-758d-4bca-b836-ea3ba6e8d097\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-cnqps" Feb 17 18:02:17 crc kubenswrapper[4892]: I0217 18:02:17.954044 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/08f3009f-758d-4bca-b836-ea3ba6e8d097-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-cnqps\" (UID: \"08f3009f-758d-4bca-b836-ea3ba6e8d097\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-cnqps" Feb 17 18:02:17 crc kubenswrapper[4892]: I0217 18:02:17.954619 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/08f3009f-758d-4bca-b836-ea3ba6e8d097-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-cnqps\" (UID: \"08f3009f-758d-4bca-b836-ea3ba6e8d097\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-cnqps" Feb 17 18:02:17 crc kubenswrapper[4892]: I0217 18:02:17.954750 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/08f3009f-758d-4bca-b836-ea3ba6e8d097-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-cnqps\" (UID: \"08f3009f-758d-4bca-b836-ea3ba6e8d097\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-cnqps" Feb 17 18:02:17 crc kubenswrapper[4892]: I0217 18:02:17.955026 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08f3009f-758d-4bca-b836-ea3ba6e8d097-config\") pod \"dnsmasq-dns-6d5b6d6b67-cnqps\" (UID: \"08f3009f-758d-4bca-b836-ea3ba6e8d097\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-cnqps" Feb 17 18:02:17 crc kubenswrapper[4892]: I0217 18:02:17.955271 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/08f3009f-758d-4bca-b836-ea3ba6e8d097-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-cnqps\" (UID: \"08f3009f-758d-4bca-b836-ea3ba6e8d097\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-cnqps" Feb 17 18:02:17 crc kubenswrapper[4892]: I0217 18:02:17.957352 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08f3009f-758d-4bca-b836-ea3ba6e8d097-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-cnqps\" (UID: \"08f3009f-758d-4bca-b836-ea3ba6e8d097\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-cnqps" Feb 17 18:02:17 crc kubenswrapper[4892]: I0217 18:02:17.986681 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hdfd\" (UniqueName: \"kubernetes.io/projected/08f3009f-758d-4bca-b836-ea3ba6e8d097-kube-api-access-9hdfd\") pod \"dnsmasq-dns-6d5b6d6b67-cnqps\" (UID: \"08f3009f-758d-4bca-b836-ea3ba6e8d097\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-cnqps" Feb 17 18:02:18 crc kubenswrapper[4892]: I0217 18:02:18.013725 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-cnqps" Feb 17 18:02:20 crc kubenswrapper[4892]: I0217 18:02:20.234313 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-bz7v2" Feb 17 18:02:25 crc kubenswrapper[4892]: I0217 18:02:25.099141 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bz7v2-config-phc6w" Feb 17 18:02:25 crc kubenswrapper[4892]: I0217 18:02:25.210187 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4qkg\" (UniqueName: \"kubernetes.io/projected/b444f257-f61f-475d-8e7b-7bca049b5244-kube-api-access-f4qkg\") pod \"b444f257-f61f-475d-8e7b-7bca049b5244\" (UID: \"b444f257-f61f-475d-8e7b-7bca049b5244\") " Feb 17 18:02:25 crc kubenswrapper[4892]: I0217 18:02:25.210495 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b444f257-f61f-475d-8e7b-7bca049b5244-var-run-ovn\") pod \"b444f257-f61f-475d-8e7b-7bca049b5244\" (UID: \"b444f257-f61f-475d-8e7b-7bca049b5244\") " Feb 17 18:02:25 crc kubenswrapper[4892]: I0217 18:02:25.210590 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b444f257-f61f-475d-8e7b-7bca049b5244-var-run\") pod \"b444f257-f61f-475d-8e7b-7bca049b5244\" (UID: \"b444f257-f61f-475d-8e7b-7bca049b5244\") " Feb 17 18:02:25 crc kubenswrapper[4892]: I0217 18:02:25.210717 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b444f257-f61f-475d-8e7b-7bca049b5244-additional-scripts\") pod \"b444f257-f61f-475d-8e7b-7bca049b5244\" (UID: \"b444f257-f61f-475d-8e7b-7bca049b5244\") " Feb 17 18:02:25 crc kubenswrapper[4892]: I0217 18:02:25.210824 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b444f257-f61f-475d-8e7b-7bca049b5244-scripts\") pod \"b444f257-f61f-475d-8e7b-7bca049b5244\" (UID: \"b444f257-f61f-475d-8e7b-7bca049b5244\") " Feb 17 18:02:25 crc kubenswrapper[4892]: I0217 18:02:25.210871 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b444f257-f61f-475d-8e7b-7bca049b5244-var-log-ovn\") pod \"b444f257-f61f-475d-8e7b-7bca049b5244\" (UID: \"b444f257-f61f-475d-8e7b-7bca049b5244\") " Feb 17 18:02:25 crc kubenswrapper[4892]: I0217 18:02:25.211388 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b444f257-f61f-475d-8e7b-7bca049b5244-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "b444f257-f61f-475d-8e7b-7bca049b5244" (UID: "b444f257-f61f-475d-8e7b-7bca049b5244"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:02:25 crc kubenswrapper[4892]: I0217 18:02:25.211444 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b444f257-f61f-475d-8e7b-7bca049b5244-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "b444f257-f61f-475d-8e7b-7bca049b5244" (UID: "b444f257-f61f-475d-8e7b-7bca049b5244"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:02:25 crc kubenswrapper[4892]: I0217 18:02:25.211466 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b444f257-f61f-475d-8e7b-7bca049b5244-var-run" (OuterVolumeSpecName: "var-run") pod "b444f257-f61f-475d-8e7b-7bca049b5244" (UID: "b444f257-f61f-475d-8e7b-7bca049b5244"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:02:25 crc kubenswrapper[4892]: I0217 18:02:25.212256 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b444f257-f61f-475d-8e7b-7bca049b5244-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "b444f257-f61f-475d-8e7b-7bca049b5244" (UID: "b444f257-f61f-475d-8e7b-7bca049b5244"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:02:25 crc kubenswrapper[4892]: I0217 18:02:25.212395 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b444f257-f61f-475d-8e7b-7bca049b5244-scripts" (OuterVolumeSpecName: "scripts") pod "b444f257-f61f-475d-8e7b-7bca049b5244" (UID: "b444f257-f61f-475d-8e7b-7bca049b5244"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:02:25 crc kubenswrapper[4892]: I0217 18:02:25.214325 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b444f257-f61f-475d-8e7b-7bca049b5244-kube-api-access-f4qkg" (OuterVolumeSpecName: "kube-api-access-f4qkg") pod "b444f257-f61f-475d-8e7b-7bca049b5244" (UID: "b444f257-f61f-475d-8e7b-7bca049b5244"). InnerVolumeSpecName "kube-api-access-f4qkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:02:25 crc kubenswrapper[4892]: I0217 18:02:25.313164 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4qkg\" (UniqueName: \"kubernetes.io/projected/b444f257-f61f-475d-8e7b-7bca049b5244-kube-api-access-f4qkg\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:25 crc kubenswrapper[4892]: I0217 18:02:25.313206 4892 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b444f257-f61f-475d-8e7b-7bca049b5244-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:25 crc kubenswrapper[4892]: I0217 18:02:25.313222 4892 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b444f257-f61f-475d-8e7b-7bca049b5244-var-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:25 crc kubenswrapper[4892]: I0217 18:02:25.313234 4892 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b444f257-f61f-475d-8e7b-7bca049b5244-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:25 crc kubenswrapper[4892]: I0217 18:02:25.313248 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b444f257-f61f-475d-8e7b-7bca049b5244-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:25 crc kubenswrapper[4892]: I0217 18:02:25.313258 4892 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b444f257-f61f-475d-8e7b-7bca049b5244-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:25 crc kubenswrapper[4892]: I0217 18:02:25.370146 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-cnqps"] Feb 17 18:02:25 crc kubenswrapper[4892]: I0217 18:02:25.448073 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-bz7v2-config-phc6w" event={"ID":"b444f257-f61f-475d-8e7b-7bca049b5244","Type":"ContainerDied","Data":"2575b434ded0249cdf5d371513f3d798adb2565f454783562f113f3289914748"} Feb 17 18:02:25 crc kubenswrapper[4892]: I0217 18:02:25.448115 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2575b434ded0249cdf5d371513f3d798adb2565f454783562f113f3289914748" Feb 17 18:02:25 crc kubenswrapper[4892]: I0217 18:02:25.448151 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bz7v2-config-phc6w" Feb 17 18:02:25 crc kubenswrapper[4892]: I0217 18:02:25.449529 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-cnqps" event={"ID":"08f3009f-758d-4bca-b836-ea3ba6e8d097","Type":"ContainerStarted","Data":"758d5deb7c3c33b37f9970c52818e6b886ae96e8c4252624a7afc935347cbb5c"} Feb 17 18:02:25 crc kubenswrapper[4892]: I0217 18:02:25.578375 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="60523b2e-a498-4bc9-920b-32f117afb898" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.130:5671: connect: connection refused" Feb 17 18:02:26 crc kubenswrapper[4892]: I0217 18:02:26.065252 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="3a991e29-288f-453d-9bb4-f8d90a2689ad" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.131:5671: connect: connection refused" Feb 17 18:02:26 crc kubenswrapper[4892]: I0217 18:02:26.214237 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-bz7v2-config-phc6w"] Feb 17 18:02:26 crc kubenswrapper[4892]: I0217 18:02:26.233908 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-bz7v2-config-phc6w"] Feb 17 18:02:26 crc kubenswrapper[4892]: I0217 18:02:26.459184 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-rvn94" event={"ID":"b26d854a-7e7b-4a84-a611-d0672cea173d","Type":"ContainerStarted","Data":"d1ffbfbf7d7e526d9f7eb72bc01754b35f328feb3b5ac960d6e752e50581986e"} Feb 17 18:02:26 crc kubenswrapper[4892]: I0217 18:02:26.461039 4892 generic.go:334] "Generic (PLEG): container finished" podID="08f3009f-758d-4bca-b836-ea3ba6e8d097" containerID="b3045e29696030e8c5b4ad9566b0e640b3288767304a646d75ee3d334a28d30a" exitCode=0 Feb 17 18:02:26 crc kubenswrapper[4892]: I0217 18:02:26.461089 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-cnqps" event={"ID":"08f3009f-758d-4bca-b836-ea3ba6e8d097","Type":"ContainerDied","Data":"b3045e29696030e8c5b4ad9566b0e640b3288767304a646d75ee3d334a28d30a"} Feb 17 18:02:26 crc kubenswrapper[4892]: I0217 18:02:26.507195 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-rvn94" podStartSLOduration=3.444135453 podStartE2EDuration="17.507178881s" podCreationTimestamp="2026-02-17 18:02:09 +0000 UTC" firstStartedPulling="2026-02-17 18:02:10.89610147 +0000 UTC m=+1102.271504725" lastFinishedPulling="2026-02-17 18:02:24.959144888 +0000 UTC m=+1116.334548153" observedRunningTime="2026-02-17 18:02:26.478211375 +0000 UTC m=+1117.853614640" watchObservedRunningTime="2026-02-17 18:02:26.507178881 +0000 UTC m=+1117.882582146" Feb 17 18:02:27 crc kubenswrapper[4892]: I0217 18:02:27.370276 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b444f257-f61f-475d-8e7b-7bca049b5244" path="/var/lib/kubelet/pods/b444f257-f61f-475d-8e7b-7bca049b5244/volumes" Feb 17 18:02:27 crc kubenswrapper[4892]: I0217 18:02:27.480787 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-cnqps" event={"ID":"08f3009f-758d-4bca-b836-ea3ba6e8d097","Type":"ContainerStarted","Data":"6b42ecebf51df8c144389699c3acf38d14ba7ddca00774dcc4e5cab68116355d"} Feb 17 18:02:27 crc kubenswrapper[4892]: I0217 18:02:27.480918 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d5b6d6b67-cnqps" Feb 17 18:02:27 crc kubenswrapper[4892]: I0217 18:02:27.505082 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d5b6d6b67-cnqps" podStartSLOduration=10.505067198999999 podStartE2EDuration="10.505067199s" podCreationTimestamp="2026-02-17 18:02:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:02:27.50287778 +0000 UTC m=+1118.878281056" watchObservedRunningTime="2026-02-17 18:02:27.505067199 +0000 UTC m=+1118.880470474" Feb 17 18:02:31 crc kubenswrapper[4892]: I0217 18:02:31.534000 4892 generic.go:334] "Generic (PLEG): container finished" podID="b26d854a-7e7b-4a84-a611-d0672cea173d" containerID="d1ffbfbf7d7e526d9f7eb72bc01754b35f328feb3b5ac960d6e752e50581986e" exitCode=0 Feb 17 18:02:31 crc kubenswrapper[4892]: I0217 18:02:31.534070 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-rvn94" event={"ID":"b26d854a-7e7b-4a84-a611-d0672cea173d","Type":"ContainerDied","Data":"d1ffbfbf7d7e526d9f7eb72bc01754b35f328feb3b5ac960d6e752e50581986e"} Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.010131 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-rvn94" Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.016088 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d5b6d6b67-cnqps" Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.095245 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-hkkrz"] Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.095652 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-hkkrz" podUID="a40576fc-fbd3-45f5-afb8-50de90642017" containerName="dnsmasq-dns" containerID="cri-o://ee31744dbff65e585731ce1bce9c8769097e005abb95fe945b70dda9c322656e" gracePeriod=10 Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.159324 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b26d854a-7e7b-4a84-a611-d0672cea173d-db-sync-config-data\") pod \"b26d854a-7e7b-4a84-a611-d0672cea173d\" (UID: \"b26d854a-7e7b-4a84-a611-d0672cea173d\") " Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.159363 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b26d854a-7e7b-4a84-a611-d0672cea173d-config-data\") pod \"b26d854a-7e7b-4a84-a611-d0672cea173d\" (UID: \"b26d854a-7e7b-4a84-a611-d0672cea173d\") " Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.159425 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b26d854a-7e7b-4a84-a611-d0672cea173d-combined-ca-bundle\") pod \"b26d854a-7e7b-4a84-a611-d0672cea173d\" (UID: \"b26d854a-7e7b-4a84-a611-d0672cea173d\") " Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.159628 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckkww\" (UniqueName: \"kubernetes.io/projected/b26d854a-7e7b-4a84-a611-d0672cea173d-kube-api-access-ckkww\") pod \"b26d854a-7e7b-4a84-a611-d0672cea173d\" (UID: \"b26d854a-7e7b-4a84-a611-d0672cea173d\") " Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.171232 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b26d854a-7e7b-4a84-a611-d0672cea173d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "b26d854a-7e7b-4a84-a611-d0672cea173d" (UID: "b26d854a-7e7b-4a84-a611-d0672cea173d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.188201 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b26d854a-7e7b-4a84-a611-d0672cea173d-kube-api-access-ckkww" (OuterVolumeSpecName: "kube-api-access-ckkww") pod "b26d854a-7e7b-4a84-a611-d0672cea173d" (UID: "b26d854a-7e7b-4a84-a611-d0672cea173d"). InnerVolumeSpecName "kube-api-access-ckkww". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.196440 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b26d854a-7e7b-4a84-a611-d0672cea173d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b26d854a-7e7b-4a84-a611-d0672cea173d" (UID: "b26d854a-7e7b-4a84-a611-d0672cea173d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.217076 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b26d854a-7e7b-4a84-a611-d0672cea173d-config-data" (OuterVolumeSpecName: "config-data") pod "b26d854a-7e7b-4a84-a611-d0672cea173d" (UID: "b26d854a-7e7b-4a84-a611-d0672cea173d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.261653 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckkww\" (UniqueName: \"kubernetes.io/projected/b26d854a-7e7b-4a84-a611-d0672cea173d-kube-api-access-ckkww\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.261689 4892 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b26d854a-7e7b-4a84-a611-d0672cea173d-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.261699 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b26d854a-7e7b-4a84-a611-d0672cea173d-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.261708 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b26d854a-7e7b-4a84-a611-d0672cea173d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.502162 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-hkkrz" Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.562047 4892 generic.go:334] "Generic (PLEG): container finished" podID="a40576fc-fbd3-45f5-afb8-50de90642017" containerID="ee31744dbff65e585731ce1bce9c8769097e005abb95fe945b70dda9c322656e" exitCode=0 Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.562238 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-hkkrz" Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.562124 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-hkkrz" event={"ID":"a40576fc-fbd3-45f5-afb8-50de90642017","Type":"ContainerDied","Data":"ee31744dbff65e585731ce1bce9c8769097e005abb95fe945b70dda9c322656e"} Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.562697 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-hkkrz" event={"ID":"a40576fc-fbd3-45f5-afb8-50de90642017","Type":"ContainerDied","Data":"cb332a6d06af9a760e318c299c1023164b04aa143f34562f8b24e09089f6f829"} Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.562718 4892 scope.go:117] "RemoveContainer" containerID="ee31744dbff65e585731ce1bce9c8769097e005abb95fe945b70dda9c322656e" Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.565054 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-rvn94" event={"ID":"b26d854a-7e7b-4a84-a611-d0672cea173d","Type":"ContainerDied","Data":"91093520ca360d4b71562db72ce92327931ae1ad58357923811871124c617de8"} Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.565086 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91093520ca360d4b71562db72ce92327931ae1ad58357923811871124c617de8" Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.565139 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-rvn94" Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.600411 4892 scope.go:117] "RemoveContainer" containerID="1bef56092db4c9c5e74ecfc6582f80b4c9ec88e7d419fae6c9b95972e9977a22" Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.620494 4892 scope.go:117] "RemoveContainer" containerID="ee31744dbff65e585731ce1bce9c8769097e005abb95fe945b70dda9c322656e" Feb 17 18:02:33 crc kubenswrapper[4892]: E0217 18:02:33.622976 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee31744dbff65e585731ce1bce9c8769097e005abb95fe945b70dda9c322656e\": container with ID starting with ee31744dbff65e585731ce1bce9c8769097e005abb95fe945b70dda9c322656e not found: ID does not exist" containerID="ee31744dbff65e585731ce1bce9c8769097e005abb95fe945b70dda9c322656e" Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.623023 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee31744dbff65e585731ce1bce9c8769097e005abb95fe945b70dda9c322656e"} err="failed to get container status \"ee31744dbff65e585731ce1bce9c8769097e005abb95fe945b70dda9c322656e\": rpc error: code = NotFound desc = could not find container \"ee31744dbff65e585731ce1bce9c8769097e005abb95fe945b70dda9c322656e\": container with ID starting with ee31744dbff65e585731ce1bce9c8769097e005abb95fe945b70dda9c322656e not found: ID does not exist" Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.623047 4892 scope.go:117] "RemoveContainer" containerID="1bef56092db4c9c5e74ecfc6582f80b4c9ec88e7d419fae6c9b95972e9977a22" Feb 17 18:02:33 crc kubenswrapper[4892]: E0217 18:02:33.623874 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bef56092db4c9c5e74ecfc6582f80b4c9ec88e7d419fae6c9b95972e9977a22\": container with ID starting with 1bef56092db4c9c5e74ecfc6582f80b4c9ec88e7d419fae6c9b95972e9977a22 not found: ID does not exist" containerID="1bef56092db4c9c5e74ecfc6582f80b4c9ec88e7d419fae6c9b95972e9977a22" Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.623909 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bef56092db4c9c5e74ecfc6582f80b4c9ec88e7d419fae6c9b95972e9977a22"} err="failed to get container status \"1bef56092db4c9c5e74ecfc6582f80b4c9ec88e7d419fae6c9b95972e9977a22\": rpc error: code = NotFound desc = could not find container \"1bef56092db4c9c5e74ecfc6582f80b4c9ec88e7d419fae6c9b95972e9977a22\": container with ID starting with 1bef56092db4c9c5e74ecfc6582f80b4c9ec88e7d419fae6c9b95972e9977a22 not found: ID does not exist" Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.673187 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a40576fc-fbd3-45f5-afb8-50de90642017-ovsdbserver-nb\") pod \"a40576fc-fbd3-45f5-afb8-50de90642017\" (UID: \"a40576fc-fbd3-45f5-afb8-50de90642017\") " Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.673350 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a40576fc-fbd3-45f5-afb8-50de90642017-config\") pod \"a40576fc-fbd3-45f5-afb8-50de90642017\" (UID: \"a40576fc-fbd3-45f5-afb8-50de90642017\") " Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.673379 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a40576fc-fbd3-45f5-afb8-50de90642017-dns-svc\") pod \"a40576fc-fbd3-45f5-afb8-50de90642017\" (UID: \"a40576fc-fbd3-45f5-afb8-50de90642017\") " Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.673475 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a40576fc-fbd3-45f5-afb8-50de90642017-ovsdbserver-sb\") pod \"a40576fc-fbd3-45f5-afb8-50de90642017\" (UID: \"a40576fc-fbd3-45f5-afb8-50de90642017\") " Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.673538 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xbtj\" (UniqueName: \"kubernetes.io/projected/a40576fc-fbd3-45f5-afb8-50de90642017-kube-api-access-4xbtj\") pod \"a40576fc-fbd3-45f5-afb8-50de90642017\" (UID: \"a40576fc-fbd3-45f5-afb8-50de90642017\") " Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.677728 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a40576fc-fbd3-45f5-afb8-50de90642017-kube-api-access-4xbtj" (OuterVolumeSpecName: "kube-api-access-4xbtj") pod "a40576fc-fbd3-45f5-afb8-50de90642017" (UID: "a40576fc-fbd3-45f5-afb8-50de90642017"). InnerVolumeSpecName "kube-api-access-4xbtj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.715071 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a40576fc-fbd3-45f5-afb8-50de90642017-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a40576fc-fbd3-45f5-afb8-50de90642017" (UID: "a40576fc-fbd3-45f5-afb8-50de90642017"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.725505 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a40576fc-fbd3-45f5-afb8-50de90642017-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a40576fc-fbd3-45f5-afb8-50de90642017" (UID: "a40576fc-fbd3-45f5-afb8-50de90642017"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.728127 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a40576fc-fbd3-45f5-afb8-50de90642017-config" (OuterVolumeSpecName: "config") pod "a40576fc-fbd3-45f5-afb8-50de90642017" (UID: "a40576fc-fbd3-45f5-afb8-50de90642017"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.740752 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a40576fc-fbd3-45f5-afb8-50de90642017-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a40576fc-fbd3-45f5-afb8-50de90642017" (UID: "a40576fc-fbd3-45f5-afb8-50de90642017"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.775761 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a40576fc-fbd3-45f5-afb8-50de90642017-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.775804 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xbtj\" (UniqueName: \"kubernetes.io/projected/a40576fc-fbd3-45f5-afb8-50de90642017-kube-api-access-4xbtj\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.775834 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a40576fc-fbd3-45f5-afb8-50de90642017-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.775844 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a40576fc-fbd3-45f5-afb8-50de90642017-config\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.775853 4892 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a40576fc-fbd3-45f5-afb8-50de90642017-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.913436 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-hkkrz"] Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.921863 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-hkkrz"] Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.974019 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-b2865"] Feb 17 18:02:33 crc kubenswrapper[4892]: E0217 18:02:33.974388 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a40576fc-fbd3-45f5-afb8-50de90642017" containerName="dnsmasq-dns" Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.974404 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="a40576fc-fbd3-45f5-afb8-50de90642017" containerName="dnsmasq-dns" Feb 17 18:02:33 crc kubenswrapper[4892]: E0217 18:02:33.974416 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a40576fc-fbd3-45f5-afb8-50de90642017" containerName="init" Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.974421 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="a40576fc-fbd3-45f5-afb8-50de90642017" containerName="init" Feb 17 18:02:33 crc kubenswrapper[4892]: E0217 18:02:33.974438 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b444f257-f61f-475d-8e7b-7bca049b5244" containerName="ovn-config" Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.974445 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="b444f257-f61f-475d-8e7b-7bca049b5244" containerName="ovn-config" Feb 17 18:02:33 crc kubenswrapper[4892]: E0217 18:02:33.974466 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b26d854a-7e7b-4a84-a611-d0672cea173d" containerName="glance-db-sync" Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.974474 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="b26d854a-7e7b-4a84-a611-d0672cea173d" containerName="glance-db-sync" Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.974645 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="b26d854a-7e7b-4a84-a611-d0672cea173d" containerName="glance-db-sync" Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.974710 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="a40576fc-fbd3-45f5-afb8-50de90642017" containerName="dnsmasq-dns" Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.974737 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="b444f257-f61f-475d-8e7b-7bca049b5244" containerName="ovn-config" Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.975793 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-b2865" Feb 17 18:02:33 crc kubenswrapper[4892]: I0217 18:02:33.989690 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-b2865"] Feb 17 18:02:34 crc kubenswrapper[4892]: I0217 18:02:34.080771 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7-config\") pod \"dnsmasq-dns-895cf5cf-b2865\" (UID: \"aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7\") " pod="openstack/dnsmasq-dns-895cf5cf-b2865" Feb 17 18:02:34 crc kubenswrapper[4892]: I0217 18:02:34.080959 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7-dns-swift-storage-0\") pod \"dnsmasq-dns-895cf5cf-b2865\" (UID: \"aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7\") " pod="openstack/dnsmasq-dns-895cf5cf-b2865" Feb 17 18:02:34 crc kubenswrapper[4892]: I0217 18:02:34.081009 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7-dns-svc\") pod \"dnsmasq-dns-895cf5cf-b2865\" (UID: \"aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7\") " pod="openstack/dnsmasq-dns-895cf5cf-b2865" Feb 17 18:02:34 crc kubenswrapper[4892]: I0217 18:02:34.081048 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7-ovsdbserver-nb\") pod \"dnsmasq-dns-895cf5cf-b2865\" (UID: \"aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7\") " pod="openstack/dnsmasq-dns-895cf5cf-b2865" Feb 17 18:02:34 crc kubenswrapper[4892]: I0217 18:02:34.081186 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7-ovsdbserver-sb\") pod \"dnsmasq-dns-895cf5cf-b2865\" (UID: \"aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7\") " pod="openstack/dnsmasq-dns-895cf5cf-b2865" Feb 17 18:02:34 crc kubenswrapper[4892]: I0217 18:02:34.081305 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldz44\" (UniqueName: \"kubernetes.io/projected/aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7-kube-api-access-ldz44\") pod \"dnsmasq-dns-895cf5cf-b2865\" (UID: \"aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7\") " pod="openstack/dnsmasq-dns-895cf5cf-b2865" Feb 17 18:02:34 crc kubenswrapper[4892]: I0217 18:02:34.182620 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldz44\" (UniqueName: \"kubernetes.io/projected/aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7-kube-api-access-ldz44\") pod \"dnsmasq-dns-895cf5cf-b2865\" (UID: \"aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7\") " pod="openstack/dnsmasq-dns-895cf5cf-b2865" Feb 17 18:02:34 crc kubenswrapper[4892]: I0217 18:02:34.182961 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7-config\") pod \"dnsmasq-dns-895cf5cf-b2865\" (UID: \"aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7\") " pod="openstack/dnsmasq-dns-895cf5cf-b2865" Feb 17 18:02:34 crc kubenswrapper[4892]: I0217 18:02:34.183022 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7-dns-swift-storage-0\") pod \"dnsmasq-dns-895cf5cf-b2865\" (UID: \"aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7\") " pod="openstack/dnsmasq-dns-895cf5cf-b2865" Feb 17 18:02:34 crc kubenswrapper[4892]: I0217 18:02:34.183047 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7-dns-svc\") pod \"dnsmasq-dns-895cf5cf-b2865\" (UID: \"aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7\") " pod="openstack/dnsmasq-dns-895cf5cf-b2865" Feb 17 18:02:34 crc kubenswrapper[4892]: I0217 18:02:34.183067 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7-ovsdbserver-nb\") pod \"dnsmasq-dns-895cf5cf-b2865\" (UID: \"aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7\") " pod="openstack/dnsmasq-dns-895cf5cf-b2865" Feb 17 18:02:34 crc kubenswrapper[4892]: I0217 18:02:34.183144 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7-ovsdbserver-sb\") pod \"dnsmasq-dns-895cf5cf-b2865\" (UID: \"aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7\") " pod="openstack/dnsmasq-dns-895cf5cf-b2865" Feb 17 18:02:34 crc kubenswrapper[4892]: I0217 18:02:34.183898 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7-config\") pod \"dnsmasq-dns-895cf5cf-b2865\" (UID: \"aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7\") " pod="openstack/dnsmasq-dns-895cf5cf-b2865" Feb 17 18:02:34 crc kubenswrapper[4892]: I0217 18:02:34.183987 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7-ovsdbserver-nb\") pod \"dnsmasq-dns-895cf5cf-b2865\" (UID: \"aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7\") " pod="openstack/dnsmasq-dns-895cf5cf-b2865" Feb 17 18:02:34 crc kubenswrapper[4892]: I0217 18:02:34.184005 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7-dns-swift-storage-0\") pod \"dnsmasq-dns-895cf5cf-b2865\" (UID: \"aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7\") " pod="openstack/dnsmasq-dns-895cf5cf-b2865" Feb 17 18:02:34 crc kubenswrapper[4892]: I0217 18:02:34.184122 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7-ovsdbserver-sb\") pod \"dnsmasq-dns-895cf5cf-b2865\" (UID: \"aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7\") " pod="openstack/dnsmasq-dns-895cf5cf-b2865" Feb 17 18:02:34 crc kubenswrapper[4892]: I0217 18:02:34.184529 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7-dns-svc\") pod \"dnsmasq-dns-895cf5cf-b2865\" (UID: \"aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7\") " pod="openstack/dnsmasq-dns-895cf5cf-b2865" Feb 17 18:02:34 crc kubenswrapper[4892]: I0217 18:02:34.206705 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldz44\" (UniqueName: \"kubernetes.io/projected/aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7-kube-api-access-ldz44\") pod \"dnsmasq-dns-895cf5cf-b2865\" (UID: \"aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7\") " pod="openstack/dnsmasq-dns-895cf5cf-b2865" Feb 17 18:02:34 crc kubenswrapper[4892]: I0217 18:02:34.294290 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-b2865" Feb 17 18:02:34 crc kubenswrapper[4892]: I0217 18:02:34.754781 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-b2865"] Feb 17 18:02:35 crc kubenswrapper[4892]: I0217 18:02:35.370967 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a40576fc-fbd3-45f5-afb8-50de90642017" path="/var/lib/kubelet/pods/a40576fc-fbd3-45f5-afb8-50de90642017/volumes" Feb 17 18:02:35 crc kubenswrapper[4892]: I0217 18:02:35.577220 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 17 18:02:35 crc kubenswrapper[4892]: I0217 18:02:35.582628 4892 generic.go:334] "Generic (PLEG): container finished" podID="aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7" containerID="0fe2b4ec3dd7289b03b187a5f65791a6d000005cecae7271c83dbd7b1687ece0" exitCode=0 Feb 17 18:02:35 crc kubenswrapper[4892]: I0217 18:02:35.582678 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-b2865" event={"ID":"aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7","Type":"ContainerDied","Data":"0fe2b4ec3dd7289b03b187a5f65791a6d000005cecae7271c83dbd7b1687ece0"} Feb 17 18:02:35 crc kubenswrapper[4892]: I0217 18:02:35.582711 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-b2865" event={"ID":"aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7","Type":"ContainerStarted","Data":"d75e0ccc46e0ea3bf8a05a62b3a9be810cf3bfcb87f9d8570b440d8ffd8dfa13"} Feb 17 18:02:36 crc kubenswrapper[4892]: I0217 18:02:36.064542 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 17 18:02:36 crc kubenswrapper[4892]: I0217 18:02:36.593906 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-b2865" event={"ID":"aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7","Type":"ContainerStarted","Data":"bccde332d60d6b0e45f0bf05b2c45bb513cdbbe8b8c25952d2bde8ff07e3a9e4"} Feb 17 18:02:36 crc kubenswrapper[4892]: I0217 18:02:36.595138 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-895cf5cf-b2865" Feb 17 18:02:37 crc kubenswrapper[4892]: I0217 18:02:37.425024 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 18:02:37 crc kubenswrapper[4892]: I0217 18:02:37.425079 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 18:02:38 crc kubenswrapper[4892]: I0217 18:02:38.524564 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-895cf5cf-b2865" podStartSLOduration=5.524542451 podStartE2EDuration="5.524542451s" podCreationTimestamp="2026-02-17 18:02:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:02:36.624004194 +0000 UTC m=+1127.999407459" watchObservedRunningTime="2026-02-17 18:02:38.524542451 +0000 UTC m=+1129.899945726" Feb 17 18:02:38 crc kubenswrapper[4892]: I0217 18:02:38.531906 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-65gsr"] Feb 17 18:02:38 crc kubenswrapper[4892]: I0217 18:02:38.533445 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-65gsr" Feb 17 18:02:38 crc kubenswrapper[4892]: I0217 18:02:38.540763 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-65gsr"] Feb 17 18:02:38 crc kubenswrapper[4892]: I0217 18:02:38.650490 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-3909-account-create-update-zjjz8"] Feb 17 18:02:38 crc kubenswrapper[4892]: I0217 18:02:38.665962 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3e03215-2841-458f-86b7-9bd2882f07a8-operator-scripts\") pod \"cinder-db-create-65gsr\" (UID: \"d3e03215-2841-458f-86b7-9bd2882f07a8\") " pod="openstack/cinder-db-create-65gsr" Feb 17 18:02:38 crc kubenswrapper[4892]: I0217 18:02:38.666018 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbspg\" (UniqueName: \"kubernetes.io/projected/d3e03215-2841-458f-86b7-9bd2882f07a8-kube-api-access-qbspg\") pod \"cinder-db-create-65gsr\" (UID: \"d3e03215-2841-458f-86b7-9bd2882f07a8\") " pod="openstack/cinder-db-create-65gsr" Feb 17 18:02:38 crc kubenswrapper[4892]: I0217 18:02:38.677424 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3909-account-create-update-zjjz8" Feb 17 18:02:38 crc kubenswrapper[4892]: I0217 18:02:38.679943 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-3909-account-create-update-zjjz8"] Feb 17 18:02:38 crc kubenswrapper[4892]: I0217 18:02:38.680738 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 17 18:02:38 crc kubenswrapper[4892]: I0217 18:02:38.731291 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-vvgtf"] Feb 17 18:02:38 crc kubenswrapper[4892]: I0217 18:02:38.732375 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-vvgtf" Feb 17 18:02:38 crc kubenswrapper[4892]: I0217 18:02:38.743422 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-vvgtf"] Feb 17 18:02:38 crc kubenswrapper[4892]: I0217 18:02:38.767367 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mbms\" (UniqueName: \"kubernetes.io/projected/48b3dff9-6db4-4820-9920-9e5a24401e98-kube-api-access-6mbms\") pod \"cinder-3909-account-create-update-zjjz8\" (UID: \"48b3dff9-6db4-4820-9920-9e5a24401e98\") " pod="openstack/cinder-3909-account-create-update-zjjz8" Feb 17 18:02:38 crc kubenswrapper[4892]: I0217 18:02:38.767551 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3e03215-2841-458f-86b7-9bd2882f07a8-operator-scripts\") pod \"cinder-db-create-65gsr\" (UID: \"d3e03215-2841-458f-86b7-9bd2882f07a8\") " pod="openstack/cinder-db-create-65gsr" Feb 17 18:02:38 crc kubenswrapper[4892]: I0217 18:02:38.767573 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48b3dff9-6db4-4820-9920-9e5a24401e98-operator-scripts\") pod \"cinder-3909-account-create-update-zjjz8\" (UID: \"48b3dff9-6db4-4820-9920-9e5a24401e98\") " pod="openstack/cinder-3909-account-create-update-zjjz8" Feb 17 18:02:38 crc kubenswrapper[4892]: I0217 18:02:38.767604 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbspg\" (UniqueName: \"kubernetes.io/projected/d3e03215-2841-458f-86b7-9bd2882f07a8-kube-api-access-qbspg\") pod \"cinder-db-create-65gsr\" (UID: \"d3e03215-2841-458f-86b7-9bd2882f07a8\") " pod="openstack/cinder-db-create-65gsr" Feb 17 18:02:38 crc kubenswrapper[4892]: I0217 18:02:38.769010 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3e03215-2841-458f-86b7-9bd2882f07a8-operator-scripts\") pod \"cinder-db-create-65gsr\" (UID: \"d3e03215-2841-458f-86b7-9bd2882f07a8\") " pod="openstack/cinder-db-create-65gsr" Feb 17 18:02:38 crc kubenswrapper[4892]: I0217 18:02:38.807631 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbspg\" (UniqueName: \"kubernetes.io/projected/d3e03215-2841-458f-86b7-9bd2882f07a8-kube-api-access-qbspg\") pod \"cinder-db-create-65gsr\" (UID: \"d3e03215-2841-458f-86b7-9bd2882f07a8\") " pod="openstack/cinder-db-create-65gsr" Feb 17 18:02:38 crc kubenswrapper[4892]: I0217 18:02:38.845848 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-fa7c-account-create-update-zqj46"] Feb 17 18:02:38 crc kubenswrapper[4892]: I0217 18:02:38.847320 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-fa7c-account-create-update-zqj46" Feb 17 18:02:38 crc kubenswrapper[4892]: I0217 18:02:38.850518 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 17 18:02:38 crc kubenswrapper[4892]: I0217 18:02:38.853771 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-65gsr" Feb 17 18:02:38 crc kubenswrapper[4892]: I0217 18:02:38.856706 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-fa7c-account-create-update-zqj46"] Feb 17 18:02:38 crc kubenswrapper[4892]: I0217 18:02:38.869292 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48b3dff9-6db4-4820-9920-9e5a24401e98-operator-scripts\") pod \"cinder-3909-account-create-update-zjjz8\" (UID: \"48b3dff9-6db4-4820-9920-9e5a24401e98\") " pod="openstack/cinder-3909-account-create-update-zjjz8" Feb 17 18:02:38 crc kubenswrapper[4892]: I0217 18:02:38.869348 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4c45\" (UniqueName: \"kubernetes.io/projected/b6ef4386-19dd-4bd6-bf1c-53598735a302-kube-api-access-b4c45\") pod \"barbican-db-create-vvgtf\" (UID: \"b6ef4386-19dd-4bd6-bf1c-53598735a302\") " pod="openstack/barbican-db-create-vvgtf" Feb 17 18:02:38 crc kubenswrapper[4892]: I0217 18:02:38.869401 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mbms\" (UniqueName: \"kubernetes.io/projected/48b3dff9-6db4-4820-9920-9e5a24401e98-kube-api-access-6mbms\") pod \"cinder-3909-account-create-update-zjjz8\" (UID: \"48b3dff9-6db4-4820-9920-9e5a24401e98\") " pod="openstack/cinder-3909-account-create-update-zjjz8" Feb 17 18:02:38 crc kubenswrapper[4892]: I0217 18:02:38.869436 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6ef4386-19dd-4bd6-bf1c-53598735a302-operator-scripts\") pod \"barbican-db-create-vvgtf\" (UID: \"b6ef4386-19dd-4bd6-bf1c-53598735a302\") " pod="openstack/barbican-db-create-vvgtf" Feb 17 18:02:38 crc kubenswrapper[4892]: I0217 18:02:38.870167 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48b3dff9-6db4-4820-9920-9e5a24401e98-operator-scripts\") pod \"cinder-3909-account-create-update-zjjz8\" (UID: \"48b3dff9-6db4-4820-9920-9e5a24401e98\") " pod="openstack/cinder-3909-account-create-update-zjjz8" Feb 17 18:02:38 crc kubenswrapper[4892]: I0217 18:02:38.893645 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mbms\" (UniqueName: \"kubernetes.io/projected/48b3dff9-6db4-4820-9920-9e5a24401e98-kube-api-access-6mbms\") pod \"cinder-3909-account-create-update-zjjz8\" (UID: \"48b3dff9-6db4-4820-9920-9e5a24401e98\") " pod="openstack/cinder-3909-account-create-update-zjjz8" Feb 17 18:02:38 crc kubenswrapper[4892]: I0217 18:02:38.930503 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-tj55v"] Feb 17 18:02:38 crc kubenswrapper[4892]: I0217 18:02:38.936891 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-tj55v" Feb 17 18:02:38 crc kubenswrapper[4892]: I0217 18:02:38.944027 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-tj55v"] Feb 17 18:02:38 crc kubenswrapper[4892]: I0217 18:02:38.970708 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4c45\" (UniqueName: \"kubernetes.io/projected/b6ef4386-19dd-4bd6-bf1c-53598735a302-kube-api-access-b4c45\") pod \"barbican-db-create-vvgtf\" (UID: \"b6ef4386-19dd-4bd6-bf1c-53598735a302\") " pod="openstack/barbican-db-create-vvgtf" Feb 17 18:02:38 crc kubenswrapper[4892]: I0217 18:02:38.970777 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ngfx\" (UniqueName: \"kubernetes.io/projected/bad7ea87-ce2b-4897-96b2-99ff27b92c8d-kube-api-access-5ngfx\") pod \"barbican-fa7c-account-create-update-zqj46\" (UID: \"bad7ea87-ce2b-4897-96b2-99ff27b92c8d\") " pod="openstack/barbican-fa7c-account-create-update-zqj46" Feb 17 18:02:38 crc kubenswrapper[4892]: I0217 18:02:38.970924 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6ef4386-19dd-4bd6-bf1c-53598735a302-operator-scripts\") pod \"barbican-db-create-vvgtf\" (UID: \"b6ef4386-19dd-4bd6-bf1c-53598735a302\") " pod="openstack/barbican-db-create-vvgtf" Feb 17 18:02:38 crc kubenswrapper[4892]: I0217 18:02:38.971010 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bad7ea87-ce2b-4897-96b2-99ff27b92c8d-operator-scripts\") pod \"barbican-fa7c-account-create-update-zqj46\" (UID: \"bad7ea87-ce2b-4897-96b2-99ff27b92c8d\") " pod="openstack/barbican-fa7c-account-create-update-zqj46" Feb 17 18:02:38 crc kubenswrapper[4892]: I0217 18:02:38.971597 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6ef4386-19dd-4bd6-bf1c-53598735a302-operator-scripts\") pod \"barbican-db-create-vvgtf\" (UID: \"b6ef4386-19dd-4bd6-bf1c-53598735a302\") " pod="openstack/barbican-db-create-vvgtf" Feb 17 18:02:39 crc kubenswrapper[4892]: I0217 18:02:38.995443 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3909-account-create-update-zjjz8" Feb 17 18:02:39 crc kubenswrapper[4892]: I0217 18:02:38.999494 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4c45\" (UniqueName: \"kubernetes.io/projected/b6ef4386-19dd-4bd6-bf1c-53598735a302-kube-api-access-b4c45\") pod \"barbican-db-create-vvgtf\" (UID: \"b6ef4386-19dd-4bd6-bf1c-53598735a302\") " pod="openstack/barbican-db-create-vvgtf" Feb 17 18:02:39 crc kubenswrapper[4892]: I0217 18:02:39.019581 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-lvsr8"] Feb 17 18:02:39 crc kubenswrapper[4892]: I0217 18:02:39.025740 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-lvsr8" Feb 17 18:02:39 crc kubenswrapper[4892]: I0217 18:02:39.031189 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 17 18:02:39 crc kubenswrapper[4892]: I0217 18:02:39.031368 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 17 18:02:39 crc kubenswrapper[4892]: I0217 18:02:39.039647 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jpwh4" Feb 17 18:02:39 crc kubenswrapper[4892]: I0217 18:02:39.040368 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 17 18:02:39 crc kubenswrapper[4892]: I0217 18:02:39.049249 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-vvgtf" Feb 17 18:02:39 crc kubenswrapper[4892]: I0217 18:02:39.062916 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-lvsr8"] Feb 17 18:02:39 crc kubenswrapper[4892]: I0217 18:02:39.072941 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9plv5\" (UniqueName: \"kubernetes.io/projected/daec0414-a6e3-4f1d-bc43-cccfa0444894-kube-api-access-9plv5\") pod \"keystone-db-sync-lvsr8\" (UID: \"daec0414-a6e3-4f1d-bc43-cccfa0444894\") " pod="openstack/keystone-db-sync-lvsr8" Feb 17 18:02:39 crc kubenswrapper[4892]: I0217 18:02:39.073010 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bad7ea87-ce2b-4897-96b2-99ff27b92c8d-operator-scripts\") pod \"barbican-fa7c-account-create-update-zqj46\" (UID: \"bad7ea87-ce2b-4897-96b2-99ff27b92c8d\") " pod="openstack/barbican-fa7c-account-create-update-zqj46" Feb 17 18:02:39 crc kubenswrapper[4892]: I0217 18:02:39.073035 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/daec0414-a6e3-4f1d-bc43-cccfa0444894-config-data\") pod \"keystone-db-sync-lvsr8\" (UID: \"daec0414-a6e3-4f1d-bc43-cccfa0444894\") " pod="openstack/keystone-db-sync-lvsr8" Feb 17 18:02:39 crc kubenswrapper[4892]: I0217 18:02:39.073065 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daec0414-a6e3-4f1d-bc43-cccfa0444894-combined-ca-bundle\") pod \"keystone-db-sync-lvsr8\" (UID: \"daec0414-a6e3-4f1d-bc43-cccfa0444894\") " pod="openstack/keystone-db-sync-lvsr8" Feb 17 18:02:39 crc kubenswrapper[4892]: I0217 18:02:39.073090 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7g2l\" (UniqueName: \"kubernetes.io/projected/64d8e63f-e602-4e70-ac10-58f577858490-kube-api-access-v7g2l\") pod \"neutron-db-create-tj55v\" (UID: \"64d8e63f-e602-4e70-ac10-58f577858490\") " pod="openstack/neutron-db-create-tj55v" Feb 17 18:02:39 crc kubenswrapper[4892]: I0217 18:02:39.073138 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ngfx\" (UniqueName: \"kubernetes.io/projected/bad7ea87-ce2b-4897-96b2-99ff27b92c8d-kube-api-access-5ngfx\") pod \"barbican-fa7c-account-create-update-zqj46\" (UID: \"bad7ea87-ce2b-4897-96b2-99ff27b92c8d\") " pod="openstack/barbican-fa7c-account-create-update-zqj46" Feb 17 18:02:39 crc kubenswrapper[4892]: I0217 18:02:39.073177 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64d8e63f-e602-4e70-ac10-58f577858490-operator-scripts\") pod \"neutron-db-create-tj55v\" (UID: \"64d8e63f-e602-4e70-ac10-58f577858490\") " pod="openstack/neutron-db-create-tj55v" Feb 17 18:02:39 crc kubenswrapper[4892]: I0217 18:02:39.074478 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bad7ea87-ce2b-4897-96b2-99ff27b92c8d-operator-scripts\") pod \"barbican-fa7c-account-create-update-zqj46\" (UID: \"bad7ea87-ce2b-4897-96b2-99ff27b92c8d\") " pod="openstack/barbican-fa7c-account-create-update-zqj46" Feb 17 18:02:39 crc kubenswrapper[4892]: I0217 18:02:39.111544 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ngfx\" (UniqueName: \"kubernetes.io/projected/bad7ea87-ce2b-4897-96b2-99ff27b92c8d-kube-api-access-5ngfx\") pod \"barbican-fa7c-account-create-update-zqj46\" (UID: \"bad7ea87-ce2b-4897-96b2-99ff27b92c8d\") " pod="openstack/barbican-fa7c-account-create-update-zqj46" Feb 17 18:02:39 crc kubenswrapper[4892]: I0217 18:02:39.115912 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-2024-account-create-update-lhrkp"] Feb 17 18:02:39 crc kubenswrapper[4892]: I0217 18:02:39.120278 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-2024-account-create-update-lhrkp" Feb 17 18:02:39 crc kubenswrapper[4892]: I0217 18:02:39.124431 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 17 18:02:39 crc kubenswrapper[4892]: I0217 18:02:39.162376 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-2024-account-create-update-lhrkp"] Feb 17 18:02:39 crc kubenswrapper[4892]: I0217 18:02:39.171283 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-fa7c-account-create-update-zqj46" Feb 17 18:02:39 crc kubenswrapper[4892]: I0217 18:02:39.174899 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64d8e63f-e602-4e70-ac10-58f577858490-operator-scripts\") pod \"neutron-db-create-tj55v\" (UID: \"64d8e63f-e602-4e70-ac10-58f577858490\") " pod="openstack/neutron-db-create-tj55v" Feb 17 18:02:39 crc kubenswrapper[4892]: I0217 18:02:39.174936 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/484d3758-2960-4e5d-89a8-d36a6cefd791-operator-scripts\") pod \"neutron-2024-account-create-update-lhrkp\" (UID: \"484d3758-2960-4e5d-89a8-d36a6cefd791\") " pod="openstack/neutron-2024-account-create-update-lhrkp" Feb 17 18:02:39 crc kubenswrapper[4892]: I0217 18:02:39.174985 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9plv5\" (UniqueName: \"kubernetes.io/projected/daec0414-a6e3-4f1d-bc43-cccfa0444894-kube-api-access-9plv5\") pod \"keystone-db-sync-lvsr8\" (UID: \"daec0414-a6e3-4f1d-bc43-cccfa0444894\") " pod="openstack/keystone-db-sync-lvsr8" Feb 17 18:02:39 crc kubenswrapper[4892]: I0217 18:02:39.175022 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zx7t4\" (UniqueName: \"kubernetes.io/projected/484d3758-2960-4e5d-89a8-d36a6cefd791-kube-api-access-zx7t4\") pod \"neutron-2024-account-create-update-lhrkp\" (UID: \"484d3758-2960-4e5d-89a8-d36a6cefd791\") " pod="openstack/neutron-2024-account-create-update-lhrkp" Feb 17 18:02:39 crc kubenswrapper[4892]: I0217 18:02:39.175055 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/daec0414-a6e3-4f1d-bc43-cccfa0444894-config-data\") pod \"keystone-db-sync-lvsr8\" (UID: \"daec0414-a6e3-4f1d-bc43-cccfa0444894\") " pod="openstack/keystone-db-sync-lvsr8" Feb 17 18:02:39 crc kubenswrapper[4892]: I0217 18:02:39.175085 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daec0414-a6e3-4f1d-bc43-cccfa0444894-combined-ca-bundle\") pod \"keystone-db-sync-lvsr8\" (UID: \"daec0414-a6e3-4f1d-bc43-cccfa0444894\") " pod="openstack/keystone-db-sync-lvsr8" Feb 17 18:02:39 crc kubenswrapper[4892]: I0217 18:02:39.175109 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7g2l\" (UniqueName: \"kubernetes.io/projected/64d8e63f-e602-4e70-ac10-58f577858490-kube-api-access-v7g2l\") pod \"neutron-db-create-tj55v\" (UID: \"64d8e63f-e602-4e70-ac10-58f577858490\") " pod="openstack/neutron-db-create-tj55v" Feb 17 18:02:39 crc kubenswrapper[4892]: I0217 18:02:39.175552 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64d8e63f-e602-4e70-ac10-58f577858490-operator-scripts\") pod \"neutron-db-create-tj55v\" (UID: \"64d8e63f-e602-4e70-ac10-58f577858490\") " pod="openstack/neutron-db-create-tj55v" Feb 17 18:02:39 crc kubenswrapper[4892]: I0217 18:02:39.179358 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daec0414-a6e3-4f1d-bc43-cccfa0444894-combined-ca-bundle\") pod \"keystone-db-sync-lvsr8\" (UID: \"daec0414-a6e3-4f1d-bc43-cccfa0444894\") " pod="openstack/keystone-db-sync-lvsr8" Feb 17 18:02:39 crc kubenswrapper[4892]: I0217 18:02:39.181450 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/daec0414-a6e3-4f1d-bc43-cccfa0444894-config-data\") pod \"keystone-db-sync-lvsr8\" (UID: \"daec0414-a6e3-4f1d-bc43-cccfa0444894\") " pod="openstack/keystone-db-sync-lvsr8" Feb 17 18:02:39 crc kubenswrapper[4892]: I0217 18:02:39.201130 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7g2l\" (UniqueName: \"kubernetes.io/projected/64d8e63f-e602-4e70-ac10-58f577858490-kube-api-access-v7g2l\") pod \"neutron-db-create-tj55v\" (UID: \"64d8e63f-e602-4e70-ac10-58f577858490\") " pod="openstack/neutron-db-create-tj55v" Feb 17 18:02:39 crc kubenswrapper[4892]: I0217 18:02:39.211606 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9plv5\" (UniqueName: \"kubernetes.io/projected/daec0414-a6e3-4f1d-bc43-cccfa0444894-kube-api-access-9plv5\") pod \"keystone-db-sync-lvsr8\" (UID: \"daec0414-a6e3-4f1d-bc43-cccfa0444894\") " pod="openstack/keystone-db-sync-lvsr8" Feb 17 18:02:39 crc kubenswrapper[4892]: I0217 18:02:39.276980 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/484d3758-2960-4e5d-89a8-d36a6cefd791-operator-scripts\") pod \"neutron-2024-account-create-update-lhrkp\" (UID: \"484d3758-2960-4e5d-89a8-d36a6cefd791\") " pod="openstack/neutron-2024-account-create-update-lhrkp" Feb 17 18:02:39 crc kubenswrapper[4892]: I0217 18:02:39.277095 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zx7t4\" (UniqueName: \"kubernetes.io/projected/484d3758-2960-4e5d-89a8-d36a6cefd791-kube-api-access-zx7t4\") pod \"neutron-2024-account-create-update-lhrkp\" (UID: \"484d3758-2960-4e5d-89a8-d36a6cefd791\") " pod="openstack/neutron-2024-account-create-update-lhrkp" Feb 17 18:02:39 crc kubenswrapper[4892]: I0217 18:02:39.277934 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/484d3758-2960-4e5d-89a8-d36a6cefd791-operator-scripts\") pod \"neutron-2024-account-create-update-lhrkp\" (UID: \"484d3758-2960-4e5d-89a8-d36a6cefd791\") " pod="openstack/neutron-2024-account-create-update-lhrkp" Feb 17 18:02:39 crc kubenswrapper[4892]: I0217 18:02:39.279560 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-tj55v" Feb 17 18:02:39 crc kubenswrapper[4892]: I0217 18:02:39.301716 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zx7t4\" (UniqueName: \"kubernetes.io/projected/484d3758-2960-4e5d-89a8-d36a6cefd791-kube-api-access-zx7t4\") pod \"neutron-2024-account-create-update-lhrkp\" (UID: \"484d3758-2960-4e5d-89a8-d36a6cefd791\") " pod="openstack/neutron-2024-account-create-update-lhrkp" Feb 17 18:02:39 crc kubenswrapper[4892]: I0217 18:02:39.465335 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-65gsr"] Feb 17 18:02:39 crc kubenswrapper[4892]: W0217 18:02:39.470037 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3e03215_2841_458f_86b7_9bd2882f07a8.slice/crio-26e41cd57a901d01f379b609ee744ecd0bdcb7594d20e5331d7e70c1fd6782d9 WatchSource:0}: Error finding container 26e41cd57a901d01f379b609ee744ecd0bdcb7594d20e5331d7e70c1fd6782d9: Status 404 returned error can't find the container with id 26e41cd57a901d01f379b609ee744ecd0bdcb7594d20e5331d7e70c1fd6782d9 Feb 17 18:02:39 crc kubenswrapper[4892]: I0217 18:02:39.472354 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-lvsr8" Feb 17 18:02:39 crc kubenswrapper[4892]: I0217 18:02:39.498389 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-2024-account-create-update-lhrkp" Feb 17 18:02:39 crc kubenswrapper[4892]: I0217 18:02:39.626319 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-3909-account-create-update-zjjz8"] Feb 17 18:02:39 crc kubenswrapper[4892]: I0217 18:02:39.657698 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-65gsr" event={"ID":"d3e03215-2841-458f-86b7-9bd2882f07a8","Type":"ContainerStarted","Data":"26e41cd57a901d01f379b609ee744ecd0bdcb7594d20e5331d7e70c1fd6782d9"} Feb 17 18:02:39 crc kubenswrapper[4892]: I0217 18:02:39.709366 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-vvgtf"] Feb 17 18:02:39 crc kubenswrapper[4892]: I0217 18:02:39.782888 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-tj55v"] Feb 17 18:02:39 crc kubenswrapper[4892]: I0217 18:02:39.826715 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-fa7c-account-create-update-zqj46"] Feb 17 18:02:39 crc kubenswrapper[4892]: I0217 18:02:39.989255 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-lvsr8"] Feb 17 18:02:39 crc kubenswrapper[4892]: W0217 18:02:39.997108 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddaec0414_a6e3_4f1d_bc43_cccfa0444894.slice/crio-c9fa314de0216a5ae662110e509ae23c942bc0e067e91cde3885229df59919a6 WatchSource:0}: Error finding container c9fa314de0216a5ae662110e509ae23c942bc0e067e91cde3885229df59919a6: Status 404 returned error can't find the container with id c9fa314de0216a5ae662110e509ae23c942bc0e067e91cde3885229df59919a6 Feb 17 18:02:40 crc kubenswrapper[4892]: I0217 18:02:40.144391 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-2024-account-create-update-lhrkp"] Feb 17 18:02:40 crc kubenswrapper[4892]: I0217 18:02:40.668416 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-lvsr8" event={"ID":"daec0414-a6e3-4f1d-bc43-cccfa0444894","Type":"ContainerStarted","Data":"c9fa314de0216a5ae662110e509ae23c942bc0e067e91cde3885229df59919a6"} Feb 17 18:02:40 crc kubenswrapper[4892]: I0217 18:02:40.670323 4892 generic.go:334] "Generic (PLEG): container finished" podID="48b3dff9-6db4-4820-9920-9e5a24401e98" containerID="81fff63afe534220bbefc6836acd04547fb898d8b45cbda7b7e4b0b8ac3851af" exitCode=0 Feb 17 18:02:40 crc kubenswrapper[4892]: I0217 18:02:40.670391 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3909-account-create-update-zjjz8" event={"ID":"48b3dff9-6db4-4820-9920-9e5a24401e98","Type":"ContainerDied","Data":"81fff63afe534220bbefc6836acd04547fb898d8b45cbda7b7e4b0b8ac3851af"} Feb 17 18:02:40 crc kubenswrapper[4892]: I0217 18:02:40.670423 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3909-account-create-update-zjjz8" event={"ID":"48b3dff9-6db4-4820-9920-9e5a24401e98","Type":"ContainerStarted","Data":"e2ab4f9d9c9c5b5a7d665f34c2a2cc8882c307a1f67bff7543fbbf32ebbe2ea5"} Feb 17 18:02:40 crc kubenswrapper[4892]: I0217 18:02:40.671741 4892 generic.go:334] "Generic (PLEG): container finished" podID="484d3758-2960-4e5d-89a8-d36a6cefd791" containerID="7285fea6e2decb3e8ebd69fc9d2b7024af73b52009fa277ef34f56d1a2aa23c3" exitCode=0 Feb 17 18:02:40 crc kubenswrapper[4892]: I0217 18:02:40.671781 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-2024-account-create-update-lhrkp" event={"ID":"484d3758-2960-4e5d-89a8-d36a6cefd791","Type":"ContainerDied","Data":"7285fea6e2decb3e8ebd69fc9d2b7024af73b52009fa277ef34f56d1a2aa23c3"} Feb 17 18:02:40 crc kubenswrapper[4892]: I0217 18:02:40.671798 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-2024-account-create-update-lhrkp" event={"ID":"484d3758-2960-4e5d-89a8-d36a6cefd791","Type":"ContainerStarted","Data":"6f1c6b24255d2797c30fedfdd2abd4db150a255b6cef207d0bba8c2764d254a2"} Feb 17 18:02:40 crc kubenswrapper[4892]: I0217 18:02:40.675402 4892 generic.go:334] "Generic (PLEG): container finished" podID="64d8e63f-e602-4e70-ac10-58f577858490" containerID="79887136152b98a313b969d8b3e992c5f5e5bd87b99e8bdb68693793eb7ca14a" exitCode=0 Feb 17 18:02:40 crc kubenswrapper[4892]: I0217 18:02:40.675477 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-tj55v" event={"ID":"64d8e63f-e602-4e70-ac10-58f577858490","Type":"ContainerDied","Data":"79887136152b98a313b969d8b3e992c5f5e5bd87b99e8bdb68693793eb7ca14a"} Feb 17 18:02:40 crc kubenswrapper[4892]: I0217 18:02:40.675504 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-tj55v" event={"ID":"64d8e63f-e602-4e70-ac10-58f577858490","Type":"ContainerStarted","Data":"cf9cbf8c62476624a7642f8cba27f789cafc47ea98a8348706eb04403a8299ff"} Feb 17 18:02:40 crc kubenswrapper[4892]: I0217 18:02:40.677977 4892 generic.go:334] "Generic (PLEG): container finished" podID="b6ef4386-19dd-4bd6-bf1c-53598735a302" containerID="9977ae08b25b1edefaebcd314933aeb3b589fcc3968062a97ddb2200bd76769c" exitCode=0 Feb 17 18:02:40 crc kubenswrapper[4892]: I0217 18:02:40.678049 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-vvgtf" event={"ID":"b6ef4386-19dd-4bd6-bf1c-53598735a302","Type":"ContainerDied","Data":"9977ae08b25b1edefaebcd314933aeb3b589fcc3968062a97ddb2200bd76769c"} Feb 17 18:02:40 crc kubenswrapper[4892]: I0217 18:02:40.678071 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-vvgtf" event={"ID":"b6ef4386-19dd-4bd6-bf1c-53598735a302","Type":"ContainerStarted","Data":"817044030f42293f2b8c1d64a9c0d39170de958476d98acea4d768dbf926fb21"} Feb 17 18:02:40 crc kubenswrapper[4892]: I0217 18:02:40.679937 4892 generic.go:334] "Generic (PLEG): container finished" podID="bad7ea87-ce2b-4897-96b2-99ff27b92c8d" containerID="7a8b1188339d86be3d5c064511d914198ccd1ff144a0a5223403ce3b3e9cdf6b" exitCode=0 Feb 17 18:02:40 crc kubenswrapper[4892]: I0217 18:02:40.679999 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-fa7c-account-create-update-zqj46" event={"ID":"bad7ea87-ce2b-4897-96b2-99ff27b92c8d","Type":"ContainerDied","Data":"7a8b1188339d86be3d5c064511d914198ccd1ff144a0a5223403ce3b3e9cdf6b"} Feb 17 18:02:40 crc kubenswrapper[4892]: I0217 18:02:40.680023 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-fa7c-account-create-update-zqj46" event={"ID":"bad7ea87-ce2b-4897-96b2-99ff27b92c8d","Type":"ContainerStarted","Data":"531865de3e2135249ba4426c516930a40d80b340743653a60b20f55f8709d674"} Feb 17 18:02:40 crc kubenswrapper[4892]: I0217 18:02:40.681875 4892 generic.go:334] "Generic (PLEG): container finished" podID="d3e03215-2841-458f-86b7-9bd2882f07a8" containerID="8a100a0c19ecedf25cf13fb999386c04f0bdb55ef5cf32b452775a4f80064859" exitCode=0 Feb 17 18:02:40 crc kubenswrapper[4892]: I0217 18:02:40.681901 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-65gsr" event={"ID":"d3e03215-2841-458f-86b7-9bd2882f07a8","Type":"ContainerDied","Data":"8a100a0c19ecedf25cf13fb999386c04f0bdb55ef5cf32b452775a4f80064859"} Feb 17 18:02:44 crc kubenswrapper[4892]: I0217 18:02:44.295990 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-895cf5cf-b2865" Feb 17 18:02:44 crc kubenswrapper[4892]: I0217 18:02:44.366336 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-cnqps"] Feb 17 18:02:44 crc kubenswrapper[4892]: I0217 18:02:44.366537 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d5b6d6b67-cnqps" podUID="08f3009f-758d-4bca-b836-ea3ba6e8d097" containerName="dnsmasq-dns" containerID="cri-o://6b42ecebf51df8c144389699c3acf38d14ba7ddca00774dcc4e5cab68116355d" gracePeriod=10 Feb 17 18:02:46 crc kubenswrapper[4892]: I0217 18:02:46.745763 4892 generic.go:334] "Generic (PLEG): container finished" podID="08f3009f-758d-4bca-b836-ea3ba6e8d097" containerID="6b42ecebf51df8c144389699c3acf38d14ba7ddca00774dcc4e5cab68116355d" exitCode=0 Feb 17 18:02:46 crc kubenswrapper[4892]: I0217 18:02:46.745881 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-cnqps" event={"ID":"08f3009f-758d-4bca-b836-ea3ba6e8d097","Type":"ContainerDied","Data":"6b42ecebf51df8c144389699c3acf38d14ba7ddca00774dcc4e5cab68116355d"} Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.635391 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-vvgtf" Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.638352 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-65gsr" Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.644429 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3909-account-create-update-zjjz8" Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.660730 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-tj55v" Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.682692 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-2024-account-create-update-lhrkp" Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.707527 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-fa7c-account-create-update-zqj46" Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.742089 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48b3dff9-6db4-4820-9920-9e5a24401e98-operator-scripts\") pod \"48b3dff9-6db4-4820-9920-9e5a24401e98\" (UID: \"48b3dff9-6db4-4820-9920-9e5a24401e98\") " Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.742543 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3e03215-2841-458f-86b7-9bd2882f07a8-operator-scripts\") pod \"d3e03215-2841-458f-86b7-9bd2882f07a8\" (UID: \"d3e03215-2841-458f-86b7-9bd2882f07a8\") " Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.743284 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3e03215-2841-458f-86b7-9bd2882f07a8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d3e03215-2841-458f-86b7-9bd2882f07a8" (UID: "d3e03215-2841-458f-86b7-9bd2882f07a8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.743132 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48b3dff9-6db4-4820-9920-9e5a24401e98-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "48b3dff9-6db4-4820-9920-9e5a24401e98" (UID: "48b3dff9-6db4-4820-9920-9e5a24401e98"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.744128 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48b3dff9-6db4-4820-9920-9e5a24401e98-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.744153 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3e03215-2841-458f-86b7-9bd2882f07a8-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.773449 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-fa7c-account-create-update-zqj46" event={"ID":"bad7ea87-ce2b-4897-96b2-99ff27b92c8d","Type":"ContainerDied","Data":"531865de3e2135249ba4426c516930a40d80b340743653a60b20f55f8709d674"} Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.773528 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="531865de3e2135249ba4426c516930a40d80b340743653a60b20f55f8709d674" Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.773478 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-fa7c-account-create-update-zqj46" Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.775374 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-65gsr" event={"ID":"d3e03215-2841-458f-86b7-9bd2882f07a8","Type":"ContainerDied","Data":"26e41cd57a901d01f379b609ee744ecd0bdcb7594d20e5331d7e70c1fd6782d9"} Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.775501 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26e41cd57a901d01f379b609ee744ecd0bdcb7594d20e5331d7e70c1fd6782d9" Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.775662 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-65gsr" Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.777861 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3909-account-create-update-zjjz8" event={"ID":"48b3dff9-6db4-4820-9920-9e5a24401e98","Type":"ContainerDied","Data":"e2ab4f9d9c9c5b5a7d665f34c2a2cc8882c307a1f67bff7543fbbf32ebbe2ea5"} Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.777977 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2ab4f9d9c9c5b5a7d665f34c2a2cc8882c307a1f67bff7543fbbf32ebbe2ea5" Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.777876 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3909-account-create-update-zjjz8" Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.779532 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-2024-account-create-update-lhrkp" event={"ID":"484d3758-2960-4e5d-89a8-d36a6cefd791","Type":"ContainerDied","Data":"6f1c6b24255d2797c30fedfdd2abd4db150a255b6cef207d0bba8c2764d254a2"} Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.779575 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-2024-account-create-update-lhrkp" Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.779578 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f1c6b24255d2797c30fedfdd2abd4db150a255b6cef207d0bba8c2764d254a2" Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.781314 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-tj55v" event={"ID":"64d8e63f-e602-4e70-ac10-58f577858490","Type":"ContainerDied","Data":"cf9cbf8c62476624a7642f8cba27f789cafc47ea98a8348706eb04403a8299ff"} Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.781424 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf9cbf8c62476624a7642f8cba27f789cafc47ea98a8348706eb04403a8299ff" Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.781323 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-tj55v" Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.782751 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-vvgtf" event={"ID":"b6ef4386-19dd-4bd6-bf1c-53598735a302","Type":"ContainerDied","Data":"817044030f42293f2b8c1d64a9c0d39170de958476d98acea4d768dbf926fb21"} Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.782780 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="817044030f42293f2b8c1d64a9c0d39170de958476d98acea4d768dbf926fb21" Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.782844 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-vvgtf" Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.845152 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zx7t4\" (UniqueName: \"kubernetes.io/projected/484d3758-2960-4e5d-89a8-d36a6cefd791-kube-api-access-zx7t4\") pod \"484d3758-2960-4e5d-89a8-d36a6cefd791\" (UID: \"484d3758-2960-4e5d-89a8-d36a6cefd791\") " Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.845203 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mbms\" (UniqueName: \"kubernetes.io/projected/48b3dff9-6db4-4820-9920-9e5a24401e98-kube-api-access-6mbms\") pod \"48b3dff9-6db4-4820-9920-9e5a24401e98\" (UID: \"48b3dff9-6db4-4820-9920-9e5a24401e98\") " Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.845232 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbspg\" (UniqueName: \"kubernetes.io/projected/d3e03215-2841-458f-86b7-9bd2882f07a8-kube-api-access-qbspg\") pod \"d3e03215-2841-458f-86b7-9bd2882f07a8\" (UID: \"d3e03215-2841-458f-86b7-9bd2882f07a8\") " Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.845254 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64d8e63f-e602-4e70-ac10-58f577858490-operator-scripts\") pod \"64d8e63f-e602-4e70-ac10-58f577858490\" (UID: \"64d8e63f-e602-4e70-ac10-58f577858490\") " Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.845279 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4c45\" (UniqueName: \"kubernetes.io/projected/b6ef4386-19dd-4bd6-bf1c-53598735a302-kube-api-access-b4c45\") pod \"b6ef4386-19dd-4bd6-bf1c-53598735a302\" (UID: \"b6ef4386-19dd-4bd6-bf1c-53598735a302\") " Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.845320 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ngfx\" (UniqueName: \"kubernetes.io/projected/bad7ea87-ce2b-4897-96b2-99ff27b92c8d-kube-api-access-5ngfx\") pod \"bad7ea87-ce2b-4897-96b2-99ff27b92c8d\" (UID: \"bad7ea87-ce2b-4897-96b2-99ff27b92c8d\") " Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.845350 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bad7ea87-ce2b-4897-96b2-99ff27b92c8d-operator-scripts\") pod \"bad7ea87-ce2b-4897-96b2-99ff27b92c8d\" (UID: \"bad7ea87-ce2b-4897-96b2-99ff27b92c8d\") " Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.845387 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/484d3758-2960-4e5d-89a8-d36a6cefd791-operator-scripts\") pod \"484d3758-2960-4e5d-89a8-d36a6cefd791\" (UID: \"484d3758-2960-4e5d-89a8-d36a6cefd791\") " Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.845408 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7g2l\" (UniqueName: \"kubernetes.io/projected/64d8e63f-e602-4e70-ac10-58f577858490-kube-api-access-v7g2l\") pod \"64d8e63f-e602-4e70-ac10-58f577858490\" (UID: \"64d8e63f-e602-4e70-ac10-58f577858490\") " Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.845436 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6ef4386-19dd-4bd6-bf1c-53598735a302-operator-scripts\") pod \"b6ef4386-19dd-4bd6-bf1c-53598735a302\" (UID: \"b6ef4386-19dd-4bd6-bf1c-53598735a302\") " Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.846359 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64d8e63f-e602-4e70-ac10-58f577858490-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "64d8e63f-e602-4e70-ac10-58f577858490" (UID: "64d8e63f-e602-4e70-ac10-58f577858490"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.846470 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6ef4386-19dd-4bd6-bf1c-53598735a302-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b6ef4386-19dd-4bd6-bf1c-53598735a302" (UID: "b6ef4386-19dd-4bd6-bf1c-53598735a302"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.846477 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/484d3758-2960-4e5d-89a8-d36a6cefd791-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "484d3758-2960-4e5d-89a8-d36a6cefd791" (UID: "484d3758-2960-4e5d-89a8-d36a6cefd791"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.846875 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bad7ea87-ce2b-4897-96b2-99ff27b92c8d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bad7ea87-ce2b-4897-96b2-99ff27b92c8d" (UID: "bad7ea87-ce2b-4897-96b2-99ff27b92c8d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.850738 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bad7ea87-ce2b-4897-96b2-99ff27b92c8d-kube-api-access-5ngfx" (OuterVolumeSpecName: "kube-api-access-5ngfx") pod "bad7ea87-ce2b-4897-96b2-99ff27b92c8d" (UID: "bad7ea87-ce2b-4897-96b2-99ff27b92c8d"). InnerVolumeSpecName "kube-api-access-5ngfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.853418 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6ef4386-19dd-4bd6-bf1c-53598735a302-kube-api-access-b4c45" (OuterVolumeSpecName: "kube-api-access-b4c45") pod "b6ef4386-19dd-4bd6-bf1c-53598735a302" (UID: "b6ef4386-19dd-4bd6-bf1c-53598735a302"). InnerVolumeSpecName "kube-api-access-b4c45". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.853651 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3e03215-2841-458f-86b7-9bd2882f07a8-kube-api-access-qbspg" (OuterVolumeSpecName: "kube-api-access-qbspg") pod "d3e03215-2841-458f-86b7-9bd2882f07a8" (UID: "d3e03215-2841-458f-86b7-9bd2882f07a8"). InnerVolumeSpecName "kube-api-access-qbspg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.854151 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48b3dff9-6db4-4820-9920-9e5a24401e98-kube-api-access-6mbms" (OuterVolumeSpecName: "kube-api-access-6mbms") pod "48b3dff9-6db4-4820-9920-9e5a24401e98" (UID: "48b3dff9-6db4-4820-9920-9e5a24401e98"). InnerVolumeSpecName "kube-api-access-6mbms". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.854249 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/484d3758-2960-4e5d-89a8-d36a6cefd791-kube-api-access-zx7t4" (OuterVolumeSpecName: "kube-api-access-zx7t4") pod "484d3758-2960-4e5d-89a8-d36a6cefd791" (UID: "484d3758-2960-4e5d-89a8-d36a6cefd791"). InnerVolumeSpecName "kube-api-access-zx7t4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.860005 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64d8e63f-e602-4e70-ac10-58f577858490-kube-api-access-v7g2l" (OuterVolumeSpecName: "kube-api-access-v7g2l") pod "64d8e63f-e602-4e70-ac10-58f577858490" (UID: "64d8e63f-e602-4e70-ac10-58f577858490"). InnerVolumeSpecName "kube-api-access-v7g2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.946204 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zx7t4\" (UniqueName: \"kubernetes.io/projected/484d3758-2960-4e5d-89a8-d36a6cefd791-kube-api-access-zx7t4\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.946237 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mbms\" (UniqueName: \"kubernetes.io/projected/48b3dff9-6db4-4820-9920-9e5a24401e98-kube-api-access-6mbms\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.946246 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbspg\" (UniqueName: \"kubernetes.io/projected/d3e03215-2841-458f-86b7-9bd2882f07a8-kube-api-access-qbspg\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.946256 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64d8e63f-e602-4e70-ac10-58f577858490-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.946266 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4c45\" (UniqueName: \"kubernetes.io/projected/b6ef4386-19dd-4bd6-bf1c-53598735a302-kube-api-access-b4c45\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.946275 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ngfx\" (UniqueName: \"kubernetes.io/projected/bad7ea87-ce2b-4897-96b2-99ff27b92c8d-kube-api-access-5ngfx\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.946283 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bad7ea87-ce2b-4897-96b2-99ff27b92c8d-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.946291 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/484d3758-2960-4e5d-89a8-d36a6cefd791-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.946300 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7g2l\" (UniqueName: \"kubernetes.io/projected/64d8e63f-e602-4e70-ac10-58f577858490-kube-api-access-v7g2l\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:47 crc kubenswrapper[4892]: I0217 18:02:47.946309 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6ef4386-19dd-4bd6-bf1c-53598735a302-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:48 crc kubenswrapper[4892]: I0217 18:02:48.063641 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-cnqps" Feb 17 18:02:48 crc kubenswrapper[4892]: I0217 18:02:48.250642 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/08f3009f-758d-4bca-b836-ea3ba6e8d097-dns-swift-storage-0\") pod \"08f3009f-758d-4bca-b836-ea3ba6e8d097\" (UID: \"08f3009f-758d-4bca-b836-ea3ba6e8d097\") " Feb 17 18:02:48 crc kubenswrapper[4892]: I0217 18:02:48.250734 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hdfd\" (UniqueName: \"kubernetes.io/projected/08f3009f-758d-4bca-b836-ea3ba6e8d097-kube-api-access-9hdfd\") pod \"08f3009f-758d-4bca-b836-ea3ba6e8d097\" (UID: \"08f3009f-758d-4bca-b836-ea3ba6e8d097\") " Feb 17 18:02:48 crc kubenswrapper[4892]: I0217 18:02:48.250767 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08f3009f-758d-4bca-b836-ea3ba6e8d097-dns-svc\") pod \"08f3009f-758d-4bca-b836-ea3ba6e8d097\" (UID: \"08f3009f-758d-4bca-b836-ea3ba6e8d097\") " Feb 17 18:02:48 crc kubenswrapper[4892]: I0217 18:02:48.250837 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08f3009f-758d-4bca-b836-ea3ba6e8d097-config\") pod \"08f3009f-758d-4bca-b836-ea3ba6e8d097\" (UID: \"08f3009f-758d-4bca-b836-ea3ba6e8d097\") " Feb 17 18:02:48 crc kubenswrapper[4892]: I0217 18:02:48.250865 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/08f3009f-758d-4bca-b836-ea3ba6e8d097-ovsdbserver-sb\") pod \"08f3009f-758d-4bca-b836-ea3ba6e8d097\" (UID: \"08f3009f-758d-4bca-b836-ea3ba6e8d097\") " Feb 17 18:02:48 crc kubenswrapper[4892]: I0217 18:02:48.250913 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/08f3009f-758d-4bca-b836-ea3ba6e8d097-ovsdbserver-nb\") pod \"08f3009f-758d-4bca-b836-ea3ba6e8d097\" (UID: \"08f3009f-758d-4bca-b836-ea3ba6e8d097\") " Feb 17 18:02:48 crc kubenswrapper[4892]: I0217 18:02:48.256614 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08f3009f-758d-4bca-b836-ea3ba6e8d097-kube-api-access-9hdfd" (OuterVolumeSpecName: "kube-api-access-9hdfd") pod "08f3009f-758d-4bca-b836-ea3ba6e8d097" (UID: "08f3009f-758d-4bca-b836-ea3ba6e8d097"). InnerVolumeSpecName "kube-api-access-9hdfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:02:48 crc kubenswrapper[4892]: I0217 18:02:48.306645 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08f3009f-758d-4bca-b836-ea3ba6e8d097-config" (OuterVolumeSpecName: "config") pod "08f3009f-758d-4bca-b836-ea3ba6e8d097" (UID: "08f3009f-758d-4bca-b836-ea3ba6e8d097"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:02:48 crc kubenswrapper[4892]: I0217 18:02:48.307239 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08f3009f-758d-4bca-b836-ea3ba6e8d097-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "08f3009f-758d-4bca-b836-ea3ba6e8d097" (UID: "08f3009f-758d-4bca-b836-ea3ba6e8d097"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:02:48 crc kubenswrapper[4892]: I0217 18:02:48.315112 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08f3009f-758d-4bca-b836-ea3ba6e8d097-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "08f3009f-758d-4bca-b836-ea3ba6e8d097" (UID: "08f3009f-758d-4bca-b836-ea3ba6e8d097"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:02:48 crc kubenswrapper[4892]: I0217 18:02:48.326381 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08f3009f-758d-4bca-b836-ea3ba6e8d097-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "08f3009f-758d-4bca-b836-ea3ba6e8d097" (UID: "08f3009f-758d-4bca-b836-ea3ba6e8d097"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:02:48 crc kubenswrapper[4892]: I0217 18:02:48.331095 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08f3009f-758d-4bca-b836-ea3ba6e8d097-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "08f3009f-758d-4bca-b836-ea3ba6e8d097" (UID: "08f3009f-758d-4bca-b836-ea3ba6e8d097"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:02:48 crc kubenswrapper[4892]: I0217 18:02:48.352984 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/08f3009f-758d-4bca-b836-ea3ba6e8d097-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:48 crc kubenswrapper[4892]: I0217 18:02:48.353016 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/08f3009f-758d-4bca-b836-ea3ba6e8d097-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:48 crc kubenswrapper[4892]: I0217 18:02:48.353026 4892 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/08f3009f-758d-4bca-b836-ea3ba6e8d097-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:48 crc kubenswrapper[4892]: I0217 18:02:48.353035 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hdfd\" (UniqueName: \"kubernetes.io/projected/08f3009f-758d-4bca-b836-ea3ba6e8d097-kube-api-access-9hdfd\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:48 crc kubenswrapper[4892]: I0217 18:02:48.353046 4892 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08f3009f-758d-4bca-b836-ea3ba6e8d097-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:48 crc kubenswrapper[4892]: I0217 18:02:48.353058 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08f3009f-758d-4bca-b836-ea3ba6e8d097-config\") on node \"crc\" DevicePath \"\"" Feb 17 18:02:48 crc kubenswrapper[4892]: I0217 18:02:48.816108 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-lvsr8" event={"ID":"daec0414-a6e3-4f1d-bc43-cccfa0444894","Type":"ContainerStarted","Data":"c3def8ed7cbb31f1e7a1e19acdef66f4145697e848ca6bd61724ce8cd5374b4b"} Feb 17 18:02:48 crc kubenswrapper[4892]: I0217 18:02:48.820346 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-cnqps" event={"ID":"08f3009f-758d-4bca-b836-ea3ba6e8d097","Type":"ContainerDied","Data":"758d5deb7c3c33b37f9970c52818e6b886ae96e8c4252624a7afc935347cbb5c"} Feb 17 18:02:48 crc kubenswrapper[4892]: I0217 18:02:48.820390 4892 scope.go:117] "RemoveContainer" containerID="6b42ecebf51df8c144389699c3acf38d14ba7ddca00774dcc4e5cab68116355d" Feb 17 18:02:48 crc kubenswrapper[4892]: I0217 18:02:48.820397 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-cnqps" Feb 17 18:02:48 crc kubenswrapper[4892]: I0217 18:02:48.847247 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-lvsr8" podStartSLOduration=2.694352065 podStartE2EDuration="10.847223203s" podCreationTimestamp="2026-02-17 18:02:38 +0000 UTC" firstStartedPulling="2026-02-17 18:02:39.999317593 +0000 UTC m=+1131.374720858" lastFinishedPulling="2026-02-17 18:02:48.152188731 +0000 UTC m=+1139.527591996" observedRunningTime="2026-02-17 18:02:48.836236869 +0000 UTC m=+1140.211640134" watchObservedRunningTime="2026-02-17 18:02:48.847223203 +0000 UTC m=+1140.222626468" Feb 17 18:02:48 crc kubenswrapper[4892]: I0217 18:02:48.861901 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-cnqps"] Feb 17 18:02:48 crc kubenswrapper[4892]: I0217 18:02:48.868135 4892 scope.go:117] "RemoveContainer" containerID="b3045e29696030e8c5b4ad9566b0e640b3288767304a646d75ee3d334a28d30a" Feb 17 18:02:48 crc kubenswrapper[4892]: I0217 18:02:48.869438 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-cnqps"] Feb 17 18:02:49 crc kubenswrapper[4892]: I0217 18:02:49.371626 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08f3009f-758d-4bca-b836-ea3ba6e8d097" path="/var/lib/kubelet/pods/08f3009f-758d-4bca-b836-ea3ba6e8d097/volumes" Feb 17 18:02:53 crc kubenswrapper[4892]: I0217 18:02:53.014638 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6d5b6d6b67-cnqps" podUID="08f3009f-758d-4bca-b836-ea3ba6e8d097" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.157:5353: i/o timeout" Feb 17 18:03:05 crc kubenswrapper[4892]: I0217 18:03:05.998526 4892 generic.go:334] "Generic (PLEG): container finished" podID="daec0414-a6e3-4f1d-bc43-cccfa0444894" containerID="c3def8ed7cbb31f1e7a1e19acdef66f4145697e848ca6bd61724ce8cd5374b4b" exitCode=0 Feb 17 18:03:05 crc kubenswrapper[4892]: I0217 18:03:05.998633 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-lvsr8" event={"ID":"daec0414-a6e3-4f1d-bc43-cccfa0444894","Type":"ContainerDied","Data":"c3def8ed7cbb31f1e7a1e19acdef66f4145697e848ca6bd61724ce8cd5374b4b"} Feb 17 18:03:07 crc kubenswrapper[4892]: I0217 18:03:07.307114 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-lvsr8" Feb 17 18:03:07 crc kubenswrapper[4892]: I0217 18:03:07.407638 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/daec0414-a6e3-4f1d-bc43-cccfa0444894-config-data\") pod \"daec0414-a6e3-4f1d-bc43-cccfa0444894\" (UID: \"daec0414-a6e3-4f1d-bc43-cccfa0444894\") " Feb 17 18:03:07 crc kubenswrapper[4892]: I0217 18:03:07.407740 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daec0414-a6e3-4f1d-bc43-cccfa0444894-combined-ca-bundle\") pod \"daec0414-a6e3-4f1d-bc43-cccfa0444894\" (UID: \"daec0414-a6e3-4f1d-bc43-cccfa0444894\") " Feb 17 18:03:07 crc kubenswrapper[4892]: I0217 18:03:07.407793 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9plv5\" (UniqueName: \"kubernetes.io/projected/daec0414-a6e3-4f1d-bc43-cccfa0444894-kube-api-access-9plv5\") pod \"daec0414-a6e3-4f1d-bc43-cccfa0444894\" (UID: \"daec0414-a6e3-4f1d-bc43-cccfa0444894\") " Feb 17 18:03:07 crc kubenswrapper[4892]: I0217 18:03:07.414326 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/daec0414-a6e3-4f1d-bc43-cccfa0444894-kube-api-access-9plv5" (OuterVolumeSpecName: "kube-api-access-9plv5") pod "daec0414-a6e3-4f1d-bc43-cccfa0444894" (UID: "daec0414-a6e3-4f1d-bc43-cccfa0444894"). InnerVolumeSpecName "kube-api-access-9plv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:03:07 crc kubenswrapper[4892]: I0217 18:03:07.424430 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 18:03:07 crc kubenswrapper[4892]: I0217 18:03:07.424506 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 18:03:07 crc kubenswrapper[4892]: I0217 18:03:07.445466 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/daec0414-a6e3-4f1d-bc43-cccfa0444894-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "daec0414-a6e3-4f1d-bc43-cccfa0444894" (UID: "daec0414-a6e3-4f1d-bc43-cccfa0444894"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:03:07 crc kubenswrapper[4892]: I0217 18:03:07.459874 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/daec0414-a6e3-4f1d-bc43-cccfa0444894-config-data" (OuterVolumeSpecName: "config-data") pod "daec0414-a6e3-4f1d-bc43-cccfa0444894" (UID: "daec0414-a6e3-4f1d-bc43-cccfa0444894"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:03:07 crc kubenswrapper[4892]: I0217 18:03:07.510400 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/daec0414-a6e3-4f1d-bc43-cccfa0444894-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:07 crc kubenswrapper[4892]: I0217 18:03:07.510431 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daec0414-a6e3-4f1d-bc43-cccfa0444894-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:07 crc kubenswrapper[4892]: I0217 18:03:07.510441 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9plv5\" (UniqueName: \"kubernetes.io/projected/daec0414-a6e3-4f1d-bc43-cccfa0444894-kube-api-access-9plv5\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.018673 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-lvsr8" event={"ID":"daec0414-a6e3-4f1d-bc43-cccfa0444894","Type":"ContainerDied","Data":"c9fa314de0216a5ae662110e509ae23c942bc0e067e91cde3885229df59919a6"} Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.018708 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9fa314de0216a5ae662110e509ae23c942bc0e067e91cde3885229df59919a6" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.018758 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-lvsr8" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.193907 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-nxqvf"] Feb 17 18:03:08 crc kubenswrapper[4892]: E0217 18:03:08.194268 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6ef4386-19dd-4bd6-bf1c-53598735a302" containerName="mariadb-database-create" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.194284 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6ef4386-19dd-4bd6-bf1c-53598735a302" containerName="mariadb-database-create" Feb 17 18:03:08 crc kubenswrapper[4892]: E0217 18:03:08.194300 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="484d3758-2960-4e5d-89a8-d36a6cefd791" containerName="mariadb-account-create-update" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.194307 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="484d3758-2960-4e5d-89a8-d36a6cefd791" containerName="mariadb-account-create-update" Feb 17 18:03:08 crc kubenswrapper[4892]: E0217 18:03:08.194326 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48b3dff9-6db4-4820-9920-9e5a24401e98" containerName="mariadb-account-create-update" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.194332 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="48b3dff9-6db4-4820-9920-9e5a24401e98" containerName="mariadb-account-create-update" Feb 17 18:03:08 crc kubenswrapper[4892]: E0217 18:03:08.194356 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64d8e63f-e602-4e70-ac10-58f577858490" containerName="mariadb-database-create" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.194361 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="64d8e63f-e602-4e70-ac10-58f577858490" containerName="mariadb-database-create" Feb 17 18:03:08 crc kubenswrapper[4892]: E0217 18:03:08.194372 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08f3009f-758d-4bca-b836-ea3ba6e8d097" containerName="dnsmasq-dns" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.194377 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="08f3009f-758d-4bca-b836-ea3ba6e8d097" containerName="dnsmasq-dns" Feb 17 18:03:08 crc kubenswrapper[4892]: E0217 18:03:08.194385 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daec0414-a6e3-4f1d-bc43-cccfa0444894" containerName="keystone-db-sync" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.194391 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="daec0414-a6e3-4f1d-bc43-cccfa0444894" containerName="keystone-db-sync" Feb 17 18:03:08 crc kubenswrapper[4892]: E0217 18:03:08.194403 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bad7ea87-ce2b-4897-96b2-99ff27b92c8d" containerName="mariadb-account-create-update" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.194411 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="bad7ea87-ce2b-4897-96b2-99ff27b92c8d" containerName="mariadb-account-create-update" Feb 17 18:03:08 crc kubenswrapper[4892]: E0217 18:03:08.194421 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08f3009f-758d-4bca-b836-ea3ba6e8d097" containerName="init" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.194427 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="08f3009f-758d-4bca-b836-ea3ba6e8d097" containerName="init" Feb 17 18:03:08 crc kubenswrapper[4892]: E0217 18:03:08.194438 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3e03215-2841-458f-86b7-9bd2882f07a8" containerName="mariadb-database-create" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.194444 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3e03215-2841-458f-86b7-9bd2882f07a8" containerName="mariadb-database-create" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.194627 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="484d3758-2960-4e5d-89a8-d36a6cefd791" containerName="mariadb-account-create-update" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.194639 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3e03215-2841-458f-86b7-9bd2882f07a8" containerName="mariadb-database-create" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.194650 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="48b3dff9-6db4-4820-9920-9e5a24401e98" containerName="mariadb-account-create-update" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.194659 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="64d8e63f-e602-4e70-ac10-58f577858490" containerName="mariadb-database-create" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.194668 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="08f3009f-758d-4bca-b836-ea3ba6e8d097" containerName="dnsmasq-dns" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.194683 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6ef4386-19dd-4bd6-bf1c-53598735a302" containerName="mariadb-database-create" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.194689 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="bad7ea87-ce2b-4897-96b2-99ff27b92c8d" containerName="mariadb-account-create-update" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.194699 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="daec0414-a6e3-4f1d-bc43-cccfa0444894" containerName="keystone-db-sync" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.195556 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-nxqvf" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.224740 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-x4dsw"] Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.225965 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-x4dsw" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.229185 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.229443 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jpwh4" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.229613 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.229745 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.229870 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.236182 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-nxqvf"] Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.249658 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-x4dsw"] Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.324421 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c48fb34-4ec6-4cf7-8988-76e9d9549e23-config-data\") pod \"keystone-bootstrap-x4dsw\" (UID: \"2c48fb34-4ec6-4cf7-8988-76e9d9549e23\") " pod="openstack/keystone-bootstrap-x4dsw" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.324519 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2c48fb34-4ec6-4cf7-8988-76e9d9549e23-fernet-keys\") pod \"keystone-bootstrap-x4dsw\" (UID: \"2c48fb34-4ec6-4cf7-8988-76e9d9549e23\") " pod="openstack/keystone-bootstrap-x4dsw" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.324551 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa7f944f-cd9d-418a-a282-ed67ebf5961d-dns-swift-storage-0\") pod \"dnsmasq-dns-6c9c9f998c-nxqvf\" (UID: \"aa7f944f-cd9d-418a-a282-ed67ebf5961d\") " pod="openstack/dnsmasq-dns-6c9c9f998c-nxqvf" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.324571 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa7f944f-cd9d-418a-a282-ed67ebf5961d-ovsdbserver-nb\") pod \"dnsmasq-dns-6c9c9f998c-nxqvf\" (UID: \"aa7f944f-cd9d-418a-a282-ed67ebf5961d\") " pod="openstack/dnsmasq-dns-6c9c9f998c-nxqvf" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.324650 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ldhv\" (UniqueName: \"kubernetes.io/projected/aa7f944f-cd9d-418a-a282-ed67ebf5961d-kube-api-access-5ldhv\") pod \"dnsmasq-dns-6c9c9f998c-nxqvf\" (UID: \"aa7f944f-cd9d-418a-a282-ed67ebf5961d\") " pod="openstack/dnsmasq-dns-6c9c9f998c-nxqvf" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.324746 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c48fb34-4ec6-4cf7-8988-76e9d9549e23-scripts\") pod \"keystone-bootstrap-x4dsw\" (UID: \"2c48fb34-4ec6-4cf7-8988-76e9d9549e23\") " pod="openstack/keystone-bootstrap-x4dsw" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.324789 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2c48fb34-4ec6-4cf7-8988-76e9d9549e23-credential-keys\") pod \"keystone-bootstrap-x4dsw\" (UID: \"2c48fb34-4ec6-4cf7-8988-76e9d9549e23\") " pod="openstack/keystone-bootstrap-x4dsw" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.324846 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c48fb34-4ec6-4cf7-8988-76e9d9549e23-combined-ca-bundle\") pod \"keystone-bootstrap-x4dsw\" (UID: \"2c48fb34-4ec6-4cf7-8988-76e9d9549e23\") " pod="openstack/keystone-bootstrap-x4dsw" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.324890 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa7f944f-cd9d-418a-a282-ed67ebf5961d-dns-svc\") pod \"dnsmasq-dns-6c9c9f998c-nxqvf\" (UID: \"aa7f944f-cd9d-418a-a282-ed67ebf5961d\") " pod="openstack/dnsmasq-dns-6c9c9f998c-nxqvf" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.324915 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa7f944f-cd9d-418a-a282-ed67ebf5961d-ovsdbserver-sb\") pod \"dnsmasq-dns-6c9c9f998c-nxqvf\" (UID: \"aa7f944f-cd9d-418a-a282-ed67ebf5961d\") " pod="openstack/dnsmasq-dns-6c9c9f998c-nxqvf" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.324937 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7v8x\" (UniqueName: \"kubernetes.io/projected/2c48fb34-4ec6-4cf7-8988-76e9d9549e23-kube-api-access-d7v8x\") pod \"keystone-bootstrap-x4dsw\" (UID: \"2c48fb34-4ec6-4cf7-8988-76e9d9549e23\") " pod="openstack/keystone-bootstrap-x4dsw" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.325001 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa7f944f-cd9d-418a-a282-ed67ebf5961d-config\") pod \"dnsmasq-dns-6c9c9f998c-nxqvf\" (UID: \"aa7f944f-cd9d-418a-a282-ed67ebf5961d\") " pod="openstack/dnsmasq-dns-6c9c9f998c-nxqvf" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.426487 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ldhv\" (UniqueName: \"kubernetes.io/projected/aa7f944f-cd9d-418a-a282-ed67ebf5961d-kube-api-access-5ldhv\") pod \"dnsmasq-dns-6c9c9f998c-nxqvf\" (UID: \"aa7f944f-cd9d-418a-a282-ed67ebf5961d\") " pod="openstack/dnsmasq-dns-6c9c9f998c-nxqvf" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.426553 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c48fb34-4ec6-4cf7-8988-76e9d9549e23-scripts\") pod \"keystone-bootstrap-x4dsw\" (UID: \"2c48fb34-4ec6-4cf7-8988-76e9d9549e23\") " pod="openstack/keystone-bootstrap-x4dsw" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.426594 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2c48fb34-4ec6-4cf7-8988-76e9d9549e23-credential-keys\") pod \"keystone-bootstrap-x4dsw\" (UID: \"2c48fb34-4ec6-4cf7-8988-76e9d9549e23\") " pod="openstack/keystone-bootstrap-x4dsw" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.426622 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c48fb34-4ec6-4cf7-8988-76e9d9549e23-combined-ca-bundle\") pod \"keystone-bootstrap-x4dsw\" (UID: \"2c48fb34-4ec6-4cf7-8988-76e9d9549e23\") " pod="openstack/keystone-bootstrap-x4dsw" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.426655 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa7f944f-cd9d-418a-a282-ed67ebf5961d-dns-svc\") pod \"dnsmasq-dns-6c9c9f998c-nxqvf\" (UID: \"aa7f944f-cd9d-418a-a282-ed67ebf5961d\") " pod="openstack/dnsmasq-dns-6c9c9f998c-nxqvf" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.426679 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa7f944f-cd9d-418a-a282-ed67ebf5961d-ovsdbserver-sb\") pod \"dnsmasq-dns-6c9c9f998c-nxqvf\" (UID: \"aa7f944f-cd9d-418a-a282-ed67ebf5961d\") " pod="openstack/dnsmasq-dns-6c9c9f998c-nxqvf" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.426699 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7v8x\" (UniqueName: \"kubernetes.io/projected/2c48fb34-4ec6-4cf7-8988-76e9d9549e23-kube-api-access-d7v8x\") pod \"keystone-bootstrap-x4dsw\" (UID: \"2c48fb34-4ec6-4cf7-8988-76e9d9549e23\") " pod="openstack/keystone-bootstrap-x4dsw" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.426750 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa7f944f-cd9d-418a-a282-ed67ebf5961d-config\") pod \"dnsmasq-dns-6c9c9f998c-nxqvf\" (UID: \"aa7f944f-cd9d-418a-a282-ed67ebf5961d\") " pod="openstack/dnsmasq-dns-6c9c9f998c-nxqvf" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.426801 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c48fb34-4ec6-4cf7-8988-76e9d9549e23-config-data\") pod \"keystone-bootstrap-x4dsw\" (UID: \"2c48fb34-4ec6-4cf7-8988-76e9d9549e23\") " pod="openstack/keystone-bootstrap-x4dsw" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.426884 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2c48fb34-4ec6-4cf7-8988-76e9d9549e23-fernet-keys\") pod \"keystone-bootstrap-x4dsw\" (UID: \"2c48fb34-4ec6-4cf7-8988-76e9d9549e23\") " pod="openstack/keystone-bootstrap-x4dsw" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.426905 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa7f944f-cd9d-418a-a282-ed67ebf5961d-ovsdbserver-nb\") pod \"dnsmasq-dns-6c9c9f998c-nxqvf\" (UID: \"aa7f944f-cd9d-418a-a282-ed67ebf5961d\") " pod="openstack/dnsmasq-dns-6c9c9f998c-nxqvf" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.426926 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa7f944f-cd9d-418a-a282-ed67ebf5961d-dns-swift-storage-0\") pod \"dnsmasq-dns-6c9c9f998c-nxqvf\" (UID: \"aa7f944f-cd9d-418a-a282-ed67ebf5961d\") " pod="openstack/dnsmasq-dns-6c9c9f998c-nxqvf" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.427994 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa7f944f-cd9d-418a-a282-ed67ebf5961d-dns-swift-storage-0\") pod \"dnsmasq-dns-6c9c9f998c-nxqvf\" (UID: \"aa7f944f-cd9d-418a-a282-ed67ebf5961d\") " pod="openstack/dnsmasq-dns-6c9c9f998c-nxqvf" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.428660 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa7f944f-cd9d-418a-a282-ed67ebf5961d-ovsdbserver-sb\") pod \"dnsmasq-dns-6c9c9f998c-nxqvf\" (UID: \"aa7f944f-cd9d-418a-a282-ed67ebf5961d\") " pod="openstack/dnsmasq-dns-6c9c9f998c-nxqvf" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.429164 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa7f944f-cd9d-418a-a282-ed67ebf5961d-config\") pod \"dnsmasq-dns-6c9c9f998c-nxqvf\" (UID: \"aa7f944f-cd9d-418a-a282-ed67ebf5961d\") " pod="openstack/dnsmasq-dns-6c9c9f998c-nxqvf" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.430735 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa7f944f-cd9d-418a-a282-ed67ebf5961d-ovsdbserver-nb\") pod \"dnsmasq-dns-6c9c9f998c-nxqvf\" (UID: \"aa7f944f-cd9d-418a-a282-ed67ebf5961d\") " pod="openstack/dnsmasq-dns-6c9c9f998c-nxqvf" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.431731 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa7f944f-cd9d-418a-a282-ed67ebf5961d-dns-svc\") pod \"dnsmasq-dns-6c9c9f998c-nxqvf\" (UID: \"aa7f944f-cd9d-418a-a282-ed67ebf5961d\") " pod="openstack/dnsmasq-dns-6c9c9f998c-nxqvf" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.452085 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2c48fb34-4ec6-4cf7-8988-76e9d9549e23-credential-keys\") pod \"keystone-bootstrap-x4dsw\" (UID: \"2c48fb34-4ec6-4cf7-8988-76e9d9549e23\") " pod="openstack/keystone-bootstrap-x4dsw" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.452090 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c48fb34-4ec6-4cf7-8988-76e9d9549e23-scripts\") pod \"keystone-bootstrap-x4dsw\" (UID: \"2c48fb34-4ec6-4cf7-8988-76e9d9549e23\") " pod="openstack/keystone-bootstrap-x4dsw" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.452673 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c48fb34-4ec6-4cf7-8988-76e9d9549e23-combined-ca-bundle\") pod \"keystone-bootstrap-x4dsw\" (UID: \"2c48fb34-4ec6-4cf7-8988-76e9d9549e23\") " pod="openstack/keystone-bootstrap-x4dsw" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.461382 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2c48fb34-4ec6-4cf7-8988-76e9d9549e23-fernet-keys\") pod \"keystone-bootstrap-x4dsw\" (UID: \"2c48fb34-4ec6-4cf7-8988-76e9d9549e23\") " pod="openstack/keystone-bootstrap-x4dsw" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.463076 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c48fb34-4ec6-4cf7-8988-76e9d9549e23-config-data\") pod \"keystone-bootstrap-x4dsw\" (UID: \"2c48fb34-4ec6-4cf7-8988-76e9d9549e23\") " pod="openstack/keystone-bootstrap-x4dsw" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.478870 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ldhv\" (UniqueName: \"kubernetes.io/projected/aa7f944f-cd9d-418a-a282-ed67ebf5961d-kube-api-access-5ldhv\") pod \"dnsmasq-dns-6c9c9f998c-nxqvf\" (UID: \"aa7f944f-cd9d-418a-a282-ed67ebf5961d\") " pod="openstack/dnsmasq-dns-6c9c9f998c-nxqvf" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.484732 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7v8x\" (UniqueName: \"kubernetes.io/projected/2c48fb34-4ec6-4cf7-8988-76e9d9549e23-kube-api-access-d7v8x\") pod \"keystone-bootstrap-x4dsw\" (UID: \"2c48fb34-4ec6-4cf7-8988-76e9d9549e23\") " pod="openstack/keystone-bootstrap-x4dsw" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.484798 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.487125 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.492114 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.503082 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.516265 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-nxqvf" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.516386 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.536409 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-7rrhb"] Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.537590 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-7rrhb" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.541603 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.541805 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-6qpqs" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.541858 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.545664 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-x4dsw" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.588065 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-7rrhb"] Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.632790 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjgkj\" (UniqueName: \"kubernetes.io/projected/e85111e5-c198-4b36-bdce-5eb7a8ff75ea-kube-api-access-gjgkj\") pod \"cinder-db-sync-7rrhb\" (UID: \"e85111e5-c198-4b36-bdce-5eb7a8ff75ea\") " pod="openstack/cinder-db-sync-7rrhb" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.632851 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14ec5fef-7255-4519-b447-474ccccaebdb-config-data\") pod \"ceilometer-0\" (UID: \"14ec5fef-7255-4519-b447-474ccccaebdb\") " pod="openstack/ceilometer-0" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.632873 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e85111e5-c198-4b36-bdce-5eb7a8ff75ea-db-sync-config-data\") pod \"cinder-db-sync-7rrhb\" (UID: \"e85111e5-c198-4b36-bdce-5eb7a8ff75ea\") " pod="openstack/cinder-db-sync-7rrhb" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.632913 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/14ec5fef-7255-4519-b447-474ccccaebdb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"14ec5fef-7255-4519-b447-474ccccaebdb\") " pod="openstack/ceilometer-0" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.632944 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/14ec5fef-7255-4519-b447-474ccccaebdb-log-httpd\") pod \"ceilometer-0\" (UID: \"14ec5fef-7255-4519-b447-474ccccaebdb\") " pod="openstack/ceilometer-0" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.632961 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/14ec5fef-7255-4519-b447-474ccccaebdb-run-httpd\") pod \"ceilometer-0\" (UID: \"14ec5fef-7255-4519-b447-474ccccaebdb\") " pod="openstack/ceilometer-0" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.632982 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e85111e5-c198-4b36-bdce-5eb7a8ff75ea-combined-ca-bundle\") pod \"cinder-db-sync-7rrhb\" (UID: \"e85111e5-c198-4b36-bdce-5eb7a8ff75ea\") " pod="openstack/cinder-db-sync-7rrhb" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.633004 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14ec5fef-7255-4519-b447-474ccccaebdb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"14ec5fef-7255-4519-b447-474ccccaebdb\") " pod="openstack/ceilometer-0" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.633020 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e85111e5-c198-4b36-bdce-5eb7a8ff75ea-etc-machine-id\") pod \"cinder-db-sync-7rrhb\" (UID: \"e85111e5-c198-4b36-bdce-5eb7a8ff75ea\") " pod="openstack/cinder-db-sync-7rrhb" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.633035 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e85111e5-c198-4b36-bdce-5eb7a8ff75ea-scripts\") pod \"cinder-db-sync-7rrhb\" (UID: \"e85111e5-c198-4b36-bdce-5eb7a8ff75ea\") " pod="openstack/cinder-db-sync-7rrhb" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.633053 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5776\" (UniqueName: \"kubernetes.io/projected/14ec5fef-7255-4519-b447-474ccccaebdb-kube-api-access-f5776\") pod \"ceilometer-0\" (UID: \"14ec5fef-7255-4519-b447-474ccccaebdb\") " pod="openstack/ceilometer-0" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.633076 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e85111e5-c198-4b36-bdce-5eb7a8ff75ea-config-data\") pod \"cinder-db-sync-7rrhb\" (UID: \"e85111e5-c198-4b36-bdce-5eb7a8ff75ea\") " pod="openstack/cinder-db-sync-7rrhb" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.633119 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14ec5fef-7255-4519-b447-474ccccaebdb-scripts\") pod \"ceilometer-0\" (UID: \"14ec5fef-7255-4519-b447-474ccccaebdb\") " pod="openstack/ceilometer-0" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.640310 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-nxqvf"] Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.663637 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-dw87w"] Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.665241 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-dw87w" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.704137 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-dw87w"] Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.731641 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-q48cp"] Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.733623 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-q48cp" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.736252 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/14ec5fef-7255-4519-b447-474ccccaebdb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"14ec5fef-7255-4519-b447-474ccccaebdb\") " pod="openstack/ceilometer-0" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.736358 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04ff949e-3323-4573-ac6a-4b1714b1976c-config\") pod \"dnsmasq-dns-57c957c4ff-dw87w\" (UID: \"04ff949e-3323-4573-ac6a-4b1714b1976c\") " pod="openstack/dnsmasq-dns-57c957c4ff-dw87w" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.736388 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/14ec5fef-7255-4519-b447-474ccccaebdb-log-httpd\") pod \"ceilometer-0\" (UID: \"14ec5fef-7255-4519-b447-474ccccaebdb\") " pod="openstack/ceilometer-0" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.736424 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/14ec5fef-7255-4519-b447-474ccccaebdb-run-httpd\") pod \"ceilometer-0\" (UID: \"14ec5fef-7255-4519-b447-474ccccaebdb\") " pod="openstack/ceilometer-0" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.736476 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e85111e5-c198-4b36-bdce-5eb7a8ff75ea-combined-ca-bundle\") pod \"cinder-db-sync-7rrhb\" (UID: \"e85111e5-c198-4b36-bdce-5eb7a8ff75ea\") " pod="openstack/cinder-db-sync-7rrhb" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.736517 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14ec5fef-7255-4519-b447-474ccccaebdb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"14ec5fef-7255-4519-b447-474ccccaebdb\") " pod="openstack/ceilometer-0" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.736552 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e85111e5-c198-4b36-bdce-5eb7a8ff75ea-etc-machine-id\") pod \"cinder-db-sync-7rrhb\" (UID: \"e85111e5-c198-4b36-bdce-5eb7a8ff75ea\") " pod="openstack/cinder-db-sync-7rrhb" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.736578 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e85111e5-c198-4b36-bdce-5eb7a8ff75ea-scripts\") pod \"cinder-db-sync-7rrhb\" (UID: \"e85111e5-c198-4b36-bdce-5eb7a8ff75ea\") " pod="openstack/cinder-db-sync-7rrhb" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.736621 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5776\" (UniqueName: \"kubernetes.io/projected/14ec5fef-7255-4519-b447-474ccccaebdb-kube-api-access-f5776\") pod \"ceilometer-0\" (UID: \"14ec5fef-7255-4519-b447-474ccccaebdb\") " pod="openstack/ceilometer-0" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.736682 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e85111e5-c198-4b36-bdce-5eb7a8ff75ea-config-data\") pod \"cinder-db-sync-7rrhb\" (UID: \"e85111e5-c198-4b36-bdce-5eb7a8ff75ea\") " pod="openstack/cinder-db-sync-7rrhb" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.736721 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04ff949e-3323-4573-ac6a-4b1714b1976c-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-dw87w\" (UID: \"04ff949e-3323-4573-ac6a-4b1714b1976c\") " pod="openstack/dnsmasq-dns-57c957c4ff-dw87w" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.736860 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14ec5fef-7255-4519-b447-474ccccaebdb-scripts\") pod \"ceilometer-0\" (UID: \"14ec5fef-7255-4519-b447-474ccccaebdb\") " pod="openstack/ceilometer-0" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.736909 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04ff949e-3323-4573-ac6a-4b1714b1976c-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-dw87w\" (UID: \"04ff949e-3323-4573-ac6a-4b1714b1976c\") " pod="openstack/dnsmasq-dns-57c957c4ff-dw87w" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.736928 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04ff949e-3323-4573-ac6a-4b1714b1976c-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-dw87w\" (UID: \"04ff949e-3323-4573-ac6a-4b1714b1976c\") " pod="openstack/dnsmasq-dns-57c957c4ff-dw87w" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.736997 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sf56q\" (UniqueName: \"kubernetes.io/projected/04ff949e-3323-4573-ac6a-4b1714b1976c-kube-api-access-sf56q\") pod \"dnsmasq-dns-57c957c4ff-dw87w\" (UID: \"04ff949e-3323-4573-ac6a-4b1714b1976c\") " pod="openstack/dnsmasq-dns-57c957c4ff-dw87w" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.737030 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjgkj\" (UniqueName: \"kubernetes.io/projected/e85111e5-c198-4b36-bdce-5eb7a8ff75ea-kube-api-access-gjgkj\") pod \"cinder-db-sync-7rrhb\" (UID: \"e85111e5-c198-4b36-bdce-5eb7a8ff75ea\") " pod="openstack/cinder-db-sync-7rrhb" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.737058 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14ec5fef-7255-4519-b447-474ccccaebdb-config-data\") pod \"ceilometer-0\" (UID: \"14ec5fef-7255-4519-b447-474ccccaebdb\") " pod="openstack/ceilometer-0" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.737084 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04ff949e-3323-4573-ac6a-4b1714b1976c-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-dw87w\" (UID: \"04ff949e-3323-4573-ac6a-4b1714b1976c\") " pod="openstack/dnsmasq-dns-57c957c4ff-dw87w" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.737111 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e85111e5-c198-4b36-bdce-5eb7a8ff75ea-db-sync-config-data\") pod \"cinder-db-sync-7rrhb\" (UID: \"e85111e5-c198-4b36-bdce-5eb7a8ff75ea\") " pod="openstack/cinder-db-sync-7rrhb" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.737145 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/14ec5fef-7255-4519-b447-474ccccaebdb-run-httpd\") pod \"ceilometer-0\" (UID: \"14ec5fef-7255-4519-b447-474ccccaebdb\") " pod="openstack/ceilometer-0" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.737503 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/14ec5fef-7255-4519-b447-474ccccaebdb-log-httpd\") pod \"ceilometer-0\" (UID: \"14ec5fef-7255-4519-b447-474ccccaebdb\") " pod="openstack/ceilometer-0" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.740034 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e85111e5-c198-4b36-bdce-5eb7a8ff75ea-etc-machine-id\") pod \"cinder-db-sync-7rrhb\" (UID: \"e85111e5-c198-4b36-bdce-5eb7a8ff75ea\") " pod="openstack/cinder-db-sync-7rrhb" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.755740 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14ec5fef-7255-4519-b447-474ccccaebdb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"14ec5fef-7255-4519-b447-474ccccaebdb\") " pod="openstack/ceilometer-0" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.756205 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.756378 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.756514 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-8gj4q" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.758160 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e85111e5-c198-4b36-bdce-5eb7a8ff75ea-combined-ca-bundle\") pod \"cinder-db-sync-7rrhb\" (UID: \"e85111e5-c198-4b36-bdce-5eb7a8ff75ea\") " pod="openstack/cinder-db-sync-7rrhb" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.759048 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e85111e5-c198-4b36-bdce-5eb7a8ff75ea-scripts\") pod \"cinder-db-sync-7rrhb\" (UID: \"e85111e5-c198-4b36-bdce-5eb7a8ff75ea\") " pod="openstack/cinder-db-sync-7rrhb" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.763007 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/14ec5fef-7255-4519-b447-474ccccaebdb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"14ec5fef-7255-4519-b447-474ccccaebdb\") " pod="openstack/ceilometer-0" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.765227 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14ec5fef-7255-4519-b447-474ccccaebdb-scripts\") pod \"ceilometer-0\" (UID: \"14ec5fef-7255-4519-b447-474ccccaebdb\") " pod="openstack/ceilometer-0" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.768476 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14ec5fef-7255-4519-b447-474ccccaebdb-config-data\") pod \"ceilometer-0\" (UID: \"14ec5fef-7255-4519-b447-474ccccaebdb\") " pod="openstack/ceilometer-0" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.768526 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e85111e5-c198-4b36-bdce-5eb7a8ff75ea-db-sync-config-data\") pod \"cinder-db-sync-7rrhb\" (UID: \"e85111e5-c198-4b36-bdce-5eb7a8ff75ea\") " pod="openstack/cinder-db-sync-7rrhb" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.784004 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e85111e5-c198-4b36-bdce-5eb7a8ff75ea-config-data\") pod \"cinder-db-sync-7rrhb\" (UID: \"e85111e5-c198-4b36-bdce-5eb7a8ff75ea\") " pod="openstack/cinder-db-sync-7rrhb" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.792146 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5776\" (UniqueName: \"kubernetes.io/projected/14ec5fef-7255-4519-b447-474ccccaebdb-kube-api-access-f5776\") pod \"ceilometer-0\" (UID: \"14ec5fef-7255-4519-b447-474ccccaebdb\") " pod="openstack/ceilometer-0" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.793082 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjgkj\" (UniqueName: \"kubernetes.io/projected/e85111e5-c198-4b36-bdce-5eb7a8ff75ea-kube-api-access-gjgkj\") pod \"cinder-db-sync-7rrhb\" (UID: \"e85111e5-c198-4b36-bdce-5eb7a8ff75ea\") " pod="openstack/cinder-db-sync-7rrhb" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.803752 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-79pp7"] Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.806891 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-79pp7" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.819808 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.820510 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-856hl" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.820664 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.837226 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-q48cp"] Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.841935 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04ff949e-3323-4573-ac6a-4b1714b1976c-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-dw87w\" (UID: \"04ff949e-3323-4573-ac6a-4b1714b1976c\") " pod="openstack/dnsmasq-dns-57c957c4ff-dw87w" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.841980 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04ff949e-3323-4573-ac6a-4b1714b1976c-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-dw87w\" (UID: \"04ff949e-3323-4573-ac6a-4b1714b1976c\") " pod="openstack/dnsmasq-dns-57c957c4ff-dw87w" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.842014 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sf56q\" (UniqueName: \"kubernetes.io/projected/04ff949e-3323-4573-ac6a-4b1714b1976c-kube-api-access-sf56q\") pod \"dnsmasq-dns-57c957c4ff-dw87w\" (UID: \"04ff949e-3323-4573-ac6a-4b1714b1976c\") " pod="openstack/dnsmasq-dns-57c957c4ff-dw87w" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.842045 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04ff949e-3323-4573-ac6a-4b1714b1976c-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-dw87w\" (UID: \"04ff949e-3323-4573-ac6a-4b1714b1976c\") " pod="openstack/dnsmasq-dns-57c957c4ff-dw87w" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.842109 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04ff949e-3323-4573-ac6a-4b1714b1976c-config\") pod \"dnsmasq-dns-57c957c4ff-dw87w\" (UID: \"04ff949e-3323-4573-ac6a-4b1714b1976c\") " pod="openstack/dnsmasq-dns-57c957c4ff-dw87w" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.842133 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78ddx\" (UniqueName: \"kubernetes.io/projected/b6b151d3-a2dd-4201-b4a4-e2f332530eaa-kube-api-access-78ddx\") pod \"neutron-db-sync-q48cp\" (UID: \"b6b151d3-a2dd-4201-b4a4-e2f332530eaa\") " pod="openstack/neutron-db-sync-q48cp" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.842261 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b6b151d3-a2dd-4201-b4a4-e2f332530eaa-config\") pod \"neutron-db-sync-q48cp\" (UID: \"b6b151d3-a2dd-4201-b4a4-e2f332530eaa\") " pod="openstack/neutron-db-sync-q48cp" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.842310 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04ff949e-3323-4573-ac6a-4b1714b1976c-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-dw87w\" (UID: \"04ff949e-3323-4573-ac6a-4b1714b1976c\") " pod="openstack/dnsmasq-dns-57c957c4ff-dw87w" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.842346 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6b151d3-a2dd-4201-b4a4-e2f332530eaa-combined-ca-bundle\") pod \"neutron-db-sync-q48cp\" (UID: \"b6b151d3-a2dd-4201-b4a4-e2f332530eaa\") " pod="openstack/neutron-db-sync-q48cp" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.845692 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04ff949e-3323-4573-ac6a-4b1714b1976c-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-dw87w\" (UID: \"04ff949e-3323-4573-ac6a-4b1714b1976c\") " pod="openstack/dnsmasq-dns-57c957c4ff-dw87w" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.846626 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04ff949e-3323-4573-ac6a-4b1714b1976c-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-dw87w\" (UID: \"04ff949e-3323-4573-ac6a-4b1714b1976c\") " pod="openstack/dnsmasq-dns-57c957c4ff-dw87w" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.851035 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-79pp7"] Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.853336 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04ff949e-3323-4573-ac6a-4b1714b1976c-config\") pod \"dnsmasq-dns-57c957c4ff-dw87w\" (UID: \"04ff949e-3323-4573-ac6a-4b1714b1976c\") " pod="openstack/dnsmasq-dns-57c957c4ff-dw87w" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.856588 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04ff949e-3323-4573-ac6a-4b1714b1976c-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-dw87w\" (UID: \"04ff949e-3323-4573-ac6a-4b1714b1976c\") " pod="openstack/dnsmasq-dns-57c957c4ff-dw87w" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.857471 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04ff949e-3323-4573-ac6a-4b1714b1976c-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-dw87w\" (UID: \"04ff949e-3323-4573-ac6a-4b1714b1976c\") " pod="openstack/dnsmasq-dns-57c957c4ff-dw87w" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.871068 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sf56q\" (UniqueName: \"kubernetes.io/projected/04ff949e-3323-4573-ac6a-4b1714b1976c-kube-api-access-sf56q\") pod \"dnsmasq-dns-57c957c4ff-dw87w\" (UID: \"04ff949e-3323-4573-ac6a-4b1714b1976c\") " pod="openstack/dnsmasq-dns-57c957c4ff-dw87w" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.872940 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-nc5l8"] Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.874308 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-nc5l8" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.880977 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.881306 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-vgzf6" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.895491 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-nc5l8"] Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.949107 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/545242bc-24e5-4521-85d8-7aff5cbd4916-db-sync-config-data\") pod \"barbican-db-sync-nc5l8\" (UID: \"545242bc-24e5-4521-85d8-7aff5cbd4916\") " pod="openstack/barbican-db-sync-nc5l8" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.949192 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3be41152-a14a-43b1-b24e-221e614556df-scripts\") pod \"placement-db-sync-79pp7\" (UID: \"3be41152-a14a-43b1-b24e-221e614556df\") " pod="openstack/placement-db-sync-79pp7" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.949241 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2gm2\" (UniqueName: \"kubernetes.io/projected/545242bc-24e5-4521-85d8-7aff5cbd4916-kube-api-access-v2gm2\") pod \"barbican-db-sync-nc5l8\" (UID: \"545242bc-24e5-4521-85d8-7aff5cbd4916\") " pod="openstack/barbican-db-sync-nc5l8" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.949323 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3be41152-a14a-43b1-b24e-221e614556df-config-data\") pod \"placement-db-sync-79pp7\" (UID: \"3be41152-a14a-43b1-b24e-221e614556df\") " pod="openstack/placement-db-sync-79pp7" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.949356 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/545242bc-24e5-4521-85d8-7aff5cbd4916-combined-ca-bundle\") pod \"barbican-db-sync-nc5l8\" (UID: \"545242bc-24e5-4521-85d8-7aff5cbd4916\") " pod="openstack/barbican-db-sync-nc5l8" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.949412 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3be41152-a14a-43b1-b24e-221e614556df-combined-ca-bundle\") pod \"placement-db-sync-79pp7\" (UID: \"3be41152-a14a-43b1-b24e-221e614556df\") " pod="openstack/placement-db-sync-79pp7" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.949445 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78ddx\" (UniqueName: \"kubernetes.io/projected/b6b151d3-a2dd-4201-b4a4-e2f332530eaa-kube-api-access-78ddx\") pod \"neutron-db-sync-q48cp\" (UID: \"b6b151d3-a2dd-4201-b4a4-e2f332530eaa\") " pod="openstack/neutron-db-sync-q48cp" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.949533 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3be41152-a14a-43b1-b24e-221e614556df-logs\") pod \"placement-db-sync-79pp7\" (UID: \"3be41152-a14a-43b1-b24e-221e614556df\") " pod="openstack/placement-db-sync-79pp7" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.949588 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b6b151d3-a2dd-4201-b4a4-e2f332530eaa-config\") pod \"neutron-db-sync-q48cp\" (UID: \"b6b151d3-a2dd-4201-b4a4-e2f332530eaa\") " pod="openstack/neutron-db-sync-q48cp" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.949661 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6b151d3-a2dd-4201-b4a4-e2f332530eaa-combined-ca-bundle\") pod \"neutron-db-sync-q48cp\" (UID: \"b6b151d3-a2dd-4201-b4a4-e2f332530eaa\") " pod="openstack/neutron-db-sync-q48cp" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.949751 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7lrm\" (UniqueName: \"kubernetes.io/projected/3be41152-a14a-43b1-b24e-221e614556df-kube-api-access-t7lrm\") pod \"placement-db-sync-79pp7\" (UID: \"3be41152-a14a-43b1-b24e-221e614556df\") " pod="openstack/placement-db-sync-79pp7" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.953396 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.961181 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b6b151d3-a2dd-4201-b4a4-e2f332530eaa-config\") pod \"neutron-db-sync-q48cp\" (UID: \"b6b151d3-a2dd-4201-b4a4-e2f332530eaa\") " pod="openstack/neutron-db-sync-q48cp" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.965520 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6b151d3-a2dd-4201-b4a4-e2f332530eaa-combined-ca-bundle\") pod \"neutron-db-sync-q48cp\" (UID: \"b6b151d3-a2dd-4201-b4a4-e2f332530eaa\") " pod="openstack/neutron-db-sync-q48cp" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.974023 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78ddx\" (UniqueName: \"kubernetes.io/projected/b6b151d3-a2dd-4201-b4a4-e2f332530eaa-kube-api-access-78ddx\") pod \"neutron-db-sync-q48cp\" (UID: \"b6b151d3-a2dd-4201-b4a4-e2f332530eaa\") " pod="openstack/neutron-db-sync-q48cp" Feb 17 18:03:08 crc kubenswrapper[4892]: I0217 18:03:08.990392 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-7rrhb" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.053021 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3be41152-a14a-43b1-b24e-221e614556df-combined-ca-bundle\") pod \"placement-db-sync-79pp7\" (UID: \"3be41152-a14a-43b1-b24e-221e614556df\") " pod="openstack/placement-db-sync-79pp7" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.053107 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3be41152-a14a-43b1-b24e-221e614556df-logs\") pod \"placement-db-sync-79pp7\" (UID: \"3be41152-a14a-43b1-b24e-221e614556df\") " pod="openstack/placement-db-sync-79pp7" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.053203 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7lrm\" (UniqueName: \"kubernetes.io/projected/3be41152-a14a-43b1-b24e-221e614556df-kube-api-access-t7lrm\") pod \"placement-db-sync-79pp7\" (UID: \"3be41152-a14a-43b1-b24e-221e614556df\") " pod="openstack/placement-db-sync-79pp7" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.053244 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/545242bc-24e5-4521-85d8-7aff5cbd4916-db-sync-config-data\") pod \"barbican-db-sync-nc5l8\" (UID: \"545242bc-24e5-4521-85d8-7aff5cbd4916\") " pod="openstack/barbican-db-sync-nc5l8" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.053294 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3be41152-a14a-43b1-b24e-221e614556df-scripts\") pod \"placement-db-sync-79pp7\" (UID: \"3be41152-a14a-43b1-b24e-221e614556df\") " pod="openstack/placement-db-sync-79pp7" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.053321 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2gm2\" (UniqueName: \"kubernetes.io/projected/545242bc-24e5-4521-85d8-7aff5cbd4916-kube-api-access-v2gm2\") pod \"barbican-db-sync-nc5l8\" (UID: \"545242bc-24e5-4521-85d8-7aff5cbd4916\") " pod="openstack/barbican-db-sync-nc5l8" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.053375 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3be41152-a14a-43b1-b24e-221e614556df-config-data\") pod \"placement-db-sync-79pp7\" (UID: \"3be41152-a14a-43b1-b24e-221e614556df\") " pod="openstack/placement-db-sync-79pp7" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.053405 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/545242bc-24e5-4521-85d8-7aff5cbd4916-combined-ca-bundle\") pod \"barbican-db-sync-nc5l8\" (UID: \"545242bc-24e5-4521-85d8-7aff5cbd4916\") " pod="openstack/barbican-db-sync-nc5l8" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.057537 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/545242bc-24e5-4521-85d8-7aff5cbd4916-db-sync-config-data\") pod \"barbican-db-sync-nc5l8\" (UID: \"545242bc-24e5-4521-85d8-7aff5cbd4916\") " pod="openstack/barbican-db-sync-nc5l8" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.060588 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3be41152-a14a-43b1-b24e-221e614556df-logs\") pod \"placement-db-sync-79pp7\" (UID: \"3be41152-a14a-43b1-b24e-221e614556df\") " pod="openstack/placement-db-sync-79pp7" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.061959 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/545242bc-24e5-4521-85d8-7aff5cbd4916-combined-ca-bundle\") pod \"barbican-db-sync-nc5l8\" (UID: \"545242bc-24e5-4521-85d8-7aff5cbd4916\") " pod="openstack/barbican-db-sync-nc5l8" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.063215 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3be41152-a14a-43b1-b24e-221e614556df-scripts\") pod \"placement-db-sync-79pp7\" (UID: \"3be41152-a14a-43b1-b24e-221e614556df\") " pod="openstack/placement-db-sync-79pp7" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.073458 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3be41152-a14a-43b1-b24e-221e614556df-combined-ca-bundle\") pod \"placement-db-sync-79pp7\" (UID: \"3be41152-a14a-43b1-b24e-221e614556df\") " pod="openstack/placement-db-sync-79pp7" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.075793 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3be41152-a14a-43b1-b24e-221e614556df-config-data\") pod \"placement-db-sync-79pp7\" (UID: \"3be41152-a14a-43b1-b24e-221e614556df\") " pod="openstack/placement-db-sync-79pp7" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.079242 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7lrm\" (UniqueName: \"kubernetes.io/projected/3be41152-a14a-43b1-b24e-221e614556df-kube-api-access-t7lrm\") pod \"placement-db-sync-79pp7\" (UID: \"3be41152-a14a-43b1-b24e-221e614556df\") " pod="openstack/placement-db-sync-79pp7" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.084546 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-dw87w" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.088039 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2gm2\" (UniqueName: \"kubernetes.io/projected/545242bc-24e5-4521-85d8-7aff5cbd4916-kube-api-access-v2gm2\") pod \"barbican-db-sync-nc5l8\" (UID: \"545242bc-24e5-4521-85d8-7aff5cbd4916\") " pod="openstack/barbican-db-sync-nc5l8" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.135707 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-q48cp" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.145926 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-79pp7" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.218347 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-nc5l8" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.229588 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-x4dsw"] Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.355340 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-nxqvf"] Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.407522 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.412053 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.412230 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.418525 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.418611 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.418754 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.418908 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-ltggx" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.431882 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.434778 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.446675 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.447076 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.455109 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.607312 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"16369677-41cd-4100-a9dd-6a015abc114f\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.607403 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16369677-41cd-4100-a9dd-6a015abc114f-logs\") pod \"glance-default-internal-api-0\" (UID: \"16369677-41cd-4100-a9dd-6a015abc114f\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.607471 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8faf0043-ec66-4725-9db0-e0b7a4d3da6d-config-data\") pod \"glance-default-external-api-0\" (UID: \"8faf0043-ec66-4725-9db0-e0b7a4d3da6d\") " pod="openstack/glance-default-external-api-0" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.607496 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"8faf0043-ec66-4725-9db0-e0b7a4d3da6d\") " pod="openstack/glance-default-external-api-0" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.607536 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8faf0043-ec66-4725-9db0-e0b7a4d3da6d-scripts\") pod \"glance-default-external-api-0\" (UID: \"8faf0043-ec66-4725-9db0-e0b7a4d3da6d\") " pod="openstack/glance-default-external-api-0" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.607570 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8faf0043-ec66-4725-9db0-e0b7a4d3da6d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8faf0043-ec66-4725-9db0-e0b7a4d3da6d\") " pod="openstack/glance-default-external-api-0" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.607610 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16369677-41cd-4100-a9dd-6a015abc114f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"16369677-41cd-4100-a9dd-6a015abc114f\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.607648 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8faf0043-ec66-4725-9db0-e0b7a4d3da6d-logs\") pod \"glance-default-external-api-0\" (UID: \"8faf0043-ec66-4725-9db0-e0b7a4d3da6d\") " pod="openstack/glance-default-external-api-0" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.607688 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16369677-41cd-4100-a9dd-6a015abc114f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"16369677-41cd-4100-a9dd-6a015abc114f\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.607717 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16369677-41cd-4100-a9dd-6a015abc114f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"16369677-41cd-4100-a9dd-6a015abc114f\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.607747 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8faf0043-ec66-4725-9db0-e0b7a4d3da6d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8faf0043-ec66-4725-9db0-e0b7a4d3da6d\") " pod="openstack/glance-default-external-api-0" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.607778 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8faf0043-ec66-4725-9db0-e0b7a4d3da6d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8faf0043-ec66-4725-9db0-e0b7a4d3da6d\") " pod="openstack/glance-default-external-api-0" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.607838 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n66p4\" (UniqueName: \"kubernetes.io/projected/16369677-41cd-4100-a9dd-6a015abc114f-kube-api-access-n66p4\") pod \"glance-default-internal-api-0\" (UID: \"16369677-41cd-4100-a9dd-6a015abc114f\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.607869 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgqrn\" (UniqueName: \"kubernetes.io/projected/8faf0043-ec66-4725-9db0-e0b7a4d3da6d-kube-api-access-vgqrn\") pod \"glance-default-external-api-0\" (UID: \"8faf0043-ec66-4725-9db0-e0b7a4d3da6d\") " pod="openstack/glance-default-external-api-0" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.607893 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16369677-41cd-4100-a9dd-6a015abc114f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"16369677-41cd-4100-a9dd-6a015abc114f\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.608027 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16369677-41cd-4100-a9dd-6a015abc114f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"16369677-41cd-4100-a9dd-6a015abc114f\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.718679 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8faf0043-ec66-4725-9db0-e0b7a4d3da6d-config-data\") pod \"glance-default-external-api-0\" (UID: \"8faf0043-ec66-4725-9db0-e0b7a4d3da6d\") " pod="openstack/glance-default-external-api-0" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.718938 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"8faf0043-ec66-4725-9db0-e0b7a4d3da6d\") " pod="openstack/glance-default-external-api-0" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.719787 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8faf0043-ec66-4725-9db0-e0b7a4d3da6d-scripts\") pod \"glance-default-external-api-0\" (UID: \"8faf0043-ec66-4725-9db0-e0b7a4d3da6d\") " pod="openstack/glance-default-external-api-0" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.719923 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8faf0043-ec66-4725-9db0-e0b7a4d3da6d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8faf0043-ec66-4725-9db0-e0b7a4d3da6d\") " pod="openstack/glance-default-external-api-0" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.720006 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16369677-41cd-4100-a9dd-6a015abc114f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"16369677-41cd-4100-a9dd-6a015abc114f\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.720076 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8faf0043-ec66-4725-9db0-e0b7a4d3da6d-logs\") pod \"glance-default-external-api-0\" (UID: \"8faf0043-ec66-4725-9db0-e0b7a4d3da6d\") " pod="openstack/glance-default-external-api-0" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.720143 4892 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"8faf0043-ec66-4725-9db0-e0b7a4d3da6d\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.720169 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16369677-41cd-4100-a9dd-6a015abc114f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"16369677-41cd-4100-a9dd-6a015abc114f\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.720280 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16369677-41cd-4100-a9dd-6a015abc114f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"16369677-41cd-4100-a9dd-6a015abc114f\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.720345 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8faf0043-ec66-4725-9db0-e0b7a4d3da6d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8faf0043-ec66-4725-9db0-e0b7a4d3da6d\") " pod="openstack/glance-default-external-api-0" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.720414 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8faf0043-ec66-4725-9db0-e0b7a4d3da6d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8faf0043-ec66-4725-9db0-e0b7a4d3da6d\") " pod="openstack/glance-default-external-api-0" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.721204 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n66p4\" (UniqueName: \"kubernetes.io/projected/16369677-41cd-4100-a9dd-6a015abc114f-kube-api-access-n66p4\") pod \"glance-default-internal-api-0\" (UID: \"16369677-41cd-4100-a9dd-6a015abc114f\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.721314 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgqrn\" (UniqueName: \"kubernetes.io/projected/8faf0043-ec66-4725-9db0-e0b7a4d3da6d-kube-api-access-vgqrn\") pod \"glance-default-external-api-0\" (UID: \"8faf0043-ec66-4725-9db0-e0b7a4d3da6d\") " pod="openstack/glance-default-external-api-0" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.721381 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16369677-41cd-4100-a9dd-6a015abc114f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"16369677-41cd-4100-a9dd-6a015abc114f\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.721576 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16369677-41cd-4100-a9dd-6a015abc114f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"16369677-41cd-4100-a9dd-6a015abc114f\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.721678 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16369677-41cd-4100-a9dd-6a015abc114f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"16369677-41cd-4100-a9dd-6a015abc114f\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.721991 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"16369677-41cd-4100-a9dd-6a015abc114f\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.722054 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16369677-41cd-4100-a9dd-6a015abc114f-logs\") pod \"glance-default-internal-api-0\" (UID: \"16369677-41cd-4100-a9dd-6a015abc114f\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.722393 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16369677-41cd-4100-a9dd-6a015abc114f-logs\") pod \"glance-default-internal-api-0\" (UID: \"16369677-41cd-4100-a9dd-6a015abc114f\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.722490 4892 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"16369677-41cd-4100-a9dd-6a015abc114f\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-internal-api-0" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.721328 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8faf0043-ec66-4725-9db0-e0b7a4d3da6d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8faf0043-ec66-4725-9db0-e0b7a4d3da6d\") " pod="openstack/glance-default-external-api-0" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.729266 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8faf0043-ec66-4725-9db0-e0b7a4d3da6d-logs\") pod \"glance-default-external-api-0\" (UID: \"8faf0043-ec66-4725-9db0-e0b7a4d3da6d\") " pod="openstack/glance-default-external-api-0" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.731858 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8faf0043-ec66-4725-9db0-e0b7a4d3da6d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8faf0043-ec66-4725-9db0-e0b7a4d3da6d\") " pod="openstack/glance-default-external-api-0" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.735450 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8faf0043-ec66-4725-9db0-e0b7a4d3da6d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8faf0043-ec66-4725-9db0-e0b7a4d3da6d\") " pod="openstack/glance-default-external-api-0" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.751695 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16369677-41cd-4100-a9dd-6a015abc114f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"16369677-41cd-4100-a9dd-6a015abc114f\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.755881 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.776903 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgqrn\" (UniqueName: \"kubernetes.io/projected/8faf0043-ec66-4725-9db0-e0b7a4d3da6d-kube-api-access-vgqrn\") pod \"glance-default-external-api-0\" (UID: \"8faf0043-ec66-4725-9db0-e0b7a4d3da6d\") " pod="openstack/glance-default-external-api-0" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.794675 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n66p4\" (UniqueName: \"kubernetes.io/projected/16369677-41cd-4100-a9dd-6a015abc114f-kube-api-access-n66p4\") pod \"glance-default-internal-api-0\" (UID: \"16369677-41cd-4100-a9dd-6a015abc114f\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.795118 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8faf0043-ec66-4725-9db0-e0b7a4d3da6d-config-data\") pod \"glance-default-external-api-0\" (UID: \"8faf0043-ec66-4725-9db0-e0b7a4d3da6d\") " pod="openstack/glance-default-external-api-0" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.795541 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8faf0043-ec66-4725-9db0-e0b7a4d3da6d-scripts\") pod \"glance-default-external-api-0\" (UID: \"8faf0043-ec66-4725-9db0-e0b7a4d3da6d\") " pod="openstack/glance-default-external-api-0" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.798471 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16369677-41cd-4100-a9dd-6a015abc114f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"16369677-41cd-4100-a9dd-6a015abc114f\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.799176 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16369677-41cd-4100-a9dd-6a015abc114f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"16369677-41cd-4100-a9dd-6a015abc114f\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:03:09 crc kubenswrapper[4892]: W0217 18:03:09.802010 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode85111e5_c198_4b36_bdce_5eb7a8ff75ea.slice/crio-8f1ea9bca6bfc9754522295f832885667c13e97c765003ac1ce99459674f9baa WatchSource:0}: Error finding container 8f1ea9bca6bfc9754522295f832885667c13e97c765003ac1ce99459674f9baa: Status 404 returned error can't find the container with id 8f1ea9bca6bfc9754522295f832885667c13e97c765003ac1ce99459674f9baa Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.802556 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16369677-41cd-4100-a9dd-6a015abc114f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"16369677-41cd-4100-a9dd-6a015abc114f\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.804031 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-7rrhb"] Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.821145 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"8faf0043-ec66-4725-9db0-e0b7a4d3da6d\") " pod="openstack/glance-default-external-api-0" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.835886 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"16369677-41cd-4100-a9dd-6a015abc114f\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:03:09 crc kubenswrapper[4892]: I0217 18:03:09.882747 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 18:03:10 crc kubenswrapper[4892]: I0217 18:03:09.999988 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 18:03:10 crc kubenswrapper[4892]: I0217 18:03:10.006000 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-79pp7"] Feb 17 18:03:10 crc kubenswrapper[4892]: I0217 18:03:10.014783 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-nc5l8"] Feb 17 18:03:10 crc kubenswrapper[4892]: I0217 18:03:10.048751 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-dw87w"] Feb 17 18:03:10 crc kubenswrapper[4892]: I0217 18:03:10.070717 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-x4dsw" event={"ID":"2c48fb34-4ec6-4cf7-8988-76e9d9549e23","Type":"ContainerStarted","Data":"3a5a394db3114515f27acb7b3724e0a08543612a8c1883d88a2dbc0f06751727"} Feb 17 18:03:10 crc kubenswrapper[4892]: I0217 18:03:10.071008 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-x4dsw" event={"ID":"2c48fb34-4ec6-4cf7-8988-76e9d9549e23","Type":"ContainerStarted","Data":"52990d10d54c8cab3d481077a05e7c5e592102f95c7fbf490d20e3745b94abf2"} Feb 17 18:03:10 crc kubenswrapper[4892]: I0217 18:03:10.093559 4892 generic.go:334] "Generic (PLEG): container finished" podID="aa7f944f-cd9d-418a-a282-ed67ebf5961d" containerID="4a3685cf47e4ca1d92082ffdb52ba15e98a7a0b501edf74c6ce91738d76cb316" exitCode=0 Feb 17 18:03:10 crc kubenswrapper[4892]: I0217 18:03:10.093626 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9c9f998c-nxqvf" event={"ID":"aa7f944f-cd9d-418a-a282-ed67ebf5961d","Type":"ContainerDied","Data":"4a3685cf47e4ca1d92082ffdb52ba15e98a7a0b501edf74c6ce91738d76cb316"} Feb 17 18:03:10 crc kubenswrapper[4892]: I0217 18:03:10.093652 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9c9f998c-nxqvf" event={"ID":"aa7f944f-cd9d-418a-a282-ed67ebf5961d","Type":"ContainerStarted","Data":"4ccd25457ff898382e16ef574b43d189c55a658bb8a290ca348dae6d6a245033"} Feb 17 18:03:10 crc kubenswrapper[4892]: I0217 18:03:10.101579 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-79pp7" event={"ID":"3be41152-a14a-43b1-b24e-221e614556df","Type":"ContainerStarted","Data":"fb032a59436448f4fc282fd5b98b03cd5baf1f63b043144e3afd96aa406d47b7"} Feb 17 18:03:10 crc kubenswrapper[4892]: I0217 18:03:10.125420 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-7rrhb" event={"ID":"e85111e5-c198-4b36-bdce-5eb7a8ff75ea","Type":"ContainerStarted","Data":"8f1ea9bca6bfc9754522295f832885667c13e97c765003ac1ce99459674f9baa"} Feb 17 18:03:10 crc kubenswrapper[4892]: I0217 18:03:10.126352 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-x4dsw" podStartSLOduration=2.126339889 podStartE2EDuration="2.126339889s" podCreationTimestamp="2026-02-17 18:03:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:03:10.098144824 +0000 UTC m=+1161.473548089" watchObservedRunningTime="2026-02-17 18:03:10.126339889 +0000 UTC m=+1161.501743154" Feb 17 18:03:10 crc kubenswrapper[4892]: I0217 18:03:10.149931 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"14ec5fef-7255-4519-b447-474ccccaebdb","Type":"ContainerStarted","Data":"65b7540608340a0fd35d712d4378d8841621d0b688dd6c7aa3f7c858d2d0b6e3"} Feb 17 18:03:10 crc kubenswrapper[4892]: I0217 18:03:10.249624 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-q48cp"] Feb 17 18:03:10 crc kubenswrapper[4892]: I0217 18:03:10.561179 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 18:03:10 crc kubenswrapper[4892]: I0217 18:03:10.724222 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 18:03:10 crc kubenswrapper[4892]: I0217 18:03:10.760007 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-nxqvf" Feb 17 18:03:10 crc kubenswrapper[4892]: I0217 18:03:10.833020 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 18:03:10 crc kubenswrapper[4892]: I0217 18:03:10.866793 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa7f944f-cd9d-418a-a282-ed67ebf5961d-ovsdbserver-nb\") pod \"aa7f944f-cd9d-418a-a282-ed67ebf5961d\" (UID: \"aa7f944f-cd9d-418a-a282-ed67ebf5961d\") " Feb 17 18:03:10 crc kubenswrapper[4892]: I0217 18:03:10.870246 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa7f944f-cd9d-418a-a282-ed67ebf5961d-ovsdbserver-sb\") pod \"aa7f944f-cd9d-418a-a282-ed67ebf5961d\" (UID: \"aa7f944f-cd9d-418a-a282-ed67ebf5961d\") " Feb 17 18:03:10 crc kubenswrapper[4892]: I0217 18:03:10.870331 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ldhv\" (UniqueName: \"kubernetes.io/projected/aa7f944f-cd9d-418a-a282-ed67ebf5961d-kube-api-access-5ldhv\") pod \"aa7f944f-cd9d-418a-a282-ed67ebf5961d\" (UID: \"aa7f944f-cd9d-418a-a282-ed67ebf5961d\") " Feb 17 18:03:10 crc kubenswrapper[4892]: I0217 18:03:10.870442 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa7f944f-cd9d-418a-a282-ed67ebf5961d-dns-swift-storage-0\") pod \"aa7f944f-cd9d-418a-a282-ed67ebf5961d\" (UID: \"aa7f944f-cd9d-418a-a282-ed67ebf5961d\") " Feb 17 18:03:10 crc kubenswrapper[4892]: I0217 18:03:10.870479 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa7f944f-cd9d-418a-a282-ed67ebf5961d-dns-svc\") pod \"aa7f944f-cd9d-418a-a282-ed67ebf5961d\" (UID: \"aa7f944f-cd9d-418a-a282-ed67ebf5961d\") " Feb 17 18:03:10 crc kubenswrapper[4892]: I0217 18:03:10.870608 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa7f944f-cd9d-418a-a282-ed67ebf5961d-config\") pod \"aa7f944f-cd9d-418a-a282-ed67ebf5961d\" (UID: \"aa7f944f-cd9d-418a-a282-ed67ebf5961d\") " Feb 17 18:03:10 crc kubenswrapper[4892]: I0217 18:03:10.869860 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 18:03:10 crc kubenswrapper[4892]: I0217 18:03:10.892627 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa7f944f-cd9d-418a-a282-ed67ebf5961d-kube-api-access-5ldhv" (OuterVolumeSpecName: "kube-api-access-5ldhv") pod "aa7f944f-cd9d-418a-a282-ed67ebf5961d" (UID: "aa7f944f-cd9d-418a-a282-ed67ebf5961d"). InnerVolumeSpecName "kube-api-access-5ldhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:03:10 crc kubenswrapper[4892]: I0217 18:03:10.920236 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa7f944f-cd9d-418a-a282-ed67ebf5961d-config" (OuterVolumeSpecName: "config") pod "aa7f944f-cd9d-418a-a282-ed67ebf5961d" (UID: "aa7f944f-cd9d-418a-a282-ed67ebf5961d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:03:10 crc kubenswrapper[4892]: I0217 18:03:10.924877 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 18:03:10 crc kubenswrapper[4892]: I0217 18:03:10.928688 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa7f944f-cd9d-418a-a282-ed67ebf5961d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "aa7f944f-cd9d-418a-a282-ed67ebf5961d" (UID: "aa7f944f-cd9d-418a-a282-ed67ebf5961d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:03:10 crc kubenswrapper[4892]: I0217 18:03:10.942680 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa7f944f-cd9d-418a-a282-ed67ebf5961d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "aa7f944f-cd9d-418a-a282-ed67ebf5961d" (UID: "aa7f944f-cd9d-418a-a282-ed67ebf5961d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:03:10 crc kubenswrapper[4892]: W0217 18:03:10.967208 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8faf0043_ec66_4725_9db0_e0b7a4d3da6d.slice/crio-0e1de3fa4771e1abd1be750d53637068e3f7b547404c610e45f4b5adfe93f430 WatchSource:0}: Error finding container 0e1de3fa4771e1abd1be750d53637068e3f7b547404c610e45f4b5adfe93f430: Status 404 returned error can't find the container with id 0e1de3fa4771e1abd1be750d53637068e3f7b547404c610e45f4b5adfe93f430 Feb 17 18:03:10 crc kubenswrapper[4892]: I0217 18:03:10.982334 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa7f944f-cd9d-418a-a282-ed67ebf5961d-config\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:10 crc kubenswrapper[4892]: I0217 18:03:10.982409 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa7f944f-cd9d-418a-a282-ed67ebf5961d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:10 crc kubenswrapper[4892]: I0217 18:03:10.982467 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ldhv\" (UniqueName: \"kubernetes.io/projected/aa7f944f-cd9d-418a-a282-ed67ebf5961d-kube-api-access-5ldhv\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:10 crc kubenswrapper[4892]: I0217 18:03:10.982481 4892 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa7f944f-cd9d-418a-a282-ed67ebf5961d-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:10 crc kubenswrapper[4892]: I0217 18:03:10.983448 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa7f944f-cd9d-418a-a282-ed67ebf5961d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "aa7f944f-cd9d-418a-a282-ed67ebf5961d" (UID: "aa7f944f-cd9d-418a-a282-ed67ebf5961d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:03:10 crc kubenswrapper[4892]: I0217 18:03:10.986198 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa7f944f-cd9d-418a-a282-ed67ebf5961d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "aa7f944f-cd9d-418a-a282-ed67ebf5961d" (UID: "aa7f944f-cd9d-418a-a282-ed67ebf5961d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:03:11 crc kubenswrapper[4892]: I0217 18:03:11.084167 4892 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa7f944f-cd9d-418a-a282-ed67ebf5961d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:11 crc kubenswrapper[4892]: I0217 18:03:11.084206 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa7f944f-cd9d-418a-a282-ed67ebf5961d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:11 crc kubenswrapper[4892]: I0217 18:03:11.164477 4892 generic.go:334] "Generic (PLEG): container finished" podID="04ff949e-3323-4573-ac6a-4b1714b1976c" containerID="b5103e207bc9f1bea8a4e92662e06c8699afde4b9d2651bb605a0f7ad28da5fc" exitCode=0 Feb 17 18:03:11 crc kubenswrapper[4892]: I0217 18:03:11.164715 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-dw87w" event={"ID":"04ff949e-3323-4573-ac6a-4b1714b1976c","Type":"ContainerDied","Data":"b5103e207bc9f1bea8a4e92662e06c8699afde4b9d2651bb605a0f7ad28da5fc"} Feb 17 18:03:11 crc kubenswrapper[4892]: I0217 18:03:11.164746 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-dw87w" event={"ID":"04ff949e-3323-4573-ac6a-4b1714b1976c","Type":"ContainerStarted","Data":"9305ec546ef009b56ab867fec42819b937adf5fa644ec35b85a2b59d513a8543"} Feb 17 18:03:11 crc kubenswrapper[4892]: I0217 18:03:11.168658 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-nc5l8" event={"ID":"545242bc-24e5-4521-85d8-7aff5cbd4916","Type":"ContainerStarted","Data":"6fd040ff2f0bbaf8897a4c0c90ad5412bbd37650ae24bb20fcdb3b64bce77634"} Feb 17 18:03:11 crc kubenswrapper[4892]: I0217 18:03:11.173284 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9c9f998c-nxqvf" event={"ID":"aa7f944f-cd9d-418a-a282-ed67ebf5961d","Type":"ContainerDied","Data":"4ccd25457ff898382e16ef574b43d189c55a658bb8a290ca348dae6d6a245033"} Feb 17 18:03:11 crc kubenswrapper[4892]: I0217 18:03:11.173331 4892 scope.go:117] "RemoveContainer" containerID="4a3685cf47e4ca1d92082ffdb52ba15e98a7a0b501edf74c6ce91738d76cb316" Feb 17 18:03:11 crc kubenswrapper[4892]: I0217 18:03:11.173529 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-nxqvf" Feb 17 18:03:11 crc kubenswrapper[4892]: I0217 18:03:11.198149 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"16369677-41cd-4100-a9dd-6a015abc114f","Type":"ContainerStarted","Data":"f0c9b8fe16360f0c73a14a99e627be7afdfc0a78720990740f86f914c3c925a8"} Feb 17 18:03:11 crc kubenswrapper[4892]: I0217 18:03:11.246417 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-q48cp" event={"ID":"b6b151d3-a2dd-4201-b4a4-e2f332530eaa","Type":"ContainerStarted","Data":"ab6e9785f15283af7e10144fb1b893d0defb9dedbde94a7d3c471d856a81f83b"} Feb 17 18:03:11 crc kubenswrapper[4892]: I0217 18:03:11.246746 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-q48cp" event={"ID":"b6b151d3-a2dd-4201-b4a4-e2f332530eaa","Type":"ContainerStarted","Data":"0f38cba2873ec096c3c758476a64ad139acc255f3a7ded6c719b70822eeb9d01"} Feb 17 18:03:11 crc kubenswrapper[4892]: I0217 18:03:11.271560 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-nxqvf"] Feb 17 18:03:11 crc kubenswrapper[4892]: I0217 18:03:11.283358 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-nxqvf"] Feb 17 18:03:11 crc kubenswrapper[4892]: I0217 18:03:11.283599 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-q48cp" podStartSLOduration=3.283587052 podStartE2EDuration="3.283587052s" podCreationTimestamp="2026-02-17 18:03:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:03:11.26556453 +0000 UTC m=+1162.640967805" watchObservedRunningTime="2026-02-17 18:03:11.283587052 +0000 UTC m=+1162.658990317" Feb 17 18:03:11 crc kubenswrapper[4892]: I0217 18:03:11.285778 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8faf0043-ec66-4725-9db0-e0b7a4d3da6d","Type":"ContainerStarted","Data":"0e1de3fa4771e1abd1be750d53637068e3f7b547404c610e45f4b5adfe93f430"} Feb 17 18:03:11 crc kubenswrapper[4892]: I0217 18:03:11.377891 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa7f944f-cd9d-418a-a282-ed67ebf5961d" path="/var/lib/kubelet/pods/aa7f944f-cd9d-418a-a282-ed67ebf5961d/volumes" Feb 17 18:03:12 crc kubenswrapper[4892]: I0217 18:03:12.306859 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8faf0043-ec66-4725-9db0-e0b7a4d3da6d","Type":"ContainerStarted","Data":"6246aeb2ade0adec8acce45f136be2a5c60aaf0b11295b2adbdaf8d7bac2c4f1"} Feb 17 18:03:12 crc kubenswrapper[4892]: I0217 18:03:12.320309 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-dw87w" event={"ID":"04ff949e-3323-4573-ac6a-4b1714b1976c","Type":"ContainerStarted","Data":"e0bceedcc0f8aa99d2da835f14efea6c50f15e3cf64b5ff954b6ccfa050e77e1"} Feb 17 18:03:12 crc kubenswrapper[4892]: I0217 18:03:12.320943 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57c957c4ff-dw87w" Feb 17 18:03:12 crc kubenswrapper[4892]: I0217 18:03:12.328203 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"16369677-41cd-4100-a9dd-6a015abc114f","Type":"ContainerStarted","Data":"c4a196bf2d492c309554895afd4605e63dc018017cfb378e06d3dc7157f1d076"} Feb 17 18:03:12 crc kubenswrapper[4892]: I0217 18:03:12.344334 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57c957c4ff-dw87w" podStartSLOduration=4.344320332 podStartE2EDuration="4.344320332s" podCreationTimestamp="2026-02-17 18:03:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:03:12.338489656 +0000 UTC m=+1163.713892921" watchObservedRunningTime="2026-02-17 18:03:12.344320332 +0000 UTC m=+1163.719723597" Feb 17 18:03:13 crc kubenswrapper[4892]: I0217 18:03:13.347536 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"16369677-41cd-4100-a9dd-6a015abc114f","Type":"ContainerStarted","Data":"f4887c3314b9e14308f24778bd2b0e62ab2da2fd653b968ca3642d6372b9cb2e"} Feb 17 18:03:13 crc kubenswrapper[4892]: I0217 18:03:13.348084 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="16369677-41cd-4100-a9dd-6a015abc114f" containerName="glance-log" containerID="cri-o://c4a196bf2d492c309554895afd4605e63dc018017cfb378e06d3dc7157f1d076" gracePeriod=30 Feb 17 18:03:13 crc kubenswrapper[4892]: I0217 18:03:13.348240 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="16369677-41cd-4100-a9dd-6a015abc114f" containerName="glance-httpd" containerID="cri-o://f4887c3314b9e14308f24778bd2b0e62ab2da2fd653b968ca3642d6372b9cb2e" gracePeriod=30 Feb 17 18:03:13 crc kubenswrapper[4892]: I0217 18:03:13.357074 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8faf0043-ec66-4725-9db0-e0b7a4d3da6d","Type":"ContainerStarted","Data":"a6218868020c69142a9a3ebdaa81f46d52f2a1989d02635a3b35de25743a226a"} Feb 17 18:03:13 crc kubenswrapper[4892]: I0217 18:03:13.357344 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="8faf0043-ec66-4725-9db0-e0b7a4d3da6d" containerName="glance-log" containerID="cri-o://6246aeb2ade0adec8acce45f136be2a5c60aaf0b11295b2adbdaf8d7bac2c4f1" gracePeriod=30 Feb 17 18:03:13 crc kubenswrapper[4892]: I0217 18:03:13.357377 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="8faf0043-ec66-4725-9db0-e0b7a4d3da6d" containerName="glance-httpd" containerID="cri-o://a6218868020c69142a9a3ebdaa81f46d52f2a1989d02635a3b35de25743a226a" gracePeriod=30 Feb 17 18:03:13 crc kubenswrapper[4892]: I0217 18:03:13.370720 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.370700273 podStartE2EDuration="5.370700273s" podCreationTimestamp="2026-02-17 18:03:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:03:13.36797699 +0000 UTC m=+1164.743380265" watchObservedRunningTime="2026-02-17 18:03:13.370700273 +0000 UTC m=+1164.746103538" Feb 17 18:03:13 crc kubenswrapper[4892]: I0217 18:03:13.399396 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.399378091 podStartE2EDuration="5.399378091s" podCreationTimestamp="2026-02-17 18:03:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:03:13.397519881 +0000 UTC m=+1164.772923146" watchObservedRunningTime="2026-02-17 18:03:13.399378091 +0000 UTC m=+1164.774781356" Feb 17 18:03:14 crc kubenswrapper[4892]: I0217 18:03:14.368529 4892 generic.go:334] "Generic (PLEG): container finished" podID="2c48fb34-4ec6-4cf7-8988-76e9d9549e23" containerID="3a5a394db3114515f27acb7b3724e0a08543612a8c1883d88a2dbc0f06751727" exitCode=0 Feb 17 18:03:14 crc kubenswrapper[4892]: I0217 18:03:14.368723 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-x4dsw" event={"ID":"2c48fb34-4ec6-4cf7-8988-76e9d9549e23","Type":"ContainerDied","Data":"3a5a394db3114515f27acb7b3724e0a08543612a8c1883d88a2dbc0f06751727"} Feb 17 18:03:14 crc kubenswrapper[4892]: I0217 18:03:14.372986 4892 generic.go:334] "Generic (PLEG): container finished" podID="16369677-41cd-4100-a9dd-6a015abc114f" containerID="f4887c3314b9e14308f24778bd2b0e62ab2da2fd653b968ca3642d6372b9cb2e" exitCode=0 Feb 17 18:03:14 crc kubenswrapper[4892]: I0217 18:03:14.373004 4892 generic.go:334] "Generic (PLEG): container finished" podID="16369677-41cd-4100-a9dd-6a015abc114f" containerID="c4a196bf2d492c309554895afd4605e63dc018017cfb378e06d3dc7157f1d076" exitCode=143 Feb 17 18:03:14 crc kubenswrapper[4892]: I0217 18:03:14.373048 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"16369677-41cd-4100-a9dd-6a015abc114f","Type":"ContainerDied","Data":"f4887c3314b9e14308f24778bd2b0e62ab2da2fd653b968ca3642d6372b9cb2e"} Feb 17 18:03:14 crc kubenswrapper[4892]: I0217 18:03:14.373065 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"16369677-41cd-4100-a9dd-6a015abc114f","Type":"ContainerDied","Data":"c4a196bf2d492c309554895afd4605e63dc018017cfb378e06d3dc7157f1d076"} Feb 17 18:03:14 crc kubenswrapper[4892]: I0217 18:03:14.376104 4892 generic.go:334] "Generic (PLEG): container finished" podID="8faf0043-ec66-4725-9db0-e0b7a4d3da6d" containerID="a6218868020c69142a9a3ebdaa81f46d52f2a1989d02635a3b35de25743a226a" exitCode=0 Feb 17 18:03:14 crc kubenswrapper[4892]: I0217 18:03:14.376127 4892 generic.go:334] "Generic (PLEG): container finished" podID="8faf0043-ec66-4725-9db0-e0b7a4d3da6d" containerID="6246aeb2ade0adec8acce45f136be2a5c60aaf0b11295b2adbdaf8d7bac2c4f1" exitCode=143 Feb 17 18:03:14 crc kubenswrapper[4892]: I0217 18:03:14.376145 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8faf0043-ec66-4725-9db0-e0b7a4d3da6d","Type":"ContainerDied","Data":"a6218868020c69142a9a3ebdaa81f46d52f2a1989d02635a3b35de25743a226a"} Feb 17 18:03:14 crc kubenswrapper[4892]: I0217 18:03:14.376162 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8faf0043-ec66-4725-9db0-e0b7a4d3da6d","Type":"ContainerDied","Data":"6246aeb2ade0adec8acce45f136be2a5c60aaf0b11295b2adbdaf8d7bac2c4f1"} Feb 17 18:03:18 crc kubenswrapper[4892]: I0217 18:03:18.421175 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-x4dsw" Feb 17 18:03:18 crc kubenswrapper[4892]: I0217 18:03:18.431392 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-x4dsw" event={"ID":"2c48fb34-4ec6-4cf7-8988-76e9d9549e23","Type":"ContainerDied","Data":"52990d10d54c8cab3d481077a05e7c5e592102f95c7fbf490d20e3745b94abf2"} Feb 17 18:03:18 crc kubenswrapper[4892]: I0217 18:03:18.431438 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52990d10d54c8cab3d481077a05e7c5e592102f95c7fbf490d20e3745b94abf2" Feb 17 18:03:18 crc kubenswrapper[4892]: I0217 18:03:18.575106 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c48fb34-4ec6-4cf7-8988-76e9d9549e23-config-data\") pod \"2c48fb34-4ec6-4cf7-8988-76e9d9549e23\" (UID: \"2c48fb34-4ec6-4cf7-8988-76e9d9549e23\") " Feb 17 18:03:18 crc kubenswrapper[4892]: I0217 18:03:18.575426 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c48fb34-4ec6-4cf7-8988-76e9d9549e23-combined-ca-bundle\") pod \"2c48fb34-4ec6-4cf7-8988-76e9d9549e23\" (UID: \"2c48fb34-4ec6-4cf7-8988-76e9d9549e23\") " Feb 17 18:03:18 crc kubenswrapper[4892]: I0217 18:03:18.575485 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2c48fb34-4ec6-4cf7-8988-76e9d9549e23-fernet-keys\") pod \"2c48fb34-4ec6-4cf7-8988-76e9d9549e23\" (UID: \"2c48fb34-4ec6-4cf7-8988-76e9d9549e23\") " Feb 17 18:03:18 crc kubenswrapper[4892]: I0217 18:03:18.575556 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7v8x\" (UniqueName: \"kubernetes.io/projected/2c48fb34-4ec6-4cf7-8988-76e9d9549e23-kube-api-access-d7v8x\") pod \"2c48fb34-4ec6-4cf7-8988-76e9d9549e23\" (UID: \"2c48fb34-4ec6-4cf7-8988-76e9d9549e23\") " Feb 17 18:03:18 crc kubenswrapper[4892]: I0217 18:03:18.575661 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c48fb34-4ec6-4cf7-8988-76e9d9549e23-scripts\") pod \"2c48fb34-4ec6-4cf7-8988-76e9d9549e23\" (UID: \"2c48fb34-4ec6-4cf7-8988-76e9d9549e23\") " Feb 17 18:03:18 crc kubenswrapper[4892]: I0217 18:03:18.575766 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2c48fb34-4ec6-4cf7-8988-76e9d9549e23-credential-keys\") pod \"2c48fb34-4ec6-4cf7-8988-76e9d9549e23\" (UID: \"2c48fb34-4ec6-4cf7-8988-76e9d9549e23\") " Feb 17 18:03:18 crc kubenswrapper[4892]: I0217 18:03:18.581463 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c48fb34-4ec6-4cf7-8988-76e9d9549e23-kube-api-access-d7v8x" (OuterVolumeSpecName: "kube-api-access-d7v8x") pod "2c48fb34-4ec6-4cf7-8988-76e9d9549e23" (UID: "2c48fb34-4ec6-4cf7-8988-76e9d9549e23"). InnerVolumeSpecName "kube-api-access-d7v8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:03:18 crc kubenswrapper[4892]: I0217 18:03:18.596290 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c48fb34-4ec6-4cf7-8988-76e9d9549e23-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "2c48fb34-4ec6-4cf7-8988-76e9d9549e23" (UID: "2c48fb34-4ec6-4cf7-8988-76e9d9549e23"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:03:18 crc kubenswrapper[4892]: I0217 18:03:18.596389 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c48fb34-4ec6-4cf7-8988-76e9d9549e23-scripts" (OuterVolumeSpecName: "scripts") pod "2c48fb34-4ec6-4cf7-8988-76e9d9549e23" (UID: "2c48fb34-4ec6-4cf7-8988-76e9d9549e23"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:03:18 crc kubenswrapper[4892]: I0217 18:03:18.618787 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c48fb34-4ec6-4cf7-8988-76e9d9549e23-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "2c48fb34-4ec6-4cf7-8988-76e9d9549e23" (UID: "2c48fb34-4ec6-4cf7-8988-76e9d9549e23"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:03:18 crc kubenswrapper[4892]: I0217 18:03:18.683090 4892 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2c48fb34-4ec6-4cf7-8988-76e9d9549e23-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:18 crc kubenswrapper[4892]: I0217 18:03:18.683125 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7v8x\" (UniqueName: \"kubernetes.io/projected/2c48fb34-4ec6-4cf7-8988-76e9d9549e23-kube-api-access-d7v8x\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:18 crc kubenswrapper[4892]: I0217 18:03:18.683136 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c48fb34-4ec6-4cf7-8988-76e9d9549e23-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:18 crc kubenswrapper[4892]: I0217 18:03:18.683145 4892 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2c48fb34-4ec6-4cf7-8988-76e9d9549e23-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:18 crc kubenswrapper[4892]: I0217 18:03:18.699072 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c48fb34-4ec6-4cf7-8988-76e9d9549e23-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2c48fb34-4ec6-4cf7-8988-76e9d9549e23" (UID: "2c48fb34-4ec6-4cf7-8988-76e9d9549e23"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:03:18 crc kubenswrapper[4892]: I0217 18:03:18.733098 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c48fb34-4ec6-4cf7-8988-76e9d9549e23-config-data" (OuterVolumeSpecName: "config-data") pod "2c48fb34-4ec6-4cf7-8988-76e9d9549e23" (UID: "2c48fb34-4ec6-4cf7-8988-76e9d9549e23"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:03:18 crc kubenswrapper[4892]: I0217 18:03:18.785048 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c48fb34-4ec6-4cf7-8988-76e9d9549e23-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:18 crc kubenswrapper[4892]: I0217 18:03:18.785079 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c48fb34-4ec6-4cf7-8988-76e9d9549e23-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:19 crc kubenswrapper[4892]: I0217 18:03:19.086058 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57c957c4ff-dw87w" Feb 17 18:03:19 crc kubenswrapper[4892]: I0217 18:03:19.215563 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-b2865"] Feb 17 18:03:19 crc kubenswrapper[4892]: I0217 18:03:19.216196 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-895cf5cf-b2865" podUID="aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7" containerName="dnsmasq-dns" containerID="cri-o://bccde332d60d6b0e45f0bf05b2c45bb513cdbbe8b8c25952d2bde8ff07e3a9e4" gracePeriod=10 Feb 17 18:03:19 crc kubenswrapper[4892]: I0217 18:03:19.296381 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-895cf5cf-b2865" podUID="aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.158:5353: connect: connection refused" Feb 17 18:03:19 crc kubenswrapper[4892]: I0217 18:03:19.444710 4892 generic.go:334] "Generic (PLEG): container finished" podID="aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7" containerID="bccde332d60d6b0e45f0bf05b2c45bb513cdbbe8b8c25952d2bde8ff07e3a9e4" exitCode=0 Feb 17 18:03:19 crc kubenswrapper[4892]: I0217 18:03:19.444799 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-x4dsw" Feb 17 18:03:19 crc kubenswrapper[4892]: I0217 18:03:19.445700 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-b2865" event={"ID":"aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7","Type":"ContainerDied","Data":"bccde332d60d6b0e45f0bf05b2c45bb513cdbbe8b8c25952d2bde8ff07e3a9e4"} Feb 17 18:03:19 crc kubenswrapper[4892]: I0217 18:03:19.543722 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-x4dsw"] Feb 17 18:03:19 crc kubenswrapper[4892]: I0217 18:03:19.551882 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-x4dsw"] Feb 17 18:03:19 crc kubenswrapper[4892]: I0217 18:03:19.641272 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-8vb5k"] Feb 17 18:03:19 crc kubenswrapper[4892]: E0217 18:03:19.641754 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa7f944f-cd9d-418a-a282-ed67ebf5961d" containerName="init" Feb 17 18:03:19 crc kubenswrapper[4892]: I0217 18:03:19.641767 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa7f944f-cd9d-418a-a282-ed67ebf5961d" containerName="init" Feb 17 18:03:19 crc kubenswrapper[4892]: E0217 18:03:19.641776 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c48fb34-4ec6-4cf7-8988-76e9d9549e23" containerName="keystone-bootstrap" Feb 17 18:03:19 crc kubenswrapper[4892]: I0217 18:03:19.641783 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c48fb34-4ec6-4cf7-8988-76e9d9549e23" containerName="keystone-bootstrap" Feb 17 18:03:19 crc kubenswrapper[4892]: I0217 18:03:19.642027 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c48fb34-4ec6-4cf7-8988-76e9d9549e23" containerName="keystone-bootstrap" Feb 17 18:03:19 crc kubenswrapper[4892]: I0217 18:03:19.642054 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa7f944f-cd9d-418a-a282-ed67ebf5961d" containerName="init" Feb 17 18:03:19 crc kubenswrapper[4892]: I0217 18:03:19.642665 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8vb5k" Feb 17 18:03:19 crc kubenswrapper[4892]: I0217 18:03:19.644795 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 17 18:03:19 crc kubenswrapper[4892]: I0217 18:03:19.644871 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 17 18:03:19 crc kubenswrapper[4892]: I0217 18:03:19.645238 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jpwh4" Feb 17 18:03:19 crc kubenswrapper[4892]: I0217 18:03:19.646041 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 17 18:03:19 crc kubenswrapper[4892]: I0217 18:03:19.646132 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 17 18:03:19 crc kubenswrapper[4892]: I0217 18:03:19.654322 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-8vb5k"] Feb 17 18:03:19 crc kubenswrapper[4892]: I0217 18:03:19.808268 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa-fernet-keys\") pod \"keystone-bootstrap-8vb5k\" (UID: \"e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa\") " pod="openstack/keystone-bootstrap-8vb5k" Feb 17 18:03:19 crc kubenswrapper[4892]: I0217 18:03:19.808352 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa-combined-ca-bundle\") pod \"keystone-bootstrap-8vb5k\" (UID: \"e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa\") " pod="openstack/keystone-bootstrap-8vb5k" Feb 17 18:03:19 crc kubenswrapper[4892]: I0217 18:03:19.808400 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57gv4\" (UniqueName: \"kubernetes.io/projected/e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa-kube-api-access-57gv4\") pod \"keystone-bootstrap-8vb5k\" (UID: \"e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa\") " pod="openstack/keystone-bootstrap-8vb5k" Feb 17 18:03:19 crc kubenswrapper[4892]: I0217 18:03:19.808440 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa-scripts\") pod \"keystone-bootstrap-8vb5k\" (UID: \"e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa\") " pod="openstack/keystone-bootstrap-8vb5k" Feb 17 18:03:19 crc kubenswrapper[4892]: I0217 18:03:19.808541 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa-config-data\") pod \"keystone-bootstrap-8vb5k\" (UID: \"e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa\") " pod="openstack/keystone-bootstrap-8vb5k" Feb 17 18:03:19 crc kubenswrapper[4892]: I0217 18:03:19.808587 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa-credential-keys\") pod \"keystone-bootstrap-8vb5k\" (UID: \"e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa\") " pod="openstack/keystone-bootstrap-8vb5k" Feb 17 18:03:19 crc kubenswrapper[4892]: I0217 18:03:19.909667 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa-credential-keys\") pod \"keystone-bootstrap-8vb5k\" (UID: \"e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa\") " pod="openstack/keystone-bootstrap-8vb5k" Feb 17 18:03:19 crc kubenswrapper[4892]: I0217 18:03:19.909783 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa-fernet-keys\") pod \"keystone-bootstrap-8vb5k\" (UID: \"e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa\") " pod="openstack/keystone-bootstrap-8vb5k" Feb 17 18:03:19 crc kubenswrapper[4892]: I0217 18:03:19.909843 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa-combined-ca-bundle\") pod \"keystone-bootstrap-8vb5k\" (UID: \"e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa\") " pod="openstack/keystone-bootstrap-8vb5k" Feb 17 18:03:19 crc kubenswrapper[4892]: I0217 18:03:19.909882 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57gv4\" (UniqueName: \"kubernetes.io/projected/e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa-kube-api-access-57gv4\") pod \"keystone-bootstrap-8vb5k\" (UID: \"e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa\") " pod="openstack/keystone-bootstrap-8vb5k" Feb 17 18:03:19 crc kubenswrapper[4892]: I0217 18:03:19.909915 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa-scripts\") pod \"keystone-bootstrap-8vb5k\" (UID: \"e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa\") " pod="openstack/keystone-bootstrap-8vb5k" Feb 17 18:03:19 crc kubenswrapper[4892]: I0217 18:03:19.909963 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa-config-data\") pod \"keystone-bootstrap-8vb5k\" (UID: \"e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa\") " pod="openstack/keystone-bootstrap-8vb5k" Feb 17 18:03:19 crc kubenswrapper[4892]: I0217 18:03:19.914497 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa-combined-ca-bundle\") pod \"keystone-bootstrap-8vb5k\" (UID: \"e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa\") " pod="openstack/keystone-bootstrap-8vb5k" Feb 17 18:03:19 crc kubenswrapper[4892]: I0217 18:03:19.915162 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa-config-data\") pod \"keystone-bootstrap-8vb5k\" (UID: \"e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa\") " pod="openstack/keystone-bootstrap-8vb5k" Feb 17 18:03:19 crc kubenswrapper[4892]: I0217 18:03:19.915611 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa-credential-keys\") pod \"keystone-bootstrap-8vb5k\" (UID: \"e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa\") " pod="openstack/keystone-bootstrap-8vb5k" Feb 17 18:03:19 crc kubenswrapper[4892]: I0217 18:03:19.915963 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa-fernet-keys\") pod \"keystone-bootstrap-8vb5k\" (UID: \"e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa\") " pod="openstack/keystone-bootstrap-8vb5k" Feb 17 18:03:19 crc kubenswrapper[4892]: I0217 18:03:19.926416 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa-scripts\") pod \"keystone-bootstrap-8vb5k\" (UID: \"e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa\") " pod="openstack/keystone-bootstrap-8vb5k" Feb 17 18:03:19 crc kubenswrapper[4892]: I0217 18:03:19.930050 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57gv4\" (UniqueName: \"kubernetes.io/projected/e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa-kube-api-access-57gv4\") pod \"keystone-bootstrap-8vb5k\" (UID: \"e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa\") " pod="openstack/keystone-bootstrap-8vb5k" Feb 17 18:03:19 crc kubenswrapper[4892]: I0217 18:03:19.963106 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8vb5k" Feb 17 18:03:21 crc kubenswrapper[4892]: I0217 18:03:21.380135 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c48fb34-4ec6-4cf7-8988-76e9d9549e23" path="/var/lib/kubelet/pods/2c48fb34-4ec6-4cf7-8988-76e9d9549e23/volumes" Feb 17 18:03:22 crc kubenswrapper[4892]: I0217 18:03:22.502486 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"16369677-41cd-4100-a9dd-6a015abc114f","Type":"ContainerDied","Data":"f0c9b8fe16360f0c73a14a99e627be7afdfc0a78720990740f86f914c3c925a8"} Feb 17 18:03:22 crc kubenswrapper[4892]: I0217 18:03:22.502769 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0c9b8fe16360f0c73a14a99e627be7afdfc0a78720990740f86f914c3c925a8" Feb 17 18:03:22 crc kubenswrapper[4892]: I0217 18:03:22.525499 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 18:03:22 crc kubenswrapper[4892]: I0217 18:03:22.560666 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n66p4\" (UniqueName: \"kubernetes.io/projected/16369677-41cd-4100-a9dd-6a015abc114f-kube-api-access-n66p4\") pod \"16369677-41cd-4100-a9dd-6a015abc114f\" (UID: \"16369677-41cd-4100-a9dd-6a015abc114f\") " Feb 17 18:03:22 crc kubenswrapper[4892]: I0217 18:03:22.560734 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16369677-41cd-4100-a9dd-6a015abc114f-scripts\") pod \"16369677-41cd-4100-a9dd-6a015abc114f\" (UID: \"16369677-41cd-4100-a9dd-6a015abc114f\") " Feb 17 18:03:22 crc kubenswrapper[4892]: I0217 18:03:22.560802 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16369677-41cd-4100-a9dd-6a015abc114f-internal-tls-certs\") pod \"16369677-41cd-4100-a9dd-6a015abc114f\" (UID: \"16369677-41cd-4100-a9dd-6a015abc114f\") " Feb 17 18:03:22 crc kubenswrapper[4892]: I0217 18:03:22.560847 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16369677-41cd-4100-a9dd-6a015abc114f-logs\") pod \"16369677-41cd-4100-a9dd-6a015abc114f\" (UID: \"16369677-41cd-4100-a9dd-6a015abc114f\") " Feb 17 18:03:22 crc kubenswrapper[4892]: I0217 18:03:22.560996 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16369677-41cd-4100-a9dd-6a015abc114f-config-data\") pod \"16369677-41cd-4100-a9dd-6a015abc114f\" (UID: \"16369677-41cd-4100-a9dd-6a015abc114f\") " Feb 17 18:03:22 crc kubenswrapper[4892]: I0217 18:03:22.561035 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"16369677-41cd-4100-a9dd-6a015abc114f\" (UID: \"16369677-41cd-4100-a9dd-6a015abc114f\") " Feb 17 18:03:22 crc kubenswrapper[4892]: I0217 18:03:22.561065 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16369677-41cd-4100-a9dd-6a015abc114f-combined-ca-bundle\") pod \"16369677-41cd-4100-a9dd-6a015abc114f\" (UID: \"16369677-41cd-4100-a9dd-6a015abc114f\") " Feb 17 18:03:22 crc kubenswrapper[4892]: I0217 18:03:22.561110 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16369677-41cd-4100-a9dd-6a015abc114f-httpd-run\") pod \"16369677-41cd-4100-a9dd-6a015abc114f\" (UID: \"16369677-41cd-4100-a9dd-6a015abc114f\") " Feb 17 18:03:22 crc kubenswrapper[4892]: I0217 18:03:22.561756 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16369677-41cd-4100-a9dd-6a015abc114f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "16369677-41cd-4100-a9dd-6a015abc114f" (UID: "16369677-41cd-4100-a9dd-6a015abc114f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:03:22 crc kubenswrapper[4892]: I0217 18:03:22.564988 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16369677-41cd-4100-a9dd-6a015abc114f-logs" (OuterVolumeSpecName: "logs") pod "16369677-41cd-4100-a9dd-6a015abc114f" (UID: "16369677-41cd-4100-a9dd-6a015abc114f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:03:22 crc kubenswrapper[4892]: I0217 18:03:22.568037 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "16369677-41cd-4100-a9dd-6a015abc114f" (UID: "16369677-41cd-4100-a9dd-6a015abc114f"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:03:22 crc kubenswrapper[4892]: I0217 18:03:22.568296 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16369677-41cd-4100-a9dd-6a015abc114f-kube-api-access-n66p4" (OuterVolumeSpecName: "kube-api-access-n66p4") pod "16369677-41cd-4100-a9dd-6a015abc114f" (UID: "16369677-41cd-4100-a9dd-6a015abc114f"). InnerVolumeSpecName "kube-api-access-n66p4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:03:22 crc kubenswrapper[4892]: I0217 18:03:22.570772 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16369677-41cd-4100-a9dd-6a015abc114f-scripts" (OuterVolumeSpecName: "scripts") pod "16369677-41cd-4100-a9dd-6a015abc114f" (UID: "16369677-41cd-4100-a9dd-6a015abc114f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:03:22 crc kubenswrapper[4892]: I0217 18:03:22.611694 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16369677-41cd-4100-a9dd-6a015abc114f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "16369677-41cd-4100-a9dd-6a015abc114f" (UID: "16369677-41cd-4100-a9dd-6a015abc114f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:03:22 crc kubenswrapper[4892]: I0217 18:03:22.630333 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16369677-41cd-4100-a9dd-6a015abc114f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "16369677-41cd-4100-a9dd-6a015abc114f" (UID: "16369677-41cd-4100-a9dd-6a015abc114f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:03:22 crc kubenswrapper[4892]: I0217 18:03:22.638849 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16369677-41cd-4100-a9dd-6a015abc114f-config-data" (OuterVolumeSpecName: "config-data") pod "16369677-41cd-4100-a9dd-6a015abc114f" (UID: "16369677-41cd-4100-a9dd-6a015abc114f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:03:22 crc kubenswrapper[4892]: I0217 18:03:22.665003 4892 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Feb 17 18:03:22 crc kubenswrapper[4892]: I0217 18:03:22.665038 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16369677-41cd-4100-a9dd-6a015abc114f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:22 crc kubenswrapper[4892]: I0217 18:03:22.665049 4892 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16369677-41cd-4100-a9dd-6a015abc114f-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:22 crc kubenswrapper[4892]: I0217 18:03:22.665066 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n66p4\" (UniqueName: \"kubernetes.io/projected/16369677-41cd-4100-a9dd-6a015abc114f-kube-api-access-n66p4\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:22 crc kubenswrapper[4892]: I0217 18:03:22.665075 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16369677-41cd-4100-a9dd-6a015abc114f-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:22 crc kubenswrapper[4892]: I0217 18:03:22.665084 4892 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16369677-41cd-4100-a9dd-6a015abc114f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:22 crc kubenswrapper[4892]: I0217 18:03:22.665092 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16369677-41cd-4100-a9dd-6a015abc114f-logs\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:22 crc kubenswrapper[4892]: I0217 18:03:22.665102 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16369677-41cd-4100-a9dd-6a015abc114f-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:22 crc kubenswrapper[4892]: I0217 18:03:22.697858 4892 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Feb 17 18:03:22 crc kubenswrapper[4892]: I0217 18:03:22.766339 4892 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:23 crc kubenswrapper[4892]: I0217 18:03:23.509891 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 18:03:23 crc kubenswrapper[4892]: I0217 18:03:23.536679 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 18:03:23 crc kubenswrapper[4892]: I0217 18:03:23.559221 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 18:03:23 crc kubenswrapper[4892]: I0217 18:03:23.575047 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 18:03:23 crc kubenswrapper[4892]: E0217 18:03:23.575571 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16369677-41cd-4100-a9dd-6a015abc114f" containerName="glance-log" Feb 17 18:03:23 crc kubenswrapper[4892]: I0217 18:03:23.575590 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="16369677-41cd-4100-a9dd-6a015abc114f" containerName="glance-log" Feb 17 18:03:23 crc kubenswrapper[4892]: E0217 18:03:23.575604 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16369677-41cd-4100-a9dd-6a015abc114f" containerName="glance-httpd" Feb 17 18:03:23 crc kubenswrapper[4892]: I0217 18:03:23.575610 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="16369677-41cd-4100-a9dd-6a015abc114f" containerName="glance-httpd" Feb 17 18:03:23 crc kubenswrapper[4892]: I0217 18:03:23.575847 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="16369677-41cd-4100-a9dd-6a015abc114f" containerName="glance-log" Feb 17 18:03:23 crc kubenswrapper[4892]: I0217 18:03:23.575873 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="16369677-41cd-4100-a9dd-6a015abc114f" containerName="glance-httpd" Feb 17 18:03:23 crc kubenswrapper[4892]: I0217 18:03:23.576983 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 18:03:23 crc kubenswrapper[4892]: I0217 18:03:23.583541 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 17 18:03:23 crc kubenswrapper[4892]: I0217 18:03:23.584910 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 17 18:03:23 crc kubenswrapper[4892]: I0217 18:03:23.585989 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 18:03:23 crc kubenswrapper[4892]: I0217 18:03:23.712001 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2083df34-114e-4d2e-a85c-7a9ca940defa-logs\") pod \"glance-default-internal-api-0\" (UID: \"2083df34-114e-4d2e-a85c-7a9ca940defa\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:03:23 crc kubenswrapper[4892]: I0217 18:03:23.712081 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2083df34-114e-4d2e-a85c-7a9ca940defa-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2083df34-114e-4d2e-a85c-7a9ca940defa\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:03:23 crc kubenswrapper[4892]: I0217 18:03:23.712106 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2083df34-114e-4d2e-a85c-7a9ca940defa-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2083df34-114e-4d2e-a85c-7a9ca940defa\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:03:23 crc kubenswrapper[4892]: I0217 18:03:23.712134 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2083df34-114e-4d2e-a85c-7a9ca940defa-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2083df34-114e-4d2e-a85c-7a9ca940defa\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:03:23 crc kubenswrapper[4892]: I0217 18:03:23.712229 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"2083df34-114e-4d2e-a85c-7a9ca940defa\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:03:23 crc kubenswrapper[4892]: I0217 18:03:23.712277 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hqqp\" (UniqueName: \"kubernetes.io/projected/2083df34-114e-4d2e-a85c-7a9ca940defa-kube-api-access-5hqqp\") pod \"glance-default-internal-api-0\" (UID: \"2083df34-114e-4d2e-a85c-7a9ca940defa\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:03:23 crc kubenswrapper[4892]: I0217 18:03:23.712314 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2083df34-114e-4d2e-a85c-7a9ca940defa-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2083df34-114e-4d2e-a85c-7a9ca940defa\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:03:23 crc kubenswrapper[4892]: I0217 18:03:23.712366 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2083df34-114e-4d2e-a85c-7a9ca940defa-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2083df34-114e-4d2e-a85c-7a9ca940defa\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:03:23 crc kubenswrapper[4892]: I0217 18:03:23.814132 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"2083df34-114e-4d2e-a85c-7a9ca940defa\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:03:23 crc kubenswrapper[4892]: I0217 18:03:23.814185 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hqqp\" (UniqueName: \"kubernetes.io/projected/2083df34-114e-4d2e-a85c-7a9ca940defa-kube-api-access-5hqqp\") pod \"glance-default-internal-api-0\" (UID: \"2083df34-114e-4d2e-a85c-7a9ca940defa\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:03:23 crc kubenswrapper[4892]: I0217 18:03:23.814216 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2083df34-114e-4d2e-a85c-7a9ca940defa-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2083df34-114e-4d2e-a85c-7a9ca940defa\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:03:23 crc kubenswrapper[4892]: I0217 18:03:23.814277 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2083df34-114e-4d2e-a85c-7a9ca940defa-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2083df34-114e-4d2e-a85c-7a9ca940defa\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:03:23 crc kubenswrapper[4892]: I0217 18:03:23.814320 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2083df34-114e-4d2e-a85c-7a9ca940defa-logs\") pod \"glance-default-internal-api-0\" (UID: \"2083df34-114e-4d2e-a85c-7a9ca940defa\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:03:23 crc kubenswrapper[4892]: I0217 18:03:23.814361 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2083df34-114e-4d2e-a85c-7a9ca940defa-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2083df34-114e-4d2e-a85c-7a9ca940defa\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:03:23 crc kubenswrapper[4892]: I0217 18:03:23.814379 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2083df34-114e-4d2e-a85c-7a9ca940defa-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2083df34-114e-4d2e-a85c-7a9ca940defa\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:03:23 crc kubenswrapper[4892]: I0217 18:03:23.814399 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2083df34-114e-4d2e-a85c-7a9ca940defa-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2083df34-114e-4d2e-a85c-7a9ca940defa\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:03:23 crc kubenswrapper[4892]: I0217 18:03:23.815081 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2083df34-114e-4d2e-a85c-7a9ca940defa-logs\") pod \"glance-default-internal-api-0\" (UID: \"2083df34-114e-4d2e-a85c-7a9ca940defa\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:03:23 crc kubenswrapper[4892]: I0217 18:03:23.815251 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2083df34-114e-4d2e-a85c-7a9ca940defa-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2083df34-114e-4d2e-a85c-7a9ca940defa\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:03:23 crc kubenswrapper[4892]: I0217 18:03:23.815394 4892 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"2083df34-114e-4d2e-a85c-7a9ca940defa\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-internal-api-0" Feb 17 18:03:23 crc kubenswrapper[4892]: I0217 18:03:23.820451 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2083df34-114e-4d2e-a85c-7a9ca940defa-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2083df34-114e-4d2e-a85c-7a9ca940defa\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:03:23 crc kubenswrapper[4892]: I0217 18:03:23.820572 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2083df34-114e-4d2e-a85c-7a9ca940defa-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2083df34-114e-4d2e-a85c-7a9ca940defa\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:03:23 crc kubenswrapper[4892]: I0217 18:03:23.823968 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2083df34-114e-4d2e-a85c-7a9ca940defa-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2083df34-114e-4d2e-a85c-7a9ca940defa\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:03:23 crc kubenswrapper[4892]: I0217 18:03:23.830777 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2083df34-114e-4d2e-a85c-7a9ca940defa-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2083df34-114e-4d2e-a85c-7a9ca940defa\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:03:23 crc kubenswrapper[4892]: I0217 18:03:23.833301 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hqqp\" (UniqueName: \"kubernetes.io/projected/2083df34-114e-4d2e-a85c-7a9ca940defa-kube-api-access-5hqqp\") pod \"glance-default-internal-api-0\" (UID: \"2083df34-114e-4d2e-a85c-7a9ca940defa\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:03:23 crc kubenswrapper[4892]: I0217 18:03:23.863473 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"2083df34-114e-4d2e-a85c-7a9ca940defa\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:03:23 crc kubenswrapper[4892]: I0217 18:03:23.939542 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 18:03:25 crc kubenswrapper[4892]: I0217 18:03:25.370934 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16369677-41cd-4100-a9dd-6a015abc114f" path="/var/lib/kubelet/pods/16369677-41cd-4100-a9dd-6a015abc114f/volumes" Feb 17 18:03:29 crc kubenswrapper[4892]: I0217 18:03:29.295353 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-895cf5cf-b2865" podUID="aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.158:5353: i/o timeout" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.008053 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.014002 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-b2865" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.127565 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"8faf0043-ec66-4725-9db0-e0b7a4d3da6d\" (UID: \"8faf0043-ec66-4725-9db0-e0b7a4d3da6d\") " Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.127611 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8faf0043-ec66-4725-9db0-e0b7a4d3da6d-public-tls-certs\") pod \"8faf0043-ec66-4725-9db0-e0b7a4d3da6d\" (UID: \"8faf0043-ec66-4725-9db0-e0b7a4d3da6d\") " Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.127641 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8faf0043-ec66-4725-9db0-e0b7a4d3da6d-combined-ca-bundle\") pod \"8faf0043-ec66-4725-9db0-e0b7a4d3da6d\" (UID: \"8faf0043-ec66-4725-9db0-e0b7a4d3da6d\") " Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.127680 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7-dns-svc\") pod \"aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7\" (UID: \"aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7\") " Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.127758 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8faf0043-ec66-4725-9db0-e0b7a4d3da6d-logs\") pod \"8faf0043-ec66-4725-9db0-e0b7a4d3da6d\" (UID: \"8faf0043-ec66-4725-9db0-e0b7a4d3da6d\") " Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.127789 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8faf0043-ec66-4725-9db0-e0b7a4d3da6d-httpd-run\") pod \"8faf0043-ec66-4725-9db0-e0b7a4d3da6d\" (UID: \"8faf0043-ec66-4725-9db0-e0b7a4d3da6d\") " Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.127825 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldz44\" (UniqueName: \"kubernetes.io/projected/aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7-kube-api-access-ldz44\") pod \"aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7\" (UID: \"aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7\") " Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.127876 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7-config\") pod \"aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7\" (UID: \"aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7\") " Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.127934 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7-ovsdbserver-nb\") pod \"aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7\" (UID: \"aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7\") " Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.127953 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8faf0043-ec66-4725-9db0-e0b7a4d3da6d-scripts\") pod \"8faf0043-ec66-4725-9db0-e0b7a4d3da6d\" (UID: \"8faf0043-ec66-4725-9db0-e0b7a4d3da6d\") " Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.127982 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8faf0043-ec66-4725-9db0-e0b7a4d3da6d-config-data\") pod \"8faf0043-ec66-4725-9db0-e0b7a4d3da6d\" (UID: \"8faf0043-ec66-4725-9db0-e0b7a4d3da6d\") " Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.128025 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgqrn\" (UniqueName: \"kubernetes.io/projected/8faf0043-ec66-4725-9db0-e0b7a4d3da6d-kube-api-access-vgqrn\") pod \"8faf0043-ec66-4725-9db0-e0b7a4d3da6d\" (UID: \"8faf0043-ec66-4725-9db0-e0b7a4d3da6d\") " Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.128047 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7-dns-swift-storage-0\") pod \"aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7\" (UID: \"aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7\") " Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.128081 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7-ovsdbserver-sb\") pod \"aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7\" (UID: \"aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7\") " Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.128884 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8faf0043-ec66-4725-9db0-e0b7a4d3da6d-logs" (OuterVolumeSpecName: "logs") pod "8faf0043-ec66-4725-9db0-e0b7a4d3da6d" (UID: "8faf0043-ec66-4725-9db0-e0b7a4d3da6d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.129056 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8faf0043-ec66-4725-9db0-e0b7a4d3da6d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8faf0043-ec66-4725-9db0-e0b7a4d3da6d" (UID: "8faf0043-ec66-4725-9db0-e0b7a4d3da6d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.133905 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "8faf0043-ec66-4725-9db0-e0b7a4d3da6d" (UID: "8faf0043-ec66-4725-9db0-e0b7a4d3da6d"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.134318 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8faf0043-ec66-4725-9db0-e0b7a4d3da6d-kube-api-access-vgqrn" (OuterVolumeSpecName: "kube-api-access-vgqrn") pod "8faf0043-ec66-4725-9db0-e0b7a4d3da6d" (UID: "8faf0043-ec66-4725-9db0-e0b7a4d3da6d"). InnerVolumeSpecName "kube-api-access-vgqrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.138061 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8faf0043-ec66-4725-9db0-e0b7a4d3da6d-scripts" (OuterVolumeSpecName: "scripts") pod "8faf0043-ec66-4725-9db0-e0b7a4d3da6d" (UID: "8faf0043-ec66-4725-9db0-e0b7a4d3da6d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.147795 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7-kube-api-access-ldz44" (OuterVolumeSpecName: "kube-api-access-ldz44") pod "aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7" (UID: "aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7"). InnerVolumeSpecName "kube-api-access-ldz44". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.194502 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7" (UID: "aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.195104 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8faf0043-ec66-4725-9db0-e0b7a4d3da6d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8faf0043-ec66-4725-9db0-e0b7a4d3da6d" (UID: "8faf0043-ec66-4725-9db0-e0b7a4d3da6d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.203110 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7" (UID: "aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.204396 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7" (UID: "aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.206322 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7-config" (OuterVolumeSpecName: "config") pod "aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7" (UID: "aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.217309 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8faf0043-ec66-4725-9db0-e0b7a4d3da6d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8faf0043-ec66-4725-9db0-e0b7a4d3da6d" (UID: "8faf0043-ec66-4725-9db0-e0b7a4d3da6d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.224919 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7" (UID: "aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.226193 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8faf0043-ec66-4725-9db0-e0b7a4d3da6d-config-data" (OuterVolumeSpecName: "config-data") pod "8faf0043-ec66-4725-9db0-e0b7a4d3da6d" (UID: "8faf0043-ec66-4725-9db0-e0b7a4d3da6d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.229681 4892 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.229708 4892 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8faf0043-ec66-4725-9db0-e0b7a4d3da6d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.229722 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8faf0043-ec66-4725-9db0-e0b7a4d3da6d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.229732 4892 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.229740 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8faf0043-ec66-4725-9db0-e0b7a4d3da6d-logs\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.229749 4892 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8faf0043-ec66-4725-9db0-e0b7a4d3da6d-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.229757 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldz44\" (UniqueName: \"kubernetes.io/projected/aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7-kube-api-access-ldz44\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.229765 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7-config\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.229773 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.229780 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8faf0043-ec66-4725-9db0-e0b7a4d3da6d-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.229788 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8faf0043-ec66-4725-9db0-e0b7a4d3da6d-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.229796 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgqrn\" (UniqueName: \"kubernetes.io/projected/8faf0043-ec66-4725-9db0-e0b7a4d3da6d-kube-api-access-vgqrn\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.229803 4892 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.229825 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.251798 4892 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.295840 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-895cf5cf-b2865" podUID="aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.158:5353: i/o timeout" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.295939 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-895cf5cf-b2865" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.332139 4892 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.623378 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-b2865" event={"ID":"aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7","Type":"ContainerDied","Data":"d75e0ccc46e0ea3bf8a05a62b3a9be810cf3bfcb87f9d8570b440d8ffd8dfa13"} Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.623419 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-b2865" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.623438 4892 scope.go:117] "RemoveContainer" containerID="bccde332d60d6b0e45f0bf05b2c45bb513cdbbe8b8c25952d2bde8ff07e3a9e4" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.627490 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8faf0043-ec66-4725-9db0-e0b7a4d3da6d","Type":"ContainerDied","Data":"0e1de3fa4771e1abd1be750d53637068e3f7b547404c610e45f4b5adfe93f430"} Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.627535 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.662659 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-b2865"] Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.673543 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-b2865"] Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.686428 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.699795 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.709947 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 18:03:34 crc kubenswrapper[4892]: E0217 18:03:34.710412 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7" containerName="dnsmasq-dns" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.710430 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7" containerName="dnsmasq-dns" Feb 17 18:03:34 crc kubenswrapper[4892]: E0217 18:03:34.710441 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7" containerName="init" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.710447 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7" containerName="init" Feb 17 18:03:34 crc kubenswrapper[4892]: E0217 18:03:34.710476 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8faf0043-ec66-4725-9db0-e0b7a4d3da6d" containerName="glance-httpd" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.710482 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="8faf0043-ec66-4725-9db0-e0b7a4d3da6d" containerName="glance-httpd" Feb 17 18:03:34 crc kubenswrapper[4892]: E0217 18:03:34.710494 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8faf0043-ec66-4725-9db0-e0b7a4d3da6d" containerName="glance-log" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.710499 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="8faf0043-ec66-4725-9db0-e0b7a4d3da6d" containerName="glance-log" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.710678 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="8faf0043-ec66-4725-9db0-e0b7a4d3da6d" containerName="glance-log" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.710695 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="8faf0043-ec66-4725-9db0-e0b7a4d3da6d" containerName="glance-httpd" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.710714 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7" containerName="dnsmasq-dns" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.711746 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.714962 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.715146 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.718189 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.840520 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1f5f86e-d42f-4224-862d-31337ee26ae5-logs\") pod \"glance-default-external-api-0\" (UID: \"a1f5f86e-d42f-4224-862d-31337ee26ae5\") " pod="openstack/glance-default-external-api-0" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.840562 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1f5f86e-d42f-4224-862d-31337ee26ae5-scripts\") pod \"glance-default-external-api-0\" (UID: \"a1f5f86e-d42f-4224-862d-31337ee26ae5\") " pod="openstack/glance-default-external-api-0" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.840582 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a1f5f86e-d42f-4224-862d-31337ee26ae5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a1f5f86e-d42f-4224-862d-31337ee26ae5\") " pod="openstack/glance-default-external-api-0" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.840807 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4cks\" (UniqueName: \"kubernetes.io/projected/a1f5f86e-d42f-4224-862d-31337ee26ae5-kube-api-access-l4cks\") pod \"glance-default-external-api-0\" (UID: \"a1f5f86e-d42f-4224-862d-31337ee26ae5\") " pod="openstack/glance-default-external-api-0" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.841009 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1f5f86e-d42f-4224-862d-31337ee26ae5-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a1f5f86e-d42f-4224-862d-31337ee26ae5\") " pod="openstack/glance-default-external-api-0" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.841070 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1f5f86e-d42f-4224-862d-31337ee26ae5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a1f5f86e-d42f-4224-862d-31337ee26ae5\") " pod="openstack/glance-default-external-api-0" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.841182 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1f5f86e-d42f-4224-862d-31337ee26ae5-config-data\") pod \"glance-default-external-api-0\" (UID: \"a1f5f86e-d42f-4224-862d-31337ee26ae5\") " pod="openstack/glance-default-external-api-0" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.841226 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"a1f5f86e-d42f-4224-862d-31337ee26ae5\") " pod="openstack/glance-default-external-api-0" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.942526 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4cks\" (UniqueName: \"kubernetes.io/projected/a1f5f86e-d42f-4224-862d-31337ee26ae5-kube-api-access-l4cks\") pod \"glance-default-external-api-0\" (UID: \"a1f5f86e-d42f-4224-862d-31337ee26ae5\") " pod="openstack/glance-default-external-api-0" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.942623 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1f5f86e-d42f-4224-862d-31337ee26ae5-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a1f5f86e-d42f-4224-862d-31337ee26ae5\") " pod="openstack/glance-default-external-api-0" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.942660 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1f5f86e-d42f-4224-862d-31337ee26ae5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a1f5f86e-d42f-4224-862d-31337ee26ae5\") " pod="openstack/glance-default-external-api-0" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.942713 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1f5f86e-d42f-4224-862d-31337ee26ae5-config-data\") pod \"glance-default-external-api-0\" (UID: \"a1f5f86e-d42f-4224-862d-31337ee26ae5\") " pod="openstack/glance-default-external-api-0" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.942739 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"a1f5f86e-d42f-4224-862d-31337ee26ae5\") " pod="openstack/glance-default-external-api-0" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.942792 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1f5f86e-d42f-4224-862d-31337ee26ae5-logs\") pod \"glance-default-external-api-0\" (UID: \"a1f5f86e-d42f-4224-862d-31337ee26ae5\") " pod="openstack/glance-default-external-api-0" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.942858 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1f5f86e-d42f-4224-862d-31337ee26ae5-scripts\") pod \"glance-default-external-api-0\" (UID: \"a1f5f86e-d42f-4224-862d-31337ee26ae5\") " pod="openstack/glance-default-external-api-0" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.942884 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a1f5f86e-d42f-4224-862d-31337ee26ae5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a1f5f86e-d42f-4224-862d-31337ee26ae5\") " pod="openstack/glance-default-external-api-0" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.942968 4892 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"a1f5f86e-d42f-4224-862d-31337ee26ae5\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.943482 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a1f5f86e-d42f-4224-862d-31337ee26ae5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a1f5f86e-d42f-4224-862d-31337ee26ae5\") " pod="openstack/glance-default-external-api-0" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.943557 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1f5f86e-d42f-4224-862d-31337ee26ae5-logs\") pod \"glance-default-external-api-0\" (UID: \"a1f5f86e-d42f-4224-862d-31337ee26ae5\") " pod="openstack/glance-default-external-api-0" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.948093 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1f5f86e-d42f-4224-862d-31337ee26ae5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a1f5f86e-d42f-4224-862d-31337ee26ae5\") " pod="openstack/glance-default-external-api-0" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.948281 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1f5f86e-d42f-4224-862d-31337ee26ae5-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a1f5f86e-d42f-4224-862d-31337ee26ae5\") " pod="openstack/glance-default-external-api-0" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.948740 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1f5f86e-d42f-4224-862d-31337ee26ae5-scripts\") pod \"glance-default-external-api-0\" (UID: \"a1f5f86e-d42f-4224-862d-31337ee26ae5\") " pod="openstack/glance-default-external-api-0" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.950184 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1f5f86e-d42f-4224-862d-31337ee26ae5-config-data\") pod \"glance-default-external-api-0\" (UID: \"a1f5f86e-d42f-4224-862d-31337ee26ae5\") " pod="openstack/glance-default-external-api-0" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.964212 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4cks\" (UniqueName: \"kubernetes.io/projected/a1f5f86e-d42f-4224-862d-31337ee26ae5-kube-api-access-l4cks\") pod \"glance-default-external-api-0\" (UID: \"a1f5f86e-d42f-4224-862d-31337ee26ae5\") " pod="openstack/glance-default-external-api-0" Feb 17 18:03:34 crc kubenswrapper[4892]: I0217 18:03:34.976477 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"a1f5f86e-d42f-4224-862d-31337ee26ae5\") " pod="openstack/glance-default-external-api-0" Feb 17 18:03:35 crc kubenswrapper[4892]: I0217 18:03:35.061536 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 18:03:35 crc kubenswrapper[4892]: E0217 18:03:35.178884 4892 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Feb 17 18:03:35 crc kubenswrapper[4892]: E0217 18:03:35.179109 4892 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gjgkj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-7rrhb_openstack(e85111e5-c198-4b36-bdce-5eb7a8ff75ea): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 18:03:35 crc kubenswrapper[4892]: E0217 18:03:35.181165 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-7rrhb" podUID="e85111e5-c198-4b36-bdce-5eb7a8ff75ea" Feb 17 18:03:35 crc kubenswrapper[4892]: I0217 18:03:35.190110 4892 scope.go:117] "RemoveContainer" containerID="0fe2b4ec3dd7289b03b187a5f65791a6d000005cecae7271c83dbd7b1687ece0" Feb 17 18:03:35 crc kubenswrapper[4892]: I0217 18:03:35.354517 4892 scope.go:117] "RemoveContainer" containerID="a6218868020c69142a9a3ebdaa81f46d52f2a1989d02635a3b35de25743a226a" Feb 17 18:03:35 crc kubenswrapper[4892]: I0217 18:03:35.382193 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8faf0043-ec66-4725-9db0-e0b7a4d3da6d" path="/var/lib/kubelet/pods/8faf0043-ec66-4725-9db0-e0b7a4d3da6d/volumes" Feb 17 18:03:35 crc kubenswrapper[4892]: I0217 18:03:35.383015 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7" path="/var/lib/kubelet/pods/aae3f43d-c805-4f99-a5ca-8f9b29cdb4a7/volumes" Feb 17 18:03:35 crc kubenswrapper[4892]: I0217 18:03:35.393548 4892 scope.go:117] "RemoveContainer" containerID="6246aeb2ade0adec8acce45f136be2a5c60aaf0b11295b2adbdaf8d7bac2c4f1" Feb 17 18:03:35 crc kubenswrapper[4892]: I0217 18:03:35.638187 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-79pp7" event={"ID":"3be41152-a14a-43b1-b24e-221e614556df","Type":"ContainerStarted","Data":"13772a7bab2e085826df0815a87fd8bfefe58b5d98ecd96e8581b82a84d62443"} Feb 17 18:03:35 crc kubenswrapper[4892]: I0217 18:03:35.640083 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"14ec5fef-7255-4519-b447-474ccccaebdb","Type":"ContainerStarted","Data":"1ca249ccf30d54590b7bedfc5fd4d93955d7342105c6c4b76747c5f1f064b57b"} Feb 17 18:03:35 crc kubenswrapper[4892]: I0217 18:03:35.650453 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-nc5l8" event={"ID":"545242bc-24e5-4521-85d8-7aff5cbd4916","Type":"ContainerStarted","Data":"47ba413927d309d951e32487d20bc96e08504a3f1ddb3fad89ea5e5a59e190b4"} Feb 17 18:03:35 crc kubenswrapper[4892]: E0217 18:03:35.652343 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-7rrhb" podUID="e85111e5-c198-4b36-bdce-5eb7a8ff75ea" Feb 17 18:03:35 crc kubenswrapper[4892]: I0217 18:03:35.661996 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-8vb5k"] Feb 17 18:03:35 crc kubenswrapper[4892]: I0217 18:03:35.668797 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-79pp7" podStartSLOduration=2.566167452 podStartE2EDuration="27.668776843s" podCreationTimestamp="2026-02-17 18:03:08 +0000 UTC" firstStartedPulling="2026-02-17 18:03:10.048900747 +0000 UTC m=+1161.424304002" lastFinishedPulling="2026-02-17 18:03:35.151510128 +0000 UTC m=+1186.526913393" observedRunningTime="2026-02-17 18:03:35.655881907 +0000 UTC m=+1187.031285192" watchObservedRunningTime="2026-02-17 18:03:35.668776843 +0000 UTC m=+1187.044180108" Feb 17 18:03:35 crc kubenswrapper[4892]: I0217 18:03:35.686248 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-nc5l8" podStartSLOduration=2.58517721 podStartE2EDuration="27.68622982s" podCreationTimestamp="2026-02-17 18:03:08 +0000 UTC" firstStartedPulling="2026-02-17 18:03:10.089157374 +0000 UTC m=+1161.464560639" lastFinishedPulling="2026-02-17 18:03:35.190209984 +0000 UTC m=+1186.565613249" observedRunningTime="2026-02-17 18:03:35.681968306 +0000 UTC m=+1187.057371571" watchObservedRunningTime="2026-02-17 18:03:35.68622982 +0000 UTC m=+1187.061633085" Feb 17 18:03:35 crc kubenswrapper[4892]: W0217 18:03:35.688798 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9de8b31_40f1_42b9_b9c2_3d1cf759d9aa.slice/crio-eccad562e4f758c2bc7142f72bb9950c425de858d4dbce091866048516f22f7e WatchSource:0}: Error finding container eccad562e4f758c2bc7142f72bb9950c425de858d4dbce091866048516f22f7e: Status 404 returned error can't find the container with id eccad562e4f758c2bc7142f72bb9950c425de858d4dbce091866048516f22f7e Feb 17 18:03:35 crc kubenswrapper[4892]: I0217 18:03:35.901115 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 18:03:35 crc kubenswrapper[4892]: I0217 18:03:35.987993 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 18:03:36 crc kubenswrapper[4892]: I0217 18:03:36.667044 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a1f5f86e-d42f-4224-862d-31337ee26ae5","Type":"ContainerStarted","Data":"45f3afdcde22cd5653dcf98eb41e608b08f4b0c0f952e1bd8f4733607e7c9c02"} Feb 17 18:03:36 crc kubenswrapper[4892]: I0217 18:03:36.667640 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a1f5f86e-d42f-4224-862d-31337ee26ae5","Type":"ContainerStarted","Data":"06dd33940ae2315f8e5deff14eaac70cef1ba80a24ab41c5ddd831c749496d49"} Feb 17 18:03:36 crc kubenswrapper[4892]: I0217 18:03:36.671683 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8vb5k" event={"ID":"e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa","Type":"ContainerStarted","Data":"a025ad92e083691a77b535757d3d225e57599fa5fb5a9fd39fba4ecee4733b2b"} Feb 17 18:03:36 crc kubenswrapper[4892]: I0217 18:03:36.671737 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8vb5k" event={"ID":"e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa","Type":"ContainerStarted","Data":"eccad562e4f758c2bc7142f72bb9950c425de858d4dbce091866048516f22f7e"} Feb 17 18:03:36 crc kubenswrapper[4892]: I0217 18:03:36.674170 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2083df34-114e-4d2e-a85c-7a9ca940defa","Type":"ContainerStarted","Data":"17617b8e76c7e5c7a5b693190d099203a08b989a0c62bcf2c5a605412934d8bc"} Feb 17 18:03:36 crc kubenswrapper[4892]: I0217 18:03:36.674205 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2083df34-114e-4d2e-a85c-7a9ca940defa","Type":"ContainerStarted","Data":"e8551b202935e45099ef6792acc2b97c171e2581e16bcf43a4b72911434e0e0f"} Feb 17 18:03:36 crc kubenswrapper[4892]: I0217 18:03:36.700194 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-8vb5k" podStartSLOduration=17.700177777 podStartE2EDuration="17.700177777s" podCreationTimestamp="2026-02-17 18:03:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:03:36.694217968 +0000 UTC m=+1188.069621253" watchObservedRunningTime="2026-02-17 18:03:36.700177777 +0000 UTC m=+1188.075581042" Feb 17 18:03:37 crc kubenswrapper[4892]: I0217 18:03:37.425240 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 18:03:37 crc kubenswrapper[4892]: I0217 18:03:37.425633 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 18:03:37 crc kubenswrapper[4892]: I0217 18:03:37.425689 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" Feb 17 18:03:37 crc kubenswrapper[4892]: I0217 18:03:37.426411 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"39bb78d4e8e45cbad6e675e4abf8cc16247b09380bca60a00134831853f3fc17"} pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 18:03:37 crc kubenswrapper[4892]: I0217 18:03:37.426472 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" containerID="cri-o://39bb78d4e8e45cbad6e675e4abf8cc16247b09380bca60a00134831853f3fc17" gracePeriod=600 Feb 17 18:03:37 crc kubenswrapper[4892]: I0217 18:03:37.691141 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2083df34-114e-4d2e-a85c-7a9ca940defa","Type":"ContainerStarted","Data":"9472b66a1c3b7f7c9c67281b9625bc73d46a15d07425d6c7adac8bfb9c882a0b"} Feb 17 18:03:37 crc kubenswrapper[4892]: I0217 18:03:37.699620 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a1f5f86e-d42f-4224-862d-31337ee26ae5","Type":"ContainerStarted","Data":"de10374cd7c9b05b17343f79c93b879107820127acee324e6c40ecf2299e73ef"} Feb 17 18:03:37 crc kubenswrapper[4892]: I0217 18:03:37.709541 4892 generic.go:334] "Generic (PLEG): container finished" podID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerID="39bb78d4e8e45cbad6e675e4abf8cc16247b09380bca60a00134831853f3fc17" exitCode=0 Feb 17 18:03:37 crc kubenswrapper[4892]: I0217 18:03:37.710491 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerDied","Data":"39bb78d4e8e45cbad6e675e4abf8cc16247b09380bca60a00134831853f3fc17"} Feb 17 18:03:37 crc kubenswrapper[4892]: I0217 18:03:37.710535 4892 scope.go:117] "RemoveContainer" containerID="277a892ddcda11348b051b3a2c03162bd0db1300ec13dbc17277b62b780132f1" Feb 17 18:03:37 crc kubenswrapper[4892]: I0217 18:03:37.741966 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=14.74194633 podStartE2EDuration="14.74194633s" podCreationTimestamp="2026-02-17 18:03:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:03:37.721236526 +0000 UTC m=+1189.096639811" watchObservedRunningTime="2026-02-17 18:03:37.74194633 +0000 UTC m=+1189.117349595" Feb 17 18:03:37 crc kubenswrapper[4892]: I0217 18:03:37.750988 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.750972691 podStartE2EDuration="3.750972691s" podCreationTimestamp="2026-02-17 18:03:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:03:37.741623341 +0000 UTC m=+1189.117026626" watchObservedRunningTime="2026-02-17 18:03:37.750972691 +0000 UTC m=+1189.126375956" Feb 17 18:03:38 crc kubenswrapper[4892]: I0217 18:03:38.727030 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"14ec5fef-7255-4519-b447-474ccccaebdb","Type":"ContainerStarted","Data":"339f18965d225092aa6b74cb44351cec1a8bae37411ff5d9763012698e3bd335"} Feb 17 18:03:39 crc kubenswrapper[4892]: I0217 18:03:39.750253 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerStarted","Data":"0b5f9c9cd974cd254191642f6cc48f52aa43de6e7dce9d7b7d9b694f86f42344"} Feb 17 18:03:43 crc kubenswrapper[4892]: I0217 18:03:43.940823 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 17 18:03:43 crc kubenswrapper[4892]: I0217 18:03:43.941323 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 17 18:03:43 crc kubenswrapper[4892]: I0217 18:03:43.990249 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 17 18:03:43 crc kubenswrapper[4892]: I0217 18:03:43.995942 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 17 18:03:44 crc kubenswrapper[4892]: I0217 18:03:44.808156 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 17 18:03:44 crc kubenswrapper[4892]: I0217 18:03:44.808513 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 17 18:03:45 crc kubenswrapper[4892]: I0217 18:03:45.062459 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 17 18:03:45 crc kubenswrapper[4892]: I0217 18:03:45.062504 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 17 18:03:45 crc kubenswrapper[4892]: I0217 18:03:45.113556 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 17 18:03:45 crc kubenswrapper[4892]: I0217 18:03:45.115918 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 17 18:03:45 crc kubenswrapper[4892]: I0217 18:03:45.819653 4892 generic.go:334] "Generic (PLEG): container finished" podID="3be41152-a14a-43b1-b24e-221e614556df" containerID="13772a7bab2e085826df0815a87fd8bfefe58b5d98ecd96e8581b82a84d62443" exitCode=0 Feb 17 18:03:45 crc kubenswrapper[4892]: I0217 18:03:45.819893 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-79pp7" event={"ID":"3be41152-a14a-43b1-b24e-221e614556df","Type":"ContainerDied","Data":"13772a7bab2e085826df0815a87fd8bfefe58b5d98ecd96e8581b82a84d62443"} Feb 17 18:03:45 crc kubenswrapper[4892]: I0217 18:03:45.821397 4892 generic.go:334] "Generic (PLEG): container finished" podID="e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa" containerID="a025ad92e083691a77b535757d3d225e57599fa5fb5a9fd39fba4ecee4733b2b" exitCode=0 Feb 17 18:03:45 crc kubenswrapper[4892]: I0217 18:03:45.821469 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8vb5k" event={"ID":"e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa","Type":"ContainerDied","Data":"a025ad92e083691a77b535757d3d225e57599fa5fb5a9fd39fba4ecee4733b2b"} Feb 17 18:03:45 crc kubenswrapper[4892]: I0217 18:03:45.823552 4892 generic.go:334] "Generic (PLEG): container finished" podID="b6b151d3-a2dd-4201-b4a4-e2f332530eaa" containerID="ab6e9785f15283af7e10144fb1b893d0defb9dedbde94a7d3c471d856a81f83b" exitCode=0 Feb 17 18:03:45 crc kubenswrapper[4892]: I0217 18:03:45.823596 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-q48cp" event={"ID":"b6b151d3-a2dd-4201-b4a4-e2f332530eaa","Type":"ContainerDied","Data":"ab6e9785f15283af7e10144fb1b893d0defb9dedbde94a7d3c471d856a81f83b"} Feb 17 18:03:45 crc kubenswrapper[4892]: I0217 18:03:45.824836 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 17 18:03:45 crc kubenswrapper[4892]: I0217 18:03:45.824860 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 17 18:03:46 crc kubenswrapper[4892]: I0217 18:03:46.834079 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"14ec5fef-7255-4519-b447-474ccccaebdb","Type":"ContainerStarted","Data":"1b519ddd92ccb43096fe441e4b92da614f8084679a5f6b39b706dd73a6031e61"} Feb 17 18:03:46 crc kubenswrapper[4892]: I0217 18:03:46.835514 4892 generic.go:334] "Generic (PLEG): container finished" podID="545242bc-24e5-4521-85d8-7aff5cbd4916" containerID="47ba413927d309d951e32487d20bc96e08504a3f1ddb3fad89ea5e5a59e190b4" exitCode=0 Feb 17 18:03:46 crc kubenswrapper[4892]: I0217 18:03:46.835626 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-nc5l8" event={"ID":"545242bc-24e5-4521-85d8-7aff5cbd4916","Type":"ContainerDied","Data":"47ba413927d309d951e32487d20bc96e08504a3f1ddb3fad89ea5e5a59e190b4"} Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.277227 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-q48cp" Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.286774 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-79pp7" Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.287991 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8vb5k" Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.412622 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa-combined-ca-bundle\") pod \"e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa\" (UID: \"e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa\") " Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.412680 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3be41152-a14a-43b1-b24e-221e614556df-logs\") pod \"3be41152-a14a-43b1-b24e-221e614556df\" (UID: \"3be41152-a14a-43b1-b24e-221e614556df\") " Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.412704 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6b151d3-a2dd-4201-b4a4-e2f332530eaa-combined-ca-bundle\") pod \"b6b151d3-a2dd-4201-b4a4-e2f332530eaa\" (UID: \"b6b151d3-a2dd-4201-b4a4-e2f332530eaa\") " Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.412747 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3be41152-a14a-43b1-b24e-221e614556df-scripts\") pod \"3be41152-a14a-43b1-b24e-221e614556df\" (UID: \"3be41152-a14a-43b1-b24e-221e614556df\") " Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.412802 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57gv4\" (UniqueName: \"kubernetes.io/projected/e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa-kube-api-access-57gv4\") pod \"e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa\" (UID: \"e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa\") " Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.412900 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa-config-data\") pod \"e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa\" (UID: \"e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa\") " Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.413110 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78ddx\" (UniqueName: \"kubernetes.io/projected/b6b151d3-a2dd-4201-b4a4-e2f332530eaa-kube-api-access-78ddx\") pod \"b6b151d3-a2dd-4201-b4a4-e2f332530eaa\" (UID: \"b6b151d3-a2dd-4201-b4a4-e2f332530eaa\") " Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.413142 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7lrm\" (UniqueName: \"kubernetes.io/projected/3be41152-a14a-43b1-b24e-221e614556df-kube-api-access-t7lrm\") pod \"3be41152-a14a-43b1-b24e-221e614556df\" (UID: \"3be41152-a14a-43b1-b24e-221e614556df\") " Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.413199 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b6b151d3-a2dd-4201-b4a4-e2f332530eaa-config\") pod \"b6b151d3-a2dd-4201-b4a4-e2f332530eaa\" (UID: \"b6b151d3-a2dd-4201-b4a4-e2f332530eaa\") " Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.413223 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa-fernet-keys\") pod \"e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa\" (UID: \"e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa\") " Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.413285 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa-scripts\") pod \"e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa\" (UID: \"e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa\") " Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.413320 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3be41152-a14a-43b1-b24e-221e614556df-combined-ca-bundle\") pod \"3be41152-a14a-43b1-b24e-221e614556df\" (UID: \"3be41152-a14a-43b1-b24e-221e614556df\") " Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.413363 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3be41152-a14a-43b1-b24e-221e614556df-config-data\") pod \"3be41152-a14a-43b1-b24e-221e614556df\" (UID: \"3be41152-a14a-43b1-b24e-221e614556df\") " Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.413423 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa-credential-keys\") pod \"e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa\" (UID: \"e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa\") " Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.413782 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3be41152-a14a-43b1-b24e-221e614556df-logs" (OuterVolumeSpecName: "logs") pod "3be41152-a14a-43b1-b24e-221e614556df" (UID: "3be41152-a14a-43b1-b24e-221e614556df"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.414098 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3be41152-a14a-43b1-b24e-221e614556df-logs\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.427348 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3be41152-a14a-43b1-b24e-221e614556df-scripts" (OuterVolumeSpecName: "scripts") pod "3be41152-a14a-43b1-b24e-221e614556df" (UID: "3be41152-a14a-43b1-b24e-221e614556df"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.427373 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa" (UID: "e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.427473 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa-scripts" (OuterVolumeSpecName: "scripts") pod "e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa" (UID: "e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.430809 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa" (UID: "e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.431354 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa-kube-api-access-57gv4" (OuterVolumeSpecName: "kube-api-access-57gv4") pod "e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa" (UID: "e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa"). InnerVolumeSpecName "kube-api-access-57gv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.449264 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3be41152-a14a-43b1-b24e-221e614556df-kube-api-access-t7lrm" (OuterVolumeSpecName: "kube-api-access-t7lrm") pod "3be41152-a14a-43b1-b24e-221e614556df" (UID: "3be41152-a14a-43b1-b24e-221e614556df"). InnerVolumeSpecName "kube-api-access-t7lrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.451319 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6b151d3-a2dd-4201-b4a4-e2f332530eaa-kube-api-access-78ddx" (OuterVolumeSpecName: "kube-api-access-78ddx") pod "b6b151d3-a2dd-4201-b4a4-e2f332530eaa" (UID: "b6b151d3-a2dd-4201-b4a4-e2f332530eaa"). InnerVolumeSpecName "kube-api-access-78ddx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.458847 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6b151d3-a2dd-4201-b4a4-e2f332530eaa-config" (OuterVolumeSpecName: "config") pod "b6b151d3-a2dd-4201-b4a4-e2f332530eaa" (UID: "b6b151d3-a2dd-4201-b4a4-e2f332530eaa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.467172 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3be41152-a14a-43b1-b24e-221e614556df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3be41152-a14a-43b1-b24e-221e614556df" (UID: "3be41152-a14a-43b1-b24e-221e614556df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.467224 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa-config-data" (OuterVolumeSpecName: "config-data") pod "e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa" (UID: "e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.482427 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6b151d3-a2dd-4201-b4a4-e2f332530eaa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b6b151d3-a2dd-4201-b4a4-e2f332530eaa" (UID: "b6b151d3-a2dd-4201-b4a4-e2f332530eaa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.495370 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa" (UID: "e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.502894 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3be41152-a14a-43b1-b24e-221e614556df-config-data" (OuterVolumeSpecName: "config-data") pod "3be41152-a14a-43b1-b24e-221e614556df" (UID: "3be41152-a14a-43b1-b24e-221e614556df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.516134 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3be41152-a14a-43b1-b24e-221e614556df-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.516166 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57gv4\" (UniqueName: \"kubernetes.io/projected/e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa-kube-api-access-57gv4\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.516176 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.516186 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78ddx\" (UniqueName: \"kubernetes.io/projected/b6b151d3-a2dd-4201-b4a4-e2f332530eaa-kube-api-access-78ddx\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.516196 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7lrm\" (UniqueName: \"kubernetes.io/projected/3be41152-a14a-43b1-b24e-221e614556df-kube-api-access-t7lrm\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.516204 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/b6b151d3-a2dd-4201-b4a4-e2f332530eaa-config\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.516213 4892 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.516222 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.516229 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3be41152-a14a-43b1-b24e-221e614556df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.516237 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3be41152-a14a-43b1-b24e-221e614556df-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.516245 4892 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.516253 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.516261 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6b151d3-a2dd-4201-b4a4-e2f332530eaa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.846687 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-79pp7" event={"ID":"3be41152-a14a-43b1-b24e-221e614556df","Type":"ContainerDied","Data":"fb032a59436448f4fc282fd5b98b03cd5baf1f63b043144e3afd96aa406d47b7"} Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.846738 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb032a59436448f4fc282fd5b98b03cd5baf1f63b043144e3afd96aa406d47b7" Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.846705 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-79pp7" Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.871530 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8vb5k" event={"ID":"e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa","Type":"ContainerDied","Data":"eccad562e4f758c2bc7142f72bb9950c425de858d4dbce091866048516f22f7e"} Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.871573 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eccad562e4f758c2bc7142f72bb9950c425de858d4dbce091866048516f22f7e" Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.871664 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8vb5k" Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.878675 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-q48cp" Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.880277 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-q48cp" event={"ID":"b6b151d3-a2dd-4201-b4a4-e2f332530eaa","Type":"ContainerDied","Data":"0f38cba2873ec096c3c758476a64ad139acc255f3a7ded6c719b70822eeb9d01"} Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.880322 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f38cba2873ec096c3c758476a64ad139acc255f3a7ded6c719b70822eeb9d01" Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.880399 4892 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.880407 4892 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.977494 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-544cf5fc64-d8jkh"] Feb 17 18:03:47 crc kubenswrapper[4892]: E0217 18:03:47.978152 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa" containerName="keystone-bootstrap" Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.978170 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa" containerName="keystone-bootstrap" Feb 17 18:03:47 crc kubenswrapper[4892]: E0217 18:03:47.978190 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3be41152-a14a-43b1-b24e-221e614556df" containerName="placement-db-sync" Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.978197 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="3be41152-a14a-43b1-b24e-221e614556df" containerName="placement-db-sync" Feb 17 18:03:47 crc kubenswrapper[4892]: E0217 18:03:47.978218 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6b151d3-a2dd-4201-b4a4-e2f332530eaa" containerName="neutron-db-sync" Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.978224 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6b151d3-a2dd-4201-b4a4-e2f332530eaa" containerName="neutron-db-sync" Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.978433 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="3be41152-a14a-43b1-b24e-221e614556df" containerName="placement-db-sync" Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.978455 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa" containerName="keystone-bootstrap" Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.978470 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6b151d3-a2dd-4201-b4a4-e2f332530eaa" containerName="neutron-db-sync" Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.979436 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-544cf5fc64-d8jkh" Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.982040 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-856hl" Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.984128 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.984465 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.984599 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.984751 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 17 18:03:47 crc kubenswrapper[4892]: I0217 18:03:47.987733 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-544cf5fc64-d8jkh"] Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.096904 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-dfdc985-pc7r9"] Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.098458 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-dfdc985-pc7r9" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.105568 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.109390 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.109838 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.109989 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.110118 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jpwh4" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.110202 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.115875 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-dfdc985-pc7r9"] Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.134489 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b04ca1c-a720-49e9-81dd-9be5c4695174-public-tls-certs\") pod \"placement-544cf5fc64-d8jkh\" (UID: \"0b04ca1c-a720-49e9-81dd-9be5c4695174\") " pod="openstack/placement-544cf5fc64-d8jkh" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.134537 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b04ca1c-a720-49e9-81dd-9be5c4695174-combined-ca-bundle\") pod \"placement-544cf5fc64-d8jkh\" (UID: \"0b04ca1c-a720-49e9-81dd-9be5c4695174\") " pod="openstack/placement-544cf5fc64-d8jkh" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.134556 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b04ca1c-a720-49e9-81dd-9be5c4695174-internal-tls-certs\") pod \"placement-544cf5fc64-d8jkh\" (UID: \"0b04ca1c-a720-49e9-81dd-9be5c4695174\") " pod="openstack/placement-544cf5fc64-d8jkh" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.134622 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b04ca1c-a720-49e9-81dd-9be5c4695174-scripts\") pod \"placement-544cf5fc64-d8jkh\" (UID: \"0b04ca1c-a720-49e9-81dd-9be5c4695174\") " pod="openstack/placement-544cf5fc64-d8jkh" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.134642 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b04ca1c-a720-49e9-81dd-9be5c4695174-logs\") pod \"placement-544cf5fc64-d8jkh\" (UID: \"0b04ca1c-a720-49e9-81dd-9be5c4695174\") " pod="openstack/placement-544cf5fc64-d8jkh" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.134675 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh8rk\" (UniqueName: \"kubernetes.io/projected/0b04ca1c-a720-49e9-81dd-9be5c4695174-kube-api-access-jh8rk\") pod \"placement-544cf5fc64-d8jkh\" (UID: \"0b04ca1c-a720-49e9-81dd-9be5c4695174\") " pod="openstack/placement-544cf5fc64-d8jkh" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.134699 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b04ca1c-a720-49e9-81dd-9be5c4695174-config-data\") pod \"placement-544cf5fc64-d8jkh\" (UID: \"0b04ca1c-a720-49e9-81dd-9be5c4695174\") " pod="openstack/placement-544cf5fc64-d8jkh" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.181486 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-7rjq9"] Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.184461 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-7rjq9" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.199263 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-7rjq9"] Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.238012 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fd50e1e-cc22-430b-ab38-88217aeafc59-combined-ca-bundle\") pod \"keystone-dfdc985-pc7r9\" (UID: \"2fd50e1e-cc22-430b-ab38-88217aeafc59\") " pod="openstack/keystone-dfdc985-pc7r9" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.238053 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fd50e1e-cc22-430b-ab38-88217aeafc59-internal-tls-certs\") pod \"keystone-dfdc985-pc7r9\" (UID: \"2fd50e1e-cc22-430b-ab38-88217aeafc59\") " pod="openstack/keystone-dfdc985-pc7r9" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.238082 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b04ca1c-a720-49e9-81dd-9be5c4695174-scripts\") pod \"placement-544cf5fc64-d8jkh\" (UID: \"0b04ca1c-a720-49e9-81dd-9be5c4695174\") " pod="openstack/placement-544cf5fc64-d8jkh" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.238102 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b04ca1c-a720-49e9-81dd-9be5c4695174-logs\") pod \"placement-544cf5fc64-d8jkh\" (UID: \"0b04ca1c-a720-49e9-81dd-9be5c4695174\") " pod="openstack/placement-544cf5fc64-d8jkh" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.238140 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jh8rk\" (UniqueName: \"kubernetes.io/projected/0b04ca1c-a720-49e9-81dd-9be5c4695174-kube-api-access-jh8rk\") pod \"placement-544cf5fc64-d8jkh\" (UID: \"0b04ca1c-a720-49e9-81dd-9be5c4695174\") " pod="openstack/placement-544cf5fc64-d8jkh" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.238162 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fd50e1e-cc22-430b-ab38-88217aeafc59-public-tls-certs\") pod \"keystone-dfdc985-pc7r9\" (UID: \"2fd50e1e-cc22-430b-ab38-88217aeafc59\") " pod="openstack/keystone-dfdc985-pc7r9" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.238181 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b04ca1c-a720-49e9-81dd-9be5c4695174-config-data\") pod \"placement-544cf5fc64-d8jkh\" (UID: \"0b04ca1c-a720-49e9-81dd-9be5c4695174\") " pod="openstack/placement-544cf5fc64-d8jkh" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.238208 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4p6h\" (UniqueName: \"kubernetes.io/projected/2fd50e1e-cc22-430b-ab38-88217aeafc59-kube-api-access-x4p6h\") pod \"keystone-dfdc985-pc7r9\" (UID: \"2fd50e1e-cc22-430b-ab38-88217aeafc59\") " pod="openstack/keystone-dfdc985-pc7r9" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.238230 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2fd50e1e-cc22-430b-ab38-88217aeafc59-credential-keys\") pod \"keystone-dfdc985-pc7r9\" (UID: \"2fd50e1e-cc22-430b-ab38-88217aeafc59\") " pod="openstack/keystone-dfdc985-pc7r9" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.238245 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fd50e1e-cc22-430b-ab38-88217aeafc59-config-data\") pod \"keystone-dfdc985-pc7r9\" (UID: \"2fd50e1e-cc22-430b-ab38-88217aeafc59\") " pod="openstack/keystone-dfdc985-pc7r9" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.238292 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b04ca1c-a720-49e9-81dd-9be5c4695174-public-tls-certs\") pod \"placement-544cf5fc64-d8jkh\" (UID: \"0b04ca1c-a720-49e9-81dd-9be5c4695174\") " pod="openstack/placement-544cf5fc64-d8jkh" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.238325 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b04ca1c-a720-49e9-81dd-9be5c4695174-combined-ca-bundle\") pod \"placement-544cf5fc64-d8jkh\" (UID: \"0b04ca1c-a720-49e9-81dd-9be5c4695174\") " pod="openstack/placement-544cf5fc64-d8jkh" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.238341 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b04ca1c-a720-49e9-81dd-9be5c4695174-internal-tls-certs\") pod \"placement-544cf5fc64-d8jkh\" (UID: \"0b04ca1c-a720-49e9-81dd-9be5c4695174\") " pod="openstack/placement-544cf5fc64-d8jkh" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.238364 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2fd50e1e-cc22-430b-ab38-88217aeafc59-fernet-keys\") pod \"keystone-dfdc985-pc7r9\" (UID: \"2fd50e1e-cc22-430b-ab38-88217aeafc59\") " pod="openstack/keystone-dfdc985-pc7r9" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.238384 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fd50e1e-cc22-430b-ab38-88217aeafc59-scripts\") pod \"keystone-dfdc985-pc7r9\" (UID: \"2fd50e1e-cc22-430b-ab38-88217aeafc59\") " pod="openstack/keystone-dfdc985-pc7r9" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.239529 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b04ca1c-a720-49e9-81dd-9be5c4695174-logs\") pod \"placement-544cf5fc64-d8jkh\" (UID: \"0b04ca1c-a720-49e9-81dd-9be5c4695174\") " pod="openstack/placement-544cf5fc64-d8jkh" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.249559 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b04ca1c-a720-49e9-81dd-9be5c4695174-scripts\") pod \"placement-544cf5fc64-d8jkh\" (UID: \"0b04ca1c-a720-49e9-81dd-9be5c4695174\") " pod="openstack/placement-544cf5fc64-d8jkh" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.251573 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b04ca1c-a720-49e9-81dd-9be5c4695174-public-tls-certs\") pod \"placement-544cf5fc64-d8jkh\" (UID: \"0b04ca1c-a720-49e9-81dd-9be5c4695174\") " pod="openstack/placement-544cf5fc64-d8jkh" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.252744 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b04ca1c-a720-49e9-81dd-9be5c4695174-internal-tls-certs\") pod \"placement-544cf5fc64-d8jkh\" (UID: \"0b04ca1c-a720-49e9-81dd-9be5c4695174\") " pod="openstack/placement-544cf5fc64-d8jkh" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.252868 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b04ca1c-a720-49e9-81dd-9be5c4695174-combined-ca-bundle\") pod \"placement-544cf5fc64-d8jkh\" (UID: \"0b04ca1c-a720-49e9-81dd-9be5c4695174\") " pod="openstack/placement-544cf5fc64-d8jkh" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.260508 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jh8rk\" (UniqueName: \"kubernetes.io/projected/0b04ca1c-a720-49e9-81dd-9be5c4695174-kube-api-access-jh8rk\") pod \"placement-544cf5fc64-d8jkh\" (UID: \"0b04ca1c-a720-49e9-81dd-9be5c4695174\") " pod="openstack/placement-544cf5fc64-d8jkh" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.267034 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b04ca1c-a720-49e9-81dd-9be5c4695174-config-data\") pod \"placement-544cf5fc64-d8jkh\" (UID: \"0b04ca1c-a720-49e9-81dd-9be5c4695174\") " pod="openstack/placement-544cf5fc64-d8jkh" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.268031 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6674d5469b-4kmh6"] Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.273362 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6674d5469b-4kmh6" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.288213 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.288411 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-8gj4q" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.288574 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.288714 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.301934 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6674d5469b-4kmh6"] Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.343645 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4p6h\" (UniqueName: \"kubernetes.io/projected/2fd50e1e-cc22-430b-ab38-88217aeafc59-kube-api-access-x4p6h\") pod \"keystone-dfdc985-pc7r9\" (UID: \"2fd50e1e-cc22-430b-ab38-88217aeafc59\") " pod="openstack/keystone-dfdc985-pc7r9" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.343712 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5fb2cb6f-9055-428a-8ae8-d76893b49c68-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-7rjq9\" (UID: \"5fb2cb6f-9055-428a-8ae8-d76893b49c68\") " pod="openstack/dnsmasq-dns-5ccc5c4795-7rjq9" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.343736 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2fd50e1e-cc22-430b-ab38-88217aeafc59-credential-keys\") pod \"keystone-dfdc985-pc7r9\" (UID: \"2fd50e1e-cc22-430b-ab38-88217aeafc59\") " pod="openstack/keystone-dfdc985-pc7r9" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.343751 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5fb2cb6f-9055-428a-8ae8-d76893b49c68-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-7rjq9\" (UID: \"5fb2cb6f-9055-428a-8ae8-d76893b49c68\") " pod="openstack/dnsmasq-dns-5ccc5c4795-7rjq9" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.343772 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fd50e1e-cc22-430b-ab38-88217aeafc59-config-data\") pod \"keystone-dfdc985-pc7r9\" (UID: \"2fd50e1e-cc22-430b-ab38-88217aeafc59\") " pod="openstack/keystone-dfdc985-pc7r9" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.343810 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fb2cb6f-9055-428a-8ae8-d76893b49c68-config\") pod \"dnsmasq-dns-5ccc5c4795-7rjq9\" (UID: \"5fb2cb6f-9055-428a-8ae8-d76893b49c68\") " pod="openstack/dnsmasq-dns-5ccc5c4795-7rjq9" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.344781 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d1b1f1d-e838-4b89-8d8b-e61b88a9917e-ovndb-tls-certs\") pod \"neutron-6674d5469b-4kmh6\" (UID: \"8d1b1f1d-e838-4b89-8d8b-e61b88a9917e\") " pod="openstack/neutron-6674d5469b-4kmh6" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.344917 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5fb2cb6f-9055-428a-8ae8-d76893b49c68-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-7rjq9\" (UID: \"5fb2cb6f-9055-428a-8ae8-d76893b49c68\") " pod="openstack/dnsmasq-dns-5ccc5c4795-7rjq9" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.344965 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcv5r\" (UniqueName: \"kubernetes.io/projected/5fb2cb6f-9055-428a-8ae8-d76893b49c68-kube-api-access-hcv5r\") pod \"dnsmasq-dns-5ccc5c4795-7rjq9\" (UID: \"5fb2cb6f-9055-428a-8ae8-d76893b49c68\") " pod="openstack/dnsmasq-dns-5ccc5c4795-7rjq9" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.345000 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2fd50e1e-cc22-430b-ab38-88217aeafc59-fernet-keys\") pod \"keystone-dfdc985-pc7r9\" (UID: \"2fd50e1e-cc22-430b-ab38-88217aeafc59\") " pod="openstack/keystone-dfdc985-pc7r9" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.345040 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fd50e1e-cc22-430b-ab38-88217aeafc59-scripts\") pod \"keystone-dfdc985-pc7r9\" (UID: \"2fd50e1e-cc22-430b-ab38-88217aeafc59\") " pod="openstack/keystone-dfdc985-pc7r9" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.345058 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8d1b1f1d-e838-4b89-8d8b-e61b88a9917e-config\") pod \"neutron-6674d5469b-4kmh6\" (UID: \"8d1b1f1d-e838-4b89-8d8b-e61b88a9917e\") " pod="openstack/neutron-6674d5469b-4kmh6" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.345087 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmbr9\" (UniqueName: \"kubernetes.io/projected/8d1b1f1d-e838-4b89-8d8b-e61b88a9917e-kube-api-access-fmbr9\") pod \"neutron-6674d5469b-4kmh6\" (UID: \"8d1b1f1d-e838-4b89-8d8b-e61b88a9917e\") " pod="openstack/neutron-6674d5469b-4kmh6" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.345134 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fd50e1e-cc22-430b-ab38-88217aeafc59-internal-tls-certs\") pod \"keystone-dfdc985-pc7r9\" (UID: \"2fd50e1e-cc22-430b-ab38-88217aeafc59\") " pod="openstack/keystone-dfdc985-pc7r9" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.345152 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fd50e1e-cc22-430b-ab38-88217aeafc59-combined-ca-bundle\") pod \"keystone-dfdc985-pc7r9\" (UID: \"2fd50e1e-cc22-430b-ab38-88217aeafc59\") " pod="openstack/keystone-dfdc985-pc7r9" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.345192 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d1b1f1d-e838-4b89-8d8b-e61b88a9917e-combined-ca-bundle\") pod \"neutron-6674d5469b-4kmh6\" (UID: \"8d1b1f1d-e838-4b89-8d8b-e61b88a9917e\") " pod="openstack/neutron-6674d5469b-4kmh6" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.345261 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8d1b1f1d-e838-4b89-8d8b-e61b88a9917e-httpd-config\") pod \"neutron-6674d5469b-4kmh6\" (UID: \"8d1b1f1d-e838-4b89-8d8b-e61b88a9917e\") " pod="openstack/neutron-6674d5469b-4kmh6" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.345286 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fd50e1e-cc22-430b-ab38-88217aeafc59-public-tls-certs\") pod \"keystone-dfdc985-pc7r9\" (UID: \"2fd50e1e-cc22-430b-ab38-88217aeafc59\") " pod="openstack/keystone-dfdc985-pc7r9" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.345303 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5fb2cb6f-9055-428a-8ae8-d76893b49c68-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-7rjq9\" (UID: \"5fb2cb6f-9055-428a-8ae8-d76893b49c68\") " pod="openstack/dnsmasq-dns-5ccc5c4795-7rjq9" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.349872 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fd50e1e-cc22-430b-ab38-88217aeafc59-combined-ca-bundle\") pod \"keystone-dfdc985-pc7r9\" (UID: \"2fd50e1e-cc22-430b-ab38-88217aeafc59\") " pod="openstack/keystone-dfdc985-pc7r9" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.351875 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2fd50e1e-cc22-430b-ab38-88217aeafc59-credential-keys\") pod \"keystone-dfdc985-pc7r9\" (UID: \"2fd50e1e-cc22-430b-ab38-88217aeafc59\") " pod="openstack/keystone-dfdc985-pc7r9" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.352775 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fd50e1e-cc22-430b-ab38-88217aeafc59-public-tls-certs\") pod \"keystone-dfdc985-pc7r9\" (UID: \"2fd50e1e-cc22-430b-ab38-88217aeafc59\") " pod="openstack/keystone-dfdc985-pc7r9" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.353717 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fd50e1e-cc22-430b-ab38-88217aeafc59-config-data\") pod \"keystone-dfdc985-pc7r9\" (UID: \"2fd50e1e-cc22-430b-ab38-88217aeafc59\") " pod="openstack/keystone-dfdc985-pc7r9" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.355994 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-544cf5fc64-d8jkh" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.359204 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2fd50e1e-cc22-430b-ab38-88217aeafc59-fernet-keys\") pod \"keystone-dfdc985-pc7r9\" (UID: \"2fd50e1e-cc22-430b-ab38-88217aeafc59\") " pod="openstack/keystone-dfdc985-pc7r9" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.366331 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fd50e1e-cc22-430b-ab38-88217aeafc59-scripts\") pod \"keystone-dfdc985-pc7r9\" (UID: \"2fd50e1e-cc22-430b-ab38-88217aeafc59\") " pod="openstack/keystone-dfdc985-pc7r9" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.374393 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fd50e1e-cc22-430b-ab38-88217aeafc59-internal-tls-certs\") pod \"keystone-dfdc985-pc7r9\" (UID: \"2fd50e1e-cc22-430b-ab38-88217aeafc59\") " pod="openstack/keystone-dfdc985-pc7r9" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.375624 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4p6h\" (UniqueName: \"kubernetes.io/projected/2fd50e1e-cc22-430b-ab38-88217aeafc59-kube-api-access-x4p6h\") pod \"keystone-dfdc985-pc7r9\" (UID: \"2fd50e1e-cc22-430b-ab38-88217aeafc59\") " pod="openstack/keystone-dfdc985-pc7r9" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.433140 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-dfdc985-pc7r9" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.463219 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5fb2cb6f-9055-428a-8ae8-d76893b49c68-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-7rjq9\" (UID: \"5fb2cb6f-9055-428a-8ae8-d76893b49c68\") " pod="openstack/dnsmasq-dns-5ccc5c4795-7rjq9" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.463292 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcv5r\" (UniqueName: \"kubernetes.io/projected/5fb2cb6f-9055-428a-8ae8-d76893b49c68-kube-api-access-hcv5r\") pod \"dnsmasq-dns-5ccc5c4795-7rjq9\" (UID: \"5fb2cb6f-9055-428a-8ae8-d76893b49c68\") " pod="openstack/dnsmasq-dns-5ccc5c4795-7rjq9" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.463354 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8d1b1f1d-e838-4b89-8d8b-e61b88a9917e-config\") pod \"neutron-6674d5469b-4kmh6\" (UID: \"8d1b1f1d-e838-4b89-8d8b-e61b88a9917e\") " pod="openstack/neutron-6674d5469b-4kmh6" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.463392 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmbr9\" (UniqueName: \"kubernetes.io/projected/8d1b1f1d-e838-4b89-8d8b-e61b88a9917e-kube-api-access-fmbr9\") pod \"neutron-6674d5469b-4kmh6\" (UID: \"8d1b1f1d-e838-4b89-8d8b-e61b88a9917e\") " pod="openstack/neutron-6674d5469b-4kmh6" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.463507 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d1b1f1d-e838-4b89-8d8b-e61b88a9917e-combined-ca-bundle\") pod \"neutron-6674d5469b-4kmh6\" (UID: \"8d1b1f1d-e838-4b89-8d8b-e61b88a9917e\") " pod="openstack/neutron-6674d5469b-4kmh6" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.463601 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8d1b1f1d-e838-4b89-8d8b-e61b88a9917e-httpd-config\") pod \"neutron-6674d5469b-4kmh6\" (UID: \"8d1b1f1d-e838-4b89-8d8b-e61b88a9917e\") " pod="openstack/neutron-6674d5469b-4kmh6" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.463633 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5fb2cb6f-9055-428a-8ae8-d76893b49c68-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-7rjq9\" (UID: \"5fb2cb6f-9055-428a-8ae8-d76893b49c68\") " pod="openstack/dnsmasq-dns-5ccc5c4795-7rjq9" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.463702 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5fb2cb6f-9055-428a-8ae8-d76893b49c68-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-7rjq9\" (UID: \"5fb2cb6f-9055-428a-8ae8-d76893b49c68\") " pod="openstack/dnsmasq-dns-5ccc5c4795-7rjq9" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.463752 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5fb2cb6f-9055-428a-8ae8-d76893b49c68-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-7rjq9\" (UID: \"5fb2cb6f-9055-428a-8ae8-d76893b49c68\") " pod="openstack/dnsmasq-dns-5ccc5c4795-7rjq9" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.463804 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fb2cb6f-9055-428a-8ae8-d76893b49c68-config\") pod \"dnsmasq-dns-5ccc5c4795-7rjq9\" (UID: \"5fb2cb6f-9055-428a-8ae8-d76893b49c68\") " pod="openstack/dnsmasq-dns-5ccc5c4795-7rjq9" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.465391 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d1b1f1d-e838-4b89-8d8b-e61b88a9917e-ovndb-tls-certs\") pod \"neutron-6674d5469b-4kmh6\" (UID: \"8d1b1f1d-e838-4b89-8d8b-e61b88a9917e\") " pod="openstack/neutron-6674d5469b-4kmh6" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.472877 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5fb2cb6f-9055-428a-8ae8-d76893b49c68-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-7rjq9\" (UID: \"5fb2cb6f-9055-428a-8ae8-d76893b49c68\") " pod="openstack/dnsmasq-dns-5ccc5c4795-7rjq9" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.473638 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5fb2cb6f-9055-428a-8ae8-d76893b49c68-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-7rjq9\" (UID: \"5fb2cb6f-9055-428a-8ae8-d76893b49c68\") " pod="openstack/dnsmasq-dns-5ccc5c4795-7rjq9" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.480649 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fb2cb6f-9055-428a-8ae8-d76893b49c68-config\") pod \"dnsmasq-dns-5ccc5c4795-7rjq9\" (UID: \"5fb2cb6f-9055-428a-8ae8-d76893b49c68\") " pod="openstack/dnsmasq-dns-5ccc5c4795-7rjq9" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.481312 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5fb2cb6f-9055-428a-8ae8-d76893b49c68-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-7rjq9\" (UID: \"5fb2cb6f-9055-428a-8ae8-d76893b49c68\") " pod="openstack/dnsmasq-dns-5ccc5c4795-7rjq9" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.496645 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d1b1f1d-e838-4b89-8d8b-e61b88a9917e-combined-ca-bundle\") pod \"neutron-6674d5469b-4kmh6\" (UID: \"8d1b1f1d-e838-4b89-8d8b-e61b88a9917e\") " pod="openstack/neutron-6674d5469b-4kmh6" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.496931 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5fb2cb6f-9055-428a-8ae8-d76893b49c68-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-7rjq9\" (UID: \"5fb2cb6f-9055-428a-8ae8-d76893b49c68\") " pod="openstack/dnsmasq-dns-5ccc5c4795-7rjq9" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.497010 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcv5r\" (UniqueName: \"kubernetes.io/projected/5fb2cb6f-9055-428a-8ae8-d76893b49c68-kube-api-access-hcv5r\") pod \"dnsmasq-dns-5ccc5c4795-7rjq9\" (UID: \"5fb2cb6f-9055-428a-8ae8-d76893b49c68\") " pod="openstack/dnsmasq-dns-5ccc5c4795-7rjq9" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.502147 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8d1b1f1d-e838-4b89-8d8b-e61b88a9917e-httpd-config\") pod \"neutron-6674d5469b-4kmh6\" (UID: \"8d1b1f1d-e838-4b89-8d8b-e61b88a9917e\") " pod="openstack/neutron-6674d5469b-4kmh6" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.504331 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d1b1f1d-e838-4b89-8d8b-e61b88a9917e-ovndb-tls-certs\") pod \"neutron-6674d5469b-4kmh6\" (UID: \"8d1b1f1d-e838-4b89-8d8b-e61b88a9917e\") " pod="openstack/neutron-6674d5469b-4kmh6" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.506664 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmbr9\" (UniqueName: \"kubernetes.io/projected/8d1b1f1d-e838-4b89-8d8b-e61b88a9917e-kube-api-access-fmbr9\") pod \"neutron-6674d5469b-4kmh6\" (UID: \"8d1b1f1d-e838-4b89-8d8b-e61b88a9917e\") " pod="openstack/neutron-6674d5469b-4kmh6" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.513489 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-7rjq9" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.514727 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8d1b1f1d-e838-4b89-8d8b-e61b88a9917e-config\") pod \"neutron-6674d5469b-4kmh6\" (UID: \"8d1b1f1d-e838-4b89-8d8b-e61b88a9917e\") " pod="openstack/neutron-6674d5469b-4kmh6" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.535985 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-dbf87fcbd-5txkh"] Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.538312 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dbf87fcbd-5txkh" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.579059 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dbf87fcbd-5txkh"] Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.674436 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvztw\" (UniqueName: \"kubernetes.io/projected/3ad1c61c-5ce6-4b28-9034-5620d94bebc1-kube-api-access-rvztw\") pod \"neutron-dbf87fcbd-5txkh\" (UID: \"3ad1c61c-5ce6-4b28-9034-5620d94bebc1\") " pod="openstack/neutron-dbf87fcbd-5txkh" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.674534 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ad1c61c-5ce6-4b28-9034-5620d94bebc1-ovndb-tls-certs\") pod \"neutron-dbf87fcbd-5txkh\" (UID: \"3ad1c61c-5ce6-4b28-9034-5620d94bebc1\") " pod="openstack/neutron-dbf87fcbd-5txkh" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.674589 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ad1c61c-5ce6-4b28-9034-5620d94bebc1-combined-ca-bundle\") pod \"neutron-dbf87fcbd-5txkh\" (UID: \"3ad1c61c-5ce6-4b28-9034-5620d94bebc1\") " pod="openstack/neutron-dbf87fcbd-5txkh" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.674628 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3ad1c61c-5ce6-4b28-9034-5620d94bebc1-httpd-config\") pod \"neutron-dbf87fcbd-5txkh\" (UID: \"3ad1c61c-5ce6-4b28-9034-5620d94bebc1\") " pod="openstack/neutron-dbf87fcbd-5txkh" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.674651 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3ad1c61c-5ce6-4b28-9034-5620d94bebc1-config\") pod \"neutron-dbf87fcbd-5txkh\" (UID: \"3ad1c61c-5ce6-4b28-9034-5620d94bebc1\") " pod="openstack/neutron-dbf87fcbd-5txkh" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.759346 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6674d5469b-4kmh6" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.778532 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvztw\" (UniqueName: \"kubernetes.io/projected/3ad1c61c-5ce6-4b28-9034-5620d94bebc1-kube-api-access-rvztw\") pod \"neutron-dbf87fcbd-5txkh\" (UID: \"3ad1c61c-5ce6-4b28-9034-5620d94bebc1\") " pod="openstack/neutron-dbf87fcbd-5txkh" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.779061 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ad1c61c-5ce6-4b28-9034-5620d94bebc1-ovndb-tls-certs\") pod \"neutron-dbf87fcbd-5txkh\" (UID: \"3ad1c61c-5ce6-4b28-9034-5620d94bebc1\") " pod="openstack/neutron-dbf87fcbd-5txkh" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.779683 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ad1c61c-5ce6-4b28-9034-5620d94bebc1-combined-ca-bundle\") pod \"neutron-dbf87fcbd-5txkh\" (UID: \"3ad1c61c-5ce6-4b28-9034-5620d94bebc1\") " pod="openstack/neutron-dbf87fcbd-5txkh" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.779748 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3ad1c61c-5ce6-4b28-9034-5620d94bebc1-httpd-config\") pod \"neutron-dbf87fcbd-5txkh\" (UID: \"3ad1c61c-5ce6-4b28-9034-5620d94bebc1\") " pod="openstack/neutron-dbf87fcbd-5txkh" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.779769 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3ad1c61c-5ce6-4b28-9034-5620d94bebc1-config\") pod \"neutron-dbf87fcbd-5txkh\" (UID: \"3ad1c61c-5ce6-4b28-9034-5620d94bebc1\") " pod="openstack/neutron-dbf87fcbd-5txkh" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.783769 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ad1c61c-5ce6-4b28-9034-5620d94bebc1-combined-ca-bundle\") pod \"neutron-dbf87fcbd-5txkh\" (UID: \"3ad1c61c-5ce6-4b28-9034-5620d94bebc1\") " pod="openstack/neutron-dbf87fcbd-5txkh" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.784043 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3ad1c61c-5ce6-4b28-9034-5620d94bebc1-config\") pod \"neutron-dbf87fcbd-5txkh\" (UID: \"3ad1c61c-5ce6-4b28-9034-5620d94bebc1\") " pod="openstack/neutron-dbf87fcbd-5txkh" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.798507 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvztw\" (UniqueName: \"kubernetes.io/projected/3ad1c61c-5ce6-4b28-9034-5620d94bebc1-kube-api-access-rvztw\") pod \"neutron-dbf87fcbd-5txkh\" (UID: \"3ad1c61c-5ce6-4b28-9034-5620d94bebc1\") " pod="openstack/neutron-dbf87fcbd-5txkh" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.807947 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ad1c61c-5ce6-4b28-9034-5620d94bebc1-ovndb-tls-certs\") pod \"neutron-dbf87fcbd-5txkh\" (UID: \"3ad1c61c-5ce6-4b28-9034-5620d94bebc1\") " pod="openstack/neutron-dbf87fcbd-5txkh" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.812285 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3ad1c61c-5ce6-4b28-9034-5620d94bebc1-httpd-config\") pod \"neutron-dbf87fcbd-5txkh\" (UID: \"3ad1c61c-5ce6-4b28-9034-5620d94bebc1\") " pod="openstack/neutron-dbf87fcbd-5txkh" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.866905 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dbf87fcbd-5txkh" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.912036 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-nc5l8" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.923412 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-nc5l8" event={"ID":"545242bc-24e5-4521-85d8-7aff5cbd4916","Type":"ContainerDied","Data":"6fd040ff2f0bbaf8897a4c0c90ad5412bbd37650ae24bb20fcdb3b64bce77634"} Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.923454 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fd040ff2f0bbaf8897a4c0c90ad5412bbd37650ae24bb20fcdb3b64bce77634" Feb 17 18:03:48 crc kubenswrapper[4892]: I0217 18:03:48.923517 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-nc5l8" Feb 17 18:03:49 crc kubenswrapper[4892]: I0217 18:03:49.087613 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2gm2\" (UniqueName: \"kubernetes.io/projected/545242bc-24e5-4521-85d8-7aff5cbd4916-kube-api-access-v2gm2\") pod \"545242bc-24e5-4521-85d8-7aff5cbd4916\" (UID: \"545242bc-24e5-4521-85d8-7aff5cbd4916\") " Feb 17 18:03:49 crc kubenswrapper[4892]: I0217 18:03:49.088767 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/545242bc-24e5-4521-85d8-7aff5cbd4916-db-sync-config-data\") pod \"545242bc-24e5-4521-85d8-7aff5cbd4916\" (UID: \"545242bc-24e5-4521-85d8-7aff5cbd4916\") " Feb 17 18:03:49 crc kubenswrapper[4892]: I0217 18:03:49.088803 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/545242bc-24e5-4521-85d8-7aff5cbd4916-combined-ca-bundle\") pod \"545242bc-24e5-4521-85d8-7aff5cbd4916\" (UID: \"545242bc-24e5-4521-85d8-7aff5cbd4916\") " Feb 17 18:03:49 crc kubenswrapper[4892]: I0217 18:03:49.098222 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/545242bc-24e5-4521-85d8-7aff5cbd4916-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "545242bc-24e5-4521-85d8-7aff5cbd4916" (UID: "545242bc-24e5-4521-85d8-7aff5cbd4916"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:03:49 crc kubenswrapper[4892]: I0217 18:03:49.107162 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/545242bc-24e5-4521-85d8-7aff5cbd4916-kube-api-access-v2gm2" (OuterVolumeSpecName: "kube-api-access-v2gm2") pod "545242bc-24e5-4521-85d8-7aff5cbd4916" (UID: "545242bc-24e5-4521-85d8-7aff5cbd4916"). InnerVolumeSpecName "kube-api-access-v2gm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:03:49 crc kubenswrapper[4892]: I0217 18:03:49.137924 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/545242bc-24e5-4521-85d8-7aff5cbd4916-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "545242bc-24e5-4521-85d8-7aff5cbd4916" (UID: "545242bc-24e5-4521-85d8-7aff5cbd4916"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:03:49 crc kubenswrapper[4892]: I0217 18:03:49.193295 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2gm2\" (UniqueName: \"kubernetes.io/projected/545242bc-24e5-4521-85d8-7aff5cbd4916-kube-api-access-v2gm2\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:49 crc kubenswrapper[4892]: I0217 18:03:49.193331 4892 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/545242bc-24e5-4521-85d8-7aff5cbd4916-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:49 crc kubenswrapper[4892]: I0217 18:03:49.193339 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/545242bc-24e5-4521-85d8-7aff5cbd4916-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:49 crc kubenswrapper[4892]: I0217 18:03:49.351815 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 17 18:03:49 crc kubenswrapper[4892]: I0217 18:03:49.352353 4892 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 18:03:49 crc kubenswrapper[4892]: I0217 18:03:49.363541 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 17 18:03:49 crc kubenswrapper[4892]: I0217 18:03:49.363644 4892 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 18:03:49 crc kubenswrapper[4892]: I0217 18:03:49.395963 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 17 18:03:49 crc kubenswrapper[4892]: I0217 18:03:49.411096 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 17 18:03:49 crc kubenswrapper[4892]: I0217 18:03:49.725207 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-7rjq9"] Feb 17 18:03:49 crc kubenswrapper[4892]: I0217 18:03:49.948257 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6674d5469b-4kmh6"] Feb 17 18:03:49 crc kubenswrapper[4892]: W0217 18:03:49.980984 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d1b1f1d_e838_4b89_8d8b_e61b88a9917e.slice/crio-e0bbd765e7ff74b0aa03e4531e76589d6c438ece00b59fdd894288c586332e69 WatchSource:0}: Error finding container e0bbd765e7ff74b0aa03e4531e76589d6c438ece00b59fdd894288c586332e69: Status 404 returned error can't find the container with id e0bbd765e7ff74b0aa03e4531e76589d6c438ece00b59fdd894288c586332e69 Feb 17 18:03:49 crc kubenswrapper[4892]: I0217 18:03:49.989509 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-dfdc985-pc7r9"] Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:49.999996 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-544cf5fc64-d8jkh"] Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.012574 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-7rjq9" event={"ID":"5fb2cb6f-9055-428a-8ae8-d76893b49c68","Type":"ContainerStarted","Data":"9c94221b7fe986ec5917cd3740ced375bca67812ff64592d29a51cda4e7b1769"} Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.074116 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dbf87fcbd-5txkh"] Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.197145 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-65fb4dcbbc-jxbk8"] Feb 17 18:03:50 crc kubenswrapper[4892]: E0217 18:03:50.197615 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="545242bc-24e5-4521-85d8-7aff5cbd4916" containerName="barbican-db-sync" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.197627 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="545242bc-24e5-4521-85d8-7aff5cbd4916" containerName="barbican-db-sync" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.197846 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="545242bc-24e5-4521-85d8-7aff5cbd4916" containerName="barbican-db-sync" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.198763 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-65fb4dcbbc-jxbk8" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.205966 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-65fb4dcbbc-jxbk8"] Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.230407 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.230575 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-vgzf6" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.230710 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.234945 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-667fbbdb6d-fdpm7"] Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.236504 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-667fbbdb6d-fdpm7" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.251200 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.375249 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a97132b-c4ac-4645-8844-1dc5acf466a1-config-data-custom\") pod \"barbican-worker-65fb4dcbbc-jxbk8\" (UID: \"7a97132b-c4ac-4645-8844-1dc5acf466a1\") " pod="openstack/barbican-worker-65fb4dcbbc-jxbk8" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.375661 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a97132b-c4ac-4645-8844-1dc5acf466a1-logs\") pod \"barbican-worker-65fb4dcbbc-jxbk8\" (UID: \"7a97132b-c4ac-4645-8844-1dc5acf466a1\") " pod="openstack/barbican-worker-65fb4dcbbc-jxbk8" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.375710 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bf3f5fcf-07b1-4f42-a5be-5b11052d080a-config-data-custom\") pod \"barbican-keystone-listener-667fbbdb6d-fdpm7\" (UID: \"bf3f5fcf-07b1-4f42-a5be-5b11052d080a\") " pod="openstack/barbican-keystone-listener-667fbbdb6d-fdpm7" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.376225 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a97132b-c4ac-4645-8844-1dc5acf466a1-config-data\") pod \"barbican-worker-65fb4dcbbc-jxbk8\" (UID: \"7a97132b-c4ac-4645-8844-1dc5acf466a1\") " pod="openstack/barbican-worker-65fb4dcbbc-jxbk8" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.376261 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf3f5fcf-07b1-4f42-a5be-5b11052d080a-config-data\") pod \"barbican-keystone-listener-667fbbdb6d-fdpm7\" (UID: \"bf3f5fcf-07b1-4f42-a5be-5b11052d080a\") " pod="openstack/barbican-keystone-listener-667fbbdb6d-fdpm7" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.376388 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjtq5\" (UniqueName: \"kubernetes.io/projected/bf3f5fcf-07b1-4f42-a5be-5b11052d080a-kube-api-access-gjtq5\") pod \"barbican-keystone-listener-667fbbdb6d-fdpm7\" (UID: \"bf3f5fcf-07b1-4f42-a5be-5b11052d080a\") " pod="openstack/barbican-keystone-listener-667fbbdb6d-fdpm7" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.376424 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a97132b-c4ac-4645-8844-1dc5acf466a1-combined-ca-bundle\") pod \"barbican-worker-65fb4dcbbc-jxbk8\" (UID: \"7a97132b-c4ac-4645-8844-1dc5acf466a1\") " pod="openstack/barbican-worker-65fb4dcbbc-jxbk8" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.376455 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf3f5fcf-07b1-4f42-a5be-5b11052d080a-logs\") pod \"barbican-keystone-listener-667fbbdb6d-fdpm7\" (UID: \"bf3f5fcf-07b1-4f42-a5be-5b11052d080a\") " pod="openstack/barbican-keystone-listener-667fbbdb6d-fdpm7" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.376492 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf3f5fcf-07b1-4f42-a5be-5b11052d080a-combined-ca-bundle\") pod \"barbican-keystone-listener-667fbbdb6d-fdpm7\" (UID: \"bf3f5fcf-07b1-4f42-a5be-5b11052d080a\") " pod="openstack/barbican-keystone-listener-667fbbdb6d-fdpm7" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.376571 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8trtn\" (UniqueName: \"kubernetes.io/projected/7a97132b-c4ac-4645-8844-1dc5acf466a1-kube-api-access-8trtn\") pod \"barbican-worker-65fb4dcbbc-jxbk8\" (UID: \"7a97132b-c4ac-4645-8844-1dc5acf466a1\") " pod="openstack/barbican-worker-65fb4dcbbc-jxbk8" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.422265 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-667fbbdb6d-fdpm7"] Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.484167 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjtq5\" (UniqueName: \"kubernetes.io/projected/bf3f5fcf-07b1-4f42-a5be-5b11052d080a-kube-api-access-gjtq5\") pod \"barbican-keystone-listener-667fbbdb6d-fdpm7\" (UID: \"bf3f5fcf-07b1-4f42-a5be-5b11052d080a\") " pod="openstack/barbican-keystone-listener-667fbbdb6d-fdpm7" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.484454 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a97132b-c4ac-4645-8844-1dc5acf466a1-combined-ca-bundle\") pod \"barbican-worker-65fb4dcbbc-jxbk8\" (UID: \"7a97132b-c4ac-4645-8844-1dc5acf466a1\") " pod="openstack/barbican-worker-65fb4dcbbc-jxbk8" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.484509 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf3f5fcf-07b1-4f42-a5be-5b11052d080a-logs\") pod \"barbican-keystone-listener-667fbbdb6d-fdpm7\" (UID: \"bf3f5fcf-07b1-4f42-a5be-5b11052d080a\") " pod="openstack/barbican-keystone-listener-667fbbdb6d-fdpm7" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.484558 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf3f5fcf-07b1-4f42-a5be-5b11052d080a-combined-ca-bundle\") pod \"barbican-keystone-listener-667fbbdb6d-fdpm7\" (UID: \"bf3f5fcf-07b1-4f42-a5be-5b11052d080a\") " pod="openstack/barbican-keystone-listener-667fbbdb6d-fdpm7" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.484667 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8trtn\" (UniqueName: \"kubernetes.io/projected/7a97132b-c4ac-4645-8844-1dc5acf466a1-kube-api-access-8trtn\") pod \"barbican-worker-65fb4dcbbc-jxbk8\" (UID: \"7a97132b-c4ac-4645-8844-1dc5acf466a1\") " pod="openstack/barbican-worker-65fb4dcbbc-jxbk8" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.484753 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a97132b-c4ac-4645-8844-1dc5acf466a1-config-data-custom\") pod \"barbican-worker-65fb4dcbbc-jxbk8\" (UID: \"7a97132b-c4ac-4645-8844-1dc5acf466a1\") " pod="openstack/barbican-worker-65fb4dcbbc-jxbk8" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.484933 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a97132b-c4ac-4645-8844-1dc5acf466a1-logs\") pod \"barbican-worker-65fb4dcbbc-jxbk8\" (UID: \"7a97132b-c4ac-4645-8844-1dc5acf466a1\") " pod="openstack/barbican-worker-65fb4dcbbc-jxbk8" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.484971 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bf3f5fcf-07b1-4f42-a5be-5b11052d080a-config-data-custom\") pod \"barbican-keystone-listener-667fbbdb6d-fdpm7\" (UID: \"bf3f5fcf-07b1-4f42-a5be-5b11052d080a\") " pod="openstack/barbican-keystone-listener-667fbbdb6d-fdpm7" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.485055 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a97132b-c4ac-4645-8844-1dc5acf466a1-config-data\") pod \"barbican-worker-65fb4dcbbc-jxbk8\" (UID: \"7a97132b-c4ac-4645-8844-1dc5acf466a1\") " pod="openstack/barbican-worker-65fb4dcbbc-jxbk8" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.485074 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf3f5fcf-07b1-4f42-a5be-5b11052d080a-config-data\") pod \"barbican-keystone-listener-667fbbdb6d-fdpm7\" (UID: \"bf3f5fcf-07b1-4f42-a5be-5b11052d080a\") " pod="openstack/barbican-keystone-listener-667fbbdb6d-fdpm7" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.485980 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf3f5fcf-07b1-4f42-a5be-5b11052d080a-logs\") pod \"barbican-keystone-listener-667fbbdb6d-fdpm7\" (UID: \"bf3f5fcf-07b1-4f42-a5be-5b11052d080a\") " pod="openstack/barbican-keystone-listener-667fbbdb6d-fdpm7" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.486148 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a97132b-c4ac-4645-8844-1dc5acf466a1-logs\") pod \"barbican-worker-65fb4dcbbc-jxbk8\" (UID: \"7a97132b-c4ac-4645-8844-1dc5acf466a1\") " pod="openstack/barbican-worker-65fb4dcbbc-jxbk8" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.493178 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bf3f5fcf-07b1-4f42-a5be-5b11052d080a-config-data-custom\") pod \"barbican-keystone-listener-667fbbdb6d-fdpm7\" (UID: \"bf3f5fcf-07b1-4f42-a5be-5b11052d080a\") " pod="openstack/barbican-keystone-listener-667fbbdb6d-fdpm7" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.493383 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a97132b-c4ac-4645-8844-1dc5acf466a1-config-data\") pod \"barbican-worker-65fb4dcbbc-jxbk8\" (UID: \"7a97132b-c4ac-4645-8844-1dc5acf466a1\") " pod="openstack/barbican-worker-65fb4dcbbc-jxbk8" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.493813 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf3f5fcf-07b1-4f42-a5be-5b11052d080a-combined-ca-bundle\") pod \"barbican-keystone-listener-667fbbdb6d-fdpm7\" (UID: \"bf3f5fcf-07b1-4f42-a5be-5b11052d080a\") " pod="openstack/barbican-keystone-listener-667fbbdb6d-fdpm7" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.508604 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a97132b-c4ac-4645-8844-1dc5acf466a1-config-data-custom\") pod \"barbican-worker-65fb4dcbbc-jxbk8\" (UID: \"7a97132b-c4ac-4645-8844-1dc5acf466a1\") " pod="openstack/barbican-worker-65fb4dcbbc-jxbk8" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.510246 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf3f5fcf-07b1-4f42-a5be-5b11052d080a-config-data\") pod \"barbican-keystone-listener-667fbbdb6d-fdpm7\" (UID: \"bf3f5fcf-07b1-4f42-a5be-5b11052d080a\") " pod="openstack/barbican-keystone-listener-667fbbdb6d-fdpm7" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.518006 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a97132b-c4ac-4645-8844-1dc5acf466a1-combined-ca-bundle\") pod \"barbican-worker-65fb4dcbbc-jxbk8\" (UID: \"7a97132b-c4ac-4645-8844-1dc5acf466a1\") " pod="openstack/barbican-worker-65fb4dcbbc-jxbk8" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.523915 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8trtn\" (UniqueName: \"kubernetes.io/projected/7a97132b-c4ac-4645-8844-1dc5acf466a1-kube-api-access-8trtn\") pod \"barbican-worker-65fb4dcbbc-jxbk8\" (UID: \"7a97132b-c4ac-4645-8844-1dc5acf466a1\") " pod="openstack/barbican-worker-65fb4dcbbc-jxbk8" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.524305 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjtq5\" (UniqueName: \"kubernetes.io/projected/bf3f5fcf-07b1-4f42-a5be-5b11052d080a-kube-api-access-gjtq5\") pod \"barbican-keystone-listener-667fbbdb6d-fdpm7\" (UID: \"bf3f5fcf-07b1-4f42-a5be-5b11052d080a\") " pod="openstack/barbican-keystone-listener-667fbbdb6d-fdpm7" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.527111 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-7rjq9"] Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.553216 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-grk2x"] Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.555023 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-grk2x" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.567204 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-grk2x"] Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.603406 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5dc464d6d6-dj24w"] Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.605496 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5dc464d6d6-dj24w" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.609381 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.621715 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5dc464d6d6-dj24w"] Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.626393 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-667fbbdb6d-fdpm7" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.690271 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2eef18f1-0710-4c8f-af0b-d8c836eef0f1-dns-svc\") pod \"dnsmasq-dns-688c87cc99-grk2x\" (UID: \"2eef18f1-0710-4c8f-af0b-d8c836eef0f1\") " pod="openstack/dnsmasq-dns-688c87cc99-grk2x" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.690324 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncbxk\" (UniqueName: \"kubernetes.io/projected/2eef18f1-0710-4c8f-af0b-d8c836eef0f1-kube-api-access-ncbxk\") pod \"dnsmasq-dns-688c87cc99-grk2x\" (UID: \"2eef18f1-0710-4c8f-af0b-d8c836eef0f1\") " pod="openstack/dnsmasq-dns-688c87cc99-grk2x" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.690355 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2eef18f1-0710-4c8f-af0b-d8c836eef0f1-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-grk2x\" (UID: \"2eef18f1-0710-4c8f-af0b-d8c836eef0f1\") " pod="openstack/dnsmasq-dns-688c87cc99-grk2x" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.690385 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2eef18f1-0710-4c8f-af0b-d8c836eef0f1-config\") pod \"dnsmasq-dns-688c87cc99-grk2x\" (UID: \"2eef18f1-0710-4c8f-af0b-d8c836eef0f1\") " pod="openstack/dnsmasq-dns-688c87cc99-grk2x" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.690423 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2eef18f1-0710-4c8f-af0b-d8c836eef0f1-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-grk2x\" (UID: \"2eef18f1-0710-4c8f-af0b-d8c836eef0f1\") " pod="openstack/dnsmasq-dns-688c87cc99-grk2x" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.690494 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2eef18f1-0710-4c8f-af0b-d8c836eef0f1-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-grk2x\" (UID: \"2eef18f1-0710-4c8f-af0b-d8c836eef0f1\") " pod="openstack/dnsmasq-dns-688c87cc99-grk2x" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.754934 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-65fb4dcbbc-jxbk8" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.792872 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2eef18f1-0710-4c8f-af0b-d8c836eef0f1-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-grk2x\" (UID: \"2eef18f1-0710-4c8f-af0b-d8c836eef0f1\") " pod="openstack/dnsmasq-dns-688c87cc99-grk2x" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.792951 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f44e1ba4-1235-4601-94da-dd3400cdb7cc-logs\") pod \"barbican-api-5dc464d6d6-dj24w\" (UID: \"f44e1ba4-1235-4601-94da-dd3400cdb7cc\") " pod="openstack/barbican-api-5dc464d6d6-dj24w" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.793070 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2eef18f1-0710-4c8f-af0b-d8c836eef0f1-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-grk2x\" (UID: \"2eef18f1-0710-4c8f-af0b-d8c836eef0f1\") " pod="openstack/dnsmasq-dns-688c87cc99-grk2x" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.793204 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q647\" (UniqueName: \"kubernetes.io/projected/f44e1ba4-1235-4601-94da-dd3400cdb7cc-kube-api-access-2q647\") pod \"barbican-api-5dc464d6d6-dj24w\" (UID: \"f44e1ba4-1235-4601-94da-dd3400cdb7cc\") " pod="openstack/barbican-api-5dc464d6d6-dj24w" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.793224 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f44e1ba4-1235-4601-94da-dd3400cdb7cc-combined-ca-bundle\") pod \"barbican-api-5dc464d6d6-dj24w\" (UID: \"f44e1ba4-1235-4601-94da-dd3400cdb7cc\") " pod="openstack/barbican-api-5dc464d6d6-dj24w" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.793245 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2eef18f1-0710-4c8f-af0b-d8c836eef0f1-dns-svc\") pod \"dnsmasq-dns-688c87cc99-grk2x\" (UID: \"2eef18f1-0710-4c8f-af0b-d8c836eef0f1\") " pod="openstack/dnsmasq-dns-688c87cc99-grk2x" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.793265 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncbxk\" (UniqueName: \"kubernetes.io/projected/2eef18f1-0710-4c8f-af0b-d8c836eef0f1-kube-api-access-ncbxk\") pod \"dnsmasq-dns-688c87cc99-grk2x\" (UID: \"2eef18f1-0710-4c8f-af0b-d8c836eef0f1\") " pod="openstack/dnsmasq-dns-688c87cc99-grk2x" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.793292 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2eef18f1-0710-4c8f-af0b-d8c836eef0f1-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-grk2x\" (UID: \"2eef18f1-0710-4c8f-af0b-d8c836eef0f1\") " pod="openstack/dnsmasq-dns-688c87cc99-grk2x" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.793309 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f44e1ba4-1235-4601-94da-dd3400cdb7cc-config-data-custom\") pod \"barbican-api-5dc464d6d6-dj24w\" (UID: \"f44e1ba4-1235-4601-94da-dd3400cdb7cc\") " pod="openstack/barbican-api-5dc464d6d6-dj24w" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.793331 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f44e1ba4-1235-4601-94da-dd3400cdb7cc-config-data\") pod \"barbican-api-5dc464d6d6-dj24w\" (UID: \"f44e1ba4-1235-4601-94da-dd3400cdb7cc\") " pod="openstack/barbican-api-5dc464d6d6-dj24w" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.793350 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2eef18f1-0710-4c8f-af0b-d8c836eef0f1-config\") pod \"dnsmasq-dns-688c87cc99-grk2x\" (UID: \"2eef18f1-0710-4c8f-af0b-d8c836eef0f1\") " pod="openstack/dnsmasq-dns-688c87cc99-grk2x" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.798469 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2eef18f1-0710-4c8f-af0b-d8c836eef0f1-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-grk2x\" (UID: \"2eef18f1-0710-4c8f-af0b-d8c836eef0f1\") " pod="openstack/dnsmasq-dns-688c87cc99-grk2x" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.801604 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2eef18f1-0710-4c8f-af0b-d8c836eef0f1-dns-svc\") pod \"dnsmasq-dns-688c87cc99-grk2x\" (UID: \"2eef18f1-0710-4c8f-af0b-d8c836eef0f1\") " pod="openstack/dnsmasq-dns-688c87cc99-grk2x" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.801671 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2eef18f1-0710-4c8f-af0b-d8c836eef0f1-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-grk2x\" (UID: \"2eef18f1-0710-4c8f-af0b-d8c836eef0f1\") " pod="openstack/dnsmasq-dns-688c87cc99-grk2x" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.802819 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2eef18f1-0710-4c8f-af0b-d8c836eef0f1-config\") pod \"dnsmasq-dns-688c87cc99-grk2x\" (UID: \"2eef18f1-0710-4c8f-af0b-d8c836eef0f1\") " pod="openstack/dnsmasq-dns-688c87cc99-grk2x" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.807434 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2eef18f1-0710-4c8f-af0b-d8c836eef0f1-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-grk2x\" (UID: \"2eef18f1-0710-4c8f-af0b-d8c836eef0f1\") " pod="openstack/dnsmasq-dns-688c87cc99-grk2x" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.848350 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6674d5469b-4kmh6"] Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.853804 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncbxk\" (UniqueName: \"kubernetes.io/projected/2eef18f1-0710-4c8f-af0b-d8c836eef0f1-kube-api-access-ncbxk\") pod \"dnsmasq-dns-688c87cc99-grk2x\" (UID: \"2eef18f1-0710-4c8f-af0b-d8c836eef0f1\") " pod="openstack/dnsmasq-dns-688c87cc99-grk2x" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.895109 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2q647\" (UniqueName: \"kubernetes.io/projected/f44e1ba4-1235-4601-94da-dd3400cdb7cc-kube-api-access-2q647\") pod \"barbican-api-5dc464d6d6-dj24w\" (UID: \"f44e1ba4-1235-4601-94da-dd3400cdb7cc\") " pod="openstack/barbican-api-5dc464d6d6-dj24w" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.895153 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f44e1ba4-1235-4601-94da-dd3400cdb7cc-combined-ca-bundle\") pod \"barbican-api-5dc464d6d6-dj24w\" (UID: \"f44e1ba4-1235-4601-94da-dd3400cdb7cc\") " pod="openstack/barbican-api-5dc464d6d6-dj24w" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.895196 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f44e1ba4-1235-4601-94da-dd3400cdb7cc-config-data-custom\") pod \"barbican-api-5dc464d6d6-dj24w\" (UID: \"f44e1ba4-1235-4601-94da-dd3400cdb7cc\") " pod="openstack/barbican-api-5dc464d6d6-dj24w" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.895221 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f44e1ba4-1235-4601-94da-dd3400cdb7cc-config-data\") pod \"barbican-api-5dc464d6d6-dj24w\" (UID: \"f44e1ba4-1235-4601-94da-dd3400cdb7cc\") " pod="openstack/barbican-api-5dc464d6d6-dj24w" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.895265 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f44e1ba4-1235-4601-94da-dd3400cdb7cc-logs\") pod \"barbican-api-5dc464d6d6-dj24w\" (UID: \"f44e1ba4-1235-4601-94da-dd3400cdb7cc\") " pod="openstack/barbican-api-5dc464d6d6-dj24w" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.895715 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f44e1ba4-1235-4601-94da-dd3400cdb7cc-logs\") pod \"barbican-api-5dc464d6d6-dj24w\" (UID: \"f44e1ba4-1235-4601-94da-dd3400cdb7cc\") " pod="openstack/barbican-api-5dc464d6d6-dj24w" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.901441 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f44e1ba4-1235-4601-94da-dd3400cdb7cc-combined-ca-bundle\") pod \"barbican-api-5dc464d6d6-dj24w\" (UID: \"f44e1ba4-1235-4601-94da-dd3400cdb7cc\") " pod="openstack/barbican-api-5dc464d6d6-dj24w" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.917249 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f44e1ba4-1235-4601-94da-dd3400cdb7cc-config-data\") pod \"barbican-api-5dc464d6d6-dj24w\" (UID: \"f44e1ba4-1235-4601-94da-dd3400cdb7cc\") " pod="openstack/barbican-api-5dc464d6d6-dj24w" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.924642 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f44e1ba4-1235-4601-94da-dd3400cdb7cc-config-data-custom\") pod \"barbican-api-5dc464d6d6-dj24w\" (UID: \"f44e1ba4-1235-4601-94da-dd3400cdb7cc\") " pod="openstack/barbican-api-5dc464d6d6-dj24w" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.927349 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5758f86b57-ddm7q"] Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.929133 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5758f86b57-ddm7q" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.930401 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q647\" (UniqueName: \"kubernetes.io/projected/f44e1ba4-1235-4601-94da-dd3400cdb7cc-kube-api-access-2q647\") pod \"barbican-api-5dc464d6d6-dj24w\" (UID: \"f44e1ba4-1235-4601-94da-dd3400cdb7cc\") " pod="openstack/barbican-api-5dc464d6d6-dj24w" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.936530 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.936656 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.946156 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5758f86b57-ddm7q"] Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.968348 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-grk2x" Feb 17 18:03:50 crc kubenswrapper[4892]: I0217 18:03:50.977008 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5dc464d6d6-dj24w" Feb 17 18:03:51 crc kubenswrapper[4892]: I0217 18:03:51.092368 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6674d5469b-4kmh6" event={"ID":"8d1b1f1d-e838-4b89-8d8b-e61b88a9917e","Type":"ContainerStarted","Data":"afb250219a37fbd0d9f4383235bb1d104ff71038b3a1237587e4fc7a008ca4b1"} Feb 17 18:03:51 crc kubenswrapper[4892]: I0217 18:03:51.094899 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6674d5469b-4kmh6" Feb 17 18:03:51 crc kubenswrapper[4892]: I0217 18:03:51.094925 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6674d5469b-4kmh6" event={"ID":"8d1b1f1d-e838-4b89-8d8b-e61b88a9917e","Type":"ContainerStarted","Data":"8621ee39ca6e415c44ee20c97506fb3890daed66ed91ff6b2d4bfcb4133e546f"} Feb 17 18:03:51 crc kubenswrapper[4892]: I0217 18:03:51.094951 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6674d5469b-4kmh6" event={"ID":"8d1b1f1d-e838-4b89-8d8b-e61b88a9917e","Type":"ContainerStarted","Data":"e0bbd765e7ff74b0aa03e4531e76589d6c438ece00b59fdd894288c586332e69"} Feb 17 18:03:51 crc kubenswrapper[4892]: I0217 18:03:51.104432 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5ffd802c-eec6-4c8c-a2a9-2571eba7bc33-httpd-config\") pod \"neutron-5758f86b57-ddm7q\" (UID: \"5ffd802c-eec6-4c8c-a2a9-2571eba7bc33\") " pod="openstack/neutron-5758f86b57-ddm7q" Feb 17 18:03:51 crc kubenswrapper[4892]: I0217 18:03:51.104508 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5wrv\" (UniqueName: \"kubernetes.io/projected/5ffd802c-eec6-4c8c-a2a9-2571eba7bc33-kube-api-access-x5wrv\") pod \"neutron-5758f86b57-ddm7q\" (UID: \"5ffd802c-eec6-4c8c-a2a9-2571eba7bc33\") " pod="openstack/neutron-5758f86b57-ddm7q" Feb 17 18:03:51 crc kubenswrapper[4892]: I0217 18:03:51.104535 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ffd802c-eec6-4c8c-a2a9-2571eba7bc33-public-tls-certs\") pod \"neutron-5758f86b57-ddm7q\" (UID: \"5ffd802c-eec6-4c8c-a2a9-2571eba7bc33\") " pod="openstack/neutron-5758f86b57-ddm7q" Feb 17 18:03:51 crc kubenswrapper[4892]: I0217 18:03:51.104553 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ffd802c-eec6-4c8c-a2a9-2571eba7bc33-ovndb-tls-certs\") pod \"neutron-5758f86b57-ddm7q\" (UID: \"5ffd802c-eec6-4c8c-a2a9-2571eba7bc33\") " pod="openstack/neutron-5758f86b57-ddm7q" Feb 17 18:03:51 crc kubenswrapper[4892]: I0217 18:03:51.104599 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ffd802c-eec6-4c8c-a2a9-2571eba7bc33-combined-ca-bundle\") pod \"neutron-5758f86b57-ddm7q\" (UID: \"5ffd802c-eec6-4c8c-a2a9-2571eba7bc33\") " pod="openstack/neutron-5758f86b57-ddm7q" Feb 17 18:03:51 crc kubenswrapper[4892]: I0217 18:03:51.104617 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ffd802c-eec6-4c8c-a2a9-2571eba7bc33-internal-tls-certs\") pod \"neutron-5758f86b57-ddm7q\" (UID: \"5ffd802c-eec6-4c8c-a2a9-2571eba7bc33\") " pod="openstack/neutron-5758f86b57-ddm7q" Feb 17 18:03:51 crc kubenswrapper[4892]: I0217 18:03:51.104638 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5ffd802c-eec6-4c8c-a2a9-2571eba7bc33-config\") pod \"neutron-5758f86b57-ddm7q\" (UID: \"5ffd802c-eec6-4c8c-a2a9-2571eba7bc33\") " pod="openstack/neutron-5758f86b57-ddm7q" Feb 17 18:03:51 crc kubenswrapper[4892]: I0217 18:03:51.127300 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-dfdc985-pc7r9" event={"ID":"2fd50e1e-cc22-430b-ab38-88217aeafc59","Type":"ContainerStarted","Data":"d55aa965f98c1b5c7b4e27276dc194db4f2a6d3b950d18e096832f47d612e12f"} Feb 17 18:03:51 crc kubenswrapper[4892]: I0217 18:03:51.127356 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-dfdc985-pc7r9" event={"ID":"2fd50e1e-cc22-430b-ab38-88217aeafc59","Type":"ContainerStarted","Data":"9ca7613cc626c0571e208035f848864174e3337eac62a5892d4d3c372b970eb0"} Feb 17 18:03:51 crc kubenswrapper[4892]: I0217 18:03:51.127412 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-dfdc985-pc7r9" Feb 17 18:03:51 crc kubenswrapper[4892]: I0217 18:03:51.133096 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-544cf5fc64-d8jkh" event={"ID":"0b04ca1c-a720-49e9-81dd-9be5c4695174","Type":"ContainerStarted","Data":"032eb4d4bda641c6ae9d5a7fe1126add66df20804e1aea991d6602b7e9264155"} Feb 17 18:03:51 crc kubenswrapper[4892]: I0217 18:03:51.133508 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-544cf5fc64-d8jkh" event={"ID":"0b04ca1c-a720-49e9-81dd-9be5c4695174","Type":"ContainerStarted","Data":"43dde95c545d91ff0eb2aca888d539a13a9b399124d85f84738e9d14ce3800da"} Feb 17 18:03:51 crc kubenswrapper[4892]: I0217 18:03:51.136022 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6674d5469b-4kmh6" podStartSLOduration=3.135997357 podStartE2EDuration="3.135997357s" podCreationTimestamp="2026-02-17 18:03:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:03:51.116929376 +0000 UTC m=+1202.492332641" watchObservedRunningTime="2026-02-17 18:03:51.135997357 +0000 UTC m=+1202.511400632" Feb 17 18:03:51 crc kubenswrapper[4892]: I0217 18:03:51.136932 4892 generic.go:334] "Generic (PLEG): container finished" podID="5fb2cb6f-9055-428a-8ae8-d76893b49c68" containerID="219940f2babea1f478053869a6de00830abbd54bbb0a262cf1cfbdd307af8bba" exitCode=0 Feb 17 18:03:51 crc kubenswrapper[4892]: I0217 18:03:51.136981 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-7rjq9" event={"ID":"5fb2cb6f-9055-428a-8ae8-d76893b49c68","Type":"ContainerDied","Data":"219940f2babea1f478053869a6de00830abbd54bbb0a262cf1cfbdd307af8bba"} Feb 17 18:03:51 crc kubenswrapper[4892]: I0217 18:03:51.152878 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dbf87fcbd-5txkh" event={"ID":"3ad1c61c-5ce6-4b28-9034-5620d94bebc1","Type":"ContainerStarted","Data":"835e3ba73c00484c7dd2e12d5e138cb93115a4a9fe41373b6d53982dc49ef11a"} Feb 17 18:03:51 crc kubenswrapper[4892]: I0217 18:03:51.152933 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dbf87fcbd-5txkh" event={"ID":"3ad1c61c-5ce6-4b28-9034-5620d94bebc1","Type":"ContainerStarted","Data":"590c9db451aa20074af5d97e979853a7e68d07c2c2cd67572e98537d4730ff2b"} Feb 17 18:03:51 crc kubenswrapper[4892]: I0217 18:03:51.158651 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-dfdc985-pc7r9" podStartSLOduration=3.158633612 podStartE2EDuration="3.158633612s" podCreationTimestamp="2026-02-17 18:03:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:03:51.146978361 +0000 UTC m=+1202.522381626" watchObservedRunningTime="2026-02-17 18:03:51.158633612 +0000 UTC m=+1202.534036877" Feb 17 18:03:51 crc kubenswrapper[4892]: I0217 18:03:51.207203 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5wrv\" (UniqueName: \"kubernetes.io/projected/5ffd802c-eec6-4c8c-a2a9-2571eba7bc33-kube-api-access-x5wrv\") pod \"neutron-5758f86b57-ddm7q\" (UID: \"5ffd802c-eec6-4c8c-a2a9-2571eba7bc33\") " pod="openstack/neutron-5758f86b57-ddm7q" Feb 17 18:03:51 crc kubenswrapper[4892]: I0217 18:03:51.207258 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ffd802c-eec6-4c8c-a2a9-2571eba7bc33-public-tls-certs\") pod \"neutron-5758f86b57-ddm7q\" (UID: \"5ffd802c-eec6-4c8c-a2a9-2571eba7bc33\") " pod="openstack/neutron-5758f86b57-ddm7q" Feb 17 18:03:51 crc kubenswrapper[4892]: I0217 18:03:51.207285 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ffd802c-eec6-4c8c-a2a9-2571eba7bc33-ovndb-tls-certs\") pod \"neutron-5758f86b57-ddm7q\" (UID: \"5ffd802c-eec6-4c8c-a2a9-2571eba7bc33\") " pod="openstack/neutron-5758f86b57-ddm7q" Feb 17 18:03:51 crc kubenswrapper[4892]: I0217 18:03:51.207367 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ffd802c-eec6-4c8c-a2a9-2571eba7bc33-combined-ca-bundle\") pod \"neutron-5758f86b57-ddm7q\" (UID: \"5ffd802c-eec6-4c8c-a2a9-2571eba7bc33\") " pod="openstack/neutron-5758f86b57-ddm7q" Feb 17 18:03:51 crc kubenswrapper[4892]: I0217 18:03:51.209994 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ffd802c-eec6-4c8c-a2a9-2571eba7bc33-internal-tls-certs\") pod \"neutron-5758f86b57-ddm7q\" (UID: \"5ffd802c-eec6-4c8c-a2a9-2571eba7bc33\") " pod="openstack/neutron-5758f86b57-ddm7q" Feb 17 18:03:51 crc kubenswrapper[4892]: I0217 18:03:51.210098 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5ffd802c-eec6-4c8c-a2a9-2571eba7bc33-config\") pod \"neutron-5758f86b57-ddm7q\" (UID: \"5ffd802c-eec6-4c8c-a2a9-2571eba7bc33\") " pod="openstack/neutron-5758f86b57-ddm7q" Feb 17 18:03:51 crc kubenswrapper[4892]: I0217 18:03:51.210362 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5ffd802c-eec6-4c8c-a2a9-2571eba7bc33-httpd-config\") pod \"neutron-5758f86b57-ddm7q\" (UID: \"5ffd802c-eec6-4c8c-a2a9-2571eba7bc33\") " pod="openstack/neutron-5758f86b57-ddm7q" Feb 17 18:03:51 crc kubenswrapper[4892]: I0217 18:03:51.212016 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ffd802c-eec6-4c8c-a2a9-2571eba7bc33-ovndb-tls-certs\") pod \"neutron-5758f86b57-ddm7q\" (UID: \"5ffd802c-eec6-4c8c-a2a9-2571eba7bc33\") " pod="openstack/neutron-5758f86b57-ddm7q" Feb 17 18:03:51 crc kubenswrapper[4892]: I0217 18:03:51.213399 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5ffd802c-eec6-4c8c-a2a9-2571eba7bc33-httpd-config\") pod \"neutron-5758f86b57-ddm7q\" (UID: \"5ffd802c-eec6-4c8c-a2a9-2571eba7bc33\") " pod="openstack/neutron-5758f86b57-ddm7q" Feb 17 18:03:51 crc kubenswrapper[4892]: I0217 18:03:51.214344 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ffd802c-eec6-4c8c-a2a9-2571eba7bc33-public-tls-certs\") pod \"neutron-5758f86b57-ddm7q\" (UID: \"5ffd802c-eec6-4c8c-a2a9-2571eba7bc33\") " pod="openstack/neutron-5758f86b57-ddm7q" Feb 17 18:03:51 crc kubenswrapper[4892]: I0217 18:03:51.216199 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ffd802c-eec6-4c8c-a2a9-2571eba7bc33-combined-ca-bundle\") pod \"neutron-5758f86b57-ddm7q\" (UID: \"5ffd802c-eec6-4c8c-a2a9-2571eba7bc33\") " pod="openstack/neutron-5758f86b57-ddm7q" Feb 17 18:03:51 crc kubenswrapper[4892]: I0217 18:03:51.218861 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ffd802c-eec6-4c8c-a2a9-2571eba7bc33-internal-tls-certs\") pod \"neutron-5758f86b57-ddm7q\" (UID: \"5ffd802c-eec6-4c8c-a2a9-2571eba7bc33\") " pod="openstack/neutron-5758f86b57-ddm7q" Feb 17 18:03:51 crc kubenswrapper[4892]: I0217 18:03:51.225866 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5ffd802c-eec6-4c8c-a2a9-2571eba7bc33-config\") pod \"neutron-5758f86b57-ddm7q\" (UID: \"5ffd802c-eec6-4c8c-a2a9-2571eba7bc33\") " pod="openstack/neutron-5758f86b57-ddm7q" Feb 17 18:03:51 crc kubenswrapper[4892]: I0217 18:03:51.242146 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-667fbbdb6d-fdpm7"] Feb 17 18:03:51 crc kubenswrapper[4892]: I0217 18:03:51.259487 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5wrv\" (UniqueName: \"kubernetes.io/projected/5ffd802c-eec6-4c8c-a2a9-2571eba7bc33-kube-api-access-x5wrv\") pod \"neutron-5758f86b57-ddm7q\" (UID: \"5ffd802c-eec6-4c8c-a2a9-2571eba7bc33\") " pod="openstack/neutron-5758f86b57-ddm7q" Feb 17 18:03:51 crc kubenswrapper[4892]: I0217 18:03:51.501022 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-65fb4dcbbc-jxbk8"] Feb 17 18:03:51 crc kubenswrapper[4892]: I0217 18:03:51.547660 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5758f86b57-ddm7q" Feb 17 18:03:51 crc kubenswrapper[4892]: I0217 18:03:51.887994 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-grk2x"] Feb 17 18:03:52 crc kubenswrapper[4892]: I0217 18:03:52.005712 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5dc464d6d6-dj24w"] Feb 17 18:03:52 crc kubenswrapper[4892]: I0217 18:03:52.178594 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5758f86b57-ddm7q"] Feb 17 18:03:52 crc kubenswrapper[4892]: E0217 18:03:52.182641 4892 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Feb 17 18:03:52 crc kubenswrapper[4892]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/5fb2cb6f-9055-428a-8ae8-d76893b49c68/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Feb 17 18:03:52 crc kubenswrapper[4892]: > podSandboxID="9c94221b7fe986ec5917cd3740ced375bca67812ff64592d29a51cda4e7b1769" Feb 17 18:03:52 crc kubenswrapper[4892]: E0217 18:03:52.182807 4892 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 17 18:03:52 crc kubenswrapper[4892]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5dchc4h67h57hb6h78hcdh7dh96h697h5d6h5ddh58h68ch575h574h686h56h578h5d6h8bh699h7chchcbhdfhb9h5d8h88h554h57h564q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-swift-storage-0,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-swift-storage-0,SubPath:dns-swift-storage-0,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hcv5r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5ccc5c4795-7rjq9_openstack(5fb2cb6f-9055-428a-8ae8-d76893b49c68): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/5fb2cb6f-9055-428a-8ae8-d76893b49c68/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Feb 17 18:03:52 crc kubenswrapper[4892]: > logger="UnhandledError" Feb 17 18:03:52 crc kubenswrapper[4892]: E0217 18:03:52.184789 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/5fb2cb6f-9055-428a-8ae8-d76893b49c68/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-5ccc5c4795-7rjq9" podUID="5fb2cb6f-9055-428a-8ae8-d76893b49c68" Feb 17 18:03:52 crc kubenswrapper[4892]: I0217 18:03:52.198003 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5dc464d6d6-dj24w" event={"ID":"f44e1ba4-1235-4601-94da-dd3400cdb7cc","Type":"ContainerStarted","Data":"e1ea2ae9ce2461063ccf80b5a0eb1e90428ad72e610eabd3d25386541b131daa"} Feb 17 18:03:52 crc kubenswrapper[4892]: W0217 18:03:52.206369 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ffd802c_eec6_4c8c_a2a9_2571eba7bc33.slice/crio-c5040092cf864192d68f5c9d6f8aeb72cafd9400d6af4ecc44bd3b5f5a0b00bd WatchSource:0}: Error finding container c5040092cf864192d68f5c9d6f8aeb72cafd9400d6af4ecc44bd3b5f5a0b00bd: Status 404 returned error can't find the container with id c5040092cf864192d68f5c9d6f8aeb72cafd9400d6af4ecc44bd3b5f5a0b00bd Feb 17 18:03:52 crc kubenswrapper[4892]: I0217 18:03:52.206417 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dbf87fcbd-5txkh" event={"ID":"3ad1c61c-5ce6-4b28-9034-5620d94bebc1","Type":"ContainerStarted","Data":"eedbe08f2d0e35caf96d63b86adeb2da73351d8a18beff691249c5d53839565e"} Feb 17 18:03:52 crc kubenswrapper[4892]: I0217 18:03:52.206518 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-dbf87fcbd-5txkh" Feb 17 18:03:52 crc kubenswrapper[4892]: I0217 18:03:52.223032 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-667fbbdb6d-fdpm7" event={"ID":"bf3f5fcf-07b1-4f42-a5be-5b11052d080a","Type":"ContainerStarted","Data":"fa7b95a6c3e5e18dd00f12d6037cc4d743852c66ed7710d593c45a39ffa5f904"} Feb 17 18:03:52 crc kubenswrapper[4892]: I0217 18:03:52.235876 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-dbf87fcbd-5txkh" podStartSLOduration=4.2358576039999996 podStartE2EDuration="4.235857604s" podCreationTimestamp="2026-02-17 18:03:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:03:52.224375687 +0000 UTC m=+1203.599778952" watchObservedRunningTime="2026-02-17 18:03:52.235857604 +0000 UTC m=+1203.611260869" Feb 17 18:03:52 crc kubenswrapper[4892]: I0217 18:03:52.239192 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-544cf5fc64-d8jkh" event={"ID":"0b04ca1c-a720-49e9-81dd-9be5c4695174","Type":"ContainerStarted","Data":"e79fbb3e74498bb58ea8a3ea7e13222eb9735c19c6b7cac1eb382a5035c993be"} Feb 17 18:03:52 crc kubenswrapper[4892]: I0217 18:03:52.240630 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-544cf5fc64-d8jkh" Feb 17 18:03:52 crc kubenswrapper[4892]: I0217 18:03:52.240664 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-544cf5fc64-d8jkh" Feb 17 18:03:52 crc kubenswrapper[4892]: I0217 18:03:52.247663 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-grk2x" event={"ID":"2eef18f1-0710-4c8f-af0b-d8c836eef0f1","Type":"ContainerStarted","Data":"82b8fdad36380d14c867d6d39ef23507bb7369edd16394412093d87494b594b8"} Feb 17 18:03:52 crc kubenswrapper[4892]: I0217 18:03:52.251125 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6674d5469b-4kmh6" podUID="8d1b1f1d-e838-4b89-8d8b-e61b88a9917e" containerName="neutron-api" containerID="cri-o://8621ee39ca6e415c44ee20c97506fb3890daed66ed91ff6b2d4bfcb4133e546f" gracePeriod=30 Feb 17 18:03:52 crc kubenswrapper[4892]: I0217 18:03:52.251368 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-65fb4dcbbc-jxbk8" event={"ID":"7a97132b-c4ac-4645-8844-1dc5acf466a1","Type":"ContainerStarted","Data":"85c0f6dc2cb435110be85b9a8316d687744dc6110745af68766aeda95c3c47eb"} Feb 17 18:03:52 crc kubenswrapper[4892]: I0217 18:03:52.252218 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6674d5469b-4kmh6" podUID="8d1b1f1d-e838-4b89-8d8b-e61b88a9917e" containerName="neutron-httpd" containerID="cri-o://afb250219a37fbd0d9f4383235bb1d104ff71038b3a1237587e4fc7a008ca4b1" gracePeriod=30 Feb 17 18:03:52 crc kubenswrapper[4892]: I0217 18:03:52.284771 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-544cf5fc64-d8jkh" podStartSLOduration=5.284747542 podStartE2EDuration="5.284747542s" podCreationTimestamp="2026-02-17 18:03:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:03:52.274237602 +0000 UTC m=+1203.649640867" watchObservedRunningTime="2026-02-17 18:03:52.284747542 +0000 UTC m=+1203.660150807" Feb 17 18:03:53 crc kubenswrapper[4892]: I0217 18:03:53.303859 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5758f86b57-ddm7q" event={"ID":"5ffd802c-eec6-4c8c-a2a9-2571eba7bc33","Type":"ContainerStarted","Data":"b9c026854a18722a9d26fd66f783126656e8b029c2d6648ec55c7480344c24c5"} Feb 17 18:03:53 crc kubenswrapper[4892]: I0217 18:03:53.304460 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5758f86b57-ddm7q" event={"ID":"5ffd802c-eec6-4c8c-a2a9-2571eba7bc33","Type":"ContainerStarted","Data":"fae322d6a6b0720f4c3bcb028ea8f29fd05444b1ae22a5979c599cdb2c7ed739"} Feb 17 18:03:53 crc kubenswrapper[4892]: I0217 18:03:53.304477 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5758f86b57-ddm7q" Feb 17 18:03:53 crc kubenswrapper[4892]: I0217 18:03:53.304486 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5758f86b57-ddm7q" event={"ID":"5ffd802c-eec6-4c8c-a2a9-2571eba7bc33","Type":"ContainerStarted","Data":"c5040092cf864192d68f5c9d6f8aeb72cafd9400d6af4ecc44bd3b5f5a0b00bd"} Feb 17 18:03:53 crc kubenswrapper[4892]: I0217 18:03:53.315691 4892 generic.go:334] "Generic (PLEG): container finished" podID="8d1b1f1d-e838-4b89-8d8b-e61b88a9917e" containerID="afb250219a37fbd0d9f4383235bb1d104ff71038b3a1237587e4fc7a008ca4b1" exitCode=0 Feb 17 18:03:53 crc kubenswrapper[4892]: I0217 18:03:53.315757 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6674d5469b-4kmh6" event={"ID":"8d1b1f1d-e838-4b89-8d8b-e61b88a9917e","Type":"ContainerDied","Data":"afb250219a37fbd0d9f4383235bb1d104ff71038b3a1237587e4fc7a008ca4b1"} Feb 17 18:03:53 crc kubenswrapper[4892]: I0217 18:03:53.327127 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5758f86b57-ddm7q" podStartSLOduration=3.327108441 podStartE2EDuration="3.327108441s" podCreationTimestamp="2026-02-17 18:03:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:03:53.326987197 +0000 UTC m=+1204.702390462" watchObservedRunningTime="2026-02-17 18:03:53.327108441 +0000 UTC m=+1204.702511706" Feb 17 18:03:53 crc kubenswrapper[4892]: I0217 18:03:53.330359 4892 generic.go:334] "Generic (PLEG): container finished" podID="2eef18f1-0710-4c8f-af0b-d8c836eef0f1" containerID="e530ac97433648e648bace58e5c8f15b83e507ba85496e9f58438ce16a3820ae" exitCode=0 Feb 17 18:03:53 crc kubenswrapper[4892]: I0217 18:03:53.330444 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-grk2x" event={"ID":"2eef18f1-0710-4c8f-af0b-d8c836eef0f1","Type":"ContainerDied","Data":"e530ac97433648e648bace58e5c8f15b83e507ba85496e9f58438ce16a3820ae"} Feb 17 18:03:53 crc kubenswrapper[4892]: I0217 18:03:53.338437 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-7rrhb" event={"ID":"e85111e5-c198-4b36-bdce-5eb7a8ff75ea","Type":"ContainerStarted","Data":"13f881e4638fa1d108f0b4917c9d4bd23859a377ed76cbafeeead4d196d41cab"} Feb 17 18:03:53 crc kubenswrapper[4892]: I0217 18:03:53.379897 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-7rrhb" podStartSLOduration=4.127549941 podStartE2EDuration="45.379877443s" podCreationTimestamp="2026-02-17 18:03:08 +0000 UTC" firstStartedPulling="2026-02-17 18:03:09.813441305 +0000 UTC m=+1161.188844570" lastFinishedPulling="2026-02-17 18:03:51.065768817 +0000 UTC m=+1202.441172072" observedRunningTime="2026-02-17 18:03:53.37041574 +0000 UTC m=+1204.745819005" watchObservedRunningTime="2026-02-17 18:03:53.379877443 +0000 UTC m=+1204.755280698" Feb 17 18:03:53 crc kubenswrapper[4892]: I0217 18:03:53.386279 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5dc464d6d6-dj24w" Feb 17 18:03:53 crc kubenswrapper[4892]: I0217 18:03:53.386319 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5dc464d6d6-dj24w" event={"ID":"f44e1ba4-1235-4601-94da-dd3400cdb7cc","Type":"ContainerStarted","Data":"5163886c0755a8c4e6089ce2614a7ccedad5e3dc84ee8fc992501c05fd1dbcbc"} Feb 17 18:03:53 crc kubenswrapper[4892]: I0217 18:03:53.386341 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5dc464d6d6-dj24w" event={"ID":"f44e1ba4-1235-4601-94da-dd3400cdb7cc","Type":"ContainerStarted","Data":"7d9cd7df8486a1f344f32c6c05a5647c165e29020c4d0222576060f2d3af228d"} Feb 17 18:03:53 crc kubenswrapper[4892]: I0217 18:03:53.386359 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5dc464d6d6-dj24w" Feb 17 18:03:53 crc kubenswrapper[4892]: I0217 18:03:53.398164 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5dc464d6d6-dj24w" podStartSLOduration=3.398148973 podStartE2EDuration="3.398148973s" podCreationTimestamp="2026-02-17 18:03:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:03:53.389971574 +0000 UTC m=+1204.765374839" watchObservedRunningTime="2026-02-17 18:03:53.398148973 +0000 UTC m=+1204.773552238" Feb 17 18:03:53 crc kubenswrapper[4892]: I0217 18:03:53.846872 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-7rjq9" Feb 17 18:03:54 crc kubenswrapper[4892]: I0217 18:03:54.016435 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5fb2cb6f-9055-428a-8ae8-d76893b49c68-ovsdbserver-nb\") pod \"5fb2cb6f-9055-428a-8ae8-d76893b49c68\" (UID: \"5fb2cb6f-9055-428a-8ae8-d76893b49c68\") " Feb 17 18:03:54 crc kubenswrapper[4892]: I0217 18:03:54.016557 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5fb2cb6f-9055-428a-8ae8-d76893b49c68-dns-svc\") pod \"5fb2cb6f-9055-428a-8ae8-d76893b49c68\" (UID: \"5fb2cb6f-9055-428a-8ae8-d76893b49c68\") " Feb 17 18:03:54 crc kubenswrapper[4892]: I0217 18:03:54.016596 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5fb2cb6f-9055-428a-8ae8-d76893b49c68-dns-swift-storage-0\") pod \"5fb2cb6f-9055-428a-8ae8-d76893b49c68\" (UID: \"5fb2cb6f-9055-428a-8ae8-d76893b49c68\") " Feb 17 18:03:54 crc kubenswrapper[4892]: I0217 18:03:54.016722 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcv5r\" (UniqueName: \"kubernetes.io/projected/5fb2cb6f-9055-428a-8ae8-d76893b49c68-kube-api-access-hcv5r\") pod \"5fb2cb6f-9055-428a-8ae8-d76893b49c68\" (UID: \"5fb2cb6f-9055-428a-8ae8-d76893b49c68\") " Feb 17 18:03:54 crc kubenswrapper[4892]: I0217 18:03:54.016753 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5fb2cb6f-9055-428a-8ae8-d76893b49c68-ovsdbserver-sb\") pod \"5fb2cb6f-9055-428a-8ae8-d76893b49c68\" (UID: \"5fb2cb6f-9055-428a-8ae8-d76893b49c68\") " Feb 17 18:03:54 crc kubenswrapper[4892]: I0217 18:03:54.016874 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fb2cb6f-9055-428a-8ae8-d76893b49c68-config\") pod \"5fb2cb6f-9055-428a-8ae8-d76893b49c68\" (UID: \"5fb2cb6f-9055-428a-8ae8-d76893b49c68\") " Feb 17 18:03:54 crc kubenswrapper[4892]: I0217 18:03:54.038075 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fb2cb6f-9055-428a-8ae8-d76893b49c68-kube-api-access-hcv5r" (OuterVolumeSpecName: "kube-api-access-hcv5r") pod "5fb2cb6f-9055-428a-8ae8-d76893b49c68" (UID: "5fb2cb6f-9055-428a-8ae8-d76893b49c68"). InnerVolumeSpecName "kube-api-access-hcv5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:03:54 crc kubenswrapper[4892]: I0217 18:03:54.094837 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fb2cb6f-9055-428a-8ae8-d76893b49c68-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5fb2cb6f-9055-428a-8ae8-d76893b49c68" (UID: "5fb2cb6f-9055-428a-8ae8-d76893b49c68"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:03:54 crc kubenswrapper[4892]: I0217 18:03:54.096357 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fb2cb6f-9055-428a-8ae8-d76893b49c68-config" (OuterVolumeSpecName: "config") pod "5fb2cb6f-9055-428a-8ae8-d76893b49c68" (UID: "5fb2cb6f-9055-428a-8ae8-d76893b49c68"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:03:54 crc kubenswrapper[4892]: I0217 18:03:54.107514 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fb2cb6f-9055-428a-8ae8-d76893b49c68-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5fb2cb6f-9055-428a-8ae8-d76893b49c68" (UID: "5fb2cb6f-9055-428a-8ae8-d76893b49c68"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:03:54 crc kubenswrapper[4892]: I0217 18:03:54.117140 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fb2cb6f-9055-428a-8ae8-d76893b49c68-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5fb2cb6f-9055-428a-8ae8-d76893b49c68" (UID: "5fb2cb6f-9055-428a-8ae8-d76893b49c68"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:03:54 crc kubenswrapper[4892]: I0217 18:03:54.118538 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcv5r\" (UniqueName: \"kubernetes.io/projected/5fb2cb6f-9055-428a-8ae8-d76893b49c68-kube-api-access-hcv5r\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:54 crc kubenswrapper[4892]: I0217 18:03:54.118567 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5fb2cb6f-9055-428a-8ae8-d76893b49c68-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:54 crc kubenswrapper[4892]: I0217 18:03:54.118579 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fb2cb6f-9055-428a-8ae8-d76893b49c68-config\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:54 crc kubenswrapper[4892]: I0217 18:03:54.118589 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5fb2cb6f-9055-428a-8ae8-d76893b49c68-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:54 crc kubenswrapper[4892]: I0217 18:03:54.118597 4892 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5fb2cb6f-9055-428a-8ae8-d76893b49c68-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:54 crc kubenswrapper[4892]: I0217 18:03:54.121657 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fb2cb6f-9055-428a-8ae8-d76893b49c68-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5fb2cb6f-9055-428a-8ae8-d76893b49c68" (UID: "5fb2cb6f-9055-428a-8ae8-d76893b49c68"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:03:54 crc kubenswrapper[4892]: I0217 18:03:54.219919 4892 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5fb2cb6f-9055-428a-8ae8-d76893b49c68-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 18:03:54 crc kubenswrapper[4892]: I0217 18:03:54.380431 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-7rjq9" event={"ID":"5fb2cb6f-9055-428a-8ae8-d76893b49c68","Type":"ContainerDied","Data":"9c94221b7fe986ec5917cd3740ced375bca67812ff64592d29a51cda4e7b1769"} Feb 17 18:03:54 crc kubenswrapper[4892]: I0217 18:03:54.380493 4892 scope.go:117] "RemoveContainer" containerID="219940f2babea1f478053869a6de00830abbd54bbb0a262cf1cfbdd307af8bba" Feb 17 18:03:54 crc kubenswrapper[4892]: I0217 18:03:54.380687 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-7rjq9" Feb 17 18:03:54 crc kubenswrapper[4892]: I0217 18:03:54.395384 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-77d47b9fd6-4gbj5"] Feb 17 18:03:54 crc kubenswrapper[4892]: E0217 18:03:54.396004 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fb2cb6f-9055-428a-8ae8-d76893b49c68" containerName="init" Feb 17 18:03:54 crc kubenswrapper[4892]: I0217 18:03:54.396020 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fb2cb6f-9055-428a-8ae8-d76893b49c68" containerName="init" Feb 17 18:03:54 crc kubenswrapper[4892]: I0217 18:03:54.396211 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fb2cb6f-9055-428a-8ae8-d76893b49c68" containerName="init" Feb 17 18:03:54 crc kubenswrapper[4892]: I0217 18:03:54.397271 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-77d47b9fd6-4gbj5" Feb 17 18:03:54 crc kubenswrapper[4892]: I0217 18:03:54.401003 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 17 18:03:54 crc kubenswrapper[4892]: I0217 18:03:54.401219 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 17 18:03:54 crc kubenswrapper[4892]: I0217 18:03:54.419356 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-77d47b9fd6-4gbj5"] Feb 17 18:03:54 crc kubenswrapper[4892]: I0217 18:03:54.489811 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-7rjq9"] Feb 17 18:03:54 crc kubenswrapper[4892]: I0217 18:03:54.499394 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-7rjq9"] Feb 17 18:03:54 crc kubenswrapper[4892]: I0217 18:03:54.525131 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcgkv\" (UniqueName: \"kubernetes.io/projected/8947a6cb-f018-4042-a2f8-e17591b0394d-kube-api-access-tcgkv\") pod \"barbican-api-77d47b9fd6-4gbj5\" (UID: \"8947a6cb-f018-4042-a2f8-e17591b0394d\") " pod="openstack/barbican-api-77d47b9fd6-4gbj5" Feb 17 18:03:54 crc kubenswrapper[4892]: I0217 18:03:54.525219 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8947a6cb-f018-4042-a2f8-e17591b0394d-config-data-custom\") pod \"barbican-api-77d47b9fd6-4gbj5\" (UID: \"8947a6cb-f018-4042-a2f8-e17591b0394d\") " pod="openstack/barbican-api-77d47b9fd6-4gbj5" Feb 17 18:03:54 crc kubenswrapper[4892]: I0217 18:03:54.525240 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8947a6cb-f018-4042-a2f8-e17591b0394d-config-data\") pod \"barbican-api-77d47b9fd6-4gbj5\" (UID: \"8947a6cb-f018-4042-a2f8-e17591b0394d\") " pod="openstack/barbican-api-77d47b9fd6-4gbj5" Feb 17 18:03:54 crc kubenswrapper[4892]: I0217 18:03:54.525267 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8947a6cb-f018-4042-a2f8-e17591b0394d-combined-ca-bundle\") pod \"barbican-api-77d47b9fd6-4gbj5\" (UID: \"8947a6cb-f018-4042-a2f8-e17591b0394d\") " pod="openstack/barbican-api-77d47b9fd6-4gbj5" Feb 17 18:03:54 crc kubenswrapper[4892]: I0217 18:03:54.525360 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8947a6cb-f018-4042-a2f8-e17591b0394d-logs\") pod \"barbican-api-77d47b9fd6-4gbj5\" (UID: \"8947a6cb-f018-4042-a2f8-e17591b0394d\") " pod="openstack/barbican-api-77d47b9fd6-4gbj5" Feb 17 18:03:54 crc kubenswrapper[4892]: I0217 18:03:54.525432 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8947a6cb-f018-4042-a2f8-e17591b0394d-public-tls-certs\") pod \"barbican-api-77d47b9fd6-4gbj5\" (UID: \"8947a6cb-f018-4042-a2f8-e17591b0394d\") " pod="openstack/barbican-api-77d47b9fd6-4gbj5" Feb 17 18:03:54 crc kubenswrapper[4892]: I0217 18:03:54.525528 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8947a6cb-f018-4042-a2f8-e17591b0394d-internal-tls-certs\") pod \"barbican-api-77d47b9fd6-4gbj5\" (UID: \"8947a6cb-f018-4042-a2f8-e17591b0394d\") " pod="openstack/barbican-api-77d47b9fd6-4gbj5" Feb 17 18:03:54 crc kubenswrapper[4892]: I0217 18:03:54.626840 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8947a6cb-f018-4042-a2f8-e17591b0394d-internal-tls-certs\") pod \"barbican-api-77d47b9fd6-4gbj5\" (UID: \"8947a6cb-f018-4042-a2f8-e17591b0394d\") " pod="openstack/barbican-api-77d47b9fd6-4gbj5" Feb 17 18:03:54 crc kubenswrapper[4892]: I0217 18:03:54.626892 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcgkv\" (UniqueName: \"kubernetes.io/projected/8947a6cb-f018-4042-a2f8-e17591b0394d-kube-api-access-tcgkv\") pod \"barbican-api-77d47b9fd6-4gbj5\" (UID: \"8947a6cb-f018-4042-a2f8-e17591b0394d\") " pod="openstack/barbican-api-77d47b9fd6-4gbj5" Feb 17 18:03:54 crc kubenswrapper[4892]: I0217 18:03:54.626936 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8947a6cb-f018-4042-a2f8-e17591b0394d-config-data-custom\") pod \"barbican-api-77d47b9fd6-4gbj5\" (UID: \"8947a6cb-f018-4042-a2f8-e17591b0394d\") " pod="openstack/barbican-api-77d47b9fd6-4gbj5" Feb 17 18:03:54 crc kubenswrapper[4892]: I0217 18:03:54.626952 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8947a6cb-f018-4042-a2f8-e17591b0394d-config-data\") pod \"barbican-api-77d47b9fd6-4gbj5\" (UID: \"8947a6cb-f018-4042-a2f8-e17591b0394d\") " pod="openstack/barbican-api-77d47b9fd6-4gbj5" Feb 17 18:03:54 crc kubenswrapper[4892]: I0217 18:03:54.626972 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8947a6cb-f018-4042-a2f8-e17591b0394d-combined-ca-bundle\") pod \"barbican-api-77d47b9fd6-4gbj5\" (UID: \"8947a6cb-f018-4042-a2f8-e17591b0394d\") " pod="openstack/barbican-api-77d47b9fd6-4gbj5" Feb 17 18:03:54 crc kubenswrapper[4892]: I0217 18:03:54.627026 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8947a6cb-f018-4042-a2f8-e17591b0394d-logs\") pod \"barbican-api-77d47b9fd6-4gbj5\" (UID: \"8947a6cb-f018-4042-a2f8-e17591b0394d\") " pod="openstack/barbican-api-77d47b9fd6-4gbj5" Feb 17 18:03:54 crc kubenswrapper[4892]: I0217 18:03:54.627076 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8947a6cb-f018-4042-a2f8-e17591b0394d-public-tls-certs\") pod \"barbican-api-77d47b9fd6-4gbj5\" (UID: \"8947a6cb-f018-4042-a2f8-e17591b0394d\") " pod="openstack/barbican-api-77d47b9fd6-4gbj5" Feb 17 18:03:54 crc kubenswrapper[4892]: I0217 18:03:54.628734 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8947a6cb-f018-4042-a2f8-e17591b0394d-logs\") pod \"barbican-api-77d47b9fd6-4gbj5\" (UID: \"8947a6cb-f018-4042-a2f8-e17591b0394d\") " pod="openstack/barbican-api-77d47b9fd6-4gbj5" Feb 17 18:03:54 crc kubenswrapper[4892]: I0217 18:03:54.634510 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8947a6cb-f018-4042-a2f8-e17591b0394d-public-tls-certs\") pod \"barbican-api-77d47b9fd6-4gbj5\" (UID: \"8947a6cb-f018-4042-a2f8-e17591b0394d\") " pod="openstack/barbican-api-77d47b9fd6-4gbj5" Feb 17 18:03:54 crc kubenswrapper[4892]: I0217 18:03:54.635655 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8947a6cb-f018-4042-a2f8-e17591b0394d-internal-tls-certs\") pod \"barbican-api-77d47b9fd6-4gbj5\" (UID: \"8947a6cb-f018-4042-a2f8-e17591b0394d\") " pod="openstack/barbican-api-77d47b9fd6-4gbj5" Feb 17 18:03:54 crc kubenswrapper[4892]: I0217 18:03:54.636144 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8947a6cb-f018-4042-a2f8-e17591b0394d-combined-ca-bundle\") pod \"barbican-api-77d47b9fd6-4gbj5\" (UID: \"8947a6cb-f018-4042-a2f8-e17591b0394d\") " pod="openstack/barbican-api-77d47b9fd6-4gbj5" Feb 17 18:03:54 crc kubenswrapper[4892]: I0217 18:03:54.648355 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8947a6cb-f018-4042-a2f8-e17591b0394d-config-data-custom\") pod \"barbican-api-77d47b9fd6-4gbj5\" (UID: \"8947a6cb-f018-4042-a2f8-e17591b0394d\") " pod="openstack/barbican-api-77d47b9fd6-4gbj5" Feb 17 18:03:54 crc kubenswrapper[4892]: I0217 18:03:54.650542 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8947a6cb-f018-4042-a2f8-e17591b0394d-config-data\") pod \"barbican-api-77d47b9fd6-4gbj5\" (UID: \"8947a6cb-f018-4042-a2f8-e17591b0394d\") " pod="openstack/barbican-api-77d47b9fd6-4gbj5" Feb 17 18:03:54 crc kubenswrapper[4892]: I0217 18:03:54.651056 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcgkv\" (UniqueName: \"kubernetes.io/projected/8947a6cb-f018-4042-a2f8-e17591b0394d-kube-api-access-tcgkv\") pod \"barbican-api-77d47b9fd6-4gbj5\" (UID: \"8947a6cb-f018-4042-a2f8-e17591b0394d\") " pod="openstack/barbican-api-77d47b9fd6-4gbj5" Feb 17 18:03:54 crc kubenswrapper[4892]: I0217 18:03:54.718672 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-77d47b9fd6-4gbj5" Feb 17 18:03:55 crc kubenswrapper[4892]: I0217 18:03:55.347686 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-77d47b9fd6-4gbj5"] Feb 17 18:03:55 crc kubenswrapper[4892]: I0217 18:03:55.373971 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fb2cb6f-9055-428a-8ae8-d76893b49c68" path="/var/lib/kubelet/pods/5fb2cb6f-9055-428a-8ae8-d76893b49c68/volumes" Feb 17 18:03:55 crc kubenswrapper[4892]: I0217 18:03:55.417486 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-77d47b9fd6-4gbj5" event={"ID":"8947a6cb-f018-4042-a2f8-e17591b0394d","Type":"ContainerStarted","Data":"01106c64d635a5068367d9b45b8a6183bae967cc967b8dbdb618840c91c8bd1c"} Feb 17 18:03:55 crc kubenswrapper[4892]: I0217 18:03:55.425495 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-grk2x" event={"ID":"2eef18f1-0710-4c8f-af0b-d8c836eef0f1","Type":"ContainerStarted","Data":"3c01b90efe5b5d4250a767cec037c711397f694acc495d33f9f97c0d90167be9"} Feb 17 18:03:55 crc kubenswrapper[4892]: I0217 18:03:55.425609 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-688c87cc99-grk2x" Feb 17 18:03:55 crc kubenswrapper[4892]: I0217 18:03:55.431199 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-65fb4dcbbc-jxbk8" event={"ID":"7a97132b-c4ac-4645-8844-1dc5acf466a1","Type":"ContainerStarted","Data":"2f7908fcab6c600dd1af68c388a6d558ccc99b22a660b78e65463d1976469b2c"} Feb 17 18:03:55 crc kubenswrapper[4892]: I0217 18:03:55.431235 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-65fb4dcbbc-jxbk8" event={"ID":"7a97132b-c4ac-4645-8844-1dc5acf466a1","Type":"ContainerStarted","Data":"2ce408135d2761c455f60cc8d128a48f7b7d2c155d3ab0f6e2e8d41bb724c760"} Feb 17 18:03:55 crc kubenswrapper[4892]: I0217 18:03:55.435916 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-667fbbdb6d-fdpm7" event={"ID":"bf3f5fcf-07b1-4f42-a5be-5b11052d080a","Type":"ContainerStarted","Data":"70e153e496f3b77144597bb37e7faa188dca474fe0e9e263bb0ae95405212f33"} Feb 17 18:03:55 crc kubenswrapper[4892]: I0217 18:03:55.449384 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-688c87cc99-grk2x" podStartSLOduration=5.449364662 podStartE2EDuration="5.449364662s" podCreationTimestamp="2026-02-17 18:03:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:03:55.441366198 +0000 UTC m=+1206.816769463" watchObservedRunningTime="2026-02-17 18:03:55.449364662 +0000 UTC m=+1206.824767927" Feb 17 18:03:55 crc kubenswrapper[4892]: I0217 18:03:55.464644 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-667fbbdb6d-fdpm7" podStartSLOduration=2.2640095860000002 podStartE2EDuration="5.46462618s" podCreationTimestamp="2026-02-17 18:03:50 +0000 UTC" firstStartedPulling="2026-02-17 18:03:51.337787917 +0000 UTC m=+1202.713191182" lastFinishedPulling="2026-02-17 18:03:54.538404521 +0000 UTC m=+1205.913807776" observedRunningTime="2026-02-17 18:03:55.458720002 +0000 UTC m=+1206.834123267" watchObservedRunningTime="2026-02-17 18:03:55.46462618 +0000 UTC m=+1206.840029445" Feb 17 18:03:55 crc kubenswrapper[4892]: I0217 18:03:55.479507 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-65fb4dcbbc-jxbk8" podStartSLOduration=2.49422357 podStartE2EDuration="5.479491179s" podCreationTimestamp="2026-02-17 18:03:50 +0000 UTC" firstStartedPulling="2026-02-17 18:03:51.551758645 +0000 UTC m=+1202.927161910" lastFinishedPulling="2026-02-17 18:03:54.537026264 +0000 UTC m=+1205.912429519" observedRunningTime="2026-02-17 18:03:55.476870969 +0000 UTC m=+1206.852274234" watchObservedRunningTime="2026-02-17 18:03:55.479491179 +0000 UTC m=+1206.854894444" Feb 17 18:03:56 crc kubenswrapper[4892]: I0217 18:03:56.462754 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-77d47b9fd6-4gbj5" event={"ID":"8947a6cb-f018-4042-a2f8-e17591b0394d","Type":"ContainerStarted","Data":"b921cc6366edc36ee38b6a143bfc88dcd8710d62a31bab58b5973626b24cb7ca"} Feb 17 18:03:56 crc kubenswrapper[4892]: I0217 18:03:56.475976 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-667fbbdb6d-fdpm7" event={"ID":"bf3f5fcf-07b1-4f42-a5be-5b11052d080a","Type":"ContainerStarted","Data":"c7cb681f90d6976e6b5cbf00e7ea59a76be308b0fe3edc7745833b928200cf4d"} Feb 17 18:04:00 crc kubenswrapper[4892]: I0217 18:04:00.524412 4892 generic.go:334] "Generic (PLEG): container finished" podID="e85111e5-c198-4b36-bdce-5eb7a8ff75ea" containerID="13f881e4638fa1d108f0b4917c9d4bd23859a377ed76cbafeeead4d196d41cab" exitCode=0 Feb 17 18:04:00 crc kubenswrapper[4892]: I0217 18:04:00.524498 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-7rrhb" event={"ID":"e85111e5-c198-4b36-bdce-5eb7a8ff75ea","Type":"ContainerDied","Data":"13f881e4638fa1d108f0b4917c9d4bd23859a377ed76cbafeeead4d196d41cab"} Feb 17 18:04:00 crc kubenswrapper[4892]: I0217 18:04:00.529580 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-77d47b9fd6-4gbj5" event={"ID":"8947a6cb-f018-4042-a2f8-e17591b0394d","Type":"ContainerStarted","Data":"80a0a6f3f3f9ebbe69c194f3e074f48a042c65dd9cb6a3e8b0adc00867e049f1"} Feb 17 18:04:00 crc kubenswrapper[4892]: I0217 18:04:00.529797 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-77d47b9fd6-4gbj5" Feb 17 18:04:00 crc kubenswrapper[4892]: I0217 18:04:00.573634 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-77d47b9fd6-4gbj5" podStartSLOduration=6.573616861 podStartE2EDuration="6.573616861s" podCreationTimestamp="2026-02-17 18:03:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:04:00.557638364 +0000 UTC m=+1211.933041629" watchObservedRunningTime="2026-02-17 18:04:00.573616861 +0000 UTC m=+1211.949020126" Feb 17 18:04:00 crc kubenswrapper[4892]: I0217 18:04:00.970960 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-688c87cc99-grk2x" Feb 17 18:04:01 crc kubenswrapper[4892]: I0217 18:04:01.080316 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-dw87w"] Feb 17 18:04:01 crc kubenswrapper[4892]: I0217 18:04:01.080877 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57c957c4ff-dw87w" podUID="04ff949e-3323-4573-ac6a-4b1714b1976c" containerName="dnsmasq-dns" containerID="cri-o://e0bceedcc0f8aa99d2da835f14efea6c50f15e3cf64b5ff954b6ccfa050e77e1" gracePeriod=10 Feb 17 18:04:01 crc kubenswrapper[4892]: E0217 18:04:01.351265 4892 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04ff949e_3323_4573_ac6a_4b1714b1976c.slice/crio-e0bceedcc0f8aa99d2da835f14efea6c50f15e3cf64b5ff954b6ccfa050e77e1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04ff949e_3323_4573_ac6a_4b1714b1976c.slice/crio-conmon-e0bceedcc0f8aa99d2da835f14efea6c50f15e3cf64b5ff954b6ccfa050e77e1.scope\": RecentStats: unable to find data in memory cache]" Feb 17 18:04:01 crc kubenswrapper[4892]: I0217 18:04:01.575131 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"14ec5fef-7255-4519-b447-474ccccaebdb","Type":"ContainerStarted","Data":"4770ccc77abc3fc5a4d307d9af15ea5f5bbbf85c76fd3467db07e2c10f0044ad"} Feb 17 18:04:01 crc kubenswrapper[4892]: I0217 18:04:01.575511 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="14ec5fef-7255-4519-b447-474ccccaebdb" containerName="ceilometer-central-agent" containerID="cri-o://1ca249ccf30d54590b7bedfc5fd4d93955d7342105c6c4b76747c5f1f064b57b" gracePeriod=30 Feb 17 18:04:01 crc kubenswrapper[4892]: I0217 18:04:01.575745 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 18:04:01 crc kubenswrapper[4892]: I0217 18:04:01.576009 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="14ec5fef-7255-4519-b447-474ccccaebdb" containerName="proxy-httpd" containerID="cri-o://4770ccc77abc3fc5a4d307d9af15ea5f5bbbf85c76fd3467db07e2c10f0044ad" gracePeriod=30 Feb 17 18:04:01 crc kubenswrapper[4892]: I0217 18:04:01.576065 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="14ec5fef-7255-4519-b447-474ccccaebdb" containerName="sg-core" containerID="cri-o://1b519ddd92ccb43096fe441e4b92da614f8084679a5f6b39b706dd73a6031e61" gracePeriod=30 Feb 17 18:04:01 crc kubenswrapper[4892]: I0217 18:04:01.576096 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="14ec5fef-7255-4519-b447-474ccccaebdb" containerName="ceilometer-notification-agent" containerID="cri-o://339f18965d225092aa6b74cb44351cec1a8bae37411ff5d9763012698e3bd335" gracePeriod=30 Feb 17 18:04:01 crc kubenswrapper[4892]: I0217 18:04:01.593090 4892 generic.go:334] "Generic (PLEG): container finished" podID="04ff949e-3323-4573-ac6a-4b1714b1976c" containerID="e0bceedcc0f8aa99d2da835f14efea6c50f15e3cf64b5ff954b6ccfa050e77e1" exitCode=0 Feb 17 18:04:01 crc kubenswrapper[4892]: I0217 18:04:01.593214 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-dw87w" event={"ID":"04ff949e-3323-4573-ac6a-4b1714b1976c","Type":"ContainerDied","Data":"e0bceedcc0f8aa99d2da835f14efea6c50f15e3cf64b5ff954b6ccfa050e77e1"} Feb 17 18:04:01 crc kubenswrapper[4892]: I0217 18:04:01.593483 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-77d47b9fd6-4gbj5" Feb 17 18:04:01 crc kubenswrapper[4892]: I0217 18:04:01.801437 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-dw87w" Feb 17 18:04:01 crc kubenswrapper[4892]: I0217 18:04:01.839093 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.7265916949999998 podStartE2EDuration="53.839072181s" podCreationTimestamp="2026-02-17 18:03:08 +0000 UTC" firstStartedPulling="2026-02-17 18:03:09.781588312 +0000 UTC m=+1161.156991567" lastFinishedPulling="2026-02-17 18:04:00.894068768 +0000 UTC m=+1212.269472053" observedRunningTime="2026-02-17 18:04:01.603635519 +0000 UTC m=+1212.979038784" watchObservedRunningTime="2026-02-17 18:04:01.839072181 +0000 UTC m=+1213.214475446" Feb 17 18:04:01 crc kubenswrapper[4892]: I0217 18:04:01.951501 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04ff949e-3323-4573-ac6a-4b1714b1976c-dns-svc\") pod \"04ff949e-3323-4573-ac6a-4b1714b1976c\" (UID: \"04ff949e-3323-4573-ac6a-4b1714b1976c\") " Feb 17 18:04:01 crc kubenswrapper[4892]: I0217 18:04:01.951561 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04ff949e-3323-4573-ac6a-4b1714b1976c-ovsdbserver-sb\") pod \"04ff949e-3323-4573-ac6a-4b1714b1976c\" (UID: \"04ff949e-3323-4573-ac6a-4b1714b1976c\") " Feb 17 18:04:01 crc kubenswrapper[4892]: I0217 18:04:01.951591 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04ff949e-3323-4573-ac6a-4b1714b1976c-ovsdbserver-nb\") pod \"04ff949e-3323-4573-ac6a-4b1714b1976c\" (UID: \"04ff949e-3323-4573-ac6a-4b1714b1976c\") " Feb 17 18:04:01 crc kubenswrapper[4892]: I0217 18:04:01.951643 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04ff949e-3323-4573-ac6a-4b1714b1976c-config\") pod \"04ff949e-3323-4573-ac6a-4b1714b1976c\" (UID: \"04ff949e-3323-4573-ac6a-4b1714b1976c\") " Feb 17 18:04:01 crc kubenswrapper[4892]: I0217 18:04:01.951691 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04ff949e-3323-4573-ac6a-4b1714b1976c-dns-swift-storage-0\") pod \"04ff949e-3323-4573-ac6a-4b1714b1976c\" (UID: \"04ff949e-3323-4573-ac6a-4b1714b1976c\") " Feb 17 18:04:01 crc kubenswrapper[4892]: I0217 18:04:01.951753 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sf56q\" (UniqueName: \"kubernetes.io/projected/04ff949e-3323-4573-ac6a-4b1714b1976c-kube-api-access-sf56q\") pod \"04ff949e-3323-4573-ac6a-4b1714b1976c\" (UID: \"04ff949e-3323-4573-ac6a-4b1714b1976c\") " Feb 17 18:04:01 crc kubenswrapper[4892]: I0217 18:04:01.959404 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04ff949e-3323-4573-ac6a-4b1714b1976c-kube-api-access-sf56q" (OuterVolumeSpecName: "kube-api-access-sf56q") pod "04ff949e-3323-4573-ac6a-4b1714b1976c" (UID: "04ff949e-3323-4573-ac6a-4b1714b1976c"). InnerVolumeSpecName "kube-api-access-sf56q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:04:01 crc kubenswrapper[4892]: I0217 18:04:01.980350 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-7rrhb" Feb 17 18:04:02 crc kubenswrapper[4892]: I0217 18:04:02.024577 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04ff949e-3323-4573-ac6a-4b1714b1976c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "04ff949e-3323-4573-ac6a-4b1714b1976c" (UID: "04ff949e-3323-4573-ac6a-4b1714b1976c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:04:02 crc kubenswrapper[4892]: I0217 18:04:02.054527 4892 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04ff949e-3323-4573-ac6a-4b1714b1976c-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:02 crc kubenswrapper[4892]: I0217 18:04:02.054556 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sf56q\" (UniqueName: \"kubernetes.io/projected/04ff949e-3323-4573-ac6a-4b1714b1976c-kube-api-access-sf56q\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:02 crc kubenswrapper[4892]: I0217 18:04:02.055062 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04ff949e-3323-4573-ac6a-4b1714b1976c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "04ff949e-3323-4573-ac6a-4b1714b1976c" (UID: "04ff949e-3323-4573-ac6a-4b1714b1976c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:04:02 crc kubenswrapper[4892]: I0217 18:04:02.068449 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04ff949e-3323-4573-ac6a-4b1714b1976c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "04ff949e-3323-4573-ac6a-4b1714b1976c" (UID: "04ff949e-3323-4573-ac6a-4b1714b1976c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:04:02 crc kubenswrapper[4892]: I0217 18:04:02.096018 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04ff949e-3323-4573-ac6a-4b1714b1976c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "04ff949e-3323-4573-ac6a-4b1714b1976c" (UID: "04ff949e-3323-4573-ac6a-4b1714b1976c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:04:02 crc kubenswrapper[4892]: I0217 18:04:02.096266 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04ff949e-3323-4573-ac6a-4b1714b1976c-config" (OuterVolumeSpecName: "config") pod "04ff949e-3323-4573-ac6a-4b1714b1976c" (UID: "04ff949e-3323-4573-ac6a-4b1714b1976c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:04:02 crc kubenswrapper[4892]: I0217 18:04:02.155623 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjgkj\" (UniqueName: \"kubernetes.io/projected/e85111e5-c198-4b36-bdce-5eb7a8ff75ea-kube-api-access-gjgkj\") pod \"e85111e5-c198-4b36-bdce-5eb7a8ff75ea\" (UID: \"e85111e5-c198-4b36-bdce-5eb7a8ff75ea\") " Feb 17 18:04:02 crc kubenswrapper[4892]: I0217 18:04:02.155746 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e85111e5-c198-4b36-bdce-5eb7a8ff75ea-db-sync-config-data\") pod \"e85111e5-c198-4b36-bdce-5eb7a8ff75ea\" (UID: \"e85111e5-c198-4b36-bdce-5eb7a8ff75ea\") " Feb 17 18:04:02 crc kubenswrapper[4892]: I0217 18:04:02.155882 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e85111e5-c198-4b36-bdce-5eb7a8ff75ea-config-data\") pod \"e85111e5-c198-4b36-bdce-5eb7a8ff75ea\" (UID: \"e85111e5-c198-4b36-bdce-5eb7a8ff75ea\") " Feb 17 18:04:02 crc kubenswrapper[4892]: I0217 18:04:02.155924 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e85111e5-c198-4b36-bdce-5eb7a8ff75ea-etc-machine-id\") pod \"e85111e5-c198-4b36-bdce-5eb7a8ff75ea\" (UID: \"e85111e5-c198-4b36-bdce-5eb7a8ff75ea\") " Feb 17 18:04:02 crc kubenswrapper[4892]: I0217 18:04:02.155949 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e85111e5-c198-4b36-bdce-5eb7a8ff75ea-scripts\") pod \"e85111e5-c198-4b36-bdce-5eb7a8ff75ea\" (UID: \"e85111e5-c198-4b36-bdce-5eb7a8ff75ea\") " Feb 17 18:04:02 crc kubenswrapper[4892]: I0217 18:04:02.156049 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e85111e5-c198-4b36-bdce-5eb7a8ff75ea-combined-ca-bundle\") pod \"e85111e5-c198-4b36-bdce-5eb7a8ff75ea\" (UID: \"e85111e5-c198-4b36-bdce-5eb7a8ff75ea\") " Feb 17 18:04:02 crc kubenswrapper[4892]: I0217 18:04:02.156484 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04ff949e-3323-4573-ac6a-4b1714b1976c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:02 crc kubenswrapper[4892]: I0217 18:04:02.156501 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04ff949e-3323-4573-ac6a-4b1714b1976c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:02 crc kubenswrapper[4892]: I0217 18:04:02.156510 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04ff949e-3323-4573-ac6a-4b1714b1976c-config\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:02 crc kubenswrapper[4892]: I0217 18:04:02.156519 4892 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04ff949e-3323-4573-ac6a-4b1714b1976c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:02 crc kubenswrapper[4892]: I0217 18:04:02.157758 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e85111e5-c198-4b36-bdce-5eb7a8ff75ea-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e85111e5-c198-4b36-bdce-5eb7a8ff75ea" (UID: "e85111e5-c198-4b36-bdce-5eb7a8ff75ea"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:04:02 crc kubenswrapper[4892]: I0217 18:04:02.161225 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e85111e5-c198-4b36-bdce-5eb7a8ff75ea-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "e85111e5-c198-4b36-bdce-5eb7a8ff75ea" (UID: "e85111e5-c198-4b36-bdce-5eb7a8ff75ea"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:04:02 crc kubenswrapper[4892]: I0217 18:04:02.162069 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e85111e5-c198-4b36-bdce-5eb7a8ff75ea-kube-api-access-gjgkj" (OuterVolumeSpecName: "kube-api-access-gjgkj") pod "e85111e5-c198-4b36-bdce-5eb7a8ff75ea" (UID: "e85111e5-c198-4b36-bdce-5eb7a8ff75ea"). InnerVolumeSpecName "kube-api-access-gjgkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:04:02 crc kubenswrapper[4892]: I0217 18:04:02.163290 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e85111e5-c198-4b36-bdce-5eb7a8ff75ea-scripts" (OuterVolumeSpecName: "scripts") pod "e85111e5-c198-4b36-bdce-5eb7a8ff75ea" (UID: "e85111e5-c198-4b36-bdce-5eb7a8ff75ea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:04:02 crc kubenswrapper[4892]: I0217 18:04:02.193921 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e85111e5-c198-4b36-bdce-5eb7a8ff75ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e85111e5-c198-4b36-bdce-5eb7a8ff75ea" (UID: "e85111e5-c198-4b36-bdce-5eb7a8ff75ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:04:02 crc kubenswrapper[4892]: I0217 18:04:02.205922 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-77d47b9fd6-4gbj5" Feb 17 18:04:02 crc kubenswrapper[4892]: I0217 18:04:02.235697 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e85111e5-c198-4b36-bdce-5eb7a8ff75ea-config-data" (OuterVolumeSpecName: "config-data") pod "e85111e5-c198-4b36-bdce-5eb7a8ff75ea" (UID: "e85111e5-c198-4b36-bdce-5eb7a8ff75ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:04:02 crc kubenswrapper[4892]: I0217 18:04:02.259636 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e85111e5-c198-4b36-bdce-5eb7a8ff75ea-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:02 crc kubenswrapper[4892]: I0217 18:04:02.259677 4892 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e85111e5-c198-4b36-bdce-5eb7a8ff75ea-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:02 crc kubenswrapper[4892]: I0217 18:04:02.259689 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e85111e5-c198-4b36-bdce-5eb7a8ff75ea-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:02 crc kubenswrapper[4892]: I0217 18:04:02.259700 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e85111e5-c198-4b36-bdce-5eb7a8ff75ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:02 crc kubenswrapper[4892]: I0217 18:04:02.259713 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjgkj\" (UniqueName: \"kubernetes.io/projected/e85111e5-c198-4b36-bdce-5eb7a8ff75ea-kube-api-access-gjgkj\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:02 crc kubenswrapper[4892]: I0217 18:04:02.259725 4892 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e85111e5-c198-4b36-bdce-5eb7a8ff75ea-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:02 crc kubenswrapper[4892]: I0217 18:04:02.678245 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-dw87w" event={"ID":"04ff949e-3323-4573-ac6a-4b1714b1976c","Type":"ContainerDied","Data":"9305ec546ef009b56ab867fec42819b937adf5fa644ec35b85a2b59d513a8543"} Feb 17 18:04:02 crc kubenswrapper[4892]: I0217 18:04:02.679538 4892 scope.go:117] "RemoveContainer" containerID="e0bceedcc0f8aa99d2da835f14efea6c50f15e3cf64b5ff954b6ccfa050e77e1" Feb 17 18:04:02 crc kubenswrapper[4892]: I0217 18:04:02.680101 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-dw87w" Feb 17 18:04:02 crc kubenswrapper[4892]: I0217 18:04:02.702851 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-7rrhb" event={"ID":"e85111e5-c198-4b36-bdce-5eb7a8ff75ea","Type":"ContainerDied","Data":"8f1ea9bca6bfc9754522295f832885667c13e97c765003ac1ce99459674f9baa"} Feb 17 18:04:02 crc kubenswrapper[4892]: I0217 18:04:02.702907 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f1ea9bca6bfc9754522295f832885667c13e97c765003ac1ce99459674f9baa" Feb 17 18:04:02 crc kubenswrapper[4892]: I0217 18:04:02.703004 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-7rrhb" Feb 17 18:04:02 crc kubenswrapper[4892]: I0217 18:04:02.705829 4892 generic.go:334] "Generic (PLEG): container finished" podID="14ec5fef-7255-4519-b447-474ccccaebdb" containerID="4770ccc77abc3fc5a4d307d9af15ea5f5bbbf85c76fd3467db07e2c10f0044ad" exitCode=0 Feb 17 18:04:02 crc kubenswrapper[4892]: I0217 18:04:02.705849 4892 generic.go:334] "Generic (PLEG): container finished" podID="14ec5fef-7255-4519-b447-474ccccaebdb" containerID="1b519ddd92ccb43096fe441e4b92da614f8084679a5f6b39b706dd73a6031e61" exitCode=2 Feb 17 18:04:02 crc kubenswrapper[4892]: I0217 18:04:02.705856 4892 generic.go:334] "Generic (PLEG): container finished" podID="14ec5fef-7255-4519-b447-474ccccaebdb" containerID="1ca249ccf30d54590b7bedfc5fd4d93955d7342105c6c4b76747c5f1f064b57b" exitCode=0 Feb 17 18:04:02 crc kubenswrapper[4892]: I0217 18:04:02.706971 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"14ec5fef-7255-4519-b447-474ccccaebdb","Type":"ContainerDied","Data":"4770ccc77abc3fc5a4d307d9af15ea5f5bbbf85c76fd3467db07e2c10f0044ad"} Feb 17 18:04:02 crc kubenswrapper[4892]: I0217 18:04:02.707029 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"14ec5fef-7255-4519-b447-474ccccaebdb","Type":"ContainerDied","Data":"1b519ddd92ccb43096fe441e4b92da614f8084679a5f6b39b706dd73a6031e61"} Feb 17 18:04:02 crc kubenswrapper[4892]: I0217 18:04:02.707039 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"14ec5fef-7255-4519-b447-474ccccaebdb","Type":"ContainerDied","Data":"1ca249ccf30d54590b7bedfc5fd4d93955d7342105c6c4b76747c5f1f064b57b"} Feb 17 18:04:02 crc kubenswrapper[4892]: I0217 18:04:02.730955 4892 scope.go:117] "RemoveContainer" containerID="b5103e207bc9f1bea8a4e92662e06c8699afde4b9d2651bb605a0f7ad28da5fc" Feb 17 18:04:02 crc kubenswrapper[4892]: I0217 18:04:02.749150 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-dw87w"] Feb 17 18:04:02 crc kubenswrapper[4892]: I0217 18:04:02.761930 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-dw87w"] Feb 17 18:04:02 crc kubenswrapper[4892]: I0217 18:04:02.898415 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5dc464d6d6-dj24w" Feb 17 18:04:02 crc kubenswrapper[4892]: I0217 18:04:02.907658 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5dc464d6d6-dj24w" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.060082 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 18:04:03 crc kubenswrapper[4892]: E0217 18:04:03.060605 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04ff949e-3323-4573-ac6a-4b1714b1976c" containerName="init" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.060628 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="04ff949e-3323-4573-ac6a-4b1714b1976c" containerName="init" Feb 17 18:04:03 crc kubenswrapper[4892]: E0217 18:04:03.060669 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04ff949e-3323-4573-ac6a-4b1714b1976c" containerName="dnsmasq-dns" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.060678 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="04ff949e-3323-4573-ac6a-4b1714b1976c" containerName="dnsmasq-dns" Feb 17 18:04:03 crc kubenswrapper[4892]: E0217 18:04:03.060690 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e85111e5-c198-4b36-bdce-5eb7a8ff75ea" containerName="cinder-db-sync" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.060700 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="e85111e5-c198-4b36-bdce-5eb7a8ff75ea" containerName="cinder-db-sync" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.060969 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="04ff949e-3323-4573-ac6a-4b1714b1976c" containerName="dnsmasq-dns" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.060990 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="e85111e5-c198-4b36-bdce-5eb7a8ff75ea" containerName="cinder-db-sync" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.062087 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.065159 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.065393 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-6qpqs" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.065522 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.069288 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.081981 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fa545b2-4053-4a5c-8a22-d0bbb112db1a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1fa545b2-4053-4a5c-8a22-d0bbb112db1a\") " pod="openstack/cinder-scheduler-0" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.082029 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fa545b2-4053-4a5c-8a22-d0bbb112db1a-scripts\") pod \"cinder-scheduler-0\" (UID: \"1fa545b2-4053-4a5c-8a22-d0bbb112db1a\") " pod="openstack/cinder-scheduler-0" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.082065 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1fa545b2-4053-4a5c-8a22-d0bbb112db1a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1fa545b2-4053-4a5c-8a22-d0bbb112db1a\") " pod="openstack/cinder-scheduler-0" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.082607 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtjzn\" (UniqueName: \"kubernetes.io/projected/1fa545b2-4053-4a5c-8a22-d0bbb112db1a-kube-api-access-jtjzn\") pod \"cinder-scheduler-0\" (UID: \"1fa545b2-4053-4a5c-8a22-d0bbb112db1a\") " pod="openstack/cinder-scheduler-0" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.082914 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1fa545b2-4053-4a5c-8a22-d0bbb112db1a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1fa545b2-4053-4a5c-8a22-d0bbb112db1a\") " pod="openstack/cinder-scheduler-0" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.082942 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fa545b2-4053-4a5c-8a22-d0bbb112db1a-config-data\") pod \"cinder-scheduler-0\" (UID: \"1fa545b2-4053-4a5c-8a22-d0bbb112db1a\") " pod="openstack/cinder-scheduler-0" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.091426 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.156192 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-9mg8g"] Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.161503 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-9mg8g" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.185008 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/522cb1f9-2e67-46ca-b4e1-3c837d27d9ac-config\") pod \"dnsmasq-dns-6bb4fc677f-9mg8g\" (UID: \"522cb1f9-2e67-46ca-b4e1-3c837d27d9ac\") " pod="openstack/dnsmasq-dns-6bb4fc677f-9mg8g" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.185055 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fa545b2-4053-4a5c-8a22-d0bbb112db1a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1fa545b2-4053-4a5c-8a22-d0bbb112db1a\") " pod="openstack/cinder-scheduler-0" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.185086 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fa545b2-4053-4a5c-8a22-d0bbb112db1a-scripts\") pod \"cinder-scheduler-0\" (UID: \"1fa545b2-4053-4a5c-8a22-d0bbb112db1a\") " pod="openstack/cinder-scheduler-0" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.185116 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl5vh\" (UniqueName: \"kubernetes.io/projected/522cb1f9-2e67-46ca-b4e1-3c837d27d9ac-kube-api-access-nl5vh\") pod \"dnsmasq-dns-6bb4fc677f-9mg8g\" (UID: \"522cb1f9-2e67-46ca-b4e1-3c837d27d9ac\") " pod="openstack/dnsmasq-dns-6bb4fc677f-9mg8g" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.185137 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1fa545b2-4053-4a5c-8a22-d0bbb112db1a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1fa545b2-4053-4a5c-8a22-d0bbb112db1a\") " pod="openstack/cinder-scheduler-0" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.185179 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1fa545b2-4053-4a5c-8a22-d0bbb112db1a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1fa545b2-4053-4a5c-8a22-d0bbb112db1a\") " pod="openstack/cinder-scheduler-0" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.185276 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/522cb1f9-2e67-46ca-b4e1-3c837d27d9ac-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-9mg8g\" (UID: \"522cb1f9-2e67-46ca-b4e1-3c837d27d9ac\") " pod="openstack/dnsmasq-dns-6bb4fc677f-9mg8g" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.185314 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/522cb1f9-2e67-46ca-b4e1-3c837d27d9ac-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-9mg8g\" (UID: \"522cb1f9-2e67-46ca-b4e1-3c837d27d9ac\") " pod="openstack/dnsmasq-dns-6bb4fc677f-9mg8g" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.185438 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtjzn\" (UniqueName: \"kubernetes.io/projected/1fa545b2-4053-4a5c-8a22-d0bbb112db1a-kube-api-access-jtjzn\") pod \"cinder-scheduler-0\" (UID: \"1fa545b2-4053-4a5c-8a22-d0bbb112db1a\") " pod="openstack/cinder-scheduler-0" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.185475 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/522cb1f9-2e67-46ca-b4e1-3c837d27d9ac-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-9mg8g\" (UID: \"522cb1f9-2e67-46ca-b4e1-3c837d27d9ac\") " pod="openstack/dnsmasq-dns-6bb4fc677f-9mg8g" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.185590 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/522cb1f9-2e67-46ca-b4e1-3c837d27d9ac-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-9mg8g\" (UID: \"522cb1f9-2e67-46ca-b4e1-3c837d27d9ac\") " pod="openstack/dnsmasq-dns-6bb4fc677f-9mg8g" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.185646 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1fa545b2-4053-4a5c-8a22-d0bbb112db1a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1fa545b2-4053-4a5c-8a22-d0bbb112db1a\") " pod="openstack/cinder-scheduler-0" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.185672 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fa545b2-4053-4a5c-8a22-d0bbb112db1a-config-data\") pod \"cinder-scheduler-0\" (UID: \"1fa545b2-4053-4a5c-8a22-d0bbb112db1a\") " pod="openstack/cinder-scheduler-0" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.196223 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fa545b2-4053-4a5c-8a22-d0bbb112db1a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1fa545b2-4053-4a5c-8a22-d0bbb112db1a\") " pod="openstack/cinder-scheduler-0" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.213717 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fa545b2-4053-4a5c-8a22-d0bbb112db1a-config-data\") pod \"cinder-scheduler-0\" (UID: \"1fa545b2-4053-4a5c-8a22-d0bbb112db1a\") " pod="openstack/cinder-scheduler-0" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.215298 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fa545b2-4053-4a5c-8a22-d0bbb112db1a-scripts\") pod \"cinder-scheduler-0\" (UID: \"1fa545b2-4053-4a5c-8a22-d0bbb112db1a\") " pod="openstack/cinder-scheduler-0" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.220393 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1fa545b2-4053-4a5c-8a22-d0bbb112db1a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1fa545b2-4053-4a5c-8a22-d0bbb112db1a\") " pod="openstack/cinder-scheduler-0" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.220709 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtjzn\" (UniqueName: \"kubernetes.io/projected/1fa545b2-4053-4a5c-8a22-d0bbb112db1a-kube-api-access-jtjzn\") pod \"cinder-scheduler-0\" (UID: \"1fa545b2-4053-4a5c-8a22-d0bbb112db1a\") " pod="openstack/cinder-scheduler-0" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.250999 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-9mg8g"] Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.290301 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/522cb1f9-2e67-46ca-b4e1-3c837d27d9ac-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-9mg8g\" (UID: \"522cb1f9-2e67-46ca-b4e1-3c837d27d9ac\") " pod="openstack/dnsmasq-dns-6bb4fc677f-9mg8g" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.290359 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/522cb1f9-2e67-46ca-b4e1-3c837d27d9ac-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-9mg8g\" (UID: \"522cb1f9-2e67-46ca-b4e1-3c837d27d9ac\") " pod="openstack/dnsmasq-dns-6bb4fc677f-9mg8g" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.290429 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/522cb1f9-2e67-46ca-b4e1-3c837d27d9ac-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-9mg8g\" (UID: \"522cb1f9-2e67-46ca-b4e1-3c837d27d9ac\") " pod="openstack/dnsmasq-dns-6bb4fc677f-9mg8g" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.290493 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/522cb1f9-2e67-46ca-b4e1-3c837d27d9ac-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-9mg8g\" (UID: \"522cb1f9-2e67-46ca-b4e1-3c837d27d9ac\") " pod="openstack/dnsmasq-dns-6bb4fc677f-9mg8g" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.290592 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/522cb1f9-2e67-46ca-b4e1-3c837d27d9ac-config\") pod \"dnsmasq-dns-6bb4fc677f-9mg8g\" (UID: \"522cb1f9-2e67-46ca-b4e1-3c837d27d9ac\") " pod="openstack/dnsmasq-dns-6bb4fc677f-9mg8g" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.290677 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nl5vh\" (UniqueName: \"kubernetes.io/projected/522cb1f9-2e67-46ca-b4e1-3c837d27d9ac-kube-api-access-nl5vh\") pod \"dnsmasq-dns-6bb4fc677f-9mg8g\" (UID: \"522cb1f9-2e67-46ca-b4e1-3c837d27d9ac\") " pod="openstack/dnsmasq-dns-6bb4fc677f-9mg8g" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.303238 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/522cb1f9-2e67-46ca-b4e1-3c837d27d9ac-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-9mg8g\" (UID: \"522cb1f9-2e67-46ca-b4e1-3c837d27d9ac\") " pod="openstack/dnsmasq-dns-6bb4fc677f-9mg8g" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.303250 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/522cb1f9-2e67-46ca-b4e1-3c837d27d9ac-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-9mg8g\" (UID: \"522cb1f9-2e67-46ca-b4e1-3c837d27d9ac\") " pod="openstack/dnsmasq-dns-6bb4fc677f-9mg8g" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.303730 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/522cb1f9-2e67-46ca-b4e1-3c837d27d9ac-config\") pod \"dnsmasq-dns-6bb4fc677f-9mg8g\" (UID: \"522cb1f9-2e67-46ca-b4e1-3c837d27d9ac\") " pod="openstack/dnsmasq-dns-6bb4fc677f-9mg8g" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.304257 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/522cb1f9-2e67-46ca-b4e1-3c837d27d9ac-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-9mg8g\" (UID: \"522cb1f9-2e67-46ca-b4e1-3c837d27d9ac\") " pod="openstack/dnsmasq-dns-6bb4fc677f-9mg8g" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.305263 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/522cb1f9-2e67-46ca-b4e1-3c837d27d9ac-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-9mg8g\" (UID: \"522cb1f9-2e67-46ca-b4e1-3c837d27d9ac\") " pod="openstack/dnsmasq-dns-6bb4fc677f-9mg8g" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.326531 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.328172 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.332259 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.333756 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl5vh\" (UniqueName: \"kubernetes.io/projected/522cb1f9-2e67-46ca-b4e1-3c837d27d9ac-kube-api-access-nl5vh\") pod \"dnsmasq-dns-6bb4fc677f-9mg8g\" (UID: \"522cb1f9-2e67-46ca-b4e1-3c837d27d9ac\") " pod="openstack/dnsmasq-dns-6bb4fc677f-9mg8g" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.391713 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.392296 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/493f91a6-7322-4fb8-8853-5021cf556a97-config-data-custom\") pod \"cinder-api-0\" (UID: \"493f91a6-7322-4fb8-8853-5021cf556a97\") " pod="openstack/cinder-api-0" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.392338 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/493f91a6-7322-4fb8-8853-5021cf556a97-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"493f91a6-7322-4fb8-8853-5021cf556a97\") " pod="openstack/cinder-api-0" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.392364 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/493f91a6-7322-4fb8-8853-5021cf556a97-scripts\") pod \"cinder-api-0\" (UID: \"493f91a6-7322-4fb8-8853-5021cf556a97\") " pod="openstack/cinder-api-0" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.392436 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw6k7\" (UniqueName: \"kubernetes.io/projected/493f91a6-7322-4fb8-8853-5021cf556a97-kube-api-access-xw6k7\") pod \"cinder-api-0\" (UID: \"493f91a6-7322-4fb8-8853-5021cf556a97\") " pod="openstack/cinder-api-0" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.392464 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/493f91a6-7322-4fb8-8853-5021cf556a97-config-data\") pod \"cinder-api-0\" (UID: \"493f91a6-7322-4fb8-8853-5021cf556a97\") " pod="openstack/cinder-api-0" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.392480 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/493f91a6-7322-4fb8-8853-5021cf556a97-etc-machine-id\") pod \"cinder-api-0\" (UID: \"493f91a6-7322-4fb8-8853-5021cf556a97\") " pod="openstack/cinder-api-0" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.393095 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/493f91a6-7322-4fb8-8853-5021cf556a97-logs\") pod \"cinder-api-0\" (UID: \"493f91a6-7322-4fb8-8853-5021cf556a97\") " pod="openstack/cinder-api-0" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.399978 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04ff949e-3323-4573-ac6a-4b1714b1976c" path="/var/lib/kubelet/pods/04ff949e-3323-4573-ac6a-4b1714b1976c/volumes" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.400758 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.497621 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/493f91a6-7322-4fb8-8853-5021cf556a97-config-data-custom\") pod \"cinder-api-0\" (UID: \"493f91a6-7322-4fb8-8853-5021cf556a97\") " pod="openstack/cinder-api-0" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.497692 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/493f91a6-7322-4fb8-8853-5021cf556a97-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"493f91a6-7322-4fb8-8853-5021cf556a97\") " pod="openstack/cinder-api-0" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.497732 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/493f91a6-7322-4fb8-8853-5021cf556a97-scripts\") pod \"cinder-api-0\" (UID: \"493f91a6-7322-4fb8-8853-5021cf556a97\") " pod="openstack/cinder-api-0" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.497808 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xw6k7\" (UniqueName: \"kubernetes.io/projected/493f91a6-7322-4fb8-8853-5021cf556a97-kube-api-access-xw6k7\") pod \"cinder-api-0\" (UID: \"493f91a6-7322-4fb8-8853-5021cf556a97\") " pod="openstack/cinder-api-0" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.497860 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/493f91a6-7322-4fb8-8853-5021cf556a97-config-data\") pod \"cinder-api-0\" (UID: \"493f91a6-7322-4fb8-8853-5021cf556a97\") " pod="openstack/cinder-api-0" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.497880 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/493f91a6-7322-4fb8-8853-5021cf556a97-etc-machine-id\") pod \"cinder-api-0\" (UID: \"493f91a6-7322-4fb8-8853-5021cf556a97\") " pod="openstack/cinder-api-0" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.497965 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/493f91a6-7322-4fb8-8853-5021cf556a97-logs\") pod \"cinder-api-0\" (UID: \"493f91a6-7322-4fb8-8853-5021cf556a97\") " pod="openstack/cinder-api-0" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.498472 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/493f91a6-7322-4fb8-8853-5021cf556a97-logs\") pod \"cinder-api-0\" (UID: \"493f91a6-7322-4fb8-8853-5021cf556a97\") " pod="openstack/cinder-api-0" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.503169 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/493f91a6-7322-4fb8-8853-5021cf556a97-config-data-custom\") pod \"cinder-api-0\" (UID: \"493f91a6-7322-4fb8-8853-5021cf556a97\") " pod="openstack/cinder-api-0" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.507193 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/493f91a6-7322-4fb8-8853-5021cf556a97-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"493f91a6-7322-4fb8-8853-5021cf556a97\") " pod="openstack/cinder-api-0" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.507306 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/493f91a6-7322-4fb8-8853-5021cf556a97-etc-machine-id\") pod \"cinder-api-0\" (UID: \"493f91a6-7322-4fb8-8853-5021cf556a97\") " pod="openstack/cinder-api-0" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.511318 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/493f91a6-7322-4fb8-8853-5021cf556a97-scripts\") pod \"cinder-api-0\" (UID: \"493f91a6-7322-4fb8-8853-5021cf556a97\") " pod="openstack/cinder-api-0" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.525910 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/493f91a6-7322-4fb8-8853-5021cf556a97-config-data\") pod \"cinder-api-0\" (UID: \"493f91a6-7322-4fb8-8853-5021cf556a97\") " pod="openstack/cinder-api-0" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.530539 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xw6k7\" (UniqueName: \"kubernetes.io/projected/493f91a6-7322-4fb8-8853-5021cf556a97-kube-api-access-xw6k7\") pod \"cinder-api-0\" (UID: \"493f91a6-7322-4fb8-8853-5021cf556a97\") " pod="openstack/cinder-api-0" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.636423 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-9mg8g" Feb 17 18:04:03 crc kubenswrapper[4892]: I0217 18:04:03.697856 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 18:04:04 crc kubenswrapper[4892]: I0217 18:04:04.072795 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 18:04:04 crc kubenswrapper[4892]: W0217 18:04:04.075343 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fa545b2_4053_4a5c_8a22_d0bbb112db1a.slice/crio-9ce7211d2030d3b1cbfd4db923ae093cacab96c51f6d85cec5cfe3bf6e42897b WatchSource:0}: Error finding container 9ce7211d2030d3b1cbfd4db923ae093cacab96c51f6d85cec5cfe3bf6e42897b: Status 404 returned error can't find the container with id 9ce7211d2030d3b1cbfd4db923ae093cacab96c51f6d85cec5cfe3bf6e42897b Feb 17 18:04:04 crc kubenswrapper[4892]: I0217 18:04:04.295418 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-9mg8g"] Feb 17 18:04:04 crc kubenswrapper[4892]: W0217 18:04:04.308944 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod522cb1f9_2e67_46ca_b4e1_3c837d27d9ac.slice/crio-fc68767794ac670f2f8a3577021ff4aa4f814cd12103b0261138ed3bb016b90c WatchSource:0}: Error finding container fc68767794ac670f2f8a3577021ff4aa4f814cd12103b0261138ed3bb016b90c: Status 404 returned error can't find the container with id fc68767794ac670f2f8a3577021ff4aa4f814cd12103b0261138ed3bb016b90c Feb 17 18:04:04 crc kubenswrapper[4892]: W0217 18:04:04.317267 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod493f91a6_7322_4fb8_8853_5021cf556a97.slice/crio-fbd79122ec9b78ff09c6bf6e32d00818bd501235f2b31d34ac8c72510d55c25d WatchSource:0}: Error finding container fbd79122ec9b78ff09c6bf6e32d00818bd501235f2b31d34ac8c72510d55c25d: Status 404 returned error can't find the container with id fbd79122ec9b78ff09c6bf6e32d00818bd501235f2b31d34ac8c72510d55c25d Feb 17 18:04:04 crc kubenswrapper[4892]: I0217 18:04:04.324160 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 17 18:04:04 crc kubenswrapper[4892]: I0217 18:04:04.751511 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"493f91a6-7322-4fb8-8853-5021cf556a97","Type":"ContainerStarted","Data":"fbd79122ec9b78ff09c6bf6e32d00818bd501235f2b31d34ac8c72510d55c25d"} Feb 17 18:04:04 crc kubenswrapper[4892]: I0217 18:04:04.756772 4892 generic.go:334] "Generic (PLEG): container finished" podID="522cb1f9-2e67-46ca-b4e1-3c837d27d9ac" containerID="5114d3b45235512ff14c5852bae41a22117edf23a6c4bc5ea936e4ced47a583f" exitCode=0 Feb 17 18:04:04 crc kubenswrapper[4892]: I0217 18:04:04.757097 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-9mg8g" event={"ID":"522cb1f9-2e67-46ca-b4e1-3c837d27d9ac","Type":"ContainerDied","Data":"5114d3b45235512ff14c5852bae41a22117edf23a6c4bc5ea936e4ced47a583f"} Feb 17 18:04:04 crc kubenswrapper[4892]: I0217 18:04:04.757133 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-9mg8g" event={"ID":"522cb1f9-2e67-46ca-b4e1-3c837d27d9ac","Type":"ContainerStarted","Data":"fc68767794ac670f2f8a3577021ff4aa4f814cd12103b0261138ed3bb016b90c"} Feb 17 18:04:04 crc kubenswrapper[4892]: I0217 18:04:04.759627 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1fa545b2-4053-4a5c-8a22-d0bbb112db1a","Type":"ContainerStarted","Data":"9ce7211d2030d3b1cbfd4db923ae093cacab96c51f6d85cec5cfe3bf6e42897b"} Feb 17 18:04:05 crc kubenswrapper[4892]: I0217 18:04:05.785142 4892 generic.go:334] "Generic (PLEG): container finished" podID="14ec5fef-7255-4519-b447-474ccccaebdb" containerID="339f18965d225092aa6b74cb44351cec1a8bae37411ff5d9763012698e3bd335" exitCode=0 Feb 17 18:04:05 crc kubenswrapper[4892]: I0217 18:04:05.785212 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"14ec5fef-7255-4519-b447-474ccccaebdb","Type":"ContainerDied","Data":"339f18965d225092aa6b74cb44351cec1a8bae37411ff5d9763012698e3bd335"} Feb 17 18:04:05 crc kubenswrapper[4892]: I0217 18:04:05.804439 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"493f91a6-7322-4fb8-8853-5021cf556a97","Type":"ContainerStarted","Data":"5a8b73edc8f4cd57bae3bf775db55b14c68f64faaabe612f5bd20795b847eb9b"} Feb 17 18:04:05 crc kubenswrapper[4892]: I0217 18:04:05.809329 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-9mg8g" event={"ID":"522cb1f9-2e67-46ca-b4e1-3c837d27d9ac","Type":"ContainerStarted","Data":"cd9ecb1657eaf84acbc9ad761ad42a0e4fed1fa8c98f3b1aa627599fe7925dc0"} Feb 17 18:04:05 crc kubenswrapper[4892]: I0217 18:04:05.809677 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bb4fc677f-9mg8g" Feb 17 18:04:05 crc kubenswrapper[4892]: I0217 18:04:05.814638 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1fa545b2-4053-4a5c-8a22-d0bbb112db1a","Type":"ContainerStarted","Data":"54673b8a6b7d7e20fc21f82b3621ecf169e0813410a5de37f605ecc06b56930b"} Feb 17 18:04:05 crc kubenswrapper[4892]: I0217 18:04:05.837627 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bb4fc677f-9mg8g" podStartSLOduration=2.83761379 podStartE2EDuration="2.83761379s" podCreationTimestamp="2026-02-17 18:04:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:04:05.83610769 +0000 UTC m=+1217.211510955" watchObservedRunningTime="2026-02-17 18:04:05.83761379 +0000 UTC m=+1217.213017055" Feb 17 18:04:06 crc kubenswrapper[4892]: I0217 18:04:06.186050 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 18:04:06 crc kubenswrapper[4892]: I0217 18:04:06.279084 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/14ec5fef-7255-4519-b447-474ccccaebdb-run-httpd\") pod \"14ec5fef-7255-4519-b447-474ccccaebdb\" (UID: \"14ec5fef-7255-4519-b447-474ccccaebdb\") " Feb 17 18:04:06 crc kubenswrapper[4892]: I0217 18:04:06.279125 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14ec5fef-7255-4519-b447-474ccccaebdb-scripts\") pod \"14ec5fef-7255-4519-b447-474ccccaebdb\" (UID: \"14ec5fef-7255-4519-b447-474ccccaebdb\") " Feb 17 18:04:06 crc kubenswrapper[4892]: I0217 18:04:06.279167 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/14ec5fef-7255-4519-b447-474ccccaebdb-log-httpd\") pod \"14ec5fef-7255-4519-b447-474ccccaebdb\" (UID: \"14ec5fef-7255-4519-b447-474ccccaebdb\") " Feb 17 18:04:06 crc kubenswrapper[4892]: I0217 18:04:06.279213 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/14ec5fef-7255-4519-b447-474ccccaebdb-sg-core-conf-yaml\") pod \"14ec5fef-7255-4519-b447-474ccccaebdb\" (UID: \"14ec5fef-7255-4519-b447-474ccccaebdb\") " Feb 17 18:04:06 crc kubenswrapper[4892]: I0217 18:04:06.279295 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5776\" (UniqueName: \"kubernetes.io/projected/14ec5fef-7255-4519-b447-474ccccaebdb-kube-api-access-f5776\") pod \"14ec5fef-7255-4519-b447-474ccccaebdb\" (UID: \"14ec5fef-7255-4519-b447-474ccccaebdb\") " Feb 17 18:04:06 crc kubenswrapper[4892]: I0217 18:04:06.279454 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14ec5fef-7255-4519-b447-474ccccaebdb-combined-ca-bundle\") pod \"14ec5fef-7255-4519-b447-474ccccaebdb\" (UID: \"14ec5fef-7255-4519-b447-474ccccaebdb\") " Feb 17 18:04:06 crc kubenswrapper[4892]: I0217 18:04:06.279494 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14ec5fef-7255-4519-b447-474ccccaebdb-config-data\") pod \"14ec5fef-7255-4519-b447-474ccccaebdb\" (UID: \"14ec5fef-7255-4519-b447-474ccccaebdb\") " Feb 17 18:04:06 crc kubenswrapper[4892]: I0217 18:04:06.279548 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14ec5fef-7255-4519-b447-474ccccaebdb-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "14ec5fef-7255-4519-b447-474ccccaebdb" (UID: "14ec5fef-7255-4519-b447-474ccccaebdb"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:04:06 crc kubenswrapper[4892]: I0217 18:04:06.279928 4892 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/14ec5fef-7255-4519-b447-474ccccaebdb-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:06 crc kubenswrapper[4892]: I0217 18:04:06.280120 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14ec5fef-7255-4519-b447-474ccccaebdb-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "14ec5fef-7255-4519-b447-474ccccaebdb" (UID: "14ec5fef-7255-4519-b447-474ccccaebdb"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:04:06 crc kubenswrapper[4892]: I0217 18:04:06.321281 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14ec5fef-7255-4519-b447-474ccccaebdb-kube-api-access-f5776" (OuterVolumeSpecName: "kube-api-access-f5776") pod "14ec5fef-7255-4519-b447-474ccccaebdb" (UID: "14ec5fef-7255-4519-b447-474ccccaebdb"). InnerVolumeSpecName "kube-api-access-f5776". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:04:06 crc kubenswrapper[4892]: I0217 18:04:06.322976 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14ec5fef-7255-4519-b447-474ccccaebdb-scripts" (OuterVolumeSpecName: "scripts") pod "14ec5fef-7255-4519-b447-474ccccaebdb" (UID: "14ec5fef-7255-4519-b447-474ccccaebdb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:04:06 crc kubenswrapper[4892]: I0217 18:04:06.354965 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14ec5fef-7255-4519-b447-474ccccaebdb-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "14ec5fef-7255-4519-b447-474ccccaebdb" (UID: "14ec5fef-7255-4519-b447-474ccccaebdb"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:04:06 crc kubenswrapper[4892]: I0217 18:04:06.384047 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14ec5fef-7255-4519-b447-474ccccaebdb-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:06 crc kubenswrapper[4892]: I0217 18:04:06.384083 4892 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/14ec5fef-7255-4519-b447-474ccccaebdb-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:06 crc kubenswrapper[4892]: I0217 18:04:06.384095 4892 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/14ec5fef-7255-4519-b447-474ccccaebdb-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:06 crc kubenswrapper[4892]: I0217 18:04:06.384105 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5776\" (UniqueName: \"kubernetes.io/projected/14ec5fef-7255-4519-b447-474ccccaebdb-kube-api-access-f5776\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:06 crc kubenswrapper[4892]: I0217 18:04:06.402932 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14ec5fef-7255-4519-b447-474ccccaebdb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "14ec5fef-7255-4519-b447-474ccccaebdb" (UID: "14ec5fef-7255-4519-b447-474ccccaebdb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:04:06 crc kubenswrapper[4892]: I0217 18:04:06.438315 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 17 18:04:06 crc kubenswrapper[4892]: I0217 18:04:06.486500 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14ec5fef-7255-4519-b447-474ccccaebdb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:06 crc kubenswrapper[4892]: I0217 18:04:06.495951 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14ec5fef-7255-4519-b447-474ccccaebdb-config-data" (OuterVolumeSpecName: "config-data") pod "14ec5fef-7255-4519-b447-474ccccaebdb" (UID: "14ec5fef-7255-4519-b447-474ccccaebdb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:04:06 crc kubenswrapper[4892]: I0217 18:04:06.588427 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14ec5fef-7255-4519-b447-474ccccaebdb-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:06 crc kubenswrapper[4892]: I0217 18:04:06.838794 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1fa545b2-4053-4a5c-8a22-d0bbb112db1a","Type":"ContainerStarted","Data":"f7e77513d0ccfd80c5434dae1f79fa0da0f8bb41f7ff365bcf112a36604a6274"} Feb 17 18:04:06 crc kubenswrapper[4892]: I0217 18:04:06.844222 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"14ec5fef-7255-4519-b447-474ccccaebdb","Type":"ContainerDied","Data":"65b7540608340a0fd35d712d4378d8841621d0b688dd6c7aa3f7c858d2d0b6e3"} Feb 17 18:04:06 crc kubenswrapper[4892]: I0217 18:04:06.844275 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 18:04:06 crc kubenswrapper[4892]: I0217 18:04:06.844267 4892 scope.go:117] "RemoveContainer" containerID="4770ccc77abc3fc5a4d307d9af15ea5f5bbbf85c76fd3467db07e2c10f0044ad" Feb 17 18:04:06 crc kubenswrapper[4892]: I0217 18:04:06.847448 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"493f91a6-7322-4fb8-8853-5021cf556a97","Type":"ContainerStarted","Data":"c9ab81ed4c25c6f84233378ea5a2a6f1b53b365e3b4f03545c7723dd00700eb8"} Feb 17 18:04:06 crc kubenswrapper[4892]: I0217 18:04:06.847482 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 17 18:04:06 crc kubenswrapper[4892]: I0217 18:04:06.860169 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.138396911 podStartE2EDuration="3.860153218s" podCreationTimestamp="2026-02-17 18:04:03 +0000 UTC" firstStartedPulling="2026-02-17 18:04:04.07742741 +0000 UTC m=+1215.452830675" lastFinishedPulling="2026-02-17 18:04:04.799183717 +0000 UTC m=+1216.174586982" observedRunningTime="2026-02-17 18:04:06.855933876 +0000 UTC m=+1218.231337151" watchObservedRunningTime="2026-02-17 18:04:06.860153218 +0000 UTC m=+1218.235556483" Feb 17 18:04:06 crc kubenswrapper[4892]: I0217 18:04:06.922326 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.922303381 podStartE2EDuration="3.922303381s" podCreationTimestamp="2026-02-17 18:04:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:04:06.894486018 +0000 UTC m=+1218.269889303" watchObservedRunningTime="2026-02-17 18:04:06.922303381 +0000 UTC m=+1218.297706646" Feb 17 18:04:06 crc kubenswrapper[4892]: I0217 18:04:06.944233 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 18:04:06 crc kubenswrapper[4892]: I0217 18:04:06.954149 4892 scope.go:117] "RemoveContainer" containerID="1b519ddd92ccb43096fe441e4b92da614f8084679a5f6b39b706dd73a6031e61" Feb 17 18:04:06 crc kubenswrapper[4892]: I0217 18:04:06.955735 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 18:04:06 crc kubenswrapper[4892]: I0217 18:04:06.963780 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 18:04:06 crc kubenswrapper[4892]: E0217 18:04:06.964297 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14ec5fef-7255-4519-b447-474ccccaebdb" containerName="proxy-httpd" Feb 17 18:04:06 crc kubenswrapper[4892]: I0217 18:04:06.964316 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="14ec5fef-7255-4519-b447-474ccccaebdb" containerName="proxy-httpd" Feb 17 18:04:06 crc kubenswrapper[4892]: E0217 18:04:06.964335 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14ec5fef-7255-4519-b447-474ccccaebdb" containerName="ceilometer-notification-agent" Feb 17 18:04:06 crc kubenswrapper[4892]: I0217 18:04:06.964342 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="14ec5fef-7255-4519-b447-474ccccaebdb" containerName="ceilometer-notification-agent" Feb 17 18:04:06 crc kubenswrapper[4892]: E0217 18:04:06.964354 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14ec5fef-7255-4519-b447-474ccccaebdb" containerName="ceilometer-central-agent" Feb 17 18:04:06 crc kubenswrapper[4892]: I0217 18:04:06.964360 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="14ec5fef-7255-4519-b447-474ccccaebdb" containerName="ceilometer-central-agent" Feb 17 18:04:06 crc kubenswrapper[4892]: E0217 18:04:06.964375 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14ec5fef-7255-4519-b447-474ccccaebdb" containerName="sg-core" Feb 17 18:04:06 crc kubenswrapper[4892]: I0217 18:04:06.964380 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="14ec5fef-7255-4519-b447-474ccccaebdb" containerName="sg-core" Feb 17 18:04:06 crc kubenswrapper[4892]: I0217 18:04:06.964598 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="14ec5fef-7255-4519-b447-474ccccaebdb" containerName="proxy-httpd" Feb 17 18:04:06 crc kubenswrapper[4892]: I0217 18:04:06.964623 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="14ec5fef-7255-4519-b447-474ccccaebdb" containerName="ceilometer-notification-agent" Feb 17 18:04:06 crc kubenswrapper[4892]: I0217 18:04:06.964640 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="14ec5fef-7255-4519-b447-474ccccaebdb" containerName="ceilometer-central-agent" Feb 17 18:04:06 crc kubenswrapper[4892]: I0217 18:04:06.964651 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="14ec5fef-7255-4519-b447-474ccccaebdb" containerName="sg-core" Feb 17 18:04:06 crc kubenswrapper[4892]: I0217 18:04:06.973419 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 18:04:06 crc kubenswrapper[4892]: I0217 18:04:06.977272 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 18:04:06 crc kubenswrapper[4892]: I0217 18:04:06.986114 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 18:04:06 crc kubenswrapper[4892]: I0217 18:04:06.986252 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 18:04:07 crc kubenswrapper[4892]: I0217 18:04:07.031737 4892 scope.go:117] "RemoveContainer" containerID="339f18965d225092aa6b74cb44351cec1a8bae37411ff5d9763012698e3bd335" Feb 17 18:04:07 crc kubenswrapper[4892]: I0217 18:04:07.086944 4892 scope.go:117] "RemoveContainer" containerID="1ca249ccf30d54590b7bedfc5fd4d93955d7342105c6c4b76747c5f1f064b57b" Feb 17 18:04:07 crc kubenswrapper[4892]: I0217 18:04:07.100250 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1035d59-31a0-44b9-b196-4b66c6cd3b19-run-httpd\") pod \"ceilometer-0\" (UID: \"a1035d59-31a0-44b9-b196-4b66c6cd3b19\") " pod="openstack/ceilometer-0" Feb 17 18:04:07 crc kubenswrapper[4892]: I0217 18:04:07.100673 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v26q9\" (UniqueName: \"kubernetes.io/projected/a1035d59-31a0-44b9-b196-4b66c6cd3b19-kube-api-access-v26q9\") pod \"ceilometer-0\" (UID: \"a1035d59-31a0-44b9-b196-4b66c6cd3b19\") " pod="openstack/ceilometer-0" Feb 17 18:04:07 crc kubenswrapper[4892]: I0217 18:04:07.100775 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a1035d59-31a0-44b9-b196-4b66c6cd3b19-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a1035d59-31a0-44b9-b196-4b66c6cd3b19\") " pod="openstack/ceilometer-0" Feb 17 18:04:07 crc kubenswrapper[4892]: I0217 18:04:07.101306 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1035d59-31a0-44b9-b196-4b66c6cd3b19-config-data\") pod \"ceilometer-0\" (UID: \"a1035d59-31a0-44b9-b196-4b66c6cd3b19\") " pod="openstack/ceilometer-0" Feb 17 18:04:07 crc kubenswrapper[4892]: I0217 18:04:07.101423 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1035d59-31a0-44b9-b196-4b66c6cd3b19-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a1035d59-31a0-44b9-b196-4b66c6cd3b19\") " pod="openstack/ceilometer-0" Feb 17 18:04:07 crc kubenswrapper[4892]: I0217 18:04:07.101547 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1035d59-31a0-44b9-b196-4b66c6cd3b19-scripts\") pod \"ceilometer-0\" (UID: \"a1035d59-31a0-44b9-b196-4b66c6cd3b19\") " pod="openstack/ceilometer-0" Feb 17 18:04:07 crc kubenswrapper[4892]: I0217 18:04:07.101681 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1035d59-31a0-44b9-b196-4b66c6cd3b19-log-httpd\") pod \"ceilometer-0\" (UID: \"a1035d59-31a0-44b9-b196-4b66c6cd3b19\") " pod="openstack/ceilometer-0" Feb 17 18:04:07 crc kubenswrapper[4892]: I0217 18:04:07.205435 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1035d59-31a0-44b9-b196-4b66c6cd3b19-run-httpd\") pod \"ceilometer-0\" (UID: \"a1035d59-31a0-44b9-b196-4b66c6cd3b19\") " pod="openstack/ceilometer-0" Feb 17 18:04:07 crc kubenswrapper[4892]: I0217 18:04:07.205507 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v26q9\" (UniqueName: \"kubernetes.io/projected/a1035d59-31a0-44b9-b196-4b66c6cd3b19-kube-api-access-v26q9\") pod \"ceilometer-0\" (UID: \"a1035d59-31a0-44b9-b196-4b66c6cd3b19\") " pod="openstack/ceilometer-0" Feb 17 18:04:07 crc kubenswrapper[4892]: I0217 18:04:07.205537 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a1035d59-31a0-44b9-b196-4b66c6cd3b19-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a1035d59-31a0-44b9-b196-4b66c6cd3b19\") " pod="openstack/ceilometer-0" Feb 17 18:04:07 crc kubenswrapper[4892]: I0217 18:04:07.205577 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1035d59-31a0-44b9-b196-4b66c6cd3b19-config-data\") pod \"ceilometer-0\" (UID: \"a1035d59-31a0-44b9-b196-4b66c6cd3b19\") " pod="openstack/ceilometer-0" Feb 17 18:04:07 crc kubenswrapper[4892]: I0217 18:04:07.205605 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1035d59-31a0-44b9-b196-4b66c6cd3b19-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a1035d59-31a0-44b9-b196-4b66c6cd3b19\") " pod="openstack/ceilometer-0" Feb 17 18:04:07 crc kubenswrapper[4892]: I0217 18:04:07.205654 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1035d59-31a0-44b9-b196-4b66c6cd3b19-scripts\") pod \"ceilometer-0\" (UID: \"a1035d59-31a0-44b9-b196-4b66c6cd3b19\") " pod="openstack/ceilometer-0" Feb 17 18:04:07 crc kubenswrapper[4892]: I0217 18:04:07.205713 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1035d59-31a0-44b9-b196-4b66c6cd3b19-log-httpd\") pod \"ceilometer-0\" (UID: \"a1035d59-31a0-44b9-b196-4b66c6cd3b19\") " pod="openstack/ceilometer-0" Feb 17 18:04:07 crc kubenswrapper[4892]: I0217 18:04:07.205985 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1035d59-31a0-44b9-b196-4b66c6cd3b19-run-httpd\") pod \"ceilometer-0\" (UID: \"a1035d59-31a0-44b9-b196-4b66c6cd3b19\") " pod="openstack/ceilometer-0" Feb 17 18:04:07 crc kubenswrapper[4892]: I0217 18:04:07.206218 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1035d59-31a0-44b9-b196-4b66c6cd3b19-log-httpd\") pod \"ceilometer-0\" (UID: \"a1035d59-31a0-44b9-b196-4b66c6cd3b19\") " pod="openstack/ceilometer-0" Feb 17 18:04:07 crc kubenswrapper[4892]: I0217 18:04:07.210982 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a1035d59-31a0-44b9-b196-4b66c6cd3b19-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a1035d59-31a0-44b9-b196-4b66c6cd3b19\") " pod="openstack/ceilometer-0" Feb 17 18:04:07 crc kubenswrapper[4892]: I0217 18:04:07.211445 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1035d59-31a0-44b9-b196-4b66c6cd3b19-config-data\") pod \"ceilometer-0\" (UID: \"a1035d59-31a0-44b9-b196-4b66c6cd3b19\") " pod="openstack/ceilometer-0" Feb 17 18:04:07 crc kubenswrapper[4892]: I0217 18:04:07.212679 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1035d59-31a0-44b9-b196-4b66c6cd3b19-scripts\") pod \"ceilometer-0\" (UID: \"a1035d59-31a0-44b9-b196-4b66c6cd3b19\") " pod="openstack/ceilometer-0" Feb 17 18:04:07 crc kubenswrapper[4892]: I0217 18:04:07.212898 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1035d59-31a0-44b9-b196-4b66c6cd3b19-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a1035d59-31a0-44b9-b196-4b66c6cd3b19\") " pod="openstack/ceilometer-0" Feb 17 18:04:07 crc kubenswrapper[4892]: I0217 18:04:07.223875 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v26q9\" (UniqueName: \"kubernetes.io/projected/a1035d59-31a0-44b9-b196-4b66c6cd3b19-kube-api-access-v26q9\") pod \"ceilometer-0\" (UID: \"a1035d59-31a0-44b9-b196-4b66c6cd3b19\") " pod="openstack/ceilometer-0" Feb 17 18:04:07 crc kubenswrapper[4892]: I0217 18:04:07.364425 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 18:04:07 crc kubenswrapper[4892]: I0217 18:04:07.374630 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14ec5fef-7255-4519-b447-474ccccaebdb" path="/var/lib/kubelet/pods/14ec5fef-7255-4519-b447-474ccccaebdb/volumes" Feb 17 18:04:07 crc kubenswrapper[4892]: I0217 18:04:07.859725 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="493f91a6-7322-4fb8-8853-5021cf556a97" containerName="cinder-api-log" containerID="cri-o://5a8b73edc8f4cd57bae3bf775db55b14c68f64faaabe612f5bd20795b847eb9b" gracePeriod=30 Feb 17 18:04:07 crc kubenswrapper[4892]: I0217 18:04:07.861048 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="493f91a6-7322-4fb8-8853-5021cf556a97" containerName="cinder-api" containerID="cri-o://c9ab81ed4c25c6f84233378ea5a2a6f1b53b365e3b4f03545c7723dd00700eb8" gracePeriod=30 Feb 17 18:04:07 crc kubenswrapper[4892]: I0217 18:04:07.894969 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 18:04:07 crc kubenswrapper[4892]: W0217 18:04:07.897385 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1035d59_31a0_44b9_b196_4b66c6cd3b19.slice/crio-44e4c8dc89b86a90fc641d14b762a1a3bf10ff4f2a58c0eefbe81d9c57a66fbb WatchSource:0}: Error finding container 44e4c8dc89b86a90fc641d14b762a1a3bf10ff4f2a58c0eefbe81d9c57a66fbb: Status 404 returned error can't find the container with id 44e4c8dc89b86a90fc641d14b762a1a3bf10ff4f2a58c0eefbe81d9c57a66fbb Feb 17 18:04:08 crc kubenswrapper[4892]: I0217 18:04:08.392350 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 17 18:04:08 crc kubenswrapper[4892]: I0217 18:04:08.486350 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 18:04:08 crc kubenswrapper[4892]: I0217 18:04:08.533182 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/493f91a6-7322-4fb8-8853-5021cf556a97-etc-machine-id\") pod \"493f91a6-7322-4fb8-8853-5021cf556a97\" (UID: \"493f91a6-7322-4fb8-8853-5021cf556a97\") " Feb 17 18:04:08 crc kubenswrapper[4892]: I0217 18:04:08.533522 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/493f91a6-7322-4fb8-8853-5021cf556a97-config-data\") pod \"493f91a6-7322-4fb8-8853-5021cf556a97\" (UID: \"493f91a6-7322-4fb8-8853-5021cf556a97\") " Feb 17 18:04:08 crc kubenswrapper[4892]: I0217 18:04:08.533648 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/493f91a6-7322-4fb8-8853-5021cf556a97-scripts\") pod \"493f91a6-7322-4fb8-8853-5021cf556a97\" (UID: \"493f91a6-7322-4fb8-8853-5021cf556a97\") " Feb 17 18:04:08 crc kubenswrapper[4892]: I0217 18:04:08.533785 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/493f91a6-7322-4fb8-8853-5021cf556a97-config-data-custom\") pod \"493f91a6-7322-4fb8-8853-5021cf556a97\" (UID: \"493f91a6-7322-4fb8-8853-5021cf556a97\") " Feb 17 18:04:08 crc kubenswrapper[4892]: I0217 18:04:08.533927 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xw6k7\" (UniqueName: \"kubernetes.io/projected/493f91a6-7322-4fb8-8853-5021cf556a97-kube-api-access-xw6k7\") pod \"493f91a6-7322-4fb8-8853-5021cf556a97\" (UID: \"493f91a6-7322-4fb8-8853-5021cf556a97\") " Feb 17 18:04:08 crc kubenswrapper[4892]: I0217 18:04:08.534112 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/493f91a6-7322-4fb8-8853-5021cf556a97-logs\") pod \"493f91a6-7322-4fb8-8853-5021cf556a97\" (UID: \"493f91a6-7322-4fb8-8853-5021cf556a97\") " Feb 17 18:04:08 crc kubenswrapper[4892]: I0217 18:04:08.534325 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/493f91a6-7322-4fb8-8853-5021cf556a97-combined-ca-bundle\") pod \"493f91a6-7322-4fb8-8853-5021cf556a97\" (UID: \"493f91a6-7322-4fb8-8853-5021cf556a97\") " Feb 17 18:04:08 crc kubenswrapper[4892]: I0217 18:04:08.533351 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/493f91a6-7322-4fb8-8853-5021cf556a97-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "493f91a6-7322-4fb8-8853-5021cf556a97" (UID: "493f91a6-7322-4fb8-8853-5021cf556a97"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:04:08 crc kubenswrapper[4892]: I0217 18:04:08.535255 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/493f91a6-7322-4fb8-8853-5021cf556a97-logs" (OuterVolumeSpecName: "logs") pod "493f91a6-7322-4fb8-8853-5021cf556a97" (UID: "493f91a6-7322-4fb8-8853-5021cf556a97"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:04:08 crc kubenswrapper[4892]: I0217 18:04:08.535577 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/493f91a6-7322-4fb8-8853-5021cf556a97-logs\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:08 crc kubenswrapper[4892]: I0217 18:04:08.535675 4892 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/493f91a6-7322-4fb8-8853-5021cf556a97-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:08 crc kubenswrapper[4892]: I0217 18:04:08.538583 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/493f91a6-7322-4fb8-8853-5021cf556a97-scripts" (OuterVolumeSpecName: "scripts") pod "493f91a6-7322-4fb8-8853-5021cf556a97" (UID: "493f91a6-7322-4fb8-8853-5021cf556a97"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:04:08 crc kubenswrapper[4892]: I0217 18:04:08.538598 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/493f91a6-7322-4fb8-8853-5021cf556a97-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "493f91a6-7322-4fb8-8853-5021cf556a97" (UID: "493f91a6-7322-4fb8-8853-5021cf556a97"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:04:08 crc kubenswrapper[4892]: I0217 18:04:08.539113 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/493f91a6-7322-4fb8-8853-5021cf556a97-kube-api-access-xw6k7" (OuterVolumeSpecName: "kube-api-access-xw6k7") pod "493f91a6-7322-4fb8-8853-5021cf556a97" (UID: "493f91a6-7322-4fb8-8853-5021cf556a97"). InnerVolumeSpecName "kube-api-access-xw6k7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:04:08 crc kubenswrapper[4892]: I0217 18:04:08.603239 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/493f91a6-7322-4fb8-8853-5021cf556a97-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "493f91a6-7322-4fb8-8853-5021cf556a97" (UID: "493f91a6-7322-4fb8-8853-5021cf556a97"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:04:08 crc kubenswrapper[4892]: I0217 18:04:08.627886 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/493f91a6-7322-4fb8-8853-5021cf556a97-config-data" (OuterVolumeSpecName: "config-data") pod "493f91a6-7322-4fb8-8853-5021cf556a97" (UID: "493f91a6-7322-4fb8-8853-5021cf556a97"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:04:08 crc kubenswrapper[4892]: I0217 18:04:08.637965 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/493f91a6-7322-4fb8-8853-5021cf556a97-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:08 crc kubenswrapper[4892]: I0217 18:04:08.638003 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/493f91a6-7322-4fb8-8853-5021cf556a97-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:08 crc kubenswrapper[4892]: I0217 18:04:08.638017 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/493f91a6-7322-4fb8-8853-5021cf556a97-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:08 crc kubenswrapper[4892]: I0217 18:04:08.638029 4892 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/493f91a6-7322-4fb8-8853-5021cf556a97-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:08 crc kubenswrapper[4892]: I0217 18:04:08.638042 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xw6k7\" (UniqueName: \"kubernetes.io/projected/493f91a6-7322-4fb8-8853-5021cf556a97-kube-api-access-xw6k7\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:08 crc kubenswrapper[4892]: I0217 18:04:08.872228 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1035d59-31a0-44b9-b196-4b66c6cd3b19","Type":"ContainerStarted","Data":"6be39c02be9a79a333e499294ba7938067b35bf9e823e2db4b9d61b286fa5f3f"} Feb 17 18:04:08 crc kubenswrapper[4892]: I0217 18:04:08.872272 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1035d59-31a0-44b9-b196-4b66c6cd3b19","Type":"ContainerStarted","Data":"44e4c8dc89b86a90fc641d14b762a1a3bf10ff4f2a58c0eefbe81d9c57a66fbb"} Feb 17 18:04:08 crc kubenswrapper[4892]: I0217 18:04:08.875003 4892 generic.go:334] "Generic (PLEG): container finished" podID="493f91a6-7322-4fb8-8853-5021cf556a97" containerID="c9ab81ed4c25c6f84233378ea5a2a6f1b53b365e3b4f03545c7723dd00700eb8" exitCode=0 Feb 17 18:04:08 crc kubenswrapper[4892]: I0217 18:04:08.875028 4892 generic.go:334] "Generic (PLEG): container finished" podID="493f91a6-7322-4fb8-8853-5021cf556a97" containerID="5a8b73edc8f4cd57bae3bf775db55b14c68f64faaabe612f5bd20795b847eb9b" exitCode=143 Feb 17 18:04:08 crc kubenswrapper[4892]: I0217 18:04:08.875196 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 18:04:08 crc kubenswrapper[4892]: I0217 18:04:08.875980 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"493f91a6-7322-4fb8-8853-5021cf556a97","Type":"ContainerDied","Data":"c9ab81ed4c25c6f84233378ea5a2a6f1b53b365e3b4f03545c7723dd00700eb8"} Feb 17 18:04:08 crc kubenswrapper[4892]: I0217 18:04:08.876021 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"493f91a6-7322-4fb8-8853-5021cf556a97","Type":"ContainerDied","Data":"5a8b73edc8f4cd57bae3bf775db55b14c68f64faaabe612f5bd20795b847eb9b"} Feb 17 18:04:08 crc kubenswrapper[4892]: I0217 18:04:08.876039 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"493f91a6-7322-4fb8-8853-5021cf556a97","Type":"ContainerDied","Data":"fbd79122ec9b78ff09c6bf6e32d00818bd501235f2b31d34ac8c72510d55c25d"} Feb 17 18:04:08 crc kubenswrapper[4892]: I0217 18:04:08.876057 4892 scope.go:117] "RemoveContainer" containerID="c9ab81ed4c25c6f84233378ea5a2a6f1b53b365e3b4f03545c7723dd00700eb8" Feb 17 18:04:08 crc kubenswrapper[4892]: I0217 18:04:08.899733 4892 scope.go:117] "RemoveContainer" containerID="5a8b73edc8f4cd57bae3bf775db55b14c68f64faaabe612f5bd20795b847eb9b" Feb 17 18:04:08 crc kubenswrapper[4892]: I0217 18:04:08.922525 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 17 18:04:08 crc kubenswrapper[4892]: I0217 18:04:08.926153 4892 scope.go:117] "RemoveContainer" containerID="c9ab81ed4c25c6f84233378ea5a2a6f1b53b365e3b4f03545c7723dd00700eb8" Feb 17 18:04:08 crc kubenswrapper[4892]: E0217 18:04:08.926687 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9ab81ed4c25c6f84233378ea5a2a6f1b53b365e3b4f03545c7723dd00700eb8\": container with ID starting with c9ab81ed4c25c6f84233378ea5a2a6f1b53b365e3b4f03545c7723dd00700eb8 not found: ID does not exist" containerID="c9ab81ed4c25c6f84233378ea5a2a6f1b53b365e3b4f03545c7723dd00700eb8" Feb 17 18:04:08 crc kubenswrapper[4892]: I0217 18:04:08.926722 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9ab81ed4c25c6f84233378ea5a2a6f1b53b365e3b4f03545c7723dd00700eb8"} err="failed to get container status \"c9ab81ed4c25c6f84233378ea5a2a6f1b53b365e3b4f03545c7723dd00700eb8\": rpc error: code = NotFound desc = could not find container \"c9ab81ed4c25c6f84233378ea5a2a6f1b53b365e3b4f03545c7723dd00700eb8\": container with ID starting with c9ab81ed4c25c6f84233378ea5a2a6f1b53b365e3b4f03545c7723dd00700eb8 not found: ID does not exist" Feb 17 18:04:08 crc kubenswrapper[4892]: I0217 18:04:08.926741 4892 scope.go:117] "RemoveContainer" containerID="5a8b73edc8f4cd57bae3bf775db55b14c68f64faaabe612f5bd20795b847eb9b" Feb 17 18:04:08 crc kubenswrapper[4892]: E0217 18:04:08.927131 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a8b73edc8f4cd57bae3bf775db55b14c68f64faaabe612f5bd20795b847eb9b\": container with ID starting with 5a8b73edc8f4cd57bae3bf775db55b14c68f64faaabe612f5bd20795b847eb9b not found: ID does not exist" containerID="5a8b73edc8f4cd57bae3bf775db55b14c68f64faaabe612f5bd20795b847eb9b" Feb 17 18:04:08 crc kubenswrapper[4892]: I0217 18:04:08.927150 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a8b73edc8f4cd57bae3bf775db55b14c68f64faaabe612f5bd20795b847eb9b"} err="failed to get container status \"5a8b73edc8f4cd57bae3bf775db55b14c68f64faaabe612f5bd20795b847eb9b\": rpc error: code = NotFound desc = could not find container \"5a8b73edc8f4cd57bae3bf775db55b14c68f64faaabe612f5bd20795b847eb9b\": container with ID starting with 5a8b73edc8f4cd57bae3bf775db55b14c68f64faaabe612f5bd20795b847eb9b not found: ID does not exist" Feb 17 18:04:08 crc kubenswrapper[4892]: I0217 18:04:08.927164 4892 scope.go:117] "RemoveContainer" containerID="c9ab81ed4c25c6f84233378ea5a2a6f1b53b365e3b4f03545c7723dd00700eb8" Feb 17 18:04:08 crc kubenswrapper[4892]: I0217 18:04:08.927428 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9ab81ed4c25c6f84233378ea5a2a6f1b53b365e3b4f03545c7723dd00700eb8"} err="failed to get container status \"c9ab81ed4c25c6f84233378ea5a2a6f1b53b365e3b4f03545c7723dd00700eb8\": rpc error: code = NotFound desc = could not find container \"c9ab81ed4c25c6f84233378ea5a2a6f1b53b365e3b4f03545c7723dd00700eb8\": container with ID starting with c9ab81ed4c25c6f84233378ea5a2a6f1b53b365e3b4f03545c7723dd00700eb8 not found: ID does not exist" Feb 17 18:04:08 crc kubenswrapper[4892]: I0217 18:04:08.927442 4892 scope.go:117] "RemoveContainer" containerID="5a8b73edc8f4cd57bae3bf775db55b14c68f64faaabe612f5bd20795b847eb9b" Feb 17 18:04:08 crc kubenswrapper[4892]: I0217 18:04:08.929576 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a8b73edc8f4cd57bae3bf775db55b14c68f64faaabe612f5bd20795b847eb9b"} err="failed to get container status \"5a8b73edc8f4cd57bae3bf775db55b14c68f64faaabe612f5bd20795b847eb9b\": rpc error: code = NotFound desc = could not find container \"5a8b73edc8f4cd57bae3bf775db55b14c68f64faaabe612f5bd20795b847eb9b\": container with ID starting with 5a8b73edc8f4cd57bae3bf775db55b14c68f64faaabe612f5bd20795b847eb9b not found: ID does not exist" Feb 17 18:04:08 crc kubenswrapper[4892]: I0217 18:04:08.934647 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 17 18:04:08 crc kubenswrapper[4892]: I0217 18:04:08.948335 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 17 18:04:08 crc kubenswrapper[4892]: E0217 18:04:08.948896 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="493f91a6-7322-4fb8-8853-5021cf556a97" containerName="cinder-api-log" Feb 17 18:04:08 crc kubenswrapper[4892]: I0217 18:04:08.948915 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="493f91a6-7322-4fb8-8853-5021cf556a97" containerName="cinder-api-log" Feb 17 18:04:08 crc kubenswrapper[4892]: E0217 18:04:08.948961 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="493f91a6-7322-4fb8-8853-5021cf556a97" containerName="cinder-api" Feb 17 18:04:08 crc kubenswrapper[4892]: I0217 18:04:08.948970 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="493f91a6-7322-4fb8-8853-5021cf556a97" containerName="cinder-api" Feb 17 18:04:08 crc kubenswrapper[4892]: I0217 18:04:08.949251 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="493f91a6-7322-4fb8-8853-5021cf556a97" containerName="cinder-api" Feb 17 18:04:08 crc kubenswrapper[4892]: I0217 18:04:08.949289 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="493f91a6-7322-4fb8-8853-5021cf556a97" containerName="cinder-api-log" Feb 17 18:04:08 crc kubenswrapper[4892]: I0217 18:04:08.950697 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 18:04:08 crc kubenswrapper[4892]: I0217 18:04:08.953493 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 17 18:04:08 crc kubenswrapper[4892]: I0217 18:04:08.954212 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 17 18:04:08 crc kubenswrapper[4892]: I0217 18:04:08.954362 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 17 18:04:08 crc kubenswrapper[4892]: I0217 18:04:08.974932 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 17 18:04:09 crc kubenswrapper[4892]: I0217 18:04:09.045573 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dcbe58d5-c580-4a8c-8476-dd29bf7ca91b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"dcbe58d5-c580-4a8c-8476-dd29bf7ca91b\") " pod="openstack/cinder-api-0" Feb 17 18:04:09 crc kubenswrapper[4892]: I0217 18:04:09.045738 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcbe58d5-c580-4a8c-8476-dd29bf7ca91b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"dcbe58d5-c580-4a8c-8476-dd29bf7ca91b\") " pod="openstack/cinder-api-0" Feb 17 18:04:09 crc kubenswrapper[4892]: I0217 18:04:09.045803 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcbe58d5-c580-4a8c-8476-dd29bf7ca91b-config-data\") pod \"cinder-api-0\" (UID: \"dcbe58d5-c580-4a8c-8476-dd29bf7ca91b\") " pod="openstack/cinder-api-0" Feb 17 18:04:09 crc kubenswrapper[4892]: I0217 18:04:09.045888 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dcbe58d5-c580-4a8c-8476-dd29bf7ca91b-config-data-custom\") pod \"cinder-api-0\" (UID: \"dcbe58d5-c580-4a8c-8476-dd29bf7ca91b\") " pod="openstack/cinder-api-0" Feb 17 18:04:09 crc kubenswrapper[4892]: I0217 18:04:09.046193 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcbe58d5-c580-4a8c-8476-dd29bf7ca91b-logs\") pod \"cinder-api-0\" (UID: \"dcbe58d5-c580-4a8c-8476-dd29bf7ca91b\") " pod="openstack/cinder-api-0" Feb 17 18:04:09 crc kubenswrapper[4892]: I0217 18:04:09.046271 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjbvq\" (UniqueName: \"kubernetes.io/projected/dcbe58d5-c580-4a8c-8476-dd29bf7ca91b-kube-api-access-qjbvq\") pod \"cinder-api-0\" (UID: \"dcbe58d5-c580-4a8c-8476-dd29bf7ca91b\") " pod="openstack/cinder-api-0" Feb 17 18:04:09 crc kubenswrapper[4892]: I0217 18:04:09.046648 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcbe58d5-c580-4a8c-8476-dd29bf7ca91b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"dcbe58d5-c580-4a8c-8476-dd29bf7ca91b\") " pod="openstack/cinder-api-0" Feb 17 18:04:09 crc kubenswrapper[4892]: I0217 18:04:09.046699 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcbe58d5-c580-4a8c-8476-dd29bf7ca91b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"dcbe58d5-c580-4a8c-8476-dd29bf7ca91b\") " pod="openstack/cinder-api-0" Feb 17 18:04:09 crc kubenswrapper[4892]: I0217 18:04:09.046786 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dcbe58d5-c580-4a8c-8476-dd29bf7ca91b-scripts\") pod \"cinder-api-0\" (UID: \"dcbe58d5-c580-4a8c-8476-dd29bf7ca91b\") " pod="openstack/cinder-api-0" Feb 17 18:04:09 crc kubenswrapper[4892]: I0217 18:04:09.148725 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcbe58d5-c580-4a8c-8476-dd29bf7ca91b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"dcbe58d5-c580-4a8c-8476-dd29bf7ca91b\") " pod="openstack/cinder-api-0" Feb 17 18:04:09 crc kubenswrapper[4892]: I0217 18:04:09.148780 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dcbe58d5-c580-4a8c-8476-dd29bf7ca91b-scripts\") pod \"cinder-api-0\" (UID: \"dcbe58d5-c580-4a8c-8476-dd29bf7ca91b\") " pod="openstack/cinder-api-0" Feb 17 18:04:09 crc kubenswrapper[4892]: I0217 18:04:09.148854 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dcbe58d5-c580-4a8c-8476-dd29bf7ca91b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"dcbe58d5-c580-4a8c-8476-dd29bf7ca91b\") " pod="openstack/cinder-api-0" Feb 17 18:04:09 crc kubenswrapper[4892]: I0217 18:04:09.148882 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcbe58d5-c580-4a8c-8476-dd29bf7ca91b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"dcbe58d5-c580-4a8c-8476-dd29bf7ca91b\") " pod="openstack/cinder-api-0" Feb 17 18:04:09 crc kubenswrapper[4892]: I0217 18:04:09.148897 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcbe58d5-c580-4a8c-8476-dd29bf7ca91b-config-data\") pod \"cinder-api-0\" (UID: \"dcbe58d5-c580-4a8c-8476-dd29bf7ca91b\") " pod="openstack/cinder-api-0" Feb 17 18:04:09 crc kubenswrapper[4892]: I0217 18:04:09.148925 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dcbe58d5-c580-4a8c-8476-dd29bf7ca91b-config-data-custom\") pod \"cinder-api-0\" (UID: \"dcbe58d5-c580-4a8c-8476-dd29bf7ca91b\") " pod="openstack/cinder-api-0" Feb 17 18:04:09 crc kubenswrapper[4892]: I0217 18:04:09.148998 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcbe58d5-c580-4a8c-8476-dd29bf7ca91b-logs\") pod \"cinder-api-0\" (UID: \"dcbe58d5-c580-4a8c-8476-dd29bf7ca91b\") " pod="openstack/cinder-api-0" Feb 17 18:04:09 crc kubenswrapper[4892]: I0217 18:04:09.149021 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjbvq\" (UniqueName: \"kubernetes.io/projected/dcbe58d5-c580-4a8c-8476-dd29bf7ca91b-kube-api-access-qjbvq\") pod \"cinder-api-0\" (UID: \"dcbe58d5-c580-4a8c-8476-dd29bf7ca91b\") " pod="openstack/cinder-api-0" Feb 17 18:04:09 crc kubenswrapper[4892]: I0217 18:04:09.149071 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcbe58d5-c580-4a8c-8476-dd29bf7ca91b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"dcbe58d5-c580-4a8c-8476-dd29bf7ca91b\") " pod="openstack/cinder-api-0" Feb 17 18:04:09 crc kubenswrapper[4892]: I0217 18:04:09.149391 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcbe58d5-c580-4a8c-8476-dd29bf7ca91b-logs\") pod \"cinder-api-0\" (UID: \"dcbe58d5-c580-4a8c-8476-dd29bf7ca91b\") " pod="openstack/cinder-api-0" Feb 17 18:04:09 crc kubenswrapper[4892]: I0217 18:04:09.149019 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dcbe58d5-c580-4a8c-8476-dd29bf7ca91b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"dcbe58d5-c580-4a8c-8476-dd29bf7ca91b\") " pod="openstack/cinder-api-0" Feb 17 18:04:09 crc kubenswrapper[4892]: I0217 18:04:09.152119 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcbe58d5-c580-4a8c-8476-dd29bf7ca91b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"dcbe58d5-c580-4a8c-8476-dd29bf7ca91b\") " pod="openstack/cinder-api-0" Feb 17 18:04:09 crc kubenswrapper[4892]: I0217 18:04:09.154368 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dcbe58d5-c580-4a8c-8476-dd29bf7ca91b-scripts\") pod \"cinder-api-0\" (UID: \"dcbe58d5-c580-4a8c-8476-dd29bf7ca91b\") " pod="openstack/cinder-api-0" Feb 17 18:04:09 crc kubenswrapper[4892]: I0217 18:04:09.154627 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcbe58d5-c580-4a8c-8476-dd29bf7ca91b-config-data\") pod \"cinder-api-0\" (UID: \"dcbe58d5-c580-4a8c-8476-dd29bf7ca91b\") " pod="openstack/cinder-api-0" Feb 17 18:04:09 crc kubenswrapper[4892]: I0217 18:04:09.155576 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcbe58d5-c580-4a8c-8476-dd29bf7ca91b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"dcbe58d5-c580-4a8c-8476-dd29bf7ca91b\") " pod="openstack/cinder-api-0" Feb 17 18:04:09 crc kubenswrapper[4892]: I0217 18:04:09.155959 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dcbe58d5-c580-4a8c-8476-dd29bf7ca91b-config-data-custom\") pod \"cinder-api-0\" (UID: \"dcbe58d5-c580-4a8c-8476-dd29bf7ca91b\") " pod="openstack/cinder-api-0" Feb 17 18:04:09 crc kubenswrapper[4892]: I0217 18:04:09.157364 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcbe58d5-c580-4a8c-8476-dd29bf7ca91b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"dcbe58d5-c580-4a8c-8476-dd29bf7ca91b\") " pod="openstack/cinder-api-0" Feb 17 18:04:09 crc kubenswrapper[4892]: I0217 18:04:09.168416 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjbvq\" (UniqueName: \"kubernetes.io/projected/dcbe58d5-c580-4a8c-8476-dd29bf7ca91b-kube-api-access-qjbvq\") pod \"cinder-api-0\" (UID: \"dcbe58d5-c580-4a8c-8476-dd29bf7ca91b\") " pod="openstack/cinder-api-0" Feb 17 18:04:09 crc kubenswrapper[4892]: I0217 18:04:09.294841 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 18:04:09 crc kubenswrapper[4892]: I0217 18:04:09.388024 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="493f91a6-7322-4fb8-8853-5021cf556a97" path="/var/lib/kubelet/pods/493f91a6-7322-4fb8-8853-5021cf556a97/volumes" Feb 17 18:04:09 crc kubenswrapper[4892]: I0217 18:04:09.726577 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-77d47b9fd6-4gbj5" Feb 17 18:04:09 crc kubenswrapper[4892]: I0217 18:04:09.765590 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 17 18:04:09 crc kubenswrapper[4892]: W0217 18:04:09.772980 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddcbe58d5_c580_4a8c_8476_dd29bf7ca91b.slice/crio-22fa72325d60d7548606add73f4ef937e113f09c885e10d204399bd2f2063809 WatchSource:0}: Error finding container 22fa72325d60d7548606add73f4ef937e113f09c885e10d204399bd2f2063809: Status 404 returned error can't find the container with id 22fa72325d60d7548606add73f4ef937e113f09c885e10d204399bd2f2063809 Feb 17 18:04:09 crc kubenswrapper[4892]: I0217 18:04:09.807710 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5dc464d6d6-dj24w"] Feb 17 18:04:09 crc kubenswrapper[4892]: I0217 18:04:09.808197 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5dc464d6d6-dj24w" podUID="f44e1ba4-1235-4601-94da-dd3400cdb7cc" containerName="barbican-api-log" containerID="cri-o://7d9cd7df8486a1f344f32c6c05a5647c165e29020c4d0222576060f2d3af228d" gracePeriod=30 Feb 17 18:04:09 crc kubenswrapper[4892]: I0217 18:04:09.808849 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5dc464d6d6-dj24w" podUID="f44e1ba4-1235-4601-94da-dd3400cdb7cc" containerName="barbican-api" containerID="cri-o://5163886c0755a8c4e6089ce2614a7ccedad5e3dc84ee8fc992501c05fd1dbcbc" gracePeriod=30 Feb 17 18:04:09 crc kubenswrapper[4892]: I0217 18:04:09.906195 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dcbe58d5-c580-4a8c-8476-dd29bf7ca91b","Type":"ContainerStarted","Data":"22fa72325d60d7548606add73f4ef937e113f09c885e10d204399bd2f2063809"} Feb 17 18:04:09 crc kubenswrapper[4892]: I0217 18:04:09.912832 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1035d59-31a0-44b9-b196-4b66c6cd3b19","Type":"ContainerStarted","Data":"9a725d366aa05121dcecb9ab3f90ced7fb12020c6c4a0261df591ac3b87f5b87"} Feb 17 18:04:10 crc kubenswrapper[4892]: I0217 18:04:10.941408 4892 generic.go:334] "Generic (PLEG): container finished" podID="f44e1ba4-1235-4601-94da-dd3400cdb7cc" containerID="7d9cd7df8486a1f344f32c6c05a5647c165e29020c4d0222576060f2d3af228d" exitCode=143 Feb 17 18:04:10 crc kubenswrapper[4892]: I0217 18:04:10.941492 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5dc464d6d6-dj24w" event={"ID":"f44e1ba4-1235-4601-94da-dd3400cdb7cc","Type":"ContainerDied","Data":"7d9cd7df8486a1f344f32c6c05a5647c165e29020c4d0222576060f2d3af228d"} Feb 17 18:04:10 crc kubenswrapper[4892]: I0217 18:04:10.944239 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dcbe58d5-c580-4a8c-8476-dd29bf7ca91b","Type":"ContainerStarted","Data":"0b8f5fa1d04ca87abf7de487fc63b1102d6d2d69e2adcaa02f32fd33d6c4c382"} Feb 17 18:04:10 crc kubenswrapper[4892]: I0217 18:04:10.947950 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1035d59-31a0-44b9-b196-4b66c6cd3b19","Type":"ContainerStarted","Data":"b2d0e8c773cacd4d4d4813ffc045dc9b16e5170cf72df0a94ecdd9c50cc60c3b"} Feb 17 18:04:11 crc kubenswrapper[4892]: I0217 18:04:11.970506 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1035d59-31a0-44b9-b196-4b66c6cd3b19","Type":"ContainerStarted","Data":"04d7cdd47f4612bb109aaab80eedfd6e76a543f7f42bd0727e80e043b4e470eb"} Feb 17 18:04:11 crc kubenswrapper[4892]: I0217 18:04:11.971142 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 18:04:11 crc kubenswrapper[4892]: I0217 18:04:11.974831 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dcbe58d5-c580-4a8c-8476-dd29bf7ca91b","Type":"ContainerStarted","Data":"890f0489886784d4621fef4344f5982943816d37e5cb528aa0fe4904d5ef40c8"} Feb 17 18:04:11 crc kubenswrapper[4892]: I0217 18:04:11.974987 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 17 18:04:12 crc kubenswrapper[4892]: I0217 18:04:12.007063 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.6992250909999997 podStartE2EDuration="6.007041804s" podCreationTimestamp="2026-02-17 18:04:06 +0000 UTC" firstStartedPulling="2026-02-17 18:04:07.899176497 +0000 UTC m=+1219.274579762" lastFinishedPulling="2026-02-17 18:04:11.20699317 +0000 UTC m=+1222.582396475" observedRunningTime="2026-02-17 18:04:11.998972477 +0000 UTC m=+1223.374375782" watchObservedRunningTime="2026-02-17 18:04:12.007041804 +0000 UTC m=+1223.382445079" Feb 17 18:04:12 crc kubenswrapper[4892]: I0217 18:04:12.965101 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5dc464d6d6-dj24w" podUID="f44e1ba4-1235-4601-94da-dd3400cdb7cc" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.187:9311/healthcheck\": read tcp 10.217.0.2:52010->10.217.0.187:9311: read: connection reset by peer" Feb 17 18:04:12 crc kubenswrapper[4892]: I0217 18:04:12.965134 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5dc464d6d6-dj24w" podUID="f44e1ba4-1235-4601-94da-dd3400cdb7cc" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.187:9311/healthcheck\": read tcp 10.217.0.2:51996->10.217.0.187:9311: read: connection reset by peer" Feb 17 18:04:13 crc kubenswrapper[4892]: I0217 18:04:13.485674 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5dc464d6d6-dj24w" Feb 17 18:04:13 crc kubenswrapper[4892]: I0217 18:04:13.524916 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.524892627 podStartE2EDuration="5.524892627s" podCreationTimestamp="2026-02-17 18:04:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:04:12.023465042 +0000 UTC m=+1223.398868327" watchObservedRunningTime="2026-02-17 18:04:13.524892627 +0000 UTC m=+1224.900295902" Feb 17 18:04:13 crc kubenswrapper[4892]: I0217 18:04:13.637959 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bb4fc677f-9mg8g" Feb 17 18:04:13 crc kubenswrapper[4892]: I0217 18:04:13.680598 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f44e1ba4-1235-4601-94da-dd3400cdb7cc-logs\") pod \"f44e1ba4-1235-4601-94da-dd3400cdb7cc\" (UID: \"f44e1ba4-1235-4601-94da-dd3400cdb7cc\") " Feb 17 18:04:13 crc kubenswrapper[4892]: I0217 18:04:13.680649 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f44e1ba4-1235-4601-94da-dd3400cdb7cc-config-data-custom\") pod \"f44e1ba4-1235-4601-94da-dd3400cdb7cc\" (UID: \"f44e1ba4-1235-4601-94da-dd3400cdb7cc\") " Feb 17 18:04:13 crc kubenswrapper[4892]: I0217 18:04:13.680693 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2q647\" (UniqueName: \"kubernetes.io/projected/f44e1ba4-1235-4601-94da-dd3400cdb7cc-kube-api-access-2q647\") pod \"f44e1ba4-1235-4601-94da-dd3400cdb7cc\" (UID: \"f44e1ba4-1235-4601-94da-dd3400cdb7cc\") " Feb 17 18:04:13 crc kubenswrapper[4892]: I0217 18:04:13.680717 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f44e1ba4-1235-4601-94da-dd3400cdb7cc-config-data\") pod \"f44e1ba4-1235-4601-94da-dd3400cdb7cc\" (UID: \"f44e1ba4-1235-4601-94da-dd3400cdb7cc\") " Feb 17 18:04:13 crc kubenswrapper[4892]: I0217 18:04:13.681096 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f44e1ba4-1235-4601-94da-dd3400cdb7cc-combined-ca-bundle\") pod \"f44e1ba4-1235-4601-94da-dd3400cdb7cc\" (UID: \"f44e1ba4-1235-4601-94da-dd3400cdb7cc\") " Feb 17 18:04:13 crc kubenswrapper[4892]: I0217 18:04:13.681271 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 17 18:04:13 crc kubenswrapper[4892]: I0217 18:04:13.681480 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f44e1ba4-1235-4601-94da-dd3400cdb7cc-logs" (OuterVolumeSpecName: "logs") pod "f44e1ba4-1235-4601-94da-dd3400cdb7cc" (UID: "f44e1ba4-1235-4601-94da-dd3400cdb7cc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:04:13 crc kubenswrapper[4892]: I0217 18:04:13.687895 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f44e1ba4-1235-4601-94da-dd3400cdb7cc-logs\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:13 crc kubenswrapper[4892]: I0217 18:04:13.702509 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f44e1ba4-1235-4601-94da-dd3400cdb7cc-kube-api-access-2q647" (OuterVolumeSpecName: "kube-api-access-2q647") pod "f44e1ba4-1235-4601-94da-dd3400cdb7cc" (UID: "f44e1ba4-1235-4601-94da-dd3400cdb7cc"). InnerVolumeSpecName "kube-api-access-2q647". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:04:13 crc kubenswrapper[4892]: I0217 18:04:13.703002 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f44e1ba4-1235-4601-94da-dd3400cdb7cc-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f44e1ba4-1235-4601-94da-dd3400cdb7cc" (UID: "f44e1ba4-1235-4601-94da-dd3400cdb7cc"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:04:13 crc kubenswrapper[4892]: I0217 18:04:13.751570 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f44e1ba4-1235-4601-94da-dd3400cdb7cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f44e1ba4-1235-4601-94da-dd3400cdb7cc" (UID: "f44e1ba4-1235-4601-94da-dd3400cdb7cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:04:13 crc kubenswrapper[4892]: I0217 18:04:13.755910 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-grk2x"] Feb 17 18:04:13 crc kubenswrapper[4892]: I0217 18:04:13.760800 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-688c87cc99-grk2x" podUID="2eef18f1-0710-4c8f-af0b-d8c836eef0f1" containerName="dnsmasq-dns" containerID="cri-o://3c01b90efe5b5d4250a767cec037c711397f694acc495d33f9f97c0d90167be9" gracePeriod=10 Feb 17 18:04:13 crc kubenswrapper[4892]: I0217 18:04:13.791559 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2q647\" (UniqueName: \"kubernetes.io/projected/f44e1ba4-1235-4601-94da-dd3400cdb7cc-kube-api-access-2q647\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:13 crc kubenswrapper[4892]: I0217 18:04:13.791588 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f44e1ba4-1235-4601-94da-dd3400cdb7cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:13 crc kubenswrapper[4892]: I0217 18:04:13.791602 4892 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f44e1ba4-1235-4601-94da-dd3400cdb7cc-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:13 crc kubenswrapper[4892]: I0217 18:04:13.808199 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 18:04:13 crc kubenswrapper[4892]: I0217 18:04:13.814079 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f44e1ba4-1235-4601-94da-dd3400cdb7cc-config-data" (OuterVolumeSpecName: "config-data") pod "f44e1ba4-1235-4601-94da-dd3400cdb7cc" (UID: "f44e1ba4-1235-4601-94da-dd3400cdb7cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:04:13 crc kubenswrapper[4892]: I0217 18:04:13.894442 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f44e1ba4-1235-4601-94da-dd3400cdb7cc-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:13 crc kubenswrapper[4892]: I0217 18:04:13.996325 4892 generic.go:334] "Generic (PLEG): container finished" podID="2eef18f1-0710-4c8f-af0b-d8c836eef0f1" containerID="3c01b90efe5b5d4250a767cec037c711397f694acc495d33f9f97c0d90167be9" exitCode=0 Feb 17 18:04:13 crc kubenswrapper[4892]: I0217 18:04:13.996458 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-grk2x" event={"ID":"2eef18f1-0710-4c8f-af0b-d8c836eef0f1","Type":"ContainerDied","Data":"3c01b90efe5b5d4250a767cec037c711397f694acc495d33f9f97c0d90167be9"} Feb 17 18:04:13 crc kubenswrapper[4892]: I0217 18:04:13.999092 4892 generic.go:334] "Generic (PLEG): container finished" podID="f44e1ba4-1235-4601-94da-dd3400cdb7cc" containerID="5163886c0755a8c4e6089ce2614a7ccedad5e3dc84ee8fc992501c05fd1dbcbc" exitCode=0 Feb 17 18:04:13 crc kubenswrapper[4892]: I0217 18:04:13.999468 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="1fa545b2-4053-4a5c-8a22-d0bbb112db1a" containerName="cinder-scheduler" containerID="cri-o://54673b8a6b7d7e20fc21f82b3621ecf169e0813410a5de37f605ecc06b56930b" gracePeriod=30 Feb 17 18:04:13 crc kubenswrapper[4892]: I0217 18:04:13.999756 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5dc464d6d6-dj24w" Feb 17 18:04:14 crc kubenswrapper[4892]: I0217 18:04:14.003834 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5dc464d6d6-dj24w" event={"ID":"f44e1ba4-1235-4601-94da-dd3400cdb7cc","Type":"ContainerDied","Data":"5163886c0755a8c4e6089ce2614a7ccedad5e3dc84ee8fc992501c05fd1dbcbc"} Feb 17 18:04:14 crc kubenswrapper[4892]: I0217 18:04:14.003908 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5dc464d6d6-dj24w" event={"ID":"f44e1ba4-1235-4601-94da-dd3400cdb7cc","Type":"ContainerDied","Data":"e1ea2ae9ce2461063ccf80b5a0eb1e90428ad72e610eabd3d25386541b131daa"} Feb 17 18:04:14 crc kubenswrapper[4892]: I0217 18:04:14.003933 4892 scope.go:117] "RemoveContainer" containerID="5163886c0755a8c4e6089ce2614a7ccedad5e3dc84ee8fc992501c05fd1dbcbc" Feb 17 18:04:14 crc kubenswrapper[4892]: I0217 18:04:14.003946 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="1fa545b2-4053-4a5c-8a22-d0bbb112db1a" containerName="probe" containerID="cri-o://f7e77513d0ccfd80c5434dae1f79fa0da0f8bb41f7ff365bcf112a36604a6274" gracePeriod=30 Feb 17 18:04:14 crc kubenswrapper[4892]: I0217 18:04:14.042984 4892 scope.go:117] "RemoveContainer" containerID="7d9cd7df8486a1f344f32c6c05a5647c165e29020c4d0222576060f2d3af228d" Feb 17 18:04:14 crc kubenswrapper[4892]: I0217 18:04:14.044149 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5dc464d6d6-dj24w"] Feb 17 18:04:14 crc kubenswrapper[4892]: I0217 18:04:14.057168 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5dc464d6d6-dj24w"] Feb 17 18:04:14 crc kubenswrapper[4892]: I0217 18:04:14.124557 4892 scope.go:117] "RemoveContainer" containerID="5163886c0755a8c4e6089ce2614a7ccedad5e3dc84ee8fc992501c05fd1dbcbc" Feb 17 18:04:14 crc kubenswrapper[4892]: E0217 18:04:14.125322 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5163886c0755a8c4e6089ce2614a7ccedad5e3dc84ee8fc992501c05fd1dbcbc\": container with ID starting with 5163886c0755a8c4e6089ce2614a7ccedad5e3dc84ee8fc992501c05fd1dbcbc not found: ID does not exist" containerID="5163886c0755a8c4e6089ce2614a7ccedad5e3dc84ee8fc992501c05fd1dbcbc" Feb 17 18:04:14 crc kubenswrapper[4892]: I0217 18:04:14.125393 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5163886c0755a8c4e6089ce2614a7ccedad5e3dc84ee8fc992501c05fd1dbcbc"} err="failed to get container status \"5163886c0755a8c4e6089ce2614a7ccedad5e3dc84ee8fc992501c05fd1dbcbc\": rpc error: code = NotFound desc = could not find container \"5163886c0755a8c4e6089ce2614a7ccedad5e3dc84ee8fc992501c05fd1dbcbc\": container with ID starting with 5163886c0755a8c4e6089ce2614a7ccedad5e3dc84ee8fc992501c05fd1dbcbc not found: ID does not exist" Feb 17 18:04:14 crc kubenswrapper[4892]: I0217 18:04:14.125471 4892 scope.go:117] "RemoveContainer" containerID="7d9cd7df8486a1f344f32c6c05a5647c165e29020c4d0222576060f2d3af228d" Feb 17 18:04:14 crc kubenswrapper[4892]: E0217 18:04:14.125884 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d9cd7df8486a1f344f32c6c05a5647c165e29020c4d0222576060f2d3af228d\": container with ID starting with 7d9cd7df8486a1f344f32c6c05a5647c165e29020c4d0222576060f2d3af228d not found: ID does not exist" containerID="7d9cd7df8486a1f344f32c6c05a5647c165e29020c4d0222576060f2d3af228d" Feb 17 18:04:14 crc kubenswrapper[4892]: I0217 18:04:14.125927 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d9cd7df8486a1f344f32c6c05a5647c165e29020c4d0222576060f2d3af228d"} err="failed to get container status \"7d9cd7df8486a1f344f32c6c05a5647c165e29020c4d0222576060f2d3af228d\": rpc error: code = NotFound desc = could not find container \"7d9cd7df8486a1f344f32c6c05a5647c165e29020c4d0222576060f2d3af228d\": container with ID starting with 7d9cd7df8486a1f344f32c6c05a5647c165e29020c4d0222576060f2d3af228d not found: ID does not exist" Feb 17 18:04:14 crc kubenswrapper[4892]: I0217 18:04:14.276204 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-grk2x" Feb 17 18:04:14 crc kubenswrapper[4892]: I0217 18:04:14.407427 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2eef18f1-0710-4c8f-af0b-d8c836eef0f1-dns-svc\") pod \"2eef18f1-0710-4c8f-af0b-d8c836eef0f1\" (UID: \"2eef18f1-0710-4c8f-af0b-d8c836eef0f1\") " Feb 17 18:04:14 crc kubenswrapper[4892]: I0217 18:04:14.408243 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2eef18f1-0710-4c8f-af0b-d8c836eef0f1-config\") pod \"2eef18f1-0710-4c8f-af0b-d8c836eef0f1\" (UID: \"2eef18f1-0710-4c8f-af0b-d8c836eef0f1\") " Feb 17 18:04:14 crc kubenswrapper[4892]: I0217 18:04:14.408370 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2eef18f1-0710-4c8f-af0b-d8c836eef0f1-ovsdbserver-sb\") pod \"2eef18f1-0710-4c8f-af0b-d8c836eef0f1\" (UID: \"2eef18f1-0710-4c8f-af0b-d8c836eef0f1\") " Feb 17 18:04:14 crc kubenswrapper[4892]: I0217 18:04:14.408442 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2eef18f1-0710-4c8f-af0b-d8c836eef0f1-dns-swift-storage-0\") pod \"2eef18f1-0710-4c8f-af0b-d8c836eef0f1\" (UID: \"2eef18f1-0710-4c8f-af0b-d8c836eef0f1\") " Feb 17 18:04:14 crc kubenswrapper[4892]: I0217 18:04:14.408521 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncbxk\" (UniqueName: \"kubernetes.io/projected/2eef18f1-0710-4c8f-af0b-d8c836eef0f1-kube-api-access-ncbxk\") pod \"2eef18f1-0710-4c8f-af0b-d8c836eef0f1\" (UID: \"2eef18f1-0710-4c8f-af0b-d8c836eef0f1\") " Feb 17 18:04:14 crc kubenswrapper[4892]: I0217 18:04:14.408565 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2eef18f1-0710-4c8f-af0b-d8c836eef0f1-ovsdbserver-nb\") pod \"2eef18f1-0710-4c8f-af0b-d8c836eef0f1\" (UID: \"2eef18f1-0710-4c8f-af0b-d8c836eef0f1\") " Feb 17 18:04:14 crc kubenswrapper[4892]: I0217 18:04:14.414051 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2eef18f1-0710-4c8f-af0b-d8c836eef0f1-kube-api-access-ncbxk" (OuterVolumeSpecName: "kube-api-access-ncbxk") pod "2eef18f1-0710-4c8f-af0b-d8c836eef0f1" (UID: "2eef18f1-0710-4c8f-af0b-d8c836eef0f1"). InnerVolumeSpecName "kube-api-access-ncbxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:04:14 crc kubenswrapper[4892]: I0217 18:04:14.466866 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2eef18f1-0710-4c8f-af0b-d8c836eef0f1-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2eef18f1-0710-4c8f-af0b-d8c836eef0f1" (UID: "2eef18f1-0710-4c8f-af0b-d8c836eef0f1"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:04:14 crc kubenswrapper[4892]: I0217 18:04:14.476533 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2eef18f1-0710-4c8f-af0b-d8c836eef0f1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2eef18f1-0710-4c8f-af0b-d8c836eef0f1" (UID: "2eef18f1-0710-4c8f-af0b-d8c836eef0f1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:04:14 crc kubenswrapper[4892]: I0217 18:04:14.482140 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2eef18f1-0710-4c8f-af0b-d8c836eef0f1-config" (OuterVolumeSpecName: "config") pod "2eef18f1-0710-4c8f-af0b-d8c836eef0f1" (UID: "2eef18f1-0710-4c8f-af0b-d8c836eef0f1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:04:14 crc kubenswrapper[4892]: I0217 18:04:14.497776 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2eef18f1-0710-4c8f-af0b-d8c836eef0f1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2eef18f1-0710-4c8f-af0b-d8c836eef0f1" (UID: "2eef18f1-0710-4c8f-af0b-d8c836eef0f1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:04:14 crc kubenswrapper[4892]: I0217 18:04:14.502647 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2eef18f1-0710-4c8f-af0b-d8c836eef0f1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2eef18f1-0710-4c8f-af0b-d8c836eef0f1" (UID: "2eef18f1-0710-4c8f-af0b-d8c836eef0f1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:04:14 crc kubenswrapper[4892]: I0217 18:04:14.511009 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2eef18f1-0710-4c8f-af0b-d8c836eef0f1-config\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:14 crc kubenswrapper[4892]: I0217 18:04:14.511052 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2eef18f1-0710-4c8f-af0b-d8c836eef0f1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:14 crc kubenswrapper[4892]: I0217 18:04:14.511067 4892 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2eef18f1-0710-4c8f-af0b-d8c836eef0f1-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:14 crc kubenswrapper[4892]: I0217 18:04:14.511080 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncbxk\" (UniqueName: \"kubernetes.io/projected/2eef18f1-0710-4c8f-af0b-d8c836eef0f1-kube-api-access-ncbxk\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:14 crc kubenswrapper[4892]: I0217 18:04:14.511093 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2eef18f1-0710-4c8f-af0b-d8c836eef0f1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:14 crc kubenswrapper[4892]: I0217 18:04:14.511107 4892 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2eef18f1-0710-4c8f-af0b-d8c836eef0f1-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:15 crc kubenswrapper[4892]: I0217 18:04:15.021419 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-grk2x" Feb 17 18:04:15 crc kubenswrapper[4892]: I0217 18:04:15.021469 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-grk2x" event={"ID":"2eef18f1-0710-4c8f-af0b-d8c836eef0f1","Type":"ContainerDied","Data":"82b8fdad36380d14c867d6d39ef23507bb7369edd16394412093d87494b594b8"} Feb 17 18:04:15 crc kubenswrapper[4892]: I0217 18:04:15.021868 4892 scope.go:117] "RemoveContainer" containerID="3c01b90efe5b5d4250a767cec037c711397f694acc495d33f9f97c0d90167be9" Feb 17 18:04:15 crc kubenswrapper[4892]: I0217 18:04:15.034049 4892 generic.go:334] "Generic (PLEG): container finished" podID="1fa545b2-4053-4a5c-8a22-d0bbb112db1a" containerID="f7e77513d0ccfd80c5434dae1f79fa0da0f8bb41f7ff365bcf112a36604a6274" exitCode=0 Feb 17 18:04:15 crc kubenswrapper[4892]: I0217 18:04:15.034133 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1fa545b2-4053-4a5c-8a22-d0bbb112db1a","Type":"ContainerDied","Data":"f7e77513d0ccfd80c5434dae1f79fa0da0f8bb41f7ff365bcf112a36604a6274"} Feb 17 18:04:15 crc kubenswrapper[4892]: I0217 18:04:15.070079 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-grk2x"] Feb 17 18:04:15 crc kubenswrapper[4892]: I0217 18:04:15.070246 4892 scope.go:117] "RemoveContainer" containerID="e530ac97433648e648bace58e5c8f15b83e507ba85496e9f58438ce16a3820ae" Feb 17 18:04:15 crc kubenswrapper[4892]: I0217 18:04:15.088509 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-grk2x"] Feb 17 18:04:15 crc kubenswrapper[4892]: I0217 18:04:15.377266 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2eef18f1-0710-4c8f-af0b-d8c836eef0f1" path="/var/lib/kubelet/pods/2eef18f1-0710-4c8f-af0b-d8c836eef0f1/volumes" Feb 17 18:04:15 crc kubenswrapper[4892]: I0217 18:04:15.378617 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f44e1ba4-1235-4601-94da-dd3400cdb7cc" path="/var/lib/kubelet/pods/f44e1ba4-1235-4601-94da-dd3400cdb7cc/volumes" Feb 17 18:04:18 crc kubenswrapper[4892]: I0217 18:04:18.086683 4892 generic.go:334] "Generic (PLEG): container finished" podID="1fa545b2-4053-4a5c-8a22-d0bbb112db1a" containerID="54673b8a6b7d7e20fc21f82b3621ecf169e0813410a5de37f605ecc06b56930b" exitCode=0 Feb 17 18:04:18 crc kubenswrapper[4892]: I0217 18:04:18.087464 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1fa545b2-4053-4a5c-8a22-d0bbb112db1a","Type":"ContainerDied","Data":"54673b8a6b7d7e20fc21f82b3621ecf169e0813410a5de37f605ecc06b56930b"} Feb 17 18:04:18 crc kubenswrapper[4892]: I0217 18:04:18.322506 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 18:04:18 crc kubenswrapper[4892]: I0217 18:04:18.500330 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1fa545b2-4053-4a5c-8a22-d0bbb112db1a-config-data-custom\") pod \"1fa545b2-4053-4a5c-8a22-d0bbb112db1a\" (UID: \"1fa545b2-4053-4a5c-8a22-d0bbb112db1a\") " Feb 17 18:04:18 crc kubenswrapper[4892]: I0217 18:04:18.500376 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fa545b2-4053-4a5c-8a22-d0bbb112db1a-config-data\") pod \"1fa545b2-4053-4a5c-8a22-d0bbb112db1a\" (UID: \"1fa545b2-4053-4a5c-8a22-d0bbb112db1a\") " Feb 17 18:04:18 crc kubenswrapper[4892]: I0217 18:04:18.500443 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1fa545b2-4053-4a5c-8a22-d0bbb112db1a-etc-machine-id\") pod \"1fa545b2-4053-4a5c-8a22-d0bbb112db1a\" (UID: \"1fa545b2-4053-4a5c-8a22-d0bbb112db1a\") " Feb 17 18:04:18 crc kubenswrapper[4892]: I0217 18:04:18.500499 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtjzn\" (UniqueName: \"kubernetes.io/projected/1fa545b2-4053-4a5c-8a22-d0bbb112db1a-kube-api-access-jtjzn\") pod \"1fa545b2-4053-4a5c-8a22-d0bbb112db1a\" (UID: \"1fa545b2-4053-4a5c-8a22-d0bbb112db1a\") " Feb 17 18:04:18 crc kubenswrapper[4892]: I0217 18:04:18.500536 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fa545b2-4053-4a5c-8a22-d0bbb112db1a-scripts\") pod \"1fa545b2-4053-4a5c-8a22-d0bbb112db1a\" (UID: \"1fa545b2-4053-4a5c-8a22-d0bbb112db1a\") " Feb 17 18:04:18 crc kubenswrapper[4892]: I0217 18:04:18.500588 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1fa545b2-4053-4a5c-8a22-d0bbb112db1a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "1fa545b2-4053-4a5c-8a22-d0bbb112db1a" (UID: "1fa545b2-4053-4a5c-8a22-d0bbb112db1a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:04:18 crc kubenswrapper[4892]: I0217 18:04:18.500718 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fa545b2-4053-4a5c-8a22-d0bbb112db1a-combined-ca-bundle\") pod \"1fa545b2-4053-4a5c-8a22-d0bbb112db1a\" (UID: \"1fa545b2-4053-4a5c-8a22-d0bbb112db1a\") " Feb 17 18:04:18 crc kubenswrapper[4892]: I0217 18:04:18.501895 4892 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1fa545b2-4053-4a5c-8a22-d0bbb112db1a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:18 crc kubenswrapper[4892]: I0217 18:04:18.506098 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fa545b2-4053-4a5c-8a22-d0bbb112db1a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1fa545b2-4053-4a5c-8a22-d0bbb112db1a" (UID: "1fa545b2-4053-4a5c-8a22-d0bbb112db1a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:04:18 crc kubenswrapper[4892]: I0217 18:04:18.506449 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fa545b2-4053-4a5c-8a22-d0bbb112db1a-kube-api-access-jtjzn" (OuterVolumeSpecName: "kube-api-access-jtjzn") pod "1fa545b2-4053-4a5c-8a22-d0bbb112db1a" (UID: "1fa545b2-4053-4a5c-8a22-d0bbb112db1a"). InnerVolumeSpecName "kube-api-access-jtjzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:04:18 crc kubenswrapper[4892]: I0217 18:04:18.507923 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fa545b2-4053-4a5c-8a22-d0bbb112db1a-scripts" (OuterVolumeSpecName: "scripts") pod "1fa545b2-4053-4a5c-8a22-d0bbb112db1a" (UID: "1fa545b2-4053-4a5c-8a22-d0bbb112db1a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:04:18 crc kubenswrapper[4892]: I0217 18:04:18.555468 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fa545b2-4053-4a5c-8a22-d0bbb112db1a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1fa545b2-4053-4a5c-8a22-d0bbb112db1a" (UID: "1fa545b2-4053-4a5c-8a22-d0bbb112db1a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:04:18 crc kubenswrapper[4892]: I0217 18:04:18.603510 4892 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1fa545b2-4053-4a5c-8a22-d0bbb112db1a-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:18 crc kubenswrapper[4892]: I0217 18:04:18.603551 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtjzn\" (UniqueName: \"kubernetes.io/projected/1fa545b2-4053-4a5c-8a22-d0bbb112db1a-kube-api-access-jtjzn\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:18 crc kubenswrapper[4892]: I0217 18:04:18.603566 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fa545b2-4053-4a5c-8a22-d0bbb112db1a-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:18 crc kubenswrapper[4892]: I0217 18:04:18.603579 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fa545b2-4053-4a5c-8a22-d0bbb112db1a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:18 crc kubenswrapper[4892]: I0217 18:04:18.661971 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fa545b2-4053-4a5c-8a22-d0bbb112db1a-config-data" (OuterVolumeSpecName: "config-data") pod "1fa545b2-4053-4a5c-8a22-d0bbb112db1a" (UID: "1fa545b2-4053-4a5c-8a22-d0bbb112db1a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:04:18 crc kubenswrapper[4892]: I0217 18:04:18.705355 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fa545b2-4053-4a5c-8a22-d0bbb112db1a-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:18 crc kubenswrapper[4892]: I0217 18:04:18.761263 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-6674d5469b-4kmh6" podUID="8d1b1f1d-e838-4b89-8d8b-e61b88a9917e" containerName="neutron-httpd" probeResult="failure" output="Get \"http://10.217.0.182:9696/\": dial tcp 10.217.0.182:9696: connect: connection refused" Feb 17 18:04:18 crc kubenswrapper[4892]: I0217 18:04:18.961253 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-dbf87fcbd-5txkh" Feb 17 18:04:19 crc kubenswrapper[4892]: I0217 18:04:19.104742 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1fa545b2-4053-4a5c-8a22-d0bbb112db1a","Type":"ContainerDied","Data":"9ce7211d2030d3b1cbfd4db923ae093cacab96c51f6d85cec5cfe3bf6e42897b"} Feb 17 18:04:19 crc kubenswrapper[4892]: I0217 18:04:19.105226 4892 scope.go:117] "RemoveContainer" containerID="f7e77513d0ccfd80c5434dae1f79fa0da0f8bb41f7ff365bcf112a36604a6274" Feb 17 18:04:19 crc kubenswrapper[4892]: I0217 18:04:19.104979 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 18:04:19 crc kubenswrapper[4892]: I0217 18:04:19.144048 4892 scope.go:117] "RemoveContainer" containerID="54673b8a6b7d7e20fc21f82b3621ecf169e0813410a5de37f605ecc06b56930b" Feb 17 18:04:19 crc kubenswrapper[4892]: I0217 18:04:19.157950 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 18:04:19 crc kubenswrapper[4892]: I0217 18:04:19.169385 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 18:04:19 crc kubenswrapper[4892]: I0217 18:04:19.198968 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 18:04:19 crc kubenswrapper[4892]: E0217 18:04:19.199493 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fa545b2-4053-4a5c-8a22-d0bbb112db1a" containerName="probe" Feb 17 18:04:19 crc kubenswrapper[4892]: I0217 18:04:19.199513 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fa545b2-4053-4a5c-8a22-d0bbb112db1a" containerName="probe" Feb 17 18:04:19 crc kubenswrapper[4892]: E0217 18:04:19.199532 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eef18f1-0710-4c8f-af0b-d8c836eef0f1" containerName="init" Feb 17 18:04:19 crc kubenswrapper[4892]: I0217 18:04:19.199539 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eef18f1-0710-4c8f-af0b-d8c836eef0f1" containerName="init" Feb 17 18:04:19 crc kubenswrapper[4892]: E0217 18:04:19.199561 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fa545b2-4053-4a5c-8a22-d0bbb112db1a" containerName="cinder-scheduler" Feb 17 18:04:19 crc kubenswrapper[4892]: I0217 18:04:19.199567 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fa545b2-4053-4a5c-8a22-d0bbb112db1a" containerName="cinder-scheduler" Feb 17 18:04:19 crc kubenswrapper[4892]: E0217 18:04:19.199576 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f44e1ba4-1235-4601-94da-dd3400cdb7cc" containerName="barbican-api" Feb 17 18:04:19 crc kubenswrapper[4892]: I0217 18:04:19.199582 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f44e1ba4-1235-4601-94da-dd3400cdb7cc" containerName="barbican-api" Feb 17 18:04:19 crc kubenswrapper[4892]: E0217 18:04:19.199604 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eef18f1-0710-4c8f-af0b-d8c836eef0f1" containerName="dnsmasq-dns" Feb 17 18:04:19 crc kubenswrapper[4892]: I0217 18:04:19.199611 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eef18f1-0710-4c8f-af0b-d8c836eef0f1" containerName="dnsmasq-dns" Feb 17 18:04:19 crc kubenswrapper[4892]: E0217 18:04:19.199626 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f44e1ba4-1235-4601-94da-dd3400cdb7cc" containerName="barbican-api-log" Feb 17 18:04:19 crc kubenswrapper[4892]: I0217 18:04:19.199632 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f44e1ba4-1235-4601-94da-dd3400cdb7cc" containerName="barbican-api-log" Feb 17 18:04:19 crc kubenswrapper[4892]: I0217 18:04:19.199865 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fa545b2-4053-4a5c-8a22-d0bbb112db1a" containerName="cinder-scheduler" Feb 17 18:04:19 crc kubenswrapper[4892]: I0217 18:04:19.199883 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="f44e1ba4-1235-4601-94da-dd3400cdb7cc" containerName="barbican-api-log" Feb 17 18:04:19 crc kubenswrapper[4892]: I0217 18:04:19.199908 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="f44e1ba4-1235-4601-94da-dd3400cdb7cc" containerName="barbican-api" Feb 17 18:04:19 crc kubenswrapper[4892]: I0217 18:04:19.199921 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="2eef18f1-0710-4c8f-af0b-d8c836eef0f1" containerName="dnsmasq-dns" Feb 17 18:04:19 crc kubenswrapper[4892]: I0217 18:04:19.199936 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fa545b2-4053-4a5c-8a22-d0bbb112db1a" containerName="probe" Feb 17 18:04:19 crc kubenswrapper[4892]: I0217 18:04:19.201088 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 18:04:19 crc kubenswrapper[4892]: I0217 18:04:19.207054 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 17 18:04:19 crc kubenswrapper[4892]: I0217 18:04:19.230974 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 18:04:19 crc kubenswrapper[4892]: I0217 18:04:19.322212 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a6736a08-c35b-491c-b408-8a3dd641cd51-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a6736a08-c35b-491c-b408-8a3dd641cd51\") " pod="openstack/cinder-scheduler-0" Feb 17 18:04:19 crc kubenswrapper[4892]: I0217 18:04:19.322263 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gps8k\" (UniqueName: \"kubernetes.io/projected/a6736a08-c35b-491c-b408-8a3dd641cd51-kube-api-access-gps8k\") pod \"cinder-scheduler-0\" (UID: \"a6736a08-c35b-491c-b408-8a3dd641cd51\") " pod="openstack/cinder-scheduler-0" Feb 17 18:04:19 crc kubenswrapper[4892]: I0217 18:04:19.322340 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a6736a08-c35b-491c-b408-8a3dd641cd51-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a6736a08-c35b-491c-b408-8a3dd641cd51\") " pod="openstack/cinder-scheduler-0" Feb 17 18:04:19 crc kubenswrapper[4892]: I0217 18:04:19.322376 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6736a08-c35b-491c-b408-8a3dd641cd51-scripts\") pod \"cinder-scheduler-0\" (UID: \"a6736a08-c35b-491c-b408-8a3dd641cd51\") " pod="openstack/cinder-scheduler-0" Feb 17 18:04:19 crc kubenswrapper[4892]: I0217 18:04:19.322403 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6736a08-c35b-491c-b408-8a3dd641cd51-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a6736a08-c35b-491c-b408-8a3dd641cd51\") " pod="openstack/cinder-scheduler-0" Feb 17 18:04:19 crc kubenswrapper[4892]: I0217 18:04:19.322421 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6736a08-c35b-491c-b408-8a3dd641cd51-config-data\") pod \"cinder-scheduler-0\" (UID: \"a6736a08-c35b-491c-b408-8a3dd641cd51\") " pod="openstack/cinder-scheduler-0" Feb 17 18:04:19 crc kubenswrapper[4892]: I0217 18:04:19.376603 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fa545b2-4053-4a5c-8a22-d0bbb112db1a" path="/var/lib/kubelet/pods/1fa545b2-4053-4a5c-8a22-d0bbb112db1a/volumes" Feb 17 18:04:19 crc kubenswrapper[4892]: I0217 18:04:19.424773 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a6736a08-c35b-491c-b408-8a3dd641cd51-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a6736a08-c35b-491c-b408-8a3dd641cd51\") " pod="openstack/cinder-scheduler-0" Feb 17 18:04:19 crc kubenswrapper[4892]: I0217 18:04:19.424895 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a6736a08-c35b-491c-b408-8a3dd641cd51-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a6736a08-c35b-491c-b408-8a3dd641cd51\") " pod="openstack/cinder-scheduler-0" Feb 17 18:04:19 crc kubenswrapper[4892]: I0217 18:04:19.425071 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gps8k\" (UniqueName: \"kubernetes.io/projected/a6736a08-c35b-491c-b408-8a3dd641cd51-kube-api-access-gps8k\") pod \"cinder-scheduler-0\" (UID: \"a6736a08-c35b-491c-b408-8a3dd641cd51\") " pod="openstack/cinder-scheduler-0" Feb 17 18:04:19 crc kubenswrapper[4892]: I0217 18:04:19.425733 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a6736a08-c35b-491c-b408-8a3dd641cd51-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a6736a08-c35b-491c-b408-8a3dd641cd51\") " pod="openstack/cinder-scheduler-0" Feb 17 18:04:19 crc kubenswrapper[4892]: I0217 18:04:19.426570 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6736a08-c35b-491c-b408-8a3dd641cd51-scripts\") pod \"cinder-scheduler-0\" (UID: \"a6736a08-c35b-491c-b408-8a3dd641cd51\") " pod="openstack/cinder-scheduler-0" Feb 17 18:04:19 crc kubenswrapper[4892]: I0217 18:04:19.426690 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6736a08-c35b-491c-b408-8a3dd641cd51-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a6736a08-c35b-491c-b408-8a3dd641cd51\") " pod="openstack/cinder-scheduler-0" Feb 17 18:04:19 crc kubenswrapper[4892]: I0217 18:04:19.426863 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6736a08-c35b-491c-b408-8a3dd641cd51-config-data\") pod \"cinder-scheduler-0\" (UID: \"a6736a08-c35b-491c-b408-8a3dd641cd51\") " pod="openstack/cinder-scheduler-0" Feb 17 18:04:19 crc kubenswrapper[4892]: I0217 18:04:19.432175 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6736a08-c35b-491c-b408-8a3dd641cd51-scripts\") pod \"cinder-scheduler-0\" (UID: \"a6736a08-c35b-491c-b408-8a3dd641cd51\") " pod="openstack/cinder-scheduler-0" Feb 17 18:04:19 crc kubenswrapper[4892]: I0217 18:04:19.444943 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a6736a08-c35b-491c-b408-8a3dd641cd51-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a6736a08-c35b-491c-b408-8a3dd641cd51\") " pod="openstack/cinder-scheduler-0" Feb 17 18:04:19 crc kubenswrapper[4892]: I0217 18:04:19.447488 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6736a08-c35b-491c-b408-8a3dd641cd51-config-data\") pod \"cinder-scheduler-0\" (UID: \"a6736a08-c35b-491c-b408-8a3dd641cd51\") " pod="openstack/cinder-scheduler-0" Feb 17 18:04:19 crc kubenswrapper[4892]: I0217 18:04:19.447975 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6736a08-c35b-491c-b408-8a3dd641cd51-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a6736a08-c35b-491c-b408-8a3dd641cd51\") " pod="openstack/cinder-scheduler-0" Feb 17 18:04:19 crc kubenswrapper[4892]: I0217 18:04:19.449231 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gps8k\" (UniqueName: \"kubernetes.io/projected/a6736a08-c35b-491c-b408-8a3dd641cd51-kube-api-access-gps8k\") pod \"cinder-scheduler-0\" (UID: \"a6736a08-c35b-491c-b408-8a3dd641cd51\") " pod="openstack/cinder-scheduler-0" Feb 17 18:04:19 crc kubenswrapper[4892]: I0217 18:04:19.527649 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 18:04:19 crc kubenswrapper[4892]: I0217 18:04:19.907062 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-544cf5fc64-d8jkh" Feb 17 18:04:19 crc kubenswrapper[4892]: I0217 18:04:19.911354 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-544cf5fc64-d8jkh" Feb 17 18:04:20 crc kubenswrapper[4892]: I0217 18:04:20.053934 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 18:04:20 crc kubenswrapper[4892]: W0217 18:04:20.073452 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6736a08_c35b_491c_b408_8a3dd641cd51.slice/crio-c9e98a9a097e082d7d42b313d5f99f2ff1b013b10ff34d724e7f60a147bda3a5 WatchSource:0}: Error finding container c9e98a9a097e082d7d42b313d5f99f2ff1b013b10ff34d724e7f60a147bda3a5: Status 404 returned error can't find the container with id c9e98a9a097e082d7d42b313d5f99f2ff1b013b10ff34d724e7f60a147bda3a5 Feb 17 18:04:20 crc kubenswrapper[4892]: I0217 18:04:20.143441 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a6736a08-c35b-491c-b408-8a3dd641cd51","Type":"ContainerStarted","Data":"c9e98a9a097e082d7d42b313d5f99f2ff1b013b10ff34d724e7f60a147bda3a5"} Feb 17 18:04:20 crc kubenswrapper[4892]: I0217 18:04:20.154402 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db8547d8d-ftgvm"] Feb 17 18:04:20 crc kubenswrapper[4892]: I0217 18:04:20.157379 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db8547d8d-ftgvm" Feb 17 18:04:20 crc kubenswrapper[4892]: I0217 18:04:20.188058 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db8547d8d-ftgvm"] Feb 17 18:04:20 crc kubenswrapper[4892]: I0217 18:04:20.244925 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggf79\" (UniqueName: \"kubernetes.io/projected/5c6047a2-148b-46cf-a50b-b7147c7c9902-kube-api-access-ggf79\") pod \"placement-db8547d8d-ftgvm\" (UID: \"5c6047a2-148b-46cf-a50b-b7147c7c9902\") " pod="openstack/placement-db8547d8d-ftgvm" Feb 17 18:04:20 crc kubenswrapper[4892]: I0217 18:04:20.244994 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c6047a2-148b-46cf-a50b-b7147c7c9902-config-data\") pod \"placement-db8547d8d-ftgvm\" (UID: \"5c6047a2-148b-46cf-a50b-b7147c7c9902\") " pod="openstack/placement-db8547d8d-ftgvm" Feb 17 18:04:20 crc kubenswrapper[4892]: I0217 18:04:20.245062 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c6047a2-148b-46cf-a50b-b7147c7c9902-logs\") pod \"placement-db8547d8d-ftgvm\" (UID: \"5c6047a2-148b-46cf-a50b-b7147c7c9902\") " pod="openstack/placement-db8547d8d-ftgvm" Feb 17 18:04:20 crc kubenswrapper[4892]: I0217 18:04:20.245161 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c6047a2-148b-46cf-a50b-b7147c7c9902-scripts\") pod \"placement-db8547d8d-ftgvm\" (UID: \"5c6047a2-148b-46cf-a50b-b7147c7c9902\") " pod="openstack/placement-db8547d8d-ftgvm" Feb 17 18:04:20 crc kubenswrapper[4892]: I0217 18:04:20.245209 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c6047a2-148b-46cf-a50b-b7147c7c9902-internal-tls-certs\") pod \"placement-db8547d8d-ftgvm\" (UID: \"5c6047a2-148b-46cf-a50b-b7147c7c9902\") " pod="openstack/placement-db8547d8d-ftgvm" Feb 17 18:04:20 crc kubenswrapper[4892]: I0217 18:04:20.245239 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c6047a2-148b-46cf-a50b-b7147c7c9902-public-tls-certs\") pod \"placement-db8547d8d-ftgvm\" (UID: \"5c6047a2-148b-46cf-a50b-b7147c7c9902\") " pod="openstack/placement-db8547d8d-ftgvm" Feb 17 18:04:20 crc kubenswrapper[4892]: I0217 18:04:20.245263 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c6047a2-148b-46cf-a50b-b7147c7c9902-combined-ca-bundle\") pod \"placement-db8547d8d-ftgvm\" (UID: \"5c6047a2-148b-46cf-a50b-b7147c7c9902\") " pod="openstack/placement-db8547d8d-ftgvm" Feb 17 18:04:20 crc kubenswrapper[4892]: I0217 18:04:20.348872 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggf79\" (UniqueName: \"kubernetes.io/projected/5c6047a2-148b-46cf-a50b-b7147c7c9902-kube-api-access-ggf79\") pod \"placement-db8547d8d-ftgvm\" (UID: \"5c6047a2-148b-46cf-a50b-b7147c7c9902\") " pod="openstack/placement-db8547d8d-ftgvm" Feb 17 18:04:20 crc kubenswrapper[4892]: I0217 18:04:20.349463 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c6047a2-148b-46cf-a50b-b7147c7c9902-config-data\") pod \"placement-db8547d8d-ftgvm\" (UID: \"5c6047a2-148b-46cf-a50b-b7147c7c9902\") " pod="openstack/placement-db8547d8d-ftgvm" Feb 17 18:04:20 crc kubenswrapper[4892]: I0217 18:04:20.349586 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c6047a2-148b-46cf-a50b-b7147c7c9902-logs\") pod \"placement-db8547d8d-ftgvm\" (UID: \"5c6047a2-148b-46cf-a50b-b7147c7c9902\") " pod="openstack/placement-db8547d8d-ftgvm" Feb 17 18:04:20 crc kubenswrapper[4892]: I0217 18:04:20.349716 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c6047a2-148b-46cf-a50b-b7147c7c9902-scripts\") pod \"placement-db8547d8d-ftgvm\" (UID: \"5c6047a2-148b-46cf-a50b-b7147c7c9902\") " pod="openstack/placement-db8547d8d-ftgvm" Feb 17 18:04:20 crc kubenswrapper[4892]: I0217 18:04:20.349833 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c6047a2-148b-46cf-a50b-b7147c7c9902-internal-tls-certs\") pod \"placement-db8547d8d-ftgvm\" (UID: \"5c6047a2-148b-46cf-a50b-b7147c7c9902\") " pod="openstack/placement-db8547d8d-ftgvm" Feb 17 18:04:20 crc kubenswrapper[4892]: I0217 18:04:20.349908 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c6047a2-148b-46cf-a50b-b7147c7c9902-public-tls-certs\") pod \"placement-db8547d8d-ftgvm\" (UID: \"5c6047a2-148b-46cf-a50b-b7147c7c9902\") " pod="openstack/placement-db8547d8d-ftgvm" Feb 17 18:04:20 crc kubenswrapper[4892]: I0217 18:04:20.350044 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c6047a2-148b-46cf-a50b-b7147c7c9902-combined-ca-bundle\") pod \"placement-db8547d8d-ftgvm\" (UID: \"5c6047a2-148b-46cf-a50b-b7147c7c9902\") " pod="openstack/placement-db8547d8d-ftgvm" Feb 17 18:04:20 crc kubenswrapper[4892]: I0217 18:04:20.352779 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c6047a2-148b-46cf-a50b-b7147c7c9902-logs\") pod \"placement-db8547d8d-ftgvm\" (UID: \"5c6047a2-148b-46cf-a50b-b7147c7c9902\") " pod="openstack/placement-db8547d8d-ftgvm" Feb 17 18:04:20 crc kubenswrapper[4892]: I0217 18:04:20.355168 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c6047a2-148b-46cf-a50b-b7147c7c9902-combined-ca-bundle\") pod \"placement-db8547d8d-ftgvm\" (UID: \"5c6047a2-148b-46cf-a50b-b7147c7c9902\") " pod="openstack/placement-db8547d8d-ftgvm" Feb 17 18:04:20 crc kubenswrapper[4892]: I0217 18:04:20.355413 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c6047a2-148b-46cf-a50b-b7147c7c9902-config-data\") pod \"placement-db8547d8d-ftgvm\" (UID: \"5c6047a2-148b-46cf-a50b-b7147c7c9902\") " pod="openstack/placement-db8547d8d-ftgvm" Feb 17 18:04:20 crc kubenswrapper[4892]: I0217 18:04:20.359729 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c6047a2-148b-46cf-a50b-b7147c7c9902-scripts\") pod \"placement-db8547d8d-ftgvm\" (UID: \"5c6047a2-148b-46cf-a50b-b7147c7c9902\") " pod="openstack/placement-db8547d8d-ftgvm" Feb 17 18:04:20 crc kubenswrapper[4892]: I0217 18:04:20.361207 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c6047a2-148b-46cf-a50b-b7147c7c9902-internal-tls-certs\") pod \"placement-db8547d8d-ftgvm\" (UID: \"5c6047a2-148b-46cf-a50b-b7147c7c9902\") " pod="openstack/placement-db8547d8d-ftgvm" Feb 17 18:04:20 crc kubenswrapper[4892]: I0217 18:04:20.363396 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c6047a2-148b-46cf-a50b-b7147c7c9902-public-tls-certs\") pod \"placement-db8547d8d-ftgvm\" (UID: \"5c6047a2-148b-46cf-a50b-b7147c7c9902\") " pod="openstack/placement-db8547d8d-ftgvm" Feb 17 18:04:20 crc kubenswrapper[4892]: I0217 18:04:20.365869 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggf79\" (UniqueName: \"kubernetes.io/projected/5c6047a2-148b-46cf-a50b-b7147c7c9902-kube-api-access-ggf79\") pod \"placement-db8547d8d-ftgvm\" (UID: \"5c6047a2-148b-46cf-a50b-b7147c7c9902\") " pod="openstack/placement-db8547d8d-ftgvm" Feb 17 18:04:20 crc kubenswrapper[4892]: I0217 18:04:20.562978 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db8547d8d-ftgvm" Feb 17 18:04:20 crc kubenswrapper[4892]: I0217 18:04:20.830531 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-dfdc985-pc7r9" Feb 17 18:04:21 crc kubenswrapper[4892]: I0217 18:04:21.061606 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db8547d8d-ftgvm"] Feb 17 18:04:21 crc kubenswrapper[4892]: I0217 18:04:21.161523 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db8547d8d-ftgvm" event={"ID":"5c6047a2-148b-46cf-a50b-b7147c7c9902","Type":"ContainerStarted","Data":"4463f5c342aa43de4c462a518f9c057c9e62ab708a85e18143550201a1e1daff"} Feb 17 18:04:21 crc kubenswrapper[4892]: I0217 18:04:21.165501 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a6736a08-c35b-491c-b408-8a3dd641cd51","Type":"ContainerStarted","Data":"276fbe4a642e629846be447b31c24d7070dfa435158a65fb8bc262ffc1b036a1"} Feb 17 18:04:21 crc kubenswrapper[4892]: I0217 18:04:21.563322 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 17 18:04:21 crc kubenswrapper[4892]: I0217 18:04:21.576366 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5758f86b57-ddm7q" Feb 17 18:04:21 crc kubenswrapper[4892]: I0217 18:04:21.677035 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-dbf87fcbd-5txkh"] Feb 17 18:04:21 crc kubenswrapper[4892]: I0217 18:04:21.677330 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-dbf87fcbd-5txkh" podUID="3ad1c61c-5ce6-4b28-9034-5620d94bebc1" containerName="neutron-api" containerID="cri-o://835e3ba73c00484c7dd2e12d5e138cb93115a4a9fe41373b6d53982dc49ef11a" gracePeriod=30 Feb 17 18:04:21 crc kubenswrapper[4892]: I0217 18:04:21.677407 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-dbf87fcbd-5txkh" podUID="3ad1c61c-5ce6-4b28-9034-5620d94bebc1" containerName="neutron-httpd" containerID="cri-o://eedbe08f2d0e35caf96d63b86adeb2da73351d8a18beff691249c5d53839565e" gracePeriod=30 Feb 17 18:04:21 crc kubenswrapper[4892]: E0217 18:04:21.979881 4892 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ad1c61c_5ce6_4b28_9034_5620d94bebc1.slice/crio-conmon-eedbe08f2d0e35caf96d63b86adeb2da73351d8a18beff691249c5d53839565e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ad1c61c_5ce6_4b28_9034_5620d94bebc1.slice/crio-eedbe08f2d0e35caf96d63b86adeb2da73351d8a18beff691249c5d53839565e.scope\": RecentStats: unable to find data in memory cache]" Feb 17 18:04:22 crc kubenswrapper[4892]: I0217 18:04:22.182024 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a6736a08-c35b-491c-b408-8a3dd641cd51","Type":"ContainerStarted","Data":"72c2cbaf2de54480ce7abb484d8c16ea293b67cb726839d9a2f462baee040be3"} Feb 17 18:04:22 crc kubenswrapper[4892]: I0217 18:04:22.184024 4892 generic.go:334] "Generic (PLEG): container finished" podID="3ad1c61c-5ce6-4b28-9034-5620d94bebc1" containerID="eedbe08f2d0e35caf96d63b86adeb2da73351d8a18beff691249c5d53839565e" exitCode=0 Feb 17 18:04:22 crc kubenswrapper[4892]: I0217 18:04:22.184071 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dbf87fcbd-5txkh" event={"ID":"3ad1c61c-5ce6-4b28-9034-5620d94bebc1","Type":"ContainerDied","Data":"eedbe08f2d0e35caf96d63b86adeb2da73351d8a18beff691249c5d53839565e"} Feb 17 18:04:22 crc kubenswrapper[4892]: I0217 18:04:22.187119 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db8547d8d-ftgvm" event={"ID":"5c6047a2-148b-46cf-a50b-b7147c7c9902","Type":"ContainerStarted","Data":"f688928d9f46e1289f6ea6452cd0e3a1da9e18c9de2b9d48b2e5baa997df9da2"} Feb 17 18:04:22 crc kubenswrapper[4892]: I0217 18:04:22.187166 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db8547d8d-ftgvm" event={"ID":"5c6047a2-148b-46cf-a50b-b7147c7c9902","Type":"ContainerStarted","Data":"8050f8a9a53f9cfd74960f06c20bc4ea0cfea7a73aa66de78f0a44fc8caa93e3"} Feb 17 18:04:22 crc kubenswrapper[4892]: I0217 18:04:22.188227 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-db8547d8d-ftgvm" Feb 17 18:04:22 crc kubenswrapper[4892]: I0217 18:04:22.188257 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-db8547d8d-ftgvm" Feb 17 18:04:22 crc kubenswrapper[4892]: I0217 18:04:22.244280 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.244260055 podStartE2EDuration="3.244260055s" podCreationTimestamp="2026-02-17 18:04:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:04:22.2050986 +0000 UTC m=+1233.580501885" watchObservedRunningTime="2026-02-17 18:04:22.244260055 +0000 UTC m=+1233.619663320" Feb 17 18:04:22 crc kubenswrapper[4892]: I0217 18:04:22.246176 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db8547d8d-ftgvm" podStartSLOduration=2.246168027 podStartE2EDuration="2.246168027s" podCreationTimestamp="2026-02-17 18:04:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:04:22.221190974 +0000 UTC m=+1233.596594249" watchObservedRunningTime="2026-02-17 18:04:22.246168027 +0000 UTC m=+1233.621571282" Feb 17 18:04:22 crc kubenswrapper[4892]: I0217 18:04:22.875488 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6674d5469b-4kmh6_8d1b1f1d-e838-4b89-8d8b-e61b88a9917e/neutron-api/0.log" Feb 17 18:04:22 crc kubenswrapper[4892]: I0217 18:04:22.876000 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6674d5469b-4kmh6" Feb 17 18:04:23 crc kubenswrapper[4892]: I0217 18:04:23.037165 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8d1b1f1d-e838-4b89-8d8b-e61b88a9917e-httpd-config\") pod \"8d1b1f1d-e838-4b89-8d8b-e61b88a9917e\" (UID: \"8d1b1f1d-e838-4b89-8d8b-e61b88a9917e\") " Feb 17 18:04:23 crc kubenswrapper[4892]: I0217 18:04:23.037226 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8d1b1f1d-e838-4b89-8d8b-e61b88a9917e-config\") pod \"8d1b1f1d-e838-4b89-8d8b-e61b88a9917e\" (UID: \"8d1b1f1d-e838-4b89-8d8b-e61b88a9917e\") " Feb 17 18:04:23 crc kubenswrapper[4892]: I0217 18:04:23.037259 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d1b1f1d-e838-4b89-8d8b-e61b88a9917e-combined-ca-bundle\") pod \"8d1b1f1d-e838-4b89-8d8b-e61b88a9917e\" (UID: \"8d1b1f1d-e838-4b89-8d8b-e61b88a9917e\") " Feb 17 18:04:23 crc kubenswrapper[4892]: I0217 18:04:23.037294 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d1b1f1d-e838-4b89-8d8b-e61b88a9917e-ovndb-tls-certs\") pod \"8d1b1f1d-e838-4b89-8d8b-e61b88a9917e\" (UID: \"8d1b1f1d-e838-4b89-8d8b-e61b88a9917e\") " Feb 17 18:04:23 crc kubenswrapper[4892]: I0217 18:04:23.037356 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmbr9\" (UniqueName: \"kubernetes.io/projected/8d1b1f1d-e838-4b89-8d8b-e61b88a9917e-kube-api-access-fmbr9\") pod \"8d1b1f1d-e838-4b89-8d8b-e61b88a9917e\" (UID: \"8d1b1f1d-e838-4b89-8d8b-e61b88a9917e\") " Feb 17 18:04:23 crc kubenswrapper[4892]: I0217 18:04:23.062106 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d1b1f1d-e838-4b89-8d8b-e61b88a9917e-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "8d1b1f1d-e838-4b89-8d8b-e61b88a9917e" (UID: "8d1b1f1d-e838-4b89-8d8b-e61b88a9917e"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:04:23 crc kubenswrapper[4892]: I0217 18:04:23.064614 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d1b1f1d-e838-4b89-8d8b-e61b88a9917e-kube-api-access-fmbr9" (OuterVolumeSpecName: "kube-api-access-fmbr9") pod "8d1b1f1d-e838-4b89-8d8b-e61b88a9917e" (UID: "8d1b1f1d-e838-4b89-8d8b-e61b88a9917e"). InnerVolumeSpecName "kube-api-access-fmbr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:04:23 crc kubenswrapper[4892]: I0217 18:04:23.127919 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d1b1f1d-e838-4b89-8d8b-e61b88a9917e-config" (OuterVolumeSpecName: "config") pod "8d1b1f1d-e838-4b89-8d8b-e61b88a9917e" (UID: "8d1b1f1d-e838-4b89-8d8b-e61b88a9917e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:04:23 crc kubenswrapper[4892]: I0217 18:04:23.137712 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d1b1f1d-e838-4b89-8d8b-e61b88a9917e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8d1b1f1d-e838-4b89-8d8b-e61b88a9917e" (UID: "8d1b1f1d-e838-4b89-8d8b-e61b88a9917e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:04:23 crc kubenswrapper[4892]: I0217 18:04:23.140002 4892 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8d1b1f1d-e838-4b89-8d8b-e61b88a9917e-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:23 crc kubenswrapper[4892]: I0217 18:04:23.140034 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/8d1b1f1d-e838-4b89-8d8b-e61b88a9917e-config\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:23 crc kubenswrapper[4892]: I0217 18:04:23.140043 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d1b1f1d-e838-4b89-8d8b-e61b88a9917e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:23 crc kubenswrapper[4892]: I0217 18:04:23.140052 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmbr9\" (UniqueName: \"kubernetes.io/projected/8d1b1f1d-e838-4b89-8d8b-e61b88a9917e-kube-api-access-fmbr9\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:23 crc kubenswrapper[4892]: I0217 18:04:23.150906 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d1b1f1d-e838-4b89-8d8b-e61b88a9917e-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "8d1b1f1d-e838-4b89-8d8b-e61b88a9917e" (UID: "8d1b1f1d-e838-4b89-8d8b-e61b88a9917e"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:04:23 crc kubenswrapper[4892]: I0217 18:04:23.207377 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6674d5469b-4kmh6_8d1b1f1d-e838-4b89-8d8b-e61b88a9917e/neutron-api/0.log" Feb 17 18:04:23 crc kubenswrapper[4892]: I0217 18:04:23.207428 4892 generic.go:334] "Generic (PLEG): container finished" podID="8d1b1f1d-e838-4b89-8d8b-e61b88a9917e" containerID="8621ee39ca6e415c44ee20c97506fb3890daed66ed91ff6b2d4bfcb4133e546f" exitCode=137 Feb 17 18:04:23 crc kubenswrapper[4892]: I0217 18:04:23.208431 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6674d5469b-4kmh6" Feb 17 18:04:23 crc kubenswrapper[4892]: I0217 18:04:23.208526 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6674d5469b-4kmh6" event={"ID":"8d1b1f1d-e838-4b89-8d8b-e61b88a9917e","Type":"ContainerDied","Data":"8621ee39ca6e415c44ee20c97506fb3890daed66ed91ff6b2d4bfcb4133e546f"} Feb 17 18:04:23 crc kubenswrapper[4892]: I0217 18:04:23.208561 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6674d5469b-4kmh6" event={"ID":"8d1b1f1d-e838-4b89-8d8b-e61b88a9917e","Type":"ContainerDied","Data":"e0bbd765e7ff74b0aa03e4531e76589d6c438ece00b59fdd894288c586332e69"} Feb 17 18:04:23 crc kubenswrapper[4892]: I0217 18:04:23.208580 4892 scope.go:117] "RemoveContainer" containerID="afb250219a37fbd0d9f4383235bb1d104ff71038b3a1237587e4fc7a008ca4b1" Feb 17 18:04:23 crc kubenswrapper[4892]: I0217 18:04:23.242149 4892 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d1b1f1d-e838-4b89-8d8b-e61b88a9917e-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:23 crc kubenswrapper[4892]: I0217 18:04:23.297050 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6674d5469b-4kmh6"] Feb 17 18:04:23 crc kubenswrapper[4892]: I0217 18:04:23.307234 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6674d5469b-4kmh6"] Feb 17 18:04:23 crc kubenswrapper[4892]: I0217 18:04:23.343956 4892 scope.go:117] "RemoveContainer" containerID="8621ee39ca6e415c44ee20c97506fb3890daed66ed91ff6b2d4bfcb4133e546f" Feb 17 18:04:23 crc kubenswrapper[4892]: I0217 18:04:23.377563 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d1b1f1d-e838-4b89-8d8b-e61b88a9917e" path="/var/lib/kubelet/pods/8d1b1f1d-e838-4b89-8d8b-e61b88a9917e/volumes" Feb 17 18:04:23 crc kubenswrapper[4892]: I0217 18:04:23.430352 4892 scope.go:117] "RemoveContainer" containerID="afb250219a37fbd0d9f4383235bb1d104ff71038b3a1237587e4fc7a008ca4b1" Feb 17 18:04:23 crc kubenswrapper[4892]: E0217 18:04:23.431794 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afb250219a37fbd0d9f4383235bb1d104ff71038b3a1237587e4fc7a008ca4b1\": container with ID starting with afb250219a37fbd0d9f4383235bb1d104ff71038b3a1237587e4fc7a008ca4b1 not found: ID does not exist" containerID="afb250219a37fbd0d9f4383235bb1d104ff71038b3a1237587e4fc7a008ca4b1" Feb 17 18:04:23 crc kubenswrapper[4892]: I0217 18:04:23.431849 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afb250219a37fbd0d9f4383235bb1d104ff71038b3a1237587e4fc7a008ca4b1"} err="failed to get container status \"afb250219a37fbd0d9f4383235bb1d104ff71038b3a1237587e4fc7a008ca4b1\": rpc error: code = NotFound desc = could not find container \"afb250219a37fbd0d9f4383235bb1d104ff71038b3a1237587e4fc7a008ca4b1\": container with ID starting with afb250219a37fbd0d9f4383235bb1d104ff71038b3a1237587e4fc7a008ca4b1 not found: ID does not exist" Feb 17 18:04:23 crc kubenswrapper[4892]: I0217 18:04:23.431871 4892 scope.go:117] "RemoveContainer" containerID="8621ee39ca6e415c44ee20c97506fb3890daed66ed91ff6b2d4bfcb4133e546f" Feb 17 18:04:23 crc kubenswrapper[4892]: E0217 18:04:23.432835 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8621ee39ca6e415c44ee20c97506fb3890daed66ed91ff6b2d4bfcb4133e546f\": container with ID starting with 8621ee39ca6e415c44ee20c97506fb3890daed66ed91ff6b2d4bfcb4133e546f not found: ID does not exist" containerID="8621ee39ca6e415c44ee20c97506fb3890daed66ed91ff6b2d4bfcb4133e546f" Feb 17 18:04:23 crc kubenswrapper[4892]: I0217 18:04:23.432864 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8621ee39ca6e415c44ee20c97506fb3890daed66ed91ff6b2d4bfcb4133e546f"} err="failed to get container status \"8621ee39ca6e415c44ee20c97506fb3890daed66ed91ff6b2d4bfcb4133e546f\": rpc error: code = NotFound desc = could not find container \"8621ee39ca6e415c44ee20c97506fb3890daed66ed91ff6b2d4bfcb4133e546f\": container with ID starting with 8621ee39ca6e415c44ee20c97506fb3890daed66ed91ff6b2d4bfcb4133e546f not found: ID does not exist" Feb 17 18:04:23 crc kubenswrapper[4892]: I0217 18:04:23.874410 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dbf87fcbd-5txkh" Feb 17 18:04:24 crc kubenswrapper[4892]: I0217 18:04:24.058452 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvztw\" (UniqueName: \"kubernetes.io/projected/3ad1c61c-5ce6-4b28-9034-5620d94bebc1-kube-api-access-rvztw\") pod \"3ad1c61c-5ce6-4b28-9034-5620d94bebc1\" (UID: \"3ad1c61c-5ce6-4b28-9034-5620d94bebc1\") " Feb 17 18:04:24 crc kubenswrapper[4892]: I0217 18:04:24.058874 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3ad1c61c-5ce6-4b28-9034-5620d94bebc1-httpd-config\") pod \"3ad1c61c-5ce6-4b28-9034-5620d94bebc1\" (UID: \"3ad1c61c-5ce6-4b28-9034-5620d94bebc1\") " Feb 17 18:04:24 crc kubenswrapper[4892]: I0217 18:04:24.058918 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ad1c61c-5ce6-4b28-9034-5620d94bebc1-ovndb-tls-certs\") pod \"3ad1c61c-5ce6-4b28-9034-5620d94bebc1\" (UID: \"3ad1c61c-5ce6-4b28-9034-5620d94bebc1\") " Feb 17 18:04:24 crc kubenswrapper[4892]: I0217 18:04:24.059067 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3ad1c61c-5ce6-4b28-9034-5620d94bebc1-config\") pod \"3ad1c61c-5ce6-4b28-9034-5620d94bebc1\" (UID: \"3ad1c61c-5ce6-4b28-9034-5620d94bebc1\") " Feb 17 18:04:24 crc kubenswrapper[4892]: I0217 18:04:24.059108 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ad1c61c-5ce6-4b28-9034-5620d94bebc1-combined-ca-bundle\") pod \"3ad1c61c-5ce6-4b28-9034-5620d94bebc1\" (UID: \"3ad1c61c-5ce6-4b28-9034-5620d94bebc1\") " Feb 17 18:04:24 crc kubenswrapper[4892]: I0217 18:04:24.064017 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ad1c61c-5ce6-4b28-9034-5620d94bebc1-kube-api-access-rvztw" (OuterVolumeSpecName: "kube-api-access-rvztw") pod "3ad1c61c-5ce6-4b28-9034-5620d94bebc1" (UID: "3ad1c61c-5ce6-4b28-9034-5620d94bebc1"). InnerVolumeSpecName "kube-api-access-rvztw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:04:24 crc kubenswrapper[4892]: I0217 18:04:24.064900 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ad1c61c-5ce6-4b28-9034-5620d94bebc1-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "3ad1c61c-5ce6-4b28-9034-5620d94bebc1" (UID: "3ad1c61c-5ce6-4b28-9034-5620d94bebc1"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:04:24 crc kubenswrapper[4892]: I0217 18:04:24.127447 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ad1c61c-5ce6-4b28-9034-5620d94bebc1-config" (OuterVolumeSpecName: "config") pod "3ad1c61c-5ce6-4b28-9034-5620d94bebc1" (UID: "3ad1c61c-5ce6-4b28-9034-5620d94bebc1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:04:24 crc kubenswrapper[4892]: I0217 18:04:24.144206 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ad1c61c-5ce6-4b28-9034-5620d94bebc1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ad1c61c-5ce6-4b28-9034-5620d94bebc1" (UID: "3ad1c61c-5ce6-4b28-9034-5620d94bebc1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:04:24 crc kubenswrapper[4892]: I0217 18:04:24.159541 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ad1c61c-5ce6-4b28-9034-5620d94bebc1-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "3ad1c61c-5ce6-4b28-9034-5620d94bebc1" (UID: "3ad1c61c-5ce6-4b28-9034-5620d94bebc1"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:04:24 crc kubenswrapper[4892]: I0217 18:04:24.160795 4892 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ad1c61c-5ce6-4b28-9034-5620d94bebc1-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:24 crc kubenswrapper[4892]: I0217 18:04:24.160865 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/3ad1c61c-5ce6-4b28-9034-5620d94bebc1-config\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:24 crc kubenswrapper[4892]: I0217 18:04:24.160875 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ad1c61c-5ce6-4b28-9034-5620d94bebc1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:24 crc kubenswrapper[4892]: I0217 18:04:24.160886 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvztw\" (UniqueName: \"kubernetes.io/projected/3ad1c61c-5ce6-4b28-9034-5620d94bebc1-kube-api-access-rvztw\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:24 crc kubenswrapper[4892]: I0217 18:04:24.160896 4892 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3ad1c61c-5ce6-4b28-9034-5620d94bebc1-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:24 crc kubenswrapper[4892]: I0217 18:04:24.237398 4892 generic.go:334] "Generic (PLEG): container finished" podID="3ad1c61c-5ce6-4b28-9034-5620d94bebc1" containerID="835e3ba73c00484c7dd2e12d5e138cb93115a4a9fe41373b6d53982dc49ef11a" exitCode=0 Feb 17 18:04:24 crc kubenswrapper[4892]: I0217 18:04:24.237444 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dbf87fcbd-5txkh" Feb 17 18:04:24 crc kubenswrapper[4892]: I0217 18:04:24.237485 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dbf87fcbd-5txkh" event={"ID":"3ad1c61c-5ce6-4b28-9034-5620d94bebc1","Type":"ContainerDied","Data":"835e3ba73c00484c7dd2e12d5e138cb93115a4a9fe41373b6d53982dc49ef11a"} Feb 17 18:04:24 crc kubenswrapper[4892]: I0217 18:04:24.237541 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dbf87fcbd-5txkh" event={"ID":"3ad1c61c-5ce6-4b28-9034-5620d94bebc1","Type":"ContainerDied","Data":"590c9db451aa20074af5d97e979853a7e68d07c2c2cd67572e98537d4730ff2b"} Feb 17 18:04:24 crc kubenswrapper[4892]: I0217 18:04:24.237560 4892 scope.go:117] "RemoveContainer" containerID="eedbe08f2d0e35caf96d63b86adeb2da73351d8a18beff691249c5d53839565e" Feb 17 18:04:24 crc kubenswrapper[4892]: I0217 18:04:24.273390 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-dbf87fcbd-5txkh"] Feb 17 18:04:24 crc kubenswrapper[4892]: I0217 18:04:24.282581 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-dbf87fcbd-5txkh"] Feb 17 18:04:24 crc kubenswrapper[4892]: I0217 18:04:24.301775 4892 scope.go:117] "RemoveContainer" containerID="835e3ba73c00484c7dd2e12d5e138cb93115a4a9fe41373b6d53982dc49ef11a" Feb 17 18:04:24 crc kubenswrapper[4892]: I0217 18:04:24.325786 4892 scope.go:117] "RemoveContainer" containerID="eedbe08f2d0e35caf96d63b86adeb2da73351d8a18beff691249c5d53839565e" Feb 17 18:04:24 crc kubenswrapper[4892]: E0217 18:04:24.327295 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eedbe08f2d0e35caf96d63b86adeb2da73351d8a18beff691249c5d53839565e\": container with ID starting with eedbe08f2d0e35caf96d63b86adeb2da73351d8a18beff691249c5d53839565e not found: ID does not exist" containerID="eedbe08f2d0e35caf96d63b86adeb2da73351d8a18beff691249c5d53839565e" Feb 17 18:04:24 crc kubenswrapper[4892]: I0217 18:04:24.327331 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eedbe08f2d0e35caf96d63b86adeb2da73351d8a18beff691249c5d53839565e"} err="failed to get container status \"eedbe08f2d0e35caf96d63b86adeb2da73351d8a18beff691249c5d53839565e\": rpc error: code = NotFound desc = could not find container \"eedbe08f2d0e35caf96d63b86adeb2da73351d8a18beff691249c5d53839565e\": container with ID starting with eedbe08f2d0e35caf96d63b86adeb2da73351d8a18beff691249c5d53839565e not found: ID does not exist" Feb 17 18:04:24 crc kubenswrapper[4892]: I0217 18:04:24.327358 4892 scope.go:117] "RemoveContainer" containerID="835e3ba73c00484c7dd2e12d5e138cb93115a4a9fe41373b6d53982dc49ef11a" Feb 17 18:04:24 crc kubenswrapper[4892]: E0217 18:04:24.327731 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"835e3ba73c00484c7dd2e12d5e138cb93115a4a9fe41373b6d53982dc49ef11a\": container with ID starting with 835e3ba73c00484c7dd2e12d5e138cb93115a4a9fe41373b6d53982dc49ef11a not found: ID does not exist" containerID="835e3ba73c00484c7dd2e12d5e138cb93115a4a9fe41373b6d53982dc49ef11a" Feb 17 18:04:24 crc kubenswrapper[4892]: I0217 18:04:24.327873 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"835e3ba73c00484c7dd2e12d5e138cb93115a4a9fe41373b6d53982dc49ef11a"} err="failed to get container status \"835e3ba73c00484c7dd2e12d5e138cb93115a4a9fe41373b6d53982dc49ef11a\": rpc error: code = NotFound desc = could not find container \"835e3ba73c00484c7dd2e12d5e138cb93115a4a9fe41373b6d53982dc49ef11a\": container with ID starting with 835e3ba73c00484c7dd2e12d5e138cb93115a4a9fe41373b6d53982dc49ef11a not found: ID does not exist" Feb 17 18:04:24 crc kubenswrapper[4892]: I0217 18:04:24.528019 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 17 18:04:25 crc kubenswrapper[4892]: I0217 18:04:25.369734 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ad1c61c-5ce6-4b28-9034-5620d94bebc1" path="/var/lib/kubelet/pods/3ad1c61c-5ce6-4b28-9034-5620d94bebc1/volumes" Feb 17 18:04:25 crc kubenswrapper[4892]: I0217 18:04:25.404150 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-787594b47-2xt6h"] Feb 17 18:04:25 crc kubenswrapper[4892]: E0217 18:04:25.404856 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ad1c61c-5ce6-4b28-9034-5620d94bebc1" containerName="neutron-api" Feb 17 18:04:25 crc kubenswrapper[4892]: I0217 18:04:25.404925 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ad1c61c-5ce6-4b28-9034-5620d94bebc1" containerName="neutron-api" Feb 17 18:04:25 crc kubenswrapper[4892]: E0217 18:04:25.404988 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ad1c61c-5ce6-4b28-9034-5620d94bebc1" containerName="neutron-httpd" Feb 17 18:04:25 crc kubenswrapper[4892]: I0217 18:04:25.405071 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ad1c61c-5ce6-4b28-9034-5620d94bebc1" containerName="neutron-httpd" Feb 17 18:04:25 crc kubenswrapper[4892]: E0217 18:04:25.405140 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d1b1f1d-e838-4b89-8d8b-e61b88a9917e" containerName="neutron-httpd" Feb 17 18:04:25 crc kubenswrapper[4892]: I0217 18:04:25.405194 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d1b1f1d-e838-4b89-8d8b-e61b88a9917e" containerName="neutron-httpd" Feb 17 18:04:25 crc kubenswrapper[4892]: E0217 18:04:25.405254 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d1b1f1d-e838-4b89-8d8b-e61b88a9917e" containerName="neutron-api" Feb 17 18:04:25 crc kubenswrapper[4892]: I0217 18:04:25.405304 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d1b1f1d-e838-4b89-8d8b-e61b88a9917e" containerName="neutron-api" Feb 17 18:04:25 crc kubenswrapper[4892]: I0217 18:04:25.405735 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d1b1f1d-e838-4b89-8d8b-e61b88a9917e" containerName="neutron-httpd" Feb 17 18:04:25 crc kubenswrapper[4892]: I0217 18:04:25.405790 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ad1c61c-5ce6-4b28-9034-5620d94bebc1" containerName="neutron-api" Feb 17 18:04:25 crc kubenswrapper[4892]: I0217 18:04:25.405801 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d1b1f1d-e838-4b89-8d8b-e61b88a9917e" containerName="neutron-api" Feb 17 18:04:25 crc kubenswrapper[4892]: I0217 18:04:25.405845 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ad1c61c-5ce6-4b28-9034-5620d94bebc1" containerName="neutron-httpd" Feb 17 18:04:25 crc kubenswrapper[4892]: I0217 18:04:25.407129 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-787594b47-2xt6h" Feb 17 18:04:25 crc kubenswrapper[4892]: I0217 18:04:25.409807 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 17 18:04:25 crc kubenswrapper[4892]: I0217 18:04:25.410014 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 17 18:04:25 crc kubenswrapper[4892]: I0217 18:04:25.410015 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 17 18:04:25 crc kubenswrapper[4892]: I0217 18:04:25.427943 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-787594b47-2xt6h"] Feb 17 18:04:25 crc kubenswrapper[4892]: I0217 18:04:25.488545 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb2d2d5a-6727-4c83-800b-03a6cf43b9c1-public-tls-certs\") pod \"swift-proxy-787594b47-2xt6h\" (UID: \"bb2d2d5a-6727-4c83-800b-03a6cf43b9c1\") " pod="openstack/swift-proxy-787594b47-2xt6h" Feb 17 18:04:25 crc kubenswrapper[4892]: I0217 18:04:25.488588 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb2d2d5a-6727-4c83-800b-03a6cf43b9c1-config-data\") pod \"swift-proxy-787594b47-2xt6h\" (UID: \"bb2d2d5a-6727-4c83-800b-03a6cf43b9c1\") " pod="openstack/swift-proxy-787594b47-2xt6h" Feb 17 18:04:25 crc kubenswrapper[4892]: I0217 18:04:25.488624 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bb2d2d5a-6727-4c83-800b-03a6cf43b9c1-etc-swift\") pod \"swift-proxy-787594b47-2xt6h\" (UID: \"bb2d2d5a-6727-4c83-800b-03a6cf43b9c1\") " pod="openstack/swift-proxy-787594b47-2xt6h" Feb 17 18:04:25 crc kubenswrapper[4892]: I0217 18:04:25.488658 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb2d2d5a-6727-4c83-800b-03a6cf43b9c1-combined-ca-bundle\") pod \"swift-proxy-787594b47-2xt6h\" (UID: \"bb2d2d5a-6727-4c83-800b-03a6cf43b9c1\") " pod="openstack/swift-proxy-787594b47-2xt6h" Feb 17 18:04:25 crc kubenswrapper[4892]: I0217 18:04:25.488714 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8jxj\" (UniqueName: \"kubernetes.io/projected/bb2d2d5a-6727-4c83-800b-03a6cf43b9c1-kube-api-access-r8jxj\") pod \"swift-proxy-787594b47-2xt6h\" (UID: \"bb2d2d5a-6727-4c83-800b-03a6cf43b9c1\") " pod="openstack/swift-proxy-787594b47-2xt6h" Feb 17 18:04:25 crc kubenswrapper[4892]: I0217 18:04:25.488746 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb2d2d5a-6727-4c83-800b-03a6cf43b9c1-run-httpd\") pod \"swift-proxy-787594b47-2xt6h\" (UID: \"bb2d2d5a-6727-4c83-800b-03a6cf43b9c1\") " pod="openstack/swift-proxy-787594b47-2xt6h" Feb 17 18:04:25 crc kubenswrapper[4892]: I0217 18:04:25.488889 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb2d2d5a-6727-4c83-800b-03a6cf43b9c1-log-httpd\") pod \"swift-proxy-787594b47-2xt6h\" (UID: \"bb2d2d5a-6727-4c83-800b-03a6cf43b9c1\") " pod="openstack/swift-proxy-787594b47-2xt6h" Feb 17 18:04:25 crc kubenswrapper[4892]: I0217 18:04:25.488993 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb2d2d5a-6727-4c83-800b-03a6cf43b9c1-internal-tls-certs\") pod \"swift-proxy-787594b47-2xt6h\" (UID: \"bb2d2d5a-6727-4c83-800b-03a6cf43b9c1\") " pod="openstack/swift-proxy-787594b47-2xt6h" Feb 17 18:04:25 crc kubenswrapper[4892]: I0217 18:04:25.590688 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb2d2d5a-6727-4c83-800b-03a6cf43b9c1-public-tls-certs\") pod \"swift-proxy-787594b47-2xt6h\" (UID: \"bb2d2d5a-6727-4c83-800b-03a6cf43b9c1\") " pod="openstack/swift-proxy-787594b47-2xt6h" Feb 17 18:04:25 crc kubenswrapper[4892]: I0217 18:04:25.590746 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb2d2d5a-6727-4c83-800b-03a6cf43b9c1-config-data\") pod \"swift-proxy-787594b47-2xt6h\" (UID: \"bb2d2d5a-6727-4c83-800b-03a6cf43b9c1\") " pod="openstack/swift-proxy-787594b47-2xt6h" Feb 17 18:04:25 crc kubenswrapper[4892]: I0217 18:04:25.590793 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bb2d2d5a-6727-4c83-800b-03a6cf43b9c1-etc-swift\") pod \"swift-proxy-787594b47-2xt6h\" (UID: \"bb2d2d5a-6727-4c83-800b-03a6cf43b9c1\") " pod="openstack/swift-proxy-787594b47-2xt6h" Feb 17 18:04:25 crc kubenswrapper[4892]: I0217 18:04:25.590878 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb2d2d5a-6727-4c83-800b-03a6cf43b9c1-combined-ca-bundle\") pod \"swift-proxy-787594b47-2xt6h\" (UID: \"bb2d2d5a-6727-4c83-800b-03a6cf43b9c1\") " pod="openstack/swift-proxy-787594b47-2xt6h" Feb 17 18:04:25 crc kubenswrapper[4892]: I0217 18:04:25.590955 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8jxj\" (UniqueName: \"kubernetes.io/projected/bb2d2d5a-6727-4c83-800b-03a6cf43b9c1-kube-api-access-r8jxj\") pod \"swift-proxy-787594b47-2xt6h\" (UID: \"bb2d2d5a-6727-4c83-800b-03a6cf43b9c1\") " pod="openstack/swift-proxy-787594b47-2xt6h" Feb 17 18:04:25 crc kubenswrapper[4892]: I0217 18:04:25.591005 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb2d2d5a-6727-4c83-800b-03a6cf43b9c1-run-httpd\") pod \"swift-proxy-787594b47-2xt6h\" (UID: \"bb2d2d5a-6727-4c83-800b-03a6cf43b9c1\") " pod="openstack/swift-proxy-787594b47-2xt6h" Feb 17 18:04:25 crc kubenswrapper[4892]: I0217 18:04:25.591127 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb2d2d5a-6727-4c83-800b-03a6cf43b9c1-log-httpd\") pod \"swift-proxy-787594b47-2xt6h\" (UID: \"bb2d2d5a-6727-4c83-800b-03a6cf43b9c1\") " pod="openstack/swift-proxy-787594b47-2xt6h" Feb 17 18:04:25 crc kubenswrapper[4892]: I0217 18:04:25.591194 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb2d2d5a-6727-4c83-800b-03a6cf43b9c1-internal-tls-certs\") pod \"swift-proxy-787594b47-2xt6h\" (UID: \"bb2d2d5a-6727-4c83-800b-03a6cf43b9c1\") " pod="openstack/swift-proxy-787594b47-2xt6h" Feb 17 18:04:25 crc kubenswrapper[4892]: I0217 18:04:25.595365 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb2d2d5a-6727-4c83-800b-03a6cf43b9c1-run-httpd\") pod \"swift-proxy-787594b47-2xt6h\" (UID: \"bb2d2d5a-6727-4c83-800b-03a6cf43b9c1\") " pod="openstack/swift-proxy-787594b47-2xt6h" Feb 17 18:04:25 crc kubenswrapper[4892]: I0217 18:04:25.595921 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb2d2d5a-6727-4c83-800b-03a6cf43b9c1-log-httpd\") pod \"swift-proxy-787594b47-2xt6h\" (UID: \"bb2d2d5a-6727-4c83-800b-03a6cf43b9c1\") " pod="openstack/swift-proxy-787594b47-2xt6h" Feb 17 18:04:25 crc kubenswrapper[4892]: I0217 18:04:25.599560 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb2d2d5a-6727-4c83-800b-03a6cf43b9c1-internal-tls-certs\") pod \"swift-proxy-787594b47-2xt6h\" (UID: \"bb2d2d5a-6727-4c83-800b-03a6cf43b9c1\") " pod="openstack/swift-proxy-787594b47-2xt6h" Feb 17 18:04:25 crc kubenswrapper[4892]: I0217 18:04:25.599644 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb2d2d5a-6727-4c83-800b-03a6cf43b9c1-combined-ca-bundle\") pod \"swift-proxy-787594b47-2xt6h\" (UID: \"bb2d2d5a-6727-4c83-800b-03a6cf43b9c1\") " pod="openstack/swift-proxy-787594b47-2xt6h" Feb 17 18:04:25 crc kubenswrapper[4892]: I0217 18:04:25.600864 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb2d2d5a-6727-4c83-800b-03a6cf43b9c1-config-data\") pod \"swift-proxy-787594b47-2xt6h\" (UID: \"bb2d2d5a-6727-4c83-800b-03a6cf43b9c1\") " pod="openstack/swift-proxy-787594b47-2xt6h" Feb 17 18:04:25 crc kubenswrapper[4892]: I0217 18:04:25.610386 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bb2d2d5a-6727-4c83-800b-03a6cf43b9c1-etc-swift\") pod \"swift-proxy-787594b47-2xt6h\" (UID: \"bb2d2d5a-6727-4c83-800b-03a6cf43b9c1\") " pod="openstack/swift-proxy-787594b47-2xt6h" Feb 17 18:04:25 crc kubenswrapper[4892]: I0217 18:04:25.612566 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb2d2d5a-6727-4c83-800b-03a6cf43b9c1-public-tls-certs\") pod \"swift-proxy-787594b47-2xt6h\" (UID: \"bb2d2d5a-6727-4c83-800b-03a6cf43b9c1\") " pod="openstack/swift-proxy-787594b47-2xt6h" Feb 17 18:04:25 crc kubenswrapper[4892]: I0217 18:04:25.618449 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8jxj\" (UniqueName: \"kubernetes.io/projected/bb2d2d5a-6727-4c83-800b-03a6cf43b9c1-kube-api-access-r8jxj\") pod \"swift-proxy-787594b47-2xt6h\" (UID: \"bb2d2d5a-6727-4c83-800b-03a6cf43b9c1\") " pod="openstack/swift-proxy-787594b47-2xt6h" Feb 17 18:04:25 crc kubenswrapper[4892]: I0217 18:04:25.726764 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-787594b47-2xt6h" Feb 17 18:04:25 crc kubenswrapper[4892]: I0217 18:04:25.845405 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 17 18:04:25 crc kubenswrapper[4892]: I0217 18:04:25.873436 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 17 18:04:25 crc kubenswrapper[4892]: I0217 18:04:25.873628 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 17 18:04:25 crc kubenswrapper[4892]: I0217 18:04:25.880281 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 17 18:04:25 crc kubenswrapper[4892]: I0217 18:04:25.880504 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-m65tt" Feb 17 18:04:25 crc kubenswrapper[4892]: I0217 18:04:25.881166 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 17 18:04:26 crc kubenswrapper[4892]: I0217 18:04:26.007542 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f0771f7-1250-403c-92b9-72411ed34b2a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"0f0771f7-1250-403c-92b9-72411ed34b2a\") " pod="openstack/openstackclient" Feb 17 18:04:26 crc kubenswrapper[4892]: I0217 18:04:26.007602 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0f0771f7-1250-403c-92b9-72411ed34b2a-openstack-config-secret\") pod \"openstackclient\" (UID: \"0f0771f7-1250-403c-92b9-72411ed34b2a\") " pod="openstack/openstackclient" Feb 17 18:04:26 crc kubenswrapper[4892]: I0217 18:04:26.007623 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m9g9\" (UniqueName: \"kubernetes.io/projected/0f0771f7-1250-403c-92b9-72411ed34b2a-kube-api-access-5m9g9\") pod \"openstackclient\" (UID: \"0f0771f7-1250-403c-92b9-72411ed34b2a\") " pod="openstack/openstackclient" Feb 17 18:04:26 crc kubenswrapper[4892]: I0217 18:04:26.007927 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0f0771f7-1250-403c-92b9-72411ed34b2a-openstack-config\") pod \"openstackclient\" (UID: \"0f0771f7-1250-403c-92b9-72411ed34b2a\") " pod="openstack/openstackclient" Feb 17 18:04:26 crc kubenswrapper[4892]: I0217 18:04:26.109759 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0f0771f7-1250-403c-92b9-72411ed34b2a-openstack-config\") pod \"openstackclient\" (UID: \"0f0771f7-1250-403c-92b9-72411ed34b2a\") " pod="openstack/openstackclient" Feb 17 18:04:26 crc kubenswrapper[4892]: I0217 18:04:26.109991 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f0771f7-1250-403c-92b9-72411ed34b2a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"0f0771f7-1250-403c-92b9-72411ed34b2a\") " pod="openstack/openstackclient" Feb 17 18:04:26 crc kubenswrapper[4892]: I0217 18:04:26.110024 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0f0771f7-1250-403c-92b9-72411ed34b2a-openstack-config-secret\") pod \"openstackclient\" (UID: \"0f0771f7-1250-403c-92b9-72411ed34b2a\") " pod="openstack/openstackclient" Feb 17 18:04:26 crc kubenswrapper[4892]: I0217 18:04:26.110045 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m9g9\" (UniqueName: \"kubernetes.io/projected/0f0771f7-1250-403c-92b9-72411ed34b2a-kube-api-access-5m9g9\") pod \"openstackclient\" (UID: \"0f0771f7-1250-403c-92b9-72411ed34b2a\") " pod="openstack/openstackclient" Feb 17 18:04:26 crc kubenswrapper[4892]: I0217 18:04:26.111542 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0f0771f7-1250-403c-92b9-72411ed34b2a-openstack-config\") pod \"openstackclient\" (UID: \"0f0771f7-1250-403c-92b9-72411ed34b2a\") " pod="openstack/openstackclient" Feb 17 18:04:26 crc kubenswrapper[4892]: I0217 18:04:26.116002 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0f0771f7-1250-403c-92b9-72411ed34b2a-openstack-config-secret\") pod \"openstackclient\" (UID: \"0f0771f7-1250-403c-92b9-72411ed34b2a\") " pod="openstack/openstackclient" Feb 17 18:04:26 crc kubenswrapper[4892]: I0217 18:04:26.116579 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f0771f7-1250-403c-92b9-72411ed34b2a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"0f0771f7-1250-403c-92b9-72411ed34b2a\") " pod="openstack/openstackclient" Feb 17 18:04:26 crc kubenswrapper[4892]: I0217 18:04:26.128915 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m9g9\" (UniqueName: \"kubernetes.io/projected/0f0771f7-1250-403c-92b9-72411ed34b2a-kube-api-access-5m9g9\") pod \"openstackclient\" (UID: \"0f0771f7-1250-403c-92b9-72411ed34b2a\") " pod="openstack/openstackclient" Feb 17 18:04:26 crc kubenswrapper[4892]: I0217 18:04:26.250471 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 17 18:04:26 crc kubenswrapper[4892]: W0217 18:04:26.492064 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb2d2d5a_6727_4c83_800b_03a6cf43b9c1.slice/crio-fc35d0e1deb77dd99fb76aef6d27085992a2baae171876bf5b5fa1f532c2af3a WatchSource:0}: Error finding container fc35d0e1deb77dd99fb76aef6d27085992a2baae171876bf5b5fa1f532c2af3a: Status 404 returned error can't find the container with id fc35d0e1deb77dd99fb76aef6d27085992a2baae171876bf5b5fa1f532c2af3a Feb 17 18:04:26 crc kubenswrapper[4892]: I0217 18:04:26.505315 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-787594b47-2xt6h"] Feb 17 18:04:26 crc kubenswrapper[4892]: I0217 18:04:26.884517 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 17 18:04:26 crc kubenswrapper[4892]: W0217 18:04:26.887390 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f0771f7_1250_403c_92b9_72411ed34b2a.slice/crio-0c78407d1ced216ea98e070612bbb86abc2d3928c6c4cd1cd280ed36c04efdf8 WatchSource:0}: Error finding container 0c78407d1ced216ea98e070612bbb86abc2d3928c6c4cd1cd280ed36c04efdf8: Status 404 returned error can't find the container with id 0c78407d1ced216ea98e070612bbb86abc2d3928c6c4cd1cd280ed36c04efdf8 Feb 17 18:04:27 crc kubenswrapper[4892]: I0217 18:04:27.076144 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 18:04:27 crc kubenswrapper[4892]: I0217 18:04:27.076419 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a1035d59-31a0-44b9-b196-4b66c6cd3b19" containerName="ceilometer-central-agent" containerID="cri-o://6be39c02be9a79a333e499294ba7938067b35bf9e823e2db4b9d61b286fa5f3f" gracePeriod=30 Feb 17 18:04:27 crc kubenswrapper[4892]: I0217 18:04:27.076549 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a1035d59-31a0-44b9-b196-4b66c6cd3b19" containerName="ceilometer-notification-agent" containerID="cri-o://9a725d366aa05121dcecb9ab3f90ced7fb12020c6c4a0261df591ac3b87f5b87" gracePeriod=30 Feb 17 18:04:27 crc kubenswrapper[4892]: I0217 18:04:27.076549 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a1035d59-31a0-44b9-b196-4b66c6cd3b19" containerName="proxy-httpd" containerID="cri-o://04d7cdd47f4612bb109aaab80eedfd6e76a543f7f42bd0727e80e043b4e470eb" gracePeriod=30 Feb 17 18:04:27 crc kubenswrapper[4892]: I0217 18:04:27.076619 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a1035d59-31a0-44b9-b196-4b66c6cd3b19" containerName="sg-core" containerID="cri-o://b2d0e8c773cacd4d4d4813ffc045dc9b16e5170cf72df0a94ecdd9c50cc60c3b" gracePeriod=30 Feb 17 18:04:27 crc kubenswrapper[4892]: I0217 18:04:27.095124 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="a1035d59-31a0-44b9-b196-4b66c6cd3b19" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Feb 17 18:04:27 crc kubenswrapper[4892]: I0217 18:04:27.280914 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"0f0771f7-1250-403c-92b9-72411ed34b2a","Type":"ContainerStarted","Data":"0c78407d1ced216ea98e070612bbb86abc2d3928c6c4cd1cd280ed36c04efdf8"} Feb 17 18:04:27 crc kubenswrapper[4892]: I0217 18:04:27.283615 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-787594b47-2xt6h" event={"ID":"bb2d2d5a-6727-4c83-800b-03a6cf43b9c1","Type":"ContainerStarted","Data":"9fe16bec1150bc58439b8a3146b91ab3797ac2826fcfb2f2ad628f2449331e3c"} Feb 17 18:04:27 crc kubenswrapper[4892]: I0217 18:04:27.283988 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-787594b47-2xt6h" Feb 17 18:04:27 crc kubenswrapper[4892]: I0217 18:04:27.284089 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-787594b47-2xt6h" event={"ID":"bb2d2d5a-6727-4c83-800b-03a6cf43b9c1","Type":"ContainerStarted","Data":"7aa2c24e87c1dd7aca3eb443e9d0a1a6e4ac79e766130bd75af04a8a8e5e4d3c"} Feb 17 18:04:27 crc kubenswrapper[4892]: I0217 18:04:27.284206 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-787594b47-2xt6h" event={"ID":"bb2d2d5a-6727-4c83-800b-03a6cf43b9c1","Type":"ContainerStarted","Data":"fc35d0e1deb77dd99fb76aef6d27085992a2baae171876bf5b5fa1f532c2af3a"} Feb 17 18:04:27 crc kubenswrapper[4892]: I0217 18:04:27.284299 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-787594b47-2xt6h" Feb 17 18:04:27 crc kubenswrapper[4892]: I0217 18:04:27.287846 4892 generic.go:334] "Generic (PLEG): container finished" podID="a1035d59-31a0-44b9-b196-4b66c6cd3b19" containerID="b2d0e8c773cacd4d4d4813ffc045dc9b16e5170cf72df0a94ecdd9c50cc60c3b" exitCode=2 Feb 17 18:04:27 crc kubenswrapper[4892]: I0217 18:04:27.287895 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1035d59-31a0-44b9-b196-4b66c6cd3b19","Type":"ContainerDied","Data":"b2d0e8c773cacd4d4d4813ffc045dc9b16e5170cf72df0a94ecdd9c50cc60c3b"} Feb 17 18:04:27 crc kubenswrapper[4892]: I0217 18:04:27.311533 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-787594b47-2xt6h" podStartSLOduration=2.311511974 podStartE2EDuration="2.311511974s" podCreationTimestamp="2026-02-17 18:04:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:04:27.304688329 +0000 UTC m=+1238.680091594" watchObservedRunningTime="2026-02-17 18:04:27.311511974 +0000 UTC m=+1238.686915239" Feb 17 18:04:28 crc kubenswrapper[4892]: I0217 18:04:28.300871 4892 generic.go:334] "Generic (PLEG): container finished" podID="a1035d59-31a0-44b9-b196-4b66c6cd3b19" containerID="04d7cdd47f4612bb109aaab80eedfd6e76a543f7f42bd0727e80e043b4e470eb" exitCode=0 Feb 17 18:04:28 crc kubenswrapper[4892]: I0217 18:04:28.301166 4892 generic.go:334] "Generic (PLEG): container finished" podID="a1035d59-31a0-44b9-b196-4b66c6cd3b19" containerID="6be39c02be9a79a333e499294ba7938067b35bf9e823e2db4b9d61b286fa5f3f" exitCode=0 Feb 17 18:04:28 crc kubenswrapper[4892]: I0217 18:04:28.301053 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1035d59-31a0-44b9-b196-4b66c6cd3b19","Type":"ContainerDied","Data":"04d7cdd47f4612bb109aaab80eedfd6e76a543f7f42bd0727e80e043b4e470eb"} Feb 17 18:04:28 crc kubenswrapper[4892]: I0217 18:04:28.301256 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1035d59-31a0-44b9-b196-4b66c6cd3b19","Type":"ContainerDied","Data":"6be39c02be9a79a333e499294ba7938067b35bf9e823e2db4b9d61b286fa5f3f"} Feb 17 18:04:29 crc kubenswrapper[4892]: I0217 18:04:29.764846 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.311708 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.365710 4892 generic.go:334] "Generic (PLEG): container finished" podID="a1035d59-31a0-44b9-b196-4b66c6cd3b19" containerID="9a725d366aa05121dcecb9ab3f90ced7fb12020c6c4a0261df591ac3b87f5b87" exitCode=0 Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.365853 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.394130 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1035d59-31a0-44b9-b196-4b66c6cd3b19","Type":"ContainerDied","Data":"9a725d366aa05121dcecb9ab3f90ced7fb12020c6c4a0261df591ac3b87f5b87"} Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.394171 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1035d59-31a0-44b9-b196-4b66c6cd3b19","Type":"ContainerDied","Data":"44e4c8dc89b86a90fc641d14b762a1a3bf10ff4f2a58c0eefbe81d9c57a66fbb"} Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.394189 4892 scope.go:117] "RemoveContainer" containerID="04d7cdd47f4612bb109aaab80eedfd6e76a543f7f42bd0727e80e043b4e470eb" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.448084 4892 scope.go:117] "RemoveContainer" containerID="b2d0e8c773cacd4d4d4813ffc045dc9b16e5170cf72df0a94ecdd9c50cc60c3b" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.462086 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1035d59-31a0-44b9-b196-4b66c6cd3b19-config-data\") pod \"a1035d59-31a0-44b9-b196-4b66c6cd3b19\" (UID: \"a1035d59-31a0-44b9-b196-4b66c6cd3b19\") " Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.462183 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1035d59-31a0-44b9-b196-4b66c6cd3b19-combined-ca-bundle\") pod \"a1035d59-31a0-44b9-b196-4b66c6cd3b19\" (UID: \"a1035d59-31a0-44b9-b196-4b66c6cd3b19\") " Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.462322 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a1035d59-31a0-44b9-b196-4b66c6cd3b19-sg-core-conf-yaml\") pod \"a1035d59-31a0-44b9-b196-4b66c6cd3b19\" (UID: \"a1035d59-31a0-44b9-b196-4b66c6cd3b19\") " Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.462357 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1035d59-31a0-44b9-b196-4b66c6cd3b19-log-httpd\") pod \"a1035d59-31a0-44b9-b196-4b66c6cd3b19\" (UID: \"a1035d59-31a0-44b9-b196-4b66c6cd3b19\") " Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.462378 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1035d59-31a0-44b9-b196-4b66c6cd3b19-scripts\") pod \"a1035d59-31a0-44b9-b196-4b66c6cd3b19\" (UID: \"a1035d59-31a0-44b9-b196-4b66c6cd3b19\") " Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.462409 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v26q9\" (UniqueName: \"kubernetes.io/projected/a1035d59-31a0-44b9-b196-4b66c6cd3b19-kube-api-access-v26q9\") pod \"a1035d59-31a0-44b9-b196-4b66c6cd3b19\" (UID: \"a1035d59-31a0-44b9-b196-4b66c6cd3b19\") " Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.462445 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1035d59-31a0-44b9-b196-4b66c6cd3b19-run-httpd\") pod \"a1035d59-31a0-44b9-b196-4b66c6cd3b19\" (UID: \"a1035d59-31a0-44b9-b196-4b66c6cd3b19\") " Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.465765 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1035d59-31a0-44b9-b196-4b66c6cd3b19-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a1035d59-31a0-44b9-b196-4b66c6cd3b19" (UID: "a1035d59-31a0-44b9-b196-4b66c6cd3b19"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.466387 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1035d59-31a0-44b9-b196-4b66c6cd3b19-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a1035d59-31a0-44b9-b196-4b66c6cd3b19" (UID: "a1035d59-31a0-44b9-b196-4b66c6cd3b19"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.469006 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1035d59-31a0-44b9-b196-4b66c6cd3b19-kube-api-access-v26q9" (OuterVolumeSpecName: "kube-api-access-v26q9") pod "a1035d59-31a0-44b9-b196-4b66c6cd3b19" (UID: "a1035d59-31a0-44b9-b196-4b66c6cd3b19"). InnerVolumeSpecName "kube-api-access-v26q9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.470318 4892 scope.go:117] "RemoveContainer" containerID="9a725d366aa05121dcecb9ab3f90ced7fb12020c6c4a0261df591ac3b87f5b87" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.470450 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1035d59-31a0-44b9-b196-4b66c6cd3b19-scripts" (OuterVolumeSpecName: "scripts") pod "a1035d59-31a0-44b9-b196-4b66c6cd3b19" (UID: "a1035d59-31a0-44b9-b196-4b66c6cd3b19"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.494046 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1035d59-31a0-44b9-b196-4b66c6cd3b19-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a1035d59-31a0-44b9-b196-4b66c6cd3b19" (UID: "a1035d59-31a0-44b9-b196-4b66c6cd3b19"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.547069 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1035d59-31a0-44b9-b196-4b66c6cd3b19-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a1035d59-31a0-44b9-b196-4b66c6cd3b19" (UID: "a1035d59-31a0-44b9-b196-4b66c6cd3b19"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.566760 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1035d59-31a0-44b9-b196-4b66c6cd3b19-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.566799 4892 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a1035d59-31a0-44b9-b196-4b66c6cd3b19-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.566825 4892 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1035d59-31a0-44b9-b196-4b66c6cd3b19-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.566838 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1035d59-31a0-44b9-b196-4b66c6cd3b19-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.566854 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v26q9\" (UniqueName: \"kubernetes.io/projected/a1035d59-31a0-44b9-b196-4b66c6cd3b19-kube-api-access-v26q9\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.566865 4892 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1035d59-31a0-44b9-b196-4b66c6cd3b19-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.595659 4892 scope.go:117] "RemoveContainer" containerID="6be39c02be9a79a333e499294ba7938067b35bf9e823e2db4b9d61b286fa5f3f" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.607389 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1035d59-31a0-44b9-b196-4b66c6cd3b19-config-data" (OuterVolumeSpecName: "config-data") pod "a1035d59-31a0-44b9-b196-4b66c6cd3b19" (UID: "a1035d59-31a0-44b9-b196-4b66c6cd3b19"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.617583 4892 scope.go:117] "RemoveContainer" containerID="04d7cdd47f4612bb109aaab80eedfd6e76a543f7f42bd0727e80e043b4e470eb" Feb 17 18:04:31 crc kubenswrapper[4892]: E0217 18:04:31.618144 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04d7cdd47f4612bb109aaab80eedfd6e76a543f7f42bd0727e80e043b4e470eb\": container with ID starting with 04d7cdd47f4612bb109aaab80eedfd6e76a543f7f42bd0727e80e043b4e470eb not found: ID does not exist" containerID="04d7cdd47f4612bb109aaab80eedfd6e76a543f7f42bd0727e80e043b4e470eb" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.618195 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04d7cdd47f4612bb109aaab80eedfd6e76a543f7f42bd0727e80e043b4e470eb"} err="failed to get container status \"04d7cdd47f4612bb109aaab80eedfd6e76a543f7f42bd0727e80e043b4e470eb\": rpc error: code = NotFound desc = could not find container \"04d7cdd47f4612bb109aaab80eedfd6e76a543f7f42bd0727e80e043b4e470eb\": container with ID starting with 04d7cdd47f4612bb109aaab80eedfd6e76a543f7f42bd0727e80e043b4e470eb not found: ID does not exist" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.618231 4892 scope.go:117] "RemoveContainer" containerID="b2d0e8c773cacd4d4d4813ffc045dc9b16e5170cf72df0a94ecdd9c50cc60c3b" Feb 17 18:04:31 crc kubenswrapper[4892]: E0217 18:04:31.618694 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2d0e8c773cacd4d4d4813ffc045dc9b16e5170cf72df0a94ecdd9c50cc60c3b\": container with ID starting with b2d0e8c773cacd4d4d4813ffc045dc9b16e5170cf72df0a94ecdd9c50cc60c3b not found: ID does not exist" containerID="b2d0e8c773cacd4d4d4813ffc045dc9b16e5170cf72df0a94ecdd9c50cc60c3b" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.618732 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2d0e8c773cacd4d4d4813ffc045dc9b16e5170cf72df0a94ecdd9c50cc60c3b"} err="failed to get container status \"b2d0e8c773cacd4d4d4813ffc045dc9b16e5170cf72df0a94ecdd9c50cc60c3b\": rpc error: code = NotFound desc = could not find container \"b2d0e8c773cacd4d4d4813ffc045dc9b16e5170cf72df0a94ecdd9c50cc60c3b\": container with ID starting with b2d0e8c773cacd4d4d4813ffc045dc9b16e5170cf72df0a94ecdd9c50cc60c3b not found: ID does not exist" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.618755 4892 scope.go:117] "RemoveContainer" containerID="9a725d366aa05121dcecb9ab3f90ced7fb12020c6c4a0261df591ac3b87f5b87" Feb 17 18:04:31 crc kubenswrapper[4892]: E0217 18:04:31.619111 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a725d366aa05121dcecb9ab3f90ced7fb12020c6c4a0261df591ac3b87f5b87\": container with ID starting with 9a725d366aa05121dcecb9ab3f90ced7fb12020c6c4a0261df591ac3b87f5b87 not found: ID does not exist" containerID="9a725d366aa05121dcecb9ab3f90ced7fb12020c6c4a0261df591ac3b87f5b87" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.619140 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a725d366aa05121dcecb9ab3f90ced7fb12020c6c4a0261df591ac3b87f5b87"} err="failed to get container status \"9a725d366aa05121dcecb9ab3f90ced7fb12020c6c4a0261df591ac3b87f5b87\": rpc error: code = NotFound desc = could not find container \"9a725d366aa05121dcecb9ab3f90ced7fb12020c6c4a0261df591ac3b87f5b87\": container with ID starting with 9a725d366aa05121dcecb9ab3f90ced7fb12020c6c4a0261df591ac3b87f5b87 not found: ID does not exist" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.619156 4892 scope.go:117] "RemoveContainer" containerID="6be39c02be9a79a333e499294ba7938067b35bf9e823e2db4b9d61b286fa5f3f" Feb 17 18:04:31 crc kubenswrapper[4892]: E0217 18:04:31.619396 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6be39c02be9a79a333e499294ba7938067b35bf9e823e2db4b9d61b286fa5f3f\": container with ID starting with 6be39c02be9a79a333e499294ba7938067b35bf9e823e2db4b9d61b286fa5f3f not found: ID does not exist" containerID="6be39c02be9a79a333e499294ba7938067b35bf9e823e2db4b9d61b286fa5f3f" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.619424 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6be39c02be9a79a333e499294ba7938067b35bf9e823e2db4b9d61b286fa5f3f"} err="failed to get container status \"6be39c02be9a79a333e499294ba7938067b35bf9e823e2db4b9d61b286fa5f3f\": rpc error: code = NotFound desc = could not find container \"6be39c02be9a79a333e499294ba7938067b35bf9e823e2db4b9d61b286fa5f3f\": container with ID starting with 6be39c02be9a79a333e499294ba7938067b35bf9e823e2db4b9d61b286fa5f3f not found: ID does not exist" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.668424 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1035d59-31a0-44b9-b196-4b66c6cd3b19-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.704987 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.721921 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.735791 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 18:04:31 crc kubenswrapper[4892]: E0217 18:04:31.736726 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1035d59-31a0-44b9-b196-4b66c6cd3b19" containerName="proxy-httpd" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.736751 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1035d59-31a0-44b9-b196-4b66c6cd3b19" containerName="proxy-httpd" Feb 17 18:04:31 crc kubenswrapper[4892]: E0217 18:04:31.736773 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1035d59-31a0-44b9-b196-4b66c6cd3b19" containerName="sg-core" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.736782 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1035d59-31a0-44b9-b196-4b66c6cd3b19" containerName="sg-core" Feb 17 18:04:31 crc kubenswrapper[4892]: E0217 18:04:31.736794 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1035d59-31a0-44b9-b196-4b66c6cd3b19" containerName="ceilometer-notification-agent" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.736802 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1035d59-31a0-44b9-b196-4b66c6cd3b19" containerName="ceilometer-notification-agent" Feb 17 18:04:31 crc kubenswrapper[4892]: E0217 18:04:31.736831 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1035d59-31a0-44b9-b196-4b66c6cd3b19" containerName="ceilometer-central-agent" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.736840 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1035d59-31a0-44b9-b196-4b66c6cd3b19" containerName="ceilometer-central-agent" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.737114 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1035d59-31a0-44b9-b196-4b66c6cd3b19" containerName="proxy-httpd" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.737136 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1035d59-31a0-44b9-b196-4b66c6cd3b19" containerName="ceilometer-notification-agent" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.737150 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1035d59-31a0-44b9-b196-4b66c6cd3b19" containerName="ceilometer-central-agent" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.737160 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1035d59-31a0-44b9-b196-4b66c6cd3b19" containerName="sg-core" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.739495 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.742394 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.742590 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.767189 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.877020 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d7300a7-7f48-4be7-addb-1ed7995eddc2-log-httpd\") pod \"ceilometer-0\" (UID: \"7d7300a7-7f48-4be7-addb-1ed7995eddc2\") " pod="openstack/ceilometer-0" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.877216 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d7300a7-7f48-4be7-addb-1ed7995eddc2-config-data\") pod \"ceilometer-0\" (UID: \"7d7300a7-7f48-4be7-addb-1ed7995eddc2\") " pod="openstack/ceilometer-0" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.877241 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d7300a7-7f48-4be7-addb-1ed7995eddc2-run-httpd\") pod \"ceilometer-0\" (UID: \"7d7300a7-7f48-4be7-addb-1ed7995eddc2\") " pod="openstack/ceilometer-0" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.877261 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d7300a7-7f48-4be7-addb-1ed7995eddc2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7d7300a7-7f48-4be7-addb-1ed7995eddc2\") " pod="openstack/ceilometer-0" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.877427 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7d7300a7-7f48-4be7-addb-1ed7995eddc2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7d7300a7-7f48-4be7-addb-1ed7995eddc2\") " pod="openstack/ceilometer-0" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.877452 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d7300a7-7f48-4be7-addb-1ed7995eddc2-scripts\") pod \"ceilometer-0\" (UID: \"7d7300a7-7f48-4be7-addb-1ed7995eddc2\") " pod="openstack/ceilometer-0" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.877557 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64hxf\" (UniqueName: \"kubernetes.io/projected/7d7300a7-7f48-4be7-addb-1ed7995eddc2-kube-api-access-64hxf\") pod \"ceilometer-0\" (UID: \"7d7300a7-7f48-4be7-addb-1ed7995eddc2\") " pod="openstack/ceilometer-0" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.978762 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7d7300a7-7f48-4be7-addb-1ed7995eddc2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7d7300a7-7f48-4be7-addb-1ed7995eddc2\") " pod="openstack/ceilometer-0" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.978830 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d7300a7-7f48-4be7-addb-1ed7995eddc2-scripts\") pod \"ceilometer-0\" (UID: \"7d7300a7-7f48-4be7-addb-1ed7995eddc2\") " pod="openstack/ceilometer-0" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.978906 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64hxf\" (UniqueName: \"kubernetes.io/projected/7d7300a7-7f48-4be7-addb-1ed7995eddc2-kube-api-access-64hxf\") pod \"ceilometer-0\" (UID: \"7d7300a7-7f48-4be7-addb-1ed7995eddc2\") " pod="openstack/ceilometer-0" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.978942 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d7300a7-7f48-4be7-addb-1ed7995eddc2-log-httpd\") pod \"ceilometer-0\" (UID: \"7d7300a7-7f48-4be7-addb-1ed7995eddc2\") " pod="openstack/ceilometer-0" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.979007 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d7300a7-7f48-4be7-addb-1ed7995eddc2-config-data\") pod \"ceilometer-0\" (UID: \"7d7300a7-7f48-4be7-addb-1ed7995eddc2\") " pod="openstack/ceilometer-0" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.979027 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d7300a7-7f48-4be7-addb-1ed7995eddc2-run-httpd\") pod \"ceilometer-0\" (UID: \"7d7300a7-7f48-4be7-addb-1ed7995eddc2\") " pod="openstack/ceilometer-0" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.979044 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d7300a7-7f48-4be7-addb-1ed7995eddc2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7d7300a7-7f48-4be7-addb-1ed7995eddc2\") " pod="openstack/ceilometer-0" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.979466 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d7300a7-7f48-4be7-addb-1ed7995eddc2-log-httpd\") pod \"ceilometer-0\" (UID: \"7d7300a7-7f48-4be7-addb-1ed7995eddc2\") " pod="openstack/ceilometer-0" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.979594 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d7300a7-7f48-4be7-addb-1ed7995eddc2-run-httpd\") pod \"ceilometer-0\" (UID: \"7d7300a7-7f48-4be7-addb-1ed7995eddc2\") " pod="openstack/ceilometer-0" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.984246 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7d7300a7-7f48-4be7-addb-1ed7995eddc2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7d7300a7-7f48-4be7-addb-1ed7995eddc2\") " pod="openstack/ceilometer-0" Feb 17 18:04:31 crc kubenswrapper[4892]: I0217 18:04:31.988557 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d7300a7-7f48-4be7-addb-1ed7995eddc2-config-data\") pod \"ceilometer-0\" (UID: \"7d7300a7-7f48-4be7-addb-1ed7995eddc2\") " pod="openstack/ceilometer-0" Feb 17 18:04:32 crc kubenswrapper[4892]: I0217 18:04:32.000148 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d7300a7-7f48-4be7-addb-1ed7995eddc2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7d7300a7-7f48-4be7-addb-1ed7995eddc2\") " pod="openstack/ceilometer-0" Feb 17 18:04:32 crc kubenswrapper[4892]: I0217 18:04:32.001522 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d7300a7-7f48-4be7-addb-1ed7995eddc2-scripts\") pod \"ceilometer-0\" (UID: \"7d7300a7-7f48-4be7-addb-1ed7995eddc2\") " pod="openstack/ceilometer-0" Feb 17 18:04:32 crc kubenswrapper[4892]: I0217 18:04:32.004633 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64hxf\" (UniqueName: \"kubernetes.io/projected/7d7300a7-7f48-4be7-addb-1ed7995eddc2-kube-api-access-64hxf\") pod \"ceilometer-0\" (UID: \"7d7300a7-7f48-4be7-addb-1ed7995eddc2\") " pod="openstack/ceilometer-0" Feb 17 18:04:32 crc kubenswrapper[4892]: I0217 18:04:32.072297 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 18:04:32 crc kubenswrapper[4892]: I0217 18:04:32.614085 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 18:04:32 crc kubenswrapper[4892]: W0217 18:04:32.638909 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d7300a7_7f48_4be7_addb_1ed7995eddc2.slice/crio-8e1ea831611da878899c42b096b4c6cff07cb9e9f10297354bed7de246da89c7 WatchSource:0}: Error finding container 8e1ea831611da878899c42b096b4c6cff07cb9e9f10297354bed7de246da89c7: Status 404 returned error can't find the container with id 8e1ea831611da878899c42b096b4c6cff07cb9e9f10297354bed7de246da89c7 Feb 17 18:04:33 crc kubenswrapper[4892]: I0217 18:04:33.373357 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1035d59-31a0-44b9-b196-4b66c6cd3b19" path="/var/lib/kubelet/pods/a1035d59-31a0-44b9-b196-4b66c6cd3b19/volumes" Feb 17 18:04:33 crc kubenswrapper[4892]: I0217 18:04:33.414062 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7d7300a7-7f48-4be7-addb-1ed7995eddc2","Type":"ContainerStarted","Data":"8e1ea831611da878899c42b096b4c6cff07cb9e9f10297354bed7de246da89c7"} Feb 17 18:04:35 crc kubenswrapper[4892]: I0217 18:04:35.735140 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-787594b47-2xt6h" Feb 17 18:04:35 crc kubenswrapper[4892]: I0217 18:04:35.741533 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-787594b47-2xt6h" Feb 17 18:04:37 crc kubenswrapper[4892]: I0217 18:04:37.256888 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 18:04:39 crc kubenswrapper[4892]: I0217 18:04:39.485994 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7d7300a7-7f48-4be7-addb-1ed7995eddc2","Type":"ContainerStarted","Data":"72946e09c4d279d54fcd131f9664725e2feebd52a6082782b4b42cc4b3e6b531"} Feb 17 18:04:39 crc kubenswrapper[4892]: I0217 18:04:39.487371 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"0f0771f7-1250-403c-92b9-72411ed34b2a","Type":"ContainerStarted","Data":"9e7e188405b6a8e4ef6bb57b42cbb6f284b2256f64aba5c60b2ec472f06d945a"} Feb 17 18:04:39 crc kubenswrapper[4892]: I0217 18:04:39.509603 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.677261129 podStartE2EDuration="14.509586562s" podCreationTimestamp="2026-02-17 18:04:25 +0000 UTC" firstStartedPulling="2026-02-17 18:04:26.889345696 +0000 UTC m=+1238.264748961" lastFinishedPulling="2026-02-17 18:04:38.721671129 +0000 UTC m=+1250.097074394" observedRunningTime="2026-02-17 18:04:39.502260215 +0000 UTC m=+1250.877663480" watchObservedRunningTime="2026-02-17 18:04:39.509586562 +0000 UTC m=+1250.884989827" Feb 17 18:04:40 crc kubenswrapper[4892]: I0217 18:04:40.500022 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7d7300a7-7f48-4be7-addb-1ed7995eddc2","Type":"ContainerStarted","Data":"e19c335e3818c29f4db8cd845e147a5ac1ecf56461f2e254342f335e3b2e8bbf"} Feb 17 18:04:40 crc kubenswrapper[4892]: I0217 18:04:40.500572 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7d7300a7-7f48-4be7-addb-1ed7995eddc2","Type":"ContainerStarted","Data":"e731a7e80fc05929980b8d7bd8d3bb13b61da08b9a366936bb45c35c7d1e8c3a"} Feb 17 18:04:42 crc kubenswrapper[4892]: I0217 18:04:42.518727 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7d7300a7-7f48-4be7-addb-1ed7995eddc2","Type":"ContainerStarted","Data":"2724110921dba80b4d4bf3594d98687dd917ed130282ef15382bf87643d60d5f"} Feb 17 18:04:42 crc kubenswrapper[4892]: I0217 18:04:42.518888 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7d7300a7-7f48-4be7-addb-1ed7995eddc2" containerName="ceilometer-central-agent" containerID="cri-o://72946e09c4d279d54fcd131f9664725e2feebd52a6082782b4b42cc4b3e6b531" gracePeriod=30 Feb 17 18:04:42 crc kubenswrapper[4892]: I0217 18:04:42.518919 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7d7300a7-7f48-4be7-addb-1ed7995eddc2" containerName="sg-core" containerID="cri-o://e19c335e3818c29f4db8cd845e147a5ac1ecf56461f2e254342f335e3b2e8bbf" gracePeriod=30 Feb 17 18:04:42 crc kubenswrapper[4892]: I0217 18:04:42.518898 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7d7300a7-7f48-4be7-addb-1ed7995eddc2" containerName="proxy-httpd" containerID="cri-o://2724110921dba80b4d4bf3594d98687dd917ed130282ef15382bf87643d60d5f" gracePeriod=30 Feb 17 18:04:42 crc kubenswrapper[4892]: I0217 18:04:42.518976 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7d7300a7-7f48-4be7-addb-1ed7995eddc2" containerName="ceilometer-notification-agent" containerID="cri-o://e731a7e80fc05929980b8d7bd8d3bb13b61da08b9a366936bb45c35c7d1e8c3a" gracePeriod=30 Feb 17 18:04:42 crc kubenswrapper[4892]: I0217 18:04:42.519332 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 18:04:42 crc kubenswrapper[4892]: I0217 18:04:42.552527 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.301512549 podStartE2EDuration="11.552502947s" podCreationTimestamp="2026-02-17 18:04:31 +0000 UTC" firstStartedPulling="2026-02-17 18:04:32.641334138 +0000 UTC m=+1244.016737403" lastFinishedPulling="2026-02-17 18:04:41.892324536 +0000 UTC m=+1253.267727801" observedRunningTime="2026-02-17 18:04:42.54036972 +0000 UTC m=+1253.915772995" watchObservedRunningTime="2026-02-17 18:04:42.552502947 +0000 UTC m=+1253.927906202" Feb 17 18:04:42 crc kubenswrapper[4892]: E0217 18:04:42.618595 4892 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d7300a7_7f48_4be7_addb_1ed7995eddc2.slice/crio-e19c335e3818c29f4db8cd845e147a5ac1ecf56461f2e254342f335e3b2e8bbf.scope\": RecentStats: unable to find data in memory cache]" Feb 17 18:04:43 crc kubenswrapper[4892]: I0217 18:04:43.549789 4892 generic.go:334] "Generic (PLEG): container finished" podID="7d7300a7-7f48-4be7-addb-1ed7995eddc2" containerID="2724110921dba80b4d4bf3594d98687dd917ed130282ef15382bf87643d60d5f" exitCode=0 Feb 17 18:04:43 crc kubenswrapper[4892]: I0217 18:04:43.550261 4892 generic.go:334] "Generic (PLEG): container finished" podID="7d7300a7-7f48-4be7-addb-1ed7995eddc2" containerID="e19c335e3818c29f4db8cd845e147a5ac1ecf56461f2e254342f335e3b2e8bbf" exitCode=2 Feb 17 18:04:43 crc kubenswrapper[4892]: I0217 18:04:43.550275 4892 generic.go:334] "Generic (PLEG): container finished" podID="7d7300a7-7f48-4be7-addb-1ed7995eddc2" containerID="e731a7e80fc05929980b8d7bd8d3bb13b61da08b9a366936bb45c35c7d1e8c3a" exitCode=0 Feb 17 18:04:43 crc kubenswrapper[4892]: I0217 18:04:43.550006 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7d7300a7-7f48-4be7-addb-1ed7995eddc2","Type":"ContainerDied","Data":"2724110921dba80b4d4bf3594d98687dd917ed130282ef15382bf87643d60d5f"} Feb 17 18:04:43 crc kubenswrapper[4892]: I0217 18:04:43.550335 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7d7300a7-7f48-4be7-addb-1ed7995eddc2","Type":"ContainerDied","Data":"e19c335e3818c29f4db8cd845e147a5ac1ecf56461f2e254342f335e3b2e8bbf"} Feb 17 18:04:43 crc kubenswrapper[4892]: I0217 18:04:43.550350 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7d7300a7-7f48-4be7-addb-1ed7995eddc2","Type":"ContainerDied","Data":"e731a7e80fc05929980b8d7bd8d3bb13b61da08b9a366936bb45c35c7d1e8c3a"} Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.022416 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-ctbgj"] Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.024489 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-ctbgj" Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.032211 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-ctbgj"] Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.137876 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-dr6ns"] Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.139080 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-dr6ns" Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.150537 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-dr6ns"] Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.184687 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r755\" (UniqueName: \"kubernetes.io/projected/5c01d520-e8b4-4e4d-9aaf-ffe0a5bb4293-kube-api-access-4r755\") pod \"nova-api-db-create-ctbgj\" (UID: \"5c01d520-e8b4-4e4d-9aaf-ffe0a5bb4293\") " pod="openstack/nova-api-db-create-ctbgj" Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.185000 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c01d520-e8b4-4e4d-9aaf-ffe0a5bb4293-operator-scripts\") pod \"nova-api-db-create-ctbgj\" (UID: \"5c01d520-e8b4-4e4d-9aaf-ffe0a5bb4293\") " pod="openstack/nova-api-db-create-ctbgj" Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.231644 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-5f5a-account-create-update-bhdvm"] Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.232899 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5f5a-account-create-update-bhdvm" Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.237442 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.254577 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-5f5a-account-create-update-bhdvm"] Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.286705 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4r755\" (UniqueName: \"kubernetes.io/projected/5c01d520-e8b4-4e4d-9aaf-ffe0a5bb4293-kube-api-access-4r755\") pod \"nova-api-db-create-ctbgj\" (UID: \"5c01d520-e8b4-4e4d-9aaf-ffe0a5bb4293\") " pod="openstack/nova-api-db-create-ctbgj" Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.287028 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55036453-856d-458e-a6c9-30809c87ccaf-operator-scripts\") pod \"nova-cell0-db-create-dr6ns\" (UID: \"55036453-856d-458e-a6c9-30809c87ccaf\") " pod="openstack/nova-cell0-db-create-dr6ns" Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.287132 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c01d520-e8b4-4e4d-9aaf-ffe0a5bb4293-operator-scripts\") pod \"nova-api-db-create-ctbgj\" (UID: \"5c01d520-e8b4-4e4d-9aaf-ffe0a5bb4293\") " pod="openstack/nova-api-db-create-ctbgj" Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.287319 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrmwg\" (UniqueName: \"kubernetes.io/projected/55036453-856d-458e-a6c9-30809c87ccaf-kube-api-access-jrmwg\") pod \"nova-cell0-db-create-dr6ns\" (UID: \"55036453-856d-458e-a6c9-30809c87ccaf\") " pod="openstack/nova-cell0-db-create-dr6ns" Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.323519 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c01d520-e8b4-4e4d-9aaf-ffe0a5bb4293-operator-scripts\") pod \"nova-api-db-create-ctbgj\" (UID: \"5c01d520-e8b4-4e4d-9aaf-ffe0a5bb4293\") " pod="openstack/nova-api-db-create-ctbgj" Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.328855 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r755\" (UniqueName: \"kubernetes.io/projected/5c01d520-e8b4-4e4d-9aaf-ffe0a5bb4293-kube-api-access-4r755\") pod \"nova-api-db-create-ctbgj\" (UID: \"5c01d520-e8b4-4e4d-9aaf-ffe0a5bb4293\") " pod="openstack/nova-api-db-create-ctbgj" Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.350089 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-c98h7"] Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.355048 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-ctbgj" Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.357492 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-c98h7" Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.393766 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55036453-856d-458e-a6c9-30809c87ccaf-operator-scripts\") pod \"nova-cell0-db-create-dr6ns\" (UID: \"55036453-856d-458e-a6c9-30809c87ccaf\") " pod="openstack/nova-cell0-db-create-dr6ns" Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.393864 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrmwg\" (UniqueName: \"kubernetes.io/projected/55036453-856d-458e-a6c9-30809c87ccaf-kube-api-access-jrmwg\") pod \"nova-cell0-db-create-dr6ns\" (UID: \"55036453-856d-458e-a6c9-30809c87ccaf\") " pod="openstack/nova-cell0-db-create-dr6ns" Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.393988 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqfr6\" (UniqueName: \"kubernetes.io/projected/2c42e2da-4b51-426a-b743-b8c79e358ecb-kube-api-access-dqfr6\") pod \"nova-api-5f5a-account-create-update-bhdvm\" (UID: \"2c42e2da-4b51-426a-b743-b8c79e358ecb\") " pod="openstack/nova-api-5f5a-account-create-update-bhdvm" Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.394021 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c42e2da-4b51-426a-b743-b8c79e358ecb-operator-scripts\") pod \"nova-api-5f5a-account-create-update-bhdvm\" (UID: \"2c42e2da-4b51-426a-b743-b8c79e358ecb\") " pod="openstack/nova-api-5f5a-account-create-update-bhdvm" Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.395003 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55036453-856d-458e-a6c9-30809c87ccaf-operator-scripts\") pod \"nova-cell0-db-create-dr6ns\" (UID: \"55036453-856d-458e-a6c9-30809c87ccaf\") " pod="openstack/nova-cell0-db-create-dr6ns" Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.419399 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrmwg\" (UniqueName: \"kubernetes.io/projected/55036453-856d-458e-a6c9-30809c87ccaf-kube-api-access-jrmwg\") pod \"nova-cell0-db-create-dr6ns\" (UID: \"55036453-856d-458e-a6c9-30809c87ccaf\") " pod="openstack/nova-cell0-db-create-dr6ns" Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.432528 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-c98h7"] Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.453368 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-dr6ns" Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.487155 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-57f0-account-create-update-lphkn"] Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.488624 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-57f0-account-create-update-lphkn" Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.492173 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.499693 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqfr6\" (UniqueName: \"kubernetes.io/projected/2c42e2da-4b51-426a-b743-b8c79e358ecb-kube-api-access-dqfr6\") pod \"nova-api-5f5a-account-create-update-bhdvm\" (UID: \"2c42e2da-4b51-426a-b743-b8c79e358ecb\") " pod="openstack/nova-api-5f5a-account-create-update-bhdvm" Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.499740 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c42e2da-4b51-426a-b743-b8c79e358ecb-operator-scripts\") pod \"nova-api-5f5a-account-create-update-bhdvm\" (UID: \"2c42e2da-4b51-426a-b743-b8c79e358ecb\") " pod="openstack/nova-api-5f5a-account-create-update-bhdvm" Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.499775 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mtwh\" (UniqueName: \"kubernetes.io/projected/0ffcecae-c54a-4d76-9bbd-e7c406582928-kube-api-access-7mtwh\") pod \"nova-cell1-db-create-c98h7\" (UID: \"0ffcecae-c54a-4d76-9bbd-e7c406582928\") " pod="openstack/nova-cell1-db-create-c98h7" Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.499797 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ffcecae-c54a-4d76-9bbd-e7c406582928-operator-scripts\") pod \"nova-cell1-db-create-c98h7\" (UID: \"0ffcecae-c54a-4d76-9bbd-e7c406582928\") " pod="openstack/nova-cell1-db-create-c98h7" Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.500858 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c42e2da-4b51-426a-b743-b8c79e358ecb-operator-scripts\") pod \"nova-api-5f5a-account-create-update-bhdvm\" (UID: \"2c42e2da-4b51-426a-b743-b8c79e358ecb\") " pod="openstack/nova-api-5f5a-account-create-update-bhdvm" Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.507312 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-57f0-account-create-update-lphkn"] Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.531432 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqfr6\" (UniqueName: \"kubernetes.io/projected/2c42e2da-4b51-426a-b743-b8c79e358ecb-kube-api-access-dqfr6\") pod \"nova-api-5f5a-account-create-update-bhdvm\" (UID: \"2c42e2da-4b51-426a-b743-b8c79e358ecb\") " pod="openstack/nova-api-5f5a-account-create-update-bhdvm" Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.602933 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97ed9367-dfce-487f-b826-06981dba28ef-operator-scripts\") pod \"nova-cell0-57f0-account-create-update-lphkn\" (UID: \"97ed9367-dfce-487f-b826-06981dba28ef\") " pod="openstack/nova-cell0-57f0-account-create-update-lphkn" Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.603009 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2d82\" (UniqueName: \"kubernetes.io/projected/97ed9367-dfce-487f-b826-06981dba28ef-kube-api-access-v2d82\") pod \"nova-cell0-57f0-account-create-update-lphkn\" (UID: \"97ed9367-dfce-487f-b826-06981dba28ef\") " pod="openstack/nova-cell0-57f0-account-create-update-lphkn" Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.603131 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mtwh\" (UniqueName: \"kubernetes.io/projected/0ffcecae-c54a-4d76-9bbd-e7c406582928-kube-api-access-7mtwh\") pod \"nova-cell1-db-create-c98h7\" (UID: \"0ffcecae-c54a-4d76-9bbd-e7c406582928\") " pod="openstack/nova-cell1-db-create-c98h7" Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.603164 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ffcecae-c54a-4d76-9bbd-e7c406582928-operator-scripts\") pod \"nova-cell1-db-create-c98h7\" (UID: \"0ffcecae-c54a-4d76-9bbd-e7c406582928\") " pod="openstack/nova-cell1-db-create-c98h7" Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.603862 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ffcecae-c54a-4d76-9bbd-e7c406582928-operator-scripts\") pod \"nova-cell1-db-create-c98h7\" (UID: \"0ffcecae-c54a-4d76-9bbd-e7c406582928\") " pod="openstack/nova-cell1-db-create-c98h7" Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.625798 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5f5a-account-create-update-bhdvm" Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.632098 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mtwh\" (UniqueName: \"kubernetes.io/projected/0ffcecae-c54a-4d76-9bbd-e7c406582928-kube-api-access-7mtwh\") pod \"nova-cell1-db-create-c98h7\" (UID: \"0ffcecae-c54a-4d76-9bbd-e7c406582928\") " pod="openstack/nova-cell1-db-create-c98h7" Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.645003 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-7299-account-create-update-dlm9t"] Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.646454 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7299-account-create-update-dlm9t" Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.652615 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.661488 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-7299-account-create-update-dlm9t"] Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.711324 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97ed9367-dfce-487f-b826-06981dba28ef-operator-scripts\") pod \"nova-cell0-57f0-account-create-update-lphkn\" (UID: \"97ed9367-dfce-487f-b826-06981dba28ef\") " pod="openstack/nova-cell0-57f0-account-create-update-lphkn" Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.711596 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2d82\" (UniqueName: \"kubernetes.io/projected/97ed9367-dfce-487f-b826-06981dba28ef-kube-api-access-v2d82\") pod \"nova-cell0-57f0-account-create-update-lphkn\" (UID: \"97ed9367-dfce-487f-b826-06981dba28ef\") " pod="openstack/nova-cell0-57f0-account-create-update-lphkn" Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.712797 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97ed9367-dfce-487f-b826-06981dba28ef-operator-scripts\") pod \"nova-cell0-57f0-account-create-update-lphkn\" (UID: \"97ed9367-dfce-487f-b826-06981dba28ef\") " pod="openstack/nova-cell0-57f0-account-create-update-lphkn" Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.717899 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-c98h7" Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.728075 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2d82\" (UniqueName: \"kubernetes.io/projected/97ed9367-dfce-487f-b826-06981dba28ef-kube-api-access-v2d82\") pod \"nova-cell0-57f0-account-create-update-lphkn\" (UID: \"97ed9367-dfce-487f-b826-06981dba28ef\") " pod="openstack/nova-cell0-57f0-account-create-update-lphkn" Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.818329 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2c26p\" (UniqueName: \"kubernetes.io/projected/932e3900-65d2-4587-9da3-8fb5bea0a354-kube-api-access-2c26p\") pod \"nova-cell1-7299-account-create-update-dlm9t\" (UID: \"932e3900-65d2-4587-9da3-8fb5bea0a354\") " pod="openstack/nova-cell1-7299-account-create-update-dlm9t" Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.818425 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/932e3900-65d2-4587-9da3-8fb5bea0a354-operator-scripts\") pod \"nova-cell1-7299-account-create-update-dlm9t\" (UID: \"932e3900-65d2-4587-9da3-8fb5bea0a354\") " pod="openstack/nova-cell1-7299-account-create-update-dlm9t" Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.920406 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/932e3900-65d2-4587-9da3-8fb5bea0a354-operator-scripts\") pod \"nova-cell1-7299-account-create-update-dlm9t\" (UID: \"932e3900-65d2-4587-9da3-8fb5bea0a354\") " pod="openstack/nova-cell1-7299-account-create-update-dlm9t" Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.920921 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2c26p\" (UniqueName: \"kubernetes.io/projected/932e3900-65d2-4587-9da3-8fb5bea0a354-kube-api-access-2c26p\") pod \"nova-cell1-7299-account-create-update-dlm9t\" (UID: \"932e3900-65d2-4587-9da3-8fb5bea0a354\") " pod="openstack/nova-cell1-7299-account-create-update-dlm9t" Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.922092 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/932e3900-65d2-4587-9da3-8fb5bea0a354-operator-scripts\") pod \"nova-cell1-7299-account-create-update-dlm9t\" (UID: \"932e3900-65d2-4587-9da3-8fb5bea0a354\") " pod="openstack/nova-cell1-7299-account-create-update-dlm9t" Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.938563 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2c26p\" (UniqueName: \"kubernetes.io/projected/932e3900-65d2-4587-9da3-8fb5bea0a354-kube-api-access-2c26p\") pod \"nova-cell1-7299-account-create-update-dlm9t\" (UID: \"932e3900-65d2-4587-9da3-8fb5bea0a354\") " pod="openstack/nova-cell1-7299-account-create-update-dlm9t" Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.951364 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-57f0-account-create-update-lphkn" Feb 17 18:04:47 crc kubenswrapper[4892]: I0217 18:04:47.974228 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7299-account-create-update-dlm9t" Feb 17 18:04:48 crc kubenswrapper[4892]: I0217 18:04:47.999391 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-ctbgj"] Feb 17 18:04:48 crc kubenswrapper[4892]: I0217 18:04:48.095323 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-dr6ns"] Feb 17 18:04:48 crc kubenswrapper[4892]: W0217 18:04:48.095686 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55036453_856d_458e_a6c9_30809c87ccaf.slice/crio-28ce31c5c5570dd8d518754827e094a39b9e787f81d842020c124f983c501a52 WatchSource:0}: Error finding container 28ce31c5c5570dd8d518754827e094a39b9e787f81d842020c124f983c501a52: Status 404 returned error can't find the container with id 28ce31c5c5570dd8d518754827e094a39b9e787f81d842020c124f983c501a52 Feb 17 18:04:48 crc kubenswrapper[4892]: I0217 18:04:48.228785 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-5f5a-account-create-update-bhdvm"] Feb 17 18:04:48 crc kubenswrapper[4892]: W0217 18:04:48.229204 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c42e2da_4b51_426a_b743_b8c79e358ecb.slice/crio-6238538448a9da9dce0e4a6d8ba8030427662dcf3a5b7fe404188bb0114f83b0 WatchSource:0}: Error finding container 6238538448a9da9dce0e4a6d8ba8030427662dcf3a5b7fe404188bb0114f83b0: Status 404 returned error can't find the container with id 6238538448a9da9dce0e4a6d8ba8030427662dcf3a5b7fe404188bb0114f83b0 Feb 17 18:04:48 crc kubenswrapper[4892]: I0217 18:04:48.340775 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-c98h7"] Feb 17 18:04:48 crc kubenswrapper[4892]: W0217 18:04:48.341403 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ffcecae_c54a_4d76_9bbd_e7c406582928.slice/crio-5a9b7cb7542ced15265f08059330f0e72058700564dfcd27724c14eab0cb325a WatchSource:0}: Error finding container 5a9b7cb7542ced15265f08059330f0e72058700564dfcd27724c14eab0cb325a: Status 404 returned error can't find the container with id 5a9b7cb7542ced15265f08059330f0e72058700564dfcd27724c14eab0cb325a Feb 17 18:04:48 crc kubenswrapper[4892]: I0217 18:04:48.563139 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-7299-account-create-update-dlm9t"] Feb 17 18:04:48 crc kubenswrapper[4892]: I0217 18:04:48.572865 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-57f0-account-create-update-lphkn"] Feb 17 18:04:48 crc kubenswrapper[4892]: I0217 18:04:48.660856 4892 generic.go:334] "Generic (PLEG): container finished" podID="55036453-856d-458e-a6c9-30809c87ccaf" containerID="55487b33f7e954c0f5c383758f8a9c00a4376ebe648b34305608a954745f431e" exitCode=0 Feb 17 18:04:48 crc kubenswrapper[4892]: I0217 18:04:48.660924 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-dr6ns" event={"ID":"55036453-856d-458e-a6c9-30809c87ccaf","Type":"ContainerDied","Data":"55487b33f7e954c0f5c383758f8a9c00a4376ebe648b34305608a954745f431e"} Feb 17 18:04:48 crc kubenswrapper[4892]: I0217 18:04:48.660951 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-dr6ns" event={"ID":"55036453-856d-458e-a6c9-30809c87ccaf","Type":"ContainerStarted","Data":"28ce31c5c5570dd8d518754827e094a39b9e787f81d842020c124f983c501a52"} Feb 17 18:04:48 crc kubenswrapper[4892]: I0217 18:04:48.662824 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-7299-account-create-update-dlm9t" event={"ID":"932e3900-65d2-4587-9da3-8fb5bea0a354","Type":"ContainerStarted","Data":"6247c1c7b1efc8494b79ed394a39bed8b2a9e0847f1107f1bc67224786c95650"} Feb 17 18:04:48 crc kubenswrapper[4892]: I0217 18:04:48.674240 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-c98h7" event={"ID":"0ffcecae-c54a-4d76-9bbd-e7c406582928","Type":"ContainerStarted","Data":"d6866eacf02bafe4ccea20fee02f0034864b3623f04f665f9716feff8b87a96f"} Feb 17 18:04:48 crc kubenswrapper[4892]: I0217 18:04:48.674278 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-c98h7" event={"ID":"0ffcecae-c54a-4d76-9bbd-e7c406582928","Type":"ContainerStarted","Data":"5a9b7cb7542ced15265f08059330f0e72058700564dfcd27724c14eab0cb325a"} Feb 17 18:04:48 crc kubenswrapper[4892]: I0217 18:04:48.688063 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5f5a-account-create-update-bhdvm" event={"ID":"2c42e2da-4b51-426a-b743-b8c79e358ecb","Type":"ContainerStarted","Data":"85e1b561483fee3321875f44825c87ea1fd5510243e332469de04d2d479fd0ae"} Feb 17 18:04:48 crc kubenswrapper[4892]: I0217 18:04:48.688304 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5f5a-account-create-update-bhdvm" event={"ID":"2c42e2da-4b51-426a-b743-b8c79e358ecb","Type":"ContainerStarted","Data":"6238538448a9da9dce0e4a6d8ba8030427662dcf3a5b7fe404188bb0114f83b0"} Feb 17 18:04:48 crc kubenswrapper[4892]: I0217 18:04:48.699113 4892 generic.go:334] "Generic (PLEG): container finished" podID="5c01d520-e8b4-4e4d-9aaf-ffe0a5bb4293" containerID="591432f5c35dbe4c788cd2a9a9485c33f60acc25bb3e704d33d8efd0d5c77925" exitCode=0 Feb 17 18:04:48 crc kubenswrapper[4892]: I0217 18:04:48.699280 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-ctbgj" event={"ID":"5c01d520-e8b4-4e4d-9aaf-ffe0a5bb4293","Type":"ContainerDied","Data":"591432f5c35dbe4c788cd2a9a9485c33f60acc25bb3e704d33d8efd0d5c77925"} Feb 17 18:04:48 crc kubenswrapper[4892]: I0217 18:04:48.699325 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-ctbgj" event={"ID":"5c01d520-e8b4-4e4d-9aaf-ffe0a5bb4293","Type":"ContainerStarted","Data":"93ffb378f5b8709db7d67158c99d79a7a7747315900a97d60e905fcb9966b505"} Feb 17 18:04:48 crc kubenswrapper[4892]: I0217 18:04:48.701677 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-57f0-account-create-update-lphkn" event={"ID":"97ed9367-dfce-487f-b826-06981dba28ef","Type":"ContainerStarted","Data":"bee4973e7cee7f3379aa829c2149be6c8c39dec0cf80575b50184b28c9e80216"} Feb 17 18:04:48 crc kubenswrapper[4892]: I0217 18:04:48.722113 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-c98h7" podStartSLOduration=1.7220938430000001 podStartE2EDuration="1.722093843s" podCreationTimestamp="2026-02-17 18:04:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:04:48.698731303 +0000 UTC m=+1260.074134578" watchObservedRunningTime="2026-02-17 18:04:48.722093843 +0000 UTC m=+1260.097497108" Feb 17 18:04:48 crc kubenswrapper[4892]: I0217 18:04:48.727726 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-5f5a-account-create-update-bhdvm" podStartSLOduration=1.727704175 podStartE2EDuration="1.727704175s" podCreationTimestamp="2026-02-17 18:04:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:04:48.713944334 +0000 UTC m=+1260.089347599" watchObservedRunningTime="2026-02-17 18:04:48.727704175 +0000 UTC m=+1260.103107430" Feb 17 18:04:49 crc kubenswrapper[4892]: I0217 18:04:49.565609 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 18:04:49 crc kubenswrapper[4892]: I0217 18:04:49.566142 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a1f5f86e-d42f-4224-862d-31337ee26ae5" containerName="glance-log" containerID="cri-o://45f3afdcde22cd5653dcf98eb41e608b08f4b0c0f952e1bd8f4733607e7c9c02" gracePeriod=30 Feb 17 18:04:49 crc kubenswrapper[4892]: I0217 18:04:49.568175 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a1f5f86e-d42f-4224-862d-31337ee26ae5" containerName="glance-httpd" containerID="cri-o://de10374cd7c9b05b17343f79c93b879107820127acee324e6c40ecf2299e73ef" gracePeriod=30 Feb 17 18:04:49 crc kubenswrapper[4892]: I0217 18:04:49.716691 4892 generic.go:334] "Generic (PLEG): container finished" podID="0ffcecae-c54a-4d76-9bbd-e7c406582928" containerID="d6866eacf02bafe4ccea20fee02f0034864b3623f04f665f9716feff8b87a96f" exitCode=0 Feb 17 18:04:49 crc kubenswrapper[4892]: I0217 18:04:49.716731 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-c98h7" event={"ID":"0ffcecae-c54a-4d76-9bbd-e7c406582928","Type":"ContainerDied","Data":"d6866eacf02bafe4ccea20fee02f0034864b3623f04f665f9716feff8b87a96f"} Feb 17 18:04:49 crc kubenswrapper[4892]: I0217 18:04:49.719263 4892 generic.go:334] "Generic (PLEG): container finished" podID="2c42e2da-4b51-426a-b743-b8c79e358ecb" containerID="85e1b561483fee3321875f44825c87ea1fd5510243e332469de04d2d479fd0ae" exitCode=0 Feb 17 18:04:49 crc kubenswrapper[4892]: I0217 18:04:49.719314 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5f5a-account-create-update-bhdvm" event={"ID":"2c42e2da-4b51-426a-b743-b8c79e358ecb","Type":"ContainerDied","Data":"85e1b561483fee3321875f44825c87ea1fd5510243e332469de04d2d479fd0ae"} Feb 17 18:04:49 crc kubenswrapper[4892]: I0217 18:04:49.722485 4892 generic.go:334] "Generic (PLEG): container finished" podID="97ed9367-dfce-487f-b826-06981dba28ef" containerID="f199e1de12a08bfc19910317b33654d3544eab27381b1a50fd11e7d82eeae7da" exitCode=0 Feb 17 18:04:49 crc kubenswrapper[4892]: I0217 18:04:49.722560 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-57f0-account-create-update-lphkn" event={"ID":"97ed9367-dfce-487f-b826-06981dba28ef","Type":"ContainerDied","Data":"f199e1de12a08bfc19910317b33654d3544eab27381b1a50fd11e7d82eeae7da"} Feb 17 18:04:49 crc kubenswrapper[4892]: I0217 18:04:49.726152 4892 generic.go:334] "Generic (PLEG): container finished" podID="a1f5f86e-d42f-4224-862d-31337ee26ae5" containerID="45f3afdcde22cd5653dcf98eb41e608b08f4b0c0f952e1bd8f4733607e7c9c02" exitCode=143 Feb 17 18:04:49 crc kubenswrapper[4892]: I0217 18:04:49.726221 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a1f5f86e-d42f-4224-862d-31337ee26ae5","Type":"ContainerDied","Data":"45f3afdcde22cd5653dcf98eb41e608b08f4b0c0f952e1bd8f4733607e7c9c02"} Feb 17 18:04:49 crc kubenswrapper[4892]: I0217 18:04:49.740363 4892 generic.go:334] "Generic (PLEG): container finished" podID="7d7300a7-7f48-4be7-addb-1ed7995eddc2" containerID="72946e09c4d279d54fcd131f9664725e2feebd52a6082782b4b42cc4b3e6b531" exitCode=0 Feb 17 18:04:49 crc kubenswrapper[4892]: I0217 18:04:49.740414 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7d7300a7-7f48-4be7-addb-1ed7995eddc2","Type":"ContainerDied","Data":"72946e09c4d279d54fcd131f9664725e2feebd52a6082782b4b42cc4b3e6b531"} Feb 17 18:04:49 crc kubenswrapper[4892]: I0217 18:04:49.747115 4892 generic.go:334] "Generic (PLEG): container finished" podID="932e3900-65d2-4587-9da3-8fb5bea0a354" containerID="09c15a6e47f0636a2694c22ae5f7d75544f34568564c4577e9dcfe58dd2d7927" exitCode=0 Feb 17 18:04:49 crc kubenswrapper[4892]: I0217 18:04:49.747311 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-7299-account-create-update-dlm9t" event={"ID":"932e3900-65d2-4587-9da3-8fb5bea0a354","Type":"ContainerDied","Data":"09c15a6e47f0636a2694c22ae5f7d75544f34568564c4577e9dcfe58dd2d7927"} Feb 17 18:04:49 crc kubenswrapper[4892]: I0217 18:04:49.886759 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 18:04:49 crc kubenswrapper[4892]: I0217 18:04:49.980532 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d7300a7-7f48-4be7-addb-1ed7995eddc2-log-httpd\") pod \"7d7300a7-7f48-4be7-addb-1ed7995eddc2\" (UID: \"7d7300a7-7f48-4be7-addb-1ed7995eddc2\") " Feb 17 18:04:49 crc kubenswrapper[4892]: I0217 18:04:49.980640 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d7300a7-7f48-4be7-addb-1ed7995eddc2-scripts\") pod \"7d7300a7-7f48-4be7-addb-1ed7995eddc2\" (UID: \"7d7300a7-7f48-4be7-addb-1ed7995eddc2\") " Feb 17 18:04:49 crc kubenswrapper[4892]: I0217 18:04:49.980695 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d7300a7-7f48-4be7-addb-1ed7995eddc2-run-httpd\") pod \"7d7300a7-7f48-4be7-addb-1ed7995eddc2\" (UID: \"7d7300a7-7f48-4be7-addb-1ed7995eddc2\") " Feb 17 18:04:49 crc kubenswrapper[4892]: I0217 18:04:49.980853 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64hxf\" (UniqueName: \"kubernetes.io/projected/7d7300a7-7f48-4be7-addb-1ed7995eddc2-kube-api-access-64hxf\") pod \"7d7300a7-7f48-4be7-addb-1ed7995eddc2\" (UID: \"7d7300a7-7f48-4be7-addb-1ed7995eddc2\") " Feb 17 18:04:49 crc kubenswrapper[4892]: I0217 18:04:49.980876 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7d7300a7-7f48-4be7-addb-1ed7995eddc2-sg-core-conf-yaml\") pod \"7d7300a7-7f48-4be7-addb-1ed7995eddc2\" (UID: \"7d7300a7-7f48-4be7-addb-1ed7995eddc2\") " Feb 17 18:04:49 crc kubenswrapper[4892]: I0217 18:04:49.980960 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d7300a7-7f48-4be7-addb-1ed7995eddc2-config-data\") pod \"7d7300a7-7f48-4be7-addb-1ed7995eddc2\" (UID: \"7d7300a7-7f48-4be7-addb-1ed7995eddc2\") " Feb 17 18:04:49 crc kubenswrapper[4892]: I0217 18:04:49.980983 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d7300a7-7f48-4be7-addb-1ed7995eddc2-combined-ca-bundle\") pod \"7d7300a7-7f48-4be7-addb-1ed7995eddc2\" (UID: \"7d7300a7-7f48-4be7-addb-1ed7995eddc2\") " Feb 17 18:04:49 crc kubenswrapper[4892]: I0217 18:04:49.981335 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d7300a7-7f48-4be7-addb-1ed7995eddc2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7d7300a7-7f48-4be7-addb-1ed7995eddc2" (UID: "7d7300a7-7f48-4be7-addb-1ed7995eddc2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:04:49 crc kubenswrapper[4892]: I0217 18:04:49.981566 4892 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d7300a7-7f48-4be7-addb-1ed7995eddc2-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:49 crc kubenswrapper[4892]: I0217 18:04:49.981592 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d7300a7-7f48-4be7-addb-1ed7995eddc2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7d7300a7-7f48-4be7-addb-1ed7995eddc2" (UID: "7d7300a7-7f48-4be7-addb-1ed7995eddc2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:04:49 crc kubenswrapper[4892]: I0217 18:04:49.992179 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d7300a7-7f48-4be7-addb-1ed7995eddc2-scripts" (OuterVolumeSpecName: "scripts") pod "7d7300a7-7f48-4be7-addb-1ed7995eddc2" (UID: "7d7300a7-7f48-4be7-addb-1ed7995eddc2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:04:49 crc kubenswrapper[4892]: I0217 18:04:49.997071 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d7300a7-7f48-4be7-addb-1ed7995eddc2-kube-api-access-64hxf" (OuterVolumeSpecName: "kube-api-access-64hxf") pod "7d7300a7-7f48-4be7-addb-1ed7995eddc2" (UID: "7d7300a7-7f48-4be7-addb-1ed7995eddc2"). InnerVolumeSpecName "kube-api-access-64hxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.029806 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d7300a7-7f48-4be7-addb-1ed7995eddc2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7d7300a7-7f48-4be7-addb-1ed7995eddc2" (UID: "7d7300a7-7f48-4be7-addb-1ed7995eddc2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.085458 4892 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d7300a7-7f48-4be7-addb-1ed7995eddc2-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.085489 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d7300a7-7f48-4be7-addb-1ed7995eddc2-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.085500 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64hxf\" (UniqueName: \"kubernetes.io/projected/7d7300a7-7f48-4be7-addb-1ed7995eddc2-kube-api-access-64hxf\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.085531 4892 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7d7300a7-7f48-4be7-addb-1ed7995eddc2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.087713 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d7300a7-7f48-4be7-addb-1ed7995eddc2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7d7300a7-7f48-4be7-addb-1ed7995eddc2" (UID: "7d7300a7-7f48-4be7-addb-1ed7995eddc2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.129693 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d7300a7-7f48-4be7-addb-1ed7995eddc2-config-data" (OuterVolumeSpecName: "config-data") pod "7d7300a7-7f48-4be7-addb-1ed7995eddc2" (UID: "7d7300a7-7f48-4be7-addb-1ed7995eddc2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.134862 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-dr6ns" Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.191138 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-ctbgj" Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.191327 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d7300a7-7f48-4be7-addb-1ed7995eddc2-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.191350 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d7300a7-7f48-4be7-addb-1ed7995eddc2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.292191 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrmwg\" (UniqueName: \"kubernetes.io/projected/55036453-856d-458e-a6c9-30809c87ccaf-kube-api-access-jrmwg\") pod \"55036453-856d-458e-a6c9-30809c87ccaf\" (UID: \"55036453-856d-458e-a6c9-30809c87ccaf\") " Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.292392 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c01d520-e8b4-4e4d-9aaf-ffe0a5bb4293-operator-scripts\") pod \"5c01d520-e8b4-4e4d-9aaf-ffe0a5bb4293\" (UID: \"5c01d520-e8b4-4e4d-9aaf-ffe0a5bb4293\") " Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.292477 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55036453-856d-458e-a6c9-30809c87ccaf-operator-scripts\") pod \"55036453-856d-458e-a6c9-30809c87ccaf\" (UID: \"55036453-856d-458e-a6c9-30809c87ccaf\") " Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.292515 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4r755\" (UniqueName: \"kubernetes.io/projected/5c01d520-e8b4-4e4d-9aaf-ffe0a5bb4293-kube-api-access-4r755\") pod \"5c01d520-e8b4-4e4d-9aaf-ffe0a5bb4293\" (UID: \"5c01d520-e8b4-4e4d-9aaf-ffe0a5bb4293\") " Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.292947 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55036453-856d-458e-a6c9-30809c87ccaf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "55036453-856d-458e-a6c9-30809c87ccaf" (UID: "55036453-856d-458e-a6c9-30809c87ccaf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.292941 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c01d520-e8b4-4e4d-9aaf-ffe0a5bb4293-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5c01d520-e8b4-4e4d-9aaf-ffe0a5bb4293" (UID: "5c01d520-e8b4-4e4d-9aaf-ffe0a5bb4293"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.296513 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c01d520-e8b4-4e4d-9aaf-ffe0a5bb4293-kube-api-access-4r755" (OuterVolumeSpecName: "kube-api-access-4r755") pod "5c01d520-e8b4-4e4d-9aaf-ffe0a5bb4293" (UID: "5c01d520-e8b4-4e4d-9aaf-ffe0a5bb4293"). InnerVolumeSpecName "kube-api-access-4r755". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.297708 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55036453-856d-458e-a6c9-30809c87ccaf-kube-api-access-jrmwg" (OuterVolumeSpecName: "kube-api-access-jrmwg") pod "55036453-856d-458e-a6c9-30809c87ccaf" (UID: "55036453-856d-458e-a6c9-30809c87ccaf"). InnerVolumeSpecName "kube-api-access-jrmwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.395539 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c01d520-e8b4-4e4d-9aaf-ffe0a5bb4293-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.395867 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55036453-856d-458e-a6c9-30809c87ccaf-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.395880 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4r755\" (UniqueName: \"kubernetes.io/projected/5c01d520-e8b4-4e4d-9aaf-ffe0a5bb4293-kube-api-access-4r755\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.395895 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrmwg\" (UniqueName: \"kubernetes.io/projected/55036453-856d-458e-a6c9-30809c87ccaf-kube-api-access-jrmwg\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.711857 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.712194 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="2083df34-114e-4d2e-a85c-7a9ca940defa" containerName="glance-log" containerID="cri-o://17617b8e76c7e5c7a5b693190d099203a08b989a0c62bcf2c5a605412934d8bc" gracePeriod=30 Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.712292 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="2083df34-114e-4d2e-a85c-7a9ca940defa" containerName="glance-httpd" containerID="cri-o://9472b66a1c3b7f7c9c67281b9625bc73d46a15d07425d6c7adac8bfb9c882a0b" gracePeriod=30 Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.759745 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-dr6ns" Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.759745 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-dr6ns" event={"ID":"55036453-856d-458e-a6c9-30809c87ccaf","Type":"ContainerDied","Data":"28ce31c5c5570dd8d518754827e094a39b9e787f81d842020c124f983c501a52"} Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.759795 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28ce31c5c5570dd8d518754827e094a39b9e787f81d842020c124f983c501a52" Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.763280 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7d7300a7-7f48-4be7-addb-1ed7995eddc2","Type":"ContainerDied","Data":"8e1ea831611da878899c42b096b4c6cff07cb9e9f10297354bed7de246da89c7"} Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.763317 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.763332 4892 scope.go:117] "RemoveContainer" containerID="2724110921dba80b4d4bf3594d98687dd917ed130282ef15382bf87643d60d5f" Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.765600 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-ctbgj" event={"ID":"5c01d520-e8b4-4e4d-9aaf-ffe0a5bb4293","Type":"ContainerDied","Data":"93ffb378f5b8709db7d67158c99d79a7a7747315900a97d60e905fcb9966b505"} Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.765635 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93ffb378f5b8709db7d67158c99d79a7a7747315900a97d60e905fcb9966b505" Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.765663 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-ctbgj" Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.806028 4892 scope.go:117] "RemoveContainer" containerID="e19c335e3818c29f4db8cd845e147a5ac1ecf56461f2e254342f335e3b2e8bbf" Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.829069 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.839977 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.861006 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 18:04:50 crc kubenswrapper[4892]: E0217 18:04:50.861454 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d7300a7-7f48-4be7-addb-1ed7995eddc2" containerName="ceilometer-central-agent" Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.861470 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d7300a7-7f48-4be7-addb-1ed7995eddc2" containerName="ceilometer-central-agent" Feb 17 18:04:50 crc kubenswrapper[4892]: E0217 18:04:50.861486 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d7300a7-7f48-4be7-addb-1ed7995eddc2" containerName="ceilometer-notification-agent" Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.861493 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d7300a7-7f48-4be7-addb-1ed7995eddc2" containerName="ceilometer-notification-agent" Feb 17 18:04:50 crc kubenswrapper[4892]: E0217 18:04:50.861502 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c01d520-e8b4-4e4d-9aaf-ffe0a5bb4293" containerName="mariadb-database-create" Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.861507 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c01d520-e8b4-4e4d-9aaf-ffe0a5bb4293" containerName="mariadb-database-create" Feb 17 18:04:50 crc kubenswrapper[4892]: E0217 18:04:50.861526 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d7300a7-7f48-4be7-addb-1ed7995eddc2" containerName="sg-core" Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.861532 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d7300a7-7f48-4be7-addb-1ed7995eddc2" containerName="sg-core" Feb 17 18:04:50 crc kubenswrapper[4892]: E0217 18:04:50.861554 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d7300a7-7f48-4be7-addb-1ed7995eddc2" containerName="proxy-httpd" Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.861561 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d7300a7-7f48-4be7-addb-1ed7995eddc2" containerName="proxy-httpd" Feb 17 18:04:50 crc kubenswrapper[4892]: E0217 18:04:50.861577 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55036453-856d-458e-a6c9-30809c87ccaf" containerName="mariadb-database-create" Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.861585 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="55036453-856d-458e-a6c9-30809c87ccaf" containerName="mariadb-database-create" Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.861782 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="55036453-856d-458e-a6c9-30809c87ccaf" containerName="mariadb-database-create" Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.861796 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d7300a7-7f48-4be7-addb-1ed7995eddc2" containerName="sg-core" Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.861807 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d7300a7-7f48-4be7-addb-1ed7995eddc2" containerName="ceilometer-central-agent" Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.861827 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d7300a7-7f48-4be7-addb-1ed7995eddc2" containerName="proxy-httpd" Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.861836 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c01d520-e8b4-4e4d-9aaf-ffe0a5bb4293" containerName="mariadb-database-create" Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.861848 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d7300a7-7f48-4be7-addb-1ed7995eddc2" containerName="ceilometer-notification-agent" Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.863658 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.871632 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.871785 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.889778 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.916920 4892 scope.go:117] "RemoveContainer" containerID="e731a7e80fc05929980b8d7bd8d3bb13b61da08b9a366936bb45c35c7d1e8c3a" Feb 17 18:04:50 crc kubenswrapper[4892]: I0217 18:04:50.970294 4892 scope.go:117] "RemoveContainer" containerID="72946e09c4d279d54fcd131f9664725e2feebd52a6082782b4b42cc4b3e6b531" Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.012936 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/617a072c-991e-4579-9762-678cedffd475-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"617a072c-991e-4579-9762-678cedffd475\") " pod="openstack/ceilometer-0" Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.012986 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2mt5\" (UniqueName: \"kubernetes.io/projected/617a072c-991e-4579-9762-678cedffd475-kube-api-access-x2mt5\") pod \"ceilometer-0\" (UID: \"617a072c-991e-4579-9762-678cedffd475\") " pod="openstack/ceilometer-0" Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.013013 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/617a072c-991e-4579-9762-678cedffd475-scripts\") pod \"ceilometer-0\" (UID: \"617a072c-991e-4579-9762-678cedffd475\") " pod="openstack/ceilometer-0" Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.013030 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/617a072c-991e-4579-9762-678cedffd475-config-data\") pod \"ceilometer-0\" (UID: \"617a072c-991e-4579-9762-678cedffd475\") " pod="openstack/ceilometer-0" Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.013102 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/617a072c-991e-4579-9762-678cedffd475-log-httpd\") pod \"ceilometer-0\" (UID: \"617a072c-991e-4579-9762-678cedffd475\") " pod="openstack/ceilometer-0" Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.013142 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/617a072c-991e-4579-9762-678cedffd475-run-httpd\") pod \"ceilometer-0\" (UID: \"617a072c-991e-4579-9762-678cedffd475\") " pod="openstack/ceilometer-0" Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.013168 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/617a072c-991e-4579-9762-678cedffd475-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"617a072c-991e-4579-9762-678cedffd475\") " pod="openstack/ceilometer-0" Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.115494 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/617a072c-991e-4579-9762-678cedffd475-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"617a072c-991e-4579-9762-678cedffd475\") " pod="openstack/ceilometer-0" Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.115553 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2mt5\" (UniqueName: \"kubernetes.io/projected/617a072c-991e-4579-9762-678cedffd475-kube-api-access-x2mt5\") pod \"ceilometer-0\" (UID: \"617a072c-991e-4579-9762-678cedffd475\") " pod="openstack/ceilometer-0" Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.115579 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/617a072c-991e-4579-9762-678cedffd475-scripts\") pod \"ceilometer-0\" (UID: \"617a072c-991e-4579-9762-678cedffd475\") " pod="openstack/ceilometer-0" Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.115597 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/617a072c-991e-4579-9762-678cedffd475-config-data\") pod \"ceilometer-0\" (UID: \"617a072c-991e-4579-9762-678cedffd475\") " pod="openstack/ceilometer-0" Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.115648 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/617a072c-991e-4579-9762-678cedffd475-log-httpd\") pod \"ceilometer-0\" (UID: \"617a072c-991e-4579-9762-678cedffd475\") " pod="openstack/ceilometer-0" Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.115670 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/617a072c-991e-4579-9762-678cedffd475-run-httpd\") pod \"ceilometer-0\" (UID: \"617a072c-991e-4579-9762-678cedffd475\") " pod="openstack/ceilometer-0" Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.115691 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/617a072c-991e-4579-9762-678cedffd475-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"617a072c-991e-4579-9762-678cedffd475\") " pod="openstack/ceilometer-0" Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.116832 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/617a072c-991e-4579-9762-678cedffd475-log-httpd\") pod \"ceilometer-0\" (UID: \"617a072c-991e-4579-9762-678cedffd475\") " pod="openstack/ceilometer-0" Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.117135 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/617a072c-991e-4579-9762-678cedffd475-run-httpd\") pod \"ceilometer-0\" (UID: \"617a072c-991e-4579-9762-678cedffd475\") " pod="openstack/ceilometer-0" Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.121402 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/617a072c-991e-4579-9762-678cedffd475-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"617a072c-991e-4579-9762-678cedffd475\") " pod="openstack/ceilometer-0" Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.121800 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/617a072c-991e-4579-9762-678cedffd475-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"617a072c-991e-4579-9762-678cedffd475\") " pod="openstack/ceilometer-0" Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.121998 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/617a072c-991e-4579-9762-678cedffd475-scripts\") pod \"ceilometer-0\" (UID: \"617a072c-991e-4579-9762-678cedffd475\") " pod="openstack/ceilometer-0" Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.129851 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/617a072c-991e-4579-9762-678cedffd475-config-data\") pod \"ceilometer-0\" (UID: \"617a072c-991e-4579-9762-678cedffd475\") " pod="openstack/ceilometer-0" Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.149762 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2mt5\" (UniqueName: \"kubernetes.io/projected/617a072c-991e-4579-9762-678cedffd475-kube-api-access-x2mt5\") pod \"ceilometer-0\" (UID: \"617a072c-991e-4579-9762-678cedffd475\") " pod="openstack/ceilometer-0" Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.229057 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.314925 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7299-account-create-update-dlm9t" Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.401713 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d7300a7-7f48-4be7-addb-1ed7995eddc2" path="/var/lib/kubelet/pods/7d7300a7-7f48-4be7-addb-1ed7995eddc2/volumes" Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.432978 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/932e3900-65d2-4587-9da3-8fb5bea0a354-operator-scripts\") pod \"932e3900-65d2-4587-9da3-8fb5bea0a354\" (UID: \"932e3900-65d2-4587-9da3-8fb5bea0a354\") " Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.433151 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2c26p\" (UniqueName: \"kubernetes.io/projected/932e3900-65d2-4587-9da3-8fb5bea0a354-kube-api-access-2c26p\") pod \"932e3900-65d2-4587-9da3-8fb5bea0a354\" (UID: \"932e3900-65d2-4587-9da3-8fb5bea0a354\") " Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.437486 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/932e3900-65d2-4587-9da3-8fb5bea0a354-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "932e3900-65d2-4587-9da3-8fb5bea0a354" (UID: "932e3900-65d2-4587-9da3-8fb5bea0a354"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.445365 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/932e3900-65d2-4587-9da3-8fb5bea0a354-kube-api-access-2c26p" (OuterVolumeSpecName: "kube-api-access-2c26p") pod "932e3900-65d2-4587-9da3-8fb5bea0a354" (UID: "932e3900-65d2-4587-9da3-8fb5bea0a354"). InnerVolumeSpecName "kube-api-access-2c26p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.536174 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/932e3900-65d2-4587-9da3-8fb5bea0a354-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.536210 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2c26p\" (UniqueName: \"kubernetes.io/projected/932e3900-65d2-4587-9da3-8fb5bea0a354-kube-api-access-2c26p\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.633642 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-57f0-account-create-update-lphkn" Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.667605 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-c98h7" Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.674066 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5f5a-account-create-update-bhdvm" Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.738346 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2d82\" (UniqueName: \"kubernetes.io/projected/97ed9367-dfce-487f-b826-06981dba28ef-kube-api-access-v2d82\") pod \"97ed9367-dfce-487f-b826-06981dba28ef\" (UID: \"97ed9367-dfce-487f-b826-06981dba28ef\") " Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.738424 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mtwh\" (UniqueName: \"kubernetes.io/projected/0ffcecae-c54a-4d76-9bbd-e7c406582928-kube-api-access-7mtwh\") pod \"0ffcecae-c54a-4d76-9bbd-e7c406582928\" (UID: \"0ffcecae-c54a-4d76-9bbd-e7c406582928\") " Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.738465 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ffcecae-c54a-4d76-9bbd-e7c406582928-operator-scripts\") pod \"0ffcecae-c54a-4d76-9bbd-e7c406582928\" (UID: \"0ffcecae-c54a-4d76-9bbd-e7c406582928\") " Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.738557 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c42e2da-4b51-426a-b743-b8c79e358ecb-operator-scripts\") pod \"2c42e2da-4b51-426a-b743-b8c79e358ecb\" (UID: \"2c42e2da-4b51-426a-b743-b8c79e358ecb\") " Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.738634 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97ed9367-dfce-487f-b826-06981dba28ef-operator-scripts\") pod \"97ed9367-dfce-487f-b826-06981dba28ef\" (UID: \"97ed9367-dfce-487f-b826-06981dba28ef\") " Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.738686 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqfr6\" (UniqueName: \"kubernetes.io/projected/2c42e2da-4b51-426a-b743-b8c79e358ecb-kube-api-access-dqfr6\") pod \"2c42e2da-4b51-426a-b743-b8c79e358ecb\" (UID: \"2c42e2da-4b51-426a-b743-b8c79e358ecb\") " Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.739962 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ffcecae-c54a-4d76-9bbd-e7c406582928-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0ffcecae-c54a-4d76-9bbd-e7c406582928" (UID: "0ffcecae-c54a-4d76-9bbd-e7c406582928"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.739970 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c42e2da-4b51-426a-b743-b8c79e358ecb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2c42e2da-4b51-426a-b743-b8c79e358ecb" (UID: "2c42e2da-4b51-426a-b743-b8c79e358ecb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.740203 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97ed9367-dfce-487f-b826-06981dba28ef-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "97ed9367-dfce-487f-b826-06981dba28ef" (UID: "97ed9367-dfce-487f-b826-06981dba28ef"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.744455 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c42e2da-4b51-426a-b743-b8c79e358ecb-kube-api-access-dqfr6" (OuterVolumeSpecName: "kube-api-access-dqfr6") pod "2c42e2da-4b51-426a-b743-b8c79e358ecb" (UID: "2c42e2da-4b51-426a-b743-b8c79e358ecb"). InnerVolumeSpecName "kube-api-access-dqfr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.744492 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ffcecae-c54a-4d76-9bbd-e7c406582928-kube-api-access-7mtwh" (OuterVolumeSpecName: "kube-api-access-7mtwh") pod "0ffcecae-c54a-4d76-9bbd-e7c406582928" (UID: "0ffcecae-c54a-4d76-9bbd-e7c406582928"). InnerVolumeSpecName "kube-api-access-7mtwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.744894 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97ed9367-dfce-487f-b826-06981dba28ef-kube-api-access-v2d82" (OuterVolumeSpecName: "kube-api-access-v2d82") pod "97ed9367-dfce-487f-b826-06981dba28ef" (UID: "97ed9367-dfce-487f-b826-06981dba28ef"). InnerVolumeSpecName "kube-api-access-v2d82". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.802970 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-7299-account-create-update-dlm9t" event={"ID":"932e3900-65d2-4587-9da3-8fb5bea0a354","Type":"ContainerDied","Data":"6247c1c7b1efc8494b79ed394a39bed8b2a9e0847f1107f1bc67224786c95650"} Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.803018 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6247c1c7b1efc8494b79ed394a39bed8b2a9e0847f1107f1bc67224786c95650" Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.803278 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7299-account-create-update-dlm9t" Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.813055 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-c98h7" Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.814067 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-c98h7" event={"ID":"0ffcecae-c54a-4d76-9bbd-e7c406582928","Type":"ContainerDied","Data":"5a9b7cb7542ced15265f08059330f0e72058700564dfcd27724c14eab0cb325a"} Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.814100 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a9b7cb7542ced15265f08059330f0e72058700564dfcd27724c14eab0cb325a" Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.817751 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5f5a-account-create-update-bhdvm" event={"ID":"2c42e2da-4b51-426a-b743-b8c79e358ecb","Type":"ContainerDied","Data":"6238538448a9da9dce0e4a6d8ba8030427662dcf3a5b7fe404188bb0114f83b0"} Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.817797 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6238538448a9da9dce0e4a6d8ba8030427662dcf3a5b7fe404188bb0114f83b0" Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.817890 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5f5a-account-create-update-bhdvm" Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.840294 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c42e2da-4b51-426a-b743-b8c79e358ecb-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.840318 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97ed9367-dfce-487f-b826-06981dba28ef-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.840327 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqfr6\" (UniqueName: \"kubernetes.io/projected/2c42e2da-4b51-426a-b743-b8c79e358ecb-kube-api-access-dqfr6\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.840336 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2d82\" (UniqueName: \"kubernetes.io/projected/97ed9367-dfce-487f-b826-06981dba28ef-kube-api-access-v2d82\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.840346 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mtwh\" (UniqueName: \"kubernetes.io/projected/0ffcecae-c54a-4d76-9bbd-e7c406582928-kube-api-access-7mtwh\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.840355 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ffcecae-c54a-4d76-9bbd-e7c406582928-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.845147 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-57f0-account-create-update-lphkn" event={"ID":"97ed9367-dfce-487f-b826-06981dba28ef","Type":"ContainerDied","Data":"bee4973e7cee7f3379aa829c2149be6c8c39dec0cf80575b50184b28c9e80216"} Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.845192 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bee4973e7cee7f3379aa829c2149be6c8c39dec0cf80575b50184b28c9e80216" Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.845349 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-57f0-account-create-update-lphkn" Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.857108 4892 generic.go:334] "Generic (PLEG): container finished" podID="2083df34-114e-4d2e-a85c-7a9ca940defa" containerID="17617b8e76c7e5c7a5b693190d099203a08b989a0c62bcf2c5a605412934d8bc" exitCode=143 Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.857157 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2083df34-114e-4d2e-a85c-7a9ca940defa","Type":"ContainerDied","Data":"17617b8e76c7e5c7a5b693190d099203a08b989a0c62bcf2c5a605412934d8bc"} Feb 17 18:04:51 crc kubenswrapper[4892]: I0217 18:04:51.890169 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 18:04:52 crc kubenswrapper[4892]: I0217 18:04:52.008698 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 18:04:52 crc kubenswrapper[4892]: I0217 18:04:52.059331 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-db8547d8d-ftgvm" Feb 17 18:04:52 crc kubenswrapper[4892]: I0217 18:04:52.063709 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-db8547d8d-ftgvm" Feb 17 18:04:52 crc kubenswrapper[4892]: I0217 18:04:52.138880 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-544cf5fc64-d8jkh"] Feb 17 18:04:52 crc kubenswrapper[4892]: I0217 18:04:52.139171 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-544cf5fc64-d8jkh" podUID="0b04ca1c-a720-49e9-81dd-9be5c4695174" containerName="placement-log" containerID="cri-o://032eb4d4bda641c6ae9d5a7fe1126add66df20804e1aea991d6602b7e9264155" gracePeriod=30 Feb 17 18:04:52 crc kubenswrapper[4892]: I0217 18:04:52.139346 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-544cf5fc64-d8jkh" podUID="0b04ca1c-a720-49e9-81dd-9be5c4695174" containerName="placement-api" containerID="cri-o://e79fbb3e74498bb58ea8a3ea7e13222eb9735c19c6b7cac1eb382a5035c993be" gracePeriod=30 Feb 17 18:04:52 crc kubenswrapper[4892]: E0217 18:04:52.883885 4892 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d7300a7_7f48_4be7_addb_1ed7995eddc2.slice/crio-conmon-72946e09c4d279d54fcd131f9664725e2feebd52a6082782b4b42cc4b3e6b531.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1f5f86e_d42f_4224_862d_31337ee26ae5.slice/crio-de10374cd7c9b05b17343f79c93b879107820127acee324e6c40ecf2299e73ef.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d7300a7_7f48_4be7_addb_1ed7995eddc2.slice/crio-72946e09c4d279d54fcd131f9664725e2feebd52a6082782b4b42cc4b3e6b531.scope\": RecentStats: unable to find data in memory cache]" Feb 17 18:04:52 crc kubenswrapper[4892]: I0217 18:04:52.919084 4892 generic.go:334] "Generic (PLEG): container finished" podID="0b04ca1c-a720-49e9-81dd-9be5c4695174" containerID="032eb4d4bda641c6ae9d5a7fe1126add66df20804e1aea991d6602b7e9264155" exitCode=143 Feb 17 18:04:52 crc kubenswrapper[4892]: I0217 18:04:52.919259 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-544cf5fc64-d8jkh" event={"ID":"0b04ca1c-a720-49e9-81dd-9be5c4695174","Type":"ContainerDied","Data":"032eb4d4bda641c6ae9d5a7fe1126add66df20804e1aea991d6602b7e9264155"} Feb 17 18:04:52 crc kubenswrapper[4892]: I0217 18:04:52.923949 4892 generic.go:334] "Generic (PLEG): container finished" podID="a1f5f86e-d42f-4224-862d-31337ee26ae5" containerID="de10374cd7c9b05b17343f79c93b879107820127acee324e6c40ecf2299e73ef" exitCode=0 Feb 17 18:04:52 crc kubenswrapper[4892]: I0217 18:04:52.923992 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a1f5f86e-d42f-4224-862d-31337ee26ae5","Type":"ContainerDied","Data":"de10374cd7c9b05b17343f79c93b879107820127acee324e6c40ecf2299e73ef"} Feb 17 18:04:52 crc kubenswrapper[4892]: I0217 18:04:52.927187 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"617a072c-991e-4579-9762-678cedffd475","Type":"ContainerStarted","Data":"140c90fae676f7d8d616739f920f39a82131b0db309372f9d347332abcf3c8fb"} Feb 17 18:04:52 crc kubenswrapper[4892]: I0217 18:04:52.927238 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"617a072c-991e-4579-9762-678cedffd475","Type":"ContainerStarted","Data":"4b88a4771c0aa012aab1b712aaf06f42d78c5e4e48f307a382849477a7af8010"} Feb 17 18:04:53 crc kubenswrapper[4892]: I0217 18:04:53.196919 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 18:04:53 crc kubenswrapper[4892]: I0217 18:04:53.276161 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1f5f86e-d42f-4224-862d-31337ee26ae5-combined-ca-bundle\") pod \"a1f5f86e-d42f-4224-862d-31337ee26ae5\" (UID: \"a1f5f86e-d42f-4224-862d-31337ee26ae5\") " Feb 17 18:04:53 crc kubenswrapper[4892]: I0217 18:04:53.276201 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"a1f5f86e-d42f-4224-862d-31337ee26ae5\" (UID: \"a1f5f86e-d42f-4224-862d-31337ee26ae5\") " Feb 17 18:04:53 crc kubenswrapper[4892]: I0217 18:04:53.276295 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1f5f86e-d42f-4224-862d-31337ee26ae5-config-data\") pod \"a1f5f86e-d42f-4224-862d-31337ee26ae5\" (UID: \"a1f5f86e-d42f-4224-862d-31337ee26ae5\") " Feb 17 18:04:53 crc kubenswrapper[4892]: I0217 18:04:53.276317 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1f5f86e-d42f-4224-862d-31337ee26ae5-public-tls-certs\") pod \"a1f5f86e-d42f-4224-862d-31337ee26ae5\" (UID: \"a1f5f86e-d42f-4224-862d-31337ee26ae5\") " Feb 17 18:04:53 crc kubenswrapper[4892]: I0217 18:04:53.276364 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1f5f86e-d42f-4224-862d-31337ee26ae5-scripts\") pod \"a1f5f86e-d42f-4224-862d-31337ee26ae5\" (UID: \"a1f5f86e-d42f-4224-862d-31337ee26ae5\") " Feb 17 18:04:53 crc kubenswrapper[4892]: I0217 18:04:53.276894 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1f5f86e-d42f-4224-862d-31337ee26ae5-logs\") pod \"a1f5f86e-d42f-4224-862d-31337ee26ae5\" (UID: \"a1f5f86e-d42f-4224-862d-31337ee26ae5\") " Feb 17 18:04:53 crc kubenswrapper[4892]: I0217 18:04:53.276979 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a1f5f86e-d42f-4224-862d-31337ee26ae5-httpd-run\") pod \"a1f5f86e-d42f-4224-862d-31337ee26ae5\" (UID: \"a1f5f86e-d42f-4224-862d-31337ee26ae5\") " Feb 17 18:04:53 crc kubenswrapper[4892]: I0217 18:04:53.277026 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4cks\" (UniqueName: \"kubernetes.io/projected/a1f5f86e-d42f-4224-862d-31337ee26ae5-kube-api-access-l4cks\") pod \"a1f5f86e-d42f-4224-862d-31337ee26ae5\" (UID: \"a1f5f86e-d42f-4224-862d-31337ee26ae5\") " Feb 17 18:04:53 crc kubenswrapper[4892]: I0217 18:04:53.278221 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1f5f86e-d42f-4224-862d-31337ee26ae5-logs" (OuterVolumeSpecName: "logs") pod "a1f5f86e-d42f-4224-862d-31337ee26ae5" (UID: "a1f5f86e-d42f-4224-862d-31337ee26ae5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:04:53 crc kubenswrapper[4892]: I0217 18:04:53.278411 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1f5f86e-d42f-4224-862d-31337ee26ae5-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a1f5f86e-d42f-4224-862d-31337ee26ae5" (UID: "a1f5f86e-d42f-4224-862d-31337ee26ae5"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:04:53 crc kubenswrapper[4892]: I0217 18:04:53.288908 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "a1f5f86e-d42f-4224-862d-31337ee26ae5" (UID: "a1f5f86e-d42f-4224-862d-31337ee26ae5"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:04:53 crc kubenswrapper[4892]: I0217 18:04:53.294887 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1f5f86e-d42f-4224-862d-31337ee26ae5-kube-api-access-l4cks" (OuterVolumeSpecName: "kube-api-access-l4cks") pod "a1f5f86e-d42f-4224-862d-31337ee26ae5" (UID: "a1f5f86e-d42f-4224-862d-31337ee26ae5"). InnerVolumeSpecName "kube-api-access-l4cks". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:04:53 crc kubenswrapper[4892]: I0217 18:04:53.297618 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1f5f86e-d42f-4224-862d-31337ee26ae5-scripts" (OuterVolumeSpecName: "scripts") pod "a1f5f86e-d42f-4224-862d-31337ee26ae5" (UID: "a1f5f86e-d42f-4224-862d-31337ee26ae5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:04:53 crc kubenswrapper[4892]: I0217 18:04:53.344209 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1f5f86e-d42f-4224-862d-31337ee26ae5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a1f5f86e-d42f-4224-862d-31337ee26ae5" (UID: "a1f5f86e-d42f-4224-862d-31337ee26ae5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:04:53 crc kubenswrapper[4892]: I0217 18:04:53.379382 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1f5f86e-d42f-4224-862d-31337ee26ae5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:53 crc kubenswrapper[4892]: I0217 18:04:53.379681 4892 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Feb 17 18:04:53 crc kubenswrapper[4892]: I0217 18:04:53.379694 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1f5f86e-d42f-4224-862d-31337ee26ae5-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:53 crc kubenswrapper[4892]: I0217 18:04:53.379707 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1f5f86e-d42f-4224-862d-31337ee26ae5-logs\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:53 crc kubenswrapper[4892]: I0217 18:04:53.379717 4892 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a1f5f86e-d42f-4224-862d-31337ee26ae5-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:53 crc kubenswrapper[4892]: I0217 18:04:53.379728 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4cks\" (UniqueName: \"kubernetes.io/projected/a1f5f86e-d42f-4224-862d-31337ee26ae5-kube-api-access-l4cks\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:53 crc kubenswrapper[4892]: I0217 18:04:53.386908 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1f5f86e-d42f-4224-862d-31337ee26ae5-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a1f5f86e-d42f-4224-862d-31337ee26ae5" (UID: "a1f5f86e-d42f-4224-862d-31337ee26ae5"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:04:53 crc kubenswrapper[4892]: I0217 18:04:53.403229 4892 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Feb 17 18:04:53 crc kubenswrapper[4892]: I0217 18:04:53.407942 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1f5f86e-d42f-4224-862d-31337ee26ae5-config-data" (OuterVolumeSpecName: "config-data") pod "a1f5f86e-d42f-4224-862d-31337ee26ae5" (UID: "a1f5f86e-d42f-4224-862d-31337ee26ae5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:04:53 crc kubenswrapper[4892]: I0217 18:04:53.482098 4892 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:53 crc kubenswrapper[4892]: I0217 18:04:53.482133 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1f5f86e-d42f-4224-862d-31337ee26ae5-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:53 crc kubenswrapper[4892]: I0217 18:04:53.482145 4892 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1f5f86e-d42f-4224-862d-31337ee26ae5-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:53 crc kubenswrapper[4892]: I0217 18:04:53.938924 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a1f5f86e-d42f-4224-862d-31337ee26ae5","Type":"ContainerDied","Data":"06dd33940ae2315f8e5deff14eaac70cef1ba80a24ab41c5ddd831c749496d49"} Feb 17 18:04:53 crc kubenswrapper[4892]: I0217 18:04:53.938963 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 18:04:53 crc kubenswrapper[4892]: I0217 18:04:53.939239 4892 scope.go:117] "RemoveContainer" containerID="de10374cd7c9b05b17343f79c93b879107820127acee324e6c40ecf2299e73ef" Feb 17 18:04:53 crc kubenswrapper[4892]: I0217 18:04:53.941626 4892 generic.go:334] "Generic (PLEG): container finished" podID="2083df34-114e-4d2e-a85c-7a9ca940defa" containerID="9472b66a1c3b7f7c9c67281b9625bc73d46a15d07425d6c7adac8bfb9c882a0b" exitCode=0 Feb 17 18:04:53 crc kubenswrapper[4892]: I0217 18:04:53.941682 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2083df34-114e-4d2e-a85c-7a9ca940defa","Type":"ContainerDied","Data":"9472b66a1c3b7f7c9c67281b9625bc73d46a15d07425d6c7adac8bfb9c882a0b"} Feb 17 18:04:53 crc kubenswrapper[4892]: I0217 18:04:53.941774 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="2083df34-114e-4d2e-a85c-7a9ca940defa" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.177:9292/healthcheck\": dial tcp 10.217.0.177:9292: connect: connection refused" Feb 17 18:04:53 crc kubenswrapper[4892]: I0217 18:04:53.941866 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="2083df34-114e-4d2e-a85c-7a9ca940defa" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.177:9292/healthcheck\": dial tcp 10.217.0.177:9292: connect: connection refused" Feb 17 18:04:53 crc kubenswrapper[4892]: I0217 18:04:53.944496 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"617a072c-991e-4579-9762-678cedffd475","Type":"ContainerStarted","Data":"8468131cf66af829d7abe4d0ffba5afd47d3ab1ccf94ebec6d39feaf37191ca8"} Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.056713 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.066867 4892 scope.go:117] "RemoveContainer" containerID="45f3afdcde22cd5653dcf98eb41e608b08f4b0c0f952e1bd8f4733607e7c9c02" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.096904 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.116674 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 18:04:54 crc kubenswrapper[4892]: E0217 18:04:54.117460 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97ed9367-dfce-487f-b826-06981dba28ef" containerName="mariadb-account-create-update" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.117548 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="97ed9367-dfce-487f-b826-06981dba28ef" containerName="mariadb-account-create-update" Feb 17 18:04:54 crc kubenswrapper[4892]: E0217 18:04:54.117634 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1f5f86e-d42f-4224-862d-31337ee26ae5" containerName="glance-httpd" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.117706 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1f5f86e-d42f-4224-862d-31337ee26ae5" containerName="glance-httpd" Feb 17 18:04:54 crc kubenswrapper[4892]: E0217 18:04:54.117787 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1f5f86e-d42f-4224-862d-31337ee26ae5" containerName="glance-log" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.117864 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1f5f86e-d42f-4224-862d-31337ee26ae5" containerName="glance-log" Feb 17 18:04:54 crc kubenswrapper[4892]: E0217 18:04:54.117945 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ffcecae-c54a-4d76-9bbd-e7c406582928" containerName="mariadb-database-create" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.118018 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ffcecae-c54a-4d76-9bbd-e7c406582928" containerName="mariadb-database-create" Feb 17 18:04:54 crc kubenswrapper[4892]: E0217 18:04:54.118086 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="932e3900-65d2-4587-9da3-8fb5bea0a354" containerName="mariadb-account-create-update" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.118148 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="932e3900-65d2-4587-9da3-8fb5bea0a354" containerName="mariadb-account-create-update" Feb 17 18:04:54 crc kubenswrapper[4892]: E0217 18:04:54.118221 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c42e2da-4b51-426a-b743-b8c79e358ecb" containerName="mariadb-account-create-update" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.118287 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c42e2da-4b51-426a-b743-b8c79e358ecb" containerName="mariadb-account-create-update" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.118566 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1f5f86e-d42f-4224-862d-31337ee26ae5" containerName="glance-httpd" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.120101 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ffcecae-c54a-4d76-9bbd-e7c406582928" containerName="mariadb-database-create" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.120203 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="97ed9367-dfce-487f-b826-06981dba28ef" containerName="mariadb-account-create-update" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.120269 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c42e2da-4b51-426a-b743-b8c79e358ecb" containerName="mariadb-account-create-update" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.120332 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1f5f86e-d42f-4224-862d-31337ee26ae5" containerName="glance-log" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.120394 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="932e3900-65d2-4587-9da3-8fb5bea0a354" containerName="mariadb-account-create-update" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.121486 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.124455 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.130229 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.132998 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.197946 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ca9c4ed-1247-4340-a675-b9d50dcbed99-config-data\") pod \"glance-default-external-api-0\" (UID: \"3ca9c4ed-1247-4340-a675-b9d50dcbed99\") " pod="openstack/glance-default-external-api-0" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.198005 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ca9c4ed-1247-4340-a675-b9d50dcbed99-logs\") pod \"glance-default-external-api-0\" (UID: \"3ca9c4ed-1247-4340-a675-b9d50dcbed99\") " pod="openstack/glance-default-external-api-0" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.198085 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3ca9c4ed-1247-4340-a675-b9d50dcbed99-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3ca9c4ed-1247-4340-a675-b9d50dcbed99\") " pod="openstack/glance-default-external-api-0" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.198104 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ca9c4ed-1247-4340-a675-b9d50dcbed99-scripts\") pod \"glance-default-external-api-0\" (UID: \"3ca9c4ed-1247-4340-a675-b9d50dcbed99\") " pod="openstack/glance-default-external-api-0" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.198146 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ca9c4ed-1247-4340-a675-b9d50dcbed99-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3ca9c4ed-1247-4340-a675-b9d50dcbed99\") " pod="openstack/glance-default-external-api-0" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.198164 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"3ca9c4ed-1247-4340-a675-b9d50dcbed99\") " pod="openstack/glance-default-external-api-0" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.198204 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ca9c4ed-1247-4340-a675-b9d50dcbed99-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3ca9c4ed-1247-4340-a675-b9d50dcbed99\") " pod="openstack/glance-default-external-api-0" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.198245 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxvjh\" (UniqueName: \"kubernetes.io/projected/3ca9c4ed-1247-4340-a675-b9d50dcbed99-kube-api-access-fxvjh\") pod \"glance-default-external-api-0\" (UID: \"3ca9c4ed-1247-4340-a675-b9d50dcbed99\") " pod="openstack/glance-default-external-api-0" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.303337 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3ca9c4ed-1247-4340-a675-b9d50dcbed99-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3ca9c4ed-1247-4340-a675-b9d50dcbed99\") " pod="openstack/glance-default-external-api-0" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.303404 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ca9c4ed-1247-4340-a675-b9d50dcbed99-scripts\") pod \"glance-default-external-api-0\" (UID: \"3ca9c4ed-1247-4340-a675-b9d50dcbed99\") " pod="openstack/glance-default-external-api-0" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.303462 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ca9c4ed-1247-4340-a675-b9d50dcbed99-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3ca9c4ed-1247-4340-a675-b9d50dcbed99\") " pod="openstack/glance-default-external-api-0" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.303488 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"3ca9c4ed-1247-4340-a675-b9d50dcbed99\") " pod="openstack/glance-default-external-api-0" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.303559 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ca9c4ed-1247-4340-a675-b9d50dcbed99-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3ca9c4ed-1247-4340-a675-b9d50dcbed99\") " pod="openstack/glance-default-external-api-0" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.303633 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxvjh\" (UniqueName: \"kubernetes.io/projected/3ca9c4ed-1247-4340-a675-b9d50dcbed99-kube-api-access-fxvjh\") pod \"glance-default-external-api-0\" (UID: \"3ca9c4ed-1247-4340-a675-b9d50dcbed99\") " pod="openstack/glance-default-external-api-0" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.303660 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ca9c4ed-1247-4340-a675-b9d50dcbed99-config-data\") pod \"glance-default-external-api-0\" (UID: \"3ca9c4ed-1247-4340-a675-b9d50dcbed99\") " pod="openstack/glance-default-external-api-0" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.303724 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ca9c4ed-1247-4340-a675-b9d50dcbed99-logs\") pod \"glance-default-external-api-0\" (UID: \"3ca9c4ed-1247-4340-a675-b9d50dcbed99\") " pod="openstack/glance-default-external-api-0" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.304254 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ca9c4ed-1247-4340-a675-b9d50dcbed99-logs\") pod \"glance-default-external-api-0\" (UID: \"3ca9c4ed-1247-4340-a675-b9d50dcbed99\") " pod="openstack/glance-default-external-api-0" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.304470 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3ca9c4ed-1247-4340-a675-b9d50dcbed99-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3ca9c4ed-1247-4340-a675-b9d50dcbed99\") " pod="openstack/glance-default-external-api-0" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.307830 4892 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"3ca9c4ed-1247-4340-a675-b9d50dcbed99\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.313687 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ca9c4ed-1247-4340-a675-b9d50dcbed99-scripts\") pod \"glance-default-external-api-0\" (UID: \"3ca9c4ed-1247-4340-a675-b9d50dcbed99\") " pod="openstack/glance-default-external-api-0" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.316651 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ca9c4ed-1247-4340-a675-b9d50dcbed99-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3ca9c4ed-1247-4340-a675-b9d50dcbed99\") " pod="openstack/glance-default-external-api-0" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.322261 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ca9c4ed-1247-4340-a675-b9d50dcbed99-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3ca9c4ed-1247-4340-a675-b9d50dcbed99\") " pod="openstack/glance-default-external-api-0" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.326888 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxvjh\" (UniqueName: \"kubernetes.io/projected/3ca9c4ed-1247-4340-a675-b9d50dcbed99-kube-api-access-fxvjh\") pod \"glance-default-external-api-0\" (UID: \"3ca9c4ed-1247-4340-a675-b9d50dcbed99\") " pod="openstack/glance-default-external-api-0" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.328722 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ca9c4ed-1247-4340-a675-b9d50dcbed99-config-data\") pod \"glance-default-external-api-0\" (UID: \"3ca9c4ed-1247-4340-a675-b9d50dcbed99\") " pod="openstack/glance-default-external-api-0" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.361759 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"3ca9c4ed-1247-4340-a675-b9d50dcbed99\") " pod="openstack/glance-default-external-api-0" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.459541 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.557086 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.593425 4892 scope.go:117] "RemoveContainer" containerID="81d3fec03d8b556c36c63d64b92c0f9f6280adb38914475942317333432f11e3" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.611525 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2083df34-114e-4d2e-a85c-7a9ca940defa-config-data\") pod \"2083df34-114e-4d2e-a85c-7a9ca940defa\" (UID: \"2083df34-114e-4d2e-a85c-7a9ca940defa\") " Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.611570 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"2083df34-114e-4d2e-a85c-7a9ca940defa\" (UID: \"2083df34-114e-4d2e-a85c-7a9ca940defa\") " Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.611628 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2083df34-114e-4d2e-a85c-7a9ca940defa-httpd-run\") pod \"2083df34-114e-4d2e-a85c-7a9ca940defa\" (UID: \"2083df34-114e-4d2e-a85c-7a9ca940defa\") " Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.611670 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2083df34-114e-4d2e-a85c-7a9ca940defa-internal-tls-certs\") pod \"2083df34-114e-4d2e-a85c-7a9ca940defa\" (UID: \"2083df34-114e-4d2e-a85c-7a9ca940defa\") " Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.611712 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hqqp\" (UniqueName: \"kubernetes.io/projected/2083df34-114e-4d2e-a85c-7a9ca940defa-kube-api-access-5hqqp\") pod \"2083df34-114e-4d2e-a85c-7a9ca940defa\" (UID: \"2083df34-114e-4d2e-a85c-7a9ca940defa\") " Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.611786 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2083df34-114e-4d2e-a85c-7a9ca940defa-logs\") pod \"2083df34-114e-4d2e-a85c-7a9ca940defa\" (UID: \"2083df34-114e-4d2e-a85c-7a9ca940defa\") " Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.611838 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2083df34-114e-4d2e-a85c-7a9ca940defa-combined-ca-bundle\") pod \"2083df34-114e-4d2e-a85c-7a9ca940defa\" (UID: \"2083df34-114e-4d2e-a85c-7a9ca940defa\") " Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.611914 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2083df34-114e-4d2e-a85c-7a9ca940defa-scripts\") pod \"2083df34-114e-4d2e-a85c-7a9ca940defa\" (UID: \"2083df34-114e-4d2e-a85c-7a9ca940defa\") " Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.614442 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2083df34-114e-4d2e-a85c-7a9ca940defa-logs" (OuterVolumeSpecName: "logs") pod "2083df34-114e-4d2e-a85c-7a9ca940defa" (UID: "2083df34-114e-4d2e-a85c-7a9ca940defa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.615054 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2083df34-114e-4d2e-a85c-7a9ca940defa-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2083df34-114e-4d2e-a85c-7a9ca940defa" (UID: "2083df34-114e-4d2e-a85c-7a9ca940defa"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.620275 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2083df34-114e-4d2e-a85c-7a9ca940defa-scripts" (OuterVolumeSpecName: "scripts") pod "2083df34-114e-4d2e-a85c-7a9ca940defa" (UID: "2083df34-114e-4d2e-a85c-7a9ca940defa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.623196 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "2083df34-114e-4d2e-a85c-7a9ca940defa" (UID: "2083df34-114e-4d2e-a85c-7a9ca940defa"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.625142 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2083df34-114e-4d2e-a85c-7a9ca940defa-kube-api-access-5hqqp" (OuterVolumeSpecName: "kube-api-access-5hqqp") pod "2083df34-114e-4d2e-a85c-7a9ca940defa" (UID: "2083df34-114e-4d2e-a85c-7a9ca940defa"). InnerVolumeSpecName "kube-api-access-5hqqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.672624 4892 scope.go:117] "RemoveContainer" containerID="68d2f045e145141f5a24c2e216528c4adbbaf49071f246c94d8ef6c8ad6acd8f" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.684917 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2083df34-114e-4d2e-a85c-7a9ca940defa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2083df34-114e-4d2e-a85c-7a9ca940defa" (UID: "2083df34-114e-4d2e-a85c-7a9ca940defa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.714039 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2083df34-114e-4d2e-a85c-7a9ca940defa-logs\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.714071 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2083df34-114e-4d2e-a85c-7a9ca940defa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.714088 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2083df34-114e-4d2e-a85c-7a9ca940defa-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.714120 4892 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.714163 4892 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2083df34-114e-4d2e-a85c-7a9ca940defa-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.714180 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hqqp\" (UniqueName: \"kubernetes.io/projected/2083df34-114e-4d2e-a85c-7a9ca940defa-kube-api-access-5hqqp\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.730030 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2083df34-114e-4d2e-a85c-7a9ca940defa-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2083df34-114e-4d2e-a85c-7a9ca940defa" (UID: "2083df34-114e-4d2e-a85c-7a9ca940defa"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.730224 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2083df34-114e-4d2e-a85c-7a9ca940defa-config-data" (OuterVolumeSpecName: "config-data") pod "2083df34-114e-4d2e-a85c-7a9ca940defa" (UID: "2083df34-114e-4d2e-a85c-7a9ca940defa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.764074 4892 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.820370 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2083df34-114e-4d2e-a85c-7a9ca940defa-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.820402 4892 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.820413 4892 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2083df34-114e-4d2e-a85c-7a9ca940defa-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.823906 4892 scope.go:117] "RemoveContainer" containerID="29fd21523b60634facb51f1651e0135c25756c7042d90f72aac5105b2827bfe9" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.929837 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 18:04:54 crc kubenswrapper[4892]: W0217 18:04:54.933191 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ca9c4ed_1247_4340_a675_b9d50dcbed99.slice/crio-1355cce60f595bcb75a2bac3bfdd2a93600d6895b9428d228a35fb00e8fe0f9e WatchSource:0}: Error finding container 1355cce60f595bcb75a2bac3bfdd2a93600d6895b9428d228a35fb00e8fe0f9e: Status 404 returned error can't find the container with id 1355cce60f595bcb75a2bac3bfdd2a93600d6895b9428d228a35fb00e8fe0f9e Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.965541 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2083df34-114e-4d2e-a85c-7a9ca940defa","Type":"ContainerDied","Data":"e8551b202935e45099ef6792acc2b97c171e2581e16bcf43a4b72911434e0e0f"} Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.965595 4892 scope.go:117] "RemoveContainer" containerID="9472b66a1c3b7f7c9c67281b9625bc73d46a15d07425d6c7adac8bfb9c882a0b" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.965736 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.986701 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"617a072c-991e-4579-9762-678cedffd475","Type":"ContainerStarted","Data":"2b9d900f1194997cadf0dfd7aab38742b3895724a55be152c7378f5fb3fba0e6"} Feb 17 18:04:54 crc kubenswrapper[4892]: I0217 18:04:54.991067 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3ca9c4ed-1247-4340-a675-b9d50dcbed99","Type":"ContainerStarted","Data":"1355cce60f595bcb75a2bac3bfdd2a93600d6895b9428d228a35fb00e8fe0f9e"} Feb 17 18:04:55 crc kubenswrapper[4892]: I0217 18:04:55.016048 4892 scope.go:117] "RemoveContainer" containerID="17617b8e76c7e5c7a5b693190d099203a08b989a0c62bcf2c5a605412934d8bc" Feb 17 18:04:55 crc kubenswrapper[4892]: I0217 18:04:55.023907 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 18:04:55 crc kubenswrapper[4892]: I0217 18:04:55.056879 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 18:04:55 crc kubenswrapper[4892]: I0217 18:04:55.073077 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 18:04:55 crc kubenswrapper[4892]: E0217 18:04:55.073664 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2083df34-114e-4d2e-a85c-7a9ca940defa" containerName="glance-httpd" Feb 17 18:04:55 crc kubenswrapper[4892]: I0217 18:04:55.073680 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="2083df34-114e-4d2e-a85c-7a9ca940defa" containerName="glance-httpd" Feb 17 18:04:55 crc kubenswrapper[4892]: E0217 18:04:55.073701 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2083df34-114e-4d2e-a85c-7a9ca940defa" containerName="glance-log" Feb 17 18:04:55 crc kubenswrapper[4892]: I0217 18:04:55.073707 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="2083df34-114e-4d2e-a85c-7a9ca940defa" containerName="glance-log" Feb 17 18:04:55 crc kubenswrapper[4892]: I0217 18:04:55.073940 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="2083df34-114e-4d2e-a85c-7a9ca940defa" containerName="glance-httpd" Feb 17 18:04:55 crc kubenswrapper[4892]: I0217 18:04:55.073961 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="2083df34-114e-4d2e-a85c-7a9ca940defa" containerName="glance-log" Feb 17 18:04:55 crc kubenswrapper[4892]: I0217 18:04:55.075059 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 18:04:55 crc kubenswrapper[4892]: I0217 18:04:55.076930 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 17 18:04:55 crc kubenswrapper[4892]: I0217 18:04:55.078032 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 17 18:04:55 crc kubenswrapper[4892]: I0217 18:04:55.103323 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 18:04:55 crc kubenswrapper[4892]: I0217 18:04:55.127133 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e69accab-69f4-4f35-91cc-b9fb1d0fded2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e69accab-69f4-4f35-91cc-b9fb1d0fded2\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:04:55 crc kubenswrapper[4892]: I0217 18:04:55.127397 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e69accab-69f4-4f35-91cc-b9fb1d0fded2-logs\") pod \"glance-default-internal-api-0\" (UID: \"e69accab-69f4-4f35-91cc-b9fb1d0fded2\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:04:55 crc kubenswrapper[4892]: I0217 18:04:55.127499 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"e69accab-69f4-4f35-91cc-b9fb1d0fded2\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:04:55 crc kubenswrapper[4892]: I0217 18:04:55.127636 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e69accab-69f4-4f35-91cc-b9fb1d0fded2-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e69accab-69f4-4f35-91cc-b9fb1d0fded2\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:04:55 crc kubenswrapper[4892]: I0217 18:04:55.127716 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkllr\" (UniqueName: \"kubernetes.io/projected/e69accab-69f4-4f35-91cc-b9fb1d0fded2-kube-api-access-wkllr\") pod \"glance-default-internal-api-0\" (UID: \"e69accab-69f4-4f35-91cc-b9fb1d0fded2\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:04:55 crc kubenswrapper[4892]: I0217 18:04:55.127788 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e69accab-69f4-4f35-91cc-b9fb1d0fded2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e69accab-69f4-4f35-91cc-b9fb1d0fded2\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:04:55 crc kubenswrapper[4892]: I0217 18:04:55.127939 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e69accab-69f4-4f35-91cc-b9fb1d0fded2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e69accab-69f4-4f35-91cc-b9fb1d0fded2\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:04:55 crc kubenswrapper[4892]: I0217 18:04:55.128033 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e69accab-69f4-4f35-91cc-b9fb1d0fded2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e69accab-69f4-4f35-91cc-b9fb1d0fded2\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:04:55 crc kubenswrapper[4892]: I0217 18:04:55.231996 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e69accab-69f4-4f35-91cc-b9fb1d0fded2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e69accab-69f4-4f35-91cc-b9fb1d0fded2\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:04:55 crc kubenswrapper[4892]: I0217 18:04:55.232045 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e69accab-69f4-4f35-91cc-b9fb1d0fded2-logs\") pod \"glance-default-internal-api-0\" (UID: \"e69accab-69f4-4f35-91cc-b9fb1d0fded2\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:04:55 crc kubenswrapper[4892]: I0217 18:04:55.232084 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"e69accab-69f4-4f35-91cc-b9fb1d0fded2\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:04:55 crc kubenswrapper[4892]: I0217 18:04:55.232136 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e69accab-69f4-4f35-91cc-b9fb1d0fded2-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e69accab-69f4-4f35-91cc-b9fb1d0fded2\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:04:55 crc kubenswrapper[4892]: I0217 18:04:55.232164 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkllr\" (UniqueName: \"kubernetes.io/projected/e69accab-69f4-4f35-91cc-b9fb1d0fded2-kube-api-access-wkllr\") pod \"glance-default-internal-api-0\" (UID: \"e69accab-69f4-4f35-91cc-b9fb1d0fded2\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:04:55 crc kubenswrapper[4892]: I0217 18:04:55.232179 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e69accab-69f4-4f35-91cc-b9fb1d0fded2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e69accab-69f4-4f35-91cc-b9fb1d0fded2\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:04:55 crc kubenswrapper[4892]: I0217 18:04:55.232220 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e69accab-69f4-4f35-91cc-b9fb1d0fded2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e69accab-69f4-4f35-91cc-b9fb1d0fded2\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:04:55 crc kubenswrapper[4892]: I0217 18:04:55.232257 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e69accab-69f4-4f35-91cc-b9fb1d0fded2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e69accab-69f4-4f35-91cc-b9fb1d0fded2\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:04:55 crc kubenswrapper[4892]: I0217 18:04:55.234786 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e69accab-69f4-4f35-91cc-b9fb1d0fded2-logs\") pod \"glance-default-internal-api-0\" (UID: \"e69accab-69f4-4f35-91cc-b9fb1d0fded2\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:04:55 crc kubenswrapper[4892]: I0217 18:04:55.235026 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e69accab-69f4-4f35-91cc-b9fb1d0fded2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e69accab-69f4-4f35-91cc-b9fb1d0fded2\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:04:55 crc kubenswrapper[4892]: I0217 18:04:55.236074 4892 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"e69accab-69f4-4f35-91cc-b9fb1d0fded2\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-internal-api-0" Feb 17 18:04:55 crc kubenswrapper[4892]: I0217 18:04:55.239679 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e69accab-69f4-4f35-91cc-b9fb1d0fded2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e69accab-69f4-4f35-91cc-b9fb1d0fded2\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:04:55 crc kubenswrapper[4892]: I0217 18:04:55.251070 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e69accab-69f4-4f35-91cc-b9fb1d0fded2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e69accab-69f4-4f35-91cc-b9fb1d0fded2\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:04:55 crc kubenswrapper[4892]: I0217 18:04:55.253372 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e69accab-69f4-4f35-91cc-b9fb1d0fded2-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e69accab-69f4-4f35-91cc-b9fb1d0fded2\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:04:55 crc kubenswrapper[4892]: I0217 18:04:55.253424 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e69accab-69f4-4f35-91cc-b9fb1d0fded2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e69accab-69f4-4f35-91cc-b9fb1d0fded2\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:04:55 crc kubenswrapper[4892]: I0217 18:04:55.258944 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkllr\" (UniqueName: \"kubernetes.io/projected/e69accab-69f4-4f35-91cc-b9fb1d0fded2-kube-api-access-wkllr\") pod \"glance-default-internal-api-0\" (UID: \"e69accab-69f4-4f35-91cc-b9fb1d0fded2\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:04:55 crc kubenswrapper[4892]: I0217 18:04:55.272231 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"e69accab-69f4-4f35-91cc-b9fb1d0fded2\") " pod="openstack/glance-default-internal-api-0" Feb 17 18:04:55 crc kubenswrapper[4892]: I0217 18:04:55.380035 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2083df34-114e-4d2e-a85c-7a9ca940defa" path="/var/lib/kubelet/pods/2083df34-114e-4d2e-a85c-7a9ca940defa/volumes" Feb 17 18:04:55 crc kubenswrapper[4892]: I0217 18:04:55.380870 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1f5f86e-d42f-4224-862d-31337ee26ae5" path="/var/lib/kubelet/pods/a1f5f86e-d42f-4224-862d-31337ee26ae5/volumes" Feb 17 18:04:55 crc kubenswrapper[4892]: I0217 18:04:55.420932 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 18:04:55 crc kubenswrapper[4892]: I0217 18:04:55.926527 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-544cf5fc64-d8jkh" Feb 17 18:04:55 crc kubenswrapper[4892]: I0217 18:04:55.951645 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jh8rk\" (UniqueName: \"kubernetes.io/projected/0b04ca1c-a720-49e9-81dd-9be5c4695174-kube-api-access-jh8rk\") pod \"0b04ca1c-a720-49e9-81dd-9be5c4695174\" (UID: \"0b04ca1c-a720-49e9-81dd-9be5c4695174\") " Feb 17 18:04:55 crc kubenswrapper[4892]: I0217 18:04:55.951723 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b04ca1c-a720-49e9-81dd-9be5c4695174-internal-tls-certs\") pod \"0b04ca1c-a720-49e9-81dd-9be5c4695174\" (UID: \"0b04ca1c-a720-49e9-81dd-9be5c4695174\") " Feb 17 18:04:55 crc kubenswrapper[4892]: I0217 18:04:55.951848 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b04ca1c-a720-49e9-81dd-9be5c4695174-scripts\") pod \"0b04ca1c-a720-49e9-81dd-9be5c4695174\" (UID: \"0b04ca1c-a720-49e9-81dd-9be5c4695174\") " Feb 17 18:04:55 crc kubenswrapper[4892]: I0217 18:04:55.951925 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b04ca1c-a720-49e9-81dd-9be5c4695174-config-data\") pod \"0b04ca1c-a720-49e9-81dd-9be5c4695174\" (UID: \"0b04ca1c-a720-49e9-81dd-9be5c4695174\") " Feb 17 18:04:55 crc kubenswrapper[4892]: I0217 18:04:55.952081 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b04ca1c-a720-49e9-81dd-9be5c4695174-combined-ca-bundle\") pod \"0b04ca1c-a720-49e9-81dd-9be5c4695174\" (UID: \"0b04ca1c-a720-49e9-81dd-9be5c4695174\") " Feb 17 18:04:55 crc kubenswrapper[4892]: I0217 18:04:55.952122 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b04ca1c-a720-49e9-81dd-9be5c4695174-public-tls-certs\") pod \"0b04ca1c-a720-49e9-81dd-9be5c4695174\" (UID: \"0b04ca1c-a720-49e9-81dd-9be5c4695174\") " Feb 17 18:04:55 crc kubenswrapper[4892]: I0217 18:04:55.952399 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b04ca1c-a720-49e9-81dd-9be5c4695174-logs\") pod \"0b04ca1c-a720-49e9-81dd-9be5c4695174\" (UID: \"0b04ca1c-a720-49e9-81dd-9be5c4695174\") " Feb 17 18:04:55 crc kubenswrapper[4892]: I0217 18:04:55.956053 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b04ca1c-a720-49e9-81dd-9be5c4695174-logs" (OuterVolumeSpecName: "logs") pod "0b04ca1c-a720-49e9-81dd-9be5c4695174" (UID: "0b04ca1c-a720-49e9-81dd-9be5c4695174"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:04:55 crc kubenswrapper[4892]: I0217 18:04:55.960608 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b04ca1c-a720-49e9-81dd-9be5c4695174-scripts" (OuterVolumeSpecName: "scripts") pod "0b04ca1c-a720-49e9-81dd-9be5c4695174" (UID: "0b04ca1c-a720-49e9-81dd-9be5c4695174"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:04:55 crc kubenswrapper[4892]: I0217 18:04:55.971150 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b04ca1c-a720-49e9-81dd-9be5c4695174-kube-api-access-jh8rk" (OuterVolumeSpecName: "kube-api-access-jh8rk") pod "0b04ca1c-a720-49e9-81dd-9be5c4695174" (UID: "0b04ca1c-a720-49e9-81dd-9be5c4695174"). InnerVolumeSpecName "kube-api-access-jh8rk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:04:56 crc kubenswrapper[4892]: I0217 18:04:56.005309 4892 generic.go:334] "Generic (PLEG): container finished" podID="0b04ca1c-a720-49e9-81dd-9be5c4695174" containerID="e79fbb3e74498bb58ea8a3ea7e13222eb9735c19c6b7cac1eb382a5035c993be" exitCode=0 Feb 17 18:04:56 crc kubenswrapper[4892]: I0217 18:04:56.005374 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-544cf5fc64-d8jkh" Feb 17 18:04:56 crc kubenswrapper[4892]: I0217 18:04:56.005365 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-544cf5fc64-d8jkh" event={"ID":"0b04ca1c-a720-49e9-81dd-9be5c4695174","Type":"ContainerDied","Data":"e79fbb3e74498bb58ea8a3ea7e13222eb9735c19c6b7cac1eb382a5035c993be"} Feb 17 18:04:56 crc kubenswrapper[4892]: I0217 18:04:56.005778 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-544cf5fc64-d8jkh" event={"ID":"0b04ca1c-a720-49e9-81dd-9be5c4695174","Type":"ContainerDied","Data":"43dde95c545d91ff0eb2aca888d539a13a9b399124d85f84738e9d14ce3800da"} Feb 17 18:04:56 crc kubenswrapper[4892]: I0217 18:04:56.006084 4892 scope.go:117] "RemoveContainer" containerID="e79fbb3e74498bb58ea8a3ea7e13222eb9735c19c6b7cac1eb382a5035c993be" Feb 17 18:04:56 crc kubenswrapper[4892]: I0217 18:04:56.017231 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3ca9c4ed-1247-4340-a675-b9d50dcbed99","Type":"ContainerStarted","Data":"bbca05ba8000e1544dd1256fe1d48355fe4077385199194480c09fd31d0d03ad"} Feb 17 18:04:56 crc kubenswrapper[4892]: I0217 18:04:56.054408 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b04ca1c-a720-49e9-81dd-9be5c4695174-logs\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:56 crc kubenswrapper[4892]: I0217 18:04:56.054443 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jh8rk\" (UniqueName: \"kubernetes.io/projected/0b04ca1c-a720-49e9-81dd-9be5c4695174-kube-api-access-jh8rk\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:56 crc kubenswrapper[4892]: I0217 18:04:56.054458 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b04ca1c-a720-49e9-81dd-9be5c4695174-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:56 crc kubenswrapper[4892]: I0217 18:04:56.062083 4892 scope.go:117] "RemoveContainer" containerID="032eb4d4bda641c6ae9d5a7fe1126add66df20804e1aea991d6602b7e9264155" Feb 17 18:04:56 crc kubenswrapper[4892]: I0217 18:04:56.096321 4892 scope.go:117] "RemoveContainer" containerID="e79fbb3e74498bb58ea8a3ea7e13222eb9735c19c6b7cac1eb382a5035c993be" Feb 17 18:04:56 crc kubenswrapper[4892]: E0217 18:04:56.096742 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e79fbb3e74498bb58ea8a3ea7e13222eb9735c19c6b7cac1eb382a5035c993be\": container with ID starting with e79fbb3e74498bb58ea8a3ea7e13222eb9735c19c6b7cac1eb382a5035c993be not found: ID does not exist" containerID="e79fbb3e74498bb58ea8a3ea7e13222eb9735c19c6b7cac1eb382a5035c993be" Feb 17 18:04:56 crc kubenswrapper[4892]: I0217 18:04:56.096777 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e79fbb3e74498bb58ea8a3ea7e13222eb9735c19c6b7cac1eb382a5035c993be"} err="failed to get container status \"e79fbb3e74498bb58ea8a3ea7e13222eb9735c19c6b7cac1eb382a5035c993be\": rpc error: code = NotFound desc = could not find container \"e79fbb3e74498bb58ea8a3ea7e13222eb9735c19c6b7cac1eb382a5035c993be\": container with ID starting with e79fbb3e74498bb58ea8a3ea7e13222eb9735c19c6b7cac1eb382a5035c993be not found: ID does not exist" Feb 17 18:04:56 crc kubenswrapper[4892]: I0217 18:04:56.096802 4892 scope.go:117] "RemoveContainer" containerID="032eb4d4bda641c6ae9d5a7fe1126add66df20804e1aea991d6602b7e9264155" Feb 17 18:04:56 crc kubenswrapper[4892]: E0217 18:04:56.097074 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"032eb4d4bda641c6ae9d5a7fe1126add66df20804e1aea991d6602b7e9264155\": container with ID starting with 032eb4d4bda641c6ae9d5a7fe1126add66df20804e1aea991d6602b7e9264155 not found: ID does not exist" containerID="032eb4d4bda641c6ae9d5a7fe1126add66df20804e1aea991d6602b7e9264155" Feb 17 18:04:56 crc kubenswrapper[4892]: I0217 18:04:56.097097 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"032eb4d4bda641c6ae9d5a7fe1126add66df20804e1aea991d6602b7e9264155"} err="failed to get container status \"032eb4d4bda641c6ae9d5a7fe1126add66df20804e1aea991d6602b7e9264155\": rpc error: code = NotFound desc = could not find container \"032eb4d4bda641c6ae9d5a7fe1126add66df20804e1aea991d6602b7e9264155\": container with ID starting with 032eb4d4bda641c6ae9d5a7fe1126add66df20804e1aea991d6602b7e9264155 not found: ID does not exist" Feb 17 18:04:56 crc kubenswrapper[4892]: I0217 18:04:56.100499 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b04ca1c-a720-49e9-81dd-9be5c4695174-config-data" (OuterVolumeSpecName: "config-data") pod "0b04ca1c-a720-49e9-81dd-9be5c4695174" (UID: "0b04ca1c-a720-49e9-81dd-9be5c4695174"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:04:56 crc kubenswrapper[4892]: I0217 18:04:56.127086 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b04ca1c-a720-49e9-81dd-9be5c4695174-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0b04ca1c-a720-49e9-81dd-9be5c4695174" (UID: "0b04ca1c-a720-49e9-81dd-9be5c4695174"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:04:56 crc kubenswrapper[4892]: I0217 18:04:56.149033 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b04ca1c-a720-49e9-81dd-9be5c4695174-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0b04ca1c-a720-49e9-81dd-9be5c4695174" (UID: "0b04ca1c-a720-49e9-81dd-9be5c4695174"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:04:56 crc kubenswrapper[4892]: I0217 18:04:56.157352 4892 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b04ca1c-a720-49e9-81dd-9be5c4695174-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:56 crc kubenswrapper[4892]: I0217 18:04:56.157385 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b04ca1c-a720-49e9-81dd-9be5c4695174-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:56 crc kubenswrapper[4892]: I0217 18:04:56.157400 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b04ca1c-a720-49e9-81dd-9be5c4695174-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:56 crc kubenswrapper[4892]: I0217 18:04:56.173307 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b04ca1c-a720-49e9-81dd-9be5c4695174-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0b04ca1c-a720-49e9-81dd-9be5c4695174" (UID: "0b04ca1c-a720-49e9-81dd-9be5c4695174"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:04:56 crc kubenswrapper[4892]: I0217 18:04:56.258781 4892 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b04ca1c-a720-49e9-81dd-9be5c4695174-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 18:04:56 crc kubenswrapper[4892]: I0217 18:04:56.348695 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-544cf5fc64-d8jkh"] Feb 17 18:04:56 crc kubenswrapper[4892]: I0217 18:04:56.367362 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-544cf5fc64-d8jkh"] Feb 17 18:04:56 crc kubenswrapper[4892]: I0217 18:04:56.402913 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 18:04:56 crc kubenswrapper[4892]: W0217 18:04:56.413887 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode69accab_69f4_4f35_91cc_b9fb1d0fded2.slice/crio-0db668624ce7395d4fcbd56b8ca1553cb4fc0314f6cf7995da270ed58dc804ac WatchSource:0}: Error finding container 0db668624ce7395d4fcbd56b8ca1553cb4fc0314f6cf7995da270ed58dc804ac: Status 404 returned error can't find the container with id 0db668624ce7395d4fcbd56b8ca1553cb4fc0314f6cf7995da270ed58dc804ac Feb 17 18:04:57 crc kubenswrapper[4892]: I0217 18:04:57.029545 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3ca9c4ed-1247-4340-a675-b9d50dcbed99","Type":"ContainerStarted","Data":"c99d3760836f82bf94729ec9f9c217c2b3cda31831ae4231d401c839708c201c"} Feb 17 18:04:57 crc kubenswrapper[4892]: I0217 18:04:57.032103 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e69accab-69f4-4f35-91cc-b9fb1d0fded2","Type":"ContainerStarted","Data":"094f4083ae3da865c0c77d4d0a55be29b2800ab643f2367af2aa5f359e590bfc"} Feb 17 18:04:57 crc kubenswrapper[4892]: I0217 18:04:57.032139 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e69accab-69f4-4f35-91cc-b9fb1d0fded2","Type":"ContainerStarted","Data":"0db668624ce7395d4fcbd56b8ca1553cb4fc0314f6cf7995da270ed58dc804ac"} Feb 17 18:04:57 crc kubenswrapper[4892]: I0217 18:04:57.035099 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"617a072c-991e-4579-9762-678cedffd475","Type":"ContainerStarted","Data":"060a20627026c73505fe9edf580b379e1b429c64cf78667462f8832867c00935"} Feb 17 18:04:57 crc kubenswrapper[4892]: I0217 18:04:57.035304 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="617a072c-991e-4579-9762-678cedffd475" containerName="ceilometer-central-agent" containerID="cri-o://140c90fae676f7d8d616739f920f39a82131b0db309372f9d347332abcf3c8fb" gracePeriod=30 Feb 17 18:04:57 crc kubenswrapper[4892]: I0217 18:04:57.035548 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 18:04:57 crc kubenswrapper[4892]: I0217 18:04:57.035604 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="617a072c-991e-4579-9762-678cedffd475" containerName="proxy-httpd" containerID="cri-o://060a20627026c73505fe9edf580b379e1b429c64cf78667462f8832867c00935" gracePeriod=30 Feb 17 18:04:57 crc kubenswrapper[4892]: I0217 18:04:57.035667 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="617a072c-991e-4579-9762-678cedffd475" containerName="sg-core" containerID="cri-o://2b9d900f1194997cadf0dfd7aab38742b3895724a55be152c7378f5fb3fba0e6" gracePeriod=30 Feb 17 18:04:57 crc kubenswrapper[4892]: I0217 18:04:57.035716 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="617a072c-991e-4579-9762-678cedffd475" containerName="ceilometer-notification-agent" containerID="cri-o://8468131cf66af829d7abe4d0ffba5afd47d3ab1ccf94ebec6d39feaf37191ca8" gracePeriod=30 Feb 17 18:04:57 crc kubenswrapper[4892]: I0217 18:04:57.056577 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.056557071 podStartE2EDuration="3.056557071s" podCreationTimestamp="2026-02-17 18:04:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:04:57.048391511 +0000 UTC m=+1268.423794786" watchObservedRunningTime="2026-02-17 18:04:57.056557071 +0000 UTC m=+1268.431960336" Feb 17 18:04:57 crc kubenswrapper[4892]: I0217 18:04:57.076512 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.210338658 podStartE2EDuration="7.076493598s" podCreationTimestamp="2026-02-17 18:04:50 +0000 UTC" firstStartedPulling="2026-02-17 18:04:51.901089914 +0000 UTC m=+1263.276493179" lastFinishedPulling="2026-02-17 18:04:55.767244854 +0000 UTC m=+1267.142648119" observedRunningTime="2026-02-17 18:04:57.075008567 +0000 UTC m=+1268.450411842" watchObservedRunningTime="2026-02-17 18:04:57.076493598 +0000 UTC m=+1268.451896873" Feb 17 18:04:57 crc kubenswrapper[4892]: I0217 18:04:57.381385 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b04ca1c-a720-49e9-81dd-9be5c4695174" path="/var/lib/kubelet/pods/0b04ca1c-a720-49e9-81dd-9be5c4695174/volumes" Feb 17 18:04:57 crc kubenswrapper[4892]: I0217 18:04:57.712045 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lvxxb"] Feb 17 18:04:57 crc kubenswrapper[4892]: E0217 18:04:57.712715 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b04ca1c-a720-49e9-81dd-9be5c4695174" containerName="placement-api" Feb 17 18:04:57 crc kubenswrapper[4892]: I0217 18:04:57.712732 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b04ca1c-a720-49e9-81dd-9be5c4695174" containerName="placement-api" Feb 17 18:04:57 crc kubenswrapper[4892]: E0217 18:04:57.712759 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b04ca1c-a720-49e9-81dd-9be5c4695174" containerName="placement-log" Feb 17 18:04:57 crc kubenswrapper[4892]: I0217 18:04:57.712765 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b04ca1c-a720-49e9-81dd-9be5c4695174" containerName="placement-log" Feb 17 18:04:57 crc kubenswrapper[4892]: I0217 18:04:57.712984 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b04ca1c-a720-49e9-81dd-9be5c4695174" containerName="placement-log" Feb 17 18:04:57 crc kubenswrapper[4892]: I0217 18:04:57.713012 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b04ca1c-a720-49e9-81dd-9be5c4695174" containerName="placement-api" Feb 17 18:04:57 crc kubenswrapper[4892]: I0217 18:04:57.713574 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-lvxxb" Feb 17 18:04:57 crc kubenswrapper[4892]: I0217 18:04:57.716784 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-r65jn" Feb 17 18:04:57 crc kubenswrapper[4892]: I0217 18:04:57.717496 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 17 18:04:57 crc kubenswrapper[4892]: I0217 18:04:57.717777 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 17 18:04:57 crc kubenswrapper[4892]: I0217 18:04:57.727677 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lvxxb"] Feb 17 18:04:57 crc kubenswrapper[4892]: I0217 18:04:57.832877 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nz4tb\" (UniqueName: \"kubernetes.io/projected/0d7546b1-a71d-4a95-afd3-adf70b749d04-kube-api-access-nz4tb\") pod \"nova-cell0-conductor-db-sync-lvxxb\" (UID: \"0d7546b1-a71d-4a95-afd3-adf70b749d04\") " pod="openstack/nova-cell0-conductor-db-sync-lvxxb" Feb 17 18:04:57 crc kubenswrapper[4892]: I0217 18:04:57.832920 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d7546b1-a71d-4a95-afd3-adf70b749d04-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-lvxxb\" (UID: \"0d7546b1-a71d-4a95-afd3-adf70b749d04\") " pod="openstack/nova-cell0-conductor-db-sync-lvxxb" Feb 17 18:04:57 crc kubenswrapper[4892]: I0217 18:04:57.833341 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d7546b1-a71d-4a95-afd3-adf70b749d04-scripts\") pod \"nova-cell0-conductor-db-sync-lvxxb\" (UID: \"0d7546b1-a71d-4a95-afd3-adf70b749d04\") " pod="openstack/nova-cell0-conductor-db-sync-lvxxb" Feb 17 18:04:57 crc kubenswrapper[4892]: I0217 18:04:57.833500 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d7546b1-a71d-4a95-afd3-adf70b749d04-config-data\") pod \"nova-cell0-conductor-db-sync-lvxxb\" (UID: \"0d7546b1-a71d-4a95-afd3-adf70b749d04\") " pod="openstack/nova-cell0-conductor-db-sync-lvxxb" Feb 17 18:04:57 crc kubenswrapper[4892]: I0217 18:04:57.936081 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d7546b1-a71d-4a95-afd3-adf70b749d04-scripts\") pod \"nova-cell0-conductor-db-sync-lvxxb\" (UID: \"0d7546b1-a71d-4a95-afd3-adf70b749d04\") " pod="openstack/nova-cell0-conductor-db-sync-lvxxb" Feb 17 18:04:57 crc kubenswrapper[4892]: I0217 18:04:57.936145 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d7546b1-a71d-4a95-afd3-adf70b749d04-config-data\") pod \"nova-cell0-conductor-db-sync-lvxxb\" (UID: \"0d7546b1-a71d-4a95-afd3-adf70b749d04\") " pod="openstack/nova-cell0-conductor-db-sync-lvxxb" Feb 17 18:04:57 crc kubenswrapper[4892]: I0217 18:04:57.936245 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nz4tb\" (UniqueName: \"kubernetes.io/projected/0d7546b1-a71d-4a95-afd3-adf70b749d04-kube-api-access-nz4tb\") pod \"nova-cell0-conductor-db-sync-lvxxb\" (UID: \"0d7546b1-a71d-4a95-afd3-adf70b749d04\") " pod="openstack/nova-cell0-conductor-db-sync-lvxxb" Feb 17 18:04:57 crc kubenswrapper[4892]: I0217 18:04:57.936271 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d7546b1-a71d-4a95-afd3-adf70b749d04-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-lvxxb\" (UID: \"0d7546b1-a71d-4a95-afd3-adf70b749d04\") " pod="openstack/nova-cell0-conductor-db-sync-lvxxb" Feb 17 18:04:57 crc kubenswrapper[4892]: I0217 18:04:57.940679 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d7546b1-a71d-4a95-afd3-adf70b749d04-scripts\") pod \"nova-cell0-conductor-db-sync-lvxxb\" (UID: \"0d7546b1-a71d-4a95-afd3-adf70b749d04\") " pod="openstack/nova-cell0-conductor-db-sync-lvxxb" Feb 17 18:04:57 crc kubenswrapper[4892]: I0217 18:04:57.941424 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d7546b1-a71d-4a95-afd3-adf70b749d04-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-lvxxb\" (UID: \"0d7546b1-a71d-4a95-afd3-adf70b749d04\") " pod="openstack/nova-cell0-conductor-db-sync-lvxxb" Feb 17 18:04:57 crc kubenswrapper[4892]: I0217 18:04:57.953499 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d7546b1-a71d-4a95-afd3-adf70b749d04-config-data\") pod \"nova-cell0-conductor-db-sync-lvxxb\" (UID: \"0d7546b1-a71d-4a95-afd3-adf70b749d04\") " pod="openstack/nova-cell0-conductor-db-sync-lvxxb" Feb 17 18:04:57 crc kubenswrapper[4892]: I0217 18:04:57.962685 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nz4tb\" (UniqueName: \"kubernetes.io/projected/0d7546b1-a71d-4a95-afd3-adf70b749d04-kube-api-access-nz4tb\") pod \"nova-cell0-conductor-db-sync-lvxxb\" (UID: \"0d7546b1-a71d-4a95-afd3-adf70b749d04\") " pod="openstack/nova-cell0-conductor-db-sync-lvxxb" Feb 17 18:04:58 crc kubenswrapper[4892]: I0217 18:04:58.035222 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-lvxxb" Feb 17 18:04:58 crc kubenswrapper[4892]: I0217 18:04:58.063938 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e69accab-69f4-4f35-91cc-b9fb1d0fded2","Type":"ContainerStarted","Data":"a6444be986863902c53a81d49703d9af2c8e3377fbc24ad52b64139d345c030e"} Feb 17 18:04:58 crc kubenswrapper[4892]: I0217 18:04:58.070319 4892 generic.go:334] "Generic (PLEG): container finished" podID="617a072c-991e-4579-9762-678cedffd475" containerID="060a20627026c73505fe9edf580b379e1b429c64cf78667462f8832867c00935" exitCode=0 Feb 17 18:04:58 crc kubenswrapper[4892]: I0217 18:04:58.070357 4892 generic.go:334] "Generic (PLEG): container finished" podID="617a072c-991e-4579-9762-678cedffd475" containerID="2b9d900f1194997cadf0dfd7aab38742b3895724a55be152c7378f5fb3fba0e6" exitCode=2 Feb 17 18:04:58 crc kubenswrapper[4892]: I0217 18:04:58.070368 4892 generic.go:334] "Generic (PLEG): container finished" podID="617a072c-991e-4579-9762-678cedffd475" containerID="8468131cf66af829d7abe4d0ffba5afd47d3ab1ccf94ebec6d39feaf37191ca8" exitCode=0 Feb 17 18:04:58 crc kubenswrapper[4892]: I0217 18:04:58.071758 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"617a072c-991e-4579-9762-678cedffd475","Type":"ContainerDied","Data":"060a20627026c73505fe9edf580b379e1b429c64cf78667462f8832867c00935"} Feb 17 18:04:58 crc kubenswrapper[4892]: I0217 18:04:58.071799 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"617a072c-991e-4579-9762-678cedffd475","Type":"ContainerDied","Data":"2b9d900f1194997cadf0dfd7aab38742b3895724a55be152c7378f5fb3fba0e6"} Feb 17 18:04:58 crc kubenswrapper[4892]: I0217 18:04:58.071829 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"617a072c-991e-4579-9762-678cedffd475","Type":"ContainerDied","Data":"8468131cf66af829d7abe4d0ffba5afd47d3ab1ccf94ebec6d39feaf37191ca8"} Feb 17 18:04:58 crc kubenswrapper[4892]: I0217 18:04:58.095015 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.094995795 podStartE2EDuration="3.094995795s" podCreationTimestamp="2026-02-17 18:04:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:04:58.08773794 +0000 UTC m=+1269.463141205" watchObservedRunningTime="2026-02-17 18:04:58.094995795 +0000 UTC m=+1269.470399060" Feb 17 18:04:58 crc kubenswrapper[4892]: I0217 18:04:58.546327 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lvxxb"] Feb 17 18:04:58 crc kubenswrapper[4892]: W0217 18:04:58.552410 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d7546b1_a71d_4a95_afd3_adf70b749d04.slice/crio-4c48082f821d82c31ff1271455260c211ef89bea46a5d08f4323d100ac0432c7 WatchSource:0}: Error finding container 4c48082f821d82c31ff1271455260c211ef89bea46a5d08f4323d100ac0432c7: Status 404 returned error can't find the container with id 4c48082f821d82c31ff1271455260c211ef89bea46a5d08f4323d100ac0432c7 Feb 17 18:04:59 crc kubenswrapper[4892]: I0217 18:04:59.085879 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-lvxxb" event={"ID":"0d7546b1-a71d-4a95-afd3-adf70b749d04","Type":"ContainerStarted","Data":"4c48082f821d82c31ff1271455260c211ef89bea46a5d08f4323d100ac0432c7"} Feb 17 18:05:00 crc kubenswrapper[4892]: I0217 18:05:00.769231 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 18:05:00 crc kubenswrapper[4892]: I0217 18:05:00.896861 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2mt5\" (UniqueName: \"kubernetes.io/projected/617a072c-991e-4579-9762-678cedffd475-kube-api-access-x2mt5\") pod \"617a072c-991e-4579-9762-678cedffd475\" (UID: \"617a072c-991e-4579-9762-678cedffd475\") " Feb 17 18:05:00 crc kubenswrapper[4892]: I0217 18:05:00.896921 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/617a072c-991e-4579-9762-678cedffd475-config-data\") pod \"617a072c-991e-4579-9762-678cedffd475\" (UID: \"617a072c-991e-4579-9762-678cedffd475\") " Feb 17 18:05:00 crc kubenswrapper[4892]: I0217 18:05:00.897010 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/617a072c-991e-4579-9762-678cedffd475-log-httpd\") pod \"617a072c-991e-4579-9762-678cedffd475\" (UID: \"617a072c-991e-4579-9762-678cedffd475\") " Feb 17 18:05:00 crc kubenswrapper[4892]: I0217 18:05:00.897084 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/617a072c-991e-4579-9762-678cedffd475-scripts\") pod \"617a072c-991e-4579-9762-678cedffd475\" (UID: \"617a072c-991e-4579-9762-678cedffd475\") " Feb 17 18:05:00 crc kubenswrapper[4892]: I0217 18:05:00.897106 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/617a072c-991e-4579-9762-678cedffd475-combined-ca-bundle\") pod \"617a072c-991e-4579-9762-678cedffd475\" (UID: \"617a072c-991e-4579-9762-678cedffd475\") " Feb 17 18:05:00 crc kubenswrapper[4892]: I0217 18:05:00.897193 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/617a072c-991e-4579-9762-678cedffd475-sg-core-conf-yaml\") pod \"617a072c-991e-4579-9762-678cedffd475\" (UID: \"617a072c-991e-4579-9762-678cedffd475\") " Feb 17 18:05:00 crc kubenswrapper[4892]: I0217 18:05:00.897255 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/617a072c-991e-4579-9762-678cedffd475-run-httpd\") pod \"617a072c-991e-4579-9762-678cedffd475\" (UID: \"617a072c-991e-4579-9762-678cedffd475\") " Feb 17 18:05:00 crc kubenswrapper[4892]: I0217 18:05:00.898060 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/617a072c-991e-4579-9762-678cedffd475-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "617a072c-991e-4579-9762-678cedffd475" (UID: "617a072c-991e-4579-9762-678cedffd475"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:05:00 crc kubenswrapper[4892]: I0217 18:05:00.898533 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/617a072c-991e-4579-9762-678cedffd475-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "617a072c-991e-4579-9762-678cedffd475" (UID: "617a072c-991e-4579-9762-678cedffd475"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:05:00 crc kubenswrapper[4892]: I0217 18:05:00.906975 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/617a072c-991e-4579-9762-678cedffd475-scripts" (OuterVolumeSpecName: "scripts") pod "617a072c-991e-4579-9762-678cedffd475" (UID: "617a072c-991e-4579-9762-678cedffd475"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:05:00 crc kubenswrapper[4892]: I0217 18:05:00.907219 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/617a072c-991e-4579-9762-678cedffd475-kube-api-access-x2mt5" (OuterVolumeSpecName: "kube-api-access-x2mt5") pod "617a072c-991e-4579-9762-678cedffd475" (UID: "617a072c-991e-4579-9762-678cedffd475"). InnerVolumeSpecName "kube-api-access-x2mt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:05:00 crc kubenswrapper[4892]: I0217 18:05:00.939237 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/617a072c-991e-4579-9762-678cedffd475-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "617a072c-991e-4579-9762-678cedffd475" (UID: "617a072c-991e-4579-9762-678cedffd475"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:05:00 crc kubenswrapper[4892]: I0217 18:05:00.999893 4892 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/617a072c-991e-4579-9762-678cedffd475-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 18:05:00 crc kubenswrapper[4892]: I0217 18:05:00.999927 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2mt5\" (UniqueName: \"kubernetes.io/projected/617a072c-991e-4579-9762-678cedffd475-kube-api-access-x2mt5\") on node \"crc\" DevicePath \"\"" Feb 17 18:05:00 crc kubenswrapper[4892]: I0217 18:05:00.999938 4892 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/617a072c-991e-4579-9762-678cedffd475-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 18:05:00 crc kubenswrapper[4892]: I0217 18:05:00.999948 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/617a072c-991e-4579-9762-678cedffd475-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:05:01 crc kubenswrapper[4892]: I0217 18:05:00.999956 4892 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/617a072c-991e-4579-9762-678cedffd475-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 18:05:01 crc kubenswrapper[4892]: I0217 18:05:01.002055 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/617a072c-991e-4579-9762-678cedffd475-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "617a072c-991e-4579-9762-678cedffd475" (UID: "617a072c-991e-4579-9762-678cedffd475"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:05:01 crc kubenswrapper[4892]: I0217 18:05:01.007657 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/617a072c-991e-4579-9762-678cedffd475-config-data" (OuterVolumeSpecName: "config-data") pod "617a072c-991e-4579-9762-678cedffd475" (UID: "617a072c-991e-4579-9762-678cedffd475"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:05:01 crc kubenswrapper[4892]: I0217 18:05:01.102110 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/617a072c-991e-4579-9762-678cedffd475-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:05:01 crc kubenswrapper[4892]: I0217 18:05:01.102156 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/617a072c-991e-4579-9762-678cedffd475-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:05:01 crc kubenswrapper[4892]: I0217 18:05:01.113218 4892 generic.go:334] "Generic (PLEG): container finished" podID="617a072c-991e-4579-9762-678cedffd475" containerID="140c90fae676f7d8d616739f920f39a82131b0db309372f9d347332abcf3c8fb" exitCode=0 Feb 17 18:05:01 crc kubenswrapper[4892]: I0217 18:05:01.113274 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"617a072c-991e-4579-9762-678cedffd475","Type":"ContainerDied","Data":"140c90fae676f7d8d616739f920f39a82131b0db309372f9d347332abcf3c8fb"} Feb 17 18:05:01 crc kubenswrapper[4892]: I0217 18:05:01.113307 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 18:05:01 crc kubenswrapper[4892]: I0217 18:05:01.113324 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"617a072c-991e-4579-9762-678cedffd475","Type":"ContainerDied","Data":"4b88a4771c0aa012aab1b712aaf06f42d78c5e4e48f307a382849477a7af8010"} Feb 17 18:05:01 crc kubenswrapper[4892]: I0217 18:05:01.113348 4892 scope.go:117] "RemoveContainer" containerID="060a20627026c73505fe9edf580b379e1b429c64cf78667462f8832867c00935" Feb 17 18:05:01 crc kubenswrapper[4892]: I0217 18:05:01.180697 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 18:05:01 crc kubenswrapper[4892]: I0217 18:05:01.195776 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 18:05:01 crc kubenswrapper[4892]: I0217 18:05:01.207940 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 18:05:01 crc kubenswrapper[4892]: E0217 18:05:01.208489 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="617a072c-991e-4579-9762-678cedffd475" containerName="sg-core" Feb 17 18:05:01 crc kubenswrapper[4892]: I0217 18:05:01.208504 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="617a072c-991e-4579-9762-678cedffd475" containerName="sg-core" Feb 17 18:05:01 crc kubenswrapper[4892]: E0217 18:05:01.208526 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="617a072c-991e-4579-9762-678cedffd475" containerName="proxy-httpd" Feb 17 18:05:01 crc kubenswrapper[4892]: I0217 18:05:01.208533 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="617a072c-991e-4579-9762-678cedffd475" containerName="proxy-httpd" Feb 17 18:05:01 crc kubenswrapper[4892]: E0217 18:05:01.208566 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="617a072c-991e-4579-9762-678cedffd475" containerName="ceilometer-notification-agent" Feb 17 18:05:01 crc kubenswrapper[4892]: I0217 18:05:01.208575 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="617a072c-991e-4579-9762-678cedffd475" containerName="ceilometer-notification-agent" Feb 17 18:05:01 crc kubenswrapper[4892]: E0217 18:05:01.208595 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="617a072c-991e-4579-9762-678cedffd475" containerName="ceilometer-central-agent" Feb 17 18:05:01 crc kubenswrapper[4892]: I0217 18:05:01.208604 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="617a072c-991e-4579-9762-678cedffd475" containerName="ceilometer-central-agent" Feb 17 18:05:01 crc kubenswrapper[4892]: I0217 18:05:01.208864 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="617a072c-991e-4579-9762-678cedffd475" containerName="ceilometer-notification-agent" Feb 17 18:05:01 crc kubenswrapper[4892]: I0217 18:05:01.208892 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="617a072c-991e-4579-9762-678cedffd475" containerName="sg-core" Feb 17 18:05:01 crc kubenswrapper[4892]: I0217 18:05:01.208902 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="617a072c-991e-4579-9762-678cedffd475" containerName="proxy-httpd" Feb 17 18:05:01 crc kubenswrapper[4892]: I0217 18:05:01.208915 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="617a072c-991e-4579-9762-678cedffd475" containerName="ceilometer-central-agent" Feb 17 18:05:01 crc kubenswrapper[4892]: I0217 18:05:01.211271 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 18:05:01 crc kubenswrapper[4892]: I0217 18:05:01.213228 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 18:05:01 crc kubenswrapper[4892]: I0217 18:05:01.213473 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 18:05:01 crc kubenswrapper[4892]: I0217 18:05:01.226268 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 18:05:01 crc kubenswrapper[4892]: I0217 18:05:01.307949 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g5dz\" (UniqueName: \"kubernetes.io/projected/ea2f704e-0caa-453e-9836-3b90322735c8-kube-api-access-4g5dz\") pod \"ceilometer-0\" (UID: \"ea2f704e-0caa-453e-9836-3b90322735c8\") " pod="openstack/ceilometer-0" Feb 17 18:05:01 crc kubenswrapper[4892]: I0217 18:05:01.308041 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ea2f704e-0caa-453e-9836-3b90322735c8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ea2f704e-0caa-453e-9836-3b90322735c8\") " pod="openstack/ceilometer-0" Feb 17 18:05:01 crc kubenswrapper[4892]: I0217 18:05:01.308078 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea2f704e-0caa-453e-9836-3b90322735c8-scripts\") pod \"ceilometer-0\" (UID: \"ea2f704e-0caa-453e-9836-3b90322735c8\") " pod="openstack/ceilometer-0" Feb 17 18:05:01 crc kubenswrapper[4892]: I0217 18:05:01.308105 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea2f704e-0caa-453e-9836-3b90322735c8-run-httpd\") pod \"ceilometer-0\" (UID: \"ea2f704e-0caa-453e-9836-3b90322735c8\") " pod="openstack/ceilometer-0" Feb 17 18:05:01 crc kubenswrapper[4892]: I0217 18:05:01.308213 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea2f704e-0caa-453e-9836-3b90322735c8-config-data\") pod \"ceilometer-0\" (UID: \"ea2f704e-0caa-453e-9836-3b90322735c8\") " pod="openstack/ceilometer-0" Feb 17 18:05:01 crc kubenswrapper[4892]: I0217 18:05:01.308314 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea2f704e-0caa-453e-9836-3b90322735c8-log-httpd\") pod \"ceilometer-0\" (UID: \"ea2f704e-0caa-453e-9836-3b90322735c8\") " pod="openstack/ceilometer-0" Feb 17 18:05:01 crc kubenswrapper[4892]: I0217 18:05:01.308340 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea2f704e-0caa-453e-9836-3b90322735c8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ea2f704e-0caa-453e-9836-3b90322735c8\") " pod="openstack/ceilometer-0" Feb 17 18:05:01 crc kubenswrapper[4892]: I0217 18:05:01.371518 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="617a072c-991e-4579-9762-678cedffd475" path="/var/lib/kubelet/pods/617a072c-991e-4579-9762-678cedffd475/volumes" Feb 17 18:05:01 crc kubenswrapper[4892]: I0217 18:05:01.410387 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea2f704e-0caa-453e-9836-3b90322735c8-config-data\") pod \"ceilometer-0\" (UID: \"ea2f704e-0caa-453e-9836-3b90322735c8\") " pod="openstack/ceilometer-0" Feb 17 18:05:01 crc kubenswrapper[4892]: I0217 18:05:01.410915 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea2f704e-0caa-453e-9836-3b90322735c8-log-httpd\") pod \"ceilometer-0\" (UID: \"ea2f704e-0caa-453e-9836-3b90322735c8\") " pod="openstack/ceilometer-0" Feb 17 18:05:01 crc kubenswrapper[4892]: I0217 18:05:01.410946 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea2f704e-0caa-453e-9836-3b90322735c8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ea2f704e-0caa-453e-9836-3b90322735c8\") " pod="openstack/ceilometer-0" Feb 17 18:05:01 crc kubenswrapper[4892]: I0217 18:05:01.410993 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g5dz\" (UniqueName: \"kubernetes.io/projected/ea2f704e-0caa-453e-9836-3b90322735c8-kube-api-access-4g5dz\") pod \"ceilometer-0\" (UID: \"ea2f704e-0caa-453e-9836-3b90322735c8\") " pod="openstack/ceilometer-0" Feb 17 18:05:01 crc kubenswrapper[4892]: I0217 18:05:01.411066 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ea2f704e-0caa-453e-9836-3b90322735c8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ea2f704e-0caa-453e-9836-3b90322735c8\") " pod="openstack/ceilometer-0" Feb 17 18:05:01 crc kubenswrapper[4892]: I0217 18:05:01.411092 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea2f704e-0caa-453e-9836-3b90322735c8-scripts\") pod \"ceilometer-0\" (UID: \"ea2f704e-0caa-453e-9836-3b90322735c8\") " pod="openstack/ceilometer-0" Feb 17 18:05:01 crc kubenswrapper[4892]: I0217 18:05:01.411115 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea2f704e-0caa-453e-9836-3b90322735c8-run-httpd\") pod \"ceilometer-0\" (UID: \"ea2f704e-0caa-453e-9836-3b90322735c8\") " pod="openstack/ceilometer-0" Feb 17 18:05:01 crc kubenswrapper[4892]: I0217 18:05:01.411586 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea2f704e-0caa-453e-9836-3b90322735c8-log-httpd\") pod \"ceilometer-0\" (UID: \"ea2f704e-0caa-453e-9836-3b90322735c8\") " pod="openstack/ceilometer-0" Feb 17 18:05:01 crc kubenswrapper[4892]: I0217 18:05:01.412179 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea2f704e-0caa-453e-9836-3b90322735c8-run-httpd\") pod \"ceilometer-0\" (UID: \"ea2f704e-0caa-453e-9836-3b90322735c8\") " pod="openstack/ceilometer-0" Feb 17 18:05:01 crc kubenswrapper[4892]: I0217 18:05:01.415415 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea2f704e-0caa-453e-9836-3b90322735c8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ea2f704e-0caa-453e-9836-3b90322735c8\") " pod="openstack/ceilometer-0" Feb 17 18:05:01 crc kubenswrapper[4892]: I0217 18:05:01.416377 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ea2f704e-0caa-453e-9836-3b90322735c8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ea2f704e-0caa-453e-9836-3b90322735c8\") " pod="openstack/ceilometer-0" Feb 17 18:05:01 crc kubenswrapper[4892]: I0217 18:05:01.417133 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea2f704e-0caa-453e-9836-3b90322735c8-config-data\") pod \"ceilometer-0\" (UID: \"ea2f704e-0caa-453e-9836-3b90322735c8\") " pod="openstack/ceilometer-0" Feb 17 18:05:01 crc kubenswrapper[4892]: I0217 18:05:01.420785 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea2f704e-0caa-453e-9836-3b90322735c8-scripts\") pod \"ceilometer-0\" (UID: \"ea2f704e-0caa-453e-9836-3b90322735c8\") " pod="openstack/ceilometer-0" Feb 17 18:05:01 crc kubenswrapper[4892]: I0217 18:05:01.429675 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g5dz\" (UniqueName: \"kubernetes.io/projected/ea2f704e-0caa-453e-9836-3b90322735c8-kube-api-access-4g5dz\") pod \"ceilometer-0\" (UID: \"ea2f704e-0caa-453e-9836-3b90322735c8\") " pod="openstack/ceilometer-0" Feb 17 18:05:01 crc kubenswrapper[4892]: I0217 18:05:01.529311 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 18:05:03 crc kubenswrapper[4892]: E0217 18:05:03.150879 4892 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d7300a7_7f48_4be7_addb_1ed7995eddc2.slice/crio-conmon-72946e09c4d279d54fcd131f9664725e2feebd52a6082782b4b42cc4b3e6b531.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d7300a7_7f48_4be7_addb_1ed7995eddc2.slice/crio-72946e09c4d279d54fcd131f9664725e2feebd52a6082782b4b42cc4b3e6b531.scope\": RecentStats: unable to find data in memory cache]" Feb 17 18:05:04 crc kubenswrapper[4892]: I0217 18:05:04.460780 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 17 18:05:04 crc kubenswrapper[4892]: I0217 18:05:04.461112 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 17 18:05:04 crc kubenswrapper[4892]: I0217 18:05:04.493297 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 17 18:05:04 crc kubenswrapper[4892]: I0217 18:05:04.499483 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 17 18:05:05 crc kubenswrapper[4892]: I0217 18:05:05.150898 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 17 18:05:05 crc kubenswrapper[4892]: I0217 18:05:05.151006 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 17 18:05:05 crc kubenswrapper[4892]: I0217 18:05:05.219367 4892 scope.go:117] "RemoveContainer" containerID="2b9d900f1194997cadf0dfd7aab38742b3895724a55be152c7378f5fb3fba0e6" Feb 17 18:05:05 crc kubenswrapper[4892]: I0217 18:05:05.281024 4892 scope.go:117] "RemoveContainer" containerID="8468131cf66af829d7abe4d0ffba5afd47d3ab1ccf94ebec6d39feaf37191ca8" Feb 17 18:05:05 crc kubenswrapper[4892]: I0217 18:05:05.421202 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 17 18:05:05 crc kubenswrapper[4892]: I0217 18:05:05.421442 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 17 18:05:05 crc kubenswrapper[4892]: I0217 18:05:05.474701 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 17 18:05:05 crc kubenswrapper[4892]: I0217 18:05:05.489243 4892 scope.go:117] "RemoveContainer" containerID="140c90fae676f7d8d616739f920f39a82131b0db309372f9d347332abcf3c8fb" Feb 17 18:05:05 crc kubenswrapper[4892]: I0217 18:05:05.521637 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 17 18:05:05 crc kubenswrapper[4892]: I0217 18:05:05.538207 4892 scope.go:117] "RemoveContainer" containerID="060a20627026c73505fe9edf580b379e1b429c64cf78667462f8832867c00935" Feb 17 18:05:05 crc kubenswrapper[4892]: E0217 18:05:05.539295 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"060a20627026c73505fe9edf580b379e1b429c64cf78667462f8832867c00935\": container with ID starting with 060a20627026c73505fe9edf580b379e1b429c64cf78667462f8832867c00935 not found: ID does not exist" containerID="060a20627026c73505fe9edf580b379e1b429c64cf78667462f8832867c00935" Feb 17 18:05:05 crc kubenswrapper[4892]: I0217 18:05:05.539329 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"060a20627026c73505fe9edf580b379e1b429c64cf78667462f8832867c00935"} err="failed to get container status \"060a20627026c73505fe9edf580b379e1b429c64cf78667462f8832867c00935\": rpc error: code = NotFound desc = could not find container \"060a20627026c73505fe9edf580b379e1b429c64cf78667462f8832867c00935\": container with ID starting with 060a20627026c73505fe9edf580b379e1b429c64cf78667462f8832867c00935 not found: ID does not exist" Feb 17 18:05:05 crc kubenswrapper[4892]: I0217 18:05:05.539356 4892 scope.go:117] "RemoveContainer" containerID="2b9d900f1194997cadf0dfd7aab38742b3895724a55be152c7378f5fb3fba0e6" Feb 17 18:05:05 crc kubenswrapper[4892]: E0217 18:05:05.540729 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b9d900f1194997cadf0dfd7aab38742b3895724a55be152c7378f5fb3fba0e6\": container with ID starting with 2b9d900f1194997cadf0dfd7aab38742b3895724a55be152c7378f5fb3fba0e6 not found: ID does not exist" containerID="2b9d900f1194997cadf0dfd7aab38742b3895724a55be152c7378f5fb3fba0e6" Feb 17 18:05:05 crc kubenswrapper[4892]: I0217 18:05:05.540785 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b9d900f1194997cadf0dfd7aab38742b3895724a55be152c7378f5fb3fba0e6"} err="failed to get container status \"2b9d900f1194997cadf0dfd7aab38742b3895724a55be152c7378f5fb3fba0e6\": rpc error: code = NotFound desc = could not find container \"2b9d900f1194997cadf0dfd7aab38742b3895724a55be152c7378f5fb3fba0e6\": container with ID starting with 2b9d900f1194997cadf0dfd7aab38742b3895724a55be152c7378f5fb3fba0e6 not found: ID does not exist" Feb 17 18:05:05 crc kubenswrapper[4892]: I0217 18:05:05.540806 4892 scope.go:117] "RemoveContainer" containerID="8468131cf66af829d7abe4d0ffba5afd47d3ab1ccf94ebec6d39feaf37191ca8" Feb 17 18:05:05 crc kubenswrapper[4892]: E0217 18:05:05.545243 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8468131cf66af829d7abe4d0ffba5afd47d3ab1ccf94ebec6d39feaf37191ca8\": container with ID starting with 8468131cf66af829d7abe4d0ffba5afd47d3ab1ccf94ebec6d39feaf37191ca8 not found: ID does not exist" containerID="8468131cf66af829d7abe4d0ffba5afd47d3ab1ccf94ebec6d39feaf37191ca8" Feb 17 18:05:05 crc kubenswrapper[4892]: I0217 18:05:05.545298 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8468131cf66af829d7abe4d0ffba5afd47d3ab1ccf94ebec6d39feaf37191ca8"} err="failed to get container status \"8468131cf66af829d7abe4d0ffba5afd47d3ab1ccf94ebec6d39feaf37191ca8\": rpc error: code = NotFound desc = could not find container \"8468131cf66af829d7abe4d0ffba5afd47d3ab1ccf94ebec6d39feaf37191ca8\": container with ID starting with 8468131cf66af829d7abe4d0ffba5afd47d3ab1ccf94ebec6d39feaf37191ca8 not found: ID does not exist" Feb 17 18:05:05 crc kubenswrapper[4892]: I0217 18:05:05.545324 4892 scope.go:117] "RemoveContainer" containerID="140c90fae676f7d8d616739f920f39a82131b0db309372f9d347332abcf3c8fb" Feb 17 18:05:05 crc kubenswrapper[4892]: E0217 18:05:05.545953 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"140c90fae676f7d8d616739f920f39a82131b0db309372f9d347332abcf3c8fb\": container with ID starting with 140c90fae676f7d8d616739f920f39a82131b0db309372f9d347332abcf3c8fb not found: ID does not exist" containerID="140c90fae676f7d8d616739f920f39a82131b0db309372f9d347332abcf3c8fb" Feb 17 18:05:05 crc kubenswrapper[4892]: I0217 18:05:05.545985 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"140c90fae676f7d8d616739f920f39a82131b0db309372f9d347332abcf3c8fb"} err="failed to get container status \"140c90fae676f7d8d616739f920f39a82131b0db309372f9d347332abcf3c8fb\": rpc error: code = NotFound desc = could not find container \"140c90fae676f7d8d616739f920f39a82131b0db309372f9d347332abcf3c8fb\": container with ID starting with 140c90fae676f7d8d616739f920f39a82131b0db309372f9d347332abcf3c8fb not found: ID does not exist" Feb 17 18:05:05 crc kubenswrapper[4892]: I0217 18:05:05.725920 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 18:05:06 crc kubenswrapper[4892]: I0217 18:05:06.165716 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-lvxxb" event={"ID":"0d7546b1-a71d-4a95-afd3-adf70b749d04","Type":"ContainerStarted","Data":"a11e7aae423c70ef1e6de32b4ff56f1385410dd2b7fc9f3e8ba4ae35f96458cc"} Feb 17 18:05:06 crc kubenswrapper[4892]: I0217 18:05:06.178923 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea2f704e-0caa-453e-9836-3b90322735c8","Type":"ContainerStarted","Data":"98974b37422b867d066b1eb269d60fe48e27ff9436302130d2a65b1286cf4dc9"} Feb 17 18:05:06 crc kubenswrapper[4892]: I0217 18:05:06.182371 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 17 18:05:06 crc kubenswrapper[4892]: I0217 18:05:06.182437 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 17 18:05:06 crc kubenswrapper[4892]: I0217 18:05:06.197634 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-lvxxb" podStartSLOduration=2.451641456 podStartE2EDuration="9.197607005s" podCreationTimestamp="2026-02-17 18:04:57 +0000 UTC" firstStartedPulling="2026-02-17 18:04:58.554507259 +0000 UTC m=+1269.929910524" lastFinishedPulling="2026-02-17 18:05:05.300472808 +0000 UTC m=+1276.675876073" observedRunningTime="2026-02-17 18:05:06.186477295 +0000 UTC m=+1277.561880570" watchObservedRunningTime="2026-02-17 18:05:06.197607005 +0000 UTC m=+1277.573010300" Feb 17 18:05:07 crc kubenswrapper[4892]: I0217 18:05:07.042720 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 17 18:05:07 crc kubenswrapper[4892]: I0217 18:05:07.047541 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 17 18:05:07 crc kubenswrapper[4892]: I0217 18:05:07.194541 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea2f704e-0caa-453e-9836-3b90322735c8","Type":"ContainerStarted","Data":"61506fa2060b7708a9e3828dc6dcd43a73a6eab9259a1fa2a6d08db85ae74f4a"} Feb 17 18:05:07 crc kubenswrapper[4892]: I0217 18:05:07.194581 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea2f704e-0caa-453e-9836-3b90322735c8","Type":"ContainerStarted","Data":"3966f9796a166765dfc49084714ee53fd2e8be23be253c4002e535fbd48882f5"} Feb 17 18:05:08 crc kubenswrapper[4892]: I0217 18:05:08.112173 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 17 18:05:08 crc kubenswrapper[4892]: I0217 18:05:08.114605 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 17 18:05:08 crc kubenswrapper[4892]: I0217 18:05:08.209467 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea2f704e-0caa-453e-9836-3b90322735c8","Type":"ContainerStarted","Data":"9906efce88544e513c1ba5c0b347c52f3cf0e87a1d2222eb597969838066ed95"} Feb 17 18:05:10 crc kubenswrapper[4892]: I0217 18:05:10.232529 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea2f704e-0caa-453e-9836-3b90322735c8","Type":"ContainerStarted","Data":"195baf703b17acf770acb716b111eabd10cf3faea3603fb8520123df461c531a"} Feb 17 18:05:10 crc kubenswrapper[4892]: I0217 18:05:10.233057 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 18:05:10 crc kubenswrapper[4892]: I0217 18:05:10.274368 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=5.633651275 podStartE2EDuration="9.27434961s" podCreationTimestamp="2026-02-17 18:05:01 +0000 UTC" firstStartedPulling="2026-02-17 18:05:05.714460764 +0000 UTC m=+1277.089864029" lastFinishedPulling="2026-02-17 18:05:09.355159099 +0000 UTC m=+1280.730562364" observedRunningTime="2026-02-17 18:05:10.259808248 +0000 UTC m=+1281.635211533" watchObservedRunningTime="2026-02-17 18:05:10.27434961 +0000 UTC m=+1281.649752875" Feb 17 18:05:13 crc kubenswrapper[4892]: E0217 18:05:13.440443 4892 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d7300a7_7f48_4be7_addb_1ed7995eddc2.slice/crio-72946e09c4d279d54fcd131f9664725e2feebd52a6082782b4b42cc4b3e6b531.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d7300a7_7f48_4be7_addb_1ed7995eddc2.slice/crio-conmon-72946e09c4d279d54fcd131f9664725e2feebd52a6082782b4b42cc4b3e6b531.scope\": RecentStats: unable to find data in memory cache]" Feb 17 18:05:23 crc kubenswrapper[4892]: E0217 18:05:23.714743 4892 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d7300a7_7f48_4be7_addb_1ed7995eddc2.slice/crio-72946e09c4d279d54fcd131f9664725e2feebd52a6082782b4b42cc4b3e6b531.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d7300a7_7f48_4be7_addb_1ed7995eddc2.slice/crio-conmon-72946e09c4d279d54fcd131f9664725e2feebd52a6082782b4b42cc4b3e6b531.scope\": RecentStats: unable to find data in memory cache]" Feb 17 18:05:31 crc kubenswrapper[4892]: I0217 18:05:31.509300 4892 generic.go:334] "Generic (PLEG): container finished" podID="0d7546b1-a71d-4a95-afd3-adf70b749d04" containerID="a11e7aae423c70ef1e6de32b4ff56f1385410dd2b7fc9f3e8ba4ae35f96458cc" exitCode=0 Feb 17 18:05:31 crc kubenswrapper[4892]: I0217 18:05:31.509397 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-lvxxb" event={"ID":"0d7546b1-a71d-4a95-afd3-adf70b749d04","Type":"ContainerDied","Data":"a11e7aae423c70ef1e6de32b4ff56f1385410dd2b7fc9f3e8ba4ae35f96458cc"} Feb 17 18:05:31 crc kubenswrapper[4892]: I0217 18:05:31.536853 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 17 18:05:32 crc kubenswrapper[4892]: I0217 18:05:32.943747 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-lvxxb" Feb 17 18:05:33 crc kubenswrapper[4892]: I0217 18:05:33.029538 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d7546b1-a71d-4a95-afd3-adf70b749d04-scripts\") pod \"0d7546b1-a71d-4a95-afd3-adf70b749d04\" (UID: \"0d7546b1-a71d-4a95-afd3-adf70b749d04\") " Feb 17 18:05:33 crc kubenswrapper[4892]: I0217 18:05:33.029742 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d7546b1-a71d-4a95-afd3-adf70b749d04-config-data\") pod \"0d7546b1-a71d-4a95-afd3-adf70b749d04\" (UID: \"0d7546b1-a71d-4a95-afd3-adf70b749d04\") " Feb 17 18:05:33 crc kubenswrapper[4892]: I0217 18:05:33.029923 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d7546b1-a71d-4a95-afd3-adf70b749d04-combined-ca-bundle\") pod \"0d7546b1-a71d-4a95-afd3-adf70b749d04\" (UID: \"0d7546b1-a71d-4a95-afd3-adf70b749d04\") " Feb 17 18:05:33 crc kubenswrapper[4892]: I0217 18:05:33.029947 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nz4tb\" (UniqueName: \"kubernetes.io/projected/0d7546b1-a71d-4a95-afd3-adf70b749d04-kube-api-access-nz4tb\") pod \"0d7546b1-a71d-4a95-afd3-adf70b749d04\" (UID: \"0d7546b1-a71d-4a95-afd3-adf70b749d04\") " Feb 17 18:05:33 crc kubenswrapper[4892]: I0217 18:05:33.035260 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d7546b1-a71d-4a95-afd3-adf70b749d04-kube-api-access-nz4tb" (OuterVolumeSpecName: "kube-api-access-nz4tb") pod "0d7546b1-a71d-4a95-afd3-adf70b749d04" (UID: "0d7546b1-a71d-4a95-afd3-adf70b749d04"). InnerVolumeSpecName "kube-api-access-nz4tb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:05:33 crc kubenswrapper[4892]: I0217 18:05:33.035910 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d7546b1-a71d-4a95-afd3-adf70b749d04-scripts" (OuterVolumeSpecName: "scripts") pod "0d7546b1-a71d-4a95-afd3-adf70b749d04" (UID: "0d7546b1-a71d-4a95-afd3-adf70b749d04"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:05:33 crc kubenswrapper[4892]: I0217 18:05:33.061512 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d7546b1-a71d-4a95-afd3-adf70b749d04-config-data" (OuterVolumeSpecName: "config-data") pod "0d7546b1-a71d-4a95-afd3-adf70b749d04" (UID: "0d7546b1-a71d-4a95-afd3-adf70b749d04"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:05:33 crc kubenswrapper[4892]: I0217 18:05:33.065256 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d7546b1-a71d-4a95-afd3-adf70b749d04-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0d7546b1-a71d-4a95-afd3-adf70b749d04" (UID: "0d7546b1-a71d-4a95-afd3-adf70b749d04"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:05:33 crc kubenswrapper[4892]: I0217 18:05:33.132558 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d7546b1-a71d-4a95-afd3-adf70b749d04-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:05:33 crc kubenswrapper[4892]: I0217 18:05:33.132599 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nz4tb\" (UniqueName: \"kubernetes.io/projected/0d7546b1-a71d-4a95-afd3-adf70b749d04-kube-api-access-nz4tb\") on node \"crc\" DevicePath \"\"" Feb 17 18:05:33 crc kubenswrapper[4892]: I0217 18:05:33.132613 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d7546b1-a71d-4a95-afd3-adf70b749d04-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:05:33 crc kubenswrapper[4892]: I0217 18:05:33.132624 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d7546b1-a71d-4a95-afd3-adf70b749d04-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:05:33 crc kubenswrapper[4892]: I0217 18:05:33.533811 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-lvxxb" event={"ID":"0d7546b1-a71d-4a95-afd3-adf70b749d04","Type":"ContainerDied","Data":"4c48082f821d82c31ff1271455260c211ef89bea46a5d08f4323d100ac0432c7"} Feb 17 18:05:33 crc kubenswrapper[4892]: I0217 18:05:33.533869 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c48082f821d82c31ff1271455260c211ef89bea46a5d08f4323d100ac0432c7" Feb 17 18:05:33 crc kubenswrapper[4892]: I0217 18:05:33.533904 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-lvxxb" Feb 17 18:05:33 crc kubenswrapper[4892]: I0217 18:05:33.651143 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 17 18:05:33 crc kubenswrapper[4892]: E0217 18:05:33.651708 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d7546b1-a71d-4a95-afd3-adf70b749d04" containerName="nova-cell0-conductor-db-sync" Feb 17 18:05:33 crc kubenswrapper[4892]: I0217 18:05:33.651731 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d7546b1-a71d-4a95-afd3-adf70b749d04" containerName="nova-cell0-conductor-db-sync" Feb 17 18:05:33 crc kubenswrapper[4892]: I0217 18:05:33.652030 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d7546b1-a71d-4a95-afd3-adf70b749d04" containerName="nova-cell0-conductor-db-sync" Feb 17 18:05:33 crc kubenswrapper[4892]: I0217 18:05:33.652757 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 17 18:05:33 crc kubenswrapper[4892]: I0217 18:05:33.654968 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-r65jn" Feb 17 18:05:33 crc kubenswrapper[4892]: I0217 18:05:33.655190 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 17 18:05:33 crc kubenswrapper[4892]: I0217 18:05:33.660278 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 17 18:05:33 crc kubenswrapper[4892]: I0217 18:05:33.743001 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03480c10-3249-4caa-b0da-919bbe13c03f-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"03480c10-3249-4caa-b0da-919bbe13c03f\") " pod="openstack/nova-cell0-conductor-0" Feb 17 18:05:33 crc kubenswrapper[4892]: I0217 18:05:33.743053 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsbtx\" (UniqueName: \"kubernetes.io/projected/03480c10-3249-4caa-b0da-919bbe13c03f-kube-api-access-tsbtx\") pod \"nova-cell0-conductor-0\" (UID: \"03480c10-3249-4caa-b0da-919bbe13c03f\") " pod="openstack/nova-cell0-conductor-0" Feb 17 18:05:33 crc kubenswrapper[4892]: I0217 18:05:33.743490 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03480c10-3249-4caa-b0da-919bbe13c03f-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"03480c10-3249-4caa-b0da-919bbe13c03f\") " pod="openstack/nova-cell0-conductor-0" Feb 17 18:05:33 crc kubenswrapper[4892]: I0217 18:05:33.853094 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03480c10-3249-4caa-b0da-919bbe13c03f-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"03480c10-3249-4caa-b0da-919bbe13c03f\") " pod="openstack/nova-cell0-conductor-0" Feb 17 18:05:33 crc kubenswrapper[4892]: I0217 18:05:33.853180 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03480c10-3249-4caa-b0da-919bbe13c03f-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"03480c10-3249-4caa-b0da-919bbe13c03f\") " pod="openstack/nova-cell0-conductor-0" Feb 17 18:05:33 crc kubenswrapper[4892]: I0217 18:05:33.853226 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsbtx\" (UniqueName: \"kubernetes.io/projected/03480c10-3249-4caa-b0da-919bbe13c03f-kube-api-access-tsbtx\") pod \"nova-cell0-conductor-0\" (UID: \"03480c10-3249-4caa-b0da-919bbe13c03f\") " pod="openstack/nova-cell0-conductor-0" Feb 17 18:05:33 crc kubenswrapper[4892]: I0217 18:05:33.857864 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03480c10-3249-4caa-b0da-919bbe13c03f-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"03480c10-3249-4caa-b0da-919bbe13c03f\") " pod="openstack/nova-cell0-conductor-0" Feb 17 18:05:33 crc kubenswrapper[4892]: I0217 18:05:33.858400 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03480c10-3249-4caa-b0da-919bbe13c03f-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"03480c10-3249-4caa-b0da-919bbe13c03f\") " pod="openstack/nova-cell0-conductor-0" Feb 17 18:05:33 crc kubenswrapper[4892]: I0217 18:05:33.876702 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsbtx\" (UniqueName: \"kubernetes.io/projected/03480c10-3249-4caa-b0da-919bbe13c03f-kube-api-access-tsbtx\") pod \"nova-cell0-conductor-0\" (UID: \"03480c10-3249-4caa-b0da-919bbe13c03f\") " pod="openstack/nova-cell0-conductor-0" Feb 17 18:05:33 crc kubenswrapper[4892]: E0217 18:05:33.959887 4892 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d7300a7_7f48_4be7_addb_1ed7995eddc2.slice/crio-72946e09c4d279d54fcd131f9664725e2feebd52a6082782b4b42cc4b3e6b531.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d7300a7_7f48_4be7_addb_1ed7995eddc2.slice/crio-conmon-72946e09c4d279d54fcd131f9664725e2feebd52a6082782b4b42cc4b3e6b531.scope\": RecentStats: unable to find data in memory cache]" Feb 17 18:05:33 crc kubenswrapper[4892]: I0217 18:05:33.970997 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 17 18:05:34 crc kubenswrapper[4892]: I0217 18:05:34.474350 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 17 18:05:34 crc kubenswrapper[4892]: I0217 18:05:34.547559 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"03480c10-3249-4caa-b0da-919bbe13c03f","Type":"ContainerStarted","Data":"6c6be2f3cb71402eb555b468f45e50e1d07f9c0a3161f46426afeb11d5d41587"} Feb 17 18:05:35 crc kubenswrapper[4892]: I0217 18:05:35.482733 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 18:05:35 crc kubenswrapper[4892]: I0217 18:05:35.483164 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="56066e3d-ab6b-4a76-bf31-11f8442d9285" containerName="kube-state-metrics" containerID="cri-o://d48371c955863cb82644a9a6770a2b1b75808550ad06a8eb03550a2db5621bb5" gracePeriod=30 Feb 17 18:05:35 crc kubenswrapper[4892]: I0217 18:05:35.558998 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"03480c10-3249-4caa-b0da-919bbe13c03f","Type":"ContainerStarted","Data":"ab514791aa617c049db64bf7cfd3380e2e19c1cb78b1e69eb13b878f9cf05a05"} Feb 17 18:05:35 crc kubenswrapper[4892]: I0217 18:05:35.559470 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 17 18:05:35 crc kubenswrapper[4892]: I0217 18:05:35.581643 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.581625913 podStartE2EDuration="2.581625913s" podCreationTimestamp="2026-02-17 18:05:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:05:35.57632576 +0000 UTC m=+1306.951729035" watchObservedRunningTime="2026-02-17 18:05:35.581625913 +0000 UTC m=+1306.957029178" Feb 17 18:05:36 crc kubenswrapper[4892]: I0217 18:05:36.020518 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 17 18:05:36 crc kubenswrapper[4892]: I0217 18:05:36.121161 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsskp\" (UniqueName: \"kubernetes.io/projected/56066e3d-ab6b-4a76-bf31-11f8442d9285-kube-api-access-jsskp\") pod \"56066e3d-ab6b-4a76-bf31-11f8442d9285\" (UID: \"56066e3d-ab6b-4a76-bf31-11f8442d9285\") " Feb 17 18:05:36 crc kubenswrapper[4892]: I0217 18:05:36.126966 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56066e3d-ab6b-4a76-bf31-11f8442d9285-kube-api-access-jsskp" (OuterVolumeSpecName: "kube-api-access-jsskp") pod "56066e3d-ab6b-4a76-bf31-11f8442d9285" (UID: "56066e3d-ab6b-4a76-bf31-11f8442d9285"). InnerVolumeSpecName "kube-api-access-jsskp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:05:36 crc kubenswrapper[4892]: I0217 18:05:36.226232 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsskp\" (UniqueName: \"kubernetes.io/projected/56066e3d-ab6b-4a76-bf31-11f8442d9285-kube-api-access-jsskp\") on node \"crc\" DevicePath \"\"" Feb 17 18:05:36 crc kubenswrapper[4892]: I0217 18:05:36.585854 4892 generic.go:334] "Generic (PLEG): container finished" podID="56066e3d-ab6b-4a76-bf31-11f8442d9285" containerID="d48371c955863cb82644a9a6770a2b1b75808550ad06a8eb03550a2db5621bb5" exitCode=2 Feb 17 18:05:36 crc kubenswrapper[4892]: I0217 18:05:36.585908 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 17 18:05:36 crc kubenswrapper[4892]: I0217 18:05:36.585948 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"56066e3d-ab6b-4a76-bf31-11f8442d9285","Type":"ContainerDied","Data":"d48371c955863cb82644a9a6770a2b1b75808550ad06a8eb03550a2db5621bb5"} Feb 17 18:05:36 crc kubenswrapper[4892]: I0217 18:05:36.587300 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"56066e3d-ab6b-4a76-bf31-11f8442d9285","Type":"ContainerDied","Data":"b95e17644cc3c59ab1e620d0e3fd9b8bf013435ef5feae7623363500958d3efb"} Feb 17 18:05:36 crc kubenswrapper[4892]: I0217 18:05:36.587330 4892 scope.go:117] "RemoveContainer" containerID="d48371c955863cb82644a9a6770a2b1b75808550ad06a8eb03550a2db5621bb5" Feb 17 18:05:36 crc kubenswrapper[4892]: I0217 18:05:36.623325 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 18:05:36 crc kubenswrapper[4892]: I0217 18:05:36.632790 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 18:05:36 crc kubenswrapper[4892]: I0217 18:05:36.643271 4892 scope.go:117] "RemoveContainer" containerID="d48371c955863cb82644a9a6770a2b1b75808550ad06a8eb03550a2db5621bb5" Feb 17 18:05:36 crc kubenswrapper[4892]: E0217 18:05:36.643757 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d48371c955863cb82644a9a6770a2b1b75808550ad06a8eb03550a2db5621bb5\": container with ID starting with d48371c955863cb82644a9a6770a2b1b75808550ad06a8eb03550a2db5621bb5 not found: ID does not exist" containerID="d48371c955863cb82644a9a6770a2b1b75808550ad06a8eb03550a2db5621bb5" Feb 17 18:05:36 crc kubenswrapper[4892]: I0217 18:05:36.643803 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d48371c955863cb82644a9a6770a2b1b75808550ad06a8eb03550a2db5621bb5"} err="failed to get container status \"d48371c955863cb82644a9a6770a2b1b75808550ad06a8eb03550a2db5621bb5\": rpc error: code = NotFound desc = could not find container \"d48371c955863cb82644a9a6770a2b1b75808550ad06a8eb03550a2db5621bb5\": container with ID starting with d48371c955863cb82644a9a6770a2b1b75808550ad06a8eb03550a2db5621bb5 not found: ID does not exist" Feb 17 18:05:36 crc kubenswrapper[4892]: I0217 18:05:36.646704 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 18:05:36 crc kubenswrapper[4892]: E0217 18:05:36.647346 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56066e3d-ab6b-4a76-bf31-11f8442d9285" containerName="kube-state-metrics" Feb 17 18:05:36 crc kubenswrapper[4892]: I0217 18:05:36.647371 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="56066e3d-ab6b-4a76-bf31-11f8442d9285" containerName="kube-state-metrics" Feb 17 18:05:36 crc kubenswrapper[4892]: I0217 18:05:36.647607 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="56066e3d-ab6b-4a76-bf31-11f8442d9285" containerName="kube-state-metrics" Feb 17 18:05:36 crc kubenswrapper[4892]: I0217 18:05:36.648247 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 17 18:05:36 crc kubenswrapper[4892]: I0217 18:05:36.651676 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Feb 17 18:05:36 crc kubenswrapper[4892]: I0217 18:05:36.651832 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Feb 17 18:05:36 crc kubenswrapper[4892]: I0217 18:05:36.656668 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 18:05:36 crc kubenswrapper[4892]: I0217 18:05:36.736616 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/8affd9cf-1116-4643-8045-d445edeaa995-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"8affd9cf-1116-4643-8045-d445edeaa995\") " pod="openstack/kube-state-metrics-0" Feb 17 18:05:36 crc kubenswrapper[4892]: I0217 18:05:36.736671 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdg2p\" (UniqueName: \"kubernetes.io/projected/8affd9cf-1116-4643-8045-d445edeaa995-kube-api-access-qdg2p\") pod \"kube-state-metrics-0\" (UID: \"8affd9cf-1116-4643-8045-d445edeaa995\") " pod="openstack/kube-state-metrics-0" Feb 17 18:05:36 crc kubenswrapper[4892]: I0217 18:05:36.736893 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/8affd9cf-1116-4643-8045-d445edeaa995-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"8affd9cf-1116-4643-8045-d445edeaa995\") " pod="openstack/kube-state-metrics-0" Feb 17 18:05:36 crc kubenswrapper[4892]: I0217 18:05:36.737057 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8affd9cf-1116-4643-8045-d445edeaa995-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"8affd9cf-1116-4643-8045-d445edeaa995\") " pod="openstack/kube-state-metrics-0" Feb 17 18:05:36 crc kubenswrapper[4892]: I0217 18:05:36.839560 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8affd9cf-1116-4643-8045-d445edeaa995-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"8affd9cf-1116-4643-8045-d445edeaa995\") " pod="openstack/kube-state-metrics-0" Feb 17 18:05:36 crc kubenswrapper[4892]: I0217 18:05:36.839757 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/8affd9cf-1116-4643-8045-d445edeaa995-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"8affd9cf-1116-4643-8045-d445edeaa995\") " pod="openstack/kube-state-metrics-0" Feb 17 18:05:36 crc kubenswrapper[4892]: I0217 18:05:36.839985 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdg2p\" (UniqueName: \"kubernetes.io/projected/8affd9cf-1116-4643-8045-d445edeaa995-kube-api-access-qdg2p\") pod \"kube-state-metrics-0\" (UID: \"8affd9cf-1116-4643-8045-d445edeaa995\") " pod="openstack/kube-state-metrics-0" Feb 17 18:05:36 crc kubenswrapper[4892]: I0217 18:05:36.840292 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/8affd9cf-1116-4643-8045-d445edeaa995-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"8affd9cf-1116-4643-8045-d445edeaa995\") " pod="openstack/kube-state-metrics-0" Feb 17 18:05:36 crc kubenswrapper[4892]: I0217 18:05:36.846756 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/8affd9cf-1116-4643-8045-d445edeaa995-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"8affd9cf-1116-4643-8045-d445edeaa995\") " pod="openstack/kube-state-metrics-0" Feb 17 18:05:36 crc kubenswrapper[4892]: I0217 18:05:36.852459 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8affd9cf-1116-4643-8045-d445edeaa995-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"8affd9cf-1116-4643-8045-d445edeaa995\") " pod="openstack/kube-state-metrics-0" Feb 17 18:05:36 crc kubenswrapper[4892]: I0217 18:05:36.854236 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/8affd9cf-1116-4643-8045-d445edeaa995-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"8affd9cf-1116-4643-8045-d445edeaa995\") " pod="openstack/kube-state-metrics-0" Feb 17 18:05:36 crc kubenswrapper[4892]: I0217 18:05:36.861976 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdg2p\" (UniqueName: \"kubernetes.io/projected/8affd9cf-1116-4643-8045-d445edeaa995-kube-api-access-qdg2p\") pod \"kube-state-metrics-0\" (UID: \"8affd9cf-1116-4643-8045-d445edeaa995\") " pod="openstack/kube-state-metrics-0" Feb 17 18:05:36 crc kubenswrapper[4892]: I0217 18:05:36.972670 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 17 18:05:37 crc kubenswrapper[4892]: I0217 18:05:37.374013 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56066e3d-ab6b-4a76-bf31-11f8442d9285" path="/var/lib/kubelet/pods/56066e3d-ab6b-4a76-bf31-11f8442d9285/volumes" Feb 17 18:05:37 crc kubenswrapper[4892]: I0217 18:05:37.470119 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 18:05:37 crc kubenswrapper[4892]: W0217 18:05:37.476454 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8affd9cf_1116_4643_8045_d445edeaa995.slice/crio-961eb86eb44cbf469fe242743791bfef8eb78a2abe611cf76220628d1eb8f528 WatchSource:0}: Error finding container 961eb86eb44cbf469fe242743791bfef8eb78a2abe611cf76220628d1eb8f528: Status 404 returned error can't find the container with id 961eb86eb44cbf469fe242743791bfef8eb78a2abe611cf76220628d1eb8f528 Feb 17 18:05:37 crc kubenswrapper[4892]: I0217 18:05:37.478498 4892 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 18:05:37 crc kubenswrapper[4892]: I0217 18:05:37.577713 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 18:05:37 crc kubenswrapper[4892]: I0217 18:05:37.577990 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ea2f704e-0caa-453e-9836-3b90322735c8" containerName="ceilometer-central-agent" containerID="cri-o://3966f9796a166765dfc49084714ee53fd2e8be23be253c4002e535fbd48882f5" gracePeriod=30 Feb 17 18:05:37 crc kubenswrapper[4892]: I0217 18:05:37.578082 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ea2f704e-0caa-453e-9836-3b90322735c8" containerName="sg-core" containerID="cri-o://9906efce88544e513c1ba5c0b347c52f3cf0e87a1d2222eb597969838066ed95" gracePeriod=30 Feb 17 18:05:37 crc kubenswrapper[4892]: I0217 18:05:37.578397 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ea2f704e-0caa-453e-9836-3b90322735c8" containerName="ceilometer-notification-agent" containerID="cri-o://61506fa2060b7708a9e3828dc6dcd43a73a6eab9259a1fa2a6d08db85ae74f4a" gracePeriod=30 Feb 17 18:05:37 crc kubenswrapper[4892]: I0217 18:05:37.578530 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ea2f704e-0caa-453e-9836-3b90322735c8" containerName="proxy-httpd" containerID="cri-o://195baf703b17acf770acb716b111eabd10cf3faea3603fb8520123df461c531a" gracePeriod=30 Feb 17 18:05:37 crc kubenswrapper[4892]: I0217 18:05:37.600286 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8affd9cf-1116-4643-8045-d445edeaa995","Type":"ContainerStarted","Data":"961eb86eb44cbf469fe242743791bfef8eb78a2abe611cf76220628d1eb8f528"} Feb 17 18:05:38 crc kubenswrapper[4892]: I0217 18:05:38.642305 4892 generic.go:334] "Generic (PLEG): container finished" podID="ea2f704e-0caa-453e-9836-3b90322735c8" containerID="195baf703b17acf770acb716b111eabd10cf3faea3603fb8520123df461c531a" exitCode=0 Feb 17 18:05:38 crc kubenswrapper[4892]: I0217 18:05:38.642889 4892 generic.go:334] "Generic (PLEG): container finished" podID="ea2f704e-0caa-453e-9836-3b90322735c8" containerID="9906efce88544e513c1ba5c0b347c52f3cf0e87a1d2222eb597969838066ed95" exitCode=2 Feb 17 18:05:38 crc kubenswrapper[4892]: I0217 18:05:38.642902 4892 generic.go:334] "Generic (PLEG): container finished" podID="ea2f704e-0caa-453e-9836-3b90322735c8" containerID="3966f9796a166765dfc49084714ee53fd2e8be23be253c4002e535fbd48882f5" exitCode=0 Feb 17 18:05:38 crc kubenswrapper[4892]: I0217 18:05:38.642477 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea2f704e-0caa-453e-9836-3b90322735c8","Type":"ContainerDied","Data":"195baf703b17acf770acb716b111eabd10cf3faea3603fb8520123df461c531a"} Feb 17 18:05:38 crc kubenswrapper[4892]: I0217 18:05:38.642973 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea2f704e-0caa-453e-9836-3b90322735c8","Type":"ContainerDied","Data":"9906efce88544e513c1ba5c0b347c52f3cf0e87a1d2222eb597969838066ed95"} Feb 17 18:05:38 crc kubenswrapper[4892]: I0217 18:05:38.642988 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea2f704e-0caa-453e-9836-3b90322735c8","Type":"ContainerDied","Data":"3966f9796a166765dfc49084714ee53fd2e8be23be253c4002e535fbd48882f5"} Feb 17 18:05:38 crc kubenswrapper[4892]: I0217 18:05:38.646519 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8affd9cf-1116-4643-8045-d445edeaa995","Type":"ContainerStarted","Data":"0f890fee5c00a218e42e924b96222201e9ce426ef9dad178a8610ce733d3612e"} Feb 17 18:05:38 crc kubenswrapper[4892]: I0217 18:05:38.646672 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 17 18:05:38 crc kubenswrapper[4892]: I0217 18:05:38.672846 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.284716869 podStartE2EDuration="2.672789617s" podCreationTimestamp="2026-02-17 18:05:36 +0000 UTC" firstStartedPulling="2026-02-17 18:05:37.478297567 +0000 UTC m=+1308.853700822" lastFinishedPulling="2026-02-17 18:05:37.866370305 +0000 UTC m=+1309.241773570" observedRunningTime="2026-02-17 18:05:38.662510681 +0000 UTC m=+1310.037913966" watchObservedRunningTime="2026-02-17 18:05:38.672789617 +0000 UTC m=+1310.048192892" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:41.417855 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:41.550645 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea2f704e-0caa-453e-9836-3b90322735c8-scripts\") pod \"ea2f704e-0caa-453e-9836-3b90322735c8\" (UID: \"ea2f704e-0caa-453e-9836-3b90322735c8\") " Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:41.550751 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ea2f704e-0caa-453e-9836-3b90322735c8-sg-core-conf-yaml\") pod \"ea2f704e-0caa-453e-9836-3b90322735c8\" (UID: \"ea2f704e-0caa-453e-9836-3b90322735c8\") " Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:41.551558 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea2f704e-0caa-453e-9836-3b90322735c8-combined-ca-bundle\") pod \"ea2f704e-0caa-453e-9836-3b90322735c8\" (UID: \"ea2f704e-0caa-453e-9836-3b90322735c8\") " Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:41.551609 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea2f704e-0caa-453e-9836-3b90322735c8-run-httpd\") pod \"ea2f704e-0caa-453e-9836-3b90322735c8\" (UID: \"ea2f704e-0caa-453e-9836-3b90322735c8\") " Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:41.551661 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea2f704e-0caa-453e-9836-3b90322735c8-log-httpd\") pod \"ea2f704e-0caa-453e-9836-3b90322735c8\" (UID: \"ea2f704e-0caa-453e-9836-3b90322735c8\") " Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:41.551712 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea2f704e-0caa-453e-9836-3b90322735c8-config-data\") pod \"ea2f704e-0caa-453e-9836-3b90322735c8\" (UID: \"ea2f704e-0caa-453e-9836-3b90322735c8\") " Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:41.551782 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4g5dz\" (UniqueName: \"kubernetes.io/projected/ea2f704e-0caa-453e-9836-3b90322735c8-kube-api-access-4g5dz\") pod \"ea2f704e-0caa-453e-9836-3b90322735c8\" (UID: \"ea2f704e-0caa-453e-9836-3b90322735c8\") " Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:41.551869 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea2f704e-0caa-453e-9836-3b90322735c8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ea2f704e-0caa-453e-9836-3b90322735c8" (UID: "ea2f704e-0caa-453e-9836-3b90322735c8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:41.552196 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea2f704e-0caa-453e-9836-3b90322735c8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ea2f704e-0caa-453e-9836-3b90322735c8" (UID: "ea2f704e-0caa-453e-9836-3b90322735c8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:41.552452 4892 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea2f704e-0caa-453e-9836-3b90322735c8-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:41.552470 4892 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea2f704e-0caa-453e-9836-3b90322735c8-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:41.557277 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea2f704e-0caa-453e-9836-3b90322735c8-scripts" (OuterVolumeSpecName: "scripts") pod "ea2f704e-0caa-453e-9836-3b90322735c8" (UID: "ea2f704e-0caa-453e-9836-3b90322735c8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:41.557974 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea2f704e-0caa-453e-9836-3b90322735c8-kube-api-access-4g5dz" (OuterVolumeSpecName: "kube-api-access-4g5dz") pod "ea2f704e-0caa-453e-9836-3b90322735c8" (UID: "ea2f704e-0caa-453e-9836-3b90322735c8"). InnerVolumeSpecName "kube-api-access-4g5dz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:41.629741 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea2f704e-0caa-453e-9836-3b90322735c8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ea2f704e-0caa-453e-9836-3b90322735c8" (UID: "ea2f704e-0caa-453e-9836-3b90322735c8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:41.654728 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4g5dz\" (UniqueName: \"kubernetes.io/projected/ea2f704e-0caa-453e-9836-3b90322735c8-kube-api-access-4g5dz\") on node \"crc\" DevicePath \"\"" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:41.654757 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea2f704e-0caa-453e-9836-3b90322735c8-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:41.654770 4892 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ea2f704e-0caa-453e-9836-3b90322735c8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:41.690782 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea2f704e-0caa-453e-9836-3b90322735c8-config-data" (OuterVolumeSpecName: "config-data") pod "ea2f704e-0caa-453e-9836-3b90322735c8" (UID: "ea2f704e-0caa-453e-9836-3b90322735c8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:41.701255 4892 generic.go:334] "Generic (PLEG): container finished" podID="ea2f704e-0caa-453e-9836-3b90322735c8" containerID="61506fa2060b7708a9e3828dc6dcd43a73a6eab9259a1fa2a6d08db85ae74f4a" exitCode=0 Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:41.701300 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea2f704e-0caa-453e-9836-3b90322735c8","Type":"ContainerDied","Data":"61506fa2060b7708a9e3828dc6dcd43a73a6eab9259a1fa2a6d08db85ae74f4a"} Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:41.701332 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea2f704e-0caa-453e-9836-3b90322735c8","Type":"ContainerDied","Data":"98974b37422b867d066b1eb269d60fe48e27ff9436302130d2a65b1286cf4dc9"} Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:41.701350 4892 scope.go:117] "RemoveContainer" containerID="195baf703b17acf770acb716b111eabd10cf3faea3603fb8520123df461c531a" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:41.701514 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:41.704957 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea2f704e-0caa-453e-9836-3b90322735c8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ea2f704e-0caa-453e-9836-3b90322735c8" (UID: "ea2f704e-0caa-453e-9836-3b90322735c8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:41.726793 4892 scope.go:117] "RemoveContainer" containerID="9906efce88544e513c1ba5c0b347c52f3cf0e87a1d2222eb597969838066ed95" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:41.747260 4892 scope.go:117] "RemoveContainer" containerID="61506fa2060b7708a9e3828dc6dcd43a73a6eab9259a1fa2a6d08db85ae74f4a" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:41.757000 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea2f704e-0caa-453e-9836-3b90322735c8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:41.757032 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea2f704e-0caa-453e-9836-3b90322735c8-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:41.769036 4892 scope.go:117] "RemoveContainer" containerID="3966f9796a166765dfc49084714ee53fd2e8be23be253c4002e535fbd48882f5" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:41.792824 4892 scope.go:117] "RemoveContainer" containerID="195baf703b17acf770acb716b111eabd10cf3faea3603fb8520123df461c531a" Feb 17 18:05:42 crc kubenswrapper[4892]: E0217 18:05:41.794197 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"195baf703b17acf770acb716b111eabd10cf3faea3603fb8520123df461c531a\": container with ID starting with 195baf703b17acf770acb716b111eabd10cf3faea3603fb8520123df461c531a not found: ID does not exist" containerID="195baf703b17acf770acb716b111eabd10cf3faea3603fb8520123df461c531a" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:41.794239 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"195baf703b17acf770acb716b111eabd10cf3faea3603fb8520123df461c531a"} err="failed to get container status \"195baf703b17acf770acb716b111eabd10cf3faea3603fb8520123df461c531a\": rpc error: code = NotFound desc = could not find container \"195baf703b17acf770acb716b111eabd10cf3faea3603fb8520123df461c531a\": container with ID starting with 195baf703b17acf770acb716b111eabd10cf3faea3603fb8520123df461c531a not found: ID does not exist" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:41.794270 4892 scope.go:117] "RemoveContainer" containerID="9906efce88544e513c1ba5c0b347c52f3cf0e87a1d2222eb597969838066ed95" Feb 17 18:05:42 crc kubenswrapper[4892]: E0217 18:05:41.795299 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9906efce88544e513c1ba5c0b347c52f3cf0e87a1d2222eb597969838066ed95\": container with ID starting with 9906efce88544e513c1ba5c0b347c52f3cf0e87a1d2222eb597969838066ed95 not found: ID does not exist" containerID="9906efce88544e513c1ba5c0b347c52f3cf0e87a1d2222eb597969838066ed95" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:41.795326 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9906efce88544e513c1ba5c0b347c52f3cf0e87a1d2222eb597969838066ed95"} err="failed to get container status \"9906efce88544e513c1ba5c0b347c52f3cf0e87a1d2222eb597969838066ed95\": rpc error: code = NotFound desc = could not find container \"9906efce88544e513c1ba5c0b347c52f3cf0e87a1d2222eb597969838066ed95\": container with ID starting with 9906efce88544e513c1ba5c0b347c52f3cf0e87a1d2222eb597969838066ed95 not found: ID does not exist" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:41.795346 4892 scope.go:117] "RemoveContainer" containerID="61506fa2060b7708a9e3828dc6dcd43a73a6eab9259a1fa2a6d08db85ae74f4a" Feb 17 18:05:42 crc kubenswrapper[4892]: E0217 18:05:41.796397 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61506fa2060b7708a9e3828dc6dcd43a73a6eab9259a1fa2a6d08db85ae74f4a\": container with ID starting with 61506fa2060b7708a9e3828dc6dcd43a73a6eab9259a1fa2a6d08db85ae74f4a not found: ID does not exist" containerID="61506fa2060b7708a9e3828dc6dcd43a73a6eab9259a1fa2a6d08db85ae74f4a" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:41.796419 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61506fa2060b7708a9e3828dc6dcd43a73a6eab9259a1fa2a6d08db85ae74f4a"} err="failed to get container status \"61506fa2060b7708a9e3828dc6dcd43a73a6eab9259a1fa2a6d08db85ae74f4a\": rpc error: code = NotFound desc = could not find container \"61506fa2060b7708a9e3828dc6dcd43a73a6eab9259a1fa2a6d08db85ae74f4a\": container with ID starting with 61506fa2060b7708a9e3828dc6dcd43a73a6eab9259a1fa2a6d08db85ae74f4a not found: ID does not exist" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:41.796436 4892 scope.go:117] "RemoveContainer" containerID="3966f9796a166765dfc49084714ee53fd2e8be23be253c4002e535fbd48882f5" Feb 17 18:05:42 crc kubenswrapper[4892]: E0217 18:05:41.796907 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3966f9796a166765dfc49084714ee53fd2e8be23be253c4002e535fbd48882f5\": container with ID starting with 3966f9796a166765dfc49084714ee53fd2e8be23be253c4002e535fbd48882f5 not found: ID does not exist" containerID="3966f9796a166765dfc49084714ee53fd2e8be23be253c4002e535fbd48882f5" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:41.796925 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3966f9796a166765dfc49084714ee53fd2e8be23be253c4002e535fbd48882f5"} err="failed to get container status \"3966f9796a166765dfc49084714ee53fd2e8be23be253c4002e535fbd48882f5\": rpc error: code = NotFound desc = could not find container \"3966f9796a166765dfc49084714ee53fd2e8be23be253c4002e535fbd48882f5\": container with ID starting with 3966f9796a166765dfc49084714ee53fd2e8be23be253c4002e535fbd48882f5 not found: ID does not exist" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:42.097756 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:42.111312 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:42.144195 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 18:05:42 crc kubenswrapper[4892]: E0217 18:05:42.144777 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea2f704e-0caa-453e-9836-3b90322735c8" containerName="ceilometer-notification-agent" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:42.144794 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea2f704e-0caa-453e-9836-3b90322735c8" containerName="ceilometer-notification-agent" Feb 17 18:05:42 crc kubenswrapper[4892]: E0217 18:05:42.144808 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea2f704e-0caa-453e-9836-3b90322735c8" containerName="ceilometer-central-agent" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:42.144854 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea2f704e-0caa-453e-9836-3b90322735c8" containerName="ceilometer-central-agent" Feb 17 18:05:42 crc kubenswrapper[4892]: E0217 18:05:42.144882 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea2f704e-0caa-453e-9836-3b90322735c8" containerName="sg-core" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:42.144891 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea2f704e-0caa-453e-9836-3b90322735c8" containerName="sg-core" Feb 17 18:05:42 crc kubenswrapper[4892]: E0217 18:05:42.144950 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea2f704e-0caa-453e-9836-3b90322735c8" containerName="proxy-httpd" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:42.144962 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea2f704e-0caa-453e-9836-3b90322735c8" containerName="proxy-httpd" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:42.145268 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea2f704e-0caa-453e-9836-3b90322735c8" containerName="proxy-httpd" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:42.145339 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea2f704e-0caa-453e-9836-3b90322735c8" containerName="ceilometer-notification-agent" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:42.145371 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea2f704e-0caa-453e-9836-3b90322735c8" containerName="ceilometer-central-agent" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:42.145426 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea2f704e-0caa-453e-9836-3b90322735c8" containerName="sg-core" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:42.148251 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:42.150541 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:42.150583 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:42.153085 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:42.159032 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:42.271414 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbfj5\" (UniqueName: \"kubernetes.io/projected/38e4898f-d94e-49aa-add7-918fe6417c43-kube-api-access-lbfj5\") pod \"ceilometer-0\" (UID: \"38e4898f-d94e-49aa-add7-918fe6417c43\") " pod="openstack/ceilometer-0" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:42.271451 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38e4898f-d94e-49aa-add7-918fe6417c43-log-httpd\") pod \"ceilometer-0\" (UID: \"38e4898f-d94e-49aa-add7-918fe6417c43\") " pod="openstack/ceilometer-0" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:42.271578 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38e4898f-d94e-49aa-add7-918fe6417c43-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"38e4898f-d94e-49aa-add7-918fe6417c43\") " pod="openstack/ceilometer-0" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:42.271656 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38e4898f-d94e-49aa-add7-918fe6417c43-config-data\") pod \"ceilometer-0\" (UID: \"38e4898f-d94e-49aa-add7-918fe6417c43\") " pod="openstack/ceilometer-0" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:42.271702 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/38e4898f-d94e-49aa-add7-918fe6417c43-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"38e4898f-d94e-49aa-add7-918fe6417c43\") " pod="openstack/ceilometer-0" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:42.271803 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38e4898f-d94e-49aa-add7-918fe6417c43-run-httpd\") pod \"ceilometer-0\" (UID: \"38e4898f-d94e-49aa-add7-918fe6417c43\") " pod="openstack/ceilometer-0" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:42.271908 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38e4898f-d94e-49aa-add7-918fe6417c43-scripts\") pod \"ceilometer-0\" (UID: \"38e4898f-d94e-49aa-add7-918fe6417c43\") " pod="openstack/ceilometer-0" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:42.272134 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/38e4898f-d94e-49aa-add7-918fe6417c43-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"38e4898f-d94e-49aa-add7-918fe6417c43\") " pod="openstack/ceilometer-0" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:42.373675 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38e4898f-d94e-49aa-add7-918fe6417c43-run-httpd\") pod \"ceilometer-0\" (UID: \"38e4898f-d94e-49aa-add7-918fe6417c43\") " pod="openstack/ceilometer-0" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:42.373724 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38e4898f-d94e-49aa-add7-918fe6417c43-scripts\") pod \"ceilometer-0\" (UID: \"38e4898f-d94e-49aa-add7-918fe6417c43\") " pod="openstack/ceilometer-0" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:42.373850 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/38e4898f-d94e-49aa-add7-918fe6417c43-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"38e4898f-d94e-49aa-add7-918fe6417c43\") " pod="openstack/ceilometer-0" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:42.373922 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbfj5\" (UniqueName: \"kubernetes.io/projected/38e4898f-d94e-49aa-add7-918fe6417c43-kube-api-access-lbfj5\") pod \"ceilometer-0\" (UID: \"38e4898f-d94e-49aa-add7-918fe6417c43\") " pod="openstack/ceilometer-0" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:42.373947 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38e4898f-d94e-49aa-add7-918fe6417c43-log-httpd\") pod \"ceilometer-0\" (UID: \"38e4898f-d94e-49aa-add7-918fe6417c43\") " pod="openstack/ceilometer-0" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:42.374004 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38e4898f-d94e-49aa-add7-918fe6417c43-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"38e4898f-d94e-49aa-add7-918fe6417c43\") " pod="openstack/ceilometer-0" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:42.374046 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38e4898f-d94e-49aa-add7-918fe6417c43-config-data\") pod \"ceilometer-0\" (UID: \"38e4898f-d94e-49aa-add7-918fe6417c43\") " pod="openstack/ceilometer-0" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:42.374078 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/38e4898f-d94e-49aa-add7-918fe6417c43-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"38e4898f-d94e-49aa-add7-918fe6417c43\") " pod="openstack/ceilometer-0" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:42.374364 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38e4898f-d94e-49aa-add7-918fe6417c43-run-httpd\") pod \"ceilometer-0\" (UID: \"38e4898f-d94e-49aa-add7-918fe6417c43\") " pod="openstack/ceilometer-0" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:42.374655 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38e4898f-d94e-49aa-add7-918fe6417c43-log-httpd\") pod \"ceilometer-0\" (UID: \"38e4898f-d94e-49aa-add7-918fe6417c43\") " pod="openstack/ceilometer-0" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:42.378489 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/38e4898f-d94e-49aa-add7-918fe6417c43-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"38e4898f-d94e-49aa-add7-918fe6417c43\") " pod="openstack/ceilometer-0" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:42.379520 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38e4898f-d94e-49aa-add7-918fe6417c43-config-data\") pod \"ceilometer-0\" (UID: \"38e4898f-d94e-49aa-add7-918fe6417c43\") " pod="openstack/ceilometer-0" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:42.380235 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38e4898f-d94e-49aa-add7-918fe6417c43-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"38e4898f-d94e-49aa-add7-918fe6417c43\") " pod="openstack/ceilometer-0" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:42.380847 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/38e4898f-d94e-49aa-add7-918fe6417c43-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"38e4898f-d94e-49aa-add7-918fe6417c43\") " pod="openstack/ceilometer-0" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:42.391027 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38e4898f-d94e-49aa-add7-918fe6417c43-scripts\") pod \"ceilometer-0\" (UID: \"38e4898f-d94e-49aa-add7-918fe6417c43\") " pod="openstack/ceilometer-0" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:42.392405 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbfj5\" (UniqueName: \"kubernetes.io/projected/38e4898f-d94e-49aa-add7-918fe6417c43-kube-api-access-lbfj5\") pod \"ceilometer-0\" (UID: \"38e4898f-d94e-49aa-add7-918fe6417c43\") " pod="openstack/ceilometer-0" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:42.471337 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 18:05:42 crc kubenswrapper[4892]: I0217 18:05:42.952137 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 18:05:42 crc kubenswrapper[4892]: W0217 18:05:42.954241 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod38e4898f_d94e_49aa_add7_918fe6417c43.slice/crio-d5703726698fa29ac42a0e9a3eab01c0836cf4728fd1c75a8bdc113f08a05730 WatchSource:0}: Error finding container d5703726698fa29ac42a0e9a3eab01c0836cf4728fd1c75a8bdc113f08a05730: Status 404 returned error can't find the container with id d5703726698fa29ac42a0e9a3eab01c0836cf4728fd1c75a8bdc113f08a05730 Feb 17 18:05:43 crc kubenswrapper[4892]: I0217 18:05:43.370137 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea2f704e-0caa-453e-9836-3b90322735c8" path="/var/lib/kubelet/pods/ea2f704e-0caa-453e-9836-3b90322735c8/volumes" Feb 17 18:05:43 crc kubenswrapper[4892]: I0217 18:05:43.730846 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38e4898f-d94e-49aa-add7-918fe6417c43","Type":"ContainerStarted","Data":"d5703726698fa29ac42a0e9a3eab01c0836cf4728fd1c75a8bdc113f08a05730"} Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.004993 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 17 18:05:44 crc kubenswrapper[4892]: E0217 18:05:44.223887 4892 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d7300a7_7f48_4be7_addb_1ed7995eddc2.slice/crio-72946e09c4d279d54fcd131f9664725e2feebd52a6082782b4b42cc4b3e6b531.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d7300a7_7f48_4be7_addb_1ed7995eddc2.slice/crio-conmon-72946e09c4d279d54fcd131f9664725e2feebd52a6082782b4b42cc4b3e6b531.scope\": RecentStats: unable to find data in memory cache]" Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.509350 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-lzqfl"] Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.511104 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-lzqfl" Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.512561 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.513808 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.525174 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-lzqfl"] Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.622731 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60efd3a5-df5d-4f7c-949b-f9952e234b8d-config-data\") pod \"nova-cell0-cell-mapping-lzqfl\" (UID: \"60efd3a5-df5d-4f7c-949b-f9952e234b8d\") " pod="openstack/nova-cell0-cell-mapping-lzqfl" Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.623086 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60efd3a5-df5d-4f7c-949b-f9952e234b8d-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-lzqfl\" (UID: \"60efd3a5-df5d-4f7c-949b-f9952e234b8d\") " pod="openstack/nova-cell0-cell-mapping-lzqfl" Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.623306 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ptrp\" (UniqueName: \"kubernetes.io/projected/60efd3a5-df5d-4f7c-949b-f9952e234b8d-kube-api-access-2ptrp\") pod \"nova-cell0-cell-mapping-lzqfl\" (UID: \"60efd3a5-df5d-4f7c-949b-f9952e234b8d\") " pod="openstack/nova-cell0-cell-mapping-lzqfl" Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.623470 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60efd3a5-df5d-4f7c-949b-f9952e234b8d-scripts\") pod \"nova-cell0-cell-mapping-lzqfl\" (UID: \"60efd3a5-df5d-4f7c-949b-f9952e234b8d\") " pod="openstack/nova-cell0-cell-mapping-lzqfl" Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.713609 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.716257 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.720456 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.725693 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60efd3a5-df5d-4f7c-949b-f9952e234b8d-config-data\") pod \"nova-cell0-cell-mapping-lzqfl\" (UID: \"60efd3a5-df5d-4f7c-949b-f9952e234b8d\") " pod="openstack/nova-cell0-cell-mapping-lzqfl" Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.725751 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60efd3a5-df5d-4f7c-949b-f9952e234b8d-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-lzqfl\" (UID: \"60efd3a5-df5d-4f7c-949b-f9952e234b8d\") " pod="openstack/nova-cell0-cell-mapping-lzqfl" Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.725868 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ptrp\" (UniqueName: \"kubernetes.io/projected/60efd3a5-df5d-4f7c-949b-f9952e234b8d-kube-api-access-2ptrp\") pod \"nova-cell0-cell-mapping-lzqfl\" (UID: \"60efd3a5-df5d-4f7c-949b-f9952e234b8d\") " pod="openstack/nova-cell0-cell-mapping-lzqfl" Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.725960 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60efd3a5-df5d-4f7c-949b-f9952e234b8d-scripts\") pod \"nova-cell0-cell-mapping-lzqfl\" (UID: \"60efd3a5-df5d-4f7c-949b-f9952e234b8d\") " pod="openstack/nova-cell0-cell-mapping-lzqfl" Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.731880 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.733601 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.741576 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60efd3a5-df5d-4f7c-949b-f9952e234b8d-scripts\") pod \"nova-cell0-cell-mapping-lzqfl\" (UID: \"60efd3a5-df5d-4f7c-949b-f9952e234b8d\") " pod="openstack/nova-cell0-cell-mapping-lzqfl" Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.746664 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60efd3a5-df5d-4f7c-949b-f9952e234b8d-config-data\") pod \"nova-cell0-cell-mapping-lzqfl\" (UID: \"60efd3a5-df5d-4f7c-949b-f9952e234b8d\") " pod="openstack/nova-cell0-cell-mapping-lzqfl" Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.747296 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60efd3a5-df5d-4f7c-949b-f9952e234b8d-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-lzqfl\" (UID: \"60efd3a5-df5d-4f7c-949b-f9952e234b8d\") " pod="openstack/nova-cell0-cell-mapping-lzqfl" Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.750081 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.759140 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38e4898f-d94e-49aa-add7-918fe6417c43","Type":"ContainerStarted","Data":"b59f376eb8012cd3ff996497e9794e6268a51d921e08d83e12a30a5f26c57a3f"} Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.761583 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.782668 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.784463 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ptrp\" (UniqueName: \"kubernetes.io/projected/60efd3a5-df5d-4f7c-949b-f9952e234b8d-kube-api-access-2ptrp\") pod \"nova-cell0-cell-mapping-lzqfl\" (UID: \"60efd3a5-df5d-4f7c-949b-f9952e234b8d\") " pod="openstack/nova-cell0-cell-mapping-lzqfl" Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.828639 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8922c341-2921-434e-94d3-007a3c54b5f0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8922c341-2921-434e-94d3-007a3c54b5f0\") " pod="openstack/nova-api-0" Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.828699 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt5r6\" (UniqueName: \"kubernetes.io/projected/b5fae9e7-9888-49a4-853c-7c8af9c39f98-kube-api-access-bt5r6\") pod \"nova-scheduler-0\" (UID: \"b5fae9e7-9888-49a4-853c-7c8af9c39f98\") " pod="openstack/nova-scheduler-0" Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.828727 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5fae9e7-9888-49a4-853c-7c8af9c39f98-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b5fae9e7-9888-49a4-853c-7c8af9c39f98\") " pod="openstack/nova-scheduler-0" Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.828745 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8922c341-2921-434e-94d3-007a3c54b5f0-logs\") pod \"nova-api-0\" (UID: \"8922c341-2921-434e-94d3-007a3c54b5f0\") " pod="openstack/nova-api-0" Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.828844 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5fae9e7-9888-49a4-853c-7c8af9c39f98-config-data\") pod \"nova-scheduler-0\" (UID: \"b5fae9e7-9888-49a4-853c-7c8af9c39f98\") " pod="openstack/nova-scheduler-0" Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.828889 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8922c341-2921-434e-94d3-007a3c54b5f0-config-data\") pod \"nova-api-0\" (UID: \"8922c341-2921-434e-94d3-007a3c54b5f0\") " pod="openstack/nova-api-0" Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.828922 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsxcv\" (UniqueName: \"kubernetes.io/projected/8922c341-2921-434e-94d3-007a3c54b5f0-kube-api-access-rsxcv\") pod \"nova-api-0\" (UID: \"8922c341-2921-434e-94d3-007a3c54b5f0\") " pod="openstack/nova-api-0" Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.861538 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.862902 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.874709 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.876211 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-lzqfl" Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.883319 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.896000 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.900540 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.907974 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.930807 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17df5ca6-aa43-460a-91a1-76a87616376e-logs\") pod \"nova-metadata-0\" (UID: \"17df5ca6-aa43-460a-91a1-76a87616376e\") " pod="openstack/nova-metadata-0" Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.930970 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8922c341-2921-434e-94d3-007a3c54b5f0-config-data\") pod \"nova-api-0\" (UID: \"8922c341-2921-434e-94d3-007a3c54b5f0\") " pod="openstack/nova-api-0" Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.931043 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsxcv\" (UniqueName: \"kubernetes.io/projected/8922c341-2921-434e-94d3-007a3c54b5f0-kube-api-access-rsxcv\") pod \"nova-api-0\" (UID: \"8922c341-2921-434e-94d3-007a3c54b5f0\") " pod="openstack/nova-api-0" Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.931079 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjhwb\" (UniqueName: \"kubernetes.io/projected/59bcdc5c-eaeb-4ba3-b714-bcbbe30baa55-kube-api-access-fjhwb\") pod \"nova-cell1-novncproxy-0\" (UID: \"59bcdc5c-eaeb-4ba3-b714-bcbbe30baa55\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.931142 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17df5ca6-aa43-460a-91a1-76a87616376e-config-data\") pod \"nova-metadata-0\" (UID: \"17df5ca6-aa43-460a-91a1-76a87616376e\") " pod="openstack/nova-metadata-0" Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.931172 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17df5ca6-aa43-460a-91a1-76a87616376e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"17df5ca6-aa43-460a-91a1-76a87616376e\") " pod="openstack/nova-metadata-0" Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.931192 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8922c341-2921-434e-94d3-007a3c54b5f0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8922c341-2921-434e-94d3-007a3c54b5f0\") " pod="openstack/nova-api-0" Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.931238 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bt5r6\" (UniqueName: \"kubernetes.io/projected/b5fae9e7-9888-49a4-853c-7c8af9c39f98-kube-api-access-bt5r6\") pod \"nova-scheduler-0\" (UID: \"b5fae9e7-9888-49a4-853c-7c8af9c39f98\") " pod="openstack/nova-scheduler-0" Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.931270 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5fae9e7-9888-49a4-853c-7c8af9c39f98-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b5fae9e7-9888-49a4-853c-7c8af9c39f98\") " pod="openstack/nova-scheduler-0" Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.931292 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8922c341-2921-434e-94d3-007a3c54b5f0-logs\") pod \"nova-api-0\" (UID: \"8922c341-2921-434e-94d3-007a3c54b5f0\") " pod="openstack/nova-api-0" Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.931397 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pjbm\" (UniqueName: \"kubernetes.io/projected/17df5ca6-aa43-460a-91a1-76a87616376e-kube-api-access-5pjbm\") pod \"nova-metadata-0\" (UID: \"17df5ca6-aa43-460a-91a1-76a87616376e\") " pod="openstack/nova-metadata-0" Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.931452 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59bcdc5c-eaeb-4ba3-b714-bcbbe30baa55-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"59bcdc5c-eaeb-4ba3-b714-bcbbe30baa55\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.931478 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5fae9e7-9888-49a4-853c-7c8af9c39f98-config-data\") pod \"nova-scheduler-0\" (UID: \"b5fae9e7-9888-49a4-853c-7c8af9c39f98\") " pod="openstack/nova-scheduler-0" Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.931505 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59bcdc5c-eaeb-4ba3-b714-bcbbe30baa55-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"59bcdc5c-eaeb-4ba3-b714-bcbbe30baa55\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.934558 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.934967 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8922c341-2921-434e-94d3-007a3c54b5f0-logs\") pod \"nova-api-0\" (UID: \"8922c341-2921-434e-94d3-007a3c54b5f0\") " pod="openstack/nova-api-0" Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.939916 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8922c341-2921-434e-94d3-007a3c54b5f0-config-data\") pod \"nova-api-0\" (UID: \"8922c341-2921-434e-94d3-007a3c54b5f0\") " pod="openstack/nova-api-0" Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.949288 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8922c341-2921-434e-94d3-007a3c54b5f0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8922c341-2921-434e-94d3-007a3c54b5f0\") " pod="openstack/nova-api-0" Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.964512 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsxcv\" (UniqueName: \"kubernetes.io/projected/8922c341-2921-434e-94d3-007a3c54b5f0-kube-api-access-rsxcv\") pod \"nova-api-0\" (UID: \"8922c341-2921-434e-94d3-007a3c54b5f0\") " pod="openstack/nova-api-0" Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.964538 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5fae9e7-9888-49a4-853c-7c8af9c39f98-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b5fae9e7-9888-49a4-853c-7c8af9c39f98\") " pod="openstack/nova-scheduler-0" Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.964598 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5fae9e7-9888-49a4-853c-7c8af9c39f98-config-data\") pod \"nova-scheduler-0\" (UID: \"b5fae9e7-9888-49a4-853c-7c8af9c39f98\") " pod="openstack/nova-scheduler-0" Feb 17 18:05:44 crc kubenswrapper[4892]: I0217 18:05:44.974380 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt5r6\" (UniqueName: \"kubernetes.io/projected/b5fae9e7-9888-49a4-853c-7c8af9c39f98-kube-api-access-bt5r6\") pod \"nova-scheduler-0\" (UID: \"b5fae9e7-9888-49a4-853c-7c8af9c39f98\") " pod="openstack/nova-scheduler-0" Feb 17 18:05:45 crc kubenswrapper[4892]: I0217 18:05:45.005743 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-lzftn"] Feb 17 18:05:45 crc kubenswrapper[4892]: I0217 18:05:45.007530 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-lzftn" Feb 17 18:05:45 crc kubenswrapper[4892]: I0217 18:05:45.033537 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pjbm\" (UniqueName: \"kubernetes.io/projected/17df5ca6-aa43-460a-91a1-76a87616376e-kube-api-access-5pjbm\") pod \"nova-metadata-0\" (UID: \"17df5ca6-aa43-460a-91a1-76a87616376e\") " pod="openstack/nova-metadata-0" Feb 17 18:05:45 crc kubenswrapper[4892]: I0217 18:05:45.033600 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59bcdc5c-eaeb-4ba3-b714-bcbbe30baa55-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"59bcdc5c-eaeb-4ba3-b714-bcbbe30baa55\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 18:05:45 crc kubenswrapper[4892]: I0217 18:05:45.033625 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59bcdc5c-eaeb-4ba3-b714-bcbbe30baa55-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"59bcdc5c-eaeb-4ba3-b714-bcbbe30baa55\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 18:05:45 crc kubenswrapper[4892]: I0217 18:05:45.033652 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17df5ca6-aa43-460a-91a1-76a87616376e-logs\") pod \"nova-metadata-0\" (UID: \"17df5ca6-aa43-460a-91a1-76a87616376e\") " pod="openstack/nova-metadata-0" Feb 17 18:05:45 crc kubenswrapper[4892]: I0217 18:05:45.033711 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjhwb\" (UniqueName: \"kubernetes.io/projected/59bcdc5c-eaeb-4ba3-b714-bcbbe30baa55-kube-api-access-fjhwb\") pod \"nova-cell1-novncproxy-0\" (UID: \"59bcdc5c-eaeb-4ba3-b714-bcbbe30baa55\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 18:05:45 crc kubenswrapper[4892]: I0217 18:05:45.033750 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17df5ca6-aa43-460a-91a1-76a87616376e-config-data\") pod \"nova-metadata-0\" (UID: \"17df5ca6-aa43-460a-91a1-76a87616376e\") " pod="openstack/nova-metadata-0" Feb 17 18:05:45 crc kubenswrapper[4892]: I0217 18:05:45.033770 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17df5ca6-aa43-460a-91a1-76a87616376e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"17df5ca6-aa43-460a-91a1-76a87616376e\") " pod="openstack/nova-metadata-0" Feb 17 18:05:45 crc kubenswrapper[4892]: I0217 18:05:45.035648 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-lzftn"] Feb 17 18:05:45 crc kubenswrapper[4892]: I0217 18:05:45.040976 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59bcdc5c-eaeb-4ba3-b714-bcbbe30baa55-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"59bcdc5c-eaeb-4ba3-b714-bcbbe30baa55\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 18:05:45 crc kubenswrapper[4892]: I0217 18:05:45.041707 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17df5ca6-aa43-460a-91a1-76a87616376e-logs\") pod \"nova-metadata-0\" (UID: \"17df5ca6-aa43-460a-91a1-76a87616376e\") " pod="openstack/nova-metadata-0" Feb 17 18:05:45 crc kubenswrapper[4892]: I0217 18:05:45.045042 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59bcdc5c-eaeb-4ba3-b714-bcbbe30baa55-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"59bcdc5c-eaeb-4ba3-b714-bcbbe30baa55\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 18:05:45 crc kubenswrapper[4892]: I0217 18:05:45.045102 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17df5ca6-aa43-460a-91a1-76a87616376e-config-data\") pod \"nova-metadata-0\" (UID: \"17df5ca6-aa43-460a-91a1-76a87616376e\") " pod="openstack/nova-metadata-0" Feb 17 18:05:45 crc kubenswrapper[4892]: I0217 18:05:45.049400 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17df5ca6-aa43-460a-91a1-76a87616376e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"17df5ca6-aa43-460a-91a1-76a87616376e\") " pod="openstack/nova-metadata-0" Feb 17 18:05:45 crc kubenswrapper[4892]: I0217 18:05:45.055961 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pjbm\" (UniqueName: \"kubernetes.io/projected/17df5ca6-aa43-460a-91a1-76a87616376e-kube-api-access-5pjbm\") pod \"nova-metadata-0\" (UID: \"17df5ca6-aa43-460a-91a1-76a87616376e\") " pod="openstack/nova-metadata-0" Feb 17 18:05:45 crc kubenswrapper[4892]: I0217 18:05:45.087244 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjhwb\" (UniqueName: \"kubernetes.io/projected/59bcdc5c-eaeb-4ba3-b714-bcbbe30baa55-kube-api-access-fjhwb\") pod \"nova-cell1-novncproxy-0\" (UID: \"59bcdc5c-eaeb-4ba3-b714-bcbbe30baa55\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 18:05:45 crc kubenswrapper[4892]: I0217 18:05:45.135059 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bbfb7ed-07bf-43e0-baf7-58baefccbba9-config\") pod \"dnsmasq-dns-865f5d856f-lzftn\" (UID: \"8bbfb7ed-07bf-43e0-baf7-58baefccbba9\") " pod="openstack/dnsmasq-dns-865f5d856f-lzftn" Feb 17 18:05:45 crc kubenswrapper[4892]: I0217 18:05:45.135122 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8bbfb7ed-07bf-43e0-baf7-58baefccbba9-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-lzftn\" (UID: \"8bbfb7ed-07bf-43e0-baf7-58baefccbba9\") " pod="openstack/dnsmasq-dns-865f5d856f-lzftn" Feb 17 18:05:45 crc kubenswrapper[4892]: I0217 18:05:45.135145 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8bbfb7ed-07bf-43e0-baf7-58baefccbba9-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-lzftn\" (UID: \"8bbfb7ed-07bf-43e0-baf7-58baefccbba9\") " pod="openstack/dnsmasq-dns-865f5d856f-lzftn" Feb 17 18:05:45 crc kubenswrapper[4892]: I0217 18:05:45.135174 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rd5p\" (UniqueName: \"kubernetes.io/projected/8bbfb7ed-07bf-43e0-baf7-58baefccbba9-kube-api-access-8rd5p\") pod \"dnsmasq-dns-865f5d856f-lzftn\" (UID: \"8bbfb7ed-07bf-43e0-baf7-58baefccbba9\") " pod="openstack/dnsmasq-dns-865f5d856f-lzftn" Feb 17 18:05:45 crc kubenswrapper[4892]: I0217 18:05:45.135212 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8bbfb7ed-07bf-43e0-baf7-58baefccbba9-dns-svc\") pod \"dnsmasq-dns-865f5d856f-lzftn\" (UID: \"8bbfb7ed-07bf-43e0-baf7-58baefccbba9\") " pod="openstack/dnsmasq-dns-865f5d856f-lzftn" Feb 17 18:05:45 crc kubenswrapper[4892]: I0217 18:05:45.135232 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8bbfb7ed-07bf-43e0-baf7-58baefccbba9-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-lzftn\" (UID: \"8bbfb7ed-07bf-43e0-baf7-58baefccbba9\") " pod="openstack/dnsmasq-dns-865f5d856f-lzftn" Feb 17 18:05:45 crc kubenswrapper[4892]: I0217 18:05:45.176761 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 18:05:45 crc kubenswrapper[4892]: I0217 18:05:45.230645 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 18:05:45 crc kubenswrapper[4892]: I0217 18:05:45.236873 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bbfb7ed-07bf-43e0-baf7-58baefccbba9-config\") pod \"dnsmasq-dns-865f5d856f-lzftn\" (UID: \"8bbfb7ed-07bf-43e0-baf7-58baefccbba9\") " pod="openstack/dnsmasq-dns-865f5d856f-lzftn" Feb 17 18:05:45 crc kubenswrapper[4892]: I0217 18:05:45.236936 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8bbfb7ed-07bf-43e0-baf7-58baefccbba9-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-lzftn\" (UID: \"8bbfb7ed-07bf-43e0-baf7-58baefccbba9\") " pod="openstack/dnsmasq-dns-865f5d856f-lzftn" Feb 17 18:05:45 crc kubenswrapper[4892]: I0217 18:05:45.236958 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8bbfb7ed-07bf-43e0-baf7-58baefccbba9-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-lzftn\" (UID: \"8bbfb7ed-07bf-43e0-baf7-58baefccbba9\") " pod="openstack/dnsmasq-dns-865f5d856f-lzftn" Feb 17 18:05:45 crc kubenswrapper[4892]: I0217 18:05:45.236993 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rd5p\" (UniqueName: \"kubernetes.io/projected/8bbfb7ed-07bf-43e0-baf7-58baefccbba9-kube-api-access-8rd5p\") pod \"dnsmasq-dns-865f5d856f-lzftn\" (UID: \"8bbfb7ed-07bf-43e0-baf7-58baefccbba9\") " pod="openstack/dnsmasq-dns-865f5d856f-lzftn" Feb 17 18:05:45 crc kubenswrapper[4892]: I0217 18:05:45.237028 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8bbfb7ed-07bf-43e0-baf7-58baefccbba9-dns-svc\") pod \"dnsmasq-dns-865f5d856f-lzftn\" (UID: \"8bbfb7ed-07bf-43e0-baf7-58baefccbba9\") " pod="openstack/dnsmasq-dns-865f5d856f-lzftn" Feb 17 18:05:45 crc kubenswrapper[4892]: I0217 18:05:45.237053 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8bbfb7ed-07bf-43e0-baf7-58baefccbba9-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-lzftn\" (UID: \"8bbfb7ed-07bf-43e0-baf7-58baefccbba9\") " pod="openstack/dnsmasq-dns-865f5d856f-lzftn" Feb 17 18:05:45 crc kubenswrapper[4892]: I0217 18:05:45.244658 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bbfb7ed-07bf-43e0-baf7-58baefccbba9-config\") pod \"dnsmasq-dns-865f5d856f-lzftn\" (UID: \"8bbfb7ed-07bf-43e0-baf7-58baefccbba9\") " pod="openstack/dnsmasq-dns-865f5d856f-lzftn" Feb 17 18:05:45 crc kubenswrapper[4892]: I0217 18:05:45.244866 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8bbfb7ed-07bf-43e0-baf7-58baefccbba9-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-lzftn\" (UID: \"8bbfb7ed-07bf-43e0-baf7-58baefccbba9\") " pod="openstack/dnsmasq-dns-865f5d856f-lzftn" Feb 17 18:05:45 crc kubenswrapper[4892]: I0217 18:05:45.244877 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8bbfb7ed-07bf-43e0-baf7-58baefccbba9-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-lzftn\" (UID: \"8bbfb7ed-07bf-43e0-baf7-58baefccbba9\") " pod="openstack/dnsmasq-dns-865f5d856f-lzftn" Feb 17 18:05:45 crc kubenswrapper[4892]: I0217 18:05:45.244952 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8bbfb7ed-07bf-43e0-baf7-58baefccbba9-dns-svc\") pod \"dnsmasq-dns-865f5d856f-lzftn\" (UID: \"8bbfb7ed-07bf-43e0-baf7-58baefccbba9\") " pod="openstack/dnsmasq-dns-865f5d856f-lzftn" Feb 17 18:05:45 crc kubenswrapper[4892]: I0217 18:05:45.245561 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8bbfb7ed-07bf-43e0-baf7-58baefccbba9-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-lzftn\" (UID: \"8bbfb7ed-07bf-43e0-baf7-58baefccbba9\") " pod="openstack/dnsmasq-dns-865f5d856f-lzftn" Feb 17 18:05:45 crc kubenswrapper[4892]: I0217 18:05:45.261467 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rd5p\" (UniqueName: \"kubernetes.io/projected/8bbfb7ed-07bf-43e0-baf7-58baefccbba9-kube-api-access-8rd5p\") pod \"dnsmasq-dns-865f5d856f-lzftn\" (UID: \"8bbfb7ed-07bf-43e0-baf7-58baefccbba9\") " pod="openstack/dnsmasq-dns-865f5d856f-lzftn" Feb 17 18:05:45 crc kubenswrapper[4892]: I0217 18:05:45.343802 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 18:05:45 crc kubenswrapper[4892]: I0217 18:05:45.356882 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-lzftn" Feb 17 18:05:45 crc kubenswrapper[4892]: I0217 18:05:45.362466 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 18:05:45 crc kubenswrapper[4892]: I0217 18:05:45.535098 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-lzqfl"] Feb 17 18:05:45 crc kubenswrapper[4892]: W0217 18:05:45.540278 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60efd3a5_df5d_4f7c_949b_f9952e234b8d.slice/crio-ef5e47cd24c01c3d4ae62af5e10b8cfacdd3cc1624d46fcd4a5c9162388dc4af WatchSource:0}: Error finding container ef5e47cd24c01c3d4ae62af5e10b8cfacdd3cc1624d46fcd4a5c9162388dc4af: Status 404 returned error can't find the container with id ef5e47cd24c01c3d4ae62af5e10b8cfacdd3cc1624d46fcd4a5c9162388dc4af Feb 17 18:05:45 crc kubenswrapper[4892]: I0217 18:05:45.786137 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-lzqfl" event={"ID":"60efd3a5-df5d-4f7c-949b-f9952e234b8d","Type":"ContainerStarted","Data":"ef5e47cd24c01c3d4ae62af5e10b8cfacdd3cc1624d46fcd4a5c9162388dc4af"} Feb 17 18:05:45 crc kubenswrapper[4892]: I0217 18:05:45.836268 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qtjqg"] Feb 17 18:05:45 crc kubenswrapper[4892]: I0217 18:05:45.861208 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-qtjqg" Feb 17 18:05:45 crc kubenswrapper[4892]: I0217 18:05:45.867445 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 17 18:05:45 crc kubenswrapper[4892]: I0217 18:05:45.868958 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 17 18:05:45 crc kubenswrapper[4892]: I0217 18:05:45.900801 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 18:05:45 crc kubenswrapper[4892]: I0217 18:05:45.915278 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qtjqg"] Feb 17 18:05:45 crc kubenswrapper[4892]: I0217 18:05:45.969724 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/150beb6f-d09f-42cd-8294-acfe9bf7bcee-scripts\") pod \"nova-cell1-conductor-db-sync-qtjqg\" (UID: \"150beb6f-d09f-42cd-8294-acfe9bf7bcee\") " pod="openstack/nova-cell1-conductor-db-sync-qtjqg" Feb 17 18:05:45 crc kubenswrapper[4892]: I0217 18:05:45.969864 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/150beb6f-d09f-42cd-8294-acfe9bf7bcee-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-qtjqg\" (UID: \"150beb6f-d09f-42cd-8294-acfe9bf7bcee\") " pod="openstack/nova-cell1-conductor-db-sync-qtjqg" Feb 17 18:05:45 crc kubenswrapper[4892]: I0217 18:05:45.969904 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/150beb6f-d09f-42cd-8294-acfe9bf7bcee-config-data\") pod \"nova-cell1-conductor-db-sync-qtjqg\" (UID: \"150beb6f-d09f-42cd-8294-acfe9bf7bcee\") " pod="openstack/nova-cell1-conductor-db-sync-qtjqg" Feb 17 18:05:45 crc kubenswrapper[4892]: I0217 18:05:45.969944 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrxf7\" (UniqueName: \"kubernetes.io/projected/150beb6f-d09f-42cd-8294-acfe9bf7bcee-kube-api-access-mrxf7\") pod \"nova-cell1-conductor-db-sync-qtjqg\" (UID: \"150beb6f-d09f-42cd-8294-acfe9bf7bcee\") " pod="openstack/nova-cell1-conductor-db-sync-qtjqg" Feb 17 18:05:46 crc kubenswrapper[4892]: I0217 18:05:46.004621 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 18:05:46 crc kubenswrapper[4892]: I0217 18:05:46.072016 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/150beb6f-d09f-42cd-8294-acfe9bf7bcee-scripts\") pod \"nova-cell1-conductor-db-sync-qtjqg\" (UID: \"150beb6f-d09f-42cd-8294-acfe9bf7bcee\") " pod="openstack/nova-cell1-conductor-db-sync-qtjqg" Feb 17 18:05:46 crc kubenswrapper[4892]: I0217 18:05:46.072125 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/150beb6f-d09f-42cd-8294-acfe9bf7bcee-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-qtjqg\" (UID: \"150beb6f-d09f-42cd-8294-acfe9bf7bcee\") " pod="openstack/nova-cell1-conductor-db-sync-qtjqg" Feb 17 18:05:46 crc kubenswrapper[4892]: I0217 18:05:46.072154 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/150beb6f-d09f-42cd-8294-acfe9bf7bcee-config-data\") pod \"nova-cell1-conductor-db-sync-qtjqg\" (UID: \"150beb6f-d09f-42cd-8294-acfe9bf7bcee\") " pod="openstack/nova-cell1-conductor-db-sync-qtjqg" Feb 17 18:05:46 crc kubenswrapper[4892]: I0217 18:05:46.072190 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrxf7\" (UniqueName: \"kubernetes.io/projected/150beb6f-d09f-42cd-8294-acfe9bf7bcee-kube-api-access-mrxf7\") pod \"nova-cell1-conductor-db-sync-qtjqg\" (UID: \"150beb6f-d09f-42cd-8294-acfe9bf7bcee\") " pod="openstack/nova-cell1-conductor-db-sync-qtjqg" Feb 17 18:05:46 crc kubenswrapper[4892]: I0217 18:05:46.078133 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/150beb6f-d09f-42cd-8294-acfe9bf7bcee-scripts\") pod \"nova-cell1-conductor-db-sync-qtjqg\" (UID: \"150beb6f-d09f-42cd-8294-acfe9bf7bcee\") " pod="openstack/nova-cell1-conductor-db-sync-qtjqg" Feb 17 18:05:46 crc kubenswrapper[4892]: I0217 18:05:46.081366 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/150beb6f-d09f-42cd-8294-acfe9bf7bcee-config-data\") pod \"nova-cell1-conductor-db-sync-qtjqg\" (UID: \"150beb6f-d09f-42cd-8294-acfe9bf7bcee\") " pod="openstack/nova-cell1-conductor-db-sync-qtjqg" Feb 17 18:05:46 crc kubenswrapper[4892]: I0217 18:05:46.083338 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/150beb6f-d09f-42cd-8294-acfe9bf7bcee-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-qtjqg\" (UID: \"150beb6f-d09f-42cd-8294-acfe9bf7bcee\") " pod="openstack/nova-cell1-conductor-db-sync-qtjqg" Feb 17 18:05:46 crc kubenswrapper[4892]: I0217 18:05:46.090977 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrxf7\" (UniqueName: \"kubernetes.io/projected/150beb6f-d09f-42cd-8294-acfe9bf7bcee-kube-api-access-mrxf7\") pod \"nova-cell1-conductor-db-sync-qtjqg\" (UID: \"150beb6f-d09f-42cd-8294-acfe9bf7bcee\") " pod="openstack/nova-cell1-conductor-db-sync-qtjqg" Feb 17 18:05:46 crc kubenswrapper[4892]: I0217 18:05:46.191680 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-qtjqg" Feb 17 18:05:46 crc kubenswrapper[4892]: I0217 18:05:46.252759 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 18:05:46 crc kubenswrapper[4892]: W0217 18:05:46.253142 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8bbfb7ed_07bf_43e0_baf7_58baefccbba9.slice/crio-a74b4c02d0ae62c584c20ca1424a783205e7af96e1b591ea1ec72ff2736c763f WatchSource:0}: Error finding container a74b4c02d0ae62c584c20ca1424a783205e7af96e1b591ea1ec72ff2736c763f: Status 404 returned error can't find the container with id a74b4c02d0ae62c584c20ca1424a783205e7af96e1b591ea1ec72ff2736c763f Feb 17 18:05:46 crc kubenswrapper[4892]: I0217 18:05:46.267971 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 18:05:46 crc kubenswrapper[4892]: I0217 18:05:46.285018 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-lzftn"] Feb 17 18:05:46 crc kubenswrapper[4892]: I0217 18:05:46.684778 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qtjqg"] Feb 17 18:05:46 crc kubenswrapper[4892]: W0217 18:05:46.688566 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod150beb6f_d09f_42cd_8294_acfe9bf7bcee.slice/crio-74b8d540dab8883240a904e110deb8ff5b7926dcdba07eef536b9335bc1a245b WatchSource:0}: Error finding container 74b8d540dab8883240a904e110deb8ff5b7926dcdba07eef536b9335bc1a245b: Status 404 returned error can't find the container with id 74b8d540dab8883240a904e110deb8ff5b7926dcdba07eef536b9335bc1a245b Feb 17 18:05:46 crc kubenswrapper[4892]: I0217 18:05:46.798729 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-lzqfl" event={"ID":"60efd3a5-df5d-4f7c-949b-f9952e234b8d","Type":"ContainerStarted","Data":"6a15516a5b658d1698d3bd63059e7269ca1018b30edeef2d1a8743a7ac2d9849"} Feb 17 18:05:46 crc kubenswrapper[4892]: I0217 18:05:46.805493 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b5fae9e7-9888-49a4-853c-7c8af9c39f98","Type":"ContainerStarted","Data":"a38ca570335e6844249f77f9c45903b344ac7a9a57d1f78dfe55672df8cbacca"} Feb 17 18:05:46 crc kubenswrapper[4892]: I0217 18:05:46.807794 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"17df5ca6-aa43-460a-91a1-76a87616376e","Type":"ContainerStarted","Data":"7e7c22b8935b6d2bca5c438272316f1ec950c43de574e6687994675c82e50b95"} Feb 17 18:05:46 crc kubenswrapper[4892]: I0217 18:05:46.819035 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-lzqfl" podStartSLOduration=2.819016132 podStartE2EDuration="2.819016132s" podCreationTimestamp="2026-02-17 18:05:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:05:46.810208235 +0000 UTC m=+1318.185611500" watchObservedRunningTime="2026-02-17 18:05:46.819016132 +0000 UTC m=+1318.194419397" Feb 17 18:05:46 crc kubenswrapper[4892]: I0217 18:05:46.819283 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38e4898f-d94e-49aa-add7-918fe6417c43","Type":"ContainerStarted","Data":"b0be6ba2fe5e58abc9204ba7aeb869fbeb1d7c4f3d1e29aa7a7093aaafa65444"} Feb 17 18:05:46 crc kubenswrapper[4892]: I0217 18:05:46.819330 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38e4898f-d94e-49aa-add7-918fe6417c43","Type":"ContainerStarted","Data":"6708cafffa28d54af9efb50772fe43d3255ae0ac65624dc2cc368cecec7f468f"} Feb 17 18:05:46 crc kubenswrapper[4892]: I0217 18:05:46.820998 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-qtjqg" event={"ID":"150beb6f-d09f-42cd-8294-acfe9bf7bcee","Type":"ContainerStarted","Data":"74b8d540dab8883240a904e110deb8ff5b7926dcdba07eef536b9335bc1a245b"} Feb 17 18:05:46 crc kubenswrapper[4892]: I0217 18:05:46.822362 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"59bcdc5c-eaeb-4ba3-b714-bcbbe30baa55","Type":"ContainerStarted","Data":"526c75c3a3595efa562b5e45ed4f025fe3920aad00d7f8cb393c17ed11ff0fa0"} Feb 17 18:05:46 crc kubenswrapper[4892]: I0217 18:05:46.824223 4892 generic.go:334] "Generic (PLEG): container finished" podID="8bbfb7ed-07bf-43e0-baf7-58baefccbba9" containerID="cfa0711cc5a62159977c3f380e6ebdb61760d793a1f27715a87929a7d74e57da" exitCode=0 Feb 17 18:05:46 crc kubenswrapper[4892]: I0217 18:05:46.824263 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-lzftn" event={"ID":"8bbfb7ed-07bf-43e0-baf7-58baefccbba9","Type":"ContainerDied","Data":"cfa0711cc5a62159977c3f380e6ebdb61760d793a1f27715a87929a7d74e57da"} Feb 17 18:05:46 crc kubenswrapper[4892]: I0217 18:05:46.824280 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-lzftn" event={"ID":"8bbfb7ed-07bf-43e0-baf7-58baefccbba9","Type":"ContainerStarted","Data":"a74b4c02d0ae62c584c20ca1424a783205e7af96e1b591ea1ec72ff2736c763f"} Feb 17 18:05:46 crc kubenswrapper[4892]: I0217 18:05:46.830310 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8922c341-2921-434e-94d3-007a3c54b5f0","Type":"ContainerStarted","Data":"c18b24cecd183d3071c80d7c54e28cdae01d9471babde029399fa5f3ef725331"} Feb 17 18:05:47 crc kubenswrapper[4892]: I0217 18:05:47.021752 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 17 18:05:47 crc kubenswrapper[4892]: I0217 18:05:47.842445 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-qtjqg" event={"ID":"150beb6f-d09f-42cd-8294-acfe9bf7bcee","Type":"ContainerStarted","Data":"df8c62daa6f190fa1487fe86dda7f0982aee1c48c29f17d050a8676d2e3c93d6"} Feb 17 18:05:47 crc kubenswrapper[4892]: I0217 18:05:47.845274 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-lzftn" event={"ID":"8bbfb7ed-07bf-43e0-baf7-58baefccbba9","Type":"ContainerStarted","Data":"2e7ce2ed233decd4d799f8db2acda887290e7fb0198c1925a7111823bcc856c3"} Feb 17 18:05:47 crc kubenswrapper[4892]: I0217 18:05:47.845517 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-865f5d856f-lzftn" Feb 17 18:05:47 crc kubenswrapper[4892]: I0217 18:05:47.859703 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-qtjqg" podStartSLOduration=2.859670927 podStartE2EDuration="2.859670927s" podCreationTimestamp="2026-02-17 18:05:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:05:47.858180806 +0000 UTC m=+1319.233584071" watchObservedRunningTime="2026-02-17 18:05:47.859670927 +0000 UTC m=+1319.235074182" Feb 17 18:05:47 crc kubenswrapper[4892]: I0217 18:05:47.887651 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-865f5d856f-lzftn" podStartSLOduration=3.887633941 podStartE2EDuration="3.887633941s" podCreationTimestamp="2026-02-17 18:05:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:05:47.884579668 +0000 UTC m=+1319.259982933" watchObservedRunningTime="2026-02-17 18:05:47.887633941 +0000 UTC m=+1319.263037206" Feb 17 18:05:48 crc kubenswrapper[4892]: I0217 18:05:48.365691 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 18:05:48 crc kubenswrapper[4892]: I0217 18:05:48.405598 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 18:05:50 crc kubenswrapper[4892]: I0217 18:05:50.885554 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"17df5ca6-aa43-460a-91a1-76a87616376e","Type":"ContainerStarted","Data":"774bd1818f399c46a92e0d40d59f996bb13f1cbbcb52122a6f55526ac6091b99"} Feb 17 18:05:50 crc kubenswrapper[4892]: I0217 18:05:50.886146 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"17df5ca6-aa43-460a-91a1-76a87616376e","Type":"ContainerStarted","Data":"1ff9989f4376fc857006e940d7640f139edcdfb827dfb96c134138f77b244736"} Feb 17 18:05:50 crc kubenswrapper[4892]: I0217 18:05:50.885677 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="17df5ca6-aa43-460a-91a1-76a87616376e" containerName="nova-metadata-log" containerID="cri-o://1ff9989f4376fc857006e940d7640f139edcdfb827dfb96c134138f77b244736" gracePeriod=30 Feb 17 18:05:50 crc kubenswrapper[4892]: I0217 18:05:50.885737 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="17df5ca6-aa43-460a-91a1-76a87616376e" containerName="nova-metadata-metadata" containerID="cri-o://774bd1818f399c46a92e0d40d59f996bb13f1cbbcb52122a6f55526ac6091b99" gracePeriod=30 Feb 17 18:05:50 crc kubenswrapper[4892]: I0217 18:05:50.889142 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38e4898f-d94e-49aa-add7-918fe6417c43","Type":"ContainerStarted","Data":"de65f10d9c05d9eac74c81195172ac99385560eb797d1f1427c70ebf37cf3262"} Feb 17 18:05:50 crc kubenswrapper[4892]: I0217 18:05:50.889288 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 18:05:50 crc kubenswrapper[4892]: I0217 18:05:50.892158 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"59bcdc5c-eaeb-4ba3-b714-bcbbe30baa55","Type":"ContainerStarted","Data":"efd8dfb1e02ad04c1f660122e0b638a7ce0432928a1c09ee9deb48c80b346a6e"} Feb 17 18:05:50 crc kubenswrapper[4892]: I0217 18:05:50.892227 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="59bcdc5c-eaeb-4ba3-b714-bcbbe30baa55" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://efd8dfb1e02ad04c1f660122e0b638a7ce0432928a1c09ee9deb48c80b346a6e" gracePeriod=30 Feb 17 18:05:50 crc kubenswrapper[4892]: I0217 18:05:50.897721 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8922c341-2921-434e-94d3-007a3c54b5f0","Type":"ContainerStarted","Data":"a30e66b0f965fe369316bd9afdf3998e749bb17909a6a522c2f5daa476a76752"} Feb 17 18:05:50 crc kubenswrapper[4892]: I0217 18:05:50.897768 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8922c341-2921-434e-94d3-007a3c54b5f0","Type":"ContainerStarted","Data":"c0fd417bd566f1c3e2720d64fd90ecfb638b16e493eabd76bf3bdea6c3f64227"} Feb 17 18:05:50 crc kubenswrapper[4892]: I0217 18:05:50.904274 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b5fae9e7-9888-49a4-853c-7c8af9c39f98","Type":"ContainerStarted","Data":"565a05ec270c8a8faa38bcf98b0ae0a4d4ed11b0be9d29859ea0b38605acbdb2"} Feb 17 18:05:50 crc kubenswrapper[4892]: I0217 18:05:50.913176 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.181490311 podStartE2EDuration="6.913153916s" podCreationTimestamp="2026-02-17 18:05:44 +0000 UTC" firstStartedPulling="2026-02-17 18:05:46.287027495 +0000 UTC m=+1317.662430760" lastFinishedPulling="2026-02-17 18:05:50.0186911 +0000 UTC m=+1321.394094365" observedRunningTime="2026-02-17 18:05:50.904851982 +0000 UTC m=+1322.280255257" watchObservedRunningTime="2026-02-17 18:05:50.913153916 +0000 UTC m=+1322.288557181" Feb 17 18:05:50 crc kubenswrapper[4892]: I0217 18:05:50.945103 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.96220217 podStartE2EDuration="6.945081796s" podCreationTimestamp="2026-02-17 18:05:44 +0000 UTC" firstStartedPulling="2026-02-17 18:05:46.023355089 +0000 UTC m=+1317.398758354" lastFinishedPulling="2026-02-17 18:05:50.006234715 +0000 UTC m=+1321.381637980" observedRunningTime="2026-02-17 18:05:50.925829017 +0000 UTC m=+1322.301232302" watchObservedRunningTime="2026-02-17 18:05:50.945081796 +0000 UTC m=+1322.320485061" Feb 17 18:05:50 crc kubenswrapper[4892]: I0217 18:05:50.981180 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.243944664 podStartE2EDuration="6.981141988s" podCreationTimestamp="2026-02-17 18:05:44 +0000 UTC" firstStartedPulling="2026-02-17 18:05:46.303260673 +0000 UTC m=+1317.678663938" lastFinishedPulling="2026-02-17 18:05:50.040457977 +0000 UTC m=+1321.415861262" observedRunningTime="2026-02-17 18:05:50.951276313 +0000 UTC m=+1322.326679578" watchObservedRunningTime="2026-02-17 18:05:50.981141988 +0000 UTC m=+1322.356545253" Feb 17 18:05:51 crc kubenswrapper[4892]: I0217 18:05:51.006207 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.842591938 podStartE2EDuration="7.006190413s" podCreationTimestamp="2026-02-17 18:05:44 +0000 UTC" firstStartedPulling="2026-02-17 18:05:45.853435411 +0000 UTC m=+1317.228838676" lastFinishedPulling="2026-02-17 18:05:50.017033896 +0000 UTC m=+1321.392437151" observedRunningTime="2026-02-17 18:05:50.971052976 +0000 UTC m=+1322.346456251" watchObservedRunningTime="2026-02-17 18:05:51.006190413 +0000 UTC m=+1322.381593678" Feb 17 18:05:51 crc kubenswrapper[4892]: I0217 18:05:51.044180 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.942015739 podStartE2EDuration="9.044156656s" podCreationTimestamp="2026-02-17 18:05:42 +0000 UTC" firstStartedPulling="2026-02-17 18:05:42.957894288 +0000 UTC m=+1314.333297553" lastFinishedPulling="2026-02-17 18:05:50.060035195 +0000 UTC m=+1321.435438470" observedRunningTime="2026-02-17 18:05:51.00531684 +0000 UTC m=+1322.380720115" watchObservedRunningTime="2026-02-17 18:05:51.044156656 +0000 UTC m=+1322.419559921" Feb 17 18:05:51 crc kubenswrapper[4892]: I0217 18:05:51.918625 4892 generic.go:334] "Generic (PLEG): container finished" podID="17df5ca6-aa43-460a-91a1-76a87616376e" containerID="1ff9989f4376fc857006e940d7640f139edcdfb827dfb96c134138f77b244736" exitCode=143 Feb 17 18:05:51 crc kubenswrapper[4892]: I0217 18:05:51.918733 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"17df5ca6-aa43-460a-91a1-76a87616376e","Type":"ContainerDied","Data":"1ff9989f4376fc857006e940d7640f139edcdfb827dfb96c134138f77b244736"} Feb 17 18:05:53 crc kubenswrapper[4892]: I0217 18:05:53.939704 4892 generic.go:334] "Generic (PLEG): container finished" podID="60efd3a5-df5d-4f7c-949b-f9952e234b8d" containerID="6a15516a5b658d1698d3bd63059e7269ca1018b30edeef2d1a8743a7ac2d9849" exitCode=0 Feb 17 18:05:53 crc kubenswrapper[4892]: I0217 18:05:53.939800 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-lzqfl" event={"ID":"60efd3a5-df5d-4f7c-949b-f9952e234b8d","Type":"ContainerDied","Data":"6a15516a5b658d1698d3bd63059e7269ca1018b30edeef2d1a8743a7ac2d9849"} Feb 17 18:05:55 crc kubenswrapper[4892]: I0217 18:05:55.179025 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 17 18:05:55 crc kubenswrapper[4892]: I0217 18:05:55.179360 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 17 18:05:55 crc kubenswrapper[4892]: I0217 18:05:55.231159 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 18:05:55 crc kubenswrapper[4892]: I0217 18:05:55.231486 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 17 18:05:55 crc kubenswrapper[4892]: I0217 18:05:55.231498 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 18:05:55 crc kubenswrapper[4892]: I0217 18:05:55.338855 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-lzqfl" Feb 17 18:05:55 crc kubenswrapper[4892]: I0217 18:05:55.344720 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 18:05:55 crc kubenswrapper[4892]: I0217 18:05:55.344788 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 18:05:55 crc kubenswrapper[4892]: I0217 18:05:55.386464 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-865f5d856f-lzftn" Feb 17 18:05:55 crc kubenswrapper[4892]: I0217 18:05:55.386500 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 17 18:05:55 crc kubenswrapper[4892]: I0217 18:05:55.401540 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60efd3a5-df5d-4f7c-949b-f9952e234b8d-combined-ca-bundle\") pod \"60efd3a5-df5d-4f7c-949b-f9952e234b8d\" (UID: \"60efd3a5-df5d-4f7c-949b-f9952e234b8d\") " Feb 17 18:05:55 crc kubenswrapper[4892]: I0217 18:05:55.401616 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60efd3a5-df5d-4f7c-949b-f9952e234b8d-config-data\") pod \"60efd3a5-df5d-4f7c-949b-f9952e234b8d\" (UID: \"60efd3a5-df5d-4f7c-949b-f9952e234b8d\") " Feb 17 18:05:55 crc kubenswrapper[4892]: I0217 18:05:55.401877 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60efd3a5-df5d-4f7c-949b-f9952e234b8d-scripts\") pod \"60efd3a5-df5d-4f7c-949b-f9952e234b8d\" (UID: \"60efd3a5-df5d-4f7c-949b-f9952e234b8d\") " Feb 17 18:05:55 crc kubenswrapper[4892]: I0217 18:05:55.401955 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ptrp\" (UniqueName: \"kubernetes.io/projected/60efd3a5-df5d-4f7c-949b-f9952e234b8d-kube-api-access-2ptrp\") pod \"60efd3a5-df5d-4f7c-949b-f9952e234b8d\" (UID: \"60efd3a5-df5d-4f7c-949b-f9952e234b8d\") " Feb 17 18:05:55 crc kubenswrapper[4892]: I0217 18:05:55.422955 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60efd3a5-df5d-4f7c-949b-f9952e234b8d-scripts" (OuterVolumeSpecName: "scripts") pod "60efd3a5-df5d-4f7c-949b-f9952e234b8d" (UID: "60efd3a5-df5d-4f7c-949b-f9952e234b8d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:05:55 crc kubenswrapper[4892]: I0217 18:05:55.442168 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60efd3a5-df5d-4f7c-949b-f9952e234b8d-config-data" (OuterVolumeSpecName: "config-data") pod "60efd3a5-df5d-4f7c-949b-f9952e234b8d" (UID: "60efd3a5-df5d-4f7c-949b-f9952e234b8d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:05:55 crc kubenswrapper[4892]: I0217 18:05:55.460198 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60efd3a5-df5d-4f7c-949b-f9952e234b8d-kube-api-access-2ptrp" (OuterVolumeSpecName: "kube-api-access-2ptrp") pod "60efd3a5-df5d-4f7c-949b-f9952e234b8d" (UID: "60efd3a5-df5d-4f7c-949b-f9952e234b8d"). InnerVolumeSpecName "kube-api-access-2ptrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:05:55 crc kubenswrapper[4892]: I0217 18:05:55.479724 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60efd3a5-df5d-4f7c-949b-f9952e234b8d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60efd3a5-df5d-4f7c-949b-f9952e234b8d" (UID: "60efd3a5-df5d-4f7c-949b-f9952e234b8d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:05:55 crc kubenswrapper[4892]: I0217 18:05:55.491824 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-9mg8g"] Feb 17 18:05:55 crc kubenswrapper[4892]: I0217 18:05:55.492096 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bb4fc677f-9mg8g" podUID="522cb1f9-2e67-46ca-b4e1-3c837d27d9ac" containerName="dnsmasq-dns" containerID="cri-o://cd9ecb1657eaf84acbc9ad761ad42a0e4fed1fa8c98f3b1aa627599fe7925dc0" gracePeriod=10 Feb 17 18:05:55 crc kubenswrapper[4892]: I0217 18:05:55.514176 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60efd3a5-df5d-4f7c-949b-f9952e234b8d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:05:55 crc kubenswrapper[4892]: I0217 18:05:55.514214 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60efd3a5-df5d-4f7c-949b-f9952e234b8d-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:05:55 crc kubenswrapper[4892]: I0217 18:05:55.514225 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60efd3a5-df5d-4f7c-949b-f9952e234b8d-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:05:55 crc kubenswrapper[4892]: I0217 18:05:55.514237 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ptrp\" (UniqueName: \"kubernetes.io/projected/60efd3a5-df5d-4f7c-949b-f9952e234b8d-kube-api-access-2ptrp\") on node \"crc\" DevicePath \"\"" Feb 17 18:05:55 crc kubenswrapper[4892]: I0217 18:05:55.968930 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-lzqfl" event={"ID":"60efd3a5-df5d-4f7c-949b-f9952e234b8d","Type":"ContainerDied","Data":"ef5e47cd24c01c3d4ae62af5e10b8cfacdd3cc1624d46fcd4a5c9162388dc4af"} Feb 17 18:05:55 crc kubenswrapper[4892]: I0217 18:05:55.969497 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef5e47cd24c01c3d4ae62af5e10b8cfacdd3cc1624d46fcd4a5c9162388dc4af" Feb 17 18:05:55 crc kubenswrapper[4892]: I0217 18:05:55.969653 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-lzqfl" Feb 17 18:05:55 crc kubenswrapper[4892]: I0217 18:05:55.983740 4892 generic.go:334] "Generic (PLEG): container finished" podID="522cb1f9-2e67-46ca-b4e1-3c837d27d9ac" containerID="cd9ecb1657eaf84acbc9ad761ad42a0e4fed1fa8c98f3b1aa627599fe7925dc0" exitCode=0 Feb 17 18:05:55 crc kubenswrapper[4892]: I0217 18:05:55.986976 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-9mg8g" event={"ID":"522cb1f9-2e67-46ca-b4e1-3c837d27d9ac","Type":"ContainerDied","Data":"cd9ecb1657eaf84acbc9ad761ad42a0e4fed1fa8c98f3b1aa627599fe7925dc0"} Feb 17 18:05:56 crc kubenswrapper[4892]: I0217 18:05:56.025397 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 17 18:05:56 crc kubenswrapper[4892]: I0217 18:05:56.066678 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-9mg8g" Feb 17 18:05:56 crc kubenswrapper[4892]: I0217 18:05:56.125909 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/522cb1f9-2e67-46ca-b4e1-3c837d27d9ac-ovsdbserver-sb\") pod \"522cb1f9-2e67-46ca-b4e1-3c837d27d9ac\" (UID: \"522cb1f9-2e67-46ca-b4e1-3c837d27d9ac\") " Feb 17 18:05:56 crc kubenswrapper[4892]: I0217 18:05:56.125954 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/522cb1f9-2e67-46ca-b4e1-3c837d27d9ac-ovsdbserver-nb\") pod \"522cb1f9-2e67-46ca-b4e1-3c837d27d9ac\" (UID: \"522cb1f9-2e67-46ca-b4e1-3c837d27d9ac\") " Feb 17 18:05:56 crc kubenswrapper[4892]: I0217 18:05:56.126054 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/522cb1f9-2e67-46ca-b4e1-3c837d27d9ac-config\") pod \"522cb1f9-2e67-46ca-b4e1-3c837d27d9ac\" (UID: \"522cb1f9-2e67-46ca-b4e1-3c837d27d9ac\") " Feb 17 18:05:56 crc kubenswrapper[4892]: I0217 18:05:56.126142 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/522cb1f9-2e67-46ca-b4e1-3c837d27d9ac-dns-svc\") pod \"522cb1f9-2e67-46ca-b4e1-3c837d27d9ac\" (UID: \"522cb1f9-2e67-46ca-b4e1-3c837d27d9ac\") " Feb 17 18:05:56 crc kubenswrapper[4892]: I0217 18:05:56.126263 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/522cb1f9-2e67-46ca-b4e1-3c837d27d9ac-dns-swift-storage-0\") pod \"522cb1f9-2e67-46ca-b4e1-3c837d27d9ac\" (UID: \"522cb1f9-2e67-46ca-b4e1-3c837d27d9ac\") " Feb 17 18:05:56 crc kubenswrapper[4892]: I0217 18:05:56.126318 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nl5vh\" (UniqueName: \"kubernetes.io/projected/522cb1f9-2e67-46ca-b4e1-3c837d27d9ac-kube-api-access-nl5vh\") pod \"522cb1f9-2e67-46ca-b4e1-3c837d27d9ac\" (UID: \"522cb1f9-2e67-46ca-b4e1-3c837d27d9ac\") " Feb 17 18:05:56 crc kubenswrapper[4892]: I0217 18:05:56.131379 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/522cb1f9-2e67-46ca-b4e1-3c837d27d9ac-kube-api-access-nl5vh" (OuterVolumeSpecName: "kube-api-access-nl5vh") pod "522cb1f9-2e67-46ca-b4e1-3c837d27d9ac" (UID: "522cb1f9-2e67-46ca-b4e1-3c837d27d9ac"). InnerVolumeSpecName "kube-api-access-nl5vh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:05:56 crc kubenswrapper[4892]: I0217 18:05:56.181457 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/522cb1f9-2e67-46ca-b4e1-3c837d27d9ac-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "522cb1f9-2e67-46ca-b4e1-3c837d27d9ac" (UID: "522cb1f9-2e67-46ca-b4e1-3c837d27d9ac"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:05:56 crc kubenswrapper[4892]: I0217 18:05:56.189551 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/522cb1f9-2e67-46ca-b4e1-3c837d27d9ac-config" (OuterVolumeSpecName: "config") pod "522cb1f9-2e67-46ca-b4e1-3c837d27d9ac" (UID: "522cb1f9-2e67-46ca-b4e1-3c837d27d9ac"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:05:56 crc kubenswrapper[4892]: I0217 18:05:56.194280 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/522cb1f9-2e67-46ca-b4e1-3c837d27d9ac-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "522cb1f9-2e67-46ca-b4e1-3c837d27d9ac" (UID: "522cb1f9-2e67-46ca-b4e1-3c837d27d9ac"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:05:56 crc kubenswrapper[4892]: I0217 18:05:56.201113 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/522cb1f9-2e67-46ca-b4e1-3c837d27d9ac-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "522cb1f9-2e67-46ca-b4e1-3c837d27d9ac" (UID: "522cb1f9-2e67-46ca-b4e1-3c837d27d9ac"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:05:56 crc kubenswrapper[4892]: I0217 18:05:56.211144 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/522cb1f9-2e67-46ca-b4e1-3c837d27d9ac-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "522cb1f9-2e67-46ca-b4e1-3c837d27d9ac" (UID: "522cb1f9-2e67-46ca-b4e1-3c837d27d9ac"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:05:56 crc kubenswrapper[4892]: I0217 18:05:56.229424 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/522cb1f9-2e67-46ca-b4e1-3c837d27d9ac-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 18:05:56 crc kubenswrapper[4892]: I0217 18:05:56.229456 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/522cb1f9-2e67-46ca-b4e1-3c837d27d9ac-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 18:05:56 crc kubenswrapper[4892]: I0217 18:05:56.229465 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/522cb1f9-2e67-46ca-b4e1-3c837d27d9ac-config\") on node \"crc\" DevicePath \"\"" Feb 17 18:05:56 crc kubenswrapper[4892]: I0217 18:05:56.229475 4892 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/522cb1f9-2e67-46ca-b4e1-3c837d27d9ac-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 18:05:56 crc kubenswrapper[4892]: I0217 18:05:56.229484 4892 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/522cb1f9-2e67-46ca-b4e1-3c837d27d9ac-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 18:05:56 crc kubenswrapper[4892]: I0217 18:05:56.229494 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nl5vh\" (UniqueName: \"kubernetes.io/projected/522cb1f9-2e67-46ca-b4e1-3c837d27d9ac-kube-api-access-nl5vh\") on node \"crc\" DevicePath \"\"" Feb 17 18:05:56 crc kubenswrapper[4892]: I0217 18:05:56.240799 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 18:05:56 crc kubenswrapper[4892]: I0217 18:05:56.241021 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8922c341-2921-434e-94d3-007a3c54b5f0" containerName="nova-api-log" containerID="cri-o://c0fd417bd566f1c3e2720d64fd90ecfb638b16e493eabd76bf3bdea6c3f64227" gracePeriod=30 Feb 17 18:05:56 crc kubenswrapper[4892]: I0217 18:05:56.241143 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8922c341-2921-434e-94d3-007a3c54b5f0" containerName="nova-api-api" containerID="cri-o://a30e66b0f965fe369316bd9afdf3998e749bb17909a6a522c2f5daa476a76752" gracePeriod=30 Feb 17 18:05:56 crc kubenswrapper[4892]: I0217 18:05:56.247353 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8922c341-2921-434e-94d3-007a3c54b5f0" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.216:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 18:05:56 crc kubenswrapper[4892]: I0217 18:05:56.247353 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8922c341-2921-434e-94d3-007a3c54b5f0" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.216:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 18:05:56 crc kubenswrapper[4892]: I0217 18:05:56.573432 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 18:05:56 crc kubenswrapper[4892]: I0217 18:05:56.995177 4892 generic.go:334] "Generic (PLEG): container finished" podID="8922c341-2921-434e-94d3-007a3c54b5f0" containerID="c0fd417bd566f1c3e2720d64fd90ecfb638b16e493eabd76bf3bdea6c3f64227" exitCode=143 Feb 17 18:05:56 crc kubenswrapper[4892]: I0217 18:05:56.995254 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8922c341-2921-434e-94d3-007a3c54b5f0","Type":"ContainerDied","Data":"c0fd417bd566f1c3e2720d64fd90ecfb638b16e493eabd76bf3bdea6c3f64227"} Feb 17 18:05:56 crc kubenswrapper[4892]: I0217 18:05:56.996967 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-9mg8g" event={"ID":"522cb1f9-2e67-46ca-b4e1-3c837d27d9ac","Type":"ContainerDied","Data":"fc68767794ac670f2f8a3577021ff4aa4f814cd12103b0261138ed3bb016b90c"} Feb 17 18:05:56 crc kubenswrapper[4892]: I0217 18:05:56.997006 4892 scope.go:117] "RemoveContainer" containerID="cd9ecb1657eaf84acbc9ad761ad42a0e4fed1fa8c98f3b1aa627599fe7925dc0" Feb 17 18:05:56 crc kubenswrapper[4892]: I0217 18:05:56.997034 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-9mg8g" Feb 17 18:05:57 crc kubenswrapper[4892]: I0217 18:05:57.029017 4892 scope.go:117] "RemoveContainer" containerID="5114d3b45235512ff14c5852bae41a22117edf23a6c4bc5ea936e4ced47a583f" Feb 17 18:05:57 crc kubenswrapper[4892]: I0217 18:05:57.034076 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-9mg8g"] Feb 17 18:05:57 crc kubenswrapper[4892]: I0217 18:05:57.051173 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-9mg8g"] Feb 17 18:05:57 crc kubenswrapper[4892]: I0217 18:05:57.373789 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="522cb1f9-2e67-46ca-b4e1-3c837d27d9ac" path="/var/lib/kubelet/pods/522cb1f9-2e67-46ca-b4e1-3c837d27d9ac/volumes" Feb 17 18:05:58 crc kubenswrapper[4892]: I0217 18:05:58.007375 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="b5fae9e7-9888-49a4-853c-7c8af9c39f98" containerName="nova-scheduler-scheduler" containerID="cri-o://565a05ec270c8a8faa38bcf98b0ae0a4d4ed11b0be9d29859ea0b38605acbdb2" gracePeriod=30 Feb 17 18:05:59 crc kubenswrapper[4892]: I0217 18:05:59.024779 4892 generic.go:334] "Generic (PLEG): container finished" podID="150beb6f-d09f-42cd-8294-acfe9bf7bcee" containerID="df8c62daa6f190fa1487fe86dda7f0982aee1c48c29f17d050a8676d2e3c93d6" exitCode=0 Feb 17 18:05:59 crc kubenswrapper[4892]: I0217 18:05:59.024846 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-qtjqg" event={"ID":"150beb6f-d09f-42cd-8294-acfe9bf7bcee","Type":"ContainerDied","Data":"df8c62daa6f190fa1487fe86dda7f0982aee1c48c29f17d050a8676d2e3c93d6"} Feb 17 18:06:00 crc kubenswrapper[4892]: E0217 18:06:00.181148 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="565a05ec270c8a8faa38bcf98b0ae0a4d4ed11b0be9d29859ea0b38605acbdb2" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 17 18:06:00 crc kubenswrapper[4892]: E0217 18:06:00.183880 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="565a05ec270c8a8faa38bcf98b0ae0a4d4ed11b0be9d29859ea0b38605acbdb2" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 17 18:06:00 crc kubenswrapper[4892]: E0217 18:06:00.189267 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="565a05ec270c8a8faa38bcf98b0ae0a4d4ed11b0be9d29859ea0b38605acbdb2" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 17 18:06:00 crc kubenswrapper[4892]: E0217 18:06:00.189319 4892 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="b5fae9e7-9888-49a4-853c-7c8af9c39f98" containerName="nova-scheduler-scheduler" Feb 17 18:06:00 crc kubenswrapper[4892]: I0217 18:06:00.424836 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-qtjqg" Feb 17 18:06:00 crc kubenswrapper[4892]: I0217 18:06:00.522229 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrxf7\" (UniqueName: \"kubernetes.io/projected/150beb6f-d09f-42cd-8294-acfe9bf7bcee-kube-api-access-mrxf7\") pod \"150beb6f-d09f-42cd-8294-acfe9bf7bcee\" (UID: \"150beb6f-d09f-42cd-8294-acfe9bf7bcee\") " Feb 17 18:06:00 crc kubenswrapper[4892]: I0217 18:06:00.522302 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/150beb6f-d09f-42cd-8294-acfe9bf7bcee-scripts\") pod \"150beb6f-d09f-42cd-8294-acfe9bf7bcee\" (UID: \"150beb6f-d09f-42cd-8294-acfe9bf7bcee\") " Feb 17 18:06:00 crc kubenswrapper[4892]: I0217 18:06:00.522400 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/150beb6f-d09f-42cd-8294-acfe9bf7bcee-combined-ca-bundle\") pod \"150beb6f-d09f-42cd-8294-acfe9bf7bcee\" (UID: \"150beb6f-d09f-42cd-8294-acfe9bf7bcee\") " Feb 17 18:06:00 crc kubenswrapper[4892]: I0217 18:06:00.522478 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/150beb6f-d09f-42cd-8294-acfe9bf7bcee-config-data\") pod \"150beb6f-d09f-42cd-8294-acfe9bf7bcee\" (UID: \"150beb6f-d09f-42cd-8294-acfe9bf7bcee\") " Feb 17 18:06:00 crc kubenswrapper[4892]: I0217 18:06:00.533505 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/150beb6f-d09f-42cd-8294-acfe9bf7bcee-kube-api-access-mrxf7" (OuterVolumeSpecName: "kube-api-access-mrxf7") pod "150beb6f-d09f-42cd-8294-acfe9bf7bcee" (UID: "150beb6f-d09f-42cd-8294-acfe9bf7bcee"). InnerVolumeSpecName "kube-api-access-mrxf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:06:00 crc kubenswrapper[4892]: I0217 18:06:00.534634 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/150beb6f-d09f-42cd-8294-acfe9bf7bcee-scripts" (OuterVolumeSpecName: "scripts") pod "150beb6f-d09f-42cd-8294-acfe9bf7bcee" (UID: "150beb6f-d09f-42cd-8294-acfe9bf7bcee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:06:00 crc kubenswrapper[4892]: I0217 18:06:00.557708 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/150beb6f-d09f-42cd-8294-acfe9bf7bcee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "150beb6f-d09f-42cd-8294-acfe9bf7bcee" (UID: "150beb6f-d09f-42cd-8294-acfe9bf7bcee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:06:00 crc kubenswrapper[4892]: I0217 18:06:00.560439 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/150beb6f-d09f-42cd-8294-acfe9bf7bcee-config-data" (OuterVolumeSpecName: "config-data") pod "150beb6f-d09f-42cd-8294-acfe9bf7bcee" (UID: "150beb6f-d09f-42cd-8294-acfe9bf7bcee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:06:00 crc kubenswrapper[4892]: I0217 18:06:00.625017 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrxf7\" (UniqueName: \"kubernetes.io/projected/150beb6f-d09f-42cd-8294-acfe9bf7bcee-kube-api-access-mrxf7\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:00 crc kubenswrapper[4892]: I0217 18:06:00.625321 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/150beb6f-d09f-42cd-8294-acfe9bf7bcee-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:00 crc kubenswrapper[4892]: I0217 18:06:00.625333 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/150beb6f-d09f-42cd-8294-acfe9bf7bcee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:00 crc kubenswrapper[4892]: I0217 18:06:00.625343 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/150beb6f-d09f-42cd-8294-acfe9bf7bcee-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:01 crc kubenswrapper[4892]: I0217 18:06:01.057625 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-qtjqg" event={"ID":"150beb6f-d09f-42cd-8294-acfe9bf7bcee","Type":"ContainerDied","Data":"74b8d540dab8883240a904e110deb8ff5b7926dcdba07eef536b9335bc1a245b"} Feb 17 18:06:01 crc kubenswrapper[4892]: I0217 18:06:01.057683 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74b8d540dab8883240a904e110deb8ff5b7926dcdba07eef536b9335bc1a245b" Feb 17 18:06:01 crc kubenswrapper[4892]: I0217 18:06:01.057756 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-qtjqg" Feb 17 18:06:01 crc kubenswrapper[4892]: I0217 18:06:01.173302 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 17 18:06:01 crc kubenswrapper[4892]: E0217 18:06:01.174039 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="150beb6f-d09f-42cd-8294-acfe9bf7bcee" containerName="nova-cell1-conductor-db-sync" Feb 17 18:06:01 crc kubenswrapper[4892]: I0217 18:06:01.174070 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="150beb6f-d09f-42cd-8294-acfe9bf7bcee" containerName="nova-cell1-conductor-db-sync" Feb 17 18:06:01 crc kubenswrapper[4892]: E0217 18:06:01.174110 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="522cb1f9-2e67-46ca-b4e1-3c837d27d9ac" containerName="init" Feb 17 18:06:01 crc kubenswrapper[4892]: I0217 18:06:01.174124 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="522cb1f9-2e67-46ca-b4e1-3c837d27d9ac" containerName="init" Feb 17 18:06:01 crc kubenswrapper[4892]: E0217 18:06:01.174298 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60efd3a5-df5d-4f7c-949b-f9952e234b8d" containerName="nova-manage" Feb 17 18:06:01 crc kubenswrapper[4892]: I0217 18:06:01.174309 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="60efd3a5-df5d-4f7c-949b-f9952e234b8d" containerName="nova-manage" Feb 17 18:06:01 crc kubenswrapper[4892]: E0217 18:06:01.174326 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="522cb1f9-2e67-46ca-b4e1-3c837d27d9ac" containerName="dnsmasq-dns" Feb 17 18:06:01 crc kubenswrapper[4892]: I0217 18:06:01.174336 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="522cb1f9-2e67-46ca-b4e1-3c837d27d9ac" containerName="dnsmasq-dns" Feb 17 18:06:01 crc kubenswrapper[4892]: I0217 18:06:01.174735 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="60efd3a5-df5d-4f7c-949b-f9952e234b8d" containerName="nova-manage" Feb 17 18:06:01 crc kubenswrapper[4892]: I0217 18:06:01.174786 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="522cb1f9-2e67-46ca-b4e1-3c837d27d9ac" containerName="dnsmasq-dns" Feb 17 18:06:01 crc kubenswrapper[4892]: I0217 18:06:01.174804 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="150beb6f-d09f-42cd-8294-acfe9bf7bcee" containerName="nova-cell1-conductor-db-sync" Feb 17 18:06:01 crc kubenswrapper[4892]: I0217 18:06:01.175907 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 17 18:06:01 crc kubenswrapper[4892]: I0217 18:06:01.179300 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 17 18:06:01 crc kubenswrapper[4892]: I0217 18:06:01.192354 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 17 18:06:01 crc kubenswrapper[4892]: I0217 18:06:01.239414 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4d79298-5d7f-4bd2-a8ba-b76c1a8b44f4-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"b4d79298-5d7f-4bd2-a8ba-b76c1a8b44f4\") " pod="openstack/nova-cell1-conductor-0" Feb 17 18:06:01 crc kubenswrapper[4892]: I0217 18:06:01.239474 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4d79298-5d7f-4bd2-a8ba-b76c1a8b44f4-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"b4d79298-5d7f-4bd2-a8ba-b76c1a8b44f4\") " pod="openstack/nova-cell1-conductor-0" Feb 17 18:06:01 crc kubenswrapper[4892]: I0217 18:06:01.239596 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvt2t\" (UniqueName: \"kubernetes.io/projected/b4d79298-5d7f-4bd2-a8ba-b76c1a8b44f4-kube-api-access-kvt2t\") pod \"nova-cell1-conductor-0\" (UID: \"b4d79298-5d7f-4bd2-a8ba-b76c1a8b44f4\") " pod="openstack/nova-cell1-conductor-0" Feb 17 18:06:01 crc kubenswrapper[4892]: I0217 18:06:01.341774 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvt2t\" (UniqueName: \"kubernetes.io/projected/b4d79298-5d7f-4bd2-a8ba-b76c1a8b44f4-kube-api-access-kvt2t\") pod \"nova-cell1-conductor-0\" (UID: \"b4d79298-5d7f-4bd2-a8ba-b76c1a8b44f4\") " pod="openstack/nova-cell1-conductor-0" Feb 17 18:06:01 crc kubenswrapper[4892]: I0217 18:06:01.342008 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4d79298-5d7f-4bd2-a8ba-b76c1a8b44f4-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"b4d79298-5d7f-4bd2-a8ba-b76c1a8b44f4\") " pod="openstack/nova-cell1-conductor-0" Feb 17 18:06:01 crc kubenswrapper[4892]: I0217 18:06:01.342043 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4d79298-5d7f-4bd2-a8ba-b76c1a8b44f4-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"b4d79298-5d7f-4bd2-a8ba-b76c1a8b44f4\") " pod="openstack/nova-cell1-conductor-0" Feb 17 18:06:01 crc kubenswrapper[4892]: I0217 18:06:01.346310 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4d79298-5d7f-4bd2-a8ba-b76c1a8b44f4-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"b4d79298-5d7f-4bd2-a8ba-b76c1a8b44f4\") " pod="openstack/nova-cell1-conductor-0" Feb 17 18:06:01 crc kubenswrapper[4892]: I0217 18:06:01.346700 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4d79298-5d7f-4bd2-a8ba-b76c1a8b44f4-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"b4d79298-5d7f-4bd2-a8ba-b76c1a8b44f4\") " pod="openstack/nova-cell1-conductor-0" Feb 17 18:06:01 crc kubenswrapper[4892]: I0217 18:06:01.374192 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvt2t\" (UniqueName: \"kubernetes.io/projected/b4d79298-5d7f-4bd2-a8ba-b76c1a8b44f4-kube-api-access-kvt2t\") pod \"nova-cell1-conductor-0\" (UID: \"b4d79298-5d7f-4bd2-a8ba-b76c1a8b44f4\") " pod="openstack/nova-cell1-conductor-0" Feb 17 18:06:01 crc kubenswrapper[4892]: I0217 18:06:01.504153 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 17 18:06:02 crc kubenswrapper[4892]: I0217 18:06:02.026066 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 17 18:06:02 crc kubenswrapper[4892]: W0217 18:06:02.039597 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4d79298_5d7f_4bd2_a8ba_b76c1a8b44f4.slice/crio-759e1b46d3d18c535b39242cd08c91124e27c8c3b47d2ef2c3ef3853cc58d85b WatchSource:0}: Error finding container 759e1b46d3d18c535b39242cd08c91124e27c8c3b47d2ef2c3ef3853cc58d85b: Status 404 returned error can't find the container with id 759e1b46d3d18c535b39242cd08c91124e27c8c3b47d2ef2c3ef3853cc58d85b Feb 17 18:06:02 crc kubenswrapper[4892]: I0217 18:06:02.073897 4892 generic.go:334] "Generic (PLEG): container finished" podID="8922c341-2921-434e-94d3-007a3c54b5f0" containerID="a30e66b0f965fe369316bd9afdf3998e749bb17909a6a522c2f5daa476a76752" exitCode=0 Feb 17 18:06:02 crc kubenswrapper[4892]: I0217 18:06:02.074066 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8922c341-2921-434e-94d3-007a3c54b5f0","Type":"ContainerDied","Data":"a30e66b0f965fe369316bd9afdf3998e749bb17909a6a522c2f5daa476a76752"} Feb 17 18:06:02 crc kubenswrapper[4892]: I0217 18:06:02.075624 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"b4d79298-5d7f-4bd2-a8ba-b76c1a8b44f4","Type":"ContainerStarted","Data":"759e1b46d3d18c535b39242cd08c91124e27c8c3b47d2ef2c3ef3853cc58d85b"} Feb 17 18:06:02 crc kubenswrapper[4892]: I0217 18:06:02.167161 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 18:06:02 crc kubenswrapper[4892]: I0217 18:06:02.262249 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8922c341-2921-434e-94d3-007a3c54b5f0-combined-ca-bundle\") pod \"8922c341-2921-434e-94d3-007a3c54b5f0\" (UID: \"8922c341-2921-434e-94d3-007a3c54b5f0\") " Feb 17 18:06:02 crc kubenswrapper[4892]: I0217 18:06:02.262312 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8922c341-2921-434e-94d3-007a3c54b5f0-logs\") pod \"8922c341-2921-434e-94d3-007a3c54b5f0\" (UID: \"8922c341-2921-434e-94d3-007a3c54b5f0\") " Feb 17 18:06:02 crc kubenswrapper[4892]: I0217 18:06:02.262426 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsxcv\" (UniqueName: \"kubernetes.io/projected/8922c341-2921-434e-94d3-007a3c54b5f0-kube-api-access-rsxcv\") pod \"8922c341-2921-434e-94d3-007a3c54b5f0\" (UID: \"8922c341-2921-434e-94d3-007a3c54b5f0\") " Feb 17 18:06:02 crc kubenswrapper[4892]: I0217 18:06:02.262567 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8922c341-2921-434e-94d3-007a3c54b5f0-config-data\") pod \"8922c341-2921-434e-94d3-007a3c54b5f0\" (UID: \"8922c341-2921-434e-94d3-007a3c54b5f0\") " Feb 17 18:06:02 crc kubenswrapper[4892]: I0217 18:06:02.263775 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8922c341-2921-434e-94d3-007a3c54b5f0-logs" (OuterVolumeSpecName: "logs") pod "8922c341-2921-434e-94d3-007a3c54b5f0" (UID: "8922c341-2921-434e-94d3-007a3c54b5f0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:06:02 crc kubenswrapper[4892]: I0217 18:06:02.268538 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8922c341-2921-434e-94d3-007a3c54b5f0-kube-api-access-rsxcv" (OuterVolumeSpecName: "kube-api-access-rsxcv") pod "8922c341-2921-434e-94d3-007a3c54b5f0" (UID: "8922c341-2921-434e-94d3-007a3c54b5f0"). InnerVolumeSpecName "kube-api-access-rsxcv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:06:02 crc kubenswrapper[4892]: I0217 18:06:02.295934 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8922c341-2921-434e-94d3-007a3c54b5f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8922c341-2921-434e-94d3-007a3c54b5f0" (UID: "8922c341-2921-434e-94d3-007a3c54b5f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:06:02 crc kubenswrapper[4892]: I0217 18:06:02.296720 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8922c341-2921-434e-94d3-007a3c54b5f0-config-data" (OuterVolumeSpecName: "config-data") pod "8922c341-2921-434e-94d3-007a3c54b5f0" (UID: "8922c341-2921-434e-94d3-007a3c54b5f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:06:02 crc kubenswrapper[4892]: I0217 18:06:02.366725 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsxcv\" (UniqueName: \"kubernetes.io/projected/8922c341-2921-434e-94d3-007a3c54b5f0-kube-api-access-rsxcv\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:02 crc kubenswrapper[4892]: I0217 18:06:02.366803 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8922c341-2921-434e-94d3-007a3c54b5f0-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:02 crc kubenswrapper[4892]: I0217 18:06:02.366868 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8922c341-2921-434e-94d3-007a3c54b5f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:02 crc kubenswrapper[4892]: I0217 18:06:02.366894 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8922c341-2921-434e-94d3-007a3c54b5f0-logs\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:03 crc kubenswrapper[4892]: I0217 18:06:03.085922 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"b4d79298-5d7f-4bd2-a8ba-b76c1a8b44f4","Type":"ContainerStarted","Data":"59c346faddf9dfe842e95a9add3e43317b2c2439005912005284fd494f8f1d8f"} Feb 17 18:06:03 crc kubenswrapper[4892]: I0217 18:06:03.086458 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 17 18:06:03 crc kubenswrapper[4892]: I0217 18:06:03.088311 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8922c341-2921-434e-94d3-007a3c54b5f0","Type":"ContainerDied","Data":"c18b24cecd183d3071c80d7c54e28cdae01d9471babde029399fa5f3ef725331"} Feb 17 18:06:03 crc kubenswrapper[4892]: I0217 18:06:03.088403 4892 scope.go:117] "RemoveContainer" containerID="a30e66b0f965fe369316bd9afdf3998e749bb17909a6a522c2f5daa476a76752" Feb 17 18:06:03 crc kubenswrapper[4892]: I0217 18:06:03.088416 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 18:06:03 crc kubenswrapper[4892]: I0217 18:06:03.121858 4892 scope.go:117] "RemoveContainer" containerID="c0fd417bd566f1c3e2720d64fd90ecfb638b16e493eabd76bf3bdea6c3f64227" Feb 17 18:06:03 crc kubenswrapper[4892]: I0217 18:06:03.496870 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.496850627 podStartE2EDuration="2.496850627s" podCreationTimestamp="2026-02-17 18:06:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:06:03.492231192 +0000 UTC m=+1334.867634457" watchObservedRunningTime="2026-02-17 18:06:03.496850627 +0000 UTC m=+1334.872253892" Feb 17 18:06:03 crc kubenswrapper[4892]: I0217 18:06:03.519668 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 18:06:03 crc kubenswrapper[4892]: I0217 18:06:03.536618 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 17 18:06:03 crc kubenswrapper[4892]: I0217 18:06:03.549933 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 17 18:06:03 crc kubenswrapper[4892]: E0217 18:06:03.550507 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8922c341-2921-434e-94d3-007a3c54b5f0" containerName="nova-api-log" Feb 17 18:06:03 crc kubenswrapper[4892]: I0217 18:06:03.550531 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="8922c341-2921-434e-94d3-007a3c54b5f0" containerName="nova-api-log" Feb 17 18:06:03 crc kubenswrapper[4892]: E0217 18:06:03.550573 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8922c341-2921-434e-94d3-007a3c54b5f0" containerName="nova-api-api" Feb 17 18:06:03 crc kubenswrapper[4892]: I0217 18:06:03.550582 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="8922c341-2921-434e-94d3-007a3c54b5f0" containerName="nova-api-api" Feb 17 18:06:03 crc kubenswrapper[4892]: I0217 18:06:03.550887 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="8922c341-2921-434e-94d3-007a3c54b5f0" containerName="nova-api-api" Feb 17 18:06:03 crc kubenswrapper[4892]: I0217 18:06:03.550921 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="8922c341-2921-434e-94d3-007a3c54b5f0" containerName="nova-api-log" Feb 17 18:06:03 crc kubenswrapper[4892]: I0217 18:06:03.552293 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 18:06:03 crc kubenswrapper[4892]: I0217 18:06:03.557732 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 17 18:06:03 crc kubenswrapper[4892]: I0217 18:06:03.560020 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 18:06:03 crc kubenswrapper[4892]: I0217 18:06:03.613083 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5f00cf6-62fb-43dc-83b0-d07d1013ac39-logs\") pod \"nova-api-0\" (UID: \"c5f00cf6-62fb-43dc-83b0-d07d1013ac39\") " pod="openstack/nova-api-0" Feb 17 18:06:03 crc kubenswrapper[4892]: I0217 18:06:03.613340 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5f00cf6-62fb-43dc-83b0-d07d1013ac39-config-data\") pod \"nova-api-0\" (UID: \"c5f00cf6-62fb-43dc-83b0-d07d1013ac39\") " pod="openstack/nova-api-0" Feb 17 18:06:03 crc kubenswrapper[4892]: I0217 18:06:03.613616 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ptpb\" (UniqueName: \"kubernetes.io/projected/c5f00cf6-62fb-43dc-83b0-d07d1013ac39-kube-api-access-6ptpb\") pod \"nova-api-0\" (UID: \"c5f00cf6-62fb-43dc-83b0-d07d1013ac39\") " pod="openstack/nova-api-0" Feb 17 18:06:03 crc kubenswrapper[4892]: I0217 18:06:03.613716 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5f00cf6-62fb-43dc-83b0-d07d1013ac39-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c5f00cf6-62fb-43dc-83b0-d07d1013ac39\") " pod="openstack/nova-api-0" Feb 17 18:06:03 crc kubenswrapper[4892]: I0217 18:06:03.715590 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5f00cf6-62fb-43dc-83b0-d07d1013ac39-logs\") pod \"nova-api-0\" (UID: \"c5f00cf6-62fb-43dc-83b0-d07d1013ac39\") " pod="openstack/nova-api-0" Feb 17 18:06:03 crc kubenswrapper[4892]: I0217 18:06:03.715672 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5f00cf6-62fb-43dc-83b0-d07d1013ac39-config-data\") pod \"nova-api-0\" (UID: \"c5f00cf6-62fb-43dc-83b0-d07d1013ac39\") " pod="openstack/nova-api-0" Feb 17 18:06:03 crc kubenswrapper[4892]: I0217 18:06:03.715738 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ptpb\" (UniqueName: \"kubernetes.io/projected/c5f00cf6-62fb-43dc-83b0-d07d1013ac39-kube-api-access-6ptpb\") pod \"nova-api-0\" (UID: \"c5f00cf6-62fb-43dc-83b0-d07d1013ac39\") " pod="openstack/nova-api-0" Feb 17 18:06:03 crc kubenswrapper[4892]: I0217 18:06:03.715776 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5f00cf6-62fb-43dc-83b0-d07d1013ac39-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c5f00cf6-62fb-43dc-83b0-d07d1013ac39\") " pod="openstack/nova-api-0" Feb 17 18:06:03 crc kubenswrapper[4892]: I0217 18:06:03.716519 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5f00cf6-62fb-43dc-83b0-d07d1013ac39-logs\") pod \"nova-api-0\" (UID: \"c5f00cf6-62fb-43dc-83b0-d07d1013ac39\") " pod="openstack/nova-api-0" Feb 17 18:06:03 crc kubenswrapper[4892]: I0217 18:06:03.723351 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5f00cf6-62fb-43dc-83b0-d07d1013ac39-config-data\") pod \"nova-api-0\" (UID: \"c5f00cf6-62fb-43dc-83b0-d07d1013ac39\") " pod="openstack/nova-api-0" Feb 17 18:06:03 crc kubenswrapper[4892]: I0217 18:06:03.724361 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5f00cf6-62fb-43dc-83b0-d07d1013ac39-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c5f00cf6-62fb-43dc-83b0-d07d1013ac39\") " pod="openstack/nova-api-0" Feb 17 18:06:03 crc kubenswrapper[4892]: I0217 18:06:03.737956 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ptpb\" (UniqueName: \"kubernetes.io/projected/c5f00cf6-62fb-43dc-83b0-d07d1013ac39-kube-api-access-6ptpb\") pod \"nova-api-0\" (UID: \"c5f00cf6-62fb-43dc-83b0-d07d1013ac39\") " pod="openstack/nova-api-0" Feb 17 18:06:03 crc kubenswrapper[4892]: I0217 18:06:03.874057 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 18:06:04 crc kubenswrapper[4892]: I0217 18:06:04.089149 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 18:06:04 crc kubenswrapper[4892]: I0217 18:06:04.116881 4892 generic.go:334] "Generic (PLEG): container finished" podID="b5fae9e7-9888-49a4-853c-7c8af9c39f98" containerID="565a05ec270c8a8faa38bcf98b0ae0a4d4ed11b0be9d29859ea0b38605acbdb2" exitCode=0 Feb 17 18:06:04 crc kubenswrapper[4892]: I0217 18:06:04.117014 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 18:06:04 crc kubenswrapper[4892]: I0217 18:06:04.117890 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b5fae9e7-9888-49a4-853c-7c8af9c39f98","Type":"ContainerDied","Data":"565a05ec270c8a8faa38bcf98b0ae0a4d4ed11b0be9d29859ea0b38605acbdb2"} Feb 17 18:06:04 crc kubenswrapper[4892]: I0217 18:06:04.117930 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b5fae9e7-9888-49a4-853c-7c8af9c39f98","Type":"ContainerDied","Data":"a38ca570335e6844249f77f9c45903b344ac7a9a57d1f78dfe55672df8cbacca"} Feb 17 18:06:04 crc kubenswrapper[4892]: I0217 18:06:04.117952 4892 scope.go:117] "RemoveContainer" containerID="565a05ec270c8a8faa38bcf98b0ae0a4d4ed11b0be9d29859ea0b38605acbdb2" Feb 17 18:06:04 crc kubenswrapper[4892]: I0217 18:06:04.164363 4892 scope.go:117] "RemoveContainer" containerID="565a05ec270c8a8faa38bcf98b0ae0a4d4ed11b0be9d29859ea0b38605acbdb2" Feb 17 18:06:04 crc kubenswrapper[4892]: E0217 18:06:04.166158 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"565a05ec270c8a8faa38bcf98b0ae0a4d4ed11b0be9d29859ea0b38605acbdb2\": container with ID starting with 565a05ec270c8a8faa38bcf98b0ae0a4d4ed11b0be9d29859ea0b38605acbdb2 not found: ID does not exist" containerID="565a05ec270c8a8faa38bcf98b0ae0a4d4ed11b0be9d29859ea0b38605acbdb2" Feb 17 18:06:04 crc kubenswrapper[4892]: I0217 18:06:04.166228 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"565a05ec270c8a8faa38bcf98b0ae0a4d4ed11b0be9d29859ea0b38605acbdb2"} err="failed to get container status \"565a05ec270c8a8faa38bcf98b0ae0a4d4ed11b0be9d29859ea0b38605acbdb2\": rpc error: code = NotFound desc = could not find container \"565a05ec270c8a8faa38bcf98b0ae0a4d4ed11b0be9d29859ea0b38605acbdb2\": container with ID starting with 565a05ec270c8a8faa38bcf98b0ae0a4d4ed11b0be9d29859ea0b38605acbdb2 not found: ID does not exist" Feb 17 18:06:04 crc kubenswrapper[4892]: I0217 18:06:04.229854 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5fae9e7-9888-49a4-853c-7c8af9c39f98-config-data\") pod \"b5fae9e7-9888-49a4-853c-7c8af9c39f98\" (UID: \"b5fae9e7-9888-49a4-853c-7c8af9c39f98\") " Feb 17 18:06:04 crc kubenswrapper[4892]: I0217 18:06:04.230013 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5fae9e7-9888-49a4-853c-7c8af9c39f98-combined-ca-bundle\") pod \"b5fae9e7-9888-49a4-853c-7c8af9c39f98\" (UID: \"b5fae9e7-9888-49a4-853c-7c8af9c39f98\") " Feb 17 18:06:04 crc kubenswrapper[4892]: I0217 18:06:04.230033 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bt5r6\" (UniqueName: \"kubernetes.io/projected/b5fae9e7-9888-49a4-853c-7c8af9c39f98-kube-api-access-bt5r6\") pod \"b5fae9e7-9888-49a4-853c-7c8af9c39f98\" (UID: \"b5fae9e7-9888-49a4-853c-7c8af9c39f98\") " Feb 17 18:06:04 crc kubenswrapper[4892]: I0217 18:06:04.238090 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5fae9e7-9888-49a4-853c-7c8af9c39f98-kube-api-access-bt5r6" (OuterVolumeSpecName: "kube-api-access-bt5r6") pod "b5fae9e7-9888-49a4-853c-7c8af9c39f98" (UID: "b5fae9e7-9888-49a4-853c-7c8af9c39f98"). InnerVolumeSpecName "kube-api-access-bt5r6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:06:04 crc kubenswrapper[4892]: I0217 18:06:04.263872 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5fae9e7-9888-49a4-853c-7c8af9c39f98-config-data" (OuterVolumeSpecName: "config-data") pod "b5fae9e7-9888-49a4-853c-7c8af9c39f98" (UID: "b5fae9e7-9888-49a4-853c-7c8af9c39f98"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:06:04 crc kubenswrapper[4892]: I0217 18:06:04.269382 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5fae9e7-9888-49a4-853c-7c8af9c39f98-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b5fae9e7-9888-49a4-853c-7c8af9c39f98" (UID: "b5fae9e7-9888-49a4-853c-7c8af9c39f98"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:06:04 crc kubenswrapper[4892]: I0217 18:06:04.332388 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5fae9e7-9888-49a4-853c-7c8af9c39f98-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:04 crc kubenswrapper[4892]: I0217 18:06:04.332678 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bt5r6\" (UniqueName: \"kubernetes.io/projected/b5fae9e7-9888-49a4-853c-7c8af9c39f98-kube-api-access-bt5r6\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:04 crc kubenswrapper[4892]: I0217 18:06:04.332687 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5fae9e7-9888-49a4-853c-7c8af9c39f98-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:04 crc kubenswrapper[4892]: I0217 18:06:04.383894 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 18:06:04 crc kubenswrapper[4892]: I0217 18:06:04.453732 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 18:06:04 crc kubenswrapper[4892]: I0217 18:06:04.481747 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 18:06:04 crc kubenswrapper[4892]: I0217 18:06:04.492367 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 18:06:04 crc kubenswrapper[4892]: E0217 18:06:04.492914 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5fae9e7-9888-49a4-853c-7c8af9c39f98" containerName="nova-scheduler-scheduler" Feb 17 18:06:04 crc kubenswrapper[4892]: I0217 18:06:04.492935 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5fae9e7-9888-49a4-853c-7c8af9c39f98" containerName="nova-scheduler-scheduler" Feb 17 18:06:04 crc kubenswrapper[4892]: I0217 18:06:04.493163 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5fae9e7-9888-49a4-853c-7c8af9c39f98" containerName="nova-scheduler-scheduler" Feb 17 18:06:04 crc kubenswrapper[4892]: I0217 18:06:04.493969 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 18:06:04 crc kubenswrapper[4892]: I0217 18:06:04.497118 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 17 18:06:04 crc kubenswrapper[4892]: I0217 18:06:04.502371 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 18:06:04 crc kubenswrapper[4892]: I0217 18:06:04.537142 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tfw5\" (UniqueName: \"kubernetes.io/projected/b3ba050d-ff82-4828-91b0-c2ca3eeccd5e-kube-api-access-6tfw5\") pod \"nova-scheduler-0\" (UID: \"b3ba050d-ff82-4828-91b0-c2ca3eeccd5e\") " pod="openstack/nova-scheduler-0" Feb 17 18:06:04 crc kubenswrapper[4892]: I0217 18:06:04.537252 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3ba050d-ff82-4828-91b0-c2ca3eeccd5e-config-data\") pod \"nova-scheduler-0\" (UID: \"b3ba050d-ff82-4828-91b0-c2ca3eeccd5e\") " pod="openstack/nova-scheduler-0" Feb 17 18:06:04 crc kubenswrapper[4892]: I0217 18:06:04.537351 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3ba050d-ff82-4828-91b0-c2ca3eeccd5e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b3ba050d-ff82-4828-91b0-c2ca3eeccd5e\") " pod="openstack/nova-scheduler-0" Feb 17 18:06:04 crc kubenswrapper[4892]: I0217 18:06:04.641018 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tfw5\" (UniqueName: \"kubernetes.io/projected/b3ba050d-ff82-4828-91b0-c2ca3eeccd5e-kube-api-access-6tfw5\") pod \"nova-scheduler-0\" (UID: \"b3ba050d-ff82-4828-91b0-c2ca3eeccd5e\") " pod="openstack/nova-scheduler-0" Feb 17 18:06:04 crc kubenswrapper[4892]: I0217 18:06:04.641177 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3ba050d-ff82-4828-91b0-c2ca3eeccd5e-config-data\") pod \"nova-scheduler-0\" (UID: \"b3ba050d-ff82-4828-91b0-c2ca3eeccd5e\") " pod="openstack/nova-scheduler-0" Feb 17 18:06:04 crc kubenswrapper[4892]: I0217 18:06:04.642316 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3ba050d-ff82-4828-91b0-c2ca3eeccd5e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b3ba050d-ff82-4828-91b0-c2ca3eeccd5e\") " pod="openstack/nova-scheduler-0" Feb 17 18:06:04 crc kubenswrapper[4892]: I0217 18:06:04.646650 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3ba050d-ff82-4828-91b0-c2ca3eeccd5e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b3ba050d-ff82-4828-91b0-c2ca3eeccd5e\") " pod="openstack/nova-scheduler-0" Feb 17 18:06:04 crc kubenswrapper[4892]: I0217 18:06:04.650038 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3ba050d-ff82-4828-91b0-c2ca3eeccd5e-config-data\") pod \"nova-scheduler-0\" (UID: \"b3ba050d-ff82-4828-91b0-c2ca3eeccd5e\") " pod="openstack/nova-scheduler-0" Feb 17 18:06:04 crc kubenswrapper[4892]: I0217 18:06:04.676436 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tfw5\" (UniqueName: \"kubernetes.io/projected/b3ba050d-ff82-4828-91b0-c2ca3eeccd5e-kube-api-access-6tfw5\") pod \"nova-scheduler-0\" (UID: \"b3ba050d-ff82-4828-91b0-c2ca3eeccd5e\") " pod="openstack/nova-scheduler-0" Feb 17 18:06:04 crc kubenswrapper[4892]: I0217 18:06:04.828645 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 18:06:05 crc kubenswrapper[4892]: I0217 18:06:05.133166 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c5f00cf6-62fb-43dc-83b0-d07d1013ac39","Type":"ContainerStarted","Data":"9836f2bba378d07e986d22e558347cecc28683307c36bd0001343389fc8cc1ad"} Feb 17 18:06:05 crc kubenswrapper[4892]: I0217 18:06:05.133418 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c5f00cf6-62fb-43dc-83b0-d07d1013ac39","Type":"ContainerStarted","Data":"a93d2b9992861dc290227f210f249e17f3b1196c0ba166f7ff424cb14caa20aa"} Feb 17 18:06:05 crc kubenswrapper[4892]: I0217 18:06:05.133429 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c5f00cf6-62fb-43dc-83b0-d07d1013ac39","Type":"ContainerStarted","Data":"1d29297b6ed97cf2eb38e716e4cbd6509cb9ba206e21c95ad0bcf319a7f6a30e"} Feb 17 18:06:05 crc kubenswrapper[4892]: I0217 18:06:05.152455 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.152435334 podStartE2EDuration="2.152435334s" podCreationTimestamp="2026-02-17 18:06:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:06:05.149768202 +0000 UTC m=+1336.525171467" watchObservedRunningTime="2026-02-17 18:06:05.152435334 +0000 UTC m=+1336.527838609" Feb 17 18:06:05 crc kubenswrapper[4892]: I0217 18:06:05.308605 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 18:06:05 crc kubenswrapper[4892]: W0217 18:06:05.312695 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb3ba050d_ff82_4828_91b0_c2ca3eeccd5e.slice/crio-def61e4a980cfcd3994302edbbbfa4374f051a5b2679f3a8e22f2eb506ca33e7 WatchSource:0}: Error finding container def61e4a980cfcd3994302edbbbfa4374f051a5b2679f3a8e22f2eb506ca33e7: Status 404 returned error can't find the container with id def61e4a980cfcd3994302edbbbfa4374f051a5b2679f3a8e22f2eb506ca33e7 Feb 17 18:06:05 crc kubenswrapper[4892]: I0217 18:06:05.371742 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8922c341-2921-434e-94d3-007a3c54b5f0" path="/var/lib/kubelet/pods/8922c341-2921-434e-94d3-007a3c54b5f0/volumes" Feb 17 18:06:05 crc kubenswrapper[4892]: I0217 18:06:05.372675 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5fae9e7-9888-49a4-853c-7c8af9c39f98" path="/var/lib/kubelet/pods/b5fae9e7-9888-49a4-853c-7c8af9c39f98/volumes" Feb 17 18:06:06 crc kubenswrapper[4892]: I0217 18:06:06.146098 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b3ba050d-ff82-4828-91b0-c2ca3eeccd5e","Type":"ContainerStarted","Data":"f7336952b591bb5a3beb50a3e49535fdba933d66072b6bbdd4a002161abb2ff1"} Feb 17 18:06:06 crc kubenswrapper[4892]: I0217 18:06:06.148138 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b3ba050d-ff82-4828-91b0-c2ca3eeccd5e","Type":"ContainerStarted","Data":"def61e4a980cfcd3994302edbbbfa4374f051a5b2679f3a8e22f2eb506ca33e7"} Feb 17 18:06:06 crc kubenswrapper[4892]: I0217 18:06:06.181538 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.181513567 podStartE2EDuration="2.181513567s" podCreationTimestamp="2026-02-17 18:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:06:06.172947966 +0000 UTC m=+1337.548351271" watchObservedRunningTime="2026-02-17 18:06:06.181513567 +0000 UTC m=+1337.556916862" Feb 17 18:06:07 crc kubenswrapper[4892]: I0217 18:06:07.424675 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 18:06:07 crc kubenswrapper[4892]: I0217 18:06:07.425160 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 18:06:09 crc kubenswrapper[4892]: I0217 18:06:09.829439 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 17 18:06:11 crc kubenswrapper[4892]: I0217 18:06:11.552538 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 17 18:06:12 crc kubenswrapper[4892]: I0217 18:06:12.480503 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 17 18:06:13 crc kubenswrapper[4892]: I0217 18:06:13.874804 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 18:06:13 crc kubenswrapper[4892]: I0217 18:06:13.875114 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 18:06:14 crc kubenswrapper[4892]: I0217 18:06:14.829946 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 17 18:06:14 crc kubenswrapper[4892]: I0217 18:06:14.862315 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 17 18:06:14 crc kubenswrapper[4892]: I0217 18:06:14.956957 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c5f00cf6-62fb-43dc-83b0-d07d1013ac39" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.222:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 18:06:14 crc kubenswrapper[4892]: I0217 18:06:14.957003 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c5f00cf6-62fb-43dc-83b0-d07d1013ac39" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.222:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 18:06:15 crc kubenswrapper[4892]: I0217 18:06:15.315067 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 17 18:06:21 crc kubenswrapper[4892]: I0217 18:06:21.376835 4892 generic.go:334] "Generic (PLEG): container finished" podID="17df5ca6-aa43-460a-91a1-76a87616376e" containerID="774bd1818f399c46a92e0d40d59f996bb13f1cbbcb52122a6f55526ac6091b99" exitCode=137 Feb 17 18:06:21 crc kubenswrapper[4892]: I0217 18:06:21.377295 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"17df5ca6-aa43-460a-91a1-76a87616376e","Type":"ContainerDied","Data":"774bd1818f399c46a92e0d40d59f996bb13f1cbbcb52122a6f55526ac6091b99"} Feb 17 18:06:21 crc kubenswrapper[4892]: I0217 18:06:21.377315 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"17df5ca6-aa43-460a-91a1-76a87616376e","Type":"ContainerDied","Data":"7e7c22b8935b6d2bca5c438272316f1ec950c43de574e6687994675c82e50b95"} Feb 17 18:06:21 crc kubenswrapper[4892]: I0217 18:06:21.377325 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e7c22b8935b6d2bca5c438272316f1ec950c43de574e6687994675c82e50b95" Feb 17 18:06:21 crc kubenswrapper[4892]: I0217 18:06:21.379124 4892 generic.go:334] "Generic (PLEG): container finished" podID="59bcdc5c-eaeb-4ba3-b714-bcbbe30baa55" containerID="efd8dfb1e02ad04c1f660122e0b638a7ce0432928a1c09ee9deb48c80b346a6e" exitCode=137 Feb 17 18:06:21 crc kubenswrapper[4892]: I0217 18:06:21.379150 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"59bcdc5c-eaeb-4ba3-b714-bcbbe30baa55","Type":"ContainerDied","Data":"efd8dfb1e02ad04c1f660122e0b638a7ce0432928a1c09ee9deb48c80b346a6e"} Feb 17 18:06:21 crc kubenswrapper[4892]: I0217 18:06:21.436902 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 18:06:21 crc kubenswrapper[4892]: I0217 18:06:21.441437 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 18:06:21 crc kubenswrapper[4892]: I0217 18:06:21.525199 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17df5ca6-aa43-460a-91a1-76a87616376e-combined-ca-bundle\") pod \"17df5ca6-aa43-460a-91a1-76a87616376e\" (UID: \"17df5ca6-aa43-460a-91a1-76a87616376e\") " Feb 17 18:06:21 crc kubenswrapper[4892]: I0217 18:06:21.525574 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjhwb\" (UniqueName: \"kubernetes.io/projected/59bcdc5c-eaeb-4ba3-b714-bcbbe30baa55-kube-api-access-fjhwb\") pod \"59bcdc5c-eaeb-4ba3-b714-bcbbe30baa55\" (UID: \"59bcdc5c-eaeb-4ba3-b714-bcbbe30baa55\") " Feb 17 18:06:21 crc kubenswrapper[4892]: I0217 18:06:21.525613 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59bcdc5c-eaeb-4ba3-b714-bcbbe30baa55-config-data\") pod \"59bcdc5c-eaeb-4ba3-b714-bcbbe30baa55\" (UID: \"59bcdc5c-eaeb-4ba3-b714-bcbbe30baa55\") " Feb 17 18:06:21 crc kubenswrapper[4892]: I0217 18:06:21.525632 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17df5ca6-aa43-460a-91a1-76a87616376e-logs\") pod \"17df5ca6-aa43-460a-91a1-76a87616376e\" (UID: \"17df5ca6-aa43-460a-91a1-76a87616376e\") " Feb 17 18:06:21 crc kubenswrapper[4892]: I0217 18:06:21.525675 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17df5ca6-aa43-460a-91a1-76a87616376e-config-data\") pod \"17df5ca6-aa43-460a-91a1-76a87616376e\" (UID: \"17df5ca6-aa43-460a-91a1-76a87616376e\") " Feb 17 18:06:21 crc kubenswrapper[4892]: I0217 18:06:21.525802 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pjbm\" (UniqueName: \"kubernetes.io/projected/17df5ca6-aa43-460a-91a1-76a87616376e-kube-api-access-5pjbm\") pod \"17df5ca6-aa43-460a-91a1-76a87616376e\" (UID: \"17df5ca6-aa43-460a-91a1-76a87616376e\") " Feb 17 18:06:21 crc kubenswrapper[4892]: I0217 18:06:21.525843 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59bcdc5c-eaeb-4ba3-b714-bcbbe30baa55-combined-ca-bundle\") pod \"59bcdc5c-eaeb-4ba3-b714-bcbbe30baa55\" (UID: \"59bcdc5c-eaeb-4ba3-b714-bcbbe30baa55\") " Feb 17 18:06:21 crc kubenswrapper[4892]: I0217 18:06:21.526112 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17df5ca6-aa43-460a-91a1-76a87616376e-logs" (OuterVolumeSpecName: "logs") pod "17df5ca6-aa43-460a-91a1-76a87616376e" (UID: "17df5ca6-aa43-460a-91a1-76a87616376e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:06:21 crc kubenswrapper[4892]: I0217 18:06:21.526770 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17df5ca6-aa43-460a-91a1-76a87616376e-logs\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:21 crc kubenswrapper[4892]: I0217 18:06:21.531239 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17df5ca6-aa43-460a-91a1-76a87616376e-kube-api-access-5pjbm" (OuterVolumeSpecName: "kube-api-access-5pjbm") pod "17df5ca6-aa43-460a-91a1-76a87616376e" (UID: "17df5ca6-aa43-460a-91a1-76a87616376e"). InnerVolumeSpecName "kube-api-access-5pjbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:06:21 crc kubenswrapper[4892]: I0217 18:06:21.531741 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59bcdc5c-eaeb-4ba3-b714-bcbbe30baa55-kube-api-access-fjhwb" (OuterVolumeSpecName: "kube-api-access-fjhwb") pod "59bcdc5c-eaeb-4ba3-b714-bcbbe30baa55" (UID: "59bcdc5c-eaeb-4ba3-b714-bcbbe30baa55"). InnerVolumeSpecName "kube-api-access-fjhwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:06:21 crc kubenswrapper[4892]: I0217 18:06:21.565515 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59bcdc5c-eaeb-4ba3-b714-bcbbe30baa55-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "59bcdc5c-eaeb-4ba3-b714-bcbbe30baa55" (UID: "59bcdc5c-eaeb-4ba3-b714-bcbbe30baa55"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:06:21 crc kubenswrapper[4892]: I0217 18:06:21.568437 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59bcdc5c-eaeb-4ba3-b714-bcbbe30baa55-config-data" (OuterVolumeSpecName: "config-data") pod "59bcdc5c-eaeb-4ba3-b714-bcbbe30baa55" (UID: "59bcdc5c-eaeb-4ba3-b714-bcbbe30baa55"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:06:21 crc kubenswrapper[4892]: I0217 18:06:21.569095 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17df5ca6-aa43-460a-91a1-76a87616376e-config-data" (OuterVolumeSpecName: "config-data") pod "17df5ca6-aa43-460a-91a1-76a87616376e" (UID: "17df5ca6-aa43-460a-91a1-76a87616376e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:06:21 crc kubenswrapper[4892]: I0217 18:06:21.573118 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17df5ca6-aa43-460a-91a1-76a87616376e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "17df5ca6-aa43-460a-91a1-76a87616376e" (UID: "17df5ca6-aa43-460a-91a1-76a87616376e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:06:21 crc kubenswrapper[4892]: I0217 18:06:21.628184 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pjbm\" (UniqueName: \"kubernetes.io/projected/17df5ca6-aa43-460a-91a1-76a87616376e-kube-api-access-5pjbm\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:21 crc kubenswrapper[4892]: I0217 18:06:21.628371 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59bcdc5c-eaeb-4ba3-b714-bcbbe30baa55-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:21 crc kubenswrapper[4892]: I0217 18:06:21.628426 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17df5ca6-aa43-460a-91a1-76a87616376e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:21 crc kubenswrapper[4892]: I0217 18:06:21.628496 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjhwb\" (UniqueName: \"kubernetes.io/projected/59bcdc5c-eaeb-4ba3-b714-bcbbe30baa55-kube-api-access-fjhwb\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:21 crc kubenswrapper[4892]: I0217 18:06:21.628548 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59bcdc5c-eaeb-4ba3-b714-bcbbe30baa55-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:21 crc kubenswrapper[4892]: I0217 18:06:21.628609 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17df5ca6-aa43-460a-91a1-76a87616376e-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.393749 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.393892 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.394003 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"59bcdc5c-eaeb-4ba3-b714-bcbbe30baa55","Type":"ContainerDied","Data":"526c75c3a3595efa562b5e45ed4f025fe3920aad00d7f8cb393c17ed11ff0fa0"} Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.395069 4892 scope.go:117] "RemoveContainer" containerID="efd8dfb1e02ad04c1f660122e0b638a7ce0432928a1c09ee9deb48c80b346a6e" Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.468690 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.483184 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.524294 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.535451 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.546431 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 18:06:22 crc kubenswrapper[4892]: E0217 18:06:22.546967 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17df5ca6-aa43-460a-91a1-76a87616376e" containerName="nova-metadata-log" Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.546980 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="17df5ca6-aa43-460a-91a1-76a87616376e" containerName="nova-metadata-log" Feb 17 18:06:22 crc kubenswrapper[4892]: E0217 18:06:22.547005 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59bcdc5c-eaeb-4ba3-b714-bcbbe30baa55" containerName="nova-cell1-novncproxy-novncproxy" Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.547013 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="59bcdc5c-eaeb-4ba3-b714-bcbbe30baa55" containerName="nova-cell1-novncproxy-novncproxy" Feb 17 18:06:22 crc kubenswrapper[4892]: E0217 18:06:22.547050 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17df5ca6-aa43-460a-91a1-76a87616376e" containerName="nova-metadata-metadata" Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.547057 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="17df5ca6-aa43-460a-91a1-76a87616376e" containerName="nova-metadata-metadata" Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.547259 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="17df5ca6-aa43-460a-91a1-76a87616376e" containerName="nova-metadata-log" Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.547278 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="59bcdc5c-eaeb-4ba3-b714-bcbbe30baa55" containerName="nova-cell1-novncproxy-novncproxy" Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.547301 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="17df5ca6-aa43-460a-91a1-76a87616376e" containerName="nova-metadata-metadata" Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.547987 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.551302 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.551536 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.551756 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.556608 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.571643 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.573744 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.577023 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.577410 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.584025 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.647257 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-548mr\" (UniqueName: \"kubernetes.io/projected/bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9-kube-api-access-548mr\") pod \"nova-cell1-novncproxy-0\" (UID: \"bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.647438 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2qvw\" (UniqueName: \"kubernetes.io/projected/7b9c463d-2912-48ed-b342-74e1a2f92d19-kube-api-access-x2qvw\") pod \"nova-metadata-0\" (UID: \"7b9c463d-2912-48ed-b342-74e1a2f92d19\") " pod="openstack/nova-metadata-0" Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.647507 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b9c463d-2912-48ed-b342-74e1a2f92d19-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7b9c463d-2912-48ed-b342-74e1a2f92d19\") " pod="openstack/nova-metadata-0" Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.647627 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b9c463d-2912-48ed-b342-74e1a2f92d19-logs\") pod \"nova-metadata-0\" (UID: \"7b9c463d-2912-48ed-b342-74e1a2f92d19\") " pod="openstack/nova-metadata-0" Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.647751 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.647887 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b9c463d-2912-48ed-b342-74e1a2f92d19-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7b9c463d-2912-48ed-b342-74e1a2f92d19\") " pod="openstack/nova-metadata-0" Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.647986 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.648060 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b9c463d-2912-48ed-b342-74e1a2f92d19-config-data\") pod \"nova-metadata-0\" (UID: \"7b9c463d-2912-48ed-b342-74e1a2f92d19\") " pod="openstack/nova-metadata-0" Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.648188 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.648299 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.750239 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b9c463d-2912-48ed-b342-74e1a2f92d19-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7b9c463d-2912-48ed-b342-74e1a2f92d19\") " pod="openstack/nova-metadata-0" Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.750286 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.750312 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b9c463d-2912-48ed-b342-74e1a2f92d19-config-data\") pod \"nova-metadata-0\" (UID: \"7b9c463d-2912-48ed-b342-74e1a2f92d19\") " pod="openstack/nova-metadata-0" Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.750348 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.750383 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.750428 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-548mr\" (UniqueName: \"kubernetes.io/projected/bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9-kube-api-access-548mr\") pod \"nova-cell1-novncproxy-0\" (UID: \"bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.750482 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2qvw\" (UniqueName: \"kubernetes.io/projected/7b9c463d-2912-48ed-b342-74e1a2f92d19-kube-api-access-x2qvw\") pod \"nova-metadata-0\" (UID: \"7b9c463d-2912-48ed-b342-74e1a2f92d19\") " pod="openstack/nova-metadata-0" Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.750509 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b9c463d-2912-48ed-b342-74e1a2f92d19-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7b9c463d-2912-48ed-b342-74e1a2f92d19\") " pod="openstack/nova-metadata-0" Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.750527 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b9c463d-2912-48ed-b342-74e1a2f92d19-logs\") pod \"nova-metadata-0\" (UID: \"7b9c463d-2912-48ed-b342-74e1a2f92d19\") " pod="openstack/nova-metadata-0" Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.750556 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.751732 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b9c463d-2912-48ed-b342-74e1a2f92d19-logs\") pod \"nova-metadata-0\" (UID: \"7b9c463d-2912-48ed-b342-74e1a2f92d19\") " pod="openstack/nova-metadata-0" Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.756980 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.757096 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b9c463d-2912-48ed-b342-74e1a2f92d19-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7b9c463d-2912-48ed-b342-74e1a2f92d19\") " pod="openstack/nova-metadata-0" Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.757205 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b9c463d-2912-48ed-b342-74e1a2f92d19-config-data\") pod \"nova-metadata-0\" (UID: \"7b9c463d-2912-48ed-b342-74e1a2f92d19\") " pod="openstack/nova-metadata-0" Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.757323 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.757579 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.757867 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b9c463d-2912-48ed-b342-74e1a2f92d19-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7b9c463d-2912-48ed-b342-74e1a2f92d19\") " pod="openstack/nova-metadata-0" Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.764615 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.767509 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2qvw\" (UniqueName: \"kubernetes.io/projected/7b9c463d-2912-48ed-b342-74e1a2f92d19-kube-api-access-x2qvw\") pod \"nova-metadata-0\" (UID: \"7b9c463d-2912-48ed-b342-74e1a2f92d19\") " pod="openstack/nova-metadata-0" Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.770454 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-548mr\" (UniqueName: \"kubernetes.io/projected/bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9-kube-api-access-548mr\") pod \"nova-cell1-novncproxy-0\" (UID: \"bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.866769 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 18:06:22 crc kubenswrapper[4892]: I0217 18:06:22.892401 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 18:06:23 crc kubenswrapper[4892]: I0217 18:06:23.382169 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17df5ca6-aa43-460a-91a1-76a87616376e" path="/var/lib/kubelet/pods/17df5ca6-aa43-460a-91a1-76a87616376e/volumes" Feb 17 18:06:23 crc kubenswrapper[4892]: I0217 18:06:23.384418 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59bcdc5c-eaeb-4ba3-b714-bcbbe30baa55" path="/var/lib/kubelet/pods/59bcdc5c-eaeb-4ba3-b714-bcbbe30baa55/volumes" Feb 17 18:06:23 crc kubenswrapper[4892]: W0217 18:06:23.427144 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb02be00_7b9c_4e4f_bee4_65a6fe46c0c9.slice/crio-bc08a0edfe171310cda0903456032b06e034d627c97015128f7fd04f37fd7bd4 WatchSource:0}: Error finding container bc08a0edfe171310cda0903456032b06e034d627c97015128f7fd04f37fd7bd4: Status 404 returned error can't find the container with id bc08a0edfe171310cda0903456032b06e034d627c97015128f7fd04f37fd7bd4 Feb 17 18:06:23 crc kubenswrapper[4892]: I0217 18:06:23.429115 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 18:06:23 crc kubenswrapper[4892]: I0217 18:06:23.447278 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 18:06:23 crc kubenswrapper[4892]: I0217 18:06:23.882708 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 17 18:06:23 crc kubenswrapper[4892]: I0217 18:06:23.883063 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 17 18:06:23 crc kubenswrapper[4892]: I0217 18:06:23.883392 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 17 18:06:23 crc kubenswrapper[4892]: I0217 18:06:23.883457 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 17 18:06:23 crc kubenswrapper[4892]: I0217 18:06:23.885711 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 17 18:06:23 crc kubenswrapper[4892]: I0217 18:06:23.885773 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 17 18:06:24 crc kubenswrapper[4892]: I0217 18:06:24.073424 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-tdc8n"] Feb 17 18:06:24 crc kubenswrapper[4892]: I0217 18:06:24.077653 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-tdc8n" Feb 17 18:06:24 crc kubenswrapper[4892]: I0217 18:06:24.081213 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9e84558-a02a-4297-87f2-6b69a3b5f452-config\") pod \"dnsmasq-dns-5c7b6c5df9-tdc8n\" (UID: \"f9e84558-a02a-4297-87f2-6b69a3b5f452\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-tdc8n" Feb 17 18:06:24 crc kubenswrapper[4892]: I0217 18:06:24.081402 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9e84558-a02a-4297-87f2-6b69a3b5f452-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-tdc8n\" (UID: \"f9e84558-a02a-4297-87f2-6b69a3b5f452\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-tdc8n" Feb 17 18:06:24 crc kubenswrapper[4892]: I0217 18:06:24.081530 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9e84558-a02a-4297-87f2-6b69a3b5f452-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-tdc8n\" (UID: \"f9e84558-a02a-4297-87f2-6b69a3b5f452\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-tdc8n" Feb 17 18:06:24 crc kubenswrapper[4892]: I0217 18:06:24.081621 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9e84558-a02a-4297-87f2-6b69a3b5f452-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-tdc8n\" (UID: \"f9e84558-a02a-4297-87f2-6b69a3b5f452\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-tdc8n" Feb 17 18:06:24 crc kubenswrapper[4892]: I0217 18:06:24.081734 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9e84558-a02a-4297-87f2-6b69a3b5f452-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-tdc8n\" (UID: \"f9e84558-a02a-4297-87f2-6b69a3b5f452\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-tdc8n" Feb 17 18:06:24 crc kubenswrapper[4892]: I0217 18:06:24.081808 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bldm\" (UniqueName: \"kubernetes.io/projected/f9e84558-a02a-4297-87f2-6b69a3b5f452-kube-api-access-8bldm\") pod \"dnsmasq-dns-5c7b6c5df9-tdc8n\" (UID: \"f9e84558-a02a-4297-87f2-6b69a3b5f452\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-tdc8n" Feb 17 18:06:24 crc kubenswrapper[4892]: I0217 18:06:24.088031 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-tdc8n"] Feb 17 18:06:24 crc kubenswrapper[4892]: I0217 18:06:24.184134 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bldm\" (UniqueName: \"kubernetes.io/projected/f9e84558-a02a-4297-87f2-6b69a3b5f452-kube-api-access-8bldm\") pod \"dnsmasq-dns-5c7b6c5df9-tdc8n\" (UID: \"f9e84558-a02a-4297-87f2-6b69a3b5f452\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-tdc8n" Feb 17 18:06:24 crc kubenswrapper[4892]: I0217 18:06:24.184445 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9e84558-a02a-4297-87f2-6b69a3b5f452-config\") pod \"dnsmasq-dns-5c7b6c5df9-tdc8n\" (UID: \"f9e84558-a02a-4297-87f2-6b69a3b5f452\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-tdc8n" Feb 17 18:06:24 crc kubenswrapper[4892]: I0217 18:06:24.184522 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9e84558-a02a-4297-87f2-6b69a3b5f452-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-tdc8n\" (UID: \"f9e84558-a02a-4297-87f2-6b69a3b5f452\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-tdc8n" Feb 17 18:06:24 crc kubenswrapper[4892]: I0217 18:06:24.184638 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9e84558-a02a-4297-87f2-6b69a3b5f452-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-tdc8n\" (UID: \"f9e84558-a02a-4297-87f2-6b69a3b5f452\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-tdc8n" Feb 17 18:06:24 crc kubenswrapper[4892]: I0217 18:06:24.184724 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9e84558-a02a-4297-87f2-6b69a3b5f452-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-tdc8n\" (UID: \"f9e84558-a02a-4297-87f2-6b69a3b5f452\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-tdc8n" Feb 17 18:06:24 crc kubenswrapper[4892]: I0217 18:06:24.184790 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9e84558-a02a-4297-87f2-6b69a3b5f452-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-tdc8n\" (UID: \"f9e84558-a02a-4297-87f2-6b69a3b5f452\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-tdc8n" Feb 17 18:06:24 crc kubenswrapper[4892]: I0217 18:06:24.185357 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9e84558-a02a-4297-87f2-6b69a3b5f452-config\") pod \"dnsmasq-dns-5c7b6c5df9-tdc8n\" (UID: \"f9e84558-a02a-4297-87f2-6b69a3b5f452\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-tdc8n" Feb 17 18:06:24 crc kubenswrapper[4892]: I0217 18:06:24.185416 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9e84558-a02a-4297-87f2-6b69a3b5f452-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-tdc8n\" (UID: \"f9e84558-a02a-4297-87f2-6b69a3b5f452\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-tdc8n" Feb 17 18:06:24 crc kubenswrapper[4892]: I0217 18:06:24.185721 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9e84558-a02a-4297-87f2-6b69a3b5f452-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-tdc8n\" (UID: \"f9e84558-a02a-4297-87f2-6b69a3b5f452\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-tdc8n" Feb 17 18:06:24 crc kubenswrapper[4892]: I0217 18:06:24.185936 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9e84558-a02a-4297-87f2-6b69a3b5f452-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-tdc8n\" (UID: \"f9e84558-a02a-4297-87f2-6b69a3b5f452\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-tdc8n" Feb 17 18:06:24 crc kubenswrapper[4892]: I0217 18:06:24.186114 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9e84558-a02a-4297-87f2-6b69a3b5f452-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-tdc8n\" (UID: \"f9e84558-a02a-4297-87f2-6b69a3b5f452\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-tdc8n" Feb 17 18:06:24 crc kubenswrapper[4892]: I0217 18:06:24.206119 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bldm\" (UniqueName: \"kubernetes.io/projected/f9e84558-a02a-4297-87f2-6b69a3b5f452-kube-api-access-8bldm\") pod \"dnsmasq-dns-5c7b6c5df9-tdc8n\" (UID: \"f9e84558-a02a-4297-87f2-6b69a3b5f452\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-tdc8n" Feb 17 18:06:24 crc kubenswrapper[4892]: I0217 18:06:24.404035 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-tdc8n" Feb 17 18:06:24 crc kubenswrapper[4892]: I0217 18:06:24.427928 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7b9c463d-2912-48ed-b342-74e1a2f92d19","Type":"ContainerStarted","Data":"6a2878294542b4973fbce5486513e4efda079ff8205b1e1994b292781335a803"} Feb 17 18:06:24 crc kubenswrapper[4892]: I0217 18:06:24.428632 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7b9c463d-2912-48ed-b342-74e1a2f92d19","Type":"ContainerStarted","Data":"00579e8a9d0e4012a3eb6b9376f292f349e2389940e2dd79e00a3a9d80c4c6f7"} Feb 17 18:06:24 crc kubenswrapper[4892]: I0217 18:06:24.428694 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7b9c463d-2912-48ed-b342-74e1a2f92d19","Type":"ContainerStarted","Data":"79484b580ee2d7bbbcb661dab756e77779250582197a3e30c09c2a2868dd3caf"} Feb 17 18:06:24 crc kubenswrapper[4892]: I0217 18:06:24.434853 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9","Type":"ContainerStarted","Data":"810f1314fdb8ea5a53e297e37a30cff80f92654d313f4d524f470fd44d72197e"} Feb 17 18:06:24 crc kubenswrapper[4892]: I0217 18:06:24.434950 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9","Type":"ContainerStarted","Data":"bc08a0edfe171310cda0903456032b06e034d627c97015128f7fd04f37fd7bd4"} Feb 17 18:06:24 crc kubenswrapper[4892]: I0217 18:06:24.462150 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.462133096 podStartE2EDuration="2.462133096s" podCreationTimestamp="2026-02-17 18:06:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:06:24.45373575 +0000 UTC m=+1355.829139035" watchObservedRunningTime="2026-02-17 18:06:24.462133096 +0000 UTC m=+1355.837536361" Feb 17 18:06:24 crc kubenswrapper[4892]: I0217 18:06:24.478475 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.478457625 podStartE2EDuration="2.478457625s" podCreationTimestamp="2026-02-17 18:06:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:06:24.475408973 +0000 UTC m=+1355.850812238" watchObservedRunningTime="2026-02-17 18:06:24.478457625 +0000 UTC m=+1355.853860900" Feb 17 18:06:24 crc kubenswrapper[4892]: I0217 18:06:24.915065 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-tdc8n"] Feb 17 18:06:24 crc kubenswrapper[4892]: W0217 18:06:24.925020 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9e84558_a02a_4297_87f2_6b69a3b5f452.slice/crio-ffdda80b687731c267061231019e007ba261f3e3d162c1927ed439b62939e3ec WatchSource:0}: Error finding container ffdda80b687731c267061231019e007ba261f3e3d162c1927ed439b62939e3ec: Status 404 returned error can't find the container with id ffdda80b687731c267061231019e007ba261f3e3d162c1927ed439b62939e3ec Feb 17 18:06:25 crc kubenswrapper[4892]: I0217 18:06:25.446027 4892 generic.go:334] "Generic (PLEG): container finished" podID="f9e84558-a02a-4297-87f2-6b69a3b5f452" containerID="f481d1c9a09f24ed0ad0a19d49e204be43df31b957576fba8fc3094295e28527" exitCode=0 Feb 17 18:06:25 crc kubenswrapper[4892]: I0217 18:06:25.446989 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-tdc8n" event={"ID":"f9e84558-a02a-4297-87f2-6b69a3b5f452","Type":"ContainerDied","Data":"f481d1c9a09f24ed0ad0a19d49e204be43df31b957576fba8fc3094295e28527"} Feb 17 18:06:25 crc kubenswrapper[4892]: I0217 18:06:25.447049 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-tdc8n" event={"ID":"f9e84558-a02a-4297-87f2-6b69a3b5f452","Type":"ContainerStarted","Data":"ffdda80b687731c267061231019e007ba261f3e3d162c1927ed439b62939e3ec"} Feb 17 18:06:25 crc kubenswrapper[4892]: E0217 18:06:25.583187 4892 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9e84558_a02a_4297_87f2_6b69a3b5f452.slice/crio-conmon-f481d1c9a09f24ed0ad0a19d49e204be43df31b957576fba8fc3094295e28527.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9e84558_a02a_4297_87f2_6b69a3b5f452.slice/crio-f481d1c9a09f24ed0ad0a19d49e204be43df31b957576fba8fc3094295e28527.scope\": RecentStats: unable to find data in memory cache]" Feb 17 18:06:26 crc kubenswrapper[4892]: I0217 18:06:26.173172 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 18:06:26 crc kubenswrapper[4892]: I0217 18:06:26.173764 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="38e4898f-d94e-49aa-add7-918fe6417c43" containerName="ceilometer-central-agent" containerID="cri-o://b59f376eb8012cd3ff996497e9794e6268a51d921e08d83e12a30a5f26c57a3f" gracePeriod=30 Feb 17 18:06:26 crc kubenswrapper[4892]: I0217 18:06:26.173865 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="38e4898f-d94e-49aa-add7-918fe6417c43" containerName="ceilometer-notification-agent" containerID="cri-o://6708cafffa28d54af9efb50772fe43d3255ae0ac65624dc2cc368cecec7f468f" gracePeriod=30 Feb 17 18:06:26 crc kubenswrapper[4892]: I0217 18:06:26.173862 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="38e4898f-d94e-49aa-add7-918fe6417c43" containerName="sg-core" containerID="cri-o://b0be6ba2fe5e58abc9204ba7aeb869fbeb1d7c4f3d1e29aa7a7093aaafa65444" gracePeriod=30 Feb 17 18:06:26 crc kubenswrapper[4892]: I0217 18:06:26.173927 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="38e4898f-d94e-49aa-add7-918fe6417c43" containerName="proxy-httpd" containerID="cri-o://de65f10d9c05d9eac74c81195172ac99385560eb797d1f1427c70ebf37cf3262" gracePeriod=30 Feb 17 18:06:26 crc kubenswrapper[4892]: I0217 18:06:26.458485 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-tdc8n" event={"ID":"f9e84558-a02a-4297-87f2-6b69a3b5f452","Type":"ContainerStarted","Data":"047fafbe070b5c5f98549aa6ce2f47850884bd46a8bba02c12bc6d546ba53b35"} Feb 17 18:06:26 crc kubenswrapper[4892]: I0217 18:06:26.458955 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c7b6c5df9-tdc8n" Feb 17 18:06:26 crc kubenswrapper[4892]: I0217 18:06:26.462079 4892 generic.go:334] "Generic (PLEG): container finished" podID="38e4898f-d94e-49aa-add7-918fe6417c43" containerID="de65f10d9c05d9eac74c81195172ac99385560eb797d1f1427c70ebf37cf3262" exitCode=0 Feb 17 18:06:26 crc kubenswrapper[4892]: I0217 18:06:26.462099 4892 generic.go:334] "Generic (PLEG): container finished" podID="38e4898f-d94e-49aa-add7-918fe6417c43" containerID="b0be6ba2fe5e58abc9204ba7aeb869fbeb1d7c4f3d1e29aa7a7093aaafa65444" exitCode=2 Feb 17 18:06:26 crc kubenswrapper[4892]: I0217 18:06:26.462115 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38e4898f-d94e-49aa-add7-918fe6417c43","Type":"ContainerDied","Data":"de65f10d9c05d9eac74c81195172ac99385560eb797d1f1427c70ebf37cf3262"} Feb 17 18:06:26 crc kubenswrapper[4892]: I0217 18:06:26.462132 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38e4898f-d94e-49aa-add7-918fe6417c43","Type":"ContainerDied","Data":"b0be6ba2fe5e58abc9204ba7aeb869fbeb1d7c4f3d1e29aa7a7093aaafa65444"} Feb 17 18:06:26 crc kubenswrapper[4892]: I0217 18:06:26.482810 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c7b6c5df9-tdc8n" podStartSLOduration=2.482788721 podStartE2EDuration="2.482788721s" podCreationTimestamp="2026-02-17 18:06:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:06:26.479572424 +0000 UTC m=+1357.854975719" watchObservedRunningTime="2026-02-17 18:06:26.482788721 +0000 UTC m=+1357.858191986" Feb 17 18:06:27 crc kubenswrapper[4892]: I0217 18:06:27.486659 4892 generic.go:334] "Generic (PLEG): container finished" podID="38e4898f-d94e-49aa-add7-918fe6417c43" containerID="b59f376eb8012cd3ff996497e9794e6268a51d921e08d83e12a30a5f26c57a3f" exitCode=0 Feb 17 18:06:27 crc kubenswrapper[4892]: I0217 18:06:27.487561 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38e4898f-d94e-49aa-add7-918fe6417c43","Type":"ContainerDied","Data":"b59f376eb8012cd3ff996497e9794e6268a51d921e08d83e12a30a5f26c57a3f"} Feb 17 18:06:27 crc kubenswrapper[4892]: I0217 18:06:27.487623 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 18:06:27 crc kubenswrapper[4892]: I0217 18:06:27.487846 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c5f00cf6-62fb-43dc-83b0-d07d1013ac39" containerName="nova-api-log" containerID="cri-o://a93d2b9992861dc290227f210f249e17f3b1196c0ba166f7ff424cb14caa20aa" gracePeriod=30 Feb 17 18:06:27 crc kubenswrapper[4892]: I0217 18:06:27.488102 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c5f00cf6-62fb-43dc-83b0-d07d1013ac39" containerName="nova-api-api" containerID="cri-o://9836f2bba378d07e986d22e558347cecc28683307c36bd0001343389fc8cc1ad" gracePeriod=30 Feb 17 18:06:27 crc kubenswrapper[4892]: I0217 18:06:27.867574 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 17 18:06:27 crc kubenswrapper[4892]: I0217 18:06:27.892608 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 18:06:27 crc kubenswrapper[4892]: I0217 18:06:27.892701 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 18:06:28 crc kubenswrapper[4892]: I0217 18:06:28.506952 4892 generic.go:334] "Generic (PLEG): container finished" podID="c5f00cf6-62fb-43dc-83b0-d07d1013ac39" containerID="a93d2b9992861dc290227f210f249e17f3b1196c0ba166f7ff424cb14caa20aa" exitCode=143 Feb 17 18:06:28 crc kubenswrapper[4892]: I0217 18:06:28.507044 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c5f00cf6-62fb-43dc-83b0-d07d1013ac39","Type":"ContainerDied","Data":"a93d2b9992861dc290227f210f249e17f3b1196c0ba166f7ff424cb14caa20aa"} Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.170326 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.295640 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/38e4898f-d94e-49aa-add7-918fe6417c43-ceilometer-tls-certs\") pod \"38e4898f-d94e-49aa-add7-918fe6417c43\" (UID: \"38e4898f-d94e-49aa-add7-918fe6417c43\") " Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.295723 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/38e4898f-d94e-49aa-add7-918fe6417c43-sg-core-conf-yaml\") pod \"38e4898f-d94e-49aa-add7-918fe6417c43\" (UID: \"38e4898f-d94e-49aa-add7-918fe6417c43\") " Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.295849 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38e4898f-d94e-49aa-add7-918fe6417c43-config-data\") pod \"38e4898f-d94e-49aa-add7-918fe6417c43\" (UID: \"38e4898f-d94e-49aa-add7-918fe6417c43\") " Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.295975 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38e4898f-d94e-49aa-add7-918fe6417c43-scripts\") pod \"38e4898f-d94e-49aa-add7-918fe6417c43\" (UID: \"38e4898f-d94e-49aa-add7-918fe6417c43\") " Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.296043 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38e4898f-d94e-49aa-add7-918fe6417c43-log-httpd\") pod \"38e4898f-d94e-49aa-add7-918fe6417c43\" (UID: \"38e4898f-d94e-49aa-add7-918fe6417c43\") " Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.296098 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbfj5\" (UniqueName: \"kubernetes.io/projected/38e4898f-d94e-49aa-add7-918fe6417c43-kube-api-access-lbfj5\") pod \"38e4898f-d94e-49aa-add7-918fe6417c43\" (UID: \"38e4898f-d94e-49aa-add7-918fe6417c43\") " Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.296213 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38e4898f-d94e-49aa-add7-918fe6417c43-run-httpd\") pod \"38e4898f-d94e-49aa-add7-918fe6417c43\" (UID: \"38e4898f-d94e-49aa-add7-918fe6417c43\") " Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.296265 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38e4898f-d94e-49aa-add7-918fe6417c43-combined-ca-bundle\") pod \"38e4898f-d94e-49aa-add7-918fe6417c43\" (UID: \"38e4898f-d94e-49aa-add7-918fe6417c43\") " Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.296965 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38e4898f-d94e-49aa-add7-918fe6417c43-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "38e4898f-d94e-49aa-add7-918fe6417c43" (UID: "38e4898f-d94e-49aa-add7-918fe6417c43"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.298636 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38e4898f-d94e-49aa-add7-918fe6417c43-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "38e4898f-d94e-49aa-add7-918fe6417c43" (UID: "38e4898f-d94e-49aa-add7-918fe6417c43"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.299136 4892 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38e4898f-d94e-49aa-add7-918fe6417c43-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.299156 4892 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38e4898f-d94e-49aa-add7-918fe6417c43-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.301502 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38e4898f-d94e-49aa-add7-918fe6417c43-kube-api-access-lbfj5" (OuterVolumeSpecName: "kube-api-access-lbfj5") pod "38e4898f-d94e-49aa-add7-918fe6417c43" (UID: "38e4898f-d94e-49aa-add7-918fe6417c43"). InnerVolumeSpecName "kube-api-access-lbfj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.301748 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38e4898f-d94e-49aa-add7-918fe6417c43-scripts" (OuterVolumeSpecName: "scripts") pod "38e4898f-d94e-49aa-add7-918fe6417c43" (UID: "38e4898f-d94e-49aa-add7-918fe6417c43"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.332995 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38e4898f-d94e-49aa-add7-918fe6417c43-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "38e4898f-d94e-49aa-add7-918fe6417c43" (UID: "38e4898f-d94e-49aa-add7-918fe6417c43"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.376300 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38e4898f-d94e-49aa-add7-918fe6417c43-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "38e4898f-d94e-49aa-add7-918fe6417c43" (UID: "38e4898f-d94e-49aa-add7-918fe6417c43"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.397320 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38e4898f-d94e-49aa-add7-918fe6417c43-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "38e4898f-d94e-49aa-add7-918fe6417c43" (UID: "38e4898f-d94e-49aa-add7-918fe6417c43"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.400856 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbfj5\" (UniqueName: \"kubernetes.io/projected/38e4898f-d94e-49aa-add7-918fe6417c43-kube-api-access-lbfj5\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.400894 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38e4898f-d94e-49aa-add7-918fe6417c43-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.400905 4892 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/38e4898f-d94e-49aa-add7-918fe6417c43-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.400913 4892 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/38e4898f-d94e-49aa-add7-918fe6417c43-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.400922 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38e4898f-d94e-49aa-add7-918fe6417c43-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.435047 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38e4898f-d94e-49aa-add7-918fe6417c43-config-data" (OuterVolumeSpecName: "config-data") pod "38e4898f-d94e-49aa-add7-918fe6417c43" (UID: "38e4898f-d94e-49aa-add7-918fe6417c43"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.503303 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38e4898f-d94e-49aa-add7-918fe6417c43-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.519050 4892 generic.go:334] "Generic (PLEG): container finished" podID="38e4898f-d94e-49aa-add7-918fe6417c43" containerID="6708cafffa28d54af9efb50772fe43d3255ae0ac65624dc2cc368cecec7f468f" exitCode=0 Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.519423 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.520803 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38e4898f-d94e-49aa-add7-918fe6417c43","Type":"ContainerDied","Data":"6708cafffa28d54af9efb50772fe43d3255ae0ac65624dc2cc368cecec7f468f"} Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.520866 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38e4898f-d94e-49aa-add7-918fe6417c43","Type":"ContainerDied","Data":"d5703726698fa29ac42a0e9a3eab01c0836cf4728fd1c75a8bdc113f08a05730"} Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.520886 4892 scope.go:117] "RemoveContainer" containerID="de65f10d9c05d9eac74c81195172ac99385560eb797d1f1427c70ebf37cf3262" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.547195 4892 scope.go:117] "RemoveContainer" containerID="b0be6ba2fe5e58abc9204ba7aeb869fbeb1d7c4f3d1e29aa7a7093aaafa65444" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.552948 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.560982 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.572861 4892 scope.go:117] "RemoveContainer" containerID="6708cafffa28d54af9efb50772fe43d3255ae0ac65624dc2cc368cecec7f468f" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.574503 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 18:06:29 crc kubenswrapper[4892]: E0217 18:06:29.574938 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38e4898f-d94e-49aa-add7-918fe6417c43" containerName="ceilometer-central-agent" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.574955 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="38e4898f-d94e-49aa-add7-918fe6417c43" containerName="ceilometer-central-agent" Feb 17 18:06:29 crc kubenswrapper[4892]: E0217 18:06:29.574970 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38e4898f-d94e-49aa-add7-918fe6417c43" containerName="ceilometer-notification-agent" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.574977 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="38e4898f-d94e-49aa-add7-918fe6417c43" containerName="ceilometer-notification-agent" Feb 17 18:06:29 crc kubenswrapper[4892]: E0217 18:06:29.574993 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38e4898f-d94e-49aa-add7-918fe6417c43" containerName="proxy-httpd" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.574999 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="38e4898f-d94e-49aa-add7-918fe6417c43" containerName="proxy-httpd" Feb 17 18:06:29 crc kubenswrapper[4892]: E0217 18:06:29.575026 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38e4898f-d94e-49aa-add7-918fe6417c43" containerName="sg-core" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.575032 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="38e4898f-d94e-49aa-add7-918fe6417c43" containerName="sg-core" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.575233 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="38e4898f-d94e-49aa-add7-918fe6417c43" containerName="ceilometer-notification-agent" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.575253 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="38e4898f-d94e-49aa-add7-918fe6417c43" containerName="sg-core" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.575267 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="38e4898f-d94e-49aa-add7-918fe6417c43" containerName="proxy-httpd" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.575284 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="38e4898f-d94e-49aa-add7-918fe6417c43" containerName="ceilometer-central-agent" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.577162 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.579944 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.583984 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.584489 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.597757 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.605642 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9b84e006-8d73-4cca-9ec6-8299bd4c018d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9b84e006-8d73-4cca-9ec6-8299bd4c018d\") " pod="openstack/ceilometer-0" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.605751 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b84e006-8d73-4cca-9ec6-8299bd4c018d-config-data\") pod \"ceilometer-0\" (UID: \"9b84e006-8d73-4cca-9ec6-8299bd4c018d\") " pod="openstack/ceilometer-0" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.605844 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvxj7\" (UniqueName: \"kubernetes.io/projected/9b84e006-8d73-4cca-9ec6-8299bd4c018d-kube-api-access-pvxj7\") pod \"ceilometer-0\" (UID: \"9b84e006-8d73-4cca-9ec6-8299bd4c018d\") " pod="openstack/ceilometer-0" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.605892 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b84e006-8d73-4cca-9ec6-8299bd4c018d-run-httpd\") pod \"ceilometer-0\" (UID: \"9b84e006-8d73-4cca-9ec6-8299bd4c018d\") " pod="openstack/ceilometer-0" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.605937 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b84e006-8d73-4cca-9ec6-8299bd4c018d-scripts\") pod \"ceilometer-0\" (UID: \"9b84e006-8d73-4cca-9ec6-8299bd4c018d\") " pod="openstack/ceilometer-0" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.605964 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b84e006-8d73-4cca-9ec6-8299bd4c018d-log-httpd\") pod \"ceilometer-0\" (UID: \"9b84e006-8d73-4cca-9ec6-8299bd4c018d\") " pod="openstack/ceilometer-0" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.605995 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b84e006-8d73-4cca-9ec6-8299bd4c018d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9b84e006-8d73-4cca-9ec6-8299bd4c018d\") " pod="openstack/ceilometer-0" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.606434 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b84e006-8d73-4cca-9ec6-8299bd4c018d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9b84e006-8d73-4cca-9ec6-8299bd4c018d\") " pod="openstack/ceilometer-0" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.615886 4892 scope.go:117] "RemoveContainer" containerID="b59f376eb8012cd3ff996497e9794e6268a51d921e08d83e12a30a5f26c57a3f" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.640034 4892 scope.go:117] "RemoveContainer" containerID="de65f10d9c05d9eac74c81195172ac99385560eb797d1f1427c70ebf37cf3262" Feb 17 18:06:29 crc kubenswrapper[4892]: E0217 18:06:29.641145 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de65f10d9c05d9eac74c81195172ac99385560eb797d1f1427c70ebf37cf3262\": container with ID starting with de65f10d9c05d9eac74c81195172ac99385560eb797d1f1427c70ebf37cf3262 not found: ID does not exist" containerID="de65f10d9c05d9eac74c81195172ac99385560eb797d1f1427c70ebf37cf3262" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.641178 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de65f10d9c05d9eac74c81195172ac99385560eb797d1f1427c70ebf37cf3262"} err="failed to get container status \"de65f10d9c05d9eac74c81195172ac99385560eb797d1f1427c70ebf37cf3262\": rpc error: code = NotFound desc = could not find container \"de65f10d9c05d9eac74c81195172ac99385560eb797d1f1427c70ebf37cf3262\": container with ID starting with de65f10d9c05d9eac74c81195172ac99385560eb797d1f1427c70ebf37cf3262 not found: ID does not exist" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.641201 4892 scope.go:117] "RemoveContainer" containerID="b0be6ba2fe5e58abc9204ba7aeb869fbeb1d7c4f3d1e29aa7a7093aaafa65444" Feb 17 18:06:29 crc kubenswrapper[4892]: E0217 18:06:29.641465 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0be6ba2fe5e58abc9204ba7aeb869fbeb1d7c4f3d1e29aa7a7093aaafa65444\": container with ID starting with b0be6ba2fe5e58abc9204ba7aeb869fbeb1d7c4f3d1e29aa7a7093aaafa65444 not found: ID does not exist" containerID="b0be6ba2fe5e58abc9204ba7aeb869fbeb1d7c4f3d1e29aa7a7093aaafa65444" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.641485 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0be6ba2fe5e58abc9204ba7aeb869fbeb1d7c4f3d1e29aa7a7093aaafa65444"} err="failed to get container status \"b0be6ba2fe5e58abc9204ba7aeb869fbeb1d7c4f3d1e29aa7a7093aaafa65444\": rpc error: code = NotFound desc = could not find container \"b0be6ba2fe5e58abc9204ba7aeb869fbeb1d7c4f3d1e29aa7a7093aaafa65444\": container with ID starting with b0be6ba2fe5e58abc9204ba7aeb869fbeb1d7c4f3d1e29aa7a7093aaafa65444 not found: ID does not exist" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.641500 4892 scope.go:117] "RemoveContainer" containerID="6708cafffa28d54af9efb50772fe43d3255ae0ac65624dc2cc368cecec7f468f" Feb 17 18:06:29 crc kubenswrapper[4892]: E0217 18:06:29.643246 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6708cafffa28d54af9efb50772fe43d3255ae0ac65624dc2cc368cecec7f468f\": container with ID starting with 6708cafffa28d54af9efb50772fe43d3255ae0ac65624dc2cc368cecec7f468f not found: ID does not exist" containerID="6708cafffa28d54af9efb50772fe43d3255ae0ac65624dc2cc368cecec7f468f" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.643290 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6708cafffa28d54af9efb50772fe43d3255ae0ac65624dc2cc368cecec7f468f"} err="failed to get container status \"6708cafffa28d54af9efb50772fe43d3255ae0ac65624dc2cc368cecec7f468f\": rpc error: code = NotFound desc = could not find container \"6708cafffa28d54af9efb50772fe43d3255ae0ac65624dc2cc368cecec7f468f\": container with ID starting with 6708cafffa28d54af9efb50772fe43d3255ae0ac65624dc2cc368cecec7f468f not found: ID does not exist" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.643317 4892 scope.go:117] "RemoveContainer" containerID="b59f376eb8012cd3ff996497e9794e6268a51d921e08d83e12a30a5f26c57a3f" Feb 17 18:06:29 crc kubenswrapper[4892]: E0217 18:06:29.643699 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b59f376eb8012cd3ff996497e9794e6268a51d921e08d83e12a30a5f26c57a3f\": container with ID starting with b59f376eb8012cd3ff996497e9794e6268a51d921e08d83e12a30a5f26c57a3f not found: ID does not exist" containerID="b59f376eb8012cd3ff996497e9794e6268a51d921e08d83e12a30a5f26c57a3f" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.643735 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b59f376eb8012cd3ff996497e9794e6268a51d921e08d83e12a30a5f26c57a3f"} err="failed to get container status \"b59f376eb8012cd3ff996497e9794e6268a51d921e08d83e12a30a5f26c57a3f\": rpc error: code = NotFound desc = could not find container \"b59f376eb8012cd3ff996497e9794e6268a51d921e08d83e12a30a5f26c57a3f\": container with ID starting with b59f376eb8012cd3ff996497e9794e6268a51d921e08d83e12a30a5f26c57a3f not found: ID does not exist" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.707874 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvxj7\" (UniqueName: \"kubernetes.io/projected/9b84e006-8d73-4cca-9ec6-8299bd4c018d-kube-api-access-pvxj7\") pod \"ceilometer-0\" (UID: \"9b84e006-8d73-4cca-9ec6-8299bd4c018d\") " pod="openstack/ceilometer-0" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.707933 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b84e006-8d73-4cca-9ec6-8299bd4c018d-run-httpd\") pod \"ceilometer-0\" (UID: \"9b84e006-8d73-4cca-9ec6-8299bd4c018d\") " pod="openstack/ceilometer-0" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.708580 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b84e006-8d73-4cca-9ec6-8299bd4c018d-run-httpd\") pod \"ceilometer-0\" (UID: \"9b84e006-8d73-4cca-9ec6-8299bd4c018d\") " pod="openstack/ceilometer-0" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.708654 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b84e006-8d73-4cca-9ec6-8299bd4c018d-scripts\") pod \"ceilometer-0\" (UID: \"9b84e006-8d73-4cca-9ec6-8299bd4c018d\") " pod="openstack/ceilometer-0" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.709273 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b84e006-8d73-4cca-9ec6-8299bd4c018d-log-httpd\") pod \"ceilometer-0\" (UID: \"9b84e006-8d73-4cca-9ec6-8299bd4c018d\") " pod="openstack/ceilometer-0" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.709318 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b84e006-8d73-4cca-9ec6-8299bd4c018d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9b84e006-8d73-4cca-9ec6-8299bd4c018d\") " pod="openstack/ceilometer-0" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.709349 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b84e006-8d73-4cca-9ec6-8299bd4c018d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9b84e006-8d73-4cca-9ec6-8299bd4c018d\") " pod="openstack/ceilometer-0" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.709470 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9b84e006-8d73-4cca-9ec6-8299bd4c018d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9b84e006-8d73-4cca-9ec6-8299bd4c018d\") " pod="openstack/ceilometer-0" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.709555 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b84e006-8d73-4cca-9ec6-8299bd4c018d-config-data\") pod \"ceilometer-0\" (UID: \"9b84e006-8d73-4cca-9ec6-8299bd4c018d\") " pod="openstack/ceilometer-0" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.709636 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b84e006-8d73-4cca-9ec6-8299bd4c018d-log-httpd\") pod \"ceilometer-0\" (UID: \"9b84e006-8d73-4cca-9ec6-8299bd4c018d\") " pod="openstack/ceilometer-0" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.712340 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b84e006-8d73-4cca-9ec6-8299bd4c018d-scripts\") pod \"ceilometer-0\" (UID: \"9b84e006-8d73-4cca-9ec6-8299bd4c018d\") " pod="openstack/ceilometer-0" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.713070 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9b84e006-8d73-4cca-9ec6-8299bd4c018d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9b84e006-8d73-4cca-9ec6-8299bd4c018d\") " pod="openstack/ceilometer-0" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.713615 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b84e006-8d73-4cca-9ec6-8299bd4c018d-config-data\") pod \"ceilometer-0\" (UID: \"9b84e006-8d73-4cca-9ec6-8299bd4c018d\") " pod="openstack/ceilometer-0" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.713746 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b84e006-8d73-4cca-9ec6-8299bd4c018d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9b84e006-8d73-4cca-9ec6-8299bd4c018d\") " pod="openstack/ceilometer-0" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.715666 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b84e006-8d73-4cca-9ec6-8299bd4c018d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9b84e006-8d73-4cca-9ec6-8299bd4c018d\") " pod="openstack/ceilometer-0" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.725859 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvxj7\" (UniqueName: \"kubernetes.io/projected/9b84e006-8d73-4cca-9ec6-8299bd4c018d-kube-api-access-pvxj7\") pod \"ceilometer-0\" (UID: \"9b84e006-8d73-4cca-9ec6-8299bd4c018d\") " pod="openstack/ceilometer-0" Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.844587 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 18:06:29 crc kubenswrapper[4892]: I0217 18:06:29.845799 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 18:06:30 crc kubenswrapper[4892]: I0217 18:06:30.344177 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 18:06:30 crc kubenswrapper[4892]: W0217 18:06:30.346626 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b84e006_8d73_4cca_9ec6_8299bd4c018d.slice/crio-65708a61bedf0366fe37683dcaeb7a95930fabd905d42864e2fb42fb55a0efd0 WatchSource:0}: Error finding container 65708a61bedf0366fe37683dcaeb7a95930fabd905d42864e2fb42fb55a0efd0: Status 404 returned error can't find the container with id 65708a61bedf0366fe37683dcaeb7a95930fabd905d42864e2fb42fb55a0efd0 Feb 17 18:06:30 crc kubenswrapper[4892]: I0217 18:06:30.530756 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9b84e006-8d73-4cca-9ec6-8299bd4c018d","Type":"ContainerStarted","Data":"65708a61bedf0366fe37683dcaeb7a95930fabd905d42864e2fb42fb55a0efd0"} Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.124411 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.136309 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5f00cf6-62fb-43dc-83b0-d07d1013ac39-config-data\") pod \"c5f00cf6-62fb-43dc-83b0-d07d1013ac39\" (UID: \"c5f00cf6-62fb-43dc-83b0-d07d1013ac39\") " Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.136423 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ptpb\" (UniqueName: \"kubernetes.io/projected/c5f00cf6-62fb-43dc-83b0-d07d1013ac39-kube-api-access-6ptpb\") pod \"c5f00cf6-62fb-43dc-83b0-d07d1013ac39\" (UID: \"c5f00cf6-62fb-43dc-83b0-d07d1013ac39\") " Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.136465 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5f00cf6-62fb-43dc-83b0-d07d1013ac39-logs\") pod \"c5f00cf6-62fb-43dc-83b0-d07d1013ac39\" (UID: \"c5f00cf6-62fb-43dc-83b0-d07d1013ac39\") " Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.136671 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5f00cf6-62fb-43dc-83b0-d07d1013ac39-combined-ca-bundle\") pod \"c5f00cf6-62fb-43dc-83b0-d07d1013ac39\" (UID: \"c5f00cf6-62fb-43dc-83b0-d07d1013ac39\") " Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.136986 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5f00cf6-62fb-43dc-83b0-d07d1013ac39-logs" (OuterVolumeSpecName: "logs") pod "c5f00cf6-62fb-43dc-83b0-d07d1013ac39" (UID: "c5f00cf6-62fb-43dc-83b0-d07d1013ac39"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.137975 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5f00cf6-62fb-43dc-83b0-d07d1013ac39-logs\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.143988 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5f00cf6-62fb-43dc-83b0-d07d1013ac39-kube-api-access-6ptpb" (OuterVolumeSpecName: "kube-api-access-6ptpb") pod "c5f00cf6-62fb-43dc-83b0-d07d1013ac39" (UID: "c5f00cf6-62fb-43dc-83b0-d07d1013ac39"). InnerVolumeSpecName "kube-api-access-6ptpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.196432 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5f00cf6-62fb-43dc-83b0-d07d1013ac39-config-data" (OuterVolumeSpecName: "config-data") pod "c5f00cf6-62fb-43dc-83b0-d07d1013ac39" (UID: "c5f00cf6-62fb-43dc-83b0-d07d1013ac39"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.205151 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5f00cf6-62fb-43dc-83b0-d07d1013ac39-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c5f00cf6-62fb-43dc-83b0-d07d1013ac39" (UID: "c5f00cf6-62fb-43dc-83b0-d07d1013ac39"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.241115 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ptpb\" (UniqueName: \"kubernetes.io/projected/c5f00cf6-62fb-43dc-83b0-d07d1013ac39-kube-api-access-6ptpb\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.241354 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5f00cf6-62fb-43dc-83b0-d07d1013ac39-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.241367 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5f00cf6-62fb-43dc-83b0-d07d1013ac39-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.372508 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38e4898f-d94e-49aa-add7-918fe6417c43" path="/var/lib/kubelet/pods/38e4898f-d94e-49aa-add7-918fe6417c43/volumes" Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.548915 4892 generic.go:334] "Generic (PLEG): container finished" podID="c5f00cf6-62fb-43dc-83b0-d07d1013ac39" containerID="9836f2bba378d07e986d22e558347cecc28683307c36bd0001343389fc8cc1ad" exitCode=0 Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.548970 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.549006 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c5f00cf6-62fb-43dc-83b0-d07d1013ac39","Type":"ContainerDied","Data":"9836f2bba378d07e986d22e558347cecc28683307c36bd0001343389fc8cc1ad"} Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.549058 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c5f00cf6-62fb-43dc-83b0-d07d1013ac39","Type":"ContainerDied","Data":"1d29297b6ed97cf2eb38e716e4cbd6509cb9ba206e21c95ad0bcf319a7f6a30e"} Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.549080 4892 scope.go:117] "RemoveContainer" containerID="9836f2bba378d07e986d22e558347cecc28683307c36bd0001343389fc8cc1ad" Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.554280 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9b84e006-8d73-4cca-9ec6-8299bd4c018d","Type":"ContainerStarted","Data":"2d7d5afebf09360bdddfa65fa236b8895cf2f07052b364cedf26d8934ecae61f"} Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.583623 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.612124 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.621443 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 17 18:06:31 crc kubenswrapper[4892]: E0217 18:06:31.621988 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5f00cf6-62fb-43dc-83b0-d07d1013ac39" containerName="nova-api-api" Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.622006 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5f00cf6-62fb-43dc-83b0-d07d1013ac39" containerName="nova-api-api" Feb 17 18:06:31 crc kubenswrapper[4892]: E0217 18:06:31.622017 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5f00cf6-62fb-43dc-83b0-d07d1013ac39" containerName="nova-api-log" Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.622023 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5f00cf6-62fb-43dc-83b0-d07d1013ac39" containerName="nova-api-log" Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.622216 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5f00cf6-62fb-43dc-83b0-d07d1013ac39" containerName="nova-api-api" Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.622231 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5f00cf6-62fb-43dc-83b0-d07d1013ac39" containerName="nova-api-log" Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.623357 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.625614 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.626176 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.626363 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.633174 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.645992 4892 scope.go:117] "RemoveContainer" containerID="a93d2b9992861dc290227f210f249e17f3b1196c0ba166f7ff424cb14caa20aa" Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.650625 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382-logs\") pod \"nova-api-0\" (UID: \"a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382\") " pod="openstack/nova-api-0" Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.650684 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382\") " pod="openstack/nova-api-0" Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.650739 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382\") " pod="openstack/nova-api-0" Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.650768 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkdb6\" (UniqueName: \"kubernetes.io/projected/a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382-kube-api-access-xkdb6\") pod \"nova-api-0\" (UID: \"a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382\") " pod="openstack/nova-api-0" Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.650808 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382-config-data\") pod \"nova-api-0\" (UID: \"a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382\") " pod="openstack/nova-api-0" Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.651454 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382-public-tls-certs\") pod \"nova-api-0\" (UID: \"a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382\") " pod="openstack/nova-api-0" Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.691718 4892 scope.go:117] "RemoveContainer" containerID="9836f2bba378d07e986d22e558347cecc28683307c36bd0001343389fc8cc1ad" Feb 17 18:06:31 crc kubenswrapper[4892]: E0217 18:06:31.693118 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9836f2bba378d07e986d22e558347cecc28683307c36bd0001343389fc8cc1ad\": container with ID starting with 9836f2bba378d07e986d22e558347cecc28683307c36bd0001343389fc8cc1ad not found: ID does not exist" containerID="9836f2bba378d07e986d22e558347cecc28683307c36bd0001343389fc8cc1ad" Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.693151 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9836f2bba378d07e986d22e558347cecc28683307c36bd0001343389fc8cc1ad"} err="failed to get container status \"9836f2bba378d07e986d22e558347cecc28683307c36bd0001343389fc8cc1ad\": rpc error: code = NotFound desc = could not find container \"9836f2bba378d07e986d22e558347cecc28683307c36bd0001343389fc8cc1ad\": container with ID starting with 9836f2bba378d07e986d22e558347cecc28683307c36bd0001343389fc8cc1ad not found: ID does not exist" Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.693171 4892 scope.go:117] "RemoveContainer" containerID="a93d2b9992861dc290227f210f249e17f3b1196c0ba166f7ff424cb14caa20aa" Feb 17 18:06:31 crc kubenswrapper[4892]: E0217 18:06:31.693426 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a93d2b9992861dc290227f210f249e17f3b1196c0ba166f7ff424cb14caa20aa\": container with ID starting with a93d2b9992861dc290227f210f249e17f3b1196c0ba166f7ff424cb14caa20aa not found: ID does not exist" containerID="a93d2b9992861dc290227f210f249e17f3b1196c0ba166f7ff424cb14caa20aa" Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.693444 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a93d2b9992861dc290227f210f249e17f3b1196c0ba166f7ff424cb14caa20aa"} err="failed to get container status \"a93d2b9992861dc290227f210f249e17f3b1196c0ba166f7ff424cb14caa20aa\": rpc error: code = NotFound desc = could not find container \"a93d2b9992861dc290227f210f249e17f3b1196c0ba166f7ff424cb14caa20aa\": container with ID starting with a93d2b9992861dc290227f210f249e17f3b1196c0ba166f7ff424cb14caa20aa not found: ID does not exist" Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.758268 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382-config-data\") pod \"nova-api-0\" (UID: \"a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382\") " pod="openstack/nova-api-0" Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.758345 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382-public-tls-certs\") pod \"nova-api-0\" (UID: \"a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382\") " pod="openstack/nova-api-0" Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.758382 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382-logs\") pod \"nova-api-0\" (UID: \"a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382\") " pod="openstack/nova-api-0" Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.758437 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382\") " pod="openstack/nova-api-0" Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.758517 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382\") " pod="openstack/nova-api-0" Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.758551 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkdb6\" (UniqueName: \"kubernetes.io/projected/a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382-kube-api-access-xkdb6\") pod \"nova-api-0\" (UID: \"a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382\") " pod="openstack/nova-api-0" Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.762997 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382-logs\") pod \"nova-api-0\" (UID: \"a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382\") " pod="openstack/nova-api-0" Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.766218 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382-config-data\") pod \"nova-api-0\" (UID: \"a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382\") " pod="openstack/nova-api-0" Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.769423 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382-public-tls-certs\") pod \"nova-api-0\" (UID: \"a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382\") " pod="openstack/nova-api-0" Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.772780 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382\") " pod="openstack/nova-api-0" Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.774553 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382\") " pod="openstack/nova-api-0" Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.778638 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkdb6\" (UniqueName: \"kubernetes.io/projected/a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382-kube-api-access-xkdb6\") pod \"nova-api-0\" (UID: \"a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382\") " pod="openstack/nova-api-0" Feb 17 18:06:31 crc kubenswrapper[4892]: I0217 18:06:31.956578 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 18:06:32 crc kubenswrapper[4892]: I0217 18:06:32.425161 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 18:06:32 crc kubenswrapper[4892]: W0217 18:06:32.434185 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda03bb4eb_c2d1_46ec_b3f0_0cae95ae0382.slice/crio-2d468f370c8ba167da3ac1736a14cf6b395b399e2ad9b55a3ab37447643a9f16 WatchSource:0}: Error finding container 2d468f370c8ba167da3ac1736a14cf6b395b399e2ad9b55a3ab37447643a9f16: Status 404 returned error can't find the container with id 2d468f370c8ba167da3ac1736a14cf6b395b399e2ad9b55a3ab37447643a9f16 Feb 17 18:06:32 crc kubenswrapper[4892]: I0217 18:06:32.569859 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9b84e006-8d73-4cca-9ec6-8299bd4c018d","Type":"ContainerStarted","Data":"031f61f13cff7f5d8f863dc06d8841658141e6387baea807bdc8db7db2c2d7a0"} Feb 17 18:06:32 crc kubenswrapper[4892]: I0217 18:06:32.569913 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9b84e006-8d73-4cca-9ec6-8299bd4c018d","Type":"ContainerStarted","Data":"e6b0cc1173074472235b1b30b11ebde69de7faf2706d2dee317f06349aa7adc8"} Feb 17 18:06:32 crc kubenswrapper[4892]: I0217 18:06:32.571908 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382","Type":"ContainerStarted","Data":"2d468f370c8ba167da3ac1736a14cf6b395b399e2ad9b55a3ab37447643a9f16"} Feb 17 18:06:32 crc kubenswrapper[4892]: I0217 18:06:32.867611 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 17 18:06:32 crc kubenswrapper[4892]: I0217 18:06:32.893215 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 17 18:06:32 crc kubenswrapper[4892]: I0217 18:06:32.893253 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 17 18:06:32 crc kubenswrapper[4892]: I0217 18:06:32.902154 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 17 18:06:33 crc kubenswrapper[4892]: I0217 18:06:33.371131 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5f00cf6-62fb-43dc-83b0-d07d1013ac39" path="/var/lib/kubelet/pods/c5f00cf6-62fb-43dc-83b0-d07d1013ac39/volumes" Feb 17 18:06:33 crc kubenswrapper[4892]: I0217 18:06:33.590011 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382","Type":"ContainerStarted","Data":"61f78b078f502e6cd5e7b57413f884e6841707d6503e3a81bb42b5c83073e76c"} Feb 17 18:06:33 crc kubenswrapper[4892]: I0217 18:06:33.590063 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382","Type":"ContainerStarted","Data":"cd47d7cbceebadd0b68ffad90cfd37403f8e798f16f16647926b6633524d52cf"} Feb 17 18:06:33 crc kubenswrapper[4892]: I0217 18:06:33.614696 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.6146706809999998 podStartE2EDuration="2.614670681s" podCreationTimestamp="2026-02-17 18:06:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:06:33.608921006 +0000 UTC m=+1364.984324291" watchObservedRunningTime="2026-02-17 18:06:33.614670681 +0000 UTC m=+1364.990073956" Feb 17 18:06:33 crc kubenswrapper[4892]: I0217 18:06:33.615156 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 17 18:06:33 crc kubenswrapper[4892]: I0217 18:06:33.829498 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-j5ngd"] Feb 17 18:06:33 crc kubenswrapper[4892]: I0217 18:06:33.831656 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-j5ngd" Feb 17 18:06:33 crc kubenswrapper[4892]: I0217 18:06:33.834009 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 17 18:06:33 crc kubenswrapper[4892]: I0217 18:06:33.835261 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 17 18:06:33 crc kubenswrapper[4892]: I0217 18:06:33.840374 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-j5ngd"] Feb 17 18:06:33 crc kubenswrapper[4892]: I0217 18:06:33.909709 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eb01add-fd0d-4606-9761-316f009e0002-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-j5ngd\" (UID: \"7eb01add-fd0d-4606-9761-316f009e0002\") " pod="openstack/nova-cell1-cell-mapping-j5ngd" Feb 17 18:06:33 crc kubenswrapper[4892]: I0217 18:06:33.909792 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7eb01add-fd0d-4606-9761-316f009e0002-scripts\") pod \"nova-cell1-cell-mapping-j5ngd\" (UID: \"7eb01add-fd0d-4606-9761-316f009e0002\") " pod="openstack/nova-cell1-cell-mapping-j5ngd" Feb 17 18:06:33 crc kubenswrapper[4892]: I0217 18:06:33.909892 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7eb01add-fd0d-4606-9761-316f009e0002-config-data\") pod \"nova-cell1-cell-mapping-j5ngd\" (UID: \"7eb01add-fd0d-4606-9761-316f009e0002\") " pod="openstack/nova-cell1-cell-mapping-j5ngd" Feb 17 18:06:33 crc kubenswrapper[4892]: I0217 18:06:33.909993 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpjmp\" (UniqueName: \"kubernetes.io/projected/7eb01add-fd0d-4606-9761-316f009e0002-kube-api-access-gpjmp\") pod \"nova-cell1-cell-mapping-j5ngd\" (UID: \"7eb01add-fd0d-4606-9761-316f009e0002\") " pod="openstack/nova-cell1-cell-mapping-j5ngd" Feb 17 18:06:33 crc kubenswrapper[4892]: I0217 18:06:33.913943 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7b9c463d-2912-48ed-b342-74e1a2f92d19" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.225:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 17 18:06:33 crc kubenswrapper[4892]: I0217 18:06:33.913963 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7b9c463d-2912-48ed-b342-74e1a2f92d19" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.225:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 17 18:06:34 crc kubenswrapper[4892]: I0217 18:06:34.011589 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eb01add-fd0d-4606-9761-316f009e0002-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-j5ngd\" (UID: \"7eb01add-fd0d-4606-9761-316f009e0002\") " pod="openstack/nova-cell1-cell-mapping-j5ngd" Feb 17 18:06:34 crc kubenswrapper[4892]: I0217 18:06:34.011663 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7eb01add-fd0d-4606-9761-316f009e0002-scripts\") pod \"nova-cell1-cell-mapping-j5ngd\" (UID: \"7eb01add-fd0d-4606-9761-316f009e0002\") " pod="openstack/nova-cell1-cell-mapping-j5ngd" Feb 17 18:06:34 crc kubenswrapper[4892]: I0217 18:06:34.011718 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7eb01add-fd0d-4606-9761-316f009e0002-config-data\") pod \"nova-cell1-cell-mapping-j5ngd\" (UID: \"7eb01add-fd0d-4606-9761-316f009e0002\") " pod="openstack/nova-cell1-cell-mapping-j5ngd" Feb 17 18:06:34 crc kubenswrapper[4892]: I0217 18:06:34.011783 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpjmp\" (UniqueName: \"kubernetes.io/projected/7eb01add-fd0d-4606-9761-316f009e0002-kube-api-access-gpjmp\") pod \"nova-cell1-cell-mapping-j5ngd\" (UID: \"7eb01add-fd0d-4606-9761-316f009e0002\") " pod="openstack/nova-cell1-cell-mapping-j5ngd" Feb 17 18:06:34 crc kubenswrapper[4892]: I0217 18:06:34.015705 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7eb01add-fd0d-4606-9761-316f009e0002-scripts\") pod \"nova-cell1-cell-mapping-j5ngd\" (UID: \"7eb01add-fd0d-4606-9761-316f009e0002\") " pod="openstack/nova-cell1-cell-mapping-j5ngd" Feb 17 18:06:34 crc kubenswrapper[4892]: I0217 18:06:34.015930 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7eb01add-fd0d-4606-9761-316f009e0002-config-data\") pod \"nova-cell1-cell-mapping-j5ngd\" (UID: \"7eb01add-fd0d-4606-9761-316f009e0002\") " pod="openstack/nova-cell1-cell-mapping-j5ngd" Feb 17 18:06:34 crc kubenswrapper[4892]: I0217 18:06:34.018230 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eb01add-fd0d-4606-9761-316f009e0002-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-j5ngd\" (UID: \"7eb01add-fd0d-4606-9761-316f009e0002\") " pod="openstack/nova-cell1-cell-mapping-j5ngd" Feb 17 18:06:34 crc kubenswrapper[4892]: I0217 18:06:34.029758 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpjmp\" (UniqueName: \"kubernetes.io/projected/7eb01add-fd0d-4606-9761-316f009e0002-kube-api-access-gpjmp\") pod \"nova-cell1-cell-mapping-j5ngd\" (UID: \"7eb01add-fd0d-4606-9761-316f009e0002\") " pod="openstack/nova-cell1-cell-mapping-j5ngd" Feb 17 18:06:34 crc kubenswrapper[4892]: I0217 18:06:34.161912 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-j5ngd" Feb 17 18:06:34 crc kubenswrapper[4892]: I0217 18:06:34.407110 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c7b6c5df9-tdc8n" Feb 17 18:06:34 crc kubenswrapper[4892]: I0217 18:06:34.474125 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-lzftn"] Feb 17 18:06:34 crc kubenswrapper[4892]: I0217 18:06:34.474381 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-865f5d856f-lzftn" podUID="8bbfb7ed-07bf-43e0-baf7-58baefccbba9" containerName="dnsmasq-dns" containerID="cri-o://2e7ce2ed233decd4d799f8db2acda887290e7fb0198c1925a7111823bcc856c3" gracePeriod=10 Feb 17 18:06:34 crc kubenswrapper[4892]: I0217 18:06:34.615516 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9b84e006-8d73-4cca-9ec6-8299bd4c018d","Type":"ContainerStarted","Data":"a5cd172333fe23357d21e0bc3aa908567d417105d32df53ee4cfafdac0ac5a71"} Feb 17 18:06:34 crc kubenswrapper[4892]: I0217 18:06:34.616762 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 18:06:34 crc kubenswrapper[4892]: I0217 18:06:34.616427 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9b84e006-8d73-4cca-9ec6-8299bd4c018d" containerName="proxy-httpd" containerID="cri-o://a5cd172333fe23357d21e0bc3aa908567d417105d32df53ee4cfafdac0ac5a71" gracePeriod=30 Feb 17 18:06:34 crc kubenswrapper[4892]: I0217 18:06:34.616442 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9b84e006-8d73-4cca-9ec6-8299bd4c018d" containerName="sg-core" containerID="cri-o://031f61f13cff7f5d8f863dc06d8841658141e6387baea807bdc8db7db2c2d7a0" gracePeriod=30 Feb 17 18:06:34 crc kubenswrapper[4892]: I0217 18:06:34.616450 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9b84e006-8d73-4cca-9ec6-8299bd4c018d" containerName="ceilometer-notification-agent" containerID="cri-o://e6b0cc1173074472235b1b30b11ebde69de7faf2706d2dee317f06349aa7adc8" gracePeriod=30 Feb 17 18:06:34 crc kubenswrapper[4892]: I0217 18:06:34.616132 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9b84e006-8d73-4cca-9ec6-8299bd4c018d" containerName="ceilometer-central-agent" containerID="cri-o://2d7d5afebf09360bdddfa65fa236b8895cf2f07052b364cedf26d8934ecae61f" gracePeriod=30 Feb 17 18:06:34 crc kubenswrapper[4892]: I0217 18:06:34.631677 4892 generic.go:334] "Generic (PLEG): container finished" podID="8bbfb7ed-07bf-43e0-baf7-58baefccbba9" containerID="2e7ce2ed233decd4d799f8db2acda887290e7fb0198c1925a7111823bcc856c3" exitCode=0 Feb 17 18:06:34 crc kubenswrapper[4892]: I0217 18:06:34.632138 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-lzftn" event={"ID":"8bbfb7ed-07bf-43e0-baf7-58baefccbba9","Type":"ContainerDied","Data":"2e7ce2ed233decd4d799f8db2acda887290e7fb0198c1925a7111823bcc856c3"} Feb 17 18:06:34 crc kubenswrapper[4892]: I0217 18:06:34.641535 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.998179476 podStartE2EDuration="5.641519576s" podCreationTimestamp="2026-02-17 18:06:29 +0000 UTC" firstStartedPulling="2026-02-17 18:06:30.349358072 +0000 UTC m=+1361.724761337" lastFinishedPulling="2026-02-17 18:06:33.992698172 +0000 UTC m=+1365.368101437" observedRunningTime="2026-02-17 18:06:34.639254595 +0000 UTC m=+1366.014657860" watchObservedRunningTime="2026-02-17 18:06:34.641519576 +0000 UTC m=+1366.016922841" Feb 17 18:06:34 crc kubenswrapper[4892]: I0217 18:06:34.695249 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-j5ngd"] Feb 17 18:06:34 crc kubenswrapper[4892]: W0217 18:06:34.714080 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7eb01add_fd0d_4606_9761_316f009e0002.slice/crio-431c86eacb42718ab2f1c140586b2012f17fad5568c707c1dac07dbc8ab43a84 WatchSource:0}: Error finding container 431c86eacb42718ab2f1c140586b2012f17fad5568c707c1dac07dbc8ab43a84: Status 404 returned error can't find the container with id 431c86eacb42718ab2f1c140586b2012f17fad5568c707c1dac07dbc8ab43a84 Feb 17 18:06:34 crc kubenswrapper[4892]: I0217 18:06:34.983198 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-lzftn" Feb 17 18:06:35 crc kubenswrapper[4892]: I0217 18:06:35.143099 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8bbfb7ed-07bf-43e0-baf7-58baefccbba9-dns-swift-storage-0\") pod \"8bbfb7ed-07bf-43e0-baf7-58baefccbba9\" (UID: \"8bbfb7ed-07bf-43e0-baf7-58baefccbba9\") " Feb 17 18:06:35 crc kubenswrapper[4892]: I0217 18:06:35.143196 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bbfb7ed-07bf-43e0-baf7-58baefccbba9-config\") pod \"8bbfb7ed-07bf-43e0-baf7-58baefccbba9\" (UID: \"8bbfb7ed-07bf-43e0-baf7-58baefccbba9\") " Feb 17 18:06:35 crc kubenswrapper[4892]: I0217 18:06:35.143221 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8bbfb7ed-07bf-43e0-baf7-58baefccbba9-ovsdbserver-nb\") pod \"8bbfb7ed-07bf-43e0-baf7-58baefccbba9\" (UID: \"8bbfb7ed-07bf-43e0-baf7-58baefccbba9\") " Feb 17 18:06:35 crc kubenswrapper[4892]: I0217 18:06:35.143267 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8bbfb7ed-07bf-43e0-baf7-58baefccbba9-ovsdbserver-sb\") pod \"8bbfb7ed-07bf-43e0-baf7-58baefccbba9\" (UID: \"8bbfb7ed-07bf-43e0-baf7-58baefccbba9\") " Feb 17 18:06:35 crc kubenswrapper[4892]: I0217 18:06:35.143324 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rd5p\" (UniqueName: \"kubernetes.io/projected/8bbfb7ed-07bf-43e0-baf7-58baefccbba9-kube-api-access-8rd5p\") pod \"8bbfb7ed-07bf-43e0-baf7-58baefccbba9\" (UID: \"8bbfb7ed-07bf-43e0-baf7-58baefccbba9\") " Feb 17 18:06:35 crc kubenswrapper[4892]: I0217 18:06:35.143355 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8bbfb7ed-07bf-43e0-baf7-58baefccbba9-dns-svc\") pod \"8bbfb7ed-07bf-43e0-baf7-58baefccbba9\" (UID: \"8bbfb7ed-07bf-43e0-baf7-58baefccbba9\") " Feb 17 18:06:35 crc kubenswrapper[4892]: I0217 18:06:35.149044 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bbfb7ed-07bf-43e0-baf7-58baefccbba9-kube-api-access-8rd5p" (OuterVolumeSpecName: "kube-api-access-8rd5p") pod "8bbfb7ed-07bf-43e0-baf7-58baefccbba9" (UID: "8bbfb7ed-07bf-43e0-baf7-58baefccbba9"). InnerVolumeSpecName "kube-api-access-8rd5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:06:35 crc kubenswrapper[4892]: I0217 18:06:35.223808 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bbfb7ed-07bf-43e0-baf7-58baefccbba9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8bbfb7ed-07bf-43e0-baf7-58baefccbba9" (UID: "8bbfb7ed-07bf-43e0-baf7-58baefccbba9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:06:35 crc kubenswrapper[4892]: I0217 18:06:35.224880 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bbfb7ed-07bf-43e0-baf7-58baefccbba9-config" (OuterVolumeSpecName: "config") pod "8bbfb7ed-07bf-43e0-baf7-58baefccbba9" (UID: "8bbfb7ed-07bf-43e0-baf7-58baefccbba9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:06:35 crc kubenswrapper[4892]: I0217 18:06:35.234046 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bbfb7ed-07bf-43e0-baf7-58baefccbba9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8bbfb7ed-07bf-43e0-baf7-58baefccbba9" (UID: "8bbfb7ed-07bf-43e0-baf7-58baefccbba9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:06:35 crc kubenswrapper[4892]: I0217 18:06:35.241343 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bbfb7ed-07bf-43e0-baf7-58baefccbba9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8bbfb7ed-07bf-43e0-baf7-58baefccbba9" (UID: "8bbfb7ed-07bf-43e0-baf7-58baefccbba9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:06:35 crc kubenswrapper[4892]: I0217 18:06:35.248013 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8bbfb7ed-07bf-43e0-baf7-58baefccbba9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:35 crc kubenswrapper[4892]: I0217 18:06:35.248057 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rd5p\" (UniqueName: \"kubernetes.io/projected/8bbfb7ed-07bf-43e0-baf7-58baefccbba9-kube-api-access-8rd5p\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:35 crc kubenswrapper[4892]: I0217 18:06:35.248071 4892 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8bbfb7ed-07bf-43e0-baf7-58baefccbba9-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:35 crc kubenswrapper[4892]: I0217 18:06:35.248081 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bbfb7ed-07bf-43e0-baf7-58baefccbba9-config\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:35 crc kubenswrapper[4892]: I0217 18:06:35.248090 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8bbfb7ed-07bf-43e0-baf7-58baefccbba9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:35 crc kubenswrapper[4892]: I0217 18:06:35.249751 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bbfb7ed-07bf-43e0-baf7-58baefccbba9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8bbfb7ed-07bf-43e0-baf7-58baefccbba9" (UID: "8bbfb7ed-07bf-43e0-baf7-58baefccbba9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:06:35 crc kubenswrapper[4892]: I0217 18:06:35.350644 4892 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8bbfb7ed-07bf-43e0-baf7-58baefccbba9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:35 crc kubenswrapper[4892]: I0217 18:06:35.645208 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-j5ngd" event={"ID":"7eb01add-fd0d-4606-9761-316f009e0002","Type":"ContainerStarted","Data":"a13a8f8cc49ae7cb4143bc67cb64f91acf0fe318d17893c57c649069e80a7423"} Feb 17 18:06:35 crc kubenswrapper[4892]: I0217 18:06:35.645261 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-j5ngd" event={"ID":"7eb01add-fd0d-4606-9761-316f009e0002","Type":"ContainerStarted","Data":"431c86eacb42718ab2f1c140586b2012f17fad5568c707c1dac07dbc8ab43a84"} Feb 17 18:06:35 crc kubenswrapper[4892]: I0217 18:06:35.653863 4892 generic.go:334] "Generic (PLEG): container finished" podID="9b84e006-8d73-4cca-9ec6-8299bd4c018d" containerID="a5cd172333fe23357d21e0bc3aa908567d417105d32df53ee4cfafdac0ac5a71" exitCode=0 Feb 17 18:06:35 crc kubenswrapper[4892]: I0217 18:06:35.653899 4892 generic.go:334] "Generic (PLEG): container finished" podID="9b84e006-8d73-4cca-9ec6-8299bd4c018d" containerID="031f61f13cff7f5d8f863dc06d8841658141e6387baea807bdc8db7db2c2d7a0" exitCode=2 Feb 17 18:06:35 crc kubenswrapper[4892]: I0217 18:06:35.653912 4892 generic.go:334] "Generic (PLEG): container finished" podID="9b84e006-8d73-4cca-9ec6-8299bd4c018d" containerID="e6b0cc1173074472235b1b30b11ebde69de7faf2706d2dee317f06349aa7adc8" exitCode=0 Feb 17 18:06:35 crc kubenswrapper[4892]: I0217 18:06:35.653976 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9b84e006-8d73-4cca-9ec6-8299bd4c018d","Type":"ContainerDied","Data":"a5cd172333fe23357d21e0bc3aa908567d417105d32df53ee4cfafdac0ac5a71"} Feb 17 18:06:35 crc kubenswrapper[4892]: I0217 18:06:35.654002 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9b84e006-8d73-4cca-9ec6-8299bd4c018d","Type":"ContainerDied","Data":"031f61f13cff7f5d8f863dc06d8841658141e6387baea807bdc8db7db2c2d7a0"} Feb 17 18:06:35 crc kubenswrapper[4892]: I0217 18:06:35.654016 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9b84e006-8d73-4cca-9ec6-8299bd4c018d","Type":"ContainerDied","Data":"e6b0cc1173074472235b1b30b11ebde69de7faf2706d2dee317f06349aa7adc8"} Feb 17 18:06:35 crc kubenswrapper[4892]: I0217 18:06:35.658189 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-lzftn" event={"ID":"8bbfb7ed-07bf-43e0-baf7-58baefccbba9","Type":"ContainerDied","Data":"a74b4c02d0ae62c584c20ca1424a783205e7af96e1b591ea1ec72ff2736c763f"} Feb 17 18:06:35 crc kubenswrapper[4892]: I0217 18:06:35.658231 4892 scope.go:117] "RemoveContainer" containerID="2e7ce2ed233decd4d799f8db2acda887290e7fb0198c1925a7111823bcc856c3" Feb 17 18:06:35 crc kubenswrapper[4892]: I0217 18:06:35.658367 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-lzftn" Feb 17 18:06:35 crc kubenswrapper[4892]: I0217 18:06:35.685088 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-j5ngd" podStartSLOduration=2.685011832 podStartE2EDuration="2.685011832s" podCreationTimestamp="2026-02-17 18:06:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:06:35.670056837 +0000 UTC m=+1367.045460122" watchObservedRunningTime="2026-02-17 18:06:35.685011832 +0000 UTC m=+1367.060415137" Feb 17 18:06:35 crc kubenswrapper[4892]: I0217 18:06:35.703709 4892 scope.go:117] "RemoveContainer" containerID="cfa0711cc5a62159977c3f380e6ebdb61760d793a1f27715a87929a7d74e57da" Feb 17 18:06:35 crc kubenswrapper[4892]: I0217 18:06:35.711888 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-lzftn"] Feb 17 18:06:35 crc kubenswrapper[4892]: I0217 18:06:35.726699 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-lzftn"] Feb 17 18:06:37 crc kubenswrapper[4892]: I0217 18:06:37.384937 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bbfb7ed-07bf-43e0-baf7-58baefccbba9" path="/var/lib/kubelet/pods/8bbfb7ed-07bf-43e0-baf7-58baefccbba9/volumes" Feb 17 18:06:37 crc kubenswrapper[4892]: I0217 18:06:37.424851 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 18:06:37 crc kubenswrapper[4892]: I0217 18:06:37.425119 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.146789 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.336839 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9b84e006-8d73-4cca-9ec6-8299bd4c018d-sg-core-conf-yaml\") pod \"9b84e006-8d73-4cca-9ec6-8299bd4c018d\" (UID: \"9b84e006-8d73-4cca-9ec6-8299bd4c018d\") " Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.336890 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvxj7\" (UniqueName: \"kubernetes.io/projected/9b84e006-8d73-4cca-9ec6-8299bd4c018d-kube-api-access-pvxj7\") pod \"9b84e006-8d73-4cca-9ec6-8299bd4c018d\" (UID: \"9b84e006-8d73-4cca-9ec6-8299bd4c018d\") " Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.336952 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b84e006-8d73-4cca-9ec6-8299bd4c018d-ceilometer-tls-certs\") pod \"9b84e006-8d73-4cca-9ec6-8299bd4c018d\" (UID: \"9b84e006-8d73-4cca-9ec6-8299bd4c018d\") " Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.337048 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b84e006-8d73-4cca-9ec6-8299bd4c018d-run-httpd\") pod \"9b84e006-8d73-4cca-9ec6-8299bd4c018d\" (UID: \"9b84e006-8d73-4cca-9ec6-8299bd4c018d\") " Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.337115 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b84e006-8d73-4cca-9ec6-8299bd4c018d-scripts\") pod \"9b84e006-8d73-4cca-9ec6-8299bd4c018d\" (UID: \"9b84e006-8d73-4cca-9ec6-8299bd4c018d\") " Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.337189 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b84e006-8d73-4cca-9ec6-8299bd4c018d-config-data\") pod \"9b84e006-8d73-4cca-9ec6-8299bd4c018d\" (UID: \"9b84e006-8d73-4cca-9ec6-8299bd4c018d\") " Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.337224 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b84e006-8d73-4cca-9ec6-8299bd4c018d-combined-ca-bundle\") pod \"9b84e006-8d73-4cca-9ec6-8299bd4c018d\" (UID: \"9b84e006-8d73-4cca-9ec6-8299bd4c018d\") " Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.337259 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b84e006-8d73-4cca-9ec6-8299bd4c018d-log-httpd\") pod \"9b84e006-8d73-4cca-9ec6-8299bd4c018d\" (UID: \"9b84e006-8d73-4cca-9ec6-8299bd4c018d\") " Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.338552 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b84e006-8d73-4cca-9ec6-8299bd4c018d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9b84e006-8d73-4cca-9ec6-8299bd4c018d" (UID: "9b84e006-8d73-4cca-9ec6-8299bd4c018d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.338677 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b84e006-8d73-4cca-9ec6-8299bd4c018d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9b84e006-8d73-4cca-9ec6-8299bd4c018d" (UID: "9b84e006-8d73-4cca-9ec6-8299bd4c018d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.342497 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b84e006-8d73-4cca-9ec6-8299bd4c018d-kube-api-access-pvxj7" (OuterVolumeSpecName: "kube-api-access-pvxj7") pod "9b84e006-8d73-4cca-9ec6-8299bd4c018d" (UID: "9b84e006-8d73-4cca-9ec6-8299bd4c018d"). InnerVolumeSpecName "kube-api-access-pvxj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.364081 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b84e006-8d73-4cca-9ec6-8299bd4c018d-scripts" (OuterVolumeSpecName: "scripts") pod "9b84e006-8d73-4cca-9ec6-8299bd4c018d" (UID: "9b84e006-8d73-4cca-9ec6-8299bd4c018d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.368217 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b84e006-8d73-4cca-9ec6-8299bd4c018d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9b84e006-8d73-4cca-9ec6-8299bd4c018d" (UID: "9b84e006-8d73-4cca-9ec6-8299bd4c018d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.419782 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b84e006-8d73-4cca-9ec6-8299bd4c018d-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "9b84e006-8d73-4cca-9ec6-8299bd4c018d" (UID: "9b84e006-8d73-4cca-9ec6-8299bd4c018d"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.442949 4892 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9b84e006-8d73-4cca-9ec6-8299bd4c018d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.442990 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvxj7\" (UniqueName: \"kubernetes.io/projected/9b84e006-8d73-4cca-9ec6-8299bd4c018d-kube-api-access-pvxj7\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.443007 4892 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b84e006-8d73-4cca-9ec6-8299bd4c018d-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.443022 4892 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b84e006-8d73-4cca-9ec6-8299bd4c018d-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.443034 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b84e006-8d73-4cca-9ec6-8299bd4c018d-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.443045 4892 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b84e006-8d73-4cca-9ec6-8299bd4c018d-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.448331 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b84e006-8d73-4cca-9ec6-8299bd4c018d-config-data" (OuterVolumeSpecName: "config-data") pod "9b84e006-8d73-4cca-9ec6-8299bd4c018d" (UID: "9b84e006-8d73-4cca-9ec6-8299bd4c018d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.448705 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b84e006-8d73-4cca-9ec6-8299bd4c018d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9b84e006-8d73-4cca-9ec6-8299bd4c018d" (UID: "9b84e006-8d73-4cca-9ec6-8299bd4c018d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.544968 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b84e006-8d73-4cca-9ec6-8299bd4c018d-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.544996 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b84e006-8d73-4cca-9ec6-8299bd4c018d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.696274 4892 generic.go:334] "Generic (PLEG): container finished" podID="9b84e006-8d73-4cca-9ec6-8299bd4c018d" containerID="2d7d5afebf09360bdddfa65fa236b8895cf2f07052b364cedf26d8934ecae61f" exitCode=0 Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.696326 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9b84e006-8d73-4cca-9ec6-8299bd4c018d","Type":"ContainerDied","Data":"2d7d5afebf09360bdddfa65fa236b8895cf2f07052b364cedf26d8934ecae61f"} Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.696358 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9b84e006-8d73-4cca-9ec6-8299bd4c018d","Type":"ContainerDied","Data":"65708a61bedf0366fe37683dcaeb7a95930fabd905d42864e2fb42fb55a0efd0"} Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.696381 4892 scope.go:117] "RemoveContainer" containerID="a5cd172333fe23357d21e0bc3aa908567d417105d32df53ee4cfafdac0ac5a71" Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.696578 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.723257 4892 scope.go:117] "RemoveContainer" containerID="031f61f13cff7f5d8f863dc06d8841658141e6387baea807bdc8db7db2c2d7a0" Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.747698 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.758874 4892 scope.go:117] "RemoveContainer" containerID="e6b0cc1173074472235b1b30b11ebde69de7faf2706d2dee317f06349aa7adc8" Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.766962 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.793125 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 18:06:38 crc kubenswrapper[4892]: E0217 18:06:38.793600 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b84e006-8d73-4cca-9ec6-8299bd4c018d" containerName="ceilometer-notification-agent" Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.793617 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b84e006-8d73-4cca-9ec6-8299bd4c018d" containerName="ceilometer-notification-agent" Feb 17 18:06:38 crc kubenswrapper[4892]: E0217 18:06:38.793634 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b84e006-8d73-4cca-9ec6-8299bd4c018d" containerName="proxy-httpd" Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.793640 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b84e006-8d73-4cca-9ec6-8299bd4c018d" containerName="proxy-httpd" Feb 17 18:06:38 crc kubenswrapper[4892]: E0217 18:06:38.793653 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b84e006-8d73-4cca-9ec6-8299bd4c018d" containerName="sg-core" Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.793659 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b84e006-8d73-4cca-9ec6-8299bd4c018d" containerName="sg-core" Feb 17 18:06:38 crc kubenswrapper[4892]: E0217 18:06:38.793690 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bbfb7ed-07bf-43e0-baf7-58baefccbba9" containerName="dnsmasq-dns" Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.793697 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bbfb7ed-07bf-43e0-baf7-58baefccbba9" containerName="dnsmasq-dns" Feb 17 18:06:38 crc kubenswrapper[4892]: E0217 18:06:38.793706 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b84e006-8d73-4cca-9ec6-8299bd4c018d" containerName="ceilometer-central-agent" Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.793711 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b84e006-8d73-4cca-9ec6-8299bd4c018d" containerName="ceilometer-central-agent" Feb 17 18:06:38 crc kubenswrapper[4892]: E0217 18:06:38.793755 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bbfb7ed-07bf-43e0-baf7-58baefccbba9" containerName="init" Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.793762 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bbfb7ed-07bf-43e0-baf7-58baefccbba9" containerName="init" Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.793988 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b84e006-8d73-4cca-9ec6-8299bd4c018d" containerName="ceilometer-notification-agent" Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.794002 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b84e006-8d73-4cca-9ec6-8299bd4c018d" containerName="proxy-httpd" Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.794093 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b84e006-8d73-4cca-9ec6-8299bd4c018d" containerName="ceilometer-central-agent" Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.794108 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b84e006-8d73-4cca-9ec6-8299bd4c018d" containerName="sg-core" Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.794128 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bbfb7ed-07bf-43e0-baf7-58baefccbba9" containerName="dnsmasq-dns" Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.797377 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.805090 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.805117 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.805389 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.806313 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.814195 4892 scope.go:117] "RemoveContainer" containerID="2d7d5afebf09360bdddfa65fa236b8895cf2f07052b364cedf26d8934ecae61f" Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.867118 4892 scope.go:117] "RemoveContainer" containerID="a5cd172333fe23357d21e0bc3aa908567d417105d32df53ee4cfafdac0ac5a71" Feb 17 18:06:38 crc kubenswrapper[4892]: E0217 18:06:38.867576 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5cd172333fe23357d21e0bc3aa908567d417105d32df53ee4cfafdac0ac5a71\": container with ID starting with a5cd172333fe23357d21e0bc3aa908567d417105d32df53ee4cfafdac0ac5a71 not found: ID does not exist" containerID="a5cd172333fe23357d21e0bc3aa908567d417105d32df53ee4cfafdac0ac5a71" Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.867616 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5cd172333fe23357d21e0bc3aa908567d417105d32df53ee4cfafdac0ac5a71"} err="failed to get container status \"a5cd172333fe23357d21e0bc3aa908567d417105d32df53ee4cfafdac0ac5a71\": rpc error: code = NotFound desc = could not find container \"a5cd172333fe23357d21e0bc3aa908567d417105d32df53ee4cfafdac0ac5a71\": container with ID starting with a5cd172333fe23357d21e0bc3aa908567d417105d32df53ee4cfafdac0ac5a71 not found: ID does not exist" Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.867642 4892 scope.go:117] "RemoveContainer" containerID="031f61f13cff7f5d8f863dc06d8841658141e6387baea807bdc8db7db2c2d7a0" Feb 17 18:06:38 crc kubenswrapper[4892]: E0217 18:06:38.867921 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"031f61f13cff7f5d8f863dc06d8841658141e6387baea807bdc8db7db2c2d7a0\": container with ID starting with 031f61f13cff7f5d8f863dc06d8841658141e6387baea807bdc8db7db2c2d7a0 not found: ID does not exist" containerID="031f61f13cff7f5d8f863dc06d8841658141e6387baea807bdc8db7db2c2d7a0" Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.867952 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"031f61f13cff7f5d8f863dc06d8841658141e6387baea807bdc8db7db2c2d7a0"} err="failed to get container status \"031f61f13cff7f5d8f863dc06d8841658141e6387baea807bdc8db7db2c2d7a0\": rpc error: code = NotFound desc = could not find container \"031f61f13cff7f5d8f863dc06d8841658141e6387baea807bdc8db7db2c2d7a0\": container with ID starting with 031f61f13cff7f5d8f863dc06d8841658141e6387baea807bdc8db7db2c2d7a0 not found: ID does not exist" Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.867970 4892 scope.go:117] "RemoveContainer" containerID="e6b0cc1173074472235b1b30b11ebde69de7faf2706d2dee317f06349aa7adc8" Feb 17 18:06:38 crc kubenswrapper[4892]: E0217 18:06:38.868267 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6b0cc1173074472235b1b30b11ebde69de7faf2706d2dee317f06349aa7adc8\": container with ID starting with e6b0cc1173074472235b1b30b11ebde69de7faf2706d2dee317f06349aa7adc8 not found: ID does not exist" containerID="e6b0cc1173074472235b1b30b11ebde69de7faf2706d2dee317f06349aa7adc8" Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.868308 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6b0cc1173074472235b1b30b11ebde69de7faf2706d2dee317f06349aa7adc8"} err="failed to get container status \"e6b0cc1173074472235b1b30b11ebde69de7faf2706d2dee317f06349aa7adc8\": rpc error: code = NotFound desc = could not find container \"e6b0cc1173074472235b1b30b11ebde69de7faf2706d2dee317f06349aa7adc8\": container with ID starting with e6b0cc1173074472235b1b30b11ebde69de7faf2706d2dee317f06349aa7adc8 not found: ID does not exist" Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.868334 4892 scope.go:117] "RemoveContainer" containerID="2d7d5afebf09360bdddfa65fa236b8895cf2f07052b364cedf26d8934ecae61f" Feb 17 18:06:38 crc kubenswrapper[4892]: E0217 18:06:38.868632 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d7d5afebf09360bdddfa65fa236b8895cf2f07052b364cedf26d8934ecae61f\": container with ID starting with 2d7d5afebf09360bdddfa65fa236b8895cf2f07052b364cedf26d8934ecae61f not found: ID does not exist" containerID="2d7d5afebf09360bdddfa65fa236b8895cf2f07052b364cedf26d8934ecae61f" Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.868664 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d7d5afebf09360bdddfa65fa236b8895cf2f07052b364cedf26d8934ecae61f"} err="failed to get container status \"2d7d5afebf09360bdddfa65fa236b8895cf2f07052b364cedf26d8934ecae61f\": rpc error: code = NotFound desc = could not find container \"2d7d5afebf09360bdddfa65fa236b8895cf2f07052b364cedf26d8934ecae61f\": container with ID starting with 2d7d5afebf09360bdddfa65fa236b8895cf2f07052b364cedf26d8934ecae61f not found: ID does not exist" Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.951897 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e7cdd99-a572-4a20-834b-c1373e080496-log-httpd\") pod \"ceilometer-0\" (UID: \"2e7cdd99-a572-4a20-834b-c1373e080496\") " pod="openstack/ceilometer-0" Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.951948 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e7cdd99-a572-4a20-834b-c1373e080496-run-httpd\") pod \"ceilometer-0\" (UID: \"2e7cdd99-a572-4a20-834b-c1373e080496\") " pod="openstack/ceilometer-0" Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.952014 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2e7cdd99-a572-4a20-834b-c1373e080496-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2e7cdd99-a572-4a20-834b-c1373e080496\") " pod="openstack/ceilometer-0" Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.952035 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e7cdd99-a572-4a20-834b-c1373e080496-scripts\") pod \"ceilometer-0\" (UID: \"2e7cdd99-a572-4a20-834b-c1373e080496\") " pod="openstack/ceilometer-0" Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.952080 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e7cdd99-a572-4a20-834b-c1373e080496-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2e7cdd99-a572-4a20-834b-c1373e080496\") " pod="openstack/ceilometer-0" Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.952118 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwtsm\" (UniqueName: \"kubernetes.io/projected/2e7cdd99-a572-4a20-834b-c1373e080496-kube-api-access-mwtsm\") pod \"ceilometer-0\" (UID: \"2e7cdd99-a572-4a20-834b-c1373e080496\") " pod="openstack/ceilometer-0" Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.952512 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e7cdd99-a572-4a20-834b-c1373e080496-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2e7cdd99-a572-4a20-834b-c1373e080496\") " pod="openstack/ceilometer-0" Feb 17 18:06:38 crc kubenswrapper[4892]: I0217 18:06:38.952605 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e7cdd99-a572-4a20-834b-c1373e080496-config-data\") pod \"ceilometer-0\" (UID: \"2e7cdd99-a572-4a20-834b-c1373e080496\") " pod="openstack/ceilometer-0" Feb 17 18:06:39 crc kubenswrapper[4892]: I0217 18:06:39.055138 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e7cdd99-a572-4a20-834b-c1373e080496-log-httpd\") pod \"ceilometer-0\" (UID: \"2e7cdd99-a572-4a20-834b-c1373e080496\") " pod="openstack/ceilometer-0" Feb 17 18:06:39 crc kubenswrapper[4892]: I0217 18:06:39.055192 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e7cdd99-a572-4a20-834b-c1373e080496-run-httpd\") pod \"ceilometer-0\" (UID: \"2e7cdd99-a572-4a20-834b-c1373e080496\") " pod="openstack/ceilometer-0" Feb 17 18:06:39 crc kubenswrapper[4892]: I0217 18:06:39.055243 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2e7cdd99-a572-4a20-834b-c1373e080496-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2e7cdd99-a572-4a20-834b-c1373e080496\") " pod="openstack/ceilometer-0" Feb 17 18:06:39 crc kubenswrapper[4892]: I0217 18:06:39.055268 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e7cdd99-a572-4a20-834b-c1373e080496-scripts\") pod \"ceilometer-0\" (UID: \"2e7cdd99-a572-4a20-834b-c1373e080496\") " pod="openstack/ceilometer-0" Feb 17 18:06:39 crc kubenswrapper[4892]: I0217 18:06:39.055289 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e7cdd99-a572-4a20-834b-c1373e080496-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2e7cdd99-a572-4a20-834b-c1373e080496\") " pod="openstack/ceilometer-0" Feb 17 18:06:39 crc kubenswrapper[4892]: I0217 18:06:39.055328 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwtsm\" (UniqueName: \"kubernetes.io/projected/2e7cdd99-a572-4a20-834b-c1373e080496-kube-api-access-mwtsm\") pod \"ceilometer-0\" (UID: \"2e7cdd99-a572-4a20-834b-c1373e080496\") " pod="openstack/ceilometer-0" Feb 17 18:06:39 crc kubenswrapper[4892]: I0217 18:06:39.055427 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e7cdd99-a572-4a20-834b-c1373e080496-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2e7cdd99-a572-4a20-834b-c1373e080496\") " pod="openstack/ceilometer-0" Feb 17 18:06:39 crc kubenswrapper[4892]: I0217 18:06:39.055460 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e7cdd99-a572-4a20-834b-c1373e080496-config-data\") pod \"ceilometer-0\" (UID: \"2e7cdd99-a572-4a20-834b-c1373e080496\") " pod="openstack/ceilometer-0" Feb 17 18:06:39 crc kubenswrapper[4892]: I0217 18:06:39.055642 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e7cdd99-a572-4a20-834b-c1373e080496-log-httpd\") pod \"ceilometer-0\" (UID: \"2e7cdd99-a572-4a20-834b-c1373e080496\") " pod="openstack/ceilometer-0" Feb 17 18:06:39 crc kubenswrapper[4892]: I0217 18:06:39.057292 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e7cdd99-a572-4a20-834b-c1373e080496-run-httpd\") pod \"ceilometer-0\" (UID: \"2e7cdd99-a572-4a20-834b-c1373e080496\") " pod="openstack/ceilometer-0" Feb 17 18:06:39 crc kubenswrapper[4892]: I0217 18:06:39.059112 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2e7cdd99-a572-4a20-834b-c1373e080496-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2e7cdd99-a572-4a20-834b-c1373e080496\") " pod="openstack/ceilometer-0" Feb 17 18:06:39 crc kubenswrapper[4892]: I0217 18:06:39.059640 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e7cdd99-a572-4a20-834b-c1373e080496-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2e7cdd99-a572-4a20-834b-c1373e080496\") " pod="openstack/ceilometer-0" Feb 17 18:06:39 crc kubenswrapper[4892]: I0217 18:06:39.070070 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e7cdd99-a572-4a20-834b-c1373e080496-scripts\") pod \"ceilometer-0\" (UID: \"2e7cdd99-a572-4a20-834b-c1373e080496\") " pod="openstack/ceilometer-0" Feb 17 18:06:39 crc kubenswrapper[4892]: I0217 18:06:39.070509 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e7cdd99-a572-4a20-834b-c1373e080496-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2e7cdd99-a572-4a20-834b-c1373e080496\") " pod="openstack/ceilometer-0" Feb 17 18:06:39 crc kubenswrapper[4892]: I0217 18:06:39.070895 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e7cdd99-a572-4a20-834b-c1373e080496-config-data\") pod \"ceilometer-0\" (UID: \"2e7cdd99-a572-4a20-834b-c1373e080496\") " pod="openstack/ceilometer-0" Feb 17 18:06:39 crc kubenswrapper[4892]: I0217 18:06:39.071175 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwtsm\" (UniqueName: \"kubernetes.io/projected/2e7cdd99-a572-4a20-834b-c1373e080496-kube-api-access-mwtsm\") pod \"ceilometer-0\" (UID: \"2e7cdd99-a572-4a20-834b-c1373e080496\") " pod="openstack/ceilometer-0" Feb 17 18:06:39 crc kubenswrapper[4892]: I0217 18:06:39.164786 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 18:06:39 crc kubenswrapper[4892]: I0217 18:06:39.380850 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b84e006-8d73-4cca-9ec6-8299bd4c018d" path="/var/lib/kubelet/pods/9b84e006-8d73-4cca-9ec6-8299bd4c018d/volumes" Feb 17 18:06:39 crc kubenswrapper[4892]: I0217 18:06:39.643250 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 18:06:39 crc kubenswrapper[4892]: I0217 18:06:39.708084 4892 generic.go:334] "Generic (PLEG): container finished" podID="7eb01add-fd0d-4606-9761-316f009e0002" containerID="a13a8f8cc49ae7cb4143bc67cb64f91acf0fe318d17893c57c649069e80a7423" exitCode=0 Feb 17 18:06:39 crc kubenswrapper[4892]: I0217 18:06:39.708122 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-j5ngd" event={"ID":"7eb01add-fd0d-4606-9761-316f009e0002","Type":"ContainerDied","Data":"a13a8f8cc49ae7cb4143bc67cb64f91acf0fe318d17893c57c649069e80a7423"} Feb 17 18:06:39 crc kubenswrapper[4892]: W0217 18:06:39.710675 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e7cdd99_a572_4a20_834b_c1373e080496.slice/crio-d249dc9a688f9ce92c0ecb0ce0b49077738c7d329b6a674732f4c71855355a6e WatchSource:0}: Error finding container d249dc9a688f9ce92c0ecb0ce0b49077738c7d329b6a674732f4c71855355a6e: Status 404 returned error can't find the container with id d249dc9a688f9ce92c0ecb0ce0b49077738c7d329b6a674732f4c71855355a6e Feb 17 18:06:40 crc kubenswrapper[4892]: I0217 18:06:40.717935 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2e7cdd99-a572-4a20-834b-c1373e080496","Type":"ContainerStarted","Data":"366a786fe2adb1868b4256bbb8d7a92b8b2dc7273456a2f82c6931c3dc7fdab4"} Feb 17 18:06:40 crc kubenswrapper[4892]: I0217 18:06:40.718383 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2e7cdd99-a572-4a20-834b-c1373e080496","Type":"ContainerStarted","Data":"d249dc9a688f9ce92c0ecb0ce0b49077738c7d329b6a674732f4c71855355a6e"} Feb 17 18:06:41 crc kubenswrapper[4892]: I0217 18:06:41.176194 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-j5ngd" Feb 17 18:06:41 crc kubenswrapper[4892]: I0217 18:06:41.304763 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7eb01add-fd0d-4606-9761-316f009e0002-scripts\") pod \"7eb01add-fd0d-4606-9761-316f009e0002\" (UID: \"7eb01add-fd0d-4606-9761-316f009e0002\") " Feb 17 18:06:41 crc kubenswrapper[4892]: I0217 18:06:41.304910 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7eb01add-fd0d-4606-9761-316f009e0002-config-data\") pod \"7eb01add-fd0d-4606-9761-316f009e0002\" (UID: \"7eb01add-fd0d-4606-9761-316f009e0002\") " Feb 17 18:06:41 crc kubenswrapper[4892]: I0217 18:06:41.304964 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpjmp\" (UniqueName: \"kubernetes.io/projected/7eb01add-fd0d-4606-9761-316f009e0002-kube-api-access-gpjmp\") pod \"7eb01add-fd0d-4606-9761-316f009e0002\" (UID: \"7eb01add-fd0d-4606-9761-316f009e0002\") " Feb 17 18:06:41 crc kubenswrapper[4892]: I0217 18:06:41.305053 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eb01add-fd0d-4606-9761-316f009e0002-combined-ca-bundle\") pod \"7eb01add-fd0d-4606-9761-316f009e0002\" (UID: \"7eb01add-fd0d-4606-9761-316f009e0002\") " Feb 17 18:06:41 crc kubenswrapper[4892]: I0217 18:06:41.316527 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7eb01add-fd0d-4606-9761-316f009e0002-scripts" (OuterVolumeSpecName: "scripts") pod "7eb01add-fd0d-4606-9761-316f009e0002" (UID: "7eb01add-fd0d-4606-9761-316f009e0002"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:06:41 crc kubenswrapper[4892]: I0217 18:06:41.316748 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7eb01add-fd0d-4606-9761-316f009e0002-kube-api-access-gpjmp" (OuterVolumeSpecName: "kube-api-access-gpjmp") pod "7eb01add-fd0d-4606-9761-316f009e0002" (UID: "7eb01add-fd0d-4606-9761-316f009e0002"). InnerVolumeSpecName "kube-api-access-gpjmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:06:41 crc kubenswrapper[4892]: I0217 18:06:41.332761 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7eb01add-fd0d-4606-9761-316f009e0002-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7eb01add-fd0d-4606-9761-316f009e0002" (UID: "7eb01add-fd0d-4606-9761-316f009e0002"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:06:41 crc kubenswrapper[4892]: I0217 18:06:41.360219 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7eb01add-fd0d-4606-9761-316f009e0002-config-data" (OuterVolumeSpecName: "config-data") pod "7eb01add-fd0d-4606-9761-316f009e0002" (UID: "7eb01add-fd0d-4606-9761-316f009e0002"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:06:41 crc kubenswrapper[4892]: I0217 18:06:41.408677 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7eb01add-fd0d-4606-9761-316f009e0002-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:41 crc kubenswrapper[4892]: I0217 18:06:41.408710 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpjmp\" (UniqueName: \"kubernetes.io/projected/7eb01add-fd0d-4606-9761-316f009e0002-kube-api-access-gpjmp\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:41 crc kubenswrapper[4892]: I0217 18:06:41.408721 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eb01add-fd0d-4606-9761-316f009e0002-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:41 crc kubenswrapper[4892]: I0217 18:06:41.408729 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7eb01add-fd0d-4606-9761-316f009e0002-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:41 crc kubenswrapper[4892]: I0217 18:06:41.729480 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2e7cdd99-a572-4a20-834b-c1373e080496","Type":"ContainerStarted","Data":"a65ac27e331c76e69932d4652e6aaf1a21e948f273f412a3f9aa451079f2be62"} Feb 17 18:06:41 crc kubenswrapper[4892]: I0217 18:06:41.733866 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-j5ngd" event={"ID":"7eb01add-fd0d-4606-9761-316f009e0002","Type":"ContainerDied","Data":"431c86eacb42718ab2f1c140586b2012f17fad5568c707c1dac07dbc8ab43a84"} Feb 17 18:06:41 crc kubenswrapper[4892]: I0217 18:06:41.733897 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="431c86eacb42718ab2f1c140586b2012f17fad5568c707c1dac07dbc8ab43a84" Feb 17 18:06:41 crc kubenswrapper[4892]: I0217 18:06:41.733955 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-j5ngd" Feb 17 18:06:41 crc kubenswrapper[4892]: I0217 18:06:41.922944 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 18:06:41 crc kubenswrapper[4892]: I0217 18:06:41.923286 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382" containerName="nova-api-log" containerID="cri-o://cd47d7cbceebadd0b68ffad90cfd37403f8e798f16f16647926b6633524d52cf" gracePeriod=30 Feb 17 18:06:41 crc kubenswrapper[4892]: I0217 18:06:41.923880 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382" containerName="nova-api-api" containerID="cri-o://61f78b078f502e6cd5e7b57413f884e6841707d6503e3a81bb42b5c83073e76c" gracePeriod=30 Feb 17 18:06:41 crc kubenswrapper[4892]: I0217 18:06:41.943031 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 18:06:41 crc kubenswrapper[4892]: I0217 18:06:41.943325 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="b3ba050d-ff82-4828-91b0-c2ca3eeccd5e" containerName="nova-scheduler-scheduler" containerID="cri-o://f7336952b591bb5a3beb50a3e49535fdba933d66072b6bbdd4a002161abb2ff1" gracePeriod=30 Feb 17 18:06:41 crc kubenswrapper[4892]: I0217 18:06:41.993053 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 18:06:41 crc kubenswrapper[4892]: I0217 18:06:41.993339 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7b9c463d-2912-48ed-b342-74e1a2f92d19" containerName="nova-metadata-log" containerID="cri-o://00579e8a9d0e4012a3eb6b9376f292f349e2389940e2dd79e00a3a9d80c4c6f7" gracePeriod=30 Feb 17 18:06:41 crc kubenswrapper[4892]: I0217 18:06:41.993427 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7b9c463d-2912-48ed-b342-74e1a2f92d19" containerName="nova-metadata-metadata" containerID="cri-o://6a2878294542b4973fbce5486513e4efda079ff8205b1e1994b292781335a803" gracePeriod=30 Feb 17 18:06:42 crc kubenswrapper[4892]: I0217 18:06:42.488836 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 18:06:42 crc kubenswrapper[4892]: I0217 18:06:42.634295 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382-combined-ca-bundle\") pod \"a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382\" (UID: \"a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382\") " Feb 17 18:06:42 crc kubenswrapper[4892]: I0217 18:06:42.634508 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382-internal-tls-certs\") pod \"a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382\" (UID: \"a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382\") " Feb 17 18:06:42 crc kubenswrapper[4892]: I0217 18:06:42.634543 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382-public-tls-certs\") pod \"a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382\" (UID: \"a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382\") " Feb 17 18:06:42 crc kubenswrapper[4892]: I0217 18:06:42.634569 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382-logs\") pod \"a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382\" (UID: \"a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382\") " Feb 17 18:06:42 crc kubenswrapper[4892]: I0217 18:06:42.634593 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382-config-data\") pod \"a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382\" (UID: \"a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382\") " Feb 17 18:06:42 crc kubenswrapper[4892]: I0217 18:06:42.634617 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkdb6\" (UniqueName: \"kubernetes.io/projected/a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382-kube-api-access-xkdb6\") pod \"a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382\" (UID: \"a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382\") " Feb 17 18:06:42 crc kubenswrapper[4892]: I0217 18:06:42.635156 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382-logs" (OuterVolumeSpecName: "logs") pod "a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382" (UID: "a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:06:42 crc kubenswrapper[4892]: I0217 18:06:42.645083 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382-kube-api-access-xkdb6" (OuterVolumeSpecName: "kube-api-access-xkdb6") pod "a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382" (UID: "a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382"). InnerVolumeSpecName "kube-api-access-xkdb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:06:42 crc kubenswrapper[4892]: I0217 18:06:42.667031 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382-config-data" (OuterVolumeSpecName: "config-data") pod "a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382" (UID: "a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:06:42 crc kubenswrapper[4892]: I0217 18:06:42.680339 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382" (UID: "a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:06:42 crc kubenswrapper[4892]: I0217 18:06:42.693858 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382" (UID: "a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:06:42 crc kubenswrapper[4892]: I0217 18:06:42.712503 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382" (UID: "a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:06:42 crc kubenswrapper[4892]: I0217 18:06:42.740018 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:42 crc kubenswrapper[4892]: I0217 18:06:42.740054 4892 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:42 crc kubenswrapper[4892]: I0217 18:06:42.740088 4892 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:42 crc kubenswrapper[4892]: I0217 18:06:42.740102 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382-logs\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:42 crc kubenswrapper[4892]: I0217 18:06:42.740115 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:42 crc kubenswrapper[4892]: I0217 18:06:42.740128 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkdb6\" (UniqueName: \"kubernetes.io/projected/a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382-kube-api-access-xkdb6\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:42 crc kubenswrapper[4892]: I0217 18:06:42.756697 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2e7cdd99-a572-4a20-834b-c1373e080496","Type":"ContainerStarted","Data":"a9803edec44f5a7d45fe654a2bc1851bccd3b5a3b1210ee4b83cc2c6d8022f3e"} Feb 17 18:06:42 crc kubenswrapper[4892]: I0217 18:06:42.759978 4892 generic.go:334] "Generic (PLEG): container finished" podID="a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382" containerID="61f78b078f502e6cd5e7b57413f884e6841707d6503e3a81bb42b5c83073e76c" exitCode=0 Feb 17 18:06:42 crc kubenswrapper[4892]: I0217 18:06:42.760009 4892 generic.go:334] "Generic (PLEG): container finished" podID="a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382" containerID="cd47d7cbceebadd0b68ffad90cfd37403f8e798f16f16647926b6633524d52cf" exitCode=143 Feb 17 18:06:42 crc kubenswrapper[4892]: I0217 18:06:42.760061 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382","Type":"ContainerDied","Data":"61f78b078f502e6cd5e7b57413f884e6841707d6503e3a81bb42b5c83073e76c"} Feb 17 18:06:42 crc kubenswrapper[4892]: I0217 18:06:42.760084 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382","Type":"ContainerDied","Data":"cd47d7cbceebadd0b68ffad90cfd37403f8e798f16f16647926b6633524d52cf"} Feb 17 18:06:42 crc kubenswrapper[4892]: I0217 18:06:42.760098 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382","Type":"ContainerDied","Data":"2d468f370c8ba167da3ac1736a14cf6b395b399e2ad9b55a3ab37447643a9f16"} Feb 17 18:06:42 crc kubenswrapper[4892]: I0217 18:06:42.760116 4892 scope.go:117] "RemoveContainer" containerID="61f78b078f502e6cd5e7b57413f884e6841707d6503e3a81bb42b5c83073e76c" Feb 17 18:06:42 crc kubenswrapper[4892]: I0217 18:06:42.760317 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 18:06:42 crc kubenswrapper[4892]: I0217 18:06:42.765246 4892 generic.go:334] "Generic (PLEG): container finished" podID="7b9c463d-2912-48ed-b342-74e1a2f92d19" containerID="00579e8a9d0e4012a3eb6b9376f292f349e2389940e2dd79e00a3a9d80c4c6f7" exitCode=143 Feb 17 18:06:42 crc kubenswrapper[4892]: I0217 18:06:42.765287 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7b9c463d-2912-48ed-b342-74e1a2f92d19","Type":"ContainerDied","Data":"00579e8a9d0e4012a3eb6b9376f292f349e2389940e2dd79e00a3a9d80c4c6f7"} Feb 17 18:06:42 crc kubenswrapper[4892]: I0217 18:06:42.788741 4892 scope.go:117] "RemoveContainer" containerID="cd47d7cbceebadd0b68ffad90cfd37403f8e798f16f16647926b6633524d52cf" Feb 17 18:06:42 crc kubenswrapper[4892]: I0217 18:06:42.806010 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 18:06:42 crc kubenswrapper[4892]: I0217 18:06:42.824980 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 17 18:06:42 crc kubenswrapper[4892]: I0217 18:06:42.833117 4892 scope.go:117] "RemoveContainer" containerID="61f78b078f502e6cd5e7b57413f884e6841707d6503e3a81bb42b5c83073e76c" Feb 17 18:06:42 crc kubenswrapper[4892]: E0217 18:06:42.833550 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61f78b078f502e6cd5e7b57413f884e6841707d6503e3a81bb42b5c83073e76c\": container with ID starting with 61f78b078f502e6cd5e7b57413f884e6841707d6503e3a81bb42b5c83073e76c not found: ID does not exist" containerID="61f78b078f502e6cd5e7b57413f884e6841707d6503e3a81bb42b5c83073e76c" Feb 17 18:06:42 crc kubenswrapper[4892]: I0217 18:06:42.833581 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61f78b078f502e6cd5e7b57413f884e6841707d6503e3a81bb42b5c83073e76c"} err="failed to get container status \"61f78b078f502e6cd5e7b57413f884e6841707d6503e3a81bb42b5c83073e76c\": rpc error: code = NotFound desc = could not find container \"61f78b078f502e6cd5e7b57413f884e6841707d6503e3a81bb42b5c83073e76c\": container with ID starting with 61f78b078f502e6cd5e7b57413f884e6841707d6503e3a81bb42b5c83073e76c not found: ID does not exist" Feb 17 18:06:42 crc kubenswrapper[4892]: I0217 18:06:42.833601 4892 scope.go:117] "RemoveContainer" containerID="cd47d7cbceebadd0b68ffad90cfd37403f8e798f16f16647926b6633524d52cf" Feb 17 18:06:42 crc kubenswrapper[4892]: E0217 18:06:42.833829 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd47d7cbceebadd0b68ffad90cfd37403f8e798f16f16647926b6633524d52cf\": container with ID starting with cd47d7cbceebadd0b68ffad90cfd37403f8e798f16f16647926b6633524d52cf not found: ID does not exist" containerID="cd47d7cbceebadd0b68ffad90cfd37403f8e798f16f16647926b6633524d52cf" Feb 17 18:06:42 crc kubenswrapper[4892]: I0217 18:06:42.833852 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd47d7cbceebadd0b68ffad90cfd37403f8e798f16f16647926b6633524d52cf"} err="failed to get container status \"cd47d7cbceebadd0b68ffad90cfd37403f8e798f16f16647926b6633524d52cf\": rpc error: code = NotFound desc = could not find container \"cd47d7cbceebadd0b68ffad90cfd37403f8e798f16f16647926b6633524d52cf\": container with ID starting with cd47d7cbceebadd0b68ffad90cfd37403f8e798f16f16647926b6633524d52cf not found: ID does not exist" Feb 17 18:06:42 crc kubenswrapper[4892]: I0217 18:06:42.833866 4892 scope.go:117] "RemoveContainer" containerID="61f78b078f502e6cd5e7b57413f884e6841707d6503e3a81bb42b5c83073e76c" Feb 17 18:06:42 crc kubenswrapper[4892]: I0217 18:06:42.834048 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61f78b078f502e6cd5e7b57413f884e6841707d6503e3a81bb42b5c83073e76c"} err="failed to get container status \"61f78b078f502e6cd5e7b57413f884e6841707d6503e3a81bb42b5c83073e76c\": rpc error: code = NotFound desc = could not find container \"61f78b078f502e6cd5e7b57413f884e6841707d6503e3a81bb42b5c83073e76c\": container with ID starting with 61f78b078f502e6cd5e7b57413f884e6841707d6503e3a81bb42b5c83073e76c not found: ID does not exist" Feb 17 18:06:42 crc kubenswrapper[4892]: I0217 18:06:42.834069 4892 scope.go:117] "RemoveContainer" containerID="cd47d7cbceebadd0b68ffad90cfd37403f8e798f16f16647926b6633524d52cf" Feb 17 18:06:42 crc kubenswrapper[4892]: I0217 18:06:42.835906 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd47d7cbceebadd0b68ffad90cfd37403f8e798f16f16647926b6633524d52cf"} err="failed to get container status \"cd47d7cbceebadd0b68ffad90cfd37403f8e798f16f16647926b6633524d52cf\": rpc error: code = NotFound desc = could not find container \"cd47d7cbceebadd0b68ffad90cfd37403f8e798f16f16647926b6633524d52cf\": container with ID starting with cd47d7cbceebadd0b68ffad90cfd37403f8e798f16f16647926b6633524d52cf not found: ID does not exist" Feb 17 18:06:42 crc kubenswrapper[4892]: I0217 18:06:42.856967 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 17 18:06:42 crc kubenswrapper[4892]: E0217 18:06:42.857937 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382" containerName="nova-api-log" Feb 17 18:06:42 crc kubenswrapper[4892]: I0217 18:06:42.865389 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382" containerName="nova-api-log" Feb 17 18:06:42 crc kubenswrapper[4892]: E0217 18:06:42.865496 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7eb01add-fd0d-4606-9761-316f009e0002" containerName="nova-manage" Feb 17 18:06:42 crc kubenswrapper[4892]: I0217 18:06:42.865505 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eb01add-fd0d-4606-9761-316f009e0002" containerName="nova-manage" Feb 17 18:06:42 crc kubenswrapper[4892]: E0217 18:06:42.865529 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382" containerName="nova-api-api" Feb 17 18:06:42 crc kubenswrapper[4892]: I0217 18:06:42.865536 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382" containerName="nova-api-api" Feb 17 18:06:42 crc kubenswrapper[4892]: I0217 18:06:42.865930 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382" containerName="nova-api-log" Feb 17 18:06:42 crc kubenswrapper[4892]: I0217 18:06:42.865965 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="7eb01add-fd0d-4606-9761-316f009e0002" containerName="nova-manage" Feb 17 18:06:42 crc kubenswrapper[4892]: I0217 18:06:42.865978 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382" containerName="nova-api-api" Feb 17 18:06:42 crc kubenswrapper[4892]: I0217 18:06:42.866972 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 18:06:42 crc kubenswrapper[4892]: I0217 18:06:42.867055 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 18:06:42 crc kubenswrapper[4892]: I0217 18:06:42.870470 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 17 18:06:42 crc kubenswrapper[4892]: I0217 18:06:42.870678 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 17 18:06:42 crc kubenswrapper[4892]: I0217 18:06:42.870918 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 17 18:06:43 crc kubenswrapper[4892]: I0217 18:06:43.050509 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68cd920a-ec23-4053-a5b6-02adbf11eaf0-logs\") pod \"nova-api-0\" (UID: \"68cd920a-ec23-4053-a5b6-02adbf11eaf0\") " pod="openstack/nova-api-0" Feb 17 18:06:43 crc kubenswrapper[4892]: I0217 18:06:43.050768 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/68cd920a-ec23-4053-a5b6-02adbf11eaf0-public-tls-certs\") pod \"nova-api-0\" (UID: \"68cd920a-ec23-4053-a5b6-02adbf11eaf0\") " pod="openstack/nova-api-0" Feb 17 18:06:43 crc kubenswrapper[4892]: I0217 18:06:43.051225 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzc7f\" (UniqueName: \"kubernetes.io/projected/68cd920a-ec23-4053-a5b6-02adbf11eaf0-kube-api-access-wzc7f\") pod \"nova-api-0\" (UID: \"68cd920a-ec23-4053-a5b6-02adbf11eaf0\") " pod="openstack/nova-api-0" Feb 17 18:06:43 crc kubenswrapper[4892]: I0217 18:06:43.051524 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68cd920a-ec23-4053-a5b6-02adbf11eaf0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"68cd920a-ec23-4053-a5b6-02adbf11eaf0\") " pod="openstack/nova-api-0" Feb 17 18:06:43 crc kubenswrapper[4892]: I0217 18:06:43.051737 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68cd920a-ec23-4053-a5b6-02adbf11eaf0-config-data\") pod \"nova-api-0\" (UID: \"68cd920a-ec23-4053-a5b6-02adbf11eaf0\") " pod="openstack/nova-api-0" Feb 17 18:06:43 crc kubenswrapper[4892]: I0217 18:06:43.051991 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/68cd920a-ec23-4053-a5b6-02adbf11eaf0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"68cd920a-ec23-4053-a5b6-02adbf11eaf0\") " pod="openstack/nova-api-0" Feb 17 18:06:43 crc kubenswrapper[4892]: I0217 18:06:43.153629 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68cd920a-ec23-4053-a5b6-02adbf11eaf0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"68cd920a-ec23-4053-a5b6-02adbf11eaf0\") " pod="openstack/nova-api-0" Feb 17 18:06:43 crc kubenswrapper[4892]: I0217 18:06:43.153868 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68cd920a-ec23-4053-a5b6-02adbf11eaf0-config-data\") pod \"nova-api-0\" (UID: \"68cd920a-ec23-4053-a5b6-02adbf11eaf0\") " pod="openstack/nova-api-0" Feb 17 18:06:43 crc kubenswrapper[4892]: I0217 18:06:43.153908 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/68cd920a-ec23-4053-a5b6-02adbf11eaf0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"68cd920a-ec23-4053-a5b6-02adbf11eaf0\") " pod="openstack/nova-api-0" Feb 17 18:06:43 crc kubenswrapper[4892]: I0217 18:06:43.153964 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68cd920a-ec23-4053-a5b6-02adbf11eaf0-logs\") pod \"nova-api-0\" (UID: \"68cd920a-ec23-4053-a5b6-02adbf11eaf0\") " pod="openstack/nova-api-0" Feb 17 18:06:43 crc kubenswrapper[4892]: I0217 18:06:43.153980 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/68cd920a-ec23-4053-a5b6-02adbf11eaf0-public-tls-certs\") pod \"nova-api-0\" (UID: \"68cd920a-ec23-4053-a5b6-02adbf11eaf0\") " pod="openstack/nova-api-0" Feb 17 18:06:43 crc kubenswrapper[4892]: I0217 18:06:43.154018 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzc7f\" (UniqueName: \"kubernetes.io/projected/68cd920a-ec23-4053-a5b6-02adbf11eaf0-kube-api-access-wzc7f\") pod \"nova-api-0\" (UID: \"68cd920a-ec23-4053-a5b6-02adbf11eaf0\") " pod="openstack/nova-api-0" Feb 17 18:06:43 crc kubenswrapper[4892]: I0217 18:06:43.155315 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68cd920a-ec23-4053-a5b6-02adbf11eaf0-logs\") pod \"nova-api-0\" (UID: \"68cd920a-ec23-4053-a5b6-02adbf11eaf0\") " pod="openstack/nova-api-0" Feb 17 18:06:43 crc kubenswrapper[4892]: I0217 18:06:43.157835 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68cd920a-ec23-4053-a5b6-02adbf11eaf0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"68cd920a-ec23-4053-a5b6-02adbf11eaf0\") " pod="openstack/nova-api-0" Feb 17 18:06:43 crc kubenswrapper[4892]: I0217 18:06:43.158293 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/68cd920a-ec23-4053-a5b6-02adbf11eaf0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"68cd920a-ec23-4053-a5b6-02adbf11eaf0\") " pod="openstack/nova-api-0" Feb 17 18:06:43 crc kubenswrapper[4892]: I0217 18:06:43.158915 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68cd920a-ec23-4053-a5b6-02adbf11eaf0-config-data\") pod \"nova-api-0\" (UID: \"68cd920a-ec23-4053-a5b6-02adbf11eaf0\") " pod="openstack/nova-api-0" Feb 17 18:06:43 crc kubenswrapper[4892]: I0217 18:06:43.166382 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/68cd920a-ec23-4053-a5b6-02adbf11eaf0-public-tls-certs\") pod \"nova-api-0\" (UID: \"68cd920a-ec23-4053-a5b6-02adbf11eaf0\") " pod="openstack/nova-api-0" Feb 17 18:06:43 crc kubenswrapper[4892]: I0217 18:06:43.171076 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzc7f\" (UniqueName: \"kubernetes.io/projected/68cd920a-ec23-4053-a5b6-02adbf11eaf0-kube-api-access-wzc7f\") pod \"nova-api-0\" (UID: \"68cd920a-ec23-4053-a5b6-02adbf11eaf0\") " pod="openstack/nova-api-0" Feb 17 18:06:43 crc kubenswrapper[4892]: I0217 18:06:43.362885 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 18:06:43 crc kubenswrapper[4892]: I0217 18:06:43.385136 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382" path="/var/lib/kubelet/pods/a03bb4eb-c2d1-46ec-b3f0-0cae95ae0382/volumes" Feb 17 18:06:43 crc kubenswrapper[4892]: I0217 18:06:43.779841 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2e7cdd99-a572-4a20-834b-c1373e080496","Type":"ContainerStarted","Data":"7084822d47905f3c440ae8cc2dc02a06102fe14e3a3594895fa3088c05136b66"} Feb 17 18:06:43 crc kubenswrapper[4892]: I0217 18:06:43.780185 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 18:06:43 crc kubenswrapper[4892]: I0217 18:06:43.788394 4892 generic.go:334] "Generic (PLEG): container finished" podID="b3ba050d-ff82-4828-91b0-c2ca3eeccd5e" containerID="f7336952b591bb5a3beb50a3e49535fdba933d66072b6bbdd4a002161abb2ff1" exitCode=0 Feb 17 18:06:43 crc kubenswrapper[4892]: I0217 18:06:43.788442 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b3ba050d-ff82-4828-91b0-c2ca3eeccd5e","Type":"ContainerDied","Data":"f7336952b591bb5a3beb50a3e49535fdba933d66072b6bbdd4a002161abb2ff1"} Feb 17 18:06:43 crc kubenswrapper[4892]: I0217 18:06:43.816155 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.65976107 podStartE2EDuration="5.816134126s" podCreationTimestamp="2026-02-17 18:06:38 +0000 UTC" firstStartedPulling="2026-02-17 18:06:39.713942604 +0000 UTC m=+1371.089345869" lastFinishedPulling="2026-02-17 18:06:42.87031566 +0000 UTC m=+1374.245718925" observedRunningTime="2026-02-17 18:06:43.809491427 +0000 UTC m=+1375.184894692" watchObservedRunningTime="2026-02-17 18:06:43.816134126 +0000 UTC m=+1375.191537401" Feb 17 18:06:43 crc kubenswrapper[4892]: I0217 18:06:43.946345 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 18:06:44 crc kubenswrapper[4892]: I0217 18:06:44.097322 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 18:06:44 crc kubenswrapper[4892]: I0217 18:06:44.180359 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3ba050d-ff82-4828-91b0-c2ca3eeccd5e-combined-ca-bundle\") pod \"b3ba050d-ff82-4828-91b0-c2ca3eeccd5e\" (UID: \"b3ba050d-ff82-4828-91b0-c2ca3eeccd5e\") " Feb 17 18:06:44 crc kubenswrapper[4892]: I0217 18:06:44.180470 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3ba050d-ff82-4828-91b0-c2ca3eeccd5e-config-data\") pod \"b3ba050d-ff82-4828-91b0-c2ca3eeccd5e\" (UID: \"b3ba050d-ff82-4828-91b0-c2ca3eeccd5e\") " Feb 17 18:06:44 crc kubenswrapper[4892]: I0217 18:06:44.180597 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tfw5\" (UniqueName: \"kubernetes.io/projected/b3ba050d-ff82-4828-91b0-c2ca3eeccd5e-kube-api-access-6tfw5\") pod \"b3ba050d-ff82-4828-91b0-c2ca3eeccd5e\" (UID: \"b3ba050d-ff82-4828-91b0-c2ca3eeccd5e\") " Feb 17 18:06:44 crc kubenswrapper[4892]: I0217 18:06:44.186986 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3ba050d-ff82-4828-91b0-c2ca3eeccd5e-kube-api-access-6tfw5" (OuterVolumeSpecName: "kube-api-access-6tfw5") pod "b3ba050d-ff82-4828-91b0-c2ca3eeccd5e" (UID: "b3ba050d-ff82-4828-91b0-c2ca3eeccd5e"). InnerVolumeSpecName "kube-api-access-6tfw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:06:44 crc kubenswrapper[4892]: I0217 18:06:44.213518 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3ba050d-ff82-4828-91b0-c2ca3eeccd5e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b3ba050d-ff82-4828-91b0-c2ca3eeccd5e" (UID: "b3ba050d-ff82-4828-91b0-c2ca3eeccd5e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:06:44 crc kubenswrapper[4892]: I0217 18:06:44.217065 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3ba050d-ff82-4828-91b0-c2ca3eeccd5e-config-data" (OuterVolumeSpecName: "config-data") pod "b3ba050d-ff82-4828-91b0-c2ca3eeccd5e" (UID: "b3ba050d-ff82-4828-91b0-c2ca3eeccd5e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:06:44 crc kubenswrapper[4892]: I0217 18:06:44.282484 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3ba050d-ff82-4828-91b0-c2ca3eeccd5e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:44 crc kubenswrapper[4892]: I0217 18:06:44.282515 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3ba050d-ff82-4828-91b0-c2ca3eeccd5e-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:44 crc kubenswrapper[4892]: I0217 18:06:44.282525 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tfw5\" (UniqueName: \"kubernetes.io/projected/b3ba050d-ff82-4828-91b0-c2ca3eeccd5e-kube-api-access-6tfw5\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:44 crc kubenswrapper[4892]: I0217 18:06:44.806637 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"68cd920a-ec23-4053-a5b6-02adbf11eaf0","Type":"ContainerStarted","Data":"7576bc6a46f917916ace668163f910721525c50fdb2445e86a823be4d67ae777"} Feb 17 18:06:44 crc kubenswrapper[4892]: I0217 18:06:44.807003 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"68cd920a-ec23-4053-a5b6-02adbf11eaf0","Type":"ContainerStarted","Data":"9bac58ad39e6590d4042399c64a90295868b511562a801762900f675847c27a2"} Feb 17 18:06:44 crc kubenswrapper[4892]: I0217 18:06:44.807017 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"68cd920a-ec23-4053-a5b6-02adbf11eaf0","Type":"ContainerStarted","Data":"5ab7188237312a8d3ad13b8d1c210fe67bf766c6e1983f2b5423b858542facb2"} Feb 17 18:06:44 crc kubenswrapper[4892]: I0217 18:06:44.809648 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 18:06:44 crc kubenswrapper[4892]: I0217 18:06:44.809695 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b3ba050d-ff82-4828-91b0-c2ca3eeccd5e","Type":"ContainerDied","Data":"def61e4a980cfcd3994302edbbbfa4374f051a5b2679f3a8e22f2eb506ca33e7"} Feb 17 18:06:44 crc kubenswrapper[4892]: I0217 18:06:44.809725 4892 scope.go:117] "RemoveContainer" containerID="f7336952b591bb5a3beb50a3e49535fdba933d66072b6bbdd4a002161abb2ff1" Feb 17 18:06:44 crc kubenswrapper[4892]: I0217 18:06:44.843743 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.843724892 podStartE2EDuration="2.843724892s" podCreationTimestamp="2026-02-17 18:06:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:06:44.835323065 +0000 UTC m=+1376.210726340" watchObservedRunningTime="2026-02-17 18:06:44.843724892 +0000 UTC m=+1376.219128157" Feb 17 18:06:44 crc kubenswrapper[4892]: I0217 18:06:44.874893 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 18:06:44 crc kubenswrapper[4892]: I0217 18:06:44.897877 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 18:06:44 crc kubenswrapper[4892]: I0217 18:06:44.913232 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 18:06:44 crc kubenswrapper[4892]: E0217 18:06:44.913711 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3ba050d-ff82-4828-91b0-c2ca3eeccd5e" containerName="nova-scheduler-scheduler" Feb 17 18:06:44 crc kubenswrapper[4892]: I0217 18:06:44.913727 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3ba050d-ff82-4828-91b0-c2ca3eeccd5e" containerName="nova-scheduler-scheduler" Feb 17 18:06:44 crc kubenswrapper[4892]: I0217 18:06:44.913936 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3ba050d-ff82-4828-91b0-c2ca3eeccd5e" containerName="nova-scheduler-scheduler" Feb 17 18:06:44 crc kubenswrapper[4892]: I0217 18:06:44.914597 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 18:06:44 crc kubenswrapper[4892]: I0217 18:06:44.919502 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 17 18:06:44 crc kubenswrapper[4892]: I0217 18:06:44.945248 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 18:06:44 crc kubenswrapper[4892]: I0217 18:06:44.998051 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/253dfc82-aa27-4e6b-88a5-0af7a1d01370-config-data\") pod \"nova-scheduler-0\" (UID: \"253dfc82-aa27-4e6b-88a5-0af7a1d01370\") " pod="openstack/nova-scheduler-0" Feb 17 18:06:44 crc kubenswrapper[4892]: I0217 18:06:44.998161 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/253dfc82-aa27-4e6b-88a5-0af7a1d01370-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"253dfc82-aa27-4e6b-88a5-0af7a1d01370\") " pod="openstack/nova-scheduler-0" Feb 17 18:06:44 crc kubenswrapper[4892]: I0217 18:06:44.998595 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8vkv\" (UniqueName: \"kubernetes.io/projected/253dfc82-aa27-4e6b-88a5-0af7a1d01370-kube-api-access-s8vkv\") pod \"nova-scheduler-0\" (UID: \"253dfc82-aa27-4e6b-88a5-0af7a1d01370\") " pod="openstack/nova-scheduler-0" Feb 17 18:06:45 crc kubenswrapper[4892]: I0217 18:06:45.101004 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8vkv\" (UniqueName: \"kubernetes.io/projected/253dfc82-aa27-4e6b-88a5-0af7a1d01370-kube-api-access-s8vkv\") pod \"nova-scheduler-0\" (UID: \"253dfc82-aa27-4e6b-88a5-0af7a1d01370\") " pod="openstack/nova-scheduler-0" Feb 17 18:06:45 crc kubenswrapper[4892]: I0217 18:06:45.101142 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/253dfc82-aa27-4e6b-88a5-0af7a1d01370-config-data\") pod \"nova-scheduler-0\" (UID: \"253dfc82-aa27-4e6b-88a5-0af7a1d01370\") " pod="openstack/nova-scheduler-0" Feb 17 18:06:45 crc kubenswrapper[4892]: I0217 18:06:45.101181 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/253dfc82-aa27-4e6b-88a5-0af7a1d01370-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"253dfc82-aa27-4e6b-88a5-0af7a1d01370\") " pod="openstack/nova-scheduler-0" Feb 17 18:06:45 crc kubenswrapper[4892]: I0217 18:06:45.123464 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/253dfc82-aa27-4e6b-88a5-0af7a1d01370-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"253dfc82-aa27-4e6b-88a5-0af7a1d01370\") " pod="openstack/nova-scheduler-0" Feb 17 18:06:45 crc kubenswrapper[4892]: I0217 18:06:45.123664 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8vkv\" (UniqueName: \"kubernetes.io/projected/253dfc82-aa27-4e6b-88a5-0af7a1d01370-kube-api-access-s8vkv\") pod \"nova-scheduler-0\" (UID: \"253dfc82-aa27-4e6b-88a5-0af7a1d01370\") " pod="openstack/nova-scheduler-0" Feb 17 18:06:45 crc kubenswrapper[4892]: I0217 18:06:45.124958 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/253dfc82-aa27-4e6b-88a5-0af7a1d01370-config-data\") pod \"nova-scheduler-0\" (UID: \"253dfc82-aa27-4e6b-88a5-0af7a1d01370\") " pod="openstack/nova-scheduler-0" Feb 17 18:06:45 crc kubenswrapper[4892]: I0217 18:06:45.260484 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 18:06:45 crc kubenswrapper[4892]: I0217 18:06:45.381693 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3ba050d-ff82-4828-91b0-c2ca3eeccd5e" path="/var/lib/kubelet/pods/b3ba050d-ff82-4828-91b0-c2ca3eeccd5e/volumes" Feb 17 18:06:45 crc kubenswrapper[4892]: I0217 18:06:45.535737 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 18:06:45 crc kubenswrapper[4892]: I0217 18:06:45.614736 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b9c463d-2912-48ed-b342-74e1a2f92d19-combined-ca-bundle\") pod \"7b9c463d-2912-48ed-b342-74e1a2f92d19\" (UID: \"7b9c463d-2912-48ed-b342-74e1a2f92d19\") " Feb 17 18:06:45 crc kubenswrapper[4892]: I0217 18:06:45.614960 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b9c463d-2912-48ed-b342-74e1a2f92d19-logs\") pod \"7b9c463d-2912-48ed-b342-74e1a2f92d19\" (UID: \"7b9c463d-2912-48ed-b342-74e1a2f92d19\") " Feb 17 18:06:45 crc kubenswrapper[4892]: I0217 18:06:45.614999 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b9c463d-2912-48ed-b342-74e1a2f92d19-nova-metadata-tls-certs\") pod \"7b9c463d-2912-48ed-b342-74e1a2f92d19\" (UID: \"7b9c463d-2912-48ed-b342-74e1a2f92d19\") " Feb 17 18:06:45 crc kubenswrapper[4892]: I0217 18:06:45.615039 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b9c463d-2912-48ed-b342-74e1a2f92d19-config-data\") pod \"7b9c463d-2912-48ed-b342-74e1a2f92d19\" (UID: \"7b9c463d-2912-48ed-b342-74e1a2f92d19\") " Feb 17 18:06:45 crc kubenswrapper[4892]: I0217 18:06:45.615090 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2qvw\" (UniqueName: \"kubernetes.io/projected/7b9c463d-2912-48ed-b342-74e1a2f92d19-kube-api-access-x2qvw\") pod \"7b9c463d-2912-48ed-b342-74e1a2f92d19\" (UID: \"7b9c463d-2912-48ed-b342-74e1a2f92d19\") " Feb 17 18:06:45 crc kubenswrapper[4892]: I0217 18:06:45.615689 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b9c463d-2912-48ed-b342-74e1a2f92d19-logs" (OuterVolumeSpecName: "logs") pod "7b9c463d-2912-48ed-b342-74e1a2f92d19" (UID: "7b9c463d-2912-48ed-b342-74e1a2f92d19"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:06:45 crc kubenswrapper[4892]: I0217 18:06:45.620142 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b9c463d-2912-48ed-b342-74e1a2f92d19-kube-api-access-x2qvw" (OuterVolumeSpecName: "kube-api-access-x2qvw") pod "7b9c463d-2912-48ed-b342-74e1a2f92d19" (UID: "7b9c463d-2912-48ed-b342-74e1a2f92d19"). InnerVolumeSpecName "kube-api-access-x2qvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:06:45 crc kubenswrapper[4892]: I0217 18:06:45.645946 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b9c463d-2912-48ed-b342-74e1a2f92d19-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b9c463d-2912-48ed-b342-74e1a2f92d19" (UID: "7b9c463d-2912-48ed-b342-74e1a2f92d19"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:06:45 crc kubenswrapper[4892]: I0217 18:06:45.647947 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b9c463d-2912-48ed-b342-74e1a2f92d19-config-data" (OuterVolumeSpecName: "config-data") pod "7b9c463d-2912-48ed-b342-74e1a2f92d19" (UID: "7b9c463d-2912-48ed-b342-74e1a2f92d19"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:06:45 crc kubenswrapper[4892]: I0217 18:06:45.676122 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b9c463d-2912-48ed-b342-74e1a2f92d19-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "7b9c463d-2912-48ed-b342-74e1a2f92d19" (UID: "7b9c463d-2912-48ed-b342-74e1a2f92d19"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:06:45 crc kubenswrapper[4892]: I0217 18:06:45.716524 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b9c463d-2912-48ed-b342-74e1a2f92d19-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:45 crc kubenswrapper[4892]: I0217 18:06:45.716555 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2qvw\" (UniqueName: \"kubernetes.io/projected/7b9c463d-2912-48ed-b342-74e1a2f92d19-kube-api-access-x2qvw\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:45 crc kubenswrapper[4892]: I0217 18:06:45.716564 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b9c463d-2912-48ed-b342-74e1a2f92d19-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:45 crc kubenswrapper[4892]: I0217 18:06:45.716575 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b9c463d-2912-48ed-b342-74e1a2f92d19-logs\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:45 crc kubenswrapper[4892]: I0217 18:06:45.716583 4892 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b9c463d-2912-48ed-b342-74e1a2f92d19-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 18:06:45 crc kubenswrapper[4892]: W0217 18:06:45.750850 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod253dfc82_aa27_4e6b_88a5_0af7a1d01370.slice/crio-9907d4f455141a9cfdc6da39ffbe900759e96dff35d0ee2cdf18151bc4ab0378 WatchSource:0}: Error finding container 9907d4f455141a9cfdc6da39ffbe900759e96dff35d0ee2cdf18151bc4ab0378: Status 404 returned error can't find the container with id 9907d4f455141a9cfdc6da39ffbe900759e96dff35d0ee2cdf18151bc4ab0378 Feb 17 18:06:45 crc kubenswrapper[4892]: I0217 18:06:45.754314 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 18:06:45 crc kubenswrapper[4892]: I0217 18:06:45.824166 4892 generic.go:334] "Generic (PLEG): container finished" podID="7b9c463d-2912-48ed-b342-74e1a2f92d19" containerID="6a2878294542b4973fbce5486513e4efda079ff8205b1e1994b292781335a803" exitCode=0 Feb 17 18:06:45 crc kubenswrapper[4892]: I0217 18:06:45.824246 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 18:06:45 crc kubenswrapper[4892]: I0217 18:06:45.824250 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7b9c463d-2912-48ed-b342-74e1a2f92d19","Type":"ContainerDied","Data":"6a2878294542b4973fbce5486513e4efda079ff8205b1e1994b292781335a803"} Feb 17 18:06:45 crc kubenswrapper[4892]: I0217 18:06:45.824333 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7b9c463d-2912-48ed-b342-74e1a2f92d19","Type":"ContainerDied","Data":"79484b580ee2d7bbbcb661dab756e77779250582197a3e30c09c2a2868dd3caf"} Feb 17 18:06:45 crc kubenswrapper[4892]: I0217 18:06:45.824366 4892 scope.go:117] "RemoveContainer" containerID="6a2878294542b4973fbce5486513e4efda079ff8205b1e1994b292781335a803" Feb 17 18:06:45 crc kubenswrapper[4892]: I0217 18:06:45.826726 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"253dfc82-aa27-4e6b-88a5-0af7a1d01370","Type":"ContainerStarted","Data":"9907d4f455141a9cfdc6da39ffbe900759e96dff35d0ee2cdf18151bc4ab0378"} Feb 17 18:06:45 crc kubenswrapper[4892]: I0217 18:06:45.856135 4892 scope.go:117] "RemoveContainer" containerID="00579e8a9d0e4012a3eb6b9376f292f349e2389940e2dd79e00a3a9d80c4c6f7" Feb 17 18:06:45 crc kubenswrapper[4892]: I0217 18:06:45.868725 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 18:06:45 crc kubenswrapper[4892]: I0217 18:06:45.890596 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 18:06:45 crc kubenswrapper[4892]: I0217 18:06:45.903415 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 17 18:06:45 crc kubenswrapper[4892]: E0217 18:06:45.903989 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b9c463d-2912-48ed-b342-74e1a2f92d19" containerName="nova-metadata-log" Feb 17 18:06:45 crc kubenswrapper[4892]: I0217 18:06:45.904011 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b9c463d-2912-48ed-b342-74e1a2f92d19" containerName="nova-metadata-log" Feb 17 18:06:45 crc kubenswrapper[4892]: E0217 18:06:45.904068 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b9c463d-2912-48ed-b342-74e1a2f92d19" containerName="nova-metadata-metadata" Feb 17 18:06:45 crc kubenswrapper[4892]: I0217 18:06:45.904074 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b9c463d-2912-48ed-b342-74e1a2f92d19" containerName="nova-metadata-metadata" Feb 17 18:06:45 crc kubenswrapper[4892]: I0217 18:06:45.904286 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b9c463d-2912-48ed-b342-74e1a2f92d19" containerName="nova-metadata-metadata" Feb 17 18:06:45 crc kubenswrapper[4892]: I0217 18:06:45.904311 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b9c463d-2912-48ed-b342-74e1a2f92d19" containerName="nova-metadata-log" Feb 17 18:06:45 crc kubenswrapper[4892]: I0217 18:06:45.905743 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 18:06:45 crc kubenswrapper[4892]: I0217 18:06:45.908068 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 17 18:06:45 crc kubenswrapper[4892]: I0217 18:06:45.915011 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 17 18:06:45 crc kubenswrapper[4892]: I0217 18:06:45.916926 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 18:06:45 crc kubenswrapper[4892]: I0217 18:06:45.935649 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb02530d-521f-427b-a570-f35de0665ecc-config-data\") pod \"nova-metadata-0\" (UID: \"eb02530d-521f-427b-a570-f35de0665ecc\") " pod="openstack/nova-metadata-0" Feb 17 18:06:45 crc kubenswrapper[4892]: I0217 18:06:45.935794 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb02530d-521f-427b-a570-f35de0665ecc-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"eb02530d-521f-427b-a570-f35de0665ecc\") " pod="openstack/nova-metadata-0" Feb 17 18:06:45 crc kubenswrapper[4892]: I0217 18:06:45.935850 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb02530d-521f-427b-a570-f35de0665ecc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"eb02530d-521f-427b-a570-f35de0665ecc\") " pod="openstack/nova-metadata-0" Feb 17 18:06:45 crc kubenswrapper[4892]: I0217 18:06:45.935897 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb02530d-521f-427b-a570-f35de0665ecc-logs\") pod \"nova-metadata-0\" (UID: \"eb02530d-521f-427b-a570-f35de0665ecc\") " pod="openstack/nova-metadata-0" Feb 17 18:06:45 crc kubenswrapper[4892]: I0217 18:06:45.936024 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk4rj\" (UniqueName: \"kubernetes.io/projected/eb02530d-521f-427b-a570-f35de0665ecc-kube-api-access-rk4rj\") pod \"nova-metadata-0\" (UID: \"eb02530d-521f-427b-a570-f35de0665ecc\") " pod="openstack/nova-metadata-0" Feb 17 18:06:45 crc kubenswrapper[4892]: I0217 18:06:45.988591 4892 scope.go:117] "RemoveContainer" containerID="6a2878294542b4973fbce5486513e4efda079ff8205b1e1994b292781335a803" Feb 17 18:06:45 crc kubenswrapper[4892]: E0217 18:06:45.993995 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a2878294542b4973fbce5486513e4efda079ff8205b1e1994b292781335a803\": container with ID starting with 6a2878294542b4973fbce5486513e4efda079ff8205b1e1994b292781335a803 not found: ID does not exist" containerID="6a2878294542b4973fbce5486513e4efda079ff8205b1e1994b292781335a803" Feb 17 18:06:45 crc kubenswrapper[4892]: I0217 18:06:45.994050 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a2878294542b4973fbce5486513e4efda079ff8205b1e1994b292781335a803"} err="failed to get container status \"6a2878294542b4973fbce5486513e4efda079ff8205b1e1994b292781335a803\": rpc error: code = NotFound desc = could not find container \"6a2878294542b4973fbce5486513e4efda079ff8205b1e1994b292781335a803\": container with ID starting with 6a2878294542b4973fbce5486513e4efda079ff8205b1e1994b292781335a803 not found: ID does not exist" Feb 17 18:06:45 crc kubenswrapper[4892]: I0217 18:06:45.994085 4892 scope.go:117] "RemoveContainer" containerID="00579e8a9d0e4012a3eb6b9376f292f349e2389940e2dd79e00a3a9d80c4c6f7" Feb 17 18:06:45 crc kubenswrapper[4892]: E0217 18:06:45.994438 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00579e8a9d0e4012a3eb6b9376f292f349e2389940e2dd79e00a3a9d80c4c6f7\": container with ID starting with 00579e8a9d0e4012a3eb6b9376f292f349e2389940e2dd79e00a3a9d80c4c6f7 not found: ID does not exist" containerID="00579e8a9d0e4012a3eb6b9376f292f349e2389940e2dd79e00a3a9d80c4c6f7" Feb 17 18:06:45 crc kubenswrapper[4892]: I0217 18:06:45.994483 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00579e8a9d0e4012a3eb6b9376f292f349e2389940e2dd79e00a3a9d80c4c6f7"} err="failed to get container status \"00579e8a9d0e4012a3eb6b9376f292f349e2389940e2dd79e00a3a9d80c4c6f7\": rpc error: code = NotFound desc = could not find container \"00579e8a9d0e4012a3eb6b9376f292f349e2389940e2dd79e00a3a9d80c4c6f7\": container with ID starting with 00579e8a9d0e4012a3eb6b9376f292f349e2389940e2dd79e00a3a9d80c4c6f7 not found: ID does not exist" Feb 17 18:06:46 crc kubenswrapper[4892]: I0217 18:06:46.037095 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk4rj\" (UniqueName: \"kubernetes.io/projected/eb02530d-521f-427b-a570-f35de0665ecc-kube-api-access-rk4rj\") pod \"nova-metadata-0\" (UID: \"eb02530d-521f-427b-a570-f35de0665ecc\") " pod="openstack/nova-metadata-0" Feb 17 18:06:46 crc kubenswrapper[4892]: I0217 18:06:46.037194 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb02530d-521f-427b-a570-f35de0665ecc-config-data\") pod \"nova-metadata-0\" (UID: \"eb02530d-521f-427b-a570-f35de0665ecc\") " pod="openstack/nova-metadata-0" Feb 17 18:06:46 crc kubenswrapper[4892]: I0217 18:06:46.037268 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb02530d-521f-427b-a570-f35de0665ecc-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"eb02530d-521f-427b-a570-f35de0665ecc\") " pod="openstack/nova-metadata-0" Feb 17 18:06:46 crc kubenswrapper[4892]: I0217 18:06:46.037294 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb02530d-521f-427b-a570-f35de0665ecc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"eb02530d-521f-427b-a570-f35de0665ecc\") " pod="openstack/nova-metadata-0" Feb 17 18:06:46 crc kubenswrapper[4892]: I0217 18:06:46.037314 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb02530d-521f-427b-a570-f35de0665ecc-logs\") pod \"nova-metadata-0\" (UID: \"eb02530d-521f-427b-a570-f35de0665ecc\") " pod="openstack/nova-metadata-0" Feb 17 18:06:46 crc kubenswrapper[4892]: I0217 18:06:46.037750 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb02530d-521f-427b-a570-f35de0665ecc-logs\") pod \"nova-metadata-0\" (UID: \"eb02530d-521f-427b-a570-f35de0665ecc\") " pod="openstack/nova-metadata-0" Feb 17 18:06:46 crc kubenswrapper[4892]: I0217 18:06:46.042870 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb02530d-521f-427b-a570-f35de0665ecc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"eb02530d-521f-427b-a570-f35de0665ecc\") " pod="openstack/nova-metadata-0" Feb 17 18:06:46 crc kubenswrapper[4892]: I0217 18:06:46.043515 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb02530d-521f-427b-a570-f35de0665ecc-config-data\") pod \"nova-metadata-0\" (UID: \"eb02530d-521f-427b-a570-f35de0665ecc\") " pod="openstack/nova-metadata-0" Feb 17 18:06:46 crc kubenswrapper[4892]: I0217 18:06:46.043634 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb02530d-521f-427b-a570-f35de0665ecc-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"eb02530d-521f-427b-a570-f35de0665ecc\") " pod="openstack/nova-metadata-0" Feb 17 18:06:46 crc kubenswrapper[4892]: I0217 18:06:46.053903 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk4rj\" (UniqueName: \"kubernetes.io/projected/eb02530d-521f-427b-a570-f35de0665ecc-kube-api-access-rk4rj\") pod \"nova-metadata-0\" (UID: \"eb02530d-521f-427b-a570-f35de0665ecc\") " pod="openstack/nova-metadata-0" Feb 17 18:06:46 crc kubenswrapper[4892]: I0217 18:06:46.294646 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 18:06:46 crc kubenswrapper[4892]: I0217 18:06:46.796787 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 18:06:46 crc kubenswrapper[4892]: I0217 18:06:46.844597 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"253dfc82-aa27-4e6b-88a5-0af7a1d01370","Type":"ContainerStarted","Data":"4e88a2b9e15b3a6cb7fc5f88c55641255e9e97f6b726a8c1f4742146a59a82fc"} Feb 17 18:06:46 crc kubenswrapper[4892]: I0217 18:06:46.848934 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eb02530d-521f-427b-a570-f35de0665ecc","Type":"ContainerStarted","Data":"01477c9b0fffb8762ca0569e23eebe055e41984b3f6d8e473e0115918c26f1da"} Feb 17 18:06:46 crc kubenswrapper[4892]: I0217 18:06:46.889699 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.889674103 podStartE2EDuration="2.889674103s" podCreationTimestamp="2026-02-17 18:06:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:06:46.870075164 +0000 UTC m=+1378.245478439" watchObservedRunningTime="2026-02-17 18:06:46.889674103 +0000 UTC m=+1378.265077378" Feb 17 18:06:47 crc kubenswrapper[4892]: I0217 18:06:47.386746 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b9c463d-2912-48ed-b342-74e1a2f92d19" path="/var/lib/kubelet/pods/7b9c463d-2912-48ed-b342-74e1a2f92d19/volumes" Feb 17 18:06:47 crc kubenswrapper[4892]: I0217 18:06:47.865036 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eb02530d-521f-427b-a570-f35de0665ecc","Type":"ContainerStarted","Data":"f69a74c1d7ff5f5e7cab3dfb70cfb2a1f2475fdc6a082b95bd0059bf7596b0bc"} Feb 17 18:06:47 crc kubenswrapper[4892]: I0217 18:06:47.865093 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eb02530d-521f-427b-a570-f35de0665ecc","Type":"ContainerStarted","Data":"f7ec71d9b448df0c427bf1c569f629e2d1b04b498fb150e292de366d671c7e43"} Feb 17 18:06:47 crc kubenswrapper[4892]: I0217 18:06:47.905520 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.905500451 podStartE2EDuration="2.905500451s" podCreationTimestamp="2026-02-17 18:06:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:06:47.88139722 +0000 UTC m=+1379.256800535" watchObservedRunningTime="2026-02-17 18:06:47.905500451 +0000 UTC m=+1379.280903726" Feb 17 18:06:50 crc kubenswrapper[4892]: I0217 18:06:50.260793 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 17 18:06:51 crc kubenswrapper[4892]: I0217 18:06:51.295452 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 18:06:51 crc kubenswrapper[4892]: I0217 18:06:51.295843 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 18:06:53 crc kubenswrapper[4892]: I0217 18:06:53.377716 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 18:06:53 crc kubenswrapper[4892]: I0217 18:06:53.378081 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 18:06:54 crc kubenswrapper[4892]: I0217 18:06:54.380986 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="68cd920a-ec23-4053-a5b6-02adbf11eaf0" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.231:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 17 18:06:54 crc kubenswrapper[4892]: I0217 18:06:54.381318 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="68cd920a-ec23-4053-a5b6-02adbf11eaf0" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.231:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 17 18:06:55 crc kubenswrapper[4892]: I0217 18:06:55.261661 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 17 18:06:55 crc kubenswrapper[4892]: I0217 18:06:55.295534 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 17 18:06:55 crc kubenswrapper[4892]: I0217 18:06:55.981626 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 17 18:06:56 crc kubenswrapper[4892]: I0217 18:06:56.295454 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 17 18:06:56 crc kubenswrapper[4892]: I0217 18:06:56.295529 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 17 18:06:57 crc kubenswrapper[4892]: I0217 18:06:57.303051 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="eb02530d-521f-427b-a570-f35de0665ecc" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.233:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 17 18:06:57 crc kubenswrapper[4892]: I0217 18:06:57.311093 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="eb02530d-521f-427b-a570-f35de0665ecc" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.233:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 17 18:07:03 crc kubenswrapper[4892]: I0217 18:07:03.376655 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 17 18:07:03 crc kubenswrapper[4892]: I0217 18:07:03.378963 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 17 18:07:03 crc kubenswrapper[4892]: I0217 18:07:03.385353 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 17 18:07:03 crc kubenswrapper[4892]: I0217 18:07:03.519162 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 17 18:07:04 crc kubenswrapper[4892]: I0217 18:07:04.053604 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 17 18:07:04 crc kubenswrapper[4892]: I0217 18:07:04.062063 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 17 18:07:06 crc kubenswrapper[4892]: I0217 18:07:06.303205 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 17 18:07:06 crc kubenswrapper[4892]: I0217 18:07:06.303506 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 17 18:07:06 crc kubenswrapper[4892]: I0217 18:07:06.310237 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 17 18:07:06 crc kubenswrapper[4892]: I0217 18:07:06.312340 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 17 18:07:07 crc kubenswrapper[4892]: I0217 18:07:07.425226 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 18:07:07 crc kubenswrapper[4892]: I0217 18:07:07.425896 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 18:07:07 crc kubenswrapper[4892]: I0217 18:07:07.425945 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" Feb 17 18:07:07 crc kubenswrapper[4892]: I0217 18:07:07.426797 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0b5f9c9cd974cd254191642f6cc48f52aa43de6e7dce9d7b7d9b694f86f42344"} pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 18:07:07 crc kubenswrapper[4892]: I0217 18:07:07.426877 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" containerID="cri-o://0b5f9c9cd974cd254191642f6cc48f52aa43de6e7dce9d7b7d9b694f86f42344" gracePeriod=600 Feb 17 18:07:08 crc kubenswrapper[4892]: I0217 18:07:08.091651 4892 generic.go:334] "Generic (PLEG): container finished" podID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerID="0b5f9c9cd974cd254191642f6cc48f52aa43de6e7dce9d7b7d9b694f86f42344" exitCode=0 Feb 17 18:07:08 crc kubenswrapper[4892]: I0217 18:07:08.091738 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerDied","Data":"0b5f9c9cd974cd254191642f6cc48f52aa43de6e7dce9d7b7d9b694f86f42344"} Feb 17 18:07:08 crc kubenswrapper[4892]: I0217 18:07:08.092140 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerStarted","Data":"6beef292233363e1e0a1fae70e7b7bf66e739bf2ae8344c3d6c91fca25fc9431"} Feb 17 18:07:08 crc kubenswrapper[4892]: I0217 18:07:08.092164 4892 scope.go:117] "RemoveContainer" containerID="39bb78d4e8e45cbad6e675e4abf8cc16247b09380bca60a00134831853f3fc17" Feb 17 18:07:09 crc kubenswrapper[4892]: I0217 18:07:09.175449 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 17 18:07:27 crc kubenswrapper[4892]: I0217 18:07:27.740359 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 18:07:27 crc kubenswrapper[4892]: I0217 18:07:27.871734 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Feb 17 18:07:27 crc kubenswrapper[4892]: I0217 18:07:27.872202 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="0f0771f7-1250-403c-92b9-72411ed34b2a" containerName="openstackclient" containerID="cri-o://9e7e188405b6a8e4ef6bb57b42cbb6f284b2256f64aba5c60b2ec472f06d945a" gracePeriod=2 Feb 17 18:07:27 crc kubenswrapper[4892]: I0217 18:07:27.898770 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Feb 17 18:07:27 crc kubenswrapper[4892]: I0217 18:07:27.955102 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-3909-account-create-update-zjjz8"] Feb 17 18:07:27 crc kubenswrapper[4892]: I0217 18:07:27.981869 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-3909-account-create-update-zjjz8"] Feb 17 18:07:28 crc kubenswrapper[4892]: I0217 18:07:28.004878 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-3909-account-create-update-z2hg5"] Feb 17 18:07:28 crc kubenswrapper[4892]: E0217 18:07:28.005622 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f0771f7-1250-403c-92b9-72411ed34b2a" containerName="openstackclient" Feb 17 18:07:28 crc kubenswrapper[4892]: I0217 18:07:28.005709 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f0771f7-1250-403c-92b9-72411ed34b2a" containerName="openstackclient" Feb 17 18:07:28 crc kubenswrapper[4892]: I0217 18:07:28.006006 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f0771f7-1250-403c-92b9-72411ed34b2a" containerName="openstackclient" Feb 17 18:07:28 crc kubenswrapper[4892]: I0217 18:07:28.006801 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3909-account-create-update-z2hg5" Feb 17 18:07:28 crc kubenswrapper[4892]: I0217 18:07:28.014262 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 17 18:07:28 crc kubenswrapper[4892]: I0217 18:07:28.015996 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-3909-account-create-update-z2hg5"] Feb 17 18:07:28 crc kubenswrapper[4892]: I0217 18:07:28.071906 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-5c69-account-create-update-dl45f"] Feb 17 18:07:28 crc kubenswrapper[4892]: I0217 18:07:28.094883 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-5c69-account-create-update-dl45f"] Feb 17 18:07:28 crc kubenswrapper[4892]: I0217 18:07:28.099782 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmlfq\" (UniqueName: \"kubernetes.io/projected/588baeb0-b9ae-4dcf-83c6-36b641359bfd-kube-api-access-gmlfq\") pod \"cinder-3909-account-create-update-z2hg5\" (UID: \"588baeb0-b9ae-4dcf-83c6-36b641359bfd\") " pod="openstack/cinder-3909-account-create-update-z2hg5" Feb 17 18:07:28 crc kubenswrapper[4892]: I0217 18:07:28.099919 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/588baeb0-b9ae-4dcf-83c6-36b641359bfd-operator-scripts\") pod \"cinder-3909-account-create-update-z2hg5\" (UID: \"588baeb0-b9ae-4dcf-83c6-36b641359bfd\") " pod="openstack/cinder-3909-account-create-update-z2hg5" Feb 17 18:07:28 crc kubenswrapper[4892]: I0217 18:07:28.200414 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 18:07:28 crc kubenswrapper[4892]: I0217 18:07:28.206420 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmlfq\" (UniqueName: \"kubernetes.io/projected/588baeb0-b9ae-4dcf-83c6-36b641359bfd-kube-api-access-gmlfq\") pod \"cinder-3909-account-create-update-z2hg5\" (UID: \"588baeb0-b9ae-4dcf-83c6-36b641359bfd\") " pod="openstack/cinder-3909-account-create-update-z2hg5" Feb 17 18:07:28 crc kubenswrapper[4892]: I0217 18:07:28.206522 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/588baeb0-b9ae-4dcf-83c6-36b641359bfd-operator-scripts\") pod \"cinder-3909-account-create-update-z2hg5\" (UID: \"588baeb0-b9ae-4dcf-83c6-36b641359bfd\") " pod="openstack/cinder-3909-account-create-update-z2hg5" Feb 17 18:07:28 crc kubenswrapper[4892]: I0217 18:07:28.207292 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/588baeb0-b9ae-4dcf-83c6-36b641359bfd-operator-scripts\") pod \"cinder-3909-account-create-update-z2hg5\" (UID: \"588baeb0-b9ae-4dcf-83c6-36b641359bfd\") " pod="openstack/cinder-3909-account-create-update-z2hg5" Feb 17 18:07:28 crc kubenswrapper[4892]: I0217 18:07:28.222143 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-5c69-account-create-update-j2dxf"] Feb 17 18:07:28 crc kubenswrapper[4892]: I0217 18:07:28.223657 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5c69-account-create-update-j2dxf" Feb 17 18:07:28 crc kubenswrapper[4892]: I0217 18:07:28.228854 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 17 18:07:28 crc kubenswrapper[4892]: I0217 18:07:28.261126 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-858jh"] Feb 17 18:07:28 crc kubenswrapper[4892]: I0217 18:07:28.261511 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-858jh" podUID="aa192ddc-2dc5-4684-afc7-29c9e9db5f6b" containerName="openstack-network-exporter" containerID="cri-o://63921be3f9a636b065a7e01b28b406aaf94dbcbae3cf8b99719f9d18b1baacbb" gracePeriod=30 Feb 17 18:07:28 crc kubenswrapper[4892]: I0217 18:07:28.323739 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-8n9s7"] Feb 17 18:07:28 crc kubenswrapper[4892]: I0217 18:07:28.387250 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmlfq\" (UniqueName: \"kubernetes.io/projected/588baeb0-b9ae-4dcf-83c6-36b641359bfd-kube-api-access-gmlfq\") pod \"cinder-3909-account-create-update-z2hg5\" (UID: \"588baeb0-b9ae-4dcf-83c6-36b641359bfd\") " pod="openstack/cinder-3909-account-create-update-z2hg5" Feb 17 18:07:28 crc kubenswrapper[4892]: I0217 18:07:28.403373 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-cq76l"] Feb 17 18:07:28 crc kubenswrapper[4892]: I0217 18:07:28.425779 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-cq76l" Feb 17 18:07:28 crc kubenswrapper[4892]: I0217 18:07:28.426829 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-bz7v2"] Feb 17 18:07:28 crc kubenswrapper[4892]: I0217 18:07:28.449954 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 17 18:07:28 crc kubenswrapper[4892]: I0217 18:07:28.476975 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-5c69-account-create-update-j2dxf"] Feb 17 18:07:28 crc kubenswrapper[4892]: I0217 18:07:28.540402 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5qjz\" (UniqueName: \"kubernetes.io/projected/40312e89-cbb3-4c9f-b8bc-33113bd6462f-kube-api-access-b5qjz\") pod \"glance-5c69-account-create-update-j2dxf\" (UID: \"40312e89-cbb3-4c9f-b8bc-33113bd6462f\") " pod="openstack/glance-5c69-account-create-update-j2dxf" Feb 17 18:07:28 crc kubenswrapper[4892]: I0217 18:07:28.540747 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40312e89-cbb3-4c9f-b8bc-33113bd6462f-operator-scripts\") pod \"glance-5c69-account-create-update-j2dxf\" (UID: \"40312e89-cbb3-4c9f-b8bc-33113bd6462f\") " pod="openstack/glance-5c69-account-create-update-j2dxf" Feb 17 18:07:28 crc kubenswrapper[4892]: I0217 18:07:28.540881 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-cq76l"] Feb 17 18:07:28 crc kubenswrapper[4892]: I0217 18:07:28.610645 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-m59z9"] Feb 17 18:07:28 crc kubenswrapper[4892]: I0217 18:07:28.642358 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5qjz\" (UniqueName: \"kubernetes.io/projected/40312e89-cbb3-4c9f-b8bc-33113bd6462f-kube-api-access-b5qjz\") pod \"glance-5c69-account-create-update-j2dxf\" (UID: \"40312e89-cbb3-4c9f-b8bc-33113bd6462f\") " pod="openstack/glance-5c69-account-create-update-j2dxf" Feb 17 18:07:28 crc kubenswrapper[4892]: I0217 18:07:28.642421 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a130c05-1db7-4d93-899e-05d086a2bcca-operator-scripts\") pod \"root-account-create-update-cq76l\" (UID: \"7a130c05-1db7-4d93-899e-05d086a2bcca\") " pod="openstack/root-account-create-update-cq76l" Feb 17 18:07:28 crc kubenswrapper[4892]: I0217 18:07:28.642504 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tctl5\" (UniqueName: \"kubernetes.io/projected/7a130c05-1db7-4d93-899e-05d086a2bcca-kube-api-access-tctl5\") pod \"root-account-create-update-cq76l\" (UID: \"7a130c05-1db7-4d93-899e-05d086a2bcca\") " pod="openstack/root-account-create-update-cq76l" Feb 17 18:07:28 crc kubenswrapper[4892]: I0217 18:07:28.642575 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40312e89-cbb3-4c9f-b8bc-33113bd6462f-operator-scripts\") pod \"glance-5c69-account-create-update-j2dxf\" (UID: \"40312e89-cbb3-4c9f-b8bc-33113bd6462f\") " pod="openstack/glance-5c69-account-create-update-j2dxf" Feb 17 18:07:28 crc kubenswrapper[4892]: I0217 18:07:28.646047 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40312e89-cbb3-4c9f-b8bc-33113bd6462f-operator-scripts\") pod \"glance-5c69-account-create-update-j2dxf\" (UID: \"40312e89-cbb3-4c9f-b8bc-33113bd6462f\") " pod="openstack/glance-5c69-account-create-update-j2dxf" Feb 17 18:07:28 crc kubenswrapper[4892]: I0217 18:07:28.648865 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3909-account-create-update-z2hg5" Feb 17 18:07:28 crc kubenswrapper[4892]: I0217 18:07:28.656872 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-m59z9"] Feb 17 18:07:28 crc kubenswrapper[4892]: I0217 18:07:28.673567 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5qjz\" (UniqueName: \"kubernetes.io/projected/40312e89-cbb3-4c9f-b8bc-33113bd6462f-kube-api-access-b5qjz\") pod \"glance-5c69-account-create-update-j2dxf\" (UID: \"40312e89-cbb3-4c9f-b8bc-33113bd6462f\") " pod="openstack/glance-5c69-account-create-update-j2dxf" Feb 17 18:07:28 crc kubenswrapper[4892]: I0217 18:07:28.711794 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Feb 17 18:07:28 crc kubenswrapper[4892]: I0217 18:07:28.716899 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="5b688e91-e3c2-4a0f-a784-a694f951ea5e" containerName="ovn-northd" containerID="cri-o://ce10b286d2db3229a5621dafd5ff72921acba1f38e152476f3b1b8d7c9ad5d2a" gracePeriod=30 Feb 17 18:07:28 crc kubenswrapper[4892]: I0217 18:07:28.717051 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="5b688e91-e3c2-4a0f-a784-a694f951ea5e" containerName="openstack-network-exporter" containerID="cri-o://f5ee250509d5fcfefa3da2589c17b1ebe3b3316577d89b3435a80ae89dc5bf32" gracePeriod=30 Feb 17 18:07:28 crc kubenswrapper[4892]: I0217 18:07:28.743895 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a130c05-1db7-4d93-899e-05d086a2bcca-operator-scripts\") pod \"root-account-create-update-cq76l\" (UID: \"7a130c05-1db7-4d93-899e-05d086a2bcca\") " pod="openstack/root-account-create-update-cq76l" Feb 17 18:07:28 crc kubenswrapper[4892]: I0217 18:07:28.743981 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tctl5\" (UniqueName: \"kubernetes.io/projected/7a130c05-1db7-4d93-899e-05d086a2bcca-kube-api-access-tctl5\") pod \"root-account-create-update-cq76l\" (UID: \"7a130c05-1db7-4d93-899e-05d086a2bcca\") " pod="openstack/root-account-create-update-cq76l" Feb 17 18:07:28 crc kubenswrapper[4892]: I0217 18:07:28.744932 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a130c05-1db7-4d93-899e-05d086a2bcca-operator-scripts\") pod \"root-account-create-update-cq76l\" (UID: \"7a130c05-1db7-4d93-899e-05d086a2bcca\") " pod="openstack/root-account-create-update-cq76l" Feb 17 18:07:28 crc kubenswrapper[4892]: I0217 18:07:28.782257 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-fa7c-account-create-update-q289d"] Feb 17 18:07:28 crc kubenswrapper[4892]: I0217 18:07:28.783884 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-fa7c-account-create-update-q289d" Feb 17 18:07:28 crc kubenswrapper[4892]: I0217 18:07:28.789857 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 17 18:07:28 crc kubenswrapper[4892]: I0217 18:07:28.805976 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tctl5\" (UniqueName: \"kubernetes.io/projected/7a130c05-1db7-4d93-899e-05d086a2bcca-kube-api-access-tctl5\") pod \"root-account-create-update-cq76l\" (UID: \"7a130c05-1db7-4d93-899e-05d086a2bcca\") " pod="openstack/root-account-create-update-cq76l" Feb 17 18:07:28 crc kubenswrapper[4892]: I0217 18:07:28.886351 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5c69-account-create-update-j2dxf" Feb 17 18:07:28 crc kubenswrapper[4892]: E0217 18:07:28.936901 4892 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-bz7v2" message="Exiting ovn-controller (1) " Feb 17 18:07:28 crc kubenswrapper[4892]: E0217 18:07:28.939120 4892 kuberuntime_container.go:691] "PreStop hook failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " pod="openstack/ovn-controller-bz7v2" podUID="ff799349-84ed-44f7-8f46-d11d5637abf1" containerName="ovn-controller" containerID="cri-o://2a24b9d22c199e4d14a2a499464981f6e1f269599a861bcbb6d2f7956c026991" Feb 17 18:07:28 crc kubenswrapper[4892]: I0217 18:07:28.939192 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-bz7v2" podUID="ff799349-84ed-44f7-8f46-d11d5637abf1" containerName="ovn-controller" containerID="cri-o://2a24b9d22c199e4d14a2a499464981f6e1f269599a861bcbb6d2f7956c026991" gracePeriod=30 Feb 17 18:07:28 crc kubenswrapper[4892]: I0217 18:07:28.973282 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e335cae7-2188-47a9-8eef-f53353b22dd4-operator-scripts\") pod \"barbican-fa7c-account-create-update-q289d\" (UID: \"e335cae7-2188-47a9-8eef-f53353b22dd4\") " pod="openstack/barbican-fa7c-account-create-update-q289d" Feb 17 18:07:28 crc kubenswrapper[4892]: I0217 18:07:28.973427 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-226mw\" (UniqueName: \"kubernetes.io/projected/e335cae7-2188-47a9-8eef-f53353b22dd4-kube-api-access-226mw\") pod \"barbican-fa7c-account-create-update-q289d\" (UID: \"e335cae7-2188-47a9-8eef-f53353b22dd4\") " pod="openstack/barbican-fa7c-account-create-update-q289d" Feb 17 18:07:28 crc kubenswrapper[4892]: I0217 18:07:28.993631 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-7rrhb"] Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.010721 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-fa7c-account-create-update-q289d"] Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.069397 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-fa7c-account-create-update-zqj46"] Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.078647 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e335cae7-2188-47a9-8eef-f53353b22dd4-operator-scripts\") pod \"barbican-fa7c-account-create-update-q289d\" (UID: \"e335cae7-2188-47a9-8eef-f53353b22dd4\") " pod="openstack/barbican-fa7c-account-create-update-q289d" Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.079699 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-226mw\" (UniqueName: \"kubernetes.io/projected/e335cae7-2188-47a9-8eef-f53353b22dd4-kube-api-access-226mw\") pod \"barbican-fa7c-account-create-update-q289d\" (UID: \"e335cae7-2188-47a9-8eef-f53353b22dd4\") " pod="openstack/barbican-fa7c-account-create-update-q289d" Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.080468 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-cq76l" Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.081514 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e335cae7-2188-47a9-8eef-f53353b22dd4-operator-scripts\") pod \"barbican-fa7c-account-create-update-q289d\" (UID: \"e335cae7-2188-47a9-8eef-f53353b22dd4\") " pod="openstack/barbican-fa7c-account-create-update-q289d" Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.082503 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-fa7c-account-create-update-zqj46"] Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.093451 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-2050-account-create-update-k5ss6"] Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.118549 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-226mw\" (UniqueName: \"kubernetes.io/projected/e335cae7-2188-47a9-8eef-f53353b22dd4-kube-api-access-226mw\") pod \"barbican-fa7c-account-create-update-q289d\" (UID: \"e335cae7-2188-47a9-8eef-f53353b22dd4\") " pod="openstack/barbican-fa7c-account-create-update-q289d" Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.137213 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-2050-account-create-update-k5ss6"] Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.192390 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-7rrhb"] Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.206727 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-fa7c-account-create-update-q289d" Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.208769 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-q48cp"] Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.226488 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-q48cp"] Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.239302 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-rvn94"] Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.247901 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-rvn94"] Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.290430 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-79pp7"] Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.353882 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-79pp7"] Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.468408 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-8n9s7" podUID="abdceb70-55cb-4f2a-ac20-7fe8e9f4d064" containerName="ovs-vswitchd" containerID="cri-o://2fa766c1995ec1ef80411d787e1580966f11b07f17360079458fcc8c8c5e948a" gracePeriod=29 Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.543484 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a3ae569-94c4-4d8e-b97c-b73a0f4c2b5c" path="/var/lib/kubelet/pods/3a3ae569-94c4-4d8e-b97c-b73a0f4c2b5c/volumes" Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.544800 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3be41152-a14a-43b1-b24e-221e614556df" path="/var/lib/kubelet/pods/3be41152-a14a-43b1-b24e-221e614556df/volumes" Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.582624 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-858jh_aa192ddc-2dc5-4684-afc7-29c9e9db5f6b/openstack-network-exporter/0.log" Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.582664 4892 generic.go:334] "Generic (PLEG): container finished" podID="aa192ddc-2dc5-4684-afc7-29c9e9db5f6b" containerID="63921be3f9a636b065a7e01b28b406aaf94dbcbae3cf8b99719f9d18b1baacbb" exitCode=2 Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.609723 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46aabc4c-95df-425a-bde5-db7523f34d7f" path="/var/lib/kubelet/pods/46aabc4c-95df-425a-bde5-db7523f34d7f/volumes" Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.618056 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48b3dff9-6db4-4820-9920-9e5a24401e98" path="/var/lib/kubelet/pods/48b3dff9-6db4-4820-9920-9e5a24401e98/volumes" Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.618732 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b26d854a-7e7b-4a84-a611-d0672cea173d" path="/var/lib/kubelet/pods/b26d854a-7e7b-4a84-a611-d0672cea173d/volumes" Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.619644 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6b151d3-a2dd-4201-b4a4-e2f332530eaa" path="/var/lib/kubelet/pods/b6b151d3-a2dd-4201-b4a4-e2f332530eaa/volumes" Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.628152 4892 generic.go:334] "Generic (PLEG): container finished" podID="5b688e91-e3c2-4a0f-a784-a694f951ea5e" containerID="f5ee250509d5fcfefa3da2589c17b1ebe3b3316577d89b3435a80ae89dc5bf32" exitCode=2 Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.648321 4892 generic.go:334] "Generic (PLEG): container finished" podID="ff799349-84ed-44f7-8f46-d11d5637abf1" containerID="2a24b9d22c199e4d14a2a499464981f6e1f269599a861bcbb6d2f7956c026991" exitCode=0 Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.702056 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bad7ea87-ce2b-4897-96b2-99ff27b92c8d" path="/var/lib/kubelet/pods/bad7ea87-ce2b-4897-96b2-99ff27b92c8d/volumes" Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.708981 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e85111e5-c198-4b36-bdce-5eb7a8ff75ea" path="/var/lib/kubelet/pods/e85111e5-c198-4b36-bdce-5eb7a8ff75ea/volumes" Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.712662 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb8ce869-af70-43d7-b341-cf6c53900d97" path="/var/lib/kubelet/pods/fb8ce869-af70-43d7-b341-cf6c53900d97/volumes" Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.713378 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-858jh" event={"ID":"aa192ddc-2dc5-4684-afc7-29c9e9db5f6b","Type":"ContainerDied","Data":"63921be3f9a636b065a7e01b28b406aaf94dbcbae3cf8b99719f9d18b1baacbb"} Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.713421 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"5b688e91-e3c2-4a0f-a784-a694f951ea5e","Type":"ContainerDied","Data":"f5ee250509d5fcfefa3da2589c17b1ebe3b3316577d89b3435a80ae89dc5bf32"} Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.713441 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-bz7v2" event={"ID":"ff799349-84ed-44f7-8f46-d11d5637abf1","Type":"ContainerDied","Data":"2a24b9d22c199e4d14a2a499464981f6e1f269599a861bcbb6d2f7956c026991"} Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.713454 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-5f5a-account-create-update-g25cr"] Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.716313 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-5f5a-account-create-update-g25cr"] Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.716337 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-57f0-account-create-update-4885s"] Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.722559 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5f5a-account-create-update-g25cr" Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.725659 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-57f0-account-create-update-4885s"] Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.725774 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-57f0-account-create-update-4885s" Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.743504 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c031fb1d-01fe-4f4d-8a7a-2a49cb75bbb9-operator-scripts\") pod \"nova-api-5f5a-account-create-update-g25cr\" (UID: \"c031fb1d-01fe-4f4d-8a7a-2a49cb75bbb9\") " pod="openstack/nova-api-5f5a-account-create-update-g25cr" Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.743560 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sn52\" (UniqueName: \"kubernetes.io/projected/d5e19d46-3d45-42e4-b59f-d25925415b3c-kube-api-access-6sn52\") pod \"nova-cell0-57f0-account-create-update-4885s\" (UID: \"d5e19d46-3d45-42e4-b59f-d25925415b3c\") " pod="openstack/nova-cell0-57f0-account-create-update-4885s" Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.743675 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fjjx\" (UniqueName: \"kubernetes.io/projected/c031fb1d-01fe-4f4d-8a7a-2a49cb75bbb9-kube-api-access-7fjjx\") pod \"nova-api-5f5a-account-create-update-g25cr\" (UID: \"c031fb1d-01fe-4f4d-8a7a-2a49cb75bbb9\") " pod="openstack/nova-api-5f5a-account-create-update-g25cr" Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.743737 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5e19d46-3d45-42e4-b59f-d25925415b3c-operator-scripts\") pod \"nova-cell0-57f0-account-create-update-4885s\" (UID: \"d5e19d46-3d45-42e4-b59f-d25925415b3c\") " pod="openstack/nova-cell0-57f0-account-create-update-4885s" Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.751579 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.816120 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-7299-account-create-update-frm9q"] Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.842444 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.843008 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7299-account-create-update-frm9q" Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.855220 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.871926 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-7299-account-create-update-frm9q"] Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.891649 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fjjx\" (UniqueName: \"kubernetes.io/projected/c031fb1d-01fe-4f4d-8a7a-2a49cb75bbb9-kube-api-access-7fjjx\") pod \"nova-api-5f5a-account-create-update-g25cr\" (UID: \"c031fb1d-01fe-4f4d-8a7a-2a49cb75bbb9\") " pod="openstack/nova-api-5f5a-account-create-update-g25cr" Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.891850 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5e19d46-3d45-42e4-b59f-d25925415b3c-operator-scripts\") pod \"nova-cell0-57f0-account-create-update-4885s\" (UID: \"d5e19d46-3d45-42e4-b59f-d25925415b3c\") " pod="openstack/nova-cell0-57f0-account-create-update-4885s" Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.891939 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6a1925e-e87d-4d6e-b9a3-35e46d58fc54-operator-scripts\") pod \"nova-cell1-7299-account-create-update-frm9q\" (UID: \"a6a1925e-e87d-4d6e-b9a3-35e46d58fc54\") " pod="openstack/nova-cell1-7299-account-create-update-frm9q" Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.892009 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljdlr\" (UniqueName: \"kubernetes.io/projected/a6a1925e-e87d-4d6e-b9a3-35e46d58fc54-kube-api-access-ljdlr\") pod \"nova-cell1-7299-account-create-update-frm9q\" (UID: \"a6a1925e-e87d-4d6e-b9a3-35e46d58fc54\") " pod="openstack/nova-cell1-7299-account-create-update-frm9q" Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.892034 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c031fb1d-01fe-4f4d-8a7a-2a49cb75bbb9-operator-scripts\") pod \"nova-api-5f5a-account-create-update-g25cr\" (UID: \"c031fb1d-01fe-4f4d-8a7a-2a49cb75bbb9\") " pod="openstack/nova-api-5f5a-account-create-update-g25cr" Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.892082 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sn52\" (UniqueName: \"kubernetes.io/projected/d5e19d46-3d45-42e4-b59f-d25925415b3c-kube-api-access-6sn52\") pod \"nova-cell0-57f0-account-create-update-4885s\" (UID: \"d5e19d46-3d45-42e4-b59f-d25925415b3c\") " pod="openstack/nova-cell0-57f0-account-create-update-4885s" Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.893801 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5e19d46-3d45-42e4-b59f-d25925415b3c-operator-scripts\") pod \"nova-cell0-57f0-account-create-update-4885s\" (UID: \"d5e19d46-3d45-42e4-b59f-d25925415b3c\") " pod="openstack/nova-cell0-57f0-account-create-update-4885s" Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.894148 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c031fb1d-01fe-4f4d-8a7a-2a49cb75bbb9-operator-scripts\") pod \"nova-api-5f5a-account-create-update-g25cr\" (UID: \"c031fb1d-01fe-4f4d-8a7a-2a49cb75bbb9\") " pod="openstack/nova-api-5f5a-account-create-update-g25cr" Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.924011 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.924657 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="c085ee96-4617-4fa6-b546-a68d29c6238b" containerName="openstack-network-exporter" containerID="cri-o://091908dabe7cfe9736d9853d937fdfd18cc05ba8b8dcdcc384696d866e69fa27" gracePeriod=300 Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.943200 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fjjx\" (UniqueName: \"kubernetes.io/projected/c031fb1d-01fe-4f4d-8a7a-2a49cb75bbb9-kube-api-access-7fjjx\") pod \"nova-api-5f5a-account-create-update-g25cr\" (UID: \"c031fb1d-01fe-4f4d-8a7a-2a49cb75bbb9\") " pod="openstack/nova-api-5f5a-account-create-update-g25cr" Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.953769 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-tdc8n"] Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.954032 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c7b6c5df9-tdc8n" podUID="f9e84558-a02a-4297-87f2-6b69a3b5f452" containerName="dnsmasq-dns" containerID="cri-o://047fafbe070b5c5f98549aa6ce2f47850884bd46a8bba02c12bc6d546ba53b35" gracePeriod=10 Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.964277 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sn52\" (UniqueName: \"kubernetes.io/projected/d5e19d46-3d45-42e4-b59f-d25925415b3c-kube-api-access-6sn52\") pod \"nova-cell0-57f0-account-create-update-4885s\" (UID: \"d5e19d46-3d45-42e4-b59f-d25925415b3c\") " pod="openstack/nova-cell0-57f0-account-create-update-4885s" Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.993859 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6a1925e-e87d-4d6e-b9a3-35e46d58fc54-operator-scripts\") pod \"nova-cell1-7299-account-create-update-frm9q\" (UID: \"a6a1925e-e87d-4d6e-b9a3-35e46d58fc54\") " pod="openstack/nova-cell1-7299-account-create-update-frm9q" Feb 17 18:07:29 crc kubenswrapper[4892]: I0217 18:07:29.993928 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljdlr\" (UniqueName: \"kubernetes.io/projected/a6a1925e-e87d-4d6e-b9a3-35e46d58fc54-kube-api-access-ljdlr\") pod \"nova-cell1-7299-account-create-update-frm9q\" (UID: \"a6a1925e-e87d-4d6e-b9a3-35e46d58fc54\") " pod="openstack/nova-cell1-7299-account-create-update-frm9q" Feb 17 18:07:29 crc kubenswrapper[4892]: E0217 18:07:29.994271 4892 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Feb 17 18:07:29 crc kubenswrapper[4892]: E0217 18:07:29.994318 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a6a1925e-e87d-4d6e-b9a3-35e46d58fc54-operator-scripts podName:a6a1925e-e87d-4d6e-b9a3-35e46d58fc54 nodeName:}" failed. No retries permitted until 2026-02-17 18:07:30.494301933 +0000 UTC m=+1421.869705198 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/a6a1925e-e87d-4d6e-b9a3-35e46d58fc54-operator-scripts") pod "nova-cell1-7299-account-create-update-frm9q" (UID: "a6a1925e-e87d-4d6e-b9a3-35e46d58fc54") : configmap "openstack-cell1-scripts" not found Feb 17 18:07:30 crc kubenswrapper[4892]: E0217 18:07:30.001612 4892 projected.go:194] Error preparing data for projected volume kube-api-access-ljdlr for pod openstack/nova-cell1-7299-account-create-update-frm9q: failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Feb 17 18:07:30 crc kubenswrapper[4892]: E0217 18:07:30.001675 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a6a1925e-e87d-4d6e-b9a3-35e46d58fc54-kube-api-access-ljdlr podName:a6a1925e-e87d-4d6e-b9a3-35e46d58fc54 nodeName:}" failed. No retries permitted until 2026-02-17 18:07:30.501659302 +0000 UTC m=+1421.877062567 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-ljdlr" (UniqueName: "kubernetes.io/projected/a6a1925e-e87d-4d6e-b9a3-35e46d58fc54-kube-api-access-ljdlr") pod "nova-cell1-7299-account-create-update-frm9q" (UID: "a6a1925e-e87d-4d6e-b9a3-35e46d58fc54") : failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.003099 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-5f5a-account-create-update-bhdvm"] Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.038917 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-5f5a-account-create-update-bhdvm"] Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.084205 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.090094 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="a6736a08-c35b-491c-b408-8a3dd641cd51" containerName="cinder-scheduler" containerID="cri-o://276fbe4a642e629846be447b31c24d7070dfa435158a65fb8bc262ffc1b036a1" gracePeriod=30 Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.090549 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="a6736a08-c35b-491c-b408-8a3dd641cd51" containerName="probe" containerID="cri-o://72c2cbaf2de54480ce7abb484d8c16ea293b67cb726839d9a2f462baee040be3" gracePeriod=30 Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.091142 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5f5a-account-create-update-g25cr" Feb 17 18:07:30 crc kubenswrapper[4892]: E0217 18:07:30.107394 4892 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Feb 17 18:07:30 crc kubenswrapper[4892]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Feb 17 18:07:30 crc kubenswrapper[4892]: + source /usr/local/bin/container-scripts/functions Feb 17 18:07:30 crc kubenswrapper[4892]: ++ OVNBridge=br-int Feb 17 18:07:30 crc kubenswrapper[4892]: ++ OVNRemote=tcp:localhost:6642 Feb 17 18:07:30 crc kubenswrapper[4892]: ++ OVNEncapType=geneve Feb 17 18:07:30 crc kubenswrapper[4892]: ++ OVNAvailabilityZones= Feb 17 18:07:30 crc kubenswrapper[4892]: ++ EnableChassisAsGateway=true Feb 17 18:07:30 crc kubenswrapper[4892]: ++ PhysicalNetworks= Feb 17 18:07:30 crc kubenswrapper[4892]: ++ OVNHostName= Feb 17 18:07:30 crc kubenswrapper[4892]: ++ DB_FILE=/etc/openvswitch/conf.db Feb 17 18:07:30 crc kubenswrapper[4892]: ++ ovs_dir=/var/lib/openvswitch Feb 17 18:07:30 crc kubenswrapper[4892]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Feb 17 18:07:30 crc kubenswrapper[4892]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Feb 17 18:07:30 crc kubenswrapper[4892]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 17 18:07:30 crc kubenswrapper[4892]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 17 18:07:30 crc kubenswrapper[4892]: + sleep 0.5 Feb 17 18:07:30 crc kubenswrapper[4892]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 17 18:07:30 crc kubenswrapper[4892]: + sleep 0.5 Feb 17 18:07:30 crc kubenswrapper[4892]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 17 18:07:30 crc kubenswrapper[4892]: + cleanup_ovsdb_server_semaphore Feb 17 18:07:30 crc kubenswrapper[4892]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 17 18:07:30 crc kubenswrapper[4892]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Feb 17 18:07:30 crc kubenswrapper[4892]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-8n9s7" message=< Feb 17 18:07:30 crc kubenswrapper[4892]: Exiting ovsdb-server (5) [ OK ] Feb 17 18:07:30 crc kubenswrapper[4892]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Feb 17 18:07:30 crc kubenswrapper[4892]: + source /usr/local/bin/container-scripts/functions Feb 17 18:07:30 crc kubenswrapper[4892]: ++ OVNBridge=br-int Feb 17 18:07:30 crc kubenswrapper[4892]: ++ OVNRemote=tcp:localhost:6642 Feb 17 18:07:30 crc kubenswrapper[4892]: ++ OVNEncapType=geneve Feb 17 18:07:30 crc kubenswrapper[4892]: ++ OVNAvailabilityZones= Feb 17 18:07:30 crc kubenswrapper[4892]: ++ EnableChassisAsGateway=true Feb 17 18:07:30 crc kubenswrapper[4892]: ++ PhysicalNetworks= Feb 17 18:07:30 crc kubenswrapper[4892]: ++ OVNHostName= Feb 17 18:07:30 crc kubenswrapper[4892]: ++ DB_FILE=/etc/openvswitch/conf.db Feb 17 18:07:30 crc kubenswrapper[4892]: ++ ovs_dir=/var/lib/openvswitch Feb 17 18:07:30 crc kubenswrapper[4892]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Feb 17 18:07:30 crc kubenswrapper[4892]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Feb 17 18:07:30 crc kubenswrapper[4892]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 17 18:07:30 crc kubenswrapper[4892]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 17 18:07:30 crc kubenswrapper[4892]: + sleep 0.5 Feb 17 18:07:30 crc kubenswrapper[4892]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 17 18:07:30 crc kubenswrapper[4892]: + sleep 0.5 Feb 17 18:07:30 crc kubenswrapper[4892]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 17 18:07:30 crc kubenswrapper[4892]: + cleanup_ovsdb_server_semaphore Feb 17 18:07:30 crc kubenswrapper[4892]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 17 18:07:30 crc kubenswrapper[4892]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Feb 17 18:07:30 crc kubenswrapper[4892]: > Feb 17 18:07:30 crc kubenswrapper[4892]: E0217 18:07:30.107439 4892 kuberuntime_container.go:691] "PreStop hook failed" err=< Feb 17 18:07:30 crc kubenswrapper[4892]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Feb 17 18:07:30 crc kubenswrapper[4892]: + source /usr/local/bin/container-scripts/functions Feb 17 18:07:30 crc kubenswrapper[4892]: ++ OVNBridge=br-int Feb 17 18:07:30 crc kubenswrapper[4892]: ++ OVNRemote=tcp:localhost:6642 Feb 17 18:07:30 crc kubenswrapper[4892]: ++ OVNEncapType=geneve Feb 17 18:07:30 crc kubenswrapper[4892]: ++ OVNAvailabilityZones= Feb 17 18:07:30 crc kubenswrapper[4892]: ++ EnableChassisAsGateway=true Feb 17 18:07:30 crc kubenswrapper[4892]: ++ PhysicalNetworks= Feb 17 18:07:30 crc kubenswrapper[4892]: ++ OVNHostName= Feb 17 18:07:30 crc kubenswrapper[4892]: ++ DB_FILE=/etc/openvswitch/conf.db Feb 17 18:07:30 crc kubenswrapper[4892]: ++ ovs_dir=/var/lib/openvswitch Feb 17 18:07:30 crc kubenswrapper[4892]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Feb 17 18:07:30 crc kubenswrapper[4892]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Feb 17 18:07:30 crc kubenswrapper[4892]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 17 18:07:30 crc kubenswrapper[4892]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 17 18:07:30 crc kubenswrapper[4892]: + sleep 0.5 Feb 17 18:07:30 crc kubenswrapper[4892]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 17 18:07:30 crc kubenswrapper[4892]: + sleep 0.5 Feb 17 18:07:30 crc kubenswrapper[4892]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 17 18:07:30 crc kubenswrapper[4892]: + cleanup_ovsdb_server_semaphore Feb 17 18:07:30 crc kubenswrapper[4892]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 17 18:07:30 crc kubenswrapper[4892]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Feb 17 18:07:30 crc kubenswrapper[4892]: > pod="openstack/ovn-controller-ovs-8n9s7" podUID="abdceb70-55cb-4f2a-ac20-7fe8e9f4d064" containerName="ovsdb-server" containerID="cri-o://e1814f919569ab3fca60ffafe790f6c380bef0cb3debbd1731ef73a58c171a64" Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.107477 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-8n9s7" podUID="abdceb70-55cb-4f2a-ac20-7fe8e9f4d064" containerName="ovsdb-server" containerID="cri-o://e1814f919569ab3fca60ffafe790f6c380bef0cb3debbd1731ef73a58c171a64" gracePeriod=29 Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.107648 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="c085ee96-4617-4fa6-b546-a68d29c6238b" containerName="ovsdbserver-nb" containerID="cri-o://82eb422a29271ebf3f8cf145783a1075bbaffa1ac3279210e6826fd4595ca345" gracePeriod=300 Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.111964 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.112718 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="747e7c96-8d95-4c34-9ff3-83dc8c793fc2" containerName="openstack-network-exporter" containerID="cri-o://c98b346276d9129a2832ecb16437471be06a6bf5533e4bd04d38cba334c672a5" gracePeriod=300 Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.144166 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-57f0-account-create-update-4885s" Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.157928 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-nc5l8"] Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.178020 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-nc5l8"] Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.192365 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.192638 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="dcbe58d5-c580-4a8c-8476-dd29bf7ca91b" containerName="cinder-api-log" containerID="cri-o://0b8f5fa1d04ca87abf7de487fc63b1102d6d2d69e2adcaa02f32fd33d6c4c382" gracePeriod=30 Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.194108 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="dcbe58d5-c580-4a8c-8476-dd29bf7ca91b" containerName="cinder-api" containerID="cri-o://890f0489886784d4621fef4344f5982943816d37e5cb528aa0fe4904d5ef40c8" gracePeriod=30 Feb 17 18:07:30 crc kubenswrapper[4892]: E0217 18:07:30.199639 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2a24b9d22c199e4d14a2a499464981f6e1f269599a861bcbb6d2f7956c026991 is running failed: container process not found" containerID="2a24b9d22c199e4d14a2a499464981f6e1f269599a861bcbb6d2f7956c026991" cmd=["/usr/local/bin/container-scripts/ovn_controller_readiness.sh"] Feb 17 18:07:30 crc kubenswrapper[4892]: E0217 18:07:30.201019 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2a24b9d22c199e4d14a2a499464981f6e1f269599a861bcbb6d2f7956c026991 is running failed: container process not found" containerID="2a24b9d22c199e4d14a2a499464981f6e1f269599a861bcbb6d2f7956c026991" cmd=["/usr/local/bin/container-scripts/ovn_controller_readiness.sh"] Feb 17 18:07:30 crc kubenswrapper[4892]: E0217 18:07:30.202101 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2a24b9d22c199e4d14a2a499464981f6e1f269599a861bcbb6d2f7956c026991 is running failed: container process not found" containerID="2a24b9d22c199e4d14a2a499464981f6e1f269599a861bcbb6d2f7956c026991" cmd=["/usr/local/bin/container-scripts/ovn_controller_readiness.sh"] Feb 17 18:07:30 crc kubenswrapper[4892]: E0217 18:07:30.202147 4892 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2a24b9d22c199e4d14a2a499464981f6e1f269599a861bcbb6d2f7956c026991 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-bz7v2" podUID="ff799349-84ed-44f7-8f46-d11d5637abf1" containerName="ovn-controller" Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.204003 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5758f86b57-ddm7q"] Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.204302 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5758f86b57-ddm7q" podUID="5ffd802c-eec6-4c8c-a2a9-2571eba7bc33" containerName="neutron-api" containerID="cri-o://fae322d6a6b0720f4c3bcb028ea8f29fd05444b1ae22a5979c599cdb2c7ed739" gracePeriod=30 Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.204718 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5758f86b57-ddm7q" podUID="5ffd802c-eec6-4c8c-a2a9-2571eba7bc33" containerName="neutron-httpd" containerID="cri-o://b9c026854a18722a9d26fd66f783126656e8b029c2d6648ec55c7480344c24c5" gracePeriod=30 Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.220074 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.235848 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-57f0-account-create-update-lphkn"] Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.244743 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-57f0-account-create-update-lphkn"] Feb 17 18:07:30 crc kubenswrapper[4892]: E0217 18:07:30.252424 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e1814f919569ab3fca60ffafe790f6c380bef0cb3debbd1731ef73a58c171a64 is running failed: container process not found" containerID="e1814f919569ab3fca60ffafe790f6c380bef0cb3debbd1731ef73a58c171a64" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 17 18:07:30 crc kubenswrapper[4892]: E0217 18:07:30.254745 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e1814f919569ab3fca60ffafe790f6c380bef0cb3debbd1731ef73a58c171a64 is running failed: container process not found" containerID="e1814f919569ab3fca60ffafe790f6c380bef0cb3debbd1731ef73a58c171a64" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.254804 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-7299-account-create-update-dlm9t"] Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.258968 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="747e7c96-8d95-4c34-9ff3-83dc8c793fc2" containerName="ovsdbserver-sb" containerID="cri-o://1ea89386cc3497040f999fba08153fd875851acce88b76b5cb7c8280e1126f85" gracePeriod=300 Feb 17 18:07:30 crc kubenswrapper[4892]: E0217 18:07:30.259072 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e1814f919569ab3fca60ffafe790f6c380bef0cb3debbd1731ef73a58c171a64 is running failed: container process not found" containerID="e1814f919569ab3fca60ffafe790f6c380bef0cb3debbd1731ef73a58c171a64" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 17 18:07:30 crc kubenswrapper[4892]: E0217 18:07:30.259110 4892 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e1814f919569ab3fca60ffafe790f6c380bef0cb3debbd1731ef73a58c171a64 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-8n9s7" podUID="abdceb70-55cb-4f2a-ac20-7fe8e9f4d064" containerName="ovsdb-server" Feb 17 18:07:30 crc kubenswrapper[4892]: E0217 18:07:30.261839 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2fa766c1995ec1ef80411d787e1580966f11b07f17360079458fcc8c8c5e948a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.261969 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-7299-account-create-update-dlm9t"] Feb 17 18:07:30 crc kubenswrapper[4892]: E0217 18:07:30.278129 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2fa766c1995ec1ef80411d787e1580966f11b07f17360079458fcc8c8c5e948a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 17 18:07:30 crc kubenswrapper[4892]: E0217 18:07:30.278683 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ce10b286d2db3229a5621dafd5ff72921acba1f38e152476f3b1b8d7c9ad5d2a" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.290728 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="60523b2e-a498-4bc9-920b-32f117afb898" containerName="rabbitmq" containerID="cri-o://c90905bd645724d583f06264c835c33dbac29aab54cd0307b6a53d4573add124" gracePeriod=60 Feb 17 18:07:30 crc kubenswrapper[4892]: E0217 18:07:30.291098 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2fa766c1995ec1ef80411d787e1580966f11b07f17360079458fcc8c8c5e948a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 17 18:07:30 crc kubenswrapper[4892]: E0217 18:07:30.291127 4892 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-8n9s7" podUID="abdceb70-55cb-4f2a-ac20-7fe8e9f4d064" containerName="ovs-vswitchd" Feb 17 18:07:30 crc kubenswrapper[4892]: E0217 18:07:30.291165 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ce10b286d2db3229a5621dafd5ff72921acba1f38e152476f3b1b8d7c9ad5d2a" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Feb 17 18:07:30 crc kubenswrapper[4892]: E0217 18:07:30.294827 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ce10b286d2db3229a5621dafd5ff72921acba1f38e152476f3b1b8d7c9ad5d2a" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Feb 17 18:07:30 crc kubenswrapper[4892]: E0217 18:07:30.294902 4892 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="5b688e91-e3c2-4a0f-a784-a694f951ea5e" containerName="ovn-northd" Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.296759 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-858jh_aa192ddc-2dc5-4684-afc7-29c9e9db5f6b/openstack-network-exporter/0.log" Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.296828 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-858jh" Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.300277 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bz7v2" Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.301394 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-65gsr"] Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.324223 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-xgmsg"] Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.349303 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-65gsr"] Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.380880 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-xgmsg"] Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.414897 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db8547d8d-ftgvm"] Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.415099 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-db8547d8d-ftgvm" podUID="5c6047a2-148b-46cf-a50b-b7147c7c9902" containerName="placement-log" containerID="cri-o://8050f8a9a53f9cfd74960f06c20bc4ea0cfea7a73aa66de78f0a44fc8caa93e3" gracePeriod=30 Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.415503 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-db8547d8d-ftgvm" podUID="5c6047a2-148b-46cf-a50b-b7147c7c9902" containerName="placement-api" containerID="cri-o://f688928d9f46e1289f6ea6452cd0e3a1da9e18c9de2b9d48b2e5baa997df9da2" gracePeriod=30 Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.440367 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.443758 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="account-server" containerID="cri-o://736363b29d80b41bd0b961a49490e559581f0ae8dd26533460a7abdb41743241" gracePeriod=30 Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.444065 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="container-updater" containerID="cri-o://58e0df0679c6d02fa3c1eeeb9fd8b0c6a7d302f3247b2ed8e7c839d88c861c37" gracePeriod=30 Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.444114 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="swift-recon-cron" containerID="cri-o://40d70537bfb2a1c3a8fc4994456c79434226023147aca15f0cc917598c474f70" gracePeriod=30 Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.444149 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="rsync" containerID="cri-o://560fe248893cb8f1a43c62388a6d42d718ed7cc7f26d7e07babf294df3388633" gracePeriod=30 Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.444181 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="object-expirer" containerID="cri-o://ec801cf9cb9854df9d768df41b9402e3ae9e7297f0a3ac7eca720a4609e68e35" gracePeriod=30 Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.444211 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="object-updater" containerID="cri-o://4a392a3a4be90b644c07bd0ce9b8c0d432df6220823557ff7db78a582c418255" gracePeriod=30 Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.444240 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="object-auditor" containerID="cri-o://651defaed761921ccf7c7a6cc3b90f243a62d645f799923b98e7b01c8fdaecf1" gracePeriod=30 Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.444267 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="object-replicator" containerID="cri-o://94a10ed2d5fee50a1d21331a9c66ebfdc3f24fee2209a9431f1451ef17c41252" gracePeriod=30 Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.444297 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="object-server" containerID="cri-o://4d5dff4d38a7a79f798428eae45a2013e22a5357dd2fcd764f8587d896667672" gracePeriod=30 Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.444342 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="account-reaper" containerID="cri-o://3cf97b246bab6355f5a8e7bc81f3bddf502ccd3c47dc0316fda01c98e9d599dc" gracePeriod=30 Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.444388 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="container-auditor" containerID="cri-o://ecc298da1bf313ad808f705d76569aadd8ccd7b7dec598f4818582d15e76c068" gracePeriod=30 Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.444418 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="container-replicator" containerID="cri-o://0f12b3eda7e47024c7abf99bbcbb51563466d6864f1fb053fa6d1defadfb7e85" gracePeriod=30 Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.444443 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="container-server" containerID="cri-o://c1d342baf3f40d0f5eafc73a64df7c0cd93b1f05a6ea729e27e0c6aa8625b809" gracePeriod=30 Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.444481 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="account-replicator" containerID="cri-o://5215ee6df896cbac8711678f30bfc8c34ed8386052d37de70cf674a5aa175b70" gracePeriod=30 Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.444512 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="account-auditor" containerID="cri-o://6ad15eb01bd19157c80c0bb24ae02bfdeddfc5c770f3fde85bbd8a6f9ea1c582" gracePeriod=30 Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.470404 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff799349-84ed-44f7-8f46-d11d5637abf1-combined-ca-bundle\") pod \"ff799349-84ed-44f7-8f46-d11d5637abf1\" (UID: \"ff799349-84ed-44f7-8f46-d11d5637abf1\") " Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.470495 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff799349-84ed-44f7-8f46-d11d5637abf1-scripts\") pod \"ff799349-84ed-44f7-8f46-d11d5637abf1\" (UID: \"ff799349-84ed-44f7-8f46-d11d5637abf1\") " Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.470605 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/aa192ddc-2dc5-4684-afc7-29c9e9db5f6b-ovs-rundir\") pod \"aa192ddc-2dc5-4684-afc7-29c9e9db5f6b\" (UID: \"aa192ddc-2dc5-4684-afc7-29c9e9db5f6b\") " Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.470649 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff799349-84ed-44f7-8f46-d11d5637abf1-ovn-controller-tls-certs\") pod \"ff799349-84ed-44f7-8f46-d11d5637abf1\" (UID: \"ff799349-84ed-44f7-8f46-d11d5637abf1\") " Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.470699 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa192ddc-2dc5-4684-afc7-29c9e9db5f6b-config\") pod \"aa192ddc-2dc5-4684-afc7-29c9e9db5f6b\" (UID: \"aa192ddc-2dc5-4684-afc7-29c9e9db5f6b\") " Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.470725 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa192ddc-2dc5-4684-afc7-29c9e9db5f6b-combined-ca-bundle\") pod \"aa192ddc-2dc5-4684-afc7-29c9e9db5f6b\" (UID: \"aa192ddc-2dc5-4684-afc7-29c9e9db5f6b\") " Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.470753 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/aa192ddc-2dc5-4684-afc7-29c9e9db5f6b-ovn-rundir\") pod \"aa192ddc-2dc5-4684-afc7-29c9e9db5f6b\" (UID: \"aa192ddc-2dc5-4684-afc7-29c9e9db5f6b\") " Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.470843 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rf4w8\" (UniqueName: \"kubernetes.io/projected/ff799349-84ed-44f7-8f46-d11d5637abf1-kube-api-access-rf4w8\") pod \"ff799349-84ed-44f7-8f46-d11d5637abf1\" (UID: \"ff799349-84ed-44f7-8f46-d11d5637abf1\") " Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.470905 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ff799349-84ed-44f7-8f46-d11d5637abf1-var-run-ovn\") pod \"ff799349-84ed-44f7-8f46-d11d5637abf1\" (UID: \"ff799349-84ed-44f7-8f46-d11d5637abf1\") " Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.470946 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ff799349-84ed-44f7-8f46-d11d5637abf1-var-run\") pod \"ff799349-84ed-44f7-8f46-d11d5637abf1\" (UID: \"ff799349-84ed-44f7-8f46-d11d5637abf1\") " Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.470966 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ff799349-84ed-44f7-8f46-d11d5637abf1-var-log-ovn\") pod \"ff799349-84ed-44f7-8f46-d11d5637abf1\" (UID: \"ff799349-84ed-44f7-8f46-d11d5637abf1\") " Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.471018 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa192ddc-2dc5-4684-afc7-29c9e9db5f6b-metrics-certs-tls-certs\") pod \"aa192ddc-2dc5-4684-afc7-29c9e9db5f6b\" (UID: \"aa192ddc-2dc5-4684-afc7-29c9e9db5f6b\") " Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.471067 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48scl\" (UniqueName: \"kubernetes.io/projected/aa192ddc-2dc5-4684-afc7-29c9e9db5f6b-kube-api-access-48scl\") pod \"aa192ddc-2dc5-4684-afc7-29c9e9db5f6b\" (UID: \"aa192ddc-2dc5-4684-afc7-29c9e9db5f6b\") " Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.473123 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ff799349-84ed-44f7-8f46-d11d5637abf1-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "ff799349-84ed-44f7-8f46-d11d5637abf1" (UID: "ff799349-84ed-44f7-8f46-d11d5637abf1"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.474490 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aa192ddc-2dc5-4684-afc7-29c9e9db5f6b-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "aa192ddc-2dc5-4684-afc7-29c9e9db5f6b" (UID: "aa192ddc-2dc5-4684-afc7-29c9e9db5f6b"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.474524 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ff799349-84ed-44f7-8f46-d11d5637abf1-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "ff799349-84ed-44f7-8f46-d11d5637abf1" (UID: "ff799349-84ed-44f7-8f46-d11d5637abf1"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.474544 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ff799349-84ed-44f7-8f46-d11d5637abf1-var-run" (OuterVolumeSpecName: "var-run") pod "ff799349-84ed-44f7-8f46-d11d5637abf1" (UID: "ff799349-84ed-44f7-8f46-d11d5637abf1"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.475106 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff799349-84ed-44f7-8f46-d11d5637abf1-scripts" (OuterVolumeSpecName: "scripts") pod "ff799349-84ed-44f7-8f46-d11d5637abf1" (UID: "ff799349-84ed-44f7-8f46-d11d5637abf1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.475232 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa192ddc-2dc5-4684-afc7-29c9e9db5f6b-config" (OuterVolumeSpecName: "config") pod "aa192ddc-2dc5-4684-afc7-29c9e9db5f6b" (UID: "aa192ddc-2dc5-4684-afc7-29c9e9db5f6b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.475771 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aa192ddc-2dc5-4684-afc7-29c9e9db5f6b-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "aa192ddc-2dc5-4684-afc7-29c9e9db5f6b" (UID: "aa192ddc-2dc5-4684-afc7-29c9e9db5f6b"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.480329 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff799349-84ed-44f7-8f46-d11d5637abf1-kube-api-access-rf4w8" (OuterVolumeSpecName: "kube-api-access-rf4w8") pod "ff799349-84ed-44f7-8f46-d11d5637abf1" (UID: "ff799349-84ed-44f7-8f46-d11d5637abf1"). InnerVolumeSpecName "kube-api-access-rf4w8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.488026 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa192ddc-2dc5-4684-afc7-29c9e9db5f6b-kube-api-access-48scl" (OuterVolumeSpecName: "kube-api-access-48scl") pod "aa192ddc-2dc5-4684-afc7-29c9e9db5f6b" (UID: "aa192ddc-2dc5-4684-afc7-29c9e9db5f6b"). InnerVolumeSpecName "kube-api-access-48scl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.491164 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-j5ngd"] Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.510032 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-j5ngd"] Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.521322 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-lzqfl"] Feb 17 18:07:30 crc kubenswrapper[4892]: E0217 18:07:30.526257 4892 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 17 18:07:30 crc kubenswrapper[4892]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[/bin/sh -c #!/bin/bash Feb 17 18:07:30 crc kubenswrapper[4892]: Feb 17 18:07:30 crc kubenswrapper[4892]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 17 18:07:30 crc kubenswrapper[4892]: Feb 17 18:07:30 crc kubenswrapper[4892]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 17 18:07:30 crc kubenswrapper[4892]: Feb 17 18:07:30 crc kubenswrapper[4892]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 17 18:07:30 crc kubenswrapper[4892]: Feb 17 18:07:30 crc kubenswrapper[4892]: if [ -n "cinder" ]; then Feb 17 18:07:30 crc kubenswrapper[4892]: GRANT_DATABASE="cinder" Feb 17 18:07:30 crc kubenswrapper[4892]: else Feb 17 18:07:30 crc kubenswrapper[4892]: GRANT_DATABASE="*" Feb 17 18:07:30 crc kubenswrapper[4892]: fi Feb 17 18:07:30 crc kubenswrapper[4892]: Feb 17 18:07:30 crc kubenswrapper[4892]: # going for maximum compatibility here: Feb 17 18:07:30 crc kubenswrapper[4892]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 17 18:07:30 crc kubenswrapper[4892]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 17 18:07:30 crc kubenswrapper[4892]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 17 18:07:30 crc kubenswrapper[4892]: # support updates Feb 17 18:07:30 crc kubenswrapper[4892]: Feb 17 18:07:30 crc kubenswrapper[4892]: $MYSQL_CMD < logger="UnhandledError" Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.529728 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-lzqfl"] Feb 17 18:07:30 crc kubenswrapper[4892]: E0217 18:07:30.529755 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"cinder-db-secret\\\" not found\"" pod="openstack/cinder-3909-account-create-update-z2hg5" podUID="588baeb0-b9ae-4dcf-83c6-36b641359bfd" Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.558308 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-3909-account-create-update-z2hg5"] Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.573305 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6a1925e-e87d-4d6e-b9a3-35e46d58fc54-operator-scripts\") pod \"nova-cell1-7299-account-create-update-frm9q\" (UID: \"a6a1925e-e87d-4d6e-b9a3-35e46d58fc54\") " pod="openstack/nova-cell1-7299-account-create-update-frm9q" Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.573368 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljdlr\" (UniqueName: \"kubernetes.io/projected/a6a1925e-e87d-4d6e-b9a3-35e46d58fc54-kube-api-access-ljdlr\") pod \"nova-cell1-7299-account-create-update-frm9q\" (UID: \"a6a1925e-e87d-4d6e-b9a3-35e46d58fc54\") " pod="openstack/nova-cell1-7299-account-create-update-frm9q" Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.573617 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48scl\" (UniqueName: \"kubernetes.io/projected/aa192ddc-2dc5-4684-afc7-29c9e9db5f6b-kube-api-access-48scl\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.573640 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff799349-84ed-44f7-8f46-d11d5637abf1-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.573652 4892 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/aa192ddc-2dc5-4684-afc7-29c9e9db5f6b-ovs-rundir\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.573663 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa192ddc-2dc5-4684-afc7-29c9e9db5f6b-config\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.573676 4892 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/aa192ddc-2dc5-4684-afc7-29c9e9db5f6b-ovn-rundir\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.573686 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rf4w8\" (UniqueName: \"kubernetes.io/projected/ff799349-84ed-44f7-8f46-d11d5637abf1-kube-api-access-rf4w8\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.573696 4892 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ff799349-84ed-44f7-8f46-d11d5637abf1-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.573703 4892 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ff799349-84ed-44f7-8f46-d11d5637abf1-var-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.573713 4892 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ff799349-84ed-44f7-8f46-d11d5637abf1-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:30 crc kubenswrapper[4892]: E0217 18:07:30.575053 4892 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Feb 17 18:07:30 crc kubenswrapper[4892]: E0217 18:07:30.575106 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a6a1925e-e87d-4d6e-b9a3-35e46d58fc54-operator-scripts podName:a6a1925e-e87d-4d6e-b9a3-35e46d58fc54 nodeName:}" failed. No retries permitted until 2026-02-17 18:07:31.575092431 +0000 UTC m=+1422.950495686 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/a6a1925e-e87d-4d6e-b9a3-35e46d58fc54-operator-scripts") pod "nova-cell1-7299-account-create-update-frm9q" (UID: "a6a1925e-e87d-4d6e-b9a3-35e46d58fc54") : configmap "openstack-cell1-scripts" not found Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.609072 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.609281 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e69accab-69f4-4f35-91cc-b9fb1d0fded2" containerName="glance-log" containerID="cri-o://094f4083ae3da865c0c77d4d0a55be29b2800ab643f2367af2aa5f359e590bfc" gracePeriod=30 Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.609650 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e69accab-69f4-4f35-91cc-b9fb1d0fded2" containerName="glance-httpd" containerID="cri-o://a6444be986863902c53a81d49703d9af2c8e3377fbc24ad52b64139d345c030e" gracePeriod=30 Feb 17 18:07:30 crc kubenswrapper[4892]: E0217 18:07:30.610047 4892 projected.go:194] Error preparing data for projected volume kube-api-access-ljdlr for pod openstack/nova-cell1-7299-account-create-update-frm9q: failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Feb 17 18:07:30 crc kubenswrapper[4892]: E0217 18:07:30.610109 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a6a1925e-e87d-4d6e-b9a3-35e46d58fc54-kube-api-access-ljdlr podName:a6a1925e-e87d-4d6e-b9a3-35e46d58fc54 nodeName:}" failed. No retries permitted until 2026-02-17 18:07:31.610092726 +0000 UTC m=+1422.985495991 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-ljdlr" (UniqueName: "kubernetes.io/projected/a6a1925e-e87d-4d6e-b9a3-35e46d58fc54-kube-api-access-ljdlr") pod "nova-cell1-7299-account-create-update-frm9q" (UID: "a6a1925e-e87d-4d6e-b9a3-35e46d58fc54") : failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.642931 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.649122 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="3ca9c4ed-1247-4340-a675-b9d50dcbed99" containerName="glance-log" containerID="cri-o://bbca05ba8000e1544dd1256fe1d48355fe4077385199194480c09fd31d0d03ad" gracePeriod=30 Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.655785 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="3ca9c4ed-1247-4340-a675-b9d50dcbed99" containerName="glance-httpd" containerID="cri-o://c99d3760836f82bf94729ec9f9c217c2b3cda31831ae4231d401c839708c201c" gracePeriod=30 Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.759377 4892 generic.go:334] "Generic (PLEG): container finished" podID="5c6047a2-148b-46cf-a50b-b7147c7c9902" containerID="8050f8a9a53f9cfd74960f06c20bc4ea0cfea7a73aa66de78f0a44fc8caa93e3" exitCode=143 Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.759865 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db8547d8d-ftgvm" event={"ID":"5c6047a2-148b-46cf-a50b-b7147c7c9902","Type":"ContainerDied","Data":"8050f8a9a53f9cfd74960f06c20bc4ea0cfea7a73aa66de78f0a44fc8caa93e3"} Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.769855 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_c085ee96-4617-4fa6-b546-a68d29c6238b/ovsdbserver-nb/0.log" Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.770086 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"c085ee96-4617-4fa6-b546-a68d29c6238b","Type":"ContainerDied","Data":"091908dabe7cfe9736d9853d937fdfd18cc05ba8b8dcdcc384696d866e69fa27"} Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.770213 4892 generic.go:334] "Generic (PLEG): container finished" podID="c085ee96-4617-4fa6-b546-a68d29c6238b" containerID="091908dabe7cfe9736d9853d937fdfd18cc05ba8b8dcdcc384696d866e69fa27" exitCode=2 Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.770275 4892 generic.go:334] "Generic (PLEG): container finished" podID="c085ee96-4617-4fa6-b546-a68d29c6238b" containerID="82eb422a29271ebf3f8cf145783a1075bbaffa1ac3279210e6826fd4595ca345" exitCode=143 Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.770342 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"c085ee96-4617-4fa6-b546-a68d29c6238b","Type":"ContainerDied","Data":"82eb422a29271ebf3f8cf145783a1075bbaffa1ac3279210e6826fd4595ca345"} Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.773373 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-858jh_aa192ddc-2dc5-4684-afc7-29c9e9db5f6b/openstack-network-exporter/0.log" Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.773557 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-858jh" event={"ID":"aa192ddc-2dc5-4684-afc7-29c9e9db5f6b","Type":"ContainerDied","Data":"ec45d12b51ac0478080a8880ebb2a56c1612762345c8796d6face1374704b5a8"} Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.774360 4892 scope.go:117] "RemoveContainer" containerID="63921be3f9a636b065a7e01b28b406aaf94dbcbae3cf8b99719f9d18b1baacbb" Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.774562 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-858jh" Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.792209 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff799349-84ed-44f7-8f46-d11d5637abf1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff799349-84ed-44f7-8f46-d11d5637abf1" (UID: "ff799349-84ed-44f7-8f46-d11d5637abf1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.792300 4892 generic.go:334] "Generic (PLEG): container finished" podID="5ffd802c-eec6-4c8c-a2a9-2571eba7bc33" containerID="b9c026854a18722a9d26fd66f783126656e8b029c2d6648ec55c7480344c24c5" exitCode=0 Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.792416 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5758f86b57-ddm7q" event={"ID":"5ffd802c-eec6-4c8c-a2a9-2571eba7bc33","Type":"ContainerDied","Data":"b9c026854a18722a9d26fd66f783126656e8b029c2d6648ec55c7480344c24c5"} Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.822030 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-tj55v"] Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.853665 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff799349-84ed-44f7-8f46-d11d5637abf1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.854033 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-tj55v"] Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.856051 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa192ddc-2dc5-4684-afc7-29c9e9db5f6b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aa192ddc-2dc5-4684-afc7-29c9e9db5f6b" (UID: "aa192ddc-2dc5-4684-afc7-29c9e9db5f6b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.905429 4892 generic.go:334] "Generic (PLEG): container finished" podID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerID="ec801cf9cb9854df9d768df41b9402e3ae9e7297f0a3ac7eca720a4609e68e35" exitCode=0 Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.905723 4892 generic.go:334] "Generic (PLEG): container finished" podID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerID="4a392a3a4be90b644c07bd0ce9b8c0d432df6220823557ff7db78a582c418255" exitCode=0 Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.905735 4892 generic.go:334] "Generic (PLEG): container finished" podID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerID="651defaed761921ccf7c7a6cc3b90f243a62d645f799923b98e7b01c8fdaecf1" exitCode=0 Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.905747 4892 generic.go:334] "Generic (PLEG): container finished" podID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerID="94a10ed2d5fee50a1d21331a9c66ebfdc3f24fee2209a9431f1451ef17c41252" exitCode=0 Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.905756 4892 generic.go:334] "Generic (PLEG): container finished" podID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerID="3cf97b246bab6355f5a8e7bc81f3bddf502ccd3c47dc0316fda01c98e9d599dc" exitCode=0 Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.905857 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0a693208-de83-4f05-b4ff-0b3e7f858c74","Type":"ContainerDied","Data":"ec801cf9cb9854df9d768df41b9402e3ae9e7297f0a3ac7eca720a4609e68e35"} Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.905893 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0a693208-de83-4f05-b4ff-0b3e7f858c74","Type":"ContainerDied","Data":"4a392a3a4be90b644c07bd0ce9b8c0d432df6220823557ff7db78a582c418255"} Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.905908 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0a693208-de83-4f05-b4ff-0b3e7f858c74","Type":"ContainerDied","Data":"651defaed761921ccf7c7a6cc3b90f243a62d645f799923b98e7b01c8fdaecf1"} Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.905919 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0a693208-de83-4f05-b4ff-0b3e7f858c74","Type":"ContainerDied","Data":"94a10ed2d5fee50a1d21331a9c66ebfdc3f24fee2209a9431f1451ef17c41252"} Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.905931 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0a693208-de83-4f05-b4ff-0b3e7f858c74","Type":"ContainerDied","Data":"3cf97b246bab6355f5a8e7bc81f3bddf502ccd3c47dc0316fda01c98e9d599dc"} Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.933825 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-2024-account-create-update-lhrkp"] Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.947035 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-bz7v2" event={"ID":"ff799349-84ed-44f7-8f46-d11d5637abf1","Type":"ContainerDied","Data":"bab04d84fb9fff3cd8342885cce22150d7b9a68faff5371f3ce3ae03d1da9f28"} Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.947366 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bz7v2" Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.956398 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3909-account-create-update-z2hg5" event={"ID":"588baeb0-b9ae-4dcf-83c6-36b641359bfd","Type":"ContainerStarted","Data":"c390e6b748815be52135ceb917b45b1047d3687f25d4e14c38231809c529dba4"} Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.961515 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa192ddc-2dc5-4684-afc7-29c9e9db5f6b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.963195 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-2024-account-create-update-lhrkp"] Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.981089 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8n9s7" event={"ID":"abdceb70-55cb-4f2a-ac20-7fe8e9f4d064","Type":"ContainerDied","Data":"e1814f919569ab3fca60ffafe790f6c380bef0cb3debbd1731ef73a58c171a64"} Feb 17 18:07:30 crc kubenswrapper[4892]: I0217 18:07:30.980165 4892 generic.go:334] "Generic (PLEG): container finished" podID="abdceb70-55cb-4f2a-ac20-7fe8e9f4d064" containerID="e1814f919569ab3fca60ffafe790f6c380bef0cb3debbd1731ef73a58c171a64" exitCode=0 Feb 17 18:07:30 crc kubenswrapper[4892]: E0217 18:07:30.976514 4892 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 17 18:07:30 crc kubenswrapper[4892]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[/bin/sh -c #!/bin/bash Feb 17 18:07:30 crc kubenswrapper[4892]: Feb 17 18:07:30 crc kubenswrapper[4892]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 17 18:07:30 crc kubenswrapper[4892]: Feb 17 18:07:30 crc kubenswrapper[4892]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 17 18:07:30 crc kubenswrapper[4892]: Feb 17 18:07:30 crc kubenswrapper[4892]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 17 18:07:30 crc kubenswrapper[4892]: Feb 17 18:07:30 crc kubenswrapper[4892]: if [ -n "cinder" ]; then Feb 17 18:07:30 crc kubenswrapper[4892]: GRANT_DATABASE="cinder" Feb 17 18:07:30 crc kubenswrapper[4892]: else Feb 17 18:07:30 crc kubenswrapper[4892]: GRANT_DATABASE="*" Feb 17 18:07:30 crc kubenswrapper[4892]: fi Feb 17 18:07:30 crc kubenswrapper[4892]: Feb 17 18:07:30 crc kubenswrapper[4892]: # going for maximum compatibility here: Feb 17 18:07:30 crc kubenswrapper[4892]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 17 18:07:30 crc kubenswrapper[4892]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 17 18:07:30 crc kubenswrapper[4892]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 17 18:07:30 crc kubenswrapper[4892]: # support updates Feb 17 18:07:30 crc kubenswrapper[4892]: Feb 17 18:07:30 crc kubenswrapper[4892]: $MYSQL_CMD < logger="UnhandledError" Feb 17 18:07:30 crc kubenswrapper[4892]: E0217 18:07:30.988176 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"cinder-db-secret\\\" not found\"" pod="openstack/cinder-3909-account-create-update-z2hg5" podUID="588baeb0-b9ae-4dcf-83c6-36b641359bfd" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:30.999219 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.004728 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa192ddc-2dc5-4684-afc7-29c9e9db5f6b-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "aa192ddc-2dc5-4684-afc7-29c9e9db5f6b" (UID: "aa192ddc-2dc5-4684-afc7-29c9e9db5f6b"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.010461 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_747e7c96-8d95-4c34-9ff3-83dc8c793fc2/ovsdbserver-sb/0.log" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.010500 4892 generic.go:334] "Generic (PLEG): container finished" podID="747e7c96-8d95-4c34-9ff3-83dc8c793fc2" containerID="c98b346276d9129a2832ecb16437471be06a6bf5533e4bd04d38cba334c672a5" exitCode=2 Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.010517 4892 generic.go:334] "Generic (PLEG): container finished" podID="747e7c96-8d95-4c34-9ff3-83dc8c793fc2" containerID="1ea89386cc3497040f999fba08153fd875851acce88b76b5cb7c8280e1126f85" exitCode=143 Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.010562 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"747e7c96-8d95-4c34-9ff3-83dc8c793fc2","Type":"ContainerDied","Data":"c98b346276d9129a2832ecb16437471be06a6bf5533e4bd04d38cba334c672a5"} Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.010595 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"747e7c96-8d95-4c34-9ff3-83dc8c793fc2","Type":"ContainerDied","Data":"1ea89386cc3497040f999fba08153fd875851acce88b76b5cb7c8280e1126f85"} Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.024085 4892 generic.go:334] "Generic (PLEG): container finished" podID="dcbe58d5-c580-4a8c-8476-dd29bf7ca91b" containerID="0b8f5fa1d04ca87abf7de487fc63b1102d6d2d69e2adcaa02f32fd33d6c4c382" exitCode=143 Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.024206 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dcbe58d5-c580-4a8c-8476-dd29bf7ca91b","Type":"ContainerDied","Data":"0b8f5fa1d04ca87abf7de487fc63b1102d6d2d69e2adcaa02f32fd33d6c4c382"} Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.047066 4892 generic.go:334] "Generic (PLEG): container finished" podID="0f0771f7-1250-403c-92b9-72411ed34b2a" containerID="9e7e188405b6a8e4ef6bb57b42cbb6f284b2256f64aba5c60b2ec472f06d945a" exitCode=137 Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.082874 4892 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa192ddc-2dc5-4684-afc7-29c9e9db5f6b-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.086892 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-5c69-account-create-update-j2dxf"] Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.098067 4892 generic.go:334] "Generic (PLEG): container finished" podID="f9e84558-a02a-4297-87f2-6b69a3b5f452" containerID="047fafbe070b5c5f98549aa6ce2f47850884bd46a8bba02c12bc6d546ba53b35" exitCode=0 Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.098114 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-tdc8n" event={"ID":"f9e84558-a02a-4297-87f2-6b69a3b5f452","Type":"ContainerDied","Data":"047fafbe070b5c5f98549aa6ce2f47850884bd46a8bba02c12bc6d546ba53b35"} Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.119944 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff799349-84ed-44f7-8f46-d11d5637abf1-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "ff799349-84ed-44f7-8f46-d11d5637abf1" (UID: "ff799349-84ed-44f7-8f46-d11d5637abf1"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.137525 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-tvldr"] Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.161011 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="3a991e29-288f-453d-9bb4-f8d90a2689ad" containerName="rabbitmq" containerID="cri-o://fbd28f7697ba231fe30eb9e4f8561cf1e9d17638992ea1a393e7ed69f9406cea" gracePeriod=60 Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.181063 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-tvldr"] Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.187954 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.192278 4892 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff799349-84ed-44f7-8f46-d11d5637abf1-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.207118 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-tdc8n" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.217729 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-vvgtf"] Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.252234 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-vvgtf"] Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.272298 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-7xltk"] Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.282160 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-7xltk"] Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.296191 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9e84558-a02a-4297-87f2-6b69a3b5f452-ovsdbserver-sb\") pod \"f9e84558-a02a-4297-87f2-6b69a3b5f452\" (UID: \"f9e84558-a02a-4297-87f2-6b69a3b5f452\") " Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.296267 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9e84558-a02a-4297-87f2-6b69a3b5f452-ovsdbserver-nb\") pod \"f9e84558-a02a-4297-87f2-6b69a3b5f452\" (UID: \"f9e84558-a02a-4297-87f2-6b69a3b5f452\") " Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.296307 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0f0771f7-1250-403c-92b9-72411ed34b2a-openstack-config\") pod \"0f0771f7-1250-403c-92b9-72411ed34b2a\" (UID: \"0f0771f7-1250-403c-92b9-72411ed34b2a\") " Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.296349 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0f0771f7-1250-403c-92b9-72411ed34b2a-openstack-config-secret\") pod \"0f0771f7-1250-403c-92b9-72411ed34b2a\" (UID: \"0f0771f7-1250-403c-92b9-72411ed34b2a\") " Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.296444 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9e84558-a02a-4297-87f2-6b69a3b5f452-config\") pod \"f9e84558-a02a-4297-87f2-6b69a3b5f452\" (UID: \"f9e84558-a02a-4297-87f2-6b69a3b5f452\") " Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.296465 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9e84558-a02a-4297-87f2-6b69a3b5f452-dns-svc\") pod \"f9e84558-a02a-4297-87f2-6b69a3b5f452\" (UID: \"f9e84558-a02a-4297-87f2-6b69a3b5f452\") " Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.296534 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f0771f7-1250-403c-92b9-72411ed34b2a-combined-ca-bundle\") pod \"0f0771f7-1250-403c-92b9-72411ed34b2a\" (UID: \"0f0771f7-1250-403c-92b9-72411ed34b2a\") " Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.296586 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5m9g9\" (UniqueName: \"kubernetes.io/projected/0f0771f7-1250-403c-92b9-72411ed34b2a-kube-api-access-5m9g9\") pod \"0f0771f7-1250-403c-92b9-72411ed34b2a\" (UID: \"0f0771f7-1250-403c-92b9-72411ed34b2a\") " Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.296643 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9e84558-a02a-4297-87f2-6b69a3b5f452-dns-swift-storage-0\") pod \"f9e84558-a02a-4297-87f2-6b69a3b5f452\" (UID: \"f9e84558-a02a-4297-87f2-6b69a3b5f452\") " Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.296693 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bldm\" (UniqueName: \"kubernetes.io/projected/f9e84558-a02a-4297-87f2-6b69a3b5f452-kube-api-access-8bldm\") pod \"f9e84558-a02a-4297-87f2-6b69a3b5f452\" (UID: \"f9e84558-a02a-4297-87f2-6b69a3b5f452\") " Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.297512 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-fa7c-account-create-update-q289d"] Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.306176 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 17 18:07:31 crc kubenswrapper[4892]: E0217 18:07:31.307876 4892 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 17 18:07:31 crc kubenswrapper[4892]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[/bin/sh -c #!/bin/bash Feb 17 18:07:31 crc kubenswrapper[4892]: Feb 17 18:07:31 crc kubenswrapper[4892]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 17 18:07:31 crc kubenswrapper[4892]: Feb 17 18:07:31 crc kubenswrapper[4892]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 17 18:07:31 crc kubenswrapper[4892]: Feb 17 18:07:31 crc kubenswrapper[4892]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 17 18:07:31 crc kubenswrapper[4892]: Feb 17 18:07:31 crc kubenswrapper[4892]: if [ -n "glance" ]; then Feb 17 18:07:31 crc kubenswrapper[4892]: GRANT_DATABASE="glance" Feb 17 18:07:31 crc kubenswrapper[4892]: else Feb 17 18:07:31 crc kubenswrapper[4892]: GRANT_DATABASE="*" Feb 17 18:07:31 crc kubenswrapper[4892]: fi Feb 17 18:07:31 crc kubenswrapper[4892]: Feb 17 18:07:31 crc kubenswrapper[4892]: # going for maximum compatibility here: Feb 17 18:07:31 crc kubenswrapper[4892]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 17 18:07:31 crc kubenswrapper[4892]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 17 18:07:31 crc kubenswrapper[4892]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 17 18:07:31 crc kubenswrapper[4892]: # support updates Feb 17 18:07:31 crc kubenswrapper[4892]: Feb 17 18:07:31 crc kubenswrapper[4892]: $MYSQL_CMD < logger="UnhandledError" Feb 17 18:07:31 crc kubenswrapper[4892]: W0217 18:07:31.307966 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a130c05_1db7_4d93_899e_05d086a2bcca.slice/crio-f75e5afe3c240e3b7b80ea954bd8614914d8c5319be4f7207619dccdca9e42d5 WatchSource:0}: Error finding container f75e5afe3c240e3b7b80ea954bd8614914d8c5319be4f7207619dccdca9e42d5: Status 404 returned error can't find the container with id f75e5afe3c240e3b7b80ea954bd8614914d8c5319be4f7207619dccdca9e42d5 Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.308169 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f0771f7-1250-403c-92b9-72411ed34b2a-kube-api-access-5m9g9" (OuterVolumeSpecName: "kube-api-access-5m9g9") pod "0f0771f7-1250-403c-92b9-72411ed34b2a" (UID: "0f0771f7-1250-403c-92b9-72411ed34b2a"). InnerVolumeSpecName "kube-api-access-5m9g9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:07:31 crc kubenswrapper[4892]: E0217 18:07:31.312756 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"glance-db-secret\\\" not found\"" pod="openstack/glance-5c69-account-create-update-j2dxf" podUID="40312e89-cbb3-4c9f-b8bc-33113bd6462f" Feb 17 18:07:31 crc kubenswrapper[4892]: W0217 18:07:31.323293 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode335cae7_2188_47a9_8eef_f53353b22dd4.slice/crio-8a775eb0f599612e05daeaa486a21cee061ad83015907ea034c8e84cea46e06d WatchSource:0}: Error finding container 8a775eb0f599612e05daeaa486a21cee061ad83015907ea034c8e84cea46e06d: Status 404 returned error can't find the container with id 8a775eb0f599612e05daeaa486a21cee061ad83015907ea034c8e84cea46e06d Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.337181 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-ctbgj"] Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.372795 4892 scope.go:117] "RemoveContainer" containerID="2a24b9d22c199e4d14a2a499464981f6e1f269599a861bcbb6d2f7956c026991" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.373173 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9e84558-a02a-4297-87f2-6b69a3b5f452-kube-api-access-8bldm" (OuterVolumeSpecName: "kube-api-access-8bldm") pod "f9e84558-a02a-4297-87f2-6b69a3b5f452" (UID: "f9e84558-a02a-4297-87f2-6b69a3b5f452"). InnerVolumeSpecName "kube-api-access-8bldm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.373310 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f0771f7-1250-403c-92b9-72411ed34b2a-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "0f0771f7-1250-403c-92b9-72411ed34b2a" (UID: "0f0771f7-1250-403c-92b9-72411ed34b2a"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:07:31 crc kubenswrapper[4892]: E0217 18:07:31.383486 4892 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 17 18:07:31 crc kubenswrapper[4892]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[/bin/sh -c #!/bin/bash Feb 17 18:07:31 crc kubenswrapper[4892]: Feb 17 18:07:31 crc kubenswrapper[4892]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 17 18:07:31 crc kubenswrapper[4892]: Feb 17 18:07:31 crc kubenswrapper[4892]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 17 18:07:31 crc kubenswrapper[4892]: Feb 17 18:07:31 crc kubenswrapper[4892]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 17 18:07:31 crc kubenswrapper[4892]: Feb 17 18:07:31 crc kubenswrapper[4892]: if [ -n "" ]; then Feb 17 18:07:31 crc kubenswrapper[4892]: GRANT_DATABASE="" Feb 17 18:07:31 crc kubenswrapper[4892]: else Feb 17 18:07:31 crc kubenswrapper[4892]: GRANT_DATABASE="*" Feb 17 18:07:31 crc kubenswrapper[4892]: fi Feb 17 18:07:31 crc kubenswrapper[4892]: Feb 17 18:07:31 crc kubenswrapper[4892]: # going for maximum compatibility here: Feb 17 18:07:31 crc kubenswrapper[4892]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 17 18:07:31 crc kubenswrapper[4892]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 17 18:07:31 crc kubenswrapper[4892]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 17 18:07:31 crc kubenswrapper[4892]: # support updates Feb 17 18:07:31 crc kubenswrapper[4892]: Feb 17 18:07:31 crc kubenswrapper[4892]: $MYSQL_CMD < logger="UnhandledError" Feb 17 18:07:31 crc kubenswrapper[4892]: E0217 18:07:31.384486 4892 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 17 18:07:31 crc kubenswrapper[4892]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[/bin/sh -c #!/bin/bash Feb 17 18:07:31 crc kubenswrapper[4892]: Feb 17 18:07:31 crc kubenswrapper[4892]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 17 18:07:31 crc kubenswrapper[4892]: Feb 17 18:07:31 crc kubenswrapper[4892]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 17 18:07:31 crc kubenswrapper[4892]: Feb 17 18:07:31 crc kubenswrapper[4892]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 17 18:07:31 crc kubenswrapper[4892]: Feb 17 18:07:31 crc kubenswrapper[4892]: if [ -n "barbican" ]; then Feb 17 18:07:31 crc kubenswrapper[4892]: GRANT_DATABASE="barbican" Feb 17 18:07:31 crc kubenswrapper[4892]: else Feb 17 18:07:31 crc kubenswrapper[4892]: GRANT_DATABASE="*" Feb 17 18:07:31 crc kubenswrapper[4892]: fi Feb 17 18:07:31 crc kubenswrapper[4892]: Feb 17 18:07:31 crc kubenswrapper[4892]: # going for maximum compatibility here: Feb 17 18:07:31 crc kubenswrapper[4892]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 17 18:07:31 crc kubenswrapper[4892]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 17 18:07:31 crc kubenswrapper[4892]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 17 18:07:31 crc kubenswrapper[4892]: # support updates Feb 17 18:07:31 crc kubenswrapper[4892]: Feb 17 18:07:31 crc kubenswrapper[4892]: $MYSQL_CMD < logger="UnhandledError" Feb 17 18:07:31 crc kubenswrapper[4892]: E0217 18:07:31.387532 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-cell1-mariadb-root-db-secret\\\" not found\"" pod="openstack/root-account-create-update-cq76l" podUID="7a130c05-1db7-4d93-899e-05d086a2bcca" Feb 17 18:07:31 crc kubenswrapper[4892]: E0217 18:07:31.388162 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack/barbican-fa7c-account-create-update-q289d" podUID="e335cae7-2188-47a9-8eef-f53353b22dd4" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.401371 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5m9g9\" (UniqueName: \"kubernetes.io/projected/0f0771f7-1250-403c-92b9-72411ed34b2a-kube-api-access-5m9g9\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.401409 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bldm\" (UniqueName: \"kubernetes.io/projected/f9e84558-a02a-4297-87f2-6b69a3b5f452-kube-api-access-8bldm\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.401421 4892 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0f0771f7-1250-403c-92b9-72411ed34b2a-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.430408 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_c085ee96-4617-4fa6-b546-a68d29c6238b/ovsdbserver-nb/0.log" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.430479 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.434649 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c42e2da-4b51-426a-b743-b8c79e358ecb" path="/var/lib/kubelet/pods/2c42e2da-4b51-426a-b743-b8c79e358ecb/volumes" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.438448 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="484d3758-2960-4e5d-89a8-d36a6cefd791" path="/var/lib/kubelet/pods/484d3758-2960-4e5d-89a8-d36a6cefd791/volumes" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.440646 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49dc1deb-f5b7-4b82-bd8e-d1c6dbe9f513" path="/var/lib/kubelet/pods/49dc1deb-f5b7-4b82-bd8e-d1c6dbe9f513/volumes" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.441211 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="545242bc-24e5-4521-85d8-7aff5cbd4916" path="/var/lib/kubelet/pods/545242bc-24e5-4521-85d8-7aff5cbd4916/volumes" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.457286 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60efd3a5-df5d-4f7c-949b-f9952e234b8d" path="/var/lib/kubelet/pods/60efd3a5-df5d-4f7c-949b-f9952e234b8d/volumes" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.458161 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64d8e63f-e602-4e70-ac10-58f577858490" path="/var/lib/kubelet/pods/64d8e63f-e602-4e70-ac10-58f577858490/volumes" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.461103 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7eb01add-fd0d-4606-9761-316f009e0002" path="/var/lib/kubelet/pods/7eb01add-fd0d-4606-9761-316f009e0002/volumes" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.461625 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="932e3900-65d2-4587-9da3-8fb5bea0a354" path="/var/lib/kubelet/pods/932e3900-65d2-4587-9da3-8fb5bea0a354/volumes" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.462221 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97ed9367-dfce-487f-b826-06981dba28ef" path="/var/lib/kubelet/pods/97ed9367-dfce-487f-b826-06981dba28ef/volumes" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.473163 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6ef4386-19dd-4bd6-bf1c-53598735a302" path="/var/lib/kubelet/pods/b6ef4386-19dd-4bd6-bf1c-53598735a302/volumes" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.473967 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c72be734-0fa6-4a40-8f95-984db82d859c" path="/var/lib/kubelet/pods/c72be734-0fa6-4a40-8f95-984db82d859c/volumes" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.474461 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3e03215-2841-458f-86b7-9bd2882f07a8" path="/var/lib/kubelet/pods/d3e03215-2841-458f-86b7-9bd2882f07a8/volumes" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.475057 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f78e90e9-8804-4e06-8083-785c64a86a1d" path="/var/lib/kubelet/pods/f78e90e9-8804-4e06-8083-785c64a86a1d/volumes" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.500007 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f0771f7-1250-403c-92b9-72411ed34b2a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0f0771f7-1250-403c-92b9-72411ed34b2a" (UID: "0f0771f7-1250-403c-92b9-72411ed34b2a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.504442 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c085ee96-4617-4fa6-b546-a68d29c6238b-combined-ca-bundle\") pod \"c085ee96-4617-4fa6-b546-a68d29c6238b\" (UID: \"c085ee96-4617-4fa6-b546-a68d29c6238b\") " Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.504750 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c085ee96-4617-4fa6-b546-a68d29c6238b-metrics-certs-tls-certs\") pod \"c085ee96-4617-4fa6-b546-a68d29c6238b\" (UID: \"c085ee96-4617-4fa6-b546-a68d29c6238b\") " Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.504770 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvv4x\" (UniqueName: \"kubernetes.io/projected/c085ee96-4617-4fa6-b546-a68d29c6238b-kube-api-access-wvv4x\") pod \"c085ee96-4617-4fa6-b546-a68d29c6238b\" (UID: \"c085ee96-4617-4fa6-b546-a68d29c6238b\") " Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.504847 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c085ee96-4617-4fa6-b546-a68d29c6238b-scripts\") pod \"c085ee96-4617-4fa6-b546-a68d29c6238b\" (UID: \"c085ee96-4617-4fa6-b546-a68d29c6238b\") " Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.504891 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c085ee96-4617-4fa6-b546-a68d29c6238b-ovsdb-rundir\") pod \"c085ee96-4617-4fa6-b546-a68d29c6238b\" (UID: \"c085ee96-4617-4fa6-b546-a68d29c6238b\") " Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.504937 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c085ee96-4617-4fa6-b546-a68d29c6238b-config\") pod \"c085ee96-4617-4fa6-b546-a68d29c6238b\" (UID: \"c085ee96-4617-4fa6-b546-a68d29c6238b\") " Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.505013 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"c085ee96-4617-4fa6-b546-a68d29c6238b\" (UID: \"c085ee96-4617-4fa6-b546-a68d29c6238b\") " Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.505063 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c085ee96-4617-4fa6-b546-a68d29c6238b-ovsdbserver-nb-tls-certs\") pod \"c085ee96-4617-4fa6-b546-a68d29c6238b\" (UID: \"c085ee96-4617-4fa6-b546-a68d29c6238b\") " Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.505488 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f0771f7-1250-403c-92b9-72411ed34b2a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.517082 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c085ee96-4617-4fa6-b546-a68d29c6238b-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "c085ee96-4617-4fa6-b546-a68d29c6238b" (UID: "c085ee96-4617-4fa6-b546-a68d29c6238b"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.523175 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c085ee96-4617-4fa6-b546-a68d29c6238b-scripts" (OuterVolumeSpecName: "scripts") pod "c085ee96-4617-4fa6-b546-a68d29c6238b" (UID: "c085ee96-4617-4fa6-b546-a68d29c6238b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.523946 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c085ee96-4617-4fa6-b546-a68d29c6238b-config" (OuterVolumeSpecName: "config") pod "c085ee96-4617-4fa6-b546-a68d29c6238b" (UID: "c085ee96-4617-4fa6-b546-a68d29c6238b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.547593 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c085ee96-4617-4fa6-b546-a68d29c6238b-kube-api-access-wvv4x" (OuterVolumeSpecName: "kube-api-access-wvv4x") pod "c085ee96-4617-4fa6-b546-a68d29c6238b" (UID: "c085ee96-4617-4fa6-b546-a68d29c6238b"). InnerVolumeSpecName "kube-api-access-wvv4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.597796 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "c085ee96-4617-4fa6-b546-a68d29c6238b" (UID: "c085ee96-4617-4fa6-b546-a68d29c6238b"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.602120 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9e84558-a02a-4297-87f2-6b69a3b5f452-config" (OuterVolumeSpecName: "config") pod "f9e84558-a02a-4297-87f2-6b69a3b5f452" (UID: "f9e84558-a02a-4297-87f2-6b69a3b5f452"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.617414 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6a1925e-e87d-4d6e-b9a3-35e46d58fc54-operator-scripts\") pod \"nova-cell1-7299-account-create-update-frm9q\" (UID: \"a6a1925e-e87d-4d6e-b9a3-35e46d58fc54\") " pod="openstack/nova-cell1-7299-account-create-update-frm9q" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.617483 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljdlr\" (UniqueName: \"kubernetes.io/projected/a6a1925e-e87d-4d6e-b9a3-35e46d58fc54-kube-api-access-ljdlr\") pod \"nova-cell1-7299-account-create-update-frm9q\" (UID: \"a6a1925e-e87d-4d6e-b9a3-35e46d58fc54\") " pod="openstack/nova-cell1-7299-account-create-update-frm9q" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.617749 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvv4x\" (UniqueName: \"kubernetes.io/projected/c085ee96-4617-4fa6-b546-a68d29c6238b-kube-api-access-wvv4x\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.617766 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c085ee96-4617-4fa6-b546-a68d29c6238b-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.617777 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c085ee96-4617-4fa6-b546-a68d29c6238b-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.617787 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c085ee96-4617-4fa6-b546-a68d29c6238b-config\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.617824 4892 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.617837 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9e84558-a02a-4297-87f2-6b69a3b5f452-config\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:31 crc kubenswrapper[4892]: E0217 18:07:31.619772 4892 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Feb 17 18:07:31 crc kubenswrapper[4892]: E0217 18:07:31.619855 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a6a1925e-e87d-4d6e-b9a3-35e46d58fc54-operator-scripts podName:a6a1925e-e87d-4d6e-b9a3-35e46d58fc54 nodeName:}" failed. No retries permitted until 2026-02-17 18:07:33.61983776 +0000 UTC m=+1424.995241025 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/a6a1925e-e87d-4d6e-b9a3-35e46d58fc54-operator-scripts") pod "nova-cell1-7299-account-create-update-frm9q" (UID: "a6a1925e-e87d-4d6e-b9a3-35e46d58fc54") : configmap "openstack-cell1-scripts" not found Feb 17 18:07:31 crc kubenswrapper[4892]: E0217 18:07:31.622517 4892 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 17 18:07:31 crc kubenswrapper[4892]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[/bin/sh -c #!/bin/bash Feb 17 18:07:31 crc kubenswrapper[4892]: Feb 17 18:07:31 crc kubenswrapper[4892]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 17 18:07:31 crc kubenswrapper[4892]: Feb 17 18:07:31 crc kubenswrapper[4892]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 17 18:07:31 crc kubenswrapper[4892]: Feb 17 18:07:31 crc kubenswrapper[4892]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 17 18:07:31 crc kubenswrapper[4892]: Feb 17 18:07:31 crc kubenswrapper[4892]: if [ -n "nova_api" ]; then Feb 17 18:07:31 crc kubenswrapper[4892]: GRANT_DATABASE="nova_api" Feb 17 18:07:31 crc kubenswrapper[4892]: else Feb 17 18:07:31 crc kubenswrapper[4892]: GRANT_DATABASE="*" Feb 17 18:07:31 crc kubenswrapper[4892]: fi Feb 17 18:07:31 crc kubenswrapper[4892]: Feb 17 18:07:31 crc kubenswrapper[4892]: # going for maximum compatibility here: Feb 17 18:07:31 crc kubenswrapper[4892]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 17 18:07:31 crc kubenswrapper[4892]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 17 18:07:31 crc kubenswrapper[4892]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 17 18:07:31 crc kubenswrapper[4892]: # support updates Feb 17 18:07:31 crc kubenswrapper[4892]: Feb 17 18:07:31 crc kubenswrapper[4892]: $MYSQL_CMD < logger="UnhandledError" Feb 17 18:07:31 crc kubenswrapper[4892]: E0217 18:07:31.624476 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack/nova-api-5f5a-account-create-update-g25cr" podUID="c031fb1d-01fe-4f4d-8a7a-2a49cb75bbb9" Feb 17 18:07:31 crc kubenswrapper[4892]: E0217 18:07:31.625563 4892 projected.go:194] Error preparing data for projected volume kube-api-access-ljdlr for pod openstack/nova-cell1-7299-account-create-update-frm9q: failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Feb 17 18:07:31 crc kubenswrapper[4892]: E0217 18:07:31.625618 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a6a1925e-e87d-4d6e-b9a3-35e46d58fc54-kube-api-access-ljdlr podName:a6a1925e-e87d-4d6e-b9a3-35e46d58fc54 nodeName:}" failed. No retries permitted until 2026-02-17 18:07:33.625600306 +0000 UTC m=+1425.001003571 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-ljdlr" (UniqueName: "kubernetes.io/projected/a6a1925e-e87d-4d6e-b9a3-35e46d58fc54-kube-api-access-ljdlr") pod "nova-cell1-7299-account-create-update-frm9q" (UID: "a6a1925e-e87d-4d6e-b9a3-35e46d58fc54") : failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.655544 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c085ee96-4617-4fa6-b546-a68d29c6238b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c085ee96-4617-4fa6-b546-a68d29c6238b" (UID: "c085ee96-4617-4fa6-b546-a68d29c6238b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.695269 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9e84558-a02a-4297-87f2-6b69a3b5f452-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f9e84558-a02a-4297-87f2-6b69a3b5f452" (UID: "f9e84558-a02a-4297-87f2-6b69a3b5f452"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.719866 4892 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9e84558-a02a-4297-87f2-6b69a3b5f452-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.719898 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c085ee96-4617-4fa6-b546-a68d29c6238b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.849411 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c085ee96-4617-4fa6-b546-a68d29c6238b-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "c085ee96-4617-4fa6-b546-a68d29c6238b" (UID: "c085ee96-4617-4fa6-b546-a68d29c6238b"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.853891 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-ctbgj"] Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.854801 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.854863 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-5f5a-account-create-update-g25cr"] Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.854937 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-c98h7"] Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.854959 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-7299-account-create-update-frm9q"] Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.854976 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-c98h7"] Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.855910 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="eb02530d-521f-427b-a570-f35de0665ecc" containerName="nova-metadata-log" containerID="cri-o://f7ec71d9b448df0c427bf1c569f629e2d1b04b498fb150e292de366d671c7e43" gracePeriod=30 Feb 17 18:07:31 crc kubenswrapper[4892]: E0217 18:07:31.856291 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-ljdlr operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/nova-cell1-7299-account-create-update-frm9q" podUID="a6a1925e-e87d-4d6e-b9a3-35e46d58fc54" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.856351 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.856484 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-57f0-account-create-update-4885s"] Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.856632 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="eb02530d-521f-427b-a570-f35de0665ecc" containerName="nova-metadata-metadata" containerID="cri-o://f69a74c1d7ff5f5e7cab3dfb70cfb2a1f2475fdc6a082b95bd0059bf7596b0bc" gracePeriod=30 Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.856924 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-dr6ns"] Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.857131 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="68cd920a-ec23-4053-a5b6-02adbf11eaf0" containerName="nova-api-log" containerID="cri-o://9bac58ad39e6590d4042399c64a90295868b511562a801762900f675847c27a2" gracePeriod=30 Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.857281 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="68cd920a-ec23-4053-a5b6-02adbf11eaf0" containerName="nova-api-api" containerID="cri-o://7576bc6a46f917916ace668163f910721525c50fdb2445e86a823be4d67ae777" gracePeriod=30 Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.864953 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9e84558-a02a-4297-87f2-6b69a3b5f452-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f9e84558-a02a-4297-87f2-6b69a3b5f452" (UID: "f9e84558-a02a-4297-87f2-6b69a3b5f452"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.875127 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="38862686-bfab-4f7d-8367-ec59a68b0299" containerName="galera" containerID="cri-o://30d52ee98141b0a4b18a9c3892ae7d9fadf3cc7b619452277145c17c67f590bb" gracePeriod=30 Feb 17 18:07:31 crc kubenswrapper[4892]: E0217 18:07:31.879837 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1ea89386cc3497040f999fba08153fd875851acce88b76b5cb7c8280e1126f85 is running failed: container process not found" containerID="1ea89386cc3497040f999fba08153fd875851acce88b76b5cb7c8280e1126f85" cmd=["/usr/bin/pidof","ovsdb-server"] Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.879977 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-dr6ns"] Feb 17 18:07:31 crc kubenswrapper[4892]: E0217 18:07:31.896282 4892 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 17 18:07:31 crc kubenswrapper[4892]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[/bin/sh -c #!/bin/bash Feb 17 18:07:31 crc kubenswrapper[4892]: Feb 17 18:07:31 crc kubenswrapper[4892]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 17 18:07:31 crc kubenswrapper[4892]: Feb 17 18:07:31 crc kubenswrapper[4892]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 17 18:07:31 crc kubenswrapper[4892]: Feb 17 18:07:31 crc kubenswrapper[4892]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 17 18:07:31 crc kubenswrapper[4892]: Feb 17 18:07:31 crc kubenswrapper[4892]: if [ -n "nova_cell0" ]; then Feb 17 18:07:31 crc kubenswrapper[4892]: GRANT_DATABASE="nova_cell0" Feb 17 18:07:31 crc kubenswrapper[4892]: else Feb 17 18:07:31 crc kubenswrapper[4892]: GRANT_DATABASE="*" Feb 17 18:07:31 crc kubenswrapper[4892]: fi Feb 17 18:07:31 crc kubenswrapper[4892]: Feb 17 18:07:31 crc kubenswrapper[4892]: # going for maximum compatibility here: Feb 17 18:07:31 crc kubenswrapper[4892]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 17 18:07:31 crc kubenswrapper[4892]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 17 18:07:31 crc kubenswrapper[4892]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 17 18:07:31 crc kubenswrapper[4892]: # support updates Feb 17 18:07:31 crc kubenswrapper[4892]: Feb 17 18:07:31 crc kubenswrapper[4892]: $MYSQL_CMD < logger="UnhandledError" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.896773 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9e84558-a02a-4297-87f2-6b69a3b5f452-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f9e84558-a02a-4297-87f2-6b69a3b5f452" (UID: "f9e84558-a02a-4297-87f2-6b69a3b5f452"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:07:31 crc kubenswrapper[4892]: E0217 18:07:31.899950 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell0-db-secret\\\" not found\"" pod="openstack/nova-cell0-57f0-account-create-update-4885s" podUID="d5e19d46-3d45-42e4-b59f-d25925415b3c" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.913003 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-65fb4dcbbc-jxbk8"] Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.914415 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-65fb4dcbbc-jxbk8" podUID="7a97132b-c4ac-4645-8844-1dc5acf466a1" containerName="barbican-worker-log" containerID="cri-o://2ce408135d2761c455f60cc8d128a48f7b7d2c155d3ab0f6e2e8d41bb724c760" gracePeriod=30 Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.915158 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-65fb4dcbbc-jxbk8" podUID="7a97132b-c4ac-4645-8844-1dc5acf466a1" containerName="barbican-worker" containerID="cri-o://2f7908fcab6c600dd1af68c388a6d558ccc99b22a660b78e65463d1976469b2c" gracePeriod=30 Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.941174 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c085ee96-4617-4fa6-b546-a68d29c6238b-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "c085ee96-4617-4fa6-b546-a68d29c6238b" (UID: "c085ee96-4617-4fa6-b546-a68d29c6238b"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:31 crc kubenswrapper[4892]: E0217 18:07:31.944968 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1ea89386cc3497040f999fba08153fd875851acce88b76b5cb7c8280e1126f85 is running failed: container process not found" containerID="1ea89386cc3497040f999fba08153fd875851acce88b76b5cb7c8280e1126f85" cmd=["/usr/bin/pidof","ovsdb-server"] Feb 17 18:07:31 crc kubenswrapper[4892]: E0217 18:07:31.955338 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1ea89386cc3497040f999fba08153fd875851acce88b76b5cb7c8280e1126f85 is running failed: container process not found" containerID="1ea89386cc3497040f999fba08153fd875851acce88b76b5cb7c8280e1126f85" cmd=["/usr/bin/pidof","ovsdb-server"] Feb 17 18:07:31 crc kubenswrapper[4892]: E0217 18:07:31.955425 4892 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1ea89386cc3497040f999fba08153fd875851acce88b76b5cb7c8280e1126f85 is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-sb-0" podUID="747e7c96-8d95-4c34-9ff3-83dc8c793fc2" containerName="ovsdbserver-sb" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.956687 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-667fbbdb6d-fdpm7"] Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.956997 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-667fbbdb6d-fdpm7" podUID="bf3f5fcf-07b1-4f42-a5be-5b11052d080a" containerName="barbican-keystone-listener-log" containerID="cri-o://70e153e496f3b77144597bb37e7faa188dca474fe0e9e263bb0ae95405212f33" gracePeriod=30 Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.957497 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-667fbbdb6d-fdpm7" podUID="bf3f5fcf-07b1-4f42-a5be-5b11052d080a" containerName="barbican-keystone-listener" containerID="cri-o://c7cb681f90d6976e6b5cbf00e7ea59a76be308b0fe3edc7745833b928200cf4d" gracePeriod=30 Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.962299 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_747e7c96-8d95-4c34-9ff3-83dc8c793fc2/ovsdbserver-sb/0.log" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.962408 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.968871 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-77d47b9fd6-4gbj5"] Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.969138 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-77d47b9fd6-4gbj5" podUID="8947a6cb-f018-4042-a2f8-e17591b0394d" containerName="barbican-api-log" containerID="cri-o://b921cc6366edc36ee38b6a143bfc88dcd8710d62a31bab58b5973626b24cb7ca" gracePeriod=30 Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.969281 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-77d47b9fd6-4gbj5" podUID="8947a6cb-f018-4042-a2f8-e17591b0394d" containerName="barbican-api" containerID="cri-o://80a0a6f3f3f9ebbe69c194f3e074f48a042c65dd9cb6a3e8b0adc00867e049f1" gracePeriod=30 Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.973979 4892 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c085ee96-4617-4fa6-b546-a68d29c6238b-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.974004 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9e84558-a02a-4297-87f2-6b69a3b5f452-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.974020 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9e84558-a02a-4297-87f2-6b69a3b5f452-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.974031 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c085ee96-4617-4fa6-b546-a68d29c6238b-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.974544 4892 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.982363 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.984678 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://810f1314fdb8ea5a53e297e37a30cff80f92654d313f4d524f470fd44d72197e" gracePeriod=30 Feb 17 18:07:31 crc kubenswrapper[4892]: I0217 18:07:31.993887 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f0771f7-1250-403c-92b9-72411ed34b2a-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "0f0771f7-1250-403c-92b9-72411ed34b2a" (UID: "0f0771f7-1250-403c-92b9-72411ed34b2a"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.006224 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-cq76l"] Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.027061 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9e84558-a02a-4297-87f2-6b69a3b5f452-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f9e84558-a02a-4297-87f2-6b69a3b5f452" (UID: "f9e84558-a02a-4297-87f2-6b69a3b5f452"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.061573 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-3909-account-create-update-z2hg5"] Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.075205 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/747e7c96-8d95-4c34-9ff3-83dc8c793fc2-combined-ca-bundle\") pod \"747e7c96-8d95-4c34-9ff3-83dc8c793fc2\" (UID: \"747e7c96-8d95-4c34-9ff3-83dc8c793fc2\") " Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.075303 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/747e7c96-8d95-4c34-9ff3-83dc8c793fc2-metrics-certs-tls-certs\") pod \"747e7c96-8d95-4c34-9ff3-83dc8c793fc2\" (UID: \"747e7c96-8d95-4c34-9ff3-83dc8c793fc2\") " Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.075399 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"747e7c96-8d95-4c34-9ff3-83dc8c793fc2\" (UID: \"747e7c96-8d95-4c34-9ff3-83dc8c793fc2\") " Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.075442 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/747e7c96-8d95-4c34-9ff3-83dc8c793fc2-config\") pod \"747e7c96-8d95-4c34-9ff3-83dc8c793fc2\" (UID: \"747e7c96-8d95-4c34-9ff3-83dc8c793fc2\") " Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.075482 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/747e7c96-8d95-4c34-9ff3-83dc8c793fc2-ovsdbserver-sb-tls-certs\") pod \"747e7c96-8d95-4c34-9ff3-83dc8c793fc2\" (UID: \"747e7c96-8d95-4c34-9ff3-83dc8c793fc2\") " Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.075534 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/747e7c96-8d95-4c34-9ff3-83dc8c793fc2-scripts\") pod \"747e7c96-8d95-4c34-9ff3-83dc8c793fc2\" (UID: \"747e7c96-8d95-4c34-9ff3-83dc8c793fc2\") " Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.075578 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/747e7c96-8d95-4c34-9ff3-83dc8c793fc2-ovsdb-rundir\") pod \"747e7c96-8d95-4c34-9ff3-83dc8c793fc2\" (UID: \"747e7c96-8d95-4c34-9ff3-83dc8c793fc2\") " Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.075604 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdgh7\" (UniqueName: \"kubernetes.io/projected/747e7c96-8d95-4c34-9ff3-83dc8c793fc2-kube-api-access-fdgh7\") pod \"747e7c96-8d95-4c34-9ff3-83dc8c793fc2\" (UID: \"747e7c96-8d95-4c34-9ff3-83dc8c793fc2\") " Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.076861 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/747e7c96-8d95-4c34-9ff3-83dc8c793fc2-config" (OuterVolumeSpecName: "config") pod "747e7c96-8d95-4c34-9ff3-83dc8c793fc2" (UID: "747e7c96-8d95-4c34-9ff3-83dc8c793fc2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.077386 4892 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9e84558-a02a-4297-87f2-6b69a3b5f452-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.077417 4892 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.077430 4892 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0f0771f7-1250-403c-92b9-72411ed34b2a-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.081655 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/747e7c96-8d95-4c34-9ff3-83dc8c793fc2-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "747e7c96-8d95-4c34-9ff3-83dc8c793fc2" (UID: "747e7c96-8d95-4c34-9ff3-83dc8c793fc2"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.081708 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/747e7c96-8d95-4c34-9ff3-83dc8c793fc2-scripts" (OuterVolumeSpecName: "scripts") pod "747e7c96-8d95-4c34-9ff3-83dc8c793fc2" (UID: "747e7c96-8d95-4c34-9ff3-83dc8c793fc2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.084689 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-5c69-account-create-update-j2dxf"] Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.094720 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/747e7c96-8d95-4c34-9ff3-83dc8c793fc2-kube-api-access-fdgh7" (OuterVolumeSpecName: "kube-api-access-fdgh7") pod "747e7c96-8d95-4c34-9ff3-83dc8c793fc2" (UID: "747e7c96-8d95-4c34-9ff3-83dc8c793fc2"). InnerVolumeSpecName "kube-api-access-fdgh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.096736 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "747e7c96-8d95-4c34-9ff3-83dc8c793fc2" (UID: "747e7c96-8d95-4c34-9ff3-83dc8c793fc2"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.108525 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-cq76l"] Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.119945 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-858jh"] Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.150241 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-858jh"] Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.173604 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-57f0-account-create-update-4885s" event={"ID":"d5e19d46-3d45-42e4-b59f-d25925415b3c","Type":"ContainerStarted","Data":"3b933bd5ff87f3fce8793baf9ec6cc757a22eb2f6cd1e7bb22e75f0618f9e06b"} Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.181240 4892 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.181275 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/747e7c96-8d95-4c34-9ff3-83dc8c793fc2-config\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.181284 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/747e7c96-8d95-4c34-9ff3-83dc8c793fc2-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.181292 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/747e7c96-8d95-4c34-9ff3-83dc8c793fc2-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.181300 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdgh7\" (UniqueName: \"kubernetes.io/projected/747e7c96-8d95-4c34-9ff3-83dc8c793fc2-kube-api-access-fdgh7\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.194887 4892 generic.go:334] "Generic (PLEG): container finished" podID="68cd920a-ec23-4053-a5b6-02adbf11eaf0" containerID="9bac58ad39e6590d4042399c64a90295868b511562a801762900f675847c27a2" exitCode=143 Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.194946 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"68cd920a-ec23-4053-a5b6-02adbf11eaf0","Type":"ContainerDied","Data":"9bac58ad39e6590d4042399c64a90295868b511562a801762900f675847c27a2"} Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.195896 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-fa7c-account-create-update-q289d"] Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.199546 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/747e7c96-8d95-4c34-9ff3-83dc8c793fc2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "747e7c96-8d95-4c34-9ff3-83dc8c793fc2" (UID: "747e7c96-8d95-4c34-9ff3-83dc8c793fc2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.200773 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_747e7c96-8d95-4c34-9ff3-83dc8c793fc2/ovsdbserver-sb/0.log" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.200946 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.201610 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"747e7c96-8d95-4c34-9ff3-83dc8c793fc2","Type":"ContainerDied","Data":"75439c82587547eef8780e63384b061d636764c68d1a719e24b0c40e7e911241"} Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.201639 4892 scope.go:117] "RemoveContainer" containerID="c98b346276d9129a2832ecb16437471be06a6bf5533e4bd04d38cba334c672a5" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.213655 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-bz7v2"] Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.220750 4892 generic.go:334] "Generic (PLEG): container finished" podID="e69accab-69f4-4f35-91cc-b9fb1d0fded2" containerID="094f4083ae3da865c0c77d4d0a55be29b2800ab643f2367af2aa5f359e590bfc" exitCode=143 Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.220800 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e69accab-69f4-4f35-91cc-b9fb1d0fded2","Type":"ContainerDied","Data":"094f4083ae3da865c0c77d4d0a55be29b2800ab643f2367af2aa5f359e590bfc"} Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.221416 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-bz7v2"] Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.230254 4892 scope.go:117] "RemoveContainer" containerID="1ea89386cc3497040f999fba08153fd875851acce88b76b5cb7c8280e1126f85" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.241254 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-5f5a-account-create-update-g25cr"] Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.248682 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-tdc8n" event={"ID":"f9e84558-a02a-4297-87f2-6b69a3b5f452","Type":"ContainerDied","Data":"ffdda80b687731c267061231019e007ba261f3e3d162c1927ed439b62939e3ec"} Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.248793 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-tdc8n" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.261748 4892 generic.go:334] "Generic (PLEG): container finished" podID="bf3f5fcf-07b1-4f42-a5be-5b11052d080a" containerID="70e153e496f3b77144597bb37e7faa188dca474fe0e9e263bb0ae95405212f33" exitCode=143 Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.261798 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-667fbbdb6d-fdpm7" event={"ID":"bf3f5fcf-07b1-4f42-a5be-5b11052d080a","Type":"ContainerDied","Data":"70e153e496f3b77144597bb37e7faa188dca474fe0e9e263bb0ae95405212f33"} Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.263267 4892 generic.go:334] "Generic (PLEG): container finished" podID="eb02530d-521f-427b-a570-f35de0665ecc" containerID="f7ec71d9b448df0c427bf1c569f629e2d1b04b498fb150e292de366d671c7e43" exitCode=143 Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.263596 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eb02530d-521f-427b-a570-f35de0665ecc","Type":"ContainerDied","Data":"f7ec71d9b448df0c427bf1c569f629e2d1b04b498fb150e292de366d671c7e43"} Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.268351 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.268502 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="253dfc82-aa27-4e6b-88a5-0af7a1d01370" containerName="nova-scheduler-scheduler" containerID="cri-o://4e88a2b9e15b3a6cb7fc5f88c55641255e9e97f6b726a8c1f4742146a59a82fc" gracePeriod=30 Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.275455 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/747e7c96-8d95-4c34-9ff3-83dc8c793fc2-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "747e7c96-8d95-4c34-9ff3-83dc8c793fc2" (UID: "747e7c96-8d95-4c34-9ff3-83dc8c793fc2"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.277426 4892 generic.go:334] "Generic (PLEG): container finished" podID="8947a6cb-f018-4042-a2f8-e17591b0394d" containerID="b921cc6366edc36ee38b6a143bfc88dcd8710d62a31bab58b5973626b24cb7ca" exitCode=143 Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.277517 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-77d47b9fd6-4gbj5" event={"ID":"8947a6cb-f018-4042-a2f8-e17591b0394d","Type":"ContainerDied","Data":"b921cc6366edc36ee38b6a143bfc88dcd8710d62a31bab58b5973626b24cb7ca"} Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.284630 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-57f0-account-create-update-4885s"] Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.286327 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/747e7c96-8d95-4c34-9ff3-83dc8c793fc2-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.286426 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/747e7c96-8d95-4c34-9ff3-83dc8c793fc2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.288898 4892 generic.go:334] "Generic (PLEG): container finished" podID="7a97132b-c4ac-4645-8844-1dc5acf466a1" containerID="2ce408135d2761c455f60cc8d128a48f7b7d2c155d3ab0f6e2e8d41bb724c760" exitCode=143 Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.288960 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-65fb4dcbbc-jxbk8" event={"ID":"7a97132b-c4ac-4645-8844-1dc5acf466a1","Type":"ContainerDied","Data":"2ce408135d2761c455f60cc8d128a48f7b7d2c155d3ab0f6e2e8d41bb724c760"} Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.290137 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5c69-account-create-update-j2dxf" event={"ID":"40312e89-cbb3-4c9f-b8bc-33113bd6462f","Type":"ContainerStarted","Data":"12368ccabe5cab0659b9d014f3bbee02577022749b8a0c64b998ebdb847f481b"} Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.297158 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.306298 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.306576 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="b4d79298-5d7f-4bd2-a8ba-b76c1a8b44f4" containerName="nova-cell1-conductor-conductor" containerID="cri-o://59c346faddf9dfe842e95a9add3e43317b2c2439005912005284fd494f8f1d8f" gracePeriod=30 Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.335666 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/747e7c96-8d95-4c34-9ff3-83dc8c793fc2-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "747e7c96-8d95-4c34-9ff3-83dc8c793fc2" (UID: "747e7c96-8d95-4c34-9ff3-83dc8c793fc2"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.338410 4892 scope.go:117] "RemoveContainer" containerID="047fafbe070b5c5f98549aa6ce2f47850884bd46a8bba02c12bc6d546ba53b35" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.338528 4892 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.338523 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qtjqg"] Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.346931 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qtjqg"] Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.363450 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lvxxb"] Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.377249 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.377429 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="03480c10-3249-4caa-b0da-919bbe13c03f" containerName="nova-cell0-conductor-conductor" containerID="cri-o://ab514791aa617c049db64bf7cfd3380e2e19c1cb78b1e69eb13b878f9cf05a05" gracePeriod=30 Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.389150 4892 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.389186 4892 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/747e7c96-8d95-4c34-9ff3-83dc8c793fc2-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.395340 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lvxxb"] Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.400297 4892 generic.go:334] "Generic (PLEG): container finished" podID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerID="560fe248893cb8f1a43c62388a6d42d718ed7cc7f26d7e07babf294df3388633" exitCode=0 Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.400327 4892 generic.go:334] "Generic (PLEG): container finished" podID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerID="4d5dff4d38a7a79f798428eae45a2013e22a5357dd2fcd764f8587d896667672" exitCode=0 Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.400337 4892 generic.go:334] "Generic (PLEG): container finished" podID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerID="58e0df0679c6d02fa3c1eeeb9fd8b0c6a7d302f3247b2ed8e7c839d88c861c37" exitCode=0 Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.400345 4892 generic.go:334] "Generic (PLEG): container finished" podID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerID="ecc298da1bf313ad808f705d76569aadd8ccd7b7dec598f4818582d15e76c068" exitCode=0 Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.400353 4892 generic.go:334] "Generic (PLEG): container finished" podID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerID="0f12b3eda7e47024c7abf99bbcbb51563466d6864f1fb053fa6d1defadfb7e85" exitCode=0 Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.400361 4892 generic.go:334] "Generic (PLEG): container finished" podID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerID="c1d342baf3f40d0f5eafc73a64df7c0cd93b1f05a6ea729e27e0c6aa8625b809" exitCode=0 Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.400357 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0a693208-de83-4f05-b4ff-0b3e7f858c74","Type":"ContainerDied","Data":"560fe248893cb8f1a43c62388a6d42d718ed7cc7f26d7e07babf294df3388633"} Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.400395 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0a693208-de83-4f05-b4ff-0b3e7f858c74","Type":"ContainerDied","Data":"4d5dff4d38a7a79f798428eae45a2013e22a5357dd2fcd764f8587d896667672"} Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.400443 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0a693208-de83-4f05-b4ff-0b3e7f858c74","Type":"ContainerDied","Data":"58e0df0679c6d02fa3c1eeeb9fd8b0c6a7d302f3247b2ed8e7c839d88c861c37"} Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.400457 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0a693208-de83-4f05-b4ff-0b3e7f858c74","Type":"ContainerDied","Data":"ecc298da1bf313ad808f705d76569aadd8ccd7b7dec598f4818582d15e76c068"} Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.400468 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0a693208-de83-4f05-b4ff-0b3e7f858c74","Type":"ContainerDied","Data":"0f12b3eda7e47024c7abf99bbcbb51563466d6864f1fb053fa6d1defadfb7e85"} Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.400479 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0a693208-de83-4f05-b4ff-0b3e7f858c74","Type":"ContainerDied","Data":"c1d342baf3f40d0f5eafc73a64df7c0cd93b1f05a6ea729e27e0c6aa8625b809"} Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.400489 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0a693208-de83-4f05-b4ff-0b3e7f858c74","Type":"ContainerDied","Data":"6ad15eb01bd19157c80c0bb24ae02bfdeddfc5c770f3fde85bbd8a6f9ea1c582"} Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.400369 4892 generic.go:334] "Generic (PLEG): container finished" podID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerID="6ad15eb01bd19157c80c0bb24ae02bfdeddfc5c770f3fde85bbd8a6f9ea1c582" exitCode=0 Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.400512 4892 generic.go:334] "Generic (PLEG): container finished" podID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerID="5215ee6df896cbac8711678f30bfc8c34ed8386052d37de70cf674a5aa175b70" exitCode=0 Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.400524 4892 generic.go:334] "Generic (PLEG): container finished" podID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerID="736363b29d80b41bd0b961a49490e559581f0ae8dd26533460a7abdb41743241" exitCode=0 Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.400574 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0a693208-de83-4f05-b4ff-0b3e7f858c74","Type":"ContainerDied","Data":"5215ee6df896cbac8711678f30bfc8c34ed8386052d37de70cf674a5aa175b70"} Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.400590 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0a693208-de83-4f05-b4ff-0b3e7f858c74","Type":"ContainerDied","Data":"736363b29d80b41bd0b961a49490e559581f0ae8dd26533460a7abdb41743241"} Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.409032 4892 generic.go:334] "Generic (PLEG): container finished" podID="60523b2e-a498-4bc9-920b-32f117afb898" containerID="c90905bd645724d583f06264c835c33dbac29aab54cd0307b6a53d4573add124" exitCode=0 Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.409096 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"60523b2e-a498-4bc9-920b-32f117afb898","Type":"ContainerDied","Data":"c90905bd645724d583f06264c835c33dbac29aab54cd0307b6a53d4573add124"} Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.421427 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_c085ee96-4617-4fa6-b546-a68d29c6238b/ovsdbserver-nb/0.log" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.421503 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"c085ee96-4617-4fa6-b546-a68d29c6238b","Type":"ContainerDied","Data":"adf20f4c1fb1e37e6edc36fa477c3ec7dfa0b7c2a94d5c159631dd0053972529"} Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.421593 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.428607 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-787594b47-2xt6h"] Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.428907 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-787594b47-2xt6h" podUID="bb2d2d5a-6727-4c83-800b-03a6cf43b9c1" containerName="proxy-httpd" containerID="cri-o://7aa2c24e87c1dd7aca3eb443e9d0a1a6e4ac79e766130bd75af04a8a8e5e4d3c" gracePeriod=30 Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.429281 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-787594b47-2xt6h" podUID="bb2d2d5a-6727-4c83-800b-03a6cf43b9c1" containerName="proxy-server" containerID="cri-o://9fe16bec1150bc58439b8a3146b91ab3797ac2826fcfb2f2ad628f2449331e3c" gracePeriod=30 Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.445420 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-fa7c-account-create-update-q289d" event={"ID":"e335cae7-2188-47a9-8eef-f53353b22dd4","Type":"ContainerStarted","Data":"8a775eb0f599612e05daeaa486a21cee061ad83015907ea034c8e84cea46e06d"} Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.452418 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-tdc8n"] Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.462473 4892 generic.go:334] "Generic (PLEG): container finished" podID="3ca9c4ed-1247-4340-a675-b9d50dcbed99" containerID="bbca05ba8000e1544dd1256fe1d48355fe4077385199194480c09fd31d0d03ad" exitCode=143 Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.462753 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3ca9c4ed-1247-4340-a675-b9d50dcbed99","Type":"ContainerDied","Data":"bbca05ba8000e1544dd1256fe1d48355fe4077385199194480c09fd31d0d03ad"} Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.464067 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5f5a-account-create-update-g25cr" event={"ID":"c031fb1d-01fe-4f4d-8a7a-2a49cb75bbb9","Type":"ContainerStarted","Data":"99608cc0fafb1868bdc4fca20ea615d044a46d25ecdbc2662f59e2d3214b8585"} Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.466362 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-tdc8n"] Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.484620 4892 generic.go:334] "Generic (PLEG): container finished" podID="a6736a08-c35b-491c-b408-8a3dd641cd51" containerID="72c2cbaf2de54480ce7abb484d8c16ea293b67cb726839d9a2f462baee040be3" exitCode=0 Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.484732 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a6736a08-c35b-491c-b408-8a3dd641cd51","Type":"ContainerDied","Data":"72c2cbaf2de54480ce7abb484d8c16ea293b67cb726839d9a2f462baee040be3"} Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.517892 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-cq76l" event={"ID":"7a130c05-1db7-4d93-899e-05d086a2bcca","Type":"ContainerStarted","Data":"f75e5afe3c240e3b7b80ea954bd8614914d8c5319be4f7207619dccdca9e42d5"} Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.517910 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7299-account-create-update-frm9q" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.517966 4892 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/root-account-create-update-cq76l" secret="" err="secret \"galera-openstack-cell1-dockercfg-gnxkd\" not found" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.539769 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.540129 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7299-account-create-update-frm9q" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.543232 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.550676 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.570149 4892 scope.go:117] "RemoveContainer" containerID="f481d1c9a09f24ed0ad0a19d49e204be43df31b957576fba8fc3094295e28527" Feb 17 18:07:32 crc kubenswrapper[4892]: E0217 18:07:32.570595 4892 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 17 18:07:32 crc kubenswrapper[4892]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[/bin/sh -c #!/bin/bash Feb 17 18:07:32 crc kubenswrapper[4892]: Feb 17 18:07:32 crc kubenswrapper[4892]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 17 18:07:32 crc kubenswrapper[4892]: Feb 17 18:07:32 crc kubenswrapper[4892]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 17 18:07:32 crc kubenswrapper[4892]: Feb 17 18:07:32 crc kubenswrapper[4892]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 17 18:07:32 crc kubenswrapper[4892]: Feb 17 18:07:32 crc kubenswrapper[4892]: if [ -n "" ]; then Feb 17 18:07:32 crc kubenswrapper[4892]: GRANT_DATABASE="" Feb 17 18:07:32 crc kubenswrapper[4892]: else Feb 17 18:07:32 crc kubenswrapper[4892]: GRANT_DATABASE="*" Feb 17 18:07:32 crc kubenswrapper[4892]: fi Feb 17 18:07:32 crc kubenswrapper[4892]: Feb 17 18:07:32 crc kubenswrapper[4892]: # going for maximum compatibility here: Feb 17 18:07:32 crc kubenswrapper[4892]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 17 18:07:32 crc kubenswrapper[4892]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 17 18:07:32 crc kubenswrapper[4892]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 17 18:07:32 crc kubenswrapper[4892]: # support updates Feb 17 18:07:32 crc kubenswrapper[4892]: Feb 17 18:07:32 crc kubenswrapper[4892]: $MYSQL_CMD < logger="UnhandledError" Feb 17 18:07:32 crc kubenswrapper[4892]: E0217 18:07:32.578257 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-cell1-mariadb-root-db-secret\\\" not found\"" pod="openstack/root-account-create-update-cq76l" podUID="7a130c05-1db7-4d93-899e-05d086a2bcca" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.593040 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/60523b2e-a498-4bc9-920b-32f117afb898-rabbitmq-tls\") pod \"60523b2e-a498-4bc9-920b-32f117afb898\" (UID: \"60523b2e-a498-4bc9-920b-32f117afb898\") " Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.593210 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4skhm\" (UniqueName: \"kubernetes.io/projected/60523b2e-a498-4bc9-920b-32f117afb898-kube-api-access-4skhm\") pod \"60523b2e-a498-4bc9-920b-32f117afb898\" (UID: \"60523b2e-a498-4bc9-920b-32f117afb898\") " Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.593487 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/60523b2e-a498-4bc9-920b-32f117afb898-server-conf\") pod \"60523b2e-a498-4bc9-920b-32f117afb898\" (UID: \"60523b2e-a498-4bc9-920b-32f117afb898\") " Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.593679 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/60523b2e-a498-4bc9-920b-32f117afb898-pod-info\") pod \"60523b2e-a498-4bc9-920b-32f117afb898\" (UID: \"60523b2e-a498-4bc9-920b-32f117afb898\") " Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.593805 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/60523b2e-a498-4bc9-920b-32f117afb898-rabbitmq-erlang-cookie\") pod \"60523b2e-a498-4bc9-920b-32f117afb898\" (UID: \"60523b2e-a498-4bc9-920b-32f117afb898\") " Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.593852 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/60523b2e-a498-4bc9-920b-32f117afb898-rabbitmq-confd\") pod \"60523b2e-a498-4bc9-920b-32f117afb898\" (UID: \"60523b2e-a498-4bc9-920b-32f117afb898\") " Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.593870 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/60523b2e-a498-4bc9-920b-32f117afb898-erlang-cookie-secret\") pod \"60523b2e-a498-4bc9-920b-32f117afb898\" (UID: \"60523b2e-a498-4bc9-920b-32f117afb898\") " Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.594001 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"60523b2e-a498-4bc9-920b-32f117afb898\" (UID: \"60523b2e-a498-4bc9-920b-32f117afb898\") " Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.594124 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/60523b2e-a498-4bc9-920b-32f117afb898-plugins-conf\") pod \"60523b2e-a498-4bc9-920b-32f117afb898\" (UID: \"60523b2e-a498-4bc9-920b-32f117afb898\") " Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.594157 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/60523b2e-a498-4bc9-920b-32f117afb898-config-data\") pod \"60523b2e-a498-4bc9-920b-32f117afb898\" (UID: \"60523b2e-a498-4bc9-920b-32f117afb898\") " Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.594176 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/60523b2e-a498-4bc9-920b-32f117afb898-rabbitmq-plugins\") pod \"60523b2e-a498-4bc9-920b-32f117afb898\" (UID: \"60523b2e-a498-4bc9-920b-32f117afb898\") " Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.618499 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60523b2e-a498-4bc9-920b-32f117afb898-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "60523b2e-a498-4bc9-920b-32f117afb898" (UID: "60523b2e-a498-4bc9-920b-32f117afb898"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.628520 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60523b2e-a498-4bc9-920b-32f117afb898-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "60523b2e-a498-4bc9-920b-32f117afb898" (UID: "60523b2e-a498-4bc9-920b-32f117afb898"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.628612 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "60523b2e-a498-4bc9-920b-32f117afb898" (UID: "60523b2e-a498-4bc9-920b-32f117afb898"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.629626 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60523b2e-a498-4bc9-920b-32f117afb898-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "60523b2e-a498-4bc9-920b-32f117afb898" (UID: "60523b2e-a498-4bc9-920b-32f117afb898"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.629724 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/60523b2e-a498-4bc9-920b-32f117afb898-pod-info" (OuterVolumeSpecName: "pod-info") pod "60523b2e-a498-4bc9-920b-32f117afb898" (UID: "60523b2e-a498-4bc9-920b-32f117afb898"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.630994 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60523b2e-a498-4bc9-920b-32f117afb898-kube-api-access-4skhm" (OuterVolumeSpecName: "kube-api-access-4skhm") pod "60523b2e-a498-4bc9-920b-32f117afb898" (UID: "60523b2e-a498-4bc9-920b-32f117afb898"). InnerVolumeSpecName "kube-api-access-4skhm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.634311 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60523b2e-a498-4bc9-920b-32f117afb898-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "60523b2e-a498-4bc9-920b-32f117afb898" (UID: "60523b2e-a498-4bc9-920b-32f117afb898"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.639579 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60523b2e-a498-4bc9-920b-32f117afb898-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "60523b2e-a498-4bc9-920b-32f117afb898" (UID: "60523b2e-a498-4bc9-920b-32f117afb898"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.653900 4892 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/60523b2e-a498-4bc9-920b-32f117afb898-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.653951 4892 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.653961 4892 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/60523b2e-a498-4bc9-920b-32f117afb898-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.653975 4892 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/60523b2e-a498-4bc9-920b-32f117afb898-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.653988 4892 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/60523b2e-a498-4bc9-920b-32f117afb898-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.654000 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4skhm\" (UniqueName: \"kubernetes.io/projected/60523b2e-a498-4bc9-920b-32f117afb898-kube-api-access-4skhm\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.654013 4892 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/60523b2e-a498-4bc9-920b-32f117afb898-pod-info\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.654023 4892 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/60523b2e-a498-4bc9-920b-32f117afb898-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:32 crc kubenswrapper[4892]: E0217 18:07:32.654895 4892 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Feb 17 18:07:32 crc kubenswrapper[4892]: E0217 18:07:32.684839 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7a130c05-1db7-4d93-899e-05d086a2bcca-operator-scripts podName:7a130c05-1db7-4d93-899e-05d086a2bcca nodeName:}" failed. No retries permitted until 2026-02-17 18:07:33.154949749 +0000 UTC m=+1424.530353014 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/7a130c05-1db7-4d93-899e-05d086a2bcca-operator-scripts") pod "root-account-create-update-cq76l" (UID: "7a130c05-1db7-4d93-899e-05d086a2bcca") : configmap "openstack-cell1-scripts" not found Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.697154 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60523b2e-a498-4bc9-920b-32f117afb898-config-data" (OuterVolumeSpecName: "config-data") pod "60523b2e-a498-4bc9-920b-32f117afb898" (UID: "60523b2e-a498-4bc9-920b-32f117afb898"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.743212 4892 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.743865 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60523b2e-a498-4bc9-920b-32f117afb898-server-conf" (OuterVolumeSpecName: "server-conf") pod "60523b2e-a498-4bc9-920b-32f117afb898" (UID: "60523b2e-a498-4bc9-920b-32f117afb898"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.756763 4892 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/60523b2e-a498-4bc9-920b-32f117afb898-server-conf\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.756793 4892 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.756820 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/60523b2e-a498-4bc9-920b-32f117afb898-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.758930 4892 scope.go:117] "RemoveContainer" containerID="9e7e188405b6a8e4ef6bb57b42cbb6f284b2256f64aba5c60b2ec472f06d945a" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.794353 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60523b2e-a498-4bc9-920b-32f117afb898-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "60523b2e-a498-4bc9-920b-32f117afb898" (UID: "60523b2e-a498-4bc9-920b-32f117afb898"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.799043 4892 scope.go:117] "RemoveContainer" containerID="091908dabe7cfe9736d9853d937fdfd18cc05ba8b8dcdcc384696d866e69fa27" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.803175 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.812544 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.837556 4892 scope.go:117] "RemoveContainer" containerID="82eb422a29271ebf3f8cf145783a1075bbaffa1ac3279210e6826fd4595ca345" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.861084 4892 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/60523b2e-a498-4bc9-920b-32f117afb898-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.868608 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-cell1-novncproxy-0" podUID="bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9" containerName="nova-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"https://10.217.0.224:6080/vnc_lite.html\": dial tcp 10.217.0.224:6080: connect: connection refused" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.935551 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-57f0-account-create-update-4885s" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.987460 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-6vq9m"] Feb 17 18:07:32 crc kubenswrapper[4892]: E0217 18:07:32.988050 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="747e7c96-8d95-4c34-9ff3-83dc8c793fc2" containerName="ovsdbserver-sb" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.988069 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="747e7c96-8d95-4c34-9ff3-83dc8c793fc2" containerName="ovsdbserver-sb" Feb 17 18:07:32 crc kubenswrapper[4892]: E0217 18:07:32.988085 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60523b2e-a498-4bc9-920b-32f117afb898" containerName="setup-container" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.988093 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="60523b2e-a498-4bc9-920b-32f117afb898" containerName="setup-container" Feb 17 18:07:32 crc kubenswrapper[4892]: E0217 18:07:32.988110 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c085ee96-4617-4fa6-b546-a68d29c6238b" containerName="openstack-network-exporter" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.988120 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="c085ee96-4617-4fa6-b546-a68d29c6238b" containerName="openstack-network-exporter" Feb 17 18:07:32 crc kubenswrapper[4892]: E0217 18:07:32.988133 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff799349-84ed-44f7-8f46-d11d5637abf1" containerName="ovn-controller" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.988141 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff799349-84ed-44f7-8f46-d11d5637abf1" containerName="ovn-controller" Feb 17 18:07:32 crc kubenswrapper[4892]: E0217 18:07:32.988156 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa192ddc-2dc5-4684-afc7-29c9e9db5f6b" containerName="openstack-network-exporter" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.988164 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa192ddc-2dc5-4684-afc7-29c9e9db5f6b" containerName="openstack-network-exporter" Feb 17 18:07:32 crc kubenswrapper[4892]: E0217 18:07:32.988203 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c085ee96-4617-4fa6-b546-a68d29c6238b" containerName="ovsdbserver-nb" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.988212 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="c085ee96-4617-4fa6-b546-a68d29c6238b" containerName="ovsdbserver-nb" Feb 17 18:07:32 crc kubenswrapper[4892]: E0217 18:07:32.988222 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9e84558-a02a-4297-87f2-6b69a3b5f452" containerName="dnsmasq-dns" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.988229 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9e84558-a02a-4297-87f2-6b69a3b5f452" containerName="dnsmasq-dns" Feb 17 18:07:32 crc kubenswrapper[4892]: E0217 18:07:32.988256 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9e84558-a02a-4297-87f2-6b69a3b5f452" containerName="init" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.988264 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9e84558-a02a-4297-87f2-6b69a3b5f452" containerName="init" Feb 17 18:07:32 crc kubenswrapper[4892]: E0217 18:07:32.988281 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="747e7c96-8d95-4c34-9ff3-83dc8c793fc2" containerName="openstack-network-exporter" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.988289 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="747e7c96-8d95-4c34-9ff3-83dc8c793fc2" containerName="openstack-network-exporter" Feb 17 18:07:32 crc kubenswrapper[4892]: E0217 18:07:32.988305 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60523b2e-a498-4bc9-920b-32f117afb898" containerName="rabbitmq" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.988313 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="60523b2e-a498-4bc9-920b-32f117afb898" containerName="rabbitmq" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.988560 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="c085ee96-4617-4fa6-b546-a68d29c6238b" containerName="ovsdbserver-nb" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.988588 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="747e7c96-8d95-4c34-9ff3-83dc8c793fc2" containerName="openstack-network-exporter" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.988606 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="c085ee96-4617-4fa6-b546-a68d29c6238b" containerName="openstack-network-exporter" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.988620 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa192ddc-2dc5-4684-afc7-29c9e9db5f6b" containerName="openstack-network-exporter" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.988632 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="60523b2e-a498-4bc9-920b-32f117afb898" containerName="rabbitmq" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.988641 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff799349-84ed-44f7-8f46-d11d5637abf1" containerName="ovn-controller" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.988656 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9e84558-a02a-4297-87f2-6b69a3b5f452" containerName="dnsmasq-dns" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.988670 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="747e7c96-8d95-4c34-9ff3-83dc8c793fc2" containerName="ovsdbserver-sb" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.989472 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-6vq9m" Feb 17 18:07:32 crc kubenswrapper[4892]: I0217 18:07:32.991989 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.006568 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-6vq9m"] Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.067138 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5e19d46-3d45-42e4-b59f-d25925415b3c-operator-scripts\") pod \"d5e19d46-3d45-42e4-b59f-d25925415b3c\" (UID: \"d5e19d46-3d45-42e4-b59f-d25925415b3c\") " Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.067196 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6sn52\" (UniqueName: \"kubernetes.io/projected/d5e19d46-3d45-42e4-b59f-d25925415b3c-kube-api-access-6sn52\") pod \"d5e19d46-3d45-42e4-b59f-d25925415b3c\" (UID: \"d5e19d46-3d45-42e4-b59f-d25925415b3c\") " Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.067506 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qwd4\" (UniqueName: \"kubernetes.io/projected/49339b73-3846-4ce0-aa1d-d285b333c807-kube-api-access-2qwd4\") pod \"root-account-create-update-6vq9m\" (UID: \"49339b73-3846-4ce0-aa1d-d285b333c807\") " pod="openstack/root-account-create-update-6vq9m" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.067652 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49339b73-3846-4ce0-aa1d-d285b333c807-operator-scripts\") pod \"root-account-create-update-6vq9m\" (UID: \"49339b73-3846-4ce0-aa1d-d285b333c807\") " pod="openstack/root-account-create-update-6vq9m" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.069212 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5e19d46-3d45-42e4-b59f-d25925415b3c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d5e19d46-3d45-42e4-b59f-d25925415b3c" (UID: "d5e19d46-3d45-42e4-b59f-d25925415b3c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.076994 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5e19d46-3d45-42e4-b59f-d25925415b3c-kube-api-access-6sn52" (OuterVolumeSpecName: "kube-api-access-6sn52") pod "d5e19d46-3d45-42e4-b59f-d25925415b3c" (UID: "d5e19d46-3d45-42e4-b59f-d25925415b3c"). InnerVolumeSpecName "kube-api-access-6sn52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.143740 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3909-account-create-update-z2hg5" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.174164 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49339b73-3846-4ce0-aa1d-d285b333c807-operator-scripts\") pod \"root-account-create-update-6vq9m\" (UID: \"49339b73-3846-4ce0-aa1d-d285b333c807\") " pod="openstack/root-account-create-update-6vq9m" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.174252 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qwd4\" (UniqueName: \"kubernetes.io/projected/49339b73-3846-4ce0-aa1d-d285b333c807-kube-api-access-2qwd4\") pod \"root-account-create-update-6vq9m\" (UID: \"49339b73-3846-4ce0-aa1d-d285b333c807\") " pod="openstack/root-account-create-update-6vq9m" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.174354 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5e19d46-3d45-42e4-b59f-d25925415b3c-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.174368 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6sn52\" (UniqueName: \"kubernetes.io/projected/d5e19d46-3d45-42e4-b59f-d25925415b3c-kube-api-access-6sn52\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:33 crc kubenswrapper[4892]: E0217 18:07:33.174441 4892 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Feb 17 18:07:33 crc kubenswrapper[4892]: E0217 18:07:33.174490 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7a130c05-1db7-4d93-899e-05d086a2bcca-operator-scripts podName:7a130c05-1db7-4d93-899e-05d086a2bcca nodeName:}" failed. No retries permitted until 2026-02-17 18:07:34.174475532 +0000 UTC m=+1425.549878797 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/7a130c05-1db7-4d93-899e-05d086a2bcca-operator-scripts") pod "root-account-create-update-cq76l" (UID: "7a130c05-1db7-4d93-899e-05d086a2bcca") : configmap "openstack-cell1-scripts" not found Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.175331 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49339b73-3846-4ce0-aa1d-d285b333c807-operator-scripts\") pod \"root-account-create-update-6vq9m\" (UID: \"49339b73-3846-4ce0-aa1d-d285b333c807\") " pod="openstack/root-account-create-update-6vq9m" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.182558 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-fa7c-account-create-update-q289d" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.193637 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qwd4\" (UniqueName: \"kubernetes.io/projected/49339b73-3846-4ce0-aa1d-d285b333c807-kube-api-access-2qwd4\") pod \"root-account-create-update-6vq9m\" (UID: \"49339b73-3846-4ce0-aa1d-d285b333c807\") " pod="openstack/root-account-create-update-6vq9m" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.223876 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5c69-account-create-update-j2dxf" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.280386 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e335cae7-2188-47a9-8eef-f53353b22dd4-operator-scripts\") pod \"e335cae7-2188-47a9-8eef-f53353b22dd4\" (UID: \"e335cae7-2188-47a9-8eef-f53353b22dd4\") " Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.280562 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/588baeb0-b9ae-4dcf-83c6-36b641359bfd-operator-scripts\") pod \"588baeb0-b9ae-4dcf-83c6-36b641359bfd\" (UID: \"588baeb0-b9ae-4dcf-83c6-36b641359bfd\") " Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.280601 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmlfq\" (UniqueName: \"kubernetes.io/projected/588baeb0-b9ae-4dcf-83c6-36b641359bfd-kube-api-access-gmlfq\") pod \"588baeb0-b9ae-4dcf-83c6-36b641359bfd\" (UID: \"588baeb0-b9ae-4dcf-83c6-36b641359bfd\") " Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.280735 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-226mw\" (UniqueName: \"kubernetes.io/projected/e335cae7-2188-47a9-8eef-f53353b22dd4-kube-api-access-226mw\") pod \"e335cae7-2188-47a9-8eef-f53353b22dd4\" (UID: \"e335cae7-2188-47a9-8eef-f53353b22dd4\") " Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.281802 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e335cae7-2188-47a9-8eef-f53353b22dd4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e335cae7-2188-47a9-8eef-f53353b22dd4" (UID: "e335cae7-2188-47a9-8eef-f53353b22dd4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.282185 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/588baeb0-b9ae-4dcf-83c6-36b641359bfd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "588baeb0-b9ae-4dcf-83c6-36b641359bfd" (UID: "588baeb0-b9ae-4dcf-83c6-36b641359bfd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.286218 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/588baeb0-b9ae-4dcf-83c6-36b641359bfd-kube-api-access-gmlfq" (OuterVolumeSpecName: "kube-api-access-gmlfq") pod "588baeb0-b9ae-4dcf-83c6-36b641359bfd" (UID: "588baeb0-b9ae-4dcf-83c6-36b641359bfd"). InnerVolumeSpecName "kube-api-access-gmlfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.290896 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e335cae7-2188-47a9-8eef-f53353b22dd4-kube-api-access-226mw" (OuterVolumeSpecName: "kube-api-access-226mw") pod "e335cae7-2188-47a9-8eef-f53353b22dd4" (UID: "e335cae7-2188-47a9-8eef-f53353b22dd4"). InnerVolumeSpecName "kube-api-access-226mw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.302924 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.311794 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-6vq9m" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.335872 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5f5a-account-create-update-g25cr" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.380785 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d7546b1-a71d-4a95-afd3-adf70b749d04" path="/var/lib/kubelet/pods/0d7546b1-a71d-4a95-afd3-adf70b749d04/volumes" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.381487 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f0771f7-1250-403c-92b9-72411ed34b2a" path="/var/lib/kubelet/pods/0f0771f7-1250-403c-92b9-72411ed34b2a/volumes" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.382005 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ffcecae-c54a-4d76-9bbd-e7c406582928" path="/var/lib/kubelet/pods/0ffcecae-c54a-4d76-9bbd-e7c406582928/volumes" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.383210 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="150beb6f-d09f-42cd-8294-acfe9bf7bcee" path="/var/lib/kubelet/pods/150beb6f-d09f-42cd-8294-acfe9bf7bcee/volumes" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.383719 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55036453-856d-458e-a6c9-30809c87ccaf" path="/var/lib/kubelet/pods/55036453-856d-458e-a6c9-30809c87ccaf/volumes" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.384207 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c01d520-e8b4-4e4d-9aaf-ffe0a5bb4293" path="/var/lib/kubelet/pods/5c01d520-e8b4-4e4d-9aaf-ffe0a5bb4293/volumes" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.384843 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="747e7c96-8d95-4c34-9ff3-83dc8c793fc2" path="/var/lib/kubelet/pods/747e7c96-8d95-4c34-9ff3-83dc8c793fc2/volumes" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.385843 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa192ddc-2dc5-4684-afc7-29c9e9db5f6b" path="/var/lib/kubelet/pods/aa192ddc-2dc5-4684-afc7-29c9e9db5f6b/volumes" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.386570 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c085ee96-4617-4fa6-b546-a68d29c6238b" path="/var/lib/kubelet/pods/c085ee96-4617-4fa6-b546-a68d29c6238b/volumes" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.387517 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9e84558-a02a-4297-87f2-6b69a3b5f452" path="/var/lib/kubelet/pods/f9e84558-a02a-4297-87f2-6b69a3b5f452/volumes" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.388137 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff799349-84ed-44f7-8f46-d11d5637abf1" path="/var/lib/kubelet/pods/ff799349-84ed-44f7-8f46-d11d5637abf1/volumes" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.388509 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9-vencrypt-tls-certs\") pod \"bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9\" (UID: \"bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9\") " Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.388707 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-548mr\" (UniqueName: \"kubernetes.io/projected/bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9-kube-api-access-548mr\") pod \"bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9\" (UID: \"bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9\") " Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.388732 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9-combined-ca-bundle\") pod \"bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9\" (UID: \"bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9\") " Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.388758 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40312e89-cbb3-4c9f-b8bc-33113bd6462f-operator-scripts\") pod \"40312e89-cbb3-4c9f-b8bc-33113bd6462f\" (UID: \"40312e89-cbb3-4c9f-b8bc-33113bd6462f\") " Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.388799 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9-nova-novncproxy-tls-certs\") pod \"bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9\" (UID: \"bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9\") " Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.388909 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5qjz\" (UniqueName: \"kubernetes.io/projected/40312e89-cbb3-4c9f-b8bc-33113bd6462f-kube-api-access-b5qjz\") pod \"40312e89-cbb3-4c9f-b8bc-33113bd6462f\" (UID: \"40312e89-cbb3-4c9f-b8bc-33113bd6462f\") " Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.388948 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9-config-data\") pod \"bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9\" (UID: \"bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9\") " Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.389404 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/588baeb0-b9ae-4dcf-83c6-36b641359bfd-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.389416 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmlfq\" (UniqueName: \"kubernetes.io/projected/588baeb0-b9ae-4dcf-83c6-36b641359bfd-kube-api-access-gmlfq\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.389426 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-226mw\" (UniqueName: \"kubernetes.io/projected/e335cae7-2188-47a9-8eef-f53353b22dd4-kube-api-access-226mw\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.389436 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e335cae7-2188-47a9-8eef-f53353b22dd4-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.396092 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40312e89-cbb3-4c9f-b8bc-33113bd6462f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "40312e89-cbb3-4c9f-b8bc-33113bd6462f" (UID: "40312e89-cbb3-4c9f-b8bc-33113bd6462f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.399006 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9-kube-api-access-548mr" (OuterVolumeSpecName: "kube-api-access-548mr") pod "bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9" (UID: "bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9"). InnerVolumeSpecName "kube-api-access-548mr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.405554 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40312e89-cbb3-4c9f-b8bc-33113bd6462f-kube-api-access-b5qjz" (OuterVolumeSpecName: "kube-api-access-b5qjz") pod "40312e89-cbb3-4c9f-b8bc-33113bd6462f" (UID: "40312e89-cbb3-4c9f-b8bc-33113bd6462f"). InnerVolumeSpecName "kube-api-access-b5qjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.460074 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9-config-data" (OuterVolumeSpecName: "config-data") pod "bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9" (UID: "bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.466497 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9" (UID: "bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.483982 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9" (UID: "bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.493136 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fjjx\" (UniqueName: \"kubernetes.io/projected/c031fb1d-01fe-4f4d-8a7a-2a49cb75bbb9-kube-api-access-7fjjx\") pod \"c031fb1d-01fe-4f4d-8a7a-2a49cb75bbb9\" (UID: \"c031fb1d-01fe-4f4d-8a7a-2a49cb75bbb9\") " Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.493200 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c031fb1d-01fe-4f4d-8a7a-2a49cb75bbb9-operator-scripts\") pod \"c031fb1d-01fe-4f4d-8a7a-2a49cb75bbb9\" (UID: \"c031fb1d-01fe-4f4d-8a7a-2a49cb75bbb9\") " Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.493934 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5qjz\" (UniqueName: \"kubernetes.io/projected/40312e89-cbb3-4c9f-b8bc-33113bd6462f-kube-api-access-b5qjz\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.493947 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.493957 4892 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.493967 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-548mr\" (UniqueName: \"kubernetes.io/projected/bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9-kube-api-access-548mr\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.493976 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.493985 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40312e89-cbb3-4c9f-b8bc-33113bd6462f-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.495859 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c031fb1d-01fe-4f4d-8a7a-2a49cb75bbb9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c031fb1d-01fe-4f4d-8a7a-2a49cb75bbb9" (UID: "c031fb1d-01fe-4f4d-8a7a-2a49cb75bbb9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.505598 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c031fb1d-01fe-4f4d-8a7a-2a49cb75bbb9-kube-api-access-7fjjx" (OuterVolumeSpecName: "kube-api-access-7fjjx") pod "c031fb1d-01fe-4f4d-8a7a-2a49cb75bbb9" (UID: "c031fb1d-01fe-4f4d-8a7a-2a49cb75bbb9"). InnerVolumeSpecName "kube-api-access-7fjjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.534547 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9" (UID: "bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.589407 4892 generic.go:334] "Generic (PLEG): container finished" podID="38862686-bfab-4f7d-8367-ec59a68b0299" containerID="30d52ee98141b0a4b18a9c3892ae7d9fadf3cc7b619452277145c17c67f590bb" exitCode=0 Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.598902 4892 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.598934 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fjjx\" (UniqueName: \"kubernetes.io/projected/c031fb1d-01fe-4f4d-8a7a-2a49cb75bbb9-kube-api-access-7fjjx\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.598945 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c031fb1d-01fe-4f4d-8a7a-2a49cb75bbb9-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.609452 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5f5a-account-create-update-g25cr" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.633421 4892 generic.go:334] "Generic (PLEG): container finished" podID="bb2d2d5a-6727-4c83-800b-03a6cf43b9c1" containerID="9fe16bec1150bc58439b8a3146b91ab3797ac2826fcfb2f2ad628f2449331e3c" exitCode=0 Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.633449 4892 generic.go:334] "Generic (PLEG): container finished" podID="bb2d2d5a-6727-4c83-800b-03a6cf43b9c1" containerID="7aa2c24e87c1dd7aca3eb443e9d0a1a6e4ac79e766130bd75af04a8a8e5e4d3c" exitCode=0 Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.639450 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-fa7c-account-create-update-q289d" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.675673 4892 generic.go:334] "Generic (PLEG): container finished" podID="bf3f5fcf-07b1-4f42-a5be-5b11052d080a" containerID="c7cb681f90d6976e6b5cbf00e7ea59a76be308b0fe3edc7745833b928200cf4d" exitCode=0 Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.685282 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-57f0-account-create-update-4885s" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.689893 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5c69-account-create-update-j2dxf" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.700268 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6a1925e-e87d-4d6e-b9a3-35e46d58fc54-operator-scripts\") pod \"nova-cell1-7299-account-create-update-frm9q\" (UID: \"a6a1925e-e87d-4d6e-b9a3-35e46d58fc54\") " pod="openstack/nova-cell1-7299-account-create-update-frm9q" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.700314 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljdlr\" (UniqueName: \"kubernetes.io/projected/a6a1925e-e87d-4d6e-b9a3-35e46d58fc54-kube-api-access-ljdlr\") pod \"nova-cell1-7299-account-create-update-frm9q\" (UID: \"a6a1925e-e87d-4d6e-b9a3-35e46d58fc54\") " pod="openstack/nova-cell1-7299-account-create-update-frm9q" Feb 17 18:07:33 crc kubenswrapper[4892]: E0217 18:07:33.700657 4892 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Feb 17 18:07:33 crc kubenswrapper[4892]: E0217 18:07:33.700693 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a6a1925e-e87d-4d6e-b9a3-35e46d58fc54-operator-scripts podName:a6a1925e-e87d-4d6e-b9a3-35e46d58fc54 nodeName:}" failed. No retries permitted until 2026-02-17 18:07:37.700681014 +0000 UTC m=+1429.076084279 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/a6a1925e-e87d-4d6e-b9a3-35e46d58fc54-operator-scripts") pod "nova-cell1-7299-account-create-update-frm9q" (UID: "a6a1925e-e87d-4d6e-b9a3-35e46d58fc54") : configmap "openstack-cell1-scripts" not found Feb 17 18:07:33 crc kubenswrapper[4892]: E0217 18:07:33.715162 4892 projected.go:194] Error preparing data for projected volume kube-api-access-ljdlr for pod openstack/nova-cell1-7299-account-create-update-frm9q: failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Feb 17 18:07:33 crc kubenswrapper[4892]: E0217 18:07:33.715217 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a6a1925e-e87d-4d6e-b9a3-35e46d58fc54-kube-api-access-ljdlr podName:a6a1925e-e87d-4d6e-b9a3-35e46d58fc54 nodeName:}" failed. No retries permitted until 2026-02-17 18:07:37.715202747 +0000 UTC m=+1429.090606012 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-ljdlr" (UniqueName: "kubernetes.io/projected/a6a1925e-e87d-4d6e-b9a3-35e46d58fc54-kube-api-access-ljdlr") pod "nova-cell1-7299-account-create-update-frm9q" (UID: "a6a1925e-e87d-4d6e-b9a3-35e46d58fc54") : failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.747524 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.768041 4892 generic.go:334] "Generic (PLEG): container finished" podID="bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9" containerID="810f1314fdb8ea5a53e297e37a30cff80f92654d313f4d524f470fd44d72197e" exitCode=0 Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.768138 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.809099 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3909-account-create-update-z2hg5" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.894048 4892 generic.go:334] "Generic (PLEG): container finished" podID="dcbe58d5-c580-4a8c-8476-dd29bf7ca91b" containerID="890f0489886784d4621fef4344f5982943816d37e5cb528aa0fe4904d5ef40c8" exitCode=0 Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.905375 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7299-account-create-update-frm9q" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.905485 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"38862686-bfab-4f7d-8367-ec59a68b0299","Type":"ContainerDied","Data":"30d52ee98141b0a4b18a9c3892ae7d9fadf3cc7b619452277145c17c67f590bb"} Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.905528 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5f5a-account-create-update-g25cr" event={"ID":"c031fb1d-01fe-4f4d-8a7a-2a49cb75bbb9","Type":"ContainerDied","Data":"99608cc0fafb1868bdc4fca20ea615d044a46d25ecdbc2662f59e2d3214b8585"} Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.905544 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-787594b47-2xt6h" event={"ID":"bb2d2d5a-6727-4c83-800b-03a6cf43b9c1","Type":"ContainerDied","Data":"9fe16bec1150bc58439b8a3146b91ab3797ac2826fcfb2f2ad628f2449331e3c"} Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.905556 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-787594b47-2xt6h" event={"ID":"bb2d2d5a-6727-4c83-800b-03a6cf43b9c1","Type":"ContainerDied","Data":"7aa2c24e87c1dd7aca3eb443e9d0a1a6e4ac79e766130bd75af04a8a8e5e4d3c"} Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.905565 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-fa7c-account-create-update-q289d" event={"ID":"e335cae7-2188-47a9-8eef-f53353b22dd4","Type":"ContainerDied","Data":"8a775eb0f599612e05daeaa486a21cee061ad83015907ea034c8e84cea46e06d"} Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.905576 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-667fbbdb6d-fdpm7" event={"ID":"bf3f5fcf-07b1-4f42-a5be-5b11052d080a","Type":"ContainerDied","Data":"c7cb681f90d6976e6b5cbf00e7ea59a76be308b0fe3edc7745833b928200cf4d"} Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.905598 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-57f0-account-create-update-4885s" event={"ID":"d5e19d46-3d45-42e4-b59f-d25925415b3c","Type":"ContainerDied","Data":"3b933bd5ff87f3fce8793baf9ec6cc757a22eb2f6cd1e7bb22e75f0618f9e06b"} Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.905609 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5c69-account-create-update-j2dxf" event={"ID":"40312e89-cbb3-4c9f-b8bc-33113bd6462f","Type":"ContainerDied","Data":"12368ccabe5cab0659b9d014f3bbee02577022749b8a0c64b998ebdb847f481b"} Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.905620 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"60523b2e-a498-4bc9-920b-32f117afb898","Type":"ContainerDied","Data":"33fb36cb1c79714be309b862878f90ca69249e16aad9f6cc0881a87adaf05284"} Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.905633 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9","Type":"ContainerDied","Data":"810f1314fdb8ea5a53e297e37a30cff80f92654d313f4d524f470fd44d72197e"} Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.905644 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9","Type":"ContainerDied","Data":"bc08a0edfe171310cda0903456032b06e034d627c97015128f7fd04f37fd7bd4"} Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.905654 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3909-account-create-update-z2hg5" event={"ID":"588baeb0-b9ae-4dcf-83c6-36b641359bfd","Type":"ContainerDied","Data":"c390e6b748815be52135ceb917b45b1047d3687f25d4e14c38231809c529dba4"} Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.905667 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dcbe58d5-c580-4a8c-8476-dd29bf7ca91b","Type":"ContainerDied","Data":"890f0489886784d4621fef4344f5982943816d37e5cb528aa0fe4904d5ef40c8"} Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.905689 4892 scope.go:117] "RemoveContainer" containerID="c90905bd645724d583f06264c835c33dbac29aab54cd0307b6a53d4573add124" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.941648 4892 scope.go:117] "RemoveContainer" containerID="e99502375ab69e8b92fee8d394383cb65bc6d54cb14279a57cb6a3666079c978" Feb 17 18:07:33 crc kubenswrapper[4892]: I0217 18:07:33.967744 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-6vq9m"] Feb 17 18:07:33 crc kubenswrapper[4892]: E0217 18:07:33.983451 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ab514791aa617c049db64bf7cfd3380e2e19c1cb78b1e69eb13b878f9cf05a05" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 17 18:07:33 crc kubenswrapper[4892]: E0217 18:07:33.984510 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ab514791aa617c049db64bf7cfd3380e2e19c1cb78b1e69eb13b878f9cf05a05" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 17 18:07:33 crc kubenswrapper[4892]: E0217 18:07:33.997040 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ab514791aa617c049db64bf7cfd3380e2e19c1cb78b1e69eb13b878f9cf05a05" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 17 18:07:33 crc kubenswrapper[4892]: E0217 18:07:33.997097 4892 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="03480c10-3249-4caa-b0da-919bbe13c03f" containerName="nova-cell0-conductor-conductor" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.083633 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.117742 4892 scope.go:117] "RemoveContainer" containerID="810f1314fdb8ea5a53e297e37a30cff80f92654d313f4d524f470fd44d72197e" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.149014 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-5f5a-account-create-update-g25cr"] Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.170947 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-5f5a-account-create-update-g25cr"] Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.193061 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-fa7c-account-create-update-q289d"] Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.224440 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpwrf\" (UniqueName: \"kubernetes.io/projected/38862686-bfab-4f7d-8367-ec59a68b0299-kube-api-access-fpwrf\") pod \"38862686-bfab-4f7d-8367-ec59a68b0299\" (UID: \"38862686-bfab-4f7d-8367-ec59a68b0299\") " Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.224489 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/38862686-bfab-4f7d-8367-ec59a68b0299-config-data-generated\") pod \"38862686-bfab-4f7d-8367-ec59a68b0299\" (UID: \"38862686-bfab-4f7d-8367-ec59a68b0299\") " Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.224531 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/38862686-bfab-4f7d-8367-ec59a68b0299-config-data-default\") pod \"38862686-bfab-4f7d-8367-ec59a68b0299\" (UID: \"38862686-bfab-4f7d-8367-ec59a68b0299\") " Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.227094 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38862686-bfab-4f7d-8367-ec59a68b0299-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "38862686-bfab-4f7d-8367-ec59a68b0299" (UID: "38862686-bfab-4f7d-8367-ec59a68b0299"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.228340 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/38862686-bfab-4f7d-8367-ec59a68b0299-kolla-config\") pod \"38862686-bfab-4f7d-8367-ec59a68b0299\" (UID: \"38862686-bfab-4f7d-8367-ec59a68b0299\") " Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.228397 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/38862686-bfab-4f7d-8367-ec59a68b0299-galera-tls-certs\") pod \"38862686-bfab-4f7d-8367-ec59a68b0299\" (UID: \"38862686-bfab-4f7d-8367-ec59a68b0299\") " Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.228445 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38862686-bfab-4f7d-8367-ec59a68b0299-combined-ca-bundle\") pod \"38862686-bfab-4f7d-8367-ec59a68b0299\" (UID: \"38862686-bfab-4f7d-8367-ec59a68b0299\") " Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.228536 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"38862686-bfab-4f7d-8367-ec59a68b0299\" (UID: \"38862686-bfab-4f7d-8367-ec59a68b0299\") " Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.228650 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38862686-bfab-4f7d-8367-ec59a68b0299-operator-scripts\") pod \"38862686-bfab-4f7d-8367-ec59a68b0299\" (UID: \"38862686-bfab-4f7d-8367-ec59a68b0299\") " Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.230133 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38862686-bfab-4f7d-8367-ec59a68b0299-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "38862686-bfab-4f7d-8367-ec59a68b0299" (UID: "38862686-bfab-4f7d-8367-ec59a68b0299"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.233483 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38862686-bfab-4f7d-8367-ec59a68b0299-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "38862686-bfab-4f7d-8367-ec59a68b0299" (UID: "38862686-bfab-4f7d-8367-ec59a68b0299"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.235222 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38862686-bfab-4f7d-8367-ec59a68b0299-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.235258 4892 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/38862686-bfab-4f7d-8367-ec59a68b0299-config-data-generated\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.235271 4892 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/38862686-bfab-4f7d-8367-ec59a68b0299-config-data-default\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:34 crc kubenswrapper[4892]: E0217 18:07:34.235354 4892 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Feb 17 18:07:34 crc kubenswrapper[4892]: E0217 18:07:34.235432 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7a130c05-1db7-4d93-899e-05d086a2bcca-operator-scripts podName:7a130c05-1db7-4d93-899e-05d086a2bcca nodeName:}" failed. No retries permitted until 2026-02-17 18:07:36.235389597 +0000 UTC m=+1427.610792862 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/7a130c05-1db7-4d93-899e-05d086a2bcca-operator-scripts") pod "root-account-create-update-cq76l" (UID: "7a130c05-1db7-4d93-899e-05d086a2bcca") : configmap "openstack-cell1-scripts" not found Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.235832 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-fa7c-account-create-update-q289d"] Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.239867 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38862686-bfab-4f7d-8367-ec59a68b0299-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "38862686-bfab-4f7d-8367-ec59a68b0299" (UID: "38862686-bfab-4f7d-8367-ec59a68b0299"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.248895 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38862686-bfab-4f7d-8367-ec59a68b0299-kube-api-access-fpwrf" (OuterVolumeSpecName: "kube-api-access-fpwrf") pod "38862686-bfab-4f7d-8367-ec59a68b0299" (UID: "38862686-bfab-4f7d-8367-ec59a68b0299"). InnerVolumeSpecName "kube-api-access-fpwrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.259581 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "mysql-db") pod "38862686-bfab-4f7d-8367-ec59a68b0299" (UID: "38862686-bfab-4f7d-8367-ec59a68b0299"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.265278 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.288907 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 18:07:34 crc kubenswrapper[4892]: E0217 18:07:34.292603 4892 kubelet_node_status.go:756] "Failed to set some node status fields" err="failed to validate nodeIP: route ip+net: no such network interface" node="crc" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.339083 4892 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/38862686-bfab-4f7d-8367-ec59a68b0299-kolla-config\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.339133 4892 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.339143 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpwrf\" (UniqueName: \"kubernetes.io/projected/38862686-bfab-4f7d-8367-ec59a68b0299-kube-api-access-fpwrf\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.361656 4892 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.385021 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38862686-bfab-4f7d-8367-ec59a68b0299-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "38862686-bfab-4f7d-8367-ec59a68b0299" (UID: "38862686-bfab-4f7d-8367-ec59a68b0299"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.423343 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38862686-bfab-4f7d-8367-ec59a68b0299-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "38862686-bfab-4f7d-8367-ec59a68b0299" (UID: "38862686-bfab-4f7d-8367-ec59a68b0299"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.457801 4892 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/38862686-bfab-4f7d-8367-ec59a68b0299-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.457841 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38862686-bfab-4f7d-8367-ec59a68b0299-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.457850 4892 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.612221 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-667fbbdb6d-fdpm7" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.630897 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.631164 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2e7cdd99-a572-4a20-834b-c1373e080496" containerName="ceilometer-central-agent" containerID="cri-o://366a786fe2adb1868b4256bbb8d7a92b8b2dc7273456a2f82c6931c3dc7fdab4" gracePeriod=30 Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.631292 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2e7cdd99-a572-4a20-834b-c1373e080496" containerName="proxy-httpd" containerID="cri-o://7084822d47905f3c440ae8cc2dc02a06102fe14e3a3594895fa3088c05136b66" gracePeriod=30 Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.631335 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2e7cdd99-a572-4a20-834b-c1373e080496" containerName="sg-core" containerID="cri-o://a9803edec44f5a7d45fe654a2bc1851bccd3b5a3b1210ee4b83cc2c6d8022f3e" gracePeriod=30 Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.631364 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2e7cdd99-a572-4a20-834b-c1373e080496" containerName="ceilometer-notification-agent" containerID="cri-o://a65ac27e331c76e69932d4652e6aaf1a21e948f273f412a3f9aa451079f2be62" gracePeriod=30 Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.640670 4892 scope.go:117] "RemoveContainer" containerID="810f1314fdb8ea5a53e297e37a30cff80f92654d313f4d524f470fd44d72197e" Feb 17 18:07:34 crc kubenswrapper[4892]: E0217 18:07:34.658463 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"810f1314fdb8ea5a53e297e37a30cff80f92654d313f4d524f470fd44d72197e\": container with ID starting with 810f1314fdb8ea5a53e297e37a30cff80f92654d313f4d524f470fd44d72197e not found: ID does not exist" containerID="810f1314fdb8ea5a53e297e37a30cff80f92654d313f4d524f470fd44d72197e" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.658508 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"810f1314fdb8ea5a53e297e37a30cff80f92654d313f4d524f470fd44d72197e"} err="failed to get container status \"810f1314fdb8ea5a53e297e37a30cff80f92654d313f4d524f470fd44d72197e\": rpc error: code = NotFound desc = could not find container \"810f1314fdb8ea5a53e297e37a30cff80f92654d313f4d524f470fd44d72197e\": container with ID starting with 810f1314fdb8ea5a53e297e37a30cff80f92654d313f4d524f470fd44d72197e not found: ID does not exist" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.689200 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.689395 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="8affd9cf-1116-4643-8045-d445edeaa995" containerName="kube-state-metrics" containerID="cri-o://0f890fee5c00a218e42e924b96222201e9ce426ef9dad178a8610ce733d3612e" gracePeriod=30 Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.698044 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-5c69-account-create-update-j2dxf"] Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.705930 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-787594b47-2xt6h" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.708523 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-5c69-account-create-update-j2dxf"] Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.715159 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.737945 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-cq76l" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.739423 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db8547d8d-ftgvm" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.763119 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf3f5fcf-07b1-4f42-a5be-5b11052d080a-combined-ca-bundle\") pod \"bf3f5fcf-07b1-4f42-a5be-5b11052d080a\" (UID: \"bf3f5fcf-07b1-4f42-a5be-5b11052d080a\") " Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.763169 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bf3f5fcf-07b1-4f42-a5be-5b11052d080a-config-data-custom\") pod \"bf3f5fcf-07b1-4f42-a5be-5b11052d080a\" (UID: \"bf3f5fcf-07b1-4f42-a5be-5b11052d080a\") " Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.763197 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf3f5fcf-07b1-4f42-a5be-5b11052d080a-logs\") pod \"bf3f5fcf-07b1-4f42-a5be-5b11052d080a\" (UID: \"bf3f5fcf-07b1-4f42-a5be-5b11052d080a\") " Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.763235 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjtq5\" (UniqueName: \"kubernetes.io/projected/bf3f5fcf-07b1-4f42-a5be-5b11052d080a-kube-api-access-gjtq5\") pod \"bf3f5fcf-07b1-4f42-a5be-5b11052d080a\" (UID: \"bf3f5fcf-07b1-4f42-a5be-5b11052d080a\") " Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.763281 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf3f5fcf-07b1-4f42-a5be-5b11052d080a-config-data\") pod \"bf3f5fcf-07b1-4f42-a5be-5b11052d080a\" (UID: \"bf3f5fcf-07b1-4f42-a5be-5b11052d080a\") " Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.768280 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf3f5fcf-07b1-4f42-a5be-5b11052d080a-logs" (OuterVolumeSpecName: "logs") pod "bf3f5fcf-07b1-4f42-a5be-5b11052d080a" (UID: "bf3f5fcf-07b1-4f42-a5be-5b11052d080a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.797594 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf3f5fcf-07b1-4f42-a5be-5b11052d080a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "bf3f5fcf-07b1-4f42-a5be-5b11052d080a" (UID: "bf3f5fcf-07b1-4f42-a5be-5b11052d080a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.804156 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf3f5fcf-07b1-4f42-a5be-5b11052d080a-kube-api-access-gjtq5" (OuterVolumeSpecName: "kube-api-access-gjtq5") pod "bf3f5fcf-07b1-4f42-a5be-5b11052d080a" (UID: "bf3f5fcf-07b1-4f42-a5be-5b11052d080a"). InnerVolumeSpecName "kube-api-access-gjtq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.814679 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.820496 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="dadb10bf-ed88-454e-8873-9c49f762ef6e" containerName="memcached" containerID="cri-o://264f797728e12c3996d15cc2b9cd8446cea32fc84d31eeb1fec2bcc2395f7027" gracePeriod=30 Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.831465 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-36cd-account-create-update-9tkph"] Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.865179 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tctl5\" (UniqueName: \"kubernetes.io/projected/7a130c05-1db7-4d93-899e-05d086a2bcca-kube-api-access-tctl5\") pod \"7a130c05-1db7-4d93-899e-05d086a2bcca\" (UID: \"7a130c05-1db7-4d93-899e-05d086a2bcca\") " Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.865234 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb2d2d5a-6727-4c83-800b-03a6cf43b9c1-run-httpd\") pod \"bb2d2d5a-6727-4c83-800b-03a6cf43b9c1\" (UID: \"bb2d2d5a-6727-4c83-800b-03a6cf43b9c1\") " Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.865259 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dcbe58d5-c580-4a8c-8476-dd29bf7ca91b-scripts\") pod \"dcbe58d5-c580-4a8c-8476-dd29bf7ca91b\" (UID: \"dcbe58d5-c580-4a8c-8476-dd29bf7ca91b\") " Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.865292 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb2d2d5a-6727-4c83-800b-03a6cf43b9c1-config-data\") pod \"bb2d2d5a-6727-4c83-800b-03a6cf43b9c1\" (UID: \"bb2d2d5a-6727-4c83-800b-03a6cf43b9c1\") " Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.865319 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcbe58d5-c580-4a8c-8476-dd29bf7ca91b-internal-tls-certs\") pod \"dcbe58d5-c580-4a8c-8476-dd29bf7ca91b\" (UID: \"dcbe58d5-c580-4a8c-8476-dd29bf7ca91b\") " Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.865369 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb2d2d5a-6727-4c83-800b-03a6cf43b9c1-combined-ca-bundle\") pod \"bb2d2d5a-6727-4c83-800b-03a6cf43b9c1\" (UID: \"bb2d2d5a-6727-4c83-800b-03a6cf43b9c1\") " Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.865396 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c6047a2-148b-46cf-a50b-b7147c7c9902-internal-tls-certs\") pod \"5c6047a2-148b-46cf-a50b-b7147c7c9902\" (UID: \"5c6047a2-148b-46cf-a50b-b7147c7c9902\") " Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.865418 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjbvq\" (UniqueName: \"kubernetes.io/projected/dcbe58d5-c580-4a8c-8476-dd29bf7ca91b-kube-api-access-qjbvq\") pod \"dcbe58d5-c580-4a8c-8476-dd29bf7ca91b\" (UID: \"dcbe58d5-c580-4a8c-8476-dd29bf7ca91b\") " Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.865453 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a130c05-1db7-4d93-899e-05d086a2bcca-operator-scripts\") pod \"7a130c05-1db7-4d93-899e-05d086a2bcca\" (UID: \"7a130c05-1db7-4d93-899e-05d086a2bcca\") " Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.865470 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8jxj\" (UniqueName: \"kubernetes.io/projected/bb2d2d5a-6727-4c83-800b-03a6cf43b9c1-kube-api-access-r8jxj\") pod \"bb2d2d5a-6727-4c83-800b-03a6cf43b9c1\" (UID: \"bb2d2d5a-6727-4c83-800b-03a6cf43b9c1\") " Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.865485 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb2d2d5a-6727-4c83-800b-03a6cf43b9c1-public-tls-certs\") pod \"bb2d2d5a-6727-4c83-800b-03a6cf43b9c1\" (UID: \"bb2d2d5a-6727-4c83-800b-03a6cf43b9c1\") " Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.865503 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb2d2d5a-6727-4c83-800b-03a6cf43b9c1-log-httpd\") pod \"bb2d2d5a-6727-4c83-800b-03a6cf43b9c1\" (UID: \"bb2d2d5a-6727-4c83-800b-03a6cf43b9c1\") " Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.865554 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c6047a2-148b-46cf-a50b-b7147c7c9902-config-data\") pod \"5c6047a2-148b-46cf-a50b-b7147c7c9902\" (UID: \"5c6047a2-148b-46cf-a50b-b7147c7c9902\") " Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.865572 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcbe58d5-c580-4a8c-8476-dd29bf7ca91b-logs\") pod \"dcbe58d5-c580-4a8c-8476-dd29bf7ca91b\" (UID: \"dcbe58d5-c580-4a8c-8476-dd29bf7ca91b\") " Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.865593 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c6047a2-148b-46cf-a50b-b7147c7c9902-logs\") pod \"5c6047a2-148b-46cf-a50b-b7147c7c9902\" (UID: \"5c6047a2-148b-46cf-a50b-b7147c7c9902\") " Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.865610 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dcbe58d5-c580-4a8c-8476-dd29bf7ca91b-config-data-custom\") pod \"dcbe58d5-c580-4a8c-8476-dd29bf7ca91b\" (UID: \"dcbe58d5-c580-4a8c-8476-dd29bf7ca91b\") " Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.865629 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dcbe58d5-c580-4a8c-8476-dd29bf7ca91b-etc-machine-id\") pod \"dcbe58d5-c580-4a8c-8476-dd29bf7ca91b\" (UID: \"dcbe58d5-c580-4a8c-8476-dd29bf7ca91b\") " Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.865649 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c6047a2-148b-46cf-a50b-b7147c7c9902-public-tls-certs\") pod \"5c6047a2-148b-46cf-a50b-b7147c7c9902\" (UID: \"5c6047a2-148b-46cf-a50b-b7147c7c9902\") " Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.865694 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcbe58d5-c580-4a8c-8476-dd29bf7ca91b-public-tls-certs\") pod \"dcbe58d5-c580-4a8c-8476-dd29bf7ca91b\" (UID: \"dcbe58d5-c580-4a8c-8476-dd29bf7ca91b\") " Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.865716 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bb2d2d5a-6727-4c83-800b-03a6cf43b9c1-etc-swift\") pod \"bb2d2d5a-6727-4c83-800b-03a6cf43b9c1\" (UID: \"bb2d2d5a-6727-4c83-800b-03a6cf43b9c1\") " Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.865735 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c6047a2-148b-46cf-a50b-b7147c7c9902-scripts\") pod \"5c6047a2-148b-46cf-a50b-b7147c7c9902\" (UID: \"5c6047a2-148b-46cf-a50b-b7147c7c9902\") " Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.865761 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb2d2d5a-6727-4c83-800b-03a6cf43b9c1-internal-tls-certs\") pod \"bb2d2d5a-6727-4c83-800b-03a6cf43b9c1\" (UID: \"bb2d2d5a-6727-4c83-800b-03a6cf43b9c1\") " Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.865785 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c6047a2-148b-46cf-a50b-b7147c7c9902-combined-ca-bundle\") pod \"5c6047a2-148b-46cf-a50b-b7147c7c9902\" (UID: \"5c6047a2-148b-46cf-a50b-b7147c7c9902\") " Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.865846 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcbe58d5-c580-4a8c-8476-dd29bf7ca91b-config-data\") pod \"dcbe58d5-c580-4a8c-8476-dd29bf7ca91b\" (UID: \"dcbe58d5-c580-4a8c-8476-dd29bf7ca91b\") " Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.865885 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggf79\" (UniqueName: \"kubernetes.io/projected/5c6047a2-148b-46cf-a50b-b7147c7c9902-kube-api-access-ggf79\") pod \"5c6047a2-148b-46cf-a50b-b7147c7c9902\" (UID: \"5c6047a2-148b-46cf-a50b-b7147c7c9902\") " Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.865924 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcbe58d5-c580-4a8c-8476-dd29bf7ca91b-combined-ca-bundle\") pod \"dcbe58d5-c580-4a8c-8476-dd29bf7ca91b\" (UID: \"dcbe58d5-c580-4a8c-8476-dd29bf7ca91b\") " Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.866292 4892 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bf3f5fcf-07b1-4f42-a5be-5b11052d080a-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.866308 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf3f5fcf-07b1-4f42-a5be-5b11052d080a-logs\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.866976 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-3909-account-create-update-z2hg5"] Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.881032 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjtq5\" (UniqueName: \"kubernetes.io/projected/bf3f5fcf-07b1-4f42-a5be-5b11052d080a-kube-api-access-gjtq5\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.882045 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a130c05-1db7-4d93-899e-05d086a2bcca-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7a130c05-1db7-4d93-899e-05d086a2bcca" (UID: "7a130c05-1db7-4d93-899e-05d086a2bcca"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.888327 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf3f5fcf-07b1-4f42-a5be-5b11052d080a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bf3f5fcf-07b1-4f42-a5be-5b11052d080a" (UID: "bf3f5fcf-07b1-4f42-a5be-5b11052d080a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.889105 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-36cd-account-create-update-9tkph"] Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.889171 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dcbe58d5-c580-4a8c-8476-dd29bf7ca91b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "dcbe58d5-c580-4a8c-8476-dd29bf7ca91b" (UID: "dcbe58d5-c580-4a8c-8476-dd29bf7ca91b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.890895 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcbe58d5-c580-4a8c-8476-dd29bf7ca91b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "dcbe58d5-c580-4a8c-8476-dd29bf7ca91b" (UID: "dcbe58d5-c580-4a8c-8476-dd29bf7ca91b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.893804 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c6047a2-148b-46cf-a50b-b7147c7c9902-logs" (OuterVolumeSpecName: "logs") pod "5c6047a2-148b-46cf-a50b-b7147c7c9902" (UID: "5c6047a2-148b-46cf-a50b-b7147c7c9902"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.894757 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcbe58d5-c580-4a8c-8476-dd29bf7ca91b-logs" (OuterVolumeSpecName: "logs") pod "dcbe58d5-c580-4a8c-8476-dd29bf7ca91b" (UID: "dcbe58d5-c580-4a8c-8476-dd29bf7ca91b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.921616 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcbe58d5-c580-4a8c-8476-dd29bf7ca91b-kube-api-access-qjbvq" (OuterVolumeSpecName: "kube-api-access-qjbvq") pod "dcbe58d5-c580-4a8c-8476-dd29bf7ca91b" (UID: "dcbe58d5-c580-4a8c-8476-dd29bf7ca91b"). InnerVolumeSpecName "kube-api-access-qjbvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.921935 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb2d2d5a-6727-4c83-800b-03a6cf43b9c1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bb2d2d5a-6727-4c83-800b-03a6cf43b9c1" (UID: "bb2d2d5a-6727-4c83-800b-03a6cf43b9c1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.923018 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb2d2d5a-6727-4c83-800b-03a6cf43b9c1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bb2d2d5a-6727-4c83-800b-03a6cf43b9c1" (UID: "bb2d2d5a-6727-4c83-800b-03a6cf43b9c1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.930699 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcbe58d5-c580-4a8c-8476-dd29bf7ca91b-scripts" (OuterVolumeSpecName: "scripts") pod "dcbe58d5-c580-4a8c-8476-dd29bf7ca91b" (UID: "dcbe58d5-c580-4a8c-8476-dd29bf7ca91b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.936405 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb2d2d5a-6727-4c83-800b-03a6cf43b9c1-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "bb2d2d5a-6727-4c83-800b-03a6cf43b9c1" (UID: "bb2d2d5a-6727-4c83-800b-03a6cf43b9c1"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.951128 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb2d2d5a-6727-4c83-800b-03a6cf43b9c1-kube-api-access-r8jxj" (OuterVolumeSpecName: "kube-api-access-r8jxj") pod "bb2d2d5a-6727-4c83-800b-03a6cf43b9c1" (UID: "bb2d2d5a-6727-4c83-800b-03a6cf43b9c1"). InnerVolumeSpecName "kube-api-access-r8jxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.952107 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a130c05-1db7-4d93-899e-05d086a2bcca-kube-api-access-tctl5" (OuterVolumeSpecName: "kube-api-access-tctl5") pod "7a130c05-1db7-4d93-899e-05d086a2bcca" (UID: "7a130c05-1db7-4d93-899e-05d086a2bcca"). InnerVolumeSpecName "kube-api-access-tctl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.952207 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c6047a2-148b-46cf-a50b-b7147c7c9902-scripts" (OuterVolumeSpecName: "scripts") pod "5c6047a2-148b-46cf-a50b-b7147c7c9902" (UID: "5c6047a2-148b-46cf-a50b-b7147c7c9902"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.954475 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-cq76l" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.956396 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-cq76l" event={"ID":"7a130c05-1db7-4d93-899e-05d086a2bcca","Type":"ContainerDied","Data":"f75e5afe3c240e3b7b80ea954bd8614914d8c5319be4f7207619dccdca9e42d5"} Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.973360 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-3909-account-create-update-z2hg5"] Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.976591 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c6047a2-148b-46cf-a50b-b7147c7c9902-kube-api-access-ggf79" (OuterVolumeSpecName: "kube-api-access-ggf79") pod "5c6047a2-148b-46cf-a50b-b7147c7c9902" (UID: "5c6047a2-148b-46cf-a50b-b7147c7c9902"). InnerVolumeSpecName "kube-api-access-ggf79". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.984565 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tctl5\" (UniqueName: \"kubernetes.io/projected/7a130c05-1db7-4d93-899e-05d086a2bcca-kube-api-access-tctl5\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.984598 4892 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb2d2d5a-6727-4c83-800b-03a6cf43b9c1-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.984611 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dcbe58d5-c580-4a8c-8476-dd29bf7ca91b-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.984624 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf3f5fcf-07b1-4f42-a5be-5b11052d080a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.984636 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjbvq\" (UniqueName: \"kubernetes.io/projected/dcbe58d5-c580-4a8c-8476-dd29bf7ca91b-kube-api-access-qjbvq\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.984648 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8jxj\" (UniqueName: \"kubernetes.io/projected/bb2d2d5a-6727-4c83-800b-03a6cf43b9c1-kube-api-access-r8jxj\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.984658 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a130c05-1db7-4d93-899e-05d086a2bcca-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.984668 4892 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb2d2d5a-6727-4c83-800b-03a6cf43b9c1-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.984678 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcbe58d5-c580-4a8c-8476-dd29bf7ca91b-logs\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.984689 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c6047a2-148b-46cf-a50b-b7147c7c9902-logs\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.984700 4892 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dcbe58d5-c580-4a8c-8476-dd29bf7ca91b-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.984713 4892 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dcbe58d5-c580-4a8c-8476-dd29bf7ca91b-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.984724 4892 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bb2d2d5a-6727-4c83-800b-03a6cf43b9c1-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.984734 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c6047a2-148b-46cf-a50b-b7147c7c9902-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.984745 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggf79\" (UniqueName: \"kubernetes.io/projected/5c6047a2-148b-46cf-a50b-b7147c7c9902-kube-api-access-ggf79\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.984766 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-36cd-account-create-update-5bllx"] Feb 17 18:07:34 crc kubenswrapper[4892]: E0217 18:07:34.985208 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38862686-bfab-4f7d-8367-ec59a68b0299" containerName="galera" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.985226 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="38862686-bfab-4f7d-8367-ec59a68b0299" containerName="galera" Feb 17 18:07:34 crc kubenswrapper[4892]: E0217 18:07:34.985251 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c6047a2-148b-46cf-a50b-b7147c7c9902" containerName="placement-api" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.985258 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c6047a2-148b-46cf-a50b-b7147c7c9902" containerName="placement-api" Feb 17 18:07:34 crc kubenswrapper[4892]: E0217 18:07:34.985275 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcbe58d5-c580-4a8c-8476-dd29bf7ca91b" containerName="cinder-api" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.985281 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcbe58d5-c580-4a8c-8476-dd29bf7ca91b" containerName="cinder-api" Feb 17 18:07:34 crc kubenswrapper[4892]: E0217 18:07:34.985290 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb2d2d5a-6727-4c83-800b-03a6cf43b9c1" containerName="proxy-server" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.985295 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb2d2d5a-6727-4c83-800b-03a6cf43b9c1" containerName="proxy-server" Feb 17 18:07:34 crc kubenswrapper[4892]: E0217 18:07:34.985310 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9" containerName="nova-cell1-novncproxy-novncproxy" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.985316 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9" containerName="nova-cell1-novncproxy-novncproxy" Feb 17 18:07:34 crc kubenswrapper[4892]: E0217 18:07:34.985331 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf3f5fcf-07b1-4f42-a5be-5b11052d080a" containerName="barbican-keystone-listener-log" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.985338 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf3f5fcf-07b1-4f42-a5be-5b11052d080a" containerName="barbican-keystone-listener-log" Feb 17 18:07:34 crc kubenswrapper[4892]: E0217 18:07:34.985345 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf3f5fcf-07b1-4f42-a5be-5b11052d080a" containerName="barbican-keystone-listener" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.985351 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf3f5fcf-07b1-4f42-a5be-5b11052d080a" containerName="barbican-keystone-listener" Feb 17 18:07:34 crc kubenswrapper[4892]: E0217 18:07:34.985362 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb2d2d5a-6727-4c83-800b-03a6cf43b9c1" containerName="proxy-httpd" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.985368 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb2d2d5a-6727-4c83-800b-03a6cf43b9c1" containerName="proxy-httpd" Feb 17 18:07:34 crc kubenswrapper[4892]: E0217 18:07:34.985384 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c6047a2-148b-46cf-a50b-b7147c7c9902" containerName="placement-log" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.985390 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c6047a2-148b-46cf-a50b-b7147c7c9902" containerName="placement-log" Feb 17 18:07:34 crc kubenswrapper[4892]: E0217 18:07:34.985401 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcbe58d5-c580-4a8c-8476-dd29bf7ca91b" containerName="cinder-api-log" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.985408 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcbe58d5-c580-4a8c-8476-dd29bf7ca91b" containerName="cinder-api-log" Feb 17 18:07:34 crc kubenswrapper[4892]: E0217 18:07:34.985424 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38862686-bfab-4f7d-8367-ec59a68b0299" containerName="mysql-bootstrap" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.985431 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="38862686-bfab-4f7d-8367-ec59a68b0299" containerName="mysql-bootstrap" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.985643 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcbe58d5-c580-4a8c-8476-dd29bf7ca91b" containerName="cinder-api-log" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.985659 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="38862686-bfab-4f7d-8367-ec59a68b0299" containerName="galera" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.985670 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c6047a2-148b-46cf-a50b-b7147c7c9902" containerName="placement-api" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.985680 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb2d2d5a-6727-4c83-800b-03a6cf43b9c1" containerName="proxy-server" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.985697 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9" containerName="nova-cell1-novncproxy-novncproxy" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.985708 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf3f5fcf-07b1-4f42-a5be-5b11052d080a" containerName="barbican-keystone-listener" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.985718 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c6047a2-148b-46cf-a50b-b7147c7c9902" containerName="placement-log" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.985733 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb2d2d5a-6727-4c83-800b-03a6cf43b9c1" containerName="proxy-httpd" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.985744 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf3f5fcf-07b1-4f42-a5be-5b11052d080a" containerName="barbican-keystone-listener-log" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.985753 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcbe58d5-c580-4a8c-8476-dd29bf7ca91b" containerName="cinder-api" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.986359 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-36cd-account-create-update-5bllx" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.988310 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.996440 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dcbe58d5-c580-4a8c-8476-dd29bf7ca91b","Type":"ContainerDied","Data":"22fa72325d60d7548606add73f4ef937e113f09c885e10d204399bd2f2063809"} Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.996622 4892 scope.go:117] "RemoveContainer" containerID="890f0489886784d4621fef4344f5982943816d37e5cb528aa0fe4904d5ef40c8" Feb 17 18:07:34 crc kubenswrapper[4892]: I0217 18:07:34.996782 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.003564 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.004580 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"38862686-bfab-4f7d-8367-ec59a68b0299","Type":"ContainerDied","Data":"5135b40394a291608961e1f25d10b174759bc6fbb292af7050e6cb4fcb627e0f"} Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.004663 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.006528 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-lvsr8"] Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.014360 4892 generic.go:334] "Generic (PLEG): container finished" podID="3ca9c4ed-1247-4340-a675-b9d50dcbed99" containerID="c99d3760836f82bf94729ec9f9c217c2b3cda31831ae4231d401c839708c201c" exitCode=0 Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.014427 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3ca9c4ed-1247-4340-a675-b9d50dcbed99","Type":"ContainerDied","Data":"c99d3760836f82bf94729ec9f9c217c2b3cda31831ae4231d401c839708c201c"} Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.023041 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-36cd-account-create-update-5bllx"] Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.027557 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-787594b47-2xt6h" event={"ID":"bb2d2d5a-6727-4c83-800b-03a6cf43b9c1","Type":"ContainerDied","Data":"fc35d0e1deb77dd99fb76aef6d27085992a2baae171876bf5b5fa1f532c2af3a"} Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.027649 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-787594b47-2xt6h" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.033064 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-lvsr8"] Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.039323 4892 scope.go:117] "RemoveContainer" containerID="0b8f5fa1d04ca87abf7de487fc63b1102d6d2d69e2adcaa02f32fd33d6c4c382" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.053686 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-8vb5k"] Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.062620 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-667fbbdb6d-fdpm7" event={"ID":"bf3f5fcf-07b1-4f42-a5be-5b11052d080a","Type":"ContainerDied","Data":"fa7b95a6c3e5e18dd00f12d6037cc4d743852c66ed7710d593c45a39ffa5f904"} Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.062658 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-667fbbdb6d-fdpm7" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.089629 4892 generic.go:334] "Generic (PLEG): container finished" podID="5c6047a2-148b-46cf-a50b-b7147c7c9902" containerID="f688928d9f46e1289f6ea6452cd0e3a1da9e18c9de2b9d48b2e5baa997df9da2" exitCode=0 Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.089764 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db8547d8d-ftgvm" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.089971 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db8547d8d-ftgvm" event={"ID":"5c6047a2-148b-46cf-a50b-b7147c7c9902","Type":"ContainerDied","Data":"f688928d9f46e1289f6ea6452cd0e3a1da9e18c9de2b9d48b2e5baa997df9da2"} Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.090010 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db8547d8d-ftgvm" event={"ID":"5c6047a2-148b-46cf-a50b-b7147c7c9902","Type":"ContainerDied","Data":"4463f5c342aa43de4c462a518f9c057c9e62ab708a85e18143550201a1e1daff"} Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.090533 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e69accab-69f4-4f35-91cc-b9fb1d0fded2-config-data\") pod \"e69accab-69f4-4f35-91cc-b9fb1d0fded2\" (UID: \"e69accab-69f4-4f35-91cc-b9fb1d0fded2\") " Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.090704 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e69accab-69f4-4f35-91cc-b9fb1d0fded2-internal-tls-certs\") pod \"e69accab-69f4-4f35-91cc-b9fb1d0fded2\" (UID: \"e69accab-69f4-4f35-91cc-b9fb1d0fded2\") " Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.090792 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e69accab-69f4-4f35-91cc-b9fb1d0fded2-logs\") pod \"e69accab-69f4-4f35-91cc-b9fb1d0fded2\" (UID: \"e69accab-69f4-4f35-91cc-b9fb1d0fded2\") " Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.090840 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e69accab-69f4-4f35-91cc-b9fb1d0fded2-httpd-run\") pod \"e69accab-69f4-4f35-91cc-b9fb1d0fded2\" (UID: \"e69accab-69f4-4f35-91cc-b9fb1d0fded2\") " Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.090877 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e69accab-69f4-4f35-91cc-b9fb1d0fded2-scripts\") pod \"e69accab-69f4-4f35-91cc-b9fb1d0fded2\" (UID: \"e69accab-69f4-4f35-91cc-b9fb1d0fded2\") " Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.090934 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e69accab-69f4-4f35-91cc-b9fb1d0fded2-combined-ca-bundle\") pod \"e69accab-69f4-4f35-91cc-b9fb1d0fded2\" (UID: \"e69accab-69f4-4f35-91cc-b9fb1d0fded2\") " Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.090963 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkllr\" (UniqueName: \"kubernetes.io/projected/e69accab-69f4-4f35-91cc-b9fb1d0fded2-kube-api-access-wkllr\") pod \"e69accab-69f4-4f35-91cc-b9fb1d0fded2\" (UID: \"e69accab-69f4-4f35-91cc-b9fb1d0fded2\") " Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.090981 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"e69accab-69f4-4f35-91cc-b9fb1d0fded2\" (UID: \"e69accab-69f4-4f35-91cc-b9fb1d0fded2\") " Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.091209 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpd2l\" (UniqueName: \"kubernetes.io/projected/e2931866-1e7d-46e7-833f-b285d8514234-kube-api-access-kpd2l\") pod \"keystone-36cd-account-create-update-5bllx\" (UID: \"e2931866-1e7d-46e7-833f-b285d8514234\") " pod="openstack/keystone-36cd-account-create-update-5bllx" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.091268 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2931866-1e7d-46e7-833f-b285d8514234-operator-scripts\") pod \"keystone-36cd-account-create-update-5bllx\" (UID: \"e2931866-1e7d-46e7-833f-b285d8514234\") " pod="openstack/keystone-36cd-account-create-update-5bllx" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.118727 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e69accab-69f4-4f35-91cc-b9fb1d0fded2-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e69accab-69f4-4f35-91cc-b9fb1d0fded2" (UID: "e69accab-69f4-4f35-91cc-b9fb1d0fded2"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.122260 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-57f0-account-create-update-4885s"] Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.123365 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e69accab-69f4-4f35-91cc-b9fb1d0fded2-logs" (OuterVolumeSpecName: "logs") pod "e69accab-69f4-4f35-91cc-b9fb1d0fded2" (UID: "e69accab-69f4-4f35-91cc-b9fb1d0fded2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.124746 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e69accab-69f4-4f35-91cc-b9fb1d0fded2-scripts" (OuterVolumeSpecName: "scripts") pod "e69accab-69f4-4f35-91cc-b9fb1d0fded2" (UID: "e69accab-69f4-4f35-91cc-b9fb1d0fded2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.124917 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-8vb5k"] Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.131331 4892 generic.go:334] "Generic (PLEG): container finished" podID="8affd9cf-1116-4643-8045-d445edeaa995" containerID="0f890fee5c00a218e42e924b96222201e9ce426ef9dad178a8610ce733d3612e" exitCode=2 Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.131398 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8affd9cf-1116-4643-8045-d445edeaa995","Type":"ContainerDied","Data":"0f890fee5c00a218e42e924b96222201e9ce426ef9dad178a8610ce733d3612e"} Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.138578 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-57f0-account-create-update-4885s"] Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.142270 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-77d47b9fd6-4gbj5" podUID="8947a6cb-f018-4042-a2f8-e17591b0394d" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.189:9311/healthcheck\": read tcp 10.217.0.2:54948->10.217.0.189:9311: read: connection reset by peer" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.142286 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-77d47b9fd6-4gbj5" podUID="8947a6cb-f018-4042-a2f8-e17591b0394d" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.189:9311/healthcheck\": read tcp 10.217.0.2:54962->10.217.0.189:9311: read: connection reset by peer" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.142883 4892 generic.go:334] "Generic (PLEG): container finished" podID="2e7cdd99-a572-4a20-834b-c1373e080496" containerID="a9803edec44f5a7d45fe654a2bc1851bccd3b5a3b1210ee4b83cc2c6d8022f3e" exitCode=2 Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.142904 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2e7cdd99-a572-4a20-834b-c1373e080496","Type":"ContainerDied","Data":"a9803edec44f5a7d45fe654a2bc1851bccd3b5a3b1210ee4b83cc2c6d8022f3e"} Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.145129 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e69accab-69f4-4f35-91cc-b9fb1d0fded2-kube-api-access-wkllr" (OuterVolumeSpecName: "kube-api-access-wkllr") pod "e69accab-69f4-4f35-91cc-b9fb1d0fded2" (UID: "e69accab-69f4-4f35-91cc-b9fb1d0fded2"). InnerVolumeSpecName "kube-api-access-wkllr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.145147 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "e69accab-69f4-4f35-91cc-b9fb1d0fded2" (UID: "e69accab-69f4-4f35-91cc-b9fb1d0fded2"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.151338 4892 generic.go:334] "Generic (PLEG): container finished" podID="e69accab-69f4-4f35-91cc-b9fb1d0fded2" containerID="a6444be986863902c53a81d49703d9af2c8e3377fbc24ad52b64139d345c030e" exitCode=0 Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.151452 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e69accab-69f4-4f35-91cc-b9fb1d0fded2","Type":"ContainerDied","Data":"a6444be986863902c53a81d49703d9af2c8e3377fbc24ad52b64139d345c030e"} Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.151495 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e69accab-69f4-4f35-91cc-b9fb1d0fded2","Type":"ContainerDied","Data":"0db668624ce7395d4fcbd56b8ca1553cb4fc0314f6cf7995da270ed58dc804ac"} Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.151555 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.160872 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-dfdc985-pc7r9"] Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.161071 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-dfdc985-pc7r9" podUID="2fd50e1e-cc22-430b-ab38-88217aeafc59" containerName="keystone-api" containerID="cri-o://d55aa965f98c1b5c7b4e27276dc194db4f2a6d3b950d18e096832f47d612e12f" gracePeriod=30 Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.169185 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-6vq9m" event={"ID":"49339b73-3846-4ce0-aa1d-d285b333c807","Type":"ContainerStarted","Data":"fc329851c12662a2429eccce7fb2d0016d07905eeccabaa72b8ee0534ecfb158"} Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.191525 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-36cd-account-create-update-5bllx"] Feb 17 18:07:35 crc kubenswrapper[4892]: E0217 18:07:35.192219 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-kpd2l operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/keystone-36cd-account-create-update-5bllx" podUID="e2931866-1e7d-46e7-833f-b285d8514234" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.201018 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpd2l\" (UniqueName: \"kubernetes.io/projected/e2931866-1e7d-46e7-833f-b285d8514234-kube-api-access-kpd2l\") pod \"keystone-36cd-account-create-update-5bllx\" (UID: \"e2931866-1e7d-46e7-833f-b285d8514234\") " pod="openstack/keystone-36cd-account-create-update-5bllx" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.201124 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2931866-1e7d-46e7-833f-b285d8514234-operator-scripts\") pod \"keystone-36cd-account-create-update-5bllx\" (UID: \"e2931866-1e7d-46e7-833f-b285d8514234\") " pod="openstack/keystone-36cd-account-create-update-5bllx" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.201349 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e69accab-69f4-4f35-91cc-b9fb1d0fded2-logs\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.201359 4892 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e69accab-69f4-4f35-91cc-b9fb1d0fded2-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.201369 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e69accab-69f4-4f35-91cc-b9fb1d0fded2-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.201378 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkllr\" (UniqueName: \"kubernetes.io/projected/e69accab-69f4-4f35-91cc-b9fb1d0fded2-kube-api-access-wkllr\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.201476 4892 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Feb 17 18:07:35 crc kubenswrapper[4892]: E0217 18:07:35.202453 4892 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 17 18:07:35 crc kubenswrapper[4892]: E0217 18:07:35.202520 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e2931866-1e7d-46e7-833f-b285d8514234-operator-scripts podName:e2931866-1e7d-46e7-833f-b285d8514234 nodeName:}" failed. No retries permitted until 2026-02-17 18:07:35.70250361 +0000 UTC m=+1427.077906875 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/e2931866-1e7d-46e7-833f-b285d8514234-operator-scripts") pod "keystone-36cd-account-create-update-5bllx" (UID: "e2931866-1e7d-46e7-833f-b285d8514234") : configmap "openstack-scripts" not found Feb 17 18:07:35 crc kubenswrapper[4892]: E0217 18:07:35.211004 4892 projected.go:194] Error preparing data for projected volume kube-api-access-kpd2l for pod openstack/keystone-36cd-account-create-update-5bllx: failed to fetch token: serviceaccounts "galera-openstack" not found Feb 17 18:07:35 crc kubenswrapper[4892]: E0217 18:07:35.211059 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e2931866-1e7d-46e7-833f-b285d8514234-kube-api-access-kpd2l podName:e2931866-1e7d-46e7-833f-b285d8514234 nodeName:}" failed. No retries permitted until 2026-02-17 18:07:35.71104268 +0000 UTC m=+1427.086445945 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-kpd2l" (UniqueName: "kubernetes.io/projected/e2931866-1e7d-46e7-833f-b285d8514234-kube-api-access-kpd2l") pod "keystone-36cd-account-create-update-5bllx" (UID: "e2931866-1e7d-46e7-833f-b285d8514234") : failed to fetch token: serviceaccounts "galera-openstack" not found Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.215547 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Feb 17 18:07:35 crc kubenswrapper[4892]: E0217 18:07:35.243512 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e1814f919569ab3fca60ffafe790f6c380bef0cb3debbd1731ef73a58c171a64 is running failed: container process not found" containerID="e1814f919569ab3fca60ffafe790f6c380bef0cb3debbd1731ef73a58c171a64" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.255684 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-zf6jg"] Feb 17 18:07:35 crc kubenswrapper[4892]: E0217 18:07:35.273101 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2fa766c1995ec1ef80411d787e1580966f11b07f17360079458fcc8c8c5e948a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 17 18:07:35 crc kubenswrapper[4892]: E0217 18:07:35.273161 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e1814f919569ab3fca60ffafe790f6c380bef0cb3debbd1731ef73a58c171a64 is running failed: container process not found" containerID="e1814f919569ab3fca60ffafe790f6c380bef0cb3debbd1731ef73a58c171a64" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 17 18:07:35 crc kubenswrapper[4892]: E0217 18:07:35.282844 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2fa766c1995ec1ef80411d787e1580966f11b07f17360079458fcc8c8c5e948a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 17 18:07:35 crc kubenswrapper[4892]: E0217 18:07:35.282988 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e1814f919569ab3fca60ffafe790f6c380bef0cb3debbd1731ef73a58c171a64 is running failed: container process not found" containerID="e1814f919569ab3fca60ffafe790f6c380bef0cb3debbd1731ef73a58c171a64" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 17 18:07:35 crc kubenswrapper[4892]: E0217 18:07:35.283009 4892 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e1814f919569ab3fca60ffafe790f6c380bef0cb3debbd1731ef73a58c171a64 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-8n9s7" podUID="abdceb70-55cb-4f2a-ac20-7fe8e9f4d064" containerName="ovsdb-server" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.286630 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-zf6jg"] Feb 17 18:07:35 crc kubenswrapper[4892]: E0217 18:07:35.292682 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4e88a2b9e15b3a6cb7fc5f88c55641255e9e97f6b726a8c1f4742146a59a82fc" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 17 18:07:35 crc kubenswrapper[4892]: E0217 18:07:35.295014 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2fa766c1995ec1ef80411d787e1580966f11b07f17360079458fcc8c8c5e948a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 17 18:07:35 crc kubenswrapper[4892]: E0217 18:07:35.295052 4892 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-8n9s7" podUID="abdceb70-55cb-4f2a-ac20-7fe8e9f4d064" containerName="ovs-vswitchd" Feb 17 18:07:35 crc kubenswrapper[4892]: E0217 18:07:35.300281 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4e88a2b9e15b3a6cb7fc5f88c55641255e9e97f6b726a8c1f4742146a59a82fc" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 17 18:07:35 crc kubenswrapper[4892]: E0217 18:07:35.324167 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4e88a2b9e15b3a6cb7fc5f88c55641255e9e97f6b726a8c1f4742146a59a82fc" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 17 18:07:35 crc kubenswrapper[4892]: E0217 18:07:35.324242 4892 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="253dfc82-aa27-4e6b-88a5-0af7a1d01370" containerName="nova-scheduler-scheduler" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.333286 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcbe58d5-c580-4a8c-8476-dd29bf7ca91b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dcbe58d5-c580-4a8c-8476-dd29bf7ca91b" (UID: "dcbe58d5-c580-4a8c-8476-dd29bf7ca91b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.353085 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-6vq9m"] Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.383220 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0126c9e0-502f-4109-a6cf-25eccd572dff" path="/var/lib/kubelet/pods/0126c9e0-502f-4109-a6cf-25eccd572dff/volumes" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.384144 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17e67c77-46ba-4230-b84d-bc8e6952b2d8" path="/var/lib/kubelet/pods/17e67c77-46ba-4230-b84d-bc8e6952b2d8/volumes" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.384635 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40312e89-cbb3-4c9f-b8bc-33113bd6462f" path="/var/lib/kubelet/pods/40312e89-cbb3-4c9f-b8bc-33113bd6462f/volumes" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.385142 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="588baeb0-b9ae-4dcf-83c6-36b641359bfd" path="/var/lib/kubelet/pods/588baeb0-b9ae-4dcf-83c6-36b641359bfd/volumes" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.386114 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60523b2e-a498-4bc9-920b-32f117afb898" path="/var/lib/kubelet/pods/60523b2e-a498-4bc9-920b-32f117afb898/volumes" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.386785 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c031fb1d-01fe-4f4d-8a7a-2a49cb75bbb9" path="/var/lib/kubelet/pods/c031fb1d-01fe-4f4d-8a7a-2a49cb75bbb9/volumes" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.387225 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5e19d46-3d45-42e4-b59f-d25925415b3c" path="/var/lib/kubelet/pods/d5e19d46-3d45-42e4-b59f-d25925415b3c/volumes" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.387708 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="daec0414-a6e3-4f1d-bc43-cccfa0444894" path="/var/lib/kubelet/pods/daec0414-a6e3-4f1d-bc43-cccfa0444894/volumes" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.388777 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e335cae7-2188-47a9-8eef-f53353b22dd4" path="/var/lib/kubelet/pods/e335cae7-2188-47a9-8eef-f53353b22dd4/volumes" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.389256 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa" path="/var/lib/kubelet/pods/e9de8b31-40f1-42b9-b9c2-3d1cf759d9aa/volumes" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.420093 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcbe58d5-c580-4a8c-8476-dd29bf7ca91b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.427332 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf3f5fcf-07b1-4f42-a5be-5b11052d080a-config-data" (OuterVolumeSpecName: "config-data") pod "bf3f5fcf-07b1-4f42-a5be-5b11052d080a" (UID: "bf3f5fcf-07b1-4f42-a5be-5b11052d080a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.524393 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf3f5fcf-07b1-4f42-a5be-5b11052d080a-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.638269 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcbe58d5-c580-4a8c-8476-dd29bf7ca91b-config-data" (OuterVolumeSpecName: "config-data") pod "dcbe58d5-c580-4a8c-8476-dd29bf7ca91b" (UID: "dcbe58d5-c580-4a8c-8476-dd29bf7ca91b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.654240 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e69accab-69f4-4f35-91cc-b9fb1d0fded2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e69accab-69f4-4f35-91cc-b9fb1d0fded2" (UID: "e69accab-69f4-4f35-91cc-b9fb1d0fded2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.666080 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb2d2d5a-6727-4c83-800b-03a6cf43b9c1-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "bb2d2d5a-6727-4c83-800b-03a6cf43b9c1" (UID: "bb2d2d5a-6727-4c83-800b-03a6cf43b9c1"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.728771 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpd2l\" (UniqueName: \"kubernetes.io/projected/e2931866-1e7d-46e7-833f-b285d8514234-kube-api-access-kpd2l\") pod \"keystone-36cd-account-create-update-5bllx\" (UID: \"e2931866-1e7d-46e7-833f-b285d8514234\") " pod="openstack/keystone-36cd-account-create-update-5bllx" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.728854 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2931866-1e7d-46e7-833f-b285d8514234-operator-scripts\") pod \"keystone-36cd-account-create-update-5bllx\" (UID: \"e2931866-1e7d-46e7-833f-b285d8514234\") " pod="openstack/keystone-36cd-account-create-update-5bllx" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.728913 4892 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Feb 17 18:07:35 crc kubenswrapper[4892]: E0217 18:07:35.729391 4892 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.729431 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e69accab-69f4-4f35-91cc-b9fb1d0fded2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:35 crc kubenswrapper[4892]: E0217 18:07:35.729484 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e2931866-1e7d-46e7-833f-b285d8514234-operator-scripts podName:e2931866-1e7d-46e7-833f-b285d8514234 nodeName:}" failed. No retries permitted until 2026-02-17 18:07:36.729467743 +0000 UTC m=+1428.104871008 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/e2931866-1e7d-46e7-833f-b285d8514234-operator-scripts") pod "keystone-36cd-account-create-update-5bllx" (UID: "e2931866-1e7d-46e7-833f-b285d8514234") : configmap "openstack-scripts" not found Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.729548 4892 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb2d2d5a-6727-4c83-800b-03a6cf43b9c1-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.729563 4892 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.729575 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcbe58d5-c580-4a8c-8476-dd29bf7ca91b-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:35 crc kubenswrapper[4892]: E0217 18:07:35.731631 4892 projected.go:194] Error preparing data for projected volume kube-api-access-kpd2l for pod openstack/keystone-36cd-account-create-update-5bllx: failed to fetch token: serviceaccounts "galera-openstack" not found Feb 17 18:07:35 crc kubenswrapper[4892]: E0217 18:07:35.731681 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e2931866-1e7d-46e7-833f-b285d8514234-kube-api-access-kpd2l podName:e2931866-1e7d-46e7-833f-b285d8514234 nodeName:}" failed. No retries permitted until 2026-02-17 18:07:36.731665212 +0000 UTC m=+1428.107068477 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-kpd2l" (UniqueName: "kubernetes.io/projected/e2931866-1e7d-46e7-833f-b285d8514234-kube-api-access-kpd2l") pod "keystone-36cd-account-create-update-5bllx" (UID: "e2931866-1e7d-46e7-833f-b285d8514234") : failed to fetch token: serviceaccounts "galera-openstack" not found Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.744063 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb2d2d5a-6727-4c83-800b-03a6cf43b9c1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb2d2d5a-6727-4c83-800b-03a6cf43b9c1" (UID: "bb2d2d5a-6727-4c83-800b-03a6cf43b9c1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.746617 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c6047a2-148b-46cf-a50b-b7147c7c9902-config-data" (OuterVolumeSpecName: "config-data") pod "5c6047a2-148b-46cf-a50b-b7147c7c9902" (UID: "5c6047a2-148b-46cf-a50b-b7147c7c9902"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.753315 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e69accab-69f4-4f35-91cc-b9fb1d0fded2-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e69accab-69f4-4f35-91cc-b9fb1d0fded2" (UID: "e69accab-69f4-4f35-91cc-b9fb1d0fded2"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.772025 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcbe58d5-c580-4a8c-8476-dd29bf7ca91b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "dcbe58d5-c580-4a8c-8476-dd29bf7ca91b" (UID: "dcbe58d5-c580-4a8c-8476-dd29bf7ca91b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.775295 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb2d2d5a-6727-4c83-800b-03a6cf43b9c1-config-data" (OuterVolumeSpecName: "config-data") pod "bb2d2d5a-6727-4c83-800b-03a6cf43b9c1" (UID: "bb2d2d5a-6727-4c83-800b-03a6cf43b9c1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.779752 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e69accab-69f4-4f35-91cc-b9fb1d0fded2-config-data" (OuterVolumeSpecName: "config-data") pod "e69accab-69f4-4f35-91cc-b9fb1d0fded2" (UID: "e69accab-69f4-4f35-91cc-b9fb1d0fded2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.781492 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c6047a2-148b-46cf-a50b-b7147c7c9902-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5c6047a2-148b-46cf-a50b-b7147c7c9902" (UID: "5c6047a2-148b-46cf-a50b-b7147c7c9902"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.835178 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c6047a2-148b-46cf-a50b-b7147c7c9902-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.835204 4892 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcbe58d5-c580-4a8c-8476-dd29bf7ca91b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.835214 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c6047a2-148b-46cf-a50b-b7147c7c9902-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.835222 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e69accab-69f4-4f35-91cc-b9fb1d0fded2-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.835231 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb2d2d5a-6727-4c83-800b-03a6cf43b9c1-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.835281 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb2d2d5a-6727-4c83-800b-03a6cf43b9c1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.835290 4892 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e69accab-69f4-4f35-91cc-b9fb1d0fded2-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.848932 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcbe58d5-c580-4a8c-8476-dd29bf7ca91b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "dcbe58d5-c580-4a8c-8476-dd29bf7ca91b" (UID: "dcbe58d5-c580-4a8c-8476-dd29bf7ca91b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.924899 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="4f63d2ae-8195-4841-a7b5-f38667fc87b2" containerName="galera" containerID="cri-o://55d0a5452a08a089f4f815ed51cec9a66b1e302496f9185e70ddb1d9d5ee79a3" gracePeriod=30 Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.938473 4892 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcbe58d5-c580-4a8c-8476-dd29bf7ca91b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:35 crc kubenswrapper[4892]: I0217 18:07:35.975369 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb2d2d5a-6727-4c83-800b-03a6cf43b9c1-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "bb2d2d5a-6727-4c83-800b-03a6cf43b9c1" (UID: "bb2d2d5a-6727-4c83-800b-03a6cf43b9c1"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.023033 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c6047a2-148b-46cf-a50b-b7147c7c9902-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5c6047a2-148b-46cf-a50b-b7147c7c9902" (UID: "5c6047a2-148b-46cf-a50b-b7147c7c9902"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.040282 4892 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb2d2d5a-6727-4c83-800b-03a6cf43b9c1-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.040311 4892 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c6047a2-148b-46cf-a50b-b7147c7c9902-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.064047 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="3a991e29-288f-453d-9bb4-f8d90a2689ad" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.131:5671: connect: connection refused" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.071108 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c6047a2-148b-46cf-a50b-b7147c7c9902-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5c6047a2-148b-46cf-a50b-b7147c7c9902" (UID: "5c6047a2-148b-46cf-a50b-b7147c7c9902"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.141626 4892 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c6047a2-148b-46cf-a50b-b7147c7c9902-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.184584 4892 generic.go:334] "Generic (PLEG): container finished" podID="68cd920a-ec23-4053-a5b6-02adbf11eaf0" containerID="7576bc6a46f917916ace668163f910721525c50fdb2445e86a823be4d67ae777" exitCode=0 Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.203126 4892 generic.go:334] "Generic (PLEG): container finished" podID="8947a6cb-f018-4042-a2f8-e17591b0394d" containerID="80a0a6f3f3f9ebbe69c194f3e074f48a042c65dd9cb6a3e8b0adc00867e049f1" exitCode=0 Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.206551 4892 generic.go:334] "Generic (PLEG): container finished" podID="253dfc82-aa27-4e6b-88a5-0af7a1d01370" containerID="4e88a2b9e15b3a6cb7fc5f88c55641255e9e97f6b726a8c1f4742146a59a82fc" exitCode=0 Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.212844 4892 generic.go:334] "Generic (PLEG): container finished" podID="49339b73-3846-4ce0-aa1d-d285b333c807" containerID="3444d44d864230addafb40b6ea7d86a4aa633d99b8d06f32e0785eca01c03058" exitCode=1 Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.214946 4892 generic.go:334] "Generic (PLEG): container finished" podID="eb02530d-521f-427b-a570-f35de0665ecc" containerID="f69a74c1d7ff5f5e7cab3dfb70cfb2a1f2475fdc6a082b95bd0059bf7596b0bc" exitCode=0 Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.237862 4892 generic.go:334] "Generic (PLEG): container finished" podID="a6736a08-c35b-491c-b408-8a3dd641cd51" containerID="276fbe4a642e629846be447b31c24d7070dfa435158a65fb8bc262ffc1b036a1" exitCode=0 Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.266390 4892 generic.go:334] "Generic (PLEG): container finished" podID="03480c10-3249-4caa-b0da-919bbe13c03f" containerID="ab514791aa617c049db64bf7cfd3380e2e19c1cb78b1e69eb13b878f9cf05a05" exitCode=0 Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.269747 4892 generic.go:334] "Generic (PLEG): container finished" podID="dadb10bf-ed88-454e-8873-9c49f762ef6e" containerID="264f797728e12c3996d15cc2b9cd8446cea32fc84d31eeb1fec2bcc2395f7027" exitCode=0 Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.280928 4892 generic.go:334] "Generic (PLEG): container finished" podID="2e7cdd99-a572-4a20-834b-c1373e080496" containerID="7084822d47905f3c440ae8cc2dc02a06102fe14e3a3594895fa3088c05136b66" exitCode=0 Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.280967 4892 generic.go:334] "Generic (PLEG): container finished" podID="2e7cdd99-a572-4a20-834b-c1373e080496" containerID="366a786fe2adb1868b4256bbb8d7a92b8b2dc7273456a2f82c6931c3dc7fdab4" exitCode=0 Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.281048 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-36cd-account-create-update-5bllx" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.328344 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"68cd920a-ec23-4053-a5b6-02adbf11eaf0","Type":"ContainerDied","Data":"7576bc6a46f917916ace668163f910721525c50fdb2445e86a823be4d67ae777"} Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.328396 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"68cd920a-ec23-4053-a5b6-02adbf11eaf0","Type":"ContainerDied","Data":"5ab7188237312a8d3ad13b8d1c210fe67bf766c6e1983f2b5423b858542facb2"} Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.328411 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ab7188237312a8d3ad13b8d1c210fe67bf766c6e1983f2b5423b858542facb2" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.328427 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.328445 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-77d47b9fd6-4gbj5" event={"ID":"8947a6cb-f018-4042-a2f8-e17591b0394d","Type":"ContainerDied","Data":"80a0a6f3f3f9ebbe69c194f3e074f48a042c65dd9cb6a3e8b0adc00867e049f1"} Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.328462 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.328480 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-77d47b9fd6-4gbj5" event={"ID":"8947a6cb-f018-4042-a2f8-e17591b0394d","Type":"ContainerDied","Data":"01106c64d635a5068367d9b45b8a6183bae967cc967b8dbdb618840c91c8bd1c"} Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.328495 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01106c64d635a5068367d9b45b8a6183bae967cc967b8dbdb618840c91c8bd1c" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.328510 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8affd9cf-1116-4643-8045-d445edeaa995","Type":"ContainerDied","Data":"961eb86eb44cbf469fe242743791bfef8eb78a2abe611cf76220628d1eb8f528"} Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.328521 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="961eb86eb44cbf469fe242743791bfef8eb78a2abe611cf76220628d1eb8f528" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.328533 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"253dfc82-aa27-4e6b-88a5-0af7a1d01370","Type":"ContainerDied","Data":"4e88a2b9e15b3a6cb7fc5f88c55641255e9e97f6b726a8c1f4742146a59a82fc"} Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.328547 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-7299-account-create-update-frm9q"] Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.328561 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-7299-account-create-update-frm9q"] Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.328585 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-cq76l"] Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.328598 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-cq76l"] Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.328615 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"253dfc82-aa27-4e6b-88a5-0af7a1d01370","Type":"ContainerDied","Data":"9907d4f455141a9cfdc6da39ffbe900759e96dff35d0ee2cdf18151bc4ab0378"} Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.328625 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9907d4f455141a9cfdc6da39ffbe900759e96dff35d0ee2cdf18151bc4ab0378" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.328641 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3ca9c4ed-1247-4340-a675-b9d50dcbed99","Type":"ContainerDied","Data":"1355cce60f595bcb75a2bac3bfdd2a93600d6895b9428d228a35fb00e8fe0f9e"} Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.328653 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1355cce60f595bcb75a2bac3bfdd2a93600d6895b9428d228a35fb00e8fe0f9e" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.328662 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-6vq9m" event={"ID":"49339b73-3846-4ce0-aa1d-d285b333c807","Type":"ContainerDied","Data":"3444d44d864230addafb40b6ea7d86a4aa633d99b8d06f32e0785eca01c03058"} Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.328675 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eb02530d-521f-427b-a570-f35de0665ecc","Type":"ContainerDied","Data":"f69a74c1d7ff5f5e7cab3dfb70cfb2a1f2475fdc6a082b95bd0059bf7596b0bc"} Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.328688 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eb02530d-521f-427b-a570-f35de0665ecc","Type":"ContainerDied","Data":"01477c9b0fffb8762ca0569e23eebe055e41984b3f6d8e473e0115918c26f1da"} Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.328696 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01477c9b0fffb8762ca0569e23eebe055e41984b3f6d8e473e0115918c26f1da" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.328705 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a6736a08-c35b-491c-b408-8a3dd641cd51","Type":"ContainerDied","Data":"276fbe4a642e629846be447b31c24d7070dfa435158a65fb8bc262ffc1b036a1"} Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.328719 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a6736a08-c35b-491c-b408-8a3dd641cd51","Type":"ContainerDied","Data":"c9e98a9a097e082d7d42b313d5f99f2ff1b013b10ff34d724e7f60a147bda3a5"} Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.328727 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9e98a9a097e082d7d42b313d5f99f2ff1b013b10ff34d724e7f60a147bda3a5" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.328737 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"03480c10-3249-4caa-b0da-919bbe13c03f","Type":"ContainerDied","Data":"ab514791aa617c049db64bf7cfd3380e2e19c1cb78b1e69eb13b878f9cf05a05"} Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.328749 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"03480c10-3249-4caa-b0da-919bbe13c03f","Type":"ContainerDied","Data":"6c6be2f3cb71402eb555b468f45e50e1d07f9c0a3161f46426afeb11d5d41587"} Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.328757 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c6be2f3cb71402eb555b468f45e50e1d07f9c0a3161f46426afeb11d5d41587" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.328766 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"dadb10bf-ed88-454e-8873-9c49f762ef6e","Type":"ContainerDied","Data":"264f797728e12c3996d15cc2b9cd8446cea32fc84d31eeb1fec2bcc2395f7027"} Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.328778 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2e7cdd99-a572-4a20-834b-c1373e080496","Type":"ContainerDied","Data":"7084822d47905f3c440ae8cc2dc02a06102fe14e3a3594895fa3088c05136b66"} Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.328791 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2e7cdd99-a572-4a20-834b-c1373e080496","Type":"ContainerDied","Data":"366a786fe2adb1868b4256bbb8d7a92b8b2dc7273456a2f82c6931c3dc7fdab4"} Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.361205 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.381052 4892 scope.go:117] "RemoveContainer" containerID="30d52ee98141b0a4b18a9c3892ae7d9fadf3cc7b619452277145c17c67f590bb" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.388425 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.410569 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.435248 4892 scope.go:117] "RemoveContainer" containerID="a6f29bfcf8f5047a2ec9026b3812106d7f946e8416e1e920e156552a30f70a3e" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.472296 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljdlr\" (UniqueName: \"kubernetes.io/projected/a6a1925e-e87d-4d6e-b9a3-35e46d58fc54-kube-api-access-ljdlr\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.472327 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6a1925e-e87d-4d6e-b9a3-35e46d58fc54-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:36 crc kubenswrapper[4892]: E0217 18:07:36.506502 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="59c346faddf9dfe842e95a9add3e43317b2c2439005912005284fd494f8f1d8f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 17 18:07:36 crc kubenswrapper[4892]: E0217 18:07:36.507834 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="59c346faddf9dfe842e95a9add3e43317b2c2439005912005284fd494f8f1d8f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 17 18:07:36 crc kubenswrapper[4892]: E0217 18:07:36.509228 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="59c346faddf9dfe842e95a9add3e43317b2c2439005912005284fd494f8f1d8f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 17 18:07:36 crc kubenswrapper[4892]: E0217 18:07:36.509264 4892 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="b4d79298-5d7f-4bd2-a8ba-b76c1a8b44f4" containerName="nova-cell1-conductor-conductor" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.533018 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.539121 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.545842 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.573291 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ca9c4ed-1247-4340-a675-b9d50dcbed99-config-data\") pod \"3ca9c4ed-1247-4340-a675-b9d50dcbed99\" (UID: \"3ca9c4ed-1247-4340-a675-b9d50dcbed99\") " Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.573389 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ca9c4ed-1247-4340-a675-b9d50dcbed99-scripts\") pod \"3ca9c4ed-1247-4340-a675-b9d50dcbed99\" (UID: \"3ca9c4ed-1247-4340-a675-b9d50dcbed99\") " Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.573506 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3ca9c4ed-1247-4340-a675-b9d50dcbed99-httpd-run\") pod \"3ca9c4ed-1247-4340-a675-b9d50dcbed99\" (UID: \"3ca9c4ed-1247-4340-a675-b9d50dcbed99\") " Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.573529 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ca9c4ed-1247-4340-a675-b9d50dcbed99-public-tls-certs\") pod \"3ca9c4ed-1247-4340-a675-b9d50dcbed99\" (UID: \"3ca9c4ed-1247-4340-a675-b9d50dcbed99\") " Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.573569 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"3ca9c4ed-1247-4340-a675-b9d50dcbed99\" (UID: \"3ca9c4ed-1247-4340-a675-b9d50dcbed99\") " Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.573658 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxvjh\" (UniqueName: \"kubernetes.io/projected/3ca9c4ed-1247-4340-a675-b9d50dcbed99-kube-api-access-fxvjh\") pod \"3ca9c4ed-1247-4340-a675-b9d50dcbed99\" (UID: \"3ca9c4ed-1247-4340-a675-b9d50dcbed99\") " Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.573693 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ca9c4ed-1247-4340-a675-b9d50dcbed99-logs\") pod \"3ca9c4ed-1247-4340-a675-b9d50dcbed99\" (UID: \"3ca9c4ed-1247-4340-a675-b9d50dcbed99\") " Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.573742 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ca9c4ed-1247-4340-a675-b9d50dcbed99-combined-ca-bundle\") pod \"3ca9c4ed-1247-4340-a675-b9d50dcbed99\" (UID: \"3ca9c4ed-1247-4340-a675-b9d50dcbed99\") " Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.580586 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ca9c4ed-1247-4340-a675-b9d50dcbed99-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3ca9c4ed-1247-4340-a675-b9d50dcbed99" (UID: "3ca9c4ed-1247-4340-a675-b9d50dcbed99"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.581368 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ca9c4ed-1247-4340-a675-b9d50dcbed99-logs" (OuterVolumeSpecName: "logs") pod "3ca9c4ed-1247-4340-a675-b9d50dcbed99" (UID: "3ca9c4ed-1247-4340-a675-b9d50dcbed99"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.620110 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "3ca9c4ed-1247-4340-a675-b9d50dcbed99" (UID: "3ca9c4ed-1247-4340-a675-b9d50dcbed99"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.620269 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ca9c4ed-1247-4340-a675-b9d50dcbed99-kube-api-access-fxvjh" (OuterVolumeSpecName: "kube-api-access-fxvjh") pod "3ca9c4ed-1247-4340-a675-b9d50dcbed99" (UID: "3ca9c4ed-1247-4340-a675-b9d50dcbed99"). InnerVolumeSpecName "kube-api-access-fxvjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.620425 4892 scope.go:117] "RemoveContainer" containerID="9fe16bec1150bc58439b8a3146b91ab3797ac2826fcfb2f2ad628f2449331e3c" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.621225 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ca9c4ed-1247-4340-a675-b9d50dcbed99-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ca9c4ed-1247-4340-a675-b9d50dcbed99" (UID: "3ca9c4ed-1247-4340-a675-b9d50dcbed99"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.621450 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ca9c4ed-1247-4340-a675-b9d50dcbed99-scripts" (OuterVolumeSpecName: "scripts") pod "3ca9c4ed-1247-4340-a675-b9d50dcbed99" (UID: "3ca9c4ed-1247-4340-a675-b9d50dcbed99"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.622031 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-667fbbdb6d-fdpm7"] Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.642993 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-667fbbdb6d-fdpm7"] Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.674964 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8affd9cf-1116-4643-8045-d445edeaa995-combined-ca-bundle\") pod \"8affd9cf-1116-4643-8045-d445edeaa995\" (UID: \"8affd9cf-1116-4643-8045-d445edeaa995\") " Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.675103 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/8affd9cf-1116-4643-8045-d445edeaa995-kube-state-metrics-tls-certs\") pod \"8affd9cf-1116-4643-8045-d445edeaa995\" (UID: \"8affd9cf-1116-4643-8045-d445edeaa995\") " Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.675157 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/8affd9cf-1116-4643-8045-d445edeaa995-kube-state-metrics-tls-config\") pod \"8affd9cf-1116-4643-8045-d445edeaa995\" (UID: \"8affd9cf-1116-4643-8045-d445edeaa995\") " Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.675265 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdg2p\" (UniqueName: \"kubernetes.io/projected/8affd9cf-1116-4643-8045-d445edeaa995-kube-api-access-qdg2p\") pod \"8affd9cf-1116-4643-8045-d445edeaa995\" (UID: \"8affd9cf-1116-4643-8045-d445edeaa995\") " Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.675943 4892 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3ca9c4ed-1247-4340-a675-b9d50dcbed99-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.675975 4892 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.675988 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxvjh\" (UniqueName: \"kubernetes.io/projected/3ca9c4ed-1247-4340-a675-b9d50dcbed99-kube-api-access-fxvjh\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.676001 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ca9c4ed-1247-4340-a675-b9d50dcbed99-logs\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.676012 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ca9c4ed-1247-4340-a675-b9d50dcbed99-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.676023 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ca9c4ed-1247-4340-a675-b9d50dcbed99-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.685800 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8affd9cf-1116-4643-8045-d445edeaa995-kube-api-access-qdg2p" (OuterVolumeSpecName: "kube-api-access-qdg2p") pod "8affd9cf-1116-4643-8045-d445edeaa995" (UID: "8affd9cf-1116-4643-8045-d445edeaa995"). InnerVolumeSpecName "kube-api-access-qdg2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.694640 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ca9c4ed-1247-4340-a675-b9d50dcbed99-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3ca9c4ed-1247-4340-a675-b9d50dcbed99" (UID: "3ca9c4ed-1247-4340-a675-b9d50dcbed99"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.714116 4892 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.722312 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-77d47b9fd6-4gbj5" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.732892 4892 scope.go:117] "RemoveContainer" containerID="7aa2c24e87c1dd7aca3eb443e9d0a1a6e4ac79e766130bd75af04a8a8e5e4d3c" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.743226 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8affd9cf-1116-4643-8045-d445edeaa995-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8affd9cf-1116-4643-8045-d445edeaa995" (UID: "8affd9cf-1116-4643-8045-d445edeaa995"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.743285 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db8547d8d-ftgvm"] Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.743748 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.748875 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.753822 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db8547d8d-ftgvm"] Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.756845 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ca9c4ed-1247-4340-a675-b9d50dcbed99-config-data" (OuterVolumeSpecName: "config-data") pod "3ca9c4ed-1247-4340-a675-b9d50dcbed99" (UID: "3ca9c4ed-1247-4340-a675-b9d50dcbed99"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.767711 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8affd9cf-1116-4643-8045-d445edeaa995-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "8affd9cf-1116-4643-8045-d445edeaa995" (UID: "8affd9cf-1116-4643-8045-d445edeaa995"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.768441 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.770106 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.778865 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8947a6cb-f018-4042-a2f8-e17591b0394d-public-tls-certs\") pod \"8947a6cb-f018-4042-a2f8-e17591b0394d\" (UID: \"8947a6cb-f018-4042-a2f8-e17591b0394d\") " Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.778926 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8947a6cb-f018-4042-a2f8-e17591b0394d-config-data-custom\") pod \"8947a6cb-f018-4042-a2f8-e17591b0394d\" (UID: \"8947a6cb-f018-4042-a2f8-e17591b0394d\") " Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.778971 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8947a6cb-f018-4042-a2f8-e17591b0394d-internal-tls-certs\") pod \"8947a6cb-f018-4042-a2f8-e17591b0394d\" (UID: \"8947a6cb-f018-4042-a2f8-e17591b0394d\") " Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.779015 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8947a6cb-f018-4042-a2f8-e17591b0394d-config-data\") pod \"8947a6cb-f018-4042-a2f8-e17591b0394d\" (UID: \"8947a6cb-f018-4042-a2f8-e17591b0394d\") " Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.779034 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8947a6cb-f018-4042-a2f8-e17591b0394d-logs\") pod \"8947a6cb-f018-4042-a2f8-e17591b0394d\" (UID: \"8947a6cb-f018-4042-a2f8-e17591b0394d\") " Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.779051 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8947a6cb-f018-4042-a2f8-e17591b0394d-combined-ca-bundle\") pod \"8947a6cb-f018-4042-a2f8-e17591b0394d\" (UID: \"8947a6cb-f018-4042-a2f8-e17591b0394d\") " Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.779074 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcgkv\" (UniqueName: \"kubernetes.io/projected/8947a6cb-f018-4042-a2f8-e17591b0394d-kube-api-access-tcgkv\") pod \"8947a6cb-f018-4042-a2f8-e17591b0394d\" (UID: \"8947a6cb-f018-4042-a2f8-e17591b0394d\") " Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.779270 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpd2l\" (UniqueName: \"kubernetes.io/projected/e2931866-1e7d-46e7-833f-b285d8514234-kube-api-access-kpd2l\") pod \"keystone-36cd-account-create-update-5bllx\" (UID: \"e2931866-1e7d-46e7-833f-b285d8514234\") " pod="openstack/keystone-36cd-account-create-update-5bllx" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.779398 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2931866-1e7d-46e7-833f-b285d8514234-operator-scripts\") pod \"keystone-36cd-account-create-update-5bllx\" (UID: \"e2931866-1e7d-46e7-833f-b285d8514234\") " pod="openstack/keystone-36cd-account-create-update-5bllx" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.779781 4892 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ca9c4ed-1247-4340-a675-b9d50dcbed99-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.779794 4892 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.779804 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8affd9cf-1116-4643-8045-d445edeaa995-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.779827 4892 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/8affd9cf-1116-4643-8045-d445edeaa995-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.779838 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ca9c4ed-1247-4340-a675-b9d50dcbed99-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.779848 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdg2p\" (UniqueName: \"kubernetes.io/projected/8affd9cf-1116-4643-8045-d445edeaa995-kube-api-access-qdg2p\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:36 crc kubenswrapper[4892]: E0217 18:07:36.779905 4892 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 17 18:07:36 crc kubenswrapper[4892]: E0217 18:07:36.779952 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e2931866-1e7d-46e7-833f-b285d8514234-operator-scripts podName:e2931866-1e7d-46e7-833f-b285d8514234 nodeName:}" failed. No retries permitted until 2026-02-17 18:07:38.779936707 +0000 UTC m=+1430.155339972 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/e2931866-1e7d-46e7-833f-b285d8514234-operator-scripts") pod "keystone-36cd-account-create-update-5bllx" (UID: "e2931866-1e7d-46e7-833f-b285d8514234") : configmap "openstack-scripts" not found Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.780470 4892 scope.go:117] "RemoveContainer" containerID="c7cb681f90d6976e6b5cbf00e7ea59a76be308b0fe3edc7745833b928200cf4d" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.782909 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.783276 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8affd9cf-1116-4643-8045-d445edeaa995-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "8affd9cf-1116-4643-8045-d445edeaa995" (UID: "8affd9cf-1116-4643-8045-d445edeaa995"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.783411 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.784030 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8947a6cb-f018-4042-a2f8-e17591b0394d-logs" (OuterVolumeSpecName: "logs") pod "8947a6cb-f018-4042-a2f8-e17591b0394d" (UID: "8947a6cb-f018-4042-a2f8-e17591b0394d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:07:36 crc kubenswrapper[4892]: E0217 18:07:36.784054 4892 projected.go:194] Error preparing data for projected volume kube-api-access-kpd2l for pod openstack/keystone-36cd-account-create-update-5bllx: failed to fetch token: serviceaccounts "galera-openstack" not found Feb 17 18:07:36 crc kubenswrapper[4892]: E0217 18:07:36.784117 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e2931866-1e7d-46e7-833f-b285d8514234-kube-api-access-kpd2l podName:e2931866-1e7d-46e7-833f-b285d8514234 nodeName:}" failed. No retries permitted until 2026-02-17 18:07:38.784097689 +0000 UTC m=+1430.159501004 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-kpd2l" (UniqueName: "kubernetes.io/projected/e2931866-1e7d-46e7-833f-b285d8514234-kube-api-access-kpd2l") pod "keystone-36cd-account-create-update-5bllx" (UID: "e2931866-1e7d-46e7-833f-b285d8514234") : failed to fetch token: serviceaccounts "galera-openstack" not found Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.784441 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8947a6cb-f018-4042-a2f8-e17591b0394d-kube-api-access-tcgkv" (OuterVolumeSpecName: "kube-api-access-tcgkv") pod "8947a6cb-f018-4042-a2f8-e17591b0394d" (UID: "8947a6cb-f018-4042-a2f8-e17591b0394d"). InnerVolumeSpecName "kube-api-access-tcgkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.791727 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-787594b47-2xt6h"] Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.804779 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8947a6cb-f018-4042-a2f8-e17591b0394d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8947a6cb-f018-4042-a2f8-e17591b0394d" (UID: "8947a6cb-f018-4042-a2f8-e17591b0394d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.809969 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-787594b47-2xt6h"] Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.811323 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.838717 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-36cd-account-create-update-5bllx" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.843018 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8947a6cb-f018-4042-a2f8-e17591b0394d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8947a6cb-f018-4042-a2f8-e17591b0394d" (UID: "8947a6cb-f018-4042-a2f8-e17591b0394d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.854487 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-6vq9m" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.854649 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.866022 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8947a6cb-f018-4042-a2f8-e17591b0394d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8947a6cb-f018-4042-a2f8-e17591b0394d" (UID: "8947a6cb-f018-4042-a2f8-e17591b0394d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.866677 4892 scope.go:117] "RemoveContainer" containerID="70e153e496f3b77144597bb37e7faa188dca474fe0e9e263bb0ae95405212f33" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.866786 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8947a6cb-f018-4042-a2f8-e17591b0394d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8947a6cb-f018-4042-a2f8-e17591b0394d" (UID: "8947a6cb-f018-4042-a2f8-e17591b0394d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.880753 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6736a08-c35b-491c-b408-8a3dd641cd51-scripts\") pod \"a6736a08-c35b-491c-b408-8a3dd641cd51\" (UID: \"a6736a08-c35b-491c-b408-8a3dd641cd51\") " Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.880839 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6736a08-c35b-491c-b408-8a3dd641cd51-config-data\") pod \"a6736a08-c35b-491c-b408-8a3dd641cd51\" (UID: \"a6736a08-c35b-491c-b408-8a3dd641cd51\") " Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.880893 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb02530d-521f-427b-a570-f35de0665ecc-nova-metadata-tls-certs\") pod \"eb02530d-521f-427b-a570-f35de0665ecc\" (UID: \"eb02530d-521f-427b-a570-f35de0665ecc\") " Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.880915 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb02530d-521f-427b-a570-f35de0665ecc-config-data\") pod \"eb02530d-521f-427b-a570-f35de0665ecc\" (UID: \"eb02530d-521f-427b-a570-f35de0665ecc\") " Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.880934 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68cd920a-ec23-4053-a5b6-02adbf11eaf0-config-data\") pod \"68cd920a-ec23-4053-a5b6-02adbf11eaf0\" (UID: \"68cd920a-ec23-4053-a5b6-02adbf11eaf0\") " Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.880957 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8vkv\" (UniqueName: \"kubernetes.io/projected/253dfc82-aa27-4e6b-88a5-0af7a1d01370-kube-api-access-s8vkv\") pod \"253dfc82-aa27-4e6b-88a5-0af7a1d01370\" (UID: \"253dfc82-aa27-4e6b-88a5-0af7a1d01370\") " Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.881002 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68cd920a-ec23-4053-a5b6-02adbf11eaf0-logs\") pod \"68cd920a-ec23-4053-a5b6-02adbf11eaf0\" (UID: \"68cd920a-ec23-4053-a5b6-02adbf11eaf0\") " Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.881032 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsbtx\" (UniqueName: \"kubernetes.io/projected/03480c10-3249-4caa-b0da-919bbe13c03f-kube-api-access-tsbtx\") pod \"03480c10-3249-4caa-b0da-919bbe13c03f\" (UID: \"03480c10-3249-4caa-b0da-919bbe13c03f\") " Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.881051 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/253dfc82-aa27-4e6b-88a5-0af7a1d01370-config-data\") pod \"253dfc82-aa27-4e6b-88a5-0af7a1d01370\" (UID: \"253dfc82-aa27-4e6b-88a5-0af7a1d01370\") " Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.881072 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a6736a08-c35b-491c-b408-8a3dd641cd51-config-data-custom\") pod \"a6736a08-c35b-491c-b408-8a3dd641cd51\" (UID: \"a6736a08-c35b-491c-b408-8a3dd641cd51\") " Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.881112 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb02530d-521f-427b-a570-f35de0665ecc-combined-ca-bundle\") pod \"eb02530d-521f-427b-a570-f35de0665ecc\" (UID: \"eb02530d-521f-427b-a570-f35de0665ecc\") " Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.881140 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/68cd920a-ec23-4053-a5b6-02adbf11eaf0-public-tls-certs\") pod \"68cd920a-ec23-4053-a5b6-02adbf11eaf0\" (UID: \"68cd920a-ec23-4053-a5b6-02adbf11eaf0\") " Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.881169 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb02530d-521f-427b-a570-f35de0665ecc-logs\") pod \"eb02530d-521f-427b-a570-f35de0665ecc\" (UID: \"eb02530d-521f-427b-a570-f35de0665ecc\") " Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.881209 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68cd920a-ec23-4053-a5b6-02adbf11eaf0-combined-ca-bundle\") pod \"68cd920a-ec23-4053-a5b6-02adbf11eaf0\" (UID: \"68cd920a-ec23-4053-a5b6-02adbf11eaf0\") " Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.881235 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/68cd920a-ec23-4053-a5b6-02adbf11eaf0-internal-tls-certs\") pod \"68cd920a-ec23-4053-a5b6-02adbf11eaf0\" (UID: \"68cd920a-ec23-4053-a5b6-02adbf11eaf0\") " Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.881256 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6736a08-c35b-491c-b408-8a3dd641cd51-combined-ca-bundle\") pod \"a6736a08-c35b-491c-b408-8a3dd641cd51\" (UID: \"a6736a08-c35b-491c-b408-8a3dd641cd51\") " Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.881273 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rk4rj\" (UniqueName: \"kubernetes.io/projected/eb02530d-521f-427b-a570-f35de0665ecc-kube-api-access-rk4rj\") pod \"eb02530d-521f-427b-a570-f35de0665ecc\" (UID: \"eb02530d-521f-427b-a570-f35de0665ecc\") " Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.881293 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/253dfc82-aa27-4e6b-88a5-0af7a1d01370-combined-ca-bundle\") pod \"253dfc82-aa27-4e6b-88a5-0af7a1d01370\" (UID: \"253dfc82-aa27-4e6b-88a5-0af7a1d01370\") " Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.881324 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzc7f\" (UniqueName: \"kubernetes.io/projected/68cd920a-ec23-4053-a5b6-02adbf11eaf0-kube-api-access-wzc7f\") pod \"68cd920a-ec23-4053-a5b6-02adbf11eaf0\" (UID: \"68cd920a-ec23-4053-a5b6-02adbf11eaf0\") " Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.881347 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03480c10-3249-4caa-b0da-919bbe13c03f-combined-ca-bundle\") pod \"03480c10-3249-4caa-b0da-919bbe13c03f\" (UID: \"03480c10-3249-4caa-b0da-919bbe13c03f\") " Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.881379 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03480c10-3249-4caa-b0da-919bbe13c03f-config-data\") pod \"03480c10-3249-4caa-b0da-919bbe13c03f\" (UID: \"03480c10-3249-4caa-b0da-919bbe13c03f\") " Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.881414 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gps8k\" (UniqueName: \"kubernetes.io/projected/a6736a08-c35b-491c-b408-8a3dd641cd51-kube-api-access-gps8k\") pod \"a6736a08-c35b-491c-b408-8a3dd641cd51\" (UID: \"a6736a08-c35b-491c-b408-8a3dd641cd51\") " Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.881439 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a6736a08-c35b-491c-b408-8a3dd641cd51-etc-machine-id\") pod \"a6736a08-c35b-491c-b408-8a3dd641cd51\" (UID: \"a6736a08-c35b-491c-b408-8a3dd641cd51\") " Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.881898 4892 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8947a6cb-f018-4042-a2f8-e17591b0394d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.881916 4892 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8947a6cb-f018-4042-a2f8-e17591b0394d-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.881925 4892 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8947a6cb-f018-4042-a2f8-e17591b0394d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.881934 4892 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/8affd9cf-1116-4643-8045-d445edeaa995-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.881945 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8947a6cb-f018-4042-a2f8-e17591b0394d-logs\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.881954 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8947a6cb-f018-4042-a2f8-e17591b0394d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.881962 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcgkv\" (UniqueName: \"kubernetes.io/projected/8947a6cb-f018-4042-a2f8-e17591b0394d-kube-api-access-tcgkv\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.882007 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a6736a08-c35b-491c-b408-8a3dd641cd51-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a6736a08-c35b-491c-b408-8a3dd641cd51" (UID: "a6736a08-c35b-491c-b408-8a3dd641cd51"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.884656 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6736a08-c35b-491c-b408-8a3dd641cd51-scripts" (OuterVolumeSpecName: "scripts") pod "a6736a08-c35b-491c-b408-8a3dd641cd51" (UID: "a6736a08-c35b-491c-b408-8a3dd641cd51"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.890358 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68cd920a-ec23-4053-a5b6-02adbf11eaf0-logs" (OuterVolumeSpecName: "logs") pod "68cd920a-ec23-4053-a5b6-02adbf11eaf0" (UID: "68cd920a-ec23-4053-a5b6-02adbf11eaf0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.890655 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb02530d-521f-427b-a570-f35de0665ecc-kube-api-access-rk4rj" (OuterVolumeSpecName: "kube-api-access-rk4rj") pod "eb02530d-521f-427b-a570-f35de0665ecc" (UID: "eb02530d-521f-427b-a570-f35de0665ecc"). InnerVolumeSpecName "kube-api-access-rk4rj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.891111 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb02530d-521f-427b-a570-f35de0665ecc-logs" (OuterVolumeSpecName: "logs") pod "eb02530d-521f-427b-a570-f35de0665ecc" (UID: "eb02530d-521f-427b-a570-f35de0665ecc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.903601 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68cd920a-ec23-4053-a5b6-02adbf11eaf0-kube-api-access-wzc7f" (OuterVolumeSpecName: "kube-api-access-wzc7f") pod "68cd920a-ec23-4053-a5b6-02adbf11eaf0" (UID: "68cd920a-ec23-4053-a5b6-02adbf11eaf0"). InnerVolumeSpecName "kube-api-access-wzc7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.909530 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03480c10-3249-4caa-b0da-919bbe13c03f-kube-api-access-tsbtx" (OuterVolumeSpecName: "kube-api-access-tsbtx") pod "03480c10-3249-4caa-b0da-919bbe13c03f" (UID: "03480c10-3249-4caa-b0da-919bbe13c03f"). InnerVolumeSpecName "kube-api-access-tsbtx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.924978 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6736a08-c35b-491c-b408-8a3dd641cd51-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a6736a08-c35b-491c-b408-8a3dd641cd51" (UID: "a6736a08-c35b-491c-b408-8a3dd641cd51"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.925121 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/253dfc82-aa27-4e6b-88a5-0af7a1d01370-kube-api-access-s8vkv" (OuterVolumeSpecName: "kube-api-access-s8vkv") pod "253dfc82-aa27-4e6b-88a5-0af7a1d01370" (UID: "253dfc82-aa27-4e6b-88a5-0af7a1d01370"). InnerVolumeSpecName "kube-api-access-s8vkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.925151 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6736a08-c35b-491c-b408-8a3dd641cd51-kube-api-access-gps8k" (OuterVolumeSpecName: "kube-api-access-gps8k") pod "a6736a08-c35b-491c-b408-8a3dd641cd51" (UID: "a6736a08-c35b-491c-b408-8a3dd641cd51"). InnerVolumeSpecName "kube-api-access-gps8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.934668 4892 scope.go:117] "RemoveContainer" containerID="f688928d9f46e1289f6ea6452cd0e3a1da9e18c9de2b9d48b2e5baa997df9da2" Feb 17 18:07:36 crc kubenswrapper[4892]: I0217 18:07:36.965147 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03480c10-3249-4caa-b0da-919bbe13c03f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "03480c10-3249-4caa-b0da-919bbe13c03f" (UID: "03480c10-3249-4caa-b0da-919bbe13c03f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:36.982694 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g47gz\" (UniqueName: \"kubernetes.io/projected/dadb10bf-ed88-454e-8873-9c49f762ef6e-kube-api-access-g47gz\") pod \"dadb10bf-ed88-454e-8873-9c49f762ef6e\" (UID: \"dadb10bf-ed88-454e-8873-9c49f762ef6e\") " Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:36.982737 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dadb10bf-ed88-454e-8873-9c49f762ef6e-config-data\") pod \"dadb10bf-ed88-454e-8873-9c49f762ef6e\" (UID: \"dadb10bf-ed88-454e-8873-9c49f762ef6e\") " Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:36.983022 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dadb10bf-ed88-454e-8873-9c49f762ef6e-combined-ca-bundle\") pod \"dadb10bf-ed88-454e-8873-9c49f762ef6e\" (UID: \"dadb10bf-ed88-454e-8873-9c49f762ef6e\") " Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:36.983074 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/dadb10bf-ed88-454e-8873-9c49f762ef6e-memcached-tls-certs\") pod \"dadb10bf-ed88-454e-8873-9c49f762ef6e\" (UID: \"dadb10bf-ed88-454e-8873-9c49f762ef6e\") " Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:36.983149 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/dadb10bf-ed88-454e-8873-9c49f762ef6e-kolla-config\") pod \"dadb10bf-ed88-454e-8873-9c49f762ef6e\" (UID: \"dadb10bf-ed88-454e-8873-9c49f762ef6e\") " Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:36.983172 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qwd4\" (UniqueName: \"kubernetes.io/projected/49339b73-3846-4ce0-aa1d-d285b333c807-kube-api-access-2qwd4\") pod \"49339b73-3846-4ce0-aa1d-d285b333c807\" (UID: \"49339b73-3846-4ce0-aa1d-d285b333c807\") " Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:36.983255 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49339b73-3846-4ce0-aa1d-d285b333c807-operator-scripts\") pod \"49339b73-3846-4ce0-aa1d-d285b333c807\" (UID: \"49339b73-3846-4ce0-aa1d-d285b333c807\") " Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:36.983695 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03480c10-3249-4caa-b0da-919bbe13c03f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:36.983710 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gps8k\" (UniqueName: \"kubernetes.io/projected/a6736a08-c35b-491c-b408-8a3dd641cd51-kube-api-access-gps8k\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:36.983721 4892 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a6736a08-c35b-491c-b408-8a3dd641cd51-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:36.983730 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6736a08-c35b-491c-b408-8a3dd641cd51-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:36.983740 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8vkv\" (UniqueName: \"kubernetes.io/projected/253dfc82-aa27-4e6b-88a5-0af7a1d01370-kube-api-access-s8vkv\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:36.983748 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68cd920a-ec23-4053-a5b6-02adbf11eaf0-logs\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:36.983757 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsbtx\" (UniqueName: \"kubernetes.io/projected/03480c10-3249-4caa-b0da-919bbe13c03f-kube-api-access-tsbtx\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:36.983769 4892 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a6736a08-c35b-491c-b408-8a3dd641cd51-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:36.983778 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb02530d-521f-427b-a570-f35de0665ecc-logs\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:36.983786 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rk4rj\" (UniqueName: \"kubernetes.io/projected/eb02530d-521f-427b-a570-f35de0665ecc-kube-api-access-rk4rj\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:36.983794 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzc7f\" (UniqueName: \"kubernetes.io/projected/68cd920a-ec23-4053-a5b6-02adbf11eaf0-kube-api-access-wzc7f\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:36.984493 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49339b73-3846-4ce0-aa1d-d285b333c807-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "49339b73-3846-4ce0-aa1d-d285b333c807" (UID: "49339b73-3846-4ce0-aa1d-d285b333c807"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:36.985019 4892 scope.go:117] "RemoveContainer" containerID="8050f8a9a53f9cfd74960f06c20bc4ea0cfea7a73aa66de78f0a44fc8caa93e3" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:36.986801 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dadb10bf-ed88-454e-8873-9c49f762ef6e-config-data" (OuterVolumeSpecName: "config-data") pod "dadb10bf-ed88-454e-8873-9c49f762ef6e" (UID: "dadb10bf-ed88-454e-8873-9c49f762ef6e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:36.989531 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dadb10bf-ed88-454e-8873-9c49f762ef6e-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "dadb10bf-ed88-454e-8873-9c49f762ef6e" (UID: "dadb10bf-ed88-454e-8873-9c49f762ef6e"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.046302 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49339b73-3846-4ce0-aa1d-d285b333c807-kube-api-access-2qwd4" (OuterVolumeSpecName: "kube-api-access-2qwd4") pod "49339b73-3846-4ce0-aa1d-d285b333c807" (UID: "49339b73-3846-4ce0-aa1d-d285b333c807"). InnerVolumeSpecName "kube-api-access-2qwd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.048053 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dadb10bf-ed88-454e-8873-9c49f762ef6e-kube-api-access-g47gz" (OuterVolumeSpecName: "kube-api-access-g47gz") pod "dadb10bf-ed88-454e-8873-9c49f762ef6e" (UID: "dadb10bf-ed88-454e-8873-9c49f762ef6e"). InnerVolumeSpecName "kube-api-access-g47gz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.056737 4892 scope.go:117] "RemoveContainer" containerID="f688928d9f46e1289f6ea6452cd0e3a1da9e18c9de2b9d48b2e5baa997df9da2" Feb 17 18:07:37 crc kubenswrapper[4892]: E0217 18:07:37.061049 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f688928d9f46e1289f6ea6452cd0e3a1da9e18c9de2b9d48b2e5baa997df9da2\": container with ID starting with f688928d9f46e1289f6ea6452cd0e3a1da9e18c9de2b9d48b2e5baa997df9da2 not found: ID does not exist" containerID="f688928d9f46e1289f6ea6452cd0e3a1da9e18c9de2b9d48b2e5baa997df9da2" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.061081 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f688928d9f46e1289f6ea6452cd0e3a1da9e18c9de2b9d48b2e5baa997df9da2"} err="failed to get container status \"f688928d9f46e1289f6ea6452cd0e3a1da9e18c9de2b9d48b2e5baa997df9da2\": rpc error: code = NotFound desc = could not find container \"f688928d9f46e1289f6ea6452cd0e3a1da9e18c9de2b9d48b2e5baa997df9da2\": container with ID starting with f688928d9f46e1289f6ea6452cd0e3a1da9e18c9de2b9d48b2e5baa997df9da2 not found: ID does not exist" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.061101 4892 scope.go:117] "RemoveContainer" containerID="8050f8a9a53f9cfd74960f06c20bc4ea0cfea7a73aa66de78f0a44fc8caa93e3" Feb 17 18:07:37 crc kubenswrapper[4892]: E0217 18:07:37.069947 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8050f8a9a53f9cfd74960f06c20bc4ea0cfea7a73aa66de78f0a44fc8caa93e3\": container with ID starting with 8050f8a9a53f9cfd74960f06c20bc4ea0cfea7a73aa66de78f0a44fc8caa93e3 not found: ID does not exist" containerID="8050f8a9a53f9cfd74960f06c20bc4ea0cfea7a73aa66de78f0a44fc8caa93e3" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.069978 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8050f8a9a53f9cfd74960f06c20bc4ea0cfea7a73aa66de78f0a44fc8caa93e3"} err="failed to get container status \"8050f8a9a53f9cfd74960f06c20bc4ea0cfea7a73aa66de78f0a44fc8caa93e3\": rpc error: code = NotFound desc = could not find container \"8050f8a9a53f9cfd74960f06c20bc4ea0cfea7a73aa66de78f0a44fc8caa93e3\": container with ID starting with 8050f8a9a53f9cfd74960f06c20bc4ea0cfea7a73aa66de78f0a44fc8caa93e3 not found: ID does not exist" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.069999 4892 scope.go:117] "RemoveContainer" containerID="a6444be986863902c53a81d49703d9af2c8e3377fbc24ad52b64139d345c030e" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.085288 4892 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/dadb10bf-ed88-454e-8873-9c49f762ef6e-kolla-config\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.085315 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qwd4\" (UniqueName: \"kubernetes.io/projected/49339b73-3846-4ce0-aa1d-d285b333c807-kube-api-access-2qwd4\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.085325 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49339b73-3846-4ce0-aa1d-d285b333c807-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.085334 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g47gz\" (UniqueName: \"kubernetes.io/projected/dadb10bf-ed88-454e-8873-9c49f762ef6e-kube-api-access-g47gz\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.085343 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dadb10bf-ed88-454e-8873-9c49f762ef6e-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.086280 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb02530d-521f-427b-a570-f35de0665ecc-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "eb02530d-521f-427b-a570-f35de0665ecc" (UID: "eb02530d-521f-427b-a570-f35de0665ecc"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.100776 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8947a6cb-f018-4042-a2f8-e17591b0394d-config-data" (OuterVolumeSpecName: "config-data") pod "8947a6cb-f018-4042-a2f8-e17591b0394d" (UID: "8947a6cb-f018-4042-a2f8-e17591b0394d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.108042 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68cd920a-ec23-4053-a5b6-02adbf11eaf0-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "68cd920a-ec23-4053-a5b6-02adbf11eaf0" (UID: "68cd920a-ec23-4053-a5b6-02adbf11eaf0"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.113944 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03480c10-3249-4caa-b0da-919bbe13c03f-config-data" (OuterVolumeSpecName: "config-data") pod "03480c10-3249-4caa-b0da-919bbe13c03f" (UID: "03480c10-3249-4caa-b0da-919bbe13c03f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.113955 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68cd920a-ec23-4053-a5b6-02adbf11eaf0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "68cd920a-ec23-4053-a5b6-02adbf11eaf0" (UID: "68cd920a-ec23-4053-a5b6-02adbf11eaf0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.115499 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/253dfc82-aa27-4e6b-88a5-0af7a1d01370-config-data" (OuterVolumeSpecName: "config-data") pod "253dfc82-aa27-4e6b-88a5-0af7a1d01370" (UID: "253dfc82-aa27-4e6b-88a5-0af7a1d01370"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.122010 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb02530d-521f-427b-a570-f35de0665ecc-config-data" (OuterVolumeSpecName: "config-data") pod "eb02530d-521f-427b-a570-f35de0665ecc" (UID: "eb02530d-521f-427b-a570-f35de0665ecc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.123022 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/253dfc82-aa27-4e6b-88a5-0af7a1d01370-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "253dfc82-aa27-4e6b-88a5-0af7a1d01370" (UID: "253dfc82-aa27-4e6b-88a5-0af7a1d01370"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.136440 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb02530d-521f-427b-a570-f35de0665ecc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eb02530d-521f-427b-a570-f35de0665ecc" (UID: "eb02530d-521f-427b-a570-f35de0665ecc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.141689 4892 scope.go:117] "RemoveContainer" containerID="094f4083ae3da865c0c77d4d0a55be29b2800ab643f2367af2aa5f359e590bfc" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.148994 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68cd920a-ec23-4053-a5b6-02adbf11eaf0-config-data" (OuterVolumeSpecName: "config-data") pod "68cd920a-ec23-4053-a5b6-02adbf11eaf0" (UID: "68cd920a-ec23-4053-a5b6-02adbf11eaf0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.161966 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6736a08-c35b-491c-b408-8a3dd641cd51-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a6736a08-c35b-491c-b408-8a3dd641cd51" (UID: "a6736a08-c35b-491c-b408-8a3dd641cd51"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.163943 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dadb10bf-ed88-454e-8873-9c49f762ef6e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dadb10bf-ed88-454e-8873-9c49f762ef6e" (UID: "dadb10bf-ed88-454e-8873-9c49f762ef6e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.173244 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dadb10bf-ed88-454e-8873-9c49f762ef6e-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "dadb10bf-ed88-454e-8873-9c49f762ef6e" (UID: "dadb10bf-ed88-454e-8873-9c49f762ef6e"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.175135 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68cd920a-ec23-4053-a5b6-02adbf11eaf0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "68cd920a-ec23-4053-a5b6-02adbf11eaf0" (UID: "68cd920a-ec23-4053-a5b6-02adbf11eaf0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.186901 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68cd920a-ec23-4053-a5b6-02adbf11eaf0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.186928 4892 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/68cd920a-ec23-4053-a5b6-02adbf11eaf0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.186936 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6736a08-c35b-491c-b408-8a3dd641cd51-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.186945 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/253dfc82-aa27-4e6b-88a5-0af7a1d01370-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.186954 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03480c10-3249-4caa-b0da-919bbe13c03f-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.186962 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8947a6cb-f018-4042-a2f8-e17591b0394d-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.186971 4892 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb02530d-521f-427b-a570-f35de0665ecc-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.186980 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dadb10bf-ed88-454e-8873-9c49f762ef6e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.186988 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb02530d-521f-427b-a570-f35de0665ecc-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.187007 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68cd920a-ec23-4053-a5b6-02adbf11eaf0-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.187015 4892 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/dadb10bf-ed88-454e-8873-9c49f762ef6e-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.187023 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/253dfc82-aa27-4e6b-88a5-0af7a1d01370-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.187032 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb02530d-521f-427b-a570-f35de0665ecc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.187039 4892 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/68cd920a-ec23-4053-a5b6-02adbf11eaf0-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.199510 4892 scope.go:117] "RemoveContainer" containerID="a6444be986863902c53a81d49703d9af2c8e3377fbc24ad52b64139d345c030e" Feb 17 18:07:37 crc kubenswrapper[4892]: E0217 18:07:37.199988 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6444be986863902c53a81d49703d9af2c8e3377fbc24ad52b64139d345c030e\": container with ID starting with a6444be986863902c53a81d49703d9af2c8e3377fbc24ad52b64139d345c030e not found: ID does not exist" containerID="a6444be986863902c53a81d49703d9af2c8e3377fbc24ad52b64139d345c030e" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.200015 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6444be986863902c53a81d49703d9af2c8e3377fbc24ad52b64139d345c030e"} err="failed to get container status \"a6444be986863902c53a81d49703d9af2c8e3377fbc24ad52b64139d345c030e\": rpc error: code = NotFound desc = could not find container \"a6444be986863902c53a81d49703d9af2c8e3377fbc24ad52b64139d345c030e\": container with ID starting with a6444be986863902c53a81d49703d9af2c8e3377fbc24ad52b64139d345c030e not found: ID does not exist" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.200035 4892 scope.go:117] "RemoveContainer" containerID="094f4083ae3da865c0c77d4d0a55be29b2800ab643f2367af2aa5f359e590bfc" Feb 17 18:07:37 crc kubenswrapper[4892]: E0217 18:07:37.200329 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"094f4083ae3da865c0c77d4d0a55be29b2800ab643f2367af2aa5f359e590bfc\": container with ID starting with 094f4083ae3da865c0c77d4d0a55be29b2800ab643f2367af2aa5f359e590bfc not found: ID does not exist" containerID="094f4083ae3da865c0c77d4d0a55be29b2800ab643f2367af2aa5f359e590bfc" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.200351 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"094f4083ae3da865c0c77d4d0a55be29b2800ab643f2367af2aa5f359e590bfc"} err="failed to get container status \"094f4083ae3da865c0c77d4d0a55be29b2800ab643f2367af2aa5f359e590bfc\": rpc error: code = NotFound desc = could not find container \"094f4083ae3da865c0c77d4d0a55be29b2800ab643f2367af2aa5f359e590bfc\": container with ID starting with 094f4083ae3da865c0c77d4d0a55be29b2800ab643f2367af2aa5f359e590bfc not found: ID does not exist" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.213993 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6736a08-c35b-491c-b408-8a3dd641cd51-config-data" (OuterVolumeSpecName: "config-data") pod "a6736a08-c35b-491c-b408-8a3dd641cd51" (UID: "a6736a08-c35b-491c-b408-8a3dd641cd51"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.288206 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6736a08-c35b-491c-b408-8a3dd641cd51-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.301581 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_5b688e91-e3c2-4a0f-a784-a694f951ea5e/ovn-northd/0.log" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.301625 4892 generic.go:334] "Generic (PLEG): container finished" podID="5b688e91-e3c2-4a0f-a784-a694f951ea5e" containerID="ce10b286d2db3229a5621dafd5ff72921acba1f38e152476f3b1b8d7c9ad5d2a" exitCode=139 Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.301678 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"5b688e91-e3c2-4a0f-a784-a694f951ea5e","Type":"ContainerDied","Data":"ce10b286d2db3229a5621dafd5ff72921acba1f38e152476f3b1b8d7c9ad5d2a"} Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.305241 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"dadb10bf-ed88-454e-8873-9c49f762ef6e","Type":"ContainerDied","Data":"dbbbecb0ba3fdbb6d53c711ee669f5b906795c9e893eb7463eb720302edeef15"} Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.305273 4892 scope.go:117] "RemoveContainer" containerID="264f797728e12c3996d15cc2b9cd8446cea32fc84d31eeb1fec2bcc2395f7027" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.305375 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.308050 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-6vq9m" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.308050 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-6vq9m" event={"ID":"49339b73-3846-4ce0-aa1d-d285b333c807","Type":"ContainerDied","Data":"fc329851c12662a2429eccce7fb2d0016d07905eeccabaa72b8ee0534ecfb158"} Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.313214 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.313261 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.318732 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.318732 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.318754 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.318761 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.318760 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-77d47b9fd6-4gbj5" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.318777 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-36cd-account-create-update-5bllx" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.318792 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.336530 4892 scope.go:117] "RemoveContainer" containerID="3444d44d864230addafb40b6ea7d86a4aa633d99b8d06f32e0785eca01c03058" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.605155 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38862686-bfab-4f7d-8367-ec59a68b0299" path="/var/lib/kubelet/pods/38862686-bfab-4f7d-8367-ec59a68b0299/volumes" Feb 17 18:07:37 crc kubenswrapper[4892]: E0217 18:07:37.609914 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="55d0a5452a08a089f4f815ed51cec9a66b1e302496f9185e70ddb1d9d5ee79a3" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.610024 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c6047a2-148b-46cf-a50b-b7147c7c9902" path="/var/lib/kubelet/pods/5c6047a2-148b-46cf-a50b-b7147c7c9902/volumes" Feb 17 18:07:37 crc kubenswrapper[4892]: E0217 18:07:37.613601 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="55d0a5452a08a089f4f815ed51cec9a66b1e302496f9185e70ddb1d9d5ee79a3" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.613683 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a130c05-1db7-4d93-899e-05d086a2bcca" path="/var/lib/kubelet/pods/7a130c05-1db7-4d93-899e-05d086a2bcca/volumes" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.614115 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6a1925e-e87d-4d6e-b9a3-35e46d58fc54" path="/var/lib/kubelet/pods/a6a1925e-e87d-4d6e-b9a3-35e46d58fc54/volumes" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.614402 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9" path="/var/lib/kubelet/pods/bb02be00-7b9c-4e4f-bee4-65a6fe46c0c9/volumes" Feb 17 18:07:37 crc kubenswrapper[4892]: E0217 18:07:37.617936 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="55d0a5452a08a089f4f815ed51cec9a66b1e302496f9185e70ddb1d9d5ee79a3" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Feb 17 18:07:37 crc kubenswrapper[4892]: E0217 18:07:37.617966 4892 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="4f63d2ae-8195-4841-a7b5-f38667fc87b2" containerName="galera" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.618544 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb2d2d5a-6727-4c83-800b-03a6cf43b9c1" path="/var/lib/kubelet/pods/bb2d2d5a-6727-4c83-800b-03a6cf43b9c1/volumes" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.619181 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf3f5fcf-07b1-4f42-a5be-5b11052d080a" path="/var/lib/kubelet/pods/bf3f5fcf-07b1-4f42-a5be-5b11052d080a/volumes" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.619729 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcbe58d5-c580-4a8c-8476-dd29bf7ca91b" path="/var/lib/kubelet/pods/dcbe58d5-c580-4a8c-8476-dd29bf7ca91b/volumes" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.622076 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e69accab-69f4-4f35-91cc-b9fb1d0fded2" path="/var/lib/kubelet/pods/e69accab-69f4-4f35-91cc-b9fb1d0fded2/volumes" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.814342 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_5b688e91-e3c2-4a0f-a784-a694f951ea5e/ovn-northd/0.log" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.814441 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.831219 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Feb 17 18:07:37 crc kubenswrapper[4892]: E0217 18:07:37.850878 4892 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49339b73_3846_4ce0_aa1d_d285b333c807.slice/crio-fc329851c12662a2429eccce7fb2d0016d07905eeccabaa72b8ee0534ecfb158\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a991e29_288f_453d_9bb4_f8d90a2689ad.slice/crio-conmon-fbd28f7697ba231fe30eb9e4f8561cf1e9d17638992ea1a393e7ed69f9406cea.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a991e29_288f_453d_9bb4_f8d90a2689ad.slice/crio-fbd28f7697ba231fe30eb9e4f8561cf1e9d17638992ea1a393e7ed69f9406cea.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddadb10bf_ed88_454e_8873_9c49f762ef6e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49339b73_3846_4ce0_aa1d_d285b333c807.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddadb10bf_ed88_454e_8873_9c49f762ef6e.slice/crio-dbbbecb0ba3fdbb6d53c711ee669f5b906795c9e893eb7463eb720302edeef15\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2931866_1e7d_46e7_833f_b285d8514234.slice\": RecentStats: unable to find data in memory cache]" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.852875 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.860748 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-6vq9m"] Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.871757 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-6vq9m"] Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.895687 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.912846 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b688e91-e3c2-4a0f-a784-a694f951ea5e-metrics-certs-tls-certs\") pod \"5b688e91-e3c2-4a0f-a784-a694f951ea5e\" (UID: \"5b688e91-e3c2-4a0f-a784-a694f951ea5e\") " Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.912888 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5b688e91-e3c2-4a0f-a784-a694f951ea5e-ovn-rundir\") pod \"5b688e91-e3c2-4a0f-a784-a694f951ea5e\" (UID: \"5b688e91-e3c2-4a0f-a784-a694f951ea5e\") " Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.913023 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b688e91-e3c2-4a0f-a784-a694f951ea5e-combined-ca-bundle\") pod \"5b688e91-e3c2-4a0f-a784-a694f951ea5e\" (UID: \"5b688e91-e3c2-4a0f-a784-a694f951ea5e\") " Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.913078 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b688e91-e3c2-4a0f-a784-a694f951ea5e-ovn-northd-tls-certs\") pod \"5b688e91-e3c2-4a0f-a784-a694f951ea5e\" (UID: \"5b688e91-e3c2-4a0f-a784-a694f951ea5e\") " Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.913138 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l62xg\" (UniqueName: \"kubernetes.io/projected/5b688e91-e3c2-4a0f-a784-a694f951ea5e-kube-api-access-l62xg\") pod \"5b688e91-e3c2-4a0f-a784-a694f951ea5e\" (UID: \"5b688e91-e3c2-4a0f-a784-a694f951ea5e\") " Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.913194 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5b688e91-e3c2-4a0f-a784-a694f951ea5e-scripts\") pod \"5b688e91-e3c2-4a0f-a784-a694f951ea5e\" (UID: \"5b688e91-e3c2-4a0f-a784-a694f951ea5e\") " Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.913219 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b688e91-e3c2-4a0f-a784-a694f951ea5e-config\") pod \"5b688e91-e3c2-4a0f-a784-a694f951ea5e\" (UID: \"5b688e91-e3c2-4a0f-a784-a694f951ea5e\") " Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.914318 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b688e91-e3c2-4a0f-a784-a694f951ea5e-config" (OuterVolumeSpecName: "config") pod "5b688e91-e3c2-4a0f-a784-a694f951ea5e" (UID: "5b688e91-e3c2-4a0f-a784-a694f951ea5e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.915389 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b688e91-e3c2-4a0f-a784-a694f951ea5e-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "5b688e91-e3c2-4a0f-a784-a694f951ea5e" (UID: "5b688e91-e3c2-4a0f-a784-a694f951ea5e"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.916686 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.917575 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b688e91-e3c2-4a0f-a784-a694f951ea5e-scripts" (OuterVolumeSpecName: "scripts") pod "5b688e91-e3c2-4a0f-a784-a694f951ea5e" (UID: "5b688e91-e3c2-4a0f-a784-a694f951ea5e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.923325 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b688e91-e3c2-4a0f-a784-a694f951ea5e-kube-api-access-l62xg" (OuterVolumeSpecName: "kube-api-access-l62xg") pod "5b688e91-e3c2-4a0f-a784-a694f951ea5e" (UID: "5b688e91-e3c2-4a0f-a784-a694f951ea5e"). InnerVolumeSpecName "kube-api-access-l62xg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.929851 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.937801 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.957035 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b688e91-e3c2-4a0f-a784-a694f951ea5e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5b688e91-e3c2-4a0f-a784-a694f951ea5e" (UID: "5b688e91-e3c2-4a0f-a784-a694f951ea5e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:37 crc kubenswrapper[4892]: I0217 18:07:37.987971 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b688e91-e3c2-4a0f-a784-a694f951ea5e-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "5b688e91-e3c2-4a0f-a784-a694f951ea5e" (UID: "5b688e91-e3c2-4a0f-a784-a694f951ea5e"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.005393 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b688e91-e3c2-4a0f-a784-a694f951ea5e-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "5b688e91-e3c2-4a0f-a784-a694f951ea5e" (UID: "5b688e91-e3c2-4a0f-a784-a694f951ea5e"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.014977 4892 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b688e91-e3c2-4a0f-a784-a694f951ea5e-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.015001 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l62xg\" (UniqueName: \"kubernetes.io/projected/5b688e91-e3c2-4a0f-a784-a694f951ea5e-kube-api-access-l62xg\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.015013 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5b688e91-e3c2-4a0f-a784-a694f951ea5e-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.015021 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b688e91-e3c2-4a0f-a784-a694f951ea5e-config\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.015028 4892 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b688e91-e3c2-4a0f-a784-a694f951ea5e-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.015036 4892 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5b688e91-e3c2-4a0f-a784-a694f951ea5e-ovn-rundir\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.015043 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b688e91-e3c2-4a0f-a784-a694f951ea5e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.044969 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.102337 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-77d47b9fd6-4gbj5"] Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.105124 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-77d47b9fd6-4gbj5"] Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.115938 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3a991e29-288f-453d-9bb4-f8d90a2689ad-erlang-cookie-secret\") pod \"3a991e29-288f-453d-9bb4-f8d90a2689ad\" (UID: \"3a991e29-288f-453d-9bb4-f8d90a2689ad\") " Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.116044 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3a991e29-288f-453d-9bb4-f8d90a2689ad-rabbitmq-confd\") pod \"3a991e29-288f-453d-9bb4-f8d90a2689ad\" (UID: \"3a991e29-288f-453d-9bb4-f8d90a2689ad\") " Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.116079 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a991e29-288f-453d-9bb4-f8d90a2689ad-config-data\") pod \"3a991e29-288f-453d-9bb4-f8d90a2689ad\" (UID: \"3a991e29-288f-453d-9bb4-f8d90a2689ad\") " Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.116106 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3a991e29-288f-453d-9bb4-f8d90a2689ad-pod-info\") pod \"3a991e29-288f-453d-9bb4-f8d90a2689ad\" (UID: \"3a991e29-288f-453d-9bb4-f8d90a2689ad\") " Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.116122 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3a991e29-288f-453d-9bb4-f8d90a2689ad-server-conf\") pod \"3a991e29-288f-453d-9bb4-f8d90a2689ad\" (UID: \"3a991e29-288f-453d-9bb4-f8d90a2689ad\") " Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.116145 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"3a991e29-288f-453d-9bb4-f8d90a2689ad\" (UID: \"3a991e29-288f-453d-9bb4-f8d90a2689ad\") " Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.116265 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3a991e29-288f-453d-9bb4-f8d90a2689ad-rabbitmq-erlang-cookie\") pod \"3a991e29-288f-453d-9bb4-f8d90a2689ad\" (UID: \"3a991e29-288f-453d-9bb4-f8d90a2689ad\") " Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.116348 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3a991e29-288f-453d-9bb4-f8d90a2689ad-plugins-conf\") pod \"3a991e29-288f-453d-9bb4-f8d90a2689ad\" (UID: \"3a991e29-288f-453d-9bb4-f8d90a2689ad\") " Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.116375 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3a991e29-288f-453d-9bb4-f8d90a2689ad-rabbitmq-tls\") pod \"3a991e29-288f-453d-9bb4-f8d90a2689ad\" (UID: \"3a991e29-288f-453d-9bb4-f8d90a2689ad\") " Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.116401 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkdw2\" (UniqueName: \"kubernetes.io/projected/3a991e29-288f-453d-9bb4-f8d90a2689ad-kube-api-access-rkdw2\") pod \"3a991e29-288f-453d-9bb4-f8d90a2689ad\" (UID: \"3a991e29-288f-453d-9bb4-f8d90a2689ad\") " Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.116419 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3a991e29-288f-453d-9bb4-f8d90a2689ad-rabbitmq-plugins\") pod \"3a991e29-288f-453d-9bb4-f8d90a2689ad\" (UID: \"3a991e29-288f-453d-9bb4-f8d90a2689ad\") " Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.119994 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a991e29-288f-453d-9bb4-f8d90a2689ad-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "3a991e29-288f-453d-9bb4-f8d90a2689ad" (UID: "3a991e29-288f-453d-9bb4-f8d90a2689ad"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.120976 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a991e29-288f-453d-9bb4-f8d90a2689ad-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "3a991e29-288f-453d-9bb4-f8d90a2689ad" (UID: "3a991e29-288f-453d-9bb4-f8d90a2689ad"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.121471 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a991e29-288f-453d-9bb4-f8d90a2689ad-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "3a991e29-288f-453d-9bb4-f8d90a2689ad" (UID: "3a991e29-288f-453d-9bb4-f8d90a2689ad"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.123451 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "persistence") pod "3a991e29-288f-453d-9bb4-f8d90a2689ad" (UID: "3a991e29-288f-453d-9bb4-f8d90a2689ad"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.124583 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a991e29-288f-453d-9bb4-f8d90a2689ad-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "3a991e29-288f-453d-9bb4-f8d90a2689ad" (UID: "3a991e29-288f-453d-9bb4-f8d90a2689ad"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.125242 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a991e29-288f-453d-9bb4-f8d90a2689ad-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "3a991e29-288f-453d-9bb4-f8d90a2689ad" (UID: "3a991e29-288f-453d-9bb4-f8d90a2689ad"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.125389 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/3a991e29-288f-453d-9bb4-f8d90a2689ad-pod-info" (OuterVolumeSpecName: "pod-info") pod "3a991e29-288f-453d-9bb4-f8d90a2689ad" (UID: "3a991e29-288f-453d-9bb4-f8d90a2689ad"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.129915 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a991e29-288f-453d-9bb4-f8d90a2689ad-kube-api-access-rkdw2" (OuterVolumeSpecName: "kube-api-access-rkdw2") pod "3a991e29-288f-453d-9bb4-f8d90a2689ad" (UID: "3a991e29-288f-453d-9bb4-f8d90a2689ad"). InnerVolumeSpecName "kube-api-access-rkdw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.140575 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-36cd-account-create-update-5bllx"] Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.148935 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-36cd-account-create-update-5bllx"] Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.161350 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.170687 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.171641 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a991e29-288f-453d-9bb4-f8d90a2689ad-config-data" (OuterVolumeSpecName: "config-data") pod "3a991e29-288f-453d-9bb4-f8d90a2689ad" (UID: "3a991e29-288f-453d-9bb4-f8d90a2689ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.179363 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.181424 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a991e29-288f-453d-9bb4-f8d90a2689ad-server-conf" (OuterVolumeSpecName: "server-conf") pod "3a991e29-288f-453d-9bb4-f8d90a2689ad" (UID: "3a991e29-288f-453d-9bb4-f8d90a2689ad"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.190772 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.199656 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.215162 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.218948 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.220859 4892 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3a991e29-288f-453d-9bb4-f8d90a2689ad-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.220892 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a991e29-288f-453d-9bb4-f8d90a2689ad-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.220904 4892 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3a991e29-288f-453d-9bb4-f8d90a2689ad-pod-info\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.220913 4892 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3a991e29-288f-453d-9bb4-f8d90a2689ad-server-conf\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.220943 4892 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.220954 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpd2l\" (UniqueName: \"kubernetes.io/projected/e2931866-1e7d-46e7-833f-b285d8514234-kube-api-access-kpd2l\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.220965 4892 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3a991e29-288f-453d-9bb4-f8d90a2689ad-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.220973 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2931866-1e7d-46e7-833f-b285d8514234-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.220982 4892 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3a991e29-288f-453d-9bb4-f8d90a2689ad-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.220990 4892 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3a991e29-288f-453d-9bb4-f8d90a2689ad-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.221000 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkdw2\" (UniqueName: \"kubernetes.io/projected/3a991e29-288f-453d-9bb4-f8d90a2689ad-kube-api-access-rkdw2\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.221009 4892 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3a991e29-288f-453d-9bb4-f8d90a2689ad-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.224521 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.230693 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.236059 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.244218 4892 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.252674 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a991e29-288f-453d-9bb4-f8d90a2689ad-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "3a991e29-288f-453d-9bb4-f8d90a2689ad" (UID: "3a991e29-288f-453d-9bb4-f8d90a2689ad"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.322128 4892 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3a991e29-288f-453d-9bb4-f8d90a2689ad-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.322162 4892 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.334401 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.335454 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_5b688e91-e3c2-4a0f-a784-a694f951ea5e/ovn-northd/0.log" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.335512 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"5b688e91-e3c2-4a0f-a784-a694f951ea5e","Type":"ContainerDied","Data":"9d795354563a389daa71a0017918bcc9fe39ccec245ee6211bd79ca5dfe2c2cd"} Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.335543 4892 scope.go:117] "RemoveContainer" containerID="f5ee250509d5fcfefa3da2589c17b1ebe3b3316577d89b3435a80ae89dc5bf32" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.335614 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.339043 4892 generic.go:334] "Generic (PLEG): container finished" podID="4f63d2ae-8195-4841-a7b5-f38667fc87b2" containerID="55d0a5452a08a089f4f815ed51cec9a66b1e302496f9185e70ddb1d9d5ee79a3" exitCode=0 Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.339112 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4f63d2ae-8195-4841-a7b5-f38667fc87b2","Type":"ContainerDied","Data":"55d0a5452a08a089f4f815ed51cec9a66b1e302496f9185e70ddb1d9d5ee79a3"} Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.339195 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.342404 4892 generic.go:334] "Generic (PLEG): container finished" podID="3a991e29-288f-453d-9bb4-f8d90a2689ad" containerID="fbd28f7697ba231fe30eb9e4f8561cf1e9d17638992ea1a393e7ed69f9406cea" exitCode=0 Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.342500 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.342543 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3a991e29-288f-453d-9bb4-f8d90a2689ad","Type":"ContainerDied","Data":"fbd28f7697ba231fe30eb9e4f8561cf1e9d17638992ea1a393e7ed69f9406cea"} Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.342587 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3a991e29-288f-453d-9bb4-f8d90a2689ad","Type":"ContainerDied","Data":"db247dcc000538ded4cfecb83a5e39b97feb0a16238ba40aaed9354e33407022"} Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.350219 4892 generic.go:334] "Generic (PLEG): container finished" podID="b4d79298-5d7f-4bd2-a8ba-b76c1a8b44f4" containerID="59c346faddf9dfe842e95a9add3e43317b2c2439005912005284fd494f8f1d8f" exitCode=0 Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.350289 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"b4d79298-5d7f-4bd2-a8ba-b76c1a8b44f4","Type":"ContainerDied","Data":"59c346faddf9dfe842e95a9add3e43317b2c2439005912005284fd494f8f1d8f"} Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.388678 4892 scope.go:117] "RemoveContainer" containerID="ce10b286d2db3229a5621dafd5ff72921acba1f38e152476f3b1b8d7c9ad5d2a" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.422987 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.423463 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4f63d2ae-8195-4841-a7b5-f38667fc87b2-config-data-default\") pod \"4f63d2ae-8195-4841-a7b5-f38667fc87b2\" (UID: \"4f63d2ae-8195-4841-a7b5-f38667fc87b2\") " Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.423534 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4f63d2ae-8195-4841-a7b5-f38667fc87b2-config-data-generated\") pod \"4f63d2ae-8195-4841-a7b5-f38667fc87b2\" (UID: \"4f63d2ae-8195-4841-a7b5-f38667fc87b2\") " Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.423591 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f63d2ae-8195-4841-a7b5-f38667fc87b2-operator-scripts\") pod \"4f63d2ae-8195-4841-a7b5-f38667fc87b2\" (UID: \"4f63d2ae-8195-4841-a7b5-f38667fc87b2\") " Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.423621 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f63d2ae-8195-4841-a7b5-f38667fc87b2-combined-ca-bundle\") pod \"4f63d2ae-8195-4841-a7b5-f38667fc87b2\" (UID: \"4f63d2ae-8195-4841-a7b5-f38667fc87b2\") " Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.423724 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxx58\" (UniqueName: \"kubernetes.io/projected/4f63d2ae-8195-4841-a7b5-f38667fc87b2-kube-api-access-sxx58\") pod \"4f63d2ae-8195-4841-a7b5-f38667fc87b2\" (UID: \"4f63d2ae-8195-4841-a7b5-f38667fc87b2\") " Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.423775 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f63d2ae-8195-4841-a7b5-f38667fc87b2-galera-tls-certs\") pod \"4f63d2ae-8195-4841-a7b5-f38667fc87b2\" (UID: \"4f63d2ae-8195-4841-a7b5-f38667fc87b2\") " Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.423843 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"4f63d2ae-8195-4841-a7b5-f38667fc87b2\" (UID: \"4f63d2ae-8195-4841-a7b5-f38667fc87b2\") " Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.423865 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4f63d2ae-8195-4841-a7b5-f38667fc87b2-kolla-config\") pod \"4f63d2ae-8195-4841-a7b5-f38667fc87b2\" (UID: \"4f63d2ae-8195-4841-a7b5-f38667fc87b2\") " Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.424122 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f63d2ae-8195-4841-a7b5-f38667fc87b2-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "4f63d2ae-8195-4841-a7b5-f38667fc87b2" (UID: "4f63d2ae-8195-4841-a7b5-f38667fc87b2"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.424763 4892 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4f63d2ae-8195-4841-a7b5-f38667fc87b2-config-data-default\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.426138 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f63d2ae-8195-4841-a7b5-f38667fc87b2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4f63d2ae-8195-4841-a7b5-f38667fc87b2" (UID: "4f63d2ae-8195-4841-a7b5-f38667fc87b2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.426434 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f63d2ae-8195-4841-a7b5-f38667fc87b2-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "4f63d2ae-8195-4841-a7b5-f38667fc87b2" (UID: "4f63d2ae-8195-4841-a7b5-f38667fc87b2"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.426772 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f63d2ae-8195-4841-a7b5-f38667fc87b2-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "4f63d2ae-8195-4841-a7b5-f38667fc87b2" (UID: "4f63d2ae-8195-4841-a7b5-f38667fc87b2"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.428127 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f63d2ae-8195-4841-a7b5-f38667fc87b2-kube-api-access-sxx58" (OuterVolumeSpecName: "kube-api-access-sxx58") pod "4f63d2ae-8195-4841-a7b5-f38667fc87b2" (UID: "4f63d2ae-8195-4841-a7b5-f38667fc87b2"). InnerVolumeSpecName "kube-api-access-sxx58". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.438244 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "mysql-db") pod "4f63d2ae-8195-4841-a7b5-f38667fc87b2" (UID: "4f63d2ae-8195-4841-a7b5-f38667fc87b2"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.443174 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.448124 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f63d2ae-8195-4841-a7b5-f38667fc87b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f63d2ae-8195-4841-a7b5-f38667fc87b2" (UID: "4f63d2ae-8195-4841-a7b5-f38667fc87b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.456207 4892 scope.go:117] "RemoveContainer" containerID="55d0a5452a08a089f4f815ed51cec9a66b1e302496f9185e70ddb1d9d5ee79a3" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.456242 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.474626 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.484535 4892 scope.go:117] "RemoveContainer" containerID="7ace672204f35f9e3ce9138c9274e92b32a9ad967b4d642a8ab107066e7260aa" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.506910 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f63d2ae-8195-4841-a7b5-f38667fc87b2-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "4f63d2ae-8195-4841-a7b5-f38667fc87b2" (UID: "4f63d2ae-8195-4841-a7b5-f38667fc87b2"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.508347 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.520571 4892 scope.go:117] "RemoveContainer" containerID="fbd28f7697ba231fe30eb9e4f8561cf1e9d17638992ea1a393e7ed69f9406cea" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.529441 4892 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4f63d2ae-8195-4841-a7b5-f38667fc87b2-config-data-generated\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.529470 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f63d2ae-8195-4841-a7b5-f38667fc87b2-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.529479 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f63d2ae-8195-4841-a7b5-f38667fc87b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.529488 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxx58\" (UniqueName: \"kubernetes.io/projected/4f63d2ae-8195-4841-a7b5-f38667fc87b2-kube-api-access-sxx58\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.529496 4892 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f63d2ae-8195-4841-a7b5-f38667fc87b2-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.529521 4892 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.529530 4892 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4f63d2ae-8195-4841-a7b5-f38667fc87b2-kolla-config\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.578027 4892 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.578637 4892 scope.go:117] "RemoveContainer" containerID="503d1ce458069660595b88c09b6233a6c992ba208fc9f8777a3919fc48271c2f" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.630964 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvt2t\" (UniqueName: \"kubernetes.io/projected/b4d79298-5d7f-4bd2-a8ba-b76c1a8b44f4-kube-api-access-kvt2t\") pod \"b4d79298-5d7f-4bd2-a8ba-b76c1a8b44f4\" (UID: \"b4d79298-5d7f-4bd2-a8ba-b76c1a8b44f4\") " Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.631110 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4d79298-5d7f-4bd2-a8ba-b76c1a8b44f4-config-data\") pod \"b4d79298-5d7f-4bd2-a8ba-b76c1a8b44f4\" (UID: \"b4d79298-5d7f-4bd2-a8ba-b76c1a8b44f4\") " Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.631370 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4d79298-5d7f-4bd2-a8ba-b76c1a8b44f4-combined-ca-bundle\") pod \"b4d79298-5d7f-4bd2-a8ba-b76c1a8b44f4\" (UID: \"b4d79298-5d7f-4bd2-a8ba-b76c1a8b44f4\") " Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.631755 4892 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.634937 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4d79298-5d7f-4bd2-a8ba-b76c1a8b44f4-kube-api-access-kvt2t" (OuterVolumeSpecName: "kube-api-access-kvt2t") pod "b4d79298-5d7f-4bd2-a8ba-b76c1a8b44f4" (UID: "b4d79298-5d7f-4bd2-a8ba-b76c1a8b44f4"). InnerVolumeSpecName "kube-api-access-kvt2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.658414 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4d79298-5d7f-4bd2-a8ba-b76c1a8b44f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b4d79298-5d7f-4bd2-a8ba-b76c1a8b44f4" (UID: "b4d79298-5d7f-4bd2-a8ba-b76c1a8b44f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.667951 4892 scope.go:117] "RemoveContainer" containerID="fbd28f7697ba231fe30eb9e4f8561cf1e9d17638992ea1a393e7ed69f9406cea" Feb 17 18:07:38 crc kubenswrapper[4892]: E0217 18:07:38.672039 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbd28f7697ba231fe30eb9e4f8561cf1e9d17638992ea1a393e7ed69f9406cea\": container with ID starting with fbd28f7697ba231fe30eb9e4f8561cf1e9d17638992ea1a393e7ed69f9406cea not found: ID does not exist" containerID="fbd28f7697ba231fe30eb9e4f8561cf1e9d17638992ea1a393e7ed69f9406cea" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.672071 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbd28f7697ba231fe30eb9e4f8561cf1e9d17638992ea1a393e7ed69f9406cea"} err="failed to get container status \"fbd28f7697ba231fe30eb9e4f8561cf1e9d17638992ea1a393e7ed69f9406cea\": rpc error: code = NotFound desc = could not find container \"fbd28f7697ba231fe30eb9e4f8561cf1e9d17638992ea1a393e7ed69f9406cea\": container with ID starting with fbd28f7697ba231fe30eb9e4f8561cf1e9d17638992ea1a393e7ed69f9406cea not found: ID does not exist" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.672091 4892 scope.go:117] "RemoveContainer" containerID="503d1ce458069660595b88c09b6233a6c992ba208fc9f8777a3919fc48271c2f" Feb 17 18:07:38 crc kubenswrapper[4892]: E0217 18:07:38.672317 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"503d1ce458069660595b88c09b6233a6c992ba208fc9f8777a3919fc48271c2f\": container with ID starting with 503d1ce458069660595b88c09b6233a6c992ba208fc9f8777a3919fc48271c2f not found: ID does not exist" containerID="503d1ce458069660595b88c09b6233a6c992ba208fc9f8777a3919fc48271c2f" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.672333 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"503d1ce458069660595b88c09b6233a6c992ba208fc9f8777a3919fc48271c2f"} err="failed to get container status \"503d1ce458069660595b88c09b6233a6c992ba208fc9f8777a3919fc48271c2f\": rpc error: code = NotFound desc = could not find container \"503d1ce458069660595b88c09b6233a6c992ba208fc9f8777a3919fc48271c2f\": container with ID starting with 503d1ce458069660595b88c09b6233a6c992ba208fc9f8777a3919fc48271c2f not found: ID does not exist" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.679264 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4d79298-5d7f-4bd2-a8ba-b76c1a8b44f4-config-data" (OuterVolumeSpecName: "config-data") pod "b4d79298-5d7f-4bd2-a8ba-b76c1a8b44f4" (UID: "b4d79298-5d7f-4bd2-a8ba-b76c1a8b44f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.738994 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvt2t\" (UniqueName: \"kubernetes.io/projected/b4d79298-5d7f-4bd2-a8ba-b76c1a8b44f4-kube-api-access-kvt2t\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.739025 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4d79298-5d7f-4bd2-a8ba-b76c1a8b44f4-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.739035 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4d79298-5d7f-4bd2-a8ba-b76c1a8b44f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.760500 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.768261 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.845058 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-dfdc985-pc7r9" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.942748 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fd50e1e-cc22-430b-ab38-88217aeafc59-scripts\") pod \"2fd50e1e-cc22-430b-ab38-88217aeafc59\" (UID: \"2fd50e1e-cc22-430b-ab38-88217aeafc59\") " Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.942806 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fd50e1e-cc22-430b-ab38-88217aeafc59-config-data\") pod \"2fd50e1e-cc22-430b-ab38-88217aeafc59\" (UID: \"2fd50e1e-cc22-430b-ab38-88217aeafc59\") " Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.942848 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2fd50e1e-cc22-430b-ab38-88217aeafc59-fernet-keys\") pod \"2fd50e1e-cc22-430b-ab38-88217aeafc59\" (UID: \"2fd50e1e-cc22-430b-ab38-88217aeafc59\") " Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.942886 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fd50e1e-cc22-430b-ab38-88217aeafc59-combined-ca-bundle\") pod \"2fd50e1e-cc22-430b-ab38-88217aeafc59\" (UID: \"2fd50e1e-cc22-430b-ab38-88217aeafc59\") " Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.942936 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fd50e1e-cc22-430b-ab38-88217aeafc59-internal-tls-certs\") pod \"2fd50e1e-cc22-430b-ab38-88217aeafc59\" (UID: \"2fd50e1e-cc22-430b-ab38-88217aeafc59\") " Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.942979 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4p6h\" (UniqueName: \"kubernetes.io/projected/2fd50e1e-cc22-430b-ab38-88217aeafc59-kube-api-access-x4p6h\") pod \"2fd50e1e-cc22-430b-ab38-88217aeafc59\" (UID: \"2fd50e1e-cc22-430b-ab38-88217aeafc59\") " Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.943003 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2fd50e1e-cc22-430b-ab38-88217aeafc59-credential-keys\") pod \"2fd50e1e-cc22-430b-ab38-88217aeafc59\" (UID: \"2fd50e1e-cc22-430b-ab38-88217aeafc59\") " Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.943050 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fd50e1e-cc22-430b-ab38-88217aeafc59-public-tls-certs\") pod \"2fd50e1e-cc22-430b-ab38-88217aeafc59\" (UID: \"2fd50e1e-cc22-430b-ab38-88217aeafc59\") " Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.954185 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fd50e1e-cc22-430b-ab38-88217aeafc59-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "2fd50e1e-cc22-430b-ab38-88217aeafc59" (UID: "2fd50e1e-cc22-430b-ab38-88217aeafc59"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.954201 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fd50e1e-cc22-430b-ab38-88217aeafc59-kube-api-access-x4p6h" (OuterVolumeSpecName: "kube-api-access-x4p6h") pod "2fd50e1e-cc22-430b-ab38-88217aeafc59" (UID: "2fd50e1e-cc22-430b-ab38-88217aeafc59"). InnerVolumeSpecName "kube-api-access-x4p6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.954242 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fd50e1e-cc22-430b-ab38-88217aeafc59-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "2fd50e1e-cc22-430b-ab38-88217aeafc59" (UID: "2fd50e1e-cc22-430b-ab38-88217aeafc59"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.969087 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fd50e1e-cc22-430b-ab38-88217aeafc59-scripts" (OuterVolumeSpecName: "scripts") pod "2fd50e1e-cc22-430b-ab38-88217aeafc59" (UID: "2fd50e1e-cc22-430b-ab38-88217aeafc59"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.997077 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fd50e1e-cc22-430b-ab38-88217aeafc59-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2fd50e1e-cc22-430b-ab38-88217aeafc59" (UID: "2fd50e1e-cc22-430b-ab38-88217aeafc59"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:38 crc kubenswrapper[4892]: I0217 18:07:38.999906 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fd50e1e-cc22-430b-ab38-88217aeafc59-config-data" (OuterVolumeSpecName: "config-data") pod "2fd50e1e-cc22-430b-ab38-88217aeafc59" (UID: "2fd50e1e-cc22-430b-ab38-88217aeafc59"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.003960 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fd50e1e-cc22-430b-ab38-88217aeafc59-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2fd50e1e-cc22-430b-ab38-88217aeafc59" (UID: "2fd50e1e-cc22-430b-ab38-88217aeafc59"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.023630 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fd50e1e-cc22-430b-ab38-88217aeafc59-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2fd50e1e-cc22-430b-ab38-88217aeafc59" (UID: "2fd50e1e-cc22-430b-ab38-88217aeafc59"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.045247 4892 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fd50e1e-cc22-430b-ab38-88217aeafc59-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.045282 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fd50e1e-cc22-430b-ab38-88217aeafc59-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.045291 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fd50e1e-cc22-430b-ab38-88217aeafc59-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.045299 4892 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2fd50e1e-cc22-430b-ab38-88217aeafc59-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.045307 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fd50e1e-cc22-430b-ab38-88217aeafc59-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.045315 4892 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fd50e1e-cc22-430b-ab38-88217aeafc59-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.045324 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4p6h\" (UniqueName: \"kubernetes.io/projected/2fd50e1e-cc22-430b-ab38-88217aeafc59-kube-api-access-x4p6h\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.045334 4892 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2fd50e1e-cc22-430b-ab38-88217aeafc59-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.064649 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-65fb4dcbbc-jxbk8" Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.146645 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a97132b-c4ac-4645-8844-1dc5acf466a1-logs\") pod \"7a97132b-c4ac-4645-8844-1dc5acf466a1\" (UID: \"7a97132b-c4ac-4645-8844-1dc5acf466a1\") " Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.146758 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a97132b-c4ac-4645-8844-1dc5acf466a1-combined-ca-bundle\") pod \"7a97132b-c4ac-4645-8844-1dc5acf466a1\" (UID: \"7a97132b-c4ac-4645-8844-1dc5acf466a1\") " Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.146799 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a97132b-c4ac-4645-8844-1dc5acf466a1-config-data\") pod \"7a97132b-c4ac-4645-8844-1dc5acf466a1\" (UID: \"7a97132b-c4ac-4645-8844-1dc5acf466a1\") " Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.146885 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8trtn\" (UniqueName: \"kubernetes.io/projected/7a97132b-c4ac-4645-8844-1dc5acf466a1-kube-api-access-8trtn\") pod \"7a97132b-c4ac-4645-8844-1dc5acf466a1\" (UID: \"7a97132b-c4ac-4645-8844-1dc5acf466a1\") " Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.146946 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a97132b-c4ac-4645-8844-1dc5acf466a1-config-data-custom\") pod \"7a97132b-c4ac-4645-8844-1dc5acf466a1\" (UID: \"7a97132b-c4ac-4645-8844-1dc5acf466a1\") " Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.151931 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a97132b-c4ac-4645-8844-1dc5acf466a1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7a97132b-c4ac-4645-8844-1dc5acf466a1" (UID: "7a97132b-c4ac-4645-8844-1dc5acf466a1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.152228 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a97132b-c4ac-4645-8844-1dc5acf466a1-logs" (OuterVolumeSpecName: "logs") pod "7a97132b-c4ac-4645-8844-1dc5acf466a1" (UID: "7a97132b-c4ac-4645-8844-1dc5acf466a1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.160967 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a97132b-c4ac-4645-8844-1dc5acf466a1-kube-api-access-8trtn" (OuterVolumeSpecName: "kube-api-access-8trtn") pod "7a97132b-c4ac-4645-8844-1dc5acf466a1" (UID: "7a97132b-c4ac-4645-8844-1dc5acf466a1"). InnerVolumeSpecName "kube-api-access-8trtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.168086 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="2e7cdd99-a572-4a20-834b-c1373e080496" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.230:3000/\": dial tcp 10.217.0.230:3000: connect: connection refused" Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.185318 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a97132b-c4ac-4645-8844-1dc5acf466a1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a97132b-c4ac-4645-8844-1dc5acf466a1" (UID: "7a97132b-c4ac-4645-8844-1dc5acf466a1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.207605 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a97132b-c4ac-4645-8844-1dc5acf466a1-config-data" (OuterVolumeSpecName: "config-data") pod "7a97132b-c4ac-4645-8844-1dc5acf466a1" (UID: "7a97132b-c4ac-4645-8844-1dc5acf466a1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.248760 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a97132b-c4ac-4645-8844-1dc5acf466a1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.248793 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a97132b-c4ac-4645-8844-1dc5acf466a1-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.248803 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8trtn\" (UniqueName: \"kubernetes.io/projected/7a97132b-c4ac-4645-8844-1dc5acf466a1-kube-api-access-8trtn\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.248828 4892 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a97132b-c4ac-4645-8844-1dc5acf466a1-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.248836 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a97132b-c4ac-4645-8844-1dc5acf466a1-logs\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.296660 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="dcbe58d5-c580-4a8c-8476-dd29bf7ca91b" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.194:8776/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.381536 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03480c10-3249-4caa-b0da-919bbe13c03f" path="/var/lib/kubelet/pods/03480c10-3249-4caa-b0da-919bbe13c03f/volumes" Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.382055 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="253dfc82-aa27-4e6b-88a5-0af7a1d01370" path="/var/lib/kubelet/pods/253dfc82-aa27-4e6b-88a5-0af7a1d01370/volumes" Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.382625 4892 generic.go:334] "Generic (PLEG): container finished" podID="7a97132b-c4ac-4645-8844-1dc5acf466a1" containerID="2f7908fcab6c600dd1af68c388a6d558ccc99b22a660b78e65463d1976469b2c" exitCode=0 Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.382778 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-65fb4dcbbc-jxbk8" Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.383016 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a991e29-288f-453d-9bb4-f8d90a2689ad" path="/var/lib/kubelet/pods/3a991e29-288f-453d-9bb4-f8d90a2689ad/volumes" Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.384482 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ca9c4ed-1247-4340-a675-b9d50dcbed99" path="/var/lib/kubelet/pods/3ca9c4ed-1247-4340-a675-b9d50dcbed99/volumes" Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.385441 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49339b73-3846-4ce0-aa1d-d285b333c807" path="/var/lib/kubelet/pods/49339b73-3846-4ce0-aa1d-d285b333c807/volumes" Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.386252 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f63d2ae-8195-4841-a7b5-f38667fc87b2" path="/var/lib/kubelet/pods/4f63d2ae-8195-4841-a7b5-f38667fc87b2/volumes" Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.387422 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b688e91-e3c2-4a0f-a784-a694f951ea5e" path="/var/lib/kubelet/pods/5b688e91-e3c2-4a0f-a784-a694f951ea5e/volumes" Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.387964 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68cd920a-ec23-4053-a5b6-02adbf11eaf0" path="/var/lib/kubelet/pods/68cd920a-ec23-4053-a5b6-02adbf11eaf0/volumes" Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.388975 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8947a6cb-f018-4042-a2f8-e17591b0394d" path="/var/lib/kubelet/pods/8947a6cb-f018-4042-a2f8-e17591b0394d/volumes" Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.389498 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8affd9cf-1116-4643-8045-d445edeaa995" path="/var/lib/kubelet/pods/8affd9cf-1116-4643-8045-d445edeaa995/volumes" Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.391098 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6736a08-c35b-491c-b408-8a3dd641cd51" path="/var/lib/kubelet/pods/a6736a08-c35b-491c-b408-8a3dd641cd51/volumes" Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.392522 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dadb10bf-ed88-454e-8873-9c49f762ef6e" path="/var/lib/kubelet/pods/dadb10bf-ed88-454e-8873-9c49f762ef6e/volumes" Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.393392 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2931866-1e7d-46e7-833f-b285d8514234" path="/var/lib/kubelet/pods/e2931866-1e7d-46e7-833f-b285d8514234/volumes" Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.393702 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb02530d-521f-427b-a570-f35de0665ecc" path="/var/lib/kubelet/pods/eb02530d-521f-427b-a570-f35de0665ecc/volumes" Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.401868 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-65fb4dcbbc-jxbk8" event={"ID":"7a97132b-c4ac-4645-8844-1dc5acf466a1","Type":"ContainerDied","Data":"2f7908fcab6c600dd1af68c388a6d558ccc99b22a660b78e65463d1976469b2c"} Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.401915 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-65fb4dcbbc-jxbk8" event={"ID":"7a97132b-c4ac-4645-8844-1dc5acf466a1","Type":"ContainerDied","Data":"85c0f6dc2cb435110be85b9a8316d687744dc6110745af68766aeda95c3c47eb"} Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.401947 4892 scope.go:117] "RemoveContainer" containerID="2f7908fcab6c600dd1af68c388a6d558ccc99b22a660b78e65463d1976469b2c" Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.404826 4892 generic.go:334] "Generic (PLEG): container finished" podID="2fd50e1e-cc22-430b-ab38-88217aeafc59" containerID="d55aa965f98c1b5c7b4e27276dc194db4f2a6d3b950d18e096832f47d612e12f" exitCode=0 Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.404881 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-dfdc985-pc7r9" event={"ID":"2fd50e1e-cc22-430b-ab38-88217aeafc59","Type":"ContainerDied","Data":"d55aa965f98c1b5c7b4e27276dc194db4f2a6d3b950d18e096832f47d612e12f"} Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.404907 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-dfdc985-pc7r9" event={"ID":"2fd50e1e-cc22-430b-ab38-88217aeafc59","Type":"ContainerDied","Data":"9ca7613cc626c0571e208035f848864174e3337eac62a5892d4d3c372b970eb0"} Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.404984 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-dfdc985-pc7r9" Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.422073 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"b4d79298-5d7f-4bd2-a8ba-b76c1a8b44f4","Type":"ContainerDied","Data":"759e1b46d3d18c535b39242cd08c91124e27c8c3b47d2ef2c3ef3853cc58d85b"} Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.422170 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.441867 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-65fb4dcbbc-jxbk8"] Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.449577 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-65fb4dcbbc-jxbk8"] Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.453940 4892 scope.go:117] "RemoveContainer" containerID="2ce408135d2761c455f60cc8d128a48f7b7d2c155d3ab0f6e2e8d41bb724c760" Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.489007 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-dfdc985-pc7r9"] Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.495555 4892 scope.go:117] "RemoveContainer" containerID="2f7908fcab6c600dd1af68c388a6d558ccc99b22a660b78e65463d1976469b2c" Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.495962 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-dfdc985-pc7r9"] Feb 17 18:07:39 crc kubenswrapper[4892]: E0217 18:07:39.495995 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f7908fcab6c600dd1af68c388a6d558ccc99b22a660b78e65463d1976469b2c\": container with ID starting with 2f7908fcab6c600dd1af68c388a6d558ccc99b22a660b78e65463d1976469b2c not found: ID does not exist" containerID="2f7908fcab6c600dd1af68c388a6d558ccc99b22a660b78e65463d1976469b2c" Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.496037 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f7908fcab6c600dd1af68c388a6d558ccc99b22a660b78e65463d1976469b2c"} err="failed to get container status \"2f7908fcab6c600dd1af68c388a6d558ccc99b22a660b78e65463d1976469b2c\": rpc error: code = NotFound desc = could not find container \"2f7908fcab6c600dd1af68c388a6d558ccc99b22a660b78e65463d1976469b2c\": container with ID starting with 2f7908fcab6c600dd1af68c388a6d558ccc99b22a660b78e65463d1976469b2c not found: ID does not exist" Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.496068 4892 scope.go:117] "RemoveContainer" containerID="2ce408135d2761c455f60cc8d128a48f7b7d2c155d3ab0f6e2e8d41bb724c760" Feb 17 18:07:39 crc kubenswrapper[4892]: E0217 18:07:39.496468 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ce408135d2761c455f60cc8d128a48f7b7d2c155d3ab0f6e2e8d41bb724c760\": container with ID starting with 2ce408135d2761c455f60cc8d128a48f7b7d2c155d3ab0f6e2e8d41bb724c760 not found: ID does not exist" containerID="2ce408135d2761c455f60cc8d128a48f7b7d2c155d3ab0f6e2e8d41bb724c760" Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.496509 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ce408135d2761c455f60cc8d128a48f7b7d2c155d3ab0f6e2e8d41bb724c760"} err="failed to get container status \"2ce408135d2761c455f60cc8d128a48f7b7d2c155d3ab0f6e2e8d41bb724c760\": rpc error: code = NotFound desc = could not find container \"2ce408135d2761c455f60cc8d128a48f7b7d2c155d3ab0f6e2e8d41bb724c760\": container with ID starting with 2ce408135d2761c455f60cc8d128a48f7b7d2c155d3ab0f6e2e8d41bb724c760 not found: ID does not exist" Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.496541 4892 scope.go:117] "RemoveContainer" containerID="d55aa965f98c1b5c7b4e27276dc194db4f2a6d3b950d18e096832f47d612e12f" Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.507348 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.512273 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.527705 4892 scope.go:117] "RemoveContainer" containerID="d55aa965f98c1b5c7b4e27276dc194db4f2a6d3b950d18e096832f47d612e12f" Feb 17 18:07:39 crc kubenswrapper[4892]: E0217 18:07:39.528229 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d55aa965f98c1b5c7b4e27276dc194db4f2a6d3b950d18e096832f47d612e12f\": container with ID starting with d55aa965f98c1b5c7b4e27276dc194db4f2a6d3b950d18e096832f47d612e12f not found: ID does not exist" containerID="d55aa965f98c1b5c7b4e27276dc194db4f2a6d3b950d18e096832f47d612e12f" Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.528292 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d55aa965f98c1b5c7b4e27276dc194db4f2a6d3b950d18e096832f47d612e12f"} err="failed to get container status \"d55aa965f98c1b5c7b4e27276dc194db4f2a6d3b950d18e096832f47d612e12f\": rpc error: code = NotFound desc = could not find container \"d55aa965f98c1b5c7b4e27276dc194db4f2a6d3b950d18e096832f47d612e12f\": container with ID starting with d55aa965f98c1b5c7b4e27276dc194db4f2a6d3b950d18e096832f47d612e12f not found: ID does not exist" Feb 17 18:07:39 crc kubenswrapper[4892]: I0217 18:07:39.528325 4892 scope.go:117] "RemoveContainer" containerID="59c346faddf9dfe842e95a9add3e43317b2c2439005912005284fd494f8f1d8f" Feb 17 18:07:40 crc kubenswrapper[4892]: E0217 18:07:40.240136 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e1814f919569ab3fca60ffafe790f6c380bef0cb3debbd1731ef73a58c171a64 is running failed: container process not found" containerID="e1814f919569ab3fca60ffafe790f6c380bef0cb3debbd1731ef73a58c171a64" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 17 18:07:40 crc kubenswrapper[4892]: E0217 18:07:40.240901 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e1814f919569ab3fca60ffafe790f6c380bef0cb3debbd1731ef73a58c171a64 is running failed: container process not found" containerID="e1814f919569ab3fca60ffafe790f6c380bef0cb3debbd1731ef73a58c171a64" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 17 18:07:40 crc kubenswrapper[4892]: E0217 18:07:40.241281 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e1814f919569ab3fca60ffafe790f6c380bef0cb3debbd1731ef73a58c171a64 is running failed: container process not found" containerID="e1814f919569ab3fca60ffafe790f6c380bef0cb3debbd1731ef73a58c171a64" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 17 18:07:40 crc kubenswrapper[4892]: E0217 18:07:40.241405 4892 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e1814f919569ab3fca60ffafe790f6c380bef0cb3debbd1731ef73a58c171a64 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-8n9s7" podUID="abdceb70-55cb-4f2a-ac20-7fe8e9f4d064" containerName="ovsdb-server" Feb 17 18:07:40 crc kubenswrapper[4892]: E0217 18:07:40.242140 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2fa766c1995ec1ef80411d787e1580966f11b07f17360079458fcc8c8c5e948a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 17 18:07:40 crc kubenswrapper[4892]: E0217 18:07:40.244901 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2fa766c1995ec1ef80411d787e1580966f11b07f17360079458fcc8c8c5e948a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 17 18:07:40 crc kubenswrapper[4892]: E0217 18:07:40.246107 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2fa766c1995ec1ef80411d787e1580966f11b07f17360079458fcc8c8c5e948a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 17 18:07:40 crc kubenswrapper[4892]: E0217 18:07:40.246151 4892 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-8n9s7" podUID="abdceb70-55cb-4f2a-ac20-7fe8e9f4d064" containerName="ovs-vswitchd" Feb 17 18:07:41 crc kubenswrapper[4892]: I0217 18:07:41.297060 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="eb02530d-521f-427b-a570-f35de0665ecc" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.233:8775/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 17 18:07:41 crc kubenswrapper[4892]: I0217 18:07:41.297249 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="eb02530d-521f-427b-a570-f35de0665ecc" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.233:8775/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 17 18:07:41 crc kubenswrapper[4892]: I0217 18:07:41.369738 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fd50e1e-cc22-430b-ab38-88217aeafc59" path="/var/lib/kubelet/pods/2fd50e1e-cc22-430b-ab38-88217aeafc59/volumes" Feb 17 18:07:41 crc kubenswrapper[4892]: I0217 18:07:41.370346 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a97132b-c4ac-4645-8844-1dc5acf466a1" path="/var/lib/kubelet/pods/7a97132b-c4ac-4645-8844-1dc5acf466a1/volumes" Feb 17 18:07:41 crc kubenswrapper[4892]: I0217 18:07:41.370919 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4d79298-5d7f-4bd2-a8ba-b76c1a8b44f4" path="/var/lib/kubelet/pods/b4d79298-5d7f-4bd2-a8ba-b76c1a8b44f4/volumes" Feb 17 18:07:42 crc kubenswrapper[4892]: I0217 18:07:42.262870 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 18:07:42 crc kubenswrapper[4892]: I0217 18:07:42.319614 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e7cdd99-a572-4a20-834b-c1373e080496-config-data\") pod \"2e7cdd99-a572-4a20-834b-c1373e080496\" (UID: \"2e7cdd99-a572-4a20-834b-c1373e080496\") " Feb 17 18:07:42 crc kubenswrapper[4892]: I0217 18:07:42.319673 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2e7cdd99-a572-4a20-834b-c1373e080496-sg-core-conf-yaml\") pod \"2e7cdd99-a572-4a20-834b-c1373e080496\" (UID: \"2e7cdd99-a572-4a20-834b-c1373e080496\") " Feb 17 18:07:42 crc kubenswrapper[4892]: I0217 18:07:42.319727 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e7cdd99-a572-4a20-834b-c1373e080496-scripts\") pod \"2e7cdd99-a572-4a20-834b-c1373e080496\" (UID: \"2e7cdd99-a572-4a20-834b-c1373e080496\") " Feb 17 18:07:42 crc kubenswrapper[4892]: I0217 18:07:42.319746 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e7cdd99-a572-4a20-834b-c1373e080496-log-httpd\") pod \"2e7cdd99-a572-4a20-834b-c1373e080496\" (UID: \"2e7cdd99-a572-4a20-834b-c1373e080496\") " Feb 17 18:07:42 crc kubenswrapper[4892]: I0217 18:07:42.319784 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e7cdd99-a572-4a20-834b-c1373e080496-run-httpd\") pod \"2e7cdd99-a572-4a20-834b-c1373e080496\" (UID: \"2e7cdd99-a572-4a20-834b-c1373e080496\") " Feb 17 18:07:42 crc kubenswrapper[4892]: I0217 18:07:42.319922 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e7cdd99-a572-4a20-834b-c1373e080496-ceilometer-tls-certs\") pod \"2e7cdd99-a572-4a20-834b-c1373e080496\" (UID: \"2e7cdd99-a572-4a20-834b-c1373e080496\") " Feb 17 18:07:42 crc kubenswrapper[4892]: I0217 18:07:42.319945 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwtsm\" (UniqueName: \"kubernetes.io/projected/2e7cdd99-a572-4a20-834b-c1373e080496-kube-api-access-mwtsm\") pod \"2e7cdd99-a572-4a20-834b-c1373e080496\" (UID: \"2e7cdd99-a572-4a20-834b-c1373e080496\") " Feb 17 18:07:42 crc kubenswrapper[4892]: I0217 18:07:42.319968 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e7cdd99-a572-4a20-834b-c1373e080496-combined-ca-bundle\") pod \"2e7cdd99-a572-4a20-834b-c1373e080496\" (UID: \"2e7cdd99-a572-4a20-834b-c1373e080496\") " Feb 17 18:07:42 crc kubenswrapper[4892]: I0217 18:07:42.320315 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e7cdd99-a572-4a20-834b-c1373e080496-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2e7cdd99-a572-4a20-834b-c1373e080496" (UID: "2e7cdd99-a572-4a20-834b-c1373e080496"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:07:42 crc kubenswrapper[4892]: I0217 18:07:42.320790 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e7cdd99-a572-4a20-834b-c1373e080496-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2e7cdd99-a572-4a20-834b-c1373e080496" (UID: "2e7cdd99-a572-4a20-834b-c1373e080496"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:07:42 crc kubenswrapper[4892]: I0217 18:07:42.327996 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e7cdd99-a572-4a20-834b-c1373e080496-scripts" (OuterVolumeSpecName: "scripts") pod "2e7cdd99-a572-4a20-834b-c1373e080496" (UID: "2e7cdd99-a572-4a20-834b-c1373e080496"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:42 crc kubenswrapper[4892]: I0217 18:07:42.328020 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e7cdd99-a572-4a20-834b-c1373e080496-kube-api-access-mwtsm" (OuterVolumeSpecName: "kube-api-access-mwtsm") pod "2e7cdd99-a572-4a20-834b-c1373e080496" (UID: "2e7cdd99-a572-4a20-834b-c1373e080496"). InnerVolumeSpecName "kube-api-access-mwtsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:07:42 crc kubenswrapper[4892]: I0217 18:07:42.364715 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e7cdd99-a572-4a20-834b-c1373e080496-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2e7cdd99-a572-4a20-834b-c1373e080496" (UID: "2e7cdd99-a572-4a20-834b-c1373e080496"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:42 crc kubenswrapper[4892]: I0217 18:07:42.372917 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e7cdd99-a572-4a20-834b-c1373e080496-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "2e7cdd99-a572-4a20-834b-c1373e080496" (UID: "2e7cdd99-a572-4a20-834b-c1373e080496"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:42 crc kubenswrapper[4892]: I0217 18:07:42.397964 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e7cdd99-a572-4a20-834b-c1373e080496-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2e7cdd99-a572-4a20-834b-c1373e080496" (UID: "2e7cdd99-a572-4a20-834b-c1373e080496"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:42 crc kubenswrapper[4892]: I0217 18:07:42.419499 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e7cdd99-a572-4a20-834b-c1373e080496-config-data" (OuterVolumeSpecName: "config-data") pod "2e7cdd99-a572-4a20-834b-c1373e080496" (UID: "2e7cdd99-a572-4a20-834b-c1373e080496"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:42 crc kubenswrapper[4892]: I0217 18:07:42.421602 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e7cdd99-a572-4a20-834b-c1373e080496-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:42 crc kubenswrapper[4892]: I0217 18:07:42.422181 4892 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2e7cdd99-a572-4a20-834b-c1373e080496-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:42 crc kubenswrapper[4892]: I0217 18:07:42.422259 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e7cdd99-a572-4a20-834b-c1373e080496-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:42 crc kubenswrapper[4892]: I0217 18:07:42.422316 4892 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e7cdd99-a572-4a20-834b-c1373e080496-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:42 crc kubenswrapper[4892]: I0217 18:07:42.422370 4892 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e7cdd99-a572-4a20-834b-c1373e080496-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:42 crc kubenswrapper[4892]: I0217 18:07:42.422473 4892 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e7cdd99-a572-4a20-834b-c1373e080496-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:42 crc kubenswrapper[4892]: I0217 18:07:42.422539 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwtsm\" (UniqueName: \"kubernetes.io/projected/2e7cdd99-a572-4a20-834b-c1373e080496-kube-api-access-mwtsm\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:42 crc kubenswrapper[4892]: I0217 18:07:42.422595 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e7cdd99-a572-4a20-834b-c1373e080496-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:42 crc kubenswrapper[4892]: I0217 18:07:42.471806 4892 generic.go:334] "Generic (PLEG): container finished" podID="2e7cdd99-a572-4a20-834b-c1373e080496" containerID="a65ac27e331c76e69932d4652e6aaf1a21e948f273f412a3f9aa451079f2be62" exitCode=0 Feb 17 18:07:42 crc kubenswrapper[4892]: I0217 18:07:42.471883 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 18:07:42 crc kubenswrapper[4892]: I0217 18:07:42.471902 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2e7cdd99-a572-4a20-834b-c1373e080496","Type":"ContainerDied","Data":"a65ac27e331c76e69932d4652e6aaf1a21e948f273f412a3f9aa451079f2be62"} Feb 17 18:07:42 crc kubenswrapper[4892]: I0217 18:07:42.472261 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2e7cdd99-a572-4a20-834b-c1373e080496","Type":"ContainerDied","Data":"d249dc9a688f9ce92c0ecb0ce0b49077738c7d329b6a674732f4c71855355a6e"} Feb 17 18:07:42 crc kubenswrapper[4892]: I0217 18:07:42.472283 4892 scope.go:117] "RemoveContainer" containerID="7084822d47905f3c440ae8cc2dc02a06102fe14e3a3594895fa3088c05136b66" Feb 17 18:07:42 crc kubenswrapper[4892]: I0217 18:07:42.500780 4892 scope.go:117] "RemoveContainer" containerID="a9803edec44f5a7d45fe654a2bc1851bccd3b5a3b1210ee4b83cc2c6d8022f3e" Feb 17 18:07:42 crc kubenswrapper[4892]: I0217 18:07:42.525136 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 18:07:42 crc kubenswrapper[4892]: I0217 18:07:42.533449 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 18:07:42 crc kubenswrapper[4892]: I0217 18:07:42.545026 4892 scope.go:117] "RemoveContainer" containerID="a65ac27e331c76e69932d4652e6aaf1a21e948f273f412a3f9aa451079f2be62" Feb 17 18:07:42 crc kubenswrapper[4892]: I0217 18:07:42.575159 4892 scope.go:117] "RemoveContainer" containerID="366a786fe2adb1868b4256bbb8d7a92b8b2dc7273456a2f82c6931c3dc7fdab4" Feb 17 18:07:42 crc kubenswrapper[4892]: I0217 18:07:42.601321 4892 scope.go:117] "RemoveContainer" containerID="7084822d47905f3c440ae8cc2dc02a06102fe14e3a3594895fa3088c05136b66" Feb 17 18:07:42 crc kubenswrapper[4892]: E0217 18:07:42.601716 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7084822d47905f3c440ae8cc2dc02a06102fe14e3a3594895fa3088c05136b66\": container with ID starting with 7084822d47905f3c440ae8cc2dc02a06102fe14e3a3594895fa3088c05136b66 not found: ID does not exist" containerID="7084822d47905f3c440ae8cc2dc02a06102fe14e3a3594895fa3088c05136b66" Feb 17 18:07:42 crc kubenswrapper[4892]: I0217 18:07:42.601758 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7084822d47905f3c440ae8cc2dc02a06102fe14e3a3594895fa3088c05136b66"} err="failed to get container status \"7084822d47905f3c440ae8cc2dc02a06102fe14e3a3594895fa3088c05136b66\": rpc error: code = NotFound desc = could not find container \"7084822d47905f3c440ae8cc2dc02a06102fe14e3a3594895fa3088c05136b66\": container with ID starting with 7084822d47905f3c440ae8cc2dc02a06102fe14e3a3594895fa3088c05136b66 not found: ID does not exist" Feb 17 18:07:42 crc kubenswrapper[4892]: I0217 18:07:42.601777 4892 scope.go:117] "RemoveContainer" containerID="a9803edec44f5a7d45fe654a2bc1851bccd3b5a3b1210ee4b83cc2c6d8022f3e" Feb 17 18:07:42 crc kubenswrapper[4892]: E0217 18:07:42.602308 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9803edec44f5a7d45fe654a2bc1851bccd3b5a3b1210ee4b83cc2c6d8022f3e\": container with ID starting with a9803edec44f5a7d45fe654a2bc1851bccd3b5a3b1210ee4b83cc2c6d8022f3e not found: ID does not exist" containerID="a9803edec44f5a7d45fe654a2bc1851bccd3b5a3b1210ee4b83cc2c6d8022f3e" Feb 17 18:07:42 crc kubenswrapper[4892]: I0217 18:07:42.602359 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9803edec44f5a7d45fe654a2bc1851bccd3b5a3b1210ee4b83cc2c6d8022f3e"} err="failed to get container status \"a9803edec44f5a7d45fe654a2bc1851bccd3b5a3b1210ee4b83cc2c6d8022f3e\": rpc error: code = NotFound desc = could not find container \"a9803edec44f5a7d45fe654a2bc1851bccd3b5a3b1210ee4b83cc2c6d8022f3e\": container with ID starting with a9803edec44f5a7d45fe654a2bc1851bccd3b5a3b1210ee4b83cc2c6d8022f3e not found: ID does not exist" Feb 17 18:07:42 crc kubenswrapper[4892]: I0217 18:07:42.602398 4892 scope.go:117] "RemoveContainer" containerID="a65ac27e331c76e69932d4652e6aaf1a21e948f273f412a3f9aa451079f2be62" Feb 17 18:07:42 crc kubenswrapper[4892]: E0217 18:07:42.602762 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a65ac27e331c76e69932d4652e6aaf1a21e948f273f412a3f9aa451079f2be62\": container with ID starting with a65ac27e331c76e69932d4652e6aaf1a21e948f273f412a3f9aa451079f2be62 not found: ID does not exist" containerID="a65ac27e331c76e69932d4652e6aaf1a21e948f273f412a3f9aa451079f2be62" Feb 17 18:07:42 crc kubenswrapper[4892]: I0217 18:07:42.602835 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a65ac27e331c76e69932d4652e6aaf1a21e948f273f412a3f9aa451079f2be62"} err="failed to get container status \"a65ac27e331c76e69932d4652e6aaf1a21e948f273f412a3f9aa451079f2be62\": rpc error: code = NotFound desc = could not find container \"a65ac27e331c76e69932d4652e6aaf1a21e948f273f412a3f9aa451079f2be62\": container with ID starting with a65ac27e331c76e69932d4652e6aaf1a21e948f273f412a3f9aa451079f2be62 not found: ID does not exist" Feb 17 18:07:42 crc kubenswrapper[4892]: I0217 18:07:42.602869 4892 scope.go:117] "RemoveContainer" containerID="366a786fe2adb1868b4256bbb8d7a92b8b2dc7273456a2f82c6931c3dc7fdab4" Feb 17 18:07:42 crc kubenswrapper[4892]: E0217 18:07:42.603226 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"366a786fe2adb1868b4256bbb8d7a92b8b2dc7273456a2f82c6931c3dc7fdab4\": container with ID starting with 366a786fe2adb1868b4256bbb8d7a92b8b2dc7273456a2f82c6931c3dc7fdab4 not found: ID does not exist" containerID="366a786fe2adb1868b4256bbb8d7a92b8b2dc7273456a2f82c6931c3dc7fdab4" Feb 17 18:07:42 crc kubenswrapper[4892]: I0217 18:07:42.603259 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"366a786fe2adb1868b4256bbb8d7a92b8b2dc7273456a2f82c6931c3dc7fdab4"} err="failed to get container status \"366a786fe2adb1868b4256bbb8d7a92b8b2dc7273456a2f82c6931c3dc7fdab4\": rpc error: code = NotFound desc = could not find container \"366a786fe2adb1868b4256bbb8d7a92b8b2dc7273456a2f82c6931c3dc7fdab4\": container with ID starting with 366a786fe2adb1868b4256bbb8d7a92b8b2dc7273456a2f82c6931c3dc7fdab4 not found: ID does not exist" Feb 17 18:07:43 crc kubenswrapper[4892]: I0217 18:07:43.369675 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e7cdd99-a572-4a20-834b-c1373e080496" path="/var/lib/kubelet/pods/2e7cdd99-a572-4a20-834b-c1373e080496/volumes" Feb 17 18:07:45 crc kubenswrapper[4892]: E0217 18:07:45.240282 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e1814f919569ab3fca60ffafe790f6c380bef0cb3debbd1731ef73a58c171a64 is running failed: container process not found" containerID="e1814f919569ab3fca60ffafe790f6c380bef0cb3debbd1731ef73a58c171a64" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 17 18:07:45 crc kubenswrapper[4892]: E0217 18:07:45.241035 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e1814f919569ab3fca60ffafe790f6c380bef0cb3debbd1731ef73a58c171a64 is running failed: container process not found" containerID="e1814f919569ab3fca60ffafe790f6c380bef0cb3debbd1731ef73a58c171a64" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 17 18:07:45 crc kubenswrapper[4892]: E0217 18:07:45.241316 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e1814f919569ab3fca60ffafe790f6c380bef0cb3debbd1731ef73a58c171a64 is running failed: container process not found" containerID="e1814f919569ab3fca60ffafe790f6c380bef0cb3debbd1731ef73a58c171a64" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 17 18:07:45 crc kubenswrapper[4892]: E0217 18:07:45.241374 4892 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e1814f919569ab3fca60ffafe790f6c380bef0cb3debbd1731ef73a58c171a64 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-8n9s7" podUID="abdceb70-55cb-4f2a-ac20-7fe8e9f4d064" containerName="ovsdb-server" Feb 17 18:07:45 crc kubenswrapper[4892]: E0217 18:07:45.241550 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2fa766c1995ec1ef80411d787e1580966f11b07f17360079458fcc8c8c5e948a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 17 18:07:45 crc kubenswrapper[4892]: E0217 18:07:45.243239 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2fa766c1995ec1ef80411d787e1580966f11b07f17360079458fcc8c8c5e948a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 17 18:07:45 crc kubenswrapper[4892]: E0217 18:07:45.244293 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2fa766c1995ec1ef80411d787e1580966f11b07f17360079458fcc8c8c5e948a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 17 18:07:45 crc kubenswrapper[4892]: E0217 18:07:45.244336 4892 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-8n9s7" podUID="abdceb70-55cb-4f2a-ac20-7fe8e9f4d064" containerName="ovs-vswitchd" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.465367 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bvjw2"] Feb 17 18:07:45 crc kubenswrapper[4892]: E0217 18:07:45.466009 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="253dfc82-aa27-4e6b-88a5-0af7a1d01370" containerName="nova-scheduler-scheduler" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.466075 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="253dfc82-aa27-4e6b-88a5-0af7a1d01370" containerName="nova-scheduler-scheduler" Feb 17 18:07:45 crc kubenswrapper[4892]: E0217 18:07:45.466129 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8947a6cb-f018-4042-a2f8-e17591b0394d" containerName="barbican-api" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.466210 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="8947a6cb-f018-4042-a2f8-e17591b0394d" containerName="barbican-api" Feb 17 18:07:45 crc kubenswrapper[4892]: E0217 18:07:45.466269 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b688e91-e3c2-4a0f-a784-a694f951ea5e" containerName="ovn-northd" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.466317 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b688e91-e3c2-4a0f-a784-a694f951ea5e" containerName="ovn-northd" Feb 17 18:07:45 crc kubenswrapper[4892]: E0217 18:07:45.466375 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e69accab-69f4-4f35-91cc-b9fb1d0fded2" containerName="glance-httpd" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.466423 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="e69accab-69f4-4f35-91cc-b9fb1d0fded2" containerName="glance-httpd" Feb 17 18:07:45 crc kubenswrapper[4892]: E0217 18:07:45.466474 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8947a6cb-f018-4042-a2f8-e17591b0394d" containerName="barbican-api-log" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.466521 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="8947a6cb-f018-4042-a2f8-e17591b0394d" containerName="barbican-api-log" Feb 17 18:07:45 crc kubenswrapper[4892]: E0217 18:07:45.466579 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ca9c4ed-1247-4340-a675-b9d50dcbed99" containerName="glance-log" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.466627 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ca9c4ed-1247-4340-a675-b9d50dcbed99" containerName="glance-log" Feb 17 18:07:45 crc kubenswrapper[4892]: E0217 18:07:45.466683 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a991e29-288f-453d-9bb4-f8d90a2689ad" containerName="rabbitmq" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.466733 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a991e29-288f-453d-9bb4-f8d90a2689ad" containerName="rabbitmq" Feb 17 18:07:45 crc kubenswrapper[4892]: E0217 18:07:45.466793 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e7cdd99-a572-4a20-834b-c1373e080496" containerName="ceilometer-notification-agent" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.466858 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e7cdd99-a572-4a20-834b-c1373e080496" containerName="ceilometer-notification-agent" Feb 17 18:07:45 crc kubenswrapper[4892]: E0217 18:07:45.466923 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68cd920a-ec23-4053-a5b6-02adbf11eaf0" containerName="nova-api-log" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.466973 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="68cd920a-ec23-4053-a5b6-02adbf11eaf0" containerName="nova-api-log" Feb 17 18:07:45 crc kubenswrapper[4892]: E0217 18:07:45.467029 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f63d2ae-8195-4841-a7b5-f38667fc87b2" containerName="galera" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.467082 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f63d2ae-8195-4841-a7b5-f38667fc87b2" containerName="galera" Feb 17 18:07:45 crc kubenswrapper[4892]: E0217 18:07:45.467140 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b688e91-e3c2-4a0f-a784-a694f951ea5e" containerName="openstack-network-exporter" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.467194 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b688e91-e3c2-4a0f-a784-a694f951ea5e" containerName="openstack-network-exporter" Feb 17 18:07:45 crc kubenswrapper[4892]: E0217 18:07:45.467248 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8affd9cf-1116-4643-8045-d445edeaa995" containerName="kube-state-metrics" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.467297 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="8affd9cf-1116-4643-8045-d445edeaa995" containerName="kube-state-metrics" Feb 17 18:07:45 crc kubenswrapper[4892]: E0217 18:07:45.467350 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e7cdd99-a572-4a20-834b-c1373e080496" containerName="proxy-httpd" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.467397 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e7cdd99-a572-4a20-834b-c1373e080496" containerName="proxy-httpd" Feb 17 18:07:45 crc kubenswrapper[4892]: E0217 18:07:45.467454 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e7cdd99-a572-4a20-834b-c1373e080496" containerName="sg-core" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.467502 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e7cdd99-a572-4a20-834b-c1373e080496" containerName="sg-core" Feb 17 18:07:45 crc kubenswrapper[4892]: E0217 18:07:45.467553 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e69accab-69f4-4f35-91cc-b9fb1d0fded2" containerName="glance-log" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.467604 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="e69accab-69f4-4f35-91cc-b9fb1d0fded2" containerName="glance-log" Feb 17 18:07:45 crc kubenswrapper[4892]: E0217 18:07:45.467654 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68cd920a-ec23-4053-a5b6-02adbf11eaf0" containerName="nova-api-api" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.467702 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="68cd920a-ec23-4053-a5b6-02adbf11eaf0" containerName="nova-api-api" Feb 17 18:07:45 crc kubenswrapper[4892]: E0217 18:07:45.467756 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4d79298-5d7f-4bd2-a8ba-b76c1a8b44f4" containerName="nova-cell1-conductor-conductor" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.467805 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4d79298-5d7f-4bd2-a8ba-b76c1a8b44f4" containerName="nova-cell1-conductor-conductor" Feb 17 18:07:45 crc kubenswrapper[4892]: E0217 18:07:45.467877 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a97132b-c4ac-4645-8844-1dc5acf466a1" containerName="barbican-worker-log" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.467953 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a97132b-c4ac-4645-8844-1dc5acf466a1" containerName="barbican-worker-log" Feb 17 18:07:45 crc kubenswrapper[4892]: E0217 18:07:45.468007 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ca9c4ed-1247-4340-a675-b9d50dcbed99" containerName="glance-httpd" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.468070 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ca9c4ed-1247-4340-a675-b9d50dcbed99" containerName="glance-httpd" Feb 17 18:07:45 crc kubenswrapper[4892]: E0217 18:07:45.468123 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6736a08-c35b-491c-b408-8a3dd641cd51" containerName="probe" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.468173 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6736a08-c35b-491c-b408-8a3dd641cd51" containerName="probe" Feb 17 18:07:45 crc kubenswrapper[4892]: E0217 18:07:45.468230 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fd50e1e-cc22-430b-ab38-88217aeafc59" containerName="keystone-api" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.468283 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fd50e1e-cc22-430b-ab38-88217aeafc59" containerName="keystone-api" Feb 17 18:07:45 crc kubenswrapper[4892]: E0217 18:07:45.468336 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a991e29-288f-453d-9bb4-f8d90a2689ad" containerName="setup-container" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.468383 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a991e29-288f-453d-9bb4-f8d90a2689ad" containerName="setup-container" Feb 17 18:07:45 crc kubenswrapper[4892]: E0217 18:07:45.468433 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f63d2ae-8195-4841-a7b5-f38667fc87b2" containerName="mysql-bootstrap" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.468479 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f63d2ae-8195-4841-a7b5-f38667fc87b2" containerName="mysql-bootstrap" Feb 17 18:07:45 crc kubenswrapper[4892]: E0217 18:07:45.468532 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6736a08-c35b-491c-b408-8a3dd641cd51" containerName="cinder-scheduler" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.468585 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6736a08-c35b-491c-b408-8a3dd641cd51" containerName="cinder-scheduler" Feb 17 18:07:45 crc kubenswrapper[4892]: E0217 18:07:45.468640 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dadb10bf-ed88-454e-8873-9c49f762ef6e" containerName="memcached" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.468692 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="dadb10bf-ed88-454e-8873-9c49f762ef6e" containerName="memcached" Feb 17 18:07:45 crc kubenswrapper[4892]: E0217 18:07:45.468745 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e7cdd99-a572-4a20-834b-c1373e080496" containerName="ceilometer-central-agent" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.468792 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e7cdd99-a572-4a20-834b-c1373e080496" containerName="ceilometer-central-agent" Feb 17 18:07:45 crc kubenswrapper[4892]: E0217 18:07:45.468865 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb02530d-521f-427b-a570-f35de0665ecc" containerName="nova-metadata-log" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.468941 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb02530d-521f-427b-a570-f35de0665ecc" containerName="nova-metadata-log" Feb 17 18:07:45 crc kubenswrapper[4892]: E0217 18:07:45.469006 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb02530d-521f-427b-a570-f35de0665ecc" containerName="nova-metadata-metadata" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.469058 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb02530d-521f-427b-a570-f35de0665ecc" containerName="nova-metadata-metadata" Feb 17 18:07:45 crc kubenswrapper[4892]: E0217 18:07:45.469109 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49339b73-3846-4ce0-aa1d-d285b333c807" containerName="mariadb-account-create-update" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.469158 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="49339b73-3846-4ce0-aa1d-d285b333c807" containerName="mariadb-account-create-update" Feb 17 18:07:45 crc kubenswrapper[4892]: E0217 18:07:45.469216 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a97132b-c4ac-4645-8844-1dc5acf466a1" containerName="barbican-worker" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.469265 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a97132b-c4ac-4645-8844-1dc5acf466a1" containerName="barbican-worker" Feb 17 18:07:45 crc kubenswrapper[4892]: E0217 18:07:45.469324 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03480c10-3249-4caa-b0da-919bbe13c03f" containerName="nova-cell0-conductor-conductor" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.469373 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="03480c10-3249-4caa-b0da-919bbe13c03f" containerName="nova-cell0-conductor-conductor" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.469601 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="49339b73-3846-4ce0-aa1d-d285b333c807" containerName="mariadb-account-create-update" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.469665 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b688e91-e3c2-4a0f-a784-a694f951ea5e" containerName="ovn-northd" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.469717 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="e69accab-69f4-4f35-91cc-b9fb1d0fded2" containerName="glance-httpd" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.469767 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e7cdd99-a572-4a20-834b-c1373e080496" containerName="ceilometer-central-agent" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.469843 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="8affd9cf-1116-4643-8045-d445edeaa995" containerName="kube-state-metrics" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.469912 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4d79298-5d7f-4bd2-a8ba-b76c1a8b44f4" containerName="nova-cell1-conductor-conductor" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.469967 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="03480c10-3249-4caa-b0da-919bbe13c03f" containerName="nova-cell0-conductor-conductor" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.470018 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a97132b-c4ac-4645-8844-1dc5acf466a1" containerName="barbican-worker" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.470071 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ca9c4ed-1247-4340-a675-b9d50dcbed99" containerName="glance-log" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.470122 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e7cdd99-a572-4a20-834b-c1373e080496" containerName="proxy-httpd" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.470179 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a991e29-288f-453d-9bb4-f8d90a2689ad" containerName="rabbitmq" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.470243 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e7cdd99-a572-4a20-834b-c1373e080496" containerName="sg-core" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.470294 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f63d2ae-8195-4841-a7b5-f38667fc87b2" containerName="galera" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.470345 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="e69accab-69f4-4f35-91cc-b9fb1d0fded2" containerName="glance-log" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.470396 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6736a08-c35b-491c-b408-8a3dd641cd51" containerName="cinder-scheduler" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.470443 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="8947a6cb-f018-4042-a2f8-e17591b0394d" containerName="barbican-api" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.470499 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="253dfc82-aa27-4e6b-88a5-0af7a1d01370" containerName="nova-scheduler-scheduler" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.470565 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6736a08-c35b-491c-b408-8a3dd641cd51" containerName="probe" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.470619 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="8947a6cb-f018-4042-a2f8-e17591b0394d" containerName="barbican-api-log" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.470668 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fd50e1e-cc22-430b-ab38-88217aeafc59" containerName="keystone-api" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.470716 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb02530d-521f-427b-a570-f35de0665ecc" containerName="nova-metadata-log" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.470767 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a97132b-c4ac-4645-8844-1dc5acf466a1" containerName="barbican-worker-log" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.470833 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="dadb10bf-ed88-454e-8873-9c49f762ef6e" containerName="memcached" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.470896 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ca9c4ed-1247-4340-a675-b9d50dcbed99" containerName="glance-httpd" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.470948 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="68cd920a-ec23-4053-a5b6-02adbf11eaf0" containerName="nova-api-log" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.470997 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="68cd920a-ec23-4053-a5b6-02adbf11eaf0" containerName="nova-api-api" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.471052 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e7cdd99-a572-4a20-834b-c1373e080496" containerName="ceilometer-notification-agent" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.471102 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb02530d-521f-427b-a570-f35de0665ecc" containerName="nova-metadata-metadata" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.471150 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b688e91-e3c2-4a0f-a784-a694f951ea5e" containerName="openstack-network-exporter" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.472475 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bvjw2" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.477222 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bvjw2"] Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.586398 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgs67\" (UniqueName: \"kubernetes.io/projected/89ba360b-4b7a-4a1b-a417-923d957d1255-kube-api-access-cgs67\") pod \"redhat-operators-bvjw2\" (UID: \"89ba360b-4b7a-4a1b-a417-923d957d1255\") " pod="openshift-marketplace/redhat-operators-bvjw2" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.586521 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89ba360b-4b7a-4a1b-a417-923d957d1255-utilities\") pod \"redhat-operators-bvjw2\" (UID: \"89ba360b-4b7a-4a1b-a417-923d957d1255\") " pod="openshift-marketplace/redhat-operators-bvjw2" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.586567 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89ba360b-4b7a-4a1b-a417-923d957d1255-catalog-content\") pod \"redhat-operators-bvjw2\" (UID: \"89ba360b-4b7a-4a1b-a417-923d957d1255\") " pod="openshift-marketplace/redhat-operators-bvjw2" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.688066 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89ba360b-4b7a-4a1b-a417-923d957d1255-utilities\") pod \"redhat-operators-bvjw2\" (UID: \"89ba360b-4b7a-4a1b-a417-923d957d1255\") " pod="openshift-marketplace/redhat-operators-bvjw2" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.688873 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89ba360b-4b7a-4a1b-a417-923d957d1255-catalog-content\") pod \"redhat-operators-bvjw2\" (UID: \"89ba360b-4b7a-4a1b-a417-923d957d1255\") " pod="openshift-marketplace/redhat-operators-bvjw2" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.688938 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgs67\" (UniqueName: \"kubernetes.io/projected/89ba360b-4b7a-4a1b-a417-923d957d1255-kube-api-access-cgs67\") pod \"redhat-operators-bvjw2\" (UID: \"89ba360b-4b7a-4a1b-a417-923d957d1255\") " pod="openshift-marketplace/redhat-operators-bvjw2" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.688670 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89ba360b-4b7a-4a1b-a417-923d957d1255-utilities\") pod \"redhat-operators-bvjw2\" (UID: \"89ba360b-4b7a-4a1b-a417-923d957d1255\") " pod="openshift-marketplace/redhat-operators-bvjw2" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.689669 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89ba360b-4b7a-4a1b-a417-923d957d1255-catalog-content\") pod \"redhat-operators-bvjw2\" (UID: \"89ba360b-4b7a-4a1b-a417-923d957d1255\") " pod="openshift-marketplace/redhat-operators-bvjw2" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.713041 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgs67\" (UniqueName: \"kubernetes.io/projected/89ba360b-4b7a-4a1b-a417-923d957d1255-kube-api-access-cgs67\") pod \"redhat-operators-bvjw2\" (UID: \"89ba360b-4b7a-4a1b-a417-923d957d1255\") " pod="openshift-marketplace/redhat-operators-bvjw2" Feb 17 18:07:45 crc kubenswrapper[4892]: I0217 18:07:45.790373 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bvjw2" Feb 17 18:07:46 crc kubenswrapper[4892]: I0217 18:07:46.406264 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bvjw2"] Feb 17 18:07:46 crc kubenswrapper[4892]: I0217 18:07:46.524962 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bvjw2" event={"ID":"89ba360b-4b7a-4a1b-a417-923d957d1255","Type":"ContainerStarted","Data":"907a798e694d80ab381066f29eaee7527d618df59a09afb3661d931d701ef8f8"} Feb 17 18:07:47 crc kubenswrapper[4892]: I0217 18:07:47.548766 4892 generic.go:334] "Generic (PLEG): container finished" podID="89ba360b-4b7a-4a1b-a417-923d957d1255" containerID="1f722a44d9b759427e5d3b582daea007e79ff667e876908db776746dc4033c7c" exitCode=0 Feb 17 18:07:47 crc kubenswrapper[4892]: I0217 18:07:47.548832 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bvjw2" event={"ID":"89ba360b-4b7a-4a1b-a417-923d957d1255","Type":"ContainerDied","Data":"1f722a44d9b759427e5d3b582daea007e79ff667e876908db776746dc4033c7c"} Feb 17 18:07:49 crc kubenswrapper[4892]: I0217 18:07:49.175319 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5758f86b57-ddm7q" Feb 17 18:07:49 crc kubenswrapper[4892]: I0217 18:07:49.249024 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5ffd802c-eec6-4c8c-a2a9-2571eba7bc33-httpd-config\") pod \"5ffd802c-eec6-4c8c-a2a9-2571eba7bc33\" (UID: \"5ffd802c-eec6-4c8c-a2a9-2571eba7bc33\") " Feb 17 18:07:49 crc kubenswrapper[4892]: I0217 18:07:49.249081 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ffd802c-eec6-4c8c-a2a9-2571eba7bc33-public-tls-certs\") pod \"5ffd802c-eec6-4c8c-a2a9-2571eba7bc33\" (UID: \"5ffd802c-eec6-4c8c-a2a9-2571eba7bc33\") " Feb 17 18:07:49 crc kubenswrapper[4892]: I0217 18:07:49.249274 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5ffd802c-eec6-4c8c-a2a9-2571eba7bc33-config\") pod \"5ffd802c-eec6-4c8c-a2a9-2571eba7bc33\" (UID: \"5ffd802c-eec6-4c8c-a2a9-2571eba7bc33\") " Feb 17 18:07:49 crc kubenswrapper[4892]: I0217 18:07:49.249313 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5wrv\" (UniqueName: \"kubernetes.io/projected/5ffd802c-eec6-4c8c-a2a9-2571eba7bc33-kube-api-access-x5wrv\") pod \"5ffd802c-eec6-4c8c-a2a9-2571eba7bc33\" (UID: \"5ffd802c-eec6-4c8c-a2a9-2571eba7bc33\") " Feb 17 18:07:49 crc kubenswrapper[4892]: I0217 18:07:49.249352 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ffd802c-eec6-4c8c-a2a9-2571eba7bc33-internal-tls-certs\") pod \"5ffd802c-eec6-4c8c-a2a9-2571eba7bc33\" (UID: \"5ffd802c-eec6-4c8c-a2a9-2571eba7bc33\") " Feb 17 18:07:49 crc kubenswrapper[4892]: I0217 18:07:49.249428 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ffd802c-eec6-4c8c-a2a9-2571eba7bc33-combined-ca-bundle\") pod \"5ffd802c-eec6-4c8c-a2a9-2571eba7bc33\" (UID: \"5ffd802c-eec6-4c8c-a2a9-2571eba7bc33\") " Feb 17 18:07:49 crc kubenswrapper[4892]: I0217 18:07:49.249505 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ffd802c-eec6-4c8c-a2a9-2571eba7bc33-ovndb-tls-certs\") pod \"5ffd802c-eec6-4c8c-a2a9-2571eba7bc33\" (UID: \"5ffd802c-eec6-4c8c-a2a9-2571eba7bc33\") " Feb 17 18:07:49 crc kubenswrapper[4892]: I0217 18:07:49.262217 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ffd802c-eec6-4c8c-a2a9-2571eba7bc33-kube-api-access-x5wrv" (OuterVolumeSpecName: "kube-api-access-x5wrv") pod "5ffd802c-eec6-4c8c-a2a9-2571eba7bc33" (UID: "5ffd802c-eec6-4c8c-a2a9-2571eba7bc33"). InnerVolumeSpecName "kube-api-access-x5wrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:07:49 crc kubenswrapper[4892]: I0217 18:07:49.263111 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ffd802c-eec6-4c8c-a2a9-2571eba7bc33-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "5ffd802c-eec6-4c8c-a2a9-2571eba7bc33" (UID: "5ffd802c-eec6-4c8c-a2a9-2571eba7bc33"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:49 crc kubenswrapper[4892]: I0217 18:07:49.300905 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ffd802c-eec6-4c8c-a2a9-2571eba7bc33-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5ffd802c-eec6-4c8c-a2a9-2571eba7bc33" (UID: "5ffd802c-eec6-4c8c-a2a9-2571eba7bc33"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:49 crc kubenswrapper[4892]: I0217 18:07:49.314593 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ffd802c-eec6-4c8c-a2a9-2571eba7bc33-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5ffd802c-eec6-4c8c-a2a9-2571eba7bc33" (UID: "5ffd802c-eec6-4c8c-a2a9-2571eba7bc33"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:49 crc kubenswrapper[4892]: I0217 18:07:49.316041 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ffd802c-eec6-4c8c-a2a9-2571eba7bc33-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5ffd802c-eec6-4c8c-a2a9-2571eba7bc33" (UID: "5ffd802c-eec6-4c8c-a2a9-2571eba7bc33"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:49 crc kubenswrapper[4892]: I0217 18:07:49.324277 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ffd802c-eec6-4c8c-a2a9-2571eba7bc33-config" (OuterVolumeSpecName: "config") pod "5ffd802c-eec6-4c8c-a2a9-2571eba7bc33" (UID: "5ffd802c-eec6-4c8c-a2a9-2571eba7bc33"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:49 crc kubenswrapper[4892]: I0217 18:07:49.331208 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ffd802c-eec6-4c8c-a2a9-2571eba7bc33-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "5ffd802c-eec6-4c8c-a2a9-2571eba7bc33" (UID: "5ffd802c-eec6-4c8c-a2a9-2571eba7bc33"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:07:49 crc kubenswrapper[4892]: I0217 18:07:49.351212 4892 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ffd802c-eec6-4c8c-a2a9-2571eba7bc33-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:49 crc kubenswrapper[4892]: I0217 18:07:49.351441 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/5ffd802c-eec6-4c8c-a2a9-2571eba7bc33-config\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:49 crc kubenswrapper[4892]: I0217 18:07:49.351452 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5wrv\" (UniqueName: \"kubernetes.io/projected/5ffd802c-eec6-4c8c-a2a9-2571eba7bc33-kube-api-access-x5wrv\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:49 crc kubenswrapper[4892]: I0217 18:07:49.351462 4892 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ffd802c-eec6-4c8c-a2a9-2571eba7bc33-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:49 crc kubenswrapper[4892]: I0217 18:07:49.351471 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ffd802c-eec6-4c8c-a2a9-2571eba7bc33-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:49 crc kubenswrapper[4892]: I0217 18:07:49.351479 4892 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ffd802c-eec6-4c8c-a2a9-2571eba7bc33-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:49 crc kubenswrapper[4892]: I0217 18:07:49.351487 4892 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5ffd802c-eec6-4c8c-a2a9-2571eba7bc33-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 17 18:07:49 crc kubenswrapper[4892]: I0217 18:07:49.584004 4892 generic.go:334] "Generic (PLEG): container finished" podID="5ffd802c-eec6-4c8c-a2a9-2571eba7bc33" containerID="fae322d6a6b0720f4c3bcb028ea8f29fd05444b1ae22a5979c599cdb2c7ed739" exitCode=0 Feb 17 18:07:49 crc kubenswrapper[4892]: I0217 18:07:49.584062 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5758f86b57-ddm7q" event={"ID":"5ffd802c-eec6-4c8c-a2a9-2571eba7bc33","Type":"ContainerDied","Data":"fae322d6a6b0720f4c3bcb028ea8f29fd05444b1ae22a5979c599cdb2c7ed739"} Feb 17 18:07:49 crc kubenswrapper[4892]: I0217 18:07:49.584071 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5758f86b57-ddm7q" Feb 17 18:07:49 crc kubenswrapper[4892]: I0217 18:07:49.584089 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5758f86b57-ddm7q" event={"ID":"5ffd802c-eec6-4c8c-a2a9-2571eba7bc33","Type":"ContainerDied","Data":"c5040092cf864192d68f5c9d6f8aeb72cafd9400d6af4ecc44bd3b5f5a0b00bd"} Feb 17 18:07:49 crc kubenswrapper[4892]: I0217 18:07:49.584108 4892 scope.go:117] "RemoveContainer" containerID="b9c026854a18722a9d26fd66f783126656e8b029c2d6648ec55c7480344c24c5" Feb 17 18:07:49 crc kubenswrapper[4892]: I0217 18:07:49.588776 4892 generic.go:334] "Generic (PLEG): container finished" podID="89ba360b-4b7a-4a1b-a417-923d957d1255" containerID="845d1f77010c36e0685c73ff930561f5de676635fd065ce1b28e9b7da4ea3db9" exitCode=0 Feb 17 18:07:49 crc kubenswrapper[4892]: I0217 18:07:49.588896 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bvjw2" event={"ID":"89ba360b-4b7a-4a1b-a417-923d957d1255","Type":"ContainerDied","Data":"845d1f77010c36e0685c73ff930561f5de676635fd065ce1b28e9b7da4ea3db9"} Feb 17 18:07:49 crc kubenswrapper[4892]: I0217 18:07:49.626880 4892 scope.go:117] "RemoveContainer" containerID="fae322d6a6b0720f4c3bcb028ea8f29fd05444b1ae22a5979c599cdb2c7ed739" Feb 17 18:07:49 crc kubenswrapper[4892]: I0217 18:07:49.642919 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5758f86b57-ddm7q"] Feb 17 18:07:49 crc kubenswrapper[4892]: I0217 18:07:49.650565 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5758f86b57-ddm7q"] Feb 17 18:07:49 crc kubenswrapper[4892]: I0217 18:07:49.669057 4892 scope.go:117] "RemoveContainer" containerID="b9c026854a18722a9d26fd66f783126656e8b029c2d6648ec55c7480344c24c5" Feb 17 18:07:49 crc kubenswrapper[4892]: E0217 18:07:49.669441 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9c026854a18722a9d26fd66f783126656e8b029c2d6648ec55c7480344c24c5\": container with ID starting with b9c026854a18722a9d26fd66f783126656e8b029c2d6648ec55c7480344c24c5 not found: ID does not exist" containerID="b9c026854a18722a9d26fd66f783126656e8b029c2d6648ec55c7480344c24c5" Feb 17 18:07:49 crc kubenswrapper[4892]: I0217 18:07:49.669478 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9c026854a18722a9d26fd66f783126656e8b029c2d6648ec55c7480344c24c5"} err="failed to get container status \"b9c026854a18722a9d26fd66f783126656e8b029c2d6648ec55c7480344c24c5\": rpc error: code = NotFound desc = could not find container \"b9c026854a18722a9d26fd66f783126656e8b029c2d6648ec55c7480344c24c5\": container with ID starting with b9c026854a18722a9d26fd66f783126656e8b029c2d6648ec55c7480344c24c5 not found: ID does not exist" Feb 17 18:07:49 crc kubenswrapper[4892]: I0217 18:07:49.669503 4892 scope.go:117] "RemoveContainer" containerID="fae322d6a6b0720f4c3bcb028ea8f29fd05444b1ae22a5979c599cdb2c7ed739" Feb 17 18:07:49 crc kubenswrapper[4892]: E0217 18:07:49.669937 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fae322d6a6b0720f4c3bcb028ea8f29fd05444b1ae22a5979c599cdb2c7ed739\": container with ID starting with fae322d6a6b0720f4c3bcb028ea8f29fd05444b1ae22a5979c599cdb2c7ed739 not found: ID does not exist" containerID="fae322d6a6b0720f4c3bcb028ea8f29fd05444b1ae22a5979c599cdb2c7ed739" Feb 17 18:07:49 crc kubenswrapper[4892]: I0217 18:07:49.669967 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fae322d6a6b0720f4c3bcb028ea8f29fd05444b1ae22a5979c599cdb2c7ed739"} err="failed to get container status \"fae322d6a6b0720f4c3bcb028ea8f29fd05444b1ae22a5979c599cdb2c7ed739\": rpc error: code = NotFound desc = could not find container \"fae322d6a6b0720f4c3bcb028ea8f29fd05444b1ae22a5979c599cdb2c7ed739\": container with ID starting with fae322d6a6b0720f4c3bcb028ea8f29fd05444b1ae22a5979c599cdb2c7ed739 not found: ID does not exist" Feb 17 18:07:50 crc kubenswrapper[4892]: E0217 18:07:50.241456 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e1814f919569ab3fca60ffafe790f6c380bef0cb3debbd1731ef73a58c171a64 is running failed: container process not found" containerID="e1814f919569ab3fca60ffafe790f6c380bef0cb3debbd1731ef73a58c171a64" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 17 18:07:50 crc kubenswrapper[4892]: E0217 18:07:50.242630 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e1814f919569ab3fca60ffafe790f6c380bef0cb3debbd1731ef73a58c171a64 is running failed: container process not found" containerID="e1814f919569ab3fca60ffafe790f6c380bef0cb3debbd1731ef73a58c171a64" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 17 18:07:50 crc kubenswrapper[4892]: E0217 18:07:50.242687 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2fa766c1995ec1ef80411d787e1580966f11b07f17360079458fcc8c8c5e948a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 17 18:07:50 crc kubenswrapper[4892]: E0217 18:07:50.243598 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e1814f919569ab3fca60ffafe790f6c380bef0cb3debbd1731ef73a58c171a64 is running failed: container process not found" containerID="e1814f919569ab3fca60ffafe790f6c380bef0cb3debbd1731ef73a58c171a64" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 17 18:07:50 crc kubenswrapper[4892]: E0217 18:07:50.243639 4892 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e1814f919569ab3fca60ffafe790f6c380bef0cb3debbd1731ef73a58c171a64 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-8n9s7" podUID="abdceb70-55cb-4f2a-ac20-7fe8e9f4d064" containerName="ovsdb-server" Feb 17 18:07:50 crc kubenswrapper[4892]: E0217 18:07:50.248463 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2fa766c1995ec1ef80411d787e1580966f11b07f17360079458fcc8c8c5e948a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 17 18:07:50 crc kubenswrapper[4892]: E0217 18:07:50.256184 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2fa766c1995ec1ef80411d787e1580966f11b07f17360079458fcc8c8c5e948a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 17 18:07:50 crc kubenswrapper[4892]: E0217 18:07:50.256263 4892 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-8n9s7" podUID="abdceb70-55cb-4f2a-ac20-7fe8e9f4d064" containerName="ovs-vswitchd" Feb 17 18:07:50 crc kubenswrapper[4892]: I0217 18:07:50.602164 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bvjw2" event={"ID":"89ba360b-4b7a-4a1b-a417-923d957d1255","Type":"ContainerStarted","Data":"fbedba0dd0e13fbd7696a0f9575468162d67e241a1c47d6122551969501fac1f"} Feb 17 18:07:50 crc kubenswrapper[4892]: I0217 18:07:50.627800 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bvjw2" podStartSLOduration=3.042481052 podStartE2EDuration="5.627779372s" podCreationTimestamp="2026-02-17 18:07:45 +0000 UTC" firstStartedPulling="2026-02-17 18:07:47.552618201 +0000 UTC m=+1438.928021476" lastFinishedPulling="2026-02-17 18:07:50.137916531 +0000 UTC m=+1441.513319796" observedRunningTime="2026-02-17 18:07:50.618554053 +0000 UTC m=+1441.993957408" watchObservedRunningTime="2026-02-17 18:07:50.627779372 +0000 UTC m=+1442.003182657" Feb 17 18:07:51 crc kubenswrapper[4892]: I0217 18:07:51.372781 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ffd802c-eec6-4c8c-a2a9-2571eba7bc33" path="/var/lib/kubelet/pods/5ffd802c-eec6-4c8c-a2a9-2571eba7bc33/volumes" Feb 17 18:07:55 crc kubenswrapper[4892]: E0217 18:07:55.240826 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e1814f919569ab3fca60ffafe790f6c380bef0cb3debbd1731ef73a58c171a64 is running failed: container process not found" containerID="e1814f919569ab3fca60ffafe790f6c380bef0cb3debbd1731ef73a58c171a64" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 17 18:07:55 crc kubenswrapper[4892]: E0217 18:07:55.241419 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2fa766c1995ec1ef80411d787e1580966f11b07f17360079458fcc8c8c5e948a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 17 18:07:55 crc kubenswrapper[4892]: E0217 18:07:55.242256 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e1814f919569ab3fca60ffafe790f6c380bef0cb3debbd1731ef73a58c171a64 is running failed: container process not found" containerID="e1814f919569ab3fca60ffafe790f6c380bef0cb3debbd1731ef73a58c171a64" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 17 18:07:55 crc kubenswrapper[4892]: E0217 18:07:55.242685 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e1814f919569ab3fca60ffafe790f6c380bef0cb3debbd1731ef73a58c171a64 is running failed: container process not found" containerID="e1814f919569ab3fca60ffafe790f6c380bef0cb3debbd1731ef73a58c171a64" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 17 18:07:55 crc kubenswrapper[4892]: E0217 18:07:55.242723 4892 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e1814f919569ab3fca60ffafe790f6c380bef0cb3debbd1731ef73a58c171a64 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-8n9s7" podUID="abdceb70-55cb-4f2a-ac20-7fe8e9f4d064" containerName="ovsdb-server" Feb 17 18:07:55 crc kubenswrapper[4892]: E0217 18:07:55.242783 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2fa766c1995ec1ef80411d787e1580966f11b07f17360079458fcc8c8c5e948a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 17 18:07:55 crc kubenswrapper[4892]: E0217 18:07:55.244269 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2fa766c1995ec1ef80411d787e1580966f11b07f17360079458fcc8c8c5e948a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 17 18:07:55 crc kubenswrapper[4892]: E0217 18:07:55.244315 4892 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-8n9s7" podUID="abdceb70-55cb-4f2a-ac20-7fe8e9f4d064" containerName="ovs-vswitchd" Feb 17 18:07:55 crc kubenswrapper[4892]: I0217 18:07:55.790659 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bvjw2" Feb 17 18:07:55 crc kubenswrapper[4892]: I0217 18:07:55.790776 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bvjw2" Feb 17 18:07:56 crc kubenswrapper[4892]: I0217 18:07:56.879164 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bvjw2" podUID="89ba360b-4b7a-4a1b-a417-923d957d1255" containerName="registry-server" probeResult="failure" output=< Feb 17 18:07:56 crc kubenswrapper[4892]: timeout: failed to connect service ":50051" within 1s Feb 17 18:07:56 crc kubenswrapper[4892]: > Feb 17 18:07:58 crc kubenswrapper[4892]: I0217 18:07:58.709863 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8n9s7_abdceb70-55cb-4f2a-ac20-7fe8e9f4d064/ovs-vswitchd/0.log" Feb 17 18:07:58 crc kubenswrapper[4892]: I0217 18:07:58.710978 4892 generic.go:334] "Generic (PLEG): container finished" podID="abdceb70-55cb-4f2a-ac20-7fe8e9f4d064" containerID="2fa766c1995ec1ef80411d787e1580966f11b07f17360079458fcc8c8c5e948a" exitCode=137 Feb 17 18:07:58 crc kubenswrapper[4892]: I0217 18:07:58.711019 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8n9s7" event={"ID":"abdceb70-55cb-4f2a-ac20-7fe8e9f4d064","Type":"ContainerDied","Data":"2fa766c1995ec1ef80411d787e1580966f11b07f17360079458fcc8c8c5e948a"} Feb 17 18:08:00 crc kubenswrapper[4892]: E0217 18:08:00.240915 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e1814f919569ab3fca60ffafe790f6c380bef0cb3debbd1731ef73a58c171a64 is running failed: container process not found" containerID="e1814f919569ab3fca60ffafe790f6c380bef0cb3debbd1731ef73a58c171a64" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 17 18:08:00 crc kubenswrapper[4892]: E0217 18:08:00.240930 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2fa766c1995ec1ef80411d787e1580966f11b07f17360079458fcc8c8c5e948a is running failed: container process not found" containerID="2fa766c1995ec1ef80411d787e1580966f11b07f17360079458fcc8c8c5e948a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 17 18:08:00 crc kubenswrapper[4892]: E0217 18:08:00.241510 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e1814f919569ab3fca60ffafe790f6c380bef0cb3debbd1731ef73a58c171a64 is running failed: container process not found" containerID="e1814f919569ab3fca60ffafe790f6c380bef0cb3debbd1731ef73a58c171a64" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 17 18:08:00 crc kubenswrapper[4892]: E0217 18:08:00.241546 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2fa766c1995ec1ef80411d787e1580966f11b07f17360079458fcc8c8c5e948a is running failed: container process not found" containerID="2fa766c1995ec1ef80411d787e1580966f11b07f17360079458fcc8c8c5e948a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 17 18:08:00 crc kubenswrapper[4892]: E0217 18:08:00.241929 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e1814f919569ab3fca60ffafe790f6c380bef0cb3debbd1731ef73a58c171a64 is running failed: container process not found" containerID="e1814f919569ab3fca60ffafe790f6c380bef0cb3debbd1731ef73a58c171a64" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 17 18:08:00 crc kubenswrapper[4892]: E0217 18:08:00.241960 4892 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e1814f919569ab3fca60ffafe790f6c380bef0cb3debbd1731ef73a58c171a64 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-8n9s7" podUID="abdceb70-55cb-4f2a-ac20-7fe8e9f4d064" containerName="ovsdb-server" Feb 17 18:08:00 crc kubenswrapper[4892]: E0217 18:08:00.241995 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2fa766c1995ec1ef80411d787e1580966f11b07f17360079458fcc8c8c5e948a is running failed: container process not found" containerID="2fa766c1995ec1ef80411d787e1580966f11b07f17360079458fcc8c8c5e948a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 17 18:08:00 crc kubenswrapper[4892]: E0217 18:08:00.242032 4892 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2fa766c1995ec1ef80411d787e1580966f11b07f17360079458fcc8c8c5e948a is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-8n9s7" podUID="abdceb70-55cb-4f2a-ac20-7fe8e9f4d064" containerName="ovs-vswitchd" Feb 17 18:08:00 crc kubenswrapper[4892]: I0217 18:08:00.277071 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8n9s7_abdceb70-55cb-4f2a-ac20-7fe8e9f4d064/ovs-vswitchd/0.log" Feb 17 18:08:00 crc kubenswrapper[4892]: I0217 18:08:00.277732 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-8n9s7" Feb 17 18:08:00 crc kubenswrapper[4892]: I0217 18:08:00.354153 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/abdceb70-55cb-4f2a-ac20-7fe8e9f4d064-etc-ovs\") pod \"abdceb70-55cb-4f2a-ac20-7fe8e9f4d064\" (UID: \"abdceb70-55cb-4f2a-ac20-7fe8e9f4d064\") " Feb 17 18:08:00 crc kubenswrapper[4892]: I0217 18:08:00.354332 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/abdceb70-55cb-4f2a-ac20-7fe8e9f4d064-var-lib\") pod \"abdceb70-55cb-4f2a-ac20-7fe8e9f4d064\" (UID: \"abdceb70-55cb-4f2a-ac20-7fe8e9f4d064\") " Feb 17 18:08:00 crc kubenswrapper[4892]: I0217 18:08:00.354345 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/abdceb70-55cb-4f2a-ac20-7fe8e9f4d064-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "abdceb70-55cb-4f2a-ac20-7fe8e9f4d064" (UID: "abdceb70-55cb-4f2a-ac20-7fe8e9f4d064"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:08:00 crc kubenswrapper[4892]: I0217 18:08:00.354404 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/abdceb70-55cb-4f2a-ac20-7fe8e9f4d064-var-run\") pod \"abdceb70-55cb-4f2a-ac20-7fe8e9f4d064\" (UID: \"abdceb70-55cb-4f2a-ac20-7fe8e9f4d064\") " Feb 17 18:08:00 crc kubenswrapper[4892]: I0217 18:08:00.354439 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/abdceb70-55cb-4f2a-ac20-7fe8e9f4d064-var-log\") pod \"abdceb70-55cb-4f2a-ac20-7fe8e9f4d064\" (UID: \"abdceb70-55cb-4f2a-ac20-7fe8e9f4d064\") " Feb 17 18:08:00 crc kubenswrapper[4892]: I0217 18:08:00.354476 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mh2g9\" (UniqueName: \"kubernetes.io/projected/abdceb70-55cb-4f2a-ac20-7fe8e9f4d064-kube-api-access-mh2g9\") pod \"abdceb70-55cb-4f2a-ac20-7fe8e9f4d064\" (UID: \"abdceb70-55cb-4f2a-ac20-7fe8e9f4d064\") " Feb 17 18:08:00 crc kubenswrapper[4892]: I0217 18:08:00.354483 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/abdceb70-55cb-4f2a-ac20-7fe8e9f4d064-var-lib" (OuterVolumeSpecName: "var-lib") pod "abdceb70-55cb-4f2a-ac20-7fe8e9f4d064" (UID: "abdceb70-55cb-4f2a-ac20-7fe8e9f4d064"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:08:00 crc kubenswrapper[4892]: I0217 18:08:00.354496 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/abdceb70-55cb-4f2a-ac20-7fe8e9f4d064-scripts\") pod \"abdceb70-55cb-4f2a-ac20-7fe8e9f4d064\" (UID: \"abdceb70-55cb-4f2a-ac20-7fe8e9f4d064\") " Feb 17 18:08:00 crc kubenswrapper[4892]: I0217 18:08:00.354506 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/abdceb70-55cb-4f2a-ac20-7fe8e9f4d064-var-log" (OuterVolumeSpecName: "var-log") pod "abdceb70-55cb-4f2a-ac20-7fe8e9f4d064" (UID: "abdceb70-55cb-4f2a-ac20-7fe8e9f4d064"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:08:00 crc kubenswrapper[4892]: I0217 18:08:00.354526 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/abdceb70-55cb-4f2a-ac20-7fe8e9f4d064-var-run" (OuterVolumeSpecName: "var-run") pod "abdceb70-55cb-4f2a-ac20-7fe8e9f4d064" (UID: "abdceb70-55cb-4f2a-ac20-7fe8e9f4d064"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 18:08:00 crc kubenswrapper[4892]: I0217 18:08:00.356531 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abdceb70-55cb-4f2a-ac20-7fe8e9f4d064-scripts" (OuterVolumeSpecName: "scripts") pod "abdceb70-55cb-4f2a-ac20-7fe8e9f4d064" (UID: "abdceb70-55cb-4f2a-ac20-7fe8e9f4d064"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:08:00 crc kubenswrapper[4892]: I0217 18:08:00.357627 4892 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/abdceb70-55cb-4f2a-ac20-7fe8e9f4d064-var-run\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:00 crc kubenswrapper[4892]: I0217 18:08:00.357654 4892 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/abdceb70-55cb-4f2a-ac20-7fe8e9f4d064-var-log\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:00 crc kubenswrapper[4892]: I0217 18:08:00.357663 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/abdceb70-55cb-4f2a-ac20-7fe8e9f4d064-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:00 crc kubenswrapper[4892]: I0217 18:08:00.357672 4892 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/abdceb70-55cb-4f2a-ac20-7fe8e9f4d064-etc-ovs\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:00 crc kubenswrapper[4892]: I0217 18:08:00.357681 4892 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/abdceb70-55cb-4f2a-ac20-7fe8e9f4d064-var-lib\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:00 crc kubenswrapper[4892]: I0217 18:08:00.372710 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abdceb70-55cb-4f2a-ac20-7fe8e9f4d064-kube-api-access-mh2g9" (OuterVolumeSpecName: "kube-api-access-mh2g9") pod "abdceb70-55cb-4f2a-ac20-7fe8e9f4d064" (UID: "abdceb70-55cb-4f2a-ac20-7fe8e9f4d064"). InnerVolumeSpecName "kube-api-access-mh2g9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:08:00 crc kubenswrapper[4892]: I0217 18:08:00.460233 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mh2g9\" (UniqueName: \"kubernetes.io/projected/abdceb70-55cb-4f2a-ac20-7fe8e9f4d064-kube-api-access-mh2g9\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:00 crc kubenswrapper[4892]: I0217 18:08:00.733280 4892 generic.go:334] "Generic (PLEG): container finished" podID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerID="40d70537bfb2a1c3a8fc4994456c79434226023147aca15f0cc917598c474f70" exitCode=137 Feb 17 18:08:00 crc kubenswrapper[4892]: I0217 18:08:00.733627 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0a693208-de83-4f05-b4ff-0b3e7f858c74","Type":"ContainerDied","Data":"40d70537bfb2a1c3a8fc4994456c79434226023147aca15f0cc917598c474f70"} Feb 17 18:08:00 crc kubenswrapper[4892]: I0217 18:08:00.735228 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8n9s7_abdceb70-55cb-4f2a-ac20-7fe8e9f4d064/ovs-vswitchd/0.log" Feb 17 18:08:00 crc kubenswrapper[4892]: I0217 18:08:00.736025 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8n9s7" event={"ID":"abdceb70-55cb-4f2a-ac20-7fe8e9f4d064","Type":"ContainerDied","Data":"250911d16f139c55d6260ef758663ea4bd7069431179b3f2214a1dc64aac1a5c"} Feb 17 18:08:00 crc kubenswrapper[4892]: I0217 18:08:00.736065 4892 scope.go:117] "RemoveContainer" containerID="2fa766c1995ec1ef80411d787e1580966f11b07f17360079458fcc8c8c5e948a" Feb 17 18:08:00 crc kubenswrapper[4892]: I0217 18:08:00.736111 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-8n9s7" Feb 17 18:08:00 crc kubenswrapper[4892]: I0217 18:08:00.773096 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-8n9s7"] Feb 17 18:08:00 crc kubenswrapper[4892]: I0217 18:08:00.778766 4892 scope.go:117] "RemoveContainer" containerID="e1814f919569ab3fca60ffafe790f6c380bef0cb3debbd1731ef73a58c171a64" Feb 17 18:08:00 crc kubenswrapper[4892]: I0217 18:08:00.781133 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-8n9s7"] Feb 17 18:08:00 crc kubenswrapper[4892]: I0217 18:08:00.828724 4892 scope.go:117] "RemoveContainer" containerID="e5e3d7ffade91386dbb0cee4288429c6a1a13fd89deb550ad8243529008d2c7b" Feb 17 18:08:00 crc kubenswrapper[4892]: I0217 18:08:00.889596 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 17 18:08:00 crc kubenswrapper[4892]: I0217 18:08:00.967824 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdkcv\" (UniqueName: \"kubernetes.io/projected/0a693208-de83-4f05-b4ff-0b3e7f858c74-kube-api-access-fdkcv\") pod \"0a693208-de83-4f05-b4ff-0b3e7f858c74\" (UID: \"0a693208-de83-4f05-b4ff-0b3e7f858c74\") " Feb 17 18:08:00 crc kubenswrapper[4892]: I0217 18:08:00.967987 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0a693208-de83-4f05-b4ff-0b3e7f858c74-etc-swift\") pod \"0a693208-de83-4f05-b4ff-0b3e7f858c74\" (UID: \"0a693208-de83-4f05-b4ff-0b3e7f858c74\") " Feb 17 18:08:00 crc kubenswrapper[4892]: I0217 18:08:00.968632 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a693208-de83-4f05-b4ff-0b3e7f858c74-combined-ca-bundle\") pod \"0a693208-de83-4f05-b4ff-0b3e7f858c74\" (UID: \"0a693208-de83-4f05-b4ff-0b3e7f858c74\") " Feb 17 18:08:00 crc kubenswrapper[4892]: I0217 18:08:00.968673 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/0a693208-de83-4f05-b4ff-0b3e7f858c74-lock\") pod \"0a693208-de83-4f05-b4ff-0b3e7f858c74\" (UID: \"0a693208-de83-4f05-b4ff-0b3e7f858c74\") " Feb 17 18:08:00 crc kubenswrapper[4892]: I0217 18:08:00.968717 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"0a693208-de83-4f05-b4ff-0b3e7f858c74\" (UID: \"0a693208-de83-4f05-b4ff-0b3e7f858c74\") " Feb 17 18:08:00 crc kubenswrapper[4892]: I0217 18:08:00.968768 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/0a693208-de83-4f05-b4ff-0b3e7f858c74-cache\") pod \"0a693208-de83-4f05-b4ff-0b3e7f858c74\" (UID: \"0a693208-de83-4f05-b4ff-0b3e7f858c74\") " Feb 17 18:08:00 crc kubenswrapper[4892]: I0217 18:08:00.969156 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a693208-de83-4f05-b4ff-0b3e7f858c74-lock" (OuterVolumeSpecName: "lock") pod "0a693208-de83-4f05-b4ff-0b3e7f858c74" (UID: "0a693208-de83-4f05-b4ff-0b3e7f858c74"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:08:00 crc kubenswrapper[4892]: I0217 18:08:00.969238 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a693208-de83-4f05-b4ff-0b3e7f858c74-cache" (OuterVolumeSpecName: "cache") pod "0a693208-de83-4f05-b4ff-0b3e7f858c74" (UID: "0a693208-de83-4f05-b4ff-0b3e7f858c74"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:08:00 crc kubenswrapper[4892]: I0217 18:08:00.969434 4892 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/0a693208-de83-4f05-b4ff-0b3e7f858c74-lock\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:00 crc kubenswrapper[4892]: I0217 18:08:00.969461 4892 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/0a693208-de83-4f05-b4ff-0b3e7f858c74-cache\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:00 crc kubenswrapper[4892]: I0217 18:08:00.971181 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a693208-de83-4f05-b4ff-0b3e7f858c74-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "0a693208-de83-4f05-b4ff-0b3e7f858c74" (UID: "0a693208-de83-4f05-b4ff-0b3e7f858c74"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:08:00 crc kubenswrapper[4892]: I0217 18:08:00.971581 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a693208-de83-4f05-b4ff-0b3e7f858c74-kube-api-access-fdkcv" (OuterVolumeSpecName: "kube-api-access-fdkcv") pod "0a693208-de83-4f05-b4ff-0b3e7f858c74" (UID: "0a693208-de83-4f05-b4ff-0b3e7f858c74"). InnerVolumeSpecName "kube-api-access-fdkcv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:08:00 crc kubenswrapper[4892]: I0217 18:08:00.971894 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "swift") pod "0a693208-de83-4f05-b4ff-0b3e7f858c74" (UID: "0a693208-de83-4f05-b4ff-0b3e7f858c74"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 18:08:01 crc kubenswrapper[4892]: I0217 18:08:01.070344 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdkcv\" (UniqueName: \"kubernetes.io/projected/0a693208-de83-4f05-b4ff-0b3e7f858c74-kube-api-access-fdkcv\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:01 crc kubenswrapper[4892]: I0217 18:08:01.070369 4892 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0a693208-de83-4f05-b4ff-0b3e7f858c74-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:01 crc kubenswrapper[4892]: I0217 18:08:01.070422 4892 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Feb 17 18:08:01 crc kubenswrapper[4892]: I0217 18:08:01.090340 4892 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Feb 17 18:08:01 crc kubenswrapper[4892]: I0217 18:08:01.171874 4892 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:01 crc kubenswrapper[4892]: I0217 18:08:01.281564 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a693208-de83-4f05-b4ff-0b3e7f858c74-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a693208-de83-4f05-b4ff-0b3e7f858c74" (UID: "0a693208-de83-4f05-b4ff-0b3e7f858c74"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:08:01 crc kubenswrapper[4892]: I0217 18:08:01.372959 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abdceb70-55cb-4f2a-ac20-7fe8e9f4d064" path="/var/lib/kubelet/pods/abdceb70-55cb-4f2a-ac20-7fe8e9f4d064/volumes" Feb 17 18:08:01 crc kubenswrapper[4892]: I0217 18:08:01.375287 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a693208-de83-4f05-b4ff-0b3e7f858c74-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:01 crc kubenswrapper[4892]: I0217 18:08:01.760457 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0a693208-de83-4f05-b4ff-0b3e7f858c74","Type":"ContainerDied","Data":"a5f99743a51ff9d0dbf46cc29bd6dccbd9b931151c253386b705f0b24776bb75"} Feb 17 18:08:01 crc kubenswrapper[4892]: I0217 18:08:01.760549 4892 scope.go:117] "RemoveContainer" containerID="40d70537bfb2a1c3a8fc4994456c79434226023147aca15f0cc917598c474f70" Feb 17 18:08:01 crc kubenswrapper[4892]: I0217 18:08:01.760586 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 17 18:08:01 crc kubenswrapper[4892]: I0217 18:08:01.829550 4892 scope.go:117] "RemoveContainer" containerID="560fe248893cb8f1a43c62388a6d42d718ed7cc7f26d7e07babf294df3388633" Feb 17 18:08:01 crc kubenswrapper[4892]: I0217 18:08:01.831733 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Feb 17 18:08:01 crc kubenswrapper[4892]: I0217 18:08:01.840096 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Feb 17 18:08:01 crc kubenswrapper[4892]: I0217 18:08:01.853709 4892 scope.go:117] "RemoveContainer" containerID="ec801cf9cb9854df9d768df41b9402e3ae9e7297f0a3ac7eca720a4609e68e35" Feb 17 18:08:01 crc kubenswrapper[4892]: I0217 18:08:01.878302 4892 scope.go:117] "RemoveContainer" containerID="4a392a3a4be90b644c07bd0ce9b8c0d432df6220823557ff7db78a582c418255" Feb 17 18:08:01 crc kubenswrapper[4892]: I0217 18:08:01.899864 4892 scope.go:117] "RemoveContainer" containerID="651defaed761921ccf7c7a6cc3b90f243a62d645f799923b98e7b01c8fdaecf1" Feb 17 18:08:01 crc kubenswrapper[4892]: I0217 18:08:01.921662 4892 scope.go:117] "RemoveContainer" containerID="94a10ed2d5fee50a1d21331a9c66ebfdc3f24fee2209a9431f1451ef17c41252" Feb 17 18:08:01 crc kubenswrapper[4892]: I0217 18:08:01.948848 4892 scope.go:117] "RemoveContainer" containerID="4d5dff4d38a7a79f798428eae45a2013e22a5357dd2fcd764f8587d896667672" Feb 17 18:08:01 crc kubenswrapper[4892]: I0217 18:08:01.977334 4892 scope.go:117] "RemoveContainer" containerID="58e0df0679c6d02fa3c1eeeb9fd8b0c6a7d302f3247b2ed8e7c839d88c861c37" Feb 17 18:08:01 crc kubenswrapper[4892]: I0217 18:08:01.997673 4892 scope.go:117] "RemoveContainer" containerID="ecc298da1bf313ad808f705d76569aadd8ccd7b7dec598f4818582d15e76c068" Feb 17 18:08:02 crc kubenswrapper[4892]: I0217 18:08:02.046805 4892 scope.go:117] "RemoveContainer" containerID="0f12b3eda7e47024c7abf99bbcbb51563466d6864f1fb053fa6d1defadfb7e85" Feb 17 18:08:02 crc kubenswrapper[4892]: I0217 18:08:02.101205 4892 scope.go:117] "RemoveContainer" containerID="c1d342baf3f40d0f5eafc73a64df7c0cd93b1f05a6ea729e27e0c6aa8625b809" Feb 17 18:08:02 crc kubenswrapper[4892]: I0217 18:08:02.121660 4892 scope.go:117] "RemoveContainer" containerID="3cf97b246bab6355f5a8e7bc81f3bddf502ccd3c47dc0316fda01c98e9d599dc" Feb 17 18:08:02 crc kubenswrapper[4892]: I0217 18:08:02.146706 4892 scope.go:117] "RemoveContainer" containerID="6ad15eb01bd19157c80c0bb24ae02bfdeddfc5c770f3fde85bbd8a6f9ea1c582" Feb 17 18:08:02 crc kubenswrapper[4892]: I0217 18:08:02.178410 4892 scope.go:117] "RemoveContainer" containerID="5215ee6df896cbac8711678f30bfc8c34ed8386052d37de70cf674a5aa175b70" Feb 17 18:08:02 crc kubenswrapper[4892]: I0217 18:08:02.204900 4892 scope.go:117] "RemoveContainer" containerID="736363b29d80b41bd0b961a49490e559581f0ae8dd26533460a7abdb41743241" Feb 17 18:08:03 crc kubenswrapper[4892]: I0217 18:08:03.369627 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" path="/var/lib/kubelet/pods/0a693208-de83-4f05-b4ff-0b3e7f858c74/volumes" Feb 17 18:08:05 crc kubenswrapper[4892]: I0217 18:08:05.847217 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bvjw2" Feb 17 18:08:05 crc kubenswrapper[4892]: I0217 18:08:05.918200 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bvjw2" Feb 17 18:08:06 crc kubenswrapper[4892]: I0217 18:08:06.112355 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bvjw2"] Feb 17 18:08:07 crc kubenswrapper[4892]: I0217 18:08:07.836438 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bvjw2" podUID="89ba360b-4b7a-4a1b-a417-923d957d1255" containerName="registry-server" containerID="cri-o://fbedba0dd0e13fbd7696a0f9575468162d67e241a1c47d6122551969501fac1f" gracePeriod=2 Feb 17 18:08:08 crc kubenswrapper[4892]: I0217 18:08:08.335155 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bvjw2" Feb 17 18:08:08 crc kubenswrapper[4892]: I0217 18:08:08.391126 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89ba360b-4b7a-4a1b-a417-923d957d1255-utilities\") pod \"89ba360b-4b7a-4a1b-a417-923d957d1255\" (UID: \"89ba360b-4b7a-4a1b-a417-923d957d1255\") " Feb 17 18:08:08 crc kubenswrapper[4892]: I0217 18:08:08.391262 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgs67\" (UniqueName: \"kubernetes.io/projected/89ba360b-4b7a-4a1b-a417-923d957d1255-kube-api-access-cgs67\") pod \"89ba360b-4b7a-4a1b-a417-923d957d1255\" (UID: \"89ba360b-4b7a-4a1b-a417-923d957d1255\") " Feb 17 18:08:08 crc kubenswrapper[4892]: I0217 18:08:08.391410 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89ba360b-4b7a-4a1b-a417-923d957d1255-catalog-content\") pod \"89ba360b-4b7a-4a1b-a417-923d957d1255\" (UID: \"89ba360b-4b7a-4a1b-a417-923d957d1255\") " Feb 17 18:08:08 crc kubenswrapper[4892]: I0217 18:08:08.391943 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89ba360b-4b7a-4a1b-a417-923d957d1255-utilities" (OuterVolumeSpecName: "utilities") pod "89ba360b-4b7a-4a1b-a417-923d957d1255" (UID: "89ba360b-4b7a-4a1b-a417-923d957d1255"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:08:08 crc kubenswrapper[4892]: I0217 18:08:08.396037 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89ba360b-4b7a-4a1b-a417-923d957d1255-kube-api-access-cgs67" (OuterVolumeSpecName: "kube-api-access-cgs67") pod "89ba360b-4b7a-4a1b-a417-923d957d1255" (UID: "89ba360b-4b7a-4a1b-a417-923d957d1255"). InnerVolumeSpecName "kube-api-access-cgs67". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:08:08 crc kubenswrapper[4892]: I0217 18:08:08.492852 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89ba360b-4b7a-4a1b-a417-923d957d1255-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:08 crc kubenswrapper[4892]: I0217 18:08:08.492881 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgs67\" (UniqueName: \"kubernetes.io/projected/89ba360b-4b7a-4a1b-a417-923d957d1255-kube-api-access-cgs67\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:08 crc kubenswrapper[4892]: I0217 18:08:08.522204 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89ba360b-4b7a-4a1b-a417-923d957d1255-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "89ba360b-4b7a-4a1b-a417-923d957d1255" (UID: "89ba360b-4b7a-4a1b-a417-923d957d1255"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:08:08 crc kubenswrapper[4892]: I0217 18:08:08.594774 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89ba360b-4b7a-4a1b-a417-923d957d1255-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 18:08:08 crc kubenswrapper[4892]: I0217 18:08:08.854153 4892 generic.go:334] "Generic (PLEG): container finished" podID="89ba360b-4b7a-4a1b-a417-923d957d1255" containerID="fbedba0dd0e13fbd7696a0f9575468162d67e241a1c47d6122551969501fac1f" exitCode=0 Feb 17 18:08:08 crc kubenswrapper[4892]: I0217 18:08:08.854192 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bvjw2" event={"ID":"89ba360b-4b7a-4a1b-a417-923d957d1255","Type":"ContainerDied","Data":"fbedba0dd0e13fbd7696a0f9575468162d67e241a1c47d6122551969501fac1f"} Feb 17 18:08:08 crc kubenswrapper[4892]: I0217 18:08:08.854221 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bvjw2" event={"ID":"89ba360b-4b7a-4a1b-a417-923d957d1255","Type":"ContainerDied","Data":"907a798e694d80ab381066f29eaee7527d618df59a09afb3661d931d701ef8f8"} Feb 17 18:08:08 crc kubenswrapper[4892]: I0217 18:08:08.854242 4892 scope.go:117] "RemoveContainer" containerID="fbedba0dd0e13fbd7696a0f9575468162d67e241a1c47d6122551969501fac1f" Feb 17 18:08:08 crc kubenswrapper[4892]: I0217 18:08:08.854380 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bvjw2" Feb 17 18:08:08 crc kubenswrapper[4892]: I0217 18:08:08.886132 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bvjw2"] Feb 17 18:08:08 crc kubenswrapper[4892]: I0217 18:08:08.888018 4892 scope.go:117] "RemoveContainer" containerID="845d1f77010c36e0685c73ff930561f5de676635fd065ce1b28e9b7da4ea3db9" Feb 17 18:08:08 crc kubenswrapper[4892]: I0217 18:08:08.895448 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bvjw2"] Feb 17 18:08:08 crc kubenswrapper[4892]: I0217 18:08:08.911405 4892 scope.go:117] "RemoveContainer" containerID="1f722a44d9b759427e5d3b582daea007e79ff667e876908db776746dc4033c7c" Feb 17 18:08:08 crc kubenswrapper[4892]: I0217 18:08:08.934103 4892 scope.go:117] "RemoveContainer" containerID="fbedba0dd0e13fbd7696a0f9575468162d67e241a1c47d6122551969501fac1f" Feb 17 18:08:08 crc kubenswrapper[4892]: E0217 18:08:08.934483 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbedba0dd0e13fbd7696a0f9575468162d67e241a1c47d6122551969501fac1f\": container with ID starting with fbedba0dd0e13fbd7696a0f9575468162d67e241a1c47d6122551969501fac1f not found: ID does not exist" containerID="fbedba0dd0e13fbd7696a0f9575468162d67e241a1c47d6122551969501fac1f" Feb 17 18:08:08 crc kubenswrapper[4892]: I0217 18:08:08.934516 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbedba0dd0e13fbd7696a0f9575468162d67e241a1c47d6122551969501fac1f"} err="failed to get container status \"fbedba0dd0e13fbd7696a0f9575468162d67e241a1c47d6122551969501fac1f\": rpc error: code = NotFound desc = could not find container \"fbedba0dd0e13fbd7696a0f9575468162d67e241a1c47d6122551969501fac1f\": container with ID starting with fbedba0dd0e13fbd7696a0f9575468162d67e241a1c47d6122551969501fac1f not found: ID does not exist" Feb 17 18:08:08 crc kubenswrapper[4892]: I0217 18:08:08.934536 4892 scope.go:117] "RemoveContainer" containerID="845d1f77010c36e0685c73ff930561f5de676635fd065ce1b28e9b7da4ea3db9" Feb 17 18:08:08 crc kubenswrapper[4892]: E0217 18:08:08.934933 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"845d1f77010c36e0685c73ff930561f5de676635fd065ce1b28e9b7da4ea3db9\": container with ID starting with 845d1f77010c36e0685c73ff930561f5de676635fd065ce1b28e9b7da4ea3db9 not found: ID does not exist" containerID="845d1f77010c36e0685c73ff930561f5de676635fd065ce1b28e9b7da4ea3db9" Feb 17 18:08:08 crc kubenswrapper[4892]: I0217 18:08:08.935037 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"845d1f77010c36e0685c73ff930561f5de676635fd065ce1b28e9b7da4ea3db9"} err="failed to get container status \"845d1f77010c36e0685c73ff930561f5de676635fd065ce1b28e9b7da4ea3db9\": rpc error: code = NotFound desc = could not find container \"845d1f77010c36e0685c73ff930561f5de676635fd065ce1b28e9b7da4ea3db9\": container with ID starting with 845d1f77010c36e0685c73ff930561f5de676635fd065ce1b28e9b7da4ea3db9 not found: ID does not exist" Feb 17 18:08:08 crc kubenswrapper[4892]: I0217 18:08:08.935124 4892 scope.go:117] "RemoveContainer" containerID="1f722a44d9b759427e5d3b582daea007e79ff667e876908db776746dc4033c7c" Feb 17 18:08:08 crc kubenswrapper[4892]: E0217 18:08:08.935530 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f722a44d9b759427e5d3b582daea007e79ff667e876908db776746dc4033c7c\": container with ID starting with 1f722a44d9b759427e5d3b582daea007e79ff667e876908db776746dc4033c7c not found: ID does not exist" containerID="1f722a44d9b759427e5d3b582daea007e79ff667e876908db776746dc4033c7c" Feb 17 18:08:08 crc kubenswrapper[4892]: I0217 18:08:08.935547 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f722a44d9b759427e5d3b582daea007e79ff667e876908db776746dc4033c7c"} err="failed to get container status \"1f722a44d9b759427e5d3b582daea007e79ff667e876908db776746dc4033c7c\": rpc error: code = NotFound desc = could not find container \"1f722a44d9b759427e5d3b582daea007e79ff667e876908db776746dc4033c7c\": container with ID starting with 1f722a44d9b759427e5d3b582daea007e79ff667e876908db776746dc4033c7c not found: ID does not exist" Feb 17 18:08:09 crc kubenswrapper[4892]: I0217 18:08:09.375576 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89ba360b-4b7a-4a1b-a417-923d957d1255" path="/var/lib/kubelet/pods/89ba360b-4b7a-4a1b-a417-923d957d1255/volumes" Feb 17 18:08:56 crc kubenswrapper[4892]: I0217 18:08:56.514635 4892 scope.go:117] "RemoveContainer" containerID="8a8dca3429143b599989ef7ff2f7114f30539084187e904a5b30eee4fcfc5b7f" Feb 17 18:08:56 crc kubenswrapper[4892]: I0217 18:08:56.557623 4892 scope.go:117] "RemoveContainer" containerID="0ed10cd1809fc1ebc8ac2ad88bd47659d051e13889fd973bc2f836802f14044e" Feb 17 18:08:56 crc kubenswrapper[4892]: I0217 18:08:56.619785 4892 scope.go:117] "RemoveContainer" containerID="20e78a2656091e6830683ed12bdcfe9347b0af0753251ba18342603e69dcb165" Feb 17 18:08:56 crc kubenswrapper[4892]: I0217 18:08:56.668423 4892 scope.go:117] "RemoveContainer" containerID="9977ae08b25b1edefaebcd314933aeb3b589fcc3968062a97ddb2200bd76769c" Feb 17 18:08:56 crc kubenswrapper[4892]: I0217 18:08:56.702269 4892 scope.go:117] "RemoveContainer" containerID="be35a06a29cd7c2a1ba928f15928d076ce2f85d560d8048d0cb22851b293f9a7" Feb 17 18:08:56 crc kubenswrapper[4892]: I0217 18:08:56.735892 4892 scope.go:117] "RemoveContainer" containerID="c3def8ed7cbb31f1e7a1e19acdef66f4145697e848ca6bd61724ce8cd5374b4b" Feb 17 18:08:56 crc kubenswrapper[4892]: I0217 18:08:56.758985 4892 scope.go:117] "RemoveContainer" containerID="d6eca5b3fff973b26b1e0665698334af0d197984bc7aca23a2b70b59cc1daf4d" Feb 17 18:08:56 crc kubenswrapper[4892]: I0217 18:08:56.781789 4892 scope.go:117] "RemoveContainer" containerID="79887136152b98a313b969d8b3e992c5f5e5bd87b99e8bdb68693793eb7ca14a" Feb 17 18:08:56 crc kubenswrapper[4892]: I0217 18:08:56.799928 4892 scope.go:117] "RemoveContainer" containerID="ceb4c385d00646492bf20fac2ee2ee7bfac8d274b0a4abeaaee8d01b2d0db45c" Feb 17 18:08:56 crc kubenswrapper[4892]: I0217 18:08:56.829105 4892 scope.go:117] "RemoveContainer" containerID="6b32704fbc72ff9a7cc5ded27500a276ebad694cfbdf97871511f5b03b78561d" Feb 17 18:08:56 crc kubenswrapper[4892]: I0217 18:08:56.851327 4892 scope.go:117] "RemoveContainer" containerID="71ee82ac13e1236283665727695224d81d671980ccde4309b03ac1b224f8754c" Feb 17 18:08:56 crc kubenswrapper[4892]: I0217 18:08:56.875579 4892 scope.go:117] "RemoveContainer" containerID="7285fea6e2decb3e8ebd69fc9d2b7024af73b52009fa277ef34f56d1a2aa23c3" Feb 17 18:08:56 crc kubenswrapper[4892]: I0217 18:08:56.897943 4892 scope.go:117] "RemoveContainer" containerID="7a8b1188339d86be3d5c064511d914198ccd1ff144a0a5223403ce3b3e9cdf6b" Feb 17 18:08:56 crc kubenswrapper[4892]: I0217 18:08:56.918618 4892 scope.go:117] "RemoveContainer" containerID="d1ffbfbf7d7e526d9f7eb72bc01754b35f328feb3b5ac960d6e752e50581986e" Feb 17 18:08:56 crc kubenswrapper[4892]: I0217 18:08:56.955824 4892 scope.go:117] "RemoveContainer" containerID="81fff63afe534220bbefc6836acd04547fb898d8b45cbda7b7e4b0b8ac3851af" Feb 17 18:08:56 crc kubenswrapper[4892]: I0217 18:08:56.992774 4892 scope.go:117] "RemoveContainer" containerID="cee32f5ada1e424bb8f30122322fd702190cef4acfca76eec13044ef381a90ff" Feb 17 18:08:57 crc kubenswrapper[4892]: I0217 18:08:57.034714 4892 scope.go:117] "RemoveContainer" containerID="e26651b0dac4631f1d31c476692d4a84f0c1e5516e237a2c33f25e57e3776b8c" Feb 17 18:08:57 crc kubenswrapper[4892]: I0217 18:08:57.064952 4892 scope.go:117] "RemoveContainer" containerID="8a100a0c19ecedf25cf13fb999386c04f0bdb55ef5cf32b452775a4f80064859" Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.219659 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6zfq6"] Feb 17 18:09:06 crc kubenswrapper[4892]: E0217 18:09:06.220722 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ffd802c-eec6-4c8c-a2a9-2571eba7bc33" containerName="neutron-api" Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.220740 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ffd802c-eec6-4c8c-a2a9-2571eba7bc33" containerName="neutron-api" Feb 17 18:09:06 crc kubenswrapper[4892]: E0217 18:09:06.220764 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="account-reaper" Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.220771 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="account-reaper" Feb 17 18:09:06 crc kubenswrapper[4892]: E0217 18:09:06.220782 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="object-updater" Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.220789 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="object-updater" Feb 17 18:09:06 crc kubenswrapper[4892]: E0217 18:09:06.220802 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abdceb70-55cb-4f2a-ac20-7fe8e9f4d064" containerName="ovs-vswitchd" Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.220810 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="abdceb70-55cb-4f2a-ac20-7fe8e9f4d064" containerName="ovs-vswitchd" Feb 17 18:09:06 crc kubenswrapper[4892]: E0217 18:09:06.220924 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="container-updater" Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.220931 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="container-updater" Feb 17 18:09:06 crc kubenswrapper[4892]: E0217 18:09:06.220955 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="container-auditor" Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.220961 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="container-auditor" Feb 17 18:09:06 crc kubenswrapper[4892]: E0217 18:09:06.220977 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="container-replicator" Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.220987 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="container-replicator" Feb 17 18:09:06 crc kubenswrapper[4892]: E0217 18:09:06.221006 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89ba360b-4b7a-4a1b-a417-923d957d1255" containerName="extract-utilities" Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.221013 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="89ba360b-4b7a-4a1b-a417-923d957d1255" containerName="extract-utilities" Feb 17 18:09:06 crc kubenswrapper[4892]: E0217 18:09:06.221032 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="account-auditor" Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.221038 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="account-auditor" Feb 17 18:09:06 crc kubenswrapper[4892]: E0217 18:09:06.221055 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89ba360b-4b7a-4a1b-a417-923d957d1255" containerName="extract-content" Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.221062 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="89ba360b-4b7a-4a1b-a417-923d957d1255" containerName="extract-content" Feb 17 18:09:06 crc kubenswrapper[4892]: E0217 18:09:06.221080 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abdceb70-55cb-4f2a-ac20-7fe8e9f4d064" containerName="ovsdb-server" Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.221086 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="abdceb70-55cb-4f2a-ac20-7fe8e9f4d064" containerName="ovsdb-server" Feb 17 18:09:06 crc kubenswrapper[4892]: E0217 18:09:06.221107 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="object-server" Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.221113 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="object-server" Feb 17 18:09:06 crc kubenswrapper[4892]: E0217 18:09:06.221123 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ffd802c-eec6-4c8c-a2a9-2571eba7bc33" containerName="neutron-httpd" Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.221132 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ffd802c-eec6-4c8c-a2a9-2571eba7bc33" containerName="neutron-httpd" Feb 17 18:09:06 crc kubenswrapper[4892]: E0217 18:09:06.221146 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89ba360b-4b7a-4a1b-a417-923d957d1255" containerName="registry-server" Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.221154 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="89ba360b-4b7a-4a1b-a417-923d957d1255" containerName="registry-server" Feb 17 18:09:06 crc kubenswrapper[4892]: E0217 18:09:06.221175 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="object-auditor" Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.221182 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="object-auditor" Feb 17 18:09:06 crc kubenswrapper[4892]: E0217 18:09:06.221189 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="object-replicator" Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.221195 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="object-replicator" Feb 17 18:09:06 crc kubenswrapper[4892]: E0217 18:09:06.221205 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="container-server" Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.221211 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="container-server" Feb 17 18:09:06 crc kubenswrapper[4892]: E0217 18:09:06.221222 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="rsync" Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.221228 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="rsync" Feb 17 18:09:06 crc kubenswrapper[4892]: E0217 18:09:06.221245 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="object-expirer" Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.221251 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="object-expirer" Feb 17 18:09:06 crc kubenswrapper[4892]: E0217 18:09:06.221271 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abdceb70-55cb-4f2a-ac20-7fe8e9f4d064" containerName="ovsdb-server-init" Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.221277 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="abdceb70-55cb-4f2a-ac20-7fe8e9f4d064" containerName="ovsdb-server-init" Feb 17 18:09:06 crc kubenswrapper[4892]: E0217 18:09:06.221291 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="account-replicator" Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.221297 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="account-replicator" Feb 17 18:09:06 crc kubenswrapper[4892]: E0217 18:09:06.221831 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="account-server" Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.221842 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="account-server" Feb 17 18:09:06 crc kubenswrapper[4892]: E0217 18:09:06.221851 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="swift-recon-cron" Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.221858 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="swift-recon-cron" Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.222223 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ffd802c-eec6-4c8c-a2a9-2571eba7bc33" containerName="neutron-httpd" Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.222249 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="container-auditor" Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.222262 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="account-auditor" Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.222270 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="rsync" Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.222282 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="object-replicator" Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.222292 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ffd802c-eec6-4c8c-a2a9-2571eba7bc33" containerName="neutron-api" Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.222310 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="container-updater" Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.222324 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="abdceb70-55cb-4f2a-ac20-7fe8e9f4d064" containerName="ovs-vswitchd" Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.222336 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="account-server" Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.222346 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="account-reaper" Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.222360 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="account-replicator" Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.222369 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="container-server" Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.222382 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="object-expirer" Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.222395 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="swift-recon-cron" Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.222404 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="abdceb70-55cb-4f2a-ac20-7fe8e9f4d064" containerName="ovsdb-server" Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.222418 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="object-auditor" Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.222440 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="89ba360b-4b7a-4a1b-a417-923d957d1255" containerName="registry-server" Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.222448 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="object-server" Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.222457 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="object-updater" Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.222479 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a693208-de83-4f05-b4ff-0b3e7f858c74" containerName="container-replicator" Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.224857 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6zfq6" Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.275749 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6zfq6"] Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.371832 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e679793-b837-48a2-a9f1-aa834bdcd400-utilities\") pod \"certified-operators-6zfq6\" (UID: \"9e679793-b837-48a2-a9f1-aa834bdcd400\") " pod="openshift-marketplace/certified-operators-6zfq6" Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.371965 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e679793-b837-48a2-a9f1-aa834bdcd400-catalog-content\") pod \"certified-operators-6zfq6\" (UID: \"9e679793-b837-48a2-a9f1-aa834bdcd400\") " pod="openshift-marketplace/certified-operators-6zfq6" Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.372049 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbxn6\" (UniqueName: \"kubernetes.io/projected/9e679793-b837-48a2-a9f1-aa834bdcd400-kube-api-access-tbxn6\") pod \"certified-operators-6zfq6\" (UID: \"9e679793-b837-48a2-a9f1-aa834bdcd400\") " pod="openshift-marketplace/certified-operators-6zfq6" Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.473516 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e679793-b837-48a2-a9f1-aa834bdcd400-utilities\") pod \"certified-operators-6zfq6\" (UID: \"9e679793-b837-48a2-a9f1-aa834bdcd400\") " pod="openshift-marketplace/certified-operators-6zfq6" Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.473598 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e679793-b837-48a2-a9f1-aa834bdcd400-catalog-content\") pod \"certified-operators-6zfq6\" (UID: \"9e679793-b837-48a2-a9f1-aa834bdcd400\") " pod="openshift-marketplace/certified-operators-6zfq6" Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.473648 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbxn6\" (UniqueName: \"kubernetes.io/projected/9e679793-b837-48a2-a9f1-aa834bdcd400-kube-api-access-tbxn6\") pod \"certified-operators-6zfq6\" (UID: \"9e679793-b837-48a2-a9f1-aa834bdcd400\") " pod="openshift-marketplace/certified-operators-6zfq6" Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.474085 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e679793-b837-48a2-a9f1-aa834bdcd400-utilities\") pod \"certified-operators-6zfq6\" (UID: \"9e679793-b837-48a2-a9f1-aa834bdcd400\") " pod="openshift-marketplace/certified-operators-6zfq6" Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.474162 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e679793-b837-48a2-a9f1-aa834bdcd400-catalog-content\") pod \"certified-operators-6zfq6\" (UID: \"9e679793-b837-48a2-a9f1-aa834bdcd400\") " pod="openshift-marketplace/certified-operators-6zfq6" Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.492777 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbxn6\" (UniqueName: \"kubernetes.io/projected/9e679793-b837-48a2-a9f1-aa834bdcd400-kube-api-access-tbxn6\") pod \"certified-operators-6zfq6\" (UID: \"9e679793-b837-48a2-a9f1-aa834bdcd400\") " pod="openshift-marketplace/certified-operators-6zfq6" Feb 17 18:09:06 crc kubenswrapper[4892]: I0217 18:09:06.555450 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6zfq6" Feb 17 18:09:07 crc kubenswrapper[4892]: I0217 18:09:07.034216 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6zfq6"] Feb 17 18:09:07 crc kubenswrapper[4892]: I0217 18:09:07.424612 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 18:09:07 crc kubenswrapper[4892]: I0217 18:09:07.424666 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 18:09:07 crc kubenswrapper[4892]: I0217 18:09:07.581947 4892 generic.go:334] "Generic (PLEG): container finished" podID="9e679793-b837-48a2-a9f1-aa834bdcd400" containerID="ab55c33fcadce3bc547fcb7047a231819e767ceb6272e8d4c927208db3d1d14a" exitCode=0 Feb 17 18:09:07 crc kubenswrapper[4892]: I0217 18:09:07.582013 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6zfq6" event={"ID":"9e679793-b837-48a2-a9f1-aa834bdcd400","Type":"ContainerDied","Data":"ab55c33fcadce3bc547fcb7047a231819e767ceb6272e8d4c927208db3d1d14a"} Feb 17 18:09:07 crc kubenswrapper[4892]: I0217 18:09:07.582264 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6zfq6" event={"ID":"9e679793-b837-48a2-a9f1-aa834bdcd400","Type":"ContainerStarted","Data":"21886e7a81cc6e517b49755fe2947c0514140567351add43f7b267ca40554189"} Feb 17 18:09:08 crc kubenswrapper[4892]: I0217 18:09:08.598198 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6zfq6" event={"ID":"9e679793-b837-48a2-a9f1-aa834bdcd400","Type":"ContainerStarted","Data":"bb6dfaedcf6108493a877ee11a7ee5c140e31b819cd5713324fd91585eaa0238"} Feb 17 18:09:09 crc kubenswrapper[4892]: I0217 18:09:09.609209 4892 generic.go:334] "Generic (PLEG): container finished" podID="9e679793-b837-48a2-a9f1-aa834bdcd400" containerID="bb6dfaedcf6108493a877ee11a7ee5c140e31b819cd5713324fd91585eaa0238" exitCode=0 Feb 17 18:09:09 crc kubenswrapper[4892]: I0217 18:09:09.609453 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6zfq6" event={"ID":"9e679793-b837-48a2-a9f1-aa834bdcd400","Type":"ContainerDied","Data":"bb6dfaedcf6108493a877ee11a7ee5c140e31b819cd5713324fd91585eaa0238"} Feb 17 18:09:11 crc kubenswrapper[4892]: I0217 18:09:11.634514 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6zfq6" event={"ID":"9e679793-b837-48a2-a9f1-aa834bdcd400","Type":"ContainerStarted","Data":"e6bcf449e3e3aa3505e7cbcb77edad7f9f00b111e5df1c1b86d3a8f886425601"} Feb 17 18:09:11 crc kubenswrapper[4892]: I0217 18:09:11.657529 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6zfq6" podStartSLOduration=2.4715708100000002 podStartE2EDuration="5.657507992s" podCreationTimestamp="2026-02-17 18:09:06 +0000 UTC" firstStartedPulling="2026-02-17 18:09:07.58412423 +0000 UTC m=+1518.959527495" lastFinishedPulling="2026-02-17 18:09:10.770061412 +0000 UTC m=+1522.145464677" observedRunningTime="2026-02-17 18:09:11.653510664 +0000 UTC m=+1523.028913929" watchObservedRunningTime="2026-02-17 18:09:11.657507992 +0000 UTC m=+1523.032911257" Feb 17 18:09:14 crc kubenswrapper[4892]: I0217 18:09:14.740713 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ssql5"] Feb 17 18:09:14 crc kubenswrapper[4892]: I0217 18:09:14.747468 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ssql5" Feb 17 18:09:14 crc kubenswrapper[4892]: I0217 18:09:14.760161 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ssql5"] Feb 17 18:09:14 crc kubenswrapper[4892]: I0217 18:09:14.926920 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4584\" (UniqueName: \"kubernetes.io/projected/34b8461f-ff1d-40a4-9983-b91bcf32454e-kube-api-access-p4584\") pod \"community-operators-ssql5\" (UID: \"34b8461f-ff1d-40a4-9983-b91bcf32454e\") " pod="openshift-marketplace/community-operators-ssql5" Feb 17 18:09:14 crc kubenswrapper[4892]: I0217 18:09:14.926978 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34b8461f-ff1d-40a4-9983-b91bcf32454e-catalog-content\") pod \"community-operators-ssql5\" (UID: \"34b8461f-ff1d-40a4-9983-b91bcf32454e\") " pod="openshift-marketplace/community-operators-ssql5" Feb 17 18:09:14 crc kubenswrapper[4892]: I0217 18:09:14.927005 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34b8461f-ff1d-40a4-9983-b91bcf32454e-utilities\") pod \"community-operators-ssql5\" (UID: \"34b8461f-ff1d-40a4-9983-b91bcf32454e\") " pod="openshift-marketplace/community-operators-ssql5" Feb 17 18:09:15 crc kubenswrapper[4892]: I0217 18:09:15.029168 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4584\" (UniqueName: \"kubernetes.io/projected/34b8461f-ff1d-40a4-9983-b91bcf32454e-kube-api-access-p4584\") pod \"community-operators-ssql5\" (UID: \"34b8461f-ff1d-40a4-9983-b91bcf32454e\") " pod="openshift-marketplace/community-operators-ssql5" Feb 17 18:09:15 crc kubenswrapper[4892]: I0217 18:09:15.029246 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34b8461f-ff1d-40a4-9983-b91bcf32454e-catalog-content\") pod \"community-operators-ssql5\" (UID: \"34b8461f-ff1d-40a4-9983-b91bcf32454e\") " pod="openshift-marketplace/community-operators-ssql5" Feb 17 18:09:15 crc kubenswrapper[4892]: I0217 18:09:15.029292 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34b8461f-ff1d-40a4-9983-b91bcf32454e-utilities\") pod \"community-operators-ssql5\" (UID: \"34b8461f-ff1d-40a4-9983-b91bcf32454e\") " pod="openshift-marketplace/community-operators-ssql5" Feb 17 18:09:15 crc kubenswrapper[4892]: I0217 18:09:15.030357 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34b8461f-ff1d-40a4-9983-b91bcf32454e-utilities\") pod \"community-operators-ssql5\" (UID: \"34b8461f-ff1d-40a4-9983-b91bcf32454e\") " pod="openshift-marketplace/community-operators-ssql5" Feb 17 18:09:15 crc kubenswrapper[4892]: I0217 18:09:15.030503 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34b8461f-ff1d-40a4-9983-b91bcf32454e-catalog-content\") pod \"community-operators-ssql5\" (UID: \"34b8461f-ff1d-40a4-9983-b91bcf32454e\") " pod="openshift-marketplace/community-operators-ssql5" Feb 17 18:09:15 crc kubenswrapper[4892]: I0217 18:09:15.057634 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4584\" (UniqueName: \"kubernetes.io/projected/34b8461f-ff1d-40a4-9983-b91bcf32454e-kube-api-access-p4584\") pod \"community-operators-ssql5\" (UID: \"34b8461f-ff1d-40a4-9983-b91bcf32454e\") " pod="openshift-marketplace/community-operators-ssql5" Feb 17 18:09:15 crc kubenswrapper[4892]: I0217 18:09:15.070577 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ssql5" Feb 17 18:09:15 crc kubenswrapper[4892]: I0217 18:09:15.639664 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ssql5"] Feb 17 18:09:15 crc kubenswrapper[4892]: W0217 18:09:15.648509 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34b8461f_ff1d_40a4_9983_b91bcf32454e.slice/crio-82bf21e175ab2e29a4a66d298766ed5f2701eeb89896e86584236439b2d839b1 WatchSource:0}: Error finding container 82bf21e175ab2e29a4a66d298766ed5f2701eeb89896e86584236439b2d839b1: Status 404 returned error can't find the container with id 82bf21e175ab2e29a4a66d298766ed5f2701eeb89896e86584236439b2d839b1 Feb 17 18:09:15 crc kubenswrapper[4892]: I0217 18:09:15.710697 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ssql5" event={"ID":"34b8461f-ff1d-40a4-9983-b91bcf32454e","Type":"ContainerStarted","Data":"82bf21e175ab2e29a4a66d298766ed5f2701eeb89896e86584236439b2d839b1"} Feb 17 18:09:16 crc kubenswrapper[4892]: I0217 18:09:16.556016 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6zfq6" Feb 17 18:09:16 crc kubenswrapper[4892]: I0217 18:09:16.556417 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6zfq6" Feb 17 18:09:16 crc kubenswrapper[4892]: I0217 18:09:16.610487 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6zfq6" Feb 17 18:09:16 crc kubenswrapper[4892]: I0217 18:09:16.719425 4892 generic.go:334] "Generic (PLEG): container finished" podID="34b8461f-ff1d-40a4-9983-b91bcf32454e" containerID="c3c5437837d58bcbe392780125c0cd91dc625edc3be0661d266b890cc5dc5f79" exitCode=0 Feb 17 18:09:16 crc kubenswrapper[4892]: I0217 18:09:16.719527 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ssql5" event={"ID":"34b8461f-ff1d-40a4-9983-b91bcf32454e","Type":"ContainerDied","Data":"c3c5437837d58bcbe392780125c0cd91dc625edc3be0661d266b890cc5dc5f79"} Feb 17 18:09:16 crc kubenswrapper[4892]: I0217 18:09:16.782905 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6zfq6" Feb 17 18:09:17 crc kubenswrapper[4892]: I0217 18:09:17.733147 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ssql5" event={"ID":"34b8461f-ff1d-40a4-9983-b91bcf32454e","Type":"ContainerStarted","Data":"a9e777f49e285a8cd9bb871ff4f1c52b00ee140b795942681d98172742504d38"} Feb 17 18:09:18 crc kubenswrapper[4892]: I0217 18:09:18.739729 4892 generic.go:334] "Generic (PLEG): container finished" podID="34b8461f-ff1d-40a4-9983-b91bcf32454e" containerID="a9e777f49e285a8cd9bb871ff4f1c52b00ee140b795942681d98172742504d38" exitCode=0 Feb 17 18:09:18 crc kubenswrapper[4892]: I0217 18:09:18.739784 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ssql5" event={"ID":"34b8461f-ff1d-40a4-9983-b91bcf32454e","Type":"ContainerDied","Data":"a9e777f49e285a8cd9bb871ff4f1c52b00ee140b795942681d98172742504d38"} Feb 17 18:09:18 crc kubenswrapper[4892]: I0217 18:09:18.908473 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6zfq6"] Feb 17 18:09:18 crc kubenswrapper[4892]: I0217 18:09:18.908694 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6zfq6" podUID="9e679793-b837-48a2-a9f1-aa834bdcd400" containerName="registry-server" containerID="cri-o://e6bcf449e3e3aa3505e7cbcb77edad7f9f00b111e5df1c1b86d3a8f886425601" gracePeriod=2 Feb 17 18:09:19 crc kubenswrapper[4892]: I0217 18:09:19.405273 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6zfq6" Feb 17 18:09:19 crc kubenswrapper[4892]: I0217 18:09:19.517169 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e679793-b837-48a2-a9f1-aa834bdcd400-catalog-content\") pod \"9e679793-b837-48a2-a9f1-aa834bdcd400\" (UID: \"9e679793-b837-48a2-a9f1-aa834bdcd400\") " Feb 17 18:09:19 crc kubenswrapper[4892]: I0217 18:09:19.517236 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbxn6\" (UniqueName: \"kubernetes.io/projected/9e679793-b837-48a2-a9f1-aa834bdcd400-kube-api-access-tbxn6\") pod \"9e679793-b837-48a2-a9f1-aa834bdcd400\" (UID: \"9e679793-b837-48a2-a9f1-aa834bdcd400\") " Feb 17 18:09:19 crc kubenswrapper[4892]: I0217 18:09:19.517391 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e679793-b837-48a2-a9f1-aa834bdcd400-utilities\") pod \"9e679793-b837-48a2-a9f1-aa834bdcd400\" (UID: \"9e679793-b837-48a2-a9f1-aa834bdcd400\") " Feb 17 18:09:19 crc kubenswrapper[4892]: I0217 18:09:19.518383 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e679793-b837-48a2-a9f1-aa834bdcd400-utilities" (OuterVolumeSpecName: "utilities") pod "9e679793-b837-48a2-a9f1-aa834bdcd400" (UID: "9e679793-b837-48a2-a9f1-aa834bdcd400"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:09:19 crc kubenswrapper[4892]: I0217 18:09:19.523485 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e679793-b837-48a2-a9f1-aa834bdcd400-kube-api-access-tbxn6" (OuterVolumeSpecName: "kube-api-access-tbxn6") pod "9e679793-b837-48a2-a9f1-aa834bdcd400" (UID: "9e679793-b837-48a2-a9f1-aa834bdcd400"). InnerVolumeSpecName "kube-api-access-tbxn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:09:19 crc kubenswrapper[4892]: I0217 18:09:19.583864 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e679793-b837-48a2-a9f1-aa834bdcd400-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9e679793-b837-48a2-a9f1-aa834bdcd400" (UID: "9e679793-b837-48a2-a9f1-aa834bdcd400"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:09:19 crc kubenswrapper[4892]: I0217 18:09:19.618869 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e679793-b837-48a2-a9f1-aa834bdcd400-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:19 crc kubenswrapper[4892]: I0217 18:09:19.618918 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e679793-b837-48a2-a9f1-aa834bdcd400-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:19 crc kubenswrapper[4892]: I0217 18:09:19.618935 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbxn6\" (UniqueName: \"kubernetes.io/projected/9e679793-b837-48a2-a9f1-aa834bdcd400-kube-api-access-tbxn6\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:19 crc kubenswrapper[4892]: I0217 18:09:19.751439 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ssql5" event={"ID":"34b8461f-ff1d-40a4-9983-b91bcf32454e","Type":"ContainerStarted","Data":"95b708a0570637e94b5fd609e81e345c911d2363757d994a61f3521645f285c3"} Feb 17 18:09:19 crc kubenswrapper[4892]: I0217 18:09:19.755522 4892 generic.go:334] "Generic (PLEG): container finished" podID="9e679793-b837-48a2-a9f1-aa834bdcd400" containerID="e6bcf449e3e3aa3505e7cbcb77edad7f9f00b111e5df1c1b86d3a8f886425601" exitCode=0 Feb 17 18:09:19 crc kubenswrapper[4892]: I0217 18:09:19.755575 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6zfq6" event={"ID":"9e679793-b837-48a2-a9f1-aa834bdcd400","Type":"ContainerDied","Data":"e6bcf449e3e3aa3505e7cbcb77edad7f9f00b111e5df1c1b86d3a8f886425601"} Feb 17 18:09:19 crc kubenswrapper[4892]: I0217 18:09:19.755581 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6zfq6" Feb 17 18:09:19 crc kubenswrapper[4892]: I0217 18:09:19.755609 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6zfq6" event={"ID":"9e679793-b837-48a2-a9f1-aa834bdcd400","Type":"ContainerDied","Data":"21886e7a81cc6e517b49755fe2947c0514140567351add43f7b267ca40554189"} Feb 17 18:09:19 crc kubenswrapper[4892]: I0217 18:09:19.755631 4892 scope.go:117] "RemoveContainer" containerID="e6bcf449e3e3aa3505e7cbcb77edad7f9f00b111e5df1c1b86d3a8f886425601" Feb 17 18:09:19 crc kubenswrapper[4892]: I0217 18:09:19.785689 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ssql5" podStartSLOduration=3.253866661 podStartE2EDuration="5.785671366s" podCreationTimestamp="2026-02-17 18:09:14 +0000 UTC" firstStartedPulling="2026-02-17 18:09:16.722592642 +0000 UTC m=+1528.097995907" lastFinishedPulling="2026-02-17 18:09:19.254397337 +0000 UTC m=+1530.629800612" observedRunningTime="2026-02-17 18:09:19.782140251 +0000 UTC m=+1531.157543526" watchObservedRunningTime="2026-02-17 18:09:19.785671366 +0000 UTC m=+1531.161074631" Feb 17 18:09:19 crc kubenswrapper[4892]: I0217 18:09:19.790379 4892 scope.go:117] "RemoveContainer" containerID="bb6dfaedcf6108493a877ee11a7ee5c140e31b819cd5713324fd91585eaa0238" Feb 17 18:09:19 crc kubenswrapper[4892]: I0217 18:09:19.814376 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6zfq6"] Feb 17 18:09:19 crc kubenswrapper[4892]: I0217 18:09:19.823226 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6zfq6"] Feb 17 18:09:19 crc kubenswrapper[4892]: I0217 18:09:19.830519 4892 scope.go:117] "RemoveContainer" containerID="ab55c33fcadce3bc547fcb7047a231819e767ceb6272e8d4c927208db3d1d14a" Feb 17 18:09:19 crc kubenswrapper[4892]: I0217 18:09:19.864235 4892 scope.go:117] "RemoveContainer" containerID="e6bcf449e3e3aa3505e7cbcb77edad7f9f00b111e5df1c1b86d3a8f886425601" Feb 17 18:09:19 crc kubenswrapper[4892]: E0217 18:09:19.864860 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6bcf449e3e3aa3505e7cbcb77edad7f9f00b111e5df1c1b86d3a8f886425601\": container with ID starting with e6bcf449e3e3aa3505e7cbcb77edad7f9f00b111e5df1c1b86d3a8f886425601 not found: ID does not exist" containerID="e6bcf449e3e3aa3505e7cbcb77edad7f9f00b111e5df1c1b86d3a8f886425601" Feb 17 18:09:19 crc kubenswrapper[4892]: I0217 18:09:19.864916 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6bcf449e3e3aa3505e7cbcb77edad7f9f00b111e5df1c1b86d3a8f886425601"} err="failed to get container status \"e6bcf449e3e3aa3505e7cbcb77edad7f9f00b111e5df1c1b86d3a8f886425601\": rpc error: code = NotFound desc = could not find container \"e6bcf449e3e3aa3505e7cbcb77edad7f9f00b111e5df1c1b86d3a8f886425601\": container with ID starting with e6bcf449e3e3aa3505e7cbcb77edad7f9f00b111e5df1c1b86d3a8f886425601 not found: ID does not exist" Feb 17 18:09:19 crc kubenswrapper[4892]: I0217 18:09:19.864949 4892 scope.go:117] "RemoveContainer" containerID="bb6dfaedcf6108493a877ee11a7ee5c140e31b819cd5713324fd91585eaa0238" Feb 17 18:09:19 crc kubenswrapper[4892]: E0217 18:09:19.867196 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb6dfaedcf6108493a877ee11a7ee5c140e31b819cd5713324fd91585eaa0238\": container with ID starting with bb6dfaedcf6108493a877ee11a7ee5c140e31b819cd5713324fd91585eaa0238 not found: ID does not exist" containerID="bb6dfaedcf6108493a877ee11a7ee5c140e31b819cd5713324fd91585eaa0238" Feb 17 18:09:19 crc kubenswrapper[4892]: I0217 18:09:19.867253 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb6dfaedcf6108493a877ee11a7ee5c140e31b819cd5713324fd91585eaa0238"} err="failed to get container status \"bb6dfaedcf6108493a877ee11a7ee5c140e31b819cd5713324fd91585eaa0238\": rpc error: code = NotFound desc = could not find container \"bb6dfaedcf6108493a877ee11a7ee5c140e31b819cd5713324fd91585eaa0238\": container with ID starting with bb6dfaedcf6108493a877ee11a7ee5c140e31b819cd5713324fd91585eaa0238 not found: ID does not exist" Feb 17 18:09:19 crc kubenswrapper[4892]: I0217 18:09:19.867287 4892 scope.go:117] "RemoveContainer" containerID="ab55c33fcadce3bc547fcb7047a231819e767ceb6272e8d4c927208db3d1d14a" Feb 17 18:09:19 crc kubenswrapper[4892]: E0217 18:09:19.867984 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab55c33fcadce3bc547fcb7047a231819e767ceb6272e8d4c927208db3d1d14a\": container with ID starting with ab55c33fcadce3bc547fcb7047a231819e767ceb6272e8d4c927208db3d1d14a not found: ID does not exist" containerID="ab55c33fcadce3bc547fcb7047a231819e767ceb6272e8d4c927208db3d1d14a" Feb 17 18:09:19 crc kubenswrapper[4892]: I0217 18:09:19.868011 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab55c33fcadce3bc547fcb7047a231819e767ceb6272e8d4c927208db3d1d14a"} err="failed to get container status \"ab55c33fcadce3bc547fcb7047a231819e767ceb6272e8d4c927208db3d1d14a\": rpc error: code = NotFound desc = could not find container \"ab55c33fcadce3bc547fcb7047a231819e767ceb6272e8d4c927208db3d1d14a\": container with ID starting with ab55c33fcadce3bc547fcb7047a231819e767ceb6272e8d4c927208db3d1d14a not found: ID does not exist" Feb 17 18:09:21 crc kubenswrapper[4892]: I0217 18:09:21.371226 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e679793-b837-48a2-a9f1-aa834bdcd400" path="/var/lib/kubelet/pods/9e679793-b837-48a2-a9f1-aa834bdcd400/volumes" Feb 17 18:09:25 crc kubenswrapper[4892]: I0217 18:09:25.071003 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ssql5" Feb 17 18:09:25 crc kubenswrapper[4892]: I0217 18:09:25.071604 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ssql5" Feb 17 18:09:25 crc kubenswrapper[4892]: I0217 18:09:25.122763 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ssql5" Feb 17 18:09:25 crc kubenswrapper[4892]: I0217 18:09:25.896666 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ssql5" Feb 17 18:09:25 crc kubenswrapper[4892]: I0217 18:09:25.958439 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ssql5"] Feb 17 18:09:27 crc kubenswrapper[4892]: I0217 18:09:27.864232 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ssql5" podUID="34b8461f-ff1d-40a4-9983-b91bcf32454e" containerName="registry-server" containerID="cri-o://95b708a0570637e94b5fd609e81e345c911d2363757d994a61f3521645f285c3" gracePeriod=2 Feb 17 18:09:28 crc kubenswrapper[4892]: I0217 18:09:28.510152 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ssql5" Feb 17 18:09:28 crc kubenswrapper[4892]: I0217 18:09:28.673599 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34b8461f-ff1d-40a4-9983-b91bcf32454e-catalog-content\") pod \"34b8461f-ff1d-40a4-9983-b91bcf32454e\" (UID: \"34b8461f-ff1d-40a4-9983-b91bcf32454e\") " Feb 17 18:09:28 crc kubenswrapper[4892]: I0217 18:09:28.673694 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4584\" (UniqueName: \"kubernetes.io/projected/34b8461f-ff1d-40a4-9983-b91bcf32454e-kube-api-access-p4584\") pod \"34b8461f-ff1d-40a4-9983-b91bcf32454e\" (UID: \"34b8461f-ff1d-40a4-9983-b91bcf32454e\") " Feb 17 18:09:28 crc kubenswrapper[4892]: I0217 18:09:28.673760 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34b8461f-ff1d-40a4-9983-b91bcf32454e-utilities\") pod \"34b8461f-ff1d-40a4-9983-b91bcf32454e\" (UID: \"34b8461f-ff1d-40a4-9983-b91bcf32454e\") " Feb 17 18:09:28 crc kubenswrapper[4892]: I0217 18:09:28.674636 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34b8461f-ff1d-40a4-9983-b91bcf32454e-utilities" (OuterVolumeSpecName: "utilities") pod "34b8461f-ff1d-40a4-9983-b91bcf32454e" (UID: "34b8461f-ff1d-40a4-9983-b91bcf32454e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:09:28 crc kubenswrapper[4892]: I0217 18:09:28.678908 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34b8461f-ff1d-40a4-9983-b91bcf32454e-kube-api-access-p4584" (OuterVolumeSpecName: "kube-api-access-p4584") pod "34b8461f-ff1d-40a4-9983-b91bcf32454e" (UID: "34b8461f-ff1d-40a4-9983-b91bcf32454e"). InnerVolumeSpecName "kube-api-access-p4584". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:09:28 crc kubenswrapper[4892]: I0217 18:09:28.735555 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34b8461f-ff1d-40a4-9983-b91bcf32454e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "34b8461f-ff1d-40a4-9983-b91bcf32454e" (UID: "34b8461f-ff1d-40a4-9983-b91bcf32454e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:09:28 crc kubenswrapper[4892]: I0217 18:09:28.776310 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34b8461f-ff1d-40a4-9983-b91bcf32454e-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:28 crc kubenswrapper[4892]: I0217 18:09:28.776849 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34b8461f-ff1d-40a4-9983-b91bcf32454e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:28 crc kubenswrapper[4892]: I0217 18:09:28.776875 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4584\" (UniqueName: \"kubernetes.io/projected/34b8461f-ff1d-40a4-9983-b91bcf32454e-kube-api-access-p4584\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:28 crc kubenswrapper[4892]: I0217 18:09:28.879549 4892 generic.go:334] "Generic (PLEG): container finished" podID="34b8461f-ff1d-40a4-9983-b91bcf32454e" containerID="95b708a0570637e94b5fd609e81e345c911d2363757d994a61f3521645f285c3" exitCode=0 Feb 17 18:09:28 crc kubenswrapper[4892]: I0217 18:09:28.879596 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ssql5" event={"ID":"34b8461f-ff1d-40a4-9983-b91bcf32454e","Type":"ContainerDied","Data":"95b708a0570637e94b5fd609e81e345c911d2363757d994a61f3521645f285c3"} Feb 17 18:09:28 crc kubenswrapper[4892]: I0217 18:09:28.879623 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ssql5" Feb 17 18:09:28 crc kubenswrapper[4892]: I0217 18:09:28.879644 4892 scope.go:117] "RemoveContainer" containerID="95b708a0570637e94b5fd609e81e345c911d2363757d994a61f3521645f285c3" Feb 17 18:09:28 crc kubenswrapper[4892]: I0217 18:09:28.879631 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ssql5" event={"ID":"34b8461f-ff1d-40a4-9983-b91bcf32454e","Type":"ContainerDied","Data":"82bf21e175ab2e29a4a66d298766ed5f2701eeb89896e86584236439b2d839b1"} Feb 17 18:09:28 crc kubenswrapper[4892]: I0217 18:09:28.909324 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ssql5"] Feb 17 18:09:28 crc kubenswrapper[4892]: I0217 18:09:28.909340 4892 scope.go:117] "RemoveContainer" containerID="a9e777f49e285a8cd9bb871ff4f1c52b00ee140b795942681d98172742504d38" Feb 17 18:09:28 crc kubenswrapper[4892]: I0217 18:09:28.929107 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ssql5"] Feb 17 18:09:28 crc kubenswrapper[4892]: I0217 18:09:28.930688 4892 scope.go:117] "RemoveContainer" containerID="c3c5437837d58bcbe392780125c0cd91dc625edc3be0661d266b890cc5dc5f79" Feb 17 18:09:28 crc kubenswrapper[4892]: I0217 18:09:28.959343 4892 scope.go:117] "RemoveContainer" containerID="95b708a0570637e94b5fd609e81e345c911d2363757d994a61f3521645f285c3" Feb 17 18:09:28 crc kubenswrapper[4892]: E0217 18:09:28.960074 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95b708a0570637e94b5fd609e81e345c911d2363757d994a61f3521645f285c3\": container with ID starting with 95b708a0570637e94b5fd609e81e345c911d2363757d994a61f3521645f285c3 not found: ID does not exist" containerID="95b708a0570637e94b5fd609e81e345c911d2363757d994a61f3521645f285c3" Feb 17 18:09:28 crc kubenswrapper[4892]: I0217 18:09:28.960117 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95b708a0570637e94b5fd609e81e345c911d2363757d994a61f3521645f285c3"} err="failed to get container status \"95b708a0570637e94b5fd609e81e345c911d2363757d994a61f3521645f285c3\": rpc error: code = NotFound desc = could not find container \"95b708a0570637e94b5fd609e81e345c911d2363757d994a61f3521645f285c3\": container with ID starting with 95b708a0570637e94b5fd609e81e345c911d2363757d994a61f3521645f285c3 not found: ID does not exist" Feb 17 18:09:28 crc kubenswrapper[4892]: I0217 18:09:28.960143 4892 scope.go:117] "RemoveContainer" containerID="a9e777f49e285a8cd9bb871ff4f1c52b00ee140b795942681d98172742504d38" Feb 17 18:09:28 crc kubenswrapper[4892]: E0217 18:09:28.960554 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9e777f49e285a8cd9bb871ff4f1c52b00ee140b795942681d98172742504d38\": container with ID starting with a9e777f49e285a8cd9bb871ff4f1c52b00ee140b795942681d98172742504d38 not found: ID does not exist" containerID="a9e777f49e285a8cd9bb871ff4f1c52b00ee140b795942681d98172742504d38" Feb 17 18:09:28 crc kubenswrapper[4892]: I0217 18:09:28.960602 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9e777f49e285a8cd9bb871ff4f1c52b00ee140b795942681d98172742504d38"} err="failed to get container status \"a9e777f49e285a8cd9bb871ff4f1c52b00ee140b795942681d98172742504d38\": rpc error: code = NotFound desc = could not find container \"a9e777f49e285a8cd9bb871ff4f1c52b00ee140b795942681d98172742504d38\": container with ID starting with a9e777f49e285a8cd9bb871ff4f1c52b00ee140b795942681d98172742504d38 not found: ID does not exist" Feb 17 18:09:28 crc kubenswrapper[4892]: I0217 18:09:28.960637 4892 scope.go:117] "RemoveContainer" containerID="c3c5437837d58bcbe392780125c0cd91dc625edc3be0661d266b890cc5dc5f79" Feb 17 18:09:28 crc kubenswrapper[4892]: E0217 18:09:28.960962 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3c5437837d58bcbe392780125c0cd91dc625edc3be0661d266b890cc5dc5f79\": container with ID starting with c3c5437837d58bcbe392780125c0cd91dc625edc3be0661d266b890cc5dc5f79 not found: ID does not exist" containerID="c3c5437837d58bcbe392780125c0cd91dc625edc3be0661d266b890cc5dc5f79" Feb 17 18:09:28 crc kubenswrapper[4892]: I0217 18:09:28.960986 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3c5437837d58bcbe392780125c0cd91dc625edc3be0661d266b890cc5dc5f79"} err="failed to get container status \"c3c5437837d58bcbe392780125c0cd91dc625edc3be0661d266b890cc5dc5f79\": rpc error: code = NotFound desc = could not find container \"c3c5437837d58bcbe392780125c0cd91dc625edc3be0661d266b890cc5dc5f79\": container with ID starting with c3c5437837d58bcbe392780125c0cd91dc625edc3be0661d266b890cc5dc5f79 not found: ID does not exist" Feb 17 18:09:29 crc kubenswrapper[4892]: I0217 18:09:29.373354 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34b8461f-ff1d-40a4-9983-b91bcf32454e" path="/var/lib/kubelet/pods/34b8461f-ff1d-40a4-9983-b91bcf32454e/volumes" Feb 17 18:09:35 crc kubenswrapper[4892]: I0217 18:09:35.685153 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fl5dk"] Feb 17 18:09:35 crc kubenswrapper[4892]: E0217 18:09:35.686171 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e679793-b837-48a2-a9f1-aa834bdcd400" containerName="extract-content" Feb 17 18:09:35 crc kubenswrapper[4892]: I0217 18:09:35.686189 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e679793-b837-48a2-a9f1-aa834bdcd400" containerName="extract-content" Feb 17 18:09:35 crc kubenswrapper[4892]: E0217 18:09:35.686204 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34b8461f-ff1d-40a4-9983-b91bcf32454e" containerName="extract-utilities" Feb 17 18:09:35 crc kubenswrapper[4892]: I0217 18:09:35.686213 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="34b8461f-ff1d-40a4-9983-b91bcf32454e" containerName="extract-utilities" Feb 17 18:09:35 crc kubenswrapper[4892]: E0217 18:09:35.686233 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34b8461f-ff1d-40a4-9983-b91bcf32454e" containerName="extract-content" Feb 17 18:09:35 crc kubenswrapper[4892]: I0217 18:09:35.686243 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="34b8461f-ff1d-40a4-9983-b91bcf32454e" containerName="extract-content" Feb 17 18:09:35 crc kubenswrapper[4892]: E0217 18:09:35.686256 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e679793-b837-48a2-a9f1-aa834bdcd400" containerName="registry-server" Feb 17 18:09:35 crc kubenswrapper[4892]: I0217 18:09:35.686263 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e679793-b837-48a2-a9f1-aa834bdcd400" containerName="registry-server" Feb 17 18:09:35 crc kubenswrapper[4892]: E0217 18:09:35.686276 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e679793-b837-48a2-a9f1-aa834bdcd400" containerName="extract-utilities" Feb 17 18:09:35 crc kubenswrapper[4892]: I0217 18:09:35.686283 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e679793-b837-48a2-a9f1-aa834bdcd400" containerName="extract-utilities" Feb 17 18:09:35 crc kubenswrapper[4892]: E0217 18:09:35.686302 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34b8461f-ff1d-40a4-9983-b91bcf32454e" containerName="registry-server" Feb 17 18:09:35 crc kubenswrapper[4892]: I0217 18:09:35.686310 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="34b8461f-ff1d-40a4-9983-b91bcf32454e" containerName="registry-server" Feb 17 18:09:35 crc kubenswrapper[4892]: I0217 18:09:35.686514 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e679793-b837-48a2-a9f1-aa834bdcd400" containerName="registry-server" Feb 17 18:09:35 crc kubenswrapper[4892]: I0217 18:09:35.686546 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="34b8461f-ff1d-40a4-9983-b91bcf32454e" containerName="registry-server" Feb 17 18:09:35 crc kubenswrapper[4892]: I0217 18:09:35.688049 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fl5dk" Feb 17 18:09:35 crc kubenswrapper[4892]: I0217 18:09:35.704346 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fl5dk"] Feb 17 18:09:35 crc kubenswrapper[4892]: I0217 18:09:35.782092 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfns4\" (UniqueName: \"kubernetes.io/projected/843dddda-c2c3-440c-b416-1aa41fc555d7-kube-api-access-cfns4\") pod \"redhat-marketplace-fl5dk\" (UID: \"843dddda-c2c3-440c-b416-1aa41fc555d7\") " pod="openshift-marketplace/redhat-marketplace-fl5dk" Feb 17 18:09:35 crc kubenswrapper[4892]: I0217 18:09:35.782959 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/843dddda-c2c3-440c-b416-1aa41fc555d7-utilities\") pod \"redhat-marketplace-fl5dk\" (UID: \"843dddda-c2c3-440c-b416-1aa41fc555d7\") " pod="openshift-marketplace/redhat-marketplace-fl5dk" Feb 17 18:09:35 crc kubenswrapper[4892]: I0217 18:09:35.783093 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/843dddda-c2c3-440c-b416-1aa41fc555d7-catalog-content\") pod \"redhat-marketplace-fl5dk\" (UID: \"843dddda-c2c3-440c-b416-1aa41fc555d7\") " pod="openshift-marketplace/redhat-marketplace-fl5dk" Feb 17 18:09:35 crc kubenswrapper[4892]: I0217 18:09:35.884108 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/843dddda-c2c3-440c-b416-1aa41fc555d7-utilities\") pod \"redhat-marketplace-fl5dk\" (UID: \"843dddda-c2c3-440c-b416-1aa41fc555d7\") " pod="openshift-marketplace/redhat-marketplace-fl5dk" Feb 17 18:09:35 crc kubenswrapper[4892]: I0217 18:09:35.884198 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/843dddda-c2c3-440c-b416-1aa41fc555d7-catalog-content\") pod \"redhat-marketplace-fl5dk\" (UID: \"843dddda-c2c3-440c-b416-1aa41fc555d7\") " pod="openshift-marketplace/redhat-marketplace-fl5dk" Feb 17 18:09:35 crc kubenswrapper[4892]: I0217 18:09:35.884268 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfns4\" (UniqueName: \"kubernetes.io/projected/843dddda-c2c3-440c-b416-1aa41fc555d7-kube-api-access-cfns4\") pod \"redhat-marketplace-fl5dk\" (UID: \"843dddda-c2c3-440c-b416-1aa41fc555d7\") " pod="openshift-marketplace/redhat-marketplace-fl5dk" Feb 17 18:09:35 crc kubenswrapper[4892]: I0217 18:09:35.884785 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/843dddda-c2c3-440c-b416-1aa41fc555d7-utilities\") pod \"redhat-marketplace-fl5dk\" (UID: \"843dddda-c2c3-440c-b416-1aa41fc555d7\") " pod="openshift-marketplace/redhat-marketplace-fl5dk" Feb 17 18:09:35 crc kubenswrapper[4892]: I0217 18:09:35.884843 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/843dddda-c2c3-440c-b416-1aa41fc555d7-catalog-content\") pod \"redhat-marketplace-fl5dk\" (UID: \"843dddda-c2c3-440c-b416-1aa41fc555d7\") " pod="openshift-marketplace/redhat-marketplace-fl5dk" Feb 17 18:09:35 crc kubenswrapper[4892]: I0217 18:09:35.915561 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfns4\" (UniqueName: \"kubernetes.io/projected/843dddda-c2c3-440c-b416-1aa41fc555d7-kube-api-access-cfns4\") pod \"redhat-marketplace-fl5dk\" (UID: \"843dddda-c2c3-440c-b416-1aa41fc555d7\") " pod="openshift-marketplace/redhat-marketplace-fl5dk" Feb 17 18:09:36 crc kubenswrapper[4892]: I0217 18:09:36.009554 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fl5dk" Feb 17 18:09:36 crc kubenswrapper[4892]: I0217 18:09:36.462943 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fl5dk"] Feb 17 18:09:36 crc kubenswrapper[4892]: W0217 18:09:36.474296 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod843dddda_c2c3_440c_b416_1aa41fc555d7.slice/crio-bbce2846c1881e4775e4fc2842fdaa05c1aa130922f44ea69b6008e06ba631a4 WatchSource:0}: Error finding container bbce2846c1881e4775e4fc2842fdaa05c1aa130922f44ea69b6008e06ba631a4: Status 404 returned error can't find the container with id bbce2846c1881e4775e4fc2842fdaa05c1aa130922f44ea69b6008e06ba631a4 Feb 17 18:09:36 crc kubenswrapper[4892]: I0217 18:09:36.970080 4892 generic.go:334] "Generic (PLEG): container finished" podID="843dddda-c2c3-440c-b416-1aa41fc555d7" containerID="1ab0a24fe8b6a9b4a9aec014865e9c74b0da3562e11d554e6298864c0e50a582" exitCode=0 Feb 17 18:09:36 crc kubenswrapper[4892]: I0217 18:09:36.970135 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fl5dk" event={"ID":"843dddda-c2c3-440c-b416-1aa41fc555d7","Type":"ContainerDied","Data":"1ab0a24fe8b6a9b4a9aec014865e9c74b0da3562e11d554e6298864c0e50a582"} Feb 17 18:09:36 crc kubenswrapper[4892]: I0217 18:09:36.970166 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fl5dk" event={"ID":"843dddda-c2c3-440c-b416-1aa41fc555d7","Type":"ContainerStarted","Data":"bbce2846c1881e4775e4fc2842fdaa05c1aa130922f44ea69b6008e06ba631a4"} Feb 17 18:09:37 crc kubenswrapper[4892]: I0217 18:09:37.425236 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 18:09:37 crc kubenswrapper[4892]: I0217 18:09:37.425721 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 18:09:38 crc kubenswrapper[4892]: I0217 18:09:38.985599 4892 generic.go:334] "Generic (PLEG): container finished" podID="843dddda-c2c3-440c-b416-1aa41fc555d7" containerID="a0813be761d9c0f784eb1e358ff94c712d66f2c36f3d70657a82ae486d3c48d2" exitCode=0 Feb 17 18:09:38 crc kubenswrapper[4892]: I0217 18:09:38.985714 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fl5dk" event={"ID":"843dddda-c2c3-440c-b416-1aa41fc555d7","Type":"ContainerDied","Data":"a0813be761d9c0f784eb1e358ff94c712d66f2c36f3d70657a82ae486d3c48d2"} Feb 17 18:09:39 crc kubenswrapper[4892]: I0217 18:09:39.997587 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fl5dk" event={"ID":"843dddda-c2c3-440c-b416-1aa41fc555d7","Type":"ContainerStarted","Data":"2a440f2ca45a96f082598cd658fab8a8270d7696c61b4d072dad39442e027b11"} Feb 17 18:09:40 crc kubenswrapper[4892]: I0217 18:09:40.021763 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fl5dk" podStartSLOduration=2.600893222 podStartE2EDuration="5.02174357s" podCreationTimestamp="2026-02-17 18:09:35 +0000 UTC" firstStartedPulling="2026-02-17 18:09:36.973297901 +0000 UTC m=+1548.348701166" lastFinishedPulling="2026-02-17 18:09:39.394148239 +0000 UTC m=+1550.769551514" observedRunningTime="2026-02-17 18:09:40.014543295 +0000 UTC m=+1551.389946560" watchObservedRunningTime="2026-02-17 18:09:40.02174357 +0000 UTC m=+1551.397146845" Feb 17 18:09:46 crc kubenswrapper[4892]: I0217 18:09:46.009914 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fl5dk" Feb 17 18:09:46 crc kubenswrapper[4892]: I0217 18:09:46.010560 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fl5dk" Feb 17 18:09:46 crc kubenswrapper[4892]: I0217 18:09:46.095588 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fl5dk" Feb 17 18:09:46 crc kubenswrapper[4892]: I0217 18:09:46.168294 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fl5dk" Feb 17 18:09:46 crc kubenswrapper[4892]: I0217 18:09:46.352187 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fl5dk"] Feb 17 18:09:48 crc kubenswrapper[4892]: I0217 18:09:48.070022 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fl5dk" podUID="843dddda-c2c3-440c-b416-1aa41fc555d7" containerName="registry-server" containerID="cri-o://2a440f2ca45a96f082598cd658fab8a8270d7696c61b4d072dad39442e027b11" gracePeriod=2 Feb 17 18:09:48 crc kubenswrapper[4892]: I0217 18:09:48.479623 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fl5dk" Feb 17 18:09:48 crc kubenswrapper[4892]: I0217 18:09:48.592215 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/843dddda-c2c3-440c-b416-1aa41fc555d7-catalog-content\") pod \"843dddda-c2c3-440c-b416-1aa41fc555d7\" (UID: \"843dddda-c2c3-440c-b416-1aa41fc555d7\") " Feb 17 18:09:48 crc kubenswrapper[4892]: I0217 18:09:48.592373 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/843dddda-c2c3-440c-b416-1aa41fc555d7-utilities\") pod \"843dddda-c2c3-440c-b416-1aa41fc555d7\" (UID: \"843dddda-c2c3-440c-b416-1aa41fc555d7\") " Feb 17 18:09:48 crc kubenswrapper[4892]: I0217 18:09:48.592408 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfns4\" (UniqueName: \"kubernetes.io/projected/843dddda-c2c3-440c-b416-1aa41fc555d7-kube-api-access-cfns4\") pod \"843dddda-c2c3-440c-b416-1aa41fc555d7\" (UID: \"843dddda-c2c3-440c-b416-1aa41fc555d7\") " Feb 17 18:09:48 crc kubenswrapper[4892]: I0217 18:09:48.593887 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/843dddda-c2c3-440c-b416-1aa41fc555d7-utilities" (OuterVolumeSpecName: "utilities") pod "843dddda-c2c3-440c-b416-1aa41fc555d7" (UID: "843dddda-c2c3-440c-b416-1aa41fc555d7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:09:48 crc kubenswrapper[4892]: I0217 18:09:48.604173 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/843dddda-c2c3-440c-b416-1aa41fc555d7-kube-api-access-cfns4" (OuterVolumeSpecName: "kube-api-access-cfns4") pod "843dddda-c2c3-440c-b416-1aa41fc555d7" (UID: "843dddda-c2c3-440c-b416-1aa41fc555d7"). InnerVolumeSpecName "kube-api-access-cfns4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:09:48 crc kubenswrapper[4892]: I0217 18:09:48.694317 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/843dddda-c2c3-440c-b416-1aa41fc555d7-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:48 crc kubenswrapper[4892]: I0217 18:09:48.694368 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfns4\" (UniqueName: \"kubernetes.io/projected/843dddda-c2c3-440c-b416-1aa41fc555d7-kube-api-access-cfns4\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:48 crc kubenswrapper[4892]: I0217 18:09:48.739222 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/843dddda-c2c3-440c-b416-1aa41fc555d7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "843dddda-c2c3-440c-b416-1aa41fc555d7" (UID: "843dddda-c2c3-440c-b416-1aa41fc555d7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:09:48 crc kubenswrapper[4892]: I0217 18:09:48.796112 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/843dddda-c2c3-440c-b416-1aa41fc555d7-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 18:09:49 crc kubenswrapper[4892]: I0217 18:09:49.081330 4892 generic.go:334] "Generic (PLEG): container finished" podID="843dddda-c2c3-440c-b416-1aa41fc555d7" containerID="2a440f2ca45a96f082598cd658fab8a8270d7696c61b4d072dad39442e027b11" exitCode=0 Feb 17 18:09:49 crc kubenswrapper[4892]: I0217 18:09:49.081399 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fl5dk" Feb 17 18:09:49 crc kubenswrapper[4892]: I0217 18:09:49.081403 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fl5dk" event={"ID":"843dddda-c2c3-440c-b416-1aa41fc555d7","Type":"ContainerDied","Data":"2a440f2ca45a96f082598cd658fab8a8270d7696c61b4d072dad39442e027b11"} Feb 17 18:09:49 crc kubenswrapper[4892]: I0217 18:09:49.081529 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fl5dk" event={"ID":"843dddda-c2c3-440c-b416-1aa41fc555d7","Type":"ContainerDied","Data":"bbce2846c1881e4775e4fc2842fdaa05c1aa130922f44ea69b6008e06ba631a4"} Feb 17 18:09:49 crc kubenswrapper[4892]: I0217 18:09:49.081549 4892 scope.go:117] "RemoveContainer" containerID="2a440f2ca45a96f082598cd658fab8a8270d7696c61b4d072dad39442e027b11" Feb 17 18:09:49 crc kubenswrapper[4892]: I0217 18:09:49.108517 4892 scope.go:117] "RemoveContainer" containerID="a0813be761d9c0f784eb1e358ff94c712d66f2c36f3d70657a82ae486d3c48d2" Feb 17 18:09:49 crc kubenswrapper[4892]: I0217 18:09:49.119154 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fl5dk"] Feb 17 18:09:49 crc kubenswrapper[4892]: I0217 18:09:49.130964 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fl5dk"] Feb 17 18:09:49 crc kubenswrapper[4892]: I0217 18:09:49.133348 4892 scope.go:117] "RemoveContainer" containerID="1ab0a24fe8b6a9b4a9aec014865e9c74b0da3562e11d554e6298864c0e50a582" Feb 17 18:09:49 crc kubenswrapper[4892]: I0217 18:09:49.154858 4892 scope.go:117] "RemoveContainer" containerID="2a440f2ca45a96f082598cd658fab8a8270d7696c61b4d072dad39442e027b11" Feb 17 18:09:49 crc kubenswrapper[4892]: E0217 18:09:49.155344 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a440f2ca45a96f082598cd658fab8a8270d7696c61b4d072dad39442e027b11\": container with ID starting with 2a440f2ca45a96f082598cd658fab8a8270d7696c61b4d072dad39442e027b11 not found: ID does not exist" containerID="2a440f2ca45a96f082598cd658fab8a8270d7696c61b4d072dad39442e027b11" Feb 17 18:09:49 crc kubenswrapper[4892]: I0217 18:09:49.155410 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a440f2ca45a96f082598cd658fab8a8270d7696c61b4d072dad39442e027b11"} err="failed to get container status \"2a440f2ca45a96f082598cd658fab8a8270d7696c61b4d072dad39442e027b11\": rpc error: code = NotFound desc = could not find container \"2a440f2ca45a96f082598cd658fab8a8270d7696c61b4d072dad39442e027b11\": container with ID starting with 2a440f2ca45a96f082598cd658fab8a8270d7696c61b4d072dad39442e027b11 not found: ID does not exist" Feb 17 18:09:49 crc kubenswrapper[4892]: I0217 18:09:49.155448 4892 scope.go:117] "RemoveContainer" containerID="a0813be761d9c0f784eb1e358ff94c712d66f2c36f3d70657a82ae486d3c48d2" Feb 17 18:09:49 crc kubenswrapper[4892]: E0217 18:09:49.155831 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0813be761d9c0f784eb1e358ff94c712d66f2c36f3d70657a82ae486d3c48d2\": container with ID starting with a0813be761d9c0f784eb1e358ff94c712d66f2c36f3d70657a82ae486d3c48d2 not found: ID does not exist" containerID="a0813be761d9c0f784eb1e358ff94c712d66f2c36f3d70657a82ae486d3c48d2" Feb 17 18:09:49 crc kubenswrapper[4892]: I0217 18:09:49.155867 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0813be761d9c0f784eb1e358ff94c712d66f2c36f3d70657a82ae486d3c48d2"} err="failed to get container status \"a0813be761d9c0f784eb1e358ff94c712d66f2c36f3d70657a82ae486d3c48d2\": rpc error: code = NotFound desc = could not find container \"a0813be761d9c0f784eb1e358ff94c712d66f2c36f3d70657a82ae486d3c48d2\": container with ID starting with a0813be761d9c0f784eb1e358ff94c712d66f2c36f3d70657a82ae486d3c48d2 not found: ID does not exist" Feb 17 18:09:49 crc kubenswrapper[4892]: I0217 18:09:49.155893 4892 scope.go:117] "RemoveContainer" containerID="1ab0a24fe8b6a9b4a9aec014865e9c74b0da3562e11d554e6298864c0e50a582" Feb 17 18:09:49 crc kubenswrapper[4892]: E0217 18:09:49.156222 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ab0a24fe8b6a9b4a9aec014865e9c74b0da3562e11d554e6298864c0e50a582\": container with ID starting with 1ab0a24fe8b6a9b4a9aec014865e9c74b0da3562e11d554e6298864c0e50a582 not found: ID does not exist" containerID="1ab0a24fe8b6a9b4a9aec014865e9c74b0da3562e11d554e6298864c0e50a582" Feb 17 18:09:49 crc kubenswrapper[4892]: I0217 18:09:49.156291 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ab0a24fe8b6a9b4a9aec014865e9c74b0da3562e11d554e6298864c0e50a582"} err="failed to get container status \"1ab0a24fe8b6a9b4a9aec014865e9c74b0da3562e11d554e6298864c0e50a582\": rpc error: code = NotFound desc = could not find container \"1ab0a24fe8b6a9b4a9aec014865e9c74b0da3562e11d554e6298864c0e50a582\": container with ID starting with 1ab0a24fe8b6a9b4a9aec014865e9c74b0da3562e11d554e6298864c0e50a582 not found: ID does not exist" Feb 17 18:09:49 crc kubenswrapper[4892]: I0217 18:09:49.375283 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="843dddda-c2c3-440c-b416-1aa41fc555d7" path="/var/lib/kubelet/pods/843dddda-c2c3-440c-b416-1aa41fc555d7/volumes" Feb 17 18:09:57 crc kubenswrapper[4892]: I0217 18:09:57.498795 4892 scope.go:117] "RemoveContainer" containerID="c4a196bf2d492c309554895afd4605e63dc018017cfb378e06d3dc7157f1d076" Feb 17 18:09:57 crc kubenswrapper[4892]: I0217 18:09:57.521614 4892 scope.go:117] "RemoveContainer" containerID="ab6e9785f15283af7e10144fb1b893d0defb9dedbde94a7d3c471d856a81f83b" Feb 17 18:09:57 crc kubenswrapper[4892]: I0217 18:09:57.560100 4892 scope.go:117] "RemoveContainer" containerID="b921cc6366edc36ee38b6a143bfc88dcd8710d62a31bab58b5973626b24cb7ca" Feb 17 18:09:57 crc kubenswrapper[4892]: I0217 18:09:57.583071 4892 scope.go:117] "RemoveContainer" containerID="3a5a394db3114515f27acb7b3724e0a08543612a8c1883d88a2dbc0f06751727" Feb 17 18:09:57 crc kubenswrapper[4892]: I0217 18:09:57.618466 4892 scope.go:117] "RemoveContainer" containerID="a025ad92e083691a77b535757d3d225e57599fa5fb5a9fd39fba4ecee4733b2b" Feb 17 18:09:57 crc kubenswrapper[4892]: I0217 18:09:57.655528 4892 scope.go:117] "RemoveContainer" containerID="47ba413927d309d951e32487d20bc96e08504a3f1ddb3fad89ea5e5a59e190b4" Feb 17 18:09:57 crc kubenswrapper[4892]: I0217 18:09:57.685119 4892 scope.go:117] "RemoveContainer" containerID="13f881e4638fa1d108f0b4917c9d4bd23859a377ed76cbafeeead4d196d41cab" Feb 17 18:09:57 crc kubenswrapper[4892]: I0217 18:09:57.733362 4892 scope.go:117] "RemoveContainer" containerID="f4887c3314b9e14308f24778bd2b0e62ab2da2fd653b968ca3642d6372b9cb2e" Feb 17 18:09:57 crc kubenswrapper[4892]: I0217 18:09:57.754316 4892 scope.go:117] "RemoveContainer" containerID="13772a7bab2e085826df0815a87fd8bfefe58b5d98ecd96e8581b82a84d62443" Feb 17 18:10:07 crc kubenswrapper[4892]: I0217 18:10:07.427365 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 18:10:07 crc kubenswrapper[4892]: I0217 18:10:07.427973 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 18:10:07 crc kubenswrapper[4892]: I0217 18:10:07.428015 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" Feb 17 18:10:07 crc kubenswrapper[4892]: I0217 18:10:07.428792 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6beef292233363e1e0a1fae70e7b7bf66e739bf2ae8344c3d6c91fca25fc9431"} pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 18:10:07 crc kubenswrapper[4892]: I0217 18:10:07.428887 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" containerID="cri-o://6beef292233363e1e0a1fae70e7b7bf66e739bf2ae8344c3d6c91fca25fc9431" gracePeriod=600 Feb 17 18:10:07 crc kubenswrapper[4892]: E0217 18:10:07.570256 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:10:08 crc kubenswrapper[4892]: I0217 18:10:08.265644 4892 generic.go:334] "Generic (PLEG): container finished" podID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerID="6beef292233363e1e0a1fae70e7b7bf66e739bf2ae8344c3d6c91fca25fc9431" exitCode=0 Feb 17 18:10:08 crc kubenswrapper[4892]: I0217 18:10:08.265696 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerDied","Data":"6beef292233363e1e0a1fae70e7b7bf66e739bf2ae8344c3d6c91fca25fc9431"} Feb 17 18:10:08 crc kubenswrapper[4892]: I0217 18:10:08.265747 4892 scope.go:117] "RemoveContainer" containerID="0b5f9c9cd974cd254191642f6cc48f52aa43de6e7dce9d7b7d9b694f86f42344" Feb 17 18:10:08 crc kubenswrapper[4892]: I0217 18:10:08.266407 4892 scope.go:117] "RemoveContainer" containerID="6beef292233363e1e0a1fae70e7b7bf66e739bf2ae8344c3d6c91fca25fc9431" Feb 17 18:10:08 crc kubenswrapper[4892]: E0217 18:10:08.266702 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:10:20 crc kubenswrapper[4892]: I0217 18:10:20.359345 4892 scope.go:117] "RemoveContainer" containerID="6beef292233363e1e0a1fae70e7b7bf66e739bf2ae8344c3d6c91fca25fc9431" Feb 17 18:10:20 crc kubenswrapper[4892]: E0217 18:10:20.360080 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:10:33 crc kubenswrapper[4892]: I0217 18:10:33.360251 4892 scope.go:117] "RemoveContainer" containerID="6beef292233363e1e0a1fae70e7b7bf66e739bf2ae8344c3d6c91fca25fc9431" Feb 17 18:10:33 crc kubenswrapper[4892]: E0217 18:10:33.361172 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:10:45 crc kubenswrapper[4892]: I0217 18:10:45.359803 4892 scope.go:117] "RemoveContainer" containerID="6beef292233363e1e0a1fae70e7b7bf66e739bf2ae8344c3d6c91fca25fc9431" Feb 17 18:10:45 crc kubenswrapper[4892]: E0217 18:10:45.360487 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:10:57 crc kubenswrapper[4892]: I0217 18:10:57.359886 4892 scope.go:117] "RemoveContainer" containerID="6beef292233363e1e0a1fae70e7b7bf66e739bf2ae8344c3d6c91fca25fc9431" Feb 17 18:10:57 crc kubenswrapper[4892]: E0217 18:10:57.361140 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:10:57 crc kubenswrapper[4892]: I0217 18:10:57.962360 4892 scope.go:117] "RemoveContainer" containerID="276fbe4a642e629846be447b31c24d7070dfa435158a65fb8bc262ffc1b036a1" Feb 17 18:10:58 crc kubenswrapper[4892]: I0217 18:10:58.004697 4892 scope.go:117] "RemoveContainer" containerID="f199e1de12a08bfc19910317b33654d3544eab27381b1a50fd11e7d82eeae7da" Feb 17 18:10:58 crc kubenswrapper[4892]: I0217 18:10:58.027755 4892 scope.go:117] "RemoveContainer" containerID="bbca05ba8000e1544dd1256fe1d48355fe4077385199194480c09fd31d0d03ad" Feb 17 18:10:58 crc kubenswrapper[4892]: I0217 18:10:58.053446 4892 scope.go:117] "RemoveContainer" containerID="591432f5c35dbe4c788cd2a9a9485c33f60acc25bb3e704d33d8efd0d5c77925" Feb 17 18:10:58 crc kubenswrapper[4892]: I0217 18:10:58.084992 4892 scope.go:117] "RemoveContainer" containerID="09c15a6e47f0636a2694c22ae5f7d75544f34568564c4577e9dcfe58dd2d7927" Feb 17 18:10:58 crc kubenswrapper[4892]: I0217 18:10:58.125244 4892 scope.go:117] "RemoveContainer" containerID="72c2cbaf2de54480ce7abb484d8c16ea293b67cb726839d9a2f462baee040be3" Feb 17 18:10:58 crc kubenswrapper[4892]: I0217 18:10:58.142092 4892 scope.go:117] "RemoveContainer" containerID="80a0a6f3f3f9ebbe69c194f3e074f48a042c65dd9cb6a3e8b0adc00867e049f1" Feb 17 18:10:58 crc kubenswrapper[4892]: I0217 18:10:58.162980 4892 scope.go:117] "RemoveContainer" containerID="d6866eacf02bafe4ccea20fee02f0034864b3623f04f665f9716feff8b87a96f" Feb 17 18:10:58 crc kubenswrapper[4892]: I0217 18:10:58.183958 4892 scope.go:117] "RemoveContainer" containerID="c99d3760836f82bf94729ec9f9c217c2b3cda31831ae4231d401c839708c201c" Feb 17 18:10:58 crc kubenswrapper[4892]: I0217 18:10:58.220138 4892 scope.go:117] "RemoveContainer" containerID="85e1b561483fee3321875f44825c87ea1fd5510243e332469de04d2d479fd0ae" Feb 17 18:10:58 crc kubenswrapper[4892]: I0217 18:10:58.242687 4892 scope.go:117] "RemoveContainer" containerID="55487b33f7e954c0f5c383758f8a9c00a4376ebe648b34305608a954745f431e" Feb 17 18:11:10 crc kubenswrapper[4892]: I0217 18:11:10.359663 4892 scope.go:117] "RemoveContainer" containerID="6beef292233363e1e0a1fae70e7b7bf66e739bf2ae8344c3d6c91fca25fc9431" Feb 17 18:11:10 crc kubenswrapper[4892]: E0217 18:11:10.362317 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:11:22 crc kubenswrapper[4892]: I0217 18:11:22.360434 4892 scope.go:117] "RemoveContainer" containerID="6beef292233363e1e0a1fae70e7b7bf66e739bf2ae8344c3d6c91fca25fc9431" Feb 17 18:11:22 crc kubenswrapper[4892]: E0217 18:11:22.361516 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:11:33 crc kubenswrapper[4892]: I0217 18:11:33.359703 4892 scope.go:117] "RemoveContainer" containerID="6beef292233363e1e0a1fae70e7b7bf66e739bf2ae8344c3d6c91fca25fc9431" Feb 17 18:11:33 crc kubenswrapper[4892]: E0217 18:11:33.360722 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:11:47 crc kubenswrapper[4892]: I0217 18:11:47.359862 4892 scope.go:117] "RemoveContainer" containerID="6beef292233363e1e0a1fae70e7b7bf66e739bf2ae8344c3d6c91fca25fc9431" Feb 17 18:11:47 crc kubenswrapper[4892]: E0217 18:11:47.360787 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:11:58 crc kubenswrapper[4892]: I0217 18:11:58.360611 4892 scope.go:117] "RemoveContainer" containerID="6beef292233363e1e0a1fae70e7b7bf66e739bf2ae8344c3d6c91fca25fc9431" Feb 17 18:11:58 crc kubenswrapper[4892]: E0217 18:11:58.361743 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:11:58 crc kubenswrapper[4892]: I0217 18:11:58.418989 4892 scope.go:117] "RemoveContainer" containerID="ab514791aa617c049db64bf7cfd3380e2e19c1cb78b1e69eb13b878f9cf05a05" Feb 17 18:11:58 crc kubenswrapper[4892]: I0217 18:11:58.450500 4892 scope.go:117] "RemoveContainer" containerID="1ff9989f4376fc857006e940d7640f139edcdfb827dfb96c134138f77b244736" Feb 17 18:11:58 crc kubenswrapper[4892]: I0217 18:11:58.470152 4892 scope.go:117] "RemoveContainer" containerID="6a15516a5b658d1698d3bd63059e7269ca1018b30edeef2d1a8743a7ac2d9849" Feb 17 18:11:58 crc kubenswrapper[4892]: I0217 18:11:58.515543 4892 scope.go:117] "RemoveContainer" containerID="df8c62daa6f190fa1487fe86dda7f0982aee1c48c29f17d050a8676d2e3c93d6" Feb 17 18:11:58 crc kubenswrapper[4892]: I0217 18:11:58.558282 4892 scope.go:117] "RemoveContainer" containerID="774bd1818f399c46a92e0d40d59f996bb13f1cbbcb52122a6f55526ac6091b99" Feb 17 18:11:58 crc kubenswrapper[4892]: I0217 18:11:58.589164 4892 scope.go:117] "RemoveContainer" containerID="a11e7aae423c70ef1e6de32b4ff56f1385410dd2b7fc9f3e8ba4ae35f96458cc" Feb 17 18:11:58 crc kubenswrapper[4892]: I0217 18:11:58.633633 4892 scope.go:117] "RemoveContainer" containerID="0f890fee5c00a218e42e924b96222201e9ce426ef9dad178a8610ce733d3612e" Feb 17 18:12:12 crc kubenswrapper[4892]: I0217 18:12:12.360360 4892 scope.go:117] "RemoveContainer" containerID="6beef292233363e1e0a1fae70e7b7bf66e739bf2ae8344c3d6c91fca25fc9431" Feb 17 18:12:12 crc kubenswrapper[4892]: E0217 18:12:12.361193 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:12:23 crc kubenswrapper[4892]: I0217 18:12:23.359847 4892 scope.go:117] "RemoveContainer" containerID="6beef292233363e1e0a1fae70e7b7bf66e739bf2ae8344c3d6c91fca25fc9431" Feb 17 18:12:23 crc kubenswrapper[4892]: E0217 18:12:23.360657 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:12:36 crc kubenswrapper[4892]: I0217 18:12:36.360900 4892 scope.go:117] "RemoveContainer" containerID="6beef292233363e1e0a1fae70e7b7bf66e739bf2ae8344c3d6c91fca25fc9431" Feb 17 18:12:36 crc kubenswrapper[4892]: E0217 18:12:36.361907 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:12:51 crc kubenswrapper[4892]: I0217 18:12:51.360768 4892 scope.go:117] "RemoveContainer" containerID="6beef292233363e1e0a1fae70e7b7bf66e739bf2ae8344c3d6c91fca25fc9431" Feb 17 18:12:51 crc kubenswrapper[4892]: E0217 18:12:51.362044 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:12:58 crc kubenswrapper[4892]: I0217 18:12:58.746022 4892 scope.go:117] "RemoveContainer" containerID="7576bc6a46f917916ace668163f910721525c50fdb2445e86a823be4d67ae777" Feb 17 18:12:58 crc kubenswrapper[4892]: I0217 18:12:58.782227 4892 scope.go:117] "RemoveContainer" containerID="f69a74c1d7ff5f5e7cab3dfb70cfb2a1f2475fdc6a082b95bd0059bf7596b0bc" Feb 17 18:12:58 crc kubenswrapper[4892]: I0217 18:12:58.801755 4892 scope.go:117] "RemoveContainer" containerID="4e88a2b9e15b3a6cb7fc5f88c55641255e9e97f6b726a8c1f4742146a59a82fc" Feb 17 18:12:58 crc kubenswrapper[4892]: I0217 18:12:58.830928 4892 scope.go:117] "RemoveContainer" containerID="a13a8f8cc49ae7cb4143bc67cb64f91acf0fe318d17893c57c649069e80a7423" Feb 17 18:12:58 crc kubenswrapper[4892]: I0217 18:12:58.887605 4892 scope.go:117] "RemoveContainer" containerID="9bac58ad39e6590d4042399c64a90295868b511562a801762900f675847c27a2" Feb 17 18:12:58 crc kubenswrapper[4892]: I0217 18:12:58.910059 4892 scope.go:117] "RemoveContainer" containerID="f7ec71d9b448df0c427bf1c569f629e2d1b04b498fb150e292de366d671c7e43" Feb 17 18:13:04 crc kubenswrapper[4892]: I0217 18:13:04.359373 4892 scope.go:117] "RemoveContainer" containerID="6beef292233363e1e0a1fae70e7b7bf66e739bf2ae8344c3d6c91fca25fc9431" Feb 17 18:13:04 crc kubenswrapper[4892]: E0217 18:13:04.360059 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:13:17 crc kubenswrapper[4892]: I0217 18:13:17.360449 4892 scope.go:117] "RemoveContainer" containerID="6beef292233363e1e0a1fae70e7b7bf66e739bf2ae8344c3d6c91fca25fc9431" Feb 17 18:13:17 crc kubenswrapper[4892]: E0217 18:13:17.361557 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:13:29 crc kubenswrapper[4892]: I0217 18:13:29.365827 4892 scope.go:117] "RemoveContainer" containerID="6beef292233363e1e0a1fae70e7b7bf66e739bf2ae8344c3d6c91fca25fc9431" Feb 17 18:13:29 crc kubenswrapper[4892]: E0217 18:13:29.366778 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:13:40 crc kubenswrapper[4892]: I0217 18:13:40.359780 4892 scope.go:117] "RemoveContainer" containerID="6beef292233363e1e0a1fae70e7b7bf66e739bf2ae8344c3d6c91fca25fc9431" Feb 17 18:13:40 crc kubenswrapper[4892]: E0217 18:13:40.361071 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:13:52 crc kubenswrapper[4892]: I0217 18:13:52.360330 4892 scope.go:117] "RemoveContainer" containerID="6beef292233363e1e0a1fae70e7b7bf66e739bf2ae8344c3d6c91fca25fc9431" Feb 17 18:13:52 crc kubenswrapper[4892]: E0217 18:13:52.361386 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:14:07 crc kubenswrapper[4892]: I0217 18:14:07.361465 4892 scope.go:117] "RemoveContainer" containerID="6beef292233363e1e0a1fae70e7b7bf66e739bf2ae8344c3d6c91fca25fc9431" Feb 17 18:14:07 crc kubenswrapper[4892]: E0217 18:14:07.362507 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:14:18 crc kubenswrapper[4892]: I0217 18:14:18.359495 4892 scope.go:117] "RemoveContainer" containerID="6beef292233363e1e0a1fae70e7b7bf66e739bf2ae8344c3d6c91fca25fc9431" Feb 17 18:14:18 crc kubenswrapper[4892]: E0217 18:14:18.360176 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:14:29 crc kubenswrapper[4892]: I0217 18:14:29.366837 4892 scope.go:117] "RemoveContainer" containerID="6beef292233363e1e0a1fae70e7b7bf66e739bf2ae8344c3d6c91fca25fc9431" Feb 17 18:14:29 crc kubenswrapper[4892]: E0217 18:14:29.367786 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:14:42 crc kubenswrapper[4892]: I0217 18:14:42.359723 4892 scope.go:117] "RemoveContainer" containerID="6beef292233363e1e0a1fae70e7b7bf66e739bf2ae8344c3d6c91fca25fc9431" Feb 17 18:14:42 crc kubenswrapper[4892]: E0217 18:14:42.360470 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:14:53 crc kubenswrapper[4892]: I0217 18:14:53.359894 4892 scope.go:117] "RemoveContainer" containerID="6beef292233363e1e0a1fae70e7b7bf66e739bf2ae8344c3d6c91fca25fc9431" Feb 17 18:14:53 crc kubenswrapper[4892]: E0217 18:14:53.360733 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:15:00 crc kubenswrapper[4892]: I0217 18:15:00.149030 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522535-k86zz"] Feb 17 18:15:00 crc kubenswrapper[4892]: E0217 18:15:00.150176 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="843dddda-c2c3-440c-b416-1aa41fc555d7" containerName="extract-utilities" Feb 17 18:15:00 crc kubenswrapper[4892]: I0217 18:15:00.150206 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="843dddda-c2c3-440c-b416-1aa41fc555d7" containerName="extract-utilities" Feb 17 18:15:00 crc kubenswrapper[4892]: E0217 18:15:00.150235 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="843dddda-c2c3-440c-b416-1aa41fc555d7" containerName="registry-server" Feb 17 18:15:00 crc kubenswrapper[4892]: I0217 18:15:00.150252 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="843dddda-c2c3-440c-b416-1aa41fc555d7" containerName="registry-server" Feb 17 18:15:00 crc kubenswrapper[4892]: E0217 18:15:00.150352 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="843dddda-c2c3-440c-b416-1aa41fc555d7" containerName="extract-content" Feb 17 18:15:00 crc kubenswrapper[4892]: I0217 18:15:00.150365 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="843dddda-c2c3-440c-b416-1aa41fc555d7" containerName="extract-content" Feb 17 18:15:00 crc kubenswrapper[4892]: I0217 18:15:00.150718 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="843dddda-c2c3-440c-b416-1aa41fc555d7" containerName="registry-server" Feb 17 18:15:00 crc kubenswrapper[4892]: I0217 18:15:00.151444 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522535-k86zz" Feb 17 18:15:00 crc kubenswrapper[4892]: I0217 18:15:00.155337 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 18:15:00 crc kubenswrapper[4892]: I0217 18:15:00.158348 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 18:15:00 crc kubenswrapper[4892]: I0217 18:15:00.161402 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522535-k86zz"] Feb 17 18:15:00 crc kubenswrapper[4892]: I0217 18:15:00.247840 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/34695ec1-983c-49e1-b645-82a0e41b0b35-config-volume\") pod \"collect-profiles-29522535-k86zz\" (UID: \"34695ec1-983c-49e1-b645-82a0e41b0b35\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522535-k86zz" Feb 17 18:15:00 crc kubenswrapper[4892]: I0217 18:15:00.247949 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whfng\" (UniqueName: \"kubernetes.io/projected/34695ec1-983c-49e1-b645-82a0e41b0b35-kube-api-access-whfng\") pod \"collect-profiles-29522535-k86zz\" (UID: \"34695ec1-983c-49e1-b645-82a0e41b0b35\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522535-k86zz" Feb 17 18:15:00 crc kubenswrapper[4892]: I0217 18:15:00.248122 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/34695ec1-983c-49e1-b645-82a0e41b0b35-secret-volume\") pod \"collect-profiles-29522535-k86zz\" (UID: \"34695ec1-983c-49e1-b645-82a0e41b0b35\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522535-k86zz" Feb 17 18:15:00 crc kubenswrapper[4892]: I0217 18:15:00.349230 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/34695ec1-983c-49e1-b645-82a0e41b0b35-config-volume\") pod \"collect-profiles-29522535-k86zz\" (UID: \"34695ec1-983c-49e1-b645-82a0e41b0b35\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522535-k86zz" Feb 17 18:15:00 crc kubenswrapper[4892]: I0217 18:15:00.349317 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whfng\" (UniqueName: \"kubernetes.io/projected/34695ec1-983c-49e1-b645-82a0e41b0b35-kube-api-access-whfng\") pod \"collect-profiles-29522535-k86zz\" (UID: \"34695ec1-983c-49e1-b645-82a0e41b0b35\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522535-k86zz" Feb 17 18:15:00 crc kubenswrapper[4892]: I0217 18:15:00.349385 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/34695ec1-983c-49e1-b645-82a0e41b0b35-secret-volume\") pod \"collect-profiles-29522535-k86zz\" (UID: \"34695ec1-983c-49e1-b645-82a0e41b0b35\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522535-k86zz" Feb 17 18:15:00 crc kubenswrapper[4892]: I0217 18:15:00.350446 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/34695ec1-983c-49e1-b645-82a0e41b0b35-config-volume\") pod \"collect-profiles-29522535-k86zz\" (UID: \"34695ec1-983c-49e1-b645-82a0e41b0b35\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522535-k86zz" Feb 17 18:15:00 crc kubenswrapper[4892]: I0217 18:15:00.356617 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/34695ec1-983c-49e1-b645-82a0e41b0b35-secret-volume\") pod \"collect-profiles-29522535-k86zz\" (UID: \"34695ec1-983c-49e1-b645-82a0e41b0b35\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522535-k86zz" Feb 17 18:15:00 crc kubenswrapper[4892]: I0217 18:15:00.375944 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whfng\" (UniqueName: \"kubernetes.io/projected/34695ec1-983c-49e1-b645-82a0e41b0b35-kube-api-access-whfng\") pod \"collect-profiles-29522535-k86zz\" (UID: \"34695ec1-983c-49e1-b645-82a0e41b0b35\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522535-k86zz" Feb 17 18:15:00 crc kubenswrapper[4892]: I0217 18:15:00.473366 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522535-k86zz" Feb 17 18:15:00 crc kubenswrapper[4892]: I0217 18:15:00.904934 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522535-k86zz"] Feb 17 18:15:00 crc kubenswrapper[4892]: W0217 18:15:00.911153 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34695ec1_983c_49e1_b645_82a0e41b0b35.slice/crio-4a5db1959c968b8d6204311799efac8d571fa183c383a5ff0e989f3a1a13f3a7 WatchSource:0}: Error finding container 4a5db1959c968b8d6204311799efac8d571fa183c383a5ff0e989f3a1a13f3a7: Status 404 returned error can't find the container with id 4a5db1959c968b8d6204311799efac8d571fa183c383a5ff0e989f3a1a13f3a7 Feb 17 18:15:01 crc kubenswrapper[4892]: I0217 18:15:01.228932 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522535-k86zz" event={"ID":"34695ec1-983c-49e1-b645-82a0e41b0b35","Type":"ContainerStarted","Data":"65bbdf31aaeebc94fb95fd4fd370b1dc72876f8c856d3bf3966e599e1f0db999"} Feb 17 18:15:01 crc kubenswrapper[4892]: I0217 18:15:01.228985 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522535-k86zz" event={"ID":"34695ec1-983c-49e1-b645-82a0e41b0b35","Type":"ContainerStarted","Data":"4a5db1959c968b8d6204311799efac8d571fa183c383a5ff0e989f3a1a13f3a7"} Feb 17 18:15:01 crc kubenswrapper[4892]: I0217 18:15:01.252174 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29522535-k86zz" podStartSLOduration=1.2521521660000001 podStartE2EDuration="1.252152166s" podCreationTimestamp="2026-02-17 18:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:15:01.24482374 +0000 UTC m=+1872.620227005" watchObservedRunningTime="2026-02-17 18:15:01.252152166 +0000 UTC m=+1872.627555451" Feb 17 18:15:02 crc kubenswrapper[4892]: I0217 18:15:02.239994 4892 generic.go:334] "Generic (PLEG): container finished" podID="34695ec1-983c-49e1-b645-82a0e41b0b35" containerID="65bbdf31aaeebc94fb95fd4fd370b1dc72876f8c856d3bf3966e599e1f0db999" exitCode=0 Feb 17 18:15:02 crc kubenswrapper[4892]: I0217 18:15:02.240166 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522535-k86zz" event={"ID":"34695ec1-983c-49e1-b645-82a0e41b0b35","Type":"ContainerDied","Data":"65bbdf31aaeebc94fb95fd4fd370b1dc72876f8c856d3bf3966e599e1f0db999"} Feb 17 18:15:03 crc kubenswrapper[4892]: I0217 18:15:03.514302 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522535-k86zz" Feb 17 18:15:03 crc kubenswrapper[4892]: I0217 18:15:03.601054 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whfng\" (UniqueName: \"kubernetes.io/projected/34695ec1-983c-49e1-b645-82a0e41b0b35-kube-api-access-whfng\") pod \"34695ec1-983c-49e1-b645-82a0e41b0b35\" (UID: \"34695ec1-983c-49e1-b645-82a0e41b0b35\") " Feb 17 18:15:03 crc kubenswrapper[4892]: I0217 18:15:03.601100 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/34695ec1-983c-49e1-b645-82a0e41b0b35-config-volume\") pod \"34695ec1-983c-49e1-b645-82a0e41b0b35\" (UID: \"34695ec1-983c-49e1-b645-82a0e41b0b35\") " Feb 17 18:15:03 crc kubenswrapper[4892]: I0217 18:15:03.601242 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/34695ec1-983c-49e1-b645-82a0e41b0b35-secret-volume\") pod \"34695ec1-983c-49e1-b645-82a0e41b0b35\" (UID: \"34695ec1-983c-49e1-b645-82a0e41b0b35\") " Feb 17 18:15:03 crc kubenswrapper[4892]: I0217 18:15:03.602696 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34695ec1-983c-49e1-b645-82a0e41b0b35-config-volume" (OuterVolumeSpecName: "config-volume") pod "34695ec1-983c-49e1-b645-82a0e41b0b35" (UID: "34695ec1-983c-49e1-b645-82a0e41b0b35"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:15:03 crc kubenswrapper[4892]: I0217 18:15:03.606763 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34695ec1-983c-49e1-b645-82a0e41b0b35-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "34695ec1-983c-49e1-b645-82a0e41b0b35" (UID: "34695ec1-983c-49e1-b645-82a0e41b0b35"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:15:03 crc kubenswrapper[4892]: I0217 18:15:03.606761 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34695ec1-983c-49e1-b645-82a0e41b0b35-kube-api-access-whfng" (OuterVolumeSpecName: "kube-api-access-whfng") pod "34695ec1-983c-49e1-b645-82a0e41b0b35" (UID: "34695ec1-983c-49e1-b645-82a0e41b0b35"). InnerVolumeSpecName "kube-api-access-whfng". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:15:03 crc kubenswrapper[4892]: I0217 18:15:03.703502 4892 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/34695ec1-983c-49e1-b645-82a0e41b0b35-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 18:15:03 crc kubenswrapper[4892]: I0217 18:15:03.703753 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whfng\" (UniqueName: \"kubernetes.io/projected/34695ec1-983c-49e1-b645-82a0e41b0b35-kube-api-access-whfng\") on node \"crc\" DevicePath \"\"" Feb 17 18:15:03 crc kubenswrapper[4892]: I0217 18:15:03.703768 4892 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/34695ec1-983c-49e1-b645-82a0e41b0b35-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 18:15:04 crc kubenswrapper[4892]: I0217 18:15:04.262628 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522535-k86zz" Feb 17 18:15:04 crc kubenswrapper[4892]: I0217 18:15:04.262591 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522535-k86zz" event={"ID":"34695ec1-983c-49e1-b645-82a0e41b0b35","Type":"ContainerDied","Data":"4a5db1959c968b8d6204311799efac8d571fa183c383a5ff0e989f3a1a13f3a7"} Feb 17 18:15:04 crc kubenswrapper[4892]: I0217 18:15:04.262719 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a5db1959c968b8d6204311799efac8d571fa183c383a5ff0e989f3a1a13f3a7" Feb 17 18:15:07 crc kubenswrapper[4892]: I0217 18:15:07.360212 4892 scope.go:117] "RemoveContainer" containerID="6beef292233363e1e0a1fae70e7b7bf66e739bf2ae8344c3d6c91fca25fc9431" Feb 17 18:15:07 crc kubenswrapper[4892]: E0217 18:15:07.360780 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:15:18 crc kubenswrapper[4892]: I0217 18:15:18.359861 4892 scope.go:117] "RemoveContainer" containerID="6beef292233363e1e0a1fae70e7b7bf66e739bf2ae8344c3d6c91fca25fc9431" Feb 17 18:15:19 crc kubenswrapper[4892]: I0217 18:15:19.409188 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerStarted","Data":"17bcf2f3d5f53df9b0508cf062145c481335de2a6af1329fbd270946f1b3c622"} Feb 17 18:17:37 crc kubenswrapper[4892]: I0217 18:17:37.425049 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 18:17:37 crc kubenswrapper[4892]: I0217 18:17:37.425652 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 18:18:07 crc kubenswrapper[4892]: I0217 18:18:07.424421 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 18:18:07 crc kubenswrapper[4892]: I0217 18:18:07.425011 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 18:18:37 crc kubenswrapper[4892]: I0217 18:18:37.424401 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 18:18:37 crc kubenswrapper[4892]: I0217 18:18:37.425068 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 18:18:37 crc kubenswrapper[4892]: I0217 18:18:37.425125 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" Feb 17 18:18:37 crc kubenswrapper[4892]: I0217 18:18:37.425776 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"17bcf2f3d5f53df9b0508cf062145c481335de2a6af1329fbd270946f1b3c622"} pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 18:18:37 crc kubenswrapper[4892]: I0217 18:18:37.425844 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" containerID="cri-o://17bcf2f3d5f53df9b0508cf062145c481335de2a6af1329fbd270946f1b3c622" gracePeriod=600 Feb 17 18:18:38 crc kubenswrapper[4892]: I0217 18:18:38.185509 4892 generic.go:334] "Generic (PLEG): container finished" podID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerID="17bcf2f3d5f53df9b0508cf062145c481335de2a6af1329fbd270946f1b3c622" exitCode=0 Feb 17 18:18:38 crc kubenswrapper[4892]: I0217 18:18:38.185577 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerDied","Data":"17bcf2f3d5f53df9b0508cf062145c481335de2a6af1329fbd270946f1b3c622"} Feb 17 18:18:38 crc kubenswrapper[4892]: I0217 18:18:38.186148 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerStarted","Data":"e0e0ad3b976f07609332977599bfbd7aff086139ea33e9c693f03eb70bc21736"} Feb 17 18:18:38 crc kubenswrapper[4892]: I0217 18:18:38.186194 4892 scope.go:117] "RemoveContainer" containerID="6beef292233363e1e0a1fae70e7b7bf66e739bf2ae8344c3d6c91fca25fc9431" Feb 17 18:19:11 crc kubenswrapper[4892]: I0217 18:19:11.915587 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-twxcd"] Feb 17 18:19:11 crc kubenswrapper[4892]: E0217 18:19:11.916491 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34695ec1-983c-49e1-b645-82a0e41b0b35" containerName="collect-profiles" Feb 17 18:19:11 crc kubenswrapper[4892]: I0217 18:19:11.916504 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="34695ec1-983c-49e1-b645-82a0e41b0b35" containerName="collect-profiles" Feb 17 18:19:11 crc kubenswrapper[4892]: I0217 18:19:11.916713 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="34695ec1-983c-49e1-b645-82a0e41b0b35" containerName="collect-profiles" Feb 17 18:19:11 crc kubenswrapper[4892]: I0217 18:19:11.918022 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-twxcd" Feb 17 18:19:11 crc kubenswrapper[4892]: I0217 18:19:11.938373 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-twxcd"] Feb 17 18:19:11 crc kubenswrapper[4892]: I0217 18:19:11.995898 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12aa0ded-8535-41a8-b852-e4164a3158b2-catalog-content\") pod \"redhat-operators-twxcd\" (UID: \"12aa0ded-8535-41a8-b852-e4164a3158b2\") " pod="openshift-marketplace/redhat-operators-twxcd" Feb 17 18:19:11 crc kubenswrapper[4892]: I0217 18:19:11.996223 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g6mh\" (UniqueName: \"kubernetes.io/projected/12aa0ded-8535-41a8-b852-e4164a3158b2-kube-api-access-7g6mh\") pod \"redhat-operators-twxcd\" (UID: \"12aa0ded-8535-41a8-b852-e4164a3158b2\") " pod="openshift-marketplace/redhat-operators-twxcd" Feb 17 18:19:11 crc kubenswrapper[4892]: I0217 18:19:11.996671 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12aa0ded-8535-41a8-b852-e4164a3158b2-utilities\") pod \"redhat-operators-twxcd\" (UID: \"12aa0ded-8535-41a8-b852-e4164a3158b2\") " pod="openshift-marketplace/redhat-operators-twxcd" Feb 17 18:19:12 crc kubenswrapper[4892]: I0217 18:19:12.098561 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7g6mh\" (UniqueName: \"kubernetes.io/projected/12aa0ded-8535-41a8-b852-e4164a3158b2-kube-api-access-7g6mh\") pod \"redhat-operators-twxcd\" (UID: \"12aa0ded-8535-41a8-b852-e4164a3158b2\") " pod="openshift-marketplace/redhat-operators-twxcd" Feb 17 18:19:12 crc kubenswrapper[4892]: I0217 18:19:12.098961 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12aa0ded-8535-41a8-b852-e4164a3158b2-utilities\") pod \"redhat-operators-twxcd\" (UID: \"12aa0ded-8535-41a8-b852-e4164a3158b2\") " pod="openshift-marketplace/redhat-operators-twxcd" Feb 17 18:19:12 crc kubenswrapper[4892]: I0217 18:19:12.099175 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12aa0ded-8535-41a8-b852-e4164a3158b2-catalog-content\") pod \"redhat-operators-twxcd\" (UID: \"12aa0ded-8535-41a8-b852-e4164a3158b2\") " pod="openshift-marketplace/redhat-operators-twxcd" Feb 17 18:19:12 crc kubenswrapper[4892]: I0217 18:19:12.099774 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12aa0ded-8535-41a8-b852-e4164a3158b2-utilities\") pod \"redhat-operators-twxcd\" (UID: \"12aa0ded-8535-41a8-b852-e4164a3158b2\") " pod="openshift-marketplace/redhat-operators-twxcd" Feb 17 18:19:12 crc kubenswrapper[4892]: I0217 18:19:12.099774 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12aa0ded-8535-41a8-b852-e4164a3158b2-catalog-content\") pod \"redhat-operators-twxcd\" (UID: \"12aa0ded-8535-41a8-b852-e4164a3158b2\") " pod="openshift-marketplace/redhat-operators-twxcd" Feb 17 18:19:12 crc kubenswrapper[4892]: I0217 18:19:12.126742 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7g6mh\" (UniqueName: \"kubernetes.io/projected/12aa0ded-8535-41a8-b852-e4164a3158b2-kube-api-access-7g6mh\") pod \"redhat-operators-twxcd\" (UID: \"12aa0ded-8535-41a8-b852-e4164a3158b2\") " pod="openshift-marketplace/redhat-operators-twxcd" Feb 17 18:19:12 crc kubenswrapper[4892]: I0217 18:19:12.251326 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-twxcd" Feb 17 18:19:12 crc kubenswrapper[4892]: I0217 18:19:12.713851 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-twxcd"] Feb 17 18:19:13 crc kubenswrapper[4892]: I0217 18:19:13.479305 4892 generic.go:334] "Generic (PLEG): container finished" podID="12aa0ded-8535-41a8-b852-e4164a3158b2" containerID="5f0727857544bf38bcb92284e5846954df2192beb8dad43e248a6c56098b7e2d" exitCode=0 Feb 17 18:19:13 crc kubenswrapper[4892]: I0217 18:19:13.479353 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-twxcd" event={"ID":"12aa0ded-8535-41a8-b852-e4164a3158b2","Type":"ContainerDied","Data":"5f0727857544bf38bcb92284e5846954df2192beb8dad43e248a6c56098b7e2d"} Feb 17 18:19:13 crc kubenswrapper[4892]: I0217 18:19:13.479601 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-twxcd" event={"ID":"12aa0ded-8535-41a8-b852-e4164a3158b2","Type":"ContainerStarted","Data":"c6f250cf417cb312eb2f578873ed55638ca6d7ca9758e5232ff1dc333deed6b8"} Feb 17 18:19:13 crc kubenswrapper[4892]: I0217 18:19:13.480922 4892 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 18:19:15 crc kubenswrapper[4892]: I0217 18:19:15.496082 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-twxcd" event={"ID":"12aa0ded-8535-41a8-b852-e4164a3158b2","Type":"ContainerStarted","Data":"533c1d4bd33149a7708107d6f7a2b67c773324f5870f3c986192ca911eed9d37"} Feb 17 18:19:16 crc kubenswrapper[4892]: I0217 18:19:16.509204 4892 generic.go:334] "Generic (PLEG): container finished" podID="12aa0ded-8535-41a8-b852-e4164a3158b2" containerID="533c1d4bd33149a7708107d6f7a2b67c773324f5870f3c986192ca911eed9d37" exitCode=0 Feb 17 18:19:16 crc kubenswrapper[4892]: I0217 18:19:16.509252 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-twxcd" event={"ID":"12aa0ded-8535-41a8-b852-e4164a3158b2","Type":"ContainerDied","Data":"533c1d4bd33149a7708107d6f7a2b67c773324f5870f3c986192ca911eed9d37"} Feb 17 18:19:18 crc kubenswrapper[4892]: I0217 18:19:18.566960 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-twxcd" event={"ID":"12aa0ded-8535-41a8-b852-e4164a3158b2","Type":"ContainerStarted","Data":"3ac66770ee5a3d75ead7f6730446596b2184c4ff5e21d0f9742eea15104158da"} Feb 17 18:19:19 crc kubenswrapper[4892]: I0217 18:19:19.621205 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-twxcd" podStartSLOduration=4.071907243 podStartE2EDuration="8.621180091s" podCreationTimestamp="2026-02-17 18:19:11 +0000 UTC" firstStartedPulling="2026-02-17 18:19:13.480640742 +0000 UTC m=+2124.856044007" lastFinishedPulling="2026-02-17 18:19:18.02991359 +0000 UTC m=+2129.405316855" observedRunningTime="2026-02-17 18:19:19.609072895 +0000 UTC m=+2130.984476170" watchObservedRunningTime="2026-02-17 18:19:19.621180091 +0000 UTC m=+2130.996583366" Feb 17 18:19:22 crc kubenswrapper[4892]: I0217 18:19:22.252355 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-twxcd" Feb 17 18:19:22 crc kubenswrapper[4892]: I0217 18:19:22.253273 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-twxcd" Feb 17 18:19:23 crc kubenswrapper[4892]: I0217 18:19:23.327646 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-twxcd" podUID="12aa0ded-8535-41a8-b852-e4164a3158b2" containerName="registry-server" probeResult="failure" output=< Feb 17 18:19:23 crc kubenswrapper[4892]: timeout: failed to connect service ":50051" within 1s Feb 17 18:19:23 crc kubenswrapper[4892]: > Feb 17 18:19:32 crc kubenswrapper[4892]: I0217 18:19:32.294567 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-twxcd" Feb 17 18:19:32 crc kubenswrapper[4892]: I0217 18:19:32.342582 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-twxcd" Feb 17 18:19:32 crc kubenswrapper[4892]: I0217 18:19:32.537724 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-twxcd"] Feb 17 18:19:33 crc kubenswrapper[4892]: I0217 18:19:33.720539 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-twxcd" podUID="12aa0ded-8535-41a8-b852-e4164a3158b2" containerName="registry-server" containerID="cri-o://3ac66770ee5a3d75ead7f6730446596b2184c4ff5e21d0f9742eea15104158da" gracePeriod=2 Feb 17 18:19:34 crc kubenswrapper[4892]: I0217 18:19:34.183665 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-twxcd" Feb 17 18:19:34 crc kubenswrapper[4892]: I0217 18:19:34.325499 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12aa0ded-8535-41a8-b852-e4164a3158b2-utilities\") pod \"12aa0ded-8535-41a8-b852-e4164a3158b2\" (UID: \"12aa0ded-8535-41a8-b852-e4164a3158b2\") " Feb 17 18:19:34 crc kubenswrapper[4892]: I0217 18:19:34.325596 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12aa0ded-8535-41a8-b852-e4164a3158b2-catalog-content\") pod \"12aa0ded-8535-41a8-b852-e4164a3158b2\" (UID: \"12aa0ded-8535-41a8-b852-e4164a3158b2\") " Feb 17 18:19:34 crc kubenswrapper[4892]: I0217 18:19:34.325616 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7g6mh\" (UniqueName: \"kubernetes.io/projected/12aa0ded-8535-41a8-b852-e4164a3158b2-kube-api-access-7g6mh\") pod \"12aa0ded-8535-41a8-b852-e4164a3158b2\" (UID: \"12aa0ded-8535-41a8-b852-e4164a3158b2\") " Feb 17 18:19:34 crc kubenswrapper[4892]: I0217 18:19:34.326496 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12aa0ded-8535-41a8-b852-e4164a3158b2-utilities" (OuterVolumeSpecName: "utilities") pod "12aa0ded-8535-41a8-b852-e4164a3158b2" (UID: "12aa0ded-8535-41a8-b852-e4164a3158b2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:19:34 crc kubenswrapper[4892]: I0217 18:19:34.334679 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12aa0ded-8535-41a8-b852-e4164a3158b2-kube-api-access-7g6mh" (OuterVolumeSpecName: "kube-api-access-7g6mh") pod "12aa0ded-8535-41a8-b852-e4164a3158b2" (UID: "12aa0ded-8535-41a8-b852-e4164a3158b2"). InnerVolumeSpecName "kube-api-access-7g6mh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:19:34 crc kubenswrapper[4892]: I0217 18:19:34.427057 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12aa0ded-8535-41a8-b852-e4164a3158b2-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 18:19:34 crc kubenswrapper[4892]: I0217 18:19:34.427102 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7g6mh\" (UniqueName: \"kubernetes.io/projected/12aa0ded-8535-41a8-b852-e4164a3158b2-kube-api-access-7g6mh\") on node \"crc\" DevicePath \"\"" Feb 17 18:19:34 crc kubenswrapper[4892]: I0217 18:19:34.483834 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12aa0ded-8535-41a8-b852-e4164a3158b2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "12aa0ded-8535-41a8-b852-e4164a3158b2" (UID: "12aa0ded-8535-41a8-b852-e4164a3158b2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:19:34 crc kubenswrapper[4892]: I0217 18:19:34.528523 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12aa0ded-8535-41a8-b852-e4164a3158b2-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 18:19:34 crc kubenswrapper[4892]: I0217 18:19:34.729686 4892 generic.go:334] "Generic (PLEG): container finished" podID="12aa0ded-8535-41a8-b852-e4164a3158b2" containerID="3ac66770ee5a3d75ead7f6730446596b2184c4ff5e21d0f9742eea15104158da" exitCode=0 Feb 17 18:19:34 crc kubenswrapper[4892]: I0217 18:19:34.729736 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-twxcd" event={"ID":"12aa0ded-8535-41a8-b852-e4164a3158b2","Type":"ContainerDied","Data":"3ac66770ee5a3d75ead7f6730446596b2184c4ff5e21d0f9742eea15104158da"} Feb 17 18:19:34 crc kubenswrapper[4892]: I0217 18:19:34.729768 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-twxcd" event={"ID":"12aa0ded-8535-41a8-b852-e4164a3158b2","Type":"ContainerDied","Data":"c6f250cf417cb312eb2f578873ed55638ca6d7ca9758e5232ff1dc333deed6b8"} Feb 17 18:19:34 crc kubenswrapper[4892]: I0217 18:19:34.729788 4892 scope.go:117] "RemoveContainer" containerID="3ac66770ee5a3d75ead7f6730446596b2184c4ff5e21d0f9742eea15104158da" Feb 17 18:19:34 crc kubenswrapper[4892]: I0217 18:19:34.729955 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-twxcd" Feb 17 18:19:34 crc kubenswrapper[4892]: I0217 18:19:34.750529 4892 scope.go:117] "RemoveContainer" containerID="533c1d4bd33149a7708107d6f7a2b67c773324f5870f3c986192ca911eed9d37" Feb 17 18:19:34 crc kubenswrapper[4892]: I0217 18:19:34.768364 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-twxcd"] Feb 17 18:19:34 crc kubenswrapper[4892]: I0217 18:19:34.777148 4892 scope.go:117] "RemoveContainer" containerID="5f0727857544bf38bcb92284e5846954df2192beb8dad43e248a6c56098b7e2d" Feb 17 18:19:34 crc kubenswrapper[4892]: I0217 18:19:34.780118 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-twxcd"] Feb 17 18:19:34 crc kubenswrapper[4892]: I0217 18:19:34.812849 4892 scope.go:117] "RemoveContainer" containerID="3ac66770ee5a3d75ead7f6730446596b2184c4ff5e21d0f9742eea15104158da" Feb 17 18:19:34 crc kubenswrapper[4892]: E0217 18:19:34.813601 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ac66770ee5a3d75ead7f6730446596b2184c4ff5e21d0f9742eea15104158da\": container with ID starting with 3ac66770ee5a3d75ead7f6730446596b2184c4ff5e21d0f9742eea15104158da not found: ID does not exist" containerID="3ac66770ee5a3d75ead7f6730446596b2184c4ff5e21d0f9742eea15104158da" Feb 17 18:19:34 crc kubenswrapper[4892]: I0217 18:19:34.813654 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ac66770ee5a3d75ead7f6730446596b2184c4ff5e21d0f9742eea15104158da"} err="failed to get container status \"3ac66770ee5a3d75ead7f6730446596b2184c4ff5e21d0f9742eea15104158da\": rpc error: code = NotFound desc = could not find container \"3ac66770ee5a3d75ead7f6730446596b2184c4ff5e21d0f9742eea15104158da\": container with ID starting with 3ac66770ee5a3d75ead7f6730446596b2184c4ff5e21d0f9742eea15104158da not found: ID does not exist" Feb 17 18:19:34 crc kubenswrapper[4892]: I0217 18:19:34.813685 4892 scope.go:117] "RemoveContainer" containerID="533c1d4bd33149a7708107d6f7a2b67c773324f5870f3c986192ca911eed9d37" Feb 17 18:19:34 crc kubenswrapper[4892]: E0217 18:19:34.814215 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"533c1d4bd33149a7708107d6f7a2b67c773324f5870f3c986192ca911eed9d37\": container with ID starting with 533c1d4bd33149a7708107d6f7a2b67c773324f5870f3c986192ca911eed9d37 not found: ID does not exist" containerID="533c1d4bd33149a7708107d6f7a2b67c773324f5870f3c986192ca911eed9d37" Feb 17 18:19:34 crc kubenswrapper[4892]: I0217 18:19:34.814253 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"533c1d4bd33149a7708107d6f7a2b67c773324f5870f3c986192ca911eed9d37"} err="failed to get container status \"533c1d4bd33149a7708107d6f7a2b67c773324f5870f3c986192ca911eed9d37\": rpc error: code = NotFound desc = could not find container \"533c1d4bd33149a7708107d6f7a2b67c773324f5870f3c986192ca911eed9d37\": container with ID starting with 533c1d4bd33149a7708107d6f7a2b67c773324f5870f3c986192ca911eed9d37 not found: ID does not exist" Feb 17 18:19:34 crc kubenswrapper[4892]: I0217 18:19:34.814279 4892 scope.go:117] "RemoveContainer" containerID="5f0727857544bf38bcb92284e5846954df2192beb8dad43e248a6c56098b7e2d" Feb 17 18:19:34 crc kubenswrapper[4892]: E0217 18:19:34.814547 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f0727857544bf38bcb92284e5846954df2192beb8dad43e248a6c56098b7e2d\": container with ID starting with 5f0727857544bf38bcb92284e5846954df2192beb8dad43e248a6c56098b7e2d not found: ID does not exist" containerID="5f0727857544bf38bcb92284e5846954df2192beb8dad43e248a6c56098b7e2d" Feb 17 18:19:34 crc kubenswrapper[4892]: I0217 18:19:34.814575 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f0727857544bf38bcb92284e5846954df2192beb8dad43e248a6c56098b7e2d"} err="failed to get container status \"5f0727857544bf38bcb92284e5846954df2192beb8dad43e248a6c56098b7e2d\": rpc error: code = NotFound desc = could not find container \"5f0727857544bf38bcb92284e5846954df2192beb8dad43e248a6c56098b7e2d\": container with ID starting with 5f0727857544bf38bcb92284e5846954df2192beb8dad43e248a6c56098b7e2d not found: ID does not exist" Feb 17 18:19:35 crc kubenswrapper[4892]: I0217 18:19:35.375754 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12aa0ded-8535-41a8-b852-e4164a3158b2" path="/var/lib/kubelet/pods/12aa0ded-8535-41a8-b852-e4164a3158b2/volumes" Feb 17 18:19:50 crc kubenswrapper[4892]: I0217 18:19:50.324862 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rpm8b"] Feb 17 18:19:50 crc kubenswrapper[4892]: E0217 18:19:50.326694 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12aa0ded-8535-41a8-b852-e4164a3158b2" containerName="registry-server" Feb 17 18:19:50 crc kubenswrapper[4892]: I0217 18:19:50.326774 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="12aa0ded-8535-41a8-b852-e4164a3158b2" containerName="registry-server" Feb 17 18:19:50 crc kubenswrapper[4892]: E0217 18:19:50.326902 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12aa0ded-8535-41a8-b852-e4164a3158b2" containerName="extract-content" Feb 17 18:19:50 crc kubenswrapper[4892]: I0217 18:19:50.326976 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="12aa0ded-8535-41a8-b852-e4164a3158b2" containerName="extract-content" Feb 17 18:19:50 crc kubenswrapper[4892]: E0217 18:19:50.327056 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12aa0ded-8535-41a8-b852-e4164a3158b2" containerName="extract-utilities" Feb 17 18:19:50 crc kubenswrapper[4892]: I0217 18:19:50.327114 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="12aa0ded-8535-41a8-b852-e4164a3158b2" containerName="extract-utilities" Feb 17 18:19:50 crc kubenswrapper[4892]: I0217 18:19:50.327344 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="12aa0ded-8535-41a8-b852-e4164a3158b2" containerName="registry-server" Feb 17 18:19:50 crc kubenswrapper[4892]: I0217 18:19:50.328583 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rpm8b" Feb 17 18:19:50 crc kubenswrapper[4892]: I0217 18:19:50.358560 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rpm8b"] Feb 17 18:19:50 crc kubenswrapper[4892]: I0217 18:19:50.388380 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5eea8ef9-e0c9-4530-830e-59ad0e502832-catalog-content\") pod \"community-operators-rpm8b\" (UID: \"5eea8ef9-e0c9-4530-830e-59ad0e502832\") " pod="openshift-marketplace/community-operators-rpm8b" Feb 17 18:19:50 crc kubenswrapper[4892]: I0217 18:19:50.388695 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrqns\" (UniqueName: \"kubernetes.io/projected/5eea8ef9-e0c9-4530-830e-59ad0e502832-kube-api-access-wrqns\") pod \"community-operators-rpm8b\" (UID: \"5eea8ef9-e0c9-4530-830e-59ad0e502832\") " pod="openshift-marketplace/community-operators-rpm8b" Feb 17 18:19:50 crc kubenswrapper[4892]: I0217 18:19:50.388798 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5eea8ef9-e0c9-4530-830e-59ad0e502832-utilities\") pod \"community-operators-rpm8b\" (UID: \"5eea8ef9-e0c9-4530-830e-59ad0e502832\") " pod="openshift-marketplace/community-operators-rpm8b" Feb 17 18:19:50 crc kubenswrapper[4892]: I0217 18:19:50.491050 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5eea8ef9-e0c9-4530-830e-59ad0e502832-catalog-content\") pod \"community-operators-rpm8b\" (UID: \"5eea8ef9-e0c9-4530-830e-59ad0e502832\") " pod="openshift-marketplace/community-operators-rpm8b" Feb 17 18:19:50 crc kubenswrapper[4892]: I0217 18:19:50.491137 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrqns\" (UniqueName: \"kubernetes.io/projected/5eea8ef9-e0c9-4530-830e-59ad0e502832-kube-api-access-wrqns\") pod \"community-operators-rpm8b\" (UID: \"5eea8ef9-e0c9-4530-830e-59ad0e502832\") " pod="openshift-marketplace/community-operators-rpm8b" Feb 17 18:19:50 crc kubenswrapper[4892]: I0217 18:19:50.491180 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5eea8ef9-e0c9-4530-830e-59ad0e502832-utilities\") pod \"community-operators-rpm8b\" (UID: \"5eea8ef9-e0c9-4530-830e-59ad0e502832\") " pod="openshift-marketplace/community-operators-rpm8b" Feb 17 18:19:50 crc kubenswrapper[4892]: I0217 18:19:50.491775 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5eea8ef9-e0c9-4530-830e-59ad0e502832-utilities\") pod \"community-operators-rpm8b\" (UID: \"5eea8ef9-e0c9-4530-830e-59ad0e502832\") " pod="openshift-marketplace/community-operators-rpm8b" Feb 17 18:19:50 crc kubenswrapper[4892]: I0217 18:19:50.491940 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5eea8ef9-e0c9-4530-830e-59ad0e502832-catalog-content\") pod \"community-operators-rpm8b\" (UID: \"5eea8ef9-e0c9-4530-830e-59ad0e502832\") " pod="openshift-marketplace/community-operators-rpm8b" Feb 17 18:19:50 crc kubenswrapper[4892]: I0217 18:19:50.517730 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrqns\" (UniqueName: \"kubernetes.io/projected/5eea8ef9-e0c9-4530-830e-59ad0e502832-kube-api-access-wrqns\") pod \"community-operators-rpm8b\" (UID: \"5eea8ef9-e0c9-4530-830e-59ad0e502832\") " pod="openshift-marketplace/community-operators-rpm8b" Feb 17 18:19:50 crc kubenswrapper[4892]: I0217 18:19:50.652801 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rpm8b" Feb 17 18:19:51 crc kubenswrapper[4892]: I0217 18:19:51.234123 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rpm8b"] Feb 17 18:19:52 crc kubenswrapper[4892]: I0217 18:19:52.027552 4892 generic.go:334] "Generic (PLEG): container finished" podID="5eea8ef9-e0c9-4530-830e-59ad0e502832" containerID="0e858378b7fad9483d0f67fb000d3b4642b9c78ca20757b5bcaea0ef9fded9f9" exitCode=0 Feb 17 18:19:52 crc kubenswrapper[4892]: I0217 18:19:52.027652 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rpm8b" event={"ID":"5eea8ef9-e0c9-4530-830e-59ad0e502832","Type":"ContainerDied","Data":"0e858378b7fad9483d0f67fb000d3b4642b9c78ca20757b5bcaea0ef9fded9f9"} Feb 17 18:19:52 crc kubenswrapper[4892]: I0217 18:19:52.027963 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rpm8b" event={"ID":"5eea8ef9-e0c9-4530-830e-59ad0e502832","Type":"ContainerStarted","Data":"6089bbe2d2a37971ca19720fff4304ee0f6cbf64049437bf07bd5462bb74a5ce"} Feb 17 18:19:54 crc kubenswrapper[4892]: I0217 18:19:54.046461 4892 generic.go:334] "Generic (PLEG): container finished" podID="5eea8ef9-e0c9-4530-830e-59ad0e502832" containerID="17db6f5f9546773be7f57d6cc3fd8c701a1b79b6bcca6a3df1c007d8654aca6c" exitCode=0 Feb 17 18:19:54 crc kubenswrapper[4892]: I0217 18:19:54.046543 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rpm8b" event={"ID":"5eea8ef9-e0c9-4530-830e-59ad0e502832","Type":"ContainerDied","Data":"17db6f5f9546773be7f57d6cc3fd8c701a1b79b6bcca6a3df1c007d8654aca6c"} Feb 17 18:19:55 crc kubenswrapper[4892]: I0217 18:19:55.057020 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rpm8b" event={"ID":"5eea8ef9-e0c9-4530-830e-59ad0e502832","Type":"ContainerStarted","Data":"1c44945cadcfae573d011463537cb56221680d1a8271469a0c683649d4b60de2"} Feb 17 18:19:59 crc kubenswrapper[4892]: I0217 18:19:59.891381 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rpm8b" podStartSLOduration=7.464481565 podStartE2EDuration="9.891360172s" podCreationTimestamp="2026-02-17 18:19:50 +0000 UTC" firstStartedPulling="2026-02-17 18:19:52.030003284 +0000 UTC m=+2163.405406549" lastFinishedPulling="2026-02-17 18:19:54.456881891 +0000 UTC m=+2165.832285156" observedRunningTime="2026-02-17 18:19:55.075258213 +0000 UTC m=+2166.450661488" watchObservedRunningTime="2026-02-17 18:19:59.891360172 +0000 UTC m=+2171.266763447" Feb 17 18:19:59 crc kubenswrapper[4892]: I0217 18:19:59.893067 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gndk6"] Feb 17 18:19:59 crc kubenswrapper[4892]: I0217 18:19:59.895922 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gndk6" Feb 17 18:19:59 crc kubenswrapper[4892]: I0217 18:19:59.903613 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gndk6"] Feb 17 18:19:59 crc kubenswrapper[4892]: I0217 18:19:59.955408 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgxb9\" (UniqueName: \"kubernetes.io/projected/04dcafb0-9799-4e3b-8a26-79e73f460b6e-kube-api-access-pgxb9\") pod \"certified-operators-gndk6\" (UID: \"04dcafb0-9799-4e3b-8a26-79e73f460b6e\") " pod="openshift-marketplace/certified-operators-gndk6" Feb 17 18:19:59 crc kubenswrapper[4892]: I0217 18:19:59.955495 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04dcafb0-9799-4e3b-8a26-79e73f460b6e-catalog-content\") pod \"certified-operators-gndk6\" (UID: \"04dcafb0-9799-4e3b-8a26-79e73f460b6e\") " pod="openshift-marketplace/certified-operators-gndk6" Feb 17 18:19:59 crc kubenswrapper[4892]: I0217 18:19:59.955660 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04dcafb0-9799-4e3b-8a26-79e73f460b6e-utilities\") pod \"certified-operators-gndk6\" (UID: \"04dcafb0-9799-4e3b-8a26-79e73f460b6e\") " pod="openshift-marketplace/certified-operators-gndk6" Feb 17 18:20:00 crc kubenswrapper[4892]: I0217 18:20:00.057107 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgxb9\" (UniqueName: \"kubernetes.io/projected/04dcafb0-9799-4e3b-8a26-79e73f460b6e-kube-api-access-pgxb9\") pod \"certified-operators-gndk6\" (UID: \"04dcafb0-9799-4e3b-8a26-79e73f460b6e\") " pod="openshift-marketplace/certified-operators-gndk6" Feb 17 18:20:00 crc kubenswrapper[4892]: I0217 18:20:00.057177 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04dcafb0-9799-4e3b-8a26-79e73f460b6e-catalog-content\") pod \"certified-operators-gndk6\" (UID: \"04dcafb0-9799-4e3b-8a26-79e73f460b6e\") " pod="openshift-marketplace/certified-operators-gndk6" Feb 17 18:20:00 crc kubenswrapper[4892]: I0217 18:20:00.057208 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04dcafb0-9799-4e3b-8a26-79e73f460b6e-utilities\") pod \"certified-operators-gndk6\" (UID: \"04dcafb0-9799-4e3b-8a26-79e73f460b6e\") " pod="openshift-marketplace/certified-operators-gndk6" Feb 17 18:20:00 crc kubenswrapper[4892]: I0217 18:20:00.057629 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04dcafb0-9799-4e3b-8a26-79e73f460b6e-utilities\") pod \"certified-operators-gndk6\" (UID: \"04dcafb0-9799-4e3b-8a26-79e73f460b6e\") " pod="openshift-marketplace/certified-operators-gndk6" Feb 17 18:20:00 crc kubenswrapper[4892]: I0217 18:20:00.057777 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04dcafb0-9799-4e3b-8a26-79e73f460b6e-catalog-content\") pod \"certified-operators-gndk6\" (UID: \"04dcafb0-9799-4e3b-8a26-79e73f460b6e\") " pod="openshift-marketplace/certified-operators-gndk6" Feb 17 18:20:00 crc kubenswrapper[4892]: I0217 18:20:00.075788 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgxb9\" (UniqueName: \"kubernetes.io/projected/04dcafb0-9799-4e3b-8a26-79e73f460b6e-kube-api-access-pgxb9\") pod \"certified-operators-gndk6\" (UID: \"04dcafb0-9799-4e3b-8a26-79e73f460b6e\") " pod="openshift-marketplace/certified-operators-gndk6" Feb 17 18:20:00 crc kubenswrapper[4892]: I0217 18:20:00.216034 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gndk6" Feb 17 18:20:00 crc kubenswrapper[4892]: I0217 18:20:00.653726 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rpm8b" Feb 17 18:20:00 crc kubenswrapper[4892]: I0217 18:20:00.654053 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rpm8b" Feb 17 18:20:00 crc kubenswrapper[4892]: I0217 18:20:00.700731 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rpm8b" Feb 17 18:20:01 crc kubenswrapper[4892]: I0217 18:20:01.044467 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gndk6"] Feb 17 18:20:01 crc kubenswrapper[4892]: W0217 18:20:01.048174 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04dcafb0_9799_4e3b_8a26_79e73f460b6e.slice/crio-5a1dcbf6f498b7a88c3ae4a416fb50fb789ed790635b75d7a491da7d1f81200d WatchSource:0}: Error finding container 5a1dcbf6f498b7a88c3ae4a416fb50fb789ed790635b75d7a491da7d1f81200d: Status 404 returned error can't find the container with id 5a1dcbf6f498b7a88c3ae4a416fb50fb789ed790635b75d7a491da7d1f81200d Feb 17 18:20:01 crc kubenswrapper[4892]: I0217 18:20:01.105018 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gndk6" event={"ID":"04dcafb0-9799-4e3b-8a26-79e73f460b6e","Type":"ContainerStarted","Data":"5a1dcbf6f498b7a88c3ae4a416fb50fb789ed790635b75d7a491da7d1f81200d"} Feb 17 18:20:01 crc kubenswrapper[4892]: I0217 18:20:01.160626 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rpm8b" Feb 17 18:20:02 crc kubenswrapper[4892]: I0217 18:20:02.116974 4892 generic.go:334] "Generic (PLEG): container finished" podID="04dcafb0-9799-4e3b-8a26-79e73f460b6e" containerID="577e2fcefc67d4da141c1610ad1779530b3763f58b7f1ae914c1737a08d410b6" exitCode=0 Feb 17 18:20:02 crc kubenswrapper[4892]: I0217 18:20:02.117029 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gndk6" event={"ID":"04dcafb0-9799-4e3b-8a26-79e73f460b6e","Type":"ContainerDied","Data":"577e2fcefc67d4da141c1610ad1779530b3763f58b7f1ae914c1737a08d410b6"} Feb 17 18:20:03 crc kubenswrapper[4892]: I0217 18:20:03.076354 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rpm8b"] Feb 17 18:20:03 crc kubenswrapper[4892]: I0217 18:20:03.128311 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rpm8b" podUID="5eea8ef9-e0c9-4530-830e-59ad0e502832" containerName="registry-server" containerID="cri-o://1c44945cadcfae573d011463537cb56221680d1a8271469a0c683649d4b60de2" gracePeriod=2 Feb 17 18:20:03 crc kubenswrapper[4892]: I0217 18:20:03.568742 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rpm8b" Feb 17 18:20:03 crc kubenswrapper[4892]: I0217 18:20:03.614002 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5eea8ef9-e0c9-4530-830e-59ad0e502832-catalog-content\") pod \"5eea8ef9-e0c9-4530-830e-59ad0e502832\" (UID: \"5eea8ef9-e0c9-4530-830e-59ad0e502832\") " Feb 17 18:20:03 crc kubenswrapper[4892]: I0217 18:20:03.614189 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5eea8ef9-e0c9-4530-830e-59ad0e502832-utilities\") pod \"5eea8ef9-e0c9-4530-830e-59ad0e502832\" (UID: \"5eea8ef9-e0c9-4530-830e-59ad0e502832\") " Feb 17 18:20:03 crc kubenswrapper[4892]: I0217 18:20:03.614208 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrqns\" (UniqueName: \"kubernetes.io/projected/5eea8ef9-e0c9-4530-830e-59ad0e502832-kube-api-access-wrqns\") pod \"5eea8ef9-e0c9-4530-830e-59ad0e502832\" (UID: \"5eea8ef9-e0c9-4530-830e-59ad0e502832\") " Feb 17 18:20:03 crc kubenswrapper[4892]: I0217 18:20:03.616594 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5eea8ef9-e0c9-4530-830e-59ad0e502832-utilities" (OuterVolumeSpecName: "utilities") pod "5eea8ef9-e0c9-4530-830e-59ad0e502832" (UID: "5eea8ef9-e0c9-4530-830e-59ad0e502832"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:20:03 crc kubenswrapper[4892]: I0217 18:20:03.620254 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5eea8ef9-e0c9-4530-830e-59ad0e502832-kube-api-access-wrqns" (OuterVolumeSpecName: "kube-api-access-wrqns") pod "5eea8ef9-e0c9-4530-830e-59ad0e502832" (UID: "5eea8ef9-e0c9-4530-830e-59ad0e502832"). InnerVolumeSpecName "kube-api-access-wrqns". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:20:03 crc kubenswrapper[4892]: I0217 18:20:03.678427 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5eea8ef9-e0c9-4530-830e-59ad0e502832-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5eea8ef9-e0c9-4530-830e-59ad0e502832" (UID: "5eea8ef9-e0c9-4530-830e-59ad0e502832"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:20:03 crc kubenswrapper[4892]: I0217 18:20:03.714975 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5eea8ef9-e0c9-4530-830e-59ad0e502832-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 18:20:03 crc kubenswrapper[4892]: I0217 18:20:03.715151 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrqns\" (UniqueName: \"kubernetes.io/projected/5eea8ef9-e0c9-4530-830e-59ad0e502832-kube-api-access-wrqns\") on node \"crc\" DevicePath \"\"" Feb 17 18:20:03 crc kubenswrapper[4892]: I0217 18:20:03.715211 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5eea8ef9-e0c9-4530-830e-59ad0e502832-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 18:20:03 crc kubenswrapper[4892]: E0217 18:20:03.811028 4892 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04dcafb0_9799_4e3b_8a26_79e73f460b6e.slice/crio-conmon-67a24a75855a8f3dd965dde6d1a52be00a1fc1be73fd3ac9c7911829285f2cd4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04dcafb0_9799_4e3b_8a26_79e73f460b6e.slice/crio-67a24a75855a8f3dd965dde6d1a52be00a1fc1be73fd3ac9c7911829285f2cd4.scope\": RecentStats: unable to find data in memory cache]" Feb 17 18:20:04 crc kubenswrapper[4892]: I0217 18:20:04.138410 4892 generic.go:334] "Generic (PLEG): container finished" podID="5eea8ef9-e0c9-4530-830e-59ad0e502832" containerID="1c44945cadcfae573d011463537cb56221680d1a8271469a0c683649d4b60de2" exitCode=0 Feb 17 18:20:04 crc kubenswrapper[4892]: I0217 18:20:04.138492 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rpm8b" event={"ID":"5eea8ef9-e0c9-4530-830e-59ad0e502832","Type":"ContainerDied","Data":"1c44945cadcfae573d011463537cb56221680d1a8271469a0c683649d4b60de2"} Feb 17 18:20:04 crc kubenswrapper[4892]: I0217 18:20:04.138507 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rpm8b" Feb 17 18:20:04 crc kubenswrapper[4892]: I0217 18:20:04.138525 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rpm8b" event={"ID":"5eea8ef9-e0c9-4530-830e-59ad0e502832","Type":"ContainerDied","Data":"6089bbe2d2a37971ca19720fff4304ee0f6cbf64049437bf07bd5462bb74a5ce"} Feb 17 18:20:04 crc kubenswrapper[4892]: I0217 18:20:04.138548 4892 scope.go:117] "RemoveContainer" containerID="1c44945cadcfae573d011463537cb56221680d1a8271469a0c683649d4b60de2" Feb 17 18:20:04 crc kubenswrapper[4892]: I0217 18:20:04.141270 4892 generic.go:334] "Generic (PLEG): container finished" podID="04dcafb0-9799-4e3b-8a26-79e73f460b6e" containerID="67a24a75855a8f3dd965dde6d1a52be00a1fc1be73fd3ac9c7911829285f2cd4" exitCode=0 Feb 17 18:20:04 crc kubenswrapper[4892]: I0217 18:20:04.141303 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gndk6" event={"ID":"04dcafb0-9799-4e3b-8a26-79e73f460b6e","Type":"ContainerDied","Data":"67a24a75855a8f3dd965dde6d1a52be00a1fc1be73fd3ac9c7911829285f2cd4"} Feb 17 18:20:04 crc kubenswrapper[4892]: I0217 18:20:04.164931 4892 scope.go:117] "RemoveContainer" containerID="17db6f5f9546773be7f57d6cc3fd8c701a1b79b6bcca6a3df1c007d8654aca6c" Feb 17 18:20:04 crc kubenswrapper[4892]: I0217 18:20:04.187575 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rpm8b"] Feb 17 18:20:04 crc kubenswrapper[4892]: I0217 18:20:04.195790 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rpm8b"] Feb 17 18:20:04 crc kubenswrapper[4892]: I0217 18:20:04.205382 4892 scope.go:117] "RemoveContainer" containerID="0e858378b7fad9483d0f67fb000d3b4642b9c78ca20757b5bcaea0ef9fded9f9" Feb 17 18:20:04 crc kubenswrapper[4892]: I0217 18:20:04.238559 4892 scope.go:117] "RemoveContainer" containerID="1c44945cadcfae573d011463537cb56221680d1a8271469a0c683649d4b60de2" Feb 17 18:20:04 crc kubenswrapper[4892]: E0217 18:20:04.239622 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c44945cadcfae573d011463537cb56221680d1a8271469a0c683649d4b60de2\": container with ID starting with 1c44945cadcfae573d011463537cb56221680d1a8271469a0c683649d4b60de2 not found: ID does not exist" containerID="1c44945cadcfae573d011463537cb56221680d1a8271469a0c683649d4b60de2" Feb 17 18:20:04 crc kubenswrapper[4892]: I0217 18:20:04.239662 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c44945cadcfae573d011463537cb56221680d1a8271469a0c683649d4b60de2"} err="failed to get container status \"1c44945cadcfae573d011463537cb56221680d1a8271469a0c683649d4b60de2\": rpc error: code = NotFound desc = could not find container \"1c44945cadcfae573d011463537cb56221680d1a8271469a0c683649d4b60de2\": container with ID starting with 1c44945cadcfae573d011463537cb56221680d1a8271469a0c683649d4b60de2 not found: ID does not exist" Feb 17 18:20:04 crc kubenswrapper[4892]: I0217 18:20:04.239685 4892 scope.go:117] "RemoveContainer" containerID="17db6f5f9546773be7f57d6cc3fd8c701a1b79b6bcca6a3df1c007d8654aca6c" Feb 17 18:20:04 crc kubenswrapper[4892]: E0217 18:20:04.240159 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17db6f5f9546773be7f57d6cc3fd8c701a1b79b6bcca6a3df1c007d8654aca6c\": container with ID starting with 17db6f5f9546773be7f57d6cc3fd8c701a1b79b6bcca6a3df1c007d8654aca6c not found: ID does not exist" containerID="17db6f5f9546773be7f57d6cc3fd8c701a1b79b6bcca6a3df1c007d8654aca6c" Feb 17 18:20:04 crc kubenswrapper[4892]: I0217 18:20:04.240213 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17db6f5f9546773be7f57d6cc3fd8c701a1b79b6bcca6a3df1c007d8654aca6c"} err="failed to get container status \"17db6f5f9546773be7f57d6cc3fd8c701a1b79b6bcca6a3df1c007d8654aca6c\": rpc error: code = NotFound desc = could not find container \"17db6f5f9546773be7f57d6cc3fd8c701a1b79b6bcca6a3df1c007d8654aca6c\": container with ID starting with 17db6f5f9546773be7f57d6cc3fd8c701a1b79b6bcca6a3df1c007d8654aca6c not found: ID does not exist" Feb 17 18:20:04 crc kubenswrapper[4892]: I0217 18:20:04.240264 4892 scope.go:117] "RemoveContainer" containerID="0e858378b7fad9483d0f67fb000d3b4642b9c78ca20757b5bcaea0ef9fded9f9" Feb 17 18:20:04 crc kubenswrapper[4892]: E0217 18:20:04.240675 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e858378b7fad9483d0f67fb000d3b4642b9c78ca20757b5bcaea0ef9fded9f9\": container with ID starting with 0e858378b7fad9483d0f67fb000d3b4642b9c78ca20757b5bcaea0ef9fded9f9 not found: ID does not exist" containerID="0e858378b7fad9483d0f67fb000d3b4642b9c78ca20757b5bcaea0ef9fded9f9" Feb 17 18:20:04 crc kubenswrapper[4892]: I0217 18:20:04.240701 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e858378b7fad9483d0f67fb000d3b4642b9c78ca20757b5bcaea0ef9fded9f9"} err="failed to get container status \"0e858378b7fad9483d0f67fb000d3b4642b9c78ca20757b5bcaea0ef9fded9f9\": rpc error: code = NotFound desc = could not find container \"0e858378b7fad9483d0f67fb000d3b4642b9c78ca20757b5bcaea0ef9fded9f9\": container with ID starting with 0e858378b7fad9483d0f67fb000d3b4642b9c78ca20757b5bcaea0ef9fded9f9 not found: ID does not exist" Feb 17 18:20:05 crc kubenswrapper[4892]: I0217 18:20:05.157060 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gndk6" event={"ID":"04dcafb0-9799-4e3b-8a26-79e73f460b6e","Type":"ContainerStarted","Data":"39aa6eef4b2331ea9865c9843dcc9564426ba236fdba89c6f6d02caaa597c42d"} Feb 17 18:20:05 crc kubenswrapper[4892]: I0217 18:20:05.179063 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gndk6" podStartSLOduration=3.690197308 podStartE2EDuration="6.179033156s" podCreationTimestamp="2026-02-17 18:19:59 +0000 UTC" firstStartedPulling="2026-02-17 18:20:02.119712428 +0000 UTC m=+2173.495115693" lastFinishedPulling="2026-02-17 18:20:04.608548266 +0000 UTC m=+2175.983951541" observedRunningTime="2026-02-17 18:20:05.175368467 +0000 UTC m=+2176.550771732" watchObservedRunningTime="2026-02-17 18:20:05.179033156 +0000 UTC m=+2176.554436451" Feb 17 18:20:05 crc kubenswrapper[4892]: I0217 18:20:05.372623 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5eea8ef9-e0c9-4530-830e-59ad0e502832" path="/var/lib/kubelet/pods/5eea8ef9-e0c9-4530-830e-59ad0e502832/volumes" Feb 17 18:20:10 crc kubenswrapper[4892]: I0217 18:20:10.217302 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gndk6" Feb 17 18:20:10 crc kubenswrapper[4892]: I0217 18:20:10.217347 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gndk6" Feb 17 18:20:10 crc kubenswrapper[4892]: I0217 18:20:10.278827 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gndk6" Feb 17 18:20:11 crc kubenswrapper[4892]: I0217 18:20:11.269345 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gndk6" Feb 17 18:20:11 crc kubenswrapper[4892]: I0217 18:20:11.332662 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gndk6"] Feb 17 18:20:13 crc kubenswrapper[4892]: I0217 18:20:13.244776 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gndk6" podUID="04dcafb0-9799-4e3b-8a26-79e73f460b6e" containerName="registry-server" containerID="cri-o://39aa6eef4b2331ea9865c9843dcc9564426ba236fdba89c6f6d02caaa597c42d" gracePeriod=2 Feb 17 18:20:13 crc kubenswrapper[4892]: I0217 18:20:13.674721 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gndk6" Feb 17 18:20:13 crc kubenswrapper[4892]: I0217 18:20:13.796385 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgxb9\" (UniqueName: \"kubernetes.io/projected/04dcafb0-9799-4e3b-8a26-79e73f460b6e-kube-api-access-pgxb9\") pod \"04dcafb0-9799-4e3b-8a26-79e73f460b6e\" (UID: \"04dcafb0-9799-4e3b-8a26-79e73f460b6e\") " Feb 17 18:20:13 crc kubenswrapper[4892]: I0217 18:20:13.796460 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04dcafb0-9799-4e3b-8a26-79e73f460b6e-catalog-content\") pod \"04dcafb0-9799-4e3b-8a26-79e73f460b6e\" (UID: \"04dcafb0-9799-4e3b-8a26-79e73f460b6e\") " Feb 17 18:20:13 crc kubenswrapper[4892]: I0217 18:20:13.796513 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04dcafb0-9799-4e3b-8a26-79e73f460b6e-utilities\") pod \"04dcafb0-9799-4e3b-8a26-79e73f460b6e\" (UID: \"04dcafb0-9799-4e3b-8a26-79e73f460b6e\") " Feb 17 18:20:13 crc kubenswrapper[4892]: I0217 18:20:13.797620 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04dcafb0-9799-4e3b-8a26-79e73f460b6e-utilities" (OuterVolumeSpecName: "utilities") pod "04dcafb0-9799-4e3b-8a26-79e73f460b6e" (UID: "04dcafb0-9799-4e3b-8a26-79e73f460b6e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:20:13 crc kubenswrapper[4892]: I0217 18:20:13.802376 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04dcafb0-9799-4e3b-8a26-79e73f460b6e-kube-api-access-pgxb9" (OuterVolumeSpecName: "kube-api-access-pgxb9") pod "04dcafb0-9799-4e3b-8a26-79e73f460b6e" (UID: "04dcafb0-9799-4e3b-8a26-79e73f460b6e"). InnerVolumeSpecName "kube-api-access-pgxb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:20:13 crc kubenswrapper[4892]: I0217 18:20:13.856792 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04dcafb0-9799-4e3b-8a26-79e73f460b6e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "04dcafb0-9799-4e3b-8a26-79e73f460b6e" (UID: "04dcafb0-9799-4e3b-8a26-79e73f460b6e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:20:13 crc kubenswrapper[4892]: I0217 18:20:13.898422 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04dcafb0-9799-4e3b-8a26-79e73f460b6e-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 18:20:13 crc kubenswrapper[4892]: I0217 18:20:13.898477 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgxb9\" (UniqueName: \"kubernetes.io/projected/04dcafb0-9799-4e3b-8a26-79e73f460b6e-kube-api-access-pgxb9\") on node \"crc\" DevicePath \"\"" Feb 17 18:20:13 crc kubenswrapper[4892]: I0217 18:20:13.898496 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04dcafb0-9799-4e3b-8a26-79e73f460b6e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 18:20:14 crc kubenswrapper[4892]: I0217 18:20:14.257581 4892 generic.go:334] "Generic (PLEG): container finished" podID="04dcafb0-9799-4e3b-8a26-79e73f460b6e" containerID="39aa6eef4b2331ea9865c9843dcc9564426ba236fdba89c6f6d02caaa597c42d" exitCode=0 Feb 17 18:20:14 crc kubenswrapper[4892]: I0217 18:20:14.257639 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gndk6" Feb 17 18:20:14 crc kubenswrapper[4892]: I0217 18:20:14.257680 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gndk6" event={"ID":"04dcafb0-9799-4e3b-8a26-79e73f460b6e","Type":"ContainerDied","Data":"39aa6eef4b2331ea9865c9843dcc9564426ba236fdba89c6f6d02caaa597c42d"} Feb 17 18:20:14 crc kubenswrapper[4892]: I0217 18:20:14.258040 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gndk6" event={"ID":"04dcafb0-9799-4e3b-8a26-79e73f460b6e","Type":"ContainerDied","Data":"5a1dcbf6f498b7a88c3ae4a416fb50fb789ed790635b75d7a491da7d1f81200d"} Feb 17 18:20:14 crc kubenswrapper[4892]: I0217 18:20:14.258067 4892 scope.go:117] "RemoveContainer" containerID="39aa6eef4b2331ea9865c9843dcc9564426ba236fdba89c6f6d02caaa597c42d" Feb 17 18:20:14 crc kubenswrapper[4892]: I0217 18:20:14.279458 4892 scope.go:117] "RemoveContainer" containerID="67a24a75855a8f3dd965dde6d1a52be00a1fc1be73fd3ac9c7911829285f2cd4" Feb 17 18:20:14 crc kubenswrapper[4892]: I0217 18:20:14.294460 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gndk6"] Feb 17 18:20:14 crc kubenswrapper[4892]: I0217 18:20:14.300680 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gndk6"] Feb 17 18:20:14 crc kubenswrapper[4892]: I0217 18:20:14.317671 4892 scope.go:117] "RemoveContainer" containerID="577e2fcefc67d4da141c1610ad1779530b3763f58b7f1ae914c1737a08d410b6" Feb 17 18:20:14 crc kubenswrapper[4892]: I0217 18:20:14.341108 4892 scope.go:117] "RemoveContainer" containerID="39aa6eef4b2331ea9865c9843dcc9564426ba236fdba89c6f6d02caaa597c42d" Feb 17 18:20:14 crc kubenswrapper[4892]: E0217 18:20:14.341513 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39aa6eef4b2331ea9865c9843dcc9564426ba236fdba89c6f6d02caaa597c42d\": container with ID starting with 39aa6eef4b2331ea9865c9843dcc9564426ba236fdba89c6f6d02caaa597c42d not found: ID does not exist" containerID="39aa6eef4b2331ea9865c9843dcc9564426ba236fdba89c6f6d02caaa597c42d" Feb 17 18:20:14 crc kubenswrapper[4892]: I0217 18:20:14.341565 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39aa6eef4b2331ea9865c9843dcc9564426ba236fdba89c6f6d02caaa597c42d"} err="failed to get container status \"39aa6eef4b2331ea9865c9843dcc9564426ba236fdba89c6f6d02caaa597c42d\": rpc error: code = NotFound desc = could not find container \"39aa6eef4b2331ea9865c9843dcc9564426ba236fdba89c6f6d02caaa597c42d\": container with ID starting with 39aa6eef4b2331ea9865c9843dcc9564426ba236fdba89c6f6d02caaa597c42d not found: ID does not exist" Feb 17 18:20:14 crc kubenswrapper[4892]: I0217 18:20:14.341599 4892 scope.go:117] "RemoveContainer" containerID="67a24a75855a8f3dd965dde6d1a52be00a1fc1be73fd3ac9c7911829285f2cd4" Feb 17 18:20:14 crc kubenswrapper[4892]: E0217 18:20:14.341890 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67a24a75855a8f3dd965dde6d1a52be00a1fc1be73fd3ac9c7911829285f2cd4\": container with ID starting with 67a24a75855a8f3dd965dde6d1a52be00a1fc1be73fd3ac9c7911829285f2cd4 not found: ID does not exist" containerID="67a24a75855a8f3dd965dde6d1a52be00a1fc1be73fd3ac9c7911829285f2cd4" Feb 17 18:20:14 crc kubenswrapper[4892]: I0217 18:20:14.341920 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67a24a75855a8f3dd965dde6d1a52be00a1fc1be73fd3ac9c7911829285f2cd4"} err="failed to get container status \"67a24a75855a8f3dd965dde6d1a52be00a1fc1be73fd3ac9c7911829285f2cd4\": rpc error: code = NotFound desc = could not find container \"67a24a75855a8f3dd965dde6d1a52be00a1fc1be73fd3ac9c7911829285f2cd4\": container with ID starting with 67a24a75855a8f3dd965dde6d1a52be00a1fc1be73fd3ac9c7911829285f2cd4 not found: ID does not exist" Feb 17 18:20:14 crc kubenswrapper[4892]: I0217 18:20:14.341941 4892 scope.go:117] "RemoveContainer" containerID="577e2fcefc67d4da141c1610ad1779530b3763f58b7f1ae914c1737a08d410b6" Feb 17 18:20:14 crc kubenswrapper[4892]: E0217 18:20:14.342145 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"577e2fcefc67d4da141c1610ad1779530b3763f58b7f1ae914c1737a08d410b6\": container with ID starting with 577e2fcefc67d4da141c1610ad1779530b3763f58b7f1ae914c1737a08d410b6 not found: ID does not exist" containerID="577e2fcefc67d4da141c1610ad1779530b3763f58b7f1ae914c1737a08d410b6" Feb 17 18:20:14 crc kubenswrapper[4892]: I0217 18:20:14.342170 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"577e2fcefc67d4da141c1610ad1779530b3763f58b7f1ae914c1737a08d410b6"} err="failed to get container status \"577e2fcefc67d4da141c1610ad1779530b3763f58b7f1ae914c1737a08d410b6\": rpc error: code = NotFound desc = could not find container \"577e2fcefc67d4da141c1610ad1779530b3763f58b7f1ae914c1737a08d410b6\": container with ID starting with 577e2fcefc67d4da141c1610ad1779530b3763f58b7f1ae914c1737a08d410b6 not found: ID does not exist" Feb 17 18:20:15 crc kubenswrapper[4892]: I0217 18:20:15.371556 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04dcafb0-9799-4e3b-8a26-79e73f460b6e" path="/var/lib/kubelet/pods/04dcafb0-9799-4e3b-8a26-79e73f460b6e/volumes" Feb 17 18:20:23 crc kubenswrapper[4892]: I0217 18:20:23.329886 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-28z84"] Feb 17 18:20:23 crc kubenswrapper[4892]: E0217 18:20:23.331047 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eea8ef9-e0c9-4530-830e-59ad0e502832" containerName="extract-utilities" Feb 17 18:20:23 crc kubenswrapper[4892]: I0217 18:20:23.331070 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eea8ef9-e0c9-4530-830e-59ad0e502832" containerName="extract-utilities" Feb 17 18:20:23 crc kubenswrapper[4892]: E0217 18:20:23.331104 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04dcafb0-9799-4e3b-8a26-79e73f460b6e" containerName="extract-content" Feb 17 18:20:23 crc kubenswrapper[4892]: I0217 18:20:23.331116 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="04dcafb0-9799-4e3b-8a26-79e73f460b6e" containerName="extract-content" Feb 17 18:20:23 crc kubenswrapper[4892]: E0217 18:20:23.331141 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eea8ef9-e0c9-4530-830e-59ad0e502832" containerName="registry-server" Feb 17 18:20:23 crc kubenswrapper[4892]: I0217 18:20:23.331152 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eea8ef9-e0c9-4530-830e-59ad0e502832" containerName="registry-server" Feb 17 18:20:23 crc kubenswrapper[4892]: E0217 18:20:23.331184 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04dcafb0-9799-4e3b-8a26-79e73f460b6e" containerName="extract-utilities" Feb 17 18:20:23 crc kubenswrapper[4892]: I0217 18:20:23.331196 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="04dcafb0-9799-4e3b-8a26-79e73f460b6e" containerName="extract-utilities" Feb 17 18:20:23 crc kubenswrapper[4892]: E0217 18:20:23.331235 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04dcafb0-9799-4e3b-8a26-79e73f460b6e" containerName="registry-server" Feb 17 18:20:23 crc kubenswrapper[4892]: I0217 18:20:23.331248 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="04dcafb0-9799-4e3b-8a26-79e73f460b6e" containerName="registry-server" Feb 17 18:20:23 crc kubenswrapper[4892]: E0217 18:20:23.331269 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eea8ef9-e0c9-4530-830e-59ad0e502832" containerName="extract-content" Feb 17 18:20:23 crc kubenswrapper[4892]: I0217 18:20:23.331280 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eea8ef9-e0c9-4530-830e-59ad0e502832" containerName="extract-content" Feb 17 18:20:23 crc kubenswrapper[4892]: I0217 18:20:23.331596 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="04dcafb0-9799-4e3b-8a26-79e73f460b6e" containerName="registry-server" Feb 17 18:20:23 crc kubenswrapper[4892]: I0217 18:20:23.331635 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="5eea8ef9-e0c9-4530-830e-59ad0e502832" containerName="registry-server" Feb 17 18:20:23 crc kubenswrapper[4892]: I0217 18:20:23.333685 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-28z84" Feb 17 18:20:23 crc kubenswrapper[4892]: I0217 18:20:23.371387 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-28z84"] Feb 17 18:20:23 crc kubenswrapper[4892]: I0217 18:20:23.384652 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a07c1da6-2451-4b75-947d-fc9e2dec4169-catalog-content\") pod \"redhat-marketplace-28z84\" (UID: \"a07c1da6-2451-4b75-947d-fc9e2dec4169\") " pod="openshift-marketplace/redhat-marketplace-28z84" Feb 17 18:20:23 crc kubenswrapper[4892]: I0217 18:20:23.384697 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bfvw\" (UniqueName: \"kubernetes.io/projected/a07c1da6-2451-4b75-947d-fc9e2dec4169-kube-api-access-7bfvw\") pod \"redhat-marketplace-28z84\" (UID: \"a07c1da6-2451-4b75-947d-fc9e2dec4169\") " pod="openshift-marketplace/redhat-marketplace-28z84" Feb 17 18:20:23 crc kubenswrapper[4892]: I0217 18:20:23.384912 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a07c1da6-2451-4b75-947d-fc9e2dec4169-utilities\") pod \"redhat-marketplace-28z84\" (UID: \"a07c1da6-2451-4b75-947d-fc9e2dec4169\") " pod="openshift-marketplace/redhat-marketplace-28z84" Feb 17 18:20:23 crc kubenswrapper[4892]: I0217 18:20:23.487027 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a07c1da6-2451-4b75-947d-fc9e2dec4169-utilities\") pod \"redhat-marketplace-28z84\" (UID: \"a07c1da6-2451-4b75-947d-fc9e2dec4169\") " pod="openshift-marketplace/redhat-marketplace-28z84" Feb 17 18:20:23 crc kubenswrapper[4892]: I0217 18:20:23.487139 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a07c1da6-2451-4b75-947d-fc9e2dec4169-catalog-content\") pod \"redhat-marketplace-28z84\" (UID: \"a07c1da6-2451-4b75-947d-fc9e2dec4169\") " pod="openshift-marketplace/redhat-marketplace-28z84" Feb 17 18:20:23 crc kubenswrapper[4892]: I0217 18:20:23.487174 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bfvw\" (UniqueName: \"kubernetes.io/projected/a07c1da6-2451-4b75-947d-fc9e2dec4169-kube-api-access-7bfvw\") pod \"redhat-marketplace-28z84\" (UID: \"a07c1da6-2451-4b75-947d-fc9e2dec4169\") " pod="openshift-marketplace/redhat-marketplace-28z84" Feb 17 18:20:23 crc kubenswrapper[4892]: I0217 18:20:23.487585 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a07c1da6-2451-4b75-947d-fc9e2dec4169-utilities\") pod \"redhat-marketplace-28z84\" (UID: \"a07c1da6-2451-4b75-947d-fc9e2dec4169\") " pod="openshift-marketplace/redhat-marketplace-28z84" Feb 17 18:20:23 crc kubenswrapper[4892]: I0217 18:20:23.487770 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a07c1da6-2451-4b75-947d-fc9e2dec4169-catalog-content\") pod \"redhat-marketplace-28z84\" (UID: \"a07c1da6-2451-4b75-947d-fc9e2dec4169\") " pod="openshift-marketplace/redhat-marketplace-28z84" Feb 17 18:20:23 crc kubenswrapper[4892]: I0217 18:20:23.521789 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bfvw\" (UniqueName: \"kubernetes.io/projected/a07c1da6-2451-4b75-947d-fc9e2dec4169-kube-api-access-7bfvw\") pod \"redhat-marketplace-28z84\" (UID: \"a07c1da6-2451-4b75-947d-fc9e2dec4169\") " pod="openshift-marketplace/redhat-marketplace-28z84" Feb 17 18:20:23 crc kubenswrapper[4892]: I0217 18:20:23.653698 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-28z84" Feb 17 18:20:24 crc kubenswrapper[4892]: I0217 18:20:24.112677 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-28z84"] Feb 17 18:20:24 crc kubenswrapper[4892]: I0217 18:20:24.371257 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-28z84" event={"ID":"a07c1da6-2451-4b75-947d-fc9e2dec4169","Type":"ContainerStarted","Data":"e191638db054494ea084646128676555fefcfc092143788a8c364cfccd2d77b9"} Feb 17 18:20:24 crc kubenswrapper[4892]: I0217 18:20:24.371720 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-28z84" event={"ID":"a07c1da6-2451-4b75-947d-fc9e2dec4169","Type":"ContainerStarted","Data":"967143f087a65a18e4e1485c878776e39e3e92ca24c3d47f6ec1383aed59db5e"} Feb 17 18:20:25 crc kubenswrapper[4892]: I0217 18:20:25.384646 4892 generic.go:334] "Generic (PLEG): container finished" podID="a07c1da6-2451-4b75-947d-fc9e2dec4169" containerID="e191638db054494ea084646128676555fefcfc092143788a8c364cfccd2d77b9" exitCode=0 Feb 17 18:20:25 crc kubenswrapper[4892]: I0217 18:20:25.385003 4892 generic.go:334] "Generic (PLEG): container finished" podID="a07c1da6-2451-4b75-947d-fc9e2dec4169" containerID="dbaec37e4b07847f2140fafe79e5529d2a925e1b8aa02ebf1fbb8c8d96f70dda" exitCode=0 Feb 17 18:20:25 crc kubenswrapper[4892]: I0217 18:20:25.384686 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-28z84" event={"ID":"a07c1da6-2451-4b75-947d-fc9e2dec4169","Type":"ContainerDied","Data":"e191638db054494ea084646128676555fefcfc092143788a8c364cfccd2d77b9"} Feb 17 18:20:25 crc kubenswrapper[4892]: I0217 18:20:25.385045 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-28z84" event={"ID":"a07c1da6-2451-4b75-947d-fc9e2dec4169","Type":"ContainerDied","Data":"dbaec37e4b07847f2140fafe79e5529d2a925e1b8aa02ebf1fbb8c8d96f70dda"} Feb 17 18:20:26 crc kubenswrapper[4892]: I0217 18:20:26.395309 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-28z84" event={"ID":"a07c1da6-2451-4b75-947d-fc9e2dec4169","Type":"ContainerStarted","Data":"21e69d58cc8d89c77e35e46b70676b6f05c14669b5ba8479444dc01281560ef7"} Feb 17 18:20:26 crc kubenswrapper[4892]: I0217 18:20:26.416056 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-28z84" podStartSLOduration=1.910317612 podStartE2EDuration="3.416027806s" podCreationTimestamp="2026-02-17 18:20:23 +0000 UTC" firstStartedPulling="2026-02-17 18:20:24.37500212 +0000 UTC m=+2195.750405385" lastFinishedPulling="2026-02-17 18:20:25.880712314 +0000 UTC m=+2197.256115579" observedRunningTime="2026-02-17 18:20:26.411269637 +0000 UTC m=+2197.786672902" watchObservedRunningTime="2026-02-17 18:20:26.416027806 +0000 UTC m=+2197.791431091" Feb 17 18:20:33 crc kubenswrapper[4892]: I0217 18:20:33.654402 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-28z84" Feb 17 18:20:33 crc kubenswrapper[4892]: I0217 18:20:33.656084 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-28z84" Feb 17 18:20:33 crc kubenswrapper[4892]: I0217 18:20:33.703243 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-28z84" Feb 17 18:20:34 crc kubenswrapper[4892]: I0217 18:20:34.510873 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-28z84" Feb 17 18:20:34 crc kubenswrapper[4892]: I0217 18:20:34.569261 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-28z84"] Feb 17 18:20:36 crc kubenswrapper[4892]: I0217 18:20:36.489501 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-28z84" podUID="a07c1da6-2451-4b75-947d-fc9e2dec4169" containerName="registry-server" containerID="cri-o://21e69d58cc8d89c77e35e46b70676b6f05c14669b5ba8479444dc01281560ef7" gracePeriod=2 Feb 17 18:20:36 crc kubenswrapper[4892]: I0217 18:20:36.935164 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-28z84" Feb 17 18:20:37 crc kubenswrapper[4892]: I0217 18:20:37.018324 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bfvw\" (UniqueName: \"kubernetes.io/projected/a07c1da6-2451-4b75-947d-fc9e2dec4169-kube-api-access-7bfvw\") pod \"a07c1da6-2451-4b75-947d-fc9e2dec4169\" (UID: \"a07c1da6-2451-4b75-947d-fc9e2dec4169\") " Feb 17 18:20:37 crc kubenswrapper[4892]: I0217 18:20:37.018390 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a07c1da6-2451-4b75-947d-fc9e2dec4169-utilities\") pod \"a07c1da6-2451-4b75-947d-fc9e2dec4169\" (UID: \"a07c1da6-2451-4b75-947d-fc9e2dec4169\") " Feb 17 18:20:37 crc kubenswrapper[4892]: I0217 18:20:37.018588 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a07c1da6-2451-4b75-947d-fc9e2dec4169-catalog-content\") pod \"a07c1da6-2451-4b75-947d-fc9e2dec4169\" (UID: \"a07c1da6-2451-4b75-947d-fc9e2dec4169\") " Feb 17 18:20:37 crc kubenswrapper[4892]: I0217 18:20:37.019309 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a07c1da6-2451-4b75-947d-fc9e2dec4169-utilities" (OuterVolumeSpecName: "utilities") pod "a07c1da6-2451-4b75-947d-fc9e2dec4169" (UID: "a07c1da6-2451-4b75-947d-fc9e2dec4169"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:20:37 crc kubenswrapper[4892]: I0217 18:20:37.023322 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a07c1da6-2451-4b75-947d-fc9e2dec4169-kube-api-access-7bfvw" (OuterVolumeSpecName: "kube-api-access-7bfvw") pod "a07c1da6-2451-4b75-947d-fc9e2dec4169" (UID: "a07c1da6-2451-4b75-947d-fc9e2dec4169"). InnerVolumeSpecName "kube-api-access-7bfvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:20:37 crc kubenswrapper[4892]: I0217 18:20:37.045302 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a07c1da6-2451-4b75-947d-fc9e2dec4169-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a07c1da6-2451-4b75-947d-fc9e2dec4169" (UID: "a07c1da6-2451-4b75-947d-fc9e2dec4169"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:20:37 crc kubenswrapper[4892]: I0217 18:20:37.120206 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a07c1da6-2451-4b75-947d-fc9e2dec4169-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 18:20:37 crc kubenswrapper[4892]: I0217 18:20:37.120238 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bfvw\" (UniqueName: \"kubernetes.io/projected/a07c1da6-2451-4b75-947d-fc9e2dec4169-kube-api-access-7bfvw\") on node \"crc\" DevicePath \"\"" Feb 17 18:20:37 crc kubenswrapper[4892]: I0217 18:20:37.120249 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a07c1da6-2451-4b75-947d-fc9e2dec4169-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 18:20:37 crc kubenswrapper[4892]: I0217 18:20:37.425177 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 18:20:37 crc kubenswrapper[4892]: I0217 18:20:37.425236 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 18:20:37 crc kubenswrapper[4892]: I0217 18:20:37.500394 4892 generic.go:334] "Generic (PLEG): container finished" podID="a07c1da6-2451-4b75-947d-fc9e2dec4169" containerID="21e69d58cc8d89c77e35e46b70676b6f05c14669b5ba8479444dc01281560ef7" exitCode=0 Feb 17 18:20:37 crc kubenswrapper[4892]: I0217 18:20:37.500452 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-28z84" event={"ID":"a07c1da6-2451-4b75-947d-fc9e2dec4169","Type":"ContainerDied","Data":"21e69d58cc8d89c77e35e46b70676b6f05c14669b5ba8479444dc01281560ef7"} Feb 17 18:20:37 crc kubenswrapper[4892]: I0217 18:20:37.500485 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-28z84" event={"ID":"a07c1da6-2451-4b75-947d-fc9e2dec4169","Type":"ContainerDied","Data":"967143f087a65a18e4e1485c878776e39e3e92ca24c3d47f6ec1383aed59db5e"} Feb 17 18:20:37 crc kubenswrapper[4892]: I0217 18:20:37.500494 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-28z84" Feb 17 18:20:37 crc kubenswrapper[4892]: I0217 18:20:37.500507 4892 scope.go:117] "RemoveContainer" containerID="21e69d58cc8d89c77e35e46b70676b6f05c14669b5ba8479444dc01281560ef7" Feb 17 18:20:37 crc kubenswrapper[4892]: I0217 18:20:37.528588 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-28z84"] Feb 17 18:20:37 crc kubenswrapper[4892]: I0217 18:20:37.537314 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-28z84"] Feb 17 18:20:37 crc kubenswrapper[4892]: I0217 18:20:37.539082 4892 scope.go:117] "RemoveContainer" containerID="dbaec37e4b07847f2140fafe79e5529d2a925e1b8aa02ebf1fbb8c8d96f70dda" Feb 17 18:20:37 crc kubenswrapper[4892]: I0217 18:20:37.558577 4892 scope.go:117] "RemoveContainer" containerID="e191638db054494ea084646128676555fefcfc092143788a8c364cfccd2d77b9" Feb 17 18:20:37 crc kubenswrapper[4892]: I0217 18:20:37.587290 4892 scope.go:117] "RemoveContainer" containerID="21e69d58cc8d89c77e35e46b70676b6f05c14669b5ba8479444dc01281560ef7" Feb 17 18:20:37 crc kubenswrapper[4892]: E0217 18:20:37.587727 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21e69d58cc8d89c77e35e46b70676b6f05c14669b5ba8479444dc01281560ef7\": container with ID starting with 21e69d58cc8d89c77e35e46b70676b6f05c14669b5ba8479444dc01281560ef7 not found: ID does not exist" containerID="21e69d58cc8d89c77e35e46b70676b6f05c14669b5ba8479444dc01281560ef7" Feb 17 18:20:37 crc kubenswrapper[4892]: I0217 18:20:37.587781 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21e69d58cc8d89c77e35e46b70676b6f05c14669b5ba8479444dc01281560ef7"} err="failed to get container status \"21e69d58cc8d89c77e35e46b70676b6f05c14669b5ba8479444dc01281560ef7\": rpc error: code = NotFound desc = could not find container \"21e69d58cc8d89c77e35e46b70676b6f05c14669b5ba8479444dc01281560ef7\": container with ID starting with 21e69d58cc8d89c77e35e46b70676b6f05c14669b5ba8479444dc01281560ef7 not found: ID does not exist" Feb 17 18:20:37 crc kubenswrapper[4892]: I0217 18:20:37.587829 4892 scope.go:117] "RemoveContainer" containerID="dbaec37e4b07847f2140fafe79e5529d2a925e1b8aa02ebf1fbb8c8d96f70dda" Feb 17 18:20:37 crc kubenswrapper[4892]: E0217 18:20:37.588350 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbaec37e4b07847f2140fafe79e5529d2a925e1b8aa02ebf1fbb8c8d96f70dda\": container with ID starting with dbaec37e4b07847f2140fafe79e5529d2a925e1b8aa02ebf1fbb8c8d96f70dda not found: ID does not exist" containerID="dbaec37e4b07847f2140fafe79e5529d2a925e1b8aa02ebf1fbb8c8d96f70dda" Feb 17 18:20:37 crc kubenswrapper[4892]: I0217 18:20:37.588379 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbaec37e4b07847f2140fafe79e5529d2a925e1b8aa02ebf1fbb8c8d96f70dda"} err="failed to get container status \"dbaec37e4b07847f2140fafe79e5529d2a925e1b8aa02ebf1fbb8c8d96f70dda\": rpc error: code = NotFound desc = could not find container \"dbaec37e4b07847f2140fafe79e5529d2a925e1b8aa02ebf1fbb8c8d96f70dda\": container with ID starting with dbaec37e4b07847f2140fafe79e5529d2a925e1b8aa02ebf1fbb8c8d96f70dda not found: ID does not exist" Feb 17 18:20:37 crc kubenswrapper[4892]: I0217 18:20:37.588395 4892 scope.go:117] "RemoveContainer" containerID="e191638db054494ea084646128676555fefcfc092143788a8c364cfccd2d77b9" Feb 17 18:20:37 crc kubenswrapper[4892]: E0217 18:20:37.588645 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e191638db054494ea084646128676555fefcfc092143788a8c364cfccd2d77b9\": container with ID starting with e191638db054494ea084646128676555fefcfc092143788a8c364cfccd2d77b9 not found: ID does not exist" containerID="e191638db054494ea084646128676555fefcfc092143788a8c364cfccd2d77b9" Feb 17 18:20:37 crc kubenswrapper[4892]: I0217 18:20:37.588676 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e191638db054494ea084646128676555fefcfc092143788a8c364cfccd2d77b9"} err="failed to get container status \"e191638db054494ea084646128676555fefcfc092143788a8c364cfccd2d77b9\": rpc error: code = NotFound desc = could not find container \"e191638db054494ea084646128676555fefcfc092143788a8c364cfccd2d77b9\": container with ID starting with e191638db054494ea084646128676555fefcfc092143788a8c364cfccd2d77b9 not found: ID does not exist" Feb 17 18:20:39 crc kubenswrapper[4892]: I0217 18:20:39.395176 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a07c1da6-2451-4b75-947d-fc9e2dec4169" path="/var/lib/kubelet/pods/a07c1da6-2451-4b75-947d-fc9e2dec4169/volumes" Feb 17 18:21:07 crc kubenswrapper[4892]: I0217 18:21:07.425220 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 18:21:07 crc kubenswrapper[4892]: I0217 18:21:07.425731 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 18:21:37 crc kubenswrapper[4892]: I0217 18:21:37.425323 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 18:21:37 crc kubenswrapper[4892]: I0217 18:21:37.425987 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 18:21:37 crc kubenswrapper[4892]: I0217 18:21:37.426038 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" Feb 17 18:21:37 crc kubenswrapper[4892]: I0217 18:21:37.426790 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e0e0ad3b976f07609332977599bfbd7aff086139ea33e9c693f03eb70bc21736"} pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 18:21:37 crc kubenswrapper[4892]: I0217 18:21:37.426921 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" containerID="cri-o://e0e0ad3b976f07609332977599bfbd7aff086139ea33e9c693f03eb70bc21736" gracePeriod=600 Feb 17 18:21:37 crc kubenswrapper[4892]: E0217 18:21:37.550250 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:21:38 crc kubenswrapper[4892]: I0217 18:21:38.101652 4892 generic.go:334] "Generic (PLEG): container finished" podID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerID="e0e0ad3b976f07609332977599bfbd7aff086139ea33e9c693f03eb70bc21736" exitCode=0 Feb 17 18:21:38 crc kubenswrapper[4892]: I0217 18:21:38.101699 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerDied","Data":"e0e0ad3b976f07609332977599bfbd7aff086139ea33e9c693f03eb70bc21736"} Feb 17 18:21:38 crc kubenswrapper[4892]: I0217 18:21:38.101738 4892 scope.go:117] "RemoveContainer" containerID="17bcf2f3d5f53df9b0508cf062145c481335de2a6af1329fbd270946f1b3c622" Feb 17 18:21:38 crc kubenswrapper[4892]: I0217 18:21:38.102318 4892 scope.go:117] "RemoveContainer" containerID="e0e0ad3b976f07609332977599bfbd7aff086139ea33e9c693f03eb70bc21736" Feb 17 18:21:38 crc kubenswrapper[4892]: E0217 18:21:38.102680 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:21:53 crc kubenswrapper[4892]: I0217 18:21:53.360167 4892 scope.go:117] "RemoveContainer" containerID="e0e0ad3b976f07609332977599bfbd7aff086139ea33e9c693f03eb70bc21736" Feb 17 18:21:53 crc kubenswrapper[4892]: E0217 18:21:53.360944 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:22:05 crc kubenswrapper[4892]: I0217 18:22:05.360246 4892 scope.go:117] "RemoveContainer" containerID="e0e0ad3b976f07609332977599bfbd7aff086139ea33e9c693f03eb70bc21736" Feb 17 18:22:05 crc kubenswrapper[4892]: E0217 18:22:05.361177 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:22:16 crc kubenswrapper[4892]: I0217 18:22:16.359464 4892 scope.go:117] "RemoveContainer" containerID="e0e0ad3b976f07609332977599bfbd7aff086139ea33e9c693f03eb70bc21736" Feb 17 18:22:16 crc kubenswrapper[4892]: E0217 18:22:16.360349 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:22:31 crc kubenswrapper[4892]: I0217 18:22:31.360199 4892 scope.go:117] "RemoveContainer" containerID="e0e0ad3b976f07609332977599bfbd7aff086139ea33e9c693f03eb70bc21736" Feb 17 18:22:31 crc kubenswrapper[4892]: E0217 18:22:31.361317 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:22:45 crc kubenswrapper[4892]: I0217 18:22:45.361643 4892 scope.go:117] "RemoveContainer" containerID="e0e0ad3b976f07609332977599bfbd7aff086139ea33e9c693f03eb70bc21736" Feb 17 18:22:45 crc kubenswrapper[4892]: E0217 18:22:45.362583 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:22:57 crc kubenswrapper[4892]: I0217 18:22:57.360647 4892 scope.go:117] "RemoveContainer" containerID="e0e0ad3b976f07609332977599bfbd7aff086139ea33e9c693f03eb70bc21736" Feb 17 18:22:57 crc kubenswrapper[4892]: E0217 18:22:57.362441 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:23:09 crc kubenswrapper[4892]: I0217 18:23:09.369944 4892 scope.go:117] "RemoveContainer" containerID="e0e0ad3b976f07609332977599bfbd7aff086139ea33e9c693f03eb70bc21736" Feb 17 18:23:09 crc kubenswrapper[4892]: E0217 18:23:09.370739 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:23:20 crc kubenswrapper[4892]: I0217 18:23:20.359776 4892 scope.go:117] "RemoveContainer" containerID="e0e0ad3b976f07609332977599bfbd7aff086139ea33e9c693f03eb70bc21736" Feb 17 18:23:20 crc kubenswrapper[4892]: E0217 18:23:20.360689 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:23:34 crc kubenswrapper[4892]: I0217 18:23:34.360853 4892 scope.go:117] "RemoveContainer" containerID="e0e0ad3b976f07609332977599bfbd7aff086139ea33e9c693f03eb70bc21736" Feb 17 18:23:34 crc kubenswrapper[4892]: E0217 18:23:34.362097 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:23:46 crc kubenswrapper[4892]: I0217 18:23:46.359990 4892 scope.go:117] "RemoveContainer" containerID="e0e0ad3b976f07609332977599bfbd7aff086139ea33e9c693f03eb70bc21736" Feb 17 18:23:46 crc kubenswrapper[4892]: E0217 18:23:46.360971 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:24:00 crc kubenswrapper[4892]: I0217 18:24:00.360686 4892 scope.go:117] "RemoveContainer" containerID="e0e0ad3b976f07609332977599bfbd7aff086139ea33e9c693f03eb70bc21736" Feb 17 18:24:00 crc kubenswrapper[4892]: E0217 18:24:00.361666 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:24:14 crc kubenswrapper[4892]: I0217 18:24:14.359893 4892 scope.go:117] "RemoveContainer" containerID="e0e0ad3b976f07609332977599bfbd7aff086139ea33e9c693f03eb70bc21736" Feb 17 18:24:14 crc kubenswrapper[4892]: E0217 18:24:14.360650 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:24:29 crc kubenswrapper[4892]: I0217 18:24:29.369179 4892 scope.go:117] "RemoveContainer" containerID="e0e0ad3b976f07609332977599bfbd7aff086139ea33e9c693f03eb70bc21736" Feb 17 18:24:29 crc kubenswrapper[4892]: E0217 18:24:29.370087 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:24:40 crc kubenswrapper[4892]: I0217 18:24:40.359987 4892 scope.go:117] "RemoveContainer" containerID="e0e0ad3b976f07609332977599bfbd7aff086139ea33e9c693f03eb70bc21736" Feb 17 18:24:40 crc kubenswrapper[4892]: E0217 18:24:40.361681 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:24:55 crc kubenswrapper[4892]: I0217 18:24:55.359637 4892 scope.go:117] "RemoveContainer" containerID="e0e0ad3b976f07609332977599bfbd7aff086139ea33e9c693f03eb70bc21736" Feb 17 18:24:55 crc kubenswrapper[4892]: E0217 18:24:55.360344 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:25:10 crc kubenswrapper[4892]: I0217 18:25:10.361241 4892 scope.go:117] "RemoveContainer" containerID="e0e0ad3b976f07609332977599bfbd7aff086139ea33e9c693f03eb70bc21736" Feb 17 18:25:10 crc kubenswrapper[4892]: E0217 18:25:10.362562 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:25:25 crc kubenswrapper[4892]: I0217 18:25:25.360790 4892 scope.go:117] "RemoveContainer" containerID="e0e0ad3b976f07609332977599bfbd7aff086139ea33e9c693f03eb70bc21736" Feb 17 18:25:25 crc kubenswrapper[4892]: E0217 18:25:25.361913 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:25:39 crc kubenswrapper[4892]: I0217 18:25:39.364631 4892 scope.go:117] "RemoveContainer" containerID="e0e0ad3b976f07609332977599bfbd7aff086139ea33e9c693f03eb70bc21736" Feb 17 18:25:39 crc kubenswrapper[4892]: E0217 18:25:39.365404 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:25:52 crc kubenswrapper[4892]: I0217 18:25:52.359727 4892 scope.go:117] "RemoveContainer" containerID="e0e0ad3b976f07609332977599bfbd7aff086139ea33e9c693f03eb70bc21736" Feb 17 18:25:52 crc kubenswrapper[4892]: E0217 18:25:52.360719 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:26:04 crc kubenswrapper[4892]: I0217 18:26:04.360112 4892 scope.go:117] "RemoveContainer" containerID="e0e0ad3b976f07609332977599bfbd7aff086139ea33e9c693f03eb70bc21736" Feb 17 18:26:04 crc kubenswrapper[4892]: E0217 18:26:04.361320 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:26:17 crc kubenswrapper[4892]: I0217 18:26:17.360485 4892 scope.go:117] "RemoveContainer" containerID="e0e0ad3b976f07609332977599bfbd7aff086139ea33e9c693f03eb70bc21736" Feb 17 18:26:17 crc kubenswrapper[4892]: E0217 18:26:17.361437 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:26:32 crc kubenswrapper[4892]: I0217 18:26:32.359618 4892 scope.go:117] "RemoveContainer" containerID="e0e0ad3b976f07609332977599bfbd7aff086139ea33e9c693f03eb70bc21736" Feb 17 18:26:32 crc kubenswrapper[4892]: E0217 18:26:32.360383 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:26:45 crc kubenswrapper[4892]: I0217 18:26:45.360327 4892 scope.go:117] "RemoveContainer" containerID="e0e0ad3b976f07609332977599bfbd7aff086139ea33e9c693f03eb70bc21736" Feb 17 18:26:46 crc kubenswrapper[4892]: I0217 18:26:46.271734 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerStarted","Data":"7fc7a9cdbbcd0e8aa68ed9058147ae85e5e43f5116c72459ea9a473802a43953"} Feb 17 18:29:07 crc kubenswrapper[4892]: I0217 18:29:07.424880 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 18:29:07 crc kubenswrapper[4892]: I0217 18:29:07.425386 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 18:29:37 crc kubenswrapper[4892]: I0217 18:29:37.425436 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 18:29:37 crc kubenswrapper[4892]: I0217 18:29:37.427054 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 18:30:00 crc kubenswrapper[4892]: I0217 18:30:00.159956 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522550-txp2t"] Feb 17 18:30:00 crc kubenswrapper[4892]: E0217 18:30:00.160941 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a07c1da6-2451-4b75-947d-fc9e2dec4169" containerName="extract-content" Feb 17 18:30:00 crc kubenswrapper[4892]: I0217 18:30:00.160967 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="a07c1da6-2451-4b75-947d-fc9e2dec4169" containerName="extract-content" Feb 17 18:30:00 crc kubenswrapper[4892]: E0217 18:30:00.160999 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a07c1da6-2451-4b75-947d-fc9e2dec4169" containerName="registry-server" Feb 17 18:30:00 crc kubenswrapper[4892]: I0217 18:30:00.161012 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="a07c1da6-2451-4b75-947d-fc9e2dec4169" containerName="registry-server" Feb 17 18:30:00 crc kubenswrapper[4892]: E0217 18:30:00.161041 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a07c1da6-2451-4b75-947d-fc9e2dec4169" containerName="extract-utilities" Feb 17 18:30:00 crc kubenswrapper[4892]: I0217 18:30:00.161054 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="a07c1da6-2451-4b75-947d-fc9e2dec4169" containerName="extract-utilities" Feb 17 18:30:00 crc kubenswrapper[4892]: I0217 18:30:00.161327 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="a07c1da6-2451-4b75-947d-fc9e2dec4169" containerName="registry-server" Feb 17 18:30:00 crc kubenswrapper[4892]: I0217 18:30:00.162095 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522550-txp2t" Feb 17 18:30:00 crc kubenswrapper[4892]: I0217 18:30:00.166492 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 18:30:00 crc kubenswrapper[4892]: I0217 18:30:00.173847 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522550-txp2t"] Feb 17 18:30:00 crc kubenswrapper[4892]: I0217 18:30:00.175595 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 18:30:00 crc kubenswrapper[4892]: I0217 18:30:00.247598 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/95eb9988-23c6-4a17-a5da-6b6c70984deb-config-volume\") pod \"collect-profiles-29522550-txp2t\" (UID: \"95eb9988-23c6-4a17-a5da-6b6c70984deb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522550-txp2t" Feb 17 18:30:00 crc kubenswrapper[4892]: I0217 18:30:00.247751 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b4lw\" (UniqueName: \"kubernetes.io/projected/95eb9988-23c6-4a17-a5da-6b6c70984deb-kube-api-access-4b4lw\") pod \"collect-profiles-29522550-txp2t\" (UID: \"95eb9988-23c6-4a17-a5da-6b6c70984deb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522550-txp2t" Feb 17 18:30:00 crc kubenswrapper[4892]: I0217 18:30:00.248050 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/95eb9988-23c6-4a17-a5da-6b6c70984deb-secret-volume\") pod \"collect-profiles-29522550-txp2t\" (UID: \"95eb9988-23c6-4a17-a5da-6b6c70984deb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522550-txp2t" Feb 17 18:30:00 crc kubenswrapper[4892]: I0217 18:30:00.349805 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/95eb9988-23c6-4a17-a5da-6b6c70984deb-config-volume\") pod \"collect-profiles-29522550-txp2t\" (UID: \"95eb9988-23c6-4a17-a5da-6b6c70984deb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522550-txp2t" Feb 17 18:30:00 crc kubenswrapper[4892]: I0217 18:30:00.349899 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b4lw\" (UniqueName: \"kubernetes.io/projected/95eb9988-23c6-4a17-a5da-6b6c70984deb-kube-api-access-4b4lw\") pod \"collect-profiles-29522550-txp2t\" (UID: \"95eb9988-23c6-4a17-a5da-6b6c70984deb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522550-txp2t" Feb 17 18:30:00 crc kubenswrapper[4892]: I0217 18:30:00.350152 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/95eb9988-23c6-4a17-a5da-6b6c70984deb-secret-volume\") pod \"collect-profiles-29522550-txp2t\" (UID: \"95eb9988-23c6-4a17-a5da-6b6c70984deb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522550-txp2t" Feb 17 18:30:00 crc kubenswrapper[4892]: I0217 18:30:00.350899 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/95eb9988-23c6-4a17-a5da-6b6c70984deb-config-volume\") pod \"collect-profiles-29522550-txp2t\" (UID: \"95eb9988-23c6-4a17-a5da-6b6c70984deb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522550-txp2t" Feb 17 18:30:00 crc kubenswrapper[4892]: I0217 18:30:00.365451 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/95eb9988-23c6-4a17-a5da-6b6c70984deb-secret-volume\") pod \"collect-profiles-29522550-txp2t\" (UID: \"95eb9988-23c6-4a17-a5da-6b6c70984deb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522550-txp2t" Feb 17 18:30:00 crc kubenswrapper[4892]: I0217 18:30:00.381920 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b4lw\" (UniqueName: \"kubernetes.io/projected/95eb9988-23c6-4a17-a5da-6b6c70984deb-kube-api-access-4b4lw\") pod \"collect-profiles-29522550-txp2t\" (UID: \"95eb9988-23c6-4a17-a5da-6b6c70984deb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522550-txp2t" Feb 17 18:30:00 crc kubenswrapper[4892]: I0217 18:30:00.487169 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522550-txp2t" Feb 17 18:30:00 crc kubenswrapper[4892]: I0217 18:30:00.948300 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522550-txp2t"] Feb 17 18:30:01 crc kubenswrapper[4892]: I0217 18:30:01.215878 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522550-txp2t" event={"ID":"95eb9988-23c6-4a17-a5da-6b6c70984deb","Type":"ContainerStarted","Data":"7ea9f3bf5ad27feff41b4c27eefa5a700f3e740e1f8e346d9e20edd2214bab6d"} Feb 17 18:30:01 crc kubenswrapper[4892]: I0217 18:30:01.215922 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522550-txp2t" event={"ID":"95eb9988-23c6-4a17-a5da-6b6c70984deb","Type":"ContainerStarted","Data":"039be691a1dbd3ff309f4619bc3a1f9f293ce9e679e3fe1dcb233beb8b71cde9"} Feb 17 18:30:01 crc kubenswrapper[4892]: I0217 18:30:01.235419 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29522550-txp2t" podStartSLOduration=1.235400451 podStartE2EDuration="1.235400451s" podCreationTimestamp="2026-02-17 18:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 18:30:01.234402244 +0000 UTC m=+2772.609805519" watchObservedRunningTime="2026-02-17 18:30:01.235400451 +0000 UTC m=+2772.610803716" Feb 17 18:30:02 crc kubenswrapper[4892]: I0217 18:30:02.227619 4892 generic.go:334] "Generic (PLEG): container finished" podID="95eb9988-23c6-4a17-a5da-6b6c70984deb" containerID="7ea9f3bf5ad27feff41b4c27eefa5a700f3e740e1f8e346d9e20edd2214bab6d" exitCode=0 Feb 17 18:30:02 crc kubenswrapper[4892]: I0217 18:30:02.227679 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522550-txp2t" event={"ID":"95eb9988-23c6-4a17-a5da-6b6c70984deb","Type":"ContainerDied","Data":"7ea9f3bf5ad27feff41b4c27eefa5a700f3e740e1f8e346d9e20edd2214bab6d"} Feb 17 18:30:03 crc kubenswrapper[4892]: I0217 18:30:03.606551 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522550-txp2t" Feb 17 18:30:03 crc kubenswrapper[4892]: I0217 18:30:03.712863 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/95eb9988-23c6-4a17-a5da-6b6c70984deb-config-volume\") pod \"95eb9988-23c6-4a17-a5da-6b6c70984deb\" (UID: \"95eb9988-23c6-4a17-a5da-6b6c70984deb\") " Feb 17 18:30:03 crc kubenswrapper[4892]: I0217 18:30:03.713067 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4b4lw\" (UniqueName: \"kubernetes.io/projected/95eb9988-23c6-4a17-a5da-6b6c70984deb-kube-api-access-4b4lw\") pod \"95eb9988-23c6-4a17-a5da-6b6c70984deb\" (UID: \"95eb9988-23c6-4a17-a5da-6b6c70984deb\") " Feb 17 18:30:03 crc kubenswrapper[4892]: I0217 18:30:03.713102 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/95eb9988-23c6-4a17-a5da-6b6c70984deb-secret-volume\") pod \"95eb9988-23c6-4a17-a5da-6b6c70984deb\" (UID: \"95eb9988-23c6-4a17-a5da-6b6c70984deb\") " Feb 17 18:30:03 crc kubenswrapper[4892]: I0217 18:30:03.718473 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95eb9988-23c6-4a17-a5da-6b6c70984deb-config-volume" (OuterVolumeSpecName: "config-volume") pod "95eb9988-23c6-4a17-a5da-6b6c70984deb" (UID: "95eb9988-23c6-4a17-a5da-6b6c70984deb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:30:03 crc kubenswrapper[4892]: I0217 18:30:03.723130 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95eb9988-23c6-4a17-a5da-6b6c70984deb-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "95eb9988-23c6-4a17-a5da-6b6c70984deb" (UID: "95eb9988-23c6-4a17-a5da-6b6c70984deb"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:30:03 crc kubenswrapper[4892]: I0217 18:30:03.723709 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95eb9988-23c6-4a17-a5da-6b6c70984deb-kube-api-access-4b4lw" (OuterVolumeSpecName: "kube-api-access-4b4lw") pod "95eb9988-23c6-4a17-a5da-6b6c70984deb" (UID: "95eb9988-23c6-4a17-a5da-6b6c70984deb"). InnerVolumeSpecName "kube-api-access-4b4lw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:30:03 crc kubenswrapper[4892]: I0217 18:30:03.815585 4892 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/95eb9988-23c6-4a17-a5da-6b6c70984deb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 18:30:03 crc kubenswrapper[4892]: I0217 18:30:03.815633 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4b4lw\" (UniqueName: \"kubernetes.io/projected/95eb9988-23c6-4a17-a5da-6b6c70984deb-kube-api-access-4b4lw\") on node \"crc\" DevicePath \"\"" Feb 17 18:30:03 crc kubenswrapper[4892]: I0217 18:30:03.815654 4892 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/95eb9988-23c6-4a17-a5da-6b6c70984deb-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 18:30:04 crc kubenswrapper[4892]: I0217 18:30:04.247627 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522550-txp2t" event={"ID":"95eb9988-23c6-4a17-a5da-6b6c70984deb","Type":"ContainerDied","Data":"039be691a1dbd3ff309f4619bc3a1f9f293ce9e679e3fe1dcb233beb8b71cde9"} Feb 17 18:30:04 crc kubenswrapper[4892]: I0217 18:30:04.247679 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="039be691a1dbd3ff309f4619bc3a1f9f293ce9e679e3fe1dcb233beb8b71cde9" Feb 17 18:30:04 crc kubenswrapper[4892]: I0217 18:30:04.247714 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522550-txp2t" Feb 17 18:30:04 crc kubenswrapper[4892]: I0217 18:30:04.318167 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522505-6hhs6"] Feb 17 18:30:04 crc kubenswrapper[4892]: I0217 18:30:04.326844 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522505-6hhs6"] Feb 17 18:30:05 crc kubenswrapper[4892]: I0217 18:30:05.381356 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1111f441-f7d1-4115-b288-48cef127137a" path="/var/lib/kubelet/pods/1111f441-f7d1-4115-b288-48cef127137a/volumes" Feb 17 18:30:07 crc kubenswrapper[4892]: I0217 18:30:07.424386 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 18:30:07 crc kubenswrapper[4892]: I0217 18:30:07.424459 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 18:30:07 crc kubenswrapper[4892]: I0217 18:30:07.424511 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" Feb 17 18:30:07 crc kubenswrapper[4892]: I0217 18:30:07.425220 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7fc7a9cdbbcd0e8aa68ed9058147ae85e5e43f5116c72459ea9a473802a43953"} pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 18:30:07 crc kubenswrapper[4892]: I0217 18:30:07.425299 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" containerID="cri-o://7fc7a9cdbbcd0e8aa68ed9058147ae85e5e43f5116c72459ea9a473802a43953" gracePeriod=600 Feb 17 18:30:08 crc kubenswrapper[4892]: I0217 18:30:08.283219 4892 generic.go:334] "Generic (PLEG): container finished" podID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerID="7fc7a9cdbbcd0e8aa68ed9058147ae85e5e43f5116c72459ea9a473802a43953" exitCode=0 Feb 17 18:30:08 crc kubenswrapper[4892]: I0217 18:30:08.283301 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerDied","Data":"7fc7a9cdbbcd0e8aa68ed9058147ae85e5e43f5116c72459ea9a473802a43953"} Feb 17 18:30:08 crc kubenswrapper[4892]: I0217 18:30:08.283842 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerStarted","Data":"931ae5ffd33ae2b47798ebc6b8ba622c3bafe46cb651ea27d2843230971d2757"} Feb 17 18:30:08 crc kubenswrapper[4892]: I0217 18:30:08.283883 4892 scope.go:117] "RemoveContainer" containerID="e0e0ad3b976f07609332977599bfbd7aff086139ea33e9c693f03eb70bc21736" Feb 17 18:30:14 crc kubenswrapper[4892]: I0217 18:30:14.072767 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-q6p9l"] Feb 17 18:30:14 crc kubenswrapper[4892]: E0217 18:30:14.073885 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95eb9988-23c6-4a17-a5da-6b6c70984deb" containerName="collect-profiles" Feb 17 18:30:14 crc kubenswrapper[4892]: I0217 18:30:14.073900 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="95eb9988-23c6-4a17-a5da-6b6c70984deb" containerName="collect-profiles" Feb 17 18:30:14 crc kubenswrapper[4892]: I0217 18:30:14.074082 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="95eb9988-23c6-4a17-a5da-6b6c70984deb" containerName="collect-profiles" Feb 17 18:30:14 crc kubenswrapper[4892]: I0217 18:30:14.075351 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q6p9l" Feb 17 18:30:14 crc kubenswrapper[4892]: I0217 18:30:14.089855 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q6p9l"] Feb 17 18:30:14 crc kubenswrapper[4892]: I0217 18:30:14.183843 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0ed2048-28f5-47b3-89dd-37d1b4ff797b-utilities\") pod \"community-operators-q6p9l\" (UID: \"f0ed2048-28f5-47b3-89dd-37d1b4ff797b\") " pod="openshift-marketplace/community-operators-q6p9l" Feb 17 18:30:14 crc kubenswrapper[4892]: I0217 18:30:14.183893 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8k97\" (UniqueName: \"kubernetes.io/projected/f0ed2048-28f5-47b3-89dd-37d1b4ff797b-kube-api-access-l8k97\") pod \"community-operators-q6p9l\" (UID: \"f0ed2048-28f5-47b3-89dd-37d1b4ff797b\") " pod="openshift-marketplace/community-operators-q6p9l" Feb 17 18:30:14 crc kubenswrapper[4892]: I0217 18:30:14.183954 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0ed2048-28f5-47b3-89dd-37d1b4ff797b-catalog-content\") pod \"community-operators-q6p9l\" (UID: \"f0ed2048-28f5-47b3-89dd-37d1b4ff797b\") " pod="openshift-marketplace/community-operators-q6p9l" Feb 17 18:30:14 crc kubenswrapper[4892]: I0217 18:30:14.285488 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0ed2048-28f5-47b3-89dd-37d1b4ff797b-catalog-content\") pod \"community-operators-q6p9l\" (UID: \"f0ed2048-28f5-47b3-89dd-37d1b4ff797b\") " pod="openshift-marketplace/community-operators-q6p9l" Feb 17 18:30:14 crc kubenswrapper[4892]: I0217 18:30:14.285620 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0ed2048-28f5-47b3-89dd-37d1b4ff797b-utilities\") pod \"community-operators-q6p9l\" (UID: \"f0ed2048-28f5-47b3-89dd-37d1b4ff797b\") " pod="openshift-marketplace/community-operators-q6p9l" Feb 17 18:30:14 crc kubenswrapper[4892]: I0217 18:30:14.285646 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8k97\" (UniqueName: \"kubernetes.io/projected/f0ed2048-28f5-47b3-89dd-37d1b4ff797b-kube-api-access-l8k97\") pod \"community-operators-q6p9l\" (UID: \"f0ed2048-28f5-47b3-89dd-37d1b4ff797b\") " pod="openshift-marketplace/community-operators-q6p9l" Feb 17 18:30:14 crc kubenswrapper[4892]: I0217 18:30:14.286352 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0ed2048-28f5-47b3-89dd-37d1b4ff797b-catalog-content\") pod \"community-operators-q6p9l\" (UID: \"f0ed2048-28f5-47b3-89dd-37d1b4ff797b\") " pod="openshift-marketplace/community-operators-q6p9l" Feb 17 18:30:14 crc kubenswrapper[4892]: I0217 18:30:14.286588 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0ed2048-28f5-47b3-89dd-37d1b4ff797b-utilities\") pod \"community-operators-q6p9l\" (UID: \"f0ed2048-28f5-47b3-89dd-37d1b4ff797b\") " pod="openshift-marketplace/community-operators-q6p9l" Feb 17 18:30:14 crc kubenswrapper[4892]: I0217 18:30:14.315635 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8k97\" (UniqueName: \"kubernetes.io/projected/f0ed2048-28f5-47b3-89dd-37d1b4ff797b-kube-api-access-l8k97\") pod \"community-operators-q6p9l\" (UID: \"f0ed2048-28f5-47b3-89dd-37d1b4ff797b\") " pod="openshift-marketplace/community-operators-q6p9l" Feb 17 18:30:14 crc kubenswrapper[4892]: I0217 18:30:14.398462 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q6p9l" Feb 17 18:30:14 crc kubenswrapper[4892]: W0217 18:30:14.899917 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0ed2048_28f5_47b3_89dd_37d1b4ff797b.slice/crio-f6df1d5be6d9b9cb79b419721cb36895665d82a77f8d7b9a178e9633f6f12ee1 WatchSource:0}: Error finding container f6df1d5be6d9b9cb79b419721cb36895665d82a77f8d7b9a178e9633f6f12ee1: Status 404 returned error can't find the container with id f6df1d5be6d9b9cb79b419721cb36895665d82a77f8d7b9a178e9633f6f12ee1 Feb 17 18:30:14 crc kubenswrapper[4892]: I0217 18:30:14.900412 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q6p9l"] Feb 17 18:30:15 crc kubenswrapper[4892]: I0217 18:30:15.361367 4892 generic.go:334] "Generic (PLEG): container finished" podID="f0ed2048-28f5-47b3-89dd-37d1b4ff797b" containerID="ee770a334483df4b498712702cb0d63ec8415fa0ddfe7f31aa0079d03bf32046" exitCode=0 Feb 17 18:30:15 crc kubenswrapper[4892]: I0217 18:30:15.363524 4892 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 18:30:15 crc kubenswrapper[4892]: I0217 18:30:15.370980 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q6p9l" event={"ID":"f0ed2048-28f5-47b3-89dd-37d1b4ff797b","Type":"ContainerDied","Data":"ee770a334483df4b498712702cb0d63ec8415fa0ddfe7f31aa0079d03bf32046"} Feb 17 18:30:15 crc kubenswrapper[4892]: I0217 18:30:15.371011 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q6p9l" event={"ID":"f0ed2048-28f5-47b3-89dd-37d1b4ff797b","Type":"ContainerStarted","Data":"f6df1d5be6d9b9cb79b419721cb36895665d82a77f8d7b9a178e9633f6f12ee1"} Feb 17 18:30:16 crc kubenswrapper[4892]: I0217 18:30:16.370313 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q6p9l" event={"ID":"f0ed2048-28f5-47b3-89dd-37d1b4ff797b","Type":"ContainerStarted","Data":"f73c4c128726aefda3c99e12b89e07ae48353fb58041a0c0cefdebdaf3bab8dc"} Feb 17 18:30:17 crc kubenswrapper[4892]: I0217 18:30:17.383106 4892 generic.go:334] "Generic (PLEG): container finished" podID="f0ed2048-28f5-47b3-89dd-37d1b4ff797b" containerID="f73c4c128726aefda3c99e12b89e07ae48353fb58041a0c0cefdebdaf3bab8dc" exitCode=0 Feb 17 18:30:17 crc kubenswrapper[4892]: I0217 18:30:17.383333 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q6p9l" event={"ID":"f0ed2048-28f5-47b3-89dd-37d1b4ff797b","Type":"ContainerDied","Data":"f73c4c128726aefda3c99e12b89e07ae48353fb58041a0c0cefdebdaf3bab8dc"} Feb 17 18:30:18 crc kubenswrapper[4892]: I0217 18:30:18.406944 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q6p9l" event={"ID":"f0ed2048-28f5-47b3-89dd-37d1b4ff797b","Type":"ContainerStarted","Data":"efaac5c7eb8183dadfa2c0125c9ef7b055fdcfe69d07eccbf30a7670a4b8a30a"} Feb 17 18:30:18 crc kubenswrapper[4892]: I0217 18:30:18.430546 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-q6p9l" podStartSLOduration=1.940571706 podStartE2EDuration="4.430526478s" podCreationTimestamp="2026-02-17 18:30:14 +0000 UTC" firstStartedPulling="2026-02-17 18:30:15.363177247 +0000 UTC m=+2786.738580522" lastFinishedPulling="2026-02-17 18:30:17.853132029 +0000 UTC m=+2789.228535294" observedRunningTime="2026-02-17 18:30:18.430179569 +0000 UTC m=+2789.805582844" watchObservedRunningTime="2026-02-17 18:30:18.430526478 +0000 UTC m=+2789.805929763" Feb 17 18:30:19 crc kubenswrapper[4892]: I0217 18:30:19.853130 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pqx58"] Feb 17 18:30:19 crc kubenswrapper[4892]: I0217 18:30:19.856336 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pqx58" Feb 17 18:30:19 crc kubenswrapper[4892]: I0217 18:30:19.863329 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pqx58"] Feb 17 18:30:19 crc kubenswrapper[4892]: I0217 18:30:19.985467 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/177be440-8218-46c9-a38f-a4b8123eb0d5-catalog-content\") pod \"redhat-operators-pqx58\" (UID: \"177be440-8218-46c9-a38f-a4b8123eb0d5\") " pod="openshift-marketplace/redhat-operators-pqx58" Feb 17 18:30:19 crc kubenswrapper[4892]: I0217 18:30:19.985588 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2l7kz\" (UniqueName: \"kubernetes.io/projected/177be440-8218-46c9-a38f-a4b8123eb0d5-kube-api-access-2l7kz\") pod \"redhat-operators-pqx58\" (UID: \"177be440-8218-46c9-a38f-a4b8123eb0d5\") " pod="openshift-marketplace/redhat-operators-pqx58" Feb 17 18:30:19 crc kubenswrapper[4892]: I0217 18:30:19.985641 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/177be440-8218-46c9-a38f-a4b8123eb0d5-utilities\") pod \"redhat-operators-pqx58\" (UID: \"177be440-8218-46c9-a38f-a4b8123eb0d5\") " pod="openshift-marketplace/redhat-operators-pqx58" Feb 17 18:30:20 crc kubenswrapper[4892]: I0217 18:30:20.087451 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2l7kz\" (UniqueName: \"kubernetes.io/projected/177be440-8218-46c9-a38f-a4b8123eb0d5-kube-api-access-2l7kz\") pod \"redhat-operators-pqx58\" (UID: \"177be440-8218-46c9-a38f-a4b8123eb0d5\") " pod="openshift-marketplace/redhat-operators-pqx58" Feb 17 18:30:20 crc kubenswrapper[4892]: I0217 18:30:20.087523 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/177be440-8218-46c9-a38f-a4b8123eb0d5-utilities\") pod \"redhat-operators-pqx58\" (UID: \"177be440-8218-46c9-a38f-a4b8123eb0d5\") " pod="openshift-marketplace/redhat-operators-pqx58" Feb 17 18:30:20 crc kubenswrapper[4892]: I0217 18:30:20.087556 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/177be440-8218-46c9-a38f-a4b8123eb0d5-catalog-content\") pod \"redhat-operators-pqx58\" (UID: \"177be440-8218-46c9-a38f-a4b8123eb0d5\") " pod="openshift-marketplace/redhat-operators-pqx58" Feb 17 18:30:20 crc kubenswrapper[4892]: I0217 18:30:20.088061 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/177be440-8218-46c9-a38f-a4b8123eb0d5-catalog-content\") pod \"redhat-operators-pqx58\" (UID: \"177be440-8218-46c9-a38f-a4b8123eb0d5\") " pod="openshift-marketplace/redhat-operators-pqx58" Feb 17 18:30:20 crc kubenswrapper[4892]: I0217 18:30:20.088275 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/177be440-8218-46c9-a38f-a4b8123eb0d5-utilities\") pod \"redhat-operators-pqx58\" (UID: \"177be440-8218-46c9-a38f-a4b8123eb0d5\") " pod="openshift-marketplace/redhat-operators-pqx58" Feb 17 18:30:20 crc kubenswrapper[4892]: I0217 18:30:20.111368 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2l7kz\" (UniqueName: \"kubernetes.io/projected/177be440-8218-46c9-a38f-a4b8123eb0d5-kube-api-access-2l7kz\") pod \"redhat-operators-pqx58\" (UID: \"177be440-8218-46c9-a38f-a4b8123eb0d5\") " pod="openshift-marketplace/redhat-operators-pqx58" Feb 17 18:30:20 crc kubenswrapper[4892]: I0217 18:30:20.172788 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pqx58" Feb 17 18:30:20 crc kubenswrapper[4892]: W0217 18:30:20.654287 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod177be440_8218_46c9_a38f_a4b8123eb0d5.slice/crio-d3d22bdefedae337d90290436589ae9afc078041a0b8bac23a7225d49a98b87e WatchSource:0}: Error finding container d3d22bdefedae337d90290436589ae9afc078041a0b8bac23a7225d49a98b87e: Status 404 returned error can't find the container with id d3d22bdefedae337d90290436589ae9afc078041a0b8bac23a7225d49a98b87e Feb 17 18:30:20 crc kubenswrapper[4892]: I0217 18:30:20.658503 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pqx58"] Feb 17 18:30:21 crc kubenswrapper[4892]: I0217 18:30:21.432126 4892 generic.go:334] "Generic (PLEG): container finished" podID="177be440-8218-46c9-a38f-a4b8123eb0d5" containerID="98e0557e39aea95d9fba4c09137d47425e31fcb53ce094ffa0c97840d673ee76" exitCode=0 Feb 17 18:30:21 crc kubenswrapper[4892]: I0217 18:30:21.432450 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pqx58" event={"ID":"177be440-8218-46c9-a38f-a4b8123eb0d5","Type":"ContainerDied","Data":"98e0557e39aea95d9fba4c09137d47425e31fcb53ce094ffa0c97840d673ee76"} Feb 17 18:30:21 crc kubenswrapper[4892]: I0217 18:30:21.432555 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pqx58" event={"ID":"177be440-8218-46c9-a38f-a4b8123eb0d5","Type":"ContainerStarted","Data":"d3d22bdefedae337d90290436589ae9afc078041a0b8bac23a7225d49a98b87e"} Feb 17 18:30:22 crc kubenswrapper[4892]: I0217 18:30:22.450625 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pqx58" event={"ID":"177be440-8218-46c9-a38f-a4b8123eb0d5","Type":"ContainerStarted","Data":"a1ed851ec033d4dc90427d70578d7ba0a84bb2b09412c68b07944bc8bd0514cc"} Feb 17 18:30:23 crc kubenswrapper[4892]: I0217 18:30:23.459942 4892 generic.go:334] "Generic (PLEG): container finished" podID="177be440-8218-46c9-a38f-a4b8123eb0d5" containerID="a1ed851ec033d4dc90427d70578d7ba0a84bb2b09412c68b07944bc8bd0514cc" exitCode=0 Feb 17 18:30:23 crc kubenswrapper[4892]: I0217 18:30:23.459980 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pqx58" event={"ID":"177be440-8218-46c9-a38f-a4b8123eb0d5","Type":"ContainerDied","Data":"a1ed851ec033d4dc90427d70578d7ba0a84bb2b09412c68b07944bc8bd0514cc"} Feb 17 18:30:24 crc kubenswrapper[4892]: I0217 18:30:24.398860 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-q6p9l" Feb 17 18:30:24 crc kubenswrapper[4892]: I0217 18:30:24.399308 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-q6p9l" Feb 17 18:30:24 crc kubenswrapper[4892]: I0217 18:30:24.481242 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-q6p9l" Feb 17 18:30:24 crc kubenswrapper[4892]: I0217 18:30:24.544583 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-q6p9l" Feb 17 18:30:26 crc kubenswrapper[4892]: I0217 18:30:26.644414 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q6p9l"] Feb 17 18:30:26 crc kubenswrapper[4892]: I0217 18:30:26.645039 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-q6p9l" podUID="f0ed2048-28f5-47b3-89dd-37d1b4ff797b" containerName="registry-server" containerID="cri-o://efaac5c7eb8183dadfa2c0125c9ef7b055fdcfe69d07eccbf30a7670a4b8a30a" gracePeriod=2 Feb 17 18:30:27 crc kubenswrapper[4892]: I0217 18:30:27.089701 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q6p9l" Feb 17 18:30:27 crc kubenswrapper[4892]: I0217 18:30:27.108428 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8k97\" (UniqueName: \"kubernetes.io/projected/f0ed2048-28f5-47b3-89dd-37d1b4ff797b-kube-api-access-l8k97\") pod \"f0ed2048-28f5-47b3-89dd-37d1b4ff797b\" (UID: \"f0ed2048-28f5-47b3-89dd-37d1b4ff797b\") " Feb 17 18:30:27 crc kubenswrapper[4892]: I0217 18:30:27.108511 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0ed2048-28f5-47b3-89dd-37d1b4ff797b-catalog-content\") pod \"f0ed2048-28f5-47b3-89dd-37d1b4ff797b\" (UID: \"f0ed2048-28f5-47b3-89dd-37d1b4ff797b\") " Feb 17 18:30:27 crc kubenswrapper[4892]: I0217 18:30:27.108545 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0ed2048-28f5-47b3-89dd-37d1b4ff797b-utilities\") pod \"f0ed2048-28f5-47b3-89dd-37d1b4ff797b\" (UID: \"f0ed2048-28f5-47b3-89dd-37d1b4ff797b\") " Feb 17 18:30:27 crc kubenswrapper[4892]: I0217 18:30:27.109679 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0ed2048-28f5-47b3-89dd-37d1b4ff797b-utilities" (OuterVolumeSpecName: "utilities") pod "f0ed2048-28f5-47b3-89dd-37d1b4ff797b" (UID: "f0ed2048-28f5-47b3-89dd-37d1b4ff797b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:30:27 crc kubenswrapper[4892]: I0217 18:30:27.118097 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0ed2048-28f5-47b3-89dd-37d1b4ff797b-kube-api-access-l8k97" (OuterVolumeSpecName: "kube-api-access-l8k97") pod "f0ed2048-28f5-47b3-89dd-37d1b4ff797b" (UID: "f0ed2048-28f5-47b3-89dd-37d1b4ff797b"). InnerVolumeSpecName "kube-api-access-l8k97". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:30:27 crc kubenswrapper[4892]: I0217 18:30:27.181047 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0ed2048-28f5-47b3-89dd-37d1b4ff797b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f0ed2048-28f5-47b3-89dd-37d1b4ff797b" (UID: "f0ed2048-28f5-47b3-89dd-37d1b4ff797b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:30:27 crc kubenswrapper[4892]: I0217 18:30:27.209895 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0ed2048-28f5-47b3-89dd-37d1b4ff797b-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 18:30:27 crc kubenswrapper[4892]: I0217 18:30:27.209930 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8k97\" (UniqueName: \"kubernetes.io/projected/f0ed2048-28f5-47b3-89dd-37d1b4ff797b-kube-api-access-l8k97\") on node \"crc\" DevicePath \"\"" Feb 17 18:30:27 crc kubenswrapper[4892]: I0217 18:30:27.209942 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0ed2048-28f5-47b3-89dd-37d1b4ff797b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 18:30:27 crc kubenswrapper[4892]: I0217 18:30:27.517647 4892 generic.go:334] "Generic (PLEG): container finished" podID="f0ed2048-28f5-47b3-89dd-37d1b4ff797b" containerID="efaac5c7eb8183dadfa2c0125c9ef7b055fdcfe69d07eccbf30a7670a4b8a30a" exitCode=0 Feb 17 18:30:27 crc kubenswrapper[4892]: I0217 18:30:27.517699 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q6p9l" event={"ID":"f0ed2048-28f5-47b3-89dd-37d1b4ff797b","Type":"ContainerDied","Data":"efaac5c7eb8183dadfa2c0125c9ef7b055fdcfe69d07eccbf30a7670a4b8a30a"} Feb 17 18:30:27 crc kubenswrapper[4892]: I0217 18:30:27.517726 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q6p9l" event={"ID":"f0ed2048-28f5-47b3-89dd-37d1b4ff797b","Type":"ContainerDied","Data":"f6df1d5be6d9b9cb79b419721cb36895665d82a77f8d7b9a178e9633f6f12ee1"} Feb 17 18:30:27 crc kubenswrapper[4892]: I0217 18:30:27.517735 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q6p9l" Feb 17 18:30:27 crc kubenswrapper[4892]: I0217 18:30:27.517747 4892 scope.go:117] "RemoveContainer" containerID="efaac5c7eb8183dadfa2c0125c9ef7b055fdcfe69d07eccbf30a7670a4b8a30a" Feb 17 18:30:27 crc kubenswrapper[4892]: I0217 18:30:27.543413 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q6p9l"] Feb 17 18:30:27 crc kubenswrapper[4892]: I0217 18:30:27.550680 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-q6p9l"] Feb 17 18:30:27 crc kubenswrapper[4892]: I0217 18:30:27.558220 4892 scope.go:117] "RemoveContainer" containerID="f73c4c128726aefda3c99e12b89e07ae48353fb58041a0c0cefdebdaf3bab8dc" Feb 17 18:30:27 crc kubenswrapper[4892]: I0217 18:30:27.582761 4892 scope.go:117] "RemoveContainer" containerID="ee770a334483df4b498712702cb0d63ec8415fa0ddfe7f31aa0079d03bf32046" Feb 17 18:30:27 crc kubenswrapper[4892]: I0217 18:30:27.608549 4892 scope.go:117] "RemoveContainer" containerID="efaac5c7eb8183dadfa2c0125c9ef7b055fdcfe69d07eccbf30a7670a4b8a30a" Feb 17 18:30:27 crc kubenswrapper[4892]: E0217 18:30:27.609077 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efaac5c7eb8183dadfa2c0125c9ef7b055fdcfe69d07eccbf30a7670a4b8a30a\": container with ID starting with efaac5c7eb8183dadfa2c0125c9ef7b055fdcfe69d07eccbf30a7670a4b8a30a not found: ID does not exist" containerID="efaac5c7eb8183dadfa2c0125c9ef7b055fdcfe69d07eccbf30a7670a4b8a30a" Feb 17 18:30:27 crc kubenswrapper[4892]: I0217 18:30:27.609117 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efaac5c7eb8183dadfa2c0125c9ef7b055fdcfe69d07eccbf30a7670a4b8a30a"} err="failed to get container status \"efaac5c7eb8183dadfa2c0125c9ef7b055fdcfe69d07eccbf30a7670a4b8a30a\": rpc error: code = NotFound desc = could not find container \"efaac5c7eb8183dadfa2c0125c9ef7b055fdcfe69d07eccbf30a7670a4b8a30a\": container with ID starting with efaac5c7eb8183dadfa2c0125c9ef7b055fdcfe69d07eccbf30a7670a4b8a30a not found: ID does not exist" Feb 17 18:30:27 crc kubenswrapper[4892]: I0217 18:30:27.609143 4892 scope.go:117] "RemoveContainer" containerID="f73c4c128726aefda3c99e12b89e07ae48353fb58041a0c0cefdebdaf3bab8dc" Feb 17 18:30:27 crc kubenswrapper[4892]: E0217 18:30:27.609505 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f73c4c128726aefda3c99e12b89e07ae48353fb58041a0c0cefdebdaf3bab8dc\": container with ID starting with f73c4c128726aefda3c99e12b89e07ae48353fb58041a0c0cefdebdaf3bab8dc not found: ID does not exist" containerID="f73c4c128726aefda3c99e12b89e07ae48353fb58041a0c0cefdebdaf3bab8dc" Feb 17 18:30:27 crc kubenswrapper[4892]: I0217 18:30:27.609549 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f73c4c128726aefda3c99e12b89e07ae48353fb58041a0c0cefdebdaf3bab8dc"} err="failed to get container status \"f73c4c128726aefda3c99e12b89e07ae48353fb58041a0c0cefdebdaf3bab8dc\": rpc error: code = NotFound desc = could not find container \"f73c4c128726aefda3c99e12b89e07ae48353fb58041a0c0cefdebdaf3bab8dc\": container with ID starting with f73c4c128726aefda3c99e12b89e07ae48353fb58041a0c0cefdebdaf3bab8dc not found: ID does not exist" Feb 17 18:30:27 crc kubenswrapper[4892]: I0217 18:30:27.609577 4892 scope.go:117] "RemoveContainer" containerID="ee770a334483df4b498712702cb0d63ec8415fa0ddfe7f31aa0079d03bf32046" Feb 17 18:30:27 crc kubenswrapper[4892]: E0217 18:30:27.609938 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee770a334483df4b498712702cb0d63ec8415fa0ddfe7f31aa0079d03bf32046\": container with ID starting with ee770a334483df4b498712702cb0d63ec8415fa0ddfe7f31aa0079d03bf32046 not found: ID does not exist" containerID="ee770a334483df4b498712702cb0d63ec8415fa0ddfe7f31aa0079d03bf32046" Feb 17 18:30:27 crc kubenswrapper[4892]: I0217 18:30:27.609982 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee770a334483df4b498712702cb0d63ec8415fa0ddfe7f31aa0079d03bf32046"} err="failed to get container status \"ee770a334483df4b498712702cb0d63ec8415fa0ddfe7f31aa0079d03bf32046\": rpc error: code = NotFound desc = could not find container \"ee770a334483df4b498712702cb0d63ec8415fa0ddfe7f31aa0079d03bf32046\": container with ID starting with ee770a334483df4b498712702cb0d63ec8415fa0ddfe7f31aa0079d03bf32046 not found: ID does not exist" Feb 17 18:30:28 crc kubenswrapper[4892]: I0217 18:30:28.530178 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pqx58" event={"ID":"177be440-8218-46c9-a38f-a4b8123eb0d5","Type":"ContainerStarted","Data":"be2329f8e18335965fa124cfb4031c9eef1d06e066f34ed3a1988702ecc32135"} Feb 17 18:30:28 crc kubenswrapper[4892]: I0217 18:30:28.558208 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pqx58" podStartSLOduration=2.894779806 podStartE2EDuration="9.558190755s" podCreationTimestamp="2026-02-17 18:30:19 +0000 UTC" firstStartedPulling="2026-02-17 18:30:21.433887885 +0000 UTC m=+2792.809291150" lastFinishedPulling="2026-02-17 18:30:28.097298834 +0000 UTC m=+2799.472702099" observedRunningTime="2026-02-17 18:30:28.553843758 +0000 UTC m=+2799.929247103" watchObservedRunningTime="2026-02-17 18:30:28.558190755 +0000 UTC m=+2799.933594020" Feb 17 18:30:29 crc kubenswrapper[4892]: I0217 18:30:29.371996 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0ed2048-28f5-47b3-89dd-37d1b4ff797b" path="/var/lib/kubelet/pods/f0ed2048-28f5-47b3-89dd-37d1b4ff797b/volumes" Feb 17 18:30:30 crc kubenswrapper[4892]: I0217 18:30:30.173424 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pqx58" Feb 17 18:30:30 crc kubenswrapper[4892]: I0217 18:30:30.174988 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pqx58" Feb 17 18:30:31 crc kubenswrapper[4892]: I0217 18:30:31.224608 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pqx58" podUID="177be440-8218-46c9-a38f-a4b8123eb0d5" containerName="registry-server" probeResult="failure" output=< Feb 17 18:30:31 crc kubenswrapper[4892]: timeout: failed to connect service ":50051" within 1s Feb 17 18:30:31 crc kubenswrapper[4892]: > Feb 17 18:30:40 crc kubenswrapper[4892]: I0217 18:30:40.256758 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pqx58" Feb 17 18:30:40 crc kubenswrapper[4892]: I0217 18:30:40.343285 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pqx58" Feb 17 18:30:40 crc kubenswrapper[4892]: I0217 18:30:40.508696 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pqx58"] Feb 17 18:30:41 crc kubenswrapper[4892]: I0217 18:30:41.678281 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pqx58" podUID="177be440-8218-46c9-a38f-a4b8123eb0d5" containerName="registry-server" containerID="cri-o://be2329f8e18335965fa124cfb4031c9eef1d06e066f34ed3a1988702ecc32135" gracePeriod=2 Feb 17 18:30:42 crc kubenswrapper[4892]: I0217 18:30:42.127858 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pqx58" Feb 17 18:30:42 crc kubenswrapper[4892]: I0217 18:30:42.274444 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2l7kz\" (UniqueName: \"kubernetes.io/projected/177be440-8218-46c9-a38f-a4b8123eb0d5-kube-api-access-2l7kz\") pod \"177be440-8218-46c9-a38f-a4b8123eb0d5\" (UID: \"177be440-8218-46c9-a38f-a4b8123eb0d5\") " Feb 17 18:30:42 crc kubenswrapper[4892]: I0217 18:30:42.274505 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/177be440-8218-46c9-a38f-a4b8123eb0d5-utilities\") pod \"177be440-8218-46c9-a38f-a4b8123eb0d5\" (UID: \"177be440-8218-46c9-a38f-a4b8123eb0d5\") " Feb 17 18:30:42 crc kubenswrapper[4892]: I0217 18:30:42.274576 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/177be440-8218-46c9-a38f-a4b8123eb0d5-catalog-content\") pod \"177be440-8218-46c9-a38f-a4b8123eb0d5\" (UID: \"177be440-8218-46c9-a38f-a4b8123eb0d5\") " Feb 17 18:30:42 crc kubenswrapper[4892]: I0217 18:30:42.275328 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/177be440-8218-46c9-a38f-a4b8123eb0d5-utilities" (OuterVolumeSpecName: "utilities") pod "177be440-8218-46c9-a38f-a4b8123eb0d5" (UID: "177be440-8218-46c9-a38f-a4b8123eb0d5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:30:42 crc kubenswrapper[4892]: I0217 18:30:42.280708 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/177be440-8218-46c9-a38f-a4b8123eb0d5-kube-api-access-2l7kz" (OuterVolumeSpecName: "kube-api-access-2l7kz") pod "177be440-8218-46c9-a38f-a4b8123eb0d5" (UID: "177be440-8218-46c9-a38f-a4b8123eb0d5"). InnerVolumeSpecName "kube-api-access-2l7kz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:30:42 crc kubenswrapper[4892]: I0217 18:30:42.377998 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2l7kz\" (UniqueName: \"kubernetes.io/projected/177be440-8218-46c9-a38f-a4b8123eb0d5-kube-api-access-2l7kz\") on node \"crc\" DevicePath \"\"" Feb 17 18:30:42 crc kubenswrapper[4892]: I0217 18:30:42.378343 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/177be440-8218-46c9-a38f-a4b8123eb0d5-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 18:30:42 crc kubenswrapper[4892]: I0217 18:30:42.420925 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/177be440-8218-46c9-a38f-a4b8123eb0d5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "177be440-8218-46c9-a38f-a4b8123eb0d5" (UID: "177be440-8218-46c9-a38f-a4b8123eb0d5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:30:42 crc kubenswrapper[4892]: I0217 18:30:42.479846 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/177be440-8218-46c9-a38f-a4b8123eb0d5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 18:30:42 crc kubenswrapper[4892]: I0217 18:30:42.694029 4892 generic.go:334] "Generic (PLEG): container finished" podID="177be440-8218-46c9-a38f-a4b8123eb0d5" containerID="be2329f8e18335965fa124cfb4031c9eef1d06e066f34ed3a1988702ecc32135" exitCode=0 Feb 17 18:30:42 crc kubenswrapper[4892]: I0217 18:30:42.694118 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pqx58" event={"ID":"177be440-8218-46c9-a38f-a4b8123eb0d5","Type":"ContainerDied","Data":"be2329f8e18335965fa124cfb4031c9eef1d06e066f34ed3a1988702ecc32135"} Feb 17 18:30:42 crc kubenswrapper[4892]: I0217 18:30:42.694147 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pqx58" Feb 17 18:30:42 crc kubenswrapper[4892]: I0217 18:30:42.694173 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pqx58" event={"ID":"177be440-8218-46c9-a38f-a4b8123eb0d5","Type":"ContainerDied","Data":"d3d22bdefedae337d90290436589ae9afc078041a0b8bac23a7225d49a98b87e"} Feb 17 18:30:42 crc kubenswrapper[4892]: I0217 18:30:42.694205 4892 scope.go:117] "RemoveContainer" containerID="be2329f8e18335965fa124cfb4031c9eef1d06e066f34ed3a1988702ecc32135" Feb 17 18:30:42 crc kubenswrapper[4892]: I0217 18:30:42.737596 4892 scope.go:117] "RemoveContainer" containerID="a1ed851ec033d4dc90427d70578d7ba0a84bb2b09412c68b07944bc8bd0514cc" Feb 17 18:30:42 crc kubenswrapper[4892]: I0217 18:30:42.742244 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pqx58"] Feb 17 18:30:42 crc kubenswrapper[4892]: I0217 18:30:42.754809 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pqx58"] Feb 17 18:30:42 crc kubenswrapper[4892]: I0217 18:30:42.781285 4892 scope.go:117] "RemoveContainer" containerID="98e0557e39aea95d9fba4c09137d47425e31fcb53ce094ffa0c97840d673ee76" Feb 17 18:30:42 crc kubenswrapper[4892]: I0217 18:30:42.821177 4892 scope.go:117] "RemoveContainer" containerID="be2329f8e18335965fa124cfb4031c9eef1d06e066f34ed3a1988702ecc32135" Feb 17 18:30:42 crc kubenswrapper[4892]: E0217 18:30:42.821937 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be2329f8e18335965fa124cfb4031c9eef1d06e066f34ed3a1988702ecc32135\": container with ID starting with be2329f8e18335965fa124cfb4031c9eef1d06e066f34ed3a1988702ecc32135 not found: ID does not exist" containerID="be2329f8e18335965fa124cfb4031c9eef1d06e066f34ed3a1988702ecc32135" Feb 17 18:30:42 crc kubenswrapper[4892]: I0217 18:30:42.821996 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be2329f8e18335965fa124cfb4031c9eef1d06e066f34ed3a1988702ecc32135"} err="failed to get container status \"be2329f8e18335965fa124cfb4031c9eef1d06e066f34ed3a1988702ecc32135\": rpc error: code = NotFound desc = could not find container \"be2329f8e18335965fa124cfb4031c9eef1d06e066f34ed3a1988702ecc32135\": container with ID starting with be2329f8e18335965fa124cfb4031c9eef1d06e066f34ed3a1988702ecc32135 not found: ID does not exist" Feb 17 18:30:42 crc kubenswrapper[4892]: I0217 18:30:42.822030 4892 scope.go:117] "RemoveContainer" containerID="a1ed851ec033d4dc90427d70578d7ba0a84bb2b09412c68b07944bc8bd0514cc" Feb 17 18:30:42 crc kubenswrapper[4892]: E0217 18:30:42.822647 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1ed851ec033d4dc90427d70578d7ba0a84bb2b09412c68b07944bc8bd0514cc\": container with ID starting with a1ed851ec033d4dc90427d70578d7ba0a84bb2b09412c68b07944bc8bd0514cc not found: ID does not exist" containerID="a1ed851ec033d4dc90427d70578d7ba0a84bb2b09412c68b07944bc8bd0514cc" Feb 17 18:30:42 crc kubenswrapper[4892]: I0217 18:30:42.822688 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1ed851ec033d4dc90427d70578d7ba0a84bb2b09412c68b07944bc8bd0514cc"} err="failed to get container status \"a1ed851ec033d4dc90427d70578d7ba0a84bb2b09412c68b07944bc8bd0514cc\": rpc error: code = NotFound desc = could not find container \"a1ed851ec033d4dc90427d70578d7ba0a84bb2b09412c68b07944bc8bd0514cc\": container with ID starting with a1ed851ec033d4dc90427d70578d7ba0a84bb2b09412c68b07944bc8bd0514cc not found: ID does not exist" Feb 17 18:30:42 crc kubenswrapper[4892]: I0217 18:30:42.822718 4892 scope.go:117] "RemoveContainer" containerID="98e0557e39aea95d9fba4c09137d47425e31fcb53ce094ffa0c97840d673ee76" Feb 17 18:30:42 crc kubenswrapper[4892]: E0217 18:30:42.823340 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98e0557e39aea95d9fba4c09137d47425e31fcb53ce094ffa0c97840d673ee76\": container with ID starting with 98e0557e39aea95d9fba4c09137d47425e31fcb53ce094ffa0c97840d673ee76 not found: ID does not exist" containerID="98e0557e39aea95d9fba4c09137d47425e31fcb53ce094ffa0c97840d673ee76" Feb 17 18:30:42 crc kubenswrapper[4892]: I0217 18:30:42.823408 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98e0557e39aea95d9fba4c09137d47425e31fcb53ce094ffa0c97840d673ee76"} err="failed to get container status \"98e0557e39aea95d9fba4c09137d47425e31fcb53ce094ffa0c97840d673ee76\": rpc error: code = NotFound desc = could not find container \"98e0557e39aea95d9fba4c09137d47425e31fcb53ce094ffa0c97840d673ee76\": container with ID starting with 98e0557e39aea95d9fba4c09137d47425e31fcb53ce094ffa0c97840d673ee76 not found: ID does not exist" Feb 17 18:30:43 crc kubenswrapper[4892]: I0217 18:30:43.377988 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="177be440-8218-46c9-a38f-a4b8123eb0d5" path="/var/lib/kubelet/pods/177be440-8218-46c9-a38f-a4b8123eb0d5/volumes" Feb 17 18:30:59 crc kubenswrapper[4892]: I0217 18:30:59.350782 4892 scope.go:117] "RemoveContainer" containerID="39338101665f015d00683b1772c24a547ba1bb7237f21bb2a360429436d44f3f" Feb 17 18:31:20 crc kubenswrapper[4892]: I0217 18:31:20.395359 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xps4j"] Feb 17 18:31:20 crc kubenswrapper[4892]: E0217 18:31:20.396657 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0ed2048-28f5-47b3-89dd-37d1b4ff797b" containerName="extract-content" Feb 17 18:31:20 crc kubenswrapper[4892]: I0217 18:31:20.396679 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0ed2048-28f5-47b3-89dd-37d1b4ff797b" containerName="extract-content" Feb 17 18:31:20 crc kubenswrapper[4892]: E0217 18:31:20.396716 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0ed2048-28f5-47b3-89dd-37d1b4ff797b" containerName="extract-utilities" Feb 17 18:31:20 crc kubenswrapper[4892]: I0217 18:31:20.396727 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0ed2048-28f5-47b3-89dd-37d1b4ff797b" containerName="extract-utilities" Feb 17 18:31:20 crc kubenswrapper[4892]: E0217 18:31:20.396743 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="177be440-8218-46c9-a38f-a4b8123eb0d5" containerName="extract-utilities" Feb 17 18:31:20 crc kubenswrapper[4892]: I0217 18:31:20.396754 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="177be440-8218-46c9-a38f-a4b8123eb0d5" containerName="extract-utilities" Feb 17 18:31:20 crc kubenswrapper[4892]: E0217 18:31:20.396775 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="177be440-8218-46c9-a38f-a4b8123eb0d5" containerName="extract-content" Feb 17 18:31:20 crc kubenswrapper[4892]: I0217 18:31:20.396785 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="177be440-8218-46c9-a38f-a4b8123eb0d5" containerName="extract-content" Feb 17 18:31:20 crc kubenswrapper[4892]: E0217 18:31:20.396810 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="177be440-8218-46c9-a38f-a4b8123eb0d5" containerName="registry-server" Feb 17 18:31:20 crc kubenswrapper[4892]: I0217 18:31:20.396844 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="177be440-8218-46c9-a38f-a4b8123eb0d5" containerName="registry-server" Feb 17 18:31:20 crc kubenswrapper[4892]: E0217 18:31:20.396861 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0ed2048-28f5-47b3-89dd-37d1b4ff797b" containerName="registry-server" Feb 17 18:31:20 crc kubenswrapper[4892]: I0217 18:31:20.396871 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0ed2048-28f5-47b3-89dd-37d1b4ff797b" containerName="registry-server" Feb 17 18:31:20 crc kubenswrapper[4892]: I0217 18:31:20.397135 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="177be440-8218-46c9-a38f-a4b8123eb0d5" containerName="registry-server" Feb 17 18:31:20 crc kubenswrapper[4892]: I0217 18:31:20.397167 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0ed2048-28f5-47b3-89dd-37d1b4ff797b" containerName="registry-server" Feb 17 18:31:20 crc kubenswrapper[4892]: I0217 18:31:20.399423 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xps4j" Feb 17 18:31:20 crc kubenswrapper[4892]: I0217 18:31:20.402360 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e-utilities\") pod \"certified-operators-xps4j\" (UID: \"d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e\") " pod="openshift-marketplace/certified-operators-xps4j" Feb 17 18:31:20 crc kubenswrapper[4892]: I0217 18:31:20.402445 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e-catalog-content\") pod \"certified-operators-xps4j\" (UID: \"d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e\") " pod="openshift-marketplace/certified-operators-xps4j" Feb 17 18:31:20 crc kubenswrapper[4892]: I0217 18:31:20.402524 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntbgz\" (UniqueName: \"kubernetes.io/projected/d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e-kube-api-access-ntbgz\") pod \"certified-operators-xps4j\" (UID: \"d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e\") " pod="openshift-marketplace/certified-operators-xps4j" Feb 17 18:31:20 crc kubenswrapper[4892]: I0217 18:31:20.416437 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xps4j"] Feb 17 18:31:20 crc kubenswrapper[4892]: I0217 18:31:20.503791 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e-utilities\") pod \"certified-operators-xps4j\" (UID: \"d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e\") " pod="openshift-marketplace/certified-operators-xps4j" Feb 17 18:31:20 crc kubenswrapper[4892]: I0217 18:31:20.503882 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e-catalog-content\") pod \"certified-operators-xps4j\" (UID: \"d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e\") " pod="openshift-marketplace/certified-operators-xps4j" Feb 17 18:31:20 crc kubenswrapper[4892]: I0217 18:31:20.503917 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntbgz\" (UniqueName: \"kubernetes.io/projected/d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e-kube-api-access-ntbgz\") pod \"certified-operators-xps4j\" (UID: \"d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e\") " pod="openshift-marketplace/certified-operators-xps4j" Feb 17 18:31:20 crc kubenswrapper[4892]: I0217 18:31:20.504224 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e-utilities\") pod \"certified-operators-xps4j\" (UID: \"d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e\") " pod="openshift-marketplace/certified-operators-xps4j" Feb 17 18:31:20 crc kubenswrapper[4892]: I0217 18:31:20.504475 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e-catalog-content\") pod \"certified-operators-xps4j\" (UID: \"d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e\") " pod="openshift-marketplace/certified-operators-xps4j" Feb 17 18:31:20 crc kubenswrapper[4892]: I0217 18:31:20.532976 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntbgz\" (UniqueName: \"kubernetes.io/projected/d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e-kube-api-access-ntbgz\") pod \"certified-operators-xps4j\" (UID: \"d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e\") " pod="openshift-marketplace/certified-operators-xps4j" Feb 17 18:31:20 crc kubenswrapper[4892]: I0217 18:31:20.735916 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xps4j" Feb 17 18:31:21 crc kubenswrapper[4892]: I0217 18:31:21.233541 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xps4j"] Feb 17 18:31:21 crc kubenswrapper[4892]: W0217 18:31:21.244183 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7d3fcc8_ce5a_4dab_8dff_7d146f383b8e.slice/crio-96d6f887efbeb78c742bd17ac2d692208aa65ea815741d7722195ef34f6948dd WatchSource:0}: Error finding container 96d6f887efbeb78c742bd17ac2d692208aa65ea815741d7722195ef34f6948dd: Status 404 returned error can't find the container with id 96d6f887efbeb78c742bd17ac2d692208aa65ea815741d7722195ef34f6948dd Feb 17 18:31:22 crc kubenswrapper[4892]: I0217 18:31:22.080722 4892 generic.go:334] "Generic (PLEG): container finished" podID="d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e" containerID="aa5abe2ed080be1f47c46c1a091baac06412590dcc47b0eecbfddcf219d2e541" exitCode=0 Feb 17 18:31:22 crc kubenswrapper[4892]: I0217 18:31:22.080780 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xps4j" event={"ID":"d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e","Type":"ContainerDied","Data":"aa5abe2ed080be1f47c46c1a091baac06412590dcc47b0eecbfddcf219d2e541"} Feb 17 18:31:22 crc kubenswrapper[4892]: I0217 18:31:22.081012 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xps4j" event={"ID":"d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e","Type":"ContainerStarted","Data":"96d6f887efbeb78c742bd17ac2d692208aa65ea815741d7722195ef34f6948dd"} Feb 17 18:31:23 crc kubenswrapper[4892]: I0217 18:31:23.089451 4892 generic.go:334] "Generic (PLEG): container finished" podID="d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e" containerID="473b4c151fb2678b019f8a5df1d55ed2a2c244b2a975882e503258a163cff612" exitCode=0 Feb 17 18:31:23 crc kubenswrapper[4892]: I0217 18:31:23.089545 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xps4j" event={"ID":"d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e","Type":"ContainerDied","Data":"473b4c151fb2678b019f8a5df1d55ed2a2c244b2a975882e503258a163cff612"} Feb 17 18:31:23 crc kubenswrapper[4892]: I0217 18:31:23.172654 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ml88n"] Feb 17 18:31:23 crc kubenswrapper[4892]: I0217 18:31:23.174568 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ml88n" Feb 17 18:31:23 crc kubenswrapper[4892]: I0217 18:31:23.209983 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ml88n"] Feb 17 18:31:23 crc kubenswrapper[4892]: I0217 18:31:23.244843 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c968f129-812f-4611-9495-d44067465be1-utilities\") pod \"redhat-marketplace-ml88n\" (UID: \"c968f129-812f-4611-9495-d44067465be1\") " pod="openshift-marketplace/redhat-marketplace-ml88n" Feb 17 18:31:23 crc kubenswrapper[4892]: I0217 18:31:23.244979 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2k4p\" (UniqueName: \"kubernetes.io/projected/c968f129-812f-4611-9495-d44067465be1-kube-api-access-r2k4p\") pod \"redhat-marketplace-ml88n\" (UID: \"c968f129-812f-4611-9495-d44067465be1\") " pod="openshift-marketplace/redhat-marketplace-ml88n" Feb 17 18:31:23 crc kubenswrapper[4892]: I0217 18:31:23.245020 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c968f129-812f-4611-9495-d44067465be1-catalog-content\") pod \"redhat-marketplace-ml88n\" (UID: \"c968f129-812f-4611-9495-d44067465be1\") " pod="openshift-marketplace/redhat-marketplace-ml88n" Feb 17 18:31:23 crc kubenswrapper[4892]: I0217 18:31:23.346478 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c968f129-812f-4611-9495-d44067465be1-utilities\") pod \"redhat-marketplace-ml88n\" (UID: \"c968f129-812f-4611-9495-d44067465be1\") " pod="openshift-marketplace/redhat-marketplace-ml88n" Feb 17 18:31:23 crc kubenswrapper[4892]: I0217 18:31:23.346575 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2k4p\" (UniqueName: \"kubernetes.io/projected/c968f129-812f-4611-9495-d44067465be1-kube-api-access-r2k4p\") pod \"redhat-marketplace-ml88n\" (UID: \"c968f129-812f-4611-9495-d44067465be1\") " pod="openshift-marketplace/redhat-marketplace-ml88n" Feb 17 18:31:23 crc kubenswrapper[4892]: I0217 18:31:23.346599 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c968f129-812f-4611-9495-d44067465be1-catalog-content\") pod \"redhat-marketplace-ml88n\" (UID: \"c968f129-812f-4611-9495-d44067465be1\") " pod="openshift-marketplace/redhat-marketplace-ml88n" Feb 17 18:31:23 crc kubenswrapper[4892]: I0217 18:31:23.347157 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c968f129-812f-4611-9495-d44067465be1-utilities\") pod \"redhat-marketplace-ml88n\" (UID: \"c968f129-812f-4611-9495-d44067465be1\") " pod="openshift-marketplace/redhat-marketplace-ml88n" Feb 17 18:31:23 crc kubenswrapper[4892]: I0217 18:31:23.347196 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c968f129-812f-4611-9495-d44067465be1-catalog-content\") pod \"redhat-marketplace-ml88n\" (UID: \"c968f129-812f-4611-9495-d44067465be1\") " pod="openshift-marketplace/redhat-marketplace-ml88n" Feb 17 18:31:23 crc kubenswrapper[4892]: I0217 18:31:23.373847 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2k4p\" (UniqueName: \"kubernetes.io/projected/c968f129-812f-4611-9495-d44067465be1-kube-api-access-r2k4p\") pod \"redhat-marketplace-ml88n\" (UID: \"c968f129-812f-4611-9495-d44067465be1\") " pod="openshift-marketplace/redhat-marketplace-ml88n" Feb 17 18:31:23 crc kubenswrapper[4892]: I0217 18:31:23.511363 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ml88n" Feb 17 18:31:24 crc kubenswrapper[4892]: I0217 18:31:24.017027 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ml88n"] Feb 17 18:31:24 crc kubenswrapper[4892]: I0217 18:31:24.100586 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xps4j" event={"ID":"d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e","Type":"ContainerStarted","Data":"470efea5ec1513382dcb5056dbebfd95c655c3bfbf250a84b11da6c8233b274b"} Feb 17 18:31:24 crc kubenswrapper[4892]: I0217 18:31:24.102433 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ml88n" event={"ID":"c968f129-812f-4611-9495-d44067465be1","Type":"ContainerStarted","Data":"dd492865943db6c6d29381bd7f49b4347f88bea28972283c211289e77c67b779"} Feb 17 18:31:24 crc kubenswrapper[4892]: I0217 18:31:24.127852 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xps4j" podStartSLOduration=2.491740681 podStartE2EDuration="4.127829059s" podCreationTimestamp="2026-02-17 18:31:20 +0000 UTC" firstStartedPulling="2026-02-17 18:31:22.082587663 +0000 UTC m=+2853.457990928" lastFinishedPulling="2026-02-17 18:31:23.718676041 +0000 UTC m=+2855.094079306" observedRunningTime="2026-02-17 18:31:24.122610508 +0000 UTC m=+2855.498013773" watchObservedRunningTime="2026-02-17 18:31:24.127829059 +0000 UTC m=+2855.503232324" Feb 17 18:31:25 crc kubenswrapper[4892]: I0217 18:31:25.112623 4892 generic.go:334] "Generic (PLEG): container finished" podID="c968f129-812f-4611-9495-d44067465be1" containerID="d863f7941713e40818ea1b5e92595cd060f159af715740f0a053b1165ce64219" exitCode=0 Feb 17 18:31:25 crc kubenswrapper[4892]: I0217 18:31:25.112696 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ml88n" event={"ID":"c968f129-812f-4611-9495-d44067465be1","Type":"ContainerDied","Data":"d863f7941713e40818ea1b5e92595cd060f159af715740f0a053b1165ce64219"} Feb 17 18:31:26 crc kubenswrapper[4892]: I0217 18:31:26.122546 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ml88n" event={"ID":"c968f129-812f-4611-9495-d44067465be1","Type":"ContainerStarted","Data":"70c1ca7518c3ff831497adc00e718c36fde3dc5bf351c7034c12730c5f190f82"} Feb 17 18:31:27 crc kubenswrapper[4892]: I0217 18:31:27.134208 4892 generic.go:334] "Generic (PLEG): container finished" podID="c968f129-812f-4611-9495-d44067465be1" containerID="70c1ca7518c3ff831497adc00e718c36fde3dc5bf351c7034c12730c5f190f82" exitCode=0 Feb 17 18:31:27 crc kubenswrapper[4892]: I0217 18:31:27.134345 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ml88n" event={"ID":"c968f129-812f-4611-9495-d44067465be1","Type":"ContainerDied","Data":"70c1ca7518c3ff831497adc00e718c36fde3dc5bf351c7034c12730c5f190f82"} Feb 17 18:31:30 crc kubenswrapper[4892]: I0217 18:31:30.175028 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ml88n" event={"ID":"c968f129-812f-4611-9495-d44067465be1","Type":"ContainerStarted","Data":"d9b7035f04924b76514dcb9f008b3c4f2d79b57a5cfdf725ced6164463aa810f"} Feb 17 18:31:30 crc kubenswrapper[4892]: I0217 18:31:30.203044 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ml88n" podStartSLOduration=3.327817423 podStartE2EDuration="7.203022577s" podCreationTimestamp="2026-02-17 18:31:23 +0000 UTC" firstStartedPulling="2026-02-17 18:31:25.114233632 +0000 UTC m=+2856.489636907" lastFinishedPulling="2026-02-17 18:31:28.989438746 +0000 UTC m=+2860.364842061" observedRunningTime="2026-02-17 18:31:30.19757171 +0000 UTC m=+2861.572974985" watchObservedRunningTime="2026-02-17 18:31:30.203022577 +0000 UTC m=+2861.578425852" Feb 17 18:31:30 crc kubenswrapper[4892]: I0217 18:31:30.736282 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xps4j" Feb 17 18:31:30 crc kubenswrapper[4892]: I0217 18:31:30.736655 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xps4j" Feb 17 18:31:30 crc kubenswrapper[4892]: I0217 18:31:30.807316 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xps4j" Feb 17 18:31:31 crc kubenswrapper[4892]: I0217 18:31:31.239568 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xps4j" Feb 17 18:31:33 crc kubenswrapper[4892]: I0217 18:31:33.512337 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ml88n" Feb 17 18:31:33 crc kubenswrapper[4892]: I0217 18:31:33.512702 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ml88n" Feb 17 18:31:33 crc kubenswrapper[4892]: I0217 18:31:33.578644 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ml88n" Feb 17 18:31:34 crc kubenswrapper[4892]: I0217 18:31:34.287841 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ml88n" Feb 17 18:31:35 crc kubenswrapper[4892]: I0217 18:31:35.554921 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xps4j"] Feb 17 18:31:35 crc kubenswrapper[4892]: I0217 18:31:35.555174 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xps4j" podUID="d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e" containerName="registry-server" containerID="cri-o://470efea5ec1513382dcb5056dbebfd95c655c3bfbf250a84b11da6c8233b274b" gracePeriod=2 Feb 17 18:31:35 crc kubenswrapper[4892]: I0217 18:31:35.756239 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ml88n"] Feb 17 18:31:36 crc kubenswrapper[4892]: I0217 18:31:36.051532 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xps4j" Feb 17 18:31:36 crc kubenswrapper[4892]: I0217 18:31:36.184545 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e-utilities\") pod \"d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e\" (UID: \"d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e\") " Feb 17 18:31:36 crc kubenswrapper[4892]: I0217 18:31:36.184721 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e-catalog-content\") pod \"d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e\" (UID: \"d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e\") " Feb 17 18:31:36 crc kubenswrapper[4892]: I0217 18:31:36.184742 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntbgz\" (UniqueName: \"kubernetes.io/projected/d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e-kube-api-access-ntbgz\") pod \"d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e\" (UID: \"d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e\") " Feb 17 18:31:36 crc kubenswrapper[4892]: I0217 18:31:36.185431 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e-utilities" (OuterVolumeSpecName: "utilities") pod "d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e" (UID: "d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:31:36 crc kubenswrapper[4892]: I0217 18:31:36.192615 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e-kube-api-access-ntbgz" (OuterVolumeSpecName: "kube-api-access-ntbgz") pod "d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e" (UID: "d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e"). InnerVolumeSpecName "kube-api-access-ntbgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:31:36 crc kubenswrapper[4892]: I0217 18:31:36.235507 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e" (UID: "d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:31:36 crc kubenswrapper[4892]: I0217 18:31:36.248404 4892 generic.go:334] "Generic (PLEG): container finished" podID="d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e" containerID="470efea5ec1513382dcb5056dbebfd95c655c3bfbf250a84b11da6c8233b274b" exitCode=0 Feb 17 18:31:36 crc kubenswrapper[4892]: I0217 18:31:36.248458 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xps4j" Feb 17 18:31:36 crc kubenswrapper[4892]: I0217 18:31:36.248531 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xps4j" event={"ID":"d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e","Type":"ContainerDied","Data":"470efea5ec1513382dcb5056dbebfd95c655c3bfbf250a84b11da6c8233b274b"} Feb 17 18:31:36 crc kubenswrapper[4892]: I0217 18:31:36.248662 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xps4j" event={"ID":"d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e","Type":"ContainerDied","Data":"96d6f887efbeb78c742bd17ac2d692208aa65ea815741d7722195ef34f6948dd"} Feb 17 18:31:36 crc kubenswrapper[4892]: I0217 18:31:36.248695 4892 scope.go:117] "RemoveContainer" containerID="470efea5ec1513382dcb5056dbebfd95c655c3bfbf250a84b11da6c8233b274b" Feb 17 18:31:36 crc kubenswrapper[4892]: I0217 18:31:36.248860 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ml88n" podUID="c968f129-812f-4611-9495-d44067465be1" containerName="registry-server" containerID="cri-o://d9b7035f04924b76514dcb9f008b3c4f2d79b57a5cfdf725ced6164463aa810f" gracePeriod=2 Feb 17 18:31:36 crc kubenswrapper[4892]: I0217 18:31:36.288579 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 18:31:36 crc kubenswrapper[4892]: I0217 18:31:36.288623 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntbgz\" (UniqueName: \"kubernetes.io/projected/d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e-kube-api-access-ntbgz\") on node \"crc\" DevicePath \"\"" Feb 17 18:31:36 crc kubenswrapper[4892]: I0217 18:31:36.288639 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 18:31:36 crc kubenswrapper[4892]: I0217 18:31:36.296060 4892 scope.go:117] "RemoveContainer" containerID="473b4c151fb2678b019f8a5df1d55ed2a2c244b2a975882e503258a163cff612" Feb 17 18:31:36 crc kubenswrapper[4892]: I0217 18:31:36.296865 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xps4j"] Feb 17 18:31:36 crc kubenswrapper[4892]: I0217 18:31:36.302311 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xps4j"] Feb 17 18:31:36 crc kubenswrapper[4892]: I0217 18:31:36.356288 4892 scope.go:117] "RemoveContainer" containerID="aa5abe2ed080be1f47c46c1a091baac06412590dcc47b0eecbfddcf219d2e541" Feb 17 18:31:36 crc kubenswrapper[4892]: I0217 18:31:36.397833 4892 scope.go:117] "RemoveContainer" containerID="470efea5ec1513382dcb5056dbebfd95c655c3bfbf250a84b11da6c8233b274b" Feb 17 18:31:36 crc kubenswrapper[4892]: E0217 18:31:36.400779 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"470efea5ec1513382dcb5056dbebfd95c655c3bfbf250a84b11da6c8233b274b\": container with ID starting with 470efea5ec1513382dcb5056dbebfd95c655c3bfbf250a84b11da6c8233b274b not found: ID does not exist" containerID="470efea5ec1513382dcb5056dbebfd95c655c3bfbf250a84b11da6c8233b274b" Feb 17 18:31:36 crc kubenswrapper[4892]: I0217 18:31:36.400912 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"470efea5ec1513382dcb5056dbebfd95c655c3bfbf250a84b11da6c8233b274b"} err="failed to get container status \"470efea5ec1513382dcb5056dbebfd95c655c3bfbf250a84b11da6c8233b274b\": rpc error: code = NotFound desc = could not find container \"470efea5ec1513382dcb5056dbebfd95c655c3bfbf250a84b11da6c8233b274b\": container with ID starting with 470efea5ec1513382dcb5056dbebfd95c655c3bfbf250a84b11da6c8233b274b not found: ID does not exist" Feb 17 18:31:36 crc kubenswrapper[4892]: I0217 18:31:36.400957 4892 scope.go:117] "RemoveContainer" containerID="473b4c151fb2678b019f8a5df1d55ed2a2c244b2a975882e503258a163cff612" Feb 17 18:31:36 crc kubenswrapper[4892]: E0217 18:31:36.401341 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"473b4c151fb2678b019f8a5df1d55ed2a2c244b2a975882e503258a163cff612\": container with ID starting with 473b4c151fb2678b019f8a5df1d55ed2a2c244b2a975882e503258a163cff612 not found: ID does not exist" containerID="473b4c151fb2678b019f8a5df1d55ed2a2c244b2a975882e503258a163cff612" Feb 17 18:31:36 crc kubenswrapper[4892]: I0217 18:31:36.401361 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"473b4c151fb2678b019f8a5df1d55ed2a2c244b2a975882e503258a163cff612"} err="failed to get container status \"473b4c151fb2678b019f8a5df1d55ed2a2c244b2a975882e503258a163cff612\": rpc error: code = NotFound desc = could not find container \"473b4c151fb2678b019f8a5df1d55ed2a2c244b2a975882e503258a163cff612\": container with ID starting with 473b4c151fb2678b019f8a5df1d55ed2a2c244b2a975882e503258a163cff612 not found: ID does not exist" Feb 17 18:31:36 crc kubenswrapper[4892]: I0217 18:31:36.401375 4892 scope.go:117] "RemoveContainer" containerID="aa5abe2ed080be1f47c46c1a091baac06412590dcc47b0eecbfddcf219d2e541" Feb 17 18:31:36 crc kubenswrapper[4892]: E0217 18:31:36.401967 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa5abe2ed080be1f47c46c1a091baac06412590dcc47b0eecbfddcf219d2e541\": container with ID starting with aa5abe2ed080be1f47c46c1a091baac06412590dcc47b0eecbfddcf219d2e541 not found: ID does not exist" containerID="aa5abe2ed080be1f47c46c1a091baac06412590dcc47b0eecbfddcf219d2e541" Feb 17 18:31:36 crc kubenswrapper[4892]: I0217 18:31:36.401994 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa5abe2ed080be1f47c46c1a091baac06412590dcc47b0eecbfddcf219d2e541"} err="failed to get container status \"aa5abe2ed080be1f47c46c1a091baac06412590dcc47b0eecbfddcf219d2e541\": rpc error: code = NotFound desc = could not find container \"aa5abe2ed080be1f47c46c1a091baac06412590dcc47b0eecbfddcf219d2e541\": container with ID starting with aa5abe2ed080be1f47c46c1a091baac06412590dcc47b0eecbfddcf219d2e541 not found: ID does not exist" Feb 17 18:31:36 crc kubenswrapper[4892]: I0217 18:31:36.609077 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ml88n" Feb 17 18:31:36 crc kubenswrapper[4892]: I0217 18:31:36.796319 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c968f129-812f-4611-9495-d44067465be1-catalog-content\") pod \"c968f129-812f-4611-9495-d44067465be1\" (UID: \"c968f129-812f-4611-9495-d44067465be1\") " Feb 17 18:31:36 crc kubenswrapper[4892]: I0217 18:31:36.796389 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c968f129-812f-4611-9495-d44067465be1-utilities\") pod \"c968f129-812f-4611-9495-d44067465be1\" (UID: \"c968f129-812f-4611-9495-d44067465be1\") " Feb 17 18:31:36 crc kubenswrapper[4892]: I0217 18:31:36.796455 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2k4p\" (UniqueName: \"kubernetes.io/projected/c968f129-812f-4611-9495-d44067465be1-kube-api-access-r2k4p\") pod \"c968f129-812f-4611-9495-d44067465be1\" (UID: \"c968f129-812f-4611-9495-d44067465be1\") " Feb 17 18:31:36 crc kubenswrapper[4892]: I0217 18:31:36.797410 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c968f129-812f-4611-9495-d44067465be1-utilities" (OuterVolumeSpecName: "utilities") pod "c968f129-812f-4611-9495-d44067465be1" (UID: "c968f129-812f-4611-9495-d44067465be1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:31:36 crc kubenswrapper[4892]: I0217 18:31:36.802890 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c968f129-812f-4611-9495-d44067465be1-kube-api-access-r2k4p" (OuterVolumeSpecName: "kube-api-access-r2k4p") pod "c968f129-812f-4611-9495-d44067465be1" (UID: "c968f129-812f-4611-9495-d44067465be1"). InnerVolumeSpecName "kube-api-access-r2k4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:31:36 crc kubenswrapper[4892]: I0217 18:31:36.827290 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c968f129-812f-4611-9495-d44067465be1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c968f129-812f-4611-9495-d44067465be1" (UID: "c968f129-812f-4611-9495-d44067465be1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:31:36 crc kubenswrapper[4892]: I0217 18:31:36.899220 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c968f129-812f-4611-9495-d44067465be1-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 18:31:36 crc kubenswrapper[4892]: I0217 18:31:36.899274 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c968f129-812f-4611-9495-d44067465be1-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 18:31:36 crc kubenswrapper[4892]: I0217 18:31:36.899293 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2k4p\" (UniqueName: \"kubernetes.io/projected/c968f129-812f-4611-9495-d44067465be1-kube-api-access-r2k4p\") on node \"crc\" DevicePath \"\"" Feb 17 18:31:37 crc kubenswrapper[4892]: I0217 18:31:37.265705 4892 generic.go:334] "Generic (PLEG): container finished" podID="c968f129-812f-4611-9495-d44067465be1" containerID="d9b7035f04924b76514dcb9f008b3c4f2d79b57a5cfdf725ced6164463aa810f" exitCode=0 Feb 17 18:31:37 crc kubenswrapper[4892]: I0217 18:31:37.265771 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ml88n" event={"ID":"c968f129-812f-4611-9495-d44067465be1","Type":"ContainerDied","Data":"d9b7035f04924b76514dcb9f008b3c4f2d79b57a5cfdf725ced6164463aa810f"} Feb 17 18:31:37 crc kubenswrapper[4892]: I0217 18:31:37.265836 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ml88n" Feb 17 18:31:37 crc kubenswrapper[4892]: I0217 18:31:37.265871 4892 scope.go:117] "RemoveContainer" containerID="d9b7035f04924b76514dcb9f008b3c4f2d79b57a5cfdf725ced6164463aa810f" Feb 17 18:31:37 crc kubenswrapper[4892]: I0217 18:31:37.265851 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ml88n" event={"ID":"c968f129-812f-4611-9495-d44067465be1","Type":"ContainerDied","Data":"dd492865943db6c6d29381bd7f49b4347f88bea28972283c211289e77c67b779"} Feb 17 18:31:37 crc kubenswrapper[4892]: I0217 18:31:37.294019 4892 scope.go:117] "RemoveContainer" containerID="70c1ca7518c3ff831497adc00e718c36fde3dc5bf351c7034c12730c5f190f82" Feb 17 18:31:37 crc kubenswrapper[4892]: I0217 18:31:37.328253 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ml88n"] Feb 17 18:31:37 crc kubenswrapper[4892]: I0217 18:31:37.339153 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ml88n"] Feb 17 18:31:37 crc kubenswrapper[4892]: I0217 18:31:37.343006 4892 scope.go:117] "RemoveContainer" containerID="d863f7941713e40818ea1b5e92595cd060f159af715740f0a053b1165ce64219" Feb 17 18:31:37 crc kubenswrapper[4892]: I0217 18:31:37.366809 4892 scope.go:117] "RemoveContainer" containerID="d9b7035f04924b76514dcb9f008b3c4f2d79b57a5cfdf725ced6164463aa810f" Feb 17 18:31:37 crc kubenswrapper[4892]: E0217 18:31:37.367481 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9b7035f04924b76514dcb9f008b3c4f2d79b57a5cfdf725ced6164463aa810f\": container with ID starting with d9b7035f04924b76514dcb9f008b3c4f2d79b57a5cfdf725ced6164463aa810f not found: ID does not exist" containerID="d9b7035f04924b76514dcb9f008b3c4f2d79b57a5cfdf725ced6164463aa810f" Feb 17 18:31:37 crc kubenswrapper[4892]: I0217 18:31:37.367621 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9b7035f04924b76514dcb9f008b3c4f2d79b57a5cfdf725ced6164463aa810f"} err="failed to get container status \"d9b7035f04924b76514dcb9f008b3c4f2d79b57a5cfdf725ced6164463aa810f\": rpc error: code = NotFound desc = could not find container \"d9b7035f04924b76514dcb9f008b3c4f2d79b57a5cfdf725ced6164463aa810f\": container with ID starting with d9b7035f04924b76514dcb9f008b3c4f2d79b57a5cfdf725ced6164463aa810f not found: ID does not exist" Feb 17 18:31:37 crc kubenswrapper[4892]: I0217 18:31:37.367723 4892 scope.go:117] "RemoveContainer" containerID="70c1ca7518c3ff831497adc00e718c36fde3dc5bf351c7034c12730c5f190f82" Feb 17 18:31:37 crc kubenswrapper[4892]: E0217 18:31:37.368300 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70c1ca7518c3ff831497adc00e718c36fde3dc5bf351c7034c12730c5f190f82\": container with ID starting with 70c1ca7518c3ff831497adc00e718c36fde3dc5bf351c7034c12730c5f190f82 not found: ID does not exist" containerID="70c1ca7518c3ff831497adc00e718c36fde3dc5bf351c7034c12730c5f190f82" Feb 17 18:31:37 crc kubenswrapper[4892]: I0217 18:31:37.368349 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70c1ca7518c3ff831497adc00e718c36fde3dc5bf351c7034c12730c5f190f82"} err="failed to get container status \"70c1ca7518c3ff831497adc00e718c36fde3dc5bf351c7034c12730c5f190f82\": rpc error: code = NotFound desc = could not find container \"70c1ca7518c3ff831497adc00e718c36fde3dc5bf351c7034c12730c5f190f82\": container with ID starting with 70c1ca7518c3ff831497adc00e718c36fde3dc5bf351c7034c12730c5f190f82 not found: ID does not exist" Feb 17 18:31:37 crc kubenswrapper[4892]: I0217 18:31:37.368387 4892 scope.go:117] "RemoveContainer" containerID="d863f7941713e40818ea1b5e92595cd060f159af715740f0a053b1165ce64219" Feb 17 18:31:37 crc kubenswrapper[4892]: E0217 18:31:37.368779 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d863f7941713e40818ea1b5e92595cd060f159af715740f0a053b1165ce64219\": container with ID starting with d863f7941713e40818ea1b5e92595cd060f159af715740f0a053b1165ce64219 not found: ID does not exist" containerID="d863f7941713e40818ea1b5e92595cd060f159af715740f0a053b1165ce64219" Feb 17 18:31:37 crc kubenswrapper[4892]: I0217 18:31:37.368923 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d863f7941713e40818ea1b5e92595cd060f159af715740f0a053b1165ce64219"} err="failed to get container status \"d863f7941713e40818ea1b5e92595cd060f159af715740f0a053b1165ce64219\": rpc error: code = NotFound desc = could not find container \"d863f7941713e40818ea1b5e92595cd060f159af715740f0a053b1165ce64219\": container with ID starting with d863f7941713e40818ea1b5e92595cd060f159af715740f0a053b1165ce64219 not found: ID does not exist" Feb 17 18:31:37 crc kubenswrapper[4892]: I0217 18:31:37.374012 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c968f129-812f-4611-9495-d44067465be1" path="/var/lib/kubelet/pods/c968f129-812f-4611-9495-d44067465be1/volumes" Feb 17 18:31:37 crc kubenswrapper[4892]: I0217 18:31:37.374963 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e" path="/var/lib/kubelet/pods/d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e/volumes" Feb 17 18:32:07 crc kubenswrapper[4892]: I0217 18:32:07.424888 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 18:32:07 crc kubenswrapper[4892]: I0217 18:32:07.425643 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 18:32:37 crc kubenswrapper[4892]: I0217 18:32:37.424753 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 18:32:37 crc kubenswrapper[4892]: I0217 18:32:37.425271 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 18:33:07 crc kubenswrapper[4892]: I0217 18:33:07.424956 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 18:33:07 crc kubenswrapper[4892]: I0217 18:33:07.425532 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 18:33:07 crc kubenswrapper[4892]: I0217 18:33:07.425592 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" Feb 17 18:33:07 crc kubenswrapper[4892]: I0217 18:33:07.426513 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"931ae5ffd33ae2b47798ebc6b8ba622c3bafe46cb651ea27d2843230971d2757"} pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 18:33:07 crc kubenswrapper[4892]: I0217 18:33:07.426635 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" containerID="cri-o://931ae5ffd33ae2b47798ebc6b8ba622c3bafe46cb651ea27d2843230971d2757" gracePeriod=600 Feb 17 18:33:07 crc kubenswrapper[4892]: E0217 18:33:07.548669 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:33:08 crc kubenswrapper[4892]: I0217 18:33:08.196887 4892 generic.go:334] "Generic (PLEG): container finished" podID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerID="931ae5ffd33ae2b47798ebc6b8ba622c3bafe46cb651ea27d2843230971d2757" exitCode=0 Feb 17 18:33:08 crc kubenswrapper[4892]: I0217 18:33:08.196937 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerDied","Data":"931ae5ffd33ae2b47798ebc6b8ba622c3bafe46cb651ea27d2843230971d2757"} Feb 17 18:33:08 crc kubenswrapper[4892]: I0217 18:33:08.196976 4892 scope.go:117] "RemoveContainer" containerID="7fc7a9cdbbcd0e8aa68ed9058147ae85e5e43f5116c72459ea9a473802a43953" Feb 17 18:33:08 crc kubenswrapper[4892]: I0217 18:33:08.197587 4892 scope.go:117] "RemoveContainer" containerID="931ae5ffd33ae2b47798ebc6b8ba622c3bafe46cb651ea27d2843230971d2757" Feb 17 18:33:08 crc kubenswrapper[4892]: E0217 18:33:08.197983 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:33:23 crc kubenswrapper[4892]: I0217 18:33:23.360087 4892 scope.go:117] "RemoveContainer" containerID="931ae5ffd33ae2b47798ebc6b8ba622c3bafe46cb651ea27d2843230971d2757" Feb 17 18:33:23 crc kubenswrapper[4892]: E0217 18:33:23.360978 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:33:34 crc kubenswrapper[4892]: I0217 18:33:34.360756 4892 scope.go:117] "RemoveContainer" containerID="931ae5ffd33ae2b47798ebc6b8ba622c3bafe46cb651ea27d2843230971d2757" Feb 17 18:33:34 crc kubenswrapper[4892]: E0217 18:33:34.361861 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:33:47 crc kubenswrapper[4892]: I0217 18:33:47.360138 4892 scope.go:117] "RemoveContainer" containerID="931ae5ffd33ae2b47798ebc6b8ba622c3bafe46cb651ea27d2843230971d2757" Feb 17 18:33:47 crc kubenswrapper[4892]: E0217 18:33:47.360911 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:33:59 crc kubenswrapper[4892]: I0217 18:33:59.985922 4892 scope.go:117] "RemoveContainer" containerID="931ae5ffd33ae2b47798ebc6b8ba622c3bafe46cb651ea27d2843230971d2757" Feb 17 18:33:59 crc kubenswrapper[4892]: E0217 18:33:59.987021 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:34:12 crc kubenswrapper[4892]: I0217 18:34:12.360417 4892 scope.go:117] "RemoveContainer" containerID="931ae5ffd33ae2b47798ebc6b8ba622c3bafe46cb651ea27d2843230971d2757" Feb 17 18:34:12 crc kubenswrapper[4892]: E0217 18:34:12.362087 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:34:23 crc kubenswrapper[4892]: I0217 18:34:23.360130 4892 scope.go:117] "RemoveContainer" containerID="931ae5ffd33ae2b47798ebc6b8ba622c3bafe46cb651ea27d2843230971d2757" Feb 17 18:34:23 crc kubenswrapper[4892]: E0217 18:34:23.360751 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:34:34 crc kubenswrapper[4892]: I0217 18:34:34.359145 4892 scope.go:117] "RemoveContainer" containerID="931ae5ffd33ae2b47798ebc6b8ba622c3bafe46cb651ea27d2843230971d2757" Feb 17 18:34:34 crc kubenswrapper[4892]: E0217 18:34:34.359784 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:34:45 crc kubenswrapper[4892]: I0217 18:34:45.360169 4892 scope.go:117] "RemoveContainer" containerID="931ae5ffd33ae2b47798ebc6b8ba622c3bafe46cb651ea27d2843230971d2757" Feb 17 18:34:45 crc kubenswrapper[4892]: E0217 18:34:45.361051 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:34:59 crc kubenswrapper[4892]: I0217 18:34:59.370698 4892 scope.go:117] "RemoveContainer" containerID="931ae5ffd33ae2b47798ebc6b8ba622c3bafe46cb651ea27d2843230971d2757" Feb 17 18:34:59 crc kubenswrapper[4892]: E0217 18:34:59.371639 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:35:10 crc kubenswrapper[4892]: I0217 18:35:10.359940 4892 scope.go:117] "RemoveContainer" containerID="931ae5ffd33ae2b47798ebc6b8ba622c3bafe46cb651ea27d2843230971d2757" Feb 17 18:35:10 crc kubenswrapper[4892]: E0217 18:35:10.362274 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:35:21 crc kubenswrapper[4892]: I0217 18:35:21.359883 4892 scope.go:117] "RemoveContainer" containerID="931ae5ffd33ae2b47798ebc6b8ba622c3bafe46cb651ea27d2843230971d2757" Feb 17 18:35:21 crc kubenswrapper[4892]: E0217 18:35:21.360462 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:35:32 crc kubenswrapper[4892]: I0217 18:35:32.359705 4892 scope.go:117] "RemoveContainer" containerID="931ae5ffd33ae2b47798ebc6b8ba622c3bafe46cb651ea27d2843230971d2757" Feb 17 18:35:32 crc kubenswrapper[4892]: E0217 18:35:32.360861 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:35:46 crc kubenswrapper[4892]: I0217 18:35:46.360002 4892 scope.go:117] "RemoveContainer" containerID="931ae5ffd33ae2b47798ebc6b8ba622c3bafe46cb651ea27d2843230971d2757" Feb 17 18:35:46 crc kubenswrapper[4892]: E0217 18:35:46.363158 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:36:01 crc kubenswrapper[4892]: I0217 18:36:01.359768 4892 scope.go:117] "RemoveContainer" containerID="931ae5ffd33ae2b47798ebc6b8ba622c3bafe46cb651ea27d2843230971d2757" Feb 17 18:36:01 crc kubenswrapper[4892]: E0217 18:36:01.361430 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:36:13 crc kubenswrapper[4892]: I0217 18:36:13.360343 4892 scope.go:117] "RemoveContainer" containerID="931ae5ffd33ae2b47798ebc6b8ba622c3bafe46cb651ea27d2843230971d2757" Feb 17 18:36:13 crc kubenswrapper[4892]: E0217 18:36:13.361095 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:36:27 crc kubenswrapper[4892]: I0217 18:36:27.359587 4892 scope.go:117] "RemoveContainer" containerID="931ae5ffd33ae2b47798ebc6b8ba622c3bafe46cb651ea27d2843230971d2757" Feb 17 18:36:27 crc kubenswrapper[4892]: E0217 18:36:27.361050 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:36:41 crc kubenswrapper[4892]: I0217 18:36:41.360355 4892 scope.go:117] "RemoveContainer" containerID="931ae5ffd33ae2b47798ebc6b8ba622c3bafe46cb651ea27d2843230971d2757" Feb 17 18:36:41 crc kubenswrapper[4892]: E0217 18:36:41.361347 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:36:52 crc kubenswrapper[4892]: I0217 18:36:52.359986 4892 scope.go:117] "RemoveContainer" containerID="931ae5ffd33ae2b47798ebc6b8ba622c3bafe46cb651ea27d2843230971d2757" Feb 17 18:36:52 crc kubenswrapper[4892]: E0217 18:36:52.362355 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:37:07 crc kubenswrapper[4892]: I0217 18:37:07.359714 4892 scope.go:117] "RemoveContainer" containerID="931ae5ffd33ae2b47798ebc6b8ba622c3bafe46cb651ea27d2843230971d2757" Feb 17 18:37:07 crc kubenswrapper[4892]: E0217 18:37:07.361319 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:37:19 crc kubenswrapper[4892]: I0217 18:37:19.364754 4892 scope.go:117] "RemoveContainer" containerID="931ae5ffd33ae2b47798ebc6b8ba622c3bafe46cb651ea27d2843230971d2757" Feb 17 18:37:19 crc kubenswrapper[4892]: E0217 18:37:19.365457 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:37:33 crc kubenswrapper[4892]: I0217 18:37:33.359700 4892 scope.go:117] "RemoveContainer" containerID="931ae5ffd33ae2b47798ebc6b8ba622c3bafe46cb651ea27d2843230971d2757" Feb 17 18:37:33 crc kubenswrapper[4892]: E0217 18:37:33.360972 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:37:44 crc kubenswrapper[4892]: I0217 18:37:44.360186 4892 scope.go:117] "RemoveContainer" containerID="931ae5ffd33ae2b47798ebc6b8ba622c3bafe46cb651ea27d2843230971d2757" Feb 17 18:37:44 crc kubenswrapper[4892]: E0217 18:37:44.360949 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:37:57 crc kubenswrapper[4892]: I0217 18:37:57.360354 4892 scope.go:117] "RemoveContainer" containerID="931ae5ffd33ae2b47798ebc6b8ba622c3bafe46cb651ea27d2843230971d2757" Feb 17 18:37:57 crc kubenswrapper[4892]: E0217 18:37:57.361573 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:38:08 crc kubenswrapper[4892]: I0217 18:38:08.359771 4892 scope.go:117] "RemoveContainer" containerID="931ae5ffd33ae2b47798ebc6b8ba622c3bafe46cb651ea27d2843230971d2757" Feb 17 18:38:08 crc kubenswrapper[4892]: I0217 18:38:08.974120 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerStarted","Data":"fe86fc28385d9eb4781c2be11d00be8ffe1907a854f7e29ff267ed39565d85ba"} Feb 17 18:40:37 crc kubenswrapper[4892]: I0217 18:40:37.425149 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 18:40:37 crc kubenswrapper[4892]: I0217 18:40:37.426094 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 18:41:07 crc kubenswrapper[4892]: I0217 18:41:07.425129 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 18:41:07 crc kubenswrapper[4892]: I0217 18:41:07.425926 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 18:41:10 crc kubenswrapper[4892]: I0217 18:41:10.291274 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-m9ltm"] Feb 17 18:41:10 crc kubenswrapper[4892]: E0217 18:41:10.291987 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c968f129-812f-4611-9495-d44067465be1" containerName="extract-utilities" Feb 17 18:41:10 crc kubenswrapper[4892]: I0217 18:41:10.292009 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="c968f129-812f-4611-9495-d44067465be1" containerName="extract-utilities" Feb 17 18:41:10 crc kubenswrapper[4892]: E0217 18:41:10.292027 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e" containerName="extract-utilities" Feb 17 18:41:10 crc kubenswrapper[4892]: I0217 18:41:10.292036 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e" containerName="extract-utilities" Feb 17 18:41:10 crc kubenswrapper[4892]: E0217 18:41:10.292063 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e" containerName="extract-content" Feb 17 18:41:10 crc kubenswrapper[4892]: I0217 18:41:10.292071 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e" containerName="extract-content" Feb 17 18:41:10 crc kubenswrapper[4892]: E0217 18:41:10.292082 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c968f129-812f-4611-9495-d44067465be1" containerName="registry-server" Feb 17 18:41:10 crc kubenswrapper[4892]: I0217 18:41:10.292089 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="c968f129-812f-4611-9495-d44067465be1" containerName="registry-server" Feb 17 18:41:10 crc kubenswrapper[4892]: E0217 18:41:10.292102 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c968f129-812f-4611-9495-d44067465be1" containerName="extract-content" Feb 17 18:41:10 crc kubenswrapper[4892]: I0217 18:41:10.292110 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="c968f129-812f-4611-9495-d44067465be1" containerName="extract-content" Feb 17 18:41:10 crc kubenswrapper[4892]: E0217 18:41:10.292130 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e" containerName="registry-server" Feb 17 18:41:10 crc kubenswrapper[4892]: I0217 18:41:10.292139 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e" containerName="registry-server" Feb 17 18:41:10 crc kubenswrapper[4892]: I0217 18:41:10.292313 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7d3fcc8-ce5a-4dab-8dff-7d146f383b8e" containerName="registry-server" Feb 17 18:41:10 crc kubenswrapper[4892]: I0217 18:41:10.292325 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="c968f129-812f-4611-9495-d44067465be1" containerName="registry-server" Feb 17 18:41:10 crc kubenswrapper[4892]: I0217 18:41:10.293671 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m9ltm" Feb 17 18:41:10 crc kubenswrapper[4892]: I0217 18:41:10.310402 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m9ltm"] Feb 17 18:41:10 crc kubenswrapper[4892]: I0217 18:41:10.428952 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad5e2b2e-3c9a-453c-a530-b6829c6a5faa-utilities\") pod \"redhat-operators-m9ltm\" (UID: \"ad5e2b2e-3c9a-453c-a530-b6829c6a5faa\") " pod="openshift-marketplace/redhat-operators-m9ltm" Feb 17 18:41:10 crc kubenswrapper[4892]: I0217 18:41:10.429002 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flt8n\" (UniqueName: \"kubernetes.io/projected/ad5e2b2e-3c9a-453c-a530-b6829c6a5faa-kube-api-access-flt8n\") pod \"redhat-operators-m9ltm\" (UID: \"ad5e2b2e-3c9a-453c-a530-b6829c6a5faa\") " pod="openshift-marketplace/redhat-operators-m9ltm" Feb 17 18:41:10 crc kubenswrapper[4892]: I0217 18:41:10.429104 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad5e2b2e-3c9a-453c-a530-b6829c6a5faa-catalog-content\") pod \"redhat-operators-m9ltm\" (UID: \"ad5e2b2e-3c9a-453c-a530-b6829c6a5faa\") " pod="openshift-marketplace/redhat-operators-m9ltm" Feb 17 18:41:10 crc kubenswrapper[4892]: I0217 18:41:10.530600 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad5e2b2e-3c9a-453c-a530-b6829c6a5faa-catalog-content\") pod \"redhat-operators-m9ltm\" (UID: \"ad5e2b2e-3c9a-453c-a530-b6829c6a5faa\") " pod="openshift-marketplace/redhat-operators-m9ltm" Feb 17 18:41:10 crc kubenswrapper[4892]: I0217 18:41:10.530740 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad5e2b2e-3c9a-453c-a530-b6829c6a5faa-utilities\") pod \"redhat-operators-m9ltm\" (UID: \"ad5e2b2e-3c9a-453c-a530-b6829c6a5faa\") " pod="openshift-marketplace/redhat-operators-m9ltm" Feb 17 18:41:10 crc kubenswrapper[4892]: I0217 18:41:10.530794 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flt8n\" (UniqueName: \"kubernetes.io/projected/ad5e2b2e-3c9a-453c-a530-b6829c6a5faa-kube-api-access-flt8n\") pod \"redhat-operators-m9ltm\" (UID: \"ad5e2b2e-3c9a-453c-a530-b6829c6a5faa\") " pod="openshift-marketplace/redhat-operators-m9ltm" Feb 17 18:41:10 crc kubenswrapper[4892]: I0217 18:41:10.531189 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad5e2b2e-3c9a-453c-a530-b6829c6a5faa-catalog-content\") pod \"redhat-operators-m9ltm\" (UID: \"ad5e2b2e-3c9a-453c-a530-b6829c6a5faa\") " pod="openshift-marketplace/redhat-operators-m9ltm" Feb 17 18:41:10 crc kubenswrapper[4892]: I0217 18:41:10.531269 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad5e2b2e-3c9a-453c-a530-b6829c6a5faa-utilities\") pod \"redhat-operators-m9ltm\" (UID: \"ad5e2b2e-3c9a-453c-a530-b6829c6a5faa\") " pod="openshift-marketplace/redhat-operators-m9ltm" Feb 17 18:41:10 crc kubenswrapper[4892]: I0217 18:41:10.565427 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flt8n\" (UniqueName: \"kubernetes.io/projected/ad5e2b2e-3c9a-453c-a530-b6829c6a5faa-kube-api-access-flt8n\") pod \"redhat-operators-m9ltm\" (UID: \"ad5e2b2e-3c9a-453c-a530-b6829c6a5faa\") " pod="openshift-marketplace/redhat-operators-m9ltm" Feb 17 18:41:10 crc kubenswrapper[4892]: I0217 18:41:10.615365 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m9ltm" Feb 17 18:41:11 crc kubenswrapper[4892]: I0217 18:41:11.039192 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m9ltm"] Feb 17 18:41:12 crc kubenswrapper[4892]: I0217 18:41:12.089439 4892 generic.go:334] "Generic (PLEG): container finished" podID="ad5e2b2e-3c9a-453c-a530-b6829c6a5faa" containerID="8e2163f6de81eba425641093b468e11041fa5670055302c99e42e11f51d92573" exitCode=0 Feb 17 18:41:12 crc kubenswrapper[4892]: I0217 18:41:12.089549 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m9ltm" event={"ID":"ad5e2b2e-3c9a-453c-a530-b6829c6a5faa","Type":"ContainerDied","Data":"8e2163f6de81eba425641093b468e11041fa5670055302c99e42e11f51d92573"} Feb 17 18:41:12 crc kubenswrapper[4892]: I0217 18:41:12.090639 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m9ltm" event={"ID":"ad5e2b2e-3c9a-453c-a530-b6829c6a5faa","Type":"ContainerStarted","Data":"2a9d8425ef76b30b2e6678d4ee3aa58502ad028b1a6841e46fdcae29912e44b4"} Feb 17 18:41:12 crc kubenswrapper[4892]: I0217 18:41:12.093446 4892 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 18:41:13 crc kubenswrapper[4892]: I0217 18:41:13.100265 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m9ltm" event={"ID":"ad5e2b2e-3c9a-453c-a530-b6829c6a5faa","Type":"ContainerStarted","Data":"6d7a79df1db4d465d4c2b6cd46f4d10d0a00f84a1e3b9d5339207a2adbd3d830"} Feb 17 18:41:14 crc kubenswrapper[4892]: I0217 18:41:14.109783 4892 generic.go:334] "Generic (PLEG): container finished" podID="ad5e2b2e-3c9a-453c-a530-b6829c6a5faa" containerID="6d7a79df1db4d465d4c2b6cd46f4d10d0a00f84a1e3b9d5339207a2adbd3d830" exitCode=0 Feb 17 18:41:14 crc kubenswrapper[4892]: I0217 18:41:14.109842 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m9ltm" event={"ID":"ad5e2b2e-3c9a-453c-a530-b6829c6a5faa","Type":"ContainerDied","Data":"6d7a79df1db4d465d4c2b6cd46f4d10d0a00f84a1e3b9d5339207a2adbd3d830"} Feb 17 18:41:16 crc kubenswrapper[4892]: I0217 18:41:16.127480 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m9ltm" event={"ID":"ad5e2b2e-3c9a-453c-a530-b6829c6a5faa","Type":"ContainerStarted","Data":"3f381493a08f9af6a7ce9c5f5a55d1061edd806ebc4b5f459d1a4ab5fe93aba3"} Feb 17 18:41:16 crc kubenswrapper[4892]: I0217 18:41:16.143304 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-m9ltm" podStartSLOduration=2.699678606 podStartE2EDuration="6.143281893s" podCreationTimestamp="2026-02-17 18:41:10 +0000 UTC" firstStartedPulling="2026-02-17 18:41:12.093192964 +0000 UTC m=+3443.468596229" lastFinishedPulling="2026-02-17 18:41:15.536796261 +0000 UTC m=+3446.912199516" observedRunningTime="2026-02-17 18:41:16.141007142 +0000 UTC m=+3447.516410407" watchObservedRunningTime="2026-02-17 18:41:16.143281893 +0000 UTC m=+3447.518685158" Feb 17 18:41:20 crc kubenswrapper[4892]: I0217 18:41:20.615677 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-m9ltm" Feb 17 18:41:20 crc kubenswrapper[4892]: I0217 18:41:20.616241 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-m9ltm" Feb 17 18:41:21 crc kubenswrapper[4892]: I0217 18:41:21.670399 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-m9ltm" podUID="ad5e2b2e-3c9a-453c-a530-b6829c6a5faa" containerName="registry-server" probeResult="failure" output=< Feb 17 18:41:21 crc kubenswrapper[4892]: timeout: failed to connect service ":50051" within 1s Feb 17 18:41:21 crc kubenswrapper[4892]: > Feb 17 18:41:30 crc kubenswrapper[4892]: I0217 18:41:30.682201 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-m9ltm" Feb 17 18:41:30 crc kubenswrapper[4892]: I0217 18:41:30.731213 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-m9ltm" Feb 17 18:41:30 crc kubenswrapper[4892]: I0217 18:41:30.925300 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m9ltm"] Feb 17 18:41:32 crc kubenswrapper[4892]: I0217 18:41:32.318159 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-m9ltm" podUID="ad5e2b2e-3c9a-453c-a530-b6829c6a5faa" containerName="registry-server" containerID="cri-o://3f381493a08f9af6a7ce9c5f5a55d1061edd806ebc4b5f459d1a4ab5fe93aba3" gracePeriod=2 Feb 17 18:41:32 crc kubenswrapper[4892]: I0217 18:41:32.753395 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m9ltm" Feb 17 18:41:32 crc kubenswrapper[4892]: I0217 18:41:32.795420 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flt8n\" (UniqueName: \"kubernetes.io/projected/ad5e2b2e-3c9a-453c-a530-b6829c6a5faa-kube-api-access-flt8n\") pod \"ad5e2b2e-3c9a-453c-a530-b6829c6a5faa\" (UID: \"ad5e2b2e-3c9a-453c-a530-b6829c6a5faa\") " Feb 17 18:41:32 crc kubenswrapper[4892]: I0217 18:41:32.795523 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad5e2b2e-3c9a-453c-a530-b6829c6a5faa-utilities\") pod \"ad5e2b2e-3c9a-453c-a530-b6829c6a5faa\" (UID: \"ad5e2b2e-3c9a-453c-a530-b6829c6a5faa\") " Feb 17 18:41:32 crc kubenswrapper[4892]: I0217 18:41:32.795922 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad5e2b2e-3c9a-453c-a530-b6829c6a5faa-catalog-content\") pod \"ad5e2b2e-3c9a-453c-a530-b6829c6a5faa\" (UID: \"ad5e2b2e-3c9a-453c-a530-b6829c6a5faa\") " Feb 17 18:41:32 crc kubenswrapper[4892]: I0217 18:41:32.798025 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad5e2b2e-3c9a-453c-a530-b6829c6a5faa-utilities" (OuterVolumeSpecName: "utilities") pod "ad5e2b2e-3c9a-453c-a530-b6829c6a5faa" (UID: "ad5e2b2e-3c9a-453c-a530-b6829c6a5faa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:41:32 crc kubenswrapper[4892]: I0217 18:41:32.801224 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad5e2b2e-3c9a-453c-a530-b6829c6a5faa-kube-api-access-flt8n" (OuterVolumeSpecName: "kube-api-access-flt8n") pod "ad5e2b2e-3c9a-453c-a530-b6829c6a5faa" (UID: "ad5e2b2e-3c9a-453c-a530-b6829c6a5faa"). InnerVolumeSpecName "kube-api-access-flt8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:41:32 crc kubenswrapper[4892]: I0217 18:41:32.898240 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flt8n\" (UniqueName: \"kubernetes.io/projected/ad5e2b2e-3c9a-453c-a530-b6829c6a5faa-kube-api-access-flt8n\") on node \"crc\" DevicePath \"\"" Feb 17 18:41:32 crc kubenswrapper[4892]: I0217 18:41:32.898573 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad5e2b2e-3c9a-453c-a530-b6829c6a5faa-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 18:41:32 crc kubenswrapper[4892]: I0217 18:41:32.936171 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad5e2b2e-3c9a-453c-a530-b6829c6a5faa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ad5e2b2e-3c9a-453c-a530-b6829c6a5faa" (UID: "ad5e2b2e-3c9a-453c-a530-b6829c6a5faa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:41:32 crc kubenswrapper[4892]: I0217 18:41:32.999729 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad5e2b2e-3c9a-453c-a530-b6829c6a5faa-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 18:41:33 crc kubenswrapper[4892]: I0217 18:41:33.335403 4892 generic.go:334] "Generic (PLEG): container finished" podID="ad5e2b2e-3c9a-453c-a530-b6829c6a5faa" containerID="3f381493a08f9af6a7ce9c5f5a55d1061edd806ebc4b5f459d1a4ab5fe93aba3" exitCode=0 Feb 17 18:41:33 crc kubenswrapper[4892]: I0217 18:41:33.335482 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m9ltm" event={"ID":"ad5e2b2e-3c9a-453c-a530-b6829c6a5faa","Type":"ContainerDied","Data":"3f381493a08f9af6a7ce9c5f5a55d1061edd806ebc4b5f459d1a4ab5fe93aba3"} Feb 17 18:41:33 crc kubenswrapper[4892]: I0217 18:41:33.335545 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m9ltm" event={"ID":"ad5e2b2e-3c9a-453c-a530-b6829c6a5faa","Type":"ContainerDied","Data":"2a9d8425ef76b30b2e6678d4ee3aa58502ad028b1a6841e46fdcae29912e44b4"} Feb 17 18:41:33 crc kubenswrapper[4892]: I0217 18:41:33.335586 4892 scope.go:117] "RemoveContainer" containerID="3f381493a08f9af6a7ce9c5f5a55d1061edd806ebc4b5f459d1a4ab5fe93aba3" Feb 17 18:41:33 crc kubenswrapper[4892]: I0217 18:41:33.335598 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m9ltm" Feb 17 18:41:33 crc kubenswrapper[4892]: I0217 18:41:33.365045 4892 scope.go:117] "RemoveContainer" containerID="6d7a79df1db4d465d4c2b6cd46f4d10d0a00f84a1e3b9d5339207a2adbd3d830" Feb 17 18:41:33 crc kubenswrapper[4892]: I0217 18:41:33.391851 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m9ltm"] Feb 17 18:41:33 crc kubenswrapper[4892]: I0217 18:41:33.401220 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-m9ltm"] Feb 17 18:41:33 crc kubenswrapper[4892]: I0217 18:41:33.415488 4892 scope.go:117] "RemoveContainer" containerID="8e2163f6de81eba425641093b468e11041fa5670055302c99e42e11f51d92573" Feb 17 18:41:33 crc kubenswrapper[4892]: I0217 18:41:33.444200 4892 scope.go:117] "RemoveContainer" containerID="3f381493a08f9af6a7ce9c5f5a55d1061edd806ebc4b5f459d1a4ab5fe93aba3" Feb 17 18:41:33 crc kubenswrapper[4892]: E0217 18:41:33.444760 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f381493a08f9af6a7ce9c5f5a55d1061edd806ebc4b5f459d1a4ab5fe93aba3\": container with ID starting with 3f381493a08f9af6a7ce9c5f5a55d1061edd806ebc4b5f459d1a4ab5fe93aba3 not found: ID does not exist" containerID="3f381493a08f9af6a7ce9c5f5a55d1061edd806ebc4b5f459d1a4ab5fe93aba3" Feb 17 18:41:33 crc kubenswrapper[4892]: I0217 18:41:33.444794 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f381493a08f9af6a7ce9c5f5a55d1061edd806ebc4b5f459d1a4ab5fe93aba3"} err="failed to get container status \"3f381493a08f9af6a7ce9c5f5a55d1061edd806ebc4b5f459d1a4ab5fe93aba3\": rpc error: code = NotFound desc = could not find container \"3f381493a08f9af6a7ce9c5f5a55d1061edd806ebc4b5f459d1a4ab5fe93aba3\": container with ID starting with 3f381493a08f9af6a7ce9c5f5a55d1061edd806ebc4b5f459d1a4ab5fe93aba3 not found: ID does not exist" Feb 17 18:41:33 crc kubenswrapper[4892]: I0217 18:41:33.444831 4892 scope.go:117] "RemoveContainer" containerID="6d7a79df1db4d465d4c2b6cd46f4d10d0a00f84a1e3b9d5339207a2adbd3d830" Feb 17 18:41:33 crc kubenswrapper[4892]: E0217 18:41:33.445276 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d7a79df1db4d465d4c2b6cd46f4d10d0a00f84a1e3b9d5339207a2adbd3d830\": container with ID starting with 6d7a79df1db4d465d4c2b6cd46f4d10d0a00f84a1e3b9d5339207a2adbd3d830 not found: ID does not exist" containerID="6d7a79df1db4d465d4c2b6cd46f4d10d0a00f84a1e3b9d5339207a2adbd3d830" Feb 17 18:41:33 crc kubenswrapper[4892]: I0217 18:41:33.445297 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d7a79df1db4d465d4c2b6cd46f4d10d0a00f84a1e3b9d5339207a2adbd3d830"} err="failed to get container status \"6d7a79df1db4d465d4c2b6cd46f4d10d0a00f84a1e3b9d5339207a2adbd3d830\": rpc error: code = NotFound desc = could not find container \"6d7a79df1db4d465d4c2b6cd46f4d10d0a00f84a1e3b9d5339207a2adbd3d830\": container with ID starting with 6d7a79df1db4d465d4c2b6cd46f4d10d0a00f84a1e3b9d5339207a2adbd3d830 not found: ID does not exist" Feb 17 18:41:33 crc kubenswrapper[4892]: I0217 18:41:33.445312 4892 scope.go:117] "RemoveContainer" containerID="8e2163f6de81eba425641093b468e11041fa5670055302c99e42e11f51d92573" Feb 17 18:41:33 crc kubenswrapper[4892]: E0217 18:41:33.445597 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e2163f6de81eba425641093b468e11041fa5670055302c99e42e11f51d92573\": container with ID starting with 8e2163f6de81eba425641093b468e11041fa5670055302c99e42e11f51d92573 not found: ID does not exist" containerID="8e2163f6de81eba425641093b468e11041fa5670055302c99e42e11f51d92573" Feb 17 18:41:33 crc kubenswrapper[4892]: I0217 18:41:33.445618 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e2163f6de81eba425641093b468e11041fa5670055302c99e42e11f51d92573"} err="failed to get container status \"8e2163f6de81eba425641093b468e11041fa5670055302c99e42e11f51d92573\": rpc error: code = NotFound desc = could not find container \"8e2163f6de81eba425641093b468e11041fa5670055302c99e42e11f51d92573\": container with ID starting with 8e2163f6de81eba425641093b468e11041fa5670055302c99e42e11f51d92573 not found: ID does not exist" Feb 17 18:41:35 crc kubenswrapper[4892]: I0217 18:41:35.377746 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad5e2b2e-3c9a-453c-a530-b6829c6a5faa" path="/var/lib/kubelet/pods/ad5e2b2e-3c9a-453c-a530-b6829c6a5faa/volumes" Feb 17 18:41:37 crc kubenswrapper[4892]: I0217 18:41:37.425763 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 18:41:37 crc kubenswrapper[4892]: I0217 18:41:37.425925 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 18:41:37 crc kubenswrapper[4892]: I0217 18:41:37.426551 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" Feb 17 18:41:37 crc kubenswrapper[4892]: I0217 18:41:37.427612 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fe86fc28385d9eb4781c2be11d00be8ffe1907a854f7e29ff267ed39565d85ba"} pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 18:41:37 crc kubenswrapper[4892]: I0217 18:41:37.427757 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" containerID="cri-o://fe86fc28385d9eb4781c2be11d00be8ffe1907a854f7e29ff267ed39565d85ba" gracePeriod=600 Feb 17 18:41:38 crc kubenswrapper[4892]: I0217 18:41:38.400496 4892 generic.go:334] "Generic (PLEG): container finished" podID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerID="fe86fc28385d9eb4781c2be11d00be8ffe1907a854f7e29ff267ed39565d85ba" exitCode=0 Feb 17 18:41:38 crc kubenswrapper[4892]: I0217 18:41:38.401056 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerDied","Data":"fe86fc28385d9eb4781c2be11d00be8ffe1907a854f7e29ff267ed39565d85ba"} Feb 17 18:41:38 crc kubenswrapper[4892]: I0217 18:41:38.401184 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerStarted","Data":"1c808ea8689d49c6406b07c975a960fba73e2a381719a39584da9285b8457a81"} Feb 17 18:41:38 crc kubenswrapper[4892]: I0217 18:41:38.401240 4892 scope.go:117] "RemoveContainer" containerID="931ae5ffd33ae2b47798ebc6b8ba622c3bafe46cb651ea27d2843230971d2757" Feb 17 18:43:35 crc kubenswrapper[4892]: I0217 18:43:35.724912 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bkhdg"] Feb 17 18:43:35 crc kubenswrapper[4892]: E0217 18:43:35.725912 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad5e2b2e-3c9a-453c-a530-b6829c6a5faa" containerName="extract-utilities" Feb 17 18:43:35 crc kubenswrapper[4892]: I0217 18:43:35.725928 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad5e2b2e-3c9a-453c-a530-b6829c6a5faa" containerName="extract-utilities" Feb 17 18:43:35 crc kubenswrapper[4892]: E0217 18:43:35.725948 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad5e2b2e-3c9a-453c-a530-b6829c6a5faa" containerName="registry-server" Feb 17 18:43:35 crc kubenswrapper[4892]: I0217 18:43:35.725957 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad5e2b2e-3c9a-453c-a530-b6829c6a5faa" containerName="registry-server" Feb 17 18:43:35 crc kubenswrapper[4892]: E0217 18:43:35.725984 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad5e2b2e-3c9a-453c-a530-b6829c6a5faa" containerName="extract-content" Feb 17 18:43:35 crc kubenswrapper[4892]: I0217 18:43:35.725992 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad5e2b2e-3c9a-453c-a530-b6829c6a5faa" containerName="extract-content" Feb 17 18:43:35 crc kubenswrapper[4892]: I0217 18:43:35.726206 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad5e2b2e-3c9a-453c-a530-b6829c6a5faa" containerName="registry-server" Feb 17 18:43:35 crc kubenswrapper[4892]: I0217 18:43:35.727801 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bkhdg" Feb 17 18:43:35 crc kubenswrapper[4892]: I0217 18:43:35.761978 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bkhdg"] Feb 17 18:43:35 crc kubenswrapper[4892]: I0217 18:43:35.893552 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38f571e6-3ddb-46f2-be46-1303d07ab52f-catalog-content\") pod \"community-operators-bkhdg\" (UID: \"38f571e6-3ddb-46f2-be46-1303d07ab52f\") " pod="openshift-marketplace/community-operators-bkhdg" Feb 17 18:43:35 crc kubenswrapper[4892]: I0217 18:43:35.893647 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38f571e6-3ddb-46f2-be46-1303d07ab52f-utilities\") pod \"community-operators-bkhdg\" (UID: \"38f571e6-3ddb-46f2-be46-1303d07ab52f\") " pod="openshift-marketplace/community-operators-bkhdg" Feb 17 18:43:35 crc kubenswrapper[4892]: I0217 18:43:35.893714 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6nrq\" (UniqueName: \"kubernetes.io/projected/38f571e6-3ddb-46f2-be46-1303d07ab52f-kube-api-access-z6nrq\") pod \"community-operators-bkhdg\" (UID: \"38f571e6-3ddb-46f2-be46-1303d07ab52f\") " pod="openshift-marketplace/community-operators-bkhdg" Feb 17 18:43:35 crc kubenswrapper[4892]: I0217 18:43:35.995406 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38f571e6-3ddb-46f2-be46-1303d07ab52f-catalog-content\") pod \"community-operators-bkhdg\" (UID: \"38f571e6-3ddb-46f2-be46-1303d07ab52f\") " pod="openshift-marketplace/community-operators-bkhdg" Feb 17 18:43:35 crc kubenswrapper[4892]: I0217 18:43:35.995474 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38f571e6-3ddb-46f2-be46-1303d07ab52f-utilities\") pod \"community-operators-bkhdg\" (UID: \"38f571e6-3ddb-46f2-be46-1303d07ab52f\") " pod="openshift-marketplace/community-operators-bkhdg" Feb 17 18:43:35 crc kubenswrapper[4892]: I0217 18:43:35.995519 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6nrq\" (UniqueName: \"kubernetes.io/projected/38f571e6-3ddb-46f2-be46-1303d07ab52f-kube-api-access-z6nrq\") pod \"community-operators-bkhdg\" (UID: \"38f571e6-3ddb-46f2-be46-1303d07ab52f\") " pod="openshift-marketplace/community-operators-bkhdg" Feb 17 18:43:35 crc kubenswrapper[4892]: I0217 18:43:35.996210 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38f571e6-3ddb-46f2-be46-1303d07ab52f-catalog-content\") pod \"community-operators-bkhdg\" (UID: \"38f571e6-3ddb-46f2-be46-1303d07ab52f\") " pod="openshift-marketplace/community-operators-bkhdg" Feb 17 18:43:35 crc kubenswrapper[4892]: I0217 18:43:35.996417 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38f571e6-3ddb-46f2-be46-1303d07ab52f-utilities\") pod \"community-operators-bkhdg\" (UID: \"38f571e6-3ddb-46f2-be46-1303d07ab52f\") " pod="openshift-marketplace/community-operators-bkhdg" Feb 17 18:43:36 crc kubenswrapper[4892]: I0217 18:43:36.017085 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6nrq\" (UniqueName: \"kubernetes.io/projected/38f571e6-3ddb-46f2-be46-1303d07ab52f-kube-api-access-z6nrq\") pod \"community-operators-bkhdg\" (UID: \"38f571e6-3ddb-46f2-be46-1303d07ab52f\") " pod="openshift-marketplace/community-operators-bkhdg" Feb 17 18:43:36 crc kubenswrapper[4892]: I0217 18:43:36.057089 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bkhdg" Feb 17 18:43:36 crc kubenswrapper[4892]: I0217 18:43:36.589397 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bkhdg"] Feb 17 18:43:36 crc kubenswrapper[4892]: I0217 18:43:36.671321 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bkhdg" event={"ID":"38f571e6-3ddb-46f2-be46-1303d07ab52f","Type":"ContainerStarted","Data":"4cee135a88a62ddfe39e05236fa37e7ee582297820004b3615b52fad31cfd95c"} Feb 17 18:43:37 crc kubenswrapper[4892]: I0217 18:43:37.425375 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 18:43:37 crc kubenswrapper[4892]: I0217 18:43:37.425636 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 18:43:37 crc kubenswrapper[4892]: I0217 18:43:37.684457 4892 generic.go:334] "Generic (PLEG): container finished" podID="38f571e6-3ddb-46f2-be46-1303d07ab52f" containerID="907563a8be18d7c478fdbde341b33aa5ffb748c6fa0f28bae4b1ec3a21299ca2" exitCode=0 Feb 17 18:43:37 crc kubenswrapper[4892]: I0217 18:43:37.684493 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bkhdg" event={"ID":"38f571e6-3ddb-46f2-be46-1303d07ab52f","Type":"ContainerDied","Data":"907563a8be18d7c478fdbde341b33aa5ffb748c6fa0f28bae4b1ec3a21299ca2"} Feb 17 18:43:37 crc kubenswrapper[4892]: I0217 18:43:37.935153 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-74tgn"] Feb 17 18:43:37 crc kubenswrapper[4892]: I0217 18:43:37.939802 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-74tgn" Feb 17 18:43:37 crc kubenswrapper[4892]: I0217 18:43:37.949212 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-74tgn"] Feb 17 18:43:38 crc kubenswrapper[4892]: I0217 18:43:38.145582 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nczg\" (UniqueName: \"kubernetes.io/projected/41a7480f-7795-4134-b7fd-debd314f8bba-kube-api-access-5nczg\") pod \"certified-operators-74tgn\" (UID: \"41a7480f-7795-4134-b7fd-debd314f8bba\") " pod="openshift-marketplace/certified-operators-74tgn" Feb 17 18:43:38 crc kubenswrapper[4892]: I0217 18:43:38.145636 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41a7480f-7795-4134-b7fd-debd314f8bba-utilities\") pod \"certified-operators-74tgn\" (UID: \"41a7480f-7795-4134-b7fd-debd314f8bba\") " pod="openshift-marketplace/certified-operators-74tgn" Feb 17 18:43:38 crc kubenswrapper[4892]: I0217 18:43:38.145689 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41a7480f-7795-4134-b7fd-debd314f8bba-catalog-content\") pod \"certified-operators-74tgn\" (UID: \"41a7480f-7795-4134-b7fd-debd314f8bba\") " pod="openshift-marketplace/certified-operators-74tgn" Feb 17 18:43:38 crc kubenswrapper[4892]: I0217 18:43:38.247636 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nczg\" (UniqueName: \"kubernetes.io/projected/41a7480f-7795-4134-b7fd-debd314f8bba-kube-api-access-5nczg\") pod \"certified-operators-74tgn\" (UID: \"41a7480f-7795-4134-b7fd-debd314f8bba\") " pod="openshift-marketplace/certified-operators-74tgn" Feb 17 18:43:38 crc kubenswrapper[4892]: I0217 18:43:38.247705 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41a7480f-7795-4134-b7fd-debd314f8bba-utilities\") pod \"certified-operators-74tgn\" (UID: \"41a7480f-7795-4134-b7fd-debd314f8bba\") " pod="openshift-marketplace/certified-operators-74tgn" Feb 17 18:43:38 crc kubenswrapper[4892]: I0217 18:43:38.247764 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41a7480f-7795-4134-b7fd-debd314f8bba-catalog-content\") pod \"certified-operators-74tgn\" (UID: \"41a7480f-7795-4134-b7fd-debd314f8bba\") " pod="openshift-marketplace/certified-operators-74tgn" Feb 17 18:43:38 crc kubenswrapper[4892]: I0217 18:43:38.248364 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41a7480f-7795-4134-b7fd-debd314f8bba-catalog-content\") pod \"certified-operators-74tgn\" (UID: \"41a7480f-7795-4134-b7fd-debd314f8bba\") " pod="openshift-marketplace/certified-operators-74tgn" Feb 17 18:43:38 crc kubenswrapper[4892]: I0217 18:43:38.250194 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41a7480f-7795-4134-b7fd-debd314f8bba-utilities\") pod \"certified-operators-74tgn\" (UID: \"41a7480f-7795-4134-b7fd-debd314f8bba\") " pod="openshift-marketplace/certified-operators-74tgn" Feb 17 18:43:38 crc kubenswrapper[4892]: I0217 18:43:38.270786 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nczg\" (UniqueName: \"kubernetes.io/projected/41a7480f-7795-4134-b7fd-debd314f8bba-kube-api-access-5nczg\") pod \"certified-operators-74tgn\" (UID: \"41a7480f-7795-4134-b7fd-debd314f8bba\") " pod="openshift-marketplace/certified-operators-74tgn" Feb 17 18:43:38 crc kubenswrapper[4892]: I0217 18:43:38.279429 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-74tgn" Feb 17 18:43:38 crc kubenswrapper[4892]: I0217 18:43:38.590725 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-74tgn"] Feb 17 18:43:38 crc kubenswrapper[4892]: I0217 18:43:38.695624 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bkhdg" event={"ID":"38f571e6-3ddb-46f2-be46-1303d07ab52f","Type":"ContainerStarted","Data":"2d3d0cb9b916ef56774a9424665b4eb587ed50b285cccebfbff1d17d01a670e2"} Feb 17 18:43:38 crc kubenswrapper[4892]: I0217 18:43:38.697075 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74tgn" event={"ID":"41a7480f-7795-4134-b7fd-debd314f8bba","Type":"ContainerStarted","Data":"16d40205832380e912eb65fc1ce6e1ec11804554b0736cdc449ff3fb0b9c46df"} Feb 17 18:43:38 crc kubenswrapper[4892]: I0217 18:43:38.918755 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gg6p8"] Feb 17 18:43:38 crc kubenswrapper[4892]: I0217 18:43:38.920894 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gg6p8" Feb 17 18:43:38 crc kubenswrapper[4892]: I0217 18:43:38.932367 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gg6p8"] Feb 17 18:43:38 crc kubenswrapper[4892]: I0217 18:43:38.960712 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05a339fa-bca7-4f9c-be57-f4364d7b83ea-utilities\") pod \"redhat-marketplace-gg6p8\" (UID: \"05a339fa-bca7-4f9c-be57-f4364d7b83ea\") " pod="openshift-marketplace/redhat-marketplace-gg6p8" Feb 17 18:43:38 crc kubenswrapper[4892]: I0217 18:43:38.960811 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05a339fa-bca7-4f9c-be57-f4364d7b83ea-catalog-content\") pod \"redhat-marketplace-gg6p8\" (UID: \"05a339fa-bca7-4f9c-be57-f4364d7b83ea\") " pod="openshift-marketplace/redhat-marketplace-gg6p8" Feb 17 18:43:38 crc kubenswrapper[4892]: I0217 18:43:38.960924 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9lwk\" (UniqueName: \"kubernetes.io/projected/05a339fa-bca7-4f9c-be57-f4364d7b83ea-kube-api-access-b9lwk\") pod \"redhat-marketplace-gg6p8\" (UID: \"05a339fa-bca7-4f9c-be57-f4364d7b83ea\") " pod="openshift-marketplace/redhat-marketplace-gg6p8" Feb 17 18:43:39 crc kubenswrapper[4892]: I0217 18:43:39.062006 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05a339fa-bca7-4f9c-be57-f4364d7b83ea-catalog-content\") pod \"redhat-marketplace-gg6p8\" (UID: \"05a339fa-bca7-4f9c-be57-f4364d7b83ea\") " pod="openshift-marketplace/redhat-marketplace-gg6p8" Feb 17 18:43:39 crc kubenswrapper[4892]: I0217 18:43:39.062067 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9lwk\" (UniqueName: \"kubernetes.io/projected/05a339fa-bca7-4f9c-be57-f4364d7b83ea-kube-api-access-b9lwk\") pod \"redhat-marketplace-gg6p8\" (UID: \"05a339fa-bca7-4f9c-be57-f4364d7b83ea\") " pod="openshift-marketplace/redhat-marketplace-gg6p8" Feb 17 18:43:39 crc kubenswrapper[4892]: I0217 18:43:39.062119 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05a339fa-bca7-4f9c-be57-f4364d7b83ea-utilities\") pod \"redhat-marketplace-gg6p8\" (UID: \"05a339fa-bca7-4f9c-be57-f4364d7b83ea\") " pod="openshift-marketplace/redhat-marketplace-gg6p8" Feb 17 18:43:39 crc kubenswrapper[4892]: I0217 18:43:39.062604 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05a339fa-bca7-4f9c-be57-f4364d7b83ea-catalog-content\") pod \"redhat-marketplace-gg6p8\" (UID: \"05a339fa-bca7-4f9c-be57-f4364d7b83ea\") " pod="openshift-marketplace/redhat-marketplace-gg6p8" Feb 17 18:43:39 crc kubenswrapper[4892]: I0217 18:43:39.062646 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05a339fa-bca7-4f9c-be57-f4364d7b83ea-utilities\") pod \"redhat-marketplace-gg6p8\" (UID: \"05a339fa-bca7-4f9c-be57-f4364d7b83ea\") " pod="openshift-marketplace/redhat-marketplace-gg6p8" Feb 17 18:43:39 crc kubenswrapper[4892]: I0217 18:43:39.082456 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9lwk\" (UniqueName: \"kubernetes.io/projected/05a339fa-bca7-4f9c-be57-f4364d7b83ea-kube-api-access-b9lwk\") pod \"redhat-marketplace-gg6p8\" (UID: \"05a339fa-bca7-4f9c-be57-f4364d7b83ea\") " pod="openshift-marketplace/redhat-marketplace-gg6p8" Feb 17 18:43:39 crc kubenswrapper[4892]: I0217 18:43:39.259946 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gg6p8" Feb 17 18:43:39 crc kubenswrapper[4892]: I0217 18:43:39.708447 4892 generic.go:334] "Generic (PLEG): container finished" podID="41a7480f-7795-4134-b7fd-debd314f8bba" containerID="12f4f674455de613b0808c5dd504027fb9871a2b6af05bc3cd72d9a67e742c5c" exitCode=0 Feb 17 18:43:39 crc kubenswrapper[4892]: I0217 18:43:39.708545 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74tgn" event={"ID":"41a7480f-7795-4134-b7fd-debd314f8bba","Type":"ContainerDied","Data":"12f4f674455de613b0808c5dd504027fb9871a2b6af05bc3cd72d9a67e742c5c"} Feb 17 18:43:39 crc kubenswrapper[4892]: I0217 18:43:39.711491 4892 generic.go:334] "Generic (PLEG): container finished" podID="38f571e6-3ddb-46f2-be46-1303d07ab52f" containerID="2d3d0cb9b916ef56774a9424665b4eb587ed50b285cccebfbff1d17d01a670e2" exitCode=0 Feb 17 18:43:39 crc kubenswrapper[4892]: I0217 18:43:39.711518 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bkhdg" event={"ID":"38f571e6-3ddb-46f2-be46-1303d07ab52f","Type":"ContainerDied","Data":"2d3d0cb9b916ef56774a9424665b4eb587ed50b285cccebfbff1d17d01a670e2"} Feb 17 18:43:39 crc kubenswrapper[4892]: W0217 18:43:39.752980 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05a339fa_bca7_4f9c_be57_f4364d7b83ea.slice/crio-13ed48fbdbb3c5fcf53443f233e3d7a715410298ffe2ef60234550134654b20f WatchSource:0}: Error finding container 13ed48fbdbb3c5fcf53443f233e3d7a715410298ffe2ef60234550134654b20f: Status 404 returned error can't find the container with id 13ed48fbdbb3c5fcf53443f233e3d7a715410298ffe2ef60234550134654b20f Feb 17 18:43:39 crc kubenswrapper[4892]: I0217 18:43:39.765375 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gg6p8"] Feb 17 18:43:40 crc kubenswrapper[4892]: I0217 18:43:40.722465 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bkhdg" event={"ID":"38f571e6-3ddb-46f2-be46-1303d07ab52f","Type":"ContainerStarted","Data":"1f21bc4dc2334059b725ce63b4da124dee4e524c86d3cd420d63007f22370803"} Feb 17 18:43:40 crc kubenswrapper[4892]: I0217 18:43:40.724226 4892 generic.go:334] "Generic (PLEG): container finished" podID="05a339fa-bca7-4f9c-be57-f4364d7b83ea" containerID="b55710cd5e2ecc4dbc42fa7f15b3dd34a2b313cc79a9209b3c8cf5ffa57d18c0" exitCode=0 Feb 17 18:43:40 crc kubenswrapper[4892]: I0217 18:43:40.724282 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gg6p8" event={"ID":"05a339fa-bca7-4f9c-be57-f4364d7b83ea","Type":"ContainerDied","Data":"b55710cd5e2ecc4dbc42fa7f15b3dd34a2b313cc79a9209b3c8cf5ffa57d18c0"} Feb 17 18:43:40 crc kubenswrapper[4892]: I0217 18:43:40.724344 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gg6p8" event={"ID":"05a339fa-bca7-4f9c-be57-f4364d7b83ea","Type":"ContainerStarted","Data":"13ed48fbdbb3c5fcf53443f233e3d7a715410298ffe2ef60234550134654b20f"} Feb 17 18:43:40 crc kubenswrapper[4892]: I0217 18:43:40.730330 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74tgn" event={"ID":"41a7480f-7795-4134-b7fd-debd314f8bba","Type":"ContainerStarted","Data":"b657e71d43ac6018b114ab9aad52fbc8badd38779ec8b3fe3395ed4859fd7e20"} Feb 17 18:43:40 crc kubenswrapper[4892]: I0217 18:43:40.750475 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bkhdg" podStartSLOduration=3.3594620649999998 podStartE2EDuration="5.750458952s" podCreationTimestamp="2026-02-17 18:43:35 +0000 UTC" firstStartedPulling="2026-02-17 18:43:37.688271708 +0000 UTC m=+3589.063674963" lastFinishedPulling="2026-02-17 18:43:40.079268585 +0000 UTC m=+3591.454671850" observedRunningTime="2026-02-17 18:43:40.742438115 +0000 UTC m=+3592.117841380" watchObservedRunningTime="2026-02-17 18:43:40.750458952 +0000 UTC m=+3592.125862217" Feb 17 18:43:41 crc kubenswrapper[4892]: I0217 18:43:41.741455 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gg6p8" event={"ID":"05a339fa-bca7-4f9c-be57-f4364d7b83ea","Type":"ContainerStarted","Data":"8c956a1a74b50ccd525c4cff86a82c35ab86e1c1a83f760894ba7f273fe3c6fa"} Feb 17 18:43:41 crc kubenswrapper[4892]: I0217 18:43:41.744918 4892 generic.go:334] "Generic (PLEG): container finished" podID="41a7480f-7795-4134-b7fd-debd314f8bba" containerID="b657e71d43ac6018b114ab9aad52fbc8badd38779ec8b3fe3395ed4859fd7e20" exitCode=0 Feb 17 18:43:41 crc kubenswrapper[4892]: I0217 18:43:41.744976 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74tgn" event={"ID":"41a7480f-7795-4134-b7fd-debd314f8bba","Type":"ContainerDied","Data":"b657e71d43ac6018b114ab9aad52fbc8badd38779ec8b3fe3395ed4859fd7e20"} Feb 17 18:43:41 crc kubenswrapper[4892]: I0217 18:43:41.745025 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74tgn" event={"ID":"41a7480f-7795-4134-b7fd-debd314f8bba","Type":"ContainerStarted","Data":"e3d353aba5fba1d90d21f24f9222add9fcc93ce8d4a6a447a9a1939eb5ec780e"} Feb 17 18:43:42 crc kubenswrapper[4892]: I0217 18:43:42.755492 4892 generic.go:334] "Generic (PLEG): container finished" podID="05a339fa-bca7-4f9c-be57-f4364d7b83ea" containerID="8c956a1a74b50ccd525c4cff86a82c35ab86e1c1a83f760894ba7f273fe3c6fa" exitCode=0 Feb 17 18:43:42 crc kubenswrapper[4892]: I0217 18:43:42.755556 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gg6p8" event={"ID":"05a339fa-bca7-4f9c-be57-f4364d7b83ea","Type":"ContainerDied","Data":"8c956a1a74b50ccd525c4cff86a82c35ab86e1c1a83f760894ba7f273fe3c6fa"} Feb 17 18:43:42 crc kubenswrapper[4892]: I0217 18:43:42.779641 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-74tgn" podStartSLOduration=4.377491939 podStartE2EDuration="5.779621803s" podCreationTimestamp="2026-02-17 18:43:37 +0000 UTC" firstStartedPulling="2026-02-17 18:43:39.710364249 +0000 UTC m=+3591.085767524" lastFinishedPulling="2026-02-17 18:43:41.112494113 +0000 UTC m=+3592.487897388" observedRunningTime="2026-02-17 18:43:41.806613908 +0000 UTC m=+3593.182017193" watchObservedRunningTime="2026-02-17 18:43:42.779621803 +0000 UTC m=+3594.155025078" Feb 17 18:43:43 crc kubenswrapper[4892]: I0217 18:43:43.767718 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gg6p8" event={"ID":"05a339fa-bca7-4f9c-be57-f4364d7b83ea","Type":"ContainerStarted","Data":"f3727b604559b95126e734078f4096bd88b941955102fbad9bddf69264d7ccf2"} Feb 17 18:43:43 crc kubenswrapper[4892]: I0217 18:43:43.817634 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gg6p8" podStartSLOduration=3.426728135 podStartE2EDuration="5.817611819s" podCreationTimestamp="2026-02-17 18:43:38 +0000 UTC" firstStartedPulling="2026-02-17 18:43:40.72665817 +0000 UTC m=+3592.102061435" lastFinishedPulling="2026-02-17 18:43:43.117541834 +0000 UTC m=+3594.492945119" observedRunningTime="2026-02-17 18:43:43.80912267 +0000 UTC m=+3595.184525945" watchObservedRunningTime="2026-02-17 18:43:43.817611819 +0000 UTC m=+3595.193015084" Feb 17 18:43:46 crc kubenswrapper[4892]: I0217 18:43:46.057426 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bkhdg" Feb 17 18:43:46 crc kubenswrapper[4892]: I0217 18:43:46.057756 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bkhdg" Feb 17 18:43:46 crc kubenswrapper[4892]: I0217 18:43:46.126578 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bkhdg" Feb 17 18:43:46 crc kubenswrapper[4892]: I0217 18:43:46.883996 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bkhdg" Feb 17 18:43:47 crc kubenswrapper[4892]: I0217 18:43:47.717490 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bkhdg"] Feb 17 18:43:48 crc kubenswrapper[4892]: I0217 18:43:48.280041 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-74tgn" Feb 17 18:43:48 crc kubenswrapper[4892]: I0217 18:43:48.280401 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-74tgn" Feb 17 18:43:48 crc kubenswrapper[4892]: I0217 18:43:48.353912 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-74tgn" Feb 17 18:43:48 crc kubenswrapper[4892]: I0217 18:43:48.833925 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bkhdg" podUID="38f571e6-3ddb-46f2-be46-1303d07ab52f" containerName="registry-server" containerID="cri-o://1f21bc4dc2334059b725ce63b4da124dee4e524c86d3cd420d63007f22370803" gracePeriod=2 Feb 17 18:43:48 crc kubenswrapper[4892]: I0217 18:43:48.895871 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-74tgn" Feb 17 18:43:49 crc kubenswrapper[4892]: I0217 18:43:49.260185 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gg6p8" Feb 17 18:43:49 crc kubenswrapper[4892]: I0217 18:43:49.260879 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gg6p8" Feb 17 18:43:49 crc kubenswrapper[4892]: I0217 18:43:49.296946 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bkhdg" Feb 17 18:43:49 crc kubenswrapper[4892]: I0217 18:43:49.330620 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gg6p8" Feb 17 18:43:49 crc kubenswrapper[4892]: I0217 18:43:49.462806 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38f571e6-3ddb-46f2-be46-1303d07ab52f-utilities\") pod \"38f571e6-3ddb-46f2-be46-1303d07ab52f\" (UID: \"38f571e6-3ddb-46f2-be46-1303d07ab52f\") " Feb 17 18:43:49 crc kubenswrapper[4892]: I0217 18:43:49.462940 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38f571e6-3ddb-46f2-be46-1303d07ab52f-catalog-content\") pod \"38f571e6-3ddb-46f2-be46-1303d07ab52f\" (UID: \"38f571e6-3ddb-46f2-be46-1303d07ab52f\") " Feb 17 18:43:49 crc kubenswrapper[4892]: I0217 18:43:49.463018 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6nrq\" (UniqueName: \"kubernetes.io/projected/38f571e6-3ddb-46f2-be46-1303d07ab52f-kube-api-access-z6nrq\") pod \"38f571e6-3ddb-46f2-be46-1303d07ab52f\" (UID: \"38f571e6-3ddb-46f2-be46-1303d07ab52f\") " Feb 17 18:43:49 crc kubenswrapper[4892]: I0217 18:43:49.464578 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38f571e6-3ddb-46f2-be46-1303d07ab52f-utilities" (OuterVolumeSpecName: "utilities") pod "38f571e6-3ddb-46f2-be46-1303d07ab52f" (UID: "38f571e6-3ddb-46f2-be46-1303d07ab52f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:43:49 crc kubenswrapper[4892]: I0217 18:43:49.468476 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38f571e6-3ddb-46f2-be46-1303d07ab52f-kube-api-access-z6nrq" (OuterVolumeSpecName: "kube-api-access-z6nrq") pod "38f571e6-3ddb-46f2-be46-1303d07ab52f" (UID: "38f571e6-3ddb-46f2-be46-1303d07ab52f"). InnerVolumeSpecName "kube-api-access-z6nrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:43:49 crc kubenswrapper[4892]: I0217 18:43:49.540377 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38f571e6-3ddb-46f2-be46-1303d07ab52f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "38f571e6-3ddb-46f2-be46-1303d07ab52f" (UID: "38f571e6-3ddb-46f2-be46-1303d07ab52f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:43:49 crc kubenswrapper[4892]: I0217 18:43:49.565123 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38f571e6-3ddb-46f2-be46-1303d07ab52f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 18:43:49 crc kubenswrapper[4892]: I0217 18:43:49.565160 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6nrq\" (UniqueName: \"kubernetes.io/projected/38f571e6-3ddb-46f2-be46-1303d07ab52f-kube-api-access-z6nrq\") on node \"crc\" DevicePath \"\"" Feb 17 18:43:49 crc kubenswrapper[4892]: I0217 18:43:49.565171 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38f571e6-3ddb-46f2-be46-1303d07ab52f-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 18:43:49 crc kubenswrapper[4892]: I0217 18:43:49.868050 4892 generic.go:334] "Generic (PLEG): container finished" podID="38f571e6-3ddb-46f2-be46-1303d07ab52f" containerID="1f21bc4dc2334059b725ce63b4da124dee4e524c86d3cd420d63007f22370803" exitCode=0 Feb 17 18:43:49 crc kubenswrapper[4892]: I0217 18:43:49.868094 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bkhdg" event={"ID":"38f571e6-3ddb-46f2-be46-1303d07ab52f","Type":"ContainerDied","Data":"1f21bc4dc2334059b725ce63b4da124dee4e524c86d3cd420d63007f22370803"} Feb 17 18:43:49 crc kubenswrapper[4892]: I0217 18:43:49.868138 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bkhdg" event={"ID":"38f571e6-3ddb-46f2-be46-1303d07ab52f","Type":"ContainerDied","Data":"4cee135a88a62ddfe39e05236fa37e7ee582297820004b3615b52fad31cfd95c"} Feb 17 18:43:49 crc kubenswrapper[4892]: I0217 18:43:49.868190 4892 scope.go:117] "RemoveContainer" containerID="1f21bc4dc2334059b725ce63b4da124dee4e524c86d3cd420d63007f22370803" Feb 17 18:43:49 crc kubenswrapper[4892]: I0217 18:43:49.869645 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bkhdg" Feb 17 18:43:49 crc kubenswrapper[4892]: I0217 18:43:49.888296 4892 scope.go:117] "RemoveContainer" containerID="2d3d0cb9b916ef56774a9424665b4eb587ed50b285cccebfbff1d17d01a670e2" Feb 17 18:43:49 crc kubenswrapper[4892]: I0217 18:43:49.911632 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bkhdg"] Feb 17 18:43:49 crc kubenswrapper[4892]: I0217 18:43:49.916274 4892 scope.go:117] "RemoveContainer" containerID="907563a8be18d7c478fdbde341b33aa5ffb748c6fa0f28bae4b1ec3a21299ca2" Feb 17 18:43:49 crc kubenswrapper[4892]: I0217 18:43:49.921277 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bkhdg"] Feb 17 18:43:49 crc kubenswrapper[4892]: I0217 18:43:49.931445 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gg6p8" Feb 17 18:43:49 crc kubenswrapper[4892]: I0217 18:43:49.947159 4892 scope.go:117] "RemoveContainer" containerID="1f21bc4dc2334059b725ce63b4da124dee4e524c86d3cd420d63007f22370803" Feb 17 18:43:49 crc kubenswrapper[4892]: E0217 18:43:49.947610 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f21bc4dc2334059b725ce63b4da124dee4e524c86d3cd420d63007f22370803\": container with ID starting with 1f21bc4dc2334059b725ce63b4da124dee4e524c86d3cd420d63007f22370803 not found: ID does not exist" containerID="1f21bc4dc2334059b725ce63b4da124dee4e524c86d3cd420d63007f22370803" Feb 17 18:43:49 crc kubenswrapper[4892]: I0217 18:43:49.947640 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f21bc4dc2334059b725ce63b4da124dee4e524c86d3cd420d63007f22370803"} err="failed to get container status \"1f21bc4dc2334059b725ce63b4da124dee4e524c86d3cd420d63007f22370803\": rpc error: code = NotFound desc = could not find container \"1f21bc4dc2334059b725ce63b4da124dee4e524c86d3cd420d63007f22370803\": container with ID starting with 1f21bc4dc2334059b725ce63b4da124dee4e524c86d3cd420d63007f22370803 not found: ID does not exist" Feb 17 18:43:49 crc kubenswrapper[4892]: I0217 18:43:49.947659 4892 scope.go:117] "RemoveContainer" containerID="2d3d0cb9b916ef56774a9424665b4eb587ed50b285cccebfbff1d17d01a670e2" Feb 17 18:43:49 crc kubenswrapper[4892]: E0217 18:43:49.955155 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d3d0cb9b916ef56774a9424665b4eb587ed50b285cccebfbff1d17d01a670e2\": container with ID starting with 2d3d0cb9b916ef56774a9424665b4eb587ed50b285cccebfbff1d17d01a670e2 not found: ID does not exist" containerID="2d3d0cb9b916ef56774a9424665b4eb587ed50b285cccebfbff1d17d01a670e2" Feb 17 18:43:49 crc kubenswrapper[4892]: I0217 18:43:49.955247 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d3d0cb9b916ef56774a9424665b4eb587ed50b285cccebfbff1d17d01a670e2"} err="failed to get container status \"2d3d0cb9b916ef56774a9424665b4eb587ed50b285cccebfbff1d17d01a670e2\": rpc error: code = NotFound desc = could not find container \"2d3d0cb9b916ef56774a9424665b4eb587ed50b285cccebfbff1d17d01a670e2\": container with ID starting with 2d3d0cb9b916ef56774a9424665b4eb587ed50b285cccebfbff1d17d01a670e2 not found: ID does not exist" Feb 17 18:43:49 crc kubenswrapper[4892]: I0217 18:43:49.955314 4892 scope.go:117] "RemoveContainer" containerID="907563a8be18d7c478fdbde341b33aa5ffb748c6fa0f28bae4b1ec3a21299ca2" Feb 17 18:43:49 crc kubenswrapper[4892]: E0217 18:43:49.956073 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"907563a8be18d7c478fdbde341b33aa5ffb748c6fa0f28bae4b1ec3a21299ca2\": container with ID starting with 907563a8be18d7c478fdbde341b33aa5ffb748c6fa0f28bae4b1ec3a21299ca2 not found: ID does not exist" containerID="907563a8be18d7c478fdbde341b33aa5ffb748c6fa0f28bae4b1ec3a21299ca2" Feb 17 18:43:49 crc kubenswrapper[4892]: I0217 18:43:49.956107 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"907563a8be18d7c478fdbde341b33aa5ffb748c6fa0f28bae4b1ec3a21299ca2"} err="failed to get container status \"907563a8be18d7c478fdbde341b33aa5ffb748c6fa0f28bae4b1ec3a21299ca2\": rpc error: code = NotFound desc = could not find container \"907563a8be18d7c478fdbde341b33aa5ffb748c6fa0f28bae4b1ec3a21299ca2\": container with ID starting with 907563a8be18d7c478fdbde341b33aa5ffb748c6fa0f28bae4b1ec3a21299ca2 not found: ID does not exist" Feb 17 18:43:50 crc kubenswrapper[4892]: I0217 18:43:50.715018 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-74tgn"] Feb 17 18:43:50 crc kubenswrapper[4892]: I0217 18:43:50.879911 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-74tgn" podUID="41a7480f-7795-4134-b7fd-debd314f8bba" containerName="registry-server" containerID="cri-o://e3d353aba5fba1d90d21f24f9222add9fcc93ce8d4a6a447a9a1939eb5ec780e" gracePeriod=2 Feb 17 18:43:51 crc kubenswrapper[4892]: I0217 18:43:51.376028 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38f571e6-3ddb-46f2-be46-1303d07ab52f" path="/var/lib/kubelet/pods/38f571e6-3ddb-46f2-be46-1303d07ab52f/volumes" Feb 17 18:43:51 crc kubenswrapper[4892]: I0217 18:43:51.768572 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-74tgn" Feb 17 18:43:51 crc kubenswrapper[4892]: I0217 18:43:51.895027 4892 generic.go:334] "Generic (PLEG): container finished" podID="41a7480f-7795-4134-b7fd-debd314f8bba" containerID="e3d353aba5fba1d90d21f24f9222add9fcc93ce8d4a6a447a9a1939eb5ec780e" exitCode=0 Feb 17 18:43:51 crc kubenswrapper[4892]: I0217 18:43:51.895108 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-74tgn" Feb 17 18:43:51 crc kubenswrapper[4892]: I0217 18:43:51.895156 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74tgn" event={"ID":"41a7480f-7795-4134-b7fd-debd314f8bba","Type":"ContainerDied","Data":"e3d353aba5fba1d90d21f24f9222add9fcc93ce8d4a6a447a9a1939eb5ec780e"} Feb 17 18:43:51 crc kubenswrapper[4892]: I0217 18:43:51.895192 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74tgn" event={"ID":"41a7480f-7795-4134-b7fd-debd314f8bba","Type":"ContainerDied","Data":"16d40205832380e912eb65fc1ce6e1ec11804554b0736cdc449ff3fb0b9c46df"} Feb 17 18:43:51 crc kubenswrapper[4892]: I0217 18:43:51.895218 4892 scope.go:117] "RemoveContainer" containerID="e3d353aba5fba1d90d21f24f9222add9fcc93ce8d4a6a447a9a1939eb5ec780e" Feb 17 18:43:51 crc kubenswrapper[4892]: I0217 18:43:51.901978 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nczg\" (UniqueName: \"kubernetes.io/projected/41a7480f-7795-4134-b7fd-debd314f8bba-kube-api-access-5nczg\") pod \"41a7480f-7795-4134-b7fd-debd314f8bba\" (UID: \"41a7480f-7795-4134-b7fd-debd314f8bba\") " Feb 17 18:43:51 crc kubenswrapper[4892]: I0217 18:43:51.902254 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41a7480f-7795-4134-b7fd-debd314f8bba-catalog-content\") pod \"41a7480f-7795-4134-b7fd-debd314f8bba\" (UID: \"41a7480f-7795-4134-b7fd-debd314f8bba\") " Feb 17 18:43:51 crc kubenswrapper[4892]: I0217 18:43:51.902392 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41a7480f-7795-4134-b7fd-debd314f8bba-utilities\") pod \"41a7480f-7795-4134-b7fd-debd314f8bba\" (UID: \"41a7480f-7795-4134-b7fd-debd314f8bba\") " Feb 17 18:43:51 crc kubenswrapper[4892]: I0217 18:43:51.904117 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41a7480f-7795-4134-b7fd-debd314f8bba-utilities" (OuterVolumeSpecName: "utilities") pod "41a7480f-7795-4134-b7fd-debd314f8bba" (UID: "41a7480f-7795-4134-b7fd-debd314f8bba"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:43:51 crc kubenswrapper[4892]: I0217 18:43:51.913179 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41a7480f-7795-4134-b7fd-debd314f8bba-kube-api-access-5nczg" (OuterVolumeSpecName: "kube-api-access-5nczg") pod "41a7480f-7795-4134-b7fd-debd314f8bba" (UID: "41a7480f-7795-4134-b7fd-debd314f8bba"). InnerVolumeSpecName "kube-api-access-5nczg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:43:51 crc kubenswrapper[4892]: I0217 18:43:51.927441 4892 scope.go:117] "RemoveContainer" containerID="b657e71d43ac6018b114ab9aad52fbc8badd38779ec8b3fe3395ed4859fd7e20" Feb 17 18:43:51 crc kubenswrapper[4892]: I0217 18:43:51.954642 4892 scope.go:117] "RemoveContainer" containerID="12f4f674455de613b0808c5dd504027fb9871a2b6af05bc3cd72d9a67e742c5c" Feb 17 18:43:51 crc kubenswrapper[4892]: I0217 18:43:51.968603 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41a7480f-7795-4134-b7fd-debd314f8bba-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "41a7480f-7795-4134-b7fd-debd314f8bba" (UID: "41a7480f-7795-4134-b7fd-debd314f8bba"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:43:51 crc kubenswrapper[4892]: I0217 18:43:51.983018 4892 scope.go:117] "RemoveContainer" containerID="e3d353aba5fba1d90d21f24f9222add9fcc93ce8d4a6a447a9a1939eb5ec780e" Feb 17 18:43:51 crc kubenswrapper[4892]: E0217 18:43:51.983413 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3d353aba5fba1d90d21f24f9222add9fcc93ce8d4a6a447a9a1939eb5ec780e\": container with ID starting with e3d353aba5fba1d90d21f24f9222add9fcc93ce8d4a6a447a9a1939eb5ec780e not found: ID does not exist" containerID="e3d353aba5fba1d90d21f24f9222add9fcc93ce8d4a6a447a9a1939eb5ec780e" Feb 17 18:43:51 crc kubenswrapper[4892]: I0217 18:43:51.983449 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3d353aba5fba1d90d21f24f9222add9fcc93ce8d4a6a447a9a1939eb5ec780e"} err="failed to get container status \"e3d353aba5fba1d90d21f24f9222add9fcc93ce8d4a6a447a9a1939eb5ec780e\": rpc error: code = NotFound desc = could not find container \"e3d353aba5fba1d90d21f24f9222add9fcc93ce8d4a6a447a9a1939eb5ec780e\": container with ID starting with e3d353aba5fba1d90d21f24f9222add9fcc93ce8d4a6a447a9a1939eb5ec780e not found: ID does not exist" Feb 17 18:43:51 crc kubenswrapper[4892]: I0217 18:43:51.983476 4892 scope.go:117] "RemoveContainer" containerID="b657e71d43ac6018b114ab9aad52fbc8badd38779ec8b3fe3395ed4859fd7e20" Feb 17 18:43:51 crc kubenswrapper[4892]: E0217 18:43:51.983913 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b657e71d43ac6018b114ab9aad52fbc8badd38779ec8b3fe3395ed4859fd7e20\": container with ID starting with b657e71d43ac6018b114ab9aad52fbc8badd38779ec8b3fe3395ed4859fd7e20 not found: ID does not exist" containerID="b657e71d43ac6018b114ab9aad52fbc8badd38779ec8b3fe3395ed4859fd7e20" Feb 17 18:43:51 crc kubenswrapper[4892]: I0217 18:43:51.983949 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b657e71d43ac6018b114ab9aad52fbc8badd38779ec8b3fe3395ed4859fd7e20"} err="failed to get container status \"b657e71d43ac6018b114ab9aad52fbc8badd38779ec8b3fe3395ed4859fd7e20\": rpc error: code = NotFound desc = could not find container \"b657e71d43ac6018b114ab9aad52fbc8badd38779ec8b3fe3395ed4859fd7e20\": container with ID starting with b657e71d43ac6018b114ab9aad52fbc8badd38779ec8b3fe3395ed4859fd7e20 not found: ID does not exist" Feb 17 18:43:51 crc kubenswrapper[4892]: I0217 18:43:51.983971 4892 scope.go:117] "RemoveContainer" containerID="12f4f674455de613b0808c5dd504027fb9871a2b6af05bc3cd72d9a67e742c5c" Feb 17 18:43:51 crc kubenswrapper[4892]: E0217 18:43:51.984362 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12f4f674455de613b0808c5dd504027fb9871a2b6af05bc3cd72d9a67e742c5c\": container with ID starting with 12f4f674455de613b0808c5dd504027fb9871a2b6af05bc3cd72d9a67e742c5c not found: ID does not exist" containerID="12f4f674455de613b0808c5dd504027fb9871a2b6af05bc3cd72d9a67e742c5c" Feb 17 18:43:51 crc kubenswrapper[4892]: I0217 18:43:51.984383 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12f4f674455de613b0808c5dd504027fb9871a2b6af05bc3cd72d9a67e742c5c"} err="failed to get container status \"12f4f674455de613b0808c5dd504027fb9871a2b6af05bc3cd72d9a67e742c5c\": rpc error: code = NotFound desc = could not find container \"12f4f674455de613b0808c5dd504027fb9871a2b6af05bc3cd72d9a67e742c5c\": container with ID starting with 12f4f674455de613b0808c5dd504027fb9871a2b6af05bc3cd72d9a67e742c5c not found: ID does not exist" Feb 17 18:43:52 crc kubenswrapper[4892]: I0217 18:43:52.004316 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41a7480f-7795-4134-b7fd-debd314f8bba-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 18:43:52 crc kubenswrapper[4892]: I0217 18:43:52.004350 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nczg\" (UniqueName: \"kubernetes.io/projected/41a7480f-7795-4134-b7fd-debd314f8bba-kube-api-access-5nczg\") on node \"crc\" DevicePath \"\"" Feb 17 18:43:52 crc kubenswrapper[4892]: I0217 18:43:52.004360 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41a7480f-7795-4134-b7fd-debd314f8bba-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 18:43:52 crc kubenswrapper[4892]: I0217 18:43:52.230154 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-74tgn"] Feb 17 18:43:52 crc kubenswrapper[4892]: I0217 18:43:52.238437 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-74tgn"] Feb 17 18:43:53 crc kubenswrapper[4892]: I0217 18:43:53.114154 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gg6p8"] Feb 17 18:43:53 crc kubenswrapper[4892]: I0217 18:43:53.114364 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gg6p8" podUID="05a339fa-bca7-4f9c-be57-f4364d7b83ea" containerName="registry-server" containerID="cri-o://f3727b604559b95126e734078f4096bd88b941955102fbad9bddf69264d7ccf2" gracePeriod=2 Feb 17 18:43:53 crc kubenswrapper[4892]: I0217 18:43:53.368933 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41a7480f-7795-4134-b7fd-debd314f8bba" path="/var/lib/kubelet/pods/41a7480f-7795-4134-b7fd-debd314f8bba/volumes" Feb 17 18:43:53 crc kubenswrapper[4892]: I0217 18:43:53.482938 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gg6p8" Feb 17 18:43:53 crc kubenswrapper[4892]: I0217 18:43:53.631051 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05a339fa-bca7-4f9c-be57-f4364d7b83ea-utilities\") pod \"05a339fa-bca7-4f9c-be57-f4364d7b83ea\" (UID: \"05a339fa-bca7-4f9c-be57-f4364d7b83ea\") " Feb 17 18:43:53 crc kubenswrapper[4892]: I0217 18:43:53.631186 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05a339fa-bca7-4f9c-be57-f4364d7b83ea-catalog-content\") pod \"05a339fa-bca7-4f9c-be57-f4364d7b83ea\" (UID: \"05a339fa-bca7-4f9c-be57-f4364d7b83ea\") " Feb 17 18:43:53 crc kubenswrapper[4892]: I0217 18:43:53.631256 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9lwk\" (UniqueName: \"kubernetes.io/projected/05a339fa-bca7-4f9c-be57-f4364d7b83ea-kube-api-access-b9lwk\") pod \"05a339fa-bca7-4f9c-be57-f4364d7b83ea\" (UID: \"05a339fa-bca7-4f9c-be57-f4364d7b83ea\") " Feb 17 18:43:53 crc kubenswrapper[4892]: I0217 18:43:53.633039 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05a339fa-bca7-4f9c-be57-f4364d7b83ea-utilities" (OuterVolumeSpecName: "utilities") pod "05a339fa-bca7-4f9c-be57-f4364d7b83ea" (UID: "05a339fa-bca7-4f9c-be57-f4364d7b83ea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:43:53 crc kubenswrapper[4892]: I0217 18:43:53.638938 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05a339fa-bca7-4f9c-be57-f4364d7b83ea-kube-api-access-b9lwk" (OuterVolumeSpecName: "kube-api-access-b9lwk") pod "05a339fa-bca7-4f9c-be57-f4364d7b83ea" (UID: "05a339fa-bca7-4f9c-be57-f4364d7b83ea"). InnerVolumeSpecName "kube-api-access-b9lwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:43:53 crc kubenswrapper[4892]: I0217 18:43:53.671568 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05a339fa-bca7-4f9c-be57-f4364d7b83ea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "05a339fa-bca7-4f9c-be57-f4364d7b83ea" (UID: "05a339fa-bca7-4f9c-be57-f4364d7b83ea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:43:53 crc kubenswrapper[4892]: I0217 18:43:53.733851 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05a339fa-bca7-4f9c-be57-f4364d7b83ea-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 18:43:53 crc kubenswrapper[4892]: I0217 18:43:53.733901 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9lwk\" (UniqueName: \"kubernetes.io/projected/05a339fa-bca7-4f9c-be57-f4364d7b83ea-kube-api-access-b9lwk\") on node \"crc\" DevicePath \"\"" Feb 17 18:43:53 crc kubenswrapper[4892]: I0217 18:43:53.733920 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05a339fa-bca7-4f9c-be57-f4364d7b83ea-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 18:43:53 crc kubenswrapper[4892]: I0217 18:43:53.926827 4892 generic.go:334] "Generic (PLEG): container finished" podID="05a339fa-bca7-4f9c-be57-f4364d7b83ea" containerID="f3727b604559b95126e734078f4096bd88b941955102fbad9bddf69264d7ccf2" exitCode=0 Feb 17 18:43:53 crc kubenswrapper[4892]: I0217 18:43:53.927041 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gg6p8" event={"ID":"05a339fa-bca7-4f9c-be57-f4364d7b83ea","Type":"ContainerDied","Data":"f3727b604559b95126e734078f4096bd88b941955102fbad9bddf69264d7ccf2"} Feb 17 18:43:53 crc kubenswrapper[4892]: I0217 18:43:53.927358 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gg6p8" event={"ID":"05a339fa-bca7-4f9c-be57-f4364d7b83ea","Type":"ContainerDied","Data":"13ed48fbdbb3c5fcf53443f233e3d7a715410298ffe2ef60234550134654b20f"} Feb 17 18:43:53 crc kubenswrapper[4892]: I0217 18:43:53.927384 4892 scope.go:117] "RemoveContainer" containerID="f3727b604559b95126e734078f4096bd88b941955102fbad9bddf69264d7ccf2" Feb 17 18:43:53 crc kubenswrapper[4892]: I0217 18:43:53.927506 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gg6p8" Feb 17 18:43:53 crc kubenswrapper[4892]: I0217 18:43:53.954520 4892 scope.go:117] "RemoveContainer" containerID="8c956a1a74b50ccd525c4cff86a82c35ab86e1c1a83f760894ba7f273fe3c6fa" Feb 17 18:43:53 crc kubenswrapper[4892]: I0217 18:43:53.965669 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gg6p8"] Feb 17 18:43:53 crc kubenswrapper[4892]: I0217 18:43:53.972112 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gg6p8"] Feb 17 18:43:54 crc kubenswrapper[4892]: I0217 18:43:54.003684 4892 scope.go:117] "RemoveContainer" containerID="b55710cd5e2ecc4dbc42fa7f15b3dd34a2b313cc79a9209b3c8cf5ffa57d18c0" Feb 17 18:43:54 crc kubenswrapper[4892]: I0217 18:43:54.019271 4892 scope.go:117] "RemoveContainer" containerID="f3727b604559b95126e734078f4096bd88b941955102fbad9bddf69264d7ccf2" Feb 17 18:43:54 crc kubenswrapper[4892]: E0217 18:43:54.019552 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3727b604559b95126e734078f4096bd88b941955102fbad9bddf69264d7ccf2\": container with ID starting with f3727b604559b95126e734078f4096bd88b941955102fbad9bddf69264d7ccf2 not found: ID does not exist" containerID="f3727b604559b95126e734078f4096bd88b941955102fbad9bddf69264d7ccf2" Feb 17 18:43:54 crc kubenswrapper[4892]: I0217 18:43:54.019581 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3727b604559b95126e734078f4096bd88b941955102fbad9bddf69264d7ccf2"} err="failed to get container status \"f3727b604559b95126e734078f4096bd88b941955102fbad9bddf69264d7ccf2\": rpc error: code = NotFound desc = could not find container \"f3727b604559b95126e734078f4096bd88b941955102fbad9bddf69264d7ccf2\": container with ID starting with f3727b604559b95126e734078f4096bd88b941955102fbad9bddf69264d7ccf2 not found: ID does not exist" Feb 17 18:43:54 crc kubenswrapper[4892]: I0217 18:43:54.019600 4892 scope.go:117] "RemoveContainer" containerID="8c956a1a74b50ccd525c4cff86a82c35ab86e1c1a83f760894ba7f273fe3c6fa" Feb 17 18:43:54 crc kubenswrapper[4892]: E0217 18:43:54.019970 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c956a1a74b50ccd525c4cff86a82c35ab86e1c1a83f760894ba7f273fe3c6fa\": container with ID starting with 8c956a1a74b50ccd525c4cff86a82c35ab86e1c1a83f760894ba7f273fe3c6fa not found: ID does not exist" containerID="8c956a1a74b50ccd525c4cff86a82c35ab86e1c1a83f760894ba7f273fe3c6fa" Feb 17 18:43:54 crc kubenswrapper[4892]: I0217 18:43:54.020007 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c956a1a74b50ccd525c4cff86a82c35ab86e1c1a83f760894ba7f273fe3c6fa"} err="failed to get container status \"8c956a1a74b50ccd525c4cff86a82c35ab86e1c1a83f760894ba7f273fe3c6fa\": rpc error: code = NotFound desc = could not find container \"8c956a1a74b50ccd525c4cff86a82c35ab86e1c1a83f760894ba7f273fe3c6fa\": container with ID starting with 8c956a1a74b50ccd525c4cff86a82c35ab86e1c1a83f760894ba7f273fe3c6fa not found: ID does not exist" Feb 17 18:43:54 crc kubenswrapper[4892]: I0217 18:43:54.020034 4892 scope.go:117] "RemoveContainer" containerID="b55710cd5e2ecc4dbc42fa7f15b3dd34a2b313cc79a9209b3c8cf5ffa57d18c0" Feb 17 18:43:54 crc kubenswrapper[4892]: E0217 18:43:54.020333 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b55710cd5e2ecc4dbc42fa7f15b3dd34a2b313cc79a9209b3c8cf5ffa57d18c0\": container with ID starting with b55710cd5e2ecc4dbc42fa7f15b3dd34a2b313cc79a9209b3c8cf5ffa57d18c0 not found: ID does not exist" containerID="b55710cd5e2ecc4dbc42fa7f15b3dd34a2b313cc79a9209b3c8cf5ffa57d18c0" Feb 17 18:43:54 crc kubenswrapper[4892]: I0217 18:43:54.020347 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b55710cd5e2ecc4dbc42fa7f15b3dd34a2b313cc79a9209b3c8cf5ffa57d18c0"} err="failed to get container status \"b55710cd5e2ecc4dbc42fa7f15b3dd34a2b313cc79a9209b3c8cf5ffa57d18c0\": rpc error: code = NotFound desc = could not find container \"b55710cd5e2ecc4dbc42fa7f15b3dd34a2b313cc79a9209b3c8cf5ffa57d18c0\": container with ID starting with b55710cd5e2ecc4dbc42fa7f15b3dd34a2b313cc79a9209b3c8cf5ffa57d18c0 not found: ID does not exist" Feb 17 18:43:55 crc kubenswrapper[4892]: I0217 18:43:55.373553 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05a339fa-bca7-4f9c-be57-f4364d7b83ea" path="/var/lib/kubelet/pods/05a339fa-bca7-4f9c-be57-f4364d7b83ea/volumes" Feb 17 18:44:07 crc kubenswrapper[4892]: I0217 18:44:07.424900 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 18:44:07 crc kubenswrapper[4892]: I0217 18:44:07.425552 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 18:44:37 crc kubenswrapper[4892]: I0217 18:44:37.425048 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 18:44:37 crc kubenswrapper[4892]: I0217 18:44:37.425615 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 18:44:37 crc kubenswrapper[4892]: I0217 18:44:37.425669 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" Feb 17 18:44:37 crc kubenswrapper[4892]: I0217 18:44:37.426555 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1c808ea8689d49c6406b07c975a960fba73e2a381719a39584da9285b8457a81"} pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 18:44:37 crc kubenswrapper[4892]: I0217 18:44:37.426631 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" containerID="cri-o://1c808ea8689d49c6406b07c975a960fba73e2a381719a39584da9285b8457a81" gracePeriod=600 Feb 17 18:44:37 crc kubenswrapper[4892]: E0217 18:44:37.551485 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:44:38 crc kubenswrapper[4892]: I0217 18:44:38.383962 4892 generic.go:334] "Generic (PLEG): container finished" podID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerID="1c808ea8689d49c6406b07c975a960fba73e2a381719a39584da9285b8457a81" exitCode=0 Feb 17 18:44:38 crc kubenswrapper[4892]: I0217 18:44:38.384051 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerDied","Data":"1c808ea8689d49c6406b07c975a960fba73e2a381719a39584da9285b8457a81"} Feb 17 18:44:38 crc kubenswrapper[4892]: I0217 18:44:38.384135 4892 scope.go:117] "RemoveContainer" containerID="fe86fc28385d9eb4781c2be11d00be8ffe1907a854f7e29ff267ed39565d85ba" Feb 17 18:44:38 crc kubenswrapper[4892]: I0217 18:44:38.385289 4892 scope.go:117] "RemoveContainer" containerID="1c808ea8689d49c6406b07c975a960fba73e2a381719a39584da9285b8457a81" Feb 17 18:44:38 crc kubenswrapper[4892]: E0217 18:44:38.385752 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:44:52 crc kubenswrapper[4892]: I0217 18:44:52.360115 4892 scope.go:117] "RemoveContainer" containerID="1c808ea8689d49c6406b07c975a960fba73e2a381719a39584da9285b8457a81" Feb 17 18:44:52 crc kubenswrapper[4892]: E0217 18:44:52.360869 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:45:00 crc kubenswrapper[4892]: I0217 18:45:00.190004 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522565-llf4f"] Feb 17 18:45:00 crc kubenswrapper[4892]: E0217 18:45:00.190922 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38f571e6-3ddb-46f2-be46-1303d07ab52f" containerName="extract-content" Feb 17 18:45:00 crc kubenswrapper[4892]: I0217 18:45:00.190941 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="38f571e6-3ddb-46f2-be46-1303d07ab52f" containerName="extract-content" Feb 17 18:45:00 crc kubenswrapper[4892]: E0217 18:45:00.190966 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05a339fa-bca7-4f9c-be57-f4364d7b83ea" containerName="extract-utilities" Feb 17 18:45:00 crc kubenswrapper[4892]: I0217 18:45:00.190974 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="05a339fa-bca7-4f9c-be57-f4364d7b83ea" containerName="extract-utilities" Feb 17 18:45:00 crc kubenswrapper[4892]: E0217 18:45:00.190991 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41a7480f-7795-4134-b7fd-debd314f8bba" containerName="extract-content" Feb 17 18:45:00 crc kubenswrapper[4892]: I0217 18:45:00.190999 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="41a7480f-7795-4134-b7fd-debd314f8bba" containerName="extract-content" Feb 17 18:45:00 crc kubenswrapper[4892]: E0217 18:45:00.191011 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41a7480f-7795-4134-b7fd-debd314f8bba" containerName="registry-server" Feb 17 18:45:00 crc kubenswrapper[4892]: I0217 18:45:00.191017 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="41a7480f-7795-4134-b7fd-debd314f8bba" containerName="registry-server" Feb 17 18:45:00 crc kubenswrapper[4892]: E0217 18:45:00.191030 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41a7480f-7795-4134-b7fd-debd314f8bba" containerName="extract-utilities" Feb 17 18:45:00 crc kubenswrapper[4892]: I0217 18:45:00.191035 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="41a7480f-7795-4134-b7fd-debd314f8bba" containerName="extract-utilities" Feb 17 18:45:00 crc kubenswrapper[4892]: E0217 18:45:00.191051 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05a339fa-bca7-4f9c-be57-f4364d7b83ea" containerName="extract-content" Feb 17 18:45:00 crc kubenswrapper[4892]: I0217 18:45:00.191056 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="05a339fa-bca7-4f9c-be57-f4364d7b83ea" containerName="extract-content" Feb 17 18:45:00 crc kubenswrapper[4892]: E0217 18:45:00.191072 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05a339fa-bca7-4f9c-be57-f4364d7b83ea" containerName="registry-server" Feb 17 18:45:00 crc kubenswrapper[4892]: I0217 18:45:00.191078 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="05a339fa-bca7-4f9c-be57-f4364d7b83ea" containerName="registry-server" Feb 17 18:45:00 crc kubenswrapper[4892]: E0217 18:45:00.191090 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38f571e6-3ddb-46f2-be46-1303d07ab52f" containerName="registry-server" Feb 17 18:45:00 crc kubenswrapper[4892]: I0217 18:45:00.191096 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="38f571e6-3ddb-46f2-be46-1303d07ab52f" containerName="registry-server" Feb 17 18:45:00 crc kubenswrapper[4892]: E0217 18:45:00.191107 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38f571e6-3ddb-46f2-be46-1303d07ab52f" containerName="extract-utilities" Feb 17 18:45:00 crc kubenswrapper[4892]: I0217 18:45:00.191113 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="38f571e6-3ddb-46f2-be46-1303d07ab52f" containerName="extract-utilities" Feb 17 18:45:00 crc kubenswrapper[4892]: I0217 18:45:00.191263 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="38f571e6-3ddb-46f2-be46-1303d07ab52f" containerName="registry-server" Feb 17 18:45:00 crc kubenswrapper[4892]: I0217 18:45:00.191283 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="41a7480f-7795-4134-b7fd-debd314f8bba" containerName="registry-server" Feb 17 18:45:00 crc kubenswrapper[4892]: I0217 18:45:00.191305 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="05a339fa-bca7-4f9c-be57-f4364d7b83ea" containerName="registry-server" Feb 17 18:45:00 crc kubenswrapper[4892]: I0217 18:45:00.191867 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522565-llf4f" Feb 17 18:45:00 crc kubenswrapper[4892]: I0217 18:45:00.194832 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 18:45:00 crc kubenswrapper[4892]: I0217 18:45:00.195057 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 18:45:00 crc kubenswrapper[4892]: I0217 18:45:00.209408 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522565-llf4f"] Feb 17 18:45:00 crc kubenswrapper[4892]: I0217 18:45:00.295448 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a216cfbe-7448-4d20-a9df-bf50992681b9-secret-volume\") pod \"collect-profiles-29522565-llf4f\" (UID: \"a216cfbe-7448-4d20-a9df-bf50992681b9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522565-llf4f" Feb 17 18:45:00 crc kubenswrapper[4892]: I0217 18:45:00.295618 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a216cfbe-7448-4d20-a9df-bf50992681b9-config-volume\") pod \"collect-profiles-29522565-llf4f\" (UID: \"a216cfbe-7448-4d20-a9df-bf50992681b9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522565-llf4f" Feb 17 18:45:00 crc kubenswrapper[4892]: I0217 18:45:00.295665 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqc7t\" (UniqueName: \"kubernetes.io/projected/a216cfbe-7448-4d20-a9df-bf50992681b9-kube-api-access-jqc7t\") pod \"collect-profiles-29522565-llf4f\" (UID: \"a216cfbe-7448-4d20-a9df-bf50992681b9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522565-llf4f" Feb 17 18:45:00 crc kubenswrapper[4892]: I0217 18:45:00.397428 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a216cfbe-7448-4d20-a9df-bf50992681b9-secret-volume\") pod \"collect-profiles-29522565-llf4f\" (UID: \"a216cfbe-7448-4d20-a9df-bf50992681b9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522565-llf4f" Feb 17 18:45:00 crc kubenswrapper[4892]: I0217 18:45:00.397605 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a216cfbe-7448-4d20-a9df-bf50992681b9-config-volume\") pod \"collect-profiles-29522565-llf4f\" (UID: \"a216cfbe-7448-4d20-a9df-bf50992681b9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522565-llf4f" Feb 17 18:45:00 crc kubenswrapper[4892]: I0217 18:45:00.397650 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqc7t\" (UniqueName: \"kubernetes.io/projected/a216cfbe-7448-4d20-a9df-bf50992681b9-kube-api-access-jqc7t\") pod \"collect-profiles-29522565-llf4f\" (UID: \"a216cfbe-7448-4d20-a9df-bf50992681b9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522565-llf4f" Feb 17 18:45:00 crc kubenswrapper[4892]: I0217 18:45:00.399111 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a216cfbe-7448-4d20-a9df-bf50992681b9-config-volume\") pod \"collect-profiles-29522565-llf4f\" (UID: \"a216cfbe-7448-4d20-a9df-bf50992681b9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522565-llf4f" Feb 17 18:45:00 crc kubenswrapper[4892]: I0217 18:45:00.405272 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a216cfbe-7448-4d20-a9df-bf50992681b9-secret-volume\") pod \"collect-profiles-29522565-llf4f\" (UID: \"a216cfbe-7448-4d20-a9df-bf50992681b9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522565-llf4f" Feb 17 18:45:00 crc kubenswrapper[4892]: I0217 18:45:00.428223 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqc7t\" (UniqueName: \"kubernetes.io/projected/a216cfbe-7448-4d20-a9df-bf50992681b9-kube-api-access-jqc7t\") pod \"collect-profiles-29522565-llf4f\" (UID: \"a216cfbe-7448-4d20-a9df-bf50992681b9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522565-llf4f" Feb 17 18:45:00 crc kubenswrapper[4892]: I0217 18:45:00.509560 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522565-llf4f" Feb 17 18:45:01 crc kubenswrapper[4892]: I0217 18:45:01.073457 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522565-llf4f"] Feb 17 18:45:01 crc kubenswrapper[4892]: I0217 18:45:01.612671 4892 generic.go:334] "Generic (PLEG): container finished" podID="a216cfbe-7448-4d20-a9df-bf50992681b9" containerID="d1caf24d56c25f58a84ca8fb005d8b73afc8b43aea5f2fadec2cee809f46f9ba" exitCode=0 Feb 17 18:45:01 crc kubenswrapper[4892]: I0217 18:45:01.612732 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522565-llf4f" event={"ID":"a216cfbe-7448-4d20-a9df-bf50992681b9","Type":"ContainerDied","Data":"d1caf24d56c25f58a84ca8fb005d8b73afc8b43aea5f2fadec2cee809f46f9ba"} Feb 17 18:45:01 crc kubenswrapper[4892]: I0217 18:45:01.612773 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522565-llf4f" event={"ID":"a216cfbe-7448-4d20-a9df-bf50992681b9","Type":"ContainerStarted","Data":"d0002480e1e515054cdf5fcf05fa96bba4fe4181f05c22ffcb737ef637b33a67"} Feb 17 18:45:02 crc kubenswrapper[4892]: I0217 18:45:02.954421 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522565-llf4f" Feb 17 18:45:03 crc kubenswrapper[4892]: I0217 18:45:03.062193 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a216cfbe-7448-4d20-a9df-bf50992681b9-config-volume\") pod \"a216cfbe-7448-4d20-a9df-bf50992681b9\" (UID: \"a216cfbe-7448-4d20-a9df-bf50992681b9\") " Feb 17 18:45:03 crc kubenswrapper[4892]: I0217 18:45:03.062240 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqc7t\" (UniqueName: \"kubernetes.io/projected/a216cfbe-7448-4d20-a9df-bf50992681b9-kube-api-access-jqc7t\") pod \"a216cfbe-7448-4d20-a9df-bf50992681b9\" (UID: \"a216cfbe-7448-4d20-a9df-bf50992681b9\") " Feb 17 18:45:03 crc kubenswrapper[4892]: I0217 18:45:03.062358 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a216cfbe-7448-4d20-a9df-bf50992681b9-secret-volume\") pod \"a216cfbe-7448-4d20-a9df-bf50992681b9\" (UID: \"a216cfbe-7448-4d20-a9df-bf50992681b9\") " Feb 17 18:45:03 crc kubenswrapper[4892]: I0217 18:45:03.063547 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a216cfbe-7448-4d20-a9df-bf50992681b9-config-volume" (OuterVolumeSpecName: "config-volume") pod "a216cfbe-7448-4d20-a9df-bf50992681b9" (UID: "a216cfbe-7448-4d20-a9df-bf50992681b9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 18:45:03 crc kubenswrapper[4892]: I0217 18:45:03.068695 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a216cfbe-7448-4d20-a9df-bf50992681b9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a216cfbe-7448-4d20-a9df-bf50992681b9" (UID: "a216cfbe-7448-4d20-a9df-bf50992681b9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 18:45:03 crc kubenswrapper[4892]: I0217 18:45:03.071046 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a216cfbe-7448-4d20-a9df-bf50992681b9-kube-api-access-jqc7t" (OuterVolumeSpecName: "kube-api-access-jqc7t") pod "a216cfbe-7448-4d20-a9df-bf50992681b9" (UID: "a216cfbe-7448-4d20-a9df-bf50992681b9"). InnerVolumeSpecName "kube-api-access-jqc7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:45:03 crc kubenswrapper[4892]: I0217 18:45:03.164965 4892 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a216cfbe-7448-4d20-a9df-bf50992681b9-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 18:45:03 crc kubenswrapper[4892]: I0217 18:45:03.165315 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqc7t\" (UniqueName: \"kubernetes.io/projected/a216cfbe-7448-4d20-a9df-bf50992681b9-kube-api-access-jqc7t\") on node \"crc\" DevicePath \"\"" Feb 17 18:45:03 crc kubenswrapper[4892]: I0217 18:45:03.165334 4892 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a216cfbe-7448-4d20-a9df-bf50992681b9-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 18:45:03 crc kubenswrapper[4892]: I0217 18:45:03.634671 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522565-llf4f" event={"ID":"a216cfbe-7448-4d20-a9df-bf50992681b9","Type":"ContainerDied","Data":"d0002480e1e515054cdf5fcf05fa96bba4fe4181f05c22ffcb737ef637b33a67"} Feb 17 18:45:03 crc kubenswrapper[4892]: I0217 18:45:03.634705 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0002480e1e515054cdf5fcf05fa96bba4fe4181f05c22ffcb737ef637b33a67" Feb 17 18:45:03 crc kubenswrapper[4892]: I0217 18:45:03.634741 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522565-llf4f" Feb 17 18:45:04 crc kubenswrapper[4892]: I0217 18:45:04.054903 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522520-5jt52"] Feb 17 18:45:04 crc kubenswrapper[4892]: I0217 18:45:04.064203 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522520-5jt52"] Feb 17 18:45:05 crc kubenswrapper[4892]: I0217 18:45:05.375953 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a8477f7-43ee-4fe8-8ff6-1af065f5ab21" path="/var/lib/kubelet/pods/8a8477f7-43ee-4fe8-8ff6-1af065f5ab21/volumes" Feb 17 18:45:06 crc kubenswrapper[4892]: I0217 18:45:06.360029 4892 scope.go:117] "RemoveContainer" containerID="1c808ea8689d49c6406b07c975a960fba73e2a381719a39584da9285b8457a81" Feb 17 18:45:06 crc kubenswrapper[4892]: E0217 18:45:06.360606 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:45:20 crc kubenswrapper[4892]: I0217 18:45:20.359623 4892 scope.go:117] "RemoveContainer" containerID="1c808ea8689d49c6406b07c975a960fba73e2a381719a39584da9285b8457a81" Feb 17 18:45:20 crc kubenswrapper[4892]: E0217 18:45:20.360657 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:45:35 crc kubenswrapper[4892]: I0217 18:45:35.359745 4892 scope.go:117] "RemoveContainer" containerID="1c808ea8689d49c6406b07c975a960fba73e2a381719a39584da9285b8457a81" Feb 17 18:45:35 crc kubenswrapper[4892]: E0217 18:45:35.360714 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:45:46 crc kubenswrapper[4892]: I0217 18:45:46.359678 4892 scope.go:117] "RemoveContainer" containerID="1c808ea8689d49c6406b07c975a960fba73e2a381719a39584da9285b8457a81" Feb 17 18:45:46 crc kubenswrapper[4892]: E0217 18:45:46.360701 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:45:57 crc kubenswrapper[4892]: I0217 18:45:57.361435 4892 scope.go:117] "RemoveContainer" containerID="1c808ea8689d49c6406b07c975a960fba73e2a381719a39584da9285b8457a81" Feb 17 18:45:57 crc kubenswrapper[4892]: E0217 18:45:57.362884 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:46:00 crc kubenswrapper[4892]: I0217 18:46:00.298884 4892 scope.go:117] "RemoveContainer" containerID="f7283cef0e509d28cda8cd3f7e1d8875cd21279f2449d37c8cdedc0c5d655358" Feb 17 18:46:08 crc kubenswrapper[4892]: I0217 18:46:08.360713 4892 scope.go:117] "RemoveContainer" containerID="1c808ea8689d49c6406b07c975a960fba73e2a381719a39584da9285b8457a81" Feb 17 18:46:08 crc kubenswrapper[4892]: E0217 18:46:08.361518 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:46:20 crc kubenswrapper[4892]: I0217 18:46:20.359526 4892 scope.go:117] "RemoveContainer" containerID="1c808ea8689d49c6406b07c975a960fba73e2a381719a39584da9285b8457a81" Feb 17 18:46:20 crc kubenswrapper[4892]: E0217 18:46:20.360440 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:46:31 crc kubenswrapper[4892]: I0217 18:46:31.360442 4892 scope.go:117] "RemoveContainer" containerID="1c808ea8689d49c6406b07c975a960fba73e2a381719a39584da9285b8457a81" Feb 17 18:46:31 crc kubenswrapper[4892]: E0217 18:46:31.361473 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:46:43 crc kubenswrapper[4892]: I0217 18:46:43.359191 4892 scope.go:117] "RemoveContainer" containerID="1c808ea8689d49c6406b07c975a960fba73e2a381719a39584da9285b8457a81" Feb 17 18:46:43 crc kubenswrapper[4892]: E0217 18:46:43.359986 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:46:58 crc kubenswrapper[4892]: I0217 18:46:58.360002 4892 scope.go:117] "RemoveContainer" containerID="1c808ea8689d49c6406b07c975a960fba73e2a381719a39584da9285b8457a81" Feb 17 18:46:58 crc kubenswrapper[4892]: E0217 18:46:58.361906 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:47:13 crc kubenswrapper[4892]: I0217 18:47:13.360005 4892 scope.go:117] "RemoveContainer" containerID="1c808ea8689d49c6406b07c975a960fba73e2a381719a39584da9285b8457a81" Feb 17 18:47:13 crc kubenswrapper[4892]: E0217 18:47:13.360974 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:47:27 crc kubenswrapper[4892]: I0217 18:47:27.360592 4892 scope.go:117] "RemoveContainer" containerID="1c808ea8689d49c6406b07c975a960fba73e2a381719a39584da9285b8457a81" Feb 17 18:47:27 crc kubenswrapper[4892]: E0217 18:47:27.361887 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:47:38 crc kubenswrapper[4892]: I0217 18:47:38.359757 4892 scope.go:117] "RemoveContainer" containerID="1c808ea8689d49c6406b07c975a960fba73e2a381719a39584da9285b8457a81" Feb 17 18:47:38 crc kubenswrapper[4892]: E0217 18:47:38.361048 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:47:52 crc kubenswrapper[4892]: I0217 18:47:52.359785 4892 scope.go:117] "RemoveContainer" containerID="1c808ea8689d49c6406b07c975a960fba73e2a381719a39584da9285b8457a81" Feb 17 18:47:52 crc kubenswrapper[4892]: E0217 18:47:52.360634 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:48:07 crc kubenswrapper[4892]: I0217 18:48:07.360118 4892 scope.go:117] "RemoveContainer" containerID="1c808ea8689d49c6406b07c975a960fba73e2a381719a39584da9285b8457a81" Feb 17 18:48:07 crc kubenswrapper[4892]: E0217 18:48:07.361088 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:48:20 crc kubenswrapper[4892]: I0217 18:48:20.359759 4892 scope.go:117] "RemoveContainer" containerID="1c808ea8689d49c6406b07c975a960fba73e2a381719a39584da9285b8457a81" Feb 17 18:48:20 crc kubenswrapper[4892]: E0217 18:48:20.360582 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:48:32 crc kubenswrapper[4892]: I0217 18:48:32.360588 4892 scope.go:117] "RemoveContainer" containerID="1c808ea8689d49c6406b07c975a960fba73e2a381719a39584da9285b8457a81" Feb 17 18:48:32 crc kubenswrapper[4892]: E0217 18:48:32.361426 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:48:46 crc kubenswrapper[4892]: I0217 18:48:46.361081 4892 scope.go:117] "RemoveContainer" containerID="1c808ea8689d49c6406b07c975a960fba73e2a381719a39584da9285b8457a81" Feb 17 18:48:46 crc kubenswrapper[4892]: E0217 18:48:46.361940 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:49:00 crc kubenswrapper[4892]: I0217 18:49:00.359326 4892 scope.go:117] "RemoveContainer" containerID="1c808ea8689d49c6406b07c975a960fba73e2a381719a39584da9285b8457a81" Feb 17 18:49:00 crc kubenswrapper[4892]: E0217 18:49:00.360449 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:49:15 crc kubenswrapper[4892]: I0217 18:49:15.359955 4892 scope.go:117] "RemoveContainer" containerID="1c808ea8689d49c6406b07c975a960fba73e2a381719a39584da9285b8457a81" Feb 17 18:49:15 crc kubenswrapper[4892]: E0217 18:49:15.360958 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:49:30 crc kubenswrapper[4892]: I0217 18:49:30.360404 4892 scope.go:117] "RemoveContainer" containerID="1c808ea8689d49c6406b07c975a960fba73e2a381719a39584da9285b8457a81" Feb 17 18:49:30 crc kubenswrapper[4892]: E0217 18:49:30.361717 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:49:43 crc kubenswrapper[4892]: I0217 18:49:43.363612 4892 scope.go:117] "RemoveContainer" containerID="1c808ea8689d49c6406b07c975a960fba73e2a381719a39584da9285b8457a81" Feb 17 18:49:44 crc kubenswrapper[4892]: I0217 18:49:44.413896 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerStarted","Data":"a6b98ccb6f64f3fd845ca76a31c615df1295c0d44951805096513fab72501508"} Feb 17 18:51:50 crc kubenswrapper[4892]: I0217 18:51:50.222036 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vh4bc"] Feb 17 18:51:50 crc kubenswrapper[4892]: E0217 18:51:50.223097 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a216cfbe-7448-4d20-a9df-bf50992681b9" containerName="collect-profiles" Feb 17 18:51:50 crc kubenswrapper[4892]: I0217 18:51:50.223114 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="a216cfbe-7448-4d20-a9df-bf50992681b9" containerName="collect-profiles" Feb 17 18:51:50 crc kubenswrapper[4892]: I0217 18:51:50.223382 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="a216cfbe-7448-4d20-a9df-bf50992681b9" containerName="collect-profiles" Feb 17 18:51:50 crc kubenswrapper[4892]: I0217 18:51:50.225299 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vh4bc" Feb 17 18:51:50 crc kubenswrapper[4892]: I0217 18:51:50.242798 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vh4bc"] Feb 17 18:51:50 crc kubenswrapper[4892]: I0217 18:51:50.351342 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38636e8c-6332-4b56-85c4-df6c6ae7f079-utilities\") pod \"redhat-operators-vh4bc\" (UID: \"38636e8c-6332-4b56-85c4-df6c6ae7f079\") " pod="openshift-marketplace/redhat-operators-vh4bc" Feb 17 18:51:50 crc kubenswrapper[4892]: I0217 18:51:50.351438 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38636e8c-6332-4b56-85c4-df6c6ae7f079-catalog-content\") pod \"redhat-operators-vh4bc\" (UID: \"38636e8c-6332-4b56-85c4-df6c6ae7f079\") " pod="openshift-marketplace/redhat-operators-vh4bc" Feb 17 18:51:50 crc kubenswrapper[4892]: I0217 18:51:50.351480 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn5jw\" (UniqueName: \"kubernetes.io/projected/38636e8c-6332-4b56-85c4-df6c6ae7f079-kube-api-access-sn5jw\") pod \"redhat-operators-vh4bc\" (UID: \"38636e8c-6332-4b56-85c4-df6c6ae7f079\") " pod="openshift-marketplace/redhat-operators-vh4bc" Feb 17 18:51:50 crc kubenswrapper[4892]: I0217 18:51:50.453112 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38636e8c-6332-4b56-85c4-df6c6ae7f079-utilities\") pod \"redhat-operators-vh4bc\" (UID: \"38636e8c-6332-4b56-85c4-df6c6ae7f079\") " pod="openshift-marketplace/redhat-operators-vh4bc" Feb 17 18:51:50 crc kubenswrapper[4892]: I0217 18:51:50.453174 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38636e8c-6332-4b56-85c4-df6c6ae7f079-catalog-content\") pod \"redhat-operators-vh4bc\" (UID: \"38636e8c-6332-4b56-85c4-df6c6ae7f079\") " pod="openshift-marketplace/redhat-operators-vh4bc" Feb 17 18:51:50 crc kubenswrapper[4892]: I0217 18:51:50.453194 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sn5jw\" (UniqueName: \"kubernetes.io/projected/38636e8c-6332-4b56-85c4-df6c6ae7f079-kube-api-access-sn5jw\") pod \"redhat-operators-vh4bc\" (UID: \"38636e8c-6332-4b56-85c4-df6c6ae7f079\") " pod="openshift-marketplace/redhat-operators-vh4bc" Feb 17 18:51:50 crc kubenswrapper[4892]: I0217 18:51:50.453590 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38636e8c-6332-4b56-85c4-df6c6ae7f079-utilities\") pod \"redhat-operators-vh4bc\" (UID: \"38636e8c-6332-4b56-85c4-df6c6ae7f079\") " pod="openshift-marketplace/redhat-operators-vh4bc" Feb 17 18:51:50 crc kubenswrapper[4892]: I0217 18:51:50.453832 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38636e8c-6332-4b56-85c4-df6c6ae7f079-catalog-content\") pod \"redhat-operators-vh4bc\" (UID: \"38636e8c-6332-4b56-85c4-df6c6ae7f079\") " pod="openshift-marketplace/redhat-operators-vh4bc" Feb 17 18:51:50 crc kubenswrapper[4892]: I0217 18:51:50.568221 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn5jw\" (UniqueName: \"kubernetes.io/projected/38636e8c-6332-4b56-85c4-df6c6ae7f079-kube-api-access-sn5jw\") pod \"redhat-operators-vh4bc\" (UID: \"38636e8c-6332-4b56-85c4-df6c6ae7f079\") " pod="openshift-marketplace/redhat-operators-vh4bc" Feb 17 18:51:50 crc kubenswrapper[4892]: I0217 18:51:50.580305 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vh4bc" Feb 17 18:51:51 crc kubenswrapper[4892]: I0217 18:51:51.065750 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vh4bc"] Feb 17 18:51:51 crc kubenswrapper[4892]: I0217 18:51:51.727260 4892 generic.go:334] "Generic (PLEG): container finished" podID="38636e8c-6332-4b56-85c4-df6c6ae7f079" containerID="67ac47e5316a9d1584ee69d16c0419f7c13f75cc51c170bd061029dbaff58480" exitCode=0 Feb 17 18:51:51 crc kubenswrapper[4892]: I0217 18:51:51.727349 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vh4bc" event={"ID":"38636e8c-6332-4b56-85c4-df6c6ae7f079","Type":"ContainerDied","Data":"67ac47e5316a9d1584ee69d16c0419f7c13f75cc51c170bd061029dbaff58480"} Feb 17 18:51:51 crc kubenswrapper[4892]: I0217 18:51:51.727704 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vh4bc" event={"ID":"38636e8c-6332-4b56-85c4-df6c6ae7f079","Type":"ContainerStarted","Data":"7ef9f07791f4dae51c3e2a2bf5ea1e2d8e3a43a3e655e3b1ef72ea8252a01d86"} Feb 17 18:51:51 crc kubenswrapper[4892]: I0217 18:51:51.729782 4892 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 18:51:52 crc kubenswrapper[4892]: I0217 18:51:52.742313 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vh4bc" event={"ID":"38636e8c-6332-4b56-85c4-df6c6ae7f079","Type":"ContainerStarted","Data":"310ac8f93361c93b857146e0d35a3f8996a90e3601ad370a95153a6441c2a780"} Feb 17 18:51:53 crc kubenswrapper[4892]: I0217 18:51:53.758405 4892 generic.go:334] "Generic (PLEG): container finished" podID="38636e8c-6332-4b56-85c4-df6c6ae7f079" containerID="310ac8f93361c93b857146e0d35a3f8996a90e3601ad370a95153a6441c2a780" exitCode=0 Feb 17 18:51:53 crc kubenswrapper[4892]: I0217 18:51:53.758469 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vh4bc" event={"ID":"38636e8c-6332-4b56-85c4-df6c6ae7f079","Type":"ContainerDied","Data":"310ac8f93361c93b857146e0d35a3f8996a90e3601ad370a95153a6441c2a780"} Feb 17 18:51:54 crc kubenswrapper[4892]: I0217 18:51:54.776011 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vh4bc" event={"ID":"38636e8c-6332-4b56-85c4-df6c6ae7f079","Type":"ContainerStarted","Data":"cb756a529356982fb60557a1f56fead5086b2eb5723094c285495b02c54516d4"} Feb 17 18:51:54 crc kubenswrapper[4892]: I0217 18:51:54.812697 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vh4bc" podStartSLOduration=2.309245455 podStartE2EDuration="4.812671143s" podCreationTimestamp="2026-02-17 18:51:50 +0000 UTC" firstStartedPulling="2026-02-17 18:51:51.729507124 +0000 UTC m=+4083.104910389" lastFinishedPulling="2026-02-17 18:51:54.232932772 +0000 UTC m=+4085.608336077" observedRunningTime="2026-02-17 18:51:54.794105892 +0000 UTC m=+4086.169509247" watchObservedRunningTime="2026-02-17 18:51:54.812671143 +0000 UTC m=+4086.188074448" Feb 17 18:52:00 crc kubenswrapper[4892]: I0217 18:52:00.581064 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vh4bc" Feb 17 18:52:00 crc kubenswrapper[4892]: I0217 18:52:00.581698 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vh4bc" Feb 17 18:52:01 crc kubenswrapper[4892]: I0217 18:52:01.638147 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vh4bc" podUID="38636e8c-6332-4b56-85c4-df6c6ae7f079" containerName="registry-server" probeResult="failure" output=< Feb 17 18:52:01 crc kubenswrapper[4892]: timeout: failed to connect service ":50051" within 1s Feb 17 18:52:01 crc kubenswrapper[4892]: > Feb 17 18:52:07 crc kubenswrapper[4892]: I0217 18:52:07.424696 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 18:52:07 crc kubenswrapper[4892]: I0217 18:52:07.425165 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 18:52:10 crc kubenswrapper[4892]: I0217 18:52:10.636600 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vh4bc" Feb 17 18:52:10 crc kubenswrapper[4892]: I0217 18:52:10.690313 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vh4bc" Feb 17 18:52:10 crc kubenswrapper[4892]: I0217 18:52:10.875700 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vh4bc"] Feb 17 18:52:11 crc kubenswrapper[4892]: I0217 18:52:11.932636 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vh4bc" podUID="38636e8c-6332-4b56-85c4-df6c6ae7f079" containerName="registry-server" containerID="cri-o://cb756a529356982fb60557a1f56fead5086b2eb5723094c285495b02c54516d4" gracePeriod=2 Feb 17 18:52:12 crc kubenswrapper[4892]: I0217 18:52:12.444364 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vh4bc" Feb 17 18:52:12 crc kubenswrapper[4892]: I0217 18:52:12.569687 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38636e8c-6332-4b56-85c4-df6c6ae7f079-utilities\") pod \"38636e8c-6332-4b56-85c4-df6c6ae7f079\" (UID: \"38636e8c-6332-4b56-85c4-df6c6ae7f079\") " Feb 17 18:52:12 crc kubenswrapper[4892]: I0217 18:52:12.570204 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sn5jw\" (UniqueName: \"kubernetes.io/projected/38636e8c-6332-4b56-85c4-df6c6ae7f079-kube-api-access-sn5jw\") pod \"38636e8c-6332-4b56-85c4-df6c6ae7f079\" (UID: \"38636e8c-6332-4b56-85c4-df6c6ae7f079\") " Feb 17 18:52:12 crc kubenswrapper[4892]: I0217 18:52:12.570304 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38636e8c-6332-4b56-85c4-df6c6ae7f079-catalog-content\") pod \"38636e8c-6332-4b56-85c4-df6c6ae7f079\" (UID: \"38636e8c-6332-4b56-85c4-df6c6ae7f079\") " Feb 17 18:52:12 crc kubenswrapper[4892]: I0217 18:52:12.571257 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38636e8c-6332-4b56-85c4-df6c6ae7f079-utilities" (OuterVolumeSpecName: "utilities") pod "38636e8c-6332-4b56-85c4-df6c6ae7f079" (UID: "38636e8c-6332-4b56-85c4-df6c6ae7f079"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:52:12 crc kubenswrapper[4892]: I0217 18:52:12.577499 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38636e8c-6332-4b56-85c4-df6c6ae7f079-kube-api-access-sn5jw" (OuterVolumeSpecName: "kube-api-access-sn5jw") pod "38636e8c-6332-4b56-85c4-df6c6ae7f079" (UID: "38636e8c-6332-4b56-85c4-df6c6ae7f079"). InnerVolumeSpecName "kube-api-access-sn5jw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:52:12 crc kubenswrapper[4892]: I0217 18:52:12.673257 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38636e8c-6332-4b56-85c4-df6c6ae7f079-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 18:52:12 crc kubenswrapper[4892]: I0217 18:52:12.673339 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sn5jw\" (UniqueName: \"kubernetes.io/projected/38636e8c-6332-4b56-85c4-df6c6ae7f079-kube-api-access-sn5jw\") on node \"crc\" DevicePath \"\"" Feb 17 18:52:12 crc kubenswrapper[4892]: I0217 18:52:12.724624 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38636e8c-6332-4b56-85c4-df6c6ae7f079-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "38636e8c-6332-4b56-85c4-df6c6ae7f079" (UID: "38636e8c-6332-4b56-85c4-df6c6ae7f079"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:52:12 crc kubenswrapper[4892]: I0217 18:52:12.775131 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38636e8c-6332-4b56-85c4-df6c6ae7f079-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 18:52:12 crc kubenswrapper[4892]: I0217 18:52:12.942394 4892 generic.go:334] "Generic (PLEG): container finished" podID="38636e8c-6332-4b56-85c4-df6c6ae7f079" containerID="cb756a529356982fb60557a1f56fead5086b2eb5723094c285495b02c54516d4" exitCode=0 Feb 17 18:52:12 crc kubenswrapper[4892]: I0217 18:52:12.942461 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vh4bc" Feb 17 18:52:12 crc kubenswrapper[4892]: I0217 18:52:12.942462 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vh4bc" event={"ID":"38636e8c-6332-4b56-85c4-df6c6ae7f079","Type":"ContainerDied","Data":"cb756a529356982fb60557a1f56fead5086b2eb5723094c285495b02c54516d4"} Feb 17 18:52:12 crc kubenswrapper[4892]: I0217 18:52:12.942568 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vh4bc" event={"ID":"38636e8c-6332-4b56-85c4-df6c6ae7f079","Type":"ContainerDied","Data":"7ef9f07791f4dae51c3e2a2bf5ea1e2d8e3a43a3e655e3b1ef72ea8252a01d86"} Feb 17 18:52:12 crc kubenswrapper[4892]: I0217 18:52:12.942589 4892 scope.go:117] "RemoveContainer" containerID="cb756a529356982fb60557a1f56fead5086b2eb5723094c285495b02c54516d4" Feb 17 18:52:12 crc kubenswrapper[4892]: I0217 18:52:12.978186 4892 scope.go:117] "RemoveContainer" containerID="310ac8f93361c93b857146e0d35a3f8996a90e3601ad370a95153a6441c2a780" Feb 17 18:52:12 crc kubenswrapper[4892]: I0217 18:52:12.992048 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vh4bc"] Feb 17 18:52:12 crc kubenswrapper[4892]: I0217 18:52:12.997091 4892 scope.go:117] "RemoveContainer" containerID="67ac47e5316a9d1584ee69d16c0419f7c13f75cc51c170bd061029dbaff58480" Feb 17 18:52:13 crc kubenswrapper[4892]: I0217 18:52:13.011529 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vh4bc"] Feb 17 18:52:13 crc kubenswrapper[4892]: I0217 18:52:13.039438 4892 scope.go:117] "RemoveContainer" containerID="cb756a529356982fb60557a1f56fead5086b2eb5723094c285495b02c54516d4" Feb 17 18:52:13 crc kubenswrapper[4892]: E0217 18:52:13.039941 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb756a529356982fb60557a1f56fead5086b2eb5723094c285495b02c54516d4\": container with ID starting with cb756a529356982fb60557a1f56fead5086b2eb5723094c285495b02c54516d4 not found: ID does not exist" containerID="cb756a529356982fb60557a1f56fead5086b2eb5723094c285495b02c54516d4" Feb 17 18:52:13 crc kubenswrapper[4892]: I0217 18:52:13.039976 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb756a529356982fb60557a1f56fead5086b2eb5723094c285495b02c54516d4"} err="failed to get container status \"cb756a529356982fb60557a1f56fead5086b2eb5723094c285495b02c54516d4\": rpc error: code = NotFound desc = could not find container \"cb756a529356982fb60557a1f56fead5086b2eb5723094c285495b02c54516d4\": container with ID starting with cb756a529356982fb60557a1f56fead5086b2eb5723094c285495b02c54516d4 not found: ID does not exist" Feb 17 18:52:13 crc kubenswrapper[4892]: I0217 18:52:13.040005 4892 scope.go:117] "RemoveContainer" containerID="310ac8f93361c93b857146e0d35a3f8996a90e3601ad370a95153a6441c2a780" Feb 17 18:52:13 crc kubenswrapper[4892]: E0217 18:52:13.040410 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"310ac8f93361c93b857146e0d35a3f8996a90e3601ad370a95153a6441c2a780\": container with ID starting with 310ac8f93361c93b857146e0d35a3f8996a90e3601ad370a95153a6441c2a780 not found: ID does not exist" containerID="310ac8f93361c93b857146e0d35a3f8996a90e3601ad370a95153a6441c2a780" Feb 17 18:52:13 crc kubenswrapper[4892]: I0217 18:52:13.040471 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"310ac8f93361c93b857146e0d35a3f8996a90e3601ad370a95153a6441c2a780"} err="failed to get container status \"310ac8f93361c93b857146e0d35a3f8996a90e3601ad370a95153a6441c2a780\": rpc error: code = NotFound desc = could not find container \"310ac8f93361c93b857146e0d35a3f8996a90e3601ad370a95153a6441c2a780\": container with ID starting with 310ac8f93361c93b857146e0d35a3f8996a90e3601ad370a95153a6441c2a780 not found: ID does not exist" Feb 17 18:52:13 crc kubenswrapper[4892]: I0217 18:52:13.040503 4892 scope.go:117] "RemoveContainer" containerID="67ac47e5316a9d1584ee69d16c0419f7c13f75cc51c170bd061029dbaff58480" Feb 17 18:52:13 crc kubenswrapper[4892]: E0217 18:52:13.040988 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67ac47e5316a9d1584ee69d16c0419f7c13f75cc51c170bd061029dbaff58480\": container with ID starting with 67ac47e5316a9d1584ee69d16c0419f7c13f75cc51c170bd061029dbaff58480 not found: ID does not exist" containerID="67ac47e5316a9d1584ee69d16c0419f7c13f75cc51c170bd061029dbaff58480" Feb 17 18:52:13 crc kubenswrapper[4892]: I0217 18:52:13.041015 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67ac47e5316a9d1584ee69d16c0419f7c13f75cc51c170bd061029dbaff58480"} err="failed to get container status \"67ac47e5316a9d1584ee69d16c0419f7c13f75cc51c170bd061029dbaff58480\": rpc error: code = NotFound desc = could not find container \"67ac47e5316a9d1584ee69d16c0419f7c13f75cc51c170bd061029dbaff58480\": container with ID starting with 67ac47e5316a9d1584ee69d16c0419f7c13f75cc51c170bd061029dbaff58480 not found: ID does not exist" Feb 17 18:52:13 crc kubenswrapper[4892]: I0217 18:52:13.376567 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38636e8c-6332-4b56-85c4-df6c6ae7f079" path="/var/lib/kubelet/pods/38636e8c-6332-4b56-85c4-df6c6ae7f079/volumes" Feb 17 18:52:37 crc kubenswrapper[4892]: I0217 18:52:37.425639 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 18:52:37 crc kubenswrapper[4892]: I0217 18:52:37.426568 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 18:53:07 crc kubenswrapper[4892]: I0217 18:53:07.424616 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 18:53:07 crc kubenswrapper[4892]: I0217 18:53:07.425263 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 18:53:07 crc kubenswrapper[4892]: I0217 18:53:07.425316 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" Feb 17 18:53:07 crc kubenswrapper[4892]: I0217 18:53:07.426049 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a6b98ccb6f64f3fd845ca76a31c615df1295c0d44951805096513fab72501508"} pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 18:53:07 crc kubenswrapper[4892]: I0217 18:53:07.426126 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" containerID="cri-o://a6b98ccb6f64f3fd845ca76a31c615df1295c0d44951805096513fab72501508" gracePeriod=600 Feb 17 18:53:08 crc kubenswrapper[4892]: I0217 18:53:08.510609 4892 generic.go:334] "Generic (PLEG): container finished" podID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerID="a6b98ccb6f64f3fd845ca76a31c615df1295c0d44951805096513fab72501508" exitCode=0 Feb 17 18:53:08 crc kubenswrapper[4892]: I0217 18:53:08.510724 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerDied","Data":"a6b98ccb6f64f3fd845ca76a31c615df1295c0d44951805096513fab72501508"} Feb 17 18:53:08 crc kubenswrapper[4892]: I0217 18:53:08.511483 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerStarted","Data":"d09654a36203bb03d7f581d946a6aff0b561d94b30f8b6769499f728c32bfc51"} Feb 17 18:53:08 crc kubenswrapper[4892]: I0217 18:53:08.511532 4892 scope.go:117] "RemoveContainer" containerID="1c808ea8689d49c6406b07c975a960fba73e2a381719a39584da9285b8457a81" Feb 17 18:53:43 crc kubenswrapper[4892]: I0217 18:53:43.643070 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2zn6j"] Feb 17 18:53:43 crc kubenswrapper[4892]: E0217 18:53:43.644182 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38636e8c-6332-4b56-85c4-df6c6ae7f079" containerName="extract-utilities" Feb 17 18:53:43 crc kubenswrapper[4892]: I0217 18:53:43.644206 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="38636e8c-6332-4b56-85c4-df6c6ae7f079" containerName="extract-utilities" Feb 17 18:53:43 crc kubenswrapper[4892]: E0217 18:53:43.644228 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38636e8c-6332-4b56-85c4-df6c6ae7f079" containerName="registry-server" Feb 17 18:53:43 crc kubenswrapper[4892]: I0217 18:53:43.644239 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="38636e8c-6332-4b56-85c4-df6c6ae7f079" containerName="registry-server" Feb 17 18:53:43 crc kubenswrapper[4892]: E0217 18:53:43.644293 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38636e8c-6332-4b56-85c4-df6c6ae7f079" containerName="extract-content" Feb 17 18:53:43 crc kubenswrapper[4892]: I0217 18:53:43.644303 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="38636e8c-6332-4b56-85c4-df6c6ae7f079" containerName="extract-content" Feb 17 18:53:43 crc kubenswrapper[4892]: I0217 18:53:43.644584 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="38636e8c-6332-4b56-85c4-df6c6ae7f079" containerName="registry-server" Feb 17 18:53:43 crc kubenswrapper[4892]: I0217 18:53:43.646504 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2zn6j" Feb 17 18:53:43 crc kubenswrapper[4892]: I0217 18:53:43.672775 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2zn6j"] Feb 17 18:53:43 crc kubenswrapper[4892]: I0217 18:53:43.771006 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6013816c-d9bf-48a2-9364-30109d8b0084-catalog-content\") pod \"redhat-marketplace-2zn6j\" (UID: \"6013816c-d9bf-48a2-9364-30109d8b0084\") " pod="openshift-marketplace/redhat-marketplace-2zn6j" Feb 17 18:53:43 crc kubenswrapper[4892]: I0217 18:53:43.771094 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6013816c-d9bf-48a2-9364-30109d8b0084-utilities\") pod \"redhat-marketplace-2zn6j\" (UID: \"6013816c-d9bf-48a2-9364-30109d8b0084\") " pod="openshift-marketplace/redhat-marketplace-2zn6j" Feb 17 18:53:43 crc kubenswrapper[4892]: I0217 18:53:43.771170 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjbrt\" (UniqueName: \"kubernetes.io/projected/6013816c-d9bf-48a2-9364-30109d8b0084-kube-api-access-jjbrt\") pod \"redhat-marketplace-2zn6j\" (UID: \"6013816c-d9bf-48a2-9364-30109d8b0084\") " pod="openshift-marketplace/redhat-marketplace-2zn6j" Feb 17 18:53:43 crc kubenswrapper[4892]: I0217 18:53:43.872284 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjbrt\" (UniqueName: \"kubernetes.io/projected/6013816c-d9bf-48a2-9364-30109d8b0084-kube-api-access-jjbrt\") pod \"redhat-marketplace-2zn6j\" (UID: \"6013816c-d9bf-48a2-9364-30109d8b0084\") " pod="openshift-marketplace/redhat-marketplace-2zn6j" Feb 17 18:53:43 crc kubenswrapper[4892]: I0217 18:53:43.872398 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6013816c-d9bf-48a2-9364-30109d8b0084-catalog-content\") pod \"redhat-marketplace-2zn6j\" (UID: \"6013816c-d9bf-48a2-9364-30109d8b0084\") " pod="openshift-marketplace/redhat-marketplace-2zn6j" Feb 17 18:53:43 crc kubenswrapper[4892]: I0217 18:53:43.872456 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6013816c-d9bf-48a2-9364-30109d8b0084-utilities\") pod \"redhat-marketplace-2zn6j\" (UID: \"6013816c-d9bf-48a2-9364-30109d8b0084\") " pod="openshift-marketplace/redhat-marketplace-2zn6j" Feb 17 18:53:43 crc kubenswrapper[4892]: I0217 18:53:43.873108 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6013816c-d9bf-48a2-9364-30109d8b0084-catalog-content\") pod \"redhat-marketplace-2zn6j\" (UID: \"6013816c-d9bf-48a2-9364-30109d8b0084\") " pod="openshift-marketplace/redhat-marketplace-2zn6j" Feb 17 18:53:43 crc kubenswrapper[4892]: I0217 18:53:43.873141 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6013816c-d9bf-48a2-9364-30109d8b0084-utilities\") pod \"redhat-marketplace-2zn6j\" (UID: \"6013816c-d9bf-48a2-9364-30109d8b0084\") " pod="openshift-marketplace/redhat-marketplace-2zn6j" Feb 17 18:53:43 crc kubenswrapper[4892]: I0217 18:53:43.903034 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjbrt\" (UniqueName: \"kubernetes.io/projected/6013816c-d9bf-48a2-9364-30109d8b0084-kube-api-access-jjbrt\") pod \"redhat-marketplace-2zn6j\" (UID: \"6013816c-d9bf-48a2-9364-30109d8b0084\") " pod="openshift-marketplace/redhat-marketplace-2zn6j" Feb 17 18:53:43 crc kubenswrapper[4892]: I0217 18:53:43.972631 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2zn6j" Feb 17 18:53:44 crc kubenswrapper[4892]: I0217 18:53:44.404220 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2zn6j"] Feb 17 18:53:44 crc kubenswrapper[4892]: I0217 18:53:44.900333 4892 generic.go:334] "Generic (PLEG): container finished" podID="6013816c-d9bf-48a2-9364-30109d8b0084" containerID="b22969d6b196308bdc123d6b5e369b5231794f431de7f9ae051a523efb551a06" exitCode=0 Feb 17 18:53:44 crc kubenswrapper[4892]: I0217 18:53:44.900432 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2zn6j" event={"ID":"6013816c-d9bf-48a2-9364-30109d8b0084","Type":"ContainerDied","Data":"b22969d6b196308bdc123d6b5e369b5231794f431de7f9ae051a523efb551a06"} Feb 17 18:53:44 crc kubenswrapper[4892]: I0217 18:53:44.900647 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2zn6j" event={"ID":"6013816c-d9bf-48a2-9364-30109d8b0084","Type":"ContainerStarted","Data":"0bf0a37e221e2da8288836e86aace02e2f7865fff05339b6626ad90d86b79b97"} Feb 17 18:53:45 crc kubenswrapper[4892]: I0217 18:53:45.917031 4892 generic.go:334] "Generic (PLEG): container finished" podID="6013816c-d9bf-48a2-9364-30109d8b0084" containerID="cb1d106914db4510eb4f80f7155f8e7e24a7db0a303f666e014148ba5d0b315b" exitCode=0 Feb 17 18:53:45 crc kubenswrapper[4892]: I0217 18:53:45.917155 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2zn6j" event={"ID":"6013816c-d9bf-48a2-9364-30109d8b0084","Type":"ContainerDied","Data":"cb1d106914db4510eb4f80f7155f8e7e24a7db0a303f666e014148ba5d0b315b"} Feb 17 18:53:46 crc kubenswrapper[4892]: I0217 18:53:46.929426 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2zn6j" event={"ID":"6013816c-d9bf-48a2-9364-30109d8b0084","Type":"ContainerStarted","Data":"2c80eaf7c73bc0382265dee0a9e0b60a4f8f0e44ca2118cb547be476ec21331f"} Feb 17 18:53:46 crc kubenswrapper[4892]: I0217 18:53:46.955912 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2zn6j" podStartSLOduration=2.433428176 podStartE2EDuration="3.955889985s" podCreationTimestamp="2026-02-17 18:53:43 +0000 UTC" firstStartedPulling="2026-02-17 18:53:44.902376998 +0000 UTC m=+4196.277780263" lastFinishedPulling="2026-02-17 18:53:46.424838797 +0000 UTC m=+4197.800242072" observedRunningTime="2026-02-17 18:53:46.954399385 +0000 UTC m=+4198.329802660" watchObservedRunningTime="2026-02-17 18:53:46.955889985 +0000 UTC m=+4198.331293260" Feb 17 18:53:53 crc kubenswrapper[4892]: I0217 18:53:53.974535 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2zn6j" Feb 17 18:53:53 crc kubenswrapper[4892]: I0217 18:53:53.975085 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2zn6j" Feb 17 18:53:54 crc kubenswrapper[4892]: I0217 18:53:54.014913 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2zn6j" Feb 17 18:53:54 crc kubenswrapper[4892]: I0217 18:53:54.057752 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2zn6j" Feb 17 18:53:54 crc kubenswrapper[4892]: I0217 18:53:54.262102 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2zn6j"] Feb 17 18:53:56 crc kubenswrapper[4892]: I0217 18:53:56.013043 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2zn6j" podUID="6013816c-d9bf-48a2-9364-30109d8b0084" containerName="registry-server" containerID="cri-o://2c80eaf7c73bc0382265dee0a9e0b60a4f8f0e44ca2118cb547be476ec21331f" gracePeriod=2 Feb 17 18:53:56 crc kubenswrapper[4892]: I0217 18:53:56.447985 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2zn6j" Feb 17 18:53:56 crc kubenswrapper[4892]: I0217 18:53:56.611029 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjbrt\" (UniqueName: \"kubernetes.io/projected/6013816c-d9bf-48a2-9364-30109d8b0084-kube-api-access-jjbrt\") pod \"6013816c-d9bf-48a2-9364-30109d8b0084\" (UID: \"6013816c-d9bf-48a2-9364-30109d8b0084\") " Feb 17 18:53:56 crc kubenswrapper[4892]: I0217 18:53:56.611124 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6013816c-d9bf-48a2-9364-30109d8b0084-catalog-content\") pod \"6013816c-d9bf-48a2-9364-30109d8b0084\" (UID: \"6013816c-d9bf-48a2-9364-30109d8b0084\") " Feb 17 18:53:56 crc kubenswrapper[4892]: I0217 18:53:56.611275 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6013816c-d9bf-48a2-9364-30109d8b0084-utilities\") pod \"6013816c-d9bf-48a2-9364-30109d8b0084\" (UID: \"6013816c-d9bf-48a2-9364-30109d8b0084\") " Feb 17 18:53:56 crc kubenswrapper[4892]: I0217 18:53:56.614415 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6013816c-d9bf-48a2-9364-30109d8b0084-utilities" (OuterVolumeSpecName: "utilities") pod "6013816c-d9bf-48a2-9364-30109d8b0084" (UID: "6013816c-d9bf-48a2-9364-30109d8b0084"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:53:56 crc kubenswrapper[4892]: I0217 18:53:56.618967 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6013816c-d9bf-48a2-9364-30109d8b0084-kube-api-access-jjbrt" (OuterVolumeSpecName: "kube-api-access-jjbrt") pod "6013816c-d9bf-48a2-9364-30109d8b0084" (UID: "6013816c-d9bf-48a2-9364-30109d8b0084"). InnerVolumeSpecName "kube-api-access-jjbrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:53:56 crc kubenswrapper[4892]: I0217 18:53:56.639629 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6013816c-d9bf-48a2-9364-30109d8b0084-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6013816c-d9bf-48a2-9364-30109d8b0084" (UID: "6013816c-d9bf-48a2-9364-30109d8b0084"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:53:56 crc kubenswrapper[4892]: I0217 18:53:56.713545 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjbrt\" (UniqueName: \"kubernetes.io/projected/6013816c-d9bf-48a2-9364-30109d8b0084-kube-api-access-jjbrt\") on node \"crc\" DevicePath \"\"" Feb 17 18:53:56 crc kubenswrapper[4892]: I0217 18:53:56.713589 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6013816c-d9bf-48a2-9364-30109d8b0084-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 18:53:56 crc kubenswrapper[4892]: I0217 18:53:56.713603 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6013816c-d9bf-48a2-9364-30109d8b0084-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 18:53:57 crc kubenswrapper[4892]: I0217 18:53:57.027880 4892 generic.go:334] "Generic (PLEG): container finished" podID="6013816c-d9bf-48a2-9364-30109d8b0084" containerID="2c80eaf7c73bc0382265dee0a9e0b60a4f8f0e44ca2118cb547be476ec21331f" exitCode=0 Feb 17 18:53:57 crc kubenswrapper[4892]: I0217 18:53:57.027947 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2zn6j" event={"ID":"6013816c-d9bf-48a2-9364-30109d8b0084","Type":"ContainerDied","Data":"2c80eaf7c73bc0382265dee0a9e0b60a4f8f0e44ca2118cb547be476ec21331f"} Feb 17 18:53:57 crc kubenswrapper[4892]: I0217 18:53:57.028004 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2zn6j" event={"ID":"6013816c-d9bf-48a2-9364-30109d8b0084","Type":"ContainerDied","Data":"0bf0a37e221e2da8288836e86aace02e2f7865fff05339b6626ad90d86b79b97"} Feb 17 18:53:57 crc kubenswrapper[4892]: I0217 18:53:57.028032 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2zn6j" Feb 17 18:53:57 crc kubenswrapper[4892]: I0217 18:53:57.028042 4892 scope.go:117] "RemoveContainer" containerID="2c80eaf7c73bc0382265dee0a9e0b60a4f8f0e44ca2118cb547be476ec21331f" Feb 17 18:53:57 crc kubenswrapper[4892]: I0217 18:53:57.062726 4892 scope.go:117] "RemoveContainer" containerID="cb1d106914db4510eb4f80f7155f8e7e24a7db0a303f666e014148ba5d0b315b" Feb 17 18:53:57 crc kubenswrapper[4892]: I0217 18:53:57.095114 4892 scope.go:117] "RemoveContainer" containerID="b22969d6b196308bdc123d6b5e369b5231794f431de7f9ae051a523efb551a06" Feb 17 18:53:57 crc kubenswrapper[4892]: I0217 18:53:57.109865 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2zn6j"] Feb 17 18:53:57 crc kubenswrapper[4892]: I0217 18:53:57.123356 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2zn6j"] Feb 17 18:53:57 crc kubenswrapper[4892]: I0217 18:53:57.137400 4892 scope.go:117] "RemoveContainer" containerID="2c80eaf7c73bc0382265dee0a9e0b60a4f8f0e44ca2118cb547be476ec21331f" Feb 17 18:53:57 crc kubenswrapper[4892]: E0217 18:53:57.138279 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c80eaf7c73bc0382265dee0a9e0b60a4f8f0e44ca2118cb547be476ec21331f\": container with ID starting with 2c80eaf7c73bc0382265dee0a9e0b60a4f8f0e44ca2118cb547be476ec21331f not found: ID does not exist" containerID="2c80eaf7c73bc0382265dee0a9e0b60a4f8f0e44ca2118cb547be476ec21331f" Feb 17 18:53:57 crc kubenswrapper[4892]: I0217 18:53:57.138345 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c80eaf7c73bc0382265dee0a9e0b60a4f8f0e44ca2118cb547be476ec21331f"} err="failed to get container status \"2c80eaf7c73bc0382265dee0a9e0b60a4f8f0e44ca2118cb547be476ec21331f\": rpc error: code = NotFound desc = could not find container \"2c80eaf7c73bc0382265dee0a9e0b60a4f8f0e44ca2118cb547be476ec21331f\": container with ID starting with 2c80eaf7c73bc0382265dee0a9e0b60a4f8f0e44ca2118cb547be476ec21331f not found: ID does not exist" Feb 17 18:53:57 crc kubenswrapper[4892]: I0217 18:53:57.138413 4892 scope.go:117] "RemoveContainer" containerID="cb1d106914db4510eb4f80f7155f8e7e24a7db0a303f666e014148ba5d0b315b" Feb 17 18:53:57 crc kubenswrapper[4892]: E0217 18:53:57.138971 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb1d106914db4510eb4f80f7155f8e7e24a7db0a303f666e014148ba5d0b315b\": container with ID starting with cb1d106914db4510eb4f80f7155f8e7e24a7db0a303f666e014148ba5d0b315b not found: ID does not exist" containerID="cb1d106914db4510eb4f80f7155f8e7e24a7db0a303f666e014148ba5d0b315b" Feb 17 18:53:57 crc kubenswrapper[4892]: I0217 18:53:57.139009 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb1d106914db4510eb4f80f7155f8e7e24a7db0a303f666e014148ba5d0b315b"} err="failed to get container status \"cb1d106914db4510eb4f80f7155f8e7e24a7db0a303f666e014148ba5d0b315b\": rpc error: code = NotFound desc = could not find container \"cb1d106914db4510eb4f80f7155f8e7e24a7db0a303f666e014148ba5d0b315b\": container with ID starting with cb1d106914db4510eb4f80f7155f8e7e24a7db0a303f666e014148ba5d0b315b not found: ID does not exist" Feb 17 18:53:57 crc kubenswrapper[4892]: I0217 18:53:57.139034 4892 scope.go:117] "RemoveContainer" containerID="b22969d6b196308bdc123d6b5e369b5231794f431de7f9ae051a523efb551a06" Feb 17 18:53:57 crc kubenswrapper[4892]: E0217 18:53:57.139489 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b22969d6b196308bdc123d6b5e369b5231794f431de7f9ae051a523efb551a06\": container with ID starting with b22969d6b196308bdc123d6b5e369b5231794f431de7f9ae051a523efb551a06 not found: ID does not exist" containerID="b22969d6b196308bdc123d6b5e369b5231794f431de7f9ae051a523efb551a06" Feb 17 18:53:57 crc kubenswrapper[4892]: I0217 18:53:57.139627 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b22969d6b196308bdc123d6b5e369b5231794f431de7f9ae051a523efb551a06"} err="failed to get container status \"b22969d6b196308bdc123d6b5e369b5231794f431de7f9ae051a523efb551a06\": rpc error: code = NotFound desc = could not find container \"b22969d6b196308bdc123d6b5e369b5231794f431de7f9ae051a523efb551a06\": container with ID starting with b22969d6b196308bdc123d6b5e369b5231794f431de7f9ae051a523efb551a06 not found: ID does not exist" Feb 17 18:53:57 crc kubenswrapper[4892]: I0217 18:53:57.373689 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6013816c-d9bf-48a2-9364-30109d8b0084" path="/var/lib/kubelet/pods/6013816c-d9bf-48a2-9364-30109d8b0084/volumes" Feb 17 18:54:01 crc kubenswrapper[4892]: I0217 18:54:01.638534 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9mf5m"] Feb 17 18:54:01 crc kubenswrapper[4892]: E0217 18:54:01.639528 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6013816c-d9bf-48a2-9364-30109d8b0084" containerName="extract-content" Feb 17 18:54:01 crc kubenswrapper[4892]: I0217 18:54:01.639552 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="6013816c-d9bf-48a2-9364-30109d8b0084" containerName="extract-content" Feb 17 18:54:01 crc kubenswrapper[4892]: E0217 18:54:01.639609 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6013816c-d9bf-48a2-9364-30109d8b0084" containerName="extract-utilities" Feb 17 18:54:01 crc kubenswrapper[4892]: I0217 18:54:01.639622 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="6013816c-d9bf-48a2-9364-30109d8b0084" containerName="extract-utilities" Feb 17 18:54:01 crc kubenswrapper[4892]: E0217 18:54:01.641393 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6013816c-d9bf-48a2-9364-30109d8b0084" containerName="registry-server" Feb 17 18:54:01 crc kubenswrapper[4892]: I0217 18:54:01.641458 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="6013816c-d9bf-48a2-9364-30109d8b0084" containerName="registry-server" Feb 17 18:54:01 crc kubenswrapper[4892]: I0217 18:54:01.641896 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="6013816c-d9bf-48a2-9364-30109d8b0084" containerName="registry-server" Feb 17 18:54:01 crc kubenswrapper[4892]: I0217 18:54:01.644023 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9mf5m" Feb 17 18:54:01 crc kubenswrapper[4892]: I0217 18:54:01.649250 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9mf5m"] Feb 17 18:54:01 crc kubenswrapper[4892]: I0217 18:54:01.710415 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6np7\" (UniqueName: \"kubernetes.io/projected/f5eb3931-9ae4-45cb-b068-c5c0afeb932c-kube-api-access-s6np7\") pod \"certified-operators-9mf5m\" (UID: \"f5eb3931-9ae4-45cb-b068-c5c0afeb932c\") " pod="openshift-marketplace/certified-operators-9mf5m" Feb 17 18:54:01 crc kubenswrapper[4892]: I0217 18:54:01.710476 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5eb3931-9ae4-45cb-b068-c5c0afeb932c-catalog-content\") pod \"certified-operators-9mf5m\" (UID: \"f5eb3931-9ae4-45cb-b068-c5c0afeb932c\") " pod="openshift-marketplace/certified-operators-9mf5m" Feb 17 18:54:01 crc kubenswrapper[4892]: I0217 18:54:01.710510 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5eb3931-9ae4-45cb-b068-c5c0afeb932c-utilities\") pod \"certified-operators-9mf5m\" (UID: \"f5eb3931-9ae4-45cb-b068-c5c0afeb932c\") " pod="openshift-marketplace/certified-operators-9mf5m" Feb 17 18:54:01 crc kubenswrapper[4892]: I0217 18:54:01.812532 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6np7\" (UniqueName: \"kubernetes.io/projected/f5eb3931-9ae4-45cb-b068-c5c0afeb932c-kube-api-access-s6np7\") pod \"certified-operators-9mf5m\" (UID: \"f5eb3931-9ae4-45cb-b068-c5c0afeb932c\") " pod="openshift-marketplace/certified-operators-9mf5m" Feb 17 18:54:01 crc kubenswrapper[4892]: I0217 18:54:01.812615 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5eb3931-9ae4-45cb-b068-c5c0afeb932c-catalog-content\") pod \"certified-operators-9mf5m\" (UID: \"f5eb3931-9ae4-45cb-b068-c5c0afeb932c\") " pod="openshift-marketplace/certified-operators-9mf5m" Feb 17 18:54:01 crc kubenswrapper[4892]: I0217 18:54:01.812661 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5eb3931-9ae4-45cb-b068-c5c0afeb932c-utilities\") pod \"certified-operators-9mf5m\" (UID: \"f5eb3931-9ae4-45cb-b068-c5c0afeb932c\") " pod="openshift-marketplace/certified-operators-9mf5m" Feb 17 18:54:01 crc kubenswrapper[4892]: I0217 18:54:01.813293 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5eb3931-9ae4-45cb-b068-c5c0afeb932c-utilities\") pod \"certified-operators-9mf5m\" (UID: \"f5eb3931-9ae4-45cb-b068-c5c0afeb932c\") " pod="openshift-marketplace/certified-operators-9mf5m" Feb 17 18:54:01 crc kubenswrapper[4892]: I0217 18:54:01.813423 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5eb3931-9ae4-45cb-b068-c5c0afeb932c-catalog-content\") pod \"certified-operators-9mf5m\" (UID: \"f5eb3931-9ae4-45cb-b068-c5c0afeb932c\") " pod="openshift-marketplace/certified-operators-9mf5m" Feb 17 18:54:01 crc kubenswrapper[4892]: I0217 18:54:01.841921 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6np7\" (UniqueName: \"kubernetes.io/projected/f5eb3931-9ae4-45cb-b068-c5c0afeb932c-kube-api-access-s6np7\") pod \"certified-operators-9mf5m\" (UID: \"f5eb3931-9ae4-45cb-b068-c5c0afeb932c\") " pod="openshift-marketplace/certified-operators-9mf5m" Feb 17 18:54:01 crc kubenswrapper[4892]: I0217 18:54:01.966019 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9mf5m" Feb 17 18:54:02 crc kubenswrapper[4892]: I0217 18:54:02.436918 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9mf5m"] Feb 17 18:54:03 crc kubenswrapper[4892]: I0217 18:54:03.092505 4892 generic.go:334] "Generic (PLEG): container finished" podID="f5eb3931-9ae4-45cb-b068-c5c0afeb932c" containerID="2956951e3cc6254c21fa917b61c54c176d9cb2d201fed1dd56b762bace0fe093" exitCode=0 Feb 17 18:54:03 crc kubenswrapper[4892]: I0217 18:54:03.092544 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9mf5m" event={"ID":"f5eb3931-9ae4-45cb-b068-c5c0afeb932c","Type":"ContainerDied","Data":"2956951e3cc6254c21fa917b61c54c176d9cb2d201fed1dd56b762bace0fe093"} Feb 17 18:54:03 crc kubenswrapper[4892]: I0217 18:54:03.092568 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9mf5m" event={"ID":"f5eb3931-9ae4-45cb-b068-c5c0afeb932c","Type":"ContainerStarted","Data":"d3e00ae12ad193cebdea8deeb7da17b9bd37f6e3b28d39fcab8e6cb37fb9d54e"} Feb 17 18:54:08 crc kubenswrapper[4892]: I0217 18:54:08.142460 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9mf5m" event={"ID":"f5eb3931-9ae4-45cb-b068-c5c0afeb932c","Type":"ContainerStarted","Data":"501739f7ec6753f764db90ed8cb06a06bb754c0b0cf38c711a21db04f5b697ac"} Feb 17 18:54:09 crc kubenswrapper[4892]: I0217 18:54:09.153270 4892 generic.go:334] "Generic (PLEG): container finished" podID="f5eb3931-9ae4-45cb-b068-c5c0afeb932c" containerID="501739f7ec6753f764db90ed8cb06a06bb754c0b0cf38c711a21db04f5b697ac" exitCode=0 Feb 17 18:54:09 crc kubenswrapper[4892]: I0217 18:54:09.153316 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9mf5m" event={"ID":"f5eb3931-9ae4-45cb-b068-c5c0afeb932c","Type":"ContainerDied","Data":"501739f7ec6753f764db90ed8cb06a06bb754c0b0cf38c711a21db04f5b697ac"} Feb 17 18:54:10 crc kubenswrapper[4892]: I0217 18:54:10.166991 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9mf5m" event={"ID":"f5eb3931-9ae4-45cb-b068-c5c0afeb932c","Type":"ContainerStarted","Data":"bd9b1f95be0a85e22138b2af7ad3502b35657e9cb09141c5ed58bbe63cc1a683"} Feb 17 18:54:10 crc kubenswrapper[4892]: I0217 18:54:10.235540 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9mf5m" podStartSLOduration=2.741342058 podStartE2EDuration="9.235516886s" podCreationTimestamp="2026-02-17 18:54:01 +0000 UTC" firstStartedPulling="2026-02-17 18:54:03.095112634 +0000 UTC m=+4214.470515899" lastFinishedPulling="2026-02-17 18:54:09.589287462 +0000 UTC m=+4220.964690727" observedRunningTime="2026-02-17 18:54:10.216908015 +0000 UTC m=+4221.592311300" watchObservedRunningTime="2026-02-17 18:54:10.235516886 +0000 UTC m=+4221.610920151" Feb 17 18:54:11 crc kubenswrapper[4892]: I0217 18:54:11.966799 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9mf5m" Feb 17 18:54:11 crc kubenswrapper[4892]: I0217 18:54:11.968105 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9mf5m" Feb 17 18:54:12 crc kubenswrapper[4892]: I0217 18:54:12.171584 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9mf5m" Feb 17 18:54:22 crc kubenswrapper[4892]: I0217 18:54:22.043084 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9mf5m" Feb 17 18:54:22 crc kubenswrapper[4892]: I0217 18:54:22.125390 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9mf5m"] Feb 17 18:54:22 crc kubenswrapper[4892]: I0217 18:54:22.270286 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9mf5m" podUID="f5eb3931-9ae4-45cb-b068-c5c0afeb932c" containerName="registry-server" containerID="cri-o://bd9b1f95be0a85e22138b2af7ad3502b35657e9cb09141c5ed58bbe63cc1a683" gracePeriod=2 Feb 17 18:54:22 crc kubenswrapper[4892]: I0217 18:54:22.705579 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9mf5m" Feb 17 18:54:22 crc kubenswrapper[4892]: I0217 18:54:22.804368 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5eb3931-9ae4-45cb-b068-c5c0afeb932c-utilities\") pod \"f5eb3931-9ae4-45cb-b068-c5c0afeb932c\" (UID: \"f5eb3931-9ae4-45cb-b068-c5c0afeb932c\") " Feb 17 18:54:22 crc kubenswrapper[4892]: I0217 18:54:22.804463 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6np7\" (UniqueName: \"kubernetes.io/projected/f5eb3931-9ae4-45cb-b068-c5c0afeb932c-kube-api-access-s6np7\") pod \"f5eb3931-9ae4-45cb-b068-c5c0afeb932c\" (UID: \"f5eb3931-9ae4-45cb-b068-c5c0afeb932c\") " Feb 17 18:54:22 crc kubenswrapper[4892]: I0217 18:54:22.804516 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5eb3931-9ae4-45cb-b068-c5c0afeb932c-catalog-content\") pod \"f5eb3931-9ae4-45cb-b068-c5c0afeb932c\" (UID: \"f5eb3931-9ae4-45cb-b068-c5c0afeb932c\") " Feb 17 18:54:22 crc kubenswrapper[4892]: I0217 18:54:22.805233 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5eb3931-9ae4-45cb-b068-c5c0afeb932c-utilities" (OuterVolumeSpecName: "utilities") pod "f5eb3931-9ae4-45cb-b068-c5c0afeb932c" (UID: "f5eb3931-9ae4-45cb-b068-c5c0afeb932c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:54:22 crc kubenswrapper[4892]: I0217 18:54:22.810311 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5eb3931-9ae4-45cb-b068-c5c0afeb932c-kube-api-access-s6np7" (OuterVolumeSpecName: "kube-api-access-s6np7") pod "f5eb3931-9ae4-45cb-b068-c5c0afeb932c" (UID: "f5eb3931-9ae4-45cb-b068-c5c0afeb932c"). InnerVolumeSpecName "kube-api-access-s6np7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:54:22 crc kubenswrapper[4892]: I0217 18:54:22.861671 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5eb3931-9ae4-45cb-b068-c5c0afeb932c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f5eb3931-9ae4-45cb-b068-c5c0afeb932c" (UID: "f5eb3931-9ae4-45cb-b068-c5c0afeb932c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:54:22 crc kubenswrapper[4892]: I0217 18:54:22.905916 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5eb3931-9ae4-45cb-b068-c5c0afeb932c-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 18:54:22 crc kubenswrapper[4892]: I0217 18:54:22.905962 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6np7\" (UniqueName: \"kubernetes.io/projected/f5eb3931-9ae4-45cb-b068-c5c0afeb932c-kube-api-access-s6np7\") on node \"crc\" DevicePath \"\"" Feb 17 18:54:22 crc kubenswrapper[4892]: I0217 18:54:22.905975 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5eb3931-9ae4-45cb-b068-c5c0afeb932c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 18:54:23 crc kubenswrapper[4892]: I0217 18:54:23.283990 4892 generic.go:334] "Generic (PLEG): container finished" podID="f5eb3931-9ae4-45cb-b068-c5c0afeb932c" containerID="bd9b1f95be0a85e22138b2af7ad3502b35657e9cb09141c5ed58bbe63cc1a683" exitCode=0 Feb 17 18:54:23 crc kubenswrapper[4892]: I0217 18:54:23.284059 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9mf5m" event={"ID":"f5eb3931-9ae4-45cb-b068-c5c0afeb932c","Type":"ContainerDied","Data":"bd9b1f95be0a85e22138b2af7ad3502b35657e9cb09141c5ed58bbe63cc1a683"} Feb 17 18:54:23 crc kubenswrapper[4892]: I0217 18:54:23.284120 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9mf5m" event={"ID":"f5eb3931-9ae4-45cb-b068-c5c0afeb932c","Type":"ContainerDied","Data":"d3e00ae12ad193cebdea8deeb7da17b9bd37f6e3b28d39fcab8e6cb37fb9d54e"} Feb 17 18:54:23 crc kubenswrapper[4892]: I0217 18:54:23.284157 4892 scope.go:117] "RemoveContainer" containerID="bd9b1f95be0a85e22138b2af7ad3502b35657e9cb09141c5ed58bbe63cc1a683" Feb 17 18:54:23 crc kubenswrapper[4892]: I0217 18:54:23.284077 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9mf5m" Feb 17 18:54:23 crc kubenswrapper[4892]: I0217 18:54:23.309025 4892 scope.go:117] "RemoveContainer" containerID="501739f7ec6753f764db90ed8cb06a06bb754c0b0cf38c711a21db04f5b697ac" Feb 17 18:54:23 crc kubenswrapper[4892]: I0217 18:54:23.343447 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9mf5m"] Feb 17 18:54:23 crc kubenswrapper[4892]: I0217 18:54:23.347783 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9mf5m"] Feb 17 18:54:23 crc kubenswrapper[4892]: I0217 18:54:23.360549 4892 scope.go:117] "RemoveContainer" containerID="2956951e3cc6254c21fa917b61c54c176d9cb2d201fed1dd56b762bace0fe093" Feb 17 18:54:23 crc kubenswrapper[4892]: I0217 18:54:23.370902 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5eb3931-9ae4-45cb-b068-c5c0afeb932c" path="/var/lib/kubelet/pods/f5eb3931-9ae4-45cb-b068-c5c0afeb932c/volumes" Feb 17 18:54:23 crc kubenswrapper[4892]: I0217 18:54:23.390796 4892 scope.go:117] "RemoveContainer" containerID="bd9b1f95be0a85e22138b2af7ad3502b35657e9cb09141c5ed58bbe63cc1a683" Feb 17 18:54:23 crc kubenswrapper[4892]: E0217 18:54:23.391770 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd9b1f95be0a85e22138b2af7ad3502b35657e9cb09141c5ed58bbe63cc1a683\": container with ID starting with bd9b1f95be0a85e22138b2af7ad3502b35657e9cb09141c5ed58bbe63cc1a683 not found: ID does not exist" containerID="bd9b1f95be0a85e22138b2af7ad3502b35657e9cb09141c5ed58bbe63cc1a683" Feb 17 18:54:23 crc kubenswrapper[4892]: I0217 18:54:23.391868 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd9b1f95be0a85e22138b2af7ad3502b35657e9cb09141c5ed58bbe63cc1a683"} err="failed to get container status \"bd9b1f95be0a85e22138b2af7ad3502b35657e9cb09141c5ed58bbe63cc1a683\": rpc error: code = NotFound desc = could not find container \"bd9b1f95be0a85e22138b2af7ad3502b35657e9cb09141c5ed58bbe63cc1a683\": container with ID starting with bd9b1f95be0a85e22138b2af7ad3502b35657e9cb09141c5ed58bbe63cc1a683 not found: ID does not exist" Feb 17 18:54:23 crc kubenswrapper[4892]: I0217 18:54:23.391902 4892 scope.go:117] "RemoveContainer" containerID="501739f7ec6753f764db90ed8cb06a06bb754c0b0cf38c711a21db04f5b697ac" Feb 17 18:54:23 crc kubenswrapper[4892]: E0217 18:54:23.392504 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"501739f7ec6753f764db90ed8cb06a06bb754c0b0cf38c711a21db04f5b697ac\": container with ID starting with 501739f7ec6753f764db90ed8cb06a06bb754c0b0cf38c711a21db04f5b697ac not found: ID does not exist" containerID="501739f7ec6753f764db90ed8cb06a06bb754c0b0cf38c711a21db04f5b697ac" Feb 17 18:54:23 crc kubenswrapper[4892]: I0217 18:54:23.392554 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"501739f7ec6753f764db90ed8cb06a06bb754c0b0cf38c711a21db04f5b697ac"} err="failed to get container status \"501739f7ec6753f764db90ed8cb06a06bb754c0b0cf38c711a21db04f5b697ac\": rpc error: code = NotFound desc = could not find container \"501739f7ec6753f764db90ed8cb06a06bb754c0b0cf38c711a21db04f5b697ac\": container with ID starting with 501739f7ec6753f764db90ed8cb06a06bb754c0b0cf38c711a21db04f5b697ac not found: ID does not exist" Feb 17 18:54:23 crc kubenswrapper[4892]: I0217 18:54:23.392584 4892 scope.go:117] "RemoveContainer" containerID="2956951e3cc6254c21fa917b61c54c176d9cb2d201fed1dd56b762bace0fe093" Feb 17 18:54:23 crc kubenswrapper[4892]: E0217 18:54:23.392851 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2956951e3cc6254c21fa917b61c54c176d9cb2d201fed1dd56b762bace0fe093\": container with ID starting with 2956951e3cc6254c21fa917b61c54c176d9cb2d201fed1dd56b762bace0fe093 not found: ID does not exist" containerID="2956951e3cc6254c21fa917b61c54c176d9cb2d201fed1dd56b762bace0fe093" Feb 17 18:54:23 crc kubenswrapper[4892]: I0217 18:54:23.392877 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2956951e3cc6254c21fa917b61c54c176d9cb2d201fed1dd56b762bace0fe093"} err="failed to get container status \"2956951e3cc6254c21fa917b61c54c176d9cb2d201fed1dd56b762bace0fe093\": rpc error: code = NotFound desc = could not find container \"2956951e3cc6254c21fa917b61c54c176d9cb2d201fed1dd56b762bace0fe093\": container with ID starting with 2956951e3cc6254c21fa917b61c54c176d9cb2d201fed1dd56b762bace0fe093 not found: ID does not exist" Feb 17 18:55:04 crc kubenswrapper[4892]: I0217 18:55:04.780566 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-h6jwp"] Feb 17 18:55:04 crc kubenswrapper[4892]: E0217 18:55:04.781509 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5eb3931-9ae4-45cb-b068-c5c0afeb932c" containerName="registry-server" Feb 17 18:55:04 crc kubenswrapper[4892]: I0217 18:55:04.781521 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5eb3931-9ae4-45cb-b068-c5c0afeb932c" containerName="registry-server" Feb 17 18:55:04 crc kubenswrapper[4892]: E0217 18:55:04.781548 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5eb3931-9ae4-45cb-b068-c5c0afeb932c" containerName="extract-content" Feb 17 18:55:04 crc kubenswrapper[4892]: I0217 18:55:04.781555 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5eb3931-9ae4-45cb-b068-c5c0afeb932c" containerName="extract-content" Feb 17 18:55:04 crc kubenswrapper[4892]: E0217 18:55:04.781562 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5eb3931-9ae4-45cb-b068-c5c0afeb932c" containerName="extract-utilities" Feb 17 18:55:04 crc kubenswrapper[4892]: I0217 18:55:04.781570 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5eb3931-9ae4-45cb-b068-c5c0afeb932c" containerName="extract-utilities" Feb 17 18:55:04 crc kubenswrapper[4892]: I0217 18:55:04.781750 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5eb3931-9ae4-45cb-b068-c5c0afeb932c" containerName="registry-server" Feb 17 18:55:04 crc kubenswrapper[4892]: I0217 18:55:04.783032 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h6jwp" Feb 17 18:55:04 crc kubenswrapper[4892]: I0217 18:55:04.800889 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h6jwp"] Feb 17 18:55:04 crc kubenswrapper[4892]: I0217 18:55:04.861829 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ca7313d-d73a-4cba-b2eb-149d231efc95-utilities\") pod \"community-operators-h6jwp\" (UID: \"6ca7313d-d73a-4cba-b2eb-149d231efc95\") " pod="openshift-marketplace/community-operators-h6jwp" Feb 17 18:55:04 crc kubenswrapper[4892]: I0217 18:55:04.861908 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr8s8\" (UniqueName: \"kubernetes.io/projected/6ca7313d-d73a-4cba-b2eb-149d231efc95-kube-api-access-zr8s8\") pod \"community-operators-h6jwp\" (UID: \"6ca7313d-d73a-4cba-b2eb-149d231efc95\") " pod="openshift-marketplace/community-operators-h6jwp" Feb 17 18:55:04 crc kubenswrapper[4892]: I0217 18:55:04.862008 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ca7313d-d73a-4cba-b2eb-149d231efc95-catalog-content\") pod \"community-operators-h6jwp\" (UID: \"6ca7313d-d73a-4cba-b2eb-149d231efc95\") " pod="openshift-marketplace/community-operators-h6jwp" Feb 17 18:55:04 crc kubenswrapper[4892]: I0217 18:55:04.963880 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ca7313d-d73a-4cba-b2eb-149d231efc95-catalog-content\") pod \"community-operators-h6jwp\" (UID: \"6ca7313d-d73a-4cba-b2eb-149d231efc95\") " pod="openshift-marketplace/community-operators-h6jwp" Feb 17 18:55:04 crc kubenswrapper[4892]: I0217 18:55:04.963979 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ca7313d-d73a-4cba-b2eb-149d231efc95-utilities\") pod \"community-operators-h6jwp\" (UID: \"6ca7313d-d73a-4cba-b2eb-149d231efc95\") " pod="openshift-marketplace/community-operators-h6jwp" Feb 17 18:55:04 crc kubenswrapper[4892]: I0217 18:55:04.964025 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zr8s8\" (UniqueName: \"kubernetes.io/projected/6ca7313d-d73a-4cba-b2eb-149d231efc95-kube-api-access-zr8s8\") pod \"community-operators-h6jwp\" (UID: \"6ca7313d-d73a-4cba-b2eb-149d231efc95\") " pod="openshift-marketplace/community-operators-h6jwp" Feb 17 18:55:04 crc kubenswrapper[4892]: I0217 18:55:04.964576 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ca7313d-d73a-4cba-b2eb-149d231efc95-utilities\") pod \"community-operators-h6jwp\" (UID: \"6ca7313d-d73a-4cba-b2eb-149d231efc95\") " pod="openshift-marketplace/community-operators-h6jwp" Feb 17 18:55:04 crc kubenswrapper[4892]: I0217 18:55:04.964598 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ca7313d-d73a-4cba-b2eb-149d231efc95-catalog-content\") pod \"community-operators-h6jwp\" (UID: \"6ca7313d-d73a-4cba-b2eb-149d231efc95\") " pod="openshift-marketplace/community-operators-h6jwp" Feb 17 18:55:05 crc kubenswrapper[4892]: I0217 18:55:05.006251 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr8s8\" (UniqueName: \"kubernetes.io/projected/6ca7313d-d73a-4cba-b2eb-149d231efc95-kube-api-access-zr8s8\") pod \"community-operators-h6jwp\" (UID: \"6ca7313d-d73a-4cba-b2eb-149d231efc95\") " pod="openshift-marketplace/community-operators-h6jwp" Feb 17 18:55:05 crc kubenswrapper[4892]: I0217 18:55:05.105054 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h6jwp" Feb 17 18:55:05 crc kubenswrapper[4892]: I0217 18:55:05.630652 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h6jwp"] Feb 17 18:55:05 crc kubenswrapper[4892]: W0217 18:55:05.644261 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ca7313d_d73a_4cba_b2eb_149d231efc95.slice/crio-f3c96c8c389b950967bd3e75f4375852c26d455c39c6e079394e5903c2bdfc5c WatchSource:0}: Error finding container f3c96c8c389b950967bd3e75f4375852c26d455c39c6e079394e5903c2bdfc5c: Status 404 returned error can't find the container with id f3c96c8c389b950967bd3e75f4375852c26d455c39c6e079394e5903c2bdfc5c Feb 17 18:55:05 crc kubenswrapper[4892]: I0217 18:55:05.705747 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h6jwp" event={"ID":"6ca7313d-d73a-4cba-b2eb-149d231efc95","Type":"ContainerStarted","Data":"f3c96c8c389b950967bd3e75f4375852c26d455c39c6e079394e5903c2bdfc5c"} Feb 17 18:55:06 crc kubenswrapper[4892]: I0217 18:55:06.718019 4892 generic.go:334] "Generic (PLEG): container finished" podID="6ca7313d-d73a-4cba-b2eb-149d231efc95" containerID="20b7364ff74c360bc570f613183da32e64de6f936c3dc3783222cb70b49880b8" exitCode=0 Feb 17 18:55:06 crc kubenswrapper[4892]: I0217 18:55:06.718083 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h6jwp" event={"ID":"6ca7313d-d73a-4cba-b2eb-149d231efc95","Type":"ContainerDied","Data":"20b7364ff74c360bc570f613183da32e64de6f936c3dc3783222cb70b49880b8"} Feb 17 18:55:07 crc kubenswrapper[4892]: I0217 18:55:07.424289 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 18:55:07 crc kubenswrapper[4892]: I0217 18:55:07.424719 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 18:55:07 crc kubenswrapper[4892]: I0217 18:55:07.730248 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h6jwp" event={"ID":"6ca7313d-d73a-4cba-b2eb-149d231efc95","Type":"ContainerStarted","Data":"0ed2be564f189125ea51f2eb014a738b54b8b2d1051f9b706763c8b449b8819f"} Feb 17 18:55:08 crc kubenswrapper[4892]: I0217 18:55:08.737944 4892 generic.go:334] "Generic (PLEG): container finished" podID="6ca7313d-d73a-4cba-b2eb-149d231efc95" containerID="0ed2be564f189125ea51f2eb014a738b54b8b2d1051f9b706763c8b449b8819f" exitCode=0 Feb 17 18:55:08 crc kubenswrapper[4892]: I0217 18:55:08.738009 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h6jwp" event={"ID":"6ca7313d-d73a-4cba-b2eb-149d231efc95","Type":"ContainerDied","Data":"0ed2be564f189125ea51f2eb014a738b54b8b2d1051f9b706763c8b449b8819f"} Feb 17 18:55:09 crc kubenswrapper[4892]: I0217 18:55:09.751241 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h6jwp" event={"ID":"6ca7313d-d73a-4cba-b2eb-149d231efc95","Type":"ContainerStarted","Data":"7c8c46f82f4744b8a46e5be5f11e59a20c9a245ac8096a499cc16c2c41cd9002"} Feb 17 18:55:15 crc kubenswrapper[4892]: I0217 18:55:15.105478 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-h6jwp" Feb 17 18:55:15 crc kubenswrapper[4892]: I0217 18:55:15.106192 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-h6jwp" Feb 17 18:55:15 crc kubenswrapper[4892]: I0217 18:55:15.178409 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-h6jwp" Feb 17 18:55:15 crc kubenswrapper[4892]: I0217 18:55:15.204990 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-h6jwp" podStartSLOduration=8.740845022 podStartE2EDuration="11.204970251s" podCreationTimestamp="2026-02-17 18:55:04 +0000 UTC" firstStartedPulling="2026-02-17 18:55:06.720371686 +0000 UTC m=+4278.095774971" lastFinishedPulling="2026-02-17 18:55:09.184496935 +0000 UTC m=+4280.559900200" observedRunningTime="2026-02-17 18:55:09.781341787 +0000 UTC m=+4281.156745062" watchObservedRunningTime="2026-02-17 18:55:15.204970251 +0000 UTC m=+4286.580373526" Feb 17 18:55:15 crc kubenswrapper[4892]: I0217 18:55:15.889765 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-h6jwp" Feb 17 18:55:15 crc kubenswrapper[4892]: I0217 18:55:15.984473 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h6jwp"] Feb 17 18:55:17 crc kubenswrapper[4892]: I0217 18:55:17.822813 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-h6jwp" podUID="6ca7313d-d73a-4cba-b2eb-149d231efc95" containerName="registry-server" containerID="cri-o://7c8c46f82f4744b8a46e5be5f11e59a20c9a245ac8096a499cc16c2c41cd9002" gracePeriod=2 Feb 17 18:55:18 crc kubenswrapper[4892]: I0217 18:55:18.249272 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h6jwp" Feb 17 18:55:18 crc kubenswrapper[4892]: I0217 18:55:18.394340 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ca7313d-d73a-4cba-b2eb-149d231efc95-catalog-content\") pod \"6ca7313d-d73a-4cba-b2eb-149d231efc95\" (UID: \"6ca7313d-d73a-4cba-b2eb-149d231efc95\") " Feb 17 18:55:18 crc kubenswrapper[4892]: I0217 18:55:18.394611 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ca7313d-d73a-4cba-b2eb-149d231efc95-utilities\") pod \"6ca7313d-d73a-4cba-b2eb-149d231efc95\" (UID: \"6ca7313d-d73a-4cba-b2eb-149d231efc95\") " Feb 17 18:55:18 crc kubenswrapper[4892]: I0217 18:55:18.394647 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zr8s8\" (UniqueName: \"kubernetes.io/projected/6ca7313d-d73a-4cba-b2eb-149d231efc95-kube-api-access-zr8s8\") pod \"6ca7313d-d73a-4cba-b2eb-149d231efc95\" (UID: \"6ca7313d-d73a-4cba-b2eb-149d231efc95\") " Feb 17 18:55:18 crc kubenswrapper[4892]: I0217 18:55:18.395621 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ca7313d-d73a-4cba-b2eb-149d231efc95-utilities" (OuterVolumeSpecName: "utilities") pod "6ca7313d-d73a-4cba-b2eb-149d231efc95" (UID: "6ca7313d-d73a-4cba-b2eb-149d231efc95"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:55:18 crc kubenswrapper[4892]: I0217 18:55:18.396347 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ca7313d-d73a-4cba-b2eb-149d231efc95-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 18:55:18 crc kubenswrapper[4892]: I0217 18:55:18.404488 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ca7313d-d73a-4cba-b2eb-149d231efc95-kube-api-access-zr8s8" (OuterVolumeSpecName: "kube-api-access-zr8s8") pod "6ca7313d-d73a-4cba-b2eb-149d231efc95" (UID: "6ca7313d-d73a-4cba-b2eb-149d231efc95"). InnerVolumeSpecName "kube-api-access-zr8s8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 18:55:18 crc kubenswrapper[4892]: I0217 18:55:18.498751 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zr8s8\" (UniqueName: \"kubernetes.io/projected/6ca7313d-d73a-4cba-b2eb-149d231efc95-kube-api-access-zr8s8\") on node \"crc\" DevicePath \"\"" Feb 17 18:55:18 crc kubenswrapper[4892]: I0217 18:55:18.610260 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ca7313d-d73a-4cba-b2eb-149d231efc95-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6ca7313d-d73a-4cba-b2eb-149d231efc95" (UID: "6ca7313d-d73a-4cba-b2eb-149d231efc95"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 18:55:18 crc kubenswrapper[4892]: I0217 18:55:18.704900 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ca7313d-d73a-4cba-b2eb-149d231efc95-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 18:55:18 crc kubenswrapper[4892]: I0217 18:55:18.831600 4892 generic.go:334] "Generic (PLEG): container finished" podID="6ca7313d-d73a-4cba-b2eb-149d231efc95" containerID="7c8c46f82f4744b8a46e5be5f11e59a20c9a245ac8096a499cc16c2c41cd9002" exitCode=0 Feb 17 18:55:18 crc kubenswrapper[4892]: I0217 18:55:18.831671 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h6jwp" event={"ID":"6ca7313d-d73a-4cba-b2eb-149d231efc95","Type":"ContainerDied","Data":"7c8c46f82f4744b8a46e5be5f11e59a20c9a245ac8096a499cc16c2c41cd9002"} Feb 17 18:55:18 crc kubenswrapper[4892]: I0217 18:55:18.831717 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h6jwp" event={"ID":"6ca7313d-d73a-4cba-b2eb-149d231efc95","Type":"ContainerDied","Data":"f3c96c8c389b950967bd3e75f4375852c26d455c39c6e079394e5903c2bdfc5c"} Feb 17 18:55:18 crc kubenswrapper[4892]: I0217 18:55:18.831741 4892 scope.go:117] "RemoveContainer" containerID="7c8c46f82f4744b8a46e5be5f11e59a20c9a245ac8096a499cc16c2c41cd9002" Feb 17 18:55:18 crc kubenswrapper[4892]: I0217 18:55:18.831677 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h6jwp" Feb 17 18:55:18 crc kubenswrapper[4892]: I0217 18:55:18.862955 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h6jwp"] Feb 17 18:55:18 crc kubenswrapper[4892]: I0217 18:55:18.866115 4892 scope.go:117] "RemoveContainer" containerID="0ed2be564f189125ea51f2eb014a738b54b8b2d1051f9b706763c8b449b8819f" Feb 17 18:55:18 crc kubenswrapper[4892]: I0217 18:55:18.871100 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-h6jwp"] Feb 17 18:55:18 crc kubenswrapper[4892]: I0217 18:55:18.890423 4892 scope.go:117] "RemoveContainer" containerID="20b7364ff74c360bc570f613183da32e64de6f936c3dc3783222cb70b49880b8" Feb 17 18:55:18 crc kubenswrapper[4892]: I0217 18:55:18.908878 4892 scope.go:117] "RemoveContainer" containerID="7c8c46f82f4744b8a46e5be5f11e59a20c9a245ac8096a499cc16c2c41cd9002" Feb 17 18:55:18 crc kubenswrapper[4892]: E0217 18:55:18.909252 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c8c46f82f4744b8a46e5be5f11e59a20c9a245ac8096a499cc16c2c41cd9002\": container with ID starting with 7c8c46f82f4744b8a46e5be5f11e59a20c9a245ac8096a499cc16c2c41cd9002 not found: ID does not exist" containerID="7c8c46f82f4744b8a46e5be5f11e59a20c9a245ac8096a499cc16c2c41cd9002" Feb 17 18:55:18 crc kubenswrapper[4892]: I0217 18:55:18.909325 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c8c46f82f4744b8a46e5be5f11e59a20c9a245ac8096a499cc16c2c41cd9002"} err="failed to get container status \"7c8c46f82f4744b8a46e5be5f11e59a20c9a245ac8096a499cc16c2c41cd9002\": rpc error: code = NotFound desc = could not find container \"7c8c46f82f4744b8a46e5be5f11e59a20c9a245ac8096a499cc16c2c41cd9002\": container with ID starting with 7c8c46f82f4744b8a46e5be5f11e59a20c9a245ac8096a499cc16c2c41cd9002 not found: ID does not exist" Feb 17 18:55:18 crc kubenswrapper[4892]: I0217 18:55:18.909360 4892 scope.go:117] "RemoveContainer" containerID="0ed2be564f189125ea51f2eb014a738b54b8b2d1051f9b706763c8b449b8819f" Feb 17 18:55:18 crc kubenswrapper[4892]: E0217 18:55:18.909638 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ed2be564f189125ea51f2eb014a738b54b8b2d1051f9b706763c8b449b8819f\": container with ID starting with 0ed2be564f189125ea51f2eb014a738b54b8b2d1051f9b706763c8b449b8819f not found: ID does not exist" containerID="0ed2be564f189125ea51f2eb014a738b54b8b2d1051f9b706763c8b449b8819f" Feb 17 18:55:18 crc kubenswrapper[4892]: I0217 18:55:18.909673 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ed2be564f189125ea51f2eb014a738b54b8b2d1051f9b706763c8b449b8819f"} err="failed to get container status \"0ed2be564f189125ea51f2eb014a738b54b8b2d1051f9b706763c8b449b8819f\": rpc error: code = NotFound desc = could not find container \"0ed2be564f189125ea51f2eb014a738b54b8b2d1051f9b706763c8b449b8819f\": container with ID starting with 0ed2be564f189125ea51f2eb014a738b54b8b2d1051f9b706763c8b449b8819f not found: ID does not exist" Feb 17 18:55:18 crc kubenswrapper[4892]: I0217 18:55:18.909698 4892 scope.go:117] "RemoveContainer" containerID="20b7364ff74c360bc570f613183da32e64de6f936c3dc3783222cb70b49880b8" Feb 17 18:55:18 crc kubenswrapper[4892]: E0217 18:55:18.909921 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20b7364ff74c360bc570f613183da32e64de6f936c3dc3783222cb70b49880b8\": container with ID starting with 20b7364ff74c360bc570f613183da32e64de6f936c3dc3783222cb70b49880b8 not found: ID does not exist" containerID="20b7364ff74c360bc570f613183da32e64de6f936c3dc3783222cb70b49880b8" Feb 17 18:55:18 crc kubenswrapper[4892]: I0217 18:55:18.909950 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20b7364ff74c360bc570f613183da32e64de6f936c3dc3783222cb70b49880b8"} err="failed to get container status \"20b7364ff74c360bc570f613183da32e64de6f936c3dc3783222cb70b49880b8\": rpc error: code = NotFound desc = could not find container \"20b7364ff74c360bc570f613183da32e64de6f936c3dc3783222cb70b49880b8\": container with ID starting with 20b7364ff74c360bc570f613183da32e64de6f936c3dc3783222cb70b49880b8 not found: ID does not exist" Feb 17 18:55:19 crc kubenswrapper[4892]: I0217 18:55:19.377498 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ca7313d-d73a-4cba-b2eb-149d231efc95" path="/var/lib/kubelet/pods/6ca7313d-d73a-4cba-b2eb-149d231efc95/volumes" Feb 17 18:55:37 crc kubenswrapper[4892]: I0217 18:55:37.424882 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 18:55:37 crc kubenswrapper[4892]: I0217 18:55:37.425670 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 18:56:07 crc kubenswrapper[4892]: I0217 18:56:07.424381 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 18:56:07 crc kubenswrapper[4892]: I0217 18:56:07.425143 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 18:56:07 crc kubenswrapper[4892]: I0217 18:56:07.425190 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" Feb 17 18:56:07 crc kubenswrapper[4892]: I0217 18:56:07.426035 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d09654a36203bb03d7f581d946a6aff0b561d94b30f8b6769499f728c32bfc51"} pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 18:56:07 crc kubenswrapper[4892]: I0217 18:56:07.426117 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" containerID="cri-o://d09654a36203bb03d7f581d946a6aff0b561d94b30f8b6769499f728c32bfc51" gracePeriod=600 Feb 17 18:56:07 crc kubenswrapper[4892]: E0217 18:56:07.558917 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:56:08 crc kubenswrapper[4892]: I0217 18:56:08.336894 4892 generic.go:334] "Generic (PLEG): container finished" podID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerID="d09654a36203bb03d7f581d946a6aff0b561d94b30f8b6769499f728c32bfc51" exitCode=0 Feb 17 18:56:08 crc kubenswrapper[4892]: I0217 18:56:08.336965 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerDied","Data":"d09654a36203bb03d7f581d946a6aff0b561d94b30f8b6769499f728c32bfc51"} Feb 17 18:56:08 crc kubenswrapper[4892]: I0217 18:56:08.337027 4892 scope.go:117] "RemoveContainer" containerID="a6b98ccb6f64f3fd845ca76a31c615df1295c0d44951805096513fab72501508" Feb 17 18:56:08 crc kubenswrapper[4892]: I0217 18:56:08.337743 4892 scope.go:117] "RemoveContainer" containerID="d09654a36203bb03d7f581d946a6aff0b561d94b30f8b6769499f728c32bfc51" Feb 17 18:56:08 crc kubenswrapper[4892]: E0217 18:56:08.338261 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:56:21 crc kubenswrapper[4892]: I0217 18:56:21.359377 4892 scope.go:117] "RemoveContainer" containerID="d09654a36203bb03d7f581d946a6aff0b561d94b30f8b6769499f728c32bfc51" Feb 17 18:56:21 crc kubenswrapper[4892]: E0217 18:56:21.361335 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:56:34 crc kubenswrapper[4892]: I0217 18:56:34.359302 4892 scope.go:117] "RemoveContainer" containerID="d09654a36203bb03d7f581d946a6aff0b561d94b30f8b6769499f728c32bfc51" Feb 17 18:56:34 crc kubenswrapper[4892]: E0217 18:56:34.360303 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:56:46 crc kubenswrapper[4892]: I0217 18:56:46.359884 4892 scope.go:117] "RemoveContainer" containerID="d09654a36203bb03d7f581d946a6aff0b561d94b30f8b6769499f728c32bfc51" Feb 17 18:56:46 crc kubenswrapper[4892]: E0217 18:56:46.361109 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:57:00 crc kubenswrapper[4892]: I0217 18:57:00.361593 4892 scope.go:117] "RemoveContainer" containerID="d09654a36203bb03d7f581d946a6aff0b561d94b30f8b6769499f728c32bfc51" Feb 17 18:57:00 crc kubenswrapper[4892]: E0217 18:57:00.365940 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:57:14 crc kubenswrapper[4892]: I0217 18:57:14.359443 4892 scope.go:117] "RemoveContainer" containerID="d09654a36203bb03d7f581d946a6aff0b561d94b30f8b6769499f728c32bfc51" Feb 17 18:57:14 crc kubenswrapper[4892]: E0217 18:57:14.360233 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:57:27 crc kubenswrapper[4892]: I0217 18:57:27.359994 4892 scope.go:117] "RemoveContainer" containerID="d09654a36203bb03d7f581d946a6aff0b561d94b30f8b6769499f728c32bfc51" Feb 17 18:57:27 crc kubenswrapper[4892]: E0217 18:57:27.361222 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:57:38 crc kubenswrapper[4892]: I0217 18:57:38.360621 4892 scope.go:117] "RemoveContainer" containerID="d09654a36203bb03d7f581d946a6aff0b561d94b30f8b6769499f728c32bfc51" Feb 17 18:57:38 crc kubenswrapper[4892]: E0217 18:57:38.361330 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:57:51 crc kubenswrapper[4892]: I0217 18:57:51.363201 4892 scope.go:117] "RemoveContainer" containerID="d09654a36203bb03d7f581d946a6aff0b561d94b30f8b6769499f728c32bfc51" Feb 17 18:57:51 crc kubenswrapper[4892]: E0217 18:57:51.364568 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:58:04 crc kubenswrapper[4892]: I0217 18:58:04.360614 4892 scope.go:117] "RemoveContainer" containerID="d09654a36203bb03d7f581d946a6aff0b561d94b30f8b6769499f728c32bfc51" Feb 17 18:58:04 crc kubenswrapper[4892]: E0217 18:58:04.361685 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:58:18 crc kubenswrapper[4892]: I0217 18:58:18.359988 4892 scope.go:117] "RemoveContainer" containerID="d09654a36203bb03d7f581d946a6aff0b561d94b30f8b6769499f728c32bfc51" Feb 17 18:58:18 crc kubenswrapper[4892]: E0217 18:58:18.360853 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:58:32 crc kubenswrapper[4892]: I0217 18:58:32.359343 4892 scope.go:117] "RemoveContainer" containerID="d09654a36203bb03d7f581d946a6aff0b561d94b30f8b6769499f728c32bfc51" Feb 17 18:58:32 crc kubenswrapper[4892]: E0217 18:58:32.360057 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:58:44 crc kubenswrapper[4892]: I0217 18:58:44.360390 4892 scope.go:117] "RemoveContainer" containerID="d09654a36203bb03d7f581d946a6aff0b561d94b30f8b6769499f728c32bfc51" Feb 17 18:58:44 crc kubenswrapper[4892]: E0217 18:58:44.361211 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:58:59 crc kubenswrapper[4892]: I0217 18:58:59.369391 4892 scope.go:117] "RemoveContainer" containerID="d09654a36203bb03d7f581d946a6aff0b561d94b30f8b6769499f728c32bfc51" Feb 17 18:58:59 crc kubenswrapper[4892]: E0217 18:58:59.370727 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:59:13 crc kubenswrapper[4892]: I0217 18:59:13.359203 4892 scope.go:117] "RemoveContainer" containerID="d09654a36203bb03d7f581d946a6aff0b561d94b30f8b6769499f728c32bfc51" Feb 17 18:59:13 crc kubenswrapper[4892]: E0217 18:59:13.361290 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:59:24 crc kubenswrapper[4892]: I0217 18:59:24.360300 4892 scope.go:117] "RemoveContainer" containerID="d09654a36203bb03d7f581d946a6aff0b561d94b30f8b6769499f728c32bfc51" Feb 17 18:59:24 crc kubenswrapper[4892]: E0217 18:59:24.361442 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:59:36 crc kubenswrapper[4892]: I0217 18:59:36.360282 4892 scope.go:117] "RemoveContainer" containerID="d09654a36203bb03d7f581d946a6aff0b561d94b30f8b6769499f728c32bfc51" Feb 17 18:59:36 crc kubenswrapper[4892]: E0217 18:59:36.361296 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 18:59:49 crc kubenswrapper[4892]: I0217 18:59:49.376241 4892 scope.go:117] "RemoveContainer" containerID="d09654a36203bb03d7f581d946a6aff0b561d94b30f8b6769499f728c32bfc51" Feb 17 18:59:49 crc kubenswrapper[4892]: E0217 18:59:49.376886 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:00:00 crc kubenswrapper[4892]: I0217 19:00:00.179595 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522580-v4ck9"] Feb 17 19:00:00 crc kubenswrapper[4892]: E0217 19:00:00.182088 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ca7313d-d73a-4cba-b2eb-149d231efc95" containerName="extract-content" Feb 17 19:00:00 crc kubenswrapper[4892]: I0217 19:00:00.182211 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ca7313d-d73a-4cba-b2eb-149d231efc95" containerName="extract-content" Feb 17 19:00:00 crc kubenswrapper[4892]: E0217 19:00:00.182301 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ca7313d-d73a-4cba-b2eb-149d231efc95" containerName="registry-server" Feb 17 19:00:00 crc kubenswrapper[4892]: I0217 19:00:00.182374 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ca7313d-d73a-4cba-b2eb-149d231efc95" containerName="registry-server" Feb 17 19:00:00 crc kubenswrapper[4892]: E0217 19:00:00.182490 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ca7313d-d73a-4cba-b2eb-149d231efc95" containerName="extract-utilities" Feb 17 19:00:00 crc kubenswrapper[4892]: I0217 19:00:00.182566 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ca7313d-d73a-4cba-b2eb-149d231efc95" containerName="extract-utilities" Feb 17 19:00:00 crc kubenswrapper[4892]: I0217 19:00:00.182937 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ca7313d-d73a-4cba-b2eb-149d231efc95" containerName="registry-server" Feb 17 19:00:00 crc kubenswrapper[4892]: I0217 19:00:00.183722 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522580-v4ck9" Feb 17 19:00:00 crc kubenswrapper[4892]: I0217 19:00:00.185880 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 19:00:00 crc kubenswrapper[4892]: I0217 19:00:00.186610 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 19:00:00 crc kubenswrapper[4892]: I0217 19:00:00.192894 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522580-v4ck9"] Feb 17 19:00:00 crc kubenswrapper[4892]: I0217 19:00:00.336909 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7ddccfb8-a091-46b4-962a-c176143d0c7a-config-volume\") pod \"collect-profiles-29522580-v4ck9\" (UID: \"7ddccfb8-a091-46b4-962a-c176143d0c7a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522580-v4ck9" Feb 17 19:00:00 crc kubenswrapper[4892]: I0217 19:00:00.337153 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7ddccfb8-a091-46b4-962a-c176143d0c7a-secret-volume\") pod \"collect-profiles-29522580-v4ck9\" (UID: \"7ddccfb8-a091-46b4-962a-c176143d0c7a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522580-v4ck9" Feb 17 19:00:00 crc kubenswrapper[4892]: I0217 19:00:00.337222 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx5hd\" (UniqueName: \"kubernetes.io/projected/7ddccfb8-a091-46b4-962a-c176143d0c7a-kube-api-access-hx5hd\") pod \"collect-profiles-29522580-v4ck9\" (UID: \"7ddccfb8-a091-46b4-962a-c176143d0c7a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522580-v4ck9" Feb 17 19:00:00 crc kubenswrapper[4892]: I0217 19:00:00.438793 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7ddccfb8-a091-46b4-962a-c176143d0c7a-config-volume\") pod \"collect-profiles-29522580-v4ck9\" (UID: \"7ddccfb8-a091-46b4-962a-c176143d0c7a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522580-v4ck9" Feb 17 19:00:00 crc kubenswrapper[4892]: I0217 19:00:00.438923 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7ddccfb8-a091-46b4-962a-c176143d0c7a-secret-volume\") pod \"collect-profiles-29522580-v4ck9\" (UID: \"7ddccfb8-a091-46b4-962a-c176143d0c7a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522580-v4ck9" Feb 17 19:00:00 crc kubenswrapper[4892]: I0217 19:00:00.439093 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hx5hd\" (UniqueName: \"kubernetes.io/projected/7ddccfb8-a091-46b4-962a-c176143d0c7a-kube-api-access-hx5hd\") pod \"collect-profiles-29522580-v4ck9\" (UID: \"7ddccfb8-a091-46b4-962a-c176143d0c7a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522580-v4ck9" Feb 17 19:00:00 crc kubenswrapper[4892]: I0217 19:00:00.441091 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7ddccfb8-a091-46b4-962a-c176143d0c7a-config-volume\") pod \"collect-profiles-29522580-v4ck9\" (UID: \"7ddccfb8-a091-46b4-962a-c176143d0c7a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522580-v4ck9" Feb 17 19:00:00 crc kubenswrapper[4892]: I0217 19:00:00.448635 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7ddccfb8-a091-46b4-962a-c176143d0c7a-secret-volume\") pod \"collect-profiles-29522580-v4ck9\" (UID: \"7ddccfb8-a091-46b4-962a-c176143d0c7a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522580-v4ck9" Feb 17 19:00:00 crc kubenswrapper[4892]: I0217 19:00:00.459050 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx5hd\" (UniqueName: \"kubernetes.io/projected/7ddccfb8-a091-46b4-962a-c176143d0c7a-kube-api-access-hx5hd\") pod \"collect-profiles-29522580-v4ck9\" (UID: \"7ddccfb8-a091-46b4-962a-c176143d0c7a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522580-v4ck9" Feb 17 19:00:00 crc kubenswrapper[4892]: I0217 19:00:00.511511 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522580-v4ck9" Feb 17 19:00:00 crc kubenswrapper[4892]: I0217 19:00:00.988509 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522580-v4ck9"] Feb 17 19:00:01 crc kubenswrapper[4892]: I0217 19:00:01.819885 4892 generic.go:334] "Generic (PLEG): container finished" podID="7ddccfb8-a091-46b4-962a-c176143d0c7a" containerID="7f0aa7858d6abd58c096f23ce9a351f74240c3af18a152f7ef0abd7b8976e91f" exitCode=0 Feb 17 19:00:01 crc kubenswrapper[4892]: I0217 19:00:01.820168 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522580-v4ck9" event={"ID":"7ddccfb8-a091-46b4-962a-c176143d0c7a","Type":"ContainerDied","Data":"7f0aa7858d6abd58c096f23ce9a351f74240c3af18a152f7ef0abd7b8976e91f"} Feb 17 19:00:01 crc kubenswrapper[4892]: I0217 19:00:01.820203 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522580-v4ck9" event={"ID":"7ddccfb8-a091-46b4-962a-c176143d0c7a","Type":"ContainerStarted","Data":"9c2dab0d6556bdd9276b29d8d24283e58c4f0af63d1473361f49afa6eb1c5eec"} Feb 17 19:00:03 crc kubenswrapper[4892]: I0217 19:00:03.204176 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522580-v4ck9" Feb 17 19:00:03 crc kubenswrapper[4892]: I0217 19:00:03.383806 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7ddccfb8-a091-46b4-962a-c176143d0c7a-secret-volume\") pod \"7ddccfb8-a091-46b4-962a-c176143d0c7a\" (UID: \"7ddccfb8-a091-46b4-962a-c176143d0c7a\") " Feb 17 19:00:03 crc kubenswrapper[4892]: I0217 19:00:03.383926 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hx5hd\" (UniqueName: \"kubernetes.io/projected/7ddccfb8-a091-46b4-962a-c176143d0c7a-kube-api-access-hx5hd\") pod \"7ddccfb8-a091-46b4-962a-c176143d0c7a\" (UID: \"7ddccfb8-a091-46b4-962a-c176143d0c7a\") " Feb 17 19:00:03 crc kubenswrapper[4892]: I0217 19:00:03.384052 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7ddccfb8-a091-46b4-962a-c176143d0c7a-config-volume\") pod \"7ddccfb8-a091-46b4-962a-c176143d0c7a\" (UID: \"7ddccfb8-a091-46b4-962a-c176143d0c7a\") " Feb 17 19:00:03 crc kubenswrapper[4892]: I0217 19:00:03.385059 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ddccfb8-a091-46b4-962a-c176143d0c7a-config-volume" (OuterVolumeSpecName: "config-volume") pod "7ddccfb8-a091-46b4-962a-c176143d0c7a" (UID: "7ddccfb8-a091-46b4-962a-c176143d0c7a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:00:03 crc kubenswrapper[4892]: I0217 19:00:03.387215 4892 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7ddccfb8-a091-46b4-962a-c176143d0c7a-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 19:00:03 crc kubenswrapper[4892]: I0217 19:00:03.394651 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ddccfb8-a091-46b4-962a-c176143d0c7a-kube-api-access-hx5hd" (OuterVolumeSpecName: "kube-api-access-hx5hd") pod "7ddccfb8-a091-46b4-962a-c176143d0c7a" (UID: "7ddccfb8-a091-46b4-962a-c176143d0c7a"). InnerVolumeSpecName "kube-api-access-hx5hd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:00:03 crc kubenswrapper[4892]: I0217 19:00:03.398695 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ddccfb8-a091-46b4-962a-c176143d0c7a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7ddccfb8-a091-46b4-962a-c176143d0c7a" (UID: "7ddccfb8-a091-46b4-962a-c176143d0c7a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:00:03 crc kubenswrapper[4892]: I0217 19:00:03.488715 4892 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7ddccfb8-a091-46b4-962a-c176143d0c7a-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 19:00:03 crc kubenswrapper[4892]: I0217 19:00:03.489393 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hx5hd\" (UniqueName: \"kubernetes.io/projected/7ddccfb8-a091-46b4-962a-c176143d0c7a-kube-api-access-hx5hd\") on node \"crc\" DevicePath \"\"" Feb 17 19:00:03 crc kubenswrapper[4892]: I0217 19:00:03.847681 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522580-v4ck9" event={"ID":"7ddccfb8-a091-46b4-962a-c176143d0c7a","Type":"ContainerDied","Data":"9c2dab0d6556bdd9276b29d8d24283e58c4f0af63d1473361f49afa6eb1c5eec"} Feb 17 19:00:03 crc kubenswrapper[4892]: I0217 19:00:03.847741 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c2dab0d6556bdd9276b29d8d24283e58c4f0af63d1473361f49afa6eb1c5eec" Feb 17 19:00:03 crc kubenswrapper[4892]: I0217 19:00:03.847787 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522580-v4ck9" Feb 17 19:00:04 crc kubenswrapper[4892]: I0217 19:00:04.300153 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522535-k86zz"] Feb 17 19:00:04 crc kubenswrapper[4892]: I0217 19:00:04.308211 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522535-k86zz"] Feb 17 19:00:04 crc kubenswrapper[4892]: I0217 19:00:04.358963 4892 scope.go:117] "RemoveContainer" containerID="d09654a36203bb03d7f581d946a6aff0b561d94b30f8b6769499f728c32bfc51" Feb 17 19:00:04 crc kubenswrapper[4892]: E0217 19:00:04.359230 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:00:05 crc kubenswrapper[4892]: I0217 19:00:05.374517 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34695ec1-983c-49e1-b645-82a0e41b0b35" path="/var/lib/kubelet/pods/34695ec1-983c-49e1-b645-82a0e41b0b35/volumes" Feb 17 19:00:18 crc kubenswrapper[4892]: I0217 19:00:18.360269 4892 scope.go:117] "RemoveContainer" containerID="d09654a36203bb03d7f581d946a6aff0b561d94b30f8b6769499f728c32bfc51" Feb 17 19:00:18 crc kubenswrapper[4892]: E0217 19:00:18.361234 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:00:33 crc kubenswrapper[4892]: I0217 19:00:33.359931 4892 scope.go:117] "RemoveContainer" containerID="d09654a36203bb03d7f581d946a6aff0b561d94b30f8b6769499f728c32bfc51" Feb 17 19:00:33 crc kubenswrapper[4892]: E0217 19:00:33.361052 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:00:48 crc kubenswrapper[4892]: I0217 19:00:48.358978 4892 scope.go:117] "RemoveContainer" containerID="d09654a36203bb03d7f581d946a6aff0b561d94b30f8b6769499f728c32bfc51" Feb 17 19:00:48 crc kubenswrapper[4892]: E0217 19:00:48.359943 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:01:00 crc kubenswrapper[4892]: I0217 19:01:00.771502 4892 scope.go:117] "RemoveContainer" containerID="65bbdf31aaeebc94fb95fd4fd370b1dc72876f8c856d3bf3966e599e1f0db999" Feb 17 19:01:01 crc kubenswrapper[4892]: I0217 19:01:01.359797 4892 scope.go:117] "RemoveContainer" containerID="d09654a36203bb03d7f581d946a6aff0b561d94b30f8b6769499f728c32bfc51" Feb 17 19:01:01 crc kubenswrapper[4892]: E0217 19:01:01.360886 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:01:03 crc kubenswrapper[4892]: I0217 19:01:03.696126 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-dckf8"] Feb 17 19:01:03 crc kubenswrapper[4892]: I0217 19:01:03.704684 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-dckf8"] Feb 17 19:01:03 crc kubenswrapper[4892]: I0217 19:01:03.834086 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-bwcm8"] Feb 17 19:01:03 crc kubenswrapper[4892]: E0217 19:01:03.834504 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ddccfb8-a091-46b4-962a-c176143d0c7a" containerName="collect-profiles" Feb 17 19:01:03 crc kubenswrapper[4892]: I0217 19:01:03.834527 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ddccfb8-a091-46b4-962a-c176143d0c7a" containerName="collect-profiles" Feb 17 19:01:03 crc kubenswrapper[4892]: I0217 19:01:03.834769 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ddccfb8-a091-46b4-962a-c176143d0c7a" containerName="collect-profiles" Feb 17 19:01:03 crc kubenswrapper[4892]: I0217 19:01:03.835426 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bwcm8" Feb 17 19:01:03 crc kubenswrapper[4892]: I0217 19:01:03.841329 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Feb 17 19:01:03 crc kubenswrapper[4892]: I0217 19:01:03.841399 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Feb 17 19:01:03 crc kubenswrapper[4892]: I0217 19:01:03.841569 4892 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-6s48s" Feb 17 19:01:03 crc kubenswrapper[4892]: I0217 19:01:03.841923 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Feb 17 19:01:03 crc kubenswrapper[4892]: I0217 19:01:03.849521 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-bwcm8"] Feb 17 19:01:03 crc kubenswrapper[4892]: I0217 19:01:03.912566 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/0509737d-dbc0-4be0-8546-55944a2b0621-node-mnt\") pod \"crc-storage-crc-bwcm8\" (UID: \"0509737d-dbc0-4be0-8546-55944a2b0621\") " pod="crc-storage/crc-storage-crc-bwcm8" Feb 17 19:01:03 crc kubenswrapper[4892]: I0217 19:01:03.912635 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnw5g\" (UniqueName: \"kubernetes.io/projected/0509737d-dbc0-4be0-8546-55944a2b0621-kube-api-access-fnw5g\") pod \"crc-storage-crc-bwcm8\" (UID: \"0509737d-dbc0-4be0-8546-55944a2b0621\") " pod="crc-storage/crc-storage-crc-bwcm8" Feb 17 19:01:03 crc kubenswrapper[4892]: I0217 19:01:03.912662 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/0509737d-dbc0-4be0-8546-55944a2b0621-crc-storage\") pod \"crc-storage-crc-bwcm8\" (UID: \"0509737d-dbc0-4be0-8546-55944a2b0621\") " pod="crc-storage/crc-storage-crc-bwcm8" Feb 17 19:01:04 crc kubenswrapper[4892]: I0217 19:01:04.013971 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/0509737d-dbc0-4be0-8546-55944a2b0621-node-mnt\") pod \"crc-storage-crc-bwcm8\" (UID: \"0509737d-dbc0-4be0-8546-55944a2b0621\") " pod="crc-storage/crc-storage-crc-bwcm8" Feb 17 19:01:04 crc kubenswrapper[4892]: I0217 19:01:04.014026 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnw5g\" (UniqueName: \"kubernetes.io/projected/0509737d-dbc0-4be0-8546-55944a2b0621-kube-api-access-fnw5g\") pod \"crc-storage-crc-bwcm8\" (UID: \"0509737d-dbc0-4be0-8546-55944a2b0621\") " pod="crc-storage/crc-storage-crc-bwcm8" Feb 17 19:01:04 crc kubenswrapper[4892]: I0217 19:01:04.014048 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/0509737d-dbc0-4be0-8546-55944a2b0621-crc-storage\") pod \"crc-storage-crc-bwcm8\" (UID: \"0509737d-dbc0-4be0-8546-55944a2b0621\") " pod="crc-storage/crc-storage-crc-bwcm8" Feb 17 19:01:04 crc kubenswrapper[4892]: I0217 19:01:04.014375 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/0509737d-dbc0-4be0-8546-55944a2b0621-node-mnt\") pod \"crc-storage-crc-bwcm8\" (UID: \"0509737d-dbc0-4be0-8546-55944a2b0621\") " pod="crc-storage/crc-storage-crc-bwcm8" Feb 17 19:01:04 crc kubenswrapper[4892]: I0217 19:01:04.014768 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/0509737d-dbc0-4be0-8546-55944a2b0621-crc-storage\") pod \"crc-storage-crc-bwcm8\" (UID: \"0509737d-dbc0-4be0-8546-55944a2b0621\") " pod="crc-storage/crc-storage-crc-bwcm8" Feb 17 19:01:04 crc kubenswrapper[4892]: I0217 19:01:04.035687 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnw5g\" (UniqueName: \"kubernetes.io/projected/0509737d-dbc0-4be0-8546-55944a2b0621-kube-api-access-fnw5g\") pod \"crc-storage-crc-bwcm8\" (UID: \"0509737d-dbc0-4be0-8546-55944a2b0621\") " pod="crc-storage/crc-storage-crc-bwcm8" Feb 17 19:01:04 crc kubenswrapper[4892]: I0217 19:01:04.166773 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bwcm8" Feb 17 19:01:04 crc kubenswrapper[4892]: I0217 19:01:04.471903 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-bwcm8"] Feb 17 19:01:04 crc kubenswrapper[4892]: I0217 19:01:04.481040 4892 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 19:01:04 crc kubenswrapper[4892]: I0217 19:01:04.506267 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-bwcm8" event={"ID":"0509737d-dbc0-4be0-8546-55944a2b0621","Type":"ContainerStarted","Data":"322a3493eb85dc78a8b51838e8b8012e7d178c3b45cd5cd4c6d775db2b4e9b43"} Feb 17 19:01:05 crc kubenswrapper[4892]: I0217 19:01:05.377613 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e475f89c-c290-4a9b-8475-2d6155ec7ea2" path="/var/lib/kubelet/pods/e475f89c-c290-4a9b-8475-2d6155ec7ea2/volumes" Feb 17 19:01:05 crc kubenswrapper[4892]: I0217 19:01:05.517554 4892 generic.go:334] "Generic (PLEG): container finished" podID="0509737d-dbc0-4be0-8546-55944a2b0621" containerID="ffe79af3b0f598c55b6de591794bd2c31e530cf69fafe08910716c5178225f7b" exitCode=0 Feb 17 19:01:05 crc kubenswrapper[4892]: I0217 19:01:05.517611 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-bwcm8" event={"ID":"0509737d-dbc0-4be0-8546-55944a2b0621","Type":"ContainerDied","Data":"ffe79af3b0f598c55b6de591794bd2c31e530cf69fafe08910716c5178225f7b"} Feb 17 19:01:06 crc kubenswrapper[4892]: I0217 19:01:06.899253 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bwcm8" Feb 17 19:01:06 crc kubenswrapper[4892]: I0217 19:01:06.991639 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/0509737d-dbc0-4be0-8546-55944a2b0621-node-mnt\") pod \"0509737d-dbc0-4be0-8546-55944a2b0621\" (UID: \"0509737d-dbc0-4be0-8546-55944a2b0621\") " Feb 17 19:01:06 crc kubenswrapper[4892]: I0217 19:01:06.991751 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0509737d-dbc0-4be0-8546-55944a2b0621-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "0509737d-dbc0-4be0-8546-55944a2b0621" (UID: "0509737d-dbc0-4be0-8546-55944a2b0621"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 19:01:06 crc kubenswrapper[4892]: I0217 19:01:06.991786 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/0509737d-dbc0-4be0-8546-55944a2b0621-crc-storage\") pod \"0509737d-dbc0-4be0-8546-55944a2b0621\" (UID: \"0509737d-dbc0-4be0-8546-55944a2b0621\") " Feb 17 19:01:06 crc kubenswrapper[4892]: I0217 19:01:06.991889 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnw5g\" (UniqueName: \"kubernetes.io/projected/0509737d-dbc0-4be0-8546-55944a2b0621-kube-api-access-fnw5g\") pod \"0509737d-dbc0-4be0-8546-55944a2b0621\" (UID: \"0509737d-dbc0-4be0-8546-55944a2b0621\") " Feb 17 19:01:06 crc kubenswrapper[4892]: I0217 19:01:06.992296 4892 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/0509737d-dbc0-4be0-8546-55944a2b0621-node-mnt\") on node \"crc\" DevicePath \"\"" Feb 17 19:01:06 crc kubenswrapper[4892]: I0217 19:01:06.999730 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0509737d-dbc0-4be0-8546-55944a2b0621-kube-api-access-fnw5g" (OuterVolumeSpecName: "kube-api-access-fnw5g") pod "0509737d-dbc0-4be0-8546-55944a2b0621" (UID: "0509737d-dbc0-4be0-8546-55944a2b0621"). InnerVolumeSpecName "kube-api-access-fnw5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:01:07 crc kubenswrapper[4892]: I0217 19:01:07.031320 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0509737d-dbc0-4be0-8546-55944a2b0621-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "0509737d-dbc0-4be0-8546-55944a2b0621" (UID: "0509737d-dbc0-4be0-8546-55944a2b0621"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:01:07 crc kubenswrapper[4892]: I0217 19:01:07.094767 4892 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/0509737d-dbc0-4be0-8546-55944a2b0621-crc-storage\") on node \"crc\" DevicePath \"\"" Feb 17 19:01:07 crc kubenswrapper[4892]: I0217 19:01:07.094877 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnw5g\" (UniqueName: \"kubernetes.io/projected/0509737d-dbc0-4be0-8546-55944a2b0621-kube-api-access-fnw5g\") on node \"crc\" DevicePath \"\"" Feb 17 19:01:07 crc kubenswrapper[4892]: I0217 19:01:07.537937 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-bwcm8" event={"ID":"0509737d-dbc0-4be0-8546-55944a2b0621","Type":"ContainerDied","Data":"322a3493eb85dc78a8b51838e8b8012e7d178c3b45cd5cd4c6d775db2b4e9b43"} Feb 17 19:01:07 crc kubenswrapper[4892]: I0217 19:01:07.538361 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="322a3493eb85dc78a8b51838e8b8012e7d178c3b45cd5cd4c6d775db2b4e9b43" Feb 17 19:01:07 crc kubenswrapper[4892]: I0217 19:01:07.537991 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bwcm8" Feb 17 19:01:09 crc kubenswrapper[4892]: I0217 19:01:09.379680 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-bwcm8"] Feb 17 19:01:09 crc kubenswrapper[4892]: I0217 19:01:09.388415 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-bwcm8"] Feb 17 19:01:09 crc kubenswrapper[4892]: I0217 19:01:09.534164 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-f4vmn"] Feb 17 19:01:09 crc kubenswrapper[4892]: E0217 19:01:09.534842 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0509737d-dbc0-4be0-8546-55944a2b0621" containerName="storage" Feb 17 19:01:09 crc kubenswrapper[4892]: I0217 19:01:09.534862 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="0509737d-dbc0-4be0-8546-55944a2b0621" containerName="storage" Feb 17 19:01:09 crc kubenswrapper[4892]: I0217 19:01:09.535075 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="0509737d-dbc0-4be0-8546-55944a2b0621" containerName="storage" Feb 17 19:01:09 crc kubenswrapper[4892]: I0217 19:01:09.535811 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-f4vmn" Feb 17 19:01:09 crc kubenswrapper[4892]: I0217 19:01:09.539169 4892 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-6s48s" Feb 17 19:01:09 crc kubenswrapper[4892]: I0217 19:01:09.539350 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Feb 17 19:01:09 crc kubenswrapper[4892]: I0217 19:01:09.539350 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Feb 17 19:01:09 crc kubenswrapper[4892]: I0217 19:01:09.539417 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Feb 17 19:01:09 crc kubenswrapper[4892]: I0217 19:01:09.540666 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-f4vmn"] Feb 17 19:01:09 crc kubenswrapper[4892]: I0217 19:01:09.658454 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/c9b9dff1-79dc-479c-8c8a-207ffa828672-crc-storage\") pod \"crc-storage-crc-f4vmn\" (UID: \"c9b9dff1-79dc-479c-8c8a-207ffa828672\") " pod="crc-storage/crc-storage-crc-f4vmn" Feb 17 19:01:09 crc kubenswrapper[4892]: I0217 19:01:09.658528 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqgxp\" (UniqueName: \"kubernetes.io/projected/c9b9dff1-79dc-479c-8c8a-207ffa828672-kube-api-access-nqgxp\") pod \"crc-storage-crc-f4vmn\" (UID: \"c9b9dff1-79dc-479c-8c8a-207ffa828672\") " pod="crc-storage/crc-storage-crc-f4vmn" Feb 17 19:01:09 crc kubenswrapper[4892]: I0217 19:01:09.658584 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/c9b9dff1-79dc-479c-8c8a-207ffa828672-node-mnt\") pod \"crc-storage-crc-f4vmn\" (UID: \"c9b9dff1-79dc-479c-8c8a-207ffa828672\") " pod="crc-storage/crc-storage-crc-f4vmn" Feb 17 19:01:09 crc kubenswrapper[4892]: I0217 19:01:09.760881 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/c9b9dff1-79dc-479c-8c8a-207ffa828672-node-mnt\") pod \"crc-storage-crc-f4vmn\" (UID: \"c9b9dff1-79dc-479c-8c8a-207ffa828672\") " pod="crc-storage/crc-storage-crc-f4vmn" Feb 17 19:01:09 crc kubenswrapper[4892]: I0217 19:01:09.761248 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/c9b9dff1-79dc-479c-8c8a-207ffa828672-crc-storage\") pod \"crc-storage-crc-f4vmn\" (UID: \"c9b9dff1-79dc-479c-8c8a-207ffa828672\") " pod="crc-storage/crc-storage-crc-f4vmn" Feb 17 19:01:09 crc kubenswrapper[4892]: I0217 19:01:09.761536 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqgxp\" (UniqueName: \"kubernetes.io/projected/c9b9dff1-79dc-479c-8c8a-207ffa828672-kube-api-access-nqgxp\") pod \"crc-storage-crc-f4vmn\" (UID: \"c9b9dff1-79dc-479c-8c8a-207ffa828672\") " pod="crc-storage/crc-storage-crc-f4vmn" Feb 17 19:01:09 crc kubenswrapper[4892]: I0217 19:01:09.761689 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/c9b9dff1-79dc-479c-8c8a-207ffa828672-node-mnt\") pod \"crc-storage-crc-f4vmn\" (UID: \"c9b9dff1-79dc-479c-8c8a-207ffa828672\") " pod="crc-storage/crc-storage-crc-f4vmn" Feb 17 19:01:09 crc kubenswrapper[4892]: I0217 19:01:09.762769 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/c9b9dff1-79dc-479c-8c8a-207ffa828672-crc-storage\") pod \"crc-storage-crc-f4vmn\" (UID: \"c9b9dff1-79dc-479c-8c8a-207ffa828672\") " pod="crc-storage/crc-storage-crc-f4vmn" Feb 17 19:01:09 crc kubenswrapper[4892]: I0217 19:01:09.780682 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqgxp\" (UniqueName: \"kubernetes.io/projected/c9b9dff1-79dc-479c-8c8a-207ffa828672-kube-api-access-nqgxp\") pod \"crc-storage-crc-f4vmn\" (UID: \"c9b9dff1-79dc-479c-8c8a-207ffa828672\") " pod="crc-storage/crc-storage-crc-f4vmn" Feb 17 19:01:09 crc kubenswrapper[4892]: I0217 19:01:09.905856 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-f4vmn" Feb 17 19:01:10 crc kubenswrapper[4892]: I0217 19:01:10.393023 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-f4vmn"] Feb 17 19:01:10 crc kubenswrapper[4892]: W0217 19:01:10.409367 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9b9dff1_79dc_479c_8c8a_207ffa828672.slice/crio-6c83597ac85b738f2c0ecc7afcef9d02c4f53b871587a558c5b2dcf2edab6033 WatchSource:0}: Error finding container 6c83597ac85b738f2c0ecc7afcef9d02c4f53b871587a558c5b2dcf2edab6033: Status 404 returned error can't find the container with id 6c83597ac85b738f2c0ecc7afcef9d02c4f53b871587a558c5b2dcf2edab6033 Feb 17 19:01:10 crc kubenswrapper[4892]: I0217 19:01:10.596923 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-f4vmn" event={"ID":"c9b9dff1-79dc-479c-8c8a-207ffa828672","Type":"ContainerStarted","Data":"6c83597ac85b738f2c0ecc7afcef9d02c4f53b871587a558c5b2dcf2edab6033"} Feb 17 19:01:11 crc kubenswrapper[4892]: I0217 19:01:11.377924 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0509737d-dbc0-4be0-8546-55944a2b0621" path="/var/lib/kubelet/pods/0509737d-dbc0-4be0-8546-55944a2b0621/volumes" Feb 17 19:01:11 crc kubenswrapper[4892]: I0217 19:01:11.610978 4892 generic.go:334] "Generic (PLEG): container finished" podID="c9b9dff1-79dc-479c-8c8a-207ffa828672" containerID="f1a1be731d5f7d536b98fbb6b98880ede204f7b407262d9960970679d60d3346" exitCode=0 Feb 17 19:01:11 crc kubenswrapper[4892]: I0217 19:01:11.611022 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-f4vmn" event={"ID":"c9b9dff1-79dc-479c-8c8a-207ffa828672","Type":"ContainerDied","Data":"f1a1be731d5f7d536b98fbb6b98880ede204f7b407262d9960970679d60d3346"} Feb 17 19:01:12 crc kubenswrapper[4892]: I0217 19:01:12.940124 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-f4vmn" Feb 17 19:01:13 crc kubenswrapper[4892]: I0217 19:01:13.123299 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/c9b9dff1-79dc-479c-8c8a-207ffa828672-crc-storage\") pod \"c9b9dff1-79dc-479c-8c8a-207ffa828672\" (UID: \"c9b9dff1-79dc-479c-8c8a-207ffa828672\") " Feb 17 19:01:13 crc kubenswrapper[4892]: I0217 19:01:13.123355 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqgxp\" (UniqueName: \"kubernetes.io/projected/c9b9dff1-79dc-479c-8c8a-207ffa828672-kube-api-access-nqgxp\") pod \"c9b9dff1-79dc-479c-8c8a-207ffa828672\" (UID: \"c9b9dff1-79dc-479c-8c8a-207ffa828672\") " Feb 17 19:01:13 crc kubenswrapper[4892]: I0217 19:01:13.123484 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/c9b9dff1-79dc-479c-8c8a-207ffa828672-node-mnt\") pod \"c9b9dff1-79dc-479c-8c8a-207ffa828672\" (UID: \"c9b9dff1-79dc-479c-8c8a-207ffa828672\") " Feb 17 19:01:13 crc kubenswrapper[4892]: I0217 19:01:13.123730 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9b9dff1-79dc-479c-8c8a-207ffa828672-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "c9b9dff1-79dc-479c-8c8a-207ffa828672" (UID: "c9b9dff1-79dc-479c-8c8a-207ffa828672"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 19:01:13 crc kubenswrapper[4892]: I0217 19:01:13.123900 4892 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/c9b9dff1-79dc-479c-8c8a-207ffa828672-node-mnt\") on node \"crc\" DevicePath \"\"" Feb 17 19:01:13 crc kubenswrapper[4892]: I0217 19:01:13.142068 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9b9dff1-79dc-479c-8c8a-207ffa828672-kube-api-access-nqgxp" (OuterVolumeSpecName: "kube-api-access-nqgxp") pod "c9b9dff1-79dc-479c-8c8a-207ffa828672" (UID: "c9b9dff1-79dc-479c-8c8a-207ffa828672"). InnerVolumeSpecName "kube-api-access-nqgxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:01:13 crc kubenswrapper[4892]: I0217 19:01:13.145891 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9b9dff1-79dc-479c-8c8a-207ffa828672-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "c9b9dff1-79dc-479c-8c8a-207ffa828672" (UID: "c9b9dff1-79dc-479c-8c8a-207ffa828672"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:01:13 crc kubenswrapper[4892]: I0217 19:01:13.225506 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqgxp\" (UniqueName: \"kubernetes.io/projected/c9b9dff1-79dc-479c-8c8a-207ffa828672-kube-api-access-nqgxp\") on node \"crc\" DevicePath \"\"" Feb 17 19:01:13 crc kubenswrapper[4892]: I0217 19:01:13.225541 4892 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/c9b9dff1-79dc-479c-8c8a-207ffa828672-crc-storage\") on node \"crc\" DevicePath \"\"" Feb 17 19:01:13 crc kubenswrapper[4892]: I0217 19:01:13.631110 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-f4vmn" event={"ID":"c9b9dff1-79dc-479c-8c8a-207ffa828672","Type":"ContainerDied","Data":"6c83597ac85b738f2c0ecc7afcef9d02c4f53b871587a558c5b2dcf2edab6033"} Feb 17 19:01:13 crc kubenswrapper[4892]: I0217 19:01:13.631149 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c83597ac85b738f2c0ecc7afcef9d02c4f53b871587a558c5b2dcf2edab6033" Feb 17 19:01:13 crc kubenswrapper[4892]: I0217 19:01:13.631234 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-f4vmn" Feb 17 19:01:15 crc kubenswrapper[4892]: I0217 19:01:15.360383 4892 scope.go:117] "RemoveContainer" containerID="d09654a36203bb03d7f581d946a6aff0b561d94b30f8b6769499f728c32bfc51" Feb 17 19:01:16 crc kubenswrapper[4892]: I0217 19:01:16.663104 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerStarted","Data":"330e1bd88e0b3cdc2508642d4af65f7484d521050f49965de99307ecfc4b06ea"} Feb 17 19:02:00 crc kubenswrapper[4892]: I0217 19:02:00.845939 4892 scope.go:117] "RemoveContainer" containerID="7b2c8e54ef6f6676cc9ceaabe3aac48284268991de638a01a0cf0bb47c8f122b" Feb 17 19:03:37 crc kubenswrapper[4892]: I0217 19:03:37.425214 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 19:03:37 crc kubenswrapper[4892]: I0217 19:03:37.427154 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 19:04:04 crc kubenswrapper[4892]: I0217 19:04:04.973252 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ktvn8"] Feb 17 19:04:04 crc kubenswrapper[4892]: E0217 19:04:04.974459 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9b9dff1-79dc-479c-8c8a-207ffa828672" containerName="storage" Feb 17 19:04:04 crc kubenswrapper[4892]: I0217 19:04:04.974482 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9b9dff1-79dc-479c-8c8a-207ffa828672" containerName="storage" Feb 17 19:04:04 crc kubenswrapper[4892]: I0217 19:04:04.974964 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9b9dff1-79dc-479c-8c8a-207ffa828672" containerName="storage" Feb 17 19:04:04 crc kubenswrapper[4892]: I0217 19:04:04.978283 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ktvn8" Feb 17 19:04:05 crc kubenswrapper[4892]: I0217 19:04:05.003397 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ktvn8"] Feb 17 19:04:05 crc kubenswrapper[4892]: I0217 19:04:05.035056 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f48376c-d805-4bcb-9678-6fce5156946c-utilities\") pod \"redhat-marketplace-ktvn8\" (UID: \"4f48376c-d805-4bcb-9678-6fce5156946c\") " pod="openshift-marketplace/redhat-marketplace-ktvn8" Feb 17 19:04:05 crc kubenswrapper[4892]: I0217 19:04:05.035187 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9m2n\" (UniqueName: \"kubernetes.io/projected/4f48376c-d805-4bcb-9678-6fce5156946c-kube-api-access-z9m2n\") pod \"redhat-marketplace-ktvn8\" (UID: \"4f48376c-d805-4bcb-9678-6fce5156946c\") " pod="openshift-marketplace/redhat-marketplace-ktvn8" Feb 17 19:04:05 crc kubenswrapper[4892]: I0217 19:04:05.035236 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f48376c-d805-4bcb-9678-6fce5156946c-catalog-content\") pod \"redhat-marketplace-ktvn8\" (UID: \"4f48376c-d805-4bcb-9678-6fce5156946c\") " pod="openshift-marketplace/redhat-marketplace-ktvn8" Feb 17 19:04:05 crc kubenswrapper[4892]: I0217 19:04:05.137165 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f48376c-d805-4bcb-9678-6fce5156946c-catalog-content\") pod \"redhat-marketplace-ktvn8\" (UID: \"4f48376c-d805-4bcb-9678-6fce5156946c\") " pod="openshift-marketplace/redhat-marketplace-ktvn8" Feb 17 19:04:05 crc kubenswrapper[4892]: I0217 19:04:05.137339 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f48376c-d805-4bcb-9678-6fce5156946c-utilities\") pod \"redhat-marketplace-ktvn8\" (UID: \"4f48376c-d805-4bcb-9678-6fce5156946c\") " pod="openshift-marketplace/redhat-marketplace-ktvn8" Feb 17 19:04:05 crc kubenswrapper[4892]: I0217 19:04:05.137495 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9m2n\" (UniqueName: \"kubernetes.io/projected/4f48376c-d805-4bcb-9678-6fce5156946c-kube-api-access-z9m2n\") pod \"redhat-marketplace-ktvn8\" (UID: \"4f48376c-d805-4bcb-9678-6fce5156946c\") " pod="openshift-marketplace/redhat-marketplace-ktvn8" Feb 17 19:04:05 crc kubenswrapper[4892]: I0217 19:04:05.138677 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f48376c-d805-4bcb-9678-6fce5156946c-catalog-content\") pod \"redhat-marketplace-ktvn8\" (UID: \"4f48376c-d805-4bcb-9678-6fce5156946c\") " pod="openshift-marketplace/redhat-marketplace-ktvn8" Feb 17 19:04:05 crc kubenswrapper[4892]: I0217 19:04:05.139200 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f48376c-d805-4bcb-9678-6fce5156946c-utilities\") pod \"redhat-marketplace-ktvn8\" (UID: \"4f48376c-d805-4bcb-9678-6fce5156946c\") " pod="openshift-marketplace/redhat-marketplace-ktvn8" Feb 17 19:04:05 crc kubenswrapper[4892]: I0217 19:04:05.159510 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9m2n\" (UniqueName: \"kubernetes.io/projected/4f48376c-d805-4bcb-9678-6fce5156946c-kube-api-access-z9m2n\") pod \"redhat-marketplace-ktvn8\" (UID: \"4f48376c-d805-4bcb-9678-6fce5156946c\") " pod="openshift-marketplace/redhat-marketplace-ktvn8" Feb 17 19:04:05 crc kubenswrapper[4892]: I0217 19:04:05.352298 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ktvn8" Feb 17 19:04:05 crc kubenswrapper[4892]: I0217 19:04:05.804339 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ktvn8"] Feb 17 19:04:05 crc kubenswrapper[4892]: W0217 19:04:05.814711 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f48376c_d805_4bcb_9678_6fce5156946c.slice/crio-47327a0a3be93a798f4e1d5ef11fd69edfceb9284b652de53e06095498fe0b08 WatchSource:0}: Error finding container 47327a0a3be93a798f4e1d5ef11fd69edfceb9284b652de53e06095498fe0b08: Status 404 returned error can't find the container with id 47327a0a3be93a798f4e1d5ef11fd69edfceb9284b652de53e06095498fe0b08 Feb 17 19:04:06 crc kubenswrapper[4892]: I0217 19:04:06.485417 4892 generic.go:334] "Generic (PLEG): container finished" podID="4f48376c-d805-4bcb-9678-6fce5156946c" containerID="a7f1e65ad71ee807a1cdbe8eef1691c58e629febb502eb6394f85bd0ae5fdbda" exitCode=0 Feb 17 19:04:06 crc kubenswrapper[4892]: I0217 19:04:06.485556 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ktvn8" event={"ID":"4f48376c-d805-4bcb-9678-6fce5156946c","Type":"ContainerDied","Data":"a7f1e65ad71ee807a1cdbe8eef1691c58e629febb502eb6394f85bd0ae5fdbda"} Feb 17 19:04:06 crc kubenswrapper[4892]: I0217 19:04:06.485888 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ktvn8" event={"ID":"4f48376c-d805-4bcb-9678-6fce5156946c","Type":"ContainerStarted","Data":"47327a0a3be93a798f4e1d5ef11fd69edfceb9284b652de53e06095498fe0b08"} Feb 17 19:04:07 crc kubenswrapper[4892]: I0217 19:04:07.424337 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 19:04:07 crc kubenswrapper[4892]: I0217 19:04:07.424376 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 19:04:08 crc kubenswrapper[4892]: I0217 19:04:08.511199 4892 generic.go:334] "Generic (PLEG): container finished" podID="4f48376c-d805-4bcb-9678-6fce5156946c" containerID="be6d7fe9ce3d42cf1b74afe455cf6aad0f18c836b1d389dcf5f0ef897311f47e" exitCode=0 Feb 17 19:04:08 crc kubenswrapper[4892]: I0217 19:04:08.511323 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ktvn8" event={"ID":"4f48376c-d805-4bcb-9678-6fce5156946c","Type":"ContainerDied","Data":"be6d7fe9ce3d42cf1b74afe455cf6aad0f18c836b1d389dcf5f0ef897311f47e"} Feb 17 19:04:09 crc kubenswrapper[4892]: I0217 19:04:09.521981 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ktvn8" event={"ID":"4f48376c-d805-4bcb-9678-6fce5156946c","Type":"ContainerStarted","Data":"1703a10427cb6a0bb4369433d7328be3cb35ad35cfdc25680ea48b79744c9b7f"} Feb 17 19:04:09 crc kubenswrapper[4892]: I0217 19:04:09.550443 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ktvn8" podStartSLOduration=3.071200759 podStartE2EDuration="5.550415924s" podCreationTimestamp="2026-02-17 19:04:04 +0000 UTC" firstStartedPulling="2026-02-17 19:04:06.486876945 +0000 UTC m=+4817.862280220" lastFinishedPulling="2026-02-17 19:04:08.96609212 +0000 UTC m=+4820.341495385" observedRunningTime="2026-02-17 19:04:09.541603817 +0000 UTC m=+4820.917007082" watchObservedRunningTime="2026-02-17 19:04:09.550415924 +0000 UTC m=+4820.925819239" Feb 17 19:04:15 crc kubenswrapper[4892]: I0217 19:04:15.353000 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ktvn8" Feb 17 19:04:15 crc kubenswrapper[4892]: I0217 19:04:15.354708 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ktvn8" Feb 17 19:04:15 crc kubenswrapper[4892]: I0217 19:04:15.411295 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ktvn8" Feb 17 19:04:15 crc kubenswrapper[4892]: I0217 19:04:15.461387 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dhjs7"] Feb 17 19:04:15 crc kubenswrapper[4892]: I0217 19:04:15.464556 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dhjs7" Feb 17 19:04:15 crc kubenswrapper[4892]: I0217 19:04:15.479932 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dhjs7"] Feb 17 19:04:15 crc kubenswrapper[4892]: I0217 19:04:15.527025 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac86c7a4-9839-4d54-93b1-7451208d6955-utilities\") pod \"certified-operators-dhjs7\" (UID: \"ac86c7a4-9839-4d54-93b1-7451208d6955\") " pod="openshift-marketplace/certified-operators-dhjs7" Feb 17 19:04:15 crc kubenswrapper[4892]: I0217 19:04:15.527145 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac86c7a4-9839-4d54-93b1-7451208d6955-catalog-content\") pod \"certified-operators-dhjs7\" (UID: \"ac86c7a4-9839-4d54-93b1-7451208d6955\") " pod="openshift-marketplace/certified-operators-dhjs7" Feb 17 19:04:15 crc kubenswrapper[4892]: I0217 19:04:15.527173 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbgqp\" (UniqueName: \"kubernetes.io/projected/ac86c7a4-9839-4d54-93b1-7451208d6955-kube-api-access-kbgqp\") pod \"certified-operators-dhjs7\" (UID: \"ac86c7a4-9839-4d54-93b1-7451208d6955\") " pod="openshift-marketplace/certified-operators-dhjs7" Feb 17 19:04:15 crc kubenswrapper[4892]: I0217 19:04:15.621546 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ktvn8" Feb 17 19:04:15 crc kubenswrapper[4892]: I0217 19:04:15.628501 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac86c7a4-9839-4d54-93b1-7451208d6955-utilities\") pod \"certified-operators-dhjs7\" (UID: \"ac86c7a4-9839-4d54-93b1-7451208d6955\") " pod="openshift-marketplace/certified-operators-dhjs7" Feb 17 19:04:15 crc kubenswrapper[4892]: I0217 19:04:15.628680 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac86c7a4-9839-4d54-93b1-7451208d6955-catalog-content\") pod \"certified-operators-dhjs7\" (UID: \"ac86c7a4-9839-4d54-93b1-7451208d6955\") " pod="openshift-marketplace/certified-operators-dhjs7" Feb 17 19:04:15 crc kubenswrapper[4892]: I0217 19:04:15.628716 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbgqp\" (UniqueName: \"kubernetes.io/projected/ac86c7a4-9839-4d54-93b1-7451208d6955-kube-api-access-kbgqp\") pod \"certified-operators-dhjs7\" (UID: \"ac86c7a4-9839-4d54-93b1-7451208d6955\") " pod="openshift-marketplace/certified-operators-dhjs7" Feb 17 19:04:15 crc kubenswrapper[4892]: I0217 19:04:15.629450 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac86c7a4-9839-4d54-93b1-7451208d6955-utilities\") pod \"certified-operators-dhjs7\" (UID: \"ac86c7a4-9839-4d54-93b1-7451208d6955\") " pod="openshift-marketplace/certified-operators-dhjs7" Feb 17 19:04:15 crc kubenswrapper[4892]: I0217 19:04:15.629743 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac86c7a4-9839-4d54-93b1-7451208d6955-catalog-content\") pod \"certified-operators-dhjs7\" (UID: \"ac86c7a4-9839-4d54-93b1-7451208d6955\") " pod="openshift-marketplace/certified-operators-dhjs7" Feb 17 19:04:15 crc kubenswrapper[4892]: I0217 19:04:15.658298 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbgqp\" (UniqueName: \"kubernetes.io/projected/ac86c7a4-9839-4d54-93b1-7451208d6955-kube-api-access-kbgqp\") pod \"certified-operators-dhjs7\" (UID: \"ac86c7a4-9839-4d54-93b1-7451208d6955\") " pod="openshift-marketplace/certified-operators-dhjs7" Feb 17 19:04:15 crc kubenswrapper[4892]: I0217 19:04:15.806744 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dhjs7" Feb 17 19:04:16 crc kubenswrapper[4892]: I0217 19:04:16.387170 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dhjs7"] Feb 17 19:04:16 crc kubenswrapper[4892]: I0217 19:04:16.427158 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ktvn8"] Feb 17 19:04:16 crc kubenswrapper[4892]: I0217 19:04:16.588738 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dhjs7" event={"ID":"ac86c7a4-9839-4d54-93b1-7451208d6955","Type":"ContainerStarted","Data":"ff09ca47bd171face209a75299cb5f035bcc180f3fcc33d9613305cbd373ae01"} Feb 17 19:04:16 crc kubenswrapper[4892]: I0217 19:04:16.588790 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dhjs7" event={"ID":"ac86c7a4-9839-4d54-93b1-7451208d6955","Type":"ContainerStarted","Data":"ce3db951993e61f22eea5ba3f5be3b0587cb36d1bc9a95be79f370581055f1a5"} Feb 17 19:04:17 crc kubenswrapper[4892]: I0217 19:04:17.606065 4892 generic.go:334] "Generic (PLEG): container finished" podID="ac86c7a4-9839-4d54-93b1-7451208d6955" containerID="ff09ca47bd171face209a75299cb5f035bcc180f3fcc33d9613305cbd373ae01" exitCode=0 Feb 17 19:04:17 crc kubenswrapper[4892]: I0217 19:04:17.606276 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ktvn8" podUID="4f48376c-d805-4bcb-9678-6fce5156946c" containerName="registry-server" containerID="cri-o://1703a10427cb6a0bb4369433d7328be3cb35ad35cfdc25680ea48b79744c9b7f" gracePeriod=2 Feb 17 19:04:17 crc kubenswrapper[4892]: I0217 19:04:17.607316 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dhjs7" event={"ID":"ac86c7a4-9839-4d54-93b1-7451208d6955","Type":"ContainerDied","Data":"ff09ca47bd171face209a75299cb5f035bcc180f3fcc33d9613305cbd373ae01"} Feb 17 19:04:18 crc kubenswrapper[4892]: I0217 19:04:18.089399 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ktvn8" Feb 17 19:04:18 crc kubenswrapper[4892]: I0217 19:04:18.189911 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f48376c-d805-4bcb-9678-6fce5156946c-catalog-content\") pod \"4f48376c-d805-4bcb-9678-6fce5156946c\" (UID: \"4f48376c-d805-4bcb-9678-6fce5156946c\") " Feb 17 19:04:18 crc kubenswrapper[4892]: I0217 19:04:18.189979 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9m2n\" (UniqueName: \"kubernetes.io/projected/4f48376c-d805-4bcb-9678-6fce5156946c-kube-api-access-z9m2n\") pod \"4f48376c-d805-4bcb-9678-6fce5156946c\" (UID: \"4f48376c-d805-4bcb-9678-6fce5156946c\") " Feb 17 19:04:18 crc kubenswrapper[4892]: I0217 19:04:18.190043 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f48376c-d805-4bcb-9678-6fce5156946c-utilities\") pod \"4f48376c-d805-4bcb-9678-6fce5156946c\" (UID: \"4f48376c-d805-4bcb-9678-6fce5156946c\") " Feb 17 19:04:18 crc kubenswrapper[4892]: I0217 19:04:18.191164 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f48376c-d805-4bcb-9678-6fce5156946c-utilities" (OuterVolumeSpecName: "utilities") pod "4f48376c-d805-4bcb-9678-6fce5156946c" (UID: "4f48376c-d805-4bcb-9678-6fce5156946c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:04:18 crc kubenswrapper[4892]: I0217 19:04:18.198089 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f48376c-d805-4bcb-9678-6fce5156946c-kube-api-access-z9m2n" (OuterVolumeSpecName: "kube-api-access-z9m2n") pod "4f48376c-d805-4bcb-9678-6fce5156946c" (UID: "4f48376c-d805-4bcb-9678-6fce5156946c"). InnerVolumeSpecName "kube-api-access-z9m2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:04:18 crc kubenswrapper[4892]: I0217 19:04:18.214321 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f48376c-d805-4bcb-9678-6fce5156946c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4f48376c-d805-4bcb-9678-6fce5156946c" (UID: "4f48376c-d805-4bcb-9678-6fce5156946c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:04:18 crc kubenswrapper[4892]: I0217 19:04:18.292844 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f48376c-d805-4bcb-9678-6fce5156946c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 19:04:18 crc kubenswrapper[4892]: I0217 19:04:18.292910 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9m2n\" (UniqueName: \"kubernetes.io/projected/4f48376c-d805-4bcb-9678-6fce5156946c-kube-api-access-z9m2n\") on node \"crc\" DevicePath \"\"" Feb 17 19:04:18 crc kubenswrapper[4892]: I0217 19:04:18.292926 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f48376c-d805-4bcb-9678-6fce5156946c-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 19:04:18 crc kubenswrapper[4892]: I0217 19:04:18.615438 4892 generic.go:334] "Generic (PLEG): container finished" podID="4f48376c-d805-4bcb-9678-6fce5156946c" containerID="1703a10427cb6a0bb4369433d7328be3cb35ad35cfdc25680ea48b79744c9b7f" exitCode=0 Feb 17 19:04:18 crc kubenswrapper[4892]: I0217 19:04:18.615494 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ktvn8" event={"ID":"4f48376c-d805-4bcb-9678-6fce5156946c","Type":"ContainerDied","Data":"1703a10427cb6a0bb4369433d7328be3cb35ad35cfdc25680ea48b79744c9b7f"} Feb 17 19:04:18 crc kubenswrapper[4892]: I0217 19:04:18.615519 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ktvn8" event={"ID":"4f48376c-d805-4bcb-9678-6fce5156946c","Type":"ContainerDied","Data":"47327a0a3be93a798f4e1d5ef11fd69edfceb9284b652de53e06095498fe0b08"} Feb 17 19:04:18 crc kubenswrapper[4892]: I0217 19:04:18.615536 4892 scope.go:117] "RemoveContainer" containerID="1703a10427cb6a0bb4369433d7328be3cb35ad35cfdc25680ea48b79744c9b7f" Feb 17 19:04:18 crc kubenswrapper[4892]: I0217 19:04:18.615650 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ktvn8" Feb 17 19:04:18 crc kubenswrapper[4892]: I0217 19:04:18.628956 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dhjs7" event={"ID":"ac86c7a4-9839-4d54-93b1-7451208d6955","Type":"ContainerStarted","Data":"111bbd95adfeb4bee4670eb2b347bc2da8012b730170a758fc120688144bd276"} Feb 17 19:04:18 crc kubenswrapper[4892]: I0217 19:04:18.640778 4892 scope.go:117] "RemoveContainer" containerID="be6d7fe9ce3d42cf1b74afe455cf6aad0f18c836b1d389dcf5f0ef897311f47e" Feb 17 19:04:18 crc kubenswrapper[4892]: I0217 19:04:18.672386 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ktvn8"] Feb 17 19:04:18 crc kubenswrapper[4892]: I0217 19:04:18.678514 4892 scope.go:117] "RemoveContainer" containerID="a7f1e65ad71ee807a1cdbe8eef1691c58e629febb502eb6394f85bd0ae5fdbda" Feb 17 19:04:18 crc kubenswrapper[4892]: I0217 19:04:18.680374 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ktvn8"] Feb 17 19:04:18 crc kubenswrapper[4892]: I0217 19:04:18.715414 4892 scope.go:117] "RemoveContainer" containerID="1703a10427cb6a0bb4369433d7328be3cb35ad35cfdc25680ea48b79744c9b7f" Feb 17 19:04:18 crc kubenswrapper[4892]: E0217 19:04:18.716290 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1703a10427cb6a0bb4369433d7328be3cb35ad35cfdc25680ea48b79744c9b7f\": container with ID starting with 1703a10427cb6a0bb4369433d7328be3cb35ad35cfdc25680ea48b79744c9b7f not found: ID does not exist" containerID="1703a10427cb6a0bb4369433d7328be3cb35ad35cfdc25680ea48b79744c9b7f" Feb 17 19:04:18 crc kubenswrapper[4892]: I0217 19:04:18.716327 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1703a10427cb6a0bb4369433d7328be3cb35ad35cfdc25680ea48b79744c9b7f"} err="failed to get container status \"1703a10427cb6a0bb4369433d7328be3cb35ad35cfdc25680ea48b79744c9b7f\": rpc error: code = NotFound desc = could not find container \"1703a10427cb6a0bb4369433d7328be3cb35ad35cfdc25680ea48b79744c9b7f\": container with ID starting with 1703a10427cb6a0bb4369433d7328be3cb35ad35cfdc25680ea48b79744c9b7f not found: ID does not exist" Feb 17 19:04:18 crc kubenswrapper[4892]: I0217 19:04:18.716353 4892 scope.go:117] "RemoveContainer" containerID="be6d7fe9ce3d42cf1b74afe455cf6aad0f18c836b1d389dcf5f0ef897311f47e" Feb 17 19:04:18 crc kubenswrapper[4892]: E0217 19:04:18.717048 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be6d7fe9ce3d42cf1b74afe455cf6aad0f18c836b1d389dcf5f0ef897311f47e\": container with ID starting with be6d7fe9ce3d42cf1b74afe455cf6aad0f18c836b1d389dcf5f0ef897311f47e not found: ID does not exist" containerID="be6d7fe9ce3d42cf1b74afe455cf6aad0f18c836b1d389dcf5f0ef897311f47e" Feb 17 19:04:18 crc kubenswrapper[4892]: I0217 19:04:18.717066 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be6d7fe9ce3d42cf1b74afe455cf6aad0f18c836b1d389dcf5f0ef897311f47e"} err="failed to get container status \"be6d7fe9ce3d42cf1b74afe455cf6aad0f18c836b1d389dcf5f0ef897311f47e\": rpc error: code = NotFound desc = could not find container \"be6d7fe9ce3d42cf1b74afe455cf6aad0f18c836b1d389dcf5f0ef897311f47e\": container with ID starting with be6d7fe9ce3d42cf1b74afe455cf6aad0f18c836b1d389dcf5f0ef897311f47e not found: ID does not exist" Feb 17 19:04:18 crc kubenswrapper[4892]: I0217 19:04:18.717078 4892 scope.go:117] "RemoveContainer" containerID="a7f1e65ad71ee807a1cdbe8eef1691c58e629febb502eb6394f85bd0ae5fdbda" Feb 17 19:04:18 crc kubenswrapper[4892]: E0217 19:04:18.719473 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7f1e65ad71ee807a1cdbe8eef1691c58e629febb502eb6394f85bd0ae5fdbda\": container with ID starting with a7f1e65ad71ee807a1cdbe8eef1691c58e629febb502eb6394f85bd0ae5fdbda not found: ID does not exist" containerID="a7f1e65ad71ee807a1cdbe8eef1691c58e629febb502eb6394f85bd0ae5fdbda" Feb 17 19:04:18 crc kubenswrapper[4892]: I0217 19:04:18.719577 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7f1e65ad71ee807a1cdbe8eef1691c58e629febb502eb6394f85bd0ae5fdbda"} err="failed to get container status \"a7f1e65ad71ee807a1cdbe8eef1691c58e629febb502eb6394f85bd0ae5fdbda\": rpc error: code = NotFound desc = could not find container \"a7f1e65ad71ee807a1cdbe8eef1691c58e629febb502eb6394f85bd0ae5fdbda\": container with ID starting with a7f1e65ad71ee807a1cdbe8eef1691c58e629febb502eb6394f85bd0ae5fdbda not found: ID does not exist" Feb 17 19:04:19 crc kubenswrapper[4892]: I0217 19:04:19.376477 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f48376c-d805-4bcb-9678-6fce5156946c" path="/var/lib/kubelet/pods/4f48376c-d805-4bcb-9678-6fce5156946c/volumes" Feb 17 19:04:19 crc kubenswrapper[4892]: I0217 19:04:19.647775 4892 generic.go:334] "Generic (PLEG): container finished" podID="ac86c7a4-9839-4d54-93b1-7451208d6955" containerID="111bbd95adfeb4bee4670eb2b347bc2da8012b730170a758fc120688144bd276" exitCode=0 Feb 17 19:04:19 crc kubenswrapper[4892]: I0217 19:04:19.647833 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dhjs7" event={"ID":"ac86c7a4-9839-4d54-93b1-7451208d6955","Type":"ContainerDied","Data":"111bbd95adfeb4bee4670eb2b347bc2da8012b730170a758fc120688144bd276"} Feb 17 19:04:20 crc kubenswrapper[4892]: I0217 19:04:20.694772 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dhjs7" event={"ID":"ac86c7a4-9839-4d54-93b1-7451208d6955","Type":"ContainerStarted","Data":"4b7025e9a8f400083409a07d4ea09be0bb451454c31e6c8aa70f4843b3fdc023"} Feb 17 19:04:20 crc kubenswrapper[4892]: I0217 19:04:20.730340 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dhjs7" podStartSLOduration=3.20934189 podStartE2EDuration="5.730323011s" podCreationTimestamp="2026-02-17 19:04:15 +0000 UTC" firstStartedPulling="2026-02-17 19:04:17.608152911 +0000 UTC m=+4828.983556176" lastFinishedPulling="2026-02-17 19:04:20.129134032 +0000 UTC m=+4831.504537297" observedRunningTime="2026-02-17 19:04:20.722051108 +0000 UTC m=+4832.097454403" watchObservedRunningTime="2026-02-17 19:04:20.730323011 +0000 UTC m=+4832.105726276" Feb 17 19:04:22 crc kubenswrapper[4892]: I0217 19:04:22.691946 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-dqv8z"] Feb 17 19:04:22 crc kubenswrapper[4892]: E0217 19:04:22.692547 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f48376c-d805-4bcb-9678-6fce5156946c" containerName="extract-content" Feb 17 19:04:22 crc kubenswrapper[4892]: I0217 19:04:22.692560 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f48376c-d805-4bcb-9678-6fce5156946c" containerName="extract-content" Feb 17 19:04:22 crc kubenswrapper[4892]: E0217 19:04:22.692585 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f48376c-d805-4bcb-9678-6fce5156946c" containerName="extract-utilities" Feb 17 19:04:22 crc kubenswrapper[4892]: I0217 19:04:22.692592 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f48376c-d805-4bcb-9678-6fce5156946c" containerName="extract-utilities" Feb 17 19:04:22 crc kubenswrapper[4892]: E0217 19:04:22.692608 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f48376c-d805-4bcb-9678-6fce5156946c" containerName="registry-server" Feb 17 19:04:22 crc kubenswrapper[4892]: I0217 19:04:22.692614 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f48376c-d805-4bcb-9678-6fce5156946c" containerName="registry-server" Feb 17 19:04:22 crc kubenswrapper[4892]: I0217 19:04:22.692808 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f48376c-d805-4bcb-9678-6fce5156946c" containerName="registry-server" Feb 17 19:04:22 crc kubenswrapper[4892]: I0217 19:04:22.693896 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-dqv8z" Feb 17 19:04:22 crc kubenswrapper[4892]: I0217 19:04:22.697367 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 17 19:04:22 crc kubenswrapper[4892]: I0217 19:04:22.697414 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 17 19:04:22 crc kubenswrapper[4892]: I0217 19:04:22.697612 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 17 19:04:22 crc kubenswrapper[4892]: I0217 19:04:22.697708 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 17 19:04:22 crc kubenswrapper[4892]: I0217 19:04:22.698615 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-bks9n" Feb 17 19:04:22 crc kubenswrapper[4892]: I0217 19:04:22.719693 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-dqv8z"] Feb 17 19:04:22 crc kubenswrapper[4892]: I0217 19:04:22.784353 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mv2c\" (UniqueName: \"kubernetes.io/projected/56666973-7a1b-407e-a569-b888f83544e6-kube-api-access-9mv2c\") pod \"dnsmasq-dns-5d7b5456f5-dqv8z\" (UID: \"56666973-7a1b-407e-a569-b888f83544e6\") " pod="openstack/dnsmasq-dns-5d7b5456f5-dqv8z" Feb 17 19:04:22 crc kubenswrapper[4892]: I0217 19:04:22.784399 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56666973-7a1b-407e-a569-b888f83544e6-config\") pod \"dnsmasq-dns-5d7b5456f5-dqv8z\" (UID: \"56666973-7a1b-407e-a569-b888f83544e6\") " pod="openstack/dnsmasq-dns-5d7b5456f5-dqv8z" Feb 17 19:04:22 crc kubenswrapper[4892]: I0217 19:04:22.784424 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56666973-7a1b-407e-a569-b888f83544e6-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-dqv8z\" (UID: \"56666973-7a1b-407e-a569-b888f83544e6\") " pod="openstack/dnsmasq-dns-5d7b5456f5-dqv8z" Feb 17 19:04:22 crc kubenswrapper[4892]: I0217 19:04:22.886941 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mv2c\" (UniqueName: \"kubernetes.io/projected/56666973-7a1b-407e-a569-b888f83544e6-kube-api-access-9mv2c\") pod \"dnsmasq-dns-5d7b5456f5-dqv8z\" (UID: \"56666973-7a1b-407e-a569-b888f83544e6\") " pod="openstack/dnsmasq-dns-5d7b5456f5-dqv8z" Feb 17 19:04:22 crc kubenswrapper[4892]: I0217 19:04:22.886983 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56666973-7a1b-407e-a569-b888f83544e6-config\") pod \"dnsmasq-dns-5d7b5456f5-dqv8z\" (UID: \"56666973-7a1b-407e-a569-b888f83544e6\") " pod="openstack/dnsmasq-dns-5d7b5456f5-dqv8z" Feb 17 19:04:22 crc kubenswrapper[4892]: I0217 19:04:22.887004 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56666973-7a1b-407e-a569-b888f83544e6-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-dqv8z\" (UID: \"56666973-7a1b-407e-a569-b888f83544e6\") " pod="openstack/dnsmasq-dns-5d7b5456f5-dqv8z" Feb 17 19:04:22 crc kubenswrapper[4892]: I0217 19:04:22.887772 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56666973-7a1b-407e-a569-b888f83544e6-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-dqv8z\" (UID: \"56666973-7a1b-407e-a569-b888f83544e6\") " pod="openstack/dnsmasq-dns-5d7b5456f5-dqv8z" Feb 17 19:04:22 crc kubenswrapper[4892]: I0217 19:04:22.887856 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56666973-7a1b-407e-a569-b888f83544e6-config\") pod \"dnsmasq-dns-5d7b5456f5-dqv8z\" (UID: \"56666973-7a1b-407e-a569-b888f83544e6\") " pod="openstack/dnsmasq-dns-5d7b5456f5-dqv8z" Feb 17 19:04:22 crc kubenswrapper[4892]: I0217 19:04:22.934179 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mv2c\" (UniqueName: \"kubernetes.io/projected/56666973-7a1b-407e-a569-b888f83544e6-kube-api-access-9mv2c\") pod \"dnsmasq-dns-5d7b5456f5-dqv8z\" (UID: \"56666973-7a1b-407e-a569-b888f83544e6\") " pod="openstack/dnsmasq-dns-5d7b5456f5-dqv8z" Feb 17 19:04:22 crc kubenswrapper[4892]: I0217 19:04:22.957456 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-2jqm4"] Feb 17 19:04:22 crc kubenswrapper[4892]: I0217 19:04:22.968088 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-2jqm4" Feb 17 19:04:22 crc kubenswrapper[4892]: I0217 19:04:22.987857 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-2jqm4"] Feb 17 19:04:23 crc kubenswrapper[4892]: I0217 19:04:23.016324 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-dqv8z" Feb 17 19:04:23 crc kubenswrapper[4892]: I0217 19:04:23.091746 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlz92\" (UniqueName: \"kubernetes.io/projected/2a2fba10-1e82-4580-b350-548b00de2a1f-kube-api-access-zlz92\") pod \"dnsmasq-dns-98ddfc8f-2jqm4\" (UID: \"2a2fba10-1e82-4580-b350-548b00de2a1f\") " pod="openstack/dnsmasq-dns-98ddfc8f-2jqm4" Feb 17 19:04:23 crc kubenswrapper[4892]: I0217 19:04:23.092112 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a2fba10-1e82-4580-b350-548b00de2a1f-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-2jqm4\" (UID: \"2a2fba10-1e82-4580-b350-548b00de2a1f\") " pod="openstack/dnsmasq-dns-98ddfc8f-2jqm4" Feb 17 19:04:23 crc kubenswrapper[4892]: I0217 19:04:23.092209 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a2fba10-1e82-4580-b350-548b00de2a1f-config\") pod \"dnsmasq-dns-98ddfc8f-2jqm4\" (UID: \"2a2fba10-1e82-4580-b350-548b00de2a1f\") " pod="openstack/dnsmasq-dns-98ddfc8f-2jqm4" Feb 17 19:04:23 crc kubenswrapper[4892]: I0217 19:04:23.193745 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlz92\" (UniqueName: \"kubernetes.io/projected/2a2fba10-1e82-4580-b350-548b00de2a1f-kube-api-access-zlz92\") pod \"dnsmasq-dns-98ddfc8f-2jqm4\" (UID: \"2a2fba10-1e82-4580-b350-548b00de2a1f\") " pod="openstack/dnsmasq-dns-98ddfc8f-2jqm4" Feb 17 19:04:23 crc kubenswrapper[4892]: I0217 19:04:23.193805 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a2fba10-1e82-4580-b350-548b00de2a1f-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-2jqm4\" (UID: \"2a2fba10-1e82-4580-b350-548b00de2a1f\") " pod="openstack/dnsmasq-dns-98ddfc8f-2jqm4" Feb 17 19:04:23 crc kubenswrapper[4892]: I0217 19:04:23.193919 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a2fba10-1e82-4580-b350-548b00de2a1f-config\") pod \"dnsmasq-dns-98ddfc8f-2jqm4\" (UID: \"2a2fba10-1e82-4580-b350-548b00de2a1f\") " pod="openstack/dnsmasq-dns-98ddfc8f-2jqm4" Feb 17 19:04:23 crc kubenswrapper[4892]: I0217 19:04:23.194894 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a2fba10-1e82-4580-b350-548b00de2a1f-config\") pod \"dnsmasq-dns-98ddfc8f-2jqm4\" (UID: \"2a2fba10-1e82-4580-b350-548b00de2a1f\") " pod="openstack/dnsmasq-dns-98ddfc8f-2jqm4" Feb 17 19:04:23 crc kubenswrapper[4892]: I0217 19:04:23.195514 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a2fba10-1e82-4580-b350-548b00de2a1f-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-2jqm4\" (UID: \"2a2fba10-1e82-4580-b350-548b00de2a1f\") " pod="openstack/dnsmasq-dns-98ddfc8f-2jqm4" Feb 17 19:04:23 crc kubenswrapper[4892]: I0217 19:04:23.216019 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlz92\" (UniqueName: \"kubernetes.io/projected/2a2fba10-1e82-4580-b350-548b00de2a1f-kube-api-access-zlz92\") pod \"dnsmasq-dns-98ddfc8f-2jqm4\" (UID: \"2a2fba10-1e82-4580-b350-548b00de2a1f\") " pod="openstack/dnsmasq-dns-98ddfc8f-2jqm4" Feb 17 19:04:23 crc kubenswrapper[4892]: I0217 19:04:23.285243 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-2jqm4" Feb 17 19:04:23 crc kubenswrapper[4892]: I0217 19:04:23.495121 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-dqv8z"] Feb 17 19:04:23 crc kubenswrapper[4892]: I0217 19:04:23.727241 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-dqv8z" event={"ID":"56666973-7a1b-407e-a569-b888f83544e6","Type":"ContainerStarted","Data":"5c7cb90f9dd6d24d9956df0242ecfa921651024b382a7c53f90c79427f49e14b"} Feb 17 19:04:23 crc kubenswrapper[4892]: I0217 19:04:23.767797 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-2jqm4"] Feb 17 19:04:23 crc kubenswrapper[4892]: W0217 19:04:23.767912 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a2fba10_1e82_4580_b350_548b00de2a1f.slice/crio-f427d7f477bc65453eefe07739b6d11b31540fb3fc220f6f85ae5a46634ac5f6 WatchSource:0}: Error finding container f427d7f477bc65453eefe07739b6d11b31540fb3fc220f6f85ae5a46634ac5f6: Status 404 returned error can't find the container with id f427d7f477bc65453eefe07739b6d11b31540fb3fc220f6f85ae5a46634ac5f6 Feb 17 19:04:23 crc kubenswrapper[4892]: I0217 19:04:23.826306 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 19:04:23 crc kubenswrapper[4892]: I0217 19:04:23.828291 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 17 19:04:23 crc kubenswrapper[4892]: I0217 19:04:23.835500 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 17 19:04:23 crc kubenswrapper[4892]: I0217 19:04:23.835568 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 17 19:04:23 crc kubenswrapper[4892]: I0217 19:04:23.835626 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 17 19:04:23 crc kubenswrapper[4892]: I0217 19:04:23.835706 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-z997r" Feb 17 19:04:23 crc kubenswrapper[4892]: I0217 19:04:23.835704 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 17 19:04:23 crc kubenswrapper[4892]: I0217 19:04:23.848183 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 19:04:23 crc kubenswrapper[4892]: I0217 19:04:23.905297 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f5a1d044-9e9c-41ca-96e2-735248659691-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f5a1d044-9e9c-41ca-96e2-735248659691\") " pod="openstack/rabbitmq-server-0" Feb 17 19:04:23 crc kubenswrapper[4892]: I0217 19:04:23.905348 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f5a1d044-9e9c-41ca-96e2-735248659691-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f5a1d044-9e9c-41ca-96e2-735248659691\") " pod="openstack/rabbitmq-server-0" Feb 17 19:04:23 crc kubenswrapper[4892]: I0217 19:04:23.905454 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f5a1d044-9e9c-41ca-96e2-735248659691-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f5a1d044-9e9c-41ca-96e2-735248659691\") " pod="openstack/rabbitmq-server-0" Feb 17 19:04:23 crc kubenswrapper[4892]: I0217 19:04:23.905525 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4b86\" (UniqueName: \"kubernetes.io/projected/f5a1d044-9e9c-41ca-96e2-735248659691-kube-api-access-d4b86\") pod \"rabbitmq-server-0\" (UID: \"f5a1d044-9e9c-41ca-96e2-735248659691\") " pod="openstack/rabbitmq-server-0" Feb 17 19:04:23 crc kubenswrapper[4892]: I0217 19:04:23.905618 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f5a1d044-9e9c-41ca-96e2-735248659691-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f5a1d044-9e9c-41ca-96e2-735248659691\") " pod="openstack/rabbitmq-server-0" Feb 17 19:04:23 crc kubenswrapper[4892]: I0217 19:04:23.905703 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f5a1d044-9e9c-41ca-96e2-735248659691-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f5a1d044-9e9c-41ca-96e2-735248659691\") " pod="openstack/rabbitmq-server-0" Feb 17 19:04:23 crc kubenswrapper[4892]: I0217 19:04:23.905785 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f5a1d044-9e9c-41ca-96e2-735248659691-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f5a1d044-9e9c-41ca-96e2-735248659691\") " pod="openstack/rabbitmq-server-0" Feb 17 19:04:23 crc kubenswrapper[4892]: I0217 19:04:23.905896 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f5a1d044-9e9c-41ca-96e2-735248659691-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f5a1d044-9e9c-41ca-96e2-735248659691\") " pod="openstack/rabbitmq-server-0" Feb 17 19:04:23 crc kubenswrapper[4892]: I0217 19:04:23.905984 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-88515a40-e0e6-4b68-9f49-c6ac1986175f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-88515a40-e0e6-4b68-9f49-c6ac1986175f\") pod \"rabbitmq-server-0\" (UID: \"f5a1d044-9e9c-41ca-96e2-735248659691\") " pod="openstack/rabbitmq-server-0" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.007611 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f5a1d044-9e9c-41ca-96e2-735248659691-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f5a1d044-9e9c-41ca-96e2-735248659691\") " pod="openstack/rabbitmq-server-0" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.007709 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f5a1d044-9e9c-41ca-96e2-735248659691-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f5a1d044-9e9c-41ca-96e2-735248659691\") " pod="openstack/rabbitmq-server-0" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.007735 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f5a1d044-9e9c-41ca-96e2-735248659691-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f5a1d044-9e9c-41ca-96e2-735248659691\") " pod="openstack/rabbitmq-server-0" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.007840 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4b86\" (UniqueName: \"kubernetes.io/projected/f5a1d044-9e9c-41ca-96e2-735248659691-kube-api-access-d4b86\") pod \"rabbitmq-server-0\" (UID: \"f5a1d044-9e9c-41ca-96e2-735248659691\") " pod="openstack/rabbitmq-server-0" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.007874 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f5a1d044-9e9c-41ca-96e2-735248659691-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f5a1d044-9e9c-41ca-96e2-735248659691\") " pod="openstack/rabbitmq-server-0" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.008276 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f5a1d044-9e9c-41ca-96e2-735248659691-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f5a1d044-9e9c-41ca-96e2-735248659691\") " pod="openstack/rabbitmq-server-0" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.008315 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f5a1d044-9e9c-41ca-96e2-735248659691-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f5a1d044-9e9c-41ca-96e2-735248659691\") " pod="openstack/rabbitmq-server-0" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.008376 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f5a1d044-9e9c-41ca-96e2-735248659691-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f5a1d044-9e9c-41ca-96e2-735248659691\") " pod="openstack/rabbitmq-server-0" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.008450 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f5a1d044-9e9c-41ca-96e2-735248659691-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f5a1d044-9e9c-41ca-96e2-735248659691\") " pod="openstack/rabbitmq-server-0" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.008690 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f5a1d044-9e9c-41ca-96e2-735248659691-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f5a1d044-9e9c-41ca-96e2-735248659691\") " pod="openstack/rabbitmq-server-0" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.008956 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-88515a40-e0e6-4b68-9f49-c6ac1986175f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-88515a40-e0e6-4b68-9f49-c6ac1986175f\") pod \"rabbitmq-server-0\" (UID: \"f5a1d044-9e9c-41ca-96e2-735248659691\") " pod="openstack/rabbitmq-server-0" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.009925 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f5a1d044-9e9c-41ca-96e2-735248659691-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f5a1d044-9e9c-41ca-96e2-735248659691\") " pod="openstack/rabbitmq-server-0" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.012256 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f5a1d044-9e9c-41ca-96e2-735248659691-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f5a1d044-9e9c-41ca-96e2-735248659691\") " pod="openstack/rabbitmq-server-0" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.012298 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f5a1d044-9e9c-41ca-96e2-735248659691-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f5a1d044-9e9c-41ca-96e2-735248659691\") " pod="openstack/rabbitmq-server-0" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.012335 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f5a1d044-9e9c-41ca-96e2-735248659691-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f5a1d044-9e9c-41ca-96e2-735248659691\") " pod="openstack/rabbitmq-server-0" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.012397 4892 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.012415 4892 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-88515a40-e0e6-4b68-9f49-c6ac1986175f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-88515a40-e0e6-4b68-9f49-c6ac1986175f\") pod \"rabbitmq-server-0\" (UID: \"f5a1d044-9e9c-41ca-96e2-735248659691\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0bcf9fa32c69b4158e289b66c9f947462be31d0b8fe8c84c345aa605e691f6ba/globalmount\"" pod="openstack/rabbitmq-server-0" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.012671 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f5a1d044-9e9c-41ca-96e2-735248659691-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f5a1d044-9e9c-41ca-96e2-735248659691\") " pod="openstack/rabbitmq-server-0" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.026312 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4b86\" (UniqueName: \"kubernetes.io/projected/f5a1d044-9e9c-41ca-96e2-735248659691-kube-api-access-d4b86\") pod \"rabbitmq-server-0\" (UID: \"f5a1d044-9e9c-41ca-96e2-735248659691\") " pod="openstack/rabbitmq-server-0" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.039389 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-88515a40-e0e6-4b68-9f49-c6ac1986175f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-88515a40-e0e6-4b68-9f49-c6ac1986175f\") pod \"rabbitmq-server-0\" (UID: \"f5a1d044-9e9c-41ca-96e2-735248659691\") " pod="openstack/rabbitmq-server-0" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.133669 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.165529 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.170690 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.170787 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.176080 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.179830 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.180169 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-db6cv" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.180179 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.182432 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.213960 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4896fa46-2882-440a-b0fe-66ab208de548-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4896fa46-2882-440a-b0fe-66ab208de548\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.214286 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4896fa46-2882-440a-b0fe-66ab208de548-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4896fa46-2882-440a-b0fe-66ab208de548\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.214398 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4896fa46-2882-440a-b0fe-66ab208de548-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4896fa46-2882-440a-b0fe-66ab208de548\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.214471 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4896fa46-2882-440a-b0fe-66ab208de548-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4896fa46-2882-440a-b0fe-66ab208de548\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.214575 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4896fa46-2882-440a-b0fe-66ab208de548-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4896fa46-2882-440a-b0fe-66ab208de548\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.214940 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvsg7\" (UniqueName: \"kubernetes.io/projected/4896fa46-2882-440a-b0fe-66ab208de548-kube-api-access-dvsg7\") pod \"rabbitmq-cell1-server-0\" (UID: \"4896fa46-2882-440a-b0fe-66ab208de548\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.218060 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4896fa46-2882-440a-b0fe-66ab208de548-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4896fa46-2882-440a-b0fe-66ab208de548\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.218179 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8306a818-37e0-43dd-8205-5089be91c516\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8306a818-37e0-43dd-8205-5089be91c516\") pod \"rabbitmq-cell1-server-0\" (UID: \"4896fa46-2882-440a-b0fe-66ab208de548\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.218287 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4896fa46-2882-440a-b0fe-66ab208de548-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4896fa46-2882-440a-b0fe-66ab208de548\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.320634 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4896fa46-2882-440a-b0fe-66ab208de548-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4896fa46-2882-440a-b0fe-66ab208de548\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.320709 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4896fa46-2882-440a-b0fe-66ab208de548-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4896fa46-2882-440a-b0fe-66ab208de548\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.320762 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4896fa46-2882-440a-b0fe-66ab208de548-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4896fa46-2882-440a-b0fe-66ab208de548\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.320863 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4896fa46-2882-440a-b0fe-66ab208de548-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4896fa46-2882-440a-b0fe-66ab208de548\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.320914 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvsg7\" (UniqueName: \"kubernetes.io/projected/4896fa46-2882-440a-b0fe-66ab208de548-kube-api-access-dvsg7\") pod \"rabbitmq-cell1-server-0\" (UID: \"4896fa46-2882-440a-b0fe-66ab208de548\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.320955 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4896fa46-2882-440a-b0fe-66ab208de548-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4896fa46-2882-440a-b0fe-66ab208de548\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.320991 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8306a818-37e0-43dd-8205-5089be91c516\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8306a818-37e0-43dd-8205-5089be91c516\") pod \"rabbitmq-cell1-server-0\" (UID: \"4896fa46-2882-440a-b0fe-66ab208de548\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.321060 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4896fa46-2882-440a-b0fe-66ab208de548-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4896fa46-2882-440a-b0fe-66ab208de548\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.321093 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4896fa46-2882-440a-b0fe-66ab208de548-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4896fa46-2882-440a-b0fe-66ab208de548\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.322594 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4896fa46-2882-440a-b0fe-66ab208de548-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4896fa46-2882-440a-b0fe-66ab208de548\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.323236 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4896fa46-2882-440a-b0fe-66ab208de548-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4896fa46-2882-440a-b0fe-66ab208de548\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.324024 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4896fa46-2882-440a-b0fe-66ab208de548-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4896fa46-2882-440a-b0fe-66ab208de548\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.324252 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4896fa46-2882-440a-b0fe-66ab208de548-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4896fa46-2882-440a-b0fe-66ab208de548\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.324427 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4896fa46-2882-440a-b0fe-66ab208de548-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4896fa46-2882-440a-b0fe-66ab208de548\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.333457 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4896fa46-2882-440a-b0fe-66ab208de548-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4896fa46-2882-440a-b0fe-66ab208de548\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.334644 4892 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.334676 4892 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8306a818-37e0-43dd-8205-5089be91c516\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8306a818-37e0-43dd-8205-5089be91c516\") pod \"rabbitmq-cell1-server-0\" (UID: \"4896fa46-2882-440a-b0fe-66ab208de548\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/01daa81c8d454feeafbbae15de02c09e5a451b10e7d393ca6cd9336766d33668/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.336736 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4896fa46-2882-440a-b0fe-66ab208de548-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4896fa46-2882-440a-b0fe-66ab208de548\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.342325 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvsg7\" (UniqueName: \"kubernetes.io/projected/4896fa46-2882-440a-b0fe-66ab208de548-kube-api-access-dvsg7\") pod \"rabbitmq-cell1-server-0\" (UID: \"4896fa46-2882-440a-b0fe-66ab208de548\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.365297 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8306a818-37e0-43dd-8205-5089be91c516\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8306a818-37e0-43dd-8205-5089be91c516\") pod \"rabbitmq-cell1-server-0\" (UID: \"4896fa46-2882-440a-b0fe-66ab208de548\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.436101 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gmpwt"] Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.438607 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gmpwt" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.448378 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gmpwt"] Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.502350 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.527056 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8fwk\" (UniqueName: \"kubernetes.io/projected/f8bc4ad2-030f-44cf-b4f6-f438435e0772-kube-api-access-s8fwk\") pod \"redhat-operators-gmpwt\" (UID: \"f8bc4ad2-030f-44cf-b4f6-f438435e0772\") " pod="openshift-marketplace/redhat-operators-gmpwt" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.527113 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8bc4ad2-030f-44cf-b4f6-f438435e0772-utilities\") pod \"redhat-operators-gmpwt\" (UID: \"f8bc4ad2-030f-44cf-b4f6-f438435e0772\") " pod="openshift-marketplace/redhat-operators-gmpwt" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.527218 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8bc4ad2-030f-44cf-b4f6-f438435e0772-catalog-content\") pod \"redhat-operators-gmpwt\" (UID: \"f8bc4ad2-030f-44cf-b4f6-f438435e0772\") " pod="openshift-marketplace/redhat-operators-gmpwt" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.628422 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8fwk\" (UniqueName: \"kubernetes.io/projected/f8bc4ad2-030f-44cf-b4f6-f438435e0772-kube-api-access-s8fwk\") pod \"redhat-operators-gmpwt\" (UID: \"f8bc4ad2-030f-44cf-b4f6-f438435e0772\") " pod="openshift-marketplace/redhat-operators-gmpwt" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.628462 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8bc4ad2-030f-44cf-b4f6-f438435e0772-utilities\") pod \"redhat-operators-gmpwt\" (UID: \"f8bc4ad2-030f-44cf-b4f6-f438435e0772\") " pod="openshift-marketplace/redhat-operators-gmpwt" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.628563 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8bc4ad2-030f-44cf-b4f6-f438435e0772-catalog-content\") pod \"redhat-operators-gmpwt\" (UID: \"f8bc4ad2-030f-44cf-b4f6-f438435e0772\") " pod="openshift-marketplace/redhat-operators-gmpwt" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.628998 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8bc4ad2-030f-44cf-b4f6-f438435e0772-catalog-content\") pod \"redhat-operators-gmpwt\" (UID: \"f8bc4ad2-030f-44cf-b4f6-f438435e0772\") " pod="openshift-marketplace/redhat-operators-gmpwt" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.629420 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8bc4ad2-030f-44cf-b4f6-f438435e0772-utilities\") pod \"redhat-operators-gmpwt\" (UID: \"f8bc4ad2-030f-44cf-b4f6-f438435e0772\") " pod="openshift-marketplace/redhat-operators-gmpwt" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.655975 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8fwk\" (UniqueName: \"kubernetes.io/projected/f8bc4ad2-030f-44cf-b4f6-f438435e0772-kube-api-access-s8fwk\") pod \"redhat-operators-gmpwt\" (UID: \"f8bc4ad2-030f-44cf-b4f6-f438435e0772\") " pod="openshift-marketplace/redhat-operators-gmpwt" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.669288 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.742533 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f5a1d044-9e9c-41ca-96e2-735248659691","Type":"ContainerStarted","Data":"a93004432e89e5a6e7412c58a9f459b6d85dcae2adde344d9ee7d29a63d4b9c0"} Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.748323 4892 generic.go:334] "Generic (PLEG): container finished" podID="2a2fba10-1e82-4580-b350-548b00de2a1f" containerID="80990b3226c6a2356df0b236deb39fdc0c5850aa9ef72469f57834673e5cbfde" exitCode=0 Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.748406 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-2jqm4" event={"ID":"2a2fba10-1e82-4580-b350-548b00de2a1f","Type":"ContainerDied","Data":"80990b3226c6a2356df0b236deb39fdc0c5850aa9ef72469f57834673e5cbfde"} Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.748441 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-2jqm4" event={"ID":"2a2fba10-1e82-4580-b350-548b00de2a1f","Type":"ContainerStarted","Data":"f427d7f477bc65453eefe07739b6d11b31540fb3fc220f6f85ae5a46634ac5f6"} Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.751288 4892 generic.go:334] "Generic (PLEG): container finished" podID="56666973-7a1b-407e-a569-b888f83544e6" containerID="76e70fd3781d305cae627a60666a54aa7d7e1faa5a35593075f275ae60d23a22" exitCode=0 Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.751335 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-dqv8z" event={"ID":"56666973-7a1b-407e-a569-b888f83544e6","Type":"ContainerDied","Data":"76e70fd3781d305cae627a60666a54aa7d7e1faa5a35593075f275ae60d23a22"} Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.769777 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gmpwt" Feb 17 19:04:24 crc kubenswrapper[4892]: I0217 19:04:24.972656 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 19:04:25 crc kubenswrapper[4892]: E0217 19:04:25.070280 4892 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Feb 17 19:04:25 crc kubenswrapper[4892]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/56666973-7a1b-407e-a569-b888f83544e6/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Feb 17 19:04:25 crc kubenswrapper[4892]: > podSandboxID="5c7cb90f9dd6d24d9956df0242ecfa921651024b382a7c53f90c79427f49e14b" Feb 17 19:04:25 crc kubenswrapper[4892]: E0217 19:04:25.070604 4892 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 17 19:04:25 crc kubenswrapper[4892]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n8chc6h5bh56fh546hb7hc8h67h5bchffh577h697h5b5h5bdh59bhf6hf4h558hb5h578h595h5cchfbh644h59ch7fh654h547h587h5cbh5d5h8fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9mv2c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5d7b5456f5-dqv8z_openstack(56666973-7a1b-407e-a569-b888f83544e6): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/56666973-7a1b-407e-a569-b888f83544e6/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Feb 17 19:04:25 crc kubenswrapper[4892]: > logger="UnhandledError" Feb 17 19:04:25 crc kubenswrapper[4892]: E0217 19:04:25.072300 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/56666973-7a1b-407e-a569-b888f83544e6/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-5d7b5456f5-dqv8z" podUID="56666973-7a1b-407e-a569-b888f83544e6" Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.206570 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.208182 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.221409 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.222415 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.222680 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-4g69t" Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.222808 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.222978 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.225984 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.238385 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c466a3b3-748c-4402-a029-ba4f30d2f660-kolla-config\") pod \"openstack-galera-0\" (UID: \"c466a3b3-748c-4402-a029-ba4f30d2f660\") " pod="openstack/openstack-galera-0" Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.238430 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c466a3b3-748c-4402-a029-ba4f30d2f660-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"c466a3b3-748c-4402-a029-ba4f30d2f660\") " pod="openstack/openstack-galera-0" Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.238452 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c466a3b3-748c-4402-a029-ba4f30d2f660-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"c466a3b3-748c-4402-a029-ba4f30d2f660\") " pod="openstack/openstack-galera-0" Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.238484 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c466a3b3-748c-4402-a029-ba4f30d2f660-config-data-generated\") pod \"openstack-galera-0\" (UID: \"c466a3b3-748c-4402-a029-ba4f30d2f660\") " pod="openstack/openstack-galera-0" Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.238544 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhmx7\" (UniqueName: \"kubernetes.io/projected/c466a3b3-748c-4402-a029-ba4f30d2f660-kube-api-access-rhmx7\") pod \"openstack-galera-0\" (UID: \"c466a3b3-748c-4402-a029-ba4f30d2f660\") " pod="openstack/openstack-galera-0" Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.238562 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c466a3b3-748c-4402-a029-ba4f30d2f660-operator-scripts\") pod \"openstack-galera-0\" (UID: \"c466a3b3-748c-4402-a029-ba4f30d2f660\") " pod="openstack/openstack-galera-0" Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.238600 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a312fd07-3a16-4945-a94e-2818d40359fc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a312fd07-3a16-4945-a94e-2818d40359fc\") pod \"openstack-galera-0\" (UID: \"c466a3b3-748c-4402-a029-ba4f30d2f660\") " pod="openstack/openstack-galera-0" Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.238653 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c466a3b3-748c-4402-a029-ba4f30d2f660-config-data-default\") pod \"openstack-galera-0\" (UID: \"c466a3b3-748c-4402-a029-ba4f30d2f660\") " pod="openstack/openstack-galera-0" Feb 17 19:04:25 crc kubenswrapper[4892]: W0217 19:04:25.309918 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8bc4ad2_030f_44cf_b4f6_f438435e0772.slice/crio-b4825c4510c50d4410d8225f799795899179336597e0d56d3b84ca1605f583fc WatchSource:0}: Error finding container b4825c4510c50d4410d8225f799795899179336597e0d56d3b84ca1605f583fc: Status 404 returned error can't find the container with id b4825c4510c50d4410d8225f799795899179336597e0d56d3b84ca1605f583fc Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.321362 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gmpwt"] Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.340235 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c466a3b3-748c-4402-a029-ba4f30d2f660-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"c466a3b3-748c-4402-a029-ba4f30d2f660\") " pod="openstack/openstack-galera-0" Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.340333 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c466a3b3-748c-4402-a029-ba4f30d2f660-config-data-generated\") pod \"openstack-galera-0\" (UID: \"c466a3b3-748c-4402-a029-ba4f30d2f660\") " pod="openstack/openstack-galera-0" Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.340449 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhmx7\" (UniqueName: \"kubernetes.io/projected/c466a3b3-748c-4402-a029-ba4f30d2f660-kube-api-access-rhmx7\") pod \"openstack-galera-0\" (UID: \"c466a3b3-748c-4402-a029-ba4f30d2f660\") " pod="openstack/openstack-galera-0" Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.340487 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c466a3b3-748c-4402-a029-ba4f30d2f660-operator-scripts\") pod \"openstack-galera-0\" (UID: \"c466a3b3-748c-4402-a029-ba4f30d2f660\") " pod="openstack/openstack-galera-0" Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.340554 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a312fd07-3a16-4945-a94e-2818d40359fc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a312fd07-3a16-4945-a94e-2818d40359fc\") pod \"openstack-galera-0\" (UID: \"c466a3b3-748c-4402-a029-ba4f30d2f660\") " pod="openstack/openstack-galera-0" Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.340651 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c466a3b3-748c-4402-a029-ba4f30d2f660-config-data-default\") pod \"openstack-galera-0\" (UID: \"c466a3b3-748c-4402-a029-ba4f30d2f660\") " pod="openstack/openstack-galera-0" Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.340714 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c466a3b3-748c-4402-a029-ba4f30d2f660-kolla-config\") pod \"openstack-galera-0\" (UID: \"c466a3b3-748c-4402-a029-ba4f30d2f660\") " pod="openstack/openstack-galera-0" Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.340769 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c466a3b3-748c-4402-a029-ba4f30d2f660-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"c466a3b3-748c-4402-a029-ba4f30d2f660\") " pod="openstack/openstack-galera-0" Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.344197 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c466a3b3-748c-4402-a029-ba4f30d2f660-kolla-config\") pod \"openstack-galera-0\" (UID: \"c466a3b3-748c-4402-a029-ba4f30d2f660\") " pod="openstack/openstack-galera-0" Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.344235 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c466a3b3-748c-4402-a029-ba4f30d2f660-config-data-default\") pod \"openstack-galera-0\" (UID: \"c466a3b3-748c-4402-a029-ba4f30d2f660\") " pod="openstack/openstack-galera-0" Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.344327 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c466a3b3-748c-4402-a029-ba4f30d2f660-config-data-generated\") pod \"openstack-galera-0\" (UID: \"c466a3b3-748c-4402-a029-ba4f30d2f660\") " pod="openstack/openstack-galera-0" Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.344361 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c466a3b3-748c-4402-a029-ba4f30d2f660-operator-scripts\") pod \"openstack-galera-0\" (UID: \"c466a3b3-748c-4402-a029-ba4f30d2f660\") " pod="openstack/openstack-galera-0" Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.350507 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c466a3b3-748c-4402-a029-ba4f30d2f660-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"c466a3b3-748c-4402-a029-ba4f30d2f660\") " pod="openstack/openstack-galera-0" Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.350715 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c466a3b3-748c-4402-a029-ba4f30d2f660-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"c466a3b3-748c-4402-a029-ba4f30d2f660\") " pod="openstack/openstack-galera-0" Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.351138 4892 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.351170 4892 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a312fd07-3a16-4945-a94e-2818d40359fc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a312fd07-3a16-4945-a94e-2818d40359fc\") pod \"openstack-galera-0\" (UID: \"c466a3b3-748c-4402-a029-ba4f30d2f660\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3b704e5d90c6fb2ccc2357483e56e135e9574ff75553b10ad10c1dc7ce421f58/globalmount\"" pod="openstack/openstack-galera-0" Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.365988 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhmx7\" (UniqueName: \"kubernetes.io/projected/c466a3b3-748c-4402-a029-ba4f30d2f660-kube-api-access-rhmx7\") pod \"openstack-galera-0\" (UID: \"c466a3b3-748c-4402-a029-ba4f30d2f660\") " pod="openstack/openstack-galera-0" Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.392149 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a312fd07-3a16-4945-a94e-2818d40359fc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a312fd07-3a16-4945-a94e-2818d40359fc\") pod \"openstack-galera-0\" (UID: \"c466a3b3-748c-4402-a029-ba4f30d2f660\") " pod="openstack/openstack-galera-0" Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.535392 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.744498 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.746920 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.753504 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.754142 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-pq4ms" Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.800513 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.806480 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-2jqm4" event={"ID":"2a2fba10-1e82-4580-b350-548b00de2a1f","Type":"ContainerStarted","Data":"deb9f2ad7525b7d52a6ffb43c66aeb0b43701513203a2f19d24ec4f011ba5fe2"} Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.806610 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-98ddfc8f-2jqm4" Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.806894 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dhjs7" Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.806978 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dhjs7" Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.807964 4892 generic.go:334] "Generic (PLEG): container finished" podID="f8bc4ad2-030f-44cf-b4f6-f438435e0772" containerID="885aa634fd84897552f7606900f3778a60b31c3bb85dfec472089b73a6484f47" exitCode=0 Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.808035 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gmpwt" event={"ID":"f8bc4ad2-030f-44cf-b4f6-f438435e0772","Type":"ContainerDied","Data":"885aa634fd84897552f7606900f3778a60b31c3bb85dfec472089b73a6484f47"} Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.808055 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gmpwt" event={"ID":"f8bc4ad2-030f-44cf-b4f6-f438435e0772","Type":"ContainerStarted","Data":"b4825c4510c50d4410d8225f799795899179336597e0d56d3b84ca1605f583fc"} Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.814620 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4896fa46-2882-440a-b0fe-66ab208de548","Type":"ContainerStarted","Data":"e68d5c8a3ea05e79fed3a98e48d5b1f36cf30c0809748315d1b9b4d5226b7616"} Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.828253 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-98ddfc8f-2jqm4" podStartSLOduration=3.828234792 podStartE2EDuration="3.828234792s" podCreationTimestamp="2026-02-17 19:04:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:04:25.827513373 +0000 UTC m=+4837.202916638" watchObservedRunningTime="2026-02-17 19:04:25.828234792 +0000 UTC m=+4837.203638047" Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.853119 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/51a16c44-9bb6-4250-8f1f-4617b6e20eb9-config-data\") pod \"memcached-0\" (UID: \"51a16c44-9bb6-4250-8f1f-4617b6e20eb9\") " pod="openstack/memcached-0" Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.853202 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/51a16c44-9bb6-4250-8f1f-4617b6e20eb9-kolla-config\") pod \"memcached-0\" (UID: \"51a16c44-9bb6-4250-8f1f-4617b6e20eb9\") " pod="openstack/memcached-0" Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.853265 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5qzs\" (UniqueName: \"kubernetes.io/projected/51a16c44-9bb6-4250-8f1f-4617b6e20eb9-kube-api-access-m5qzs\") pod \"memcached-0\" (UID: \"51a16c44-9bb6-4250-8f1f-4617b6e20eb9\") " pod="openstack/memcached-0" Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.955347 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/51a16c44-9bb6-4250-8f1f-4617b6e20eb9-config-data\") pod \"memcached-0\" (UID: \"51a16c44-9bb6-4250-8f1f-4617b6e20eb9\") " pod="openstack/memcached-0" Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.955417 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/51a16c44-9bb6-4250-8f1f-4617b6e20eb9-kolla-config\") pod \"memcached-0\" (UID: \"51a16c44-9bb6-4250-8f1f-4617b6e20eb9\") " pod="openstack/memcached-0" Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.955496 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5qzs\" (UniqueName: \"kubernetes.io/projected/51a16c44-9bb6-4250-8f1f-4617b6e20eb9-kube-api-access-m5qzs\") pod \"memcached-0\" (UID: \"51a16c44-9bb6-4250-8f1f-4617b6e20eb9\") " pod="openstack/memcached-0" Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.958755 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/51a16c44-9bb6-4250-8f1f-4617b6e20eb9-kolla-config\") pod \"memcached-0\" (UID: \"51a16c44-9bb6-4250-8f1f-4617b6e20eb9\") " pod="openstack/memcached-0" Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.959178 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/51a16c44-9bb6-4250-8f1f-4617b6e20eb9-config-data\") pod \"memcached-0\" (UID: \"51a16c44-9bb6-4250-8f1f-4617b6e20eb9\") " pod="openstack/memcached-0" Feb 17 19:04:25 crc kubenswrapper[4892]: I0217 19:04:25.981386 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5qzs\" (UniqueName: \"kubernetes.io/projected/51a16c44-9bb6-4250-8f1f-4617b6e20eb9-kube-api-access-m5qzs\") pod \"memcached-0\" (UID: \"51a16c44-9bb6-4250-8f1f-4617b6e20eb9\") " pod="openstack/memcached-0" Feb 17 19:04:26 crc kubenswrapper[4892]: I0217 19:04:26.070867 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dhjs7" Feb 17 19:04:26 crc kubenswrapper[4892]: I0217 19:04:26.075282 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 17 19:04:26 crc kubenswrapper[4892]: I0217 19:04:26.400660 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 17 19:04:26 crc kubenswrapper[4892]: I0217 19:04:26.556387 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 17 19:04:26 crc kubenswrapper[4892]: W0217 19:04:26.770601 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc466a3b3_748c_4402_a029_ba4f30d2f660.slice/crio-da9c3a33acb1c52700aa77077a52b452f052a53d21812b3e15f2864f969dec41 WatchSource:0}: Error finding container da9c3a33acb1c52700aa77077a52b452f052a53d21812b3e15f2864f969dec41: Status 404 returned error can't find the container with id da9c3a33acb1c52700aa77077a52b452f052a53d21812b3e15f2864f969dec41 Feb 17 19:04:26 crc kubenswrapper[4892]: I0217 19:04:26.785896 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 17 19:04:26 crc kubenswrapper[4892]: I0217 19:04:26.800764 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 17 19:04:26 crc kubenswrapper[4892]: I0217 19:04:26.800950 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 17 19:04:26 crc kubenswrapper[4892]: I0217 19:04:26.804473 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-n2bll" Feb 17 19:04:26 crc kubenswrapper[4892]: I0217 19:04:26.805084 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 17 19:04:26 crc kubenswrapper[4892]: I0217 19:04:26.805572 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 17 19:04:26 crc kubenswrapper[4892]: I0217 19:04:26.808574 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 17 19:04:26 crc kubenswrapper[4892]: I0217 19:04:26.846056 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f5a1d044-9e9c-41ca-96e2-735248659691","Type":"ContainerStarted","Data":"0e1ac38ae4fa6b22abe369a93643c0a6ad6172744d4df53b1ba3eb6bf221a1cf"} Feb 17 19:04:26 crc kubenswrapper[4892]: I0217 19:04:26.848644 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"c466a3b3-748c-4402-a029-ba4f30d2f660","Type":"ContainerStarted","Data":"da9c3a33acb1c52700aa77077a52b452f052a53d21812b3e15f2864f969dec41"} Feb 17 19:04:26 crc kubenswrapper[4892]: I0217 19:04:26.850651 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"51a16c44-9bb6-4250-8f1f-4617b6e20eb9","Type":"ContainerStarted","Data":"37db7b81da8141ea8256d8c4804da32212a8a95be60e3a6debe14f060d569b50"} Feb 17 19:04:26 crc kubenswrapper[4892]: I0217 19:04:26.974523 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-bb406c37-aedb-46a7-a3d6-e1ac314cb87a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bb406c37-aedb-46a7-a3d6-e1ac314cb87a\") pod \"openstack-cell1-galera-0\" (UID: \"20cf068b-8714-4fc3-8a41-3af0baacc634\") " pod="openstack/openstack-cell1-galera-0" Feb 17 19:04:26 crc kubenswrapper[4892]: I0217 19:04:26.974592 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/20cf068b-8714-4fc3-8a41-3af0baacc634-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"20cf068b-8714-4fc3-8a41-3af0baacc634\") " pod="openstack/openstack-cell1-galera-0" Feb 17 19:04:26 crc kubenswrapper[4892]: I0217 19:04:26.974630 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20cf068b-8714-4fc3-8a41-3af0baacc634-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"20cf068b-8714-4fc3-8a41-3af0baacc634\") " pod="openstack/openstack-cell1-galera-0" Feb 17 19:04:26 crc kubenswrapper[4892]: I0217 19:04:26.974657 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/20cf068b-8714-4fc3-8a41-3af0baacc634-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"20cf068b-8714-4fc3-8a41-3af0baacc634\") " pod="openstack/openstack-cell1-galera-0" Feb 17 19:04:26 crc kubenswrapper[4892]: I0217 19:04:26.974710 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/20cf068b-8714-4fc3-8a41-3af0baacc634-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"20cf068b-8714-4fc3-8a41-3af0baacc634\") " pod="openstack/openstack-cell1-galera-0" Feb 17 19:04:26 crc kubenswrapper[4892]: I0217 19:04:26.974739 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/20cf068b-8714-4fc3-8a41-3af0baacc634-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"20cf068b-8714-4fc3-8a41-3af0baacc634\") " pod="openstack/openstack-cell1-galera-0" Feb 17 19:04:26 crc kubenswrapper[4892]: I0217 19:04:26.974761 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfcdj\" (UniqueName: \"kubernetes.io/projected/20cf068b-8714-4fc3-8a41-3af0baacc634-kube-api-access-nfcdj\") pod \"openstack-cell1-galera-0\" (UID: \"20cf068b-8714-4fc3-8a41-3af0baacc634\") " pod="openstack/openstack-cell1-galera-0" Feb 17 19:04:26 crc kubenswrapper[4892]: I0217 19:04:26.974837 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20cf068b-8714-4fc3-8a41-3af0baacc634-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"20cf068b-8714-4fc3-8a41-3af0baacc634\") " pod="openstack/openstack-cell1-galera-0" Feb 17 19:04:27 crc kubenswrapper[4892]: I0217 19:04:27.023707 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dhjs7" Feb 17 19:04:27 crc kubenswrapper[4892]: I0217 19:04:27.076864 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/20cf068b-8714-4fc3-8a41-3af0baacc634-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"20cf068b-8714-4fc3-8a41-3af0baacc634\") " pod="openstack/openstack-cell1-galera-0" Feb 17 19:04:27 crc kubenswrapper[4892]: I0217 19:04:27.077166 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20cf068b-8714-4fc3-8a41-3af0baacc634-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"20cf068b-8714-4fc3-8a41-3af0baacc634\") " pod="openstack/openstack-cell1-galera-0" Feb 17 19:04:27 crc kubenswrapper[4892]: I0217 19:04:27.077197 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/20cf068b-8714-4fc3-8a41-3af0baacc634-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"20cf068b-8714-4fc3-8a41-3af0baacc634\") " pod="openstack/openstack-cell1-galera-0" Feb 17 19:04:27 crc kubenswrapper[4892]: I0217 19:04:27.077246 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/20cf068b-8714-4fc3-8a41-3af0baacc634-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"20cf068b-8714-4fc3-8a41-3af0baacc634\") " pod="openstack/openstack-cell1-galera-0" Feb 17 19:04:27 crc kubenswrapper[4892]: I0217 19:04:27.077277 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/20cf068b-8714-4fc3-8a41-3af0baacc634-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"20cf068b-8714-4fc3-8a41-3af0baacc634\") " pod="openstack/openstack-cell1-galera-0" Feb 17 19:04:27 crc kubenswrapper[4892]: I0217 19:04:27.077301 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfcdj\" (UniqueName: \"kubernetes.io/projected/20cf068b-8714-4fc3-8a41-3af0baacc634-kube-api-access-nfcdj\") pod \"openstack-cell1-galera-0\" (UID: \"20cf068b-8714-4fc3-8a41-3af0baacc634\") " pod="openstack/openstack-cell1-galera-0" Feb 17 19:04:27 crc kubenswrapper[4892]: I0217 19:04:27.077352 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20cf068b-8714-4fc3-8a41-3af0baacc634-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"20cf068b-8714-4fc3-8a41-3af0baacc634\") " pod="openstack/openstack-cell1-galera-0" Feb 17 19:04:27 crc kubenswrapper[4892]: I0217 19:04:27.077384 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-bb406c37-aedb-46a7-a3d6-e1ac314cb87a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bb406c37-aedb-46a7-a3d6-e1ac314cb87a\") pod \"openstack-cell1-galera-0\" (UID: \"20cf068b-8714-4fc3-8a41-3af0baacc634\") " pod="openstack/openstack-cell1-galera-0" Feb 17 19:04:27 crc kubenswrapper[4892]: I0217 19:04:27.078729 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/20cf068b-8714-4fc3-8a41-3af0baacc634-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"20cf068b-8714-4fc3-8a41-3af0baacc634\") " pod="openstack/openstack-cell1-galera-0" Feb 17 19:04:27 crc kubenswrapper[4892]: I0217 19:04:27.078831 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/20cf068b-8714-4fc3-8a41-3af0baacc634-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"20cf068b-8714-4fc3-8a41-3af0baacc634\") " pod="openstack/openstack-cell1-galera-0" Feb 17 19:04:27 crc kubenswrapper[4892]: I0217 19:04:27.079555 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/20cf068b-8714-4fc3-8a41-3af0baacc634-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"20cf068b-8714-4fc3-8a41-3af0baacc634\") " pod="openstack/openstack-cell1-galera-0" Feb 17 19:04:27 crc kubenswrapper[4892]: I0217 19:04:27.081751 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20cf068b-8714-4fc3-8a41-3af0baacc634-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"20cf068b-8714-4fc3-8a41-3af0baacc634\") " pod="openstack/openstack-cell1-galera-0" Feb 17 19:04:27 crc kubenswrapper[4892]: I0217 19:04:27.081917 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/20cf068b-8714-4fc3-8a41-3af0baacc634-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"20cf068b-8714-4fc3-8a41-3af0baacc634\") " pod="openstack/openstack-cell1-galera-0" Feb 17 19:04:27 crc kubenswrapper[4892]: I0217 19:04:27.082896 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20cf068b-8714-4fc3-8a41-3af0baacc634-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"20cf068b-8714-4fc3-8a41-3af0baacc634\") " pod="openstack/openstack-cell1-galera-0" Feb 17 19:04:27 crc kubenswrapper[4892]: I0217 19:04:27.083685 4892 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 19:04:27 crc kubenswrapper[4892]: I0217 19:04:27.083742 4892 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-bb406c37-aedb-46a7-a3d6-e1ac314cb87a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bb406c37-aedb-46a7-a3d6-e1ac314cb87a\") pod \"openstack-cell1-galera-0\" (UID: \"20cf068b-8714-4fc3-8a41-3af0baacc634\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2ef7d5508c5dc503a509863dbb518597647ed84ce503e856b746ef4dc41f54c1/globalmount\"" pod="openstack/openstack-cell1-galera-0" Feb 17 19:04:27 crc kubenswrapper[4892]: I0217 19:04:27.098271 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfcdj\" (UniqueName: \"kubernetes.io/projected/20cf068b-8714-4fc3-8a41-3af0baacc634-kube-api-access-nfcdj\") pod \"openstack-cell1-galera-0\" (UID: \"20cf068b-8714-4fc3-8a41-3af0baacc634\") " pod="openstack/openstack-cell1-galera-0" Feb 17 19:04:27 crc kubenswrapper[4892]: I0217 19:04:27.113905 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-bb406c37-aedb-46a7-a3d6-e1ac314cb87a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bb406c37-aedb-46a7-a3d6-e1ac314cb87a\") pod \"openstack-cell1-galera-0\" (UID: \"20cf068b-8714-4fc3-8a41-3af0baacc634\") " pod="openstack/openstack-cell1-galera-0" Feb 17 19:04:27 crc kubenswrapper[4892]: I0217 19:04:27.187659 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 17 19:04:27 crc kubenswrapper[4892]: I0217 19:04:27.691629 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 17 19:04:27 crc kubenswrapper[4892]: W0217 19:04:27.693122 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20cf068b_8714_4fc3_8a41_3af0baacc634.slice/crio-f1ceec04b0b27662744848c6ebd0748006e1c7c0b0d24d58f8c4a13ed0c892b3 WatchSource:0}: Error finding container f1ceec04b0b27662744848c6ebd0748006e1c7c0b0d24d58f8c4a13ed0c892b3: Status 404 returned error can't find the container with id f1ceec04b0b27662744848c6ebd0748006e1c7c0b0d24d58f8c4a13ed0c892b3 Feb 17 19:04:27 crc kubenswrapper[4892]: I0217 19:04:27.866954 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"c466a3b3-748c-4402-a029-ba4f30d2f660","Type":"ContainerStarted","Data":"3e135a18f4cf5a72bf0dd704ef4c9d5a877f8c319c1f5f4ec4a7df7f4e8e3420"} Feb 17 19:04:27 crc kubenswrapper[4892]: I0217 19:04:27.868787 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"20cf068b-8714-4fc3-8a41-3af0baacc634","Type":"ContainerStarted","Data":"98bd067c5a8f8b5fed1dc3f7cb087658f60469fefad039067902d3379b557f99"} Feb 17 19:04:27 crc kubenswrapper[4892]: I0217 19:04:27.868873 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"20cf068b-8714-4fc3-8a41-3af0baacc634","Type":"ContainerStarted","Data":"f1ceec04b0b27662744848c6ebd0748006e1c7c0b0d24d58f8c4a13ed0c892b3"} Feb 17 19:04:27 crc kubenswrapper[4892]: I0217 19:04:27.872217 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4896fa46-2882-440a-b0fe-66ab208de548","Type":"ContainerStarted","Data":"cfe22892d0efb55da7698e66f3e81a197baba371f76869ce1e3d5e6e9095f20b"} Feb 17 19:04:27 crc kubenswrapper[4892]: I0217 19:04:27.873800 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"51a16c44-9bb6-4250-8f1f-4617b6e20eb9","Type":"ContainerStarted","Data":"015501c38c8ce417a30172298c0816ec3358cc8c29c26e08c2135f7fafbbfb7b"} Feb 17 19:04:27 crc kubenswrapper[4892]: I0217 19:04:27.874915 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 17 19:04:27 crc kubenswrapper[4892]: I0217 19:04:27.877109 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-dqv8z" event={"ID":"56666973-7a1b-407e-a569-b888f83544e6","Type":"ContainerStarted","Data":"9df25a4e7ca6936cde507f54ed3e86d03159bb8dbd1a3fa927de448a8e1b0366"} Feb 17 19:04:27 crc kubenswrapper[4892]: I0217 19:04:27.877388 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d7b5456f5-dqv8z" Feb 17 19:04:27 crc kubenswrapper[4892]: I0217 19:04:27.882550 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gmpwt" event={"ID":"f8bc4ad2-030f-44cf-b4f6-f438435e0772","Type":"ContainerStarted","Data":"fd20c06120e57488e38683b55398d732644e3c70cdabd87e6e92d70664d97d26"} Feb 17 19:04:27 crc kubenswrapper[4892]: I0217 19:04:27.969179 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.969157976 podStartE2EDuration="2.969157976s" podCreationTimestamp="2026-02-17 19:04:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:04:27.936418824 +0000 UTC m=+4839.311822089" watchObservedRunningTime="2026-02-17 19:04:27.969157976 +0000 UTC m=+4839.344561241" Feb 17 19:04:27 crc kubenswrapper[4892]: I0217 19:04:27.990259 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d7b5456f5-dqv8z" podStartSLOduration=5.990239665 podStartE2EDuration="5.990239665s" podCreationTimestamp="2026-02-17 19:04:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:04:27.98932844 +0000 UTC m=+4839.364731705" watchObservedRunningTime="2026-02-17 19:04:27.990239665 +0000 UTC m=+4839.365642930" Feb 17 19:04:28 crc kubenswrapper[4892]: I0217 19:04:28.447501 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dhjs7"] Feb 17 19:04:28 crc kubenswrapper[4892]: I0217 19:04:28.897392 4892 generic.go:334] "Generic (PLEG): container finished" podID="f8bc4ad2-030f-44cf-b4f6-f438435e0772" containerID="fd20c06120e57488e38683b55398d732644e3c70cdabd87e6e92d70664d97d26" exitCode=0 Feb 17 19:04:28 crc kubenswrapper[4892]: I0217 19:04:28.897587 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gmpwt" event={"ID":"f8bc4ad2-030f-44cf-b4f6-f438435e0772","Type":"ContainerDied","Data":"fd20c06120e57488e38683b55398d732644e3c70cdabd87e6e92d70664d97d26"} Feb 17 19:04:29 crc kubenswrapper[4892]: I0217 19:04:29.908835 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gmpwt" event={"ID":"f8bc4ad2-030f-44cf-b4f6-f438435e0772","Type":"ContainerStarted","Data":"a0c5d6274d7a07dcff285f43d8d495bfb8d537f4f998699d4d2725be2d419e6c"} Feb 17 19:04:29 crc kubenswrapper[4892]: I0217 19:04:29.908976 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dhjs7" podUID="ac86c7a4-9839-4d54-93b1-7451208d6955" containerName="registry-server" containerID="cri-o://4b7025e9a8f400083409a07d4ea09be0bb451454c31e6c8aa70f4843b3fdc023" gracePeriod=2 Feb 17 19:04:29 crc kubenswrapper[4892]: I0217 19:04:29.935498 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gmpwt" podStartSLOduration=2.502636916 podStartE2EDuration="5.935480053s" podCreationTimestamp="2026-02-17 19:04:24 +0000 UTC" firstStartedPulling="2026-02-17 19:04:25.828017777 +0000 UTC m=+4837.203421042" lastFinishedPulling="2026-02-17 19:04:29.260860914 +0000 UTC m=+4840.636264179" observedRunningTime="2026-02-17 19:04:29.92720967 +0000 UTC m=+4841.302612945" watchObservedRunningTime="2026-02-17 19:04:29.935480053 +0000 UTC m=+4841.310883318" Feb 17 19:04:30 crc kubenswrapper[4892]: I0217 19:04:30.367040 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dhjs7" Feb 17 19:04:30 crc kubenswrapper[4892]: I0217 19:04:30.447158 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac86c7a4-9839-4d54-93b1-7451208d6955-catalog-content\") pod \"ac86c7a4-9839-4d54-93b1-7451208d6955\" (UID: \"ac86c7a4-9839-4d54-93b1-7451208d6955\") " Feb 17 19:04:30 crc kubenswrapper[4892]: I0217 19:04:30.447200 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbgqp\" (UniqueName: \"kubernetes.io/projected/ac86c7a4-9839-4d54-93b1-7451208d6955-kube-api-access-kbgqp\") pod \"ac86c7a4-9839-4d54-93b1-7451208d6955\" (UID: \"ac86c7a4-9839-4d54-93b1-7451208d6955\") " Feb 17 19:04:30 crc kubenswrapper[4892]: I0217 19:04:30.447225 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac86c7a4-9839-4d54-93b1-7451208d6955-utilities\") pod \"ac86c7a4-9839-4d54-93b1-7451208d6955\" (UID: \"ac86c7a4-9839-4d54-93b1-7451208d6955\") " Feb 17 19:04:30 crc kubenswrapper[4892]: I0217 19:04:30.448505 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac86c7a4-9839-4d54-93b1-7451208d6955-utilities" (OuterVolumeSpecName: "utilities") pod "ac86c7a4-9839-4d54-93b1-7451208d6955" (UID: "ac86c7a4-9839-4d54-93b1-7451208d6955"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:04:30 crc kubenswrapper[4892]: I0217 19:04:30.455591 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac86c7a4-9839-4d54-93b1-7451208d6955-kube-api-access-kbgqp" (OuterVolumeSpecName: "kube-api-access-kbgqp") pod "ac86c7a4-9839-4d54-93b1-7451208d6955" (UID: "ac86c7a4-9839-4d54-93b1-7451208d6955"). InnerVolumeSpecName "kube-api-access-kbgqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:04:30 crc kubenswrapper[4892]: I0217 19:04:30.510258 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac86c7a4-9839-4d54-93b1-7451208d6955-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ac86c7a4-9839-4d54-93b1-7451208d6955" (UID: "ac86c7a4-9839-4d54-93b1-7451208d6955"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:04:30 crc kubenswrapper[4892]: I0217 19:04:30.549594 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac86c7a4-9839-4d54-93b1-7451208d6955-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 19:04:30 crc kubenswrapper[4892]: I0217 19:04:30.550854 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbgqp\" (UniqueName: \"kubernetes.io/projected/ac86c7a4-9839-4d54-93b1-7451208d6955-kube-api-access-kbgqp\") on node \"crc\" DevicePath \"\"" Feb 17 19:04:30 crc kubenswrapper[4892]: I0217 19:04:30.550940 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac86c7a4-9839-4d54-93b1-7451208d6955-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 19:04:30 crc kubenswrapper[4892]: I0217 19:04:30.918545 4892 generic.go:334] "Generic (PLEG): container finished" podID="ac86c7a4-9839-4d54-93b1-7451208d6955" containerID="4b7025e9a8f400083409a07d4ea09be0bb451454c31e6c8aa70f4843b3fdc023" exitCode=0 Feb 17 19:04:30 crc kubenswrapper[4892]: I0217 19:04:30.918637 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dhjs7" event={"ID":"ac86c7a4-9839-4d54-93b1-7451208d6955","Type":"ContainerDied","Data":"4b7025e9a8f400083409a07d4ea09be0bb451454c31e6c8aa70f4843b3fdc023"} Feb 17 19:04:30 crc kubenswrapper[4892]: I0217 19:04:30.918676 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dhjs7" event={"ID":"ac86c7a4-9839-4d54-93b1-7451208d6955","Type":"ContainerDied","Data":"ce3db951993e61f22eea5ba3f5be3b0587cb36d1bc9a95be79f370581055f1a5"} Feb 17 19:04:30 crc kubenswrapper[4892]: I0217 19:04:30.918696 4892 scope.go:117] "RemoveContainer" containerID="4b7025e9a8f400083409a07d4ea09be0bb451454c31e6c8aa70f4843b3fdc023" Feb 17 19:04:30 crc kubenswrapper[4892]: I0217 19:04:30.919400 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dhjs7" Feb 17 19:04:30 crc kubenswrapper[4892]: I0217 19:04:30.939464 4892 scope.go:117] "RemoveContainer" containerID="111bbd95adfeb4bee4670eb2b347bc2da8012b730170a758fc120688144bd276" Feb 17 19:04:30 crc kubenswrapper[4892]: I0217 19:04:30.965010 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dhjs7"] Feb 17 19:04:30 crc kubenswrapper[4892]: I0217 19:04:30.973230 4892 scope.go:117] "RemoveContainer" containerID="ff09ca47bd171face209a75299cb5f035bcc180f3fcc33d9613305cbd373ae01" Feb 17 19:04:30 crc kubenswrapper[4892]: I0217 19:04:30.974763 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dhjs7"] Feb 17 19:04:30 crc kubenswrapper[4892]: I0217 19:04:30.993279 4892 scope.go:117] "RemoveContainer" containerID="4b7025e9a8f400083409a07d4ea09be0bb451454c31e6c8aa70f4843b3fdc023" Feb 17 19:04:30 crc kubenswrapper[4892]: E0217 19:04:30.993723 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b7025e9a8f400083409a07d4ea09be0bb451454c31e6c8aa70f4843b3fdc023\": container with ID starting with 4b7025e9a8f400083409a07d4ea09be0bb451454c31e6c8aa70f4843b3fdc023 not found: ID does not exist" containerID="4b7025e9a8f400083409a07d4ea09be0bb451454c31e6c8aa70f4843b3fdc023" Feb 17 19:04:30 crc kubenswrapper[4892]: I0217 19:04:30.993792 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b7025e9a8f400083409a07d4ea09be0bb451454c31e6c8aa70f4843b3fdc023"} err="failed to get container status \"4b7025e9a8f400083409a07d4ea09be0bb451454c31e6c8aa70f4843b3fdc023\": rpc error: code = NotFound desc = could not find container \"4b7025e9a8f400083409a07d4ea09be0bb451454c31e6c8aa70f4843b3fdc023\": container with ID starting with 4b7025e9a8f400083409a07d4ea09be0bb451454c31e6c8aa70f4843b3fdc023 not found: ID does not exist" Feb 17 19:04:30 crc kubenswrapper[4892]: I0217 19:04:30.993855 4892 scope.go:117] "RemoveContainer" containerID="111bbd95adfeb4bee4670eb2b347bc2da8012b730170a758fc120688144bd276" Feb 17 19:04:30 crc kubenswrapper[4892]: E0217 19:04:30.994134 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"111bbd95adfeb4bee4670eb2b347bc2da8012b730170a758fc120688144bd276\": container with ID starting with 111bbd95adfeb4bee4670eb2b347bc2da8012b730170a758fc120688144bd276 not found: ID does not exist" containerID="111bbd95adfeb4bee4670eb2b347bc2da8012b730170a758fc120688144bd276" Feb 17 19:04:30 crc kubenswrapper[4892]: I0217 19:04:30.994165 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"111bbd95adfeb4bee4670eb2b347bc2da8012b730170a758fc120688144bd276"} err="failed to get container status \"111bbd95adfeb4bee4670eb2b347bc2da8012b730170a758fc120688144bd276\": rpc error: code = NotFound desc = could not find container \"111bbd95adfeb4bee4670eb2b347bc2da8012b730170a758fc120688144bd276\": container with ID starting with 111bbd95adfeb4bee4670eb2b347bc2da8012b730170a758fc120688144bd276 not found: ID does not exist" Feb 17 19:04:30 crc kubenswrapper[4892]: I0217 19:04:30.994187 4892 scope.go:117] "RemoveContainer" containerID="ff09ca47bd171face209a75299cb5f035bcc180f3fcc33d9613305cbd373ae01" Feb 17 19:04:30 crc kubenswrapper[4892]: E0217 19:04:30.994513 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff09ca47bd171face209a75299cb5f035bcc180f3fcc33d9613305cbd373ae01\": container with ID starting with ff09ca47bd171face209a75299cb5f035bcc180f3fcc33d9613305cbd373ae01 not found: ID does not exist" containerID="ff09ca47bd171face209a75299cb5f035bcc180f3fcc33d9613305cbd373ae01" Feb 17 19:04:30 crc kubenswrapper[4892]: I0217 19:04:30.994570 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff09ca47bd171face209a75299cb5f035bcc180f3fcc33d9613305cbd373ae01"} err="failed to get container status \"ff09ca47bd171face209a75299cb5f035bcc180f3fcc33d9613305cbd373ae01\": rpc error: code = NotFound desc = could not find container \"ff09ca47bd171face209a75299cb5f035bcc180f3fcc33d9613305cbd373ae01\": container with ID starting with ff09ca47bd171face209a75299cb5f035bcc180f3fcc33d9613305cbd373ae01 not found: ID does not exist" Feb 17 19:04:31 crc kubenswrapper[4892]: I0217 19:04:31.376688 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac86c7a4-9839-4d54-93b1-7451208d6955" path="/var/lib/kubelet/pods/ac86c7a4-9839-4d54-93b1-7451208d6955/volumes" Feb 17 19:04:31 crc kubenswrapper[4892]: I0217 19:04:31.935956 4892 generic.go:334] "Generic (PLEG): container finished" podID="c466a3b3-748c-4402-a029-ba4f30d2f660" containerID="3e135a18f4cf5a72bf0dd704ef4c9d5a877f8c319c1f5f4ec4a7df7f4e8e3420" exitCode=0 Feb 17 19:04:31 crc kubenswrapper[4892]: I0217 19:04:31.936047 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"c466a3b3-748c-4402-a029-ba4f30d2f660","Type":"ContainerDied","Data":"3e135a18f4cf5a72bf0dd704ef4c9d5a877f8c319c1f5f4ec4a7df7f4e8e3420"} Feb 17 19:04:32 crc kubenswrapper[4892]: I0217 19:04:32.953404 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"c466a3b3-748c-4402-a029-ba4f30d2f660","Type":"ContainerStarted","Data":"42cbefcb3d6b35739a174cc0a7bb6df228791bb602aa05356ec44462b60ea98a"} Feb 17 19:04:32 crc kubenswrapper[4892]: I0217 19:04:32.956199 4892 generic.go:334] "Generic (PLEG): container finished" podID="20cf068b-8714-4fc3-8a41-3af0baacc634" containerID="98bd067c5a8f8b5fed1dc3f7cb087658f60469fefad039067902d3379b557f99" exitCode=0 Feb 17 19:04:32 crc kubenswrapper[4892]: I0217 19:04:32.956234 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"20cf068b-8714-4fc3-8a41-3af0baacc634","Type":"ContainerDied","Data":"98bd067c5a8f8b5fed1dc3f7cb087658f60469fefad039067902d3379b557f99"} Feb 17 19:04:32 crc kubenswrapper[4892]: I0217 19:04:32.995205 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=8.995188519 podStartE2EDuration="8.995188519s" podCreationTimestamp="2026-02-17 19:04:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:04:32.993853884 +0000 UTC m=+4844.369257199" watchObservedRunningTime="2026-02-17 19:04:32.995188519 +0000 UTC m=+4844.370591794" Feb 17 19:04:33 crc kubenswrapper[4892]: I0217 19:04:33.018128 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d7b5456f5-dqv8z" Feb 17 19:04:33 crc kubenswrapper[4892]: I0217 19:04:33.287712 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-98ddfc8f-2jqm4" Feb 17 19:04:33 crc kubenswrapper[4892]: I0217 19:04:33.339874 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-dqv8z"] Feb 17 19:04:33 crc kubenswrapper[4892]: I0217 19:04:33.966798 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"20cf068b-8714-4fc3-8a41-3af0baacc634","Type":"ContainerStarted","Data":"a326ab873b2b9be7114cb698bf4d1632321b227ea6cd104a63fe48e837b99b3e"} Feb 17 19:04:33 crc kubenswrapper[4892]: I0217 19:04:33.967092 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d7b5456f5-dqv8z" podUID="56666973-7a1b-407e-a569-b888f83544e6" containerName="dnsmasq-dns" containerID="cri-o://9df25a4e7ca6936cde507f54ed3e86d03159bb8dbd1a3fa927de448a8e1b0366" gracePeriod=10 Feb 17 19:04:33 crc kubenswrapper[4892]: I0217 19:04:33.994339 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=8.994314738 podStartE2EDuration="8.994314738s" podCreationTimestamp="2026-02-17 19:04:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:04:33.990710841 +0000 UTC m=+4845.366114116" watchObservedRunningTime="2026-02-17 19:04:33.994314738 +0000 UTC m=+4845.369718003" Feb 17 19:04:34 crc kubenswrapper[4892]: I0217 19:04:34.464218 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-dqv8z" Feb 17 19:04:34 crc kubenswrapper[4892]: I0217 19:04:34.641409 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mv2c\" (UniqueName: \"kubernetes.io/projected/56666973-7a1b-407e-a569-b888f83544e6-kube-api-access-9mv2c\") pod \"56666973-7a1b-407e-a569-b888f83544e6\" (UID: \"56666973-7a1b-407e-a569-b888f83544e6\") " Feb 17 19:04:34 crc kubenswrapper[4892]: I0217 19:04:34.641754 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56666973-7a1b-407e-a569-b888f83544e6-config\") pod \"56666973-7a1b-407e-a569-b888f83544e6\" (UID: \"56666973-7a1b-407e-a569-b888f83544e6\") " Feb 17 19:04:34 crc kubenswrapper[4892]: I0217 19:04:34.641892 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56666973-7a1b-407e-a569-b888f83544e6-dns-svc\") pod \"56666973-7a1b-407e-a569-b888f83544e6\" (UID: \"56666973-7a1b-407e-a569-b888f83544e6\") " Feb 17 19:04:34 crc kubenswrapper[4892]: I0217 19:04:34.650762 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56666973-7a1b-407e-a569-b888f83544e6-kube-api-access-9mv2c" (OuterVolumeSpecName: "kube-api-access-9mv2c") pod "56666973-7a1b-407e-a569-b888f83544e6" (UID: "56666973-7a1b-407e-a569-b888f83544e6"). InnerVolumeSpecName "kube-api-access-9mv2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:04:34 crc kubenswrapper[4892]: I0217 19:04:34.687054 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56666973-7a1b-407e-a569-b888f83544e6-config" (OuterVolumeSpecName: "config") pod "56666973-7a1b-407e-a569-b888f83544e6" (UID: "56666973-7a1b-407e-a569-b888f83544e6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:04:34 crc kubenswrapper[4892]: I0217 19:04:34.698691 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56666973-7a1b-407e-a569-b888f83544e6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "56666973-7a1b-407e-a569-b888f83544e6" (UID: "56666973-7a1b-407e-a569-b888f83544e6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:04:34 crc kubenswrapper[4892]: I0217 19:04:34.743788 4892 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56666973-7a1b-407e-a569-b888f83544e6-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 19:04:34 crc kubenswrapper[4892]: I0217 19:04:34.743857 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mv2c\" (UniqueName: \"kubernetes.io/projected/56666973-7a1b-407e-a569-b888f83544e6-kube-api-access-9mv2c\") on node \"crc\" DevicePath \"\"" Feb 17 19:04:34 crc kubenswrapper[4892]: I0217 19:04:34.743877 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56666973-7a1b-407e-a569-b888f83544e6-config\") on node \"crc\" DevicePath \"\"" Feb 17 19:04:34 crc kubenswrapper[4892]: I0217 19:04:34.770178 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gmpwt" Feb 17 19:04:34 crc kubenswrapper[4892]: I0217 19:04:34.770246 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gmpwt" Feb 17 19:04:34 crc kubenswrapper[4892]: I0217 19:04:34.975884 4892 generic.go:334] "Generic (PLEG): container finished" podID="56666973-7a1b-407e-a569-b888f83544e6" containerID="9df25a4e7ca6936cde507f54ed3e86d03159bb8dbd1a3fa927de448a8e1b0366" exitCode=0 Feb 17 19:04:34 crc kubenswrapper[4892]: I0217 19:04:34.975925 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-dqv8z" event={"ID":"56666973-7a1b-407e-a569-b888f83544e6","Type":"ContainerDied","Data":"9df25a4e7ca6936cde507f54ed3e86d03159bb8dbd1a3fa927de448a8e1b0366"} Feb 17 19:04:34 crc kubenswrapper[4892]: I0217 19:04:34.975950 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-dqv8z" event={"ID":"56666973-7a1b-407e-a569-b888f83544e6","Type":"ContainerDied","Data":"5c7cb90f9dd6d24d9956df0242ecfa921651024b382a7c53f90c79427f49e14b"} Feb 17 19:04:34 crc kubenswrapper[4892]: I0217 19:04:34.975966 4892 scope.go:117] "RemoveContainer" containerID="9df25a4e7ca6936cde507f54ed3e86d03159bb8dbd1a3fa927de448a8e1b0366" Feb 17 19:04:34 crc kubenswrapper[4892]: I0217 19:04:34.976071 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-dqv8z" Feb 17 19:04:35 crc kubenswrapper[4892]: I0217 19:04:35.017243 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-dqv8z"] Feb 17 19:04:35 crc kubenswrapper[4892]: I0217 19:04:35.026744 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-dqv8z"] Feb 17 19:04:35 crc kubenswrapper[4892]: I0217 19:04:35.027028 4892 scope.go:117] "RemoveContainer" containerID="76e70fd3781d305cae627a60666a54aa7d7e1faa5a35593075f275ae60d23a22" Feb 17 19:04:35 crc kubenswrapper[4892]: I0217 19:04:35.061323 4892 scope.go:117] "RemoveContainer" containerID="9df25a4e7ca6936cde507f54ed3e86d03159bb8dbd1a3fa927de448a8e1b0366" Feb 17 19:04:35 crc kubenswrapper[4892]: E0217 19:04:35.073414 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9df25a4e7ca6936cde507f54ed3e86d03159bb8dbd1a3fa927de448a8e1b0366\": container with ID starting with 9df25a4e7ca6936cde507f54ed3e86d03159bb8dbd1a3fa927de448a8e1b0366 not found: ID does not exist" containerID="9df25a4e7ca6936cde507f54ed3e86d03159bb8dbd1a3fa927de448a8e1b0366" Feb 17 19:04:35 crc kubenswrapper[4892]: I0217 19:04:35.073466 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9df25a4e7ca6936cde507f54ed3e86d03159bb8dbd1a3fa927de448a8e1b0366"} err="failed to get container status \"9df25a4e7ca6936cde507f54ed3e86d03159bb8dbd1a3fa927de448a8e1b0366\": rpc error: code = NotFound desc = could not find container \"9df25a4e7ca6936cde507f54ed3e86d03159bb8dbd1a3fa927de448a8e1b0366\": container with ID starting with 9df25a4e7ca6936cde507f54ed3e86d03159bb8dbd1a3fa927de448a8e1b0366 not found: ID does not exist" Feb 17 19:04:35 crc kubenswrapper[4892]: I0217 19:04:35.073495 4892 scope.go:117] "RemoveContainer" containerID="76e70fd3781d305cae627a60666a54aa7d7e1faa5a35593075f275ae60d23a22" Feb 17 19:04:35 crc kubenswrapper[4892]: E0217 19:04:35.074419 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76e70fd3781d305cae627a60666a54aa7d7e1faa5a35593075f275ae60d23a22\": container with ID starting with 76e70fd3781d305cae627a60666a54aa7d7e1faa5a35593075f275ae60d23a22 not found: ID does not exist" containerID="76e70fd3781d305cae627a60666a54aa7d7e1faa5a35593075f275ae60d23a22" Feb 17 19:04:35 crc kubenswrapper[4892]: I0217 19:04:35.074484 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76e70fd3781d305cae627a60666a54aa7d7e1faa5a35593075f275ae60d23a22"} err="failed to get container status \"76e70fd3781d305cae627a60666a54aa7d7e1faa5a35593075f275ae60d23a22\": rpc error: code = NotFound desc = could not find container \"76e70fd3781d305cae627a60666a54aa7d7e1faa5a35593075f275ae60d23a22\": container with ID starting with 76e70fd3781d305cae627a60666a54aa7d7e1faa5a35593075f275ae60d23a22 not found: ID does not exist" Feb 17 19:04:35 crc kubenswrapper[4892]: I0217 19:04:35.374004 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56666973-7a1b-407e-a569-b888f83544e6" path="/var/lib/kubelet/pods/56666973-7a1b-407e-a569-b888f83544e6/volumes" Feb 17 19:04:35 crc kubenswrapper[4892]: I0217 19:04:35.536517 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 17 19:04:35 crc kubenswrapper[4892]: I0217 19:04:35.536590 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 17 19:04:35 crc kubenswrapper[4892]: I0217 19:04:35.832274 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gmpwt" podUID="f8bc4ad2-030f-44cf-b4f6-f438435e0772" containerName="registry-server" probeResult="failure" output=< Feb 17 19:04:35 crc kubenswrapper[4892]: timeout: failed to connect service ":50051" within 1s Feb 17 19:04:35 crc kubenswrapper[4892]: > Feb 17 19:04:36 crc kubenswrapper[4892]: I0217 19:04:36.077282 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 17 19:04:37 crc kubenswrapper[4892]: I0217 19:04:37.187715 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 17 19:04:37 crc kubenswrapper[4892]: I0217 19:04:37.188045 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 17 19:04:37 crc kubenswrapper[4892]: I0217 19:04:37.425399 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 19:04:37 crc kubenswrapper[4892]: I0217 19:04:37.425470 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 19:04:37 crc kubenswrapper[4892]: I0217 19:04:37.425516 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" Feb 17 19:04:37 crc kubenswrapper[4892]: I0217 19:04:37.426262 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"330e1bd88e0b3cdc2508642d4af65f7484d521050f49965de99307ecfc4b06ea"} pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 19:04:37 crc kubenswrapper[4892]: I0217 19:04:37.426329 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" containerID="cri-o://330e1bd88e0b3cdc2508642d4af65f7484d521050f49965de99307ecfc4b06ea" gracePeriod=600 Feb 17 19:04:37 crc kubenswrapper[4892]: I0217 19:04:37.975229 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 17 19:04:38 crc kubenswrapper[4892]: I0217 19:04:38.044425 4892 generic.go:334] "Generic (PLEG): container finished" podID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerID="330e1bd88e0b3cdc2508642d4af65f7484d521050f49965de99307ecfc4b06ea" exitCode=0 Feb 17 19:04:38 crc kubenswrapper[4892]: I0217 19:04:38.044599 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerDied","Data":"330e1bd88e0b3cdc2508642d4af65f7484d521050f49965de99307ecfc4b06ea"} Feb 17 19:04:38 crc kubenswrapper[4892]: I0217 19:04:38.044881 4892 scope.go:117] "RemoveContainer" containerID="d09654a36203bb03d7f581d946a6aff0b561d94b30f8b6769499f728c32bfc51" Feb 17 19:04:38 crc kubenswrapper[4892]: I0217 19:04:38.101630 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 17 19:04:38 crc kubenswrapper[4892]: I0217 19:04:38.114181 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 17 19:04:38 crc kubenswrapper[4892]: I0217 19:04:38.206043 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="20cf068b-8714-4fc3-8a41-3af0baacc634" containerName="galera" probeResult="failure" output=< Feb 17 19:04:38 crc kubenswrapper[4892]: wsrep_local_state_comment (Joined) differs from Synced Feb 17 19:04:38 crc kubenswrapper[4892]: > Feb 17 19:04:39 crc kubenswrapper[4892]: I0217 19:04:39.054550 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerStarted","Data":"1176c4dd94e4eb26cd8f99c2e68bcd6da89a092583d39fe54d90751c364f5f39"} Feb 17 19:04:44 crc kubenswrapper[4892]: I0217 19:04:44.223920 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-xnbm7"] Feb 17 19:04:44 crc kubenswrapper[4892]: E0217 19:04:44.224895 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56666973-7a1b-407e-a569-b888f83544e6" containerName="dnsmasq-dns" Feb 17 19:04:44 crc kubenswrapper[4892]: I0217 19:04:44.224913 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="56666973-7a1b-407e-a569-b888f83544e6" containerName="dnsmasq-dns" Feb 17 19:04:44 crc kubenswrapper[4892]: E0217 19:04:44.224947 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac86c7a4-9839-4d54-93b1-7451208d6955" containerName="registry-server" Feb 17 19:04:44 crc kubenswrapper[4892]: I0217 19:04:44.224957 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac86c7a4-9839-4d54-93b1-7451208d6955" containerName="registry-server" Feb 17 19:04:44 crc kubenswrapper[4892]: E0217 19:04:44.224975 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac86c7a4-9839-4d54-93b1-7451208d6955" containerName="extract-utilities" Feb 17 19:04:44 crc kubenswrapper[4892]: I0217 19:04:44.224983 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac86c7a4-9839-4d54-93b1-7451208d6955" containerName="extract-utilities" Feb 17 19:04:44 crc kubenswrapper[4892]: E0217 19:04:44.224996 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56666973-7a1b-407e-a569-b888f83544e6" containerName="init" Feb 17 19:04:44 crc kubenswrapper[4892]: I0217 19:04:44.225003 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="56666973-7a1b-407e-a569-b888f83544e6" containerName="init" Feb 17 19:04:44 crc kubenswrapper[4892]: E0217 19:04:44.225024 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac86c7a4-9839-4d54-93b1-7451208d6955" containerName="extract-content" Feb 17 19:04:44 crc kubenswrapper[4892]: I0217 19:04:44.225032 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac86c7a4-9839-4d54-93b1-7451208d6955" containerName="extract-content" Feb 17 19:04:44 crc kubenswrapper[4892]: I0217 19:04:44.225235 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="56666973-7a1b-407e-a569-b888f83544e6" containerName="dnsmasq-dns" Feb 17 19:04:44 crc kubenswrapper[4892]: I0217 19:04:44.225253 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac86c7a4-9839-4d54-93b1-7451208d6955" containerName="registry-server" Feb 17 19:04:44 crc kubenswrapper[4892]: I0217 19:04:44.225837 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xnbm7" Feb 17 19:04:44 crc kubenswrapper[4892]: I0217 19:04:44.227902 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 17 19:04:44 crc kubenswrapper[4892]: I0217 19:04:44.237734 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-xnbm7"] Feb 17 19:04:44 crc kubenswrapper[4892]: I0217 19:04:44.329244 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqwnw\" (UniqueName: \"kubernetes.io/projected/59ba8b04-0fa0-48b5-afff-ad1921668fec-kube-api-access-qqwnw\") pod \"root-account-create-update-xnbm7\" (UID: \"59ba8b04-0fa0-48b5-afff-ad1921668fec\") " pod="openstack/root-account-create-update-xnbm7" Feb 17 19:04:44 crc kubenswrapper[4892]: I0217 19:04:44.329365 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59ba8b04-0fa0-48b5-afff-ad1921668fec-operator-scripts\") pod \"root-account-create-update-xnbm7\" (UID: \"59ba8b04-0fa0-48b5-afff-ad1921668fec\") " pod="openstack/root-account-create-update-xnbm7" Feb 17 19:04:44 crc kubenswrapper[4892]: I0217 19:04:44.432452 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqwnw\" (UniqueName: \"kubernetes.io/projected/59ba8b04-0fa0-48b5-afff-ad1921668fec-kube-api-access-qqwnw\") pod \"root-account-create-update-xnbm7\" (UID: \"59ba8b04-0fa0-48b5-afff-ad1921668fec\") " pod="openstack/root-account-create-update-xnbm7" Feb 17 19:04:44 crc kubenswrapper[4892]: I0217 19:04:44.432741 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59ba8b04-0fa0-48b5-afff-ad1921668fec-operator-scripts\") pod \"root-account-create-update-xnbm7\" (UID: \"59ba8b04-0fa0-48b5-afff-ad1921668fec\") " pod="openstack/root-account-create-update-xnbm7" Feb 17 19:04:44 crc kubenswrapper[4892]: I0217 19:04:44.434160 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59ba8b04-0fa0-48b5-afff-ad1921668fec-operator-scripts\") pod \"root-account-create-update-xnbm7\" (UID: \"59ba8b04-0fa0-48b5-afff-ad1921668fec\") " pod="openstack/root-account-create-update-xnbm7" Feb 17 19:04:44 crc kubenswrapper[4892]: I0217 19:04:44.459938 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqwnw\" (UniqueName: \"kubernetes.io/projected/59ba8b04-0fa0-48b5-afff-ad1921668fec-kube-api-access-qqwnw\") pod \"root-account-create-update-xnbm7\" (UID: \"59ba8b04-0fa0-48b5-afff-ad1921668fec\") " pod="openstack/root-account-create-update-xnbm7" Feb 17 19:04:44 crc kubenswrapper[4892]: I0217 19:04:44.560481 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xnbm7" Feb 17 19:04:44 crc kubenswrapper[4892]: I0217 19:04:44.815168 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gmpwt" Feb 17 19:04:44 crc kubenswrapper[4892]: I0217 19:04:44.864235 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gmpwt" Feb 17 19:04:45 crc kubenswrapper[4892]: I0217 19:04:45.012499 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-xnbm7"] Feb 17 19:04:45 crc kubenswrapper[4892]: I0217 19:04:45.051104 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gmpwt"] Feb 17 19:04:45 crc kubenswrapper[4892]: I0217 19:04:45.119454 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-xnbm7" event={"ID":"59ba8b04-0fa0-48b5-afff-ad1921668fec","Type":"ContainerStarted","Data":"b6a509a3c0171e34c957426fe818f1340af0c56831a080ce51863709800a3db1"} Feb 17 19:04:46 crc kubenswrapper[4892]: I0217 19:04:46.136770 4892 generic.go:334] "Generic (PLEG): container finished" podID="59ba8b04-0fa0-48b5-afff-ad1921668fec" containerID="a7d7dabdbc951e58d287c7dbfcdd34c3144fd4ee67a2f404f74880801b24ae94" exitCode=0 Feb 17 19:04:46 crc kubenswrapper[4892]: I0217 19:04:46.136860 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-xnbm7" event={"ID":"59ba8b04-0fa0-48b5-afff-ad1921668fec","Type":"ContainerDied","Data":"a7d7dabdbc951e58d287c7dbfcdd34c3144fd4ee67a2f404f74880801b24ae94"} Feb 17 19:04:46 crc kubenswrapper[4892]: I0217 19:04:46.137605 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gmpwt" podUID="f8bc4ad2-030f-44cf-b4f6-f438435e0772" containerName="registry-server" containerID="cri-o://a0c5d6274d7a07dcff285f43d8d495bfb8d537f4f998699d4d2725be2d419e6c" gracePeriod=2 Feb 17 19:04:46 crc kubenswrapper[4892]: I0217 19:04:46.657975 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gmpwt" Feb 17 19:04:46 crc kubenswrapper[4892]: I0217 19:04:46.778720 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8bc4ad2-030f-44cf-b4f6-f438435e0772-catalog-content\") pod \"f8bc4ad2-030f-44cf-b4f6-f438435e0772\" (UID: \"f8bc4ad2-030f-44cf-b4f6-f438435e0772\") " Feb 17 19:04:46 crc kubenswrapper[4892]: I0217 19:04:46.778892 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8bc4ad2-030f-44cf-b4f6-f438435e0772-utilities\") pod \"f8bc4ad2-030f-44cf-b4f6-f438435e0772\" (UID: \"f8bc4ad2-030f-44cf-b4f6-f438435e0772\") " Feb 17 19:04:46 crc kubenswrapper[4892]: I0217 19:04:46.778939 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8fwk\" (UniqueName: \"kubernetes.io/projected/f8bc4ad2-030f-44cf-b4f6-f438435e0772-kube-api-access-s8fwk\") pod \"f8bc4ad2-030f-44cf-b4f6-f438435e0772\" (UID: \"f8bc4ad2-030f-44cf-b4f6-f438435e0772\") " Feb 17 19:04:46 crc kubenswrapper[4892]: I0217 19:04:46.780213 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8bc4ad2-030f-44cf-b4f6-f438435e0772-utilities" (OuterVolumeSpecName: "utilities") pod "f8bc4ad2-030f-44cf-b4f6-f438435e0772" (UID: "f8bc4ad2-030f-44cf-b4f6-f438435e0772"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:04:46 crc kubenswrapper[4892]: I0217 19:04:46.784733 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8bc4ad2-030f-44cf-b4f6-f438435e0772-kube-api-access-s8fwk" (OuterVolumeSpecName: "kube-api-access-s8fwk") pod "f8bc4ad2-030f-44cf-b4f6-f438435e0772" (UID: "f8bc4ad2-030f-44cf-b4f6-f438435e0772"). InnerVolumeSpecName "kube-api-access-s8fwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:04:46 crc kubenswrapper[4892]: I0217 19:04:46.881517 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8bc4ad2-030f-44cf-b4f6-f438435e0772-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 19:04:46 crc kubenswrapper[4892]: I0217 19:04:46.881563 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8fwk\" (UniqueName: \"kubernetes.io/projected/f8bc4ad2-030f-44cf-b4f6-f438435e0772-kube-api-access-s8fwk\") on node \"crc\" DevicePath \"\"" Feb 17 19:04:46 crc kubenswrapper[4892]: I0217 19:04:46.929084 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8bc4ad2-030f-44cf-b4f6-f438435e0772-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f8bc4ad2-030f-44cf-b4f6-f438435e0772" (UID: "f8bc4ad2-030f-44cf-b4f6-f438435e0772"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:04:46 crc kubenswrapper[4892]: I0217 19:04:46.983899 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8bc4ad2-030f-44cf-b4f6-f438435e0772-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 19:04:47 crc kubenswrapper[4892]: I0217 19:04:47.150405 4892 generic.go:334] "Generic (PLEG): container finished" podID="f8bc4ad2-030f-44cf-b4f6-f438435e0772" containerID="a0c5d6274d7a07dcff285f43d8d495bfb8d537f4f998699d4d2725be2d419e6c" exitCode=0 Feb 17 19:04:47 crc kubenswrapper[4892]: I0217 19:04:47.150488 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gmpwt" event={"ID":"f8bc4ad2-030f-44cf-b4f6-f438435e0772","Type":"ContainerDied","Data":"a0c5d6274d7a07dcff285f43d8d495bfb8d537f4f998699d4d2725be2d419e6c"} Feb 17 19:04:47 crc kubenswrapper[4892]: I0217 19:04:47.150552 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gmpwt" event={"ID":"f8bc4ad2-030f-44cf-b4f6-f438435e0772","Type":"ContainerDied","Data":"b4825c4510c50d4410d8225f799795899179336597e0d56d3b84ca1605f583fc"} Feb 17 19:04:47 crc kubenswrapper[4892]: I0217 19:04:47.150582 4892 scope.go:117] "RemoveContainer" containerID="a0c5d6274d7a07dcff285f43d8d495bfb8d537f4f998699d4d2725be2d419e6c" Feb 17 19:04:47 crc kubenswrapper[4892]: I0217 19:04:47.150597 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gmpwt" Feb 17 19:04:47 crc kubenswrapper[4892]: I0217 19:04:47.189980 4892 scope.go:117] "RemoveContainer" containerID="fd20c06120e57488e38683b55398d732644e3c70cdabd87e6e92d70664d97d26" Feb 17 19:04:47 crc kubenswrapper[4892]: I0217 19:04:47.202190 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gmpwt"] Feb 17 19:04:47 crc kubenswrapper[4892]: I0217 19:04:47.214932 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gmpwt"] Feb 17 19:04:47 crc kubenswrapper[4892]: I0217 19:04:47.269582 4892 scope.go:117] "RemoveContainer" containerID="885aa634fd84897552f7606900f3778a60b31c3bb85dfec472089b73a6484f47" Feb 17 19:04:47 crc kubenswrapper[4892]: I0217 19:04:47.308329 4892 scope.go:117] "RemoveContainer" containerID="a0c5d6274d7a07dcff285f43d8d495bfb8d537f4f998699d4d2725be2d419e6c" Feb 17 19:04:47 crc kubenswrapper[4892]: E0217 19:04:47.312103 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0c5d6274d7a07dcff285f43d8d495bfb8d537f4f998699d4d2725be2d419e6c\": container with ID starting with a0c5d6274d7a07dcff285f43d8d495bfb8d537f4f998699d4d2725be2d419e6c not found: ID does not exist" containerID="a0c5d6274d7a07dcff285f43d8d495bfb8d537f4f998699d4d2725be2d419e6c" Feb 17 19:04:47 crc kubenswrapper[4892]: I0217 19:04:47.312140 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0c5d6274d7a07dcff285f43d8d495bfb8d537f4f998699d4d2725be2d419e6c"} err="failed to get container status \"a0c5d6274d7a07dcff285f43d8d495bfb8d537f4f998699d4d2725be2d419e6c\": rpc error: code = NotFound desc = could not find container \"a0c5d6274d7a07dcff285f43d8d495bfb8d537f4f998699d4d2725be2d419e6c\": container with ID starting with a0c5d6274d7a07dcff285f43d8d495bfb8d537f4f998699d4d2725be2d419e6c not found: ID does not exist" Feb 17 19:04:47 crc kubenswrapper[4892]: I0217 19:04:47.312165 4892 scope.go:117] "RemoveContainer" containerID="fd20c06120e57488e38683b55398d732644e3c70cdabd87e6e92d70664d97d26" Feb 17 19:04:47 crc kubenswrapper[4892]: E0217 19:04:47.312669 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd20c06120e57488e38683b55398d732644e3c70cdabd87e6e92d70664d97d26\": container with ID starting with fd20c06120e57488e38683b55398d732644e3c70cdabd87e6e92d70664d97d26 not found: ID does not exist" containerID="fd20c06120e57488e38683b55398d732644e3c70cdabd87e6e92d70664d97d26" Feb 17 19:04:47 crc kubenswrapper[4892]: I0217 19:04:47.312692 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd20c06120e57488e38683b55398d732644e3c70cdabd87e6e92d70664d97d26"} err="failed to get container status \"fd20c06120e57488e38683b55398d732644e3c70cdabd87e6e92d70664d97d26\": rpc error: code = NotFound desc = could not find container \"fd20c06120e57488e38683b55398d732644e3c70cdabd87e6e92d70664d97d26\": container with ID starting with fd20c06120e57488e38683b55398d732644e3c70cdabd87e6e92d70664d97d26 not found: ID does not exist" Feb 17 19:04:47 crc kubenswrapper[4892]: I0217 19:04:47.312710 4892 scope.go:117] "RemoveContainer" containerID="885aa634fd84897552f7606900f3778a60b31c3bb85dfec472089b73a6484f47" Feb 17 19:04:47 crc kubenswrapper[4892]: E0217 19:04:47.313004 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"885aa634fd84897552f7606900f3778a60b31c3bb85dfec472089b73a6484f47\": container with ID starting with 885aa634fd84897552f7606900f3778a60b31c3bb85dfec472089b73a6484f47 not found: ID does not exist" containerID="885aa634fd84897552f7606900f3778a60b31c3bb85dfec472089b73a6484f47" Feb 17 19:04:47 crc kubenswrapper[4892]: I0217 19:04:47.313026 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"885aa634fd84897552f7606900f3778a60b31c3bb85dfec472089b73a6484f47"} err="failed to get container status \"885aa634fd84897552f7606900f3778a60b31c3bb85dfec472089b73a6484f47\": rpc error: code = NotFound desc = could not find container \"885aa634fd84897552f7606900f3778a60b31c3bb85dfec472089b73a6484f47\": container with ID starting with 885aa634fd84897552f7606900f3778a60b31c3bb85dfec472089b73a6484f47 not found: ID does not exist" Feb 17 19:04:47 crc kubenswrapper[4892]: I0217 19:04:47.336674 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 17 19:04:47 crc kubenswrapper[4892]: I0217 19:04:47.401515 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8bc4ad2-030f-44cf-b4f6-f438435e0772" path="/var/lib/kubelet/pods/f8bc4ad2-030f-44cf-b4f6-f438435e0772/volumes" Feb 17 19:04:47 crc kubenswrapper[4892]: I0217 19:04:47.541320 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xnbm7" Feb 17 19:04:47 crc kubenswrapper[4892]: I0217 19:04:47.607003 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59ba8b04-0fa0-48b5-afff-ad1921668fec-operator-scripts\") pod \"59ba8b04-0fa0-48b5-afff-ad1921668fec\" (UID: \"59ba8b04-0fa0-48b5-afff-ad1921668fec\") " Feb 17 19:04:47 crc kubenswrapper[4892]: I0217 19:04:47.607154 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqwnw\" (UniqueName: \"kubernetes.io/projected/59ba8b04-0fa0-48b5-afff-ad1921668fec-kube-api-access-qqwnw\") pod \"59ba8b04-0fa0-48b5-afff-ad1921668fec\" (UID: \"59ba8b04-0fa0-48b5-afff-ad1921668fec\") " Feb 17 19:04:47 crc kubenswrapper[4892]: I0217 19:04:47.607590 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59ba8b04-0fa0-48b5-afff-ad1921668fec-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "59ba8b04-0fa0-48b5-afff-ad1921668fec" (UID: "59ba8b04-0fa0-48b5-afff-ad1921668fec"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:04:47 crc kubenswrapper[4892]: I0217 19:04:47.612922 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59ba8b04-0fa0-48b5-afff-ad1921668fec-kube-api-access-qqwnw" (OuterVolumeSpecName: "kube-api-access-qqwnw") pod "59ba8b04-0fa0-48b5-afff-ad1921668fec" (UID: "59ba8b04-0fa0-48b5-afff-ad1921668fec"). InnerVolumeSpecName "kube-api-access-qqwnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:04:47 crc kubenswrapper[4892]: I0217 19:04:47.709569 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqwnw\" (UniqueName: \"kubernetes.io/projected/59ba8b04-0fa0-48b5-afff-ad1921668fec-kube-api-access-qqwnw\") on node \"crc\" DevicePath \"\"" Feb 17 19:04:47 crc kubenswrapper[4892]: I0217 19:04:47.709610 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59ba8b04-0fa0-48b5-afff-ad1921668fec-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:04:48 crc kubenswrapper[4892]: I0217 19:04:48.163669 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-xnbm7" event={"ID":"59ba8b04-0fa0-48b5-afff-ad1921668fec","Type":"ContainerDied","Data":"b6a509a3c0171e34c957426fe818f1340af0c56831a080ce51863709800a3db1"} Feb 17 19:04:48 crc kubenswrapper[4892]: I0217 19:04:48.163970 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6a509a3c0171e34c957426fe818f1340af0c56831a080ce51863709800a3db1" Feb 17 19:04:48 crc kubenswrapper[4892]: I0217 19:04:48.163733 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xnbm7" Feb 17 19:04:55 crc kubenswrapper[4892]: I0217 19:04:55.736678 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-xnbm7"] Feb 17 19:04:55 crc kubenswrapper[4892]: I0217 19:04:55.744410 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-xnbm7"] Feb 17 19:04:55 crc kubenswrapper[4892]: I0217 19:04:55.835329 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-2j98k"] Feb 17 19:04:55 crc kubenswrapper[4892]: E0217 19:04:55.837001 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59ba8b04-0fa0-48b5-afff-ad1921668fec" containerName="mariadb-account-create-update" Feb 17 19:04:55 crc kubenswrapper[4892]: I0217 19:04:55.837030 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="59ba8b04-0fa0-48b5-afff-ad1921668fec" containerName="mariadb-account-create-update" Feb 17 19:04:55 crc kubenswrapper[4892]: E0217 19:04:55.837051 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8bc4ad2-030f-44cf-b4f6-f438435e0772" containerName="extract-utilities" Feb 17 19:04:55 crc kubenswrapper[4892]: I0217 19:04:55.837060 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8bc4ad2-030f-44cf-b4f6-f438435e0772" containerName="extract-utilities" Feb 17 19:04:55 crc kubenswrapper[4892]: E0217 19:04:55.837077 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8bc4ad2-030f-44cf-b4f6-f438435e0772" containerName="extract-content" Feb 17 19:04:55 crc kubenswrapper[4892]: I0217 19:04:55.837087 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8bc4ad2-030f-44cf-b4f6-f438435e0772" containerName="extract-content" Feb 17 19:04:55 crc kubenswrapper[4892]: E0217 19:04:55.837116 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8bc4ad2-030f-44cf-b4f6-f438435e0772" containerName="registry-server" Feb 17 19:04:55 crc kubenswrapper[4892]: I0217 19:04:55.837124 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8bc4ad2-030f-44cf-b4f6-f438435e0772" containerName="registry-server" Feb 17 19:04:55 crc kubenswrapper[4892]: I0217 19:04:55.838632 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="59ba8b04-0fa0-48b5-afff-ad1921668fec" containerName="mariadb-account-create-update" Feb 17 19:04:55 crc kubenswrapper[4892]: I0217 19:04:55.838851 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8bc4ad2-030f-44cf-b4f6-f438435e0772" containerName="registry-server" Feb 17 19:04:55 crc kubenswrapper[4892]: I0217 19:04:55.842071 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2j98k" Feb 17 19:04:55 crc kubenswrapper[4892]: I0217 19:04:55.848465 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 17 19:04:55 crc kubenswrapper[4892]: I0217 19:04:55.924932 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-2j98k"] Feb 17 19:04:55 crc kubenswrapper[4892]: I0217 19:04:55.988871 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5l25\" (UniqueName: \"kubernetes.io/projected/69504bab-0d0d-49dd-9e90-42682a7906ae-kube-api-access-l5l25\") pod \"root-account-create-update-2j98k\" (UID: \"69504bab-0d0d-49dd-9e90-42682a7906ae\") " pod="openstack/root-account-create-update-2j98k" Feb 17 19:04:55 crc kubenswrapper[4892]: I0217 19:04:55.989180 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69504bab-0d0d-49dd-9e90-42682a7906ae-operator-scripts\") pod \"root-account-create-update-2j98k\" (UID: \"69504bab-0d0d-49dd-9e90-42682a7906ae\") " pod="openstack/root-account-create-update-2j98k" Feb 17 19:04:56 crc kubenswrapper[4892]: I0217 19:04:56.090561 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69504bab-0d0d-49dd-9e90-42682a7906ae-operator-scripts\") pod \"root-account-create-update-2j98k\" (UID: \"69504bab-0d0d-49dd-9e90-42682a7906ae\") " pod="openstack/root-account-create-update-2j98k" Feb 17 19:04:56 crc kubenswrapper[4892]: I0217 19:04:56.090728 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5l25\" (UniqueName: \"kubernetes.io/projected/69504bab-0d0d-49dd-9e90-42682a7906ae-kube-api-access-l5l25\") pod \"root-account-create-update-2j98k\" (UID: \"69504bab-0d0d-49dd-9e90-42682a7906ae\") " pod="openstack/root-account-create-update-2j98k" Feb 17 19:04:56 crc kubenswrapper[4892]: I0217 19:04:56.091391 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69504bab-0d0d-49dd-9e90-42682a7906ae-operator-scripts\") pod \"root-account-create-update-2j98k\" (UID: \"69504bab-0d0d-49dd-9e90-42682a7906ae\") " pod="openstack/root-account-create-update-2j98k" Feb 17 19:04:56 crc kubenswrapper[4892]: I0217 19:04:56.120704 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5l25\" (UniqueName: \"kubernetes.io/projected/69504bab-0d0d-49dd-9e90-42682a7906ae-kube-api-access-l5l25\") pod \"root-account-create-update-2j98k\" (UID: \"69504bab-0d0d-49dd-9e90-42682a7906ae\") " pod="openstack/root-account-create-update-2j98k" Feb 17 19:04:56 crc kubenswrapper[4892]: I0217 19:04:56.225584 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2j98k" Feb 17 19:04:56 crc kubenswrapper[4892]: I0217 19:04:56.717656 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-2j98k"] Feb 17 19:04:56 crc kubenswrapper[4892]: W0217 19:04:56.722064 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69504bab_0d0d_49dd_9e90_42682a7906ae.slice/crio-b5dafad53bb5e3bb0d7b47de0f26a45aa2faca66524460fb159f66ef19b585e6 WatchSource:0}: Error finding container b5dafad53bb5e3bb0d7b47de0f26a45aa2faca66524460fb159f66ef19b585e6: Status 404 returned error can't find the container with id b5dafad53bb5e3bb0d7b47de0f26a45aa2faca66524460fb159f66ef19b585e6 Feb 17 19:04:57 crc kubenswrapper[4892]: I0217 19:04:57.270303 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2j98k" event={"ID":"69504bab-0d0d-49dd-9e90-42682a7906ae","Type":"ContainerStarted","Data":"b5dafad53bb5e3bb0d7b47de0f26a45aa2faca66524460fb159f66ef19b585e6"} Feb 17 19:04:57 crc kubenswrapper[4892]: I0217 19:04:57.370915 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59ba8b04-0fa0-48b5-afff-ad1921668fec" path="/var/lib/kubelet/pods/59ba8b04-0fa0-48b5-afff-ad1921668fec/volumes" Feb 17 19:04:58 crc kubenswrapper[4892]: I0217 19:04:58.283393 4892 generic.go:334] "Generic (PLEG): container finished" podID="69504bab-0d0d-49dd-9e90-42682a7906ae" containerID="a0927066ab9bf2b90db1ce0228bcb3dfed54eb95b44fcdf34522a6f3c58fbf28" exitCode=0 Feb 17 19:04:58 crc kubenswrapper[4892]: I0217 19:04:58.283454 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2j98k" event={"ID":"69504bab-0d0d-49dd-9e90-42682a7906ae","Type":"ContainerDied","Data":"a0927066ab9bf2b90db1ce0228bcb3dfed54eb95b44fcdf34522a6f3c58fbf28"} Feb 17 19:04:59 crc kubenswrapper[4892]: I0217 19:04:59.294647 4892 generic.go:334] "Generic (PLEG): container finished" podID="4896fa46-2882-440a-b0fe-66ab208de548" containerID="cfe22892d0efb55da7698e66f3e81a197baba371f76869ce1e3d5e6e9095f20b" exitCode=0 Feb 17 19:04:59 crc kubenswrapper[4892]: I0217 19:04:59.294718 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4896fa46-2882-440a-b0fe-66ab208de548","Type":"ContainerDied","Data":"cfe22892d0efb55da7698e66f3e81a197baba371f76869ce1e3d5e6e9095f20b"} Feb 17 19:04:59 crc kubenswrapper[4892]: I0217 19:04:59.703925 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2j98k" Feb 17 19:04:59 crc kubenswrapper[4892]: I0217 19:04:59.855643 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69504bab-0d0d-49dd-9e90-42682a7906ae-operator-scripts\") pod \"69504bab-0d0d-49dd-9e90-42682a7906ae\" (UID: \"69504bab-0d0d-49dd-9e90-42682a7906ae\") " Feb 17 19:04:59 crc kubenswrapper[4892]: I0217 19:04:59.855747 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5l25\" (UniqueName: \"kubernetes.io/projected/69504bab-0d0d-49dd-9e90-42682a7906ae-kube-api-access-l5l25\") pod \"69504bab-0d0d-49dd-9e90-42682a7906ae\" (UID: \"69504bab-0d0d-49dd-9e90-42682a7906ae\") " Feb 17 19:04:59 crc kubenswrapper[4892]: I0217 19:04:59.856391 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69504bab-0d0d-49dd-9e90-42682a7906ae-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "69504bab-0d0d-49dd-9e90-42682a7906ae" (UID: "69504bab-0d0d-49dd-9e90-42682a7906ae"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:04:59 crc kubenswrapper[4892]: I0217 19:04:59.862996 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69504bab-0d0d-49dd-9e90-42682a7906ae-kube-api-access-l5l25" (OuterVolumeSpecName: "kube-api-access-l5l25") pod "69504bab-0d0d-49dd-9e90-42682a7906ae" (UID: "69504bab-0d0d-49dd-9e90-42682a7906ae"). InnerVolumeSpecName "kube-api-access-l5l25". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:04:59 crc kubenswrapper[4892]: I0217 19:04:59.957483 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69504bab-0d0d-49dd-9e90-42682a7906ae-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:04:59 crc kubenswrapper[4892]: I0217 19:04:59.957509 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5l25\" (UniqueName: \"kubernetes.io/projected/69504bab-0d0d-49dd-9e90-42682a7906ae-kube-api-access-l5l25\") on node \"crc\" DevicePath \"\"" Feb 17 19:05:00 crc kubenswrapper[4892]: I0217 19:05:00.306627 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4896fa46-2882-440a-b0fe-66ab208de548","Type":"ContainerStarted","Data":"ee7c25ec1dd3875f374fec2625f4e9ca808ce7cd5a27f9a907cdf302af8967b1"} Feb 17 19:05:00 crc kubenswrapper[4892]: I0217 19:05:00.306869 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:05:00 crc kubenswrapper[4892]: I0217 19:05:00.308735 4892 generic.go:334] "Generic (PLEG): container finished" podID="f5a1d044-9e9c-41ca-96e2-735248659691" containerID="0e1ac38ae4fa6b22abe369a93643c0a6ad6172744d4df53b1ba3eb6bf221a1cf" exitCode=0 Feb 17 19:05:00 crc kubenswrapper[4892]: I0217 19:05:00.308791 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f5a1d044-9e9c-41ca-96e2-735248659691","Type":"ContainerDied","Data":"0e1ac38ae4fa6b22abe369a93643c0a6ad6172744d4df53b1ba3eb6bf221a1cf"} Feb 17 19:05:00 crc kubenswrapper[4892]: I0217 19:05:00.312429 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2j98k" event={"ID":"69504bab-0d0d-49dd-9e90-42682a7906ae","Type":"ContainerDied","Data":"b5dafad53bb5e3bb0d7b47de0f26a45aa2faca66524460fb159f66ef19b585e6"} Feb 17 19:05:00 crc kubenswrapper[4892]: I0217 19:05:00.312478 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5dafad53bb5e3bb0d7b47de0f26a45aa2faca66524460fb159f66ef19b585e6" Feb 17 19:05:00 crc kubenswrapper[4892]: I0217 19:05:00.312544 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2j98k" Feb 17 19:05:00 crc kubenswrapper[4892]: I0217 19:05:00.352447 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.352432843 podStartE2EDuration="37.352432843s" podCreationTimestamp="2026-02-17 19:04:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:05:00.341623932 +0000 UTC m=+4871.717027197" watchObservedRunningTime="2026-02-17 19:05:00.352432843 +0000 UTC m=+4871.727836108" Feb 17 19:05:01 crc kubenswrapper[4892]: I0217 19:05:01.325006 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f5a1d044-9e9c-41ca-96e2-735248659691","Type":"ContainerStarted","Data":"32a80db7adf34cf8be0fb7837d28f1a3c2f3f3ab2fdf42369624edf06010d26a"} Feb 17 19:05:01 crc kubenswrapper[4892]: I0217 19:05:01.325803 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 17 19:05:01 crc kubenswrapper[4892]: I0217 19:05:01.357047 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=39.35702913 podStartE2EDuration="39.35702913s" podCreationTimestamp="2026-02-17 19:04:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:05:01.351511181 +0000 UTC m=+4872.726914446" watchObservedRunningTime="2026-02-17 19:05:01.35702913 +0000 UTC m=+4872.732432415" Feb 17 19:05:09 crc kubenswrapper[4892]: I0217 19:05:09.656593 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qpwp7"] Feb 17 19:05:09 crc kubenswrapper[4892]: E0217 19:05:09.657391 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69504bab-0d0d-49dd-9e90-42682a7906ae" containerName="mariadb-account-create-update" Feb 17 19:05:09 crc kubenswrapper[4892]: I0217 19:05:09.657404 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="69504bab-0d0d-49dd-9e90-42682a7906ae" containerName="mariadb-account-create-update" Feb 17 19:05:09 crc kubenswrapper[4892]: I0217 19:05:09.657567 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="69504bab-0d0d-49dd-9e90-42682a7906ae" containerName="mariadb-account-create-update" Feb 17 19:05:09 crc kubenswrapper[4892]: I0217 19:05:09.658776 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qpwp7" Feb 17 19:05:09 crc kubenswrapper[4892]: I0217 19:05:09.722509 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29beb521-cea5-4c7f-a444-d20860c48f12-catalog-content\") pod \"community-operators-qpwp7\" (UID: \"29beb521-cea5-4c7f-a444-d20860c48f12\") " pod="openshift-marketplace/community-operators-qpwp7" Feb 17 19:05:09 crc kubenswrapper[4892]: I0217 19:05:09.722580 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29beb521-cea5-4c7f-a444-d20860c48f12-utilities\") pod \"community-operators-qpwp7\" (UID: \"29beb521-cea5-4c7f-a444-d20860c48f12\") " pod="openshift-marketplace/community-operators-qpwp7" Feb 17 19:05:09 crc kubenswrapper[4892]: I0217 19:05:09.723126 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tkvr\" (UniqueName: \"kubernetes.io/projected/29beb521-cea5-4c7f-a444-d20860c48f12-kube-api-access-6tkvr\") pod \"community-operators-qpwp7\" (UID: \"29beb521-cea5-4c7f-a444-d20860c48f12\") " pod="openshift-marketplace/community-operators-qpwp7" Feb 17 19:05:09 crc kubenswrapper[4892]: I0217 19:05:09.736923 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qpwp7"] Feb 17 19:05:09 crc kubenswrapper[4892]: I0217 19:05:09.825134 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tkvr\" (UniqueName: \"kubernetes.io/projected/29beb521-cea5-4c7f-a444-d20860c48f12-kube-api-access-6tkvr\") pod \"community-operators-qpwp7\" (UID: \"29beb521-cea5-4c7f-a444-d20860c48f12\") " pod="openshift-marketplace/community-operators-qpwp7" Feb 17 19:05:09 crc kubenswrapper[4892]: I0217 19:05:09.825202 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29beb521-cea5-4c7f-a444-d20860c48f12-catalog-content\") pod \"community-operators-qpwp7\" (UID: \"29beb521-cea5-4c7f-a444-d20860c48f12\") " pod="openshift-marketplace/community-operators-qpwp7" Feb 17 19:05:09 crc kubenswrapper[4892]: I0217 19:05:09.825237 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29beb521-cea5-4c7f-a444-d20860c48f12-utilities\") pod \"community-operators-qpwp7\" (UID: \"29beb521-cea5-4c7f-a444-d20860c48f12\") " pod="openshift-marketplace/community-operators-qpwp7" Feb 17 19:05:09 crc kubenswrapper[4892]: I0217 19:05:09.825689 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29beb521-cea5-4c7f-a444-d20860c48f12-utilities\") pod \"community-operators-qpwp7\" (UID: \"29beb521-cea5-4c7f-a444-d20860c48f12\") " pod="openshift-marketplace/community-operators-qpwp7" Feb 17 19:05:09 crc kubenswrapper[4892]: I0217 19:05:09.825720 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29beb521-cea5-4c7f-a444-d20860c48f12-catalog-content\") pod \"community-operators-qpwp7\" (UID: \"29beb521-cea5-4c7f-a444-d20860c48f12\") " pod="openshift-marketplace/community-operators-qpwp7" Feb 17 19:05:09 crc kubenswrapper[4892]: I0217 19:05:09.848268 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tkvr\" (UniqueName: \"kubernetes.io/projected/29beb521-cea5-4c7f-a444-d20860c48f12-kube-api-access-6tkvr\") pod \"community-operators-qpwp7\" (UID: \"29beb521-cea5-4c7f-a444-d20860c48f12\") " pod="openshift-marketplace/community-operators-qpwp7" Feb 17 19:05:09 crc kubenswrapper[4892]: I0217 19:05:09.980485 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qpwp7" Feb 17 19:05:10 crc kubenswrapper[4892]: I0217 19:05:10.485303 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qpwp7"] Feb 17 19:05:11 crc kubenswrapper[4892]: I0217 19:05:11.452121 4892 generic.go:334] "Generic (PLEG): container finished" podID="29beb521-cea5-4c7f-a444-d20860c48f12" containerID="64f8c0c11e683db3cea941847d20f7ec829c5375fc465654d9788aac4b90d587" exitCode=0 Feb 17 19:05:11 crc kubenswrapper[4892]: I0217 19:05:11.452192 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qpwp7" event={"ID":"29beb521-cea5-4c7f-a444-d20860c48f12","Type":"ContainerDied","Data":"64f8c0c11e683db3cea941847d20f7ec829c5375fc465654d9788aac4b90d587"} Feb 17 19:05:11 crc kubenswrapper[4892]: I0217 19:05:11.452420 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qpwp7" event={"ID":"29beb521-cea5-4c7f-a444-d20860c48f12","Type":"ContainerStarted","Data":"832a8c6bf1e0308e09d1ba4e2a0c60ed943d999683e213b95daa2074fb5e3494"} Feb 17 19:05:12 crc kubenswrapper[4892]: I0217 19:05:12.468149 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qpwp7" event={"ID":"29beb521-cea5-4c7f-a444-d20860c48f12","Type":"ContainerStarted","Data":"725eba507422391a6cecc65b5a9a5257222c4b6cce65a0aa552e0e96f56cff3d"} Feb 17 19:05:13 crc kubenswrapper[4892]: I0217 19:05:13.479355 4892 generic.go:334] "Generic (PLEG): container finished" podID="29beb521-cea5-4c7f-a444-d20860c48f12" containerID="725eba507422391a6cecc65b5a9a5257222c4b6cce65a0aa552e0e96f56cff3d" exitCode=0 Feb 17 19:05:13 crc kubenswrapper[4892]: I0217 19:05:13.479447 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qpwp7" event={"ID":"29beb521-cea5-4c7f-a444-d20860c48f12","Type":"ContainerDied","Data":"725eba507422391a6cecc65b5a9a5257222c4b6cce65a0aa552e0e96f56cff3d"} Feb 17 19:05:14 crc kubenswrapper[4892]: I0217 19:05:14.170059 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 17 19:05:14 crc kubenswrapper[4892]: I0217 19:05:14.505182 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:05:14 crc kubenswrapper[4892]: I0217 19:05:14.512387 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qpwp7" event={"ID":"29beb521-cea5-4c7f-a444-d20860c48f12","Type":"ContainerStarted","Data":"225d2a65128a7e817f47a5841b97dbfa983ed6aa52f69893da4f59e6b65969e6"} Feb 17 19:05:19 crc kubenswrapper[4892]: I0217 19:05:19.981724 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qpwp7" Feb 17 19:05:19 crc kubenswrapper[4892]: I0217 19:05:19.982286 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qpwp7" Feb 17 19:05:20 crc kubenswrapper[4892]: I0217 19:05:20.046222 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qpwp7" Feb 17 19:05:20 crc kubenswrapper[4892]: I0217 19:05:20.080566 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qpwp7" podStartSLOduration=8.657311231 podStartE2EDuration="11.080533007s" podCreationTimestamp="2026-02-17 19:05:09 +0000 UTC" firstStartedPulling="2026-02-17 19:05:11.455110326 +0000 UTC m=+4882.830513611" lastFinishedPulling="2026-02-17 19:05:13.878332112 +0000 UTC m=+4885.253735387" observedRunningTime="2026-02-17 19:05:14.565438408 +0000 UTC m=+4885.940841673" watchObservedRunningTime="2026-02-17 19:05:20.080533007 +0000 UTC m=+4891.455936312" Feb 17 19:05:20 crc kubenswrapper[4892]: I0217 19:05:20.363970 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-mb544"] Feb 17 19:05:20 crc kubenswrapper[4892]: I0217 19:05:20.365405 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-mb544" Feb 17 19:05:20 crc kubenswrapper[4892]: I0217 19:05:20.388145 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-mb544"] Feb 17 19:05:20 crc kubenswrapper[4892]: I0217 19:05:20.418365 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d80642f-243c-4542-b750-70696772e89d-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-mb544\" (UID: \"7d80642f-243c-4542-b750-70696772e89d\") " pod="openstack/dnsmasq-dns-5b7946d7b9-mb544" Feb 17 19:05:20 crc kubenswrapper[4892]: I0217 19:05:20.419204 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prs44\" (UniqueName: \"kubernetes.io/projected/7d80642f-243c-4542-b750-70696772e89d-kube-api-access-prs44\") pod \"dnsmasq-dns-5b7946d7b9-mb544\" (UID: \"7d80642f-243c-4542-b750-70696772e89d\") " pod="openstack/dnsmasq-dns-5b7946d7b9-mb544" Feb 17 19:05:20 crc kubenswrapper[4892]: I0217 19:05:20.419274 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d80642f-243c-4542-b750-70696772e89d-config\") pod \"dnsmasq-dns-5b7946d7b9-mb544\" (UID: \"7d80642f-243c-4542-b750-70696772e89d\") " pod="openstack/dnsmasq-dns-5b7946d7b9-mb544" Feb 17 19:05:20 crc kubenswrapper[4892]: I0217 19:05:20.521301 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prs44\" (UniqueName: \"kubernetes.io/projected/7d80642f-243c-4542-b750-70696772e89d-kube-api-access-prs44\") pod \"dnsmasq-dns-5b7946d7b9-mb544\" (UID: \"7d80642f-243c-4542-b750-70696772e89d\") " pod="openstack/dnsmasq-dns-5b7946d7b9-mb544" Feb 17 19:05:20 crc kubenswrapper[4892]: I0217 19:05:20.521377 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d80642f-243c-4542-b750-70696772e89d-config\") pod \"dnsmasq-dns-5b7946d7b9-mb544\" (UID: \"7d80642f-243c-4542-b750-70696772e89d\") " pod="openstack/dnsmasq-dns-5b7946d7b9-mb544" Feb 17 19:05:20 crc kubenswrapper[4892]: I0217 19:05:20.521481 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d80642f-243c-4542-b750-70696772e89d-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-mb544\" (UID: \"7d80642f-243c-4542-b750-70696772e89d\") " pod="openstack/dnsmasq-dns-5b7946d7b9-mb544" Feb 17 19:05:20 crc kubenswrapper[4892]: I0217 19:05:20.522857 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d80642f-243c-4542-b750-70696772e89d-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-mb544\" (UID: \"7d80642f-243c-4542-b750-70696772e89d\") " pod="openstack/dnsmasq-dns-5b7946d7b9-mb544" Feb 17 19:05:20 crc kubenswrapper[4892]: I0217 19:05:20.522878 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d80642f-243c-4542-b750-70696772e89d-config\") pod \"dnsmasq-dns-5b7946d7b9-mb544\" (UID: \"7d80642f-243c-4542-b750-70696772e89d\") " pod="openstack/dnsmasq-dns-5b7946d7b9-mb544" Feb 17 19:05:20 crc kubenswrapper[4892]: I0217 19:05:20.539599 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prs44\" (UniqueName: \"kubernetes.io/projected/7d80642f-243c-4542-b750-70696772e89d-kube-api-access-prs44\") pod \"dnsmasq-dns-5b7946d7b9-mb544\" (UID: \"7d80642f-243c-4542-b750-70696772e89d\") " pod="openstack/dnsmasq-dns-5b7946d7b9-mb544" Feb 17 19:05:20 crc kubenswrapper[4892]: I0217 19:05:20.630405 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qpwp7" Feb 17 19:05:20 crc kubenswrapper[4892]: I0217 19:05:20.683179 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-mb544" Feb 17 19:05:20 crc kubenswrapper[4892]: I0217 19:05:20.689902 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qpwp7"] Feb 17 19:05:21 crc kubenswrapper[4892]: I0217 19:05:21.106133 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 19:05:21 crc kubenswrapper[4892]: I0217 19:05:21.142640 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-mb544"] Feb 17 19:05:21 crc kubenswrapper[4892]: W0217 19:05:21.268382 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d80642f_243c_4542_b750_70696772e89d.slice/crio-d9452f1662bab52c8aa2e4972e25e6404df48084df5624f167050de142cebcea WatchSource:0}: Error finding container d9452f1662bab52c8aa2e4972e25e6404df48084df5624f167050de142cebcea: Status 404 returned error can't find the container with id d9452f1662bab52c8aa2e4972e25e6404df48084df5624f167050de142cebcea Feb 17 19:05:21 crc kubenswrapper[4892]: I0217 19:05:21.588246 4892 generic.go:334] "Generic (PLEG): container finished" podID="7d80642f-243c-4542-b750-70696772e89d" containerID="e3443eccf8ac9db843bff666c60a97e0c8b0794fde6e3954541215db1dc302d4" exitCode=0 Feb 17 19:05:21 crc kubenswrapper[4892]: I0217 19:05:21.588478 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-mb544" event={"ID":"7d80642f-243c-4542-b750-70696772e89d","Type":"ContainerDied","Data":"e3443eccf8ac9db843bff666c60a97e0c8b0794fde6e3954541215db1dc302d4"} Feb 17 19:05:21 crc kubenswrapper[4892]: I0217 19:05:21.596553 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-mb544" event={"ID":"7d80642f-243c-4542-b750-70696772e89d","Type":"ContainerStarted","Data":"d9452f1662bab52c8aa2e4972e25e6404df48084df5624f167050de142cebcea"} Feb 17 19:05:22 crc kubenswrapper[4892]: I0217 19:05:22.150334 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 19:05:22 crc kubenswrapper[4892]: I0217 19:05:22.608063 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-mb544" event={"ID":"7d80642f-243c-4542-b750-70696772e89d","Type":"ContainerStarted","Data":"0508f7e80818138c836a3a4efe169f8954d5e9e826faa4000e6ab500b126fb12"} Feb 17 19:05:22 crc kubenswrapper[4892]: I0217 19:05:22.608255 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qpwp7" podUID="29beb521-cea5-4c7f-a444-d20860c48f12" containerName="registry-server" containerID="cri-o://225d2a65128a7e817f47a5841b97dbfa983ed6aa52f69893da4f59e6b65969e6" gracePeriod=2 Feb 17 19:05:22 crc kubenswrapper[4892]: I0217 19:05:22.636546 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b7946d7b9-mb544" podStartSLOduration=2.6365208129999997 podStartE2EDuration="2.636520813s" podCreationTimestamp="2026-02-17 19:05:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:05:22.631004764 +0000 UTC m=+4894.006408029" watchObservedRunningTime="2026-02-17 19:05:22.636520813 +0000 UTC m=+4894.011924078" Feb 17 19:05:23 crc kubenswrapper[4892]: I0217 19:05:23.534994 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qpwp7" Feb 17 19:05:23 crc kubenswrapper[4892]: I0217 19:05:23.573324 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tkvr\" (UniqueName: \"kubernetes.io/projected/29beb521-cea5-4c7f-a444-d20860c48f12-kube-api-access-6tkvr\") pod \"29beb521-cea5-4c7f-a444-d20860c48f12\" (UID: \"29beb521-cea5-4c7f-a444-d20860c48f12\") " Feb 17 19:05:23 crc kubenswrapper[4892]: I0217 19:05:23.573411 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29beb521-cea5-4c7f-a444-d20860c48f12-utilities\") pod \"29beb521-cea5-4c7f-a444-d20860c48f12\" (UID: \"29beb521-cea5-4c7f-a444-d20860c48f12\") " Feb 17 19:05:23 crc kubenswrapper[4892]: I0217 19:05:23.573688 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29beb521-cea5-4c7f-a444-d20860c48f12-catalog-content\") pod \"29beb521-cea5-4c7f-a444-d20860c48f12\" (UID: \"29beb521-cea5-4c7f-a444-d20860c48f12\") " Feb 17 19:05:23 crc kubenswrapper[4892]: I0217 19:05:23.574669 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29beb521-cea5-4c7f-a444-d20860c48f12-utilities" (OuterVolumeSpecName: "utilities") pod "29beb521-cea5-4c7f-a444-d20860c48f12" (UID: "29beb521-cea5-4c7f-a444-d20860c48f12"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:05:23 crc kubenswrapper[4892]: I0217 19:05:23.581538 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29beb521-cea5-4c7f-a444-d20860c48f12-kube-api-access-6tkvr" (OuterVolumeSpecName: "kube-api-access-6tkvr") pod "29beb521-cea5-4c7f-a444-d20860c48f12" (UID: "29beb521-cea5-4c7f-a444-d20860c48f12"). InnerVolumeSpecName "kube-api-access-6tkvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:05:23 crc kubenswrapper[4892]: I0217 19:05:23.621967 4892 generic.go:334] "Generic (PLEG): container finished" podID="29beb521-cea5-4c7f-a444-d20860c48f12" containerID="225d2a65128a7e817f47a5841b97dbfa983ed6aa52f69893da4f59e6b65969e6" exitCode=0 Feb 17 19:05:23 crc kubenswrapper[4892]: I0217 19:05:23.622104 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qpwp7" Feb 17 19:05:23 crc kubenswrapper[4892]: I0217 19:05:23.622087 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qpwp7" event={"ID":"29beb521-cea5-4c7f-a444-d20860c48f12","Type":"ContainerDied","Data":"225d2a65128a7e817f47a5841b97dbfa983ed6aa52f69893da4f59e6b65969e6"} Feb 17 19:05:23 crc kubenswrapper[4892]: I0217 19:05:23.622221 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qpwp7" event={"ID":"29beb521-cea5-4c7f-a444-d20860c48f12","Type":"ContainerDied","Data":"832a8c6bf1e0308e09d1ba4e2a0c60ed943d999683e213b95daa2074fb5e3494"} Feb 17 19:05:23 crc kubenswrapper[4892]: I0217 19:05:23.622511 4892 scope.go:117] "RemoveContainer" containerID="225d2a65128a7e817f47a5841b97dbfa983ed6aa52f69893da4f59e6b65969e6" Feb 17 19:05:23 crc kubenswrapper[4892]: I0217 19:05:23.622551 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b7946d7b9-mb544" Feb 17 19:05:23 crc kubenswrapper[4892]: I0217 19:05:23.647341 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29beb521-cea5-4c7f-a444-d20860c48f12-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "29beb521-cea5-4c7f-a444-d20860c48f12" (UID: "29beb521-cea5-4c7f-a444-d20860c48f12"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:05:23 crc kubenswrapper[4892]: I0217 19:05:23.655760 4892 scope.go:117] "RemoveContainer" containerID="725eba507422391a6cecc65b5a9a5257222c4b6cce65a0aa552e0e96f56cff3d" Feb 17 19:05:23 crc kubenswrapper[4892]: I0217 19:05:23.677764 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29beb521-cea5-4c7f-a444-d20860c48f12-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 19:05:23 crc kubenswrapper[4892]: I0217 19:05:23.677799 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tkvr\" (UniqueName: \"kubernetes.io/projected/29beb521-cea5-4c7f-a444-d20860c48f12-kube-api-access-6tkvr\") on node \"crc\" DevicePath \"\"" Feb 17 19:05:23 crc kubenswrapper[4892]: I0217 19:05:23.677810 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29beb521-cea5-4c7f-a444-d20860c48f12-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 19:05:23 crc kubenswrapper[4892]: I0217 19:05:23.679452 4892 scope.go:117] "RemoveContainer" containerID="64f8c0c11e683db3cea941847d20f7ec829c5375fc465654d9788aac4b90d587" Feb 17 19:05:23 crc kubenswrapper[4892]: I0217 19:05:23.708338 4892 scope.go:117] "RemoveContainer" containerID="225d2a65128a7e817f47a5841b97dbfa983ed6aa52f69893da4f59e6b65969e6" Feb 17 19:05:23 crc kubenswrapper[4892]: E0217 19:05:23.708992 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"225d2a65128a7e817f47a5841b97dbfa983ed6aa52f69893da4f59e6b65969e6\": container with ID starting with 225d2a65128a7e817f47a5841b97dbfa983ed6aa52f69893da4f59e6b65969e6 not found: ID does not exist" containerID="225d2a65128a7e817f47a5841b97dbfa983ed6aa52f69893da4f59e6b65969e6" Feb 17 19:05:23 crc kubenswrapper[4892]: I0217 19:05:23.709072 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"225d2a65128a7e817f47a5841b97dbfa983ed6aa52f69893da4f59e6b65969e6"} err="failed to get container status \"225d2a65128a7e817f47a5841b97dbfa983ed6aa52f69893da4f59e6b65969e6\": rpc error: code = NotFound desc = could not find container \"225d2a65128a7e817f47a5841b97dbfa983ed6aa52f69893da4f59e6b65969e6\": container with ID starting with 225d2a65128a7e817f47a5841b97dbfa983ed6aa52f69893da4f59e6b65969e6 not found: ID does not exist" Feb 17 19:05:23 crc kubenswrapper[4892]: I0217 19:05:23.709136 4892 scope.go:117] "RemoveContainer" containerID="725eba507422391a6cecc65b5a9a5257222c4b6cce65a0aa552e0e96f56cff3d" Feb 17 19:05:23 crc kubenswrapper[4892]: E0217 19:05:23.709586 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"725eba507422391a6cecc65b5a9a5257222c4b6cce65a0aa552e0e96f56cff3d\": container with ID starting with 725eba507422391a6cecc65b5a9a5257222c4b6cce65a0aa552e0e96f56cff3d not found: ID does not exist" containerID="725eba507422391a6cecc65b5a9a5257222c4b6cce65a0aa552e0e96f56cff3d" Feb 17 19:05:23 crc kubenswrapper[4892]: I0217 19:05:23.709633 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"725eba507422391a6cecc65b5a9a5257222c4b6cce65a0aa552e0e96f56cff3d"} err="failed to get container status \"725eba507422391a6cecc65b5a9a5257222c4b6cce65a0aa552e0e96f56cff3d\": rpc error: code = NotFound desc = could not find container \"725eba507422391a6cecc65b5a9a5257222c4b6cce65a0aa552e0e96f56cff3d\": container with ID starting with 725eba507422391a6cecc65b5a9a5257222c4b6cce65a0aa552e0e96f56cff3d not found: ID does not exist" Feb 17 19:05:23 crc kubenswrapper[4892]: I0217 19:05:23.709669 4892 scope.go:117] "RemoveContainer" containerID="64f8c0c11e683db3cea941847d20f7ec829c5375fc465654d9788aac4b90d587" Feb 17 19:05:23 crc kubenswrapper[4892]: E0217 19:05:23.710092 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64f8c0c11e683db3cea941847d20f7ec829c5375fc465654d9788aac4b90d587\": container with ID starting with 64f8c0c11e683db3cea941847d20f7ec829c5375fc465654d9788aac4b90d587 not found: ID does not exist" containerID="64f8c0c11e683db3cea941847d20f7ec829c5375fc465654d9788aac4b90d587" Feb 17 19:05:23 crc kubenswrapper[4892]: I0217 19:05:23.710126 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64f8c0c11e683db3cea941847d20f7ec829c5375fc465654d9788aac4b90d587"} err="failed to get container status \"64f8c0c11e683db3cea941847d20f7ec829c5375fc465654d9788aac4b90d587\": rpc error: code = NotFound desc = could not find container \"64f8c0c11e683db3cea941847d20f7ec829c5375fc465654d9788aac4b90d587\": container with ID starting with 64f8c0c11e683db3cea941847d20f7ec829c5375fc465654d9788aac4b90d587 not found: ID does not exist" Feb 17 19:05:23 crc kubenswrapper[4892]: I0217 19:05:23.956211 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qpwp7"] Feb 17 19:05:23 crc kubenswrapper[4892]: I0217 19:05:23.963849 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qpwp7"] Feb 17 19:05:25 crc kubenswrapper[4892]: I0217 19:05:25.375129 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29beb521-cea5-4c7f-a444-d20860c48f12" path="/var/lib/kubelet/pods/29beb521-cea5-4c7f-a444-d20860c48f12/volumes" Feb 17 19:05:29 crc kubenswrapper[4892]: I0217 19:05:29.128759 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="f5a1d044-9e9c-41ca-96e2-735248659691" containerName="rabbitmq" containerID="cri-o://32a80db7adf34cf8be0fb7837d28f1a3c2f3f3ab2fdf42369624edf06010d26a" gracePeriod=52 Feb 17 19:05:29 crc kubenswrapper[4892]: I0217 19:05:29.813937 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="4896fa46-2882-440a-b0fe-66ab208de548" containerName="rabbitmq" containerID="cri-o://ee7c25ec1dd3875f374fec2625f4e9ca808ce7cd5a27f9a907cdf302af8967b1" gracePeriod=53 Feb 17 19:05:30 crc kubenswrapper[4892]: I0217 19:05:30.682380 4892 generic.go:334] "Generic (PLEG): container finished" podID="f5a1d044-9e9c-41ca-96e2-735248659691" containerID="32a80db7adf34cf8be0fb7837d28f1a3c2f3f3ab2fdf42369624edf06010d26a" exitCode=0 Feb 17 19:05:30 crc kubenswrapper[4892]: I0217 19:05:30.682451 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f5a1d044-9e9c-41ca-96e2-735248659691","Type":"ContainerDied","Data":"32a80db7adf34cf8be0fb7837d28f1a3c2f3f3ab2fdf42369624edf06010d26a"} Feb 17 19:05:30 crc kubenswrapper[4892]: I0217 19:05:30.684782 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b7946d7b9-mb544" Feb 17 19:05:30 crc kubenswrapper[4892]: I0217 19:05:30.741903 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-2jqm4"] Feb 17 19:05:30 crc kubenswrapper[4892]: I0217 19:05:30.742120 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-98ddfc8f-2jqm4" podUID="2a2fba10-1e82-4580-b350-548b00de2a1f" containerName="dnsmasq-dns" containerID="cri-o://deb9f2ad7525b7d52a6ffb43c66aeb0b43701513203a2f19d24ec4f011ba5fe2" gracePeriod=10 Feb 17 19:05:30 crc kubenswrapper[4892]: I0217 19:05:30.857419 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 17 19:05:30 crc kubenswrapper[4892]: I0217 19:05:30.902558 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f5a1d044-9e9c-41ca-96e2-735248659691-rabbitmq-erlang-cookie\") pod \"f5a1d044-9e9c-41ca-96e2-735248659691\" (UID: \"f5a1d044-9e9c-41ca-96e2-735248659691\") " Feb 17 19:05:30 crc kubenswrapper[4892]: I0217 19:05:30.902631 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f5a1d044-9e9c-41ca-96e2-735248659691-erlang-cookie-secret\") pod \"f5a1d044-9e9c-41ca-96e2-735248659691\" (UID: \"f5a1d044-9e9c-41ca-96e2-735248659691\") " Feb 17 19:05:30 crc kubenswrapper[4892]: I0217 19:05:30.902671 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f5a1d044-9e9c-41ca-96e2-735248659691-rabbitmq-confd\") pod \"f5a1d044-9e9c-41ca-96e2-735248659691\" (UID: \"f5a1d044-9e9c-41ca-96e2-735248659691\") " Feb 17 19:05:30 crc kubenswrapper[4892]: I0217 19:05:30.902714 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f5a1d044-9e9c-41ca-96e2-735248659691-server-conf\") pod \"f5a1d044-9e9c-41ca-96e2-735248659691\" (UID: \"f5a1d044-9e9c-41ca-96e2-735248659691\") " Feb 17 19:05:30 crc kubenswrapper[4892]: I0217 19:05:30.902846 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f5a1d044-9e9c-41ca-96e2-735248659691-pod-info\") pod \"f5a1d044-9e9c-41ca-96e2-735248659691\" (UID: \"f5a1d044-9e9c-41ca-96e2-735248659691\") " Feb 17 19:05:30 crc kubenswrapper[4892]: I0217 19:05:30.902903 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f5a1d044-9e9c-41ca-96e2-735248659691-plugins-conf\") pod \"f5a1d044-9e9c-41ca-96e2-735248659691\" (UID: \"f5a1d044-9e9c-41ca-96e2-735248659691\") " Feb 17 19:05:30 crc kubenswrapper[4892]: I0217 19:05:30.902932 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4b86\" (UniqueName: \"kubernetes.io/projected/f5a1d044-9e9c-41ca-96e2-735248659691-kube-api-access-d4b86\") pod \"f5a1d044-9e9c-41ca-96e2-735248659691\" (UID: \"f5a1d044-9e9c-41ca-96e2-735248659691\") " Feb 17 19:05:30 crc kubenswrapper[4892]: I0217 19:05:30.902978 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f5a1d044-9e9c-41ca-96e2-735248659691-rabbitmq-plugins\") pod \"f5a1d044-9e9c-41ca-96e2-735248659691\" (UID: \"f5a1d044-9e9c-41ca-96e2-735248659691\") " Feb 17 19:05:30 crc kubenswrapper[4892]: I0217 19:05:30.903092 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-88515a40-e0e6-4b68-9f49-c6ac1986175f\") pod \"f5a1d044-9e9c-41ca-96e2-735248659691\" (UID: \"f5a1d044-9e9c-41ca-96e2-735248659691\") " Feb 17 19:05:30 crc kubenswrapper[4892]: I0217 19:05:30.905958 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5a1d044-9e9c-41ca-96e2-735248659691-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "f5a1d044-9e9c-41ca-96e2-735248659691" (UID: "f5a1d044-9e9c-41ca-96e2-735248659691"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:05:30 crc kubenswrapper[4892]: I0217 19:05:30.906074 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5a1d044-9e9c-41ca-96e2-735248659691-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "f5a1d044-9e9c-41ca-96e2-735248659691" (UID: "f5a1d044-9e9c-41ca-96e2-735248659691"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:05:30 crc kubenswrapper[4892]: I0217 19:05:30.909880 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5a1d044-9e9c-41ca-96e2-735248659691-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "f5a1d044-9e9c-41ca-96e2-735248659691" (UID: "f5a1d044-9e9c-41ca-96e2-735248659691"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:05:30 crc kubenswrapper[4892]: I0217 19:05:30.919457 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5a1d044-9e9c-41ca-96e2-735248659691-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "f5a1d044-9e9c-41ca-96e2-735248659691" (UID: "f5a1d044-9e9c-41ca-96e2-735248659691"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:05:30 crc kubenswrapper[4892]: I0217 19:05:30.919583 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/f5a1d044-9e9c-41ca-96e2-735248659691-pod-info" (OuterVolumeSpecName: "pod-info") pod "f5a1d044-9e9c-41ca-96e2-735248659691" (UID: "f5a1d044-9e9c-41ca-96e2-735248659691"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 17 19:05:30 crc kubenswrapper[4892]: I0217 19:05:30.921040 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5a1d044-9e9c-41ca-96e2-735248659691-kube-api-access-d4b86" (OuterVolumeSpecName: "kube-api-access-d4b86") pod "f5a1d044-9e9c-41ca-96e2-735248659691" (UID: "f5a1d044-9e9c-41ca-96e2-735248659691"). InnerVolumeSpecName "kube-api-access-d4b86". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:05:30 crc kubenswrapper[4892]: I0217 19:05:30.961548 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5a1d044-9e9c-41ca-96e2-735248659691-server-conf" (OuterVolumeSpecName: "server-conf") pod "f5a1d044-9e9c-41ca-96e2-735248659691" (UID: "f5a1d044-9e9c-41ca-96e2-735248659691"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:05:30 crc kubenswrapper[4892]: I0217 19:05:30.982646 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-88515a40-e0e6-4b68-9f49-c6ac1986175f" (OuterVolumeSpecName: "persistence") pod "f5a1d044-9e9c-41ca-96e2-735248659691" (UID: "f5a1d044-9e9c-41ca-96e2-735248659691"). InnerVolumeSpecName "pvc-88515a40-e0e6-4b68-9f49-c6ac1986175f". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.004673 4892 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f5a1d044-9e9c-41ca-96e2-735248659691-server-conf\") on node \"crc\" DevicePath \"\"" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.004700 4892 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f5a1d044-9e9c-41ca-96e2-735248659691-pod-info\") on node \"crc\" DevicePath \"\"" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.004711 4892 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f5a1d044-9e9c-41ca-96e2-735248659691-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.004721 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4b86\" (UniqueName: \"kubernetes.io/projected/f5a1d044-9e9c-41ca-96e2-735248659691-kube-api-access-d4b86\") on node \"crc\" DevicePath \"\"" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.004731 4892 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f5a1d044-9e9c-41ca-96e2-735248659691-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.004765 4892 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-88515a40-e0e6-4b68-9f49-c6ac1986175f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-88515a40-e0e6-4b68-9f49-c6ac1986175f\") on node \"crc\" " Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.004775 4892 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f5a1d044-9e9c-41ca-96e2-735248659691-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.004785 4892 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f5a1d044-9e9c-41ca-96e2-735248659691-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.027905 4892 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.028050 4892 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-88515a40-e0e6-4b68-9f49-c6ac1986175f" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-88515a40-e0e6-4b68-9f49-c6ac1986175f") on node "crc" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.087542 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5a1d044-9e9c-41ca-96e2-735248659691-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "f5a1d044-9e9c-41ca-96e2-735248659691" (UID: "f5a1d044-9e9c-41ca-96e2-735248659691"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.105974 4892 reconciler_common.go:293] "Volume detached for volume \"pvc-88515a40-e0e6-4b68-9f49-c6ac1986175f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-88515a40-e0e6-4b68-9f49-c6ac1986175f\") on node \"crc\" DevicePath \"\"" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.106005 4892 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f5a1d044-9e9c-41ca-96e2-735248659691-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.183151 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-2jqm4" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.315505 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlz92\" (UniqueName: \"kubernetes.io/projected/2a2fba10-1e82-4580-b350-548b00de2a1f-kube-api-access-zlz92\") pod \"2a2fba10-1e82-4580-b350-548b00de2a1f\" (UID: \"2a2fba10-1e82-4580-b350-548b00de2a1f\") " Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.315566 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a2fba10-1e82-4580-b350-548b00de2a1f-config\") pod \"2a2fba10-1e82-4580-b350-548b00de2a1f\" (UID: \"2a2fba10-1e82-4580-b350-548b00de2a1f\") " Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.315656 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a2fba10-1e82-4580-b350-548b00de2a1f-dns-svc\") pod \"2a2fba10-1e82-4580-b350-548b00de2a1f\" (UID: \"2a2fba10-1e82-4580-b350-548b00de2a1f\") " Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.324075 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a2fba10-1e82-4580-b350-548b00de2a1f-kube-api-access-zlz92" (OuterVolumeSpecName: "kube-api-access-zlz92") pod "2a2fba10-1e82-4580-b350-548b00de2a1f" (UID: "2a2fba10-1e82-4580-b350-548b00de2a1f"). InnerVolumeSpecName "kube-api-access-zlz92". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.374280 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a2fba10-1e82-4580-b350-548b00de2a1f-config" (OuterVolumeSpecName: "config") pod "2a2fba10-1e82-4580-b350-548b00de2a1f" (UID: "2a2fba10-1e82-4580-b350-548b00de2a1f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.385795 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a2fba10-1e82-4580-b350-548b00de2a1f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2a2fba10-1e82-4580-b350-548b00de2a1f" (UID: "2a2fba10-1e82-4580-b350-548b00de2a1f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.419684 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlz92\" (UniqueName: \"kubernetes.io/projected/2a2fba10-1e82-4580-b350-548b00de2a1f-kube-api-access-zlz92\") on node \"crc\" DevicePath \"\"" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.419721 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a2fba10-1e82-4580-b350-548b00de2a1f-config\") on node \"crc\" DevicePath \"\"" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.419732 4892 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a2fba10-1e82-4580-b350-548b00de2a1f-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.448368 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.521077 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8306a818-37e0-43dd-8205-5089be91c516\") pod \"4896fa46-2882-440a-b0fe-66ab208de548\" (UID: \"4896fa46-2882-440a-b0fe-66ab208de548\") " Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.521149 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4896fa46-2882-440a-b0fe-66ab208de548-plugins-conf\") pod \"4896fa46-2882-440a-b0fe-66ab208de548\" (UID: \"4896fa46-2882-440a-b0fe-66ab208de548\") " Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.521174 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvsg7\" (UniqueName: \"kubernetes.io/projected/4896fa46-2882-440a-b0fe-66ab208de548-kube-api-access-dvsg7\") pod \"4896fa46-2882-440a-b0fe-66ab208de548\" (UID: \"4896fa46-2882-440a-b0fe-66ab208de548\") " Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.521348 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4896fa46-2882-440a-b0fe-66ab208de548-server-conf\") pod \"4896fa46-2882-440a-b0fe-66ab208de548\" (UID: \"4896fa46-2882-440a-b0fe-66ab208de548\") " Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.521401 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4896fa46-2882-440a-b0fe-66ab208de548-rabbitmq-plugins\") pod \"4896fa46-2882-440a-b0fe-66ab208de548\" (UID: \"4896fa46-2882-440a-b0fe-66ab208de548\") " Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.521445 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4896fa46-2882-440a-b0fe-66ab208de548-pod-info\") pod \"4896fa46-2882-440a-b0fe-66ab208de548\" (UID: \"4896fa46-2882-440a-b0fe-66ab208de548\") " Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.521471 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4896fa46-2882-440a-b0fe-66ab208de548-rabbitmq-confd\") pod \"4896fa46-2882-440a-b0fe-66ab208de548\" (UID: \"4896fa46-2882-440a-b0fe-66ab208de548\") " Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.521517 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4896fa46-2882-440a-b0fe-66ab208de548-erlang-cookie-secret\") pod \"4896fa46-2882-440a-b0fe-66ab208de548\" (UID: \"4896fa46-2882-440a-b0fe-66ab208de548\") " Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.521553 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4896fa46-2882-440a-b0fe-66ab208de548-rabbitmq-erlang-cookie\") pod \"4896fa46-2882-440a-b0fe-66ab208de548\" (UID: \"4896fa46-2882-440a-b0fe-66ab208de548\") " Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.522471 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4896fa46-2882-440a-b0fe-66ab208de548-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "4896fa46-2882-440a-b0fe-66ab208de548" (UID: "4896fa46-2882-440a-b0fe-66ab208de548"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.523282 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4896fa46-2882-440a-b0fe-66ab208de548-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "4896fa46-2882-440a-b0fe-66ab208de548" (UID: "4896fa46-2882-440a-b0fe-66ab208de548"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.523473 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4896fa46-2882-440a-b0fe-66ab208de548-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "4896fa46-2882-440a-b0fe-66ab208de548" (UID: "4896fa46-2882-440a-b0fe-66ab208de548"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.526467 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4896fa46-2882-440a-b0fe-66ab208de548-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "4896fa46-2882-440a-b0fe-66ab208de548" (UID: "4896fa46-2882-440a-b0fe-66ab208de548"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.527920 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/4896fa46-2882-440a-b0fe-66ab208de548-pod-info" (OuterVolumeSpecName: "pod-info") pod "4896fa46-2882-440a-b0fe-66ab208de548" (UID: "4896fa46-2882-440a-b0fe-66ab208de548"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.543317 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4896fa46-2882-440a-b0fe-66ab208de548-kube-api-access-dvsg7" (OuterVolumeSpecName: "kube-api-access-dvsg7") pod "4896fa46-2882-440a-b0fe-66ab208de548" (UID: "4896fa46-2882-440a-b0fe-66ab208de548"). InnerVolumeSpecName "kube-api-access-dvsg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.559650 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8306a818-37e0-43dd-8205-5089be91c516" (OuterVolumeSpecName: "persistence") pod "4896fa46-2882-440a-b0fe-66ab208de548" (UID: "4896fa46-2882-440a-b0fe-66ab208de548"). InnerVolumeSpecName "pvc-8306a818-37e0-43dd-8205-5089be91c516". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.568984 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4896fa46-2882-440a-b0fe-66ab208de548-server-conf" (OuterVolumeSpecName: "server-conf") pod "4896fa46-2882-440a-b0fe-66ab208de548" (UID: "4896fa46-2882-440a-b0fe-66ab208de548"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.625440 4892 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4896fa46-2882-440a-b0fe-66ab208de548-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.625732 4892 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4896fa46-2882-440a-b0fe-66ab208de548-pod-info\") on node \"crc\" DevicePath \"\"" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.625793 4892 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4896fa46-2882-440a-b0fe-66ab208de548-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.625891 4892 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4896fa46-2882-440a-b0fe-66ab208de548-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.625972 4892 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-8306a818-37e0-43dd-8205-5089be91c516\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8306a818-37e0-43dd-8205-5089be91c516\") on node \"crc\" " Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.626040 4892 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4896fa46-2882-440a-b0fe-66ab208de548-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.626103 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvsg7\" (UniqueName: \"kubernetes.io/projected/4896fa46-2882-440a-b0fe-66ab208de548-kube-api-access-dvsg7\") on node \"crc\" DevicePath \"\"" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.626165 4892 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4896fa46-2882-440a-b0fe-66ab208de548-server-conf\") on node \"crc\" DevicePath \"\"" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.653164 4892 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.653399 4892 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-8306a818-37e0-43dd-8205-5089be91c516" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8306a818-37e0-43dd-8205-5089be91c516") on node "crc" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.660775 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4896fa46-2882-440a-b0fe-66ab208de548-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "4896fa46-2882-440a-b0fe-66ab208de548" (UID: "4896fa46-2882-440a-b0fe-66ab208de548"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.691468 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.691887 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f5a1d044-9e9c-41ca-96e2-735248659691","Type":"ContainerDied","Data":"a93004432e89e5a6e7412c58a9f459b6d85dcae2adde344d9ee7d29a63d4b9c0"} Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.691942 4892 scope.go:117] "RemoveContainer" containerID="32a80db7adf34cf8be0fb7837d28f1a3c2f3f3ab2fdf42369624edf06010d26a" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.695043 4892 generic.go:334] "Generic (PLEG): container finished" podID="4896fa46-2882-440a-b0fe-66ab208de548" containerID="ee7c25ec1dd3875f374fec2625f4e9ca808ce7cd5a27f9a907cdf302af8967b1" exitCode=0 Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.695077 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4896fa46-2882-440a-b0fe-66ab208de548","Type":"ContainerDied","Data":"ee7c25ec1dd3875f374fec2625f4e9ca808ce7cd5a27f9a907cdf302af8967b1"} Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.695108 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4896fa46-2882-440a-b0fe-66ab208de548","Type":"ContainerDied","Data":"e68d5c8a3ea05e79fed3a98e48d5b1f36cf30c0809748315d1b9b4d5226b7616"} Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.695113 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.698055 4892 generic.go:334] "Generic (PLEG): container finished" podID="2a2fba10-1e82-4580-b350-548b00de2a1f" containerID="deb9f2ad7525b7d52a6ffb43c66aeb0b43701513203a2f19d24ec4f011ba5fe2" exitCode=0 Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.698122 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-2jqm4" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.698124 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-2jqm4" event={"ID":"2a2fba10-1e82-4580-b350-548b00de2a1f","Type":"ContainerDied","Data":"deb9f2ad7525b7d52a6ffb43c66aeb0b43701513203a2f19d24ec4f011ba5fe2"} Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.698171 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-2jqm4" event={"ID":"2a2fba10-1e82-4580-b350-548b00de2a1f","Type":"ContainerDied","Data":"f427d7f477bc65453eefe07739b6d11b31540fb3fc220f6f85ae5a46634ac5f6"} Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.716220 4892 scope.go:117] "RemoveContainer" containerID="0e1ac38ae4fa6b22abe369a93643c0a6ad6172744d4df53b1ba3eb6bf221a1cf" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.727899 4892 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4896fa46-2882-440a-b0fe-66ab208de548-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.727930 4892 reconciler_common.go:293] "Volume detached for volume \"pvc-8306a818-37e0-43dd-8205-5089be91c516\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8306a818-37e0-43dd-8205-5089be91c516\") on node \"crc\" DevicePath \"\"" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.727952 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.735132 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.748915 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-2jqm4"] Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.767721 4892 scope.go:117] "RemoveContainer" containerID="ee7c25ec1dd3875f374fec2625f4e9ca808ce7cd5a27f9a907cdf302af8967b1" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.782663 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-2jqm4"] Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.807391 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 19:05:31 crc kubenswrapper[4892]: E0217 19:05:31.807768 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29beb521-cea5-4c7f-a444-d20860c48f12" containerName="extract-content" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.807786 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="29beb521-cea5-4c7f-a444-d20860c48f12" containerName="extract-content" Feb 17 19:05:31 crc kubenswrapper[4892]: E0217 19:05:31.807794 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5a1d044-9e9c-41ca-96e2-735248659691" containerName="rabbitmq" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.807801 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5a1d044-9e9c-41ca-96e2-735248659691" containerName="rabbitmq" Feb 17 19:05:31 crc kubenswrapper[4892]: E0217 19:05:31.807833 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a2fba10-1e82-4580-b350-548b00de2a1f" containerName="init" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.807840 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a2fba10-1e82-4580-b350-548b00de2a1f" containerName="init" Feb 17 19:05:31 crc kubenswrapper[4892]: E0217 19:05:31.807853 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5a1d044-9e9c-41ca-96e2-735248659691" containerName="setup-container" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.807859 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5a1d044-9e9c-41ca-96e2-735248659691" containerName="setup-container" Feb 17 19:05:31 crc kubenswrapper[4892]: E0217 19:05:31.807873 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29beb521-cea5-4c7f-a444-d20860c48f12" containerName="registry-server" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.807878 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="29beb521-cea5-4c7f-a444-d20860c48f12" containerName="registry-server" Feb 17 19:05:31 crc kubenswrapper[4892]: E0217 19:05:31.807888 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4896fa46-2882-440a-b0fe-66ab208de548" containerName="setup-container" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.807893 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="4896fa46-2882-440a-b0fe-66ab208de548" containerName="setup-container" Feb 17 19:05:31 crc kubenswrapper[4892]: E0217 19:05:31.807904 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4896fa46-2882-440a-b0fe-66ab208de548" containerName="rabbitmq" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.807910 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="4896fa46-2882-440a-b0fe-66ab208de548" containerName="rabbitmq" Feb 17 19:05:31 crc kubenswrapper[4892]: E0217 19:05:31.807921 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29beb521-cea5-4c7f-a444-d20860c48f12" containerName="extract-utilities" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.807927 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="29beb521-cea5-4c7f-a444-d20860c48f12" containerName="extract-utilities" Feb 17 19:05:31 crc kubenswrapper[4892]: E0217 19:05:31.807939 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a2fba10-1e82-4580-b350-548b00de2a1f" containerName="dnsmasq-dns" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.807946 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a2fba10-1e82-4580-b350-548b00de2a1f" containerName="dnsmasq-dns" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.808116 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a2fba10-1e82-4580-b350-548b00de2a1f" containerName="dnsmasq-dns" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.808129 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5a1d044-9e9c-41ca-96e2-735248659691" containerName="rabbitmq" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.808138 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="4896fa46-2882-440a-b0fe-66ab208de548" containerName="rabbitmq" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.808145 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="29beb521-cea5-4c7f-a444-d20860c48f12" containerName="registry-server" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.809031 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.816079 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.816299 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.816322 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.816324 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-z997r" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.819913 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.821369 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.830149 4892 scope.go:117] "RemoveContainer" containerID="cfe22892d0efb55da7698e66f3e81a197baba371f76869ce1e3d5e6e9095f20b" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.833006 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.839622 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.846804 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.848318 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.850223 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-db6cv" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.850235 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.850329 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.852107 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.852646 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.860426 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.861579 4892 scope.go:117] "RemoveContainer" containerID="ee7c25ec1dd3875f374fec2625f4e9ca808ce7cd5a27f9a907cdf302af8967b1" Feb 17 19:05:31 crc kubenswrapper[4892]: E0217 19:05:31.861928 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee7c25ec1dd3875f374fec2625f4e9ca808ce7cd5a27f9a907cdf302af8967b1\": container with ID starting with ee7c25ec1dd3875f374fec2625f4e9ca808ce7cd5a27f9a907cdf302af8967b1 not found: ID does not exist" containerID="ee7c25ec1dd3875f374fec2625f4e9ca808ce7cd5a27f9a907cdf302af8967b1" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.861956 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee7c25ec1dd3875f374fec2625f4e9ca808ce7cd5a27f9a907cdf302af8967b1"} err="failed to get container status \"ee7c25ec1dd3875f374fec2625f4e9ca808ce7cd5a27f9a907cdf302af8967b1\": rpc error: code = NotFound desc = could not find container \"ee7c25ec1dd3875f374fec2625f4e9ca808ce7cd5a27f9a907cdf302af8967b1\": container with ID starting with ee7c25ec1dd3875f374fec2625f4e9ca808ce7cd5a27f9a907cdf302af8967b1 not found: ID does not exist" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.861979 4892 scope.go:117] "RemoveContainer" containerID="cfe22892d0efb55da7698e66f3e81a197baba371f76869ce1e3d5e6e9095f20b" Feb 17 19:05:31 crc kubenswrapper[4892]: E0217 19:05:31.862236 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfe22892d0efb55da7698e66f3e81a197baba371f76869ce1e3d5e6e9095f20b\": container with ID starting with cfe22892d0efb55da7698e66f3e81a197baba371f76869ce1e3d5e6e9095f20b not found: ID does not exist" containerID="cfe22892d0efb55da7698e66f3e81a197baba371f76869ce1e3d5e6e9095f20b" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.862282 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfe22892d0efb55da7698e66f3e81a197baba371f76869ce1e3d5e6e9095f20b"} err="failed to get container status \"cfe22892d0efb55da7698e66f3e81a197baba371f76869ce1e3d5e6e9095f20b\": rpc error: code = NotFound desc = could not find container \"cfe22892d0efb55da7698e66f3e81a197baba371f76869ce1e3d5e6e9095f20b\": container with ID starting with cfe22892d0efb55da7698e66f3e81a197baba371f76869ce1e3d5e6e9095f20b not found: ID does not exist" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.862311 4892 scope.go:117] "RemoveContainer" containerID="deb9f2ad7525b7d52a6ffb43c66aeb0b43701513203a2f19d24ec4f011ba5fe2" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.884048 4892 scope.go:117] "RemoveContainer" containerID="80990b3226c6a2356df0b236deb39fdc0c5850aa9ef72469f57834673e5cbfde" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.904095 4892 scope.go:117] "RemoveContainer" containerID="deb9f2ad7525b7d52a6ffb43c66aeb0b43701513203a2f19d24ec4f011ba5fe2" Feb 17 19:05:31 crc kubenswrapper[4892]: E0217 19:05:31.904716 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"deb9f2ad7525b7d52a6ffb43c66aeb0b43701513203a2f19d24ec4f011ba5fe2\": container with ID starting with deb9f2ad7525b7d52a6ffb43c66aeb0b43701513203a2f19d24ec4f011ba5fe2 not found: ID does not exist" containerID="deb9f2ad7525b7d52a6ffb43c66aeb0b43701513203a2f19d24ec4f011ba5fe2" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.904789 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"deb9f2ad7525b7d52a6ffb43c66aeb0b43701513203a2f19d24ec4f011ba5fe2"} err="failed to get container status \"deb9f2ad7525b7d52a6ffb43c66aeb0b43701513203a2f19d24ec4f011ba5fe2\": rpc error: code = NotFound desc = could not find container \"deb9f2ad7525b7d52a6ffb43c66aeb0b43701513203a2f19d24ec4f011ba5fe2\": container with ID starting with deb9f2ad7525b7d52a6ffb43c66aeb0b43701513203a2f19d24ec4f011ba5fe2 not found: ID does not exist" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.904930 4892 scope.go:117] "RemoveContainer" containerID="80990b3226c6a2356df0b236deb39fdc0c5850aa9ef72469f57834673e5cbfde" Feb 17 19:05:31 crc kubenswrapper[4892]: E0217 19:05:31.905532 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80990b3226c6a2356df0b236deb39fdc0c5850aa9ef72469f57834673e5cbfde\": container with ID starting with 80990b3226c6a2356df0b236deb39fdc0c5850aa9ef72469f57834673e5cbfde not found: ID does not exist" containerID="80990b3226c6a2356df0b236deb39fdc0c5850aa9ef72469f57834673e5cbfde" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.905585 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80990b3226c6a2356df0b236deb39fdc0c5850aa9ef72469f57834673e5cbfde"} err="failed to get container status \"80990b3226c6a2356df0b236deb39fdc0c5850aa9ef72469f57834673e5cbfde\": rpc error: code = NotFound desc = could not find container \"80990b3226c6a2356df0b236deb39fdc0c5850aa9ef72469f57834673e5cbfde\": container with ID starting with 80990b3226c6a2356df0b236deb39fdc0c5850aa9ef72469f57834673e5cbfde not found: ID does not exist" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.932127 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ce51a677-1590-46d7-a8d9-47d1d2564a70-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ce51a677-1590-46d7-a8d9-47d1d2564a70\") " pod="openstack/rabbitmq-server-0" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.932178 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b45c3cf8-448f-4ac0-8964-d78733369884-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b45c3cf8-448f-4ac0-8964-d78733369884\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.932206 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ce51a677-1590-46d7-a8d9-47d1d2564a70-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ce51a677-1590-46d7-a8d9-47d1d2564a70\") " pod="openstack/rabbitmq-server-0" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.932229 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b45c3cf8-448f-4ac0-8964-d78733369884-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b45c3cf8-448f-4ac0-8964-d78733369884\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.932242 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b45c3cf8-448f-4ac0-8964-d78733369884-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b45c3cf8-448f-4ac0-8964-d78733369884\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.932260 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b45c3cf8-448f-4ac0-8964-d78733369884-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b45c3cf8-448f-4ac0-8964-d78733369884\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.932287 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-88515a40-e0e6-4b68-9f49-c6ac1986175f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-88515a40-e0e6-4b68-9f49-c6ac1986175f\") pod \"rabbitmq-server-0\" (UID: \"ce51a677-1590-46d7-a8d9-47d1d2564a70\") " pod="openstack/rabbitmq-server-0" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.932310 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85w7f\" (UniqueName: \"kubernetes.io/projected/ce51a677-1590-46d7-a8d9-47d1d2564a70-kube-api-access-85w7f\") pod \"rabbitmq-server-0\" (UID: \"ce51a677-1590-46d7-a8d9-47d1d2564a70\") " pod="openstack/rabbitmq-server-0" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.932336 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8306a818-37e0-43dd-8205-5089be91c516\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8306a818-37e0-43dd-8205-5089be91c516\") pod \"rabbitmq-cell1-server-0\" (UID: \"b45c3cf8-448f-4ac0-8964-d78733369884\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.932353 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b45c3cf8-448f-4ac0-8964-d78733369884-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b45c3cf8-448f-4ac0-8964-d78733369884\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.932371 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ce51a677-1590-46d7-a8d9-47d1d2564a70-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ce51a677-1590-46d7-a8d9-47d1d2564a70\") " pod="openstack/rabbitmq-server-0" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.932391 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ce51a677-1590-46d7-a8d9-47d1d2564a70-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ce51a677-1590-46d7-a8d9-47d1d2564a70\") " pod="openstack/rabbitmq-server-0" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.932426 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ce51a677-1590-46d7-a8d9-47d1d2564a70-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ce51a677-1590-46d7-a8d9-47d1d2564a70\") " pod="openstack/rabbitmq-server-0" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.932447 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ce51a677-1590-46d7-a8d9-47d1d2564a70-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ce51a677-1590-46d7-a8d9-47d1d2564a70\") " pod="openstack/rabbitmq-server-0" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.932474 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b45c3cf8-448f-4ac0-8964-d78733369884-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b45c3cf8-448f-4ac0-8964-d78733369884\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.932492 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsbgh\" (UniqueName: \"kubernetes.io/projected/b45c3cf8-448f-4ac0-8964-d78733369884-kube-api-access-hsbgh\") pod \"rabbitmq-cell1-server-0\" (UID: \"b45c3cf8-448f-4ac0-8964-d78733369884\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.932506 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ce51a677-1590-46d7-a8d9-47d1d2564a70-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ce51a677-1590-46d7-a8d9-47d1d2564a70\") " pod="openstack/rabbitmq-server-0" Feb 17 19:05:31 crc kubenswrapper[4892]: I0217 19:05:31.932519 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b45c3cf8-448f-4ac0-8964-d78733369884-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b45c3cf8-448f-4ac0-8964-d78733369884\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:05:32 crc kubenswrapper[4892]: I0217 19:05:32.034619 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ce51a677-1590-46d7-a8d9-47d1d2564a70-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ce51a677-1590-46d7-a8d9-47d1d2564a70\") " pod="openstack/rabbitmq-server-0" Feb 17 19:05:32 crc kubenswrapper[4892]: I0217 19:05:32.034705 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b45c3cf8-448f-4ac0-8964-d78733369884-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b45c3cf8-448f-4ac0-8964-d78733369884\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:05:32 crc kubenswrapper[4892]: I0217 19:05:32.034743 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ce51a677-1590-46d7-a8d9-47d1d2564a70-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ce51a677-1590-46d7-a8d9-47d1d2564a70\") " pod="openstack/rabbitmq-server-0" Feb 17 19:05:32 crc kubenswrapper[4892]: I0217 19:05:32.034788 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b45c3cf8-448f-4ac0-8964-d78733369884-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b45c3cf8-448f-4ac0-8964-d78733369884\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:05:32 crc kubenswrapper[4892]: I0217 19:05:32.034905 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b45c3cf8-448f-4ac0-8964-d78733369884-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b45c3cf8-448f-4ac0-8964-d78733369884\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:05:32 crc kubenswrapper[4892]: I0217 19:05:32.034943 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b45c3cf8-448f-4ac0-8964-d78733369884-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b45c3cf8-448f-4ac0-8964-d78733369884\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:05:32 crc kubenswrapper[4892]: I0217 19:05:32.034987 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-88515a40-e0e6-4b68-9f49-c6ac1986175f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-88515a40-e0e6-4b68-9f49-c6ac1986175f\") pod \"rabbitmq-server-0\" (UID: \"ce51a677-1590-46d7-a8d9-47d1d2564a70\") " pod="openstack/rabbitmq-server-0" Feb 17 19:05:32 crc kubenswrapper[4892]: I0217 19:05:32.035018 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85w7f\" (UniqueName: \"kubernetes.io/projected/ce51a677-1590-46d7-a8d9-47d1d2564a70-kube-api-access-85w7f\") pod \"rabbitmq-server-0\" (UID: \"ce51a677-1590-46d7-a8d9-47d1d2564a70\") " pod="openstack/rabbitmq-server-0" Feb 17 19:05:32 crc kubenswrapper[4892]: I0217 19:05:32.035071 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8306a818-37e0-43dd-8205-5089be91c516\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8306a818-37e0-43dd-8205-5089be91c516\") pod \"rabbitmq-cell1-server-0\" (UID: \"b45c3cf8-448f-4ac0-8964-d78733369884\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:05:32 crc kubenswrapper[4892]: I0217 19:05:32.035104 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b45c3cf8-448f-4ac0-8964-d78733369884-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b45c3cf8-448f-4ac0-8964-d78733369884\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:05:32 crc kubenswrapper[4892]: I0217 19:05:32.035144 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ce51a677-1590-46d7-a8d9-47d1d2564a70-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ce51a677-1590-46d7-a8d9-47d1d2564a70\") " pod="openstack/rabbitmq-server-0" Feb 17 19:05:32 crc kubenswrapper[4892]: I0217 19:05:32.035180 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ce51a677-1590-46d7-a8d9-47d1d2564a70-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ce51a677-1590-46d7-a8d9-47d1d2564a70\") " pod="openstack/rabbitmq-server-0" Feb 17 19:05:32 crc kubenswrapper[4892]: I0217 19:05:32.035248 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ce51a677-1590-46d7-a8d9-47d1d2564a70-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ce51a677-1590-46d7-a8d9-47d1d2564a70\") " pod="openstack/rabbitmq-server-0" Feb 17 19:05:32 crc kubenswrapper[4892]: I0217 19:05:32.035295 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ce51a677-1590-46d7-a8d9-47d1d2564a70-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ce51a677-1590-46d7-a8d9-47d1d2564a70\") " pod="openstack/rabbitmq-server-0" Feb 17 19:05:32 crc kubenswrapper[4892]: I0217 19:05:32.035331 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b45c3cf8-448f-4ac0-8964-d78733369884-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b45c3cf8-448f-4ac0-8964-d78733369884\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:05:32 crc kubenswrapper[4892]: I0217 19:05:32.035346 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b45c3cf8-448f-4ac0-8964-d78733369884-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b45c3cf8-448f-4ac0-8964-d78733369884\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:05:32 crc kubenswrapper[4892]: I0217 19:05:32.035385 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsbgh\" (UniqueName: \"kubernetes.io/projected/b45c3cf8-448f-4ac0-8964-d78733369884-kube-api-access-hsbgh\") pod \"rabbitmq-cell1-server-0\" (UID: \"b45c3cf8-448f-4ac0-8964-d78733369884\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:05:32 crc kubenswrapper[4892]: I0217 19:05:32.035413 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ce51a677-1590-46d7-a8d9-47d1d2564a70-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ce51a677-1590-46d7-a8d9-47d1d2564a70\") " pod="openstack/rabbitmq-server-0" Feb 17 19:05:32 crc kubenswrapper[4892]: I0217 19:05:32.035446 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b45c3cf8-448f-4ac0-8964-d78733369884-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b45c3cf8-448f-4ac0-8964-d78733369884\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:05:32 crc kubenswrapper[4892]: I0217 19:05:32.035788 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b45c3cf8-448f-4ac0-8964-d78733369884-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b45c3cf8-448f-4ac0-8964-d78733369884\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:05:32 crc kubenswrapper[4892]: I0217 19:05:32.036060 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b45c3cf8-448f-4ac0-8964-d78733369884-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b45c3cf8-448f-4ac0-8964-d78733369884\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:05:32 crc kubenswrapper[4892]: I0217 19:05:32.036411 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ce51a677-1590-46d7-a8d9-47d1d2564a70-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ce51a677-1590-46d7-a8d9-47d1d2564a70\") " pod="openstack/rabbitmq-server-0" Feb 17 19:05:32 crc kubenswrapper[4892]: I0217 19:05:32.036470 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ce51a677-1590-46d7-a8d9-47d1d2564a70-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ce51a677-1590-46d7-a8d9-47d1d2564a70\") " pod="openstack/rabbitmq-server-0" Feb 17 19:05:32 crc kubenswrapper[4892]: I0217 19:05:32.037151 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ce51a677-1590-46d7-a8d9-47d1d2564a70-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ce51a677-1590-46d7-a8d9-47d1d2564a70\") " pod="openstack/rabbitmq-server-0" Feb 17 19:05:32 crc kubenswrapper[4892]: I0217 19:05:32.038032 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b45c3cf8-448f-4ac0-8964-d78733369884-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b45c3cf8-448f-4ac0-8964-d78733369884\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:05:32 crc kubenswrapper[4892]: I0217 19:05:32.038653 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ce51a677-1590-46d7-a8d9-47d1d2564a70-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ce51a677-1590-46d7-a8d9-47d1d2564a70\") " pod="openstack/rabbitmq-server-0" Feb 17 19:05:32 crc kubenswrapper[4892]: I0217 19:05:32.039576 4892 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 19:05:32 crc kubenswrapper[4892]: I0217 19:05:32.039795 4892 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-88515a40-e0e6-4b68-9f49-c6ac1986175f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-88515a40-e0e6-4b68-9f49-c6ac1986175f\") pod \"rabbitmq-server-0\" (UID: \"ce51a677-1590-46d7-a8d9-47d1d2564a70\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0bcf9fa32c69b4158e289b66c9f947462be31d0b8fe8c84c345aa605e691f6ba/globalmount\"" pod="openstack/rabbitmq-server-0" Feb 17 19:05:32 crc kubenswrapper[4892]: I0217 19:05:32.040394 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ce51a677-1590-46d7-a8d9-47d1d2564a70-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ce51a677-1590-46d7-a8d9-47d1d2564a70\") " pod="openstack/rabbitmq-server-0" Feb 17 19:05:32 crc kubenswrapper[4892]: I0217 19:05:32.040488 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b45c3cf8-448f-4ac0-8964-d78733369884-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b45c3cf8-448f-4ac0-8964-d78733369884\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:05:32 crc kubenswrapper[4892]: I0217 19:05:32.040633 4892 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 19:05:32 crc kubenswrapper[4892]: I0217 19:05:32.040671 4892 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8306a818-37e0-43dd-8205-5089be91c516\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8306a818-37e0-43dd-8205-5089be91c516\") pod \"rabbitmq-cell1-server-0\" (UID: \"b45c3cf8-448f-4ac0-8964-d78733369884\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/01daa81c8d454feeafbbae15de02c09e5a451b10e7d393ca6cd9336766d33668/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:05:32 crc kubenswrapper[4892]: I0217 19:05:32.040805 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ce51a677-1590-46d7-a8d9-47d1d2564a70-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ce51a677-1590-46d7-a8d9-47d1d2564a70\") " pod="openstack/rabbitmq-server-0" Feb 17 19:05:32 crc kubenswrapper[4892]: I0217 19:05:32.041150 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b45c3cf8-448f-4ac0-8964-d78733369884-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b45c3cf8-448f-4ac0-8964-d78733369884\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:05:32 crc kubenswrapper[4892]: I0217 19:05:32.045962 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b45c3cf8-448f-4ac0-8964-d78733369884-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b45c3cf8-448f-4ac0-8964-d78733369884\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:05:32 crc kubenswrapper[4892]: I0217 19:05:32.046246 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ce51a677-1590-46d7-a8d9-47d1d2564a70-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ce51a677-1590-46d7-a8d9-47d1d2564a70\") " pod="openstack/rabbitmq-server-0" Feb 17 19:05:32 crc kubenswrapper[4892]: I0217 19:05:32.069490 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsbgh\" (UniqueName: \"kubernetes.io/projected/b45c3cf8-448f-4ac0-8964-d78733369884-kube-api-access-hsbgh\") pod \"rabbitmq-cell1-server-0\" (UID: \"b45c3cf8-448f-4ac0-8964-d78733369884\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:05:32 crc kubenswrapper[4892]: I0217 19:05:32.074014 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85w7f\" (UniqueName: \"kubernetes.io/projected/ce51a677-1590-46d7-a8d9-47d1d2564a70-kube-api-access-85w7f\") pod \"rabbitmq-server-0\" (UID: \"ce51a677-1590-46d7-a8d9-47d1d2564a70\") " pod="openstack/rabbitmq-server-0" Feb 17 19:05:32 crc kubenswrapper[4892]: I0217 19:05:32.083062 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8306a818-37e0-43dd-8205-5089be91c516\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8306a818-37e0-43dd-8205-5089be91c516\") pod \"rabbitmq-cell1-server-0\" (UID: \"b45c3cf8-448f-4ac0-8964-d78733369884\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:05:32 crc kubenswrapper[4892]: I0217 19:05:32.101036 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-88515a40-e0e6-4b68-9f49-c6ac1986175f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-88515a40-e0e6-4b68-9f49-c6ac1986175f\") pod \"rabbitmq-server-0\" (UID: \"ce51a677-1590-46d7-a8d9-47d1d2564a70\") " pod="openstack/rabbitmq-server-0" Feb 17 19:05:32 crc kubenswrapper[4892]: I0217 19:05:32.131480 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 17 19:05:32 crc kubenswrapper[4892]: I0217 19:05:32.165462 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:05:32 crc kubenswrapper[4892]: I0217 19:05:32.674800 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 19:05:32 crc kubenswrapper[4892]: I0217 19:05:32.731953 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ce51a677-1590-46d7-a8d9-47d1d2564a70","Type":"ContainerStarted","Data":"29fd00b232323cd77f0f261b4ccaec2ad2fc5e0e6eb0c037a10d1d4b09fca43d"} Feb 17 19:05:32 crc kubenswrapper[4892]: I0217 19:05:32.736135 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 19:05:32 crc kubenswrapper[4892]: W0217 19:05:32.755030 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb45c3cf8_448f_4ac0_8964_d78733369884.slice/crio-4ffeacffb14cfa64d7a4ee311671794849d3478b83dce3524118c924bc7b7dc5 WatchSource:0}: Error finding container 4ffeacffb14cfa64d7a4ee311671794849d3478b83dce3524118c924bc7b7dc5: Status 404 returned error can't find the container with id 4ffeacffb14cfa64d7a4ee311671794849d3478b83dce3524118c924bc7b7dc5 Feb 17 19:05:33 crc kubenswrapper[4892]: I0217 19:05:33.385366 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a2fba10-1e82-4580-b350-548b00de2a1f" path="/var/lib/kubelet/pods/2a2fba10-1e82-4580-b350-548b00de2a1f/volumes" Feb 17 19:05:33 crc kubenswrapper[4892]: I0217 19:05:33.387856 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4896fa46-2882-440a-b0fe-66ab208de548" path="/var/lib/kubelet/pods/4896fa46-2882-440a-b0fe-66ab208de548/volumes" Feb 17 19:05:33 crc kubenswrapper[4892]: I0217 19:05:33.394020 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5a1d044-9e9c-41ca-96e2-735248659691" path="/var/lib/kubelet/pods/f5a1d044-9e9c-41ca-96e2-735248659691/volumes" Feb 17 19:05:33 crc kubenswrapper[4892]: I0217 19:05:33.756955 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b45c3cf8-448f-4ac0-8964-d78733369884","Type":"ContainerStarted","Data":"4ffeacffb14cfa64d7a4ee311671794849d3478b83dce3524118c924bc7b7dc5"} Feb 17 19:05:34 crc kubenswrapper[4892]: I0217 19:05:34.771769 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b45c3cf8-448f-4ac0-8964-d78733369884","Type":"ContainerStarted","Data":"80bb441a8ef008a36926efc27afcb1c8c612f19757dd69317e39f15306afe704"} Feb 17 19:05:34 crc kubenswrapper[4892]: I0217 19:05:34.774193 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ce51a677-1590-46d7-a8d9-47d1d2564a70","Type":"ContainerStarted","Data":"13257654a2e27047d65581818dba7ad42f930dc0145985e2097775c236437664"} Feb 17 19:06:07 crc kubenswrapper[4892]: I0217 19:06:07.080397 4892 generic.go:334] "Generic (PLEG): container finished" podID="b45c3cf8-448f-4ac0-8964-d78733369884" containerID="80bb441a8ef008a36926efc27afcb1c8c612f19757dd69317e39f15306afe704" exitCode=0 Feb 17 19:06:07 crc kubenswrapper[4892]: I0217 19:06:07.080475 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b45c3cf8-448f-4ac0-8964-d78733369884","Type":"ContainerDied","Data":"80bb441a8ef008a36926efc27afcb1c8c612f19757dd69317e39f15306afe704"} Feb 17 19:06:07 crc kubenswrapper[4892]: I0217 19:06:07.086616 4892 generic.go:334] "Generic (PLEG): container finished" podID="ce51a677-1590-46d7-a8d9-47d1d2564a70" containerID="13257654a2e27047d65581818dba7ad42f930dc0145985e2097775c236437664" exitCode=0 Feb 17 19:06:07 crc kubenswrapper[4892]: I0217 19:06:07.086803 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ce51a677-1590-46d7-a8d9-47d1d2564a70","Type":"ContainerDied","Data":"13257654a2e27047d65581818dba7ad42f930dc0145985e2097775c236437664"} Feb 17 19:06:08 crc kubenswrapper[4892]: I0217 19:06:08.095831 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b45c3cf8-448f-4ac0-8964-d78733369884","Type":"ContainerStarted","Data":"fba75be50b1c8b33488ce14a73bc7837102ae5f02b183c7e6967f06b2d853fa3"} Feb 17 19:06:08 crc kubenswrapper[4892]: I0217 19:06:08.096510 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:06:08 crc kubenswrapper[4892]: I0217 19:06:08.098414 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ce51a677-1590-46d7-a8d9-47d1d2564a70","Type":"ContainerStarted","Data":"6146be191895248968c78a553ea7d99020edf915b6197f47e655f3be192adace"} Feb 17 19:06:08 crc kubenswrapper[4892]: I0217 19:06:08.098662 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 17 19:06:08 crc kubenswrapper[4892]: I0217 19:06:08.123069 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.123046624 podStartE2EDuration="37.123046624s" podCreationTimestamp="2026-02-17 19:05:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:06:08.118266714 +0000 UTC m=+4939.493669999" watchObservedRunningTime="2026-02-17 19:06:08.123046624 +0000 UTC m=+4939.498449889" Feb 17 19:06:22 crc kubenswrapper[4892]: I0217 19:06:22.137117 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 17 19:06:22 crc kubenswrapper[4892]: I0217 19:06:22.170855 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 17 19:06:22 crc kubenswrapper[4892]: I0217 19:06:22.175054 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=51.175027898 podStartE2EDuration="51.175027898s" podCreationTimestamp="2026-02-17 19:05:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:06:08.150490984 +0000 UTC m=+4939.525894279" watchObservedRunningTime="2026-02-17 19:06:22.175027898 +0000 UTC m=+4953.550431203" Feb 17 19:06:34 crc kubenswrapper[4892]: I0217 19:06:34.341868 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Feb 17 19:06:34 crc kubenswrapper[4892]: I0217 19:06:34.343838 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 17 19:06:34 crc kubenswrapper[4892]: I0217 19:06:34.359960 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 17 19:06:34 crc kubenswrapper[4892]: I0217 19:06:34.374687 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-cl6br" Feb 17 19:06:34 crc kubenswrapper[4892]: I0217 19:06:34.518148 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pdxl\" (UniqueName: \"kubernetes.io/projected/bb08e0b0-bc0c-4e59-83ad-02753fab1950-kube-api-access-2pdxl\") pod \"mariadb-client\" (UID: \"bb08e0b0-bc0c-4e59-83ad-02753fab1950\") " pod="openstack/mariadb-client" Feb 17 19:06:34 crc kubenswrapper[4892]: I0217 19:06:34.620307 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pdxl\" (UniqueName: \"kubernetes.io/projected/bb08e0b0-bc0c-4e59-83ad-02753fab1950-kube-api-access-2pdxl\") pod \"mariadb-client\" (UID: \"bb08e0b0-bc0c-4e59-83ad-02753fab1950\") " pod="openstack/mariadb-client" Feb 17 19:06:34 crc kubenswrapper[4892]: I0217 19:06:34.655301 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pdxl\" (UniqueName: \"kubernetes.io/projected/bb08e0b0-bc0c-4e59-83ad-02753fab1950-kube-api-access-2pdxl\") pod \"mariadb-client\" (UID: \"bb08e0b0-bc0c-4e59-83ad-02753fab1950\") " pod="openstack/mariadb-client" Feb 17 19:06:34 crc kubenswrapper[4892]: I0217 19:06:34.695574 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 17 19:06:35 crc kubenswrapper[4892]: I0217 19:06:35.278557 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 17 19:06:35 crc kubenswrapper[4892]: W0217 19:06:35.279805 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb08e0b0_bc0c_4e59_83ad_02753fab1950.slice/crio-8755762062c54852c28f855a7d73a0ac00d285f13dec4830b26c24f50920ba9b WatchSource:0}: Error finding container 8755762062c54852c28f855a7d73a0ac00d285f13dec4830b26c24f50920ba9b: Status 404 returned error can't find the container with id 8755762062c54852c28f855a7d73a0ac00d285f13dec4830b26c24f50920ba9b Feb 17 19:06:35 crc kubenswrapper[4892]: I0217 19:06:35.398953 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"bb08e0b0-bc0c-4e59-83ad-02753fab1950","Type":"ContainerStarted","Data":"8755762062c54852c28f855a7d73a0ac00d285f13dec4830b26c24f50920ba9b"} Feb 17 19:06:36 crc kubenswrapper[4892]: I0217 19:06:36.410594 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"bb08e0b0-bc0c-4e59-83ad-02753fab1950","Type":"ContainerStarted","Data":"1db196c3a97adcca8581d3f6cc05c621499a5408315eb274de3e3d5f617aa96d"} Feb 17 19:06:36 crc kubenswrapper[4892]: I0217 19:06:36.442778 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client" podStartSLOduration=2.442746068 podStartE2EDuration="2.442746068s" podCreationTimestamp="2026-02-17 19:06:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:06:36.427713683 +0000 UTC m=+4967.803117018" watchObservedRunningTime="2026-02-17 19:06:36.442746068 +0000 UTC m=+4967.818149363" Feb 17 19:06:50 crc kubenswrapper[4892]: I0217 19:06:50.300958 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Feb 17 19:06:50 crc kubenswrapper[4892]: I0217 19:06:50.301691 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-client" podUID="bb08e0b0-bc0c-4e59-83ad-02753fab1950" containerName="mariadb-client" containerID="cri-o://1db196c3a97adcca8581d3f6cc05c621499a5408315eb274de3e3d5f617aa96d" gracePeriod=30 Feb 17 19:06:50 crc kubenswrapper[4892]: I0217 19:06:50.557133 4892 generic.go:334] "Generic (PLEG): container finished" podID="bb08e0b0-bc0c-4e59-83ad-02753fab1950" containerID="1db196c3a97adcca8581d3f6cc05c621499a5408315eb274de3e3d5f617aa96d" exitCode=143 Feb 17 19:06:50 crc kubenswrapper[4892]: I0217 19:06:50.557381 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"bb08e0b0-bc0c-4e59-83ad-02753fab1950","Type":"ContainerDied","Data":"1db196c3a97adcca8581d3f6cc05c621499a5408315eb274de3e3d5f617aa96d"} Feb 17 19:06:50 crc kubenswrapper[4892]: I0217 19:06:50.827205 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 17 19:06:50 crc kubenswrapper[4892]: I0217 19:06:50.848071 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pdxl\" (UniqueName: \"kubernetes.io/projected/bb08e0b0-bc0c-4e59-83ad-02753fab1950-kube-api-access-2pdxl\") pod \"bb08e0b0-bc0c-4e59-83ad-02753fab1950\" (UID: \"bb08e0b0-bc0c-4e59-83ad-02753fab1950\") " Feb 17 19:06:50 crc kubenswrapper[4892]: I0217 19:06:50.856250 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb08e0b0-bc0c-4e59-83ad-02753fab1950-kube-api-access-2pdxl" (OuterVolumeSpecName: "kube-api-access-2pdxl") pod "bb08e0b0-bc0c-4e59-83ad-02753fab1950" (UID: "bb08e0b0-bc0c-4e59-83ad-02753fab1950"). InnerVolumeSpecName "kube-api-access-2pdxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:06:50 crc kubenswrapper[4892]: I0217 19:06:50.950493 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pdxl\" (UniqueName: \"kubernetes.io/projected/bb08e0b0-bc0c-4e59-83ad-02753fab1950-kube-api-access-2pdxl\") on node \"crc\" DevicePath \"\"" Feb 17 19:06:51 crc kubenswrapper[4892]: I0217 19:06:51.569730 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"bb08e0b0-bc0c-4e59-83ad-02753fab1950","Type":"ContainerDied","Data":"8755762062c54852c28f855a7d73a0ac00d285f13dec4830b26c24f50920ba9b"} Feb 17 19:06:51 crc kubenswrapper[4892]: I0217 19:06:51.569780 4892 scope.go:117] "RemoveContainer" containerID="1db196c3a97adcca8581d3f6cc05c621499a5408315eb274de3e3d5f617aa96d" Feb 17 19:06:51 crc kubenswrapper[4892]: I0217 19:06:51.569792 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 17 19:06:51 crc kubenswrapper[4892]: I0217 19:06:51.594567 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Feb 17 19:06:51 crc kubenswrapper[4892]: I0217 19:06:51.603526 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Feb 17 19:06:53 crc kubenswrapper[4892]: I0217 19:06:53.374431 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb08e0b0-bc0c-4e59-83ad-02753fab1950" path="/var/lib/kubelet/pods/bb08e0b0-bc0c-4e59-83ad-02753fab1950/volumes" Feb 17 19:06:58 crc kubenswrapper[4892]: I0217 19:06:58.645286 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="cert-manager/cert-manager-webhook-6888856db4-t2r56" podUID="e7381d2c-f7e2-4935-be5f-380479a2e516" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.53:6080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 19:06:58 crc kubenswrapper[4892]: I0217 19:06:58.645712 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="cert-manager/cert-manager-webhook-6888856db4-t2r56" podUID="e7381d2c-f7e2-4935-be5f-380479a2e516" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.53:6080/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 19:07:07 crc kubenswrapper[4892]: I0217 19:07:07.425778 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 19:07:07 crc kubenswrapper[4892]: I0217 19:07:07.426423 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 19:07:37 crc kubenswrapper[4892]: I0217 19:07:37.425514 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 19:07:37 crc kubenswrapper[4892]: I0217 19:07:37.426310 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 19:08:01 crc kubenswrapper[4892]: I0217 19:08:01.162990 4892 scope.go:117] "RemoveContainer" containerID="ffe79af3b0f598c55b6de591794bd2c31e530cf69fafe08910716c5178225f7b" Feb 17 19:08:07 crc kubenswrapper[4892]: I0217 19:08:07.424711 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 19:08:07 crc kubenswrapper[4892]: I0217 19:08:07.425328 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 19:08:07 crc kubenswrapper[4892]: I0217 19:08:07.425372 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" Feb 17 19:08:07 crc kubenswrapper[4892]: I0217 19:08:07.426070 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1176c4dd94e4eb26cd8f99c2e68bcd6da89a092583d39fe54d90751c364f5f39"} pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 19:08:07 crc kubenswrapper[4892]: I0217 19:08:07.426129 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" containerID="cri-o://1176c4dd94e4eb26cd8f99c2e68bcd6da89a092583d39fe54d90751c364f5f39" gracePeriod=600 Feb 17 19:08:08 crc kubenswrapper[4892]: E0217 19:08:08.068282 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:08:08 crc kubenswrapper[4892]: I0217 19:08:08.456950 4892 generic.go:334] "Generic (PLEG): container finished" podID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerID="1176c4dd94e4eb26cd8f99c2e68bcd6da89a092583d39fe54d90751c364f5f39" exitCode=0 Feb 17 19:08:08 crc kubenswrapper[4892]: I0217 19:08:08.456995 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerDied","Data":"1176c4dd94e4eb26cd8f99c2e68bcd6da89a092583d39fe54d90751c364f5f39"} Feb 17 19:08:08 crc kubenswrapper[4892]: I0217 19:08:08.457030 4892 scope.go:117] "RemoveContainer" containerID="330e1bd88e0b3cdc2508642d4af65f7484d521050f49965de99307ecfc4b06ea" Feb 17 19:08:08 crc kubenswrapper[4892]: I0217 19:08:08.457706 4892 scope.go:117] "RemoveContainer" containerID="1176c4dd94e4eb26cd8f99c2e68bcd6da89a092583d39fe54d90751c364f5f39" Feb 17 19:08:08 crc kubenswrapper[4892]: E0217 19:08:08.458070 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:08:19 crc kubenswrapper[4892]: I0217 19:08:19.369147 4892 scope.go:117] "RemoveContainer" containerID="1176c4dd94e4eb26cd8f99c2e68bcd6da89a092583d39fe54d90751c364f5f39" Feb 17 19:08:19 crc kubenswrapper[4892]: E0217 19:08:19.370355 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:08:30 crc kubenswrapper[4892]: I0217 19:08:30.361602 4892 scope.go:117] "RemoveContainer" containerID="1176c4dd94e4eb26cd8f99c2e68bcd6da89a092583d39fe54d90751c364f5f39" Feb 17 19:08:30 crc kubenswrapper[4892]: E0217 19:08:30.362700 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:08:41 crc kubenswrapper[4892]: I0217 19:08:41.360640 4892 scope.go:117] "RemoveContainer" containerID="1176c4dd94e4eb26cd8f99c2e68bcd6da89a092583d39fe54d90751c364f5f39" Feb 17 19:08:41 crc kubenswrapper[4892]: E0217 19:08:41.361965 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:08:53 crc kubenswrapper[4892]: I0217 19:08:53.360757 4892 scope.go:117] "RemoveContainer" containerID="1176c4dd94e4eb26cd8f99c2e68bcd6da89a092583d39fe54d90751c364f5f39" Feb 17 19:08:53 crc kubenswrapper[4892]: E0217 19:08:53.361932 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:09:08 crc kubenswrapper[4892]: I0217 19:09:08.360902 4892 scope.go:117] "RemoveContainer" containerID="1176c4dd94e4eb26cd8f99c2e68bcd6da89a092583d39fe54d90751c364f5f39" Feb 17 19:09:08 crc kubenswrapper[4892]: E0217 19:09:08.362064 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:09:21 crc kubenswrapper[4892]: I0217 19:09:21.360944 4892 scope.go:117] "RemoveContainer" containerID="1176c4dd94e4eb26cd8f99c2e68bcd6da89a092583d39fe54d90751c364f5f39" Feb 17 19:09:21 crc kubenswrapper[4892]: E0217 19:09:21.362016 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:09:34 crc kubenswrapper[4892]: I0217 19:09:34.360360 4892 scope.go:117] "RemoveContainer" containerID="1176c4dd94e4eb26cd8f99c2e68bcd6da89a092583d39fe54d90751c364f5f39" Feb 17 19:09:34 crc kubenswrapper[4892]: E0217 19:09:34.361346 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:09:48 crc kubenswrapper[4892]: I0217 19:09:48.359291 4892 scope.go:117] "RemoveContainer" containerID="1176c4dd94e4eb26cd8f99c2e68bcd6da89a092583d39fe54d90751c364f5f39" Feb 17 19:09:48 crc kubenswrapper[4892]: E0217 19:09:48.361416 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:10:02 crc kubenswrapper[4892]: I0217 19:10:02.360596 4892 scope.go:117] "RemoveContainer" containerID="1176c4dd94e4eb26cd8f99c2e68bcd6da89a092583d39fe54d90751c364f5f39" Feb 17 19:10:02 crc kubenswrapper[4892]: E0217 19:10:02.361698 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:10:17 crc kubenswrapper[4892]: I0217 19:10:17.359989 4892 scope.go:117] "RemoveContainer" containerID="1176c4dd94e4eb26cd8f99c2e68bcd6da89a092583d39fe54d90751c364f5f39" Feb 17 19:10:17 crc kubenswrapper[4892]: E0217 19:10:17.361107 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:10:32 crc kubenswrapper[4892]: I0217 19:10:32.361197 4892 scope.go:117] "RemoveContainer" containerID="1176c4dd94e4eb26cd8f99c2e68bcd6da89a092583d39fe54d90751c364f5f39" Feb 17 19:10:32 crc kubenswrapper[4892]: E0217 19:10:32.362342 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:10:47 crc kubenswrapper[4892]: I0217 19:10:47.360579 4892 scope.go:117] "RemoveContainer" containerID="1176c4dd94e4eb26cd8f99c2e68bcd6da89a092583d39fe54d90751c364f5f39" Feb 17 19:10:47 crc kubenswrapper[4892]: E0217 19:10:47.361352 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:11:01 crc kubenswrapper[4892]: I0217 19:11:01.288482 4892 scope.go:117] "RemoveContainer" containerID="a7d7dabdbc951e58d287c7dbfcdd34c3144fd4ee67a2f404f74880801b24ae94" Feb 17 19:11:02 crc kubenswrapper[4892]: I0217 19:11:02.360331 4892 scope.go:117] "RemoveContainer" containerID="1176c4dd94e4eb26cd8f99c2e68bcd6da89a092583d39fe54d90751c364f5f39" Feb 17 19:11:02 crc kubenswrapper[4892]: E0217 19:11:02.360978 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:11:17 crc kubenswrapper[4892]: I0217 19:11:17.360364 4892 scope.go:117] "RemoveContainer" containerID="1176c4dd94e4eb26cd8f99c2e68bcd6da89a092583d39fe54d90751c364f5f39" Feb 17 19:11:17 crc kubenswrapper[4892]: E0217 19:11:17.361608 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:11:30 crc kubenswrapper[4892]: I0217 19:11:30.361000 4892 scope.go:117] "RemoveContainer" containerID="1176c4dd94e4eb26cd8f99c2e68bcd6da89a092583d39fe54d90751c364f5f39" Feb 17 19:11:30 crc kubenswrapper[4892]: E0217 19:11:30.362250 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:11:42 crc kubenswrapper[4892]: I0217 19:11:42.359782 4892 scope.go:117] "RemoveContainer" containerID="1176c4dd94e4eb26cd8f99c2e68bcd6da89a092583d39fe54d90751c364f5f39" Feb 17 19:11:42 crc kubenswrapper[4892]: E0217 19:11:42.360958 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:11:45 crc kubenswrapper[4892]: I0217 19:11:45.622855 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Feb 17 19:11:45 crc kubenswrapper[4892]: E0217 19:11:45.624068 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb08e0b0-bc0c-4e59-83ad-02753fab1950" containerName="mariadb-client" Feb 17 19:11:45 crc kubenswrapper[4892]: I0217 19:11:45.624087 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb08e0b0-bc0c-4e59-83ad-02753fab1950" containerName="mariadb-client" Feb 17 19:11:45 crc kubenswrapper[4892]: I0217 19:11:45.624479 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb08e0b0-bc0c-4e59-83ad-02753fab1950" containerName="mariadb-client" Feb 17 19:11:45 crc kubenswrapper[4892]: I0217 19:11:45.625585 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Feb 17 19:11:45 crc kubenswrapper[4892]: I0217 19:11:45.627952 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-cl6br" Feb 17 19:11:45 crc kubenswrapper[4892]: I0217 19:11:45.643261 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Feb 17 19:11:45 crc kubenswrapper[4892]: I0217 19:11:45.767973 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a12d4439-d2e1-40ad-a569-1b9c7d02ed80\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a12d4439-d2e1-40ad-a569-1b9c7d02ed80\") pod \"mariadb-copy-data\" (UID: \"4e3f7981-bf3f-4acc-8345-e58e505778b4\") " pod="openstack/mariadb-copy-data" Feb 17 19:11:45 crc kubenswrapper[4892]: I0217 19:11:45.768032 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8m54t\" (UniqueName: \"kubernetes.io/projected/4e3f7981-bf3f-4acc-8345-e58e505778b4-kube-api-access-8m54t\") pod \"mariadb-copy-data\" (UID: \"4e3f7981-bf3f-4acc-8345-e58e505778b4\") " pod="openstack/mariadb-copy-data" Feb 17 19:11:45 crc kubenswrapper[4892]: I0217 19:11:45.870016 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8m54t\" (UniqueName: \"kubernetes.io/projected/4e3f7981-bf3f-4acc-8345-e58e505778b4-kube-api-access-8m54t\") pod \"mariadb-copy-data\" (UID: \"4e3f7981-bf3f-4acc-8345-e58e505778b4\") " pod="openstack/mariadb-copy-data" Feb 17 19:11:45 crc kubenswrapper[4892]: I0217 19:11:45.870074 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a12d4439-d2e1-40ad-a569-1b9c7d02ed80\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a12d4439-d2e1-40ad-a569-1b9c7d02ed80\") pod \"mariadb-copy-data\" (UID: \"4e3f7981-bf3f-4acc-8345-e58e505778b4\") " pod="openstack/mariadb-copy-data" Feb 17 19:11:45 crc kubenswrapper[4892]: I0217 19:11:45.872765 4892 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 19:11:45 crc kubenswrapper[4892]: I0217 19:11:45.872807 4892 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a12d4439-d2e1-40ad-a569-1b9c7d02ed80\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a12d4439-d2e1-40ad-a569-1b9c7d02ed80\") pod \"mariadb-copy-data\" (UID: \"4e3f7981-bf3f-4acc-8345-e58e505778b4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5ec12285626db956646286b7efa82846bd40fd493b7758e1ed5c55b3db5702f3/globalmount\"" pod="openstack/mariadb-copy-data" Feb 17 19:11:45 crc kubenswrapper[4892]: I0217 19:11:45.895370 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8m54t\" (UniqueName: \"kubernetes.io/projected/4e3f7981-bf3f-4acc-8345-e58e505778b4-kube-api-access-8m54t\") pod \"mariadb-copy-data\" (UID: \"4e3f7981-bf3f-4acc-8345-e58e505778b4\") " pod="openstack/mariadb-copy-data" Feb 17 19:11:45 crc kubenswrapper[4892]: I0217 19:11:45.917343 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a12d4439-d2e1-40ad-a569-1b9c7d02ed80\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a12d4439-d2e1-40ad-a569-1b9c7d02ed80\") pod \"mariadb-copy-data\" (UID: \"4e3f7981-bf3f-4acc-8345-e58e505778b4\") " pod="openstack/mariadb-copy-data" Feb 17 19:11:45 crc kubenswrapper[4892]: I0217 19:11:45.952618 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Feb 17 19:11:46 crc kubenswrapper[4892]: I0217 19:11:46.504843 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Feb 17 19:11:46 crc kubenswrapper[4892]: W0217 19:11:46.506056 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e3f7981_bf3f_4acc_8345_e58e505778b4.slice/crio-a74ee319ca15b7079094cb45160b8a2f0aee1de6749904be1e1e037662525b51 WatchSource:0}: Error finding container a74ee319ca15b7079094cb45160b8a2f0aee1de6749904be1e1e037662525b51: Status 404 returned error can't find the container with id a74ee319ca15b7079094cb45160b8a2f0aee1de6749904be1e1e037662525b51 Feb 17 19:11:46 crc kubenswrapper[4892]: I0217 19:11:46.743233 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"4e3f7981-bf3f-4acc-8345-e58e505778b4","Type":"ContainerStarted","Data":"ece216792de186d08d841c83a922c310ddca83924e8224b90080944393481fad"} Feb 17 19:11:46 crc kubenswrapper[4892]: I0217 19:11:46.743567 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"4e3f7981-bf3f-4acc-8345-e58e505778b4","Type":"ContainerStarted","Data":"a74ee319ca15b7079094cb45160b8a2f0aee1de6749904be1e1e037662525b51"} Feb 17 19:11:46 crc kubenswrapper[4892]: I0217 19:11:46.767372 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=2.76734584 podStartE2EDuration="2.76734584s" podCreationTimestamp="2026-02-17 19:11:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:11:46.758876332 +0000 UTC m=+5278.134279627" watchObservedRunningTime="2026-02-17 19:11:46.76734584 +0000 UTC m=+5278.142749155" Feb 17 19:11:49 crc kubenswrapper[4892]: I0217 19:11:49.601421 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Feb 17 19:11:49 crc kubenswrapper[4892]: I0217 19:11:49.603628 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 17 19:11:49 crc kubenswrapper[4892]: I0217 19:11:49.612217 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 17 19:11:49 crc kubenswrapper[4892]: I0217 19:11:49.648123 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwvvw\" (UniqueName: \"kubernetes.io/projected/fce94566-46e3-4629-a31a-f527da1ca6b1-kube-api-access-nwvvw\") pod \"mariadb-client\" (UID: \"fce94566-46e3-4629-a31a-f527da1ca6b1\") " pod="openstack/mariadb-client" Feb 17 19:11:49 crc kubenswrapper[4892]: I0217 19:11:49.749902 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwvvw\" (UniqueName: \"kubernetes.io/projected/fce94566-46e3-4629-a31a-f527da1ca6b1-kube-api-access-nwvvw\") pod \"mariadb-client\" (UID: \"fce94566-46e3-4629-a31a-f527da1ca6b1\") " pod="openstack/mariadb-client" Feb 17 19:11:49 crc kubenswrapper[4892]: I0217 19:11:49.776000 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwvvw\" (UniqueName: \"kubernetes.io/projected/fce94566-46e3-4629-a31a-f527da1ca6b1-kube-api-access-nwvvw\") pod \"mariadb-client\" (UID: \"fce94566-46e3-4629-a31a-f527da1ca6b1\") " pod="openstack/mariadb-client" Feb 17 19:11:49 crc kubenswrapper[4892]: I0217 19:11:49.948262 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 17 19:11:50 crc kubenswrapper[4892]: W0217 19:11:50.439111 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfce94566_46e3_4629_a31a_f527da1ca6b1.slice/crio-e68ef7132b631b0ab51c8e37914c823cf6c1c0d3de993b8f1be5e92925a7c7b8 WatchSource:0}: Error finding container e68ef7132b631b0ab51c8e37914c823cf6c1c0d3de993b8f1be5e92925a7c7b8: Status 404 returned error can't find the container with id e68ef7132b631b0ab51c8e37914c823cf6c1c0d3de993b8f1be5e92925a7c7b8 Feb 17 19:11:50 crc kubenswrapper[4892]: I0217 19:11:50.442827 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 17 19:11:50 crc kubenswrapper[4892]: I0217 19:11:50.784144 4892 generic.go:334] "Generic (PLEG): container finished" podID="fce94566-46e3-4629-a31a-f527da1ca6b1" containerID="f6a474d4d576b696edc8fcb48689448187c502500e101eb99a85ddc9a364f483" exitCode=0 Feb 17 19:11:50 crc kubenswrapper[4892]: I0217 19:11:50.784232 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"fce94566-46e3-4629-a31a-f527da1ca6b1","Type":"ContainerDied","Data":"f6a474d4d576b696edc8fcb48689448187c502500e101eb99a85ddc9a364f483"} Feb 17 19:11:50 crc kubenswrapper[4892]: I0217 19:11:50.785148 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"fce94566-46e3-4629-a31a-f527da1ca6b1","Type":"ContainerStarted","Data":"e68ef7132b631b0ab51c8e37914c823cf6c1c0d3de993b8f1be5e92925a7c7b8"} Feb 17 19:11:52 crc kubenswrapper[4892]: I0217 19:11:52.217637 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 17 19:11:52 crc kubenswrapper[4892]: I0217 19:11:52.249530 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_fce94566-46e3-4629-a31a-f527da1ca6b1/mariadb-client/0.log" Feb 17 19:11:52 crc kubenswrapper[4892]: I0217 19:11:52.275465 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Feb 17 19:11:52 crc kubenswrapper[4892]: I0217 19:11:52.282637 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Feb 17 19:11:52 crc kubenswrapper[4892]: I0217 19:11:52.393982 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Feb 17 19:11:52 crc kubenswrapper[4892]: E0217 19:11:52.394899 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fce94566-46e3-4629-a31a-f527da1ca6b1" containerName="mariadb-client" Feb 17 19:11:52 crc kubenswrapper[4892]: I0217 19:11:52.394947 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="fce94566-46e3-4629-a31a-f527da1ca6b1" containerName="mariadb-client" Feb 17 19:11:52 crc kubenswrapper[4892]: I0217 19:11:52.395413 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="fce94566-46e3-4629-a31a-f527da1ca6b1" containerName="mariadb-client" Feb 17 19:11:52 crc kubenswrapper[4892]: I0217 19:11:52.396903 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 17 19:11:52 crc kubenswrapper[4892]: I0217 19:11:52.402069 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 17 19:11:52 crc kubenswrapper[4892]: I0217 19:11:52.407656 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwvvw\" (UniqueName: \"kubernetes.io/projected/fce94566-46e3-4629-a31a-f527da1ca6b1-kube-api-access-nwvvw\") pod \"fce94566-46e3-4629-a31a-f527da1ca6b1\" (UID: \"fce94566-46e3-4629-a31a-f527da1ca6b1\") " Feb 17 19:11:52 crc kubenswrapper[4892]: I0217 19:11:52.424291 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fce94566-46e3-4629-a31a-f527da1ca6b1-kube-api-access-nwvvw" (OuterVolumeSpecName: "kube-api-access-nwvvw") pod "fce94566-46e3-4629-a31a-f527da1ca6b1" (UID: "fce94566-46e3-4629-a31a-f527da1ca6b1"). InnerVolumeSpecName "kube-api-access-nwvvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:11:52 crc kubenswrapper[4892]: I0217 19:11:52.510219 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l6nk\" (UniqueName: \"kubernetes.io/projected/bd35ac34-19a8-4f42-853f-82d8bd9105a8-kube-api-access-5l6nk\") pod \"mariadb-client\" (UID: \"bd35ac34-19a8-4f42-853f-82d8bd9105a8\") " pod="openstack/mariadb-client" Feb 17 19:11:52 crc kubenswrapper[4892]: I0217 19:11:52.510551 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwvvw\" (UniqueName: \"kubernetes.io/projected/fce94566-46e3-4629-a31a-f527da1ca6b1-kube-api-access-nwvvw\") on node \"crc\" DevicePath \"\"" Feb 17 19:11:52 crc kubenswrapper[4892]: I0217 19:11:52.612789 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5l6nk\" (UniqueName: \"kubernetes.io/projected/bd35ac34-19a8-4f42-853f-82d8bd9105a8-kube-api-access-5l6nk\") pod \"mariadb-client\" (UID: \"bd35ac34-19a8-4f42-853f-82d8bd9105a8\") " pod="openstack/mariadb-client" Feb 17 19:11:52 crc kubenswrapper[4892]: I0217 19:11:52.633896 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l6nk\" (UniqueName: \"kubernetes.io/projected/bd35ac34-19a8-4f42-853f-82d8bd9105a8-kube-api-access-5l6nk\") pod \"mariadb-client\" (UID: \"bd35ac34-19a8-4f42-853f-82d8bd9105a8\") " pod="openstack/mariadb-client" Feb 17 19:11:52 crc kubenswrapper[4892]: I0217 19:11:52.779357 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 17 19:11:52 crc kubenswrapper[4892]: I0217 19:11:52.812068 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e68ef7132b631b0ab51c8e37914c823cf6c1c0d3de993b8f1be5e92925a7c7b8" Feb 17 19:11:52 crc kubenswrapper[4892]: I0217 19:11:52.812174 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 17 19:11:52 crc kubenswrapper[4892]: I0217 19:11:52.883894 4892 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/mariadb-client" oldPodUID="fce94566-46e3-4629-a31a-f527da1ca6b1" podUID="bd35ac34-19a8-4f42-853f-82d8bd9105a8" Feb 17 19:11:53 crc kubenswrapper[4892]: I0217 19:11:53.144124 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 17 19:11:53 crc kubenswrapper[4892]: W0217 19:11:53.155263 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd35ac34_19a8_4f42_853f_82d8bd9105a8.slice/crio-7bddd8217fd28deef420a0167c93a039004babc6a6f71d2b426e9bade703a0c4 WatchSource:0}: Error finding container 7bddd8217fd28deef420a0167c93a039004babc6a6f71d2b426e9bade703a0c4: Status 404 returned error can't find the container with id 7bddd8217fd28deef420a0167c93a039004babc6a6f71d2b426e9bade703a0c4 Feb 17 19:11:53 crc kubenswrapper[4892]: I0217 19:11:53.372245 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fce94566-46e3-4629-a31a-f527da1ca6b1" path="/var/lib/kubelet/pods/fce94566-46e3-4629-a31a-f527da1ca6b1/volumes" Feb 17 19:11:53 crc kubenswrapper[4892]: I0217 19:11:53.822088 4892 generic.go:334] "Generic (PLEG): container finished" podID="bd35ac34-19a8-4f42-853f-82d8bd9105a8" containerID="b7dff2c42ca9e7d523d9b7fbca15bf5b1aa40db5b7b0c7348d30055ed989d67f" exitCode=0 Feb 17 19:11:53 crc kubenswrapper[4892]: I0217 19:11:53.822131 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"bd35ac34-19a8-4f42-853f-82d8bd9105a8","Type":"ContainerDied","Data":"b7dff2c42ca9e7d523d9b7fbca15bf5b1aa40db5b7b0c7348d30055ed989d67f"} Feb 17 19:11:53 crc kubenswrapper[4892]: I0217 19:11:53.822156 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"bd35ac34-19a8-4f42-853f-82d8bd9105a8","Type":"ContainerStarted","Data":"7bddd8217fd28deef420a0167c93a039004babc6a6f71d2b426e9bade703a0c4"} Feb 17 19:11:55 crc kubenswrapper[4892]: I0217 19:11:55.217024 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 17 19:11:55 crc kubenswrapper[4892]: I0217 19:11:55.235664 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_bd35ac34-19a8-4f42-853f-82d8bd9105a8/mariadb-client/0.log" Feb 17 19:11:55 crc kubenswrapper[4892]: I0217 19:11:55.265522 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Feb 17 19:11:55 crc kubenswrapper[4892]: I0217 19:11:55.272431 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Feb 17 19:11:55 crc kubenswrapper[4892]: I0217 19:11:55.373554 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5l6nk\" (UniqueName: \"kubernetes.io/projected/bd35ac34-19a8-4f42-853f-82d8bd9105a8-kube-api-access-5l6nk\") pod \"bd35ac34-19a8-4f42-853f-82d8bd9105a8\" (UID: \"bd35ac34-19a8-4f42-853f-82d8bd9105a8\") " Feb 17 19:11:55 crc kubenswrapper[4892]: I0217 19:11:55.382607 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd35ac34-19a8-4f42-853f-82d8bd9105a8-kube-api-access-5l6nk" (OuterVolumeSpecName: "kube-api-access-5l6nk") pod "bd35ac34-19a8-4f42-853f-82d8bd9105a8" (UID: "bd35ac34-19a8-4f42-853f-82d8bd9105a8"). InnerVolumeSpecName "kube-api-access-5l6nk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:11:55 crc kubenswrapper[4892]: I0217 19:11:55.477379 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5l6nk\" (UniqueName: \"kubernetes.io/projected/bd35ac34-19a8-4f42-853f-82d8bd9105a8-kube-api-access-5l6nk\") on node \"crc\" DevicePath \"\"" Feb 17 19:11:55 crc kubenswrapper[4892]: I0217 19:11:55.850660 4892 scope.go:117] "RemoveContainer" containerID="b7dff2c42ca9e7d523d9b7fbca15bf5b1aa40db5b7b0c7348d30055ed989d67f" Feb 17 19:11:55 crc kubenswrapper[4892]: I0217 19:11:55.850681 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 17 19:11:57 crc kubenswrapper[4892]: I0217 19:11:57.360247 4892 scope.go:117] "RemoveContainer" containerID="1176c4dd94e4eb26cd8f99c2e68bcd6da89a092583d39fe54d90751c364f5f39" Feb 17 19:11:57 crc kubenswrapper[4892]: E0217 19:11:57.361027 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:11:57 crc kubenswrapper[4892]: I0217 19:11:57.382154 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd35ac34-19a8-4f42-853f-82d8bd9105a8" path="/var/lib/kubelet/pods/bd35ac34-19a8-4f42-853f-82d8bd9105a8/volumes" Feb 17 19:12:08 crc kubenswrapper[4892]: I0217 19:12:08.360275 4892 scope.go:117] "RemoveContainer" containerID="1176c4dd94e4eb26cd8f99c2e68bcd6da89a092583d39fe54d90751c364f5f39" Feb 17 19:12:08 crc kubenswrapper[4892]: E0217 19:12:08.361249 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:12:23 crc kubenswrapper[4892]: I0217 19:12:23.360811 4892 scope.go:117] "RemoveContainer" containerID="1176c4dd94e4eb26cd8f99c2e68bcd6da89a092583d39fe54d90751c364f5f39" Feb 17 19:12:23 crc kubenswrapper[4892]: E0217 19:12:23.362014 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.244780 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 17 19:12:25 crc kubenswrapper[4892]: E0217 19:12:25.245770 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd35ac34-19a8-4f42-853f-82d8bd9105a8" containerName="mariadb-client" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.245790 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd35ac34-19a8-4f42-853f-82d8bd9105a8" containerName="mariadb-client" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.246068 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd35ac34-19a8-4f42-853f-82d8bd9105a8" containerName="mariadb-client" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.249299 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.252436 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-5c75w" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.252574 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.254985 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.259980 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.280770 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.282575 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.299638 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.303887 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.333464 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.348428 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.455712 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnlpn\" (UniqueName: \"kubernetes.io/projected/05c8f198-b5d3-4126-a3ed-89c2fd6cca1f-kube-api-access-pnlpn\") pod \"ovsdbserver-sb-0\" (UID: \"05c8f198-b5d3-4126-a3ed-89c2fd6cca1f\") " pod="openstack/ovsdbserver-sb-0" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.455842 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d7585d4-873b-4de6-97f9-4e9f719a0805-config\") pod \"ovsdbserver-sb-2\" (UID: \"0d7585d4-873b-4de6-97f9-4e9f719a0805\") " pod="openstack/ovsdbserver-sb-2" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.455893 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05c8f198-b5d3-4126-a3ed-89c2fd6cca1f-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"05c8f198-b5d3-4126-a3ed-89c2fd6cca1f\") " pod="openstack/ovsdbserver-sb-0" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.455937 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/05c8f198-b5d3-4126-a3ed-89c2fd6cca1f-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"05c8f198-b5d3-4126-a3ed-89c2fd6cca1f\") " pod="openstack/ovsdbserver-sb-0" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.456031 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86v8n\" (UniqueName: \"kubernetes.io/projected/5457fa69-1de2-46fe-bd62-0af0c4bc3bd7-kube-api-access-86v8n\") pod \"ovsdbserver-sb-1\" (UID: \"5457fa69-1de2-46fe-bd62-0af0c4bc3bd7\") " pod="openstack/ovsdbserver-sb-1" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.456093 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-61289e42-0f22-4a19-8cbf-fb181019a676\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-61289e42-0f22-4a19-8cbf-fb181019a676\") pod \"ovsdbserver-sb-2\" (UID: \"0d7585d4-873b-4de6-97f9-4e9f719a0805\") " pod="openstack/ovsdbserver-sb-2" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.456119 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0d7585d4-873b-4de6-97f9-4e9f719a0805-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"0d7585d4-873b-4de6-97f9-4e9f719a0805\") " pod="openstack/ovsdbserver-sb-2" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.456168 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5457fa69-1de2-46fe-bd62-0af0c4bc3bd7-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"5457fa69-1de2-46fe-bd62-0af0c4bc3bd7\") " pod="openstack/ovsdbserver-sb-1" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.456194 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5457fa69-1de2-46fe-bd62-0af0c4bc3bd7-config\") pod \"ovsdbserver-sb-1\" (UID: \"5457fa69-1de2-46fe-bd62-0af0c4bc3bd7\") " pod="openstack/ovsdbserver-sb-1" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.456222 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2e1eb174-22ed-4d3d-8c6b-c526c90a14e4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2e1eb174-22ed-4d3d-8c6b-c526c90a14e4\") pod \"ovsdbserver-sb-0\" (UID: \"05c8f198-b5d3-4126-a3ed-89c2fd6cca1f\") " pod="openstack/ovsdbserver-sb-0" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.456263 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5457fa69-1de2-46fe-bd62-0af0c4bc3bd7-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"5457fa69-1de2-46fe-bd62-0af0c4bc3bd7\") " pod="openstack/ovsdbserver-sb-1" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.456321 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05c8f198-b5d3-4126-a3ed-89c2fd6cca1f-config\") pod \"ovsdbserver-sb-0\" (UID: \"05c8f198-b5d3-4126-a3ed-89c2fd6cca1f\") " pod="openstack/ovsdbserver-sb-0" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.456368 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/05c8f198-b5d3-4126-a3ed-89c2fd6cca1f-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"05c8f198-b5d3-4126-a3ed-89c2fd6cca1f\") " pod="openstack/ovsdbserver-sb-0" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.456835 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5457fa69-1de2-46fe-bd62-0af0c4bc3bd7-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"5457fa69-1de2-46fe-bd62-0af0c4bc3bd7\") " pod="openstack/ovsdbserver-sb-1" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.456933 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0d7585d4-873b-4de6-97f9-4e9f719a0805-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"0d7585d4-873b-4de6-97f9-4e9f719a0805\") " pod="openstack/ovsdbserver-sb-2" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.456996 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d7585d4-873b-4de6-97f9-4e9f719a0805-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"0d7585d4-873b-4de6-97f9-4e9f719a0805\") " pod="openstack/ovsdbserver-sb-2" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.457022 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlkr4\" (UniqueName: \"kubernetes.io/projected/0d7585d4-873b-4de6-97f9-4e9f719a0805-kube-api-access-mlkr4\") pod \"ovsdbserver-sb-2\" (UID: \"0d7585d4-873b-4de6-97f9-4e9f719a0805\") " pod="openstack/ovsdbserver-sb-2" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.457098 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-053bcdea-3ba8-4bae-b299-5cbde32ede63\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-053bcdea-3ba8-4bae-b299-5cbde32ede63\") pod \"ovsdbserver-sb-1\" (UID: \"5457fa69-1de2-46fe-bd62-0af0c4bc3bd7\") " pod="openstack/ovsdbserver-sb-1" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.475642 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.477610 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.481881 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.481936 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-52mh7" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.482018 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.484658 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.486248 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.499484 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.502702 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.508264 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.523655 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.540157 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.558565 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0d7585d4-873b-4de6-97f9-4e9f719a0805-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"0d7585d4-873b-4de6-97f9-4e9f719a0805\") " pod="openstack/ovsdbserver-sb-2" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.558760 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d7585d4-873b-4de6-97f9-4e9f719a0805-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"0d7585d4-873b-4de6-97f9-4e9f719a0805\") " pod="openstack/ovsdbserver-sb-2" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.558877 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlkr4\" (UniqueName: \"kubernetes.io/projected/0d7585d4-873b-4de6-97f9-4e9f719a0805-kube-api-access-mlkr4\") pod \"ovsdbserver-sb-2\" (UID: \"0d7585d4-873b-4de6-97f9-4e9f719a0805\") " pod="openstack/ovsdbserver-sb-2" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.558978 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-053bcdea-3ba8-4bae-b299-5cbde32ede63\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-053bcdea-3ba8-4bae-b299-5cbde32ede63\") pod \"ovsdbserver-sb-1\" (UID: \"5457fa69-1de2-46fe-bd62-0af0c4bc3bd7\") " pod="openstack/ovsdbserver-sb-1" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.559078 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnlpn\" (UniqueName: \"kubernetes.io/projected/05c8f198-b5d3-4126-a3ed-89c2fd6cca1f-kube-api-access-pnlpn\") pod \"ovsdbserver-sb-0\" (UID: \"05c8f198-b5d3-4126-a3ed-89c2fd6cca1f\") " pod="openstack/ovsdbserver-sb-0" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.559169 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d7585d4-873b-4de6-97f9-4e9f719a0805-config\") pod \"ovsdbserver-sb-2\" (UID: \"0d7585d4-873b-4de6-97f9-4e9f719a0805\") " pod="openstack/ovsdbserver-sb-2" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.559272 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05c8f198-b5d3-4126-a3ed-89c2fd6cca1f-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"05c8f198-b5d3-4126-a3ed-89c2fd6cca1f\") " pod="openstack/ovsdbserver-sb-0" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.559378 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/05c8f198-b5d3-4126-a3ed-89c2fd6cca1f-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"05c8f198-b5d3-4126-a3ed-89c2fd6cca1f\") " pod="openstack/ovsdbserver-sb-0" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.559490 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86v8n\" (UniqueName: \"kubernetes.io/projected/5457fa69-1de2-46fe-bd62-0af0c4bc3bd7-kube-api-access-86v8n\") pod \"ovsdbserver-sb-1\" (UID: \"5457fa69-1de2-46fe-bd62-0af0c4bc3bd7\") " pod="openstack/ovsdbserver-sb-1" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.559597 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-61289e42-0f22-4a19-8cbf-fb181019a676\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-61289e42-0f22-4a19-8cbf-fb181019a676\") pod \"ovsdbserver-sb-2\" (UID: \"0d7585d4-873b-4de6-97f9-4e9f719a0805\") " pod="openstack/ovsdbserver-sb-2" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.559695 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0d7585d4-873b-4de6-97f9-4e9f719a0805-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"0d7585d4-873b-4de6-97f9-4e9f719a0805\") " pod="openstack/ovsdbserver-sb-2" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.559811 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5457fa69-1de2-46fe-bd62-0af0c4bc3bd7-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"5457fa69-1de2-46fe-bd62-0af0c4bc3bd7\") " pod="openstack/ovsdbserver-sb-1" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.559940 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5457fa69-1de2-46fe-bd62-0af0c4bc3bd7-config\") pod \"ovsdbserver-sb-1\" (UID: \"5457fa69-1de2-46fe-bd62-0af0c4bc3bd7\") " pod="openstack/ovsdbserver-sb-1" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.560041 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2e1eb174-22ed-4d3d-8c6b-c526c90a14e4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2e1eb174-22ed-4d3d-8c6b-c526c90a14e4\") pod \"ovsdbserver-sb-0\" (UID: \"05c8f198-b5d3-4126-a3ed-89c2fd6cca1f\") " pod="openstack/ovsdbserver-sb-0" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.560143 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5457fa69-1de2-46fe-bd62-0af0c4bc3bd7-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"5457fa69-1de2-46fe-bd62-0af0c4bc3bd7\") " pod="openstack/ovsdbserver-sb-1" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.560248 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05c8f198-b5d3-4126-a3ed-89c2fd6cca1f-config\") pod \"ovsdbserver-sb-0\" (UID: \"05c8f198-b5d3-4126-a3ed-89c2fd6cca1f\") " pod="openstack/ovsdbserver-sb-0" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.560367 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/05c8f198-b5d3-4126-a3ed-89c2fd6cca1f-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"05c8f198-b5d3-4126-a3ed-89c2fd6cca1f\") " pod="openstack/ovsdbserver-sb-0" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.560502 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5457fa69-1de2-46fe-bd62-0af0c4bc3bd7-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"5457fa69-1de2-46fe-bd62-0af0c4bc3bd7\") " pod="openstack/ovsdbserver-sb-1" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.560940 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d7585d4-873b-4de6-97f9-4e9f719a0805-config\") pod \"ovsdbserver-sb-2\" (UID: \"0d7585d4-873b-4de6-97f9-4e9f719a0805\") " pod="openstack/ovsdbserver-sb-2" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.559624 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0d7585d4-873b-4de6-97f9-4e9f719a0805-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"0d7585d4-873b-4de6-97f9-4e9f719a0805\") " pod="openstack/ovsdbserver-sb-2" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.561934 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0d7585d4-873b-4de6-97f9-4e9f719a0805-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"0d7585d4-873b-4de6-97f9-4e9f719a0805\") " pod="openstack/ovsdbserver-sb-2" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.562555 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5457fa69-1de2-46fe-bd62-0af0c4bc3bd7-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"5457fa69-1de2-46fe-bd62-0af0c4bc3bd7\") " pod="openstack/ovsdbserver-sb-1" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.562603 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/05c8f198-b5d3-4126-a3ed-89c2fd6cca1f-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"05c8f198-b5d3-4126-a3ed-89c2fd6cca1f\") " pod="openstack/ovsdbserver-sb-0" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.563676 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/05c8f198-b5d3-4126-a3ed-89c2fd6cca1f-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"05c8f198-b5d3-4126-a3ed-89c2fd6cca1f\") " pod="openstack/ovsdbserver-sb-0" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.563791 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5457fa69-1de2-46fe-bd62-0af0c4bc3bd7-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"5457fa69-1de2-46fe-bd62-0af0c4bc3bd7\") " pod="openstack/ovsdbserver-sb-1" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.564285 4892 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.564313 4892 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.564322 4892 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-61289e42-0f22-4a19-8cbf-fb181019a676\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-61289e42-0f22-4a19-8cbf-fb181019a676\") pod \"ovsdbserver-sb-2\" (UID: \"0d7585d4-873b-4de6-97f9-4e9f719a0805\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d8ba166d4328d706c5feebc998786f4761bb79616baa8ecd6de75df6d9a0f5f6/globalmount\"" pod="openstack/ovsdbserver-sb-2" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.564334 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05c8f198-b5d3-4126-a3ed-89c2fd6cca1f-config\") pod \"ovsdbserver-sb-0\" (UID: \"05c8f198-b5d3-4126-a3ed-89c2fd6cca1f\") " pod="openstack/ovsdbserver-sb-0" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.564508 4892 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2e1eb174-22ed-4d3d-8c6b-c526c90a14e4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2e1eb174-22ed-4d3d-8c6b-c526c90a14e4\") pod \"ovsdbserver-sb-0\" (UID: \"05c8f198-b5d3-4126-a3ed-89c2fd6cca1f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a9f66830ea84a4050bd759874c8b655f608d1d6df666ae9ae355ed22eb827b49/globalmount\"" pod="openstack/ovsdbserver-sb-0" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.564526 4892 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.564548 4892 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-053bcdea-3ba8-4bae-b299-5cbde32ede63\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-053bcdea-3ba8-4bae-b299-5cbde32ede63\") pod \"ovsdbserver-sb-1\" (UID: \"5457fa69-1de2-46fe-bd62-0af0c4bc3bd7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7b2faa6d5dbf0f1ac052c17d54f0522b38d4f5820f51891fe40c9634c0049b29/globalmount\"" pod="openstack/ovsdbserver-sb-1" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.564977 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5457fa69-1de2-46fe-bd62-0af0c4bc3bd7-config\") pod \"ovsdbserver-sb-1\" (UID: \"5457fa69-1de2-46fe-bd62-0af0c4bc3bd7\") " pod="openstack/ovsdbserver-sb-1" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.570447 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05c8f198-b5d3-4126-a3ed-89c2fd6cca1f-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"05c8f198-b5d3-4126-a3ed-89c2fd6cca1f\") " pod="openstack/ovsdbserver-sb-0" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.570448 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d7585d4-873b-4de6-97f9-4e9f719a0805-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"0d7585d4-873b-4de6-97f9-4e9f719a0805\") " pod="openstack/ovsdbserver-sb-2" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.570458 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5457fa69-1de2-46fe-bd62-0af0c4bc3bd7-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"5457fa69-1de2-46fe-bd62-0af0c4bc3bd7\") " pod="openstack/ovsdbserver-sb-1" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.579300 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlkr4\" (UniqueName: \"kubernetes.io/projected/0d7585d4-873b-4de6-97f9-4e9f719a0805-kube-api-access-mlkr4\") pod \"ovsdbserver-sb-2\" (UID: \"0d7585d4-873b-4de6-97f9-4e9f719a0805\") " pod="openstack/ovsdbserver-sb-2" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.579525 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86v8n\" (UniqueName: \"kubernetes.io/projected/5457fa69-1de2-46fe-bd62-0af0c4bc3bd7-kube-api-access-86v8n\") pod \"ovsdbserver-sb-1\" (UID: \"5457fa69-1de2-46fe-bd62-0af0c4bc3bd7\") " pod="openstack/ovsdbserver-sb-1" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.587019 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnlpn\" (UniqueName: \"kubernetes.io/projected/05c8f198-b5d3-4126-a3ed-89c2fd6cca1f-kube-api-access-pnlpn\") pod \"ovsdbserver-sb-0\" (UID: \"05c8f198-b5d3-4126-a3ed-89c2fd6cca1f\") " pod="openstack/ovsdbserver-sb-0" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.613895 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2e1eb174-22ed-4d3d-8c6b-c526c90a14e4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2e1eb174-22ed-4d3d-8c6b-c526c90a14e4\") pod \"ovsdbserver-sb-0\" (UID: \"05c8f198-b5d3-4126-a3ed-89c2fd6cca1f\") " pod="openstack/ovsdbserver-sb-0" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.613895 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-61289e42-0f22-4a19-8cbf-fb181019a676\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-61289e42-0f22-4a19-8cbf-fb181019a676\") pod \"ovsdbserver-sb-2\" (UID: \"0d7585d4-873b-4de6-97f9-4e9f719a0805\") " pod="openstack/ovsdbserver-sb-2" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.616494 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-053bcdea-3ba8-4bae-b299-5cbde32ede63\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-053bcdea-3ba8-4bae-b299-5cbde32ede63\") pod \"ovsdbserver-sb-1\" (UID: \"5457fa69-1de2-46fe-bd62-0af0c4bc3bd7\") " pod="openstack/ovsdbserver-sb-1" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.619732 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.662404 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/be323205-1316-4c35-9475-8d0e5a64c889-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"be323205-1316-4c35-9475-8d0e5a64c889\") " pod="openstack/ovsdbserver-nb-1" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.662739 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d988b11e-1dda-417d-8792-4130cb690552-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"d988b11e-1dda-417d-8792-4130cb690552\") " pod="openstack/ovsdbserver-nb-2" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.662771 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be323205-1316-4c35-9475-8d0e5a64c889-config\") pod \"ovsdbserver-nb-1\" (UID: \"be323205-1316-4c35-9475-8d0e5a64c889\") " pod="openstack/ovsdbserver-nb-1" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.662798 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9hlg\" (UniqueName: \"kubernetes.io/projected/d988b11e-1dda-417d-8792-4130cb690552-kube-api-access-d9hlg\") pod \"ovsdbserver-nb-2\" (UID: \"d988b11e-1dda-417d-8792-4130cb690552\") " pod="openstack/ovsdbserver-nb-2" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.662849 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbr9t\" (UniqueName: \"kubernetes.io/projected/8f5f5244-28cd-4312-84a9-7baa4b22d017-kube-api-access-qbr9t\") pod \"ovsdbserver-nb-0\" (UID: \"8f5f5244-28cd-4312-84a9-7baa4b22d017\") " pod="openstack/ovsdbserver-nb-0" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.662989 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f5f5244-28cd-4312-84a9-7baa4b22d017-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"8f5f5244-28cd-4312-84a9-7baa4b22d017\") " pod="openstack/ovsdbserver-nb-0" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.663129 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be323205-1316-4c35-9475-8d0e5a64c889-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"be323205-1316-4c35-9475-8d0e5a64c889\") " pod="openstack/ovsdbserver-nb-1" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.663211 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-be68897e-831d-41aa-9531-692b43ed9a77\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-be68897e-831d-41aa-9531-692b43ed9a77\") pod \"ovsdbserver-nb-0\" (UID: \"8f5f5244-28cd-4312-84a9-7baa4b22d017\") " pod="openstack/ovsdbserver-nb-0" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.663242 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8f5f5244-28cd-4312-84a9-7baa4b22d017-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"8f5f5244-28cd-4312-84a9-7baa4b22d017\") " pod="openstack/ovsdbserver-nb-0" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.663280 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be323205-1316-4c35-9475-8d0e5a64c889-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"be323205-1316-4c35-9475-8d0e5a64c889\") " pod="openstack/ovsdbserver-nb-1" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.663301 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dfsb\" (UniqueName: \"kubernetes.io/projected/be323205-1316-4c35-9475-8d0e5a64c889-kube-api-access-4dfsb\") pod \"ovsdbserver-nb-1\" (UID: \"be323205-1316-4c35-9475-8d0e5a64c889\") " pod="openstack/ovsdbserver-nb-1" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.663382 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3809ae8e-d80c-4656-a492-0d3ba71de992\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3809ae8e-d80c-4656-a492-0d3ba71de992\") pod \"ovsdbserver-nb-2\" (UID: \"d988b11e-1dda-417d-8792-4130cb690552\") " pod="openstack/ovsdbserver-nb-2" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.663399 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d988b11e-1dda-417d-8792-4130cb690552-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"d988b11e-1dda-417d-8792-4130cb690552\") " pod="openstack/ovsdbserver-nb-2" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.663425 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d988b11e-1dda-417d-8792-4130cb690552-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"d988b11e-1dda-417d-8792-4130cb690552\") " pod="openstack/ovsdbserver-nb-2" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.663446 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8f5f5244-28cd-4312-84a9-7baa4b22d017-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"8f5f5244-28cd-4312-84a9-7baa4b22d017\") " pod="openstack/ovsdbserver-nb-0" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.663472 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d988b11e-1dda-417d-8792-4130cb690552-config\") pod \"ovsdbserver-nb-2\" (UID: \"d988b11e-1dda-417d-8792-4130cb690552\") " pod="openstack/ovsdbserver-nb-2" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.663502 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f5f5244-28cd-4312-84a9-7baa4b22d017-config\") pod \"ovsdbserver-nb-0\" (UID: \"8f5f5244-28cd-4312-84a9-7baa4b22d017\") " pod="openstack/ovsdbserver-nb-0" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.663529 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-134401d9-7f33-4f52-989b-3ce502edee10\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-134401d9-7f33-4f52-989b-3ce502edee10\") pod \"ovsdbserver-nb-1\" (UID: \"be323205-1316-4c35-9475-8d0e5a64c889\") " pod="openstack/ovsdbserver-nb-1" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.765051 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3809ae8e-d80c-4656-a492-0d3ba71de992\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3809ae8e-d80c-4656-a492-0d3ba71de992\") pod \"ovsdbserver-nb-2\" (UID: \"d988b11e-1dda-417d-8792-4130cb690552\") " pod="openstack/ovsdbserver-nb-2" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.765104 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d988b11e-1dda-417d-8792-4130cb690552-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"d988b11e-1dda-417d-8792-4130cb690552\") " pod="openstack/ovsdbserver-nb-2" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.765133 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d988b11e-1dda-417d-8792-4130cb690552-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"d988b11e-1dda-417d-8792-4130cb690552\") " pod="openstack/ovsdbserver-nb-2" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.765158 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8f5f5244-28cd-4312-84a9-7baa4b22d017-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"8f5f5244-28cd-4312-84a9-7baa4b22d017\") " pod="openstack/ovsdbserver-nb-0" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.765210 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d988b11e-1dda-417d-8792-4130cb690552-config\") pod \"ovsdbserver-nb-2\" (UID: \"d988b11e-1dda-417d-8792-4130cb690552\") " pod="openstack/ovsdbserver-nb-2" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.765242 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f5f5244-28cd-4312-84a9-7baa4b22d017-config\") pod \"ovsdbserver-nb-0\" (UID: \"8f5f5244-28cd-4312-84a9-7baa4b22d017\") " pod="openstack/ovsdbserver-nb-0" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.765281 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-134401d9-7f33-4f52-989b-3ce502edee10\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-134401d9-7f33-4f52-989b-3ce502edee10\") pod \"ovsdbserver-nb-1\" (UID: \"be323205-1316-4c35-9475-8d0e5a64c889\") " pod="openstack/ovsdbserver-nb-1" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.765319 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/be323205-1316-4c35-9475-8d0e5a64c889-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"be323205-1316-4c35-9475-8d0e5a64c889\") " pod="openstack/ovsdbserver-nb-1" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.765355 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d988b11e-1dda-417d-8792-4130cb690552-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"d988b11e-1dda-417d-8792-4130cb690552\") " pod="openstack/ovsdbserver-nb-2" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.765378 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be323205-1316-4c35-9475-8d0e5a64c889-config\") pod \"ovsdbserver-nb-1\" (UID: \"be323205-1316-4c35-9475-8d0e5a64c889\") " pod="openstack/ovsdbserver-nb-1" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.765404 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9hlg\" (UniqueName: \"kubernetes.io/projected/d988b11e-1dda-417d-8792-4130cb690552-kube-api-access-d9hlg\") pod \"ovsdbserver-nb-2\" (UID: \"d988b11e-1dda-417d-8792-4130cb690552\") " pod="openstack/ovsdbserver-nb-2" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.765427 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbr9t\" (UniqueName: \"kubernetes.io/projected/8f5f5244-28cd-4312-84a9-7baa4b22d017-kube-api-access-qbr9t\") pod \"ovsdbserver-nb-0\" (UID: \"8f5f5244-28cd-4312-84a9-7baa4b22d017\") " pod="openstack/ovsdbserver-nb-0" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.765460 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f5f5244-28cd-4312-84a9-7baa4b22d017-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"8f5f5244-28cd-4312-84a9-7baa4b22d017\") " pod="openstack/ovsdbserver-nb-0" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.765508 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be323205-1316-4c35-9475-8d0e5a64c889-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"be323205-1316-4c35-9475-8d0e5a64c889\") " pod="openstack/ovsdbserver-nb-1" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.765558 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-be68897e-831d-41aa-9531-692b43ed9a77\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-be68897e-831d-41aa-9531-692b43ed9a77\") pod \"ovsdbserver-nb-0\" (UID: \"8f5f5244-28cd-4312-84a9-7baa4b22d017\") " pod="openstack/ovsdbserver-nb-0" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.765576 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8f5f5244-28cd-4312-84a9-7baa4b22d017-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"8f5f5244-28cd-4312-84a9-7baa4b22d017\") " pod="openstack/ovsdbserver-nb-0" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.765602 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be323205-1316-4c35-9475-8d0e5a64c889-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"be323205-1316-4c35-9475-8d0e5a64c889\") " pod="openstack/ovsdbserver-nb-1" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.765626 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dfsb\" (UniqueName: \"kubernetes.io/projected/be323205-1316-4c35-9475-8d0e5a64c889-kube-api-access-4dfsb\") pod \"ovsdbserver-nb-1\" (UID: \"be323205-1316-4c35-9475-8d0e5a64c889\") " pod="openstack/ovsdbserver-nb-1" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.766362 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d988b11e-1dda-417d-8792-4130cb690552-config\") pod \"ovsdbserver-nb-2\" (UID: \"d988b11e-1dda-417d-8792-4130cb690552\") " pod="openstack/ovsdbserver-nb-2" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.766675 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d988b11e-1dda-417d-8792-4130cb690552-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"d988b11e-1dda-417d-8792-4130cb690552\") " pod="openstack/ovsdbserver-nb-2" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.768102 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f5f5244-28cd-4312-84a9-7baa4b22d017-config\") pod \"ovsdbserver-nb-0\" (UID: \"8f5f5244-28cd-4312-84a9-7baa4b22d017\") " pod="openstack/ovsdbserver-nb-0" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.768147 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8f5f5244-28cd-4312-84a9-7baa4b22d017-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"8f5f5244-28cd-4312-84a9-7baa4b22d017\") " pod="openstack/ovsdbserver-nb-0" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.768150 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be323205-1316-4c35-9475-8d0e5a64c889-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"be323205-1316-4c35-9475-8d0e5a64c889\") " pod="openstack/ovsdbserver-nb-1" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.768169 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d988b11e-1dda-417d-8792-4130cb690552-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"d988b11e-1dda-417d-8792-4130cb690552\") " pod="openstack/ovsdbserver-nb-2" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.768478 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/be323205-1316-4c35-9475-8d0e5a64c889-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"be323205-1316-4c35-9475-8d0e5a64c889\") " pod="openstack/ovsdbserver-nb-1" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.769784 4892 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.769831 4892 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-134401d9-7f33-4f52-989b-3ce502edee10\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-134401d9-7f33-4f52-989b-3ce502edee10\") pod \"ovsdbserver-nb-1\" (UID: \"be323205-1316-4c35-9475-8d0e5a64c889\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ff18bc6e5782983ba509ac889417ab280a03af0d140edaf0b8a5c39fc63b5f6b/globalmount\"" pod="openstack/ovsdbserver-nb-1" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.772795 4892 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.772857 4892 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-be68897e-831d-41aa-9531-692b43ed9a77\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-be68897e-831d-41aa-9531-692b43ed9a77\") pod \"ovsdbserver-nb-0\" (UID: \"8f5f5244-28cd-4312-84a9-7baa4b22d017\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6ca4fa4674b2a6b7512b8fe428c84bcbf2be2be6789c3b5271060e89f5531751/globalmount\"" pod="openstack/ovsdbserver-nb-0" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.773307 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8f5f5244-28cd-4312-84a9-7baa4b22d017-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"8f5f5244-28cd-4312-84a9-7baa4b22d017\") " pod="openstack/ovsdbserver-nb-0" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.774639 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d988b11e-1dda-417d-8792-4130cb690552-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"d988b11e-1dda-417d-8792-4130cb690552\") " pod="openstack/ovsdbserver-nb-2" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.778361 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be323205-1316-4c35-9475-8d0e5a64c889-config\") pod \"ovsdbserver-nb-1\" (UID: \"be323205-1316-4c35-9475-8d0e5a64c889\") " pod="openstack/ovsdbserver-nb-1" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.790317 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f5f5244-28cd-4312-84a9-7baa4b22d017-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"8f5f5244-28cd-4312-84a9-7baa4b22d017\") " pod="openstack/ovsdbserver-nb-0" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.790585 4892 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.790624 4892 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3809ae8e-d80c-4656-a492-0d3ba71de992\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3809ae8e-d80c-4656-a492-0d3ba71de992\") pod \"ovsdbserver-nb-2\" (UID: \"d988b11e-1dda-417d-8792-4130cb690552\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f06abef84bd827538b0a39137613e7c50ed27b705f7fcfa0cb6efd0c9942edd7/globalmount\"" pod="openstack/ovsdbserver-nb-2" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.791109 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be323205-1316-4c35-9475-8d0e5a64c889-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"be323205-1316-4c35-9475-8d0e5a64c889\") " pod="openstack/ovsdbserver-nb-1" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.793297 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbr9t\" (UniqueName: \"kubernetes.io/projected/8f5f5244-28cd-4312-84a9-7baa4b22d017-kube-api-access-qbr9t\") pod \"ovsdbserver-nb-0\" (UID: \"8f5f5244-28cd-4312-84a9-7baa4b22d017\") " pod="openstack/ovsdbserver-nb-0" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.793558 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9hlg\" (UniqueName: \"kubernetes.io/projected/d988b11e-1dda-417d-8792-4130cb690552-kube-api-access-d9hlg\") pod \"ovsdbserver-nb-2\" (UID: \"d988b11e-1dda-417d-8792-4130cb690552\") " pod="openstack/ovsdbserver-nb-2" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.795784 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dfsb\" (UniqueName: \"kubernetes.io/projected/be323205-1316-4c35-9475-8d0e5a64c889-kube-api-access-4dfsb\") pod \"ovsdbserver-nb-1\" (UID: \"be323205-1316-4c35-9475-8d0e5a64c889\") " pod="openstack/ovsdbserver-nb-1" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.824857 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-134401d9-7f33-4f52-989b-3ce502edee10\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-134401d9-7f33-4f52-989b-3ce502edee10\") pod \"ovsdbserver-nb-1\" (UID: \"be323205-1316-4c35-9475-8d0e5a64c889\") " pod="openstack/ovsdbserver-nb-1" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.826871 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3809ae8e-d80c-4656-a492-0d3ba71de992\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3809ae8e-d80c-4656-a492-0d3ba71de992\") pod \"ovsdbserver-nb-2\" (UID: \"d988b11e-1dda-417d-8792-4130cb690552\") " pod="openstack/ovsdbserver-nb-2" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.829780 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.845757 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-be68897e-831d-41aa-9531-692b43ed9a77\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-be68897e-831d-41aa-9531-692b43ed9a77\") pod \"ovsdbserver-nb-0\" (UID: \"8f5f5244-28cd-4312-84a9-7baa4b22d017\") " pod="openstack/ovsdbserver-nb-0" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.872905 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 17 19:12:25 crc kubenswrapper[4892]: I0217 19:12:25.899530 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Feb 17 19:12:26 crc kubenswrapper[4892]: I0217 19:12:26.103412 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 17 19:12:26 crc kubenswrapper[4892]: I0217 19:12:26.119624 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Feb 17 19:12:26 crc kubenswrapper[4892]: I0217 19:12:26.141461 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Feb 17 19:12:26 crc kubenswrapper[4892]: I0217 19:12:26.168782 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"0d7585d4-873b-4de6-97f9-4e9f719a0805","Type":"ContainerStarted","Data":"493e6cec10079b7f6f7c7445981377b8f6248c6c4c7ca33450bee28fd066f57c"} Feb 17 19:12:26 crc kubenswrapper[4892]: I0217 19:12:26.400248 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Feb 17 19:12:26 crc kubenswrapper[4892]: I0217 19:12:26.502761 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 17 19:12:26 crc kubenswrapper[4892]: I0217 19:12:26.589025 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Feb 17 19:12:26 crc kubenswrapper[4892]: W0217 19:12:26.673323 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe323205_1316_4c35_9475_8d0e5a64c889.slice/crio-9e63ca80600b33f274daa3c118ed86de194ff1977a3c428a77fa35d10c1ced28 WatchSource:0}: Error finding container 9e63ca80600b33f274daa3c118ed86de194ff1977a3c428a77fa35d10c1ced28: Status 404 returned error can't find the container with id 9e63ca80600b33f274daa3c118ed86de194ff1977a3c428a77fa35d10c1ced28 Feb 17 19:12:26 crc kubenswrapper[4892]: W0217 19:12:26.696169 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5457fa69_1de2_46fe_bd62_0af0c4bc3bd7.slice/crio-e0735466ef947041a02ebbfdb1ae37dfe4df129a445ae3207cd4b8dd6efb1a68 WatchSource:0}: Error finding container e0735466ef947041a02ebbfdb1ae37dfe4df129a445ae3207cd4b8dd6efb1a68: Status 404 returned error can't find the container with id e0735466ef947041a02ebbfdb1ae37dfe4df129a445ae3207cd4b8dd6efb1a68 Feb 17 19:12:26 crc kubenswrapper[4892]: I0217 19:12:26.725039 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Feb 17 19:12:26 crc kubenswrapper[4892]: W0217 19:12:26.745629 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd988b11e_1dda_417d_8792_4130cb690552.slice/crio-92d64436d22729aff677f920099f745ce5ebce7c5d7a78c437ecea263f9f2412 WatchSource:0}: Error finding container 92d64436d22729aff677f920099f745ce5ebce7c5d7a78c437ecea263f9f2412: Status 404 returned error can't find the container with id 92d64436d22729aff677f920099f745ce5ebce7c5d7a78c437ecea263f9f2412 Feb 17 19:12:26 crc kubenswrapper[4892]: I0217 19:12:26.806604 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 17 19:12:26 crc kubenswrapper[4892]: W0217 19:12:26.963752 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f5f5244_28cd_4312_84a9_7baa4b22d017.slice/crio-8ad303722a242c938ba1b93fe79a0362c8f375ba2cdd00a8bd8e75d0f6864769 WatchSource:0}: Error finding container 8ad303722a242c938ba1b93fe79a0362c8f375ba2cdd00a8bd8e75d0f6864769: Status 404 returned error can't find the container with id 8ad303722a242c938ba1b93fe79a0362c8f375ba2cdd00a8bd8e75d0f6864769 Feb 17 19:12:27 crc kubenswrapper[4892]: I0217 19:12:27.179599 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"8f5f5244-28cd-4312-84a9-7baa4b22d017","Type":"ContainerStarted","Data":"8ad303722a242c938ba1b93fe79a0362c8f375ba2cdd00a8bd8e75d0f6864769"} Feb 17 19:12:27 crc kubenswrapper[4892]: I0217 19:12:27.182679 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"0d7585d4-873b-4de6-97f9-4e9f719a0805","Type":"ContainerStarted","Data":"668e2abdb0513599cc7a80ee6be52192eaece24e0d5657a5f510a663c3d6183f"} Feb 17 19:12:27 crc kubenswrapper[4892]: I0217 19:12:27.184963 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"be323205-1316-4c35-9475-8d0e5a64c889","Type":"ContainerStarted","Data":"6091d645e869541128393a0a94ac2513da80a8f77733136add789a1acdea1781"} Feb 17 19:12:27 crc kubenswrapper[4892]: I0217 19:12:27.185026 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"be323205-1316-4c35-9475-8d0e5a64c889","Type":"ContainerStarted","Data":"9e63ca80600b33f274daa3c118ed86de194ff1977a3c428a77fa35d10c1ced28"} Feb 17 19:12:27 crc kubenswrapper[4892]: I0217 19:12:27.186478 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"d988b11e-1dda-417d-8792-4130cb690552","Type":"ContainerStarted","Data":"92d64436d22729aff677f920099f745ce5ebce7c5d7a78c437ecea263f9f2412"} Feb 17 19:12:27 crc kubenswrapper[4892]: I0217 19:12:27.187854 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"05c8f198-b5d3-4126-a3ed-89c2fd6cca1f","Type":"ContainerStarted","Data":"83d8afafdfaa952228c88105d293c981b6b7ca0dc6999e899eb907349873767e"} Feb 17 19:12:27 crc kubenswrapper[4892]: I0217 19:12:27.188936 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"5457fa69-1de2-46fe-bd62-0af0c4bc3bd7","Type":"ContainerStarted","Data":"e0735466ef947041a02ebbfdb1ae37dfe4df129a445ae3207cd4b8dd6efb1a68"} Feb 17 19:12:28 crc kubenswrapper[4892]: I0217 19:12:28.202875 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"d988b11e-1dda-417d-8792-4130cb690552","Type":"ContainerStarted","Data":"838bc35d52c6b7ba0c47b76c26116abc275ec2bc4670fe44bc7c04020803cae2"} Feb 17 19:12:28 crc kubenswrapper[4892]: I0217 19:12:28.203234 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"d988b11e-1dda-417d-8792-4130cb690552","Type":"ContainerStarted","Data":"230fd72bd87b528dbf0dfd3c522a639b019fab194cb74cdb2c2a35407bc4d6f8"} Feb 17 19:12:28 crc kubenswrapper[4892]: I0217 19:12:28.205534 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"05c8f198-b5d3-4126-a3ed-89c2fd6cca1f","Type":"ContainerStarted","Data":"75b60011b7c257c7c7975fc676853b3142658f20d1b1d9d32bb06bc186760588"} Feb 17 19:12:28 crc kubenswrapper[4892]: I0217 19:12:28.205577 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"05c8f198-b5d3-4126-a3ed-89c2fd6cca1f","Type":"ContainerStarted","Data":"69306998d933232d22766af691dd71dbdc771f7c645c637c40a33574901a8484"} Feb 17 19:12:28 crc kubenswrapper[4892]: I0217 19:12:28.208202 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"5457fa69-1de2-46fe-bd62-0af0c4bc3bd7","Type":"ContainerStarted","Data":"4a3bf566b847f847fa341367a25a9be34ede3977640e8f267e42d7855c7d1ba7"} Feb 17 19:12:28 crc kubenswrapper[4892]: I0217 19:12:28.208250 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"5457fa69-1de2-46fe-bd62-0af0c4bc3bd7","Type":"ContainerStarted","Data":"159b2238d0fb28331672479294f6976ecc35f94ff0d1961391eb69f29a6979d7"} Feb 17 19:12:28 crc kubenswrapper[4892]: I0217 19:12:28.210594 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"8f5f5244-28cd-4312-84a9-7baa4b22d017","Type":"ContainerStarted","Data":"9e3f60ba862e2d3b7a3a955f9a34e10e643db7b3bcdc8d2c580cc7848f52b99b"} Feb 17 19:12:28 crc kubenswrapper[4892]: I0217 19:12:28.210630 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"8f5f5244-28cd-4312-84a9-7baa4b22d017","Type":"ContainerStarted","Data":"84df3dd07699a5a46f00b999da311a7d1b0a1fd2c539ea85b6c09630b98b35fa"} Feb 17 19:12:28 crc kubenswrapper[4892]: I0217 19:12:28.213992 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"0d7585d4-873b-4de6-97f9-4e9f719a0805","Type":"ContainerStarted","Data":"2324b2f5ce2e5c6d17f53326d0616bd9cc897549906f9c65b0d77c7e0f6d4503"} Feb 17 19:12:28 crc kubenswrapper[4892]: I0217 19:12:28.216982 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"be323205-1316-4c35-9475-8d0e5a64c889","Type":"ContainerStarted","Data":"6ff40c50b76e4fb3ecca7eb7e4ef949d71318fa48f48d19ce15141c8f60a4c05"} Feb 17 19:12:28 crc kubenswrapper[4892]: I0217 19:12:28.239085 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=4.239054838 podStartE2EDuration="4.239054838s" podCreationTimestamp="2026-02-17 19:12:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:12:28.225115772 +0000 UTC m=+5319.600519077" watchObservedRunningTime="2026-02-17 19:12:28.239054838 +0000 UTC m=+5319.614458133" Feb 17 19:12:28 crc kubenswrapper[4892]: I0217 19:12:28.266697 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=4.266669743 podStartE2EDuration="4.266669743s" podCreationTimestamp="2026-02-17 19:12:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:12:28.250987161 +0000 UTC m=+5319.626390466" watchObservedRunningTime="2026-02-17 19:12:28.266669743 +0000 UTC m=+5319.642073028" Feb 17 19:12:28 crc kubenswrapper[4892]: I0217 19:12:28.286368 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=4.286342424 podStartE2EDuration="4.286342424s" podCreationTimestamp="2026-02-17 19:12:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:12:28.2750938 +0000 UTC m=+5319.650497135" watchObservedRunningTime="2026-02-17 19:12:28.286342424 +0000 UTC m=+5319.661745709" Feb 17 19:12:28 crc kubenswrapper[4892]: I0217 19:12:28.296427 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=4.296403195 podStartE2EDuration="4.296403195s" podCreationTimestamp="2026-02-17 19:12:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:12:28.294304459 +0000 UTC m=+5319.669707754" watchObservedRunningTime="2026-02-17 19:12:28.296403195 +0000 UTC m=+5319.671806480" Feb 17 19:12:28 crc kubenswrapper[4892]: I0217 19:12:28.321112 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=4.321085221 podStartE2EDuration="4.321085221s" podCreationTimestamp="2026-02-17 19:12:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:12:28.316638311 +0000 UTC m=+5319.692041586" watchObservedRunningTime="2026-02-17 19:12:28.321085221 +0000 UTC m=+5319.696488506" Feb 17 19:12:28 crc kubenswrapper[4892]: I0217 19:12:28.347238 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=4.347217426 podStartE2EDuration="4.347217426s" podCreationTimestamp="2026-02-17 19:12:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:12:28.338180903 +0000 UTC m=+5319.713584178" watchObservedRunningTime="2026-02-17 19:12:28.347217426 +0000 UTC m=+5319.722620701" Feb 17 19:12:28 crc kubenswrapper[4892]: I0217 19:12:28.620472 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Feb 17 19:12:28 crc kubenswrapper[4892]: I0217 19:12:28.832929 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Feb 17 19:12:28 crc kubenswrapper[4892]: I0217 19:12:28.873099 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 17 19:12:28 crc kubenswrapper[4892]: I0217 19:12:28.899985 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Feb 17 19:12:29 crc kubenswrapper[4892]: I0217 19:12:29.103930 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 17 19:12:29 crc kubenswrapper[4892]: I0217 19:12:29.120154 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Feb 17 19:12:30 crc kubenswrapper[4892]: I0217 19:12:30.620995 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Feb 17 19:12:30 crc kubenswrapper[4892]: I0217 19:12:30.830933 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Feb 17 19:12:30 crc kubenswrapper[4892]: I0217 19:12:30.873534 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 17 19:12:30 crc kubenswrapper[4892]: I0217 19:12:30.900011 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Feb 17 19:12:31 crc kubenswrapper[4892]: I0217 19:12:31.103896 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 17 19:12:31 crc kubenswrapper[4892]: I0217 19:12:31.120196 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Feb 17 19:12:31 crc kubenswrapper[4892]: I0217 19:12:31.676989 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Feb 17 19:12:31 crc kubenswrapper[4892]: I0217 19:12:31.892504 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Feb 17 19:12:31 crc kubenswrapper[4892]: I0217 19:12:31.945737 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 17 19:12:31 crc kubenswrapper[4892]: I0217 19:12:31.972072 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Feb 17 19:12:32 crc kubenswrapper[4892]: I0217 19:12:32.145018 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 17 19:12:32 crc kubenswrapper[4892]: I0217 19:12:32.165120 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Feb 17 19:12:32 crc kubenswrapper[4892]: I0217 19:12:32.306425 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 17 19:12:32 crc kubenswrapper[4892]: I0217 19:12:32.316160 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Feb 17 19:12:32 crc kubenswrapper[4892]: I0217 19:12:32.325952 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Feb 17 19:12:32 crc kubenswrapper[4892]: I0217 19:12:32.327591 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Feb 17 19:12:32 crc kubenswrapper[4892]: I0217 19:12:32.336756 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Feb 17 19:12:32 crc kubenswrapper[4892]: I0217 19:12:32.340237 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 17 19:12:32 crc kubenswrapper[4892]: I0217 19:12:32.545377 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57f4f8c4cc-sdppk"] Feb 17 19:12:32 crc kubenswrapper[4892]: I0217 19:12:32.547094 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57f4f8c4cc-sdppk" Feb 17 19:12:32 crc kubenswrapper[4892]: I0217 19:12:32.549362 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 17 19:12:32 crc kubenswrapper[4892]: I0217 19:12:32.555466 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57f4f8c4cc-sdppk"] Feb 17 19:12:32 crc kubenswrapper[4892]: I0217 19:12:32.627845 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54b98418-13c7-4b90-8392-9dd43c469ec0-config\") pod \"dnsmasq-dns-57f4f8c4cc-sdppk\" (UID: \"54b98418-13c7-4b90-8392-9dd43c469ec0\") " pod="openstack/dnsmasq-dns-57f4f8c4cc-sdppk" Feb 17 19:12:32 crc kubenswrapper[4892]: I0217 19:12:32.628175 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/54b98418-13c7-4b90-8392-9dd43c469ec0-ovsdbserver-nb\") pod \"dnsmasq-dns-57f4f8c4cc-sdppk\" (UID: \"54b98418-13c7-4b90-8392-9dd43c469ec0\") " pod="openstack/dnsmasq-dns-57f4f8c4cc-sdppk" Feb 17 19:12:32 crc kubenswrapper[4892]: I0217 19:12:32.628214 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/54b98418-13c7-4b90-8392-9dd43c469ec0-dns-svc\") pod \"dnsmasq-dns-57f4f8c4cc-sdppk\" (UID: \"54b98418-13c7-4b90-8392-9dd43c469ec0\") " pod="openstack/dnsmasq-dns-57f4f8c4cc-sdppk" Feb 17 19:12:32 crc kubenswrapper[4892]: I0217 19:12:32.628247 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddbfr\" (UniqueName: \"kubernetes.io/projected/54b98418-13c7-4b90-8392-9dd43c469ec0-kube-api-access-ddbfr\") pod \"dnsmasq-dns-57f4f8c4cc-sdppk\" (UID: \"54b98418-13c7-4b90-8392-9dd43c469ec0\") " pod="openstack/dnsmasq-dns-57f4f8c4cc-sdppk" Feb 17 19:12:32 crc kubenswrapper[4892]: I0217 19:12:32.672476 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57f4f8c4cc-sdppk"] Feb 17 19:12:32 crc kubenswrapper[4892]: E0217 19:12:32.673143 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-ddbfr ovsdbserver-nb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-57f4f8c4cc-sdppk" podUID="54b98418-13c7-4b90-8392-9dd43c469ec0" Feb 17 19:12:32 crc kubenswrapper[4892]: I0217 19:12:32.701771 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b798876ff-xz2gr"] Feb 17 19:12:32 crc kubenswrapper[4892]: I0217 19:12:32.703447 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b798876ff-xz2gr" Feb 17 19:12:32 crc kubenswrapper[4892]: I0217 19:12:32.705291 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 17 19:12:32 crc kubenswrapper[4892]: I0217 19:12:32.720840 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b798876ff-xz2gr"] Feb 17 19:12:32 crc kubenswrapper[4892]: I0217 19:12:32.733499 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54b98418-13c7-4b90-8392-9dd43c469ec0-config\") pod \"dnsmasq-dns-57f4f8c4cc-sdppk\" (UID: \"54b98418-13c7-4b90-8392-9dd43c469ec0\") " pod="openstack/dnsmasq-dns-57f4f8c4cc-sdppk" Feb 17 19:12:32 crc kubenswrapper[4892]: I0217 19:12:32.733616 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/54b98418-13c7-4b90-8392-9dd43c469ec0-ovsdbserver-nb\") pod \"dnsmasq-dns-57f4f8c4cc-sdppk\" (UID: \"54b98418-13c7-4b90-8392-9dd43c469ec0\") " pod="openstack/dnsmasq-dns-57f4f8c4cc-sdppk" Feb 17 19:12:32 crc kubenswrapper[4892]: I0217 19:12:32.733656 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/54b98418-13c7-4b90-8392-9dd43c469ec0-dns-svc\") pod \"dnsmasq-dns-57f4f8c4cc-sdppk\" (UID: \"54b98418-13c7-4b90-8392-9dd43c469ec0\") " pod="openstack/dnsmasq-dns-57f4f8c4cc-sdppk" Feb 17 19:12:32 crc kubenswrapper[4892]: I0217 19:12:32.733690 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddbfr\" (UniqueName: \"kubernetes.io/projected/54b98418-13c7-4b90-8392-9dd43c469ec0-kube-api-access-ddbfr\") pod \"dnsmasq-dns-57f4f8c4cc-sdppk\" (UID: \"54b98418-13c7-4b90-8392-9dd43c469ec0\") " pod="openstack/dnsmasq-dns-57f4f8c4cc-sdppk" Feb 17 19:12:32 crc kubenswrapper[4892]: I0217 19:12:32.742649 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54b98418-13c7-4b90-8392-9dd43c469ec0-config\") pod \"dnsmasq-dns-57f4f8c4cc-sdppk\" (UID: \"54b98418-13c7-4b90-8392-9dd43c469ec0\") " pod="openstack/dnsmasq-dns-57f4f8c4cc-sdppk" Feb 17 19:12:32 crc kubenswrapper[4892]: I0217 19:12:32.743411 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/54b98418-13c7-4b90-8392-9dd43c469ec0-ovsdbserver-nb\") pod \"dnsmasq-dns-57f4f8c4cc-sdppk\" (UID: \"54b98418-13c7-4b90-8392-9dd43c469ec0\") " pod="openstack/dnsmasq-dns-57f4f8c4cc-sdppk" Feb 17 19:12:32 crc kubenswrapper[4892]: I0217 19:12:32.743984 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/54b98418-13c7-4b90-8392-9dd43c469ec0-dns-svc\") pod \"dnsmasq-dns-57f4f8c4cc-sdppk\" (UID: \"54b98418-13c7-4b90-8392-9dd43c469ec0\") " pod="openstack/dnsmasq-dns-57f4f8c4cc-sdppk" Feb 17 19:12:32 crc kubenswrapper[4892]: I0217 19:12:32.777388 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddbfr\" (UniqueName: \"kubernetes.io/projected/54b98418-13c7-4b90-8392-9dd43c469ec0-kube-api-access-ddbfr\") pod \"dnsmasq-dns-57f4f8c4cc-sdppk\" (UID: \"54b98418-13c7-4b90-8392-9dd43c469ec0\") " pod="openstack/dnsmasq-dns-57f4f8c4cc-sdppk" Feb 17 19:12:32 crc kubenswrapper[4892]: I0217 19:12:32.835752 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/462a6f91-70eb-412e-9c96-104545e1a695-ovsdbserver-nb\") pod \"dnsmasq-dns-7b798876ff-xz2gr\" (UID: \"462a6f91-70eb-412e-9c96-104545e1a695\") " pod="openstack/dnsmasq-dns-7b798876ff-xz2gr" Feb 17 19:12:32 crc kubenswrapper[4892]: I0217 19:12:32.835863 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/462a6f91-70eb-412e-9c96-104545e1a695-config\") pod \"dnsmasq-dns-7b798876ff-xz2gr\" (UID: \"462a6f91-70eb-412e-9c96-104545e1a695\") " pod="openstack/dnsmasq-dns-7b798876ff-xz2gr" Feb 17 19:12:32 crc kubenswrapper[4892]: I0217 19:12:32.835957 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/462a6f91-70eb-412e-9c96-104545e1a695-dns-svc\") pod \"dnsmasq-dns-7b798876ff-xz2gr\" (UID: \"462a6f91-70eb-412e-9c96-104545e1a695\") " pod="openstack/dnsmasq-dns-7b798876ff-xz2gr" Feb 17 19:12:32 crc kubenswrapper[4892]: I0217 19:12:32.836032 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jpth\" (UniqueName: \"kubernetes.io/projected/462a6f91-70eb-412e-9c96-104545e1a695-kube-api-access-4jpth\") pod \"dnsmasq-dns-7b798876ff-xz2gr\" (UID: \"462a6f91-70eb-412e-9c96-104545e1a695\") " pod="openstack/dnsmasq-dns-7b798876ff-xz2gr" Feb 17 19:12:32 crc kubenswrapper[4892]: I0217 19:12:32.836052 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/462a6f91-70eb-412e-9c96-104545e1a695-ovsdbserver-sb\") pod \"dnsmasq-dns-7b798876ff-xz2gr\" (UID: \"462a6f91-70eb-412e-9c96-104545e1a695\") " pod="openstack/dnsmasq-dns-7b798876ff-xz2gr" Feb 17 19:12:32 crc kubenswrapper[4892]: I0217 19:12:32.938168 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/462a6f91-70eb-412e-9c96-104545e1a695-dns-svc\") pod \"dnsmasq-dns-7b798876ff-xz2gr\" (UID: \"462a6f91-70eb-412e-9c96-104545e1a695\") " pod="openstack/dnsmasq-dns-7b798876ff-xz2gr" Feb 17 19:12:32 crc kubenswrapper[4892]: I0217 19:12:32.938301 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jpth\" (UniqueName: \"kubernetes.io/projected/462a6f91-70eb-412e-9c96-104545e1a695-kube-api-access-4jpth\") pod \"dnsmasq-dns-7b798876ff-xz2gr\" (UID: \"462a6f91-70eb-412e-9c96-104545e1a695\") " pod="openstack/dnsmasq-dns-7b798876ff-xz2gr" Feb 17 19:12:32 crc kubenswrapper[4892]: I0217 19:12:32.938326 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/462a6f91-70eb-412e-9c96-104545e1a695-ovsdbserver-sb\") pod \"dnsmasq-dns-7b798876ff-xz2gr\" (UID: \"462a6f91-70eb-412e-9c96-104545e1a695\") " pod="openstack/dnsmasq-dns-7b798876ff-xz2gr" Feb 17 19:12:32 crc kubenswrapper[4892]: I0217 19:12:32.938376 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/462a6f91-70eb-412e-9c96-104545e1a695-ovsdbserver-nb\") pod \"dnsmasq-dns-7b798876ff-xz2gr\" (UID: \"462a6f91-70eb-412e-9c96-104545e1a695\") " pod="openstack/dnsmasq-dns-7b798876ff-xz2gr" Feb 17 19:12:32 crc kubenswrapper[4892]: I0217 19:12:32.938430 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/462a6f91-70eb-412e-9c96-104545e1a695-config\") pod \"dnsmasq-dns-7b798876ff-xz2gr\" (UID: \"462a6f91-70eb-412e-9c96-104545e1a695\") " pod="openstack/dnsmasq-dns-7b798876ff-xz2gr" Feb 17 19:12:32 crc kubenswrapper[4892]: I0217 19:12:32.939472 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/462a6f91-70eb-412e-9c96-104545e1a695-config\") pod \"dnsmasq-dns-7b798876ff-xz2gr\" (UID: \"462a6f91-70eb-412e-9c96-104545e1a695\") " pod="openstack/dnsmasq-dns-7b798876ff-xz2gr" Feb 17 19:12:32 crc kubenswrapper[4892]: I0217 19:12:32.940234 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/462a6f91-70eb-412e-9c96-104545e1a695-dns-svc\") pod \"dnsmasq-dns-7b798876ff-xz2gr\" (UID: \"462a6f91-70eb-412e-9c96-104545e1a695\") " pod="openstack/dnsmasq-dns-7b798876ff-xz2gr" Feb 17 19:12:32 crc kubenswrapper[4892]: I0217 19:12:32.940788 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/462a6f91-70eb-412e-9c96-104545e1a695-ovsdbserver-nb\") pod \"dnsmasq-dns-7b798876ff-xz2gr\" (UID: \"462a6f91-70eb-412e-9c96-104545e1a695\") " pod="openstack/dnsmasq-dns-7b798876ff-xz2gr" Feb 17 19:12:32 crc kubenswrapper[4892]: I0217 19:12:32.940873 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/462a6f91-70eb-412e-9c96-104545e1a695-ovsdbserver-sb\") pod \"dnsmasq-dns-7b798876ff-xz2gr\" (UID: \"462a6f91-70eb-412e-9c96-104545e1a695\") " pod="openstack/dnsmasq-dns-7b798876ff-xz2gr" Feb 17 19:12:32 crc kubenswrapper[4892]: I0217 19:12:32.957453 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jpth\" (UniqueName: \"kubernetes.io/projected/462a6f91-70eb-412e-9c96-104545e1a695-kube-api-access-4jpth\") pod \"dnsmasq-dns-7b798876ff-xz2gr\" (UID: \"462a6f91-70eb-412e-9c96-104545e1a695\") " pod="openstack/dnsmasq-dns-7b798876ff-xz2gr" Feb 17 19:12:33 crc kubenswrapper[4892]: I0217 19:12:33.023117 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b798876ff-xz2gr" Feb 17 19:12:33 crc kubenswrapper[4892]: I0217 19:12:33.275306 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57f4f8c4cc-sdppk" Feb 17 19:12:33 crc kubenswrapper[4892]: I0217 19:12:33.287339 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57f4f8c4cc-sdppk" Feb 17 19:12:33 crc kubenswrapper[4892]: I0217 19:12:33.448065 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/54b98418-13c7-4b90-8392-9dd43c469ec0-dns-svc\") pod \"54b98418-13c7-4b90-8392-9dd43c469ec0\" (UID: \"54b98418-13c7-4b90-8392-9dd43c469ec0\") " Feb 17 19:12:33 crc kubenswrapper[4892]: I0217 19:12:33.448114 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/54b98418-13c7-4b90-8392-9dd43c469ec0-ovsdbserver-nb\") pod \"54b98418-13c7-4b90-8392-9dd43c469ec0\" (UID: \"54b98418-13c7-4b90-8392-9dd43c469ec0\") " Feb 17 19:12:33 crc kubenswrapper[4892]: I0217 19:12:33.448269 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddbfr\" (UniqueName: \"kubernetes.io/projected/54b98418-13c7-4b90-8392-9dd43c469ec0-kube-api-access-ddbfr\") pod \"54b98418-13c7-4b90-8392-9dd43c469ec0\" (UID: \"54b98418-13c7-4b90-8392-9dd43c469ec0\") " Feb 17 19:12:33 crc kubenswrapper[4892]: I0217 19:12:33.448355 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54b98418-13c7-4b90-8392-9dd43c469ec0-config\") pod \"54b98418-13c7-4b90-8392-9dd43c469ec0\" (UID: \"54b98418-13c7-4b90-8392-9dd43c469ec0\") " Feb 17 19:12:33 crc kubenswrapper[4892]: I0217 19:12:33.448604 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54b98418-13c7-4b90-8392-9dd43c469ec0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "54b98418-13c7-4b90-8392-9dd43c469ec0" (UID: "54b98418-13c7-4b90-8392-9dd43c469ec0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:12:33 crc kubenswrapper[4892]: I0217 19:12:33.448720 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54b98418-13c7-4b90-8392-9dd43c469ec0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "54b98418-13c7-4b90-8392-9dd43c469ec0" (UID: "54b98418-13c7-4b90-8392-9dd43c469ec0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:12:33 crc kubenswrapper[4892]: I0217 19:12:33.448808 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54b98418-13c7-4b90-8392-9dd43c469ec0-config" (OuterVolumeSpecName: "config") pod "54b98418-13c7-4b90-8392-9dd43c469ec0" (UID: "54b98418-13c7-4b90-8392-9dd43c469ec0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:12:33 crc kubenswrapper[4892]: I0217 19:12:33.450829 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54b98418-13c7-4b90-8392-9dd43c469ec0-config\") on node \"crc\" DevicePath \"\"" Feb 17 19:12:33 crc kubenswrapper[4892]: I0217 19:12:33.450857 4892 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/54b98418-13c7-4b90-8392-9dd43c469ec0-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 19:12:33 crc kubenswrapper[4892]: I0217 19:12:33.450869 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/54b98418-13c7-4b90-8392-9dd43c469ec0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 19:12:33 crc kubenswrapper[4892]: I0217 19:12:33.453367 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54b98418-13c7-4b90-8392-9dd43c469ec0-kube-api-access-ddbfr" (OuterVolumeSpecName: "kube-api-access-ddbfr") pod "54b98418-13c7-4b90-8392-9dd43c469ec0" (UID: "54b98418-13c7-4b90-8392-9dd43c469ec0"). InnerVolumeSpecName "kube-api-access-ddbfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:12:33 crc kubenswrapper[4892]: I0217 19:12:33.528060 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b798876ff-xz2gr"] Feb 17 19:12:33 crc kubenswrapper[4892]: I0217 19:12:33.552357 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddbfr\" (UniqueName: \"kubernetes.io/projected/54b98418-13c7-4b90-8392-9dd43c469ec0-kube-api-access-ddbfr\") on node \"crc\" DevicePath \"\"" Feb 17 19:12:34 crc kubenswrapper[4892]: I0217 19:12:34.291353 4892 generic.go:334] "Generic (PLEG): container finished" podID="462a6f91-70eb-412e-9c96-104545e1a695" containerID="fe576cc0825d047473bda3255f916353177d4c9d6a3c5f9ce47b564f1915d83a" exitCode=0 Feb 17 19:12:34 crc kubenswrapper[4892]: I0217 19:12:34.291446 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57f4f8c4cc-sdppk" Feb 17 19:12:34 crc kubenswrapper[4892]: I0217 19:12:34.292075 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b798876ff-xz2gr" event={"ID":"462a6f91-70eb-412e-9c96-104545e1a695","Type":"ContainerDied","Data":"fe576cc0825d047473bda3255f916353177d4c9d6a3c5f9ce47b564f1915d83a"} Feb 17 19:12:34 crc kubenswrapper[4892]: I0217 19:12:34.292203 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b798876ff-xz2gr" event={"ID":"462a6f91-70eb-412e-9c96-104545e1a695","Type":"ContainerStarted","Data":"c34c4d43be081e54d86a9876d8f5b12e5315965588ef565d5bc64ba6077612a6"} Feb 17 19:12:34 crc kubenswrapper[4892]: I0217 19:12:34.468765 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57f4f8c4cc-sdppk"] Feb 17 19:12:34 crc kubenswrapper[4892]: I0217 19:12:34.480725 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57f4f8c4cc-sdppk"] Feb 17 19:12:35 crc kubenswrapper[4892]: I0217 19:12:35.306279 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b798876ff-xz2gr" event={"ID":"462a6f91-70eb-412e-9c96-104545e1a695","Type":"ContainerStarted","Data":"59f78a5ea41e8cce831a0d1a4fbe477fd8085e9f9d45e089134c804f21d3e03e"} Feb 17 19:12:35 crc kubenswrapper[4892]: I0217 19:12:35.306702 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b798876ff-xz2gr" Feb 17 19:12:35 crc kubenswrapper[4892]: I0217 19:12:35.360483 4892 scope.go:117] "RemoveContainer" containerID="1176c4dd94e4eb26cd8f99c2e68bcd6da89a092583d39fe54d90751c364f5f39" Feb 17 19:12:35 crc kubenswrapper[4892]: E0217 19:12:35.360987 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:12:35 crc kubenswrapper[4892]: I0217 19:12:35.376713 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54b98418-13c7-4b90-8392-9dd43c469ec0" path="/var/lib/kubelet/pods/54b98418-13c7-4b90-8392-9dd43c469ec0/volumes" Feb 17 19:12:35 crc kubenswrapper[4892]: I0217 19:12:35.423442 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b798876ff-xz2gr" podStartSLOduration=3.423416257 podStartE2EDuration="3.423416257s" podCreationTimestamp="2026-02-17 19:12:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:12:35.340387817 +0000 UTC m=+5326.715791122" watchObservedRunningTime="2026-02-17 19:12:35.423416257 +0000 UTC m=+5326.798819552" Feb 17 19:12:35 crc kubenswrapper[4892]: I0217 19:12:35.431322 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Feb 17 19:12:35 crc kubenswrapper[4892]: I0217 19:12:35.432905 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Feb 17 19:12:35 crc kubenswrapper[4892]: I0217 19:12:35.435344 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Feb 17 19:12:35 crc kubenswrapper[4892]: I0217 19:12:35.442657 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Feb 17 19:12:35 crc kubenswrapper[4892]: I0217 19:12:35.492154 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm75v\" (UniqueName: \"kubernetes.io/projected/d6e36b08-e650-4dec-82ee-72c4e5e013d7-kube-api-access-vm75v\") pod \"ovn-copy-data\" (UID: \"d6e36b08-e650-4dec-82ee-72c4e5e013d7\") " pod="openstack/ovn-copy-data" Feb 17 19:12:35 crc kubenswrapper[4892]: I0217 19:12:35.492288 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d4b1483c-ac29-461e-b318-b9499ea694f0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d4b1483c-ac29-461e-b318-b9499ea694f0\") pod \"ovn-copy-data\" (UID: \"d6e36b08-e650-4dec-82ee-72c4e5e013d7\") " pod="openstack/ovn-copy-data" Feb 17 19:12:35 crc kubenswrapper[4892]: I0217 19:12:35.493414 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/d6e36b08-e650-4dec-82ee-72c4e5e013d7-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"d6e36b08-e650-4dec-82ee-72c4e5e013d7\") " pod="openstack/ovn-copy-data" Feb 17 19:12:35 crc kubenswrapper[4892]: I0217 19:12:35.595550 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/d6e36b08-e650-4dec-82ee-72c4e5e013d7-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"d6e36b08-e650-4dec-82ee-72c4e5e013d7\") " pod="openstack/ovn-copy-data" Feb 17 19:12:35 crc kubenswrapper[4892]: I0217 19:12:35.595742 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vm75v\" (UniqueName: \"kubernetes.io/projected/d6e36b08-e650-4dec-82ee-72c4e5e013d7-kube-api-access-vm75v\") pod \"ovn-copy-data\" (UID: \"d6e36b08-e650-4dec-82ee-72c4e5e013d7\") " pod="openstack/ovn-copy-data" Feb 17 19:12:35 crc kubenswrapper[4892]: I0217 19:12:35.595902 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d4b1483c-ac29-461e-b318-b9499ea694f0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d4b1483c-ac29-461e-b318-b9499ea694f0\") pod \"ovn-copy-data\" (UID: \"d6e36b08-e650-4dec-82ee-72c4e5e013d7\") " pod="openstack/ovn-copy-data" Feb 17 19:12:35 crc kubenswrapper[4892]: I0217 19:12:35.601052 4892 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 19:12:35 crc kubenswrapper[4892]: I0217 19:12:35.601097 4892 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d4b1483c-ac29-461e-b318-b9499ea694f0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d4b1483c-ac29-461e-b318-b9499ea694f0\") pod \"ovn-copy-data\" (UID: \"d6e36b08-e650-4dec-82ee-72c4e5e013d7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3662fedc9f2431363a31f9d7faf2a3269097141af565b5ee3be74ede943f9ab6/globalmount\"" pod="openstack/ovn-copy-data" Feb 17 19:12:35 crc kubenswrapper[4892]: I0217 19:12:35.603081 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/d6e36b08-e650-4dec-82ee-72c4e5e013d7-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"d6e36b08-e650-4dec-82ee-72c4e5e013d7\") " pod="openstack/ovn-copy-data" Feb 17 19:12:35 crc kubenswrapper[4892]: I0217 19:12:35.614739 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vm75v\" (UniqueName: \"kubernetes.io/projected/d6e36b08-e650-4dec-82ee-72c4e5e013d7-kube-api-access-vm75v\") pod \"ovn-copy-data\" (UID: \"d6e36b08-e650-4dec-82ee-72c4e5e013d7\") " pod="openstack/ovn-copy-data" Feb 17 19:12:35 crc kubenswrapper[4892]: I0217 19:12:35.639056 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d4b1483c-ac29-461e-b318-b9499ea694f0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d4b1483c-ac29-461e-b318-b9499ea694f0\") pod \"ovn-copy-data\" (UID: \"d6e36b08-e650-4dec-82ee-72c4e5e013d7\") " pod="openstack/ovn-copy-data" Feb 17 19:12:35 crc kubenswrapper[4892]: I0217 19:12:35.750570 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Feb 17 19:12:36 crc kubenswrapper[4892]: I0217 19:12:36.317828 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Feb 17 19:12:37 crc kubenswrapper[4892]: I0217 19:12:37.331593 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"d6e36b08-e650-4dec-82ee-72c4e5e013d7","Type":"ContainerStarted","Data":"e1a59eaa03754c23bcbd3d5fec43f8c31822b54319929fc77e08afa705fa9d9f"} Feb 17 19:12:37 crc kubenswrapper[4892]: I0217 19:12:37.332288 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"d6e36b08-e650-4dec-82ee-72c4e5e013d7","Type":"ContainerStarted","Data":"2999a629c77c6b34bf455493030031d4ad3c4a8893d3ff4981c0e016cafe8eef"} Feb 17 19:12:37 crc kubenswrapper[4892]: I0217 19:12:37.356129 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=3.356108411 podStartE2EDuration="3.356108411s" podCreationTimestamp="2026-02-17 19:12:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:12:37.348628489 +0000 UTC m=+5328.724031794" watchObservedRunningTime="2026-02-17 19:12:37.356108411 +0000 UTC m=+5328.731511676" Feb 17 19:12:43 crc kubenswrapper[4892]: I0217 19:12:43.024027 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b798876ff-xz2gr" Feb 17 19:12:43 crc kubenswrapper[4892]: I0217 19:12:43.031776 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 17 19:12:43 crc kubenswrapper[4892]: I0217 19:12:43.036447 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 17 19:12:43 crc kubenswrapper[4892]: I0217 19:12:43.041750 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 17 19:12:43 crc kubenswrapper[4892]: I0217 19:12:43.041849 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-hs426" Feb 17 19:12:43 crc kubenswrapper[4892]: I0217 19:12:43.041990 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 17 19:12:43 crc kubenswrapper[4892]: I0217 19:12:43.045757 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 17 19:12:43 crc kubenswrapper[4892]: I0217 19:12:43.108660 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-mb544"] Feb 17 19:12:43 crc kubenswrapper[4892]: I0217 19:12:43.108943 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b7946d7b9-mb544" podUID="7d80642f-243c-4542-b750-70696772e89d" containerName="dnsmasq-dns" containerID="cri-o://0508f7e80818138c836a3a4efe169f8954d5e9e826faa4000e6ab500b126fb12" gracePeriod=10 Feb 17 19:12:43 crc kubenswrapper[4892]: I0217 19:12:43.135316 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbzhd\" (UniqueName: \"kubernetes.io/projected/cb3a7c2a-a1ba-4b09-b07c-6287365b6ee2-kube-api-access-hbzhd\") pod \"ovn-northd-0\" (UID: \"cb3a7c2a-a1ba-4b09-b07c-6287365b6ee2\") " pod="openstack/ovn-northd-0" Feb 17 19:12:43 crc kubenswrapper[4892]: I0217 19:12:43.135778 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb3a7c2a-a1ba-4b09-b07c-6287365b6ee2-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"cb3a7c2a-a1ba-4b09-b07c-6287365b6ee2\") " pod="openstack/ovn-northd-0" Feb 17 19:12:43 crc kubenswrapper[4892]: I0217 19:12:43.135930 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb3a7c2a-a1ba-4b09-b07c-6287365b6ee2-config\") pod \"ovn-northd-0\" (UID: \"cb3a7c2a-a1ba-4b09-b07c-6287365b6ee2\") " pod="openstack/ovn-northd-0" Feb 17 19:12:43 crc kubenswrapper[4892]: I0217 19:12:43.135994 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cb3a7c2a-a1ba-4b09-b07c-6287365b6ee2-scripts\") pod \"ovn-northd-0\" (UID: \"cb3a7c2a-a1ba-4b09-b07c-6287365b6ee2\") " pod="openstack/ovn-northd-0" Feb 17 19:12:43 crc kubenswrapper[4892]: I0217 19:12:43.136047 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cb3a7c2a-a1ba-4b09-b07c-6287365b6ee2-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"cb3a7c2a-a1ba-4b09-b07c-6287365b6ee2\") " pod="openstack/ovn-northd-0" Feb 17 19:12:43 crc kubenswrapper[4892]: I0217 19:12:43.237945 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cb3a7c2a-a1ba-4b09-b07c-6287365b6ee2-scripts\") pod \"ovn-northd-0\" (UID: \"cb3a7c2a-a1ba-4b09-b07c-6287365b6ee2\") " pod="openstack/ovn-northd-0" Feb 17 19:12:43 crc kubenswrapper[4892]: I0217 19:12:43.237997 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cb3a7c2a-a1ba-4b09-b07c-6287365b6ee2-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"cb3a7c2a-a1ba-4b09-b07c-6287365b6ee2\") " pod="openstack/ovn-northd-0" Feb 17 19:12:43 crc kubenswrapper[4892]: I0217 19:12:43.238049 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbzhd\" (UniqueName: \"kubernetes.io/projected/cb3a7c2a-a1ba-4b09-b07c-6287365b6ee2-kube-api-access-hbzhd\") pod \"ovn-northd-0\" (UID: \"cb3a7c2a-a1ba-4b09-b07c-6287365b6ee2\") " pod="openstack/ovn-northd-0" Feb 17 19:12:43 crc kubenswrapper[4892]: I0217 19:12:43.238123 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb3a7c2a-a1ba-4b09-b07c-6287365b6ee2-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"cb3a7c2a-a1ba-4b09-b07c-6287365b6ee2\") " pod="openstack/ovn-northd-0" Feb 17 19:12:43 crc kubenswrapper[4892]: I0217 19:12:43.238161 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb3a7c2a-a1ba-4b09-b07c-6287365b6ee2-config\") pod \"ovn-northd-0\" (UID: \"cb3a7c2a-a1ba-4b09-b07c-6287365b6ee2\") " pod="openstack/ovn-northd-0" Feb 17 19:12:43 crc kubenswrapper[4892]: I0217 19:12:43.240414 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb3a7c2a-a1ba-4b09-b07c-6287365b6ee2-config\") pod \"ovn-northd-0\" (UID: \"cb3a7c2a-a1ba-4b09-b07c-6287365b6ee2\") " pod="openstack/ovn-northd-0" Feb 17 19:12:43 crc kubenswrapper[4892]: I0217 19:12:43.244105 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cb3a7c2a-a1ba-4b09-b07c-6287365b6ee2-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"cb3a7c2a-a1ba-4b09-b07c-6287365b6ee2\") " pod="openstack/ovn-northd-0" Feb 17 19:12:43 crc kubenswrapper[4892]: I0217 19:12:43.245510 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cb3a7c2a-a1ba-4b09-b07c-6287365b6ee2-scripts\") pod \"ovn-northd-0\" (UID: \"cb3a7c2a-a1ba-4b09-b07c-6287365b6ee2\") " pod="openstack/ovn-northd-0" Feb 17 19:12:43 crc kubenswrapper[4892]: I0217 19:12:43.250318 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb3a7c2a-a1ba-4b09-b07c-6287365b6ee2-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"cb3a7c2a-a1ba-4b09-b07c-6287365b6ee2\") " pod="openstack/ovn-northd-0" Feb 17 19:12:43 crc kubenswrapper[4892]: I0217 19:12:43.261371 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbzhd\" (UniqueName: \"kubernetes.io/projected/cb3a7c2a-a1ba-4b09-b07c-6287365b6ee2-kube-api-access-hbzhd\") pod \"ovn-northd-0\" (UID: \"cb3a7c2a-a1ba-4b09-b07c-6287365b6ee2\") " pod="openstack/ovn-northd-0" Feb 17 19:12:43 crc kubenswrapper[4892]: I0217 19:12:43.359758 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 17 19:12:43 crc kubenswrapper[4892]: I0217 19:12:43.400762 4892 generic.go:334] "Generic (PLEG): container finished" podID="7d80642f-243c-4542-b750-70696772e89d" containerID="0508f7e80818138c836a3a4efe169f8954d5e9e826faa4000e6ab500b126fb12" exitCode=0 Feb 17 19:12:43 crc kubenswrapper[4892]: I0217 19:12:43.400810 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-mb544" event={"ID":"7d80642f-243c-4542-b750-70696772e89d","Type":"ContainerDied","Data":"0508f7e80818138c836a3a4efe169f8954d5e9e826faa4000e6ab500b126fb12"} Feb 17 19:12:43 crc kubenswrapper[4892]: I0217 19:12:43.606616 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-mb544" Feb 17 19:12:43 crc kubenswrapper[4892]: I0217 19:12:43.682946 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 17 19:12:43 crc kubenswrapper[4892]: I0217 19:12:43.750646 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d80642f-243c-4542-b750-70696772e89d-dns-svc\") pod \"7d80642f-243c-4542-b750-70696772e89d\" (UID: \"7d80642f-243c-4542-b750-70696772e89d\") " Feb 17 19:12:43 crc kubenswrapper[4892]: I0217 19:12:43.750771 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prs44\" (UniqueName: \"kubernetes.io/projected/7d80642f-243c-4542-b750-70696772e89d-kube-api-access-prs44\") pod \"7d80642f-243c-4542-b750-70696772e89d\" (UID: \"7d80642f-243c-4542-b750-70696772e89d\") " Feb 17 19:12:43 crc kubenswrapper[4892]: I0217 19:12:43.750864 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d80642f-243c-4542-b750-70696772e89d-config\") pod \"7d80642f-243c-4542-b750-70696772e89d\" (UID: \"7d80642f-243c-4542-b750-70696772e89d\") " Feb 17 19:12:43 crc kubenswrapper[4892]: I0217 19:12:43.756407 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d80642f-243c-4542-b750-70696772e89d-kube-api-access-prs44" (OuterVolumeSpecName: "kube-api-access-prs44") pod "7d80642f-243c-4542-b750-70696772e89d" (UID: "7d80642f-243c-4542-b750-70696772e89d"). InnerVolumeSpecName "kube-api-access-prs44". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:12:43 crc kubenswrapper[4892]: I0217 19:12:43.840426 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d80642f-243c-4542-b750-70696772e89d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7d80642f-243c-4542-b750-70696772e89d" (UID: "7d80642f-243c-4542-b750-70696772e89d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:12:43 crc kubenswrapper[4892]: I0217 19:12:43.840968 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d80642f-243c-4542-b750-70696772e89d-config" (OuterVolumeSpecName: "config") pod "7d80642f-243c-4542-b750-70696772e89d" (UID: "7d80642f-243c-4542-b750-70696772e89d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:12:43 crc kubenswrapper[4892]: I0217 19:12:43.855905 4892 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d80642f-243c-4542-b750-70696772e89d-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 19:12:43 crc kubenswrapper[4892]: I0217 19:12:43.855932 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prs44\" (UniqueName: \"kubernetes.io/projected/7d80642f-243c-4542-b750-70696772e89d-kube-api-access-prs44\") on node \"crc\" DevicePath \"\"" Feb 17 19:12:43 crc kubenswrapper[4892]: I0217 19:12:43.855945 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d80642f-243c-4542-b750-70696772e89d-config\") on node \"crc\" DevicePath \"\"" Feb 17 19:12:44 crc kubenswrapper[4892]: I0217 19:12:44.414357 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-mb544" Feb 17 19:12:44 crc kubenswrapper[4892]: I0217 19:12:44.414372 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-mb544" event={"ID":"7d80642f-243c-4542-b750-70696772e89d","Type":"ContainerDied","Data":"d9452f1662bab52c8aa2e4972e25e6404df48084df5624f167050de142cebcea"} Feb 17 19:12:44 crc kubenswrapper[4892]: I0217 19:12:44.414956 4892 scope.go:117] "RemoveContainer" containerID="0508f7e80818138c836a3a4efe169f8954d5e9e826faa4000e6ab500b126fb12" Feb 17 19:12:44 crc kubenswrapper[4892]: I0217 19:12:44.417637 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"cb3a7c2a-a1ba-4b09-b07c-6287365b6ee2","Type":"ContainerStarted","Data":"93c67eac123dc0e555434605c99a91ac2daf9d191438f6b9d87f2d9bb1a294dd"} Feb 17 19:12:44 crc kubenswrapper[4892]: I0217 19:12:44.417686 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"cb3a7c2a-a1ba-4b09-b07c-6287365b6ee2","Type":"ContainerStarted","Data":"fd9956f40af6ed37cce182a17d61d93fe9caf06bfda366d0ec53bba7a8ba47a5"} Feb 17 19:12:44 crc kubenswrapper[4892]: I0217 19:12:44.417705 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"cb3a7c2a-a1ba-4b09-b07c-6287365b6ee2","Type":"ContainerStarted","Data":"250d6766ad84cfdf92ee5370b95b212b1e8a9a86b57fb8971794a22949678462"} Feb 17 19:12:44 crc kubenswrapper[4892]: I0217 19:12:44.417879 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 17 19:12:44 crc kubenswrapper[4892]: I0217 19:12:44.441235 4892 scope.go:117] "RemoveContainer" containerID="e3443eccf8ac9db843bff666c60a97e0c8b0794fde6e3954541215db1dc302d4" Feb 17 19:12:44 crc kubenswrapper[4892]: I0217 19:12:44.467980 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.467957783 podStartE2EDuration="2.467957783s" podCreationTimestamp="2026-02-17 19:12:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:12:44.450133101 +0000 UTC m=+5335.825536396" watchObservedRunningTime="2026-02-17 19:12:44.467957783 +0000 UTC m=+5335.843361058" Feb 17 19:12:44 crc kubenswrapper[4892]: I0217 19:12:44.491604 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-mb544"] Feb 17 19:12:44 crc kubenswrapper[4892]: I0217 19:12:44.507504 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-mb544"] Feb 17 19:12:45 crc kubenswrapper[4892]: I0217 19:12:45.376535 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d80642f-243c-4542-b750-70696772e89d" path="/var/lib/kubelet/pods/7d80642f-243c-4542-b750-70696772e89d/volumes" Feb 17 19:12:46 crc kubenswrapper[4892]: I0217 19:12:46.359800 4892 scope.go:117] "RemoveContainer" containerID="1176c4dd94e4eb26cd8f99c2e68bcd6da89a092583d39fe54d90751c364f5f39" Feb 17 19:12:46 crc kubenswrapper[4892]: E0217 19:12:46.360235 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:12:48 crc kubenswrapper[4892]: I0217 19:12:48.442248 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-bd2v5"] Feb 17 19:12:48 crc kubenswrapper[4892]: E0217 19:12:48.442756 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d80642f-243c-4542-b750-70696772e89d" containerName="init" Feb 17 19:12:48 crc kubenswrapper[4892]: I0217 19:12:48.442774 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d80642f-243c-4542-b750-70696772e89d" containerName="init" Feb 17 19:12:48 crc kubenswrapper[4892]: E0217 19:12:48.442892 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d80642f-243c-4542-b750-70696772e89d" containerName="dnsmasq-dns" Feb 17 19:12:48 crc kubenswrapper[4892]: I0217 19:12:48.442903 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d80642f-243c-4542-b750-70696772e89d" containerName="dnsmasq-dns" Feb 17 19:12:48 crc kubenswrapper[4892]: I0217 19:12:48.443136 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d80642f-243c-4542-b750-70696772e89d" containerName="dnsmasq-dns" Feb 17 19:12:48 crc kubenswrapper[4892]: I0217 19:12:48.443871 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bd2v5" Feb 17 19:12:48 crc kubenswrapper[4892]: I0217 19:12:48.456472 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-bd2v5"] Feb 17 19:12:48 crc kubenswrapper[4892]: I0217 19:12:48.529775 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-28ef-account-create-update-rlgx4"] Feb 17 19:12:48 crc kubenswrapper[4892]: I0217 19:12:48.530902 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-28ef-account-create-update-rlgx4" Feb 17 19:12:48 crc kubenswrapper[4892]: I0217 19:12:48.532647 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 17 19:12:48 crc kubenswrapper[4892]: I0217 19:12:48.547387 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-28ef-account-create-update-rlgx4"] Feb 17 19:12:48 crc kubenswrapper[4892]: I0217 19:12:48.547668 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp9q7\" (UniqueName: \"kubernetes.io/projected/5c6fb2ec-298f-4592-afd1-2eb29fb08684-kube-api-access-cp9q7\") pod \"keystone-db-create-bd2v5\" (UID: \"5c6fb2ec-298f-4592-afd1-2eb29fb08684\") " pod="openstack/keystone-db-create-bd2v5" Feb 17 19:12:48 crc kubenswrapper[4892]: I0217 19:12:48.547734 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c6fb2ec-298f-4592-afd1-2eb29fb08684-operator-scripts\") pod \"keystone-db-create-bd2v5\" (UID: \"5c6fb2ec-298f-4592-afd1-2eb29fb08684\") " pod="openstack/keystone-db-create-bd2v5" Feb 17 19:12:48 crc kubenswrapper[4892]: I0217 19:12:48.649998 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4g6b\" (UniqueName: \"kubernetes.io/projected/9b312301-b3f1-4ef3-bf19-cb59fe062e42-kube-api-access-h4g6b\") pod \"keystone-28ef-account-create-update-rlgx4\" (UID: \"9b312301-b3f1-4ef3-bf19-cb59fe062e42\") " pod="openstack/keystone-28ef-account-create-update-rlgx4" Feb 17 19:12:48 crc kubenswrapper[4892]: I0217 19:12:48.650292 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cp9q7\" (UniqueName: \"kubernetes.io/projected/5c6fb2ec-298f-4592-afd1-2eb29fb08684-kube-api-access-cp9q7\") pod \"keystone-db-create-bd2v5\" (UID: \"5c6fb2ec-298f-4592-afd1-2eb29fb08684\") " pod="openstack/keystone-db-create-bd2v5" Feb 17 19:12:48 crc kubenswrapper[4892]: I0217 19:12:48.650475 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c6fb2ec-298f-4592-afd1-2eb29fb08684-operator-scripts\") pod \"keystone-db-create-bd2v5\" (UID: \"5c6fb2ec-298f-4592-afd1-2eb29fb08684\") " pod="openstack/keystone-db-create-bd2v5" Feb 17 19:12:48 crc kubenswrapper[4892]: I0217 19:12:48.650546 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b312301-b3f1-4ef3-bf19-cb59fe062e42-operator-scripts\") pod \"keystone-28ef-account-create-update-rlgx4\" (UID: \"9b312301-b3f1-4ef3-bf19-cb59fe062e42\") " pod="openstack/keystone-28ef-account-create-update-rlgx4" Feb 17 19:12:48 crc kubenswrapper[4892]: I0217 19:12:48.651680 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c6fb2ec-298f-4592-afd1-2eb29fb08684-operator-scripts\") pod \"keystone-db-create-bd2v5\" (UID: \"5c6fb2ec-298f-4592-afd1-2eb29fb08684\") " pod="openstack/keystone-db-create-bd2v5" Feb 17 19:12:48 crc kubenswrapper[4892]: I0217 19:12:48.671512 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp9q7\" (UniqueName: \"kubernetes.io/projected/5c6fb2ec-298f-4592-afd1-2eb29fb08684-kube-api-access-cp9q7\") pod \"keystone-db-create-bd2v5\" (UID: \"5c6fb2ec-298f-4592-afd1-2eb29fb08684\") " pod="openstack/keystone-db-create-bd2v5" Feb 17 19:12:48 crc kubenswrapper[4892]: I0217 19:12:48.751988 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4g6b\" (UniqueName: \"kubernetes.io/projected/9b312301-b3f1-4ef3-bf19-cb59fe062e42-kube-api-access-h4g6b\") pod \"keystone-28ef-account-create-update-rlgx4\" (UID: \"9b312301-b3f1-4ef3-bf19-cb59fe062e42\") " pod="openstack/keystone-28ef-account-create-update-rlgx4" Feb 17 19:12:48 crc kubenswrapper[4892]: I0217 19:12:48.752135 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b312301-b3f1-4ef3-bf19-cb59fe062e42-operator-scripts\") pod \"keystone-28ef-account-create-update-rlgx4\" (UID: \"9b312301-b3f1-4ef3-bf19-cb59fe062e42\") " pod="openstack/keystone-28ef-account-create-update-rlgx4" Feb 17 19:12:48 crc kubenswrapper[4892]: I0217 19:12:48.752760 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b312301-b3f1-4ef3-bf19-cb59fe062e42-operator-scripts\") pod \"keystone-28ef-account-create-update-rlgx4\" (UID: \"9b312301-b3f1-4ef3-bf19-cb59fe062e42\") " pod="openstack/keystone-28ef-account-create-update-rlgx4" Feb 17 19:12:48 crc kubenswrapper[4892]: I0217 19:12:48.760440 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bd2v5" Feb 17 19:12:48 crc kubenswrapper[4892]: I0217 19:12:48.769971 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4g6b\" (UniqueName: \"kubernetes.io/projected/9b312301-b3f1-4ef3-bf19-cb59fe062e42-kube-api-access-h4g6b\") pod \"keystone-28ef-account-create-update-rlgx4\" (UID: \"9b312301-b3f1-4ef3-bf19-cb59fe062e42\") " pod="openstack/keystone-28ef-account-create-update-rlgx4" Feb 17 19:12:48 crc kubenswrapper[4892]: I0217 19:12:48.847217 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-28ef-account-create-update-rlgx4" Feb 17 19:12:49 crc kubenswrapper[4892]: I0217 19:12:49.253705 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-bd2v5"] Feb 17 19:12:49 crc kubenswrapper[4892]: I0217 19:12:49.341781 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-28ef-account-create-update-rlgx4"] Feb 17 19:12:49 crc kubenswrapper[4892]: W0217 19:12:49.483625 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c6fb2ec_298f_4592_afd1_2eb29fb08684.slice/crio-994973bf1273f0b7fa3bfa2b6230e4f47627021ed2b1a15afacd50e7536cb7d9 WatchSource:0}: Error finding container 994973bf1273f0b7fa3bfa2b6230e4f47627021ed2b1a15afacd50e7536cb7d9: Status 404 returned error can't find the container with id 994973bf1273f0b7fa3bfa2b6230e4f47627021ed2b1a15afacd50e7536cb7d9 Feb 17 19:12:49 crc kubenswrapper[4892]: I0217 19:12:49.491497 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 17 19:12:50 crc kubenswrapper[4892]: I0217 19:12:50.484996 4892 generic.go:334] "Generic (PLEG): container finished" podID="9b312301-b3f1-4ef3-bf19-cb59fe062e42" containerID="14b1f15a077528a3858cd63adb897e0b4ae6a714ddd839d19d298cf8442aac1b" exitCode=0 Feb 17 19:12:50 crc kubenswrapper[4892]: I0217 19:12:50.485238 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-28ef-account-create-update-rlgx4" event={"ID":"9b312301-b3f1-4ef3-bf19-cb59fe062e42","Type":"ContainerDied","Data":"14b1f15a077528a3858cd63adb897e0b4ae6a714ddd839d19d298cf8442aac1b"} Feb 17 19:12:50 crc kubenswrapper[4892]: I0217 19:12:50.486204 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-28ef-account-create-update-rlgx4" event={"ID":"9b312301-b3f1-4ef3-bf19-cb59fe062e42","Type":"ContainerStarted","Data":"c85d2405808b1821033cbd23ca721a358c4e2976c6c491026537b2bb2650b71f"} Feb 17 19:12:50 crc kubenswrapper[4892]: I0217 19:12:50.490337 4892 generic.go:334] "Generic (PLEG): container finished" podID="5c6fb2ec-298f-4592-afd1-2eb29fb08684" containerID="1aeb8b42dd0468752bc3ecdd432d7564864f6dd46c59d60f10eee335910c95c1" exitCode=0 Feb 17 19:12:50 crc kubenswrapper[4892]: I0217 19:12:50.490374 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-bd2v5" event={"ID":"5c6fb2ec-298f-4592-afd1-2eb29fb08684","Type":"ContainerDied","Data":"1aeb8b42dd0468752bc3ecdd432d7564864f6dd46c59d60f10eee335910c95c1"} Feb 17 19:12:50 crc kubenswrapper[4892]: I0217 19:12:50.490417 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-bd2v5" event={"ID":"5c6fb2ec-298f-4592-afd1-2eb29fb08684","Type":"ContainerStarted","Data":"994973bf1273f0b7fa3bfa2b6230e4f47627021ed2b1a15afacd50e7536cb7d9"} Feb 17 19:12:51 crc kubenswrapper[4892]: I0217 19:12:51.997357 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bd2v5" Feb 17 19:12:52 crc kubenswrapper[4892]: I0217 19:12:52.002629 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-28ef-account-create-update-rlgx4" Feb 17 19:12:52 crc kubenswrapper[4892]: I0217 19:12:52.113564 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b312301-b3f1-4ef3-bf19-cb59fe062e42-operator-scripts\") pod \"9b312301-b3f1-4ef3-bf19-cb59fe062e42\" (UID: \"9b312301-b3f1-4ef3-bf19-cb59fe062e42\") " Feb 17 19:12:52 crc kubenswrapper[4892]: I0217 19:12:52.113693 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cp9q7\" (UniqueName: \"kubernetes.io/projected/5c6fb2ec-298f-4592-afd1-2eb29fb08684-kube-api-access-cp9q7\") pod \"5c6fb2ec-298f-4592-afd1-2eb29fb08684\" (UID: \"5c6fb2ec-298f-4592-afd1-2eb29fb08684\") " Feb 17 19:12:52 crc kubenswrapper[4892]: I0217 19:12:52.113748 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4g6b\" (UniqueName: \"kubernetes.io/projected/9b312301-b3f1-4ef3-bf19-cb59fe062e42-kube-api-access-h4g6b\") pod \"9b312301-b3f1-4ef3-bf19-cb59fe062e42\" (UID: \"9b312301-b3f1-4ef3-bf19-cb59fe062e42\") " Feb 17 19:12:52 crc kubenswrapper[4892]: I0217 19:12:52.113765 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c6fb2ec-298f-4592-afd1-2eb29fb08684-operator-scripts\") pod \"5c6fb2ec-298f-4592-afd1-2eb29fb08684\" (UID: \"5c6fb2ec-298f-4592-afd1-2eb29fb08684\") " Feb 17 19:12:52 crc kubenswrapper[4892]: I0217 19:12:52.114587 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c6fb2ec-298f-4592-afd1-2eb29fb08684-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5c6fb2ec-298f-4592-afd1-2eb29fb08684" (UID: "5c6fb2ec-298f-4592-afd1-2eb29fb08684"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:12:52 crc kubenswrapper[4892]: I0217 19:12:52.114596 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b312301-b3f1-4ef3-bf19-cb59fe062e42-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9b312301-b3f1-4ef3-bf19-cb59fe062e42" (UID: "9b312301-b3f1-4ef3-bf19-cb59fe062e42"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:12:52 crc kubenswrapper[4892]: I0217 19:12:52.120182 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c6fb2ec-298f-4592-afd1-2eb29fb08684-kube-api-access-cp9q7" (OuterVolumeSpecName: "kube-api-access-cp9q7") pod "5c6fb2ec-298f-4592-afd1-2eb29fb08684" (UID: "5c6fb2ec-298f-4592-afd1-2eb29fb08684"). InnerVolumeSpecName "kube-api-access-cp9q7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:12:52 crc kubenswrapper[4892]: I0217 19:12:52.120217 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b312301-b3f1-4ef3-bf19-cb59fe062e42-kube-api-access-h4g6b" (OuterVolumeSpecName: "kube-api-access-h4g6b") pod "9b312301-b3f1-4ef3-bf19-cb59fe062e42" (UID: "9b312301-b3f1-4ef3-bf19-cb59fe062e42"). InnerVolumeSpecName "kube-api-access-h4g6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:12:52 crc kubenswrapper[4892]: I0217 19:12:52.215603 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cp9q7\" (UniqueName: \"kubernetes.io/projected/5c6fb2ec-298f-4592-afd1-2eb29fb08684-kube-api-access-cp9q7\") on node \"crc\" DevicePath \"\"" Feb 17 19:12:52 crc kubenswrapper[4892]: I0217 19:12:52.215636 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4g6b\" (UniqueName: \"kubernetes.io/projected/9b312301-b3f1-4ef3-bf19-cb59fe062e42-kube-api-access-h4g6b\") on node \"crc\" DevicePath \"\"" Feb 17 19:12:52 crc kubenswrapper[4892]: I0217 19:12:52.215652 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c6fb2ec-298f-4592-afd1-2eb29fb08684-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:12:52 crc kubenswrapper[4892]: I0217 19:12:52.215663 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b312301-b3f1-4ef3-bf19-cb59fe062e42-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:12:52 crc kubenswrapper[4892]: I0217 19:12:52.516162 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-28ef-account-create-update-rlgx4" Feb 17 19:12:52 crc kubenswrapper[4892]: I0217 19:12:52.516313 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-28ef-account-create-update-rlgx4" event={"ID":"9b312301-b3f1-4ef3-bf19-cb59fe062e42","Type":"ContainerDied","Data":"c85d2405808b1821033cbd23ca721a358c4e2976c6c491026537b2bb2650b71f"} Feb 17 19:12:52 crc kubenswrapper[4892]: I0217 19:12:52.516347 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c85d2405808b1821033cbd23ca721a358c4e2976c6c491026537b2bb2650b71f" Feb 17 19:12:52 crc kubenswrapper[4892]: I0217 19:12:52.518415 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-bd2v5" event={"ID":"5c6fb2ec-298f-4592-afd1-2eb29fb08684","Type":"ContainerDied","Data":"994973bf1273f0b7fa3bfa2b6230e4f47627021ed2b1a15afacd50e7536cb7d9"} Feb 17 19:12:52 crc kubenswrapper[4892]: I0217 19:12:52.518433 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="994973bf1273f0b7fa3bfa2b6230e4f47627021ed2b1a15afacd50e7536cb7d9" Feb 17 19:12:52 crc kubenswrapper[4892]: I0217 19:12:52.518482 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bd2v5" Feb 17 19:12:53 crc kubenswrapper[4892]: I0217 19:12:53.484344 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 17 19:12:54 crc kubenswrapper[4892]: I0217 19:12:54.188739 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-s74rl"] Feb 17 19:12:54 crc kubenswrapper[4892]: E0217 19:12:54.196395 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c6fb2ec-298f-4592-afd1-2eb29fb08684" containerName="mariadb-database-create" Feb 17 19:12:54 crc kubenswrapper[4892]: I0217 19:12:54.196442 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c6fb2ec-298f-4592-afd1-2eb29fb08684" containerName="mariadb-database-create" Feb 17 19:12:54 crc kubenswrapper[4892]: E0217 19:12:54.196501 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b312301-b3f1-4ef3-bf19-cb59fe062e42" containerName="mariadb-account-create-update" Feb 17 19:12:54 crc kubenswrapper[4892]: I0217 19:12:54.196511 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b312301-b3f1-4ef3-bf19-cb59fe062e42" containerName="mariadb-account-create-update" Feb 17 19:12:54 crc kubenswrapper[4892]: I0217 19:12:54.200687 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b312301-b3f1-4ef3-bf19-cb59fe062e42" containerName="mariadb-account-create-update" Feb 17 19:12:54 crc kubenswrapper[4892]: I0217 19:12:54.200746 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c6fb2ec-298f-4592-afd1-2eb29fb08684" containerName="mariadb-database-create" Feb 17 19:12:54 crc kubenswrapper[4892]: I0217 19:12:54.203036 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-s74rl" Feb 17 19:12:54 crc kubenswrapper[4892]: I0217 19:12:54.206871 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 17 19:12:54 crc kubenswrapper[4892]: I0217 19:12:54.206937 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 17 19:12:54 crc kubenswrapper[4892]: I0217 19:12:54.206974 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-xlfb6" Feb 17 19:12:54 crc kubenswrapper[4892]: I0217 19:12:54.208596 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 17 19:12:54 crc kubenswrapper[4892]: I0217 19:12:54.244247 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-s74rl"] Feb 17 19:12:54 crc kubenswrapper[4892]: I0217 19:12:54.363673 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbnbc\" (UniqueName: \"kubernetes.io/projected/f32656b5-b635-4abb-a3f0-fd6bc228c874-kube-api-access-fbnbc\") pod \"keystone-db-sync-s74rl\" (UID: \"f32656b5-b635-4abb-a3f0-fd6bc228c874\") " pod="openstack/keystone-db-sync-s74rl" Feb 17 19:12:54 crc kubenswrapper[4892]: I0217 19:12:54.363878 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f32656b5-b635-4abb-a3f0-fd6bc228c874-config-data\") pod \"keystone-db-sync-s74rl\" (UID: \"f32656b5-b635-4abb-a3f0-fd6bc228c874\") " pod="openstack/keystone-db-sync-s74rl" Feb 17 19:12:54 crc kubenswrapper[4892]: I0217 19:12:54.364134 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f32656b5-b635-4abb-a3f0-fd6bc228c874-combined-ca-bundle\") pod \"keystone-db-sync-s74rl\" (UID: \"f32656b5-b635-4abb-a3f0-fd6bc228c874\") " pod="openstack/keystone-db-sync-s74rl" Feb 17 19:12:54 crc kubenswrapper[4892]: I0217 19:12:54.466698 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbnbc\" (UniqueName: \"kubernetes.io/projected/f32656b5-b635-4abb-a3f0-fd6bc228c874-kube-api-access-fbnbc\") pod \"keystone-db-sync-s74rl\" (UID: \"f32656b5-b635-4abb-a3f0-fd6bc228c874\") " pod="openstack/keystone-db-sync-s74rl" Feb 17 19:12:54 crc kubenswrapper[4892]: I0217 19:12:54.466835 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f32656b5-b635-4abb-a3f0-fd6bc228c874-config-data\") pod \"keystone-db-sync-s74rl\" (UID: \"f32656b5-b635-4abb-a3f0-fd6bc228c874\") " pod="openstack/keystone-db-sync-s74rl" Feb 17 19:12:54 crc kubenswrapper[4892]: I0217 19:12:54.466994 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f32656b5-b635-4abb-a3f0-fd6bc228c874-combined-ca-bundle\") pod \"keystone-db-sync-s74rl\" (UID: \"f32656b5-b635-4abb-a3f0-fd6bc228c874\") " pod="openstack/keystone-db-sync-s74rl" Feb 17 19:12:54 crc kubenswrapper[4892]: I0217 19:12:54.478960 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f32656b5-b635-4abb-a3f0-fd6bc228c874-combined-ca-bundle\") pod \"keystone-db-sync-s74rl\" (UID: \"f32656b5-b635-4abb-a3f0-fd6bc228c874\") " pod="openstack/keystone-db-sync-s74rl" Feb 17 19:12:54 crc kubenswrapper[4892]: I0217 19:12:54.479138 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f32656b5-b635-4abb-a3f0-fd6bc228c874-config-data\") pod \"keystone-db-sync-s74rl\" (UID: \"f32656b5-b635-4abb-a3f0-fd6bc228c874\") " pod="openstack/keystone-db-sync-s74rl" Feb 17 19:12:54 crc kubenswrapper[4892]: I0217 19:12:54.513736 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbnbc\" (UniqueName: \"kubernetes.io/projected/f32656b5-b635-4abb-a3f0-fd6bc228c874-kube-api-access-fbnbc\") pod \"keystone-db-sync-s74rl\" (UID: \"f32656b5-b635-4abb-a3f0-fd6bc228c874\") " pod="openstack/keystone-db-sync-s74rl" Feb 17 19:12:54 crc kubenswrapper[4892]: I0217 19:12:54.541462 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-s74rl" Feb 17 19:12:55 crc kubenswrapper[4892]: I0217 19:12:55.129063 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-s74rl"] Feb 17 19:12:55 crc kubenswrapper[4892]: I0217 19:12:55.570048 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-s74rl" event={"ID":"f32656b5-b635-4abb-a3f0-fd6bc228c874","Type":"ContainerStarted","Data":"5add7a11b80cc08f7ced43002f94bbff103efe8941c5af602bc4243576960d27"} Feb 17 19:12:55 crc kubenswrapper[4892]: I0217 19:12:55.570561 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-s74rl" event={"ID":"f32656b5-b635-4abb-a3f0-fd6bc228c874","Type":"ContainerStarted","Data":"5d93a071566b67a2755425a836ac5cab6165e628a8be9262503bd0838b8fad17"} Feb 17 19:12:55 crc kubenswrapper[4892]: I0217 19:12:55.607858 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-s74rl" podStartSLOduration=1.607793398 podStartE2EDuration="1.607793398s" podCreationTimestamp="2026-02-17 19:12:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:12:55.600198483 +0000 UTC m=+5346.975601748" watchObservedRunningTime="2026-02-17 19:12:55.607793398 +0000 UTC m=+5346.983196703" Feb 17 19:12:57 crc kubenswrapper[4892]: E0217 19:12:57.102242 4892 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf32656b5_b635_4abb_a3f0_fd6bc228c874.slice/crio-5add7a11b80cc08f7ced43002f94bbff103efe8941c5af602bc4243576960d27.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf32656b5_b635_4abb_a3f0_fd6bc228c874.slice/crio-conmon-5add7a11b80cc08f7ced43002f94bbff103efe8941c5af602bc4243576960d27.scope\": RecentStats: unable to find data in memory cache]" Feb 17 19:12:57 crc kubenswrapper[4892]: I0217 19:12:57.796530 4892 generic.go:334] "Generic (PLEG): container finished" podID="f32656b5-b635-4abb-a3f0-fd6bc228c874" containerID="5add7a11b80cc08f7ced43002f94bbff103efe8941c5af602bc4243576960d27" exitCode=0 Feb 17 19:12:57 crc kubenswrapper[4892]: I0217 19:12:57.796583 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-s74rl" event={"ID":"f32656b5-b635-4abb-a3f0-fd6bc228c874","Type":"ContainerDied","Data":"5add7a11b80cc08f7ced43002f94bbff103efe8941c5af602bc4243576960d27"} Feb 17 19:12:59 crc kubenswrapper[4892]: I0217 19:12:59.269119 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-s74rl" Feb 17 19:12:59 crc kubenswrapper[4892]: I0217 19:12:59.373972 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f32656b5-b635-4abb-a3f0-fd6bc228c874-combined-ca-bundle\") pod \"f32656b5-b635-4abb-a3f0-fd6bc228c874\" (UID: \"f32656b5-b635-4abb-a3f0-fd6bc228c874\") " Feb 17 19:12:59 crc kubenswrapper[4892]: I0217 19:12:59.374197 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f32656b5-b635-4abb-a3f0-fd6bc228c874-config-data\") pod \"f32656b5-b635-4abb-a3f0-fd6bc228c874\" (UID: \"f32656b5-b635-4abb-a3f0-fd6bc228c874\") " Feb 17 19:12:59 crc kubenswrapper[4892]: I0217 19:12:59.374297 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbnbc\" (UniqueName: \"kubernetes.io/projected/f32656b5-b635-4abb-a3f0-fd6bc228c874-kube-api-access-fbnbc\") pod \"f32656b5-b635-4abb-a3f0-fd6bc228c874\" (UID: \"f32656b5-b635-4abb-a3f0-fd6bc228c874\") " Feb 17 19:12:59 crc kubenswrapper[4892]: I0217 19:12:59.395953 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f32656b5-b635-4abb-a3f0-fd6bc228c874-kube-api-access-fbnbc" (OuterVolumeSpecName: "kube-api-access-fbnbc") pod "f32656b5-b635-4abb-a3f0-fd6bc228c874" (UID: "f32656b5-b635-4abb-a3f0-fd6bc228c874"). InnerVolumeSpecName "kube-api-access-fbnbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:12:59 crc kubenswrapper[4892]: I0217 19:12:59.401147 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f32656b5-b635-4abb-a3f0-fd6bc228c874-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f32656b5-b635-4abb-a3f0-fd6bc228c874" (UID: "f32656b5-b635-4abb-a3f0-fd6bc228c874"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:12:59 crc kubenswrapper[4892]: I0217 19:12:59.437000 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f32656b5-b635-4abb-a3f0-fd6bc228c874-config-data" (OuterVolumeSpecName: "config-data") pod "f32656b5-b635-4abb-a3f0-fd6bc228c874" (UID: "f32656b5-b635-4abb-a3f0-fd6bc228c874"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:12:59 crc kubenswrapper[4892]: I0217 19:12:59.476967 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f32656b5-b635-4abb-a3f0-fd6bc228c874-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 19:12:59 crc kubenswrapper[4892]: I0217 19:12:59.476995 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbnbc\" (UniqueName: \"kubernetes.io/projected/f32656b5-b635-4abb-a3f0-fd6bc228c874-kube-api-access-fbnbc\") on node \"crc\" DevicePath \"\"" Feb 17 19:12:59 crc kubenswrapper[4892]: I0217 19:12:59.477007 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f32656b5-b635-4abb-a3f0-fd6bc228c874-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 19:12:59 crc kubenswrapper[4892]: I0217 19:12:59.825977 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-s74rl" event={"ID":"f32656b5-b635-4abb-a3f0-fd6bc228c874","Type":"ContainerDied","Data":"5d93a071566b67a2755425a836ac5cab6165e628a8be9262503bd0838b8fad17"} Feb 17 19:12:59 crc kubenswrapper[4892]: I0217 19:12:59.826018 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d93a071566b67a2755425a836ac5cab6165e628a8be9262503bd0838b8fad17" Feb 17 19:12:59 crc kubenswrapper[4892]: I0217 19:12:59.826032 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-s74rl" Feb 17 19:13:00 crc kubenswrapper[4892]: I0217 19:13:00.162677 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-mzvbx"] Feb 17 19:13:00 crc kubenswrapper[4892]: E0217 19:13:00.163119 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f32656b5-b635-4abb-a3f0-fd6bc228c874" containerName="keystone-db-sync" Feb 17 19:13:00 crc kubenswrapper[4892]: I0217 19:13:00.163142 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f32656b5-b635-4abb-a3f0-fd6bc228c874" containerName="keystone-db-sync" Feb 17 19:13:00 crc kubenswrapper[4892]: I0217 19:13:00.163332 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="f32656b5-b635-4abb-a3f0-fd6bc228c874" containerName="keystone-db-sync" Feb 17 19:13:00 crc kubenswrapper[4892]: I0217 19:13:00.163948 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mzvbx" Feb 17 19:13:00 crc kubenswrapper[4892]: I0217 19:13:00.170530 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 17 19:13:00 crc kubenswrapper[4892]: I0217 19:13:00.170739 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 17 19:13:00 crc kubenswrapper[4892]: I0217 19:13:00.170968 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 17 19:13:00 crc kubenswrapper[4892]: I0217 19:13:00.171234 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-xlfb6" Feb 17 19:13:00 crc kubenswrapper[4892]: I0217 19:13:00.171308 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 17 19:13:00 crc kubenswrapper[4892]: I0217 19:13:00.183909 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-779db6986f-pv8lv"] Feb 17 19:13:00 crc kubenswrapper[4892]: I0217 19:13:00.185649 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-779db6986f-pv8lv" Feb 17 19:13:00 crc kubenswrapper[4892]: I0217 19:13:00.244930 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-779db6986f-pv8lv"] Feb 17 19:13:00 crc kubenswrapper[4892]: I0217 19:13:00.259368 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-mzvbx"] Feb 17 19:13:00 crc kubenswrapper[4892]: I0217 19:13:00.293501 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/deca5be5-0971-47ec-858d-8d678eb3b961-config\") pod \"dnsmasq-dns-779db6986f-pv8lv\" (UID: \"deca5be5-0971-47ec-858d-8d678eb3b961\") " pod="openstack/dnsmasq-dns-779db6986f-pv8lv" Feb 17 19:13:00 crc kubenswrapper[4892]: I0217 19:13:00.293555 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/deca5be5-0971-47ec-858d-8d678eb3b961-ovsdbserver-sb\") pod \"dnsmasq-dns-779db6986f-pv8lv\" (UID: \"deca5be5-0971-47ec-858d-8d678eb3b961\") " pod="openstack/dnsmasq-dns-779db6986f-pv8lv" Feb 17 19:13:00 crc kubenswrapper[4892]: I0217 19:13:00.293587 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef-combined-ca-bundle\") pod \"keystone-bootstrap-mzvbx\" (UID: \"7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef\") " pod="openstack/keystone-bootstrap-mzvbx" Feb 17 19:13:00 crc kubenswrapper[4892]: I0217 19:13:00.293724 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/deca5be5-0971-47ec-858d-8d678eb3b961-dns-svc\") pod \"dnsmasq-dns-779db6986f-pv8lv\" (UID: \"deca5be5-0971-47ec-858d-8d678eb3b961\") " pod="openstack/dnsmasq-dns-779db6986f-pv8lv" Feb 17 19:13:00 crc kubenswrapper[4892]: I0217 19:13:00.293803 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn4gg\" (UniqueName: \"kubernetes.io/projected/deca5be5-0971-47ec-858d-8d678eb3b961-kube-api-access-bn4gg\") pod \"dnsmasq-dns-779db6986f-pv8lv\" (UID: \"deca5be5-0971-47ec-858d-8d678eb3b961\") " pod="openstack/dnsmasq-dns-779db6986f-pv8lv" Feb 17 19:13:00 crc kubenswrapper[4892]: I0217 19:13:00.293977 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef-credential-keys\") pod \"keystone-bootstrap-mzvbx\" (UID: \"7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef\") " pod="openstack/keystone-bootstrap-mzvbx" Feb 17 19:13:00 crc kubenswrapper[4892]: I0217 19:13:00.294114 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrjtl\" (UniqueName: \"kubernetes.io/projected/7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef-kube-api-access-jrjtl\") pod \"keystone-bootstrap-mzvbx\" (UID: \"7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef\") " pod="openstack/keystone-bootstrap-mzvbx" Feb 17 19:13:00 crc kubenswrapper[4892]: I0217 19:13:00.294141 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/deca5be5-0971-47ec-858d-8d678eb3b961-ovsdbserver-nb\") pod \"dnsmasq-dns-779db6986f-pv8lv\" (UID: \"deca5be5-0971-47ec-858d-8d678eb3b961\") " pod="openstack/dnsmasq-dns-779db6986f-pv8lv" Feb 17 19:13:00 crc kubenswrapper[4892]: I0217 19:13:00.294165 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef-config-data\") pod \"keystone-bootstrap-mzvbx\" (UID: \"7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef\") " pod="openstack/keystone-bootstrap-mzvbx" Feb 17 19:13:00 crc kubenswrapper[4892]: I0217 19:13:00.294268 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef-scripts\") pod \"keystone-bootstrap-mzvbx\" (UID: \"7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef\") " pod="openstack/keystone-bootstrap-mzvbx" Feb 17 19:13:00 crc kubenswrapper[4892]: I0217 19:13:00.294281 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef-fernet-keys\") pod \"keystone-bootstrap-mzvbx\" (UID: \"7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef\") " pod="openstack/keystone-bootstrap-mzvbx" Feb 17 19:13:00 crc kubenswrapper[4892]: I0217 19:13:00.395864 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn4gg\" (UniqueName: \"kubernetes.io/projected/deca5be5-0971-47ec-858d-8d678eb3b961-kube-api-access-bn4gg\") pod \"dnsmasq-dns-779db6986f-pv8lv\" (UID: \"deca5be5-0971-47ec-858d-8d678eb3b961\") " pod="openstack/dnsmasq-dns-779db6986f-pv8lv" Feb 17 19:13:00 crc kubenswrapper[4892]: I0217 19:13:00.395939 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef-credential-keys\") pod \"keystone-bootstrap-mzvbx\" (UID: \"7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef\") " pod="openstack/keystone-bootstrap-mzvbx" Feb 17 19:13:00 crc kubenswrapper[4892]: I0217 19:13:00.395994 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrjtl\" (UniqueName: \"kubernetes.io/projected/7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef-kube-api-access-jrjtl\") pod \"keystone-bootstrap-mzvbx\" (UID: \"7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef\") " pod="openstack/keystone-bootstrap-mzvbx" Feb 17 19:13:00 crc kubenswrapper[4892]: I0217 19:13:00.396014 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/deca5be5-0971-47ec-858d-8d678eb3b961-ovsdbserver-nb\") pod \"dnsmasq-dns-779db6986f-pv8lv\" (UID: \"deca5be5-0971-47ec-858d-8d678eb3b961\") " pod="openstack/dnsmasq-dns-779db6986f-pv8lv" Feb 17 19:13:00 crc kubenswrapper[4892]: I0217 19:13:00.396030 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef-config-data\") pod \"keystone-bootstrap-mzvbx\" (UID: \"7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef\") " pod="openstack/keystone-bootstrap-mzvbx" Feb 17 19:13:00 crc kubenswrapper[4892]: I0217 19:13:00.396085 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef-scripts\") pod \"keystone-bootstrap-mzvbx\" (UID: \"7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef\") " pod="openstack/keystone-bootstrap-mzvbx" Feb 17 19:13:00 crc kubenswrapper[4892]: I0217 19:13:00.396101 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef-fernet-keys\") pod \"keystone-bootstrap-mzvbx\" (UID: \"7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef\") " pod="openstack/keystone-bootstrap-mzvbx" Feb 17 19:13:00 crc kubenswrapper[4892]: I0217 19:13:00.396123 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/deca5be5-0971-47ec-858d-8d678eb3b961-config\") pod \"dnsmasq-dns-779db6986f-pv8lv\" (UID: \"deca5be5-0971-47ec-858d-8d678eb3b961\") " pod="openstack/dnsmasq-dns-779db6986f-pv8lv" Feb 17 19:13:00 crc kubenswrapper[4892]: I0217 19:13:00.396149 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/deca5be5-0971-47ec-858d-8d678eb3b961-ovsdbserver-sb\") pod \"dnsmasq-dns-779db6986f-pv8lv\" (UID: \"deca5be5-0971-47ec-858d-8d678eb3b961\") " pod="openstack/dnsmasq-dns-779db6986f-pv8lv" Feb 17 19:13:00 crc kubenswrapper[4892]: I0217 19:13:00.396177 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef-combined-ca-bundle\") pod \"keystone-bootstrap-mzvbx\" (UID: \"7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef\") " pod="openstack/keystone-bootstrap-mzvbx" Feb 17 19:13:00 crc kubenswrapper[4892]: I0217 19:13:00.396213 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/deca5be5-0971-47ec-858d-8d678eb3b961-dns-svc\") pod \"dnsmasq-dns-779db6986f-pv8lv\" (UID: \"deca5be5-0971-47ec-858d-8d678eb3b961\") " pod="openstack/dnsmasq-dns-779db6986f-pv8lv" Feb 17 19:13:00 crc kubenswrapper[4892]: I0217 19:13:00.397158 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/deca5be5-0971-47ec-858d-8d678eb3b961-dns-svc\") pod \"dnsmasq-dns-779db6986f-pv8lv\" (UID: \"deca5be5-0971-47ec-858d-8d678eb3b961\") " pod="openstack/dnsmasq-dns-779db6986f-pv8lv" Feb 17 19:13:00 crc kubenswrapper[4892]: I0217 19:13:00.398868 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/deca5be5-0971-47ec-858d-8d678eb3b961-config\") pod \"dnsmasq-dns-779db6986f-pv8lv\" (UID: \"deca5be5-0971-47ec-858d-8d678eb3b961\") " pod="openstack/dnsmasq-dns-779db6986f-pv8lv" Feb 17 19:13:00 crc kubenswrapper[4892]: I0217 19:13:00.398900 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/deca5be5-0971-47ec-858d-8d678eb3b961-ovsdbserver-nb\") pod \"dnsmasq-dns-779db6986f-pv8lv\" (UID: \"deca5be5-0971-47ec-858d-8d678eb3b961\") " pod="openstack/dnsmasq-dns-779db6986f-pv8lv" Feb 17 19:13:00 crc kubenswrapper[4892]: I0217 19:13:00.399658 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/deca5be5-0971-47ec-858d-8d678eb3b961-ovsdbserver-sb\") pod \"dnsmasq-dns-779db6986f-pv8lv\" (UID: \"deca5be5-0971-47ec-858d-8d678eb3b961\") " pod="openstack/dnsmasq-dns-779db6986f-pv8lv" Feb 17 19:13:00 crc kubenswrapper[4892]: I0217 19:13:00.403562 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef-combined-ca-bundle\") pod \"keystone-bootstrap-mzvbx\" (UID: \"7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef\") " pod="openstack/keystone-bootstrap-mzvbx" Feb 17 19:13:00 crc kubenswrapper[4892]: I0217 19:13:00.403895 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef-credential-keys\") pod \"keystone-bootstrap-mzvbx\" (UID: \"7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef\") " pod="openstack/keystone-bootstrap-mzvbx" Feb 17 19:13:00 crc kubenswrapper[4892]: I0217 19:13:00.404190 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef-scripts\") pod \"keystone-bootstrap-mzvbx\" (UID: \"7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef\") " pod="openstack/keystone-bootstrap-mzvbx" Feb 17 19:13:00 crc kubenswrapper[4892]: I0217 19:13:00.413696 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef-config-data\") pod \"keystone-bootstrap-mzvbx\" (UID: \"7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef\") " pod="openstack/keystone-bootstrap-mzvbx" Feb 17 19:13:00 crc kubenswrapper[4892]: I0217 19:13:00.415996 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn4gg\" (UniqueName: \"kubernetes.io/projected/deca5be5-0971-47ec-858d-8d678eb3b961-kube-api-access-bn4gg\") pod \"dnsmasq-dns-779db6986f-pv8lv\" (UID: \"deca5be5-0971-47ec-858d-8d678eb3b961\") " pod="openstack/dnsmasq-dns-779db6986f-pv8lv" Feb 17 19:13:00 crc kubenswrapper[4892]: I0217 19:13:00.416898 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrjtl\" (UniqueName: \"kubernetes.io/projected/7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef-kube-api-access-jrjtl\") pod \"keystone-bootstrap-mzvbx\" (UID: \"7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef\") " pod="openstack/keystone-bootstrap-mzvbx" Feb 17 19:13:00 crc kubenswrapper[4892]: I0217 19:13:00.433519 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef-fernet-keys\") pod \"keystone-bootstrap-mzvbx\" (UID: \"7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef\") " pod="openstack/keystone-bootstrap-mzvbx" Feb 17 19:13:00 crc kubenswrapper[4892]: I0217 19:13:00.490237 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mzvbx" Feb 17 19:13:00 crc kubenswrapper[4892]: I0217 19:13:00.496539 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-779db6986f-pv8lv" Feb 17 19:13:00 crc kubenswrapper[4892]: I0217 19:13:00.967227 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-779db6986f-pv8lv"] Feb 17 19:13:01 crc kubenswrapper[4892]: I0217 19:13:01.044799 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-mzvbx"] Feb 17 19:13:01 crc kubenswrapper[4892]: I0217 19:13:01.362249 4892 scope.go:117] "RemoveContainer" containerID="1176c4dd94e4eb26cd8f99c2e68bcd6da89a092583d39fe54d90751c364f5f39" Feb 17 19:13:01 crc kubenswrapper[4892]: E0217 19:13:01.362503 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:13:01 crc kubenswrapper[4892]: I0217 19:13:01.850715 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mzvbx" event={"ID":"7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef","Type":"ContainerStarted","Data":"1d4abc8967695e153a07c6e0c365169f9fb3898c506bc87a28d5bb8889d4660d"} Feb 17 19:13:01 crc kubenswrapper[4892]: I0217 19:13:01.851091 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mzvbx" event={"ID":"7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef","Type":"ContainerStarted","Data":"7faf596d399a6c180a34d81a8c82d2d030e15ec32a717b4a66f0434c28100f45"} Feb 17 19:13:01 crc kubenswrapper[4892]: I0217 19:13:01.854166 4892 generic.go:334] "Generic (PLEG): container finished" podID="deca5be5-0971-47ec-858d-8d678eb3b961" containerID="4643169c90fc1fdc9a3900dbab08da34ef1d78185a79fb7de4584a41e5ca055b" exitCode=0 Feb 17 19:13:01 crc kubenswrapper[4892]: I0217 19:13:01.854204 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-779db6986f-pv8lv" event={"ID":"deca5be5-0971-47ec-858d-8d678eb3b961","Type":"ContainerDied","Data":"4643169c90fc1fdc9a3900dbab08da34ef1d78185a79fb7de4584a41e5ca055b"} Feb 17 19:13:01 crc kubenswrapper[4892]: I0217 19:13:01.854224 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-779db6986f-pv8lv" event={"ID":"deca5be5-0971-47ec-858d-8d678eb3b961","Type":"ContainerStarted","Data":"8fe9cddc5b7d844468a4ec08a123cae42b1320bbafbe9ecbaa7508fdb8f9475b"} Feb 17 19:13:01 crc kubenswrapper[4892]: I0217 19:13:01.880939 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-mzvbx" podStartSLOduration=1.880924644 podStartE2EDuration="1.880924644s" podCreationTimestamp="2026-02-17 19:13:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:13:01.878128589 +0000 UTC m=+5353.253531854" watchObservedRunningTime="2026-02-17 19:13:01.880924644 +0000 UTC m=+5353.256327899" Feb 17 19:13:02 crc kubenswrapper[4892]: I0217 19:13:02.881842 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-779db6986f-pv8lv" event={"ID":"deca5be5-0971-47ec-858d-8d678eb3b961","Type":"ContainerStarted","Data":"e836dd11c243b88adbffea6b4e0ce44bb7630d69cb315f5a5476982785963926"} Feb 17 19:13:02 crc kubenswrapper[4892]: I0217 19:13:02.882180 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-779db6986f-pv8lv" Feb 17 19:13:02 crc kubenswrapper[4892]: I0217 19:13:02.911233 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-779db6986f-pv8lv" podStartSLOduration=2.911215638 podStartE2EDuration="2.911215638s" podCreationTimestamp="2026-02-17 19:13:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:13:02.903030777 +0000 UTC m=+5354.278434072" watchObservedRunningTime="2026-02-17 19:13:02.911215638 +0000 UTC m=+5354.286618903" Feb 17 19:13:04 crc kubenswrapper[4892]: I0217 19:13:04.900463 4892 generic.go:334] "Generic (PLEG): container finished" podID="7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef" containerID="1d4abc8967695e153a07c6e0c365169f9fb3898c506bc87a28d5bb8889d4660d" exitCode=0 Feb 17 19:13:04 crc kubenswrapper[4892]: I0217 19:13:04.900633 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mzvbx" event={"ID":"7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef","Type":"ContainerDied","Data":"1d4abc8967695e153a07c6e0c365169f9fb3898c506bc87a28d5bb8889d4660d"} Feb 17 19:13:06 crc kubenswrapper[4892]: I0217 19:13:06.338758 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mzvbx" Feb 17 19:13:06 crc kubenswrapper[4892]: I0217 19:13:06.414941 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrjtl\" (UniqueName: \"kubernetes.io/projected/7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef-kube-api-access-jrjtl\") pod \"7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef\" (UID: \"7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef\") " Feb 17 19:13:06 crc kubenswrapper[4892]: I0217 19:13:06.415031 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef-fernet-keys\") pod \"7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef\" (UID: \"7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef\") " Feb 17 19:13:06 crc kubenswrapper[4892]: I0217 19:13:06.415173 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef-config-data\") pod \"7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef\" (UID: \"7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef\") " Feb 17 19:13:06 crc kubenswrapper[4892]: I0217 19:13:06.415215 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef-combined-ca-bundle\") pod \"7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef\" (UID: \"7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef\") " Feb 17 19:13:06 crc kubenswrapper[4892]: I0217 19:13:06.415249 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef-credential-keys\") pod \"7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef\" (UID: \"7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef\") " Feb 17 19:13:06 crc kubenswrapper[4892]: I0217 19:13:06.415322 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef-scripts\") pod \"7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef\" (UID: \"7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef\") " Feb 17 19:13:06 crc kubenswrapper[4892]: I0217 19:13:06.421641 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef-scripts" (OuterVolumeSpecName: "scripts") pod "7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef" (UID: "7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:13:06 crc kubenswrapper[4892]: I0217 19:13:06.423072 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef" (UID: "7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:13:06 crc kubenswrapper[4892]: I0217 19:13:06.424652 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef-kube-api-access-jrjtl" (OuterVolumeSpecName: "kube-api-access-jrjtl") pod "7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef" (UID: "7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef"). InnerVolumeSpecName "kube-api-access-jrjtl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:13:06 crc kubenswrapper[4892]: I0217 19:13:06.430036 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef" (UID: "7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:13:06 crc kubenswrapper[4892]: E0217 19:13:06.456644 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef-combined-ca-bundle podName:7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef nodeName:}" failed. No retries permitted until 2026-02-17 19:13:06.956620715 +0000 UTC m=+5358.332023980 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef-combined-ca-bundle") pod "7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef" (UID: "7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef") : error deleting /var/lib/kubelet/pods/7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef/volume-subpaths: remove /var/lib/kubelet/pods/7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef/volume-subpaths: no such file or directory Feb 17 19:13:06 crc kubenswrapper[4892]: I0217 19:13:06.460048 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef-config-data" (OuterVolumeSpecName: "config-data") pod "7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef" (UID: "7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:13:06 crc kubenswrapper[4892]: I0217 19:13:06.518411 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 19:13:06 crc kubenswrapper[4892]: I0217 19:13:06.518472 4892 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 17 19:13:06 crc kubenswrapper[4892]: I0217 19:13:06.518500 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:13:06 crc kubenswrapper[4892]: I0217 19:13:06.518525 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrjtl\" (UniqueName: \"kubernetes.io/projected/7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef-kube-api-access-jrjtl\") on node \"crc\" DevicePath \"\"" Feb 17 19:13:06 crc kubenswrapper[4892]: I0217 19:13:06.518550 4892 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 17 19:13:06 crc kubenswrapper[4892]: I0217 19:13:06.931210 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mzvbx" event={"ID":"7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef","Type":"ContainerDied","Data":"7faf596d399a6c180a34d81a8c82d2d030e15ec32a717b4a66f0434c28100f45"} Feb 17 19:13:06 crc kubenswrapper[4892]: I0217 19:13:06.932192 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7faf596d399a6c180a34d81a8c82d2d030e15ec32a717b4a66f0434c28100f45" Feb 17 19:13:06 crc kubenswrapper[4892]: I0217 19:13:06.931327 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mzvbx" Feb 17 19:13:07 crc kubenswrapper[4892]: I0217 19:13:07.024379 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-mzvbx"] Feb 17 19:13:07 crc kubenswrapper[4892]: I0217 19:13:07.026483 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef-combined-ca-bundle\") pod \"7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef\" (UID: \"7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef\") " Feb 17 19:13:07 crc kubenswrapper[4892]: I0217 19:13:07.030517 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef" (UID: "7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:13:07 crc kubenswrapper[4892]: I0217 19:13:07.033084 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-mzvbx"] Feb 17 19:13:07 crc kubenswrapper[4892]: I0217 19:13:07.116053 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-jpgjq"] Feb 17 19:13:07 crc kubenswrapper[4892]: E0217 19:13:07.116492 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef" containerName="keystone-bootstrap" Feb 17 19:13:07 crc kubenswrapper[4892]: I0217 19:13:07.116511 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef" containerName="keystone-bootstrap" Feb 17 19:13:07 crc kubenswrapper[4892]: I0217 19:13:07.116782 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef" containerName="keystone-bootstrap" Feb 17 19:13:07 crc kubenswrapper[4892]: I0217 19:13:07.117647 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jpgjq" Feb 17 19:13:07 crc kubenswrapper[4892]: I0217 19:13:07.128554 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 19:13:07 crc kubenswrapper[4892]: I0217 19:13:07.138870 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-jpgjq"] Feb 17 19:13:07 crc kubenswrapper[4892]: I0217 19:13:07.229657 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7813c451-11c5-4d33-bfca-baa5b12a2235-combined-ca-bundle\") pod \"keystone-bootstrap-jpgjq\" (UID: \"7813c451-11c5-4d33-bfca-baa5b12a2235\") " pod="openstack/keystone-bootstrap-jpgjq" Feb 17 19:13:07 crc kubenswrapper[4892]: I0217 19:13:07.229731 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7813c451-11c5-4d33-bfca-baa5b12a2235-scripts\") pod \"keystone-bootstrap-jpgjq\" (UID: \"7813c451-11c5-4d33-bfca-baa5b12a2235\") " pod="openstack/keystone-bootstrap-jpgjq" Feb 17 19:13:07 crc kubenswrapper[4892]: I0217 19:13:07.229767 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2h72\" (UniqueName: \"kubernetes.io/projected/7813c451-11c5-4d33-bfca-baa5b12a2235-kube-api-access-g2h72\") pod \"keystone-bootstrap-jpgjq\" (UID: \"7813c451-11c5-4d33-bfca-baa5b12a2235\") " pod="openstack/keystone-bootstrap-jpgjq" Feb 17 19:13:07 crc kubenswrapper[4892]: I0217 19:13:07.229918 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7813c451-11c5-4d33-bfca-baa5b12a2235-credential-keys\") pod \"keystone-bootstrap-jpgjq\" (UID: \"7813c451-11c5-4d33-bfca-baa5b12a2235\") " pod="openstack/keystone-bootstrap-jpgjq" Feb 17 19:13:07 crc kubenswrapper[4892]: I0217 19:13:07.229955 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7813c451-11c5-4d33-bfca-baa5b12a2235-config-data\") pod \"keystone-bootstrap-jpgjq\" (UID: \"7813c451-11c5-4d33-bfca-baa5b12a2235\") " pod="openstack/keystone-bootstrap-jpgjq" Feb 17 19:13:07 crc kubenswrapper[4892]: I0217 19:13:07.230093 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7813c451-11c5-4d33-bfca-baa5b12a2235-fernet-keys\") pod \"keystone-bootstrap-jpgjq\" (UID: \"7813c451-11c5-4d33-bfca-baa5b12a2235\") " pod="openstack/keystone-bootstrap-jpgjq" Feb 17 19:13:07 crc kubenswrapper[4892]: I0217 19:13:07.331223 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7813c451-11c5-4d33-bfca-baa5b12a2235-config-data\") pod \"keystone-bootstrap-jpgjq\" (UID: \"7813c451-11c5-4d33-bfca-baa5b12a2235\") " pod="openstack/keystone-bootstrap-jpgjq" Feb 17 19:13:07 crc kubenswrapper[4892]: I0217 19:13:07.331315 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7813c451-11c5-4d33-bfca-baa5b12a2235-fernet-keys\") pod \"keystone-bootstrap-jpgjq\" (UID: \"7813c451-11c5-4d33-bfca-baa5b12a2235\") " pod="openstack/keystone-bootstrap-jpgjq" Feb 17 19:13:07 crc kubenswrapper[4892]: I0217 19:13:07.331361 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7813c451-11c5-4d33-bfca-baa5b12a2235-combined-ca-bundle\") pod \"keystone-bootstrap-jpgjq\" (UID: \"7813c451-11c5-4d33-bfca-baa5b12a2235\") " pod="openstack/keystone-bootstrap-jpgjq" Feb 17 19:13:07 crc kubenswrapper[4892]: I0217 19:13:07.331407 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7813c451-11c5-4d33-bfca-baa5b12a2235-scripts\") pod \"keystone-bootstrap-jpgjq\" (UID: \"7813c451-11c5-4d33-bfca-baa5b12a2235\") " pod="openstack/keystone-bootstrap-jpgjq" Feb 17 19:13:07 crc kubenswrapper[4892]: I0217 19:13:07.331430 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2h72\" (UniqueName: \"kubernetes.io/projected/7813c451-11c5-4d33-bfca-baa5b12a2235-kube-api-access-g2h72\") pod \"keystone-bootstrap-jpgjq\" (UID: \"7813c451-11c5-4d33-bfca-baa5b12a2235\") " pod="openstack/keystone-bootstrap-jpgjq" Feb 17 19:13:07 crc kubenswrapper[4892]: I0217 19:13:07.331517 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7813c451-11c5-4d33-bfca-baa5b12a2235-credential-keys\") pod \"keystone-bootstrap-jpgjq\" (UID: \"7813c451-11c5-4d33-bfca-baa5b12a2235\") " pod="openstack/keystone-bootstrap-jpgjq" Feb 17 19:13:07 crc kubenswrapper[4892]: I0217 19:13:07.335016 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7813c451-11c5-4d33-bfca-baa5b12a2235-scripts\") pod \"keystone-bootstrap-jpgjq\" (UID: \"7813c451-11c5-4d33-bfca-baa5b12a2235\") " pod="openstack/keystone-bootstrap-jpgjq" Feb 17 19:13:07 crc kubenswrapper[4892]: I0217 19:13:07.335276 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7813c451-11c5-4d33-bfca-baa5b12a2235-fernet-keys\") pod \"keystone-bootstrap-jpgjq\" (UID: \"7813c451-11c5-4d33-bfca-baa5b12a2235\") " pod="openstack/keystone-bootstrap-jpgjq" Feb 17 19:13:07 crc kubenswrapper[4892]: I0217 19:13:07.336496 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7813c451-11c5-4d33-bfca-baa5b12a2235-config-data\") pod \"keystone-bootstrap-jpgjq\" (UID: \"7813c451-11c5-4d33-bfca-baa5b12a2235\") " pod="openstack/keystone-bootstrap-jpgjq" Feb 17 19:13:07 crc kubenswrapper[4892]: I0217 19:13:07.337151 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7813c451-11c5-4d33-bfca-baa5b12a2235-combined-ca-bundle\") pod \"keystone-bootstrap-jpgjq\" (UID: \"7813c451-11c5-4d33-bfca-baa5b12a2235\") " pod="openstack/keystone-bootstrap-jpgjq" Feb 17 19:13:07 crc kubenswrapper[4892]: I0217 19:13:07.340073 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7813c451-11c5-4d33-bfca-baa5b12a2235-credential-keys\") pod \"keystone-bootstrap-jpgjq\" (UID: \"7813c451-11c5-4d33-bfca-baa5b12a2235\") " pod="openstack/keystone-bootstrap-jpgjq" Feb 17 19:13:07 crc kubenswrapper[4892]: I0217 19:13:07.349477 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2h72\" (UniqueName: \"kubernetes.io/projected/7813c451-11c5-4d33-bfca-baa5b12a2235-kube-api-access-g2h72\") pod \"keystone-bootstrap-jpgjq\" (UID: \"7813c451-11c5-4d33-bfca-baa5b12a2235\") " pod="openstack/keystone-bootstrap-jpgjq" Feb 17 19:13:07 crc kubenswrapper[4892]: E0217 19:13:07.365410 4892 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d2cc988_ecaa_4775_8bdd_09f1eb84c8ef.slice/crio-7faf596d399a6c180a34d81a8c82d2d030e15ec32a717b4a66f0434c28100f45\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d2cc988_ecaa_4775_8bdd_09f1eb84c8ef.slice\": RecentStats: unable to find data in memory cache]" Feb 17 19:13:07 crc kubenswrapper[4892]: I0217 19:13:07.383518 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef" path="/var/lib/kubelet/pods/7d2cc988-ecaa-4775-8bdd-09f1eb84c8ef/volumes" Feb 17 19:13:07 crc kubenswrapper[4892]: I0217 19:13:07.487722 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jpgjq" Feb 17 19:13:08 crc kubenswrapper[4892]: I0217 19:13:08.053504 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-jpgjq"] Feb 17 19:13:08 crc kubenswrapper[4892]: I0217 19:13:08.954119 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jpgjq" event={"ID":"7813c451-11c5-4d33-bfca-baa5b12a2235","Type":"ContainerStarted","Data":"e6e78e0f2a0470cbf58e69d1585bb6969f0d97e46cc06cd5ef2a9b3afb3e688f"} Feb 17 19:13:08 crc kubenswrapper[4892]: I0217 19:13:08.954503 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jpgjq" event={"ID":"7813c451-11c5-4d33-bfca-baa5b12a2235","Type":"ContainerStarted","Data":"b22ec846fcde859280e8de34ed40189cb135e2c3925331f3cc5eaba7ebc1b9c8"} Feb 17 19:13:08 crc kubenswrapper[4892]: I0217 19:13:08.990358 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-jpgjq" podStartSLOduration=1.990327641 podStartE2EDuration="1.990327641s" podCreationTimestamp="2026-02-17 19:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:13:08.973058056 +0000 UTC m=+5360.348461361" watchObservedRunningTime="2026-02-17 19:13:08.990327641 +0000 UTC m=+5360.365730946" Feb 17 19:13:10 crc kubenswrapper[4892]: I0217 19:13:10.498013 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-779db6986f-pv8lv" Feb 17 19:13:10 crc kubenswrapper[4892]: I0217 19:13:10.578129 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b798876ff-xz2gr"] Feb 17 19:13:10 crc kubenswrapper[4892]: I0217 19:13:10.578415 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b798876ff-xz2gr" podUID="462a6f91-70eb-412e-9c96-104545e1a695" containerName="dnsmasq-dns" containerID="cri-o://59f78a5ea41e8cce831a0d1a4fbe477fd8085e9f9d45e089134c804f21d3e03e" gracePeriod=10 Feb 17 19:13:10 crc kubenswrapper[4892]: I0217 19:13:10.978681 4892 generic.go:334] "Generic (PLEG): container finished" podID="462a6f91-70eb-412e-9c96-104545e1a695" containerID="59f78a5ea41e8cce831a0d1a4fbe477fd8085e9f9d45e089134c804f21d3e03e" exitCode=0 Feb 17 19:13:10 crc kubenswrapper[4892]: I0217 19:13:10.978748 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b798876ff-xz2gr" event={"ID":"462a6f91-70eb-412e-9c96-104545e1a695","Type":"ContainerDied","Data":"59f78a5ea41e8cce831a0d1a4fbe477fd8085e9f9d45e089134c804f21d3e03e"} Feb 17 19:13:10 crc kubenswrapper[4892]: I0217 19:13:10.981233 4892 generic.go:334] "Generic (PLEG): container finished" podID="7813c451-11c5-4d33-bfca-baa5b12a2235" containerID="e6e78e0f2a0470cbf58e69d1585bb6969f0d97e46cc06cd5ef2a9b3afb3e688f" exitCode=0 Feb 17 19:13:10 crc kubenswrapper[4892]: I0217 19:13:10.981274 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jpgjq" event={"ID":"7813c451-11c5-4d33-bfca-baa5b12a2235","Type":"ContainerDied","Data":"e6e78e0f2a0470cbf58e69d1585bb6969f0d97e46cc06cd5ef2a9b3afb3e688f"} Feb 17 19:13:11 crc kubenswrapper[4892]: I0217 19:13:11.528945 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b798876ff-xz2gr" Feb 17 19:13:11 crc kubenswrapper[4892]: I0217 19:13:11.626011 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/462a6f91-70eb-412e-9c96-104545e1a695-config\") pod \"462a6f91-70eb-412e-9c96-104545e1a695\" (UID: \"462a6f91-70eb-412e-9c96-104545e1a695\") " Feb 17 19:13:11 crc kubenswrapper[4892]: I0217 19:13:11.626068 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/462a6f91-70eb-412e-9c96-104545e1a695-dns-svc\") pod \"462a6f91-70eb-412e-9c96-104545e1a695\" (UID: \"462a6f91-70eb-412e-9c96-104545e1a695\") " Feb 17 19:13:11 crc kubenswrapper[4892]: I0217 19:13:11.626099 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/462a6f91-70eb-412e-9c96-104545e1a695-ovsdbserver-nb\") pod \"462a6f91-70eb-412e-9c96-104545e1a695\" (UID: \"462a6f91-70eb-412e-9c96-104545e1a695\") " Feb 17 19:13:11 crc kubenswrapper[4892]: I0217 19:13:11.626126 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jpth\" (UniqueName: \"kubernetes.io/projected/462a6f91-70eb-412e-9c96-104545e1a695-kube-api-access-4jpth\") pod \"462a6f91-70eb-412e-9c96-104545e1a695\" (UID: \"462a6f91-70eb-412e-9c96-104545e1a695\") " Feb 17 19:13:11 crc kubenswrapper[4892]: I0217 19:13:11.626178 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/462a6f91-70eb-412e-9c96-104545e1a695-ovsdbserver-sb\") pod \"462a6f91-70eb-412e-9c96-104545e1a695\" (UID: \"462a6f91-70eb-412e-9c96-104545e1a695\") " Feb 17 19:13:11 crc kubenswrapper[4892]: I0217 19:13:11.632680 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/462a6f91-70eb-412e-9c96-104545e1a695-kube-api-access-4jpth" (OuterVolumeSpecName: "kube-api-access-4jpth") pod "462a6f91-70eb-412e-9c96-104545e1a695" (UID: "462a6f91-70eb-412e-9c96-104545e1a695"). InnerVolumeSpecName "kube-api-access-4jpth". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:13:11 crc kubenswrapper[4892]: I0217 19:13:11.673921 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/462a6f91-70eb-412e-9c96-104545e1a695-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "462a6f91-70eb-412e-9c96-104545e1a695" (UID: "462a6f91-70eb-412e-9c96-104545e1a695"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:13:11 crc kubenswrapper[4892]: I0217 19:13:11.679321 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/462a6f91-70eb-412e-9c96-104545e1a695-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "462a6f91-70eb-412e-9c96-104545e1a695" (UID: "462a6f91-70eb-412e-9c96-104545e1a695"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:13:11 crc kubenswrapper[4892]: I0217 19:13:11.680997 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/462a6f91-70eb-412e-9c96-104545e1a695-config" (OuterVolumeSpecName: "config") pod "462a6f91-70eb-412e-9c96-104545e1a695" (UID: "462a6f91-70eb-412e-9c96-104545e1a695"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:13:11 crc kubenswrapper[4892]: I0217 19:13:11.688191 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/462a6f91-70eb-412e-9c96-104545e1a695-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "462a6f91-70eb-412e-9c96-104545e1a695" (UID: "462a6f91-70eb-412e-9c96-104545e1a695"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:13:11 crc kubenswrapper[4892]: I0217 19:13:11.728495 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/462a6f91-70eb-412e-9c96-104545e1a695-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 19:13:11 crc kubenswrapper[4892]: I0217 19:13:11.728546 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/462a6f91-70eb-412e-9c96-104545e1a695-config\") on node \"crc\" DevicePath \"\"" Feb 17 19:13:11 crc kubenswrapper[4892]: I0217 19:13:11.728563 4892 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/462a6f91-70eb-412e-9c96-104545e1a695-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 19:13:11 crc kubenswrapper[4892]: I0217 19:13:11.728583 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/462a6f91-70eb-412e-9c96-104545e1a695-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 19:13:11 crc kubenswrapper[4892]: I0217 19:13:11.728601 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jpth\" (UniqueName: \"kubernetes.io/projected/462a6f91-70eb-412e-9c96-104545e1a695-kube-api-access-4jpth\") on node \"crc\" DevicePath \"\"" Feb 17 19:13:11 crc kubenswrapper[4892]: I0217 19:13:11.994785 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b798876ff-xz2gr" Feb 17 19:13:11 crc kubenswrapper[4892]: I0217 19:13:11.995234 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b798876ff-xz2gr" event={"ID":"462a6f91-70eb-412e-9c96-104545e1a695","Type":"ContainerDied","Data":"c34c4d43be081e54d86a9876d8f5b12e5315965588ef565d5bc64ba6077612a6"} Feb 17 19:13:11 crc kubenswrapper[4892]: I0217 19:13:11.995909 4892 scope.go:117] "RemoveContainer" containerID="59f78a5ea41e8cce831a0d1a4fbe477fd8085e9f9d45e089134c804f21d3e03e" Feb 17 19:13:12 crc kubenswrapper[4892]: I0217 19:13:12.047135 4892 scope.go:117] "RemoveContainer" containerID="fe576cc0825d047473bda3255f916353177d4c9d6a3c5f9ce47b564f1915d83a" Feb 17 19:13:12 crc kubenswrapper[4892]: I0217 19:13:12.057307 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b798876ff-xz2gr"] Feb 17 19:13:12 crc kubenswrapper[4892]: I0217 19:13:12.067843 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b798876ff-xz2gr"] Feb 17 19:13:13 crc kubenswrapper[4892]: I0217 19:13:13.010262 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jpgjq" event={"ID":"7813c451-11c5-4d33-bfca-baa5b12a2235","Type":"ContainerDied","Data":"b22ec846fcde859280e8de34ed40189cb135e2c3925331f3cc5eaba7ebc1b9c8"} Feb 17 19:13:13 crc kubenswrapper[4892]: I0217 19:13:13.010301 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b22ec846fcde859280e8de34ed40189cb135e2c3925331f3cc5eaba7ebc1b9c8" Feb 17 19:13:13 crc kubenswrapper[4892]: I0217 19:13:13.115752 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jpgjq" Feb 17 19:13:13 crc kubenswrapper[4892]: I0217 19:13:13.253749 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7813c451-11c5-4d33-bfca-baa5b12a2235-fernet-keys\") pod \"7813c451-11c5-4d33-bfca-baa5b12a2235\" (UID: \"7813c451-11c5-4d33-bfca-baa5b12a2235\") " Feb 17 19:13:13 crc kubenswrapper[4892]: I0217 19:13:13.253927 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7813c451-11c5-4d33-bfca-baa5b12a2235-combined-ca-bundle\") pod \"7813c451-11c5-4d33-bfca-baa5b12a2235\" (UID: \"7813c451-11c5-4d33-bfca-baa5b12a2235\") " Feb 17 19:13:13 crc kubenswrapper[4892]: I0217 19:13:13.253965 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7813c451-11c5-4d33-bfca-baa5b12a2235-config-data\") pod \"7813c451-11c5-4d33-bfca-baa5b12a2235\" (UID: \"7813c451-11c5-4d33-bfca-baa5b12a2235\") " Feb 17 19:13:13 crc kubenswrapper[4892]: I0217 19:13:13.254003 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7813c451-11c5-4d33-bfca-baa5b12a2235-scripts\") pod \"7813c451-11c5-4d33-bfca-baa5b12a2235\" (UID: \"7813c451-11c5-4d33-bfca-baa5b12a2235\") " Feb 17 19:13:13 crc kubenswrapper[4892]: I0217 19:13:13.254065 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2h72\" (UniqueName: \"kubernetes.io/projected/7813c451-11c5-4d33-bfca-baa5b12a2235-kube-api-access-g2h72\") pod \"7813c451-11c5-4d33-bfca-baa5b12a2235\" (UID: \"7813c451-11c5-4d33-bfca-baa5b12a2235\") " Feb 17 19:13:13 crc kubenswrapper[4892]: I0217 19:13:13.254116 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7813c451-11c5-4d33-bfca-baa5b12a2235-credential-keys\") pod \"7813c451-11c5-4d33-bfca-baa5b12a2235\" (UID: \"7813c451-11c5-4d33-bfca-baa5b12a2235\") " Feb 17 19:13:13 crc kubenswrapper[4892]: I0217 19:13:13.258673 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7813c451-11c5-4d33-bfca-baa5b12a2235-scripts" (OuterVolumeSpecName: "scripts") pod "7813c451-11c5-4d33-bfca-baa5b12a2235" (UID: "7813c451-11c5-4d33-bfca-baa5b12a2235"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:13:13 crc kubenswrapper[4892]: I0217 19:13:13.258920 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7813c451-11c5-4d33-bfca-baa5b12a2235-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "7813c451-11c5-4d33-bfca-baa5b12a2235" (UID: "7813c451-11c5-4d33-bfca-baa5b12a2235"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:13:13 crc kubenswrapper[4892]: I0217 19:13:13.259045 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7813c451-11c5-4d33-bfca-baa5b12a2235-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "7813c451-11c5-4d33-bfca-baa5b12a2235" (UID: "7813c451-11c5-4d33-bfca-baa5b12a2235"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:13:13 crc kubenswrapper[4892]: I0217 19:13:13.260535 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7813c451-11c5-4d33-bfca-baa5b12a2235-kube-api-access-g2h72" (OuterVolumeSpecName: "kube-api-access-g2h72") pod "7813c451-11c5-4d33-bfca-baa5b12a2235" (UID: "7813c451-11c5-4d33-bfca-baa5b12a2235"). InnerVolumeSpecName "kube-api-access-g2h72". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:13:13 crc kubenswrapper[4892]: I0217 19:13:13.277450 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7813c451-11c5-4d33-bfca-baa5b12a2235-config-data" (OuterVolumeSpecName: "config-data") pod "7813c451-11c5-4d33-bfca-baa5b12a2235" (UID: "7813c451-11c5-4d33-bfca-baa5b12a2235"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:13:13 crc kubenswrapper[4892]: I0217 19:13:13.293842 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7813c451-11c5-4d33-bfca-baa5b12a2235-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7813c451-11c5-4d33-bfca-baa5b12a2235" (UID: "7813c451-11c5-4d33-bfca-baa5b12a2235"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:13:13 crc kubenswrapper[4892]: I0217 19:13:13.356379 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2h72\" (UniqueName: \"kubernetes.io/projected/7813c451-11c5-4d33-bfca-baa5b12a2235-kube-api-access-g2h72\") on node \"crc\" DevicePath \"\"" Feb 17 19:13:13 crc kubenswrapper[4892]: I0217 19:13:13.356408 4892 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7813c451-11c5-4d33-bfca-baa5b12a2235-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 17 19:13:13 crc kubenswrapper[4892]: I0217 19:13:13.356417 4892 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7813c451-11c5-4d33-bfca-baa5b12a2235-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 17 19:13:13 crc kubenswrapper[4892]: I0217 19:13:13.356427 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7813c451-11c5-4d33-bfca-baa5b12a2235-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 19:13:13 crc kubenswrapper[4892]: I0217 19:13:13.356437 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7813c451-11c5-4d33-bfca-baa5b12a2235-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 19:13:13 crc kubenswrapper[4892]: I0217 19:13:13.356444 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7813c451-11c5-4d33-bfca-baa5b12a2235-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:13:13 crc kubenswrapper[4892]: I0217 19:13:13.372493 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="462a6f91-70eb-412e-9c96-104545e1a695" path="/var/lib/kubelet/pods/462a6f91-70eb-412e-9c96-104545e1a695/volumes" Feb 17 19:13:14 crc kubenswrapper[4892]: I0217 19:13:14.020494 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jpgjq" Feb 17 19:13:14 crc kubenswrapper[4892]: I0217 19:13:14.230503 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7d9c8c664b-fqjld"] Feb 17 19:13:14 crc kubenswrapper[4892]: E0217 19:13:14.234460 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7813c451-11c5-4d33-bfca-baa5b12a2235" containerName="keystone-bootstrap" Feb 17 19:13:14 crc kubenswrapper[4892]: I0217 19:13:14.234502 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="7813c451-11c5-4d33-bfca-baa5b12a2235" containerName="keystone-bootstrap" Feb 17 19:13:14 crc kubenswrapper[4892]: E0217 19:13:14.234529 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="462a6f91-70eb-412e-9c96-104545e1a695" containerName="dnsmasq-dns" Feb 17 19:13:14 crc kubenswrapper[4892]: I0217 19:13:14.234538 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="462a6f91-70eb-412e-9c96-104545e1a695" containerName="dnsmasq-dns" Feb 17 19:13:14 crc kubenswrapper[4892]: E0217 19:13:14.238706 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="462a6f91-70eb-412e-9c96-104545e1a695" containerName="init" Feb 17 19:13:14 crc kubenswrapper[4892]: I0217 19:13:14.238743 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="462a6f91-70eb-412e-9c96-104545e1a695" containerName="init" Feb 17 19:13:14 crc kubenswrapper[4892]: I0217 19:13:14.239170 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="7813c451-11c5-4d33-bfca-baa5b12a2235" containerName="keystone-bootstrap" Feb 17 19:13:14 crc kubenswrapper[4892]: I0217 19:13:14.239195 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="462a6f91-70eb-412e-9c96-104545e1a695" containerName="dnsmasq-dns" Feb 17 19:13:14 crc kubenswrapper[4892]: I0217 19:13:14.239988 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7d9c8c664b-fqjld" Feb 17 19:13:14 crc kubenswrapper[4892]: I0217 19:13:14.243663 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 17 19:13:14 crc kubenswrapper[4892]: I0217 19:13:14.243681 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 17 19:13:14 crc kubenswrapper[4892]: I0217 19:13:14.243798 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-xlfb6" Feb 17 19:13:14 crc kubenswrapper[4892]: I0217 19:13:14.245663 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 17 19:13:14 crc kubenswrapper[4892]: I0217 19:13:14.251226 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7d9c8c664b-fqjld"] Feb 17 19:13:14 crc kubenswrapper[4892]: I0217 19:13:14.376285 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/92b3e18a-a257-410d-9f92-d6775960d070-fernet-keys\") pod \"keystone-7d9c8c664b-fqjld\" (UID: \"92b3e18a-a257-410d-9f92-d6775960d070\") " pod="openstack/keystone-7d9c8c664b-fqjld" Feb 17 19:13:14 crc kubenswrapper[4892]: I0217 19:13:14.376338 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92b3e18a-a257-410d-9f92-d6775960d070-scripts\") pod \"keystone-7d9c8c664b-fqjld\" (UID: \"92b3e18a-a257-410d-9f92-d6775960d070\") " pod="openstack/keystone-7d9c8c664b-fqjld" Feb 17 19:13:14 crc kubenswrapper[4892]: I0217 19:13:14.376475 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/92b3e18a-a257-410d-9f92-d6775960d070-credential-keys\") pod \"keystone-7d9c8c664b-fqjld\" (UID: \"92b3e18a-a257-410d-9f92-d6775960d070\") " pod="openstack/keystone-7d9c8c664b-fqjld" Feb 17 19:13:14 crc kubenswrapper[4892]: I0217 19:13:14.376585 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92b3e18a-a257-410d-9f92-d6775960d070-combined-ca-bundle\") pod \"keystone-7d9c8c664b-fqjld\" (UID: \"92b3e18a-a257-410d-9f92-d6775960d070\") " pod="openstack/keystone-7d9c8c664b-fqjld" Feb 17 19:13:14 crc kubenswrapper[4892]: I0217 19:13:14.376699 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92b3e18a-a257-410d-9f92-d6775960d070-config-data\") pod \"keystone-7d9c8c664b-fqjld\" (UID: \"92b3e18a-a257-410d-9f92-d6775960d070\") " pod="openstack/keystone-7d9c8c664b-fqjld" Feb 17 19:13:14 crc kubenswrapper[4892]: I0217 19:13:14.376776 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcr7v\" (UniqueName: \"kubernetes.io/projected/92b3e18a-a257-410d-9f92-d6775960d070-kube-api-access-lcr7v\") pod \"keystone-7d9c8c664b-fqjld\" (UID: \"92b3e18a-a257-410d-9f92-d6775960d070\") " pod="openstack/keystone-7d9c8c664b-fqjld" Feb 17 19:13:14 crc kubenswrapper[4892]: I0217 19:13:14.478911 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/92b3e18a-a257-410d-9f92-d6775960d070-fernet-keys\") pod \"keystone-7d9c8c664b-fqjld\" (UID: \"92b3e18a-a257-410d-9f92-d6775960d070\") " pod="openstack/keystone-7d9c8c664b-fqjld" Feb 17 19:13:14 crc kubenswrapper[4892]: I0217 19:13:14.479009 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92b3e18a-a257-410d-9f92-d6775960d070-scripts\") pod \"keystone-7d9c8c664b-fqjld\" (UID: \"92b3e18a-a257-410d-9f92-d6775960d070\") " pod="openstack/keystone-7d9c8c664b-fqjld" Feb 17 19:13:14 crc kubenswrapper[4892]: I0217 19:13:14.479052 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/92b3e18a-a257-410d-9f92-d6775960d070-credential-keys\") pod \"keystone-7d9c8c664b-fqjld\" (UID: \"92b3e18a-a257-410d-9f92-d6775960d070\") " pod="openstack/keystone-7d9c8c664b-fqjld" Feb 17 19:13:14 crc kubenswrapper[4892]: I0217 19:13:14.479096 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92b3e18a-a257-410d-9f92-d6775960d070-combined-ca-bundle\") pod \"keystone-7d9c8c664b-fqjld\" (UID: \"92b3e18a-a257-410d-9f92-d6775960d070\") " pod="openstack/keystone-7d9c8c664b-fqjld" Feb 17 19:13:14 crc kubenswrapper[4892]: I0217 19:13:14.479157 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92b3e18a-a257-410d-9f92-d6775960d070-config-data\") pod \"keystone-7d9c8c664b-fqjld\" (UID: \"92b3e18a-a257-410d-9f92-d6775960d070\") " pod="openstack/keystone-7d9c8c664b-fqjld" Feb 17 19:13:14 crc kubenswrapper[4892]: I0217 19:13:14.479209 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcr7v\" (UniqueName: \"kubernetes.io/projected/92b3e18a-a257-410d-9f92-d6775960d070-kube-api-access-lcr7v\") pod \"keystone-7d9c8c664b-fqjld\" (UID: \"92b3e18a-a257-410d-9f92-d6775960d070\") " pod="openstack/keystone-7d9c8c664b-fqjld" Feb 17 19:13:14 crc kubenswrapper[4892]: I0217 19:13:14.485214 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92b3e18a-a257-410d-9f92-d6775960d070-scripts\") pod \"keystone-7d9c8c664b-fqjld\" (UID: \"92b3e18a-a257-410d-9f92-d6775960d070\") " pod="openstack/keystone-7d9c8c664b-fqjld" Feb 17 19:13:14 crc kubenswrapper[4892]: I0217 19:13:14.485801 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/92b3e18a-a257-410d-9f92-d6775960d070-fernet-keys\") pod \"keystone-7d9c8c664b-fqjld\" (UID: \"92b3e18a-a257-410d-9f92-d6775960d070\") " pod="openstack/keystone-7d9c8c664b-fqjld" Feb 17 19:13:14 crc kubenswrapper[4892]: I0217 19:13:14.486694 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92b3e18a-a257-410d-9f92-d6775960d070-config-data\") pod \"keystone-7d9c8c664b-fqjld\" (UID: \"92b3e18a-a257-410d-9f92-d6775960d070\") " pod="openstack/keystone-7d9c8c664b-fqjld" Feb 17 19:13:14 crc kubenswrapper[4892]: I0217 19:13:14.487341 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92b3e18a-a257-410d-9f92-d6775960d070-combined-ca-bundle\") pod \"keystone-7d9c8c664b-fqjld\" (UID: \"92b3e18a-a257-410d-9f92-d6775960d070\") " pod="openstack/keystone-7d9c8c664b-fqjld" Feb 17 19:13:14 crc kubenswrapper[4892]: I0217 19:13:14.487449 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/92b3e18a-a257-410d-9f92-d6775960d070-credential-keys\") pod \"keystone-7d9c8c664b-fqjld\" (UID: \"92b3e18a-a257-410d-9f92-d6775960d070\") " pod="openstack/keystone-7d9c8c664b-fqjld" Feb 17 19:13:14 crc kubenswrapper[4892]: I0217 19:13:14.508904 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcr7v\" (UniqueName: \"kubernetes.io/projected/92b3e18a-a257-410d-9f92-d6775960d070-kube-api-access-lcr7v\") pod \"keystone-7d9c8c664b-fqjld\" (UID: \"92b3e18a-a257-410d-9f92-d6775960d070\") " pod="openstack/keystone-7d9c8c664b-fqjld" Feb 17 19:13:14 crc kubenswrapper[4892]: I0217 19:13:14.558531 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7d9c8c664b-fqjld" Feb 17 19:13:15 crc kubenswrapper[4892]: I0217 19:13:15.058041 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7d9c8c664b-fqjld"] Feb 17 19:13:15 crc kubenswrapper[4892]: I0217 19:13:15.360398 4892 scope.go:117] "RemoveContainer" containerID="1176c4dd94e4eb26cd8f99c2e68bcd6da89a092583d39fe54d90751c364f5f39" Feb 17 19:13:16 crc kubenswrapper[4892]: I0217 19:13:16.042389 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7d9c8c664b-fqjld" event={"ID":"92b3e18a-a257-410d-9f92-d6775960d070","Type":"ContainerStarted","Data":"2c544d10c7776b8b2e77fc532dec0768d6a951737c1411c8787ee94c1470ff37"} Feb 17 19:13:16 crc kubenswrapper[4892]: I0217 19:13:16.042733 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7d9c8c664b-fqjld" event={"ID":"92b3e18a-a257-410d-9f92-d6775960d070","Type":"ContainerStarted","Data":"84c421311056042ec8ccb73f0e3f2c2b665e230eeeb1e5d17f3342555f85d42f"} Feb 17 19:13:16 crc kubenswrapper[4892]: I0217 19:13:16.042764 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7d9c8c664b-fqjld" Feb 17 19:13:16 crc kubenswrapper[4892]: I0217 19:13:16.044514 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerStarted","Data":"6bc133c88c6796604fb1d95fbac5023f863829395efa1fa885020ed0c34254e3"} Feb 17 19:13:16 crc kubenswrapper[4892]: I0217 19:13:16.072038 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7d9c8c664b-fqjld" podStartSLOduration=2.072022709 podStartE2EDuration="2.072022709s" podCreationTimestamp="2026-02-17 19:13:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:13:16.06240133 +0000 UTC m=+5367.437804595" watchObservedRunningTime="2026-02-17 19:13:16.072022709 +0000 UTC m=+5367.447425974" Feb 17 19:13:46 crc kubenswrapper[4892]: I0217 19:13:46.004052 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7d9c8c664b-fqjld" Feb 17 19:13:48 crc kubenswrapper[4892]: I0217 19:13:48.456207 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 17 19:13:48 crc kubenswrapper[4892]: I0217 19:13:48.458221 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 17 19:13:48 crc kubenswrapper[4892]: I0217 19:13:48.462413 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-9pqxx" Feb 17 19:13:48 crc kubenswrapper[4892]: I0217 19:13:48.462466 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 17 19:13:48 crc kubenswrapper[4892]: I0217 19:13:48.462568 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 17 19:13:48 crc kubenswrapper[4892]: I0217 19:13:48.484607 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 17 19:13:48 crc kubenswrapper[4892]: I0217 19:13:48.563250 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/68d93bba-ae69-4c30-8c55-7818f38437b7-openstack-config\") pod \"openstackclient\" (UID: \"68d93bba-ae69-4c30-8c55-7818f38437b7\") " pod="openstack/openstackclient" Feb 17 19:13:48 crc kubenswrapper[4892]: I0217 19:13:48.563308 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/68d93bba-ae69-4c30-8c55-7818f38437b7-openstack-config-secret\") pod \"openstackclient\" (UID: \"68d93bba-ae69-4c30-8c55-7818f38437b7\") " pod="openstack/openstackclient" Feb 17 19:13:48 crc kubenswrapper[4892]: I0217 19:13:48.563415 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dbnt\" (UniqueName: \"kubernetes.io/projected/68d93bba-ae69-4c30-8c55-7818f38437b7-kube-api-access-4dbnt\") pod \"openstackclient\" (UID: \"68d93bba-ae69-4c30-8c55-7818f38437b7\") " pod="openstack/openstackclient" Feb 17 19:13:48 crc kubenswrapper[4892]: I0217 19:13:48.664860 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dbnt\" (UniqueName: \"kubernetes.io/projected/68d93bba-ae69-4c30-8c55-7818f38437b7-kube-api-access-4dbnt\") pod \"openstackclient\" (UID: \"68d93bba-ae69-4c30-8c55-7818f38437b7\") " pod="openstack/openstackclient" Feb 17 19:13:48 crc kubenswrapper[4892]: I0217 19:13:48.665009 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/68d93bba-ae69-4c30-8c55-7818f38437b7-openstack-config\") pod \"openstackclient\" (UID: \"68d93bba-ae69-4c30-8c55-7818f38437b7\") " pod="openstack/openstackclient" Feb 17 19:13:48 crc kubenswrapper[4892]: I0217 19:13:48.665047 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/68d93bba-ae69-4c30-8c55-7818f38437b7-openstack-config-secret\") pod \"openstackclient\" (UID: \"68d93bba-ae69-4c30-8c55-7818f38437b7\") " pod="openstack/openstackclient" Feb 17 19:13:48 crc kubenswrapper[4892]: I0217 19:13:48.666258 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/68d93bba-ae69-4c30-8c55-7818f38437b7-openstack-config\") pod \"openstackclient\" (UID: \"68d93bba-ae69-4c30-8c55-7818f38437b7\") " pod="openstack/openstackclient" Feb 17 19:13:48 crc kubenswrapper[4892]: I0217 19:13:48.671184 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/68d93bba-ae69-4c30-8c55-7818f38437b7-openstack-config-secret\") pod \"openstackclient\" (UID: \"68d93bba-ae69-4c30-8c55-7818f38437b7\") " pod="openstack/openstackclient" Feb 17 19:13:48 crc kubenswrapper[4892]: I0217 19:13:48.683332 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dbnt\" (UniqueName: \"kubernetes.io/projected/68d93bba-ae69-4c30-8c55-7818f38437b7-kube-api-access-4dbnt\") pod \"openstackclient\" (UID: \"68d93bba-ae69-4c30-8c55-7818f38437b7\") " pod="openstack/openstackclient" Feb 17 19:13:48 crc kubenswrapper[4892]: I0217 19:13:48.799291 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 17 19:13:49 crc kubenswrapper[4892]: I0217 19:13:49.398387 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 17 19:13:49 crc kubenswrapper[4892]: I0217 19:13:49.465590 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"68d93bba-ae69-4c30-8c55-7818f38437b7","Type":"ContainerStarted","Data":"0105d09ca807f9ecf103fdf494beca1e967807a46a8e4213546bf5da4e58978e"} Feb 17 19:13:50 crc kubenswrapper[4892]: I0217 19:13:50.483299 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"68d93bba-ae69-4c30-8c55-7818f38437b7","Type":"ContainerStarted","Data":"8623c66734141f6b22c688965c38d0cdd1c1b073fef5a470da15f185f4dbe675"} Feb 17 19:13:50 crc kubenswrapper[4892]: I0217 19:13:50.513890 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.513866707 podStartE2EDuration="2.513866707s" podCreationTimestamp="2026-02-17 19:13:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:13:50.51030453 +0000 UTC m=+5401.885707835" watchObservedRunningTime="2026-02-17 19:13:50.513866707 +0000 UTC m=+5401.889269992" Feb 17 19:14:38 crc kubenswrapper[4892]: E0217 19:14:38.625169 4892 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.41:43240->38.102.83.41:46529: write tcp 38.102.83.41:43240->38.102.83.41:46529: write: connection reset by peer Feb 17 19:14:48 crc kubenswrapper[4892]: I0217 19:14:48.501630 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7k9bt"] Feb 17 19:14:48 crc kubenswrapper[4892]: I0217 19:14:48.504924 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7k9bt" Feb 17 19:14:48 crc kubenswrapper[4892]: I0217 19:14:48.529304 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7k9bt"] Feb 17 19:14:48 crc kubenswrapper[4892]: I0217 19:14:48.638321 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92a4d608-6a14-463c-add8-5150b532960e-utilities\") pod \"certified-operators-7k9bt\" (UID: \"92a4d608-6a14-463c-add8-5150b532960e\") " pod="openshift-marketplace/certified-operators-7k9bt" Feb 17 19:14:48 crc kubenswrapper[4892]: I0217 19:14:48.638511 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5mcz\" (UniqueName: \"kubernetes.io/projected/92a4d608-6a14-463c-add8-5150b532960e-kube-api-access-h5mcz\") pod \"certified-operators-7k9bt\" (UID: \"92a4d608-6a14-463c-add8-5150b532960e\") " pod="openshift-marketplace/certified-operators-7k9bt" Feb 17 19:14:48 crc kubenswrapper[4892]: I0217 19:14:48.638738 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92a4d608-6a14-463c-add8-5150b532960e-catalog-content\") pod \"certified-operators-7k9bt\" (UID: \"92a4d608-6a14-463c-add8-5150b532960e\") " pod="openshift-marketplace/certified-operators-7k9bt" Feb 17 19:14:48 crc kubenswrapper[4892]: I0217 19:14:48.741073 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5mcz\" (UniqueName: \"kubernetes.io/projected/92a4d608-6a14-463c-add8-5150b532960e-kube-api-access-h5mcz\") pod \"certified-operators-7k9bt\" (UID: \"92a4d608-6a14-463c-add8-5150b532960e\") " pod="openshift-marketplace/certified-operators-7k9bt" Feb 17 19:14:48 crc kubenswrapper[4892]: I0217 19:14:48.741214 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92a4d608-6a14-463c-add8-5150b532960e-catalog-content\") pod \"certified-operators-7k9bt\" (UID: \"92a4d608-6a14-463c-add8-5150b532960e\") " pod="openshift-marketplace/certified-operators-7k9bt" Feb 17 19:14:48 crc kubenswrapper[4892]: I0217 19:14:48.741257 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92a4d608-6a14-463c-add8-5150b532960e-utilities\") pod \"certified-operators-7k9bt\" (UID: \"92a4d608-6a14-463c-add8-5150b532960e\") " pod="openshift-marketplace/certified-operators-7k9bt" Feb 17 19:14:48 crc kubenswrapper[4892]: I0217 19:14:48.741863 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92a4d608-6a14-463c-add8-5150b532960e-catalog-content\") pod \"certified-operators-7k9bt\" (UID: \"92a4d608-6a14-463c-add8-5150b532960e\") " pod="openshift-marketplace/certified-operators-7k9bt" Feb 17 19:14:48 crc kubenswrapper[4892]: I0217 19:14:48.741930 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92a4d608-6a14-463c-add8-5150b532960e-utilities\") pod \"certified-operators-7k9bt\" (UID: \"92a4d608-6a14-463c-add8-5150b532960e\") " pod="openshift-marketplace/certified-operators-7k9bt" Feb 17 19:14:48 crc kubenswrapper[4892]: I0217 19:14:48.761567 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5mcz\" (UniqueName: \"kubernetes.io/projected/92a4d608-6a14-463c-add8-5150b532960e-kube-api-access-h5mcz\") pod \"certified-operators-7k9bt\" (UID: \"92a4d608-6a14-463c-add8-5150b532960e\") " pod="openshift-marketplace/certified-operators-7k9bt" Feb 17 19:14:48 crc kubenswrapper[4892]: I0217 19:14:48.833953 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7k9bt" Feb 17 19:14:49 crc kubenswrapper[4892]: I0217 19:14:49.386356 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7k9bt"] Feb 17 19:14:50 crc kubenswrapper[4892]: I0217 19:14:50.293412 4892 generic.go:334] "Generic (PLEG): container finished" podID="92a4d608-6a14-463c-add8-5150b532960e" containerID="f7f5fe7dfde0653888e800853b2484af8b06373aa6a6963adc58d9a2514472b6" exitCode=0 Feb 17 19:14:50 crc kubenswrapper[4892]: I0217 19:14:50.293506 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7k9bt" event={"ID":"92a4d608-6a14-463c-add8-5150b532960e","Type":"ContainerDied","Data":"f7f5fe7dfde0653888e800853b2484af8b06373aa6a6963adc58d9a2514472b6"} Feb 17 19:14:50 crc kubenswrapper[4892]: I0217 19:14:50.293868 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7k9bt" event={"ID":"92a4d608-6a14-463c-add8-5150b532960e","Type":"ContainerStarted","Data":"e2d7cab86ad57eb42faddd8e9005a8e384dde0545fa02632a4d92334a3295e33"} Feb 17 19:14:50 crc kubenswrapper[4892]: I0217 19:14:50.297280 4892 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 19:14:52 crc kubenswrapper[4892]: I0217 19:14:52.318526 4892 generic.go:334] "Generic (PLEG): container finished" podID="92a4d608-6a14-463c-add8-5150b532960e" containerID="e8672e10dc17fdc85e50d1dfc631470c2b8eaee452fc1ca56afc206b7dc7692f" exitCode=0 Feb 17 19:14:52 crc kubenswrapper[4892]: I0217 19:14:52.318651 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7k9bt" event={"ID":"92a4d608-6a14-463c-add8-5150b532960e","Type":"ContainerDied","Data":"e8672e10dc17fdc85e50d1dfc631470c2b8eaee452fc1ca56afc206b7dc7692f"} Feb 17 19:14:53 crc kubenswrapper[4892]: I0217 19:14:53.330431 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7k9bt" event={"ID":"92a4d608-6a14-463c-add8-5150b532960e","Type":"ContainerStarted","Data":"72bce17cdb95ae330d875768850c1b2d4089ca982bfc9f94ebaa0eebdb621035"} Feb 17 19:14:53 crc kubenswrapper[4892]: I0217 19:14:53.354586 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7k9bt" podStartSLOduration=2.884906771 podStartE2EDuration="5.35456331s" podCreationTimestamp="2026-02-17 19:14:48 +0000 UTC" firstStartedPulling="2026-02-17 19:14:50.296889769 +0000 UTC m=+5461.672293074" lastFinishedPulling="2026-02-17 19:14:52.766546328 +0000 UTC m=+5464.141949613" observedRunningTime="2026-02-17 19:14:53.349852862 +0000 UTC m=+5464.725256147" watchObservedRunningTime="2026-02-17 19:14:53.35456331 +0000 UTC m=+5464.729966595" Feb 17 19:14:58 crc kubenswrapper[4892]: I0217 19:14:58.834626 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7k9bt" Feb 17 19:14:58 crc kubenswrapper[4892]: I0217 19:14:58.835152 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7k9bt" Feb 17 19:14:58 crc kubenswrapper[4892]: I0217 19:14:58.889426 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7k9bt" Feb 17 19:14:59 crc kubenswrapper[4892]: I0217 19:14:59.443151 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7k9bt" Feb 17 19:14:59 crc kubenswrapper[4892]: I0217 19:14:59.496547 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7k9bt"] Feb 17 19:15:00 crc kubenswrapper[4892]: I0217 19:15:00.068392 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-2j98k"] Feb 17 19:15:00 crc kubenswrapper[4892]: I0217 19:15:00.075499 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-2j98k"] Feb 17 19:15:00 crc kubenswrapper[4892]: I0217 19:15:00.146368 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522595-9rrr5"] Feb 17 19:15:00 crc kubenswrapper[4892]: I0217 19:15:00.147703 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522595-9rrr5" Feb 17 19:15:00 crc kubenswrapper[4892]: I0217 19:15:00.153129 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 19:15:00 crc kubenswrapper[4892]: I0217 19:15:00.153177 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 19:15:00 crc kubenswrapper[4892]: I0217 19:15:00.158995 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522595-9rrr5"] Feb 17 19:15:00 crc kubenswrapper[4892]: I0217 19:15:00.321986 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/53744014-b001-443c-980c-2ee0a13a037c-config-volume\") pod \"collect-profiles-29522595-9rrr5\" (UID: \"53744014-b001-443c-980c-2ee0a13a037c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522595-9rrr5" Feb 17 19:15:00 crc kubenswrapper[4892]: I0217 19:15:00.322059 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9fw6\" (UniqueName: \"kubernetes.io/projected/53744014-b001-443c-980c-2ee0a13a037c-kube-api-access-x9fw6\") pod \"collect-profiles-29522595-9rrr5\" (UID: \"53744014-b001-443c-980c-2ee0a13a037c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522595-9rrr5" Feb 17 19:15:00 crc kubenswrapper[4892]: I0217 19:15:00.322452 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/53744014-b001-443c-980c-2ee0a13a037c-secret-volume\") pod \"collect-profiles-29522595-9rrr5\" (UID: \"53744014-b001-443c-980c-2ee0a13a037c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522595-9rrr5" Feb 17 19:15:00 crc kubenswrapper[4892]: I0217 19:15:00.423892 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/53744014-b001-443c-980c-2ee0a13a037c-config-volume\") pod \"collect-profiles-29522595-9rrr5\" (UID: \"53744014-b001-443c-980c-2ee0a13a037c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522595-9rrr5" Feb 17 19:15:00 crc kubenswrapper[4892]: I0217 19:15:00.423945 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9fw6\" (UniqueName: \"kubernetes.io/projected/53744014-b001-443c-980c-2ee0a13a037c-kube-api-access-x9fw6\") pod \"collect-profiles-29522595-9rrr5\" (UID: \"53744014-b001-443c-980c-2ee0a13a037c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522595-9rrr5" Feb 17 19:15:00 crc kubenswrapper[4892]: I0217 19:15:00.424124 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/53744014-b001-443c-980c-2ee0a13a037c-secret-volume\") pod \"collect-profiles-29522595-9rrr5\" (UID: \"53744014-b001-443c-980c-2ee0a13a037c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522595-9rrr5" Feb 17 19:15:00 crc kubenswrapper[4892]: I0217 19:15:00.425094 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/53744014-b001-443c-980c-2ee0a13a037c-config-volume\") pod \"collect-profiles-29522595-9rrr5\" (UID: \"53744014-b001-443c-980c-2ee0a13a037c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522595-9rrr5" Feb 17 19:15:00 crc kubenswrapper[4892]: I0217 19:15:00.439512 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/53744014-b001-443c-980c-2ee0a13a037c-secret-volume\") pod \"collect-profiles-29522595-9rrr5\" (UID: \"53744014-b001-443c-980c-2ee0a13a037c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522595-9rrr5" Feb 17 19:15:00 crc kubenswrapper[4892]: I0217 19:15:00.439763 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9fw6\" (UniqueName: \"kubernetes.io/projected/53744014-b001-443c-980c-2ee0a13a037c-kube-api-access-x9fw6\") pod \"collect-profiles-29522595-9rrr5\" (UID: \"53744014-b001-443c-980c-2ee0a13a037c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522595-9rrr5" Feb 17 19:15:00 crc kubenswrapper[4892]: I0217 19:15:00.536089 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522595-9rrr5" Feb 17 19:15:01 crc kubenswrapper[4892]: I0217 19:15:01.003109 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522595-9rrr5"] Feb 17 19:15:01 crc kubenswrapper[4892]: W0217 19:15:01.015676 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53744014_b001_443c_980c_2ee0a13a037c.slice/crio-78d8a592d0894dc569176fdfae9efdaa5ee11418b83b32c5f5830b365652592e WatchSource:0}: Error finding container 78d8a592d0894dc569176fdfae9efdaa5ee11418b83b32c5f5830b365652592e: Status 404 returned error can't find the container with id 78d8a592d0894dc569176fdfae9efdaa5ee11418b83b32c5f5830b365652592e Feb 17 19:15:01 crc kubenswrapper[4892]: I0217 19:15:01.382770 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69504bab-0d0d-49dd-9e90-42682a7906ae" path="/var/lib/kubelet/pods/69504bab-0d0d-49dd-9e90-42682a7906ae/volumes" Feb 17 19:15:01 crc kubenswrapper[4892]: I0217 19:15:01.452479 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7k9bt" podUID="92a4d608-6a14-463c-add8-5150b532960e" containerName="registry-server" containerID="cri-o://72bce17cdb95ae330d875768850c1b2d4089ca982bfc9f94ebaa0eebdb621035" gracePeriod=2 Feb 17 19:15:01 crc kubenswrapper[4892]: I0217 19:15:01.452958 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522595-9rrr5" event={"ID":"53744014-b001-443c-980c-2ee0a13a037c","Type":"ContainerStarted","Data":"08cb4266fc063cfdd35144c7fc72be99ccf72818535563f1455af8a14738d6ed"} Feb 17 19:15:01 crc kubenswrapper[4892]: I0217 19:15:01.453124 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522595-9rrr5" event={"ID":"53744014-b001-443c-980c-2ee0a13a037c","Type":"ContainerStarted","Data":"78d8a592d0894dc569176fdfae9efdaa5ee11418b83b32c5f5830b365652592e"} Feb 17 19:15:01 crc kubenswrapper[4892]: I0217 19:15:01.478161 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29522595-9rrr5" podStartSLOduration=1.478143864 podStartE2EDuration="1.478143864s" podCreationTimestamp="2026-02-17 19:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:15:01.469289114 +0000 UTC m=+5472.844692379" watchObservedRunningTime="2026-02-17 19:15:01.478143864 +0000 UTC m=+5472.853547129" Feb 17 19:15:01 crc kubenswrapper[4892]: I0217 19:15:01.513258 4892 scope.go:117] "RemoveContainer" containerID="a0927066ab9bf2b90db1ce0228bcb3dfed54eb95b44fcdf34522a6f3c58fbf28" Feb 17 19:15:01 crc kubenswrapper[4892]: I0217 19:15:01.936682 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7k9bt" Feb 17 19:15:02 crc kubenswrapper[4892]: I0217 19:15:02.078511 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5mcz\" (UniqueName: \"kubernetes.io/projected/92a4d608-6a14-463c-add8-5150b532960e-kube-api-access-h5mcz\") pod \"92a4d608-6a14-463c-add8-5150b532960e\" (UID: \"92a4d608-6a14-463c-add8-5150b532960e\") " Feb 17 19:15:02 crc kubenswrapper[4892]: I0217 19:15:02.078636 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92a4d608-6a14-463c-add8-5150b532960e-utilities\") pod \"92a4d608-6a14-463c-add8-5150b532960e\" (UID: \"92a4d608-6a14-463c-add8-5150b532960e\") " Feb 17 19:15:02 crc kubenswrapper[4892]: I0217 19:15:02.078733 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92a4d608-6a14-463c-add8-5150b532960e-catalog-content\") pod \"92a4d608-6a14-463c-add8-5150b532960e\" (UID: \"92a4d608-6a14-463c-add8-5150b532960e\") " Feb 17 19:15:02 crc kubenswrapper[4892]: I0217 19:15:02.081246 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92a4d608-6a14-463c-add8-5150b532960e-utilities" (OuterVolumeSpecName: "utilities") pod "92a4d608-6a14-463c-add8-5150b532960e" (UID: "92a4d608-6a14-463c-add8-5150b532960e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:15:02 crc kubenswrapper[4892]: I0217 19:15:02.087937 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92a4d608-6a14-463c-add8-5150b532960e-kube-api-access-h5mcz" (OuterVolumeSpecName: "kube-api-access-h5mcz") pod "92a4d608-6a14-463c-add8-5150b532960e" (UID: "92a4d608-6a14-463c-add8-5150b532960e"). InnerVolumeSpecName "kube-api-access-h5mcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:15:02 crc kubenswrapper[4892]: I0217 19:15:02.180410 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5mcz\" (UniqueName: \"kubernetes.io/projected/92a4d608-6a14-463c-add8-5150b532960e-kube-api-access-h5mcz\") on node \"crc\" DevicePath \"\"" Feb 17 19:15:02 crc kubenswrapper[4892]: I0217 19:15:02.180916 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92a4d608-6a14-463c-add8-5150b532960e-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 19:15:02 crc kubenswrapper[4892]: I0217 19:15:02.469613 4892 generic.go:334] "Generic (PLEG): container finished" podID="92a4d608-6a14-463c-add8-5150b532960e" containerID="72bce17cdb95ae330d875768850c1b2d4089ca982bfc9f94ebaa0eebdb621035" exitCode=0 Feb 17 19:15:02 crc kubenswrapper[4892]: I0217 19:15:02.469743 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7k9bt" Feb 17 19:15:02 crc kubenswrapper[4892]: I0217 19:15:02.470554 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7k9bt" event={"ID":"92a4d608-6a14-463c-add8-5150b532960e","Type":"ContainerDied","Data":"72bce17cdb95ae330d875768850c1b2d4089ca982bfc9f94ebaa0eebdb621035"} Feb 17 19:15:02 crc kubenswrapper[4892]: I0217 19:15:02.470596 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7k9bt" event={"ID":"92a4d608-6a14-463c-add8-5150b532960e","Type":"ContainerDied","Data":"e2d7cab86ad57eb42faddd8e9005a8e384dde0545fa02632a4d92334a3295e33"} Feb 17 19:15:02 crc kubenswrapper[4892]: I0217 19:15:02.470618 4892 scope.go:117] "RemoveContainer" containerID="72bce17cdb95ae330d875768850c1b2d4089ca982bfc9f94ebaa0eebdb621035" Feb 17 19:15:02 crc kubenswrapper[4892]: I0217 19:15:02.475631 4892 generic.go:334] "Generic (PLEG): container finished" podID="53744014-b001-443c-980c-2ee0a13a037c" containerID="08cb4266fc063cfdd35144c7fc72be99ccf72818535563f1455af8a14738d6ed" exitCode=0 Feb 17 19:15:02 crc kubenswrapper[4892]: I0217 19:15:02.475705 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522595-9rrr5" event={"ID":"53744014-b001-443c-980c-2ee0a13a037c","Type":"ContainerDied","Data":"08cb4266fc063cfdd35144c7fc72be99ccf72818535563f1455af8a14738d6ed"} Feb 17 19:15:02 crc kubenswrapper[4892]: I0217 19:15:02.522346 4892 scope.go:117] "RemoveContainer" containerID="e8672e10dc17fdc85e50d1dfc631470c2b8eaee452fc1ca56afc206b7dc7692f" Feb 17 19:15:02 crc kubenswrapper[4892]: I0217 19:15:02.534000 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92a4d608-6a14-463c-add8-5150b532960e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "92a4d608-6a14-463c-add8-5150b532960e" (UID: "92a4d608-6a14-463c-add8-5150b532960e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:15:02 crc kubenswrapper[4892]: I0217 19:15:02.558726 4892 scope.go:117] "RemoveContainer" containerID="f7f5fe7dfde0653888e800853b2484af8b06373aa6a6963adc58d9a2514472b6" Feb 17 19:15:02 crc kubenswrapper[4892]: I0217 19:15:02.590413 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92a4d608-6a14-463c-add8-5150b532960e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 19:15:02 crc kubenswrapper[4892]: I0217 19:15:02.599199 4892 scope.go:117] "RemoveContainer" containerID="72bce17cdb95ae330d875768850c1b2d4089ca982bfc9f94ebaa0eebdb621035" Feb 17 19:15:02 crc kubenswrapper[4892]: E0217 19:15:02.599931 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72bce17cdb95ae330d875768850c1b2d4089ca982bfc9f94ebaa0eebdb621035\": container with ID starting with 72bce17cdb95ae330d875768850c1b2d4089ca982bfc9f94ebaa0eebdb621035 not found: ID does not exist" containerID="72bce17cdb95ae330d875768850c1b2d4089ca982bfc9f94ebaa0eebdb621035" Feb 17 19:15:02 crc kubenswrapper[4892]: I0217 19:15:02.599986 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72bce17cdb95ae330d875768850c1b2d4089ca982bfc9f94ebaa0eebdb621035"} err="failed to get container status \"72bce17cdb95ae330d875768850c1b2d4089ca982bfc9f94ebaa0eebdb621035\": rpc error: code = NotFound desc = could not find container \"72bce17cdb95ae330d875768850c1b2d4089ca982bfc9f94ebaa0eebdb621035\": container with ID starting with 72bce17cdb95ae330d875768850c1b2d4089ca982bfc9f94ebaa0eebdb621035 not found: ID does not exist" Feb 17 19:15:02 crc kubenswrapper[4892]: I0217 19:15:02.600018 4892 scope.go:117] "RemoveContainer" containerID="e8672e10dc17fdc85e50d1dfc631470c2b8eaee452fc1ca56afc206b7dc7692f" Feb 17 19:15:02 crc kubenswrapper[4892]: E0217 19:15:02.600463 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8672e10dc17fdc85e50d1dfc631470c2b8eaee452fc1ca56afc206b7dc7692f\": container with ID starting with e8672e10dc17fdc85e50d1dfc631470c2b8eaee452fc1ca56afc206b7dc7692f not found: ID does not exist" containerID="e8672e10dc17fdc85e50d1dfc631470c2b8eaee452fc1ca56afc206b7dc7692f" Feb 17 19:15:02 crc kubenswrapper[4892]: I0217 19:15:02.600506 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8672e10dc17fdc85e50d1dfc631470c2b8eaee452fc1ca56afc206b7dc7692f"} err="failed to get container status \"e8672e10dc17fdc85e50d1dfc631470c2b8eaee452fc1ca56afc206b7dc7692f\": rpc error: code = NotFound desc = could not find container \"e8672e10dc17fdc85e50d1dfc631470c2b8eaee452fc1ca56afc206b7dc7692f\": container with ID starting with e8672e10dc17fdc85e50d1dfc631470c2b8eaee452fc1ca56afc206b7dc7692f not found: ID does not exist" Feb 17 19:15:02 crc kubenswrapper[4892]: I0217 19:15:02.600538 4892 scope.go:117] "RemoveContainer" containerID="f7f5fe7dfde0653888e800853b2484af8b06373aa6a6963adc58d9a2514472b6" Feb 17 19:15:02 crc kubenswrapper[4892]: E0217 19:15:02.601128 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7f5fe7dfde0653888e800853b2484af8b06373aa6a6963adc58d9a2514472b6\": container with ID starting with f7f5fe7dfde0653888e800853b2484af8b06373aa6a6963adc58d9a2514472b6 not found: ID does not exist" containerID="f7f5fe7dfde0653888e800853b2484af8b06373aa6a6963adc58d9a2514472b6" Feb 17 19:15:02 crc kubenswrapper[4892]: I0217 19:15:02.601246 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7f5fe7dfde0653888e800853b2484af8b06373aa6a6963adc58d9a2514472b6"} err="failed to get container status \"f7f5fe7dfde0653888e800853b2484af8b06373aa6a6963adc58d9a2514472b6\": rpc error: code = NotFound desc = could not find container \"f7f5fe7dfde0653888e800853b2484af8b06373aa6a6963adc58d9a2514472b6\": container with ID starting with f7f5fe7dfde0653888e800853b2484af8b06373aa6a6963adc58d9a2514472b6 not found: ID does not exist" Feb 17 19:15:02 crc kubenswrapper[4892]: I0217 19:15:02.831328 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7k9bt"] Feb 17 19:15:02 crc kubenswrapper[4892]: I0217 19:15:02.839074 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7k9bt"] Feb 17 19:15:03 crc kubenswrapper[4892]: I0217 19:15:03.390866 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92a4d608-6a14-463c-add8-5150b532960e" path="/var/lib/kubelet/pods/92a4d608-6a14-463c-add8-5150b532960e/volumes" Feb 17 19:15:03 crc kubenswrapper[4892]: I0217 19:15:03.847373 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522595-9rrr5" Feb 17 19:15:04 crc kubenswrapper[4892]: I0217 19:15:04.022211 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9fw6\" (UniqueName: \"kubernetes.io/projected/53744014-b001-443c-980c-2ee0a13a037c-kube-api-access-x9fw6\") pod \"53744014-b001-443c-980c-2ee0a13a037c\" (UID: \"53744014-b001-443c-980c-2ee0a13a037c\") " Feb 17 19:15:04 crc kubenswrapper[4892]: I0217 19:15:04.022352 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/53744014-b001-443c-980c-2ee0a13a037c-secret-volume\") pod \"53744014-b001-443c-980c-2ee0a13a037c\" (UID: \"53744014-b001-443c-980c-2ee0a13a037c\") " Feb 17 19:15:04 crc kubenswrapper[4892]: I0217 19:15:04.022378 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/53744014-b001-443c-980c-2ee0a13a037c-config-volume\") pod \"53744014-b001-443c-980c-2ee0a13a037c\" (UID: \"53744014-b001-443c-980c-2ee0a13a037c\") " Feb 17 19:15:04 crc kubenswrapper[4892]: I0217 19:15:04.022897 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53744014-b001-443c-980c-2ee0a13a037c-config-volume" (OuterVolumeSpecName: "config-volume") pod "53744014-b001-443c-980c-2ee0a13a037c" (UID: "53744014-b001-443c-980c-2ee0a13a037c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:15:04 crc kubenswrapper[4892]: I0217 19:15:04.031991 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53744014-b001-443c-980c-2ee0a13a037c-kube-api-access-x9fw6" (OuterVolumeSpecName: "kube-api-access-x9fw6") pod "53744014-b001-443c-980c-2ee0a13a037c" (UID: "53744014-b001-443c-980c-2ee0a13a037c"). InnerVolumeSpecName "kube-api-access-x9fw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:15:04 crc kubenswrapper[4892]: I0217 19:15:04.032879 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53744014-b001-443c-980c-2ee0a13a037c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "53744014-b001-443c-980c-2ee0a13a037c" (UID: "53744014-b001-443c-980c-2ee0a13a037c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:15:04 crc kubenswrapper[4892]: I0217 19:15:04.123924 4892 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/53744014-b001-443c-980c-2ee0a13a037c-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 19:15:04 crc kubenswrapper[4892]: I0217 19:15:04.123953 4892 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/53744014-b001-443c-980c-2ee0a13a037c-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 19:15:04 crc kubenswrapper[4892]: I0217 19:15:04.123964 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9fw6\" (UniqueName: \"kubernetes.io/projected/53744014-b001-443c-980c-2ee0a13a037c-kube-api-access-x9fw6\") on node \"crc\" DevicePath \"\"" Feb 17 19:15:04 crc kubenswrapper[4892]: I0217 19:15:04.494198 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522595-9rrr5" event={"ID":"53744014-b001-443c-980c-2ee0a13a037c","Type":"ContainerDied","Data":"78d8a592d0894dc569176fdfae9efdaa5ee11418b83b32c5f5830b365652592e"} Feb 17 19:15:04 crc kubenswrapper[4892]: I0217 19:15:04.494232 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78d8a592d0894dc569176fdfae9efdaa5ee11418b83b32c5f5830b365652592e" Feb 17 19:15:04 crc kubenswrapper[4892]: I0217 19:15:04.494251 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522595-9rrr5" Feb 17 19:15:04 crc kubenswrapper[4892]: I0217 19:15:04.561542 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522550-txp2t"] Feb 17 19:15:04 crc kubenswrapper[4892]: I0217 19:15:04.569215 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522550-txp2t"] Feb 17 19:15:05 crc kubenswrapper[4892]: I0217 19:15:05.373533 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95eb9988-23c6-4a17-a5da-6b6c70984deb" path="/var/lib/kubelet/pods/95eb9988-23c6-4a17-a5da-6b6c70984deb/volumes" Feb 17 19:15:33 crc kubenswrapper[4892]: I0217 19:15:33.487735 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-w5ds9"] Feb 17 19:15:33 crc kubenswrapper[4892]: E0217 19:15:33.489092 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92a4d608-6a14-463c-add8-5150b532960e" containerName="registry-server" Feb 17 19:15:33 crc kubenswrapper[4892]: I0217 19:15:33.489111 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="92a4d608-6a14-463c-add8-5150b532960e" containerName="registry-server" Feb 17 19:15:33 crc kubenswrapper[4892]: E0217 19:15:33.489142 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53744014-b001-443c-980c-2ee0a13a037c" containerName="collect-profiles" Feb 17 19:15:33 crc kubenswrapper[4892]: I0217 19:15:33.489149 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="53744014-b001-443c-980c-2ee0a13a037c" containerName="collect-profiles" Feb 17 19:15:33 crc kubenswrapper[4892]: E0217 19:15:33.489164 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92a4d608-6a14-463c-add8-5150b532960e" containerName="extract-utilities" Feb 17 19:15:33 crc kubenswrapper[4892]: I0217 19:15:33.489171 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="92a4d608-6a14-463c-add8-5150b532960e" containerName="extract-utilities" Feb 17 19:15:33 crc kubenswrapper[4892]: E0217 19:15:33.489183 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92a4d608-6a14-463c-add8-5150b532960e" containerName="extract-content" Feb 17 19:15:33 crc kubenswrapper[4892]: I0217 19:15:33.489191 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="92a4d608-6a14-463c-add8-5150b532960e" containerName="extract-content" Feb 17 19:15:33 crc kubenswrapper[4892]: I0217 19:15:33.489437 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="53744014-b001-443c-980c-2ee0a13a037c" containerName="collect-profiles" Feb 17 19:15:33 crc kubenswrapper[4892]: I0217 19:15:33.489452 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="92a4d608-6a14-463c-add8-5150b532960e" containerName="registry-server" Feb 17 19:15:33 crc kubenswrapper[4892]: I0217 19:15:33.490282 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-w5ds9" Feb 17 19:15:33 crc kubenswrapper[4892]: I0217 19:15:33.506085 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-2660-account-create-update-kjtbg"] Feb 17 19:15:33 crc kubenswrapper[4892]: I0217 19:15:33.507649 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2660-account-create-update-kjtbg" Feb 17 19:15:33 crc kubenswrapper[4892]: I0217 19:15:33.510336 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 17 19:15:33 crc kubenswrapper[4892]: I0217 19:15:33.516525 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-w5ds9"] Feb 17 19:15:33 crc kubenswrapper[4892]: I0217 19:15:33.530480 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-2660-account-create-update-kjtbg"] Feb 17 19:15:33 crc kubenswrapper[4892]: I0217 19:15:33.676294 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68c4x\" (UniqueName: \"kubernetes.io/projected/762ac045-f6cb-4819-81ff-89553338a250-kube-api-access-68c4x\") pod \"barbican-2660-account-create-update-kjtbg\" (UID: \"762ac045-f6cb-4819-81ff-89553338a250\") " pod="openstack/barbican-2660-account-create-update-kjtbg" Feb 17 19:15:33 crc kubenswrapper[4892]: I0217 19:15:33.676456 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba60e1d6-317d-4d32-b534-26e5490eb1fc-operator-scripts\") pod \"barbican-db-create-w5ds9\" (UID: \"ba60e1d6-317d-4d32-b534-26e5490eb1fc\") " pod="openstack/barbican-db-create-w5ds9" Feb 17 19:15:33 crc kubenswrapper[4892]: I0217 19:15:33.676703 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/762ac045-f6cb-4819-81ff-89553338a250-operator-scripts\") pod \"barbican-2660-account-create-update-kjtbg\" (UID: \"762ac045-f6cb-4819-81ff-89553338a250\") " pod="openstack/barbican-2660-account-create-update-kjtbg" Feb 17 19:15:33 crc kubenswrapper[4892]: I0217 19:15:33.676804 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bx5t\" (UniqueName: \"kubernetes.io/projected/ba60e1d6-317d-4d32-b534-26e5490eb1fc-kube-api-access-4bx5t\") pod \"barbican-db-create-w5ds9\" (UID: \"ba60e1d6-317d-4d32-b534-26e5490eb1fc\") " pod="openstack/barbican-db-create-w5ds9" Feb 17 19:15:33 crc kubenswrapper[4892]: I0217 19:15:33.779083 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/762ac045-f6cb-4819-81ff-89553338a250-operator-scripts\") pod \"barbican-2660-account-create-update-kjtbg\" (UID: \"762ac045-f6cb-4819-81ff-89553338a250\") " pod="openstack/barbican-2660-account-create-update-kjtbg" Feb 17 19:15:33 crc kubenswrapper[4892]: I0217 19:15:33.779178 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bx5t\" (UniqueName: \"kubernetes.io/projected/ba60e1d6-317d-4d32-b534-26e5490eb1fc-kube-api-access-4bx5t\") pod \"barbican-db-create-w5ds9\" (UID: \"ba60e1d6-317d-4d32-b534-26e5490eb1fc\") " pod="openstack/barbican-db-create-w5ds9" Feb 17 19:15:33 crc kubenswrapper[4892]: I0217 19:15:33.779334 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68c4x\" (UniqueName: \"kubernetes.io/projected/762ac045-f6cb-4819-81ff-89553338a250-kube-api-access-68c4x\") pod \"barbican-2660-account-create-update-kjtbg\" (UID: \"762ac045-f6cb-4819-81ff-89553338a250\") " pod="openstack/barbican-2660-account-create-update-kjtbg" Feb 17 19:15:33 crc kubenswrapper[4892]: I0217 19:15:33.779400 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba60e1d6-317d-4d32-b534-26e5490eb1fc-operator-scripts\") pod \"barbican-db-create-w5ds9\" (UID: \"ba60e1d6-317d-4d32-b534-26e5490eb1fc\") " pod="openstack/barbican-db-create-w5ds9" Feb 17 19:15:33 crc kubenswrapper[4892]: I0217 19:15:33.779770 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/762ac045-f6cb-4819-81ff-89553338a250-operator-scripts\") pod \"barbican-2660-account-create-update-kjtbg\" (UID: \"762ac045-f6cb-4819-81ff-89553338a250\") " pod="openstack/barbican-2660-account-create-update-kjtbg" Feb 17 19:15:33 crc kubenswrapper[4892]: I0217 19:15:33.780571 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba60e1d6-317d-4d32-b534-26e5490eb1fc-operator-scripts\") pod \"barbican-db-create-w5ds9\" (UID: \"ba60e1d6-317d-4d32-b534-26e5490eb1fc\") " pod="openstack/barbican-db-create-w5ds9" Feb 17 19:15:33 crc kubenswrapper[4892]: I0217 19:15:33.802071 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68c4x\" (UniqueName: \"kubernetes.io/projected/762ac045-f6cb-4819-81ff-89553338a250-kube-api-access-68c4x\") pod \"barbican-2660-account-create-update-kjtbg\" (UID: \"762ac045-f6cb-4819-81ff-89553338a250\") " pod="openstack/barbican-2660-account-create-update-kjtbg" Feb 17 19:15:33 crc kubenswrapper[4892]: I0217 19:15:33.803699 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bx5t\" (UniqueName: \"kubernetes.io/projected/ba60e1d6-317d-4d32-b534-26e5490eb1fc-kube-api-access-4bx5t\") pod \"barbican-db-create-w5ds9\" (UID: \"ba60e1d6-317d-4d32-b534-26e5490eb1fc\") " pod="openstack/barbican-db-create-w5ds9" Feb 17 19:15:33 crc kubenswrapper[4892]: I0217 19:15:33.819097 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-w5ds9" Feb 17 19:15:33 crc kubenswrapper[4892]: I0217 19:15:33.842003 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2660-account-create-update-kjtbg" Feb 17 19:15:34 crc kubenswrapper[4892]: I0217 19:15:34.077018 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-w5ds9"] Feb 17 19:15:34 crc kubenswrapper[4892]: I0217 19:15:34.374658 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-2660-account-create-update-kjtbg"] Feb 17 19:15:34 crc kubenswrapper[4892]: I0217 19:15:34.864974 4892 generic.go:334] "Generic (PLEG): container finished" podID="ba60e1d6-317d-4d32-b534-26e5490eb1fc" containerID="db4cf0b38f43b1cff85a050ebe406706b8addde71dc13a4b348bc844af64d635" exitCode=0 Feb 17 19:15:34 crc kubenswrapper[4892]: I0217 19:15:34.865071 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-w5ds9" event={"ID":"ba60e1d6-317d-4d32-b534-26e5490eb1fc","Type":"ContainerDied","Data":"db4cf0b38f43b1cff85a050ebe406706b8addde71dc13a4b348bc844af64d635"} Feb 17 19:15:34 crc kubenswrapper[4892]: I0217 19:15:34.865313 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-w5ds9" event={"ID":"ba60e1d6-317d-4d32-b534-26e5490eb1fc","Type":"ContainerStarted","Data":"bf837621650b7d524e11f810cb9fa4dd561c017d3ecf65f9c52bc42eb2674f54"} Feb 17 19:15:34 crc kubenswrapper[4892]: I0217 19:15:34.869416 4892 generic.go:334] "Generic (PLEG): container finished" podID="762ac045-f6cb-4819-81ff-89553338a250" containerID="7d4ae2b60b9faa6f0a3a71aa98aa4ab3e6d59e97011d769997322e7e43e6e5a9" exitCode=0 Feb 17 19:15:34 crc kubenswrapper[4892]: I0217 19:15:34.869464 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-2660-account-create-update-kjtbg" event={"ID":"762ac045-f6cb-4819-81ff-89553338a250","Type":"ContainerDied","Data":"7d4ae2b60b9faa6f0a3a71aa98aa4ab3e6d59e97011d769997322e7e43e6e5a9"} Feb 17 19:15:34 crc kubenswrapper[4892]: I0217 19:15:34.869494 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-2660-account-create-update-kjtbg" event={"ID":"762ac045-f6cb-4819-81ff-89553338a250","Type":"ContainerStarted","Data":"e4753f8cae67ab59153ae4a5d797b1e3e5920123e87997c57c39d773dc199338"} Feb 17 19:15:36 crc kubenswrapper[4892]: I0217 19:15:36.284778 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-w5ds9" Feb 17 19:15:36 crc kubenswrapper[4892]: I0217 19:15:36.292797 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2660-account-create-update-kjtbg" Feb 17 19:15:36 crc kubenswrapper[4892]: I0217 19:15:36.366217 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bx5t\" (UniqueName: \"kubernetes.io/projected/ba60e1d6-317d-4d32-b534-26e5490eb1fc-kube-api-access-4bx5t\") pod \"ba60e1d6-317d-4d32-b534-26e5490eb1fc\" (UID: \"ba60e1d6-317d-4d32-b534-26e5490eb1fc\") " Feb 17 19:15:36 crc kubenswrapper[4892]: I0217 19:15:36.366386 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68c4x\" (UniqueName: \"kubernetes.io/projected/762ac045-f6cb-4819-81ff-89553338a250-kube-api-access-68c4x\") pod \"762ac045-f6cb-4819-81ff-89553338a250\" (UID: \"762ac045-f6cb-4819-81ff-89553338a250\") " Feb 17 19:15:36 crc kubenswrapper[4892]: I0217 19:15:36.366413 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/762ac045-f6cb-4819-81ff-89553338a250-operator-scripts\") pod \"762ac045-f6cb-4819-81ff-89553338a250\" (UID: \"762ac045-f6cb-4819-81ff-89553338a250\") " Feb 17 19:15:36 crc kubenswrapper[4892]: I0217 19:15:36.366471 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba60e1d6-317d-4d32-b534-26e5490eb1fc-operator-scripts\") pod \"ba60e1d6-317d-4d32-b534-26e5490eb1fc\" (UID: \"ba60e1d6-317d-4d32-b534-26e5490eb1fc\") " Feb 17 19:15:36 crc kubenswrapper[4892]: I0217 19:15:36.367173 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba60e1d6-317d-4d32-b534-26e5490eb1fc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ba60e1d6-317d-4d32-b534-26e5490eb1fc" (UID: "ba60e1d6-317d-4d32-b534-26e5490eb1fc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:15:36 crc kubenswrapper[4892]: I0217 19:15:36.367190 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/762ac045-f6cb-4819-81ff-89553338a250-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "762ac045-f6cb-4819-81ff-89553338a250" (UID: "762ac045-f6cb-4819-81ff-89553338a250"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:15:36 crc kubenswrapper[4892]: I0217 19:15:36.367511 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/762ac045-f6cb-4819-81ff-89553338a250-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:15:36 crc kubenswrapper[4892]: I0217 19:15:36.367532 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba60e1d6-317d-4d32-b534-26e5490eb1fc-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:15:36 crc kubenswrapper[4892]: I0217 19:15:36.372484 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/762ac045-f6cb-4819-81ff-89553338a250-kube-api-access-68c4x" (OuterVolumeSpecName: "kube-api-access-68c4x") pod "762ac045-f6cb-4819-81ff-89553338a250" (UID: "762ac045-f6cb-4819-81ff-89553338a250"). InnerVolumeSpecName "kube-api-access-68c4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:15:36 crc kubenswrapper[4892]: I0217 19:15:36.373482 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba60e1d6-317d-4d32-b534-26e5490eb1fc-kube-api-access-4bx5t" (OuterVolumeSpecName: "kube-api-access-4bx5t") pod "ba60e1d6-317d-4d32-b534-26e5490eb1fc" (UID: "ba60e1d6-317d-4d32-b534-26e5490eb1fc"). InnerVolumeSpecName "kube-api-access-4bx5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:15:36 crc kubenswrapper[4892]: I0217 19:15:36.469107 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68c4x\" (UniqueName: \"kubernetes.io/projected/762ac045-f6cb-4819-81ff-89553338a250-kube-api-access-68c4x\") on node \"crc\" DevicePath \"\"" Feb 17 19:15:36 crc kubenswrapper[4892]: I0217 19:15:36.469140 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bx5t\" (UniqueName: \"kubernetes.io/projected/ba60e1d6-317d-4d32-b534-26e5490eb1fc-kube-api-access-4bx5t\") on node \"crc\" DevicePath \"\"" Feb 17 19:15:36 crc kubenswrapper[4892]: I0217 19:15:36.893886 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-w5ds9" event={"ID":"ba60e1d6-317d-4d32-b534-26e5490eb1fc","Type":"ContainerDied","Data":"bf837621650b7d524e11f810cb9fa4dd561c017d3ecf65f9c52bc42eb2674f54"} Feb 17 19:15:36 crc kubenswrapper[4892]: I0217 19:15:36.893932 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf837621650b7d524e11f810cb9fa4dd561c017d3ecf65f9c52bc42eb2674f54" Feb 17 19:15:36 crc kubenswrapper[4892]: I0217 19:15:36.893899 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-w5ds9" Feb 17 19:15:36 crc kubenswrapper[4892]: I0217 19:15:36.896627 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-2660-account-create-update-kjtbg" event={"ID":"762ac045-f6cb-4819-81ff-89553338a250","Type":"ContainerDied","Data":"e4753f8cae67ab59153ae4a5d797b1e3e5920123e87997c57c39d773dc199338"} Feb 17 19:15:36 crc kubenswrapper[4892]: I0217 19:15:36.896676 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4753f8cae67ab59153ae4a5d797b1e3e5920123e87997c57c39d773dc199338" Feb 17 19:15:36 crc kubenswrapper[4892]: I0217 19:15:36.896739 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2660-account-create-update-kjtbg" Feb 17 19:15:37 crc kubenswrapper[4892]: I0217 19:15:37.424507 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 19:15:37 crc kubenswrapper[4892]: I0217 19:15:37.424934 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 19:15:38 crc kubenswrapper[4892]: I0217 19:15:38.879380 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-5h2fz"] Feb 17 19:15:38 crc kubenswrapper[4892]: E0217 19:15:38.879746 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba60e1d6-317d-4d32-b534-26e5490eb1fc" containerName="mariadb-database-create" Feb 17 19:15:38 crc kubenswrapper[4892]: I0217 19:15:38.879758 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba60e1d6-317d-4d32-b534-26e5490eb1fc" containerName="mariadb-database-create" Feb 17 19:15:38 crc kubenswrapper[4892]: E0217 19:15:38.879771 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="762ac045-f6cb-4819-81ff-89553338a250" containerName="mariadb-account-create-update" Feb 17 19:15:38 crc kubenswrapper[4892]: I0217 19:15:38.879777 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="762ac045-f6cb-4819-81ff-89553338a250" containerName="mariadb-account-create-update" Feb 17 19:15:38 crc kubenswrapper[4892]: I0217 19:15:38.879988 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba60e1d6-317d-4d32-b534-26e5490eb1fc" containerName="mariadb-database-create" Feb 17 19:15:38 crc kubenswrapper[4892]: I0217 19:15:38.880004 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="762ac045-f6cb-4819-81ff-89553338a250" containerName="mariadb-account-create-update" Feb 17 19:15:38 crc kubenswrapper[4892]: I0217 19:15:38.880558 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-5h2fz" Feb 17 19:15:38 crc kubenswrapper[4892]: I0217 19:15:38.882870 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-9p2s8" Feb 17 19:15:38 crc kubenswrapper[4892]: I0217 19:15:38.891894 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 17 19:15:38 crc kubenswrapper[4892]: I0217 19:15:38.894172 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-5h2fz"] Feb 17 19:15:38 crc kubenswrapper[4892]: I0217 19:15:38.918478 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b188c46-158a-41cb-9f68-945408ac3ed5-combined-ca-bundle\") pod \"barbican-db-sync-5h2fz\" (UID: \"7b188c46-158a-41cb-9f68-945408ac3ed5\") " pod="openstack/barbican-db-sync-5h2fz" Feb 17 19:15:38 crc kubenswrapper[4892]: I0217 19:15:38.918542 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7b188c46-158a-41cb-9f68-945408ac3ed5-db-sync-config-data\") pod \"barbican-db-sync-5h2fz\" (UID: \"7b188c46-158a-41cb-9f68-945408ac3ed5\") " pod="openstack/barbican-db-sync-5h2fz" Feb 17 19:15:38 crc kubenswrapper[4892]: I0217 19:15:38.918614 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sxcp\" (UniqueName: \"kubernetes.io/projected/7b188c46-158a-41cb-9f68-945408ac3ed5-kube-api-access-5sxcp\") pod \"barbican-db-sync-5h2fz\" (UID: \"7b188c46-158a-41cb-9f68-945408ac3ed5\") " pod="openstack/barbican-db-sync-5h2fz" Feb 17 19:15:39 crc kubenswrapper[4892]: I0217 19:15:39.020172 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b188c46-158a-41cb-9f68-945408ac3ed5-combined-ca-bundle\") pod \"barbican-db-sync-5h2fz\" (UID: \"7b188c46-158a-41cb-9f68-945408ac3ed5\") " pod="openstack/barbican-db-sync-5h2fz" Feb 17 19:15:39 crc kubenswrapper[4892]: I0217 19:15:39.020246 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7b188c46-158a-41cb-9f68-945408ac3ed5-db-sync-config-data\") pod \"barbican-db-sync-5h2fz\" (UID: \"7b188c46-158a-41cb-9f68-945408ac3ed5\") " pod="openstack/barbican-db-sync-5h2fz" Feb 17 19:15:39 crc kubenswrapper[4892]: I0217 19:15:39.020339 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sxcp\" (UniqueName: \"kubernetes.io/projected/7b188c46-158a-41cb-9f68-945408ac3ed5-kube-api-access-5sxcp\") pod \"barbican-db-sync-5h2fz\" (UID: \"7b188c46-158a-41cb-9f68-945408ac3ed5\") " pod="openstack/barbican-db-sync-5h2fz" Feb 17 19:15:39 crc kubenswrapper[4892]: I0217 19:15:39.026041 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b188c46-158a-41cb-9f68-945408ac3ed5-combined-ca-bundle\") pod \"barbican-db-sync-5h2fz\" (UID: \"7b188c46-158a-41cb-9f68-945408ac3ed5\") " pod="openstack/barbican-db-sync-5h2fz" Feb 17 19:15:39 crc kubenswrapper[4892]: I0217 19:15:39.026583 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7b188c46-158a-41cb-9f68-945408ac3ed5-db-sync-config-data\") pod \"barbican-db-sync-5h2fz\" (UID: \"7b188c46-158a-41cb-9f68-945408ac3ed5\") " pod="openstack/barbican-db-sync-5h2fz" Feb 17 19:15:39 crc kubenswrapper[4892]: I0217 19:15:39.039577 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sxcp\" (UniqueName: \"kubernetes.io/projected/7b188c46-158a-41cb-9f68-945408ac3ed5-kube-api-access-5sxcp\") pod \"barbican-db-sync-5h2fz\" (UID: \"7b188c46-158a-41cb-9f68-945408ac3ed5\") " pod="openstack/barbican-db-sync-5h2fz" Feb 17 19:15:39 crc kubenswrapper[4892]: I0217 19:15:39.209981 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-5h2fz" Feb 17 19:15:39 crc kubenswrapper[4892]: W0217 19:15:39.712038 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b188c46_158a_41cb_9f68_945408ac3ed5.slice/crio-b9422431d539c6f11e0232b097c6e7db845f0e7deab9f24e0b91afc1dbd192ce WatchSource:0}: Error finding container b9422431d539c6f11e0232b097c6e7db845f0e7deab9f24e0b91afc1dbd192ce: Status 404 returned error can't find the container with id b9422431d539c6f11e0232b097c6e7db845f0e7deab9f24e0b91afc1dbd192ce Feb 17 19:15:39 crc kubenswrapper[4892]: I0217 19:15:39.715967 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-5h2fz"] Feb 17 19:15:39 crc kubenswrapper[4892]: I0217 19:15:39.930713 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-5h2fz" event={"ID":"7b188c46-158a-41cb-9f68-945408ac3ed5","Type":"ContainerStarted","Data":"6ba4a6e09056831cb826c5ef28434496f8537dd932678e77ebc52a1ccea229f9"} Feb 17 19:15:39 crc kubenswrapper[4892]: I0217 19:15:39.931114 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-5h2fz" event={"ID":"7b188c46-158a-41cb-9f68-945408ac3ed5","Type":"ContainerStarted","Data":"b9422431d539c6f11e0232b097c6e7db845f0e7deab9f24e0b91afc1dbd192ce"} Feb 17 19:15:39 crc kubenswrapper[4892]: I0217 19:15:39.950039 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-5h2fz" podStartSLOduration=1.9500199999999999 podStartE2EDuration="1.95002s" podCreationTimestamp="2026-02-17 19:15:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:15:39.945398015 +0000 UTC m=+5511.320801280" watchObservedRunningTime="2026-02-17 19:15:39.95002 +0000 UTC m=+5511.325423286" Feb 17 19:15:41 crc kubenswrapper[4892]: E0217 19:15:41.074841 4892 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b188c46_158a_41cb_9f68_945408ac3ed5.slice/crio-conmon-6ba4a6e09056831cb826c5ef28434496f8537dd932678e77ebc52a1ccea229f9.scope\": RecentStats: unable to find data in memory cache]" Feb 17 19:15:41 crc kubenswrapper[4892]: I0217 19:15:41.950706 4892 generic.go:334] "Generic (PLEG): container finished" podID="7b188c46-158a-41cb-9f68-945408ac3ed5" containerID="6ba4a6e09056831cb826c5ef28434496f8537dd932678e77ebc52a1ccea229f9" exitCode=0 Feb 17 19:15:41 crc kubenswrapper[4892]: I0217 19:15:41.950891 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-5h2fz" event={"ID":"7b188c46-158a-41cb-9f68-945408ac3ed5","Type":"ContainerDied","Data":"6ba4a6e09056831cb826c5ef28434496f8537dd932678e77ebc52a1ccea229f9"} Feb 17 19:15:43 crc kubenswrapper[4892]: I0217 19:15:43.315807 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-5h2fz" Feb 17 19:15:43 crc kubenswrapper[4892]: I0217 19:15:43.406111 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b188c46-158a-41cb-9f68-945408ac3ed5-combined-ca-bundle\") pod \"7b188c46-158a-41cb-9f68-945408ac3ed5\" (UID: \"7b188c46-158a-41cb-9f68-945408ac3ed5\") " Feb 17 19:15:43 crc kubenswrapper[4892]: I0217 19:15:43.406811 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sxcp\" (UniqueName: \"kubernetes.io/projected/7b188c46-158a-41cb-9f68-945408ac3ed5-kube-api-access-5sxcp\") pod \"7b188c46-158a-41cb-9f68-945408ac3ed5\" (UID: \"7b188c46-158a-41cb-9f68-945408ac3ed5\") " Feb 17 19:15:43 crc kubenswrapper[4892]: I0217 19:15:43.407027 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7b188c46-158a-41cb-9f68-945408ac3ed5-db-sync-config-data\") pod \"7b188c46-158a-41cb-9f68-945408ac3ed5\" (UID: \"7b188c46-158a-41cb-9f68-945408ac3ed5\") " Feb 17 19:15:43 crc kubenswrapper[4892]: I0217 19:15:43.415077 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b188c46-158a-41cb-9f68-945408ac3ed5-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "7b188c46-158a-41cb-9f68-945408ac3ed5" (UID: "7b188c46-158a-41cb-9f68-945408ac3ed5"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:15:43 crc kubenswrapper[4892]: I0217 19:15:43.416149 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b188c46-158a-41cb-9f68-945408ac3ed5-kube-api-access-5sxcp" (OuterVolumeSpecName: "kube-api-access-5sxcp") pod "7b188c46-158a-41cb-9f68-945408ac3ed5" (UID: "7b188c46-158a-41cb-9f68-945408ac3ed5"). InnerVolumeSpecName "kube-api-access-5sxcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:15:43 crc kubenswrapper[4892]: I0217 19:15:43.449647 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b188c46-158a-41cb-9f68-945408ac3ed5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b188c46-158a-41cb-9f68-945408ac3ed5" (UID: "7b188c46-158a-41cb-9f68-945408ac3ed5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:15:43 crc kubenswrapper[4892]: I0217 19:15:43.509916 4892 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7b188c46-158a-41cb-9f68-945408ac3ed5-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 19:15:43 crc kubenswrapper[4892]: I0217 19:15:43.509943 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b188c46-158a-41cb-9f68-945408ac3ed5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 19:15:43 crc kubenswrapper[4892]: I0217 19:15:43.509952 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sxcp\" (UniqueName: \"kubernetes.io/projected/7b188c46-158a-41cb-9f68-945408ac3ed5-kube-api-access-5sxcp\") on node \"crc\" DevicePath \"\"" Feb 17 19:15:43 crc kubenswrapper[4892]: I0217 19:15:43.983619 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-5h2fz" event={"ID":"7b188c46-158a-41cb-9f68-945408ac3ed5","Type":"ContainerDied","Data":"b9422431d539c6f11e0232b097c6e7db845f0e7deab9f24e0b91afc1dbd192ce"} Feb 17 19:15:43 crc kubenswrapper[4892]: I0217 19:15:43.983685 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-5h2fz" Feb 17 19:15:43 crc kubenswrapper[4892]: I0217 19:15:43.983692 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9422431d539c6f11e0232b097c6e7db845f0e7deab9f24e0b91afc1dbd192ce" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.238491 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7b97c57775-zc698"] Feb 17 19:15:44 crc kubenswrapper[4892]: E0217 19:15:44.269315 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b188c46-158a-41cb-9f68-945408ac3ed5" containerName="barbican-db-sync" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.269351 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b188c46-158a-41cb-9f68-945408ac3ed5" containerName="barbican-db-sync" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.270749 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b188c46-158a-41cb-9f68-945408ac3ed5" containerName="barbican-db-sync" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.288977 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7b97c57775-zc698"] Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.289098 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7b97c57775-zc698" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.296088 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-9p2s8" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.296436 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.299188 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.299395 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-68d8fcd9d6-kjplh"] Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.324969 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa15978b-f3ec-4d7f-8a4e-4a4bcfd61f15-combined-ca-bundle\") pod \"barbican-worker-7b97c57775-zc698\" (UID: \"fa15978b-f3ec-4d7f-8a4e-4a4bcfd61f15\") " pod="openstack/barbican-worker-7b97c57775-zc698" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.325010 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa15978b-f3ec-4d7f-8a4e-4a4bcfd61f15-logs\") pod \"barbican-worker-7b97c57775-zc698\" (UID: \"fa15978b-f3ec-4d7f-8a4e-4a4bcfd61f15\") " pod="openstack/barbican-worker-7b97c57775-zc698" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.325098 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxjkr\" (UniqueName: \"kubernetes.io/projected/fa15978b-f3ec-4d7f-8a4e-4a4bcfd61f15-kube-api-access-jxjkr\") pod \"barbican-worker-7b97c57775-zc698\" (UID: \"fa15978b-f3ec-4d7f-8a4e-4a4bcfd61f15\") " pod="openstack/barbican-worker-7b97c57775-zc698" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.325132 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fa15978b-f3ec-4d7f-8a4e-4a4bcfd61f15-config-data-custom\") pod \"barbican-worker-7b97c57775-zc698\" (UID: \"fa15978b-f3ec-4d7f-8a4e-4a4bcfd61f15\") " pod="openstack/barbican-worker-7b97c57775-zc698" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.325164 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa15978b-f3ec-4d7f-8a4e-4a4bcfd61f15-config-data\") pod \"barbican-worker-7b97c57775-zc698\" (UID: \"fa15978b-f3ec-4d7f-8a4e-4a4bcfd61f15\") " pod="openstack/barbican-worker-7b97c57775-zc698" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.325310 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-68d8fcd9d6-kjplh" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.329530 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.330526 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-68d8fcd9d6-kjplh"] Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.373385 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79bd66dcf7-j2q47"] Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.375049 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd66dcf7-j2q47" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.412955 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd66dcf7-j2q47"] Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.426876 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54tc9\" (UniqueName: \"kubernetes.io/projected/de7a0634-e601-4c65-adb6-8e2625e1709b-kube-api-access-54tc9\") pod \"barbican-keystone-listener-68d8fcd9d6-kjplh\" (UID: \"de7a0634-e601-4c65-adb6-8e2625e1709b\") " pod="openstack/barbican-keystone-listener-68d8fcd9d6-kjplh" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.426966 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e284394-c7d9-41d3-983e-b1474ec1d8c3-dns-svc\") pod \"dnsmasq-dns-79bd66dcf7-j2q47\" (UID: \"5e284394-c7d9-41d3-983e-b1474ec1d8c3\") " pod="openstack/dnsmasq-dns-79bd66dcf7-j2q47" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.427016 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de7a0634-e601-4c65-adb6-8e2625e1709b-combined-ca-bundle\") pod \"barbican-keystone-listener-68d8fcd9d6-kjplh\" (UID: \"de7a0634-e601-4c65-adb6-8e2625e1709b\") " pod="openstack/barbican-keystone-listener-68d8fcd9d6-kjplh" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.427062 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxjkr\" (UniqueName: \"kubernetes.io/projected/fa15978b-f3ec-4d7f-8a4e-4a4bcfd61f15-kube-api-access-jxjkr\") pod \"barbican-worker-7b97c57775-zc698\" (UID: \"fa15978b-f3ec-4d7f-8a4e-4a4bcfd61f15\") " pod="openstack/barbican-worker-7b97c57775-zc698" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.427089 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j75ph\" (UniqueName: \"kubernetes.io/projected/5e284394-c7d9-41d3-983e-b1474ec1d8c3-kube-api-access-j75ph\") pod \"dnsmasq-dns-79bd66dcf7-j2q47\" (UID: \"5e284394-c7d9-41d3-983e-b1474ec1d8c3\") " pod="openstack/dnsmasq-dns-79bd66dcf7-j2q47" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.427132 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fa15978b-f3ec-4d7f-8a4e-4a4bcfd61f15-config-data-custom\") pod \"barbican-worker-7b97c57775-zc698\" (UID: \"fa15978b-f3ec-4d7f-8a4e-4a4bcfd61f15\") " pod="openstack/barbican-worker-7b97c57775-zc698" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.427160 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de7a0634-e601-4c65-adb6-8e2625e1709b-logs\") pod \"barbican-keystone-listener-68d8fcd9d6-kjplh\" (UID: \"de7a0634-e601-4c65-adb6-8e2625e1709b\") " pod="openstack/barbican-keystone-listener-68d8fcd9d6-kjplh" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.427190 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e284394-c7d9-41d3-983e-b1474ec1d8c3-config\") pod \"dnsmasq-dns-79bd66dcf7-j2q47\" (UID: \"5e284394-c7d9-41d3-983e-b1474ec1d8c3\") " pod="openstack/dnsmasq-dns-79bd66dcf7-j2q47" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.427230 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa15978b-f3ec-4d7f-8a4e-4a4bcfd61f15-config-data\") pod \"barbican-worker-7b97c57775-zc698\" (UID: \"fa15978b-f3ec-4d7f-8a4e-4a4bcfd61f15\") " pod="openstack/barbican-worker-7b97c57775-zc698" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.427297 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de7a0634-e601-4c65-adb6-8e2625e1709b-config-data\") pod \"barbican-keystone-listener-68d8fcd9d6-kjplh\" (UID: \"de7a0634-e601-4c65-adb6-8e2625e1709b\") " pod="openstack/barbican-keystone-listener-68d8fcd9d6-kjplh" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.427321 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e284394-c7d9-41d3-983e-b1474ec1d8c3-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd66dcf7-j2q47\" (UID: \"5e284394-c7d9-41d3-983e-b1474ec1d8c3\") " pod="openstack/dnsmasq-dns-79bd66dcf7-j2q47" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.427345 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e284394-c7d9-41d3-983e-b1474ec1d8c3-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd66dcf7-j2q47\" (UID: \"5e284394-c7d9-41d3-983e-b1474ec1d8c3\") " pod="openstack/dnsmasq-dns-79bd66dcf7-j2q47" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.427393 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa15978b-f3ec-4d7f-8a4e-4a4bcfd61f15-combined-ca-bundle\") pod \"barbican-worker-7b97c57775-zc698\" (UID: \"fa15978b-f3ec-4d7f-8a4e-4a4bcfd61f15\") " pod="openstack/barbican-worker-7b97c57775-zc698" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.427420 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa15978b-f3ec-4d7f-8a4e-4a4bcfd61f15-logs\") pod \"barbican-worker-7b97c57775-zc698\" (UID: \"fa15978b-f3ec-4d7f-8a4e-4a4bcfd61f15\") " pod="openstack/barbican-worker-7b97c57775-zc698" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.427450 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/de7a0634-e601-4c65-adb6-8e2625e1709b-config-data-custom\") pod \"barbican-keystone-listener-68d8fcd9d6-kjplh\" (UID: \"de7a0634-e601-4c65-adb6-8e2625e1709b\") " pod="openstack/barbican-keystone-listener-68d8fcd9d6-kjplh" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.427999 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa15978b-f3ec-4d7f-8a4e-4a4bcfd61f15-logs\") pod \"barbican-worker-7b97c57775-zc698\" (UID: \"fa15978b-f3ec-4d7f-8a4e-4a4bcfd61f15\") " pod="openstack/barbican-worker-7b97c57775-zc698" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.433883 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fa15978b-f3ec-4d7f-8a4e-4a4bcfd61f15-config-data-custom\") pod \"barbican-worker-7b97c57775-zc698\" (UID: \"fa15978b-f3ec-4d7f-8a4e-4a4bcfd61f15\") " pod="openstack/barbican-worker-7b97c57775-zc698" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.436909 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa15978b-f3ec-4d7f-8a4e-4a4bcfd61f15-combined-ca-bundle\") pod \"barbican-worker-7b97c57775-zc698\" (UID: \"fa15978b-f3ec-4d7f-8a4e-4a4bcfd61f15\") " pod="openstack/barbican-worker-7b97c57775-zc698" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.444198 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-694cc5859d-jlpkr"] Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.446389 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-694cc5859d-jlpkr" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.446393 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa15978b-f3ec-4d7f-8a4e-4a4bcfd61f15-config-data\") pod \"barbican-worker-7b97c57775-zc698\" (UID: \"fa15978b-f3ec-4d7f-8a4e-4a4bcfd61f15\") " pod="openstack/barbican-worker-7b97c57775-zc698" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.451575 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.454445 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-694cc5859d-jlpkr"] Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.454597 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxjkr\" (UniqueName: \"kubernetes.io/projected/fa15978b-f3ec-4d7f-8a4e-4a4bcfd61f15-kube-api-access-jxjkr\") pod \"barbican-worker-7b97c57775-zc698\" (UID: \"fa15978b-f3ec-4d7f-8a4e-4a4bcfd61f15\") " pod="openstack/barbican-worker-7b97c57775-zc698" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.529722 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de7a0634-e601-4c65-adb6-8e2625e1709b-config-data\") pod \"barbican-keystone-listener-68d8fcd9d6-kjplh\" (UID: \"de7a0634-e601-4c65-adb6-8e2625e1709b\") " pod="openstack/barbican-keystone-listener-68d8fcd9d6-kjplh" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.529756 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e284394-c7d9-41d3-983e-b1474ec1d8c3-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd66dcf7-j2q47\" (UID: \"5e284394-c7d9-41d3-983e-b1474ec1d8c3\") " pod="openstack/dnsmasq-dns-79bd66dcf7-j2q47" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.529773 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e284394-c7d9-41d3-983e-b1474ec1d8c3-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd66dcf7-j2q47\" (UID: \"5e284394-c7d9-41d3-983e-b1474ec1d8c3\") " pod="openstack/dnsmasq-dns-79bd66dcf7-j2q47" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.529796 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8380e84c-8f80-43dc-825e-d9dd3dd0533f-logs\") pod \"barbican-api-694cc5859d-jlpkr\" (UID: \"8380e84c-8f80-43dc-825e-d9dd3dd0533f\") " pod="openstack/barbican-api-694cc5859d-jlpkr" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.529916 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8380e84c-8f80-43dc-825e-d9dd3dd0533f-combined-ca-bundle\") pod \"barbican-api-694cc5859d-jlpkr\" (UID: \"8380e84c-8f80-43dc-825e-d9dd3dd0533f\") " pod="openstack/barbican-api-694cc5859d-jlpkr" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.529938 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/de7a0634-e601-4c65-adb6-8e2625e1709b-config-data-custom\") pod \"barbican-keystone-listener-68d8fcd9d6-kjplh\" (UID: \"de7a0634-e601-4c65-adb6-8e2625e1709b\") " pod="openstack/barbican-keystone-listener-68d8fcd9d6-kjplh" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.529972 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvwbv\" (UniqueName: \"kubernetes.io/projected/8380e84c-8f80-43dc-825e-d9dd3dd0533f-kube-api-access-tvwbv\") pod \"barbican-api-694cc5859d-jlpkr\" (UID: \"8380e84c-8f80-43dc-825e-d9dd3dd0533f\") " pod="openstack/barbican-api-694cc5859d-jlpkr" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.529993 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54tc9\" (UniqueName: \"kubernetes.io/projected/de7a0634-e601-4c65-adb6-8e2625e1709b-kube-api-access-54tc9\") pod \"barbican-keystone-listener-68d8fcd9d6-kjplh\" (UID: \"de7a0634-e601-4c65-adb6-8e2625e1709b\") " pod="openstack/barbican-keystone-listener-68d8fcd9d6-kjplh" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.530020 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e284394-c7d9-41d3-983e-b1474ec1d8c3-dns-svc\") pod \"dnsmasq-dns-79bd66dcf7-j2q47\" (UID: \"5e284394-c7d9-41d3-983e-b1474ec1d8c3\") " pod="openstack/dnsmasq-dns-79bd66dcf7-j2q47" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.530047 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8380e84c-8f80-43dc-825e-d9dd3dd0533f-config-data\") pod \"barbican-api-694cc5859d-jlpkr\" (UID: \"8380e84c-8f80-43dc-825e-d9dd3dd0533f\") " pod="openstack/barbican-api-694cc5859d-jlpkr" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.530065 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8380e84c-8f80-43dc-825e-d9dd3dd0533f-config-data-custom\") pod \"barbican-api-694cc5859d-jlpkr\" (UID: \"8380e84c-8f80-43dc-825e-d9dd3dd0533f\") " pod="openstack/barbican-api-694cc5859d-jlpkr" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.530086 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de7a0634-e601-4c65-adb6-8e2625e1709b-combined-ca-bundle\") pod \"barbican-keystone-listener-68d8fcd9d6-kjplh\" (UID: \"de7a0634-e601-4c65-adb6-8e2625e1709b\") " pod="openstack/barbican-keystone-listener-68d8fcd9d6-kjplh" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.530112 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j75ph\" (UniqueName: \"kubernetes.io/projected/5e284394-c7d9-41d3-983e-b1474ec1d8c3-kube-api-access-j75ph\") pod \"dnsmasq-dns-79bd66dcf7-j2q47\" (UID: \"5e284394-c7d9-41d3-983e-b1474ec1d8c3\") " pod="openstack/dnsmasq-dns-79bd66dcf7-j2q47" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.530149 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de7a0634-e601-4c65-adb6-8e2625e1709b-logs\") pod \"barbican-keystone-listener-68d8fcd9d6-kjplh\" (UID: \"de7a0634-e601-4c65-adb6-8e2625e1709b\") " pod="openstack/barbican-keystone-listener-68d8fcd9d6-kjplh" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.530172 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e284394-c7d9-41d3-983e-b1474ec1d8c3-config\") pod \"dnsmasq-dns-79bd66dcf7-j2q47\" (UID: \"5e284394-c7d9-41d3-983e-b1474ec1d8c3\") " pod="openstack/dnsmasq-dns-79bd66dcf7-j2q47" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.530666 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e284394-c7d9-41d3-983e-b1474ec1d8c3-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd66dcf7-j2q47\" (UID: \"5e284394-c7d9-41d3-983e-b1474ec1d8c3\") " pod="openstack/dnsmasq-dns-79bd66dcf7-j2q47" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.531025 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de7a0634-e601-4c65-adb6-8e2625e1709b-logs\") pod \"barbican-keystone-listener-68d8fcd9d6-kjplh\" (UID: \"de7a0634-e601-4c65-adb6-8e2625e1709b\") " pod="openstack/barbican-keystone-listener-68d8fcd9d6-kjplh" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.531216 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e284394-c7d9-41d3-983e-b1474ec1d8c3-config\") pod \"dnsmasq-dns-79bd66dcf7-j2q47\" (UID: \"5e284394-c7d9-41d3-983e-b1474ec1d8c3\") " pod="openstack/dnsmasq-dns-79bd66dcf7-j2q47" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.534399 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e284394-c7d9-41d3-983e-b1474ec1d8c3-dns-svc\") pod \"dnsmasq-dns-79bd66dcf7-j2q47\" (UID: \"5e284394-c7d9-41d3-983e-b1474ec1d8c3\") " pod="openstack/dnsmasq-dns-79bd66dcf7-j2q47" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.535977 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de7a0634-e601-4c65-adb6-8e2625e1709b-combined-ca-bundle\") pod \"barbican-keystone-listener-68d8fcd9d6-kjplh\" (UID: \"de7a0634-e601-4c65-adb6-8e2625e1709b\") " pod="openstack/barbican-keystone-listener-68d8fcd9d6-kjplh" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.536643 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/de7a0634-e601-4c65-adb6-8e2625e1709b-config-data-custom\") pod \"barbican-keystone-listener-68d8fcd9d6-kjplh\" (UID: \"de7a0634-e601-4c65-adb6-8e2625e1709b\") " pod="openstack/barbican-keystone-listener-68d8fcd9d6-kjplh" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.536667 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e284394-c7d9-41d3-983e-b1474ec1d8c3-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd66dcf7-j2q47\" (UID: \"5e284394-c7d9-41d3-983e-b1474ec1d8c3\") " pod="openstack/dnsmasq-dns-79bd66dcf7-j2q47" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.536988 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de7a0634-e601-4c65-adb6-8e2625e1709b-config-data\") pod \"barbican-keystone-listener-68d8fcd9d6-kjplh\" (UID: \"de7a0634-e601-4c65-adb6-8e2625e1709b\") " pod="openstack/barbican-keystone-listener-68d8fcd9d6-kjplh" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.547479 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j75ph\" (UniqueName: \"kubernetes.io/projected/5e284394-c7d9-41d3-983e-b1474ec1d8c3-kube-api-access-j75ph\") pod \"dnsmasq-dns-79bd66dcf7-j2q47\" (UID: \"5e284394-c7d9-41d3-983e-b1474ec1d8c3\") " pod="openstack/dnsmasq-dns-79bd66dcf7-j2q47" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.548045 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54tc9\" (UniqueName: \"kubernetes.io/projected/de7a0634-e601-4c65-adb6-8e2625e1709b-kube-api-access-54tc9\") pod \"barbican-keystone-listener-68d8fcd9d6-kjplh\" (UID: \"de7a0634-e601-4c65-adb6-8e2625e1709b\") " pod="openstack/barbican-keystone-listener-68d8fcd9d6-kjplh" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.632134 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8380e84c-8f80-43dc-825e-d9dd3dd0533f-config-data\") pod \"barbican-api-694cc5859d-jlpkr\" (UID: \"8380e84c-8f80-43dc-825e-d9dd3dd0533f\") " pod="openstack/barbican-api-694cc5859d-jlpkr" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.632177 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8380e84c-8f80-43dc-825e-d9dd3dd0533f-config-data-custom\") pod \"barbican-api-694cc5859d-jlpkr\" (UID: \"8380e84c-8f80-43dc-825e-d9dd3dd0533f\") " pod="openstack/barbican-api-694cc5859d-jlpkr" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.632286 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8380e84c-8f80-43dc-825e-d9dd3dd0533f-logs\") pod \"barbican-api-694cc5859d-jlpkr\" (UID: \"8380e84c-8f80-43dc-825e-d9dd3dd0533f\") " pod="openstack/barbican-api-694cc5859d-jlpkr" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.632314 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8380e84c-8f80-43dc-825e-d9dd3dd0533f-combined-ca-bundle\") pod \"barbican-api-694cc5859d-jlpkr\" (UID: \"8380e84c-8f80-43dc-825e-d9dd3dd0533f\") " pod="openstack/barbican-api-694cc5859d-jlpkr" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.632349 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvwbv\" (UniqueName: \"kubernetes.io/projected/8380e84c-8f80-43dc-825e-d9dd3dd0533f-kube-api-access-tvwbv\") pod \"barbican-api-694cc5859d-jlpkr\" (UID: \"8380e84c-8f80-43dc-825e-d9dd3dd0533f\") " pod="openstack/barbican-api-694cc5859d-jlpkr" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.633133 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8380e84c-8f80-43dc-825e-d9dd3dd0533f-logs\") pod \"barbican-api-694cc5859d-jlpkr\" (UID: \"8380e84c-8f80-43dc-825e-d9dd3dd0533f\") " pod="openstack/barbican-api-694cc5859d-jlpkr" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.639269 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8380e84c-8f80-43dc-825e-d9dd3dd0533f-combined-ca-bundle\") pod \"barbican-api-694cc5859d-jlpkr\" (UID: \"8380e84c-8f80-43dc-825e-d9dd3dd0533f\") " pod="openstack/barbican-api-694cc5859d-jlpkr" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.639838 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8380e84c-8f80-43dc-825e-d9dd3dd0533f-config-data-custom\") pod \"barbican-api-694cc5859d-jlpkr\" (UID: \"8380e84c-8f80-43dc-825e-d9dd3dd0533f\") " pod="openstack/barbican-api-694cc5859d-jlpkr" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.641141 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7b97c57775-zc698" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.641343 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8380e84c-8f80-43dc-825e-d9dd3dd0533f-config-data\") pod \"barbican-api-694cc5859d-jlpkr\" (UID: \"8380e84c-8f80-43dc-825e-d9dd3dd0533f\") " pod="openstack/barbican-api-694cc5859d-jlpkr" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.655465 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-68d8fcd9d6-kjplh" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.658074 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvwbv\" (UniqueName: \"kubernetes.io/projected/8380e84c-8f80-43dc-825e-d9dd3dd0533f-kube-api-access-tvwbv\") pod \"barbican-api-694cc5859d-jlpkr\" (UID: \"8380e84c-8f80-43dc-825e-d9dd3dd0533f\") " pod="openstack/barbican-api-694cc5859d-jlpkr" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.703579 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd66dcf7-j2q47" Feb 17 19:15:44 crc kubenswrapper[4892]: I0217 19:15:44.824562 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-694cc5859d-jlpkr" Feb 17 19:15:45 crc kubenswrapper[4892]: I0217 19:15:45.140943 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-68d8fcd9d6-kjplh"] Feb 17 19:15:45 crc kubenswrapper[4892]: I0217 19:15:45.224238 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7b97c57775-zc698"] Feb 17 19:15:45 crc kubenswrapper[4892]: W0217 19:15:45.232330 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa15978b_f3ec_4d7f_8a4e_4a4bcfd61f15.slice/crio-750b903b9b1a7093947f85e274a2e3b484093954ffdb42be3be7ecf1a1fb64b2 WatchSource:0}: Error finding container 750b903b9b1a7093947f85e274a2e3b484093954ffdb42be3be7ecf1a1fb64b2: Status 404 returned error can't find the container with id 750b903b9b1a7093947f85e274a2e3b484093954ffdb42be3be7ecf1a1fb64b2 Feb 17 19:15:45 crc kubenswrapper[4892]: I0217 19:15:45.311069 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd66dcf7-j2q47"] Feb 17 19:15:45 crc kubenswrapper[4892]: I0217 19:15:45.485020 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-694cc5859d-jlpkr"] Feb 17 19:15:45 crc kubenswrapper[4892]: W0217 19:15:45.492227 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8380e84c_8f80_43dc_825e_d9dd3dd0533f.slice/crio-61de8b4531d5272cb5cd2362d26f6301d03c706130eeb31e73227a21b2be5c2b WatchSource:0}: Error finding container 61de8b4531d5272cb5cd2362d26f6301d03c706130eeb31e73227a21b2be5c2b: Status 404 returned error can't find the container with id 61de8b4531d5272cb5cd2362d26f6301d03c706130eeb31e73227a21b2be5c2b Feb 17 19:15:46 crc kubenswrapper[4892]: I0217 19:15:46.019943 4892 generic.go:334] "Generic (PLEG): container finished" podID="5e284394-c7d9-41d3-983e-b1474ec1d8c3" containerID="a8c8db4bc61ca522bd50695a94b01ccf18d9da6b3d5fb82d47e2508b4651986a" exitCode=0 Feb 17 19:15:46 crc kubenswrapper[4892]: I0217 19:15:46.020018 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd66dcf7-j2q47" event={"ID":"5e284394-c7d9-41d3-983e-b1474ec1d8c3","Type":"ContainerDied","Data":"a8c8db4bc61ca522bd50695a94b01ccf18d9da6b3d5fb82d47e2508b4651986a"} Feb 17 19:15:46 crc kubenswrapper[4892]: I0217 19:15:46.020242 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd66dcf7-j2q47" event={"ID":"5e284394-c7d9-41d3-983e-b1474ec1d8c3","Type":"ContainerStarted","Data":"8680a7b5023f11fb6d428d73e2021eaffbe60fe08503c23077610f88f1b9c75e"} Feb 17 19:15:46 crc kubenswrapper[4892]: I0217 19:15:46.022530 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-694cc5859d-jlpkr" event={"ID":"8380e84c-8f80-43dc-825e-d9dd3dd0533f","Type":"ContainerStarted","Data":"84470977c989cee22862315f0b44f2b2174ba28e329094a624fa1ab67d296267"} Feb 17 19:15:46 crc kubenswrapper[4892]: I0217 19:15:46.022583 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-694cc5859d-jlpkr" event={"ID":"8380e84c-8f80-43dc-825e-d9dd3dd0533f","Type":"ContainerStarted","Data":"ca5399189ff9ffb349535ced73b1308131f0506e25b2a3166cb1d6fd7a05068b"} Feb 17 19:15:46 crc kubenswrapper[4892]: I0217 19:15:46.022595 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-694cc5859d-jlpkr" event={"ID":"8380e84c-8f80-43dc-825e-d9dd3dd0533f","Type":"ContainerStarted","Data":"61de8b4531d5272cb5cd2362d26f6301d03c706130eeb31e73227a21b2be5c2b"} Feb 17 19:15:46 crc kubenswrapper[4892]: I0217 19:15:46.024220 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-694cc5859d-jlpkr" Feb 17 19:15:46 crc kubenswrapper[4892]: I0217 19:15:46.024240 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-694cc5859d-jlpkr" Feb 17 19:15:46 crc kubenswrapper[4892]: I0217 19:15:46.025434 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7b97c57775-zc698" event={"ID":"fa15978b-f3ec-4d7f-8a4e-4a4bcfd61f15","Type":"ContainerStarted","Data":"1d2b7daf3bc4cc8e7aab4782909a98b80e28aad83139f5a4a6072c84ac1c5a99"} Feb 17 19:15:46 crc kubenswrapper[4892]: I0217 19:15:46.025593 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7b97c57775-zc698" event={"ID":"fa15978b-f3ec-4d7f-8a4e-4a4bcfd61f15","Type":"ContainerStarted","Data":"a2f0a0d2739b17b03ac2edd9b5ad35ebcaeee1f8d94417a7bf97f8b3811be735"} Feb 17 19:15:46 crc kubenswrapper[4892]: I0217 19:15:46.025697 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7b97c57775-zc698" event={"ID":"fa15978b-f3ec-4d7f-8a4e-4a4bcfd61f15","Type":"ContainerStarted","Data":"750b903b9b1a7093947f85e274a2e3b484093954ffdb42be3be7ecf1a1fb64b2"} Feb 17 19:15:46 crc kubenswrapper[4892]: I0217 19:15:46.028762 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-68d8fcd9d6-kjplh" event={"ID":"de7a0634-e601-4c65-adb6-8e2625e1709b","Type":"ContainerStarted","Data":"444351202bde57710ee20cbb47a5d840c0672038ddbc4cec6a37680af1a6b247"} Feb 17 19:15:46 crc kubenswrapper[4892]: I0217 19:15:46.029870 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-68d8fcd9d6-kjplh" event={"ID":"de7a0634-e601-4c65-adb6-8e2625e1709b","Type":"ContainerStarted","Data":"2a8276dee550929d4d7ef625c91cb723257e2d5d9f66565c5d887e6480acb3f8"} Feb 17 19:15:46 crc kubenswrapper[4892]: I0217 19:15:46.030064 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-68d8fcd9d6-kjplh" event={"ID":"de7a0634-e601-4c65-adb6-8e2625e1709b","Type":"ContainerStarted","Data":"f9849afe1379e90caf7e4d7eda7d5b93796a8e90010456511138ec3244314909"} Feb 17 19:15:46 crc kubenswrapper[4892]: I0217 19:15:46.064603 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-68d8fcd9d6-kjplh" podStartSLOduration=2.0645829 podStartE2EDuration="2.0645829s" podCreationTimestamp="2026-02-17 19:15:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:15:46.057317954 +0000 UTC m=+5517.432721219" watchObservedRunningTime="2026-02-17 19:15:46.0645829 +0000 UTC m=+5517.439986165" Feb 17 19:15:46 crc kubenswrapper[4892]: I0217 19:15:46.081692 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7b97c57775-zc698" podStartSLOduration=2.081676352 podStartE2EDuration="2.081676352s" podCreationTimestamp="2026-02-17 19:15:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:15:46.076195554 +0000 UTC m=+5517.451598819" watchObservedRunningTime="2026-02-17 19:15:46.081676352 +0000 UTC m=+5517.457079617" Feb 17 19:15:46 crc kubenswrapper[4892]: I0217 19:15:46.110782 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-694cc5859d-jlpkr" podStartSLOduration=2.110764836 podStartE2EDuration="2.110764836s" podCreationTimestamp="2026-02-17 19:15:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:15:46.096971254 +0000 UTC m=+5517.472374529" watchObservedRunningTime="2026-02-17 19:15:46.110764836 +0000 UTC m=+5517.486168101" Feb 17 19:15:47 crc kubenswrapper[4892]: I0217 19:15:47.042303 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd66dcf7-j2q47" event={"ID":"5e284394-c7d9-41d3-983e-b1474ec1d8c3","Type":"ContainerStarted","Data":"c994ed33fa988dfffc22871f210d9efd9d1ea6f0a27f5e2f059022d71a55ca41"} Feb 17 19:15:47 crc kubenswrapper[4892]: I0217 19:15:47.042782 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79bd66dcf7-j2q47" Feb 17 19:15:47 crc kubenswrapper[4892]: I0217 19:15:47.065617 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79bd66dcf7-j2q47" podStartSLOduration=3.065596773 podStartE2EDuration="3.065596773s" podCreationTimestamp="2026-02-17 19:15:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:15:47.063774563 +0000 UTC m=+5518.439177828" watchObservedRunningTime="2026-02-17 19:15:47.065596773 +0000 UTC m=+5518.441000038" Feb 17 19:15:54 crc kubenswrapper[4892]: I0217 19:15:54.705021 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79bd66dcf7-j2q47" Feb 17 19:15:54 crc kubenswrapper[4892]: I0217 19:15:54.774571 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-779db6986f-pv8lv"] Feb 17 19:15:54 crc kubenswrapper[4892]: I0217 19:15:54.774831 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-779db6986f-pv8lv" podUID="deca5be5-0971-47ec-858d-8d678eb3b961" containerName="dnsmasq-dns" containerID="cri-o://e836dd11c243b88adbffea6b4e0ce44bb7630d69cb315f5a5476982785963926" gracePeriod=10 Feb 17 19:15:55 crc kubenswrapper[4892]: I0217 19:15:55.175519 4892 generic.go:334] "Generic (PLEG): container finished" podID="deca5be5-0971-47ec-858d-8d678eb3b961" containerID="e836dd11c243b88adbffea6b4e0ce44bb7630d69cb315f5a5476982785963926" exitCode=0 Feb 17 19:15:55 crc kubenswrapper[4892]: I0217 19:15:55.175768 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-779db6986f-pv8lv" event={"ID":"deca5be5-0971-47ec-858d-8d678eb3b961","Type":"ContainerDied","Data":"e836dd11c243b88adbffea6b4e0ce44bb7630d69cb315f5a5476982785963926"} Feb 17 19:15:55 crc kubenswrapper[4892]: I0217 19:15:55.266224 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-779db6986f-pv8lv" Feb 17 19:15:55 crc kubenswrapper[4892]: I0217 19:15:55.369774 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/deca5be5-0971-47ec-858d-8d678eb3b961-ovsdbserver-sb\") pod \"deca5be5-0971-47ec-858d-8d678eb3b961\" (UID: \"deca5be5-0971-47ec-858d-8d678eb3b961\") " Feb 17 19:15:55 crc kubenswrapper[4892]: I0217 19:15:55.369997 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/deca5be5-0971-47ec-858d-8d678eb3b961-config\") pod \"deca5be5-0971-47ec-858d-8d678eb3b961\" (UID: \"deca5be5-0971-47ec-858d-8d678eb3b961\") " Feb 17 19:15:55 crc kubenswrapper[4892]: I0217 19:15:55.370076 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bn4gg\" (UniqueName: \"kubernetes.io/projected/deca5be5-0971-47ec-858d-8d678eb3b961-kube-api-access-bn4gg\") pod \"deca5be5-0971-47ec-858d-8d678eb3b961\" (UID: \"deca5be5-0971-47ec-858d-8d678eb3b961\") " Feb 17 19:15:55 crc kubenswrapper[4892]: I0217 19:15:55.370127 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/deca5be5-0971-47ec-858d-8d678eb3b961-dns-svc\") pod \"deca5be5-0971-47ec-858d-8d678eb3b961\" (UID: \"deca5be5-0971-47ec-858d-8d678eb3b961\") " Feb 17 19:15:55 crc kubenswrapper[4892]: I0217 19:15:55.370151 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/deca5be5-0971-47ec-858d-8d678eb3b961-ovsdbserver-nb\") pod \"deca5be5-0971-47ec-858d-8d678eb3b961\" (UID: \"deca5be5-0971-47ec-858d-8d678eb3b961\") " Feb 17 19:15:55 crc kubenswrapper[4892]: I0217 19:15:55.391297 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/deca5be5-0971-47ec-858d-8d678eb3b961-kube-api-access-bn4gg" (OuterVolumeSpecName: "kube-api-access-bn4gg") pod "deca5be5-0971-47ec-858d-8d678eb3b961" (UID: "deca5be5-0971-47ec-858d-8d678eb3b961"). InnerVolumeSpecName "kube-api-access-bn4gg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:15:55 crc kubenswrapper[4892]: I0217 19:15:55.414685 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/deca5be5-0971-47ec-858d-8d678eb3b961-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "deca5be5-0971-47ec-858d-8d678eb3b961" (UID: "deca5be5-0971-47ec-858d-8d678eb3b961"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:15:55 crc kubenswrapper[4892]: I0217 19:15:55.420712 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/deca5be5-0971-47ec-858d-8d678eb3b961-config" (OuterVolumeSpecName: "config") pod "deca5be5-0971-47ec-858d-8d678eb3b961" (UID: "deca5be5-0971-47ec-858d-8d678eb3b961"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:15:55 crc kubenswrapper[4892]: I0217 19:15:55.427104 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/deca5be5-0971-47ec-858d-8d678eb3b961-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "deca5be5-0971-47ec-858d-8d678eb3b961" (UID: "deca5be5-0971-47ec-858d-8d678eb3b961"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:15:55 crc kubenswrapper[4892]: I0217 19:15:55.428504 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/deca5be5-0971-47ec-858d-8d678eb3b961-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "deca5be5-0971-47ec-858d-8d678eb3b961" (UID: "deca5be5-0971-47ec-858d-8d678eb3b961"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:15:55 crc kubenswrapper[4892]: I0217 19:15:55.476481 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/deca5be5-0971-47ec-858d-8d678eb3b961-config\") on node \"crc\" DevicePath \"\"" Feb 17 19:15:55 crc kubenswrapper[4892]: I0217 19:15:55.476521 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bn4gg\" (UniqueName: \"kubernetes.io/projected/deca5be5-0971-47ec-858d-8d678eb3b961-kube-api-access-bn4gg\") on node \"crc\" DevicePath \"\"" Feb 17 19:15:55 crc kubenswrapper[4892]: I0217 19:15:55.476536 4892 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/deca5be5-0971-47ec-858d-8d678eb3b961-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 19:15:55 crc kubenswrapper[4892]: I0217 19:15:55.476548 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/deca5be5-0971-47ec-858d-8d678eb3b961-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 19:15:55 crc kubenswrapper[4892]: I0217 19:15:55.476561 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/deca5be5-0971-47ec-858d-8d678eb3b961-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 19:15:56 crc kubenswrapper[4892]: I0217 19:15:56.213670 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-779db6986f-pv8lv" event={"ID":"deca5be5-0971-47ec-858d-8d678eb3b961","Type":"ContainerDied","Data":"8fe9cddc5b7d844468a4ec08a123cae42b1320bbafbe9ecbaa7508fdb8f9475b"} Feb 17 19:15:56 crc kubenswrapper[4892]: I0217 19:15:56.213754 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-779db6986f-pv8lv" Feb 17 19:15:56 crc kubenswrapper[4892]: I0217 19:15:56.213757 4892 scope.go:117] "RemoveContainer" containerID="e836dd11c243b88adbffea6b4e0ce44bb7630d69cb315f5a5476982785963926" Feb 17 19:15:56 crc kubenswrapper[4892]: I0217 19:15:56.218439 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-694cc5859d-jlpkr" Feb 17 19:15:56 crc kubenswrapper[4892]: I0217 19:15:56.252143 4892 scope.go:117] "RemoveContainer" containerID="4643169c90fc1fdc9a3900dbab08da34ef1d78185a79fb7de4584a41e5ca055b" Feb 17 19:15:56 crc kubenswrapper[4892]: I0217 19:15:56.299448 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-694cc5859d-jlpkr" Feb 17 19:15:56 crc kubenswrapper[4892]: I0217 19:15:56.377188 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-779db6986f-pv8lv"] Feb 17 19:15:56 crc kubenswrapper[4892]: I0217 19:15:56.393826 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-779db6986f-pv8lv"] Feb 17 19:15:57 crc kubenswrapper[4892]: I0217 19:15:57.373938 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="deca5be5-0971-47ec-858d-8d678eb3b961" path="/var/lib/kubelet/pods/deca5be5-0971-47ec-858d-8d678eb3b961/volumes" Feb 17 19:16:01 crc kubenswrapper[4892]: I0217 19:16:01.716786 4892 scope.go:117] "RemoveContainer" containerID="7ea9f3bf5ad27feff41b4c27eefa5a700f3e740e1f8e346d9e20edd2214bab6d" Feb 17 19:16:07 crc kubenswrapper[4892]: I0217 19:16:07.425080 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 19:16:07 crc kubenswrapper[4892]: I0217 19:16:07.425767 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 19:16:08 crc kubenswrapper[4892]: I0217 19:16:08.495378 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-h6827"] Feb 17 19:16:08 crc kubenswrapper[4892]: E0217 19:16:08.496202 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deca5be5-0971-47ec-858d-8d678eb3b961" containerName="dnsmasq-dns" Feb 17 19:16:08 crc kubenswrapper[4892]: I0217 19:16:08.496221 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="deca5be5-0971-47ec-858d-8d678eb3b961" containerName="dnsmasq-dns" Feb 17 19:16:08 crc kubenswrapper[4892]: E0217 19:16:08.496241 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deca5be5-0971-47ec-858d-8d678eb3b961" containerName="init" Feb 17 19:16:08 crc kubenswrapper[4892]: I0217 19:16:08.496250 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="deca5be5-0971-47ec-858d-8d678eb3b961" containerName="init" Feb 17 19:16:08 crc kubenswrapper[4892]: I0217 19:16:08.496521 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="deca5be5-0971-47ec-858d-8d678eb3b961" containerName="dnsmasq-dns" Feb 17 19:16:08 crc kubenswrapper[4892]: I0217 19:16:08.497339 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-h6827" Feb 17 19:16:08 crc kubenswrapper[4892]: I0217 19:16:08.501941 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-h6827"] Feb 17 19:16:08 crc kubenswrapper[4892]: I0217 19:16:08.558355 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b4edd1b-14b6-4623-b2cb-5f26e4044fe1-operator-scripts\") pod \"neutron-db-create-h6827\" (UID: \"9b4edd1b-14b6-4623-b2cb-5f26e4044fe1\") " pod="openstack/neutron-db-create-h6827" Feb 17 19:16:08 crc kubenswrapper[4892]: I0217 19:16:08.558574 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjz86\" (UniqueName: \"kubernetes.io/projected/9b4edd1b-14b6-4623-b2cb-5f26e4044fe1-kube-api-access-qjz86\") pod \"neutron-db-create-h6827\" (UID: \"9b4edd1b-14b6-4623-b2cb-5f26e4044fe1\") " pod="openstack/neutron-db-create-h6827" Feb 17 19:16:08 crc kubenswrapper[4892]: I0217 19:16:08.630380 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-df55-account-create-update-wvxkj"] Feb 17 19:16:08 crc kubenswrapper[4892]: I0217 19:16:08.632421 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-df55-account-create-update-wvxkj" Feb 17 19:16:08 crc kubenswrapper[4892]: I0217 19:16:08.635047 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 17 19:16:08 crc kubenswrapper[4892]: I0217 19:16:08.640277 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-df55-account-create-update-wvxkj"] Feb 17 19:16:08 crc kubenswrapper[4892]: I0217 19:16:08.660314 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b4edd1b-14b6-4623-b2cb-5f26e4044fe1-operator-scripts\") pod \"neutron-db-create-h6827\" (UID: \"9b4edd1b-14b6-4623-b2cb-5f26e4044fe1\") " pod="openstack/neutron-db-create-h6827" Feb 17 19:16:08 crc kubenswrapper[4892]: I0217 19:16:08.660363 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjz86\" (UniqueName: \"kubernetes.io/projected/9b4edd1b-14b6-4623-b2cb-5f26e4044fe1-kube-api-access-qjz86\") pod \"neutron-db-create-h6827\" (UID: \"9b4edd1b-14b6-4623-b2cb-5f26e4044fe1\") " pod="openstack/neutron-db-create-h6827" Feb 17 19:16:08 crc kubenswrapper[4892]: I0217 19:16:08.662596 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b4edd1b-14b6-4623-b2cb-5f26e4044fe1-operator-scripts\") pod \"neutron-db-create-h6827\" (UID: \"9b4edd1b-14b6-4623-b2cb-5f26e4044fe1\") " pod="openstack/neutron-db-create-h6827" Feb 17 19:16:08 crc kubenswrapper[4892]: I0217 19:16:08.684583 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjz86\" (UniqueName: \"kubernetes.io/projected/9b4edd1b-14b6-4623-b2cb-5f26e4044fe1-kube-api-access-qjz86\") pod \"neutron-db-create-h6827\" (UID: \"9b4edd1b-14b6-4623-b2cb-5f26e4044fe1\") " pod="openstack/neutron-db-create-h6827" Feb 17 19:16:08 crc kubenswrapper[4892]: I0217 19:16:08.762587 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc0410f1-c219-4657-b233-febdc2406bf8-operator-scripts\") pod \"neutron-df55-account-create-update-wvxkj\" (UID: \"cc0410f1-c219-4657-b233-febdc2406bf8\") " pod="openstack/neutron-df55-account-create-update-wvxkj" Feb 17 19:16:08 crc kubenswrapper[4892]: I0217 19:16:08.763079 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9czx2\" (UniqueName: \"kubernetes.io/projected/cc0410f1-c219-4657-b233-febdc2406bf8-kube-api-access-9czx2\") pod \"neutron-df55-account-create-update-wvxkj\" (UID: \"cc0410f1-c219-4657-b233-febdc2406bf8\") " pod="openstack/neutron-df55-account-create-update-wvxkj" Feb 17 19:16:08 crc kubenswrapper[4892]: I0217 19:16:08.815509 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-h6827" Feb 17 19:16:08 crc kubenswrapper[4892]: I0217 19:16:08.864656 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc0410f1-c219-4657-b233-febdc2406bf8-operator-scripts\") pod \"neutron-df55-account-create-update-wvxkj\" (UID: \"cc0410f1-c219-4657-b233-febdc2406bf8\") " pod="openstack/neutron-df55-account-create-update-wvxkj" Feb 17 19:16:08 crc kubenswrapper[4892]: I0217 19:16:08.865094 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9czx2\" (UniqueName: \"kubernetes.io/projected/cc0410f1-c219-4657-b233-febdc2406bf8-kube-api-access-9czx2\") pod \"neutron-df55-account-create-update-wvxkj\" (UID: \"cc0410f1-c219-4657-b233-febdc2406bf8\") " pod="openstack/neutron-df55-account-create-update-wvxkj" Feb 17 19:16:08 crc kubenswrapper[4892]: I0217 19:16:08.865900 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc0410f1-c219-4657-b233-febdc2406bf8-operator-scripts\") pod \"neutron-df55-account-create-update-wvxkj\" (UID: \"cc0410f1-c219-4657-b233-febdc2406bf8\") " pod="openstack/neutron-df55-account-create-update-wvxkj" Feb 17 19:16:08 crc kubenswrapper[4892]: I0217 19:16:08.882756 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9czx2\" (UniqueName: \"kubernetes.io/projected/cc0410f1-c219-4657-b233-febdc2406bf8-kube-api-access-9czx2\") pod \"neutron-df55-account-create-update-wvxkj\" (UID: \"cc0410f1-c219-4657-b233-febdc2406bf8\") " pod="openstack/neutron-df55-account-create-update-wvxkj" Feb 17 19:16:08 crc kubenswrapper[4892]: I0217 19:16:08.950014 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-df55-account-create-update-wvxkj" Feb 17 19:16:09 crc kubenswrapper[4892]: I0217 19:16:09.246829 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-h6827"] Feb 17 19:16:09 crc kubenswrapper[4892]: W0217 19:16:09.247056 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b4edd1b_14b6_4623_b2cb_5f26e4044fe1.slice/crio-6a1395625d7838fb1c46b41d5d798d77657b2b27bfca2cf98229651c16b0cf18 WatchSource:0}: Error finding container 6a1395625d7838fb1c46b41d5d798d77657b2b27bfca2cf98229651c16b0cf18: Status 404 returned error can't find the container with id 6a1395625d7838fb1c46b41d5d798d77657b2b27bfca2cf98229651c16b0cf18 Feb 17 19:16:09 crc kubenswrapper[4892]: I0217 19:16:09.354442 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-h6827" event={"ID":"9b4edd1b-14b6-4623-b2cb-5f26e4044fe1","Type":"ContainerStarted","Data":"6a1395625d7838fb1c46b41d5d798d77657b2b27bfca2cf98229651c16b0cf18"} Feb 17 19:16:09 crc kubenswrapper[4892]: W0217 19:16:09.411391 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc0410f1_c219_4657_b233_febdc2406bf8.slice/crio-7dc914bb8eb1c53302a3d02069f657ab51340c707dddbaf5f5cbdc5a48d5b621 WatchSource:0}: Error finding container 7dc914bb8eb1c53302a3d02069f657ab51340c707dddbaf5f5cbdc5a48d5b621: Status 404 returned error can't find the container with id 7dc914bb8eb1c53302a3d02069f657ab51340c707dddbaf5f5cbdc5a48d5b621 Feb 17 19:16:09 crc kubenswrapper[4892]: I0217 19:16:09.419086 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-df55-account-create-update-wvxkj"] Feb 17 19:16:10 crc kubenswrapper[4892]: I0217 19:16:10.381193 4892 generic.go:334] "Generic (PLEG): container finished" podID="cc0410f1-c219-4657-b233-febdc2406bf8" containerID="512bd8c4a3b30d9dad3257749b2223715980d679bf3a038de491ed77dbd33e78" exitCode=0 Feb 17 19:16:10 crc kubenswrapper[4892]: I0217 19:16:10.381566 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-df55-account-create-update-wvxkj" event={"ID":"cc0410f1-c219-4657-b233-febdc2406bf8","Type":"ContainerDied","Data":"512bd8c4a3b30d9dad3257749b2223715980d679bf3a038de491ed77dbd33e78"} Feb 17 19:16:10 crc kubenswrapper[4892]: I0217 19:16:10.381599 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-df55-account-create-update-wvxkj" event={"ID":"cc0410f1-c219-4657-b233-febdc2406bf8","Type":"ContainerStarted","Data":"7dc914bb8eb1c53302a3d02069f657ab51340c707dddbaf5f5cbdc5a48d5b621"} Feb 17 19:16:10 crc kubenswrapper[4892]: I0217 19:16:10.387720 4892 generic.go:334] "Generic (PLEG): container finished" podID="9b4edd1b-14b6-4623-b2cb-5f26e4044fe1" containerID="eb207e4acf5536ef95c537fb8f7cc4631c7f08e8223e6ba0d1a0fbaad9708aec" exitCode=0 Feb 17 19:16:10 crc kubenswrapper[4892]: I0217 19:16:10.387779 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-h6827" event={"ID":"9b4edd1b-14b6-4623-b2cb-5f26e4044fe1","Type":"ContainerDied","Data":"eb207e4acf5536ef95c537fb8f7cc4631c7f08e8223e6ba0d1a0fbaad9708aec"} Feb 17 19:16:11 crc kubenswrapper[4892]: I0217 19:16:11.915959 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-h6827" Feb 17 19:16:11 crc kubenswrapper[4892]: I0217 19:16:11.921668 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-df55-account-create-update-wvxkj" Feb 17 19:16:12 crc kubenswrapper[4892]: I0217 19:16:12.040182 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9czx2\" (UniqueName: \"kubernetes.io/projected/cc0410f1-c219-4657-b233-febdc2406bf8-kube-api-access-9czx2\") pod \"cc0410f1-c219-4657-b233-febdc2406bf8\" (UID: \"cc0410f1-c219-4657-b233-febdc2406bf8\") " Feb 17 19:16:12 crc kubenswrapper[4892]: I0217 19:16:12.040313 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjz86\" (UniqueName: \"kubernetes.io/projected/9b4edd1b-14b6-4623-b2cb-5f26e4044fe1-kube-api-access-qjz86\") pod \"9b4edd1b-14b6-4623-b2cb-5f26e4044fe1\" (UID: \"9b4edd1b-14b6-4623-b2cb-5f26e4044fe1\") " Feb 17 19:16:12 crc kubenswrapper[4892]: I0217 19:16:12.040382 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b4edd1b-14b6-4623-b2cb-5f26e4044fe1-operator-scripts\") pod \"9b4edd1b-14b6-4623-b2cb-5f26e4044fe1\" (UID: \"9b4edd1b-14b6-4623-b2cb-5f26e4044fe1\") " Feb 17 19:16:12 crc kubenswrapper[4892]: I0217 19:16:12.040421 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc0410f1-c219-4657-b233-febdc2406bf8-operator-scripts\") pod \"cc0410f1-c219-4657-b233-febdc2406bf8\" (UID: \"cc0410f1-c219-4657-b233-febdc2406bf8\") " Feb 17 19:16:12 crc kubenswrapper[4892]: I0217 19:16:12.041261 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b4edd1b-14b6-4623-b2cb-5f26e4044fe1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9b4edd1b-14b6-4623-b2cb-5f26e4044fe1" (UID: "9b4edd1b-14b6-4623-b2cb-5f26e4044fe1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:16:12 crc kubenswrapper[4892]: I0217 19:16:12.041422 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc0410f1-c219-4657-b233-febdc2406bf8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cc0410f1-c219-4657-b233-febdc2406bf8" (UID: "cc0410f1-c219-4657-b233-febdc2406bf8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:16:12 crc kubenswrapper[4892]: I0217 19:16:12.046623 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b4edd1b-14b6-4623-b2cb-5f26e4044fe1-kube-api-access-qjz86" (OuterVolumeSpecName: "kube-api-access-qjz86") pod "9b4edd1b-14b6-4623-b2cb-5f26e4044fe1" (UID: "9b4edd1b-14b6-4623-b2cb-5f26e4044fe1"). InnerVolumeSpecName "kube-api-access-qjz86". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:16:12 crc kubenswrapper[4892]: I0217 19:16:12.047162 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc0410f1-c219-4657-b233-febdc2406bf8-kube-api-access-9czx2" (OuterVolumeSpecName: "kube-api-access-9czx2") pod "cc0410f1-c219-4657-b233-febdc2406bf8" (UID: "cc0410f1-c219-4657-b233-febdc2406bf8"). InnerVolumeSpecName "kube-api-access-9czx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:16:12 crc kubenswrapper[4892]: I0217 19:16:12.143189 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjz86\" (UniqueName: \"kubernetes.io/projected/9b4edd1b-14b6-4623-b2cb-5f26e4044fe1-kube-api-access-qjz86\") on node \"crc\" DevicePath \"\"" Feb 17 19:16:12 crc kubenswrapper[4892]: I0217 19:16:12.143238 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b4edd1b-14b6-4623-b2cb-5f26e4044fe1-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:16:12 crc kubenswrapper[4892]: I0217 19:16:12.143257 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc0410f1-c219-4657-b233-febdc2406bf8-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:16:12 crc kubenswrapper[4892]: I0217 19:16:12.143278 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9czx2\" (UniqueName: \"kubernetes.io/projected/cc0410f1-c219-4657-b233-febdc2406bf8-kube-api-access-9czx2\") on node \"crc\" DevicePath \"\"" Feb 17 19:16:12 crc kubenswrapper[4892]: I0217 19:16:12.417369 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-h6827" Feb 17 19:16:12 crc kubenswrapper[4892]: I0217 19:16:12.417696 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-h6827" event={"ID":"9b4edd1b-14b6-4623-b2cb-5f26e4044fe1","Type":"ContainerDied","Data":"6a1395625d7838fb1c46b41d5d798d77657b2b27bfca2cf98229651c16b0cf18"} Feb 17 19:16:12 crc kubenswrapper[4892]: I0217 19:16:12.418919 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a1395625d7838fb1c46b41d5d798d77657b2b27bfca2cf98229651c16b0cf18" Feb 17 19:16:12 crc kubenswrapper[4892]: I0217 19:16:12.420289 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-df55-account-create-update-wvxkj" event={"ID":"cc0410f1-c219-4657-b233-febdc2406bf8","Type":"ContainerDied","Data":"7dc914bb8eb1c53302a3d02069f657ab51340c707dddbaf5f5cbdc5a48d5b621"} Feb 17 19:16:12 crc kubenswrapper[4892]: I0217 19:16:12.420346 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7dc914bb8eb1c53302a3d02069f657ab51340c707dddbaf5f5cbdc5a48d5b621" Feb 17 19:16:12 crc kubenswrapper[4892]: I0217 19:16:12.420368 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-df55-account-create-update-wvxkj" Feb 17 19:16:13 crc kubenswrapper[4892]: I0217 19:16:13.837369 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-kjvr5"] Feb 17 19:16:13 crc kubenswrapper[4892]: E0217 19:16:13.838220 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b4edd1b-14b6-4623-b2cb-5f26e4044fe1" containerName="mariadb-database-create" Feb 17 19:16:13 crc kubenswrapper[4892]: I0217 19:16:13.838249 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b4edd1b-14b6-4623-b2cb-5f26e4044fe1" containerName="mariadb-database-create" Feb 17 19:16:13 crc kubenswrapper[4892]: E0217 19:16:13.838265 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc0410f1-c219-4657-b233-febdc2406bf8" containerName="mariadb-account-create-update" Feb 17 19:16:13 crc kubenswrapper[4892]: I0217 19:16:13.838270 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc0410f1-c219-4657-b233-febdc2406bf8" containerName="mariadb-account-create-update" Feb 17 19:16:13 crc kubenswrapper[4892]: I0217 19:16:13.838495 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b4edd1b-14b6-4623-b2cb-5f26e4044fe1" containerName="mariadb-database-create" Feb 17 19:16:13 crc kubenswrapper[4892]: I0217 19:16:13.838509 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc0410f1-c219-4657-b233-febdc2406bf8" containerName="mariadb-account-create-update" Feb 17 19:16:13 crc kubenswrapper[4892]: I0217 19:16:13.839295 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-kjvr5" Feb 17 19:16:13 crc kubenswrapper[4892]: I0217 19:16:13.847280 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-rjlsn" Feb 17 19:16:13 crc kubenswrapper[4892]: I0217 19:16:13.848011 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 17 19:16:13 crc kubenswrapper[4892]: I0217 19:16:13.849151 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 17 19:16:13 crc kubenswrapper[4892]: I0217 19:16:13.862026 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-kjvr5"] Feb 17 19:16:13 crc kubenswrapper[4892]: I0217 19:16:13.887911 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/587b7485-a715-4355-8424-08cdb036121d-config\") pod \"neutron-db-sync-kjvr5\" (UID: \"587b7485-a715-4355-8424-08cdb036121d\") " pod="openstack/neutron-db-sync-kjvr5" Feb 17 19:16:13 crc kubenswrapper[4892]: I0217 19:16:13.888150 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/587b7485-a715-4355-8424-08cdb036121d-combined-ca-bundle\") pod \"neutron-db-sync-kjvr5\" (UID: \"587b7485-a715-4355-8424-08cdb036121d\") " pod="openstack/neutron-db-sync-kjvr5" Feb 17 19:16:13 crc kubenswrapper[4892]: I0217 19:16:13.888222 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-962v6\" (UniqueName: \"kubernetes.io/projected/587b7485-a715-4355-8424-08cdb036121d-kube-api-access-962v6\") pod \"neutron-db-sync-kjvr5\" (UID: \"587b7485-a715-4355-8424-08cdb036121d\") " pod="openstack/neutron-db-sync-kjvr5" Feb 17 19:16:13 crc kubenswrapper[4892]: I0217 19:16:13.990331 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-962v6\" (UniqueName: \"kubernetes.io/projected/587b7485-a715-4355-8424-08cdb036121d-kube-api-access-962v6\") pod \"neutron-db-sync-kjvr5\" (UID: \"587b7485-a715-4355-8424-08cdb036121d\") " pod="openstack/neutron-db-sync-kjvr5" Feb 17 19:16:13 crc kubenswrapper[4892]: I0217 19:16:13.990488 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/587b7485-a715-4355-8424-08cdb036121d-config\") pod \"neutron-db-sync-kjvr5\" (UID: \"587b7485-a715-4355-8424-08cdb036121d\") " pod="openstack/neutron-db-sync-kjvr5" Feb 17 19:16:13 crc kubenswrapper[4892]: I0217 19:16:13.990688 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/587b7485-a715-4355-8424-08cdb036121d-combined-ca-bundle\") pod \"neutron-db-sync-kjvr5\" (UID: \"587b7485-a715-4355-8424-08cdb036121d\") " pod="openstack/neutron-db-sync-kjvr5" Feb 17 19:16:13 crc kubenswrapper[4892]: I0217 19:16:13.997197 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/587b7485-a715-4355-8424-08cdb036121d-config\") pod \"neutron-db-sync-kjvr5\" (UID: \"587b7485-a715-4355-8424-08cdb036121d\") " pod="openstack/neutron-db-sync-kjvr5" Feb 17 19:16:13 crc kubenswrapper[4892]: I0217 19:16:13.998459 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/587b7485-a715-4355-8424-08cdb036121d-combined-ca-bundle\") pod \"neutron-db-sync-kjvr5\" (UID: \"587b7485-a715-4355-8424-08cdb036121d\") " pod="openstack/neutron-db-sync-kjvr5" Feb 17 19:16:14 crc kubenswrapper[4892]: I0217 19:16:14.008362 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-962v6\" (UniqueName: \"kubernetes.io/projected/587b7485-a715-4355-8424-08cdb036121d-kube-api-access-962v6\") pod \"neutron-db-sync-kjvr5\" (UID: \"587b7485-a715-4355-8424-08cdb036121d\") " pod="openstack/neutron-db-sync-kjvr5" Feb 17 19:16:14 crc kubenswrapper[4892]: I0217 19:16:14.207220 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-kjvr5" Feb 17 19:16:14 crc kubenswrapper[4892]: I0217 19:16:14.705744 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-kjvr5"] Feb 17 19:16:15 crc kubenswrapper[4892]: I0217 19:16:15.469362 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-kjvr5" event={"ID":"587b7485-a715-4355-8424-08cdb036121d","Type":"ContainerStarted","Data":"d8fa6b77b197156ab5b88cc3d6d45054d4896bf5a83b3b2c48fd3b4478b170ad"} Feb 17 19:16:15 crc kubenswrapper[4892]: I0217 19:16:15.470121 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-kjvr5" event={"ID":"587b7485-a715-4355-8424-08cdb036121d","Type":"ContainerStarted","Data":"46e4eba98ca6c7fdc59569c86dee206b1dd21aea1d72ab1cbc8eed7b578575fd"} Feb 17 19:16:15 crc kubenswrapper[4892]: I0217 19:16:15.498467 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-kjvr5" podStartSLOduration=2.498447747 podStartE2EDuration="2.498447747s" podCreationTimestamp="2026-02-17 19:16:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:16:15.491246914 +0000 UTC m=+5546.866650229" watchObservedRunningTime="2026-02-17 19:16:15.498447747 +0000 UTC m=+5546.873851012" Feb 17 19:16:20 crc kubenswrapper[4892]: I0217 19:16:20.581745 4892 generic.go:334] "Generic (PLEG): container finished" podID="587b7485-a715-4355-8424-08cdb036121d" containerID="d8fa6b77b197156ab5b88cc3d6d45054d4896bf5a83b3b2c48fd3b4478b170ad" exitCode=0 Feb 17 19:16:20 crc kubenswrapper[4892]: I0217 19:16:20.581850 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-kjvr5" event={"ID":"587b7485-a715-4355-8424-08cdb036121d","Type":"ContainerDied","Data":"d8fa6b77b197156ab5b88cc3d6d45054d4896bf5a83b3b2c48fd3b4478b170ad"} Feb 17 19:16:21 crc kubenswrapper[4892]: I0217 19:16:21.991317 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-kjvr5" Feb 17 19:16:22 crc kubenswrapper[4892]: I0217 19:16:22.049878 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/587b7485-a715-4355-8424-08cdb036121d-combined-ca-bundle\") pod \"587b7485-a715-4355-8424-08cdb036121d\" (UID: \"587b7485-a715-4355-8424-08cdb036121d\") " Feb 17 19:16:22 crc kubenswrapper[4892]: I0217 19:16:22.049959 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-962v6\" (UniqueName: \"kubernetes.io/projected/587b7485-a715-4355-8424-08cdb036121d-kube-api-access-962v6\") pod \"587b7485-a715-4355-8424-08cdb036121d\" (UID: \"587b7485-a715-4355-8424-08cdb036121d\") " Feb 17 19:16:22 crc kubenswrapper[4892]: I0217 19:16:22.050045 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/587b7485-a715-4355-8424-08cdb036121d-config\") pod \"587b7485-a715-4355-8424-08cdb036121d\" (UID: \"587b7485-a715-4355-8424-08cdb036121d\") " Feb 17 19:16:22 crc kubenswrapper[4892]: I0217 19:16:22.056288 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/587b7485-a715-4355-8424-08cdb036121d-kube-api-access-962v6" (OuterVolumeSpecName: "kube-api-access-962v6") pod "587b7485-a715-4355-8424-08cdb036121d" (UID: "587b7485-a715-4355-8424-08cdb036121d"). InnerVolumeSpecName "kube-api-access-962v6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:16:22 crc kubenswrapper[4892]: I0217 19:16:22.082763 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/587b7485-a715-4355-8424-08cdb036121d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "587b7485-a715-4355-8424-08cdb036121d" (UID: "587b7485-a715-4355-8424-08cdb036121d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:16:22 crc kubenswrapper[4892]: I0217 19:16:22.087594 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/587b7485-a715-4355-8424-08cdb036121d-config" (OuterVolumeSpecName: "config") pod "587b7485-a715-4355-8424-08cdb036121d" (UID: "587b7485-a715-4355-8424-08cdb036121d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:16:22 crc kubenswrapper[4892]: I0217 19:16:22.151433 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/587b7485-a715-4355-8424-08cdb036121d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 19:16:22 crc kubenswrapper[4892]: I0217 19:16:22.151467 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-962v6\" (UniqueName: \"kubernetes.io/projected/587b7485-a715-4355-8424-08cdb036121d-kube-api-access-962v6\") on node \"crc\" DevicePath \"\"" Feb 17 19:16:22 crc kubenswrapper[4892]: I0217 19:16:22.151480 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/587b7485-a715-4355-8424-08cdb036121d-config\") on node \"crc\" DevicePath \"\"" Feb 17 19:16:22 crc kubenswrapper[4892]: I0217 19:16:22.615621 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-kjvr5" event={"ID":"587b7485-a715-4355-8424-08cdb036121d","Type":"ContainerDied","Data":"46e4eba98ca6c7fdc59569c86dee206b1dd21aea1d72ab1cbc8eed7b578575fd"} Feb 17 19:16:22 crc kubenswrapper[4892]: I0217 19:16:22.616026 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46e4eba98ca6c7fdc59569c86dee206b1dd21aea1d72ab1cbc8eed7b578575fd" Feb 17 19:16:22 crc kubenswrapper[4892]: I0217 19:16:22.615683 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-kjvr5" Feb 17 19:16:22 crc kubenswrapper[4892]: I0217 19:16:22.868452 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5675647d9-6z5jv"] Feb 17 19:16:22 crc kubenswrapper[4892]: E0217 19:16:22.868904 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="587b7485-a715-4355-8424-08cdb036121d" containerName="neutron-db-sync" Feb 17 19:16:22 crc kubenswrapper[4892]: I0217 19:16:22.868916 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="587b7485-a715-4355-8424-08cdb036121d" containerName="neutron-db-sync" Feb 17 19:16:22 crc kubenswrapper[4892]: I0217 19:16:22.869132 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="587b7485-a715-4355-8424-08cdb036121d" containerName="neutron-db-sync" Feb 17 19:16:22 crc kubenswrapper[4892]: I0217 19:16:22.870120 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5675647d9-6z5jv" Feb 17 19:16:22 crc kubenswrapper[4892]: I0217 19:16:22.919686 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5675647d9-6z5jv"] Feb 17 19:16:22 crc kubenswrapper[4892]: I0217 19:16:22.972971 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f21a68f6-4b3e-40a5-aa01-aee310e5aabb-config\") pod \"dnsmasq-dns-5675647d9-6z5jv\" (UID: \"f21a68f6-4b3e-40a5-aa01-aee310e5aabb\") " pod="openstack/dnsmasq-dns-5675647d9-6z5jv" Feb 17 19:16:22 crc kubenswrapper[4892]: I0217 19:16:22.973012 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f21a68f6-4b3e-40a5-aa01-aee310e5aabb-dns-svc\") pod \"dnsmasq-dns-5675647d9-6z5jv\" (UID: \"f21a68f6-4b3e-40a5-aa01-aee310e5aabb\") " pod="openstack/dnsmasq-dns-5675647d9-6z5jv" Feb 17 19:16:22 crc kubenswrapper[4892]: I0217 19:16:22.973084 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6fjs\" (UniqueName: \"kubernetes.io/projected/f21a68f6-4b3e-40a5-aa01-aee310e5aabb-kube-api-access-j6fjs\") pod \"dnsmasq-dns-5675647d9-6z5jv\" (UID: \"f21a68f6-4b3e-40a5-aa01-aee310e5aabb\") " pod="openstack/dnsmasq-dns-5675647d9-6z5jv" Feb 17 19:16:22 crc kubenswrapper[4892]: I0217 19:16:22.973139 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f21a68f6-4b3e-40a5-aa01-aee310e5aabb-ovsdbserver-nb\") pod \"dnsmasq-dns-5675647d9-6z5jv\" (UID: \"f21a68f6-4b3e-40a5-aa01-aee310e5aabb\") " pod="openstack/dnsmasq-dns-5675647d9-6z5jv" Feb 17 19:16:22 crc kubenswrapper[4892]: I0217 19:16:22.973157 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f21a68f6-4b3e-40a5-aa01-aee310e5aabb-ovsdbserver-sb\") pod \"dnsmasq-dns-5675647d9-6z5jv\" (UID: \"f21a68f6-4b3e-40a5-aa01-aee310e5aabb\") " pod="openstack/dnsmasq-dns-5675647d9-6z5jv" Feb 17 19:16:23 crc kubenswrapper[4892]: I0217 19:16:23.079281 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f21a68f6-4b3e-40a5-aa01-aee310e5aabb-config\") pod \"dnsmasq-dns-5675647d9-6z5jv\" (UID: \"f21a68f6-4b3e-40a5-aa01-aee310e5aabb\") " pod="openstack/dnsmasq-dns-5675647d9-6z5jv" Feb 17 19:16:23 crc kubenswrapper[4892]: I0217 19:16:23.079322 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f21a68f6-4b3e-40a5-aa01-aee310e5aabb-dns-svc\") pod \"dnsmasq-dns-5675647d9-6z5jv\" (UID: \"f21a68f6-4b3e-40a5-aa01-aee310e5aabb\") " pod="openstack/dnsmasq-dns-5675647d9-6z5jv" Feb 17 19:16:23 crc kubenswrapper[4892]: I0217 19:16:23.079406 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6fjs\" (UniqueName: \"kubernetes.io/projected/f21a68f6-4b3e-40a5-aa01-aee310e5aabb-kube-api-access-j6fjs\") pod \"dnsmasq-dns-5675647d9-6z5jv\" (UID: \"f21a68f6-4b3e-40a5-aa01-aee310e5aabb\") " pod="openstack/dnsmasq-dns-5675647d9-6z5jv" Feb 17 19:16:23 crc kubenswrapper[4892]: I0217 19:16:23.079468 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f21a68f6-4b3e-40a5-aa01-aee310e5aabb-ovsdbserver-nb\") pod \"dnsmasq-dns-5675647d9-6z5jv\" (UID: \"f21a68f6-4b3e-40a5-aa01-aee310e5aabb\") " pod="openstack/dnsmasq-dns-5675647d9-6z5jv" Feb 17 19:16:23 crc kubenswrapper[4892]: I0217 19:16:23.079493 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f21a68f6-4b3e-40a5-aa01-aee310e5aabb-ovsdbserver-sb\") pod \"dnsmasq-dns-5675647d9-6z5jv\" (UID: \"f21a68f6-4b3e-40a5-aa01-aee310e5aabb\") " pod="openstack/dnsmasq-dns-5675647d9-6z5jv" Feb 17 19:16:23 crc kubenswrapper[4892]: I0217 19:16:23.080429 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f21a68f6-4b3e-40a5-aa01-aee310e5aabb-ovsdbserver-sb\") pod \"dnsmasq-dns-5675647d9-6z5jv\" (UID: \"f21a68f6-4b3e-40a5-aa01-aee310e5aabb\") " pod="openstack/dnsmasq-dns-5675647d9-6z5jv" Feb 17 19:16:23 crc kubenswrapper[4892]: I0217 19:16:23.081035 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f21a68f6-4b3e-40a5-aa01-aee310e5aabb-config\") pod \"dnsmasq-dns-5675647d9-6z5jv\" (UID: \"f21a68f6-4b3e-40a5-aa01-aee310e5aabb\") " pod="openstack/dnsmasq-dns-5675647d9-6z5jv" Feb 17 19:16:23 crc kubenswrapper[4892]: I0217 19:16:23.081565 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f21a68f6-4b3e-40a5-aa01-aee310e5aabb-ovsdbserver-nb\") pod \"dnsmasq-dns-5675647d9-6z5jv\" (UID: \"f21a68f6-4b3e-40a5-aa01-aee310e5aabb\") " pod="openstack/dnsmasq-dns-5675647d9-6z5jv" Feb 17 19:16:23 crc kubenswrapper[4892]: I0217 19:16:23.085423 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f21a68f6-4b3e-40a5-aa01-aee310e5aabb-dns-svc\") pod \"dnsmasq-dns-5675647d9-6z5jv\" (UID: \"f21a68f6-4b3e-40a5-aa01-aee310e5aabb\") " pod="openstack/dnsmasq-dns-5675647d9-6z5jv" Feb 17 19:16:23 crc kubenswrapper[4892]: I0217 19:16:23.092566 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-cd6dd448c-cv5w9"] Feb 17 19:16:23 crc kubenswrapper[4892]: I0217 19:16:23.099359 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cd6dd448c-cv5w9" Feb 17 19:16:23 crc kubenswrapper[4892]: I0217 19:16:23.105723 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-rjlsn" Feb 17 19:16:23 crc kubenswrapper[4892]: I0217 19:16:23.106060 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 17 19:16:23 crc kubenswrapper[4892]: I0217 19:16:23.107209 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 17 19:16:23 crc kubenswrapper[4892]: I0217 19:16:23.113448 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6fjs\" (UniqueName: \"kubernetes.io/projected/f21a68f6-4b3e-40a5-aa01-aee310e5aabb-kube-api-access-j6fjs\") pod \"dnsmasq-dns-5675647d9-6z5jv\" (UID: \"f21a68f6-4b3e-40a5-aa01-aee310e5aabb\") " pod="openstack/dnsmasq-dns-5675647d9-6z5jv" Feb 17 19:16:23 crc kubenswrapper[4892]: I0217 19:16:23.129387 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-cd6dd448c-cv5w9"] Feb 17 19:16:23 crc kubenswrapper[4892]: I0217 19:16:23.181242 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/63dc1c5d-2307-487a-a2f3-5c40864bdfb9-config\") pod \"neutron-cd6dd448c-cv5w9\" (UID: \"63dc1c5d-2307-487a-a2f3-5c40864bdfb9\") " pod="openstack/neutron-cd6dd448c-cv5w9" Feb 17 19:16:23 crc kubenswrapper[4892]: I0217 19:16:23.181565 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/63dc1c5d-2307-487a-a2f3-5c40864bdfb9-httpd-config\") pod \"neutron-cd6dd448c-cv5w9\" (UID: \"63dc1c5d-2307-487a-a2f3-5c40864bdfb9\") " pod="openstack/neutron-cd6dd448c-cv5w9" Feb 17 19:16:23 crc kubenswrapper[4892]: I0217 19:16:23.181587 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63dc1c5d-2307-487a-a2f3-5c40864bdfb9-combined-ca-bundle\") pod \"neutron-cd6dd448c-cv5w9\" (UID: \"63dc1c5d-2307-487a-a2f3-5c40864bdfb9\") " pod="openstack/neutron-cd6dd448c-cv5w9" Feb 17 19:16:23 crc kubenswrapper[4892]: I0217 19:16:23.181752 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whpkx\" (UniqueName: \"kubernetes.io/projected/63dc1c5d-2307-487a-a2f3-5c40864bdfb9-kube-api-access-whpkx\") pod \"neutron-cd6dd448c-cv5w9\" (UID: \"63dc1c5d-2307-487a-a2f3-5c40864bdfb9\") " pod="openstack/neutron-cd6dd448c-cv5w9" Feb 17 19:16:23 crc kubenswrapper[4892]: I0217 19:16:23.229561 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5675647d9-6z5jv" Feb 17 19:16:23 crc kubenswrapper[4892]: I0217 19:16:23.283766 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whpkx\" (UniqueName: \"kubernetes.io/projected/63dc1c5d-2307-487a-a2f3-5c40864bdfb9-kube-api-access-whpkx\") pod \"neutron-cd6dd448c-cv5w9\" (UID: \"63dc1c5d-2307-487a-a2f3-5c40864bdfb9\") " pod="openstack/neutron-cd6dd448c-cv5w9" Feb 17 19:16:23 crc kubenswrapper[4892]: I0217 19:16:23.283838 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/63dc1c5d-2307-487a-a2f3-5c40864bdfb9-config\") pod \"neutron-cd6dd448c-cv5w9\" (UID: \"63dc1c5d-2307-487a-a2f3-5c40864bdfb9\") " pod="openstack/neutron-cd6dd448c-cv5w9" Feb 17 19:16:23 crc kubenswrapper[4892]: I0217 19:16:23.283860 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/63dc1c5d-2307-487a-a2f3-5c40864bdfb9-httpd-config\") pod \"neutron-cd6dd448c-cv5w9\" (UID: \"63dc1c5d-2307-487a-a2f3-5c40864bdfb9\") " pod="openstack/neutron-cd6dd448c-cv5w9" Feb 17 19:16:23 crc kubenswrapper[4892]: I0217 19:16:23.283881 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63dc1c5d-2307-487a-a2f3-5c40864bdfb9-combined-ca-bundle\") pod \"neutron-cd6dd448c-cv5w9\" (UID: \"63dc1c5d-2307-487a-a2f3-5c40864bdfb9\") " pod="openstack/neutron-cd6dd448c-cv5w9" Feb 17 19:16:23 crc kubenswrapper[4892]: I0217 19:16:23.289134 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/63dc1c5d-2307-487a-a2f3-5c40864bdfb9-httpd-config\") pod \"neutron-cd6dd448c-cv5w9\" (UID: \"63dc1c5d-2307-487a-a2f3-5c40864bdfb9\") " pod="openstack/neutron-cd6dd448c-cv5w9" Feb 17 19:16:23 crc kubenswrapper[4892]: I0217 19:16:23.294576 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/63dc1c5d-2307-487a-a2f3-5c40864bdfb9-config\") pod \"neutron-cd6dd448c-cv5w9\" (UID: \"63dc1c5d-2307-487a-a2f3-5c40864bdfb9\") " pod="openstack/neutron-cd6dd448c-cv5w9" Feb 17 19:16:23 crc kubenswrapper[4892]: I0217 19:16:23.297555 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63dc1c5d-2307-487a-a2f3-5c40864bdfb9-combined-ca-bundle\") pod \"neutron-cd6dd448c-cv5w9\" (UID: \"63dc1c5d-2307-487a-a2f3-5c40864bdfb9\") " pod="openstack/neutron-cd6dd448c-cv5w9" Feb 17 19:16:23 crc kubenswrapper[4892]: I0217 19:16:23.305413 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whpkx\" (UniqueName: \"kubernetes.io/projected/63dc1c5d-2307-487a-a2f3-5c40864bdfb9-kube-api-access-whpkx\") pod \"neutron-cd6dd448c-cv5w9\" (UID: \"63dc1c5d-2307-487a-a2f3-5c40864bdfb9\") " pod="openstack/neutron-cd6dd448c-cv5w9" Feb 17 19:16:23 crc kubenswrapper[4892]: I0217 19:16:23.479333 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cd6dd448c-cv5w9" Feb 17 19:16:23 crc kubenswrapper[4892]: I0217 19:16:23.610764 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5675647d9-6z5jv"] Feb 17 19:16:23 crc kubenswrapper[4892]: I0217 19:16:23.642725 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5675647d9-6z5jv" event={"ID":"f21a68f6-4b3e-40a5-aa01-aee310e5aabb","Type":"ContainerStarted","Data":"7b4dd1e9e2e50b702b93c7da5b33423f7d82df960a0886b343f09015c929263c"} Feb 17 19:16:24 crc kubenswrapper[4892]: W0217 19:16:24.071699 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63dc1c5d_2307_487a_a2f3_5c40864bdfb9.slice/crio-2afceb9648926221a7192c597b18de4107a07e87f00a0e6571403b15cef866a1 WatchSource:0}: Error finding container 2afceb9648926221a7192c597b18de4107a07e87f00a0e6571403b15cef866a1: Status 404 returned error can't find the container with id 2afceb9648926221a7192c597b18de4107a07e87f00a0e6571403b15cef866a1 Feb 17 19:16:24 crc kubenswrapper[4892]: I0217 19:16:24.071739 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-cd6dd448c-cv5w9"] Feb 17 19:16:24 crc kubenswrapper[4892]: I0217 19:16:24.652255 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cd6dd448c-cv5w9" event={"ID":"63dc1c5d-2307-487a-a2f3-5c40864bdfb9","Type":"ContainerStarted","Data":"b31a023661f3114334c55b388ad5dd94a48a3475fcf8cebe81c0560010b13779"} Feb 17 19:16:24 crc kubenswrapper[4892]: I0217 19:16:24.652572 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-cd6dd448c-cv5w9" Feb 17 19:16:24 crc kubenswrapper[4892]: I0217 19:16:24.652584 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cd6dd448c-cv5w9" event={"ID":"63dc1c5d-2307-487a-a2f3-5c40864bdfb9","Type":"ContainerStarted","Data":"668a7d90a0085ad55dc50cdbfddad4084ab246f2ab4829da347fd7771e26d4e7"} Feb 17 19:16:24 crc kubenswrapper[4892]: I0217 19:16:24.652593 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cd6dd448c-cv5w9" event={"ID":"63dc1c5d-2307-487a-a2f3-5c40864bdfb9","Type":"ContainerStarted","Data":"2afceb9648926221a7192c597b18de4107a07e87f00a0e6571403b15cef866a1"} Feb 17 19:16:24 crc kubenswrapper[4892]: I0217 19:16:24.654227 4892 generic.go:334] "Generic (PLEG): container finished" podID="f21a68f6-4b3e-40a5-aa01-aee310e5aabb" containerID="26c491876dfa29d22fffb447dc10eab51c37a75f652de5af29f0e1bf12829650" exitCode=0 Feb 17 19:16:24 crc kubenswrapper[4892]: I0217 19:16:24.654267 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5675647d9-6z5jv" event={"ID":"f21a68f6-4b3e-40a5-aa01-aee310e5aabb","Type":"ContainerDied","Data":"26c491876dfa29d22fffb447dc10eab51c37a75f652de5af29f0e1bf12829650"} Feb 17 19:16:24 crc kubenswrapper[4892]: I0217 19:16:24.679131 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-cd6dd448c-cv5w9" podStartSLOduration=1.679104565 podStartE2EDuration="1.679104565s" podCreationTimestamp="2026-02-17 19:16:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:16:24.675283571 +0000 UTC m=+5556.050686846" watchObservedRunningTime="2026-02-17 19:16:24.679104565 +0000 UTC m=+5556.054507840" Feb 17 19:16:25 crc kubenswrapper[4892]: I0217 19:16:25.667544 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5675647d9-6z5jv" event={"ID":"f21a68f6-4b3e-40a5-aa01-aee310e5aabb","Type":"ContainerStarted","Data":"b72e2b4573aa564e44d61945813dc88dcaec399a7156e9b402ce4ef3c68eaa93"} Feb 17 19:16:25 crc kubenswrapper[4892]: I0217 19:16:25.692317 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5675647d9-6z5jv" podStartSLOduration=3.692300895 podStartE2EDuration="3.692300895s" podCreationTimestamp="2026-02-17 19:16:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:16:25.686042697 +0000 UTC m=+5557.061445982" watchObservedRunningTime="2026-02-17 19:16:25.692300895 +0000 UTC m=+5557.067704160" Feb 17 19:16:26 crc kubenswrapper[4892]: I0217 19:16:26.711501 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5675647d9-6z5jv" Feb 17 19:16:33 crc kubenswrapper[4892]: I0217 19:16:33.232888 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5675647d9-6z5jv" Feb 17 19:16:33 crc kubenswrapper[4892]: I0217 19:16:33.323638 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd66dcf7-j2q47"] Feb 17 19:16:33 crc kubenswrapper[4892]: I0217 19:16:33.323869 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-79bd66dcf7-j2q47" podUID="5e284394-c7d9-41d3-983e-b1474ec1d8c3" containerName="dnsmasq-dns" containerID="cri-o://c994ed33fa988dfffc22871f210d9efd9d1ea6f0a27f5e2f059022d71a55ca41" gracePeriod=10 Feb 17 19:16:33 crc kubenswrapper[4892]: I0217 19:16:33.794078 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd66dcf7-j2q47" event={"ID":"5e284394-c7d9-41d3-983e-b1474ec1d8c3","Type":"ContainerDied","Data":"c994ed33fa988dfffc22871f210d9efd9d1ea6f0a27f5e2f059022d71a55ca41"} Feb 17 19:16:33 crc kubenswrapper[4892]: I0217 19:16:33.793994 4892 generic.go:334] "Generic (PLEG): container finished" podID="5e284394-c7d9-41d3-983e-b1474ec1d8c3" containerID="c994ed33fa988dfffc22871f210d9efd9d1ea6f0a27f5e2f059022d71a55ca41" exitCode=0 Feb 17 19:16:33 crc kubenswrapper[4892]: I0217 19:16:33.892032 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd66dcf7-j2q47" Feb 17 19:16:33 crc kubenswrapper[4892]: I0217 19:16:33.951032 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e284394-c7d9-41d3-983e-b1474ec1d8c3-dns-svc\") pod \"5e284394-c7d9-41d3-983e-b1474ec1d8c3\" (UID: \"5e284394-c7d9-41d3-983e-b1474ec1d8c3\") " Feb 17 19:16:33 crc kubenswrapper[4892]: I0217 19:16:33.951145 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e284394-c7d9-41d3-983e-b1474ec1d8c3-config\") pod \"5e284394-c7d9-41d3-983e-b1474ec1d8c3\" (UID: \"5e284394-c7d9-41d3-983e-b1474ec1d8c3\") " Feb 17 19:16:33 crc kubenswrapper[4892]: I0217 19:16:33.951274 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e284394-c7d9-41d3-983e-b1474ec1d8c3-ovsdbserver-nb\") pod \"5e284394-c7d9-41d3-983e-b1474ec1d8c3\" (UID: \"5e284394-c7d9-41d3-983e-b1474ec1d8c3\") " Feb 17 19:16:33 crc kubenswrapper[4892]: I0217 19:16:33.951302 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j75ph\" (UniqueName: \"kubernetes.io/projected/5e284394-c7d9-41d3-983e-b1474ec1d8c3-kube-api-access-j75ph\") pod \"5e284394-c7d9-41d3-983e-b1474ec1d8c3\" (UID: \"5e284394-c7d9-41d3-983e-b1474ec1d8c3\") " Feb 17 19:16:33 crc kubenswrapper[4892]: I0217 19:16:33.951324 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e284394-c7d9-41d3-983e-b1474ec1d8c3-ovsdbserver-sb\") pod \"5e284394-c7d9-41d3-983e-b1474ec1d8c3\" (UID: \"5e284394-c7d9-41d3-983e-b1474ec1d8c3\") " Feb 17 19:16:33 crc kubenswrapper[4892]: I0217 19:16:33.979138 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e284394-c7d9-41d3-983e-b1474ec1d8c3-kube-api-access-j75ph" (OuterVolumeSpecName: "kube-api-access-j75ph") pod "5e284394-c7d9-41d3-983e-b1474ec1d8c3" (UID: "5e284394-c7d9-41d3-983e-b1474ec1d8c3"). InnerVolumeSpecName "kube-api-access-j75ph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:16:34 crc kubenswrapper[4892]: I0217 19:16:34.000572 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e284394-c7d9-41d3-983e-b1474ec1d8c3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5e284394-c7d9-41d3-983e-b1474ec1d8c3" (UID: "5e284394-c7d9-41d3-983e-b1474ec1d8c3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:16:34 crc kubenswrapper[4892]: I0217 19:16:34.001676 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e284394-c7d9-41d3-983e-b1474ec1d8c3-config" (OuterVolumeSpecName: "config") pod "5e284394-c7d9-41d3-983e-b1474ec1d8c3" (UID: "5e284394-c7d9-41d3-983e-b1474ec1d8c3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:16:34 crc kubenswrapper[4892]: I0217 19:16:34.005518 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e284394-c7d9-41d3-983e-b1474ec1d8c3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5e284394-c7d9-41d3-983e-b1474ec1d8c3" (UID: "5e284394-c7d9-41d3-983e-b1474ec1d8c3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:16:34 crc kubenswrapper[4892]: I0217 19:16:34.019089 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e284394-c7d9-41d3-983e-b1474ec1d8c3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5e284394-c7d9-41d3-983e-b1474ec1d8c3" (UID: "5e284394-c7d9-41d3-983e-b1474ec1d8c3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:16:34 crc kubenswrapper[4892]: I0217 19:16:34.053780 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e284394-c7d9-41d3-983e-b1474ec1d8c3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 19:16:34 crc kubenswrapper[4892]: I0217 19:16:34.053829 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j75ph\" (UniqueName: \"kubernetes.io/projected/5e284394-c7d9-41d3-983e-b1474ec1d8c3-kube-api-access-j75ph\") on node \"crc\" DevicePath \"\"" Feb 17 19:16:34 crc kubenswrapper[4892]: I0217 19:16:34.053841 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e284394-c7d9-41d3-983e-b1474ec1d8c3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 19:16:34 crc kubenswrapper[4892]: I0217 19:16:34.053852 4892 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e284394-c7d9-41d3-983e-b1474ec1d8c3-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 19:16:34 crc kubenswrapper[4892]: I0217 19:16:34.053860 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e284394-c7d9-41d3-983e-b1474ec1d8c3-config\") on node \"crc\" DevicePath \"\"" Feb 17 19:16:34 crc kubenswrapper[4892]: I0217 19:16:34.804409 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd66dcf7-j2q47" event={"ID":"5e284394-c7d9-41d3-983e-b1474ec1d8c3","Type":"ContainerDied","Data":"8680a7b5023f11fb6d428d73e2021eaffbe60fe08503c23077610f88f1b9c75e"} Feb 17 19:16:34 crc kubenswrapper[4892]: I0217 19:16:34.804471 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd66dcf7-j2q47" Feb 17 19:16:34 crc kubenswrapper[4892]: I0217 19:16:34.804714 4892 scope.go:117] "RemoveContainer" containerID="c994ed33fa988dfffc22871f210d9efd9d1ea6f0a27f5e2f059022d71a55ca41" Feb 17 19:16:34 crc kubenswrapper[4892]: I0217 19:16:34.822743 4892 scope.go:117] "RemoveContainer" containerID="a8c8db4bc61ca522bd50695a94b01ccf18d9da6b3d5fb82d47e2508b4651986a" Feb 17 19:16:34 crc kubenswrapper[4892]: I0217 19:16:34.844538 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd66dcf7-j2q47"] Feb 17 19:16:34 crc kubenswrapper[4892]: I0217 19:16:34.856309 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79bd66dcf7-j2q47"] Feb 17 19:16:35 crc kubenswrapper[4892]: I0217 19:16:35.374661 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e284394-c7d9-41d3-983e-b1474ec1d8c3" path="/var/lib/kubelet/pods/5e284394-c7d9-41d3-983e-b1474ec1d8c3/volumes" Feb 17 19:16:36 crc kubenswrapper[4892]: I0217 19:16:36.696625 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nh84n"] Feb 17 19:16:36 crc kubenswrapper[4892]: E0217 19:16:36.701023 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e284394-c7d9-41d3-983e-b1474ec1d8c3" containerName="dnsmasq-dns" Feb 17 19:16:36 crc kubenswrapper[4892]: I0217 19:16:36.701051 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e284394-c7d9-41d3-983e-b1474ec1d8c3" containerName="dnsmasq-dns" Feb 17 19:16:36 crc kubenswrapper[4892]: E0217 19:16:36.701070 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e284394-c7d9-41d3-983e-b1474ec1d8c3" containerName="init" Feb 17 19:16:36 crc kubenswrapper[4892]: I0217 19:16:36.701078 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e284394-c7d9-41d3-983e-b1474ec1d8c3" containerName="init" Feb 17 19:16:36 crc kubenswrapper[4892]: I0217 19:16:36.701459 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e284394-c7d9-41d3-983e-b1474ec1d8c3" containerName="dnsmasq-dns" Feb 17 19:16:36 crc kubenswrapper[4892]: I0217 19:16:36.703253 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nh84n" Feb 17 19:16:36 crc kubenswrapper[4892]: I0217 19:16:36.713061 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nh84n"] Feb 17 19:16:36 crc kubenswrapper[4892]: I0217 19:16:36.808115 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3dbcf46d-0f16-415b-96cf-a727d8b2bdb0-utilities\") pod \"community-operators-nh84n\" (UID: \"3dbcf46d-0f16-415b-96cf-a727d8b2bdb0\") " pod="openshift-marketplace/community-operators-nh84n" Feb 17 19:16:36 crc kubenswrapper[4892]: I0217 19:16:36.808224 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3dbcf46d-0f16-415b-96cf-a727d8b2bdb0-catalog-content\") pod \"community-operators-nh84n\" (UID: \"3dbcf46d-0f16-415b-96cf-a727d8b2bdb0\") " pod="openshift-marketplace/community-operators-nh84n" Feb 17 19:16:36 crc kubenswrapper[4892]: I0217 19:16:36.808276 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnzlm\" (UniqueName: \"kubernetes.io/projected/3dbcf46d-0f16-415b-96cf-a727d8b2bdb0-kube-api-access-pnzlm\") pod \"community-operators-nh84n\" (UID: \"3dbcf46d-0f16-415b-96cf-a727d8b2bdb0\") " pod="openshift-marketplace/community-operators-nh84n" Feb 17 19:16:36 crc kubenswrapper[4892]: I0217 19:16:36.909885 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3dbcf46d-0f16-415b-96cf-a727d8b2bdb0-utilities\") pod \"community-operators-nh84n\" (UID: \"3dbcf46d-0f16-415b-96cf-a727d8b2bdb0\") " pod="openshift-marketplace/community-operators-nh84n" Feb 17 19:16:36 crc kubenswrapper[4892]: I0217 19:16:36.909999 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3dbcf46d-0f16-415b-96cf-a727d8b2bdb0-catalog-content\") pod \"community-operators-nh84n\" (UID: \"3dbcf46d-0f16-415b-96cf-a727d8b2bdb0\") " pod="openshift-marketplace/community-operators-nh84n" Feb 17 19:16:36 crc kubenswrapper[4892]: I0217 19:16:36.910058 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnzlm\" (UniqueName: \"kubernetes.io/projected/3dbcf46d-0f16-415b-96cf-a727d8b2bdb0-kube-api-access-pnzlm\") pod \"community-operators-nh84n\" (UID: \"3dbcf46d-0f16-415b-96cf-a727d8b2bdb0\") " pod="openshift-marketplace/community-operators-nh84n" Feb 17 19:16:36 crc kubenswrapper[4892]: I0217 19:16:36.910757 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3dbcf46d-0f16-415b-96cf-a727d8b2bdb0-utilities\") pod \"community-operators-nh84n\" (UID: \"3dbcf46d-0f16-415b-96cf-a727d8b2bdb0\") " pod="openshift-marketplace/community-operators-nh84n" Feb 17 19:16:36 crc kubenswrapper[4892]: I0217 19:16:36.910798 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3dbcf46d-0f16-415b-96cf-a727d8b2bdb0-catalog-content\") pod \"community-operators-nh84n\" (UID: \"3dbcf46d-0f16-415b-96cf-a727d8b2bdb0\") " pod="openshift-marketplace/community-operators-nh84n" Feb 17 19:16:36 crc kubenswrapper[4892]: I0217 19:16:36.933804 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnzlm\" (UniqueName: \"kubernetes.io/projected/3dbcf46d-0f16-415b-96cf-a727d8b2bdb0-kube-api-access-pnzlm\") pod \"community-operators-nh84n\" (UID: \"3dbcf46d-0f16-415b-96cf-a727d8b2bdb0\") " pod="openshift-marketplace/community-operators-nh84n" Feb 17 19:16:37 crc kubenswrapper[4892]: I0217 19:16:37.047253 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nh84n" Feb 17 19:16:37 crc kubenswrapper[4892]: I0217 19:16:37.425348 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 19:16:37 crc kubenswrapper[4892]: I0217 19:16:37.425632 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 19:16:37 crc kubenswrapper[4892]: I0217 19:16:37.425686 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" Feb 17 19:16:37 crc kubenswrapper[4892]: I0217 19:16:37.426429 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6bc133c88c6796604fb1d95fbac5023f863829395efa1fa885020ed0c34254e3"} pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 19:16:37 crc kubenswrapper[4892]: I0217 19:16:37.426493 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" containerID="cri-o://6bc133c88c6796604fb1d95fbac5023f863829395efa1fa885020ed0c34254e3" gracePeriod=600 Feb 17 19:16:37 crc kubenswrapper[4892]: I0217 19:16:37.617605 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nh84n"] Feb 17 19:16:37 crc kubenswrapper[4892]: W0217 19:16:37.626155 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dbcf46d_0f16_415b_96cf_a727d8b2bdb0.slice/crio-74629f278f16b350bf36987c3c58ed715fb911a1c26aeff8bdfac3816663e5a4 WatchSource:0}: Error finding container 74629f278f16b350bf36987c3c58ed715fb911a1c26aeff8bdfac3816663e5a4: Status 404 returned error can't find the container with id 74629f278f16b350bf36987c3c58ed715fb911a1c26aeff8bdfac3816663e5a4 Feb 17 19:16:37 crc kubenswrapper[4892]: I0217 19:16:37.838299 4892 generic.go:334] "Generic (PLEG): container finished" podID="3dbcf46d-0f16-415b-96cf-a727d8b2bdb0" containerID="318fe4fc70429cdfa66ce1353c69ae6386836d28aa90ab15defa830d3d29709d" exitCode=0 Feb 17 19:16:37 crc kubenswrapper[4892]: I0217 19:16:37.838333 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nh84n" event={"ID":"3dbcf46d-0f16-415b-96cf-a727d8b2bdb0","Type":"ContainerDied","Data":"318fe4fc70429cdfa66ce1353c69ae6386836d28aa90ab15defa830d3d29709d"} Feb 17 19:16:37 crc kubenswrapper[4892]: I0217 19:16:37.838642 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nh84n" event={"ID":"3dbcf46d-0f16-415b-96cf-a727d8b2bdb0","Type":"ContainerStarted","Data":"74629f278f16b350bf36987c3c58ed715fb911a1c26aeff8bdfac3816663e5a4"} Feb 17 19:16:37 crc kubenswrapper[4892]: I0217 19:16:37.843584 4892 generic.go:334] "Generic (PLEG): container finished" podID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerID="6bc133c88c6796604fb1d95fbac5023f863829395efa1fa885020ed0c34254e3" exitCode=0 Feb 17 19:16:37 crc kubenswrapper[4892]: I0217 19:16:37.843612 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerDied","Data":"6bc133c88c6796604fb1d95fbac5023f863829395efa1fa885020ed0c34254e3"} Feb 17 19:16:37 crc kubenswrapper[4892]: I0217 19:16:37.843632 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerStarted","Data":"0b9ac01ffb1eb212ecc7dea4def4de73890b147a83b4b71346712fbd55b67ac2"} Feb 17 19:16:37 crc kubenswrapper[4892]: I0217 19:16:37.843646 4892 scope.go:117] "RemoveContainer" containerID="1176c4dd94e4eb26cd8f99c2e68bcd6da89a092583d39fe54d90751c364f5f39" Feb 17 19:16:39 crc kubenswrapper[4892]: I0217 19:16:39.867706 4892 generic.go:334] "Generic (PLEG): container finished" podID="3dbcf46d-0f16-415b-96cf-a727d8b2bdb0" containerID="69c2fcae7c16eb1e5bab244254e7394cf69e62c961a3886bdc60353dcc0d6e9f" exitCode=0 Feb 17 19:16:39 crc kubenswrapper[4892]: I0217 19:16:39.867822 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nh84n" event={"ID":"3dbcf46d-0f16-415b-96cf-a727d8b2bdb0","Type":"ContainerDied","Data":"69c2fcae7c16eb1e5bab244254e7394cf69e62c961a3886bdc60353dcc0d6e9f"} Feb 17 19:16:40 crc kubenswrapper[4892]: I0217 19:16:40.885935 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nh84n" event={"ID":"3dbcf46d-0f16-415b-96cf-a727d8b2bdb0","Type":"ContainerStarted","Data":"bed2a9ae6d9522517e810689d42521fe60f3c38932367be125b383d181245158"} Feb 17 19:16:40 crc kubenswrapper[4892]: I0217 19:16:40.918669 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nh84n" podStartSLOduration=2.493922868 podStartE2EDuration="4.918649585s" podCreationTimestamp="2026-02-17 19:16:36 +0000 UTC" firstStartedPulling="2026-02-17 19:16:37.841683044 +0000 UTC m=+5569.217086319" lastFinishedPulling="2026-02-17 19:16:40.266409761 +0000 UTC m=+5571.641813036" observedRunningTime="2026-02-17 19:16:40.906178738 +0000 UTC m=+5572.281582033" watchObservedRunningTime="2026-02-17 19:16:40.918649585 +0000 UTC m=+5572.294052850" Feb 17 19:16:47 crc kubenswrapper[4892]: I0217 19:16:47.047968 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nh84n" Feb 17 19:16:47 crc kubenswrapper[4892]: I0217 19:16:47.048577 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nh84n" Feb 17 19:16:47 crc kubenswrapper[4892]: I0217 19:16:47.106045 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nh84n" Feb 17 19:16:48 crc kubenswrapper[4892]: I0217 19:16:48.057549 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nh84n" Feb 17 19:16:48 crc kubenswrapper[4892]: I0217 19:16:48.121178 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nh84n"] Feb 17 19:16:50 crc kubenswrapper[4892]: I0217 19:16:50.030116 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nh84n" podUID="3dbcf46d-0f16-415b-96cf-a727d8b2bdb0" containerName="registry-server" containerID="cri-o://bed2a9ae6d9522517e810689d42521fe60f3c38932367be125b383d181245158" gracePeriod=2 Feb 17 19:16:50 crc kubenswrapper[4892]: I0217 19:16:50.525890 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nh84n" Feb 17 19:16:50 crc kubenswrapper[4892]: I0217 19:16:50.600171 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3dbcf46d-0f16-415b-96cf-a727d8b2bdb0-catalog-content\") pod \"3dbcf46d-0f16-415b-96cf-a727d8b2bdb0\" (UID: \"3dbcf46d-0f16-415b-96cf-a727d8b2bdb0\") " Feb 17 19:16:50 crc kubenswrapper[4892]: I0217 19:16:50.600460 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnzlm\" (UniqueName: \"kubernetes.io/projected/3dbcf46d-0f16-415b-96cf-a727d8b2bdb0-kube-api-access-pnzlm\") pod \"3dbcf46d-0f16-415b-96cf-a727d8b2bdb0\" (UID: \"3dbcf46d-0f16-415b-96cf-a727d8b2bdb0\") " Feb 17 19:16:50 crc kubenswrapper[4892]: I0217 19:16:50.600692 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3dbcf46d-0f16-415b-96cf-a727d8b2bdb0-utilities\") pod \"3dbcf46d-0f16-415b-96cf-a727d8b2bdb0\" (UID: \"3dbcf46d-0f16-415b-96cf-a727d8b2bdb0\") " Feb 17 19:16:50 crc kubenswrapper[4892]: I0217 19:16:50.602187 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3dbcf46d-0f16-415b-96cf-a727d8b2bdb0-utilities" (OuterVolumeSpecName: "utilities") pod "3dbcf46d-0f16-415b-96cf-a727d8b2bdb0" (UID: "3dbcf46d-0f16-415b-96cf-a727d8b2bdb0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:16:50 crc kubenswrapper[4892]: I0217 19:16:50.613074 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dbcf46d-0f16-415b-96cf-a727d8b2bdb0-kube-api-access-pnzlm" (OuterVolumeSpecName: "kube-api-access-pnzlm") pod "3dbcf46d-0f16-415b-96cf-a727d8b2bdb0" (UID: "3dbcf46d-0f16-415b-96cf-a727d8b2bdb0"). InnerVolumeSpecName "kube-api-access-pnzlm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:16:50 crc kubenswrapper[4892]: I0217 19:16:50.668540 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3dbcf46d-0f16-415b-96cf-a727d8b2bdb0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3dbcf46d-0f16-415b-96cf-a727d8b2bdb0" (UID: "3dbcf46d-0f16-415b-96cf-a727d8b2bdb0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:16:50 crc kubenswrapper[4892]: I0217 19:16:50.702997 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnzlm\" (UniqueName: \"kubernetes.io/projected/3dbcf46d-0f16-415b-96cf-a727d8b2bdb0-kube-api-access-pnzlm\") on node \"crc\" DevicePath \"\"" Feb 17 19:16:50 crc kubenswrapper[4892]: I0217 19:16:50.703045 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3dbcf46d-0f16-415b-96cf-a727d8b2bdb0-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 19:16:50 crc kubenswrapper[4892]: I0217 19:16:50.703054 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3dbcf46d-0f16-415b-96cf-a727d8b2bdb0-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 19:16:51 crc kubenswrapper[4892]: I0217 19:16:51.040590 4892 generic.go:334] "Generic (PLEG): container finished" podID="3dbcf46d-0f16-415b-96cf-a727d8b2bdb0" containerID="bed2a9ae6d9522517e810689d42521fe60f3c38932367be125b383d181245158" exitCode=0 Feb 17 19:16:51 crc kubenswrapper[4892]: I0217 19:16:51.040607 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nh84n" Feb 17 19:16:51 crc kubenswrapper[4892]: I0217 19:16:51.040697 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nh84n" event={"ID":"3dbcf46d-0f16-415b-96cf-a727d8b2bdb0","Type":"ContainerDied","Data":"bed2a9ae6d9522517e810689d42521fe60f3c38932367be125b383d181245158"} Feb 17 19:16:51 crc kubenswrapper[4892]: I0217 19:16:51.040776 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nh84n" event={"ID":"3dbcf46d-0f16-415b-96cf-a727d8b2bdb0","Type":"ContainerDied","Data":"74629f278f16b350bf36987c3c58ed715fb911a1c26aeff8bdfac3816663e5a4"} Feb 17 19:16:51 crc kubenswrapper[4892]: I0217 19:16:51.040854 4892 scope.go:117] "RemoveContainer" containerID="bed2a9ae6d9522517e810689d42521fe60f3c38932367be125b383d181245158" Feb 17 19:16:51 crc kubenswrapper[4892]: I0217 19:16:51.095841 4892 scope.go:117] "RemoveContainer" containerID="69c2fcae7c16eb1e5bab244254e7394cf69e62c961a3886bdc60353dcc0d6e9f" Feb 17 19:16:51 crc kubenswrapper[4892]: I0217 19:16:51.103325 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nh84n"] Feb 17 19:16:51 crc kubenswrapper[4892]: I0217 19:16:51.114518 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nh84n"] Feb 17 19:16:51 crc kubenswrapper[4892]: I0217 19:16:51.121439 4892 scope.go:117] "RemoveContainer" containerID="318fe4fc70429cdfa66ce1353c69ae6386836d28aa90ab15defa830d3d29709d" Feb 17 19:16:51 crc kubenswrapper[4892]: I0217 19:16:51.182577 4892 scope.go:117] "RemoveContainer" containerID="bed2a9ae6d9522517e810689d42521fe60f3c38932367be125b383d181245158" Feb 17 19:16:51 crc kubenswrapper[4892]: E0217 19:16:51.183146 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bed2a9ae6d9522517e810689d42521fe60f3c38932367be125b383d181245158\": container with ID starting with bed2a9ae6d9522517e810689d42521fe60f3c38932367be125b383d181245158 not found: ID does not exist" containerID="bed2a9ae6d9522517e810689d42521fe60f3c38932367be125b383d181245158" Feb 17 19:16:51 crc kubenswrapper[4892]: I0217 19:16:51.183215 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bed2a9ae6d9522517e810689d42521fe60f3c38932367be125b383d181245158"} err="failed to get container status \"bed2a9ae6d9522517e810689d42521fe60f3c38932367be125b383d181245158\": rpc error: code = NotFound desc = could not find container \"bed2a9ae6d9522517e810689d42521fe60f3c38932367be125b383d181245158\": container with ID starting with bed2a9ae6d9522517e810689d42521fe60f3c38932367be125b383d181245158 not found: ID does not exist" Feb 17 19:16:51 crc kubenswrapper[4892]: I0217 19:16:51.183258 4892 scope.go:117] "RemoveContainer" containerID="69c2fcae7c16eb1e5bab244254e7394cf69e62c961a3886bdc60353dcc0d6e9f" Feb 17 19:16:51 crc kubenswrapper[4892]: E0217 19:16:51.183747 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69c2fcae7c16eb1e5bab244254e7394cf69e62c961a3886bdc60353dcc0d6e9f\": container with ID starting with 69c2fcae7c16eb1e5bab244254e7394cf69e62c961a3886bdc60353dcc0d6e9f not found: ID does not exist" containerID="69c2fcae7c16eb1e5bab244254e7394cf69e62c961a3886bdc60353dcc0d6e9f" Feb 17 19:16:51 crc kubenswrapper[4892]: I0217 19:16:51.183778 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69c2fcae7c16eb1e5bab244254e7394cf69e62c961a3886bdc60353dcc0d6e9f"} err="failed to get container status \"69c2fcae7c16eb1e5bab244254e7394cf69e62c961a3886bdc60353dcc0d6e9f\": rpc error: code = NotFound desc = could not find container \"69c2fcae7c16eb1e5bab244254e7394cf69e62c961a3886bdc60353dcc0d6e9f\": container with ID starting with 69c2fcae7c16eb1e5bab244254e7394cf69e62c961a3886bdc60353dcc0d6e9f not found: ID does not exist" Feb 17 19:16:51 crc kubenswrapper[4892]: I0217 19:16:51.183799 4892 scope.go:117] "RemoveContainer" containerID="318fe4fc70429cdfa66ce1353c69ae6386836d28aa90ab15defa830d3d29709d" Feb 17 19:16:51 crc kubenswrapper[4892]: E0217 19:16:51.184292 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"318fe4fc70429cdfa66ce1353c69ae6386836d28aa90ab15defa830d3d29709d\": container with ID starting with 318fe4fc70429cdfa66ce1353c69ae6386836d28aa90ab15defa830d3d29709d not found: ID does not exist" containerID="318fe4fc70429cdfa66ce1353c69ae6386836d28aa90ab15defa830d3d29709d" Feb 17 19:16:51 crc kubenswrapper[4892]: I0217 19:16:51.184310 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"318fe4fc70429cdfa66ce1353c69ae6386836d28aa90ab15defa830d3d29709d"} err="failed to get container status \"318fe4fc70429cdfa66ce1353c69ae6386836d28aa90ab15defa830d3d29709d\": rpc error: code = NotFound desc = could not find container \"318fe4fc70429cdfa66ce1353c69ae6386836d28aa90ab15defa830d3d29709d\": container with ID starting with 318fe4fc70429cdfa66ce1353c69ae6386836d28aa90ab15defa830d3d29709d not found: ID does not exist" Feb 17 19:16:51 crc kubenswrapper[4892]: I0217 19:16:51.377344 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dbcf46d-0f16-415b-96cf-a727d8b2bdb0" path="/var/lib/kubelet/pods/3dbcf46d-0f16-415b-96cf-a727d8b2bdb0/volumes" Feb 17 19:16:53 crc kubenswrapper[4892]: I0217 19:16:53.494405 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-cd6dd448c-cv5w9" Feb 17 19:16:54 crc kubenswrapper[4892]: I0217 19:16:54.739890 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wttf5"] Feb 17 19:16:54 crc kubenswrapper[4892]: E0217 19:16:54.740664 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dbcf46d-0f16-415b-96cf-a727d8b2bdb0" containerName="registry-server" Feb 17 19:16:54 crc kubenswrapper[4892]: I0217 19:16:54.740682 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dbcf46d-0f16-415b-96cf-a727d8b2bdb0" containerName="registry-server" Feb 17 19:16:54 crc kubenswrapper[4892]: E0217 19:16:54.740708 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dbcf46d-0f16-415b-96cf-a727d8b2bdb0" containerName="extract-content" Feb 17 19:16:54 crc kubenswrapper[4892]: I0217 19:16:54.740716 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dbcf46d-0f16-415b-96cf-a727d8b2bdb0" containerName="extract-content" Feb 17 19:16:54 crc kubenswrapper[4892]: E0217 19:16:54.740741 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dbcf46d-0f16-415b-96cf-a727d8b2bdb0" containerName="extract-utilities" Feb 17 19:16:54 crc kubenswrapper[4892]: I0217 19:16:54.740750 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dbcf46d-0f16-415b-96cf-a727d8b2bdb0" containerName="extract-utilities" Feb 17 19:16:54 crc kubenswrapper[4892]: I0217 19:16:54.741061 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dbcf46d-0f16-415b-96cf-a727d8b2bdb0" containerName="registry-server" Feb 17 19:16:54 crc kubenswrapper[4892]: I0217 19:16:54.742452 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wttf5" Feb 17 19:16:54 crc kubenswrapper[4892]: I0217 19:16:54.754291 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wttf5"] Feb 17 19:16:54 crc kubenswrapper[4892]: I0217 19:16:54.881491 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr5hs\" (UniqueName: \"kubernetes.io/projected/2f283cf0-c55a-469e-b46e-d5eb187f33e9-kube-api-access-mr5hs\") pod \"redhat-operators-wttf5\" (UID: \"2f283cf0-c55a-469e-b46e-d5eb187f33e9\") " pod="openshift-marketplace/redhat-operators-wttf5" Feb 17 19:16:54 crc kubenswrapper[4892]: I0217 19:16:54.881582 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f283cf0-c55a-469e-b46e-d5eb187f33e9-utilities\") pod \"redhat-operators-wttf5\" (UID: \"2f283cf0-c55a-469e-b46e-d5eb187f33e9\") " pod="openshift-marketplace/redhat-operators-wttf5" Feb 17 19:16:54 crc kubenswrapper[4892]: I0217 19:16:54.881630 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f283cf0-c55a-469e-b46e-d5eb187f33e9-catalog-content\") pod \"redhat-operators-wttf5\" (UID: \"2f283cf0-c55a-469e-b46e-d5eb187f33e9\") " pod="openshift-marketplace/redhat-operators-wttf5" Feb 17 19:16:54 crc kubenswrapper[4892]: I0217 19:16:54.982722 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f283cf0-c55a-469e-b46e-d5eb187f33e9-catalog-content\") pod \"redhat-operators-wttf5\" (UID: \"2f283cf0-c55a-469e-b46e-d5eb187f33e9\") " pod="openshift-marketplace/redhat-operators-wttf5" Feb 17 19:16:54 crc kubenswrapper[4892]: I0217 19:16:54.983039 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr5hs\" (UniqueName: \"kubernetes.io/projected/2f283cf0-c55a-469e-b46e-d5eb187f33e9-kube-api-access-mr5hs\") pod \"redhat-operators-wttf5\" (UID: \"2f283cf0-c55a-469e-b46e-d5eb187f33e9\") " pod="openshift-marketplace/redhat-operators-wttf5" Feb 17 19:16:54 crc kubenswrapper[4892]: I0217 19:16:54.983114 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f283cf0-c55a-469e-b46e-d5eb187f33e9-utilities\") pod \"redhat-operators-wttf5\" (UID: \"2f283cf0-c55a-469e-b46e-d5eb187f33e9\") " pod="openshift-marketplace/redhat-operators-wttf5" Feb 17 19:16:54 crc kubenswrapper[4892]: I0217 19:16:54.983722 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f283cf0-c55a-469e-b46e-d5eb187f33e9-utilities\") pod \"redhat-operators-wttf5\" (UID: \"2f283cf0-c55a-469e-b46e-d5eb187f33e9\") " pod="openshift-marketplace/redhat-operators-wttf5" Feb 17 19:16:54 crc kubenswrapper[4892]: I0217 19:16:54.984241 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f283cf0-c55a-469e-b46e-d5eb187f33e9-catalog-content\") pod \"redhat-operators-wttf5\" (UID: \"2f283cf0-c55a-469e-b46e-d5eb187f33e9\") " pod="openshift-marketplace/redhat-operators-wttf5" Feb 17 19:16:55 crc kubenswrapper[4892]: I0217 19:16:55.016238 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr5hs\" (UniqueName: \"kubernetes.io/projected/2f283cf0-c55a-469e-b46e-d5eb187f33e9-kube-api-access-mr5hs\") pod \"redhat-operators-wttf5\" (UID: \"2f283cf0-c55a-469e-b46e-d5eb187f33e9\") " pod="openshift-marketplace/redhat-operators-wttf5" Feb 17 19:16:55 crc kubenswrapper[4892]: I0217 19:16:55.062328 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wttf5" Feb 17 19:16:55 crc kubenswrapper[4892]: I0217 19:16:55.584640 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wttf5"] Feb 17 19:16:56 crc kubenswrapper[4892]: I0217 19:16:56.099406 4892 generic.go:334] "Generic (PLEG): container finished" podID="2f283cf0-c55a-469e-b46e-d5eb187f33e9" containerID="1a81f16c2d743f107cd3ff16ba1ab70710f359d3ee38cac5c64dbdaa3481306b" exitCode=0 Feb 17 19:16:56 crc kubenswrapper[4892]: I0217 19:16:56.099507 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wttf5" event={"ID":"2f283cf0-c55a-469e-b46e-d5eb187f33e9","Type":"ContainerDied","Data":"1a81f16c2d743f107cd3ff16ba1ab70710f359d3ee38cac5c64dbdaa3481306b"} Feb 17 19:16:56 crc kubenswrapper[4892]: I0217 19:16:56.099790 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wttf5" event={"ID":"2f283cf0-c55a-469e-b46e-d5eb187f33e9","Type":"ContainerStarted","Data":"544386a9887d3b3701f04b4f927785ef338c09b5965a92e90931e36620118216"} Feb 17 19:16:58 crc kubenswrapper[4892]: I0217 19:16:58.120697 4892 generic.go:334] "Generic (PLEG): container finished" podID="2f283cf0-c55a-469e-b46e-d5eb187f33e9" containerID="184b25e2924d41726eb04584e7e25c2345a3f218d1f0959c735f5a87ae7c40e6" exitCode=0 Feb 17 19:16:58 crc kubenswrapper[4892]: I0217 19:16:58.120787 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wttf5" event={"ID":"2f283cf0-c55a-469e-b46e-d5eb187f33e9","Type":"ContainerDied","Data":"184b25e2924d41726eb04584e7e25c2345a3f218d1f0959c735f5a87ae7c40e6"} Feb 17 19:16:59 crc kubenswrapper[4892]: I0217 19:16:59.136293 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wttf5" event={"ID":"2f283cf0-c55a-469e-b46e-d5eb187f33e9","Type":"ContainerStarted","Data":"c7f545a70d78f9840d1fedd4abc556972cb3cafbee999a7e31cddc049fd6c766"} Feb 17 19:16:59 crc kubenswrapper[4892]: I0217 19:16:59.163125 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wttf5" podStartSLOduration=2.727135848 podStartE2EDuration="5.163103448s" podCreationTimestamp="2026-02-17 19:16:54 +0000 UTC" firstStartedPulling="2026-02-17 19:16:56.101125692 +0000 UTC m=+5587.476528957" lastFinishedPulling="2026-02-17 19:16:58.537093252 +0000 UTC m=+5589.912496557" observedRunningTime="2026-02-17 19:16:59.155853063 +0000 UTC m=+5590.531256338" watchObservedRunningTime="2026-02-17 19:16:59.163103448 +0000 UTC m=+5590.538506723" Feb 17 19:17:00 crc kubenswrapper[4892]: I0217 19:17:00.109854 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-cpcdn"] Feb 17 19:17:00 crc kubenswrapper[4892]: I0217 19:17:00.111099 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-cpcdn" Feb 17 19:17:00 crc kubenswrapper[4892]: I0217 19:17:00.114836 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-cpcdn"] Feb 17 19:17:00 crc kubenswrapper[4892]: I0217 19:17:00.183402 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-f3d8-account-create-update-wnrsx"] Feb 17 19:17:00 crc kubenswrapper[4892]: I0217 19:17:00.186504 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f3d8-account-create-update-wnrsx" Feb 17 19:17:00 crc kubenswrapper[4892]: I0217 19:17:00.188441 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 17 19:17:00 crc kubenswrapper[4892]: I0217 19:17:00.199305 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-f3d8-account-create-update-wnrsx"] Feb 17 19:17:00 crc kubenswrapper[4892]: I0217 19:17:00.301712 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/776ec54b-43e8-4316-bbca-e54f0302c366-operator-scripts\") pod \"glance-db-create-cpcdn\" (UID: \"776ec54b-43e8-4316-bbca-e54f0302c366\") " pod="openstack/glance-db-create-cpcdn" Feb 17 19:17:00 crc kubenswrapper[4892]: I0217 19:17:00.301770 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd4vd\" (UniqueName: \"kubernetes.io/projected/1719978b-bac0-4de9-a1d0-0699cc81c53b-kube-api-access-kd4vd\") pod \"glance-f3d8-account-create-update-wnrsx\" (UID: \"1719978b-bac0-4de9-a1d0-0699cc81c53b\") " pod="openstack/glance-f3d8-account-create-update-wnrsx" Feb 17 19:17:00 crc kubenswrapper[4892]: I0217 19:17:00.301802 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1719978b-bac0-4de9-a1d0-0699cc81c53b-operator-scripts\") pod \"glance-f3d8-account-create-update-wnrsx\" (UID: \"1719978b-bac0-4de9-a1d0-0699cc81c53b\") " pod="openstack/glance-f3d8-account-create-update-wnrsx" Feb 17 19:17:00 crc kubenswrapper[4892]: I0217 19:17:00.302067 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpfmw\" (UniqueName: \"kubernetes.io/projected/776ec54b-43e8-4316-bbca-e54f0302c366-kube-api-access-wpfmw\") pod \"glance-db-create-cpcdn\" (UID: \"776ec54b-43e8-4316-bbca-e54f0302c366\") " pod="openstack/glance-db-create-cpcdn" Feb 17 19:17:00 crc kubenswrapper[4892]: I0217 19:17:00.403663 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd4vd\" (UniqueName: \"kubernetes.io/projected/1719978b-bac0-4de9-a1d0-0699cc81c53b-kube-api-access-kd4vd\") pod \"glance-f3d8-account-create-update-wnrsx\" (UID: \"1719978b-bac0-4de9-a1d0-0699cc81c53b\") " pod="openstack/glance-f3d8-account-create-update-wnrsx" Feb 17 19:17:00 crc kubenswrapper[4892]: I0217 19:17:00.403722 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1719978b-bac0-4de9-a1d0-0699cc81c53b-operator-scripts\") pod \"glance-f3d8-account-create-update-wnrsx\" (UID: \"1719978b-bac0-4de9-a1d0-0699cc81c53b\") " pod="openstack/glance-f3d8-account-create-update-wnrsx" Feb 17 19:17:00 crc kubenswrapper[4892]: I0217 19:17:00.403794 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpfmw\" (UniqueName: \"kubernetes.io/projected/776ec54b-43e8-4316-bbca-e54f0302c366-kube-api-access-wpfmw\") pod \"glance-db-create-cpcdn\" (UID: \"776ec54b-43e8-4316-bbca-e54f0302c366\") " pod="openstack/glance-db-create-cpcdn" Feb 17 19:17:00 crc kubenswrapper[4892]: I0217 19:17:00.403937 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/776ec54b-43e8-4316-bbca-e54f0302c366-operator-scripts\") pod \"glance-db-create-cpcdn\" (UID: \"776ec54b-43e8-4316-bbca-e54f0302c366\") " pod="openstack/glance-db-create-cpcdn" Feb 17 19:17:00 crc kubenswrapper[4892]: I0217 19:17:00.404707 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1719978b-bac0-4de9-a1d0-0699cc81c53b-operator-scripts\") pod \"glance-f3d8-account-create-update-wnrsx\" (UID: \"1719978b-bac0-4de9-a1d0-0699cc81c53b\") " pod="openstack/glance-f3d8-account-create-update-wnrsx" Feb 17 19:17:00 crc kubenswrapper[4892]: I0217 19:17:00.404908 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/776ec54b-43e8-4316-bbca-e54f0302c366-operator-scripts\") pod \"glance-db-create-cpcdn\" (UID: \"776ec54b-43e8-4316-bbca-e54f0302c366\") " pod="openstack/glance-db-create-cpcdn" Feb 17 19:17:00 crc kubenswrapper[4892]: I0217 19:17:00.427860 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd4vd\" (UniqueName: \"kubernetes.io/projected/1719978b-bac0-4de9-a1d0-0699cc81c53b-kube-api-access-kd4vd\") pod \"glance-f3d8-account-create-update-wnrsx\" (UID: \"1719978b-bac0-4de9-a1d0-0699cc81c53b\") " pod="openstack/glance-f3d8-account-create-update-wnrsx" Feb 17 19:17:00 crc kubenswrapper[4892]: I0217 19:17:00.432228 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpfmw\" (UniqueName: \"kubernetes.io/projected/776ec54b-43e8-4316-bbca-e54f0302c366-kube-api-access-wpfmw\") pod \"glance-db-create-cpcdn\" (UID: \"776ec54b-43e8-4316-bbca-e54f0302c366\") " pod="openstack/glance-db-create-cpcdn" Feb 17 19:17:00 crc kubenswrapper[4892]: I0217 19:17:00.447066 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-cpcdn" Feb 17 19:17:00 crc kubenswrapper[4892]: I0217 19:17:00.503518 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f3d8-account-create-update-wnrsx" Feb 17 19:17:00 crc kubenswrapper[4892]: I0217 19:17:00.972347 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-cpcdn"] Feb 17 19:17:00 crc kubenswrapper[4892]: W0217 19:17:00.975523 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod776ec54b_43e8_4316_bbca_e54f0302c366.slice/crio-17454c7be5af98b3a1b72922626dc3d5e1f2dc1cde45eea063a6cce0fa2f4e3f WatchSource:0}: Error finding container 17454c7be5af98b3a1b72922626dc3d5e1f2dc1cde45eea063a6cce0fa2f4e3f: Status 404 returned error can't find the container with id 17454c7be5af98b3a1b72922626dc3d5e1f2dc1cde45eea063a6cce0fa2f4e3f Feb 17 19:17:01 crc kubenswrapper[4892]: I0217 19:17:01.069302 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-f3d8-account-create-update-wnrsx"] Feb 17 19:17:01 crc kubenswrapper[4892]: I0217 19:17:01.156468 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f3d8-account-create-update-wnrsx" event={"ID":"1719978b-bac0-4de9-a1d0-0699cc81c53b","Type":"ContainerStarted","Data":"7ebbb5cff129dbbf1de1f0b605308d9ccb26bfc91525d6c8a1ce5ff20edc1494"} Feb 17 19:17:01 crc kubenswrapper[4892]: I0217 19:17:01.158020 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-cpcdn" event={"ID":"776ec54b-43e8-4316-bbca-e54f0302c366","Type":"ContainerStarted","Data":"2124fc3cd68ef6e01d0bda780f105f3113271b407b7ed59edf969e538b690838"} Feb 17 19:17:01 crc kubenswrapper[4892]: I0217 19:17:01.158050 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-cpcdn" event={"ID":"776ec54b-43e8-4316-bbca-e54f0302c366","Type":"ContainerStarted","Data":"17454c7be5af98b3a1b72922626dc3d5e1f2dc1cde45eea063a6cce0fa2f4e3f"} Feb 17 19:17:01 crc kubenswrapper[4892]: I0217 19:17:01.176437 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-cpcdn" podStartSLOduration=1.176391837 podStartE2EDuration="1.176391837s" podCreationTimestamp="2026-02-17 19:17:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:17:01.174353971 +0000 UTC m=+5592.549757236" watchObservedRunningTime="2026-02-17 19:17:01.176391837 +0000 UTC m=+5592.551795122" Feb 17 19:17:02 crc kubenswrapper[4892]: I0217 19:17:02.171011 4892 generic.go:334] "Generic (PLEG): container finished" podID="1719978b-bac0-4de9-a1d0-0699cc81c53b" containerID="ccb9f9aaf311645e58e52d2cdb654cf0177eb185804f2a0b51390b71db11738a" exitCode=0 Feb 17 19:17:02 crc kubenswrapper[4892]: I0217 19:17:02.171066 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f3d8-account-create-update-wnrsx" event={"ID":"1719978b-bac0-4de9-a1d0-0699cc81c53b","Type":"ContainerDied","Data":"ccb9f9aaf311645e58e52d2cdb654cf0177eb185804f2a0b51390b71db11738a"} Feb 17 19:17:02 crc kubenswrapper[4892]: I0217 19:17:02.174506 4892 generic.go:334] "Generic (PLEG): container finished" podID="776ec54b-43e8-4316-bbca-e54f0302c366" containerID="2124fc3cd68ef6e01d0bda780f105f3113271b407b7ed59edf969e538b690838" exitCode=0 Feb 17 19:17:02 crc kubenswrapper[4892]: I0217 19:17:02.174573 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-cpcdn" event={"ID":"776ec54b-43e8-4316-bbca-e54f0302c366","Type":"ContainerDied","Data":"2124fc3cd68ef6e01d0bda780f105f3113271b407b7ed59edf969e538b690838"} Feb 17 19:17:02 crc kubenswrapper[4892]: I0217 19:17:02.703290 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7d7bl"] Feb 17 19:17:02 crc kubenswrapper[4892]: I0217 19:17:02.706586 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7d7bl" Feb 17 19:17:02 crc kubenswrapper[4892]: I0217 19:17:02.725030 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7d7bl"] Feb 17 19:17:02 crc kubenswrapper[4892]: I0217 19:17:02.853728 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/374d103b-c8ff-4cac-aa32-5d7355ac5746-utilities\") pod \"redhat-marketplace-7d7bl\" (UID: \"374d103b-c8ff-4cac-aa32-5d7355ac5746\") " pod="openshift-marketplace/redhat-marketplace-7d7bl" Feb 17 19:17:02 crc kubenswrapper[4892]: I0217 19:17:02.854008 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47tph\" (UniqueName: \"kubernetes.io/projected/374d103b-c8ff-4cac-aa32-5d7355ac5746-kube-api-access-47tph\") pod \"redhat-marketplace-7d7bl\" (UID: \"374d103b-c8ff-4cac-aa32-5d7355ac5746\") " pod="openshift-marketplace/redhat-marketplace-7d7bl" Feb 17 19:17:02 crc kubenswrapper[4892]: I0217 19:17:02.854084 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/374d103b-c8ff-4cac-aa32-5d7355ac5746-catalog-content\") pod \"redhat-marketplace-7d7bl\" (UID: \"374d103b-c8ff-4cac-aa32-5d7355ac5746\") " pod="openshift-marketplace/redhat-marketplace-7d7bl" Feb 17 19:17:02 crc kubenswrapper[4892]: I0217 19:17:02.959695 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47tph\" (UniqueName: \"kubernetes.io/projected/374d103b-c8ff-4cac-aa32-5d7355ac5746-kube-api-access-47tph\") pod \"redhat-marketplace-7d7bl\" (UID: \"374d103b-c8ff-4cac-aa32-5d7355ac5746\") " pod="openshift-marketplace/redhat-marketplace-7d7bl" Feb 17 19:17:02 crc kubenswrapper[4892]: I0217 19:17:02.959802 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/374d103b-c8ff-4cac-aa32-5d7355ac5746-catalog-content\") pod \"redhat-marketplace-7d7bl\" (UID: \"374d103b-c8ff-4cac-aa32-5d7355ac5746\") " pod="openshift-marketplace/redhat-marketplace-7d7bl" Feb 17 19:17:02 crc kubenswrapper[4892]: I0217 19:17:02.960039 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/374d103b-c8ff-4cac-aa32-5d7355ac5746-utilities\") pod \"redhat-marketplace-7d7bl\" (UID: \"374d103b-c8ff-4cac-aa32-5d7355ac5746\") " pod="openshift-marketplace/redhat-marketplace-7d7bl" Feb 17 19:17:02 crc kubenswrapper[4892]: I0217 19:17:02.960604 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/374d103b-c8ff-4cac-aa32-5d7355ac5746-catalog-content\") pod \"redhat-marketplace-7d7bl\" (UID: \"374d103b-c8ff-4cac-aa32-5d7355ac5746\") " pod="openshift-marketplace/redhat-marketplace-7d7bl" Feb 17 19:17:02 crc kubenswrapper[4892]: I0217 19:17:02.960627 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/374d103b-c8ff-4cac-aa32-5d7355ac5746-utilities\") pod \"redhat-marketplace-7d7bl\" (UID: \"374d103b-c8ff-4cac-aa32-5d7355ac5746\") " pod="openshift-marketplace/redhat-marketplace-7d7bl" Feb 17 19:17:02 crc kubenswrapper[4892]: I0217 19:17:02.988893 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47tph\" (UniqueName: \"kubernetes.io/projected/374d103b-c8ff-4cac-aa32-5d7355ac5746-kube-api-access-47tph\") pod \"redhat-marketplace-7d7bl\" (UID: \"374d103b-c8ff-4cac-aa32-5d7355ac5746\") " pod="openshift-marketplace/redhat-marketplace-7d7bl" Feb 17 19:17:03 crc kubenswrapper[4892]: I0217 19:17:03.031264 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7d7bl" Feb 17 19:17:03 crc kubenswrapper[4892]: I0217 19:17:03.485861 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7d7bl"] Feb 17 19:17:03 crc kubenswrapper[4892]: I0217 19:17:03.658234 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f3d8-account-create-update-wnrsx" Feb 17 19:17:03 crc kubenswrapper[4892]: I0217 19:17:03.665116 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-cpcdn" Feb 17 19:17:03 crc kubenswrapper[4892]: I0217 19:17:03.776273 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpfmw\" (UniqueName: \"kubernetes.io/projected/776ec54b-43e8-4316-bbca-e54f0302c366-kube-api-access-wpfmw\") pod \"776ec54b-43e8-4316-bbca-e54f0302c366\" (UID: \"776ec54b-43e8-4316-bbca-e54f0302c366\") " Feb 17 19:17:03 crc kubenswrapper[4892]: I0217 19:17:03.776347 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1719978b-bac0-4de9-a1d0-0699cc81c53b-operator-scripts\") pod \"1719978b-bac0-4de9-a1d0-0699cc81c53b\" (UID: \"1719978b-bac0-4de9-a1d0-0699cc81c53b\") " Feb 17 19:17:03 crc kubenswrapper[4892]: I0217 19:17:03.776523 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/776ec54b-43e8-4316-bbca-e54f0302c366-operator-scripts\") pod \"776ec54b-43e8-4316-bbca-e54f0302c366\" (UID: \"776ec54b-43e8-4316-bbca-e54f0302c366\") " Feb 17 19:17:03 crc kubenswrapper[4892]: I0217 19:17:03.776606 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kd4vd\" (UniqueName: \"kubernetes.io/projected/1719978b-bac0-4de9-a1d0-0699cc81c53b-kube-api-access-kd4vd\") pod \"1719978b-bac0-4de9-a1d0-0699cc81c53b\" (UID: \"1719978b-bac0-4de9-a1d0-0699cc81c53b\") " Feb 17 19:17:03 crc kubenswrapper[4892]: I0217 19:17:03.776931 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1719978b-bac0-4de9-a1d0-0699cc81c53b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1719978b-bac0-4de9-a1d0-0699cc81c53b" (UID: "1719978b-bac0-4de9-a1d0-0699cc81c53b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:17:03 crc kubenswrapper[4892]: I0217 19:17:03.777062 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/776ec54b-43e8-4316-bbca-e54f0302c366-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "776ec54b-43e8-4316-bbca-e54f0302c366" (UID: "776ec54b-43e8-4316-bbca-e54f0302c366"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:17:03 crc kubenswrapper[4892]: I0217 19:17:03.780775 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/776ec54b-43e8-4316-bbca-e54f0302c366-kube-api-access-wpfmw" (OuterVolumeSpecName: "kube-api-access-wpfmw") pod "776ec54b-43e8-4316-bbca-e54f0302c366" (UID: "776ec54b-43e8-4316-bbca-e54f0302c366"). InnerVolumeSpecName "kube-api-access-wpfmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:17:03 crc kubenswrapper[4892]: I0217 19:17:03.782975 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1719978b-bac0-4de9-a1d0-0699cc81c53b-kube-api-access-kd4vd" (OuterVolumeSpecName: "kube-api-access-kd4vd") pod "1719978b-bac0-4de9-a1d0-0699cc81c53b" (UID: "1719978b-bac0-4de9-a1d0-0699cc81c53b"). InnerVolumeSpecName "kube-api-access-kd4vd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:17:03 crc kubenswrapper[4892]: I0217 19:17:03.878972 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1719978b-bac0-4de9-a1d0-0699cc81c53b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:17:03 crc kubenswrapper[4892]: I0217 19:17:03.879021 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/776ec54b-43e8-4316-bbca-e54f0302c366-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:17:03 crc kubenswrapper[4892]: I0217 19:17:03.879043 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kd4vd\" (UniqueName: \"kubernetes.io/projected/1719978b-bac0-4de9-a1d0-0699cc81c53b-kube-api-access-kd4vd\") on node \"crc\" DevicePath \"\"" Feb 17 19:17:03 crc kubenswrapper[4892]: I0217 19:17:03.879063 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpfmw\" (UniqueName: \"kubernetes.io/projected/776ec54b-43e8-4316-bbca-e54f0302c366-kube-api-access-wpfmw\") on node \"crc\" DevicePath \"\"" Feb 17 19:17:04 crc kubenswrapper[4892]: I0217 19:17:04.212942 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f3d8-account-create-update-wnrsx" event={"ID":"1719978b-bac0-4de9-a1d0-0699cc81c53b","Type":"ContainerDied","Data":"7ebbb5cff129dbbf1de1f0b605308d9ccb26bfc91525d6c8a1ce5ff20edc1494"} Feb 17 19:17:04 crc kubenswrapper[4892]: I0217 19:17:04.212982 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ebbb5cff129dbbf1de1f0b605308d9ccb26bfc91525d6c8a1ce5ff20edc1494" Feb 17 19:17:04 crc kubenswrapper[4892]: I0217 19:17:04.213002 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f3d8-account-create-update-wnrsx" Feb 17 19:17:04 crc kubenswrapper[4892]: I0217 19:17:04.217728 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-cpcdn" event={"ID":"776ec54b-43e8-4316-bbca-e54f0302c366","Type":"ContainerDied","Data":"17454c7be5af98b3a1b72922626dc3d5e1f2dc1cde45eea063a6cce0fa2f4e3f"} Feb 17 19:17:04 crc kubenswrapper[4892]: I0217 19:17:04.217762 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17454c7be5af98b3a1b72922626dc3d5e1f2dc1cde45eea063a6cce0fa2f4e3f" Feb 17 19:17:04 crc kubenswrapper[4892]: I0217 19:17:04.217842 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-cpcdn" Feb 17 19:17:04 crc kubenswrapper[4892]: I0217 19:17:04.220641 4892 generic.go:334] "Generic (PLEG): container finished" podID="374d103b-c8ff-4cac-aa32-5d7355ac5746" containerID="fc46e314caa4a47393053e9e55a6c248b59a7a4a5baf2fce40f66ba12cd802fd" exitCode=0 Feb 17 19:17:04 crc kubenswrapper[4892]: I0217 19:17:04.220686 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7d7bl" event={"ID":"374d103b-c8ff-4cac-aa32-5d7355ac5746","Type":"ContainerDied","Data":"fc46e314caa4a47393053e9e55a6c248b59a7a4a5baf2fce40f66ba12cd802fd"} Feb 17 19:17:04 crc kubenswrapper[4892]: I0217 19:17:04.220714 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7d7bl" event={"ID":"374d103b-c8ff-4cac-aa32-5d7355ac5746","Type":"ContainerStarted","Data":"e83190f92744231179b0dd55a571016413932724027a52b407f2fe3d2dfc24dc"} Feb 17 19:17:05 crc kubenswrapper[4892]: I0217 19:17:05.064230 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wttf5" Feb 17 19:17:05 crc kubenswrapper[4892]: I0217 19:17:05.064600 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wttf5" Feb 17 19:17:05 crc kubenswrapper[4892]: I0217 19:17:05.232674 4892 generic.go:334] "Generic (PLEG): container finished" podID="374d103b-c8ff-4cac-aa32-5d7355ac5746" containerID="0f087f8f937cf9a4ee1bc7f52e4f75446e64a4e1776cc273ab1b14fb43e6b12e" exitCode=0 Feb 17 19:17:05 crc kubenswrapper[4892]: I0217 19:17:05.232736 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7d7bl" event={"ID":"374d103b-c8ff-4cac-aa32-5d7355ac5746","Type":"ContainerDied","Data":"0f087f8f937cf9a4ee1bc7f52e4f75446e64a4e1776cc273ab1b14fb43e6b12e"} Feb 17 19:17:05 crc kubenswrapper[4892]: I0217 19:17:05.416645 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-c7scg"] Feb 17 19:17:05 crc kubenswrapper[4892]: E0217 19:17:05.417192 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1719978b-bac0-4de9-a1d0-0699cc81c53b" containerName="mariadb-account-create-update" Feb 17 19:17:05 crc kubenswrapper[4892]: I0217 19:17:05.417210 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="1719978b-bac0-4de9-a1d0-0699cc81c53b" containerName="mariadb-account-create-update" Feb 17 19:17:05 crc kubenswrapper[4892]: E0217 19:17:05.417223 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="776ec54b-43e8-4316-bbca-e54f0302c366" containerName="mariadb-database-create" Feb 17 19:17:05 crc kubenswrapper[4892]: I0217 19:17:05.417232 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="776ec54b-43e8-4316-bbca-e54f0302c366" containerName="mariadb-database-create" Feb 17 19:17:05 crc kubenswrapper[4892]: I0217 19:17:05.417440 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="776ec54b-43e8-4316-bbca-e54f0302c366" containerName="mariadb-database-create" Feb 17 19:17:05 crc kubenswrapper[4892]: I0217 19:17:05.417462 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="1719978b-bac0-4de9-a1d0-0699cc81c53b" containerName="mariadb-account-create-update" Feb 17 19:17:05 crc kubenswrapper[4892]: I0217 19:17:05.418100 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-c7scg" Feb 17 19:17:05 crc kubenswrapper[4892]: I0217 19:17:05.420159 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-zvxqz" Feb 17 19:17:05 crc kubenswrapper[4892]: I0217 19:17:05.420659 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 17 19:17:05 crc kubenswrapper[4892]: I0217 19:17:05.430882 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-c7scg"] Feb 17 19:17:05 crc kubenswrapper[4892]: I0217 19:17:05.616985 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff3816c1-c175-4094-89c4-e2143aa24c5c-config-data\") pod \"glance-db-sync-c7scg\" (UID: \"ff3816c1-c175-4094-89c4-e2143aa24c5c\") " pod="openstack/glance-db-sync-c7scg" Feb 17 19:17:05 crc kubenswrapper[4892]: I0217 19:17:05.617798 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv48z\" (UniqueName: \"kubernetes.io/projected/ff3816c1-c175-4094-89c4-e2143aa24c5c-kube-api-access-zv48z\") pod \"glance-db-sync-c7scg\" (UID: \"ff3816c1-c175-4094-89c4-e2143aa24c5c\") " pod="openstack/glance-db-sync-c7scg" Feb 17 19:17:05 crc kubenswrapper[4892]: I0217 19:17:05.617979 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ff3816c1-c175-4094-89c4-e2143aa24c5c-db-sync-config-data\") pod \"glance-db-sync-c7scg\" (UID: \"ff3816c1-c175-4094-89c4-e2143aa24c5c\") " pod="openstack/glance-db-sync-c7scg" Feb 17 19:17:05 crc kubenswrapper[4892]: I0217 19:17:05.618154 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff3816c1-c175-4094-89c4-e2143aa24c5c-combined-ca-bundle\") pod \"glance-db-sync-c7scg\" (UID: \"ff3816c1-c175-4094-89c4-e2143aa24c5c\") " pod="openstack/glance-db-sync-c7scg" Feb 17 19:17:05 crc kubenswrapper[4892]: I0217 19:17:05.720091 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff3816c1-c175-4094-89c4-e2143aa24c5c-config-data\") pod \"glance-db-sync-c7scg\" (UID: \"ff3816c1-c175-4094-89c4-e2143aa24c5c\") " pod="openstack/glance-db-sync-c7scg" Feb 17 19:17:05 crc kubenswrapper[4892]: I0217 19:17:05.720212 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zv48z\" (UniqueName: \"kubernetes.io/projected/ff3816c1-c175-4094-89c4-e2143aa24c5c-kube-api-access-zv48z\") pod \"glance-db-sync-c7scg\" (UID: \"ff3816c1-c175-4094-89c4-e2143aa24c5c\") " pod="openstack/glance-db-sync-c7scg" Feb 17 19:17:05 crc kubenswrapper[4892]: I0217 19:17:05.720248 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ff3816c1-c175-4094-89c4-e2143aa24c5c-db-sync-config-data\") pod \"glance-db-sync-c7scg\" (UID: \"ff3816c1-c175-4094-89c4-e2143aa24c5c\") " pod="openstack/glance-db-sync-c7scg" Feb 17 19:17:05 crc kubenswrapper[4892]: I0217 19:17:05.720305 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff3816c1-c175-4094-89c4-e2143aa24c5c-combined-ca-bundle\") pod \"glance-db-sync-c7scg\" (UID: \"ff3816c1-c175-4094-89c4-e2143aa24c5c\") " pod="openstack/glance-db-sync-c7scg" Feb 17 19:17:05 crc kubenswrapper[4892]: I0217 19:17:05.725923 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ff3816c1-c175-4094-89c4-e2143aa24c5c-db-sync-config-data\") pod \"glance-db-sync-c7scg\" (UID: \"ff3816c1-c175-4094-89c4-e2143aa24c5c\") " pod="openstack/glance-db-sync-c7scg" Feb 17 19:17:05 crc kubenswrapper[4892]: I0217 19:17:05.725987 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff3816c1-c175-4094-89c4-e2143aa24c5c-combined-ca-bundle\") pod \"glance-db-sync-c7scg\" (UID: \"ff3816c1-c175-4094-89c4-e2143aa24c5c\") " pod="openstack/glance-db-sync-c7scg" Feb 17 19:17:05 crc kubenswrapper[4892]: I0217 19:17:05.726446 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff3816c1-c175-4094-89c4-e2143aa24c5c-config-data\") pod \"glance-db-sync-c7scg\" (UID: \"ff3816c1-c175-4094-89c4-e2143aa24c5c\") " pod="openstack/glance-db-sync-c7scg" Feb 17 19:17:05 crc kubenswrapper[4892]: I0217 19:17:05.743042 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv48z\" (UniqueName: \"kubernetes.io/projected/ff3816c1-c175-4094-89c4-e2143aa24c5c-kube-api-access-zv48z\") pod \"glance-db-sync-c7scg\" (UID: \"ff3816c1-c175-4094-89c4-e2143aa24c5c\") " pod="openstack/glance-db-sync-c7scg" Feb 17 19:17:06 crc kubenswrapper[4892]: I0217 19:17:06.038091 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-c7scg" Feb 17 19:17:06 crc kubenswrapper[4892]: I0217 19:17:06.157009 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wttf5" podUID="2f283cf0-c55a-469e-b46e-d5eb187f33e9" containerName="registry-server" probeResult="failure" output=< Feb 17 19:17:06 crc kubenswrapper[4892]: timeout: failed to connect service ":50051" within 1s Feb 17 19:17:06 crc kubenswrapper[4892]: > Feb 17 19:17:06 crc kubenswrapper[4892]: I0217 19:17:06.265162 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7d7bl" event={"ID":"374d103b-c8ff-4cac-aa32-5d7355ac5746","Type":"ContainerStarted","Data":"1020fc3eea8afec379fdd2ce77ad86a3f90817c12e3b4fd1c8e07d10b2e61cce"} Feb 17 19:17:06 crc kubenswrapper[4892]: I0217 19:17:06.297368 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7d7bl" podStartSLOduration=2.914624886 podStartE2EDuration="4.297350024s" podCreationTimestamp="2026-02-17 19:17:02 +0000 UTC" firstStartedPulling="2026-02-17 19:17:04.223980935 +0000 UTC m=+5595.599384210" lastFinishedPulling="2026-02-17 19:17:05.606706073 +0000 UTC m=+5596.982109348" observedRunningTime="2026-02-17 19:17:06.288261809 +0000 UTC m=+5597.663665084" watchObservedRunningTime="2026-02-17 19:17:06.297350024 +0000 UTC m=+5597.672753289" Feb 17 19:17:06 crc kubenswrapper[4892]: I0217 19:17:06.605023 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-c7scg"] Feb 17 19:17:07 crc kubenswrapper[4892]: I0217 19:17:07.276839 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-c7scg" event={"ID":"ff3816c1-c175-4094-89c4-e2143aa24c5c","Type":"ContainerStarted","Data":"899255e01f33471ea060742028bbabc149c65b6233a959a55119a355e123632b"} Feb 17 19:17:07 crc kubenswrapper[4892]: I0217 19:17:07.276883 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-c7scg" event={"ID":"ff3816c1-c175-4094-89c4-e2143aa24c5c","Type":"ContainerStarted","Data":"6795d5fc9ec08818858cdc3b48a5346bebeebd740244ac5ea4801ad51f0b7b3d"} Feb 17 19:17:07 crc kubenswrapper[4892]: I0217 19:17:07.299849 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-c7scg" podStartSLOduration=2.299829506 podStartE2EDuration="2.299829506s" podCreationTimestamp="2026-02-17 19:17:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:17:07.295217161 +0000 UTC m=+5598.670620436" watchObservedRunningTime="2026-02-17 19:17:07.299829506 +0000 UTC m=+5598.675232771" Feb 17 19:17:11 crc kubenswrapper[4892]: I0217 19:17:11.326013 4892 generic.go:334] "Generic (PLEG): container finished" podID="ff3816c1-c175-4094-89c4-e2143aa24c5c" containerID="899255e01f33471ea060742028bbabc149c65b6233a959a55119a355e123632b" exitCode=0 Feb 17 19:17:11 crc kubenswrapper[4892]: I0217 19:17:11.326615 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-c7scg" event={"ID":"ff3816c1-c175-4094-89c4-e2143aa24c5c","Type":"ContainerDied","Data":"899255e01f33471ea060742028bbabc149c65b6233a959a55119a355e123632b"} Feb 17 19:17:12 crc kubenswrapper[4892]: I0217 19:17:12.817978 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-c7scg" Feb 17 19:17:12 crc kubenswrapper[4892]: I0217 19:17:12.973489 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zv48z\" (UniqueName: \"kubernetes.io/projected/ff3816c1-c175-4094-89c4-e2143aa24c5c-kube-api-access-zv48z\") pod \"ff3816c1-c175-4094-89c4-e2143aa24c5c\" (UID: \"ff3816c1-c175-4094-89c4-e2143aa24c5c\") " Feb 17 19:17:12 crc kubenswrapper[4892]: I0217 19:17:12.973683 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff3816c1-c175-4094-89c4-e2143aa24c5c-combined-ca-bundle\") pod \"ff3816c1-c175-4094-89c4-e2143aa24c5c\" (UID: \"ff3816c1-c175-4094-89c4-e2143aa24c5c\") " Feb 17 19:17:12 crc kubenswrapper[4892]: I0217 19:17:12.973936 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ff3816c1-c175-4094-89c4-e2143aa24c5c-db-sync-config-data\") pod \"ff3816c1-c175-4094-89c4-e2143aa24c5c\" (UID: \"ff3816c1-c175-4094-89c4-e2143aa24c5c\") " Feb 17 19:17:12 crc kubenswrapper[4892]: I0217 19:17:12.974059 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff3816c1-c175-4094-89c4-e2143aa24c5c-config-data\") pod \"ff3816c1-c175-4094-89c4-e2143aa24c5c\" (UID: \"ff3816c1-c175-4094-89c4-e2143aa24c5c\") " Feb 17 19:17:12 crc kubenswrapper[4892]: I0217 19:17:12.979973 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff3816c1-c175-4094-89c4-e2143aa24c5c-kube-api-access-zv48z" (OuterVolumeSpecName: "kube-api-access-zv48z") pod "ff3816c1-c175-4094-89c4-e2143aa24c5c" (UID: "ff3816c1-c175-4094-89c4-e2143aa24c5c"). InnerVolumeSpecName "kube-api-access-zv48z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:17:12 crc kubenswrapper[4892]: I0217 19:17:12.982082 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff3816c1-c175-4094-89c4-e2143aa24c5c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ff3816c1-c175-4094-89c4-e2143aa24c5c" (UID: "ff3816c1-c175-4094-89c4-e2143aa24c5c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:17:13 crc kubenswrapper[4892]: I0217 19:17:13.029060 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff3816c1-c175-4094-89c4-e2143aa24c5c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff3816c1-c175-4094-89c4-e2143aa24c5c" (UID: "ff3816c1-c175-4094-89c4-e2143aa24c5c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:17:13 crc kubenswrapper[4892]: I0217 19:17:13.034538 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7d7bl" Feb 17 19:17:13 crc kubenswrapper[4892]: I0217 19:17:13.034774 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7d7bl" Feb 17 19:17:13 crc kubenswrapper[4892]: I0217 19:17:13.078005 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zv48z\" (UniqueName: \"kubernetes.io/projected/ff3816c1-c175-4094-89c4-e2143aa24c5c-kube-api-access-zv48z\") on node \"crc\" DevicePath \"\"" Feb 17 19:17:13 crc kubenswrapper[4892]: I0217 19:17:13.078043 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff3816c1-c175-4094-89c4-e2143aa24c5c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 19:17:13 crc kubenswrapper[4892]: I0217 19:17:13.078065 4892 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ff3816c1-c175-4094-89c4-e2143aa24c5c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 19:17:13 crc kubenswrapper[4892]: I0217 19:17:13.084998 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff3816c1-c175-4094-89c4-e2143aa24c5c-config-data" (OuterVolumeSpecName: "config-data") pod "ff3816c1-c175-4094-89c4-e2143aa24c5c" (UID: "ff3816c1-c175-4094-89c4-e2143aa24c5c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:17:13 crc kubenswrapper[4892]: I0217 19:17:13.108927 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7d7bl" Feb 17 19:17:13 crc kubenswrapper[4892]: I0217 19:17:13.179553 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff3816c1-c175-4094-89c4-e2143aa24c5c-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 19:17:13 crc kubenswrapper[4892]: I0217 19:17:13.351696 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-c7scg" event={"ID":"ff3816c1-c175-4094-89c4-e2143aa24c5c","Type":"ContainerDied","Data":"6795d5fc9ec08818858cdc3b48a5346bebeebd740244ac5ea4801ad51f0b7b3d"} Feb 17 19:17:13 crc kubenswrapper[4892]: I0217 19:17:13.351744 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-c7scg" Feb 17 19:17:13 crc kubenswrapper[4892]: I0217 19:17:13.351761 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6795d5fc9ec08818858cdc3b48a5346bebeebd740244ac5ea4801ad51f0b7b3d" Feb 17 19:17:13 crc kubenswrapper[4892]: I0217 19:17:13.428731 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7d7bl" Feb 17 19:17:13 crc kubenswrapper[4892]: I0217 19:17:13.488413 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7d7bl"] Feb 17 19:17:13 crc kubenswrapper[4892]: I0217 19:17:13.753293 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7cc68945df-zppb2"] Feb 17 19:17:13 crc kubenswrapper[4892]: E0217 19:17:13.753945 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff3816c1-c175-4094-89c4-e2143aa24c5c" containerName="glance-db-sync" Feb 17 19:17:13 crc kubenswrapper[4892]: I0217 19:17:13.753969 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff3816c1-c175-4094-89c4-e2143aa24c5c" containerName="glance-db-sync" Feb 17 19:17:13 crc kubenswrapper[4892]: I0217 19:17:13.754227 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff3816c1-c175-4094-89c4-e2143aa24c5c" containerName="glance-db-sync" Feb 17 19:17:13 crc kubenswrapper[4892]: I0217 19:17:13.755557 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cc68945df-zppb2" Feb 17 19:17:13 crc kubenswrapper[4892]: I0217 19:17:13.767741 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 19:17:13 crc kubenswrapper[4892]: I0217 19:17:13.769903 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 19:17:13 crc kubenswrapper[4892]: I0217 19:17:13.773166 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 17 19:17:13 crc kubenswrapper[4892]: I0217 19:17:13.773503 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Feb 17 19:17:13 crc kubenswrapper[4892]: I0217 19:17:13.773694 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-zvxqz" Feb 17 19:17:13 crc kubenswrapper[4892]: I0217 19:17:13.773866 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 17 19:17:13 crc kubenswrapper[4892]: I0217 19:17:13.784024 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 19:17:13 crc kubenswrapper[4892]: I0217 19:17:13.838576 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cc68945df-zppb2"] Feb 17 19:17:13 crc kubenswrapper[4892]: I0217 19:17:13.889944 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 19:17:13 crc kubenswrapper[4892]: I0217 19:17:13.892340 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 19:17:13 crc kubenswrapper[4892]: I0217 19:17:13.898691 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 19:17:13 crc kubenswrapper[4892]: I0217 19:17:13.899204 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 17 19:17:13 crc kubenswrapper[4892]: I0217 19:17:13.921695 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8b44870-db5a-4228-bc5f-6e85be75fe36-dns-svc\") pod \"dnsmasq-dns-7cc68945df-zppb2\" (UID: \"a8b44870-db5a-4228-bc5f-6e85be75fe36\") " pod="openstack/dnsmasq-dns-7cc68945df-zppb2" Feb 17 19:17:13 crc kubenswrapper[4892]: I0217 19:17:13.921740 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7\") " pod="openstack/glance-default-external-api-0" Feb 17 19:17:13 crc kubenswrapper[4892]: I0217 19:17:13.921789 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8b44870-db5a-4228-bc5f-6e85be75fe36-config\") pod \"dnsmasq-dns-7cc68945df-zppb2\" (UID: \"a8b44870-db5a-4228-bc5f-6e85be75fe36\") " pod="openstack/dnsmasq-dns-7cc68945df-zppb2" Feb 17 19:17:13 crc kubenswrapper[4892]: I0217 19:17:13.921846 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kcnc\" (UniqueName: \"kubernetes.io/projected/a8b44870-db5a-4228-bc5f-6e85be75fe36-kube-api-access-2kcnc\") pod \"dnsmasq-dns-7cc68945df-zppb2\" (UID: \"a8b44870-db5a-4228-bc5f-6e85be75fe36\") " pod="openstack/dnsmasq-dns-7cc68945df-zppb2" Feb 17 19:17:13 crc kubenswrapper[4892]: I0217 19:17:13.921873 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7-scripts\") pod \"glance-default-external-api-0\" (UID: \"5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7\") " pod="openstack/glance-default-external-api-0" Feb 17 19:17:13 crc kubenswrapper[4892]: I0217 19:17:13.921904 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a8b44870-db5a-4228-bc5f-6e85be75fe36-ovsdbserver-sb\") pod \"dnsmasq-dns-7cc68945df-zppb2\" (UID: \"a8b44870-db5a-4228-bc5f-6e85be75fe36\") " pod="openstack/dnsmasq-dns-7cc68945df-zppb2" Feb 17 19:17:13 crc kubenswrapper[4892]: I0217 19:17:13.921929 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klksm\" (UniqueName: \"kubernetes.io/projected/5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7-kube-api-access-klksm\") pod \"glance-default-external-api-0\" (UID: \"5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7\") " pod="openstack/glance-default-external-api-0" Feb 17 19:17:13 crc kubenswrapper[4892]: I0217 19:17:13.921949 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7-ceph\") pod \"glance-default-external-api-0\" (UID: \"5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7\") " pod="openstack/glance-default-external-api-0" Feb 17 19:17:13 crc kubenswrapper[4892]: I0217 19:17:13.921963 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7\") " pod="openstack/glance-default-external-api-0" Feb 17 19:17:13 crc kubenswrapper[4892]: I0217 19:17:13.921978 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7-config-data\") pod \"glance-default-external-api-0\" (UID: \"5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7\") " pod="openstack/glance-default-external-api-0" Feb 17 19:17:13 crc kubenswrapper[4892]: I0217 19:17:13.922017 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a8b44870-db5a-4228-bc5f-6e85be75fe36-ovsdbserver-nb\") pod \"dnsmasq-dns-7cc68945df-zppb2\" (UID: \"a8b44870-db5a-4228-bc5f-6e85be75fe36\") " pod="openstack/dnsmasq-dns-7cc68945df-zppb2" Feb 17 19:17:13 crc kubenswrapper[4892]: I0217 19:17:13.922036 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7-logs\") pod \"glance-default-external-api-0\" (UID: \"5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7\") " pod="openstack/glance-default-external-api-0" Feb 17 19:17:14 crc kubenswrapper[4892]: I0217 19:17:14.024068 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a8b44870-db5a-4228-bc5f-6e85be75fe36-ovsdbserver-nb\") pod \"dnsmasq-dns-7cc68945df-zppb2\" (UID: \"a8b44870-db5a-4228-bc5f-6e85be75fe36\") " pod="openstack/dnsmasq-dns-7cc68945df-zppb2" Feb 17 19:17:14 crc kubenswrapper[4892]: I0217 19:17:14.024125 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7-logs\") pod \"glance-default-external-api-0\" (UID: \"5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7\") " pod="openstack/glance-default-external-api-0" Feb 17 19:17:14 crc kubenswrapper[4892]: I0217 19:17:14.024170 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8b44870-db5a-4228-bc5f-6e85be75fe36-dns-svc\") pod \"dnsmasq-dns-7cc68945df-zppb2\" (UID: \"a8b44870-db5a-4228-bc5f-6e85be75fe36\") " pod="openstack/dnsmasq-dns-7cc68945df-zppb2" Feb 17 19:17:14 crc kubenswrapper[4892]: I0217 19:17:14.024192 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76aefb55-60ee-49af-803c-c0237d2fb375-config-data\") pod \"glance-default-internal-api-0\" (UID: \"76aefb55-60ee-49af-803c-c0237d2fb375\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:17:14 crc kubenswrapper[4892]: I0217 19:17:14.024210 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7\") " pod="openstack/glance-default-external-api-0" Feb 17 19:17:14 crc kubenswrapper[4892]: I0217 19:17:14.024259 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8b44870-db5a-4228-bc5f-6e85be75fe36-config\") pod \"dnsmasq-dns-7cc68945df-zppb2\" (UID: \"a8b44870-db5a-4228-bc5f-6e85be75fe36\") " pod="openstack/dnsmasq-dns-7cc68945df-zppb2" Feb 17 19:17:14 crc kubenswrapper[4892]: I0217 19:17:14.024301 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kcnc\" (UniqueName: \"kubernetes.io/projected/a8b44870-db5a-4228-bc5f-6e85be75fe36-kube-api-access-2kcnc\") pod \"dnsmasq-dns-7cc68945df-zppb2\" (UID: \"a8b44870-db5a-4228-bc5f-6e85be75fe36\") " pod="openstack/dnsmasq-dns-7cc68945df-zppb2" Feb 17 19:17:14 crc kubenswrapper[4892]: I0217 19:17:14.024317 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/76aefb55-60ee-49af-803c-c0237d2fb375-ceph\") pod \"glance-default-internal-api-0\" (UID: \"76aefb55-60ee-49af-803c-c0237d2fb375\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:17:14 crc kubenswrapper[4892]: I0217 19:17:14.024334 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76aefb55-60ee-49af-803c-c0237d2fb375-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"76aefb55-60ee-49af-803c-c0237d2fb375\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:17:14 crc kubenswrapper[4892]: I0217 19:17:14.024355 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7-scripts\") pod \"glance-default-external-api-0\" (UID: \"5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7\") " pod="openstack/glance-default-external-api-0" Feb 17 19:17:14 crc kubenswrapper[4892]: I0217 19:17:14.024380 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76aefb55-60ee-49af-803c-c0237d2fb375-logs\") pod \"glance-default-internal-api-0\" (UID: \"76aefb55-60ee-49af-803c-c0237d2fb375\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:17:14 crc kubenswrapper[4892]: I0217 19:17:14.024406 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a8b44870-db5a-4228-bc5f-6e85be75fe36-ovsdbserver-sb\") pod \"dnsmasq-dns-7cc68945df-zppb2\" (UID: \"a8b44870-db5a-4228-bc5f-6e85be75fe36\") " pod="openstack/dnsmasq-dns-7cc68945df-zppb2" Feb 17 19:17:14 crc kubenswrapper[4892]: I0217 19:17:14.024428 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klksm\" (UniqueName: \"kubernetes.io/projected/5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7-kube-api-access-klksm\") pod \"glance-default-external-api-0\" (UID: \"5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7\") " pod="openstack/glance-default-external-api-0" Feb 17 19:17:14 crc kubenswrapper[4892]: I0217 19:17:14.024448 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7-ceph\") pod \"glance-default-external-api-0\" (UID: \"5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7\") " pod="openstack/glance-default-external-api-0" Feb 17 19:17:14 crc kubenswrapper[4892]: I0217 19:17:14.024464 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7\") " pod="openstack/glance-default-external-api-0" Feb 17 19:17:14 crc kubenswrapper[4892]: I0217 19:17:14.024480 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7-config-data\") pod \"glance-default-external-api-0\" (UID: \"5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7\") " pod="openstack/glance-default-external-api-0" Feb 17 19:17:14 crc kubenswrapper[4892]: I0217 19:17:14.024499 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76aefb55-60ee-49af-803c-c0237d2fb375-scripts\") pod \"glance-default-internal-api-0\" (UID: \"76aefb55-60ee-49af-803c-c0237d2fb375\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:17:14 crc kubenswrapper[4892]: I0217 19:17:14.024516 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5d9k5\" (UniqueName: \"kubernetes.io/projected/76aefb55-60ee-49af-803c-c0237d2fb375-kube-api-access-5d9k5\") pod \"glance-default-internal-api-0\" (UID: \"76aefb55-60ee-49af-803c-c0237d2fb375\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:17:14 crc kubenswrapper[4892]: I0217 19:17:14.024536 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/76aefb55-60ee-49af-803c-c0237d2fb375-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"76aefb55-60ee-49af-803c-c0237d2fb375\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:17:14 crc kubenswrapper[4892]: I0217 19:17:14.024652 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7-logs\") pod \"glance-default-external-api-0\" (UID: \"5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7\") " pod="openstack/glance-default-external-api-0" Feb 17 19:17:14 crc kubenswrapper[4892]: I0217 19:17:14.024720 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7\") " pod="openstack/glance-default-external-api-0" Feb 17 19:17:14 crc kubenswrapper[4892]: I0217 19:17:14.025203 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8b44870-db5a-4228-bc5f-6e85be75fe36-dns-svc\") pod \"dnsmasq-dns-7cc68945df-zppb2\" (UID: \"a8b44870-db5a-4228-bc5f-6e85be75fe36\") " pod="openstack/dnsmasq-dns-7cc68945df-zppb2" Feb 17 19:17:14 crc kubenswrapper[4892]: I0217 19:17:14.025277 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a8b44870-db5a-4228-bc5f-6e85be75fe36-ovsdbserver-nb\") pod \"dnsmasq-dns-7cc68945df-zppb2\" (UID: \"a8b44870-db5a-4228-bc5f-6e85be75fe36\") " pod="openstack/dnsmasq-dns-7cc68945df-zppb2" Feb 17 19:17:14 crc kubenswrapper[4892]: I0217 19:17:14.025398 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a8b44870-db5a-4228-bc5f-6e85be75fe36-ovsdbserver-sb\") pod \"dnsmasq-dns-7cc68945df-zppb2\" (UID: \"a8b44870-db5a-4228-bc5f-6e85be75fe36\") " pod="openstack/dnsmasq-dns-7cc68945df-zppb2" Feb 17 19:17:14 crc kubenswrapper[4892]: I0217 19:17:14.026066 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8b44870-db5a-4228-bc5f-6e85be75fe36-config\") pod \"dnsmasq-dns-7cc68945df-zppb2\" (UID: \"a8b44870-db5a-4228-bc5f-6e85be75fe36\") " pod="openstack/dnsmasq-dns-7cc68945df-zppb2" Feb 17 19:17:14 crc kubenswrapper[4892]: I0217 19:17:14.029977 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7-scripts\") pod \"glance-default-external-api-0\" (UID: \"5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7\") " pod="openstack/glance-default-external-api-0" Feb 17 19:17:14 crc kubenswrapper[4892]: I0217 19:17:14.030041 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7-ceph\") pod \"glance-default-external-api-0\" (UID: \"5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7\") " pod="openstack/glance-default-external-api-0" Feb 17 19:17:14 crc kubenswrapper[4892]: I0217 19:17:14.031270 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7-config-data\") pod \"glance-default-external-api-0\" (UID: \"5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7\") " pod="openstack/glance-default-external-api-0" Feb 17 19:17:14 crc kubenswrapper[4892]: I0217 19:17:14.032446 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7\") " pod="openstack/glance-default-external-api-0" Feb 17 19:17:14 crc kubenswrapper[4892]: I0217 19:17:14.038755 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klksm\" (UniqueName: \"kubernetes.io/projected/5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7-kube-api-access-klksm\") pod \"glance-default-external-api-0\" (UID: \"5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7\") " pod="openstack/glance-default-external-api-0" Feb 17 19:17:14 crc kubenswrapper[4892]: I0217 19:17:14.043376 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kcnc\" (UniqueName: \"kubernetes.io/projected/a8b44870-db5a-4228-bc5f-6e85be75fe36-kube-api-access-2kcnc\") pod \"dnsmasq-dns-7cc68945df-zppb2\" (UID: \"a8b44870-db5a-4228-bc5f-6e85be75fe36\") " pod="openstack/dnsmasq-dns-7cc68945df-zppb2" Feb 17 19:17:14 crc kubenswrapper[4892]: I0217 19:17:14.082196 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cc68945df-zppb2" Feb 17 19:17:14 crc kubenswrapper[4892]: I0217 19:17:14.118614 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 19:17:14 crc kubenswrapper[4892]: I0217 19:17:14.127957 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/76aefb55-60ee-49af-803c-c0237d2fb375-ceph\") pod \"glance-default-internal-api-0\" (UID: \"76aefb55-60ee-49af-803c-c0237d2fb375\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:17:14 crc kubenswrapper[4892]: I0217 19:17:14.128016 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76aefb55-60ee-49af-803c-c0237d2fb375-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"76aefb55-60ee-49af-803c-c0237d2fb375\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:17:14 crc kubenswrapper[4892]: I0217 19:17:14.128075 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76aefb55-60ee-49af-803c-c0237d2fb375-logs\") pod \"glance-default-internal-api-0\" (UID: \"76aefb55-60ee-49af-803c-c0237d2fb375\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:17:14 crc kubenswrapper[4892]: I0217 19:17:14.128148 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76aefb55-60ee-49af-803c-c0237d2fb375-scripts\") pod \"glance-default-internal-api-0\" (UID: \"76aefb55-60ee-49af-803c-c0237d2fb375\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:17:14 crc kubenswrapper[4892]: I0217 19:17:14.128181 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5d9k5\" (UniqueName: \"kubernetes.io/projected/76aefb55-60ee-49af-803c-c0237d2fb375-kube-api-access-5d9k5\") pod \"glance-default-internal-api-0\" (UID: \"76aefb55-60ee-49af-803c-c0237d2fb375\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:17:14 crc kubenswrapper[4892]: I0217 19:17:14.128214 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/76aefb55-60ee-49af-803c-c0237d2fb375-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"76aefb55-60ee-49af-803c-c0237d2fb375\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:17:14 crc kubenswrapper[4892]: I0217 19:17:14.128323 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76aefb55-60ee-49af-803c-c0237d2fb375-config-data\") pod \"glance-default-internal-api-0\" (UID: \"76aefb55-60ee-49af-803c-c0237d2fb375\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:17:14 crc kubenswrapper[4892]: I0217 19:17:14.129668 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76aefb55-60ee-49af-803c-c0237d2fb375-logs\") pod \"glance-default-internal-api-0\" (UID: \"76aefb55-60ee-49af-803c-c0237d2fb375\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:17:14 crc kubenswrapper[4892]: I0217 19:17:14.130289 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/76aefb55-60ee-49af-803c-c0237d2fb375-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"76aefb55-60ee-49af-803c-c0237d2fb375\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:17:14 crc kubenswrapper[4892]: I0217 19:17:14.133037 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76aefb55-60ee-49af-803c-c0237d2fb375-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"76aefb55-60ee-49af-803c-c0237d2fb375\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:17:14 crc kubenswrapper[4892]: I0217 19:17:14.134447 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76aefb55-60ee-49af-803c-c0237d2fb375-config-data\") pod \"glance-default-internal-api-0\" (UID: \"76aefb55-60ee-49af-803c-c0237d2fb375\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:17:14 crc kubenswrapper[4892]: I0217 19:17:14.135033 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76aefb55-60ee-49af-803c-c0237d2fb375-scripts\") pod \"glance-default-internal-api-0\" (UID: \"76aefb55-60ee-49af-803c-c0237d2fb375\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:17:14 crc kubenswrapper[4892]: I0217 19:17:14.141944 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/76aefb55-60ee-49af-803c-c0237d2fb375-ceph\") pod \"glance-default-internal-api-0\" (UID: \"76aefb55-60ee-49af-803c-c0237d2fb375\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:17:14 crc kubenswrapper[4892]: I0217 19:17:14.160702 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5d9k5\" (UniqueName: \"kubernetes.io/projected/76aefb55-60ee-49af-803c-c0237d2fb375-kube-api-access-5d9k5\") pod \"glance-default-internal-api-0\" (UID: \"76aefb55-60ee-49af-803c-c0237d2fb375\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:17:14 crc kubenswrapper[4892]: I0217 19:17:14.235438 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 19:17:15 crc kubenswrapper[4892]: I0217 19:17:14.613875 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cc68945df-zppb2"] Feb 17 19:17:15 crc kubenswrapper[4892]: I0217 19:17:14.670171 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 19:17:15 crc kubenswrapper[4892]: I0217 19:17:14.807651 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 19:17:15 crc kubenswrapper[4892]: W0217 19:17:14.810579 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5584e2ce_2c3e_4b8b_9ad5_3780dd796ab7.slice/crio-c0251130a09563bbe81d34127687181f0869c65ad9d2ad0de7cf092c35cb2e6e WatchSource:0}: Error finding container c0251130a09563bbe81d34127687181f0869c65ad9d2ad0de7cf092c35cb2e6e: Status 404 returned error can't find the container with id c0251130a09563bbe81d34127687181f0869c65ad9d2ad0de7cf092c35cb2e6e Feb 17 19:17:15 crc kubenswrapper[4892]: I0217 19:17:14.927891 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 19:17:15 crc kubenswrapper[4892]: W0217 19:17:15.003213 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76aefb55_60ee_49af_803c_c0237d2fb375.slice/crio-11d91f94bd422d7dcc568477be822e12e0a9c1b9014090c94b87d78dc9295eaf WatchSource:0}: Error finding container 11d91f94bd422d7dcc568477be822e12e0a9c1b9014090c94b87d78dc9295eaf: Status 404 returned error can't find the container with id 11d91f94bd422d7dcc568477be822e12e0a9c1b9014090c94b87d78dc9295eaf Feb 17 19:17:15 crc kubenswrapper[4892]: I0217 19:17:15.163755 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wttf5" Feb 17 19:17:15 crc kubenswrapper[4892]: I0217 19:17:15.296006 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wttf5" Feb 17 19:17:15 crc kubenswrapper[4892]: I0217 19:17:15.379660 4892 generic.go:334] "Generic (PLEG): container finished" podID="a8b44870-db5a-4228-bc5f-6e85be75fe36" containerID="9a05b6950990a15db97edb4d8cb408031574a6159f69ce0921f4e2a2a3e9a152" exitCode=0 Feb 17 19:17:15 crc kubenswrapper[4892]: I0217 19:17:15.379891 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7d7bl" podUID="374d103b-c8ff-4cac-aa32-5d7355ac5746" containerName="registry-server" containerID="cri-o://1020fc3eea8afec379fdd2ce77ad86a3f90817c12e3b4fd1c8e07d10b2e61cce" gracePeriod=2 Feb 17 19:17:15 crc kubenswrapper[4892]: I0217 19:17:15.388959 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"76aefb55-60ee-49af-803c-c0237d2fb375","Type":"ContainerStarted","Data":"11d91f94bd422d7dcc568477be822e12e0a9c1b9014090c94b87d78dc9295eaf"} Feb 17 19:17:15 crc kubenswrapper[4892]: I0217 19:17:15.389002 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7","Type":"ContainerStarted","Data":"c0251130a09563bbe81d34127687181f0869c65ad9d2ad0de7cf092c35cb2e6e"} Feb 17 19:17:15 crc kubenswrapper[4892]: I0217 19:17:15.389017 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cc68945df-zppb2" event={"ID":"a8b44870-db5a-4228-bc5f-6e85be75fe36","Type":"ContainerDied","Data":"9a05b6950990a15db97edb4d8cb408031574a6159f69ce0921f4e2a2a3e9a152"} Feb 17 19:17:15 crc kubenswrapper[4892]: I0217 19:17:15.389032 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cc68945df-zppb2" event={"ID":"a8b44870-db5a-4228-bc5f-6e85be75fe36","Type":"ContainerStarted","Data":"dbe16d04c27173bacf838b2865831c8f30adba155715e7e122d5e7f4074e7984"} Feb 17 19:17:15 crc kubenswrapper[4892]: I0217 19:17:15.748184 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wttf5"] Feb 17 19:17:15 crc kubenswrapper[4892]: I0217 19:17:15.924402 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7d7bl" Feb 17 19:17:16 crc kubenswrapper[4892]: I0217 19:17:16.070235 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/374d103b-c8ff-4cac-aa32-5d7355ac5746-utilities\") pod \"374d103b-c8ff-4cac-aa32-5d7355ac5746\" (UID: \"374d103b-c8ff-4cac-aa32-5d7355ac5746\") " Feb 17 19:17:16 crc kubenswrapper[4892]: I0217 19:17:16.070306 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/374d103b-c8ff-4cac-aa32-5d7355ac5746-catalog-content\") pod \"374d103b-c8ff-4cac-aa32-5d7355ac5746\" (UID: \"374d103b-c8ff-4cac-aa32-5d7355ac5746\") " Feb 17 19:17:16 crc kubenswrapper[4892]: I0217 19:17:16.070646 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47tph\" (UniqueName: \"kubernetes.io/projected/374d103b-c8ff-4cac-aa32-5d7355ac5746-kube-api-access-47tph\") pod \"374d103b-c8ff-4cac-aa32-5d7355ac5746\" (UID: \"374d103b-c8ff-4cac-aa32-5d7355ac5746\") " Feb 17 19:17:16 crc kubenswrapper[4892]: I0217 19:17:16.077466 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/374d103b-c8ff-4cac-aa32-5d7355ac5746-kube-api-access-47tph" (OuterVolumeSpecName: "kube-api-access-47tph") pod "374d103b-c8ff-4cac-aa32-5d7355ac5746" (UID: "374d103b-c8ff-4cac-aa32-5d7355ac5746"). InnerVolumeSpecName "kube-api-access-47tph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:17:16 crc kubenswrapper[4892]: I0217 19:17:16.071778 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/374d103b-c8ff-4cac-aa32-5d7355ac5746-utilities" (OuterVolumeSpecName: "utilities") pod "374d103b-c8ff-4cac-aa32-5d7355ac5746" (UID: "374d103b-c8ff-4cac-aa32-5d7355ac5746"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:17:16 crc kubenswrapper[4892]: I0217 19:17:16.093685 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/374d103b-c8ff-4cac-aa32-5d7355ac5746-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "374d103b-c8ff-4cac-aa32-5d7355ac5746" (UID: "374d103b-c8ff-4cac-aa32-5d7355ac5746"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:17:16 crc kubenswrapper[4892]: I0217 19:17:16.177179 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47tph\" (UniqueName: \"kubernetes.io/projected/374d103b-c8ff-4cac-aa32-5d7355ac5746-kube-api-access-47tph\") on node \"crc\" DevicePath \"\"" Feb 17 19:17:16 crc kubenswrapper[4892]: I0217 19:17:16.177459 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/374d103b-c8ff-4cac-aa32-5d7355ac5746-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 19:17:16 crc kubenswrapper[4892]: I0217 19:17:16.177468 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/374d103b-c8ff-4cac-aa32-5d7355ac5746-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 19:17:16 crc kubenswrapper[4892]: I0217 19:17:16.401350 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cc68945df-zppb2" event={"ID":"a8b44870-db5a-4228-bc5f-6e85be75fe36","Type":"ContainerStarted","Data":"58c1befcb6b6d10cc2ef84862f9588cfaea8ee51e4dddd42c1df0e4fe86bd170"} Feb 17 19:17:16 crc kubenswrapper[4892]: I0217 19:17:16.401736 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7cc68945df-zppb2" Feb 17 19:17:16 crc kubenswrapper[4892]: I0217 19:17:16.404742 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"76aefb55-60ee-49af-803c-c0237d2fb375","Type":"ContainerStarted","Data":"e326de689b3521834ed5f776bfc13866794d9aeb6e7a48ec90f35e02d8a7507c"} Feb 17 19:17:16 crc kubenswrapper[4892]: I0217 19:17:16.407071 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7","Type":"ContainerStarted","Data":"9a010163d8f76589e57ddaa0e5b4393f270ce1865b05b885f0ee6eaff29f1a08"} Feb 17 19:17:16 crc kubenswrapper[4892]: I0217 19:17:16.410071 4892 generic.go:334] "Generic (PLEG): container finished" podID="374d103b-c8ff-4cac-aa32-5d7355ac5746" containerID="1020fc3eea8afec379fdd2ce77ad86a3f90817c12e3b4fd1c8e07d10b2e61cce" exitCode=0 Feb 17 19:17:16 crc kubenswrapper[4892]: I0217 19:17:16.410311 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wttf5" podUID="2f283cf0-c55a-469e-b46e-d5eb187f33e9" containerName="registry-server" containerID="cri-o://c7f545a70d78f9840d1fedd4abc556972cb3cafbee999a7e31cddc049fd6c766" gracePeriod=2 Feb 17 19:17:16 crc kubenswrapper[4892]: I0217 19:17:16.410678 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7d7bl" Feb 17 19:17:16 crc kubenswrapper[4892]: I0217 19:17:16.411154 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7d7bl" event={"ID":"374d103b-c8ff-4cac-aa32-5d7355ac5746","Type":"ContainerDied","Data":"1020fc3eea8afec379fdd2ce77ad86a3f90817c12e3b4fd1c8e07d10b2e61cce"} Feb 17 19:17:16 crc kubenswrapper[4892]: I0217 19:17:16.411192 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7d7bl" event={"ID":"374d103b-c8ff-4cac-aa32-5d7355ac5746","Type":"ContainerDied","Data":"e83190f92744231179b0dd55a571016413932724027a52b407f2fe3d2dfc24dc"} Feb 17 19:17:16 crc kubenswrapper[4892]: I0217 19:17:16.411214 4892 scope.go:117] "RemoveContainer" containerID="1020fc3eea8afec379fdd2ce77ad86a3f90817c12e3b4fd1c8e07d10b2e61cce" Feb 17 19:17:16 crc kubenswrapper[4892]: I0217 19:17:16.436763 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7cc68945df-zppb2" podStartSLOduration=3.436743782 podStartE2EDuration="3.436743782s" podCreationTimestamp="2026-02-17 19:17:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:17:16.434378288 +0000 UTC m=+5607.809781553" watchObservedRunningTime="2026-02-17 19:17:16.436743782 +0000 UTC m=+5607.812147047" Feb 17 19:17:16 crc kubenswrapper[4892]: I0217 19:17:16.448556 4892 scope.go:117] "RemoveContainer" containerID="0f087f8f937cf9a4ee1bc7f52e4f75446e64a4e1776cc273ab1b14fb43e6b12e" Feb 17 19:17:16 crc kubenswrapper[4892]: I0217 19:17:16.461974 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7d7bl"] Feb 17 19:17:16 crc kubenswrapper[4892]: I0217 19:17:16.477236 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7d7bl"] Feb 17 19:17:16 crc kubenswrapper[4892]: I0217 19:17:16.493447 4892 scope.go:117] "RemoveContainer" containerID="fc46e314caa4a47393053e9e55a6c248b59a7a4a5baf2fce40f66ba12cd802fd" Feb 17 19:17:16 crc kubenswrapper[4892]: I0217 19:17:16.577754 4892 scope.go:117] "RemoveContainer" containerID="1020fc3eea8afec379fdd2ce77ad86a3f90817c12e3b4fd1c8e07d10b2e61cce" Feb 17 19:17:16 crc kubenswrapper[4892]: E0217 19:17:16.578651 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1020fc3eea8afec379fdd2ce77ad86a3f90817c12e3b4fd1c8e07d10b2e61cce\": container with ID starting with 1020fc3eea8afec379fdd2ce77ad86a3f90817c12e3b4fd1c8e07d10b2e61cce not found: ID does not exist" containerID="1020fc3eea8afec379fdd2ce77ad86a3f90817c12e3b4fd1c8e07d10b2e61cce" Feb 17 19:17:16 crc kubenswrapper[4892]: I0217 19:17:16.578689 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1020fc3eea8afec379fdd2ce77ad86a3f90817c12e3b4fd1c8e07d10b2e61cce"} err="failed to get container status \"1020fc3eea8afec379fdd2ce77ad86a3f90817c12e3b4fd1c8e07d10b2e61cce\": rpc error: code = NotFound desc = could not find container \"1020fc3eea8afec379fdd2ce77ad86a3f90817c12e3b4fd1c8e07d10b2e61cce\": container with ID starting with 1020fc3eea8afec379fdd2ce77ad86a3f90817c12e3b4fd1c8e07d10b2e61cce not found: ID does not exist" Feb 17 19:17:16 crc kubenswrapper[4892]: I0217 19:17:16.578713 4892 scope.go:117] "RemoveContainer" containerID="0f087f8f937cf9a4ee1bc7f52e4f75446e64a4e1776cc273ab1b14fb43e6b12e" Feb 17 19:17:16 crc kubenswrapper[4892]: E0217 19:17:16.579043 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f087f8f937cf9a4ee1bc7f52e4f75446e64a4e1776cc273ab1b14fb43e6b12e\": container with ID starting with 0f087f8f937cf9a4ee1bc7f52e4f75446e64a4e1776cc273ab1b14fb43e6b12e not found: ID does not exist" containerID="0f087f8f937cf9a4ee1bc7f52e4f75446e64a4e1776cc273ab1b14fb43e6b12e" Feb 17 19:17:16 crc kubenswrapper[4892]: I0217 19:17:16.579075 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f087f8f937cf9a4ee1bc7f52e4f75446e64a4e1776cc273ab1b14fb43e6b12e"} err="failed to get container status \"0f087f8f937cf9a4ee1bc7f52e4f75446e64a4e1776cc273ab1b14fb43e6b12e\": rpc error: code = NotFound desc = could not find container \"0f087f8f937cf9a4ee1bc7f52e4f75446e64a4e1776cc273ab1b14fb43e6b12e\": container with ID starting with 0f087f8f937cf9a4ee1bc7f52e4f75446e64a4e1776cc273ab1b14fb43e6b12e not found: ID does not exist" Feb 17 19:17:16 crc kubenswrapper[4892]: I0217 19:17:16.579095 4892 scope.go:117] "RemoveContainer" containerID="fc46e314caa4a47393053e9e55a6c248b59a7a4a5baf2fce40f66ba12cd802fd" Feb 17 19:17:16 crc kubenswrapper[4892]: E0217 19:17:16.579282 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc46e314caa4a47393053e9e55a6c248b59a7a4a5baf2fce40f66ba12cd802fd\": container with ID starting with fc46e314caa4a47393053e9e55a6c248b59a7a4a5baf2fce40f66ba12cd802fd not found: ID does not exist" containerID="fc46e314caa4a47393053e9e55a6c248b59a7a4a5baf2fce40f66ba12cd802fd" Feb 17 19:17:16 crc kubenswrapper[4892]: I0217 19:17:16.579311 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc46e314caa4a47393053e9e55a6c248b59a7a4a5baf2fce40f66ba12cd802fd"} err="failed to get container status \"fc46e314caa4a47393053e9e55a6c248b59a7a4a5baf2fce40f66ba12cd802fd\": rpc error: code = NotFound desc = could not find container \"fc46e314caa4a47393053e9e55a6c248b59a7a4a5baf2fce40f66ba12cd802fd\": container with ID starting with fc46e314caa4a47393053e9e55a6c248b59a7a4a5baf2fce40f66ba12cd802fd not found: ID does not exist" Feb 17 19:17:16 crc kubenswrapper[4892]: I0217 19:17:16.939510 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wttf5" Feb 17 19:17:17 crc kubenswrapper[4892]: I0217 19:17:17.076438 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 19:17:17 crc kubenswrapper[4892]: I0217 19:17:17.092424 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f283cf0-c55a-469e-b46e-d5eb187f33e9-catalog-content\") pod \"2f283cf0-c55a-469e-b46e-d5eb187f33e9\" (UID: \"2f283cf0-c55a-469e-b46e-d5eb187f33e9\") " Feb 17 19:17:17 crc kubenswrapper[4892]: I0217 19:17:17.092521 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mr5hs\" (UniqueName: \"kubernetes.io/projected/2f283cf0-c55a-469e-b46e-d5eb187f33e9-kube-api-access-mr5hs\") pod \"2f283cf0-c55a-469e-b46e-d5eb187f33e9\" (UID: \"2f283cf0-c55a-469e-b46e-d5eb187f33e9\") " Feb 17 19:17:17 crc kubenswrapper[4892]: I0217 19:17:17.092591 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f283cf0-c55a-469e-b46e-d5eb187f33e9-utilities\") pod \"2f283cf0-c55a-469e-b46e-d5eb187f33e9\" (UID: \"2f283cf0-c55a-469e-b46e-d5eb187f33e9\") " Feb 17 19:17:17 crc kubenswrapper[4892]: I0217 19:17:17.093640 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f283cf0-c55a-469e-b46e-d5eb187f33e9-utilities" (OuterVolumeSpecName: "utilities") pod "2f283cf0-c55a-469e-b46e-d5eb187f33e9" (UID: "2f283cf0-c55a-469e-b46e-d5eb187f33e9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:17:17 crc kubenswrapper[4892]: I0217 19:17:17.117148 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f283cf0-c55a-469e-b46e-d5eb187f33e9-kube-api-access-mr5hs" (OuterVolumeSpecName: "kube-api-access-mr5hs") pod "2f283cf0-c55a-469e-b46e-d5eb187f33e9" (UID: "2f283cf0-c55a-469e-b46e-d5eb187f33e9"). InnerVolumeSpecName "kube-api-access-mr5hs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:17:17 crc kubenswrapper[4892]: I0217 19:17:17.195074 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mr5hs\" (UniqueName: \"kubernetes.io/projected/2f283cf0-c55a-469e-b46e-d5eb187f33e9-kube-api-access-mr5hs\") on node \"crc\" DevicePath \"\"" Feb 17 19:17:17 crc kubenswrapper[4892]: I0217 19:17:17.195104 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f283cf0-c55a-469e-b46e-d5eb187f33e9-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 19:17:17 crc kubenswrapper[4892]: I0217 19:17:17.213599 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f283cf0-c55a-469e-b46e-d5eb187f33e9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2f283cf0-c55a-469e-b46e-d5eb187f33e9" (UID: "2f283cf0-c55a-469e-b46e-d5eb187f33e9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:17:17 crc kubenswrapper[4892]: I0217 19:17:17.296748 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f283cf0-c55a-469e-b46e-d5eb187f33e9-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 19:17:17 crc kubenswrapper[4892]: I0217 19:17:17.372710 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="374d103b-c8ff-4cac-aa32-5d7355ac5746" path="/var/lib/kubelet/pods/374d103b-c8ff-4cac-aa32-5d7355ac5746/volumes" Feb 17 19:17:17 crc kubenswrapper[4892]: I0217 19:17:17.420603 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"76aefb55-60ee-49af-803c-c0237d2fb375","Type":"ContainerStarted","Data":"d31513f0b7c997f001bc79823c8edad201d62c1c35b420c049f1e6e854751ee6"} Feb 17 19:17:17 crc kubenswrapper[4892]: I0217 19:17:17.423279 4892 generic.go:334] "Generic (PLEG): container finished" podID="2f283cf0-c55a-469e-b46e-d5eb187f33e9" containerID="c7f545a70d78f9840d1fedd4abc556972cb3cafbee999a7e31cddc049fd6c766" exitCode=0 Feb 17 19:17:17 crc kubenswrapper[4892]: I0217 19:17:17.423323 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wttf5" event={"ID":"2f283cf0-c55a-469e-b46e-d5eb187f33e9","Type":"ContainerDied","Data":"c7f545a70d78f9840d1fedd4abc556972cb3cafbee999a7e31cddc049fd6c766"} Feb 17 19:17:17 crc kubenswrapper[4892]: I0217 19:17:17.423343 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wttf5" event={"ID":"2f283cf0-c55a-469e-b46e-d5eb187f33e9","Type":"ContainerDied","Data":"544386a9887d3b3701f04b4f927785ef338c09b5965a92e90931e36620118216"} Feb 17 19:17:17 crc kubenswrapper[4892]: I0217 19:17:17.423359 4892 scope.go:117] "RemoveContainer" containerID="c7f545a70d78f9840d1fedd4abc556972cb3cafbee999a7e31cddc049fd6c766" Feb 17 19:17:17 crc kubenswrapper[4892]: I0217 19:17:17.423463 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wttf5" Feb 17 19:17:17 crc kubenswrapper[4892]: I0217 19:17:17.427757 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7","Type":"ContainerStarted","Data":"ab189c3313490197590ad4bee2bab8c004be07b6204413277401644ae1ff198e"} Feb 17 19:17:17 crc kubenswrapper[4892]: I0217 19:17:17.427847 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7" containerName="glance-httpd" containerID="cri-o://ab189c3313490197590ad4bee2bab8c004be07b6204413277401644ae1ff198e" gracePeriod=30 Feb 17 19:17:17 crc kubenswrapper[4892]: I0217 19:17:17.427796 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7" containerName="glance-log" containerID="cri-o://9a010163d8f76589e57ddaa0e5b4393f270ce1865b05b885f0ee6eaff29f1a08" gracePeriod=30 Feb 17 19:17:17 crc kubenswrapper[4892]: I0217 19:17:17.446099 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.44607749 podStartE2EDuration="4.44607749s" podCreationTimestamp="2026-02-17 19:17:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:17:17.444192648 +0000 UTC m=+5608.819595933" watchObservedRunningTime="2026-02-17 19:17:17.44607749 +0000 UTC m=+5608.821480755" Feb 17 19:17:17 crc kubenswrapper[4892]: I0217 19:17:17.454930 4892 scope.go:117] "RemoveContainer" containerID="184b25e2924d41726eb04584e7e25c2345a3f218d1f0959c735f5a87ae7c40e6" Feb 17 19:17:17 crc kubenswrapper[4892]: I0217 19:17:17.474847 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.474827035 podStartE2EDuration="4.474827035s" podCreationTimestamp="2026-02-17 19:17:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:17:17.46315186 +0000 UTC m=+5608.838555125" watchObservedRunningTime="2026-02-17 19:17:17.474827035 +0000 UTC m=+5608.850230300" Feb 17 19:17:17 crc kubenswrapper[4892]: I0217 19:17:17.491139 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wttf5"] Feb 17 19:17:17 crc kubenswrapper[4892]: I0217 19:17:17.494692 4892 scope.go:117] "RemoveContainer" containerID="1a81f16c2d743f107cd3ff16ba1ab70710f359d3ee38cac5c64dbdaa3481306b" Feb 17 19:17:17 crc kubenswrapper[4892]: I0217 19:17:17.500394 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wttf5"] Feb 17 19:17:17 crc kubenswrapper[4892]: I0217 19:17:17.518626 4892 scope.go:117] "RemoveContainer" containerID="c7f545a70d78f9840d1fedd4abc556972cb3cafbee999a7e31cddc049fd6c766" Feb 17 19:17:17 crc kubenswrapper[4892]: E0217 19:17:17.519259 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7f545a70d78f9840d1fedd4abc556972cb3cafbee999a7e31cddc049fd6c766\": container with ID starting with c7f545a70d78f9840d1fedd4abc556972cb3cafbee999a7e31cddc049fd6c766 not found: ID does not exist" containerID="c7f545a70d78f9840d1fedd4abc556972cb3cafbee999a7e31cddc049fd6c766" Feb 17 19:17:17 crc kubenswrapper[4892]: I0217 19:17:17.519302 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7f545a70d78f9840d1fedd4abc556972cb3cafbee999a7e31cddc049fd6c766"} err="failed to get container status \"c7f545a70d78f9840d1fedd4abc556972cb3cafbee999a7e31cddc049fd6c766\": rpc error: code = NotFound desc = could not find container \"c7f545a70d78f9840d1fedd4abc556972cb3cafbee999a7e31cddc049fd6c766\": container with ID starting with c7f545a70d78f9840d1fedd4abc556972cb3cafbee999a7e31cddc049fd6c766 not found: ID does not exist" Feb 17 19:17:17 crc kubenswrapper[4892]: I0217 19:17:17.519329 4892 scope.go:117] "RemoveContainer" containerID="184b25e2924d41726eb04584e7e25c2345a3f218d1f0959c735f5a87ae7c40e6" Feb 17 19:17:17 crc kubenswrapper[4892]: E0217 19:17:17.519754 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"184b25e2924d41726eb04584e7e25c2345a3f218d1f0959c735f5a87ae7c40e6\": container with ID starting with 184b25e2924d41726eb04584e7e25c2345a3f218d1f0959c735f5a87ae7c40e6 not found: ID does not exist" containerID="184b25e2924d41726eb04584e7e25c2345a3f218d1f0959c735f5a87ae7c40e6" Feb 17 19:17:17 crc kubenswrapper[4892]: I0217 19:17:17.519784 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"184b25e2924d41726eb04584e7e25c2345a3f218d1f0959c735f5a87ae7c40e6"} err="failed to get container status \"184b25e2924d41726eb04584e7e25c2345a3f218d1f0959c735f5a87ae7c40e6\": rpc error: code = NotFound desc = could not find container \"184b25e2924d41726eb04584e7e25c2345a3f218d1f0959c735f5a87ae7c40e6\": container with ID starting with 184b25e2924d41726eb04584e7e25c2345a3f218d1f0959c735f5a87ae7c40e6 not found: ID does not exist" Feb 17 19:17:17 crc kubenswrapper[4892]: I0217 19:17:17.519807 4892 scope.go:117] "RemoveContainer" containerID="1a81f16c2d743f107cd3ff16ba1ab70710f359d3ee38cac5c64dbdaa3481306b" Feb 17 19:17:17 crc kubenswrapper[4892]: E0217 19:17:17.520101 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a81f16c2d743f107cd3ff16ba1ab70710f359d3ee38cac5c64dbdaa3481306b\": container with ID starting with 1a81f16c2d743f107cd3ff16ba1ab70710f359d3ee38cac5c64dbdaa3481306b not found: ID does not exist" containerID="1a81f16c2d743f107cd3ff16ba1ab70710f359d3ee38cac5c64dbdaa3481306b" Feb 17 19:17:17 crc kubenswrapper[4892]: I0217 19:17:17.520146 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a81f16c2d743f107cd3ff16ba1ab70710f359d3ee38cac5c64dbdaa3481306b"} err="failed to get container status \"1a81f16c2d743f107cd3ff16ba1ab70710f359d3ee38cac5c64dbdaa3481306b\": rpc error: code = NotFound desc = could not find container \"1a81f16c2d743f107cd3ff16ba1ab70710f359d3ee38cac5c64dbdaa3481306b\": container with ID starting with 1a81f16c2d743f107cd3ff16ba1ab70710f359d3ee38cac5c64dbdaa3481306b not found: ID does not exist" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.122500 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.213304 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7-combined-ca-bundle\") pod \"5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7\" (UID: \"5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7\") " Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.213412 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7-logs\") pod \"5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7\" (UID: \"5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7\") " Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.213488 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7-scripts\") pod \"5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7\" (UID: \"5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7\") " Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.213532 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7-ceph\") pod \"5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7\" (UID: \"5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7\") " Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.213597 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7-httpd-run\") pod \"5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7\" (UID: \"5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7\") " Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.213734 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7-config-data\") pod \"5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7\" (UID: \"5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7\") " Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.213834 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klksm\" (UniqueName: \"kubernetes.io/projected/5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7-kube-api-access-klksm\") pod \"5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7\" (UID: \"5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7\") " Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.214018 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7-logs" (OuterVolumeSpecName: "logs") pod "5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7" (UID: "5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.214316 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7" (UID: "5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.214651 4892 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.214676 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7-logs\") on node \"crc\" DevicePath \"\"" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.218336 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7-kube-api-access-klksm" (OuterVolumeSpecName: "kube-api-access-klksm") pod "5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7" (UID: "5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7"). InnerVolumeSpecName "kube-api-access-klksm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.218655 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7-ceph" (OuterVolumeSpecName: "ceph") pod "5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7" (UID: "5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.220212 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7-scripts" (OuterVolumeSpecName: "scripts") pod "5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7" (UID: "5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.261596 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7" (UID: "5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.278809 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7-config-data" (OuterVolumeSpecName: "config-data") pod "5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7" (UID: "5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.317860 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.317915 4892 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7-ceph\") on node \"crc\" DevicePath \"\"" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.317934 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.317955 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klksm\" (UniqueName: \"kubernetes.io/projected/5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7-kube-api-access-klksm\") on node \"crc\" DevicePath \"\"" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.317976 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.450328 4892 generic.go:334] "Generic (PLEG): container finished" podID="5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7" containerID="ab189c3313490197590ad4bee2bab8c004be07b6204413277401644ae1ff198e" exitCode=0 Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.450371 4892 generic.go:334] "Generic (PLEG): container finished" podID="5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7" containerID="9a010163d8f76589e57ddaa0e5b4393f270ce1865b05b885f0ee6eaff29f1a08" exitCode=143 Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.450404 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7","Type":"ContainerDied","Data":"ab189c3313490197590ad4bee2bab8c004be07b6204413277401644ae1ff198e"} Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.450453 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7","Type":"ContainerDied","Data":"9a010163d8f76589e57ddaa0e5b4393f270ce1865b05b885f0ee6eaff29f1a08"} Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.450465 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7","Type":"ContainerDied","Data":"c0251130a09563bbe81d34127687181f0869c65ad9d2ad0de7cf092c35cb2e6e"} Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.450483 4892 scope.go:117] "RemoveContainer" containerID="ab189c3313490197590ad4bee2bab8c004be07b6204413277401644ae1ff198e" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.450828 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="76aefb55-60ee-49af-803c-c0237d2fb375" containerName="glance-log" containerID="cri-o://e326de689b3521834ed5f776bfc13866794d9aeb6e7a48ec90f35e02d8a7507c" gracePeriod=30 Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.450884 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="76aefb55-60ee-49af-803c-c0237d2fb375" containerName="glance-httpd" containerID="cri-o://d31513f0b7c997f001bc79823c8edad201d62c1c35b420c049f1e6e854751ee6" gracePeriod=30 Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.451343 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.490112 4892 scope.go:117] "RemoveContainer" containerID="9a010163d8f76589e57ddaa0e5b4393f270ce1865b05b885f0ee6eaff29f1a08" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.512513 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.537261 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.551853 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 19:17:18 crc kubenswrapper[4892]: E0217 19:17:18.552329 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="374d103b-c8ff-4cac-aa32-5d7355ac5746" containerName="extract-content" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.552347 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="374d103b-c8ff-4cac-aa32-5d7355ac5746" containerName="extract-content" Feb 17 19:17:18 crc kubenswrapper[4892]: E0217 19:17:18.552363 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f283cf0-c55a-469e-b46e-d5eb187f33e9" containerName="extract-utilities" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.552372 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f283cf0-c55a-469e-b46e-d5eb187f33e9" containerName="extract-utilities" Feb 17 19:17:18 crc kubenswrapper[4892]: E0217 19:17:18.552383 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="374d103b-c8ff-4cac-aa32-5d7355ac5746" containerName="registry-server" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.552389 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="374d103b-c8ff-4cac-aa32-5d7355ac5746" containerName="registry-server" Feb 17 19:17:18 crc kubenswrapper[4892]: E0217 19:17:18.552398 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f283cf0-c55a-469e-b46e-d5eb187f33e9" containerName="extract-content" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.552404 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f283cf0-c55a-469e-b46e-d5eb187f33e9" containerName="extract-content" Feb 17 19:17:18 crc kubenswrapper[4892]: E0217 19:17:18.552418 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="374d103b-c8ff-4cac-aa32-5d7355ac5746" containerName="extract-utilities" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.552424 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="374d103b-c8ff-4cac-aa32-5d7355ac5746" containerName="extract-utilities" Feb 17 19:17:18 crc kubenswrapper[4892]: E0217 19:17:18.552433 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f283cf0-c55a-469e-b46e-d5eb187f33e9" containerName="registry-server" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.552439 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f283cf0-c55a-469e-b46e-d5eb187f33e9" containerName="registry-server" Feb 17 19:17:18 crc kubenswrapper[4892]: E0217 19:17:18.552451 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7" containerName="glance-log" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.552456 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7" containerName="glance-log" Feb 17 19:17:18 crc kubenswrapper[4892]: E0217 19:17:18.552475 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7" containerName="glance-httpd" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.552481 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7" containerName="glance-httpd" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.552686 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="374d103b-c8ff-4cac-aa32-5d7355ac5746" containerName="registry-server" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.552699 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7" containerName="glance-log" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.552710 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f283cf0-c55a-469e-b46e-d5eb187f33e9" containerName="registry-server" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.552718 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7" containerName="glance-httpd" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.553754 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.559128 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.559267 4892 scope.go:117] "RemoveContainer" containerID="ab189c3313490197590ad4bee2bab8c004be07b6204413277401644ae1ff198e" Feb 17 19:17:18 crc kubenswrapper[4892]: E0217 19:17:18.563148 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab189c3313490197590ad4bee2bab8c004be07b6204413277401644ae1ff198e\": container with ID starting with ab189c3313490197590ad4bee2bab8c004be07b6204413277401644ae1ff198e not found: ID does not exist" containerID="ab189c3313490197590ad4bee2bab8c004be07b6204413277401644ae1ff198e" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.563192 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab189c3313490197590ad4bee2bab8c004be07b6204413277401644ae1ff198e"} err="failed to get container status \"ab189c3313490197590ad4bee2bab8c004be07b6204413277401644ae1ff198e\": rpc error: code = NotFound desc = could not find container \"ab189c3313490197590ad4bee2bab8c004be07b6204413277401644ae1ff198e\": container with ID starting with ab189c3313490197590ad4bee2bab8c004be07b6204413277401644ae1ff198e not found: ID does not exist" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.563214 4892 scope.go:117] "RemoveContainer" containerID="9a010163d8f76589e57ddaa0e5b4393f270ce1865b05b885f0ee6eaff29f1a08" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.573284 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 19:17:18 crc kubenswrapper[4892]: E0217 19:17:18.578652 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a010163d8f76589e57ddaa0e5b4393f270ce1865b05b885f0ee6eaff29f1a08\": container with ID starting with 9a010163d8f76589e57ddaa0e5b4393f270ce1865b05b885f0ee6eaff29f1a08 not found: ID does not exist" containerID="9a010163d8f76589e57ddaa0e5b4393f270ce1865b05b885f0ee6eaff29f1a08" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.578689 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a010163d8f76589e57ddaa0e5b4393f270ce1865b05b885f0ee6eaff29f1a08"} err="failed to get container status \"9a010163d8f76589e57ddaa0e5b4393f270ce1865b05b885f0ee6eaff29f1a08\": rpc error: code = NotFound desc = could not find container \"9a010163d8f76589e57ddaa0e5b4393f270ce1865b05b885f0ee6eaff29f1a08\": container with ID starting with 9a010163d8f76589e57ddaa0e5b4393f270ce1865b05b885f0ee6eaff29f1a08 not found: ID does not exist" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.578716 4892 scope.go:117] "RemoveContainer" containerID="ab189c3313490197590ad4bee2bab8c004be07b6204413277401644ae1ff198e" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.580377 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab189c3313490197590ad4bee2bab8c004be07b6204413277401644ae1ff198e"} err="failed to get container status \"ab189c3313490197590ad4bee2bab8c004be07b6204413277401644ae1ff198e\": rpc error: code = NotFound desc = could not find container \"ab189c3313490197590ad4bee2bab8c004be07b6204413277401644ae1ff198e\": container with ID starting with ab189c3313490197590ad4bee2bab8c004be07b6204413277401644ae1ff198e not found: ID does not exist" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.580435 4892 scope.go:117] "RemoveContainer" containerID="9a010163d8f76589e57ddaa0e5b4393f270ce1865b05b885f0ee6eaff29f1a08" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.580848 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a010163d8f76589e57ddaa0e5b4393f270ce1865b05b885f0ee6eaff29f1a08"} err="failed to get container status \"9a010163d8f76589e57ddaa0e5b4393f270ce1865b05b885f0ee6eaff29f1a08\": rpc error: code = NotFound desc = could not find container \"9a010163d8f76589e57ddaa0e5b4393f270ce1865b05b885f0ee6eaff29f1a08\": container with ID starting with 9a010163d8f76589e57ddaa0e5b4393f270ce1865b05b885f0ee6eaff29f1a08 not found: ID does not exist" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.624283 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bc2d1c9d-6329-44ea-b176-b8734429b1da-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bc2d1c9d-6329-44ea-b176-b8734429b1da\") " pod="openstack/glance-default-external-api-0" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.624343 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc2d1c9d-6329-44ea-b176-b8734429b1da-config-data\") pod \"glance-default-external-api-0\" (UID: \"bc2d1c9d-6329-44ea-b176-b8734429b1da\") " pod="openstack/glance-default-external-api-0" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.624371 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc2d1c9d-6329-44ea-b176-b8734429b1da-scripts\") pod \"glance-default-external-api-0\" (UID: \"bc2d1c9d-6329-44ea-b176-b8734429b1da\") " pod="openstack/glance-default-external-api-0" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.624405 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/bc2d1c9d-6329-44ea-b176-b8734429b1da-ceph\") pod \"glance-default-external-api-0\" (UID: \"bc2d1c9d-6329-44ea-b176-b8734429b1da\") " pod="openstack/glance-default-external-api-0" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.624511 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc2d1c9d-6329-44ea-b176-b8734429b1da-logs\") pod \"glance-default-external-api-0\" (UID: \"bc2d1c9d-6329-44ea-b176-b8734429b1da\") " pod="openstack/glance-default-external-api-0" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.624540 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p78jn\" (UniqueName: \"kubernetes.io/projected/bc2d1c9d-6329-44ea-b176-b8734429b1da-kube-api-access-p78jn\") pod \"glance-default-external-api-0\" (UID: \"bc2d1c9d-6329-44ea-b176-b8734429b1da\") " pod="openstack/glance-default-external-api-0" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.624563 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc2d1c9d-6329-44ea-b176-b8734429b1da-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bc2d1c9d-6329-44ea-b176-b8734429b1da\") " pod="openstack/glance-default-external-api-0" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.726388 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc2d1c9d-6329-44ea-b176-b8734429b1da-config-data\") pod \"glance-default-external-api-0\" (UID: \"bc2d1c9d-6329-44ea-b176-b8734429b1da\") " pod="openstack/glance-default-external-api-0" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.726436 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc2d1c9d-6329-44ea-b176-b8734429b1da-scripts\") pod \"glance-default-external-api-0\" (UID: \"bc2d1c9d-6329-44ea-b176-b8734429b1da\") " pod="openstack/glance-default-external-api-0" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.726481 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/bc2d1c9d-6329-44ea-b176-b8734429b1da-ceph\") pod \"glance-default-external-api-0\" (UID: \"bc2d1c9d-6329-44ea-b176-b8734429b1da\") " pod="openstack/glance-default-external-api-0" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.726561 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc2d1c9d-6329-44ea-b176-b8734429b1da-logs\") pod \"glance-default-external-api-0\" (UID: \"bc2d1c9d-6329-44ea-b176-b8734429b1da\") " pod="openstack/glance-default-external-api-0" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.726587 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p78jn\" (UniqueName: \"kubernetes.io/projected/bc2d1c9d-6329-44ea-b176-b8734429b1da-kube-api-access-p78jn\") pod \"glance-default-external-api-0\" (UID: \"bc2d1c9d-6329-44ea-b176-b8734429b1da\") " pod="openstack/glance-default-external-api-0" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.726611 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc2d1c9d-6329-44ea-b176-b8734429b1da-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bc2d1c9d-6329-44ea-b176-b8734429b1da\") " pod="openstack/glance-default-external-api-0" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.726646 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bc2d1c9d-6329-44ea-b176-b8734429b1da-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bc2d1c9d-6329-44ea-b176-b8734429b1da\") " pod="openstack/glance-default-external-api-0" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.727066 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bc2d1c9d-6329-44ea-b176-b8734429b1da-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bc2d1c9d-6329-44ea-b176-b8734429b1da\") " pod="openstack/glance-default-external-api-0" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.727237 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc2d1c9d-6329-44ea-b176-b8734429b1da-logs\") pod \"glance-default-external-api-0\" (UID: \"bc2d1c9d-6329-44ea-b176-b8734429b1da\") " pod="openstack/glance-default-external-api-0" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.730685 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/bc2d1c9d-6329-44ea-b176-b8734429b1da-ceph\") pod \"glance-default-external-api-0\" (UID: \"bc2d1c9d-6329-44ea-b176-b8734429b1da\") " pod="openstack/glance-default-external-api-0" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.730703 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc2d1c9d-6329-44ea-b176-b8734429b1da-scripts\") pod \"glance-default-external-api-0\" (UID: \"bc2d1c9d-6329-44ea-b176-b8734429b1da\") " pod="openstack/glance-default-external-api-0" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.731581 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc2d1c9d-6329-44ea-b176-b8734429b1da-config-data\") pod \"glance-default-external-api-0\" (UID: \"bc2d1c9d-6329-44ea-b176-b8734429b1da\") " pod="openstack/glance-default-external-api-0" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.737469 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc2d1c9d-6329-44ea-b176-b8734429b1da-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bc2d1c9d-6329-44ea-b176-b8734429b1da\") " pod="openstack/glance-default-external-api-0" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.745444 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p78jn\" (UniqueName: \"kubernetes.io/projected/bc2d1c9d-6329-44ea-b176-b8734429b1da-kube-api-access-p78jn\") pod \"glance-default-external-api-0\" (UID: \"bc2d1c9d-6329-44ea-b176-b8734429b1da\") " pod="openstack/glance-default-external-api-0" Feb 17 19:17:18 crc kubenswrapper[4892]: I0217 19:17:18.881302 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.122297 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.240395 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76aefb55-60ee-49af-803c-c0237d2fb375-logs\") pod \"76aefb55-60ee-49af-803c-c0237d2fb375\" (UID: \"76aefb55-60ee-49af-803c-c0237d2fb375\") " Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.240512 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76aefb55-60ee-49af-803c-c0237d2fb375-scripts\") pod \"76aefb55-60ee-49af-803c-c0237d2fb375\" (UID: \"76aefb55-60ee-49af-803c-c0237d2fb375\") " Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.240536 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76aefb55-60ee-49af-803c-c0237d2fb375-config-data\") pod \"76aefb55-60ee-49af-803c-c0237d2fb375\" (UID: \"76aefb55-60ee-49af-803c-c0237d2fb375\") " Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.240560 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5d9k5\" (UniqueName: \"kubernetes.io/projected/76aefb55-60ee-49af-803c-c0237d2fb375-kube-api-access-5d9k5\") pod \"76aefb55-60ee-49af-803c-c0237d2fb375\" (UID: \"76aefb55-60ee-49af-803c-c0237d2fb375\") " Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.240577 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/76aefb55-60ee-49af-803c-c0237d2fb375-ceph\") pod \"76aefb55-60ee-49af-803c-c0237d2fb375\" (UID: \"76aefb55-60ee-49af-803c-c0237d2fb375\") " Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.240715 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76aefb55-60ee-49af-803c-c0237d2fb375-combined-ca-bundle\") pod \"76aefb55-60ee-49af-803c-c0237d2fb375\" (UID: \"76aefb55-60ee-49af-803c-c0237d2fb375\") " Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.240739 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/76aefb55-60ee-49af-803c-c0237d2fb375-httpd-run\") pod \"76aefb55-60ee-49af-803c-c0237d2fb375\" (UID: \"76aefb55-60ee-49af-803c-c0237d2fb375\") " Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.240881 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76aefb55-60ee-49af-803c-c0237d2fb375-logs" (OuterVolumeSpecName: "logs") pod "76aefb55-60ee-49af-803c-c0237d2fb375" (UID: "76aefb55-60ee-49af-803c-c0237d2fb375"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.241081 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76aefb55-60ee-49af-803c-c0237d2fb375-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "76aefb55-60ee-49af-803c-c0237d2fb375" (UID: "76aefb55-60ee-49af-803c-c0237d2fb375"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.241354 4892 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/76aefb55-60ee-49af-803c-c0237d2fb375-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.241369 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76aefb55-60ee-49af-803c-c0237d2fb375-logs\") on node \"crc\" DevicePath \"\"" Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.249925 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76aefb55-60ee-49af-803c-c0237d2fb375-kube-api-access-5d9k5" (OuterVolumeSpecName: "kube-api-access-5d9k5") pod "76aefb55-60ee-49af-803c-c0237d2fb375" (UID: "76aefb55-60ee-49af-803c-c0237d2fb375"). InnerVolumeSpecName "kube-api-access-5d9k5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.249943 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76aefb55-60ee-49af-803c-c0237d2fb375-scripts" (OuterVolumeSpecName: "scripts") pod "76aefb55-60ee-49af-803c-c0237d2fb375" (UID: "76aefb55-60ee-49af-803c-c0237d2fb375"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.250013 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76aefb55-60ee-49af-803c-c0237d2fb375-ceph" (OuterVolumeSpecName: "ceph") pod "76aefb55-60ee-49af-803c-c0237d2fb375" (UID: "76aefb55-60ee-49af-803c-c0237d2fb375"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.266755 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76aefb55-60ee-49af-803c-c0237d2fb375-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "76aefb55-60ee-49af-803c-c0237d2fb375" (UID: "76aefb55-60ee-49af-803c-c0237d2fb375"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.296797 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76aefb55-60ee-49af-803c-c0237d2fb375-config-data" (OuterVolumeSpecName: "config-data") pod "76aefb55-60ee-49af-803c-c0237d2fb375" (UID: "76aefb55-60ee-49af-803c-c0237d2fb375"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.343738 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76aefb55-60ee-49af-803c-c0237d2fb375-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.343776 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76aefb55-60ee-49af-803c-c0237d2fb375-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.343790 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5d9k5\" (UniqueName: \"kubernetes.io/projected/76aefb55-60ee-49af-803c-c0237d2fb375-kube-api-access-5d9k5\") on node \"crc\" DevicePath \"\"" Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.343802 4892 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/76aefb55-60ee-49af-803c-c0237d2fb375-ceph\") on node \"crc\" DevicePath \"\"" Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.343827 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76aefb55-60ee-49af-803c-c0237d2fb375-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.372559 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f283cf0-c55a-469e-b46e-d5eb187f33e9" path="/var/lib/kubelet/pods/2f283cf0-c55a-469e-b46e-d5eb187f33e9/volumes" Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.373463 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7" path="/var/lib/kubelet/pods/5584e2ce-2c3e-4b8b-9ad5-3780dd796ab7/volumes" Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.467172 4892 generic.go:334] "Generic (PLEG): container finished" podID="76aefb55-60ee-49af-803c-c0237d2fb375" containerID="d31513f0b7c997f001bc79823c8edad201d62c1c35b420c049f1e6e854751ee6" exitCode=0 Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.467917 4892 generic.go:334] "Generic (PLEG): container finished" podID="76aefb55-60ee-49af-803c-c0237d2fb375" containerID="e326de689b3521834ed5f776bfc13866794d9aeb6e7a48ec90f35e02d8a7507c" exitCode=143 Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.467229 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"76aefb55-60ee-49af-803c-c0237d2fb375","Type":"ContainerDied","Data":"d31513f0b7c997f001bc79823c8edad201d62c1c35b420c049f1e6e854751ee6"} Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.467286 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.468026 4892 scope.go:117] "RemoveContainer" containerID="d31513f0b7c997f001bc79823c8edad201d62c1c35b420c049f1e6e854751ee6" Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.468001 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"76aefb55-60ee-49af-803c-c0237d2fb375","Type":"ContainerDied","Data":"e326de689b3521834ed5f776bfc13866794d9aeb6e7a48ec90f35e02d8a7507c"} Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.468461 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"76aefb55-60ee-49af-803c-c0237d2fb375","Type":"ContainerDied","Data":"11d91f94bd422d7dcc568477be822e12e0a9c1b9014090c94b87d78dc9295eaf"} Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.509909 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.521612 4892 scope.go:117] "RemoveContainer" containerID="e326de689b3521834ed5f776bfc13866794d9aeb6e7a48ec90f35e02d8a7507c" Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.533972 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.561895 4892 scope.go:117] "RemoveContainer" containerID="d31513f0b7c997f001bc79823c8edad201d62c1c35b420c049f1e6e854751ee6" Feb 17 19:17:19 crc kubenswrapper[4892]: E0217 19:17:19.562520 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d31513f0b7c997f001bc79823c8edad201d62c1c35b420c049f1e6e854751ee6\": container with ID starting with d31513f0b7c997f001bc79823c8edad201d62c1c35b420c049f1e6e854751ee6 not found: ID does not exist" containerID="d31513f0b7c997f001bc79823c8edad201d62c1c35b420c049f1e6e854751ee6" Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.562564 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d31513f0b7c997f001bc79823c8edad201d62c1c35b420c049f1e6e854751ee6"} err="failed to get container status \"d31513f0b7c997f001bc79823c8edad201d62c1c35b420c049f1e6e854751ee6\": rpc error: code = NotFound desc = could not find container \"d31513f0b7c997f001bc79823c8edad201d62c1c35b420c049f1e6e854751ee6\": container with ID starting with d31513f0b7c997f001bc79823c8edad201d62c1c35b420c049f1e6e854751ee6 not found: ID does not exist" Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.562593 4892 scope.go:117] "RemoveContainer" containerID="e326de689b3521834ed5f776bfc13866794d9aeb6e7a48ec90f35e02d8a7507c" Feb 17 19:17:19 crc kubenswrapper[4892]: E0217 19:17:19.563051 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e326de689b3521834ed5f776bfc13866794d9aeb6e7a48ec90f35e02d8a7507c\": container with ID starting with e326de689b3521834ed5f776bfc13866794d9aeb6e7a48ec90f35e02d8a7507c not found: ID does not exist" containerID="e326de689b3521834ed5f776bfc13866794d9aeb6e7a48ec90f35e02d8a7507c" Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.563105 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e326de689b3521834ed5f776bfc13866794d9aeb6e7a48ec90f35e02d8a7507c"} err="failed to get container status \"e326de689b3521834ed5f776bfc13866794d9aeb6e7a48ec90f35e02d8a7507c\": rpc error: code = NotFound desc = could not find container \"e326de689b3521834ed5f776bfc13866794d9aeb6e7a48ec90f35e02d8a7507c\": container with ID starting with e326de689b3521834ed5f776bfc13866794d9aeb6e7a48ec90f35e02d8a7507c not found: ID does not exist" Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.563134 4892 scope.go:117] "RemoveContainer" containerID="d31513f0b7c997f001bc79823c8edad201d62c1c35b420c049f1e6e854751ee6" Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.563498 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d31513f0b7c997f001bc79823c8edad201d62c1c35b420c049f1e6e854751ee6"} err="failed to get container status \"d31513f0b7c997f001bc79823c8edad201d62c1c35b420c049f1e6e854751ee6\": rpc error: code = NotFound desc = could not find container \"d31513f0b7c997f001bc79823c8edad201d62c1c35b420c049f1e6e854751ee6\": container with ID starting with d31513f0b7c997f001bc79823c8edad201d62c1c35b420c049f1e6e854751ee6 not found: ID does not exist" Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.563530 4892 scope.go:117] "RemoveContainer" containerID="e326de689b3521834ed5f776bfc13866794d9aeb6e7a48ec90f35e02d8a7507c" Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.565107 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.565393 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e326de689b3521834ed5f776bfc13866794d9aeb6e7a48ec90f35e02d8a7507c"} err="failed to get container status \"e326de689b3521834ed5f776bfc13866794d9aeb6e7a48ec90f35e02d8a7507c\": rpc error: code = NotFound desc = could not find container \"e326de689b3521834ed5f776bfc13866794d9aeb6e7a48ec90f35e02d8a7507c\": container with ID starting with e326de689b3521834ed5f776bfc13866794d9aeb6e7a48ec90f35e02d8a7507c not found: ID does not exist" Feb 17 19:17:19 crc kubenswrapper[4892]: E0217 19:17:19.565765 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76aefb55-60ee-49af-803c-c0237d2fb375" containerName="glance-log" Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.565796 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="76aefb55-60ee-49af-803c-c0237d2fb375" containerName="glance-log" Feb 17 19:17:19 crc kubenswrapper[4892]: E0217 19:17:19.565850 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76aefb55-60ee-49af-803c-c0237d2fb375" containerName="glance-httpd" Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.565862 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="76aefb55-60ee-49af-803c-c0237d2fb375" containerName="glance-httpd" Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.577170 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="76aefb55-60ee-49af-803c-c0237d2fb375" containerName="glance-httpd" Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.577271 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="76aefb55-60ee-49af-803c-c0237d2fb375" containerName="glance-log" Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.587156 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.590624 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.594943 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.658232 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.660174 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d04280f1-af10-4187-a3e9-7f7384dafc7d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d04280f1-af10-4187-a3e9-7f7384dafc7d\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.660308 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d04280f1-af10-4187-a3e9-7f7384dafc7d-logs\") pod \"glance-default-internal-api-0\" (UID: \"d04280f1-af10-4187-a3e9-7f7384dafc7d\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.660409 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clgb8\" (UniqueName: \"kubernetes.io/projected/d04280f1-af10-4187-a3e9-7f7384dafc7d-kube-api-access-clgb8\") pod \"glance-default-internal-api-0\" (UID: \"d04280f1-af10-4187-a3e9-7f7384dafc7d\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.660429 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d04280f1-af10-4187-a3e9-7f7384dafc7d-ceph\") pod \"glance-default-internal-api-0\" (UID: \"d04280f1-af10-4187-a3e9-7f7384dafc7d\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.660475 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d04280f1-af10-4187-a3e9-7f7384dafc7d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d04280f1-af10-4187-a3e9-7f7384dafc7d\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.660510 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d04280f1-af10-4187-a3e9-7f7384dafc7d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d04280f1-af10-4187-a3e9-7f7384dafc7d\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.660566 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d04280f1-af10-4187-a3e9-7f7384dafc7d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d04280f1-af10-4187-a3e9-7f7384dafc7d\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:17:19 crc kubenswrapper[4892]: W0217 19:17:19.662028 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc2d1c9d_6329_44ea_b176_b8734429b1da.slice/crio-837fccdb42b1c5b047a6a136afa8ffddf5e176b04333d4bd1a7544ddd3269f6a WatchSource:0}: Error finding container 837fccdb42b1c5b047a6a136afa8ffddf5e176b04333d4bd1a7544ddd3269f6a: Status 404 returned error can't find the container with id 837fccdb42b1c5b047a6a136afa8ffddf5e176b04333d4bd1a7544ddd3269f6a Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.762883 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d04280f1-af10-4187-a3e9-7f7384dafc7d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d04280f1-af10-4187-a3e9-7f7384dafc7d\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.763058 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d04280f1-af10-4187-a3e9-7f7384dafc7d-logs\") pod \"glance-default-internal-api-0\" (UID: \"d04280f1-af10-4187-a3e9-7f7384dafc7d\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.763202 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clgb8\" (UniqueName: \"kubernetes.io/projected/d04280f1-af10-4187-a3e9-7f7384dafc7d-kube-api-access-clgb8\") pod \"glance-default-internal-api-0\" (UID: \"d04280f1-af10-4187-a3e9-7f7384dafc7d\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.763231 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d04280f1-af10-4187-a3e9-7f7384dafc7d-ceph\") pod \"glance-default-internal-api-0\" (UID: \"d04280f1-af10-4187-a3e9-7f7384dafc7d\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.763251 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d04280f1-af10-4187-a3e9-7f7384dafc7d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d04280f1-af10-4187-a3e9-7f7384dafc7d\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.763566 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d04280f1-af10-4187-a3e9-7f7384dafc7d-logs\") pod \"glance-default-internal-api-0\" (UID: \"d04280f1-af10-4187-a3e9-7f7384dafc7d\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.763795 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d04280f1-af10-4187-a3e9-7f7384dafc7d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d04280f1-af10-4187-a3e9-7f7384dafc7d\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.764133 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d04280f1-af10-4187-a3e9-7f7384dafc7d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d04280f1-af10-4187-a3e9-7f7384dafc7d\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.764263 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d04280f1-af10-4187-a3e9-7f7384dafc7d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d04280f1-af10-4187-a3e9-7f7384dafc7d\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.767986 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d04280f1-af10-4187-a3e9-7f7384dafc7d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d04280f1-af10-4187-a3e9-7f7384dafc7d\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.770563 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d04280f1-af10-4187-a3e9-7f7384dafc7d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d04280f1-af10-4187-a3e9-7f7384dafc7d\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.770740 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d04280f1-af10-4187-a3e9-7f7384dafc7d-ceph\") pod \"glance-default-internal-api-0\" (UID: \"d04280f1-af10-4187-a3e9-7f7384dafc7d\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.771685 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d04280f1-af10-4187-a3e9-7f7384dafc7d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d04280f1-af10-4187-a3e9-7f7384dafc7d\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.783149 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clgb8\" (UniqueName: \"kubernetes.io/projected/d04280f1-af10-4187-a3e9-7f7384dafc7d-kube-api-access-clgb8\") pod \"glance-default-internal-api-0\" (UID: \"d04280f1-af10-4187-a3e9-7f7384dafc7d\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:17:19 crc kubenswrapper[4892]: I0217 19:17:19.916913 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 19:17:20 crc kubenswrapper[4892]: I0217 19:17:20.479440 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 19:17:20 crc kubenswrapper[4892]: I0217 19:17:20.491666 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bc2d1c9d-6329-44ea-b176-b8734429b1da","Type":"ContainerStarted","Data":"3806908dd7eae271c3ab77dd52415cf2e258aee15cd55da14c1ef8dcc9754def"} Feb 17 19:17:20 crc kubenswrapper[4892]: I0217 19:17:20.491710 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bc2d1c9d-6329-44ea-b176-b8734429b1da","Type":"ContainerStarted","Data":"837fccdb42b1c5b047a6a136afa8ffddf5e176b04333d4bd1a7544ddd3269f6a"} Feb 17 19:17:20 crc kubenswrapper[4892]: W0217 19:17:20.495724 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd04280f1_af10_4187_a3e9_7f7384dafc7d.slice/crio-f7cbed2c4a0bb34b6d6c784484695056378dbd1136bf16ba41786bd6d9855178 WatchSource:0}: Error finding container f7cbed2c4a0bb34b6d6c784484695056378dbd1136bf16ba41786bd6d9855178: Status 404 returned error can't find the container with id f7cbed2c4a0bb34b6d6c784484695056378dbd1136bf16ba41786bd6d9855178 Feb 17 19:17:21 crc kubenswrapper[4892]: I0217 19:17:21.391018 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76aefb55-60ee-49af-803c-c0237d2fb375" path="/var/lib/kubelet/pods/76aefb55-60ee-49af-803c-c0237d2fb375/volumes" Feb 17 19:17:21 crc kubenswrapper[4892]: I0217 19:17:21.504058 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bc2d1c9d-6329-44ea-b176-b8734429b1da","Type":"ContainerStarted","Data":"6a8154347440583902f5cedcf6c23d94b6a019f2da26bf85ba0a195e89efbd73"} Feb 17 19:17:21 crc kubenswrapper[4892]: I0217 19:17:21.506894 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d04280f1-af10-4187-a3e9-7f7384dafc7d","Type":"ContainerStarted","Data":"0d7cb5bc3f04fca0ac6d8c4eebfdaa5cfd90cc8171568f754b0730622a602e14"} Feb 17 19:17:21 crc kubenswrapper[4892]: I0217 19:17:21.506936 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d04280f1-af10-4187-a3e9-7f7384dafc7d","Type":"ContainerStarted","Data":"f7cbed2c4a0bb34b6d6c784484695056378dbd1136bf16ba41786bd6d9855178"} Feb 17 19:17:21 crc kubenswrapper[4892]: I0217 19:17:21.524975 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.524950787 podStartE2EDuration="3.524950787s" podCreationTimestamp="2026-02-17 19:17:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:17:21.520478686 +0000 UTC m=+5612.895881961" watchObservedRunningTime="2026-02-17 19:17:21.524950787 +0000 UTC m=+5612.900354072" Feb 17 19:17:22 crc kubenswrapper[4892]: I0217 19:17:22.525060 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d04280f1-af10-4187-a3e9-7f7384dafc7d","Type":"ContainerStarted","Data":"c71aded13f218c065f94e937f42ab143edd6b47a10b7ce47acb3faee6b4c612f"} Feb 17 19:17:22 crc kubenswrapper[4892]: I0217 19:17:22.561333 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.561304312 podStartE2EDuration="3.561304312s" podCreationTimestamp="2026-02-17 19:17:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:17:22.546945294 +0000 UTC m=+5613.922348579" watchObservedRunningTime="2026-02-17 19:17:22.561304312 +0000 UTC m=+5613.936707617" Feb 17 19:17:24 crc kubenswrapper[4892]: I0217 19:17:24.084891 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7cc68945df-zppb2" Feb 17 19:17:24 crc kubenswrapper[4892]: I0217 19:17:24.184457 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5675647d9-6z5jv"] Feb 17 19:17:24 crc kubenswrapper[4892]: I0217 19:17:24.184731 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5675647d9-6z5jv" podUID="f21a68f6-4b3e-40a5-aa01-aee310e5aabb" containerName="dnsmasq-dns" containerID="cri-o://b72e2b4573aa564e44d61945813dc88dcaec399a7156e9b402ce4ef3c68eaa93" gracePeriod=10 Feb 17 19:17:24 crc kubenswrapper[4892]: I0217 19:17:24.562655 4892 generic.go:334] "Generic (PLEG): container finished" podID="f21a68f6-4b3e-40a5-aa01-aee310e5aabb" containerID="b72e2b4573aa564e44d61945813dc88dcaec399a7156e9b402ce4ef3c68eaa93" exitCode=0 Feb 17 19:17:24 crc kubenswrapper[4892]: I0217 19:17:24.562744 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5675647d9-6z5jv" event={"ID":"f21a68f6-4b3e-40a5-aa01-aee310e5aabb","Type":"ContainerDied","Data":"b72e2b4573aa564e44d61945813dc88dcaec399a7156e9b402ce4ef3c68eaa93"} Feb 17 19:17:24 crc kubenswrapper[4892]: I0217 19:17:24.712865 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5675647d9-6z5jv" Feb 17 19:17:24 crc kubenswrapper[4892]: I0217 19:17:24.756509 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f21a68f6-4b3e-40a5-aa01-aee310e5aabb-dns-svc\") pod \"f21a68f6-4b3e-40a5-aa01-aee310e5aabb\" (UID: \"f21a68f6-4b3e-40a5-aa01-aee310e5aabb\") " Feb 17 19:17:24 crc kubenswrapper[4892]: I0217 19:17:24.756619 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f21a68f6-4b3e-40a5-aa01-aee310e5aabb-ovsdbserver-nb\") pod \"f21a68f6-4b3e-40a5-aa01-aee310e5aabb\" (UID: \"f21a68f6-4b3e-40a5-aa01-aee310e5aabb\") " Feb 17 19:17:24 crc kubenswrapper[4892]: I0217 19:17:24.756728 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6fjs\" (UniqueName: \"kubernetes.io/projected/f21a68f6-4b3e-40a5-aa01-aee310e5aabb-kube-api-access-j6fjs\") pod \"f21a68f6-4b3e-40a5-aa01-aee310e5aabb\" (UID: \"f21a68f6-4b3e-40a5-aa01-aee310e5aabb\") " Feb 17 19:17:24 crc kubenswrapper[4892]: I0217 19:17:24.756767 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f21a68f6-4b3e-40a5-aa01-aee310e5aabb-ovsdbserver-sb\") pod \"f21a68f6-4b3e-40a5-aa01-aee310e5aabb\" (UID: \"f21a68f6-4b3e-40a5-aa01-aee310e5aabb\") " Feb 17 19:17:24 crc kubenswrapper[4892]: I0217 19:17:24.756785 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f21a68f6-4b3e-40a5-aa01-aee310e5aabb-config\") pod \"f21a68f6-4b3e-40a5-aa01-aee310e5aabb\" (UID: \"f21a68f6-4b3e-40a5-aa01-aee310e5aabb\") " Feb 17 19:17:24 crc kubenswrapper[4892]: I0217 19:17:24.762704 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f21a68f6-4b3e-40a5-aa01-aee310e5aabb-kube-api-access-j6fjs" (OuterVolumeSpecName: "kube-api-access-j6fjs") pod "f21a68f6-4b3e-40a5-aa01-aee310e5aabb" (UID: "f21a68f6-4b3e-40a5-aa01-aee310e5aabb"). InnerVolumeSpecName "kube-api-access-j6fjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:17:24 crc kubenswrapper[4892]: I0217 19:17:24.823248 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f21a68f6-4b3e-40a5-aa01-aee310e5aabb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f21a68f6-4b3e-40a5-aa01-aee310e5aabb" (UID: "f21a68f6-4b3e-40a5-aa01-aee310e5aabb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:17:24 crc kubenswrapper[4892]: I0217 19:17:24.840792 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f21a68f6-4b3e-40a5-aa01-aee310e5aabb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f21a68f6-4b3e-40a5-aa01-aee310e5aabb" (UID: "f21a68f6-4b3e-40a5-aa01-aee310e5aabb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:17:24 crc kubenswrapper[4892]: I0217 19:17:24.844229 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f21a68f6-4b3e-40a5-aa01-aee310e5aabb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f21a68f6-4b3e-40a5-aa01-aee310e5aabb" (UID: "f21a68f6-4b3e-40a5-aa01-aee310e5aabb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:17:24 crc kubenswrapper[4892]: I0217 19:17:24.846944 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f21a68f6-4b3e-40a5-aa01-aee310e5aabb-config" (OuterVolumeSpecName: "config") pod "f21a68f6-4b3e-40a5-aa01-aee310e5aabb" (UID: "f21a68f6-4b3e-40a5-aa01-aee310e5aabb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:17:24 crc kubenswrapper[4892]: I0217 19:17:24.859254 4892 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f21a68f6-4b3e-40a5-aa01-aee310e5aabb-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 19:17:24 crc kubenswrapper[4892]: I0217 19:17:24.859283 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f21a68f6-4b3e-40a5-aa01-aee310e5aabb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 19:17:24 crc kubenswrapper[4892]: I0217 19:17:24.859297 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6fjs\" (UniqueName: \"kubernetes.io/projected/f21a68f6-4b3e-40a5-aa01-aee310e5aabb-kube-api-access-j6fjs\") on node \"crc\" DevicePath \"\"" Feb 17 19:17:24 crc kubenswrapper[4892]: I0217 19:17:24.859306 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f21a68f6-4b3e-40a5-aa01-aee310e5aabb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 19:17:24 crc kubenswrapper[4892]: I0217 19:17:24.859314 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f21a68f6-4b3e-40a5-aa01-aee310e5aabb-config\") on node \"crc\" DevicePath \"\"" Feb 17 19:17:25 crc kubenswrapper[4892]: I0217 19:17:25.576369 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5675647d9-6z5jv" event={"ID":"f21a68f6-4b3e-40a5-aa01-aee310e5aabb","Type":"ContainerDied","Data":"7b4dd1e9e2e50b702b93c7da5b33423f7d82df960a0886b343f09015c929263c"} Feb 17 19:17:25 crc kubenswrapper[4892]: I0217 19:17:25.576451 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5675647d9-6z5jv" Feb 17 19:17:25 crc kubenswrapper[4892]: I0217 19:17:25.576654 4892 scope.go:117] "RemoveContainer" containerID="b72e2b4573aa564e44d61945813dc88dcaec399a7156e9b402ce4ef3c68eaa93" Feb 17 19:17:25 crc kubenswrapper[4892]: I0217 19:17:25.609181 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5675647d9-6z5jv"] Feb 17 19:17:25 crc kubenswrapper[4892]: I0217 19:17:25.620145 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5675647d9-6z5jv"] Feb 17 19:17:25 crc kubenswrapper[4892]: I0217 19:17:25.629040 4892 scope.go:117] "RemoveContainer" containerID="26c491876dfa29d22fffb447dc10eab51c37a75f652de5af29f0e1bf12829650" Feb 17 19:17:27 crc kubenswrapper[4892]: I0217 19:17:27.376147 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f21a68f6-4b3e-40a5-aa01-aee310e5aabb" path="/var/lib/kubelet/pods/f21a68f6-4b3e-40a5-aa01-aee310e5aabb/volumes" Feb 17 19:17:28 crc kubenswrapper[4892]: I0217 19:17:28.881970 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 17 19:17:28 crc kubenswrapper[4892]: I0217 19:17:28.882036 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 17 19:17:28 crc kubenswrapper[4892]: I0217 19:17:28.920278 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 17 19:17:28 crc kubenswrapper[4892]: I0217 19:17:28.937174 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 17 19:17:29 crc kubenswrapper[4892]: I0217 19:17:29.643638 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 17 19:17:29 crc kubenswrapper[4892]: I0217 19:17:29.643915 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 17 19:17:29 crc kubenswrapper[4892]: I0217 19:17:29.917953 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 17 19:17:29 crc kubenswrapper[4892]: I0217 19:17:29.918040 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 17 19:17:29 crc kubenswrapper[4892]: I0217 19:17:29.959266 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 17 19:17:29 crc kubenswrapper[4892]: I0217 19:17:29.969018 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 17 19:17:30 crc kubenswrapper[4892]: I0217 19:17:30.656446 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 17 19:17:30 crc kubenswrapper[4892]: I0217 19:17:30.657062 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 17 19:17:31 crc kubenswrapper[4892]: I0217 19:17:31.665143 4892 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 19:17:31 crc kubenswrapper[4892]: I0217 19:17:31.665451 4892 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 19:17:31 crc kubenswrapper[4892]: I0217 19:17:31.730980 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 17 19:17:31 crc kubenswrapper[4892]: I0217 19:17:31.732973 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 17 19:17:32 crc kubenswrapper[4892]: I0217 19:17:32.592647 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 17 19:17:32 crc kubenswrapper[4892]: I0217 19:17:32.611130 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 17 19:17:40 crc kubenswrapper[4892]: I0217 19:17:40.468569 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-64tqq"] Feb 17 19:17:40 crc kubenswrapper[4892]: E0217 19:17:40.469400 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f21a68f6-4b3e-40a5-aa01-aee310e5aabb" containerName="init" Feb 17 19:17:40 crc kubenswrapper[4892]: I0217 19:17:40.469413 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f21a68f6-4b3e-40a5-aa01-aee310e5aabb" containerName="init" Feb 17 19:17:40 crc kubenswrapper[4892]: E0217 19:17:40.469427 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f21a68f6-4b3e-40a5-aa01-aee310e5aabb" containerName="dnsmasq-dns" Feb 17 19:17:40 crc kubenswrapper[4892]: I0217 19:17:40.469433 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f21a68f6-4b3e-40a5-aa01-aee310e5aabb" containerName="dnsmasq-dns" Feb 17 19:17:40 crc kubenswrapper[4892]: I0217 19:17:40.469650 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="f21a68f6-4b3e-40a5-aa01-aee310e5aabb" containerName="dnsmasq-dns" Feb 17 19:17:40 crc kubenswrapper[4892]: I0217 19:17:40.470310 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-64tqq" Feb 17 19:17:40 crc kubenswrapper[4892]: I0217 19:17:40.492423 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-64tqq"] Feb 17 19:17:40 crc kubenswrapper[4892]: I0217 19:17:40.581303 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-d452-account-create-update-ncvgx"] Feb 17 19:17:40 crc kubenswrapper[4892]: I0217 19:17:40.585917 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2ffea8d-6680-40df-8b32-312e03efb9aa-operator-scripts\") pod \"placement-db-create-64tqq\" (UID: \"c2ffea8d-6680-40df-8b32-312e03efb9aa\") " pod="openstack/placement-db-create-64tqq" Feb 17 19:17:40 crc kubenswrapper[4892]: I0217 19:17:40.585988 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q44kg\" (UniqueName: \"kubernetes.io/projected/c2ffea8d-6680-40df-8b32-312e03efb9aa-kube-api-access-q44kg\") pod \"placement-db-create-64tqq\" (UID: \"c2ffea8d-6680-40df-8b32-312e03efb9aa\") " pod="openstack/placement-db-create-64tqq" Feb 17 19:17:40 crc kubenswrapper[4892]: I0217 19:17:40.586205 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d452-account-create-update-ncvgx" Feb 17 19:17:40 crc kubenswrapper[4892]: I0217 19:17:40.593882 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d452-account-create-update-ncvgx"] Feb 17 19:17:40 crc kubenswrapper[4892]: I0217 19:17:40.597714 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 17 19:17:40 crc kubenswrapper[4892]: I0217 19:17:40.687545 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q44kg\" (UniqueName: \"kubernetes.io/projected/c2ffea8d-6680-40df-8b32-312e03efb9aa-kube-api-access-q44kg\") pod \"placement-db-create-64tqq\" (UID: \"c2ffea8d-6680-40df-8b32-312e03efb9aa\") " pod="openstack/placement-db-create-64tqq" Feb 17 19:17:40 crc kubenswrapper[4892]: I0217 19:17:40.687654 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26f9bb32-23ac-4897-a488-25db578b4696-operator-scripts\") pod \"placement-d452-account-create-update-ncvgx\" (UID: \"26f9bb32-23ac-4897-a488-25db578b4696\") " pod="openstack/placement-d452-account-create-update-ncvgx" Feb 17 19:17:40 crc kubenswrapper[4892]: I0217 19:17:40.687709 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gvhv\" (UniqueName: \"kubernetes.io/projected/26f9bb32-23ac-4897-a488-25db578b4696-kube-api-access-5gvhv\") pod \"placement-d452-account-create-update-ncvgx\" (UID: \"26f9bb32-23ac-4897-a488-25db578b4696\") " pod="openstack/placement-d452-account-create-update-ncvgx" Feb 17 19:17:40 crc kubenswrapper[4892]: I0217 19:17:40.687766 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2ffea8d-6680-40df-8b32-312e03efb9aa-operator-scripts\") pod \"placement-db-create-64tqq\" (UID: \"c2ffea8d-6680-40df-8b32-312e03efb9aa\") " pod="openstack/placement-db-create-64tqq" Feb 17 19:17:40 crc kubenswrapper[4892]: I0217 19:17:40.688589 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2ffea8d-6680-40df-8b32-312e03efb9aa-operator-scripts\") pod \"placement-db-create-64tqq\" (UID: \"c2ffea8d-6680-40df-8b32-312e03efb9aa\") " pod="openstack/placement-db-create-64tqq" Feb 17 19:17:40 crc kubenswrapper[4892]: I0217 19:17:40.714991 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q44kg\" (UniqueName: \"kubernetes.io/projected/c2ffea8d-6680-40df-8b32-312e03efb9aa-kube-api-access-q44kg\") pod \"placement-db-create-64tqq\" (UID: \"c2ffea8d-6680-40df-8b32-312e03efb9aa\") " pod="openstack/placement-db-create-64tqq" Feb 17 19:17:40 crc kubenswrapper[4892]: I0217 19:17:40.790051 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26f9bb32-23ac-4897-a488-25db578b4696-operator-scripts\") pod \"placement-d452-account-create-update-ncvgx\" (UID: \"26f9bb32-23ac-4897-a488-25db578b4696\") " pod="openstack/placement-d452-account-create-update-ncvgx" Feb 17 19:17:40 crc kubenswrapper[4892]: I0217 19:17:40.790170 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gvhv\" (UniqueName: \"kubernetes.io/projected/26f9bb32-23ac-4897-a488-25db578b4696-kube-api-access-5gvhv\") pod \"placement-d452-account-create-update-ncvgx\" (UID: \"26f9bb32-23ac-4897-a488-25db578b4696\") " pod="openstack/placement-d452-account-create-update-ncvgx" Feb 17 19:17:40 crc kubenswrapper[4892]: I0217 19:17:40.790686 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26f9bb32-23ac-4897-a488-25db578b4696-operator-scripts\") pod \"placement-d452-account-create-update-ncvgx\" (UID: \"26f9bb32-23ac-4897-a488-25db578b4696\") " pod="openstack/placement-d452-account-create-update-ncvgx" Feb 17 19:17:40 crc kubenswrapper[4892]: I0217 19:17:40.806948 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gvhv\" (UniqueName: \"kubernetes.io/projected/26f9bb32-23ac-4897-a488-25db578b4696-kube-api-access-5gvhv\") pod \"placement-d452-account-create-update-ncvgx\" (UID: \"26f9bb32-23ac-4897-a488-25db578b4696\") " pod="openstack/placement-d452-account-create-update-ncvgx" Feb 17 19:17:40 crc kubenswrapper[4892]: I0217 19:17:40.840995 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-64tqq" Feb 17 19:17:40 crc kubenswrapper[4892]: I0217 19:17:40.900937 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d452-account-create-update-ncvgx" Feb 17 19:17:41 crc kubenswrapper[4892]: I0217 19:17:41.328015 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-64tqq"] Feb 17 19:17:41 crc kubenswrapper[4892]: W0217 19:17:41.452866 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26f9bb32_23ac_4897_a488_25db578b4696.slice/crio-5208cf61fcbd3648296ad45ab322ebfdcb884745ce4210e424d61c107490fab1 WatchSource:0}: Error finding container 5208cf61fcbd3648296ad45ab322ebfdcb884745ce4210e424d61c107490fab1: Status 404 returned error can't find the container with id 5208cf61fcbd3648296ad45ab322ebfdcb884745ce4210e424d61c107490fab1 Feb 17 19:17:41 crc kubenswrapper[4892]: I0217 19:17:41.455558 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d452-account-create-update-ncvgx"] Feb 17 19:17:41 crc kubenswrapper[4892]: I0217 19:17:41.773678 4892 generic.go:334] "Generic (PLEG): container finished" podID="c2ffea8d-6680-40df-8b32-312e03efb9aa" containerID="64ede27a0870296551f3f27b1149629c1800c8d7d496d0949a20bc79ab66c89a" exitCode=0 Feb 17 19:17:41 crc kubenswrapper[4892]: I0217 19:17:41.774558 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-64tqq" event={"ID":"c2ffea8d-6680-40df-8b32-312e03efb9aa","Type":"ContainerDied","Data":"64ede27a0870296551f3f27b1149629c1800c8d7d496d0949a20bc79ab66c89a"} Feb 17 19:17:41 crc kubenswrapper[4892]: I0217 19:17:41.774653 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-64tqq" event={"ID":"c2ffea8d-6680-40df-8b32-312e03efb9aa","Type":"ContainerStarted","Data":"de4a5bfe55f1df324370ad349077d283f834670f5a46744d2b6fdd9b5ad1c92f"} Feb 17 19:17:41 crc kubenswrapper[4892]: I0217 19:17:41.777802 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d452-account-create-update-ncvgx" event={"ID":"26f9bb32-23ac-4897-a488-25db578b4696","Type":"ContainerStarted","Data":"9b74d5e0808d4916acfebbdce4333b4195dc6c278e31c267ea060b4652a9b98f"} Feb 17 19:17:41 crc kubenswrapper[4892]: I0217 19:17:41.777903 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d452-account-create-update-ncvgx" event={"ID":"26f9bb32-23ac-4897-a488-25db578b4696","Type":"ContainerStarted","Data":"5208cf61fcbd3648296ad45ab322ebfdcb884745ce4210e424d61c107490fab1"} Feb 17 19:17:41 crc kubenswrapper[4892]: I0217 19:17:41.818304 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-d452-account-create-update-ncvgx" podStartSLOduration=1.818277758 podStartE2EDuration="1.818277758s" podCreationTimestamp="2026-02-17 19:17:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:17:41.807535108 +0000 UTC m=+5633.182938383" watchObservedRunningTime="2026-02-17 19:17:41.818277758 +0000 UTC m=+5633.193681033" Feb 17 19:17:42 crc kubenswrapper[4892]: I0217 19:17:42.793360 4892 generic.go:334] "Generic (PLEG): container finished" podID="26f9bb32-23ac-4897-a488-25db578b4696" containerID="9b74d5e0808d4916acfebbdce4333b4195dc6c278e31c267ea060b4652a9b98f" exitCode=0 Feb 17 19:17:42 crc kubenswrapper[4892]: I0217 19:17:42.796019 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d452-account-create-update-ncvgx" event={"ID":"26f9bb32-23ac-4897-a488-25db578b4696","Type":"ContainerDied","Data":"9b74d5e0808d4916acfebbdce4333b4195dc6c278e31c267ea060b4652a9b98f"} Feb 17 19:17:43 crc kubenswrapper[4892]: I0217 19:17:43.187159 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-64tqq" Feb 17 19:17:43 crc kubenswrapper[4892]: I0217 19:17:43.340346 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q44kg\" (UniqueName: \"kubernetes.io/projected/c2ffea8d-6680-40df-8b32-312e03efb9aa-kube-api-access-q44kg\") pod \"c2ffea8d-6680-40df-8b32-312e03efb9aa\" (UID: \"c2ffea8d-6680-40df-8b32-312e03efb9aa\") " Feb 17 19:17:43 crc kubenswrapper[4892]: I0217 19:17:43.340675 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2ffea8d-6680-40df-8b32-312e03efb9aa-operator-scripts\") pod \"c2ffea8d-6680-40df-8b32-312e03efb9aa\" (UID: \"c2ffea8d-6680-40df-8b32-312e03efb9aa\") " Feb 17 19:17:43 crc kubenswrapper[4892]: I0217 19:17:43.341468 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2ffea8d-6680-40df-8b32-312e03efb9aa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c2ffea8d-6680-40df-8b32-312e03efb9aa" (UID: "c2ffea8d-6680-40df-8b32-312e03efb9aa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:17:43 crc kubenswrapper[4892]: I0217 19:17:43.351998 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2ffea8d-6680-40df-8b32-312e03efb9aa-kube-api-access-q44kg" (OuterVolumeSpecName: "kube-api-access-q44kg") pod "c2ffea8d-6680-40df-8b32-312e03efb9aa" (UID: "c2ffea8d-6680-40df-8b32-312e03efb9aa"). InnerVolumeSpecName "kube-api-access-q44kg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:17:43 crc kubenswrapper[4892]: I0217 19:17:43.443276 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2ffea8d-6680-40df-8b32-312e03efb9aa-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:17:43 crc kubenswrapper[4892]: I0217 19:17:43.443336 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q44kg\" (UniqueName: \"kubernetes.io/projected/c2ffea8d-6680-40df-8b32-312e03efb9aa-kube-api-access-q44kg\") on node \"crc\" DevicePath \"\"" Feb 17 19:17:43 crc kubenswrapper[4892]: I0217 19:17:43.809158 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-64tqq" Feb 17 19:17:43 crc kubenswrapper[4892]: I0217 19:17:43.809198 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-64tqq" event={"ID":"c2ffea8d-6680-40df-8b32-312e03efb9aa","Type":"ContainerDied","Data":"de4a5bfe55f1df324370ad349077d283f834670f5a46744d2b6fdd9b5ad1c92f"} Feb 17 19:17:43 crc kubenswrapper[4892]: I0217 19:17:43.809246 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de4a5bfe55f1df324370ad349077d283f834670f5a46744d2b6fdd9b5ad1c92f" Feb 17 19:17:44 crc kubenswrapper[4892]: I0217 19:17:44.227087 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d452-account-create-update-ncvgx" Feb 17 19:17:44 crc kubenswrapper[4892]: I0217 19:17:44.366920 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26f9bb32-23ac-4897-a488-25db578b4696-operator-scripts\") pod \"26f9bb32-23ac-4897-a488-25db578b4696\" (UID: \"26f9bb32-23ac-4897-a488-25db578b4696\") " Feb 17 19:17:44 crc kubenswrapper[4892]: I0217 19:17:44.367002 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gvhv\" (UniqueName: \"kubernetes.io/projected/26f9bb32-23ac-4897-a488-25db578b4696-kube-api-access-5gvhv\") pod \"26f9bb32-23ac-4897-a488-25db578b4696\" (UID: \"26f9bb32-23ac-4897-a488-25db578b4696\") " Feb 17 19:17:44 crc kubenswrapper[4892]: I0217 19:17:44.367724 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26f9bb32-23ac-4897-a488-25db578b4696-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "26f9bb32-23ac-4897-a488-25db578b4696" (UID: "26f9bb32-23ac-4897-a488-25db578b4696"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:17:44 crc kubenswrapper[4892]: I0217 19:17:44.372240 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26f9bb32-23ac-4897-a488-25db578b4696-kube-api-access-5gvhv" (OuterVolumeSpecName: "kube-api-access-5gvhv") pod "26f9bb32-23ac-4897-a488-25db578b4696" (UID: "26f9bb32-23ac-4897-a488-25db578b4696"). InnerVolumeSpecName "kube-api-access-5gvhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:17:44 crc kubenswrapper[4892]: I0217 19:17:44.470116 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26f9bb32-23ac-4897-a488-25db578b4696-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:17:44 crc kubenswrapper[4892]: I0217 19:17:44.470158 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gvhv\" (UniqueName: \"kubernetes.io/projected/26f9bb32-23ac-4897-a488-25db578b4696-kube-api-access-5gvhv\") on node \"crc\" DevicePath \"\"" Feb 17 19:17:44 crc kubenswrapper[4892]: I0217 19:17:44.818953 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d452-account-create-update-ncvgx" event={"ID":"26f9bb32-23ac-4897-a488-25db578b4696","Type":"ContainerDied","Data":"5208cf61fcbd3648296ad45ab322ebfdcb884745ce4210e424d61c107490fab1"} Feb 17 19:17:44 crc kubenswrapper[4892]: I0217 19:17:44.818988 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5208cf61fcbd3648296ad45ab322ebfdcb884745ce4210e424d61c107490fab1" Feb 17 19:17:44 crc kubenswrapper[4892]: I0217 19:17:44.819039 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d452-account-create-update-ncvgx" Feb 17 19:17:45 crc kubenswrapper[4892]: I0217 19:17:45.925349 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-779fb5b75c-gs2vd"] Feb 17 19:17:45 crc kubenswrapper[4892]: E0217 19:17:45.926285 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26f9bb32-23ac-4897-a488-25db578b4696" containerName="mariadb-account-create-update" Feb 17 19:17:45 crc kubenswrapper[4892]: I0217 19:17:45.926303 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="26f9bb32-23ac-4897-a488-25db578b4696" containerName="mariadb-account-create-update" Feb 17 19:17:45 crc kubenswrapper[4892]: E0217 19:17:45.926321 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2ffea8d-6680-40df-8b32-312e03efb9aa" containerName="mariadb-database-create" Feb 17 19:17:45 crc kubenswrapper[4892]: I0217 19:17:45.926328 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2ffea8d-6680-40df-8b32-312e03efb9aa" containerName="mariadb-database-create" Feb 17 19:17:45 crc kubenswrapper[4892]: I0217 19:17:45.926593 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2ffea8d-6680-40df-8b32-312e03efb9aa" containerName="mariadb-database-create" Feb 17 19:17:45 crc kubenswrapper[4892]: I0217 19:17:45.926609 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="26f9bb32-23ac-4897-a488-25db578b4696" containerName="mariadb-account-create-update" Feb 17 19:17:45 crc kubenswrapper[4892]: I0217 19:17:45.927885 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-779fb5b75c-gs2vd" Feb 17 19:17:45 crc kubenswrapper[4892]: I0217 19:17:45.945492 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-779fb5b75c-gs2vd"] Feb 17 19:17:45 crc kubenswrapper[4892]: I0217 19:17:45.969886 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-j4r8j"] Feb 17 19:17:45 crc kubenswrapper[4892]: I0217 19:17:45.981577 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-j4r8j" Feb 17 19:17:45 crc kubenswrapper[4892]: I0217 19:17:45.983286 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 17 19:17:45 crc kubenswrapper[4892]: I0217 19:17:45.983479 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 17 19:17:45 crc kubenswrapper[4892]: I0217 19:17:45.983842 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-sg2ft" Feb 17 19:17:46 crc kubenswrapper[4892]: I0217 19:17:46.003756 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-j4r8j"] Feb 17 19:17:46 crc kubenswrapper[4892]: I0217 19:17:46.100991 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bce8da88-116f-4f18-8d2f-050918735c8f-combined-ca-bundle\") pod \"placement-db-sync-j4r8j\" (UID: \"bce8da88-116f-4f18-8d2f-050918735c8f\") " pod="openstack/placement-db-sync-j4r8j" Feb 17 19:17:46 crc kubenswrapper[4892]: I0217 19:17:46.101053 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bce8da88-116f-4f18-8d2f-050918735c8f-config-data\") pod \"placement-db-sync-j4r8j\" (UID: \"bce8da88-116f-4f18-8d2f-050918735c8f\") " pod="openstack/placement-db-sync-j4r8j" Feb 17 19:17:46 crc kubenswrapper[4892]: I0217 19:17:46.101082 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bce8da88-116f-4f18-8d2f-050918735c8f-logs\") pod \"placement-db-sync-j4r8j\" (UID: \"bce8da88-116f-4f18-8d2f-050918735c8f\") " pod="openstack/placement-db-sync-j4r8j" Feb 17 19:17:46 crc kubenswrapper[4892]: I0217 19:17:46.101105 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bce8da88-116f-4f18-8d2f-050918735c8f-scripts\") pod \"placement-db-sync-j4r8j\" (UID: \"bce8da88-116f-4f18-8d2f-050918735c8f\") " pod="openstack/placement-db-sync-j4r8j" Feb 17 19:17:46 crc kubenswrapper[4892]: I0217 19:17:46.101133 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ab3a525-f249-4171-b99a-e57c330dea23-config\") pod \"dnsmasq-dns-779fb5b75c-gs2vd\" (UID: \"7ab3a525-f249-4171-b99a-e57c330dea23\") " pod="openstack/dnsmasq-dns-779fb5b75c-gs2vd" Feb 17 19:17:46 crc kubenswrapper[4892]: I0217 19:17:46.101231 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ab3a525-f249-4171-b99a-e57c330dea23-ovsdbserver-sb\") pod \"dnsmasq-dns-779fb5b75c-gs2vd\" (UID: \"7ab3a525-f249-4171-b99a-e57c330dea23\") " pod="openstack/dnsmasq-dns-779fb5b75c-gs2vd" Feb 17 19:17:46 crc kubenswrapper[4892]: I0217 19:17:46.101280 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ab3a525-f249-4171-b99a-e57c330dea23-ovsdbserver-nb\") pod \"dnsmasq-dns-779fb5b75c-gs2vd\" (UID: \"7ab3a525-f249-4171-b99a-e57c330dea23\") " pod="openstack/dnsmasq-dns-779fb5b75c-gs2vd" Feb 17 19:17:46 crc kubenswrapper[4892]: I0217 19:17:46.101340 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9cfx\" (UniqueName: \"kubernetes.io/projected/7ab3a525-f249-4171-b99a-e57c330dea23-kube-api-access-x9cfx\") pod \"dnsmasq-dns-779fb5b75c-gs2vd\" (UID: \"7ab3a525-f249-4171-b99a-e57c330dea23\") " pod="openstack/dnsmasq-dns-779fb5b75c-gs2vd" Feb 17 19:17:46 crc kubenswrapper[4892]: I0217 19:17:46.101759 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk7n7\" (UniqueName: \"kubernetes.io/projected/bce8da88-116f-4f18-8d2f-050918735c8f-kube-api-access-gk7n7\") pod \"placement-db-sync-j4r8j\" (UID: \"bce8da88-116f-4f18-8d2f-050918735c8f\") " pod="openstack/placement-db-sync-j4r8j" Feb 17 19:17:46 crc kubenswrapper[4892]: I0217 19:17:46.101806 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ab3a525-f249-4171-b99a-e57c330dea23-dns-svc\") pod \"dnsmasq-dns-779fb5b75c-gs2vd\" (UID: \"7ab3a525-f249-4171-b99a-e57c330dea23\") " pod="openstack/dnsmasq-dns-779fb5b75c-gs2vd" Feb 17 19:17:46 crc kubenswrapper[4892]: I0217 19:17:46.203675 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9cfx\" (UniqueName: \"kubernetes.io/projected/7ab3a525-f249-4171-b99a-e57c330dea23-kube-api-access-x9cfx\") pod \"dnsmasq-dns-779fb5b75c-gs2vd\" (UID: \"7ab3a525-f249-4171-b99a-e57c330dea23\") " pod="openstack/dnsmasq-dns-779fb5b75c-gs2vd" Feb 17 19:17:46 crc kubenswrapper[4892]: I0217 19:17:46.204132 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gk7n7\" (UniqueName: \"kubernetes.io/projected/bce8da88-116f-4f18-8d2f-050918735c8f-kube-api-access-gk7n7\") pod \"placement-db-sync-j4r8j\" (UID: \"bce8da88-116f-4f18-8d2f-050918735c8f\") " pod="openstack/placement-db-sync-j4r8j" Feb 17 19:17:46 crc kubenswrapper[4892]: I0217 19:17:46.204158 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ab3a525-f249-4171-b99a-e57c330dea23-dns-svc\") pod \"dnsmasq-dns-779fb5b75c-gs2vd\" (UID: \"7ab3a525-f249-4171-b99a-e57c330dea23\") " pod="openstack/dnsmasq-dns-779fb5b75c-gs2vd" Feb 17 19:17:46 crc kubenswrapper[4892]: I0217 19:17:46.204236 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bce8da88-116f-4f18-8d2f-050918735c8f-combined-ca-bundle\") pod \"placement-db-sync-j4r8j\" (UID: \"bce8da88-116f-4f18-8d2f-050918735c8f\") " pod="openstack/placement-db-sync-j4r8j" Feb 17 19:17:46 crc kubenswrapper[4892]: I0217 19:17:46.204280 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bce8da88-116f-4f18-8d2f-050918735c8f-config-data\") pod \"placement-db-sync-j4r8j\" (UID: \"bce8da88-116f-4f18-8d2f-050918735c8f\") " pod="openstack/placement-db-sync-j4r8j" Feb 17 19:17:46 crc kubenswrapper[4892]: I0217 19:17:46.204315 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bce8da88-116f-4f18-8d2f-050918735c8f-logs\") pod \"placement-db-sync-j4r8j\" (UID: \"bce8da88-116f-4f18-8d2f-050918735c8f\") " pod="openstack/placement-db-sync-j4r8j" Feb 17 19:17:46 crc kubenswrapper[4892]: I0217 19:17:46.204340 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bce8da88-116f-4f18-8d2f-050918735c8f-scripts\") pod \"placement-db-sync-j4r8j\" (UID: \"bce8da88-116f-4f18-8d2f-050918735c8f\") " pod="openstack/placement-db-sync-j4r8j" Feb 17 19:17:46 crc kubenswrapper[4892]: I0217 19:17:46.204383 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ab3a525-f249-4171-b99a-e57c330dea23-config\") pod \"dnsmasq-dns-779fb5b75c-gs2vd\" (UID: \"7ab3a525-f249-4171-b99a-e57c330dea23\") " pod="openstack/dnsmasq-dns-779fb5b75c-gs2vd" Feb 17 19:17:46 crc kubenswrapper[4892]: I0217 19:17:46.204431 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ab3a525-f249-4171-b99a-e57c330dea23-ovsdbserver-sb\") pod \"dnsmasq-dns-779fb5b75c-gs2vd\" (UID: \"7ab3a525-f249-4171-b99a-e57c330dea23\") " pod="openstack/dnsmasq-dns-779fb5b75c-gs2vd" Feb 17 19:17:46 crc kubenswrapper[4892]: I0217 19:17:46.204455 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ab3a525-f249-4171-b99a-e57c330dea23-ovsdbserver-nb\") pod \"dnsmasq-dns-779fb5b75c-gs2vd\" (UID: \"7ab3a525-f249-4171-b99a-e57c330dea23\") " pod="openstack/dnsmasq-dns-779fb5b75c-gs2vd" Feb 17 19:17:46 crc kubenswrapper[4892]: I0217 19:17:46.204895 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bce8da88-116f-4f18-8d2f-050918735c8f-logs\") pod \"placement-db-sync-j4r8j\" (UID: \"bce8da88-116f-4f18-8d2f-050918735c8f\") " pod="openstack/placement-db-sync-j4r8j" Feb 17 19:17:46 crc kubenswrapper[4892]: I0217 19:17:46.205413 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ab3a525-f249-4171-b99a-e57c330dea23-dns-svc\") pod \"dnsmasq-dns-779fb5b75c-gs2vd\" (UID: \"7ab3a525-f249-4171-b99a-e57c330dea23\") " pod="openstack/dnsmasq-dns-779fb5b75c-gs2vd" Feb 17 19:17:46 crc kubenswrapper[4892]: I0217 19:17:46.205581 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ab3a525-f249-4171-b99a-e57c330dea23-ovsdbserver-nb\") pod \"dnsmasq-dns-779fb5b75c-gs2vd\" (UID: \"7ab3a525-f249-4171-b99a-e57c330dea23\") " pod="openstack/dnsmasq-dns-779fb5b75c-gs2vd" Feb 17 19:17:46 crc kubenswrapper[4892]: I0217 19:17:46.205969 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ab3a525-f249-4171-b99a-e57c330dea23-ovsdbserver-sb\") pod \"dnsmasq-dns-779fb5b75c-gs2vd\" (UID: \"7ab3a525-f249-4171-b99a-e57c330dea23\") " pod="openstack/dnsmasq-dns-779fb5b75c-gs2vd" Feb 17 19:17:46 crc kubenswrapper[4892]: I0217 19:17:46.206294 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ab3a525-f249-4171-b99a-e57c330dea23-config\") pod \"dnsmasq-dns-779fb5b75c-gs2vd\" (UID: \"7ab3a525-f249-4171-b99a-e57c330dea23\") " pod="openstack/dnsmasq-dns-779fb5b75c-gs2vd" Feb 17 19:17:46 crc kubenswrapper[4892]: I0217 19:17:46.210303 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bce8da88-116f-4f18-8d2f-050918735c8f-config-data\") pod \"placement-db-sync-j4r8j\" (UID: \"bce8da88-116f-4f18-8d2f-050918735c8f\") " pod="openstack/placement-db-sync-j4r8j" Feb 17 19:17:46 crc kubenswrapper[4892]: I0217 19:17:46.211325 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bce8da88-116f-4f18-8d2f-050918735c8f-combined-ca-bundle\") pod \"placement-db-sync-j4r8j\" (UID: \"bce8da88-116f-4f18-8d2f-050918735c8f\") " pod="openstack/placement-db-sync-j4r8j" Feb 17 19:17:46 crc kubenswrapper[4892]: I0217 19:17:46.217093 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bce8da88-116f-4f18-8d2f-050918735c8f-scripts\") pod \"placement-db-sync-j4r8j\" (UID: \"bce8da88-116f-4f18-8d2f-050918735c8f\") " pod="openstack/placement-db-sync-j4r8j" Feb 17 19:17:46 crc kubenswrapper[4892]: I0217 19:17:46.220532 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9cfx\" (UniqueName: \"kubernetes.io/projected/7ab3a525-f249-4171-b99a-e57c330dea23-kube-api-access-x9cfx\") pod \"dnsmasq-dns-779fb5b75c-gs2vd\" (UID: \"7ab3a525-f249-4171-b99a-e57c330dea23\") " pod="openstack/dnsmasq-dns-779fb5b75c-gs2vd" Feb 17 19:17:46 crc kubenswrapper[4892]: I0217 19:17:46.229669 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk7n7\" (UniqueName: \"kubernetes.io/projected/bce8da88-116f-4f18-8d2f-050918735c8f-kube-api-access-gk7n7\") pod \"placement-db-sync-j4r8j\" (UID: \"bce8da88-116f-4f18-8d2f-050918735c8f\") " pod="openstack/placement-db-sync-j4r8j" Feb 17 19:17:46 crc kubenswrapper[4892]: I0217 19:17:46.260003 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-779fb5b75c-gs2vd" Feb 17 19:17:46 crc kubenswrapper[4892]: I0217 19:17:46.313199 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-j4r8j" Feb 17 19:17:46 crc kubenswrapper[4892]: I0217 19:17:46.847423 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-779fb5b75c-gs2vd"] Feb 17 19:17:46 crc kubenswrapper[4892]: I0217 19:17:46.934977 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-j4r8j"] Feb 17 19:17:47 crc kubenswrapper[4892]: I0217 19:17:47.883008 4892 generic.go:334] "Generic (PLEG): container finished" podID="7ab3a525-f249-4171-b99a-e57c330dea23" containerID="79b3dcad0eafacf89d149280a9c5f685baac833ab204cf6d69254eeca63c0ff0" exitCode=0 Feb 17 19:17:47 crc kubenswrapper[4892]: I0217 19:17:47.883132 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-779fb5b75c-gs2vd" event={"ID":"7ab3a525-f249-4171-b99a-e57c330dea23","Type":"ContainerDied","Data":"79b3dcad0eafacf89d149280a9c5f685baac833ab204cf6d69254eeca63c0ff0"} Feb 17 19:17:47 crc kubenswrapper[4892]: I0217 19:17:47.883170 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-779fb5b75c-gs2vd" event={"ID":"7ab3a525-f249-4171-b99a-e57c330dea23","Type":"ContainerStarted","Data":"16d3e7fb5373b4f1193de8b3ee89f9089a6c9b01b3baf2e49cb28f81900fe577"} Feb 17 19:17:47 crc kubenswrapper[4892]: I0217 19:17:47.889692 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-j4r8j" event={"ID":"bce8da88-116f-4f18-8d2f-050918735c8f","Type":"ContainerStarted","Data":"e40a8b2216b350aee3d40e0347f9e71faab93f401d48f3fa6717da13c86c484d"} Feb 17 19:17:47 crc kubenswrapper[4892]: I0217 19:17:47.889742 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-j4r8j" event={"ID":"bce8da88-116f-4f18-8d2f-050918735c8f","Type":"ContainerStarted","Data":"67ddb8894d79531181bde7c9373953810767342692d3bfae5e6496b79d70d06e"} Feb 17 19:17:47 crc kubenswrapper[4892]: I0217 19:17:47.970464 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-j4r8j" podStartSLOduration=2.9704424720000002 podStartE2EDuration="2.970442472s" podCreationTimestamp="2026-02-17 19:17:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:17:47.957166424 +0000 UTC m=+5639.332569689" watchObservedRunningTime="2026-02-17 19:17:47.970442472 +0000 UTC m=+5639.345845757" Feb 17 19:17:48 crc kubenswrapper[4892]: I0217 19:17:48.904059 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-779fb5b75c-gs2vd" event={"ID":"7ab3a525-f249-4171-b99a-e57c330dea23","Type":"ContainerStarted","Data":"7d0717f80637382ea7bd19ae66fb8bd6d9d1456f0d28546bf405ca9582346188"} Feb 17 19:17:48 crc kubenswrapper[4892]: I0217 19:17:48.904220 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-779fb5b75c-gs2vd" Feb 17 19:17:48 crc kubenswrapper[4892]: I0217 19:17:48.905705 4892 generic.go:334] "Generic (PLEG): container finished" podID="bce8da88-116f-4f18-8d2f-050918735c8f" containerID="e40a8b2216b350aee3d40e0347f9e71faab93f401d48f3fa6717da13c86c484d" exitCode=0 Feb 17 19:17:48 crc kubenswrapper[4892]: I0217 19:17:48.905872 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-j4r8j" event={"ID":"bce8da88-116f-4f18-8d2f-050918735c8f","Type":"ContainerDied","Data":"e40a8b2216b350aee3d40e0347f9e71faab93f401d48f3fa6717da13c86c484d"} Feb 17 19:17:48 crc kubenswrapper[4892]: I0217 19:17:48.944069 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-779fb5b75c-gs2vd" podStartSLOduration=3.944051375 podStartE2EDuration="3.944051375s" podCreationTimestamp="2026-02-17 19:17:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:17:48.930485628 +0000 UTC m=+5640.305888893" watchObservedRunningTime="2026-02-17 19:17:48.944051375 +0000 UTC m=+5640.319454640" Feb 17 19:17:50 crc kubenswrapper[4892]: I0217 19:17:50.318181 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-j4r8j" Feb 17 19:17:50 crc kubenswrapper[4892]: I0217 19:17:50.510500 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bce8da88-116f-4f18-8d2f-050918735c8f-scripts\") pod \"bce8da88-116f-4f18-8d2f-050918735c8f\" (UID: \"bce8da88-116f-4f18-8d2f-050918735c8f\") " Feb 17 19:17:50 crc kubenswrapper[4892]: I0217 19:17:50.510937 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bce8da88-116f-4f18-8d2f-050918735c8f-logs\") pod \"bce8da88-116f-4f18-8d2f-050918735c8f\" (UID: \"bce8da88-116f-4f18-8d2f-050918735c8f\") " Feb 17 19:17:50 crc kubenswrapper[4892]: I0217 19:17:50.511044 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bce8da88-116f-4f18-8d2f-050918735c8f-config-data\") pod \"bce8da88-116f-4f18-8d2f-050918735c8f\" (UID: \"bce8da88-116f-4f18-8d2f-050918735c8f\") " Feb 17 19:17:50 crc kubenswrapper[4892]: I0217 19:17:50.511149 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gk7n7\" (UniqueName: \"kubernetes.io/projected/bce8da88-116f-4f18-8d2f-050918735c8f-kube-api-access-gk7n7\") pod \"bce8da88-116f-4f18-8d2f-050918735c8f\" (UID: \"bce8da88-116f-4f18-8d2f-050918735c8f\") " Feb 17 19:17:50 crc kubenswrapper[4892]: I0217 19:17:50.511257 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bce8da88-116f-4f18-8d2f-050918735c8f-combined-ca-bundle\") pod \"bce8da88-116f-4f18-8d2f-050918735c8f\" (UID: \"bce8da88-116f-4f18-8d2f-050918735c8f\") " Feb 17 19:17:50 crc kubenswrapper[4892]: I0217 19:17:50.511435 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bce8da88-116f-4f18-8d2f-050918735c8f-logs" (OuterVolumeSpecName: "logs") pod "bce8da88-116f-4f18-8d2f-050918735c8f" (UID: "bce8da88-116f-4f18-8d2f-050918735c8f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:17:50 crc kubenswrapper[4892]: I0217 19:17:50.511746 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bce8da88-116f-4f18-8d2f-050918735c8f-logs\") on node \"crc\" DevicePath \"\"" Feb 17 19:17:50 crc kubenswrapper[4892]: I0217 19:17:50.515894 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bce8da88-116f-4f18-8d2f-050918735c8f-scripts" (OuterVolumeSpecName: "scripts") pod "bce8da88-116f-4f18-8d2f-050918735c8f" (UID: "bce8da88-116f-4f18-8d2f-050918735c8f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:17:50 crc kubenswrapper[4892]: I0217 19:17:50.516607 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bce8da88-116f-4f18-8d2f-050918735c8f-kube-api-access-gk7n7" (OuterVolumeSpecName: "kube-api-access-gk7n7") pod "bce8da88-116f-4f18-8d2f-050918735c8f" (UID: "bce8da88-116f-4f18-8d2f-050918735c8f"). InnerVolumeSpecName "kube-api-access-gk7n7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:17:50 crc kubenswrapper[4892]: I0217 19:17:50.557001 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bce8da88-116f-4f18-8d2f-050918735c8f-config-data" (OuterVolumeSpecName: "config-data") pod "bce8da88-116f-4f18-8d2f-050918735c8f" (UID: "bce8da88-116f-4f18-8d2f-050918735c8f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:17:50 crc kubenswrapper[4892]: I0217 19:17:50.569459 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bce8da88-116f-4f18-8d2f-050918735c8f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bce8da88-116f-4f18-8d2f-050918735c8f" (UID: "bce8da88-116f-4f18-8d2f-050918735c8f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:17:50 crc kubenswrapper[4892]: I0217 19:17:50.615237 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bce8da88-116f-4f18-8d2f-050918735c8f-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 19:17:50 crc kubenswrapper[4892]: I0217 19:17:50.615294 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gk7n7\" (UniqueName: \"kubernetes.io/projected/bce8da88-116f-4f18-8d2f-050918735c8f-kube-api-access-gk7n7\") on node \"crc\" DevicePath \"\"" Feb 17 19:17:50 crc kubenswrapper[4892]: I0217 19:17:50.615316 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bce8da88-116f-4f18-8d2f-050918735c8f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 19:17:50 crc kubenswrapper[4892]: I0217 19:17:50.615336 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bce8da88-116f-4f18-8d2f-050918735c8f-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:17:50 crc kubenswrapper[4892]: I0217 19:17:50.970098 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-j4r8j" event={"ID":"bce8da88-116f-4f18-8d2f-050918735c8f","Type":"ContainerDied","Data":"67ddb8894d79531181bde7c9373953810767342692d3bfae5e6496b79d70d06e"} Feb 17 19:17:50 crc kubenswrapper[4892]: I0217 19:17:50.970143 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67ddb8894d79531181bde7c9373953810767342692d3bfae5e6496b79d70d06e" Feb 17 19:17:50 crc kubenswrapper[4892]: I0217 19:17:50.970219 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-j4r8j" Feb 17 19:17:51 crc kubenswrapper[4892]: I0217 19:17:51.529801 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-89979d8dd-729tc"] Feb 17 19:17:51 crc kubenswrapper[4892]: E0217 19:17:51.530519 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bce8da88-116f-4f18-8d2f-050918735c8f" containerName="placement-db-sync" Feb 17 19:17:51 crc kubenswrapper[4892]: I0217 19:17:51.530533 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="bce8da88-116f-4f18-8d2f-050918735c8f" containerName="placement-db-sync" Feb 17 19:17:51 crc kubenswrapper[4892]: I0217 19:17:51.530721 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="bce8da88-116f-4f18-8d2f-050918735c8f" containerName="placement-db-sync" Feb 17 19:17:51 crc kubenswrapper[4892]: I0217 19:17:51.531733 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-89979d8dd-729tc" Feb 17 19:17:51 crc kubenswrapper[4892]: I0217 19:17:51.533878 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 17 19:17:51 crc kubenswrapper[4892]: I0217 19:17:51.537156 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e155c783-830f-44c1-85da-037c34052461-combined-ca-bundle\") pod \"placement-89979d8dd-729tc\" (UID: \"e155c783-830f-44c1-85da-037c34052461\") " pod="openstack/placement-89979d8dd-729tc" Feb 17 19:17:51 crc kubenswrapper[4892]: I0217 19:17:51.537275 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e155c783-830f-44c1-85da-037c34052461-config-data\") pod \"placement-89979d8dd-729tc\" (UID: \"e155c783-830f-44c1-85da-037c34052461\") " pod="openstack/placement-89979d8dd-729tc" Feb 17 19:17:51 crc kubenswrapper[4892]: I0217 19:17:51.537311 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e155c783-830f-44c1-85da-037c34052461-logs\") pod \"placement-89979d8dd-729tc\" (UID: \"e155c783-830f-44c1-85da-037c34052461\") " pod="openstack/placement-89979d8dd-729tc" Feb 17 19:17:51 crc kubenswrapper[4892]: I0217 19:17:51.537358 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e155c783-830f-44c1-85da-037c34052461-scripts\") pod \"placement-89979d8dd-729tc\" (UID: \"e155c783-830f-44c1-85da-037c34052461\") " pod="openstack/placement-89979d8dd-729tc" Feb 17 19:17:51 crc kubenswrapper[4892]: I0217 19:17:51.537376 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6dfb\" (UniqueName: \"kubernetes.io/projected/e155c783-830f-44c1-85da-037c34052461-kube-api-access-k6dfb\") pod \"placement-89979d8dd-729tc\" (UID: \"e155c783-830f-44c1-85da-037c34052461\") " pod="openstack/placement-89979d8dd-729tc" Feb 17 19:17:51 crc kubenswrapper[4892]: I0217 19:17:51.540940 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 17 19:17:51 crc kubenswrapper[4892]: I0217 19:17:51.541007 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-89979d8dd-729tc"] Feb 17 19:17:51 crc kubenswrapper[4892]: I0217 19:17:51.541536 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-sg2ft" Feb 17 19:17:51 crc kubenswrapper[4892]: I0217 19:17:51.639289 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e155c783-830f-44c1-85da-037c34052461-config-data\") pod \"placement-89979d8dd-729tc\" (UID: \"e155c783-830f-44c1-85da-037c34052461\") " pod="openstack/placement-89979d8dd-729tc" Feb 17 19:17:51 crc kubenswrapper[4892]: I0217 19:17:51.639352 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e155c783-830f-44c1-85da-037c34052461-logs\") pod \"placement-89979d8dd-729tc\" (UID: \"e155c783-830f-44c1-85da-037c34052461\") " pod="openstack/placement-89979d8dd-729tc" Feb 17 19:17:51 crc kubenswrapper[4892]: I0217 19:17:51.639394 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e155c783-830f-44c1-85da-037c34052461-scripts\") pod \"placement-89979d8dd-729tc\" (UID: \"e155c783-830f-44c1-85da-037c34052461\") " pod="openstack/placement-89979d8dd-729tc" Feb 17 19:17:51 crc kubenswrapper[4892]: I0217 19:17:51.639414 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6dfb\" (UniqueName: \"kubernetes.io/projected/e155c783-830f-44c1-85da-037c34052461-kube-api-access-k6dfb\") pod \"placement-89979d8dd-729tc\" (UID: \"e155c783-830f-44c1-85da-037c34052461\") " pod="openstack/placement-89979d8dd-729tc" Feb 17 19:17:51 crc kubenswrapper[4892]: I0217 19:17:51.639452 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e155c783-830f-44c1-85da-037c34052461-combined-ca-bundle\") pod \"placement-89979d8dd-729tc\" (UID: \"e155c783-830f-44c1-85da-037c34052461\") " pod="openstack/placement-89979d8dd-729tc" Feb 17 19:17:51 crc kubenswrapper[4892]: I0217 19:17:51.640293 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e155c783-830f-44c1-85da-037c34052461-logs\") pod \"placement-89979d8dd-729tc\" (UID: \"e155c783-830f-44c1-85da-037c34052461\") " pod="openstack/placement-89979d8dd-729tc" Feb 17 19:17:51 crc kubenswrapper[4892]: I0217 19:17:51.644205 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e155c783-830f-44c1-85da-037c34052461-combined-ca-bundle\") pod \"placement-89979d8dd-729tc\" (UID: \"e155c783-830f-44c1-85da-037c34052461\") " pod="openstack/placement-89979d8dd-729tc" Feb 17 19:17:51 crc kubenswrapper[4892]: I0217 19:17:51.644589 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e155c783-830f-44c1-85da-037c34052461-config-data\") pod \"placement-89979d8dd-729tc\" (UID: \"e155c783-830f-44c1-85da-037c34052461\") " pod="openstack/placement-89979d8dd-729tc" Feb 17 19:17:51 crc kubenswrapper[4892]: I0217 19:17:51.651089 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e155c783-830f-44c1-85da-037c34052461-scripts\") pod \"placement-89979d8dd-729tc\" (UID: \"e155c783-830f-44c1-85da-037c34052461\") " pod="openstack/placement-89979d8dd-729tc" Feb 17 19:17:51 crc kubenswrapper[4892]: I0217 19:17:51.674304 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6dfb\" (UniqueName: \"kubernetes.io/projected/e155c783-830f-44c1-85da-037c34052461-kube-api-access-k6dfb\") pod \"placement-89979d8dd-729tc\" (UID: \"e155c783-830f-44c1-85da-037c34052461\") " pod="openstack/placement-89979d8dd-729tc" Feb 17 19:17:51 crc kubenswrapper[4892]: I0217 19:17:51.849619 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-89979d8dd-729tc" Feb 17 19:17:52 crc kubenswrapper[4892]: W0217 19:17:52.338275 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode155c783_830f_44c1_85da_037c34052461.slice/crio-5156550e61ddb1059ba54e7d4677d28d491f32a616714f0a08f4a297afd6d95c WatchSource:0}: Error finding container 5156550e61ddb1059ba54e7d4677d28d491f32a616714f0a08f4a297afd6d95c: Status 404 returned error can't find the container with id 5156550e61ddb1059ba54e7d4677d28d491f32a616714f0a08f4a297afd6d95c Feb 17 19:17:52 crc kubenswrapper[4892]: I0217 19:17:52.345077 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-89979d8dd-729tc"] Feb 17 19:17:52 crc kubenswrapper[4892]: I0217 19:17:52.991843 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-89979d8dd-729tc" event={"ID":"e155c783-830f-44c1-85da-037c34052461","Type":"ContainerStarted","Data":"91c4beb2f6e68438b0f919f3121be5fb2d77f8f91c97bfa05bd4d0ec79b98714"} Feb 17 19:17:52 crc kubenswrapper[4892]: I0217 19:17:52.992276 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-89979d8dd-729tc" event={"ID":"e155c783-830f-44c1-85da-037c34052461","Type":"ContainerStarted","Data":"b9c9ab4f5bf062a5c95cdce5a71f22e77ac7a2cfcc66d75e9732bc16add45666"} Feb 17 19:17:52 crc kubenswrapper[4892]: I0217 19:17:52.992292 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-89979d8dd-729tc" event={"ID":"e155c783-830f-44c1-85da-037c34052461","Type":"ContainerStarted","Data":"5156550e61ddb1059ba54e7d4677d28d491f32a616714f0a08f4a297afd6d95c"} Feb 17 19:17:52 crc kubenswrapper[4892]: I0217 19:17:52.992625 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-89979d8dd-729tc" Feb 17 19:17:52 crc kubenswrapper[4892]: I0217 19:17:52.992648 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-89979d8dd-729tc" Feb 17 19:17:53 crc kubenswrapper[4892]: I0217 19:17:53.012199 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-89979d8dd-729tc" podStartSLOduration=2.012179472 podStartE2EDuration="2.012179472s" podCreationTimestamp="2026-02-17 19:17:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:17:53.007848865 +0000 UTC m=+5644.383252150" watchObservedRunningTime="2026-02-17 19:17:53.012179472 +0000 UTC m=+5644.387582737" Feb 17 19:17:56 crc kubenswrapper[4892]: I0217 19:17:56.261024 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-779fb5b75c-gs2vd" Feb 17 19:17:56 crc kubenswrapper[4892]: I0217 19:17:56.340192 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cc68945df-zppb2"] Feb 17 19:17:56 crc kubenswrapper[4892]: I0217 19:17:56.340987 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7cc68945df-zppb2" podUID="a8b44870-db5a-4228-bc5f-6e85be75fe36" containerName="dnsmasq-dns" containerID="cri-o://58c1befcb6b6d10cc2ef84862f9588cfaea8ee51e4dddd42c1df0e4fe86bd170" gracePeriod=10 Feb 17 19:17:56 crc kubenswrapper[4892]: I0217 19:17:56.822054 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cc68945df-zppb2" Feb 17 19:17:56 crc kubenswrapper[4892]: I0217 19:17:56.846299 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a8b44870-db5a-4228-bc5f-6e85be75fe36-ovsdbserver-nb\") pod \"a8b44870-db5a-4228-bc5f-6e85be75fe36\" (UID: \"a8b44870-db5a-4228-bc5f-6e85be75fe36\") " Feb 17 19:17:56 crc kubenswrapper[4892]: I0217 19:17:56.846399 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8b44870-db5a-4228-bc5f-6e85be75fe36-dns-svc\") pod \"a8b44870-db5a-4228-bc5f-6e85be75fe36\" (UID: \"a8b44870-db5a-4228-bc5f-6e85be75fe36\") " Feb 17 19:17:56 crc kubenswrapper[4892]: I0217 19:17:56.846474 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8b44870-db5a-4228-bc5f-6e85be75fe36-config\") pod \"a8b44870-db5a-4228-bc5f-6e85be75fe36\" (UID: \"a8b44870-db5a-4228-bc5f-6e85be75fe36\") " Feb 17 19:17:56 crc kubenswrapper[4892]: I0217 19:17:56.846568 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a8b44870-db5a-4228-bc5f-6e85be75fe36-ovsdbserver-sb\") pod \"a8b44870-db5a-4228-bc5f-6e85be75fe36\" (UID: \"a8b44870-db5a-4228-bc5f-6e85be75fe36\") " Feb 17 19:17:56 crc kubenswrapper[4892]: I0217 19:17:56.846661 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kcnc\" (UniqueName: \"kubernetes.io/projected/a8b44870-db5a-4228-bc5f-6e85be75fe36-kube-api-access-2kcnc\") pod \"a8b44870-db5a-4228-bc5f-6e85be75fe36\" (UID: \"a8b44870-db5a-4228-bc5f-6e85be75fe36\") " Feb 17 19:17:56 crc kubenswrapper[4892]: I0217 19:17:56.865045 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8b44870-db5a-4228-bc5f-6e85be75fe36-kube-api-access-2kcnc" (OuterVolumeSpecName: "kube-api-access-2kcnc") pod "a8b44870-db5a-4228-bc5f-6e85be75fe36" (UID: "a8b44870-db5a-4228-bc5f-6e85be75fe36"). InnerVolumeSpecName "kube-api-access-2kcnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:17:56 crc kubenswrapper[4892]: I0217 19:17:56.897017 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8b44870-db5a-4228-bc5f-6e85be75fe36-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a8b44870-db5a-4228-bc5f-6e85be75fe36" (UID: "a8b44870-db5a-4228-bc5f-6e85be75fe36"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:17:56 crc kubenswrapper[4892]: I0217 19:17:56.897372 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8b44870-db5a-4228-bc5f-6e85be75fe36-config" (OuterVolumeSpecName: "config") pod "a8b44870-db5a-4228-bc5f-6e85be75fe36" (UID: "a8b44870-db5a-4228-bc5f-6e85be75fe36"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:17:56 crc kubenswrapper[4892]: I0217 19:17:56.916470 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8b44870-db5a-4228-bc5f-6e85be75fe36-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a8b44870-db5a-4228-bc5f-6e85be75fe36" (UID: "a8b44870-db5a-4228-bc5f-6e85be75fe36"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:17:56 crc kubenswrapper[4892]: I0217 19:17:56.921596 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8b44870-db5a-4228-bc5f-6e85be75fe36-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a8b44870-db5a-4228-bc5f-6e85be75fe36" (UID: "a8b44870-db5a-4228-bc5f-6e85be75fe36"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:17:56 crc kubenswrapper[4892]: I0217 19:17:56.948652 4892 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8b44870-db5a-4228-bc5f-6e85be75fe36-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 19:17:56 crc kubenswrapper[4892]: I0217 19:17:56.948691 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8b44870-db5a-4228-bc5f-6e85be75fe36-config\") on node \"crc\" DevicePath \"\"" Feb 17 19:17:56 crc kubenswrapper[4892]: I0217 19:17:56.948700 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a8b44870-db5a-4228-bc5f-6e85be75fe36-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 19:17:56 crc kubenswrapper[4892]: I0217 19:17:56.948714 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kcnc\" (UniqueName: \"kubernetes.io/projected/a8b44870-db5a-4228-bc5f-6e85be75fe36-kube-api-access-2kcnc\") on node \"crc\" DevicePath \"\"" Feb 17 19:17:56 crc kubenswrapper[4892]: I0217 19:17:56.948726 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a8b44870-db5a-4228-bc5f-6e85be75fe36-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 19:17:57 crc kubenswrapper[4892]: I0217 19:17:57.044033 4892 generic.go:334] "Generic (PLEG): container finished" podID="a8b44870-db5a-4228-bc5f-6e85be75fe36" containerID="58c1befcb6b6d10cc2ef84862f9588cfaea8ee51e4dddd42c1df0e4fe86bd170" exitCode=0 Feb 17 19:17:57 crc kubenswrapper[4892]: I0217 19:17:57.044101 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cc68945df-zppb2" event={"ID":"a8b44870-db5a-4228-bc5f-6e85be75fe36","Type":"ContainerDied","Data":"58c1befcb6b6d10cc2ef84862f9588cfaea8ee51e4dddd42c1df0e4fe86bd170"} Feb 17 19:17:57 crc kubenswrapper[4892]: I0217 19:17:57.044139 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cc68945df-zppb2" event={"ID":"a8b44870-db5a-4228-bc5f-6e85be75fe36","Type":"ContainerDied","Data":"dbe16d04c27173bacf838b2865831c8f30adba155715e7e122d5e7f4074e7984"} Feb 17 19:17:57 crc kubenswrapper[4892]: I0217 19:17:57.044167 4892 scope.go:117] "RemoveContainer" containerID="58c1befcb6b6d10cc2ef84862f9588cfaea8ee51e4dddd42c1df0e4fe86bd170" Feb 17 19:17:57 crc kubenswrapper[4892]: I0217 19:17:57.044422 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cc68945df-zppb2" Feb 17 19:17:57 crc kubenswrapper[4892]: I0217 19:17:57.077515 4892 scope.go:117] "RemoveContainer" containerID="9a05b6950990a15db97edb4d8cb408031574a6159f69ce0921f4e2a2a3e9a152" Feb 17 19:17:57 crc kubenswrapper[4892]: I0217 19:17:57.111708 4892 scope.go:117] "RemoveContainer" containerID="58c1befcb6b6d10cc2ef84862f9588cfaea8ee51e4dddd42c1df0e4fe86bd170" Feb 17 19:17:57 crc kubenswrapper[4892]: E0217 19:17:57.112220 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58c1befcb6b6d10cc2ef84862f9588cfaea8ee51e4dddd42c1df0e4fe86bd170\": container with ID starting with 58c1befcb6b6d10cc2ef84862f9588cfaea8ee51e4dddd42c1df0e4fe86bd170 not found: ID does not exist" containerID="58c1befcb6b6d10cc2ef84862f9588cfaea8ee51e4dddd42c1df0e4fe86bd170" Feb 17 19:17:57 crc kubenswrapper[4892]: I0217 19:17:57.112268 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58c1befcb6b6d10cc2ef84862f9588cfaea8ee51e4dddd42c1df0e4fe86bd170"} err="failed to get container status \"58c1befcb6b6d10cc2ef84862f9588cfaea8ee51e4dddd42c1df0e4fe86bd170\": rpc error: code = NotFound desc = could not find container \"58c1befcb6b6d10cc2ef84862f9588cfaea8ee51e4dddd42c1df0e4fe86bd170\": container with ID starting with 58c1befcb6b6d10cc2ef84862f9588cfaea8ee51e4dddd42c1df0e4fe86bd170 not found: ID does not exist" Feb 17 19:17:57 crc kubenswrapper[4892]: I0217 19:17:57.112303 4892 scope.go:117] "RemoveContainer" containerID="9a05b6950990a15db97edb4d8cb408031574a6159f69ce0921f4e2a2a3e9a152" Feb 17 19:17:57 crc kubenswrapper[4892]: E0217 19:17:57.112630 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a05b6950990a15db97edb4d8cb408031574a6159f69ce0921f4e2a2a3e9a152\": container with ID starting with 9a05b6950990a15db97edb4d8cb408031574a6159f69ce0921f4e2a2a3e9a152 not found: ID does not exist" containerID="9a05b6950990a15db97edb4d8cb408031574a6159f69ce0921f4e2a2a3e9a152" Feb 17 19:17:57 crc kubenswrapper[4892]: I0217 19:17:57.112662 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a05b6950990a15db97edb4d8cb408031574a6159f69ce0921f4e2a2a3e9a152"} err="failed to get container status \"9a05b6950990a15db97edb4d8cb408031574a6159f69ce0921f4e2a2a3e9a152\": rpc error: code = NotFound desc = could not find container \"9a05b6950990a15db97edb4d8cb408031574a6159f69ce0921f4e2a2a3e9a152\": container with ID starting with 9a05b6950990a15db97edb4d8cb408031574a6159f69ce0921f4e2a2a3e9a152 not found: ID does not exist" Feb 17 19:17:57 crc kubenswrapper[4892]: I0217 19:17:57.114447 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cc68945df-zppb2"] Feb 17 19:17:57 crc kubenswrapper[4892]: I0217 19:17:57.126108 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7cc68945df-zppb2"] Feb 17 19:17:57 crc kubenswrapper[4892]: I0217 19:17:57.380780 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8b44870-db5a-4228-bc5f-6e85be75fe36" path="/var/lib/kubelet/pods/a8b44870-db5a-4228-bc5f-6e85be75fe36/volumes" Feb 17 19:18:01 crc kubenswrapper[4892]: I0217 19:18:01.887058 4892 scope.go:117] "RemoveContainer" containerID="f6a474d4d576b696edc8fcb48689448187c502500e101eb99a85ddc9a364f483" Feb 17 19:18:22 crc kubenswrapper[4892]: I0217 19:18:22.889013 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-89979d8dd-729tc" Feb 17 19:18:22 crc kubenswrapper[4892]: I0217 19:18:22.893542 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-89979d8dd-729tc" Feb 17 19:18:37 crc kubenswrapper[4892]: I0217 19:18:37.424701 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 19:18:37 crc kubenswrapper[4892]: I0217 19:18:37.425507 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 19:18:43 crc kubenswrapper[4892]: E0217 19:18:43.780067 4892 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.41:51912->38.102.83.41:46529: write tcp 38.102.83.41:51912->38.102.83.41:46529: write: broken pipe Feb 17 19:18:46 crc kubenswrapper[4892]: I0217 19:18:46.899379 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-6t524"] Feb 17 19:18:46 crc kubenswrapper[4892]: E0217 19:18:46.900257 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8b44870-db5a-4228-bc5f-6e85be75fe36" containerName="init" Feb 17 19:18:46 crc kubenswrapper[4892]: I0217 19:18:46.900269 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8b44870-db5a-4228-bc5f-6e85be75fe36" containerName="init" Feb 17 19:18:46 crc kubenswrapper[4892]: E0217 19:18:46.900283 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8b44870-db5a-4228-bc5f-6e85be75fe36" containerName="dnsmasq-dns" Feb 17 19:18:46 crc kubenswrapper[4892]: I0217 19:18:46.900289 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8b44870-db5a-4228-bc5f-6e85be75fe36" containerName="dnsmasq-dns" Feb 17 19:18:46 crc kubenswrapper[4892]: I0217 19:18:46.900488 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8b44870-db5a-4228-bc5f-6e85be75fe36" containerName="dnsmasq-dns" Feb 17 19:18:46 crc kubenswrapper[4892]: I0217 19:18:46.901139 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-6t524" Feb 17 19:18:46 crc kubenswrapper[4892]: I0217 19:18:46.913389 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-6t524"] Feb 17 19:18:46 crc kubenswrapper[4892]: I0217 19:18:46.993362 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-7phmt"] Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.012536 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40507831-2854-429d-b79a-2da3e53325ba-operator-scripts\") pod \"nova-api-db-create-6t524\" (UID: \"40507831-2854-429d-b79a-2da3e53325ba\") " pod="openstack/nova-api-db-create-6t524" Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.012950 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx7bp\" (UniqueName: \"kubernetes.io/projected/40507831-2854-429d-b79a-2da3e53325ba-kube-api-access-mx7bp\") pod \"nova-api-db-create-6t524\" (UID: \"40507831-2854-429d-b79a-2da3e53325ba\") " pod="openstack/nova-api-db-create-6t524" Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.015082 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-7phmt"] Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.015206 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-7phmt" Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.053865 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-9160-account-create-update-2ph45"] Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.057390 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-9160-account-create-update-2ph45" Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.065419 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.066409 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-9160-account-create-update-2ph45"] Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.125140 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4hcj\" (UniqueName: \"kubernetes.io/projected/0a22ec1e-d67b-40f0-9adc-0cc81be398a5-kube-api-access-p4hcj\") pod \"nova-cell0-db-create-7phmt\" (UID: \"0a22ec1e-d67b-40f0-9adc-0cc81be398a5\") " pod="openstack/nova-cell0-db-create-7phmt" Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.125758 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40507831-2854-429d-b79a-2da3e53325ba-operator-scripts\") pod \"nova-api-db-create-6t524\" (UID: \"40507831-2854-429d-b79a-2da3e53325ba\") " pod="openstack/nova-api-db-create-6t524" Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.126508 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a22ec1e-d67b-40f0-9adc-0cc81be398a5-operator-scripts\") pod \"nova-cell0-db-create-7phmt\" (UID: \"0a22ec1e-d67b-40f0-9adc-0cc81be398a5\") " pod="openstack/nova-cell0-db-create-7phmt" Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.126615 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7snsb\" (UniqueName: \"kubernetes.io/projected/751ad68c-37cf-403a-8a09-e0ff6a096874-kube-api-access-7snsb\") pod \"nova-api-9160-account-create-update-2ph45\" (UID: \"751ad68c-37cf-403a-8a09-e0ff6a096874\") " pod="openstack/nova-api-9160-account-create-update-2ph45" Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.126729 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/751ad68c-37cf-403a-8a09-e0ff6a096874-operator-scripts\") pod \"nova-api-9160-account-create-update-2ph45\" (UID: \"751ad68c-37cf-403a-8a09-e0ff6a096874\") " pod="openstack/nova-api-9160-account-create-update-2ph45" Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.126845 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mx7bp\" (UniqueName: \"kubernetes.io/projected/40507831-2854-429d-b79a-2da3e53325ba-kube-api-access-mx7bp\") pod \"nova-api-db-create-6t524\" (UID: \"40507831-2854-429d-b79a-2da3e53325ba\") " pod="openstack/nova-api-db-create-6t524" Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.125701 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-ltfnh"] Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.126449 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40507831-2854-429d-b79a-2da3e53325ba-operator-scripts\") pod \"nova-api-db-create-6t524\" (UID: \"40507831-2854-429d-b79a-2da3e53325ba\") " pod="openstack/nova-api-db-create-6t524" Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.129415 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-ltfnh" Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.134921 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-ltfnh"] Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.152877 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx7bp\" (UniqueName: \"kubernetes.io/projected/40507831-2854-429d-b79a-2da3e53325ba-kube-api-access-mx7bp\") pod \"nova-api-db-create-6t524\" (UID: \"40507831-2854-429d-b79a-2da3e53325ba\") " pod="openstack/nova-api-db-create-6t524" Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.198796 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-01db-account-create-update-5crxt"] Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.200312 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-01db-account-create-update-5crxt" Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.204778 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.227113 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-01db-account-create-update-5crxt"] Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.227985 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhbdv\" (UniqueName: \"kubernetes.io/projected/953c04db-941c-4906-92b2-69578f20f3ce-kube-api-access-dhbdv\") pod \"nova-cell1-db-create-ltfnh\" (UID: \"953c04db-941c-4906-92b2-69578f20f3ce\") " pod="openstack/nova-cell1-db-create-ltfnh" Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.228139 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4hcj\" (UniqueName: \"kubernetes.io/projected/0a22ec1e-d67b-40f0-9adc-0cc81be398a5-kube-api-access-p4hcj\") pod \"nova-cell0-db-create-7phmt\" (UID: \"0a22ec1e-d67b-40f0-9adc-0cc81be398a5\") " pod="openstack/nova-cell0-db-create-7phmt" Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.228513 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a22ec1e-d67b-40f0-9adc-0cc81be398a5-operator-scripts\") pod \"nova-cell0-db-create-7phmt\" (UID: \"0a22ec1e-d67b-40f0-9adc-0cc81be398a5\") " pod="openstack/nova-cell0-db-create-7phmt" Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.228600 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/953c04db-941c-4906-92b2-69578f20f3ce-operator-scripts\") pod \"nova-cell1-db-create-ltfnh\" (UID: \"953c04db-941c-4906-92b2-69578f20f3ce\") " pod="openstack/nova-cell1-db-create-ltfnh" Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.228690 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7snsb\" (UniqueName: \"kubernetes.io/projected/751ad68c-37cf-403a-8a09-e0ff6a096874-kube-api-access-7snsb\") pod \"nova-api-9160-account-create-update-2ph45\" (UID: \"751ad68c-37cf-403a-8a09-e0ff6a096874\") " pod="openstack/nova-api-9160-account-create-update-2ph45" Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.228787 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/751ad68c-37cf-403a-8a09-e0ff6a096874-operator-scripts\") pod \"nova-api-9160-account-create-update-2ph45\" (UID: \"751ad68c-37cf-403a-8a09-e0ff6a096874\") " pod="openstack/nova-api-9160-account-create-update-2ph45" Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.228935 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fvxb\" (UniqueName: \"kubernetes.io/projected/e9b34438-358d-4397-8d7a-0489bddea606-kube-api-access-9fvxb\") pod \"nova-cell0-01db-account-create-update-5crxt\" (UID: \"e9b34438-358d-4397-8d7a-0489bddea606\") " pod="openstack/nova-cell0-01db-account-create-update-5crxt" Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.229013 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9b34438-358d-4397-8d7a-0489bddea606-operator-scripts\") pod \"nova-cell0-01db-account-create-update-5crxt\" (UID: \"e9b34438-358d-4397-8d7a-0489bddea606\") " pod="openstack/nova-cell0-01db-account-create-update-5crxt" Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.229748 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/751ad68c-37cf-403a-8a09-e0ff6a096874-operator-scripts\") pod \"nova-api-9160-account-create-update-2ph45\" (UID: \"751ad68c-37cf-403a-8a09-e0ff6a096874\") " pod="openstack/nova-api-9160-account-create-update-2ph45" Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.230229 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a22ec1e-d67b-40f0-9adc-0cc81be398a5-operator-scripts\") pod \"nova-cell0-db-create-7phmt\" (UID: \"0a22ec1e-d67b-40f0-9adc-0cc81be398a5\") " pod="openstack/nova-cell0-db-create-7phmt" Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.232216 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-6t524" Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.245300 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4hcj\" (UniqueName: \"kubernetes.io/projected/0a22ec1e-d67b-40f0-9adc-0cc81be398a5-kube-api-access-p4hcj\") pod \"nova-cell0-db-create-7phmt\" (UID: \"0a22ec1e-d67b-40f0-9adc-0cc81be398a5\") " pod="openstack/nova-cell0-db-create-7phmt" Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.245770 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7snsb\" (UniqueName: \"kubernetes.io/projected/751ad68c-37cf-403a-8a09-e0ff6a096874-kube-api-access-7snsb\") pod \"nova-api-9160-account-create-update-2ph45\" (UID: \"751ad68c-37cf-403a-8a09-e0ff6a096874\") " pod="openstack/nova-api-9160-account-create-update-2ph45" Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.336178 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fvxb\" (UniqueName: \"kubernetes.io/projected/e9b34438-358d-4397-8d7a-0489bddea606-kube-api-access-9fvxb\") pod \"nova-cell0-01db-account-create-update-5crxt\" (UID: \"e9b34438-358d-4397-8d7a-0489bddea606\") " pod="openstack/nova-cell0-01db-account-create-update-5crxt" Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.336462 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9b34438-358d-4397-8d7a-0489bddea606-operator-scripts\") pod \"nova-cell0-01db-account-create-update-5crxt\" (UID: \"e9b34438-358d-4397-8d7a-0489bddea606\") " pod="openstack/nova-cell0-01db-account-create-update-5crxt" Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.336571 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhbdv\" (UniqueName: \"kubernetes.io/projected/953c04db-941c-4906-92b2-69578f20f3ce-kube-api-access-dhbdv\") pod \"nova-cell1-db-create-ltfnh\" (UID: \"953c04db-941c-4906-92b2-69578f20f3ce\") " pod="openstack/nova-cell1-db-create-ltfnh" Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.336679 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/953c04db-941c-4906-92b2-69578f20f3ce-operator-scripts\") pod \"nova-cell1-db-create-ltfnh\" (UID: \"953c04db-941c-4906-92b2-69578f20f3ce\") " pod="openstack/nova-cell1-db-create-ltfnh" Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.337219 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9b34438-358d-4397-8d7a-0489bddea606-operator-scripts\") pod \"nova-cell0-01db-account-create-update-5crxt\" (UID: \"e9b34438-358d-4397-8d7a-0489bddea606\") " pod="openstack/nova-cell0-01db-account-create-update-5crxt" Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.337618 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/953c04db-941c-4906-92b2-69578f20f3ce-operator-scripts\") pod \"nova-cell1-db-create-ltfnh\" (UID: \"953c04db-941c-4906-92b2-69578f20f3ce\") " pod="openstack/nova-cell1-db-create-ltfnh" Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.356135 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhbdv\" (UniqueName: \"kubernetes.io/projected/953c04db-941c-4906-92b2-69578f20f3ce-kube-api-access-dhbdv\") pod \"nova-cell1-db-create-ltfnh\" (UID: \"953c04db-941c-4906-92b2-69578f20f3ce\") " pod="openstack/nova-cell1-db-create-ltfnh" Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.356579 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-7phmt" Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.373313 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fvxb\" (UniqueName: \"kubernetes.io/projected/e9b34438-358d-4397-8d7a-0489bddea606-kube-api-access-9fvxb\") pod \"nova-cell0-01db-account-create-update-5crxt\" (UID: \"e9b34438-358d-4397-8d7a-0489bddea606\") " pod="openstack/nova-cell0-01db-account-create-update-5crxt" Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.400412 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-9160-account-create-update-2ph45" Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.410804 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-d240-account-create-update-sxl8m"] Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.412570 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-d240-account-create-update-sxl8m" Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.415411 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.427952 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-d240-account-create-update-sxl8m"] Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.449604 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-ltfnh" Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.525833 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-01db-account-create-update-5crxt" Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.540163 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-865zm\" (UniqueName: \"kubernetes.io/projected/e9502d16-c7a7-47ab-8103-f05229ca14ae-kube-api-access-865zm\") pod \"nova-cell1-d240-account-create-update-sxl8m\" (UID: \"e9502d16-c7a7-47ab-8103-f05229ca14ae\") " pod="openstack/nova-cell1-d240-account-create-update-sxl8m" Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.540266 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9502d16-c7a7-47ab-8103-f05229ca14ae-operator-scripts\") pod \"nova-cell1-d240-account-create-update-sxl8m\" (UID: \"e9502d16-c7a7-47ab-8103-f05229ca14ae\") " pod="openstack/nova-cell1-d240-account-create-update-sxl8m" Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.641697 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-865zm\" (UniqueName: \"kubernetes.io/projected/e9502d16-c7a7-47ab-8103-f05229ca14ae-kube-api-access-865zm\") pod \"nova-cell1-d240-account-create-update-sxl8m\" (UID: \"e9502d16-c7a7-47ab-8103-f05229ca14ae\") " pod="openstack/nova-cell1-d240-account-create-update-sxl8m" Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.641784 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9502d16-c7a7-47ab-8103-f05229ca14ae-operator-scripts\") pod \"nova-cell1-d240-account-create-update-sxl8m\" (UID: \"e9502d16-c7a7-47ab-8103-f05229ca14ae\") " pod="openstack/nova-cell1-d240-account-create-update-sxl8m" Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.646968 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9502d16-c7a7-47ab-8103-f05229ca14ae-operator-scripts\") pod \"nova-cell1-d240-account-create-update-sxl8m\" (UID: \"e9502d16-c7a7-47ab-8103-f05229ca14ae\") " pod="openstack/nova-cell1-d240-account-create-update-sxl8m" Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.657707 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-865zm\" (UniqueName: \"kubernetes.io/projected/e9502d16-c7a7-47ab-8103-f05229ca14ae-kube-api-access-865zm\") pod \"nova-cell1-d240-account-create-update-sxl8m\" (UID: \"e9502d16-c7a7-47ab-8103-f05229ca14ae\") " pod="openstack/nova-cell1-d240-account-create-update-sxl8m" Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.738140 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-d240-account-create-update-sxl8m" Feb 17 19:18:47 crc kubenswrapper[4892]: I0217 19:18:47.760951 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-6t524"] Feb 17 19:18:47 crc kubenswrapper[4892]: W0217 19:18:47.779291 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40507831_2854_429d_b79a_2da3e53325ba.slice/crio-6a6431b52fbafd07175e602bc994eb7f42cde6e1aebeecddaa419011b3f24a0e WatchSource:0}: Error finding container 6a6431b52fbafd07175e602bc994eb7f42cde6e1aebeecddaa419011b3f24a0e: Status 404 returned error can't find the container with id 6a6431b52fbafd07175e602bc994eb7f42cde6e1aebeecddaa419011b3f24a0e Feb 17 19:18:48 crc kubenswrapper[4892]: I0217 19:18:48.044937 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-7phmt"] Feb 17 19:18:48 crc kubenswrapper[4892]: I0217 19:18:48.063249 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-9160-account-create-update-2ph45"] Feb 17 19:18:48 crc kubenswrapper[4892]: W0217 19:18:48.063675 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod953c04db_941c_4906_92b2_69578f20f3ce.slice/crio-82db282c2d3f0c2436e94d376707ce315377100bd446653e818cea23148a812d WatchSource:0}: Error finding container 82db282c2d3f0c2436e94d376707ce315377100bd446653e818cea23148a812d: Status 404 returned error can't find the container with id 82db282c2d3f0c2436e94d376707ce315377100bd446653e818cea23148a812d Feb 17 19:18:48 crc kubenswrapper[4892]: I0217 19:18:48.071926 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-ltfnh"] Feb 17 19:18:48 crc kubenswrapper[4892]: I0217 19:18:48.190372 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-01db-account-create-update-5crxt"] Feb 17 19:18:48 crc kubenswrapper[4892]: I0217 19:18:48.406370 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-d240-account-create-update-sxl8m"] Feb 17 19:18:48 crc kubenswrapper[4892]: W0217 19:18:48.429081 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9502d16_c7a7_47ab_8103_f05229ca14ae.slice/crio-73719fe4647adde3c270fff7e8681cd3c12c51b983005611aeea4c618bd0b7ca WatchSource:0}: Error finding container 73719fe4647adde3c270fff7e8681cd3c12c51b983005611aeea4c618bd0b7ca: Status 404 returned error can't find the container with id 73719fe4647adde3c270fff7e8681cd3c12c51b983005611aeea4c618bd0b7ca Feb 17 19:18:48 crc kubenswrapper[4892]: I0217 19:18:48.674363 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-7phmt" event={"ID":"0a22ec1e-d67b-40f0-9adc-0cc81be398a5","Type":"ContainerStarted","Data":"13f6f892a55a2f43a917e4a35d18bc360b8ba41d67d20fbd612812517991abc9"} Feb 17 19:18:48 crc kubenswrapper[4892]: I0217 19:18:48.674405 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-7phmt" event={"ID":"0a22ec1e-d67b-40f0-9adc-0cc81be398a5","Type":"ContainerStarted","Data":"1f9b229043fa7a108cdbda573c589681d246dbbf54765799fd038ce7a6da5d3e"} Feb 17 19:18:48 crc kubenswrapper[4892]: I0217 19:18:48.685953 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-01db-account-create-update-5crxt" event={"ID":"e9b34438-358d-4397-8d7a-0489bddea606","Type":"ContainerStarted","Data":"668d4f2f067d0eb8d9008384e88dc8609ef1a33aa8c63f63eb80946319a2abe7"} Feb 17 19:18:48 crc kubenswrapper[4892]: I0217 19:18:48.686003 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-01db-account-create-update-5crxt" event={"ID":"e9b34438-358d-4397-8d7a-0489bddea606","Type":"ContainerStarted","Data":"cc6329284e873af8917cd067d23e0c2c21f5b5b6b7d713579e013e46ac7c59ee"} Feb 17 19:18:48 crc kubenswrapper[4892]: I0217 19:18:48.689466 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-d240-account-create-update-sxl8m" event={"ID":"e9502d16-c7a7-47ab-8103-f05229ca14ae","Type":"ContainerStarted","Data":"73719fe4647adde3c270fff7e8681cd3c12c51b983005611aeea4c618bd0b7ca"} Feb 17 19:18:48 crc kubenswrapper[4892]: I0217 19:18:48.692499 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-9160-account-create-update-2ph45" event={"ID":"751ad68c-37cf-403a-8a09-e0ff6a096874","Type":"ContainerStarted","Data":"8cf66092601ece5e6e73f766aad3a5cde9245c52106e26fc7e4ed6fd6c31db29"} Feb 17 19:18:48 crc kubenswrapper[4892]: I0217 19:18:48.692550 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-9160-account-create-update-2ph45" event={"ID":"751ad68c-37cf-403a-8a09-e0ff6a096874","Type":"ContainerStarted","Data":"4aacdbd13a8297c71d5bfcad40be0840e2d274bc956ff6d9fe8056fb509a8d00"} Feb 17 19:18:48 crc kubenswrapper[4892]: I0217 19:18:48.705422 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-7phmt" podStartSLOduration=2.705395691 podStartE2EDuration="2.705395691s" podCreationTimestamp="2026-02-17 19:18:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:18:48.695476314 +0000 UTC m=+5700.070879579" watchObservedRunningTime="2026-02-17 19:18:48.705395691 +0000 UTC m=+5700.080798956" Feb 17 19:18:48 crc kubenswrapper[4892]: I0217 19:18:48.706501 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-ltfnh" event={"ID":"953c04db-941c-4906-92b2-69578f20f3ce","Type":"ContainerStarted","Data":"a250680d484833a0b789f6bc0d411c9d61592e521cfc90b94a5ebf0331386fc0"} Feb 17 19:18:48 crc kubenswrapper[4892]: I0217 19:18:48.706544 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-ltfnh" event={"ID":"953c04db-941c-4906-92b2-69578f20f3ce","Type":"ContainerStarted","Data":"82db282c2d3f0c2436e94d376707ce315377100bd446653e818cea23148a812d"} Feb 17 19:18:48 crc kubenswrapper[4892]: I0217 19:18:48.711410 4892 generic.go:334] "Generic (PLEG): container finished" podID="40507831-2854-429d-b79a-2da3e53325ba" containerID="d6cf0779e1b78cc9bdba4d1263425f4c6c8e7d26a4037734497d6f154374b45c" exitCode=0 Feb 17 19:18:48 crc kubenswrapper[4892]: I0217 19:18:48.711460 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-6t524" event={"ID":"40507831-2854-429d-b79a-2da3e53325ba","Type":"ContainerDied","Data":"d6cf0779e1b78cc9bdba4d1263425f4c6c8e7d26a4037734497d6f154374b45c"} Feb 17 19:18:48 crc kubenswrapper[4892]: I0217 19:18:48.711486 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-6t524" event={"ID":"40507831-2854-429d-b79a-2da3e53325ba","Type":"ContainerStarted","Data":"6a6431b52fbafd07175e602bc994eb7f42cde6e1aebeecddaa419011b3f24a0e"} Feb 17 19:18:48 crc kubenswrapper[4892]: I0217 19:18:48.723083 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-d240-account-create-update-sxl8m" podStartSLOduration=1.723062247 podStartE2EDuration="1.723062247s" podCreationTimestamp="2026-02-17 19:18:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:18:48.713069788 +0000 UTC m=+5700.088473053" watchObservedRunningTime="2026-02-17 19:18:48.723062247 +0000 UTC m=+5700.098465512" Feb 17 19:18:48 crc kubenswrapper[4892]: I0217 19:18:48.731481 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-01db-account-create-update-5crxt" podStartSLOduration=1.731463794 podStartE2EDuration="1.731463794s" podCreationTimestamp="2026-02-17 19:18:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:18:48.729292166 +0000 UTC m=+5700.104695441" watchObservedRunningTime="2026-02-17 19:18:48.731463794 +0000 UTC m=+5700.106867049" Feb 17 19:18:48 crc kubenswrapper[4892]: I0217 19:18:48.757886 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-9160-account-create-update-2ph45" podStartSLOduration=2.757863727 podStartE2EDuration="2.757863727s" podCreationTimestamp="2026-02-17 19:18:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:18:48.748680379 +0000 UTC m=+5700.124083644" watchObservedRunningTime="2026-02-17 19:18:48.757863727 +0000 UTC m=+5700.133266992" Feb 17 19:18:49 crc kubenswrapper[4892]: I0217 19:18:49.726939 4892 generic.go:334] "Generic (PLEG): container finished" podID="0a22ec1e-d67b-40f0-9adc-0cc81be398a5" containerID="13f6f892a55a2f43a917e4a35d18bc360b8ba41d67d20fbd612812517991abc9" exitCode=0 Feb 17 19:18:49 crc kubenswrapper[4892]: I0217 19:18:49.727070 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-7phmt" event={"ID":"0a22ec1e-d67b-40f0-9adc-0cc81be398a5","Type":"ContainerDied","Data":"13f6f892a55a2f43a917e4a35d18bc360b8ba41d67d20fbd612812517991abc9"} Feb 17 19:18:49 crc kubenswrapper[4892]: I0217 19:18:49.729347 4892 generic.go:334] "Generic (PLEG): container finished" podID="e9b34438-358d-4397-8d7a-0489bddea606" containerID="668d4f2f067d0eb8d9008384e88dc8609ef1a33aa8c63f63eb80946319a2abe7" exitCode=0 Feb 17 19:18:49 crc kubenswrapper[4892]: I0217 19:18:49.729448 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-01db-account-create-update-5crxt" event={"ID":"e9b34438-358d-4397-8d7a-0489bddea606","Type":"ContainerDied","Data":"668d4f2f067d0eb8d9008384e88dc8609ef1a33aa8c63f63eb80946319a2abe7"} Feb 17 19:18:49 crc kubenswrapper[4892]: I0217 19:18:49.735087 4892 generic.go:334] "Generic (PLEG): container finished" podID="e9502d16-c7a7-47ab-8103-f05229ca14ae" containerID="08b9f92b44090339833a5a7f102d02ee2e686638c10087a7ae249439ca64d4fd" exitCode=0 Feb 17 19:18:49 crc kubenswrapper[4892]: I0217 19:18:49.735144 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-d240-account-create-update-sxl8m" event={"ID":"e9502d16-c7a7-47ab-8103-f05229ca14ae","Type":"ContainerDied","Data":"08b9f92b44090339833a5a7f102d02ee2e686638c10087a7ae249439ca64d4fd"} Feb 17 19:18:49 crc kubenswrapper[4892]: I0217 19:18:49.739217 4892 generic.go:334] "Generic (PLEG): container finished" podID="751ad68c-37cf-403a-8a09-e0ff6a096874" containerID="8cf66092601ece5e6e73f766aad3a5cde9245c52106e26fc7e4ed6fd6c31db29" exitCode=0 Feb 17 19:18:49 crc kubenswrapper[4892]: I0217 19:18:49.739355 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-9160-account-create-update-2ph45" event={"ID":"751ad68c-37cf-403a-8a09-e0ff6a096874","Type":"ContainerDied","Data":"8cf66092601ece5e6e73f766aad3a5cde9245c52106e26fc7e4ed6fd6c31db29"} Feb 17 19:18:49 crc kubenswrapper[4892]: I0217 19:18:49.742378 4892 generic.go:334] "Generic (PLEG): container finished" podID="953c04db-941c-4906-92b2-69578f20f3ce" containerID="a250680d484833a0b789f6bc0d411c9d61592e521cfc90b94a5ebf0331386fc0" exitCode=0 Feb 17 19:18:49 crc kubenswrapper[4892]: I0217 19:18:49.742414 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-ltfnh" event={"ID":"953c04db-941c-4906-92b2-69578f20f3ce","Type":"ContainerDied","Data":"a250680d484833a0b789f6bc0d411c9d61592e521cfc90b94a5ebf0331386fc0"} Feb 17 19:18:50 crc kubenswrapper[4892]: I0217 19:18:50.138622 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-6t524" Feb 17 19:18:50 crc kubenswrapper[4892]: I0217 19:18:50.144358 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-ltfnh" Feb 17 19:18:50 crc kubenswrapper[4892]: I0217 19:18:50.197758 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhbdv\" (UniqueName: \"kubernetes.io/projected/953c04db-941c-4906-92b2-69578f20f3ce-kube-api-access-dhbdv\") pod \"953c04db-941c-4906-92b2-69578f20f3ce\" (UID: \"953c04db-941c-4906-92b2-69578f20f3ce\") " Feb 17 19:18:50 crc kubenswrapper[4892]: I0217 19:18:50.198021 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mx7bp\" (UniqueName: \"kubernetes.io/projected/40507831-2854-429d-b79a-2da3e53325ba-kube-api-access-mx7bp\") pod \"40507831-2854-429d-b79a-2da3e53325ba\" (UID: \"40507831-2854-429d-b79a-2da3e53325ba\") " Feb 17 19:18:50 crc kubenswrapper[4892]: I0217 19:18:50.198131 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40507831-2854-429d-b79a-2da3e53325ba-operator-scripts\") pod \"40507831-2854-429d-b79a-2da3e53325ba\" (UID: \"40507831-2854-429d-b79a-2da3e53325ba\") " Feb 17 19:18:50 crc kubenswrapper[4892]: I0217 19:18:50.198228 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/953c04db-941c-4906-92b2-69578f20f3ce-operator-scripts\") pod \"953c04db-941c-4906-92b2-69578f20f3ce\" (UID: \"953c04db-941c-4906-92b2-69578f20f3ce\") " Feb 17 19:18:50 crc kubenswrapper[4892]: I0217 19:18:50.198893 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40507831-2854-429d-b79a-2da3e53325ba-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "40507831-2854-429d-b79a-2da3e53325ba" (UID: "40507831-2854-429d-b79a-2da3e53325ba"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:18:50 crc kubenswrapper[4892]: I0217 19:18:50.199175 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/953c04db-941c-4906-92b2-69578f20f3ce-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "953c04db-941c-4906-92b2-69578f20f3ce" (UID: "953c04db-941c-4906-92b2-69578f20f3ce"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:18:50 crc kubenswrapper[4892]: I0217 19:18:50.203783 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/953c04db-941c-4906-92b2-69578f20f3ce-kube-api-access-dhbdv" (OuterVolumeSpecName: "kube-api-access-dhbdv") pod "953c04db-941c-4906-92b2-69578f20f3ce" (UID: "953c04db-941c-4906-92b2-69578f20f3ce"). InnerVolumeSpecName "kube-api-access-dhbdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:18:50 crc kubenswrapper[4892]: I0217 19:18:50.204297 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40507831-2854-429d-b79a-2da3e53325ba-kube-api-access-mx7bp" (OuterVolumeSpecName: "kube-api-access-mx7bp") pod "40507831-2854-429d-b79a-2da3e53325ba" (UID: "40507831-2854-429d-b79a-2da3e53325ba"). InnerVolumeSpecName "kube-api-access-mx7bp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:18:50 crc kubenswrapper[4892]: I0217 19:18:50.300404 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhbdv\" (UniqueName: \"kubernetes.io/projected/953c04db-941c-4906-92b2-69578f20f3ce-kube-api-access-dhbdv\") on node \"crc\" DevicePath \"\"" Feb 17 19:18:50 crc kubenswrapper[4892]: I0217 19:18:50.300472 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mx7bp\" (UniqueName: \"kubernetes.io/projected/40507831-2854-429d-b79a-2da3e53325ba-kube-api-access-mx7bp\") on node \"crc\" DevicePath \"\"" Feb 17 19:18:50 crc kubenswrapper[4892]: I0217 19:18:50.300503 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40507831-2854-429d-b79a-2da3e53325ba-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:18:50 crc kubenswrapper[4892]: I0217 19:18:50.300532 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/953c04db-941c-4906-92b2-69578f20f3ce-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:18:50 crc kubenswrapper[4892]: I0217 19:18:50.774475 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-ltfnh" Feb 17 19:18:50 crc kubenswrapper[4892]: I0217 19:18:50.774530 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-ltfnh" event={"ID":"953c04db-941c-4906-92b2-69578f20f3ce","Type":"ContainerDied","Data":"82db282c2d3f0c2436e94d376707ce315377100bd446653e818cea23148a812d"} Feb 17 19:18:50 crc kubenswrapper[4892]: I0217 19:18:50.774596 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82db282c2d3f0c2436e94d376707ce315377100bd446653e818cea23148a812d" Feb 17 19:18:50 crc kubenswrapper[4892]: I0217 19:18:50.777684 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-6t524" Feb 17 19:18:50 crc kubenswrapper[4892]: I0217 19:18:50.778200 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-6t524" event={"ID":"40507831-2854-429d-b79a-2da3e53325ba","Type":"ContainerDied","Data":"6a6431b52fbafd07175e602bc994eb7f42cde6e1aebeecddaa419011b3f24a0e"} Feb 17 19:18:50 crc kubenswrapper[4892]: I0217 19:18:50.778665 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a6431b52fbafd07175e602bc994eb7f42cde6e1aebeecddaa419011b3f24a0e" Feb 17 19:18:51 crc kubenswrapper[4892]: I0217 19:18:51.147417 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-9160-account-create-update-2ph45" Feb 17 19:18:51 crc kubenswrapper[4892]: I0217 19:18:51.245643 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7snsb\" (UniqueName: \"kubernetes.io/projected/751ad68c-37cf-403a-8a09-e0ff6a096874-kube-api-access-7snsb\") pod \"751ad68c-37cf-403a-8a09-e0ff6a096874\" (UID: \"751ad68c-37cf-403a-8a09-e0ff6a096874\") " Feb 17 19:18:51 crc kubenswrapper[4892]: I0217 19:18:51.245849 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/751ad68c-37cf-403a-8a09-e0ff6a096874-operator-scripts\") pod \"751ad68c-37cf-403a-8a09-e0ff6a096874\" (UID: \"751ad68c-37cf-403a-8a09-e0ff6a096874\") " Feb 17 19:18:51 crc kubenswrapper[4892]: I0217 19:18:51.246674 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/751ad68c-37cf-403a-8a09-e0ff6a096874-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "751ad68c-37cf-403a-8a09-e0ff6a096874" (UID: "751ad68c-37cf-403a-8a09-e0ff6a096874"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:18:51 crc kubenswrapper[4892]: I0217 19:18:51.251018 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/751ad68c-37cf-403a-8a09-e0ff6a096874-kube-api-access-7snsb" (OuterVolumeSpecName: "kube-api-access-7snsb") pod "751ad68c-37cf-403a-8a09-e0ff6a096874" (UID: "751ad68c-37cf-403a-8a09-e0ff6a096874"). InnerVolumeSpecName "kube-api-access-7snsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:18:51 crc kubenswrapper[4892]: I0217 19:18:51.328336 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-01db-account-create-update-5crxt" Feb 17 19:18:51 crc kubenswrapper[4892]: I0217 19:18:51.334717 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-d240-account-create-update-sxl8m" Feb 17 19:18:51 crc kubenswrapper[4892]: I0217 19:18:51.345546 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-7phmt" Feb 17 19:18:51 crc kubenswrapper[4892]: I0217 19:18:51.347374 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9b34438-358d-4397-8d7a-0489bddea606-operator-scripts\") pod \"e9b34438-358d-4397-8d7a-0489bddea606\" (UID: \"e9b34438-358d-4397-8d7a-0489bddea606\") " Feb 17 19:18:51 crc kubenswrapper[4892]: I0217 19:18:51.347476 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fvxb\" (UniqueName: \"kubernetes.io/projected/e9b34438-358d-4397-8d7a-0489bddea606-kube-api-access-9fvxb\") pod \"e9b34438-358d-4397-8d7a-0489bddea606\" (UID: \"e9b34438-358d-4397-8d7a-0489bddea606\") " Feb 17 19:18:51 crc kubenswrapper[4892]: I0217 19:18:51.347806 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9b34438-358d-4397-8d7a-0489bddea606-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e9b34438-358d-4397-8d7a-0489bddea606" (UID: "e9b34438-358d-4397-8d7a-0489bddea606"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:18:51 crc kubenswrapper[4892]: I0217 19:18:51.348486 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/751ad68c-37cf-403a-8a09-e0ff6a096874-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:18:51 crc kubenswrapper[4892]: I0217 19:18:51.348508 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9b34438-358d-4397-8d7a-0489bddea606-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:18:51 crc kubenswrapper[4892]: I0217 19:18:51.348517 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7snsb\" (UniqueName: \"kubernetes.io/projected/751ad68c-37cf-403a-8a09-e0ff6a096874-kube-api-access-7snsb\") on node \"crc\" DevicePath \"\"" Feb 17 19:18:51 crc kubenswrapper[4892]: I0217 19:18:51.350032 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9b34438-358d-4397-8d7a-0489bddea606-kube-api-access-9fvxb" (OuterVolumeSpecName: "kube-api-access-9fvxb") pod "e9b34438-358d-4397-8d7a-0489bddea606" (UID: "e9b34438-358d-4397-8d7a-0489bddea606"). InnerVolumeSpecName "kube-api-access-9fvxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:18:51 crc kubenswrapper[4892]: I0217 19:18:51.449277 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9502d16-c7a7-47ab-8103-f05229ca14ae-operator-scripts\") pod \"e9502d16-c7a7-47ab-8103-f05229ca14ae\" (UID: \"e9502d16-c7a7-47ab-8103-f05229ca14ae\") " Feb 17 19:18:51 crc kubenswrapper[4892]: I0217 19:18:51.449418 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a22ec1e-d67b-40f0-9adc-0cc81be398a5-operator-scripts\") pod \"0a22ec1e-d67b-40f0-9adc-0cc81be398a5\" (UID: \"0a22ec1e-d67b-40f0-9adc-0cc81be398a5\") " Feb 17 19:18:51 crc kubenswrapper[4892]: I0217 19:18:51.449459 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-865zm\" (UniqueName: \"kubernetes.io/projected/e9502d16-c7a7-47ab-8103-f05229ca14ae-kube-api-access-865zm\") pod \"e9502d16-c7a7-47ab-8103-f05229ca14ae\" (UID: \"e9502d16-c7a7-47ab-8103-f05229ca14ae\") " Feb 17 19:18:51 crc kubenswrapper[4892]: I0217 19:18:51.449562 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4hcj\" (UniqueName: \"kubernetes.io/projected/0a22ec1e-d67b-40f0-9adc-0cc81be398a5-kube-api-access-p4hcj\") pod \"0a22ec1e-d67b-40f0-9adc-0cc81be398a5\" (UID: \"0a22ec1e-d67b-40f0-9adc-0cc81be398a5\") " Feb 17 19:18:51 crc kubenswrapper[4892]: I0217 19:18:51.449896 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9502d16-c7a7-47ab-8103-f05229ca14ae-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e9502d16-c7a7-47ab-8103-f05229ca14ae" (UID: "e9502d16-c7a7-47ab-8103-f05229ca14ae"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:18:51 crc kubenswrapper[4892]: I0217 19:18:51.449903 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a22ec1e-d67b-40f0-9adc-0cc81be398a5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0a22ec1e-d67b-40f0-9adc-0cc81be398a5" (UID: "0a22ec1e-d67b-40f0-9adc-0cc81be398a5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:18:51 crc kubenswrapper[4892]: I0217 19:18:51.451267 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9502d16-c7a7-47ab-8103-f05229ca14ae-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:18:51 crc kubenswrapper[4892]: I0217 19:18:51.451286 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a22ec1e-d67b-40f0-9adc-0cc81be398a5-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:18:51 crc kubenswrapper[4892]: I0217 19:18:51.451296 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fvxb\" (UniqueName: \"kubernetes.io/projected/e9b34438-358d-4397-8d7a-0489bddea606-kube-api-access-9fvxb\") on node \"crc\" DevicePath \"\"" Feb 17 19:18:51 crc kubenswrapper[4892]: I0217 19:18:51.452637 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a22ec1e-d67b-40f0-9adc-0cc81be398a5-kube-api-access-p4hcj" (OuterVolumeSpecName: "kube-api-access-p4hcj") pod "0a22ec1e-d67b-40f0-9adc-0cc81be398a5" (UID: "0a22ec1e-d67b-40f0-9adc-0cc81be398a5"). InnerVolumeSpecName "kube-api-access-p4hcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:18:51 crc kubenswrapper[4892]: I0217 19:18:51.452669 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9502d16-c7a7-47ab-8103-f05229ca14ae-kube-api-access-865zm" (OuterVolumeSpecName: "kube-api-access-865zm") pod "e9502d16-c7a7-47ab-8103-f05229ca14ae" (UID: "e9502d16-c7a7-47ab-8103-f05229ca14ae"). InnerVolumeSpecName "kube-api-access-865zm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:18:51 crc kubenswrapper[4892]: I0217 19:18:51.554089 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4hcj\" (UniqueName: \"kubernetes.io/projected/0a22ec1e-d67b-40f0-9adc-0cc81be398a5-kube-api-access-p4hcj\") on node \"crc\" DevicePath \"\"" Feb 17 19:18:51 crc kubenswrapper[4892]: I0217 19:18:51.554141 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-865zm\" (UniqueName: \"kubernetes.io/projected/e9502d16-c7a7-47ab-8103-f05229ca14ae-kube-api-access-865zm\") on node \"crc\" DevicePath \"\"" Feb 17 19:18:51 crc kubenswrapper[4892]: I0217 19:18:51.795433 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-7phmt" event={"ID":"0a22ec1e-d67b-40f0-9adc-0cc81be398a5","Type":"ContainerDied","Data":"1f9b229043fa7a108cdbda573c589681d246dbbf54765799fd038ce7a6da5d3e"} Feb 17 19:18:51 crc kubenswrapper[4892]: I0217 19:18:51.796218 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f9b229043fa7a108cdbda573c589681d246dbbf54765799fd038ce7a6da5d3e" Feb 17 19:18:51 crc kubenswrapper[4892]: I0217 19:18:51.795490 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-7phmt" Feb 17 19:18:51 crc kubenswrapper[4892]: I0217 19:18:51.798534 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-01db-account-create-update-5crxt" event={"ID":"e9b34438-358d-4397-8d7a-0489bddea606","Type":"ContainerDied","Data":"cc6329284e873af8917cd067d23e0c2c21f5b5b6b7d713579e013e46ac7c59ee"} Feb 17 19:18:51 crc kubenswrapper[4892]: I0217 19:18:51.798559 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-01db-account-create-update-5crxt" Feb 17 19:18:51 crc kubenswrapper[4892]: I0217 19:18:51.798570 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc6329284e873af8917cd067d23e0c2c21f5b5b6b7d713579e013e46ac7c59ee" Feb 17 19:18:51 crc kubenswrapper[4892]: I0217 19:18:51.802127 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-d240-account-create-update-sxl8m" event={"ID":"e9502d16-c7a7-47ab-8103-f05229ca14ae","Type":"ContainerDied","Data":"73719fe4647adde3c270fff7e8681cd3c12c51b983005611aeea4c618bd0b7ca"} Feb 17 19:18:51 crc kubenswrapper[4892]: I0217 19:18:51.802158 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73719fe4647adde3c270fff7e8681cd3c12c51b983005611aeea4c618bd0b7ca" Feb 17 19:18:51 crc kubenswrapper[4892]: I0217 19:18:51.802287 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-d240-account-create-update-sxl8m" Feb 17 19:18:51 crc kubenswrapper[4892]: I0217 19:18:51.805931 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-9160-account-create-update-2ph45" event={"ID":"751ad68c-37cf-403a-8a09-e0ff6a096874","Type":"ContainerDied","Data":"4aacdbd13a8297c71d5bfcad40be0840e2d274bc956ff6d9fe8056fb509a8d00"} Feb 17 19:18:51 crc kubenswrapper[4892]: I0217 19:18:51.805986 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4aacdbd13a8297c71d5bfcad40be0840e2d274bc956ff6d9fe8056fb509a8d00" Feb 17 19:18:51 crc kubenswrapper[4892]: I0217 19:18:51.806001 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-9160-account-create-update-2ph45" Feb 17 19:18:57 crc kubenswrapper[4892]: I0217 19:18:57.223863 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-sqz4g"] Feb 17 19:18:57 crc kubenswrapper[4892]: E0217 19:18:57.225058 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40507831-2854-429d-b79a-2da3e53325ba" containerName="mariadb-database-create" Feb 17 19:18:57 crc kubenswrapper[4892]: I0217 19:18:57.225080 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="40507831-2854-429d-b79a-2da3e53325ba" containerName="mariadb-database-create" Feb 17 19:18:57 crc kubenswrapper[4892]: E0217 19:18:57.225128 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9502d16-c7a7-47ab-8103-f05229ca14ae" containerName="mariadb-account-create-update" Feb 17 19:18:57 crc kubenswrapper[4892]: I0217 19:18:57.225139 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9502d16-c7a7-47ab-8103-f05229ca14ae" containerName="mariadb-account-create-update" Feb 17 19:18:57 crc kubenswrapper[4892]: E0217 19:18:57.225158 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="953c04db-941c-4906-92b2-69578f20f3ce" containerName="mariadb-database-create" Feb 17 19:18:57 crc kubenswrapper[4892]: I0217 19:18:57.225169 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="953c04db-941c-4906-92b2-69578f20f3ce" containerName="mariadb-database-create" Feb 17 19:18:57 crc kubenswrapper[4892]: E0217 19:18:57.225185 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9b34438-358d-4397-8d7a-0489bddea606" containerName="mariadb-account-create-update" Feb 17 19:18:57 crc kubenswrapper[4892]: I0217 19:18:57.225196 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9b34438-358d-4397-8d7a-0489bddea606" containerName="mariadb-account-create-update" Feb 17 19:18:57 crc kubenswrapper[4892]: E0217 19:18:57.225212 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="751ad68c-37cf-403a-8a09-e0ff6a096874" containerName="mariadb-account-create-update" Feb 17 19:18:57 crc kubenswrapper[4892]: I0217 19:18:57.225223 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="751ad68c-37cf-403a-8a09-e0ff6a096874" containerName="mariadb-account-create-update" Feb 17 19:18:57 crc kubenswrapper[4892]: E0217 19:18:57.225240 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a22ec1e-d67b-40f0-9adc-0cc81be398a5" containerName="mariadb-database-create" Feb 17 19:18:57 crc kubenswrapper[4892]: I0217 19:18:57.225250 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a22ec1e-d67b-40f0-9adc-0cc81be398a5" containerName="mariadb-database-create" Feb 17 19:18:57 crc kubenswrapper[4892]: I0217 19:18:57.225596 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a22ec1e-d67b-40f0-9adc-0cc81be398a5" containerName="mariadb-database-create" Feb 17 19:18:57 crc kubenswrapper[4892]: I0217 19:18:57.225620 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="751ad68c-37cf-403a-8a09-e0ff6a096874" containerName="mariadb-account-create-update" Feb 17 19:18:57 crc kubenswrapper[4892]: I0217 19:18:57.225641 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="953c04db-941c-4906-92b2-69578f20f3ce" containerName="mariadb-database-create" Feb 17 19:18:57 crc kubenswrapper[4892]: I0217 19:18:57.225675 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9502d16-c7a7-47ab-8103-f05229ca14ae" containerName="mariadb-account-create-update" Feb 17 19:18:57 crc kubenswrapper[4892]: I0217 19:18:57.225694 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="40507831-2854-429d-b79a-2da3e53325ba" containerName="mariadb-database-create" Feb 17 19:18:57 crc kubenswrapper[4892]: I0217 19:18:57.225715 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9b34438-358d-4397-8d7a-0489bddea606" containerName="mariadb-account-create-update" Feb 17 19:18:57 crc kubenswrapper[4892]: I0217 19:18:57.227144 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-sqz4g" Feb 17 19:18:57 crc kubenswrapper[4892]: I0217 19:18:57.229716 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-smnqn" Feb 17 19:18:57 crc kubenswrapper[4892]: I0217 19:18:57.230651 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 17 19:18:57 crc kubenswrapper[4892]: I0217 19:18:57.230980 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 17 19:18:57 crc kubenswrapper[4892]: I0217 19:18:57.244333 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-sqz4g"] Feb 17 19:18:57 crc kubenswrapper[4892]: I0217 19:18:57.280502 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc-config-data\") pod \"nova-cell0-conductor-db-sync-sqz4g\" (UID: \"8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc\") " pod="openstack/nova-cell0-conductor-db-sync-sqz4g" Feb 17 19:18:57 crc kubenswrapper[4892]: I0217 19:18:57.280573 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc-scripts\") pod \"nova-cell0-conductor-db-sync-sqz4g\" (UID: \"8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc\") " pod="openstack/nova-cell0-conductor-db-sync-sqz4g" Feb 17 19:18:57 crc kubenswrapper[4892]: I0217 19:18:57.280658 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gctw8\" (UniqueName: \"kubernetes.io/projected/8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc-kube-api-access-gctw8\") pod \"nova-cell0-conductor-db-sync-sqz4g\" (UID: \"8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc\") " pod="openstack/nova-cell0-conductor-db-sync-sqz4g" Feb 17 19:18:57 crc kubenswrapper[4892]: I0217 19:18:57.280687 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-sqz4g\" (UID: \"8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc\") " pod="openstack/nova-cell0-conductor-db-sync-sqz4g" Feb 17 19:18:57 crc kubenswrapper[4892]: I0217 19:18:57.382876 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-sqz4g\" (UID: \"8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc\") " pod="openstack/nova-cell0-conductor-db-sync-sqz4g" Feb 17 19:18:57 crc kubenswrapper[4892]: I0217 19:18:57.383080 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc-config-data\") pod \"nova-cell0-conductor-db-sync-sqz4g\" (UID: \"8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc\") " pod="openstack/nova-cell0-conductor-db-sync-sqz4g" Feb 17 19:18:57 crc kubenswrapper[4892]: I0217 19:18:57.383130 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc-scripts\") pod \"nova-cell0-conductor-db-sync-sqz4g\" (UID: \"8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc\") " pod="openstack/nova-cell0-conductor-db-sync-sqz4g" Feb 17 19:18:57 crc kubenswrapper[4892]: I0217 19:18:57.383172 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gctw8\" (UniqueName: \"kubernetes.io/projected/8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc-kube-api-access-gctw8\") pod \"nova-cell0-conductor-db-sync-sqz4g\" (UID: \"8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc\") " pod="openstack/nova-cell0-conductor-db-sync-sqz4g" Feb 17 19:18:57 crc kubenswrapper[4892]: I0217 19:18:57.390122 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-sqz4g\" (UID: \"8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc\") " pod="openstack/nova-cell0-conductor-db-sync-sqz4g" Feb 17 19:18:57 crc kubenswrapper[4892]: I0217 19:18:57.390840 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc-scripts\") pod \"nova-cell0-conductor-db-sync-sqz4g\" (UID: \"8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc\") " pod="openstack/nova-cell0-conductor-db-sync-sqz4g" Feb 17 19:18:57 crc kubenswrapper[4892]: I0217 19:18:57.393990 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc-config-data\") pod \"nova-cell0-conductor-db-sync-sqz4g\" (UID: \"8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc\") " pod="openstack/nova-cell0-conductor-db-sync-sqz4g" Feb 17 19:18:57 crc kubenswrapper[4892]: I0217 19:18:57.401289 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gctw8\" (UniqueName: \"kubernetes.io/projected/8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc-kube-api-access-gctw8\") pod \"nova-cell0-conductor-db-sync-sqz4g\" (UID: \"8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc\") " pod="openstack/nova-cell0-conductor-db-sync-sqz4g" Feb 17 19:18:57 crc kubenswrapper[4892]: I0217 19:18:57.554175 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-sqz4g" Feb 17 19:18:58 crc kubenswrapper[4892]: I0217 19:18:58.016248 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-sqz4g"] Feb 17 19:18:58 crc kubenswrapper[4892]: W0217 19:18:58.036795 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ee6a2b0_3ba3_46cb_92c0_1b9a15b027dc.slice/crio-53f9b5b0121286fa8bae05f210819ab77f49a23bd9ff5173da5b206528bc2ca4 WatchSource:0}: Error finding container 53f9b5b0121286fa8bae05f210819ab77f49a23bd9ff5173da5b206528bc2ca4: Status 404 returned error can't find the container with id 53f9b5b0121286fa8bae05f210819ab77f49a23bd9ff5173da5b206528bc2ca4 Feb 17 19:18:58 crc kubenswrapper[4892]: I0217 19:18:58.891581 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-sqz4g" event={"ID":"8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc","Type":"ContainerStarted","Data":"6b16a0927f66df6489fd569847254692c7552c057528d5e7c548c2cf5eb5f9cc"} Feb 17 19:18:58 crc kubenswrapper[4892]: I0217 19:18:58.891959 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-sqz4g" event={"ID":"8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc","Type":"ContainerStarted","Data":"53f9b5b0121286fa8bae05f210819ab77f49a23bd9ff5173da5b206528bc2ca4"} Feb 17 19:18:58 crc kubenswrapper[4892]: I0217 19:18:58.920577 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-sqz4g" podStartSLOduration=1.920555265 podStartE2EDuration="1.920555265s" podCreationTimestamp="2026-02-17 19:18:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:18:58.907561714 +0000 UTC m=+5710.282964999" watchObservedRunningTime="2026-02-17 19:18:58.920555265 +0000 UTC m=+5710.295958540" Feb 17 19:19:02 crc kubenswrapper[4892]: I0217 19:19:02.081322 4892 scope.go:117] "RemoveContainer" containerID="1d4abc8967695e153a07c6e0c365169f9fb3898c506bc87a28d5bb8889d4660d" Feb 17 19:19:03 crc kubenswrapper[4892]: I0217 19:19:03.961215 4892 generic.go:334] "Generic (PLEG): container finished" podID="8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc" containerID="6b16a0927f66df6489fd569847254692c7552c057528d5e7c548c2cf5eb5f9cc" exitCode=0 Feb 17 19:19:03 crc kubenswrapper[4892]: I0217 19:19:03.961520 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-sqz4g" event={"ID":"8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc","Type":"ContainerDied","Data":"6b16a0927f66df6489fd569847254692c7552c057528d5e7c548c2cf5eb5f9cc"} Feb 17 19:19:05 crc kubenswrapper[4892]: I0217 19:19:05.324851 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-sqz4g" Feb 17 19:19:05 crc kubenswrapper[4892]: I0217 19:19:05.387804 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc-config-data\") pod \"8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc\" (UID: \"8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc\") " Feb 17 19:19:05 crc kubenswrapper[4892]: I0217 19:19:05.387928 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc-combined-ca-bundle\") pod \"8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc\" (UID: \"8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc\") " Feb 17 19:19:05 crc kubenswrapper[4892]: I0217 19:19:05.388321 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc-scripts\") pod \"8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc\" (UID: \"8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc\") " Feb 17 19:19:05 crc kubenswrapper[4892]: I0217 19:19:05.389471 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gctw8\" (UniqueName: \"kubernetes.io/projected/8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc-kube-api-access-gctw8\") pod \"8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc\" (UID: \"8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc\") " Feb 17 19:19:05 crc kubenswrapper[4892]: I0217 19:19:05.403337 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc-kube-api-access-gctw8" (OuterVolumeSpecName: "kube-api-access-gctw8") pod "8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc" (UID: "8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc"). InnerVolumeSpecName "kube-api-access-gctw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:19:05 crc kubenswrapper[4892]: I0217 19:19:05.403351 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc-scripts" (OuterVolumeSpecName: "scripts") pod "8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc" (UID: "8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:19:05 crc kubenswrapper[4892]: I0217 19:19:05.424707 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc-config-data" (OuterVolumeSpecName: "config-data") pod "8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc" (UID: "8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:19:05 crc kubenswrapper[4892]: I0217 19:19:05.432144 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc" (UID: "8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:19:05 crc kubenswrapper[4892]: I0217 19:19:05.491785 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:19:05 crc kubenswrapper[4892]: I0217 19:19:05.491843 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gctw8\" (UniqueName: \"kubernetes.io/projected/8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc-kube-api-access-gctw8\") on node \"crc\" DevicePath \"\"" Feb 17 19:19:05 crc kubenswrapper[4892]: I0217 19:19:05.491858 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 19:19:05 crc kubenswrapper[4892]: I0217 19:19:05.491870 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 19:19:05 crc kubenswrapper[4892]: I0217 19:19:05.989478 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-sqz4g" event={"ID":"8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc","Type":"ContainerDied","Data":"53f9b5b0121286fa8bae05f210819ab77f49a23bd9ff5173da5b206528bc2ca4"} Feb 17 19:19:05 crc kubenswrapper[4892]: I0217 19:19:05.989528 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53f9b5b0121286fa8bae05f210819ab77f49a23bd9ff5173da5b206528bc2ca4" Feb 17 19:19:05 crc kubenswrapper[4892]: I0217 19:19:05.989527 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-sqz4g" Feb 17 19:19:06 crc kubenswrapper[4892]: I0217 19:19:06.086588 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 17 19:19:06 crc kubenswrapper[4892]: E0217 19:19:06.087582 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc" containerName="nova-cell0-conductor-db-sync" Feb 17 19:19:06 crc kubenswrapper[4892]: I0217 19:19:06.087627 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc" containerName="nova-cell0-conductor-db-sync" Feb 17 19:19:06 crc kubenswrapper[4892]: I0217 19:19:06.088278 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc" containerName="nova-cell0-conductor-db-sync" Feb 17 19:19:06 crc kubenswrapper[4892]: I0217 19:19:06.089908 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 17 19:19:06 crc kubenswrapper[4892]: I0217 19:19:06.092914 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-smnqn" Feb 17 19:19:06 crc kubenswrapper[4892]: I0217 19:19:06.096671 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 17 19:19:06 crc kubenswrapper[4892]: I0217 19:19:06.106776 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f3b01f4-cae0-43d8-8d5a-5eb7052304e3-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"2f3b01f4-cae0-43d8-8d5a-5eb7052304e3\") " pod="openstack/nova-cell0-conductor-0" Feb 17 19:19:06 crc kubenswrapper[4892]: I0217 19:19:06.106942 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljjwm\" (UniqueName: \"kubernetes.io/projected/2f3b01f4-cae0-43d8-8d5a-5eb7052304e3-kube-api-access-ljjwm\") pod \"nova-cell0-conductor-0\" (UID: \"2f3b01f4-cae0-43d8-8d5a-5eb7052304e3\") " pod="openstack/nova-cell0-conductor-0" Feb 17 19:19:06 crc kubenswrapper[4892]: I0217 19:19:06.107054 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f3b01f4-cae0-43d8-8d5a-5eb7052304e3-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"2f3b01f4-cae0-43d8-8d5a-5eb7052304e3\") " pod="openstack/nova-cell0-conductor-0" Feb 17 19:19:06 crc kubenswrapper[4892]: I0217 19:19:06.111769 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 17 19:19:06 crc kubenswrapper[4892]: I0217 19:19:06.208590 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f3b01f4-cae0-43d8-8d5a-5eb7052304e3-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"2f3b01f4-cae0-43d8-8d5a-5eb7052304e3\") " pod="openstack/nova-cell0-conductor-0" Feb 17 19:19:06 crc kubenswrapper[4892]: I0217 19:19:06.208782 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f3b01f4-cae0-43d8-8d5a-5eb7052304e3-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"2f3b01f4-cae0-43d8-8d5a-5eb7052304e3\") " pod="openstack/nova-cell0-conductor-0" Feb 17 19:19:06 crc kubenswrapper[4892]: I0217 19:19:06.208806 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljjwm\" (UniqueName: \"kubernetes.io/projected/2f3b01f4-cae0-43d8-8d5a-5eb7052304e3-kube-api-access-ljjwm\") pod \"nova-cell0-conductor-0\" (UID: \"2f3b01f4-cae0-43d8-8d5a-5eb7052304e3\") " pod="openstack/nova-cell0-conductor-0" Feb 17 19:19:06 crc kubenswrapper[4892]: I0217 19:19:06.212435 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f3b01f4-cae0-43d8-8d5a-5eb7052304e3-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"2f3b01f4-cae0-43d8-8d5a-5eb7052304e3\") " pod="openstack/nova-cell0-conductor-0" Feb 17 19:19:06 crc kubenswrapper[4892]: I0217 19:19:06.212658 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f3b01f4-cae0-43d8-8d5a-5eb7052304e3-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"2f3b01f4-cae0-43d8-8d5a-5eb7052304e3\") " pod="openstack/nova-cell0-conductor-0" Feb 17 19:19:06 crc kubenswrapper[4892]: I0217 19:19:06.228280 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljjwm\" (UniqueName: \"kubernetes.io/projected/2f3b01f4-cae0-43d8-8d5a-5eb7052304e3-kube-api-access-ljjwm\") pod \"nova-cell0-conductor-0\" (UID: \"2f3b01f4-cae0-43d8-8d5a-5eb7052304e3\") " pod="openstack/nova-cell0-conductor-0" Feb 17 19:19:06 crc kubenswrapper[4892]: I0217 19:19:06.412260 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 17 19:19:06 crc kubenswrapper[4892]: I0217 19:19:06.989252 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 17 19:19:07 crc kubenswrapper[4892]: I0217 19:19:07.004542 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"2f3b01f4-cae0-43d8-8d5a-5eb7052304e3","Type":"ContainerStarted","Data":"adfe6ce88553b02c5949e2a13827ff8d7ec6c2f5c142aabf5565fac0ef709b34"} Feb 17 19:19:07 crc kubenswrapper[4892]: I0217 19:19:07.424302 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 19:19:07 crc kubenswrapper[4892]: I0217 19:19:07.424666 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 19:19:08 crc kubenswrapper[4892]: I0217 19:19:08.040969 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"2f3b01f4-cae0-43d8-8d5a-5eb7052304e3","Type":"ContainerStarted","Data":"3d7ba1b53e5252313403655198580bfb151aeca1caa5cc295583ec81123f5097"} Feb 17 19:19:08 crc kubenswrapper[4892]: I0217 19:19:08.041153 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 17 19:19:16 crc kubenswrapper[4892]: I0217 19:19:16.459528 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 17 19:19:16 crc kubenswrapper[4892]: I0217 19:19:16.488926 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=10.488906368 podStartE2EDuration="10.488906368s" podCreationTimestamp="2026-02-17 19:19:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:19:08.06333413 +0000 UTC m=+5719.438737405" watchObservedRunningTime="2026-02-17 19:19:16.488906368 +0000 UTC m=+5727.864309633" Feb 17 19:19:16 crc kubenswrapper[4892]: I0217 19:19:16.996953 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-j7t5v"] Feb 17 19:19:16 crc kubenswrapper[4892]: I0217 19:19:16.998849 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-j7t5v" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.000875 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.002158 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.009897 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-j7t5v"] Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.038197 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6a454e7-b6ca-4e29-9579-6e11202bcf98-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-j7t5v\" (UID: \"e6a454e7-b6ca-4e29-9579-6e11202bcf98\") " pod="openstack/nova-cell0-cell-mapping-j7t5v" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.038243 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6a454e7-b6ca-4e29-9579-6e11202bcf98-config-data\") pod \"nova-cell0-cell-mapping-j7t5v\" (UID: \"e6a454e7-b6ca-4e29-9579-6e11202bcf98\") " pod="openstack/nova-cell0-cell-mapping-j7t5v" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.038316 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6a454e7-b6ca-4e29-9579-6e11202bcf98-scripts\") pod \"nova-cell0-cell-mapping-j7t5v\" (UID: \"e6a454e7-b6ca-4e29-9579-6e11202bcf98\") " pod="openstack/nova-cell0-cell-mapping-j7t5v" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.038333 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl86f\" (UniqueName: \"kubernetes.io/projected/e6a454e7-b6ca-4e29-9579-6e11202bcf98-kube-api-access-gl86f\") pod \"nova-cell0-cell-mapping-j7t5v\" (UID: \"e6a454e7-b6ca-4e29-9579-6e11202bcf98\") " pod="openstack/nova-cell0-cell-mapping-j7t5v" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.128252 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.129694 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.139801 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.141617 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6a454e7-b6ca-4e29-9579-6e11202bcf98-scripts\") pod \"nova-cell0-cell-mapping-j7t5v\" (UID: \"e6a454e7-b6ca-4e29-9579-6e11202bcf98\") " pod="openstack/nova-cell0-cell-mapping-j7t5v" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.141645 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gl86f\" (UniqueName: \"kubernetes.io/projected/e6a454e7-b6ca-4e29-9579-6e11202bcf98-kube-api-access-gl86f\") pod \"nova-cell0-cell-mapping-j7t5v\" (UID: \"e6a454e7-b6ca-4e29-9579-6e11202bcf98\") " pod="openstack/nova-cell0-cell-mapping-j7t5v" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.141671 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6860fb76-f8d0-4d14-872a-cfc33ad6887e-config-data\") pod \"nova-scheduler-0\" (UID: \"6860fb76-f8d0-4d14-872a-cfc33ad6887e\") " pod="openstack/nova-scheduler-0" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.141709 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6860fb76-f8d0-4d14-872a-cfc33ad6887e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6860fb76-f8d0-4d14-872a-cfc33ad6887e\") " pod="openstack/nova-scheduler-0" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.141785 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwc5x\" (UniqueName: \"kubernetes.io/projected/6860fb76-f8d0-4d14-872a-cfc33ad6887e-kube-api-access-jwc5x\") pod \"nova-scheduler-0\" (UID: \"6860fb76-f8d0-4d14-872a-cfc33ad6887e\") " pod="openstack/nova-scheduler-0" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.141804 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6a454e7-b6ca-4e29-9579-6e11202bcf98-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-j7t5v\" (UID: \"e6a454e7-b6ca-4e29-9579-6e11202bcf98\") " pod="openstack/nova-cell0-cell-mapping-j7t5v" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.141841 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6a454e7-b6ca-4e29-9579-6e11202bcf98-config-data\") pod \"nova-cell0-cell-mapping-j7t5v\" (UID: \"e6a454e7-b6ca-4e29-9579-6e11202bcf98\") " pod="openstack/nova-cell0-cell-mapping-j7t5v" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.168424 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6a454e7-b6ca-4e29-9579-6e11202bcf98-scripts\") pod \"nova-cell0-cell-mapping-j7t5v\" (UID: \"e6a454e7-b6ca-4e29-9579-6e11202bcf98\") " pod="openstack/nova-cell0-cell-mapping-j7t5v" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.183453 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6a454e7-b6ca-4e29-9579-6e11202bcf98-config-data\") pod \"nova-cell0-cell-mapping-j7t5v\" (UID: \"e6a454e7-b6ca-4e29-9579-6e11202bcf98\") " pod="openstack/nova-cell0-cell-mapping-j7t5v" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.183734 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6a454e7-b6ca-4e29-9579-6e11202bcf98-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-j7t5v\" (UID: \"e6a454e7-b6ca-4e29-9579-6e11202bcf98\") " pod="openstack/nova-cell0-cell-mapping-j7t5v" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.194464 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gl86f\" (UniqueName: \"kubernetes.io/projected/e6a454e7-b6ca-4e29-9579-6e11202bcf98-kube-api-access-gl86f\") pod \"nova-cell0-cell-mapping-j7t5v\" (UID: \"e6a454e7-b6ca-4e29-9579-6e11202bcf98\") " pod="openstack/nova-cell0-cell-mapping-j7t5v" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.219942 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.245003 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6860fb76-f8d0-4d14-872a-cfc33ad6887e-config-data\") pod \"nova-scheduler-0\" (UID: \"6860fb76-f8d0-4d14-872a-cfc33ad6887e\") " pod="openstack/nova-scheduler-0" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.245064 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6860fb76-f8d0-4d14-872a-cfc33ad6887e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6860fb76-f8d0-4d14-872a-cfc33ad6887e\") " pod="openstack/nova-scheduler-0" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.245136 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwc5x\" (UniqueName: \"kubernetes.io/projected/6860fb76-f8d0-4d14-872a-cfc33ad6887e-kube-api-access-jwc5x\") pod \"nova-scheduler-0\" (UID: \"6860fb76-f8d0-4d14-872a-cfc33ad6887e\") " pod="openstack/nova-scheduler-0" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.254386 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6860fb76-f8d0-4d14-872a-cfc33ad6887e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6860fb76-f8d0-4d14-872a-cfc33ad6887e\") " pod="openstack/nova-scheduler-0" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.256105 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6860fb76-f8d0-4d14-872a-cfc33ad6887e-config-data\") pod \"nova-scheduler-0\" (UID: \"6860fb76-f8d0-4d14-872a-cfc33ad6887e\") " pod="openstack/nova-scheduler-0" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.281631 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwc5x\" (UniqueName: \"kubernetes.io/projected/6860fb76-f8d0-4d14-872a-cfc33ad6887e-kube-api-access-jwc5x\") pod \"nova-scheduler-0\" (UID: \"6860fb76-f8d0-4d14-872a-cfc33ad6887e\") " pod="openstack/nova-scheduler-0" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.292108 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.293962 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.314776 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.317115 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-j7t5v" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.343499 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.345458 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.347263 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18c586e6-4c06-45f0-9e7e-bd1ddd615e18-config-data\") pod \"nova-metadata-0\" (UID: \"18c586e6-4c06-45f0-9e7e-bd1ddd615e18\") " pod="openstack/nova-metadata-0" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.347299 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18c586e6-4c06-45f0-9e7e-bd1ddd615e18-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"18c586e6-4c06-45f0-9e7e-bd1ddd615e18\") " pod="openstack/nova-metadata-0" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.347335 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18c586e6-4c06-45f0-9e7e-bd1ddd615e18-logs\") pod \"nova-metadata-0\" (UID: \"18c586e6-4c06-45f0-9e7e-bd1ddd615e18\") " pod="openstack/nova-metadata-0" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.347385 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gkv8\" (UniqueName: \"kubernetes.io/projected/18c586e6-4c06-45f0-9e7e-bd1ddd615e18-kube-api-access-6gkv8\") pod \"nova-metadata-0\" (UID: \"18c586e6-4c06-45f0-9e7e-bd1ddd615e18\") " pod="openstack/nova-metadata-0" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.350718 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.360409 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.417906 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.417942 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6785d9ddb9-ckc89"] Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.419476 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6785d9ddb9-ckc89" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.428979 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6785d9ddb9-ckc89"] Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.450593 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d2641e63-3c52-4760-a46d-b01684e4ebb5-dns-svc\") pod \"dnsmasq-dns-6785d9ddb9-ckc89\" (UID: \"d2641e63-3c52-4760-a46d-b01684e4ebb5\") " pod="openstack/dnsmasq-dns-6785d9ddb9-ckc89" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.450642 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gkv8\" (UniqueName: \"kubernetes.io/projected/18c586e6-4c06-45f0-9e7e-bd1ddd615e18-kube-api-access-6gkv8\") pod \"nova-metadata-0\" (UID: \"18c586e6-4c06-45f0-9e7e-bd1ddd615e18\") " pod="openstack/nova-metadata-0" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.450679 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a1e86d9-6f05-4489-a9f2-008c85d563bb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1a1e86d9-6f05-4489-a9f2-008c85d563bb\") " pod="openstack/nova-api-0" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.450707 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d2641e63-3c52-4760-a46d-b01684e4ebb5-ovsdbserver-sb\") pod \"dnsmasq-dns-6785d9ddb9-ckc89\" (UID: \"d2641e63-3c52-4760-a46d-b01684e4ebb5\") " pod="openstack/dnsmasq-dns-6785d9ddb9-ckc89" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.450737 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d2641e63-3c52-4760-a46d-b01684e4ebb5-ovsdbserver-nb\") pod \"dnsmasq-dns-6785d9ddb9-ckc89\" (UID: \"d2641e63-3c52-4760-a46d-b01684e4ebb5\") " pod="openstack/dnsmasq-dns-6785d9ddb9-ckc89" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.450761 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97dgg\" (UniqueName: \"kubernetes.io/projected/1a1e86d9-6f05-4489-a9f2-008c85d563bb-kube-api-access-97dgg\") pod \"nova-api-0\" (UID: \"1a1e86d9-6f05-4489-a9f2-008c85d563bb\") " pod="openstack/nova-api-0" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.450779 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a1e86d9-6f05-4489-a9f2-008c85d563bb-logs\") pod \"nova-api-0\" (UID: \"1a1e86d9-6f05-4489-a9f2-008c85d563bb\") " pod="openstack/nova-api-0" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.450865 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a1e86d9-6f05-4489-a9f2-008c85d563bb-config-data\") pod \"nova-api-0\" (UID: \"1a1e86d9-6f05-4489-a9f2-008c85d563bb\") " pod="openstack/nova-api-0" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.450887 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krd5r\" (UniqueName: \"kubernetes.io/projected/d2641e63-3c52-4760-a46d-b01684e4ebb5-kube-api-access-krd5r\") pod \"dnsmasq-dns-6785d9ddb9-ckc89\" (UID: \"d2641e63-3c52-4760-a46d-b01684e4ebb5\") " pod="openstack/dnsmasq-dns-6785d9ddb9-ckc89" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.450944 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2641e63-3c52-4760-a46d-b01684e4ebb5-config\") pod \"dnsmasq-dns-6785d9ddb9-ckc89\" (UID: \"d2641e63-3c52-4760-a46d-b01684e4ebb5\") " pod="openstack/dnsmasq-dns-6785d9ddb9-ckc89" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.450975 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18c586e6-4c06-45f0-9e7e-bd1ddd615e18-config-data\") pod \"nova-metadata-0\" (UID: \"18c586e6-4c06-45f0-9e7e-bd1ddd615e18\") " pod="openstack/nova-metadata-0" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.451004 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18c586e6-4c06-45f0-9e7e-bd1ddd615e18-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"18c586e6-4c06-45f0-9e7e-bd1ddd615e18\") " pod="openstack/nova-metadata-0" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.451041 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18c586e6-4c06-45f0-9e7e-bd1ddd615e18-logs\") pod \"nova-metadata-0\" (UID: \"18c586e6-4c06-45f0-9e7e-bd1ddd615e18\") " pod="openstack/nova-metadata-0" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.457603 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.458253 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18c586e6-4c06-45f0-9e7e-bd1ddd615e18-logs\") pod \"nova-metadata-0\" (UID: \"18c586e6-4c06-45f0-9e7e-bd1ddd615e18\") " pod="openstack/nova-metadata-0" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.459068 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.461131 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.472997 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18c586e6-4c06-45f0-9e7e-bd1ddd615e18-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"18c586e6-4c06-45f0-9e7e-bd1ddd615e18\") " pod="openstack/nova-metadata-0" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.474721 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gkv8\" (UniqueName: \"kubernetes.io/projected/18c586e6-4c06-45f0-9e7e-bd1ddd615e18-kube-api-access-6gkv8\") pod \"nova-metadata-0\" (UID: \"18c586e6-4c06-45f0-9e7e-bd1ddd615e18\") " pod="openstack/nova-metadata-0" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.480214 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18c586e6-4c06-45f0-9e7e-bd1ddd615e18-config-data\") pod \"nova-metadata-0\" (UID: \"18c586e6-4c06-45f0-9e7e-bd1ddd615e18\") " pod="openstack/nova-metadata-0" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.488001 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.552770 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a1e86d9-6f05-4489-a9f2-008c85d563bb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1a1e86d9-6f05-4489-a9f2-008c85d563bb\") " pod="openstack/nova-api-0" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.553024 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d2641e63-3c52-4760-a46d-b01684e4ebb5-ovsdbserver-sb\") pod \"dnsmasq-dns-6785d9ddb9-ckc89\" (UID: \"d2641e63-3c52-4760-a46d-b01684e4ebb5\") " pod="openstack/dnsmasq-dns-6785d9ddb9-ckc89" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.553055 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d2641e63-3c52-4760-a46d-b01684e4ebb5-ovsdbserver-nb\") pod \"dnsmasq-dns-6785d9ddb9-ckc89\" (UID: \"d2641e63-3c52-4760-a46d-b01684e4ebb5\") " pod="openstack/dnsmasq-dns-6785d9ddb9-ckc89" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.553082 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97dgg\" (UniqueName: \"kubernetes.io/projected/1a1e86d9-6f05-4489-a9f2-008c85d563bb-kube-api-access-97dgg\") pod \"nova-api-0\" (UID: \"1a1e86d9-6f05-4489-a9f2-008c85d563bb\") " pod="openstack/nova-api-0" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.553100 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a1e86d9-6f05-4489-a9f2-008c85d563bb-logs\") pod \"nova-api-0\" (UID: \"1a1e86d9-6f05-4489-a9f2-008c85d563bb\") " pod="openstack/nova-api-0" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.553134 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a1e86d9-6f05-4489-a9f2-008c85d563bb-config-data\") pod \"nova-api-0\" (UID: \"1a1e86d9-6f05-4489-a9f2-008c85d563bb\") " pod="openstack/nova-api-0" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.553150 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krd5r\" (UniqueName: \"kubernetes.io/projected/d2641e63-3c52-4760-a46d-b01684e4ebb5-kube-api-access-krd5r\") pod \"dnsmasq-dns-6785d9ddb9-ckc89\" (UID: \"d2641e63-3c52-4760-a46d-b01684e4ebb5\") " pod="openstack/dnsmasq-dns-6785d9ddb9-ckc89" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.553183 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9243c8d-1b08-4ff6-aca9-be5aab1a4437-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f9243c8d-1b08-4ff6-aca9-be5aab1a4437\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.553200 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9243c8d-1b08-4ff6-aca9-be5aab1a4437-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f9243c8d-1b08-4ff6-aca9-be5aab1a4437\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.553220 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2641e63-3c52-4760-a46d-b01684e4ebb5-config\") pod \"dnsmasq-dns-6785d9ddb9-ckc89\" (UID: \"d2641e63-3c52-4760-a46d-b01684e4ebb5\") " pod="openstack/dnsmasq-dns-6785d9ddb9-ckc89" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.553291 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d2641e63-3c52-4760-a46d-b01684e4ebb5-dns-svc\") pod \"dnsmasq-dns-6785d9ddb9-ckc89\" (UID: \"d2641e63-3c52-4760-a46d-b01684e4ebb5\") " pod="openstack/dnsmasq-dns-6785d9ddb9-ckc89" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.553324 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9htgx\" (UniqueName: \"kubernetes.io/projected/f9243c8d-1b08-4ff6-aca9-be5aab1a4437-kube-api-access-9htgx\") pod \"nova-cell1-novncproxy-0\" (UID: \"f9243c8d-1b08-4ff6-aca9-be5aab1a4437\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.560693 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a1e86d9-6f05-4489-a9f2-008c85d563bb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1a1e86d9-6f05-4489-a9f2-008c85d563bb\") " pod="openstack/nova-api-0" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.561152 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2641e63-3c52-4760-a46d-b01684e4ebb5-config\") pod \"dnsmasq-dns-6785d9ddb9-ckc89\" (UID: \"d2641e63-3c52-4760-a46d-b01684e4ebb5\") " pod="openstack/dnsmasq-dns-6785d9ddb9-ckc89" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.561316 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d2641e63-3c52-4760-a46d-b01684e4ebb5-dns-svc\") pod \"dnsmasq-dns-6785d9ddb9-ckc89\" (UID: \"d2641e63-3c52-4760-a46d-b01684e4ebb5\") " pod="openstack/dnsmasq-dns-6785d9ddb9-ckc89" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.561677 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d2641e63-3c52-4760-a46d-b01684e4ebb5-ovsdbserver-nb\") pod \"dnsmasq-dns-6785d9ddb9-ckc89\" (UID: \"d2641e63-3c52-4760-a46d-b01684e4ebb5\") " pod="openstack/dnsmasq-dns-6785d9ddb9-ckc89" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.561898 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d2641e63-3c52-4760-a46d-b01684e4ebb5-ovsdbserver-sb\") pod \"dnsmasq-dns-6785d9ddb9-ckc89\" (UID: \"d2641e63-3c52-4760-a46d-b01684e4ebb5\") " pod="openstack/dnsmasq-dns-6785d9ddb9-ckc89" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.562175 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a1e86d9-6f05-4489-a9f2-008c85d563bb-logs\") pod \"nova-api-0\" (UID: \"1a1e86d9-6f05-4489-a9f2-008c85d563bb\") " pod="openstack/nova-api-0" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.562192 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.584593 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a1e86d9-6f05-4489-a9f2-008c85d563bb-config-data\") pod \"nova-api-0\" (UID: \"1a1e86d9-6f05-4489-a9f2-008c85d563bb\") " pod="openstack/nova-api-0" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.587799 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krd5r\" (UniqueName: \"kubernetes.io/projected/d2641e63-3c52-4760-a46d-b01684e4ebb5-kube-api-access-krd5r\") pod \"dnsmasq-dns-6785d9ddb9-ckc89\" (UID: \"d2641e63-3c52-4760-a46d-b01684e4ebb5\") " pod="openstack/dnsmasq-dns-6785d9ddb9-ckc89" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.600572 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97dgg\" (UniqueName: \"kubernetes.io/projected/1a1e86d9-6f05-4489-a9f2-008c85d563bb-kube-api-access-97dgg\") pod \"nova-api-0\" (UID: \"1a1e86d9-6f05-4489-a9f2-008c85d563bb\") " pod="openstack/nova-api-0" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.651930 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.654430 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9htgx\" (UniqueName: \"kubernetes.io/projected/f9243c8d-1b08-4ff6-aca9-be5aab1a4437-kube-api-access-9htgx\") pod \"nova-cell1-novncproxy-0\" (UID: \"f9243c8d-1b08-4ff6-aca9-be5aab1a4437\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.654539 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9243c8d-1b08-4ff6-aca9-be5aab1a4437-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f9243c8d-1b08-4ff6-aca9-be5aab1a4437\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.654561 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9243c8d-1b08-4ff6-aca9-be5aab1a4437-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f9243c8d-1b08-4ff6-aca9-be5aab1a4437\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.660327 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9243c8d-1b08-4ff6-aca9-be5aab1a4437-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f9243c8d-1b08-4ff6-aca9-be5aab1a4437\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.661276 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9243c8d-1b08-4ff6-aca9-be5aab1a4437-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f9243c8d-1b08-4ff6-aca9-be5aab1a4437\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.673626 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9htgx\" (UniqueName: \"kubernetes.io/projected/f9243c8d-1b08-4ff6-aca9-be5aab1a4437-kube-api-access-9htgx\") pod \"nova-cell1-novncproxy-0\" (UID: \"f9243c8d-1b08-4ff6-aca9-be5aab1a4437\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.748502 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.788075 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6785d9ddb9-ckc89" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.819468 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 19:19:17 crc kubenswrapper[4892]: I0217 19:19:17.962521 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-j7t5v"] Feb 17 19:19:18 crc kubenswrapper[4892]: I0217 19:19:18.181483 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 19:19:18 crc kubenswrapper[4892]: I0217 19:19:18.193654 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-j7t5v" event={"ID":"e6a454e7-b6ca-4e29-9579-6e11202bcf98","Type":"ContainerStarted","Data":"c82f041831853527efa635daae8f08db95e3a2f4911b9221c56338da464e66dd"} Feb 17 19:19:18 crc kubenswrapper[4892]: I0217 19:19:18.193697 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-j7t5v" event={"ID":"e6a454e7-b6ca-4e29-9579-6e11202bcf98","Type":"ContainerStarted","Data":"514199cdef9d2b239606e4a36ad9098764f5d643b60ad18231c987dbc84d3d87"} Feb 17 19:19:18 crc kubenswrapper[4892]: I0217 19:19:18.223786 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-j7t5v" podStartSLOduration=2.223768876 podStartE2EDuration="2.223768876s" podCreationTimestamp="2026-02-17 19:19:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:19:18.211788203 +0000 UTC m=+5729.587191488" watchObservedRunningTime="2026-02-17 19:19:18.223768876 +0000 UTC m=+5729.599172141" Feb 17 19:19:18 crc kubenswrapper[4892]: I0217 19:19:18.290060 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7f66c"] Feb 17 19:19:18 crc kubenswrapper[4892]: I0217 19:19:18.291380 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7f66c" Feb 17 19:19:18 crc kubenswrapper[4892]: I0217 19:19:18.295811 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 17 19:19:18 crc kubenswrapper[4892]: I0217 19:19:18.296066 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 17 19:19:18 crc kubenswrapper[4892]: I0217 19:19:18.305399 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7f66c"] Feb 17 19:19:18 crc kubenswrapper[4892]: I0217 19:19:18.369259 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6785d9ddb9-ckc89"] Feb 17 19:19:18 crc kubenswrapper[4892]: I0217 19:19:18.386939 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 19:19:18 crc kubenswrapper[4892]: I0217 19:19:18.434501 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 19:19:18 crc kubenswrapper[4892]: I0217 19:19:18.477743 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31bf5754-5e90-44b7-ab1b-f4883dc02a8f-scripts\") pod \"nova-cell1-conductor-db-sync-7f66c\" (UID: \"31bf5754-5e90-44b7-ab1b-f4883dc02a8f\") " pod="openstack/nova-cell1-conductor-db-sync-7f66c" Feb 17 19:19:18 crc kubenswrapper[4892]: I0217 19:19:18.477921 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31bf5754-5e90-44b7-ab1b-f4883dc02a8f-config-data\") pod \"nova-cell1-conductor-db-sync-7f66c\" (UID: \"31bf5754-5e90-44b7-ab1b-f4883dc02a8f\") " pod="openstack/nova-cell1-conductor-db-sync-7f66c" Feb 17 19:19:18 crc kubenswrapper[4892]: I0217 19:19:18.477956 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31bf5754-5e90-44b7-ab1b-f4883dc02a8f-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7f66c\" (UID: \"31bf5754-5e90-44b7-ab1b-f4883dc02a8f\") " pod="openstack/nova-cell1-conductor-db-sync-7f66c" Feb 17 19:19:18 crc kubenswrapper[4892]: I0217 19:19:18.478000 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbc6g\" (UniqueName: \"kubernetes.io/projected/31bf5754-5e90-44b7-ab1b-f4883dc02a8f-kube-api-access-jbc6g\") pod \"nova-cell1-conductor-db-sync-7f66c\" (UID: \"31bf5754-5e90-44b7-ab1b-f4883dc02a8f\") " pod="openstack/nova-cell1-conductor-db-sync-7f66c" Feb 17 19:19:18 crc kubenswrapper[4892]: I0217 19:19:18.575030 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 19:19:18 crc kubenswrapper[4892]: W0217 19:19:18.577672 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9243c8d_1b08_4ff6_aca9_be5aab1a4437.slice/crio-97c038e15897a4dcbdfb0f52b0be760dc44f6e83e87a3ce2f27ada6073485d85 WatchSource:0}: Error finding container 97c038e15897a4dcbdfb0f52b0be760dc44f6e83e87a3ce2f27ada6073485d85: Status 404 returned error can't find the container with id 97c038e15897a4dcbdfb0f52b0be760dc44f6e83e87a3ce2f27ada6073485d85 Feb 17 19:19:18 crc kubenswrapper[4892]: I0217 19:19:18.579599 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31bf5754-5e90-44b7-ab1b-f4883dc02a8f-scripts\") pod \"nova-cell1-conductor-db-sync-7f66c\" (UID: \"31bf5754-5e90-44b7-ab1b-f4883dc02a8f\") " pod="openstack/nova-cell1-conductor-db-sync-7f66c" Feb 17 19:19:18 crc kubenswrapper[4892]: I0217 19:19:18.579760 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31bf5754-5e90-44b7-ab1b-f4883dc02a8f-config-data\") pod \"nova-cell1-conductor-db-sync-7f66c\" (UID: \"31bf5754-5e90-44b7-ab1b-f4883dc02a8f\") " pod="openstack/nova-cell1-conductor-db-sync-7f66c" Feb 17 19:19:18 crc kubenswrapper[4892]: I0217 19:19:18.579785 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31bf5754-5e90-44b7-ab1b-f4883dc02a8f-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7f66c\" (UID: \"31bf5754-5e90-44b7-ab1b-f4883dc02a8f\") " pod="openstack/nova-cell1-conductor-db-sync-7f66c" Feb 17 19:19:18 crc kubenswrapper[4892]: I0217 19:19:18.579842 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbc6g\" (UniqueName: \"kubernetes.io/projected/31bf5754-5e90-44b7-ab1b-f4883dc02a8f-kube-api-access-jbc6g\") pod \"nova-cell1-conductor-db-sync-7f66c\" (UID: \"31bf5754-5e90-44b7-ab1b-f4883dc02a8f\") " pod="openstack/nova-cell1-conductor-db-sync-7f66c" Feb 17 19:19:18 crc kubenswrapper[4892]: I0217 19:19:18.583635 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31bf5754-5e90-44b7-ab1b-f4883dc02a8f-scripts\") pod \"nova-cell1-conductor-db-sync-7f66c\" (UID: \"31bf5754-5e90-44b7-ab1b-f4883dc02a8f\") " pod="openstack/nova-cell1-conductor-db-sync-7f66c" Feb 17 19:19:18 crc kubenswrapper[4892]: I0217 19:19:18.585330 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31bf5754-5e90-44b7-ab1b-f4883dc02a8f-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7f66c\" (UID: \"31bf5754-5e90-44b7-ab1b-f4883dc02a8f\") " pod="openstack/nova-cell1-conductor-db-sync-7f66c" Feb 17 19:19:18 crc kubenswrapper[4892]: I0217 19:19:18.588364 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31bf5754-5e90-44b7-ab1b-f4883dc02a8f-config-data\") pod \"nova-cell1-conductor-db-sync-7f66c\" (UID: \"31bf5754-5e90-44b7-ab1b-f4883dc02a8f\") " pod="openstack/nova-cell1-conductor-db-sync-7f66c" Feb 17 19:19:18 crc kubenswrapper[4892]: I0217 19:19:18.593326 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbc6g\" (UniqueName: \"kubernetes.io/projected/31bf5754-5e90-44b7-ab1b-f4883dc02a8f-kube-api-access-jbc6g\") pod \"nova-cell1-conductor-db-sync-7f66c\" (UID: \"31bf5754-5e90-44b7-ab1b-f4883dc02a8f\") " pod="openstack/nova-cell1-conductor-db-sync-7f66c" Feb 17 19:19:18 crc kubenswrapper[4892]: I0217 19:19:18.663124 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7f66c" Feb 17 19:19:18 crc kubenswrapper[4892]: I0217 19:19:18.957121 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7f66c"] Feb 17 19:19:19 crc kubenswrapper[4892]: I0217 19:19:19.206002 4892 generic.go:334] "Generic (PLEG): container finished" podID="d2641e63-3c52-4760-a46d-b01684e4ebb5" containerID="017d5b444118121233935c69b7c089105c2a96752202a45685379f5c53455d2b" exitCode=0 Feb 17 19:19:19 crc kubenswrapper[4892]: I0217 19:19:19.206391 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6785d9ddb9-ckc89" event={"ID":"d2641e63-3c52-4760-a46d-b01684e4ebb5","Type":"ContainerDied","Data":"017d5b444118121233935c69b7c089105c2a96752202a45685379f5c53455d2b"} Feb 17 19:19:19 crc kubenswrapper[4892]: I0217 19:19:19.206421 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6785d9ddb9-ckc89" event={"ID":"d2641e63-3c52-4760-a46d-b01684e4ebb5","Type":"ContainerStarted","Data":"dacc19e8c876016cc4bf63fe2b31e8ae4c98d5a944c9f2a94b903a9a80a28fac"} Feb 17 19:19:19 crc kubenswrapper[4892]: I0217 19:19:19.210801 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1a1e86d9-6f05-4489-a9f2-008c85d563bb","Type":"ContainerStarted","Data":"8b448ace939260e5f518b050b42c8575a1fe9f82d09cfe17e6856aa9429d0069"} Feb 17 19:19:19 crc kubenswrapper[4892]: I0217 19:19:19.210848 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1a1e86d9-6f05-4489-a9f2-008c85d563bb","Type":"ContainerStarted","Data":"5314f0b377b81ce0f6e9d8132d0c8829260cd9901e61c4cb036452e7d68e4570"} Feb 17 19:19:19 crc kubenswrapper[4892]: I0217 19:19:19.210858 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1a1e86d9-6f05-4489-a9f2-008c85d563bb","Type":"ContainerStarted","Data":"2e0db0a36710fe2a317f7e709d306628b8db3baf1b2d96ebf95b28898c606397"} Feb 17 19:19:19 crc kubenswrapper[4892]: I0217 19:19:19.214997 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f9243c8d-1b08-4ff6-aca9-be5aab1a4437","Type":"ContainerStarted","Data":"8f577a819e93460b86b0612097544a6ee471507bd452e9a5a9ad36b7fd3da05a"} Feb 17 19:19:19 crc kubenswrapper[4892]: I0217 19:19:19.215033 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f9243c8d-1b08-4ff6-aca9-be5aab1a4437","Type":"ContainerStarted","Data":"97c038e15897a4dcbdfb0f52b0be760dc44f6e83e87a3ce2f27ada6073485d85"} Feb 17 19:19:19 crc kubenswrapper[4892]: I0217 19:19:19.218303 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"18c586e6-4c06-45f0-9e7e-bd1ddd615e18","Type":"ContainerStarted","Data":"ead61f9b003c3be7fb7b98f23446b6822ea4fe2ec6656458b6662afcc280c3fe"} Feb 17 19:19:19 crc kubenswrapper[4892]: I0217 19:19:19.218350 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"18c586e6-4c06-45f0-9e7e-bd1ddd615e18","Type":"ContainerStarted","Data":"3944412c5973e8b4099815acd4ccf73c6eecd506ffbd039ccb4db80e3f6cb2eb"} Feb 17 19:19:19 crc kubenswrapper[4892]: I0217 19:19:19.218362 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"18c586e6-4c06-45f0-9e7e-bd1ddd615e18","Type":"ContainerStarted","Data":"d1295f84c5d3a2b951fd31a0e60cd986eda7ce671854bae955f25acdaa934616"} Feb 17 19:19:19 crc kubenswrapper[4892]: I0217 19:19:19.222006 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6860fb76-f8d0-4d14-872a-cfc33ad6887e","Type":"ContainerStarted","Data":"ff0cc3094cc5f81a42411b6f1316968436a7bdc677f37379d01c095d339ab9d8"} Feb 17 19:19:19 crc kubenswrapper[4892]: I0217 19:19:19.222043 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6860fb76-f8d0-4d14-872a-cfc33ad6887e","Type":"ContainerStarted","Data":"be54fad0feb5b17f8cb924496b7e4f11b13e7a07d9190553fce4bb03830682da"} Feb 17 19:19:19 crc kubenswrapper[4892]: I0217 19:19:19.232782 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7f66c" event={"ID":"31bf5754-5e90-44b7-ab1b-f4883dc02a8f","Type":"ContainerStarted","Data":"a32c3eff22184a4eeb92c6b68faad9b1c77b11a41a4ab582aa7da95cd90f3351"} Feb 17 19:19:19 crc kubenswrapper[4892]: I0217 19:19:19.232834 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7f66c" event={"ID":"31bf5754-5e90-44b7-ab1b-f4883dc02a8f","Type":"ContainerStarted","Data":"8371344c2ee2cd612cc18abba0baeb7a47bbd5361e3b1faf3315f3898e0e6320"} Feb 17 19:19:19 crc kubenswrapper[4892]: I0217 19:19:19.268498 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.2684828980000002 podStartE2EDuration="2.268482898s" podCreationTimestamp="2026-02-17 19:19:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:19:19.262842205 +0000 UTC m=+5730.638245470" watchObservedRunningTime="2026-02-17 19:19:19.268482898 +0000 UTC m=+5730.643886153" Feb 17 19:19:19 crc kubenswrapper[4892]: I0217 19:19:19.295058 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.2950384440000002 podStartE2EDuration="2.295038444s" podCreationTimestamp="2026-02-17 19:19:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:19:19.281747686 +0000 UTC m=+5730.657150951" watchObservedRunningTime="2026-02-17 19:19:19.295038444 +0000 UTC m=+5730.670441709" Feb 17 19:19:19 crc kubenswrapper[4892]: I0217 19:19:19.305738 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.305718052 podStartE2EDuration="2.305718052s" podCreationTimestamp="2026-02-17 19:19:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:19:19.295555248 +0000 UTC m=+5730.670958513" watchObservedRunningTime="2026-02-17 19:19:19.305718052 +0000 UTC m=+5730.681121317" Feb 17 19:19:19 crc kubenswrapper[4892]: I0217 19:19:19.320590 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.320574523 podStartE2EDuration="2.320574523s" podCreationTimestamp="2026-02-17 19:19:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:19:19.318389034 +0000 UTC m=+5730.693792299" watchObservedRunningTime="2026-02-17 19:19:19.320574523 +0000 UTC m=+5730.695977788" Feb 17 19:19:19 crc kubenswrapper[4892]: I0217 19:19:19.350759 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-7f66c" podStartSLOduration=1.350741886 podStartE2EDuration="1.350741886s" podCreationTimestamp="2026-02-17 19:19:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:19:19.332996838 +0000 UTC m=+5730.708400103" watchObservedRunningTime="2026-02-17 19:19:19.350741886 +0000 UTC m=+5730.726145151" Feb 17 19:19:20 crc kubenswrapper[4892]: I0217 19:19:20.246410 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6785d9ddb9-ckc89" event={"ID":"d2641e63-3c52-4760-a46d-b01684e4ebb5","Type":"ContainerStarted","Data":"21792d4e8dcff1b1ab8500689dffc6a2345228a33400e1ae3d2b3fa593dfcf71"} Feb 17 19:19:20 crc kubenswrapper[4892]: I0217 19:19:20.275047 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6785d9ddb9-ckc89" podStartSLOduration=3.275026079 podStartE2EDuration="3.275026079s" podCreationTimestamp="2026-02-17 19:19:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:19:20.267938958 +0000 UTC m=+5731.643342223" watchObservedRunningTime="2026-02-17 19:19:20.275026079 +0000 UTC m=+5731.650429354" Feb 17 19:19:21 crc kubenswrapper[4892]: I0217 19:19:21.257913 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6785d9ddb9-ckc89" Feb 17 19:19:22 crc kubenswrapper[4892]: I0217 19:19:22.270556 4892 generic.go:334] "Generic (PLEG): container finished" podID="31bf5754-5e90-44b7-ab1b-f4883dc02a8f" containerID="a32c3eff22184a4eeb92c6b68faad9b1c77b11a41a4ab582aa7da95cd90f3351" exitCode=0 Feb 17 19:19:22 crc kubenswrapper[4892]: I0217 19:19:22.271635 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7f66c" event={"ID":"31bf5754-5e90-44b7-ab1b-f4883dc02a8f","Type":"ContainerDied","Data":"a32c3eff22184a4eeb92c6b68faad9b1c77b11a41a4ab582aa7da95cd90f3351"} Feb 17 19:19:22 crc kubenswrapper[4892]: I0217 19:19:22.563464 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 17 19:19:22 crc kubenswrapper[4892]: I0217 19:19:22.652745 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 19:19:22 crc kubenswrapper[4892]: I0217 19:19:22.652847 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 19:19:22 crc kubenswrapper[4892]: I0217 19:19:22.819925 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 17 19:19:23 crc kubenswrapper[4892]: I0217 19:19:23.286866 4892 generic.go:334] "Generic (PLEG): container finished" podID="e6a454e7-b6ca-4e29-9579-6e11202bcf98" containerID="c82f041831853527efa635daae8f08db95e3a2f4911b9221c56338da464e66dd" exitCode=0 Feb 17 19:19:23 crc kubenswrapper[4892]: I0217 19:19:23.286917 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-j7t5v" event={"ID":"e6a454e7-b6ca-4e29-9579-6e11202bcf98","Type":"ContainerDied","Data":"c82f041831853527efa635daae8f08db95e3a2f4911b9221c56338da464e66dd"} Feb 17 19:19:23 crc kubenswrapper[4892]: I0217 19:19:23.775946 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7f66c" Feb 17 19:19:23 crc kubenswrapper[4892]: I0217 19:19:23.911185 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31bf5754-5e90-44b7-ab1b-f4883dc02a8f-combined-ca-bundle\") pod \"31bf5754-5e90-44b7-ab1b-f4883dc02a8f\" (UID: \"31bf5754-5e90-44b7-ab1b-f4883dc02a8f\") " Feb 17 19:19:23 crc kubenswrapper[4892]: I0217 19:19:23.911506 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31bf5754-5e90-44b7-ab1b-f4883dc02a8f-scripts\") pod \"31bf5754-5e90-44b7-ab1b-f4883dc02a8f\" (UID: \"31bf5754-5e90-44b7-ab1b-f4883dc02a8f\") " Feb 17 19:19:23 crc kubenswrapper[4892]: I0217 19:19:23.911703 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31bf5754-5e90-44b7-ab1b-f4883dc02a8f-config-data\") pod \"31bf5754-5e90-44b7-ab1b-f4883dc02a8f\" (UID: \"31bf5754-5e90-44b7-ab1b-f4883dc02a8f\") " Feb 17 19:19:23 crc kubenswrapper[4892]: I0217 19:19:23.911830 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbc6g\" (UniqueName: \"kubernetes.io/projected/31bf5754-5e90-44b7-ab1b-f4883dc02a8f-kube-api-access-jbc6g\") pod \"31bf5754-5e90-44b7-ab1b-f4883dc02a8f\" (UID: \"31bf5754-5e90-44b7-ab1b-f4883dc02a8f\") " Feb 17 19:19:23 crc kubenswrapper[4892]: I0217 19:19:23.919121 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31bf5754-5e90-44b7-ab1b-f4883dc02a8f-scripts" (OuterVolumeSpecName: "scripts") pod "31bf5754-5e90-44b7-ab1b-f4883dc02a8f" (UID: "31bf5754-5e90-44b7-ab1b-f4883dc02a8f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:19:23 crc kubenswrapper[4892]: I0217 19:19:23.919373 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31bf5754-5e90-44b7-ab1b-f4883dc02a8f-kube-api-access-jbc6g" (OuterVolumeSpecName: "kube-api-access-jbc6g") pod "31bf5754-5e90-44b7-ab1b-f4883dc02a8f" (UID: "31bf5754-5e90-44b7-ab1b-f4883dc02a8f"). InnerVolumeSpecName "kube-api-access-jbc6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:19:23 crc kubenswrapper[4892]: I0217 19:19:23.947664 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31bf5754-5e90-44b7-ab1b-f4883dc02a8f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "31bf5754-5e90-44b7-ab1b-f4883dc02a8f" (UID: "31bf5754-5e90-44b7-ab1b-f4883dc02a8f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:19:23 crc kubenswrapper[4892]: I0217 19:19:23.959373 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31bf5754-5e90-44b7-ab1b-f4883dc02a8f-config-data" (OuterVolumeSpecName: "config-data") pod "31bf5754-5e90-44b7-ab1b-f4883dc02a8f" (UID: "31bf5754-5e90-44b7-ab1b-f4883dc02a8f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:19:24 crc kubenswrapper[4892]: I0217 19:19:24.015901 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbc6g\" (UniqueName: \"kubernetes.io/projected/31bf5754-5e90-44b7-ab1b-f4883dc02a8f-kube-api-access-jbc6g\") on node \"crc\" DevicePath \"\"" Feb 17 19:19:24 crc kubenswrapper[4892]: I0217 19:19:24.015959 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31bf5754-5e90-44b7-ab1b-f4883dc02a8f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 19:19:24 crc kubenswrapper[4892]: I0217 19:19:24.015978 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31bf5754-5e90-44b7-ab1b-f4883dc02a8f-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:19:24 crc kubenswrapper[4892]: I0217 19:19:24.015993 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31bf5754-5e90-44b7-ab1b-f4883dc02a8f-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 19:19:24 crc kubenswrapper[4892]: I0217 19:19:24.300706 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7f66c" event={"ID":"31bf5754-5e90-44b7-ab1b-f4883dc02a8f","Type":"ContainerDied","Data":"8371344c2ee2cd612cc18abba0baeb7a47bbd5361e3b1faf3315f3898e0e6320"} Feb 17 19:19:24 crc kubenswrapper[4892]: I0217 19:19:24.300796 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8371344c2ee2cd612cc18abba0baeb7a47bbd5361e3b1faf3315f3898e0e6320" Feb 17 19:19:24 crc kubenswrapper[4892]: I0217 19:19:24.301989 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7f66c" Feb 17 19:19:24 crc kubenswrapper[4892]: I0217 19:19:24.392048 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 17 19:19:24 crc kubenswrapper[4892]: E0217 19:19:24.392567 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31bf5754-5e90-44b7-ab1b-f4883dc02a8f" containerName="nova-cell1-conductor-db-sync" Feb 17 19:19:24 crc kubenswrapper[4892]: I0217 19:19:24.392597 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="31bf5754-5e90-44b7-ab1b-f4883dc02a8f" containerName="nova-cell1-conductor-db-sync" Feb 17 19:19:24 crc kubenswrapper[4892]: I0217 19:19:24.393004 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="31bf5754-5e90-44b7-ab1b-f4883dc02a8f" containerName="nova-cell1-conductor-db-sync" Feb 17 19:19:24 crc kubenswrapper[4892]: I0217 19:19:24.393856 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 17 19:19:24 crc kubenswrapper[4892]: I0217 19:19:24.415985 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 17 19:19:24 crc kubenswrapper[4892]: I0217 19:19:24.416395 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 17 19:19:24 crc kubenswrapper[4892]: I0217 19:19:24.430137 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dba96c41-f19e-47ca-9cd0-1cf12d32d448-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"dba96c41-f19e-47ca-9cd0-1cf12d32d448\") " pod="openstack/nova-cell1-conductor-0" Feb 17 19:19:24 crc kubenswrapper[4892]: I0217 19:19:24.430212 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25ld4\" (UniqueName: \"kubernetes.io/projected/dba96c41-f19e-47ca-9cd0-1cf12d32d448-kube-api-access-25ld4\") pod \"nova-cell1-conductor-0\" (UID: \"dba96c41-f19e-47ca-9cd0-1cf12d32d448\") " pod="openstack/nova-cell1-conductor-0" Feb 17 19:19:24 crc kubenswrapper[4892]: I0217 19:19:24.430296 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dba96c41-f19e-47ca-9cd0-1cf12d32d448-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"dba96c41-f19e-47ca-9cd0-1cf12d32d448\") " pod="openstack/nova-cell1-conductor-0" Feb 17 19:19:24 crc kubenswrapper[4892]: I0217 19:19:24.533669 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dba96c41-f19e-47ca-9cd0-1cf12d32d448-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"dba96c41-f19e-47ca-9cd0-1cf12d32d448\") " pod="openstack/nova-cell1-conductor-0" Feb 17 19:19:24 crc kubenswrapper[4892]: I0217 19:19:24.534067 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25ld4\" (UniqueName: \"kubernetes.io/projected/dba96c41-f19e-47ca-9cd0-1cf12d32d448-kube-api-access-25ld4\") pod \"nova-cell1-conductor-0\" (UID: \"dba96c41-f19e-47ca-9cd0-1cf12d32d448\") " pod="openstack/nova-cell1-conductor-0" Feb 17 19:19:24 crc kubenswrapper[4892]: I0217 19:19:24.534232 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dba96c41-f19e-47ca-9cd0-1cf12d32d448-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"dba96c41-f19e-47ca-9cd0-1cf12d32d448\") " pod="openstack/nova-cell1-conductor-0" Feb 17 19:19:24 crc kubenswrapper[4892]: I0217 19:19:24.539050 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dba96c41-f19e-47ca-9cd0-1cf12d32d448-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"dba96c41-f19e-47ca-9cd0-1cf12d32d448\") " pod="openstack/nova-cell1-conductor-0" Feb 17 19:19:24 crc kubenswrapper[4892]: I0217 19:19:24.542216 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dba96c41-f19e-47ca-9cd0-1cf12d32d448-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"dba96c41-f19e-47ca-9cd0-1cf12d32d448\") " pod="openstack/nova-cell1-conductor-0" Feb 17 19:19:24 crc kubenswrapper[4892]: I0217 19:19:24.560091 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25ld4\" (UniqueName: \"kubernetes.io/projected/dba96c41-f19e-47ca-9cd0-1cf12d32d448-kube-api-access-25ld4\") pod \"nova-cell1-conductor-0\" (UID: \"dba96c41-f19e-47ca-9cd0-1cf12d32d448\") " pod="openstack/nova-cell1-conductor-0" Feb 17 19:19:24 crc kubenswrapper[4892]: I0217 19:19:24.755354 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 17 19:19:24 crc kubenswrapper[4892]: I0217 19:19:24.764567 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-j7t5v" Feb 17 19:19:24 crc kubenswrapper[4892]: I0217 19:19:24.839480 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6a454e7-b6ca-4e29-9579-6e11202bcf98-scripts\") pod \"e6a454e7-b6ca-4e29-9579-6e11202bcf98\" (UID: \"e6a454e7-b6ca-4e29-9579-6e11202bcf98\") " Feb 17 19:19:24 crc kubenswrapper[4892]: I0217 19:19:24.839542 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6a454e7-b6ca-4e29-9579-6e11202bcf98-config-data\") pod \"e6a454e7-b6ca-4e29-9579-6e11202bcf98\" (UID: \"e6a454e7-b6ca-4e29-9579-6e11202bcf98\") " Feb 17 19:19:24 crc kubenswrapper[4892]: I0217 19:19:24.839571 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gl86f\" (UniqueName: \"kubernetes.io/projected/e6a454e7-b6ca-4e29-9579-6e11202bcf98-kube-api-access-gl86f\") pod \"e6a454e7-b6ca-4e29-9579-6e11202bcf98\" (UID: \"e6a454e7-b6ca-4e29-9579-6e11202bcf98\") " Feb 17 19:19:24 crc kubenswrapper[4892]: I0217 19:19:24.839662 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6a454e7-b6ca-4e29-9579-6e11202bcf98-combined-ca-bundle\") pod \"e6a454e7-b6ca-4e29-9579-6e11202bcf98\" (UID: \"e6a454e7-b6ca-4e29-9579-6e11202bcf98\") " Feb 17 19:19:24 crc kubenswrapper[4892]: I0217 19:19:24.843733 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6a454e7-b6ca-4e29-9579-6e11202bcf98-scripts" (OuterVolumeSpecName: "scripts") pod "e6a454e7-b6ca-4e29-9579-6e11202bcf98" (UID: "e6a454e7-b6ca-4e29-9579-6e11202bcf98"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:19:24 crc kubenswrapper[4892]: I0217 19:19:24.845461 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6a454e7-b6ca-4e29-9579-6e11202bcf98-kube-api-access-gl86f" (OuterVolumeSpecName: "kube-api-access-gl86f") pod "e6a454e7-b6ca-4e29-9579-6e11202bcf98" (UID: "e6a454e7-b6ca-4e29-9579-6e11202bcf98"). InnerVolumeSpecName "kube-api-access-gl86f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:19:24 crc kubenswrapper[4892]: I0217 19:19:24.878950 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6a454e7-b6ca-4e29-9579-6e11202bcf98-config-data" (OuterVolumeSpecName: "config-data") pod "e6a454e7-b6ca-4e29-9579-6e11202bcf98" (UID: "e6a454e7-b6ca-4e29-9579-6e11202bcf98"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:19:24 crc kubenswrapper[4892]: I0217 19:19:24.881504 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6a454e7-b6ca-4e29-9579-6e11202bcf98-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e6a454e7-b6ca-4e29-9579-6e11202bcf98" (UID: "e6a454e7-b6ca-4e29-9579-6e11202bcf98"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:19:24 crc kubenswrapper[4892]: I0217 19:19:24.944519 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6a454e7-b6ca-4e29-9579-6e11202bcf98-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 19:19:24 crc kubenswrapper[4892]: I0217 19:19:24.944552 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gl86f\" (UniqueName: \"kubernetes.io/projected/e6a454e7-b6ca-4e29-9579-6e11202bcf98-kube-api-access-gl86f\") on node \"crc\" DevicePath \"\"" Feb 17 19:19:24 crc kubenswrapper[4892]: I0217 19:19:24.944564 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6a454e7-b6ca-4e29-9579-6e11202bcf98-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 19:19:24 crc kubenswrapper[4892]: I0217 19:19:24.944601 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6a454e7-b6ca-4e29-9579-6e11202bcf98-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:19:25 crc kubenswrapper[4892]: I0217 19:19:25.251870 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 17 19:19:25 crc kubenswrapper[4892]: I0217 19:19:25.319556 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-j7t5v" event={"ID":"e6a454e7-b6ca-4e29-9579-6e11202bcf98","Type":"ContainerDied","Data":"514199cdef9d2b239606e4a36ad9098764f5d643b60ad18231c987dbc84d3d87"} Feb 17 19:19:25 crc kubenswrapper[4892]: I0217 19:19:25.319623 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="514199cdef9d2b239606e4a36ad9098764f5d643b60ad18231c987dbc84d3d87" Feb 17 19:19:25 crc kubenswrapper[4892]: I0217 19:19:25.319720 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-j7t5v" Feb 17 19:19:25 crc kubenswrapper[4892]: I0217 19:19:25.324666 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"dba96c41-f19e-47ca-9cd0-1cf12d32d448","Type":"ContainerStarted","Data":"aef695fb4441ae72ad5b69d3f2cfa0e99efab8f819d61e69cbb5c40fddbe3913"} Feb 17 19:19:25 crc kubenswrapper[4892]: I0217 19:19:25.486987 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 19:19:25 crc kubenswrapper[4892]: I0217 19:19:25.488061 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1a1e86d9-6f05-4489-a9f2-008c85d563bb" containerName="nova-api-api" containerID="cri-o://8b448ace939260e5f518b050b42c8575a1fe9f82d09cfe17e6856aa9429d0069" gracePeriod=30 Feb 17 19:19:25 crc kubenswrapper[4892]: I0217 19:19:25.495206 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1a1e86d9-6f05-4489-a9f2-008c85d563bb" containerName="nova-api-log" containerID="cri-o://5314f0b377b81ce0f6e9d8132d0c8829260cd9901e61c4cb036452e7d68e4570" gracePeriod=30 Feb 17 19:19:25 crc kubenswrapper[4892]: I0217 19:19:25.497925 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 19:19:25 crc kubenswrapper[4892]: I0217 19:19:25.498139 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="6860fb76-f8d0-4d14-872a-cfc33ad6887e" containerName="nova-scheduler-scheduler" containerID="cri-o://ff0cc3094cc5f81a42411b6f1316968436a7bdc677f37379d01c095d339ab9d8" gracePeriod=30 Feb 17 19:19:25 crc kubenswrapper[4892]: I0217 19:19:25.529510 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 19:19:25 crc kubenswrapper[4892]: I0217 19:19:25.529778 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="18c586e6-4c06-45f0-9e7e-bd1ddd615e18" containerName="nova-metadata-log" containerID="cri-o://3944412c5973e8b4099815acd4ccf73c6eecd506ffbd039ccb4db80e3f6cb2eb" gracePeriod=30 Feb 17 19:19:25 crc kubenswrapper[4892]: I0217 19:19:25.529936 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="18c586e6-4c06-45f0-9e7e-bd1ddd615e18" containerName="nova-metadata-metadata" containerID="cri-o://ead61f9b003c3be7fb7b98f23446b6822ea4fe2ec6656458b6662afcc280c3fe" gracePeriod=30 Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.172623 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.173868 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.282757 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a1e86d9-6f05-4489-a9f2-008c85d563bb-combined-ca-bundle\") pod \"1a1e86d9-6f05-4489-a9f2-008c85d563bb\" (UID: \"1a1e86d9-6f05-4489-a9f2-008c85d563bb\") " Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.282887 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97dgg\" (UniqueName: \"kubernetes.io/projected/1a1e86d9-6f05-4489-a9f2-008c85d563bb-kube-api-access-97dgg\") pod \"1a1e86d9-6f05-4489-a9f2-008c85d563bb\" (UID: \"1a1e86d9-6f05-4489-a9f2-008c85d563bb\") " Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.282939 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18c586e6-4c06-45f0-9e7e-bd1ddd615e18-logs\") pod \"18c586e6-4c06-45f0-9e7e-bd1ddd615e18\" (UID: \"18c586e6-4c06-45f0-9e7e-bd1ddd615e18\") " Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.282978 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a1e86d9-6f05-4489-a9f2-008c85d563bb-config-data\") pod \"1a1e86d9-6f05-4489-a9f2-008c85d563bb\" (UID: \"1a1e86d9-6f05-4489-a9f2-008c85d563bb\") " Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.283052 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18c586e6-4c06-45f0-9e7e-bd1ddd615e18-config-data\") pod \"18c586e6-4c06-45f0-9e7e-bd1ddd615e18\" (UID: \"18c586e6-4c06-45f0-9e7e-bd1ddd615e18\") " Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.283163 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gkv8\" (UniqueName: \"kubernetes.io/projected/18c586e6-4c06-45f0-9e7e-bd1ddd615e18-kube-api-access-6gkv8\") pod \"18c586e6-4c06-45f0-9e7e-bd1ddd615e18\" (UID: \"18c586e6-4c06-45f0-9e7e-bd1ddd615e18\") " Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.283269 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18c586e6-4c06-45f0-9e7e-bd1ddd615e18-combined-ca-bundle\") pod \"18c586e6-4c06-45f0-9e7e-bd1ddd615e18\" (UID: \"18c586e6-4c06-45f0-9e7e-bd1ddd615e18\") " Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.283365 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18c586e6-4c06-45f0-9e7e-bd1ddd615e18-logs" (OuterVolumeSpecName: "logs") pod "18c586e6-4c06-45f0-9e7e-bd1ddd615e18" (UID: "18c586e6-4c06-45f0-9e7e-bd1ddd615e18"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.283387 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a1e86d9-6f05-4489-a9f2-008c85d563bb-logs\") pod \"1a1e86d9-6f05-4489-a9f2-008c85d563bb\" (UID: \"1a1e86d9-6f05-4489-a9f2-008c85d563bb\") " Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.283947 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a1e86d9-6f05-4489-a9f2-008c85d563bb-logs" (OuterVolumeSpecName: "logs") pod "1a1e86d9-6f05-4489-a9f2-008c85d563bb" (UID: "1a1e86d9-6f05-4489-a9f2-008c85d563bb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.284364 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18c586e6-4c06-45f0-9e7e-bd1ddd615e18-logs\") on node \"crc\" DevicePath \"\"" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.284395 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a1e86d9-6f05-4489-a9f2-008c85d563bb-logs\") on node \"crc\" DevicePath \"\"" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.288305 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a1e86d9-6f05-4489-a9f2-008c85d563bb-kube-api-access-97dgg" (OuterVolumeSpecName: "kube-api-access-97dgg") pod "1a1e86d9-6f05-4489-a9f2-008c85d563bb" (UID: "1a1e86d9-6f05-4489-a9f2-008c85d563bb"). InnerVolumeSpecName "kube-api-access-97dgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.288526 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18c586e6-4c06-45f0-9e7e-bd1ddd615e18-kube-api-access-6gkv8" (OuterVolumeSpecName: "kube-api-access-6gkv8") pod "18c586e6-4c06-45f0-9e7e-bd1ddd615e18" (UID: "18c586e6-4c06-45f0-9e7e-bd1ddd615e18"). InnerVolumeSpecName "kube-api-access-6gkv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.309402 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18c586e6-4c06-45f0-9e7e-bd1ddd615e18-config-data" (OuterVolumeSpecName: "config-data") pod "18c586e6-4c06-45f0-9e7e-bd1ddd615e18" (UID: "18c586e6-4c06-45f0-9e7e-bd1ddd615e18"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.310038 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a1e86d9-6f05-4489-a9f2-008c85d563bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1a1e86d9-6f05-4489-a9f2-008c85d563bb" (UID: "1a1e86d9-6f05-4489-a9f2-008c85d563bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.310058 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18c586e6-4c06-45f0-9e7e-bd1ddd615e18-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "18c586e6-4c06-45f0-9e7e-bd1ddd615e18" (UID: "18c586e6-4c06-45f0-9e7e-bd1ddd615e18"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.311995 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a1e86d9-6f05-4489-a9f2-008c85d563bb-config-data" (OuterVolumeSpecName: "config-data") pod "1a1e86d9-6f05-4489-a9f2-008c85d563bb" (UID: "1a1e86d9-6f05-4489-a9f2-008c85d563bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.334590 4892 generic.go:334] "Generic (PLEG): container finished" podID="1a1e86d9-6f05-4489-a9f2-008c85d563bb" containerID="8b448ace939260e5f518b050b42c8575a1fe9f82d09cfe17e6856aa9429d0069" exitCode=0 Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.334623 4892 generic.go:334] "Generic (PLEG): container finished" podID="1a1e86d9-6f05-4489-a9f2-008c85d563bb" containerID="5314f0b377b81ce0f6e9d8132d0c8829260cd9901e61c4cb036452e7d68e4570" exitCode=143 Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.334719 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.334870 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1a1e86d9-6f05-4489-a9f2-008c85d563bb","Type":"ContainerDied","Data":"8b448ace939260e5f518b050b42c8575a1fe9f82d09cfe17e6856aa9429d0069"} Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.334939 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1a1e86d9-6f05-4489-a9f2-008c85d563bb","Type":"ContainerDied","Data":"5314f0b377b81ce0f6e9d8132d0c8829260cd9901e61c4cb036452e7d68e4570"} Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.334952 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1a1e86d9-6f05-4489-a9f2-008c85d563bb","Type":"ContainerDied","Data":"2e0db0a36710fe2a317f7e709d306628b8db3baf1b2d96ebf95b28898c606397"} Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.334969 4892 scope.go:117] "RemoveContainer" containerID="8b448ace939260e5f518b050b42c8575a1fe9f82d09cfe17e6856aa9429d0069" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.336214 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"dba96c41-f19e-47ca-9cd0-1cf12d32d448","Type":"ContainerStarted","Data":"71311d9426c73238e86516e341e460027b0269746d4b74ba0cef6f1380147381"} Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.336347 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.339280 4892 generic.go:334] "Generic (PLEG): container finished" podID="18c586e6-4c06-45f0-9e7e-bd1ddd615e18" containerID="ead61f9b003c3be7fb7b98f23446b6822ea4fe2ec6656458b6662afcc280c3fe" exitCode=0 Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.339306 4892 generic.go:334] "Generic (PLEG): container finished" podID="18c586e6-4c06-45f0-9e7e-bd1ddd615e18" containerID="3944412c5973e8b4099815acd4ccf73c6eecd506ffbd039ccb4db80e3f6cb2eb" exitCode=143 Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.339329 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"18c586e6-4c06-45f0-9e7e-bd1ddd615e18","Type":"ContainerDied","Data":"ead61f9b003c3be7fb7b98f23446b6822ea4fe2ec6656458b6662afcc280c3fe"} Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.339351 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"18c586e6-4c06-45f0-9e7e-bd1ddd615e18","Type":"ContainerDied","Data":"3944412c5973e8b4099815acd4ccf73c6eecd506ffbd039ccb4db80e3f6cb2eb"} Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.339362 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"18c586e6-4c06-45f0-9e7e-bd1ddd615e18","Type":"ContainerDied","Data":"d1295f84c5d3a2b951fd31a0e60cd986eda7ce671854bae955f25acdaa934616"} Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.339430 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.358526 4892 scope.go:117] "RemoveContainer" containerID="5314f0b377b81ce0f6e9d8132d0c8829260cd9901e61c4cb036452e7d68e4570" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.371733 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.371707877 podStartE2EDuration="2.371707877s" podCreationTimestamp="2026-02-17 19:19:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:19:26.354621906 +0000 UTC m=+5737.730025171" watchObservedRunningTime="2026-02-17 19:19:26.371707877 +0000 UTC m=+5737.747111142" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.388297 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.390168 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gkv8\" (UniqueName: \"kubernetes.io/projected/18c586e6-4c06-45f0-9e7e-bd1ddd615e18-kube-api-access-6gkv8\") on node \"crc\" DevicePath \"\"" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.390227 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18c586e6-4c06-45f0-9e7e-bd1ddd615e18-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.390242 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a1e86d9-6f05-4489-a9f2-008c85d563bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.390255 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97dgg\" (UniqueName: \"kubernetes.io/projected/1a1e86d9-6f05-4489-a9f2-008c85d563bb-kube-api-access-97dgg\") on node \"crc\" DevicePath \"\"" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.390334 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a1e86d9-6f05-4489-a9f2-008c85d563bb-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.390348 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18c586e6-4c06-45f0-9e7e-bd1ddd615e18-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.394484 4892 scope.go:117] "RemoveContainer" containerID="8b448ace939260e5f518b050b42c8575a1fe9f82d09cfe17e6856aa9429d0069" Feb 17 19:19:26 crc kubenswrapper[4892]: E0217 19:19:26.395235 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b448ace939260e5f518b050b42c8575a1fe9f82d09cfe17e6856aa9429d0069\": container with ID starting with 8b448ace939260e5f518b050b42c8575a1fe9f82d09cfe17e6856aa9429d0069 not found: ID does not exist" containerID="8b448ace939260e5f518b050b42c8575a1fe9f82d09cfe17e6856aa9429d0069" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.395289 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b448ace939260e5f518b050b42c8575a1fe9f82d09cfe17e6856aa9429d0069"} err="failed to get container status \"8b448ace939260e5f518b050b42c8575a1fe9f82d09cfe17e6856aa9429d0069\": rpc error: code = NotFound desc = could not find container \"8b448ace939260e5f518b050b42c8575a1fe9f82d09cfe17e6856aa9429d0069\": container with ID starting with 8b448ace939260e5f518b050b42c8575a1fe9f82d09cfe17e6856aa9429d0069 not found: ID does not exist" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.395323 4892 scope.go:117] "RemoveContainer" containerID="5314f0b377b81ce0f6e9d8132d0c8829260cd9901e61c4cb036452e7d68e4570" Feb 17 19:19:26 crc kubenswrapper[4892]: E0217 19:19:26.395601 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5314f0b377b81ce0f6e9d8132d0c8829260cd9901e61c4cb036452e7d68e4570\": container with ID starting with 5314f0b377b81ce0f6e9d8132d0c8829260cd9901e61c4cb036452e7d68e4570 not found: ID does not exist" containerID="5314f0b377b81ce0f6e9d8132d0c8829260cd9901e61c4cb036452e7d68e4570" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.395634 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5314f0b377b81ce0f6e9d8132d0c8829260cd9901e61c4cb036452e7d68e4570"} err="failed to get container status \"5314f0b377b81ce0f6e9d8132d0c8829260cd9901e61c4cb036452e7d68e4570\": rpc error: code = NotFound desc = could not find container \"5314f0b377b81ce0f6e9d8132d0c8829260cd9901e61c4cb036452e7d68e4570\": container with ID starting with 5314f0b377b81ce0f6e9d8132d0c8829260cd9901e61c4cb036452e7d68e4570 not found: ID does not exist" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.395655 4892 scope.go:117] "RemoveContainer" containerID="8b448ace939260e5f518b050b42c8575a1fe9f82d09cfe17e6856aa9429d0069" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.395923 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b448ace939260e5f518b050b42c8575a1fe9f82d09cfe17e6856aa9429d0069"} err="failed to get container status \"8b448ace939260e5f518b050b42c8575a1fe9f82d09cfe17e6856aa9429d0069\": rpc error: code = NotFound desc = could not find container \"8b448ace939260e5f518b050b42c8575a1fe9f82d09cfe17e6856aa9429d0069\": container with ID starting with 8b448ace939260e5f518b050b42c8575a1fe9f82d09cfe17e6856aa9429d0069 not found: ID does not exist" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.395947 4892 scope.go:117] "RemoveContainer" containerID="5314f0b377b81ce0f6e9d8132d0c8829260cd9901e61c4cb036452e7d68e4570" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.396381 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5314f0b377b81ce0f6e9d8132d0c8829260cd9901e61c4cb036452e7d68e4570"} err="failed to get container status \"5314f0b377b81ce0f6e9d8132d0c8829260cd9901e61c4cb036452e7d68e4570\": rpc error: code = NotFound desc = could not find container \"5314f0b377b81ce0f6e9d8132d0c8829260cd9901e61c4cb036452e7d68e4570\": container with ID starting with 5314f0b377b81ce0f6e9d8132d0c8829260cd9901e61c4cb036452e7d68e4570 not found: ID does not exist" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.396410 4892 scope.go:117] "RemoveContainer" containerID="ead61f9b003c3be7fb7b98f23446b6822ea4fe2ec6656458b6662afcc280c3fe" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.401658 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.411943 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.422500 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.431240 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 17 19:19:26 crc kubenswrapper[4892]: E0217 19:19:26.432036 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18c586e6-4c06-45f0-9e7e-bd1ddd615e18" containerName="nova-metadata-metadata" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.432058 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="18c586e6-4c06-45f0-9e7e-bd1ddd615e18" containerName="nova-metadata-metadata" Feb 17 19:19:26 crc kubenswrapper[4892]: E0217 19:19:26.432068 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18c586e6-4c06-45f0-9e7e-bd1ddd615e18" containerName="nova-metadata-log" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.432078 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="18c586e6-4c06-45f0-9e7e-bd1ddd615e18" containerName="nova-metadata-log" Feb 17 19:19:26 crc kubenswrapper[4892]: E0217 19:19:26.432128 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a1e86d9-6f05-4489-a9f2-008c85d563bb" containerName="nova-api-api" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.432136 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a1e86d9-6f05-4489-a9f2-008c85d563bb" containerName="nova-api-api" Feb 17 19:19:26 crc kubenswrapper[4892]: E0217 19:19:26.432157 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6a454e7-b6ca-4e29-9579-6e11202bcf98" containerName="nova-manage" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.432164 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6a454e7-b6ca-4e29-9579-6e11202bcf98" containerName="nova-manage" Feb 17 19:19:26 crc kubenswrapper[4892]: E0217 19:19:26.432192 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a1e86d9-6f05-4489-a9f2-008c85d563bb" containerName="nova-api-log" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.432198 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a1e86d9-6f05-4489-a9f2-008c85d563bb" containerName="nova-api-log" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.432438 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="18c586e6-4c06-45f0-9e7e-bd1ddd615e18" containerName="nova-metadata-log" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.432457 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6a454e7-b6ca-4e29-9579-6e11202bcf98" containerName="nova-manage" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.432481 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a1e86d9-6f05-4489-a9f2-008c85d563bb" containerName="nova-api-log" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.432492 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a1e86d9-6f05-4489-a9f2-008c85d563bb" containerName="nova-api-api" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.432502 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="18c586e6-4c06-45f0-9e7e-bd1ddd615e18" containerName="nova-metadata-metadata" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.433937 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.437810 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.444042 4892 scope.go:117] "RemoveContainer" containerID="3944412c5973e8b4099815acd4ccf73c6eecd506ffbd039ccb4db80e3f6cb2eb" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.447233 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.467008 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.468697 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.470962 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.472196 4892 scope.go:117] "RemoveContainer" containerID="ead61f9b003c3be7fb7b98f23446b6822ea4fe2ec6656458b6662afcc280c3fe" Feb 17 19:19:26 crc kubenswrapper[4892]: E0217 19:19:26.476548 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ead61f9b003c3be7fb7b98f23446b6822ea4fe2ec6656458b6662afcc280c3fe\": container with ID starting with ead61f9b003c3be7fb7b98f23446b6822ea4fe2ec6656458b6662afcc280c3fe not found: ID does not exist" containerID="ead61f9b003c3be7fb7b98f23446b6822ea4fe2ec6656458b6662afcc280c3fe" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.476583 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ead61f9b003c3be7fb7b98f23446b6822ea4fe2ec6656458b6662afcc280c3fe"} err="failed to get container status \"ead61f9b003c3be7fb7b98f23446b6822ea4fe2ec6656458b6662afcc280c3fe\": rpc error: code = NotFound desc = could not find container \"ead61f9b003c3be7fb7b98f23446b6822ea4fe2ec6656458b6662afcc280c3fe\": container with ID starting with ead61f9b003c3be7fb7b98f23446b6822ea4fe2ec6656458b6662afcc280c3fe not found: ID does not exist" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.476608 4892 scope.go:117] "RemoveContainer" containerID="3944412c5973e8b4099815acd4ccf73c6eecd506ffbd039ccb4db80e3f6cb2eb" Feb 17 19:19:26 crc kubenswrapper[4892]: E0217 19:19:26.478987 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3944412c5973e8b4099815acd4ccf73c6eecd506ffbd039ccb4db80e3f6cb2eb\": container with ID starting with 3944412c5973e8b4099815acd4ccf73c6eecd506ffbd039ccb4db80e3f6cb2eb not found: ID does not exist" containerID="3944412c5973e8b4099815acd4ccf73c6eecd506ffbd039ccb4db80e3f6cb2eb" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.479030 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3944412c5973e8b4099815acd4ccf73c6eecd506ffbd039ccb4db80e3f6cb2eb"} err="failed to get container status \"3944412c5973e8b4099815acd4ccf73c6eecd506ffbd039ccb4db80e3f6cb2eb\": rpc error: code = NotFound desc = could not find container \"3944412c5973e8b4099815acd4ccf73c6eecd506ffbd039ccb4db80e3f6cb2eb\": container with ID starting with 3944412c5973e8b4099815acd4ccf73c6eecd506ffbd039ccb4db80e3f6cb2eb not found: ID does not exist" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.479060 4892 scope.go:117] "RemoveContainer" containerID="ead61f9b003c3be7fb7b98f23446b6822ea4fe2ec6656458b6662afcc280c3fe" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.479351 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ead61f9b003c3be7fb7b98f23446b6822ea4fe2ec6656458b6662afcc280c3fe"} err="failed to get container status \"ead61f9b003c3be7fb7b98f23446b6822ea4fe2ec6656458b6662afcc280c3fe\": rpc error: code = NotFound desc = could not find container \"ead61f9b003c3be7fb7b98f23446b6822ea4fe2ec6656458b6662afcc280c3fe\": container with ID starting with ead61f9b003c3be7fb7b98f23446b6822ea4fe2ec6656458b6662afcc280c3fe not found: ID does not exist" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.479368 4892 scope.go:117] "RemoveContainer" containerID="3944412c5973e8b4099815acd4ccf73c6eecd506ffbd039ccb4db80e3f6cb2eb" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.479689 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3944412c5973e8b4099815acd4ccf73c6eecd506ffbd039ccb4db80e3f6cb2eb"} err="failed to get container status \"3944412c5973e8b4099815acd4ccf73c6eecd506ffbd039ccb4db80e3f6cb2eb\": rpc error: code = NotFound desc = could not find container \"3944412c5973e8b4099815acd4ccf73c6eecd506ffbd039ccb4db80e3f6cb2eb\": container with ID starting with 3944412c5973e8b4099815acd4ccf73c6eecd506ffbd039ccb4db80e3f6cb2eb not found: ID does not exist" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.480873 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.594673 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5be3191-ec7d-4f73-80c6-3a50bd405ea4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e5be3191-ec7d-4f73-80c6-3a50bd405ea4\") " pod="openstack/nova-api-0" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.594760 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsts5\" (UniqueName: \"kubernetes.io/projected/e5be3191-ec7d-4f73-80c6-3a50bd405ea4-kube-api-access-qsts5\") pod \"nova-api-0\" (UID: \"e5be3191-ec7d-4f73-80c6-3a50bd405ea4\") " pod="openstack/nova-api-0" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.594788 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9a562ab-e2d3-47ad-806c-052d6eba74a8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b9a562ab-e2d3-47ad-806c-052d6eba74a8\") " pod="openstack/nova-metadata-0" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.594829 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9a562ab-e2d3-47ad-806c-052d6eba74a8-logs\") pod \"nova-metadata-0\" (UID: \"b9a562ab-e2d3-47ad-806c-052d6eba74a8\") " pod="openstack/nova-metadata-0" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.595065 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5be3191-ec7d-4f73-80c6-3a50bd405ea4-config-data\") pod \"nova-api-0\" (UID: \"e5be3191-ec7d-4f73-80c6-3a50bd405ea4\") " pod="openstack/nova-api-0" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.595157 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5be3191-ec7d-4f73-80c6-3a50bd405ea4-logs\") pod \"nova-api-0\" (UID: \"e5be3191-ec7d-4f73-80c6-3a50bd405ea4\") " pod="openstack/nova-api-0" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.595313 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tks7\" (UniqueName: \"kubernetes.io/projected/b9a562ab-e2d3-47ad-806c-052d6eba74a8-kube-api-access-6tks7\") pod \"nova-metadata-0\" (UID: \"b9a562ab-e2d3-47ad-806c-052d6eba74a8\") " pod="openstack/nova-metadata-0" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.595407 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9a562ab-e2d3-47ad-806c-052d6eba74a8-config-data\") pod \"nova-metadata-0\" (UID: \"b9a562ab-e2d3-47ad-806c-052d6eba74a8\") " pod="openstack/nova-metadata-0" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.696847 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsts5\" (UniqueName: \"kubernetes.io/projected/e5be3191-ec7d-4f73-80c6-3a50bd405ea4-kube-api-access-qsts5\") pod \"nova-api-0\" (UID: \"e5be3191-ec7d-4f73-80c6-3a50bd405ea4\") " pod="openstack/nova-api-0" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.696920 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9a562ab-e2d3-47ad-806c-052d6eba74a8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b9a562ab-e2d3-47ad-806c-052d6eba74a8\") " pod="openstack/nova-metadata-0" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.696977 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9a562ab-e2d3-47ad-806c-052d6eba74a8-logs\") pod \"nova-metadata-0\" (UID: \"b9a562ab-e2d3-47ad-806c-052d6eba74a8\") " pod="openstack/nova-metadata-0" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.697101 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5be3191-ec7d-4f73-80c6-3a50bd405ea4-config-data\") pod \"nova-api-0\" (UID: \"e5be3191-ec7d-4f73-80c6-3a50bd405ea4\") " pod="openstack/nova-api-0" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.697150 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5be3191-ec7d-4f73-80c6-3a50bd405ea4-logs\") pod \"nova-api-0\" (UID: \"e5be3191-ec7d-4f73-80c6-3a50bd405ea4\") " pod="openstack/nova-api-0" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.697177 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tks7\" (UniqueName: \"kubernetes.io/projected/b9a562ab-e2d3-47ad-806c-052d6eba74a8-kube-api-access-6tks7\") pod \"nova-metadata-0\" (UID: \"b9a562ab-e2d3-47ad-806c-052d6eba74a8\") " pod="openstack/nova-metadata-0" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.697229 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9a562ab-e2d3-47ad-806c-052d6eba74a8-config-data\") pod \"nova-metadata-0\" (UID: \"b9a562ab-e2d3-47ad-806c-052d6eba74a8\") " pod="openstack/nova-metadata-0" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.697318 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5be3191-ec7d-4f73-80c6-3a50bd405ea4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e5be3191-ec7d-4f73-80c6-3a50bd405ea4\") " pod="openstack/nova-api-0" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.698143 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5be3191-ec7d-4f73-80c6-3a50bd405ea4-logs\") pod \"nova-api-0\" (UID: \"e5be3191-ec7d-4f73-80c6-3a50bd405ea4\") " pod="openstack/nova-api-0" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.699326 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9a562ab-e2d3-47ad-806c-052d6eba74a8-logs\") pod \"nova-metadata-0\" (UID: \"b9a562ab-e2d3-47ad-806c-052d6eba74a8\") " pod="openstack/nova-metadata-0" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.701922 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5be3191-ec7d-4f73-80c6-3a50bd405ea4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e5be3191-ec7d-4f73-80c6-3a50bd405ea4\") " pod="openstack/nova-api-0" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.702151 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5be3191-ec7d-4f73-80c6-3a50bd405ea4-config-data\") pod \"nova-api-0\" (UID: \"e5be3191-ec7d-4f73-80c6-3a50bd405ea4\") " pod="openstack/nova-api-0" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.704479 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9a562ab-e2d3-47ad-806c-052d6eba74a8-config-data\") pod \"nova-metadata-0\" (UID: \"b9a562ab-e2d3-47ad-806c-052d6eba74a8\") " pod="openstack/nova-metadata-0" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.709113 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9a562ab-e2d3-47ad-806c-052d6eba74a8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b9a562ab-e2d3-47ad-806c-052d6eba74a8\") " pod="openstack/nova-metadata-0" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.715422 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tks7\" (UniqueName: \"kubernetes.io/projected/b9a562ab-e2d3-47ad-806c-052d6eba74a8-kube-api-access-6tks7\") pod \"nova-metadata-0\" (UID: \"b9a562ab-e2d3-47ad-806c-052d6eba74a8\") " pod="openstack/nova-metadata-0" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.724726 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsts5\" (UniqueName: \"kubernetes.io/projected/e5be3191-ec7d-4f73-80c6-3a50bd405ea4-kube-api-access-qsts5\") pod \"nova-api-0\" (UID: \"e5be3191-ec7d-4f73-80c6-3a50bd405ea4\") " pod="openstack/nova-api-0" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.763990 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 19:19:26 crc kubenswrapper[4892]: I0217 19:19:26.786429 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 19:19:27 crc kubenswrapper[4892]: I0217 19:19:27.315575 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 19:19:27 crc kubenswrapper[4892]: I0217 19:19:27.341227 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 19:19:27 crc kubenswrapper[4892]: I0217 19:19:27.353546 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e5be3191-ec7d-4f73-80c6-3a50bd405ea4","Type":"ContainerStarted","Data":"d6e68cf4b0c5f67a9b0ef6783fa428739726d03340f281db1aa642fce2a6902b"} Feb 17 19:19:27 crc kubenswrapper[4892]: I0217 19:19:27.355763 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b9a562ab-e2d3-47ad-806c-052d6eba74a8","Type":"ContainerStarted","Data":"d02a49fb1645993730a5bb486fb2c593105d3d2d895efd4c2087bb8adbd7fab4"} Feb 17 19:19:27 crc kubenswrapper[4892]: I0217 19:19:27.378943 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18c586e6-4c06-45f0-9e7e-bd1ddd615e18" path="/var/lib/kubelet/pods/18c586e6-4c06-45f0-9e7e-bd1ddd615e18/volumes" Feb 17 19:19:27 crc kubenswrapper[4892]: I0217 19:19:27.381431 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a1e86d9-6f05-4489-a9f2-008c85d563bb" path="/var/lib/kubelet/pods/1a1e86d9-6f05-4489-a9f2-008c85d563bb/volumes" Feb 17 19:19:27 crc kubenswrapper[4892]: I0217 19:19:27.790046 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6785d9ddb9-ckc89" Feb 17 19:19:27 crc kubenswrapper[4892]: I0217 19:19:27.820508 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 17 19:19:27 crc kubenswrapper[4892]: I0217 19:19:27.844954 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 17 19:19:27 crc kubenswrapper[4892]: I0217 19:19:27.873857 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-779fb5b75c-gs2vd"] Feb 17 19:19:27 crc kubenswrapper[4892]: I0217 19:19:27.874132 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-779fb5b75c-gs2vd" podUID="7ab3a525-f249-4171-b99a-e57c330dea23" containerName="dnsmasq-dns" containerID="cri-o://7d0717f80637382ea7bd19ae66fb8bd6d9d1456f0d28546bf405ca9582346188" gracePeriod=10 Feb 17 19:19:28 crc kubenswrapper[4892]: I0217 19:19:28.391488 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b9a562ab-e2d3-47ad-806c-052d6eba74a8","Type":"ContainerStarted","Data":"96c7256a8f801e51e8517af3107fdb5e61b57e673163e8a8dc77991b648aadde"} Feb 17 19:19:28 crc kubenswrapper[4892]: I0217 19:19:28.391537 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b9a562ab-e2d3-47ad-806c-052d6eba74a8","Type":"ContainerStarted","Data":"87fad8914a2d8e750d67214f78b4fec531a31bdb0d7fcdf0dd9670d93a32fbff"} Feb 17 19:19:28 crc kubenswrapper[4892]: I0217 19:19:28.400830 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e5be3191-ec7d-4f73-80c6-3a50bd405ea4","Type":"ContainerStarted","Data":"4b72abdf1ab2633f2bbf4ed54ef99c02d38db7a87e30d71bb110deceab28ecbd"} Feb 17 19:19:28 crc kubenswrapper[4892]: I0217 19:19:28.400892 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e5be3191-ec7d-4f73-80c6-3a50bd405ea4","Type":"ContainerStarted","Data":"d8502017983b84a0ca83da22e1cce7c2e998638c312132ae3e15ce0c234ffc34"} Feb 17 19:19:28 crc kubenswrapper[4892]: I0217 19:19:28.403415 4892 generic.go:334] "Generic (PLEG): container finished" podID="7ab3a525-f249-4171-b99a-e57c330dea23" containerID="7d0717f80637382ea7bd19ae66fb8bd6d9d1456f0d28546bf405ca9582346188" exitCode=0 Feb 17 19:19:28 crc kubenswrapper[4892]: I0217 19:19:28.403561 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-779fb5b75c-gs2vd" event={"ID":"7ab3a525-f249-4171-b99a-e57c330dea23","Type":"ContainerDied","Data":"7d0717f80637382ea7bd19ae66fb8bd6d9d1456f0d28546bf405ca9582346188"} Feb 17 19:19:28 crc kubenswrapper[4892]: I0217 19:19:28.403590 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-779fb5b75c-gs2vd" event={"ID":"7ab3a525-f249-4171-b99a-e57c330dea23","Type":"ContainerDied","Data":"16d3e7fb5373b4f1193de8b3ee89f9089a6c9b01b3baf2e49cb28f81900fe577"} Feb 17 19:19:28 crc kubenswrapper[4892]: I0217 19:19:28.403604 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16d3e7fb5373b4f1193de8b3ee89f9089a6c9b01b3baf2e49cb28f81900fe577" Feb 17 19:19:28 crc kubenswrapper[4892]: I0217 19:19:28.417763 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.417726208 podStartE2EDuration="2.417726208s" podCreationTimestamp="2026-02-17 19:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:19:28.410502782 +0000 UTC m=+5739.785906057" watchObservedRunningTime="2026-02-17 19:19:28.417726208 +0000 UTC m=+5739.793129473" Feb 17 19:19:28 crc kubenswrapper[4892]: I0217 19:19:28.419478 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 17 19:19:28 crc kubenswrapper[4892]: I0217 19:19:28.440909 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.440888112 podStartE2EDuration="2.440888112s" podCreationTimestamp="2026-02-17 19:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:19:28.431082118 +0000 UTC m=+5739.806485383" watchObservedRunningTime="2026-02-17 19:19:28.440888112 +0000 UTC m=+5739.816291377" Feb 17 19:19:28 crc kubenswrapper[4892]: I0217 19:19:28.445337 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-779fb5b75c-gs2vd" Feb 17 19:19:28 crc kubenswrapper[4892]: I0217 19:19:28.545730 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ab3a525-f249-4171-b99a-e57c330dea23-dns-svc\") pod \"7ab3a525-f249-4171-b99a-e57c330dea23\" (UID: \"7ab3a525-f249-4171-b99a-e57c330dea23\") " Feb 17 19:19:28 crc kubenswrapper[4892]: I0217 19:19:28.546204 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ab3a525-f249-4171-b99a-e57c330dea23-config\") pod \"7ab3a525-f249-4171-b99a-e57c330dea23\" (UID: \"7ab3a525-f249-4171-b99a-e57c330dea23\") " Feb 17 19:19:28 crc kubenswrapper[4892]: I0217 19:19:28.546388 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ab3a525-f249-4171-b99a-e57c330dea23-ovsdbserver-sb\") pod \"7ab3a525-f249-4171-b99a-e57c330dea23\" (UID: \"7ab3a525-f249-4171-b99a-e57c330dea23\") " Feb 17 19:19:28 crc kubenswrapper[4892]: I0217 19:19:28.546444 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ab3a525-f249-4171-b99a-e57c330dea23-ovsdbserver-nb\") pod \"7ab3a525-f249-4171-b99a-e57c330dea23\" (UID: \"7ab3a525-f249-4171-b99a-e57c330dea23\") " Feb 17 19:19:28 crc kubenswrapper[4892]: I0217 19:19:28.546529 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9cfx\" (UniqueName: \"kubernetes.io/projected/7ab3a525-f249-4171-b99a-e57c330dea23-kube-api-access-x9cfx\") pod \"7ab3a525-f249-4171-b99a-e57c330dea23\" (UID: \"7ab3a525-f249-4171-b99a-e57c330dea23\") " Feb 17 19:19:28 crc kubenswrapper[4892]: I0217 19:19:28.553171 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ab3a525-f249-4171-b99a-e57c330dea23-kube-api-access-x9cfx" (OuterVolumeSpecName: "kube-api-access-x9cfx") pod "7ab3a525-f249-4171-b99a-e57c330dea23" (UID: "7ab3a525-f249-4171-b99a-e57c330dea23"). InnerVolumeSpecName "kube-api-access-x9cfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:19:28 crc kubenswrapper[4892]: I0217 19:19:28.615186 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ab3a525-f249-4171-b99a-e57c330dea23-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7ab3a525-f249-4171-b99a-e57c330dea23" (UID: "7ab3a525-f249-4171-b99a-e57c330dea23"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:19:28 crc kubenswrapper[4892]: I0217 19:19:28.618202 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ab3a525-f249-4171-b99a-e57c330dea23-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7ab3a525-f249-4171-b99a-e57c330dea23" (UID: "7ab3a525-f249-4171-b99a-e57c330dea23"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:19:28 crc kubenswrapper[4892]: I0217 19:19:28.628493 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ab3a525-f249-4171-b99a-e57c330dea23-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7ab3a525-f249-4171-b99a-e57c330dea23" (UID: "7ab3a525-f249-4171-b99a-e57c330dea23"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:19:28 crc kubenswrapper[4892]: I0217 19:19:28.635413 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ab3a525-f249-4171-b99a-e57c330dea23-config" (OuterVolumeSpecName: "config") pod "7ab3a525-f249-4171-b99a-e57c330dea23" (UID: "7ab3a525-f249-4171-b99a-e57c330dea23"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:19:28 crc kubenswrapper[4892]: I0217 19:19:28.649028 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ab3a525-f249-4171-b99a-e57c330dea23-config\") on node \"crc\" DevicePath \"\"" Feb 17 19:19:28 crc kubenswrapper[4892]: I0217 19:19:28.649087 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ab3a525-f249-4171-b99a-e57c330dea23-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 19:19:28 crc kubenswrapper[4892]: I0217 19:19:28.649101 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ab3a525-f249-4171-b99a-e57c330dea23-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 19:19:28 crc kubenswrapper[4892]: I0217 19:19:28.649114 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9cfx\" (UniqueName: \"kubernetes.io/projected/7ab3a525-f249-4171-b99a-e57c330dea23-kube-api-access-x9cfx\") on node \"crc\" DevicePath \"\"" Feb 17 19:19:28 crc kubenswrapper[4892]: I0217 19:19:28.649126 4892 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ab3a525-f249-4171-b99a-e57c330dea23-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 19:19:29 crc kubenswrapper[4892]: I0217 19:19:29.420336 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-779fb5b75c-gs2vd" Feb 17 19:19:29 crc kubenswrapper[4892]: I0217 19:19:29.453997 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-779fb5b75c-gs2vd"] Feb 17 19:19:29 crc kubenswrapper[4892]: I0217 19:19:29.467168 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-779fb5b75c-gs2vd"] Feb 17 19:19:30 crc kubenswrapper[4892]: I0217 19:19:30.395736 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 19:19:30 crc kubenswrapper[4892]: I0217 19:19:30.447114 4892 generic.go:334] "Generic (PLEG): container finished" podID="6860fb76-f8d0-4d14-872a-cfc33ad6887e" containerID="ff0cc3094cc5f81a42411b6f1316968436a7bdc677f37379d01c095d339ab9d8" exitCode=0 Feb 17 19:19:30 crc kubenswrapper[4892]: I0217 19:19:30.447152 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6860fb76-f8d0-4d14-872a-cfc33ad6887e","Type":"ContainerDied","Data":"ff0cc3094cc5f81a42411b6f1316968436a7bdc677f37379d01c095d339ab9d8"} Feb 17 19:19:30 crc kubenswrapper[4892]: I0217 19:19:30.447181 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6860fb76-f8d0-4d14-872a-cfc33ad6887e","Type":"ContainerDied","Data":"be54fad0feb5b17f8cb924496b7e4f11b13e7a07d9190553fce4bb03830682da"} Feb 17 19:19:30 crc kubenswrapper[4892]: I0217 19:19:30.447198 4892 scope.go:117] "RemoveContainer" containerID="ff0cc3094cc5f81a42411b6f1316968436a7bdc677f37379d01c095d339ab9d8" Feb 17 19:19:30 crc kubenswrapper[4892]: I0217 19:19:30.447193 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 19:19:30 crc kubenswrapper[4892]: I0217 19:19:30.471602 4892 scope.go:117] "RemoveContainer" containerID="ff0cc3094cc5f81a42411b6f1316968436a7bdc677f37379d01c095d339ab9d8" Feb 17 19:19:30 crc kubenswrapper[4892]: E0217 19:19:30.472141 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff0cc3094cc5f81a42411b6f1316968436a7bdc677f37379d01c095d339ab9d8\": container with ID starting with ff0cc3094cc5f81a42411b6f1316968436a7bdc677f37379d01c095d339ab9d8 not found: ID does not exist" containerID="ff0cc3094cc5f81a42411b6f1316968436a7bdc677f37379d01c095d339ab9d8" Feb 17 19:19:30 crc kubenswrapper[4892]: I0217 19:19:30.472190 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff0cc3094cc5f81a42411b6f1316968436a7bdc677f37379d01c095d339ab9d8"} err="failed to get container status \"ff0cc3094cc5f81a42411b6f1316968436a7bdc677f37379d01c095d339ab9d8\": rpc error: code = NotFound desc = could not find container \"ff0cc3094cc5f81a42411b6f1316968436a7bdc677f37379d01c095d339ab9d8\": container with ID starting with ff0cc3094cc5f81a42411b6f1316968436a7bdc677f37379d01c095d339ab9d8 not found: ID does not exist" Feb 17 19:19:30 crc kubenswrapper[4892]: I0217 19:19:30.506893 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6860fb76-f8d0-4d14-872a-cfc33ad6887e-combined-ca-bundle\") pod \"6860fb76-f8d0-4d14-872a-cfc33ad6887e\" (UID: \"6860fb76-f8d0-4d14-872a-cfc33ad6887e\") " Feb 17 19:19:30 crc kubenswrapper[4892]: I0217 19:19:30.507267 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwc5x\" (UniqueName: \"kubernetes.io/projected/6860fb76-f8d0-4d14-872a-cfc33ad6887e-kube-api-access-jwc5x\") pod \"6860fb76-f8d0-4d14-872a-cfc33ad6887e\" (UID: \"6860fb76-f8d0-4d14-872a-cfc33ad6887e\") " Feb 17 19:19:30 crc kubenswrapper[4892]: I0217 19:19:30.507329 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6860fb76-f8d0-4d14-872a-cfc33ad6887e-config-data\") pod \"6860fb76-f8d0-4d14-872a-cfc33ad6887e\" (UID: \"6860fb76-f8d0-4d14-872a-cfc33ad6887e\") " Feb 17 19:19:30 crc kubenswrapper[4892]: I0217 19:19:30.512339 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6860fb76-f8d0-4d14-872a-cfc33ad6887e-kube-api-access-jwc5x" (OuterVolumeSpecName: "kube-api-access-jwc5x") pod "6860fb76-f8d0-4d14-872a-cfc33ad6887e" (UID: "6860fb76-f8d0-4d14-872a-cfc33ad6887e"). InnerVolumeSpecName "kube-api-access-jwc5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:19:30 crc kubenswrapper[4892]: I0217 19:19:30.536335 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6860fb76-f8d0-4d14-872a-cfc33ad6887e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6860fb76-f8d0-4d14-872a-cfc33ad6887e" (UID: "6860fb76-f8d0-4d14-872a-cfc33ad6887e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:19:30 crc kubenswrapper[4892]: I0217 19:19:30.556280 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6860fb76-f8d0-4d14-872a-cfc33ad6887e-config-data" (OuterVolumeSpecName: "config-data") pod "6860fb76-f8d0-4d14-872a-cfc33ad6887e" (UID: "6860fb76-f8d0-4d14-872a-cfc33ad6887e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:19:30 crc kubenswrapper[4892]: I0217 19:19:30.611365 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6860fb76-f8d0-4d14-872a-cfc33ad6887e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 19:19:30 crc kubenswrapper[4892]: I0217 19:19:30.611434 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwc5x\" (UniqueName: \"kubernetes.io/projected/6860fb76-f8d0-4d14-872a-cfc33ad6887e-kube-api-access-jwc5x\") on node \"crc\" DevicePath \"\"" Feb 17 19:19:30 crc kubenswrapper[4892]: I0217 19:19:30.611464 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6860fb76-f8d0-4d14-872a-cfc33ad6887e-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 19:19:30 crc kubenswrapper[4892]: I0217 19:19:30.784779 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 19:19:30 crc kubenswrapper[4892]: I0217 19:19:30.804331 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 19:19:30 crc kubenswrapper[4892]: I0217 19:19:30.814500 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 19:19:30 crc kubenswrapper[4892]: E0217 19:19:30.825942 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ab3a525-f249-4171-b99a-e57c330dea23" containerName="init" Feb 17 19:19:30 crc kubenswrapper[4892]: I0217 19:19:30.825987 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ab3a525-f249-4171-b99a-e57c330dea23" containerName="init" Feb 17 19:19:30 crc kubenswrapper[4892]: E0217 19:19:30.826013 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ab3a525-f249-4171-b99a-e57c330dea23" containerName="dnsmasq-dns" Feb 17 19:19:30 crc kubenswrapper[4892]: I0217 19:19:30.826019 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ab3a525-f249-4171-b99a-e57c330dea23" containerName="dnsmasq-dns" Feb 17 19:19:30 crc kubenswrapper[4892]: E0217 19:19:30.826050 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6860fb76-f8d0-4d14-872a-cfc33ad6887e" containerName="nova-scheduler-scheduler" Feb 17 19:19:30 crc kubenswrapper[4892]: I0217 19:19:30.826056 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="6860fb76-f8d0-4d14-872a-cfc33ad6887e" containerName="nova-scheduler-scheduler" Feb 17 19:19:30 crc kubenswrapper[4892]: I0217 19:19:30.826424 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ab3a525-f249-4171-b99a-e57c330dea23" containerName="dnsmasq-dns" Feb 17 19:19:30 crc kubenswrapper[4892]: I0217 19:19:30.826451 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="6860fb76-f8d0-4d14-872a-cfc33ad6887e" containerName="nova-scheduler-scheduler" Feb 17 19:19:30 crc kubenswrapper[4892]: I0217 19:19:30.827228 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 19:19:30 crc kubenswrapper[4892]: I0217 19:19:30.828161 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 19:19:30 crc kubenswrapper[4892]: I0217 19:19:30.831580 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 17 19:19:30 crc kubenswrapper[4892]: I0217 19:19:30.916452 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7c99bfe-5f65-4c1a-8244-2aa2f44fa732-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e7c99bfe-5f65-4c1a-8244-2aa2f44fa732\") " pod="openstack/nova-scheduler-0" Feb 17 19:19:30 crc kubenswrapper[4892]: I0217 19:19:30.916519 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hbkj\" (UniqueName: \"kubernetes.io/projected/e7c99bfe-5f65-4c1a-8244-2aa2f44fa732-kube-api-access-2hbkj\") pod \"nova-scheduler-0\" (UID: \"e7c99bfe-5f65-4c1a-8244-2aa2f44fa732\") " pod="openstack/nova-scheduler-0" Feb 17 19:19:30 crc kubenswrapper[4892]: I0217 19:19:30.916546 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7c99bfe-5f65-4c1a-8244-2aa2f44fa732-config-data\") pod \"nova-scheduler-0\" (UID: \"e7c99bfe-5f65-4c1a-8244-2aa2f44fa732\") " pod="openstack/nova-scheduler-0" Feb 17 19:19:31 crc kubenswrapper[4892]: I0217 19:19:31.018193 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7c99bfe-5f65-4c1a-8244-2aa2f44fa732-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e7c99bfe-5f65-4c1a-8244-2aa2f44fa732\") " pod="openstack/nova-scheduler-0" Feb 17 19:19:31 crc kubenswrapper[4892]: I0217 19:19:31.018250 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hbkj\" (UniqueName: \"kubernetes.io/projected/e7c99bfe-5f65-4c1a-8244-2aa2f44fa732-kube-api-access-2hbkj\") pod \"nova-scheduler-0\" (UID: \"e7c99bfe-5f65-4c1a-8244-2aa2f44fa732\") " pod="openstack/nova-scheduler-0" Feb 17 19:19:31 crc kubenswrapper[4892]: I0217 19:19:31.018280 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7c99bfe-5f65-4c1a-8244-2aa2f44fa732-config-data\") pod \"nova-scheduler-0\" (UID: \"e7c99bfe-5f65-4c1a-8244-2aa2f44fa732\") " pod="openstack/nova-scheduler-0" Feb 17 19:19:31 crc kubenswrapper[4892]: I0217 19:19:31.021605 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7c99bfe-5f65-4c1a-8244-2aa2f44fa732-config-data\") pod \"nova-scheduler-0\" (UID: \"e7c99bfe-5f65-4c1a-8244-2aa2f44fa732\") " pod="openstack/nova-scheduler-0" Feb 17 19:19:31 crc kubenswrapper[4892]: I0217 19:19:31.023144 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7c99bfe-5f65-4c1a-8244-2aa2f44fa732-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e7c99bfe-5f65-4c1a-8244-2aa2f44fa732\") " pod="openstack/nova-scheduler-0" Feb 17 19:19:31 crc kubenswrapper[4892]: I0217 19:19:31.042234 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hbkj\" (UniqueName: \"kubernetes.io/projected/e7c99bfe-5f65-4c1a-8244-2aa2f44fa732-kube-api-access-2hbkj\") pod \"nova-scheduler-0\" (UID: \"e7c99bfe-5f65-4c1a-8244-2aa2f44fa732\") " pod="openstack/nova-scheduler-0" Feb 17 19:19:31 crc kubenswrapper[4892]: I0217 19:19:31.149049 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 19:19:31 crc kubenswrapper[4892]: I0217 19:19:31.387844 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6860fb76-f8d0-4d14-872a-cfc33ad6887e" path="/var/lib/kubelet/pods/6860fb76-f8d0-4d14-872a-cfc33ad6887e/volumes" Feb 17 19:19:31 crc kubenswrapper[4892]: I0217 19:19:31.388970 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ab3a525-f249-4171-b99a-e57c330dea23" path="/var/lib/kubelet/pods/7ab3a525-f249-4171-b99a-e57c330dea23/volumes" Feb 17 19:19:31 crc kubenswrapper[4892]: I0217 19:19:31.637012 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 19:19:31 crc kubenswrapper[4892]: I0217 19:19:31.786856 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 19:19:31 crc kubenswrapper[4892]: I0217 19:19:31.788081 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 19:19:32 crc kubenswrapper[4892]: I0217 19:19:32.475464 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e7c99bfe-5f65-4c1a-8244-2aa2f44fa732","Type":"ContainerStarted","Data":"459010c91b6d56239baf49bf6b00d53830d4e4e21ed6ef8130f165f7397502ff"} Feb 17 19:19:32 crc kubenswrapper[4892]: I0217 19:19:32.475533 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e7c99bfe-5f65-4c1a-8244-2aa2f44fa732","Type":"ContainerStarted","Data":"5b26c1325d586e633804b4c9a80a50565e4ca0a06ea29384543ebd0808e364ce"} Feb 17 19:19:32 crc kubenswrapper[4892]: I0217 19:19:32.497238 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.497221671 podStartE2EDuration="2.497221671s" podCreationTimestamp="2026-02-17 19:19:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:19:32.495411503 +0000 UTC m=+5743.870814798" watchObservedRunningTime="2026-02-17 19:19:32.497221671 +0000 UTC m=+5743.872624936" Feb 17 19:19:34 crc kubenswrapper[4892]: I0217 19:19:34.803803 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 17 19:19:35 crc kubenswrapper[4892]: I0217 19:19:35.305437 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-xpqwv"] Feb 17 19:19:35 crc kubenswrapper[4892]: I0217 19:19:35.307341 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-xpqwv" Feb 17 19:19:35 crc kubenswrapper[4892]: I0217 19:19:35.309972 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 17 19:19:35 crc kubenswrapper[4892]: I0217 19:19:35.310565 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 17 19:19:35 crc kubenswrapper[4892]: I0217 19:19:35.315415 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-xpqwv"] Feb 17 19:19:35 crc kubenswrapper[4892]: I0217 19:19:35.439026 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9b1b334-4337-48c7-88bb-259fc38f15e5-scripts\") pod \"nova-cell1-cell-mapping-xpqwv\" (UID: \"f9b1b334-4337-48c7-88bb-259fc38f15e5\") " pod="openstack/nova-cell1-cell-mapping-xpqwv" Feb 17 19:19:35 crc kubenswrapper[4892]: I0217 19:19:35.439529 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6655\" (UniqueName: \"kubernetes.io/projected/f9b1b334-4337-48c7-88bb-259fc38f15e5-kube-api-access-r6655\") pod \"nova-cell1-cell-mapping-xpqwv\" (UID: \"f9b1b334-4337-48c7-88bb-259fc38f15e5\") " pod="openstack/nova-cell1-cell-mapping-xpqwv" Feb 17 19:19:35 crc kubenswrapper[4892]: I0217 19:19:35.439590 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9b1b334-4337-48c7-88bb-259fc38f15e5-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-xpqwv\" (UID: \"f9b1b334-4337-48c7-88bb-259fc38f15e5\") " pod="openstack/nova-cell1-cell-mapping-xpqwv" Feb 17 19:19:35 crc kubenswrapper[4892]: I0217 19:19:35.439688 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9b1b334-4337-48c7-88bb-259fc38f15e5-config-data\") pod \"nova-cell1-cell-mapping-xpqwv\" (UID: \"f9b1b334-4337-48c7-88bb-259fc38f15e5\") " pod="openstack/nova-cell1-cell-mapping-xpqwv" Feb 17 19:19:35 crc kubenswrapper[4892]: I0217 19:19:35.542120 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9b1b334-4337-48c7-88bb-259fc38f15e5-scripts\") pod \"nova-cell1-cell-mapping-xpqwv\" (UID: \"f9b1b334-4337-48c7-88bb-259fc38f15e5\") " pod="openstack/nova-cell1-cell-mapping-xpqwv" Feb 17 19:19:35 crc kubenswrapper[4892]: I0217 19:19:35.542203 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6655\" (UniqueName: \"kubernetes.io/projected/f9b1b334-4337-48c7-88bb-259fc38f15e5-kube-api-access-r6655\") pod \"nova-cell1-cell-mapping-xpqwv\" (UID: \"f9b1b334-4337-48c7-88bb-259fc38f15e5\") " pod="openstack/nova-cell1-cell-mapping-xpqwv" Feb 17 19:19:35 crc kubenswrapper[4892]: I0217 19:19:35.542223 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9b1b334-4337-48c7-88bb-259fc38f15e5-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-xpqwv\" (UID: \"f9b1b334-4337-48c7-88bb-259fc38f15e5\") " pod="openstack/nova-cell1-cell-mapping-xpqwv" Feb 17 19:19:35 crc kubenswrapper[4892]: I0217 19:19:35.542259 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9b1b334-4337-48c7-88bb-259fc38f15e5-config-data\") pod \"nova-cell1-cell-mapping-xpqwv\" (UID: \"f9b1b334-4337-48c7-88bb-259fc38f15e5\") " pod="openstack/nova-cell1-cell-mapping-xpqwv" Feb 17 19:19:35 crc kubenswrapper[4892]: I0217 19:19:35.548853 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9b1b334-4337-48c7-88bb-259fc38f15e5-config-data\") pod \"nova-cell1-cell-mapping-xpqwv\" (UID: \"f9b1b334-4337-48c7-88bb-259fc38f15e5\") " pod="openstack/nova-cell1-cell-mapping-xpqwv" Feb 17 19:19:35 crc kubenswrapper[4892]: I0217 19:19:35.549043 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9b1b334-4337-48c7-88bb-259fc38f15e5-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-xpqwv\" (UID: \"f9b1b334-4337-48c7-88bb-259fc38f15e5\") " pod="openstack/nova-cell1-cell-mapping-xpqwv" Feb 17 19:19:35 crc kubenswrapper[4892]: I0217 19:19:35.549621 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9b1b334-4337-48c7-88bb-259fc38f15e5-scripts\") pod \"nova-cell1-cell-mapping-xpqwv\" (UID: \"f9b1b334-4337-48c7-88bb-259fc38f15e5\") " pod="openstack/nova-cell1-cell-mapping-xpqwv" Feb 17 19:19:35 crc kubenswrapper[4892]: I0217 19:19:35.562901 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6655\" (UniqueName: \"kubernetes.io/projected/f9b1b334-4337-48c7-88bb-259fc38f15e5-kube-api-access-r6655\") pod \"nova-cell1-cell-mapping-xpqwv\" (UID: \"f9b1b334-4337-48c7-88bb-259fc38f15e5\") " pod="openstack/nova-cell1-cell-mapping-xpqwv" Feb 17 19:19:35 crc kubenswrapper[4892]: I0217 19:19:35.630497 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-xpqwv" Feb 17 19:19:36 crc kubenswrapper[4892]: I0217 19:19:36.149886 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 17 19:19:36 crc kubenswrapper[4892]: I0217 19:19:36.176831 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-xpqwv"] Feb 17 19:19:36 crc kubenswrapper[4892]: W0217 19:19:36.178085 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9b1b334_4337_48c7_88bb_259fc38f15e5.slice/crio-43f3c38797020955ee1229395ef969f0b89cd27b8a9a972e9219229a88f660fa WatchSource:0}: Error finding container 43f3c38797020955ee1229395ef969f0b89cd27b8a9a972e9219229a88f660fa: Status 404 returned error can't find the container with id 43f3c38797020955ee1229395ef969f0b89cd27b8a9a972e9219229a88f660fa Feb 17 19:19:36 crc kubenswrapper[4892]: I0217 19:19:36.511562 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-xpqwv" event={"ID":"f9b1b334-4337-48c7-88bb-259fc38f15e5","Type":"ContainerStarted","Data":"7ca3c76c178458c0147c5c14f7e6e40ce6c73b8caf5bf97e097aff16c0aafe56"} Feb 17 19:19:36 crc kubenswrapper[4892]: I0217 19:19:36.511610 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-xpqwv" event={"ID":"f9b1b334-4337-48c7-88bb-259fc38f15e5","Type":"ContainerStarted","Data":"43f3c38797020955ee1229395ef969f0b89cd27b8a9a972e9219229a88f660fa"} Feb 17 19:19:36 crc kubenswrapper[4892]: I0217 19:19:36.533653 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-xpqwv" podStartSLOduration=1.533631843 podStartE2EDuration="1.533631843s" podCreationTimestamp="2026-02-17 19:19:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:19:36.530729625 +0000 UTC m=+5747.906132960" watchObservedRunningTime="2026-02-17 19:19:36.533631843 +0000 UTC m=+5747.909035098" Feb 17 19:19:36 crc kubenswrapper[4892]: I0217 19:19:36.764275 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 19:19:36 crc kubenswrapper[4892]: I0217 19:19:36.764722 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 19:19:36 crc kubenswrapper[4892]: I0217 19:19:36.787139 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 17 19:19:36 crc kubenswrapper[4892]: I0217 19:19:36.787209 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 17 19:19:37 crc kubenswrapper[4892]: I0217 19:19:37.427319 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 19:19:37 crc kubenswrapper[4892]: I0217 19:19:37.427374 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 19:19:37 crc kubenswrapper[4892]: I0217 19:19:37.427418 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" Feb 17 19:19:37 crc kubenswrapper[4892]: I0217 19:19:37.427968 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0b9ac01ffb1eb212ecc7dea4def4de73890b147a83b4b71346712fbd55b67ac2"} pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 19:19:37 crc kubenswrapper[4892]: I0217 19:19:37.428009 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" containerID="cri-o://0b9ac01ffb1eb212ecc7dea4def4de73890b147a83b4b71346712fbd55b67ac2" gracePeriod=600 Feb 17 19:19:37 crc kubenswrapper[4892]: E0217 19:19:37.552090 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:19:37 crc kubenswrapper[4892]: I0217 19:19:37.927974 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b9a562ab-e2d3-47ad-806c-052d6eba74a8" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.98:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 19:19:37 crc kubenswrapper[4892]: I0217 19:19:37.927983 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e5be3191-ec7d-4f73-80c6-3a50bd405ea4" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.97:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 19:19:37 crc kubenswrapper[4892]: I0217 19:19:37.928843 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e5be3191-ec7d-4f73-80c6-3a50bd405ea4" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.97:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 19:19:37 crc kubenswrapper[4892]: I0217 19:19:37.929107 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b9a562ab-e2d3-47ad-806c-052d6eba74a8" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.98:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 19:19:38 crc kubenswrapper[4892]: I0217 19:19:38.531223 4892 generic.go:334] "Generic (PLEG): container finished" podID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerID="0b9ac01ffb1eb212ecc7dea4def4de73890b147a83b4b71346712fbd55b67ac2" exitCode=0 Feb 17 19:19:38 crc kubenswrapper[4892]: I0217 19:19:38.531269 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerDied","Data":"0b9ac01ffb1eb212ecc7dea4def4de73890b147a83b4b71346712fbd55b67ac2"} Feb 17 19:19:38 crc kubenswrapper[4892]: I0217 19:19:38.531302 4892 scope.go:117] "RemoveContainer" containerID="6bc133c88c6796604fb1d95fbac5023f863829395efa1fa885020ed0c34254e3" Feb 17 19:19:38 crc kubenswrapper[4892]: I0217 19:19:38.531956 4892 scope.go:117] "RemoveContainer" containerID="0b9ac01ffb1eb212ecc7dea4def4de73890b147a83b4b71346712fbd55b67ac2" Feb 17 19:19:38 crc kubenswrapper[4892]: E0217 19:19:38.532182 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:19:41 crc kubenswrapper[4892]: I0217 19:19:41.149457 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 17 19:19:41 crc kubenswrapper[4892]: I0217 19:19:41.188586 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 17 19:19:41 crc kubenswrapper[4892]: I0217 19:19:41.568344 4892 generic.go:334] "Generic (PLEG): container finished" podID="f9b1b334-4337-48c7-88bb-259fc38f15e5" containerID="7ca3c76c178458c0147c5c14f7e6e40ce6c73b8caf5bf97e097aff16c0aafe56" exitCode=0 Feb 17 19:19:41 crc kubenswrapper[4892]: I0217 19:19:41.568429 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-xpqwv" event={"ID":"f9b1b334-4337-48c7-88bb-259fc38f15e5","Type":"ContainerDied","Data":"7ca3c76c178458c0147c5c14f7e6e40ce6c73b8caf5bf97e097aff16c0aafe56"} Feb 17 19:19:41 crc kubenswrapper[4892]: I0217 19:19:41.618354 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 17 19:19:43 crc kubenswrapper[4892]: I0217 19:19:43.043263 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-xpqwv" Feb 17 19:19:43 crc kubenswrapper[4892]: I0217 19:19:43.099730 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9b1b334-4337-48c7-88bb-259fc38f15e5-scripts\") pod \"f9b1b334-4337-48c7-88bb-259fc38f15e5\" (UID: \"f9b1b334-4337-48c7-88bb-259fc38f15e5\") " Feb 17 19:19:43 crc kubenswrapper[4892]: I0217 19:19:43.099867 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9b1b334-4337-48c7-88bb-259fc38f15e5-combined-ca-bundle\") pod \"f9b1b334-4337-48c7-88bb-259fc38f15e5\" (UID: \"f9b1b334-4337-48c7-88bb-259fc38f15e5\") " Feb 17 19:19:43 crc kubenswrapper[4892]: I0217 19:19:43.100040 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9b1b334-4337-48c7-88bb-259fc38f15e5-config-data\") pod \"f9b1b334-4337-48c7-88bb-259fc38f15e5\" (UID: \"f9b1b334-4337-48c7-88bb-259fc38f15e5\") " Feb 17 19:19:43 crc kubenswrapper[4892]: I0217 19:19:43.100491 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6655\" (UniqueName: \"kubernetes.io/projected/f9b1b334-4337-48c7-88bb-259fc38f15e5-kube-api-access-r6655\") pod \"f9b1b334-4337-48c7-88bb-259fc38f15e5\" (UID: \"f9b1b334-4337-48c7-88bb-259fc38f15e5\") " Feb 17 19:19:43 crc kubenswrapper[4892]: I0217 19:19:43.106046 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9b1b334-4337-48c7-88bb-259fc38f15e5-kube-api-access-r6655" (OuterVolumeSpecName: "kube-api-access-r6655") pod "f9b1b334-4337-48c7-88bb-259fc38f15e5" (UID: "f9b1b334-4337-48c7-88bb-259fc38f15e5"). InnerVolumeSpecName "kube-api-access-r6655". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:19:43 crc kubenswrapper[4892]: I0217 19:19:43.107237 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9b1b334-4337-48c7-88bb-259fc38f15e5-scripts" (OuterVolumeSpecName: "scripts") pod "f9b1b334-4337-48c7-88bb-259fc38f15e5" (UID: "f9b1b334-4337-48c7-88bb-259fc38f15e5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:19:43 crc kubenswrapper[4892]: I0217 19:19:43.150439 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9b1b334-4337-48c7-88bb-259fc38f15e5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f9b1b334-4337-48c7-88bb-259fc38f15e5" (UID: "f9b1b334-4337-48c7-88bb-259fc38f15e5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:19:43 crc kubenswrapper[4892]: I0217 19:19:43.163089 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9b1b334-4337-48c7-88bb-259fc38f15e5-config-data" (OuterVolumeSpecName: "config-data") pod "f9b1b334-4337-48c7-88bb-259fc38f15e5" (UID: "f9b1b334-4337-48c7-88bb-259fc38f15e5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:19:43 crc kubenswrapper[4892]: I0217 19:19:43.206695 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9b1b334-4337-48c7-88bb-259fc38f15e5-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:19:43 crc kubenswrapper[4892]: I0217 19:19:43.206747 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9b1b334-4337-48c7-88bb-259fc38f15e5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 19:19:43 crc kubenswrapper[4892]: I0217 19:19:43.206766 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9b1b334-4337-48c7-88bb-259fc38f15e5-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 19:19:43 crc kubenswrapper[4892]: I0217 19:19:43.206782 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6655\" (UniqueName: \"kubernetes.io/projected/f9b1b334-4337-48c7-88bb-259fc38f15e5-kube-api-access-r6655\") on node \"crc\" DevicePath \"\"" Feb 17 19:19:43 crc kubenswrapper[4892]: I0217 19:19:43.607211 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-xpqwv" event={"ID":"f9b1b334-4337-48c7-88bb-259fc38f15e5","Type":"ContainerDied","Data":"43f3c38797020955ee1229395ef969f0b89cd27b8a9a972e9219229a88f660fa"} Feb 17 19:19:43 crc kubenswrapper[4892]: I0217 19:19:43.607259 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43f3c38797020955ee1229395ef969f0b89cd27b8a9a972e9219229a88f660fa" Feb 17 19:19:43 crc kubenswrapper[4892]: I0217 19:19:43.607299 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-xpqwv" Feb 17 19:19:43 crc kubenswrapper[4892]: I0217 19:19:43.896147 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 19:19:43 crc kubenswrapper[4892]: I0217 19:19:43.897019 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e5be3191-ec7d-4f73-80c6-3a50bd405ea4" containerName="nova-api-api" containerID="cri-o://4b72abdf1ab2633f2bbf4ed54ef99c02d38db7a87e30d71bb110deceab28ecbd" gracePeriod=30 Feb 17 19:19:43 crc kubenswrapper[4892]: I0217 19:19:43.896966 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e5be3191-ec7d-4f73-80c6-3a50bd405ea4" containerName="nova-api-log" containerID="cri-o://d8502017983b84a0ca83da22e1cce7c2e998638c312132ae3e15ce0c234ffc34" gracePeriod=30 Feb 17 19:19:43 crc kubenswrapper[4892]: I0217 19:19:43.921992 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 19:19:43 crc kubenswrapper[4892]: I0217 19:19:43.922211 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="e7c99bfe-5f65-4c1a-8244-2aa2f44fa732" containerName="nova-scheduler-scheduler" containerID="cri-o://459010c91b6d56239baf49bf6b00d53830d4e4e21ed6ef8130f165f7397502ff" gracePeriod=30 Feb 17 19:19:43 crc kubenswrapper[4892]: I0217 19:19:43.938330 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 19:19:43 crc kubenswrapper[4892]: I0217 19:19:43.938568 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b9a562ab-e2d3-47ad-806c-052d6eba74a8" containerName="nova-metadata-log" containerID="cri-o://87fad8914a2d8e750d67214f78b4fec531a31bdb0d7fcdf0dd9670d93a32fbff" gracePeriod=30 Feb 17 19:19:43 crc kubenswrapper[4892]: I0217 19:19:43.938982 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b9a562ab-e2d3-47ad-806c-052d6eba74a8" containerName="nova-metadata-metadata" containerID="cri-o://96c7256a8f801e51e8517af3107fdb5e61b57e673163e8a8dc77991b648aadde" gracePeriod=30 Feb 17 19:19:44 crc kubenswrapper[4892]: I0217 19:19:44.622173 4892 generic.go:334] "Generic (PLEG): container finished" podID="e5be3191-ec7d-4f73-80c6-3a50bd405ea4" containerID="d8502017983b84a0ca83da22e1cce7c2e998638c312132ae3e15ce0c234ffc34" exitCode=143 Feb 17 19:19:44 crc kubenswrapper[4892]: I0217 19:19:44.622276 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e5be3191-ec7d-4f73-80c6-3a50bd405ea4","Type":"ContainerDied","Data":"d8502017983b84a0ca83da22e1cce7c2e998638c312132ae3e15ce0c234ffc34"} Feb 17 19:19:44 crc kubenswrapper[4892]: I0217 19:19:44.625954 4892 generic.go:334] "Generic (PLEG): container finished" podID="b9a562ab-e2d3-47ad-806c-052d6eba74a8" containerID="87fad8914a2d8e750d67214f78b4fec531a31bdb0d7fcdf0dd9670d93a32fbff" exitCode=143 Feb 17 19:19:44 crc kubenswrapper[4892]: I0217 19:19:44.626019 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b9a562ab-e2d3-47ad-806c-052d6eba74a8","Type":"ContainerDied","Data":"87fad8914a2d8e750d67214f78b4fec531a31bdb0d7fcdf0dd9670d93a32fbff"} Feb 17 19:19:46 crc kubenswrapper[4892]: E0217 19:19:46.151210 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="459010c91b6d56239baf49bf6b00d53830d4e4e21ed6ef8130f165f7397502ff" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 17 19:19:46 crc kubenswrapper[4892]: E0217 19:19:46.153296 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="459010c91b6d56239baf49bf6b00d53830d4e4e21ed6ef8130f165f7397502ff" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 17 19:19:46 crc kubenswrapper[4892]: E0217 19:19:46.154526 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="459010c91b6d56239baf49bf6b00d53830d4e4e21ed6ef8130f165f7397502ff" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 17 19:19:46 crc kubenswrapper[4892]: E0217 19:19:46.154618 4892 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="e7c99bfe-5f65-4c1a-8244-2aa2f44fa732" containerName="nova-scheduler-scheduler" Feb 17 19:19:47 crc kubenswrapper[4892]: I0217 19:19:47.634865 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 19:19:47 crc kubenswrapper[4892]: I0217 19:19:47.639377 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 19:19:47 crc kubenswrapper[4892]: I0217 19:19:47.677456 4892 generic.go:334] "Generic (PLEG): container finished" podID="e5be3191-ec7d-4f73-80c6-3a50bd405ea4" containerID="4b72abdf1ab2633f2bbf4ed54ef99c02d38db7a87e30d71bb110deceab28ecbd" exitCode=0 Feb 17 19:19:47 crc kubenswrapper[4892]: I0217 19:19:47.677514 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e5be3191-ec7d-4f73-80c6-3a50bd405ea4","Type":"ContainerDied","Data":"4b72abdf1ab2633f2bbf4ed54ef99c02d38db7a87e30d71bb110deceab28ecbd"} Feb 17 19:19:47 crc kubenswrapper[4892]: I0217 19:19:47.677596 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e5be3191-ec7d-4f73-80c6-3a50bd405ea4","Type":"ContainerDied","Data":"d6e68cf4b0c5f67a9b0ef6783fa428739726d03340f281db1aa642fce2a6902b"} Feb 17 19:19:47 crc kubenswrapper[4892]: I0217 19:19:47.677545 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 19:19:47 crc kubenswrapper[4892]: I0217 19:19:47.677652 4892 scope.go:117] "RemoveContainer" containerID="4b72abdf1ab2633f2bbf4ed54ef99c02d38db7a87e30d71bb110deceab28ecbd" Feb 17 19:19:47 crc kubenswrapper[4892]: I0217 19:19:47.681400 4892 generic.go:334] "Generic (PLEG): container finished" podID="b9a562ab-e2d3-47ad-806c-052d6eba74a8" containerID="96c7256a8f801e51e8517af3107fdb5e61b57e673163e8a8dc77991b648aadde" exitCode=0 Feb 17 19:19:47 crc kubenswrapper[4892]: I0217 19:19:47.681433 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b9a562ab-e2d3-47ad-806c-052d6eba74a8","Type":"ContainerDied","Data":"96c7256a8f801e51e8517af3107fdb5e61b57e673163e8a8dc77991b648aadde"} Feb 17 19:19:47 crc kubenswrapper[4892]: I0217 19:19:47.681456 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b9a562ab-e2d3-47ad-806c-052d6eba74a8","Type":"ContainerDied","Data":"d02a49fb1645993730a5bb486fb2c593105d3d2d895efd4c2087bb8adbd7fab4"} Feb 17 19:19:47 crc kubenswrapper[4892]: I0217 19:19:47.681460 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 19:19:47 crc kubenswrapper[4892]: I0217 19:19:47.709247 4892 scope.go:117] "RemoveContainer" containerID="d8502017983b84a0ca83da22e1cce7c2e998638c312132ae3e15ce0c234ffc34" Feb 17 19:19:47 crc kubenswrapper[4892]: I0217 19:19:47.710566 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5be3191-ec7d-4f73-80c6-3a50bd405ea4-config-data\") pod \"e5be3191-ec7d-4f73-80c6-3a50bd405ea4\" (UID: \"e5be3191-ec7d-4f73-80c6-3a50bd405ea4\") " Feb 17 19:19:47 crc kubenswrapper[4892]: I0217 19:19:47.710677 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9a562ab-e2d3-47ad-806c-052d6eba74a8-combined-ca-bundle\") pod \"b9a562ab-e2d3-47ad-806c-052d6eba74a8\" (UID: \"b9a562ab-e2d3-47ad-806c-052d6eba74a8\") " Feb 17 19:19:47 crc kubenswrapper[4892]: I0217 19:19:47.710729 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9a562ab-e2d3-47ad-806c-052d6eba74a8-logs\") pod \"b9a562ab-e2d3-47ad-806c-052d6eba74a8\" (UID: \"b9a562ab-e2d3-47ad-806c-052d6eba74a8\") " Feb 17 19:19:47 crc kubenswrapper[4892]: I0217 19:19:47.710831 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5be3191-ec7d-4f73-80c6-3a50bd405ea4-logs\") pod \"e5be3191-ec7d-4f73-80c6-3a50bd405ea4\" (UID: \"e5be3191-ec7d-4f73-80c6-3a50bd405ea4\") " Feb 17 19:19:47 crc kubenswrapper[4892]: I0217 19:19:47.710962 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsts5\" (UniqueName: \"kubernetes.io/projected/e5be3191-ec7d-4f73-80c6-3a50bd405ea4-kube-api-access-qsts5\") pod \"e5be3191-ec7d-4f73-80c6-3a50bd405ea4\" (UID: \"e5be3191-ec7d-4f73-80c6-3a50bd405ea4\") " Feb 17 19:19:47 crc kubenswrapper[4892]: I0217 19:19:47.710993 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tks7\" (UniqueName: \"kubernetes.io/projected/b9a562ab-e2d3-47ad-806c-052d6eba74a8-kube-api-access-6tks7\") pod \"b9a562ab-e2d3-47ad-806c-052d6eba74a8\" (UID: \"b9a562ab-e2d3-47ad-806c-052d6eba74a8\") " Feb 17 19:19:47 crc kubenswrapper[4892]: I0217 19:19:47.711087 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5be3191-ec7d-4f73-80c6-3a50bd405ea4-combined-ca-bundle\") pod \"e5be3191-ec7d-4f73-80c6-3a50bd405ea4\" (UID: \"e5be3191-ec7d-4f73-80c6-3a50bd405ea4\") " Feb 17 19:19:47 crc kubenswrapper[4892]: I0217 19:19:47.711118 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9a562ab-e2d3-47ad-806c-052d6eba74a8-config-data\") pod \"b9a562ab-e2d3-47ad-806c-052d6eba74a8\" (UID: \"b9a562ab-e2d3-47ad-806c-052d6eba74a8\") " Feb 17 19:19:47 crc kubenswrapper[4892]: I0217 19:19:47.711376 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9a562ab-e2d3-47ad-806c-052d6eba74a8-logs" (OuterVolumeSpecName: "logs") pod "b9a562ab-e2d3-47ad-806c-052d6eba74a8" (UID: "b9a562ab-e2d3-47ad-806c-052d6eba74a8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:19:47 crc kubenswrapper[4892]: I0217 19:19:47.711600 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5be3191-ec7d-4f73-80c6-3a50bd405ea4-logs" (OuterVolumeSpecName: "logs") pod "e5be3191-ec7d-4f73-80c6-3a50bd405ea4" (UID: "e5be3191-ec7d-4f73-80c6-3a50bd405ea4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:19:47 crc kubenswrapper[4892]: I0217 19:19:47.712404 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9a562ab-e2d3-47ad-806c-052d6eba74a8-logs\") on node \"crc\" DevicePath \"\"" Feb 17 19:19:47 crc kubenswrapper[4892]: I0217 19:19:47.712500 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5be3191-ec7d-4f73-80c6-3a50bd405ea4-logs\") on node \"crc\" DevicePath \"\"" Feb 17 19:19:47 crc kubenswrapper[4892]: I0217 19:19:47.716627 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5be3191-ec7d-4f73-80c6-3a50bd405ea4-kube-api-access-qsts5" (OuterVolumeSpecName: "kube-api-access-qsts5") pod "e5be3191-ec7d-4f73-80c6-3a50bd405ea4" (UID: "e5be3191-ec7d-4f73-80c6-3a50bd405ea4"). InnerVolumeSpecName "kube-api-access-qsts5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:19:47 crc kubenswrapper[4892]: I0217 19:19:47.717011 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9a562ab-e2d3-47ad-806c-052d6eba74a8-kube-api-access-6tks7" (OuterVolumeSpecName: "kube-api-access-6tks7") pod "b9a562ab-e2d3-47ad-806c-052d6eba74a8" (UID: "b9a562ab-e2d3-47ad-806c-052d6eba74a8"). InnerVolumeSpecName "kube-api-access-6tks7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:19:47 crc kubenswrapper[4892]: I0217 19:19:47.732845 4892 scope.go:117] "RemoveContainer" containerID="4b72abdf1ab2633f2bbf4ed54ef99c02d38db7a87e30d71bb110deceab28ecbd" Feb 17 19:19:47 crc kubenswrapper[4892]: E0217 19:19:47.733482 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b72abdf1ab2633f2bbf4ed54ef99c02d38db7a87e30d71bb110deceab28ecbd\": container with ID starting with 4b72abdf1ab2633f2bbf4ed54ef99c02d38db7a87e30d71bb110deceab28ecbd not found: ID does not exist" containerID="4b72abdf1ab2633f2bbf4ed54ef99c02d38db7a87e30d71bb110deceab28ecbd" Feb 17 19:19:47 crc kubenswrapper[4892]: I0217 19:19:47.733514 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b72abdf1ab2633f2bbf4ed54ef99c02d38db7a87e30d71bb110deceab28ecbd"} err="failed to get container status \"4b72abdf1ab2633f2bbf4ed54ef99c02d38db7a87e30d71bb110deceab28ecbd\": rpc error: code = NotFound desc = could not find container \"4b72abdf1ab2633f2bbf4ed54ef99c02d38db7a87e30d71bb110deceab28ecbd\": container with ID starting with 4b72abdf1ab2633f2bbf4ed54ef99c02d38db7a87e30d71bb110deceab28ecbd not found: ID does not exist" Feb 17 19:19:47 crc kubenswrapper[4892]: I0217 19:19:47.733532 4892 scope.go:117] "RemoveContainer" containerID="d8502017983b84a0ca83da22e1cce7c2e998638c312132ae3e15ce0c234ffc34" Feb 17 19:19:47 crc kubenswrapper[4892]: E0217 19:19:47.733866 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8502017983b84a0ca83da22e1cce7c2e998638c312132ae3e15ce0c234ffc34\": container with ID starting with d8502017983b84a0ca83da22e1cce7c2e998638c312132ae3e15ce0c234ffc34 not found: ID does not exist" containerID="d8502017983b84a0ca83da22e1cce7c2e998638c312132ae3e15ce0c234ffc34" Feb 17 19:19:47 crc kubenswrapper[4892]: I0217 19:19:47.733885 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8502017983b84a0ca83da22e1cce7c2e998638c312132ae3e15ce0c234ffc34"} err="failed to get container status \"d8502017983b84a0ca83da22e1cce7c2e998638c312132ae3e15ce0c234ffc34\": rpc error: code = NotFound desc = could not find container \"d8502017983b84a0ca83da22e1cce7c2e998638c312132ae3e15ce0c234ffc34\": container with ID starting with d8502017983b84a0ca83da22e1cce7c2e998638c312132ae3e15ce0c234ffc34 not found: ID does not exist" Feb 17 19:19:47 crc kubenswrapper[4892]: I0217 19:19:47.733899 4892 scope.go:117] "RemoveContainer" containerID="96c7256a8f801e51e8517af3107fdb5e61b57e673163e8a8dc77991b648aadde" Feb 17 19:19:47 crc kubenswrapper[4892]: I0217 19:19:47.740856 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9a562ab-e2d3-47ad-806c-052d6eba74a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b9a562ab-e2d3-47ad-806c-052d6eba74a8" (UID: "b9a562ab-e2d3-47ad-806c-052d6eba74a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:19:47 crc kubenswrapper[4892]: I0217 19:19:47.741726 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5be3191-ec7d-4f73-80c6-3a50bd405ea4-config-data" (OuterVolumeSpecName: "config-data") pod "e5be3191-ec7d-4f73-80c6-3a50bd405ea4" (UID: "e5be3191-ec7d-4f73-80c6-3a50bd405ea4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:19:47 crc kubenswrapper[4892]: I0217 19:19:47.749107 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5be3191-ec7d-4f73-80c6-3a50bd405ea4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5be3191-ec7d-4f73-80c6-3a50bd405ea4" (UID: "e5be3191-ec7d-4f73-80c6-3a50bd405ea4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:19:47 crc kubenswrapper[4892]: I0217 19:19:47.755273 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9a562ab-e2d3-47ad-806c-052d6eba74a8-config-data" (OuterVolumeSpecName: "config-data") pod "b9a562ab-e2d3-47ad-806c-052d6eba74a8" (UID: "b9a562ab-e2d3-47ad-806c-052d6eba74a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:19:47 crc kubenswrapper[4892]: I0217 19:19:47.766500 4892 scope.go:117] "RemoveContainer" containerID="87fad8914a2d8e750d67214f78b4fec531a31bdb0d7fcdf0dd9670d93a32fbff" Feb 17 19:19:47 crc kubenswrapper[4892]: I0217 19:19:47.789422 4892 scope.go:117] "RemoveContainer" containerID="96c7256a8f801e51e8517af3107fdb5e61b57e673163e8a8dc77991b648aadde" Feb 17 19:19:47 crc kubenswrapper[4892]: E0217 19:19:47.789976 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96c7256a8f801e51e8517af3107fdb5e61b57e673163e8a8dc77991b648aadde\": container with ID starting with 96c7256a8f801e51e8517af3107fdb5e61b57e673163e8a8dc77991b648aadde not found: ID does not exist" containerID="96c7256a8f801e51e8517af3107fdb5e61b57e673163e8a8dc77991b648aadde" Feb 17 19:19:47 crc kubenswrapper[4892]: I0217 19:19:47.790029 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96c7256a8f801e51e8517af3107fdb5e61b57e673163e8a8dc77991b648aadde"} err="failed to get container status \"96c7256a8f801e51e8517af3107fdb5e61b57e673163e8a8dc77991b648aadde\": rpc error: code = NotFound desc = could not find container \"96c7256a8f801e51e8517af3107fdb5e61b57e673163e8a8dc77991b648aadde\": container with ID starting with 96c7256a8f801e51e8517af3107fdb5e61b57e673163e8a8dc77991b648aadde not found: ID does not exist" Feb 17 19:19:47 crc kubenswrapper[4892]: I0217 19:19:47.790058 4892 scope.go:117] "RemoveContainer" containerID="87fad8914a2d8e750d67214f78b4fec531a31bdb0d7fcdf0dd9670d93a32fbff" Feb 17 19:19:47 crc kubenswrapper[4892]: E0217 19:19:47.797055 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87fad8914a2d8e750d67214f78b4fec531a31bdb0d7fcdf0dd9670d93a32fbff\": container with ID starting with 87fad8914a2d8e750d67214f78b4fec531a31bdb0d7fcdf0dd9670d93a32fbff not found: ID does not exist" containerID="87fad8914a2d8e750d67214f78b4fec531a31bdb0d7fcdf0dd9670d93a32fbff" Feb 17 19:19:47 crc kubenswrapper[4892]: I0217 19:19:47.797198 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87fad8914a2d8e750d67214f78b4fec531a31bdb0d7fcdf0dd9670d93a32fbff"} err="failed to get container status \"87fad8914a2d8e750d67214f78b4fec531a31bdb0d7fcdf0dd9670d93a32fbff\": rpc error: code = NotFound desc = could not find container \"87fad8914a2d8e750d67214f78b4fec531a31bdb0d7fcdf0dd9670d93a32fbff\": container with ID starting with 87fad8914a2d8e750d67214f78b4fec531a31bdb0d7fcdf0dd9670d93a32fbff not found: ID does not exist" Feb 17 19:19:47 crc kubenswrapper[4892]: I0217 19:19:47.814451 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5be3191-ec7d-4f73-80c6-3a50bd405ea4-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 19:19:47 crc kubenswrapper[4892]: I0217 19:19:47.814490 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9a562ab-e2d3-47ad-806c-052d6eba74a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 19:19:47 crc kubenswrapper[4892]: I0217 19:19:47.814502 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tks7\" (UniqueName: \"kubernetes.io/projected/b9a562ab-e2d3-47ad-806c-052d6eba74a8-kube-api-access-6tks7\") on node \"crc\" DevicePath \"\"" Feb 17 19:19:47 crc kubenswrapper[4892]: I0217 19:19:47.814515 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsts5\" (UniqueName: \"kubernetes.io/projected/e5be3191-ec7d-4f73-80c6-3a50bd405ea4-kube-api-access-qsts5\") on node \"crc\" DevicePath \"\"" Feb 17 19:19:47 crc kubenswrapper[4892]: I0217 19:19:47.814525 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5be3191-ec7d-4f73-80c6-3a50bd405ea4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 19:19:47 crc kubenswrapper[4892]: I0217 19:19:47.814533 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9a562ab-e2d3-47ad-806c-052d6eba74a8-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.041147 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.066236 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.083967 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.102323 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.114966 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 17 19:19:48 crc kubenswrapper[4892]: E0217 19:19:48.115456 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9a562ab-e2d3-47ad-806c-052d6eba74a8" containerName="nova-metadata-metadata" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.115480 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9a562ab-e2d3-47ad-806c-052d6eba74a8" containerName="nova-metadata-metadata" Feb 17 19:19:48 crc kubenswrapper[4892]: E0217 19:19:48.115510 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5be3191-ec7d-4f73-80c6-3a50bd405ea4" containerName="nova-api-log" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.115518 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5be3191-ec7d-4f73-80c6-3a50bd405ea4" containerName="nova-api-log" Feb 17 19:19:48 crc kubenswrapper[4892]: E0217 19:19:48.115543 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9b1b334-4337-48c7-88bb-259fc38f15e5" containerName="nova-manage" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.115552 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9b1b334-4337-48c7-88bb-259fc38f15e5" containerName="nova-manage" Feb 17 19:19:48 crc kubenswrapper[4892]: E0217 19:19:48.115565 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5be3191-ec7d-4f73-80c6-3a50bd405ea4" containerName="nova-api-api" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.115573 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5be3191-ec7d-4f73-80c6-3a50bd405ea4" containerName="nova-api-api" Feb 17 19:19:48 crc kubenswrapper[4892]: E0217 19:19:48.115609 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9a562ab-e2d3-47ad-806c-052d6eba74a8" containerName="nova-metadata-log" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.115619 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9a562ab-e2d3-47ad-806c-052d6eba74a8" containerName="nova-metadata-log" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.115906 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5be3191-ec7d-4f73-80c6-3a50bd405ea4" containerName="nova-api-log" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.115931 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9a562ab-e2d3-47ad-806c-052d6eba74a8" containerName="nova-metadata-metadata" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.115952 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9b1b334-4337-48c7-88bb-259fc38f15e5" containerName="nova-manage" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.115979 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9a562ab-e2d3-47ad-806c-052d6eba74a8" containerName="nova-metadata-log" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.115997 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5be3191-ec7d-4f73-80c6-3a50bd405ea4" containerName="nova-api-api" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.117112 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.124986 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.128119 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.175308 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.179134 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.180984 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.197918 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.223082 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61db98bb-314d-4dd3-9a11-88cac622dea5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"61db98bb-314d-4dd3-9a11-88cac622dea5\") " pod="openstack/nova-metadata-0" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.223162 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61db98bb-314d-4dd3-9a11-88cac622dea5-logs\") pod \"nova-metadata-0\" (UID: \"61db98bb-314d-4dd3-9a11-88cac622dea5\") " pod="openstack/nova-metadata-0" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.223189 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb9c4\" (UniqueName: \"kubernetes.io/projected/61db98bb-314d-4dd3-9a11-88cac622dea5-kube-api-access-tb9c4\") pod \"nova-metadata-0\" (UID: \"61db98bb-314d-4dd3-9a11-88cac622dea5\") " pod="openstack/nova-metadata-0" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.223213 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61db98bb-314d-4dd3-9a11-88cac622dea5-config-data\") pod \"nova-metadata-0\" (UID: \"61db98bb-314d-4dd3-9a11-88cac622dea5\") " pod="openstack/nova-metadata-0" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.325399 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88699460-d2ea-4d92-acf7-150aff42bf48-logs\") pod \"nova-api-0\" (UID: \"88699460-d2ea-4d92-acf7-150aff42bf48\") " pod="openstack/nova-api-0" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.325449 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61db98bb-314d-4dd3-9a11-88cac622dea5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"61db98bb-314d-4dd3-9a11-88cac622dea5\") " pod="openstack/nova-metadata-0" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.325508 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61db98bb-314d-4dd3-9a11-88cac622dea5-logs\") pod \"nova-metadata-0\" (UID: \"61db98bb-314d-4dd3-9a11-88cac622dea5\") " pod="openstack/nova-metadata-0" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.325532 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tb9c4\" (UniqueName: \"kubernetes.io/projected/61db98bb-314d-4dd3-9a11-88cac622dea5-kube-api-access-tb9c4\") pod \"nova-metadata-0\" (UID: \"61db98bb-314d-4dd3-9a11-88cac622dea5\") " pod="openstack/nova-metadata-0" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.325555 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88699460-d2ea-4d92-acf7-150aff42bf48-config-data\") pod \"nova-api-0\" (UID: \"88699460-d2ea-4d92-acf7-150aff42bf48\") " pod="openstack/nova-api-0" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.325572 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61db98bb-314d-4dd3-9a11-88cac622dea5-config-data\") pod \"nova-metadata-0\" (UID: \"61db98bb-314d-4dd3-9a11-88cac622dea5\") " pod="openstack/nova-metadata-0" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.325609 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pltl4\" (UniqueName: \"kubernetes.io/projected/88699460-d2ea-4d92-acf7-150aff42bf48-kube-api-access-pltl4\") pod \"nova-api-0\" (UID: \"88699460-d2ea-4d92-acf7-150aff42bf48\") " pod="openstack/nova-api-0" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.325641 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88699460-d2ea-4d92-acf7-150aff42bf48-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"88699460-d2ea-4d92-acf7-150aff42bf48\") " pod="openstack/nova-api-0" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.326233 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61db98bb-314d-4dd3-9a11-88cac622dea5-logs\") pod \"nova-metadata-0\" (UID: \"61db98bb-314d-4dd3-9a11-88cac622dea5\") " pod="openstack/nova-metadata-0" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.332842 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61db98bb-314d-4dd3-9a11-88cac622dea5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"61db98bb-314d-4dd3-9a11-88cac622dea5\") " pod="openstack/nova-metadata-0" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.333720 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61db98bb-314d-4dd3-9a11-88cac622dea5-config-data\") pod \"nova-metadata-0\" (UID: \"61db98bb-314d-4dd3-9a11-88cac622dea5\") " pod="openstack/nova-metadata-0" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.341969 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tb9c4\" (UniqueName: \"kubernetes.io/projected/61db98bb-314d-4dd3-9a11-88cac622dea5-kube-api-access-tb9c4\") pod \"nova-metadata-0\" (UID: \"61db98bb-314d-4dd3-9a11-88cac622dea5\") " pod="openstack/nova-metadata-0" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.427186 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88699460-d2ea-4d92-acf7-150aff42bf48-config-data\") pod \"nova-api-0\" (UID: \"88699460-d2ea-4d92-acf7-150aff42bf48\") " pod="openstack/nova-api-0" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.427309 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pltl4\" (UniqueName: \"kubernetes.io/projected/88699460-d2ea-4d92-acf7-150aff42bf48-kube-api-access-pltl4\") pod \"nova-api-0\" (UID: \"88699460-d2ea-4d92-acf7-150aff42bf48\") " pod="openstack/nova-api-0" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.427371 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88699460-d2ea-4d92-acf7-150aff42bf48-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"88699460-d2ea-4d92-acf7-150aff42bf48\") " pod="openstack/nova-api-0" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.427462 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88699460-d2ea-4d92-acf7-150aff42bf48-logs\") pod \"nova-api-0\" (UID: \"88699460-d2ea-4d92-acf7-150aff42bf48\") " pod="openstack/nova-api-0" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.427980 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88699460-d2ea-4d92-acf7-150aff42bf48-logs\") pod \"nova-api-0\" (UID: \"88699460-d2ea-4d92-acf7-150aff42bf48\") " pod="openstack/nova-api-0" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.430996 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.431938 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88699460-d2ea-4d92-acf7-150aff42bf48-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"88699460-d2ea-4d92-acf7-150aff42bf48\") " pod="openstack/nova-api-0" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.444689 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88699460-d2ea-4d92-acf7-150aff42bf48-config-data\") pod \"nova-api-0\" (UID: \"88699460-d2ea-4d92-acf7-150aff42bf48\") " pod="openstack/nova-api-0" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.449378 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pltl4\" (UniqueName: \"kubernetes.io/projected/88699460-d2ea-4d92-acf7-150aff42bf48-kube-api-access-pltl4\") pod \"nova-api-0\" (UID: \"88699460-d2ea-4d92-acf7-150aff42bf48\") " pod="openstack/nova-api-0" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.498807 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.538534 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.631574 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hbkj\" (UniqueName: \"kubernetes.io/projected/e7c99bfe-5f65-4c1a-8244-2aa2f44fa732-kube-api-access-2hbkj\") pod \"e7c99bfe-5f65-4c1a-8244-2aa2f44fa732\" (UID: \"e7c99bfe-5f65-4c1a-8244-2aa2f44fa732\") " Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.632191 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7c99bfe-5f65-4c1a-8244-2aa2f44fa732-config-data\") pod \"e7c99bfe-5f65-4c1a-8244-2aa2f44fa732\" (UID: \"e7c99bfe-5f65-4c1a-8244-2aa2f44fa732\") " Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.632320 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7c99bfe-5f65-4c1a-8244-2aa2f44fa732-combined-ca-bundle\") pod \"e7c99bfe-5f65-4c1a-8244-2aa2f44fa732\" (UID: \"e7c99bfe-5f65-4c1a-8244-2aa2f44fa732\") " Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.637662 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7c99bfe-5f65-4c1a-8244-2aa2f44fa732-kube-api-access-2hbkj" (OuterVolumeSpecName: "kube-api-access-2hbkj") pod "e7c99bfe-5f65-4c1a-8244-2aa2f44fa732" (UID: "e7c99bfe-5f65-4c1a-8244-2aa2f44fa732"). InnerVolumeSpecName "kube-api-access-2hbkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.674980 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7c99bfe-5f65-4c1a-8244-2aa2f44fa732-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e7c99bfe-5f65-4c1a-8244-2aa2f44fa732" (UID: "e7c99bfe-5f65-4c1a-8244-2aa2f44fa732"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.677342 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7c99bfe-5f65-4c1a-8244-2aa2f44fa732-config-data" (OuterVolumeSpecName: "config-data") pod "e7c99bfe-5f65-4c1a-8244-2aa2f44fa732" (UID: "e7c99bfe-5f65-4c1a-8244-2aa2f44fa732"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.701545 4892 generic.go:334] "Generic (PLEG): container finished" podID="e7c99bfe-5f65-4c1a-8244-2aa2f44fa732" containerID="459010c91b6d56239baf49bf6b00d53830d4e4e21ed6ef8130f165f7397502ff" exitCode=0 Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.701599 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e7c99bfe-5f65-4c1a-8244-2aa2f44fa732","Type":"ContainerDied","Data":"459010c91b6d56239baf49bf6b00d53830d4e4e21ed6ef8130f165f7397502ff"} Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.701619 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e7c99bfe-5f65-4c1a-8244-2aa2f44fa732","Type":"ContainerDied","Data":"5b26c1325d586e633804b4c9a80a50565e4ca0a06ea29384543ebd0808e364ce"} Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.701635 4892 scope.go:117] "RemoveContainer" containerID="459010c91b6d56239baf49bf6b00d53830d4e4e21ed6ef8130f165f7397502ff" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.703044 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.737716 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7c99bfe-5f65-4c1a-8244-2aa2f44fa732-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.737759 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hbkj\" (UniqueName: \"kubernetes.io/projected/e7c99bfe-5f65-4c1a-8244-2aa2f44fa732-kube-api-access-2hbkj\") on node \"crc\" DevicePath \"\"" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.737776 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7c99bfe-5f65-4c1a-8244-2aa2f44fa732-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.747588 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.759886 4892 scope.go:117] "RemoveContainer" containerID="459010c91b6d56239baf49bf6b00d53830d4e4e21ed6ef8130f165f7397502ff" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.762052 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 19:19:48 crc kubenswrapper[4892]: E0217 19:19:48.764616 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"459010c91b6d56239baf49bf6b00d53830d4e4e21ed6ef8130f165f7397502ff\": container with ID starting with 459010c91b6d56239baf49bf6b00d53830d4e4e21ed6ef8130f165f7397502ff not found: ID does not exist" containerID="459010c91b6d56239baf49bf6b00d53830d4e4e21ed6ef8130f165f7397502ff" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.764734 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"459010c91b6d56239baf49bf6b00d53830d4e4e21ed6ef8130f165f7397502ff"} err="failed to get container status \"459010c91b6d56239baf49bf6b00d53830d4e4e21ed6ef8130f165f7397502ff\": rpc error: code = NotFound desc = could not find container \"459010c91b6d56239baf49bf6b00d53830d4e4e21ed6ef8130f165f7397502ff\": container with ID starting with 459010c91b6d56239baf49bf6b00d53830d4e4e21ed6ef8130f165f7397502ff not found: ID does not exist" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.771565 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 19:19:48 crc kubenswrapper[4892]: E0217 19:19:48.772076 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7c99bfe-5f65-4c1a-8244-2aa2f44fa732" containerName="nova-scheduler-scheduler" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.772092 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7c99bfe-5f65-4c1a-8244-2aa2f44fa732" containerName="nova-scheduler-scheduler" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.772304 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7c99bfe-5f65-4c1a-8244-2aa2f44fa732" containerName="nova-scheduler-scheduler" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.773153 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.778416 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.781492 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.942137 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc7212aa-fb51-4121-a36c-99a201ee026d-config-data\") pod \"nova-scheduler-0\" (UID: \"bc7212aa-fb51-4121-a36c-99a201ee026d\") " pod="openstack/nova-scheduler-0" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.942443 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqdnr\" (UniqueName: \"kubernetes.io/projected/bc7212aa-fb51-4121-a36c-99a201ee026d-kube-api-access-xqdnr\") pod \"nova-scheduler-0\" (UID: \"bc7212aa-fb51-4121-a36c-99a201ee026d\") " pod="openstack/nova-scheduler-0" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.942875 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc7212aa-fb51-4121-a36c-99a201ee026d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bc7212aa-fb51-4121-a36c-99a201ee026d\") " pod="openstack/nova-scheduler-0" Feb 17 19:19:48 crc kubenswrapper[4892]: I0217 19:19:48.990274 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 19:19:48 crc kubenswrapper[4892]: W0217 19:19:48.996618 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61db98bb_314d_4dd3_9a11_88cac622dea5.slice/crio-045150db74afd24ab33e9ba9b0f49d6ecce24470760b71deef4ef0bec6791e60 WatchSource:0}: Error finding container 045150db74afd24ab33e9ba9b0f49d6ecce24470760b71deef4ef0bec6791e60: Status 404 returned error can't find the container with id 045150db74afd24ab33e9ba9b0f49d6ecce24470760b71deef4ef0bec6791e60 Feb 17 19:19:49 crc kubenswrapper[4892]: I0217 19:19:49.044396 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc7212aa-fb51-4121-a36c-99a201ee026d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bc7212aa-fb51-4121-a36c-99a201ee026d\") " pod="openstack/nova-scheduler-0" Feb 17 19:19:49 crc kubenswrapper[4892]: I0217 19:19:49.044443 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc7212aa-fb51-4121-a36c-99a201ee026d-config-data\") pod \"nova-scheduler-0\" (UID: \"bc7212aa-fb51-4121-a36c-99a201ee026d\") " pod="openstack/nova-scheduler-0" Feb 17 19:19:49 crc kubenswrapper[4892]: I0217 19:19:49.044499 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqdnr\" (UniqueName: \"kubernetes.io/projected/bc7212aa-fb51-4121-a36c-99a201ee026d-kube-api-access-xqdnr\") pod \"nova-scheduler-0\" (UID: \"bc7212aa-fb51-4121-a36c-99a201ee026d\") " pod="openstack/nova-scheduler-0" Feb 17 19:19:49 crc kubenswrapper[4892]: I0217 19:19:49.049721 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc7212aa-fb51-4121-a36c-99a201ee026d-config-data\") pod \"nova-scheduler-0\" (UID: \"bc7212aa-fb51-4121-a36c-99a201ee026d\") " pod="openstack/nova-scheduler-0" Feb 17 19:19:49 crc kubenswrapper[4892]: I0217 19:19:49.051875 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc7212aa-fb51-4121-a36c-99a201ee026d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bc7212aa-fb51-4121-a36c-99a201ee026d\") " pod="openstack/nova-scheduler-0" Feb 17 19:19:49 crc kubenswrapper[4892]: I0217 19:19:49.060350 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqdnr\" (UniqueName: \"kubernetes.io/projected/bc7212aa-fb51-4121-a36c-99a201ee026d-kube-api-access-xqdnr\") pod \"nova-scheduler-0\" (UID: \"bc7212aa-fb51-4121-a36c-99a201ee026d\") " pod="openstack/nova-scheduler-0" Feb 17 19:19:49 crc kubenswrapper[4892]: W0217 19:19:49.097406 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88699460_d2ea_4d92_acf7_150aff42bf48.slice/crio-8f752cbfcff50720c8a79b5e9fdd65c943fa2011783b25a14bdd4250bd9beba3 WatchSource:0}: Error finding container 8f752cbfcff50720c8a79b5e9fdd65c943fa2011783b25a14bdd4250bd9beba3: Status 404 returned error can't find the container with id 8f752cbfcff50720c8a79b5e9fdd65c943fa2011783b25a14bdd4250bd9beba3 Feb 17 19:19:49 crc kubenswrapper[4892]: I0217 19:19:49.100585 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 19:19:49 crc kubenswrapper[4892]: I0217 19:19:49.103218 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 19:19:49 crc kubenswrapper[4892]: I0217 19:19:49.376859 4892 scope.go:117] "RemoveContainer" containerID="0b9ac01ffb1eb212ecc7dea4def4de73890b147a83b4b71346712fbd55b67ac2" Feb 17 19:19:49 crc kubenswrapper[4892]: E0217 19:19:49.377410 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:19:49 crc kubenswrapper[4892]: I0217 19:19:49.385864 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9a562ab-e2d3-47ad-806c-052d6eba74a8" path="/var/lib/kubelet/pods/b9a562ab-e2d3-47ad-806c-052d6eba74a8/volumes" Feb 17 19:19:49 crc kubenswrapper[4892]: I0217 19:19:49.386702 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5be3191-ec7d-4f73-80c6-3a50bd405ea4" path="/var/lib/kubelet/pods/e5be3191-ec7d-4f73-80c6-3a50bd405ea4/volumes" Feb 17 19:19:49 crc kubenswrapper[4892]: I0217 19:19:49.387429 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7c99bfe-5f65-4c1a-8244-2aa2f44fa732" path="/var/lib/kubelet/pods/e7c99bfe-5f65-4c1a-8244-2aa2f44fa732/volumes" Feb 17 19:19:49 crc kubenswrapper[4892]: I0217 19:19:49.617936 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 19:19:49 crc kubenswrapper[4892]: I0217 19:19:49.730493 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"61db98bb-314d-4dd3-9a11-88cac622dea5","Type":"ContainerStarted","Data":"f78a103f4ef89fb9ee2aa1d4fab94d0d3ef04bfc9ee8f212eed1cf44f7afb6a2"} Feb 17 19:19:49 crc kubenswrapper[4892]: I0217 19:19:49.730530 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"61db98bb-314d-4dd3-9a11-88cac622dea5","Type":"ContainerStarted","Data":"c5fc9d5719f755770060276b956e0dcb1c0188bfca63321e05144704a16cfa75"} Feb 17 19:19:49 crc kubenswrapper[4892]: I0217 19:19:49.730540 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"61db98bb-314d-4dd3-9a11-88cac622dea5","Type":"ContainerStarted","Data":"045150db74afd24ab33e9ba9b0f49d6ecce24470760b71deef4ef0bec6791e60"} Feb 17 19:19:49 crc kubenswrapper[4892]: I0217 19:19:49.734497 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bc7212aa-fb51-4121-a36c-99a201ee026d","Type":"ContainerStarted","Data":"c1be2e80e5582baf9eb07295a1187c9de3095c4f521e2e406e4dd10e930aed32"} Feb 17 19:19:49 crc kubenswrapper[4892]: I0217 19:19:49.736947 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"88699460-d2ea-4d92-acf7-150aff42bf48","Type":"ContainerStarted","Data":"a85168328370849ba524f17ec5f25cd2c99b6466753bcdfa951f650acd3bf490"} Feb 17 19:19:49 crc kubenswrapper[4892]: I0217 19:19:49.736968 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"88699460-d2ea-4d92-acf7-150aff42bf48","Type":"ContainerStarted","Data":"cd39d941e73d414b020632b87dfae80090b58672310a138db82a228a57a08545"} Feb 17 19:19:49 crc kubenswrapper[4892]: I0217 19:19:49.736978 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"88699460-d2ea-4d92-acf7-150aff42bf48","Type":"ContainerStarted","Data":"8f752cbfcff50720c8a79b5e9fdd65c943fa2011783b25a14bdd4250bd9beba3"} Feb 17 19:19:49 crc kubenswrapper[4892]: I0217 19:19:49.762536 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.762520221 podStartE2EDuration="1.762520221s" podCreationTimestamp="2026-02-17 19:19:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:19:49.750661491 +0000 UTC m=+5761.126064766" watchObservedRunningTime="2026-02-17 19:19:49.762520221 +0000 UTC m=+5761.137923486" Feb 17 19:19:49 crc kubenswrapper[4892]: I0217 19:19:49.787659 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.787640978 podStartE2EDuration="1.787640978s" podCreationTimestamp="2026-02-17 19:19:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:19:49.78065801 +0000 UTC m=+5761.156061315" watchObservedRunningTime="2026-02-17 19:19:49.787640978 +0000 UTC m=+5761.163044243" Feb 17 19:19:50 crc kubenswrapper[4892]: I0217 19:19:50.748148 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bc7212aa-fb51-4121-a36c-99a201ee026d","Type":"ContainerStarted","Data":"c475e8b05c7af30a23bd45d3f75832fa2ffcb0f0025d4eb7e5a3728c26973515"} Feb 17 19:19:53 crc kubenswrapper[4892]: I0217 19:19:53.431605 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 19:19:53 crc kubenswrapper[4892]: I0217 19:19:53.432129 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 19:19:54 crc kubenswrapper[4892]: I0217 19:19:54.103642 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 17 19:19:58 crc kubenswrapper[4892]: I0217 19:19:58.431688 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 17 19:19:58 crc kubenswrapper[4892]: I0217 19:19:58.432451 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 17 19:19:58 crc kubenswrapper[4892]: I0217 19:19:58.500023 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 19:19:58 crc kubenswrapper[4892]: I0217 19:19:58.500073 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 19:19:59 crc kubenswrapper[4892]: I0217 19:19:59.104273 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 17 19:19:59 crc kubenswrapper[4892]: I0217 19:19:59.132588 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 17 19:19:59 crc kubenswrapper[4892]: I0217 19:19:59.160152 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=11.160135072 podStartE2EDuration="11.160135072s" podCreationTimestamp="2026-02-17 19:19:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:19:50.772385272 +0000 UTC m=+5762.147788557" watchObservedRunningTime="2026-02-17 19:19:59.160135072 +0000 UTC m=+5770.535538337" Feb 17 19:19:59 crc kubenswrapper[4892]: I0217 19:19:59.533134 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="61db98bb-314d-4dd3-9a11-88cac622dea5" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.101:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 19:19:59 crc kubenswrapper[4892]: I0217 19:19:59.533331 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="61db98bb-314d-4dd3-9a11-88cac622dea5" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.101:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 19:19:59 crc kubenswrapper[4892]: I0217 19:19:59.617119 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="88699460-d2ea-4d92-acf7-150aff42bf48" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.102:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 19:19:59 crc kubenswrapper[4892]: I0217 19:19:59.617365 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="88699460-d2ea-4d92-acf7-150aff42bf48" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.102:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 19:19:59 crc kubenswrapper[4892]: I0217 19:19:59.918162 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 17 19:20:01 crc kubenswrapper[4892]: I0217 19:20:01.359590 4892 scope.go:117] "RemoveContainer" containerID="0b9ac01ffb1eb212ecc7dea4def4de73890b147a83b4b71346712fbd55b67ac2" Feb 17 19:20:01 crc kubenswrapper[4892]: E0217 19:20:01.360097 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:20:08 crc kubenswrapper[4892]: I0217 19:20:08.433442 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 17 19:20:08 crc kubenswrapper[4892]: I0217 19:20:08.435404 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 17 19:20:08 crc kubenswrapper[4892]: I0217 19:20:08.435727 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 17 19:20:08 crc kubenswrapper[4892]: I0217 19:20:08.503471 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 17 19:20:08 crc kubenswrapper[4892]: I0217 19:20:08.504183 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 17 19:20:08 crc kubenswrapper[4892]: I0217 19:20:08.506721 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 17 19:20:08 crc kubenswrapper[4892]: I0217 19:20:08.508941 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 17 19:20:08 crc kubenswrapper[4892]: I0217 19:20:08.966533 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 17 19:20:08 crc kubenswrapper[4892]: I0217 19:20:08.971446 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 17 19:20:08 crc kubenswrapper[4892]: I0217 19:20:08.971835 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 17 19:20:09 crc kubenswrapper[4892]: I0217 19:20:09.229999 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c67fc5945-dlf96"] Feb 17 19:20:09 crc kubenswrapper[4892]: I0217 19:20:09.238402 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c67fc5945-dlf96" Feb 17 19:20:09 crc kubenswrapper[4892]: I0217 19:20:09.270288 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c67fc5945-dlf96"] Feb 17 19:20:09 crc kubenswrapper[4892]: I0217 19:20:09.399042 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a4dff54-f831-4752-b7b9-67123477ec0e-ovsdbserver-nb\") pod \"dnsmasq-dns-6c67fc5945-dlf96\" (UID: \"3a4dff54-f831-4752-b7b9-67123477ec0e\") " pod="openstack/dnsmasq-dns-6c67fc5945-dlf96" Feb 17 19:20:09 crc kubenswrapper[4892]: I0217 19:20:09.399312 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a4dff54-f831-4752-b7b9-67123477ec0e-dns-svc\") pod \"dnsmasq-dns-6c67fc5945-dlf96\" (UID: \"3a4dff54-f831-4752-b7b9-67123477ec0e\") " pod="openstack/dnsmasq-dns-6c67fc5945-dlf96" Feb 17 19:20:09 crc kubenswrapper[4892]: I0217 19:20:09.399423 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a4dff54-f831-4752-b7b9-67123477ec0e-config\") pod \"dnsmasq-dns-6c67fc5945-dlf96\" (UID: \"3a4dff54-f831-4752-b7b9-67123477ec0e\") " pod="openstack/dnsmasq-dns-6c67fc5945-dlf96" Feb 17 19:20:09 crc kubenswrapper[4892]: I0217 19:20:09.399687 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3a4dff54-f831-4752-b7b9-67123477ec0e-ovsdbserver-sb\") pod \"dnsmasq-dns-6c67fc5945-dlf96\" (UID: \"3a4dff54-f831-4752-b7b9-67123477ec0e\") " pod="openstack/dnsmasq-dns-6c67fc5945-dlf96" Feb 17 19:20:09 crc kubenswrapper[4892]: I0217 19:20:09.399857 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmzgq\" (UniqueName: \"kubernetes.io/projected/3a4dff54-f831-4752-b7b9-67123477ec0e-kube-api-access-pmzgq\") pod \"dnsmasq-dns-6c67fc5945-dlf96\" (UID: \"3a4dff54-f831-4752-b7b9-67123477ec0e\") " pod="openstack/dnsmasq-dns-6c67fc5945-dlf96" Feb 17 19:20:09 crc kubenswrapper[4892]: I0217 19:20:09.501933 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3a4dff54-f831-4752-b7b9-67123477ec0e-ovsdbserver-sb\") pod \"dnsmasq-dns-6c67fc5945-dlf96\" (UID: \"3a4dff54-f831-4752-b7b9-67123477ec0e\") " pod="openstack/dnsmasq-dns-6c67fc5945-dlf96" Feb 17 19:20:09 crc kubenswrapper[4892]: I0217 19:20:09.501977 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmzgq\" (UniqueName: \"kubernetes.io/projected/3a4dff54-f831-4752-b7b9-67123477ec0e-kube-api-access-pmzgq\") pod \"dnsmasq-dns-6c67fc5945-dlf96\" (UID: \"3a4dff54-f831-4752-b7b9-67123477ec0e\") " pod="openstack/dnsmasq-dns-6c67fc5945-dlf96" Feb 17 19:20:09 crc kubenswrapper[4892]: I0217 19:20:09.502044 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a4dff54-f831-4752-b7b9-67123477ec0e-ovsdbserver-nb\") pod \"dnsmasq-dns-6c67fc5945-dlf96\" (UID: \"3a4dff54-f831-4752-b7b9-67123477ec0e\") " pod="openstack/dnsmasq-dns-6c67fc5945-dlf96" Feb 17 19:20:09 crc kubenswrapper[4892]: I0217 19:20:09.502061 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a4dff54-f831-4752-b7b9-67123477ec0e-dns-svc\") pod \"dnsmasq-dns-6c67fc5945-dlf96\" (UID: \"3a4dff54-f831-4752-b7b9-67123477ec0e\") " pod="openstack/dnsmasq-dns-6c67fc5945-dlf96" Feb 17 19:20:09 crc kubenswrapper[4892]: I0217 19:20:09.502090 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a4dff54-f831-4752-b7b9-67123477ec0e-config\") pod \"dnsmasq-dns-6c67fc5945-dlf96\" (UID: \"3a4dff54-f831-4752-b7b9-67123477ec0e\") " pod="openstack/dnsmasq-dns-6c67fc5945-dlf96" Feb 17 19:20:09 crc kubenswrapper[4892]: I0217 19:20:09.502810 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a4dff54-f831-4752-b7b9-67123477ec0e-config\") pod \"dnsmasq-dns-6c67fc5945-dlf96\" (UID: \"3a4dff54-f831-4752-b7b9-67123477ec0e\") " pod="openstack/dnsmasq-dns-6c67fc5945-dlf96" Feb 17 19:20:09 crc kubenswrapper[4892]: I0217 19:20:09.505125 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a4dff54-f831-4752-b7b9-67123477ec0e-ovsdbserver-nb\") pod \"dnsmasq-dns-6c67fc5945-dlf96\" (UID: \"3a4dff54-f831-4752-b7b9-67123477ec0e\") " pod="openstack/dnsmasq-dns-6c67fc5945-dlf96" Feb 17 19:20:09 crc kubenswrapper[4892]: I0217 19:20:09.505633 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a4dff54-f831-4752-b7b9-67123477ec0e-dns-svc\") pod \"dnsmasq-dns-6c67fc5945-dlf96\" (UID: \"3a4dff54-f831-4752-b7b9-67123477ec0e\") " pod="openstack/dnsmasq-dns-6c67fc5945-dlf96" Feb 17 19:20:09 crc kubenswrapper[4892]: I0217 19:20:09.506056 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3a4dff54-f831-4752-b7b9-67123477ec0e-ovsdbserver-sb\") pod \"dnsmasq-dns-6c67fc5945-dlf96\" (UID: \"3a4dff54-f831-4752-b7b9-67123477ec0e\") " pod="openstack/dnsmasq-dns-6c67fc5945-dlf96" Feb 17 19:20:09 crc kubenswrapper[4892]: I0217 19:20:09.526282 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmzgq\" (UniqueName: \"kubernetes.io/projected/3a4dff54-f831-4752-b7b9-67123477ec0e-kube-api-access-pmzgq\") pod \"dnsmasq-dns-6c67fc5945-dlf96\" (UID: \"3a4dff54-f831-4752-b7b9-67123477ec0e\") " pod="openstack/dnsmasq-dns-6c67fc5945-dlf96" Feb 17 19:20:09 crc kubenswrapper[4892]: I0217 19:20:09.578593 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c67fc5945-dlf96" Feb 17 19:20:10 crc kubenswrapper[4892]: I0217 19:20:10.058986 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c67fc5945-dlf96"] Feb 17 19:20:10 crc kubenswrapper[4892]: I0217 19:20:10.986211 4892 generic.go:334] "Generic (PLEG): container finished" podID="3a4dff54-f831-4752-b7b9-67123477ec0e" containerID="be81c5109854d13651b22e33d33faca421e7b2e5666125286a1a1c60733609fe" exitCode=0 Feb 17 19:20:10 crc kubenswrapper[4892]: I0217 19:20:10.986308 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c67fc5945-dlf96" event={"ID":"3a4dff54-f831-4752-b7b9-67123477ec0e","Type":"ContainerDied","Data":"be81c5109854d13651b22e33d33faca421e7b2e5666125286a1a1c60733609fe"} Feb 17 19:20:10 crc kubenswrapper[4892]: I0217 19:20:10.986603 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c67fc5945-dlf96" event={"ID":"3a4dff54-f831-4752-b7b9-67123477ec0e","Type":"ContainerStarted","Data":"37847434dd5ad823e1d7c6b17bd6bd7d6a9e94fc08cbf88acf86910027174249"} Feb 17 19:20:11 crc kubenswrapper[4892]: I0217 19:20:11.997288 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c67fc5945-dlf96" event={"ID":"3a4dff54-f831-4752-b7b9-67123477ec0e","Type":"ContainerStarted","Data":"01a562fd2e3c2c4e182fb48388ea03f20288967692f1afabfc6788c50deddc82"} Feb 17 19:20:11 crc kubenswrapper[4892]: I0217 19:20:11.997546 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c67fc5945-dlf96" Feb 17 19:20:12 crc kubenswrapper[4892]: I0217 19:20:12.026667 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c67fc5945-dlf96" podStartSLOduration=3.026650936 podStartE2EDuration="3.026650936s" podCreationTimestamp="2026-02-17 19:20:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:20:12.017597962 +0000 UTC m=+5783.393001227" watchObservedRunningTime="2026-02-17 19:20:12.026650936 +0000 UTC m=+5783.402054191" Feb 17 19:20:13 crc kubenswrapper[4892]: I0217 19:20:13.359060 4892 scope.go:117] "RemoveContainer" containerID="0b9ac01ffb1eb212ecc7dea4def4de73890b147a83b4b71346712fbd55b67ac2" Feb 17 19:20:13 crc kubenswrapper[4892]: E0217 19:20:13.359375 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:20:19 crc kubenswrapper[4892]: I0217 19:20:19.581137 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6c67fc5945-dlf96" Feb 17 19:20:19 crc kubenswrapper[4892]: I0217 19:20:19.712765 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6785d9ddb9-ckc89"] Feb 17 19:20:19 crc kubenswrapper[4892]: I0217 19:20:19.713041 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6785d9ddb9-ckc89" podUID="d2641e63-3c52-4760-a46d-b01684e4ebb5" containerName="dnsmasq-dns" containerID="cri-o://21792d4e8dcff1b1ab8500689dffc6a2345228a33400e1ae3d2b3fa593dfcf71" gracePeriod=10 Feb 17 19:20:20 crc kubenswrapper[4892]: I0217 19:20:20.084036 4892 generic.go:334] "Generic (PLEG): container finished" podID="d2641e63-3c52-4760-a46d-b01684e4ebb5" containerID="21792d4e8dcff1b1ab8500689dffc6a2345228a33400e1ae3d2b3fa593dfcf71" exitCode=0 Feb 17 19:20:20 crc kubenswrapper[4892]: I0217 19:20:20.084172 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6785d9ddb9-ckc89" event={"ID":"d2641e63-3c52-4760-a46d-b01684e4ebb5","Type":"ContainerDied","Data":"21792d4e8dcff1b1ab8500689dffc6a2345228a33400e1ae3d2b3fa593dfcf71"} Feb 17 19:20:20 crc kubenswrapper[4892]: I0217 19:20:20.248183 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6785d9ddb9-ckc89" Feb 17 19:20:20 crc kubenswrapper[4892]: I0217 19:20:20.421483 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d2641e63-3c52-4760-a46d-b01684e4ebb5-dns-svc\") pod \"d2641e63-3c52-4760-a46d-b01684e4ebb5\" (UID: \"d2641e63-3c52-4760-a46d-b01684e4ebb5\") " Feb 17 19:20:20 crc kubenswrapper[4892]: I0217 19:20:20.421649 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krd5r\" (UniqueName: \"kubernetes.io/projected/d2641e63-3c52-4760-a46d-b01684e4ebb5-kube-api-access-krd5r\") pod \"d2641e63-3c52-4760-a46d-b01684e4ebb5\" (UID: \"d2641e63-3c52-4760-a46d-b01684e4ebb5\") " Feb 17 19:20:20 crc kubenswrapper[4892]: I0217 19:20:20.421715 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2641e63-3c52-4760-a46d-b01684e4ebb5-config\") pod \"d2641e63-3c52-4760-a46d-b01684e4ebb5\" (UID: \"d2641e63-3c52-4760-a46d-b01684e4ebb5\") " Feb 17 19:20:20 crc kubenswrapper[4892]: I0217 19:20:20.421775 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d2641e63-3c52-4760-a46d-b01684e4ebb5-ovsdbserver-nb\") pod \"d2641e63-3c52-4760-a46d-b01684e4ebb5\" (UID: \"d2641e63-3c52-4760-a46d-b01684e4ebb5\") " Feb 17 19:20:20 crc kubenswrapper[4892]: I0217 19:20:20.421804 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d2641e63-3c52-4760-a46d-b01684e4ebb5-ovsdbserver-sb\") pod \"d2641e63-3c52-4760-a46d-b01684e4ebb5\" (UID: \"d2641e63-3c52-4760-a46d-b01684e4ebb5\") " Feb 17 19:20:20 crc kubenswrapper[4892]: I0217 19:20:20.435473 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2641e63-3c52-4760-a46d-b01684e4ebb5-kube-api-access-krd5r" (OuterVolumeSpecName: "kube-api-access-krd5r") pod "d2641e63-3c52-4760-a46d-b01684e4ebb5" (UID: "d2641e63-3c52-4760-a46d-b01684e4ebb5"). InnerVolumeSpecName "kube-api-access-krd5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:20:20 crc kubenswrapper[4892]: I0217 19:20:20.520483 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2641e63-3c52-4760-a46d-b01684e4ebb5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d2641e63-3c52-4760-a46d-b01684e4ebb5" (UID: "d2641e63-3c52-4760-a46d-b01684e4ebb5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:20:20 crc kubenswrapper[4892]: I0217 19:20:20.524176 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d2641e63-3c52-4760-a46d-b01684e4ebb5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 19:20:20 crc kubenswrapper[4892]: I0217 19:20:20.524204 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krd5r\" (UniqueName: \"kubernetes.io/projected/d2641e63-3c52-4760-a46d-b01684e4ebb5-kube-api-access-krd5r\") on node \"crc\" DevicePath \"\"" Feb 17 19:20:20 crc kubenswrapper[4892]: I0217 19:20:20.536137 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2641e63-3c52-4760-a46d-b01684e4ebb5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d2641e63-3c52-4760-a46d-b01684e4ebb5" (UID: "d2641e63-3c52-4760-a46d-b01684e4ebb5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:20:20 crc kubenswrapper[4892]: I0217 19:20:20.540606 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2641e63-3c52-4760-a46d-b01684e4ebb5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d2641e63-3c52-4760-a46d-b01684e4ebb5" (UID: "d2641e63-3c52-4760-a46d-b01684e4ebb5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:20:20 crc kubenswrapper[4892]: I0217 19:20:20.595285 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2641e63-3c52-4760-a46d-b01684e4ebb5-config" (OuterVolumeSpecName: "config") pod "d2641e63-3c52-4760-a46d-b01684e4ebb5" (UID: "d2641e63-3c52-4760-a46d-b01684e4ebb5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:20:20 crc kubenswrapper[4892]: I0217 19:20:20.627161 4892 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d2641e63-3c52-4760-a46d-b01684e4ebb5-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 19:20:20 crc kubenswrapper[4892]: I0217 19:20:20.627201 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2641e63-3c52-4760-a46d-b01684e4ebb5-config\") on node \"crc\" DevicePath \"\"" Feb 17 19:20:20 crc kubenswrapper[4892]: I0217 19:20:20.627212 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d2641e63-3c52-4760-a46d-b01684e4ebb5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 19:20:21 crc kubenswrapper[4892]: I0217 19:20:21.099659 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6785d9ddb9-ckc89" event={"ID":"d2641e63-3c52-4760-a46d-b01684e4ebb5","Type":"ContainerDied","Data":"dacc19e8c876016cc4bf63fe2b31e8ae4c98d5a944c9f2a94b903a9a80a28fac"} Feb 17 19:20:21 crc kubenswrapper[4892]: I0217 19:20:21.099724 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6785d9ddb9-ckc89" Feb 17 19:20:21 crc kubenswrapper[4892]: I0217 19:20:21.099739 4892 scope.go:117] "RemoveContainer" containerID="21792d4e8dcff1b1ab8500689dffc6a2345228a33400e1ae3d2b3fa593dfcf71" Feb 17 19:20:21 crc kubenswrapper[4892]: I0217 19:20:21.131662 4892 scope.go:117] "RemoveContainer" containerID="017d5b444118121233935c69b7c089105c2a96752202a45685379f5c53455d2b" Feb 17 19:20:21 crc kubenswrapper[4892]: I0217 19:20:21.137236 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6785d9ddb9-ckc89"] Feb 17 19:20:21 crc kubenswrapper[4892]: I0217 19:20:21.148461 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6785d9ddb9-ckc89"] Feb 17 19:20:21 crc kubenswrapper[4892]: I0217 19:20:21.386171 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2641e63-3c52-4760-a46d-b01684e4ebb5" path="/var/lib/kubelet/pods/d2641e63-3c52-4760-a46d-b01684e4ebb5/volumes" Feb 17 19:20:22 crc kubenswrapper[4892]: I0217 19:20:22.155524 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-zn4fv"] Feb 17 19:20:22 crc kubenswrapper[4892]: E0217 19:20:22.156420 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2641e63-3c52-4760-a46d-b01684e4ebb5" containerName="dnsmasq-dns" Feb 17 19:20:22 crc kubenswrapper[4892]: I0217 19:20:22.156440 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2641e63-3c52-4760-a46d-b01684e4ebb5" containerName="dnsmasq-dns" Feb 17 19:20:22 crc kubenswrapper[4892]: E0217 19:20:22.156503 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2641e63-3c52-4760-a46d-b01684e4ebb5" containerName="init" Feb 17 19:20:22 crc kubenswrapper[4892]: I0217 19:20:22.156511 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2641e63-3c52-4760-a46d-b01684e4ebb5" containerName="init" Feb 17 19:20:22 crc kubenswrapper[4892]: I0217 19:20:22.156787 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2641e63-3c52-4760-a46d-b01684e4ebb5" containerName="dnsmasq-dns" Feb 17 19:20:22 crc kubenswrapper[4892]: I0217 19:20:22.157751 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-zn4fv" Feb 17 19:20:22 crc kubenswrapper[4892]: I0217 19:20:22.165794 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-zn4fv"] Feb 17 19:20:22 crc kubenswrapper[4892]: I0217 19:20:22.257191 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5529f735-60b3-453d-8f60-88e712323868-operator-scripts\") pod \"cinder-db-create-zn4fv\" (UID: \"5529f735-60b3-453d-8f60-88e712323868\") " pod="openstack/cinder-db-create-zn4fv" Feb 17 19:20:22 crc kubenswrapper[4892]: I0217 19:20:22.257612 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9r9w\" (UniqueName: \"kubernetes.io/projected/5529f735-60b3-453d-8f60-88e712323868-kube-api-access-x9r9w\") pod \"cinder-db-create-zn4fv\" (UID: \"5529f735-60b3-453d-8f60-88e712323868\") " pod="openstack/cinder-db-create-zn4fv" Feb 17 19:20:22 crc kubenswrapper[4892]: I0217 19:20:22.272743 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-8127-account-create-update-tkktr"] Feb 17 19:20:22 crc kubenswrapper[4892]: I0217 19:20:22.274235 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8127-account-create-update-tkktr" Feb 17 19:20:22 crc kubenswrapper[4892]: I0217 19:20:22.276299 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 17 19:20:22 crc kubenswrapper[4892]: I0217 19:20:22.284676 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8127-account-create-update-tkktr"] Feb 17 19:20:22 crc kubenswrapper[4892]: I0217 19:20:22.360137 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9r9w\" (UniqueName: \"kubernetes.io/projected/5529f735-60b3-453d-8f60-88e712323868-kube-api-access-x9r9w\") pod \"cinder-db-create-zn4fv\" (UID: \"5529f735-60b3-453d-8f60-88e712323868\") " pod="openstack/cinder-db-create-zn4fv" Feb 17 19:20:22 crc kubenswrapper[4892]: I0217 19:20:22.360260 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l442j\" (UniqueName: \"kubernetes.io/projected/f660f012-1387-4d60-a990-2dbb04f06f42-kube-api-access-l442j\") pod \"cinder-8127-account-create-update-tkktr\" (UID: \"f660f012-1387-4d60-a990-2dbb04f06f42\") " pod="openstack/cinder-8127-account-create-update-tkktr" Feb 17 19:20:22 crc kubenswrapper[4892]: I0217 19:20:22.360329 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5529f735-60b3-453d-8f60-88e712323868-operator-scripts\") pod \"cinder-db-create-zn4fv\" (UID: \"5529f735-60b3-453d-8f60-88e712323868\") " pod="openstack/cinder-db-create-zn4fv" Feb 17 19:20:22 crc kubenswrapper[4892]: I0217 19:20:22.360357 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f660f012-1387-4d60-a990-2dbb04f06f42-operator-scripts\") pod \"cinder-8127-account-create-update-tkktr\" (UID: \"f660f012-1387-4d60-a990-2dbb04f06f42\") " pod="openstack/cinder-8127-account-create-update-tkktr" Feb 17 19:20:22 crc kubenswrapper[4892]: I0217 19:20:22.361349 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5529f735-60b3-453d-8f60-88e712323868-operator-scripts\") pod \"cinder-db-create-zn4fv\" (UID: \"5529f735-60b3-453d-8f60-88e712323868\") " pod="openstack/cinder-db-create-zn4fv" Feb 17 19:20:22 crc kubenswrapper[4892]: I0217 19:20:22.380288 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9r9w\" (UniqueName: \"kubernetes.io/projected/5529f735-60b3-453d-8f60-88e712323868-kube-api-access-x9r9w\") pod \"cinder-db-create-zn4fv\" (UID: \"5529f735-60b3-453d-8f60-88e712323868\") " pod="openstack/cinder-db-create-zn4fv" Feb 17 19:20:22 crc kubenswrapper[4892]: I0217 19:20:22.462155 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l442j\" (UniqueName: \"kubernetes.io/projected/f660f012-1387-4d60-a990-2dbb04f06f42-kube-api-access-l442j\") pod \"cinder-8127-account-create-update-tkktr\" (UID: \"f660f012-1387-4d60-a990-2dbb04f06f42\") " pod="openstack/cinder-8127-account-create-update-tkktr" Feb 17 19:20:22 crc kubenswrapper[4892]: I0217 19:20:22.462247 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f660f012-1387-4d60-a990-2dbb04f06f42-operator-scripts\") pod \"cinder-8127-account-create-update-tkktr\" (UID: \"f660f012-1387-4d60-a990-2dbb04f06f42\") " pod="openstack/cinder-8127-account-create-update-tkktr" Feb 17 19:20:22 crc kubenswrapper[4892]: I0217 19:20:22.463001 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f660f012-1387-4d60-a990-2dbb04f06f42-operator-scripts\") pod \"cinder-8127-account-create-update-tkktr\" (UID: \"f660f012-1387-4d60-a990-2dbb04f06f42\") " pod="openstack/cinder-8127-account-create-update-tkktr" Feb 17 19:20:22 crc kubenswrapper[4892]: I0217 19:20:22.478534 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l442j\" (UniqueName: \"kubernetes.io/projected/f660f012-1387-4d60-a990-2dbb04f06f42-kube-api-access-l442j\") pod \"cinder-8127-account-create-update-tkktr\" (UID: \"f660f012-1387-4d60-a990-2dbb04f06f42\") " pod="openstack/cinder-8127-account-create-update-tkktr" Feb 17 19:20:22 crc kubenswrapper[4892]: I0217 19:20:22.519740 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-zn4fv" Feb 17 19:20:22 crc kubenswrapper[4892]: I0217 19:20:22.596304 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8127-account-create-update-tkktr" Feb 17 19:20:23 crc kubenswrapper[4892]: I0217 19:20:23.039017 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-zn4fv"] Feb 17 19:20:23 crc kubenswrapper[4892]: I0217 19:20:23.126265 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-zn4fv" event={"ID":"5529f735-60b3-453d-8f60-88e712323868","Type":"ContainerStarted","Data":"8264b734faf1571eed58b83f23c48160c7f39c3aba98a37565842dd2ce95dcf1"} Feb 17 19:20:23 crc kubenswrapper[4892]: I0217 19:20:23.131151 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8127-account-create-update-tkktr"] Feb 17 19:20:23 crc kubenswrapper[4892]: W0217 19:20:23.133119 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf660f012_1387_4d60_a990_2dbb04f06f42.slice/crio-5c52cf775015c5e3a464d9b01f8323d5bf7305a2f590a1b3e11f71f421445b0f WatchSource:0}: Error finding container 5c52cf775015c5e3a464d9b01f8323d5bf7305a2f590a1b3e11f71f421445b0f: Status 404 returned error can't find the container with id 5c52cf775015c5e3a464d9b01f8323d5bf7305a2f590a1b3e11f71f421445b0f Feb 17 19:20:24 crc kubenswrapper[4892]: I0217 19:20:24.144117 4892 generic.go:334] "Generic (PLEG): container finished" podID="f660f012-1387-4d60-a990-2dbb04f06f42" containerID="26e250c5841f2649f9ae1d65c7a8e9dc959a847b50ea4bc78cc68af760a591f0" exitCode=0 Feb 17 19:20:24 crc kubenswrapper[4892]: I0217 19:20:24.144900 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8127-account-create-update-tkktr" event={"ID":"f660f012-1387-4d60-a990-2dbb04f06f42","Type":"ContainerDied","Data":"26e250c5841f2649f9ae1d65c7a8e9dc959a847b50ea4bc78cc68af760a591f0"} Feb 17 19:20:24 crc kubenswrapper[4892]: I0217 19:20:24.144970 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8127-account-create-update-tkktr" event={"ID":"f660f012-1387-4d60-a990-2dbb04f06f42","Type":"ContainerStarted","Data":"5c52cf775015c5e3a464d9b01f8323d5bf7305a2f590a1b3e11f71f421445b0f"} Feb 17 19:20:24 crc kubenswrapper[4892]: I0217 19:20:24.150725 4892 generic.go:334] "Generic (PLEG): container finished" podID="5529f735-60b3-453d-8f60-88e712323868" containerID="097e4bcc8898757b053b51217b7726af600c68ea15dbff18282d663e27363bf9" exitCode=0 Feb 17 19:20:24 crc kubenswrapper[4892]: I0217 19:20:24.150810 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-zn4fv" event={"ID":"5529f735-60b3-453d-8f60-88e712323868","Type":"ContainerDied","Data":"097e4bcc8898757b053b51217b7726af600c68ea15dbff18282d663e27363bf9"} Feb 17 19:20:25 crc kubenswrapper[4892]: I0217 19:20:25.360031 4892 scope.go:117] "RemoveContainer" containerID="0b9ac01ffb1eb212ecc7dea4def4de73890b147a83b4b71346712fbd55b67ac2" Feb 17 19:20:25 crc kubenswrapper[4892]: E0217 19:20:25.360632 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:20:25 crc kubenswrapper[4892]: I0217 19:20:25.741278 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8127-account-create-update-tkktr" Feb 17 19:20:25 crc kubenswrapper[4892]: I0217 19:20:25.748268 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-zn4fv" Feb 17 19:20:25 crc kubenswrapper[4892]: I0217 19:20:25.825336 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l442j\" (UniqueName: \"kubernetes.io/projected/f660f012-1387-4d60-a990-2dbb04f06f42-kube-api-access-l442j\") pod \"f660f012-1387-4d60-a990-2dbb04f06f42\" (UID: \"f660f012-1387-4d60-a990-2dbb04f06f42\") " Feb 17 19:20:25 crc kubenswrapper[4892]: I0217 19:20:25.825428 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9r9w\" (UniqueName: \"kubernetes.io/projected/5529f735-60b3-453d-8f60-88e712323868-kube-api-access-x9r9w\") pod \"5529f735-60b3-453d-8f60-88e712323868\" (UID: \"5529f735-60b3-453d-8f60-88e712323868\") " Feb 17 19:20:25 crc kubenswrapper[4892]: I0217 19:20:25.825483 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f660f012-1387-4d60-a990-2dbb04f06f42-operator-scripts\") pod \"f660f012-1387-4d60-a990-2dbb04f06f42\" (UID: \"f660f012-1387-4d60-a990-2dbb04f06f42\") " Feb 17 19:20:25 crc kubenswrapper[4892]: I0217 19:20:25.825544 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5529f735-60b3-453d-8f60-88e712323868-operator-scripts\") pod \"5529f735-60b3-453d-8f60-88e712323868\" (UID: \"5529f735-60b3-453d-8f60-88e712323868\") " Feb 17 19:20:25 crc kubenswrapper[4892]: I0217 19:20:25.826850 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f660f012-1387-4d60-a990-2dbb04f06f42-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f660f012-1387-4d60-a990-2dbb04f06f42" (UID: "f660f012-1387-4d60-a990-2dbb04f06f42"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:20:25 crc kubenswrapper[4892]: I0217 19:20:25.827141 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5529f735-60b3-453d-8f60-88e712323868-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5529f735-60b3-453d-8f60-88e712323868" (UID: "5529f735-60b3-453d-8f60-88e712323868"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:20:25 crc kubenswrapper[4892]: I0217 19:20:25.833218 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5529f735-60b3-453d-8f60-88e712323868-kube-api-access-x9r9w" (OuterVolumeSpecName: "kube-api-access-x9r9w") pod "5529f735-60b3-453d-8f60-88e712323868" (UID: "5529f735-60b3-453d-8f60-88e712323868"). InnerVolumeSpecName "kube-api-access-x9r9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:20:25 crc kubenswrapper[4892]: I0217 19:20:25.834729 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f660f012-1387-4d60-a990-2dbb04f06f42-kube-api-access-l442j" (OuterVolumeSpecName: "kube-api-access-l442j") pod "f660f012-1387-4d60-a990-2dbb04f06f42" (UID: "f660f012-1387-4d60-a990-2dbb04f06f42"). InnerVolumeSpecName "kube-api-access-l442j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:20:25 crc kubenswrapper[4892]: I0217 19:20:25.927595 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l442j\" (UniqueName: \"kubernetes.io/projected/f660f012-1387-4d60-a990-2dbb04f06f42-kube-api-access-l442j\") on node \"crc\" DevicePath \"\"" Feb 17 19:20:25 crc kubenswrapper[4892]: I0217 19:20:25.927643 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9r9w\" (UniqueName: \"kubernetes.io/projected/5529f735-60b3-453d-8f60-88e712323868-kube-api-access-x9r9w\") on node \"crc\" DevicePath \"\"" Feb 17 19:20:25 crc kubenswrapper[4892]: I0217 19:20:25.927659 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f660f012-1387-4d60-a990-2dbb04f06f42-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:20:25 crc kubenswrapper[4892]: I0217 19:20:25.927670 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5529f735-60b3-453d-8f60-88e712323868-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:20:26 crc kubenswrapper[4892]: I0217 19:20:26.182321 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-zn4fv" event={"ID":"5529f735-60b3-453d-8f60-88e712323868","Type":"ContainerDied","Data":"8264b734faf1571eed58b83f23c48160c7f39c3aba98a37565842dd2ce95dcf1"} Feb 17 19:20:26 crc kubenswrapper[4892]: I0217 19:20:26.182599 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8264b734faf1571eed58b83f23c48160c7f39c3aba98a37565842dd2ce95dcf1" Feb 17 19:20:26 crc kubenswrapper[4892]: I0217 19:20:26.182404 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-zn4fv" Feb 17 19:20:26 crc kubenswrapper[4892]: I0217 19:20:26.185416 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8127-account-create-update-tkktr" event={"ID":"f660f012-1387-4d60-a990-2dbb04f06f42","Type":"ContainerDied","Data":"5c52cf775015c5e3a464d9b01f8323d5bf7305a2f590a1b3e11f71f421445b0f"} Feb 17 19:20:26 crc kubenswrapper[4892]: I0217 19:20:26.185450 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c52cf775015c5e3a464d9b01f8323d5bf7305a2f590a1b3e11f71f421445b0f" Feb 17 19:20:26 crc kubenswrapper[4892]: I0217 19:20:26.185503 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8127-account-create-update-tkktr" Feb 17 19:20:27 crc kubenswrapper[4892]: I0217 19:20:27.611671 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-hqdbg"] Feb 17 19:20:27 crc kubenswrapper[4892]: E0217 19:20:27.612689 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f660f012-1387-4d60-a990-2dbb04f06f42" containerName="mariadb-account-create-update" Feb 17 19:20:27 crc kubenswrapper[4892]: I0217 19:20:27.612727 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f660f012-1387-4d60-a990-2dbb04f06f42" containerName="mariadb-account-create-update" Feb 17 19:20:27 crc kubenswrapper[4892]: E0217 19:20:27.612772 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5529f735-60b3-453d-8f60-88e712323868" containerName="mariadb-database-create" Feb 17 19:20:27 crc kubenswrapper[4892]: I0217 19:20:27.612789 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="5529f735-60b3-453d-8f60-88e712323868" containerName="mariadb-database-create" Feb 17 19:20:27 crc kubenswrapper[4892]: I0217 19:20:27.613365 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="5529f735-60b3-453d-8f60-88e712323868" containerName="mariadb-database-create" Feb 17 19:20:27 crc kubenswrapper[4892]: I0217 19:20:27.613400 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="f660f012-1387-4d60-a990-2dbb04f06f42" containerName="mariadb-account-create-update" Feb 17 19:20:27 crc kubenswrapper[4892]: I0217 19:20:27.615023 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-hqdbg" Feb 17 19:20:27 crc kubenswrapper[4892]: I0217 19:20:27.622232 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 17 19:20:27 crc kubenswrapper[4892]: I0217 19:20:27.622810 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 17 19:20:27 crc kubenswrapper[4892]: I0217 19:20:27.622980 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-jzvx2" Feb 17 19:20:27 crc kubenswrapper[4892]: I0217 19:20:27.630505 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-hqdbg"] Feb 17 19:20:27 crc kubenswrapper[4892]: I0217 19:20:27.767264 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1436563-c0ac-4fff-9f7d-84e644d9061a-scripts\") pod \"cinder-db-sync-hqdbg\" (UID: \"b1436563-c0ac-4fff-9f7d-84e644d9061a\") " pod="openstack/cinder-db-sync-hqdbg" Feb 17 19:20:27 crc kubenswrapper[4892]: I0217 19:20:27.767366 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1436563-c0ac-4fff-9f7d-84e644d9061a-combined-ca-bundle\") pod \"cinder-db-sync-hqdbg\" (UID: \"b1436563-c0ac-4fff-9f7d-84e644d9061a\") " pod="openstack/cinder-db-sync-hqdbg" Feb 17 19:20:27 crc kubenswrapper[4892]: I0217 19:20:27.767486 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1436563-c0ac-4fff-9f7d-84e644d9061a-config-data\") pod \"cinder-db-sync-hqdbg\" (UID: \"b1436563-c0ac-4fff-9f7d-84e644d9061a\") " pod="openstack/cinder-db-sync-hqdbg" Feb 17 19:20:27 crc kubenswrapper[4892]: I0217 19:20:27.767520 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b1436563-c0ac-4fff-9f7d-84e644d9061a-db-sync-config-data\") pod \"cinder-db-sync-hqdbg\" (UID: \"b1436563-c0ac-4fff-9f7d-84e644d9061a\") " pod="openstack/cinder-db-sync-hqdbg" Feb 17 19:20:27 crc kubenswrapper[4892]: I0217 19:20:27.767549 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b1436563-c0ac-4fff-9f7d-84e644d9061a-etc-machine-id\") pod \"cinder-db-sync-hqdbg\" (UID: \"b1436563-c0ac-4fff-9f7d-84e644d9061a\") " pod="openstack/cinder-db-sync-hqdbg" Feb 17 19:20:27 crc kubenswrapper[4892]: I0217 19:20:27.767581 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9sgm\" (UniqueName: \"kubernetes.io/projected/b1436563-c0ac-4fff-9f7d-84e644d9061a-kube-api-access-d9sgm\") pod \"cinder-db-sync-hqdbg\" (UID: \"b1436563-c0ac-4fff-9f7d-84e644d9061a\") " pod="openstack/cinder-db-sync-hqdbg" Feb 17 19:20:27 crc kubenswrapper[4892]: I0217 19:20:27.869612 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1436563-c0ac-4fff-9f7d-84e644d9061a-scripts\") pod \"cinder-db-sync-hqdbg\" (UID: \"b1436563-c0ac-4fff-9f7d-84e644d9061a\") " pod="openstack/cinder-db-sync-hqdbg" Feb 17 19:20:27 crc kubenswrapper[4892]: I0217 19:20:27.869683 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1436563-c0ac-4fff-9f7d-84e644d9061a-combined-ca-bundle\") pod \"cinder-db-sync-hqdbg\" (UID: \"b1436563-c0ac-4fff-9f7d-84e644d9061a\") " pod="openstack/cinder-db-sync-hqdbg" Feb 17 19:20:27 crc kubenswrapper[4892]: I0217 19:20:27.869788 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1436563-c0ac-4fff-9f7d-84e644d9061a-config-data\") pod \"cinder-db-sync-hqdbg\" (UID: \"b1436563-c0ac-4fff-9f7d-84e644d9061a\") " pod="openstack/cinder-db-sync-hqdbg" Feb 17 19:20:27 crc kubenswrapper[4892]: I0217 19:20:27.869830 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b1436563-c0ac-4fff-9f7d-84e644d9061a-db-sync-config-data\") pod \"cinder-db-sync-hqdbg\" (UID: \"b1436563-c0ac-4fff-9f7d-84e644d9061a\") " pod="openstack/cinder-db-sync-hqdbg" Feb 17 19:20:27 crc kubenswrapper[4892]: I0217 19:20:27.869854 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b1436563-c0ac-4fff-9f7d-84e644d9061a-etc-machine-id\") pod \"cinder-db-sync-hqdbg\" (UID: \"b1436563-c0ac-4fff-9f7d-84e644d9061a\") " pod="openstack/cinder-db-sync-hqdbg" Feb 17 19:20:27 crc kubenswrapper[4892]: I0217 19:20:27.869874 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9sgm\" (UniqueName: \"kubernetes.io/projected/b1436563-c0ac-4fff-9f7d-84e644d9061a-kube-api-access-d9sgm\") pod \"cinder-db-sync-hqdbg\" (UID: \"b1436563-c0ac-4fff-9f7d-84e644d9061a\") " pod="openstack/cinder-db-sync-hqdbg" Feb 17 19:20:27 crc kubenswrapper[4892]: I0217 19:20:27.870498 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b1436563-c0ac-4fff-9f7d-84e644d9061a-etc-machine-id\") pod \"cinder-db-sync-hqdbg\" (UID: \"b1436563-c0ac-4fff-9f7d-84e644d9061a\") " pod="openstack/cinder-db-sync-hqdbg" Feb 17 19:20:27 crc kubenswrapper[4892]: I0217 19:20:27.874548 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1436563-c0ac-4fff-9f7d-84e644d9061a-config-data\") pod \"cinder-db-sync-hqdbg\" (UID: \"b1436563-c0ac-4fff-9f7d-84e644d9061a\") " pod="openstack/cinder-db-sync-hqdbg" Feb 17 19:20:27 crc kubenswrapper[4892]: I0217 19:20:27.888277 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b1436563-c0ac-4fff-9f7d-84e644d9061a-db-sync-config-data\") pod \"cinder-db-sync-hqdbg\" (UID: \"b1436563-c0ac-4fff-9f7d-84e644d9061a\") " pod="openstack/cinder-db-sync-hqdbg" Feb 17 19:20:27 crc kubenswrapper[4892]: I0217 19:20:27.888542 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1436563-c0ac-4fff-9f7d-84e644d9061a-combined-ca-bundle\") pod \"cinder-db-sync-hqdbg\" (UID: \"b1436563-c0ac-4fff-9f7d-84e644d9061a\") " pod="openstack/cinder-db-sync-hqdbg" Feb 17 19:20:27 crc kubenswrapper[4892]: I0217 19:20:27.890728 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1436563-c0ac-4fff-9f7d-84e644d9061a-scripts\") pod \"cinder-db-sync-hqdbg\" (UID: \"b1436563-c0ac-4fff-9f7d-84e644d9061a\") " pod="openstack/cinder-db-sync-hqdbg" Feb 17 19:20:27 crc kubenswrapper[4892]: I0217 19:20:27.891147 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9sgm\" (UniqueName: \"kubernetes.io/projected/b1436563-c0ac-4fff-9f7d-84e644d9061a-kube-api-access-d9sgm\") pod \"cinder-db-sync-hqdbg\" (UID: \"b1436563-c0ac-4fff-9f7d-84e644d9061a\") " pod="openstack/cinder-db-sync-hqdbg" Feb 17 19:20:27 crc kubenswrapper[4892]: I0217 19:20:27.933658 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-hqdbg" Feb 17 19:20:28 crc kubenswrapper[4892]: I0217 19:20:28.458565 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-hqdbg"] Feb 17 19:20:29 crc kubenswrapper[4892]: I0217 19:20:29.217057 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-hqdbg" event={"ID":"b1436563-c0ac-4fff-9f7d-84e644d9061a","Type":"ContainerStarted","Data":"f486a1aef66e93de89256eb5a1cf20ac8317ef5671afda6e70d99c02d18a046c"} Feb 17 19:20:29 crc kubenswrapper[4892]: I0217 19:20:29.217419 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-hqdbg" event={"ID":"b1436563-c0ac-4fff-9f7d-84e644d9061a","Type":"ContainerStarted","Data":"177d2e99aa2a77fd2e91858a9f85a226ef0c3d8e775decb652306f322f3c5988"} Feb 17 19:20:29 crc kubenswrapper[4892]: I0217 19:20:29.242297 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-hqdbg" podStartSLOduration=2.242276605 podStartE2EDuration="2.242276605s" podCreationTimestamp="2026-02-17 19:20:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:20:29.235981195 +0000 UTC m=+5800.611384480" watchObservedRunningTime="2026-02-17 19:20:29.242276605 +0000 UTC m=+5800.617679870" Feb 17 19:20:32 crc kubenswrapper[4892]: I0217 19:20:32.337163 4892 generic.go:334] "Generic (PLEG): container finished" podID="b1436563-c0ac-4fff-9f7d-84e644d9061a" containerID="f486a1aef66e93de89256eb5a1cf20ac8317ef5671afda6e70d99c02d18a046c" exitCode=0 Feb 17 19:20:32 crc kubenswrapper[4892]: I0217 19:20:32.337653 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-hqdbg" event={"ID":"b1436563-c0ac-4fff-9f7d-84e644d9061a","Type":"ContainerDied","Data":"f486a1aef66e93de89256eb5a1cf20ac8317ef5671afda6e70d99c02d18a046c"} Feb 17 19:20:33 crc kubenswrapper[4892]: I0217 19:20:33.737104 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-hqdbg" Feb 17 19:20:33 crc kubenswrapper[4892]: I0217 19:20:33.811589 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1436563-c0ac-4fff-9f7d-84e644d9061a-config-data\") pod \"b1436563-c0ac-4fff-9f7d-84e644d9061a\" (UID: \"b1436563-c0ac-4fff-9f7d-84e644d9061a\") " Feb 17 19:20:33 crc kubenswrapper[4892]: I0217 19:20:33.811655 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b1436563-c0ac-4fff-9f7d-84e644d9061a-db-sync-config-data\") pod \"b1436563-c0ac-4fff-9f7d-84e644d9061a\" (UID: \"b1436563-c0ac-4fff-9f7d-84e644d9061a\") " Feb 17 19:20:33 crc kubenswrapper[4892]: I0217 19:20:33.811725 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1436563-c0ac-4fff-9f7d-84e644d9061a-combined-ca-bundle\") pod \"b1436563-c0ac-4fff-9f7d-84e644d9061a\" (UID: \"b1436563-c0ac-4fff-9f7d-84e644d9061a\") " Feb 17 19:20:33 crc kubenswrapper[4892]: I0217 19:20:33.811870 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1436563-c0ac-4fff-9f7d-84e644d9061a-scripts\") pod \"b1436563-c0ac-4fff-9f7d-84e644d9061a\" (UID: \"b1436563-c0ac-4fff-9f7d-84e644d9061a\") " Feb 17 19:20:33 crc kubenswrapper[4892]: I0217 19:20:33.811930 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9sgm\" (UniqueName: \"kubernetes.io/projected/b1436563-c0ac-4fff-9f7d-84e644d9061a-kube-api-access-d9sgm\") pod \"b1436563-c0ac-4fff-9f7d-84e644d9061a\" (UID: \"b1436563-c0ac-4fff-9f7d-84e644d9061a\") " Feb 17 19:20:33 crc kubenswrapper[4892]: I0217 19:20:33.811968 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b1436563-c0ac-4fff-9f7d-84e644d9061a-etc-machine-id\") pod \"b1436563-c0ac-4fff-9f7d-84e644d9061a\" (UID: \"b1436563-c0ac-4fff-9f7d-84e644d9061a\") " Feb 17 19:20:33 crc kubenswrapper[4892]: I0217 19:20:33.812575 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b1436563-c0ac-4fff-9f7d-84e644d9061a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "b1436563-c0ac-4fff-9f7d-84e644d9061a" (UID: "b1436563-c0ac-4fff-9f7d-84e644d9061a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 19:20:33 crc kubenswrapper[4892]: I0217 19:20:33.816958 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1436563-c0ac-4fff-9f7d-84e644d9061a-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "b1436563-c0ac-4fff-9f7d-84e644d9061a" (UID: "b1436563-c0ac-4fff-9f7d-84e644d9061a"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:20:33 crc kubenswrapper[4892]: I0217 19:20:33.817243 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1436563-c0ac-4fff-9f7d-84e644d9061a-scripts" (OuterVolumeSpecName: "scripts") pod "b1436563-c0ac-4fff-9f7d-84e644d9061a" (UID: "b1436563-c0ac-4fff-9f7d-84e644d9061a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:20:33 crc kubenswrapper[4892]: I0217 19:20:33.817927 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1436563-c0ac-4fff-9f7d-84e644d9061a-kube-api-access-d9sgm" (OuterVolumeSpecName: "kube-api-access-d9sgm") pod "b1436563-c0ac-4fff-9f7d-84e644d9061a" (UID: "b1436563-c0ac-4fff-9f7d-84e644d9061a"). InnerVolumeSpecName "kube-api-access-d9sgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:20:33 crc kubenswrapper[4892]: I0217 19:20:33.843579 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1436563-c0ac-4fff-9f7d-84e644d9061a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b1436563-c0ac-4fff-9f7d-84e644d9061a" (UID: "b1436563-c0ac-4fff-9f7d-84e644d9061a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:20:33 crc kubenswrapper[4892]: I0217 19:20:33.861994 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1436563-c0ac-4fff-9f7d-84e644d9061a-config-data" (OuterVolumeSpecName: "config-data") pod "b1436563-c0ac-4fff-9f7d-84e644d9061a" (UID: "b1436563-c0ac-4fff-9f7d-84e644d9061a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:20:33 crc kubenswrapper[4892]: I0217 19:20:33.914075 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1436563-c0ac-4fff-9f7d-84e644d9061a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 19:20:33 crc kubenswrapper[4892]: I0217 19:20:33.914103 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1436563-c0ac-4fff-9f7d-84e644d9061a-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:20:33 crc kubenswrapper[4892]: I0217 19:20:33.914112 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9sgm\" (UniqueName: \"kubernetes.io/projected/b1436563-c0ac-4fff-9f7d-84e644d9061a-kube-api-access-d9sgm\") on node \"crc\" DevicePath \"\"" Feb 17 19:20:33 crc kubenswrapper[4892]: I0217 19:20:33.914123 4892 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b1436563-c0ac-4fff-9f7d-84e644d9061a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 17 19:20:33 crc kubenswrapper[4892]: I0217 19:20:33.914131 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1436563-c0ac-4fff-9f7d-84e644d9061a-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 19:20:33 crc kubenswrapper[4892]: I0217 19:20:33.914140 4892 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b1436563-c0ac-4fff-9f7d-84e644d9061a-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 19:20:34 crc kubenswrapper[4892]: I0217 19:20:34.362058 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-hqdbg" event={"ID":"b1436563-c0ac-4fff-9f7d-84e644d9061a","Type":"ContainerDied","Data":"177d2e99aa2a77fd2e91858a9f85a226ef0c3d8e775decb652306f322f3c5988"} Feb 17 19:20:34 crc kubenswrapper[4892]: I0217 19:20:34.362115 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-hqdbg" Feb 17 19:20:34 crc kubenswrapper[4892]: I0217 19:20:34.362132 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="177d2e99aa2a77fd2e91858a9f85a226ef0c3d8e775decb652306f322f3c5988" Feb 17 19:20:34 crc kubenswrapper[4892]: I0217 19:20:34.790479 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-747474b8df-c8tcp"] Feb 17 19:20:34 crc kubenswrapper[4892]: E0217 19:20:34.792090 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1436563-c0ac-4fff-9f7d-84e644d9061a" containerName="cinder-db-sync" Feb 17 19:20:34 crc kubenswrapper[4892]: I0217 19:20:34.792182 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1436563-c0ac-4fff-9f7d-84e644d9061a" containerName="cinder-db-sync" Feb 17 19:20:34 crc kubenswrapper[4892]: I0217 19:20:34.792703 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1436563-c0ac-4fff-9f7d-84e644d9061a" containerName="cinder-db-sync" Feb 17 19:20:34 crc kubenswrapper[4892]: I0217 19:20:34.797411 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-747474b8df-c8tcp" Feb 17 19:20:34 crc kubenswrapper[4892]: I0217 19:20:34.816543 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-747474b8df-c8tcp"] Feb 17 19:20:34 crc kubenswrapper[4892]: I0217 19:20:34.865086 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 17 19:20:34 crc kubenswrapper[4892]: I0217 19:20:34.866739 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 19:20:34 crc kubenswrapper[4892]: I0217 19:20:34.869471 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-jzvx2" Feb 17 19:20:34 crc kubenswrapper[4892]: I0217 19:20:34.875386 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 17 19:20:34 crc kubenswrapper[4892]: I0217 19:20:34.875600 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 17 19:20:34 crc kubenswrapper[4892]: I0217 19:20:34.875622 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 17 19:20:34 crc kubenswrapper[4892]: I0217 19:20:34.888466 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 17 19:20:34 crc kubenswrapper[4892]: I0217 19:20:34.938744 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl5pr\" (UniqueName: \"kubernetes.io/projected/db264755-2d6e-4d13-ab1f-1a4b3712b242-kube-api-access-sl5pr\") pod \"cinder-api-0\" (UID: \"db264755-2d6e-4d13-ab1f-1a4b3712b242\") " pod="openstack/cinder-api-0" Feb 17 19:20:34 crc kubenswrapper[4892]: I0217 19:20:34.938966 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db264755-2d6e-4d13-ab1f-1a4b3712b242-config-data-custom\") pod \"cinder-api-0\" (UID: \"db264755-2d6e-4d13-ab1f-1a4b3712b242\") " pod="openstack/cinder-api-0" Feb 17 19:20:34 crc kubenswrapper[4892]: I0217 19:20:34.939083 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db264755-2d6e-4d13-ab1f-1a4b3712b242-logs\") pod \"cinder-api-0\" (UID: \"db264755-2d6e-4d13-ab1f-1a4b3712b242\") " pod="openstack/cinder-api-0" Feb 17 19:20:34 crc kubenswrapper[4892]: I0217 19:20:34.939161 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb938d82-faba-45d3-8829-2aeb76c0e18c-dns-svc\") pod \"dnsmasq-dns-747474b8df-c8tcp\" (UID: \"bb938d82-faba-45d3-8829-2aeb76c0e18c\") " pod="openstack/dnsmasq-dns-747474b8df-c8tcp" Feb 17 19:20:34 crc kubenswrapper[4892]: I0217 19:20:34.939235 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db264755-2d6e-4d13-ab1f-1a4b3712b242-config-data\") pod \"cinder-api-0\" (UID: \"db264755-2d6e-4d13-ab1f-1a4b3712b242\") " pod="openstack/cinder-api-0" Feb 17 19:20:34 crc kubenswrapper[4892]: I0217 19:20:34.939326 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db264755-2d6e-4d13-ab1f-1a4b3712b242-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"db264755-2d6e-4d13-ab1f-1a4b3712b242\") " pod="openstack/cinder-api-0" Feb 17 19:20:34 crc kubenswrapper[4892]: I0217 19:20:34.939404 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/db264755-2d6e-4d13-ab1f-1a4b3712b242-etc-machine-id\") pod \"cinder-api-0\" (UID: \"db264755-2d6e-4d13-ab1f-1a4b3712b242\") " pod="openstack/cinder-api-0" Feb 17 19:20:34 crc kubenswrapper[4892]: I0217 19:20:34.939488 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db264755-2d6e-4d13-ab1f-1a4b3712b242-scripts\") pod \"cinder-api-0\" (UID: \"db264755-2d6e-4d13-ab1f-1a4b3712b242\") " pod="openstack/cinder-api-0" Feb 17 19:20:34 crc kubenswrapper[4892]: I0217 19:20:34.939576 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb938d82-faba-45d3-8829-2aeb76c0e18c-ovsdbserver-sb\") pod \"dnsmasq-dns-747474b8df-c8tcp\" (UID: \"bb938d82-faba-45d3-8829-2aeb76c0e18c\") " pod="openstack/dnsmasq-dns-747474b8df-c8tcp" Feb 17 19:20:34 crc kubenswrapper[4892]: I0217 19:20:34.939715 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb938d82-faba-45d3-8829-2aeb76c0e18c-ovsdbserver-nb\") pod \"dnsmasq-dns-747474b8df-c8tcp\" (UID: \"bb938d82-faba-45d3-8829-2aeb76c0e18c\") " pod="openstack/dnsmasq-dns-747474b8df-c8tcp" Feb 17 19:20:34 crc kubenswrapper[4892]: I0217 19:20:34.939826 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rl8p7\" (UniqueName: \"kubernetes.io/projected/bb938d82-faba-45d3-8829-2aeb76c0e18c-kube-api-access-rl8p7\") pod \"dnsmasq-dns-747474b8df-c8tcp\" (UID: \"bb938d82-faba-45d3-8829-2aeb76c0e18c\") " pod="openstack/dnsmasq-dns-747474b8df-c8tcp" Feb 17 19:20:34 crc kubenswrapper[4892]: I0217 19:20:34.939905 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb938d82-faba-45d3-8829-2aeb76c0e18c-config\") pod \"dnsmasq-dns-747474b8df-c8tcp\" (UID: \"bb938d82-faba-45d3-8829-2aeb76c0e18c\") " pod="openstack/dnsmasq-dns-747474b8df-c8tcp" Feb 17 19:20:35 crc kubenswrapper[4892]: I0217 19:20:35.041797 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db264755-2d6e-4d13-ab1f-1a4b3712b242-logs\") pod \"cinder-api-0\" (UID: \"db264755-2d6e-4d13-ab1f-1a4b3712b242\") " pod="openstack/cinder-api-0" Feb 17 19:20:35 crc kubenswrapper[4892]: I0217 19:20:35.042646 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb938d82-faba-45d3-8829-2aeb76c0e18c-dns-svc\") pod \"dnsmasq-dns-747474b8df-c8tcp\" (UID: \"bb938d82-faba-45d3-8829-2aeb76c0e18c\") " pod="openstack/dnsmasq-dns-747474b8df-c8tcp" Feb 17 19:20:35 crc kubenswrapper[4892]: I0217 19:20:35.042769 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db264755-2d6e-4d13-ab1f-1a4b3712b242-config-data\") pod \"cinder-api-0\" (UID: \"db264755-2d6e-4d13-ab1f-1a4b3712b242\") " pod="openstack/cinder-api-0" Feb 17 19:20:35 crc kubenswrapper[4892]: I0217 19:20:35.042214 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db264755-2d6e-4d13-ab1f-1a4b3712b242-logs\") pod \"cinder-api-0\" (UID: \"db264755-2d6e-4d13-ab1f-1a4b3712b242\") " pod="openstack/cinder-api-0" Feb 17 19:20:35 crc kubenswrapper[4892]: I0217 19:20:35.042958 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db264755-2d6e-4d13-ab1f-1a4b3712b242-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"db264755-2d6e-4d13-ab1f-1a4b3712b242\") " pod="openstack/cinder-api-0" Feb 17 19:20:35 crc kubenswrapper[4892]: I0217 19:20:35.043114 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/db264755-2d6e-4d13-ab1f-1a4b3712b242-etc-machine-id\") pod \"cinder-api-0\" (UID: \"db264755-2d6e-4d13-ab1f-1a4b3712b242\") " pod="openstack/cinder-api-0" Feb 17 19:20:35 crc kubenswrapper[4892]: I0217 19:20:35.043194 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db264755-2d6e-4d13-ab1f-1a4b3712b242-scripts\") pod \"cinder-api-0\" (UID: \"db264755-2d6e-4d13-ab1f-1a4b3712b242\") " pod="openstack/cinder-api-0" Feb 17 19:20:35 crc kubenswrapper[4892]: I0217 19:20:35.043296 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb938d82-faba-45d3-8829-2aeb76c0e18c-ovsdbserver-nb\") pod \"dnsmasq-dns-747474b8df-c8tcp\" (UID: \"bb938d82-faba-45d3-8829-2aeb76c0e18c\") " pod="openstack/dnsmasq-dns-747474b8df-c8tcp" Feb 17 19:20:35 crc kubenswrapper[4892]: I0217 19:20:35.043300 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/db264755-2d6e-4d13-ab1f-1a4b3712b242-etc-machine-id\") pod \"cinder-api-0\" (UID: \"db264755-2d6e-4d13-ab1f-1a4b3712b242\") " pod="openstack/cinder-api-0" Feb 17 19:20:35 crc kubenswrapper[4892]: I0217 19:20:35.043328 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb938d82-faba-45d3-8829-2aeb76c0e18c-ovsdbserver-sb\") pod \"dnsmasq-dns-747474b8df-c8tcp\" (UID: \"bb938d82-faba-45d3-8829-2aeb76c0e18c\") " pod="openstack/dnsmasq-dns-747474b8df-c8tcp" Feb 17 19:20:35 crc kubenswrapper[4892]: I0217 19:20:35.043830 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb938d82-faba-45d3-8829-2aeb76c0e18c-dns-svc\") pod \"dnsmasq-dns-747474b8df-c8tcp\" (UID: \"bb938d82-faba-45d3-8829-2aeb76c0e18c\") " pod="openstack/dnsmasq-dns-747474b8df-c8tcp" Feb 17 19:20:35 crc kubenswrapper[4892]: I0217 19:20:35.043844 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rl8p7\" (UniqueName: \"kubernetes.io/projected/bb938d82-faba-45d3-8829-2aeb76c0e18c-kube-api-access-rl8p7\") pod \"dnsmasq-dns-747474b8df-c8tcp\" (UID: \"bb938d82-faba-45d3-8829-2aeb76c0e18c\") " pod="openstack/dnsmasq-dns-747474b8df-c8tcp" Feb 17 19:20:35 crc kubenswrapper[4892]: I0217 19:20:35.044032 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb938d82-faba-45d3-8829-2aeb76c0e18c-config\") pod \"dnsmasq-dns-747474b8df-c8tcp\" (UID: \"bb938d82-faba-45d3-8829-2aeb76c0e18c\") " pod="openstack/dnsmasq-dns-747474b8df-c8tcp" Feb 17 19:20:35 crc kubenswrapper[4892]: I0217 19:20:35.044106 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb938d82-faba-45d3-8829-2aeb76c0e18c-ovsdbserver-sb\") pod \"dnsmasq-dns-747474b8df-c8tcp\" (UID: \"bb938d82-faba-45d3-8829-2aeb76c0e18c\") " pod="openstack/dnsmasq-dns-747474b8df-c8tcp" Feb 17 19:20:35 crc kubenswrapper[4892]: I0217 19:20:35.044131 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb938d82-faba-45d3-8829-2aeb76c0e18c-ovsdbserver-nb\") pod \"dnsmasq-dns-747474b8df-c8tcp\" (UID: \"bb938d82-faba-45d3-8829-2aeb76c0e18c\") " pod="openstack/dnsmasq-dns-747474b8df-c8tcp" Feb 17 19:20:35 crc kubenswrapper[4892]: I0217 19:20:35.044281 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sl5pr\" (UniqueName: \"kubernetes.io/projected/db264755-2d6e-4d13-ab1f-1a4b3712b242-kube-api-access-sl5pr\") pod \"cinder-api-0\" (UID: \"db264755-2d6e-4d13-ab1f-1a4b3712b242\") " pod="openstack/cinder-api-0" Feb 17 19:20:35 crc kubenswrapper[4892]: I0217 19:20:35.044320 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db264755-2d6e-4d13-ab1f-1a4b3712b242-config-data-custom\") pod \"cinder-api-0\" (UID: \"db264755-2d6e-4d13-ab1f-1a4b3712b242\") " pod="openstack/cinder-api-0" Feb 17 19:20:35 crc kubenswrapper[4892]: I0217 19:20:35.045059 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb938d82-faba-45d3-8829-2aeb76c0e18c-config\") pod \"dnsmasq-dns-747474b8df-c8tcp\" (UID: \"bb938d82-faba-45d3-8829-2aeb76c0e18c\") " pod="openstack/dnsmasq-dns-747474b8df-c8tcp" Feb 17 19:20:35 crc kubenswrapper[4892]: I0217 19:20:35.047909 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db264755-2d6e-4d13-ab1f-1a4b3712b242-scripts\") pod \"cinder-api-0\" (UID: \"db264755-2d6e-4d13-ab1f-1a4b3712b242\") " pod="openstack/cinder-api-0" Feb 17 19:20:35 crc kubenswrapper[4892]: I0217 19:20:35.051161 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db264755-2d6e-4d13-ab1f-1a4b3712b242-config-data-custom\") pod \"cinder-api-0\" (UID: \"db264755-2d6e-4d13-ab1f-1a4b3712b242\") " pod="openstack/cinder-api-0" Feb 17 19:20:35 crc kubenswrapper[4892]: I0217 19:20:35.051766 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db264755-2d6e-4d13-ab1f-1a4b3712b242-config-data\") pod \"cinder-api-0\" (UID: \"db264755-2d6e-4d13-ab1f-1a4b3712b242\") " pod="openstack/cinder-api-0" Feb 17 19:20:35 crc kubenswrapper[4892]: I0217 19:20:35.060380 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db264755-2d6e-4d13-ab1f-1a4b3712b242-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"db264755-2d6e-4d13-ab1f-1a4b3712b242\") " pod="openstack/cinder-api-0" Feb 17 19:20:35 crc kubenswrapper[4892]: I0217 19:20:35.063635 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl5pr\" (UniqueName: \"kubernetes.io/projected/db264755-2d6e-4d13-ab1f-1a4b3712b242-kube-api-access-sl5pr\") pod \"cinder-api-0\" (UID: \"db264755-2d6e-4d13-ab1f-1a4b3712b242\") " pod="openstack/cinder-api-0" Feb 17 19:20:35 crc kubenswrapper[4892]: I0217 19:20:35.075840 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rl8p7\" (UniqueName: \"kubernetes.io/projected/bb938d82-faba-45d3-8829-2aeb76c0e18c-kube-api-access-rl8p7\") pod \"dnsmasq-dns-747474b8df-c8tcp\" (UID: \"bb938d82-faba-45d3-8829-2aeb76c0e18c\") " pod="openstack/dnsmasq-dns-747474b8df-c8tcp" Feb 17 19:20:35 crc kubenswrapper[4892]: I0217 19:20:35.117524 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-747474b8df-c8tcp" Feb 17 19:20:35 crc kubenswrapper[4892]: I0217 19:20:35.202399 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 19:20:35 crc kubenswrapper[4892]: I0217 19:20:35.585221 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-747474b8df-c8tcp"] Feb 17 19:20:35 crc kubenswrapper[4892]: W0217 19:20:35.819054 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb264755_2d6e_4d13_ab1f_1a4b3712b242.slice/crio-ff8ce85defc5cf70b20da8db703e51504947e47aa18cf1f69a817d9e73a02c4c WatchSource:0}: Error finding container ff8ce85defc5cf70b20da8db703e51504947e47aa18cf1f69a817d9e73a02c4c: Status 404 returned error can't find the container with id ff8ce85defc5cf70b20da8db703e51504947e47aa18cf1f69a817d9e73a02c4c Feb 17 19:20:35 crc kubenswrapper[4892]: I0217 19:20:35.825772 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 17 19:20:36 crc kubenswrapper[4892]: I0217 19:20:36.384889 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"db264755-2d6e-4d13-ab1f-1a4b3712b242","Type":"ContainerStarted","Data":"ff8ce85defc5cf70b20da8db703e51504947e47aa18cf1f69a817d9e73a02c4c"} Feb 17 19:20:36 crc kubenswrapper[4892]: I0217 19:20:36.387168 4892 generic.go:334] "Generic (PLEG): container finished" podID="bb938d82-faba-45d3-8829-2aeb76c0e18c" containerID="81a68a321592b06058d6286cdb835cef2b8afc93edcb4e1c6cc273ae77c40f05" exitCode=0 Feb 17 19:20:36 crc kubenswrapper[4892]: I0217 19:20:36.387194 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-747474b8df-c8tcp" event={"ID":"bb938d82-faba-45d3-8829-2aeb76c0e18c","Type":"ContainerDied","Data":"81a68a321592b06058d6286cdb835cef2b8afc93edcb4e1c6cc273ae77c40f05"} Feb 17 19:20:36 crc kubenswrapper[4892]: I0217 19:20:36.387210 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-747474b8df-c8tcp" event={"ID":"bb938d82-faba-45d3-8829-2aeb76c0e18c","Type":"ContainerStarted","Data":"56d64137ca22148bf7b17ecbf0887980901fd02d7117d2310abfabe69f91a72a"} Feb 17 19:20:37 crc kubenswrapper[4892]: I0217 19:20:37.408245 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"db264755-2d6e-4d13-ab1f-1a4b3712b242","Type":"ContainerStarted","Data":"7dd04ecbe0ec3921048d1c69f234d5ca72164af63de73c2812832f5df598b0a5"} Feb 17 19:20:37 crc kubenswrapper[4892]: I0217 19:20:37.408767 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 17 19:20:37 crc kubenswrapper[4892]: I0217 19:20:37.408783 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"db264755-2d6e-4d13-ab1f-1a4b3712b242","Type":"ContainerStarted","Data":"1ab0b224df9ec86f11b84944dc6f8b4c202c7a169e3ea90883e69ff0e285ae6b"} Feb 17 19:20:37 crc kubenswrapper[4892]: I0217 19:20:37.414020 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-747474b8df-c8tcp" event={"ID":"bb938d82-faba-45d3-8829-2aeb76c0e18c","Type":"ContainerStarted","Data":"8d8a76213b1f6f3f3eeaab602e47cb61dd5453b368d09441b4e02dc4d7b79ae9"} Feb 17 19:20:37 crc kubenswrapper[4892]: I0217 19:20:37.414337 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-747474b8df-c8tcp" Feb 17 19:20:37 crc kubenswrapper[4892]: I0217 19:20:37.437552 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.437532472 podStartE2EDuration="3.437532472s" podCreationTimestamp="2026-02-17 19:20:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:20:37.43302963 +0000 UTC m=+5808.808432925" watchObservedRunningTime="2026-02-17 19:20:37.437532472 +0000 UTC m=+5808.812935737" Feb 17 19:20:37 crc kubenswrapper[4892]: I0217 19:20:37.461801 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-747474b8df-c8tcp" podStartSLOduration=3.461782696 podStartE2EDuration="3.461782696s" podCreationTimestamp="2026-02-17 19:20:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:20:37.453007889 +0000 UTC m=+5808.828411154" watchObservedRunningTime="2026-02-17 19:20:37.461782696 +0000 UTC m=+5808.837185961" Feb 17 19:20:39 crc kubenswrapper[4892]: I0217 19:20:39.378156 4892 scope.go:117] "RemoveContainer" containerID="0b9ac01ffb1eb212ecc7dea4def4de73890b147a83b4b71346712fbd55b67ac2" Feb 17 19:20:39 crc kubenswrapper[4892]: E0217 19:20:39.379008 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:20:45 crc kubenswrapper[4892]: I0217 19:20:45.120075 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-747474b8df-c8tcp" Feb 17 19:20:45 crc kubenswrapper[4892]: I0217 19:20:45.278811 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c67fc5945-dlf96"] Feb 17 19:20:45 crc kubenswrapper[4892]: I0217 19:20:45.279429 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6c67fc5945-dlf96" podUID="3a4dff54-f831-4752-b7b9-67123477ec0e" containerName="dnsmasq-dns" containerID="cri-o://01a562fd2e3c2c4e182fb48388ea03f20288967692f1afabfc6788c50deddc82" gracePeriod=10 Feb 17 19:20:45 crc kubenswrapper[4892]: I0217 19:20:45.514664 4892 generic.go:334] "Generic (PLEG): container finished" podID="3a4dff54-f831-4752-b7b9-67123477ec0e" containerID="01a562fd2e3c2c4e182fb48388ea03f20288967692f1afabfc6788c50deddc82" exitCode=0 Feb 17 19:20:45 crc kubenswrapper[4892]: I0217 19:20:45.514706 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c67fc5945-dlf96" event={"ID":"3a4dff54-f831-4752-b7b9-67123477ec0e","Type":"ContainerDied","Data":"01a562fd2e3c2c4e182fb48388ea03f20288967692f1afabfc6788c50deddc82"} Feb 17 19:20:45 crc kubenswrapper[4892]: I0217 19:20:45.790938 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c67fc5945-dlf96" Feb 17 19:20:45 crc kubenswrapper[4892]: I0217 19:20:45.897949 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmzgq\" (UniqueName: \"kubernetes.io/projected/3a4dff54-f831-4752-b7b9-67123477ec0e-kube-api-access-pmzgq\") pod \"3a4dff54-f831-4752-b7b9-67123477ec0e\" (UID: \"3a4dff54-f831-4752-b7b9-67123477ec0e\") " Feb 17 19:20:45 crc kubenswrapper[4892]: I0217 19:20:45.898198 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a4dff54-f831-4752-b7b9-67123477ec0e-dns-svc\") pod \"3a4dff54-f831-4752-b7b9-67123477ec0e\" (UID: \"3a4dff54-f831-4752-b7b9-67123477ec0e\") " Feb 17 19:20:45 crc kubenswrapper[4892]: I0217 19:20:45.898277 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a4dff54-f831-4752-b7b9-67123477ec0e-ovsdbserver-nb\") pod \"3a4dff54-f831-4752-b7b9-67123477ec0e\" (UID: \"3a4dff54-f831-4752-b7b9-67123477ec0e\") " Feb 17 19:20:45 crc kubenswrapper[4892]: I0217 19:20:45.898328 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3a4dff54-f831-4752-b7b9-67123477ec0e-ovsdbserver-sb\") pod \"3a4dff54-f831-4752-b7b9-67123477ec0e\" (UID: \"3a4dff54-f831-4752-b7b9-67123477ec0e\") " Feb 17 19:20:45 crc kubenswrapper[4892]: I0217 19:20:45.898449 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a4dff54-f831-4752-b7b9-67123477ec0e-config\") pod \"3a4dff54-f831-4752-b7b9-67123477ec0e\" (UID: \"3a4dff54-f831-4752-b7b9-67123477ec0e\") " Feb 17 19:20:45 crc kubenswrapper[4892]: I0217 19:20:45.911202 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a4dff54-f831-4752-b7b9-67123477ec0e-kube-api-access-pmzgq" (OuterVolumeSpecName: "kube-api-access-pmzgq") pod "3a4dff54-f831-4752-b7b9-67123477ec0e" (UID: "3a4dff54-f831-4752-b7b9-67123477ec0e"). InnerVolumeSpecName "kube-api-access-pmzgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:20:45 crc kubenswrapper[4892]: I0217 19:20:45.959112 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a4dff54-f831-4752-b7b9-67123477ec0e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3a4dff54-f831-4752-b7b9-67123477ec0e" (UID: "3a4dff54-f831-4752-b7b9-67123477ec0e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:20:45 crc kubenswrapper[4892]: I0217 19:20:45.965857 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a4dff54-f831-4752-b7b9-67123477ec0e-config" (OuterVolumeSpecName: "config") pod "3a4dff54-f831-4752-b7b9-67123477ec0e" (UID: "3a4dff54-f831-4752-b7b9-67123477ec0e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:20:45 crc kubenswrapper[4892]: I0217 19:20:45.972275 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a4dff54-f831-4752-b7b9-67123477ec0e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3a4dff54-f831-4752-b7b9-67123477ec0e" (UID: "3a4dff54-f831-4752-b7b9-67123477ec0e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:20:45 crc kubenswrapper[4892]: I0217 19:20:45.974768 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a4dff54-f831-4752-b7b9-67123477ec0e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3a4dff54-f831-4752-b7b9-67123477ec0e" (UID: "3a4dff54-f831-4752-b7b9-67123477ec0e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:20:46 crc kubenswrapper[4892]: I0217 19:20:46.001055 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmzgq\" (UniqueName: \"kubernetes.io/projected/3a4dff54-f831-4752-b7b9-67123477ec0e-kube-api-access-pmzgq\") on node \"crc\" DevicePath \"\"" Feb 17 19:20:46 crc kubenswrapper[4892]: I0217 19:20:46.001095 4892 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a4dff54-f831-4752-b7b9-67123477ec0e-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 19:20:46 crc kubenswrapper[4892]: I0217 19:20:46.001108 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a4dff54-f831-4752-b7b9-67123477ec0e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 19:20:46 crc kubenswrapper[4892]: I0217 19:20:46.001121 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3a4dff54-f831-4752-b7b9-67123477ec0e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 19:20:46 crc kubenswrapper[4892]: I0217 19:20:46.001132 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a4dff54-f831-4752-b7b9-67123477ec0e-config\") on node \"crc\" DevicePath \"\"" Feb 17 19:20:46 crc kubenswrapper[4892]: I0217 19:20:46.527191 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c67fc5945-dlf96" event={"ID":"3a4dff54-f831-4752-b7b9-67123477ec0e","Type":"ContainerDied","Data":"37847434dd5ad823e1d7c6b17bd6bd7d6a9e94fc08cbf88acf86910027174249"} Feb 17 19:20:46 crc kubenswrapper[4892]: I0217 19:20:46.527238 4892 scope.go:117] "RemoveContainer" containerID="01a562fd2e3c2c4e182fb48388ea03f20288967692f1afabfc6788c50deddc82" Feb 17 19:20:46 crc kubenswrapper[4892]: I0217 19:20:46.527364 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c67fc5945-dlf96" Feb 17 19:20:46 crc kubenswrapper[4892]: I0217 19:20:46.583030 4892 scope.go:117] "RemoveContainer" containerID="be81c5109854d13651b22e33d33faca421e7b2e5666125286a1a1c60733609fe" Feb 17 19:20:46 crc kubenswrapper[4892]: I0217 19:20:46.585444 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c67fc5945-dlf96"] Feb 17 19:20:46 crc kubenswrapper[4892]: I0217 19:20:46.641811 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c67fc5945-dlf96"] Feb 17 19:20:46 crc kubenswrapper[4892]: I0217 19:20:46.658876 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 17 19:20:46 crc kubenswrapper[4892]: I0217 19:20:46.659086 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="2f3b01f4-cae0-43d8-8d5a-5eb7052304e3" containerName="nova-cell0-conductor-conductor" containerID="cri-o://3d7ba1b53e5252313403655198580bfb151aeca1caa5cc295583ec81123f5097" gracePeriod=30 Feb 17 19:20:46 crc kubenswrapper[4892]: I0217 19:20:46.686443 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 19:20:46 crc kubenswrapper[4892]: I0217 19:20:46.686676 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="bc7212aa-fb51-4121-a36c-99a201ee026d" containerName="nova-scheduler-scheduler" containerID="cri-o://c475e8b05c7af30a23bd45d3f75832fa2ffcb0f0025d4eb7e5a3728c26973515" gracePeriod=30 Feb 17 19:20:46 crc kubenswrapper[4892]: I0217 19:20:46.694092 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 19:20:46 crc kubenswrapper[4892]: I0217 19:20:46.694235 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="61db98bb-314d-4dd3-9a11-88cac622dea5" containerName="nova-metadata-log" containerID="cri-o://c5fc9d5719f755770060276b956e0dcb1c0188bfca63321e05144704a16cfa75" gracePeriod=30 Feb 17 19:20:46 crc kubenswrapper[4892]: I0217 19:20:46.694445 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="61db98bb-314d-4dd3-9a11-88cac622dea5" containerName="nova-metadata-metadata" containerID="cri-o://f78a103f4ef89fb9ee2aa1d4fab94d0d3ef04bfc9ee8f212eed1cf44f7afb6a2" gracePeriod=30 Feb 17 19:20:46 crc kubenswrapper[4892]: I0217 19:20:46.713079 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 19:20:46 crc kubenswrapper[4892]: I0217 19:20:46.713505 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="f9243c8d-1b08-4ff6-aca9-be5aab1a4437" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://8f577a819e93460b86b0612097544a6ee471507bd452e9a5a9ad36b7fd3da05a" gracePeriod=30 Feb 17 19:20:46 crc kubenswrapper[4892]: I0217 19:20:46.724466 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 19:20:46 crc kubenswrapper[4892]: I0217 19:20:46.724788 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="88699460-d2ea-4d92-acf7-150aff42bf48" containerName="nova-api-api" containerID="cri-o://a85168328370849ba524f17ec5f25cd2c99b6466753bcdfa951f650acd3bf490" gracePeriod=30 Feb 17 19:20:46 crc kubenswrapper[4892]: I0217 19:20:46.724982 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="88699460-d2ea-4d92-acf7-150aff42bf48" containerName="nova-api-log" containerID="cri-o://cd39d941e73d414b020632b87dfae80090b58672310a138db82a228a57a08545" gracePeriod=30 Feb 17 19:20:47 crc kubenswrapper[4892]: I0217 19:20:47.373849 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a4dff54-f831-4752-b7b9-67123477ec0e" path="/var/lib/kubelet/pods/3a4dff54-f831-4752-b7b9-67123477ec0e/volumes" Feb 17 19:20:47 crc kubenswrapper[4892]: I0217 19:20:47.542825 4892 generic.go:334] "Generic (PLEG): container finished" podID="f9243c8d-1b08-4ff6-aca9-be5aab1a4437" containerID="8f577a819e93460b86b0612097544a6ee471507bd452e9a5a9ad36b7fd3da05a" exitCode=0 Feb 17 19:20:47 crc kubenswrapper[4892]: I0217 19:20:47.542873 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f9243c8d-1b08-4ff6-aca9-be5aab1a4437","Type":"ContainerDied","Data":"8f577a819e93460b86b0612097544a6ee471507bd452e9a5a9ad36b7fd3da05a"} Feb 17 19:20:47 crc kubenswrapper[4892]: I0217 19:20:47.543121 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f9243c8d-1b08-4ff6-aca9-be5aab1a4437","Type":"ContainerDied","Data":"97c038e15897a4dcbdfb0f52b0be760dc44f6e83e87a3ce2f27ada6073485d85"} Feb 17 19:20:47 crc kubenswrapper[4892]: I0217 19:20:47.543138 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97c038e15897a4dcbdfb0f52b0be760dc44f6e83e87a3ce2f27ada6073485d85" Feb 17 19:20:47 crc kubenswrapper[4892]: I0217 19:20:47.545392 4892 generic.go:334] "Generic (PLEG): container finished" podID="61db98bb-314d-4dd3-9a11-88cac622dea5" containerID="c5fc9d5719f755770060276b956e0dcb1c0188bfca63321e05144704a16cfa75" exitCode=143 Feb 17 19:20:47 crc kubenswrapper[4892]: I0217 19:20:47.545476 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"61db98bb-314d-4dd3-9a11-88cac622dea5","Type":"ContainerDied","Data":"c5fc9d5719f755770060276b956e0dcb1c0188bfca63321e05144704a16cfa75"} Feb 17 19:20:47 crc kubenswrapper[4892]: I0217 19:20:47.545654 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 19:20:47 crc kubenswrapper[4892]: I0217 19:20:47.547309 4892 generic.go:334] "Generic (PLEG): container finished" podID="88699460-d2ea-4d92-acf7-150aff42bf48" containerID="cd39d941e73d414b020632b87dfae80090b58672310a138db82a228a57a08545" exitCode=143 Feb 17 19:20:47 crc kubenswrapper[4892]: I0217 19:20:47.547346 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"88699460-d2ea-4d92-acf7-150aff42bf48","Type":"ContainerDied","Data":"cd39d941e73d414b020632b87dfae80090b58672310a138db82a228a57a08545"} Feb 17 19:20:47 crc kubenswrapper[4892]: I0217 19:20:47.598231 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 17 19:20:47 crc kubenswrapper[4892]: I0217 19:20:47.632529 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9243c8d-1b08-4ff6-aca9-be5aab1a4437-combined-ca-bundle\") pod \"f9243c8d-1b08-4ff6-aca9-be5aab1a4437\" (UID: \"f9243c8d-1b08-4ff6-aca9-be5aab1a4437\") " Feb 17 19:20:47 crc kubenswrapper[4892]: I0217 19:20:47.632740 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9243c8d-1b08-4ff6-aca9-be5aab1a4437-config-data\") pod \"f9243c8d-1b08-4ff6-aca9-be5aab1a4437\" (UID: \"f9243c8d-1b08-4ff6-aca9-be5aab1a4437\") " Feb 17 19:20:47 crc kubenswrapper[4892]: I0217 19:20:47.632829 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9htgx\" (UniqueName: \"kubernetes.io/projected/f9243c8d-1b08-4ff6-aca9-be5aab1a4437-kube-api-access-9htgx\") pod \"f9243c8d-1b08-4ff6-aca9-be5aab1a4437\" (UID: \"f9243c8d-1b08-4ff6-aca9-be5aab1a4437\") " Feb 17 19:20:47 crc kubenswrapper[4892]: I0217 19:20:47.644727 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9243c8d-1b08-4ff6-aca9-be5aab1a4437-kube-api-access-9htgx" (OuterVolumeSpecName: "kube-api-access-9htgx") pod "f9243c8d-1b08-4ff6-aca9-be5aab1a4437" (UID: "f9243c8d-1b08-4ff6-aca9-be5aab1a4437"). InnerVolumeSpecName "kube-api-access-9htgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:20:47 crc kubenswrapper[4892]: I0217 19:20:47.661291 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9243c8d-1b08-4ff6-aca9-be5aab1a4437-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f9243c8d-1b08-4ff6-aca9-be5aab1a4437" (UID: "f9243c8d-1b08-4ff6-aca9-be5aab1a4437"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:20:47 crc kubenswrapper[4892]: I0217 19:20:47.698278 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9243c8d-1b08-4ff6-aca9-be5aab1a4437-config-data" (OuterVolumeSpecName: "config-data") pod "f9243c8d-1b08-4ff6-aca9-be5aab1a4437" (UID: "f9243c8d-1b08-4ff6-aca9-be5aab1a4437"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:20:47 crc kubenswrapper[4892]: I0217 19:20:47.739075 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9htgx\" (UniqueName: \"kubernetes.io/projected/f9243c8d-1b08-4ff6-aca9-be5aab1a4437-kube-api-access-9htgx\") on node \"crc\" DevicePath \"\"" Feb 17 19:20:47 crc kubenswrapper[4892]: I0217 19:20:47.739114 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9243c8d-1b08-4ff6-aca9-be5aab1a4437-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 19:20:47 crc kubenswrapper[4892]: I0217 19:20:47.739125 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9243c8d-1b08-4ff6-aca9-be5aab1a4437-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 19:20:48 crc kubenswrapper[4892]: I0217 19:20:48.026696 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 19:20:48 crc kubenswrapper[4892]: I0217 19:20:48.147702 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc7212aa-fb51-4121-a36c-99a201ee026d-config-data\") pod \"bc7212aa-fb51-4121-a36c-99a201ee026d\" (UID: \"bc7212aa-fb51-4121-a36c-99a201ee026d\") " Feb 17 19:20:48 crc kubenswrapper[4892]: I0217 19:20:48.147762 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqdnr\" (UniqueName: \"kubernetes.io/projected/bc7212aa-fb51-4121-a36c-99a201ee026d-kube-api-access-xqdnr\") pod \"bc7212aa-fb51-4121-a36c-99a201ee026d\" (UID: \"bc7212aa-fb51-4121-a36c-99a201ee026d\") " Feb 17 19:20:48 crc kubenswrapper[4892]: I0217 19:20:48.147787 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc7212aa-fb51-4121-a36c-99a201ee026d-combined-ca-bundle\") pod \"bc7212aa-fb51-4121-a36c-99a201ee026d\" (UID: \"bc7212aa-fb51-4121-a36c-99a201ee026d\") " Feb 17 19:20:48 crc kubenswrapper[4892]: I0217 19:20:48.151497 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc7212aa-fb51-4121-a36c-99a201ee026d-kube-api-access-xqdnr" (OuterVolumeSpecName: "kube-api-access-xqdnr") pod "bc7212aa-fb51-4121-a36c-99a201ee026d" (UID: "bc7212aa-fb51-4121-a36c-99a201ee026d"). InnerVolumeSpecName "kube-api-access-xqdnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:20:48 crc kubenswrapper[4892]: I0217 19:20:48.183427 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc7212aa-fb51-4121-a36c-99a201ee026d-config-data" (OuterVolumeSpecName: "config-data") pod "bc7212aa-fb51-4121-a36c-99a201ee026d" (UID: "bc7212aa-fb51-4121-a36c-99a201ee026d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:20:48 crc kubenswrapper[4892]: I0217 19:20:48.193161 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc7212aa-fb51-4121-a36c-99a201ee026d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bc7212aa-fb51-4121-a36c-99a201ee026d" (UID: "bc7212aa-fb51-4121-a36c-99a201ee026d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:20:48 crc kubenswrapper[4892]: I0217 19:20:48.250229 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc7212aa-fb51-4121-a36c-99a201ee026d-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 19:20:48 crc kubenswrapper[4892]: I0217 19:20:48.250265 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqdnr\" (UniqueName: \"kubernetes.io/projected/bc7212aa-fb51-4121-a36c-99a201ee026d-kube-api-access-xqdnr\") on node \"crc\" DevicePath \"\"" Feb 17 19:20:48 crc kubenswrapper[4892]: I0217 19:20:48.250275 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc7212aa-fb51-4121-a36c-99a201ee026d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 19:20:48 crc kubenswrapper[4892]: I0217 19:20:48.561582 4892 generic.go:334] "Generic (PLEG): container finished" podID="bc7212aa-fb51-4121-a36c-99a201ee026d" containerID="c475e8b05c7af30a23bd45d3f75832fa2ffcb0f0025d4eb7e5a3728c26973515" exitCode=0 Feb 17 19:20:48 crc kubenswrapper[4892]: I0217 19:20:48.561676 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 19:20:48 crc kubenswrapper[4892]: I0217 19:20:48.562831 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bc7212aa-fb51-4121-a36c-99a201ee026d","Type":"ContainerDied","Data":"c475e8b05c7af30a23bd45d3f75832fa2ffcb0f0025d4eb7e5a3728c26973515"} Feb 17 19:20:48 crc kubenswrapper[4892]: I0217 19:20:48.562882 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bc7212aa-fb51-4121-a36c-99a201ee026d","Type":"ContainerDied","Data":"c1be2e80e5582baf9eb07295a1187c9de3095c4f521e2e406e4dd10e930aed32"} Feb 17 19:20:48 crc kubenswrapper[4892]: I0217 19:20:48.562902 4892 scope.go:117] "RemoveContainer" containerID="c475e8b05c7af30a23bd45d3f75832fa2ffcb0f0025d4eb7e5a3728c26973515" Feb 17 19:20:48 crc kubenswrapper[4892]: I0217 19:20:48.564148 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 19:20:48 crc kubenswrapper[4892]: I0217 19:20:48.635510 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 19:20:48 crc kubenswrapper[4892]: I0217 19:20:48.653686 4892 scope.go:117] "RemoveContainer" containerID="c475e8b05c7af30a23bd45d3f75832fa2ffcb0f0025d4eb7e5a3728c26973515" Feb 17 19:20:48 crc kubenswrapper[4892]: E0217 19:20:48.654161 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c475e8b05c7af30a23bd45d3f75832fa2ffcb0f0025d4eb7e5a3728c26973515\": container with ID starting with c475e8b05c7af30a23bd45d3f75832fa2ffcb0f0025d4eb7e5a3728c26973515 not found: ID does not exist" containerID="c475e8b05c7af30a23bd45d3f75832fa2ffcb0f0025d4eb7e5a3728c26973515" Feb 17 19:20:48 crc kubenswrapper[4892]: I0217 19:20:48.654207 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c475e8b05c7af30a23bd45d3f75832fa2ffcb0f0025d4eb7e5a3728c26973515"} err="failed to get container status \"c475e8b05c7af30a23bd45d3f75832fa2ffcb0f0025d4eb7e5a3728c26973515\": rpc error: code = NotFound desc = could not find container \"c475e8b05c7af30a23bd45d3f75832fa2ffcb0f0025d4eb7e5a3728c26973515\": container with ID starting with c475e8b05c7af30a23bd45d3f75832fa2ffcb0f0025d4eb7e5a3728c26973515 not found: ID does not exist" Feb 17 19:20:48 crc kubenswrapper[4892]: I0217 19:20:48.665172 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 19:20:48 crc kubenswrapper[4892]: I0217 19:20:48.683239 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 19:20:48 crc kubenswrapper[4892]: I0217 19:20:48.694139 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 19:20:48 crc kubenswrapper[4892]: I0217 19:20:48.705561 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 19:20:48 crc kubenswrapper[4892]: E0217 19:20:48.706116 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a4dff54-f831-4752-b7b9-67123477ec0e" containerName="dnsmasq-dns" Feb 17 19:20:48 crc kubenswrapper[4892]: I0217 19:20:48.706134 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a4dff54-f831-4752-b7b9-67123477ec0e" containerName="dnsmasq-dns" Feb 17 19:20:48 crc kubenswrapper[4892]: E0217 19:20:48.706146 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a4dff54-f831-4752-b7b9-67123477ec0e" containerName="init" Feb 17 19:20:48 crc kubenswrapper[4892]: I0217 19:20:48.706153 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a4dff54-f831-4752-b7b9-67123477ec0e" containerName="init" Feb 17 19:20:48 crc kubenswrapper[4892]: E0217 19:20:48.706175 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9243c8d-1b08-4ff6-aca9-be5aab1a4437" containerName="nova-cell1-novncproxy-novncproxy" Feb 17 19:20:48 crc kubenswrapper[4892]: I0217 19:20:48.706181 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9243c8d-1b08-4ff6-aca9-be5aab1a4437" containerName="nova-cell1-novncproxy-novncproxy" Feb 17 19:20:48 crc kubenswrapper[4892]: E0217 19:20:48.706197 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc7212aa-fb51-4121-a36c-99a201ee026d" containerName="nova-scheduler-scheduler" Feb 17 19:20:48 crc kubenswrapper[4892]: I0217 19:20:48.706202 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc7212aa-fb51-4121-a36c-99a201ee026d" containerName="nova-scheduler-scheduler" Feb 17 19:20:48 crc kubenswrapper[4892]: I0217 19:20:48.706409 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc7212aa-fb51-4121-a36c-99a201ee026d" containerName="nova-scheduler-scheduler" Feb 17 19:20:48 crc kubenswrapper[4892]: I0217 19:20:48.706436 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a4dff54-f831-4752-b7b9-67123477ec0e" containerName="dnsmasq-dns" Feb 17 19:20:48 crc kubenswrapper[4892]: I0217 19:20:48.706453 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9243c8d-1b08-4ff6-aca9-be5aab1a4437" containerName="nova-cell1-novncproxy-novncproxy" Feb 17 19:20:48 crc kubenswrapper[4892]: I0217 19:20:48.707166 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 19:20:48 crc kubenswrapper[4892]: I0217 19:20:48.709107 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 17 19:20:48 crc kubenswrapper[4892]: I0217 19:20:48.717456 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 19:20:48 crc kubenswrapper[4892]: I0217 19:20:48.730332 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 19:20:48 crc kubenswrapper[4892]: I0217 19:20:48.732200 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 19:20:48 crc kubenswrapper[4892]: I0217 19:20:48.734480 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 17 19:20:48 crc kubenswrapper[4892]: I0217 19:20:48.745275 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 19:20:48 crc kubenswrapper[4892]: I0217 19:20:48.768740 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f2c6c93-db35-404a-a613-34cb4b2de98b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8f2c6c93-db35-404a-a613-34cb4b2de98b\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 19:20:48 crc kubenswrapper[4892]: I0217 19:20:48.768846 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f2c6c93-db35-404a-a613-34cb4b2de98b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8f2c6c93-db35-404a-a613-34cb4b2de98b\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 19:20:48 crc kubenswrapper[4892]: I0217 19:20:48.768990 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z56kp\" (UniqueName: \"kubernetes.io/projected/8f2c6c93-db35-404a-a613-34cb4b2de98b-kube-api-access-z56kp\") pod \"nova-cell1-novncproxy-0\" (UID: \"8f2c6c93-db35-404a-a613-34cb4b2de98b\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 19:20:48 crc kubenswrapper[4892]: I0217 19:20:48.870981 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z56kp\" (UniqueName: \"kubernetes.io/projected/8f2c6c93-db35-404a-a613-34cb4b2de98b-kube-api-access-z56kp\") pod \"nova-cell1-novncproxy-0\" (UID: \"8f2c6c93-db35-404a-a613-34cb4b2de98b\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 19:20:48 crc kubenswrapper[4892]: I0217 19:20:48.871060 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2e0bc87-ae8f-4e54-9301-03b2ae1c1d17-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e2e0bc87-ae8f-4e54-9301-03b2ae1c1d17\") " pod="openstack/nova-scheduler-0" Feb 17 19:20:48 crc kubenswrapper[4892]: I0217 19:20:48.871108 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f2c6c93-db35-404a-a613-34cb4b2de98b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8f2c6c93-db35-404a-a613-34cb4b2de98b\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 19:20:48 crc kubenswrapper[4892]: I0217 19:20:48.871128 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2e0bc87-ae8f-4e54-9301-03b2ae1c1d17-config-data\") pod \"nova-scheduler-0\" (UID: \"e2e0bc87-ae8f-4e54-9301-03b2ae1c1d17\") " pod="openstack/nova-scheduler-0" Feb 17 19:20:48 crc kubenswrapper[4892]: I0217 19:20:48.871170 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f2c6c93-db35-404a-a613-34cb4b2de98b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8f2c6c93-db35-404a-a613-34cb4b2de98b\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 19:20:48 crc kubenswrapper[4892]: I0217 19:20:48.871214 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8xc7\" (UniqueName: \"kubernetes.io/projected/e2e0bc87-ae8f-4e54-9301-03b2ae1c1d17-kube-api-access-q8xc7\") pod \"nova-scheduler-0\" (UID: \"e2e0bc87-ae8f-4e54-9301-03b2ae1c1d17\") " pod="openstack/nova-scheduler-0" Feb 17 19:20:48 crc kubenswrapper[4892]: I0217 19:20:48.876161 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f2c6c93-db35-404a-a613-34cb4b2de98b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8f2c6c93-db35-404a-a613-34cb4b2de98b\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 19:20:48 crc kubenswrapper[4892]: I0217 19:20:48.887534 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f2c6c93-db35-404a-a613-34cb4b2de98b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8f2c6c93-db35-404a-a613-34cb4b2de98b\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 19:20:48 crc kubenswrapper[4892]: I0217 19:20:48.888783 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z56kp\" (UniqueName: \"kubernetes.io/projected/8f2c6c93-db35-404a-a613-34cb4b2de98b-kube-api-access-z56kp\") pod \"nova-cell1-novncproxy-0\" (UID: \"8f2c6c93-db35-404a-a613-34cb4b2de98b\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 19:20:48 crc kubenswrapper[4892]: I0217 19:20:48.972390 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2e0bc87-ae8f-4e54-9301-03b2ae1c1d17-config-data\") pod \"nova-scheduler-0\" (UID: \"e2e0bc87-ae8f-4e54-9301-03b2ae1c1d17\") " pod="openstack/nova-scheduler-0" Feb 17 19:20:48 crc kubenswrapper[4892]: I0217 19:20:48.973565 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8xc7\" (UniqueName: \"kubernetes.io/projected/e2e0bc87-ae8f-4e54-9301-03b2ae1c1d17-kube-api-access-q8xc7\") pod \"nova-scheduler-0\" (UID: \"e2e0bc87-ae8f-4e54-9301-03b2ae1c1d17\") " pod="openstack/nova-scheduler-0" Feb 17 19:20:48 crc kubenswrapper[4892]: I0217 19:20:48.974283 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2e0bc87-ae8f-4e54-9301-03b2ae1c1d17-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e2e0bc87-ae8f-4e54-9301-03b2ae1c1d17\") " pod="openstack/nova-scheduler-0" Feb 17 19:20:48 crc kubenswrapper[4892]: I0217 19:20:48.976244 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2e0bc87-ae8f-4e54-9301-03b2ae1c1d17-config-data\") pod \"nova-scheduler-0\" (UID: \"e2e0bc87-ae8f-4e54-9301-03b2ae1c1d17\") " pod="openstack/nova-scheduler-0" Feb 17 19:20:48 crc kubenswrapper[4892]: I0217 19:20:48.988490 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2e0bc87-ae8f-4e54-9301-03b2ae1c1d17-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e2e0bc87-ae8f-4e54-9301-03b2ae1c1d17\") " pod="openstack/nova-scheduler-0" Feb 17 19:20:48 crc kubenswrapper[4892]: I0217 19:20:48.990837 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8xc7\" (UniqueName: \"kubernetes.io/projected/e2e0bc87-ae8f-4e54-9301-03b2ae1c1d17-kube-api-access-q8xc7\") pod \"nova-scheduler-0\" (UID: \"e2e0bc87-ae8f-4e54-9301-03b2ae1c1d17\") " pod="openstack/nova-scheduler-0" Feb 17 19:20:49 crc kubenswrapper[4892]: I0217 19:20:49.033340 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 19:20:49 crc kubenswrapper[4892]: I0217 19:20:49.053561 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 19:20:49 crc kubenswrapper[4892]: I0217 19:20:49.372150 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc7212aa-fb51-4121-a36c-99a201ee026d" path="/var/lib/kubelet/pods/bc7212aa-fb51-4121-a36c-99a201ee026d/volumes" Feb 17 19:20:49 crc kubenswrapper[4892]: I0217 19:20:49.373128 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9243c8d-1b08-4ff6-aca9-be5aab1a4437" path="/var/lib/kubelet/pods/f9243c8d-1b08-4ff6-aca9-be5aab1a4437/volumes" Feb 17 19:20:49 crc kubenswrapper[4892]: I0217 19:20:49.553431 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 19:20:49 crc kubenswrapper[4892]: W0217 19:20:49.555884 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2e0bc87_ae8f_4e54_9301_03b2ae1c1d17.slice/crio-1029c03b3848928a3c761fe4bba5961501ebe32ad22655b1c3a5a353a9ddfc00 WatchSource:0}: Error finding container 1029c03b3848928a3c761fe4bba5961501ebe32ad22655b1c3a5a353a9ddfc00: Status 404 returned error can't find the container with id 1029c03b3848928a3c761fe4bba5961501ebe32ad22655b1c3a5a353a9ddfc00 Feb 17 19:20:49 crc kubenswrapper[4892]: I0217 19:20:49.566024 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 19:20:49 crc kubenswrapper[4892]: I0217 19:20:49.577295 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e2e0bc87-ae8f-4e54-9301-03b2ae1c1d17","Type":"ContainerStarted","Data":"1029c03b3848928a3c761fe4bba5961501ebe32ad22655b1c3a5a353a9ddfc00"} Feb 17 19:20:49 crc kubenswrapper[4892]: I0217 19:20:49.581633 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8f2c6c93-db35-404a-a613-34cb4b2de98b","Type":"ContainerStarted","Data":"fc038d47280c0c4ec6c52f3e4e4d4b7e2af02c3dc6eea368b2b328aa221eeb86"} Feb 17 19:20:49 crc kubenswrapper[4892]: I0217 19:20:49.836856 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="61db98bb-314d-4dd3-9a11-88cac622dea5" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.101:8775/\": read tcp 10.217.0.2:49690->10.217.1.101:8775: read: connection reset by peer" Feb 17 19:20:49 crc kubenswrapper[4892]: I0217 19:20:49.837501 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="61db98bb-314d-4dd3-9a11-88cac622dea5" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.101:8775/\": read tcp 10.217.0.2:49674->10.217.1.101:8775: read: connection reset by peer" Feb 17 19:20:49 crc kubenswrapper[4892]: I0217 19:20:49.863594 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-api-0" podUID="88699460-d2ea-4d92-acf7-150aff42bf48" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.102:8774/\": read tcp 10.217.0.2:55806->10.217.1.102:8774: read: connection reset by peer" Feb 17 19:20:49 crc kubenswrapper[4892]: I0217 19:20:49.864405 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-api-0" podUID="88699460-d2ea-4d92-acf7-150aff42bf48" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.102:8774/\": read tcp 10.217.0.2:55808->10.217.1.102:8774: read: connection reset by peer" Feb 17 19:20:49 crc kubenswrapper[4892]: I0217 19:20:49.936554 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 17 19:20:49 crc kubenswrapper[4892]: I0217 19:20:49.936828 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="dba96c41-f19e-47ca-9cd0-1cf12d32d448" containerName="nova-cell1-conductor-conductor" containerID="cri-o://71311d9426c73238e86516e341e460027b0269746d4b74ba0cef6f1380147381" gracePeriod=30 Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.313169 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.404341 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.408603 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61db98bb-314d-4dd3-9a11-88cac622dea5-combined-ca-bundle\") pod \"61db98bb-314d-4dd3-9a11-88cac622dea5\" (UID: \"61db98bb-314d-4dd3-9a11-88cac622dea5\") " Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.408724 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61db98bb-314d-4dd3-9a11-88cac622dea5-logs\") pod \"61db98bb-314d-4dd3-9a11-88cac622dea5\" (UID: \"61db98bb-314d-4dd3-9a11-88cac622dea5\") " Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.408751 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61db98bb-314d-4dd3-9a11-88cac622dea5-config-data\") pod \"61db98bb-314d-4dd3-9a11-88cac622dea5\" (UID: \"61db98bb-314d-4dd3-9a11-88cac622dea5\") " Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.409003 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tb9c4\" (UniqueName: \"kubernetes.io/projected/61db98bb-314d-4dd3-9a11-88cac622dea5-kube-api-access-tb9c4\") pod \"61db98bb-314d-4dd3-9a11-88cac622dea5\" (UID: \"61db98bb-314d-4dd3-9a11-88cac622dea5\") " Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.410150 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61db98bb-314d-4dd3-9a11-88cac622dea5-logs" (OuterVolumeSpecName: "logs") pod "61db98bb-314d-4dd3-9a11-88cac622dea5" (UID: "61db98bb-314d-4dd3-9a11-88cac622dea5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.416827 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61db98bb-314d-4dd3-9a11-88cac622dea5-kube-api-access-tb9c4" (OuterVolumeSpecName: "kube-api-access-tb9c4") pod "61db98bb-314d-4dd3-9a11-88cac622dea5" (UID: "61db98bb-314d-4dd3-9a11-88cac622dea5"). InnerVolumeSpecName "kube-api-access-tb9c4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.421864 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61db98bb-314d-4dd3-9a11-88cac622dea5-logs\") on node \"crc\" DevicePath \"\"" Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.421910 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tb9c4\" (UniqueName: \"kubernetes.io/projected/61db98bb-314d-4dd3-9a11-88cac622dea5-kube-api-access-tb9c4\") on node \"crc\" DevicePath \"\"" Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.485155 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61db98bb-314d-4dd3-9a11-88cac622dea5-config-data" (OuterVolumeSpecName: "config-data") pod "61db98bb-314d-4dd3-9a11-88cac622dea5" (UID: "61db98bb-314d-4dd3-9a11-88cac622dea5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.490338 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61db98bb-314d-4dd3-9a11-88cac622dea5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "61db98bb-314d-4dd3-9a11-88cac622dea5" (UID: "61db98bb-314d-4dd3-9a11-88cac622dea5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.522945 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88699460-d2ea-4d92-acf7-150aff42bf48-combined-ca-bundle\") pod \"88699460-d2ea-4d92-acf7-150aff42bf48\" (UID: \"88699460-d2ea-4d92-acf7-150aff42bf48\") " Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.523193 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pltl4\" (UniqueName: \"kubernetes.io/projected/88699460-d2ea-4d92-acf7-150aff42bf48-kube-api-access-pltl4\") pod \"88699460-d2ea-4d92-acf7-150aff42bf48\" (UID: \"88699460-d2ea-4d92-acf7-150aff42bf48\") " Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.523286 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88699460-d2ea-4d92-acf7-150aff42bf48-config-data\") pod \"88699460-d2ea-4d92-acf7-150aff42bf48\" (UID: \"88699460-d2ea-4d92-acf7-150aff42bf48\") " Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.523312 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88699460-d2ea-4d92-acf7-150aff42bf48-logs\") pod \"88699460-d2ea-4d92-acf7-150aff42bf48\" (UID: \"88699460-d2ea-4d92-acf7-150aff42bf48\") " Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.523748 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61db98bb-314d-4dd3-9a11-88cac622dea5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.523764 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61db98bb-314d-4dd3-9a11-88cac622dea5-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.524077 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88699460-d2ea-4d92-acf7-150aff42bf48-logs" (OuterVolumeSpecName: "logs") pod "88699460-d2ea-4d92-acf7-150aff42bf48" (UID: "88699460-d2ea-4d92-acf7-150aff42bf48"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.528970 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88699460-d2ea-4d92-acf7-150aff42bf48-kube-api-access-pltl4" (OuterVolumeSpecName: "kube-api-access-pltl4") pod "88699460-d2ea-4d92-acf7-150aff42bf48" (UID: "88699460-d2ea-4d92-acf7-150aff42bf48"). InnerVolumeSpecName "kube-api-access-pltl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.548078 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88699460-d2ea-4d92-acf7-150aff42bf48-config-data" (OuterVolumeSpecName: "config-data") pod "88699460-d2ea-4d92-acf7-150aff42bf48" (UID: "88699460-d2ea-4d92-acf7-150aff42bf48"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.554596 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88699460-d2ea-4d92-acf7-150aff42bf48-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88699460-d2ea-4d92-acf7-150aff42bf48" (UID: "88699460-d2ea-4d92-acf7-150aff42bf48"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.614184 4892 generic.go:334] "Generic (PLEG): container finished" podID="2f3b01f4-cae0-43d8-8d5a-5eb7052304e3" containerID="3d7ba1b53e5252313403655198580bfb151aeca1caa5cc295583ec81123f5097" exitCode=0 Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.614244 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"2f3b01f4-cae0-43d8-8d5a-5eb7052304e3","Type":"ContainerDied","Data":"3d7ba1b53e5252313403655198580bfb151aeca1caa5cc295583ec81123f5097"} Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.621219 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e2e0bc87-ae8f-4e54-9301-03b2ae1c1d17","Type":"ContainerStarted","Data":"057d700fe79b3768e3253ba7ecbf5d6eec1aeb3130b69d9e7917730c4b68974d"} Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.625018 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88699460-d2ea-4d92-acf7-150aff42bf48-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.625044 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pltl4\" (UniqueName: \"kubernetes.io/projected/88699460-d2ea-4d92-acf7-150aff42bf48-kube-api-access-pltl4\") on node \"crc\" DevicePath \"\"" Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.625055 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88699460-d2ea-4d92-acf7-150aff42bf48-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.625063 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88699460-d2ea-4d92-acf7-150aff42bf48-logs\") on node \"crc\" DevicePath \"\"" Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.645193 4892 generic.go:334] "Generic (PLEG): container finished" podID="61db98bb-314d-4dd3-9a11-88cac622dea5" containerID="f78a103f4ef89fb9ee2aa1d4fab94d0d3ef04bfc9ee8f212eed1cf44f7afb6a2" exitCode=0 Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.645296 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.645397 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"61db98bb-314d-4dd3-9a11-88cac622dea5","Type":"ContainerDied","Data":"f78a103f4ef89fb9ee2aa1d4fab94d0d3ef04bfc9ee8f212eed1cf44f7afb6a2"} Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.645434 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"61db98bb-314d-4dd3-9a11-88cac622dea5","Type":"ContainerDied","Data":"045150db74afd24ab33e9ba9b0f49d6ecce24470760b71deef4ef0bec6791e60"} Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.645451 4892 scope.go:117] "RemoveContainer" containerID="f78a103f4ef89fb9ee2aa1d4fab94d0d3ef04bfc9ee8f212eed1cf44f7afb6a2" Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.658372 4892 generic.go:334] "Generic (PLEG): container finished" podID="88699460-d2ea-4d92-acf7-150aff42bf48" containerID="a85168328370849ba524f17ec5f25cd2c99b6466753bcdfa951f650acd3bf490" exitCode=0 Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.658431 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"88699460-d2ea-4d92-acf7-150aff42bf48","Type":"ContainerDied","Data":"a85168328370849ba524f17ec5f25cd2c99b6466753bcdfa951f650acd3bf490"} Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.658455 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"88699460-d2ea-4d92-acf7-150aff42bf48","Type":"ContainerDied","Data":"8f752cbfcff50720c8a79b5e9fdd65c943fa2011783b25a14bdd4250bd9beba3"} Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.658527 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.661578 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8f2c6c93-db35-404a-a613-34cb4b2de98b","Type":"ContainerStarted","Data":"44d9909830ab3c80214f84b8268e297c01847ced103bc8b997716925647dcd46"} Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.677867 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.677856019 podStartE2EDuration="2.677856019s" podCreationTimestamp="2026-02-17 19:20:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:20:50.63708795 +0000 UTC m=+5822.012491205" watchObservedRunningTime="2026-02-17 19:20:50.677856019 +0000 UTC m=+5822.053259284" Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.682892 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.682883085 podStartE2EDuration="2.682883085s" podCreationTimestamp="2026-02-17 19:20:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:20:50.676876613 +0000 UTC m=+5822.052279878" watchObservedRunningTime="2026-02-17 19:20:50.682883085 +0000 UTC m=+5822.058286350" Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.704939 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.716737 4892 scope.go:117] "RemoveContainer" containerID="c5fc9d5719f755770060276b956e0dcb1c0188bfca63321e05144704a16cfa75" Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.727010 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.759808 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.764761 4892 scope.go:117] "RemoveContainer" containerID="f78a103f4ef89fb9ee2aa1d4fab94d0d3ef04bfc9ee8f212eed1cf44f7afb6a2" Feb 17 19:20:50 crc kubenswrapper[4892]: E0217 19:20:50.765203 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f78a103f4ef89fb9ee2aa1d4fab94d0d3ef04bfc9ee8f212eed1cf44f7afb6a2\": container with ID starting with f78a103f4ef89fb9ee2aa1d4fab94d0d3ef04bfc9ee8f212eed1cf44f7afb6a2 not found: ID does not exist" containerID="f78a103f4ef89fb9ee2aa1d4fab94d0d3ef04bfc9ee8f212eed1cf44f7afb6a2" Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.765239 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f78a103f4ef89fb9ee2aa1d4fab94d0d3ef04bfc9ee8f212eed1cf44f7afb6a2"} err="failed to get container status \"f78a103f4ef89fb9ee2aa1d4fab94d0d3ef04bfc9ee8f212eed1cf44f7afb6a2\": rpc error: code = NotFound desc = could not find container \"f78a103f4ef89fb9ee2aa1d4fab94d0d3ef04bfc9ee8f212eed1cf44f7afb6a2\": container with ID starting with f78a103f4ef89fb9ee2aa1d4fab94d0d3ef04bfc9ee8f212eed1cf44f7afb6a2 not found: ID does not exist" Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.765267 4892 scope.go:117] "RemoveContainer" containerID="c5fc9d5719f755770060276b956e0dcb1c0188bfca63321e05144704a16cfa75" Feb 17 19:20:50 crc kubenswrapper[4892]: E0217 19:20:50.765559 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5fc9d5719f755770060276b956e0dcb1c0188bfca63321e05144704a16cfa75\": container with ID starting with c5fc9d5719f755770060276b956e0dcb1c0188bfca63321e05144704a16cfa75 not found: ID does not exist" containerID="c5fc9d5719f755770060276b956e0dcb1c0188bfca63321e05144704a16cfa75" Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.765589 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5fc9d5719f755770060276b956e0dcb1c0188bfca63321e05144704a16cfa75"} err="failed to get container status \"c5fc9d5719f755770060276b956e0dcb1c0188bfca63321e05144704a16cfa75\": rpc error: code = NotFound desc = could not find container \"c5fc9d5719f755770060276b956e0dcb1c0188bfca63321e05144704a16cfa75\": container with ID starting with c5fc9d5719f755770060276b956e0dcb1c0188bfca63321e05144704a16cfa75 not found: ID does not exist" Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.765606 4892 scope.go:117] "RemoveContainer" containerID="a85168328370849ba524f17ec5f25cd2c99b6466753bcdfa951f650acd3bf490" Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.775403 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.803044 4892 scope.go:117] "RemoveContainer" containerID="cd39d941e73d414b020632b87dfae80090b58672310a138db82a228a57a08545" Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.803217 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 17 19:20:50 crc kubenswrapper[4892]: E0217 19:20:50.803751 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88699460-d2ea-4d92-acf7-150aff42bf48" containerName="nova-api-api" Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.803775 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="88699460-d2ea-4d92-acf7-150aff42bf48" containerName="nova-api-api" Feb 17 19:20:50 crc kubenswrapper[4892]: E0217 19:20:50.803832 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88699460-d2ea-4d92-acf7-150aff42bf48" containerName="nova-api-log" Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.803842 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="88699460-d2ea-4d92-acf7-150aff42bf48" containerName="nova-api-log" Feb 17 19:20:50 crc kubenswrapper[4892]: E0217 19:20:50.803868 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61db98bb-314d-4dd3-9a11-88cac622dea5" containerName="nova-metadata-metadata" Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.803876 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="61db98bb-314d-4dd3-9a11-88cac622dea5" containerName="nova-metadata-metadata" Feb 17 19:20:50 crc kubenswrapper[4892]: E0217 19:20:50.803890 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61db98bb-314d-4dd3-9a11-88cac622dea5" containerName="nova-metadata-log" Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.803897 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="61db98bb-314d-4dd3-9a11-88cac622dea5" containerName="nova-metadata-log" Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.804149 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="88699460-d2ea-4d92-acf7-150aff42bf48" containerName="nova-api-api" Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.804182 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="61db98bb-314d-4dd3-9a11-88cac622dea5" containerName="nova-metadata-log" Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.804198 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="61db98bb-314d-4dd3-9a11-88cac622dea5" containerName="nova-metadata-metadata" Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.804227 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="88699460-d2ea-4d92-acf7-150aff42bf48" containerName="nova-api-log" Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.805597 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.812999 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.816515 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.833569 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.833852 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.837444 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.846245 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.873982 4892 scope.go:117] "RemoveContainer" containerID="a85168328370849ba524f17ec5f25cd2c99b6466753bcdfa951f650acd3bf490" Feb 17 19:20:50 crc kubenswrapper[4892]: E0217 19:20:50.874364 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a85168328370849ba524f17ec5f25cd2c99b6466753bcdfa951f650acd3bf490\": container with ID starting with a85168328370849ba524f17ec5f25cd2c99b6466753bcdfa951f650acd3bf490 not found: ID does not exist" containerID="a85168328370849ba524f17ec5f25cd2c99b6466753bcdfa951f650acd3bf490" Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.874400 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a85168328370849ba524f17ec5f25cd2c99b6466753bcdfa951f650acd3bf490"} err="failed to get container status \"a85168328370849ba524f17ec5f25cd2c99b6466753bcdfa951f650acd3bf490\": rpc error: code = NotFound desc = could not find container \"a85168328370849ba524f17ec5f25cd2c99b6466753bcdfa951f650acd3bf490\": container with ID starting with a85168328370849ba524f17ec5f25cd2c99b6466753bcdfa951f650acd3bf490 not found: ID does not exist" Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.874426 4892 scope.go:117] "RemoveContainer" containerID="cd39d941e73d414b020632b87dfae80090b58672310a138db82a228a57a08545" Feb 17 19:20:50 crc kubenswrapper[4892]: E0217 19:20:50.874779 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd39d941e73d414b020632b87dfae80090b58672310a138db82a228a57a08545\": container with ID starting with cd39d941e73d414b020632b87dfae80090b58672310a138db82a228a57a08545 not found: ID does not exist" containerID="cd39d941e73d414b020632b87dfae80090b58672310a138db82a228a57a08545" Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.874820 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd39d941e73d414b020632b87dfae80090b58672310a138db82a228a57a08545"} err="failed to get container status \"cd39d941e73d414b020632b87dfae80090b58672310a138db82a228a57a08545\": rpc error: code = NotFound desc = could not find container \"cd39d941e73d414b020632b87dfae80090b58672310a138db82a228a57a08545\": container with ID starting with cd39d941e73d414b020632b87dfae80090b58672310a138db82a228a57a08545 not found: ID does not exist" Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.942121 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cx8ng\" (UniqueName: \"kubernetes.io/projected/74f65fe3-a32c-4468-aabd-7db13aa21aa4-kube-api-access-cx8ng\") pod \"nova-api-0\" (UID: \"74f65fe3-a32c-4468-aabd-7db13aa21aa4\") " pod="openstack/nova-api-0" Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.942205 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74f65fe3-a32c-4468-aabd-7db13aa21aa4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"74f65fe3-a32c-4468-aabd-7db13aa21aa4\") " pod="openstack/nova-api-0" Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.942234 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gnn2\" (UniqueName: \"kubernetes.io/projected/d2730df2-b773-4353-9b44-9dcc2b516221-kube-api-access-6gnn2\") pod \"nova-metadata-0\" (UID: \"d2730df2-b773-4353-9b44-9dcc2b516221\") " pod="openstack/nova-metadata-0" Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.942257 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74f65fe3-a32c-4468-aabd-7db13aa21aa4-config-data\") pod \"nova-api-0\" (UID: \"74f65fe3-a32c-4468-aabd-7db13aa21aa4\") " pod="openstack/nova-api-0" Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.942272 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2730df2-b773-4353-9b44-9dcc2b516221-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d2730df2-b773-4353-9b44-9dcc2b516221\") " pod="openstack/nova-metadata-0" Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.942317 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74f65fe3-a32c-4468-aabd-7db13aa21aa4-logs\") pod \"nova-api-0\" (UID: \"74f65fe3-a32c-4468-aabd-7db13aa21aa4\") " pod="openstack/nova-api-0" Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.942336 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2730df2-b773-4353-9b44-9dcc2b516221-config-data\") pod \"nova-metadata-0\" (UID: \"d2730df2-b773-4353-9b44-9dcc2b516221\") " pod="openstack/nova-metadata-0" Feb 17 19:20:50 crc kubenswrapper[4892]: I0217 19:20:50.942403 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2730df2-b773-4353-9b44-9dcc2b516221-logs\") pod \"nova-metadata-0\" (UID: \"d2730df2-b773-4353-9b44-9dcc2b516221\") " pod="openstack/nova-metadata-0" Feb 17 19:20:51 crc kubenswrapper[4892]: I0217 19:20:51.022634 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 17 19:20:51 crc kubenswrapper[4892]: I0217 19:20:51.047140 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cx8ng\" (UniqueName: \"kubernetes.io/projected/74f65fe3-a32c-4468-aabd-7db13aa21aa4-kube-api-access-cx8ng\") pod \"nova-api-0\" (UID: \"74f65fe3-a32c-4468-aabd-7db13aa21aa4\") " pod="openstack/nova-api-0" Feb 17 19:20:51 crc kubenswrapper[4892]: I0217 19:20:51.047206 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74f65fe3-a32c-4468-aabd-7db13aa21aa4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"74f65fe3-a32c-4468-aabd-7db13aa21aa4\") " pod="openstack/nova-api-0" Feb 17 19:20:51 crc kubenswrapper[4892]: I0217 19:20:51.047235 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gnn2\" (UniqueName: \"kubernetes.io/projected/d2730df2-b773-4353-9b44-9dcc2b516221-kube-api-access-6gnn2\") pod \"nova-metadata-0\" (UID: \"d2730df2-b773-4353-9b44-9dcc2b516221\") " pod="openstack/nova-metadata-0" Feb 17 19:20:51 crc kubenswrapper[4892]: I0217 19:20:51.047256 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74f65fe3-a32c-4468-aabd-7db13aa21aa4-config-data\") pod \"nova-api-0\" (UID: \"74f65fe3-a32c-4468-aabd-7db13aa21aa4\") " pod="openstack/nova-api-0" Feb 17 19:20:51 crc kubenswrapper[4892]: I0217 19:20:51.047273 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2730df2-b773-4353-9b44-9dcc2b516221-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d2730df2-b773-4353-9b44-9dcc2b516221\") " pod="openstack/nova-metadata-0" Feb 17 19:20:51 crc kubenswrapper[4892]: I0217 19:20:51.047316 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74f65fe3-a32c-4468-aabd-7db13aa21aa4-logs\") pod \"nova-api-0\" (UID: \"74f65fe3-a32c-4468-aabd-7db13aa21aa4\") " pod="openstack/nova-api-0" Feb 17 19:20:51 crc kubenswrapper[4892]: I0217 19:20:51.047336 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2730df2-b773-4353-9b44-9dcc2b516221-config-data\") pod \"nova-metadata-0\" (UID: \"d2730df2-b773-4353-9b44-9dcc2b516221\") " pod="openstack/nova-metadata-0" Feb 17 19:20:51 crc kubenswrapper[4892]: I0217 19:20:51.047385 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2730df2-b773-4353-9b44-9dcc2b516221-logs\") pod \"nova-metadata-0\" (UID: \"d2730df2-b773-4353-9b44-9dcc2b516221\") " pod="openstack/nova-metadata-0" Feb 17 19:20:51 crc kubenswrapper[4892]: I0217 19:20:51.047743 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2730df2-b773-4353-9b44-9dcc2b516221-logs\") pod \"nova-metadata-0\" (UID: \"d2730df2-b773-4353-9b44-9dcc2b516221\") " pod="openstack/nova-metadata-0" Feb 17 19:20:51 crc kubenswrapper[4892]: I0217 19:20:51.052400 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74f65fe3-a32c-4468-aabd-7db13aa21aa4-logs\") pod \"nova-api-0\" (UID: \"74f65fe3-a32c-4468-aabd-7db13aa21aa4\") " pod="openstack/nova-api-0" Feb 17 19:20:51 crc kubenswrapper[4892]: I0217 19:20:51.059034 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74f65fe3-a32c-4468-aabd-7db13aa21aa4-config-data\") pod \"nova-api-0\" (UID: \"74f65fe3-a32c-4468-aabd-7db13aa21aa4\") " pod="openstack/nova-api-0" Feb 17 19:20:51 crc kubenswrapper[4892]: I0217 19:20:51.060281 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2730df2-b773-4353-9b44-9dcc2b516221-config-data\") pod \"nova-metadata-0\" (UID: \"d2730df2-b773-4353-9b44-9dcc2b516221\") " pod="openstack/nova-metadata-0" Feb 17 19:20:51 crc kubenswrapper[4892]: I0217 19:20:51.069316 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2730df2-b773-4353-9b44-9dcc2b516221-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d2730df2-b773-4353-9b44-9dcc2b516221\") " pod="openstack/nova-metadata-0" Feb 17 19:20:51 crc kubenswrapper[4892]: I0217 19:20:51.071386 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cx8ng\" (UniqueName: \"kubernetes.io/projected/74f65fe3-a32c-4468-aabd-7db13aa21aa4-kube-api-access-cx8ng\") pod \"nova-api-0\" (UID: \"74f65fe3-a32c-4468-aabd-7db13aa21aa4\") " pod="openstack/nova-api-0" Feb 17 19:20:51 crc kubenswrapper[4892]: I0217 19:20:51.074591 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gnn2\" (UniqueName: \"kubernetes.io/projected/d2730df2-b773-4353-9b44-9dcc2b516221-kube-api-access-6gnn2\") pod \"nova-metadata-0\" (UID: \"d2730df2-b773-4353-9b44-9dcc2b516221\") " pod="openstack/nova-metadata-0" Feb 17 19:20:51 crc kubenswrapper[4892]: I0217 19:20:51.075371 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74f65fe3-a32c-4468-aabd-7db13aa21aa4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"74f65fe3-a32c-4468-aabd-7db13aa21aa4\") " pod="openstack/nova-api-0" Feb 17 19:20:51 crc kubenswrapper[4892]: I0217 19:20:51.143509 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 19:20:51 crc kubenswrapper[4892]: I0217 19:20:51.148472 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljjwm\" (UniqueName: \"kubernetes.io/projected/2f3b01f4-cae0-43d8-8d5a-5eb7052304e3-kube-api-access-ljjwm\") pod \"2f3b01f4-cae0-43d8-8d5a-5eb7052304e3\" (UID: \"2f3b01f4-cae0-43d8-8d5a-5eb7052304e3\") " Feb 17 19:20:51 crc kubenswrapper[4892]: I0217 19:20:51.148611 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f3b01f4-cae0-43d8-8d5a-5eb7052304e3-config-data\") pod \"2f3b01f4-cae0-43d8-8d5a-5eb7052304e3\" (UID: \"2f3b01f4-cae0-43d8-8d5a-5eb7052304e3\") " Feb 17 19:20:51 crc kubenswrapper[4892]: I0217 19:20:51.148773 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f3b01f4-cae0-43d8-8d5a-5eb7052304e3-combined-ca-bundle\") pod \"2f3b01f4-cae0-43d8-8d5a-5eb7052304e3\" (UID: \"2f3b01f4-cae0-43d8-8d5a-5eb7052304e3\") " Feb 17 19:20:51 crc kubenswrapper[4892]: I0217 19:20:51.151890 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f3b01f4-cae0-43d8-8d5a-5eb7052304e3-kube-api-access-ljjwm" (OuterVolumeSpecName: "kube-api-access-ljjwm") pod "2f3b01f4-cae0-43d8-8d5a-5eb7052304e3" (UID: "2f3b01f4-cae0-43d8-8d5a-5eb7052304e3"). InnerVolumeSpecName "kube-api-access-ljjwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:20:51 crc kubenswrapper[4892]: I0217 19:20:51.161268 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 19:20:51 crc kubenswrapper[4892]: I0217 19:20:51.176971 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f3b01f4-cae0-43d8-8d5a-5eb7052304e3-config-data" (OuterVolumeSpecName: "config-data") pod "2f3b01f4-cae0-43d8-8d5a-5eb7052304e3" (UID: "2f3b01f4-cae0-43d8-8d5a-5eb7052304e3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:20:51 crc kubenswrapper[4892]: I0217 19:20:51.178366 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f3b01f4-cae0-43d8-8d5a-5eb7052304e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2f3b01f4-cae0-43d8-8d5a-5eb7052304e3" (UID: "2f3b01f4-cae0-43d8-8d5a-5eb7052304e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:20:51 crc kubenswrapper[4892]: I0217 19:20:51.251583 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f3b01f4-cae0-43d8-8d5a-5eb7052304e3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 19:20:51 crc kubenswrapper[4892]: I0217 19:20:51.251644 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljjwm\" (UniqueName: \"kubernetes.io/projected/2f3b01f4-cae0-43d8-8d5a-5eb7052304e3-kube-api-access-ljjwm\") on node \"crc\" DevicePath \"\"" Feb 17 19:20:51 crc kubenswrapper[4892]: I0217 19:20:51.251660 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f3b01f4-cae0-43d8-8d5a-5eb7052304e3-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 19:20:51 crc kubenswrapper[4892]: I0217 19:20:51.374854 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61db98bb-314d-4dd3-9a11-88cac622dea5" path="/var/lib/kubelet/pods/61db98bb-314d-4dd3-9a11-88cac622dea5/volumes" Feb 17 19:20:51 crc kubenswrapper[4892]: I0217 19:20:51.375474 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88699460-d2ea-4d92-acf7-150aff42bf48" path="/var/lib/kubelet/pods/88699460-d2ea-4d92-acf7-150aff42bf48/volumes" Feb 17 19:20:51 crc kubenswrapper[4892]: I0217 19:20:51.666307 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 19:20:51 crc kubenswrapper[4892]: I0217 19:20:51.691170 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 19:20:51 crc kubenswrapper[4892]: I0217 19:20:51.691509 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 17 19:20:51 crc kubenswrapper[4892]: I0217 19:20:51.691775 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"2f3b01f4-cae0-43d8-8d5a-5eb7052304e3","Type":"ContainerDied","Data":"adfe6ce88553b02c5949e2a13827ff8d7ec6c2f5c142aabf5565fac0ef709b34"} Feb 17 19:20:51 crc kubenswrapper[4892]: I0217 19:20:51.691838 4892 scope.go:117] "RemoveContainer" containerID="3d7ba1b53e5252313403655198580bfb151aeca1caa5cc295583ec81123f5097" Feb 17 19:20:51 crc kubenswrapper[4892]: I0217 19:20:51.697955 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"74f65fe3-a32c-4468-aabd-7db13aa21aa4","Type":"ContainerStarted","Data":"83c872092d163fa24e137a4766ea36cb3df34dc0f787c63f5fb1cd59ab346371"} Feb 17 19:20:51 crc kubenswrapper[4892]: I0217 19:20:51.869498 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 17 19:20:51 crc kubenswrapper[4892]: I0217 19:20:51.916011 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 17 19:20:51 crc kubenswrapper[4892]: I0217 19:20:51.933205 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 17 19:20:51 crc kubenswrapper[4892]: E0217 19:20:51.933999 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f3b01f4-cae0-43d8-8d5a-5eb7052304e3" containerName="nova-cell0-conductor-conductor" Feb 17 19:20:51 crc kubenswrapper[4892]: I0217 19:20:51.934029 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f3b01f4-cae0-43d8-8d5a-5eb7052304e3" containerName="nova-cell0-conductor-conductor" Feb 17 19:20:51 crc kubenswrapper[4892]: I0217 19:20:51.934284 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f3b01f4-cae0-43d8-8d5a-5eb7052304e3" containerName="nova-cell0-conductor-conductor" Feb 17 19:20:51 crc kubenswrapper[4892]: I0217 19:20:51.935182 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 17 19:20:51 crc kubenswrapper[4892]: I0217 19:20:51.938074 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 17 19:20:51 crc kubenswrapper[4892]: I0217 19:20:51.961486 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 17 19:20:52 crc kubenswrapper[4892]: I0217 19:20:52.068264 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tssfw\" (UniqueName: \"kubernetes.io/projected/d62a3b6f-05ed-4c4f-8c4a-357db18967d9-kube-api-access-tssfw\") pod \"nova-cell0-conductor-0\" (UID: \"d62a3b6f-05ed-4c4f-8c4a-357db18967d9\") " pod="openstack/nova-cell0-conductor-0" Feb 17 19:20:52 crc kubenswrapper[4892]: I0217 19:20:52.068303 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d62a3b6f-05ed-4c4f-8c4a-357db18967d9-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"d62a3b6f-05ed-4c4f-8c4a-357db18967d9\") " pod="openstack/nova-cell0-conductor-0" Feb 17 19:20:52 crc kubenswrapper[4892]: I0217 19:20:52.068386 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d62a3b6f-05ed-4c4f-8c4a-357db18967d9-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"d62a3b6f-05ed-4c4f-8c4a-357db18967d9\") " pod="openstack/nova-cell0-conductor-0" Feb 17 19:20:52 crc kubenswrapper[4892]: I0217 19:20:52.172122 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d62a3b6f-05ed-4c4f-8c4a-357db18967d9-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"d62a3b6f-05ed-4c4f-8c4a-357db18967d9\") " pod="openstack/nova-cell0-conductor-0" Feb 17 19:20:52 crc kubenswrapper[4892]: I0217 19:20:52.172253 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tssfw\" (UniqueName: \"kubernetes.io/projected/d62a3b6f-05ed-4c4f-8c4a-357db18967d9-kube-api-access-tssfw\") pod \"nova-cell0-conductor-0\" (UID: \"d62a3b6f-05ed-4c4f-8c4a-357db18967d9\") " pod="openstack/nova-cell0-conductor-0" Feb 17 19:20:52 crc kubenswrapper[4892]: I0217 19:20:52.172272 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d62a3b6f-05ed-4c4f-8c4a-357db18967d9-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"d62a3b6f-05ed-4c4f-8c4a-357db18967d9\") " pod="openstack/nova-cell0-conductor-0" Feb 17 19:20:52 crc kubenswrapper[4892]: I0217 19:20:52.176349 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d62a3b6f-05ed-4c4f-8c4a-357db18967d9-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"d62a3b6f-05ed-4c4f-8c4a-357db18967d9\") " pod="openstack/nova-cell0-conductor-0" Feb 17 19:20:52 crc kubenswrapper[4892]: I0217 19:20:52.176483 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d62a3b6f-05ed-4c4f-8c4a-357db18967d9-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"d62a3b6f-05ed-4c4f-8c4a-357db18967d9\") " pod="openstack/nova-cell0-conductor-0" Feb 17 19:20:52 crc kubenswrapper[4892]: I0217 19:20:52.195279 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tssfw\" (UniqueName: \"kubernetes.io/projected/d62a3b6f-05ed-4c4f-8c4a-357db18967d9-kube-api-access-tssfw\") pod \"nova-cell0-conductor-0\" (UID: \"d62a3b6f-05ed-4c4f-8c4a-357db18967d9\") " pod="openstack/nova-cell0-conductor-0" Feb 17 19:20:52 crc kubenswrapper[4892]: I0217 19:20:52.257737 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 17 19:20:52 crc kubenswrapper[4892]: I0217 19:20:52.360281 4892 scope.go:117] "RemoveContainer" containerID="0b9ac01ffb1eb212ecc7dea4def4de73890b147a83b4b71346712fbd55b67ac2" Feb 17 19:20:52 crc kubenswrapper[4892]: E0217 19:20:52.360670 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:20:52 crc kubenswrapper[4892]: I0217 19:20:52.718157 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"74f65fe3-a32c-4468-aabd-7db13aa21aa4","Type":"ContainerStarted","Data":"b9de8cff655d97851c673048b173338dbbdc3352968e715b840eb2394b079902"} Feb 17 19:20:52 crc kubenswrapper[4892]: I0217 19:20:52.718464 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"74f65fe3-a32c-4468-aabd-7db13aa21aa4","Type":"ContainerStarted","Data":"87aac0b422a5c6fa015611fc43442b8018f3af9bd129b89100a498b163553bab"} Feb 17 19:20:52 crc kubenswrapper[4892]: I0217 19:20:52.721921 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d2730df2-b773-4353-9b44-9dcc2b516221","Type":"ContainerStarted","Data":"b1834fba28e56bcb58a0012a5daa7923a9b18bebea4c07285fa112db4ff2513e"} Feb 17 19:20:52 crc kubenswrapper[4892]: I0217 19:20:52.721976 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d2730df2-b773-4353-9b44-9dcc2b516221","Type":"ContainerStarted","Data":"9e63277c45c08d2e1cd6457d1e1997d3242a058acfd4f5be20156014f47c439b"} Feb 17 19:20:52 crc kubenswrapper[4892]: I0217 19:20:52.722003 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d2730df2-b773-4353-9b44-9dcc2b516221","Type":"ContainerStarted","Data":"3e37c35f3af4419064c8cf5cf8169daf9e829e6f37f4d15cb535e72fc5732ef7"} Feb 17 19:20:52 crc kubenswrapper[4892]: I0217 19:20:52.743522 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 17 19:20:52 crc kubenswrapper[4892]: W0217 19:20:52.743930 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd62a3b6f_05ed_4c4f_8c4a_357db18967d9.slice/crio-5b5b4dc24ee48a92e732682d1c20d60b7befb52e0cb00b8e9d5677cc01d93b2c WatchSource:0}: Error finding container 5b5b4dc24ee48a92e732682d1c20d60b7befb52e0cb00b8e9d5677cc01d93b2c: Status 404 returned error can't find the container with id 5b5b4dc24ee48a92e732682d1c20d60b7befb52e0cb00b8e9d5677cc01d93b2c Feb 17 19:20:52 crc kubenswrapper[4892]: I0217 19:20:52.765223 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.765206535 podStartE2EDuration="2.765206535s" podCreationTimestamp="2026-02-17 19:20:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:20:52.752123493 +0000 UTC m=+5824.127526768" watchObservedRunningTime="2026-02-17 19:20:52.765206535 +0000 UTC m=+5824.140609800" Feb 17 19:20:53 crc kubenswrapper[4892]: I0217 19:20:53.378236 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f3b01f4-cae0-43d8-8d5a-5eb7052304e3" path="/var/lib/kubelet/pods/2f3b01f4-cae0-43d8-8d5a-5eb7052304e3/volumes" Feb 17 19:20:53 crc kubenswrapper[4892]: I0217 19:20:53.732940 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"d62a3b6f-05ed-4c4f-8c4a-357db18967d9","Type":"ContainerStarted","Data":"e8700e9d019703f8522b742152a7b7f0198429932f23ad3980fe61f339da91aa"} Feb 17 19:20:53 crc kubenswrapper[4892]: I0217 19:20:53.732999 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"d62a3b6f-05ed-4c4f-8c4a-357db18967d9","Type":"ContainerStarted","Data":"5b5b4dc24ee48a92e732682d1c20d60b7befb52e0cb00b8e9d5677cc01d93b2c"} Feb 17 19:20:53 crc kubenswrapper[4892]: I0217 19:20:53.750484 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.750463642 podStartE2EDuration="3.750463642s" podCreationTimestamp="2026-02-17 19:20:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:20:52.793446196 +0000 UTC m=+5824.168849461" watchObservedRunningTime="2026-02-17 19:20:53.750463642 +0000 UTC m=+5825.125866907" Feb 17 19:20:53 crc kubenswrapper[4892]: I0217 19:20:53.755477 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.755465677 podStartE2EDuration="2.755465677s" podCreationTimestamp="2026-02-17 19:20:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:20:53.746257938 +0000 UTC m=+5825.121661203" watchObservedRunningTime="2026-02-17 19:20:53.755465677 +0000 UTC m=+5825.130868942" Feb 17 19:20:54 crc kubenswrapper[4892]: I0217 19:20:54.034084 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 17 19:20:54 crc kubenswrapper[4892]: I0217 19:20:54.054931 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 17 19:20:54 crc kubenswrapper[4892]: I0217 19:20:54.355700 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 17 19:20:54 crc kubenswrapper[4892]: I0217 19:20:54.426836 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25ld4\" (UniqueName: \"kubernetes.io/projected/dba96c41-f19e-47ca-9cd0-1cf12d32d448-kube-api-access-25ld4\") pod \"dba96c41-f19e-47ca-9cd0-1cf12d32d448\" (UID: \"dba96c41-f19e-47ca-9cd0-1cf12d32d448\") " Feb 17 19:20:54 crc kubenswrapper[4892]: I0217 19:20:54.426893 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dba96c41-f19e-47ca-9cd0-1cf12d32d448-combined-ca-bundle\") pod \"dba96c41-f19e-47ca-9cd0-1cf12d32d448\" (UID: \"dba96c41-f19e-47ca-9cd0-1cf12d32d448\") " Feb 17 19:20:54 crc kubenswrapper[4892]: I0217 19:20:54.427106 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dba96c41-f19e-47ca-9cd0-1cf12d32d448-config-data\") pod \"dba96c41-f19e-47ca-9cd0-1cf12d32d448\" (UID: \"dba96c41-f19e-47ca-9cd0-1cf12d32d448\") " Feb 17 19:20:54 crc kubenswrapper[4892]: I0217 19:20:54.433260 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dba96c41-f19e-47ca-9cd0-1cf12d32d448-kube-api-access-25ld4" (OuterVolumeSpecName: "kube-api-access-25ld4") pod "dba96c41-f19e-47ca-9cd0-1cf12d32d448" (UID: "dba96c41-f19e-47ca-9cd0-1cf12d32d448"). InnerVolumeSpecName "kube-api-access-25ld4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:20:54 crc kubenswrapper[4892]: I0217 19:20:54.458478 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dba96c41-f19e-47ca-9cd0-1cf12d32d448-config-data" (OuterVolumeSpecName: "config-data") pod "dba96c41-f19e-47ca-9cd0-1cf12d32d448" (UID: "dba96c41-f19e-47ca-9cd0-1cf12d32d448"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:20:54 crc kubenswrapper[4892]: I0217 19:20:54.458487 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dba96c41-f19e-47ca-9cd0-1cf12d32d448-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dba96c41-f19e-47ca-9cd0-1cf12d32d448" (UID: "dba96c41-f19e-47ca-9cd0-1cf12d32d448"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:20:54 crc kubenswrapper[4892]: I0217 19:20:54.529413 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dba96c41-f19e-47ca-9cd0-1cf12d32d448-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 19:20:54 crc kubenswrapper[4892]: I0217 19:20:54.529684 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25ld4\" (UniqueName: \"kubernetes.io/projected/dba96c41-f19e-47ca-9cd0-1cf12d32d448-kube-api-access-25ld4\") on node \"crc\" DevicePath \"\"" Feb 17 19:20:54 crc kubenswrapper[4892]: I0217 19:20:54.529697 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dba96c41-f19e-47ca-9cd0-1cf12d32d448-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 19:20:54 crc kubenswrapper[4892]: I0217 19:20:54.753544 4892 generic.go:334] "Generic (PLEG): container finished" podID="dba96c41-f19e-47ca-9cd0-1cf12d32d448" containerID="71311d9426c73238e86516e341e460027b0269746d4b74ba0cef6f1380147381" exitCode=0 Feb 17 19:20:54 crc kubenswrapper[4892]: I0217 19:20:54.753678 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 17 19:20:54 crc kubenswrapper[4892]: I0217 19:20:54.753688 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"dba96c41-f19e-47ca-9cd0-1cf12d32d448","Type":"ContainerDied","Data":"71311d9426c73238e86516e341e460027b0269746d4b74ba0cef6f1380147381"} Feb 17 19:20:54 crc kubenswrapper[4892]: I0217 19:20:54.755306 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"dba96c41-f19e-47ca-9cd0-1cf12d32d448","Type":"ContainerDied","Data":"aef695fb4441ae72ad5b69d3f2cfa0e99efab8f819d61e69cbb5c40fddbe3913"} Feb 17 19:20:54 crc kubenswrapper[4892]: I0217 19:20:54.755345 4892 scope.go:117] "RemoveContainer" containerID="71311d9426c73238e86516e341e460027b0269746d4b74ba0cef6f1380147381" Feb 17 19:20:54 crc kubenswrapper[4892]: I0217 19:20:54.755948 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 17 19:20:54 crc kubenswrapper[4892]: I0217 19:20:54.795187 4892 scope.go:117] "RemoveContainer" containerID="71311d9426c73238e86516e341e460027b0269746d4b74ba0cef6f1380147381" Feb 17 19:20:54 crc kubenswrapper[4892]: E0217 19:20:54.795891 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71311d9426c73238e86516e341e460027b0269746d4b74ba0cef6f1380147381\": container with ID starting with 71311d9426c73238e86516e341e460027b0269746d4b74ba0cef6f1380147381 not found: ID does not exist" containerID="71311d9426c73238e86516e341e460027b0269746d4b74ba0cef6f1380147381" Feb 17 19:20:54 crc kubenswrapper[4892]: I0217 19:20:54.796000 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71311d9426c73238e86516e341e460027b0269746d4b74ba0cef6f1380147381"} err="failed to get container status \"71311d9426c73238e86516e341e460027b0269746d4b74ba0cef6f1380147381\": rpc error: code = NotFound desc = could not find container \"71311d9426c73238e86516e341e460027b0269746d4b74ba0cef6f1380147381\": container with ID starting with 71311d9426c73238e86516e341e460027b0269746d4b74ba0cef6f1380147381 not found: ID does not exist" Feb 17 19:20:54 crc kubenswrapper[4892]: I0217 19:20:54.846793 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 17 19:20:54 crc kubenswrapper[4892]: I0217 19:20:54.859089 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 17 19:20:54 crc kubenswrapper[4892]: I0217 19:20:54.883512 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 17 19:20:54 crc kubenswrapper[4892]: E0217 19:20:54.884470 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dba96c41-f19e-47ca-9cd0-1cf12d32d448" containerName="nova-cell1-conductor-conductor" Feb 17 19:20:54 crc kubenswrapper[4892]: I0217 19:20:54.884620 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="dba96c41-f19e-47ca-9cd0-1cf12d32d448" containerName="nova-cell1-conductor-conductor" Feb 17 19:20:54 crc kubenswrapper[4892]: I0217 19:20:54.885151 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="dba96c41-f19e-47ca-9cd0-1cf12d32d448" containerName="nova-cell1-conductor-conductor" Feb 17 19:20:54 crc kubenswrapper[4892]: I0217 19:20:54.903108 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 17 19:20:54 crc kubenswrapper[4892]: I0217 19:20:54.909376 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 17 19:20:54 crc kubenswrapper[4892]: I0217 19:20:54.912755 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 17 19:20:55 crc kubenswrapper[4892]: I0217 19:20:55.040728 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbrvl\" (UniqueName: \"kubernetes.io/projected/74eb0508-84ab-4ac0-a7b4-5cdc321c3c4a-kube-api-access-mbrvl\") pod \"nova-cell1-conductor-0\" (UID: \"74eb0508-84ab-4ac0-a7b4-5cdc321c3c4a\") " pod="openstack/nova-cell1-conductor-0" Feb 17 19:20:55 crc kubenswrapper[4892]: I0217 19:20:55.040798 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74eb0508-84ab-4ac0-a7b4-5cdc321c3c4a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"74eb0508-84ab-4ac0-a7b4-5cdc321c3c4a\") " pod="openstack/nova-cell1-conductor-0" Feb 17 19:20:55 crc kubenswrapper[4892]: I0217 19:20:55.040862 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74eb0508-84ab-4ac0-a7b4-5cdc321c3c4a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"74eb0508-84ab-4ac0-a7b4-5cdc321c3c4a\") " pod="openstack/nova-cell1-conductor-0" Feb 17 19:20:55 crc kubenswrapper[4892]: I0217 19:20:55.142764 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbrvl\" (UniqueName: \"kubernetes.io/projected/74eb0508-84ab-4ac0-a7b4-5cdc321c3c4a-kube-api-access-mbrvl\") pod \"nova-cell1-conductor-0\" (UID: \"74eb0508-84ab-4ac0-a7b4-5cdc321c3c4a\") " pod="openstack/nova-cell1-conductor-0" Feb 17 19:20:55 crc kubenswrapper[4892]: I0217 19:20:55.142866 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74eb0508-84ab-4ac0-a7b4-5cdc321c3c4a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"74eb0508-84ab-4ac0-a7b4-5cdc321c3c4a\") " pod="openstack/nova-cell1-conductor-0" Feb 17 19:20:55 crc kubenswrapper[4892]: I0217 19:20:55.142916 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74eb0508-84ab-4ac0-a7b4-5cdc321c3c4a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"74eb0508-84ab-4ac0-a7b4-5cdc321c3c4a\") " pod="openstack/nova-cell1-conductor-0" Feb 17 19:20:55 crc kubenswrapper[4892]: I0217 19:20:55.155579 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74eb0508-84ab-4ac0-a7b4-5cdc321c3c4a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"74eb0508-84ab-4ac0-a7b4-5cdc321c3c4a\") " pod="openstack/nova-cell1-conductor-0" Feb 17 19:20:55 crc kubenswrapper[4892]: I0217 19:20:55.155595 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74eb0508-84ab-4ac0-a7b4-5cdc321c3c4a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"74eb0508-84ab-4ac0-a7b4-5cdc321c3c4a\") " pod="openstack/nova-cell1-conductor-0" Feb 17 19:20:55 crc kubenswrapper[4892]: I0217 19:20:55.160734 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbrvl\" (UniqueName: \"kubernetes.io/projected/74eb0508-84ab-4ac0-a7b4-5cdc321c3c4a-kube-api-access-mbrvl\") pod \"nova-cell1-conductor-0\" (UID: \"74eb0508-84ab-4ac0-a7b4-5cdc321c3c4a\") " pod="openstack/nova-cell1-conductor-0" Feb 17 19:20:55 crc kubenswrapper[4892]: I0217 19:20:55.249538 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 17 19:20:55 crc kubenswrapper[4892]: I0217 19:20:55.382757 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dba96c41-f19e-47ca-9cd0-1cf12d32d448" path="/var/lib/kubelet/pods/dba96c41-f19e-47ca-9cd0-1cf12d32d448/volumes" Feb 17 19:20:55 crc kubenswrapper[4892]: I0217 19:20:55.816088 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 17 19:20:55 crc kubenswrapper[4892]: W0217 19:20:55.822017 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74eb0508_84ab_4ac0_a7b4_5cdc321c3c4a.slice/crio-b61d938e01773abc1c5cc198308cf4b262f97b00eb5606512a6b2e5079fef25c WatchSource:0}: Error finding container b61d938e01773abc1c5cc198308cf4b262f97b00eb5606512a6b2e5079fef25c: Status 404 returned error can't find the container with id b61d938e01773abc1c5cc198308cf4b262f97b00eb5606512a6b2e5079fef25c Feb 17 19:20:56 crc kubenswrapper[4892]: I0217 19:20:56.144171 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 19:20:56 crc kubenswrapper[4892]: I0217 19:20:56.144215 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 19:20:56 crc kubenswrapper[4892]: I0217 19:20:56.800166 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"74eb0508-84ab-4ac0-a7b4-5cdc321c3c4a","Type":"ContainerStarted","Data":"07783f23535025be8b8331c575a55ec1f402a4c838604e9179cb34c1bf66d3ea"} Feb 17 19:20:56 crc kubenswrapper[4892]: I0217 19:20:56.800479 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"74eb0508-84ab-4ac0-a7b4-5cdc321c3c4a","Type":"ContainerStarted","Data":"b61d938e01773abc1c5cc198308cf4b262f97b00eb5606512a6b2e5079fef25c"} Feb 17 19:20:56 crc kubenswrapper[4892]: I0217 19:20:56.800500 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 17 19:20:56 crc kubenswrapper[4892]: I0217 19:20:56.827112 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.827031082 podStartE2EDuration="2.827031082s" podCreationTimestamp="2026-02-17 19:20:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:20:56.816119288 +0000 UTC m=+5828.191522563" watchObservedRunningTime="2026-02-17 19:20:56.827031082 +0000 UTC m=+5828.202434347" Feb 17 19:20:57 crc kubenswrapper[4892]: I0217 19:20:57.304591 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 17 19:20:59 crc kubenswrapper[4892]: I0217 19:20:59.033547 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 17 19:20:59 crc kubenswrapper[4892]: I0217 19:20:59.051783 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 17 19:20:59 crc kubenswrapper[4892]: I0217 19:20:59.054731 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 17 19:20:59 crc kubenswrapper[4892]: I0217 19:20:59.090703 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 17 19:20:59 crc kubenswrapper[4892]: I0217 19:20:59.847565 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 17 19:20:59 crc kubenswrapper[4892]: I0217 19:20:59.895284 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 17 19:21:01 crc kubenswrapper[4892]: I0217 19:21:01.144611 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 17 19:21:01 crc kubenswrapper[4892]: I0217 19:21:01.146625 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 17 19:21:01 crc kubenswrapper[4892]: I0217 19:21:01.162907 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 19:21:01 crc kubenswrapper[4892]: I0217 19:21:01.162970 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 19:21:02 crc kubenswrapper[4892]: I0217 19:21:02.310035 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d2730df2-b773-4353-9b44-9dcc2b516221" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.112:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 19:21:02 crc kubenswrapper[4892]: I0217 19:21:02.310035 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="74f65fe3-a32c-4468-aabd-7db13aa21aa4" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.113:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 19:21:02 crc kubenswrapper[4892]: I0217 19:21:02.310146 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="74f65fe3-a32c-4468-aabd-7db13aa21aa4" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.113:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 19:21:02 crc kubenswrapper[4892]: I0217 19:21:02.310149 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d2730df2-b773-4353-9b44-9dcc2b516221" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.112:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 19:21:05 crc kubenswrapper[4892]: I0217 19:21:05.282394 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 17 19:21:05 crc kubenswrapper[4892]: I0217 19:21:05.544435 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 19:21:05 crc kubenswrapper[4892]: I0217 19:21:05.549869 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 19:21:05 crc kubenswrapper[4892]: I0217 19:21:05.556162 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 17 19:21:05 crc kubenswrapper[4892]: I0217 19:21:05.580010 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 19:21:05 crc kubenswrapper[4892]: I0217 19:21:05.717209 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/caecb191-fb7b-44fd-b27c-ec755ed4ee4a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"caecb191-fb7b-44fd-b27c-ec755ed4ee4a\") " pod="openstack/cinder-scheduler-0" Feb 17 19:21:05 crc kubenswrapper[4892]: I0217 19:21:05.717337 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/caecb191-fb7b-44fd-b27c-ec755ed4ee4a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"caecb191-fb7b-44fd-b27c-ec755ed4ee4a\") " pod="openstack/cinder-scheduler-0" Feb 17 19:21:05 crc kubenswrapper[4892]: I0217 19:21:05.717565 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caecb191-fb7b-44fd-b27c-ec755ed4ee4a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"caecb191-fb7b-44fd-b27c-ec755ed4ee4a\") " pod="openstack/cinder-scheduler-0" Feb 17 19:21:05 crc kubenswrapper[4892]: I0217 19:21:05.717678 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsg4p\" (UniqueName: \"kubernetes.io/projected/caecb191-fb7b-44fd-b27c-ec755ed4ee4a-kube-api-access-dsg4p\") pod \"cinder-scheduler-0\" (UID: \"caecb191-fb7b-44fd-b27c-ec755ed4ee4a\") " pod="openstack/cinder-scheduler-0" Feb 17 19:21:05 crc kubenswrapper[4892]: I0217 19:21:05.717850 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/caecb191-fb7b-44fd-b27c-ec755ed4ee4a-scripts\") pod \"cinder-scheduler-0\" (UID: \"caecb191-fb7b-44fd-b27c-ec755ed4ee4a\") " pod="openstack/cinder-scheduler-0" Feb 17 19:21:05 crc kubenswrapper[4892]: I0217 19:21:05.717971 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/caecb191-fb7b-44fd-b27c-ec755ed4ee4a-config-data\") pod \"cinder-scheduler-0\" (UID: \"caecb191-fb7b-44fd-b27c-ec755ed4ee4a\") " pod="openstack/cinder-scheduler-0" Feb 17 19:21:05 crc kubenswrapper[4892]: I0217 19:21:05.820262 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/caecb191-fb7b-44fd-b27c-ec755ed4ee4a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"caecb191-fb7b-44fd-b27c-ec755ed4ee4a\") " pod="openstack/cinder-scheduler-0" Feb 17 19:21:05 crc kubenswrapper[4892]: I0217 19:21:05.820307 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/caecb191-fb7b-44fd-b27c-ec755ed4ee4a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"caecb191-fb7b-44fd-b27c-ec755ed4ee4a\") " pod="openstack/cinder-scheduler-0" Feb 17 19:21:05 crc kubenswrapper[4892]: I0217 19:21:05.820375 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caecb191-fb7b-44fd-b27c-ec755ed4ee4a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"caecb191-fb7b-44fd-b27c-ec755ed4ee4a\") " pod="openstack/cinder-scheduler-0" Feb 17 19:21:05 crc kubenswrapper[4892]: I0217 19:21:05.820396 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsg4p\" (UniqueName: \"kubernetes.io/projected/caecb191-fb7b-44fd-b27c-ec755ed4ee4a-kube-api-access-dsg4p\") pod \"cinder-scheduler-0\" (UID: \"caecb191-fb7b-44fd-b27c-ec755ed4ee4a\") " pod="openstack/cinder-scheduler-0" Feb 17 19:21:05 crc kubenswrapper[4892]: I0217 19:21:05.820457 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/caecb191-fb7b-44fd-b27c-ec755ed4ee4a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"caecb191-fb7b-44fd-b27c-ec755ed4ee4a\") " pod="openstack/cinder-scheduler-0" Feb 17 19:21:05 crc kubenswrapper[4892]: I0217 19:21:05.820467 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/caecb191-fb7b-44fd-b27c-ec755ed4ee4a-scripts\") pod \"cinder-scheduler-0\" (UID: \"caecb191-fb7b-44fd-b27c-ec755ed4ee4a\") " pod="openstack/cinder-scheduler-0" Feb 17 19:21:05 crc kubenswrapper[4892]: I0217 19:21:05.820578 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/caecb191-fb7b-44fd-b27c-ec755ed4ee4a-config-data\") pod \"cinder-scheduler-0\" (UID: \"caecb191-fb7b-44fd-b27c-ec755ed4ee4a\") " pod="openstack/cinder-scheduler-0" Feb 17 19:21:05 crc kubenswrapper[4892]: I0217 19:21:05.825559 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/caecb191-fb7b-44fd-b27c-ec755ed4ee4a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"caecb191-fb7b-44fd-b27c-ec755ed4ee4a\") " pod="openstack/cinder-scheduler-0" Feb 17 19:21:05 crc kubenswrapper[4892]: I0217 19:21:05.825883 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/caecb191-fb7b-44fd-b27c-ec755ed4ee4a-config-data\") pod \"cinder-scheduler-0\" (UID: \"caecb191-fb7b-44fd-b27c-ec755ed4ee4a\") " pod="openstack/cinder-scheduler-0" Feb 17 19:21:05 crc kubenswrapper[4892]: I0217 19:21:05.826736 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caecb191-fb7b-44fd-b27c-ec755ed4ee4a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"caecb191-fb7b-44fd-b27c-ec755ed4ee4a\") " pod="openstack/cinder-scheduler-0" Feb 17 19:21:05 crc kubenswrapper[4892]: I0217 19:21:05.826879 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/caecb191-fb7b-44fd-b27c-ec755ed4ee4a-scripts\") pod \"cinder-scheduler-0\" (UID: \"caecb191-fb7b-44fd-b27c-ec755ed4ee4a\") " pod="openstack/cinder-scheduler-0" Feb 17 19:21:05 crc kubenswrapper[4892]: I0217 19:21:05.862613 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsg4p\" (UniqueName: \"kubernetes.io/projected/caecb191-fb7b-44fd-b27c-ec755ed4ee4a-kube-api-access-dsg4p\") pod \"cinder-scheduler-0\" (UID: \"caecb191-fb7b-44fd-b27c-ec755ed4ee4a\") " pod="openstack/cinder-scheduler-0" Feb 17 19:21:05 crc kubenswrapper[4892]: I0217 19:21:05.874940 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 19:21:06 crc kubenswrapper[4892]: I0217 19:21:06.363362 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 19:21:06 crc kubenswrapper[4892]: W0217 19:21:06.364709 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcaecb191_fb7b_44fd_b27c_ec755ed4ee4a.slice/crio-0d4ee1078bb3cc2af9ac3925950bff6996b86005a2b14a7878252040d276d361 WatchSource:0}: Error finding container 0d4ee1078bb3cc2af9ac3925950bff6996b86005a2b14a7878252040d276d361: Status 404 returned error can't find the container with id 0d4ee1078bb3cc2af9ac3925950bff6996b86005a2b14a7878252040d276d361 Feb 17 19:21:06 crc kubenswrapper[4892]: I0217 19:21:06.954360 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"caecb191-fb7b-44fd-b27c-ec755ed4ee4a","Type":"ContainerStarted","Data":"0d4ee1078bb3cc2af9ac3925950bff6996b86005a2b14a7878252040d276d361"} Feb 17 19:21:07 crc kubenswrapper[4892]: I0217 19:21:07.254658 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 17 19:21:07 crc kubenswrapper[4892]: I0217 19:21:07.255273 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="db264755-2d6e-4d13-ab1f-1a4b3712b242" containerName="cinder-api-log" containerID="cri-o://1ab0b224df9ec86f11b84944dc6f8b4c202c7a169e3ea90883e69ff0e285ae6b" gracePeriod=30 Feb 17 19:21:07 crc kubenswrapper[4892]: I0217 19:21:07.255386 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="db264755-2d6e-4d13-ab1f-1a4b3712b242" containerName="cinder-api" containerID="cri-o://7dd04ecbe0ec3921048d1c69f234d5ca72164af63de73c2812832f5df598b0a5" gracePeriod=30 Feb 17 19:21:07 crc kubenswrapper[4892]: I0217 19:21:07.365724 4892 scope.go:117] "RemoveContainer" containerID="0b9ac01ffb1eb212ecc7dea4def4de73890b147a83b4b71346712fbd55b67ac2" Feb 17 19:21:07 crc kubenswrapper[4892]: E0217 19:21:07.366070 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:21:07 crc kubenswrapper[4892]: I0217 19:21:07.936307 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Feb 17 19:21:07 crc kubenswrapper[4892]: I0217 19:21:07.938242 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Feb 17 19:21:07 crc kubenswrapper[4892]: I0217 19:21:07.940903 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Feb 17 19:21:07 crc kubenswrapper[4892]: I0217 19:21:07.956890 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Feb 17 19:21:07 crc kubenswrapper[4892]: I0217 19:21:07.975644 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"caecb191-fb7b-44fd-b27c-ec755ed4ee4a","Type":"ContainerStarted","Data":"a163fc53b4c3c229dd7e4fbad7a23c23d502aac9f6920e286b7a0b309c9b8371"} Feb 17 19:21:07 crc kubenswrapper[4892]: I0217 19:21:07.975694 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"caecb191-fb7b-44fd-b27c-ec755ed4ee4a","Type":"ContainerStarted","Data":"4491fc369c58ac9fc17501a31c2cb8206375b67c71fa4aff1f279c0269aa964f"} Feb 17 19:21:07 crc kubenswrapper[4892]: I0217 19:21:07.991471 4892 generic.go:334] "Generic (PLEG): container finished" podID="db264755-2d6e-4d13-ab1f-1a4b3712b242" containerID="1ab0b224df9ec86f11b84944dc6f8b4c202c7a169e3ea90883e69ff0e285ae6b" exitCode=143 Feb 17 19:21:07 crc kubenswrapper[4892]: I0217 19:21:07.992083 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"db264755-2d6e-4d13-ab1f-1a4b3712b242","Type":"ContainerDied","Data":"1ab0b224df9ec86f11b84944dc6f8b4c202c7a169e3ea90883e69ff0e285ae6b"} Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.068806 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/196db019-189a-4787-a766-f9ae8d46cbea-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"196db019-189a-4787-a766-f9ae8d46cbea\") " pod="openstack/cinder-volume-volume1-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.068921 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/196db019-189a-4787-a766-f9ae8d46cbea-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"196db019-189a-4787-a766-f9ae8d46cbea\") " pod="openstack/cinder-volume-volume1-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.068943 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/196db019-189a-4787-a766-f9ae8d46cbea-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"196db019-189a-4787-a766-f9ae8d46cbea\") " pod="openstack/cinder-volume-volume1-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.068990 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/196db019-189a-4787-a766-f9ae8d46cbea-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"196db019-189a-4787-a766-f9ae8d46cbea\") " pod="openstack/cinder-volume-volume1-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.069022 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/196db019-189a-4787-a766-f9ae8d46cbea-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"196db019-189a-4787-a766-f9ae8d46cbea\") " pod="openstack/cinder-volume-volume1-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.069045 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/196db019-189a-4787-a766-f9ae8d46cbea-sys\") pod \"cinder-volume-volume1-0\" (UID: \"196db019-189a-4787-a766-f9ae8d46cbea\") " pod="openstack/cinder-volume-volume1-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.069094 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwsg9\" (UniqueName: \"kubernetes.io/projected/196db019-189a-4787-a766-f9ae8d46cbea-kube-api-access-mwsg9\") pod \"cinder-volume-volume1-0\" (UID: \"196db019-189a-4787-a766-f9ae8d46cbea\") " pod="openstack/cinder-volume-volume1-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.069117 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/196db019-189a-4787-a766-f9ae8d46cbea-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"196db019-189a-4787-a766-f9ae8d46cbea\") " pod="openstack/cinder-volume-volume1-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.069150 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/196db019-189a-4787-a766-f9ae8d46cbea-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"196db019-189a-4787-a766-f9ae8d46cbea\") " pod="openstack/cinder-volume-volume1-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.069174 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/196db019-189a-4787-a766-f9ae8d46cbea-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"196db019-189a-4787-a766-f9ae8d46cbea\") " pod="openstack/cinder-volume-volume1-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.069190 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/196db019-189a-4787-a766-f9ae8d46cbea-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"196db019-189a-4787-a766-f9ae8d46cbea\") " pod="openstack/cinder-volume-volume1-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.069211 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/196db019-189a-4787-a766-f9ae8d46cbea-run\") pod \"cinder-volume-volume1-0\" (UID: \"196db019-189a-4787-a766-f9ae8d46cbea\") " pod="openstack/cinder-volume-volume1-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.069239 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/196db019-189a-4787-a766-f9ae8d46cbea-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"196db019-189a-4787-a766-f9ae8d46cbea\") " pod="openstack/cinder-volume-volume1-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.069255 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/196db019-189a-4787-a766-f9ae8d46cbea-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"196db019-189a-4787-a766-f9ae8d46cbea\") " pod="openstack/cinder-volume-volume1-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.069280 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/196db019-189a-4787-a766-f9ae8d46cbea-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"196db019-189a-4787-a766-f9ae8d46cbea\") " pod="openstack/cinder-volume-volume1-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.069297 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/196db019-189a-4787-a766-f9ae8d46cbea-dev\") pod \"cinder-volume-volume1-0\" (UID: \"196db019-189a-4787-a766-f9ae8d46cbea\") " pod="openstack/cinder-volume-volume1-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.170428 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/196db019-189a-4787-a766-f9ae8d46cbea-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"196db019-189a-4787-a766-f9ae8d46cbea\") " pod="openstack/cinder-volume-volume1-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.170468 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/196db019-189a-4787-a766-f9ae8d46cbea-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"196db019-189a-4787-a766-f9ae8d46cbea\") " pod="openstack/cinder-volume-volume1-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.170489 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/196db019-189a-4787-a766-f9ae8d46cbea-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"196db019-189a-4787-a766-f9ae8d46cbea\") " pod="openstack/cinder-volume-volume1-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.170517 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/196db019-189a-4787-a766-f9ae8d46cbea-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"196db019-189a-4787-a766-f9ae8d46cbea\") " pod="openstack/cinder-volume-volume1-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.170557 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/196db019-189a-4787-a766-f9ae8d46cbea-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"196db019-189a-4787-a766-f9ae8d46cbea\") " pod="openstack/cinder-volume-volume1-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.170580 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/196db019-189a-4787-a766-f9ae8d46cbea-sys\") pod \"cinder-volume-volume1-0\" (UID: \"196db019-189a-4787-a766-f9ae8d46cbea\") " pod="openstack/cinder-volume-volume1-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.170619 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwsg9\" (UniqueName: \"kubernetes.io/projected/196db019-189a-4787-a766-f9ae8d46cbea-kube-api-access-mwsg9\") pod \"cinder-volume-volume1-0\" (UID: \"196db019-189a-4787-a766-f9ae8d46cbea\") " pod="openstack/cinder-volume-volume1-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.170644 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/196db019-189a-4787-a766-f9ae8d46cbea-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"196db019-189a-4787-a766-f9ae8d46cbea\") " pod="openstack/cinder-volume-volume1-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.170673 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/196db019-189a-4787-a766-f9ae8d46cbea-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"196db019-189a-4787-a766-f9ae8d46cbea\") " pod="openstack/cinder-volume-volume1-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.170704 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/196db019-189a-4787-a766-f9ae8d46cbea-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"196db019-189a-4787-a766-f9ae8d46cbea\") " pod="openstack/cinder-volume-volume1-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.170721 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/196db019-189a-4787-a766-f9ae8d46cbea-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"196db019-189a-4787-a766-f9ae8d46cbea\") " pod="openstack/cinder-volume-volume1-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.170736 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/196db019-189a-4787-a766-f9ae8d46cbea-run\") pod \"cinder-volume-volume1-0\" (UID: \"196db019-189a-4787-a766-f9ae8d46cbea\") " pod="openstack/cinder-volume-volume1-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.170763 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/196db019-189a-4787-a766-f9ae8d46cbea-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"196db019-189a-4787-a766-f9ae8d46cbea\") " pod="openstack/cinder-volume-volume1-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.170777 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/196db019-189a-4787-a766-f9ae8d46cbea-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"196db019-189a-4787-a766-f9ae8d46cbea\") " pod="openstack/cinder-volume-volume1-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.170779 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/196db019-189a-4787-a766-f9ae8d46cbea-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"196db019-189a-4787-a766-f9ae8d46cbea\") " pod="openstack/cinder-volume-volume1-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.171206 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/196db019-189a-4787-a766-f9ae8d46cbea-sys\") pod \"cinder-volume-volume1-0\" (UID: \"196db019-189a-4787-a766-f9ae8d46cbea\") " pod="openstack/cinder-volume-volume1-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.171367 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/196db019-189a-4787-a766-f9ae8d46cbea-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"196db019-189a-4787-a766-f9ae8d46cbea\") " pod="openstack/cinder-volume-volume1-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.171393 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/196db019-189a-4787-a766-f9ae8d46cbea-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"196db019-189a-4787-a766-f9ae8d46cbea\") " pod="openstack/cinder-volume-volume1-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.171425 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/196db019-189a-4787-a766-f9ae8d46cbea-run\") pod \"cinder-volume-volume1-0\" (UID: \"196db019-189a-4787-a766-f9ae8d46cbea\") " pod="openstack/cinder-volume-volume1-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.170794 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/196db019-189a-4787-a766-f9ae8d46cbea-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"196db019-189a-4787-a766-f9ae8d46cbea\") " pod="openstack/cinder-volume-volume1-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.171439 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/196db019-189a-4787-a766-f9ae8d46cbea-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"196db019-189a-4787-a766-f9ae8d46cbea\") " pod="openstack/cinder-volume-volume1-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.171481 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/196db019-189a-4787-a766-f9ae8d46cbea-dev\") pod \"cinder-volume-volume1-0\" (UID: \"196db019-189a-4787-a766-f9ae8d46cbea\") " pod="openstack/cinder-volume-volume1-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.171502 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/196db019-189a-4787-a766-f9ae8d46cbea-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"196db019-189a-4787-a766-f9ae8d46cbea\") " pod="openstack/cinder-volume-volume1-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.171507 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/196db019-189a-4787-a766-f9ae8d46cbea-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"196db019-189a-4787-a766-f9ae8d46cbea\") " pod="openstack/cinder-volume-volume1-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.171541 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/196db019-189a-4787-a766-f9ae8d46cbea-dev\") pod \"cinder-volume-volume1-0\" (UID: \"196db019-189a-4787-a766-f9ae8d46cbea\") " pod="openstack/cinder-volume-volume1-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.171864 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/196db019-189a-4787-a766-f9ae8d46cbea-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"196db019-189a-4787-a766-f9ae8d46cbea\") " pod="openstack/cinder-volume-volume1-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.174893 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/196db019-189a-4787-a766-f9ae8d46cbea-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"196db019-189a-4787-a766-f9ae8d46cbea\") " pod="openstack/cinder-volume-volume1-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.175337 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/196db019-189a-4787-a766-f9ae8d46cbea-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"196db019-189a-4787-a766-f9ae8d46cbea\") " pod="openstack/cinder-volume-volume1-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.175646 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/196db019-189a-4787-a766-f9ae8d46cbea-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"196db019-189a-4787-a766-f9ae8d46cbea\") " pod="openstack/cinder-volume-volume1-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.176093 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/196db019-189a-4787-a766-f9ae8d46cbea-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"196db019-189a-4787-a766-f9ae8d46cbea\") " pod="openstack/cinder-volume-volume1-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.180229 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/196db019-189a-4787-a766-f9ae8d46cbea-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"196db019-189a-4787-a766-f9ae8d46cbea\") " pod="openstack/cinder-volume-volume1-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.189326 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwsg9\" (UniqueName: \"kubernetes.io/projected/196db019-189a-4787-a766-f9ae8d46cbea-kube-api-access-mwsg9\") pod \"cinder-volume-volume1-0\" (UID: \"196db019-189a-4787-a766-f9ae8d46cbea\") " pod="openstack/cinder-volume-volume1-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.261905 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.480273 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.480257127 podStartE2EDuration="3.480257127s" podCreationTimestamp="2026-02-17 19:21:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:21:08.004968475 +0000 UTC m=+5839.380371740" watchObservedRunningTime="2026-02-17 19:21:08.480257127 +0000 UTC m=+5839.855660392" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.493101 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.494918 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.498746 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.508254 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.681801 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0c0d1a90-2e96-43e4-9ed7-b375dd729dd5-sys\") pod \"cinder-backup-0\" (UID: \"0c0d1a90-2e96-43e4-9ed7-b375dd729dd5\") " pod="openstack/cinder-backup-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.681866 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/0c0d1a90-2e96-43e4-9ed7-b375dd729dd5-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"0c0d1a90-2e96-43e4-9ed7-b375dd729dd5\") " pod="openstack/cinder-backup-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.681895 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c0d1a90-2e96-43e4-9ed7-b375dd729dd5-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"0c0d1a90-2e96-43e4-9ed7-b375dd729dd5\") " pod="openstack/cinder-backup-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.681909 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0c0d1a90-2e96-43e4-9ed7-b375dd729dd5-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"0c0d1a90-2e96-43e4-9ed7-b375dd729dd5\") " pod="openstack/cinder-backup-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.682011 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0c0d1a90-2e96-43e4-9ed7-b375dd729dd5-lib-modules\") pod \"cinder-backup-0\" (UID: \"0c0d1a90-2e96-43e4-9ed7-b375dd729dd5\") " pod="openstack/cinder-backup-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.682039 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0c0d1a90-2e96-43e4-9ed7-b375dd729dd5-etc-nvme\") pod \"cinder-backup-0\" (UID: \"0c0d1a90-2e96-43e4-9ed7-b375dd729dd5\") " pod="openstack/cinder-backup-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.682065 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/0c0d1a90-2e96-43e4-9ed7-b375dd729dd5-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"0c0d1a90-2e96-43e4-9ed7-b375dd729dd5\") " pod="openstack/cinder-backup-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.682081 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0c0d1a90-2e96-43e4-9ed7-b375dd729dd5-dev\") pod \"cinder-backup-0\" (UID: \"0c0d1a90-2e96-43e4-9ed7-b375dd729dd5\") " pod="openstack/cinder-backup-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.682114 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c0d1a90-2e96-43e4-9ed7-b375dd729dd5-config-data\") pod \"cinder-backup-0\" (UID: \"0c0d1a90-2e96-43e4-9ed7-b375dd729dd5\") " pod="openstack/cinder-backup-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.682133 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0c0d1a90-2e96-43e4-9ed7-b375dd729dd5-run\") pod \"cinder-backup-0\" (UID: \"0c0d1a90-2e96-43e4-9ed7-b375dd729dd5\") " pod="openstack/cinder-backup-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.682153 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0c0d1a90-2e96-43e4-9ed7-b375dd729dd5-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"0c0d1a90-2e96-43e4-9ed7-b375dd729dd5\") " pod="openstack/cinder-backup-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.682210 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x67zs\" (UniqueName: \"kubernetes.io/projected/0c0d1a90-2e96-43e4-9ed7-b375dd729dd5-kube-api-access-x67zs\") pod \"cinder-backup-0\" (UID: \"0c0d1a90-2e96-43e4-9ed7-b375dd729dd5\") " pod="openstack/cinder-backup-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.682228 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c0d1a90-2e96-43e4-9ed7-b375dd729dd5-scripts\") pod \"cinder-backup-0\" (UID: \"0c0d1a90-2e96-43e4-9ed7-b375dd729dd5\") " pod="openstack/cinder-backup-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.682248 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0c0d1a90-2e96-43e4-9ed7-b375dd729dd5-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"0c0d1a90-2e96-43e4-9ed7-b375dd729dd5\") " pod="openstack/cinder-backup-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.682276 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0c0d1a90-2e96-43e4-9ed7-b375dd729dd5-config-data-custom\") pod \"cinder-backup-0\" (UID: \"0c0d1a90-2e96-43e4-9ed7-b375dd729dd5\") " pod="openstack/cinder-backup-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.682312 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0c0d1a90-2e96-43e4-9ed7-b375dd729dd5-ceph\") pod \"cinder-backup-0\" (UID: \"0c0d1a90-2e96-43e4-9ed7-b375dd729dd5\") " pod="openstack/cinder-backup-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.784059 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0c0d1a90-2e96-43e4-9ed7-b375dd729dd5-run\") pod \"cinder-backup-0\" (UID: \"0c0d1a90-2e96-43e4-9ed7-b375dd729dd5\") " pod="openstack/cinder-backup-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.784112 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0c0d1a90-2e96-43e4-9ed7-b375dd729dd5-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"0c0d1a90-2e96-43e4-9ed7-b375dd729dd5\") " pod="openstack/cinder-backup-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.784161 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x67zs\" (UniqueName: \"kubernetes.io/projected/0c0d1a90-2e96-43e4-9ed7-b375dd729dd5-kube-api-access-x67zs\") pod \"cinder-backup-0\" (UID: \"0c0d1a90-2e96-43e4-9ed7-b375dd729dd5\") " pod="openstack/cinder-backup-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.784188 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c0d1a90-2e96-43e4-9ed7-b375dd729dd5-scripts\") pod \"cinder-backup-0\" (UID: \"0c0d1a90-2e96-43e4-9ed7-b375dd729dd5\") " pod="openstack/cinder-backup-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.784214 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0c0d1a90-2e96-43e4-9ed7-b375dd729dd5-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"0c0d1a90-2e96-43e4-9ed7-b375dd729dd5\") " pod="openstack/cinder-backup-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.784251 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0c0d1a90-2e96-43e4-9ed7-b375dd729dd5-ceph\") pod \"cinder-backup-0\" (UID: \"0c0d1a90-2e96-43e4-9ed7-b375dd729dd5\") " pod="openstack/cinder-backup-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.784269 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0c0d1a90-2e96-43e4-9ed7-b375dd729dd5-config-data-custom\") pod \"cinder-backup-0\" (UID: \"0c0d1a90-2e96-43e4-9ed7-b375dd729dd5\") " pod="openstack/cinder-backup-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.784334 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0c0d1a90-2e96-43e4-9ed7-b375dd729dd5-sys\") pod \"cinder-backup-0\" (UID: \"0c0d1a90-2e96-43e4-9ed7-b375dd729dd5\") " pod="openstack/cinder-backup-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.784359 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/0c0d1a90-2e96-43e4-9ed7-b375dd729dd5-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"0c0d1a90-2e96-43e4-9ed7-b375dd729dd5\") " pod="openstack/cinder-backup-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.784384 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c0d1a90-2e96-43e4-9ed7-b375dd729dd5-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"0c0d1a90-2e96-43e4-9ed7-b375dd729dd5\") " pod="openstack/cinder-backup-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.784400 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0c0d1a90-2e96-43e4-9ed7-b375dd729dd5-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"0c0d1a90-2e96-43e4-9ed7-b375dd729dd5\") " pod="openstack/cinder-backup-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.784444 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0c0d1a90-2e96-43e4-9ed7-b375dd729dd5-lib-modules\") pod \"cinder-backup-0\" (UID: \"0c0d1a90-2e96-43e4-9ed7-b375dd729dd5\") " pod="openstack/cinder-backup-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.784464 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0c0d1a90-2e96-43e4-9ed7-b375dd729dd5-etc-nvme\") pod \"cinder-backup-0\" (UID: \"0c0d1a90-2e96-43e4-9ed7-b375dd729dd5\") " pod="openstack/cinder-backup-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.784488 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/0c0d1a90-2e96-43e4-9ed7-b375dd729dd5-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"0c0d1a90-2e96-43e4-9ed7-b375dd729dd5\") " pod="openstack/cinder-backup-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.784505 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0c0d1a90-2e96-43e4-9ed7-b375dd729dd5-dev\") pod \"cinder-backup-0\" (UID: \"0c0d1a90-2e96-43e4-9ed7-b375dd729dd5\") " pod="openstack/cinder-backup-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.784541 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c0d1a90-2e96-43e4-9ed7-b375dd729dd5-config-data\") pod \"cinder-backup-0\" (UID: \"0c0d1a90-2e96-43e4-9ed7-b375dd729dd5\") " pod="openstack/cinder-backup-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.785133 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0c0d1a90-2e96-43e4-9ed7-b375dd729dd5-sys\") pod \"cinder-backup-0\" (UID: \"0c0d1a90-2e96-43e4-9ed7-b375dd729dd5\") " pod="openstack/cinder-backup-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.785213 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0c0d1a90-2e96-43e4-9ed7-b375dd729dd5-run\") pod \"cinder-backup-0\" (UID: \"0c0d1a90-2e96-43e4-9ed7-b375dd729dd5\") " pod="openstack/cinder-backup-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.785274 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0c0d1a90-2e96-43e4-9ed7-b375dd729dd5-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"0c0d1a90-2e96-43e4-9ed7-b375dd729dd5\") " pod="openstack/cinder-backup-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.785306 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0c0d1a90-2e96-43e4-9ed7-b375dd729dd5-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"0c0d1a90-2e96-43e4-9ed7-b375dd729dd5\") " pod="openstack/cinder-backup-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.785324 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/0c0d1a90-2e96-43e4-9ed7-b375dd729dd5-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"0c0d1a90-2e96-43e4-9ed7-b375dd729dd5\") " pod="openstack/cinder-backup-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.785420 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0c0d1a90-2e96-43e4-9ed7-b375dd729dd5-etc-nvme\") pod \"cinder-backup-0\" (UID: \"0c0d1a90-2e96-43e4-9ed7-b375dd729dd5\") " pod="openstack/cinder-backup-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.785461 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/0c0d1a90-2e96-43e4-9ed7-b375dd729dd5-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"0c0d1a90-2e96-43e4-9ed7-b375dd729dd5\") " pod="openstack/cinder-backup-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.785495 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0c0d1a90-2e96-43e4-9ed7-b375dd729dd5-dev\") pod \"cinder-backup-0\" (UID: \"0c0d1a90-2e96-43e4-9ed7-b375dd729dd5\") " pod="openstack/cinder-backup-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.785259 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0c0d1a90-2e96-43e4-9ed7-b375dd729dd5-lib-modules\") pod \"cinder-backup-0\" (UID: \"0c0d1a90-2e96-43e4-9ed7-b375dd729dd5\") " pod="openstack/cinder-backup-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.785473 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0c0d1a90-2e96-43e4-9ed7-b375dd729dd5-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"0c0d1a90-2e96-43e4-9ed7-b375dd729dd5\") " pod="openstack/cinder-backup-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.790373 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0c0d1a90-2e96-43e4-9ed7-b375dd729dd5-ceph\") pod \"cinder-backup-0\" (UID: \"0c0d1a90-2e96-43e4-9ed7-b375dd729dd5\") " pod="openstack/cinder-backup-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.790979 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c0d1a90-2e96-43e4-9ed7-b375dd729dd5-config-data\") pod \"cinder-backup-0\" (UID: \"0c0d1a90-2e96-43e4-9ed7-b375dd729dd5\") " pod="openstack/cinder-backup-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.791465 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c0d1a90-2e96-43e4-9ed7-b375dd729dd5-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"0c0d1a90-2e96-43e4-9ed7-b375dd729dd5\") " pod="openstack/cinder-backup-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.795341 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0c0d1a90-2e96-43e4-9ed7-b375dd729dd5-config-data-custom\") pod \"cinder-backup-0\" (UID: \"0c0d1a90-2e96-43e4-9ed7-b375dd729dd5\") " pod="openstack/cinder-backup-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.812264 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x67zs\" (UniqueName: \"kubernetes.io/projected/0c0d1a90-2e96-43e4-9ed7-b375dd729dd5-kube-api-access-x67zs\") pod \"cinder-backup-0\" (UID: \"0c0d1a90-2e96-43e4-9ed7-b375dd729dd5\") " pod="openstack/cinder-backup-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.813751 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c0d1a90-2e96-43e4-9ed7-b375dd729dd5-scripts\") pod \"cinder-backup-0\" (UID: \"0c0d1a90-2e96-43e4-9ed7-b375dd729dd5\") " pod="openstack/cinder-backup-0" Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.854069 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Feb 17 19:21:08 crc kubenswrapper[4892]: W0217 19:21:08.854679 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod196db019_189a_4787_a766_f9ae8d46cbea.slice/crio-886f3117b9b48ffe7f8a8ba812695ac84d79c70ce22145d605e545401ab0b494 WatchSource:0}: Error finding container 886f3117b9b48ffe7f8a8ba812695ac84d79c70ce22145d605e545401ab0b494: Status 404 returned error can't find the container with id 886f3117b9b48ffe7f8a8ba812695ac84d79c70ce22145d605e545401ab0b494 Feb 17 19:21:08 crc kubenswrapper[4892]: I0217 19:21:08.857752 4892 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 19:21:09 crc kubenswrapper[4892]: I0217 19:21:09.008089 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"196db019-189a-4787-a766-f9ae8d46cbea","Type":"ContainerStarted","Data":"886f3117b9b48ffe7f8a8ba812695ac84d79c70ce22145d605e545401ab0b494"} Feb 17 19:21:09 crc kubenswrapper[4892]: I0217 19:21:09.113554 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Feb 17 19:21:09 crc kubenswrapper[4892]: I0217 19:21:09.629273 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Feb 17 19:21:10 crc kubenswrapper[4892]: I0217 19:21:10.018620 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"196db019-189a-4787-a766-f9ae8d46cbea","Type":"ContainerStarted","Data":"db2550b3cdb25f2434beae89d4ccd9cb6f9c356d54d7aed7c2a9c6e6ed0e5700"} Feb 17 19:21:10 crc kubenswrapper[4892]: I0217 19:21:10.019672 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"0c0d1a90-2e96-43e4-9ed7-b375dd729dd5","Type":"ContainerStarted","Data":"650f08b510a85ea07ac474a606d02f7b12de07c28017e178a9c98e515de1f95f"} Feb 17 19:21:10 crc kubenswrapper[4892]: I0217 19:21:10.517463 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="db264755-2d6e-4d13-ab1f-1a4b3712b242" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.1.109:8776/healthcheck\": read tcp 10.217.0.2:47944->10.217.1.109:8776: read: connection reset by peer" Feb 17 19:21:10 crc kubenswrapper[4892]: I0217 19:21:10.852797 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 19:21:10 crc kubenswrapper[4892]: I0217 19:21:10.878913 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.029421 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db264755-2d6e-4d13-ab1f-1a4b3712b242-config-data-custom\") pod \"db264755-2d6e-4d13-ab1f-1a4b3712b242\" (UID: \"db264755-2d6e-4d13-ab1f-1a4b3712b242\") " Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.029718 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db264755-2d6e-4d13-ab1f-1a4b3712b242-config-data\") pod \"db264755-2d6e-4d13-ab1f-1a4b3712b242\" (UID: \"db264755-2d6e-4d13-ab1f-1a4b3712b242\") " Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.029842 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db264755-2d6e-4d13-ab1f-1a4b3712b242-scripts\") pod \"db264755-2d6e-4d13-ab1f-1a4b3712b242\" (UID: \"db264755-2d6e-4d13-ab1f-1a4b3712b242\") " Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.029879 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db264755-2d6e-4d13-ab1f-1a4b3712b242-combined-ca-bundle\") pod \"db264755-2d6e-4d13-ab1f-1a4b3712b242\" (UID: \"db264755-2d6e-4d13-ab1f-1a4b3712b242\") " Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.029932 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/db264755-2d6e-4d13-ab1f-1a4b3712b242-etc-machine-id\") pod \"db264755-2d6e-4d13-ab1f-1a4b3712b242\" (UID: \"db264755-2d6e-4d13-ab1f-1a4b3712b242\") " Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.029952 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db264755-2d6e-4d13-ab1f-1a4b3712b242-logs\") pod \"db264755-2d6e-4d13-ab1f-1a4b3712b242\" (UID: \"db264755-2d6e-4d13-ab1f-1a4b3712b242\") " Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.030022 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sl5pr\" (UniqueName: \"kubernetes.io/projected/db264755-2d6e-4d13-ab1f-1a4b3712b242-kube-api-access-sl5pr\") pod \"db264755-2d6e-4d13-ab1f-1a4b3712b242\" (UID: \"db264755-2d6e-4d13-ab1f-1a4b3712b242\") " Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.031695 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db264755-2d6e-4d13-ab1f-1a4b3712b242-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "db264755-2d6e-4d13-ab1f-1a4b3712b242" (UID: "db264755-2d6e-4d13-ab1f-1a4b3712b242"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.032171 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db264755-2d6e-4d13-ab1f-1a4b3712b242-logs" (OuterVolumeSpecName: "logs") pod "db264755-2d6e-4d13-ab1f-1a4b3712b242" (UID: "db264755-2d6e-4d13-ab1f-1a4b3712b242"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.034034 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"196db019-189a-4787-a766-f9ae8d46cbea","Type":"ContainerStarted","Data":"d6421cb98bccc4cfd0fae2bdaa32c99f48375687306ed8208d7e84cba462dc3b"} Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.034287 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db264755-2d6e-4d13-ab1f-1a4b3712b242-kube-api-access-sl5pr" (OuterVolumeSpecName: "kube-api-access-sl5pr") pod "db264755-2d6e-4d13-ab1f-1a4b3712b242" (UID: "db264755-2d6e-4d13-ab1f-1a4b3712b242"). InnerVolumeSpecName "kube-api-access-sl5pr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.036900 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db264755-2d6e-4d13-ab1f-1a4b3712b242-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "db264755-2d6e-4d13-ab1f-1a4b3712b242" (UID: "db264755-2d6e-4d13-ab1f-1a4b3712b242"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.037676 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db264755-2d6e-4d13-ab1f-1a4b3712b242-scripts" (OuterVolumeSpecName: "scripts") pod "db264755-2d6e-4d13-ab1f-1a4b3712b242" (UID: "db264755-2d6e-4d13-ab1f-1a4b3712b242"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.042270 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"0c0d1a90-2e96-43e4-9ed7-b375dd729dd5","Type":"ContainerStarted","Data":"60dc909b8d96e4924d4f22af0fb674cc83ae960af854719e5995749a35b70901"} Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.055316 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.056210 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"db264755-2d6e-4d13-ab1f-1a4b3712b242","Type":"ContainerDied","Data":"7dd04ecbe0ec3921048d1c69f234d5ca72164af63de73c2812832f5df598b0a5"} Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.056247 4892 scope.go:117] "RemoveContainer" containerID="7dd04ecbe0ec3921048d1c69f234d5ca72164af63de73c2812832f5df598b0a5" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.055045 4892 generic.go:334] "Generic (PLEG): container finished" podID="db264755-2d6e-4d13-ab1f-1a4b3712b242" containerID="7dd04ecbe0ec3921048d1c69f234d5ca72164af63de73c2812832f5df598b0a5" exitCode=0 Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.060368 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"db264755-2d6e-4d13-ab1f-1a4b3712b242","Type":"ContainerDied","Data":"ff8ce85defc5cf70b20da8db703e51504947e47aa18cf1f69a817d9e73a02c4c"} Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.061051 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=3.314020974 podStartE2EDuration="4.061041834s" podCreationTimestamp="2026-02-17 19:21:07 +0000 UTC" firstStartedPulling="2026-02-17 19:21:08.857316378 +0000 UTC m=+5840.232719643" lastFinishedPulling="2026-02-17 19:21:09.604337238 +0000 UTC m=+5840.979740503" observedRunningTime="2026-02-17 19:21:11.060792898 +0000 UTC m=+5842.436196163" watchObservedRunningTime="2026-02-17 19:21:11.061041834 +0000 UTC m=+5842.436445099" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.098998 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db264755-2d6e-4d13-ab1f-1a4b3712b242-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "db264755-2d6e-4d13-ab1f-1a4b3712b242" (UID: "db264755-2d6e-4d13-ab1f-1a4b3712b242"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.099674 4892 scope.go:117] "RemoveContainer" containerID="1ab0b224df9ec86f11b84944dc6f8b4c202c7a169e3ea90883e69ff0e285ae6b" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.112179 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db264755-2d6e-4d13-ab1f-1a4b3712b242-config-data" (OuterVolumeSpecName: "config-data") pod "db264755-2d6e-4d13-ab1f-1a4b3712b242" (UID: "db264755-2d6e-4d13-ab1f-1a4b3712b242"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.128856 4892 scope.go:117] "RemoveContainer" containerID="7dd04ecbe0ec3921048d1c69f234d5ca72164af63de73c2812832f5df598b0a5" Feb 17 19:21:11 crc kubenswrapper[4892]: E0217 19:21:11.130398 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7dd04ecbe0ec3921048d1c69f234d5ca72164af63de73c2812832f5df598b0a5\": container with ID starting with 7dd04ecbe0ec3921048d1c69f234d5ca72164af63de73c2812832f5df598b0a5 not found: ID does not exist" containerID="7dd04ecbe0ec3921048d1c69f234d5ca72164af63de73c2812832f5df598b0a5" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.130446 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dd04ecbe0ec3921048d1c69f234d5ca72164af63de73c2812832f5df598b0a5"} err="failed to get container status \"7dd04ecbe0ec3921048d1c69f234d5ca72164af63de73c2812832f5df598b0a5\": rpc error: code = NotFound desc = could not find container \"7dd04ecbe0ec3921048d1c69f234d5ca72164af63de73c2812832f5df598b0a5\": container with ID starting with 7dd04ecbe0ec3921048d1c69f234d5ca72164af63de73c2812832f5df598b0a5 not found: ID does not exist" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.130477 4892 scope.go:117] "RemoveContainer" containerID="1ab0b224df9ec86f11b84944dc6f8b4c202c7a169e3ea90883e69ff0e285ae6b" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.132704 4892 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db264755-2d6e-4d13-ab1f-1a4b3712b242-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.132729 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db264755-2d6e-4d13-ab1f-1a4b3712b242-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.132737 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db264755-2d6e-4d13-ab1f-1a4b3712b242-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.132746 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db264755-2d6e-4d13-ab1f-1a4b3712b242-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.132756 4892 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/db264755-2d6e-4d13-ab1f-1a4b3712b242-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.132765 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db264755-2d6e-4d13-ab1f-1a4b3712b242-logs\") on node \"crc\" DevicePath \"\"" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.132772 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sl5pr\" (UniqueName: \"kubernetes.io/projected/db264755-2d6e-4d13-ab1f-1a4b3712b242-kube-api-access-sl5pr\") on node \"crc\" DevicePath \"\"" Feb 17 19:21:11 crc kubenswrapper[4892]: E0217 19:21:11.133664 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ab0b224df9ec86f11b84944dc6f8b4c202c7a169e3ea90883e69ff0e285ae6b\": container with ID starting with 1ab0b224df9ec86f11b84944dc6f8b4c202c7a169e3ea90883e69ff0e285ae6b not found: ID does not exist" containerID="1ab0b224df9ec86f11b84944dc6f8b4c202c7a169e3ea90883e69ff0e285ae6b" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.133699 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ab0b224df9ec86f11b84944dc6f8b4c202c7a169e3ea90883e69ff0e285ae6b"} err="failed to get container status \"1ab0b224df9ec86f11b84944dc6f8b4c202c7a169e3ea90883e69ff0e285ae6b\": rpc error: code = NotFound desc = could not find container \"1ab0b224df9ec86f11b84944dc6f8b4c202c7a169e3ea90883e69ff0e285ae6b\": container with ID starting with 1ab0b224df9ec86f11b84944dc6f8b4c202c7a169e3ea90883e69ff0e285ae6b not found: ID does not exist" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.153783 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.154311 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.158257 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.165413 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.165838 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.166706 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.172661 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.426119 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.442990 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.464244 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 17 19:21:11 crc kubenswrapper[4892]: E0217 19:21:11.465895 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db264755-2d6e-4d13-ab1f-1a4b3712b242" containerName="cinder-api-log" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.465914 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="db264755-2d6e-4d13-ab1f-1a4b3712b242" containerName="cinder-api-log" Feb 17 19:21:11 crc kubenswrapper[4892]: E0217 19:21:11.465945 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db264755-2d6e-4d13-ab1f-1a4b3712b242" containerName="cinder-api" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.465951 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="db264755-2d6e-4d13-ab1f-1a4b3712b242" containerName="cinder-api" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.466179 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="db264755-2d6e-4d13-ab1f-1a4b3712b242" containerName="cinder-api" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.466210 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="db264755-2d6e-4d13-ab1f-1a4b3712b242" containerName="cinder-api-log" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.467311 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.469663 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.472752 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.644461 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b762d850-e72b-4151-97c1-c6e1f8c9e76f-scripts\") pod \"cinder-api-0\" (UID: \"b762d850-e72b-4151-97c1-c6e1f8c9e76f\") " pod="openstack/cinder-api-0" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.644641 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b762d850-e72b-4151-97c1-c6e1f8c9e76f-config-data\") pod \"cinder-api-0\" (UID: \"b762d850-e72b-4151-97c1-c6e1f8c9e76f\") " pod="openstack/cinder-api-0" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.644960 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b762d850-e72b-4151-97c1-c6e1f8c9e76f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b762d850-e72b-4151-97c1-c6e1f8c9e76f\") " pod="openstack/cinder-api-0" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.645147 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b762d850-e72b-4151-97c1-c6e1f8c9e76f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b762d850-e72b-4151-97c1-c6e1f8c9e76f\") " pod="openstack/cinder-api-0" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.645324 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b762d850-e72b-4151-97c1-c6e1f8c9e76f-logs\") pod \"cinder-api-0\" (UID: \"b762d850-e72b-4151-97c1-c6e1f8c9e76f\") " pod="openstack/cinder-api-0" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.645547 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b762d850-e72b-4151-97c1-c6e1f8c9e76f-config-data-custom\") pod \"cinder-api-0\" (UID: \"b762d850-e72b-4151-97c1-c6e1f8c9e76f\") " pod="openstack/cinder-api-0" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.645688 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rj5v4\" (UniqueName: \"kubernetes.io/projected/b762d850-e72b-4151-97c1-c6e1f8c9e76f-kube-api-access-rj5v4\") pod \"cinder-api-0\" (UID: \"b762d850-e72b-4151-97c1-c6e1f8c9e76f\") " pod="openstack/cinder-api-0" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.748288 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b762d850-e72b-4151-97c1-c6e1f8c9e76f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b762d850-e72b-4151-97c1-c6e1f8c9e76f\") " pod="openstack/cinder-api-0" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.748372 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b762d850-e72b-4151-97c1-c6e1f8c9e76f-logs\") pod \"cinder-api-0\" (UID: \"b762d850-e72b-4151-97c1-c6e1f8c9e76f\") " pod="openstack/cinder-api-0" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.748447 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b762d850-e72b-4151-97c1-c6e1f8c9e76f-config-data-custom\") pod \"cinder-api-0\" (UID: \"b762d850-e72b-4151-97c1-c6e1f8c9e76f\") " pod="openstack/cinder-api-0" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.748499 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rj5v4\" (UniqueName: \"kubernetes.io/projected/b762d850-e72b-4151-97c1-c6e1f8c9e76f-kube-api-access-rj5v4\") pod \"cinder-api-0\" (UID: \"b762d850-e72b-4151-97c1-c6e1f8c9e76f\") " pod="openstack/cinder-api-0" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.748533 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b762d850-e72b-4151-97c1-c6e1f8c9e76f-scripts\") pod \"cinder-api-0\" (UID: \"b762d850-e72b-4151-97c1-c6e1f8c9e76f\") " pod="openstack/cinder-api-0" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.748566 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b762d850-e72b-4151-97c1-c6e1f8c9e76f-config-data\") pod \"cinder-api-0\" (UID: \"b762d850-e72b-4151-97c1-c6e1f8c9e76f\") " pod="openstack/cinder-api-0" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.748636 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b762d850-e72b-4151-97c1-c6e1f8c9e76f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b762d850-e72b-4151-97c1-c6e1f8c9e76f\") " pod="openstack/cinder-api-0" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.749762 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b762d850-e72b-4151-97c1-c6e1f8c9e76f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b762d850-e72b-4151-97c1-c6e1f8c9e76f\") " pod="openstack/cinder-api-0" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.750459 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b762d850-e72b-4151-97c1-c6e1f8c9e76f-logs\") pod \"cinder-api-0\" (UID: \"b762d850-e72b-4151-97c1-c6e1f8c9e76f\") " pod="openstack/cinder-api-0" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.754416 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b762d850-e72b-4151-97c1-c6e1f8c9e76f-scripts\") pod \"cinder-api-0\" (UID: \"b762d850-e72b-4151-97c1-c6e1f8c9e76f\") " pod="openstack/cinder-api-0" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.755897 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b762d850-e72b-4151-97c1-c6e1f8c9e76f-config-data-custom\") pod \"cinder-api-0\" (UID: \"b762d850-e72b-4151-97c1-c6e1f8c9e76f\") " pod="openstack/cinder-api-0" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.757531 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b762d850-e72b-4151-97c1-c6e1f8c9e76f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b762d850-e72b-4151-97c1-c6e1f8c9e76f\") " pod="openstack/cinder-api-0" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.758028 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b762d850-e72b-4151-97c1-c6e1f8c9e76f-config-data\") pod \"cinder-api-0\" (UID: \"b762d850-e72b-4151-97c1-c6e1f8c9e76f\") " pod="openstack/cinder-api-0" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.776518 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rj5v4\" (UniqueName: \"kubernetes.io/projected/b762d850-e72b-4151-97c1-c6e1f8c9e76f-kube-api-access-rj5v4\") pod \"cinder-api-0\" (UID: \"b762d850-e72b-4151-97c1-c6e1f8c9e76f\") " pod="openstack/cinder-api-0" Feb 17 19:21:11 crc kubenswrapper[4892]: I0217 19:21:11.794293 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 19:21:12 crc kubenswrapper[4892]: I0217 19:21:12.079120 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"0c0d1a90-2e96-43e4-9ed7-b375dd729dd5","Type":"ContainerStarted","Data":"4549317e09d97a96955873f5db7cefe83fc9153ce705593ee7e74991a8869f01"} Feb 17 19:21:12 crc kubenswrapper[4892]: I0217 19:21:12.085572 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 17 19:21:12 crc kubenswrapper[4892]: I0217 19:21:12.088897 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 17 19:21:12 crc kubenswrapper[4892]: I0217 19:21:12.091019 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 17 19:21:12 crc kubenswrapper[4892]: I0217 19:21:12.128268 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=3.258176802 podStartE2EDuration="4.128252643s" podCreationTimestamp="2026-02-17 19:21:08 +0000 UTC" firstStartedPulling="2026-02-17 19:21:09.624457911 +0000 UTC m=+5840.999861186" lastFinishedPulling="2026-02-17 19:21:10.494533762 +0000 UTC m=+5841.869937027" observedRunningTime="2026-02-17 19:21:12.107217636 +0000 UTC m=+5843.482620911" watchObservedRunningTime="2026-02-17 19:21:12.128252643 +0000 UTC m=+5843.503655908" Feb 17 19:21:12 crc kubenswrapper[4892]: I0217 19:21:12.333636 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 17 19:21:12 crc kubenswrapper[4892]: W0217 19:21:12.338497 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb762d850_e72b_4151_97c1_c6e1f8c9e76f.slice/crio-563bf27dfe23bd05f0d616f5edfaa645ac68414f219a0679a89d9dd4c73f0995 WatchSource:0}: Error finding container 563bf27dfe23bd05f0d616f5edfaa645ac68414f219a0679a89d9dd4c73f0995: Status 404 returned error can't find the container with id 563bf27dfe23bd05f0d616f5edfaa645ac68414f219a0679a89d9dd4c73f0995 Feb 17 19:21:13 crc kubenswrapper[4892]: I0217 19:21:13.101517 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b762d850-e72b-4151-97c1-c6e1f8c9e76f","Type":"ContainerStarted","Data":"0d4916de25b2957b35c93f18ecc14ba508cdcbe7b1383842991f96d859b19efe"} Feb 17 19:21:13 crc kubenswrapper[4892]: I0217 19:21:13.102254 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b762d850-e72b-4151-97c1-c6e1f8c9e76f","Type":"ContainerStarted","Data":"563bf27dfe23bd05f0d616f5edfaa645ac68414f219a0679a89d9dd4c73f0995"} Feb 17 19:21:13 crc kubenswrapper[4892]: I0217 19:21:13.262005 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Feb 17 19:21:13 crc kubenswrapper[4892]: I0217 19:21:13.379001 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db264755-2d6e-4d13-ab1f-1a4b3712b242" path="/var/lib/kubelet/pods/db264755-2d6e-4d13-ab1f-1a4b3712b242/volumes" Feb 17 19:21:14 crc kubenswrapper[4892]: I0217 19:21:14.114164 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Feb 17 19:21:14 crc kubenswrapper[4892]: I0217 19:21:14.118702 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b762d850-e72b-4151-97c1-c6e1f8c9e76f","Type":"ContainerStarted","Data":"0dd0a3874a07611261a9cfe7f59f0e1be9a8413645802c577787cda500c8951f"} Feb 17 19:21:14 crc kubenswrapper[4892]: I0217 19:21:14.119686 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 17 19:21:14 crc kubenswrapper[4892]: I0217 19:21:14.140908 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.140886794 podStartE2EDuration="3.140886794s" podCreationTimestamp="2026-02-17 19:21:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:21:14.139007943 +0000 UTC m=+5845.514411228" watchObservedRunningTime="2026-02-17 19:21:14.140886794 +0000 UTC m=+5845.516290069" Feb 17 19:21:16 crc kubenswrapper[4892]: I0217 19:21:16.079895 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 17 19:21:16 crc kubenswrapper[4892]: I0217 19:21:16.150043 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 19:21:16 crc kubenswrapper[4892]: I0217 19:21:16.154934 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="caecb191-fb7b-44fd-b27c-ec755ed4ee4a" containerName="cinder-scheduler" containerID="cri-o://4491fc369c58ac9fc17501a31c2cb8206375b67c71fa4aff1f279c0269aa964f" gracePeriod=30 Feb 17 19:21:16 crc kubenswrapper[4892]: I0217 19:21:16.155712 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="caecb191-fb7b-44fd-b27c-ec755ed4ee4a" containerName="probe" containerID="cri-o://a163fc53b4c3c229dd7e4fbad7a23c23d502aac9f6920e286b7a0b309c9b8371" gracePeriod=30 Feb 17 19:21:17 crc kubenswrapper[4892]: I0217 19:21:17.172064 4892 generic.go:334] "Generic (PLEG): container finished" podID="caecb191-fb7b-44fd-b27c-ec755ed4ee4a" containerID="a163fc53b4c3c229dd7e4fbad7a23c23d502aac9f6920e286b7a0b309c9b8371" exitCode=0 Feb 17 19:21:17 crc kubenswrapper[4892]: I0217 19:21:17.172131 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"caecb191-fb7b-44fd-b27c-ec755ed4ee4a","Type":"ContainerDied","Data":"a163fc53b4c3c229dd7e4fbad7a23c23d502aac9f6920e286b7a0b309c9b8371"} Feb 17 19:21:18 crc kubenswrapper[4892]: I0217 19:21:18.479358 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Feb 17 19:21:19 crc kubenswrapper[4892]: I0217 19:21:19.197559 4892 generic.go:334] "Generic (PLEG): container finished" podID="caecb191-fb7b-44fd-b27c-ec755ed4ee4a" containerID="4491fc369c58ac9fc17501a31c2cb8206375b67c71fa4aff1f279c0269aa964f" exitCode=0 Feb 17 19:21:19 crc kubenswrapper[4892]: I0217 19:21:19.197889 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"caecb191-fb7b-44fd-b27c-ec755ed4ee4a","Type":"ContainerDied","Data":"4491fc369c58ac9fc17501a31c2cb8206375b67c71fa4aff1f279c0269aa964f"} Feb 17 19:21:19 crc kubenswrapper[4892]: I0217 19:21:19.375171 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Feb 17 19:21:19 crc kubenswrapper[4892]: I0217 19:21:19.514077 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 19:21:19 crc kubenswrapper[4892]: I0217 19:21:19.638504 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caecb191-fb7b-44fd-b27c-ec755ed4ee4a-combined-ca-bundle\") pod \"caecb191-fb7b-44fd-b27c-ec755ed4ee4a\" (UID: \"caecb191-fb7b-44fd-b27c-ec755ed4ee4a\") " Feb 17 19:21:19 crc kubenswrapper[4892]: I0217 19:21:19.638560 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/caecb191-fb7b-44fd-b27c-ec755ed4ee4a-etc-machine-id\") pod \"caecb191-fb7b-44fd-b27c-ec755ed4ee4a\" (UID: \"caecb191-fb7b-44fd-b27c-ec755ed4ee4a\") " Feb 17 19:21:19 crc kubenswrapper[4892]: I0217 19:21:19.638618 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/caecb191-fb7b-44fd-b27c-ec755ed4ee4a-scripts\") pod \"caecb191-fb7b-44fd-b27c-ec755ed4ee4a\" (UID: \"caecb191-fb7b-44fd-b27c-ec755ed4ee4a\") " Feb 17 19:21:19 crc kubenswrapper[4892]: I0217 19:21:19.638653 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dsg4p\" (UniqueName: \"kubernetes.io/projected/caecb191-fb7b-44fd-b27c-ec755ed4ee4a-kube-api-access-dsg4p\") pod \"caecb191-fb7b-44fd-b27c-ec755ed4ee4a\" (UID: \"caecb191-fb7b-44fd-b27c-ec755ed4ee4a\") " Feb 17 19:21:19 crc kubenswrapper[4892]: I0217 19:21:19.638900 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/caecb191-fb7b-44fd-b27c-ec755ed4ee4a-config-data-custom\") pod \"caecb191-fb7b-44fd-b27c-ec755ed4ee4a\" (UID: \"caecb191-fb7b-44fd-b27c-ec755ed4ee4a\") " Feb 17 19:21:19 crc kubenswrapper[4892]: I0217 19:21:19.638936 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/caecb191-fb7b-44fd-b27c-ec755ed4ee4a-config-data\") pod \"caecb191-fb7b-44fd-b27c-ec755ed4ee4a\" (UID: \"caecb191-fb7b-44fd-b27c-ec755ed4ee4a\") " Feb 17 19:21:19 crc kubenswrapper[4892]: I0217 19:21:19.640273 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/caecb191-fb7b-44fd-b27c-ec755ed4ee4a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "caecb191-fb7b-44fd-b27c-ec755ed4ee4a" (UID: "caecb191-fb7b-44fd-b27c-ec755ed4ee4a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 19:21:19 crc kubenswrapper[4892]: I0217 19:21:19.652687 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/caecb191-fb7b-44fd-b27c-ec755ed4ee4a-kube-api-access-dsg4p" (OuterVolumeSpecName: "kube-api-access-dsg4p") pod "caecb191-fb7b-44fd-b27c-ec755ed4ee4a" (UID: "caecb191-fb7b-44fd-b27c-ec755ed4ee4a"). InnerVolumeSpecName "kube-api-access-dsg4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:21:19 crc kubenswrapper[4892]: I0217 19:21:19.652965 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caecb191-fb7b-44fd-b27c-ec755ed4ee4a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "caecb191-fb7b-44fd-b27c-ec755ed4ee4a" (UID: "caecb191-fb7b-44fd-b27c-ec755ed4ee4a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:21:19 crc kubenswrapper[4892]: I0217 19:21:19.653073 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caecb191-fb7b-44fd-b27c-ec755ed4ee4a-scripts" (OuterVolumeSpecName: "scripts") pod "caecb191-fb7b-44fd-b27c-ec755ed4ee4a" (UID: "caecb191-fb7b-44fd-b27c-ec755ed4ee4a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:21:19 crc kubenswrapper[4892]: I0217 19:21:19.699558 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caecb191-fb7b-44fd-b27c-ec755ed4ee4a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "caecb191-fb7b-44fd-b27c-ec755ed4ee4a" (UID: "caecb191-fb7b-44fd-b27c-ec755ed4ee4a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:21:19 crc kubenswrapper[4892]: I0217 19:21:19.742860 4892 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/caecb191-fb7b-44fd-b27c-ec755ed4ee4a-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 17 19:21:19 crc kubenswrapper[4892]: I0217 19:21:19.742898 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caecb191-fb7b-44fd-b27c-ec755ed4ee4a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 19:21:19 crc kubenswrapper[4892]: I0217 19:21:19.742917 4892 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/caecb191-fb7b-44fd-b27c-ec755ed4ee4a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 17 19:21:19 crc kubenswrapper[4892]: I0217 19:21:19.742939 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/caecb191-fb7b-44fd-b27c-ec755ed4ee4a-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:21:19 crc kubenswrapper[4892]: I0217 19:21:19.742951 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dsg4p\" (UniqueName: \"kubernetes.io/projected/caecb191-fb7b-44fd-b27c-ec755ed4ee4a-kube-api-access-dsg4p\") on node \"crc\" DevicePath \"\"" Feb 17 19:21:19 crc kubenswrapper[4892]: I0217 19:21:19.792667 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caecb191-fb7b-44fd-b27c-ec755ed4ee4a-config-data" (OuterVolumeSpecName: "config-data") pod "caecb191-fb7b-44fd-b27c-ec755ed4ee4a" (UID: "caecb191-fb7b-44fd-b27c-ec755ed4ee4a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:21:19 crc kubenswrapper[4892]: I0217 19:21:19.844392 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/caecb191-fb7b-44fd-b27c-ec755ed4ee4a-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 19:21:20 crc kubenswrapper[4892]: I0217 19:21:20.210938 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"caecb191-fb7b-44fd-b27c-ec755ed4ee4a","Type":"ContainerDied","Data":"0d4ee1078bb3cc2af9ac3925950bff6996b86005a2b14a7878252040d276d361"} Feb 17 19:21:20 crc kubenswrapper[4892]: I0217 19:21:20.211017 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 19:21:20 crc kubenswrapper[4892]: I0217 19:21:20.211364 4892 scope.go:117] "RemoveContainer" containerID="a163fc53b4c3c229dd7e4fbad7a23c23d502aac9f6920e286b7a0b309c9b8371" Feb 17 19:21:20 crc kubenswrapper[4892]: I0217 19:21:20.243305 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 19:21:20 crc kubenswrapper[4892]: I0217 19:21:20.249536 4892 scope.go:117] "RemoveContainer" containerID="4491fc369c58ac9fc17501a31c2cb8206375b67c71fa4aff1f279c0269aa964f" Feb 17 19:21:20 crc kubenswrapper[4892]: I0217 19:21:20.259193 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 19:21:20 crc kubenswrapper[4892]: I0217 19:21:20.271600 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 19:21:20 crc kubenswrapper[4892]: E0217 19:21:20.272188 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caecb191-fb7b-44fd-b27c-ec755ed4ee4a" containerName="probe" Feb 17 19:21:20 crc kubenswrapper[4892]: I0217 19:21:20.272205 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="caecb191-fb7b-44fd-b27c-ec755ed4ee4a" containerName="probe" Feb 17 19:21:20 crc kubenswrapper[4892]: E0217 19:21:20.272226 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caecb191-fb7b-44fd-b27c-ec755ed4ee4a" containerName="cinder-scheduler" Feb 17 19:21:20 crc kubenswrapper[4892]: I0217 19:21:20.272235 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="caecb191-fb7b-44fd-b27c-ec755ed4ee4a" containerName="cinder-scheduler" Feb 17 19:21:20 crc kubenswrapper[4892]: I0217 19:21:20.272539 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="caecb191-fb7b-44fd-b27c-ec755ed4ee4a" containerName="probe" Feb 17 19:21:20 crc kubenswrapper[4892]: I0217 19:21:20.272571 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="caecb191-fb7b-44fd-b27c-ec755ed4ee4a" containerName="cinder-scheduler" Feb 17 19:21:20 crc kubenswrapper[4892]: I0217 19:21:20.274237 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 19:21:20 crc kubenswrapper[4892]: I0217 19:21:20.278746 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 17 19:21:20 crc kubenswrapper[4892]: I0217 19:21:20.280751 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 19:21:20 crc kubenswrapper[4892]: I0217 19:21:20.359141 4892 scope.go:117] "RemoveContainer" containerID="0b9ac01ffb1eb212ecc7dea4def4de73890b147a83b4b71346712fbd55b67ac2" Feb 17 19:21:20 crc kubenswrapper[4892]: E0217 19:21:20.359714 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:21:20 crc kubenswrapper[4892]: I0217 19:21:20.455150 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b4368bec-a527-4e2e-bcd8-c3b83faf9bca-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b4368bec-a527-4e2e-bcd8-c3b83faf9bca\") " pod="openstack/cinder-scheduler-0" Feb 17 19:21:20 crc kubenswrapper[4892]: I0217 19:21:20.455223 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b4368bec-a527-4e2e-bcd8-c3b83faf9bca-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b4368bec-a527-4e2e-bcd8-c3b83faf9bca\") " pod="openstack/cinder-scheduler-0" Feb 17 19:21:20 crc kubenswrapper[4892]: I0217 19:21:20.455383 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4368bec-a527-4e2e-bcd8-c3b83faf9bca-scripts\") pod \"cinder-scheduler-0\" (UID: \"b4368bec-a527-4e2e-bcd8-c3b83faf9bca\") " pod="openstack/cinder-scheduler-0" Feb 17 19:21:20 crc kubenswrapper[4892]: I0217 19:21:20.455427 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4368bec-a527-4e2e-bcd8-c3b83faf9bca-config-data\") pod \"cinder-scheduler-0\" (UID: \"b4368bec-a527-4e2e-bcd8-c3b83faf9bca\") " pod="openstack/cinder-scheduler-0" Feb 17 19:21:20 crc kubenswrapper[4892]: I0217 19:21:20.455640 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4368bec-a527-4e2e-bcd8-c3b83faf9bca-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b4368bec-a527-4e2e-bcd8-c3b83faf9bca\") " pod="openstack/cinder-scheduler-0" Feb 17 19:21:20 crc kubenswrapper[4892]: I0217 19:21:20.455739 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f57ng\" (UniqueName: \"kubernetes.io/projected/b4368bec-a527-4e2e-bcd8-c3b83faf9bca-kube-api-access-f57ng\") pod \"cinder-scheduler-0\" (UID: \"b4368bec-a527-4e2e-bcd8-c3b83faf9bca\") " pod="openstack/cinder-scheduler-0" Feb 17 19:21:20 crc kubenswrapper[4892]: I0217 19:21:20.557226 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4368bec-a527-4e2e-bcd8-c3b83faf9bca-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b4368bec-a527-4e2e-bcd8-c3b83faf9bca\") " pod="openstack/cinder-scheduler-0" Feb 17 19:21:20 crc kubenswrapper[4892]: I0217 19:21:20.557305 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f57ng\" (UniqueName: \"kubernetes.io/projected/b4368bec-a527-4e2e-bcd8-c3b83faf9bca-kube-api-access-f57ng\") pod \"cinder-scheduler-0\" (UID: \"b4368bec-a527-4e2e-bcd8-c3b83faf9bca\") " pod="openstack/cinder-scheduler-0" Feb 17 19:21:20 crc kubenswrapper[4892]: I0217 19:21:20.557372 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b4368bec-a527-4e2e-bcd8-c3b83faf9bca-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b4368bec-a527-4e2e-bcd8-c3b83faf9bca\") " pod="openstack/cinder-scheduler-0" Feb 17 19:21:20 crc kubenswrapper[4892]: I0217 19:21:20.557400 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b4368bec-a527-4e2e-bcd8-c3b83faf9bca-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b4368bec-a527-4e2e-bcd8-c3b83faf9bca\") " pod="openstack/cinder-scheduler-0" Feb 17 19:21:20 crc kubenswrapper[4892]: I0217 19:21:20.557469 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4368bec-a527-4e2e-bcd8-c3b83faf9bca-scripts\") pod \"cinder-scheduler-0\" (UID: \"b4368bec-a527-4e2e-bcd8-c3b83faf9bca\") " pod="openstack/cinder-scheduler-0" Feb 17 19:21:20 crc kubenswrapper[4892]: I0217 19:21:20.557530 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4368bec-a527-4e2e-bcd8-c3b83faf9bca-config-data\") pod \"cinder-scheduler-0\" (UID: \"b4368bec-a527-4e2e-bcd8-c3b83faf9bca\") " pod="openstack/cinder-scheduler-0" Feb 17 19:21:20 crc kubenswrapper[4892]: I0217 19:21:20.558195 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b4368bec-a527-4e2e-bcd8-c3b83faf9bca-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b4368bec-a527-4e2e-bcd8-c3b83faf9bca\") " pod="openstack/cinder-scheduler-0" Feb 17 19:21:20 crc kubenswrapper[4892]: I0217 19:21:20.563626 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4368bec-a527-4e2e-bcd8-c3b83faf9bca-config-data\") pod \"cinder-scheduler-0\" (UID: \"b4368bec-a527-4e2e-bcd8-c3b83faf9bca\") " pod="openstack/cinder-scheduler-0" Feb 17 19:21:20 crc kubenswrapper[4892]: I0217 19:21:20.563734 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b4368bec-a527-4e2e-bcd8-c3b83faf9bca-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b4368bec-a527-4e2e-bcd8-c3b83faf9bca\") " pod="openstack/cinder-scheduler-0" Feb 17 19:21:20 crc kubenswrapper[4892]: I0217 19:21:20.563829 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4368bec-a527-4e2e-bcd8-c3b83faf9bca-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b4368bec-a527-4e2e-bcd8-c3b83faf9bca\") " pod="openstack/cinder-scheduler-0" Feb 17 19:21:20 crc kubenswrapper[4892]: I0217 19:21:20.564535 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4368bec-a527-4e2e-bcd8-c3b83faf9bca-scripts\") pod \"cinder-scheduler-0\" (UID: \"b4368bec-a527-4e2e-bcd8-c3b83faf9bca\") " pod="openstack/cinder-scheduler-0" Feb 17 19:21:20 crc kubenswrapper[4892]: I0217 19:21:20.579676 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f57ng\" (UniqueName: \"kubernetes.io/projected/b4368bec-a527-4e2e-bcd8-c3b83faf9bca-kube-api-access-f57ng\") pod \"cinder-scheduler-0\" (UID: \"b4368bec-a527-4e2e-bcd8-c3b83faf9bca\") " pod="openstack/cinder-scheduler-0" Feb 17 19:21:20 crc kubenswrapper[4892]: I0217 19:21:20.597480 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 19:21:21 crc kubenswrapper[4892]: I0217 19:21:21.159549 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 19:21:21 crc kubenswrapper[4892]: W0217 19:21:21.178478 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4368bec_a527_4e2e_bcd8_c3b83faf9bca.slice/crio-357d5aa04a992a601d37ad50fb2171edb8620e59229b6b22b4f933d292195bb5 WatchSource:0}: Error finding container 357d5aa04a992a601d37ad50fb2171edb8620e59229b6b22b4f933d292195bb5: Status 404 returned error can't find the container with id 357d5aa04a992a601d37ad50fb2171edb8620e59229b6b22b4f933d292195bb5 Feb 17 19:21:21 crc kubenswrapper[4892]: I0217 19:21:21.228651 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b4368bec-a527-4e2e-bcd8-c3b83faf9bca","Type":"ContainerStarted","Data":"357d5aa04a992a601d37ad50fb2171edb8620e59229b6b22b4f933d292195bb5"} Feb 17 19:21:21 crc kubenswrapper[4892]: I0217 19:21:21.375545 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="caecb191-fb7b-44fd-b27c-ec755ed4ee4a" path="/var/lib/kubelet/pods/caecb191-fb7b-44fd-b27c-ec755ed4ee4a/volumes" Feb 17 19:21:22 crc kubenswrapper[4892]: I0217 19:21:22.244933 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b4368bec-a527-4e2e-bcd8-c3b83faf9bca","Type":"ContainerStarted","Data":"de23ec11ccda3ac5231046602e5635f6223cfd7736a2eedbb6f14d6d7821ded2"} Feb 17 19:21:23 crc kubenswrapper[4892]: I0217 19:21:23.259232 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b4368bec-a527-4e2e-bcd8-c3b83faf9bca","Type":"ContainerStarted","Data":"e0304d3f4c992530f866849b862336a1aa3085d5aebf644c82dd7853fbf38e7c"} Feb 17 19:21:23 crc kubenswrapper[4892]: I0217 19:21:23.292320 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.292300082 podStartE2EDuration="3.292300082s" podCreationTimestamp="2026-02-17 19:21:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:21:23.286510566 +0000 UTC m=+5854.661913851" watchObservedRunningTime="2026-02-17 19:21:23.292300082 +0000 UTC m=+5854.667703347" Feb 17 19:21:23 crc kubenswrapper[4892]: I0217 19:21:23.681238 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 17 19:21:25 crc kubenswrapper[4892]: I0217 19:21:25.598227 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 17 19:21:30 crc kubenswrapper[4892]: I0217 19:21:30.823738 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 17 19:21:32 crc kubenswrapper[4892]: I0217 19:21:32.360437 4892 scope.go:117] "RemoveContainer" containerID="0b9ac01ffb1eb212ecc7dea4def4de73890b147a83b4b71346712fbd55b67ac2" Feb 17 19:21:32 crc kubenswrapper[4892]: E0217 19:21:32.361411 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:21:44 crc kubenswrapper[4892]: I0217 19:21:44.359934 4892 scope.go:117] "RemoveContainer" containerID="0b9ac01ffb1eb212ecc7dea4def4de73890b147a83b4b71346712fbd55b67ac2" Feb 17 19:21:44 crc kubenswrapper[4892]: E0217 19:21:44.361567 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:21:56 crc kubenswrapper[4892]: I0217 19:21:56.359804 4892 scope.go:117] "RemoveContainer" containerID="0b9ac01ffb1eb212ecc7dea4def4de73890b147a83b4b71346712fbd55b67ac2" Feb 17 19:21:56 crc kubenswrapper[4892]: E0217 19:21:56.360684 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:22:09 crc kubenswrapper[4892]: I0217 19:22:09.373633 4892 scope.go:117] "RemoveContainer" containerID="0b9ac01ffb1eb212ecc7dea4def4de73890b147a83b4b71346712fbd55b67ac2" Feb 17 19:22:09 crc kubenswrapper[4892]: E0217 19:22:09.374723 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:22:23 crc kubenswrapper[4892]: I0217 19:22:23.364627 4892 scope.go:117] "RemoveContainer" containerID="0b9ac01ffb1eb212ecc7dea4def4de73890b147a83b4b71346712fbd55b67ac2" Feb 17 19:22:23 crc kubenswrapper[4892]: E0217 19:22:23.365534 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:22:36 crc kubenswrapper[4892]: I0217 19:22:36.360300 4892 scope.go:117] "RemoveContainer" containerID="0b9ac01ffb1eb212ecc7dea4def4de73890b147a83b4b71346712fbd55b67ac2" Feb 17 19:22:36 crc kubenswrapper[4892]: E0217 19:22:36.360918 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:22:48 crc kubenswrapper[4892]: I0217 19:22:48.359772 4892 scope.go:117] "RemoveContainer" containerID="0b9ac01ffb1eb212ecc7dea4def4de73890b147a83b4b71346712fbd55b67ac2" Feb 17 19:22:48 crc kubenswrapper[4892]: E0217 19:22:48.360809 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:22:53 crc kubenswrapper[4892]: I0217 19:22:53.043033 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-bd2v5"] Feb 17 19:22:53 crc kubenswrapper[4892]: I0217 19:22:53.055790 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-28ef-account-create-update-rlgx4"] Feb 17 19:22:53 crc kubenswrapper[4892]: I0217 19:22:53.066544 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-bd2v5"] Feb 17 19:22:53 crc kubenswrapper[4892]: I0217 19:22:53.075097 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-28ef-account-create-update-rlgx4"] Feb 17 19:22:53 crc kubenswrapper[4892]: I0217 19:22:53.370370 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c6fb2ec-298f-4592-afd1-2eb29fb08684" path="/var/lib/kubelet/pods/5c6fb2ec-298f-4592-afd1-2eb29fb08684/volumes" Feb 17 19:22:53 crc kubenswrapper[4892]: I0217 19:22:53.371149 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b312301-b3f1-4ef3-bf19-cb59fe062e42" path="/var/lib/kubelet/pods/9b312301-b3f1-4ef3-bf19-cb59fe062e42/volumes" Feb 17 19:22:59 crc kubenswrapper[4892]: I0217 19:22:59.032504 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-s74rl"] Feb 17 19:22:59 crc kubenswrapper[4892]: I0217 19:22:59.046232 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-s74rl"] Feb 17 19:22:59 crc kubenswrapper[4892]: I0217 19:22:59.394320 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f32656b5-b635-4abb-a3f0-fd6bc228c874" path="/var/lib/kubelet/pods/f32656b5-b635-4abb-a3f0-fd6bc228c874/volumes" Feb 17 19:23:02 crc kubenswrapper[4892]: I0217 19:23:02.359922 4892 scope.go:117] "RemoveContainer" containerID="0b9ac01ffb1eb212ecc7dea4def4de73890b147a83b4b71346712fbd55b67ac2" Feb 17 19:23:02 crc kubenswrapper[4892]: E0217 19:23:02.360778 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:23:02 crc kubenswrapper[4892]: I0217 19:23:02.689262 4892 scope.go:117] "RemoveContainer" containerID="14b1f15a077528a3858cd63adb897e0b4ae6a714ddd839d19d298cf8442aac1b" Feb 17 19:23:02 crc kubenswrapper[4892]: I0217 19:23:02.744491 4892 scope.go:117] "RemoveContainer" containerID="5add7a11b80cc08f7ced43002f94bbff103efe8941c5af602bc4243576960d27" Feb 17 19:23:02 crc kubenswrapper[4892]: I0217 19:23:02.819420 4892 scope.go:117] "RemoveContainer" containerID="1aeb8b42dd0468752bc3ecdd432d7564864f6dd46c59d60f10eee335910c95c1" Feb 17 19:23:07 crc kubenswrapper[4892]: I0217 19:23:07.490250 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-9w7kn"] Feb 17 19:23:07 crc kubenswrapper[4892]: I0217 19:23:07.491848 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9w7kn" Feb 17 19:23:07 crc kubenswrapper[4892]: I0217 19:23:07.494933 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-dczgr" Feb 17 19:23:07 crc kubenswrapper[4892]: I0217 19:23:07.497129 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 17 19:23:07 crc kubenswrapper[4892]: I0217 19:23:07.505618 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9w7kn"] Feb 17 19:23:07 crc kubenswrapper[4892]: I0217 19:23:07.537808 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-pc6zw"] Feb 17 19:23:07 crc kubenswrapper[4892]: I0217 19:23:07.570757 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-pc6zw"] Feb 17 19:23:07 crc kubenswrapper[4892]: I0217 19:23:07.570900 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-pc6zw" Feb 17 19:23:07 crc kubenswrapper[4892]: I0217 19:23:07.581874 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/031573a9-543b-4333-a50c-d7e514eb3a41-var-log-ovn\") pod \"ovn-controller-9w7kn\" (UID: \"031573a9-543b-4333-a50c-d7e514eb3a41\") " pod="openstack/ovn-controller-9w7kn" Feb 17 19:23:07 crc kubenswrapper[4892]: I0217 19:23:07.582022 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v876d\" (UniqueName: \"kubernetes.io/projected/031573a9-543b-4333-a50c-d7e514eb3a41-kube-api-access-v876d\") pod \"ovn-controller-9w7kn\" (UID: \"031573a9-543b-4333-a50c-d7e514eb3a41\") " pod="openstack/ovn-controller-9w7kn" Feb 17 19:23:07 crc kubenswrapper[4892]: I0217 19:23:07.582100 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/031573a9-543b-4333-a50c-d7e514eb3a41-var-run\") pod \"ovn-controller-9w7kn\" (UID: \"031573a9-543b-4333-a50c-d7e514eb3a41\") " pod="openstack/ovn-controller-9w7kn" Feb 17 19:23:07 crc kubenswrapper[4892]: I0217 19:23:07.582182 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/031573a9-543b-4333-a50c-d7e514eb3a41-scripts\") pod \"ovn-controller-9w7kn\" (UID: \"031573a9-543b-4333-a50c-d7e514eb3a41\") " pod="openstack/ovn-controller-9w7kn" Feb 17 19:23:07 crc kubenswrapper[4892]: I0217 19:23:07.582223 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/031573a9-543b-4333-a50c-d7e514eb3a41-var-run-ovn\") pod \"ovn-controller-9w7kn\" (UID: \"031573a9-543b-4333-a50c-d7e514eb3a41\") " pod="openstack/ovn-controller-9w7kn" Feb 17 19:23:07 crc kubenswrapper[4892]: I0217 19:23:07.684544 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v876d\" (UniqueName: \"kubernetes.io/projected/031573a9-543b-4333-a50c-d7e514eb3a41-kube-api-access-v876d\") pod \"ovn-controller-9w7kn\" (UID: \"031573a9-543b-4333-a50c-d7e514eb3a41\") " pod="openstack/ovn-controller-9w7kn" Feb 17 19:23:07 crc kubenswrapper[4892]: I0217 19:23:07.684956 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/8bec279d-5310-4762-8f15-b0ad2d919df9-var-lib\") pod \"ovn-controller-ovs-pc6zw\" (UID: \"8bec279d-5310-4762-8f15-b0ad2d919df9\") " pod="openstack/ovn-controller-ovs-pc6zw" Feb 17 19:23:07 crc kubenswrapper[4892]: I0217 19:23:07.685152 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/031573a9-543b-4333-a50c-d7e514eb3a41-var-run\") pod \"ovn-controller-9w7kn\" (UID: \"031573a9-543b-4333-a50c-d7e514eb3a41\") " pod="openstack/ovn-controller-9w7kn" Feb 17 19:23:07 crc kubenswrapper[4892]: I0217 19:23:07.685379 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8bec279d-5310-4762-8f15-b0ad2d919df9-scripts\") pod \"ovn-controller-ovs-pc6zw\" (UID: \"8bec279d-5310-4762-8f15-b0ad2d919df9\") " pod="openstack/ovn-controller-ovs-pc6zw" Feb 17 19:23:07 crc kubenswrapper[4892]: I0217 19:23:07.685484 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/031573a9-543b-4333-a50c-d7e514eb3a41-var-run\") pod \"ovn-controller-9w7kn\" (UID: \"031573a9-543b-4333-a50c-d7e514eb3a41\") " pod="openstack/ovn-controller-9w7kn" Feb 17 19:23:07 crc kubenswrapper[4892]: I0217 19:23:07.685677 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8bec279d-5310-4762-8f15-b0ad2d919df9-var-run\") pod \"ovn-controller-ovs-pc6zw\" (UID: \"8bec279d-5310-4762-8f15-b0ad2d919df9\") " pod="openstack/ovn-controller-ovs-pc6zw" Feb 17 19:23:07 crc kubenswrapper[4892]: I0217 19:23:07.685909 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/031573a9-543b-4333-a50c-d7e514eb3a41-scripts\") pod \"ovn-controller-9w7kn\" (UID: \"031573a9-543b-4333-a50c-d7e514eb3a41\") " pod="openstack/ovn-controller-9w7kn" Feb 17 19:23:07 crc kubenswrapper[4892]: I0217 19:23:07.686260 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8bec279d-5310-4762-8f15-b0ad2d919df9-var-log\") pod \"ovn-controller-ovs-pc6zw\" (UID: \"8bec279d-5310-4762-8f15-b0ad2d919df9\") " pod="openstack/ovn-controller-ovs-pc6zw" Feb 17 19:23:07 crc kubenswrapper[4892]: I0217 19:23:07.686442 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/031573a9-543b-4333-a50c-d7e514eb3a41-var-run-ovn\") pod \"ovn-controller-9w7kn\" (UID: \"031573a9-543b-4333-a50c-d7e514eb3a41\") " pod="openstack/ovn-controller-9w7kn" Feb 17 19:23:07 crc kubenswrapper[4892]: I0217 19:23:07.686783 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/8bec279d-5310-4762-8f15-b0ad2d919df9-etc-ovs\") pod \"ovn-controller-ovs-pc6zw\" (UID: \"8bec279d-5310-4762-8f15-b0ad2d919df9\") " pod="openstack/ovn-controller-ovs-pc6zw" Feb 17 19:23:07 crc kubenswrapper[4892]: I0217 19:23:07.692559 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/031573a9-543b-4333-a50c-d7e514eb3a41-var-log-ovn\") pod \"ovn-controller-9w7kn\" (UID: \"031573a9-543b-4333-a50c-d7e514eb3a41\") " pod="openstack/ovn-controller-9w7kn" Feb 17 19:23:07 crc kubenswrapper[4892]: I0217 19:23:07.692885 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmkrl\" (UniqueName: \"kubernetes.io/projected/8bec279d-5310-4762-8f15-b0ad2d919df9-kube-api-access-vmkrl\") pod \"ovn-controller-ovs-pc6zw\" (UID: \"8bec279d-5310-4762-8f15-b0ad2d919df9\") " pod="openstack/ovn-controller-ovs-pc6zw" Feb 17 19:23:07 crc kubenswrapper[4892]: I0217 19:23:07.687888 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/031573a9-543b-4333-a50c-d7e514eb3a41-scripts\") pod \"ovn-controller-9w7kn\" (UID: \"031573a9-543b-4333-a50c-d7e514eb3a41\") " pod="openstack/ovn-controller-9w7kn" Feb 17 19:23:07 crc kubenswrapper[4892]: I0217 19:23:07.692661 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/031573a9-543b-4333-a50c-d7e514eb3a41-var-log-ovn\") pod \"ovn-controller-9w7kn\" (UID: \"031573a9-543b-4333-a50c-d7e514eb3a41\") " pod="openstack/ovn-controller-9w7kn" Feb 17 19:23:07 crc kubenswrapper[4892]: I0217 19:23:07.687950 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/031573a9-543b-4333-a50c-d7e514eb3a41-var-run-ovn\") pod \"ovn-controller-9w7kn\" (UID: \"031573a9-543b-4333-a50c-d7e514eb3a41\") " pod="openstack/ovn-controller-9w7kn" Feb 17 19:23:07 crc kubenswrapper[4892]: I0217 19:23:07.706476 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v876d\" (UniqueName: \"kubernetes.io/projected/031573a9-543b-4333-a50c-d7e514eb3a41-kube-api-access-v876d\") pod \"ovn-controller-9w7kn\" (UID: \"031573a9-543b-4333-a50c-d7e514eb3a41\") " pod="openstack/ovn-controller-9w7kn" Feb 17 19:23:07 crc kubenswrapper[4892]: I0217 19:23:07.796576 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8bec279d-5310-4762-8f15-b0ad2d919df9-scripts\") pod \"ovn-controller-ovs-pc6zw\" (UID: \"8bec279d-5310-4762-8f15-b0ad2d919df9\") " pod="openstack/ovn-controller-ovs-pc6zw" Feb 17 19:23:07 crc kubenswrapper[4892]: I0217 19:23:07.796679 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8bec279d-5310-4762-8f15-b0ad2d919df9-var-run\") pod \"ovn-controller-ovs-pc6zw\" (UID: \"8bec279d-5310-4762-8f15-b0ad2d919df9\") " pod="openstack/ovn-controller-ovs-pc6zw" Feb 17 19:23:07 crc kubenswrapper[4892]: I0217 19:23:07.796743 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8bec279d-5310-4762-8f15-b0ad2d919df9-var-log\") pod \"ovn-controller-ovs-pc6zw\" (UID: \"8bec279d-5310-4762-8f15-b0ad2d919df9\") " pod="openstack/ovn-controller-ovs-pc6zw" Feb 17 19:23:07 crc kubenswrapper[4892]: I0217 19:23:07.796781 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/8bec279d-5310-4762-8f15-b0ad2d919df9-etc-ovs\") pod \"ovn-controller-ovs-pc6zw\" (UID: \"8bec279d-5310-4762-8f15-b0ad2d919df9\") " pod="openstack/ovn-controller-ovs-pc6zw" Feb 17 19:23:07 crc kubenswrapper[4892]: I0217 19:23:07.796887 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmkrl\" (UniqueName: \"kubernetes.io/projected/8bec279d-5310-4762-8f15-b0ad2d919df9-kube-api-access-vmkrl\") pod \"ovn-controller-ovs-pc6zw\" (UID: \"8bec279d-5310-4762-8f15-b0ad2d919df9\") " pod="openstack/ovn-controller-ovs-pc6zw" Feb 17 19:23:07 crc kubenswrapper[4892]: I0217 19:23:07.797010 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/8bec279d-5310-4762-8f15-b0ad2d919df9-var-lib\") pod \"ovn-controller-ovs-pc6zw\" (UID: \"8bec279d-5310-4762-8f15-b0ad2d919df9\") " pod="openstack/ovn-controller-ovs-pc6zw" Feb 17 19:23:07 crc kubenswrapper[4892]: I0217 19:23:07.797273 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/8bec279d-5310-4762-8f15-b0ad2d919df9-var-lib\") pod \"ovn-controller-ovs-pc6zw\" (UID: \"8bec279d-5310-4762-8f15-b0ad2d919df9\") " pod="openstack/ovn-controller-ovs-pc6zw" Feb 17 19:23:07 crc kubenswrapper[4892]: I0217 19:23:07.797382 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8bec279d-5310-4762-8f15-b0ad2d919df9-var-run\") pod \"ovn-controller-ovs-pc6zw\" (UID: \"8bec279d-5310-4762-8f15-b0ad2d919df9\") " pod="openstack/ovn-controller-ovs-pc6zw" Feb 17 19:23:07 crc kubenswrapper[4892]: I0217 19:23:07.797451 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8bec279d-5310-4762-8f15-b0ad2d919df9-var-log\") pod \"ovn-controller-ovs-pc6zw\" (UID: \"8bec279d-5310-4762-8f15-b0ad2d919df9\") " pod="openstack/ovn-controller-ovs-pc6zw" Feb 17 19:23:07 crc kubenswrapper[4892]: I0217 19:23:07.797518 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/8bec279d-5310-4762-8f15-b0ad2d919df9-etc-ovs\") pod \"ovn-controller-ovs-pc6zw\" (UID: \"8bec279d-5310-4762-8f15-b0ad2d919df9\") " pod="openstack/ovn-controller-ovs-pc6zw" Feb 17 19:23:07 crc kubenswrapper[4892]: I0217 19:23:07.798648 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8bec279d-5310-4762-8f15-b0ad2d919df9-scripts\") pod \"ovn-controller-ovs-pc6zw\" (UID: \"8bec279d-5310-4762-8f15-b0ad2d919df9\") " pod="openstack/ovn-controller-ovs-pc6zw" Feb 17 19:23:07 crc kubenswrapper[4892]: I0217 19:23:07.809900 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9w7kn" Feb 17 19:23:07 crc kubenswrapper[4892]: I0217 19:23:07.825687 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmkrl\" (UniqueName: \"kubernetes.io/projected/8bec279d-5310-4762-8f15-b0ad2d919df9-kube-api-access-vmkrl\") pod \"ovn-controller-ovs-pc6zw\" (UID: \"8bec279d-5310-4762-8f15-b0ad2d919df9\") " pod="openstack/ovn-controller-ovs-pc6zw" Feb 17 19:23:07 crc kubenswrapper[4892]: I0217 19:23:07.906387 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-pc6zw" Feb 17 19:23:08 crc kubenswrapper[4892]: I0217 19:23:08.294937 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9w7kn"] Feb 17 19:23:08 crc kubenswrapper[4892]: I0217 19:23:08.764546 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-pc6zw"] Feb 17 19:23:08 crc kubenswrapper[4892]: W0217 19:23:08.772533 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8bec279d_5310_4762_8f15_b0ad2d919df9.slice/crio-5fa0f93b2b3455bf20e8fa1782bb2297a1cd3b943716f2198226c6728949c9d5 WatchSource:0}: Error finding container 5fa0f93b2b3455bf20e8fa1782bb2297a1cd3b943716f2198226c6728949c9d5: Status 404 returned error can't find the container with id 5fa0f93b2b3455bf20e8fa1782bb2297a1cd3b943716f2198226c6728949c9d5 Feb 17 19:23:08 crc kubenswrapper[4892]: I0217 19:23:08.837401 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-pc6zw" event={"ID":"8bec279d-5310-4762-8f15-b0ad2d919df9","Type":"ContainerStarted","Data":"5fa0f93b2b3455bf20e8fa1782bb2297a1cd3b943716f2198226c6728949c9d5"} Feb 17 19:23:08 crc kubenswrapper[4892]: I0217 19:23:08.842330 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9w7kn" event={"ID":"031573a9-543b-4333-a50c-d7e514eb3a41","Type":"ContainerStarted","Data":"5f1d27d84a61a9358cb4148bcbd2aeded6525c20fecbe65f26a6faaef58f1c94"} Feb 17 19:23:08 crc kubenswrapper[4892]: I0217 19:23:08.842373 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9w7kn" event={"ID":"031573a9-543b-4333-a50c-d7e514eb3a41","Type":"ContainerStarted","Data":"8c06b9b94da01ea64ccc4d859ea311cf7ac2d93691b9448e0be4d85768598d24"} Feb 17 19:23:08 crc kubenswrapper[4892]: I0217 19:23:08.842466 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-9w7kn" Feb 17 19:23:08 crc kubenswrapper[4892]: I0217 19:23:08.871649 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-9w7kn" podStartSLOduration=1.8716291900000002 podStartE2EDuration="1.87162919s" podCreationTimestamp="2026-02-17 19:23:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:23:08.864034885 +0000 UTC m=+5960.239438150" watchObservedRunningTime="2026-02-17 19:23:08.87162919 +0000 UTC m=+5960.247032465" Feb 17 19:23:09 crc kubenswrapper[4892]: I0217 19:23:09.019779 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-vjdzt"] Feb 17 19:23:09 crc kubenswrapper[4892]: I0217 19:23:09.021516 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-vjdzt" Feb 17 19:23:09 crc kubenswrapper[4892]: I0217 19:23:09.041946 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-vjdzt"] Feb 17 19:23:09 crc kubenswrapper[4892]: I0217 19:23:09.054135 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 17 19:23:09 crc kubenswrapper[4892]: I0217 19:23:09.135294 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzldc\" (UniqueName: \"kubernetes.io/projected/ba17e82c-3558-4262-99b7-d37b403e49dd-kube-api-access-vzldc\") pod \"ovn-controller-metrics-vjdzt\" (UID: \"ba17e82c-3558-4262-99b7-d37b403e49dd\") " pod="openstack/ovn-controller-metrics-vjdzt" Feb 17 19:23:09 crc kubenswrapper[4892]: I0217 19:23:09.135357 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/ba17e82c-3558-4262-99b7-d37b403e49dd-ovs-rundir\") pod \"ovn-controller-metrics-vjdzt\" (UID: \"ba17e82c-3558-4262-99b7-d37b403e49dd\") " pod="openstack/ovn-controller-metrics-vjdzt" Feb 17 19:23:09 crc kubenswrapper[4892]: I0217 19:23:09.135394 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/ba17e82c-3558-4262-99b7-d37b403e49dd-ovn-rundir\") pod \"ovn-controller-metrics-vjdzt\" (UID: \"ba17e82c-3558-4262-99b7-d37b403e49dd\") " pod="openstack/ovn-controller-metrics-vjdzt" Feb 17 19:23:09 crc kubenswrapper[4892]: I0217 19:23:09.135843 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba17e82c-3558-4262-99b7-d37b403e49dd-config\") pod \"ovn-controller-metrics-vjdzt\" (UID: \"ba17e82c-3558-4262-99b7-d37b403e49dd\") " pod="openstack/ovn-controller-metrics-vjdzt" Feb 17 19:23:09 crc kubenswrapper[4892]: I0217 19:23:09.237830 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzldc\" (UniqueName: \"kubernetes.io/projected/ba17e82c-3558-4262-99b7-d37b403e49dd-kube-api-access-vzldc\") pod \"ovn-controller-metrics-vjdzt\" (UID: \"ba17e82c-3558-4262-99b7-d37b403e49dd\") " pod="openstack/ovn-controller-metrics-vjdzt" Feb 17 19:23:09 crc kubenswrapper[4892]: I0217 19:23:09.237909 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/ba17e82c-3558-4262-99b7-d37b403e49dd-ovs-rundir\") pod \"ovn-controller-metrics-vjdzt\" (UID: \"ba17e82c-3558-4262-99b7-d37b403e49dd\") " pod="openstack/ovn-controller-metrics-vjdzt" Feb 17 19:23:09 crc kubenswrapper[4892]: I0217 19:23:09.237942 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/ba17e82c-3558-4262-99b7-d37b403e49dd-ovn-rundir\") pod \"ovn-controller-metrics-vjdzt\" (UID: \"ba17e82c-3558-4262-99b7-d37b403e49dd\") " pod="openstack/ovn-controller-metrics-vjdzt" Feb 17 19:23:09 crc kubenswrapper[4892]: I0217 19:23:09.238036 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba17e82c-3558-4262-99b7-d37b403e49dd-config\") pod \"ovn-controller-metrics-vjdzt\" (UID: \"ba17e82c-3558-4262-99b7-d37b403e49dd\") " pod="openstack/ovn-controller-metrics-vjdzt" Feb 17 19:23:09 crc kubenswrapper[4892]: I0217 19:23:09.238317 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/ba17e82c-3558-4262-99b7-d37b403e49dd-ovs-rundir\") pod \"ovn-controller-metrics-vjdzt\" (UID: \"ba17e82c-3558-4262-99b7-d37b403e49dd\") " pod="openstack/ovn-controller-metrics-vjdzt" Feb 17 19:23:09 crc kubenswrapper[4892]: I0217 19:23:09.238342 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/ba17e82c-3558-4262-99b7-d37b403e49dd-ovn-rundir\") pod \"ovn-controller-metrics-vjdzt\" (UID: \"ba17e82c-3558-4262-99b7-d37b403e49dd\") " pod="openstack/ovn-controller-metrics-vjdzt" Feb 17 19:23:09 crc kubenswrapper[4892]: I0217 19:23:09.238762 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba17e82c-3558-4262-99b7-d37b403e49dd-config\") pod \"ovn-controller-metrics-vjdzt\" (UID: \"ba17e82c-3558-4262-99b7-d37b403e49dd\") " pod="openstack/ovn-controller-metrics-vjdzt" Feb 17 19:23:09 crc kubenswrapper[4892]: I0217 19:23:09.256359 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzldc\" (UniqueName: \"kubernetes.io/projected/ba17e82c-3558-4262-99b7-d37b403e49dd-kube-api-access-vzldc\") pod \"ovn-controller-metrics-vjdzt\" (UID: \"ba17e82c-3558-4262-99b7-d37b403e49dd\") " pod="openstack/ovn-controller-metrics-vjdzt" Feb 17 19:23:09 crc kubenswrapper[4892]: I0217 19:23:09.339963 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-vjdzt" Feb 17 19:23:09 crc kubenswrapper[4892]: I0217 19:23:09.837774 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-vjdzt"] Feb 17 19:23:09 crc kubenswrapper[4892]: I0217 19:23:09.872936 4892 generic.go:334] "Generic (PLEG): container finished" podID="8bec279d-5310-4762-8f15-b0ad2d919df9" containerID="9b8bfa6eda36946861129f8217f7662b0261c78a1c73b6c148a6969b886c3d85" exitCode=0 Feb 17 19:23:09 crc kubenswrapper[4892]: I0217 19:23:09.873020 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-pc6zw" event={"ID":"8bec279d-5310-4762-8f15-b0ad2d919df9","Type":"ContainerDied","Data":"9b8bfa6eda36946861129f8217f7662b0261c78a1c73b6c148a6969b886c3d85"} Feb 17 19:23:09 crc kubenswrapper[4892]: I0217 19:23:09.874887 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-vjdzt" event={"ID":"ba17e82c-3558-4262-99b7-d37b403e49dd","Type":"ContainerStarted","Data":"bd8cf1be6766c9af3ce7a26923deeee7833c214c9bd9190c7027958c98070186"} Feb 17 19:23:10 crc kubenswrapper[4892]: I0217 19:23:10.885132 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-vjdzt" event={"ID":"ba17e82c-3558-4262-99b7-d37b403e49dd","Type":"ContainerStarted","Data":"d42ea28329003fab4348191c7b538862edd05a418dd219c7c66d90b27f2c1a18"} Feb 17 19:23:10 crc kubenswrapper[4892]: I0217 19:23:10.889795 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-pc6zw" event={"ID":"8bec279d-5310-4762-8f15-b0ad2d919df9","Type":"ContainerStarted","Data":"84dc289936cb7f29e5d87bcd01eadc96d51b51a787719d3100a8c419521a0806"} Feb 17 19:23:10 crc kubenswrapper[4892]: I0217 19:23:10.889849 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-pc6zw" event={"ID":"8bec279d-5310-4762-8f15-b0ad2d919df9","Type":"ContainerStarted","Data":"379aab6358f88646bed30873e84ef29618ba55fe5c0bfb63462ea55701f4fc57"} Feb 17 19:23:10 crc kubenswrapper[4892]: I0217 19:23:10.889988 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-pc6zw" Feb 17 19:23:10 crc kubenswrapper[4892]: I0217 19:23:10.890006 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-pc6zw" Feb 17 19:23:10 crc kubenswrapper[4892]: I0217 19:23:10.945132 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-pc6zw" podStartSLOduration=3.945111302 podStartE2EDuration="3.945111302s" podCreationTimestamp="2026-02-17 19:23:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:23:10.944107364 +0000 UTC m=+5962.319510639" watchObservedRunningTime="2026-02-17 19:23:10.945111302 +0000 UTC m=+5962.320514577" Feb 17 19:23:10 crc kubenswrapper[4892]: I0217 19:23:10.947016 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-vjdzt" podStartSLOduration=2.947003843 podStartE2EDuration="2.947003843s" podCreationTimestamp="2026-02-17 19:23:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:23:10.900305434 +0000 UTC m=+5962.275708719" watchObservedRunningTime="2026-02-17 19:23:10.947003843 +0000 UTC m=+5962.322407118" Feb 17 19:23:14 crc kubenswrapper[4892]: I0217 19:23:14.047974 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-jpgjq"] Feb 17 19:23:14 crc kubenswrapper[4892]: I0217 19:23:14.060652 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-jpgjq"] Feb 17 19:23:14 crc kubenswrapper[4892]: I0217 19:23:14.359984 4892 scope.go:117] "RemoveContainer" containerID="0b9ac01ffb1eb212ecc7dea4def4de73890b147a83b4b71346712fbd55b67ac2" Feb 17 19:23:14 crc kubenswrapper[4892]: E0217 19:23:14.360338 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:23:15 crc kubenswrapper[4892]: I0217 19:23:15.380675 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7813c451-11c5-4d33-bfca-baa5b12a2235" path="/var/lib/kubelet/pods/7813c451-11c5-4d33-bfca-baa5b12a2235/volumes" Feb 17 19:23:16 crc kubenswrapper[4892]: I0217 19:23:16.755565 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-create-hpq92"] Feb 17 19:23:16 crc kubenswrapper[4892]: I0217 19:23:16.757268 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-hpq92" Feb 17 19:23:16 crc kubenswrapper[4892]: I0217 19:23:16.779397 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-hpq92"] Feb 17 19:23:16 crc kubenswrapper[4892]: I0217 19:23:16.813100 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/026c9245-5963-4614-b96a-a9d6c2a6b839-operator-scripts\") pod \"octavia-db-create-hpq92\" (UID: \"026c9245-5963-4614-b96a-a9d6c2a6b839\") " pod="openstack/octavia-db-create-hpq92" Feb 17 19:23:16 crc kubenswrapper[4892]: I0217 19:23:16.813436 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8ffr\" (UniqueName: \"kubernetes.io/projected/026c9245-5963-4614-b96a-a9d6c2a6b839-kube-api-access-q8ffr\") pod \"octavia-db-create-hpq92\" (UID: \"026c9245-5963-4614-b96a-a9d6c2a6b839\") " pod="openstack/octavia-db-create-hpq92" Feb 17 19:23:16 crc kubenswrapper[4892]: I0217 19:23:16.915449 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8ffr\" (UniqueName: \"kubernetes.io/projected/026c9245-5963-4614-b96a-a9d6c2a6b839-kube-api-access-q8ffr\") pod \"octavia-db-create-hpq92\" (UID: \"026c9245-5963-4614-b96a-a9d6c2a6b839\") " pod="openstack/octavia-db-create-hpq92" Feb 17 19:23:16 crc kubenswrapper[4892]: I0217 19:23:16.915651 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/026c9245-5963-4614-b96a-a9d6c2a6b839-operator-scripts\") pod \"octavia-db-create-hpq92\" (UID: \"026c9245-5963-4614-b96a-a9d6c2a6b839\") " pod="openstack/octavia-db-create-hpq92" Feb 17 19:23:16 crc kubenswrapper[4892]: I0217 19:23:16.916417 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/026c9245-5963-4614-b96a-a9d6c2a6b839-operator-scripts\") pod \"octavia-db-create-hpq92\" (UID: \"026c9245-5963-4614-b96a-a9d6c2a6b839\") " pod="openstack/octavia-db-create-hpq92" Feb 17 19:23:16 crc kubenswrapper[4892]: I0217 19:23:16.934033 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8ffr\" (UniqueName: \"kubernetes.io/projected/026c9245-5963-4614-b96a-a9d6c2a6b839-kube-api-access-q8ffr\") pod \"octavia-db-create-hpq92\" (UID: \"026c9245-5963-4614-b96a-a9d6c2a6b839\") " pod="openstack/octavia-db-create-hpq92" Feb 17 19:23:17 crc kubenswrapper[4892]: I0217 19:23:17.082536 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-hpq92" Feb 17 19:23:17 crc kubenswrapper[4892]: I0217 19:23:17.605908 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-hpq92"] Feb 17 19:23:17 crc kubenswrapper[4892]: I0217 19:23:17.985686 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-hpq92" event={"ID":"026c9245-5963-4614-b96a-a9d6c2a6b839","Type":"ContainerStarted","Data":"08338a763062d1e3fe63c443c8caf6ebf457fa93055508058d82076f018e12a7"} Feb 17 19:23:17 crc kubenswrapper[4892]: I0217 19:23:17.985967 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-hpq92" event={"ID":"026c9245-5963-4614-b96a-a9d6c2a6b839","Type":"ContainerStarted","Data":"6a58cb730d620e59806e1f15527af84922976734f5c0ccce54951d20c1bb3623"} Feb 17 19:23:18 crc kubenswrapper[4892]: I0217 19:23:18.006156 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-db-create-hpq92" podStartSLOduration=2.006124472 podStartE2EDuration="2.006124472s" podCreationTimestamp="2026-02-17 19:23:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:23:18.004210811 +0000 UTC m=+5969.379614076" watchObservedRunningTime="2026-02-17 19:23:18.006124472 +0000 UTC m=+5969.381527737" Feb 17 19:23:18 crc kubenswrapper[4892]: I0217 19:23:18.166363 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-0986-account-create-update-r9xjq"] Feb 17 19:23:18 crc kubenswrapper[4892]: I0217 19:23:18.167918 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-0986-account-create-update-r9xjq" Feb 17 19:23:18 crc kubenswrapper[4892]: I0217 19:23:18.170209 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-db-secret" Feb 17 19:23:18 crc kubenswrapper[4892]: I0217 19:23:18.178905 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-0986-account-create-update-r9xjq"] Feb 17 19:23:18 crc kubenswrapper[4892]: I0217 19:23:18.244690 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b778e62-5003-4544-8653-d5a6c7b62648-operator-scripts\") pod \"octavia-0986-account-create-update-r9xjq\" (UID: \"5b778e62-5003-4544-8653-d5a6c7b62648\") " pod="openstack/octavia-0986-account-create-update-r9xjq" Feb 17 19:23:18 crc kubenswrapper[4892]: I0217 19:23:18.244778 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn4rq\" (UniqueName: \"kubernetes.io/projected/5b778e62-5003-4544-8653-d5a6c7b62648-kube-api-access-rn4rq\") pod \"octavia-0986-account-create-update-r9xjq\" (UID: \"5b778e62-5003-4544-8653-d5a6c7b62648\") " pod="openstack/octavia-0986-account-create-update-r9xjq" Feb 17 19:23:18 crc kubenswrapper[4892]: I0217 19:23:18.346969 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b778e62-5003-4544-8653-d5a6c7b62648-operator-scripts\") pod \"octavia-0986-account-create-update-r9xjq\" (UID: \"5b778e62-5003-4544-8653-d5a6c7b62648\") " pod="openstack/octavia-0986-account-create-update-r9xjq" Feb 17 19:23:18 crc kubenswrapper[4892]: I0217 19:23:18.347333 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rn4rq\" (UniqueName: \"kubernetes.io/projected/5b778e62-5003-4544-8653-d5a6c7b62648-kube-api-access-rn4rq\") pod \"octavia-0986-account-create-update-r9xjq\" (UID: \"5b778e62-5003-4544-8653-d5a6c7b62648\") " pod="openstack/octavia-0986-account-create-update-r9xjq" Feb 17 19:23:18 crc kubenswrapper[4892]: I0217 19:23:18.348219 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b778e62-5003-4544-8653-d5a6c7b62648-operator-scripts\") pod \"octavia-0986-account-create-update-r9xjq\" (UID: \"5b778e62-5003-4544-8653-d5a6c7b62648\") " pod="openstack/octavia-0986-account-create-update-r9xjq" Feb 17 19:23:18 crc kubenswrapper[4892]: I0217 19:23:18.378277 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn4rq\" (UniqueName: \"kubernetes.io/projected/5b778e62-5003-4544-8653-d5a6c7b62648-kube-api-access-rn4rq\") pod \"octavia-0986-account-create-update-r9xjq\" (UID: \"5b778e62-5003-4544-8653-d5a6c7b62648\") " pod="openstack/octavia-0986-account-create-update-r9xjq" Feb 17 19:23:18 crc kubenswrapper[4892]: I0217 19:23:18.538157 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-0986-account-create-update-r9xjq" Feb 17 19:23:19 crc kubenswrapper[4892]: I0217 19:23:19.001978 4892 generic.go:334] "Generic (PLEG): container finished" podID="026c9245-5963-4614-b96a-a9d6c2a6b839" containerID="08338a763062d1e3fe63c443c8caf6ebf457fa93055508058d82076f018e12a7" exitCode=0 Feb 17 19:23:19 crc kubenswrapper[4892]: I0217 19:23:19.002110 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-hpq92" event={"ID":"026c9245-5963-4614-b96a-a9d6c2a6b839","Type":"ContainerDied","Data":"08338a763062d1e3fe63c443c8caf6ebf457fa93055508058d82076f018e12a7"} Feb 17 19:23:19 crc kubenswrapper[4892]: I0217 19:23:19.069230 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-0986-account-create-update-r9xjq"] Feb 17 19:23:20 crc kubenswrapper[4892]: I0217 19:23:20.026262 4892 generic.go:334] "Generic (PLEG): container finished" podID="5b778e62-5003-4544-8653-d5a6c7b62648" containerID="e4a28dbe11c4731fbaf8dc60346b4b2da6f455dd469edc28be32354c857797f1" exitCode=0 Feb 17 19:23:20 crc kubenswrapper[4892]: I0217 19:23:20.026343 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-0986-account-create-update-r9xjq" event={"ID":"5b778e62-5003-4544-8653-d5a6c7b62648","Type":"ContainerDied","Data":"e4a28dbe11c4731fbaf8dc60346b4b2da6f455dd469edc28be32354c857797f1"} Feb 17 19:23:20 crc kubenswrapper[4892]: I0217 19:23:20.026553 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-0986-account-create-update-r9xjq" event={"ID":"5b778e62-5003-4544-8653-d5a6c7b62648","Type":"ContainerStarted","Data":"7c559274a6a5a243280c9061c04816dbfbef22c420cc34c4d9f548b52430c737"} Feb 17 19:23:20 crc kubenswrapper[4892]: I0217 19:23:20.419220 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-hpq92" Feb 17 19:23:20 crc kubenswrapper[4892]: I0217 19:23:20.489995 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8ffr\" (UniqueName: \"kubernetes.io/projected/026c9245-5963-4614-b96a-a9d6c2a6b839-kube-api-access-q8ffr\") pod \"026c9245-5963-4614-b96a-a9d6c2a6b839\" (UID: \"026c9245-5963-4614-b96a-a9d6c2a6b839\") " Feb 17 19:23:20 crc kubenswrapper[4892]: I0217 19:23:20.490053 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/026c9245-5963-4614-b96a-a9d6c2a6b839-operator-scripts\") pod \"026c9245-5963-4614-b96a-a9d6c2a6b839\" (UID: \"026c9245-5963-4614-b96a-a9d6c2a6b839\") " Feb 17 19:23:20 crc kubenswrapper[4892]: I0217 19:23:20.490803 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/026c9245-5963-4614-b96a-a9d6c2a6b839-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "026c9245-5963-4614-b96a-a9d6c2a6b839" (UID: "026c9245-5963-4614-b96a-a9d6c2a6b839"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:23:20 crc kubenswrapper[4892]: I0217 19:23:20.500145 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/026c9245-5963-4614-b96a-a9d6c2a6b839-kube-api-access-q8ffr" (OuterVolumeSpecName: "kube-api-access-q8ffr") pod "026c9245-5963-4614-b96a-a9d6c2a6b839" (UID: "026c9245-5963-4614-b96a-a9d6c2a6b839"). InnerVolumeSpecName "kube-api-access-q8ffr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:23:20 crc kubenswrapper[4892]: I0217 19:23:20.593229 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8ffr\" (UniqueName: \"kubernetes.io/projected/026c9245-5963-4614-b96a-a9d6c2a6b839-kube-api-access-q8ffr\") on node \"crc\" DevicePath \"\"" Feb 17 19:23:20 crc kubenswrapper[4892]: I0217 19:23:20.593280 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/026c9245-5963-4614-b96a-a9d6c2a6b839-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:23:21 crc kubenswrapper[4892]: I0217 19:23:21.070324 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-hpq92" Feb 17 19:23:21 crc kubenswrapper[4892]: I0217 19:23:21.070496 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-hpq92" event={"ID":"026c9245-5963-4614-b96a-a9d6c2a6b839","Type":"ContainerDied","Data":"6a58cb730d620e59806e1f15527af84922976734f5c0ccce54951d20c1bb3623"} Feb 17 19:23:21 crc kubenswrapper[4892]: I0217 19:23:21.070728 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a58cb730d620e59806e1f15527af84922976734f5c0ccce54951d20c1bb3623" Feb 17 19:23:21 crc kubenswrapper[4892]: I0217 19:23:21.525111 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-0986-account-create-update-r9xjq" Feb 17 19:23:21 crc kubenswrapper[4892]: I0217 19:23:21.626586 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rn4rq\" (UniqueName: \"kubernetes.io/projected/5b778e62-5003-4544-8653-d5a6c7b62648-kube-api-access-rn4rq\") pod \"5b778e62-5003-4544-8653-d5a6c7b62648\" (UID: \"5b778e62-5003-4544-8653-d5a6c7b62648\") " Feb 17 19:23:21 crc kubenswrapper[4892]: I0217 19:23:21.626706 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b778e62-5003-4544-8653-d5a6c7b62648-operator-scripts\") pod \"5b778e62-5003-4544-8653-d5a6c7b62648\" (UID: \"5b778e62-5003-4544-8653-d5a6c7b62648\") " Feb 17 19:23:21 crc kubenswrapper[4892]: I0217 19:23:21.627301 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b778e62-5003-4544-8653-d5a6c7b62648-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5b778e62-5003-4544-8653-d5a6c7b62648" (UID: "5b778e62-5003-4544-8653-d5a6c7b62648"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:23:21 crc kubenswrapper[4892]: I0217 19:23:21.627632 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b778e62-5003-4544-8653-d5a6c7b62648-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:23:21 crc kubenswrapper[4892]: I0217 19:23:21.632078 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b778e62-5003-4544-8653-d5a6c7b62648-kube-api-access-rn4rq" (OuterVolumeSpecName: "kube-api-access-rn4rq") pod "5b778e62-5003-4544-8653-d5a6c7b62648" (UID: "5b778e62-5003-4544-8653-d5a6c7b62648"). InnerVolumeSpecName "kube-api-access-rn4rq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:23:21 crc kubenswrapper[4892]: I0217 19:23:21.728636 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rn4rq\" (UniqueName: \"kubernetes.io/projected/5b778e62-5003-4544-8653-d5a6c7b62648-kube-api-access-rn4rq\") on node \"crc\" DevicePath \"\"" Feb 17 19:23:22 crc kubenswrapper[4892]: I0217 19:23:22.086728 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-0986-account-create-update-r9xjq" event={"ID":"5b778e62-5003-4544-8653-d5a6c7b62648","Type":"ContainerDied","Data":"7c559274a6a5a243280c9061c04816dbfbef22c420cc34c4d9f548b52430c737"} Feb 17 19:23:22 crc kubenswrapper[4892]: I0217 19:23:22.087101 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c559274a6a5a243280c9061c04816dbfbef22c420cc34c4d9f548b52430c737" Feb 17 19:23:22 crc kubenswrapper[4892]: I0217 19:23:22.086807 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-0986-account-create-update-r9xjq" Feb 17 19:23:24 crc kubenswrapper[4892]: I0217 19:23:24.025480 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-persistence-db-create-kn955"] Feb 17 19:23:24 crc kubenswrapper[4892]: E0217 19:23:24.026386 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b778e62-5003-4544-8653-d5a6c7b62648" containerName="mariadb-account-create-update" Feb 17 19:23:24 crc kubenswrapper[4892]: I0217 19:23:24.026406 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b778e62-5003-4544-8653-d5a6c7b62648" containerName="mariadb-account-create-update" Feb 17 19:23:24 crc kubenswrapper[4892]: E0217 19:23:24.026441 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="026c9245-5963-4614-b96a-a9d6c2a6b839" containerName="mariadb-database-create" Feb 17 19:23:24 crc kubenswrapper[4892]: I0217 19:23:24.026449 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="026c9245-5963-4614-b96a-a9d6c2a6b839" containerName="mariadb-database-create" Feb 17 19:23:24 crc kubenswrapper[4892]: I0217 19:23:24.026740 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b778e62-5003-4544-8653-d5a6c7b62648" containerName="mariadb-account-create-update" Feb 17 19:23:24 crc kubenswrapper[4892]: I0217 19:23:24.026766 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="026c9245-5963-4614-b96a-a9d6c2a6b839" containerName="mariadb-database-create" Feb 17 19:23:24 crc kubenswrapper[4892]: I0217 19:23:24.027619 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-kn955" Feb 17 19:23:24 crc kubenswrapper[4892]: I0217 19:23:24.038096 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-kn955"] Feb 17 19:23:24 crc kubenswrapper[4892]: I0217 19:23:24.184567 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7wbd\" (UniqueName: \"kubernetes.io/projected/6f2653d2-fc1e-4bcb-ad8c-0c372dfb0f4a-kube-api-access-m7wbd\") pod \"octavia-persistence-db-create-kn955\" (UID: \"6f2653d2-fc1e-4bcb-ad8c-0c372dfb0f4a\") " pod="openstack/octavia-persistence-db-create-kn955" Feb 17 19:23:24 crc kubenswrapper[4892]: I0217 19:23:24.185419 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f2653d2-fc1e-4bcb-ad8c-0c372dfb0f4a-operator-scripts\") pod \"octavia-persistence-db-create-kn955\" (UID: \"6f2653d2-fc1e-4bcb-ad8c-0c372dfb0f4a\") " pod="openstack/octavia-persistence-db-create-kn955" Feb 17 19:23:24 crc kubenswrapper[4892]: I0217 19:23:24.288205 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7wbd\" (UniqueName: \"kubernetes.io/projected/6f2653d2-fc1e-4bcb-ad8c-0c372dfb0f4a-kube-api-access-m7wbd\") pod \"octavia-persistence-db-create-kn955\" (UID: \"6f2653d2-fc1e-4bcb-ad8c-0c372dfb0f4a\") " pod="openstack/octavia-persistence-db-create-kn955" Feb 17 19:23:24 crc kubenswrapper[4892]: I0217 19:23:24.288316 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f2653d2-fc1e-4bcb-ad8c-0c372dfb0f4a-operator-scripts\") pod \"octavia-persistence-db-create-kn955\" (UID: \"6f2653d2-fc1e-4bcb-ad8c-0c372dfb0f4a\") " pod="openstack/octavia-persistence-db-create-kn955" Feb 17 19:23:24 crc kubenswrapper[4892]: I0217 19:23:24.289211 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f2653d2-fc1e-4bcb-ad8c-0c372dfb0f4a-operator-scripts\") pod \"octavia-persistence-db-create-kn955\" (UID: \"6f2653d2-fc1e-4bcb-ad8c-0c372dfb0f4a\") " pod="openstack/octavia-persistence-db-create-kn955" Feb 17 19:23:24 crc kubenswrapper[4892]: I0217 19:23:24.311258 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7wbd\" (UniqueName: \"kubernetes.io/projected/6f2653d2-fc1e-4bcb-ad8c-0c372dfb0f4a-kube-api-access-m7wbd\") pod \"octavia-persistence-db-create-kn955\" (UID: \"6f2653d2-fc1e-4bcb-ad8c-0c372dfb0f4a\") " pod="openstack/octavia-persistence-db-create-kn955" Feb 17 19:23:24 crc kubenswrapper[4892]: I0217 19:23:24.361652 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-kn955" Feb 17 19:23:24 crc kubenswrapper[4892]: W0217 19:23:24.893619 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f2653d2_fc1e_4bcb_ad8c_0c372dfb0f4a.slice/crio-9e08a74ad51aee650e6377b6b1840c7d2ffa3f92671ec06c688c4681a2787356 WatchSource:0}: Error finding container 9e08a74ad51aee650e6377b6b1840c7d2ffa3f92671ec06c688c4681a2787356: Status 404 returned error can't find the container with id 9e08a74ad51aee650e6377b6b1840c7d2ffa3f92671ec06c688c4681a2787356 Feb 17 19:23:24 crc kubenswrapper[4892]: I0217 19:23:24.897688 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-kn955"] Feb 17 19:23:25 crc kubenswrapper[4892]: I0217 19:23:25.121556 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-kn955" event={"ID":"6f2653d2-fc1e-4bcb-ad8c-0c372dfb0f4a","Type":"ContainerStarted","Data":"9e08a74ad51aee650e6377b6b1840c7d2ffa3f92671ec06c688c4681a2787356"} Feb 17 19:23:25 crc kubenswrapper[4892]: I0217 19:23:25.173250 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-096a-account-create-update-tnsbx"] Feb 17 19:23:25 crc kubenswrapper[4892]: I0217 19:23:25.175076 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-096a-account-create-update-tnsbx" Feb 17 19:23:25 crc kubenswrapper[4892]: I0217 19:23:25.176910 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-persistence-db-secret" Feb 17 19:23:25 crc kubenswrapper[4892]: I0217 19:23:25.190209 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-096a-account-create-update-tnsbx"] Feb 17 19:23:25 crc kubenswrapper[4892]: I0217 19:23:25.310233 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6be7a9e1-cad7-4881-b176-c238e08665a9-operator-scripts\") pod \"octavia-096a-account-create-update-tnsbx\" (UID: \"6be7a9e1-cad7-4881-b176-c238e08665a9\") " pod="openstack/octavia-096a-account-create-update-tnsbx" Feb 17 19:23:25 crc kubenswrapper[4892]: I0217 19:23:25.310743 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b96fw\" (UniqueName: \"kubernetes.io/projected/6be7a9e1-cad7-4881-b176-c238e08665a9-kube-api-access-b96fw\") pod \"octavia-096a-account-create-update-tnsbx\" (UID: \"6be7a9e1-cad7-4881-b176-c238e08665a9\") " pod="openstack/octavia-096a-account-create-update-tnsbx" Feb 17 19:23:25 crc kubenswrapper[4892]: I0217 19:23:25.413288 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6be7a9e1-cad7-4881-b176-c238e08665a9-operator-scripts\") pod \"octavia-096a-account-create-update-tnsbx\" (UID: \"6be7a9e1-cad7-4881-b176-c238e08665a9\") " pod="openstack/octavia-096a-account-create-update-tnsbx" Feb 17 19:23:25 crc kubenswrapper[4892]: I0217 19:23:25.413431 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b96fw\" (UniqueName: \"kubernetes.io/projected/6be7a9e1-cad7-4881-b176-c238e08665a9-kube-api-access-b96fw\") pod \"octavia-096a-account-create-update-tnsbx\" (UID: \"6be7a9e1-cad7-4881-b176-c238e08665a9\") " pod="openstack/octavia-096a-account-create-update-tnsbx" Feb 17 19:23:25 crc kubenswrapper[4892]: I0217 19:23:25.414230 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6be7a9e1-cad7-4881-b176-c238e08665a9-operator-scripts\") pod \"octavia-096a-account-create-update-tnsbx\" (UID: \"6be7a9e1-cad7-4881-b176-c238e08665a9\") " pod="openstack/octavia-096a-account-create-update-tnsbx" Feb 17 19:23:25 crc kubenswrapper[4892]: I0217 19:23:25.444610 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b96fw\" (UniqueName: \"kubernetes.io/projected/6be7a9e1-cad7-4881-b176-c238e08665a9-kube-api-access-b96fw\") pod \"octavia-096a-account-create-update-tnsbx\" (UID: \"6be7a9e1-cad7-4881-b176-c238e08665a9\") " pod="openstack/octavia-096a-account-create-update-tnsbx" Feb 17 19:23:25 crc kubenswrapper[4892]: I0217 19:23:25.499477 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-096a-account-create-update-tnsbx" Feb 17 19:23:26 crc kubenswrapper[4892]: I0217 19:23:26.013130 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-096a-account-create-update-tnsbx"] Feb 17 19:23:26 crc kubenswrapper[4892]: W0217 19:23:26.016960 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6be7a9e1_cad7_4881_b176_c238e08665a9.slice/crio-31622c657a5a2b32b21dd91f9449ba85d8245867bd315d6f0b7f9032199040ee WatchSource:0}: Error finding container 31622c657a5a2b32b21dd91f9449ba85d8245867bd315d6f0b7f9032199040ee: Status 404 returned error can't find the container with id 31622c657a5a2b32b21dd91f9449ba85d8245867bd315d6f0b7f9032199040ee Feb 17 19:23:26 crc kubenswrapper[4892]: I0217 19:23:26.133752 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-096a-account-create-update-tnsbx" event={"ID":"6be7a9e1-cad7-4881-b176-c238e08665a9","Type":"ContainerStarted","Data":"31622c657a5a2b32b21dd91f9449ba85d8245867bd315d6f0b7f9032199040ee"} Feb 17 19:23:26 crc kubenswrapper[4892]: I0217 19:23:26.137674 4892 generic.go:334] "Generic (PLEG): container finished" podID="6f2653d2-fc1e-4bcb-ad8c-0c372dfb0f4a" containerID="e8c54432f1b557347791329d315a7383365a73a9f9b3d488d6c7b3cc0ce2930d" exitCode=0 Feb 17 19:23:26 crc kubenswrapper[4892]: I0217 19:23:26.137708 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-kn955" event={"ID":"6f2653d2-fc1e-4bcb-ad8c-0c372dfb0f4a","Type":"ContainerDied","Data":"e8c54432f1b557347791329d315a7383365a73a9f9b3d488d6c7b3cc0ce2930d"} Feb 17 19:23:27 crc kubenswrapper[4892]: I0217 19:23:27.152088 4892 generic.go:334] "Generic (PLEG): container finished" podID="6be7a9e1-cad7-4881-b176-c238e08665a9" containerID="45c16671d07799a9d496923becc2da51857aca42d04c2e7751709c015ae8c017" exitCode=0 Feb 17 19:23:27 crc kubenswrapper[4892]: I0217 19:23:27.152657 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-096a-account-create-update-tnsbx" event={"ID":"6be7a9e1-cad7-4881-b176-c238e08665a9","Type":"ContainerDied","Data":"45c16671d07799a9d496923becc2da51857aca42d04c2e7751709c015ae8c017"} Feb 17 19:23:27 crc kubenswrapper[4892]: I0217 19:23:27.568788 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-kn955" Feb 17 19:23:27 crc kubenswrapper[4892]: I0217 19:23:27.763349 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7wbd\" (UniqueName: \"kubernetes.io/projected/6f2653d2-fc1e-4bcb-ad8c-0c372dfb0f4a-kube-api-access-m7wbd\") pod \"6f2653d2-fc1e-4bcb-ad8c-0c372dfb0f4a\" (UID: \"6f2653d2-fc1e-4bcb-ad8c-0c372dfb0f4a\") " Feb 17 19:23:27 crc kubenswrapper[4892]: I0217 19:23:27.763526 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f2653d2-fc1e-4bcb-ad8c-0c372dfb0f4a-operator-scripts\") pod \"6f2653d2-fc1e-4bcb-ad8c-0c372dfb0f4a\" (UID: \"6f2653d2-fc1e-4bcb-ad8c-0c372dfb0f4a\") " Feb 17 19:23:27 crc kubenswrapper[4892]: I0217 19:23:27.764284 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f2653d2-fc1e-4bcb-ad8c-0c372dfb0f4a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6f2653d2-fc1e-4bcb-ad8c-0c372dfb0f4a" (UID: "6f2653d2-fc1e-4bcb-ad8c-0c372dfb0f4a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:23:27 crc kubenswrapper[4892]: I0217 19:23:27.769151 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f2653d2-fc1e-4bcb-ad8c-0c372dfb0f4a-kube-api-access-m7wbd" (OuterVolumeSpecName: "kube-api-access-m7wbd") pod "6f2653d2-fc1e-4bcb-ad8c-0c372dfb0f4a" (UID: "6f2653d2-fc1e-4bcb-ad8c-0c372dfb0f4a"). InnerVolumeSpecName "kube-api-access-m7wbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:23:27 crc kubenswrapper[4892]: I0217 19:23:27.866713 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7wbd\" (UniqueName: \"kubernetes.io/projected/6f2653d2-fc1e-4bcb-ad8c-0c372dfb0f4a-kube-api-access-m7wbd\") on node \"crc\" DevicePath \"\"" Feb 17 19:23:27 crc kubenswrapper[4892]: I0217 19:23:27.866756 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f2653d2-fc1e-4bcb-ad8c-0c372dfb0f4a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:23:28 crc kubenswrapper[4892]: I0217 19:23:28.170249 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-kn955" event={"ID":"6f2653d2-fc1e-4bcb-ad8c-0c372dfb0f4a","Type":"ContainerDied","Data":"9e08a74ad51aee650e6377b6b1840c7d2ffa3f92671ec06c688c4681a2787356"} Feb 17 19:23:28 crc kubenswrapper[4892]: I0217 19:23:28.170340 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e08a74ad51aee650e6377b6b1840c7d2ffa3f92671ec06c688c4681a2787356" Feb 17 19:23:28 crc kubenswrapper[4892]: I0217 19:23:28.170343 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-kn955" Feb 17 19:23:28 crc kubenswrapper[4892]: I0217 19:23:28.361217 4892 scope.go:117] "RemoveContainer" containerID="0b9ac01ffb1eb212ecc7dea4def4de73890b147a83b4b71346712fbd55b67ac2" Feb 17 19:23:28 crc kubenswrapper[4892]: E0217 19:23:28.362102 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:23:28 crc kubenswrapper[4892]: I0217 19:23:28.653995 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-096a-account-create-update-tnsbx" Feb 17 19:23:28 crc kubenswrapper[4892]: I0217 19:23:28.802538 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b96fw\" (UniqueName: \"kubernetes.io/projected/6be7a9e1-cad7-4881-b176-c238e08665a9-kube-api-access-b96fw\") pod \"6be7a9e1-cad7-4881-b176-c238e08665a9\" (UID: \"6be7a9e1-cad7-4881-b176-c238e08665a9\") " Feb 17 19:23:28 crc kubenswrapper[4892]: I0217 19:23:28.802939 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6be7a9e1-cad7-4881-b176-c238e08665a9-operator-scripts\") pod \"6be7a9e1-cad7-4881-b176-c238e08665a9\" (UID: \"6be7a9e1-cad7-4881-b176-c238e08665a9\") " Feb 17 19:23:28 crc kubenswrapper[4892]: I0217 19:23:28.803797 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6be7a9e1-cad7-4881-b176-c238e08665a9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6be7a9e1-cad7-4881-b176-c238e08665a9" (UID: "6be7a9e1-cad7-4881-b176-c238e08665a9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:23:28 crc kubenswrapper[4892]: I0217 19:23:28.806528 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6be7a9e1-cad7-4881-b176-c238e08665a9-kube-api-access-b96fw" (OuterVolumeSpecName: "kube-api-access-b96fw") pod "6be7a9e1-cad7-4881-b176-c238e08665a9" (UID: "6be7a9e1-cad7-4881-b176-c238e08665a9"). InnerVolumeSpecName "kube-api-access-b96fw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:23:28 crc kubenswrapper[4892]: I0217 19:23:28.906489 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6be7a9e1-cad7-4881-b176-c238e08665a9-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:23:28 crc kubenswrapper[4892]: I0217 19:23:28.906986 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b96fw\" (UniqueName: \"kubernetes.io/projected/6be7a9e1-cad7-4881-b176-c238e08665a9-kube-api-access-b96fw\") on node \"crc\" DevicePath \"\"" Feb 17 19:23:29 crc kubenswrapper[4892]: I0217 19:23:29.178738 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-096a-account-create-update-tnsbx" event={"ID":"6be7a9e1-cad7-4881-b176-c238e08665a9","Type":"ContainerDied","Data":"31622c657a5a2b32b21dd91f9449ba85d8245867bd315d6f0b7f9032199040ee"} Feb 17 19:23:29 crc kubenswrapper[4892]: I0217 19:23:29.178776 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31622c657a5a2b32b21dd91f9449ba85d8245867bd315d6f0b7f9032199040ee" Feb 17 19:23:29 crc kubenswrapper[4892]: I0217 19:23:29.178808 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-096a-account-create-update-tnsbx" Feb 17 19:23:31 crc kubenswrapper[4892]: I0217 19:23:31.163878 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-api-bd9b9bf5f-c7bmv"] Feb 17 19:23:31 crc kubenswrapper[4892]: E0217 19:23:31.164858 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f2653d2-fc1e-4bcb-ad8c-0c372dfb0f4a" containerName="mariadb-database-create" Feb 17 19:23:31 crc kubenswrapper[4892]: I0217 19:23:31.164880 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f2653d2-fc1e-4bcb-ad8c-0c372dfb0f4a" containerName="mariadb-database-create" Feb 17 19:23:31 crc kubenswrapper[4892]: E0217 19:23:31.164904 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6be7a9e1-cad7-4881-b176-c238e08665a9" containerName="mariadb-account-create-update" Feb 17 19:23:31 crc kubenswrapper[4892]: I0217 19:23:31.164912 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="6be7a9e1-cad7-4881-b176-c238e08665a9" containerName="mariadb-account-create-update" Feb 17 19:23:31 crc kubenswrapper[4892]: I0217 19:23:31.165169 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f2653d2-fc1e-4bcb-ad8c-0c372dfb0f4a" containerName="mariadb-database-create" Feb 17 19:23:31 crc kubenswrapper[4892]: I0217 19:23:31.165197 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="6be7a9e1-cad7-4881-b176-c238e08665a9" containerName="mariadb-account-create-update" Feb 17 19:23:31 crc kubenswrapper[4892]: I0217 19:23:31.167267 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-bd9b9bf5f-c7bmv" Feb 17 19:23:31 crc kubenswrapper[4892]: I0217 19:23:31.170673 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-scripts" Feb 17 19:23:31 crc kubenswrapper[4892]: I0217 19:23:31.170850 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-config-data" Feb 17 19:23:31 crc kubenswrapper[4892]: I0217 19:23:31.170983 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-octavia-dockercfg-wc42x" Feb 17 19:23:31 crc kubenswrapper[4892]: I0217 19:23:31.175352 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-bd9b9bf5f-c7bmv"] Feb 17 19:23:31 crc kubenswrapper[4892]: I0217 19:23:31.363180 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/6e8dd5b6-76f7-4712-a660-b7a23721b2ad-config-data-merged\") pod \"octavia-api-bd9b9bf5f-c7bmv\" (UID: \"6e8dd5b6-76f7-4712-a660-b7a23721b2ad\") " pod="openstack/octavia-api-bd9b9bf5f-c7bmv" Feb 17 19:23:31 crc kubenswrapper[4892]: I0217 19:23:31.363259 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e8dd5b6-76f7-4712-a660-b7a23721b2ad-combined-ca-bundle\") pod \"octavia-api-bd9b9bf5f-c7bmv\" (UID: \"6e8dd5b6-76f7-4712-a660-b7a23721b2ad\") " pod="openstack/octavia-api-bd9b9bf5f-c7bmv" Feb 17 19:23:31 crc kubenswrapper[4892]: I0217 19:23:31.363335 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e8dd5b6-76f7-4712-a660-b7a23721b2ad-scripts\") pod \"octavia-api-bd9b9bf5f-c7bmv\" (UID: \"6e8dd5b6-76f7-4712-a660-b7a23721b2ad\") " pod="openstack/octavia-api-bd9b9bf5f-c7bmv" Feb 17 19:23:31 crc kubenswrapper[4892]: I0217 19:23:31.363386 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e8dd5b6-76f7-4712-a660-b7a23721b2ad-config-data\") pod \"octavia-api-bd9b9bf5f-c7bmv\" (UID: \"6e8dd5b6-76f7-4712-a660-b7a23721b2ad\") " pod="openstack/octavia-api-bd9b9bf5f-c7bmv" Feb 17 19:23:31 crc kubenswrapper[4892]: I0217 19:23:31.363759 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/6e8dd5b6-76f7-4712-a660-b7a23721b2ad-octavia-run\") pod \"octavia-api-bd9b9bf5f-c7bmv\" (UID: \"6e8dd5b6-76f7-4712-a660-b7a23721b2ad\") " pod="openstack/octavia-api-bd9b9bf5f-c7bmv" Feb 17 19:23:31 crc kubenswrapper[4892]: I0217 19:23:31.465587 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/6e8dd5b6-76f7-4712-a660-b7a23721b2ad-config-data-merged\") pod \"octavia-api-bd9b9bf5f-c7bmv\" (UID: \"6e8dd5b6-76f7-4712-a660-b7a23721b2ad\") " pod="openstack/octavia-api-bd9b9bf5f-c7bmv" Feb 17 19:23:31 crc kubenswrapper[4892]: I0217 19:23:31.465648 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e8dd5b6-76f7-4712-a660-b7a23721b2ad-combined-ca-bundle\") pod \"octavia-api-bd9b9bf5f-c7bmv\" (UID: \"6e8dd5b6-76f7-4712-a660-b7a23721b2ad\") " pod="openstack/octavia-api-bd9b9bf5f-c7bmv" Feb 17 19:23:31 crc kubenswrapper[4892]: I0217 19:23:31.465722 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e8dd5b6-76f7-4712-a660-b7a23721b2ad-scripts\") pod \"octavia-api-bd9b9bf5f-c7bmv\" (UID: \"6e8dd5b6-76f7-4712-a660-b7a23721b2ad\") " pod="openstack/octavia-api-bd9b9bf5f-c7bmv" Feb 17 19:23:31 crc kubenswrapper[4892]: I0217 19:23:31.465785 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e8dd5b6-76f7-4712-a660-b7a23721b2ad-config-data\") pod \"octavia-api-bd9b9bf5f-c7bmv\" (UID: \"6e8dd5b6-76f7-4712-a660-b7a23721b2ad\") " pod="openstack/octavia-api-bd9b9bf5f-c7bmv" Feb 17 19:23:31 crc kubenswrapper[4892]: I0217 19:23:31.465941 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/6e8dd5b6-76f7-4712-a660-b7a23721b2ad-octavia-run\") pod \"octavia-api-bd9b9bf5f-c7bmv\" (UID: \"6e8dd5b6-76f7-4712-a660-b7a23721b2ad\") " pod="openstack/octavia-api-bd9b9bf5f-c7bmv" Feb 17 19:23:31 crc kubenswrapper[4892]: I0217 19:23:31.466368 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/6e8dd5b6-76f7-4712-a660-b7a23721b2ad-config-data-merged\") pod \"octavia-api-bd9b9bf5f-c7bmv\" (UID: \"6e8dd5b6-76f7-4712-a660-b7a23721b2ad\") " pod="openstack/octavia-api-bd9b9bf5f-c7bmv" Feb 17 19:23:31 crc kubenswrapper[4892]: I0217 19:23:31.466559 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/6e8dd5b6-76f7-4712-a660-b7a23721b2ad-octavia-run\") pod \"octavia-api-bd9b9bf5f-c7bmv\" (UID: \"6e8dd5b6-76f7-4712-a660-b7a23721b2ad\") " pod="openstack/octavia-api-bd9b9bf5f-c7bmv" Feb 17 19:23:31 crc kubenswrapper[4892]: I0217 19:23:31.472478 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e8dd5b6-76f7-4712-a660-b7a23721b2ad-config-data\") pod \"octavia-api-bd9b9bf5f-c7bmv\" (UID: \"6e8dd5b6-76f7-4712-a660-b7a23721b2ad\") " pod="openstack/octavia-api-bd9b9bf5f-c7bmv" Feb 17 19:23:31 crc kubenswrapper[4892]: I0217 19:23:31.473131 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e8dd5b6-76f7-4712-a660-b7a23721b2ad-combined-ca-bundle\") pod \"octavia-api-bd9b9bf5f-c7bmv\" (UID: \"6e8dd5b6-76f7-4712-a660-b7a23721b2ad\") " pod="openstack/octavia-api-bd9b9bf5f-c7bmv" Feb 17 19:23:31 crc kubenswrapper[4892]: I0217 19:23:31.480327 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e8dd5b6-76f7-4712-a660-b7a23721b2ad-scripts\") pod \"octavia-api-bd9b9bf5f-c7bmv\" (UID: \"6e8dd5b6-76f7-4712-a660-b7a23721b2ad\") " pod="openstack/octavia-api-bd9b9bf5f-c7bmv" Feb 17 19:23:31 crc kubenswrapper[4892]: I0217 19:23:31.488915 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-bd9b9bf5f-c7bmv" Feb 17 19:23:32 crc kubenswrapper[4892]: I0217 19:23:32.087053 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-bd9b9bf5f-c7bmv"] Feb 17 19:23:32 crc kubenswrapper[4892]: I0217 19:23:32.224941 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-bd9b9bf5f-c7bmv" event={"ID":"6e8dd5b6-76f7-4712-a660-b7a23721b2ad","Type":"ContainerStarted","Data":"1f41789828f68490de9886afab5ce0e3a3f1f3f9e089f54b9385c9c3f15624a3"} Feb 17 19:23:41 crc kubenswrapper[4892]: I0217 19:23:41.347873 4892 generic.go:334] "Generic (PLEG): container finished" podID="6e8dd5b6-76f7-4712-a660-b7a23721b2ad" containerID="7ab537a5fec6ac67ac34c94ceae3e62d5c63818c065aa7a1ce5d7851d456e2d8" exitCode=0 Feb 17 19:23:41 crc kubenswrapper[4892]: I0217 19:23:41.347967 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-bd9b9bf5f-c7bmv" event={"ID":"6e8dd5b6-76f7-4712-a660-b7a23721b2ad","Type":"ContainerDied","Data":"7ab537a5fec6ac67ac34c94ceae3e62d5c63818c065aa7a1ce5d7851d456e2d8"} Feb 17 19:23:42 crc kubenswrapper[4892]: I0217 19:23:42.359807 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-bd9b9bf5f-c7bmv" event={"ID":"6e8dd5b6-76f7-4712-a660-b7a23721b2ad","Type":"ContainerStarted","Data":"100498b61d882a958cf73ebd8bad86acbb60c6429c982413759e8de9c37715cc"} Feb 17 19:23:42 crc kubenswrapper[4892]: I0217 19:23:42.360072 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-bd9b9bf5f-c7bmv" Feb 17 19:23:42 crc kubenswrapper[4892]: I0217 19:23:42.360084 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-bd9b9bf5f-c7bmv" Feb 17 19:23:42 crc kubenswrapper[4892]: I0217 19:23:42.360091 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-bd9b9bf5f-c7bmv" event={"ID":"6e8dd5b6-76f7-4712-a660-b7a23721b2ad","Type":"ContainerStarted","Data":"6963980b0228d0eca4887e4b94960cc54be8a744772b888a1435540550f9d0c0"} Feb 17 19:23:42 crc kubenswrapper[4892]: I0217 19:23:42.398346 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-api-bd9b9bf5f-c7bmv" podStartSLOduration=3.223399772 podStartE2EDuration="11.398329999s" podCreationTimestamp="2026-02-17 19:23:31 +0000 UTC" firstStartedPulling="2026-02-17 19:23:32.086383746 +0000 UTC m=+5983.461787011" lastFinishedPulling="2026-02-17 19:23:40.261313963 +0000 UTC m=+5991.636717238" observedRunningTime="2026-02-17 19:23:42.3920885 +0000 UTC m=+5993.767491765" watchObservedRunningTime="2026-02-17 19:23:42.398329999 +0000 UTC m=+5993.773733264" Feb 17 19:23:42 crc kubenswrapper[4892]: I0217 19:23:42.878335 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-9w7kn" podUID="031573a9-543b-4333-a50c-d7e514eb3a41" containerName="ovn-controller" probeResult="failure" output=< Feb 17 19:23:42 crc kubenswrapper[4892]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 17 19:23:42 crc kubenswrapper[4892]: > Feb 17 19:23:42 crc kubenswrapper[4892]: I0217 19:23:42.964572 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-pc6zw" Feb 17 19:23:42 crc kubenswrapper[4892]: I0217 19:23:42.966624 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-pc6zw" Feb 17 19:23:43 crc kubenswrapper[4892]: I0217 19:23:43.098997 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-9w7kn-config-8dnwr"] Feb 17 19:23:43 crc kubenswrapper[4892]: I0217 19:23:43.100304 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9w7kn-config-8dnwr" Feb 17 19:23:43 crc kubenswrapper[4892]: I0217 19:23:43.102094 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 17 19:23:43 crc kubenswrapper[4892]: I0217 19:23:43.119395 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9w7kn-config-8dnwr"] Feb 17 19:23:43 crc kubenswrapper[4892]: I0217 19:23:43.224746 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/78e52789-0598-48ee-a2b3-0f87657e15aa-var-run\") pod \"ovn-controller-9w7kn-config-8dnwr\" (UID: \"78e52789-0598-48ee-a2b3-0f87657e15aa\") " pod="openstack/ovn-controller-9w7kn-config-8dnwr" Feb 17 19:23:43 crc kubenswrapper[4892]: I0217 19:23:43.224951 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/78e52789-0598-48ee-a2b3-0f87657e15aa-var-run-ovn\") pod \"ovn-controller-9w7kn-config-8dnwr\" (UID: \"78e52789-0598-48ee-a2b3-0f87657e15aa\") " pod="openstack/ovn-controller-9w7kn-config-8dnwr" Feb 17 19:23:43 crc kubenswrapper[4892]: I0217 19:23:43.225068 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/78e52789-0598-48ee-a2b3-0f87657e15aa-additional-scripts\") pod \"ovn-controller-9w7kn-config-8dnwr\" (UID: \"78e52789-0598-48ee-a2b3-0f87657e15aa\") " pod="openstack/ovn-controller-9w7kn-config-8dnwr" Feb 17 19:23:43 crc kubenswrapper[4892]: I0217 19:23:43.225134 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7gmm\" (UniqueName: \"kubernetes.io/projected/78e52789-0598-48ee-a2b3-0f87657e15aa-kube-api-access-x7gmm\") pod \"ovn-controller-9w7kn-config-8dnwr\" (UID: \"78e52789-0598-48ee-a2b3-0f87657e15aa\") " pod="openstack/ovn-controller-9w7kn-config-8dnwr" Feb 17 19:23:43 crc kubenswrapper[4892]: I0217 19:23:43.225219 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78e52789-0598-48ee-a2b3-0f87657e15aa-scripts\") pod \"ovn-controller-9w7kn-config-8dnwr\" (UID: \"78e52789-0598-48ee-a2b3-0f87657e15aa\") " pod="openstack/ovn-controller-9w7kn-config-8dnwr" Feb 17 19:23:43 crc kubenswrapper[4892]: I0217 19:23:43.225429 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/78e52789-0598-48ee-a2b3-0f87657e15aa-var-log-ovn\") pod \"ovn-controller-9w7kn-config-8dnwr\" (UID: \"78e52789-0598-48ee-a2b3-0f87657e15aa\") " pod="openstack/ovn-controller-9w7kn-config-8dnwr" Feb 17 19:23:43 crc kubenswrapper[4892]: I0217 19:23:43.327007 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/78e52789-0598-48ee-a2b3-0f87657e15aa-var-log-ovn\") pod \"ovn-controller-9w7kn-config-8dnwr\" (UID: \"78e52789-0598-48ee-a2b3-0f87657e15aa\") " pod="openstack/ovn-controller-9w7kn-config-8dnwr" Feb 17 19:23:43 crc kubenswrapper[4892]: I0217 19:23:43.327096 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/78e52789-0598-48ee-a2b3-0f87657e15aa-var-run\") pod \"ovn-controller-9w7kn-config-8dnwr\" (UID: \"78e52789-0598-48ee-a2b3-0f87657e15aa\") " pod="openstack/ovn-controller-9w7kn-config-8dnwr" Feb 17 19:23:43 crc kubenswrapper[4892]: I0217 19:23:43.327143 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/78e52789-0598-48ee-a2b3-0f87657e15aa-var-run-ovn\") pod \"ovn-controller-9w7kn-config-8dnwr\" (UID: \"78e52789-0598-48ee-a2b3-0f87657e15aa\") " pod="openstack/ovn-controller-9w7kn-config-8dnwr" Feb 17 19:23:43 crc kubenswrapper[4892]: I0217 19:23:43.327182 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/78e52789-0598-48ee-a2b3-0f87657e15aa-additional-scripts\") pod \"ovn-controller-9w7kn-config-8dnwr\" (UID: \"78e52789-0598-48ee-a2b3-0f87657e15aa\") " pod="openstack/ovn-controller-9w7kn-config-8dnwr" Feb 17 19:23:43 crc kubenswrapper[4892]: I0217 19:23:43.327227 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7gmm\" (UniqueName: \"kubernetes.io/projected/78e52789-0598-48ee-a2b3-0f87657e15aa-kube-api-access-x7gmm\") pod \"ovn-controller-9w7kn-config-8dnwr\" (UID: \"78e52789-0598-48ee-a2b3-0f87657e15aa\") " pod="openstack/ovn-controller-9w7kn-config-8dnwr" Feb 17 19:23:43 crc kubenswrapper[4892]: I0217 19:23:43.327262 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78e52789-0598-48ee-a2b3-0f87657e15aa-scripts\") pod \"ovn-controller-9w7kn-config-8dnwr\" (UID: \"78e52789-0598-48ee-a2b3-0f87657e15aa\") " pod="openstack/ovn-controller-9w7kn-config-8dnwr" Feb 17 19:23:43 crc kubenswrapper[4892]: I0217 19:23:43.329137 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78e52789-0598-48ee-a2b3-0f87657e15aa-scripts\") pod \"ovn-controller-9w7kn-config-8dnwr\" (UID: \"78e52789-0598-48ee-a2b3-0f87657e15aa\") " pod="openstack/ovn-controller-9w7kn-config-8dnwr" Feb 17 19:23:43 crc kubenswrapper[4892]: I0217 19:23:43.329493 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/78e52789-0598-48ee-a2b3-0f87657e15aa-var-run\") pod \"ovn-controller-9w7kn-config-8dnwr\" (UID: \"78e52789-0598-48ee-a2b3-0f87657e15aa\") " pod="openstack/ovn-controller-9w7kn-config-8dnwr" Feb 17 19:23:43 crc kubenswrapper[4892]: I0217 19:23:43.329640 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/78e52789-0598-48ee-a2b3-0f87657e15aa-var-run-ovn\") pod \"ovn-controller-9w7kn-config-8dnwr\" (UID: \"78e52789-0598-48ee-a2b3-0f87657e15aa\") " pod="openstack/ovn-controller-9w7kn-config-8dnwr" Feb 17 19:23:43 crc kubenswrapper[4892]: I0217 19:23:43.329995 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/78e52789-0598-48ee-a2b3-0f87657e15aa-additional-scripts\") pod \"ovn-controller-9w7kn-config-8dnwr\" (UID: \"78e52789-0598-48ee-a2b3-0f87657e15aa\") " pod="openstack/ovn-controller-9w7kn-config-8dnwr" Feb 17 19:23:43 crc kubenswrapper[4892]: I0217 19:23:43.330060 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/78e52789-0598-48ee-a2b3-0f87657e15aa-var-log-ovn\") pod \"ovn-controller-9w7kn-config-8dnwr\" (UID: \"78e52789-0598-48ee-a2b3-0f87657e15aa\") " pod="openstack/ovn-controller-9w7kn-config-8dnwr" Feb 17 19:23:43 crc kubenswrapper[4892]: I0217 19:23:43.347201 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7gmm\" (UniqueName: \"kubernetes.io/projected/78e52789-0598-48ee-a2b3-0f87657e15aa-kube-api-access-x7gmm\") pod \"ovn-controller-9w7kn-config-8dnwr\" (UID: \"78e52789-0598-48ee-a2b3-0f87657e15aa\") " pod="openstack/ovn-controller-9w7kn-config-8dnwr" Feb 17 19:23:43 crc kubenswrapper[4892]: I0217 19:23:43.361518 4892 scope.go:117] "RemoveContainer" containerID="0b9ac01ffb1eb212ecc7dea4def4de73890b147a83b4b71346712fbd55b67ac2" Feb 17 19:23:43 crc kubenswrapper[4892]: E0217 19:23:43.362328 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:23:43 crc kubenswrapper[4892]: I0217 19:23:43.421490 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9w7kn-config-8dnwr" Feb 17 19:23:43 crc kubenswrapper[4892]: I0217 19:23:43.948800 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9w7kn-config-8dnwr"] Feb 17 19:23:44 crc kubenswrapper[4892]: I0217 19:23:44.381785 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9w7kn-config-8dnwr" event={"ID":"78e52789-0598-48ee-a2b3-0f87657e15aa","Type":"ContainerStarted","Data":"9209cffe36c192d2ded24a8b285eaaf05912ad95d3061f67a9efab49d8beb351"} Feb 17 19:23:44 crc kubenswrapper[4892]: I0217 19:23:44.382100 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9w7kn-config-8dnwr" event={"ID":"78e52789-0598-48ee-a2b3-0f87657e15aa","Type":"ContainerStarted","Data":"5f035b58bd82ed1ab3d83e110bbaa12b1bf7dec99b450c2156a5ff6e57a4cbd4"} Feb 17 19:23:44 crc kubenswrapper[4892]: I0217 19:23:44.407966 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-9w7kn-config-8dnwr" podStartSLOduration=1.407945328 podStartE2EDuration="1.407945328s" podCreationTimestamp="2026-02-17 19:23:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:23:44.39837966 +0000 UTC m=+5995.773782915" watchObservedRunningTime="2026-02-17 19:23:44.407945328 +0000 UTC m=+5995.783348603" Feb 17 19:23:45 crc kubenswrapper[4892]: I0217 19:23:45.393584 4892 generic.go:334] "Generic (PLEG): container finished" podID="78e52789-0598-48ee-a2b3-0f87657e15aa" containerID="9209cffe36c192d2ded24a8b285eaaf05912ad95d3061f67a9efab49d8beb351" exitCode=0 Feb 17 19:23:45 crc kubenswrapper[4892]: I0217 19:23:45.393663 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9w7kn-config-8dnwr" event={"ID":"78e52789-0598-48ee-a2b3-0f87657e15aa","Type":"ContainerDied","Data":"9209cffe36c192d2ded24a8b285eaaf05912ad95d3061f67a9efab49d8beb351"} Feb 17 19:23:46 crc kubenswrapper[4892]: I0217 19:23:46.829085 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9w7kn-config-8dnwr" Feb 17 19:23:46 crc kubenswrapper[4892]: I0217 19:23:46.899147 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/78e52789-0598-48ee-a2b3-0f87657e15aa-additional-scripts\") pod \"78e52789-0598-48ee-a2b3-0f87657e15aa\" (UID: \"78e52789-0598-48ee-a2b3-0f87657e15aa\") " Feb 17 19:23:46 crc kubenswrapper[4892]: I0217 19:23:46.899195 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/78e52789-0598-48ee-a2b3-0f87657e15aa-var-run\") pod \"78e52789-0598-48ee-a2b3-0f87657e15aa\" (UID: \"78e52789-0598-48ee-a2b3-0f87657e15aa\") " Feb 17 19:23:46 crc kubenswrapper[4892]: I0217 19:23:46.899225 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/78e52789-0598-48ee-a2b3-0f87657e15aa-var-log-ovn\") pod \"78e52789-0598-48ee-a2b3-0f87657e15aa\" (UID: \"78e52789-0598-48ee-a2b3-0f87657e15aa\") " Feb 17 19:23:46 crc kubenswrapper[4892]: I0217 19:23:46.899337 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78e52789-0598-48ee-a2b3-0f87657e15aa-scripts\") pod \"78e52789-0598-48ee-a2b3-0f87657e15aa\" (UID: \"78e52789-0598-48ee-a2b3-0f87657e15aa\") " Feb 17 19:23:46 crc kubenswrapper[4892]: I0217 19:23:46.899394 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/78e52789-0598-48ee-a2b3-0f87657e15aa-var-run-ovn\") pod \"78e52789-0598-48ee-a2b3-0f87657e15aa\" (UID: \"78e52789-0598-48ee-a2b3-0f87657e15aa\") " Feb 17 19:23:46 crc kubenswrapper[4892]: I0217 19:23:46.899486 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7gmm\" (UniqueName: \"kubernetes.io/projected/78e52789-0598-48ee-a2b3-0f87657e15aa-kube-api-access-x7gmm\") pod \"78e52789-0598-48ee-a2b3-0f87657e15aa\" (UID: \"78e52789-0598-48ee-a2b3-0f87657e15aa\") " Feb 17 19:23:46 crc kubenswrapper[4892]: I0217 19:23:46.899914 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/78e52789-0598-48ee-a2b3-0f87657e15aa-var-run" (OuterVolumeSpecName: "var-run") pod "78e52789-0598-48ee-a2b3-0f87657e15aa" (UID: "78e52789-0598-48ee-a2b3-0f87657e15aa"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 19:23:46 crc kubenswrapper[4892]: I0217 19:23:46.899914 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/78e52789-0598-48ee-a2b3-0f87657e15aa-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "78e52789-0598-48ee-a2b3-0f87657e15aa" (UID: "78e52789-0598-48ee-a2b3-0f87657e15aa"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 19:23:46 crc kubenswrapper[4892]: I0217 19:23:46.900002 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/78e52789-0598-48ee-a2b3-0f87657e15aa-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "78e52789-0598-48ee-a2b3-0f87657e15aa" (UID: "78e52789-0598-48ee-a2b3-0f87657e15aa"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 19:23:46 crc kubenswrapper[4892]: I0217 19:23:46.900071 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78e52789-0598-48ee-a2b3-0f87657e15aa-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "78e52789-0598-48ee-a2b3-0f87657e15aa" (UID: "78e52789-0598-48ee-a2b3-0f87657e15aa"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:23:46 crc kubenswrapper[4892]: I0217 19:23:46.900608 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78e52789-0598-48ee-a2b3-0f87657e15aa-scripts" (OuterVolumeSpecName: "scripts") pod "78e52789-0598-48ee-a2b3-0f87657e15aa" (UID: "78e52789-0598-48ee-a2b3-0f87657e15aa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:23:46 crc kubenswrapper[4892]: I0217 19:23:46.908138 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78e52789-0598-48ee-a2b3-0f87657e15aa-kube-api-access-x7gmm" (OuterVolumeSpecName: "kube-api-access-x7gmm") pod "78e52789-0598-48ee-a2b3-0f87657e15aa" (UID: "78e52789-0598-48ee-a2b3-0f87657e15aa"). InnerVolumeSpecName "kube-api-access-x7gmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:23:47 crc kubenswrapper[4892]: I0217 19:23:47.001775 4892 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/78e52789-0598-48ee-a2b3-0f87657e15aa-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 17 19:23:47 crc kubenswrapper[4892]: I0217 19:23:47.001815 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7gmm\" (UniqueName: \"kubernetes.io/projected/78e52789-0598-48ee-a2b3-0f87657e15aa-kube-api-access-x7gmm\") on node \"crc\" DevicePath \"\"" Feb 17 19:23:47 crc kubenswrapper[4892]: I0217 19:23:47.001831 4892 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/78e52789-0598-48ee-a2b3-0f87657e15aa-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:23:47 crc kubenswrapper[4892]: I0217 19:23:47.001860 4892 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/78e52789-0598-48ee-a2b3-0f87657e15aa-var-run\") on node \"crc\" DevicePath \"\"" Feb 17 19:23:47 crc kubenswrapper[4892]: I0217 19:23:47.001873 4892 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/78e52789-0598-48ee-a2b3-0f87657e15aa-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 17 19:23:47 crc kubenswrapper[4892]: I0217 19:23:47.001889 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78e52789-0598-48ee-a2b3-0f87657e15aa-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:23:47 crc kubenswrapper[4892]: I0217 19:23:47.427191 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9w7kn-config-8dnwr" event={"ID":"78e52789-0598-48ee-a2b3-0f87657e15aa","Type":"ContainerDied","Data":"5f035b58bd82ed1ab3d83e110bbaa12b1bf7dec99b450c2156a5ff6e57a4cbd4"} Feb 17 19:23:47 crc kubenswrapper[4892]: I0217 19:23:47.427230 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f035b58bd82ed1ab3d83e110bbaa12b1bf7dec99b450c2156a5ff6e57a4cbd4" Feb 17 19:23:47 crc kubenswrapper[4892]: I0217 19:23:47.427281 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9w7kn-config-8dnwr" Feb 17 19:23:47 crc kubenswrapper[4892]: I0217 19:23:47.494425 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-9w7kn-config-8dnwr"] Feb 17 19:23:47 crc kubenswrapper[4892]: I0217 19:23:47.503649 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-9w7kn-config-8dnwr"] Feb 17 19:23:47 crc kubenswrapper[4892]: I0217 19:23:47.613749 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-9w7kn-config-tq5m5"] Feb 17 19:23:47 crc kubenswrapper[4892]: E0217 19:23:47.614254 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78e52789-0598-48ee-a2b3-0f87657e15aa" containerName="ovn-config" Feb 17 19:23:47 crc kubenswrapper[4892]: I0217 19:23:47.614272 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="78e52789-0598-48ee-a2b3-0f87657e15aa" containerName="ovn-config" Feb 17 19:23:47 crc kubenswrapper[4892]: I0217 19:23:47.614486 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="78e52789-0598-48ee-a2b3-0f87657e15aa" containerName="ovn-config" Feb 17 19:23:47 crc kubenswrapper[4892]: I0217 19:23:47.615160 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9w7kn-config-tq5m5" Feb 17 19:23:47 crc kubenswrapper[4892]: I0217 19:23:47.617010 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 17 19:23:47 crc kubenswrapper[4892]: I0217 19:23:47.644378 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9w7kn-config-tq5m5"] Feb 17 19:23:47 crc kubenswrapper[4892]: I0217 19:23:47.715950 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78zl8\" (UniqueName: \"kubernetes.io/projected/a447e192-b0c8-4dd9-be35-be8ccd7fb281-kube-api-access-78zl8\") pod \"ovn-controller-9w7kn-config-tq5m5\" (UID: \"a447e192-b0c8-4dd9-be35-be8ccd7fb281\") " pod="openstack/ovn-controller-9w7kn-config-tq5m5" Feb 17 19:23:47 crc kubenswrapper[4892]: I0217 19:23:47.716109 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a447e192-b0c8-4dd9-be35-be8ccd7fb281-var-run-ovn\") pod \"ovn-controller-9w7kn-config-tq5m5\" (UID: \"a447e192-b0c8-4dd9-be35-be8ccd7fb281\") " pod="openstack/ovn-controller-9w7kn-config-tq5m5" Feb 17 19:23:47 crc kubenswrapper[4892]: I0217 19:23:47.716130 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a447e192-b0c8-4dd9-be35-be8ccd7fb281-var-run\") pod \"ovn-controller-9w7kn-config-tq5m5\" (UID: \"a447e192-b0c8-4dd9-be35-be8ccd7fb281\") " pod="openstack/ovn-controller-9w7kn-config-tq5m5" Feb 17 19:23:47 crc kubenswrapper[4892]: I0217 19:23:47.716359 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a447e192-b0c8-4dd9-be35-be8ccd7fb281-additional-scripts\") pod \"ovn-controller-9w7kn-config-tq5m5\" (UID: \"a447e192-b0c8-4dd9-be35-be8ccd7fb281\") " pod="openstack/ovn-controller-9w7kn-config-tq5m5" Feb 17 19:23:47 crc kubenswrapper[4892]: I0217 19:23:47.716447 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a447e192-b0c8-4dd9-be35-be8ccd7fb281-var-log-ovn\") pod \"ovn-controller-9w7kn-config-tq5m5\" (UID: \"a447e192-b0c8-4dd9-be35-be8ccd7fb281\") " pod="openstack/ovn-controller-9w7kn-config-tq5m5" Feb 17 19:23:47 crc kubenswrapper[4892]: I0217 19:23:47.716595 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a447e192-b0c8-4dd9-be35-be8ccd7fb281-scripts\") pod \"ovn-controller-9w7kn-config-tq5m5\" (UID: \"a447e192-b0c8-4dd9-be35-be8ccd7fb281\") " pod="openstack/ovn-controller-9w7kn-config-tq5m5" Feb 17 19:23:47 crc kubenswrapper[4892]: I0217 19:23:47.825543 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a447e192-b0c8-4dd9-be35-be8ccd7fb281-var-run-ovn\") pod \"ovn-controller-9w7kn-config-tq5m5\" (UID: \"a447e192-b0c8-4dd9-be35-be8ccd7fb281\") " pod="openstack/ovn-controller-9w7kn-config-tq5m5" Feb 17 19:23:47 crc kubenswrapper[4892]: I0217 19:23:47.825612 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a447e192-b0c8-4dd9-be35-be8ccd7fb281-var-run\") pod \"ovn-controller-9w7kn-config-tq5m5\" (UID: \"a447e192-b0c8-4dd9-be35-be8ccd7fb281\") " pod="openstack/ovn-controller-9w7kn-config-tq5m5" Feb 17 19:23:47 crc kubenswrapper[4892]: I0217 19:23:47.825712 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a447e192-b0c8-4dd9-be35-be8ccd7fb281-additional-scripts\") pod \"ovn-controller-9w7kn-config-tq5m5\" (UID: \"a447e192-b0c8-4dd9-be35-be8ccd7fb281\") " pod="openstack/ovn-controller-9w7kn-config-tq5m5" Feb 17 19:23:47 crc kubenswrapper[4892]: I0217 19:23:47.825771 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a447e192-b0c8-4dd9-be35-be8ccd7fb281-var-log-ovn\") pod \"ovn-controller-9w7kn-config-tq5m5\" (UID: \"a447e192-b0c8-4dd9-be35-be8ccd7fb281\") " pod="openstack/ovn-controller-9w7kn-config-tq5m5" Feb 17 19:23:47 crc kubenswrapper[4892]: I0217 19:23:47.825905 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a447e192-b0c8-4dd9-be35-be8ccd7fb281-scripts\") pod \"ovn-controller-9w7kn-config-tq5m5\" (UID: \"a447e192-b0c8-4dd9-be35-be8ccd7fb281\") " pod="openstack/ovn-controller-9w7kn-config-tq5m5" Feb 17 19:23:47 crc kubenswrapper[4892]: I0217 19:23:47.826008 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78zl8\" (UniqueName: \"kubernetes.io/projected/a447e192-b0c8-4dd9-be35-be8ccd7fb281-kube-api-access-78zl8\") pod \"ovn-controller-9w7kn-config-tq5m5\" (UID: \"a447e192-b0c8-4dd9-be35-be8ccd7fb281\") " pod="openstack/ovn-controller-9w7kn-config-tq5m5" Feb 17 19:23:47 crc kubenswrapper[4892]: I0217 19:23:47.826890 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a447e192-b0c8-4dd9-be35-be8ccd7fb281-var-run-ovn\") pod \"ovn-controller-9w7kn-config-tq5m5\" (UID: \"a447e192-b0c8-4dd9-be35-be8ccd7fb281\") " pod="openstack/ovn-controller-9w7kn-config-tq5m5" Feb 17 19:23:47 crc kubenswrapper[4892]: I0217 19:23:47.826973 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a447e192-b0c8-4dd9-be35-be8ccd7fb281-var-run\") pod \"ovn-controller-9w7kn-config-tq5m5\" (UID: \"a447e192-b0c8-4dd9-be35-be8ccd7fb281\") " pod="openstack/ovn-controller-9w7kn-config-tq5m5" Feb 17 19:23:47 crc kubenswrapper[4892]: I0217 19:23:47.827745 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a447e192-b0c8-4dd9-be35-be8ccd7fb281-additional-scripts\") pod \"ovn-controller-9w7kn-config-tq5m5\" (UID: \"a447e192-b0c8-4dd9-be35-be8ccd7fb281\") " pod="openstack/ovn-controller-9w7kn-config-tq5m5" Feb 17 19:23:47 crc kubenswrapper[4892]: I0217 19:23:47.827822 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a447e192-b0c8-4dd9-be35-be8ccd7fb281-var-log-ovn\") pod \"ovn-controller-9w7kn-config-tq5m5\" (UID: \"a447e192-b0c8-4dd9-be35-be8ccd7fb281\") " pod="openstack/ovn-controller-9w7kn-config-tq5m5" Feb 17 19:23:47 crc kubenswrapper[4892]: I0217 19:23:47.830417 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a447e192-b0c8-4dd9-be35-be8ccd7fb281-scripts\") pod \"ovn-controller-9w7kn-config-tq5m5\" (UID: \"a447e192-b0c8-4dd9-be35-be8ccd7fb281\") " pod="openstack/ovn-controller-9w7kn-config-tq5m5" Feb 17 19:23:47 crc kubenswrapper[4892]: I0217 19:23:47.857617 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78zl8\" (UniqueName: \"kubernetes.io/projected/a447e192-b0c8-4dd9-be35-be8ccd7fb281-kube-api-access-78zl8\") pod \"ovn-controller-9w7kn-config-tq5m5\" (UID: \"a447e192-b0c8-4dd9-be35-be8ccd7fb281\") " pod="openstack/ovn-controller-9w7kn-config-tq5m5" Feb 17 19:23:47 crc kubenswrapper[4892]: I0217 19:23:47.896461 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-9w7kn" Feb 17 19:23:47 crc kubenswrapper[4892]: I0217 19:23:47.933073 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9w7kn-config-tq5m5" Feb 17 19:23:48 crc kubenswrapper[4892]: I0217 19:23:48.474673 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9w7kn-config-tq5m5"] Feb 17 19:23:48 crc kubenswrapper[4892]: W0217 19:23:48.476545 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda447e192_b0c8_4dd9_be35_be8ccd7fb281.slice/crio-fd342454f345622d059b6d72c986dab23625782c568ae917614fc7ebeeb79051 WatchSource:0}: Error finding container fd342454f345622d059b6d72c986dab23625782c568ae917614fc7ebeeb79051: Status 404 returned error can't find the container with id fd342454f345622d059b6d72c986dab23625782c568ae917614fc7ebeeb79051 Feb 17 19:23:49 crc kubenswrapper[4892]: I0217 19:23:49.377675 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78e52789-0598-48ee-a2b3-0f87657e15aa" path="/var/lib/kubelet/pods/78e52789-0598-48ee-a2b3-0f87657e15aa/volumes" Feb 17 19:23:49 crc kubenswrapper[4892]: I0217 19:23:49.476781 4892 generic.go:334] "Generic (PLEG): container finished" podID="a447e192-b0c8-4dd9-be35-be8ccd7fb281" containerID="1c3ae0c61f075ab319a23e6c94701988aa7d870316d86e0d474f3faa28ccaaf9" exitCode=0 Feb 17 19:23:49 crc kubenswrapper[4892]: I0217 19:23:49.476858 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9w7kn-config-tq5m5" event={"ID":"a447e192-b0c8-4dd9-be35-be8ccd7fb281","Type":"ContainerDied","Data":"1c3ae0c61f075ab319a23e6c94701988aa7d870316d86e0d474f3faa28ccaaf9"} Feb 17 19:23:49 crc kubenswrapper[4892]: I0217 19:23:49.476903 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9w7kn-config-tq5m5" event={"ID":"a447e192-b0c8-4dd9-be35-be8ccd7fb281","Type":"ContainerStarted","Data":"fd342454f345622d059b6d72c986dab23625782c568ae917614fc7ebeeb79051"} Feb 17 19:23:50 crc kubenswrapper[4892]: I0217 19:23:50.401860 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-rsyslog-c6t7x"] Feb 17 19:23:50 crc kubenswrapper[4892]: I0217 19:23:50.418699 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-c6t7x" Feb 17 19:23:50 crc kubenswrapper[4892]: I0217 19:23:50.420166 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-c6t7x"] Feb 17 19:23:50 crc kubenswrapper[4892]: I0217 19:23:50.423271 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-scripts" Feb 17 19:23:50 crc kubenswrapper[4892]: I0217 19:23:50.426660 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"octavia-hmport-map" Feb 17 19:23:50 crc kubenswrapper[4892]: I0217 19:23:50.429313 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-config-data" Feb 17 19:23:50 crc kubenswrapper[4892]: I0217 19:23:50.488404 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fcd17ca-4457-418b-8dc1-3f8b5e485d72-config-data\") pod \"octavia-rsyslog-c6t7x\" (UID: \"2fcd17ca-4457-418b-8dc1-3f8b5e485d72\") " pod="openstack/octavia-rsyslog-c6t7x" Feb 17 19:23:50 crc kubenswrapper[4892]: I0217 19:23:50.488459 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fcd17ca-4457-418b-8dc1-3f8b5e485d72-scripts\") pod \"octavia-rsyslog-c6t7x\" (UID: \"2fcd17ca-4457-418b-8dc1-3f8b5e485d72\") " pod="openstack/octavia-rsyslog-c6t7x" Feb 17 19:23:50 crc kubenswrapper[4892]: I0217 19:23:50.488992 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/2fcd17ca-4457-418b-8dc1-3f8b5e485d72-config-data-merged\") pod \"octavia-rsyslog-c6t7x\" (UID: \"2fcd17ca-4457-418b-8dc1-3f8b5e485d72\") " pod="openstack/octavia-rsyslog-c6t7x" Feb 17 19:23:50 crc kubenswrapper[4892]: I0217 19:23:50.489961 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/2fcd17ca-4457-418b-8dc1-3f8b5e485d72-hm-ports\") pod \"octavia-rsyslog-c6t7x\" (UID: \"2fcd17ca-4457-418b-8dc1-3f8b5e485d72\") " pod="openstack/octavia-rsyslog-c6t7x" Feb 17 19:23:50 crc kubenswrapper[4892]: I0217 19:23:50.592906 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/2fcd17ca-4457-418b-8dc1-3f8b5e485d72-config-data-merged\") pod \"octavia-rsyslog-c6t7x\" (UID: \"2fcd17ca-4457-418b-8dc1-3f8b5e485d72\") " pod="openstack/octavia-rsyslog-c6t7x" Feb 17 19:23:50 crc kubenswrapper[4892]: I0217 19:23:50.593245 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/2fcd17ca-4457-418b-8dc1-3f8b5e485d72-hm-ports\") pod \"octavia-rsyslog-c6t7x\" (UID: \"2fcd17ca-4457-418b-8dc1-3f8b5e485d72\") " pod="openstack/octavia-rsyslog-c6t7x" Feb 17 19:23:50 crc kubenswrapper[4892]: I0217 19:23:50.593320 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fcd17ca-4457-418b-8dc1-3f8b5e485d72-config-data\") pod \"octavia-rsyslog-c6t7x\" (UID: \"2fcd17ca-4457-418b-8dc1-3f8b5e485d72\") " pod="openstack/octavia-rsyslog-c6t7x" Feb 17 19:23:50 crc kubenswrapper[4892]: I0217 19:23:50.593338 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fcd17ca-4457-418b-8dc1-3f8b5e485d72-scripts\") pod \"octavia-rsyslog-c6t7x\" (UID: \"2fcd17ca-4457-418b-8dc1-3f8b5e485d72\") " pod="openstack/octavia-rsyslog-c6t7x" Feb 17 19:23:50 crc kubenswrapper[4892]: I0217 19:23:50.593471 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/2fcd17ca-4457-418b-8dc1-3f8b5e485d72-config-data-merged\") pod \"octavia-rsyslog-c6t7x\" (UID: \"2fcd17ca-4457-418b-8dc1-3f8b5e485d72\") " pod="openstack/octavia-rsyslog-c6t7x" Feb 17 19:23:50 crc kubenswrapper[4892]: I0217 19:23:50.594468 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/2fcd17ca-4457-418b-8dc1-3f8b5e485d72-hm-ports\") pod \"octavia-rsyslog-c6t7x\" (UID: \"2fcd17ca-4457-418b-8dc1-3f8b5e485d72\") " pod="openstack/octavia-rsyslog-c6t7x" Feb 17 19:23:50 crc kubenswrapper[4892]: I0217 19:23:50.599479 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fcd17ca-4457-418b-8dc1-3f8b5e485d72-config-data\") pod \"octavia-rsyslog-c6t7x\" (UID: \"2fcd17ca-4457-418b-8dc1-3f8b5e485d72\") " pod="openstack/octavia-rsyslog-c6t7x" Feb 17 19:23:50 crc kubenswrapper[4892]: I0217 19:23:50.616912 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fcd17ca-4457-418b-8dc1-3f8b5e485d72-scripts\") pod \"octavia-rsyslog-c6t7x\" (UID: \"2fcd17ca-4457-418b-8dc1-3f8b5e485d72\") " pod="openstack/octavia-rsyslog-c6t7x" Feb 17 19:23:50 crc kubenswrapper[4892]: I0217 19:23:50.700412 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-bd9b9bf5f-c7bmv" Feb 17 19:23:50 crc kubenswrapper[4892]: I0217 19:23:50.748192 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-c6t7x" Feb 17 19:23:50 crc kubenswrapper[4892]: I0217 19:23:50.830390 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-bd9b9bf5f-c7bmv" Feb 17 19:23:50 crc kubenswrapper[4892]: I0217 19:23:50.934367 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9w7kn-config-tq5m5" Feb 17 19:23:51 crc kubenswrapper[4892]: I0217 19:23:51.004057 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a447e192-b0c8-4dd9-be35-be8ccd7fb281-scripts\") pod \"a447e192-b0c8-4dd9-be35-be8ccd7fb281\" (UID: \"a447e192-b0c8-4dd9-be35-be8ccd7fb281\") " Feb 17 19:23:51 crc kubenswrapper[4892]: I0217 19:23:51.004197 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78zl8\" (UniqueName: \"kubernetes.io/projected/a447e192-b0c8-4dd9-be35-be8ccd7fb281-kube-api-access-78zl8\") pod \"a447e192-b0c8-4dd9-be35-be8ccd7fb281\" (UID: \"a447e192-b0c8-4dd9-be35-be8ccd7fb281\") " Feb 17 19:23:51 crc kubenswrapper[4892]: I0217 19:23:51.004228 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a447e192-b0c8-4dd9-be35-be8ccd7fb281-var-log-ovn\") pod \"a447e192-b0c8-4dd9-be35-be8ccd7fb281\" (UID: \"a447e192-b0c8-4dd9-be35-be8ccd7fb281\") " Feb 17 19:23:51 crc kubenswrapper[4892]: I0217 19:23:51.004283 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a447e192-b0c8-4dd9-be35-be8ccd7fb281-var-run-ovn\") pod \"a447e192-b0c8-4dd9-be35-be8ccd7fb281\" (UID: \"a447e192-b0c8-4dd9-be35-be8ccd7fb281\") " Feb 17 19:23:51 crc kubenswrapper[4892]: I0217 19:23:51.004354 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a447e192-b0c8-4dd9-be35-be8ccd7fb281-var-run\") pod \"a447e192-b0c8-4dd9-be35-be8ccd7fb281\" (UID: \"a447e192-b0c8-4dd9-be35-be8ccd7fb281\") " Feb 17 19:23:51 crc kubenswrapper[4892]: I0217 19:23:51.004377 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a447e192-b0c8-4dd9-be35-be8ccd7fb281-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "a447e192-b0c8-4dd9-be35-be8ccd7fb281" (UID: "a447e192-b0c8-4dd9-be35-be8ccd7fb281"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 19:23:51 crc kubenswrapper[4892]: I0217 19:23:51.004475 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a447e192-b0c8-4dd9-be35-be8ccd7fb281-additional-scripts\") pod \"a447e192-b0c8-4dd9-be35-be8ccd7fb281\" (UID: \"a447e192-b0c8-4dd9-be35-be8ccd7fb281\") " Feb 17 19:23:51 crc kubenswrapper[4892]: I0217 19:23:51.004432 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a447e192-b0c8-4dd9-be35-be8ccd7fb281-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "a447e192-b0c8-4dd9-be35-be8ccd7fb281" (UID: "a447e192-b0c8-4dd9-be35-be8ccd7fb281"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 19:23:51 crc kubenswrapper[4892]: I0217 19:23:51.004499 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a447e192-b0c8-4dd9-be35-be8ccd7fb281-var-run" (OuterVolumeSpecName: "var-run") pod "a447e192-b0c8-4dd9-be35-be8ccd7fb281" (UID: "a447e192-b0c8-4dd9-be35-be8ccd7fb281"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 19:23:51 crc kubenswrapper[4892]: I0217 19:23:51.005103 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a447e192-b0c8-4dd9-be35-be8ccd7fb281-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "a447e192-b0c8-4dd9-be35-be8ccd7fb281" (UID: "a447e192-b0c8-4dd9-be35-be8ccd7fb281"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:23:51 crc kubenswrapper[4892]: I0217 19:23:51.005212 4892 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a447e192-b0c8-4dd9-be35-be8ccd7fb281-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 17 19:23:51 crc kubenswrapper[4892]: I0217 19:23:51.005233 4892 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a447e192-b0c8-4dd9-be35-be8ccd7fb281-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 17 19:23:51 crc kubenswrapper[4892]: I0217 19:23:51.005245 4892 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a447e192-b0c8-4dd9-be35-be8ccd7fb281-var-run\") on node \"crc\" DevicePath \"\"" Feb 17 19:23:51 crc kubenswrapper[4892]: I0217 19:23:51.005257 4892 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a447e192-b0c8-4dd9-be35-be8ccd7fb281-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:23:51 crc kubenswrapper[4892]: I0217 19:23:51.005271 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a447e192-b0c8-4dd9-be35-be8ccd7fb281-scripts" (OuterVolumeSpecName: "scripts") pod "a447e192-b0c8-4dd9-be35-be8ccd7fb281" (UID: "a447e192-b0c8-4dd9-be35-be8ccd7fb281"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:23:51 crc kubenswrapper[4892]: I0217 19:23:51.011240 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a447e192-b0c8-4dd9-be35-be8ccd7fb281-kube-api-access-78zl8" (OuterVolumeSpecName: "kube-api-access-78zl8") pod "a447e192-b0c8-4dd9-be35-be8ccd7fb281" (UID: "a447e192-b0c8-4dd9-be35-be8ccd7fb281"). InnerVolumeSpecName "kube-api-access-78zl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:23:51 crc kubenswrapper[4892]: I0217 19:23:51.106877 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a447e192-b0c8-4dd9-be35-be8ccd7fb281-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:23:51 crc kubenswrapper[4892]: I0217 19:23:51.107182 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78zl8\" (UniqueName: \"kubernetes.io/projected/a447e192-b0c8-4dd9-be35-be8ccd7fb281-kube-api-access-78zl8\") on node \"crc\" DevicePath \"\"" Feb 17 19:23:51 crc kubenswrapper[4892]: I0217 19:23:51.125565 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-59f8cff499-p4jsp"] Feb 17 19:23:51 crc kubenswrapper[4892]: E0217 19:23:51.126134 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a447e192-b0c8-4dd9-be35-be8ccd7fb281" containerName="ovn-config" Feb 17 19:23:51 crc kubenswrapper[4892]: I0217 19:23:51.126156 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="a447e192-b0c8-4dd9-be35-be8ccd7fb281" containerName="ovn-config" Feb 17 19:23:51 crc kubenswrapper[4892]: I0217 19:23:51.126685 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="a447e192-b0c8-4dd9-be35-be8ccd7fb281" containerName="ovn-config" Feb 17 19:23:51 crc kubenswrapper[4892]: I0217 19:23:51.129861 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-p4jsp" Feb 17 19:23:51 crc kubenswrapper[4892]: I0217 19:23:51.132678 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Feb 17 19:23:51 crc kubenswrapper[4892]: I0217 19:23:51.163547 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-p4jsp"] Feb 17 19:23:51 crc kubenswrapper[4892]: I0217 19:23:51.210207 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c5c321c3-f90f-4f24-b7ed-2027e64b6e01-httpd-config\") pod \"octavia-image-upload-59f8cff499-p4jsp\" (UID: \"c5c321c3-f90f-4f24-b7ed-2027e64b6e01\") " pod="openstack/octavia-image-upload-59f8cff499-p4jsp" Feb 17 19:23:51 crc kubenswrapper[4892]: I0217 19:23:51.210379 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/c5c321c3-f90f-4f24-b7ed-2027e64b6e01-amphora-image\") pod \"octavia-image-upload-59f8cff499-p4jsp\" (UID: \"c5c321c3-f90f-4f24-b7ed-2027e64b6e01\") " pod="openstack/octavia-image-upload-59f8cff499-p4jsp" Feb 17 19:23:51 crc kubenswrapper[4892]: I0217 19:23:51.313208 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/c5c321c3-f90f-4f24-b7ed-2027e64b6e01-amphora-image\") pod \"octavia-image-upload-59f8cff499-p4jsp\" (UID: \"c5c321c3-f90f-4f24-b7ed-2027e64b6e01\") " pod="openstack/octavia-image-upload-59f8cff499-p4jsp" Feb 17 19:23:51 crc kubenswrapper[4892]: I0217 19:23:51.313343 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c5c321c3-f90f-4f24-b7ed-2027e64b6e01-httpd-config\") pod \"octavia-image-upload-59f8cff499-p4jsp\" (UID: \"c5c321c3-f90f-4f24-b7ed-2027e64b6e01\") " pod="openstack/octavia-image-upload-59f8cff499-p4jsp" Feb 17 19:23:51 crc kubenswrapper[4892]: I0217 19:23:51.313720 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/c5c321c3-f90f-4f24-b7ed-2027e64b6e01-amphora-image\") pod \"octavia-image-upload-59f8cff499-p4jsp\" (UID: \"c5c321c3-f90f-4f24-b7ed-2027e64b6e01\") " pod="openstack/octavia-image-upload-59f8cff499-p4jsp" Feb 17 19:23:51 crc kubenswrapper[4892]: I0217 19:23:51.325383 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c5c321c3-f90f-4f24-b7ed-2027e64b6e01-httpd-config\") pod \"octavia-image-upload-59f8cff499-p4jsp\" (UID: \"c5c321c3-f90f-4f24-b7ed-2027e64b6e01\") " pod="openstack/octavia-image-upload-59f8cff499-p4jsp" Feb 17 19:23:51 crc kubenswrapper[4892]: I0217 19:23:51.388912 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-c6t7x"] Feb 17 19:23:51 crc kubenswrapper[4892]: I0217 19:23:51.449627 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-c6t7x"] Feb 17 19:23:51 crc kubenswrapper[4892]: I0217 19:23:51.467264 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-p4jsp" Feb 17 19:23:51 crc kubenswrapper[4892]: I0217 19:23:51.496693 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9w7kn-config-tq5m5" event={"ID":"a447e192-b0c8-4dd9-be35-be8ccd7fb281","Type":"ContainerDied","Data":"fd342454f345622d059b6d72c986dab23625782c568ae917614fc7ebeeb79051"} Feb 17 19:23:51 crc kubenswrapper[4892]: I0217 19:23:51.496729 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd342454f345622d059b6d72c986dab23625782c568ae917614fc7ebeeb79051" Feb 17 19:23:51 crc kubenswrapper[4892]: I0217 19:23:51.496735 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9w7kn-config-tq5m5" Feb 17 19:23:51 crc kubenswrapper[4892]: I0217 19:23:51.497930 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-c6t7x" event={"ID":"2fcd17ca-4457-418b-8dc1-3f8b5e485d72","Type":"ContainerStarted","Data":"449bc078898dcc04ff1159a2252e9041adfc03a1358f71a3930910f1640774d6"} Feb 17 19:23:51 crc kubenswrapper[4892]: I0217 19:23:51.912404 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-p4jsp"] Feb 17 19:23:52 crc kubenswrapper[4892]: I0217 19:23:52.034177 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-9w7kn-config-tq5m5"] Feb 17 19:23:52 crc kubenswrapper[4892]: I0217 19:23:52.047540 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-9w7kn-config-tq5m5"] Feb 17 19:23:52 crc kubenswrapper[4892]: I0217 19:23:52.535445 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-p4jsp" event={"ID":"c5c321c3-f90f-4f24-b7ed-2027e64b6e01","Type":"ContainerStarted","Data":"36fad1b3c74e4563f20d05809e818b69a40ba170918ee856787d6521123bf857"} Feb 17 19:23:53 crc kubenswrapper[4892]: I0217 19:23:53.371605 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a447e192-b0c8-4dd9-be35-be8ccd7fb281" path="/var/lib/kubelet/pods/a447e192-b0c8-4dd9-be35-be8ccd7fb281/volumes" Feb 17 19:23:53 crc kubenswrapper[4892]: I0217 19:23:53.550061 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-c6t7x" event={"ID":"2fcd17ca-4457-418b-8dc1-3f8b5e485d72","Type":"ContainerStarted","Data":"7b32c057d4d02180b2493ebbee1a10f2ac1bff06e88b706a98a83e6cf88b2552"} Feb 17 19:23:55 crc kubenswrapper[4892]: I0217 19:23:55.361117 4892 scope.go:117] "RemoveContainer" containerID="0b9ac01ffb1eb212ecc7dea4def4de73890b147a83b4b71346712fbd55b67ac2" Feb 17 19:23:55 crc kubenswrapper[4892]: E0217 19:23:55.361770 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:23:55 crc kubenswrapper[4892]: I0217 19:23:55.570206 4892 generic.go:334] "Generic (PLEG): container finished" podID="2fcd17ca-4457-418b-8dc1-3f8b5e485d72" containerID="7b32c057d4d02180b2493ebbee1a10f2ac1bff06e88b706a98a83e6cf88b2552" exitCode=0 Feb 17 19:23:55 crc kubenswrapper[4892]: I0217 19:23:55.570452 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-c6t7x" event={"ID":"2fcd17ca-4457-418b-8dc1-3f8b5e485d72","Type":"ContainerDied","Data":"7b32c057d4d02180b2493ebbee1a10f2ac1bff06e88b706a98a83e6cf88b2552"} Feb 17 19:23:55 crc kubenswrapper[4892]: I0217 19:23:55.871914 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-housekeeping-bk2pv"] Feb 17 19:23:55 crc kubenswrapper[4892]: I0217 19:23:55.874126 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-bk2pv" Feb 17 19:23:55 crc kubenswrapper[4892]: I0217 19:23:55.882449 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-config-data" Feb 17 19:23:55 crc kubenswrapper[4892]: I0217 19:23:55.882499 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-certs-secret" Feb 17 19:23:55 crc kubenswrapper[4892]: I0217 19:23:55.882907 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-scripts" Feb 17 19:23:55 crc kubenswrapper[4892]: I0217 19:23:55.915774 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-bk2pv"] Feb 17 19:23:55 crc kubenswrapper[4892]: I0217 19:23:55.948886 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c603be1-c02e-4c29-ae4c-f0209436fa4b-scripts\") pod \"octavia-housekeeping-bk2pv\" (UID: \"3c603be1-c02e-4c29-ae4c-f0209436fa4b\") " pod="openstack/octavia-housekeeping-bk2pv" Feb 17 19:23:55 crc kubenswrapper[4892]: I0217 19:23:55.948931 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c603be1-c02e-4c29-ae4c-f0209436fa4b-config-data\") pod \"octavia-housekeeping-bk2pv\" (UID: \"3c603be1-c02e-4c29-ae4c-f0209436fa4b\") " pod="openstack/octavia-housekeeping-bk2pv" Feb 17 19:23:55 crc kubenswrapper[4892]: I0217 19:23:55.948999 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/3c603be1-c02e-4c29-ae4c-f0209436fa4b-amphora-certs\") pod \"octavia-housekeeping-bk2pv\" (UID: \"3c603be1-c02e-4c29-ae4c-f0209436fa4b\") " pod="openstack/octavia-housekeeping-bk2pv" Feb 17 19:23:55 crc kubenswrapper[4892]: I0217 19:23:55.949073 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/3c603be1-c02e-4c29-ae4c-f0209436fa4b-hm-ports\") pod \"octavia-housekeeping-bk2pv\" (UID: \"3c603be1-c02e-4c29-ae4c-f0209436fa4b\") " pod="openstack/octavia-housekeeping-bk2pv" Feb 17 19:23:55 crc kubenswrapper[4892]: I0217 19:23:55.949132 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/3c603be1-c02e-4c29-ae4c-f0209436fa4b-config-data-merged\") pod \"octavia-housekeeping-bk2pv\" (UID: \"3c603be1-c02e-4c29-ae4c-f0209436fa4b\") " pod="openstack/octavia-housekeeping-bk2pv" Feb 17 19:23:55 crc kubenswrapper[4892]: I0217 19:23:55.949151 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c603be1-c02e-4c29-ae4c-f0209436fa4b-combined-ca-bundle\") pod \"octavia-housekeeping-bk2pv\" (UID: \"3c603be1-c02e-4c29-ae4c-f0209436fa4b\") " pod="openstack/octavia-housekeeping-bk2pv" Feb 17 19:23:56 crc kubenswrapper[4892]: I0217 19:23:56.051393 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c603be1-c02e-4c29-ae4c-f0209436fa4b-scripts\") pod \"octavia-housekeeping-bk2pv\" (UID: \"3c603be1-c02e-4c29-ae4c-f0209436fa4b\") " pod="openstack/octavia-housekeeping-bk2pv" Feb 17 19:23:56 crc kubenswrapper[4892]: I0217 19:23:56.051438 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c603be1-c02e-4c29-ae4c-f0209436fa4b-config-data\") pod \"octavia-housekeeping-bk2pv\" (UID: \"3c603be1-c02e-4c29-ae4c-f0209436fa4b\") " pod="openstack/octavia-housekeeping-bk2pv" Feb 17 19:23:56 crc kubenswrapper[4892]: I0217 19:23:56.051516 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/3c603be1-c02e-4c29-ae4c-f0209436fa4b-amphora-certs\") pod \"octavia-housekeeping-bk2pv\" (UID: \"3c603be1-c02e-4c29-ae4c-f0209436fa4b\") " pod="openstack/octavia-housekeeping-bk2pv" Feb 17 19:23:56 crc kubenswrapper[4892]: I0217 19:23:56.051548 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/3c603be1-c02e-4c29-ae4c-f0209436fa4b-hm-ports\") pod \"octavia-housekeeping-bk2pv\" (UID: \"3c603be1-c02e-4c29-ae4c-f0209436fa4b\") " pod="openstack/octavia-housekeeping-bk2pv" Feb 17 19:23:56 crc kubenswrapper[4892]: I0217 19:23:56.051592 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/3c603be1-c02e-4c29-ae4c-f0209436fa4b-config-data-merged\") pod \"octavia-housekeeping-bk2pv\" (UID: \"3c603be1-c02e-4c29-ae4c-f0209436fa4b\") " pod="openstack/octavia-housekeeping-bk2pv" Feb 17 19:23:56 crc kubenswrapper[4892]: I0217 19:23:56.051611 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c603be1-c02e-4c29-ae4c-f0209436fa4b-combined-ca-bundle\") pod \"octavia-housekeeping-bk2pv\" (UID: \"3c603be1-c02e-4c29-ae4c-f0209436fa4b\") " pod="openstack/octavia-housekeeping-bk2pv" Feb 17 19:23:56 crc kubenswrapper[4892]: I0217 19:23:56.052454 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/3c603be1-c02e-4c29-ae4c-f0209436fa4b-config-data-merged\") pod \"octavia-housekeeping-bk2pv\" (UID: \"3c603be1-c02e-4c29-ae4c-f0209436fa4b\") " pod="openstack/octavia-housekeeping-bk2pv" Feb 17 19:23:56 crc kubenswrapper[4892]: I0217 19:23:56.052789 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/3c603be1-c02e-4c29-ae4c-f0209436fa4b-hm-ports\") pod \"octavia-housekeeping-bk2pv\" (UID: \"3c603be1-c02e-4c29-ae4c-f0209436fa4b\") " pod="openstack/octavia-housekeeping-bk2pv" Feb 17 19:23:56 crc kubenswrapper[4892]: I0217 19:23:56.057521 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c603be1-c02e-4c29-ae4c-f0209436fa4b-combined-ca-bundle\") pod \"octavia-housekeeping-bk2pv\" (UID: \"3c603be1-c02e-4c29-ae4c-f0209436fa4b\") " pod="openstack/octavia-housekeeping-bk2pv" Feb 17 19:23:56 crc kubenswrapper[4892]: I0217 19:23:56.057582 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c603be1-c02e-4c29-ae4c-f0209436fa4b-config-data\") pod \"octavia-housekeeping-bk2pv\" (UID: \"3c603be1-c02e-4c29-ae4c-f0209436fa4b\") " pod="openstack/octavia-housekeeping-bk2pv" Feb 17 19:23:56 crc kubenswrapper[4892]: I0217 19:23:56.057696 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c603be1-c02e-4c29-ae4c-f0209436fa4b-scripts\") pod \"octavia-housekeeping-bk2pv\" (UID: \"3c603be1-c02e-4c29-ae4c-f0209436fa4b\") " pod="openstack/octavia-housekeeping-bk2pv" Feb 17 19:23:56 crc kubenswrapper[4892]: I0217 19:23:56.071656 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/3c603be1-c02e-4c29-ae4c-f0209436fa4b-amphora-certs\") pod \"octavia-housekeeping-bk2pv\" (UID: \"3c603be1-c02e-4c29-ae4c-f0209436fa4b\") " pod="openstack/octavia-housekeeping-bk2pv" Feb 17 19:23:56 crc kubenswrapper[4892]: I0217 19:23:56.202884 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-bk2pv" Feb 17 19:23:56 crc kubenswrapper[4892]: I0217 19:23:56.494804 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-sync-zfzzm"] Feb 17 19:23:56 crc kubenswrapper[4892]: I0217 19:23:56.497581 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-zfzzm" Feb 17 19:23:56 crc kubenswrapper[4892]: I0217 19:23:56.502088 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-scripts" Feb 17 19:23:56 crc kubenswrapper[4892]: I0217 19:23:56.525551 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-zfzzm"] Feb 17 19:23:56 crc kubenswrapper[4892]: I0217 19:23:56.560253 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/909c89b3-4107-4f01-b4ec-3ecea629d2d4-config-data\") pod \"octavia-db-sync-zfzzm\" (UID: \"909c89b3-4107-4f01-b4ec-3ecea629d2d4\") " pod="openstack/octavia-db-sync-zfzzm" Feb 17 19:23:56 crc kubenswrapper[4892]: I0217 19:23:56.560629 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/909c89b3-4107-4f01-b4ec-3ecea629d2d4-combined-ca-bundle\") pod \"octavia-db-sync-zfzzm\" (UID: \"909c89b3-4107-4f01-b4ec-3ecea629d2d4\") " pod="openstack/octavia-db-sync-zfzzm" Feb 17 19:23:56 crc kubenswrapper[4892]: I0217 19:23:56.560716 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/909c89b3-4107-4f01-b4ec-3ecea629d2d4-scripts\") pod \"octavia-db-sync-zfzzm\" (UID: \"909c89b3-4107-4f01-b4ec-3ecea629d2d4\") " pod="openstack/octavia-db-sync-zfzzm" Feb 17 19:23:56 crc kubenswrapper[4892]: I0217 19:23:56.560957 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/909c89b3-4107-4f01-b4ec-3ecea629d2d4-config-data-merged\") pod \"octavia-db-sync-zfzzm\" (UID: \"909c89b3-4107-4f01-b4ec-3ecea629d2d4\") " pod="openstack/octavia-db-sync-zfzzm" Feb 17 19:23:56 crc kubenswrapper[4892]: I0217 19:23:56.663524 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/909c89b3-4107-4f01-b4ec-3ecea629d2d4-config-data\") pod \"octavia-db-sync-zfzzm\" (UID: \"909c89b3-4107-4f01-b4ec-3ecea629d2d4\") " pod="openstack/octavia-db-sync-zfzzm" Feb 17 19:23:56 crc kubenswrapper[4892]: I0217 19:23:56.663677 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/909c89b3-4107-4f01-b4ec-3ecea629d2d4-combined-ca-bundle\") pod \"octavia-db-sync-zfzzm\" (UID: \"909c89b3-4107-4f01-b4ec-3ecea629d2d4\") " pod="openstack/octavia-db-sync-zfzzm" Feb 17 19:23:56 crc kubenswrapper[4892]: I0217 19:23:56.663708 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/909c89b3-4107-4f01-b4ec-3ecea629d2d4-scripts\") pod \"octavia-db-sync-zfzzm\" (UID: \"909c89b3-4107-4f01-b4ec-3ecea629d2d4\") " pod="openstack/octavia-db-sync-zfzzm" Feb 17 19:23:56 crc kubenswrapper[4892]: I0217 19:23:56.664370 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/909c89b3-4107-4f01-b4ec-3ecea629d2d4-config-data-merged\") pod \"octavia-db-sync-zfzzm\" (UID: \"909c89b3-4107-4f01-b4ec-3ecea629d2d4\") " pod="openstack/octavia-db-sync-zfzzm" Feb 17 19:23:56 crc kubenswrapper[4892]: I0217 19:23:56.665224 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/909c89b3-4107-4f01-b4ec-3ecea629d2d4-config-data-merged\") pod \"octavia-db-sync-zfzzm\" (UID: \"909c89b3-4107-4f01-b4ec-3ecea629d2d4\") " pod="openstack/octavia-db-sync-zfzzm" Feb 17 19:23:56 crc kubenswrapper[4892]: I0217 19:23:56.668370 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/909c89b3-4107-4f01-b4ec-3ecea629d2d4-config-data\") pod \"octavia-db-sync-zfzzm\" (UID: \"909c89b3-4107-4f01-b4ec-3ecea629d2d4\") " pod="openstack/octavia-db-sync-zfzzm" Feb 17 19:23:56 crc kubenswrapper[4892]: I0217 19:23:56.671524 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/909c89b3-4107-4f01-b4ec-3ecea629d2d4-scripts\") pod \"octavia-db-sync-zfzzm\" (UID: \"909c89b3-4107-4f01-b4ec-3ecea629d2d4\") " pod="openstack/octavia-db-sync-zfzzm" Feb 17 19:23:56 crc kubenswrapper[4892]: I0217 19:23:56.671567 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/909c89b3-4107-4f01-b4ec-3ecea629d2d4-combined-ca-bundle\") pod \"octavia-db-sync-zfzzm\" (UID: \"909c89b3-4107-4f01-b4ec-3ecea629d2d4\") " pod="openstack/octavia-db-sync-zfzzm" Feb 17 19:23:56 crc kubenswrapper[4892]: I0217 19:23:56.831407 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-zfzzm" Feb 17 19:23:57 crc kubenswrapper[4892]: I0217 19:23:57.276467 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-bk2pv"] Feb 17 19:23:57 crc kubenswrapper[4892]: I0217 19:23:57.417098 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-zfzzm"] Feb 17 19:23:57 crc kubenswrapper[4892]: I0217 19:23:57.596367 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-bk2pv" event={"ID":"3c603be1-c02e-4c29-ae4c-f0209436fa4b","Type":"ContainerStarted","Data":"9aa6f7ea783975b9afac5d5287bc37ac8b061d812534d319b46c4c158260bc59"} Feb 17 19:23:57 crc kubenswrapper[4892]: I0217 19:23:57.598192 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-zfzzm" event={"ID":"909c89b3-4107-4f01-b4ec-3ecea629d2d4","Type":"ContainerStarted","Data":"078bc9401fabcfac08f13f6eb44f2d0b168e36eb3b4e7477e21db0c9a6396fb3"} Feb 17 19:23:58 crc kubenswrapper[4892]: I0217 19:23:58.171375 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-worker-pck59"] Feb 17 19:23:58 crc kubenswrapper[4892]: I0217 19:23:58.174271 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-pck59" Feb 17 19:23:58 crc kubenswrapper[4892]: I0217 19:23:58.185879 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-config-data" Feb 17 19:23:58 crc kubenswrapper[4892]: I0217 19:23:58.186111 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-scripts" Feb 17 19:23:58 crc kubenswrapper[4892]: I0217 19:23:58.192792 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-pck59"] Feb 17 19:23:58 crc kubenswrapper[4892]: I0217 19:23:58.204511 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/cda6c9c6-98f3-4399-b095-ce45683ebd27-config-data-merged\") pod \"octavia-worker-pck59\" (UID: \"cda6c9c6-98f3-4399-b095-ce45683ebd27\") " pod="openstack/octavia-worker-pck59" Feb 17 19:23:58 crc kubenswrapper[4892]: I0217 19:23:58.204606 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/cda6c9c6-98f3-4399-b095-ce45683ebd27-hm-ports\") pod \"octavia-worker-pck59\" (UID: \"cda6c9c6-98f3-4399-b095-ce45683ebd27\") " pod="openstack/octavia-worker-pck59" Feb 17 19:23:58 crc kubenswrapper[4892]: I0217 19:23:58.204657 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cda6c9c6-98f3-4399-b095-ce45683ebd27-config-data\") pod \"octavia-worker-pck59\" (UID: \"cda6c9c6-98f3-4399-b095-ce45683ebd27\") " pod="openstack/octavia-worker-pck59" Feb 17 19:23:58 crc kubenswrapper[4892]: I0217 19:23:58.204692 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cda6c9c6-98f3-4399-b095-ce45683ebd27-scripts\") pod \"octavia-worker-pck59\" (UID: \"cda6c9c6-98f3-4399-b095-ce45683ebd27\") " pod="openstack/octavia-worker-pck59" Feb 17 19:23:58 crc kubenswrapper[4892]: I0217 19:23:58.204803 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/cda6c9c6-98f3-4399-b095-ce45683ebd27-amphora-certs\") pod \"octavia-worker-pck59\" (UID: \"cda6c9c6-98f3-4399-b095-ce45683ebd27\") " pod="openstack/octavia-worker-pck59" Feb 17 19:23:58 crc kubenswrapper[4892]: I0217 19:23:58.204892 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cda6c9c6-98f3-4399-b095-ce45683ebd27-combined-ca-bundle\") pod \"octavia-worker-pck59\" (UID: \"cda6c9c6-98f3-4399-b095-ce45683ebd27\") " pod="openstack/octavia-worker-pck59" Feb 17 19:23:58 crc kubenswrapper[4892]: I0217 19:23:58.306893 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/cda6c9c6-98f3-4399-b095-ce45683ebd27-config-data-merged\") pod \"octavia-worker-pck59\" (UID: \"cda6c9c6-98f3-4399-b095-ce45683ebd27\") " pod="openstack/octavia-worker-pck59" Feb 17 19:23:58 crc kubenswrapper[4892]: I0217 19:23:58.306974 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/cda6c9c6-98f3-4399-b095-ce45683ebd27-hm-ports\") pod \"octavia-worker-pck59\" (UID: \"cda6c9c6-98f3-4399-b095-ce45683ebd27\") " pod="openstack/octavia-worker-pck59" Feb 17 19:23:58 crc kubenswrapper[4892]: I0217 19:23:58.307009 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cda6c9c6-98f3-4399-b095-ce45683ebd27-config-data\") pod \"octavia-worker-pck59\" (UID: \"cda6c9c6-98f3-4399-b095-ce45683ebd27\") " pod="openstack/octavia-worker-pck59" Feb 17 19:23:58 crc kubenswrapper[4892]: I0217 19:23:58.307040 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cda6c9c6-98f3-4399-b095-ce45683ebd27-scripts\") pod \"octavia-worker-pck59\" (UID: \"cda6c9c6-98f3-4399-b095-ce45683ebd27\") " pod="openstack/octavia-worker-pck59" Feb 17 19:23:58 crc kubenswrapper[4892]: I0217 19:23:58.307066 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/cda6c9c6-98f3-4399-b095-ce45683ebd27-amphora-certs\") pod \"octavia-worker-pck59\" (UID: \"cda6c9c6-98f3-4399-b095-ce45683ebd27\") " pod="openstack/octavia-worker-pck59" Feb 17 19:23:58 crc kubenswrapper[4892]: I0217 19:23:58.307088 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cda6c9c6-98f3-4399-b095-ce45683ebd27-combined-ca-bundle\") pod \"octavia-worker-pck59\" (UID: \"cda6c9c6-98f3-4399-b095-ce45683ebd27\") " pod="openstack/octavia-worker-pck59" Feb 17 19:23:58 crc kubenswrapper[4892]: I0217 19:23:58.307549 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/cda6c9c6-98f3-4399-b095-ce45683ebd27-config-data-merged\") pod \"octavia-worker-pck59\" (UID: \"cda6c9c6-98f3-4399-b095-ce45683ebd27\") " pod="openstack/octavia-worker-pck59" Feb 17 19:23:58 crc kubenswrapper[4892]: I0217 19:23:58.308662 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/cda6c9c6-98f3-4399-b095-ce45683ebd27-hm-ports\") pod \"octavia-worker-pck59\" (UID: \"cda6c9c6-98f3-4399-b095-ce45683ebd27\") " pod="openstack/octavia-worker-pck59" Feb 17 19:23:58 crc kubenswrapper[4892]: I0217 19:23:58.313248 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cda6c9c6-98f3-4399-b095-ce45683ebd27-config-data\") pod \"octavia-worker-pck59\" (UID: \"cda6c9c6-98f3-4399-b095-ce45683ebd27\") " pod="openstack/octavia-worker-pck59" Feb 17 19:23:58 crc kubenswrapper[4892]: I0217 19:23:58.324642 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/cda6c9c6-98f3-4399-b095-ce45683ebd27-amphora-certs\") pod \"octavia-worker-pck59\" (UID: \"cda6c9c6-98f3-4399-b095-ce45683ebd27\") " pod="openstack/octavia-worker-pck59" Feb 17 19:23:58 crc kubenswrapper[4892]: I0217 19:23:58.324736 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cda6c9c6-98f3-4399-b095-ce45683ebd27-combined-ca-bundle\") pod \"octavia-worker-pck59\" (UID: \"cda6c9c6-98f3-4399-b095-ce45683ebd27\") " pod="openstack/octavia-worker-pck59" Feb 17 19:23:58 crc kubenswrapper[4892]: I0217 19:23:58.324832 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cda6c9c6-98f3-4399-b095-ce45683ebd27-scripts\") pod \"octavia-worker-pck59\" (UID: \"cda6c9c6-98f3-4399-b095-ce45683ebd27\") " pod="openstack/octavia-worker-pck59" Feb 17 19:23:58 crc kubenswrapper[4892]: I0217 19:23:58.493067 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-pck59" Feb 17 19:23:58 crc kubenswrapper[4892]: I0217 19:23:58.614085 4892 generic.go:334] "Generic (PLEG): container finished" podID="909c89b3-4107-4f01-b4ec-3ecea629d2d4" containerID="f0fc1541762ff375ec48c4af8b28fe89e3d6e560ab6030c282b84a2446b1d4fd" exitCode=0 Feb 17 19:23:58 crc kubenswrapper[4892]: I0217 19:23:58.614126 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-zfzzm" event={"ID":"909c89b3-4107-4f01-b4ec-3ecea629d2d4","Type":"ContainerDied","Data":"f0fc1541762ff375ec48c4af8b28fe89e3d6e560ab6030c282b84a2446b1d4fd"} Feb 17 19:24:02 crc kubenswrapper[4892]: I0217 19:24:02.616622 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-pck59"] Feb 17 19:24:02 crc kubenswrapper[4892]: W0217 19:24:02.648012 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcda6c9c6_98f3_4399_b095_ce45683ebd27.slice/crio-da53d5b2e45e30bbc59936d50e1413c2a6bbc3f796468763ba2b213d198eeacc WatchSource:0}: Error finding container da53d5b2e45e30bbc59936d50e1413c2a6bbc3f796468763ba2b213d198eeacc: Status 404 returned error can't find the container with id da53d5b2e45e30bbc59936d50e1413c2a6bbc3f796468763ba2b213d198eeacc Feb 17 19:24:02 crc kubenswrapper[4892]: I0217 19:24:02.952364 4892 scope.go:117] "RemoveContainer" containerID="e6e78e0f2a0470cbf58e69d1585bb6969f0d97e46cc06cd5ef2a9b3afb3e688f" Feb 17 19:24:03 crc kubenswrapper[4892]: I0217 19:24:03.016393 4892 scope.go:117] "RemoveContainer" containerID="7d0717f80637382ea7bd19ae66fb8bd6d9d1456f0d28546bf405ca9582346188" Feb 17 19:24:03 crc kubenswrapper[4892]: I0217 19:24:03.085945 4892 scope.go:117] "RemoveContainer" containerID="79b3dcad0eafacf89d149280a9c5f685baac833ab204cf6d69254eeca63c0ff0" Feb 17 19:24:03 crc kubenswrapper[4892]: I0217 19:24:03.668140 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-pck59" event={"ID":"cda6c9c6-98f3-4399-b095-ce45683ebd27","Type":"ContainerStarted","Data":"da53d5b2e45e30bbc59936d50e1413c2a6bbc3f796468763ba2b213d198eeacc"} Feb 17 19:24:03 crc kubenswrapper[4892]: I0217 19:24:03.670327 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-c6t7x" event={"ID":"2fcd17ca-4457-418b-8dc1-3f8b5e485d72","Type":"ContainerStarted","Data":"57c71b9f7bd630aa213968e28cb0f4b2ca73cf0104062c024220dd16ae7669cd"} Feb 17 19:24:03 crc kubenswrapper[4892]: I0217 19:24:03.671415 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-rsyslog-c6t7x" Feb 17 19:24:03 crc kubenswrapper[4892]: I0217 19:24:03.673174 4892 generic.go:334] "Generic (PLEG): container finished" podID="c5c321c3-f90f-4f24-b7ed-2027e64b6e01" containerID="cf2fa0bcac7c69eb0f9d966e0b45f8237fa69d7cd8747c173d68ecadad7cced1" exitCode=0 Feb 17 19:24:03 crc kubenswrapper[4892]: I0217 19:24:03.673216 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-p4jsp" event={"ID":"c5c321c3-f90f-4f24-b7ed-2027e64b6e01","Type":"ContainerDied","Data":"cf2fa0bcac7c69eb0f9d966e0b45f8237fa69d7cd8747c173d68ecadad7cced1"} Feb 17 19:24:03 crc kubenswrapper[4892]: I0217 19:24:03.675143 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-bk2pv" event={"ID":"3c603be1-c02e-4c29-ae4c-f0209436fa4b","Type":"ContainerStarted","Data":"f5de2e9015497e352f46ca91043836bff6e66da99932d7d0f7412f464156767a"} Feb 17 19:24:03 crc kubenswrapper[4892]: I0217 19:24:03.693686 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-zfzzm" event={"ID":"909c89b3-4107-4f01-b4ec-3ecea629d2d4","Type":"ContainerStarted","Data":"3f10b4fc486f1f3b519ee8504b216b2ba4a9f79fdddc06c664d2d1187819d9fb"} Feb 17 19:24:03 crc kubenswrapper[4892]: I0217 19:24:03.702752 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-rsyslog-c6t7x" podStartSLOduration=2.454291933 podStartE2EDuration="13.702733741s" podCreationTimestamp="2026-02-17 19:23:50 +0000 UTC" firstStartedPulling="2026-02-17 19:23:51.392948847 +0000 UTC m=+6002.768352112" lastFinishedPulling="2026-02-17 19:24:02.641390655 +0000 UTC m=+6014.016793920" observedRunningTime="2026-02-17 19:24:03.690003986 +0000 UTC m=+6015.065407261" watchObservedRunningTime="2026-02-17 19:24:03.702733741 +0000 UTC m=+6015.078137006" Feb 17 19:24:03 crc kubenswrapper[4892]: I0217 19:24:03.752260 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-db-sync-zfzzm" podStartSLOduration=7.752243442 podStartE2EDuration="7.752243442s" podCreationTimestamp="2026-02-17 19:23:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:24:03.742885249 +0000 UTC m=+6015.118288524" watchObservedRunningTime="2026-02-17 19:24:03.752243442 +0000 UTC m=+6015.127646707" Feb 17 19:24:04 crc kubenswrapper[4892]: I0217 19:24:04.705596 4892 generic.go:334] "Generic (PLEG): container finished" podID="3c603be1-c02e-4c29-ae4c-f0209436fa4b" containerID="f5de2e9015497e352f46ca91043836bff6e66da99932d7d0f7412f464156767a" exitCode=0 Feb 17 19:24:04 crc kubenswrapper[4892]: I0217 19:24:04.705672 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-bk2pv" event={"ID":"3c603be1-c02e-4c29-ae4c-f0209436fa4b","Type":"ContainerDied","Data":"f5de2e9015497e352f46ca91043836bff6e66da99932d7d0f7412f464156767a"} Feb 17 19:24:04 crc kubenswrapper[4892]: I0217 19:24:04.711075 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-pck59" event={"ID":"cda6c9c6-98f3-4399-b095-ce45683ebd27","Type":"ContainerStarted","Data":"efec32ead855996a7180a35be1a2bf8b43b3af03c42c50417eb878e26ac5c323"} Feb 17 19:24:05 crc kubenswrapper[4892]: I0217 19:24:05.725754 4892 generic.go:334] "Generic (PLEG): container finished" podID="909c89b3-4107-4f01-b4ec-3ecea629d2d4" containerID="3f10b4fc486f1f3b519ee8504b216b2ba4a9f79fdddc06c664d2d1187819d9fb" exitCode=0 Feb 17 19:24:05 crc kubenswrapper[4892]: I0217 19:24:05.725860 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-zfzzm" event={"ID":"909c89b3-4107-4f01-b4ec-3ecea629d2d4","Type":"ContainerDied","Data":"3f10b4fc486f1f3b519ee8504b216b2ba4a9f79fdddc06c664d2d1187819d9fb"} Feb 17 19:24:05 crc kubenswrapper[4892]: I0217 19:24:05.728669 4892 generic.go:334] "Generic (PLEG): container finished" podID="cda6c9c6-98f3-4399-b095-ce45683ebd27" containerID="efec32ead855996a7180a35be1a2bf8b43b3af03c42c50417eb878e26ac5c323" exitCode=0 Feb 17 19:24:05 crc kubenswrapper[4892]: I0217 19:24:05.728781 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-pck59" event={"ID":"cda6c9c6-98f3-4399-b095-ce45683ebd27","Type":"ContainerDied","Data":"efec32ead855996a7180a35be1a2bf8b43b3af03c42c50417eb878e26ac5c323"} Feb 17 19:24:05 crc kubenswrapper[4892]: I0217 19:24:05.734452 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-p4jsp" event={"ID":"c5c321c3-f90f-4f24-b7ed-2027e64b6e01","Type":"ContainerStarted","Data":"1642b43cd0f53e496b14ddd84284a97fb94ee5a50db5bd3b02e2815d122fd464"} Feb 17 19:24:05 crc kubenswrapper[4892]: I0217 19:24:05.745774 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-bk2pv" event={"ID":"3c603be1-c02e-4c29-ae4c-f0209436fa4b","Type":"ContainerStarted","Data":"ce805cfccb679d3e8f47df02e166bdb4d826281960da33513a3ef0a903d8b753"} Feb 17 19:24:05 crc kubenswrapper[4892]: I0217 19:24:05.746001 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-housekeeping-bk2pv" Feb 17 19:24:05 crc kubenswrapper[4892]: I0217 19:24:05.821312 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-59f8cff499-p4jsp" podStartSLOduration=1.473724403 podStartE2EDuration="14.821288231s" podCreationTimestamp="2026-02-17 19:23:51 +0000 UTC" firstStartedPulling="2026-02-17 19:23:51.912326088 +0000 UTC m=+6003.287729353" lastFinishedPulling="2026-02-17 19:24:05.259889906 +0000 UTC m=+6016.635293181" observedRunningTime="2026-02-17 19:24:05.798174406 +0000 UTC m=+6017.173577711" watchObservedRunningTime="2026-02-17 19:24:05.821288231 +0000 UTC m=+6017.196691506" Feb 17 19:24:05 crc kubenswrapper[4892]: I0217 19:24:05.829301 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-housekeeping-bk2pv" podStartSLOduration=5.434731522 podStartE2EDuration="10.829280188s" podCreationTimestamp="2026-02-17 19:23:55 +0000 UTC" firstStartedPulling="2026-02-17 19:23:57.314380208 +0000 UTC m=+6008.689783473" lastFinishedPulling="2026-02-17 19:24:02.708928874 +0000 UTC m=+6014.084332139" observedRunningTime="2026-02-17 19:24:05.827950721 +0000 UTC m=+6017.203354046" watchObservedRunningTime="2026-02-17 19:24:05.829280188 +0000 UTC m=+6017.204683453" Feb 17 19:24:06 crc kubenswrapper[4892]: I0217 19:24:06.359596 4892 scope.go:117] "RemoveContainer" containerID="0b9ac01ffb1eb212ecc7dea4def4de73890b147a83b4b71346712fbd55b67ac2" Feb 17 19:24:06 crc kubenswrapper[4892]: E0217 19:24:06.360148 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:24:06 crc kubenswrapper[4892]: I0217 19:24:06.762533 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-pck59" event={"ID":"cda6c9c6-98f3-4399-b095-ce45683ebd27","Type":"ContainerStarted","Data":"0265736f8584c18fcdd65df3c95ab836eb82908c8c698aba277204e7e4f2d3a3"} Feb 17 19:24:06 crc kubenswrapper[4892]: I0217 19:24:06.800408 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-worker-pck59" podStartSLOduration=7.430843776 podStartE2EDuration="8.800365789s" podCreationTimestamp="2026-02-17 19:23:58 +0000 UTC" firstStartedPulling="2026-02-17 19:24:02.651063777 +0000 UTC m=+6014.026467042" lastFinishedPulling="2026-02-17 19:24:04.02058579 +0000 UTC m=+6015.395989055" observedRunningTime="2026-02-17 19:24:06.797330427 +0000 UTC m=+6018.172733692" watchObservedRunningTime="2026-02-17 19:24:06.800365789 +0000 UTC m=+6018.175769054" Feb 17 19:24:07 crc kubenswrapper[4892]: I0217 19:24:07.280811 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-zfzzm" Feb 17 19:24:07 crc kubenswrapper[4892]: I0217 19:24:07.366871 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/909c89b3-4107-4f01-b4ec-3ecea629d2d4-combined-ca-bundle\") pod \"909c89b3-4107-4f01-b4ec-3ecea629d2d4\" (UID: \"909c89b3-4107-4f01-b4ec-3ecea629d2d4\") " Feb 17 19:24:07 crc kubenswrapper[4892]: I0217 19:24:07.366963 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/909c89b3-4107-4f01-b4ec-3ecea629d2d4-config-data\") pod \"909c89b3-4107-4f01-b4ec-3ecea629d2d4\" (UID: \"909c89b3-4107-4f01-b4ec-3ecea629d2d4\") " Feb 17 19:24:07 crc kubenswrapper[4892]: I0217 19:24:07.367038 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/909c89b3-4107-4f01-b4ec-3ecea629d2d4-config-data-merged\") pod \"909c89b3-4107-4f01-b4ec-3ecea629d2d4\" (UID: \"909c89b3-4107-4f01-b4ec-3ecea629d2d4\") " Feb 17 19:24:07 crc kubenswrapper[4892]: I0217 19:24:07.367063 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/909c89b3-4107-4f01-b4ec-3ecea629d2d4-scripts\") pod \"909c89b3-4107-4f01-b4ec-3ecea629d2d4\" (UID: \"909c89b3-4107-4f01-b4ec-3ecea629d2d4\") " Feb 17 19:24:07 crc kubenswrapper[4892]: I0217 19:24:07.389836 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/909c89b3-4107-4f01-b4ec-3ecea629d2d4-scripts" (OuterVolumeSpecName: "scripts") pod "909c89b3-4107-4f01-b4ec-3ecea629d2d4" (UID: "909c89b3-4107-4f01-b4ec-3ecea629d2d4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:24:07 crc kubenswrapper[4892]: I0217 19:24:07.416581 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/909c89b3-4107-4f01-b4ec-3ecea629d2d4-config-data" (OuterVolumeSpecName: "config-data") pod "909c89b3-4107-4f01-b4ec-3ecea629d2d4" (UID: "909c89b3-4107-4f01-b4ec-3ecea629d2d4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:24:07 crc kubenswrapper[4892]: I0217 19:24:07.419125 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/909c89b3-4107-4f01-b4ec-3ecea629d2d4-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "909c89b3-4107-4f01-b4ec-3ecea629d2d4" (UID: "909c89b3-4107-4f01-b4ec-3ecea629d2d4"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:24:07 crc kubenswrapper[4892]: I0217 19:24:07.433505 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/909c89b3-4107-4f01-b4ec-3ecea629d2d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "909c89b3-4107-4f01-b4ec-3ecea629d2d4" (UID: "909c89b3-4107-4f01-b4ec-3ecea629d2d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:24:07 crc kubenswrapper[4892]: I0217 19:24:07.468920 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/909c89b3-4107-4f01-b4ec-3ecea629d2d4-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 19:24:07 crc kubenswrapper[4892]: I0217 19:24:07.468949 4892 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/909c89b3-4107-4f01-b4ec-3ecea629d2d4-config-data-merged\") on node \"crc\" DevicePath \"\"" Feb 17 19:24:07 crc kubenswrapper[4892]: I0217 19:24:07.468960 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/909c89b3-4107-4f01-b4ec-3ecea629d2d4-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:24:07 crc kubenswrapper[4892]: I0217 19:24:07.468968 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/909c89b3-4107-4f01-b4ec-3ecea629d2d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 19:24:07 crc kubenswrapper[4892]: I0217 19:24:07.778922 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-zfzzm" Feb 17 19:24:07 crc kubenswrapper[4892]: I0217 19:24:07.781015 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-zfzzm" event={"ID":"909c89b3-4107-4f01-b4ec-3ecea629d2d4","Type":"ContainerDied","Data":"078bc9401fabcfac08f13f6eb44f2d0b168e36eb3b4e7477e21db0c9a6396fb3"} Feb 17 19:24:07 crc kubenswrapper[4892]: I0217 19:24:07.781124 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="078bc9401fabcfac08f13f6eb44f2d0b168e36eb3b4e7477e21db0c9a6396fb3" Feb 17 19:24:07 crc kubenswrapper[4892]: I0217 19:24:07.781180 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-worker-pck59" Feb 17 19:24:11 crc kubenswrapper[4892]: I0217 19:24:11.255577 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-housekeeping-bk2pv" Feb 17 19:24:13 crc kubenswrapper[4892]: I0217 19:24:13.555404 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-worker-pck59" Feb 17 19:24:17 crc kubenswrapper[4892]: I0217 19:24:17.360347 4892 scope.go:117] "RemoveContainer" containerID="0b9ac01ffb1eb212ecc7dea4def4de73890b147a83b4b71346712fbd55b67ac2" Feb 17 19:24:17 crc kubenswrapper[4892]: E0217 19:24:17.361110 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:24:20 crc kubenswrapper[4892]: I0217 19:24:20.801482 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-rsyslog-c6t7x" Feb 17 19:24:28 crc kubenswrapper[4892]: I0217 19:24:28.362635 4892 scope.go:117] "RemoveContainer" containerID="0b9ac01ffb1eb212ecc7dea4def4de73890b147a83b4b71346712fbd55b67ac2" Feb 17 19:24:28 crc kubenswrapper[4892]: E0217 19:24:28.394920 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:24:37 crc kubenswrapper[4892]: I0217 19:24:37.113276 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-p4jsp"] Feb 17 19:24:37 crc kubenswrapper[4892]: I0217 19:24:37.114210 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-image-upload-59f8cff499-p4jsp" podUID="c5c321c3-f90f-4f24-b7ed-2027e64b6e01" containerName="octavia-amphora-httpd" containerID="cri-o://1642b43cd0f53e496b14ddd84284a97fb94ee5a50db5bd3b02e2815d122fd464" gracePeriod=30 Feb 17 19:24:37 crc kubenswrapper[4892]: I0217 19:24:37.729507 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-p4jsp" Feb 17 19:24:37 crc kubenswrapper[4892]: I0217 19:24:37.800855 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/c5c321c3-f90f-4f24-b7ed-2027e64b6e01-amphora-image\") pod \"c5c321c3-f90f-4f24-b7ed-2027e64b6e01\" (UID: \"c5c321c3-f90f-4f24-b7ed-2027e64b6e01\") " Feb 17 19:24:37 crc kubenswrapper[4892]: I0217 19:24:37.851038 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5c321c3-f90f-4f24-b7ed-2027e64b6e01-amphora-image" (OuterVolumeSpecName: "amphora-image") pod "c5c321c3-f90f-4f24-b7ed-2027e64b6e01" (UID: "c5c321c3-f90f-4f24-b7ed-2027e64b6e01"). InnerVolumeSpecName "amphora-image". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:24:37 crc kubenswrapper[4892]: I0217 19:24:37.902605 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c5c321c3-f90f-4f24-b7ed-2027e64b6e01-httpd-config\") pod \"c5c321c3-f90f-4f24-b7ed-2027e64b6e01\" (UID: \"c5c321c3-f90f-4f24-b7ed-2027e64b6e01\") " Feb 17 19:24:37 crc kubenswrapper[4892]: I0217 19:24:37.903231 4892 reconciler_common.go:293] "Volume detached for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/c5c321c3-f90f-4f24-b7ed-2027e64b6e01-amphora-image\") on node \"crc\" DevicePath \"\"" Feb 17 19:24:37 crc kubenswrapper[4892]: I0217 19:24:37.936214 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5c321c3-f90f-4f24-b7ed-2027e64b6e01-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "c5c321c3-f90f-4f24-b7ed-2027e64b6e01" (UID: "c5c321c3-f90f-4f24-b7ed-2027e64b6e01"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:24:38 crc kubenswrapper[4892]: I0217 19:24:38.004471 4892 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c5c321c3-f90f-4f24-b7ed-2027e64b6e01-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 17 19:24:38 crc kubenswrapper[4892]: I0217 19:24:38.233628 4892 generic.go:334] "Generic (PLEG): container finished" podID="c5c321c3-f90f-4f24-b7ed-2027e64b6e01" containerID="1642b43cd0f53e496b14ddd84284a97fb94ee5a50db5bd3b02e2815d122fd464" exitCode=0 Feb 17 19:24:38 crc kubenswrapper[4892]: I0217 19:24:38.233680 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-p4jsp" event={"ID":"c5c321c3-f90f-4f24-b7ed-2027e64b6e01","Type":"ContainerDied","Data":"1642b43cd0f53e496b14ddd84284a97fb94ee5a50db5bd3b02e2815d122fd464"} Feb 17 19:24:38 crc kubenswrapper[4892]: I0217 19:24:38.233716 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-p4jsp" event={"ID":"c5c321c3-f90f-4f24-b7ed-2027e64b6e01","Type":"ContainerDied","Data":"36fad1b3c74e4563f20d05809e818b69a40ba170918ee856787d6521123bf857"} Feb 17 19:24:38 crc kubenswrapper[4892]: I0217 19:24:38.233727 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-p4jsp" Feb 17 19:24:38 crc kubenswrapper[4892]: I0217 19:24:38.233739 4892 scope.go:117] "RemoveContainer" containerID="1642b43cd0f53e496b14ddd84284a97fb94ee5a50db5bd3b02e2815d122fd464" Feb 17 19:24:38 crc kubenswrapper[4892]: I0217 19:24:38.294306 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-p4jsp"] Feb 17 19:24:38 crc kubenswrapper[4892]: I0217 19:24:38.299627 4892 scope.go:117] "RemoveContainer" containerID="cf2fa0bcac7c69eb0f9d966e0b45f8237fa69d7cd8747c173d68ecadad7cced1" Feb 17 19:24:38 crc kubenswrapper[4892]: I0217 19:24:38.306523 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-p4jsp"] Feb 17 19:24:38 crc kubenswrapper[4892]: I0217 19:24:38.334562 4892 scope.go:117] "RemoveContainer" containerID="1642b43cd0f53e496b14ddd84284a97fb94ee5a50db5bd3b02e2815d122fd464" Feb 17 19:24:38 crc kubenswrapper[4892]: E0217 19:24:38.335191 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1642b43cd0f53e496b14ddd84284a97fb94ee5a50db5bd3b02e2815d122fd464\": container with ID starting with 1642b43cd0f53e496b14ddd84284a97fb94ee5a50db5bd3b02e2815d122fd464 not found: ID does not exist" containerID="1642b43cd0f53e496b14ddd84284a97fb94ee5a50db5bd3b02e2815d122fd464" Feb 17 19:24:38 crc kubenswrapper[4892]: I0217 19:24:38.335238 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1642b43cd0f53e496b14ddd84284a97fb94ee5a50db5bd3b02e2815d122fd464"} err="failed to get container status \"1642b43cd0f53e496b14ddd84284a97fb94ee5a50db5bd3b02e2815d122fd464\": rpc error: code = NotFound desc = could not find container \"1642b43cd0f53e496b14ddd84284a97fb94ee5a50db5bd3b02e2815d122fd464\": container with ID starting with 1642b43cd0f53e496b14ddd84284a97fb94ee5a50db5bd3b02e2815d122fd464 not found: ID does not exist" Feb 17 19:24:38 crc kubenswrapper[4892]: I0217 19:24:38.335268 4892 scope.go:117] "RemoveContainer" containerID="cf2fa0bcac7c69eb0f9d966e0b45f8237fa69d7cd8747c173d68ecadad7cced1" Feb 17 19:24:38 crc kubenswrapper[4892]: E0217 19:24:38.335719 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf2fa0bcac7c69eb0f9d966e0b45f8237fa69d7cd8747c173d68ecadad7cced1\": container with ID starting with cf2fa0bcac7c69eb0f9d966e0b45f8237fa69d7cd8747c173d68ecadad7cced1 not found: ID does not exist" containerID="cf2fa0bcac7c69eb0f9d966e0b45f8237fa69d7cd8747c173d68ecadad7cced1" Feb 17 19:24:38 crc kubenswrapper[4892]: I0217 19:24:38.335756 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf2fa0bcac7c69eb0f9d966e0b45f8237fa69d7cd8747c173d68ecadad7cced1"} err="failed to get container status \"cf2fa0bcac7c69eb0f9d966e0b45f8237fa69d7cd8747c173d68ecadad7cced1\": rpc error: code = NotFound desc = could not find container \"cf2fa0bcac7c69eb0f9d966e0b45f8237fa69d7cd8747c173d68ecadad7cced1\": container with ID starting with cf2fa0bcac7c69eb0f9d966e0b45f8237fa69d7cd8747c173d68ecadad7cced1 not found: ID does not exist" Feb 17 19:24:39 crc kubenswrapper[4892]: I0217 19:24:39.381244 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5c321c3-f90f-4f24-b7ed-2027e64b6e01" path="/var/lib/kubelet/pods/c5c321c3-f90f-4f24-b7ed-2027e64b6e01/volumes" Feb 17 19:24:42 crc kubenswrapper[4892]: I0217 19:24:42.360983 4892 scope.go:117] "RemoveContainer" containerID="0b9ac01ffb1eb212ecc7dea4def4de73890b147a83b4b71346712fbd55b67ac2" Feb 17 19:24:42 crc kubenswrapper[4892]: I0217 19:24:42.838006 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-59f8cff499-tgk79"] Feb 17 19:24:42 crc kubenswrapper[4892]: E0217 19:24:42.839139 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5c321c3-f90f-4f24-b7ed-2027e64b6e01" containerName="init" Feb 17 19:24:42 crc kubenswrapper[4892]: I0217 19:24:42.839169 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5c321c3-f90f-4f24-b7ed-2027e64b6e01" containerName="init" Feb 17 19:24:42 crc kubenswrapper[4892]: E0217 19:24:42.839192 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5c321c3-f90f-4f24-b7ed-2027e64b6e01" containerName="octavia-amphora-httpd" Feb 17 19:24:42 crc kubenswrapper[4892]: I0217 19:24:42.839207 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5c321c3-f90f-4f24-b7ed-2027e64b6e01" containerName="octavia-amphora-httpd" Feb 17 19:24:42 crc kubenswrapper[4892]: E0217 19:24:42.839245 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="909c89b3-4107-4f01-b4ec-3ecea629d2d4" containerName="octavia-db-sync" Feb 17 19:24:42 crc kubenswrapper[4892]: I0217 19:24:42.839260 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="909c89b3-4107-4f01-b4ec-3ecea629d2d4" containerName="octavia-db-sync" Feb 17 19:24:42 crc kubenswrapper[4892]: E0217 19:24:42.839319 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="909c89b3-4107-4f01-b4ec-3ecea629d2d4" containerName="init" Feb 17 19:24:42 crc kubenswrapper[4892]: I0217 19:24:42.839333 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="909c89b3-4107-4f01-b4ec-3ecea629d2d4" containerName="init" Feb 17 19:24:42 crc kubenswrapper[4892]: I0217 19:24:42.839747 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="909c89b3-4107-4f01-b4ec-3ecea629d2d4" containerName="octavia-db-sync" Feb 17 19:24:42 crc kubenswrapper[4892]: I0217 19:24:42.839802 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5c321c3-f90f-4f24-b7ed-2027e64b6e01" containerName="octavia-amphora-httpd" Feb 17 19:24:42 crc kubenswrapper[4892]: I0217 19:24:42.841978 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-tgk79" Feb 17 19:24:42 crc kubenswrapper[4892]: I0217 19:24:42.848399 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Feb 17 19:24:42 crc kubenswrapper[4892]: I0217 19:24:42.853891 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-tgk79"] Feb 17 19:24:42 crc kubenswrapper[4892]: I0217 19:24:42.961374 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a4e42272-b37b-4214-9a11-de8ce611d1b3-httpd-config\") pod \"octavia-image-upload-59f8cff499-tgk79\" (UID: \"a4e42272-b37b-4214-9a11-de8ce611d1b3\") " pod="openstack/octavia-image-upload-59f8cff499-tgk79" Feb 17 19:24:42 crc kubenswrapper[4892]: I0217 19:24:42.962011 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/a4e42272-b37b-4214-9a11-de8ce611d1b3-amphora-image\") pod \"octavia-image-upload-59f8cff499-tgk79\" (UID: \"a4e42272-b37b-4214-9a11-de8ce611d1b3\") " pod="openstack/octavia-image-upload-59f8cff499-tgk79" Feb 17 19:24:43 crc kubenswrapper[4892]: I0217 19:24:43.063328 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/a4e42272-b37b-4214-9a11-de8ce611d1b3-amphora-image\") pod \"octavia-image-upload-59f8cff499-tgk79\" (UID: \"a4e42272-b37b-4214-9a11-de8ce611d1b3\") " pod="openstack/octavia-image-upload-59f8cff499-tgk79" Feb 17 19:24:43 crc kubenswrapper[4892]: I0217 19:24:43.063515 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a4e42272-b37b-4214-9a11-de8ce611d1b3-httpd-config\") pod \"octavia-image-upload-59f8cff499-tgk79\" (UID: \"a4e42272-b37b-4214-9a11-de8ce611d1b3\") " pod="openstack/octavia-image-upload-59f8cff499-tgk79" Feb 17 19:24:43 crc kubenswrapper[4892]: I0217 19:24:43.064006 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/a4e42272-b37b-4214-9a11-de8ce611d1b3-amphora-image\") pod \"octavia-image-upload-59f8cff499-tgk79\" (UID: \"a4e42272-b37b-4214-9a11-de8ce611d1b3\") " pod="openstack/octavia-image-upload-59f8cff499-tgk79" Feb 17 19:24:43 crc kubenswrapper[4892]: I0217 19:24:43.071053 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a4e42272-b37b-4214-9a11-de8ce611d1b3-httpd-config\") pod \"octavia-image-upload-59f8cff499-tgk79\" (UID: \"a4e42272-b37b-4214-9a11-de8ce611d1b3\") " pod="openstack/octavia-image-upload-59f8cff499-tgk79" Feb 17 19:24:43 crc kubenswrapper[4892]: I0217 19:24:43.187426 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-tgk79" Feb 17 19:24:43 crc kubenswrapper[4892]: I0217 19:24:43.324284 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerStarted","Data":"2c781ca08bf3af58232a3452d8a6e42ca4ea1616baa6e1729ebe4342764473bf"} Feb 17 19:24:43 crc kubenswrapper[4892]: I0217 19:24:43.716579 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-tgk79"] Feb 17 19:24:43 crc kubenswrapper[4892]: W0217 19:24:43.723492 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4e42272_b37b_4214_9a11_de8ce611d1b3.slice/crio-90eb15db9d89da5c21a6b625a5624c150e2ca18b77074add1ea0305b6245cd73 WatchSource:0}: Error finding container 90eb15db9d89da5c21a6b625a5624c150e2ca18b77074add1ea0305b6245cd73: Status 404 returned error can't find the container with id 90eb15db9d89da5c21a6b625a5624c150e2ca18b77074add1ea0305b6245cd73 Feb 17 19:24:44 crc kubenswrapper[4892]: I0217 19:24:44.340380 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-tgk79" event={"ID":"a4e42272-b37b-4214-9a11-de8ce611d1b3","Type":"ContainerStarted","Data":"90eb15db9d89da5c21a6b625a5624c150e2ca18b77074add1ea0305b6245cd73"} Feb 17 19:24:45 crc kubenswrapper[4892]: I0217 19:24:45.357800 4892 generic.go:334] "Generic (PLEG): container finished" podID="a4e42272-b37b-4214-9a11-de8ce611d1b3" containerID="188eab2ce1dccd6dcd4a1c4c9c14a289ab1c7ae704bc493f1e32784942c25586" exitCode=0 Feb 17 19:24:45 crc kubenswrapper[4892]: I0217 19:24:45.357961 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-tgk79" event={"ID":"a4e42272-b37b-4214-9a11-de8ce611d1b3","Type":"ContainerDied","Data":"188eab2ce1dccd6dcd4a1c4c9c14a289ab1c7ae704bc493f1e32784942c25586"} Feb 17 19:24:47 crc kubenswrapper[4892]: I0217 19:24:47.395645 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-tgk79" event={"ID":"a4e42272-b37b-4214-9a11-de8ce611d1b3","Type":"ContainerStarted","Data":"4b87a7e7523d4c706acd90c551349172c651734648f7b23aafc9e213063db450"} Feb 17 19:24:47 crc kubenswrapper[4892]: I0217 19:24:47.421640 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-59f8cff499-tgk79" podStartSLOduration=2.500343591 podStartE2EDuration="5.421622812s" podCreationTimestamp="2026-02-17 19:24:42 +0000 UTC" firstStartedPulling="2026-02-17 19:24:43.726149582 +0000 UTC m=+6055.101552847" lastFinishedPulling="2026-02-17 19:24:46.647428773 +0000 UTC m=+6058.022832068" observedRunningTime="2026-02-17 19:24:47.418030184 +0000 UTC m=+6058.793433449" watchObservedRunningTime="2026-02-17 19:24:47.421622812 +0000 UTC m=+6058.797026067" Feb 17 19:24:51 crc kubenswrapper[4892]: I0217 19:24:51.407743 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-healthmanager-crjvb"] Feb 17 19:24:51 crc kubenswrapper[4892]: I0217 19:24:51.411718 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-crjvb" Feb 17 19:24:51 crc kubenswrapper[4892]: I0217 19:24:51.414960 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-config-data" Feb 17 19:24:51 crc kubenswrapper[4892]: I0217 19:24:51.415300 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-scripts" Feb 17 19:24:51 crc kubenswrapper[4892]: I0217 19:24:51.419310 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-crjvb"] Feb 17 19:24:51 crc kubenswrapper[4892]: I0217 19:24:51.507450 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/154030e3-3d0f-4c51-8a6e-e9ddd9238c03-combined-ca-bundle\") pod \"octavia-healthmanager-crjvb\" (UID: \"154030e3-3d0f-4c51-8a6e-e9ddd9238c03\") " pod="openstack/octavia-healthmanager-crjvb" Feb 17 19:24:51 crc kubenswrapper[4892]: I0217 19:24:51.507606 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/154030e3-3d0f-4c51-8a6e-e9ddd9238c03-hm-ports\") pod \"octavia-healthmanager-crjvb\" (UID: \"154030e3-3d0f-4c51-8a6e-e9ddd9238c03\") " pod="openstack/octavia-healthmanager-crjvb" Feb 17 19:24:51 crc kubenswrapper[4892]: I0217 19:24:51.507662 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/154030e3-3d0f-4c51-8a6e-e9ddd9238c03-config-data-merged\") pod \"octavia-healthmanager-crjvb\" (UID: \"154030e3-3d0f-4c51-8a6e-e9ddd9238c03\") " pod="openstack/octavia-healthmanager-crjvb" Feb 17 19:24:51 crc kubenswrapper[4892]: I0217 19:24:51.507688 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/154030e3-3d0f-4c51-8a6e-e9ddd9238c03-config-data\") pod \"octavia-healthmanager-crjvb\" (UID: \"154030e3-3d0f-4c51-8a6e-e9ddd9238c03\") " pod="openstack/octavia-healthmanager-crjvb" Feb 17 19:24:51 crc kubenswrapper[4892]: I0217 19:24:51.507712 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/154030e3-3d0f-4c51-8a6e-e9ddd9238c03-scripts\") pod \"octavia-healthmanager-crjvb\" (UID: \"154030e3-3d0f-4c51-8a6e-e9ddd9238c03\") " pod="openstack/octavia-healthmanager-crjvb" Feb 17 19:24:51 crc kubenswrapper[4892]: I0217 19:24:51.507736 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/154030e3-3d0f-4c51-8a6e-e9ddd9238c03-amphora-certs\") pod \"octavia-healthmanager-crjvb\" (UID: \"154030e3-3d0f-4c51-8a6e-e9ddd9238c03\") " pod="openstack/octavia-healthmanager-crjvb" Feb 17 19:24:51 crc kubenswrapper[4892]: I0217 19:24:51.608854 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/154030e3-3d0f-4c51-8a6e-e9ddd9238c03-hm-ports\") pod \"octavia-healthmanager-crjvb\" (UID: \"154030e3-3d0f-4c51-8a6e-e9ddd9238c03\") " pod="openstack/octavia-healthmanager-crjvb" Feb 17 19:24:51 crc kubenswrapper[4892]: I0217 19:24:51.608944 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/154030e3-3d0f-4c51-8a6e-e9ddd9238c03-config-data-merged\") pod \"octavia-healthmanager-crjvb\" (UID: \"154030e3-3d0f-4c51-8a6e-e9ddd9238c03\") " pod="openstack/octavia-healthmanager-crjvb" Feb 17 19:24:51 crc kubenswrapper[4892]: I0217 19:24:51.608977 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/154030e3-3d0f-4c51-8a6e-e9ddd9238c03-config-data\") pod \"octavia-healthmanager-crjvb\" (UID: \"154030e3-3d0f-4c51-8a6e-e9ddd9238c03\") " pod="openstack/octavia-healthmanager-crjvb" Feb 17 19:24:51 crc kubenswrapper[4892]: I0217 19:24:51.609008 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/154030e3-3d0f-4c51-8a6e-e9ddd9238c03-scripts\") pod \"octavia-healthmanager-crjvb\" (UID: \"154030e3-3d0f-4c51-8a6e-e9ddd9238c03\") " pod="openstack/octavia-healthmanager-crjvb" Feb 17 19:24:51 crc kubenswrapper[4892]: I0217 19:24:51.609040 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/154030e3-3d0f-4c51-8a6e-e9ddd9238c03-amphora-certs\") pod \"octavia-healthmanager-crjvb\" (UID: \"154030e3-3d0f-4c51-8a6e-e9ddd9238c03\") " pod="openstack/octavia-healthmanager-crjvb" Feb 17 19:24:51 crc kubenswrapper[4892]: I0217 19:24:51.609087 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/154030e3-3d0f-4c51-8a6e-e9ddd9238c03-combined-ca-bundle\") pod \"octavia-healthmanager-crjvb\" (UID: \"154030e3-3d0f-4c51-8a6e-e9ddd9238c03\") " pod="openstack/octavia-healthmanager-crjvb" Feb 17 19:24:51 crc kubenswrapper[4892]: I0217 19:24:51.609827 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/154030e3-3d0f-4c51-8a6e-e9ddd9238c03-config-data-merged\") pod \"octavia-healthmanager-crjvb\" (UID: \"154030e3-3d0f-4c51-8a6e-e9ddd9238c03\") " pod="openstack/octavia-healthmanager-crjvb" Feb 17 19:24:51 crc kubenswrapper[4892]: I0217 19:24:51.610098 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/154030e3-3d0f-4c51-8a6e-e9ddd9238c03-hm-ports\") pod \"octavia-healthmanager-crjvb\" (UID: \"154030e3-3d0f-4c51-8a6e-e9ddd9238c03\") " pod="openstack/octavia-healthmanager-crjvb" Feb 17 19:24:51 crc kubenswrapper[4892]: I0217 19:24:51.616335 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/154030e3-3d0f-4c51-8a6e-e9ddd9238c03-combined-ca-bundle\") pod \"octavia-healthmanager-crjvb\" (UID: \"154030e3-3d0f-4c51-8a6e-e9ddd9238c03\") " pod="openstack/octavia-healthmanager-crjvb" Feb 17 19:24:51 crc kubenswrapper[4892]: I0217 19:24:51.616954 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/154030e3-3d0f-4c51-8a6e-e9ddd9238c03-amphora-certs\") pod \"octavia-healthmanager-crjvb\" (UID: \"154030e3-3d0f-4c51-8a6e-e9ddd9238c03\") " pod="openstack/octavia-healthmanager-crjvb" Feb 17 19:24:51 crc kubenswrapper[4892]: I0217 19:24:51.617735 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/154030e3-3d0f-4c51-8a6e-e9ddd9238c03-config-data\") pod \"octavia-healthmanager-crjvb\" (UID: \"154030e3-3d0f-4c51-8a6e-e9ddd9238c03\") " pod="openstack/octavia-healthmanager-crjvb" Feb 17 19:24:51 crc kubenswrapper[4892]: I0217 19:24:51.618387 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/154030e3-3d0f-4c51-8a6e-e9ddd9238c03-scripts\") pod \"octavia-healthmanager-crjvb\" (UID: \"154030e3-3d0f-4c51-8a6e-e9ddd9238c03\") " pod="openstack/octavia-healthmanager-crjvb" Feb 17 19:24:51 crc kubenswrapper[4892]: I0217 19:24:51.760663 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-crjvb" Feb 17 19:24:52 crc kubenswrapper[4892]: I0217 19:24:52.390150 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-crjvb"] Feb 17 19:24:52 crc kubenswrapper[4892]: I0217 19:24:52.467103 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-crjvb" event={"ID":"154030e3-3d0f-4c51-8a6e-e9ddd9238c03","Type":"ContainerStarted","Data":"279917e3372d878f3d6b9dea4b13a87b5c82cdfa094f278b5dbd0a9d7b93551f"} Feb 17 19:24:52 crc kubenswrapper[4892]: I0217 19:24:52.680449 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-crjvb"] Feb 17 19:24:53 crc kubenswrapper[4892]: I0217 19:24:53.482295 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-crjvb" event={"ID":"154030e3-3d0f-4c51-8a6e-e9ddd9238c03","Type":"ContainerStarted","Data":"261bd03eff962c12708e0c25e556984144957dd116b4cd6ee288b057f324b528"} Feb 17 19:24:55 crc kubenswrapper[4892]: I0217 19:24:55.528408 4892 generic.go:334] "Generic (PLEG): container finished" podID="154030e3-3d0f-4c51-8a6e-e9ddd9238c03" containerID="261bd03eff962c12708e0c25e556984144957dd116b4cd6ee288b057f324b528" exitCode=0 Feb 17 19:24:55 crc kubenswrapper[4892]: I0217 19:24:55.528497 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-crjvb" event={"ID":"154030e3-3d0f-4c51-8a6e-e9ddd9238c03","Type":"ContainerDied","Data":"261bd03eff962c12708e0c25e556984144957dd116b4cd6ee288b057f324b528"} Feb 17 19:24:57 crc kubenswrapper[4892]: I0217 19:24:57.561391 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-crjvb" event={"ID":"154030e3-3d0f-4c51-8a6e-e9ddd9238c03","Type":"ContainerStarted","Data":"08c2bf1d1ac27407967f8e58276a55a7c71150cc998670881edb341d06f7a88b"} Feb 17 19:24:57 crc kubenswrapper[4892]: I0217 19:24:57.562847 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-healthmanager-crjvb" Feb 17 19:24:57 crc kubenswrapper[4892]: I0217 19:24:57.635563 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-healthmanager-crjvb" podStartSLOduration=6.6355383329999995 podStartE2EDuration="6.635538333s" podCreationTimestamp="2026-02-17 19:24:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:24:57.592107516 +0000 UTC m=+6068.967510821" watchObservedRunningTime="2026-02-17 19:24:57.635538333 +0000 UTC m=+6069.010941608" Feb 17 19:25:06 crc kubenswrapper[4892]: I0217 19:25:06.809567 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-healthmanager-crjvb" Feb 17 19:25:14 crc kubenswrapper[4892]: I0217 19:25:14.346789 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-689d4548cc-cqqrq"] Feb 17 19:25:14 crc kubenswrapper[4892]: I0217 19:25:14.350540 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-689d4548cc-cqqrq" Feb 17 19:25:14 crc kubenswrapper[4892]: I0217 19:25:14.352213 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a27f67b7-efd3-4634-b422-99b953a1e67c-config-data\") pod \"horizon-689d4548cc-cqqrq\" (UID: \"a27f67b7-efd3-4634-b422-99b953a1e67c\") " pod="openstack/horizon-689d4548cc-cqqrq" Feb 17 19:25:14 crc kubenswrapper[4892]: I0217 19:25:14.352363 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkgcf\" (UniqueName: \"kubernetes.io/projected/a27f67b7-efd3-4634-b422-99b953a1e67c-kube-api-access-kkgcf\") pod \"horizon-689d4548cc-cqqrq\" (UID: \"a27f67b7-efd3-4634-b422-99b953a1e67c\") " pod="openstack/horizon-689d4548cc-cqqrq" Feb 17 19:25:14 crc kubenswrapper[4892]: I0217 19:25:14.352477 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a27f67b7-efd3-4634-b422-99b953a1e67c-scripts\") pod \"horizon-689d4548cc-cqqrq\" (UID: \"a27f67b7-efd3-4634-b422-99b953a1e67c\") " pod="openstack/horizon-689d4548cc-cqqrq" Feb 17 19:25:14 crc kubenswrapper[4892]: I0217 19:25:14.352603 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a27f67b7-efd3-4634-b422-99b953a1e67c-horizon-secret-key\") pod \"horizon-689d4548cc-cqqrq\" (UID: \"a27f67b7-efd3-4634-b422-99b953a1e67c\") " pod="openstack/horizon-689d4548cc-cqqrq" Feb 17 19:25:14 crc kubenswrapper[4892]: I0217 19:25:14.352698 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a27f67b7-efd3-4634-b422-99b953a1e67c-logs\") pod \"horizon-689d4548cc-cqqrq\" (UID: \"a27f67b7-efd3-4634-b422-99b953a1e67c\") " pod="openstack/horizon-689d4548cc-cqqrq" Feb 17 19:25:14 crc kubenswrapper[4892]: I0217 19:25:14.354381 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Feb 17 19:25:14 crc kubenswrapper[4892]: I0217 19:25:14.354946 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Feb 17 19:25:14 crc kubenswrapper[4892]: I0217 19:25:14.355109 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-bx6wr" Feb 17 19:25:14 crc kubenswrapper[4892]: I0217 19:25:14.355259 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Feb 17 19:25:14 crc kubenswrapper[4892]: I0217 19:25:14.372139 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-689d4548cc-cqqrq"] Feb 17 19:25:14 crc kubenswrapper[4892]: I0217 19:25:14.424195 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 19:25:14 crc kubenswrapper[4892]: I0217 19:25:14.424402 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="bc2d1c9d-6329-44ea-b176-b8734429b1da" containerName="glance-log" containerID="cri-o://3806908dd7eae271c3ab77dd52415cf2e258aee15cd55da14c1ef8dcc9754def" gracePeriod=30 Feb 17 19:25:14 crc kubenswrapper[4892]: I0217 19:25:14.424807 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="bc2d1c9d-6329-44ea-b176-b8734429b1da" containerName="glance-httpd" containerID="cri-o://6a8154347440583902f5cedcf6c23d94b6a019f2da26bf85ba0a195e89efbd73" gracePeriod=30 Feb 17 19:25:14 crc kubenswrapper[4892]: I0217 19:25:14.456507 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a27f67b7-efd3-4634-b422-99b953a1e67c-scripts\") pod \"horizon-689d4548cc-cqqrq\" (UID: \"a27f67b7-efd3-4634-b422-99b953a1e67c\") " pod="openstack/horizon-689d4548cc-cqqrq" Feb 17 19:25:14 crc kubenswrapper[4892]: I0217 19:25:14.456625 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a27f67b7-efd3-4634-b422-99b953a1e67c-horizon-secret-key\") pod \"horizon-689d4548cc-cqqrq\" (UID: \"a27f67b7-efd3-4634-b422-99b953a1e67c\") " pod="openstack/horizon-689d4548cc-cqqrq" Feb 17 19:25:14 crc kubenswrapper[4892]: I0217 19:25:14.456649 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a27f67b7-efd3-4634-b422-99b953a1e67c-logs\") pod \"horizon-689d4548cc-cqqrq\" (UID: \"a27f67b7-efd3-4634-b422-99b953a1e67c\") " pod="openstack/horizon-689d4548cc-cqqrq" Feb 17 19:25:14 crc kubenswrapper[4892]: I0217 19:25:14.456883 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a27f67b7-efd3-4634-b422-99b953a1e67c-config-data\") pod \"horizon-689d4548cc-cqqrq\" (UID: \"a27f67b7-efd3-4634-b422-99b953a1e67c\") " pod="openstack/horizon-689d4548cc-cqqrq" Feb 17 19:25:14 crc kubenswrapper[4892]: I0217 19:25:14.456979 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkgcf\" (UniqueName: \"kubernetes.io/projected/a27f67b7-efd3-4634-b422-99b953a1e67c-kube-api-access-kkgcf\") pod \"horizon-689d4548cc-cqqrq\" (UID: \"a27f67b7-efd3-4634-b422-99b953a1e67c\") " pod="openstack/horizon-689d4548cc-cqqrq" Feb 17 19:25:14 crc kubenswrapper[4892]: I0217 19:25:14.457168 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a27f67b7-efd3-4634-b422-99b953a1e67c-logs\") pod \"horizon-689d4548cc-cqqrq\" (UID: \"a27f67b7-efd3-4634-b422-99b953a1e67c\") " pod="openstack/horizon-689d4548cc-cqqrq" Feb 17 19:25:14 crc kubenswrapper[4892]: I0217 19:25:14.457590 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a27f67b7-efd3-4634-b422-99b953a1e67c-scripts\") pod \"horizon-689d4548cc-cqqrq\" (UID: \"a27f67b7-efd3-4634-b422-99b953a1e67c\") " pod="openstack/horizon-689d4548cc-cqqrq" Feb 17 19:25:14 crc kubenswrapper[4892]: I0217 19:25:14.458124 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a27f67b7-efd3-4634-b422-99b953a1e67c-config-data\") pod \"horizon-689d4548cc-cqqrq\" (UID: \"a27f67b7-efd3-4634-b422-99b953a1e67c\") " pod="openstack/horizon-689d4548cc-cqqrq" Feb 17 19:25:14 crc kubenswrapper[4892]: I0217 19:25:14.463071 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a27f67b7-efd3-4634-b422-99b953a1e67c-horizon-secret-key\") pod \"horizon-689d4548cc-cqqrq\" (UID: \"a27f67b7-efd3-4634-b422-99b953a1e67c\") " pod="openstack/horizon-689d4548cc-cqqrq" Feb 17 19:25:14 crc kubenswrapper[4892]: I0217 19:25:14.473660 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-598d6b48f9-h5jq8"] Feb 17 19:25:14 crc kubenswrapper[4892]: I0217 19:25:14.487455 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-598d6b48f9-h5jq8" Feb 17 19:25:14 crc kubenswrapper[4892]: I0217 19:25:14.495198 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-598d6b48f9-h5jq8"] Feb 17 19:25:14 crc kubenswrapper[4892]: I0217 19:25:14.497944 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkgcf\" (UniqueName: \"kubernetes.io/projected/a27f67b7-efd3-4634-b422-99b953a1e67c-kube-api-access-kkgcf\") pod \"horizon-689d4548cc-cqqrq\" (UID: \"a27f67b7-efd3-4634-b422-99b953a1e67c\") " pod="openstack/horizon-689d4548cc-cqqrq" Feb 17 19:25:14 crc kubenswrapper[4892]: I0217 19:25:14.514874 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 19:25:14 crc kubenswrapper[4892]: I0217 19:25:14.515287 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="d04280f1-af10-4187-a3e9-7f7384dafc7d" containerName="glance-log" containerID="cri-o://0d7cb5bc3f04fca0ac6d8c4eebfdaa5cfd90cc8171568f754b0730622a602e14" gracePeriod=30 Feb 17 19:25:14 crc kubenswrapper[4892]: I0217 19:25:14.515716 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="d04280f1-af10-4187-a3e9-7f7384dafc7d" containerName="glance-httpd" containerID="cri-o://c71aded13f218c065f94e937f42ab143edd6b47a10b7ce47acb3faee6b4c612f" gracePeriod=30 Feb 17 19:25:14 crc kubenswrapper[4892]: I0217 19:25:14.559075 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a022a2b-3eb8-4b7b-a496-5649767568a7-logs\") pod \"horizon-598d6b48f9-h5jq8\" (UID: \"8a022a2b-3eb8-4b7b-a496-5649767568a7\") " pod="openstack/horizon-598d6b48f9-h5jq8" Feb 17 19:25:14 crc kubenswrapper[4892]: I0217 19:25:14.559121 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a022a2b-3eb8-4b7b-a496-5649767568a7-scripts\") pod \"horizon-598d6b48f9-h5jq8\" (UID: \"8a022a2b-3eb8-4b7b-a496-5649767568a7\") " pod="openstack/horizon-598d6b48f9-h5jq8" Feb 17 19:25:14 crc kubenswrapper[4892]: I0217 19:25:14.559203 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8a022a2b-3eb8-4b7b-a496-5649767568a7-config-data\") pod \"horizon-598d6b48f9-h5jq8\" (UID: \"8a022a2b-3eb8-4b7b-a496-5649767568a7\") " pod="openstack/horizon-598d6b48f9-h5jq8" Feb 17 19:25:14 crc kubenswrapper[4892]: I0217 19:25:14.559272 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8a022a2b-3eb8-4b7b-a496-5649767568a7-horizon-secret-key\") pod \"horizon-598d6b48f9-h5jq8\" (UID: \"8a022a2b-3eb8-4b7b-a496-5649767568a7\") " pod="openstack/horizon-598d6b48f9-h5jq8" Feb 17 19:25:14 crc kubenswrapper[4892]: I0217 19:25:14.559290 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt2d4\" (UniqueName: \"kubernetes.io/projected/8a022a2b-3eb8-4b7b-a496-5649767568a7-kube-api-access-lt2d4\") pod \"horizon-598d6b48f9-h5jq8\" (UID: \"8a022a2b-3eb8-4b7b-a496-5649767568a7\") " pod="openstack/horizon-598d6b48f9-h5jq8" Feb 17 19:25:14 crc kubenswrapper[4892]: I0217 19:25:14.661823 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a022a2b-3eb8-4b7b-a496-5649767568a7-logs\") pod \"horizon-598d6b48f9-h5jq8\" (UID: \"8a022a2b-3eb8-4b7b-a496-5649767568a7\") " pod="openstack/horizon-598d6b48f9-h5jq8" Feb 17 19:25:14 crc kubenswrapper[4892]: I0217 19:25:14.661962 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a022a2b-3eb8-4b7b-a496-5649767568a7-scripts\") pod \"horizon-598d6b48f9-h5jq8\" (UID: \"8a022a2b-3eb8-4b7b-a496-5649767568a7\") " pod="openstack/horizon-598d6b48f9-h5jq8" Feb 17 19:25:14 crc kubenswrapper[4892]: I0217 19:25:14.662113 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8a022a2b-3eb8-4b7b-a496-5649767568a7-config-data\") pod \"horizon-598d6b48f9-h5jq8\" (UID: \"8a022a2b-3eb8-4b7b-a496-5649767568a7\") " pod="openstack/horizon-598d6b48f9-h5jq8" Feb 17 19:25:14 crc kubenswrapper[4892]: I0217 19:25:14.662267 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8a022a2b-3eb8-4b7b-a496-5649767568a7-horizon-secret-key\") pod \"horizon-598d6b48f9-h5jq8\" (UID: \"8a022a2b-3eb8-4b7b-a496-5649767568a7\") " pod="openstack/horizon-598d6b48f9-h5jq8" Feb 17 19:25:14 crc kubenswrapper[4892]: I0217 19:25:14.662346 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt2d4\" (UniqueName: \"kubernetes.io/projected/8a022a2b-3eb8-4b7b-a496-5649767568a7-kube-api-access-lt2d4\") pod \"horizon-598d6b48f9-h5jq8\" (UID: \"8a022a2b-3eb8-4b7b-a496-5649767568a7\") " pod="openstack/horizon-598d6b48f9-h5jq8" Feb 17 19:25:14 crc kubenswrapper[4892]: I0217 19:25:14.662451 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a022a2b-3eb8-4b7b-a496-5649767568a7-logs\") pod \"horizon-598d6b48f9-h5jq8\" (UID: \"8a022a2b-3eb8-4b7b-a496-5649767568a7\") " pod="openstack/horizon-598d6b48f9-h5jq8" Feb 17 19:25:14 crc kubenswrapper[4892]: I0217 19:25:14.662804 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a022a2b-3eb8-4b7b-a496-5649767568a7-scripts\") pod \"horizon-598d6b48f9-h5jq8\" (UID: \"8a022a2b-3eb8-4b7b-a496-5649767568a7\") " pod="openstack/horizon-598d6b48f9-h5jq8" Feb 17 19:25:14 crc kubenswrapper[4892]: I0217 19:25:14.663248 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8a022a2b-3eb8-4b7b-a496-5649767568a7-config-data\") pod \"horizon-598d6b48f9-h5jq8\" (UID: \"8a022a2b-3eb8-4b7b-a496-5649767568a7\") " pod="openstack/horizon-598d6b48f9-h5jq8" Feb 17 19:25:14 crc kubenswrapper[4892]: I0217 19:25:14.665995 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8a022a2b-3eb8-4b7b-a496-5649767568a7-horizon-secret-key\") pod \"horizon-598d6b48f9-h5jq8\" (UID: \"8a022a2b-3eb8-4b7b-a496-5649767568a7\") " pod="openstack/horizon-598d6b48f9-h5jq8" Feb 17 19:25:14 crc kubenswrapper[4892]: I0217 19:25:14.668070 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-689d4548cc-cqqrq" Feb 17 19:25:14 crc kubenswrapper[4892]: I0217 19:25:14.678723 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt2d4\" (UniqueName: \"kubernetes.io/projected/8a022a2b-3eb8-4b7b-a496-5649767568a7-kube-api-access-lt2d4\") pod \"horizon-598d6b48f9-h5jq8\" (UID: \"8a022a2b-3eb8-4b7b-a496-5649767568a7\") " pod="openstack/horizon-598d6b48f9-h5jq8" Feb 17 19:25:15 crc kubenswrapper[4892]: I0217 19:25:14.820215 4892 generic.go:334] "Generic (PLEG): container finished" podID="d04280f1-af10-4187-a3e9-7f7384dafc7d" containerID="0d7cb5bc3f04fca0ac6d8c4eebfdaa5cfd90cc8171568f754b0730622a602e14" exitCode=143 Feb 17 19:25:15 crc kubenswrapper[4892]: I0217 19:25:14.820398 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d04280f1-af10-4187-a3e9-7f7384dafc7d","Type":"ContainerDied","Data":"0d7cb5bc3f04fca0ac6d8c4eebfdaa5cfd90cc8171568f754b0730622a602e14"} Feb 17 19:25:15 crc kubenswrapper[4892]: I0217 19:25:14.823891 4892 generic.go:334] "Generic (PLEG): container finished" podID="bc2d1c9d-6329-44ea-b176-b8734429b1da" containerID="3806908dd7eae271c3ab77dd52415cf2e258aee15cd55da14c1ef8dcc9754def" exitCode=143 Feb 17 19:25:15 crc kubenswrapper[4892]: I0217 19:25:14.823929 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bc2d1c9d-6329-44ea-b176-b8734429b1da","Type":"ContainerDied","Data":"3806908dd7eae271c3ab77dd52415cf2e258aee15cd55da14c1ef8dcc9754def"} Feb 17 19:25:15 crc kubenswrapper[4892]: I0217 19:25:14.911209 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-598d6b48f9-h5jq8" Feb 17 19:25:15 crc kubenswrapper[4892]: I0217 19:25:15.416117 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-598d6b48f9-h5jq8"] Feb 17 19:25:15 crc kubenswrapper[4892]: I0217 19:25:15.443780 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5fdbc494b5-smb4m"] Feb 17 19:25:15 crc kubenswrapper[4892]: I0217 19:25:15.472708 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5fdbc494b5-smb4m"] Feb 17 19:25:15 crc kubenswrapper[4892]: I0217 19:25:15.472801 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5fdbc494b5-smb4m" Feb 17 19:25:15 crc kubenswrapper[4892]: I0217 19:25:15.584765 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4976d40b-aca4-4ac9-956c-ff2719378071-logs\") pod \"horizon-5fdbc494b5-smb4m\" (UID: \"4976d40b-aca4-4ac9-956c-ff2719378071\") " pod="openstack/horizon-5fdbc494b5-smb4m" Feb 17 19:25:15 crc kubenswrapper[4892]: I0217 19:25:15.584826 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4976d40b-aca4-4ac9-956c-ff2719378071-config-data\") pod \"horizon-5fdbc494b5-smb4m\" (UID: \"4976d40b-aca4-4ac9-956c-ff2719378071\") " pod="openstack/horizon-5fdbc494b5-smb4m" Feb 17 19:25:15 crc kubenswrapper[4892]: I0217 19:25:15.584849 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcmw4\" (UniqueName: \"kubernetes.io/projected/4976d40b-aca4-4ac9-956c-ff2719378071-kube-api-access-lcmw4\") pod \"horizon-5fdbc494b5-smb4m\" (UID: \"4976d40b-aca4-4ac9-956c-ff2719378071\") " pod="openstack/horizon-5fdbc494b5-smb4m" Feb 17 19:25:15 crc kubenswrapper[4892]: I0217 19:25:15.584901 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4976d40b-aca4-4ac9-956c-ff2719378071-scripts\") pod \"horizon-5fdbc494b5-smb4m\" (UID: \"4976d40b-aca4-4ac9-956c-ff2719378071\") " pod="openstack/horizon-5fdbc494b5-smb4m" Feb 17 19:25:15 crc kubenswrapper[4892]: I0217 19:25:15.584943 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4976d40b-aca4-4ac9-956c-ff2719378071-horizon-secret-key\") pod \"horizon-5fdbc494b5-smb4m\" (UID: \"4976d40b-aca4-4ac9-956c-ff2719378071\") " pod="openstack/horizon-5fdbc494b5-smb4m" Feb 17 19:25:15 crc kubenswrapper[4892]: I0217 19:25:15.686898 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4976d40b-aca4-4ac9-956c-ff2719378071-logs\") pod \"horizon-5fdbc494b5-smb4m\" (UID: \"4976d40b-aca4-4ac9-956c-ff2719378071\") " pod="openstack/horizon-5fdbc494b5-smb4m" Feb 17 19:25:15 crc kubenswrapper[4892]: I0217 19:25:15.686948 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4976d40b-aca4-4ac9-956c-ff2719378071-config-data\") pod \"horizon-5fdbc494b5-smb4m\" (UID: \"4976d40b-aca4-4ac9-956c-ff2719378071\") " pod="openstack/horizon-5fdbc494b5-smb4m" Feb 17 19:25:15 crc kubenswrapper[4892]: I0217 19:25:15.686969 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcmw4\" (UniqueName: \"kubernetes.io/projected/4976d40b-aca4-4ac9-956c-ff2719378071-kube-api-access-lcmw4\") pod \"horizon-5fdbc494b5-smb4m\" (UID: \"4976d40b-aca4-4ac9-956c-ff2719378071\") " pod="openstack/horizon-5fdbc494b5-smb4m" Feb 17 19:25:15 crc kubenswrapper[4892]: I0217 19:25:15.687019 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4976d40b-aca4-4ac9-956c-ff2719378071-scripts\") pod \"horizon-5fdbc494b5-smb4m\" (UID: \"4976d40b-aca4-4ac9-956c-ff2719378071\") " pod="openstack/horizon-5fdbc494b5-smb4m" Feb 17 19:25:15 crc kubenswrapper[4892]: I0217 19:25:15.687059 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4976d40b-aca4-4ac9-956c-ff2719378071-horizon-secret-key\") pod \"horizon-5fdbc494b5-smb4m\" (UID: \"4976d40b-aca4-4ac9-956c-ff2719378071\") " pod="openstack/horizon-5fdbc494b5-smb4m" Feb 17 19:25:15 crc kubenswrapper[4892]: I0217 19:25:15.687678 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4976d40b-aca4-4ac9-956c-ff2719378071-logs\") pod \"horizon-5fdbc494b5-smb4m\" (UID: \"4976d40b-aca4-4ac9-956c-ff2719378071\") " pod="openstack/horizon-5fdbc494b5-smb4m" Feb 17 19:25:15 crc kubenswrapper[4892]: I0217 19:25:15.689914 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4976d40b-aca4-4ac9-956c-ff2719378071-scripts\") pod \"horizon-5fdbc494b5-smb4m\" (UID: \"4976d40b-aca4-4ac9-956c-ff2719378071\") " pod="openstack/horizon-5fdbc494b5-smb4m" Feb 17 19:25:15 crc kubenswrapper[4892]: I0217 19:25:15.690856 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4976d40b-aca4-4ac9-956c-ff2719378071-config-data\") pod \"horizon-5fdbc494b5-smb4m\" (UID: \"4976d40b-aca4-4ac9-956c-ff2719378071\") " pod="openstack/horizon-5fdbc494b5-smb4m" Feb 17 19:25:15 crc kubenswrapper[4892]: I0217 19:25:15.693407 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4976d40b-aca4-4ac9-956c-ff2719378071-horizon-secret-key\") pod \"horizon-5fdbc494b5-smb4m\" (UID: \"4976d40b-aca4-4ac9-956c-ff2719378071\") " pod="openstack/horizon-5fdbc494b5-smb4m" Feb 17 19:25:15 crc kubenswrapper[4892]: I0217 19:25:15.702393 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcmw4\" (UniqueName: \"kubernetes.io/projected/4976d40b-aca4-4ac9-956c-ff2719378071-kube-api-access-lcmw4\") pod \"horizon-5fdbc494b5-smb4m\" (UID: \"4976d40b-aca4-4ac9-956c-ff2719378071\") " pod="openstack/horizon-5fdbc494b5-smb4m" Feb 17 19:25:15 crc kubenswrapper[4892]: I0217 19:25:15.788475 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-598d6b48f9-h5jq8"] Feb 17 19:25:15 crc kubenswrapper[4892]: W0217 19:25:15.822657 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda27f67b7_efd3_4634_b422_99b953a1e67c.slice/crio-873b628e052f77a802e45c68fe101fe09ef88b33df6df204b8ecd4e8b8c22ce2 WatchSource:0}: Error finding container 873b628e052f77a802e45c68fe101fe09ef88b33df6df204b8ecd4e8b8c22ce2: Status 404 returned error can't find the container with id 873b628e052f77a802e45c68fe101fe09ef88b33df6df204b8ecd4e8b8c22ce2 Feb 17 19:25:15 crc kubenswrapper[4892]: I0217 19:25:15.823291 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-689d4548cc-cqqrq"] Feb 17 19:25:15 crc kubenswrapper[4892]: I0217 19:25:15.830639 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5fdbc494b5-smb4m" Feb 17 19:25:15 crc kubenswrapper[4892]: I0217 19:25:15.836723 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-598d6b48f9-h5jq8" event={"ID":"8a022a2b-3eb8-4b7b-a496-5649767568a7","Type":"ContainerStarted","Data":"c13d33898c08447efeaa9378c208e076e604e9597bd08faabe1301743fdfc485"} Feb 17 19:25:15 crc kubenswrapper[4892]: I0217 19:25:15.838177 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-689d4548cc-cqqrq" event={"ID":"a27f67b7-efd3-4634-b422-99b953a1e67c","Type":"ContainerStarted","Data":"873b628e052f77a802e45c68fe101fe09ef88b33df6df204b8ecd4e8b8c22ce2"} Feb 17 19:25:16 crc kubenswrapper[4892]: I0217 19:25:16.303310 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5fdbc494b5-smb4m"] Feb 17 19:25:16 crc kubenswrapper[4892]: I0217 19:25:16.855272 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fdbc494b5-smb4m" event={"ID":"4976d40b-aca4-4ac9-956c-ff2719378071","Type":"ContainerStarted","Data":"3c0cd342a848fee3d40486414796d6ecce47cedfa6affe5974990e3a3f8da3e3"} Feb 17 19:25:17 crc kubenswrapper[4892]: I0217 19:25:17.876171 4892 generic.go:334] "Generic (PLEG): container finished" podID="d04280f1-af10-4187-a3e9-7f7384dafc7d" containerID="c71aded13f218c065f94e937f42ab143edd6b47a10b7ce47acb3faee6b4c612f" exitCode=0 Feb 17 19:25:17 crc kubenswrapper[4892]: I0217 19:25:17.876250 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d04280f1-af10-4187-a3e9-7f7384dafc7d","Type":"ContainerDied","Data":"c71aded13f218c065f94e937f42ab143edd6b47a10b7ce47acb3faee6b4c612f"} Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.239293 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.359624 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clgb8\" (UniqueName: \"kubernetes.io/projected/d04280f1-af10-4187-a3e9-7f7384dafc7d-kube-api-access-clgb8\") pod \"d04280f1-af10-4187-a3e9-7f7384dafc7d\" (UID: \"d04280f1-af10-4187-a3e9-7f7384dafc7d\") " Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.359851 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d04280f1-af10-4187-a3e9-7f7384dafc7d-config-data\") pod \"d04280f1-af10-4187-a3e9-7f7384dafc7d\" (UID: \"d04280f1-af10-4187-a3e9-7f7384dafc7d\") " Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.359879 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d04280f1-af10-4187-a3e9-7f7384dafc7d-scripts\") pod \"d04280f1-af10-4187-a3e9-7f7384dafc7d\" (UID: \"d04280f1-af10-4187-a3e9-7f7384dafc7d\") " Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.359904 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d04280f1-af10-4187-a3e9-7f7384dafc7d-httpd-run\") pod \"d04280f1-af10-4187-a3e9-7f7384dafc7d\" (UID: \"d04280f1-af10-4187-a3e9-7f7384dafc7d\") " Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.359942 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d04280f1-af10-4187-a3e9-7f7384dafc7d-logs\") pod \"d04280f1-af10-4187-a3e9-7f7384dafc7d\" (UID: \"d04280f1-af10-4187-a3e9-7f7384dafc7d\") " Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.359969 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d04280f1-af10-4187-a3e9-7f7384dafc7d-ceph\") pod \"d04280f1-af10-4187-a3e9-7f7384dafc7d\" (UID: \"d04280f1-af10-4187-a3e9-7f7384dafc7d\") " Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.360008 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d04280f1-af10-4187-a3e9-7f7384dafc7d-combined-ca-bundle\") pod \"d04280f1-af10-4187-a3e9-7f7384dafc7d\" (UID: \"d04280f1-af10-4187-a3e9-7f7384dafc7d\") " Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.368330 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d04280f1-af10-4187-a3e9-7f7384dafc7d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d04280f1-af10-4187-a3e9-7f7384dafc7d" (UID: "d04280f1-af10-4187-a3e9-7f7384dafc7d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.368391 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d04280f1-af10-4187-a3e9-7f7384dafc7d-logs" (OuterVolumeSpecName: "logs") pod "d04280f1-af10-4187-a3e9-7f7384dafc7d" (UID: "d04280f1-af10-4187-a3e9-7f7384dafc7d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.373982 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d04280f1-af10-4187-a3e9-7f7384dafc7d-scripts" (OuterVolumeSpecName: "scripts") pod "d04280f1-af10-4187-a3e9-7f7384dafc7d" (UID: "d04280f1-af10-4187-a3e9-7f7384dafc7d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.378506 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d04280f1-af10-4187-a3e9-7f7384dafc7d-kube-api-access-clgb8" (OuterVolumeSpecName: "kube-api-access-clgb8") pod "d04280f1-af10-4187-a3e9-7f7384dafc7d" (UID: "d04280f1-af10-4187-a3e9-7f7384dafc7d"). InnerVolumeSpecName "kube-api-access-clgb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.381338 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d04280f1-af10-4187-a3e9-7f7384dafc7d-ceph" (OuterVolumeSpecName: "ceph") pod "d04280f1-af10-4187-a3e9-7f7384dafc7d" (UID: "d04280f1-af10-4187-a3e9-7f7384dafc7d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.425511 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d04280f1-af10-4187-a3e9-7f7384dafc7d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d04280f1-af10-4187-a3e9-7f7384dafc7d" (UID: "d04280f1-af10-4187-a3e9-7f7384dafc7d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.449301 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.465372 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clgb8\" (UniqueName: \"kubernetes.io/projected/d04280f1-af10-4187-a3e9-7f7384dafc7d-kube-api-access-clgb8\") on node \"crc\" DevicePath \"\"" Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.465402 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d04280f1-af10-4187-a3e9-7f7384dafc7d-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.465412 4892 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d04280f1-af10-4187-a3e9-7f7384dafc7d-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.465422 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d04280f1-af10-4187-a3e9-7f7384dafc7d-logs\") on node \"crc\" DevicePath \"\"" Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.465448 4892 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d04280f1-af10-4187-a3e9-7f7384dafc7d-ceph\") on node \"crc\" DevicePath \"\"" Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.465457 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d04280f1-af10-4187-a3e9-7f7384dafc7d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.473144 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d04280f1-af10-4187-a3e9-7f7384dafc7d-config-data" (OuterVolumeSpecName: "config-data") pod "d04280f1-af10-4187-a3e9-7f7384dafc7d" (UID: "d04280f1-af10-4187-a3e9-7f7384dafc7d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.567614 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc2d1c9d-6329-44ea-b176-b8734429b1da-combined-ca-bundle\") pod \"bc2d1c9d-6329-44ea-b176-b8734429b1da\" (UID: \"bc2d1c9d-6329-44ea-b176-b8734429b1da\") " Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.567667 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bc2d1c9d-6329-44ea-b176-b8734429b1da-httpd-run\") pod \"bc2d1c9d-6329-44ea-b176-b8734429b1da\" (UID: \"bc2d1c9d-6329-44ea-b176-b8734429b1da\") " Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.567685 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc2d1c9d-6329-44ea-b176-b8734429b1da-logs\") pod \"bc2d1c9d-6329-44ea-b176-b8734429b1da\" (UID: \"bc2d1c9d-6329-44ea-b176-b8734429b1da\") " Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.567793 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p78jn\" (UniqueName: \"kubernetes.io/projected/bc2d1c9d-6329-44ea-b176-b8734429b1da-kube-api-access-p78jn\") pod \"bc2d1c9d-6329-44ea-b176-b8734429b1da\" (UID: \"bc2d1c9d-6329-44ea-b176-b8734429b1da\") " Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.567882 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/bc2d1c9d-6329-44ea-b176-b8734429b1da-ceph\") pod \"bc2d1c9d-6329-44ea-b176-b8734429b1da\" (UID: \"bc2d1c9d-6329-44ea-b176-b8734429b1da\") " Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.567922 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc2d1c9d-6329-44ea-b176-b8734429b1da-config-data\") pod \"bc2d1c9d-6329-44ea-b176-b8734429b1da\" (UID: \"bc2d1c9d-6329-44ea-b176-b8734429b1da\") " Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.568031 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc2d1c9d-6329-44ea-b176-b8734429b1da-scripts\") pod \"bc2d1c9d-6329-44ea-b176-b8734429b1da\" (UID: \"bc2d1c9d-6329-44ea-b176-b8734429b1da\") " Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.568520 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d04280f1-af10-4187-a3e9-7f7384dafc7d-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.569065 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc2d1c9d-6329-44ea-b176-b8734429b1da-logs" (OuterVolumeSpecName: "logs") pod "bc2d1c9d-6329-44ea-b176-b8734429b1da" (UID: "bc2d1c9d-6329-44ea-b176-b8734429b1da"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.569255 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc2d1c9d-6329-44ea-b176-b8734429b1da-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "bc2d1c9d-6329-44ea-b176-b8734429b1da" (UID: "bc2d1c9d-6329-44ea-b176-b8734429b1da"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.576288 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc2d1c9d-6329-44ea-b176-b8734429b1da-kube-api-access-p78jn" (OuterVolumeSpecName: "kube-api-access-p78jn") pod "bc2d1c9d-6329-44ea-b176-b8734429b1da" (UID: "bc2d1c9d-6329-44ea-b176-b8734429b1da"). InnerVolumeSpecName "kube-api-access-p78jn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.576393 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc2d1c9d-6329-44ea-b176-b8734429b1da-ceph" (OuterVolumeSpecName: "ceph") pod "bc2d1c9d-6329-44ea-b176-b8734429b1da" (UID: "bc2d1c9d-6329-44ea-b176-b8734429b1da"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.578586 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc2d1c9d-6329-44ea-b176-b8734429b1da-scripts" (OuterVolumeSpecName: "scripts") pod "bc2d1c9d-6329-44ea-b176-b8734429b1da" (UID: "bc2d1c9d-6329-44ea-b176-b8734429b1da"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.609990 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc2d1c9d-6329-44ea-b176-b8734429b1da-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bc2d1c9d-6329-44ea-b176-b8734429b1da" (UID: "bc2d1c9d-6329-44ea-b176-b8734429b1da"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.635023 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc2d1c9d-6329-44ea-b176-b8734429b1da-config-data" (OuterVolumeSpecName: "config-data") pod "bc2d1c9d-6329-44ea-b176-b8734429b1da" (UID: "bc2d1c9d-6329-44ea-b176-b8734429b1da"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.670895 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc2d1c9d-6329-44ea-b176-b8734429b1da-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.670926 4892 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bc2d1c9d-6329-44ea-b176-b8734429b1da-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.670935 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc2d1c9d-6329-44ea-b176-b8734429b1da-logs\") on node \"crc\" DevicePath \"\"" Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.670945 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p78jn\" (UniqueName: \"kubernetes.io/projected/bc2d1c9d-6329-44ea-b176-b8734429b1da-kube-api-access-p78jn\") on node \"crc\" DevicePath \"\"" Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.670957 4892 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/bc2d1c9d-6329-44ea-b176-b8734429b1da-ceph\") on node \"crc\" DevicePath \"\"" Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.670965 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc2d1c9d-6329-44ea-b176-b8734429b1da-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.670972 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc2d1c9d-6329-44ea-b176-b8734429b1da-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.888181 4892 generic.go:334] "Generic (PLEG): container finished" podID="bc2d1c9d-6329-44ea-b176-b8734429b1da" containerID="6a8154347440583902f5cedcf6c23d94b6a019f2da26bf85ba0a195e89efbd73" exitCode=0 Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.888249 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.888254 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bc2d1c9d-6329-44ea-b176-b8734429b1da","Type":"ContainerDied","Data":"6a8154347440583902f5cedcf6c23d94b6a019f2da26bf85ba0a195e89efbd73"} Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.888367 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bc2d1c9d-6329-44ea-b176-b8734429b1da","Type":"ContainerDied","Data":"837fccdb42b1c5b047a6a136afa8ffddf5e176b04333d4bd1a7544ddd3269f6a"} Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.888387 4892 scope.go:117] "RemoveContainer" containerID="6a8154347440583902f5cedcf6c23d94b6a019f2da26bf85ba0a195e89efbd73" Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.892786 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d04280f1-af10-4187-a3e9-7f7384dafc7d","Type":"ContainerDied","Data":"f7cbed2c4a0bb34b6d6c784484695056378dbd1136bf16ba41786bd6d9855178"} Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.892874 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.921998 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.938374 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.956182 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 19:25:18 crc kubenswrapper[4892]: E0217 19:25:18.956707 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d04280f1-af10-4187-a3e9-7f7384dafc7d" containerName="glance-log" Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.956724 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="d04280f1-af10-4187-a3e9-7f7384dafc7d" containerName="glance-log" Feb 17 19:25:18 crc kubenswrapper[4892]: E0217 19:25:18.956765 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc2d1c9d-6329-44ea-b176-b8734429b1da" containerName="glance-log" Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.956772 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc2d1c9d-6329-44ea-b176-b8734429b1da" containerName="glance-log" Feb 17 19:25:18 crc kubenswrapper[4892]: E0217 19:25:18.956790 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc2d1c9d-6329-44ea-b176-b8734429b1da" containerName="glance-httpd" Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.956796 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc2d1c9d-6329-44ea-b176-b8734429b1da" containerName="glance-httpd" Feb 17 19:25:18 crc kubenswrapper[4892]: E0217 19:25:18.956806 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d04280f1-af10-4187-a3e9-7f7384dafc7d" containerName="glance-httpd" Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.956824 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="d04280f1-af10-4187-a3e9-7f7384dafc7d" containerName="glance-httpd" Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.962713 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc2d1c9d-6329-44ea-b176-b8734429b1da" containerName="glance-httpd" Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.962742 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="d04280f1-af10-4187-a3e9-7f7384dafc7d" containerName="glance-httpd" Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.962762 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="d04280f1-af10-4187-a3e9-7f7384dafc7d" containerName="glance-log" Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.962778 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc2d1c9d-6329-44ea-b176-b8734429b1da" containerName="glance-log" Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.963965 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.967587 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.978498 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.978623 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-zvxqz" Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.978885 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.979415 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 17 19:25:18 crc kubenswrapper[4892]: I0217 19:25:18.997700 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.008047 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.010206 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.012571 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.020504 4892 scope.go:117] "RemoveContainer" containerID="3806908dd7eae271c3ab77dd52415cf2e258aee15cd55da14c1ef8dcc9754def" Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.036279 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.050419 4892 scope.go:117] "RemoveContainer" containerID="6a8154347440583902f5cedcf6c23d94b6a019f2da26bf85ba0a195e89efbd73" Feb 17 19:25:19 crc kubenswrapper[4892]: E0217 19:25:19.050776 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a8154347440583902f5cedcf6c23d94b6a019f2da26bf85ba0a195e89efbd73\": container with ID starting with 6a8154347440583902f5cedcf6c23d94b6a019f2da26bf85ba0a195e89efbd73 not found: ID does not exist" containerID="6a8154347440583902f5cedcf6c23d94b6a019f2da26bf85ba0a195e89efbd73" Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.050804 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a8154347440583902f5cedcf6c23d94b6a019f2da26bf85ba0a195e89efbd73"} err="failed to get container status \"6a8154347440583902f5cedcf6c23d94b6a019f2da26bf85ba0a195e89efbd73\": rpc error: code = NotFound desc = could not find container \"6a8154347440583902f5cedcf6c23d94b6a019f2da26bf85ba0a195e89efbd73\": container with ID starting with 6a8154347440583902f5cedcf6c23d94b6a019f2da26bf85ba0a195e89efbd73 not found: ID does not exist" Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.050884 4892 scope.go:117] "RemoveContainer" containerID="3806908dd7eae271c3ab77dd52415cf2e258aee15cd55da14c1ef8dcc9754def" Feb 17 19:25:19 crc kubenswrapper[4892]: E0217 19:25:19.053182 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3806908dd7eae271c3ab77dd52415cf2e258aee15cd55da14c1ef8dcc9754def\": container with ID starting with 3806908dd7eae271c3ab77dd52415cf2e258aee15cd55da14c1ef8dcc9754def not found: ID does not exist" containerID="3806908dd7eae271c3ab77dd52415cf2e258aee15cd55da14c1ef8dcc9754def" Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.053234 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3806908dd7eae271c3ab77dd52415cf2e258aee15cd55da14c1ef8dcc9754def"} err="failed to get container status \"3806908dd7eae271c3ab77dd52415cf2e258aee15cd55da14c1ef8dcc9754def\": rpc error: code = NotFound desc = could not find container \"3806908dd7eae271c3ab77dd52415cf2e258aee15cd55da14c1ef8dcc9754def\": container with ID starting with 3806908dd7eae271c3ab77dd52415cf2e258aee15cd55da14c1ef8dcc9754def not found: ID does not exist" Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.053261 4892 scope.go:117] "RemoveContainer" containerID="c71aded13f218c065f94e937f42ab143edd6b47a10b7ce47acb3faee6b4c612f" Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.078152 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/253a67a0-8806-43b4-8994-10b9143ee4dd-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"253a67a0-8806-43b4-8994-10b9143ee4dd\") " pod="openstack/glance-default-external-api-0" Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.078191 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6e79a3c-a8e5-4144-bedb-0e771ee43025-logs\") pod \"glance-default-internal-api-0\" (UID: \"e6e79a3c-a8e5-4144-bedb-0e771ee43025\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.078231 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/253a67a0-8806-43b4-8994-10b9143ee4dd-scripts\") pod \"glance-default-external-api-0\" (UID: \"253a67a0-8806-43b4-8994-10b9143ee4dd\") " pod="openstack/glance-default-external-api-0" Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.078249 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/253a67a0-8806-43b4-8994-10b9143ee4dd-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"253a67a0-8806-43b4-8994-10b9143ee4dd\") " pod="openstack/glance-default-external-api-0" Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.078282 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e79a3c-a8e5-4144-bedb-0e771ee43025-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e6e79a3c-a8e5-4144-bedb-0e771ee43025\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.078299 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e6e79a3c-a8e5-4144-bedb-0e771ee43025-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e6e79a3c-a8e5-4144-bedb-0e771ee43025\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.078368 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6e79a3c-a8e5-4144-bedb-0e771ee43025-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e6e79a3c-a8e5-4144-bedb-0e771ee43025\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.078431 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/253a67a0-8806-43b4-8994-10b9143ee4dd-logs\") pod \"glance-default-external-api-0\" (UID: \"253a67a0-8806-43b4-8994-10b9143ee4dd\") " pod="openstack/glance-default-external-api-0" Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.078458 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/253a67a0-8806-43b4-8994-10b9143ee4dd-ceph\") pod \"glance-default-external-api-0\" (UID: \"253a67a0-8806-43b4-8994-10b9143ee4dd\") " pod="openstack/glance-default-external-api-0" Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.078519 4892 scope.go:117] "RemoveContainer" containerID="0d7cb5bc3f04fca0ac6d8c4eebfdaa5cfd90cc8171568f754b0730622a602e14" Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.078523 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e6e79a3c-a8e5-4144-bedb-0e771ee43025-ceph\") pod \"glance-default-internal-api-0\" (UID: \"e6e79a3c-a8e5-4144-bedb-0e771ee43025\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.078686 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jql9r\" (UniqueName: \"kubernetes.io/projected/253a67a0-8806-43b4-8994-10b9143ee4dd-kube-api-access-jql9r\") pod \"glance-default-external-api-0\" (UID: \"253a67a0-8806-43b4-8994-10b9143ee4dd\") " pod="openstack/glance-default-external-api-0" Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.078865 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/253a67a0-8806-43b4-8994-10b9143ee4dd-config-data\") pod \"glance-default-external-api-0\" (UID: \"253a67a0-8806-43b4-8994-10b9143ee4dd\") " pod="openstack/glance-default-external-api-0" Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.079002 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6e79a3c-a8e5-4144-bedb-0e771ee43025-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e6e79a3c-a8e5-4144-bedb-0e771ee43025\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.079027 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shcmb\" (UniqueName: \"kubernetes.io/projected/e6e79a3c-a8e5-4144-bedb-0e771ee43025-kube-api-access-shcmb\") pod \"glance-default-internal-api-0\" (UID: \"e6e79a3c-a8e5-4144-bedb-0e771ee43025\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.180968 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/253a67a0-8806-43b4-8994-10b9143ee4dd-config-data\") pod \"glance-default-external-api-0\" (UID: \"253a67a0-8806-43b4-8994-10b9143ee4dd\") " pod="openstack/glance-default-external-api-0" Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.181045 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6e79a3c-a8e5-4144-bedb-0e771ee43025-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e6e79a3c-a8e5-4144-bedb-0e771ee43025\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.181064 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shcmb\" (UniqueName: \"kubernetes.io/projected/e6e79a3c-a8e5-4144-bedb-0e771ee43025-kube-api-access-shcmb\") pod \"glance-default-internal-api-0\" (UID: \"e6e79a3c-a8e5-4144-bedb-0e771ee43025\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.181095 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/253a67a0-8806-43b4-8994-10b9143ee4dd-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"253a67a0-8806-43b4-8994-10b9143ee4dd\") " pod="openstack/glance-default-external-api-0" Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.181113 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6e79a3c-a8e5-4144-bedb-0e771ee43025-logs\") pod \"glance-default-internal-api-0\" (UID: \"e6e79a3c-a8e5-4144-bedb-0e771ee43025\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.181127 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/253a67a0-8806-43b4-8994-10b9143ee4dd-scripts\") pod \"glance-default-external-api-0\" (UID: \"253a67a0-8806-43b4-8994-10b9143ee4dd\") " pod="openstack/glance-default-external-api-0" Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.181145 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/253a67a0-8806-43b4-8994-10b9143ee4dd-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"253a67a0-8806-43b4-8994-10b9143ee4dd\") " pod="openstack/glance-default-external-api-0" Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.181171 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e79a3c-a8e5-4144-bedb-0e771ee43025-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e6e79a3c-a8e5-4144-bedb-0e771ee43025\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.181187 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e6e79a3c-a8e5-4144-bedb-0e771ee43025-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e6e79a3c-a8e5-4144-bedb-0e771ee43025\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.181216 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6e79a3c-a8e5-4144-bedb-0e771ee43025-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e6e79a3c-a8e5-4144-bedb-0e771ee43025\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.181261 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/253a67a0-8806-43b4-8994-10b9143ee4dd-logs\") pod \"glance-default-external-api-0\" (UID: \"253a67a0-8806-43b4-8994-10b9143ee4dd\") " pod="openstack/glance-default-external-api-0" Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.181281 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/253a67a0-8806-43b4-8994-10b9143ee4dd-ceph\") pod \"glance-default-external-api-0\" (UID: \"253a67a0-8806-43b4-8994-10b9143ee4dd\") " pod="openstack/glance-default-external-api-0" Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.181334 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e6e79a3c-a8e5-4144-bedb-0e771ee43025-ceph\") pod \"glance-default-internal-api-0\" (UID: \"e6e79a3c-a8e5-4144-bedb-0e771ee43025\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.181352 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jql9r\" (UniqueName: \"kubernetes.io/projected/253a67a0-8806-43b4-8994-10b9143ee4dd-kube-api-access-jql9r\") pod \"glance-default-external-api-0\" (UID: \"253a67a0-8806-43b4-8994-10b9143ee4dd\") " pod="openstack/glance-default-external-api-0" Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.182864 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/253a67a0-8806-43b4-8994-10b9143ee4dd-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"253a67a0-8806-43b4-8994-10b9143ee4dd\") " pod="openstack/glance-default-external-api-0" Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.182869 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/253a67a0-8806-43b4-8994-10b9143ee4dd-logs\") pod \"glance-default-external-api-0\" (UID: \"253a67a0-8806-43b4-8994-10b9143ee4dd\") " pod="openstack/glance-default-external-api-0" Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.183136 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e6e79a3c-a8e5-4144-bedb-0e771ee43025-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e6e79a3c-a8e5-4144-bedb-0e771ee43025\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.183268 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6e79a3c-a8e5-4144-bedb-0e771ee43025-logs\") pod \"glance-default-internal-api-0\" (UID: \"e6e79a3c-a8e5-4144-bedb-0e771ee43025\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.185720 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/253a67a0-8806-43b4-8994-10b9143ee4dd-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"253a67a0-8806-43b4-8994-10b9143ee4dd\") " pod="openstack/glance-default-external-api-0" Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.190576 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/253a67a0-8806-43b4-8994-10b9143ee4dd-ceph\") pod \"glance-default-external-api-0\" (UID: \"253a67a0-8806-43b4-8994-10b9143ee4dd\") " pod="openstack/glance-default-external-api-0" Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.190593 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e79a3c-a8e5-4144-bedb-0e771ee43025-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e6e79a3c-a8e5-4144-bedb-0e771ee43025\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.191985 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/253a67a0-8806-43b4-8994-10b9143ee4dd-config-data\") pod \"glance-default-external-api-0\" (UID: \"253a67a0-8806-43b4-8994-10b9143ee4dd\") " pod="openstack/glance-default-external-api-0" Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.195966 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/253a67a0-8806-43b4-8994-10b9143ee4dd-scripts\") pod \"glance-default-external-api-0\" (UID: \"253a67a0-8806-43b4-8994-10b9143ee4dd\") " pod="openstack/glance-default-external-api-0" Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.196375 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6e79a3c-a8e5-4144-bedb-0e771ee43025-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e6e79a3c-a8e5-4144-bedb-0e771ee43025\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.198621 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e6e79a3c-a8e5-4144-bedb-0e771ee43025-ceph\") pod \"glance-default-internal-api-0\" (UID: \"e6e79a3c-a8e5-4144-bedb-0e771ee43025\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.198901 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6e79a3c-a8e5-4144-bedb-0e771ee43025-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e6e79a3c-a8e5-4144-bedb-0e771ee43025\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.199626 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shcmb\" (UniqueName: \"kubernetes.io/projected/e6e79a3c-a8e5-4144-bedb-0e771ee43025-kube-api-access-shcmb\") pod \"glance-default-internal-api-0\" (UID: \"e6e79a3c-a8e5-4144-bedb-0e771ee43025\") " pod="openstack/glance-default-internal-api-0" Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.201413 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jql9r\" (UniqueName: \"kubernetes.io/projected/253a67a0-8806-43b4-8994-10b9143ee4dd-kube-api-access-jql9r\") pod \"glance-default-external-api-0\" (UID: \"253a67a0-8806-43b4-8994-10b9143ee4dd\") " pod="openstack/glance-default-external-api-0" Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.314514 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.335080 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.377916 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc2d1c9d-6329-44ea-b176-b8734429b1da" path="/var/lib/kubelet/pods/bc2d1c9d-6329-44ea-b176-b8734429b1da/volumes" Feb 17 19:25:19 crc kubenswrapper[4892]: I0217 19:25:19.380712 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d04280f1-af10-4187-a3e9-7f7384dafc7d" path="/var/lib/kubelet/pods/d04280f1-af10-4187-a3e9-7f7384dafc7d/volumes" Feb 17 19:25:24 crc kubenswrapper[4892]: I0217 19:25:24.963439 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-598d6b48f9-h5jq8" event={"ID":"8a022a2b-3eb8-4b7b-a496-5649767568a7","Type":"ContainerStarted","Data":"9a15f6f6fdc92aa1fc1b7bb83a6e66b2aa71891dee85dd8553941908a3d17f54"} Feb 17 19:25:24 crc kubenswrapper[4892]: I0217 19:25:24.979271 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-689d4548cc-cqqrq" event={"ID":"a27f67b7-efd3-4634-b422-99b953a1e67c","Type":"ContainerStarted","Data":"5aa89074cc9ec5ffe1244cb742c7b589a0451dadf5d4241635ec96d3d80e6274"} Feb 17 19:25:24 crc kubenswrapper[4892]: I0217 19:25:24.993509 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fdbc494b5-smb4m" event={"ID":"4976d40b-aca4-4ac9-956c-ff2719378071","Type":"ContainerStarted","Data":"e2d672890367052c9abe2d31a774ac20d688d0b3209257c1948a416b26bd80b5"} Feb 17 19:25:25 crc kubenswrapper[4892]: I0217 19:25:25.110638 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 19:25:25 crc kubenswrapper[4892]: W0217 19:25:25.121127 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6e79a3c_a8e5_4144_bedb_0e771ee43025.slice/crio-9100ed182c7420df14d2d289bdd5994d20eda652bc5d3a25e0a4cc34087b78a8 WatchSource:0}: Error finding container 9100ed182c7420df14d2d289bdd5994d20eda652bc5d3a25e0a4cc34087b78a8: Status 404 returned error can't find the container with id 9100ed182c7420df14d2d289bdd5994d20eda652bc5d3a25e0a4cc34087b78a8 Feb 17 19:25:25 crc kubenswrapper[4892]: W0217 19:25:25.205075 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod253a67a0_8806_43b4_8994_10b9143ee4dd.slice/crio-6fad97d9cdb789758c1f9ac171709f13293b7766fec992786ded0e1f7cdba3d8 WatchSource:0}: Error finding container 6fad97d9cdb789758c1f9ac171709f13293b7766fec992786ded0e1f7cdba3d8: Status 404 returned error can't find the container with id 6fad97d9cdb789758c1f9ac171709f13293b7766fec992786ded0e1f7cdba3d8 Feb 17 19:25:25 crc kubenswrapper[4892]: I0217 19:25:25.214888 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 19:25:26 crc kubenswrapper[4892]: I0217 19:25:26.007213 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e6e79a3c-a8e5-4144-bedb-0e771ee43025","Type":"ContainerStarted","Data":"0c38d8827beefa5c09d4b5619eea63d6e0dbf97220bb436c5b5347cb1fb051d8"} Feb 17 19:25:26 crc kubenswrapper[4892]: I0217 19:25:26.007749 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e6e79a3c-a8e5-4144-bedb-0e771ee43025","Type":"ContainerStarted","Data":"9100ed182c7420df14d2d289bdd5994d20eda652bc5d3a25e0a4cc34087b78a8"} Feb 17 19:25:26 crc kubenswrapper[4892]: I0217 19:25:26.010398 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fdbc494b5-smb4m" event={"ID":"4976d40b-aca4-4ac9-956c-ff2719378071","Type":"ContainerStarted","Data":"461c1b8b342c0053cfff056cc0b5fd094028a0136d959a19602edab8f6328f3c"} Feb 17 19:25:26 crc kubenswrapper[4892]: I0217 19:25:26.017395 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"253a67a0-8806-43b4-8994-10b9143ee4dd","Type":"ContainerStarted","Data":"66d2ceb1ae812dc201ee0142adad5957461df9a387a193879cb55d1c9b3e7b73"} Feb 17 19:25:26 crc kubenswrapper[4892]: I0217 19:25:26.017450 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"253a67a0-8806-43b4-8994-10b9143ee4dd","Type":"ContainerStarted","Data":"6fad97d9cdb789758c1f9ac171709f13293b7766fec992786ded0e1f7cdba3d8"} Feb 17 19:25:26 crc kubenswrapper[4892]: I0217 19:25:26.024023 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-598d6b48f9-h5jq8" event={"ID":"8a022a2b-3eb8-4b7b-a496-5649767568a7","Type":"ContainerStarted","Data":"01c7e151913dc1faaf2c706b3516aa0e455eb28cd829add0ab4eb03c81e0ccec"} Feb 17 19:25:26 crc kubenswrapper[4892]: I0217 19:25:26.024163 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-598d6b48f9-h5jq8" podUID="8a022a2b-3eb8-4b7b-a496-5649767568a7" containerName="horizon-log" containerID="cri-o://9a15f6f6fdc92aa1fc1b7bb83a6e66b2aa71891dee85dd8553941908a3d17f54" gracePeriod=30 Feb 17 19:25:26 crc kubenswrapper[4892]: I0217 19:25:26.024500 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-598d6b48f9-h5jq8" podUID="8a022a2b-3eb8-4b7b-a496-5649767568a7" containerName="horizon" containerID="cri-o://01c7e151913dc1faaf2c706b3516aa0e455eb28cd829add0ab4eb03c81e0ccec" gracePeriod=30 Feb 17 19:25:26 crc kubenswrapper[4892]: I0217 19:25:26.026613 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-689d4548cc-cqqrq" event={"ID":"a27f67b7-efd3-4634-b422-99b953a1e67c","Type":"ContainerStarted","Data":"3a77051d0aabb5ab375c3bae5a2ebf213d884d7c99f60dee1f039da4923e7bd3"} Feb 17 19:25:26 crc kubenswrapper[4892]: I0217 19:25:26.032690 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5fdbc494b5-smb4m" podStartSLOduration=2.737873207 podStartE2EDuration="11.032572285s" podCreationTimestamp="2026-02-17 19:25:15 +0000 UTC" firstStartedPulling="2026-02-17 19:25:16.31654552 +0000 UTC m=+6087.691948785" lastFinishedPulling="2026-02-17 19:25:24.611244598 +0000 UTC m=+6095.986647863" observedRunningTime="2026-02-17 19:25:26.031815044 +0000 UTC m=+6097.407218309" watchObservedRunningTime="2026-02-17 19:25:26.032572285 +0000 UTC m=+6097.407975550" Feb 17 19:25:26 crc kubenswrapper[4892]: I0217 19:25:26.072634 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-598d6b48f9-h5jq8" podStartSLOduration=3.329628278 podStartE2EDuration="12.072617229s" podCreationTimestamp="2026-02-17 19:25:14 +0000 UTC" firstStartedPulling="2026-02-17 19:25:15.793067601 +0000 UTC m=+6087.168470866" lastFinishedPulling="2026-02-17 19:25:24.536056522 +0000 UTC m=+6095.911459817" observedRunningTime="2026-02-17 19:25:26.058443316 +0000 UTC m=+6097.433846581" watchObservedRunningTime="2026-02-17 19:25:26.072617229 +0000 UTC m=+6097.448020494" Feb 17 19:25:26 crc kubenswrapper[4892]: I0217 19:25:26.086291 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-689d4548cc-cqqrq" podStartSLOduration=3.360557597 podStartE2EDuration="12.086266679s" podCreationTimestamp="2026-02-17 19:25:14 +0000 UTC" firstStartedPulling="2026-02-17 19:25:15.827914916 +0000 UTC m=+6087.203318181" lastFinishedPulling="2026-02-17 19:25:24.553623968 +0000 UTC m=+6095.929027263" observedRunningTime="2026-02-17 19:25:26.077493922 +0000 UTC m=+6097.452897177" watchObservedRunningTime="2026-02-17 19:25:26.086266679 +0000 UTC m=+6097.461669944" Feb 17 19:25:27 crc kubenswrapper[4892]: I0217 19:25:27.040671 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"253a67a0-8806-43b4-8994-10b9143ee4dd","Type":"ContainerStarted","Data":"6877d2cee447ed6fc0f46342fb720d8466a260cadfff70f9036d1ecaa47bff71"} Feb 17 19:25:27 crc kubenswrapper[4892]: I0217 19:25:27.044019 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e6e79a3c-a8e5-4144-bedb-0e771ee43025","Type":"ContainerStarted","Data":"9298ddb3b86a4e9bc626721e0e7f443e523c7a2dd9c3fd9e49826e9bb7e31bb9"} Feb 17 19:25:27 crc kubenswrapper[4892]: I0217 19:25:27.092411 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=9.092381479 podStartE2EDuration="9.092381479s" podCreationTimestamp="2026-02-17 19:25:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:25:27.075368509 +0000 UTC m=+6098.450771804" watchObservedRunningTime="2026-02-17 19:25:27.092381479 +0000 UTC m=+6098.467784774" Feb 17 19:25:27 crc kubenswrapper[4892]: I0217 19:25:27.125935 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=9.125914988 podStartE2EDuration="9.125914988s" podCreationTimestamp="2026-02-17 19:25:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:25:27.10757083 +0000 UTC m=+6098.482974145" watchObservedRunningTime="2026-02-17 19:25:27.125914988 +0000 UTC m=+6098.501318263" Feb 17 19:25:29 crc kubenswrapper[4892]: I0217 19:25:29.316134 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 17 19:25:29 crc kubenswrapper[4892]: I0217 19:25:29.316683 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 17 19:25:29 crc kubenswrapper[4892]: I0217 19:25:29.336035 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 17 19:25:29 crc kubenswrapper[4892]: I0217 19:25:29.336093 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 17 19:25:29 crc kubenswrapper[4892]: I0217 19:25:29.378002 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 17 19:25:29 crc kubenswrapper[4892]: I0217 19:25:29.379620 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 17 19:25:29 crc kubenswrapper[4892]: I0217 19:25:29.384060 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 17 19:25:29 crc kubenswrapper[4892]: I0217 19:25:29.403421 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 17 19:25:30 crc kubenswrapper[4892]: I0217 19:25:30.083896 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 17 19:25:30 crc kubenswrapper[4892]: I0217 19:25:30.083942 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 17 19:25:30 crc kubenswrapper[4892]: I0217 19:25:30.084129 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 17 19:25:30 crc kubenswrapper[4892]: I0217 19:25:30.084164 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 17 19:25:33 crc kubenswrapper[4892]: I0217 19:25:33.216406 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 17 19:25:33 crc kubenswrapper[4892]: I0217 19:25:33.217000 4892 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 19:25:33 crc kubenswrapper[4892]: I0217 19:25:33.324400 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 17 19:25:33 crc kubenswrapper[4892]: I0217 19:25:33.324669 4892 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 19:25:33 crc kubenswrapper[4892]: I0217 19:25:33.424243 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 17 19:25:33 crc kubenswrapper[4892]: I0217 19:25:33.441517 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 17 19:25:34 crc kubenswrapper[4892]: I0217 19:25:34.668763 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-689d4548cc-cqqrq" Feb 17 19:25:34 crc kubenswrapper[4892]: I0217 19:25:34.669019 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-689d4548cc-cqqrq" Feb 17 19:25:34 crc kubenswrapper[4892]: I0217 19:25:34.912136 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-598d6b48f9-h5jq8" Feb 17 19:25:35 crc kubenswrapper[4892]: I0217 19:25:35.831062 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5fdbc494b5-smb4m" Feb 17 19:25:35 crc kubenswrapper[4892]: I0217 19:25:35.831113 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5fdbc494b5-smb4m" Feb 17 19:25:35 crc kubenswrapper[4892]: I0217 19:25:35.833137 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5fdbc494b5-smb4m" podUID="4976d40b-aca4-4ac9-956c-ff2719378071" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.140:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.140:8080: connect: connection refused" Feb 17 19:25:37 crc kubenswrapper[4892]: I0217 19:25:37.049723 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-2660-account-create-update-kjtbg"] Feb 17 19:25:37 crc kubenswrapper[4892]: I0217 19:25:37.066350 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-w5ds9"] Feb 17 19:25:37 crc kubenswrapper[4892]: I0217 19:25:37.084082 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-2660-account-create-update-kjtbg"] Feb 17 19:25:37 crc kubenswrapper[4892]: I0217 19:25:37.093598 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-w5ds9"] Feb 17 19:25:37 crc kubenswrapper[4892]: I0217 19:25:37.395877 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="762ac045-f6cb-4819-81ff-89553338a250" path="/var/lib/kubelet/pods/762ac045-f6cb-4819-81ff-89553338a250/volumes" Feb 17 19:25:37 crc kubenswrapper[4892]: I0217 19:25:37.396755 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba60e1d6-317d-4d32-b534-26e5490eb1fc" path="/var/lib/kubelet/pods/ba60e1d6-317d-4d32-b534-26e5490eb1fc/volumes" Feb 17 19:25:44 crc kubenswrapper[4892]: I0217 19:25:44.034206 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-5h2fz"] Feb 17 19:25:44 crc kubenswrapper[4892]: I0217 19:25:44.044423 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-5h2fz"] Feb 17 19:25:45 crc kubenswrapper[4892]: I0217 19:25:45.375206 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b188c46-158a-41cb-9f68-945408ac3ed5" path="/var/lib/kubelet/pods/7b188c46-158a-41cb-9f68-945408ac3ed5/volumes" Feb 17 19:25:46 crc kubenswrapper[4892]: I0217 19:25:46.431642 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-689d4548cc-cqqrq" Feb 17 19:25:47 crc kubenswrapper[4892]: I0217 19:25:47.503728 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5fdbc494b5-smb4m" Feb 17 19:25:48 crc kubenswrapper[4892]: I0217 19:25:48.043966 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-689d4548cc-cqqrq" Feb 17 19:25:49 crc kubenswrapper[4892]: I0217 19:25:49.198119 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5fdbc494b5-smb4m" Feb 17 19:25:49 crc kubenswrapper[4892]: I0217 19:25:49.269442 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-689d4548cc-cqqrq"] Feb 17 19:25:49 crc kubenswrapper[4892]: I0217 19:25:49.272394 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-689d4548cc-cqqrq" podUID="a27f67b7-efd3-4634-b422-99b953a1e67c" containerName="horizon-log" containerID="cri-o://5aa89074cc9ec5ffe1244cb742c7b589a0451dadf5d4241635ec96d3d80e6274" gracePeriod=30 Feb 17 19:25:49 crc kubenswrapper[4892]: I0217 19:25:49.273615 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-689d4548cc-cqqrq" podUID="a27f67b7-efd3-4634-b422-99b953a1e67c" containerName="horizon" containerID="cri-o://3a77051d0aabb5ab375c3bae5a2ebf213d884d7c99f60dee1f039da4923e7bd3" gracePeriod=30 Feb 17 19:25:53 crc kubenswrapper[4892]: I0217 19:25:53.418240 4892 generic.go:334] "Generic (PLEG): container finished" podID="a27f67b7-efd3-4634-b422-99b953a1e67c" containerID="3a77051d0aabb5ab375c3bae5a2ebf213d884d7c99f60dee1f039da4923e7bd3" exitCode=0 Feb 17 19:25:53 crc kubenswrapper[4892]: I0217 19:25:53.418378 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-689d4548cc-cqqrq" event={"ID":"a27f67b7-efd3-4634-b422-99b953a1e67c","Type":"ContainerDied","Data":"3a77051d0aabb5ab375c3bae5a2ebf213d884d7c99f60dee1f039da4923e7bd3"} Feb 17 19:25:54 crc kubenswrapper[4892]: I0217 19:25:54.669400 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-689d4548cc-cqqrq" podUID="a27f67b7-efd3-4634-b422-99b953a1e67c" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.138:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.138:8080: connect: connection refused" Feb 17 19:25:56 crc kubenswrapper[4892]: I0217 19:25:56.463644 4892 generic.go:334] "Generic (PLEG): container finished" podID="8a022a2b-3eb8-4b7b-a496-5649767568a7" containerID="01c7e151913dc1faaf2c706b3516aa0e455eb28cd829add0ab4eb03c81e0ccec" exitCode=137 Feb 17 19:25:56 crc kubenswrapper[4892]: I0217 19:25:56.463989 4892 generic.go:334] "Generic (PLEG): container finished" podID="8a022a2b-3eb8-4b7b-a496-5649767568a7" containerID="9a15f6f6fdc92aa1fc1b7bb83a6e66b2aa71891dee85dd8553941908a3d17f54" exitCode=137 Feb 17 19:25:56 crc kubenswrapper[4892]: I0217 19:25:56.464083 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-598d6b48f9-h5jq8" event={"ID":"8a022a2b-3eb8-4b7b-a496-5649767568a7","Type":"ContainerDied","Data":"01c7e151913dc1faaf2c706b3516aa0e455eb28cd829add0ab4eb03c81e0ccec"} Feb 17 19:25:56 crc kubenswrapper[4892]: I0217 19:25:56.464123 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-598d6b48f9-h5jq8" event={"ID":"8a022a2b-3eb8-4b7b-a496-5649767568a7","Type":"ContainerDied","Data":"9a15f6f6fdc92aa1fc1b7bb83a6e66b2aa71891dee85dd8553941908a3d17f54"} Feb 17 19:25:56 crc kubenswrapper[4892]: I0217 19:25:56.464154 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-598d6b48f9-h5jq8" event={"ID":"8a022a2b-3eb8-4b7b-a496-5649767568a7","Type":"ContainerDied","Data":"c13d33898c08447efeaa9378c208e076e604e9597bd08faabe1301743fdfc485"} Feb 17 19:25:56 crc kubenswrapper[4892]: I0217 19:25:56.464185 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c13d33898c08447efeaa9378c208e076e604e9597bd08faabe1301743fdfc485" Feb 17 19:25:56 crc kubenswrapper[4892]: I0217 19:25:56.496489 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-598d6b48f9-h5jq8" Feb 17 19:25:56 crc kubenswrapper[4892]: I0217 19:25:56.584037 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a022a2b-3eb8-4b7b-a496-5649767568a7-scripts\") pod \"8a022a2b-3eb8-4b7b-a496-5649767568a7\" (UID: \"8a022a2b-3eb8-4b7b-a496-5649767568a7\") " Feb 17 19:25:56 crc kubenswrapper[4892]: I0217 19:25:56.584312 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8a022a2b-3eb8-4b7b-a496-5649767568a7-horizon-secret-key\") pod \"8a022a2b-3eb8-4b7b-a496-5649767568a7\" (UID: \"8a022a2b-3eb8-4b7b-a496-5649767568a7\") " Feb 17 19:25:56 crc kubenswrapper[4892]: I0217 19:25:56.584335 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a022a2b-3eb8-4b7b-a496-5649767568a7-logs\") pod \"8a022a2b-3eb8-4b7b-a496-5649767568a7\" (UID: \"8a022a2b-3eb8-4b7b-a496-5649767568a7\") " Feb 17 19:25:56 crc kubenswrapper[4892]: I0217 19:25:56.584425 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lt2d4\" (UniqueName: \"kubernetes.io/projected/8a022a2b-3eb8-4b7b-a496-5649767568a7-kube-api-access-lt2d4\") pod \"8a022a2b-3eb8-4b7b-a496-5649767568a7\" (UID: \"8a022a2b-3eb8-4b7b-a496-5649767568a7\") " Feb 17 19:25:56 crc kubenswrapper[4892]: I0217 19:25:56.584441 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8a022a2b-3eb8-4b7b-a496-5649767568a7-config-data\") pod \"8a022a2b-3eb8-4b7b-a496-5649767568a7\" (UID: \"8a022a2b-3eb8-4b7b-a496-5649767568a7\") " Feb 17 19:25:56 crc kubenswrapper[4892]: I0217 19:25:56.585544 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a022a2b-3eb8-4b7b-a496-5649767568a7-logs" (OuterVolumeSpecName: "logs") pod "8a022a2b-3eb8-4b7b-a496-5649767568a7" (UID: "8a022a2b-3eb8-4b7b-a496-5649767568a7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:25:56 crc kubenswrapper[4892]: I0217 19:25:56.591950 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a022a2b-3eb8-4b7b-a496-5649767568a7-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "8a022a2b-3eb8-4b7b-a496-5649767568a7" (UID: "8a022a2b-3eb8-4b7b-a496-5649767568a7"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:25:56 crc kubenswrapper[4892]: I0217 19:25:56.611039 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a022a2b-3eb8-4b7b-a496-5649767568a7-kube-api-access-lt2d4" (OuterVolumeSpecName: "kube-api-access-lt2d4") pod "8a022a2b-3eb8-4b7b-a496-5649767568a7" (UID: "8a022a2b-3eb8-4b7b-a496-5649767568a7"). InnerVolumeSpecName "kube-api-access-lt2d4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:25:56 crc kubenswrapper[4892]: I0217 19:25:56.611950 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a022a2b-3eb8-4b7b-a496-5649767568a7-scripts" (OuterVolumeSpecName: "scripts") pod "8a022a2b-3eb8-4b7b-a496-5649767568a7" (UID: "8a022a2b-3eb8-4b7b-a496-5649767568a7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:25:56 crc kubenswrapper[4892]: I0217 19:25:56.637690 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a022a2b-3eb8-4b7b-a496-5649767568a7-config-data" (OuterVolumeSpecName: "config-data") pod "8a022a2b-3eb8-4b7b-a496-5649767568a7" (UID: "8a022a2b-3eb8-4b7b-a496-5649767568a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:25:56 crc kubenswrapper[4892]: I0217 19:25:56.687446 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lt2d4\" (UniqueName: \"kubernetes.io/projected/8a022a2b-3eb8-4b7b-a496-5649767568a7-kube-api-access-lt2d4\") on node \"crc\" DevicePath \"\"" Feb 17 19:25:56 crc kubenswrapper[4892]: I0217 19:25:56.687506 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8a022a2b-3eb8-4b7b-a496-5649767568a7-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 19:25:56 crc kubenswrapper[4892]: I0217 19:25:56.687526 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a022a2b-3eb8-4b7b-a496-5649767568a7-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:25:56 crc kubenswrapper[4892]: I0217 19:25:56.687549 4892 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8a022a2b-3eb8-4b7b-a496-5649767568a7-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 17 19:25:56 crc kubenswrapper[4892]: I0217 19:25:56.687570 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a022a2b-3eb8-4b7b-a496-5649767568a7-logs\") on node \"crc\" DevicePath \"\"" Feb 17 19:25:57 crc kubenswrapper[4892]: I0217 19:25:57.474635 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-598d6b48f9-h5jq8" Feb 17 19:25:57 crc kubenswrapper[4892]: I0217 19:25:57.514145 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-598d6b48f9-h5jq8"] Feb 17 19:25:57 crc kubenswrapper[4892]: I0217 19:25:57.525044 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-598d6b48f9-h5jq8"] Feb 17 19:25:59 crc kubenswrapper[4892]: I0217 19:25:59.375395 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a022a2b-3eb8-4b7b-a496-5649767568a7" path="/var/lib/kubelet/pods/8a022a2b-3eb8-4b7b-a496-5649767568a7/volumes" Feb 17 19:26:03 crc kubenswrapper[4892]: I0217 19:26:03.254970 4892 scope.go:117] "RemoveContainer" containerID="8f577a819e93460b86b0612097544a6ee471507bd452e9a5a9ad36b7fd3da05a" Feb 17 19:26:03 crc kubenswrapper[4892]: I0217 19:26:03.304759 4892 scope.go:117] "RemoveContainer" containerID="7d4ae2b60b9faa6f0a3a71aa98aa4ab3e6d59e97011d769997322e7e43e6e5a9" Feb 17 19:26:03 crc kubenswrapper[4892]: I0217 19:26:03.341535 4892 scope.go:117] "RemoveContainer" containerID="db4cf0b38f43b1cff85a050ebe406706b8addde71dc13a4b348bc844af64d635" Feb 17 19:26:03 crc kubenswrapper[4892]: I0217 19:26:03.402088 4892 scope.go:117] "RemoveContainer" containerID="6ba4a6e09056831cb826c5ef28434496f8537dd932678e77ebc52a1ccea229f9" Feb 17 19:26:04 crc kubenswrapper[4892]: I0217 19:26:04.669315 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-689d4548cc-cqqrq" podUID="a27f67b7-efd3-4634-b422-99b953a1e67c" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.138:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.138:8080: connect: connection refused" Feb 17 19:26:12 crc kubenswrapper[4892]: I0217 19:26:12.084754 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-df55-account-create-update-wvxkj"] Feb 17 19:26:12 crc kubenswrapper[4892]: I0217 19:26:12.098242 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-h6827"] Feb 17 19:26:12 crc kubenswrapper[4892]: I0217 19:26:12.110939 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-df55-account-create-update-wvxkj"] Feb 17 19:26:12 crc kubenswrapper[4892]: I0217 19:26:12.122447 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-h6827"] Feb 17 19:26:13 crc kubenswrapper[4892]: I0217 19:26:13.379390 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b4edd1b-14b6-4623-b2cb-5f26e4044fe1" path="/var/lib/kubelet/pods/9b4edd1b-14b6-4623-b2cb-5f26e4044fe1/volumes" Feb 17 19:26:13 crc kubenswrapper[4892]: I0217 19:26:13.381034 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc0410f1-c219-4657-b233-febdc2406bf8" path="/var/lib/kubelet/pods/cc0410f1-c219-4657-b233-febdc2406bf8/volumes" Feb 17 19:26:14 crc kubenswrapper[4892]: I0217 19:26:14.669565 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-689d4548cc-cqqrq" podUID="a27f67b7-efd3-4634-b422-99b953a1e67c" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.138:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.138:8080: connect: connection refused" Feb 17 19:26:14 crc kubenswrapper[4892]: I0217 19:26:14.670177 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-689d4548cc-cqqrq" Feb 17 19:26:19 crc kubenswrapper[4892]: I0217 19:26:19.781648 4892 generic.go:334] "Generic (PLEG): container finished" podID="a27f67b7-efd3-4634-b422-99b953a1e67c" containerID="5aa89074cc9ec5ffe1244cb742c7b589a0451dadf5d4241635ec96d3d80e6274" exitCode=137 Feb 17 19:26:19 crc kubenswrapper[4892]: I0217 19:26:19.781715 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-689d4548cc-cqqrq" event={"ID":"a27f67b7-efd3-4634-b422-99b953a1e67c","Type":"ContainerDied","Data":"5aa89074cc9ec5ffe1244cb742c7b589a0451dadf5d4241635ec96d3d80e6274"} Feb 17 19:26:19 crc kubenswrapper[4892]: I0217 19:26:19.782191 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-689d4548cc-cqqrq" event={"ID":"a27f67b7-efd3-4634-b422-99b953a1e67c","Type":"ContainerDied","Data":"873b628e052f77a802e45c68fe101fe09ef88b33df6df204b8ecd4e8b8c22ce2"} Feb 17 19:26:19 crc kubenswrapper[4892]: I0217 19:26:19.782205 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="873b628e052f77a802e45c68fe101fe09ef88b33df6df204b8ecd4e8b8c22ce2" Feb 17 19:26:19 crc kubenswrapper[4892]: I0217 19:26:19.791257 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-689d4548cc-cqqrq" Feb 17 19:26:19 crc kubenswrapper[4892]: I0217 19:26:19.936851 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a27f67b7-efd3-4634-b422-99b953a1e67c-horizon-secret-key\") pod \"a27f67b7-efd3-4634-b422-99b953a1e67c\" (UID: \"a27f67b7-efd3-4634-b422-99b953a1e67c\") " Feb 17 19:26:19 crc kubenswrapper[4892]: I0217 19:26:19.936922 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a27f67b7-efd3-4634-b422-99b953a1e67c-logs\") pod \"a27f67b7-efd3-4634-b422-99b953a1e67c\" (UID: \"a27f67b7-efd3-4634-b422-99b953a1e67c\") " Feb 17 19:26:19 crc kubenswrapper[4892]: I0217 19:26:19.937323 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a27f67b7-efd3-4634-b422-99b953a1e67c-logs" (OuterVolumeSpecName: "logs") pod "a27f67b7-efd3-4634-b422-99b953a1e67c" (UID: "a27f67b7-efd3-4634-b422-99b953a1e67c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:26:19 crc kubenswrapper[4892]: I0217 19:26:19.937360 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a27f67b7-efd3-4634-b422-99b953a1e67c-config-data\") pod \"a27f67b7-efd3-4634-b422-99b953a1e67c\" (UID: \"a27f67b7-efd3-4634-b422-99b953a1e67c\") " Feb 17 19:26:19 crc kubenswrapper[4892]: I0217 19:26:19.937438 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a27f67b7-efd3-4634-b422-99b953a1e67c-scripts\") pod \"a27f67b7-efd3-4634-b422-99b953a1e67c\" (UID: \"a27f67b7-efd3-4634-b422-99b953a1e67c\") " Feb 17 19:26:19 crc kubenswrapper[4892]: I0217 19:26:19.937750 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkgcf\" (UniqueName: \"kubernetes.io/projected/a27f67b7-efd3-4634-b422-99b953a1e67c-kube-api-access-kkgcf\") pod \"a27f67b7-efd3-4634-b422-99b953a1e67c\" (UID: \"a27f67b7-efd3-4634-b422-99b953a1e67c\") " Feb 17 19:26:19 crc kubenswrapper[4892]: I0217 19:26:19.938385 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a27f67b7-efd3-4634-b422-99b953a1e67c-logs\") on node \"crc\" DevicePath \"\"" Feb 17 19:26:19 crc kubenswrapper[4892]: I0217 19:26:19.942408 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a27f67b7-efd3-4634-b422-99b953a1e67c-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "a27f67b7-efd3-4634-b422-99b953a1e67c" (UID: "a27f67b7-efd3-4634-b422-99b953a1e67c"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:26:19 crc kubenswrapper[4892]: I0217 19:26:19.943470 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a27f67b7-efd3-4634-b422-99b953a1e67c-kube-api-access-kkgcf" (OuterVolumeSpecName: "kube-api-access-kkgcf") pod "a27f67b7-efd3-4634-b422-99b953a1e67c" (UID: "a27f67b7-efd3-4634-b422-99b953a1e67c"). InnerVolumeSpecName "kube-api-access-kkgcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:26:19 crc kubenswrapper[4892]: I0217 19:26:19.963635 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a27f67b7-efd3-4634-b422-99b953a1e67c-config-data" (OuterVolumeSpecName: "config-data") pod "a27f67b7-efd3-4634-b422-99b953a1e67c" (UID: "a27f67b7-efd3-4634-b422-99b953a1e67c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:26:19 crc kubenswrapper[4892]: I0217 19:26:19.984090 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a27f67b7-efd3-4634-b422-99b953a1e67c-scripts" (OuterVolumeSpecName: "scripts") pod "a27f67b7-efd3-4634-b422-99b953a1e67c" (UID: "a27f67b7-efd3-4634-b422-99b953a1e67c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:26:20 crc kubenswrapper[4892]: I0217 19:26:20.040071 4892 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a27f67b7-efd3-4634-b422-99b953a1e67c-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 17 19:26:20 crc kubenswrapper[4892]: I0217 19:26:20.040105 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a27f67b7-efd3-4634-b422-99b953a1e67c-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 19:26:20 crc kubenswrapper[4892]: I0217 19:26:20.040115 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a27f67b7-efd3-4634-b422-99b953a1e67c-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:26:20 crc kubenswrapper[4892]: I0217 19:26:20.040124 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkgcf\" (UniqueName: \"kubernetes.io/projected/a27f67b7-efd3-4634-b422-99b953a1e67c-kube-api-access-kkgcf\") on node \"crc\" DevicePath \"\"" Feb 17 19:26:20 crc kubenswrapper[4892]: I0217 19:26:20.794756 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-689d4548cc-cqqrq" Feb 17 19:26:20 crc kubenswrapper[4892]: I0217 19:26:20.852880 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-689d4548cc-cqqrq"] Feb 17 19:26:20 crc kubenswrapper[4892]: I0217 19:26:20.867508 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-689d4548cc-cqqrq"] Feb 17 19:26:21 crc kubenswrapper[4892]: I0217 19:26:21.387539 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a27f67b7-efd3-4634-b422-99b953a1e67c" path="/var/lib/kubelet/pods/a27f67b7-efd3-4634-b422-99b953a1e67c/volumes" Feb 17 19:26:22 crc kubenswrapper[4892]: I0217 19:26:22.046783 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-kjvr5"] Feb 17 19:26:22 crc kubenswrapper[4892]: I0217 19:26:22.061958 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-kjvr5"] Feb 17 19:26:23 crc kubenswrapper[4892]: I0217 19:26:23.377000 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="587b7485-a715-4355-8424-08cdb036121d" path="/var/lib/kubelet/pods/587b7485-a715-4355-8424-08cdb036121d/volumes" Feb 17 19:26:31 crc kubenswrapper[4892]: I0217 19:26:31.648374 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-dcbd57fbf-8j2s5"] Feb 17 19:26:31 crc kubenswrapper[4892]: E0217 19:26:31.649194 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a27f67b7-efd3-4634-b422-99b953a1e67c" containerName="horizon" Feb 17 19:26:31 crc kubenswrapper[4892]: I0217 19:26:31.649206 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="a27f67b7-efd3-4634-b422-99b953a1e67c" containerName="horizon" Feb 17 19:26:31 crc kubenswrapper[4892]: E0217 19:26:31.649227 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a022a2b-3eb8-4b7b-a496-5649767568a7" containerName="horizon" Feb 17 19:26:31 crc kubenswrapper[4892]: I0217 19:26:31.649233 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a022a2b-3eb8-4b7b-a496-5649767568a7" containerName="horizon" Feb 17 19:26:31 crc kubenswrapper[4892]: E0217 19:26:31.649249 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a022a2b-3eb8-4b7b-a496-5649767568a7" containerName="horizon-log" Feb 17 19:26:31 crc kubenswrapper[4892]: I0217 19:26:31.649255 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a022a2b-3eb8-4b7b-a496-5649767568a7" containerName="horizon-log" Feb 17 19:26:31 crc kubenswrapper[4892]: E0217 19:26:31.649288 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a27f67b7-efd3-4634-b422-99b953a1e67c" containerName="horizon-log" Feb 17 19:26:31 crc kubenswrapper[4892]: I0217 19:26:31.649293 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="a27f67b7-efd3-4634-b422-99b953a1e67c" containerName="horizon-log" Feb 17 19:26:31 crc kubenswrapper[4892]: I0217 19:26:31.649487 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="a27f67b7-efd3-4634-b422-99b953a1e67c" containerName="horizon" Feb 17 19:26:31 crc kubenswrapper[4892]: I0217 19:26:31.649502 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a022a2b-3eb8-4b7b-a496-5649767568a7" containerName="horizon-log" Feb 17 19:26:31 crc kubenswrapper[4892]: I0217 19:26:31.649521 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="a27f67b7-efd3-4634-b422-99b953a1e67c" containerName="horizon-log" Feb 17 19:26:31 crc kubenswrapper[4892]: I0217 19:26:31.649529 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a022a2b-3eb8-4b7b-a496-5649767568a7" containerName="horizon" Feb 17 19:26:31 crc kubenswrapper[4892]: I0217 19:26:31.650605 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-dcbd57fbf-8j2s5" Feb 17 19:26:31 crc kubenswrapper[4892]: I0217 19:26:31.668978 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-dcbd57fbf-8j2s5"] Feb 17 19:26:31 crc kubenswrapper[4892]: I0217 19:26:31.724531 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe6fd6f5-5715-4a79-8c5b-839baf178e1d-logs\") pod \"horizon-dcbd57fbf-8j2s5\" (UID: \"fe6fd6f5-5715-4a79-8c5b-839baf178e1d\") " pod="openstack/horizon-dcbd57fbf-8j2s5" Feb 17 19:26:31 crc kubenswrapper[4892]: I0217 19:26:31.724607 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbn68\" (UniqueName: \"kubernetes.io/projected/fe6fd6f5-5715-4a79-8c5b-839baf178e1d-kube-api-access-pbn68\") pod \"horizon-dcbd57fbf-8j2s5\" (UID: \"fe6fd6f5-5715-4a79-8c5b-839baf178e1d\") " pod="openstack/horizon-dcbd57fbf-8j2s5" Feb 17 19:26:31 crc kubenswrapper[4892]: I0217 19:26:31.724764 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fe6fd6f5-5715-4a79-8c5b-839baf178e1d-horizon-secret-key\") pod \"horizon-dcbd57fbf-8j2s5\" (UID: \"fe6fd6f5-5715-4a79-8c5b-839baf178e1d\") " pod="openstack/horizon-dcbd57fbf-8j2s5" Feb 17 19:26:31 crc kubenswrapper[4892]: I0217 19:26:31.724950 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fe6fd6f5-5715-4a79-8c5b-839baf178e1d-config-data\") pod \"horizon-dcbd57fbf-8j2s5\" (UID: \"fe6fd6f5-5715-4a79-8c5b-839baf178e1d\") " pod="openstack/horizon-dcbd57fbf-8j2s5" Feb 17 19:26:31 crc kubenswrapper[4892]: I0217 19:26:31.725266 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe6fd6f5-5715-4a79-8c5b-839baf178e1d-scripts\") pod \"horizon-dcbd57fbf-8j2s5\" (UID: \"fe6fd6f5-5715-4a79-8c5b-839baf178e1d\") " pod="openstack/horizon-dcbd57fbf-8j2s5" Feb 17 19:26:31 crc kubenswrapper[4892]: I0217 19:26:31.827904 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe6fd6f5-5715-4a79-8c5b-839baf178e1d-scripts\") pod \"horizon-dcbd57fbf-8j2s5\" (UID: \"fe6fd6f5-5715-4a79-8c5b-839baf178e1d\") " pod="openstack/horizon-dcbd57fbf-8j2s5" Feb 17 19:26:31 crc kubenswrapper[4892]: I0217 19:26:31.828336 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe6fd6f5-5715-4a79-8c5b-839baf178e1d-logs\") pod \"horizon-dcbd57fbf-8j2s5\" (UID: \"fe6fd6f5-5715-4a79-8c5b-839baf178e1d\") " pod="openstack/horizon-dcbd57fbf-8j2s5" Feb 17 19:26:31 crc kubenswrapper[4892]: I0217 19:26:31.828474 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbn68\" (UniqueName: \"kubernetes.io/projected/fe6fd6f5-5715-4a79-8c5b-839baf178e1d-kube-api-access-pbn68\") pod \"horizon-dcbd57fbf-8j2s5\" (UID: \"fe6fd6f5-5715-4a79-8c5b-839baf178e1d\") " pod="openstack/horizon-dcbd57fbf-8j2s5" Feb 17 19:26:31 crc kubenswrapper[4892]: I0217 19:26:31.828539 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fe6fd6f5-5715-4a79-8c5b-839baf178e1d-horizon-secret-key\") pod \"horizon-dcbd57fbf-8j2s5\" (UID: \"fe6fd6f5-5715-4a79-8c5b-839baf178e1d\") " pod="openstack/horizon-dcbd57fbf-8j2s5" Feb 17 19:26:31 crc kubenswrapper[4892]: I0217 19:26:31.828651 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fe6fd6f5-5715-4a79-8c5b-839baf178e1d-config-data\") pod \"horizon-dcbd57fbf-8j2s5\" (UID: \"fe6fd6f5-5715-4a79-8c5b-839baf178e1d\") " pod="openstack/horizon-dcbd57fbf-8j2s5" Feb 17 19:26:31 crc kubenswrapper[4892]: I0217 19:26:31.828713 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe6fd6f5-5715-4a79-8c5b-839baf178e1d-scripts\") pod \"horizon-dcbd57fbf-8j2s5\" (UID: \"fe6fd6f5-5715-4a79-8c5b-839baf178e1d\") " pod="openstack/horizon-dcbd57fbf-8j2s5" Feb 17 19:26:31 crc kubenswrapper[4892]: I0217 19:26:31.829036 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe6fd6f5-5715-4a79-8c5b-839baf178e1d-logs\") pod \"horizon-dcbd57fbf-8j2s5\" (UID: \"fe6fd6f5-5715-4a79-8c5b-839baf178e1d\") " pod="openstack/horizon-dcbd57fbf-8j2s5" Feb 17 19:26:31 crc kubenswrapper[4892]: I0217 19:26:31.829807 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fe6fd6f5-5715-4a79-8c5b-839baf178e1d-config-data\") pod \"horizon-dcbd57fbf-8j2s5\" (UID: \"fe6fd6f5-5715-4a79-8c5b-839baf178e1d\") " pod="openstack/horizon-dcbd57fbf-8j2s5" Feb 17 19:26:31 crc kubenswrapper[4892]: I0217 19:26:31.834677 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fe6fd6f5-5715-4a79-8c5b-839baf178e1d-horizon-secret-key\") pod \"horizon-dcbd57fbf-8j2s5\" (UID: \"fe6fd6f5-5715-4a79-8c5b-839baf178e1d\") " pod="openstack/horizon-dcbd57fbf-8j2s5" Feb 17 19:26:31 crc kubenswrapper[4892]: I0217 19:26:31.849171 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbn68\" (UniqueName: \"kubernetes.io/projected/fe6fd6f5-5715-4a79-8c5b-839baf178e1d-kube-api-access-pbn68\") pod \"horizon-dcbd57fbf-8j2s5\" (UID: \"fe6fd6f5-5715-4a79-8c5b-839baf178e1d\") " pod="openstack/horizon-dcbd57fbf-8j2s5" Feb 17 19:26:31 crc kubenswrapper[4892]: I0217 19:26:31.986144 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-dcbd57fbf-8j2s5" Feb 17 19:26:32 crc kubenswrapper[4892]: I0217 19:26:32.580590 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-dcbd57fbf-8j2s5"] Feb 17 19:26:32 crc kubenswrapper[4892]: I0217 19:26:32.859056 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-4h7h4"] Feb 17 19:26:32 crc kubenswrapper[4892]: I0217 19:26:32.860948 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-4h7h4" Feb 17 19:26:32 crc kubenswrapper[4892]: I0217 19:26:32.892141 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-4h7h4"] Feb 17 19:26:32 crc kubenswrapper[4892]: I0217 19:26:32.940713 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-d2c7-account-create-update-fkq2k"] Feb 17 19:26:32 crc kubenswrapper[4892]: I0217 19:26:32.942264 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-d2c7-account-create-update-fkq2k" Feb 17 19:26:32 crc kubenswrapper[4892]: I0217 19:26:32.944139 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Feb 17 19:26:32 crc kubenswrapper[4892]: I0217 19:26:32.950719 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19fe62af-2689-46c8-837f-6a6af7f92052-operator-scripts\") pod \"heat-db-create-4h7h4\" (UID: \"19fe62af-2689-46c8-837f-6a6af7f92052\") " pod="openstack/heat-db-create-4h7h4" Feb 17 19:26:32 crc kubenswrapper[4892]: I0217 19:26:32.950935 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn4lg\" (UniqueName: \"kubernetes.io/projected/19fe62af-2689-46c8-837f-6a6af7f92052-kube-api-access-jn4lg\") pod \"heat-db-create-4h7h4\" (UID: \"19fe62af-2689-46c8-837f-6a6af7f92052\") " pod="openstack/heat-db-create-4h7h4" Feb 17 19:26:32 crc kubenswrapper[4892]: I0217 19:26:32.972708 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-d2c7-account-create-update-fkq2k"] Feb 17 19:26:32 crc kubenswrapper[4892]: I0217 19:26:32.983226 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-dcbd57fbf-8j2s5" event={"ID":"fe6fd6f5-5715-4a79-8c5b-839baf178e1d","Type":"ContainerStarted","Data":"61e9f6fda3f53bb31f5160abfa02474caabf78d467723796f59798ea85bfb274"} Feb 17 19:26:32 crc kubenswrapper[4892]: I0217 19:26:32.983267 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-dcbd57fbf-8j2s5" event={"ID":"fe6fd6f5-5715-4a79-8c5b-839baf178e1d","Type":"ContainerStarted","Data":"770fd5191282b3c49d7711d0e066fb6df4399d8e59f52d8a6ee0053e8e64dc9c"} Feb 17 19:26:33 crc kubenswrapper[4892]: I0217 19:26:33.052972 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kh64\" (UniqueName: \"kubernetes.io/projected/9ead331a-f404-4cd9-9b84-00f5804cf185-kube-api-access-6kh64\") pod \"heat-d2c7-account-create-update-fkq2k\" (UID: \"9ead331a-f404-4cd9-9b84-00f5804cf185\") " pod="openstack/heat-d2c7-account-create-update-fkq2k" Feb 17 19:26:33 crc kubenswrapper[4892]: I0217 19:26:33.053024 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19fe62af-2689-46c8-837f-6a6af7f92052-operator-scripts\") pod \"heat-db-create-4h7h4\" (UID: \"19fe62af-2689-46c8-837f-6a6af7f92052\") " pod="openstack/heat-db-create-4h7h4" Feb 17 19:26:33 crc kubenswrapper[4892]: I0217 19:26:33.053080 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ead331a-f404-4cd9-9b84-00f5804cf185-operator-scripts\") pod \"heat-d2c7-account-create-update-fkq2k\" (UID: \"9ead331a-f404-4cd9-9b84-00f5804cf185\") " pod="openstack/heat-d2c7-account-create-update-fkq2k" Feb 17 19:26:33 crc kubenswrapper[4892]: I0217 19:26:33.053131 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jn4lg\" (UniqueName: \"kubernetes.io/projected/19fe62af-2689-46c8-837f-6a6af7f92052-kube-api-access-jn4lg\") pod \"heat-db-create-4h7h4\" (UID: \"19fe62af-2689-46c8-837f-6a6af7f92052\") " pod="openstack/heat-db-create-4h7h4" Feb 17 19:26:33 crc kubenswrapper[4892]: I0217 19:26:33.054071 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19fe62af-2689-46c8-837f-6a6af7f92052-operator-scripts\") pod \"heat-db-create-4h7h4\" (UID: \"19fe62af-2689-46c8-837f-6a6af7f92052\") " pod="openstack/heat-db-create-4h7h4" Feb 17 19:26:33 crc kubenswrapper[4892]: I0217 19:26:33.068584 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn4lg\" (UniqueName: \"kubernetes.io/projected/19fe62af-2689-46c8-837f-6a6af7f92052-kube-api-access-jn4lg\") pod \"heat-db-create-4h7h4\" (UID: \"19fe62af-2689-46c8-837f-6a6af7f92052\") " pod="openstack/heat-db-create-4h7h4" Feb 17 19:26:33 crc kubenswrapper[4892]: I0217 19:26:33.154704 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kh64\" (UniqueName: \"kubernetes.io/projected/9ead331a-f404-4cd9-9b84-00f5804cf185-kube-api-access-6kh64\") pod \"heat-d2c7-account-create-update-fkq2k\" (UID: \"9ead331a-f404-4cd9-9b84-00f5804cf185\") " pod="openstack/heat-d2c7-account-create-update-fkq2k" Feb 17 19:26:33 crc kubenswrapper[4892]: I0217 19:26:33.154781 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ead331a-f404-4cd9-9b84-00f5804cf185-operator-scripts\") pod \"heat-d2c7-account-create-update-fkq2k\" (UID: \"9ead331a-f404-4cd9-9b84-00f5804cf185\") " pod="openstack/heat-d2c7-account-create-update-fkq2k" Feb 17 19:26:33 crc kubenswrapper[4892]: I0217 19:26:33.155514 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ead331a-f404-4cd9-9b84-00f5804cf185-operator-scripts\") pod \"heat-d2c7-account-create-update-fkq2k\" (UID: \"9ead331a-f404-4cd9-9b84-00f5804cf185\") " pod="openstack/heat-d2c7-account-create-update-fkq2k" Feb 17 19:26:33 crc kubenswrapper[4892]: I0217 19:26:33.175781 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kh64\" (UniqueName: \"kubernetes.io/projected/9ead331a-f404-4cd9-9b84-00f5804cf185-kube-api-access-6kh64\") pod \"heat-d2c7-account-create-update-fkq2k\" (UID: \"9ead331a-f404-4cd9-9b84-00f5804cf185\") " pod="openstack/heat-d2c7-account-create-update-fkq2k" Feb 17 19:26:33 crc kubenswrapper[4892]: I0217 19:26:33.246414 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-4h7h4" Feb 17 19:26:33 crc kubenswrapper[4892]: I0217 19:26:33.275304 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-d2c7-account-create-update-fkq2k" Feb 17 19:26:33 crc kubenswrapper[4892]: I0217 19:26:33.794644 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-d2c7-account-create-update-fkq2k"] Feb 17 19:26:33 crc kubenswrapper[4892]: W0217 19:26:33.805052 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ead331a_f404_4cd9_9b84_00f5804cf185.slice/crio-4e39e370c5a30d535f90c354a33623d96e8f14023abc4b41755c88ade2cdf14d WatchSource:0}: Error finding container 4e39e370c5a30d535f90c354a33623d96e8f14023abc4b41755c88ade2cdf14d: Status 404 returned error can't find the container with id 4e39e370c5a30d535f90c354a33623d96e8f14023abc4b41755c88ade2cdf14d Feb 17 19:26:33 crc kubenswrapper[4892]: I0217 19:26:33.806032 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-4h7h4"] Feb 17 19:26:33 crc kubenswrapper[4892]: W0217 19:26:33.807426 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19fe62af_2689_46c8_837f_6a6af7f92052.slice/crio-78427e24b0c4a0a958b521b7339bc22c454fd25733c5b50f2ddcfe47b502ad63 WatchSource:0}: Error finding container 78427e24b0c4a0a958b521b7339bc22c454fd25733c5b50f2ddcfe47b502ad63: Status 404 returned error can't find the container with id 78427e24b0c4a0a958b521b7339bc22c454fd25733c5b50f2ddcfe47b502ad63 Feb 17 19:26:33 crc kubenswrapper[4892]: I0217 19:26:33.999156 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-4h7h4" event={"ID":"19fe62af-2689-46c8-837f-6a6af7f92052","Type":"ContainerStarted","Data":"78427e24b0c4a0a958b521b7339bc22c454fd25733c5b50f2ddcfe47b502ad63"} Feb 17 19:26:34 crc kubenswrapper[4892]: I0217 19:26:34.001752 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-dcbd57fbf-8j2s5" event={"ID":"fe6fd6f5-5715-4a79-8c5b-839baf178e1d","Type":"ContainerStarted","Data":"f930593c727d039d38f18b5ba62917062cee86cd9744613d50709fa89097c5df"} Feb 17 19:26:34 crc kubenswrapper[4892]: I0217 19:26:34.003707 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-d2c7-account-create-update-fkq2k" event={"ID":"9ead331a-f404-4cd9-9b84-00f5804cf185","Type":"ContainerStarted","Data":"4e39e370c5a30d535f90c354a33623d96e8f14023abc4b41755c88ade2cdf14d"} Feb 17 19:26:34 crc kubenswrapper[4892]: I0217 19:26:34.024096 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-dcbd57fbf-8j2s5" podStartSLOduration=3.024047748 podStartE2EDuration="3.024047748s" podCreationTimestamp="2026-02-17 19:26:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:26:34.021844298 +0000 UTC m=+6165.397247563" watchObservedRunningTime="2026-02-17 19:26:34.024047748 +0000 UTC m=+6165.399451013" Feb 17 19:26:35 crc kubenswrapper[4892]: I0217 19:26:35.019071 4892 generic.go:334] "Generic (PLEG): container finished" podID="9ead331a-f404-4cd9-9b84-00f5804cf185" containerID="bc71ffe5efcce70bf0ea42a08027ff9d77d96e1e1586ea6be9677ca6d4614b42" exitCode=0 Feb 17 19:26:35 crc kubenswrapper[4892]: I0217 19:26:35.019214 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-d2c7-account-create-update-fkq2k" event={"ID":"9ead331a-f404-4cd9-9b84-00f5804cf185","Type":"ContainerDied","Data":"bc71ffe5efcce70bf0ea42a08027ff9d77d96e1e1586ea6be9677ca6d4614b42"} Feb 17 19:26:35 crc kubenswrapper[4892]: I0217 19:26:35.032936 4892 generic.go:334] "Generic (PLEG): container finished" podID="19fe62af-2689-46c8-837f-6a6af7f92052" containerID="0f596e43bd6920656122497cc29e605b2ff65ab29a1a2307c12b16f7161b7eb9" exitCode=0 Feb 17 19:26:35 crc kubenswrapper[4892]: I0217 19:26:35.034231 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-4h7h4" event={"ID":"19fe62af-2689-46c8-837f-6a6af7f92052","Type":"ContainerDied","Data":"0f596e43bd6920656122497cc29e605b2ff65ab29a1a2307c12b16f7161b7eb9"} Feb 17 19:26:36 crc kubenswrapper[4892]: I0217 19:26:36.375113 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-4h7h4" Feb 17 19:26:36 crc kubenswrapper[4892]: I0217 19:26:36.451484 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jn4lg\" (UniqueName: \"kubernetes.io/projected/19fe62af-2689-46c8-837f-6a6af7f92052-kube-api-access-jn4lg\") pod \"19fe62af-2689-46c8-837f-6a6af7f92052\" (UID: \"19fe62af-2689-46c8-837f-6a6af7f92052\") " Feb 17 19:26:36 crc kubenswrapper[4892]: I0217 19:26:36.451518 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19fe62af-2689-46c8-837f-6a6af7f92052-operator-scripts\") pod \"19fe62af-2689-46c8-837f-6a6af7f92052\" (UID: \"19fe62af-2689-46c8-837f-6a6af7f92052\") " Feb 17 19:26:36 crc kubenswrapper[4892]: I0217 19:26:36.452765 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19fe62af-2689-46c8-837f-6a6af7f92052-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "19fe62af-2689-46c8-837f-6a6af7f92052" (UID: "19fe62af-2689-46c8-837f-6a6af7f92052"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:26:36 crc kubenswrapper[4892]: I0217 19:26:36.453017 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19fe62af-2689-46c8-837f-6a6af7f92052-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:26:36 crc kubenswrapper[4892]: I0217 19:26:36.458228 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19fe62af-2689-46c8-837f-6a6af7f92052-kube-api-access-jn4lg" (OuterVolumeSpecName: "kube-api-access-jn4lg") pod "19fe62af-2689-46c8-837f-6a6af7f92052" (UID: "19fe62af-2689-46c8-837f-6a6af7f92052"). InnerVolumeSpecName "kube-api-access-jn4lg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:26:36 crc kubenswrapper[4892]: I0217 19:26:36.551282 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-d2c7-account-create-update-fkq2k" Feb 17 19:26:36 crc kubenswrapper[4892]: I0217 19:26:36.555208 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jn4lg\" (UniqueName: \"kubernetes.io/projected/19fe62af-2689-46c8-837f-6a6af7f92052-kube-api-access-jn4lg\") on node \"crc\" DevicePath \"\"" Feb 17 19:26:36 crc kubenswrapper[4892]: I0217 19:26:36.656402 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kh64\" (UniqueName: \"kubernetes.io/projected/9ead331a-f404-4cd9-9b84-00f5804cf185-kube-api-access-6kh64\") pod \"9ead331a-f404-4cd9-9b84-00f5804cf185\" (UID: \"9ead331a-f404-4cd9-9b84-00f5804cf185\") " Feb 17 19:26:36 crc kubenswrapper[4892]: I0217 19:26:36.656825 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ead331a-f404-4cd9-9b84-00f5804cf185-operator-scripts\") pod \"9ead331a-f404-4cd9-9b84-00f5804cf185\" (UID: \"9ead331a-f404-4cd9-9b84-00f5804cf185\") " Feb 17 19:26:36 crc kubenswrapper[4892]: I0217 19:26:36.657555 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ead331a-f404-4cd9-9b84-00f5804cf185-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9ead331a-f404-4cd9-9b84-00f5804cf185" (UID: "9ead331a-f404-4cd9-9b84-00f5804cf185"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:26:36 crc kubenswrapper[4892]: I0217 19:26:36.659184 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ead331a-f404-4cd9-9b84-00f5804cf185-kube-api-access-6kh64" (OuterVolumeSpecName: "kube-api-access-6kh64") pod "9ead331a-f404-4cd9-9b84-00f5804cf185" (UID: "9ead331a-f404-4cd9-9b84-00f5804cf185"). InnerVolumeSpecName "kube-api-access-6kh64". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:26:36 crc kubenswrapper[4892]: I0217 19:26:36.758425 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kh64\" (UniqueName: \"kubernetes.io/projected/9ead331a-f404-4cd9-9b84-00f5804cf185-kube-api-access-6kh64\") on node \"crc\" DevicePath \"\"" Feb 17 19:26:36 crc kubenswrapper[4892]: I0217 19:26:36.758638 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ead331a-f404-4cd9-9b84-00f5804cf185-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:26:37 crc kubenswrapper[4892]: I0217 19:26:37.065425 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-d2c7-account-create-update-fkq2k" Feb 17 19:26:37 crc kubenswrapper[4892]: I0217 19:26:37.065424 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-d2c7-account-create-update-fkq2k" event={"ID":"9ead331a-f404-4cd9-9b84-00f5804cf185","Type":"ContainerDied","Data":"4e39e370c5a30d535f90c354a33623d96e8f14023abc4b41755c88ade2cdf14d"} Feb 17 19:26:37 crc kubenswrapper[4892]: I0217 19:26:37.065639 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e39e370c5a30d535f90c354a33623d96e8f14023abc4b41755c88ade2cdf14d" Feb 17 19:26:37 crc kubenswrapper[4892]: I0217 19:26:37.067755 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-4h7h4" event={"ID":"19fe62af-2689-46c8-837f-6a6af7f92052","Type":"ContainerDied","Data":"78427e24b0c4a0a958b521b7339bc22c454fd25733c5b50f2ddcfe47b502ad63"} Feb 17 19:26:37 crc kubenswrapper[4892]: I0217 19:26:37.067791 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78427e24b0c4a0a958b521b7339bc22c454fd25733c5b50f2ddcfe47b502ad63" Feb 17 19:26:37 crc kubenswrapper[4892]: I0217 19:26:37.067839 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-4h7h4" Feb 17 19:26:38 crc kubenswrapper[4892]: I0217 19:26:38.065214 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-zwvbt"] Feb 17 19:26:38 crc kubenswrapper[4892]: E0217 19:26:38.066691 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19fe62af-2689-46c8-837f-6a6af7f92052" containerName="mariadb-database-create" Feb 17 19:26:38 crc kubenswrapper[4892]: I0217 19:26:38.066767 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="19fe62af-2689-46c8-837f-6a6af7f92052" containerName="mariadb-database-create" Feb 17 19:26:38 crc kubenswrapper[4892]: E0217 19:26:38.066843 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ead331a-f404-4cd9-9b84-00f5804cf185" containerName="mariadb-account-create-update" Feb 17 19:26:38 crc kubenswrapper[4892]: I0217 19:26:38.066892 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ead331a-f404-4cd9-9b84-00f5804cf185" containerName="mariadb-account-create-update" Feb 17 19:26:38 crc kubenswrapper[4892]: I0217 19:26:38.067159 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ead331a-f404-4cd9-9b84-00f5804cf185" containerName="mariadb-account-create-update" Feb 17 19:26:38 crc kubenswrapper[4892]: I0217 19:26:38.067223 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="19fe62af-2689-46c8-837f-6a6af7f92052" containerName="mariadb-database-create" Feb 17 19:26:38 crc kubenswrapper[4892]: I0217 19:26:38.067963 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-zwvbt" Feb 17 19:26:38 crc kubenswrapper[4892]: I0217 19:26:38.071700 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-wlmh5" Feb 17 19:26:38 crc kubenswrapper[4892]: I0217 19:26:38.072001 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Feb 17 19:26:38 crc kubenswrapper[4892]: I0217 19:26:38.108718 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-zwvbt"] Feb 17 19:26:38 crc kubenswrapper[4892]: I0217 19:26:38.205669 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61ec9d13-098a-4249-8f59-db24d25a7ff9-config-data\") pod \"heat-db-sync-zwvbt\" (UID: \"61ec9d13-098a-4249-8f59-db24d25a7ff9\") " pod="openstack/heat-db-sync-zwvbt" Feb 17 19:26:38 crc kubenswrapper[4892]: I0217 19:26:38.205810 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mgk6\" (UniqueName: \"kubernetes.io/projected/61ec9d13-098a-4249-8f59-db24d25a7ff9-kube-api-access-7mgk6\") pod \"heat-db-sync-zwvbt\" (UID: \"61ec9d13-098a-4249-8f59-db24d25a7ff9\") " pod="openstack/heat-db-sync-zwvbt" Feb 17 19:26:38 crc kubenswrapper[4892]: I0217 19:26:38.205885 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61ec9d13-098a-4249-8f59-db24d25a7ff9-combined-ca-bundle\") pod \"heat-db-sync-zwvbt\" (UID: \"61ec9d13-098a-4249-8f59-db24d25a7ff9\") " pod="openstack/heat-db-sync-zwvbt" Feb 17 19:26:38 crc kubenswrapper[4892]: I0217 19:26:38.307909 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61ec9d13-098a-4249-8f59-db24d25a7ff9-config-data\") pod \"heat-db-sync-zwvbt\" (UID: \"61ec9d13-098a-4249-8f59-db24d25a7ff9\") " pod="openstack/heat-db-sync-zwvbt" Feb 17 19:26:38 crc kubenswrapper[4892]: I0217 19:26:38.308023 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mgk6\" (UniqueName: \"kubernetes.io/projected/61ec9d13-098a-4249-8f59-db24d25a7ff9-kube-api-access-7mgk6\") pod \"heat-db-sync-zwvbt\" (UID: \"61ec9d13-098a-4249-8f59-db24d25a7ff9\") " pod="openstack/heat-db-sync-zwvbt" Feb 17 19:26:38 crc kubenswrapper[4892]: I0217 19:26:38.308068 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61ec9d13-098a-4249-8f59-db24d25a7ff9-combined-ca-bundle\") pod \"heat-db-sync-zwvbt\" (UID: \"61ec9d13-098a-4249-8f59-db24d25a7ff9\") " pod="openstack/heat-db-sync-zwvbt" Feb 17 19:26:38 crc kubenswrapper[4892]: I0217 19:26:38.314911 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61ec9d13-098a-4249-8f59-db24d25a7ff9-config-data\") pod \"heat-db-sync-zwvbt\" (UID: \"61ec9d13-098a-4249-8f59-db24d25a7ff9\") " pod="openstack/heat-db-sync-zwvbt" Feb 17 19:26:38 crc kubenswrapper[4892]: I0217 19:26:38.316312 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61ec9d13-098a-4249-8f59-db24d25a7ff9-combined-ca-bundle\") pod \"heat-db-sync-zwvbt\" (UID: \"61ec9d13-098a-4249-8f59-db24d25a7ff9\") " pod="openstack/heat-db-sync-zwvbt" Feb 17 19:26:38 crc kubenswrapper[4892]: I0217 19:26:38.324034 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mgk6\" (UniqueName: \"kubernetes.io/projected/61ec9d13-098a-4249-8f59-db24d25a7ff9-kube-api-access-7mgk6\") pod \"heat-db-sync-zwvbt\" (UID: \"61ec9d13-098a-4249-8f59-db24d25a7ff9\") " pod="openstack/heat-db-sync-zwvbt" Feb 17 19:26:38 crc kubenswrapper[4892]: I0217 19:26:38.390731 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-zwvbt" Feb 17 19:26:38 crc kubenswrapper[4892]: I0217 19:26:38.869686 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-zwvbt"] Feb 17 19:26:38 crc kubenswrapper[4892]: W0217 19:26:38.878192 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61ec9d13_098a_4249_8f59_db24d25a7ff9.slice/crio-4becf0f356babc199da99f49e9f2aa46829717462d57c999fbd0a149c9044d91 WatchSource:0}: Error finding container 4becf0f356babc199da99f49e9f2aa46829717462d57c999fbd0a149c9044d91: Status 404 returned error can't find the container with id 4becf0f356babc199da99f49e9f2aa46829717462d57c999fbd0a149c9044d91 Feb 17 19:26:38 crc kubenswrapper[4892]: I0217 19:26:38.881102 4892 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 19:26:39 crc kubenswrapper[4892]: I0217 19:26:39.088359 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-zwvbt" event={"ID":"61ec9d13-098a-4249-8f59-db24d25a7ff9","Type":"ContainerStarted","Data":"4becf0f356babc199da99f49e9f2aa46829717462d57c999fbd0a149c9044d91"} Feb 17 19:26:41 crc kubenswrapper[4892]: I0217 19:26:41.987247 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-dcbd57fbf-8j2s5" Feb 17 19:26:41 crc kubenswrapper[4892]: I0217 19:26:41.987529 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-dcbd57fbf-8j2s5" Feb 17 19:26:47 crc kubenswrapper[4892]: I0217 19:26:47.204081 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-zwvbt" event={"ID":"61ec9d13-098a-4249-8f59-db24d25a7ff9","Type":"ContainerStarted","Data":"17c5421290a2046338b02c97de1fcba92ead909ff6b4e9d121e0ce66c1e73915"} Feb 17 19:26:47 crc kubenswrapper[4892]: I0217 19:26:47.230740 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-zwvbt" podStartSLOduration=1.5264004180000001 podStartE2EDuration="9.230713516s" podCreationTimestamp="2026-02-17 19:26:38 +0000 UTC" firstStartedPulling="2026-02-17 19:26:38.880839503 +0000 UTC m=+6170.256242758" lastFinishedPulling="2026-02-17 19:26:46.585152591 +0000 UTC m=+6177.960555856" observedRunningTime="2026-02-17 19:26:47.227281252 +0000 UTC m=+6178.602684547" watchObservedRunningTime="2026-02-17 19:26:47.230713516 +0000 UTC m=+6178.606116791" Feb 17 19:26:49 crc kubenswrapper[4892]: I0217 19:26:49.229181 4892 generic.go:334] "Generic (PLEG): container finished" podID="61ec9d13-098a-4249-8f59-db24d25a7ff9" containerID="17c5421290a2046338b02c97de1fcba92ead909ff6b4e9d121e0ce66c1e73915" exitCode=0 Feb 17 19:26:49 crc kubenswrapper[4892]: I0217 19:26:49.229243 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-zwvbt" event={"ID":"61ec9d13-098a-4249-8f59-db24d25a7ff9","Type":"ContainerDied","Data":"17c5421290a2046338b02c97de1fcba92ead909ff6b4e9d121e0ce66c1e73915"} Feb 17 19:26:50 crc kubenswrapper[4892]: I0217 19:26:50.727805 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-zwvbt" Feb 17 19:26:50 crc kubenswrapper[4892]: I0217 19:26:50.849748 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61ec9d13-098a-4249-8f59-db24d25a7ff9-config-data\") pod \"61ec9d13-098a-4249-8f59-db24d25a7ff9\" (UID: \"61ec9d13-098a-4249-8f59-db24d25a7ff9\") " Feb 17 19:26:50 crc kubenswrapper[4892]: I0217 19:26:50.849981 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mgk6\" (UniqueName: \"kubernetes.io/projected/61ec9d13-098a-4249-8f59-db24d25a7ff9-kube-api-access-7mgk6\") pod \"61ec9d13-098a-4249-8f59-db24d25a7ff9\" (UID: \"61ec9d13-098a-4249-8f59-db24d25a7ff9\") " Feb 17 19:26:50 crc kubenswrapper[4892]: I0217 19:26:50.850044 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61ec9d13-098a-4249-8f59-db24d25a7ff9-combined-ca-bundle\") pod \"61ec9d13-098a-4249-8f59-db24d25a7ff9\" (UID: \"61ec9d13-098a-4249-8f59-db24d25a7ff9\") " Feb 17 19:26:50 crc kubenswrapper[4892]: I0217 19:26:50.874488 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61ec9d13-098a-4249-8f59-db24d25a7ff9-kube-api-access-7mgk6" (OuterVolumeSpecName: "kube-api-access-7mgk6") pod "61ec9d13-098a-4249-8f59-db24d25a7ff9" (UID: "61ec9d13-098a-4249-8f59-db24d25a7ff9"). InnerVolumeSpecName "kube-api-access-7mgk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:26:50 crc kubenswrapper[4892]: I0217 19:26:50.905283 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61ec9d13-098a-4249-8f59-db24d25a7ff9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "61ec9d13-098a-4249-8f59-db24d25a7ff9" (UID: "61ec9d13-098a-4249-8f59-db24d25a7ff9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:26:50 crc kubenswrapper[4892]: I0217 19:26:50.939491 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61ec9d13-098a-4249-8f59-db24d25a7ff9-config-data" (OuterVolumeSpecName: "config-data") pod "61ec9d13-098a-4249-8f59-db24d25a7ff9" (UID: "61ec9d13-098a-4249-8f59-db24d25a7ff9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:26:50 crc kubenswrapper[4892]: I0217 19:26:50.953569 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61ec9d13-098a-4249-8f59-db24d25a7ff9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 19:26:50 crc kubenswrapper[4892]: I0217 19:26:50.953601 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61ec9d13-098a-4249-8f59-db24d25a7ff9-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 19:26:50 crc kubenswrapper[4892]: I0217 19:26:50.953611 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mgk6\" (UniqueName: \"kubernetes.io/projected/61ec9d13-098a-4249-8f59-db24d25a7ff9-kube-api-access-7mgk6\") on node \"crc\" DevicePath \"\"" Feb 17 19:26:51 crc kubenswrapper[4892]: I0217 19:26:51.259456 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-zwvbt" event={"ID":"61ec9d13-098a-4249-8f59-db24d25a7ff9","Type":"ContainerDied","Data":"4becf0f356babc199da99f49e9f2aa46829717462d57c999fbd0a149c9044d91"} Feb 17 19:26:51 crc kubenswrapper[4892]: I0217 19:26:51.259711 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4becf0f356babc199da99f49e9f2aa46829717462d57c999fbd0a149c9044d91" Feb 17 19:26:51 crc kubenswrapper[4892]: I0217 19:26:51.259770 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-zwvbt" Feb 17 19:26:52 crc kubenswrapper[4892]: I0217 19:26:52.677187 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-768dfc78cc-qbdts"] Feb 17 19:26:52 crc kubenswrapper[4892]: E0217 19:26:52.677907 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61ec9d13-098a-4249-8f59-db24d25a7ff9" containerName="heat-db-sync" Feb 17 19:26:52 crc kubenswrapper[4892]: I0217 19:26:52.677919 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="61ec9d13-098a-4249-8f59-db24d25a7ff9" containerName="heat-db-sync" Feb 17 19:26:52 crc kubenswrapper[4892]: I0217 19:26:52.678180 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="61ec9d13-098a-4249-8f59-db24d25a7ff9" containerName="heat-db-sync" Feb 17 19:26:52 crc kubenswrapper[4892]: I0217 19:26:52.678934 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-768dfc78cc-qbdts" Feb 17 19:26:52 crc kubenswrapper[4892]: I0217 19:26:52.684951 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Feb 17 19:26:52 crc kubenswrapper[4892]: I0217 19:26:52.685937 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-wlmh5" Feb 17 19:26:52 crc kubenswrapper[4892]: I0217 19:26:52.686186 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Feb 17 19:26:52 crc kubenswrapper[4892]: I0217 19:26:52.713169 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-768dfc78cc-qbdts"] Feb 17 19:26:52 crc kubenswrapper[4892]: I0217 19:26:52.801423 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d5b0472-6954-47d0-b1dd-aeefea0ce5be-combined-ca-bundle\") pod \"heat-engine-768dfc78cc-qbdts\" (UID: \"2d5b0472-6954-47d0-b1dd-aeefea0ce5be\") " pod="openstack/heat-engine-768dfc78cc-qbdts" Feb 17 19:26:52 crc kubenswrapper[4892]: I0217 19:26:52.801564 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbfgv\" (UniqueName: \"kubernetes.io/projected/2d5b0472-6954-47d0-b1dd-aeefea0ce5be-kube-api-access-pbfgv\") pod \"heat-engine-768dfc78cc-qbdts\" (UID: \"2d5b0472-6954-47d0-b1dd-aeefea0ce5be\") " pod="openstack/heat-engine-768dfc78cc-qbdts" Feb 17 19:26:52 crc kubenswrapper[4892]: I0217 19:26:52.801595 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2d5b0472-6954-47d0-b1dd-aeefea0ce5be-config-data-custom\") pod \"heat-engine-768dfc78cc-qbdts\" (UID: \"2d5b0472-6954-47d0-b1dd-aeefea0ce5be\") " pod="openstack/heat-engine-768dfc78cc-qbdts" Feb 17 19:26:52 crc kubenswrapper[4892]: I0217 19:26:52.801660 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d5b0472-6954-47d0-b1dd-aeefea0ce5be-config-data\") pod \"heat-engine-768dfc78cc-qbdts\" (UID: \"2d5b0472-6954-47d0-b1dd-aeefea0ce5be\") " pod="openstack/heat-engine-768dfc78cc-qbdts" Feb 17 19:26:52 crc kubenswrapper[4892]: I0217 19:26:52.819055 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-5fcbcbc588-hh84w"] Feb 17 19:26:52 crc kubenswrapper[4892]: I0217 19:26:52.820651 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5fcbcbc588-hh84w" Feb 17 19:26:52 crc kubenswrapper[4892]: I0217 19:26:52.824188 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Feb 17 19:26:52 crc kubenswrapper[4892]: I0217 19:26:52.850881 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-7bf8574b5-6xj5n"] Feb 17 19:26:52 crc kubenswrapper[4892]: I0217 19:26:52.852378 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7bf8574b5-6xj5n" Feb 17 19:26:52 crc kubenswrapper[4892]: I0217 19:26:52.862295 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Feb 17 19:26:52 crc kubenswrapper[4892]: I0217 19:26:52.869917 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5fcbcbc588-hh84w"] Feb 17 19:26:52 crc kubenswrapper[4892]: I0217 19:26:52.887420 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7bf8574b5-6xj5n"] Feb 17 19:26:52 crc kubenswrapper[4892]: I0217 19:26:52.906673 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl7vr\" (UniqueName: \"kubernetes.io/projected/0fc379d3-42fb-4714-b739-78a9f7e81068-kube-api-access-zl7vr\") pod \"heat-cfnapi-5fcbcbc588-hh84w\" (UID: \"0fc379d3-42fb-4714-b739-78a9f7e81068\") " pod="openstack/heat-cfnapi-5fcbcbc588-hh84w" Feb 17 19:26:52 crc kubenswrapper[4892]: I0217 19:26:52.906772 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d5b0472-6954-47d0-b1dd-aeefea0ce5be-combined-ca-bundle\") pod \"heat-engine-768dfc78cc-qbdts\" (UID: \"2d5b0472-6954-47d0-b1dd-aeefea0ce5be\") " pod="openstack/heat-engine-768dfc78cc-qbdts" Feb 17 19:26:52 crc kubenswrapper[4892]: I0217 19:26:52.906843 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92b20a09-f679-471f-aad0-bf6e308b3bce-config-data\") pod \"heat-api-7bf8574b5-6xj5n\" (UID: \"92b20a09-f679-471f-aad0-bf6e308b3bce\") " pod="openstack/heat-api-7bf8574b5-6xj5n" Feb 17 19:26:52 crc kubenswrapper[4892]: I0217 19:26:52.906862 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92b20a09-f679-471f-aad0-bf6e308b3bce-combined-ca-bundle\") pod \"heat-api-7bf8574b5-6xj5n\" (UID: \"92b20a09-f679-471f-aad0-bf6e308b3bce\") " pod="openstack/heat-api-7bf8574b5-6xj5n" Feb 17 19:26:52 crc kubenswrapper[4892]: I0217 19:26:52.908209 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fc379d3-42fb-4714-b739-78a9f7e81068-config-data\") pod \"heat-cfnapi-5fcbcbc588-hh84w\" (UID: \"0fc379d3-42fb-4714-b739-78a9f7e81068\") " pod="openstack/heat-cfnapi-5fcbcbc588-hh84w" Feb 17 19:26:52 crc kubenswrapper[4892]: I0217 19:26:52.908290 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fc379d3-42fb-4714-b739-78a9f7e81068-combined-ca-bundle\") pod \"heat-cfnapi-5fcbcbc588-hh84w\" (UID: \"0fc379d3-42fb-4714-b739-78a9f7e81068\") " pod="openstack/heat-cfnapi-5fcbcbc588-hh84w" Feb 17 19:26:52 crc kubenswrapper[4892]: I0217 19:26:52.908418 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbfgv\" (UniqueName: \"kubernetes.io/projected/2d5b0472-6954-47d0-b1dd-aeefea0ce5be-kube-api-access-pbfgv\") pod \"heat-engine-768dfc78cc-qbdts\" (UID: \"2d5b0472-6954-47d0-b1dd-aeefea0ce5be\") " pod="openstack/heat-engine-768dfc78cc-qbdts" Feb 17 19:26:52 crc kubenswrapper[4892]: I0217 19:26:52.908449 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92b20a09-f679-471f-aad0-bf6e308b3bce-config-data-custom\") pod \"heat-api-7bf8574b5-6xj5n\" (UID: \"92b20a09-f679-471f-aad0-bf6e308b3bce\") " pod="openstack/heat-api-7bf8574b5-6xj5n" Feb 17 19:26:52 crc kubenswrapper[4892]: I0217 19:26:52.908784 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2d5b0472-6954-47d0-b1dd-aeefea0ce5be-config-data-custom\") pod \"heat-engine-768dfc78cc-qbdts\" (UID: \"2d5b0472-6954-47d0-b1dd-aeefea0ce5be\") " pod="openstack/heat-engine-768dfc78cc-qbdts" Feb 17 19:26:52 crc kubenswrapper[4892]: I0217 19:26:52.908882 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0fc379d3-42fb-4714-b739-78a9f7e81068-config-data-custom\") pod \"heat-cfnapi-5fcbcbc588-hh84w\" (UID: \"0fc379d3-42fb-4714-b739-78a9f7e81068\") " pod="openstack/heat-cfnapi-5fcbcbc588-hh84w" Feb 17 19:26:52 crc kubenswrapper[4892]: I0217 19:26:52.909426 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d5b0472-6954-47d0-b1dd-aeefea0ce5be-config-data\") pod \"heat-engine-768dfc78cc-qbdts\" (UID: \"2d5b0472-6954-47d0-b1dd-aeefea0ce5be\") " pod="openstack/heat-engine-768dfc78cc-qbdts" Feb 17 19:26:52 crc kubenswrapper[4892]: I0217 19:26:52.909846 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpv5c\" (UniqueName: \"kubernetes.io/projected/92b20a09-f679-471f-aad0-bf6e308b3bce-kube-api-access-mpv5c\") pod \"heat-api-7bf8574b5-6xj5n\" (UID: \"92b20a09-f679-471f-aad0-bf6e308b3bce\") " pod="openstack/heat-api-7bf8574b5-6xj5n" Feb 17 19:26:52 crc kubenswrapper[4892]: I0217 19:26:52.914666 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2d5b0472-6954-47d0-b1dd-aeefea0ce5be-config-data-custom\") pod \"heat-engine-768dfc78cc-qbdts\" (UID: \"2d5b0472-6954-47d0-b1dd-aeefea0ce5be\") " pod="openstack/heat-engine-768dfc78cc-qbdts" Feb 17 19:26:52 crc kubenswrapper[4892]: I0217 19:26:52.915103 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d5b0472-6954-47d0-b1dd-aeefea0ce5be-combined-ca-bundle\") pod \"heat-engine-768dfc78cc-qbdts\" (UID: \"2d5b0472-6954-47d0-b1dd-aeefea0ce5be\") " pod="openstack/heat-engine-768dfc78cc-qbdts" Feb 17 19:26:52 crc kubenswrapper[4892]: I0217 19:26:52.925252 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d5b0472-6954-47d0-b1dd-aeefea0ce5be-config-data\") pod \"heat-engine-768dfc78cc-qbdts\" (UID: \"2d5b0472-6954-47d0-b1dd-aeefea0ce5be\") " pod="openstack/heat-engine-768dfc78cc-qbdts" Feb 17 19:26:52 crc kubenswrapper[4892]: I0217 19:26:52.931244 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbfgv\" (UniqueName: \"kubernetes.io/projected/2d5b0472-6954-47d0-b1dd-aeefea0ce5be-kube-api-access-pbfgv\") pod \"heat-engine-768dfc78cc-qbdts\" (UID: \"2d5b0472-6954-47d0-b1dd-aeefea0ce5be\") " pod="openstack/heat-engine-768dfc78cc-qbdts" Feb 17 19:26:53 crc kubenswrapper[4892]: I0217 19:26:53.012165 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92b20a09-f679-471f-aad0-bf6e308b3bce-config-data-custom\") pod \"heat-api-7bf8574b5-6xj5n\" (UID: \"92b20a09-f679-471f-aad0-bf6e308b3bce\") " pod="openstack/heat-api-7bf8574b5-6xj5n" Feb 17 19:26:53 crc kubenswrapper[4892]: I0217 19:26:53.012562 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0fc379d3-42fb-4714-b739-78a9f7e81068-config-data-custom\") pod \"heat-cfnapi-5fcbcbc588-hh84w\" (UID: \"0fc379d3-42fb-4714-b739-78a9f7e81068\") " pod="openstack/heat-cfnapi-5fcbcbc588-hh84w" Feb 17 19:26:53 crc kubenswrapper[4892]: I0217 19:26:53.012698 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpv5c\" (UniqueName: \"kubernetes.io/projected/92b20a09-f679-471f-aad0-bf6e308b3bce-kube-api-access-mpv5c\") pod \"heat-api-7bf8574b5-6xj5n\" (UID: \"92b20a09-f679-471f-aad0-bf6e308b3bce\") " pod="openstack/heat-api-7bf8574b5-6xj5n" Feb 17 19:26:53 crc kubenswrapper[4892]: I0217 19:26:53.012748 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl7vr\" (UniqueName: \"kubernetes.io/projected/0fc379d3-42fb-4714-b739-78a9f7e81068-kube-api-access-zl7vr\") pod \"heat-cfnapi-5fcbcbc588-hh84w\" (UID: \"0fc379d3-42fb-4714-b739-78a9f7e81068\") " pod="openstack/heat-cfnapi-5fcbcbc588-hh84w" Feb 17 19:26:53 crc kubenswrapper[4892]: I0217 19:26:53.012794 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92b20a09-f679-471f-aad0-bf6e308b3bce-config-data\") pod \"heat-api-7bf8574b5-6xj5n\" (UID: \"92b20a09-f679-471f-aad0-bf6e308b3bce\") " pod="openstack/heat-api-7bf8574b5-6xj5n" Feb 17 19:26:53 crc kubenswrapper[4892]: I0217 19:26:53.012924 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92b20a09-f679-471f-aad0-bf6e308b3bce-combined-ca-bundle\") pod \"heat-api-7bf8574b5-6xj5n\" (UID: \"92b20a09-f679-471f-aad0-bf6e308b3bce\") " pod="openstack/heat-api-7bf8574b5-6xj5n" Feb 17 19:26:53 crc kubenswrapper[4892]: I0217 19:26:53.012956 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fc379d3-42fb-4714-b739-78a9f7e81068-config-data\") pod \"heat-cfnapi-5fcbcbc588-hh84w\" (UID: \"0fc379d3-42fb-4714-b739-78a9f7e81068\") " pod="openstack/heat-cfnapi-5fcbcbc588-hh84w" Feb 17 19:26:53 crc kubenswrapper[4892]: I0217 19:26:53.013013 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fc379d3-42fb-4714-b739-78a9f7e81068-combined-ca-bundle\") pod \"heat-cfnapi-5fcbcbc588-hh84w\" (UID: \"0fc379d3-42fb-4714-b739-78a9f7e81068\") " pod="openstack/heat-cfnapi-5fcbcbc588-hh84w" Feb 17 19:26:53 crc kubenswrapper[4892]: I0217 19:26:53.017954 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fc379d3-42fb-4714-b739-78a9f7e81068-combined-ca-bundle\") pod \"heat-cfnapi-5fcbcbc588-hh84w\" (UID: \"0fc379d3-42fb-4714-b739-78a9f7e81068\") " pod="openstack/heat-cfnapi-5fcbcbc588-hh84w" Feb 17 19:26:53 crc kubenswrapper[4892]: I0217 19:26:53.018831 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92b20a09-f679-471f-aad0-bf6e308b3bce-combined-ca-bundle\") pod \"heat-api-7bf8574b5-6xj5n\" (UID: \"92b20a09-f679-471f-aad0-bf6e308b3bce\") " pod="openstack/heat-api-7bf8574b5-6xj5n" Feb 17 19:26:53 crc kubenswrapper[4892]: I0217 19:26:53.019233 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92b20a09-f679-471f-aad0-bf6e308b3bce-config-data\") pod \"heat-api-7bf8574b5-6xj5n\" (UID: \"92b20a09-f679-471f-aad0-bf6e308b3bce\") " pod="openstack/heat-api-7bf8574b5-6xj5n" Feb 17 19:26:53 crc kubenswrapper[4892]: I0217 19:26:53.019745 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92b20a09-f679-471f-aad0-bf6e308b3bce-config-data-custom\") pod \"heat-api-7bf8574b5-6xj5n\" (UID: \"92b20a09-f679-471f-aad0-bf6e308b3bce\") " pod="openstack/heat-api-7bf8574b5-6xj5n" Feb 17 19:26:53 crc kubenswrapper[4892]: I0217 19:26:53.023178 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-768dfc78cc-qbdts" Feb 17 19:26:53 crc kubenswrapper[4892]: I0217 19:26:53.027122 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0fc379d3-42fb-4714-b739-78a9f7e81068-config-data-custom\") pod \"heat-cfnapi-5fcbcbc588-hh84w\" (UID: \"0fc379d3-42fb-4714-b739-78a9f7e81068\") " pod="openstack/heat-cfnapi-5fcbcbc588-hh84w" Feb 17 19:26:53 crc kubenswrapper[4892]: I0217 19:26:53.027602 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fc379d3-42fb-4714-b739-78a9f7e81068-config-data\") pod \"heat-cfnapi-5fcbcbc588-hh84w\" (UID: \"0fc379d3-42fb-4714-b739-78a9f7e81068\") " pod="openstack/heat-cfnapi-5fcbcbc588-hh84w" Feb 17 19:26:53 crc kubenswrapper[4892]: I0217 19:26:53.030352 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl7vr\" (UniqueName: \"kubernetes.io/projected/0fc379d3-42fb-4714-b739-78a9f7e81068-kube-api-access-zl7vr\") pod \"heat-cfnapi-5fcbcbc588-hh84w\" (UID: \"0fc379d3-42fb-4714-b739-78a9f7e81068\") " pod="openstack/heat-cfnapi-5fcbcbc588-hh84w" Feb 17 19:26:53 crc kubenswrapper[4892]: I0217 19:26:53.035839 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpv5c\" (UniqueName: \"kubernetes.io/projected/92b20a09-f679-471f-aad0-bf6e308b3bce-kube-api-access-mpv5c\") pod \"heat-api-7bf8574b5-6xj5n\" (UID: \"92b20a09-f679-471f-aad0-bf6e308b3bce\") " pod="openstack/heat-api-7bf8574b5-6xj5n" Feb 17 19:26:53 crc kubenswrapper[4892]: I0217 19:26:53.151970 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5fcbcbc588-hh84w" Feb 17 19:26:53 crc kubenswrapper[4892]: I0217 19:26:53.190103 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7bf8574b5-6xj5n" Feb 17 19:26:53 crc kubenswrapper[4892]: I0217 19:26:53.695426 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-768dfc78cc-qbdts"] Feb 17 19:26:53 crc kubenswrapper[4892]: W0217 19:26:53.697418 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d5b0472_6954_47d0_b1dd_aeefea0ce5be.slice/crio-fb79e58dfcf2f1b38358c83c6dc99abc098fdbae3c976b221f1a45ef38e0b4b3 WatchSource:0}: Error finding container fb79e58dfcf2f1b38358c83c6dc99abc098fdbae3c976b221f1a45ef38e0b4b3: Status 404 returned error can't find the container with id fb79e58dfcf2f1b38358c83c6dc99abc098fdbae3c976b221f1a45ef38e0b4b3 Feb 17 19:26:53 crc kubenswrapper[4892]: I0217 19:26:53.815328 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5fcbcbc588-hh84w"] Feb 17 19:26:53 crc kubenswrapper[4892]: I0217 19:26:53.926125 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7bf8574b5-6xj5n"] Feb 17 19:26:53 crc kubenswrapper[4892]: W0217 19:26:53.930718 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92b20a09_f679_471f_aad0_bf6e308b3bce.slice/crio-5cc3e673f3679699ff721acbbce64bd71d80d675f8e0341f0d4d0992b4fcc247 WatchSource:0}: Error finding container 5cc3e673f3679699ff721acbbce64bd71d80d675f8e0341f0d4d0992b4fcc247: Status 404 returned error can't find the container with id 5cc3e673f3679699ff721acbbce64bd71d80d675f8e0341f0d4d0992b4fcc247 Feb 17 19:26:54 crc kubenswrapper[4892]: I0217 19:26:54.020190 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-dcbd57fbf-8j2s5" Feb 17 19:26:54 crc kubenswrapper[4892]: I0217 19:26:54.315598 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7bf8574b5-6xj5n" event={"ID":"92b20a09-f679-471f-aad0-bf6e308b3bce","Type":"ContainerStarted","Data":"5cc3e673f3679699ff721acbbce64bd71d80d675f8e0341f0d4d0992b4fcc247"} Feb 17 19:26:54 crc kubenswrapper[4892]: I0217 19:26:54.325488 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5fcbcbc588-hh84w" event={"ID":"0fc379d3-42fb-4714-b739-78a9f7e81068","Type":"ContainerStarted","Data":"c95f0a73d23680d697008b647981a56062d6dd61b189cf67ed35223c99afadf1"} Feb 17 19:26:54 crc kubenswrapper[4892]: I0217 19:26:54.326698 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-768dfc78cc-qbdts" event={"ID":"2d5b0472-6954-47d0-b1dd-aeefea0ce5be","Type":"ContainerStarted","Data":"1fffb6222bb025b8eb15026a02ec146ef7ab2fdffc27050a4d9deb36e4151ab3"} Feb 17 19:26:54 crc kubenswrapper[4892]: I0217 19:26:54.326739 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-768dfc78cc-qbdts" event={"ID":"2d5b0472-6954-47d0-b1dd-aeefea0ce5be","Type":"ContainerStarted","Data":"fb79e58dfcf2f1b38358c83c6dc99abc098fdbae3c976b221f1a45ef38e0b4b3"} Feb 17 19:26:54 crc kubenswrapper[4892]: I0217 19:26:54.326869 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-768dfc78cc-qbdts" Feb 17 19:26:54 crc kubenswrapper[4892]: I0217 19:26:54.361067 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-768dfc78cc-qbdts" podStartSLOduration=2.361052108 podStartE2EDuration="2.361052108s" podCreationTimestamp="2026-02-17 19:26:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:26:54.347376658 +0000 UTC m=+6185.722779923" watchObservedRunningTime="2026-02-17 19:26:54.361052108 +0000 UTC m=+6185.736455373" Feb 17 19:26:56 crc kubenswrapper[4892]: I0217 19:26:56.087773 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-dcbd57fbf-8j2s5" Feb 17 19:26:56 crc kubenswrapper[4892]: I0217 19:26:56.157448 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5fdbc494b5-smb4m"] Feb 17 19:26:56 crc kubenswrapper[4892]: I0217 19:26:56.157671 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5fdbc494b5-smb4m" podUID="4976d40b-aca4-4ac9-956c-ff2719378071" containerName="horizon-log" containerID="cri-o://e2d672890367052c9abe2d31a774ac20d688d0b3209257c1948a416b26bd80b5" gracePeriod=30 Feb 17 19:26:56 crc kubenswrapper[4892]: I0217 19:26:56.158159 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5fdbc494b5-smb4m" podUID="4976d40b-aca4-4ac9-956c-ff2719378071" containerName="horizon" containerID="cri-o://461c1b8b342c0053cfff056cc0b5fd094028a0136d959a19602edab8f6328f3c" gracePeriod=30 Feb 17 19:26:57 crc kubenswrapper[4892]: I0217 19:26:57.382120 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-7bf8574b5-6xj5n" Feb 17 19:26:57 crc kubenswrapper[4892]: I0217 19:26:57.382723 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7bf8574b5-6xj5n" event={"ID":"92b20a09-f679-471f-aad0-bf6e308b3bce","Type":"ContainerStarted","Data":"43c6059d2441ce0e57fe535bfb01ec3a4a4bc8de62385316ff73150cdb757b53"} Feb 17 19:26:57 crc kubenswrapper[4892]: I0217 19:26:57.382746 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5fcbcbc588-hh84w" event={"ID":"0fc379d3-42fb-4714-b739-78a9f7e81068","Type":"ContainerStarted","Data":"c852ab549131db3d8ee8219cd05638afea415279a2de48129a917580a4cffa9a"} Feb 17 19:26:57 crc kubenswrapper[4892]: I0217 19:26:57.385907 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-5fcbcbc588-hh84w" Feb 17 19:26:57 crc kubenswrapper[4892]: I0217 19:26:57.406077 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-7bf8574b5-6xj5n" podStartSLOduration=2.854162004 podStartE2EDuration="5.406058961s" podCreationTimestamp="2026-02-17 19:26:52 +0000 UTC" firstStartedPulling="2026-02-17 19:26:53.932865401 +0000 UTC m=+6185.308268666" lastFinishedPulling="2026-02-17 19:26:56.484762358 +0000 UTC m=+6187.860165623" observedRunningTime="2026-02-17 19:26:57.387468577 +0000 UTC m=+6188.762871852" watchObservedRunningTime="2026-02-17 19:26:57.406058961 +0000 UTC m=+6188.781462236" Feb 17 19:26:57 crc kubenswrapper[4892]: I0217 19:26:57.411825 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-5fcbcbc588-hh84w" podStartSLOduration=2.760381445 podStartE2EDuration="5.411796477s" podCreationTimestamp="2026-02-17 19:26:52 +0000 UTC" firstStartedPulling="2026-02-17 19:26:53.830959781 +0000 UTC m=+6185.206363046" lastFinishedPulling="2026-02-17 19:26:56.482374813 +0000 UTC m=+6187.857778078" observedRunningTime="2026-02-17 19:26:57.405684641 +0000 UTC m=+6188.781087916" watchObservedRunningTime="2026-02-17 19:26:57.411796477 +0000 UTC m=+6188.787199742" Feb 17 19:26:59 crc kubenswrapper[4892]: I0217 19:26:59.411224 4892 generic.go:334] "Generic (PLEG): container finished" podID="4976d40b-aca4-4ac9-956c-ff2719378071" containerID="461c1b8b342c0053cfff056cc0b5fd094028a0136d959a19602edab8f6328f3c" exitCode=0 Feb 17 19:26:59 crc kubenswrapper[4892]: I0217 19:26:59.411447 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fdbc494b5-smb4m" event={"ID":"4976d40b-aca4-4ac9-956c-ff2719378071","Type":"ContainerDied","Data":"461c1b8b342c0053cfff056cc0b5fd094028a0136d959a19602edab8f6328f3c"} Feb 17 19:27:03 crc kubenswrapper[4892]: I0217 19:27:03.071897 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-768dfc78cc-qbdts" Feb 17 19:27:03 crc kubenswrapper[4892]: I0217 19:27:03.618099 4892 scope.go:117] "RemoveContainer" containerID="d8fa6b77b197156ab5b88cc3d6d45054d4896bf5a83b3b2c48fd3b4478b170ad" Feb 17 19:27:03 crc kubenswrapper[4892]: I0217 19:27:03.749140 4892 scope.go:117] "RemoveContainer" containerID="eb207e4acf5536ef95c537fb8f7cc4631c7f08e8223e6ba0d1a0fbaad9708aec" Feb 17 19:27:03 crc kubenswrapper[4892]: I0217 19:27:03.797032 4892 scope.go:117] "RemoveContainer" containerID="512bd8c4a3b30d9dad3257749b2223715980d679bf3a038de491ed77dbd33e78" Feb 17 19:27:04 crc kubenswrapper[4892]: I0217 19:27:04.054004 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-f3d8-account-create-update-wnrsx"] Feb 17 19:27:04 crc kubenswrapper[4892]: I0217 19:27:04.069233 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-cpcdn"] Feb 17 19:27:04 crc kubenswrapper[4892]: I0217 19:27:04.082897 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-f3d8-account-create-update-wnrsx"] Feb 17 19:27:04 crc kubenswrapper[4892]: I0217 19:27:04.095359 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-cpcdn"] Feb 17 19:27:04 crc kubenswrapper[4892]: I0217 19:27:04.536707 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-5fcbcbc588-hh84w" Feb 17 19:27:04 crc kubenswrapper[4892]: I0217 19:27:04.646097 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-7bf8574b5-6xj5n" Feb 17 19:27:05 crc kubenswrapper[4892]: I0217 19:27:05.371434 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1719978b-bac0-4de9-a1d0-0699cc81c53b" path="/var/lib/kubelet/pods/1719978b-bac0-4de9-a1d0-0699cc81c53b/volumes" Feb 17 19:27:05 crc kubenswrapper[4892]: I0217 19:27:05.372100 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="776ec54b-43e8-4316-bbca-e54f0302c366" path="/var/lib/kubelet/pods/776ec54b-43e8-4316-bbca-e54f0302c366/volumes" Feb 17 19:27:05 crc kubenswrapper[4892]: I0217 19:27:05.832175 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5fdbc494b5-smb4m" podUID="4976d40b-aca4-4ac9-956c-ff2719378071" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.140:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.140:8080: connect: connection refused" Feb 17 19:27:07 crc kubenswrapper[4892]: I0217 19:27:07.425370 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 19:27:07 crc kubenswrapper[4892]: I0217 19:27:07.425624 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 19:27:13 crc kubenswrapper[4892]: I0217 19:27:13.058902 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-c7scg"] Feb 17 19:27:13 crc kubenswrapper[4892]: I0217 19:27:13.071345 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-c7scg"] Feb 17 19:27:13 crc kubenswrapper[4892]: I0217 19:27:13.375420 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff3816c1-c175-4094-89c4-e2143aa24c5c" path="/var/lib/kubelet/pods/ff3816c1-c175-4094-89c4-e2143aa24c5c/volumes" Feb 17 19:27:15 crc kubenswrapper[4892]: I0217 19:27:15.832011 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5fdbc494b5-smb4m" podUID="4976d40b-aca4-4ac9-956c-ff2719378071" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.140:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.140:8080: connect: connection refused" Feb 17 19:27:25 crc kubenswrapper[4892]: I0217 19:27:25.832540 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5fdbc494b5-smb4m" podUID="4976d40b-aca4-4ac9-956c-ff2719378071" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.140:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.140:8080: connect: connection refused" Feb 17 19:27:25 crc kubenswrapper[4892]: I0217 19:27:25.833138 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5fdbc494b5-smb4m" Feb 17 19:27:26 crc kubenswrapper[4892]: I0217 19:27:26.589748 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5fdbc494b5-smb4m" Feb 17 19:27:26 crc kubenswrapper[4892]: I0217 19:27:26.757431 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcmw4\" (UniqueName: \"kubernetes.io/projected/4976d40b-aca4-4ac9-956c-ff2719378071-kube-api-access-lcmw4\") pod \"4976d40b-aca4-4ac9-956c-ff2719378071\" (UID: \"4976d40b-aca4-4ac9-956c-ff2719378071\") " Feb 17 19:27:26 crc kubenswrapper[4892]: I0217 19:27:26.757529 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4976d40b-aca4-4ac9-956c-ff2719378071-logs\") pod \"4976d40b-aca4-4ac9-956c-ff2719378071\" (UID: \"4976d40b-aca4-4ac9-956c-ff2719378071\") " Feb 17 19:27:26 crc kubenswrapper[4892]: I0217 19:27:26.757698 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4976d40b-aca4-4ac9-956c-ff2719378071-scripts\") pod \"4976d40b-aca4-4ac9-956c-ff2719378071\" (UID: \"4976d40b-aca4-4ac9-956c-ff2719378071\") " Feb 17 19:27:26 crc kubenswrapper[4892]: I0217 19:27:26.757724 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4976d40b-aca4-4ac9-956c-ff2719378071-horizon-secret-key\") pod \"4976d40b-aca4-4ac9-956c-ff2719378071\" (UID: \"4976d40b-aca4-4ac9-956c-ff2719378071\") " Feb 17 19:27:26 crc kubenswrapper[4892]: I0217 19:27:26.757771 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4976d40b-aca4-4ac9-956c-ff2719378071-config-data\") pod \"4976d40b-aca4-4ac9-956c-ff2719378071\" (UID: \"4976d40b-aca4-4ac9-956c-ff2719378071\") " Feb 17 19:27:26 crc kubenswrapper[4892]: I0217 19:27:26.758627 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4976d40b-aca4-4ac9-956c-ff2719378071-logs" (OuterVolumeSpecName: "logs") pod "4976d40b-aca4-4ac9-956c-ff2719378071" (UID: "4976d40b-aca4-4ac9-956c-ff2719378071"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:27:26 crc kubenswrapper[4892]: I0217 19:27:26.759795 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4976d40b-aca4-4ac9-956c-ff2719378071-logs\") on node \"crc\" DevicePath \"\"" Feb 17 19:27:26 crc kubenswrapper[4892]: I0217 19:27:26.772102 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4976d40b-aca4-4ac9-956c-ff2719378071-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "4976d40b-aca4-4ac9-956c-ff2719378071" (UID: "4976d40b-aca4-4ac9-956c-ff2719378071"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:27:26 crc kubenswrapper[4892]: I0217 19:27:26.772151 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4976d40b-aca4-4ac9-956c-ff2719378071-kube-api-access-lcmw4" (OuterVolumeSpecName: "kube-api-access-lcmw4") pod "4976d40b-aca4-4ac9-956c-ff2719378071" (UID: "4976d40b-aca4-4ac9-956c-ff2719378071"). InnerVolumeSpecName "kube-api-access-lcmw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:27:26 crc kubenswrapper[4892]: I0217 19:27:26.774261 4892 generic.go:334] "Generic (PLEG): container finished" podID="4976d40b-aca4-4ac9-956c-ff2719378071" containerID="e2d672890367052c9abe2d31a774ac20d688d0b3209257c1948a416b26bd80b5" exitCode=137 Feb 17 19:27:26 crc kubenswrapper[4892]: I0217 19:27:26.774297 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fdbc494b5-smb4m" event={"ID":"4976d40b-aca4-4ac9-956c-ff2719378071","Type":"ContainerDied","Data":"e2d672890367052c9abe2d31a774ac20d688d0b3209257c1948a416b26bd80b5"} Feb 17 19:27:26 crc kubenswrapper[4892]: I0217 19:27:26.774322 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fdbc494b5-smb4m" event={"ID":"4976d40b-aca4-4ac9-956c-ff2719378071","Type":"ContainerDied","Data":"3c0cd342a848fee3d40486414796d6ecce47cedfa6affe5974990e3a3f8da3e3"} Feb 17 19:27:26 crc kubenswrapper[4892]: I0217 19:27:26.774338 4892 scope.go:117] "RemoveContainer" containerID="461c1b8b342c0053cfff056cc0b5fd094028a0136d959a19602edab8f6328f3c" Feb 17 19:27:26 crc kubenswrapper[4892]: I0217 19:27:26.774347 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5fdbc494b5-smb4m" Feb 17 19:27:26 crc kubenswrapper[4892]: I0217 19:27:26.784733 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4976d40b-aca4-4ac9-956c-ff2719378071-scripts" (OuterVolumeSpecName: "scripts") pod "4976d40b-aca4-4ac9-956c-ff2719378071" (UID: "4976d40b-aca4-4ac9-956c-ff2719378071"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:27:26 crc kubenswrapper[4892]: I0217 19:27:26.784881 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4976d40b-aca4-4ac9-956c-ff2719378071-config-data" (OuterVolumeSpecName: "config-data") pod "4976d40b-aca4-4ac9-956c-ff2719378071" (UID: "4976d40b-aca4-4ac9-956c-ff2719378071"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:27:26 crc kubenswrapper[4892]: I0217 19:27:26.861834 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcmw4\" (UniqueName: \"kubernetes.io/projected/4976d40b-aca4-4ac9-956c-ff2719378071-kube-api-access-lcmw4\") on node \"crc\" DevicePath \"\"" Feb 17 19:27:26 crc kubenswrapper[4892]: I0217 19:27:26.861865 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4976d40b-aca4-4ac9-956c-ff2719378071-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:27:26 crc kubenswrapper[4892]: I0217 19:27:26.861874 4892 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4976d40b-aca4-4ac9-956c-ff2719378071-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 17 19:27:26 crc kubenswrapper[4892]: I0217 19:27:26.861883 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4976d40b-aca4-4ac9-956c-ff2719378071-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 19:27:26 crc kubenswrapper[4892]: I0217 19:27:26.933031 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08k8zvg"] Feb 17 19:27:26 crc kubenswrapper[4892]: E0217 19:27:26.933499 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4976d40b-aca4-4ac9-956c-ff2719378071" containerName="horizon-log" Feb 17 19:27:26 crc kubenswrapper[4892]: I0217 19:27:26.933513 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="4976d40b-aca4-4ac9-956c-ff2719378071" containerName="horizon-log" Feb 17 19:27:26 crc kubenswrapper[4892]: E0217 19:27:26.933529 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4976d40b-aca4-4ac9-956c-ff2719378071" containerName="horizon" Feb 17 19:27:26 crc kubenswrapper[4892]: I0217 19:27:26.933535 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="4976d40b-aca4-4ac9-956c-ff2719378071" containerName="horizon" Feb 17 19:27:26 crc kubenswrapper[4892]: I0217 19:27:26.933801 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="4976d40b-aca4-4ac9-956c-ff2719378071" containerName="horizon" Feb 17 19:27:26 crc kubenswrapper[4892]: I0217 19:27:26.933868 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="4976d40b-aca4-4ac9-956c-ff2719378071" containerName="horizon-log" Feb 17 19:27:26 crc kubenswrapper[4892]: I0217 19:27:26.935389 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08k8zvg" Feb 17 19:27:26 crc kubenswrapper[4892]: I0217 19:27:26.937533 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 17 19:27:26 crc kubenswrapper[4892]: I0217 19:27:26.947196 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08k8zvg"] Feb 17 19:27:26 crc kubenswrapper[4892]: I0217 19:27:26.964056 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a0b5e8ae-9589-439c-b68d-04963a2fc27b-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08k8zvg\" (UID: \"a0b5e8ae-9589-439c-b68d-04963a2fc27b\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08k8zvg" Feb 17 19:27:26 crc kubenswrapper[4892]: I0217 19:27:26.964106 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a0b5e8ae-9589-439c-b68d-04963a2fc27b-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08k8zvg\" (UID: \"a0b5e8ae-9589-439c-b68d-04963a2fc27b\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08k8zvg" Feb 17 19:27:26 crc kubenswrapper[4892]: I0217 19:27:26.964211 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnbq5\" (UniqueName: \"kubernetes.io/projected/a0b5e8ae-9589-439c-b68d-04963a2fc27b-kube-api-access-nnbq5\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08k8zvg\" (UID: \"a0b5e8ae-9589-439c-b68d-04963a2fc27b\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08k8zvg" Feb 17 19:27:27 crc kubenswrapper[4892]: I0217 19:27:27.058963 4892 scope.go:117] "RemoveContainer" containerID="e2d672890367052c9abe2d31a774ac20d688d0b3209257c1948a416b26bd80b5" Feb 17 19:27:27 crc kubenswrapper[4892]: I0217 19:27:27.067013 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a0b5e8ae-9589-439c-b68d-04963a2fc27b-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08k8zvg\" (UID: \"a0b5e8ae-9589-439c-b68d-04963a2fc27b\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08k8zvg" Feb 17 19:27:27 crc kubenswrapper[4892]: I0217 19:27:27.067062 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a0b5e8ae-9589-439c-b68d-04963a2fc27b-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08k8zvg\" (UID: \"a0b5e8ae-9589-439c-b68d-04963a2fc27b\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08k8zvg" Feb 17 19:27:27 crc kubenswrapper[4892]: I0217 19:27:27.067223 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnbq5\" (UniqueName: \"kubernetes.io/projected/a0b5e8ae-9589-439c-b68d-04963a2fc27b-kube-api-access-nnbq5\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08k8zvg\" (UID: \"a0b5e8ae-9589-439c-b68d-04963a2fc27b\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08k8zvg" Feb 17 19:27:27 crc kubenswrapper[4892]: I0217 19:27:27.067389 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a0b5e8ae-9589-439c-b68d-04963a2fc27b-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08k8zvg\" (UID: \"a0b5e8ae-9589-439c-b68d-04963a2fc27b\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08k8zvg" Feb 17 19:27:27 crc kubenswrapper[4892]: I0217 19:27:27.067495 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a0b5e8ae-9589-439c-b68d-04963a2fc27b-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08k8zvg\" (UID: \"a0b5e8ae-9589-439c-b68d-04963a2fc27b\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08k8zvg" Feb 17 19:27:27 crc kubenswrapper[4892]: I0217 19:27:27.088326 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnbq5\" (UniqueName: \"kubernetes.io/projected/a0b5e8ae-9589-439c-b68d-04963a2fc27b-kube-api-access-nnbq5\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08k8zvg\" (UID: \"a0b5e8ae-9589-439c-b68d-04963a2fc27b\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08k8zvg" Feb 17 19:27:27 crc kubenswrapper[4892]: I0217 19:27:27.094649 4892 scope.go:117] "RemoveContainer" containerID="461c1b8b342c0053cfff056cc0b5fd094028a0136d959a19602edab8f6328f3c" Feb 17 19:27:27 crc kubenswrapper[4892]: E0217 19:27:27.097264 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"461c1b8b342c0053cfff056cc0b5fd094028a0136d959a19602edab8f6328f3c\": container with ID starting with 461c1b8b342c0053cfff056cc0b5fd094028a0136d959a19602edab8f6328f3c not found: ID does not exist" containerID="461c1b8b342c0053cfff056cc0b5fd094028a0136d959a19602edab8f6328f3c" Feb 17 19:27:27 crc kubenswrapper[4892]: I0217 19:27:27.097316 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"461c1b8b342c0053cfff056cc0b5fd094028a0136d959a19602edab8f6328f3c"} err="failed to get container status \"461c1b8b342c0053cfff056cc0b5fd094028a0136d959a19602edab8f6328f3c\": rpc error: code = NotFound desc = could not find container \"461c1b8b342c0053cfff056cc0b5fd094028a0136d959a19602edab8f6328f3c\": container with ID starting with 461c1b8b342c0053cfff056cc0b5fd094028a0136d959a19602edab8f6328f3c not found: ID does not exist" Feb 17 19:27:27 crc kubenswrapper[4892]: I0217 19:27:27.097347 4892 scope.go:117] "RemoveContainer" containerID="e2d672890367052c9abe2d31a774ac20d688d0b3209257c1948a416b26bd80b5" Feb 17 19:27:27 crc kubenswrapper[4892]: E0217 19:27:27.097767 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2d672890367052c9abe2d31a774ac20d688d0b3209257c1948a416b26bd80b5\": container with ID starting with e2d672890367052c9abe2d31a774ac20d688d0b3209257c1948a416b26bd80b5 not found: ID does not exist" containerID="e2d672890367052c9abe2d31a774ac20d688d0b3209257c1948a416b26bd80b5" Feb 17 19:27:27 crc kubenswrapper[4892]: I0217 19:27:27.097799 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2d672890367052c9abe2d31a774ac20d688d0b3209257c1948a416b26bd80b5"} err="failed to get container status \"e2d672890367052c9abe2d31a774ac20d688d0b3209257c1948a416b26bd80b5\": rpc error: code = NotFound desc = could not find container \"e2d672890367052c9abe2d31a774ac20d688d0b3209257c1948a416b26bd80b5\": container with ID starting with e2d672890367052c9abe2d31a774ac20d688d0b3209257c1948a416b26bd80b5 not found: ID does not exist" Feb 17 19:27:27 crc kubenswrapper[4892]: I0217 19:27:27.119858 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5fdbc494b5-smb4m"] Feb 17 19:27:27 crc kubenswrapper[4892]: I0217 19:27:27.134732 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5fdbc494b5-smb4m"] Feb 17 19:27:27 crc kubenswrapper[4892]: I0217 19:27:27.275749 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08k8zvg" Feb 17 19:27:27 crc kubenswrapper[4892]: I0217 19:27:27.386078 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4976d40b-aca4-4ac9-956c-ff2719378071" path="/var/lib/kubelet/pods/4976d40b-aca4-4ac9-956c-ff2719378071/volumes" Feb 17 19:27:27 crc kubenswrapper[4892]: I0217 19:27:27.777112 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08k8zvg"] Feb 17 19:27:27 crc kubenswrapper[4892]: W0217 19:27:27.794553 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0b5e8ae_9589_439c_b68d_04963a2fc27b.slice/crio-d10aa16d86f530b01e1ace00355f259806fb3dafeab563914e6ddbd56e8598e8 WatchSource:0}: Error finding container d10aa16d86f530b01e1ace00355f259806fb3dafeab563914e6ddbd56e8598e8: Status 404 returned error can't find the container with id d10aa16d86f530b01e1ace00355f259806fb3dafeab563914e6ddbd56e8598e8 Feb 17 19:27:28 crc kubenswrapper[4892]: I0217 19:27:28.817749 4892 generic.go:334] "Generic (PLEG): container finished" podID="a0b5e8ae-9589-439c-b68d-04963a2fc27b" containerID="cb4f15f4dd10d3a07f4af0b93438b24dae7a6d25436ba6310a75f8cb447fe6bf" exitCode=0 Feb 17 19:27:28 crc kubenswrapper[4892]: I0217 19:27:28.818116 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08k8zvg" event={"ID":"a0b5e8ae-9589-439c-b68d-04963a2fc27b","Type":"ContainerDied","Data":"cb4f15f4dd10d3a07f4af0b93438b24dae7a6d25436ba6310a75f8cb447fe6bf"} Feb 17 19:27:28 crc kubenswrapper[4892]: I0217 19:27:28.818174 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08k8zvg" event={"ID":"a0b5e8ae-9589-439c-b68d-04963a2fc27b","Type":"ContainerStarted","Data":"d10aa16d86f530b01e1ace00355f259806fb3dafeab563914e6ddbd56e8598e8"} Feb 17 19:27:29 crc kubenswrapper[4892]: I0217 19:27:29.291130 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jhfhx"] Feb 17 19:27:29 crc kubenswrapper[4892]: I0217 19:27:29.295118 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jhfhx" Feb 17 19:27:29 crc kubenswrapper[4892]: I0217 19:27:29.307686 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jhfhx"] Feb 17 19:27:29 crc kubenswrapper[4892]: I0217 19:27:29.326367 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96ed9f69-fd36-40cd-821f-01de224abc3c-catalog-content\") pod \"redhat-operators-jhfhx\" (UID: \"96ed9f69-fd36-40cd-821f-01de224abc3c\") " pod="openshift-marketplace/redhat-operators-jhfhx" Feb 17 19:27:29 crc kubenswrapper[4892]: I0217 19:27:29.326578 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96ed9f69-fd36-40cd-821f-01de224abc3c-utilities\") pod \"redhat-operators-jhfhx\" (UID: \"96ed9f69-fd36-40cd-821f-01de224abc3c\") " pod="openshift-marketplace/redhat-operators-jhfhx" Feb 17 19:27:29 crc kubenswrapper[4892]: I0217 19:27:29.326807 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgld9\" (UniqueName: \"kubernetes.io/projected/96ed9f69-fd36-40cd-821f-01de224abc3c-kube-api-access-xgld9\") pod \"redhat-operators-jhfhx\" (UID: \"96ed9f69-fd36-40cd-821f-01de224abc3c\") " pod="openshift-marketplace/redhat-operators-jhfhx" Feb 17 19:27:29 crc kubenswrapper[4892]: I0217 19:27:29.428624 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgld9\" (UniqueName: \"kubernetes.io/projected/96ed9f69-fd36-40cd-821f-01de224abc3c-kube-api-access-xgld9\") pod \"redhat-operators-jhfhx\" (UID: \"96ed9f69-fd36-40cd-821f-01de224abc3c\") " pod="openshift-marketplace/redhat-operators-jhfhx" Feb 17 19:27:29 crc kubenswrapper[4892]: I0217 19:27:29.428753 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96ed9f69-fd36-40cd-821f-01de224abc3c-catalog-content\") pod \"redhat-operators-jhfhx\" (UID: \"96ed9f69-fd36-40cd-821f-01de224abc3c\") " pod="openshift-marketplace/redhat-operators-jhfhx" Feb 17 19:27:29 crc kubenswrapper[4892]: I0217 19:27:29.428809 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96ed9f69-fd36-40cd-821f-01de224abc3c-utilities\") pod \"redhat-operators-jhfhx\" (UID: \"96ed9f69-fd36-40cd-821f-01de224abc3c\") " pod="openshift-marketplace/redhat-operators-jhfhx" Feb 17 19:27:29 crc kubenswrapper[4892]: I0217 19:27:29.429468 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96ed9f69-fd36-40cd-821f-01de224abc3c-catalog-content\") pod \"redhat-operators-jhfhx\" (UID: \"96ed9f69-fd36-40cd-821f-01de224abc3c\") " pod="openshift-marketplace/redhat-operators-jhfhx" Feb 17 19:27:29 crc kubenswrapper[4892]: I0217 19:27:29.429527 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96ed9f69-fd36-40cd-821f-01de224abc3c-utilities\") pod \"redhat-operators-jhfhx\" (UID: \"96ed9f69-fd36-40cd-821f-01de224abc3c\") " pod="openshift-marketplace/redhat-operators-jhfhx" Feb 17 19:27:29 crc kubenswrapper[4892]: I0217 19:27:29.454035 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgld9\" (UniqueName: \"kubernetes.io/projected/96ed9f69-fd36-40cd-821f-01de224abc3c-kube-api-access-xgld9\") pod \"redhat-operators-jhfhx\" (UID: \"96ed9f69-fd36-40cd-821f-01de224abc3c\") " pod="openshift-marketplace/redhat-operators-jhfhx" Feb 17 19:27:29 crc kubenswrapper[4892]: I0217 19:27:29.617999 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jhfhx" Feb 17 19:27:30 crc kubenswrapper[4892]: I0217 19:27:30.203856 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jhfhx"] Feb 17 19:27:30 crc kubenswrapper[4892]: W0217 19:27:30.205674 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96ed9f69_fd36_40cd_821f_01de224abc3c.slice/crio-0184ccc28d162d9a966ab02b3e10f60c96b9c7d0c6ee63c508892dba196b8213 WatchSource:0}: Error finding container 0184ccc28d162d9a966ab02b3e10f60c96b9c7d0c6ee63c508892dba196b8213: Status 404 returned error can't find the container with id 0184ccc28d162d9a966ab02b3e10f60c96b9c7d0c6ee63c508892dba196b8213 Feb 17 19:27:30 crc kubenswrapper[4892]: I0217 19:27:30.852760 4892 generic.go:334] "Generic (PLEG): container finished" podID="a0b5e8ae-9589-439c-b68d-04963a2fc27b" containerID="3cc0b7a23bef17386612fd3f2c3b988409a34d98153d22c1ddb1f4729cb86521" exitCode=0 Feb 17 19:27:30 crc kubenswrapper[4892]: I0217 19:27:30.854682 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08k8zvg" event={"ID":"a0b5e8ae-9589-439c-b68d-04963a2fc27b","Type":"ContainerDied","Data":"3cc0b7a23bef17386612fd3f2c3b988409a34d98153d22c1ddb1f4729cb86521"} Feb 17 19:27:30 crc kubenswrapper[4892]: I0217 19:27:30.867310 4892 generic.go:334] "Generic (PLEG): container finished" podID="96ed9f69-fd36-40cd-821f-01de224abc3c" containerID="89e01ce7658acd088dbc50625d19093613c3a7ebc1cbb213a4bbb40f19919990" exitCode=0 Feb 17 19:27:30 crc kubenswrapper[4892]: I0217 19:27:30.867347 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jhfhx" event={"ID":"96ed9f69-fd36-40cd-821f-01de224abc3c","Type":"ContainerDied","Data":"89e01ce7658acd088dbc50625d19093613c3a7ebc1cbb213a4bbb40f19919990"} Feb 17 19:27:30 crc kubenswrapper[4892]: I0217 19:27:30.867379 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jhfhx" event={"ID":"96ed9f69-fd36-40cd-821f-01de224abc3c","Type":"ContainerStarted","Data":"0184ccc28d162d9a966ab02b3e10f60c96b9c7d0c6ee63c508892dba196b8213"} Feb 17 19:27:31 crc kubenswrapper[4892]: I0217 19:27:31.883941 4892 generic.go:334] "Generic (PLEG): container finished" podID="a0b5e8ae-9589-439c-b68d-04963a2fc27b" containerID="d2685c74491f251810daa8b17df69b5073fb2847f093cbd54f80bd668c4c439a" exitCode=0 Feb 17 19:27:31 crc kubenswrapper[4892]: I0217 19:27:31.884177 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08k8zvg" event={"ID":"a0b5e8ae-9589-439c-b68d-04963a2fc27b","Type":"ContainerDied","Data":"d2685c74491f251810daa8b17df69b5073fb2847f093cbd54f80bd668c4c439a"} Feb 17 19:27:31 crc kubenswrapper[4892]: I0217 19:27:31.887833 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jhfhx" event={"ID":"96ed9f69-fd36-40cd-821f-01de224abc3c","Type":"ContainerStarted","Data":"0057a003804d42eaec2645ed6e2c6b72ea7c0c0533a51a00da85a5488d36bac2"} Feb 17 19:27:33 crc kubenswrapper[4892]: I0217 19:27:33.370799 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08k8zvg" Feb 17 19:27:33 crc kubenswrapper[4892]: I0217 19:27:33.692275 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnbq5\" (UniqueName: \"kubernetes.io/projected/a0b5e8ae-9589-439c-b68d-04963a2fc27b-kube-api-access-nnbq5\") pod \"a0b5e8ae-9589-439c-b68d-04963a2fc27b\" (UID: \"a0b5e8ae-9589-439c-b68d-04963a2fc27b\") " Feb 17 19:27:33 crc kubenswrapper[4892]: I0217 19:27:33.692342 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a0b5e8ae-9589-439c-b68d-04963a2fc27b-bundle\") pod \"a0b5e8ae-9589-439c-b68d-04963a2fc27b\" (UID: \"a0b5e8ae-9589-439c-b68d-04963a2fc27b\") " Feb 17 19:27:33 crc kubenswrapper[4892]: I0217 19:27:33.692387 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a0b5e8ae-9589-439c-b68d-04963a2fc27b-util\") pod \"a0b5e8ae-9589-439c-b68d-04963a2fc27b\" (UID: \"a0b5e8ae-9589-439c-b68d-04963a2fc27b\") " Feb 17 19:27:33 crc kubenswrapper[4892]: I0217 19:27:33.693584 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0b5e8ae-9589-439c-b68d-04963a2fc27b-bundle" (OuterVolumeSpecName: "bundle") pod "a0b5e8ae-9589-439c-b68d-04963a2fc27b" (UID: "a0b5e8ae-9589-439c-b68d-04963a2fc27b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:27:33 crc kubenswrapper[4892]: I0217 19:27:33.695735 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0b5e8ae-9589-439c-b68d-04963a2fc27b-util" (OuterVolumeSpecName: "util") pod "a0b5e8ae-9589-439c-b68d-04963a2fc27b" (UID: "a0b5e8ae-9589-439c-b68d-04963a2fc27b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:27:33 crc kubenswrapper[4892]: I0217 19:27:33.698203 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0b5e8ae-9589-439c-b68d-04963a2fc27b-kube-api-access-nnbq5" (OuterVolumeSpecName: "kube-api-access-nnbq5") pod "a0b5e8ae-9589-439c-b68d-04963a2fc27b" (UID: "a0b5e8ae-9589-439c-b68d-04963a2fc27b"). InnerVolumeSpecName "kube-api-access-nnbq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:27:33 crc kubenswrapper[4892]: I0217 19:27:33.795532 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnbq5\" (UniqueName: \"kubernetes.io/projected/a0b5e8ae-9589-439c-b68d-04963a2fc27b-kube-api-access-nnbq5\") on node \"crc\" DevicePath \"\"" Feb 17 19:27:33 crc kubenswrapper[4892]: I0217 19:27:33.795578 4892 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a0b5e8ae-9589-439c-b68d-04963a2fc27b-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 19:27:33 crc kubenswrapper[4892]: I0217 19:27:33.795598 4892 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a0b5e8ae-9589-439c-b68d-04963a2fc27b-util\") on node \"crc\" DevicePath \"\"" Feb 17 19:27:33 crc kubenswrapper[4892]: I0217 19:27:33.941038 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08k8zvg" event={"ID":"a0b5e8ae-9589-439c-b68d-04963a2fc27b","Type":"ContainerDied","Data":"d10aa16d86f530b01e1ace00355f259806fb3dafeab563914e6ddbd56e8598e8"} Feb 17 19:27:33 crc kubenswrapper[4892]: I0217 19:27:33.941073 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d10aa16d86f530b01e1ace00355f259806fb3dafeab563914e6ddbd56e8598e8" Feb 17 19:27:33 crc kubenswrapper[4892]: I0217 19:27:33.941132 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08k8zvg" Feb 17 19:27:35 crc kubenswrapper[4892]: I0217 19:27:35.284884 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bpgrs"] Feb 17 19:27:35 crc kubenswrapper[4892]: E0217 19:27:35.285998 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0b5e8ae-9589-439c-b68d-04963a2fc27b" containerName="util" Feb 17 19:27:35 crc kubenswrapper[4892]: I0217 19:27:35.286021 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0b5e8ae-9589-439c-b68d-04963a2fc27b" containerName="util" Feb 17 19:27:35 crc kubenswrapper[4892]: E0217 19:27:35.286067 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0b5e8ae-9589-439c-b68d-04963a2fc27b" containerName="pull" Feb 17 19:27:35 crc kubenswrapper[4892]: I0217 19:27:35.286080 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0b5e8ae-9589-439c-b68d-04963a2fc27b" containerName="pull" Feb 17 19:27:35 crc kubenswrapper[4892]: E0217 19:27:35.286097 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0b5e8ae-9589-439c-b68d-04963a2fc27b" containerName="extract" Feb 17 19:27:35 crc kubenswrapper[4892]: I0217 19:27:35.286108 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0b5e8ae-9589-439c-b68d-04963a2fc27b" containerName="extract" Feb 17 19:27:35 crc kubenswrapper[4892]: I0217 19:27:35.286610 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0b5e8ae-9589-439c-b68d-04963a2fc27b" containerName="extract" Feb 17 19:27:35 crc kubenswrapper[4892]: I0217 19:27:35.292007 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bpgrs" Feb 17 19:27:35 crc kubenswrapper[4892]: I0217 19:27:35.297779 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bpgrs"] Feb 17 19:27:35 crc kubenswrapper[4892]: I0217 19:27:35.335165 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fb11873-b198-4f7e-bbe1-5ab39f163319-utilities\") pod \"certified-operators-bpgrs\" (UID: \"9fb11873-b198-4f7e-bbe1-5ab39f163319\") " pod="openshift-marketplace/certified-operators-bpgrs" Feb 17 19:27:35 crc kubenswrapper[4892]: I0217 19:27:35.335363 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptrp5\" (UniqueName: \"kubernetes.io/projected/9fb11873-b198-4f7e-bbe1-5ab39f163319-kube-api-access-ptrp5\") pod \"certified-operators-bpgrs\" (UID: \"9fb11873-b198-4f7e-bbe1-5ab39f163319\") " pod="openshift-marketplace/certified-operators-bpgrs" Feb 17 19:27:35 crc kubenswrapper[4892]: I0217 19:27:35.335945 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fb11873-b198-4f7e-bbe1-5ab39f163319-catalog-content\") pod \"certified-operators-bpgrs\" (UID: \"9fb11873-b198-4f7e-bbe1-5ab39f163319\") " pod="openshift-marketplace/certified-operators-bpgrs" Feb 17 19:27:35 crc kubenswrapper[4892]: I0217 19:27:35.437908 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fb11873-b198-4f7e-bbe1-5ab39f163319-catalog-content\") pod \"certified-operators-bpgrs\" (UID: \"9fb11873-b198-4f7e-bbe1-5ab39f163319\") " pod="openshift-marketplace/certified-operators-bpgrs" Feb 17 19:27:35 crc kubenswrapper[4892]: I0217 19:27:35.438017 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fb11873-b198-4f7e-bbe1-5ab39f163319-utilities\") pod \"certified-operators-bpgrs\" (UID: \"9fb11873-b198-4f7e-bbe1-5ab39f163319\") " pod="openshift-marketplace/certified-operators-bpgrs" Feb 17 19:27:35 crc kubenswrapper[4892]: I0217 19:27:35.438145 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptrp5\" (UniqueName: \"kubernetes.io/projected/9fb11873-b198-4f7e-bbe1-5ab39f163319-kube-api-access-ptrp5\") pod \"certified-operators-bpgrs\" (UID: \"9fb11873-b198-4f7e-bbe1-5ab39f163319\") " pod="openshift-marketplace/certified-operators-bpgrs" Feb 17 19:27:35 crc kubenswrapper[4892]: I0217 19:27:35.438310 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fb11873-b198-4f7e-bbe1-5ab39f163319-catalog-content\") pod \"certified-operators-bpgrs\" (UID: \"9fb11873-b198-4f7e-bbe1-5ab39f163319\") " pod="openshift-marketplace/certified-operators-bpgrs" Feb 17 19:27:35 crc kubenswrapper[4892]: I0217 19:27:35.438673 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fb11873-b198-4f7e-bbe1-5ab39f163319-utilities\") pod \"certified-operators-bpgrs\" (UID: \"9fb11873-b198-4f7e-bbe1-5ab39f163319\") " pod="openshift-marketplace/certified-operators-bpgrs" Feb 17 19:27:35 crc kubenswrapper[4892]: I0217 19:27:35.458187 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptrp5\" (UniqueName: \"kubernetes.io/projected/9fb11873-b198-4f7e-bbe1-5ab39f163319-kube-api-access-ptrp5\") pod \"certified-operators-bpgrs\" (UID: \"9fb11873-b198-4f7e-bbe1-5ab39f163319\") " pod="openshift-marketplace/certified-operators-bpgrs" Feb 17 19:27:35 crc kubenswrapper[4892]: I0217 19:27:35.620107 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bpgrs" Feb 17 19:27:36 crc kubenswrapper[4892]: I0217 19:27:36.376080 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bpgrs"] Feb 17 19:27:36 crc kubenswrapper[4892]: W0217 19:27:36.384666 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fb11873_b198_4f7e_bbe1_5ab39f163319.slice/crio-ea1c0b78dd5e95bf0fea6de36a3e83d3a199d559fdb49e05505a6c9ac076b5cd WatchSource:0}: Error finding container ea1c0b78dd5e95bf0fea6de36a3e83d3a199d559fdb49e05505a6c9ac076b5cd: Status 404 returned error can't find the container with id ea1c0b78dd5e95bf0fea6de36a3e83d3a199d559fdb49e05505a6c9ac076b5cd Feb 17 19:27:36 crc kubenswrapper[4892]: I0217 19:27:36.468261 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hpdxn"] Feb 17 19:27:36 crc kubenswrapper[4892]: I0217 19:27:36.505917 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hpdxn" Feb 17 19:27:36 crc kubenswrapper[4892]: I0217 19:27:36.530537 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hpdxn"] Feb 17 19:27:36 crc kubenswrapper[4892]: I0217 19:27:36.663328 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfxtg\" (UniqueName: \"kubernetes.io/projected/f4408360-2931-4cec-9602-74ffccbd54aa-kube-api-access-cfxtg\") pod \"community-operators-hpdxn\" (UID: \"f4408360-2931-4cec-9602-74ffccbd54aa\") " pod="openshift-marketplace/community-operators-hpdxn" Feb 17 19:27:36 crc kubenswrapper[4892]: I0217 19:27:36.663721 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4408360-2931-4cec-9602-74ffccbd54aa-utilities\") pod \"community-operators-hpdxn\" (UID: \"f4408360-2931-4cec-9602-74ffccbd54aa\") " pod="openshift-marketplace/community-operators-hpdxn" Feb 17 19:27:36 crc kubenswrapper[4892]: I0217 19:27:36.664015 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4408360-2931-4cec-9602-74ffccbd54aa-catalog-content\") pod \"community-operators-hpdxn\" (UID: \"f4408360-2931-4cec-9602-74ffccbd54aa\") " pod="openshift-marketplace/community-operators-hpdxn" Feb 17 19:27:36 crc kubenswrapper[4892]: I0217 19:27:36.765968 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfxtg\" (UniqueName: \"kubernetes.io/projected/f4408360-2931-4cec-9602-74ffccbd54aa-kube-api-access-cfxtg\") pod \"community-operators-hpdxn\" (UID: \"f4408360-2931-4cec-9602-74ffccbd54aa\") " pod="openshift-marketplace/community-operators-hpdxn" Feb 17 19:27:36 crc kubenswrapper[4892]: I0217 19:27:36.766152 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4408360-2931-4cec-9602-74ffccbd54aa-utilities\") pod \"community-operators-hpdxn\" (UID: \"f4408360-2931-4cec-9602-74ffccbd54aa\") " pod="openshift-marketplace/community-operators-hpdxn" Feb 17 19:27:36 crc kubenswrapper[4892]: I0217 19:27:36.766242 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4408360-2931-4cec-9602-74ffccbd54aa-catalog-content\") pod \"community-operators-hpdxn\" (UID: \"f4408360-2931-4cec-9602-74ffccbd54aa\") " pod="openshift-marketplace/community-operators-hpdxn" Feb 17 19:27:36 crc kubenswrapper[4892]: I0217 19:27:36.766618 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4408360-2931-4cec-9602-74ffccbd54aa-utilities\") pod \"community-operators-hpdxn\" (UID: \"f4408360-2931-4cec-9602-74ffccbd54aa\") " pod="openshift-marketplace/community-operators-hpdxn" Feb 17 19:27:36 crc kubenswrapper[4892]: I0217 19:27:36.766800 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4408360-2931-4cec-9602-74ffccbd54aa-catalog-content\") pod \"community-operators-hpdxn\" (UID: \"f4408360-2931-4cec-9602-74ffccbd54aa\") " pod="openshift-marketplace/community-operators-hpdxn" Feb 17 19:27:36 crc kubenswrapper[4892]: I0217 19:27:36.785335 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfxtg\" (UniqueName: \"kubernetes.io/projected/f4408360-2931-4cec-9602-74ffccbd54aa-kube-api-access-cfxtg\") pod \"community-operators-hpdxn\" (UID: \"f4408360-2931-4cec-9602-74ffccbd54aa\") " pod="openshift-marketplace/community-operators-hpdxn" Feb 17 19:27:36 crc kubenswrapper[4892]: I0217 19:27:36.951826 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hpdxn" Feb 17 19:27:36 crc kubenswrapper[4892]: I0217 19:27:36.980310 4892 generic.go:334] "Generic (PLEG): container finished" podID="96ed9f69-fd36-40cd-821f-01de224abc3c" containerID="0057a003804d42eaec2645ed6e2c6b72ea7c0c0533a51a00da85a5488d36bac2" exitCode=0 Feb 17 19:27:36 crc kubenswrapper[4892]: I0217 19:27:36.980537 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jhfhx" event={"ID":"96ed9f69-fd36-40cd-821f-01de224abc3c","Type":"ContainerDied","Data":"0057a003804d42eaec2645ed6e2c6b72ea7c0c0533a51a00da85a5488d36bac2"} Feb 17 19:27:36 crc kubenswrapper[4892]: I0217 19:27:36.986612 4892 generic.go:334] "Generic (PLEG): container finished" podID="9fb11873-b198-4f7e-bbe1-5ab39f163319" containerID="680a84cd44d046b88798fbba5e216f46111ec8857e75769bff339c12f486215d" exitCode=0 Feb 17 19:27:36 crc kubenswrapper[4892]: I0217 19:27:36.986688 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bpgrs" event={"ID":"9fb11873-b198-4f7e-bbe1-5ab39f163319","Type":"ContainerDied","Data":"680a84cd44d046b88798fbba5e216f46111ec8857e75769bff339c12f486215d"} Feb 17 19:27:36 crc kubenswrapper[4892]: I0217 19:27:36.986961 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bpgrs" event={"ID":"9fb11873-b198-4f7e-bbe1-5ab39f163319","Type":"ContainerStarted","Data":"ea1c0b78dd5e95bf0fea6de36a3e83d3a199d559fdb49e05505a6c9ac076b5cd"} Feb 17 19:27:37 crc kubenswrapper[4892]: I0217 19:27:37.425302 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 19:27:37 crc kubenswrapper[4892]: I0217 19:27:37.426628 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 19:27:37 crc kubenswrapper[4892]: I0217 19:27:37.463949 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hpdxn"] Feb 17 19:27:37 crc kubenswrapper[4892]: W0217 19:27:37.472189 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4408360_2931_4cec_9602_74ffccbd54aa.slice/crio-499cd627bdb4f205e5c760ccc82b6163087e2e87027036001c00210d2d6cf5d1 WatchSource:0}: Error finding container 499cd627bdb4f205e5c760ccc82b6163087e2e87027036001c00210d2d6cf5d1: Status 404 returned error can't find the container with id 499cd627bdb4f205e5c760ccc82b6163087e2e87027036001c00210d2d6cf5d1 Feb 17 19:27:38 crc kubenswrapper[4892]: I0217 19:27:38.003882 4892 generic.go:334] "Generic (PLEG): container finished" podID="f4408360-2931-4cec-9602-74ffccbd54aa" containerID="0b18afc0b9ae4b8cfddfd81011a8a23e36b4660ae5b2001161f9720b05396065" exitCode=0 Feb 17 19:27:38 crc kubenswrapper[4892]: I0217 19:27:38.004228 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hpdxn" event={"ID":"f4408360-2931-4cec-9602-74ffccbd54aa","Type":"ContainerDied","Data":"0b18afc0b9ae4b8cfddfd81011a8a23e36b4660ae5b2001161f9720b05396065"} Feb 17 19:27:38 crc kubenswrapper[4892]: I0217 19:27:38.004253 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hpdxn" event={"ID":"f4408360-2931-4cec-9602-74ffccbd54aa","Type":"ContainerStarted","Data":"499cd627bdb4f205e5c760ccc82b6163087e2e87027036001c00210d2d6cf5d1"} Feb 17 19:27:38 crc kubenswrapper[4892]: I0217 19:27:38.007855 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jhfhx" event={"ID":"96ed9f69-fd36-40cd-821f-01de224abc3c","Type":"ContainerStarted","Data":"6782d546b5df9fa48763b2fc680435e7f2c173ae2246551799ab644f49fac4cf"} Feb 17 19:27:38 crc kubenswrapper[4892]: I0217 19:27:38.047756 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jhfhx" podStartSLOduration=2.21008069 podStartE2EDuration="9.047742026s" podCreationTimestamp="2026-02-17 19:27:29 +0000 UTC" firstStartedPulling="2026-02-17 19:27:30.86888909 +0000 UTC m=+6222.244292355" lastFinishedPulling="2026-02-17 19:27:37.706550426 +0000 UTC m=+6229.081953691" observedRunningTime="2026-02-17 19:27:38.045856236 +0000 UTC m=+6229.421259501" watchObservedRunningTime="2026-02-17 19:27:38.047742026 +0000 UTC m=+6229.423145291" Feb 17 19:27:39 crc kubenswrapper[4892]: I0217 19:27:39.019448 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bpgrs" event={"ID":"9fb11873-b198-4f7e-bbe1-5ab39f163319","Type":"ContainerStarted","Data":"b240e393e06c0e1e61ed6d4c21224691607201ba0ba3375fc29560367b00d45a"} Feb 17 19:27:39 crc kubenswrapper[4892]: I0217 19:27:39.021498 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hpdxn" event={"ID":"f4408360-2931-4cec-9602-74ffccbd54aa","Type":"ContainerStarted","Data":"4f5f530a2800d990b19803f00da0f6f6fbc19b8ac24dee6377c26a95ca043fe7"} Feb 17 19:27:39 crc kubenswrapper[4892]: I0217 19:27:39.618245 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jhfhx" Feb 17 19:27:39 crc kubenswrapper[4892]: I0217 19:27:39.618871 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jhfhx" Feb 17 19:27:39 crc kubenswrapper[4892]: I0217 19:27:39.886505 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-24tmv"] Feb 17 19:27:39 crc kubenswrapper[4892]: I0217 19:27:39.890185 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-24tmv" Feb 17 19:27:39 crc kubenswrapper[4892]: I0217 19:27:39.898126 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-24tmv"] Feb 17 19:27:39 crc kubenswrapper[4892]: I0217 19:27:39.960176 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0459d0da-a36b-47b3-8488-7bfcaae7c8e1-utilities\") pod \"redhat-marketplace-24tmv\" (UID: \"0459d0da-a36b-47b3-8488-7bfcaae7c8e1\") " pod="openshift-marketplace/redhat-marketplace-24tmv" Feb 17 19:27:39 crc kubenswrapper[4892]: I0217 19:27:39.960313 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpt5x\" (UniqueName: \"kubernetes.io/projected/0459d0da-a36b-47b3-8488-7bfcaae7c8e1-kube-api-access-mpt5x\") pod \"redhat-marketplace-24tmv\" (UID: \"0459d0da-a36b-47b3-8488-7bfcaae7c8e1\") " pod="openshift-marketplace/redhat-marketplace-24tmv" Feb 17 19:27:39 crc kubenswrapper[4892]: I0217 19:27:39.960388 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0459d0da-a36b-47b3-8488-7bfcaae7c8e1-catalog-content\") pod \"redhat-marketplace-24tmv\" (UID: \"0459d0da-a36b-47b3-8488-7bfcaae7c8e1\") " pod="openshift-marketplace/redhat-marketplace-24tmv" Feb 17 19:27:40 crc kubenswrapper[4892]: I0217 19:27:40.061018 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpt5x\" (UniqueName: \"kubernetes.io/projected/0459d0da-a36b-47b3-8488-7bfcaae7c8e1-kube-api-access-mpt5x\") pod \"redhat-marketplace-24tmv\" (UID: \"0459d0da-a36b-47b3-8488-7bfcaae7c8e1\") " pod="openshift-marketplace/redhat-marketplace-24tmv" Feb 17 19:27:40 crc kubenswrapper[4892]: I0217 19:27:40.061072 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0459d0da-a36b-47b3-8488-7bfcaae7c8e1-catalog-content\") pod \"redhat-marketplace-24tmv\" (UID: \"0459d0da-a36b-47b3-8488-7bfcaae7c8e1\") " pod="openshift-marketplace/redhat-marketplace-24tmv" Feb 17 19:27:40 crc kubenswrapper[4892]: I0217 19:27:40.061183 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0459d0da-a36b-47b3-8488-7bfcaae7c8e1-utilities\") pod \"redhat-marketplace-24tmv\" (UID: \"0459d0da-a36b-47b3-8488-7bfcaae7c8e1\") " pod="openshift-marketplace/redhat-marketplace-24tmv" Feb 17 19:27:40 crc kubenswrapper[4892]: I0217 19:27:40.104066 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0459d0da-a36b-47b3-8488-7bfcaae7c8e1-utilities\") pod \"redhat-marketplace-24tmv\" (UID: \"0459d0da-a36b-47b3-8488-7bfcaae7c8e1\") " pod="openshift-marketplace/redhat-marketplace-24tmv" Feb 17 19:27:40 crc kubenswrapper[4892]: I0217 19:27:40.104130 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0459d0da-a36b-47b3-8488-7bfcaae7c8e1-catalog-content\") pod \"redhat-marketplace-24tmv\" (UID: \"0459d0da-a36b-47b3-8488-7bfcaae7c8e1\") " pod="openshift-marketplace/redhat-marketplace-24tmv" Feb 17 19:27:40 crc kubenswrapper[4892]: I0217 19:27:40.110257 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpt5x\" (UniqueName: \"kubernetes.io/projected/0459d0da-a36b-47b3-8488-7bfcaae7c8e1-kube-api-access-mpt5x\") pod \"redhat-marketplace-24tmv\" (UID: \"0459d0da-a36b-47b3-8488-7bfcaae7c8e1\") " pod="openshift-marketplace/redhat-marketplace-24tmv" Feb 17 19:27:40 crc kubenswrapper[4892]: I0217 19:27:40.222136 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-24tmv" Feb 17 19:27:40 crc kubenswrapper[4892]: I0217 19:27:40.710332 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jhfhx" podUID="96ed9f69-fd36-40cd-821f-01de224abc3c" containerName="registry-server" probeResult="failure" output=< Feb 17 19:27:40 crc kubenswrapper[4892]: timeout: failed to connect service ":50051" within 1s Feb 17 19:27:40 crc kubenswrapper[4892]: > Feb 17 19:27:40 crc kubenswrapper[4892]: I0217 19:27:40.823784 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-24tmv"] Feb 17 19:27:40 crc kubenswrapper[4892]: W0217 19:27:40.828066 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0459d0da_a36b_47b3_8488_7bfcaae7c8e1.slice/crio-76aab720c02e9613c50ee27578ea59298a2e7e2d5d62993416303c6fc256bc98 WatchSource:0}: Error finding container 76aab720c02e9613c50ee27578ea59298a2e7e2d5d62993416303c6fc256bc98: Status 404 returned error can't find the container with id 76aab720c02e9613c50ee27578ea59298a2e7e2d5d62993416303c6fc256bc98 Feb 17 19:27:41 crc kubenswrapper[4892]: I0217 19:27:41.062938 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-24tmv" event={"ID":"0459d0da-a36b-47b3-8488-7bfcaae7c8e1","Type":"ContainerStarted","Data":"76aab720c02e9613c50ee27578ea59298a2e7e2d5d62993416303c6fc256bc98"} Feb 17 19:27:41 crc kubenswrapper[4892]: I0217 19:27:41.068121 4892 generic.go:334] "Generic (PLEG): container finished" podID="9fb11873-b198-4f7e-bbe1-5ab39f163319" containerID="b240e393e06c0e1e61ed6d4c21224691607201ba0ba3375fc29560367b00d45a" exitCode=0 Feb 17 19:27:41 crc kubenswrapper[4892]: I0217 19:27:41.068158 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bpgrs" event={"ID":"9fb11873-b198-4f7e-bbe1-5ab39f163319","Type":"ContainerDied","Data":"b240e393e06c0e1e61ed6d4c21224691607201ba0ba3375fc29560367b00d45a"} Feb 17 19:27:42 crc kubenswrapper[4892]: I0217 19:27:42.084144 4892 generic.go:334] "Generic (PLEG): container finished" podID="0459d0da-a36b-47b3-8488-7bfcaae7c8e1" containerID="b0834fd9fdcd718d4995c2bc1fb144c48ca911ddb38cc41248d0fb7a4964c160" exitCode=0 Feb 17 19:27:42 crc kubenswrapper[4892]: I0217 19:27:42.084428 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-24tmv" event={"ID":"0459d0da-a36b-47b3-8488-7bfcaae7c8e1","Type":"ContainerDied","Data":"b0834fd9fdcd718d4995c2bc1fb144c48ca911ddb38cc41248d0fb7a4964c160"} Feb 17 19:27:42 crc kubenswrapper[4892]: I0217 19:27:42.093363 4892 generic.go:334] "Generic (PLEG): container finished" podID="f4408360-2931-4cec-9602-74ffccbd54aa" containerID="4f5f530a2800d990b19803f00da0f6f6fbc19b8ac24dee6377c26a95ca043fe7" exitCode=0 Feb 17 19:27:42 crc kubenswrapper[4892]: I0217 19:27:42.093443 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hpdxn" event={"ID":"f4408360-2931-4cec-9602-74ffccbd54aa","Type":"ContainerDied","Data":"4f5f530a2800d990b19803f00da0f6f6fbc19b8ac24dee6377c26a95ca043fe7"} Feb 17 19:27:42 crc kubenswrapper[4892]: I0217 19:27:42.098309 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bpgrs" event={"ID":"9fb11873-b198-4f7e-bbe1-5ab39f163319","Type":"ContainerStarted","Data":"e3dccf5b65fc76775a541a1d8182c374429dd6341f2699410ce08bbe2624966a"} Feb 17 19:27:42 crc kubenswrapper[4892]: I0217 19:27:42.162828 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bpgrs" podStartSLOduration=2.71524725 podStartE2EDuration="7.162796401s" podCreationTimestamp="2026-02-17 19:27:35 +0000 UTC" firstStartedPulling="2026-02-17 19:27:36.988172438 +0000 UTC m=+6228.363575703" lastFinishedPulling="2026-02-17 19:27:41.435721599 +0000 UTC m=+6232.811124854" observedRunningTime="2026-02-17 19:27:42.157381704 +0000 UTC m=+6233.532784999" watchObservedRunningTime="2026-02-17 19:27:42.162796401 +0000 UTC m=+6233.538199666" Feb 17 19:27:43 crc kubenswrapper[4892]: I0217 19:27:43.145125 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hpdxn" event={"ID":"f4408360-2931-4cec-9602-74ffccbd54aa","Type":"ContainerStarted","Data":"7fbb0a87aec2ae19edd6ac7297b091acdba759d565fc94a00655eccb0967423f"} Feb 17 19:27:43 crc kubenswrapper[4892]: I0217 19:27:43.186740 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hpdxn" podStartSLOduration=2.684134622 podStartE2EDuration="7.186719874s" podCreationTimestamp="2026-02-17 19:27:36 +0000 UTC" firstStartedPulling="2026-02-17 19:27:38.00838706 +0000 UTC m=+6229.383790325" lastFinishedPulling="2026-02-17 19:27:42.510972312 +0000 UTC m=+6233.886375577" observedRunningTime="2026-02-17 19:27:43.184233237 +0000 UTC m=+6234.559636492" watchObservedRunningTime="2026-02-17 19:27:43.186719874 +0000 UTC m=+6234.562123139" Feb 17 19:27:44 crc kubenswrapper[4892]: I0217 19:27:44.101663 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-d452-account-create-update-ncvgx"] Feb 17 19:27:44 crc kubenswrapper[4892]: I0217 19:27:44.117212 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-64tqq"] Feb 17 19:27:44 crc kubenswrapper[4892]: I0217 19:27:44.134465 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-d452-account-create-update-ncvgx"] Feb 17 19:27:44 crc kubenswrapper[4892]: I0217 19:27:44.158087 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-64tqq"] Feb 17 19:27:44 crc kubenswrapper[4892]: I0217 19:27:44.196375 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-24tmv" event={"ID":"0459d0da-a36b-47b3-8488-7bfcaae7c8e1","Type":"ContainerStarted","Data":"ab92829da0a5598f996df6b2470d6e69ab2b48f8afb70a05cccc562858d40eb6"} Feb 17 19:27:45 crc kubenswrapper[4892]: I0217 19:27:45.214341 4892 generic.go:334] "Generic (PLEG): container finished" podID="0459d0da-a36b-47b3-8488-7bfcaae7c8e1" containerID="ab92829da0a5598f996df6b2470d6e69ab2b48f8afb70a05cccc562858d40eb6" exitCode=0 Feb 17 19:27:45 crc kubenswrapper[4892]: I0217 19:27:45.214439 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-24tmv" event={"ID":"0459d0da-a36b-47b3-8488-7bfcaae7c8e1","Type":"ContainerDied","Data":"ab92829da0a5598f996df6b2470d6e69ab2b48f8afb70a05cccc562858d40eb6"} Feb 17 19:27:45 crc kubenswrapper[4892]: I0217 19:27:45.371870 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26f9bb32-23ac-4897-a488-25db578b4696" path="/var/lib/kubelet/pods/26f9bb32-23ac-4897-a488-25db578b4696/volumes" Feb 17 19:27:45 crc kubenswrapper[4892]: I0217 19:27:45.372450 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2ffea8d-6680-40df-8b32-312e03efb9aa" path="/var/lib/kubelet/pods/c2ffea8d-6680-40df-8b32-312e03efb9aa/volumes" Feb 17 19:27:45 crc kubenswrapper[4892]: I0217 19:27:45.603237 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-2tk5v"] Feb 17 19:27:45 crc kubenswrapper[4892]: I0217 19:27:45.605013 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2tk5v" Feb 17 19:27:45 crc kubenswrapper[4892]: I0217 19:27:45.611081 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-rt9th" Feb 17 19:27:45 crc kubenswrapper[4892]: I0217 19:27:45.611399 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Feb 17 19:27:45 crc kubenswrapper[4892]: I0217 19:27:45.611516 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Feb 17 19:27:45 crc kubenswrapper[4892]: I0217 19:27:45.621415 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bpgrs" Feb 17 19:27:45 crc kubenswrapper[4892]: I0217 19:27:45.621972 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bpgrs" Feb 17 19:27:45 crc kubenswrapper[4892]: I0217 19:27:45.654662 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-2tk5v"] Feb 17 19:27:45 crc kubenswrapper[4892]: I0217 19:27:45.723002 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfxqr\" (UniqueName: \"kubernetes.io/projected/b593a6d0-56ba-4022-8084-246a3ac9fd30-kube-api-access-kfxqr\") pod \"obo-prometheus-operator-68bc856cb9-2tk5v\" (UID: \"b593a6d0-56ba-4022-8084-246a3ac9fd30\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2tk5v" Feb 17 19:27:45 crc kubenswrapper[4892]: I0217 19:27:45.724411 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7567bbd4cd-mqtrm"] Feb 17 19:27:45 crc kubenswrapper[4892]: I0217 19:27:45.727171 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7567bbd4cd-mqtrm" Feb 17 19:27:45 crc kubenswrapper[4892]: I0217 19:27:45.729199 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Feb 17 19:27:45 crc kubenswrapper[4892]: I0217 19:27:45.729381 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-ktv8v" Feb 17 19:27:45 crc kubenswrapper[4892]: I0217 19:27:45.760381 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7567bbd4cd-mqtrm"] Feb 17 19:27:45 crc kubenswrapper[4892]: I0217 19:27:45.770893 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7567bbd4cd-dvr5f"] Feb 17 19:27:45 crc kubenswrapper[4892]: I0217 19:27:45.789628 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7567bbd4cd-dvr5f" Feb 17 19:27:45 crc kubenswrapper[4892]: I0217 19:27:45.824333 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7567bbd4cd-dvr5f"] Feb 17 19:27:45 crc kubenswrapper[4892]: I0217 19:27:45.828148 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8a9c57e6-a854-4895-b520-267ac9379772-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7567bbd4cd-mqtrm\" (UID: \"8a9c57e6-a854-4895-b520-267ac9379772\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7567bbd4cd-mqtrm" Feb 17 19:27:45 crc kubenswrapper[4892]: I0217 19:27:45.828194 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfxqr\" (UniqueName: \"kubernetes.io/projected/b593a6d0-56ba-4022-8084-246a3ac9fd30-kube-api-access-kfxqr\") pod \"obo-prometheus-operator-68bc856cb9-2tk5v\" (UID: \"b593a6d0-56ba-4022-8084-246a3ac9fd30\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2tk5v" Feb 17 19:27:45 crc kubenswrapper[4892]: I0217 19:27:45.828359 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8a9c57e6-a854-4895-b520-267ac9379772-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7567bbd4cd-mqtrm\" (UID: \"8a9c57e6-a854-4895-b520-267ac9379772\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7567bbd4cd-mqtrm" Feb 17 19:27:45 crc kubenswrapper[4892]: I0217 19:27:45.880488 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfxqr\" (UniqueName: \"kubernetes.io/projected/b593a6d0-56ba-4022-8084-246a3ac9fd30-kube-api-access-kfxqr\") pod \"obo-prometheus-operator-68bc856cb9-2tk5v\" (UID: \"b593a6d0-56ba-4022-8084-246a3ac9fd30\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2tk5v" Feb 17 19:27:45 crc kubenswrapper[4892]: I0217 19:27:45.927216 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2tk5v" Feb 17 19:27:45 crc kubenswrapper[4892]: I0217 19:27:45.934387 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8a9c57e6-a854-4895-b520-267ac9379772-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7567bbd4cd-mqtrm\" (UID: \"8a9c57e6-a854-4895-b520-267ac9379772\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7567bbd4cd-mqtrm" Feb 17 19:27:45 crc kubenswrapper[4892]: I0217 19:27:45.934472 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/25bc72d9-8845-4954-93f7-657b3cac94b6-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7567bbd4cd-dvr5f\" (UID: \"25bc72d9-8845-4954-93f7-657b3cac94b6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7567bbd4cd-dvr5f" Feb 17 19:27:45 crc kubenswrapper[4892]: I0217 19:27:45.934558 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8a9c57e6-a854-4895-b520-267ac9379772-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7567bbd4cd-mqtrm\" (UID: \"8a9c57e6-a854-4895-b520-267ac9379772\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7567bbd4cd-mqtrm" Feb 17 19:27:45 crc kubenswrapper[4892]: I0217 19:27:45.934578 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/25bc72d9-8845-4954-93f7-657b3cac94b6-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7567bbd4cd-dvr5f\" (UID: \"25bc72d9-8845-4954-93f7-657b3cac94b6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7567bbd4cd-dvr5f" Feb 17 19:27:45 crc kubenswrapper[4892]: I0217 19:27:45.938917 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-hdpxb"] Feb 17 19:27:45 crc kubenswrapper[4892]: I0217 19:27:45.940282 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8a9c57e6-a854-4895-b520-267ac9379772-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7567bbd4cd-mqtrm\" (UID: \"8a9c57e6-a854-4895-b520-267ac9379772\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7567bbd4cd-mqtrm" Feb 17 19:27:45 crc kubenswrapper[4892]: I0217 19:27:45.940732 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-hdpxb" Feb 17 19:27:45 crc kubenswrapper[4892]: I0217 19:27:45.943077 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8a9c57e6-a854-4895-b520-267ac9379772-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7567bbd4cd-mqtrm\" (UID: \"8a9c57e6-a854-4895-b520-267ac9379772\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7567bbd4cd-mqtrm" Feb 17 19:27:45 crc kubenswrapper[4892]: I0217 19:27:45.955210 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-st28q" Feb 17 19:27:45 crc kubenswrapper[4892]: I0217 19:27:45.955396 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Feb 17 19:27:45 crc kubenswrapper[4892]: I0217 19:27:45.967362 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-hdpxb"] Feb 17 19:27:46 crc kubenswrapper[4892]: I0217 19:27:46.036352 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hcsc\" (UniqueName: \"kubernetes.io/projected/6d3c1e97-ee06-41a8-890b-8606b2297aa0-kube-api-access-5hcsc\") pod \"observability-operator-59bdc8b94-hdpxb\" (UID: \"6d3c1e97-ee06-41a8-890b-8606b2297aa0\") " pod="openshift-operators/observability-operator-59bdc8b94-hdpxb" Feb 17 19:27:46 crc kubenswrapper[4892]: I0217 19:27:46.036408 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/6d3c1e97-ee06-41a8-890b-8606b2297aa0-observability-operator-tls\") pod \"observability-operator-59bdc8b94-hdpxb\" (UID: \"6d3c1e97-ee06-41a8-890b-8606b2297aa0\") " pod="openshift-operators/observability-operator-59bdc8b94-hdpxb" Feb 17 19:27:46 crc kubenswrapper[4892]: I0217 19:27:46.036501 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/25bc72d9-8845-4954-93f7-657b3cac94b6-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7567bbd4cd-dvr5f\" (UID: \"25bc72d9-8845-4954-93f7-657b3cac94b6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7567bbd4cd-dvr5f" Feb 17 19:27:46 crc kubenswrapper[4892]: I0217 19:27:46.036541 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/25bc72d9-8845-4954-93f7-657b3cac94b6-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7567bbd4cd-dvr5f\" (UID: \"25bc72d9-8845-4954-93f7-657b3cac94b6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7567bbd4cd-dvr5f" Feb 17 19:27:46 crc kubenswrapper[4892]: I0217 19:27:46.057016 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/25bc72d9-8845-4954-93f7-657b3cac94b6-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7567bbd4cd-dvr5f\" (UID: \"25bc72d9-8845-4954-93f7-657b3cac94b6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7567bbd4cd-dvr5f" Feb 17 19:27:46 crc kubenswrapper[4892]: I0217 19:27:46.057382 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/25bc72d9-8845-4954-93f7-657b3cac94b6-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7567bbd4cd-dvr5f\" (UID: \"25bc72d9-8845-4954-93f7-657b3cac94b6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7567bbd4cd-dvr5f" Feb 17 19:27:46 crc kubenswrapper[4892]: I0217 19:27:46.066612 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7567bbd4cd-mqtrm" Feb 17 19:27:46 crc kubenswrapper[4892]: I0217 19:27:46.085403 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-n5wfv"] Feb 17 19:27:46 crc kubenswrapper[4892]: I0217 19:27:46.086804 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-n5wfv" Feb 17 19:27:46 crc kubenswrapper[4892]: I0217 19:27:46.089095 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-vbqkg" Feb 17 19:27:46 crc kubenswrapper[4892]: I0217 19:27:46.121573 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7567bbd4cd-dvr5f" Feb 17 19:27:46 crc kubenswrapper[4892]: I0217 19:27:46.138413 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjmmg\" (UniqueName: \"kubernetes.io/projected/609d9353-9db2-4a1c-8f00-8cfe986c3b12-kube-api-access-kjmmg\") pod \"perses-operator-5bf474d74f-n5wfv\" (UID: \"609d9353-9db2-4a1c-8f00-8cfe986c3b12\") " pod="openshift-operators/perses-operator-5bf474d74f-n5wfv" Feb 17 19:27:46 crc kubenswrapper[4892]: I0217 19:27:46.138684 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hcsc\" (UniqueName: \"kubernetes.io/projected/6d3c1e97-ee06-41a8-890b-8606b2297aa0-kube-api-access-5hcsc\") pod \"observability-operator-59bdc8b94-hdpxb\" (UID: \"6d3c1e97-ee06-41a8-890b-8606b2297aa0\") " pod="openshift-operators/observability-operator-59bdc8b94-hdpxb" Feb 17 19:27:46 crc kubenswrapper[4892]: I0217 19:27:46.138812 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/6d3c1e97-ee06-41a8-890b-8606b2297aa0-observability-operator-tls\") pod \"observability-operator-59bdc8b94-hdpxb\" (UID: \"6d3c1e97-ee06-41a8-890b-8606b2297aa0\") " pod="openshift-operators/observability-operator-59bdc8b94-hdpxb" Feb 17 19:27:46 crc kubenswrapper[4892]: I0217 19:27:46.139073 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/609d9353-9db2-4a1c-8f00-8cfe986c3b12-openshift-service-ca\") pod \"perses-operator-5bf474d74f-n5wfv\" (UID: \"609d9353-9db2-4a1c-8f00-8cfe986c3b12\") " pod="openshift-operators/perses-operator-5bf474d74f-n5wfv" Feb 17 19:27:46 crc kubenswrapper[4892]: I0217 19:27:46.148186 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-n5wfv"] Feb 17 19:27:46 crc kubenswrapper[4892]: I0217 19:27:46.164127 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/6d3c1e97-ee06-41a8-890b-8606b2297aa0-observability-operator-tls\") pod \"observability-operator-59bdc8b94-hdpxb\" (UID: \"6d3c1e97-ee06-41a8-890b-8606b2297aa0\") " pod="openshift-operators/observability-operator-59bdc8b94-hdpxb" Feb 17 19:27:46 crc kubenswrapper[4892]: I0217 19:27:46.177567 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hcsc\" (UniqueName: \"kubernetes.io/projected/6d3c1e97-ee06-41a8-890b-8606b2297aa0-kube-api-access-5hcsc\") pod \"observability-operator-59bdc8b94-hdpxb\" (UID: \"6d3c1e97-ee06-41a8-890b-8606b2297aa0\") " pod="openshift-operators/observability-operator-59bdc8b94-hdpxb" Feb 17 19:27:46 crc kubenswrapper[4892]: I0217 19:27:46.218056 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-hdpxb" Feb 17 19:27:46 crc kubenswrapper[4892]: I0217 19:27:46.255145 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/609d9353-9db2-4a1c-8f00-8cfe986c3b12-openshift-service-ca\") pod \"perses-operator-5bf474d74f-n5wfv\" (UID: \"609d9353-9db2-4a1c-8f00-8cfe986c3b12\") " pod="openshift-operators/perses-operator-5bf474d74f-n5wfv" Feb 17 19:27:46 crc kubenswrapper[4892]: I0217 19:27:46.255491 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjmmg\" (UniqueName: \"kubernetes.io/projected/609d9353-9db2-4a1c-8f00-8cfe986c3b12-kube-api-access-kjmmg\") pod \"perses-operator-5bf474d74f-n5wfv\" (UID: \"609d9353-9db2-4a1c-8f00-8cfe986c3b12\") " pod="openshift-operators/perses-operator-5bf474d74f-n5wfv" Feb 17 19:27:46 crc kubenswrapper[4892]: I0217 19:27:46.256760 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/609d9353-9db2-4a1c-8f00-8cfe986c3b12-openshift-service-ca\") pod \"perses-operator-5bf474d74f-n5wfv\" (UID: \"609d9353-9db2-4a1c-8f00-8cfe986c3b12\") " pod="openshift-operators/perses-operator-5bf474d74f-n5wfv" Feb 17 19:27:46 crc kubenswrapper[4892]: I0217 19:27:46.304373 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjmmg\" (UniqueName: \"kubernetes.io/projected/609d9353-9db2-4a1c-8f00-8cfe986c3b12-kube-api-access-kjmmg\") pod \"perses-operator-5bf474d74f-n5wfv\" (UID: \"609d9353-9db2-4a1c-8f00-8cfe986c3b12\") " pod="openshift-operators/perses-operator-5bf474d74f-n5wfv" Feb 17 19:27:46 crc kubenswrapper[4892]: I0217 19:27:46.589385 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-n5wfv" Feb 17 19:27:46 crc kubenswrapper[4892]: I0217 19:27:46.716193 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-bpgrs" podUID="9fb11873-b198-4f7e-bbe1-5ab39f163319" containerName="registry-server" probeResult="failure" output=< Feb 17 19:27:46 crc kubenswrapper[4892]: timeout: failed to connect service ":50051" within 1s Feb 17 19:27:46 crc kubenswrapper[4892]: > Feb 17 19:27:46 crc kubenswrapper[4892]: I0217 19:27:46.906559 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7567bbd4cd-mqtrm"] Feb 17 19:27:46 crc kubenswrapper[4892]: I0217 19:27:46.952014 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hpdxn" Feb 17 19:27:46 crc kubenswrapper[4892]: I0217 19:27:46.952058 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hpdxn" Feb 17 19:27:47 crc kubenswrapper[4892]: I0217 19:27:47.020149 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-2tk5v"] Feb 17 19:27:47 crc kubenswrapper[4892]: I0217 19:27:47.126194 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7567bbd4cd-dvr5f"] Feb 17 19:27:47 crc kubenswrapper[4892]: I0217 19:27:47.200870 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-hdpxb"] Feb 17 19:27:47 crc kubenswrapper[4892]: I0217 19:27:47.283956 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7567bbd4cd-mqtrm" event={"ID":"8a9c57e6-a854-4895-b520-267ac9379772","Type":"ContainerStarted","Data":"8dd4832cb9243c39a220d96bd79a3661957ad3ebbc33c2299282fdb68377411b"} Feb 17 19:27:47 crc kubenswrapper[4892]: I0217 19:27:47.287826 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-hdpxb" event={"ID":"6d3c1e97-ee06-41a8-890b-8606b2297aa0","Type":"ContainerStarted","Data":"6ec5360c360c3b1fefa511bd936fe9b3b8dd3f0e5008104dae7aa85b044bb478"} Feb 17 19:27:47 crc kubenswrapper[4892]: I0217 19:27:47.300563 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-24tmv" event={"ID":"0459d0da-a36b-47b3-8488-7bfcaae7c8e1","Type":"ContainerStarted","Data":"7fb7f81ec19a349a2830c90c23b413d90fc318b0ad36f23e546480639479facc"} Feb 17 19:27:47 crc kubenswrapper[4892]: I0217 19:27:47.304106 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2tk5v" event={"ID":"b593a6d0-56ba-4022-8084-246a3ac9fd30","Type":"ContainerStarted","Data":"5775fa013992f6dd7ec0a138ae16c0f70aeafe204c79c3c553cb3c1886ecb94f"} Feb 17 19:27:47 crc kubenswrapper[4892]: I0217 19:27:47.305869 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7567bbd4cd-dvr5f" event={"ID":"25bc72d9-8845-4954-93f7-657b3cac94b6","Type":"ContainerStarted","Data":"f405a1306ab20d91a7c3b647cd350642f028869517fe6c0364d229fda23060ef"} Feb 17 19:27:47 crc kubenswrapper[4892]: I0217 19:27:47.333554 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-n5wfv"] Feb 17 19:27:47 crc kubenswrapper[4892]: I0217 19:27:47.336798 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-24tmv" podStartSLOduration=4.687356612 podStartE2EDuration="8.336784126s" podCreationTimestamp="2026-02-17 19:27:39 +0000 UTC" firstStartedPulling="2026-02-17 19:27:42.089583998 +0000 UTC m=+6233.464987263" lastFinishedPulling="2026-02-17 19:27:45.739011512 +0000 UTC m=+6237.114414777" observedRunningTime="2026-02-17 19:27:47.321672658 +0000 UTC m=+6238.697075923" watchObservedRunningTime="2026-02-17 19:27:47.336784126 +0000 UTC m=+6238.712187391" Feb 17 19:27:48 crc kubenswrapper[4892]: I0217 19:27:48.033983 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-hpdxn" podUID="f4408360-2931-4cec-9602-74ffccbd54aa" containerName="registry-server" probeResult="failure" output=< Feb 17 19:27:48 crc kubenswrapper[4892]: timeout: failed to connect service ":50051" within 1s Feb 17 19:27:48 crc kubenswrapper[4892]: > Feb 17 19:27:48 crc kubenswrapper[4892]: I0217 19:27:48.323199 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-n5wfv" event={"ID":"609d9353-9db2-4a1c-8f00-8cfe986c3b12","Type":"ContainerStarted","Data":"95d17dd498c105ac89b4ed9fb3c9436210f9e2177c55594abd950e6dad62c3d2"} Feb 17 19:27:50 crc kubenswrapper[4892]: I0217 19:27:50.222592 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-24tmv" Feb 17 19:27:50 crc kubenswrapper[4892]: I0217 19:27:50.225011 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-24tmv" Feb 17 19:27:50 crc kubenswrapper[4892]: I0217 19:27:50.705398 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jhfhx" podUID="96ed9f69-fd36-40cd-821f-01de224abc3c" containerName="registry-server" probeResult="failure" output=< Feb 17 19:27:50 crc kubenswrapper[4892]: timeout: failed to connect service ":50051" within 1s Feb 17 19:27:50 crc kubenswrapper[4892]: > Feb 17 19:27:51 crc kubenswrapper[4892]: I0217 19:27:51.038294 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-j4r8j"] Feb 17 19:27:51 crc kubenswrapper[4892]: I0217 19:27:51.054912 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-j4r8j"] Feb 17 19:27:51 crc kubenswrapper[4892]: I0217 19:27:51.313823 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-24tmv" podUID="0459d0da-a36b-47b3-8488-7bfcaae7c8e1" containerName="registry-server" probeResult="failure" output=< Feb 17 19:27:51 crc kubenswrapper[4892]: timeout: failed to connect service ":50051" within 1s Feb 17 19:27:51 crc kubenswrapper[4892]: > Feb 17 19:27:51 crc kubenswrapper[4892]: I0217 19:27:51.372754 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bce8da88-116f-4f18-8d2f-050918735c8f" path="/var/lib/kubelet/pods/bce8da88-116f-4f18-8d2f-050918735c8f/volumes" Feb 17 19:27:56 crc kubenswrapper[4892]: I0217 19:27:56.709458 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-bpgrs" podUID="9fb11873-b198-4f7e-bbe1-5ab39f163319" containerName="registry-server" probeResult="failure" output=< Feb 17 19:27:56 crc kubenswrapper[4892]: timeout: failed to connect service ":50051" within 1s Feb 17 19:27:56 crc kubenswrapper[4892]: > Feb 17 19:27:58 crc kubenswrapper[4892]: I0217 19:27:58.015148 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-hpdxn" podUID="f4408360-2931-4cec-9602-74ffccbd54aa" containerName="registry-server" probeResult="failure" output=< Feb 17 19:27:58 crc kubenswrapper[4892]: timeout: failed to connect service ":50051" within 1s Feb 17 19:27:58 crc kubenswrapper[4892]: > Feb 17 19:28:00 crc kubenswrapper[4892]: I0217 19:28:00.305327 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-24tmv" Feb 17 19:28:00 crc kubenswrapper[4892]: I0217 19:28:00.367397 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-24tmv" Feb 17 19:28:00 crc kubenswrapper[4892]: I0217 19:28:00.731025 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jhfhx" podUID="96ed9f69-fd36-40cd-821f-01de224abc3c" containerName="registry-server" probeResult="failure" output=< Feb 17 19:28:00 crc kubenswrapper[4892]: timeout: failed to connect service ":50051" within 1s Feb 17 19:28:00 crc kubenswrapper[4892]: > Feb 17 19:28:02 crc kubenswrapper[4892]: I0217 19:28:02.566064 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7567bbd4cd-dvr5f" event={"ID":"25bc72d9-8845-4954-93f7-657b3cac94b6","Type":"ContainerStarted","Data":"3b221603e6f5d25839798435fecd04ffd064f849d30066746659f63a1bea8f1c"} Feb 17 19:28:02 crc kubenswrapper[4892]: I0217 19:28:02.571151 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7567bbd4cd-mqtrm" event={"ID":"8a9c57e6-a854-4895-b520-267ac9379772","Type":"ContainerStarted","Data":"e3e832b77a6a471121f477fb714e5c0182375d2ae23890eaf6b1e25aee779b05"} Feb 17 19:28:02 crc kubenswrapper[4892]: I0217 19:28:02.574026 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-hdpxb" event={"ID":"6d3c1e97-ee06-41a8-890b-8606b2297aa0","Type":"ContainerStarted","Data":"a1f6a85c34fd57d1d072aaaa9cb4fb0b1206e6527d2eb746de487184a93dbd52"} Feb 17 19:28:02 crc kubenswrapper[4892]: I0217 19:28:02.574804 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-hdpxb" Feb 17 19:28:02 crc kubenswrapper[4892]: I0217 19:28:02.576887 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2tk5v" event={"ID":"b593a6d0-56ba-4022-8084-246a3ac9fd30","Type":"ContainerStarted","Data":"d1b566279bbf91da9e67fe33058b706349350948969ae5828f096990aa456847"} Feb 17 19:28:02 crc kubenswrapper[4892]: I0217 19:28:02.579680 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-hdpxb" Feb 17 19:28:02 crc kubenswrapper[4892]: I0217 19:28:02.579764 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-n5wfv" event={"ID":"609d9353-9db2-4a1c-8f00-8cfe986c3b12","Type":"ContainerStarted","Data":"b8c980a6a3981feaf9c191f264a6e2ac28672049f3bddf970cda949e5ec91a09"} Feb 17 19:28:02 crc kubenswrapper[4892]: I0217 19:28:02.580263 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-n5wfv" Feb 17 19:28:02 crc kubenswrapper[4892]: I0217 19:28:02.597323 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7567bbd4cd-dvr5f" podStartSLOduration=3.736477038 podStartE2EDuration="17.597302433s" podCreationTimestamp="2026-02-17 19:27:45 +0000 UTC" firstStartedPulling="2026-02-17 19:27:47.119446781 +0000 UTC m=+6238.494850036" lastFinishedPulling="2026-02-17 19:28:00.980272165 +0000 UTC m=+6252.355675431" observedRunningTime="2026-02-17 19:28:02.584970288 +0000 UTC m=+6253.960373553" watchObservedRunningTime="2026-02-17 19:28:02.597302433 +0000 UTC m=+6253.972705698" Feb 17 19:28:02 crc kubenswrapper[4892]: I0217 19:28:02.611837 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2tk5v" podStartSLOduration=3.664012285 podStartE2EDuration="17.611808735s" podCreationTimestamp="2026-02-17 19:27:45 +0000 UTC" firstStartedPulling="2026-02-17 19:27:47.030000148 +0000 UTC m=+6238.405403413" lastFinishedPulling="2026-02-17 19:28:00.977796598 +0000 UTC m=+6252.353199863" observedRunningTime="2026-02-17 19:28:02.601114486 +0000 UTC m=+6253.976517761" watchObservedRunningTime="2026-02-17 19:28:02.611808735 +0000 UTC m=+6253.987212000" Feb 17 19:28:02 crc kubenswrapper[4892]: I0217 19:28:02.652333 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-n5wfv" podStartSLOduration=2.924451238 podStartE2EDuration="16.652315572s" podCreationTimestamp="2026-02-17 19:27:46 +0000 UTC" firstStartedPulling="2026-02-17 19:27:47.331619007 +0000 UTC m=+6238.707022272" lastFinishedPulling="2026-02-17 19:28:01.059483341 +0000 UTC m=+6252.434886606" observedRunningTime="2026-02-17 19:28:02.6396772 +0000 UTC m=+6254.015080475" watchObservedRunningTime="2026-02-17 19:28:02.652315572 +0000 UTC m=+6254.027718837" Feb 17 19:28:02 crc kubenswrapper[4892]: I0217 19:28:02.671388 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-hdpxb" podStartSLOduration=3.873844988 podStartE2EDuration="17.671366958s" podCreationTimestamp="2026-02-17 19:27:45 +0000 UTC" firstStartedPulling="2026-02-17 19:27:47.221073463 +0000 UTC m=+6238.596476728" lastFinishedPulling="2026-02-17 19:28:01.018595433 +0000 UTC m=+6252.393998698" observedRunningTime="2026-02-17 19:28:02.657260626 +0000 UTC m=+6254.032663891" watchObservedRunningTime="2026-02-17 19:28:02.671366958 +0000 UTC m=+6254.046770223" Feb 17 19:28:02 crc kubenswrapper[4892]: I0217 19:28:02.695208 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7567bbd4cd-mqtrm" podStartSLOduration=3.64681426 podStartE2EDuration="17.695189404s" podCreationTimestamp="2026-02-17 19:27:45 +0000 UTC" firstStartedPulling="2026-02-17 19:27:46.928658823 +0000 UTC m=+6238.304062088" lastFinishedPulling="2026-02-17 19:28:00.977033967 +0000 UTC m=+6252.352437232" observedRunningTime="2026-02-17 19:28:02.684236047 +0000 UTC m=+6254.059639312" watchObservedRunningTime="2026-02-17 19:28:02.695189404 +0000 UTC m=+6254.070592669" Feb 17 19:28:03 crc kubenswrapper[4892]: I0217 19:28:03.277170 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-24tmv"] Feb 17 19:28:03 crc kubenswrapper[4892]: I0217 19:28:03.277681 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-24tmv" podUID="0459d0da-a36b-47b3-8488-7bfcaae7c8e1" containerName="registry-server" containerID="cri-o://7fb7f81ec19a349a2830c90c23b413d90fc318b0ad36f23e546480639479facc" gracePeriod=2 Feb 17 19:28:03 crc kubenswrapper[4892]: I0217 19:28:03.592478 4892 generic.go:334] "Generic (PLEG): container finished" podID="0459d0da-a36b-47b3-8488-7bfcaae7c8e1" containerID="7fb7f81ec19a349a2830c90c23b413d90fc318b0ad36f23e546480639479facc" exitCode=0 Feb 17 19:28:03 crc kubenswrapper[4892]: I0217 19:28:03.593402 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-24tmv" event={"ID":"0459d0da-a36b-47b3-8488-7bfcaae7c8e1","Type":"ContainerDied","Data":"7fb7f81ec19a349a2830c90c23b413d90fc318b0ad36f23e546480639479facc"} Feb 17 19:28:03 crc kubenswrapper[4892]: I0217 19:28:03.914926 4892 scope.go:117] "RemoveContainer" containerID="9b74d5e0808d4916acfebbdce4333b4195dc6c278e31c267ea060b4652a9b98f" Feb 17 19:28:03 crc kubenswrapper[4892]: I0217 19:28:03.951334 4892 scope.go:117] "RemoveContainer" containerID="64ede27a0870296551f3f27b1149629c1800c8d7d496d0949a20bc79ab66c89a" Feb 17 19:28:04 crc kubenswrapper[4892]: I0217 19:28:04.051792 4892 scope.go:117] "RemoveContainer" containerID="899255e01f33471ea060742028bbabc149c65b6233a959a55119a355e123632b" Feb 17 19:28:04 crc kubenswrapper[4892]: I0217 19:28:04.086839 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-24tmv" Feb 17 19:28:04 crc kubenswrapper[4892]: I0217 19:28:04.132441 4892 scope.go:117] "RemoveContainer" containerID="e40a8b2216b350aee3d40e0347f9e71faab93f401d48f3fa6717da13c86c484d" Feb 17 19:28:04 crc kubenswrapper[4892]: I0217 19:28:04.152677 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0459d0da-a36b-47b3-8488-7bfcaae7c8e1-utilities\") pod \"0459d0da-a36b-47b3-8488-7bfcaae7c8e1\" (UID: \"0459d0da-a36b-47b3-8488-7bfcaae7c8e1\") " Feb 17 19:28:04 crc kubenswrapper[4892]: I0217 19:28:04.152799 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpt5x\" (UniqueName: \"kubernetes.io/projected/0459d0da-a36b-47b3-8488-7bfcaae7c8e1-kube-api-access-mpt5x\") pod \"0459d0da-a36b-47b3-8488-7bfcaae7c8e1\" (UID: \"0459d0da-a36b-47b3-8488-7bfcaae7c8e1\") " Feb 17 19:28:04 crc kubenswrapper[4892]: I0217 19:28:04.152863 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0459d0da-a36b-47b3-8488-7bfcaae7c8e1-catalog-content\") pod \"0459d0da-a36b-47b3-8488-7bfcaae7c8e1\" (UID: \"0459d0da-a36b-47b3-8488-7bfcaae7c8e1\") " Feb 17 19:28:04 crc kubenswrapper[4892]: I0217 19:28:04.154538 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0459d0da-a36b-47b3-8488-7bfcaae7c8e1-utilities" (OuterVolumeSpecName: "utilities") pod "0459d0da-a36b-47b3-8488-7bfcaae7c8e1" (UID: "0459d0da-a36b-47b3-8488-7bfcaae7c8e1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:28:04 crc kubenswrapper[4892]: I0217 19:28:04.163238 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0459d0da-a36b-47b3-8488-7bfcaae7c8e1-kube-api-access-mpt5x" (OuterVolumeSpecName: "kube-api-access-mpt5x") pod "0459d0da-a36b-47b3-8488-7bfcaae7c8e1" (UID: "0459d0da-a36b-47b3-8488-7bfcaae7c8e1"). InnerVolumeSpecName "kube-api-access-mpt5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:28:04 crc kubenswrapper[4892]: I0217 19:28:04.167904 4892 scope.go:117] "RemoveContainer" containerID="2124fc3cd68ef6e01d0bda780f105f3113271b407b7ed59edf969e538b690838" Feb 17 19:28:04 crc kubenswrapper[4892]: I0217 19:28:04.181630 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0459d0da-a36b-47b3-8488-7bfcaae7c8e1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0459d0da-a36b-47b3-8488-7bfcaae7c8e1" (UID: "0459d0da-a36b-47b3-8488-7bfcaae7c8e1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:28:04 crc kubenswrapper[4892]: I0217 19:28:04.197286 4892 scope.go:117] "RemoveContainer" containerID="ccb9f9aaf311645e58e52d2cdb654cf0177eb185804f2a0b51390b71db11738a" Feb 17 19:28:04 crc kubenswrapper[4892]: I0217 19:28:04.255758 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpt5x\" (UniqueName: \"kubernetes.io/projected/0459d0da-a36b-47b3-8488-7bfcaae7c8e1-kube-api-access-mpt5x\") on node \"crc\" DevicePath \"\"" Feb 17 19:28:04 crc kubenswrapper[4892]: I0217 19:28:04.255787 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0459d0da-a36b-47b3-8488-7bfcaae7c8e1-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 19:28:04 crc kubenswrapper[4892]: I0217 19:28:04.255797 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0459d0da-a36b-47b3-8488-7bfcaae7c8e1-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 19:28:04 crc kubenswrapper[4892]: I0217 19:28:04.607304 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-24tmv" Feb 17 19:28:04 crc kubenswrapper[4892]: I0217 19:28:04.607326 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-24tmv" event={"ID":"0459d0da-a36b-47b3-8488-7bfcaae7c8e1","Type":"ContainerDied","Data":"76aab720c02e9613c50ee27578ea59298a2e7e2d5d62993416303c6fc256bc98"} Feb 17 19:28:04 crc kubenswrapper[4892]: I0217 19:28:04.607384 4892 scope.go:117] "RemoveContainer" containerID="7fb7f81ec19a349a2830c90c23b413d90fc318b0ad36f23e546480639479facc" Feb 17 19:28:04 crc kubenswrapper[4892]: I0217 19:28:04.651747 4892 scope.go:117] "RemoveContainer" containerID="ab92829da0a5598f996df6b2470d6e69ab2b48f8afb70a05cccc562858d40eb6" Feb 17 19:28:04 crc kubenswrapper[4892]: I0217 19:28:04.670064 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-24tmv"] Feb 17 19:28:04 crc kubenswrapper[4892]: I0217 19:28:04.691457 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-24tmv"] Feb 17 19:28:04 crc kubenswrapper[4892]: I0217 19:28:04.706239 4892 scope.go:117] "RemoveContainer" containerID="b0834fd9fdcd718d4995c2bc1fb144c48ca911ddb38cc41248d0fb7a4964c160" Feb 17 19:28:05 crc kubenswrapper[4892]: I0217 19:28:05.380266 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0459d0da-a36b-47b3-8488-7bfcaae7c8e1" path="/var/lib/kubelet/pods/0459d0da-a36b-47b3-8488-7bfcaae7c8e1/volumes" Feb 17 19:28:05 crc kubenswrapper[4892]: I0217 19:28:05.679291 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bpgrs" Feb 17 19:28:05 crc kubenswrapper[4892]: I0217 19:28:05.739088 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bpgrs" Feb 17 19:28:06 crc kubenswrapper[4892]: I0217 19:28:06.592526 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-n5wfv" Feb 17 19:28:07 crc kubenswrapper[4892]: I0217 19:28:07.021711 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hpdxn" Feb 17 19:28:07 crc kubenswrapper[4892]: I0217 19:28:07.099225 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hpdxn" Feb 17 19:28:07 crc kubenswrapper[4892]: I0217 19:28:07.424695 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 19:28:07 crc kubenswrapper[4892]: I0217 19:28:07.426271 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 19:28:07 crc kubenswrapper[4892]: I0217 19:28:07.426581 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" Feb 17 19:28:07 crc kubenswrapper[4892]: I0217 19:28:07.427051 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2c781ca08bf3af58232a3452d8a6e42ca4ea1616baa6e1729ebe4342764473bf"} pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 19:28:07 crc kubenswrapper[4892]: I0217 19:28:07.427135 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" containerID="cri-o://2c781ca08bf3af58232a3452d8a6e42ca4ea1616baa6e1729ebe4342764473bf" gracePeriod=600 Feb 17 19:28:08 crc kubenswrapper[4892]: I0217 19:28:08.260923 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hpdxn"] Feb 17 19:28:08 crc kubenswrapper[4892]: I0217 19:28:08.668535 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hpdxn" podUID="f4408360-2931-4cec-9602-74ffccbd54aa" containerName="registry-server" containerID="cri-o://7fbb0a87aec2ae19edd6ac7297b091acdba759d565fc94a00655eccb0967423f" gracePeriod=2 Feb 17 19:28:09 crc kubenswrapper[4892]: I0217 19:28:09.062619 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bpgrs"] Feb 17 19:28:09 crc kubenswrapper[4892]: I0217 19:28:09.062892 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bpgrs" podUID="9fb11873-b198-4f7e-bbe1-5ab39f163319" containerName="registry-server" containerID="cri-o://e3dccf5b65fc76775a541a1d8182c374429dd6341f2699410ce08bbe2624966a" gracePeriod=2 Feb 17 19:28:09 crc kubenswrapper[4892]: I0217 19:28:09.740838 4892 generic.go:334] "Generic (PLEG): container finished" podID="f4408360-2931-4cec-9602-74ffccbd54aa" containerID="7fbb0a87aec2ae19edd6ac7297b091acdba759d565fc94a00655eccb0967423f" exitCode=0 Feb 17 19:28:09 crc kubenswrapper[4892]: I0217 19:28:09.741284 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hpdxn" event={"ID":"f4408360-2931-4cec-9602-74ffccbd54aa","Type":"ContainerDied","Data":"7fbb0a87aec2ae19edd6ac7297b091acdba759d565fc94a00655eccb0967423f"} Feb 17 19:28:09 crc kubenswrapper[4892]: I0217 19:28:09.749125 4892 generic.go:334] "Generic (PLEG): container finished" podID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerID="2c781ca08bf3af58232a3452d8a6e42ca4ea1616baa6e1729ebe4342764473bf" exitCode=0 Feb 17 19:28:09 crc kubenswrapper[4892]: I0217 19:28:09.749208 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerDied","Data":"2c781ca08bf3af58232a3452d8a6e42ca4ea1616baa6e1729ebe4342764473bf"} Feb 17 19:28:09 crc kubenswrapper[4892]: I0217 19:28:09.749241 4892 scope.go:117] "RemoveContainer" containerID="0b9ac01ffb1eb212ecc7dea4def4de73890b147a83b4b71346712fbd55b67ac2" Feb 17 19:28:09 crc kubenswrapper[4892]: I0217 19:28:09.758745 4892 generic.go:334] "Generic (PLEG): container finished" podID="9fb11873-b198-4f7e-bbe1-5ab39f163319" containerID="e3dccf5b65fc76775a541a1d8182c374429dd6341f2699410ce08bbe2624966a" exitCode=0 Feb 17 19:28:09 crc kubenswrapper[4892]: I0217 19:28:09.758777 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bpgrs" event={"ID":"9fb11873-b198-4f7e-bbe1-5ab39f163319","Type":"ContainerDied","Data":"e3dccf5b65fc76775a541a1d8182c374429dd6341f2699410ce08bbe2624966a"} Feb 17 19:28:10 crc kubenswrapper[4892]: I0217 19:28:10.407187 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hpdxn" Feb 17 19:28:10 crc kubenswrapper[4892]: I0217 19:28:10.418198 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bpgrs" Feb 17 19:28:10 crc kubenswrapper[4892]: I0217 19:28:10.602979 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4408360-2931-4cec-9602-74ffccbd54aa-utilities\") pod \"f4408360-2931-4cec-9602-74ffccbd54aa\" (UID: \"f4408360-2931-4cec-9602-74ffccbd54aa\") " Feb 17 19:28:10 crc kubenswrapper[4892]: I0217 19:28:10.603048 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfxtg\" (UniqueName: \"kubernetes.io/projected/f4408360-2931-4cec-9602-74ffccbd54aa-kube-api-access-cfxtg\") pod \"f4408360-2931-4cec-9602-74ffccbd54aa\" (UID: \"f4408360-2931-4cec-9602-74ffccbd54aa\") " Feb 17 19:28:10 crc kubenswrapper[4892]: I0217 19:28:10.603097 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptrp5\" (UniqueName: \"kubernetes.io/projected/9fb11873-b198-4f7e-bbe1-5ab39f163319-kube-api-access-ptrp5\") pod \"9fb11873-b198-4f7e-bbe1-5ab39f163319\" (UID: \"9fb11873-b198-4f7e-bbe1-5ab39f163319\") " Feb 17 19:28:10 crc kubenswrapper[4892]: I0217 19:28:10.603141 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4408360-2931-4cec-9602-74ffccbd54aa-catalog-content\") pod \"f4408360-2931-4cec-9602-74ffccbd54aa\" (UID: \"f4408360-2931-4cec-9602-74ffccbd54aa\") " Feb 17 19:28:10 crc kubenswrapper[4892]: I0217 19:28:10.603159 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fb11873-b198-4f7e-bbe1-5ab39f163319-catalog-content\") pod \"9fb11873-b198-4f7e-bbe1-5ab39f163319\" (UID: \"9fb11873-b198-4f7e-bbe1-5ab39f163319\") " Feb 17 19:28:10 crc kubenswrapper[4892]: I0217 19:28:10.603230 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fb11873-b198-4f7e-bbe1-5ab39f163319-utilities\") pod \"9fb11873-b198-4f7e-bbe1-5ab39f163319\" (UID: \"9fb11873-b198-4f7e-bbe1-5ab39f163319\") " Feb 17 19:28:10 crc kubenswrapper[4892]: I0217 19:28:10.603533 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4408360-2931-4cec-9602-74ffccbd54aa-utilities" (OuterVolumeSpecName: "utilities") pod "f4408360-2931-4cec-9602-74ffccbd54aa" (UID: "f4408360-2931-4cec-9602-74ffccbd54aa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:28:10 crc kubenswrapper[4892]: I0217 19:28:10.605194 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fb11873-b198-4f7e-bbe1-5ab39f163319-utilities" (OuterVolumeSpecName: "utilities") pod "9fb11873-b198-4f7e-bbe1-5ab39f163319" (UID: "9fb11873-b198-4f7e-bbe1-5ab39f163319"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:28:10 crc kubenswrapper[4892]: I0217 19:28:10.615003 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fb11873-b198-4f7e-bbe1-5ab39f163319-kube-api-access-ptrp5" (OuterVolumeSpecName: "kube-api-access-ptrp5") pod "9fb11873-b198-4f7e-bbe1-5ab39f163319" (UID: "9fb11873-b198-4f7e-bbe1-5ab39f163319"). InnerVolumeSpecName "kube-api-access-ptrp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:28:10 crc kubenswrapper[4892]: I0217 19:28:10.622231 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4408360-2931-4cec-9602-74ffccbd54aa-kube-api-access-cfxtg" (OuterVolumeSpecName: "kube-api-access-cfxtg") pod "f4408360-2931-4cec-9602-74ffccbd54aa" (UID: "f4408360-2931-4cec-9602-74ffccbd54aa"). InnerVolumeSpecName "kube-api-access-cfxtg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:28:10 crc kubenswrapper[4892]: I0217 19:28:10.673823 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4408360-2931-4cec-9602-74ffccbd54aa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f4408360-2931-4cec-9602-74ffccbd54aa" (UID: "f4408360-2931-4cec-9602-74ffccbd54aa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:28:10 crc kubenswrapper[4892]: I0217 19:28:10.685471 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fb11873-b198-4f7e-bbe1-5ab39f163319-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9fb11873-b198-4f7e-bbe1-5ab39f163319" (UID: "9fb11873-b198-4f7e-bbe1-5ab39f163319"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:28:10 crc kubenswrapper[4892]: I0217 19:28:10.708445 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4408360-2931-4cec-9602-74ffccbd54aa-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 19:28:10 crc kubenswrapper[4892]: I0217 19:28:10.708472 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fb11873-b198-4f7e-bbe1-5ab39f163319-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 19:28:10 crc kubenswrapper[4892]: I0217 19:28:10.708481 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fb11873-b198-4f7e-bbe1-5ab39f163319-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 19:28:10 crc kubenswrapper[4892]: I0217 19:28:10.708491 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4408360-2931-4cec-9602-74ffccbd54aa-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 19:28:10 crc kubenswrapper[4892]: I0217 19:28:10.708500 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfxtg\" (UniqueName: \"kubernetes.io/projected/f4408360-2931-4cec-9602-74ffccbd54aa-kube-api-access-cfxtg\") on node \"crc\" DevicePath \"\"" Feb 17 19:28:10 crc kubenswrapper[4892]: I0217 19:28:10.708509 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptrp5\" (UniqueName: \"kubernetes.io/projected/9fb11873-b198-4f7e-bbe1-5ab39f163319-kube-api-access-ptrp5\") on node \"crc\" DevicePath \"\"" Feb 17 19:28:10 crc kubenswrapper[4892]: I0217 19:28:10.776102 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jhfhx" podUID="96ed9f69-fd36-40cd-821f-01de224abc3c" containerName="registry-server" probeResult="failure" output=< Feb 17 19:28:10 crc kubenswrapper[4892]: timeout: failed to connect service ":50051" within 1s Feb 17 19:28:10 crc kubenswrapper[4892]: > Feb 17 19:28:10 crc kubenswrapper[4892]: I0217 19:28:10.785591 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hpdxn" event={"ID":"f4408360-2931-4cec-9602-74ffccbd54aa","Type":"ContainerDied","Data":"499cd627bdb4f205e5c760ccc82b6163087e2e87027036001c00210d2d6cf5d1"} Feb 17 19:28:10 crc kubenswrapper[4892]: I0217 19:28:10.785640 4892 scope.go:117] "RemoveContainer" containerID="7fbb0a87aec2ae19edd6ac7297b091acdba759d565fc94a00655eccb0967423f" Feb 17 19:28:10 crc kubenswrapper[4892]: I0217 19:28:10.785781 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hpdxn" Feb 17 19:28:10 crc kubenswrapper[4892]: I0217 19:28:10.795960 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerStarted","Data":"23227f03c5258bf78d47839d96cf8cec6b2327db6429577c69616ffcf333ff09"} Feb 17 19:28:10 crc kubenswrapper[4892]: I0217 19:28:10.799948 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bpgrs" event={"ID":"9fb11873-b198-4f7e-bbe1-5ab39f163319","Type":"ContainerDied","Data":"ea1c0b78dd5e95bf0fea6de36a3e83d3a199d559fdb49e05505a6c9ac076b5cd"} Feb 17 19:28:10 crc kubenswrapper[4892]: I0217 19:28:10.800046 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bpgrs" Feb 17 19:28:10 crc kubenswrapper[4892]: I0217 19:28:10.825736 4892 scope.go:117] "RemoveContainer" containerID="4f5f530a2800d990b19803f00da0f6f6fbc19b8ac24dee6377c26a95ca043fe7" Feb 17 19:28:10 crc kubenswrapper[4892]: I0217 19:28:10.900537 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hpdxn"] Feb 17 19:28:10 crc kubenswrapper[4892]: I0217 19:28:10.942070 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hpdxn"] Feb 17 19:28:10 crc kubenswrapper[4892]: I0217 19:28:10.961870 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bpgrs"] Feb 17 19:28:10 crc kubenswrapper[4892]: I0217 19:28:10.965826 4892 scope.go:117] "RemoveContainer" containerID="0b18afc0b9ae4b8cfddfd81011a8a23e36b4660ae5b2001161f9720b05396065" Feb 17 19:28:10 crc kubenswrapper[4892]: I0217 19:28:10.973406 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bpgrs"] Feb 17 19:28:11 crc kubenswrapper[4892]: I0217 19:28:11.009744 4892 scope.go:117] "RemoveContainer" containerID="e3dccf5b65fc76775a541a1d8182c374429dd6341f2699410ce08bbe2624966a" Feb 17 19:28:11 crc kubenswrapper[4892]: I0217 19:28:11.064706 4892 scope.go:117] "RemoveContainer" containerID="b240e393e06c0e1e61ed6d4c21224691607201ba0ba3375fc29560367b00d45a" Feb 17 19:28:11 crc kubenswrapper[4892]: I0217 19:28:11.102652 4892 scope.go:117] "RemoveContainer" containerID="680a84cd44d046b88798fbba5e216f46111ec8857e75769bff339c12f486215d" Feb 17 19:28:11 crc kubenswrapper[4892]: I0217 19:28:11.139865 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Feb 17 19:28:11 crc kubenswrapper[4892]: I0217 19:28:11.140107 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="68d93bba-ae69-4c30-8c55-7818f38437b7" containerName="openstackclient" containerID="cri-o://8623c66734141f6b22c688965c38d0cdd1c1b073fef5a470da15f185f4dbe675" gracePeriod=2 Feb 17 19:28:11 crc kubenswrapper[4892]: I0217 19:28:11.166886 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Feb 17 19:28:11 crc kubenswrapper[4892]: I0217 19:28:11.185212 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 17 19:28:11 crc kubenswrapper[4892]: E0217 19:28:11.185786 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68d93bba-ae69-4c30-8c55-7818f38437b7" containerName="openstackclient" Feb 17 19:28:11 crc kubenswrapper[4892]: I0217 19:28:11.185800 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="68d93bba-ae69-4c30-8c55-7818f38437b7" containerName="openstackclient" Feb 17 19:28:11 crc kubenswrapper[4892]: E0217 19:28:11.185840 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fb11873-b198-4f7e-bbe1-5ab39f163319" containerName="registry-server" Feb 17 19:28:11 crc kubenswrapper[4892]: I0217 19:28:11.185851 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fb11873-b198-4f7e-bbe1-5ab39f163319" containerName="registry-server" Feb 17 19:28:11 crc kubenswrapper[4892]: E0217 19:28:11.185868 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4408360-2931-4cec-9602-74ffccbd54aa" containerName="extract-utilities" Feb 17 19:28:11 crc kubenswrapper[4892]: I0217 19:28:11.185878 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4408360-2931-4cec-9602-74ffccbd54aa" containerName="extract-utilities" Feb 17 19:28:11 crc kubenswrapper[4892]: E0217 19:28:11.185903 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4408360-2931-4cec-9602-74ffccbd54aa" containerName="registry-server" Feb 17 19:28:11 crc kubenswrapper[4892]: I0217 19:28:11.185911 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4408360-2931-4cec-9602-74ffccbd54aa" containerName="registry-server" Feb 17 19:28:11 crc kubenswrapper[4892]: E0217 19:28:11.185928 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0459d0da-a36b-47b3-8488-7bfcaae7c8e1" containerName="extract-utilities" Feb 17 19:28:11 crc kubenswrapper[4892]: I0217 19:28:11.185934 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="0459d0da-a36b-47b3-8488-7bfcaae7c8e1" containerName="extract-utilities" Feb 17 19:28:11 crc kubenswrapper[4892]: E0217 19:28:11.185943 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4408360-2931-4cec-9602-74ffccbd54aa" containerName="extract-content" Feb 17 19:28:11 crc kubenswrapper[4892]: I0217 19:28:11.185951 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4408360-2931-4cec-9602-74ffccbd54aa" containerName="extract-content" Feb 17 19:28:11 crc kubenswrapper[4892]: E0217 19:28:11.185963 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fb11873-b198-4f7e-bbe1-5ab39f163319" containerName="extract-utilities" Feb 17 19:28:11 crc kubenswrapper[4892]: I0217 19:28:11.185968 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fb11873-b198-4f7e-bbe1-5ab39f163319" containerName="extract-utilities" Feb 17 19:28:11 crc kubenswrapper[4892]: E0217 19:28:11.185982 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0459d0da-a36b-47b3-8488-7bfcaae7c8e1" containerName="extract-content" Feb 17 19:28:11 crc kubenswrapper[4892]: I0217 19:28:11.185988 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="0459d0da-a36b-47b3-8488-7bfcaae7c8e1" containerName="extract-content" Feb 17 19:28:11 crc kubenswrapper[4892]: E0217 19:28:11.186017 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0459d0da-a36b-47b3-8488-7bfcaae7c8e1" containerName="registry-server" Feb 17 19:28:11 crc kubenswrapper[4892]: I0217 19:28:11.186022 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="0459d0da-a36b-47b3-8488-7bfcaae7c8e1" containerName="registry-server" Feb 17 19:28:11 crc kubenswrapper[4892]: E0217 19:28:11.186035 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fb11873-b198-4f7e-bbe1-5ab39f163319" containerName="extract-content" Feb 17 19:28:11 crc kubenswrapper[4892]: I0217 19:28:11.186041 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fb11873-b198-4f7e-bbe1-5ab39f163319" containerName="extract-content" Feb 17 19:28:11 crc kubenswrapper[4892]: I0217 19:28:11.186265 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="0459d0da-a36b-47b3-8488-7bfcaae7c8e1" containerName="registry-server" Feb 17 19:28:11 crc kubenswrapper[4892]: I0217 19:28:11.186273 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="68d93bba-ae69-4c30-8c55-7818f38437b7" containerName="openstackclient" Feb 17 19:28:11 crc kubenswrapper[4892]: I0217 19:28:11.186294 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4408360-2931-4cec-9602-74ffccbd54aa" containerName="registry-server" Feb 17 19:28:11 crc kubenswrapper[4892]: I0217 19:28:11.186306 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fb11873-b198-4f7e-bbe1-5ab39f163319" containerName="registry-server" Feb 17 19:28:11 crc kubenswrapper[4892]: I0217 19:28:11.187149 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 17 19:28:11 crc kubenswrapper[4892]: I0217 19:28:11.205142 4892 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="68d93bba-ae69-4c30-8c55-7818f38437b7" podUID="3519261b-9df6-4bbd-976b-a6987e030742" Feb 17 19:28:11 crc kubenswrapper[4892]: I0217 19:28:11.206077 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 17 19:28:11 crc kubenswrapper[4892]: I0217 19:28:11.314681 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 19:28:11 crc kubenswrapper[4892]: I0217 19:28:11.316212 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 17 19:28:11 crc kubenswrapper[4892]: I0217 19:28:11.318180 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-55v94" Feb 17 19:28:11 crc kubenswrapper[4892]: I0217 19:28:11.325408 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 19:28:11 crc kubenswrapper[4892]: I0217 19:28:11.331799 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3519261b-9df6-4bbd-976b-a6987e030742-openstack-config\") pod \"openstackclient\" (UID: \"3519261b-9df6-4bbd-976b-a6987e030742\") " pod="openstack/openstackclient" Feb 17 19:28:11 crc kubenswrapper[4892]: I0217 19:28:11.331872 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3519261b-9df6-4bbd-976b-a6987e030742-openstack-config-secret\") pod \"openstackclient\" (UID: \"3519261b-9df6-4bbd-976b-a6987e030742\") " pod="openstack/openstackclient" Feb 17 19:28:11 crc kubenswrapper[4892]: I0217 19:28:11.331952 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d2j5\" (UniqueName: \"kubernetes.io/projected/3519261b-9df6-4bbd-976b-a6987e030742-kube-api-access-4d2j5\") pod \"openstackclient\" (UID: \"3519261b-9df6-4bbd-976b-a6987e030742\") " pod="openstack/openstackclient" Feb 17 19:28:11 crc kubenswrapper[4892]: I0217 19:28:11.383711 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fb11873-b198-4f7e-bbe1-5ab39f163319" path="/var/lib/kubelet/pods/9fb11873-b198-4f7e-bbe1-5ab39f163319/volumes" Feb 17 19:28:11 crc kubenswrapper[4892]: I0217 19:28:11.384408 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4408360-2931-4cec-9602-74ffccbd54aa" path="/var/lib/kubelet/pods/f4408360-2931-4cec-9602-74ffccbd54aa/volumes" Feb 17 19:28:11 crc kubenswrapper[4892]: I0217 19:28:11.435151 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bbxc\" (UniqueName: \"kubernetes.io/projected/7e8e76a7-b0ec-4b83-b288-c80fdd74ff97-kube-api-access-9bbxc\") pod \"kube-state-metrics-0\" (UID: \"7e8e76a7-b0ec-4b83-b288-c80fdd74ff97\") " pod="openstack/kube-state-metrics-0" Feb 17 19:28:11 crc kubenswrapper[4892]: I0217 19:28:11.435432 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3519261b-9df6-4bbd-976b-a6987e030742-openstack-config\") pod \"openstackclient\" (UID: \"3519261b-9df6-4bbd-976b-a6987e030742\") " pod="openstack/openstackclient" Feb 17 19:28:11 crc kubenswrapper[4892]: I0217 19:28:11.435545 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3519261b-9df6-4bbd-976b-a6987e030742-openstack-config-secret\") pod \"openstackclient\" (UID: \"3519261b-9df6-4bbd-976b-a6987e030742\") " pod="openstack/openstackclient" Feb 17 19:28:11 crc kubenswrapper[4892]: I0217 19:28:11.435684 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d2j5\" (UniqueName: \"kubernetes.io/projected/3519261b-9df6-4bbd-976b-a6987e030742-kube-api-access-4d2j5\") pod \"openstackclient\" (UID: \"3519261b-9df6-4bbd-976b-a6987e030742\") " pod="openstack/openstackclient" Feb 17 19:28:11 crc kubenswrapper[4892]: I0217 19:28:11.436643 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3519261b-9df6-4bbd-976b-a6987e030742-openstack-config\") pod \"openstackclient\" (UID: \"3519261b-9df6-4bbd-976b-a6987e030742\") " pod="openstack/openstackclient" Feb 17 19:28:11 crc kubenswrapper[4892]: I0217 19:28:11.442068 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3519261b-9df6-4bbd-976b-a6987e030742-openstack-config-secret\") pod \"openstackclient\" (UID: \"3519261b-9df6-4bbd-976b-a6987e030742\") " pod="openstack/openstackclient" Feb 17 19:28:11 crc kubenswrapper[4892]: I0217 19:28:11.470059 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d2j5\" (UniqueName: \"kubernetes.io/projected/3519261b-9df6-4bbd-976b-a6987e030742-kube-api-access-4d2j5\") pod \"openstackclient\" (UID: \"3519261b-9df6-4bbd-976b-a6987e030742\") " pod="openstack/openstackclient" Feb 17 19:28:11 crc kubenswrapper[4892]: I0217 19:28:11.524352 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 17 19:28:11 crc kubenswrapper[4892]: I0217 19:28:11.537639 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bbxc\" (UniqueName: \"kubernetes.io/projected/7e8e76a7-b0ec-4b83-b288-c80fdd74ff97-kube-api-access-9bbxc\") pod \"kube-state-metrics-0\" (UID: \"7e8e76a7-b0ec-4b83-b288-c80fdd74ff97\") " pod="openstack/kube-state-metrics-0" Feb 17 19:28:11 crc kubenswrapper[4892]: I0217 19:28:11.569378 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bbxc\" (UniqueName: \"kubernetes.io/projected/7e8e76a7-b0ec-4b83-b288-c80fdd74ff97-kube-api-access-9bbxc\") pod \"kube-state-metrics-0\" (UID: \"7e8e76a7-b0ec-4b83-b288-c80fdd74ff97\") " pod="openstack/kube-state-metrics-0" Feb 17 19:28:11 crc kubenswrapper[4892]: I0217 19:28:11.633262 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.236877 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.247946 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.257135 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.257404 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.257419 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.257411 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-nrwps" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.261081 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.285365 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.308424 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.359067 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b1b1be0e-8ec2-48ea-967b-a89c7e20bea9-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"b1b1be0e-8ec2-48ea-967b-a89c7e20bea9\") " pod="openstack/alertmanager-metric-storage-0" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.359111 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4x2m\" (UniqueName: \"kubernetes.io/projected/b1b1be0e-8ec2-48ea-967b-a89c7e20bea9-kube-api-access-t4x2m\") pod \"alertmanager-metric-storage-0\" (UID: \"b1b1be0e-8ec2-48ea-967b-a89c7e20bea9\") " pod="openstack/alertmanager-metric-storage-0" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.359161 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b1b1be0e-8ec2-48ea-967b-a89c7e20bea9-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"b1b1be0e-8ec2-48ea-967b-a89c7e20bea9\") " pod="openstack/alertmanager-metric-storage-0" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.359236 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b1b1be0e-8ec2-48ea-967b-a89c7e20bea9-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"b1b1be0e-8ec2-48ea-967b-a89c7e20bea9\") " pod="openstack/alertmanager-metric-storage-0" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.359565 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b1b1be0e-8ec2-48ea-967b-a89c7e20bea9-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"b1b1be0e-8ec2-48ea-967b-a89c7e20bea9\") " pod="openstack/alertmanager-metric-storage-0" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.359634 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/b1b1be0e-8ec2-48ea-967b-a89c7e20bea9-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"b1b1be0e-8ec2-48ea-967b-a89c7e20bea9\") " pod="openstack/alertmanager-metric-storage-0" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.359664 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b1b1be0e-8ec2-48ea-967b-a89c7e20bea9-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"b1b1be0e-8ec2-48ea-967b-a89c7e20bea9\") " pod="openstack/alertmanager-metric-storage-0" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.438655 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.462246 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b1b1be0e-8ec2-48ea-967b-a89c7e20bea9-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"b1b1be0e-8ec2-48ea-967b-a89c7e20bea9\") " pod="openstack/alertmanager-metric-storage-0" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.462294 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/b1b1be0e-8ec2-48ea-967b-a89c7e20bea9-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"b1b1be0e-8ec2-48ea-967b-a89c7e20bea9\") " pod="openstack/alertmanager-metric-storage-0" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.462316 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b1b1be0e-8ec2-48ea-967b-a89c7e20bea9-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"b1b1be0e-8ec2-48ea-967b-a89c7e20bea9\") " pod="openstack/alertmanager-metric-storage-0" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.462410 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b1b1be0e-8ec2-48ea-967b-a89c7e20bea9-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"b1b1be0e-8ec2-48ea-967b-a89c7e20bea9\") " pod="openstack/alertmanager-metric-storage-0" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.462428 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4x2m\" (UniqueName: \"kubernetes.io/projected/b1b1be0e-8ec2-48ea-967b-a89c7e20bea9-kube-api-access-t4x2m\") pod \"alertmanager-metric-storage-0\" (UID: \"b1b1be0e-8ec2-48ea-967b-a89c7e20bea9\") " pod="openstack/alertmanager-metric-storage-0" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.462478 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b1b1be0e-8ec2-48ea-967b-a89c7e20bea9-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"b1b1be0e-8ec2-48ea-967b-a89c7e20bea9\") " pod="openstack/alertmanager-metric-storage-0" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.462522 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b1b1be0e-8ec2-48ea-967b-a89c7e20bea9-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"b1b1be0e-8ec2-48ea-967b-a89c7e20bea9\") " pod="openstack/alertmanager-metric-storage-0" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.463086 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/b1b1be0e-8ec2-48ea-967b-a89c7e20bea9-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"b1b1be0e-8ec2-48ea-967b-a89c7e20bea9\") " pod="openstack/alertmanager-metric-storage-0" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.476988 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b1b1be0e-8ec2-48ea-967b-a89c7e20bea9-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"b1b1be0e-8ec2-48ea-967b-a89c7e20bea9\") " pod="openstack/alertmanager-metric-storage-0" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.478056 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b1b1be0e-8ec2-48ea-967b-a89c7e20bea9-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"b1b1be0e-8ec2-48ea-967b-a89c7e20bea9\") " pod="openstack/alertmanager-metric-storage-0" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.479289 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b1b1be0e-8ec2-48ea-967b-a89c7e20bea9-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"b1b1be0e-8ec2-48ea-967b-a89c7e20bea9\") " pod="openstack/alertmanager-metric-storage-0" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.480631 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b1b1be0e-8ec2-48ea-967b-a89c7e20bea9-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"b1b1be0e-8ec2-48ea-967b-a89c7e20bea9\") " pod="openstack/alertmanager-metric-storage-0" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.481380 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b1b1be0e-8ec2-48ea-967b-a89c7e20bea9-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"b1b1be0e-8ec2-48ea-967b-a89c7e20bea9\") " pod="openstack/alertmanager-metric-storage-0" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.503702 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4x2m\" (UniqueName: \"kubernetes.io/projected/b1b1be0e-8ec2-48ea-967b-a89c7e20bea9-kube-api-access-t4x2m\") pod \"alertmanager-metric-storage-0\" (UID: \"b1b1be0e-8ec2-48ea-967b-a89c7e20bea9\") " pod="openstack/alertmanager-metric-storage-0" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.678603 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.707321 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.711406 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.713555 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.713835 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.713946 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.714670 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.714858 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.715125 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.715131 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-xjlzn" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.715414 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.722320 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.872420 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/b8b75602-eb3c-41f2-85d9-e5b055bd0724-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"b8b75602-eb3c-41f2-85d9-e5b055bd0724\") " pod="openstack/prometheus-metric-storage-0" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.872471 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b8b75602-eb3c-41f2-85d9-e5b055bd0724-config\") pod \"prometheus-metric-storage-0\" (UID: \"b8b75602-eb3c-41f2-85d9-e5b055bd0724\") " pod="openstack/prometheus-metric-storage-0" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.872524 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b8b75602-eb3c-41f2-85d9-e5b055bd0724-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"b8b75602-eb3c-41f2-85d9-e5b055bd0724\") " pod="openstack/prometheus-metric-storage-0" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.872575 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b8b75602-eb3c-41f2-85d9-e5b055bd0724-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"b8b75602-eb3c-41f2-85d9-e5b055bd0724\") " pod="openstack/prometheus-metric-storage-0" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.872597 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/b8b75602-eb3c-41f2-85d9-e5b055bd0724-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"b8b75602-eb3c-41f2-85d9-e5b055bd0724\") " pod="openstack/prometheus-metric-storage-0" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.872732 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b8b75602-eb3c-41f2-85d9-e5b055bd0724-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"b8b75602-eb3c-41f2-85d9-e5b055bd0724\") " pod="openstack/prometheus-metric-storage-0" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.872773 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b8b75602-eb3c-41f2-85d9-e5b055bd0724-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"b8b75602-eb3c-41f2-85d9-e5b055bd0724\") " pod="openstack/prometheus-metric-storage-0" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.873256 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztcjp\" (UniqueName: \"kubernetes.io/projected/b8b75602-eb3c-41f2-85d9-e5b055bd0724-kube-api-access-ztcjp\") pod \"prometheus-metric-storage-0\" (UID: \"b8b75602-eb3c-41f2-85d9-e5b055bd0724\") " pod="openstack/prometheus-metric-storage-0" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.873349 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b8b75602-eb3c-41f2-85d9-e5b055bd0724-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"b8b75602-eb3c-41f2-85d9-e5b055bd0724\") " pod="openstack/prometheus-metric-storage-0" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.873382 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-9ef27487-d6cb-4bb2-a8b2-012837ef6f07\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9ef27487-d6cb-4bb2-a8b2-012837ef6f07\") pod \"prometheus-metric-storage-0\" (UID: \"b8b75602-eb3c-41f2-85d9-e5b055bd0724\") " pod="openstack/prometheus-metric-storage-0" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.902745 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"3519261b-9df6-4bbd-976b-a6987e030742","Type":"ContainerStarted","Data":"f0f889313ee9c13ddd5900c31d6fb6de0e8dce57c3bcb72fc8bd365b246e7913"} Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.903083 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"3519261b-9df6-4bbd-976b-a6987e030742","Type":"ContainerStarted","Data":"d589bf6a6ddfd022d97b2592d8669258752c2b9806c76d8be9fa794239464ffb"} Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.904984 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"7e8e76a7-b0ec-4b83-b288-c80fdd74ff97","Type":"ContainerStarted","Data":"7a705cc313be6805addb044a7d2d7502d5ed89dafb827e11fbc71353611f17d3"} Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.924177 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.924158241 podStartE2EDuration="1.924158241s" podCreationTimestamp="2026-02-17 19:28:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:28:12.917864041 +0000 UTC m=+6264.293267306" watchObservedRunningTime="2026-02-17 19:28:12.924158241 +0000 UTC m=+6264.299561506" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.985271 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b8b75602-eb3c-41f2-85d9-e5b055bd0724-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"b8b75602-eb3c-41f2-85d9-e5b055bd0724\") " pod="openstack/prometheus-metric-storage-0" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.985335 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b8b75602-eb3c-41f2-85d9-e5b055bd0724-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"b8b75602-eb3c-41f2-85d9-e5b055bd0724\") " pod="openstack/prometheus-metric-storage-0" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.985361 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/b8b75602-eb3c-41f2-85d9-e5b055bd0724-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"b8b75602-eb3c-41f2-85d9-e5b055bd0724\") " pod="openstack/prometheus-metric-storage-0" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.985404 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b8b75602-eb3c-41f2-85d9-e5b055bd0724-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"b8b75602-eb3c-41f2-85d9-e5b055bd0724\") " pod="openstack/prometheus-metric-storage-0" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.985427 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b8b75602-eb3c-41f2-85d9-e5b055bd0724-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"b8b75602-eb3c-41f2-85d9-e5b055bd0724\") " pod="openstack/prometheus-metric-storage-0" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.985482 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztcjp\" (UniqueName: \"kubernetes.io/projected/b8b75602-eb3c-41f2-85d9-e5b055bd0724-kube-api-access-ztcjp\") pod \"prometheus-metric-storage-0\" (UID: \"b8b75602-eb3c-41f2-85d9-e5b055bd0724\") " pod="openstack/prometheus-metric-storage-0" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.985532 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b8b75602-eb3c-41f2-85d9-e5b055bd0724-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"b8b75602-eb3c-41f2-85d9-e5b055bd0724\") " pod="openstack/prometheus-metric-storage-0" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.985560 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-9ef27487-d6cb-4bb2-a8b2-012837ef6f07\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9ef27487-d6cb-4bb2-a8b2-012837ef6f07\") pod \"prometheus-metric-storage-0\" (UID: \"b8b75602-eb3c-41f2-85d9-e5b055bd0724\") " pod="openstack/prometheus-metric-storage-0" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.985706 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/b8b75602-eb3c-41f2-85d9-e5b055bd0724-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"b8b75602-eb3c-41f2-85d9-e5b055bd0724\") " pod="openstack/prometheus-metric-storage-0" Feb 17 19:28:12 crc kubenswrapper[4892]: I0217 19:28:12.985743 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b8b75602-eb3c-41f2-85d9-e5b055bd0724-config\") pod \"prometheus-metric-storage-0\" (UID: \"b8b75602-eb3c-41f2-85d9-e5b055bd0724\") " pod="openstack/prometheus-metric-storage-0" Feb 17 19:28:13 crc kubenswrapper[4892]: I0217 19:28:13.008754 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b8b75602-eb3c-41f2-85d9-e5b055bd0724-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"b8b75602-eb3c-41f2-85d9-e5b055bd0724\") " pod="openstack/prometheus-metric-storage-0" Feb 17 19:28:13 crc kubenswrapper[4892]: I0217 19:28:13.009339 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/b8b75602-eb3c-41f2-85d9-e5b055bd0724-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"b8b75602-eb3c-41f2-85d9-e5b055bd0724\") " pod="openstack/prometheus-metric-storage-0" Feb 17 19:28:13 crc kubenswrapper[4892]: I0217 19:28:13.010402 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b8b75602-eb3c-41f2-85d9-e5b055bd0724-config\") pod \"prometheus-metric-storage-0\" (UID: \"b8b75602-eb3c-41f2-85d9-e5b055bd0724\") " pod="openstack/prometheus-metric-storage-0" Feb 17 19:28:13 crc kubenswrapper[4892]: I0217 19:28:13.010430 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/b8b75602-eb3c-41f2-85d9-e5b055bd0724-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"b8b75602-eb3c-41f2-85d9-e5b055bd0724\") " pod="openstack/prometheus-metric-storage-0" Feb 17 19:28:13 crc kubenswrapper[4892]: I0217 19:28:13.013031 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b8b75602-eb3c-41f2-85d9-e5b055bd0724-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"b8b75602-eb3c-41f2-85d9-e5b055bd0724\") " pod="openstack/prometheus-metric-storage-0" Feb 17 19:28:13 crc kubenswrapper[4892]: I0217 19:28:13.013309 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b8b75602-eb3c-41f2-85d9-e5b055bd0724-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"b8b75602-eb3c-41f2-85d9-e5b055bd0724\") " pod="openstack/prometheus-metric-storage-0" Feb 17 19:28:13 crc kubenswrapper[4892]: I0217 19:28:13.019241 4892 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 19:28:13 crc kubenswrapper[4892]: I0217 19:28:13.019287 4892 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-9ef27487-d6cb-4bb2-a8b2-012837ef6f07\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9ef27487-d6cb-4bb2-a8b2-012837ef6f07\") pod \"prometheus-metric-storage-0\" (UID: \"b8b75602-eb3c-41f2-85d9-e5b055bd0724\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7dbde491f378cc98cf7fc4990a432b63c74773c214a5a5120dcaa8f6b5196e19/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 17 19:28:13 crc kubenswrapper[4892]: I0217 19:28:13.019999 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b8b75602-eb3c-41f2-85d9-e5b055bd0724-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"b8b75602-eb3c-41f2-85d9-e5b055bd0724\") " pod="openstack/prometheus-metric-storage-0" Feb 17 19:28:13 crc kubenswrapper[4892]: I0217 19:28:13.020262 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b8b75602-eb3c-41f2-85d9-e5b055bd0724-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"b8b75602-eb3c-41f2-85d9-e5b055bd0724\") " pod="openstack/prometheus-metric-storage-0" Feb 17 19:28:13 crc kubenswrapper[4892]: I0217 19:28:13.035576 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztcjp\" (UniqueName: \"kubernetes.io/projected/b8b75602-eb3c-41f2-85d9-e5b055bd0724-kube-api-access-ztcjp\") pod \"prometheus-metric-storage-0\" (UID: \"b8b75602-eb3c-41f2-85d9-e5b055bd0724\") " pod="openstack/prometheus-metric-storage-0" Feb 17 19:28:13 crc kubenswrapper[4892]: I0217 19:28:13.103996 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-9ef27487-d6cb-4bb2-a8b2-012837ef6f07\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9ef27487-d6cb-4bb2-a8b2-012837ef6f07\") pod \"prometheus-metric-storage-0\" (UID: \"b8b75602-eb3c-41f2-85d9-e5b055bd0724\") " pod="openstack/prometheus-metric-storage-0" Feb 17 19:28:13 crc kubenswrapper[4892]: I0217 19:28:13.262600 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 17 19:28:13 crc kubenswrapper[4892]: I0217 19:28:13.409554 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 17 19:28:13 crc kubenswrapper[4892]: I0217 19:28:13.480391 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 17 19:28:13 crc kubenswrapper[4892]: I0217 19:28:13.617412 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/68d93bba-ae69-4c30-8c55-7818f38437b7-openstack-config-secret\") pod \"68d93bba-ae69-4c30-8c55-7818f38437b7\" (UID: \"68d93bba-ae69-4c30-8c55-7818f38437b7\") " Feb 17 19:28:13 crc kubenswrapper[4892]: I0217 19:28:13.617565 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dbnt\" (UniqueName: \"kubernetes.io/projected/68d93bba-ae69-4c30-8c55-7818f38437b7-kube-api-access-4dbnt\") pod \"68d93bba-ae69-4c30-8c55-7818f38437b7\" (UID: \"68d93bba-ae69-4c30-8c55-7818f38437b7\") " Feb 17 19:28:13 crc kubenswrapper[4892]: I0217 19:28:13.617674 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/68d93bba-ae69-4c30-8c55-7818f38437b7-openstack-config\") pod \"68d93bba-ae69-4c30-8c55-7818f38437b7\" (UID: \"68d93bba-ae69-4c30-8c55-7818f38437b7\") " Feb 17 19:28:13 crc kubenswrapper[4892]: I0217 19:28:13.637864 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68d93bba-ae69-4c30-8c55-7818f38437b7-kube-api-access-4dbnt" (OuterVolumeSpecName: "kube-api-access-4dbnt") pod "68d93bba-ae69-4c30-8c55-7818f38437b7" (UID: "68d93bba-ae69-4c30-8c55-7818f38437b7"). InnerVolumeSpecName "kube-api-access-4dbnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:28:13 crc kubenswrapper[4892]: I0217 19:28:13.656515 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68d93bba-ae69-4c30-8c55-7818f38437b7-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "68d93bba-ae69-4c30-8c55-7818f38437b7" (UID: "68d93bba-ae69-4c30-8c55-7818f38437b7"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:28:13 crc kubenswrapper[4892]: I0217 19:28:13.683619 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68d93bba-ae69-4c30-8c55-7818f38437b7-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "68d93bba-ae69-4c30-8c55-7818f38437b7" (UID: "68d93bba-ae69-4c30-8c55-7818f38437b7"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:28:13 crc kubenswrapper[4892]: I0217 19:28:13.720337 4892 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/68d93bba-ae69-4c30-8c55-7818f38437b7-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 17 19:28:13 crc kubenswrapper[4892]: I0217 19:28:13.720369 4892 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/68d93bba-ae69-4c30-8c55-7818f38437b7-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 17 19:28:13 crc kubenswrapper[4892]: I0217 19:28:13.720382 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dbnt\" (UniqueName: \"kubernetes.io/projected/68d93bba-ae69-4c30-8c55-7818f38437b7-kube-api-access-4dbnt\") on node \"crc\" DevicePath \"\"" Feb 17 19:28:13 crc kubenswrapper[4892]: I0217 19:28:13.916674 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"b1b1be0e-8ec2-48ea-967b-a89c7e20bea9","Type":"ContainerStarted","Data":"6954ff905188a545b81aa3f567052fceb317855d23253efa4a5709029c111ab9"} Feb 17 19:28:13 crc kubenswrapper[4892]: I0217 19:28:13.918507 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"7e8e76a7-b0ec-4b83-b288-c80fdd74ff97","Type":"ContainerStarted","Data":"db5167c2123f903c1cf06a85be86cb5840adc859376621dd830ef1af2cd445a6"} Feb 17 19:28:13 crc kubenswrapper[4892]: I0217 19:28:13.918605 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 17 19:28:13 crc kubenswrapper[4892]: I0217 19:28:13.920164 4892 generic.go:334] "Generic (PLEG): container finished" podID="68d93bba-ae69-4c30-8c55-7818f38437b7" containerID="8623c66734141f6b22c688965c38d0cdd1c1b073fef5a470da15f185f4dbe675" exitCode=137 Feb 17 19:28:13 crc kubenswrapper[4892]: I0217 19:28:13.921232 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 17 19:28:13 crc kubenswrapper[4892]: I0217 19:28:13.922983 4892 scope.go:117] "RemoveContainer" containerID="8623c66734141f6b22c688965c38d0cdd1c1b073fef5a470da15f185f4dbe675" Feb 17 19:28:13 crc kubenswrapper[4892]: I0217 19:28:13.941732 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.335373929 podStartE2EDuration="2.941715431s" podCreationTimestamp="2026-02-17 19:28:11 +0000 UTC" firstStartedPulling="2026-02-17 19:28:12.495027169 +0000 UTC m=+6263.870430434" lastFinishedPulling="2026-02-17 19:28:13.101368671 +0000 UTC m=+6264.476771936" observedRunningTime="2026-02-17 19:28:13.935168704 +0000 UTC m=+6265.310571969" watchObservedRunningTime="2026-02-17 19:28:13.941715431 +0000 UTC m=+6265.317118696" Feb 17 19:28:13 crc kubenswrapper[4892]: I0217 19:28:13.944521 4892 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="68d93bba-ae69-4c30-8c55-7818f38437b7" podUID="3519261b-9df6-4bbd-976b-a6987e030742" Feb 17 19:28:13 crc kubenswrapper[4892]: I0217 19:28:13.962663 4892 scope.go:117] "RemoveContainer" containerID="8623c66734141f6b22c688965c38d0cdd1c1b073fef5a470da15f185f4dbe675" Feb 17 19:28:13 crc kubenswrapper[4892]: E0217 19:28:13.963182 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8623c66734141f6b22c688965c38d0cdd1c1b073fef5a470da15f185f4dbe675\": container with ID starting with 8623c66734141f6b22c688965c38d0cdd1c1b073fef5a470da15f185f4dbe675 not found: ID does not exist" containerID="8623c66734141f6b22c688965c38d0cdd1c1b073fef5a470da15f185f4dbe675" Feb 17 19:28:13 crc kubenswrapper[4892]: I0217 19:28:13.963227 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8623c66734141f6b22c688965c38d0cdd1c1b073fef5a470da15f185f4dbe675"} err="failed to get container status \"8623c66734141f6b22c688965c38d0cdd1c1b073fef5a470da15f185f4dbe675\": rpc error: code = NotFound desc = could not find container \"8623c66734141f6b22c688965c38d0cdd1c1b073fef5a470da15f185f4dbe675\": container with ID starting with 8623c66734141f6b22c688965c38d0cdd1c1b073fef5a470da15f185f4dbe675 not found: ID does not exist" Feb 17 19:28:14 crc kubenswrapper[4892]: I0217 19:28:14.041311 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 17 19:28:14 crc kubenswrapper[4892]: I0217 19:28:14.932055 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b8b75602-eb3c-41f2-85d9-e5b055bd0724","Type":"ContainerStarted","Data":"5588ae238ff816869c8fa891829597bd8b8276db07c4dbab8687d3f613d5dad1"} Feb 17 19:28:15 crc kubenswrapper[4892]: I0217 19:28:15.380737 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68d93bba-ae69-4c30-8c55-7818f38437b7" path="/var/lib/kubelet/pods/68d93bba-ae69-4c30-8c55-7818f38437b7/volumes" Feb 17 19:28:19 crc kubenswrapper[4892]: I0217 19:28:19.988664 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"b1b1be0e-8ec2-48ea-967b-a89c7e20bea9","Type":"ContainerStarted","Data":"5c4799c83a69810ec748aa9b98786c9d3304250f317a935a7f842809dc7e9d44"} Feb 17 19:28:19 crc kubenswrapper[4892]: I0217 19:28:19.993259 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b8b75602-eb3c-41f2-85d9-e5b055bd0724","Type":"ContainerStarted","Data":"e9a6d5f3784619ea727f9f30914cac1466d87402a4a60a372a95c3fdd24a78ec"} Feb 17 19:28:20 crc kubenswrapper[4892]: I0217 19:28:20.710506 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jhfhx" podUID="96ed9f69-fd36-40cd-821f-01de224abc3c" containerName="registry-server" probeResult="failure" output=< Feb 17 19:28:20 crc kubenswrapper[4892]: timeout: failed to connect service ":50051" within 1s Feb 17 19:28:20 crc kubenswrapper[4892]: > Feb 17 19:28:21 crc kubenswrapper[4892]: I0217 19:28:21.640596 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 17 19:28:27 crc kubenswrapper[4892]: I0217 19:28:27.084872 4892 generic.go:334] "Generic (PLEG): container finished" podID="b1b1be0e-8ec2-48ea-967b-a89c7e20bea9" containerID="5c4799c83a69810ec748aa9b98786c9d3304250f317a935a7f842809dc7e9d44" exitCode=0 Feb 17 19:28:27 crc kubenswrapper[4892]: I0217 19:28:27.084942 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"b1b1be0e-8ec2-48ea-967b-a89c7e20bea9","Type":"ContainerDied","Data":"5c4799c83a69810ec748aa9b98786c9d3304250f317a935a7f842809dc7e9d44"} Feb 17 19:28:28 crc kubenswrapper[4892]: I0217 19:28:28.099600 4892 generic.go:334] "Generic (PLEG): container finished" podID="b8b75602-eb3c-41f2-85d9-e5b055bd0724" containerID="e9a6d5f3784619ea727f9f30914cac1466d87402a4a60a372a95c3fdd24a78ec" exitCode=0 Feb 17 19:28:28 crc kubenswrapper[4892]: I0217 19:28:28.099882 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b8b75602-eb3c-41f2-85d9-e5b055bd0724","Type":"ContainerDied","Data":"e9a6d5f3784619ea727f9f30914cac1466d87402a4a60a372a95c3fdd24a78ec"} Feb 17 19:28:29 crc kubenswrapper[4892]: I0217 19:28:29.673299 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jhfhx" Feb 17 19:28:29 crc kubenswrapper[4892]: I0217 19:28:29.745470 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jhfhx" Feb 17 19:28:29 crc kubenswrapper[4892]: I0217 19:28:29.915279 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jhfhx"] Feb 17 19:28:30 crc kubenswrapper[4892]: I0217 19:28:30.127961 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"b1b1be0e-8ec2-48ea-967b-a89c7e20bea9","Type":"ContainerStarted","Data":"68379842ce8f44490439c53216d6c194d6f099110182481e9b84bab13bc55086"} Feb 17 19:28:31 crc kubenswrapper[4892]: I0217 19:28:31.138166 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jhfhx" podUID="96ed9f69-fd36-40cd-821f-01de224abc3c" containerName="registry-server" containerID="cri-o://6782d546b5df9fa48763b2fc680435e7f2c173ae2246551799ab644f49fac4cf" gracePeriod=2 Feb 17 19:28:31 crc kubenswrapper[4892]: I0217 19:28:31.726316 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jhfhx" Feb 17 19:28:31 crc kubenswrapper[4892]: I0217 19:28:31.803922 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgld9\" (UniqueName: \"kubernetes.io/projected/96ed9f69-fd36-40cd-821f-01de224abc3c-kube-api-access-xgld9\") pod \"96ed9f69-fd36-40cd-821f-01de224abc3c\" (UID: \"96ed9f69-fd36-40cd-821f-01de224abc3c\") " Feb 17 19:28:31 crc kubenswrapper[4892]: I0217 19:28:31.803983 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96ed9f69-fd36-40cd-821f-01de224abc3c-utilities\") pod \"96ed9f69-fd36-40cd-821f-01de224abc3c\" (UID: \"96ed9f69-fd36-40cd-821f-01de224abc3c\") " Feb 17 19:28:31 crc kubenswrapper[4892]: I0217 19:28:31.804094 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96ed9f69-fd36-40cd-821f-01de224abc3c-catalog-content\") pod \"96ed9f69-fd36-40cd-821f-01de224abc3c\" (UID: \"96ed9f69-fd36-40cd-821f-01de224abc3c\") " Feb 17 19:28:31 crc kubenswrapper[4892]: I0217 19:28:31.804970 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96ed9f69-fd36-40cd-821f-01de224abc3c-utilities" (OuterVolumeSpecName: "utilities") pod "96ed9f69-fd36-40cd-821f-01de224abc3c" (UID: "96ed9f69-fd36-40cd-821f-01de224abc3c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:28:31 crc kubenswrapper[4892]: I0217 19:28:31.818720 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96ed9f69-fd36-40cd-821f-01de224abc3c-kube-api-access-xgld9" (OuterVolumeSpecName: "kube-api-access-xgld9") pod "96ed9f69-fd36-40cd-821f-01de224abc3c" (UID: "96ed9f69-fd36-40cd-821f-01de224abc3c"). InnerVolumeSpecName "kube-api-access-xgld9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:28:31 crc kubenswrapper[4892]: I0217 19:28:31.907260 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgld9\" (UniqueName: \"kubernetes.io/projected/96ed9f69-fd36-40cd-821f-01de224abc3c-kube-api-access-xgld9\") on node \"crc\" DevicePath \"\"" Feb 17 19:28:31 crc kubenswrapper[4892]: I0217 19:28:31.907288 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96ed9f69-fd36-40cd-821f-01de224abc3c-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 19:28:31 crc kubenswrapper[4892]: I0217 19:28:31.946149 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96ed9f69-fd36-40cd-821f-01de224abc3c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "96ed9f69-fd36-40cd-821f-01de224abc3c" (UID: "96ed9f69-fd36-40cd-821f-01de224abc3c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:28:32 crc kubenswrapper[4892]: I0217 19:28:32.009223 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96ed9f69-fd36-40cd-821f-01de224abc3c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 19:28:32 crc kubenswrapper[4892]: I0217 19:28:32.159990 4892 generic.go:334] "Generic (PLEG): container finished" podID="96ed9f69-fd36-40cd-821f-01de224abc3c" containerID="6782d546b5df9fa48763b2fc680435e7f2c173ae2246551799ab644f49fac4cf" exitCode=0 Feb 17 19:28:32 crc kubenswrapper[4892]: I0217 19:28:32.160057 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jhfhx" event={"ID":"96ed9f69-fd36-40cd-821f-01de224abc3c","Type":"ContainerDied","Data":"6782d546b5df9fa48763b2fc680435e7f2c173ae2246551799ab644f49fac4cf"} Feb 17 19:28:32 crc kubenswrapper[4892]: I0217 19:28:32.160286 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jhfhx" event={"ID":"96ed9f69-fd36-40cd-821f-01de224abc3c","Type":"ContainerDied","Data":"0184ccc28d162d9a966ab02b3e10f60c96b9c7d0c6ee63c508892dba196b8213"} Feb 17 19:28:32 crc kubenswrapper[4892]: I0217 19:28:32.160308 4892 scope.go:117] "RemoveContainer" containerID="6782d546b5df9fa48763b2fc680435e7f2c173ae2246551799ab644f49fac4cf" Feb 17 19:28:32 crc kubenswrapper[4892]: I0217 19:28:32.160124 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jhfhx" Feb 17 19:28:32 crc kubenswrapper[4892]: I0217 19:28:32.217867 4892 scope.go:117] "RemoveContainer" containerID="0057a003804d42eaec2645ed6e2c6b72ea7c0c0533a51a00da85a5488d36bac2" Feb 17 19:28:32 crc kubenswrapper[4892]: I0217 19:28:32.218977 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jhfhx"] Feb 17 19:28:32 crc kubenswrapper[4892]: I0217 19:28:32.238422 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jhfhx"] Feb 17 19:28:32 crc kubenswrapper[4892]: I0217 19:28:32.245445 4892 scope.go:117] "RemoveContainer" containerID="89e01ce7658acd088dbc50625d19093613c3a7ebc1cbb213a4bbb40f19919990" Feb 17 19:28:33 crc kubenswrapper[4892]: I0217 19:28:33.372790 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96ed9f69-fd36-40cd-821f-01de224abc3c" path="/var/lib/kubelet/pods/96ed9f69-fd36-40cd-821f-01de224abc3c/volumes" Feb 17 19:28:34 crc kubenswrapper[4892]: I0217 19:28:34.187891 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"b1b1be0e-8ec2-48ea-967b-a89c7e20bea9","Type":"ContainerStarted","Data":"a8a84c2256a11cd6c6965138c544ecae7863ec8f87d3572d84f7778b1eb4212a"} Feb 17 19:28:34 crc kubenswrapper[4892]: I0217 19:28:34.188483 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Feb 17 19:28:34 crc kubenswrapper[4892]: I0217 19:28:34.191562 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Feb 17 19:28:34 crc kubenswrapper[4892]: I0217 19:28:34.224247 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=5.8253833 podStartE2EDuration="22.224225676s" podCreationTimestamp="2026-02-17 19:28:12 +0000 UTC" firstStartedPulling="2026-02-17 19:28:13.282442736 +0000 UTC m=+6264.657846001" lastFinishedPulling="2026-02-17 19:28:29.681285112 +0000 UTC m=+6281.056688377" observedRunningTime="2026-02-17 19:28:34.216768814 +0000 UTC m=+6285.592172079" watchObservedRunningTime="2026-02-17 19:28:34.224225676 +0000 UTC m=+6285.599628961" Feb 17 19:28:35 crc kubenswrapper[4892]: I0217 19:28:35.371889 4892 scope.go:117] "RemoveContainer" containerID="6782d546b5df9fa48763b2fc680435e7f2c173ae2246551799ab644f49fac4cf" Feb 17 19:28:35 crc kubenswrapper[4892]: E0217 19:28:35.373725 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6782d546b5df9fa48763b2fc680435e7f2c173ae2246551799ab644f49fac4cf\": container with ID starting with 6782d546b5df9fa48763b2fc680435e7f2c173ae2246551799ab644f49fac4cf not found: ID does not exist" containerID="6782d546b5df9fa48763b2fc680435e7f2c173ae2246551799ab644f49fac4cf" Feb 17 19:28:35 crc kubenswrapper[4892]: I0217 19:28:35.373795 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6782d546b5df9fa48763b2fc680435e7f2c173ae2246551799ab644f49fac4cf"} err="failed to get container status \"6782d546b5df9fa48763b2fc680435e7f2c173ae2246551799ab644f49fac4cf\": rpc error: code = NotFound desc = could not find container \"6782d546b5df9fa48763b2fc680435e7f2c173ae2246551799ab644f49fac4cf\": container with ID starting with 6782d546b5df9fa48763b2fc680435e7f2c173ae2246551799ab644f49fac4cf not found: ID does not exist" Feb 17 19:28:35 crc kubenswrapper[4892]: I0217 19:28:35.373876 4892 scope.go:117] "RemoveContainer" containerID="0057a003804d42eaec2645ed6e2c6b72ea7c0c0533a51a00da85a5488d36bac2" Feb 17 19:28:35 crc kubenswrapper[4892]: E0217 19:28:35.374517 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0057a003804d42eaec2645ed6e2c6b72ea7c0c0533a51a00da85a5488d36bac2\": container with ID starting with 0057a003804d42eaec2645ed6e2c6b72ea7c0c0533a51a00da85a5488d36bac2 not found: ID does not exist" containerID="0057a003804d42eaec2645ed6e2c6b72ea7c0c0533a51a00da85a5488d36bac2" Feb 17 19:28:35 crc kubenswrapper[4892]: I0217 19:28:35.374569 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0057a003804d42eaec2645ed6e2c6b72ea7c0c0533a51a00da85a5488d36bac2"} err="failed to get container status \"0057a003804d42eaec2645ed6e2c6b72ea7c0c0533a51a00da85a5488d36bac2\": rpc error: code = NotFound desc = could not find container \"0057a003804d42eaec2645ed6e2c6b72ea7c0c0533a51a00da85a5488d36bac2\": container with ID starting with 0057a003804d42eaec2645ed6e2c6b72ea7c0c0533a51a00da85a5488d36bac2 not found: ID does not exist" Feb 17 19:28:35 crc kubenswrapper[4892]: I0217 19:28:35.374598 4892 scope.go:117] "RemoveContainer" containerID="89e01ce7658acd088dbc50625d19093613c3a7ebc1cbb213a4bbb40f19919990" Feb 17 19:28:35 crc kubenswrapper[4892]: E0217 19:28:35.375006 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89e01ce7658acd088dbc50625d19093613c3a7ebc1cbb213a4bbb40f19919990\": container with ID starting with 89e01ce7658acd088dbc50625d19093613c3a7ebc1cbb213a4bbb40f19919990 not found: ID does not exist" containerID="89e01ce7658acd088dbc50625d19093613c3a7ebc1cbb213a4bbb40f19919990" Feb 17 19:28:35 crc kubenswrapper[4892]: I0217 19:28:35.375055 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89e01ce7658acd088dbc50625d19093613c3a7ebc1cbb213a4bbb40f19919990"} err="failed to get container status \"89e01ce7658acd088dbc50625d19093613c3a7ebc1cbb213a4bbb40f19919990\": rpc error: code = NotFound desc = could not find container \"89e01ce7658acd088dbc50625d19093613c3a7ebc1cbb213a4bbb40f19919990\": container with ID starting with 89e01ce7658acd088dbc50625d19093613c3a7ebc1cbb213a4bbb40f19919990 not found: ID does not exist" Feb 17 19:28:36 crc kubenswrapper[4892]: I0217 19:28:36.217452 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b8b75602-eb3c-41f2-85d9-e5b055bd0724","Type":"ContainerStarted","Data":"08a6c2fba3d784a42c92d6be23d1b731607e3042e686bea85ae131f2ebb2cf46"} Feb 17 19:28:40 crc kubenswrapper[4892]: I0217 19:28:40.291164 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b8b75602-eb3c-41f2-85d9-e5b055bd0724","Type":"ContainerStarted","Data":"8f91624d7c0a63ff3a966b70e483d02a80bb4794c5f707ebd3af65d2709c4abc"} Feb 17 19:28:45 crc kubenswrapper[4892]: I0217 19:28:45.373181 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b8b75602-eb3c-41f2-85d9-e5b055bd0724","Type":"ContainerStarted","Data":"7369482a173cefbe49f7f16cbfdf97e5811b9df0437dfc1c8c98a6f33f05d0e9"} Feb 17 19:28:45 crc kubenswrapper[4892]: I0217 19:28:45.397869 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=3.8540091309999998 podStartE2EDuration="34.39785365s" podCreationTimestamp="2026-02-17 19:28:11 +0000 UTC" firstStartedPulling="2026-02-17 19:28:14.046440788 +0000 UTC m=+6265.421844053" lastFinishedPulling="2026-02-17 19:28:44.590285307 +0000 UTC m=+6295.965688572" observedRunningTime="2026-02-17 19:28:45.389151204 +0000 UTC m=+6296.764554459" watchObservedRunningTime="2026-02-17 19:28:45.39785365 +0000 UTC m=+6296.773256905" Feb 17 19:28:48 crc kubenswrapper[4892]: I0217 19:28:48.413952 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 17 19:28:50 crc kubenswrapper[4892]: I0217 19:28:50.227671 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 19:28:50 crc kubenswrapper[4892]: E0217 19:28:50.228703 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96ed9f69-fd36-40cd-821f-01de224abc3c" containerName="extract-utilities" Feb 17 19:28:50 crc kubenswrapper[4892]: I0217 19:28:50.228719 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="96ed9f69-fd36-40cd-821f-01de224abc3c" containerName="extract-utilities" Feb 17 19:28:50 crc kubenswrapper[4892]: E0217 19:28:50.228778 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96ed9f69-fd36-40cd-821f-01de224abc3c" containerName="registry-server" Feb 17 19:28:50 crc kubenswrapper[4892]: I0217 19:28:50.228787 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="96ed9f69-fd36-40cd-821f-01de224abc3c" containerName="registry-server" Feb 17 19:28:50 crc kubenswrapper[4892]: E0217 19:28:50.228815 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96ed9f69-fd36-40cd-821f-01de224abc3c" containerName="extract-content" Feb 17 19:28:50 crc kubenswrapper[4892]: I0217 19:28:50.228840 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="96ed9f69-fd36-40cd-821f-01de224abc3c" containerName="extract-content" Feb 17 19:28:50 crc kubenswrapper[4892]: I0217 19:28:50.229158 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="96ed9f69-fd36-40cd-821f-01de224abc3c" containerName="registry-server" Feb 17 19:28:50 crc kubenswrapper[4892]: I0217 19:28:50.231912 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 19:28:50 crc kubenswrapper[4892]: I0217 19:28:50.233676 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 19:28:50 crc kubenswrapper[4892]: I0217 19:28:50.233750 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 19:28:50 crc kubenswrapper[4892]: I0217 19:28:50.244638 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 19:28:50 crc kubenswrapper[4892]: I0217 19:28:50.319639 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/783172fc-f2f2-4207-bafb-92b82c8477be-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"783172fc-f2f2-4207-bafb-92b82c8477be\") " pod="openstack/ceilometer-0" Feb 17 19:28:50 crc kubenswrapper[4892]: I0217 19:28:50.319710 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/783172fc-f2f2-4207-bafb-92b82c8477be-scripts\") pod \"ceilometer-0\" (UID: \"783172fc-f2f2-4207-bafb-92b82c8477be\") " pod="openstack/ceilometer-0" Feb 17 19:28:50 crc kubenswrapper[4892]: I0217 19:28:50.319746 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/783172fc-f2f2-4207-bafb-92b82c8477be-run-httpd\") pod \"ceilometer-0\" (UID: \"783172fc-f2f2-4207-bafb-92b82c8477be\") " pod="openstack/ceilometer-0" Feb 17 19:28:50 crc kubenswrapper[4892]: I0217 19:28:50.319800 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/783172fc-f2f2-4207-bafb-92b82c8477be-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"783172fc-f2f2-4207-bafb-92b82c8477be\") " pod="openstack/ceilometer-0" Feb 17 19:28:50 crc kubenswrapper[4892]: I0217 19:28:50.319868 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqcd2\" (UniqueName: \"kubernetes.io/projected/783172fc-f2f2-4207-bafb-92b82c8477be-kube-api-access-xqcd2\") pod \"ceilometer-0\" (UID: \"783172fc-f2f2-4207-bafb-92b82c8477be\") " pod="openstack/ceilometer-0" Feb 17 19:28:50 crc kubenswrapper[4892]: I0217 19:28:50.319893 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/783172fc-f2f2-4207-bafb-92b82c8477be-log-httpd\") pod \"ceilometer-0\" (UID: \"783172fc-f2f2-4207-bafb-92b82c8477be\") " pod="openstack/ceilometer-0" Feb 17 19:28:50 crc kubenswrapper[4892]: I0217 19:28:50.319981 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/783172fc-f2f2-4207-bafb-92b82c8477be-config-data\") pod \"ceilometer-0\" (UID: \"783172fc-f2f2-4207-bafb-92b82c8477be\") " pod="openstack/ceilometer-0" Feb 17 19:28:50 crc kubenswrapper[4892]: I0217 19:28:50.422289 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/783172fc-f2f2-4207-bafb-92b82c8477be-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"783172fc-f2f2-4207-bafb-92b82c8477be\") " pod="openstack/ceilometer-0" Feb 17 19:28:50 crc kubenswrapper[4892]: I0217 19:28:50.422343 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqcd2\" (UniqueName: \"kubernetes.io/projected/783172fc-f2f2-4207-bafb-92b82c8477be-kube-api-access-xqcd2\") pod \"ceilometer-0\" (UID: \"783172fc-f2f2-4207-bafb-92b82c8477be\") " pod="openstack/ceilometer-0" Feb 17 19:28:50 crc kubenswrapper[4892]: I0217 19:28:50.422363 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/783172fc-f2f2-4207-bafb-92b82c8477be-log-httpd\") pod \"ceilometer-0\" (UID: \"783172fc-f2f2-4207-bafb-92b82c8477be\") " pod="openstack/ceilometer-0" Feb 17 19:28:50 crc kubenswrapper[4892]: I0217 19:28:50.422399 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/783172fc-f2f2-4207-bafb-92b82c8477be-config-data\") pod \"ceilometer-0\" (UID: \"783172fc-f2f2-4207-bafb-92b82c8477be\") " pod="openstack/ceilometer-0" Feb 17 19:28:50 crc kubenswrapper[4892]: I0217 19:28:50.422510 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/783172fc-f2f2-4207-bafb-92b82c8477be-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"783172fc-f2f2-4207-bafb-92b82c8477be\") " pod="openstack/ceilometer-0" Feb 17 19:28:50 crc kubenswrapper[4892]: I0217 19:28:50.422538 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/783172fc-f2f2-4207-bafb-92b82c8477be-scripts\") pod \"ceilometer-0\" (UID: \"783172fc-f2f2-4207-bafb-92b82c8477be\") " pod="openstack/ceilometer-0" Feb 17 19:28:50 crc kubenswrapper[4892]: I0217 19:28:50.422559 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/783172fc-f2f2-4207-bafb-92b82c8477be-run-httpd\") pod \"ceilometer-0\" (UID: \"783172fc-f2f2-4207-bafb-92b82c8477be\") " pod="openstack/ceilometer-0" Feb 17 19:28:50 crc kubenswrapper[4892]: I0217 19:28:50.423016 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/783172fc-f2f2-4207-bafb-92b82c8477be-run-httpd\") pod \"ceilometer-0\" (UID: \"783172fc-f2f2-4207-bafb-92b82c8477be\") " pod="openstack/ceilometer-0" Feb 17 19:28:50 crc kubenswrapper[4892]: I0217 19:28:50.424109 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/783172fc-f2f2-4207-bafb-92b82c8477be-log-httpd\") pod \"ceilometer-0\" (UID: \"783172fc-f2f2-4207-bafb-92b82c8477be\") " pod="openstack/ceilometer-0" Feb 17 19:28:50 crc kubenswrapper[4892]: I0217 19:28:50.430058 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/783172fc-f2f2-4207-bafb-92b82c8477be-scripts\") pod \"ceilometer-0\" (UID: \"783172fc-f2f2-4207-bafb-92b82c8477be\") " pod="openstack/ceilometer-0" Feb 17 19:28:50 crc kubenswrapper[4892]: I0217 19:28:50.430690 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/783172fc-f2f2-4207-bafb-92b82c8477be-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"783172fc-f2f2-4207-bafb-92b82c8477be\") " pod="openstack/ceilometer-0" Feb 17 19:28:50 crc kubenswrapper[4892]: I0217 19:28:50.430744 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/783172fc-f2f2-4207-bafb-92b82c8477be-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"783172fc-f2f2-4207-bafb-92b82c8477be\") " pod="openstack/ceilometer-0" Feb 17 19:28:50 crc kubenswrapper[4892]: I0217 19:28:50.431226 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/783172fc-f2f2-4207-bafb-92b82c8477be-config-data\") pod \"ceilometer-0\" (UID: \"783172fc-f2f2-4207-bafb-92b82c8477be\") " pod="openstack/ceilometer-0" Feb 17 19:28:50 crc kubenswrapper[4892]: I0217 19:28:50.442864 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqcd2\" (UniqueName: \"kubernetes.io/projected/783172fc-f2f2-4207-bafb-92b82c8477be-kube-api-access-xqcd2\") pod \"ceilometer-0\" (UID: \"783172fc-f2f2-4207-bafb-92b82c8477be\") " pod="openstack/ceilometer-0" Feb 17 19:28:50 crc kubenswrapper[4892]: I0217 19:28:50.560967 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 19:28:51 crc kubenswrapper[4892]: I0217 19:28:51.050522 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-d240-account-create-update-sxl8m"] Feb 17 19:28:51 crc kubenswrapper[4892]: I0217 19:28:51.069983 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-d240-account-create-update-sxl8m"] Feb 17 19:28:51 crc kubenswrapper[4892]: I0217 19:28:51.079856 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-6t524"] Feb 17 19:28:51 crc kubenswrapper[4892]: I0217 19:28:51.088793 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-01db-account-create-update-5crxt"] Feb 17 19:28:51 crc kubenswrapper[4892]: I0217 19:28:51.103357 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-ltfnh"] Feb 17 19:28:51 crc kubenswrapper[4892]: I0217 19:28:51.112475 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-6t524"] Feb 17 19:28:51 crc kubenswrapper[4892]: I0217 19:28:51.129411 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-01db-account-create-update-5crxt"] Feb 17 19:28:51 crc kubenswrapper[4892]: I0217 19:28:51.140539 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-ltfnh"] Feb 17 19:28:51 crc kubenswrapper[4892]: I0217 19:28:51.153217 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 19:28:51 crc kubenswrapper[4892]: I0217 19:28:51.375405 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40507831-2854-429d-b79a-2da3e53325ba" path="/var/lib/kubelet/pods/40507831-2854-429d-b79a-2da3e53325ba/volumes" Feb 17 19:28:51 crc kubenswrapper[4892]: I0217 19:28:51.376457 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="953c04db-941c-4906-92b2-69578f20f3ce" path="/var/lib/kubelet/pods/953c04db-941c-4906-92b2-69578f20f3ce/volumes" Feb 17 19:28:51 crc kubenswrapper[4892]: I0217 19:28:51.377284 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9502d16-c7a7-47ab-8103-f05229ca14ae" path="/var/lib/kubelet/pods/e9502d16-c7a7-47ab-8103-f05229ca14ae/volumes" Feb 17 19:28:51 crc kubenswrapper[4892]: I0217 19:28:51.378671 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9b34438-358d-4397-8d7a-0489bddea606" path="/var/lib/kubelet/pods/e9b34438-358d-4397-8d7a-0489bddea606/volumes" Feb 17 19:28:51 crc kubenswrapper[4892]: I0217 19:28:51.448093 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"783172fc-f2f2-4207-bafb-92b82c8477be","Type":"ContainerStarted","Data":"d0322e458d1499f43b22f00fce261d1edbbefb86bf53127ca3a889b585ce83f0"} Feb 17 19:28:52 crc kubenswrapper[4892]: I0217 19:28:52.038534 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-7phmt"] Feb 17 19:28:52 crc kubenswrapper[4892]: I0217 19:28:52.055338 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-7phmt"] Feb 17 19:28:52 crc kubenswrapper[4892]: I0217 19:28:52.070736 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-9160-account-create-update-2ph45"] Feb 17 19:28:52 crc kubenswrapper[4892]: I0217 19:28:52.079830 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-9160-account-create-update-2ph45"] Feb 17 19:28:52 crc kubenswrapper[4892]: I0217 19:28:52.460424 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"783172fc-f2f2-4207-bafb-92b82c8477be","Type":"ContainerStarted","Data":"6696b36d09e84647ded1a7da2719c938844118d729b77876495a8f9e895c2e4c"} Feb 17 19:28:52 crc kubenswrapper[4892]: I0217 19:28:52.460760 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"783172fc-f2f2-4207-bafb-92b82c8477be","Type":"ContainerStarted","Data":"5904c1c707391722ce5792e20eb31627df929cb22c22d91b0ad4b549e3bede90"} Feb 17 19:28:53 crc kubenswrapper[4892]: I0217 19:28:53.376743 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a22ec1e-d67b-40f0-9adc-0cc81be398a5" path="/var/lib/kubelet/pods/0a22ec1e-d67b-40f0-9adc-0cc81be398a5/volumes" Feb 17 19:28:53 crc kubenswrapper[4892]: I0217 19:28:53.379287 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="751ad68c-37cf-403a-8a09-e0ff6a096874" path="/var/lib/kubelet/pods/751ad68c-37cf-403a-8a09-e0ff6a096874/volumes" Feb 17 19:28:53 crc kubenswrapper[4892]: I0217 19:28:53.477176 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"783172fc-f2f2-4207-bafb-92b82c8477be","Type":"ContainerStarted","Data":"4fefcb8ddad77b3ebb1a22470cdcc07b188cd554704bc6755ae777cdcfae6a52"} Feb 17 19:28:55 crc kubenswrapper[4892]: I0217 19:28:55.511050 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"783172fc-f2f2-4207-bafb-92b82c8477be","Type":"ContainerStarted","Data":"efe0e35522ac404b0815649ab9349270b70e570ee79ab52083cf7938fe9285dc"} Feb 17 19:28:55 crc kubenswrapper[4892]: I0217 19:28:55.513366 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 19:28:55 crc kubenswrapper[4892]: I0217 19:28:55.537699 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.420835324 podStartE2EDuration="5.537683423s" podCreationTimestamp="2026-02-17 19:28:50 +0000 UTC" firstStartedPulling="2026-02-17 19:28:51.143113438 +0000 UTC m=+6302.518516703" lastFinishedPulling="2026-02-17 19:28:54.259961537 +0000 UTC m=+6305.635364802" observedRunningTime="2026-02-17 19:28:55.53571328 +0000 UTC m=+6306.911116555" watchObservedRunningTime="2026-02-17 19:28:55.537683423 +0000 UTC m=+6306.913086698" Feb 17 19:28:58 crc kubenswrapper[4892]: I0217 19:28:58.411363 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 17 19:28:58 crc kubenswrapper[4892]: I0217 19:28:58.414490 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 17 19:28:58 crc kubenswrapper[4892]: I0217 19:28:58.550682 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 17 19:29:02 crc kubenswrapper[4892]: I0217 19:29:02.511742 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-kbm4j"] Feb 17 19:29:02 crc kubenswrapper[4892]: I0217 19:29:02.514751 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-kbm4j" Feb 17 19:29:02 crc kubenswrapper[4892]: I0217 19:29:02.522907 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-kbm4j"] Feb 17 19:29:02 crc kubenswrapper[4892]: I0217 19:29:02.647570 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7df073dc-7ac0-41c7-899b-7373b161da67-operator-scripts\") pod \"aodh-db-create-kbm4j\" (UID: \"7df073dc-7ac0-41c7-899b-7373b161da67\") " pod="openstack/aodh-db-create-kbm4j" Feb 17 19:29:02 crc kubenswrapper[4892]: I0217 19:29:02.647889 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwh5l\" (UniqueName: \"kubernetes.io/projected/7df073dc-7ac0-41c7-899b-7373b161da67-kube-api-access-lwh5l\") pod \"aodh-db-create-kbm4j\" (UID: \"7df073dc-7ac0-41c7-899b-7373b161da67\") " pod="openstack/aodh-db-create-kbm4j" Feb 17 19:29:02 crc kubenswrapper[4892]: I0217 19:29:02.703294 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-b720-account-create-update-mng59"] Feb 17 19:29:02 crc kubenswrapper[4892]: I0217 19:29:02.705005 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-b720-account-create-update-mng59" Feb 17 19:29:02 crc kubenswrapper[4892]: I0217 19:29:02.707715 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Feb 17 19:29:02 crc kubenswrapper[4892]: I0217 19:29:02.714580 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-b720-account-create-update-mng59"] Feb 17 19:29:02 crc kubenswrapper[4892]: I0217 19:29:02.752709 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7df073dc-7ac0-41c7-899b-7373b161da67-operator-scripts\") pod \"aodh-db-create-kbm4j\" (UID: \"7df073dc-7ac0-41c7-899b-7373b161da67\") " pod="openstack/aodh-db-create-kbm4j" Feb 17 19:29:02 crc kubenswrapper[4892]: I0217 19:29:02.755016 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwh5l\" (UniqueName: \"kubernetes.io/projected/7df073dc-7ac0-41c7-899b-7373b161da67-kube-api-access-lwh5l\") pod \"aodh-db-create-kbm4j\" (UID: \"7df073dc-7ac0-41c7-899b-7373b161da67\") " pod="openstack/aodh-db-create-kbm4j" Feb 17 19:29:02 crc kubenswrapper[4892]: I0217 19:29:02.755707 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7df073dc-7ac0-41c7-899b-7373b161da67-operator-scripts\") pod \"aodh-db-create-kbm4j\" (UID: \"7df073dc-7ac0-41c7-899b-7373b161da67\") " pod="openstack/aodh-db-create-kbm4j" Feb 17 19:29:02 crc kubenswrapper[4892]: I0217 19:29:02.779072 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwh5l\" (UniqueName: \"kubernetes.io/projected/7df073dc-7ac0-41c7-899b-7373b161da67-kube-api-access-lwh5l\") pod \"aodh-db-create-kbm4j\" (UID: \"7df073dc-7ac0-41c7-899b-7373b161da67\") " pod="openstack/aodh-db-create-kbm4j" Feb 17 19:29:02 crc kubenswrapper[4892]: I0217 19:29:02.858143 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qqzq\" (UniqueName: \"kubernetes.io/projected/724bf084-29f5-4fdf-b1b4-bc3ab7cfe4a7-kube-api-access-6qqzq\") pod \"aodh-b720-account-create-update-mng59\" (UID: \"724bf084-29f5-4fdf-b1b4-bc3ab7cfe4a7\") " pod="openstack/aodh-b720-account-create-update-mng59" Feb 17 19:29:02 crc kubenswrapper[4892]: I0217 19:29:02.858188 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/724bf084-29f5-4fdf-b1b4-bc3ab7cfe4a7-operator-scripts\") pod \"aodh-b720-account-create-update-mng59\" (UID: \"724bf084-29f5-4fdf-b1b4-bc3ab7cfe4a7\") " pod="openstack/aodh-b720-account-create-update-mng59" Feb 17 19:29:02 crc kubenswrapper[4892]: I0217 19:29:02.868445 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-kbm4j" Feb 17 19:29:02 crc kubenswrapper[4892]: I0217 19:29:02.960612 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qqzq\" (UniqueName: \"kubernetes.io/projected/724bf084-29f5-4fdf-b1b4-bc3ab7cfe4a7-kube-api-access-6qqzq\") pod \"aodh-b720-account-create-update-mng59\" (UID: \"724bf084-29f5-4fdf-b1b4-bc3ab7cfe4a7\") " pod="openstack/aodh-b720-account-create-update-mng59" Feb 17 19:29:02 crc kubenswrapper[4892]: I0217 19:29:02.960658 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/724bf084-29f5-4fdf-b1b4-bc3ab7cfe4a7-operator-scripts\") pod \"aodh-b720-account-create-update-mng59\" (UID: \"724bf084-29f5-4fdf-b1b4-bc3ab7cfe4a7\") " pod="openstack/aodh-b720-account-create-update-mng59" Feb 17 19:29:02 crc kubenswrapper[4892]: I0217 19:29:02.961442 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/724bf084-29f5-4fdf-b1b4-bc3ab7cfe4a7-operator-scripts\") pod \"aodh-b720-account-create-update-mng59\" (UID: \"724bf084-29f5-4fdf-b1b4-bc3ab7cfe4a7\") " pod="openstack/aodh-b720-account-create-update-mng59" Feb 17 19:29:02 crc kubenswrapper[4892]: I0217 19:29:02.979717 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qqzq\" (UniqueName: \"kubernetes.io/projected/724bf084-29f5-4fdf-b1b4-bc3ab7cfe4a7-kube-api-access-6qqzq\") pod \"aodh-b720-account-create-update-mng59\" (UID: \"724bf084-29f5-4fdf-b1b4-bc3ab7cfe4a7\") " pod="openstack/aodh-b720-account-create-update-mng59" Feb 17 19:29:03 crc kubenswrapper[4892]: I0217 19:29:03.023477 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-b720-account-create-update-mng59" Feb 17 19:29:03 crc kubenswrapper[4892]: I0217 19:29:03.354693 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-kbm4j"] Feb 17 19:29:03 crc kubenswrapper[4892]: I0217 19:29:03.522417 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-b720-account-create-update-mng59"] Feb 17 19:29:03 crc kubenswrapper[4892]: W0217 19:29:03.530185 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod724bf084_29f5_4fdf_b1b4_bc3ab7cfe4a7.slice/crio-3c03ac9bd4f75023b55407c1d29a9f14264db857fd911cf8baf61a400a9e5185 WatchSource:0}: Error finding container 3c03ac9bd4f75023b55407c1d29a9f14264db857fd911cf8baf61a400a9e5185: Status 404 returned error can't find the container with id 3c03ac9bd4f75023b55407c1d29a9f14264db857fd911cf8baf61a400a9e5185 Feb 17 19:29:03 crc kubenswrapper[4892]: I0217 19:29:03.622759 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-b720-account-create-update-mng59" event={"ID":"724bf084-29f5-4fdf-b1b4-bc3ab7cfe4a7","Type":"ContainerStarted","Data":"3c03ac9bd4f75023b55407c1d29a9f14264db857fd911cf8baf61a400a9e5185"} Feb 17 19:29:03 crc kubenswrapper[4892]: I0217 19:29:03.624310 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-kbm4j" event={"ID":"7df073dc-7ac0-41c7-899b-7373b161da67","Type":"ContainerStarted","Data":"4273ef726e9e8dfff3490bf9e13a16d9ef13a953fb1a85c34f1203a79fb46adf"} Feb 17 19:29:03 crc kubenswrapper[4892]: I0217 19:29:03.624331 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-kbm4j" event={"ID":"7df073dc-7ac0-41c7-899b-7373b161da67","Type":"ContainerStarted","Data":"19f30f0d2cd96b2cd6501b11793d0e804080e949bcf3e63418fb5f50b796615f"} Feb 17 19:29:03 crc kubenswrapper[4892]: I0217 19:29:03.642513 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-create-kbm4j" podStartSLOduration=1.642498529 podStartE2EDuration="1.642498529s" podCreationTimestamp="2026-02-17 19:29:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:29:03.640242167 +0000 UTC m=+6315.015645422" watchObservedRunningTime="2026-02-17 19:29:03.642498529 +0000 UTC m=+6315.017901794" Feb 17 19:29:04 crc kubenswrapper[4892]: I0217 19:29:04.411716 4892 scope.go:117] "RemoveContainer" containerID="13f6f892a55a2f43a917e4a35d18bc360b8ba41d67d20fbd612812517991abc9" Feb 17 19:29:04 crc kubenswrapper[4892]: I0217 19:29:04.442033 4892 scope.go:117] "RemoveContainer" containerID="8cf66092601ece5e6e73f766aad3a5cde9245c52106e26fc7e4ed6fd6c31db29" Feb 17 19:29:04 crc kubenswrapper[4892]: I0217 19:29:04.493500 4892 scope.go:117] "RemoveContainer" containerID="668d4f2f067d0eb8d9008384e88dc8609ef1a33aa8c63f63eb80946319a2abe7" Feb 17 19:29:04 crc kubenswrapper[4892]: I0217 19:29:04.567441 4892 scope.go:117] "RemoveContainer" containerID="a250680d484833a0b789f6bc0d411c9d61592e521cfc90b94a5ebf0331386fc0" Feb 17 19:29:04 crc kubenswrapper[4892]: I0217 19:29:04.616038 4892 scope.go:117] "RemoveContainer" containerID="08b9f92b44090339833a5a7f102d02ee2e686638c10087a7ae249439ca64d4fd" Feb 17 19:29:04 crc kubenswrapper[4892]: I0217 19:29:04.647188 4892 generic.go:334] "Generic (PLEG): container finished" podID="724bf084-29f5-4fdf-b1b4-bc3ab7cfe4a7" containerID="9675d4aaac3723cfa4ebe733a0da53c4d14eb870405c06b52bc3532e86d51b6d" exitCode=0 Feb 17 19:29:04 crc kubenswrapper[4892]: I0217 19:29:04.647285 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-b720-account-create-update-mng59" event={"ID":"724bf084-29f5-4fdf-b1b4-bc3ab7cfe4a7","Type":"ContainerDied","Data":"9675d4aaac3723cfa4ebe733a0da53c4d14eb870405c06b52bc3532e86d51b6d"} Feb 17 19:29:04 crc kubenswrapper[4892]: I0217 19:29:04.654343 4892 generic.go:334] "Generic (PLEG): container finished" podID="7df073dc-7ac0-41c7-899b-7373b161da67" containerID="4273ef726e9e8dfff3490bf9e13a16d9ef13a953fb1a85c34f1203a79fb46adf" exitCode=0 Feb 17 19:29:04 crc kubenswrapper[4892]: I0217 19:29:04.654405 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-kbm4j" event={"ID":"7df073dc-7ac0-41c7-899b-7373b161da67","Type":"ContainerDied","Data":"4273ef726e9e8dfff3490bf9e13a16d9ef13a953fb1a85c34f1203a79fb46adf"} Feb 17 19:29:04 crc kubenswrapper[4892]: I0217 19:29:04.677761 4892 scope.go:117] "RemoveContainer" containerID="d6cf0779e1b78cc9bdba4d1263425f4c6c8e7d26a4037734497d6f154374b45c" Feb 17 19:29:06 crc kubenswrapper[4892]: I0217 19:29:06.062325 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-sqz4g"] Feb 17 19:29:06 crc kubenswrapper[4892]: I0217 19:29:06.077591 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-sqz4g"] Feb 17 19:29:06 crc kubenswrapper[4892]: I0217 19:29:06.237191 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-b720-account-create-update-mng59" Feb 17 19:29:06 crc kubenswrapper[4892]: I0217 19:29:06.263470 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-kbm4j" Feb 17 19:29:06 crc kubenswrapper[4892]: I0217 19:29:06.363033 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qqzq\" (UniqueName: \"kubernetes.io/projected/724bf084-29f5-4fdf-b1b4-bc3ab7cfe4a7-kube-api-access-6qqzq\") pod \"724bf084-29f5-4fdf-b1b4-bc3ab7cfe4a7\" (UID: \"724bf084-29f5-4fdf-b1b4-bc3ab7cfe4a7\") " Feb 17 19:29:06 crc kubenswrapper[4892]: I0217 19:29:06.363191 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwh5l\" (UniqueName: \"kubernetes.io/projected/7df073dc-7ac0-41c7-899b-7373b161da67-kube-api-access-lwh5l\") pod \"7df073dc-7ac0-41c7-899b-7373b161da67\" (UID: \"7df073dc-7ac0-41c7-899b-7373b161da67\") " Feb 17 19:29:06 crc kubenswrapper[4892]: I0217 19:29:06.363340 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/724bf084-29f5-4fdf-b1b4-bc3ab7cfe4a7-operator-scripts\") pod \"724bf084-29f5-4fdf-b1b4-bc3ab7cfe4a7\" (UID: \"724bf084-29f5-4fdf-b1b4-bc3ab7cfe4a7\") " Feb 17 19:29:06 crc kubenswrapper[4892]: I0217 19:29:06.363516 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7df073dc-7ac0-41c7-899b-7373b161da67-operator-scripts\") pod \"7df073dc-7ac0-41c7-899b-7373b161da67\" (UID: \"7df073dc-7ac0-41c7-899b-7373b161da67\") " Feb 17 19:29:06 crc kubenswrapper[4892]: I0217 19:29:06.364456 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7df073dc-7ac0-41c7-899b-7373b161da67-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7df073dc-7ac0-41c7-899b-7373b161da67" (UID: "7df073dc-7ac0-41c7-899b-7373b161da67"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:29:06 crc kubenswrapper[4892]: I0217 19:29:06.365036 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/724bf084-29f5-4fdf-b1b4-bc3ab7cfe4a7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "724bf084-29f5-4fdf-b1b4-bc3ab7cfe4a7" (UID: "724bf084-29f5-4fdf-b1b4-bc3ab7cfe4a7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:29:06 crc kubenswrapper[4892]: I0217 19:29:06.390449 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/724bf084-29f5-4fdf-b1b4-bc3ab7cfe4a7-kube-api-access-6qqzq" (OuterVolumeSpecName: "kube-api-access-6qqzq") pod "724bf084-29f5-4fdf-b1b4-bc3ab7cfe4a7" (UID: "724bf084-29f5-4fdf-b1b4-bc3ab7cfe4a7"). InnerVolumeSpecName "kube-api-access-6qqzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:29:06 crc kubenswrapper[4892]: I0217 19:29:06.390667 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7df073dc-7ac0-41c7-899b-7373b161da67-kube-api-access-lwh5l" (OuterVolumeSpecName: "kube-api-access-lwh5l") pod "7df073dc-7ac0-41c7-899b-7373b161da67" (UID: "7df073dc-7ac0-41c7-899b-7373b161da67"). InnerVolumeSpecName "kube-api-access-lwh5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:29:06 crc kubenswrapper[4892]: I0217 19:29:06.467677 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7df073dc-7ac0-41c7-899b-7373b161da67-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:29:06 crc kubenswrapper[4892]: I0217 19:29:06.467711 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qqzq\" (UniqueName: \"kubernetes.io/projected/724bf084-29f5-4fdf-b1b4-bc3ab7cfe4a7-kube-api-access-6qqzq\") on node \"crc\" DevicePath \"\"" Feb 17 19:29:06 crc kubenswrapper[4892]: I0217 19:29:06.467723 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwh5l\" (UniqueName: \"kubernetes.io/projected/7df073dc-7ac0-41c7-899b-7373b161da67-kube-api-access-lwh5l\") on node \"crc\" DevicePath \"\"" Feb 17 19:29:06 crc kubenswrapper[4892]: I0217 19:29:06.467735 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/724bf084-29f5-4fdf-b1b4-bc3ab7cfe4a7-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:29:06 crc kubenswrapper[4892]: I0217 19:29:06.685928 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-kbm4j" event={"ID":"7df073dc-7ac0-41c7-899b-7373b161da67","Type":"ContainerDied","Data":"19f30f0d2cd96b2cd6501b11793d0e804080e949bcf3e63418fb5f50b796615f"} Feb 17 19:29:06 crc kubenswrapper[4892]: I0217 19:29:06.685967 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19f30f0d2cd96b2cd6501b11793d0e804080e949bcf3e63418fb5f50b796615f" Feb 17 19:29:06 crc kubenswrapper[4892]: I0217 19:29:06.685987 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-kbm4j" Feb 17 19:29:06 crc kubenswrapper[4892]: I0217 19:29:06.688379 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-b720-account-create-update-mng59" event={"ID":"724bf084-29f5-4fdf-b1b4-bc3ab7cfe4a7","Type":"ContainerDied","Data":"3c03ac9bd4f75023b55407c1d29a9f14264db857fd911cf8baf61a400a9e5185"} Feb 17 19:29:06 crc kubenswrapper[4892]: I0217 19:29:06.688400 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c03ac9bd4f75023b55407c1d29a9f14264db857fd911cf8baf61a400a9e5185" Feb 17 19:29:06 crc kubenswrapper[4892]: I0217 19:29:06.688438 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-b720-account-create-update-mng59" Feb 17 19:29:07 crc kubenswrapper[4892]: I0217 19:29:07.381443 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc" path="/var/lib/kubelet/pods/8ee6a2b0-3ba3-46cb-92c0-1b9a15b027dc/volumes" Feb 17 19:29:08 crc kubenswrapper[4892]: I0217 19:29:08.056804 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-pdh2g"] Feb 17 19:29:08 crc kubenswrapper[4892]: E0217 19:29:08.057318 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="724bf084-29f5-4fdf-b1b4-bc3ab7cfe4a7" containerName="mariadb-account-create-update" Feb 17 19:29:08 crc kubenswrapper[4892]: I0217 19:29:08.057331 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="724bf084-29f5-4fdf-b1b4-bc3ab7cfe4a7" containerName="mariadb-account-create-update" Feb 17 19:29:08 crc kubenswrapper[4892]: E0217 19:29:08.057361 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7df073dc-7ac0-41c7-899b-7373b161da67" containerName="mariadb-database-create" Feb 17 19:29:08 crc kubenswrapper[4892]: I0217 19:29:08.057369 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="7df073dc-7ac0-41c7-899b-7373b161da67" containerName="mariadb-database-create" Feb 17 19:29:08 crc kubenswrapper[4892]: I0217 19:29:08.057620 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="7df073dc-7ac0-41c7-899b-7373b161da67" containerName="mariadb-database-create" Feb 17 19:29:08 crc kubenswrapper[4892]: I0217 19:29:08.057644 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="724bf084-29f5-4fdf-b1b4-bc3ab7cfe4a7" containerName="mariadb-account-create-update" Feb 17 19:29:08 crc kubenswrapper[4892]: I0217 19:29:08.058402 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-pdh2g" Feb 17 19:29:08 crc kubenswrapper[4892]: I0217 19:29:08.060528 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Feb 17 19:29:08 crc kubenswrapper[4892]: I0217 19:29:08.060546 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 17 19:29:08 crc kubenswrapper[4892]: I0217 19:29:08.060797 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Feb 17 19:29:08 crc kubenswrapper[4892]: I0217 19:29:08.062656 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-r2wdc" Feb 17 19:29:08 crc kubenswrapper[4892]: I0217 19:29:08.093049 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-pdh2g"] Feb 17 19:29:08 crc kubenswrapper[4892]: I0217 19:29:08.223521 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkxw7\" (UniqueName: \"kubernetes.io/projected/24578eef-81b5-44bd-acac-2352f1ce2892-kube-api-access-kkxw7\") pod \"aodh-db-sync-pdh2g\" (UID: \"24578eef-81b5-44bd-acac-2352f1ce2892\") " pod="openstack/aodh-db-sync-pdh2g" Feb 17 19:29:08 crc kubenswrapper[4892]: I0217 19:29:08.223610 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24578eef-81b5-44bd-acac-2352f1ce2892-combined-ca-bundle\") pod \"aodh-db-sync-pdh2g\" (UID: \"24578eef-81b5-44bd-acac-2352f1ce2892\") " pod="openstack/aodh-db-sync-pdh2g" Feb 17 19:29:08 crc kubenswrapper[4892]: I0217 19:29:08.223848 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24578eef-81b5-44bd-acac-2352f1ce2892-scripts\") pod \"aodh-db-sync-pdh2g\" (UID: \"24578eef-81b5-44bd-acac-2352f1ce2892\") " pod="openstack/aodh-db-sync-pdh2g" Feb 17 19:29:08 crc kubenswrapper[4892]: I0217 19:29:08.224196 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24578eef-81b5-44bd-acac-2352f1ce2892-config-data\") pod \"aodh-db-sync-pdh2g\" (UID: \"24578eef-81b5-44bd-acac-2352f1ce2892\") " pod="openstack/aodh-db-sync-pdh2g" Feb 17 19:29:08 crc kubenswrapper[4892]: I0217 19:29:08.325849 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkxw7\" (UniqueName: \"kubernetes.io/projected/24578eef-81b5-44bd-acac-2352f1ce2892-kube-api-access-kkxw7\") pod \"aodh-db-sync-pdh2g\" (UID: \"24578eef-81b5-44bd-acac-2352f1ce2892\") " pod="openstack/aodh-db-sync-pdh2g" Feb 17 19:29:08 crc kubenswrapper[4892]: I0217 19:29:08.325911 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24578eef-81b5-44bd-acac-2352f1ce2892-combined-ca-bundle\") pod \"aodh-db-sync-pdh2g\" (UID: \"24578eef-81b5-44bd-acac-2352f1ce2892\") " pod="openstack/aodh-db-sync-pdh2g" Feb 17 19:29:08 crc kubenswrapper[4892]: I0217 19:29:08.325979 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24578eef-81b5-44bd-acac-2352f1ce2892-scripts\") pod \"aodh-db-sync-pdh2g\" (UID: \"24578eef-81b5-44bd-acac-2352f1ce2892\") " pod="openstack/aodh-db-sync-pdh2g" Feb 17 19:29:08 crc kubenswrapper[4892]: I0217 19:29:08.326096 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24578eef-81b5-44bd-acac-2352f1ce2892-config-data\") pod \"aodh-db-sync-pdh2g\" (UID: \"24578eef-81b5-44bd-acac-2352f1ce2892\") " pod="openstack/aodh-db-sync-pdh2g" Feb 17 19:29:08 crc kubenswrapper[4892]: I0217 19:29:08.332889 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24578eef-81b5-44bd-acac-2352f1ce2892-scripts\") pod \"aodh-db-sync-pdh2g\" (UID: \"24578eef-81b5-44bd-acac-2352f1ce2892\") " pod="openstack/aodh-db-sync-pdh2g" Feb 17 19:29:08 crc kubenswrapper[4892]: I0217 19:29:08.333555 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24578eef-81b5-44bd-acac-2352f1ce2892-config-data\") pod \"aodh-db-sync-pdh2g\" (UID: \"24578eef-81b5-44bd-acac-2352f1ce2892\") " pod="openstack/aodh-db-sync-pdh2g" Feb 17 19:29:08 crc kubenswrapper[4892]: I0217 19:29:08.344521 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24578eef-81b5-44bd-acac-2352f1ce2892-combined-ca-bundle\") pod \"aodh-db-sync-pdh2g\" (UID: \"24578eef-81b5-44bd-acac-2352f1ce2892\") " pod="openstack/aodh-db-sync-pdh2g" Feb 17 19:29:08 crc kubenswrapper[4892]: I0217 19:29:08.347933 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkxw7\" (UniqueName: \"kubernetes.io/projected/24578eef-81b5-44bd-acac-2352f1ce2892-kube-api-access-kkxw7\") pod \"aodh-db-sync-pdh2g\" (UID: \"24578eef-81b5-44bd-acac-2352f1ce2892\") " pod="openstack/aodh-db-sync-pdh2g" Feb 17 19:29:08 crc kubenswrapper[4892]: I0217 19:29:08.380532 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-pdh2g" Feb 17 19:29:08 crc kubenswrapper[4892]: I0217 19:29:08.922100 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-pdh2g"] Feb 17 19:29:09 crc kubenswrapper[4892]: I0217 19:29:09.736571 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-pdh2g" event={"ID":"24578eef-81b5-44bd-acac-2352f1ce2892","Type":"ContainerStarted","Data":"2e1919535475e99e5158d874de6d1bc7127e867bec1066d39574ca5a11bde77a"} Feb 17 19:29:14 crc kubenswrapper[4892]: I0217 19:29:14.807048 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-pdh2g" event={"ID":"24578eef-81b5-44bd-acac-2352f1ce2892","Type":"ContainerStarted","Data":"832666fe0070771b40d542c667cd46a363db384ea90e60dbf62a678e5ef158bb"} Feb 17 19:29:14 crc kubenswrapper[4892]: I0217 19:29:14.838978 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-pdh2g" podStartSLOduration=1.8621235440000001 podStartE2EDuration="6.83895199s" podCreationTimestamp="2026-02-17 19:29:08 +0000 UTC" firstStartedPulling="2026-02-17 19:29:08.932567588 +0000 UTC m=+6320.307970853" lastFinishedPulling="2026-02-17 19:29:13.909396034 +0000 UTC m=+6325.284799299" observedRunningTime="2026-02-17 19:29:14.834286594 +0000 UTC m=+6326.209689909" watchObservedRunningTime="2026-02-17 19:29:14.83895199 +0000 UTC m=+6326.214355285" Feb 17 19:29:16 crc kubenswrapper[4892]: I0217 19:29:16.830611 4892 generic.go:334] "Generic (PLEG): container finished" podID="24578eef-81b5-44bd-acac-2352f1ce2892" containerID="832666fe0070771b40d542c667cd46a363db384ea90e60dbf62a678e5ef158bb" exitCode=0 Feb 17 19:29:16 crc kubenswrapper[4892]: I0217 19:29:16.830650 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-pdh2g" event={"ID":"24578eef-81b5-44bd-acac-2352f1ce2892","Type":"ContainerDied","Data":"832666fe0070771b40d542c667cd46a363db384ea90e60dbf62a678e5ef158bb"} Feb 17 19:29:18 crc kubenswrapper[4892]: I0217 19:29:18.372357 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-pdh2g" Feb 17 19:29:18 crc kubenswrapper[4892]: I0217 19:29:18.505515 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24578eef-81b5-44bd-acac-2352f1ce2892-combined-ca-bundle\") pod \"24578eef-81b5-44bd-acac-2352f1ce2892\" (UID: \"24578eef-81b5-44bd-acac-2352f1ce2892\") " Feb 17 19:29:18 crc kubenswrapper[4892]: I0217 19:29:18.506004 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24578eef-81b5-44bd-acac-2352f1ce2892-scripts\") pod \"24578eef-81b5-44bd-acac-2352f1ce2892\" (UID: \"24578eef-81b5-44bd-acac-2352f1ce2892\") " Feb 17 19:29:18 crc kubenswrapper[4892]: I0217 19:29:18.506067 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24578eef-81b5-44bd-acac-2352f1ce2892-config-data\") pod \"24578eef-81b5-44bd-acac-2352f1ce2892\" (UID: \"24578eef-81b5-44bd-acac-2352f1ce2892\") " Feb 17 19:29:18 crc kubenswrapper[4892]: I0217 19:29:18.506243 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkxw7\" (UniqueName: \"kubernetes.io/projected/24578eef-81b5-44bd-acac-2352f1ce2892-kube-api-access-kkxw7\") pod \"24578eef-81b5-44bd-acac-2352f1ce2892\" (UID: \"24578eef-81b5-44bd-acac-2352f1ce2892\") " Feb 17 19:29:18 crc kubenswrapper[4892]: I0217 19:29:18.512550 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24578eef-81b5-44bd-acac-2352f1ce2892-kube-api-access-kkxw7" (OuterVolumeSpecName: "kube-api-access-kkxw7") pod "24578eef-81b5-44bd-acac-2352f1ce2892" (UID: "24578eef-81b5-44bd-acac-2352f1ce2892"). InnerVolumeSpecName "kube-api-access-kkxw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:29:18 crc kubenswrapper[4892]: I0217 19:29:18.513295 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24578eef-81b5-44bd-acac-2352f1ce2892-scripts" (OuterVolumeSpecName: "scripts") pod "24578eef-81b5-44bd-acac-2352f1ce2892" (UID: "24578eef-81b5-44bd-acac-2352f1ce2892"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:29:18 crc kubenswrapper[4892]: I0217 19:29:18.562672 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24578eef-81b5-44bd-acac-2352f1ce2892-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "24578eef-81b5-44bd-acac-2352f1ce2892" (UID: "24578eef-81b5-44bd-acac-2352f1ce2892"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:29:18 crc kubenswrapper[4892]: I0217 19:29:18.568993 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24578eef-81b5-44bd-acac-2352f1ce2892-config-data" (OuterVolumeSpecName: "config-data") pod "24578eef-81b5-44bd-acac-2352f1ce2892" (UID: "24578eef-81b5-44bd-acac-2352f1ce2892"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:29:18 crc kubenswrapper[4892]: I0217 19:29:18.609879 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkxw7\" (UniqueName: \"kubernetes.io/projected/24578eef-81b5-44bd-acac-2352f1ce2892-kube-api-access-kkxw7\") on node \"crc\" DevicePath \"\"" Feb 17 19:29:18 crc kubenswrapper[4892]: I0217 19:29:18.609915 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24578eef-81b5-44bd-acac-2352f1ce2892-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 19:29:18 crc kubenswrapper[4892]: I0217 19:29:18.609927 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24578eef-81b5-44bd-acac-2352f1ce2892-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:29:18 crc kubenswrapper[4892]: I0217 19:29:18.609975 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24578eef-81b5-44bd-acac-2352f1ce2892-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 19:29:18 crc kubenswrapper[4892]: I0217 19:29:18.866957 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-pdh2g" event={"ID":"24578eef-81b5-44bd-acac-2352f1ce2892","Type":"ContainerDied","Data":"2e1919535475e99e5158d874de6d1bc7127e867bec1066d39574ca5a11bde77a"} Feb 17 19:29:18 crc kubenswrapper[4892]: I0217 19:29:18.867028 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e1919535475e99e5158d874de6d1bc7127e867bec1066d39574ca5a11bde77a" Feb 17 19:29:18 crc kubenswrapper[4892]: I0217 19:29:18.867118 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-pdh2g" Feb 17 19:29:20 crc kubenswrapper[4892]: I0217 19:29:20.571223 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 17 19:29:23 crc kubenswrapper[4892]: I0217 19:29:23.198344 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Feb 17 19:29:23 crc kubenswrapper[4892]: E0217 19:29:23.199433 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24578eef-81b5-44bd-acac-2352f1ce2892" containerName="aodh-db-sync" Feb 17 19:29:23 crc kubenswrapper[4892]: I0217 19:29:23.199448 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="24578eef-81b5-44bd-acac-2352f1ce2892" containerName="aodh-db-sync" Feb 17 19:29:23 crc kubenswrapper[4892]: I0217 19:29:23.199755 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="24578eef-81b5-44bd-acac-2352f1ce2892" containerName="aodh-db-sync" Feb 17 19:29:23 crc kubenswrapper[4892]: I0217 19:29:23.203024 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Feb 17 19:29:23 crc kubenswrapper[4892]: I0217 19:29:23.207590 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Feb 17 19:29:23 crc kubenswrapper[4892]: I0217 19:29:23.208576 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Feb 17 19:29:23 crc kubenswrapper[4892]: I0217 19:29:23.208595 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-r2wdc" Feb 17 19:29:23 crc kubenswrapper[4892]: I0217 19:29:23.219957 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Feb 17 19:29:23 crc kubenswrapper[4892]: I0217 19:29:23.379455 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4722499-5387-425c-a006-1664b733c70e-combined-ca-bundle\") pod \"aodh-0\" (UID: \"d4722499-5387-425c-a006-1664b733c70e\") " pod="openstack/aodh-0" Feb 17 19:29:23 crc kubenswrapper[4892]: I0217 19:29:23.380159 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4722499-5387-425c-a006-1664b733c70e-scripts\") pod \"aodh-0\" (UID: \"d4722499-5387-425c-a006-1664b733c70e\") " pod="openstack/aodh-0" Feb 17 19:29:23 crc kubenswrapper[4892]: I0217 19:29:23.382111 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4722499-5387-425c-a006-1664b733c70e-config-data\") pod \"aodh-0\" (UID: \"d4722499-5387-425c-a006-1664b733c70e\") " pod="openstack/aodh-0" Feb 17 19:29:23 crc kubenswrapper[4892]: I0217 19:29:23.382352 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ht96\" (UniqueName: \"kubernetes.io/projected/d4722499-5387-425c-a006-1664b733c70e-kube-api-access-4ht96\") pod \"aodh-0\" (UID: \"d4722499-5387-425c-a006-1664b733c70e\") " pod="openstack/aodh-0" Feb 17 19:29:23 crc kubenswrapper[4892]: I0217 19:29:23.484742 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4722499-5387-425c-a006-1664b733c70e-scripts\") pod \"aodh-0\" (UID: \"d4722499-5387-425c-a006-1664b733c70e\") " pod="openstack/aodh-0" Feb 17 19:29:23 crc kubenswrapper[4892]: I0217 19:29:23.484796 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4722499-5387-425c-a006-1664b733c70e-config-data\") pod \"aodh-0\" (UID: \"d4722499-5387-425c-a006-1664b733c70e\") " pod="openstack/aodh-0" Feb 17 19:29:23 crc kubenswrapper[4892]: I0217 19:29:23.484858 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ht96\" (UniqueName: \"kubernetes.io/projected/d4722499-5387-425c-a006-1664b733c70e-kube-api-access-4ht96\") pod \"aodh-0\" (UID: \"d4722499-5387-425c-a006-1664b733c70e\") " pod="openstack/aodh-0" Feb 17 19:29:23 crc kubenswrapper[4892]: I0217 19:29:23.484984 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4722499-5387-425c-a006-1664b733c70e-combined-ca-bundle\") pod \"aodh-0\" (UID: \"d4722499-5387-425c-a006-1664b733c70e\") " pod="openstack/aodh-0" Feb 17 19:29:23 crc kubenswrapper[4892]: I0217 19:29:23.494538 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4722499-5387-425c-a006-1664b733c70e-scripts\") pod \"aodh-0\" (UID: \"d4722499-5387-425c-a006-1664b733c70e\") " pod="openstack/aodh-0" Feb 17 19:29:23 crc kubenswrapper[4892]: I0217 19:29:23.496833 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4722499-5387-425c-a006-1664b733c70e-config-data\") pod \"aodh-0\" (UID: \"d4722499-5387-425c-a006-1664b733c70e\") " pod="openstack/aodh-0" Feb 17 19:29:23 crc kubenswrapper[4892]: I0217 19:29:23.511476 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4722499-5387-425c-a006-1664b733c70e-combined-ca-bundle\") pod \"aodh-0\" (UID: \"d4722499-5387-425c-a006-1664b733c70e\") " pod="openstack/aodh-0" Feb 17 19:29:23 crc kubenswrapper[4892]: I0217 19:29:23.537605 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ht96\" (UniqueName: \"kubernetes.io/projected/d4722499-5387-425c-a006-1664b733c70e-kube-api-access-4ht96\") pod \"aodh-0\" (UID: \"d4722499-5387-425c-a006-1664b733c70e\") " pod="openstack/aodh-0" Feb 17 19:29:23 crc kubenswrapper[4892]: I0217 19:29:23.543315 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Feb 17 19:29:24 crc kubenswrapper[4892]: I0217 19:29:24.033863 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7f66c"] Feb 17 19:29:24 crc kubenswrapper[4892]: I0217 19:29:24.045212 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7f66c"] Feb 17 19:29:24 crc kubenswrapper[4892]: I0217 19:29:24.157490 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Feb 17 19:29:24 crc kubenswrapper[4892]: I0217 19:29:24.945665 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d4722499-5387-425c-a006-1664b733c70e","Type":"ContainerStarted","Data":"0dd435d1e6c197e7743e22f247591dc5ada66f261173461c0ed9710ffd2367df"} Feb 17 19:29:24 crc kubenswrapper[4892]: I0217 19:29:24.946035 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d4722499-5387-425c-a006-1664b733c70e","Type":"ContainerStarted","Data":"3da2d9584b9efaae7bc6ef2f7d273a6d55831e7e33d4ab7189eed06dab94fb1c"} Feb 17 19:29:25 crc kubenswrapper[4892]: I0217 19:29:25.048414 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-j7t5v"] Feb 17 19:29:25 crc kubenswrapper[4892]: I0217 19:29:25.056748 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-j7t5v"] Feb 17 19:29:25 crc kubenswrapper[4892]: I0217 19:29:25.373096 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31bf5754-5e90-44b7-ab1b-f4883dc02a8f" path="/var/lib/kubelet/pods/31bf5754-5e90-44b7-ab1b-f4883dc02a8f/volumes" Feb 17 19:29:25 crc kubenswrapper[4892]: I0217 19:29:25.374352 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6a454e7-b6ca-4e29-9579-6e11202bcf98" path="/var/lib/kubelet/pods/e6a454e7-b6ca-4e29-9579-6e11202bcf98/volumes" Feb 17 19:29:25 crc kubenswrapper[4892]: I0217 19:29:25.546073 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 19:29:25 crc kubenswrapper[4892]: I0217 19:29:25.546407 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="783172fc-f2f2-4207-bafb-92b82c8477be" containerName="ceilometer-central-agent" containerID="cri-o://5904c1c707391722ce5792e20eb31627df929cb22c22d91b0ad4b549e3bede90" gracePeriod=30 Feb 17 19:29:25 crc kubenswrapper[4892]: I0217 19:29:25.546558 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="783172fc-f2f2-4207-bafb-92b82c8477be" containerName="proxy-httpd" containerID="cri-o://efe0e35522ac404b0815649ab9349270b70e570ee79ab52083cf7938fe9285dc" gracePeriod=30 Feb 17 19:29:25 crc kubenswrapper[4892]: I0217 19:29:25.546608 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="783172fc-f2f2-4207-bafb-92b82c8477be" containerName="sg-core" containerID="cri-o://4fefcb8ddad77b3ebb1a22470cdcc07b188cd554704bc6755ae777cdcfae6a52" gracePeriod=30 Feb 17 19:29:25 crc kubenswrapper[4892]: I0217 19:29:25.546688 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="783172fc-f2f2-4207-bafb-92b82c8477be" containerName="ceilometer-notification-agent" containerID="cri-o://6696b36d09e84647ded1a7da2719c938844118d729b77876495a8f9e895c2e4c" gracePeriod=30 Feb 17 19:29:25 crc kubenswrapper[4892]: I0217 19:29:25.969790 4892 generic.go:334] "Generic (PLEG): container finished" podID="783172fc-f2f2-4207-bafb-92b82c8477be" containerID="efe0e35522ac404b0815649ab9349270b70e570ee79ab52083cf7938fe9285dc" exitCode=0 Feb 17 19:29:25 crc kubenswrapper[4892]: I0217 19:29:25.970099 4892 generic.go:334] "Generic (PLEG): container finished" podID="783172fc-f2f2-4207-bafb-92b82c8477be" containerID="4fefcb8ddad77b3ebb1a22470cdcc07b188cd554704bc6755ae777cdcfae6a52" exitCode=2 Feb 17 19:29:25 crc kubenswrapper[4892]: I0217 19:29:25.970110 4892 generic.go:334] "Generic (PLEG): container finished" podID="783172fc-f2f2-4207-bafb-92b82c8477be" containerID="5904c1c707391722ce5792e20eb31627df929cb22c22d91b0ad4b549e3bede90" exitCode=0 Feb 17 19:29:25 crc kubenswrapper[4892]: I0217 19:29:25.969866 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"783172fc-f2f2-4207-bafb-92b82c8477be","Type":"ContainerDied","Data":"efe0e35522ac404b0815649ab9349270b70e570ee79ab52083cf7938fe9285dc"} Feb 17 19:29:25 crc kubenswrapper[4892]: I0217 19:29:25.970321 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"783172fc-f2f2-4207-bafb-92b82c8477be","Type":"ContainerDied","Data":"4fefcb8ddad77b3ebb1a22470cdcc07b188cd554704bc6755ae777cdcfae6a52"} Feb 17 19:29:25 crc kubenswrapper[4892]: I0217 19:29:25.970334 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"783172fc-f2f2-4207-bafb-92b82c8477be","Type":"ContainerDied","Data":"5904c1c707391722ce5792e20eb31627df929cb22c22d91b0ad4b549e3bede90"} Feb 17 19:29:26 crc kubenswrapper[4892]: I0217 19:29:26.980935 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d4722499-5387-425c-a006-1664b733c70e","Type":"ContainerStarted","Data":"229c92ebb34ac99631f8a0794cd19199665878aa4615b0c5d9223ac8bc850d1b"} Feb 17 19:29:28 crc kubenswrapper[4892]: I0217 19:29:28.016957 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d4722499-5387-425c-a006-1664b733c70e","Type":"ContainerStarted","Data":"84138da40e023cf4e51a315fcac6d8f3d9a7fdb6b2ec63457ca60df3186f695e"} Feb 17 19:29:29 crc kubenswrapper[4892]: I0217 19:29:29.045640 4892 generic.go:334] "Generic (PLEG): container finished" podID="783172fc-f2f2-4207-bafb-92b82c8477be" containerID="6696b36d09e84647ded1a7da2719c938844118d729b77876495a8f9e895c2e4c" exitCode=0 Feb 17 19:29:29 crc kubenswrapper[4892]: I0217 19:29:29.046127 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"783172fc-f2f2-4207-bafb-92b82c8477be","Type":"ContainerDied","Data":"6696b36d09e84647ded1a7da2719c938844118d729b77876495a8f9e895c2e4c"} Feb 17 19:29:29 crc kubenswrapper[4892]: I0217 19:29:29.046156 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"783172fc-f2f2-4207-bafb-92b82c8477be","Type":"ContainerDied","Data":"d0322e458d1499f43b22f00fce261d1edbbefb86bf53127ca3a889b585ce83f0"} Feb 17 19:29:29 crc kubenswrapper[4892]: I0217 19:29:29.046169 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0322e458d1499f43b22f00fce261d1edbbefb86bf53127ca3a889b585ce83f0" Feb 17 19:29:29 crc kubenswrapper[4892]: I0217 19:29:29.062071 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 19:29:29 crc kubenswrapper[4892]: I0217 19:29:29.235599 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/783172fc-f2f2-4207-bafb-92b82c8477be-sg-core-conf-yaml\") pod \"783172fc-f2f2-4207-bafb-92b82c8477be\" (UID: \"783172fc-f2f2-4207-bafb-92b82c8477be\") " Feb 17 19:29:29 crc kubenswrapper[4892]: I0217 19:29:29.235928 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqcd2\" (UniqueName: \"kubernetes.io/projected/783172fc-f2f2-4207-bafb-92b82c8477be-kube-api-access-xqcd2\") pod \"783172fc-f2f2-4207-bafb-92b82c8477be\" (UID: \"783172fc-f2f2-4207-bafb-92b82c8477be\") " Feb 17 19:29:29 crc kubenswrapper[4892]: I0217 19:29:29.235972 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/783172fc-f2f2-4207-bafb-92b82c8477be-run-httpd\") pod \"783172fc-f2f2-4207-bafb-92b82c8477be\" (UID: \"783172fc-f2f2-4207-bafb-92b82c8477be\") " Feb 17 19:29:29 crc kubenswrapper[4892]: I0217 19:29:29.236024 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/783172fc-f2f2-4207-bafb-92b82c8477be-log-httpd\") pod \"783172fc-f2f2-4207-bafb-92b82c8477be\" (UID: \"783172fc-f2f2-4207-bafb-92b82c8477be\") " Feb 17 19:29:29 crc kubenswrapper[4892]: I0217 19:29:29.236062 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/783172fc-f2f2-4207-bafb-92b82c8477be-config-data\") pod \"783172fc-f2f2-4207-bafb-92b82c8477be\" (UID: \"783172fc-f2f2-4207-bafb-92b82c8477be\") " Feb 17 19:29:29 crc kubenswrapper[4892]: I0217 19:29:29.236110 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/783172fc-f2f2-4207-bafb-92b82c8477be-scripts\") pod \"783172fc-f2f2-4207-bafb-92b82c8477be\" (UID: \"783172fc-f2f2-4207-bafb-92b82c8477be\") " Feb 17 19:29:29 crc kubenswrapper[4892]: I0217 19:29:29.236159 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/783172fc-f2f2-4207-bafb-92b82c8477be-combined-ca-bundle\") pod \"783172fc-f2f2-4207-bafb-92b82c8477be\" (UID: \"783172fc-f2f2-4207-bafb-92b82c8477be\") " Feb 17 19:29:29 crc kubenswrapper[4892]: I0217 19:29:29.236560 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/783172fc-f2f2-4207-bafb-92b82c8477be-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "783172fc-f2f2-4207-bafb-92b82c8477be" (UID: "783172fc-f2f2-4207-bafb-92b82c8477be"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:29:29 crc kubenswrapper[4892]: I0217 19:29:29.236626 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/783172fc-f2f2-4207-bafb-92b82c8477be-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "783172fc-f2f2-4207-bafb-92b82c8477be" (UID: "783172fc-f2f2-4207-bafb-92b82c8477be"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:29:29 crc kubenswrapper[4892]: I0217 19:29:29.241022 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/783172fc-f2f2-4207-bafb-92b82c8477be-scripts" (OuterVolumeSpecName: "scripts") pod "783172fc-f2f2-4207-bafb-92b82c8477be" (UID: "783172fc-f2f2-4207-bafb-92b82c8477be"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:29:29 crc kubenswrapper[4892]: I0217 19:29:29.241626 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/783172fc-f2f2-4207-bafb-92b82c8477be-kube-api-access-xqcd2" (OuterVolumeSpecName: "kube-api-access-xqcd2") pod "783172fc-f2f2-4207-bafb-92b82c8477be" (UID: "783172fc-f2f2-4207-bafb-92b82c8477be"). InnerVolumeSpecName "kube-api-access-xqcd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:29:29 crc kubenswrapper[4892]: I0217 19:29:29.267540 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/783172fc-f2f2-4207-bafb-92b82c8477be-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "783172fc-f2f2-4207-bafb-92b82c8477be" (UID: "783172fc-f2f2-4207-bafb-92b82c8477be"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:29:29 crc kubenswrapper[4892]: I0217 19:29:29.317593 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/783172fc-f2f2-4207-bafb-92b82c8477be-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "783172fc-f2f2-4207-bafb-92b82c8477be" (UID: "783172fc-f2f2-4207-bafb-92b82c8477be"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:29:29 crc kubenswrapper[4892]: I0217 19:29:29.339681 4892 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/783172fc-f2f2-4207-bafb-92b82c8477be-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 19:29:29 crc kubenswrapper[4892]: I0217 19:29:29.340007 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqcd2\" (UniqueName: \"kubernetes.io/projected/783172fc-f2f2-4207-bafb-92b82c8477be-kube-api-access-xqcd2\") on node \"crc\" DevicePath \"\"" Feb 17 19:29:29 crc kubenswrapper[4892]: I0217 19:29:29.340091 4892 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/783172fc-f2f2-4207-bafb-92b82c8477be-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 19:29:29 crc kubenswrapper[4892]: I0217 19:29:29.340166 4892 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/783172fc-f2f2-4207-bafb-92b82c8477be-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 19:29:29 crc kubenswrapper[4892]: I0217 19:29:29.340237 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/783172fc-f2f2-4207-bafb-92b82c8477be-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:29:29 crc kubenswrapper[4892]: I0217 19:29:29.340306 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/783172fc-f2f2-4207-bafb-92b82c8477be-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 19:29:29 crc kubenswrapper[4892]: I0217 19:29:29.346047 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/783172fc-f2f2-4207-bafb-92b82c8477be-config-data" (OuterVolumeSpecName: "config-data") pod "783172fc-f2f2-4207-bafb-92b82c8477be" (UID: "783172fc-f2f2-4207-bafb-92b82c8477be"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:29:29 crc kubenswrapper[4892]: I0217 19:29:29.446828 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/783172fc-f2f2-4207-bafb-92b82c8477be-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 19:29:30 crc kubenswrapper[4892]: I0217 19:29:30.061184 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d4722499-5387-425c-a006-1664b733c70e","Type":"ContainerStarted","Data":"7853f61b463bcc49bac1e00c665d606671ba273b39265ed3daac7f0399a11b86"} Feb 17 19:29:30 crc kubenswrapper[4892]: I0217 19:29:30.061218 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 19:29:30 crc kubenswrapper[4892]: I0217 19:29:30.084520 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=1.7783827159999999 podStartE2EDuration="7.084492671s" podCreationTimestamp="2026-02-17 19:29:23 +0000 UTC" firstStartedPulling="2026-02-17 19:29:24.150068579 +0000 UTC m=+6335.525471844" lastFinishedPulling="2026-02-17 19:29:29.456178534 +0000 UTC m=+6340.831581799" observedRunningTime="2026-02-17 19:29:30.080470381 +0000 UTC m=+6341.455873656" watchObservedRunningTime="2026-02-17 19:29:30.084492671 +0000 UTC m=+6341.459895976" Feb 17 19:29:30 crc kubenswrapper[4892]: I0217 19:29:30.138792 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 19:29:30 crc kubenswrapper[4892]: I0217 19:29:30.159328 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 19:29:30 crc kubenswrapper[4892]: I0217 19:29:30.172997 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 19:29:30 crc kubenswrapper[4892]: E0217 19:29:30.173520 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="783172fc-f2f2-4207-bafb-92b82c8477be" containerName="proxy-httpd" Feb 17 19:29:30 crc kubenswrapper[4892]: I0217 19:29:30.173539 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="783172fc-f2f2-4207-bafb-92b82c8477be" containerName="proxy-httpd" Feb 17 19:29:30 crc kubenswrapper[4892]: E0217 19:29:30.173561 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="783172fc-f2f2-4207-bafb-92b82c8477be" containerName="ceilometer-notification-agent" Feb 17 19:29:30 crc kubenswrapper[4892]: I0217 19:29:30.173567 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="783172fc-f2f2-4207-bafb-92b82c8477be" containerName="ceilometer-notification-agent" Feb 17 19:29:30 crc kubenswrapper[4892]: E0217 19:29:30.173579 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="783172fc-f2f2-4207-bafb-92b82c8477be" containerName="ceilometer-central-agent" Feb 17 19:29:30 crc kubenswrapper[4892]: I0217 19:29:30.173585 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="783172fc-f2f2-4207-bafb-92b82c8477be" containerName="ceilometer-central-agent" Feb 17 19:29:30 crc kubenswrapper[4892]: E0217 19:29:30.173600 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="783172fc-f2f2-4207-bafb-92b82c8477be" containerName="sg-core" Feb 17 19:29:30 crc kubenswrapper[4892]: I0217 19:29:30.173606 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="783172fc-f2f2-4207-bafb-92b82c8477be" containerName="sg-core" Feb 17 19:29:30 crc kubenswrapper[4892]: I0217 19:29:30.173851 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="783172fc-f2f2-4207-bafb-92b82c8477be" containerName="sg-core" Feb 17 19:29:30 crc kubenswrapper[4892]: I0217 19:29:30.173868 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="783172fc-f2f2-4207-bafb-92b82c8477be" containerName="ceilometer-notification-agent" Feb 17 19:29:30 crc kubenswrapper[4892]: I0217 19:29:30.173891 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="783172fc-f2f2-4207-bafb-92b82c8477be" containerName="proxy-httpd" Feb 17 19:29:30 crc kubenswrapper[4892]: I0217 19:29:30.173906 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="783172fc-f2f2-4207-bafb-92b82c8477be" containerName="ceilometer-central-agent" Feb 17 19:29:30 crc kubenswrapper[4892]: I0217 19:29:30.179728 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 19:29:30 crc kubenswrapper[4892]: I0217 19:29:30.182296 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 19:29:30 crc kubenswrapper[4892]: I0217 19:29:30.183532 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 19:29:30 crc kubenswrapper[4892]: I0217 19:29:30.195701 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 19:29:30 crc kubenswrapper[4892]: I0217 19:29:30.263150 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/11d7e146-b34b-4407-ba25-c19028f250ac-run-httpd\") pod \"ceilometer-0\" (UID: \"11d7e146-b34b-4407-ba25-c19028f250ac\") " pod="openstack/ceilometer-0" Feb 17 19:29:30 crc kubenswrapper[4892]: I0217 19:29:30.263234 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11d7e146-b34b-4407-ba25-c19028f250ac-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"11d7e146-b34b-4407-ba25-c19028f250ac\") " pod="openstack/ceilometer-0" Feb 17 19:29:30 crc kubenswrapper[4892]: I0217 19:29:30.263306 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11d7e146-b34b-4407-ba25-c19028f250ac-config-data\") pod \"ceilometer-0\" (UID: \"11d7e146-b34b-4407-ba25-c19028f250ac\") " pod="openstack/ceilometer-0" Feb 17 19:29:30 crc kubenswrapper[4892]: I0217 19:29:30.263373 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/11d7e146-b34b-4407-ba25-c19028f250ac-log-httpd\") pod \"ceilometer-0\" (UID: \"11d7e146-b34b-4407-ba25-c19028f250ac\") " pod="openstack/ceilometer-0" Feb 17 19:29:30 crc kubenswrapper[4892]: I0217 19:29:30.263422 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/11d7e146-b34b-4407-ba25-c19028f250ac-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"11d7e146-b34b-4407-ba25-c19028f250ac\") " pod="openstack/ceilometer-0" Feb 17 19:29:30 crc kubenswrapper[4892]: I0217 19:29:30.263444 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2dld\" (UniqueName: \"kubernetes.io/projected/11d7e146-b34b-4407-ba25-c19028f250ac-kube-api-access-n2dld\") pod \"ceilometer-0\" (UID: \"11d7e146-b34b-4407-ba25-c19028f250ac\") " pod="openstack/ceilometer-0" Feb 17 19:29:30 crc kubenswrapper[4892]: I0217 19:29:30.263470 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11d7e146-b34b-4407-ba25-c19028f250ac-scripts\") pod \"ceilometer-0\" (UID: \"11d7e146-b34b-4407-ba25-c19028f250ac\") " pod="openstack/ceilometer-0" Feb 17 19:29:30 crc kubenswrapper[4892]: I0217 19:29:30.365270 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11d7e146-b34b-4407-ba25-c19028f250ac-config-data\") pod \"ceilometer-0\" (UID: \"11d7e146-b34b-4407-ba25-c19028f250ac\") " pod="openstack/ceilometer-0" Feb 17 19:29:30 crc kubenswrapper[4892]: I0217 19:29:30.365359 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/11d7e146-b34b-4407-ba25-c19028f250ac-log-httpd\") pod \"ceilometer-0\" (UID: \"11d7e146-b34b-4407-ba25-c19028f250ac\") " pod="openstack/ceilometer-0" Feb 17 19:29:30 crc kubenswrapper[4892]: I0217 19:29:30.365405 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/11d7e146-b34b-4407-ba25-c19028f250ac-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"11d7e146-b34b-4407-ba25-c19028f250ac\") " pod="openstack/ceilometer-0" Feb 17 19:29:30 crc kubenswrapper[4892]: I0217 19:29:30.365431 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2dld\" (UniqueName: \"kubernetes.io/projected/11d7e146-b34b-4407-ba25-c19028f250ac-kube-api-access-n2dld\") pod \"ceilometer-0\" (UID: \"11d7e146-b34b-4407-ba25-c19028f250ac\") " pod="openstack/ceilometer-0" Feb 17 19:29:30 crc kubenswrapper[4892]: I0217 19:29:30.365459 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11d7e146-b34b-4407-ba25-c19028f250ac-scripts\") pod \"ceilometer-0\" (UID: \"11d7e146-b34b-4407-ba25-c19028f250ac\") " pod="openstack/ceilometer-0" Feb 17 19:29:30 crc kubenswrapper[4892]: I0217 19:29:30.365486 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/11d7e146-b34b-4407-ba25-c19028f250ac-run-httpd\") pod \"ceilometer-0\" (UID: \"11d7e146-b34b-4407-ba25-c19028f250ac\") " pod="openstack/ceilometer-0" Feb 17 19:29:30 crc kubenswrapper[4892]: I0217 19:29:30.365531 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11d7e146-b34b-4407-ba25-c19028f250ac-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"11d7e146-b34b-4407-ba25-c19028f250ac\") " pod="openstack/ceilometer-0" Feb 17 19:29:30 crc kubenswrapper[4892]: I0217 19:29:30.366366 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/11d7e146-b34b-4407-ba25-c19028f250ac-log-httpd\") pod \"ceilometer-0\" (UID: \"11d7e146-b34b-4407-ba25-c19028f250ac\") " pod="openstack/ceilometer-0" Feb 17 19:29:30 crc kubenswrapper[4892]: I0217 19:29:30.369427 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/11d7e146-b34b-4407-ba25-c19028f250ac-run-httpd\") pod \"ceilometer-0\" (UID: \"11d7e146-b34b-4407-ba25-c19028f250ac\") " pod="openstack/ceilometer-0" Feb 17 19:29:30 crc kubenswrapper[4892]: I0217 19:29:30.371299 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11d7e146-b34b-4407-ba25-c19028f250ac-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"11d7e146-b34b-4407-ba25-c19028f250ac\") " pod="openstack/ceilometer-0" Feb 17 19:29:30 crc kubenswrapper[4892]: I0217 19:29:30.375629 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/11d7e146-b34b-4407-ba25-c19028f250ac-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"11d7e146-b34b-4407-ba25-c19028f250ac\") " pod="openstack/ceilometer-0" Feb 17 19:29:30 crc kubenswrapper[4892]: I0217 19:29:30.375949 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11d7e146-b34b-4407-ba25-c19028f250ac-config-data\") pod \"ceilometer-0\" (UID: \"11d7e146-b34b-4407-ba25-c19028f250ac\") " pod="openstack/ceilometer-0" Feb 17 19:29:30 crc kubenswrapper[4892]: I0217 19:29:30.387476 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11d7e146-b34b-4407-ba25-c19028f250ac-scripts\") pod \"ceilometer-0\" (UID: \"11d7e146-b34b-4407-ba25-c19028f250ac\") " pod="openstack/ceilometer-0" Feb 17 19:29:30 crc kubenswrapper[4892]: I0217 19:29:30.395527 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2dld\" (UniqueName: \"kubernetes.io/projected/11d7e146-b34b-4407-ba25-c19028f250ac-kube-api-access-n2dld\") pod \"ceilometer-0\" (UID: \"11d7e146-b34b-4407-ba25-c19028f250ac\") " pod="openstack/ceilometer-0" Feb 17 19:29:30 crc kubenswrapper[4892]: I0217 19:29:30.508381 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 19:29:31 crc kubenswrapper[4892]: I0217 19:29:31.061956 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 19:29:31 crc kubenswrapper[4892]: I0217 19:29:31.373753 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="783172fc-f2f2-4207-bafb-92b82c8477be" path="/var/lib/kubelet/pods/783172fc-f2f2-4207-bafb-92b82c8477be/volumes" Feb 17 19:29:32 crc kubenswrapper[4892]: I0217 19:29:32.121224 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"11d7e146-b34b-4407-ba25-c19028f250ac","Type":"ContainerStarted","Data":"e4e7f3de758dcffad3c1b7afb225ab969ab3302b0b7b5d525f37ffca623a0c9d"} Feb 17 19:29:32 crc kubenswrapper[4892]: I0217 19:29:32.121523 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"11d7e146-b34b-4407-ba25-c19028f250ac","Type":"ContainerStarted","Data":"4195f1d8ea7cc1b7c9c9253383733b73dafebc76eca59368d6651bb20a428387"} Feb 17 19:29:33 crc kubenswrapper[4892]: I0217 19:29:33.134846 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"11d7e146-b34b-4407-ba25-c19028f250ac","Type":"ContainerStarted","Data":"ca1759fe3213ccf74056245143820c16aa75c8882f71fb4c2b5367c1df147716"} Feb 17 19:29:33 crc kubenswrapper[4892]: I0217 19:29:33.135557 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"11d7e146-b34b-4407-ba25-c19028f250ac","Type":"ContainerStarted","Data":"2fd09ccb026953c0d4fdbf7c124f47e8c78a8a4ca9b8831e521388b680f71874"} Feb 17 19:29:35 crc kubenswrapper[4892]: I0217 19:29:35.175070 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"11d7e146-b34b-4407-ba25-c19028f250ac","Type":"ContainerStarted","Data":"be644b50ba916308679414702f03d47e40b84c47c7741b31501290e4bed7ba4c"} Feb 17 19:29:35 crc kubenswrapper[4892]: I0217 19:29:35.175976 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 19:29:35 crc kubenswrapper[4892]: I0217 19:29:35.208013 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.87479203 podStartE2EDuration="5.207994749s" podCreationTimestamp="2026-02-17 19:29:30 +0000 UTC" firstStartedPulling="2026-02-17 19:29:31.13229918 +0000 UTC m=+6342.507702445" lastFinishedPulling="2026-02-17 19:29:34.465501899 +0000 UTC m=+6345.840905164" observedRunningTime="2026-02-17 19:29:35.205207654 +0000 UTC m=+6346.580610939" watchObservedRunningTime="2026-02-17 19:29:35.207994749 +0000 UTC m=+6346.583398004" Feb 17 19:29:35 crc kubenswrapper[4892]: I0217 19:29:35.637864 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-zgcrt"] Feb 17 19:29:35 crc kubenswrapper[4892]: I0217 19:29:35.639656 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-zgcrt" Feb 17 19:29:35 crc kubenswrapper[4892]: I0217 19:29:35.708248 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-79d7-account-create-update-t5x9n"] Feb 17 19:29:35 crc kubenswrapper[4892]: I0217 19:29:35.709843 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-79d7-account-create-update-t5x9n" Feb 17 19:29:35 crc kubenswrapper[4892]: I0217 19:29:35.726782 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Feb 17 19:29:35 crc kubenswrapper[4892]: I0217 19:29:35.727324 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/106208e4-2541-4e90-af0c-ca382c617cab-operator-scripts\") pod \"manila-79d7-account-create-update-t5x9n\" (UID: \"106208e4-2541-4e90-af0c-ca382c617cab\") " pod="openstack/manila-79d7-account-create-update-t5x9n" Feb 17 19:29:35 crc kubenswrapper[4892]: I0217 19:29:35.727508 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/304b8888-e8d3-48c0-82f1-e655ad9edc79-operator-scripts\") pod \"manila-db-create-zgcrt\" (UID: \"304b8888-e8d3-48c0-82f1-e655ad9edc79\") " pod="openstack/manila-db-create-zgcrt" Feb 17 19:29:35 crc kubenswrapper[4892]: I0217 19:29:35.727554 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66h9k\" (UniqueName: \"kubernetes.io/projected/304b8888-e8d3-48c0-82f1-e655ad9edc79-kube-api-access-66h9k\") pod \"manila-db-create-zgcrt\" (UID: \"304b8888-e8d3-48c0-82f1-e655ad9edc79\") " pod="openstack/manila-db-create-zgcrt" Feb 17 19:29:35 crc kubenswrapper[4892]: I0217 19:29:35.727593 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b6sk\" (UniqueName: \"kubernetes.io/projected/106208e4-2541-4e90-af0c-ca382c617cab-kube-api-access-6b6sk\") pod \"manila-79d7-account-create-update-t5x9n\" (UID: \"106208e4-2541-4e90-af0c-ca382c617cab\") " pod="openstack/manila-79d7-account-create-update-t5x9n" Feb 17 19:29:35 crc kubenswrapper[4892]: I0217 19:29:35.758797 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-zgcrt"] Feb 17 19:29:35 crc kubenswrapper[4892]: I0217 19:29:35.788006 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-79d7-account-create-update-t5x9n"] Feb 17 19:29:35 crc kubenswrapper[4892]: I0217 19:29:35.830202 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/304b8888-e8d3-48c0-82f1-e655ad9edc79-operator-scripts\") pod \"manila-db-create-zgcrt\" (UID: \"304b8888-e8d3-48c0-82f1-e655ad9edc79\") " pod="openstack/manila-db-create-zgcrt" Feb 17 19:29:35 crc kubenswrapper[4892]: I0217 19:29:35.830291 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66h9k\" (UniqueName: \"kubernetes.io/projected/304b8888-e8d3-48c0-82f1-e655ad9edc79-kube-api-access-66h9k\") pod \"manila-db-create-zgcrt\" (UID: \"304b8888-e8d3-48c0-82f1-e655ad9edc79\") " pod="openstack/manila-db-create-zgcrt" Feb 17 19:29:35 crc kubenswrapper[4892]: I0217 19:29:35.830326 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6b6sk\" (UniqueName: \"kubernetes.io/projected/106208e4-2541-4e90-af0c-ca382c617cab-kube-api-access-6b6sk\") pod \"manila-79d7-account-create-update-t5x9n\" (UID: \"106208e4-2541-4e90-af0c-ca382c617cab\") " pod="openstack/manila-79d7-account-create-update-t5x9n" Feb 17 19:29:35 crc kubenswrapper[4892]: I0217 19:29:35.830464 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/106208e4-2541-4e90-af0c-ca382c617cab-operator-scripts\") pod \"manila-79d7-account-create-update-t5x9n\" (UID: \"106208e4-2541-4e90-af0c-ca382c617cab\") " pod="openstack/manila-79d7-account-create-update-t5x9n" Feb 17 19:29:35 crc kubenswrapper[4892]: I0217 19:29:35.833182 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/304b8888-e8d3-48c0-82f1-e655ad9edc79-operator-scripts\") pod \"manila-db-create-zgcrt\" (UID: \"304b8888-e8d3-48c0-82f1-e655ad9edc79\") " pod="openstack/manila-db-create-zgcrt" Feb 17 19:29:35 crc kubenswrapper[4892]: I0217 19:29:35.834513 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/106208e4-2541-4e90-af0c-ca382c617cab-operator-scripts\") pod \"manila-79d7-account-create-update-t5x9n\" (UID: \"106208e4-2541-4e90-af0c-ca382c617cab\") " pod="openstack/manila-79d7-account-create-update-t5x9n" Feb 17 19:29:35 crc kubenswrapper[4892]: I0217 19:29:35.852445 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6b6sk\" (UniqueName: \"kubernetes.io/projected/106208e4-2541-4e90-af0c-ca382c617cab-kube-api-access-6b6sk\") pod \"manila-79d7-account-create-update-t5x9n\" (UID: \"106208e4-2541-4e90-af0c-ca382c617cab\") " pod="openstack/manila-79d7-account-create-update-t5x9n" Feb 17 19:29:35 crc kubenswrapper[4892]: I0217 19:29:35.855169 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66h9k\" (UniqueName: \"kubernetes.io/projected/304b8888-e8d3-48c0-82f1-e655ad9edc79-kube-api-access-66h9k\") pod \"manila-db-create-zgcrt\" (UID: \"304b8888-e8d3-48c0-82f1-e655ad9edc79\") " pod="openstack/manila-db-create-zgcrt" Feb 17 19:29:35 crc kubenswrapper[4892]: I0217 19:29:35.966478 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-zgcrt" Feb 17 19:29:36 crc kubenswrapper[4892]: I0217 19:29:36.069758 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-79d7-account-create-update-t5x9n" Feb 17 19:29:36 crc kubenswrapper[4892]: I0217 19:29:36.469435 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-zgcrt"] Feb 17 19:29:36 crc kubenswrapper[4892]: I0217 19:29:36.734089 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-79d7-account-create-update-t5x9n"] Feb 17 19:29:37 crc kubenswrapper[4892]: I0217 19:29:37.216989 4892 generic.go:334] "Generic (PLEG): container finished" podID="304b8888-e8d3-48c0-82f1-e655ad9edc79" containerID="d526d6d95aeb98886aca2a95cf83b731410c8652885314337721634c4cf6ba94" exitCode=0 Feb 17 19:29:37 crc kubenswrapper[4892]: I0217 19:29:37.217243 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-zgcrt" event={"ID":"304b8888-e8d3-48c0-82f1-e655ad9edc79","Type":"ContainerDied","Data":"d526d6d95aeb98886aca2a95cf83b731410c8652885314337721634c4cf6ba94"} Feb 17 19:29:37 crc kubenswrapper[4892]: I0217 19:29:37.217270 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-zgcrt" event={"ID":"304b8888-e8d3-48c0-82f1-e655ad9edc79","Type":"ContainerStarted","Data":"2d6f83ca625c0cac73aa3dc660c8819d0a5b4afe35fb11ed7ce58addaef6922f"} Feb 17 19:29:37 crc kubenswrapper[4892]: I0217 19:29:37.226280 4892 generic.go:334] "Generic (PLEG): container finished" podID="106208e4-2541-4e90-af0c-ca382c617cab" containerID="7916dcb4e7fef0f96a59fe7fd6988abc2ab60010c389af1751be47989a404e66" exitCode=0 Feb 17 19:29:37 crc kubenswrapper[4892]: I0217 19:29:37.226378 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-79d7-account-create-update-t5x9n" event={"ID":"106208e4-2541-4e90-af0c-ca382c617cab","Type":"ContainerDied","Data":"7916dcb4e7fef0f96a59fe7fd6988abc2ab60010c389af1751be47989a404e66"} Feb 17 19:29:37 crc kubenswrapper[4892]: I0217 19:29:37.226417 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-79d7-account-create-update-t5x9n" event={"ID":"106208e4-2541-4e90-af0c-ca382c617cab","Type":"ContainerStarted","Data":"b016d7748582743df62f6e8a2ecb78b90d5b748e14bdae17e79b89b63ea37e04"} Feb 17 19:29:38 crc kubenswrapper[4892]: I0217 19:29:38.791125 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-zgcrt" Feb 17 19:29:38 crc kubenswrapper[4892]: I0217 19:29:38.801029 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-79d7-account-create-update-t5x9n" Feb 17 19:29:38 crc kubenswrapper[4892]: I0217 19:29:38.900051 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66h9k\" (UniqueName: \"kubernetes.io/projected/304b8888-e8d3-48c0-82f1-e655ad9edc79-kube-api-access-66h9k\") pod \"304b8888-e8d3-48c0-82f1-e655ad9edc79\" (UID: \"304b8888-e8d3-48c0-82f1-e655ad9edc79\") " Feb 17 19:29:38 crc kubenswrapper[4892]: I0217 19:29:38.900143 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6b6sk\" (UniqueName: \"kubernetes.io/projected/106208e4-2541-4e90-af0c-ca382c617cab-kube-api-access-6b6sk\") pod \"106208e4-2541-4e90-af0c-ca382c617cab\" (UID: \"106208e4-2541-4e90-af0c-ca382c617cab\") " Feb 17 19:29:38 crc kubenswrapper[4892]: I0217 19:29:38.900200 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/304b8888-e8d3-48c0-82f1-e655ad9edc79-operator-scripts\") pod \"304b8888-e8d3-48c0-82f1-e655ad9edc79\" (UID: \"304b8888-e8d3-48c0-82f1-e655ad9edc79\") " Feb 17 19:29:38 crc kubenswrapper[4892]: I0217 19:29:38.900309 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/106208e4-2541-4e90-af0c-ca382c617cab-operator-scripts\") pod \"106208e4-2541-4e90-af0c-ca382c617cab\" (UID: \"106208e4-2541-4e90-af0c-ca382c617cab\") " Feb 17 19:29:38 crc kubenswrapper[4892]: I0217 19:29:38.901646 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/106208e4-2541-4e90-af0c-ca382c617cab-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "106208e4-2541-4e90-af0c-ca382c617cab" (UID: "106208e4-2541-4e90-af0c-ca382c617cab"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:29:38 crc kubenswrapper[4892]: I0217 19:29:38.901765 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/304b8888-e8d3-48c0-82f1-e655ad9edc79-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "304b8888-e8d3-48c0-82f1-e655ad9edc79" (UID: "304b8888-e8d3-48c0-82f1-e655ad9edc79"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:29:38 crc kubenswrapper[4892]: I0217 19:29:38.905559 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/106208e4-2541-4e90-af0c-ca382c617cab-kube-api-access-6b6sk" (OuterVolumeSpecName: "kube-api-access-6b6sk") pod "106208e4-2541-4e90-af0c-ca382c617cab" (UID: "106208e4-2541-4e90-af0c-ca382c617cab"). InnerVolumeSpecName "kube-api-access-6b6sk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:29:38 crc kubenswrapper[4892]: I0217 19:29:38.906025 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/304b8888-e8d3-48c0-82f1-e655ad9edc79-kube-api-access-66h9k" (OuterVolumeSpecName: "kube-api-access-66h9k") pod "304b8888-e8d3-48c0-82f1-e655ad9edc79" (UID: "304b8888-e8d3-48c0-82f1-e655ad9edc79"). InnerVolumeSpecName "kube-api-access-66h9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:29:39 crc kubenswrapper[4892]: I0217 19:29:39.003905 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66h9k\" (UniqueName: \"kubernetes.io/projected/304b8888-e8d3-48c0-82f1-e655ad9edc79-kube-api-access-66h9k\") on node \"crc\" DevicePath \"\"" Feb 17 19:29:39 crc kubenswrapper[4892]: I0217 19:29:39.003944 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6b6sk\" (UniqueName: \"kubernetes.io/projected/106208e4-2541-4e90-af0c-ca382c617cab-kube-api-access-6b6sk\") on node \"crc\" DevicePath \"\"" Feb 17 19:29:39 crc kubenswrapper[4892]: I0217 19:29:39.003958 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/304b8888-e8d3-48c0-82f1-e655ad9edc79-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:29:39 crc kubenswrapper[4892]: I0217 19:29:39.003971 4892 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/106208e4-2541-4e90-af0c-ca382c617cab-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:29:39 crc kubenswrapper[4892]: I0217 19:29:39.261076 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-79d7-account-create-update-t5x9n" Feb 17 19:29:39 crc kubenswrapper[4892]: I0217 19:29:39.261094 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-79d7-account-create-update-t5x9n" event={"ID":"106208e4-2541-4e90-af0c-ca382c617cab","Type":"ContainerDied","Data":"b016d7748582743df62f6e8a2ecb78b90d5b748e14bdae17e79b89b63ea37e04"} Feb 17 19:29:39 crc kubenswrapper[4892]: I0217 19:29:39.261241 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b016d7748582743df62f6e8a2ecb78b90d5b748e14bdae17e79b89b63ea37e04" Feb 17 19:29:39 crc kubenswrapper[4892]: I0217 19:29:39.265094 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-zgcrt" event={"ID":"304b8888-e8d3-48c0-82f1-e655ad9edc79","Type":"ContainerDied","Data":"2d6f83ca625c0cac73aa3dc660c8819d0a5b4afe35fb11ed7ce58addaef6922f"} Feb 17 19:29:39 crc kubenswrapper[4892]: I0217 19:29:39.265155 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d6f83ca625c0cac73aa3dc660c8819d0a5b4afe35fb11ed7ce58addaef6922f" Feb 17 19:29:39 crc kubenswrapper[4892]: I0217 19:29:39.265174 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-zgcrt" Feb 17 19:29:41 crc kubenswrapper[4892]: I0217 19:29:41.104273 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-lbj77"] Feb 17 19:29:41 crc kubenswrapper[4892]: E0217 19:29:41.105348 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="304b8888-e8d3-48c0-82f1-e655ad9edc79" containerName="mariadb-database-create" Feb 17 19:29:41 crc kubenswrapper[4892]: I0217 19:29:41.105367 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="304b8888-e8d3-48c0-82f1-e655ad9edc79" containerName="mariadb-database-create" Feb 17 19:29:41 crc kubenswrapper[4892]: E0217 19:29:41.105433 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="106208e4-2541-4e90-af0c-ca382c617cab" containerName="mariadb-account-create-update" Feb 17 19:29:41 crc kubenswrapper[4892]: I0217 19:29:41.105442 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="106208e4-2541-4e90-af0c-ca382c617cab" containerName="mariadb-account-create-update" Feb 17 19:29:41 crc kubenswrapper[4892]: I0217 19:29:41.105749 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="106208e4-2541-4e90-af0c-ca382c617cab" containerName="mariadb-account-create-update" Feb 17 19:29:41 crc kubenswrapper[4892]: I0217 19:29:41.105778 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="304b8888-e8d3-48c0-82f1-e655ad9edc79" containerName="mariadb-database-create" Feb 17 19:29:41 crc kubenswrapper[4892]: I0217 19:29:41.107135 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-lbj77" Feb 17 19:29:41 crc kubenswrapper[4892]: I0217 19:29:41.110457 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-vb5zt" Feb 17 19:29:41 crc kubenswrapper[4892]: I0217 19:29:41.111030 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Feb 17 19:29:41 crc kubenswrapper[4892]: I0217 19:29:41.127639 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-lbj77"] Feb 17 19:29:41 crc kubenswrapper[4892]: I0217 19:29:41.269865 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4zt7\" (UniqueName: \"kubernetes.io/projected/e5bf75de-8d27-4ca0-be50-bad2df22a6ea-kube-api-access-k4zt7\") pod \"manila-db-sync-lbj77\" (UID: \"e5bf75de-8d27-4ca0-be50-bad2df22a6ea\") " pod="openstack/manila-db-sync-lbj77" Feb 17 19:29:41 crc kubenswrapper[4892]: I0217 19:29:41.270374 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5bf75de-8d27-4ca0-be50-bad2df22a6ea-config-data\") pod \"manila-db-sync-lbj77\" (UID: \"e5bf75de-8d27-4ca0-be50-bad2df22a6ea\") " pod="openstack/manila-db-sync-lbj77" Feb 17 19:29:41 crc kubenswrapper[4892]: I0217 19:29:41.270433 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5bf75de-8d27-4ca0-be50-bad2df22a6ea-combined-ca-bundle\") pod \"manila-db-sync-lbj77\" (UID: \"e5bf75de-8d27-4ca0-be50-bad2df22a6ea\") " pod="openstack/manila-db-sync-lbj77" Feb 17 19:29:41 crc kubenswrapper[4892]: I0217 19:29:41.270583 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/e5bf75de-8d27-4ca0-be50-bad2df22a6ea-job-config-data\") pod \"manila-db-sync-lbj77\" (UID: \"e5bf75de-8d27-4ca0-be50-bad2df22a6ea\") " pod="openstack/manila-db-sync-lbj77" Feb 17 19:29:41 crc kubenswrapper[4892]: I0217 19:29:41.373562 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5bf75de-8d27-4ca0-be50-bad2df22a6ea-config-data\") pod \"manila-db-sync-lbj77\" (UID: \"e5bf75de-8d27-4ca0-be50-bad2df22a6ea\") " pod="openstack/manila-db-sync-lbj77" Feb 17 19:29:41 crc kubenswrapper[4892]: I0217 19:29:41.373641 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5bf75de-8d27-4ca0-be50-bad2df22a6ea-combined-ca-bundle\") pod \"manila-db-sync-lbj77\" (UID: \"e5bf75de-8d27-4ca0-be50-bad2df22a6ea\") " pod="openstack/manila-db-sync-lbj77" Feb 17 19:29:41 crc kubenswrapper[4892]: I0217 19:29:41.373864 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/e5bf75de-8d27-4ca0-be50-bad2df22a6ea-job-config-data\") pod \"manila-db-sync-lbj77\" (UID: \"e5bf75de-8d27-4ca0-be50-bad2df22a6ea\") " pod="openstack/manila-db-sync-lbj77" Feb 17 19:29:41 crc kubenswrapper[4892]: I0217 19:29:41.374023 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4zt7\" (UniqueName: \"kubernetes.io/projected/e5bf75de-8d27-4ca0-be50-bad2df22a6ea-kube-api-access-k4zt7\") pod \"manila-db-sync-lbj77\" (UID: \"e5bf75de-8d27-4ca0-be50-bad2df22a6ea\") " pod="openstack/manila-db-sync-lbj77" Feb 17 19:29:41 crc kubenswrapper[4892]: I0217 19:29:41.383226 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/e5bf75de-8d27-4ca0-be50-bad2df22a6ea-job-config-data\") pod \"manila-db-sync-lbj77\" (UID: \"e5bf75de-8d27-4ca0-be50-bad2df22a6ea\") " pod="openstack/manila-db-sync-lbj77" Feb 17 19:29:41 crc kubenswrapper[4892]: I0217 19:29:41.394800 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5bf75de-8d27-4ca0-be50-bad2df22a6ea-config-data\") pod \"manila-db-sync-lbj77\" (UID: \"e5bf75de-8d27-4ca0-be50-bad2df22a6ea\") " pod="openstack/manila-db-sync-lbj77" Feb 17 19:29:41 crc kubenswrapper[4892]: I0217 19:29:41.398008 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4zt7\" (UniqueName: \"kubernetes.io/projected/e5bf75de-8d27-4ca0-be50-bad2df22a6ea-kube-api-access-k4zt7\") pod \"manila-db-sync-lbj77\" (UID: \"e5bf75de-8d27-4ca0-be50-bad2df22a6ea\") " pod="openstack/manila-db-sync-lbj77" Feb 17 19:29:41 crc kubenswrapper[4892]: I0217 19:29:41.400012 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5bf75de-8d27-4ca0-be50-bad2df22a6ea-combined-ca-bundle\") pod \"manila-db-sync-lbj77\" (UID: \"e5bf75de-8d27-4ca0-be50-bad2df22a6ea\") " pod="openstack/manila-db-sync-lbj77" Feb 17 19:29:41 crc kubenswrapper[4892]: I0217 19:29:41.429371 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-lbj77" Feb 17 19:29:42 crc kubenswrapper[4892]: I0217 19:29:42.146547 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-lbj77"] Feb 17 19:29:42 crc kubenswrapper[4892]: W0217 19:29:42.152236 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5bf75de_8d27_4ca0_be50_bad2df22a6ea.slice/crio-4e14a1f0d7e7c4d7bdde06db3befbaa07f6047d9336d3b62744bea2bbc1a3354 WatchSource:0}: Error finding container 4e14a1f0d7e7c4d7bdde06db3befbaa07f6047d9336d3b62744bea2bbc1a3354: Status 404 returned error can't find the container with id 4e14a1f0d7e7c4d7bdde06db3befbaa07f6047d9336d3b62744bea2bbc1a3354 Feb 17 19:29:42 crc kubenswrapper[4892]: I0217 19:29:42.306847 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-lbj77" event={"ID":"e5bf75de-8d27-4ca0-be50-bad2df22a6ea","Type":"ContainerStarted","Data":"4e14a1f0d7e7c4d7bdde06db3befbaa07f6047d9336d3b62744bea2bbc1a3354"} Feb 17 19:29:43 crc kubenswrapper[4892]: I0217 19:29:43.048594 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-xpqwv"] Feb 17 19:29:43 crc kubenswrapper[4892]: I0217 19:29:43.058528 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-xpqwv"] Feb 17 19:29:43 crc kubenswrapper[4892]: I0217 19:29:43.372483 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9b1b334-4337-48c7-88bb-259fc38f15e5" path="/var/lib/kubelet/pods/f9b1b334-4337-48c7-88bb-259fc38f15e5/volumes" Feb 17 19:29:47 crc kubenswrapper[4892]: I0217 19:29:47.379875 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-lbj77" event={"ID":"e5bf75de-8d27-4ca0-be50-bad2df22a6ea","Type":"ContainerStarted","Data":"32ec4a4642066bf06fc91b65afd41a4d40cc80aed602b2487652f4d43ee45eee"} Feb 17 19:29:47 crc kubenswrapper[4892]: I0217 19:29:47.413455 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-lbj77" podStartSLOduration=2.137775654 podStartE2EDuration="6.413430059s" podCreationTimestamp="2026-02-17 19:29:41 +0000 UTC" firstStartedPulling="2026-02-17 19:29:42.154321927 +0000 UTC m=+6353.529725192" lastFinishedPulling="2026-02-17 19:29:46.429976332 +0000 UTC m=+6357.805379597" observedRunningTime="2026-02-17 19:29:47.407760845 +0000 UTC m=+6358.783164120" watchObservedRunningTime="2026-02-17 19:29:47.413430059 +0000 UTC m=+6358.788833334" Feb 17 19:29:49 crc kubenswrapper[4892]: I0217 19:29:49.425884 4892 generic.go:334] "Generic (PLEG): container finished" podID="e5bf75de-8d27-4ca0-be50-bad2df22a6ea" containerID="32ec4a4642066bf06fc91b65afd41a4d40cc80aed602b2487652f4d43ee45eee" exitCode=0 Feb 17 19:29:49 crc kubenswrapper[4892]: I0217 19:29:49.426360 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-lbj77" event={"ID":"e5bf75de-8d27-4ca0-be50-bad2df22a6ea","Type":"ContainerDied","Data":"32ec4a4642066bf06fc91b65afd41a4d40cc80aed602b2487652f4d43ee45eee"} Feb 17 19:29:51 crc kubenswrapper[4892]: I0217 19:29:51.134759 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-lbj77" Feb 17 19:29:51 crc kubenswrapper[4892]: I0217 19:29:51.243528 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5bf75de-8d27-4ca0-be50-bad2df22a6ea-combined-ca-bundle\") pod \"e5bf75de-8d27-4ca0-be50-bad2df22a6ea\" (UID: \"e5bf75de-8d27-4ca0-be50-bad2df22a6ea\") " Feb 17 19:29:51 crc kubenswrapper[4892]: I0217 19:29:51.243576 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4zt7\" (UniqueName: \"kubernetes.io/projected/e5bf75de-8d27-4ca0-be50-bad2df22a6ea-kube-api-access-k4zt7\") pod \"e5bf75de-8d27-4ca0-be50-bad2df22a6ea\" (UID: \"e5bf75de-8d27-4ca0-be50-bad2df22a6ea\") " Feb 17 19:29:51 crc kubenswrapper[4892]: I0217 19:29:51.243617 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5bf75de-8d27-4ca0-be50-bad2df22a6ea-config-data\") pod \"e5bf75de-8d27-4ca0-be50-bad2df22a6ea\" (UID: \"e5bf75de-8d27-4ca0-be50-bad2df22a6ea\") " Feb 17 19:29:51 crc kubenswrapper[4892]: I0217 19:29:51.243697 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/e5bf75de-8d27-4ca0-be50-bad2df22a6ea-job-config-data\") pod \"e5bf75de-8d27-4ca0-be50-bad2df22a6ea\" (UID: \"e5bf75de-8d27-4ca0-be50-bad2df22a6ea\") " Feb 17 19:29:51 crc kubenswrapper[4892]: I0217 19:29:51.249079 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5bf75de-8d27-4ca0-be50-bad2df22a6ea-kube-api-access-k4zt7" (OuterVolumeSpecName: "kube-api-access-k4zt7") pod "e5bf75de-8d27-4ca0-be50-bad2df22a6ea" (UID: "e5bf75de-8d27-4ca0-be50-bad2df22a6ea"). InnerVolumeSpecName "kube-api-access-k4zt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:29:51 crc kubenswrapper[4892]: I0217 19:29:51.249965 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5bf75de-8d27-4ca0-be50-bad2df22a6ea-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "e5bf75de-8d27-4ca0-be50-bad2df22a6ea" (UID: "e5bf75de-8d27-4ca0-be50-bad2df22a6ea"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:29:51 crc kubenswrapper[4892]: I0217 19:29:51.254966 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5bf75de-8d27-4ca0-be50-bad2df22a6ea-config-data" (OuterVolumeSpecName: "config-data") pod "e5bf75de-8d27-4ca0-be50-bad2df22a6ea" (UID: "e5bf75de-8d27-4ca0-be50-bad2df22a6ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:29:51 crc kubenswrapper[4892]: I0217 19:29:51.296911 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5bf75de-8d27-4ca0-be50-bad2df22a6ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5bf75de-8d27-4ca0-be50-bad2df22a6ea" (UID: "e5bf75de-8d27-4ca0-be50-bad2df22a6ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:29:51 crc kubenswrapper[4892]: I0217 19:29:51.346153 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5bf75de-8d27-4ca0-be50-bad2df22a6ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 19:29:51 crc kubenswrapper[4892]: I0217 19:29:51.346183 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4zt7\" (UniqueName: \"kubernetes.io/projected/e5bf75de-8d27-4ca0-be50-bad2df22a6ea-kube-api-access-k4zt7\") on node \"crc\" DevicePath \"\"" Feb 17 19:29:51 crc kubenswrapper[4892]: I0217 19:29:51.346194 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5bf75de-8d27-4ca0-be50-bad2df22a6ea-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 19:29:51 crc kubenswrapper[4892]: I0217 19:29:51.346203 4892 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/e5bf75de-8d27-4ca0-be50-bad2df22a6ea-job-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 19:29:51 crc kubenswrapper[4892]: I0217 19:29:51.460488 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-lbj77" event={"ID":"e5bf75de-8d27-4ca0-be50-bad2df22a6ea","Type":"ContainerDied","Data":"4e14a1f0d7e7c4d7bdde06db3befbaa07f6047d9336d3b62744bea2bbc1a3354"} Feb 17 19:29:51 crc kubenswrapper[4892]: I0217 19:29:51.460523 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e14a1f0d7e7c4d7bdde06db3befbaa07f6047d9336d3b62744bea2bbc1a3354" Feb 17 19:29:51 crc kubenswrapper[4892]: I0217 19:29:51.460547 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-lbj77" Feb 17 19:29:51 crc kubenswrapper[4892]: I0217 19:29:51.834952 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Feb 17 19:29:51 crc kubenswrapper[4892]: E0217 19:29:51.835748 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5bf75de-8d27-4ca0-be50-bad2df22a6ea" containerName="manila-db-sync" Feb 17 19:29:51 crc kubenswrapper[4892]: I0217 19:29:51.835773 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5bf75de-8d27-4ca0-be50-bad2df22a6ea" containerName="manila-db-sync" Feb 17 19:29:51 crc kubenswrapper[4892]: I0217 19:29:51.836203 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5bf75de-8d27-4ca0-be50-bad2df22a6ea" containerName="manila-db-sync" Feb 17 19:29:51 crc kubenswrapper[4892]: I0217 19:29:51.838426 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Feb 17 19:29:51 crc kubenswrapper[4892]: I0217 19:29:51.841675 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Feb 17 19:29:51 crc kubenswrapper[4892]: I0217 19:29:51.841963 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Feb 17 19:29:51 crc kubenswrapper[4892]: I0217 19:29:51.842689 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Feb 17 19:29:51 crc kubenswrapper[4892]: I0217 19:29:51.843008 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-vb5zt" Feb 17 19:29:51 crc kubenswrapper[4892]: I0217 19:29:51.855973 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Feb 17 19:29:51 crc kubenswrapper[4892]: I0217 19:29:51.953125 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Feb 17 19:29:51 crc kubenswrapper[4892]: I0217 19:29:51.955314 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Feb 17 19:29:51 crc kubenswrapper[4892]: I0217 19:29:51.957732 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Feb 17 19:29:51 crc kubenswrapper[4892]: I0217 19:29:51.958735 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13c41fce-5694-4cf4-af3d-befaa59b6459-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"13c41fce-5694-4cf4-af3d-befaa59b6459\") " pod="openstack/manila-scheduler-0" Feb 17 19:29:51 crc kubenswrapper[4892]: I0217 19:29:51.958797 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13c41fce-5694-4cf4-af3d-befaa59b6459-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"13c41fce-5694-4cf4-af3d-befaa59b6459\") " pod="openstack/manila-scheduler-0" Feb 17 19:29:51 crc kubenswrapper[4892]: I0217 19:29:51.958832 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13c41fce-5694-4cf4-af3d-befaa59b6459-config-data\") pod \"manila-scheduler-0\" (UID: \"13c41fce-5694-4cf4-af3d-befaa59b6459\") " pod="openstack/manila-scheduler-0" Feb 17 19:29:51 crc kubenswrapper[4892]: I0217 19:29:51.958870 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/13c41fce-5694-4cf4-af3d-befaa59b6459-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"13c41fce-5694-4cf4-af3d-befaa59b6459\") " pod="openstack/manila-scheduler-0" Feb 17 19:29:51 crc kubenswrapper[4892]: I0217 19:29:51.958892 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt5k2\" (UniqueName: \"kubernetes.io/projected/13c41fce-5694-4cf4-af3d-befaa59b6459-kube-api-access-qt5k2\") pod \"manila-scheduler-0\" (UID: \"13c41fce-5694-4cf4-af3d-befaa59b6459\") " pod="openstack/manila-scheduler-0" Feb 17 19:29:51 crc kubenswrapper[4892]: I0217 19:29:51.959002 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13c41fce-5694-4cf4-af3d-befaa59b6459-scripts\") pod \"manila-scheduler-0\" (UID: \"13c41fce-5694-4cf4-af3d-befaa59b6459\") " pod="openstack/manila-scheduler-0" Feb 17 19:29:51 crc kubenswrapper[4892]: I0217 19:29:51.964515 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.062353 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/13c41fce-5694-4cf4-af3d-befaa59b6459-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"13c41fce-5694-4cf4-af3d-befaa59b6459\") " pod="openstack/manila-scheduler-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.062393 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/806e5569-2407-4607-8331-cc09f54e37a6-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"806e5569-2407-4607-8331-cc09f54e37a6\") " pod="openstack/manila-share-share1-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.062419 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qt5k2\" (UniqueName: \"kubernetes.io/projected/13c41fce-5694-4cf4-af3d-befaa59b6459-kube-api-access-qt5k2\") pod \"manila-scheduler-0\" (UID: \"13c41fce-5694-4cf4-af3d-befaa59b6459\") " pod="openstack/manila-scheduler-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.062439 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/806e5569-2407-4607-8331-cc09f54e37a6-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"806e5569-2407-4607-8331-cc09f54e37a6\") " pod="openstack/manila-share-share1-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.062520 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/806e5569-2407-4607-8331-cc09f54e37a6-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"806e5569-2407-4607-8331-cc09f54e37a6\") " pod="openstack/manila-share-share1-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.062554 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7hsk\" (UniqueName: \"kubernetes.io/projected/806e5569-2407-4607-8331-cc09f54e37a6-kube-api-access-k7hsk\") pod \"manila-share-share1-0\" (UID: \"806e5569-2407-4607-8331-cc09f54e37a6\") " pod="openstack/manila-share-share1-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.062580 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/806e5569-2407-4607-8331-cc09f54e37a6-config-data\") pod \"manila-share-share1-0\" (UID: \"806e5569-2407-4607-8331-cc09f54e37a6\") " pod="openstack/manila-share-share1-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.062617 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13c41fce-5694-4cf4-af3d-befaa59b6459-scripts\") pod \"manila-scheduler-0\" (UID: \"13c41fce-5694-4cf4-af3d-befaa59b6459\") " pod="openstack/manila-scheduler-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.062638 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/806e5569-2407-4607-8331-cc09f54e37a6-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"806e5569-2407-4607-8331-cc09f54e37a6\") " pod="openstack/manila-share-share1-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.062674 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/806e5569-2407-4607-8331-cc09f54e37a6-ceph\") pod \"manila-share-share1-0\" (UID: \"806e5569-2407-4607-8331-cc09f54e37a6\") " pod="openstack/manila-share-share1-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.062702 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13c41fce-5694-4cf4-af3d-befaa59b6459-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"13c41fce-5694-4cf4-af3d-befaa59b6459\") " pod="openstack/manila-scheduler-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.062718 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/806e5569-2407-4607-8331-cc09f54e37a6-scripts\") pod \"manila-share-share1-0\" (UID: \"806e5569-2407-4607-8331-cc09f54e37a6\") " pod="openstack/manila-share-share1-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.062756 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13c41fce-5694-4cf4-af3d-befaa59b6459-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"13c41fce-5694-4cf4-af3d-befaa59b6459\") " pod="openstack/manila-scheduler-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.062772 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13c41fce-5694-4cf4-af3d-befaa59b6459-config-data\") pod \"manila-scheduler-0\" (UID: \"13c41fce-5694-4cf4-af3d-befaa59b6459\") " pod="openstack/manila-scheduler-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.063399 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/13c41fce-5694-4cf4-af3d-befaa59b6459-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"13c41fce-5694-4cf4-af3d-befaa59b6459\") " pod="openstack/manila-scheduler-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.065745 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.068485 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.078634 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.081169 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13c41fce-5694-4cf4-af3d-befaa59b6459-scripts\") pod \"manila-scheduler-0\" (UID: \"13c41fce-5694-4cf4-af3d-befaa59b6459\") " pod="openstack/manila-scheduler-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.082650 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13c41fce-5694-4cf4-af3d-befaa59b6459-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"13c41fce-5694-4cf4-af3d-befaa59b6459\") " pod="openstack/manila-scheduler-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.095864 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13c41fce-5694-4cf4-af3d-befaa59b6459-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"13c41fce-5694-4cf4-af3d-befaa59b6459\") " pod="openstack/manila-scheduler-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.096260 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.098589 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13c41fce-5694-4cf4-af3d-befaa59b6459-config-data\") pod \"manila-scheduler-0\" (UID: \"13c41fce-5694-4cf4-af3d-befaa59b6459\") " pod="openstack/manila-scheduler-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.106600 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt5k2\" (UniqueName: \"kubernetes.io/projected/13c41fce-5694-4cf4-af3d-befaa59b6459-kube-api-access-qt5k2\") pod \"manila-scheduler-0\" (UID: \"13c41fce-5694-4cf4-af3d-befaa59b6459\") " pod="openstack/manila-scheduler-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.107977 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55d56cc9f7-v7w88"] Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.110074 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55d56cc9f7-v7w88" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.141956 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55d56cc9f7-v7w88"] Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.168066 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37a2faa2-4ddf-4fd0-b254-1e5b139e29bb-logs\") pod \"manila-api-0\" (UID: \"37a2faa2-4ddf-4fd0-b254-1e5b139e29bb\") " pod="openstack/manila-api-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.168146 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37a2faa2-4ddf-4fd0-b254-1e5b139e29bb-scripts\") pod \"manila-api-0\" (UID: \"37a2faa2-4ddf-4fd0-b254-1e5b139e29bb\") " pod="openstack/manila-api-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.168184 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/806e5569-2407-4607-8331-cc09f54e37a6-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"806e5569-2407-4607-8331-cc09f54e37a6\") " pod="openstack/manila-share-share1-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.168210 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/806e5569-2407-4607-8331-cc09f54e37a6-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"806e5569-2407-4607-8331-cc09f54e37a6\") " pod="openstack/manila-share-share1-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.168586 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/806e5569-2407-4607-8331-cc09f54e37a6-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"806e5569-2407-4607-8331-cc09f54e37a6\") " pod="openstack/manila-share-share1-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.168621 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37a2faa2-4ddf-4fd0-b254-1e5b139e29bb-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"37a2faa2-4ddf-4fd0-b254-1e5b139e29bb\") " pod="openstack/manila-api-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.168644 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37a2faa2-4ddf-4fd0-b254-1e5b139e29bb-config-data\") pod \"manila-api-0\" (UID: \"37a2faa2-4ddf-4fd0-b254-1e5b139e29bb\") " pod="openstack/manila-api-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.168672 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7hsk\" (UniqueName: \"kubernetes.io/projected/806e5569-2407-4607-8331-cc09f54e37a6-kube-api-access-k7hsk\") pod \"manila-share-share1-0\" (UID: \"806e5569-2407-4607-8331-cc09f54e37a6\") " pod="openstack/manila-share-share1-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.168705 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/37a2faa2-4ddf-4fd0-b254-1e5b139e29bb-config-data-custom\") pod \"manila-api-0\" (UID: \"37a2faa2-4ddf-4fd0-b254-1e5b139e29bb\") " pod="openstack/manila-api-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.168723 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/806e5569-2407-4607-8331-cc09f54e37a6-config-data\") pod \"manila-share-share1-0\" (UID: \"806e5569-2407-4607-8331-cc09f54e37a6\") " pod="openstack/manila-share-share1-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.168752 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/37a2faa2-4ddf-4fd0-b254-1e5b139e29bb-etc-machine-id\") pod \"manila-api-0\" (UID: \"37a2faa2-4ddf-4fd0-b254-1e5b139e29bb\") " pod="openstack/manila-api-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.168788 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6wlb\" (UniqueName: \"kubernetes.io/projected/37a2faa2-4ddf-4fd0-b254-1e5b139e29bb-kube-api-access-c6wlb\") pod \"manila-api-0\" (UID: \"37a2faa2-4ddf-4fd0-b254-1e5b139e29bb\") " pod="openstack/manila-api-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.168823 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/806e5569-2407-4607-8331-cc09f54e37a6-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"806e5569-2407-4607-8331-cc09f54e37a6\") " pod="openstack/manila-share-share1-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.168870 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/806e5569-2407-4607-8331-cc09f54e37a6-ceph\") pod \"manila-share-share1-0\" (UID: \"806e5569-2407-4607-8331-cc09f54e37a6\") " pod="openstack/manila-share-share1-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.168986 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/806e5569-2407-4607-8331-cc09f54e37a6-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"806e5569-2407-4607-8331-cc09f54e37a6\") " pod="openstack/manila-share-share1-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.170036 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/806e5569-2407-4607-8331-cc09f54e37a6-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"806e5569-2407-4607-8331-cc09f54e37a6\") " pod="openstack/manila-share-share1-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.170160 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/806e5569-2407-4607-8331-cc09f54e37a6-scripts\") pod \"manila-share-share1-0\" (UID: \"806e5569-2407-4607-8331-cc09f54e37a6\") " pod="openstack/manila-share-share1-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.190154 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7hsk\" (UniqueName: \"kubernetes.io/projected/806e5569-2407-4607-8331-cc09f54e37a6-kube-api-access-k7hsk\") pod \"manila-share-share1-0\" (UID: \"806e5569-2407-4607-8331-cc09f54e37a6\") " pod="openstack/manila-share-share1-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.191779 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/806e5569-2407-4607-8331-cc09f54e37a6-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"806e5569-2407-4607-8331-cc09f54e37a6\") " pod="openstack/manila-share-share1-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.193491 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/806e5569-2407-4607-8331-cc09f54e37a6-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"806e5569-2407-4607-8331-cc09f54e37a6\") " pod="openstack/manila-share-share1-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.197932 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.203864 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/806e5569-2407-4607-8331-cc09f54e37a6-ceph\") pod \"manila-share-share1-0\" (UID: \"806e5569-2407-4607-8331-cc09f54e37a6\") " pod="openstack/manila-share-share1-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.204972 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/806e5569-2407-4607-8331-cc09f54e37a6-scripts\") pod \"manila-share-share1-0\" (UID: \"806e5569-2407-4607-8331-cc09f54e37a6\") " pod="openstack/manila-share-share1-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.205639 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/806e5569-2407-4607-8331-cc09f54e37a6-config-data\") pod \"manila-share-share1-0\" (UID: \"806e5569-2407-4607-8331-cc09f54e37a6\") " pod="openstack/manila-share-share1-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.275149 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37a2faa2-4ddf-4fd0-b254-1e5b139e29bb-logs\") pod \"manila-api-0\" (UID: \"37a2faa2-4ddf-4fd0-b254-1e5b139e29bb\") " pod="openstack/manila-api-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.275204 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p559m\" (UniqueName: \"kubernetes.io/projected/81c2639e-73db-4dd2-abf6-01d6ddd092a2-kube-api-access-p559m\") pod \"dnsmasq-dns-55d56cc9f7-v7w88\" (UID: \"81c2639e-73db-4dd2-abf6-01d6ddd092a2\") " pod="openstack/dnsmasq-dns-55d56cc9f7-v7w88" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.275242 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37a2faa2-4ddf-4fd0-b254-1e5b139e29bb-scripts\") pod \"manila-api-0\" (UID: \"37a2faa2-4ddf-4fd0-b254-1e5b139e29bb\") " pod="openstack/manila-api-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.275284 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/81c2639e-73db-4dd2-abf6-01d6ddd092a2-ovsdbserver-sb\") pod \"dnsmasq-dns-55d56cc9f7-v7w88\" (UID: \"81c2639e-73db-4dd2-abf6-01d6ddd092a2\") " pod="openstack/dnsmasq-dns-55d56cc9f7-v7w88" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.275353 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37a2faa2-4ddf-4fd0-b254-1e5b139e29bb-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"37a2faa2-4ddf-4fd0-b254-1e5b139e29bb\") " pod="openstack/manila-api-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.275371 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37a2faa2-4ddf-4fd0-b254-1e5b139e29bb-config-data\") pod \"manila-api-0\" (UID: \"37a2faa2-4ddf-4fd0-b254-1e5b139e29bb\") " pod="openstack/manila-api-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.275395 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/81c2639e-73db-4dd2-abf6-01d6ddd092a2-dns-svc\") pod \"dnsmasq-dns-55d56cc9f7-v7w88\" (UID: \"81c2639e-73db-4dd2-abf6-01d6ddd092a2\") " pod="openstack/dnsmasq-dns-55d56cc9f7-v7w88" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.275416 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/37a2faa2-4ddf-4fd0-b254-1e5b139e29bb-config-data-custom\") pod \"manila-api-0\" (UID: \"37a2faa2-4ddf-4fd0-b254-1e5b139e29bb\") " pod="openstack/manila-api-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.275437 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/37a2faa2-4ddf-4fd0-b254-1e5b139e29bb-etc-machine-id\") pod \"manila-api-0\" (UID: \"37a2faa2-4ddf-4fd0-b254-1e5b139e29bb\") " pod="openstack/manila-api-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.275451 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81c2639e-73db-4dd2-abf6-01d6ddd092a2-config\") pod \"dnsmasq-dns-55d56cc9f7-v7w88\" (UID: \"81c2639e-73db-4dd2-abf6-01d6ddd092a2\") " pod="openstack/dnsmasq-dns-55d56cc9f7-v7w88" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.275476 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6wlb\" (UniqueName: \"kubernetes.io/projected/37a2faa2-4ddf-4fd0-b254-1e5b139e29bb-kube-api-access-c6wlb\") pod \"manila-api-0\" (UID: \"37a2faa2-4ddf-4fd0-b254-1e5b139e29bb\") " pod="openstack/manila-api-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.275523 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/81c2639e-73db-4dd2-abf6-01d6ddd092a2-ovsdbserver-nb\") pod \"dnsmasq-dns-55d56cc9f7-v7w88\" (UID: \"81c2639e-73db-4dd2-abf6-01d6ddd092a2\") " pod="openstack/dnsmasq-dns-55d56cc9f7-v7w88" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.276004 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37a2faa2-4ddf-4fd0-b254-1e5b139e29bb-logs\") pod \"manila-api-0\" (UID: \"37a2faa2-4ddf-4fd0-b254-1e5b139e29bb\") " pod="openstack/manila-api-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.283854 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/37a2faa2-4ddf-4fd0-b254-1e5b139e29bb-etc-machine-id\") pod \"manila-api-0\" (UID: \"37a2faa2-4ddf-4fd0-b254-1e5b139e29bb\") " pod="openstack/manila-api-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.284698 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.289739 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37a2faa2-4ddf-4fd0-b254-1e5b139e29bb-scripts\") pod \"manila-api-0\" (UID: \"37a2faa2-4ddf-4fd0-b254-1e5b139e29bb\") " pod="openstack/manila-api-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.293715 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37a2faa2-4ddf-4fd0-b254-1e5b139e29bb-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"37a2faa2-4ddf-4fd0-b254-1e5b139e29bb\") " pod="openstack/manila-api-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.305791 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6wlb\" (UniqueName: \"kubernetes.io/projected/37a2faa2-4ddf-4fd0-b254-1e5b139e29bb-kube-api-access-c6wlb\") pod \"manila-api-0\" (UID: \"37a2faa2-4ddf-4fd0-b254-1e5b139e29bb\") " pod="openstack/manila-api-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.315395 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37a2faa2-4ddf-4fd0-b254-1e5b139e29bb-config-data\") pod \"manila-api-0\" (UID: \"37a2faa2-4ddf-4fd0-b254-1e5b139e29bb\") " pod="openstack/manila-api-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.336600 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/37a2faa2-4ddf-4fd0-b254-1e5b139e29bb-config-data-custom\") pod \"manila-api-0\" (UID: \"37a2faa2-4ddf-4fd0-b254-1e5b139e29bb\") " pod="openstack/manila-api-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.386407 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/81c2639e-73db-4dd2-abf6-01d6ddd092a2-ovsdbserver-sb\") pod \"dnsmasq-dns-55d56cc9f7-v7w88\" (UID: \"81c2639e-73db-4dd2-abf6-01d6ddd092a2\") " pod="openstack/dnsmasq-dns-55d56cc9f7-v7w88" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.387417 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/81c2639e-73db-4dd2-abf6-01d6ddd092a2-dns-svc\") pod \"dnsmasq-dns-55d56cc9f7-v7w88\" (UID: \"81c2639e-73db-4dd2-abf6-01d6ddd092a2\") " pod="openstack/dnsmasq-dns-55d56cc9f7-v7w88" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.387488 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81c2639e-73db-4dd2-abf6-01d6ddd092a2-config\") pod \"dnsmasq-dns-55d56cc9f7-v7w88\" (UID: \"81c2639e-73db-4dd2-abf6-01d6ddd092a2\") " pod="openstack/dnsmasq-dns-55d56cc9f7-v7w88" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.398129 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/81c2639e-73db-4dd2-abf6-01d6ddd092a2-ovsdbserver-nb\") pod \"dnsmasq-dns-55d56cc9f7-v7w88\" (UID: \"81c2639e-73db-4dd2-abf6-01d6ddd092a2\") " pod="openstack/dnsmasq-dns-55d56cc9f7-v7w88" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.398282 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p559m\" (UniqueName: \"kubernetes.io/projected/81c2639e-73db-4dd2-abf6-01d6ddd092a2-kube-api-access-p559m\") pod \"dnsmasq-dns-55d56cc9f7-v7w88\" (UID: \"81c2639e-73db-4dd2-abf6-01d6ddd092a2\") " pod="openstack/dnsmasq-dns-55d56cc9f7-v7w88" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.398769 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/81c2639e-73db-4dd2-abf6-01d6ddd092a2-ovsdbserver-sb\") pod \"dnsmasq-dns-55d56cc9f7-v7w88\" (UID: \"81c2639e-73db-4dd2-abf6-01d6ddd092a2\") " pod="openstack/dnsmasq-dns-55d56cc9f7-v7w88" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.398764 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81c2639e-73db-4dd2-abf6-01d6ddd092a2-config\") pod \"dnsmasq-dns-55d56cc9f7-v7w88\" (UID: \"81c2639e-73db-4dd2-abf6-01d6ddd092a2\") " pod="openstack/dnsmasq-dns-55d56cc9f7-v7w88" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.398961 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/81c2639e-73db-4dd2-abf6-01d6ddd092a2-dns-svc\") pod \"dnsmasq-dns-55d56cc9f7-v7w88\" (UID: \"81c2639e-73db-4dd2-abf6-01d6ddd092a2\") " pod="openstack/dnsmasq-dns-55d56cc9f7-v7w88" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.400523 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/81c2639e-73db-4dd2-abf6-01d6ddd092a2-ovsdbserver-nb\") pod \"dnsmasq-dns-55d56cc9f7-v7w88\" (UID: \"81c2639e-73db-4dd2-abf6-01d6ddd092a2\") " pod="openstack/dnsmasq-dns-55d56cc9f7-v7w88" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.403547 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.416992 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p559m\" (UniqueName: \"kubernetes.io/projected/81c2639e-73db-4dd2-abf6-01d6ddd092a2-kube-api-access-p559m\") pod \"dnsmasq-dns-55d56cc9f7-v7w88\" (UID: \"81c2639e-73db-4dd2-abf6-01d6ddd092a2\") " pod="openstack/dnsmasq-dns-55d56cc9f7-v7w88" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.713087 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55d56cc9f7-v7w88" Feb 17 19:29:52 crc kubenswrapper[4892]: I0217 19:29:52.768009 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Feb 17 19:29:53 crc kubenswrapper[4892]: I0217 19:29:53.018418 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Feb 17 19:29:53 crc kubenswrapper[4892]: I0217 19:29:53.206143 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Feb 17 19:29:53 crc kubenswrapper[4892]: I0217 19:29:53.329861 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55d56cc9f7-v7w88"] Feb 17 19:29:53 crc kubenswrapper[4892]: I0217 19:29:53.498692 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"13c41fce-5694-4cf4-af3d-befaa59b6459","Type":"ContainerStarted","Data":"9670d19dfd24b7669cf6ccdc95874066a8d8bbe57a9d2716116fc34dadef591f"} Feb 17 19:29:53 crc kubenswrapper[4892]: I0217 19:29:53.507116 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"37a2faa2-4ddf-4fd0-b254-1e5b139e29bb","Type":"ContainerStarted","Data":"7ac02b40eb5daca05445fd39f35d757be8de86a1bf533e6c2211a790b14b5e1a"} Feb 17 19:29:53 crc kubenswrapper[4892]: I0217 19:29:53.508391 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55d56cc9f7-v7w88" event={"ID":"81c2639e-73db-4dd2-abf6-01d6ddd092a2","Type":"ContainerStarted","Data":"6e99fa72ec340a0a3c2adcdad13d7e55d86c45ec2972730d585a8063d6ceec03"} Feb 17 19:29:53 crc kubenswrapper[4892]: I0217 19:29:53.509409 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"806e5569-2407-4607-8331-cc09f54e37a6","Type":"ContainerStarted","Data":"a2dccc36643e1cf9ee89c61ae8569457f7a5e45e069351f26485b07dcfa010c3"} Feb 17 19:29:54 crc kubenswrapper[4892]: I0217 19:29:54.532942 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"13c41fce-5694-4cf4-af3d-befaa59b6459","Type":"ContainerStarted","Data":"662decd4403a8044671da067f35ea08602fd68c0b1d29f6bccd04d5974f569dd"} Feb 17 19:29:54 crc kubenswrapper[4892]: I0217 19:29:54.535291 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"37a2faa2-4ddf-4fd0-b254-1e5b139e29bb","Type":"ContainerStarted","Data":"accfc223db76940c970e8063d024c562d79689fe267046f890005833da368379"} Feb 17 19:29:54 crc kubenswrapper[4892]: I0217 19:29:54.535334 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"37a2faa2-4ddf-4fd0-b254-1e5b139e29bb","Type":"ContainerStarted","Data":"cd2979d58d1e295662dc43d57e89d5f3371b7c469ff409dce43e9a86476c6cb2"} Feb 17 19:29:54 crc kubenswrapper[4892]: I0217 19:29:54.535366 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Feb 17 19:29:54 crc kubenswrapper[4892]: I0217 19:29:54.537275 4892 generic.go:334] "Generic (PLEG): container finished" podID="81c2639e-73db-4dd2-abf6-01d6ddd092a2" containerID="079f08ab5e1c77810e6063cc359fa35289a8dda28a3d90303388f69c744e6572" exitCode=0 Feb 17 19:29:54 crc kubenswrapper[4892]: I0217 19:29:54.537309 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55d56cc9f7-v7w88" event={"ID":"81c2639e-73db-4dd2-abf6-01d6ddd092a2","Type":"ContainerDied","Data":"079f08ab5e1c77810e6063cc359fa35289a8dda28a3d90303388f69c744e6572"} Feb 17 19:29:54 crc kubenswrapper[4892]: I0217 19:29:54.581909 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=2.581891143 podStartE2EDuration="2.581891143s" podCreationTimestamp="2026-02-17 19:29:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:29:54.551311435 +0000 UTC m=+6365.926714710" watchObservedRunningTime="2026-02-17 19:29:54.581891143 +0000 UTC m=+6365.957294408" Feb 17 19:29:55 crc kubenswrapper[4892]: I0217 19:29:55.552377 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"13c41fce-5694-4cf4-af3d-befaa59b6459","Type":"ContainerStarted","Data":"145322d87064e45d26cabbd5b11623f827b5504c4e9b9923ad1a2d1413319174"} Feb 17 19:29:55 crc kubenswrapper[4892]: I0217 19:29:55.557285 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55d56cc9f7-v7w88" event={"ID":"81c2639e-73db-4dd2-abf6-01d6ddd092a2","Type":"ContainerStarted","Data":"ee15e2787e598018c39fa4e1fdbb9a3d8eb0d6660294782d29ce94e1f03ebe49"} Feb 17 19:29:55 crc kubenswrapper[4892]: I0217 19:29:55.557356 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55d56cc9f7-v7w88" Feb 17 19:29:55 crc kubenswrapper[4892]: I0217 19:29:55.581117 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.885523075 podStartE2EDuration="4.581098855s" podCreationTimestamp="2026-02-17 19:29:51 +0000 UTC" firstStartedPulling="2026-02-17 19:29:52.788441918 +0000 UTC m=+6364.163845183" lastFinishedPulling="2026-02-17 19:29:53.484017698 +0000 UTC m=+6364.859420963" observedRunningTime="2026-02-17 19:29:55.573290663 +0000 UTC m=+6366.948693938" watchObservedRunningTime="2026-02-17 19:29:55.581098855 +0000 UTC m=+6366.956502120" Feb 17 19:29:55 crc kubenswrapper[4892]: I0217 19:29:55.606476 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55d56cc9f7-v7w88" podStartSLOduration=3.606457262 podStartE2EDuration="3.606457262s" podCreationTimestamp="2026-02-17 19:29:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:29:55.596279477 +0000 UTC m=+6366.971682742" watchObservedRunningTime="2026-02-17 19:29:55.606457262 +0000 UTC m=+6366.981860517" Feb 17 19:29:56 crc kubenswrapper[4892]: I0217 19:29:56.883301 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 19:29:56 crc kubenswrapper[4892]: I0217 19:29:56.883922 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="11d7e146-b34b-4407-ba25-c19028f250ac" containerName="ceilometer-central-agent" containerID="cri-o://e4e7f3de758dcffad3c1b7afb225ab969ab3302b0b7b5d525f37ffca623a0c9d" gracePeriod=30 Feb 17 19:29:56 crc kubenswrapper[4892]: I0217 19:29:56.884026 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="11d7e146-b34b-4407-ba25-c19028f250ac" containerName="proxy-httpd" containerID="cri-o://be644b50ba916308679414702f03d47e40b84c47c7741b31501290e4bed7ba4c" gracePeriod=30 Feb 17 19:29:56 crc kubenswrapper[4892]: I0217 19:29:56.884049 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="11d7e146-b34b-4407-ba25-c19028f250ac" containerName="sg-core" containerID="cri-o://ca1759fe3213ccf74056245143820c16aa75c8882f71fb4c2b5367c1df147716" gracePeriod=30 Feb 17 19:29:56 crc kubenswrapper[4892]: I0217 19:29:56.884062 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="11d7e146-b34b-4407-ba25-c19028f250ac" containerName="ceilometer-notification-agent" containerID="cri-o://2fd09ccb026953c0d4fdbf7c124f47e8c78a8a4ca9b8831e521388b680f71874" gracePeriod=30 Feb 17 19:29:56 crc kubenswrapper[4892]: I0217 19:29:56.896166 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="11d7e146-b34b-4407-ba25-c19028f250ac" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.1.169:3000/\": EOF" Feb 17 19:29:57 crc kubenswrapper[4892]: I0217 19:29:57.588627 4892 generic.go:334] "Generic (PLEG): container finished" podID="11d7e146-b34b-4407-ba25-c19028f250ac" containerID="be644b50ba916308679414702f03d47e40b84c47c7741b31501290e4bed7ba4c" exitCode=0 Feb 17 19:29:57 crc kubenswrapper[4892]: I0217 19:29:57.588668 4892 generic.go:334] "Generic (PLEG): container finished" podID="11d7e146-b34b-4407-ba25-c19028f250ac" containerID="ca1759fe3213ccf74056245143820c16aa75c8882f71fb4c2b5367c1df147716" exitCode=2 Feb 17 19:29:57 crc kubenswrapper[4892]: I0217 19:29:57.588679 4892 generic.go:334] "Generic (PLEG): container finished" podID="11d7e146-b34b-4407-ba25-c19028f250ac" containerID="e4e7f3de758dcffad3c1b7afb225ab969ab3302b0b7b5d525f37ffca623a0c9d" exitCode=0 Feb 17 19:29:57 crc kubenswrapper[4892]: I0217 19:29:57.588702 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"11d7e146-b34b-4407-ba25-c19028f250ac","Type":"ContainerDied","Data":"be644b50ba916308679414702f03d47e40b84c47c7741b31501290e4bed7ba4c"} Feb 17 19:29:57 crc kubenswrapper[4892]: I0217 19:29:57.588747 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"11d7e146-b34b-4407-ba25-c19028f250ac","Type":"ContainerDied","Data":"ca1759fe3213ccf74056245143820c16aa75c8882f71fb4c2b5367c1df147716"} Feb 17 19:29:57 crc kubenswrapper[4892]: I0217 19:29:57.588757 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"11d7e146-b34b-4407-ba25-c19028f250ac","Type":"ContainerDied","Data":"e4e7f3de758dcffad3c1b7afb225ab969ab3302b0b7b5d525f37ffca623a0c9d"} Feb 17 19:29:59 crc kubenswrapper[4892]: I0217 19:29:59.619377 4892 generic.go:334] "Generic (PLEG): container finished" podID="11d7e146-b34b-4407-ba25-c19028f250ac" containerID="2fd09ccb026953c0d4fdbf7c124f47e8c78a8a4ca9b8831e521388b680f71874" exitCode=0 Feb 17 19:29:59 crc kubenswrapper[4892]: I0217 19:29:59.619463 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"11d7e146-b34b-4407-ba25-c19028f250ac","Type":"ContainerDied","Data":"2fd09ccb026953c0d4fdbf7c124f47e8c78a8a4ca9b8831e521388b680f71874"} Feb 17 19:30:00 crc kubenswrapper[4892]: I0217 19:30:00.154382 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522610-lwvp7"] Feb 17 19:30:00 crc kubenswrapper[4892]: I0217 19:30:00.156475 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522610-lwvp7" Feb 17 19:30:00 crc kubenswrapper[4892]: I0217 19:30:00.159318 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 19:30:00 crc kubenswrapper[4892]: I0217 19:30:00.161002 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 19:30:00 crc kubenswrapper[4892]: I0217 19:30:00.164900 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522610-lwvp7"] Feb 17 19:30:00 crc kubenswrapper[4892]: I0217 19:30:00.268358 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmccx\" (UniqueName: \"kubernetes.io/projected/ddcc45a7-ec52-48c2-abf1-ba45be48d183-kube-api-access-lmccx\") pod \"collect-profiles-29522610-lwvp7\" (UID: \"ddcc45a7-ec52-48c2-abf1-ba45be48d183\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522610-lwvp7" Feb 17 19:30:00 crc kubenswrapper[4892]: I0217 19:30:00.268548 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ddcc45a7-ec52-48c2-abf1-ba45be48d183-config-volume\") pod \"collect-profiles-29522610-lwvp7\" (UID: \"ddcc45a7-ec52-48c2-abf1-ba45be48d183\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522610-lwvp7" Feb 17 19:30:00 crc kubenswrapper[4892]: I0217 19:30:00.268593 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ddcc45a7-ec52-48c2-abf1-ba45be48d183-secret-volume\") pod \"collect-profiles-29522610-lwvp7\" (UID: \"ddcc45a7-ec52-48c2-abf1-ba45be48d183\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522610-lwvp7" Feb 17 19:30:00 crc kubenswrapper[4892]: I0217 19:30:00.370517 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmccx\" (UniqueName: \"kubernetes.io/projected/ddcc45a7-ec52-48c2-abf1-ba45be48d183-kube-api-access-lmccx\") pod \"collect-profiles-29522610-lwvp7\" (UID: \"ddcc45a7-ec52-48c2-abf1-ba45be48d183\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522610-lwvp7" Feb 17 19:30:00 crc kubenswrapper[4892]: I0217 19:30:00.370893 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ddcc45a7-ec52-48c2-abf1-ba45be48d183-config-volume\") pod \"collect-profiles-29522610-lwvp7\" (UID: \"ddcc45a7-ec52-48c2-abf1-ba45be48d183\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522610-lwvp7" Feb 17 19:30:00 crc kubenswrapper[4892]: I0217 19:30:00.370926 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ddcc45a7-ec52-48c2-abf1-ba45be48d183-secret-volume\") pod \"collect-profiles-29522610-lwvp7\" (UID: \"ddcc45a7-ec52-48c2-abf1-ba45be48d183\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522610-lwvp7" Feb 17 19:30:00 crc kubenswrapper[4892]: I0217 19:30:00.372697 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ddcc45a7-ec52-48c2-abf1-ba45be48d183-config-volume\") pod \"collect-profiles-29522610-lwvp7\" (UID: \"ddcc45a7-ec52-48c2-abf1-ba45be48d183\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522610-lwvp7" Feb 17 19:30:00 crc kubenswrapper[4892]: I0217 19:30:00.380684 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ddcc45a7-ec52-48c2-abf1-ba45be48d183-secret-volume\") pod \"collect-profiles-29522610-lwvp7\" (UID: \"ddcc45a7-ec52-48c2-abf1-ba45be48d183\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522610-lwvp7" Feb 17 19:30:00 crc kubenswrapper[4892]: I0217 19:30:00.389466 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmccx\" (UniqueName: \"kubernetes.io/projected/ddcc45a7-ec52-48c2-abf1-ba45be48d183-kube-api-access-lmccx\") pod \"collect-profiles-29522610-lwvp7\" (UID: \"ddcc45a7-ec52-48c2-abf1-ba45be48d183\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522610-lwvp7" Feb 17 19:30:00 crc kubenswrapper[4892]: I0217 19:30:00.484046 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522610-lwvp7" Feb 17 19:30:00 crc kubenswrapper[4892]: I0217 19:30:00.509846 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="11d7e146-b34b-4407-ba25-c19028f250ac" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.1.169:3000/\": dial tcp 10.217.1.169:3000: connect: connection refused" Feb 17 19:30:01 crc kubenswrapper[4892]: I0217 19:30:01.656572 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"11d7e146-b34b-4407-ba25-c19028f250ac","Type":"ContainerDied","Data":"4195f1d8ea7cc1b7c9c9253383733b73dafebc76eca59368d6651bb20a428387"} Feb 17 19:30:01 crc kubenswrapper[4892]: I0217 19:30:01.657236 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4195f1d8ea7cc1b7c9c9253383733b73dafebc76eca59368d6651bb20a428387" Feb 17 19:30:01 crc kubenswrapper[4892]: W0217 19:30:01.720131 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podddcc45a7_ec52_48c2_abf1_ba45be48d183.slice/crio-b26c73c0c878059a040f77300aa87e17968d658af0e0e4d0fb40bc6caccff0ea WatchSource:0}: Error finding container b26c73c0c878059a040f77300aa87e17968d658af0e0e4d0fb40bc6caccff0ea: Status 404 returned error can't find the container with id b26c73c0c878059a040f77300aa87e17968d658af0e0e4d0fb40bc6caccff0ea Feb 17 19:30:01 crc kubenswrapper[4892]: I0217 19:30:01.723456 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522610-lwvp7"] Feb 17 19:30:01 crc kubenswrapper[4892]: I0217 19:30:01.831974 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 19:30:01 crc kubenswrapper[4892]: I0217 19:30:01.909045 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/11d7e146-b34b-4407-ba25-c19028f250ac-sg-core-conf-yaml\") pod \"11d7e146-b34b-4407-ba25-c19028f250ac\" (UID: \"11d7e146-b34b-4407-ba25-c19028f250ac\") " Feb 17 19:30:01 crc kubenswrapper[4892]: I0217 19:30:01.909084 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11d7e146-b34b-4407-ba25-c19028f250ac-combined-ca-bundle\") pod \"11d7e146-b34b-4407-ba25-c19028f250ac\" (UID: \"11d7e146-b34b-4407-ba25-c19028f250ac\") " Feb 17 19:30:01 crc kubenswrapper[4892]: I0217 19:30:01.909106 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11d7e146-b34b-4407-ba25-c19028f250ac-config-data\") pod \"11d7e146-b34b-4407-ba25-c19028f250ac\" (UID: \"11d7e146-b34b-4407-ba25-c19028f250ac\") " Feb 17 19:30:01 crc kubenswrapper[4892]: I0217 19:30:01.909228 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11d7e146-b34b-4407-ba25-c19028f250ac-scripts\") pod \"11d7e146-b34b-4407-ba25-c19028f250ac\" (UID: \"11d7e146-b34b-4407-ba25-c19028f250ac\") " Feb 17 19:30:01 crc kubenswrapper[4892]: I0217 19:30:01.909300 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2dld\" (UniqueName: \"kubernetes.io/projected/11d7e146-b34b-4407-ba25-c19028f250ac-kube-api-access-n2dld\") pod \"11d7e146-b34b-4407-ba25-c19028f250ac\" (UID: \"11d7e146-b34b-4407-ba25-c19028f250ac\") " Feb 17 19:30:01 crc kubenswrapper[4892]: I0217 19:30:01.909339 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/11d7e146-b34b-4407-ba25-c19028f250ac-log-httpd\") pod \"11d7e146-b34b-4407-ba25-c19028f250ac\" (UID: \"11d7e146-b34b-4407-ba25-c19028f250ac\") " Feb 17 19:30:01 crc kubenswrapper[4892]: I0217 19:30:01.909400 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/11d7e146-b34b-4407-ba25-c19028f250ac-run-httpd\") pod \"11d7e146-b34b-4407-ba25-c19028f250ac\" (UID: \"11d7e146-b34b-4407-ba25-c19028f250ac\") " Feb 17 19:30:01 crc kubenswrapper[4892]: I0217 19:30:01.910960 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11d7e146-b34b-4407-ba25-c19028f250ac-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "11d7e146-b34b-4407-ba25-c19028f250ac" (UID: "11d7e146-b34b-4407-ba25-c19028f250ac"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:30:01 crc kubenswrapper[4892]: I0217 19:30:01.911247 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11d7e146-b34b-4407-ba25-c19028f250ac-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "11d7e146-b34b-4407-ba25-c19028f250ac" (UID: "11d7e146-b34b-4407-ba25-c19028f250ac"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:30:01 crc kubenswrapper[4892]: I0217 19:30:01.930058 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11d7e146-b34b-4407-ba25-c19028f250ac-scripts" (OuterVolumeSpecName: "scripts") pod "11d7e146-b34b-4407-ba25-c19028f250ac" (UID: "11d7e146-b34b-4407-ba25-c19028f250ac"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:30:01 crc kubenswrapper[4892]: I0217 19:30:01.937241 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11d7e146-b34b-4407-ba25-c19028f250ac-kube-api-access-n2dld" (OuterVolumeSpecName: "kube-api-access-n2dld") pod "11d7e146-b34b-4407-ba25-c19028f250ac" (UID: "11d7e146-b34b-4407-ba25-c19028f250ac"). InnerVolumeSpecName "kube-api-access-n2dld". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:30:01 crc kubenswrapper[4892]: I0217 19:30:01.992386 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11d7e146-b34b-4407-ba25-c19028f250ac-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "11d7e146-b34b-4407-ba25-c19028f250ac" (UID: "11d7e146-b34b-4407-ba25-c19028f250ac"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.016062 4892 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/11d7e146-b34b-4407-ba25-c19028f250ac-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.016090 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11d7e146-b34b-4407-ba25-c19028f250ac-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.016099 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2dld\" (UniqueName: \"kubernetes.io/projected/11d7e146-b34b-4407-ba25-c19028f250ac-kube-api-access-n2dld\") on node \"crc\" DevicePath \"\"" Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.016109 4892 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/11d7e146-b34b-4407-ba25-c19028f250ac-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.016118 4892 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/11d7e146-b34b-4407-ba25-c19028f250ac-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.057201 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11d7e146-b34b-4407-ba25-c19028f250ac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "11d7e146-b34b-4407-ba25-c19028f250ac" (UID: "11d7e146-b34b-4407-ba25-c19028f250ac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.074359 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11d7e146-b34b-4407-ba25-c19028f250ac-config-data" (OuterVolumeSpecName: "config-data") pod "11d7e146-b34b-4407-ba25-c19028f250ac" (UID: "11d7e146-b34b-4407-ba25-c19028f250ac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.118860 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11d7e146-b34b-4407-ba25-c19028f250ac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.118900 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11d7e146-b34b-4407-ba25-c19028f250ac-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.199079 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.668636 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"806e5569-2407-4607-8331-cc09f54e37a6","Type":"ContainerStarted","Data":"a3eefae18af32b7aa916bfd0741e8b30a60863ee09c917655bf92bc58de83e40"} Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.669031 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"806e5569-2407-4607-8331-cc09f54e37a6","Type":"ContainerStarted","Data":"85dab55bfc18c364a43d004a5dddbb02fed2e1a8e44e84e8184d024e602b6c26"} Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.672319 4892 generic.go:334] "Generic (PLEG): container finished" podID="ddcc45a7-ec52-48c2-abf1-ba45be48d183" containerID="443ca8a21c8d7b6145693c4135642c571f32ded888065039e4d7a019823a44f7" exitCode=0 Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.672475 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.672582 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522610-lwvp7" event={"ID":"ddcc45a7-ec52-48c2-abf1-ba45be48d183","Type":"ContainerDied","Data":"443ca8a21c8d7b6145693c4135642c571f32ded888065039e4d7a019823a44f7"} Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.672688 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522610-lwvp7" event={"ID":"ddcc45a7-ec52-48c2-abf1-ba45be48d183","Type":"ContainerStarted","Data":"b26c73c0c878059a040f77300aa87e17968d658af0e0e4d0fb40bc6caccff0ea"} Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.705389 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.520707835 podStartE2EDuration="11.705370843s" podCreationTimestamp="2026-02-17 19:29:51 +0000 UTC" firstStartedPulling="2026-02-17 19:29:53.037438692 +0000 UTC m=+6364.412841957" lastFinishedPulling="2026-02-17 19:30:01.22210168 +0000 UTC m=+6372.597504965" observedRunningTime="2026-02-17 19:30:02.701291403 +0000 UTC m=+6374.076694698" watchObservedRunningTime="2026-02-17 19:30:02.705370843 +0000 UTC m=+6374.080774118" Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.715044 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55d56cc9f7-v7w88" Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.734188 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.750906 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.779864 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 19:30:02 crc kubenswrapper[4892]: E0217 19:30:02.780442 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11d7e146-b34b-4407-ba25-c19028f250ac" containerName="proxy-httpd" Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.780458 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="11d7e146-b34b-4407-ba25-c19028f250ac" containerName="proxy-httpd" Feb 17 19:30:02 crc kubenswrapper[4892]: E0217 19:30:02.780468 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11d7e146-b34b-4407-ba25-c19028f250ac" containerName="ceilometer-central-agent" Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.780476 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="11d7e146-b34b-4407-ba25-c19028f250ac" containerName="ceilometer-central-agent" Feb 17 19:30:02 crc kubenswrapper[4892]: E0217 19:30:02.780491 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11d7e146-b34b-4407-ba25-c19028f250ac" containerName="sg-core" Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.780499 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="11d7e146-b34b-4407-ba25-c19028f250ac" containerName="sg-core" Feb 17 19:30:02 crc kubenswrapper[4892]: E0217 19:30:02.780523 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11d7e146-b34b-4407-ba25-c19028f250ac" containerName="ceilometer-notification-agent" Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.780529 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="11d7e146-b34b-4407-ba25-c19028f250ac" containerName="ceilometer-notification-agent" Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.780755 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="11d7e146-b34b-4407-ba25-c19028f250ac" containerName="proxy-httpd" Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.780771 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="11d7e146-b34b-4407-ba25-c19028f250ac" containerName="ceilometer-notification-agent" Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.780782 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="11d7e146-b34b-4407-ba25-c19028f250ac" containerName="sg-core" Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.780792 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="11d7e146-b34b-4407-ba25-c19028f250ac" containerName="ceilometer-central-agent" Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.782950 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.793107 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.793630 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.793979 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.836202 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2\") " pod="openstack/ceilometer-0" Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.836274 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjnc6\" (UniqueName: \"kubernetes.io/projected/48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2-kube-api-access-bjnc6\") pod \"ceilometer-0\" (UID: \"48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2\") " pod="openstack/ceilometer-0" Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.836314 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2-scripts\") pod \"ceilometer-0\" (UID: \"48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2\") " pod="openstack/ceilometer-0" Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.836334 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2-config-data\") pod \"ceilometer-0\" (UID: \"48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2\") " pod="openstack/ceilometer-0" Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.836401 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2-run-httpd\") pod \"ceilometer-0\" (UID: \"48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2\") " pod="openstack/ceilometer-0" Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.836427 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2\") " pod="openstack/ceilometer-0" Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.836445 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2-log-httpd\") pod \"ceilometer-0\" (UID: \"48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2\") " pod="openstack/ceilometer-0" Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.856599 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-747474b8df-c8tcp"] Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.859229 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-747474b8df-c8tcp" podUID="bb938d82-faba-45d3-8829-2aeb76c0e18c" containerName="dnsmasq-dns" containerID="cri-o://8d8a76213b1f6f3f3eeaab602e47cb61dd5453b368d09441b4e02dc4d7b79ae9" gracePeriod=10 Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.940452 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2\") " pod="openstack/ceilometer-0" Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.940740 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjnc6\" (UniqueName: \"kubernetes.io/projected/48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2-kube-api-access-bjnc6\") pod \"ceilometer-0\" (UID: \"48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2\") " pod="openstack/ceilometer-0" Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.940771 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2-scripts\") pod \"ceilometer-0\" (UID: \"48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2\") " pod="openstack/ceilometer-0" Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.940785 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2-config-data\") pod \"ceilometer-0\" (UID: \"48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2\") " pod="openstack/ceilometer-0" Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.940863 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2-run-httpd\") pod \"ceilometer-0\" (UID: \"48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2\") " pod="openstack/ceilometer-0" Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.940893 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2\") " pod="openstack/ceilometer-0" Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.940908 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2-log-httpd\") pod \"ceilometer-0\" (UID: \"48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2\") " pod="openstack/ceilometer-0" Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.942110 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2-log-httpd\") pod \"ceilometer-0\" (UID: \"48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2\") " pod="openstack/ceilometer-0" Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.942928 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2-run-httpd\") pod \"ceilometer-0\" (UID: \"48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2\") " pod="openstack/ceilometer-0" Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.955125 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2\") " pod="openstack/ceilometer-0" Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.956635 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2-scripts\") pod \"ceilometer-0\" (UID: \"48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2\") " pod="openstack/ceilometer-0" Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.964282 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2-config-data\") pod \"ceilometer-0\" (UID: \"48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2\") " pod="openstack/ceilometer-0" Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.967811 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2\") " pod="openstack/ceilometer-0" Feb 17 19:30:02 crc kubenswrapper[4892]: I0217 19:30:02.970585 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjnc6\" (UniqueName: \"kubernetes.io/projected/48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2-kube-api-access-bjnc6\") pod \"ceilometer-0\" (UID: \"48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2\") " pod="openstack/ceilometer-0" Feb 17 19:30:03 crc kubenswrapper[4892]: I0217 19:30:03.110851 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 19:30:03 crc kubenswrapper[4892]: I0217 19:30:03.389842 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11d7e146-b34b-4407-ba25-c19028f250ac" path="/var/lib/kubelet/pods/11d7e146-b34b-4407-ba25-c19028f250ac/volumes" Feb 17 19:30:03 crc kubenswrapper[4892]: I0217 19:30:03.689713 4892 generic.go:334] "Generic (PLEG): container finished" podID="bb938d82-faba-45d3-8829-2aeb76c0e18c" containerID="8d8a76213b1f6f3f3eeaab602e47cb61dd5453b368d09441b4e02dc4d7b79ae9" exitCode=0 Feb 17 19:30:03 crc kubenswrapper[4892]: I0217 19:30:03.689753 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-747474b8df-c8tcp" Feb 17 19:30:03 crc kubenswrapper[4892]: I0217 19:30:03.690017 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-747474b8df-c8tcp" event={"ID":"bb938d82-faba-45d3-8829-2aeb76c0e18c","Type":"ContainerDied","Data":"8d8a76213b1f6f3f3eeaab602e47cb61dd5453b368d09441b4e02dc4d7b79ae9"} Feb 17 19:30:03 crc kubenswrapper[4892]: I0217 19:30:03.690063 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-747474b8df-c8tcp" event={"ID":"bb938d82-faba-45d3-8829-2aeb76c0e18c","Type":"ContainerDied","Data":"56d64137ca22148bf7b17ecbf0887980901fd02d7117d2310abfabe69f91a72a"} Feb 17 19:30:03 crc kubenswrapper[4892]: I0217 19:30:03.690082 4892 scope.go:117] "RemoveContainer" containerID="8d8a76213b1f6f3f3eeaab602e47cb61dd5453b368d09441b4e02dc4d7b79ae9" Feb 17 19:30:03 crc kubenswrapper[4892]: I0217 19:30:03.736842 4892 scope.go:117] "RemoveContainer" containerID="81a68a321592b06058d6286cdb835cef2b8afc93edcb4e1c6cc273ae77c40f05" Feb 17 19:30:03 crc kubenswrapper[4892]: I0217 19:30:03.773855 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb938d82-faba-45d3-8829-2aeb76c0e18c-dns-svc\") pod \"bb938d82-faba-45d3-8829-2aeb76c0e18c\" (UID: \"bb938d82-faba-45d3-8829-2aeb76c0e18c\") " Feb 17 19:30:03 crc kubenswrapper[4892]: I0217 19:30:03.774152 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb938d82-faba-45d3-8829-2aeb76c0e18c-ovsdbserver-nb\") pod \"bb938d82-faba-45d3-8829-2aeb76c0e18c\" (UID: \"bb938d82-faba-45d3-8829-2aeb76c0e18c\") " Feb 17 19:30:03 crc kubenswrapper[4892]: I0217 19:30:03.774186 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb938d82-faba-45d3-8829-2aeb76c0e18c-ovsdbserver-sb\") pod \"bb938d82-faba-45d3-8829-2aeb76c0e18c\" (UID: \"bb938d82-faba-45d3-8829-2aeb76c0e18c\") " Feb 17 19:30:03 crc kubenswrapper[4892]: I0217 19:30:03.774212 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb938d82-faba-45d3-8829-2aeb76c0e18c-config\") pod \"bb938d82-faba-45d3-8829-2aeb76c0e18c\" (UID: \"bb938d82-faba-45d3-8829-2aeb76c0e18c\") " Feb 17 19:30:03 crc kubenswrapper[4892]: I0217 19:30:03.774279 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rl8p7\" (UniqueName: \"kubernetes.io/projected/bb938d82-faba-45d3-8829-2aeb76c0e18c-kube-api-access-rl8p7\") pod \"bb938d82-faba-45d3-8829-2aeb76c0e18c\" (UID: \"bb938d82-faba-45d3-8829-2aeb76c0e18c\") " Feb 17 19:30:03 crc kubenswrapper[4892]: I0217 19:30:03.776895 4892 scope.go:117] "RemoveContainer" containerID="8d8a76213b1f6f3f3eeaab602e47cb61dd5453b368d09441b4e02dc4d7b79ae9" Feb 17 19:30:03 crc kubenswrapper[4892]: E0217 19:30:03.779862 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d8a76213b1f6f3f3eeaab602e47cb61dd5453b368d09441b4e02dc4d7b79ae9\": container with ID starting with 8d8a76213b1f6f3f3eeaab602e47cb61dd5453b368d09441b4e02dc4d7b79ae9 not found: ID does not exist" containerID="8d8a76213b1f6f3f3eeaab602e47cb61dd5453b368d09441b4e02dc4d7b79ae9" Feb 17 19:30:03 crc kubenswrapper[4892]: I0217 19:30:03.779910 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d8a76213b1f6f3f3eeaab602e47cb61dd5453b368d09441b4e02dc4d7b79ae9"} err="failed to get container status \"8d8a76213b1f6f3f3eeaab602e47cb61dd5453b368d09441b4e02dc4d7b79ae9\": rpc error: code = NotFound desc = could not find container \"8d8a76213b1f6f3f3eeaab602e47cb61dd5453b368d09441b4e02dc4d7b79ae9\": container with ID starting with 8d8a76213b1f6f3f3eeaab602e47cb61dd5453b368d09441b4e02dc4d7b79ae9 not found: ID does not exist" Feb 17 19:30:03 crc kubenswrapper[4892]: I0217 19:30:03.779937 4892 scope.go:117] "RemoveContainer" containerID="81a68a321592b06058d6286cdb835cef2b8afc93edcb4e1c6cc273ae77c40f05" Feb 17 19:30:03 crc kubenswrapper[4892]: E0217 19:30:03.784355 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81a68a321592b06058d6286cdb835cef2b8afc93edcb4e1c6cc273ae77c40f05\": container with ID starting with 81a68a321592b06058d6286cdb835cef2b8afc93edcb4e1c6cc273ae77c40f05 not found: ID does not exist" containerID="81a68a321592b06058d6286cdb835cef2b8afc93edcb4e1c6cc273ae77c40f05" Feb 17 19:30:03 crc kubenswrapper[4892]: I0217 19:30:03.784386 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81a68a321592b06058d6286cdb835cef2b8afc93edcb4e1c6cc273ae77c40f05"} err="failed to get container status \"81a68a321592b06058d6286cdb835cef2b8afc93edcb4e1c6cc273ae77c40f05\": rpc error: code = NotFound desc = could not find container \"81a68a321592b06058d6286cdb835cef2b8afc93edcb4e1c6cc273ae77c40f05\": container with ID starting with 81a68a321592b06058d6286cdb835cef2b8afc93edcb4e1c6cc273ae77c40f05 not found: ID does not exist" Feb 17 19:30:03 crc kubenswrapper[4892]: I0217 19:30:03.793718 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb938d82-faba-45d3-8829-2aeb76c0e18c-kube-api-access-rl8p7" (OuterVolumeSpecName: "kube-api-access-rl8p7") pod "bb938d82-faba-45d3-8829-2aeb76c0e18c" (UID: "bb938d82-faba-45d3-8829-2aeb76c0e18c"). InnerVolumeSpecName "kube-api-access-rl8p7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:30:03 crc kubenswrapper[4892]: I0217 19:30:03.830516 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 19:30:03 crc kubenswrapper[4892]: I0217 19:30:03.865191 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb938d82-faba-45d3-8829-2aeb76c0e18c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bb938d82-faba-45d3-8829-2aeb76c0e18c" (UID: "bb938d82-faba-45d3-8829-2aeb76c0e18c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:30:03 crc kubenswrapper[4892]: I0217 19:30:03.877865 4892 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb938d82-faba-45d3-8829-2aeb76c0e18c-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 19:30:03 crc kubenswrapper[4892]: I0217 19:30:03.877899 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rl8p7\" (UniqueName: \"kubernetes.io/projected/bb938d82-faba-45d3-8829-2aeb76c0e18c-kube-api-access-rl8p7\") on node \"crc\" DevicePath \"\"" Feb 17 19:30:03 crc kubenswrapper[4892]: I0217 19:30:03.903181 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb938d82-faba-45d3-8829-2aeb76c0e18c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bb938d82-faba-45d3-8829-2aeb76c0e18c" (UID: "bb938d82-faba-45d3-8829-2aeb76c0e18c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:30:03 crc kubenswrapper[4892]: I0217 19:30:03.926630 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb938d82-faba-45d3-8829-2aeb76c0e18c-config" (OuterVolumeSpecName: "config") pod "bb938d82-faba-45d3-8829-2aeb76c0e18c" (UID: "bb938d82-faba-45d3-8829-2aeb76c0e18c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:30:03 crc kubenswrapper[4892]: I0217 19:30:03.932681 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb938d82-faba-45d3-8829-2aeb76c0e18c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bb938d82-faba-45d3-8829-2aeb76c0e18c" (UID: "bb938d82-faba-45d3-8829-2aeb76c0e18c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:30:03 crc kubenswrapper[4892]: I0217 19:30:03.979752 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb938d82-faba-45d3-8829-2aeb76c0e18c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 19:30:03 crc kubenswrapper[4892]: I0217 19:30:03.979776 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb938d82-faba-45d3-8829-2aeb76c0e18c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 19:30:03 crc kubenswrapper[4892]: I0217 19:30:03.979785 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb938d82-faba-45d3-8829-2aeb76c0e18c-config\") on node \"crc\" DevicePath \"\"" Feb 17 19:30:04 crc kubenswrapper[4892]: I0217 19:30:04.005141 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522610-lwvp7" Feb 17 19:30:04 crc kubenswrapper[4892]: I0217 19:30:04.083888 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ddcc45a7-ec52-48c2-abf1-ba45be48d183-config-volume\") pod \"ddcc45a7-ec52-48c2-abf1-ba45be48d183\" (UID: \"ddcc45a7-ec52-48c2-abf1-ba45be48d183\") " Feb 17 19:30:04 crc kubenswrapper[4892]: I0217 19:30:04.084170 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ddcc45a7-ec52-48c2-abf1-ba45be48d183-secret-volume\") pod \"ddcc45a7-ec52-48c2-abf1-ba45be48d183\" (UID: \"ddcc45a7-ec52-48c2-abf1-ba45be48d183\") " Feb 17 19:30:04 crc kubenswrapper[4892]: I0217 19:30:04.084366 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmccx\" (UniqueName: \"kubernetes.io/projected/ddcc45a7-ec52-48c2-abf1-ba45be48d183-kube-api-access-lmccx\") pod \"ddcc45a7-ec52-48c2-abf1-ba45be48d183\" (UID: \"ddcc45a7-ec52-48c2-abf1-ba45be48d183\") " Feb 17 19:30:04 crc kubenswrapper[4892]: I0217 19:30:04.084456 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddcc45a7-ec52-48c2-abf1-ba45be48d183-config-volume" (OuterVolumeSpecName: "config-volume") pod "ddcc45a7-ec52-48c2-abf1-ba45be48d183" (UID: "ddcc45a7-ec52-48c2-abf1-ba45be48d183"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:30:04 crc kubenswrapper[4892]: I0217 19:30:04.085724 4892 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ddcc45a7-ec52-48c2-abf1-ba45be48d183-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 19:30:04 crc kubenswrapper[4892]: I0217 19:30:04.088650 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddcc45a7-ec52-48c2-abf1-ba45be48d183-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ddcc45a7-ec52-48c2-abf1-ba45be48d183" (UID: "ddcc45a7-ec52-48c2-abf1-ba45be48d183"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:30:04 crc kubenswrapper[4892]: I0217 19:30:04.091361 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddcc45a7-ec52-48c2-abf1-ba45be48d183-kube-api-access-lmccx" (OuterVolumeSpecName: "kube-api-access-lmccx") pod "ddcc45a7-ec52-48c2-abf1-ba45be48d183" (UID: "ddcc45a7-ec52-48c2-abf1-ba45be48d183"). InnerVolumeSpecName "kube-api-access-lmccx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:30:04 crc kubenswrapper[4892]: I0217 19:30:04.188101 4892 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ddcc45a7-ec52-48c2-abf1-ba45be48d183-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 19:30:04 crc kubenswrapper[4892]: I0217 19:30:04.188148 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmccx\" (UniqueName: \"kubernetes.io/projected/ddcc45a7-ec52-48c2-abf1-ba45be48d183-kube-api-access-lmccx\") on node \"crc\" DevicePath \"\"" Feb 17 19:30:04 crc kubenswrapper[4892]: I0217 19:30:04.705882 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522610-lwvp7" Feb 17 19:30:04 crc kubenswrapper[4892]: I0217 19:30:04.706687 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522610-lwvp7" event={"ID":"ddcc45a7-ec52-48c2-abf1-ba45be48d183","Type":"ContainerDied","Data":"b26c73c0c878059a040f77300aa87e17968d658af0e0e4d0fb40bc6caccff0ea"} Feb 17 19:30:04 crc kubenswrapper[4892]: I0217 19:30:04.706955 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b26c73c0c878059a040f77300aa87e17968d658af0e0e4d0fb40bc6caccff0ea" Feb 17 19:30:04 crc kubenswrapper[4892]: I0217 19:30:04.713077 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2","Type":"ContainerStarted","Data":"3b2f744588ebcc22ab7591b8def16d10f15eceeaf984493cf59be4633e897cea"} Feb 17 19:30:04 crc kubenswrapper[4892]: I0217 19:30:04.713110 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2","Type":"ContainerStarted","Data":"ece4837681bc971d3d44d17956b090d21ab049d4afd5455b1d4653afdd8e53ca"} Feb 17 19:30:04 crc kubenswrapper[4892]: I0217 19:30:04.715693 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-747474b8df-c8tcp" Feb 17 19:30:04 crc kubenswrapper[4892]: I0217 19:30:04.763804 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-747474b8df-c8tcp"] Feb 17 19:30:04 crc kubenswrapper[4892]: I0217 19:30:04.776478 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-747474b8df-c8tcp"] Feb 17 19:30:05 crc kubenswrapper[4892]: I0217 19:30:05.016718 4892 scope.go:117] "RemoveContainer" containerID="a32c3eff22184a4eeb92c6b68faad9b1c77b11a41a4ab582aa7da95cd90f3351" Feb 17 19:30:05 crc kubenswrapper[4892]: I0217 19:30:05.082076 4892 scope.go:117] "RemoveContainer" containerID="1c3ae0c61f075ab319a23e6c94701988aa7d870316d86e0d474f3faa28ccaaf9" Feb 17 19:30:05 crc kubenswrapper[4892]: I0217 19:30:05.089591 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522565-llf4f"] Feb 17 19:30:05 crc kubenswrapper[4892]: I0217 19:30:05.099195 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522565-llf4f"] Feb 17 19:30:05 crc kubenswrapper[4892]: I0217 19:30:05.197670 4892 scope.go:117] "RemoveContainer" containerID="6b16a0927f66df6489fd569847254692c7552c057528d5e7c548c2cf5eb5f9cc" Feb 17 19:30:05 crc kubenswrapper[4892]: I0217 19:30:05.255257 4892 scope.go:117] "RemoveContainer" containerID="7ca3c76c178458c0147c5c14f7e6e40ce6c73b8caf5bf97e097aff16c0aafe56" Feb 17 19:30:05 crc kubenswrapper[4892]: I0217 19:30:05.271271 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 19:30:05 crc kubenswrapper[4892]: I0217 19:30:05.318662 4892 scope.go:117] "RemoveContainer" containerID="c82f041831853527efa635daae8f08db95e3a2f4911b9221c56338da464e66dd" Feb 17 19:30:05 crc kubenswrapper[4892]: I0217 19:30:05.362347 4892 scope.go:117] "RemoveContainer" containerID="9209cffe36c192d2ded24a8b285eaaf05912ad95d3061f67a9efab49d8beb351" Feb 17 19:30:05 crc kubenswrapper[4892]: I0217 19:30:05.378540 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a216cfbe-7448-4d20-a9df-bf50992681b9" path="/var/lib/kubelet/pods/a216cfbe-7448-4d20-a9df-bf50992681b9/volumes" Feb 17 19:30:05 crc kubenswrapper[4892]: I0217 19:30:05.380746 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb938d82-faba-45d3-8829-2aeb76c0e18c" path="/var/lib/kubelet/pods/bb938d82-faba-45d3-8829-2aeb76c0e18c/volumes" Feb 17 19:30:05 crc kubenswrapper[4892]: I0217 19:30:05.729805 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2","Type":"ContainerStarted","Data":"0d5279381caa246d600800f7c25a2e6db7ad93e1c5016c6f999663faacb46157"} Feb 17 19:30:06 crc kubenswrapper[4892]: I0217 19:30:06.741697 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2","Type":"ContainerStarted","Data":"4a51fd82343a1ca5f40b80587f36d9d7d91b50c7b2b2b7c28040cd02e957cd73"} Feb 17 19:30:07 crc kubenswrapper[4892]: I0217 19:30:07.756994 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2","Type":"ContainerStarted","Data":"47639ab84dd7a512fb2ce7e9599507ab40a857aa1095323066f3afc8e72492dc"} Feb 17 19:30:07 crc kubenswrapper[4892]: I0217 19:30:07.758467 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 19:30:07 crc kubenswrapper[4892]: I0217 19:30:07.757558 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2" containerName="proxy-httpd" containerID="cri-o://47639ab84dd7a512fb2ce7e9599507ab40a857aa1095323066f3afc8e72492dc" gracePeriod=30 Feb 17 19:30:07 crc kubenswrapper[4892]: I0217 19:30:07.757612 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2" containerName="sg-core" containerID="cri-o://4a51fd82343a1ca5f40b80587f36d9d7d91b50c7b2b2b7c28040cd02e957cd73" gracePeriod=30 Feb 17 19:30:07 crc kubenswrapper[4892]: I0217 19:30:07.757611 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2" containerName="ceilometer-notification-agent" containerID="cri-o://0d5279381caa246d600800f7c25a2e6db7ad93e1c5016c6f999663faacb46157" gracePeriod=30 Feb 17 19:30:07 crc kubenswrapper[4892]: I0217 19:30:07.757430 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2" containerName="ceilometer-central-agent" containerID="cri-o://3b2f744588ebcc22ab7591b8def16d10f15eceeaf984493cf59be4633e897cea" gracePeriod=30 Feb 17 19:30:07 crc kubenswrapper[4892]: I0217 19:30:07.802944 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.452433132 podStartE2EDuration="5.802905888s" podCreationTimestamp="2026-02-17 19:30:02 +0000 UTC" firstStartedPulling="2026-02-17 19:30:03.91068818 +0000 UTC m=+6375.286091445" lastFinishedPulling="2026-02-17 19:30:07.261160906 +0000 UTC m=+6378.636564201" observedRunningTime="2026-02-17 19:30:07.778524598 +0000 UTC m=+6379.153927903" watchObservedRunningTime="2026-02-17 19:30:07.802905888 +0000 UTC m=+6379.178309163" Feb 17 19:30:08 crc kubenswrapper[4892]: I0217 19:30:08.774766 4892 generic.go:334] "Generic (PLEG): container finished" podID="48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2" containerID="47639ab84dd7a512fb2ce7e9599507ab40a857aa1095323066f3afc8e72492dc" exitCode=0 Feb 17 19:30:08 crc kubenswrapper[4892]: I0217 19:30:08.775092 4892 generic.go:334] "Generic (PLEG): container finished" podID="48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2" containerID="4a51fd82343a1ca5f40b80587f36d9d7d91b50c7b2b2b7c28040cd02e957cd73" exitCode=2 Feb 17 19:30:08 crc kubenswrapper[4892]: I0217 19:30:08.775106 4892 generic.go:334] "Generic (PLEG): container finished" podID="48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2" containerID="0d5279381caa246d600800f7c25a2e6db7ad93e1c5016c6f999663faacb46157" exitCode=0 Feb 17 19:30:08 crc kubenswrapper[4892]: I0217 19:30:08.774865 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2","Type":"ContainerDied","Data":"47639ab84dd7a512fb2ce7e9599507ab40a857aa1095323066f3afc8e72492dc"} Feb 17 19:30:08 crc kubenswrapper[4892]: I0217 19:30:08.775152 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2","Type":"ContainerDied","Data":"4a51fd82343a1ca5f40b80587f36d9d7d91b50c7b2b2b7c28040cd02e957cd73"} Feb 17 19:30:08 crc kubenswrapper[4892]: I0217 19:30:08.775171 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2","Type":"ContainerDied","Data":"0d5279381caa246d600800f7c25a2e6db7ad93e1c5016c6f999663faacb46157"} Feb 17 19:30:10 crc kubenswrapper[4892]: I0217 19:30:10.725014 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 19:30:10 crc kubenswrapper[4892]: I0217 19:30:10.802245 4892 generic.go:334] "Generic (PLEG): container finished" podID="48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2" containerID="3b2f744588ebcc22ab7591b8def16d10f15eceeaf984493cf59be4633e897cea" exitCode=0 Feb 17 19:30:10 crc kubenswrapper[4892]: I0217 19:30:10.802291 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2","Type":"ContainerDied","Data":"3b2f744588ebcc22ab7591b8def16d10f15eceeaf984493cf59be4633e897cea"} Feb 17 19:30:10 crc kubenswrapper[4892]: I0217 19:30:10.802335 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 19:30:10 crc kubenswrapper[4892]: I0217 19:30:10.802369 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2","Type":"ContainerDied","Data":"ece4837681bc971d3d44d17956b090d21ab049d4afd5455b1d4653afdd8e53ca"} Feb 17 19:30:10 crc kubenswrapper[4892]: I0217 19:30:10.802406 4892 scope.go:117] "RemoveContainer" containerID="47639ab84dd7a512fb2ce7e9599507ab40a857aa1095323066f3afc8e72492dc" Feb 17 19:30:10 crc kubenswrapper[4892]: I0217 19:30:10.827140 4892 scope.go:117] "RemoveContainer" containerID="4a51fd82343a1ca5f40b80587f36d9d7d91b50c7b2b2b7c28040cd02e957cd73" Feb 17 19:30:10 crc kubenswrapper[4892]: I0217 19:30:10.855486 4892 scope.go:117] "RemoveContainer" containerID="0d5279381caa246d600800f7c25a2e6db7ad93e1c5016c6f999663faacb46157" Feb 17 19:30:10 crc kubenswrapper[4892]: I0217 19:30:10.876874 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2-run-httpd\") pod \"48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2\" (UID: \"48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2\") " Feb 17 19:30:10 crc kubenswrapper[4892]: I0217 19:30:10.877001 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2-combined-ca-bundle\") pod \"48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2\" (UID: \"48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2\") " Feb 17 19:30:10 crc kubenswrapper[4892]: I0217 19:30:10.877045 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2-sg-core-conf-yaml\") pod \"48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2\" (UID: \"48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2\") " Feb 17 19:30:10 crc kubenswrapper[4892]: I0217 19:30:10.877122 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2-config-data\") pod \"48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2\" (UID: \"48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2\") " Feb 17 19:30:10 crc kubenswrapper[4892]: I0217 19:30:10.877182 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjnc6\" (UniqueName: \"kubernetes.io/projected/48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2-kube-api-access-bjnc6\") pod \"48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2\" (UID: \"48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2\") " Feb 17 19:30:10 crc kubenswrapper[4892]: I0217 19:30:10.877295 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2-log-httpd\") pod \"48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2\" (UID: \"48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2\") " Feb 17 19:30:10 crc kubenswrapper[4892]: I0217 19:30:10.877340 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2-scripts\") pod \"48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2\" (UID: \"48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2\") " Feb 17 19:30:10 crc kubenswrapper[4892]: I0217 19:30:10.877461 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2" (UID: "48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:30:10 crc kubenswrapper[4892]: I0217 19:30:10.878040 4892 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 19:30:10 crc kubenswrapper[4892]: I0217 19:30:10.878100 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2" (UID: "48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:30:10 crc kubenswrapper[4892]: I0217 19:30:10.882213 4892 scope.go:117] "RemoveContainer" containerID="3b2f744588ebcc22ab7591b8def16d10f15eceeaf984493cf59be4633e897cea" Feb 17 19:30:10 crc kubenswrapper[4892]: I0217 19:30:10.883579 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2-scripts" (OuterVolumeSpecName: "scripts") pod "48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2" (UID: "48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:30:10 crc kubenswrapper[4892]: I0217 19:30:10.883663 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2-kube-api-access-bjnc6" (OuterVolumeSpecName: "kube-api-access-bjnc6") pod "48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2" (UID: "48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2"). InnerVolumeSpecName "kube-api-access-bjnc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:30:10 crc kubenswrapper[4892]: I0217 19:30:10.918400 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2" (UID: "48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:30:10 crc kubenswrapper[4892]: I0217 19:30:10.980467 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjnc6\" (UniqueName: \"kubernetes.io/projected/48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2-kube-api-access-bjnc6\") on node \"crc\" DevicePath \"\"" Feb 17 19:30:10 crc kubenswrapper[4892]: I0217 19:30:10.980504 4892 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 19:30:10 crc kubenswrapper[4892]: I0217 19:30:10.980514 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 19:30:10 crc kubenswrapper[4892]: I0217 19:30:10.980523 4892 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 19:30:11 crc kubenswrapper[4892]: I0217 19:30:11.011044 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2" (UID: "48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:30:11 crc kubenswrapper[4892]: I0217 19:30:11.030274 4892 scope.go:117] "RemoveContainer" containerID="47639ab84dd7a512fb2ce7e9599507ab40a857aa1095323066f3afc8e72492dc" Feb 17 19:30:11 crc kubenswrapper[4892]: E0217 19:30:11.032233 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47639ab84dd7a512fb2ce7e9599507ab40a857aa1095323066f3afc8e72492dc\": container with ID starting with 47639ab84dd7a512fb2ce7e9599507ab40a857aa1095323066f3afc8e72492dc not found: ID does not exist" containerID="47639ab84dd7a512fb2ce7e9599507ab40a857aa1095323066f3afc8e72492dc" Feb 17 19:30:11 crc kubenswrapper[4892]: I0217 19:30:11.032279 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47639ab84dd7a512fb2ce7e9599507ab40a857aa1095323066f3afc8e72492dc"} err="failed to get container status \"47639ab84dd7a512fb2ce7e9599507ab40a857aa1095323066f3afc8e72492dc\": rpc error: code = NotFound desc = could not find container \"47639ab84dd7a512fb2ce7e9599507ab40a857aa1095323066f3afc8e72492dc\": container with ID starting with 47639ab84dd7a512fb2ce7e9599507ab40a857aa1095323066f3afc8e72492dc not found: ID does not exist" Feb 17 19:30:11 crc kubenswrapper[4892]: I0217 19:30:11.032306 4892 scope.go:117] "RemoveContainer" containerID="4a51fd82343a1ca5f40b80587f36d9d7d91b50c7b2b2b7c28040cd02e957cd73" Feb 17 19:30:11 crc kubenswrapper[4892]: I0217 19:30:11.033546 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2-config-data" (OuterVolumeSpecName: "config-data") pod "48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2" (UID: "48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:30:11 crc kubenswrapper[4892]: E0217 19:30:11.034005 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a51fd82343a1ca5f40b80587f36d9d7d91b50c7b2b2b7c28040cd02e957cd73\": container with ID starting with 4a51fd82343a1ca5f40b80587f36d9d7d91b50c7b2b2b7c28040cd02e957cd73 not found: ID does not exist" containerID="4a51fd82343a1ca5f40b80587f36d9d7d91b50c7b2b2b7c28040cd02e957cd73" Feb 17 19:30:11 crc kubenswrapper[4892]: I0217 19:30:11.034132 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a51fd82343a1ca5f40b80587f36d9d7d91b50c7b2b2b7c28040cd02e957cd73"} err="failed to get container status \"4a51fd82343a1ca5f40b80587f36d9d7d91b50c7b2b2b7c28040cd02e957cd73\": rpc error: code = NotFound desc = could not find container \"4a51fd82343a1ca5f40b80587f36d9d7d91b50c7b2b2b7c28040cd02e957cd73\": container with ID starting with 4a51fd82343a1ca5f40b80587f36d9d7d91b50c7b2b2b7c28040cd02e957cd73 not found: ID does not exist" Feb 17 19:30:11 crc kubenswrapper[4892]: I0217 19:30:11.034206 4892 scope.go:117] "RemoveContainer" containerID="0d5279381caa246d600800f7c25a2e6db7ad93e1c5016c6f999663faacb46157" Feb 17 19:30:11 crc kubenswrapper[4892]: E0217 19:30:11.034625 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d5279381caa246d600800f7c25a2e6db7ad93e1c5016c6f999663faacb46157\": container with ID starting with 0d5279381caa246d600800f7c25a2e6db7ad93e1c5016c6f999663faacb46157 not found: ID does not exist" containerID="0d5279381caa246d600800f7c25a2e6db7ad93e1c5016c6f999663faacb46157" Feb 17 19:30:11 crc kubenswrapper[4892]: I0217 19:30:11.034707 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d5279381caa246d600800f7c25a2e6db7ad93e1c5016c6f999663faacb46157"} err="failed to get container status \"0d5279381caa246d600800f7c25a2e6db7ad93e1c5016c6f999663faacb46157\": rpc error: code = NotFound desc = could not find container \"0d5279381caa246d600800f7c25a2e6db7ad93e1c5016c6f999663faacb46157\": container with ID starting with 0d5279381caa246d600800f7c25a2e6db7ad93e1c5016c6f999663faacb46157 not found: ID does not exist" Feb 17 19:30:11 crc kubenswrapper[4892]: I0217 19:30:11.034777 4892 scope.go:117] "RemoveContainer" containerID="3b2f744588ebcc22ab7591b8def16d10f15eceeaf984493cf59be4633e897cea" Feb 17 19:30:11 crc kubenswrapper[4892]: E0217 19:30:11.037075 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b2f744588ebcc22ab7591b8def16d10f15eceeaf984493cf59be4633e897cea\": container with ID starting with 3b2f744588ebcc22ab7591b8def16d10f15eceeaf984493cf59be4633e897cea not found: ID does not exist" containerID="3b2f744588ebcc22ab7591b8def16d10f15eceeaf984493cf59be4633e897cea" Feb 17 19:30:11 crc kubenswrapper[4892]: I0217 19:30:11.037164 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b2f744588ebcc22ab7591b8def16d10f15eceeaf984493cf59be4633e897cea"} err="failed to get container status \"3b2f744588ebcc22ab7591b8def16d10f15eceeaf984493cf59be4633e897cea\": rpc error: code = NotFound desc = could not find container \"3b2f744588ebcc22ab7591b8def16d10f15eceeaf984493cf59be4633e897cea\": container with ID starting with 3b2f744588ebcc22ab7591b8def16d10f15eceeaf984493cf59be4633e897cea not found: ID does not exist" Feb 17 19:30:11 crc kubenswrapper[4892]: I0217 19:30:11.082647 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 19:30:11 crc kubenswrapper[4892]: I0217 19:30:11.083142 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 19:30:11 crc kubenswrapper[4892]: I0217 19:30:11.138492 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 19:30:11 crc kubenswrapper[4892]: I0217 19:30:11.156592 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 19:30:11 crc kubenswrapper[4892]: I0217 19:30:11.172445 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 19:30:11 crc kubenswrapper[4892]: E0217 19:30:11.173006 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2" containerName="proxy-httpd" Feb 17 19:30:11 crc kubenswrapper[4892]: I0217 19:30:11.173019 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2" containerName="proxy-httpd" Feb 17 19:30:11 crc kubenswrapper[4892]: E0217 19:30:11.173040 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddcc45a7-ec52-48c2-abf1-ba45be48d183" containerName="collect-profiles" Feb 17 19:30:11 crc kubenswrapper[4892]: I0217 19:30:11.173047 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddcc45a7-ec52-48c2-abf1-ba45be48d183" containerName="collect-profiles" Feb 17 19:30:11 crc kubenswrapper[4892]: E0217 19:30:11.173057 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb938d82-faba-45d3-8829-2aeb76c0e18c" containerName="dnsmasq-dns" Feb 17 19:30:11 crc kubenswrapper[4892]: I0217 19:30:11.173063 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb938d82-faba-45d3-8829-2aeb76c0e18c" containerName="dnsmasq-dns" Feb 17 19:30:11 crc kubenswrapper[4892]: E0217 19:30:11.173088 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2" containerName="sg-core" Feb 17 19:30:11 crc kubenswrapper[4892]: I0217 19:30:11.173094 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2" containerName="sg-core" Feb 17 19:30:11 crc kubenswrapper[4892]: E0217 19:30:11.173113 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2" containerName="ceilometer-notification-agent" Feb 17 19:30:11 crc kubenswrapper[4892]: I0217 19:30:11.173118 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2" containerName="ceilometer-notification-agent" Feb 17 19:30:11 crc kubenswrapper[4892]: E0217 19:30:11.173134 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb938d82-faba-45d3-8829-2aeb76c0e18c" containerName="init" Feb 17 19:30:11 crc kubenswrapper[4892]: I0217 19:30:11.173140 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb938d82-faba-45d3-8829-2aeb76c0e18c" containerName="init" Feb 17 19:30:11 crc kubenswrapper[4892]: E0217 19:30:11.173150 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2" containerName="ceilometer-central-agent" Feb 17 19:30:11 crc kubenswrapper[4892]: I0217 19:30:11.173155 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2" containerName="ceilometer-central-agent" Feb 17 19:30:11 crc kubenswrapper[4892]: I0217 19:30:11.173374 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2" containerName="proxy-httpd" Feb 17 19:30:11 crc kubenswrapper[4892]: I0217 19:30:11.173386 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2" containerName="ceilometer-notification-agent" Feb 17 19:30:11 crc kubenswrapper[4892]: I0217 19:30:11.173407 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb938d82-faba-45d3-8829-2aeb76c0e18c" containerName="dnsmasq-dns" Feb 17 19:30:11 crc kubenswrapper[4892]: I0217 19:30:11.173414 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2" containerName="sg-core" Feb 17 19:30:11 crc kubenswrapper[4892]: I0217 19:30:11.173427 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddcc45a7-ec52-48c2-abf1-ba45be48d183" containerName="collect-profiles" Feb 17 19:30:11 crc kubenswrapper[4892]: I0217 19:30:11.173442 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2" containerName="ceilometer-central-agent" Feb 17 19:30:11 crc kubenswrapper[4892]: I0217 19:30:11.179928 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 19:30:11 crc kubenswrapper[4892]: I0217 19:30:11.183110 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 19:30:11 crc kubenswrapper[4892]: I0217 19:30:11.185016 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 19:30:11 crc kubenswrapper[4892]: I0217 19:30:11.191417 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 19:30:11 crc kubenswrapper[4892]: I0217 19:30:11.293115 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f-log-httpd\") pod \"ceilometer-0\" (UID: \"ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f\") " pod="openstack/ceilometer-0" Feb 17 19:30:11 crc kubenswrapper[4892]: I0217 19:30:11.293180 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f-scripts\") pod \"ceilometer-0\" (UID: \"ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f\") " pod="openstack/ceilometer-0" Feb 17 19:30:11 crc kubenswrapper[4892]: I0217 19:30:11.293235 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f\") " pod="openstack/ceilometer-0" Feb 17 19:30:11 crc kubenswrapper[4892]: I0217 19:30:11.293290 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkkrp\" (UniqueName: \"kubernetes.io/projected/ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f-kube-api-access-bkkrp\") pod \"ceilometer-0\" (UID: \"ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f\") " pod="openstack/ceilometer-0" Feb 17 19:30:11 crc kubenswrapper[4892]: I0217 19:30:11.293338 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f-run-httpd\") pod \"ceilometer-0\" (UID: \"ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f\") " pod="openstack/ceilometer-0" Feb 17 19:30:11 crc kubenswrapper[4892]: I0217 19:30:11.293380 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f\") " pod="openstack/ceilometer-0" Feb 17 19:30:11 crc kubenswrapper[4892]: I0217 19:30:11.293409 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f-config-data\") pod \"ceilometer-0\" (UID: \"ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f\") " pod="openstack/ceilometer-0" Feb 17 19:30:11 crc kubenswrapper[4892]: I0217 19:30:11.376724 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2" path="/var/lib/kubelet/pods/48fb7ae8-4400-4cd3-bbc0-c6f145c4a6b2/volumes" Feb 17 19:30:11 crc kubenswrapper[4892]: I0217 19:30:11.396426 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkkrp\" (UniqueName: \"kubernetes.io/projected/ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f-kube-api-access-bkkrp\") pod \"ceilometer-0\" (UID: \"ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f\") " pod="openstack/ceilometer-0" Feb 17 19:30:11 crc kubenswrapper[4892]: I0217 19:30:11.396534 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f-run-httpd\") pod \"ceilometer-0\" (UID: \"ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f\") " pod="openstack/ceilometer-0" Feb 17 19:30:11 crc kubenswrapper[4892]: I0217 19:30:11.396600 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f\") " pod="openstack/ceilometer-0" Feb 17 19:30:11 crc kubenswrapper[4892]: I0217 19:30:11.396652 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f-config-data\") pod \"ceilometer-0\" (UID: \"ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f\") " pod="openstack/ceilometer-0" Feb 17 19:30:11 crc kubenswrapper[4892]: I0217 19:30:11.396713 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f-log-httpd\") pod \"ceilometer-0\" (UID: \"ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f\") " pod="openstack/ceilometer-0" Feb 17 19:30:11 crc kubenswrapper[4892]: I0217 19:30:11.396761 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f-scripts\") pod \"ceilometer-0\" (UID: \"ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f\") " pod="openstack/ceilometer-0" Feb 17 19:30:11 crc kubenswrapper[4892]: I0217 19:30:11.396843 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f\") " pod="openstack/ceilometer-0" Feb 17 19:30:11 crc kubenswrapper[4892]: I0217 19:30:11.397871 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f-run-httpd\") pod \"ceilometer-0\" (UID: \"ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f\") " pod="openstack/ceilometer-0" Feb 17 19:30:11 crc kubenswrapper[4892]: I0217 19:30:11.398345 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f-log-httpd\") pod \"ceilometer-0\" (UID: \"ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f\") " pod="openstack/ceilometer-0" Feb 17 19:30:11 crc kubenswrapper[4892]: I0217 19:30:11.402508 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f\") " pod="openstack/ceilometer-0" Feb 17 19:30:11 crc kubenswrapper[4892]: I0217 19:30:11.403736 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f\") " pod="openstack/ceilometer-0" Feb 17 19:30:11 crc kubenswrapper[4892]: I0217 19:30:11.409176 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f-config-data\") pod \"ceilometer-0\" (UID: \"ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f\") " pod="openstack/ceilometer-0" Feb 17 19:30:11 crc kubenswrapper[4892]: I0217 19:30:11.412983 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f-scripts\") pod \"ceilometer-0\" (UID: \"ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f\") " pod="openstack/ceilometer-0" Feb 17 19:30:11 crc kubenswrapper[4892]: I0217 19:30:11.414540 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkkrp\" (UniqueName: \"kubernetes.io/projected/ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f-kube-api-access-bkkrp\") pod \"ceilometer-0\" (UID: \"ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f\") " pod="openstack/ceilometer-0" Feb 17 19:30:11 crc kubenswrapper[4892]: I0217 19:30:11.503039 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 19:30:12 crc kubenswrapper[4892]: I0217 19:30:12.061754 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 19:30:12 crc kubenswrapper[4892]: I0217 19:30:12.285785 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Feb 17 19:30:12 crc kubenswrapper[4892]: I0217 19:30:12.834324 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f","Type":"ContainerStarted","Data":"eb4484c662b22b2f889a6e41fda5cd12f48a6f2ba6f7509c806823f31c8e8aa1"} Feb 17 19:30:12 crc kubenswrapper[4892]: I0217 19:30:12.834579 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f","Type":"ContainerStarted","Data":"948e054e3e1daba8744745280868e846c89a58eaf011baa446d020767cd48bee"} Feb 17 19:30:13 crc kubenswrapper[4892]: I0217 19:30:13.858914 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f","Type":"ContainerStarted","Data":"5e344c98f57abee2749028075420930eff2d7200885c10f515fb1a64bed0ed0c"} Feb 17 19:30:14 crc kubenswrapper[4892]: I0217 19:30:14.062220 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Feb 17 19:30:14 crc kubenswrapper[4892]: I0217 19:30:14.316186 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Feb 17 19:30:14 crc kubenswrapper[4892]: I0217 19:30:14.385438 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Feb 17 19:30:14 crc kubenswrapper[4892]: I0217 19:30:14.872420 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f","Type":"ContainerStarted","Data":"789c4ae6cb709082b885bb773aacd56ca09b18cdf946c452080b871167681ec8"} Feb 17 19:30:15 crc kubenswrapper[4892]: I0217 19:30:15.885776 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f","Type":"ContainerStarted","Data":"049f69417a7b49db3cf3d897a123cc831d7f6afc859a3fe02cfd5734dfc46380"} Feb 17 19:30:15 crc kubenswrapper[4892]: I0217 19:30:15.886411 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 19:30:15 crc kubenswrapper[4892]: I0217 19:30:15.907346 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.554725819 podStartE2EDuration="4.907327953s" podCreationTimestamp="2026-02-17 19:30:11 +0000 UTC" firstStartedPulling="2026-02-17 19:30:12.065863849 +0000 UTC m=+6383.441267134" lastFinishedPulling="2026-02-17 19:30:15.418465993 +0000 UTC m=+6386.793869268" observedRunningTime="2026-02-17 19:30:15.905280718 +0000 UTC m=+6387.280683983" watchObservedRunningTime="2026-02-17 19:30:15.907327953 +0000 UTC m=+6387.282731218" Feb 17 19:30:26 crc kubenswrapper[4892]: I0217 19:30:26.030601 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-8127-account-create-update-tkktr"] Feb 17 19:30:26 crc kubenswrapper[4892]: I0217 19:30:26.041191 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-zn4fv"] Feb 17 19:30:26 crc kubenswrapper[4892]: I0217 19:30:26.052404 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-zn4fv"] Feb 17 19:30:26 crc kubenswrapper[4892]: I0217 19:30:26.062563 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-8127-account-create-update-tkktr"] Feb 17 19:30:27 crc kubenswrapper[4892]: I0217 19:30:27.375856 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5529f735-60b3-453d-8f60-88e712323868" path="/var/lib/kubelet/pods/5529f735-60b3-453d-8f60-88e712323868/volumes" Feb 17 19:30:27 crc kubenswrapper[4892]: I0217 19:30:27.380171 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f660f012-1387-4d60-a990-2dbb04f06f42" path="/var/lib/kubelet/pods/f660f012-1387-4d60-a990-2dbb04f06f42/volumes" Feb 17 19:30:34 crc kubenswrapper[4892]: I0217 19:30:34.067995 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-hqdbg"] Feb 17 19:30:34 crc kubenswrapper[4892]: I0217 19:30:34.086968 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-hqdbg"] Feb 17 19:30:35 crc kubenswrapper[4892]: I0217 19:30:35.376145 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1436563-c0ac-4fff-9f7d-84e644d9061a" path="/var/lib/kubelet/pods/b1436563-c0ac-4fff-9f7d-84e644d9061a/volumes" Feb 17 19:30:37 crc kubenswrapper[4892]: I0217 19:30:37.424535 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 19:30:37 crc kubenswrapper[4892]: I0217 19:30:37.425064 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 19:30:41 crc kubenswrapper[4892]: I0217 19:30:41.515201 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 17 19:31:00 crc kubenswrapper[4892]: I0217 19:31:00.172948 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54d469b495-gltpp"] Feb 17 19:31:00 crc kubenswrapper[4892]: I0217 19:31:00.176444 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54d469b495-gltpp" Feb 17 19:31:00 crc kubenswrapper[4892]: I0217 19:31:00.185241 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1" Feb 17 19:31:00 crc kubenswrapper[4892]: I0217 19:31:00.201686 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54d469b495-gltpp"] Feb 17 19:31:00 crc kubenswrapper[4892]: I0217 19:31:00.304480 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/47439126-d0ea-45a1-ad04-74057fde2a2b-ovsdbserver-nb\") pod \"dnsmasq-dns-54d469b495-gltpp\" (UID: \"47439126-d0ea-45a1-ad04-74057fde2a2b\") " pod="openstack/dnsmasq-dns-54d469b495-gltpp" Feb 17 19:31:00 crc kubenswrapper[4892]: I0217 19:31:00.304548 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/47439126-d0ea-45a1-ad04-74057fde2a2b-openstack-cell1\") pod \"dnsmasq-dns-54d469b495-gltpp\" (UID: \"47439126-d0ea-45a1-ad04-74057fde2a2b\") " pod="openstack/dnsmasq-dns-54d469b495-gltpp" Feb 17 19:31:00 crc kubenswrapper[4892]: I0217 19:31:00.305019 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqjqw\" (UniqueName: \"kubernetes.io/projected/47439126-d0ea-45a1-ad04-74057fde2a2b-kube-api-access-fqjqw\") pod \"dnsmasq-dns-54d469b495-gltpp\" (UID: \"47439126-d0ea-45a1-ad04-74057fde2a2b\") " pod="openstack/dnsmasq-dns-54d469b495-gltpp" Feb 17 19:31:00 crc kubenswrapper[4892]: I0217 19:31:00.305135 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/47439126-d0ea-45a1-ad04-74057fde2a2b-dns-svc\") pod \"dnsmasq-dns-54d469b495-gltpp\" (UID: \"47439126-d0ea-45a1-ad04-74057fde2a2b\") " pod="openstack/dnsmasq-dns-54d469b495-gltpp" Feb 17 19:31:00 crc kubenswrapper[4892]: I0217 19:31:00.305375 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47439126-d0ea-45a1-ad04-74057fde2a2b-config\") pod \"dnsmasq-dns-54d469b495-gltpp\" (UID: \"47439126-d0ea-45a1-ad04-74057fde2a2b\") " pod="openstack/dnsmasq-dns-54d469b495-gltpp" Feb 17 19:31:00 crc kubenswrapper[4892]: I0217 19:31:00.305429 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/47439126-d0ea-45a1-ad04-74057fde2a2b-ovsdbserver-sb\") pod \"dnsmasq-dns-54d469b495-gltpp\" (UID: \"47439126-d0ea-45a1-ad04-74057fde2a2b\") " pod="openstack/dnsmasq-dns-54d469b495-gltpp" Feb 17 19:31:00 crc kubenswrapper[4892]: I0217 19:31:00.407558 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqjqw\" (UniqueName: \"kubernetes.io/projected/47439126-d0ea-45a1-ad04-74057fde2a2b-kube-api-access-fqjqw\") pod \"dnsmasq-dns-54d469b495-gltpp\" (UID: \"47439126-d0ea-45a1-ad04-74057fde2a2b\") " pod="openstack/dnsmasq-dns-54d469b495-gltpp" Feb 17 19:31:00 crc kubenswrapper[4892]: I0217 19:31:00.407634 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/47439126-d0ea-45a1-ad04-74057fde2a2b-dns-svc\") pod \"dnsmasq-dns-54d469b495-gltpp\" (UID: \"47439126-d0ea-45a1-ad04-74057fde2a2b\") " pod="openstack/dnsmasq-dns-54d469b495-gltpp" Feb 17 19:31:00 crc kubenswrapper[4892]: I0217 19:31:00.407713 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47439126-d0ea-45a1-ad04-74057fde2a2b-config\") pod \"dnsmasq-dns-54d469b495-gltpp\" (UID: \"47439126-d0ea-45a1-ad04-74057fde2a2b\") " pod="openstack/dnsmasq-dns-54d469b495-gltpp" Feb 17 19:31:00 crc kubenswrapper[4892]: I0217 19:31:00.407744 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/47439126-d0ea-45a1-ad04-74057fde2a2b-ovsdbserver-sb\") pod \"dnsmasq-dns-54d469b495-gltpp\" (UID: \"47439126-d0ea-45a1-ad04-74057fde2a2b\") " pod="openstack/dnsmasq-dns-54d469b495-gltpp" Feb 17 19:31:00 crc kubenswrapper[4892]: I0217 19:31:00.407764 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/47439126-d0ea-45a1-ad04-74057fde2a2b-ovsdbserver-nb\") pod \"dnsmasq-dns-54d469b495-gltpp\" (UID: \"47439126-d0ea-45a1-ad04-74057fde2a2b\") " pod="openstack/dnsmasq-dns-54d469b495-gltpp" Feb 17 19:31:00 crc kubenswrapper[4892]: I0217 19:31:00.407786 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/47439126-d0ea-45a1-ad04-74057fde2a2b-openstack-cell1\") pod \"dnsmasq-dns-54d469b495-gltpp\" (UID: \"47439126-d0ea-45a1-ad04-74057fde2a2b\") " pod="openstack/dnsmasq-dns-54d469b495-gltpp" Feb 17 19:31:00 crc kubenswrapper[4892]: I0217 19:31:00.408730 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/47439126-d0ea-45a1-ad04-74057fde2a2b-openstack-cell1\") pod \"dnsmasq-dns-54d469b495-gltpp\" (UID: \"47439126-d0ea-45a1-ad04-74057fde2a2b\") " pod="openstack/dnsmasq-dns-54d469b495-gltpp" Feb 17 19:31:00 crc kubenswrapper[4892]: I0217 19:31:00.408854 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47439126-d0ea-45a1-ad04-74057fde2a2b-config\") pod \"dnsmasq-dns-54d469b495-gltpp\" (UID: \"47439126-d0ea-45a1-ad04-74057fde2a2b\") " pod="openstack/dnsmasq-dns-54d469b495-gltpp" Feb 17 19:31:00 crc kubenswrapper[4892]: I0217 19:31:00.409119 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/47439126-d0ea-45a1-ad04-74057fde2a2b-ovsdbserver-sb\") pod \"dnsmasq-dns-54d469b495-gltpp\" (UID: \"47439126-d0ea-45a1-ad04-74057fde2a2b\") " pod="openstack/dnsmasq-dns-54d469b495-gltpp" Feb 17 19:31:00 crc kubenswrapper[4892]: I0217 19:31:00.409336 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/47439126-d0ea-45a1-ad04-74057fde2a2b-ovsdbserver-nb\") pod \"dnsmasq-dns-54d469b495-gltpp\" (UID: \"47439126-d0ea-45a1-ad04-74057fde2a2b\") " pod="openstack/dnsmasq-dns-54d469b495-gltpp" Feb 17 19:31:00 crc kubenswrapper[4892]: I0217 19:31:00.409626 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/47439126-d0ea-45a1-ad04-74057fde2a2b-dns-svc\") pod \"dnsmasq-dns-54d469b495-gltpp\" (UID: \"47439126-d0ea-45a1-ad04-74057fde2a2b\") " pod="openstack/dnsmasq-dns-54d469b495-gltpp" Feb 17 19:31:00 crc kubenswrapper[4892]: I0217 19:31:00.429419 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqjqw\" (UniqueName: \"kubernetes.io/projected/47439126-d0ea-45a1-ad04-74057fde2a2b-kube-api-access-fqjqw\") pod \"dnsmasq-dns-54d469b495-gltpp\" (UID: \"47439126-d0ea-45a1-ad04-74057fde2a2b\") " pod="openstack/dnsmasq-dns-54d469b495-gltpp" Feb 17 19:31:00 crc kubenswrapper[4892]: I0217 19:31:00.508232 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54d469b495-gltpp" Feb 17 19:31:01 crc kubenswrapper[4892]: I0217 19:31:01.300158 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54d469b495-gltpp"] Feb 17 19:31:01 crc kubenswrapper[4892]: I0217 19:31:01.560396 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54d469b495-gltpp" event={"ID":"47439126-d0ea-45a1-ad04-74057fde2a2b","Type":"ContainerStarted","Data":"32ce6f9f025fc9997d67626fbfae1be4ab992a130743ff67f275fbde6a225a35"} Feb 17 19:31:02 crc kubenswrapper[4892]: I0217 19:31:02.573022 4892 generic.go:334] "Generic (PLEG): container finished" podID="47439126-d0ea-45a1-ad04-74057fde2a2b" containerID="db03c918b5ad2da6f79cdaac2e8b3d7ef359ace155d805c6d14b36551181678a" exitCode=0 Feb 17 19:31:02 crc kubenswrapper[4892]: I0217 19:31:02.573066 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54d469b495-gltpp" event={"ID":"47439126-d0ea-45a1-ad04-74057fde2a2b","Type":"ContainerDied","Data":"db03c918b5ad2da6f79cdaac2e8b3d7ef359ace155d805c6d14b36551181678a"} Feb 17 19:31:03 crc kubenswrapper[4892]: I0217 19:31:03.584550 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54d469b495-gltpp" event={"ID":"47439126-d0ea-45a1-ad04-74057fde2a2b","Type":"ContainerStarted","Data":"e30a6cfb05d3bf3bebc6d3d1907da4843110f4a8461fdc59bc67392e66ba6299"} Feb 17 19:31:03 crc kubenswrapper[4892]: I0217 19:31:03.585244 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54d469b495-gltpp" Feb 17 19:31:03 crc kubenswrapper[4892]: I0217 19:31:03.614204 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54d469b495-gltpp" podStartSLOduration=3.614157864 podStartE2EDuration="3.614157864s" podCreationTimestamp="2026-02-17 19:31:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:31:03.603745792 +0000 UTC m=+6434.979149057" watchObservedRunningTime="2026-02-17 19:31:03.614157864 +0000 UTC m=+6434.989561129" Feb 17 19:31:05 crc kubenswrapper[4892]: I0217 19:31:05.549964 4892 scope.go:117] "RemoveContainer" containerID="d1caf24d56c25f58a84ca8fb005d8b73afc8b43aea5f2fadec2cee809f46f9ba" Feb 17 19:31:05 crc kubenswrapper[4892]: I0217 19:31:05.615652 4892 scope.go:117] "RemoveContainer" containerID="f486a1aef66e93de89256eb5a1cf20ac8317ef5671afda6e70d99c02d18a046c" Feb 17 19:31:05 crc kubenswrapper[4892]: I0217 19:31:05.679855 4892 scope.go:117] "RemoveContainer" containerID="26e250c5841f2649f9ae1d65c7a8e9dc959a847b50ea4bc78cc68af760a591f0" Feb 17 19:31:05 crc kubenswrapper[4892]: I0217 19:31:05.717603 4892 scope.go:117] "RemoveContainer" containerID="097e4bcc8898757b053b51217b7726af600c68ea15dbff18282d663e27363bf9" Feb 17 19:31:07 crc kubenswrapper[4892]: I0217 19:31:07.425043 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 19:31:07 crc kubenswrapper[4892]: I0217 19:31:07.425371 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 19:31:10 crc kubenswrapper[4892]: I0217 19:31:10.509692 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-54d469b495-gltpp" Feb 17 19:31:10 crc kubenswrapper[4892]: I0217 19:31:10.624482 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55d56cc9f7-v7w88"] Feb 17 19:31:10 crc kubenswrapper[4892]: I0217 19:31:10.624924 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55d56cc9f7-v7w88" podUID="81c2639e-73db-4dd2-abf6-01d6ddd092a2" containerName="dnsmasq-dns" containerID="cri-o://ee15e2787e598018c39fa4e1fdbb9a3d8eb0d6660294782d29ce94e1f03ebe49" gracePeriod=10 Feb 17 19:31:10 crc kubenswrapper[4892]: I0217 19:31:10.816733 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d54db68c9-hsn6k"] Feb 17 19:31:10 crc kubenswrapper[4892]: I0217 19:31:10.820299 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d54db68c9-hsn6k" Feb 17 19:31:10 crc kubenswrapper[4892]: I0217 19:31:10.829452 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d54db68c9-hsn6k"] Feb 17 19:31:10 crc kubenswrapper[4892]: I0217 19:31:10.950809 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/902b8a22-0008-44c7-a2c5-a1cfebd97794-openstack-cell1\") pod \"dnsmasq-dns-7d54db68c9-hsn6k\" (UID: \"902b8a22-0008-44c7-a2c5-a1cfebd97794\") " pod="openstack/dnsmasq-dns-7d54db68c9-hsn6k" Feb 17 19:31:10 crc kubenswrapper[4892]: I0217 19:31:10.950935 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/902b8a22-0008-44c7-a2c5-a1cfebd97794-config\") pod \"dnsmasq-dns-7d54db68c9-hsn6k\" (UID: \"902b8a22-0008-44c7-a2c5-a1cfebd97794\") " pod="openstack/dnsmasq-dns-7d54db68c9-hsn6k" Feb 17 19:31:10 crc kubenswrapper[4892]: I0217 19:31:10.951010 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/902b8a22-0008-44c7-a2c5-a1cfebd97794-ovsdbserver-sb\") pod \"dnsmasq-dns-7d54db68c9-hsn6k\" (UID: \"902b8a22-0008-44c7-a2c5-a1cfebd97794\") " pod="openstack/dnsmasq-dns-7d54db68c9-hsn6k" Feb 17 19:31:10 crc kubenswrapper[4892]: I0217 19:31:10.951076 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcmd2\" (UniqueName: \"kubernetes.io/projected/902b8a22-0008-44c7-a2c5-a1cfebd97794-kube-api-access-pcmd2\") pod \"dnsmasq-dns-7d54db68c9-hsn6k\" (UID: \"902b8a22-0008-44c7-a2c5-a1cfebd97794\") " pod="openstack/dnsmasq-dns-7d54db68c9-hsn6k" Feb 17 19:31:10 crc kubenswrapper[4892]: I0217 19:31:10.951134 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/902b8a22-0008-44c7-a2c5-a1cfebd97794-ovsdbserver-nb\") pod \"dnsmasq-dns-7d54db68c9-hsn6k\" (UID: \"902b8a22-0008-44c7-a2c5-a1cfebd97794\") " pod="openstack/dnsmasq-dns-7d54db68c9-hsn6k" Feb 17 19:31:10 crc kubenswrapper[4892]: I0217 19:31:10.951194 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/902b8a22-0008-44c7-a2c5-a1cfebd97794-dns-svc\") pod \"dnsmasq-dns-7d54db68c9-hsn6k\" (UID: \"902b8a22-0008-44c7-a2c5-a1cfebd97794\") " pod="openstack/dnsmasq-dns-7d54db68c9-hsn6k" Feb 17 19:31:11 crc kubenswrapper[4892]: I0217 19:31:11.053833 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/902b8a22-0008-44c7-a2c5-a1cfebd97794-dns-svc\") pod \"dnsmasq-dns-7d54db68c9-hsn6k\" (UID: \"902b8a22-0008-44c7-a2c5-a1cfebd97794\") " pod="openstack/dnsmasq-dns-7d54db68c9-hsn6k" Feb 17 19:31:11 crc kubenswrapper[4892]: I0217 19:31:11.053983 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/902b8a22-0008-44c7-a2c5-a1cfebd97794-openstack-cell1\") pod \"dnsmasq-dns-7d54db68c9-hsn6k\" (UID: \"902b8a22-0008-44c7-a2c5-a1cfebd97794\") " pod="openstack/dnsmasq-dns-7d54db68c9-hsn6k" Feb 17 19:31:11 crc kubenswrapper[4892]: I0217 19:31:11.054059 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/902b8a22-0008-44c7-a2c5-a1cfebd97794-config\") pod \"dnsmasq-dns-7d54db68c9-hsn6k\" (UID: \"902b8a22-0008-44c7-a2c5-a1cfebd97794\") " pod="openstack/dnsmasq-dns-7d54db68c9-hsn6k" Feb 17 19:31:11 crc kubenswrapper[4892]: I0217 19:31:11.054142 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/902b8a22-0008-44c7-a2c5-a1cfebd97794-ovsdbserver-sb\") pod \"dnsmasq-dns-7d54db68c9-hsn6k\" (UID: \"902b8a22-0008-44c7-a2c5-a1cfebd97794\") " pod="openstack/dnsmasq-dns-7d54db68c9-hsn6k" Feb 17 19:31:11 crc kubenswrapper[4892]: I0217 19:31:11.054305 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcmd2\" (UniqueName: \"kubernetes.io/projected/902b8a22-0008-44c7-a2c5-a1cfebd97794-kube-api-access-pcmd2\") pod \"dnsmasq-dns-7d54db68c9-hsn6k\" (UID: \"902b8a22-0008-44c7-a2c5-a1cfebd97794\") " pod="openstack/dnsmasq-dns-7d54db68c9-hsn6k" Feb 17 19:31:11 crc kubenswrapper[4892]: I0217 19:31:11.054386 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/902b8a22-0008-44c7-a2c5-a1cfebd97794-ovsdbserver-nb\") pod \"dnsmasq-dns-7d54db68c9-hsn6k\" (UID: \"902b8a22-0008-44c7-a2c5-a1cfebd97794\") " pod="openstack/dnsmasq-dns-7d54db68c9-hsn6k" Feb 17 19:31:11 crc kubenswrapper[4892]: I0217 19:31:11.057196 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/902b8a22-0008-44c7-a2c5-a1cfebd97794-ovsdbserver-nb\") pod \"dnsmasq-dns-7d54db68c9-hsn6k\" (UID: \"902b8a22-0008-44c7-a2c5-a1cfebd97794\") " pod="openstack/dnsmasq-dns-7d54db68c9-hsn6k" Feb 17 19:31:11 crc kubenswrapper[4892]: I0217 19:31:11.057296 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/902b8a22-0008-44c7-a2c5-a1cfebd97794-openstack-cell1\") pod \"dnsmasq-dns-7d54db68c9-hsn6k\" (UID: \"902b8a22-0008-44c7-a2c5-a1cfebd97794\") " pod="openstack/dnsmasq-dns-7d54db68c9-hsn6k" Feb 17 19:31:11 crc kubenswrapper[4892]: I0217 19:31:11.057481 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/902b8a22-0008-44c7-a2c5-a1cfebd97794-dns-svc\") pod \"dnsmasq-dns-7d54db68c9-hsn6k\" (UID: \"902b8a22-0008-44c7-a2c5-a1cfebd97794\") " pod="openstack/dnsmasq-dns-7d54db68c9-hsn6k" Feb 17 19:31:11 crc kubenswrapper[4892]: I0217 19:31:11.058067 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/902b8a22-0008-44c7-a2c5-a1cfebd97794-config\") pod \"dnsmasq-dns-7d54db68c9-hsn6k\" (UID: \"902b8a22-0008-44c7-a2c5-a1cfebd97794\") " pod="openstack/dnsmasq-dns-7d54db68c9-hsn6k" Feb 17 19:31:11 crc kubenswrapper[4892]: I0217 19:31:11.058141 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/902b8a22-0008-44c7-a2c5-a1cfebd97794-ovsdbserver-sb\") pod \"dnsmasq-dns-7d54db68c9-hsn6k\" (UID: \"902b8a22-0008-44c7-a2c5-a1cfebd97794\") " pod="openstack/dnsmasq-dns-7d54db68c9-hsn6k" Feb 17 19:31:11 crc kubenswrapper[4892]: I0217 19:31:11.099455 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcmd2\" (UniqueName: \"kubernetes.io/projected/902b8a22-0008-44c7-a2c5-a1cfebd97794-kube-api-access-pcmd2\") pod \"dnsmasq-dns-7d54db68c9-hsn6k\" (UID: \"902b8a22-0008-44c7-a2c5-a1cfebd97794\") " pod="openstack/dnsmasq-dns-7d54db68c9-hsn6k" Feb 17 19:31:11 crc kubenswrapper[4892]: I0217 19:31:11.139553 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d54db68c9-hsn6k" Feb 17 19:31:11 crc kubenswrapper[4892]: I0217 19:31:11.274542 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55d56cc9f7-v7w88" Feb 17 19:31:11 crc kubenswrapper[4892]: I0217 19:31:11.464849 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/81c2639e-73db-4dd2-abf6-01d6ddd092a2-ovsdbserver-sb\") pod \"81c2639e-73db-4dd2-abf6-01d6ddd092a2\" (UID: \"81c2639e-73db-4dd2-abf6-01d6ddd092a2\") " Feb 17 19:31:11 crc kubenswrapper[4892]: I0217 19:31:11.464904 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/81c2639e-73db-4dd2-abf6-01d6ddd092a2-ovsdbserver-nb\") pod \"81c2639e-73db-4dd2-abf6-01d6ddd092a2\" (UID: \"81c2639e-73db-4dd2-abf6-01d6ddd092a2\") " Feb 17 19:31:11 crc kubenswrapper[4892]: I0217 19:31:11.464956 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p559m\" (UniqueName: \"kubernetes.io/projected/81c2639e-73db-4dd2-abf6-01d6ddd092a2-kube-api-access-p559m\") pod \"81c2639e-73db-4dd2-abf6-01d6ddd092a2\" (UID: \"81c2639e-73db-4dd2-abf6-01d6ddd092a2\") " Feb 17 19:31:11 crc kubenswrapper[4892]: I0217 19:31:11.465233 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/81c2639e-73db-4dd2-abf6-01d6ddd092a2-dns-svc\") pod \"81c2639e-73db-4dd2-abf6-01d6ddd092a2\" (UID: \"81c2639e-73db-4dd2-abf6-01d6ddd092a2\") " Feb 17 19:31:11 crc kubenswrapper[4892]: I0217 19:31:11.465270 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81c2639e-73db-4dd2-abf6-01d6ddd092a2-config\") pod \"81c2639e-73db-4dd2-abf6-01d6ddd092a2\" (UID: \"81c2639e-73db-4dd2-abf6-01d6ddd092a2\") " Feb 17 19:31:11 crc kubenswrapper[4892]: I0217 19:31:11.472853 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81c2639e-73db-4dd2-abf6-01d6ddd092a2-kube-api-access-p559m" (OuterVolumeSpecName: "kube-api-access-p559m") pod "81c2639e-73db-4dd2-abf6-01d6ddd092a2" (UID: "81c2639e-73db-4dd2-abf6-01d6ddd092a2"). InnerVolumeSpecName "kube-api-access-p559m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:31:11 crc kubenswrapper[4892]: I0217 19:31:11.536610 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81c2639e-73db-4dd2-abf6-01d6ddd092a2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "81c2639e-73db-4dd2-abf6-01d6ddd092a2" (UID: "81c2639e-73db-4dd2-abf6-01d6ddd092a2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:31:11 crc kubenswrapper[4892]: I0217 19:31:11.537992 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81c2639e-73db-4dd2-abf6-01d6ddd092a2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "81c2639e-73db-4dd2-abf6-01d6ddd092a2" (UID: "81c2639e-73db-4dd2-abf6-01d6ddd092a2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:31:11 crc kubenswrapper[4892]: I0217 19:31:11.543356 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81c2639e-73db-4dd2-abf6-01d6ddd092a2-config" (OuterVolumeSpecName: "config") pod "81c2639e-73db-4dd2-abf6-01d6ddd092a2" (UID: "81c2639e-73db-4dd2-abf6-01d6ddd092a2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:31:11 crc kubenswrapper[4892]: I0217 19:31:11.548569 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81c2639e-73db-4dd2-abf6-01d6ddd092a2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "81c2639e-73db-4dd2-abf6-01d6ddd092a2" (UID: "81c2639e-73db-4dd2-abf6-01d6ddd092a2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:31:11 crc kubenswrapper[4892]: I0217 19:31:11.567916 4892 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/81c2639e-73db-4dd2-abf6-01d6ddd092a2-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 19:31:11 crc kubenswrapper[4892]: I0217 19:31:11.567951 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81c2639e-73db-4dd2-abf6-01d6ddd092a2-config\") on node \"crc\" DevicePath \"\"" Feb 17 19:31:11 crc kubenswrapper[4892]: I0217 19:31:11.567960 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/81c2639e-73db-4dd2-abf6-01d6ddd092a2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 19:31:11 crc kubenswrapper[4892]: I0217 19:31:11.567971 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/81c2639e-73db-4dd2-abf6-01d6ddd092a2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 19:31:11 crc kubenswrapper[4892]: I0217 19:31:11.567980 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p559m\" (UniqueName: \"kubernetes.io/projected/81c2639e-73db-4dd2-abf6-01d6ddd092a2-kube-api-access-p559m\") on node \"crc\" DevicePath \"\"" Feb 17 19:31:11 crc kubenswrapper[4892]: I0217 19:31:11.638779 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d54db68c9-hsn6k"] Feb 17 19:31:11 crc kubenswrapper[4892]: I0217 19:31:11.741300 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d54db68c9-hsn6k" event={"ID":"902b8a22-0008-44c7-a2c5-a1cfebd97794","Type":"ContainerStarted","Data":"3841a87e64622fd1f71580bbd098cf134e728deb77b0c3ce7c38520a98144f77"} Feb 17 19:31:11 crc kubenswrapper[4892]: I0217 19:31:11.764602 4892 generic.go:334] "Generic (PLEG): container finished" podID="81c2639e-73db-4dd2-abf6-01d6ddd092a2" containerID="ee15e2787e598018c39fa4e1fdbb9a3d8eb0d6660294782d29ce94e1f03ebe49" exitCode=0 Feb 17 19:31:11 crc kubenswrapper[4892]: I0217 19:31:11.764657 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55d56cc9f7-v7w88" event={"ID":"81c2639e-73db-4dd2-abf6-01d6ddd092a2","Type":"ContainerDied","Data":"ee15e2787e598018c39fa4e1fdbb9a3d8eb0d6660294782d29ce94e1f03ebe49"} Feb 17 19:31:11 crc kubenswrapper[4892]: I0217 19:31:11.764691 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55d56cc9f7-v7w88" event={"ID":"81c2639e-73db-4dd2-abf6-01d6ddd092a2","Type":"ContainerDied","Data":"6e99fa72ec340a0a3c2adcdad13d7e55d86c45ec2972730d585a8063d6ceec03"} Feb 17 19:31:11 crc kubenswrapper[4892]: I0217 19:31:11.764713 4892 scope.go:117] "RemoveContainer" containerID="ee15e2787e598018c39fa4e1fdbb9a3d8eb0d6660294782d29ce94e1f03ebe49" Feb 17 19:31:11 crc kubenswrapper[4892]: I0217 19:31:11.764792 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55d56cc9f7-v7w88" Feb 17 19:31:11 crc kubenswrapper[4892]: I0217 19:31:11.914329 4892 scope.go:117] "RemoveContainer" containerID="079f08ab5e1c77810e6063cc359fa35289a8dda28a3d90303388f69c744e6572" Feb 17 19:31:11 crc kubenswrapper[4892]: I0217 19:31:11.939035 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55d56cc9f7-v7w88"] Feb 17 19:31:11 crc kubenswrapper[4892]: I0217 19:31:11.955003 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55d56cc9f7-v7w88"] Feb 17 19:31:12 crc kubenswrapper[4892]: I0217 19:31:12.010686 4892 scope.go:117] "RemoveContainer" containerID="ee15e2787e598018c39fa4e1fdbb9a3d8eb0d6660294782d29ce94e1f03ebe49" Feb 17 19:31:12 crc kubenswrapper[4892]: E0217 19:31:12.011035 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee15e2787e598018c39fa4e1fdbb9a3d8eb0d6660294782d29ce94e1f03ebe49\": container with ID starting with ee15e2787e598018c39fa4e1fdbb9a3d8eb0d6660294782d29ce94e1f03ebe49 not found: ID does not exist" containerID="ee15e2787e598018c39fa4e1fdbb9a3d8eb0d6660294782d29ce94e1f03ebe49" Feb 17 19:31:12 crc kubenswrapper[4892]: I0217 19:31:12.011062 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee15e2787e598018c39fa4e1fdbb9a3d8eb0d6660294782d29ce94e1f03ebe49"} err="failed to get container status \"ee15e2787e598018c39fa4e1fdbb9a3d8eb0d6660294782d29ce94e1f03ebe49\": rpc error: code = NotFound desc = could not find container \"ee15e2787e598018c39fa4e1fdbb9a3d8eb0d6660294782d29ce94e1f03ebe49\": container with ID starting with ee15e2787e598018c39fa4e1fdbb9a3d8eb0d6660294782d29ce94e1f03ebe49 not found: ID does not exist" Feb 17 19:31:12 crc kubenswrapper[4892]: I0217 19:31:12.011080 4892 scope.go:117] "RemoveContainer" containerID="079f08ab5e1c77810e6063cc359fa35289a8dda28a3d90303388f69c744e6572" Feb 17 19:31:12 crc kubenswrapper[4892]: E0217 19:31:12.011490 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"079f08ab5e1c77810e6063cc359fa35289a8dda28a3d90303388f69c744e6572\": container with ID starting with 079f08ab5e1c77810e6063cc359fa35289a8dda28a3d90303388f69c744e6572 not found: ID does not exist" containerID="079f08ab5e1c77810e6063cc359fa35289a8dda28a3d90303388f69c744e6572" Feb 17 19:31:12 crc kubenswrapper[4892]: I0217 19:31:12.011512 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"079f08ab5e1c77810e6063cc359fa35289a8dda28a3d90303388f69c744e6572"} err="failed to get container status \"079f08ab5e1c77810e6063cc359fa35289a8dda28a3d90303388f69c744e6572\": rpc error: code = NotFound desc = could not find container \"079f08ab5e1c77810e6063cc359fa35289a8dda28a3d90303388f69c744e6572\": container with ID starting with 079f08ab5e1c77810e6063cc359fa35289a8dda28a3d90303388f69c744e6572 not found: ID does not exist" Feb 17 19:31:12 crc kubenswrapper[4892]: I0217 19:31:12.782409 4892 generic.go:334] "Generic (PLEG): container finished" podID="902b8a22-0008-44c7-a2c5-a1cfebd97794" containerID="bab3622a7df631e259158a73457156edc77a067be33f8bfe792e50acc4084d1b" exitCode=0 Feb 17 19:31:12 crc kubenswrapper[4892]: I0217 19:31:12.782744 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d54db68c9-hsn6k" event={"ID":"902b8a22-0008-44c7-a2c5-a1cfebd97794","Type":"ContainerDied","Data":"bab3622a7df631e259158a73457156edc77a067be33f8bfe792e50acc4084d1b"} Feb 17 19:31:13 crc kubenswrapper[4892]: I0217 19:31:13.388117 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81c2639e-73db-4dd2-abf6-01d6ddd092a2" path="/var/lib/kubelet/pods/81c2639e-73db-4dd2-abf6-01d6ddd092a2/volumes" Feb 17 19:31:13 crc kubenswrapper[4892]: I0217 19:31:13.797709 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d54db68c9-hsn6k" event={"ID":"902b8a22-0008-44c7-a2c5-a1cfebd97794","Type":"ContainerStarted","Data":"13d0ae896917fa9b8a2390fb8702ba3e3d0120d471fd8a383dab63ec0b80a4ac"} Feb 17 19:31:13 crc kubenswrapper[4892]: I0217 19:31:13.798049 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7d54db68c9-hsn6k" Feb 17 19:31:13 crc kubenswrapper[4892]: I0217 19:31:13.824500 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7d54db68c9-hsn6k" podStartSLOduration=3.824481907 podStartE2EDuration="3.824481907s" podCreationTimestamp="2026-02-17 19:31:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 19:31:13.816277066 +0000 UTC m=+6445.191680361" watchObservedRunningTime="2026-02-17 19:31:13.824481907 +0000 UTC m=+6445.199885172" Feb 17 19:31:18 crc kubenswrapper[4892]: I0217 19:31:18.692842 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cg76fw"] Feb 17 19:31:18 crc kubenswrapper[4892]: E0217 19:31:18.693904 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81c2639e-73db-4dd2-abf6-01d6ddd092a2" containerName="dnsmasq-dns" Feb 17 19:31:18 crc kubenswrapper[4892]: I0217 19:31:18.693920 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="81c2639e-73db-4dd2-abf6-01d6ddd092a2" containerName="dnsmasq-dns" Feb 17 19:31:18 crc kubenswrapper[4892]: E0217 19:31:18.693963 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81c2639e-73db-4dd2-abf6-01d6ddd092a2" containerName="init" Feb 17 19:31:18 crc kubenswrapper[4892]: I0217 19:31:18.693971 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="81c2639e-73db-4dd2-abf6-01d6ddd092a2" containerName="init" Feb 17 19:31:18 crc kubenswrapper[4892]: I0217 19:31:18.694306 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="81c2639e-73db-4dd2-abf6-01d6ddd092a2" containerName="dnsmasq-dns" Feb 17 19:31:18 crc kubenswrapper[4892]: I0217 19:31:18.695257 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cg76fw" Feb 17 19:31:18 crc kubenswrapper[4892]: I0217 19:31:18.698266 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-sdxx7" Feb 17 19:31:18 crc kubenswrapper[4892]: I0217 19:31:18.698897 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 17 19:31:18 crc kubenswrapper[4892]: I0217 19:31:18.699342 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 17 19:31:18 crc kubenswrapper[4892]: I0217 19:31:18.701196 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 19:31:18 crc kubenswrapper[4892]: I0217 19:31:18.716276 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cg76fw"] Feb 17 19:31:18 crc kubenswrapper[4892]: I0217 19:31:18.849536 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/422e1c2e-e6ab-4d2c-ac8b-86e70965fef8-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cg76fw\" (UID: \"422e1c2e-e6ab-4d2c-ac8b-86e70965fef8\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cg76fw" Feb 17 19:31:18 crc kubenswrapper[4892]: I0217 19:31:18.849610 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/422e1c2e-e6ab-4d2c-ac8b-86e70965fef8-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cg76fw\" (UID: \"422e1c2e-e6ab-4d2c-ac8b-86e70965fef8\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cg76fw" Feb 17 19:31:18 crc kubenswrapper[4892]: I0217 19:31:18.849683 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/422e1c2e-e6ab-4d2c-ac8b-86e70965fef8-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cg76fw\" (UID: \"422e1c2e-e6ab-4d2c-ac8b-86e70965fef8\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cg76fw" Feb 17 19:31:18 crc kubenswrapper[4892]: I0217 19:31:18.849931 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/422e1c2e-e6ab-4d2c-ac8b-86e70965fef8-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cg76fw\" (UID: \"422e1c2e-e6ab-4d2c-ac8b-86e70965fef8\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cg76fw" Feb 17 19:31:18 crc kubenswrapper[4892]: I0217 19:31:18.850032 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmkkr\" (UniqueName: \"kubernetes.io/projected/422e1c2e-e6ab-4d2c-ac8b-86e70965fef8-kube-api-access-fmkkr\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cg76fw\" (UID: \"422e1c2e-e6ab-4d2c-ac8b-86e70965fef8\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cg76fw" Feb 17 19:31:18 crc kubenswrapper[4892]: I0217 19:31:18.951906 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/422e1c2e-e6ab-4d2c-ac8b-86e70965fef8-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cg76fw\" (UID: \"422e1c2e-e6ab-4d2c-ac8b-86e70965fef8\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cg76fw" Feb 17 19:31:18 crc kubenswrapper[4892]: I0217 19:31:18.951970 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/422e1c2e-e6ab-4d2c-ac8b-86e70965fef8-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cg76fw\" (UID: \"422e1c2e-e6ab-4d2c-ac8b-86e70965fef8\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cg76fw" Feb 17 19:31:18 crc kubenswrapper[4892]: I0217 19:31:18.952037 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/422e1c2e-e6ab-4d2c-ac8b-86e70965fef8-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cg76fw\" (UID: \"422e1c2e-e6ab-4d2c-ac8b-86e70965fef8\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cg76fw" Feb 17 19:31:18 crc kubenswrapper[4892]: I0217 19:31:18.952085 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/422e1c2e-e6ab-4d2c-ac8b-86e70965fef8-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cg76fw\" (UID: \"422e1c2e-e6ab-4d2c-ac8b-86e70965fef8\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cg76fw" Feb 17 19:31:18 crc kubenswrapper[4892]: I0217 19:31:18.952114 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmkkr\" (UniqueName: \"kubernetes.io/projected/422e1c2e-e6ab-4d2c-ac8b-86e70965fef8-kube-api-access-fmkkr\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cg76fw\" (UID: \"422e1c2e-e6ab-4d2c-ac8b-86e70965fef8\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cg76fw" Feb 17 19:31:18 crc kubenswrapper[4892]: I0217 19:31:18.959779 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/422e1c2e-e6ab-4d2c-ac8b-86e70965fef8-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cg76fw\" (UID: \"422e1c2e-e6ab-4d2c-ac8b-86e70965fef8\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cg76fw" Feb 17 19:31:18 crc kubenswrapper[4892]: I0217 19:31:18.960612 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/422e1c2e-e6ab-4d2c-ac8b-86e70965fef8-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cg76fw\" (UID: \"422e1c2e-e6ab-4d2c-ac8b-86e70965fef8\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cg76fw" Feb 17 19:31:18 crc kubenswrapper[4892]: I0217 19:31:18.965328 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/422e1c2e-e6ab-4d2c-ac8b-86e70965fef8-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cg76fw\" (UID: \"422e1c2e-e6ab-4d2c-ac8b-86e70965fef8\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cg76fw" Feb 17 19:31:18 crc kubenswrapper[4892]: I0217 19:31:18.968262 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/422e1c2e-e6ab-4d2c-ac8b-86e70965fef8-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cg76fw\" (UID: \"422e1c2e-e6ab-4d2c-ac8b-86e70965fef8\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cg76fw" Feb 17 19:31:18 crc kubenswrapper[4892]: I0217 19:31:18.974113 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmkkr\" (UniqueName: \"kubernetes.io/projected/422e1c2e-e6ab-4d2c-ac8b-86e70965fef8-kube-api-access-fmkkr\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cg76fw\" (UID: \"422e1c2e-e6ab-4d2c-ac8b-86e70965fef8\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cg76fw" Feb 17 19:31:19 crc kubenswrapper[4892]: I0217 19:31:19.033028 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cg76fw" Feb 17 19:31:19 crc kubenswrapper[4892]: I0217 19:31:19.795603 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cg76fw"] Feb 17 19:31:19 crc kubenswrapper[4892]: W0217 19:31:19.805448 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod422e1c2e_e6ab_4d2c_ac8b_86e70965fef8.slice/crio-760e4a4b276fcc7a1b257970839627ed7f1fda866a6aa2b312d1feb069c9bd7e WatchSource:0}: Error finding container 760e4a4b276fcc7a1b257970839627ed7f1fda866a6aa2b312d1feb069c9bd7e: Status 404 returned error can't find the container with id 760e4a4b276fcc7a1b257970839627ed7f1fda866a6aa2b312d1feb069c9bd7e Feb 17 19:31:19 crc kubenswrapper[4892]: I0217 19:31:19.872132 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cg76fw" event={"ID":"422e1c2e-e6ab-4d2c-ac8b-86e70965fef8","Type":"ContainerStarted","Data":"760e4a4b276fcc7a1b257970839627ed7f1fda866a6aa2b312d1feb069c9bd7e"} Feb 17 19:31:21 crc kubenswrapper[4892]: I0217 19:31:21.144982 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7d54db68c9-hsn6k" Feb 17 19:31:21 crc kubenswrapper[4892]: I0217 19:31:21.232371 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54d469b495-gltpp"] Feb 17 19:31:21 crc kubenswrapper[4892]: I0217 19:31:21.232735 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-54d469b495-gltpp" podUID="47439126-d0ea-45a1-ad04-74057fde2a2b" containerName="dnsmasq-dns" containerID="cri-o://e30a6cfb05d3bf3bebc6d3d1907da4843110f4a8461fdc59bc67392e66ba6299" gracePeriod=10 Feb 17 19:31:21 crc kubenswrapper[4892]: I0217 19:31:21.921970 4892 generic.go:334] "Generic (PLEG): container finished" podID="47439126-d0ea-45a1-ad04-74057fde2a2b" containerID="e30a6cfb05d3bf3bebc6d3d1907da4843110f4a8461fdc59bc67392e66ba6299" exitCode=0 Feb 17 19:31:21 crc kubenswrapper[4892]: I0217 19:31:21.922013 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54d469b495-gltpp" event={"ID":"47439126-d0ea-45a1-ad04-74057fde2a2b","Type":"ContainerDied","Data":"e30a6cfb05d3bf3bebc6d3d1907da4843110f4a8461fdc59bc67392e66ba6299"} Feb 17 19:31:22 crc kubenswrapper[4892]: I0217 19:31:22.056144 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54d469b495-gltpp" Feb 17 19:31:22 crc kubenswrapper[4892]: I0217 19:31:22.143562 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/47439126-d0ea-45a1-ad04-74057fde2a2b-openstack-cell1\") pod \"47439126-d0ea-45a1-ad04-74057fde2a2b\" (UID: \"47439126-d0ea-45a1-ad04-74057fde2a2b\") " Feb 17 19:31:22 crc kubenswrapper[4892]: I0217 19:31:22.143678 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqjqw\" (UniqueName: \"kubernetes.io/projected/47439126-d0ea-45a1-ad04-74057fde2a2b-kube-api-access-fqjqw\") pod \"47439126-d0ea-45a1-ad04-74057fde2a2b\" (UID: \"47439126-d0ea-45a1-ad04-74057fde2a2b\") " Feb 17 19:31:22 crc kubenswrapper[4892]: I0217 19:31:22.143729 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/47439126-d0ea-45a1-ad04-74057fde2a2b-ovsdbserver-sb\") pod \"47439126-d0ea-45a1-ad04-74057fde2a2b\" (UID: \"47439126-d0ea-45a1-ad04-74057fde2a2b\") " Feb 17 19:31:22 crc kubenswrapper[4892]: I0217 19:31:22.143878 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/47439126-d0ea-45a1-ad04-74057fde2a2b-dns-svc\") pod \"47439126-d0ea-45a1-ad04-74057fde2a2b\" (UID: \"47439126-d0ea-45a1-ad04-74057fde2a2b\") " Feb 17 19:31:22 crc kubenswrapper[4892]: I0217 19:31:22.143928 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/47439126-d0ea-45a1-ad04-74057fde2a2b-ovsdbserver-nb\") pod \"47439126-d0ea-45a1-ad04-74057fde2a2b\" (UID: \"47439126-d0ea-45a1-ad04-74057fde2a2b\") " Feb 17 19:31:22 crc kubenswrapper[4892]: I0217 19:31:22.144028 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47439126-d0ea-45a1-ad04-74057fde2a2b-config\") pod \"47439126-d0ea-45a1-ad04-74057fde2a2b\" (UID: \"47439126-d0ea-45a1-ad04-74057fde2a2b\") " Feb 17 19:31:22 crc kubenswrapper[4892]: I0217 19:31:22.149701 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47439126-d0ea-45a1-ad04-74057fde2a2b-kube-api-access-fqjqw" (OuterVolumeSpecName: "kube-api-access-fqjqw") pod "47439126-d0ea-45a1-ad04-74057fde2a2b" (UID: "47439126-d0ea-45a1-ad04-74057fde2a2b"). InnerVolumeSpecName "kube-api-access-fqjqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:31:22 crc kubenswrapper[4892]: I0217 19:31:22.217507 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47439126-d0ea-45a1-ad04-74057fde2a2b-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "47439126-d0ea-45a1-ad04-74057fde2a2b" (UID: "47439126-d0ea-45a1-ad04-74057fde2a2b"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:31:22 crc kubenswrapper[4892]: I0217 19:31:22.218623 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47439126-d0ea-45a1-ad04-74057fde2a2b-config" (OuterVolumeSpecName: "config") pod "47439126-d0ea-45a1-ad04-74057fde2a2b" (UID: "47439126-d0ea-45a1-ad04-74057fde2a2b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:31:22 crc kubenswrapper[4892]: I0217 19:31:22.223943 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47439126-d0ea-45a1-ad04-74057fde2a2b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "47439126-d0ea-45a1-ad04-74057fde2a2b" (UID: "47439126-d0ea-45a1-ad04-74057fde2a2b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:31:22 crc kubenswrapper[4892]: I0217 19:31:22.238155 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47439126-d0ea-45a1-ad04-74057fde2a2b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "47439126-d0ea-45a1-ad04-74057fde2a2b" (UID: "47439126-d0ea-45a1-ad04-74057fde2a2b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:31:22 crc kubenswrapper[4892]: I0217 19:31:22.246994 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqjqw\" (UniqueName: \"kubernetes.io/projected/47439126-d0ea-45a1-ad04-74057fde2a2b-kube-api-access-fqjqw\") on node \"crc\" DevicePath \"\"" Feb 17 19:31:22 crc kubenswrapper[4892]: I0217 19:31:22.247057 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/47439126-d0ea-45a1-ad04-74057fde2a2b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 19:31:22 crc kubenswrapper[4892]: I0217 19:31:22.247066 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/47439126-d0ea-45a1-ad04-74057fde2a2b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 19:31:22 crc kubenswrapper[4892]: I0217 19:31:22.247076 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47439126-d0ea-45a1-ad04-74057fde2a2b-config\") on node \"crc\" DevicePath \"\"" Feb 17 19:31:22 crc kubenswrapper[4892]: I0217 19:31:22.247086 4892 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/47439126-d0ea-45a1-ad04-74057fde2a2b-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 17 19:31:22 crc kubenswrapper[4892]: I0217 19:31:22.256924 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47439126-d0ea-45a1-ad04-74057fde2a2b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "47439126-d0ea-45a1-ad04-74057fde2a2b" (UID: "47439126-d0ea-45a1-ad04-74057fde2a2b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:31:22 crc kubenswrapper[4892]: I0217 19:31:22.349656 4892 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/47439126-d0ea-45a1-ad04-74057fde2a2b-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 19:31:22 crc kubenswrapper[4892]: I0217 19:31:22.937241 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54d469b495-gltpp" event={"ID":"47439126-d0ea-45a1-ad04-74057fde2a2b","Type":"ContainerDied","Data":"32ce6f9f025fc9997d67626fbfae1be4ab992a130743ff67f275fbde6a225a35"} Feb 17 19:31:22 crc kubenswrapper[4892]: I0217 19:31:22.937709 4892 scope.go:117] "RemoveContainer" containerID="e30a6cfb05d3bf3bebc6d3d1907da4843110f4a8461fdc59bc67392e66ba6299" Feb 17 19:31:22 crc kubenswrapper[4892]: I0217 19:31:22.937321 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54d469b495-gltpp" Feb 17 19:31:22 crc kubenswrapper[4892]: I0217 19:31:22.975880 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54d469b495-gltpp"] Feb 17 19:31:22 crc kubenswrapper[4892]: I0217 19:31:22.985588 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54d469b495-gltpp"] Feb 17 19:31:22 crc kubenswrapper[4892]: I0217 19:31:22.986384 4892 scope.go:117] "RemoveContainer" containerID="db03c918b5ad2da6f79cdaac2e8b3d7ef359ace155d805c6d14b36551181678a" Feb 17 19:31:23 crc kubenswrapper[4892]: I0217 19:31:23.376255 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47439126-d0ea-45a1-ad04-74057fde2a2b" path="/var/lib/kubelet/pods/47439126-d0ea-45a1-ad04-74057fde2a2b/volumes" Feb 17 19:31:33 crc kubenswrapper[4892]: E0217 19:31:33.513150 4892 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest" Feb 17 19:31:33 crc kubenswrapper[4892]: E0217 19:31:33.513904 4892 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 17 19:31:33 crc kubenswrapper[4892]: container &Container{Name:pre-adoption-validation-openstack-pre-adoption-openstack-cell1,Image:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,Command:[],Args:[ansible-runner run /runner -p osp.edpm.pre_adoption_validation -i pre-adoption-validation-openstack-pre-adoption-openstack-cell1],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ANSIBLE_CALLBACKS_ENABLED,Value:ansible.posix.profile_tasks,ValueFrom:nil,},EnvVar{Name:ANSIBLE_CALLBACK_RESULT_FORMAT,Value:yaml,ValueFrom:nil,},EnvVar{Name:ANSIBLE_FORCE_COLOR,Value:True,ValueFrom:nil,},EnvVar{Name:ANSIBLE_DISPLAY_ARGS_TO_STDOUT,Value:True,ValueFrom:nil,},EnvVar{Name:ANSIBLE_SSH_ARGS,Value:-C -o ControlMaster=auto -o ControlPersist=80s,ValueFrom:nil,},EnvVar{Name:ANSIBLE_VERBOSITY,Value:1,ValueFrom:nil,},EnvVar{Name:RUNNER_PLAYBOOK,Value: Feb 17 19:31:33 crc kubenswrapper[4892]: osp.edpm.pre_adoption_validation Feb 17 19:31:33 crc kubenswrapper[4892]: Feb 17 19:31:33 crc kubenswrapper[4892]: ,ValueFrom:nil,},EnvVar{Name:RUNNER_EXTRA_VARS,Value: Feb 17 19:31:33 crc kubenswrapper[4892]: edpm_override_hosts: openstack-cell1 Feb 17 19:31:33 crc kubenswrapper[4892]: edpm_service_type: pre-adoption-validation Feb 17 19:31:33 crc kubenswrapper[4892]: edpm_services_override: Feb 17 19:31:33 crc kubenswrapper[4892]: - pre-adoption-validation Feb 17 19:31:33 crc kubenswrapper[4892]: Feb 17 19:31:33 crc kubenswrapper[4892]: Feb 17 19:31:33 crc kubenswrapper[4892]: ,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ceph,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:pre-adoption-validation-combined-ca-bundle,ReadOnly:false,MountPath:/var/lib/openstack/cacerts/pre-adoption-validation,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key-openstack-cell1,ReadOnly:false,MountPath:/runner/env/ssh_key/ssh_key_openstack-cell1,SubPath:ssh_key_openstack-cell1,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:inventory,ReadOnly:false,MountPath:/runner/inventory/hosts,SubPath:inventory,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fmkkr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:openstack-aee-default-env,},Optional:*true,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod pre-adoption-validation-openstack-pre-adoption-openstack-cg76fw_openstack(422e1c2e-e6ab-4d2c-ac8b-86e70965fef8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled Feb 17 19:31:33 crc kubenswrapper[4892]: > logger="UnhandledError" Feb 17 19:31:33 crc kubenswrapper[4892]: E0217 19:31:33.515139 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pre-adoption-validation-openstack-pre-adoption-openstack-cell1\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cg76fw" podUID="422e1c2e-e6ab-4d2c-ac8b-86e70965fef8" Feb 17 19:31:34 crc kubenswrapper[4892]: E0217 19:31:34.107145 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pre-adoption-validation-openstack-pre-adoption-openstack-cell1\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest\\\"\"" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cg76fw" podUID="422e1c2e-e6ab-4d2c-ac8b-86e70965fef8" Feb 17 19:31:37 crc kubenswrapper[4892]: I0217 19:31:37.425034 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 19:31:37 crc kubenswrapper[4892]: I0217 19:31:37.425723 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 19:31:37 crc kubenswrapper[4892]: I0217 19:31:37.425793 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" Feb 17 19:31:37 crc kubenswrapper[4892]: I0217 19:31:37.427174 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"23227f03c5258bf78d47839d96cf8cec6b2327db6429577c69616ffcf333ff09"} pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 19:31:37 crc kubenswrapper[4892]: I0217 19:31:37.427261 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" containerID="cri-o://23227f03c5258bf78d47839d96cf8cec6b2327db6429577c69616ffcf333ff09" gracePeriod=600 Feb 17 19:31:37 crc kubenswrapper[4892]: E0217 19:31:37.769395 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:31:38 crc kubenswrapper[4892]: I0217 19:31:38.163085 4892 generic.go:334] "Generic (PLEG): container finished" podID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerID="23227f03c5258bf78d47839d96cf8cec6b2327db6429577c69616ffcf333ff09" exitCode=0 Feb 17 19:31:38 crc kubenswrapper[4892]: I0217 19:31:38.163131 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerDied","Data":"23227f03c5258bf78d47839d96cf8cec6b2327db6429577c69616ffcf333ff09"} Feb 17 19:31:38 crc kubenswrapper[4892]: I0217 19:31:38.163166 4892 scope.go:117] "RemoveContainer" containerID="2c781ca08bf3af58232a3452d8a6e42ca4ea1616baa6e1729ebe4342764473bf" Feb 17 19:31:38 crc kubenswrapper[4892]: I0217 19:31:38.163911 4892 scope.go:117] "RemoveContainer" containerID="23227f03c5258bf78d47839d96cf8cec6b2327db6429577c69616ffcf333ff09" Feb 17 19:31:38 crc kubenswrapper[4892]: E0217 19:31:38.164193 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:31:46 crc kubenswrapper[4892]: I0217 19:31:46.361987 4892 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 19:31:48 crc kubenswrapper[4892]: I0217 19:31:48.319891 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cg76fw" event={"ID":"422e1c2e-e6ab-4d2c-ac8b-86e70965fef8","Type":"ContainerStarted","Data":"dfa13c371352fd5ff41da07e988987b40a7b84f5efab5e095b946f800d57ad5e"} Feb 17 19:31:48 crc kubenswrapper[4892]: I0217 19:31:48.357685 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cg76fw" podStartSLOduration=2.608919162 podStartE2EDuration="30.357660946s" podCreationTimestamp="2026-02-17 19:31:18 +0000 UTC" firstStartedPulling="2026-02-17 19:31:19.80767552 +0000 UTC m=+6451.183078785" lastFinishedPulling="2026-02-17 19:31:47.556417254 +0000 UTC m=+6478.931820569" observedRunningTime="2026-02-17 19:31:48.351101657 +0000 UTC m=+6479.726504932" watchObservedRunningTime="2026-02-17 19:31:48.357660946 +0000 UTC m=+6479.733064231" Feb 17 19:31:50 crc kubenswrapper[4892]: I0217 19:31:50.366269 4892 scope.go:117] "RemoveContainer" containerID="23227f03c5258bf78d47839d96cf8cec6b2327db6429577c69616ffcf333ff09" Feb 17 19:31:50 crc kubenswrapper[4892]: E0217 19:31:50.367455 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:32:02 crc kubenswrapper[4892]: I0217 19:32:02.530972 4892 generic.go:334] "Generic (PLEG): container finished" podID="422e1c2e-e6ab-4d2c-ac8b-86e70965fef8" containerID="dfa13c371352fd5ff41da07e988987b40a7b84f5efab5e095b946f800d57ad5e" exitCode=0 Feb 17 19:32:02 crc kubenswrapper[4892]: I0217 19:32:02.531023 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cg76fw" event={"ID":"422e1c2e-e6ab-4d2c-ac8b-86e70965fef8","Type":"ContainerDied","Data":"dfa13c371352fd5ff41da07e988987b40a7b84f5efab5e095b946f800d57ad5e"} Feb 17 19:32:04 crc kubenswrapper[4892]: I0217 19:32:04.135084 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cg76fw" Feb 17 19:32:04 crc kubenswrapper[4892]: I0217 19:32:04.267834 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/422e1c2e-e6ab-4d2c-ac8b-86e70965fef8-pre-adoption-validation-combined-ca-bundle\") pod \"422e1c2e-e6ab-4d2c-ac8b-86e70965fef8\" (UID: \"422e1c2e-e6ab-4d2c-ac8b-86e70965fef8\") " Feb 17 19:32:04 crc kubenswrapper[4892]: I0217 19:32:04.268091 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/422e1c2e-e6ab-4d2c-ac8b-86e70965fef8-ssh-key-openstack-cell1\") pod \"422e1c2e-e6ab-4d2c-ac8b-86e70965fef8\" (UID: \"422e1c2e-e6ab-4d2c-ac8b-86e70965fef8\") " Feb 17 19:32:04 crc kubenswrapper[4892]: I0217 19:32:04.268766 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/422e1c2e-e6ab-4d2c-ac8b-86e70965fef8-inventory\") pod \"422e1c2e-e6ab-4d2c-ac8b-86e70965fef8\" (UID: \"422e1c2e-e6ab-4d2c-ac8b-86e70965fef8\") " Feb 17 19:32:04 crc kubenswrapper[4892]: I0217 19:32:04.268865 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/422e1c2e-e6ab-4d2c-ac8b-86e70965fef8-ceph\") pod \"422e1c2e-e6ab-4d2c-ac8b-86e70965fef8\" (UID: \"422e1c2e-e6ab-4d2c-ac8b-86e70965fef8\") " Feb 17 19:32:04 crc kubenswrapper[4892]: I0217 19:32:04.268906 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmkkr\" (UniqueName: \"kubernetes.io/projected/422e1c2e-e6ab-4d2c-ac8b-86e70965fef8-kube-api-access-fmkkr\") pod \"422e1c2e-e6ab-4d2c-ac8b-86e70965fef8\" (UID: \"422e1c2e-e6ab-4d2c-ac8b-86e70965fef8\") " Feb 17 19:32:04 crc kubenswrapper[4892]: I0217 19:32:04.274050 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/422e1c2e-e6ab-4d2c-ac8b-86e70965fef8-pre-adoption-validation-combined-ca-bundle" (OuterVolumeSpecName: "pre-adoption-validation-combined-ca-bundle") pod "422e1c2e-e6ab-4d2c-ac8b-86e70965fef8" (UID: "422e1c2e-e6ab-4d2c-ac8b-86e70965fef8"). InnerVolumeSpecName "pre-adoption-validation-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:32:04 crc kubenswrapper[4892]: I0217 19:32:04.281138 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/422e1c2e-e6ab-4d2c-ac8b-86e70965fef8-ceph" (OuterVolumeSpecName: "ceph") pod "422e1c2e-e6ab-4d2c-ac8b-86e70965fef8" (UID: "422e1c2e-e6ab-4d2c-ac8b-86e70965fef8"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:32:04 crc kubenswrapper[4892]: I0217 19:32:04.282243 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/422e1c2e-e6ab-4d2c-ac8b-86e70965fef8-kube-api-access-fmkkr" (OuterVolumeSpecName: "kube-api-access-fmkkr") pod "422e1c2e-e6ab-4d2c-ac8b-86e70965fef8" (UID: "422e1c2e-e6ab-4d2c-ac8b-86e70965fef8"). InnerVolumeSpecName "kube-api-access-fmkkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:32:04 crc kubenswrapper[4892]: I0217 19:32:04.300722 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/422e1c2e-e6ab-4d2c-ac8b-86e70965fef8-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "422e1c2e-e6ab-4d2c-ac8b-86e70965fef8" (UID: "422e1c2e-e6ab-4d2c-ac8b-86e70965fef8"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:32:04 crc kubenswrapper[4892]: I0217 19:32:04.306595 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/422e1c2e-e6ab-4d2c-ac8b-86e70965fef8-inventory" (OuterVolumeSpecName: "inventory") pod "422e1c2e-e6ab-4d2c-ac8b-86e70965fef8" (UID: "422e1c2e-e6ab-4d2c-ac8b-86e70965fef8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:32:04 crc kubenswrapper[4892]: I0217 19:32:04.371851 4892 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/422e1c2e-e6ab-4d2c-ac8b-86e70965fef8-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 17 19:32:04 crc kubenswrapper[4892]: I0217 19:32:04.371898 4892 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/422e1c2e-e6ab-4d2c-ac8b-86e70965fef8-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 19:32:04 crc kubenswrapper[4892]: I0217 19:32:04.371915 4892 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/422e1c2e-e6ab-4d2c-ac8b-86e70965fef8-ceph\") on node \"crc\" DevicePath \"\"" Feb 17 19:32:04 crc kubenswrapper[4892]: I0217 19:32:04.371934 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmkkr\" (UniqueName: \"kubernetes.io/projected/422e1c2e-e6ab-4d2c-ac8b-86e70965fef8-kube-api-access-fmkkr\") on node \"crc\" DevicePath \"\"" Feb 17 19:32:04 crc kubenswrapper[4892]: I0217 19:32:04.371955 4892 reconciler_common.go:293] "Volume detached for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/422e1c2e-e6ab-4d2c-ac8b-86e70965fef8-pre-adoption-validation-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 19:32:04 crc kubenswrapper[4892]: I0217 19:32:04.567792 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cg76fw" event={"ID":"422e1c2e-e6ab-4d2c-ac8b-86e70965fef8","Type":"ContainerDied","Data":"760e4a4b276fcc7a1b257970839627ed7f1fda866a6aa2b312d1feb069c9bd7e"} Feb 17 19:32:04 crc kubenswrapper[4892]: I0217 19:32:04.568190 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="760e4a4b276fcc7a1b257970839627ed7f1fda866a6aa2b312d1feb069c9bd7e" Feb 17 19:32:04 crc kubenswrapper[4892]: I0217 19:32:04.567926 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cg76fw" Feb 17 19:32:05 crc kubenswrapper[4892]: I0217 19:32:05.359971 4892 scope.go:117] "RemoveContainer" containerID="23227f03c5258bf78d47839d96cf8cec6b2327db6429577c69616ffcf333ff09" Feb 17 19:32:05 crc kubenswrapper[4892]: E0217 19:32:05.360308 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:32:05 crc kubenswrapper[4892]: I0217 19:32:05.918689 4892 scope.go:117] "RemoveContainer" containerID="5aa89074cc9ec5ffe1244cb742c7b589a0451dadf5d4241635ec96d3d80e6274" Feb 17 19:32:05 crc kubenswrapper[4892]: I0217 19:32:05.959800 4892 scope.go:117] "RemoveContainer" containerID="9a15f6f6fdc92aa1fc1b7bb83a6e66b2aa71891dee85dd8553941908a3d17f54" Feb 17 19:32:05 crc kubenswrapper[4892]: I0217 19:32:05.983398 4892 scope.go:117] "RemoveContainer" containerID="01c7e151913dc1faaf2c706b3516aa0e455eb28cd829add0ab4eb03c81e0ccec" Feb 17 19:32:06 crc kubenswrapper[4892]: I0217 19:32:06.164465 4892 scope.go:117] "RemoveContainer" containerID="3a77051d0aabb5ab375c3bae5a2ebf213d884d7c99f60dee1f039da4923e7bd3" Feb 17 19:32:07 crc kubenswrapper[4892]: I0217 19:32:07.400941 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-w56qd"] Feb 17 19:32:07 crc kubenswrapper[4892]: E0217 19:32:07.439233 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47439126-d0ea-45a1-ad04-74057fde2a2b" containerName="init" Feb 17 19:32:07 crc kubenswrapper[4892]: I0217 19:32:07.439286 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="47439126-d0ea-45a1-ad04-74057fde2a2b" containerName="init" Feb 17 19:32:07 crc kubenswrapper[4892]: E0217 19:32:07.439417 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47439126-d0ea-45a1-ad04-74057fde2a2b" containerName="dnsmasq-dns" Feb 17 19:32:07 crc kubenswrapper[4892]: I0217 19:32:07.439432 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="47439126-d0ea-45a1-ad04-74057fde2a2b" containerName="dnsmasq-dns" Feb 17 19:32:07 crc kubenswrapper[4892]: E0217 19:32:07.439524 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="422e1c2e-e6ab-4d2c-ac8b-86e70965fef8" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Feb 17 19:32:07 crc kubenswrapper[4892]: I0217 19:32:07.439554 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="422e1c2e-e6ab-4d2c-ac8b-86e70965fef8" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Feb 17 19:32:07 crc kubenswrapper[4892]: I0217 19:32:07.440537 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="422e1c2e-e6ab-4d2c-ac8b-86e70965fef8" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Feb 17 19:32:07 crc kubenswrapper[4892]: I0217 19:32:07.440573 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="47439126-d0ea-45a1-ad04-74057fde2a2b" containerName="dnsmasq-dns" Feb 17 19:32:07 crc kubenswrapper[4892]: I0217 19:32:07.441873 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-w56qd"] Feb 17 19:32:07 crc kubenswrapper[4892]: I0217 19:32:07.441979 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-w56qd" Feb 17 19:32:07 crc kubenswrapper[4892]: I0217 19:32:07.445790 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 19:32:07 crc kubenswrapper[4892]: I0217 19:32:07.446047 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 17 19:32:07 crc kubenswrapper[4892]: I0217 19:32:07.446148 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-sdxx7" Feb 17 19:32:07 crc kubenswrapper[4892]: I0217 19:32:07.446188 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 17 19:32:07 crc kubenswrapper[4892]: I0217 19:32:07.546537 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w86zr\" (UniqueName: \"kubernetes.io/projected/b17be7f8-f4d0-434f-b0b0-010faf440574-kube-api-access-w86zr\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-w56qd\" (UID: \"b17be7f8-f4d0-434f-b0b0-010faf440574\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-w56qd" Feb 17 19:32:07 crc kubenswrapper[4892]: I0217 19:32:07.546589 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/b17be7f8-f4d0-434f-b0b0-010faf440574-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-w56qd\" (UID: \"b17be7f8-f4d0-434f-b0b0-010faf440574\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-w56qd" Feb 17 19:32:07 crc kubenswrapper[4892]: I0217 19:32:07.546655 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b17be7f8-f4d0-434f-b0b0-010faf440574-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-w56qd\" (UID: \"b17be7f8-f4d0-434f-b0b0-010faf440574\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-w56qd" Feb 17 19:32:07 crc kubenswrapper[4892]: I0217 19:32:07.547013 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b17be7f8-f4d0-434f-b0b0-010faf440574-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-w56qd\" (UID: \"b17be7f8-f4d0-434f-b0b0-010faf440574\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-w56qd" Feb 17 19:32:07 crc kubenswrapper[4892]: I0217 19:32:07.547327 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b17be7f8-f4d0-434f-b0b0-010faf440574-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-w56qd\" (UID: \"b17be7f8-f4d0-434f-b0b0-010faf440574\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-w56qd" Feb 17 19:32:07 crc kubenswrapper[4892]: I0217 19:32:07.649486 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b17be7f8-f4d0-434f-b0b0-010faf440574-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-w56qd\" (UID: \"b17be7f8-f4d0-434f-b0b0-010faf440574\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-w56qd" Feb 17 19:32:07 crc kubenswrapper[4892]: I0217 19:32:07.649680 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w86zr\" (UniqueName: \"kubernetes.io/projected/b17be7f8-f4d0-434f-b0b0-010faf440574-kube-api-access-w86zr\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-w56qd\" (UID: \"b17be7f8-f4d0-434f-b0b0-010faf440574\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-w56qd" Feb 17 19:32:07 crc kubenswrapper[4892]: I0217 19:32:07.649719 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/b17be7f8-f4d0-434f-b0b0-010faf440574-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-w56qd\" (UID: \"b17be7f8-f4d0-434f-b0b0-010faf440574\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-w56qd" Feb 17 19:32:07 crc kubenswrapper[4892]: I0217 19:32:07.649785 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b17be7f8-f4d0-434f-b0b0-010faf440574-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-w56qd\" (UID: \"b17be7f8-f4d0-434f-b0b0-010faf440574\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-w56qd" Feb 17 19:32:07 crc kubenswrapper[4892]: I0217 19:32:07.649866 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b17be7f8-f4d0-434f-b0b0-010faf440574-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-w56qd\" (UID: \"b17be7f8-f4d0-434f-b0b0-010faf440574\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-w56qd" Feb 17 19:32:07 crc kubenswrapper[4892]: I0217 19:32:07.655606 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b17be7f8-f4d0-434f-b0b0-010faf440574-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-w56qd\" (UID: \"b17be7f8-f4d0-434f-b0b0-010faf440574\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-w56qd" Feb 17 19:32:07 crc kubenswrapper[4892]: I0217 19:32:07.656099 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b17be7f8-f4d0-434f-b0b0-010faf440574-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-w56qd\" (UID: \"b17be7f8-f4d0-434f-b0b0-010faf440574\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-w56qd" Feb 17 19:32:07 crc kubenswrapper[4892]: I0217 19:32:07.657280 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b17be7f8-f4d0-434f-b0b0-010faf440574-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-w56qd\" (UID: \"b17be7f8-f4d0-434f-b0b0-010faf440574\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-w56qd" Feb 17 19:32:07 crc kubenswrapper[4892]: I0217 19:32:07.658609 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/b17be7f8-f4d0-434f-b0b0-010faf440574-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-w56qd\" (UID: \"b17be7f8-f4d0-434f-b0b0-010faf440574\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-w56qd" Feb 17 19:32:07 crc kubenswrapper[4892]: I0217 19:32:07.676725 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w86zr\" (UniqueName: \"kubernetes.io/projected/b17be7f8-f4d0-434f-b0b0-010faf440574-kube-api-access-w86zr\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-w56qd\" (UID: \"b17be7f8-f4d0-434f-b0b0-010faf440574\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-w56qd" Feb 17 19:32:07 crc kubenswrapper[4892]: I0217 19:32:07.766741 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-w56qd" Feb 17 19:32:08 crc kubenswrapper[4892]: I0217 19:32:08.371500 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-w56qd"] Feb 17 19:32:08 crc kubenswrapper[4892]: I0217 19:32:08.623040 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-w56qd" event={"ID":"b17be7f8-f4d0-434f-b0b0-010faf440574","Type":"ContainerStarted","Data":"9237e4fcfccfa95c650e640ee676b43de84251ea3f27860627228ea781a23eb1"} Feb 17 19:32:09 crc kubenswrapper[4892]: I0217 19:32:09.661095 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-w56qd" event={"ID":"b17be7f8-f4d0-434f-b0b0-010faf440574","Type":"ContainerStarted","Data":"f9461f639d8ca1093f31de86be7310fd6d5db58d14b83ea7b67ab80764274297"} Feb 17 19:32:09 crc kubenswrapper[4892]: I0217 19:32:09.692036 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-w56qd" podStartSLOduration=2.082815949 podStartE2EDuration="2.692018418s" podCreationTimestamp="2026-02-17 19:32:07 +0000 UTC" firstStartedPulling="2026-02-17 19:32:08.388483093 +0000 UTC m=+6499.763886378" lastFinishedPulling="2026-02-17 19:32:08.997685572 +0000 UTC m=+6500.373088847" observedRunningTime="2026-02-17 19:32:09.68838122 +0000 UTC m=+6501.063784495" watchObservedRunningTime="2026-02-17 19:32:09.692018418 +0000 UTC m=+6501.067421693" Feb 17 19:32:16 crc kubenswrapper[4892]: I0217 19:32:16.360131 4892 scope.go:117] "RemoveContainer" containerID="23227f03c5258bf78d47839d96cf8cec6b2327db6429577c69616ffcf333ff09" Feb 17 19:32:16 crc kubenswrapper[4892]: E0217 19:32:16.361146 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:32:30 crc kubenswrapper[4892]: I0217 19:32:30.359766 4892 scope.go:117] "RemoveContainer" containerID="23227f03c5258bf78d47839d96cf8cec6b2327db6429577c69616ffcf333ff09" Feb 17 19:32:30 crc kubenswrapper[4892]: E0217 19:32:30.360818 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:32:42 crc kubenswrapper[4892]: I0217 19:32:42.360413 4892 scope.go:117] "RemoveContainer" containerID="23227f03c5258bf78d47839d96cf8cec6b2327db6429577c69616ffcf333ff09" Feb 17 19:32:42 crc kubenswrapper[4892]: E0217 19:32:42.361337 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:32:56 crc kubenswrapper[4892]: I0217 19:32:56.362730 4892 scope.go:117] "RemoveContainer" containerID="23227f03c5258bf78d47839d96cf8cec6b2327db6429577c69616ffcf333ff09" Feb 17 19:32:56 crc kubenswrapper[4892]: E0217 19:32:56.363503 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:33:09 crc kubenswrapper[4892]: I0217 19:33:09.373344 4892 scope.go:117] "RemoveContainer" containerID="23227f03c5258bf78d47839d96cf8cec6b2327db6429577c69616ffcf333ff09" Feb 17 19:33:09 crc kubenswrapper[4892]: E0217 19:33:09.374945 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:33:21 crc kubenswrapper[4892]: I0217 19:33:21.055285 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-create-hpq92"] Feb 17 19:33:21 crc kubenswrapper[4892]: I0217 19:33:21.066051 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-create-hpq92"] Feb 17 19:33:21 crc kubenswrapper[4892]: I0217 19:33:21.385126 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="026c9245-5963-4614-b96a-a9d6c2a6b839" path="/var/lib/kubelet/pods/026c9245-5963-4614-b96a-a9d6c2a6b839/volumes" Feb 17 19:33:22 crc kubenswrapper[4892]: I0217 19:33:22.043040 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-0986-account-create-update-r9xjq"] Feb 17 19:33:22 crc kubenswrapper[4892]: I0217 19:33:22.061284 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-0986-account-create-update-r9xjq"] Feb 17 19:33:23 crc kubenswrapper[4892]: I0217 19:33:23.376152 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b778e62-5003-4544-8653-d5a6c7b62648" path="/var/lib/kubelet/pods/5b778e62-5003-4544-8653-d5a6c7b62648/volumes" Feb 17 19:33:24 crc kubenswrapper[4892]: I0217 19:33:24.359416 4892 scope.go:117] "RemoveContainer" containerID="23227f03c5258bf78d47839d96cf8cec6b2327db6429577c69616ffcf333ff09" Feb 17 19:33:24 crc kubenswrapper[4892]: E0217 19:33:24.360010 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:33:28 crc kubenswrapper[4892]: I0217 19:33:28.043670 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-persistence-db-create-kn955"] Feb 17 19:33:28 crc kubenswrapper[4892]: I0217 19:33:28.060263 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-persistence-db-create-kn955"] Feb 17 19:33:29 crc kubenswrapper[4892]: I0217 19:33:29.051067 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-096a-account-create-update-tnsbx"] Feb 17 19:33:29 crc kubenswrapper[4892]: I0217 19:33:29.068475 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-096a-account-create-update-tnsbx"] Feb 17 19:33:29 crc kubenswrapper[4892]: I0217 19:33:29.382461 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6be7a9e1-cad7-4881-b176-c238e08665a9" path="/var/lib/kubelet/pods/6be7a9e1-cad7-4881-b176-c238e08665a9/volumes" Feb 17 19:33:29 crc kubenswrapper[4892]: I0217 19:33:29.384605 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f2653d2-fc1e-4bcb-ad8c-0c372dfb0f4a" path="/var/lib/kubelet/pods/6f2653d2-fc1e-4bcb-ad8c-0c372dfb0f4a/volumes" Feb 17 19:33:35 crc kubenswrapper[4892]: I0217 19:33:35.359809 4892 scope.go:117] "RemoveContainer" containerID="23227f03c5258bf78d47839d96cf8cec6b2327db6429577c69616ffcf333ff09" Feb 17 19:33:35 crc kubenswrapper[4892]: E0217 19:33:35.361010 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:33:47 crc kubenswrapper[4892]: I0217 19:33:47.360868 4892 scope.go:117] "RemoveContainer" containerID="23227f03c5258bf78d47839d96cf8cec6b2327db6429577c69616ffcf333ff09" Feb 17 19:33:47 crc kubenswrapper[4892]: E0217 19:33:47.361988 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:33:58 crc kubenswrapper[4892]: I0217 19:33:58.360742 4892 scope.go:117] "RemoveContainer" containerID="23227f03c5258bf78d47839d96cf8cec6b2327db6429577c69616ffcf333ff09" Feb 17 19:33:58 crc kubenswrapper[4892]: E0217 19:33:58.361760 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:34:06 crc kubenswrapper[4892]: I0217 19:34:06.509231 4892 scope.go:117] "RemoveContainer" containerID="45c16671d07799a9d496923becc2da51857aca42d04c2e7751709c015ae8c017" Feb 17 19:34:06 crc kubenswrapper[4892]: I0217 19:34:06.543135 4892 scope.go:117] "RemoveContainer" containerID="e8c54432f1b557347791329d315a7383365a73a9f9b3d488d6c7b3cc0ce2930d" Feb 17 19:34:06 crc kubenswrapper[4892]: I0217 19:34:06.678437 4892 scope.go:117] "RemoveContainer" containerID="08338a763062d1e3fe63c443c8caf6ebf457fa93055508058d82076f018e12a7" Feb 17 19:34:06 crc kubenswrapper[4892]: I0217 19:34:06.735503 4892 scope.go:117] "RemoveContainer" containerID="e4a28dbe11c4731fbaf8dc60346b4b2da6f455dd469edc28be32354c857797f1" Feb 17 19:34:07 crc kubenswrapper[4892]: I0217 19:34:07.069358 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-sync-zfzzm"] Feb 17 19:34:07 crc kubenswrapper[4892]: I0217 19:34:07.084802 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-sync-zfzzm"] Feb 17 19:34:07 crc kubenswrapper[4892]: I0217 19:34:07.380211 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="909c89b3-4107-4f01-b4ec-3ecea629d2d4" path="/var/lib/kubelet/pods/909c89b3-4107-4f01-b4ec-3ecea629d2d4/volumes" Feb 17 19:34:09 crc kubenswrapper[4892]: I0217 19:34:09.372592 4892 scope.go:117] "RemoveContainer" containerID="23227f03c5258bf78d47839d96cf8cec6b2327db6429577c69616ffcf333ff09" Feb 17 19:34:09 crc kubenswrapper[4892]: E0217 19:34:09.386028 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:34:21 crc kubenswrapper[4892]: I0217 19:34:21.360316 4892 scope.go:117] "RemoveContainer" containerID="23227f03c5258bf78d47839d96cf8cec6b2327db6429577c69616ffcf333ff09" Feb 17 19:34:21 crc kubenswrapper[4892]: E0217 19:34:21.361499 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:34:33 crc kubenswrapper[4892]: I0217 19:34:33.368187 4892 scope.go:117] "RemoveContainer" containerID="23227f03c5258bf78d47839d96cf8cec6b2327db6429577c69616ffcf333ff09" Feb 17 19:34:33 crc kubenswrapper[4892]: E0217 19:34:33.369517 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:34:48 crc kubenswrapper[4892]: I0217 19:34:48.360014 4892 scope.go:117] "RemoveContainer" containerID="23227f03c5258bf78d47839d96cf8cec6b2327db6429577c69616ffcf333ff09" Feb 17 19:34:48 crc kubenswrapper[4892]: E0217 19:34:48.361225 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:35:03 crc kubenswrapper[4892]: I0217 19:35:03.360711 4892 scope.go:117] "RemoveContainer" containerID="23227f03c5258bf78d47839d96cf8cec6b2327db6429577c69616ffcf333ff09" Feb 17 19:35:03 crc kubenswrapper[4892]: E0217 19:35:03.361831 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:35:06 crc kubenswrapper[4892]: I0217 19:35:06.933756 4892 scope.go:117] "RemoveContainer" containerID="efe0e35522ac404b0815649ab9349270b70e570ee79ab52083cf7938fe9285dc" Feb 17 19:35:06 crc kubenswrapper[4892]: I0217 19:35:06.965545 4892 scope.go:117] "RemoveContainer" containerID="4fefcb8ddad77b3ebb1a22470cdcc07b188cd554704bc6755ae777cdcfae6a52" Feb 17 19:35:07 crc kubenswrapper[4892]: I0217 19:35:07.002377 4892 scope.go:117] "RemoveContainer" containerID="3f10b4fc486f1f3b519ee8504b216b2ba4a9f79fdddc06c664d2d1187819d9fb" Feb 17 19:35:07 crc kubenswrapper[4892]: I0217 19:35:07.044231 4892 scope.go:117] "RemoveContainer" containerID="6696b36d09e84647ded1a7da2719c938844118d729b77876495a8f9e895c2e4c" Feb 17 19:35:07 crc kubenswrapper[4892]: I0217 19:35:07.104207 4892 scope.go:117] "RemoveContainer" containerID="5904c1c707391722ce5792e20eb31627df929cb22c22d91b0ad4b549e3bede90" Feb 17 19:35:07 crc kubenswrapper[4892]: I0217 19:35:07.134816 4892 scope.go:117] "RemoveContainer" containerID="f0fc1541762ff375ec48c4af8b28fe89e3d6e560ab6030c282b84a2446b1d4fd" Feb 17 19:35:15 crc kubenswrapper[4892]: I0217 19:35:15.359989 4892 scope.go:117] "RemoveContainer" containerID="23227f03c5258bf78d47839d96cf8cec6b2327db6429577c69616ffcf333ff09" Feb 17 19:35:15 crc kubenswrapper[4892]: E0217 19:35:15.360666 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:35:30 crc kubenswrapper[4892]: I0217 19:35:30.362493 4892 scope.go:117] "RemoveContainer" containerID="23227f03c5258bf78d47839d96cf8cec6b2327db6429577c69616ffcf333ff09" Feb 17 19:35:30 crc kubenswrapper[4892]: E0217 19:35:30.363994 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:35:44 crc kubenswrapper[4892]: I0217 19:35:44.360223 4892 scope.go:117] "RemoveContainer" containerID="23227f03c5258bf78d47839d96cf8cec6b2327db6429577c69616ffcf333ff09" Feb 17 19:35:44 crc kubenswrapper[4892]: E0217 19:35:44.361050 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:35:55 crc kubenswrapper[4892]: I0217 19:35:55.684795 4892 scope.go:117] "RemoveContainer" containerID="23227f03c5258bf78d47839d96cf8cec6b2327db6429577c69616ffcf333ff09" Feb 17 19:35:55 crc kubenswrapper[4892]: E0217 19:35:55.685614 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:36:07 crc kubenswrapper[4892]: I0217 19:36:07.245536 4892 scope.go:117] "RemoveContainer" containerID="ca1759fe3213ccf74056245143820c16aa75c8882f71fb4c2b5367c1df147716" Feb 17 19:36:07 crc kubenswrapper[4892]: I0217 19:36:07.289888 4892 scope.go:117] "RemoveContainer" containerID="2fd09ccb026953c0d4fdbf7c124f47e8c78a8a4ca9b8831e521388b680f71874" Feb 17 19:36:07 crc kubenswrapper[4892]: I0217 19:36:07.321631 4892 scope.go:117] "RemoveContainer" containerID="e4e7f3de758dcffad3c1b7afb225ab969ab3302b0b7b5d525f37ffca623a0c9d" Feb 17 19:36:07 crc kubenswrapper[4892]: I0217 19:36:07.353116 4892 scope.go:117] "RemoveContainer" containerID="be644b50ba916308679414702f03d47e40b84c47c7741b31501290e4bed7ba4c" Feb 17 19:36:08 crc kubenswrapper[4892]: I0217 19:36:08.359524 4892 scope.go:117] "RemoveContainer" containerID="23227f03c5258bf78d47839d96cf8cec6b2327db6429577c69616ffcf333ff09" Feb 17 19:36:08 crc kubenswrapper[4892]: E0217 19:36:08.359949 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:36:20 crc kubenswrapper[4892]: I0217 19:36:20.371573 4892 scope.go:117] "RemoveContainer" containerID="23227f03c5258bf78d47839d96cf8cec6b2327db6429577c69616ffcf333ff09" Feb 17 19:36:20 crc kubenswrapper[4892]: E0217 19:36:20.372904 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:36:33 crc kubenswrapper[4892]: I0217 19:36:33.366611 4892 scope.go:117] "RemoveContainer" containerID="23227f03c5258bf78d47839d96cf8cec6b2327db6429577c69616ffcf333ff09" Feb 17 19:36:33 crc kubenswrapper[4892]: E0217 19:36:33.367455 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:36:37 crc kubenswrapper[4892]: I0217 19:36:37.048579 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-4h7h4"] Feb 17 19:36:37 crc kubenswrapper[4892]: I0217 19:36:37.064509 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-d2c7-account-create-update-fkq2k"] Feb 17 19:36:37 crc kubenswrapper[4892]: I0217 19:36:37.075835 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-d2c7-account-create-update-fkq2k"] Feb 17 19:36:37 crc kubenswrapper[4892]: I0217 19:36:37.088972 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-4h7h4"] Feb 17 19:36:37 crc kubenswrapper[4892]: I0217 19:36:37.427402 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19fe62af-2689-46c8-837f-6a6af7f92052" path="/var/lib/kubelet/pods/19fe62af-2689-46c8-837f-6a6af7f92052/volumes" Feb 17 19:36:37 crc kubenswrapper[4892]: I0217 19:36:37.428192 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ead331a-f404-4cd9-9b84-00f5804cf185" path="/var/lib/kubelet/pods/9ead331a-f404-4cd9-9b84-00f5804cf185/volumes" Feb 17 19:36:47 crc kubenswrapper[4892]: I0217 19:36:47.359857 4892 scope.go:117] "RemoveContainer" containerID="23227f03c5258bf78d47839d96cf8cec6b2327db6429577c69616ffcf333ff09" Feb 17 19:36:48 crc kubenswrapper[4892]: I0217 19:36:48.479082 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerStarted","Data":"d2b2efd3d5988ce334d039949155348879bf6dbf519c1878bea365e6fb91554e"} Feb 17 19:36:51 crc kubenswrapper[4892]: I0217 19:36:51.038584 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-zwvbt"] Feb 17 19:36:51 crc kubenswrapper[4892]: I0217 19:36:51.054770 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-zwvbt"] Feb 17 19:36:51 crc kubenswrapper[4892]: I0217 19:36:51.393468 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61ec9d13-098a-4249-8f59-db24d25a7ff9" path="/var/lib/kubelet/pods/61ec9d13-098a-4249-8f59-db24d25a7ff9/volumes" Feb 17 19:37:07 crc kubenswrapper[4892]: I0217 19:37:07.433462 4892 scope.go:117] "RemoveContainer" containerID="17c5421290a2046338b02c97de1fcba92ead909ff6b4e9d121e0ce66c1e73915" Feb 17 19:37:07 crc kubenswrapper[4892]: I0217 19:37:07.485991 4892 scope.go:117] "RemoveContainer" containerID="bc71ffe5efcce70bf0ea42a08027ff9d77d96e1e1586ea6be9677ca6d4614b42" Feb 17 19:37:07 crc kubenswrapper[4892]: I0217 19:37:07.539574 4892 scope.go:117] "RemoveContainer" containerID="0f596e43bd6920656122497cc29e605b2ff65ab29a1a2307c12b16f7161b7eb9" Feb 17 19:37:43 crc kubenswrapper[4892]: I0217 19:37:43.015991 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mgg9h"] Feb 17 19:37:43 crc kubenswrapper[4892]: I0217 19:37:43.019358 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mgg9h" Feb 17 19:37:43 crc kubenswrapper[4892]: I0217 19:37:43.057801 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mgg9h"] Feb 17 19:37:43 crc kubenswrapper[4892]: I0217 19:37:43.109115 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc0d8760-1412-4684-9dea-a8ee8e058586-catalog-content\") pod \"certified-operators-mgg9h\" (UID: \"fc0d8760-1412-4684-9dea-a8ee8e058586\") " pod="openshift-marketplace/certified-operators-mgg9h" Feb 17 19:37:43 crc kubenswrapper[4892]: I0217 19:37:43.109610 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4hps\" (UniqueName: \"kubernetes.io/projected/fc0d8760-1412-4684-9dea-a8ee8e058586-kube-api-access-t4hps\") pod \"certified-operators-mgg9h\" (UID: \"fc0d8760-1412-4684-9dea-a8ee8e058586\") " pod="openshift-marketplace/certified-operators-mgg9h" Feb 17 19:37:43 crc kubenswrapper[4892]: I0217 19:37:43.109693 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc0d8760-1412-4684-9dea-a8ee8e058586-utilities\") pod \"certified-operators-mgg9h\" (UID: \"fc0d8760-1412-4684-9dea-a8ee8e058586\") " pod="openshift-marketplace/certified-operators-mgg9h" Feb 17 19:37:43 crc kubenswrapper[4892]: I0217 19:37:43.212288 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4hps\" (UniqueName: \"kubernetes.io/projected/fc0d8760-1412-4684-9dea-a8ee8e058586-kube-api-access-t4hps\") pod \"certified-operators-mgg9h\" (UID: \"fc0d8760-1412-4684-9dea-a8ee8e058586\") " pod="openshift-marketplace/certified-operators-mgg9h" Feb 17 19:37:43 crc kubenswrapper[4892]: I0217 19:37:43.212557 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc0d8760-1412-4684-9dea-a8ee8e058586-utilities\") pod \"certified-operators-mgg9h\" (UID: \"fc0d8760-1412-4684-9dea-a8ee8e058586\") " pod="openshift-marketplace/certified-operators-mgg9h" Feb 17 19:37:43 crc kubenswrapper[4892]: I0217 19:37:43.212787 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc0d8760-1412-4684-9dea-a8ee8e058586-catalog-content\") pod \"certified-operators-mgg9h\" (UID: \"fc0d8760-1412-4684-9dea-a8ee8e058586\") " pod="openshift-marketplace/certified-operators-mgg9h" Feb 17 19:37:43 crc kubenswrapper[4892]: I0217 19:37:43.213052 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc0d8760-1412-4684-9dea-a8ee8e058586-utilities\") pod \"certified-operators-mgg9h\" (UID: \"fc0d8760-1412-4684-9dea-a8ee8e058586\") " pod="openshift-marketplace/certified-operators-mgg9h" Feb 17 19:37:43 crc kubenswrapper[4892]: I0217 19:37:43.213110 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc0d8760-1412-4684-9dea-a8ee8e058586-catalog-content\") pod \"certified-operators-mgg9h\" (UID: \"fc0d8760-1412-4684-9dea-a8ee8e058586\") " pod="openshift-marketplace/certified-operators-mgg9h" Feb 17 19:37:43 crc kubenswrapper[4892]: I0217 19:37:43.237341 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4hps\" (UniqueName: \"kubernetes.io/projected/fc0d8760-1412-4684-9dea-a8ee8e058586-kube-api-access-t4hps\") pod \"certified-operators-mgg9h\" (UID: \"fc0d8760-1412-4684-9dea-a8ee8e058586\") " pod="openshift-marketplace/certified-operators-mgg9h" Feb 17 19:37:43 crc kubenswrapper[4892]: I0217 19:37:43.352593 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mgg9h" Feb 17 19:37:43 crc kubenswrapper[4892]: I0217 19:37:43.908072 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mgg9h"] Feb 17 19:37:44 crc kubenswrapper[4892]: I0217 19:37:44.228419 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mgg9h" event={"ID":"fc0d8760-1412-4684-9dea-a8ee8e058586","Type":"ContainerStarted","Data":"bdcf667f32b2264b5e2ce57e1281d3f018d810ed91d9ea7fe49e76ceb9d16081"} Feb 17 19:37:44 crc kubenswrapper[4892]: I0217 19:37:44.228905 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mgg9h" event={"ID":"fc0d8760-1412-4684-9dea-a8ee8e058586","Type":"ContainerStarted","Data":"e33b73db8245fca74ed0c6e91a804b0da81058296388682782f33516914f4816"} Feb 17 19:37:45 crc kubenswrapper[4892]: I0217 19:37:45.241320 4892 generic.go:334] "Generic (PLEG): container finished" podID="fc0d8760-1412-4684-9dea-a8ee8e058586" containerID="bdcf667f32b2264b5e2ce57e1281d3f018d810ed91d9ea7fe49e76ceb9d16081" exitCode=0 Feb 17 19:37:45 crc kubenswrapper[4892]: I0217 19:37:45.241658 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mgg9h" event={"ID":"fc0d8760-1412-4684-9dea-a8ee8e058586","Type":"ContainerDied","Data":"bdcf667f32b2264b5e2ce57e1281d3f018d810ed91d9ea7fe49e76ceb9d16081"} Feb 17 19:37:45 crc kubenswrapper[4892]: I0217 19:37:45.244007 4892 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 19:37:47 crc kubenswrapper[4892]: I0217 19:37:47.267400 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mgg9h" event={"ID":"fc0d8760-1412-4684-9dea-a8ee8e058586","Type":"ContainerStarted","Data":"1c715544b090d181df67d9b16f42834944ff6288f0a7b80fc3b35a3e5fa8d8ad"} Feb 17 19:37:49 crc kubenswrapper[4892]: I0217 19:37:49.288661 4892 generic.go:334] "Generic (PLEG): container finished" podID="fc0d8760-1412-4684-9dea-a8ee8e058586" containerID="1c715544b090d181df67d9b16f42834944ff6288f0a7b80fc3b35a3e5fa8d8ad" exitCode=0 Feb 17 19:37:49 crc kubenswrapper[4892]: I0217 19:37:49.288743 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mgg9h" event={"ID":"fc0d8760-1412-4684-9dea-a8ee8e058586","Type":"ContainerDied","Data":"1c715544b090d181df67d9b16f42834944ff6288f0a7b80fc3b35a3e5fa8d8ad"} Feb 17 19:37:51 crc kubenswrapper[4892]: I0217 19:37:51.316413 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mgg9h" event={"ID":"fc0d8760-1412-4684-9dea-a8ee8e058586","Type":"ContainerStarted","Data":"4c9c082bbe32a0affe5d658ffa99a6c6ad8f185beaaed34ecccc663984c96066"} Feb 17 19:37:51 crc kubenswrapper[4892]: I0217 19:37:51.346742 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mgg9h" podStartSLOduration=4.346201689 podStartE2EDuration="9.346716136s" podCreationTimestamp="2026-02-17 19:37:42 +0000 UTC" firstStartedPulling="2026-02-17 19:37:45.243627656 +0000 UTC m=+6836.619030931" lastFinishedPulling="2026-02-17 19:37:50.244142113 +0000 UTC m=+6841.619545378" observedRunningTime="2026-02-17 19:37:51.343325854 +0000 UTC m=+6842.718729199" watchObservedRunningTime="2026-02-17 19:37:51.346716136 +0000 UTC m=+6842.722119431" Feb 17 19:37:53 crc kubenswrapper[4892]: I0217 19:37:53.353466 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mgg9h" Feb 17 19:37:53 crc kubenswrapper[4892]: I0217 19:37:53.353982 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mgg9h" Feb 17 19:37:54 crc kubenswrapper[4892]: I0217 19:37:54.458216 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-mgg9h" podUID="fc0d8760-1412-4684-9dea-a8ee8e058586" containerName="registry-server" probeResult="failure" output=< Feb 17 19:37:54 crc kubenswrapper[4892]: timeout: failed to connect service ":50051" within 1s Feb 17 19:37:54 crc kubenswrapper[4892]: > Feb 17 19:38:03 crc kubenswrapper[4892]: I0217 19:38:03.405598 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mgg9h" Feb 17 19:38:03 crc kubenswrapper[4892]: I0217 19:38:03.484141 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mgg9h" Feb 17 19:38:03 crc kubenswrapper[4892]: I0217 19:38:03.646357 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mgg9h"] Feb 17 19:38:04 crc kubenswrapper[4892]: I0217 19:38:04.468001 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mgg9h" podUID="fc0d8760-1412-4684-9dea-a8ee8e058586" containerName="registry-server" containerID="cri-o://4c9c082bbe32a0affe5d658ffa99a6c6ad8f185beaaed34ecccc663984c96066" gracePeriod=2 Feb 17 19:38:05 crc kubenswrapper[4892]: I0217 19:38:05.072754 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mgg9h" Feb 17 19:38:05 crc kubenswrapper[4892]: I0217 19:38:05.178397 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4hps\" (UniqueName: \"kubernetes.io/projected/fc0d8760-1412-4684-9dea-a8ee8e058586-kube-api-access-t4hps\") pod \"fc0d8760-1412-4684-9dea-a8ee8e058586\" (UID: \"fc0d8760-1412-4684-9dea-a8ee8e058586\") " Feb 17 19:38:05 crc kubenswrapper[4892]: I0217 19:38:05.178496 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc0d8760-1412-4684-9dea-a8ee8e058586-catalog-content\") pod \"fc0d8760-1412-4684-9dea-a8ee8e058586\" (UID: \"fc0d8760-1412-4684-9dea-a8ee8e058586\") " Feb 17 19:38:05 crc kubenswrapper[4892]: I0217 19:38:05.178653 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc0d8760-1412-4684-9dea-a8ee8e058586-utilities\") pod \"fc0d8760-1412-4684-9dea-a8ee8e058586\" (UID: \"fc0d8760-1412-4684-9dea-a8ee8e058586\") " Feb 17 19:38:05 crc kubenswrapper[4892]: I0217 19:38:05.179660 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc0d8760-1412-4684-9dea-a8ee8e058586-utilities" (OuterVolumeSpecName: "utilities") pod "fc0d8760-1412-4684-9dea-a8ee8e058586" (UID: "fc0d8760-1412-4684-9dea-a8ee8e058586"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:38:05 crc kubenswrapper[4892]: I0217 19:38:05.181435 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc0d8760-1412-4684-9dea-a8ee8e058586-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 19:38:05 crc kubenswrapper[4892]: I0217 19:38:05.186064 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc0d8760-1412-4684-9dea-a8ee8e058586-kube-api-access-t4hps" (OuterVolumeSpecName: "kube-api-access-t4hps") pod "fc0d8760-1412-4684-9dea-a8ee8e058586" (UID: "fc0d8760-1412-4684-9dea-a8ee8e058586"). InnerVolumeSpecName "kube-api-access-t4hps". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:38:05 crc kubenswrapper[4892]: I0217 19:38:05.250059 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc0d8760-1412-4684-9dea-a8ee8e058586-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fc0d8760-1412-4684-9dea-a8ee8e058586" (UID: "fc0d8760-1412-4684-9dea-a8ee8e058586"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:38:05 crc kubenswrapper[4892]: I0217 19:38:05.283832 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4hps\" (UniqueName: \"kubernetes.io/projected/fc0d8760-1412-4684-9dea-a8ee8e058586-kube-api-access-t4hps\") on node \"crc\" DevicePath \"\"" Feb 17 19:38:05 crc kubenswrapper[4892]: I0217 19:38:05.283871 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc0d8760-1412-4684-9dea-a8ee8e058586-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 19:38:05 crc kubenswrapper[4892]: I0217 19:38:05.490541 4892 generic.go:334] "Generic (PLEG): container finished" podID="fc0d8760-1412-4684-9dea-a8ee8e058586" containerID="4c9c082bbe32a0affe5d658ffa99a6c6ad8f185beaaed34ecccc663984c96066" exitCode=0 Feb 17 19:38:05 crc kubenswrapper[4892]: I0217 19:38:05.490573 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mgg9h" Feb 17 19:38:05 crc kubenswrapper[4892]: I0217 19:38:05.490590 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mgg9h" event={"ID":"fc0d8760-1412-4684-9dea-a8ee8e058586","Type":"ContainerDied","Data":"4c9c082bbe32a0affe5d658ffa99a6c6ad8f185beaaed34ecccc663984c96066"} Feb 17 19:38:05 crc kubenswrapper[4892]: I0217 19:38:05.490617 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mgg9h" event={"ID":"fc0d8760-1412-4684-9dea-a8ee8e058586","Type":"ContainerDied","Data":"e33b73db8245fca74ed0c6e91a804b0da81058296388682782f33516914f4816"} Feb 17 19:38:05 crc kubenswrapper[4892]: I0217 19:38:05.490634 4892 scope.go:117] "RemoveContainer" containerID="4c9c082bbe32a0affe5d658ffa99a6c6ad8f185beaaed34ecccc663984c96066" Feb 17 19:38:05 crc kubenswrapper[4892]: I0217 19:38:05.520870 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mgg9h"] Feb 17 19:38:05 crc kubenswrapper[4892]: I0217 19:38:05.531019 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mgg9h"] Feb 17 19:38:05 crc kubenswrapper[4892]: I0217 19:38:05.533316 4892 scope.go:117] "RemoveContainer" containerID="1c715544b090d181df67d9b16f42834944ff6288f0a7b80fc3b35a3e5fa8d8ad" Feb 17 19:38:05 crc kubenswrapper[4892]: I0217 19:38:05.557074 4892 scope.go:117] "RemoveContainer" containerID="bdcf667f32b2264b5e2ce57e1281d3f018d810ed91d9ea7fe49e76ceb9d16081" Feb 17 19:38:05 crc kubenswrapper[4892]: I0217 19:38:05.637470 4892 scope.go:117] "RemoveContainer" containerID="4c9c082bbe32a0affe5d658ffa99a6c6ad8f185beaaed34ecccc663984c96066" Feb 17 19:38:05 crc kubenswrapper[4892]: E0217 19:38:05.644582 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c9c082bbe32a0affe5d658ffa99a6c6ad8f185beaaed34ecccc663984c96066\": container with ID starting with 4c9c082bbe32a0affe5d658ffa99a6c6ad8f185beaaed34ecccc663984c96066 not found: ID does not exist" containerID="4c9c082bbe32a0affe5d658ffa99a6c6ad8f185beaaed34ecccc663984c96066" Feb 17 19:38:05 crc kubenswrapper[4892]: I0217 19:38:05.644617 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c9c082bbe32a0affe5d658ffa99a6c6ad8f185beaaed34ecccc663984c96066"} err="failed to get container status \"4c9c082bbe32a0affe5d658ffa99a6c6ad8f185beaaed34ecccc663984c96066\": rpc error: code = NotFound desc = could not find container \"4c9c082bbe32a0affe5d658ffa99a6c6ad8f185beaaed34ecccc663984c96066\": container with ID starting with 4c9c082bbe32a0affe5d658ffa99a6c6ad8f185beaaed34ecccc663984c96066 not found: ID does not exist" Feb 17 19:38:05 crc kubenswrapper[4892]: I0217 19:38:05.644641 4892 scope.go:117] "RemoveContainer" containerID="1c715544b090d181df67d9b16f42834944ff6288f0a7b80fc3b35a3e5fa8d8ad" Feb 17 19:38:05 crc kubenswrapper[4892]: E0217 19:38:05.645109 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c715544b090d181df67d9b16f42834944ff6288f0a7b80fc3b35a3e5fa8d8ad\": container with ID starting with 1c715544b090d181df67d9b16f42834944ff6288f0a7b80fc3b35a3e5fa8d8ad not found: ID does not exist" containerID="1c715544b090d181df67d9b16f42834944ff6288f0a7b80fc3b35a3e5fa8d8ad" Feb 17 19:38:05 crc kubenswrapper[4892]: I0217 19:38:05.645139 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c715544b090d181df67d9b16f42834944ff6288f0a7b80fc3b35a3e5fa8d8ad"} err="failed to get container status \"1c715544b090d181df67d9b16f42834944ff6288f0a7b80fc3b35a3e5fa8d8ad\": rpc error: code = NotFound desc = could not find container \"1c715544b090d181df67d9b16f42834944ff6288f0a7b80fc3b35a3e5fa8d8ad\": container with ID starting with 1c715544b090d181df67d9b16f42834944ff6288f0a7b80fc3b35a3e5fa8d8ad not found: ID does not exist" Feb 17 19:38:05 crc kubenswrapper[4892]: I0217 19:38:05.645156 4892 scope.go:117] "RemoveContainer" containerID="bdcf667f32b2264b5e2ce57e1281d3f018d810ed91d9ea7fe49e76ceb9d16081" Feb 17 19:38:05 crc kubenswrapper[4892]: E0217 19:38:05.647547 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdcf667f32b2264b5e2ce57e1281d3f018d810ed91d9ea7fe49e76ceb9d16081\": container with ID starting with bdcf667f32b2264b5e2ce57e1281d3f018d810ed91d9ea7fe49e76ceb9d16081 not found: ID does not exist" containerID="bdcf667f32b2264b5e2ce57e1281d3f018d810ed91d9ea7fe49e76ceb9d16081" Feb 17 19:38:05 crc kubenswrapper[4892]: I0217 19:38:05.647577 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdcf667f32b2264b5e2ce57e1281d3f018d810ed91d9ea7fe49e76ceb9d16081"} err="failed to get container status \"bdcf667f32b2264b5e2ce57e1281d3f018d810ed91d9ea7fe49e76ceb9d16081\": rpc error: code = NotFound desc = could not find container \"bdcf667f32b2264b5e2ce57e1281d3f018d810ed91d9ea7fe49e76ceb9d16081\": container with ID starting with bdcf667f32b2264b5e2ce57e1281d3f018d810ed91d9ea7fe49e76ceb9d16081 not found: ID does not exist" Feb 17 19:38:07 crc kubenswrapper[4892]: I0217 19:38:07.374903 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc0d8760-1412-4684-9dea-a8ee8e058586" path="/var/lib/kubelet/pods/fc0d8760-1412-4684-9dea-a8ee8e058586/volumes" Feb 17 19:38:19 crc kubenswrapper[4892]: I0217 19:38:19.619517 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-m496q"] Feb 17 19:38:19 crc kubenswrapper[4892]: E0217 19:38:19.621178 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc0d8760-1412-4684-9dea-a8ee8e058586" containerName="registry-server" Feb 17 19:38:19 crc kubenswrapper[4892]: I0217 19:38:19.621204 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc0d8760-1412-4684-9dea-a8ee8e058586" containerName="registry-server" Feb 17 19:38:19 crc kubenswrapper[4892]: E0217 19:38:19.621313 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc0d8760-1412-4684-9dea-a8ee8e058586" containerName="extract-utilities" Feb 17 19:38:19 crc kubenswrapper[4892]: I0217 19:38:19.621329 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc0d8760-1412-4684-9dea-a8ee8e058586" containerName="extract-utilities" Feb 17 19:38:19 crc kubenswrapper[4892]: E0217 19:38:19.621350 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc0d8760-1412-4684-9dea-a8ee8e058586" containerName="extract-content" Feb 17 19:38:19 crc kubenswrapper[4892]: I0217 19:38:19.621366 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc0d8760-1412-4684-9dea-a8ee8e058586" containerName="extract-content" Feb 17 19:38:19 crc kubenswrapper[4892]: I0217 19:38:19.622009 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc0d8760-1412-4684-9dea-a8ee8e058586" containerName="registry-server" Feb 17 19:38:19 crc kubenswrapper[4892]: I0217 19:38:19.628128 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m496q" Feb 17 19:38:19 crc kubenswrapper[4892]: I0217 19:38:19.643679 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m496q"] Feb 17 19:38:19 crc kubenswrapper[4892]: I0217 19:38:19.794665 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2zbv\" (UniqueName: \"kubernetes.io/projected/45d3b930-cea9-4bab-9963-2509288bfad1-kube-api-access-r2zbv\") pod \"redhat-marketplace-m496q\" (UID: \"45d3b930-cea9-4bab-9963-2509288bfad1\") " pod="openshift-marketplace/redhat-marketplace-m496q" Feb 17 19:38:19 crc kubenswrapper[4892]: I0217 19:38:19.795046 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45d3b930-cea9-4bab-9963-2509288bfad1-catalog-content\") pod \"redhat-marketplace-m496q\" (UID: \"45d3b930-cea9-4bab-9963-2509288bfad1\") " pod="openshift-marketplace/redhat-marketplace-m496q" Feb 17 19:38:19 crc kubenswrapper[4892]: I0217 19:38:19.795234 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45d3b930-cea9-4bab-9963-2509288bfad1-utilities\") pod \"redhat-marketplace-m496q\" (UID: \"45d3b930-cea9-4bab-9963-2509288bfad1\") " pod="openshift-marketplace/redhat-marketplace-m496q" Feb 17 19:38:19 crc kubenswrapper[4892]: I0217 19:38:19.897669 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2zbv\" (UniqueName: \"kubernetes.io/projected/45d3b930-cea9-4bab-9963-2509288bfad1-kube-api-access-r2zbv\") pod \"redhat-marketplace-m496q\" (UID: \"45d3b930-cea9-4bab-9963-2509288bfad1\") " pod="openshift-marketplace/redhat-marketplace-m496q" Feb 17 19:38:19 crc kubenswrapper[4892]: I0217 19:38:19.897881 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45d3b930-cea9-4bab-9963-2509288bfad1-catalog-content\") pod \"redhat-marketplace-m496q\" (UID: \"45d3b930-cea9-4bab-9963-2509288bfad1\") " pod="openshift-marketplace/redhat-marketplace-m496q" Feb 17 19:38:19 crc kubenswrapper[4892]: I0217 19:38:19.898364 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45d3b930-cea9-4bab-9963-2509288bfad1-utilities\") pod \"redhat-marketplace-m496q\" (UID: \"45d3b930-cea9-4bab-9963-2509288bfad1\") " pod="openshift-marketplace/redhat-marketplace-m496q" Feb 17 19:38:19 crc kubenswrapper[4892]: I0217 19:38:19.898541 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45d3b930-cea9-4bab-9963-2509288bfad1-catalog-content\") pod \"redhat-marketplace-m496q\" (UID: \"45d3b930-cea9-4bab-9963-2509288bfad1\") " pod="openshift-marketplace/redhat-marketplace-m496q" Feb 17 19:38:19 crc kubenswrapper[4892]: I0217 19:38:19.898571 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45d3b930-cea9-4bab-9963-2509288bfad1-utilities\") pod \"redhat-marketplace-m496q\" (UID: \"45d3b930-cea9-4bab-9963-2509288bfad1\") " pod="openshift-marketplace/redhat-marketplace-m496q" Feb 17 19:38:19 crc kubenswrapper[4892]: I0217 19:38:19.919144 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2zbv\" (UniqueName: \"kubernetes.io/projected/45d3b930-cea9-4bab-9963-2509288bfad1-kube-api-access-r2zbv\") pod \"redhat-marketplace-m496q\" (UID: \"45d3b930-cea9-4bab-9963-2509288bfad1\") " pod="openshift-marketplace/redhat-marketplace-m496q" Feb 17 19:38:19 crc kubenswrapper[4892]: I0217 19:38:19.958411 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m496q" Feb 17 19:38:20 crc kubenswrapper[4892]: I0217 19:38:20.482139 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m496q"] Feb 17 19:38:20 crc kubenswrapper[4892]: I0217 19:38:20.737419 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m496q" event={"ID":"45d3b930-cea9-4bab-9963-2509288bfad1","Type":"ContainerStarted","Data":"b0228ec19449c96bbe527c6482145db3e5a6ddcf03c09b2fcb778e9281b17eb6"} Feb 17 19:38:21 crc kubenswrapper[4892]: I0217 19:38:21.752906 4892 generic.go:334] "Generic (PLEG): container finished" podID="45d3b930-cea9-4bab-9963-2509288bfad1" containerID="bc6b6a07db8961b099aa427ac793cae7aceaafb1ae833d7d76d3581ab0089c41" exitCode=0 Feb 17 19:38:21 crc kubenswrapper[4892]: I0217 19:38:21.753062 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m496q" event={"ID":"45d3b930-cea9-4bab-9963-2509288bfad1","Type":"ContainerDied","Data":"bc6b6a07db8961b099aa427ac793cae7aceaafb1ae833d7d76d3581ab0089c41"} Feb 17 19:38:22 crc kubenswrapper[4892]: I0217 19:38:22.766260 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m496q" event={"ID":"45d3b930-cea9-4bab-9963-2509288bfad1","Type":"ContainerStarted","Data":"52cc33f3c2a46f8990548bb3a046b457dfe4e5555c70ce9af6881a3e7e79f578"} Feb 17 19:38:25 crc kubenswrapper[4892]: I0217 19:38:25.838751 4892 generic.go:334] "Generic (PLEG): container finished" podID="45d3b930-cea9-4bab-9963-2509288bfad1" containerID="52cc33f3c2a46f8990548bb3a046b457dfe4e5555c70ce9af6881a3e7e79f578" exitCode=0 Feb 17 19:38:25 crc kubenswrapper[4892]: I0217 19:38:25.839248 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m496q" event={"ID":"45d3b930-cea9-4bab-9963-2509288bfad1","Type":"ContainerDied","Data":"52cc33f3c2a46f8990548bb3a046b457dfe4e5555c70ce9af6881a3e7e79f578"} Feb 17 19:38:26 crc kubenswrapper[4892]: I0217 19:38:26.852442 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m496q" event={"ID":"45d3b930-cea9-4bab-9963-2509288bfad1","Type":"ContainerStarted","Data":"e88b823cd0d853879de6c4890d20d6ae8025e2ac2c36954d702b143b0919fa43"} Feb 17 19:38:26 crc kubenswrapper[4892]: I0217 19:38:26.878848 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-m496q" podStartSLOduration=3.101008944 podStartE2EDuration="7.878806629s" podCreationTimestamp="2026-02-17 19:38:19 +0000 UTC" firstStartedPulling="2026-02-17 19:38:21.75706942 +0000 UTC m=+6873.132472685" lastFinishedPulling="2026-02-17 19:38:26.534867105 +0000 UTC m=+6877.910270370" observedRunningTime="2026-02-17 19:38:26.86662766 +0000 UTC m=+6878.242030945" watchObservedRunningTime="2026-02-17 19:38:26.878806629 +0000 UTC m=+6878.254209904" Feb 17 19:38:29 crc kubenswrapper[4892]: I0217 19:38:29.959317 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-m496q" Feb 17 19:38:29 crc kubenswrapper[4892]: I0217 19:38:29.960279 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-m496q" Feb 17 19:38:30 crc kubenswrapper[4892]: I0217 19:38:30.059674 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-m496q" Feb 17 19:38:37 crc kubenswrapper[4892]: I0217 19:38:37.715427 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zbh4l"] Feb 17 19:38:37 crc kubenswrapper[4892]: I0217 19:38:37.720317 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zbh4l" Feb 17 19:38:37 crc kubenswrapper[4892]: I0217 19:38:37.727933 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zbh4l"] Feb 17 19:38:37 crc kubenswrapper[4892]: I0217 19:38:37.919058 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86dbc6cd-3de6-414c-a422-53507111e8ca-utilities\") pod \"redhat-operators-zbh4l\" (UID: \"86dbc6cd-3de6-414c-a422-53507111e8ca\") " pod="openshift-marketplace/redhat-operators-zbh4l" Feb 17 19:38:37 crc kubenswrapper[4892]: I0217 19:38:37.919563 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qbv7\" (UniqueName: \"kubernetes.io/projected/86dbc6cd-3de6-414c-a422-53507111e8ca-kube-api-access-5qbv7\") pod \"redhat-operators-zbh4l\" (UID: \"86dbc6cd-3de6-414c-a422-53507111e8ca\") " pod="openshift-marketplace/redhat-operators-zbh4l" Feb 17 19:38:37 crc kubenswrapper[4892]: I0217 19:38:37.919794 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86dbc6cd-3de6-414c-a422-53507111e8ca-catalog-content\") pod \"redhat-operators-zbh4l\" (UID: \"86dbc6cd-3de6-414c-a422-53507111e8ca\") " pod="openshift-marketplace/redhat-operators-zbh4l" Feb 17 19:38:38 crc kubenswrapper[4892]: I0217 19:38:38.022798 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qbv7\" (UniqueName: \"kubernetes.io/projected/86dbc6cd-3de6-414c-a422-53507111e8ca-kube-api-access-5qbv7\") pod \"redhat-operators-zbh4l\" (UID: \"86dbc6cd-3de6-414c-a422-53507111e8ca\") " pod="openshift-marketplace/redhat-operators-zbh4l" Feb 17 19:38:38 crc kubenswrapper[4892]: I0217 19:38:38.022889 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86dbc6cd-3de6-414c-a422-53507111e8ca-catalog-content\") pod \"redhat-operators-zbh4l\" (UID: \"86dbc6cd-3de6-414c-a422-53507111e8ca\") " pod="openshift-marketplace/redhat-operators-zbh4l" Feb 17 19:38:38 crc kubenswrapper[4892]: I0217 19:38:38.022968 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86dbc6cd-3de6-414c-a422-53507111e8ca-utilities\") pod \"redhat-operators-zbh4l\" (UID: \"86dbc6cd-3de6-414c-a422-53507111e8ca\") " pod="openshift-marketplace/redhat-operators-zbh4l" Feb 17 19:38:38 crc kubenswrapper[4892]: I0217 19:38:38.023456 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86dbc6cd-3de6-414c-a422-53507111e8ca-utilities\") pod \"redhat-operators-zbh4l\" (UID: \"86dbc6cd-3de6-414c-a422-53507111e8ca\") " pod="openshift-marketplace/redhat-operators-zbh4l" Feb 17 19:38:38 crc kubenswrapper[4892]: I0217 19:38:38.023733 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86dbc6cd-3de6-414c-a422-53507111e8ca-catalog-content\") pod \"redhat-operators-zbh4l\" (UID: \"86dbc6cd-3de6-414c-a422-53507111e8ca\") " pod="openshift-marketplace/redhat-operators-zbh4l" Feb 17 19:38:38 crc kubenswrapper[4892]: I0217 19:38:38.047871 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qbv7\" (UniqueName: \"kubernetes.io/projected/86dbc6cd-3de6-414c-a422-53507111e8ca-kube-api-access-5qbv7\") pod \"redhat-operators-zbh4l\" (UID: \"86dbc6cd-3de6-414c-a422-53507111e8ca\") " pod="openshift-marketplace/redhat-operators-zbh4l" Feb 17 19:38:38 crc kubenswrapper[4892]: I0217 19:38:38.049752 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zbh4l" Feb 17 19:38:38 crc kubenswrapper[4892]: I0217 19:38:38.568416 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zbh4l"] Feb 17 19:38:38 crc kubenswrapper[4892]: W0217 19:38:38.574239 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86dbc6cd_3de6_414c_a422_53507111e8ca.slice/crio-eaaf18da88985c21adfcd15e1471653f22bf5a05d5186b8dfb179b8e08850654 WatchSource:0}: Error finding container eaaf18da88985c21adfcd15e1471653f22bf5a05d5186b8dfb179b8e08850654: Status 404 returned error can't find the container with id eaaf18da88985c21adfcd15e1471653f22bf5a05d5186b8dfb179b8e08850654 Feb 17 19:38:39 crc kubenswrapper[4892]: I0217 19:38:39.022245 4892 generic.go:334] "Generic (PLEG): container finished" podID="86dbc6cd-3de6-414c-a422-53507111e8ca" containerID="d7d9a7c70ad343af3942ef95e5c8c071a051550e113b10d5a7fbc991df68bee1" exitCode=0 Feb 17 19:38:39 crc kubenswrapper[4892]: I0217 19:38:39.022287 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zbh4l" event={"ID":"86dbc6cd-3de6-414c-a422-53507111e8ca","Type":"ContainerDied","Data":"d7d9a7c70ad343af3942ef95e5c8c071a051550e113b10d5a7fbc991df68bee1"} Feb 17 19:38:39 crc kubenswrapper[4892]: I0217 19:38:39.022310 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zbh4l" event={"ID":"86dbc6cd-3de6-414c-a422-53507111e8ca","Type":"ContainerStarted","Data":"eaaf18da88985c21adfcd15e1471653f22bf5a05d5186b8dfb179b8e08850654"} Feb 17 19:38:40 crc kubenswrapper[4892]: I0217 19:38:40.024618 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-m496q" Feb 17 19:38:40 crc kubenswrapper[4892]: I0217 19:38:40.032747 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zbh4l" event={"ID":"86dbc6cd-3de6-414c-a422-53507111e8ca","Type":"ContainerStarted","Data":"5664fa71913afe84900d58e8df65d53a653ca53a5fa85258d1b47acd801c6313"} Feb 17 19:38:42 crc kubenswrapper[4892]: I0217 19:38:42.290195 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m496q"] Feb 17 19:38:42 crc kubenswrapper[4892]: I0217 19:38:42.290769 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-m496q" podUID="45d3b930-cea9-4bab-9963-2509288bfad1" containerName="registry-server" containerID="cri-o://e88b823cd0d853879de6c4890d20d6ae8025e2ac2c36954d702b143b0919fa43" gracePeriod=2 Feb 17 19:38:43 crc kubenswrapper[4892]: I0217 19:38:43.080338 4892 generic.go:334] "Generic (PLEG): container finished" podID="45d3b930-cea9-4bab-9963-2509288bfad1" containerID="e88b823cd0d853879de6c4890d20d6ae8025e2ac2c36954d702b143b0919fa43" exitCode=0 Feb 17 19:38:43 crc kubenswrapper[4892]: I0217 19:38:43.080405 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m496q" event={"ID":"45d3b930-cea9-4bab-9963-2509288bfad1","Type":"ContainerDied","Data":"e88b823cd0d853879de6c4890d20d6ae8025e2ac2c36954d702b143b0919fa43"} Feb 17 19:38:43 crc kubenswrapper[4892]: I0217 19:38:43.348078 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m496q" Feb 17 19:38:43 crc kubenswrapper[4892]: I0217 19:38:43.377772 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2zbv\" (UniqueName: \"kubernetes.io/projected/45d3b930-cea9-4bab-9963-2509288bfad1-kube-api-access-r2zbv\") pod \"45d3b930-cea9-4bab-9963-2509288bfad1\" (UID: \"45d3b930-cea9-4bab-9963-2509288bfad1\") " Feb 17 19:38:43 crc kubenswrapper[4892]: I0217 19:38:43.378201 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45d3b930-cea9-4bab-9963-2509288bfad1-catalog-content\") pod \"45d3b930-cea9-4bab-9963-2509288bfad1\" (UID: \"45d3b930-cea9-4bab-9963-2509288bfad1\") " Feb 17 19:38:43 crc kubenswrapper[4892]: I0217 19:38:43.378274 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45d3b930-cea9-4bab-9963-2509288bfad1-utilities\") pod \"45d3b930-cea9-4bab-9963-2509288bfad1\" (UID: \"45d3b930-cea9-4bab-9963-2509288bfad1\") " Feb 17 19:38:43 crc kubenswrapper[4892]: I0217 19:38:43.380943 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45d3b930-cea9-4bab-9963-2509288bfad1-utilities" (OuterVolumeSpecName: "utilities") pod "45d3b930-cea9-4bab-9963-2509288bfad1" (UID: "45d3b930-cea9-4bab-9963-2509288bfad1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:38:43 crc kubenswrapper[4892]: I0217 19:38:43.395128 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45d3b930-cea9-4bab-9963-2509288bfad1-kube-api-access-r2zbv" (OuterVolumeSpecName: "kube-api-access-r2zbv") pod "45d3b930-cea9-4bab-9963-2509288bfad1" (UID: "45d3b930-cea9-4bab-9963-2509288bfad1"). InnerVolumeSpecName "kube-api-access-r2zbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:38:43 crc kubenswrapper[4892]: I0217 19:38:43.416766 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45d3b930-cea9-4bab-9963-2509288bfad1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "45d3b930-cea9-4bab-9963-2509288bfad1" (UID: "45d3b930-cea9-4bab-9963-2509288bfad1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:38:43 crc kubenswrapper[4892]: I0217 19:38:43.481208 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45d3b930-cea9-4bab-9963-2509288bfad1-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 19:38:43 crc kubenswrapper[4892]: I0217 19:38:43.481249 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45d3b930-cea9-4bab-9963-2509288bfad1-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 19:38:43 crc kubenswrapper[4892]: I0217 19:38:43.481263 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2zbv\" (UniqueName: \"kubernetes.io/projected/45d3b930-cea9-4bab-9963-2509288bfad1-kube-api-access-r2zbv\") on node \"crc\" DevicePath \"\"" Feb 17 19:38:44 crc kubenswrapper[4892]: I0217 19:38:44.094283 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m496q" event={"ID":"45d3b930-cea9-4bab-9963-2509288bfad1","Type":"ContainerDied","Data":"b0228ec19449c96bbe527c6482145db3e5a6ddcf03c09b2fcb778e9281b17eb6"} Feb 17 19:38:44 crc kubenswrapper[4892]: I0217 19:38:44.094330 4892 scope.go:117] "RemoveContainer" containerID="e88b823cd0d853879de6c4890d20d6ae8025e2ac2c36954d702b143b0919fa43" Feb 17 19:38:44 crc kubenswrapper[4892]: I0217 19:38:44.094382 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m496q" Feb 17 19:38:44 crc kubenswrapper[4892]: I0217 19:38:44.137350 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m496q"] Feb 17 19:38:44 crc kubenswrapper[4892]: I0217 19:38:44.155177 4892 scope.go:117] "RemoveContainer" containerID="52cc33f3c2a46f8990548bb3a046b457dfe4e5555c70ce9af6881a3e7e79f578" Feb 17 19:38:44 crc kubenswrapper[4892]: I0217 19:38:44.158114 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-m496q"] Feb 17 19:38:44 crc kubenswrapper[4892]: I0217 19:38:44.194126 4892 scope.go:117] "RemoveContainer" containerID="bc6b6a07db8961b099aa427ac793cae7aceaafb1ae833d7d76d3581ab0089c41" Feb 17 19:38:45 crc kubenswrapper[4892]: I0217 19:38:45.374018 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45d3b930-cea9-4bab-9963-2509288bfad1" path="/var/lib/kubelet/pods/45d3b930-cea9-4bab-9963-2509288bfad1/volumes" Feb 17 19:38:49 crc kubenswrapper[4892]: I0217 19:38:49.168720 4892 generic.go:334] "Generic (PLEG): container finished" podID="86dbc6cd-3de6-414c-a422-53507111e8ca" containerID="5664fa71913afe84900d58e8df65d53a653ca53a5fa85258d1b47acd801c6313" exitCode=0 Feb 17 19:38:49 crc kubenswrapper[4892]: I0217 19:38:49.168872 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zbh4l" event={"ID":"86dbc6cd-3de6-414c-a422-53507111e8ca","Type":"ContainerDied","Data":"5664fa71913afe84900d58e8df65d53a653ca53a5fa85258d1b47acd801c6313"} Feb 17 19:38:50 crc kubenswrapper[4892]: I0217 19:38:50.190766 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zbh4l" event={"ID":"86dbc6cd-3de6-414c-a422-53507111e8ca","Type":"ContainerStarted","Data":"6e2731b0a4c1e86b4dfe8250d2463caab0cf2d39ae80436eb77eb39a2a461bd3"} Feb 17 19:38:50 crc kubenswrapper[4892]: I0217 19:38:50.214135 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zbh4l" podStartSLOduration=2.353813612 podStartE2EDuration="13.214095888s" podCreationTimestamp="2026-02-17 19:38:37 +0000 UTC" firstStartedPulling="2026-02-17 19:38:39.023744162 +0000 UTC m=+6890.399147427" lastFinishedPulling="2026-02-17 19:38:49.884026398 +0000 UTC m=+6901.259429703" observedRunningTime="2026-02-17 19:38:50.20861116 +0000 UTC m=+6901.584014425" watchObservedRunningTime="2026-02-17 19:38:50.214095888 +0000 UTC m=+6901.589499143" Feb 17 19:38:58 crc kubenswrapper[4892]: I0217 19:38:58.050677 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zbh4l" Feb 17 19:38:58 crc kubenswrapper[4892]: I0217 19:38:58.051272 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zbh4l" Feb 17 19:38:59 crc kubenswrapper[4892]: I0217 19:38:59.095867 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zbh4l" podUID="86dbc6cd-3de6-414c-a422-53507111e8ca" containerName="registry-server" probeResult="failure" output=< Feb 17 19:38:59 crc kubenswrapper[4892]: timeout: failed to connect service ":50051" within 1s Feb 17 19:38:59 crc kubenswrapper[4892]: > Feb 17 19:39:06 crc kubenswrapper[4892]: I0217 19:39:06.055722 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-kbm4j"] Feb 17 19:39:06 crc kubenswrapper[4892]: I0217 19:39:06.068000 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-kbm4j"] Feb 17 19:39:07 crc kubenswrapper[4892]: I0217 19:39:07.036172 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-b720-account-create-update-mng59"] Feb 17 19:39:07 crc kubenswrapper[4892]: I0217 19:39:07.045022 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-b720-account-create-update-mng59"] Feb 17 19:39:07 crc kubenswrapper[4892]: I0217 19:39:07.378582 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="724bf084-29f5-4fdf-b1b4-bc3ab7cfe4a7" path="/var/lib/kubelet/pods/724bf084-29f5-4fdf-b1b4-bc3ab7cfe4a7/volumes" Feb 17 19:39:07 crc kubenswrapper[4892]: I0217 19:39:07.379496 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7df073dc-7ac0-41c7-899b-7373b161da67" path="/var/lib/kubelet/pods/7df073dc-7ac0-41c7-899b-7373b161da67/volumes" Feb 17 19:39:07 crc kubenswrapper[4892]: I0217 19:39:07.424510 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 19:39:07 crc kubenswrapper[4892]: I0217 19:39:07.424617 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 19:39:07 crc kubenswrapper[4892]: I0217 19:39:07.758274 4892 scope.go:117] "RemoveContainer" containerID="4273ef726e9e8dfff3490bf9e13a16d9ef13a953fb1a85c34f1203a79fb46adf" Feb 17 19:39:07 crc kubenswrapper[4892]: I0217 19:39:07.802957 4892 scope.go:117] "RemoveContainer" containerID="9675d4aaac3723cfa4ebe733a0da53c4d14eb870405c06b52bc3532e86d51b6d" Feb 17 19:39:08 crc kubenswrapper[4892]: I0217 19:39:08.104436 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zbh4l" Feb 17 19:39:08 crc kubenswrapper[4892]: I0217 19:39:08.162463 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zbh4l" Feb 17 19:39:08 crc kubenswrapper[4892]: I0217 19:39:08.904762 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zbh4l"] Feb 17 19:39:09 crc kubenswrapper[4892]: I0217 19:39:09.405964 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zbh4l" podUID="86dbc6cd-3de6-414c-a422-53507111e8ca" containerName="registry-server" containerID="cri-o://6e2731b0a4c1e86b4dfe8250d2463caab0cf2d39ae80436eb77eb39a2a461bd3" gracePeriod=2 Feb 17 19:39:10 crc kubenswrapper[4892]: I0217 19:39:10.006125 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zbh4l" Feb 17 19:39:10 crc kubenswrapper[4892]: I0217 19:39:10.177885 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86dbc6cd-3de6-414c-a422-53507111e8ca-utilities\") pod \"86dbc6cd-3de6-414c-a422-53507111e8ca\" (UID: \"86dbc6cd-3de6-414c-a422-53507111e8ca\") " Feb 17 19:39:10 crc kubenswrapper[4892]: I0217 19:39:10.177952 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86dbc6cd-3de6-414c-a422-53507111e8ca-catalog-content\") pod \"86dbc6cd-3de6-414c-a422-53507111e8ca\" (UID: \"86dbc6cd-3de6-414c-a422-53507111e8ca\") " Feb 17 19:39:10 crc kubenswrapper[4892]: I0217 19:39:10.178051 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qbv7\" (UniqueName: \"kubernetes.io/projected/86dbc6cd-3de6-414c-a422-53507111e8ca-kube-api-access-5qbv7\") pod \"86dbc6cd-3de6-414c-a422-53507111e8ca\" (UID: \"86dbc6cd-3de6-414c-a422-53507111e8ca\") " Feb 17 19:39:10 crc kubenswrapper[4892]: I0217 19:39:10.178950 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86dbc6cd-3de6-414c-a422-53507111e8ca-utilities" (OuterVolumeSpecName: "utilities") pod "86dbc6cd-3de6-414c-a422-53507111e8ca" (UID: "86dbc6cd-3de6-414c-a422-53507111e8ca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:39:10 crc kubenswrapper[4892]: I0217 19:39:10.189106 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86dbc6cd-3de6-414c-a422-53507111e8ca-kube-api-access-5qbv7" (OuterVolumeSpecName: "kube-api-access-5qbv7") pod "86dbc6cd-3de6-414c-a422-53507111e8ca" (UID: "86dbc6cd-3de6-414c-a422-53507111e8ca"). InnerVolumeSpecName "kube-api-access-5qbv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:39:10 crc kubenswrapper[4892]: I0217 19:39:10.280697 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86dbc6cd-3de6-414c-a422-53507111e8ca-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 19:39:10 crc kubenswrapper[4892]: I0217 19:39:10.280742 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qbv7\" (UniqueName: \"kubernetes.io/projected/86dbc6cd-3de6-414c-a422-53507111e8ca-kube-api-access-5qbv7\") on node \"crc\" DevicePath \"\"" Feb 17 19:39:10 crc kubenswrapper[4892]: I0217 19:39:10.312672 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86dbc6cd-3de6-414c-a422-53507111e8ca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "86dbc6cd-3de6-414c-a422-53507111e8ca" (UID: "86dbc6cd-3de6-414c-a422-53507111e8ca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:39:10 crc kubenswrapper[4892]: I0217 19:39:10.383508 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86dbc6cd-3de6-414c-a422-53507111e8ca-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 19:39:10 crc kubenswrapper[4892]: I0217 19:39:10.431433 4892 generic.go:334] "Generic (PLEG): container finished" podID="86dbc6cd-3de6-414c-a422-53507111e8ca" containerID="6e2731b0a4c1e86b4dfe8250d2463caab0cf2d39ae80436eb77eb39a2a461bd3" exitCode=0 Feb 17 19:39:10 crc kubenswrapper[4892]: I0217 19:39:10.431486 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zbh4l" event={"ID":"86dbc6cd-3de6-414c-a422-53507111e8ca","Type":"ContainerDied","Data":"6e2731b0a4c1e86b4dfe8250d2463caab0cf2d39ae80436eb77eb39a2a461bd3"} Feb 17 19:39:10 crc kubenswrapper[4892]: I0217 19:39:10.431514 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zbh4l" event={"ID":"86dbc6cd-3de6-414c-a422-53507111e8ca","Type":"ContainerDied","Data":"eaaf18da88985c21adfcd15e1471653f22bf5a05d5186b8dfb179b8e08850654"} Feb 17 19:39:10 crc kubenswrapper[4892]: I0217 19:39:10.431534 4892 scope.go:117] "RemoveContainer" containerID="6e2731b0a4c1e86b4dfe8250d2463caab0cf2d39ae80436eb77eb39a2a461bd3" Feb 17 19:39:10 crc kubenswrapper[4892]: I0217 19:39:10.431699 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zbh4l" Feb 17 19:39:10 crc kubenswrapper[4892]: I0217 19:39:10.467686 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zbh4l"] Feb 17 19:39:10 crc kubenswrapper[4892]: I0217 19:39:10.468704 4892 scope.go:117] "RemoveContainer" containerID="5664fa71913afe84900d58e8df65d53a653ca53a5fa85258d1b47acd801c6313" Feb 17 19:39:10 crc kubenswrapper[4892]: I0217 19:39:10.479565 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zbh4l"] Feb 17 19:39:10 crc kubenswrapper[4892]: I0217 19:39:10.492359 4892 scope.go:117] "RemoveContainer" containerID="d7d9a7c70ad343af3942ef95e5c8c071a051550e113b10d5a7fbc991df68bee1" Feb 17 19:39:10 crc kubenswrapper[4892]: I0217 19:39:10.539585 4892 scope.go:117] "RemoveContainer" containerID="6e2731b0a4c1e86b4dfe8250d2463caab0cf2d39ae80436eb77eb39a2a461bd3" Feb 17 19:39:10 crc kubenswrapper[4892]: E0217 19:39:10.540062 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e2731b0a4c1e86b4dfe8250d2463caab0cf2d39ae80436eb77eb39a2a461bd3\": container with ID starting with 6e2731b0a4c1e86b4dfe8250d2463caab0cf2d39ae80436eb77eb39a2a461bd3 not found: ID does not exist" containerID="6e2731b0a4c1e86b4dfe8250d2463caab0cf2d39ae80436eb77eb39a2a461bd3" Feb 17 19:39:10 crc kubenswrapper[4892]: I0217 19:39:10.540104 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e2731b0a4c1e86b4dfe8250d2463caab0cf2d39ae80436eb77eb39a2a461bd3"} err="failed to get container status \"6e2731b0a4c1e86b4dfe8250d2463caab0cf2d39ae80436eb77eb39a2a461bd3\": rpc error: code = NotFound desc = could not find container \"6e2731b0a4c1e86b4dfe8250d2463caab0cf2d39ae80436eb77eb39a2a461bd3\": container with ID starting with 6e2731b0a4c1e86b4dfe8250d2463caab0cf2d39ae80436eb77eb39a2a461bd3 not found: ID does not exist" Feb 17 19:39:10 crc kubenswrapper[4892]: I0217 19:39:10.540125 4892 scope.go:117] "RemoveContainer" containerID="5664fa71913afe84900d58e8df65d53a653ca53a5fa85258d1b47acd801c6313" Feb 17 19:39:10 crc kubenswrapper[4892]: E0217 19:39:10.540720 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5664fa71913afe84900d58e8df65d53a653ca53a5fa85258d1b47acd801c6313\": container with ID starting with 5664fa71913afe84900d58e8df65d53a653ca53a5fa85258d1b47acd801c6313 not found: ID does not exist" containerID="5664fa71913afe84900d58e8df65d53a653ca53a5fa85258d1b47acd801c6313" Feb 17 19:39:10 crc kubenswrapper[4892]: I0217 19:39:10.540744 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5664fa71913afe84900d58e8df65d53a653ca53a5fa85258d1b47acd801c6313"} err="failed to get container status \"5664fa71913afe84900d58e8df65d53a653ca53a5fa85258d1b47acd801c6313\": rpc error: code = NotFound desc = could not find container \"5664fa71913afe84900d58e8df65d53a653ca53a5fa85258d1b47acd801c6313\": container with ID starting with 5664fa71913afe84900d58e8df65d53a653ca53a5fa85258d1b47acd801c6313 not found: ID does not exist" Feb 17 19:39:10 crc kubenswrapper[4892]: I0217 19:39:10.540759 4892 scope.go:117] "RemoveContainer" containerID="d7d9a7c70ad343af3942ef95e5c8c071a051550e113b10d5a7fbc991df68bee1" Feb 17 19:39:10 crc kubenswrapper[4892]: E0217 19:39:10.540995 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7d9a7c70ad343af3942ef95e5c8c071a051550e113b10d5a7fbc991df68bee1\": container with ID starting with d7d9a7c70ad343af3942ef95e5c8c071a051550e113b10d5a7fbc991df68bee1 not found: ID does not exist" containerID="d7d9a7c70ad343af3942ef95e5c8c071a051550e113b10d5a7fbc991df68bee1" Feb 17 19:39:10 crc kubenswrapper[4892]: I0217 19:39:10.541018 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7d9a7c70ad343af3942ef95e5c8c071a051550e113b10d5a7fbc991df68bee1"} err="failed to get container status \"d7d9a7c70ad343af3942ef95e5c8c071a051550e113b10d5a7fbc991df68bee1\": rpc error: code = NotFound desc = could not find container \"d7d9a7c70ad343af3942ef95e5c8c071a051550e113b10d5a7fbc991df68bee1\": container with ID starting with d7d9a7c70ad343af3942ef95e5c8c071a051550e113b10d5a7fbc991df68bee1 not found: ID does not exist" Feb 17 19:39:11 crc kubenswrapper[4892]: I0217 19:39:11.374193 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86dbc6cd-3de6-414c-a422-53507111e8ca" path="/var/lib/kubelet/pods/86dbc6cd-3de6-414c-a422-53507111e8ca/volumes" Feb 17 19:39:18 crc kubenswrapper[4892]: I0217 19:39:18.032533 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-pdh2g"] Feb 17 19:39:18 crc kubenswrapper[4892]: I0217 19:39:18.044454 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-pdh2g"] Feb 17 19:39:19 crc kubenswrapper[4892]: I0217 19:39:19.376901 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24578eef-81b5-44bd-acac-2352f1ce2892" path="/var/lib/kubelet/pods/24578eef-81b5-44bd-acac-2352f1ce2892/volumes" Feb 17 19:39:37 crc kubenswrapper[4892]: I0217 19:39:37.424966 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 19:39:37 crc kubenswrapper[4892]: I0217 19:39:37.425865 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 19:39:39 crc kubenswrapper[4892]: I0217 19:39:39.069845 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-zgcrt"] Feb 17 19:39:39 crc kubenswrapper[4892]: I0217 19:39:39.089487 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-79d7-account-create-update-t5x9n"] Feb 17 19:39:39 crc kubenswrapper[4892]: I0217 19:39:39.099496 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-zgcrt"] Feb 17 19:39:39 crc kubenswrapper[4892]: I0217 19:39:39.110051 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-79d7-account-create-update-t5x9n"] Feb 17 19:39:39 crc kubenswrapper[4892]: I0217 19:39:39.387678 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="106208e4-2541-4e90-af0c-ca382c617cab" path="/var/lib/kubelet/pods/106208e4-2541-4e90-af0c-ca382c617cab/volumes" Feb 17 19:39:39 crc kubenswrapper[4892]: I0217 19:39:39.389282 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="304b8888-e8d3-48c0-82f1-e655ad9edc79" path="/var/lib/kubelet/pods/304b8888-e8d3-48c0-82f1-e655ad9edc79/volumes" Feb 17 19:39:51 crc kubenswrapper[4892]: I0217 19:39:51.037968 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-lbj77"] Feb 17 19:39:51 crc kubenswrapper[4892]: I0217 19:39:51.051930 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-lbj77"] Feb 17 19:39:51 crc kubenswrapper[4892]: I0217 19:39:51.386017 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5bf75de-8d27-4ca0-be50-bad2df22a6ea" path="/var/lib/kubelet/pods/e5bf75de-8d27-4ca0-be50-bad2df22a6ea/volumes" Feb 17 19:40:07 crc kubenswrapper[4892]: I0217 19:40:07.424543 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 19:40:07 crc kubenswrapper[4892]: I0217 19:40:07.425288 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 19:40:07 crc kubenswrapper[4892]: I0217 19:40:07.425393 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" Feb 17 19:40:07 crc kubenswrapper[4892]: I0217 19:40:07.426782 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d2b2efd3d5988ce334d039949155348879bf6dbf519c1878bea365e6fb91554e"} pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 19:40:07 crc kubenswrapper[4892]: I0217 19:40:07.426854 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" containerID="cri-o://d2b2efd3d5988ce334d039949155348879bf6dbf519c1878bea365e6fb91554e" gracePeriod=600 Feb 17 19:40:08 crc kubenswrapper[4892]: I0217 19:40:08.005994 4892 scope.go:117] "RemoveContainer" containerID="32ec4a4642066bf06fc91b65afd41a4d40cc80aed602b2487652f4d43ee45eee" Feb 17 19:40:08 crc kubenswrapper[4892]: I0217 19:40:08.057353 4892 scope.go:117] "RemoveContainer" containerID="832666fe0070771b40d542c667cd46a363db384ea90e60dbf62a678e5ef158bb" Feb 17 19:40:08 crc kubenswrapper[4892]: I0217 19:40:08.106840 4892 scope.go:117] "RemoveContainer" containerID="d526d6d95aeb98886aca2a95cf83b731410c8652885314337721634c4cf6ba94" Feb 17 19:40:08 crc kubenswrapper[4892]: I0217 19:40:08.159268 4892 scope.go:117] "RemoveContainer" containerID="7916dcb4e7fef0f96a59fe7fd6988abc2ab60010c389af1751be47989a404e66" Feb 17 19:40:08 crc kubenswrapper[4892]: I0217 19:40:08.241100 4892 generic.go:334] "Generic (PLEG): container finished" podID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerID="d2b2efd3d5988ce334d039949155348879bf6dbf519c1878bea365e6fb91554e" exitCode=0 Feb 17 19:40:08 crc kubenswrapper[4892]: I0217 19:40:08.241190 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerDied","Data":"d2b2efd3d5988ce334d039949155348879bf6dbf519c1878bea365e6fb91554e"} Feb 17 19:40:08 crc kubenswrapper[4892]: I0217 19:40:08.241243 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerStarted","Data":"445f87c5a931a69b03d0a40bb0f8702e093637c8856522e51b78c9e6dbbc9e48"} Feb 17 19:40:08 crc kubenswrapper[4892]: I0217 19:40:08.241297 4892 scope.go:117] "RemoveContainer" containerID="23227f03c5258bf78d47839d96cf8cec6b2327db6429577c69616ffcf333ff09" Feb 17 19:41:34 crc kubenswrapper[4892]: I0217 19:41:34.393429 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-t2xc9"] Feb 17 19:41:34 crc kubenswrapper[4892]: E0217 19:41:34.395541 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45d3b930-cea9-4bab-9963-2509288bfad1" containerName="extract-utilities" Feb 17 19:41:34 crc kubenswrapper[4892]: I0217 19:41:34.395632 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="45d3b930-cea9-4bab-9963-2509288bfad1" containerName="extract-utilities" Feb 17 19:41:34 crc kubenswrapper[4892]: E0217 19:41:34.395725 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45d3b930-cea9-4bab-9963-2509288bfad1" containerName="extract-content" Feb 17 19:41:34 crc kubenswrapper[4892]: I0217 19:41:34.395784 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="45d3b930-cea9-4bab-9963-2509288bfad1" containerName="extract-content" Feb 17 19:41:34 crc kubenswrapper[4892]: E0217 19:41:34.395892 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86dbc6cd-3de6-414c-a422-53507111e8ca" containerName="extract-content" Feb 17 19:41:34 crc kubenswrapper[4892]: I0217 19:41:34.395956 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="86dbc6cd-3de6-414c-a422-53507111e8ca" containerName="extract-content" Feb 17 19:41:34 crc kubenswrapper[4892]: E0217 19:41:34.396025 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45d3b930-cea9-4bab-9963-2509288bfad1" containerName="registry-server" Feb 17 19:41:34 crc kubenswrapper[4892]: I0217 19:41:34.396082 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="45d3b930-cea9-4bab-9963-2509288bfad1" containerName="registry-server" Feb 17 19:41:34 crc kubenswrapper[4892]: E0217 19:41:34.396146 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86dbc6cd-3de6-414c-a422-53507111e8ca" containerName="extract-utilities" Feb 17 19:41:34 crc kubenswrapper[4892]: I0217 19:41:34.396200 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="86dbc6cd-3de6-414c-a422-53507111e8ca" containerName="extract-utilities" Feb 17 19:41:34 crc kubenswrapper[4892]: E0217 19:41:34.396272 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86dbc6cd-3de6-414c-a422-53507111e8ca" containerName="registry-server" Feb 17 19:41:34 crc kubenswrapper[4892]: I0217 19:41:34.396330 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="86dbc6cd-3de6-414c-a422-53507111e8ca" containerName="registry-server" Feb 17 19:41:34 crc kubenswrapper[4892]: I0217 19:41:34.396605 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="45d3b930-cea9-4bab-9963-2509288bfad1" containerName="registry-server" Feb 17 19:41:34 crc kubenswrapper[4892]: I0217 19:41:34.396686 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="86dbc6cd-3de6-414c-a422-53507111e8ca" containerName="registry-server" Feb 17 19:41:34 crc kubenswrapper[4892]: I0217 19:41:34.398768 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t2xc9" Feb 17 19:41:34 crc kubenswrapper[4892]: I0217 19:41:34.405123 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t2xc9"] Feb 17 19:41:34 crc kubenswrapper[4892]: I0217 19:41:34.449934 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27d26a0a-7afb-40f6-a793-58c5b3319e4f-utilities\") pod \"community-operators-t2xc9\" (UID: \"27d26a0a-7afb-40f6-a793-58c5b3319e4f\") " pod="openshift-marketplace/community-operators-t2xc9" Feb 17 19:41:34 crc kubenswrapper[4892]: I0217 19:41:34.450137 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfmfx\" (UniqueName: \"kubernetes.io/projected/27d26a0a-7afb-40f6-a793-58c5b3319e4f-kube-api-access-gfmfx\") pod \"community-operators-t2xc9\" (UID: \"27d26a0a-7afb-40f6-a793-58c5b3319e4f\") " pod="openshift-marketplace/community-operators-t2xc9" Feb 17 19:41:34 crc kubenswrapper[4892]: I0217 19:41:34.450202 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27d26a0a-7afb-40f6-a793-58c5b3319e4f-catalog-content\") pod \"community-operators-t2xc9\" (UID: \"27d26a0a-7afb-40f6-a793-58c5b3319e4f\") " pod="openshift-marketplace/community-operators-t2xc9" Feb 17 19:41:34 crc kubenswrapper[4892]: I0217 19:41:34.552046 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfmfx\" (UniqueName: \"kubernetes.io/projected/27d26a0a-7afb-40f6-a793-58c5b3319e4f-kube-api-access-gfmfx\") pod \"community-operators-t2xc9\" (UID: \"27d26a0a-7afb-40f6-a793-58c5b3319e4f\") " pod="openshift-marketplace/community-operators-t2xc9" Feb 17 19:41:34 crc kubenswrapper[4892]: I0217 19:41:34.552167 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27d26a0a-7afb-40f6-a793-58c5b3319e4f-catalog-content\") pod \"community-operators-t2xc9\" (UID: \"27d26a0a-7afb-40f6-a793-58c5b3319e4f\") " pod="openshift-marketplace/community-operators-t2xc9" Feb 17 19:41:34 crc kubenswrapper[4892]: I0217 19:41:34.552261 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27d26a0a-7afb-40f6-a793-58c5b3319e4f-utilities\") pod \"community-operators-t2xc9\" (UID: \"27d26a0a-7afb-40f6-a793-58c5b3319e4f\") " pod="openshift-marketplace/community-operators-t2xc9" Feb 17 19:41:34 crc kubenswrapper[4892]: I0217 19:41:34.552909 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27d26a0a-7afb-40f6-a793-58c5b3319e4f-utilities\") pod \"community-operators-t2xc9\" (UID: \"27d26a0a-7afb-40f6-a793-58c5b3319e4f\") " pod="openshift-marketplace/community-operators-t2xc9" Feb 17 19:41:34 crc kubenswrapper[4892]: I0217 19:41:34.553287 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27d26a0a-7afb-40f6-a793-58c5b3319e4f-catalog-content\") pod \"community-operators-t2xc9\" (UID: \"27d26a0a-7afb-40f6-a793-58c5b3319e4f\") " pod="openshift-marketplace/community-operators-t2xc9" Feb 17 19:41:34 crc kubenswrapper[4892]: I0217 19:41:34.575339 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfmfx\" (UniqueName: \"kubernetes.io/projected/27d26a0a-7afb-40f6-a793-58c5b3319e4f-kube-api-access-gfmfx\") pod \"community-operators-t2xc9\" (UID: \"27d26a0a-7afb-40f6-a793-58c5b3319e4f\") " pod="openshift-marketplace/community-operators-t2xc9" Feb 17 19:41:34 crc kubenswrapper[4892]: I0217 19:41:34.718867 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t2xc9" Feb 17 19:41:35 crc kubenswrapper[4892]: I0217 19:41:35.257686 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t2xc9"] Feb 17 19:41:35 crc kubenswrapper[4892]: I0217 19:41:35.357497 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t2xc9" event={"ID":"27d26a0a-7afb-40f6-a793-58c5b3319e4f","Type":"ContainerStarted","Data":"196088de09dc64650f90894e8b01aa6f8a7c4a26c96897fdc6d44f8895300bab"} Feb 17 19:41:36 crc kubenswrapper[4892]: I0217 19:41:36.371155 4892 generic.go:334] "Generic (PLEG): container finished" podID="27d26a0a-7afb-40f6-a793-58c5b3319e4f" containerID="237cf994962392a8d6908bbe748ae5e5d6fa5bee63e40659f65b2ecaa3b88858" exitCode=0 Feb 17 19:41:36 crc kubenswrapper[4892]: I0217 19:41:36.371439 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t2xc9" event={"ID":"27d26a0a-7afb-40f6-a793-58c5b3319e4f","Type":"ContainerDied","Data":"237cf994962392a8d6908bbe748ae5e5d6fa5bee63e40659f65b2ecaa3b88858"} Feb 17 19:41:37 crc kubenswrapper[4892]: I0217 19:41:37.384148 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t2xc9" event={"ID":"27d26a0a-7afb-40f6-a793-58c5b3319e4f","Type":"ContainerStarted","Data":"9619649f55c8b7c643c4ae24776c8c94bc5e5c373a57b32fd587467061c32e78"} Feb 17 19:41:41 crc kubenswrapper[4892]: I0217 19:41:41.430205 4892 generic.go:334] "Generic (PLEG): container finished" podID="27d26a0a-7afb-40f6-a793-58c5b3319e4f" containerID="9619649f55c8b7c643c4ae24776c8c94bc5e5c373a57b32fd587467061c32e78" exitCode=0 Feb 17 19:41:41 crc kubenswrapper[4892]: I0217 19:41:41.430272 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t2xc9" event={"ID":"27d26a0a-7afb-40f6-a793-58c5b3319e4f","Type":"ContainerDied","Data":"9619649f55c8b7c643c4ae24776c8c94bc5e5c373a57b32fd587467061c32e78"} Feb 17 19:41:42 crc kubenswrapper[4892]: I0217 19:41:42.442548 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t2xc9" event={"ID":"27d26a0a-7afb-40f6-a793-58c5b3319e4f","Type":"ContainerStarted","Data":"637efbaaa955f00ecdda66fd5d30411a8412e6d385707a5e3d701ba455b9e034"} Feb 17 19:41:42 crc kubenswrapper[4892]: I0217 19:41:42.461007 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-t2xc9" podStartSLOduration=3.001667173 podStartE2EDuration="8.460988399s" podCreationTimestamp="2026-02-17 19:41:34 +0000 UTC" firstStartedPulling="2026-02-17 19:41:36.375075093 +0000 UTC m=+7067.750478368" lastFinishedPulling="2026-02-17 19:41:41.834396309 +0000 UTC m=+7073.209799594" observedRunningTime="2026-02-17 19:41:42.458891002 +0000 UTC m=+7073.834294287" watchObservedRunningTime="2026-02-17 19:41:42.460988399 +0000 UTC m=+7073.836391664" Feb 17 19:41:44 crc kubenswrapper[4892]: I0217 19:41:44.719255 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-t2xc9" Feb 17 19:41:44 crc kubenswrapper[4892]: I0217 19:41:44.719902 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-t2xc9" Feb 17 19:41:45 crc kubenswrapper[4892]: I0217 19:41:45.781408 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-t2xc9" podUID="27d26a0a-7afb-40f6-a793-58c5b3319e4f" containerName="registry-server" probeResult="failure" output=< Feb 17 19:41:45 crc kubenswrapper[4892]: timeout: failed to connect service ":50051" within 1s Feb 17 19:41:45 crc kubenswrapper[4892]: > Feb 17 19:41:54 crc kubenswrapper[4892]: I0217 19:41:54.813965 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-t2xc9" Feb 17 19:41:54 crc kubenswrapper[4892]: I0217 19:41:54.878471 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-t2xc9" Feb 17 19:41:55 crc kubenswrapper[4892]: I0217 19:41:55.060894 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t2xc9"] Feb 17 19:41:56 crc kubenswrapper[4892]: I0217 19:41:56.612285 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-t2xc9" podUID="27d26a0a-7afb-40f6-a793-58c5b3319e4f" containerName="registry-server" containerID="cri-o://637efbaaa955f00ecdda66fd5d30411a8412e6d385707a5e3d701ba455b9e034" gracePeriod=2 Feb 17 19:41:57 crc kubenswrapper[4892]: I0217 19:41:57.327876 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t2xc9" Feb 17 19:41:57 crc kubenswrapper[4892]: I0217 19:41:57.339766 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfmfx\" (UniqueName: \"kubernetes.io/projected/27d26a0a-7afb-40f6-a793-58c5b3319e4f-kube-api-access-gfmfx\") pod \"27d26a0a-7afb-40f6-a793-58c5b3319e4f\" (UID: \"27d26a0a-7afb-40f6-a793-58c5b3319e4f\") " Feb 17 19:41:57 crc kubenswrapper[4892]: I0217 19:41:57.339920 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27d26a0a-7afb-40f6-a793-58c5b3319e4f-catalog-content\") pod \"27d26a0a-7afb-40f6-a793-58c5b3319e4f\" (UID: \"27d26a0a-7afb-40f6-a793-58c5b3319e4f\") " Feb 17 19:41:57 crc kubenswrapper[4892]: I0217 19:41:57.340003 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27d26a0a-7afb-40f6-a793-58c5b3319e4f-utilities\") pod \"27d26a0a-7afb-40f6-a793-58c5b3319e4f\" (UID: \"27d26a0a-7afb-40f6-a793-58c5b3319e4f\") " Feb 17 19:41:57 crc kubenswrapper[4892]: I0217 19:41:57.340725 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27d26a0a-7afb-40f6-a793-58c5b3319e4f-utilities" (OuterVolumeSpecName: "utilities") pod "27d26a0a-7afb-40f6-a793-58c5b3319e4f" (UID: "27d26a0a-7afb-40f6-a793-58c5b3319e4f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:41:57 crc kubenswrapper[4892]: I0217 19:41:57.347405 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27d26a0a-7afb-40f6-a793-58c5b3319e4f-kube-api-access-gfmfx" (OuterVolumeSpecName: "kube-api-access-gfmfx") pod "27d26a0a-7afb-40f6-a793-58c5b3319e4f" (UID: "27d26a0a-7afb-40f6-a793-58c5b3319e4f"). InnerVolumeSpecName "kube-api-access-gfmfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:41:57 crc kubenswrapper[4892]: I0217 19:41:57.414598 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27d26a0a-7afb-40f6-a793-58c5b3319e4f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "27d26a0a-7afb-40f6-a793-58c5b3319e4f" (UID: "27d26a0a-7afb-40f6-a793-58c5b3319e4f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:41:57 crc kubenswrapper[4892]: I0217 19:41:57.442630 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27d26a0a-7afb-40f6-a793-58c5b3319e4f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 19:41:57 crc kubenswrapper[4892]: I0217 19:41:57.442659 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27d26a0a-7afb-40f6-a793-58c5b3319e4f-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 19:41:57 crc kubenswrapper[4892]: I0217 19:41:57.442669 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfmfx\" (UniqueName: \"kubernetes.io/projected/27d26a0a-7afb-40f6-a793-58c5b3319e4f-kube-api-access-gfmfx\") on node \"crc\" DevicePath \"\"" Feb 17 19:41:57 crc kubenswrapper[4892]: I0217 19:41:57.626645 4892 generic.go:334] "Generic (PLEG): container finished" podID="27d26a0a-7afb-40f6-a793-58c5b3319e4f" containerID="637efbaaa955f00ecdda66fd5d30411a8412e6d385707a5e3d701ba455b9e034" exitCode=0 Feb 17 19:41:57 crc kubenswrapper[4892]: I0217 19:41:57.626695 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t2xc9" event={"ID":"27d26a0a-7afb-40f6-a793-58c5b3319e4f","Type":"ContainerDied","Data":"637efbaaa955f00ecdda66fd5d30411a8412e6d385707a5e3d701ba455b9e034"} Feb 17 19:41:57 crc kubenswrapper[4892]: I0217 19:41:57.626726 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t2xc9" event={"ID":"27d26a0a-7afb-40f6-a793-58c5b3319e4f","Type":"ContainerDied","Data":"196088de09dc64650f90894e8b01aa6f8a7c4a26c96897fdc6d44f8895300bab"} Feb 17 19:41:57 crc kubenswrapper[4892]: I0217 19:41:57.626747 4892 scope.go:117] "RemoveContainer" containerID="637efbaaa955f00ecdda66fd5d30411a8412e6d385707a5e3d701ba455b9e034" Feb 17 19:41:57 crc kubenswrapper[4892]: I0217 19:41:57.626760 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t2xc9" Feb 17 19:41:57 crc kubenswrapper[4892]: I0217 19:41:57.654406 4892 scope.go:117] "RemoveContainer" containerID="9619649f55c8b7c643c4ae24776c8c94bc5e5c373a57b32fd587467061c32e78" Feb 17 19:41:57 crc kubenswrapper[4892]: I0217 19:41:57.682848 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t2xc9"] Feb 17 19:41:57 crc kubenswrapper[4892]: I0217 19:41:57.693599 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-t2xc9"] Feb 17 19:41:57 crc kubenswrapper[4892]: I0217 19:41:57.700162 4892 scope.go:117] "RemoveContainer" containerID="237cf994962392a8d6908bbe748ae5e5d6fa5bee63e40659f65b2ecaa3b88858" Feb 17 19:41:57 crc kubenswrapper[4892]: I0217 19:41:57.741000 4892 scope.go:117] "RemoveContainer" containerID="637efbaaa955f00ecdda66fd5d30411a8412e6d385707a5e3d701ba455b9e034" Feb 17 19:41:57 crc kubenswrapper[4892]: E0217 19:41:57.741369 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"637efbaaa955f00ecdda66fd5d30411a8412e6d385707a5e3d701ba455b9e034\": container with ID starting with 637efbaaa955f00ecdda66fd5d30411a8412e6d385707a5e3d701ba455b9e034 not found: ID does not exist" containerID="637efbaaa955f00ecdda66fd5d30411a8412e6d385707a5e3d701ba455b9e034" Feb 17 19:41:57 crc kubenswrapper[4892]: I0217 19:41:57.741400 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"637efbaaa955f00ecdda66fd5d30411a8412e6d385707a5e3d701ba455b9e034"} err="failed to get container status \"637efbaaa955f00ecdda66fd5d30411a8412e6d385707a5e3d701ba455b9e034\": rpc error: code = NotFound desc = could not find container \"637efbaaa955f00ecdda66fd5d30411a8412e6d385707a5e3d701ba455b9e034\": container with ID starting with 637efbaaa955f00ecdda66fd5d30411a8412e6d385707a5e3d701ba455b9e034 not found: ID does not exist" Feb 17 19:41:57 crc kubenswrapper[4892]: I0217 19:41:57.741421 4892 scope.go:117] "RemoveContainer" containerID="9619649f55c8b7c643c4ae24776c8c94bc5e5c373a57b32fd587467061c32e78" Feb 17 19:41:57 crc kubenswrapper[4892]: E0217 19:41:57.741795 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9619649f55c8b7c643c4ae24776c8c94bc5e5c373a57b32fd587467061c32e78\": container with ID starting with 9619649f55c8b7c643c4ae24776c8c94bc5e5c373a57b32fd587467061c32e78 not found: ID does not exist" containerID="9619649f55c8b7c643c4ae24776c8c94bc5e5c373a57b32fd587467061c32e78" Feb 17 19:41:57 crc kubenswrapper[4892]: I0217 19:41:57.741848 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9619649f55c8b7c643c4ae24776c8c94bc5e5c373a57b32fd587467061c32e78"} err="failed to get container status \"9619649f55c8b7c643c4ae24776c8c94bc5e5c373a57b32fd587467061c32e78\": rpc error: code = NotFound desc = could not find container \"9619649f55c8b7c643c4ae24776c8c94bc5e5c373a57b32fd587467061c32e78\": container with ID starting with 9619649f55c8b7c643c4ae24776c8c94bc5e5c373a57b32fd587467061c32e78 not found: ID does not exist" Feb 17 19:41:57 crc kubenswrapper[4892]: I0217 19:41:57.741875 4892 scope.go:117] "RemoveContainer" containerID="237cf994962392a8d6908bbe748ae5e5d6fa5bee63e40659f65b2ecaa3b88858" Feb 17 19:41:57 crc kubenswrapper[4892]: E0217 19:41:57.742278 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"237cf994962392a8d6908bbe748ae5e5d6fa5bee63e40659f65b2ecaa3b88858\": container with ID starting with 237cf994962392a8d6908bbe748ae5e5d6fa5bee63e40659f65b2ecaa3b88858 not found: ID does not exist" containerID="237cf994962392a8d6908bbe748ae5e5d6fa5bee63e40659f65b2ecaa3b88858" Feb 17 19:41:57 crc kubenswrapper[4892]: I0217 19:41:57.742304 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"237cf994962392a8d6908bbe748ae5e5d6fa5bee63e40659f65b2ecaa3b88858"} err="failed to get container status \"237cf994962392a8d6908bbe748ae5e5d6fa5bee63e40659f65b2ecaa3b88858\": rpc error: code = NotFound desc = could not find container \"237cf994962392a8d6908bbe748ae5e5d6fa5bee63e40659f65b2ecaa3b88858\": container with ID starting with 237cf994962392a8d6908bbe748ae5e5d6fa5bee63e40659f65b2ecaa3b88858 not found: ID does not exist" Feb 17 19:41:59 crc kubenswrapper[4892]: I0217 19:41:59.384981 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27d26a0a-7afb-40f6-a793-58c5b3319e4f" path="/var/lib/kubelet/pods/27d26a0a-7afb-40f6-a793-58c5b3319e4f/volumes" Feb 17 19:42:07 crc kubenswrapper[4892]: I0217 19:42:07.425333 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 19:42:07 crc kubenswrapper[4892]: I0217 19:42:07.427383 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 19:42:37 crc kubenswrapper[4892]: I0217 19:42:37.426368 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 19:42:37 crc kubenswrapper[4892]: I0217 19:42:37.427048 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 19:43:01 crc kubenswrapper[4892]: I0217 19:43:01.413635 4892 generic.go:334] "Generic (PLEG): container finished" podID="b17be7f8-f4d0-434f-b0b0-010faf440574" containerID="f9461f639d8ca1093f31de86be7310fd6d5db58d14b83ea7b67ab80764274297" exitCode=0 Feb 17 19:43:01 crc kubenswrapper[4892]: I0217 19:43:01.414423 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-w56qd" event={"ID":"b17be7f8-f4d0-434f-b0b0-010faf440574","Type":"ContainerDied","Data":"f9461f639d8ca1093f31de86be7310fd6d5db58d14b83ea7b67ab80764274297"} Feb 17 19:43:02 crc kubenswrapper[4892]: I0217 19:43:02.963167 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-w56qd" Feb 17 19:43:03 crc kubenswrapper[4892]: I0217 19:43:03.071620 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w86zr\" (UniqueName: \"kubernetes.io/projected/b17be7f8-f4d0-434f-b0b0-010faf440574-kube-api-access-w86zr\") pod \"b17be7f8-f4d0-434f-b0b0-010faf440574\" (UID: \"b17be7f8-f4d0-434f-b0b0-010faf440574\") " Feb 17 19:43:03 crc kubenswrapper[4892]: I0217 19:43:03.072869 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b17be7f8-f4d0-434f-b0b0-010faf440574-tripleo-cleanup-combined-ca-bundle\") pod \"b17be7f8-f4d0-434f-b0b0-010faf440574\" (UID: \"b17be7f8-f4d0-434f-b0b0-010faf440574\") " Feb 17 19:43:03 crc kubenswrapper[4892]: I0217 19:43:03.073129 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b17be7f8-f4d0-434f-b0b0-010faf440574-inventory\") pod \"b17be7f8-f4d0-434f-b0b0-010faf440574\" (UID: \"b17be7f8-f4d0-434f-b0b0-010faf440574\") " Feb 17 19:43:03 crc kubenswrapper[4892]: I0217 19:43:03.073296 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/b17be7f8-f4d0-434f-b0b0-010faf440574-ssh-key-openstack-cell1\") pod \"b17be7f8-f4d0-434f-b0b0-010faf440574\" (UID: \"b17be7f8-f4d0-434f-b0b0-010faf440574\") " Feb 17 19:43:03 crc kubenswrapper[4892]: I0217 19:43:03.073373 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b17be7f8-f4d0-434f-b0b0-010faf440574-ceph\") pod \"b17be7f8-f4d0-434f-b0b0-010faf440574\" (UID: \"b17be7f8-f4d0-434f-b0b0-010faf440574\") " Feb 17 19:43:03 crc kubenswrapper[4892]: I0217 19:43:03.078571 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b17be7f8-f4d0-434f-b0b0-010faf440574-tripleo-cleanup-combined-ca-bundle" (OuterVolumeSpecName: "tripleo-cleanup-combined-ca-bundle") pod "b17be7f8-f4d0-434f-b0b0-010faf440574" (UID: "b17be7f8-f4d0-434f-b0b0-010faf440574"). InnerVolumeSpecName "tripleo-cleanup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:43:03 crc kubenswrapper[4892]: I0217 19:43:03.084916 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b17be7f8-f4d0-434f-b0b0-010faf440574-ceph" (OuterVolumeSpecName: "ceph") pod "b17be7f8-f4d0-434f-b0b0-010faf440574" (UID: "b17be7f8-f4d0-434f-b0b0-010faf440574"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:43:03 crc kubenswrapper[4892]: I0217 19:43:03.086953 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b17be7f8-f4d0-434f-b0b0-010faf440574-kube-api-access-w86zr" (OuterVolumeSpecName: "kube-api-access-w86zr") pod "b17be7f8-f4d0-434f-b0b0-010faf440574" (UID: "b17be7f8-f4d0-434f-b0b0-010faf440574"). InnerVolumeSpecName "kube-api-access-w86zr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:43:03 crc kubenswrapper[4892]: I0217 19:43:03.104651 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b17be7f8-f4d0-434f-b0b0-010faf440574-inventory" (OuterVolumeSpecName: "inventory") pod "b17be7f8-f4d0-434f-b0b0-010faf440574" (UID: "b17be7f8-f4d0-434f-b0b0-010faf440574"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:43:03 crc kubenswrapper[4892]: I0217 19:43:03.107803 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b17be7f8-f4d0-434f-b0b0-010faf440574-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "b17be7f8-f4d0-434f-b0b0-010faf440574" (UID: "b17be7f8-f4d0-434f-b0b0-010faf440574"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:43:03 crc kubenswrapper[4892]: I0217 19:43:03.177875 4892 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/b17be7f8-f4d0-434f-b0b0-010faf440574-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 17 19:43:03 crc kubenswrapper[4892]: I0217 19:43:03.177938 4892 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b17be7f8-f4d0-434f-b0b0-010faf440574-ceph\") on node \"crc\" DevicePath \"\"" Feb 17 19:43:03 crc kubenswrapper[4892]: I0217 19:43:03.177957 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w86zr\" (UniqueName: \"kubernetes.io/projected/b17be7f8-f4d0-434f-b0b0-010faf440574-kube-api-access-w86zr\") on node \"crc\" DevicePath \"\"" Feb 17 19:43:03 crc kubenswrapper[4892]: I0217 19:43:03.177975 4892 reconciler_common.go:293] "Volume detached for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b17be7f8-f4d0-434f-b0b0-010faf440574-tripleo-cleanup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 19:43:03 crc kubenswrapper[4892]: I0217 19:43:03.178044 4892 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b17be7f8-f4d0-434f-b0b0-010faf440574-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 19:43:03 crc kubenswrapper[4892]: I0217 19:43:03.442888 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-w56qd" event={"ID":"b17be7f8-f4d0-434f-b0b0-010faf440574","Type":"ContainerDied","Data":"9237e4fcfccfa95c650e640ee676b43de84251ea3f27860627228ea781a23eb1"} Feb 17 19:43:03 crc kubenswrapper[4892]: I0217 19:43:03.442926 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9237e4fcfccfa95c650e640ee676b43de84251ea3f27860627228ea781a23eb1" Feb 17 19:43:03 crc kubenswrapper[4892]: I0217 19:43:03.442980 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-w56qd" Feb 17 19:43:07 crc kubenswrapper[4892]: I0217 19:43:07.424885 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 19:43:07 crc kubenswrapper[4892]: I0217 19:43:07.425596 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 19:43:07 crc kubenswrapper[4892]: I0217 19:43:07.425641 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" Feb 17 19:43:07 crc kubenswrapper[4892]: I0217 19:43:07.426548 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"445f87c5a931a69b03d0a40bb0f8702e093637c8856522e51b78c9e6dbbc9e48"} pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 19:43:07 crc kubenswrapper[4892]: I0217 19:43:07.426596 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" containerID="cri-o://445f87c5a931a69b03d0a40bb0f8702e093637c8856522e51b78c9e6dbbc9e48" gracePeriod=600 Feb 17 19:43:07 crc kubenswrapper[4892]: E0217 19:43:07.550983 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:43:08 crc kubenswrapper[4892]: I0217 19:43:08.092221 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-v99wc"] Feb 17 19:43:08 crc kubenswrapper[4892]: E0217 19:43:08.093127 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27d26a0a-7afb-40f6-a793-58c5b3319e4f" containerName="extract-content" Feb 17 19:43:08 crc kubenswrapper[4892]: I0217 19:43:08.093147 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="27d26a0a-7afb-40f6-a793-58c5b3319e4f" containerName="extract-content" Feb 17 19:43:08 crc kubenswrapper[4892]: E0217 19:43:08.093211 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27d26a0a-7afb-40f6-a793-58c5b3319e4f" containerName="extract-utilities" Feb 17 19:43:08 crc kubenswrapper[4892]: I0217 19:43:08.093219 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="27d26a0a-7afb-40f6-a793-58c5b3319e4f" containerName="extract-utilities" Feb 17 19:43:08 crc kubenswrapper[4892]: E0217 19:43:08.093245 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27d26a0a-7afb-40f6-a793-58c5b3319e4f" containerName="registry-server" Feb 17 19:43:08 crc kubenswrapper[4892]: I0217 19:43:08.093254 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="27d26a0a-7afb-40f6-a793-58c5b3319e4f" containerName="registry-server" Feb 17 19:43:08 crc kubenswrapper[4892]: E0217 19:43:08.093270 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b17be7f8-f4d0-434f-b0b0-010faf440574" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Feb 17 19:43:08 crc kubenswrapper[4892]: I0217 19:43:08.093282 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="b17be7f8-f4d0-434f-b0b0-010faf440574" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Feb 17 19:43:08 crc kubenswrapper[4892]: I0217 19:43:08.093625 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="b17be7f8-f4d0-434f-b0b0-010faf440574" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Feb 17 19:43:08 crc kubenswrapper[4892]: I0217 19:43:08.093656 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="27d26a0a-7afb-40f6-a793-58c5b3319e4f" containerName="registry-server" Feb 17 19:43:08 crc kubenswrapper[4892]: I0217 19:43:08.094809 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-v99wc" Feb 17 19:43:08 crc kubenswrapper[4892]: I0217 19:43:08.098907 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 19:43:08 crc kubenswrapper[4892]: I0217 19:43:08.099252 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 17 19:43:08 crc kubenswrapper[4892]: I0217 19:43:08.099540 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 17 19:43:08 crc kubenswrapper[4892]: I0217 19:43:08.100606 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-sdxx7" Feb 17 19:43:08 crc kubenswrapper[4892]: I0217 19:43:08.109304 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-v99wc"] Feb 17 19:43:08 crc kubenswrapper[4892]: I0217 19:43:08.186537 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/69c7c84a-660b-4db3-a143-95efe4b92db2-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-v99wc\" (UID: \"69c7c84a-660b-4db3-a143-95efe4b92db2\") " pod="openstack/bootstrap-openstack-openstack-cell1-v99wc" Feb 17 19:43:08 crc kubenswrapper[4892]: I0217 19:43:08.186606 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/69c7c84a-660b-4db3-a143-95efe4b92db2-ceph\") pod \"bootstrap-openstack-openstack-cell1-v99wc\" (UID: \"69c7c84a-660b-4db3-a143-95efe4b92db2\") " pod="openstack/bootstrap-openstack-openstack-cell1-v99wc" Feb 17 19:43:08 crc kubenswrapper[4892]: I0217 19:43:08.186735 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5gj6\" (UniqueName: \"kubernetes.io/projected/69c7c84a-660b-4db3-a143-95efe4b92db2-kube-api-access-h5gj6\") pod \"bootstrap-openstack-openstack-cell1-v99wc\" (UID: \"69c7c84a-660b-4db3-a143-95efe4b92db2\") " pod="openstack/bootstrap-openstack-openstack-cell1-v99wc" Feb 17 19:43:08 crc kubenswrapper[4892]: I0217 19:43:08.186920 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69c7c84a-660b-4db3-a143-95efe4b92db2-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-v99wc\" (UID: \"69c7c84a-660b-4db3-a143-95efe4b92db2\") " pod="openstack/bootstrap-openstack-openstack-cell1-v99wc" Feb 17 19:43:08 crc kubenswrapper[4892]: I0217 19:43:08.186958 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69c7c84a-660b-4db3-a143-95efe4b92db2-inventory\") pod \"bootstrap-openstack-openstack-cell1-v99wc\" (UID: \"69c7c84a-660b-4db3-a143-95efe4b92db2\") " pod="openstack/bootstrap-openstack-openstack-cell1-v99wc" Feb 17 19:43:08 crc kubenswrapper[4892]: I0217 19:43:08.288861 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69c7c84a-660b-4db3-a143-95efe4b92db2-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-v99wc\" (UID: \"69c7c84a-660b-4db3-a143-95efe4b92db2\") " pod="openstack/bootstrap-openstack-openstack-cell1-v99wc" Feb 17 19:43:08 crc kubenswrapper[4892]: I0217 19:43:08.288927 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69c7c84a-660b-4db3-a143-95efe4b92db2-inventory\") pod \"bootstrap-openstack-openstack-cell1-v99wc\" (UID: \"69c7c84a-660b-4db3-a143-95efe4b92db2\") " pod="openstack/bootstrap-openstack-openstack-cell1-v99wc" Feb 17 19:43:08 crc kubenswrapper[4892]: I0217 19:43:08.288986 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/69c7c84a-660b-4db3-a143-95efe4b92db2-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-v99wc\" (UID: \"69c7c84a-660b-4db3-a143-95efe4b92db2\") " pod="openstack/bootstrap-openstack-openstack-cell1-v99wc" Feb 17 19:43:08 crc kubenswrapper[4892]: I0217 19:43:08.289034 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/69c7c84a-660b-4db3-a143-95efe4b92db2-ceph\") pod \"bootstrap-openstack-openstack-cell1-v99wc\" (UID: \"69c7c84a-660b-4db3-a143-95efe4b92db2\") " pod="openstack/bootstrap-openstack-openstack-cell1-v99wc" Feb 17 19:43:08 crc kubenswrapper[4892]: I0217 19:43:08.289062 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5gj6\" (UniqueName: \"kubernetes.io/projected/69c7c84a-660b-4db3-a143-95efe4b92db2-kube-api-access-h5gj6\") pod \"bootstrap-openstack-openstack-cell1-v99wc\" (UID: \"69c7c84a-660b-4db3-a143-95efe4b92db2\") " pod="openstack/bootstrap-openstack-openstack-cell1-v99wc" Feb 17 19:43:08 crc kubenswrapper[4892]: I0217 19:43:08.295498 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/69c7c84a-660b-4db3-a143-95efe4b92db2-ceph\") pod \"bootstrap-openstack-openstack-cell1-v99wc\" (UID: \"69c7c84a-660b-4db3-a143-95efe4b92db2\") " pod="openstack/bootstrap-openstack-openstack-cell1-v99wc" Feb 17 19:43:08 crc kubenswrapper[4892]: I0217 19:43:08.297797 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/69c7c84a-660b-4db3-a143-95efe4b92db2-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-v99wc\" (UID: \"69c7c84a-660b-4db3-a143-95efe4b92db2\") " pod="openstack/bootstrap-openstack-openstack-cell1-v99wc" Feb 17 19:43:08 crc kubenswrapper[4892]: I0217 19:43:08.300089 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69c7c84a-660b-4db3-a143-95efe4b92db2-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-v99wc\" (UID: \"69c7c84a-660b-4db3-a143-95efe4b92db2\") " pod="openstack/bootstrap-openstack-openstack-cell1-v99wc" Feb 17 19:43:08 crc kubenswrapper[4892]: I0217 19:43:08.301228 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69c7c84a-660b-4db3-a143-95efe4b92db2-inventory\") pod \"bootstrap-openstack-openstack-cell1-v99wc\" (UID: \"69c7c84a-660b-4db3-a143-95efe4b92db2\") " pod="openstack/bootstrap-openstack-openstack-cell1-v99wc" Feb 17 19:43:08 crc kubenswrapper[4892]: I0217 19:43:08.309264 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5gj6\" (UniqueName: \"kubernetes.io/projected/69c7c84a-660b-4db3-a143-95efe4b92db2-kube-api-access-h5gj6\") pod \"bootstrap-openstack-openstack-cell1-v99wc\" (UID: \"69c7c84a-660b-4db3-a143-95efe4b92db2\") " pod="openstack/bootstrap-openstack-openstack-cell1-v99wc" Feb 17 19:43:08 crc kubenswrapper[4892]: I0217 19:43:08.420397 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-v99wc" Feb 17 19:43:08 crc kubenswrapper[4892]: I0217 19:43:08.500971 4892 generic.go:334] "Generic (PLEG): container finished" podID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerID="445f87c5a931a69b03d0a40bb0f8702e093637c8856522e51b78c9e6dbbc9e48" exitCode=0 Feb 17 19:43:08 crc kubenswrapper[4892]: I0217 19:43:08.501012 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerDied","Data":"445f87c5a931a69b03d0a40bb0f8702e093637c8856522e51b78c9e6dbbc9e48"} Feb 17 19:43:08 crc kubenswrapper[4892]: I0217 19:43:08.501044 4892 scope.go:117] "RemoveContainer" containerID="d2b2efd3d5988ce334d039949155348879bf6dbf519c1878bea365e6fb91554e" Feb 17 19:43:08 crc kubenswrapper[4892]: I0217 19:43:08.501897 4892 scope.go:117] "RemoveContainer" containerID="445f87c5a931a69b03d0a40bb0f8702e093637c8856522e51b78c9e6dbbc9e48" Feb 17 19:43:08 crc kubenswrapper[4892]: E0217 19:43:08.502339 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:43:09 crc kubenswrapper[4892]: I0217 19:43:09.118719 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-v99wc"] Feb 17 19:43:09 crc kubenswrapper[4892]: I0217 19:43:09.121317 4892 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 19:43:09 crc kubenswrapper[4892]: I0217 19:43:09.518686 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-v99wc" event={"ID":"69c7c84a-660b-4db3-a143-95efe4b92db2","Type":"ContainerStarted","Data":"924a94c422ce35c065f9773a6ef6604c05396f567b6fd738056ddb2e4f930870"} Feb 17 19:43:10 crc kubenswrapper[4892]: I0217 19:43:10.531343 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-v99wc" event={"ID":"69c7c84a-660b-4db3-a143-95efe4b92db2","Type":"ContainerStarted","Data":"58833eb5fbc9e1602f82651c2ac03a328c423f0c744018eae55a9a7746b650a2"} Feb 17 19:43:10 crc kubenswrapper[4892]: I0217 19:43:10.561676 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-openstack-openstack-cell1-v99wc" podStartSLOduration=2.120467713 podStartE2EDuration="2.561651827s" podCreationTimestamp="2026-02-17 19:43:08 +0000 UTC" firstStartedPulling="2026-02-17 19:43:09.121070858 +0000 UTC m=+7160.496474133" lastFinishedPulling="2026-02-17 19:43:09.562254982 +0000 UTC m=+7160.937658247" observedRunningTime="2026-02-17 19:43:10.551246367 +0000 UTC m=+7161.926649652" watchObservedRunningTime="2026-02-17 19:43:10.561651827 +0000 UTC m=+7161.937055092" Feb 17 19:43:23 crc kubenswrapper[4892]: I0217 19:43:23.360212 4892 scope.go:117] "RemoveContainer" containerID="445f87c5a931a69b03d0a40bb0f8702e093637c8856522e51b78c9e6dbbc9e48" Feb 17 19:43:23 crc kubenswrapper[4892]: E0217 19:43:23.360964 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:43:38 crc kubenswrapper[4892]: I0217 19:43:38.360052 4892 scope.go:117] "RemoveContainer" containerID="445f87c5a931a69b03d0a40bb0f8702e093637c8856522e51b78c9e6dbbc9e48" Feb 17 19:43:38 crc kubenswrapper[4892]: E0217 19:43:38.360829 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:43:51 crc kubenswrapper[4892]: I0217 19:43:51.360664 4892 scope.go:117] "RemoveContainer" containerID="445f87c5a931a69b03d0a40bb0f8702e093637c8856522e51b78c9e6dbbc9e48" Feb 17 19:43:51 crc kubenswrapper[4892]: E0217 19:43:51.361935 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:44:03 crc kubenswrapper[4892]: I0217 19:44:03.368335 4892 scope.go:117] "RemoveContainer" containerID="445f87c5a931a69b03d0a40bb0f8702e093637c8856522e51b78c9e6dbbc9e48" Feb 17 19:44:03 crc kubenswrapper[4892]: E0217 19:44:03.369303 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:44:16 crc kubenswrapper[4892]: I0217 19:44:16.359462 4892 scope.go:117] "RemoveContainer" containerID="445f87c5a931a69b03d0a40bb0f8702e093637c8856522e51b78c9e6dbbc9e48" Feb 17 19:44:16 crc kubenswrapper[4892]: E0217 19:44:16.360383 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:44:29 crc kubenswrapper[4892]: I0217 19:44:29.376949 4892 scope.go:117] "RemoveContainer" containerID="445f87c5a931a69b03d0a40bb0f8702e093637c8856522e51b78c9e6dbbc9e48" Feb 17 19:44:29 crc kubenswrapper[4892]: E0217 19:44:29.378135 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:44:42 crc kubenswrapper[4892]: I0217 19:44:42.361175 4892 scope.go:117] "RemoveContainer" containerID="445f87c5a931a69b03d0a40bb0f8702e093637c8856522e51b78c9e6dbbc9e48" Feb 17 19:44:42 crc kubenswrapper[4892]: E0217 19:44:42.362351 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:44:54 crc kubenswrapper[4892]: I0217 19:44:54.359367 4892 scope.go:117] "RemoveContainer" containerID="445f87c5a931a69b03d0a40bb0f8702e093637c8856522e51b78c9e6dbbc9e48" Feb 17 19:44:54 crc kubenswrapper[4892]: E0217 19:44:54.361037 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:45:00 crc kubenswrapper[4892]: I0217 19:45:00.162747 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522625-4h5xs"] Feb 17 19:45:00 crc kubenswrapper[4892]: I0217 19:45:00.165284 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522625-4h5xs" Feb 17 19:45:00 crc kubenswrapper[4892]: I0217 19:45:00.168897 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 19:45:00 crc kubenswrapper[4892]: I0217 19:45:00.169322 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 19:45:00 crc kubenswrapper[4892]: I0217 19:45:00.181614 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2cwp\" (UniqueName: \"kubernetes.io/projected/dc6e067c-cdf0-490c-a561-43a216cde39c-kube-api-access-p2cwp\") pod \"collect-profiles-29522625-4h5xs\" (UID: \"dc6e067c-cdf0-490c-a561-43a216cde39c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522625-4h5xs" Feb 17 19:45:00 crc kubenswrapper[4892]: I0217 19:45:00.182377 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dc6e067c-cdf0-490c-a561-43a216cde39c-config-volume\") pod \"collect-profiles-29522625-4h5xs\" (UID: \"dc6e067c-cdf0-490c-a561-43a216cde39c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522625-4h5xs" Feb 17 19:45:00 crc kubenswrapper[4892]: I0217 19:45:00.182565 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dc6e067c-cdf0-490c-a561-43a216cde39c-secret-volume\") pod \"collect-profiles-29522625-4h5xs\" (UID: \"dc6e067c-cdf0-490c-a561-43a216cde39c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522625-4h5xs" Feb 17 19:45:00 crc kubenswrapper[4892]: I0217 19:45:00.207185 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522625-4h5xs"] Feb 17 19:45:00 crc kubenswrapper[4892]: I0217 19:45:00.284715 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2cwp\" (UniqueName: \"kubernetes.io/projected/dc6e067c-cdf0-490c-a561-43a216cde39c-kube-api-access-p2cwp\") pod \"collect-profiles-29522625-4h5xs\" (UID: \"dc6e067c-cdf0-490c-a561-43a216cde39c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522625-4h5xs" Feb 17 19:45:00 crc kubenswrapper[4892]: I0217 19:45:00.284888 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dc6e067c-cdf0-490c-a561-43a216cde39c-config-volume\") pod \"collect-profiles-29522625-4h5xs\" (UID: \"dc6e067c-cdf0-490c-a561-43a216cde39c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522625-4h5xs" Feb 17 19:45:00 crc kubenswrapper[4892]: I0217 19:45:00.284931 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dc6e067c-cdf0-490c-a561-43a216cde39c-secret-volume\") pod \"collect-profiles-29522625-4h5xs\" (UID: \"dc6e067c-cdf0-490c-a561-43a216cde39c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522625-4h5xs" Feb 17 19:45:00 crc kubenswrapper[4892]: I0217 19:45:00.286512 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dc6e067c-cdf0-490c-a561-43a216cde39c-config-volume\") pod \"collect-profiles-29522625-4h5xs\" (UID: \"dc6e067c-cdf0-490c-a561-43a216cde39c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522625-4h5xs" Feb 17 19:45:00 crc kubenswrapper[4892]: I0217 19:45:00.296900 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dc6e067c-cdf0-490c-a561-43a216cde39c-secret-volume\") pod \"collect-profiles-29522625-4h5xs\" (UID: \"dc6e067c-cdf0-490c-a561-43a216cde39c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522625-4h5xs" Feb 17 19:45:00 crc kubenswrapper[4892]: I0217 19:45:00.301516 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2cwp\" (UniqueName: \"kubernetes.io/projected/dc6e067c-cdf0-490c-a561-43a216cde39c-kube-api-access-p2cwp\") pod \"collect-profiles-29522625-4h5xs\" (UID: \"dc6e067c-cdf0-490c-a561-43a216cde39c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522625-4h5xs" Feb 17 19:45:00 crc kubenswrapper[4892]: I0217 19:45:00.497048 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522625-4h5xs" Feb 17 19:45:01 crc kubenswrapper[4892]: I0217 19:45:01.060998 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522625-4h5xs"] Feb 17 19:45:01 crc kubenswrapper[4892]: I0217 19:45:01.109248 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522625-4h5xs" event={"ID":"dc6e067c-cdf0-490c-a561-43a216cde39c","Type":"ContainerStarted","Data":"3c8f0d051c34f5ebb0219ef1e0ef73dc1dfcd31c506e286148f8b5fb4bd5d573"} Feb 17 19:45:02 crc kubenswrapper[4892]: I0217 19:45:02.123637 4892 generic.go:334] "Generic (PLEG): container finished" podID="dc6e067c-cdf0-490c-a561-43a216cde39c" containerID="76458a9fb6d6c16b17614531b215790665eaa9219e3ed7b1e86569c79ad49f77" exitCode=0 Feb 17 19:45:02 crc kubenswrapper[4892]: I0217 19:45:02.123692 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522625-4h5xs" event={"ID":"dc6e067c-cdf0-490c-a561-43a216cde39c","Type":"ContainerDied","Data":"76458a9fb6d6c16b17614531b215790665eaa9219e3ed7b1e86569c79ad49f77"} Feb 17 19:45:03 crc kubenswrapper[4892]: I0217 19:45:03.542703 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522625-4h5xs" Feb 17 19:45:03 crc kubenswrapper[4892]: I0217 19:45:03.669663 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dc6e067c-cdf0-490c-a561-43a216cde39c-config-volume\") pod \"dc6e067c-cdf0-490c-a561-43a216cde39c\" (UID: \"dc6e067c-cdf0-490c-a561-43a216cde39c\") " Feb 17 19:45:03 crc kubenswrapper[4892]: I0217 19:45:03.669764 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dc6e067c-cdf0-490c-a561-43a216cde39c-secret-volume\") pod \"dc6e067c-cdf0-490c-a561-43a216cde39c\" (UID: \"dc6e067c-cdf0-490c-a561-43a216cde39c\") " Feb 17 19:45:03 crc kubenswrapper[4892]: I0217 19:45:03.671589 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2cwp\" (UniqueName: \"kubernetes.io/projected/dc6e067c-cdf0-490c-a561-43a216cde39c-kube-api-access-p2cwp\") pod \"dc6e067c-cdf0-490c-a561-43a216cde39c\" (UID: \"dc6e067c-cdf0-490c-a561-43a216cde39c\") " Feb 17 19:45:03 crc kubenswrapper[4892]: I0217 19:45:03.672437 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc6e067c-cdf0-490c-a561-43a216cde39c-config-volume" (OuterVolumeSpecName: "config-volume") pod "dc6e067c-cdf0-490c-a561-43a216cde39c" (UID: "dc6e067c-cdf0-490c-a561-43a216cde39c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:45:03 crc kubenswrapper[4892]: I0217 19:45:03.677503 4892 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dc6e067c-cdf0-490c-a561-43a216cde39c-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 19:45:03 crc kubenswrapper[4892]: I0217 19:45:03.690751 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc6e067c-cdf0-490c-a561-43a216cde39c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "dc6e067c-cdf0-490c-a561-43a216cde39c" (UID: "dc6e067c-cdf0-490c-a561-43a216cde39c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:45:03 crc kubenswrapper[4892]: I0217 19:45:03.694234 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc6e067c-cdf0-490c-a561-43a216cde39c-kube-api-access-p2cwp" (OuterVolumeSpecName: "kube-api-access-p2cwp") pod "dc6e067c-cdf0-490c-a561-43a216cde39c" (UID: "dc6e067c-cdf0-490c-a561-43a216cde39c"). InnerVolumeSpecName "kube-api-access-p2cwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:45:03 crc kubenswrapper[4892]: I0217 19:45:03.781685 4892 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dc6e067c-cdf0-490c-a561-43a216cde39c-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 19:45:03 crc kubenswrapper[4892]: I0217 19:45:03.781729 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2cwp\" (UniqueName: \"kubernetes.io/projected/dc6e067c-cdf0-490c-a561-43a216cde39c-kube-api-access-p2cwp\") on node \"crc\" DevicePath \"\"" Feb 17 19:45:04 crc kubenswrapper[4892]: I0217 19:45:04.148464 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522625-4h5xs" event={"ID":"dc6e067c-cdf0-490c-a561-43a216cde39c","Type":"ContainerDied","Data":"3c8f0d051c34f5ebb0219ef1e0ef73dc1dfcd31c506e286148f8b5fb4bd5d573"} Feb 17 19:45:04 crc kubenswrapper[4892]: I0217 19:45:04.148991 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c8f0d051c34f5ebb0219ef1e0ef73dc1dfcd31c506e286148f8b5fb4bd5d573" Feb 17 19:45:04 crc kubenswrapper[4892]: I0217 19:45:04.148548 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522625-4h5xs" Feb 17 19:45:04 crc kubenswrapper[4892]: I0217 19:45:04.619362 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522580-v4ck9"] Feb 17 19:45:04 crc kubenswrapper[4892]: I0217 19:45:04.629875 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522580-v4ck9"] Feb 17 19:45:05 crc kubenswrapper[4892]: I0217 19:45:05.376764 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ddccfb8-a091-46b4-962a-c176143d0c7a" path="/var/lib/kubelet/pods/7ddccfb8-a091-46b4-962a-c176143d0c7a/volumes" Feb 17 19:45:08 crc kubenswrapper[4892]: I0217 19:45:08.480994 4892 scope.go:117] "RemoveContainer" containerID="7f0aa7858d6abd58c096f23ce9a351f74240c3af18a152f7ef0abd7b8976e91f" Feb 17 19:45:09 crc kubenswrapper[4892]: I0217 19:45:09.370610 4892 scope.go:117] "RemoveContainer" containerID="445f87c5a931a69b03d0a40bb0f8702e093637c8856522e51b78c9e6dbbc9e48" Feb 17 19:45:09 crc kubenswrapper[4892]: E0217 19:45:09.371415 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:45:23 crc kubenswrapper[4892]: I0217 19:45:23.359467 4892 scope.go:117] "RemoveContainer" containerID="445f87c5a931a69b03d0a40bb0f8702e093637c8856522e51b78c9e6dbbc9e48" Feb 17 19:45:23 crc kubenswrapper[4892]: E0217 19:45:23.360217 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:45:34 crc kubenswrapper[4892]: I0217 19:45:34.359898 4892 scope.go:117] "RemoveContainer" containerID="445f87c5a931a69b03d0a40bb0f8702e093637c8856522e51b78c9e6dbbc9e48" Feb 17 19:45:34 crc kubenswrapper[4892]: E0217 19:45:34.360788 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:45:49 crc kubenswrapper[4892]: I0217 19:45:49.375402 4892 scope.go:117] "RemoveContainer" containerID="445f87c5a931a69b03d0a40bb0f8702e093637c8856522e51b78c9e6dbbc9e48" Feb 17 19:45:49 crc kubenswrapper[4892]: E0217 19:45:49.376824 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:46:00 crc kubenswrapper[4892]: I0217 19:46:00.359523 4892 scope.go:117] "RemoveContainer" containerID="445f87c5a931a69b03d0a40bb0f8702e093637c8856522e51b78c9e6dbbc9e48" Feb 17 19:46:00 crc kubenswrapper[4892]: E0217 19:46:00.360391 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:46:11 crc kubenswrapper[4892]: I0217 19:46:11.361120 4892 scope.go:117] "RemoveContainer" containerID="445f87c5a931a69b03d0a40bb0f8702e093637c8856522e51b78c9e6dbbc9e48" Feb 17 19:46:11 crc kubenswrapper[4892]: E0217 19:46:11.362364 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:46:17 crc kubenswrapper[4892]: I0217 19:46:17.030101 4892 generic.go:334] "Generic (PLEG): container finished" podID="69c7c84a-660b-4db3-a143-95efe4b92db2" containerID="58833eb5fbc9e1602f82651c2ac03a328c423f0c744018eae55a9a7746b650a2" exitCode=0 Feb 17 19:46:17 crc kubenswrapper[4892]: I0217 19:46:17.030142 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-v99wc" event={"ID":"69c7c84a-660b-4db3-a143-95efe4b92db2","Type":"ContainerDied","Data":"58833eb5fbc9e1602f82651c2ac03a328c423f0c744018eae55a9a7746b650a2"} Feb 17 19:46:18 crc kubenswrapper[4892]: I0217 19:46:18.612040 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-v99wc" Feb 17 19:46:18 crc kubenswrapper[4892]: I0217 19:46:18.776681 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/69c7c84a-660b-4db3-a143-95efe4b92db2-ceph\") pod \"69c7c84a-660b-4db3-a143-95efe4b92db2\" (UID: \"69c7c84a-660b-4db3-a143-95efe4b92db2\") " Feb 17 19:46:18 crc kubenswrapper[4892]: I0217 19:46:18.776801 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5gj6\" (UniqueName: \"kubernetes.io/projected/69c7c84a-660b-4db3-a143-95efe4b92db2-kube-api-access-h5gj6\") pod \"69c7c84a-660b-4db3-a143-95efe4b92db2\" (UID: \"69c7c84a-660b-4db3-a143-95efe4b92db2\") " Feb 17 19:46:18 crc kubenswrapper[4892]: I0217 19:46:18.776895 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/69c7c84a-660b-4db3-a143-95efe4b92db2-ssh-key-openstack-cell1\") pod \"69c7c84a-660b-4db3-a143-95efe4b92db2\" (UID: \"69c7c84a-660b-4db3-a143-95efe4b92db2\") " Feb 17 19:46:18 crc kubenswrapper[4892]: I0217 19:46:18.777119 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69c7c84a-660b-4db3-a143-95efe4b92db2-bootstrap-combined-ca-bundle\") pod \"69c7c84a-660b-4db3-a143-95efe4b92db2\" (UID: \"69c7c84a-660b-4db3-a143-95efe4b92db2\") " Feb 17 19:46:18 crc kubenswrapper[4892]: I0217 19:46:18.777161 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69c7c84a-660b-4db3-a143-95efe4b92db2-inventory\") pod \"69c7c84a-660b-4db3-a143-95efe4b92db2\" (UID: \"69c7c84a-660b-4db3-a143-95efe4b92db2\") " Feb 17 19:46:18 crc kubenswrapper[4892]: I0217 19:46:18.782529 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69c7c84a-660b-4db3-a143-95efe4b92db2-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "69c7c84a-660b-4db3-a143-95efe4b92db2" (UID: "69c7c84a-660b-4db3-a143-95efe4b92db2"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:46:18 crc kubenswrapper[4892]: I0217 19:46:18.783430 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69c7c84a-660b-4db3-a143-95efe4b92db2-ceph" (OuterVolumeSpecName: "ceph") pod "69c7c84a-660b-4db3-a143-95efe4b92db2" (UID: "69c7c84a-660b-4db3-a143-95efe4b92db2"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:46:18 crc kubenswrapper[4892]: I0217 19:46:18.786170 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69c7c84a-660b-4db3-a143-95efe4b92db2-kube-api-access-h5gj6" (OuterVolumeSpecName: "kube-api-access-h5gj6") pod "69c7c84a-660b-4db3-a143-95efe4b92db2" (UID: "69c7c84a-660b-4db3-a143-95efe4b92db2"). InnerVolumeSpecName "kube-api-access-h5gj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:46:18 crc kubenswrapper[4892]: I0217 19:46:18.810922 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69c7c84a-660b-4db3-a143-95efe4b92db2-inventory" (OuterVolumeSpecName: "inventory") pod "69c7c84a-660b-4db3-a143-95efe4b92db2" (UID: "69c7c84a-660b-4db3-a143-95efe4b92db2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:46:18 crc kubenswrapper[4892]: I0217 19:46:18.813508 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69c7c84a-660b-4db3-a143-95efe4b92db2-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "69c7c84a-660b-4db3-a143-95efe4b92db2" (UID: "69c7c84a-660b-4db3-a143-95efe4b92db2"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:46:18 crc kubenswrapper[4892]: I0217 19:46:18.879337 4892 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/69c7c84a-660b-4db3-a143-95efe4b92db2-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 17 19:46:18 crc kubenswrapper[4892]: I0217 19:46:18.879375 4892 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69c7c84a-660b-4db3-a143-95efe4b92db2-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 19:46:18 crc kubenswrapper[4892]: I0217 19:46:18.879385 4892 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69c7c84a-660b-4db3-a143-95efe4b92db2-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 19:46:18 crc kubenswrapper[4892]: I0217 19:46:18.879394 4892 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/69c7c84a-660b-4db3-a143-95efe4b92db2-ceph\") on node \"crc\" DevicePath \"\"" Feb 17 19:46:18 crc kubenswrapper[4892]: I0217 19:46:18.879403 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5gj6\" (UniqueName: \"kubernetes.io/projected/69c7c84a-660b-4db3-a143-95efe4b92db2-kube-api-access-h5gj6\") on node \"crc\" DevicePath \"\"" Feb 17 19:46:19 crc kubenswrapper[4892]: I0217 19:46:19.052919 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-v99wc" event={"ID":"69c7c84a-660b-4db3-a143-95efe4b92db2","Type":"ContainerDied","Data":"924a94c422ce35c065f9773a6ef6604c05396f567b6fd738056ddb2e4f930870"} Feb 17 19:46:19 crc kubenswrapper[4892]: I0217 19:46:19.052977 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="924a94c422ce35c065f9773a6ef6604c05396f567b6fd738056ddb2e4f930870" Feb 17 19:46:19 crc kubenswrapper[4892]: I0217 19:46:19.052994 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-v99wc" Feb 17 19:46:19 crc kubenswrapper[4892]: I0217 19:46:19.154738 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-hmvph"] Feb 17 19:46:19 crc kubenswrapper[4892]: E0217 19:46:19.155445 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc6e067c-cdf0-490c-a561-43a216cde39c" containerName="collect-profiles" Feb 17 19:46:19 crc kubenswrapper[4892]: I0217 19:46:19.155549 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc6e067c-cdf0-490c-a561-43a216cde39c" containerName="collect-profiles" Feb 17 19:46:19 crc kubenswrapper[4892]: E0217 19:46:19.155641 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69c7c84a-660b-4db3-a143-95efe4b92db2" containerName="bootstrap-openstack-openstack-cell1" Feb 17 19:46:19 crc kubenswrapper[4892]: I0217 19:46:19.155702 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="69c7c84a-660b-4db3-a143-95efe4b92db2" containerName="bootstrap-openstack-openstack-cell1" Feb 17 19:46:19 crc kubenswrapper[4892]: I0217 19:46:19.155994 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="69c7c84a-660b-4db3-a143-95efe4b92db2" containerName="bootstrap-openstack-openstack-cell1" Feb 17 19:46:19 crc kubenswrapper[4892]: I0217 19:46:19.156075 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc6e067c-cdf0-490c-a561-43a216cde39c" containerName="collect-profiles" Feb 17 19:46:19 crc kubenswrapper[4892]: I0217 19:46:19.157014 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-hmvph" Feb 17 19:46:19 crc kubenswrapper[4892]: I0217 19:46:19.159687 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-sdxx7" Feb 17 19:46:19 crc kubenswrapper[4892]: I0217 19:46:19.159886 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 19:46:19 crc kubenswrapper[4892]: I0217 19:46:19.160074 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 17 19:46:19 crc kubenswrapper[4892]: I0217 19:46:19.161081 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 17 19:46:19 crc kubenswrapper[4892]: I0217 19:46:19.168689 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-hmvph"] Feb 17 19:46:19 crc kubenswrapper[4892]: I0217 19:46:19.186121 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b80ae5d-715f-4c7b-a06b-597a0a53a869-inventory\") pod \"download-cache-openstack-openstack-cell1-hmvph\" (UID: \"0b80ae5d-715f-4c7b-a06b-597a0a53a869\") " pod="openstack/download-cache-openstack-openstack-cell1-hmvph" Feb 17 19:46:19 crc kubenswrapper[4892]: I0217 19:46:19.186455 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lf679\" (UniqueName: \"kubernetes.io/projected/0b80ae5d-715f-4c7b-a06b-597a0a53a869-kube-api-access-lf679\") pod \"download-cache-openstack-openstack-cell1-hmvph\" (UID: \"0b80ae5d-715f-4c7b-a06b-597a0a53a869\") " pod="openstack/download-cache-openstack-openstack-cell1-hmvph" Feb 17 19:46:19 crc kubenswrapper[4892]: I0217 19:46:19.186629 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0b80ae5d-715f-4c7b-a06b-597a0a53a869-ceph\") pod \"download-cache-openstack-openstack-cell1-hmvph\" (UID: \"0b80ae5d-715f-4c7b-a06b-597a0a53a869\") " pod="openstack/download-cache-openstack-openstack-cell1-hmvph" Feb 17 19:46:19 crc kubenswrapper[4892]: I0217 19:46:19.186850 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0b80ae5d-715f-4c7b-a06b-597a0a53a869-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-hmvph\" (UID: \"0b80ae5d-715f-4c7b-a06b-597a0a53a869\") " pod="openstack/download-cache-openstack-openstack-cell1-hmvph" Feb 17 19:46:19 crc kubenswrapper[4892]: I0217 19:46:19.289587 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b80ae5d-715f-4c7b-a06b-597a0a53a869-inventory\") pod \"download-cache-openstack-openstack-cell1-hmvph\" (UID: \"0b80ae5d-715f-4c7b-a06b-597a0a53a869\") " pod="openstack/download-cache-openstack-openstack-cell1-hmvph" Feb 17 19:46:19 crc kubenswrapper[4892]: I0217 19:46:19.290298 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lf679\" (UniqueName: \"kubernetes.io/projected/0b80ae5d-715f-4c7b-a06b-597a0a53a869-kube-api-access-lf679\") pod \"download-cache-openstack-openstack-cell1-hmvph\" (UID: \"0b80ae5d-715f-4c7b-a06b-597a0a53a869\") " pod="openstack/download-cache-openstack-openstack-cell1-hmvph" Feb 17 19:46:19 crc kubenswrapper[4892]: I0217 19:46:19.290455 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0b80ae5d-715f-4c7b-a06b-597a0a53a869-ceph\") pod \"download-cache-openstack-openstack-cell1-hmvph\" (UID: \"0b80ae5d-715f-4c7b-a06b-597a0a53a869\") " pod="openstack/download-cache-openstack-openstack-cell1-hmvph" Feb 17 19:46:19 crc kubenswrapper[4892]: I0217 19:46:19.290646 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0b80ae5d-715f-4c7b-a06b-597a0a53a869-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-hmvph\" (UID: \"0b80ae5d-715f-4c7b-a06b-597a0a53a869\") " pod="openstack/download-cache-openstack-openstack-cell1-hmvph" Feb 17 19:46:19 crc kubenswrapper[4892]: I0217 19:46:19.295867 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b80ae5d-715f-4c7b-a06b-597a0a53a869-inventory\") pod \"download-cache-openstack-openstack-cell1-hmvph\" (UID: \"0b80ae5d-715f-4c7b-a06b-597a0a53a869\") " pod="openstack/download-cache-openstack-openstack-cell1-hmvph" Feb 17 19:46:19 crc kubenswrapper[4892]: I0217 19:46:19.295937 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0b80ae5d-715f-4c7b-a06b-597a0a53a869-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-hmvph\" (UID: \"0b80ae5d-715f-4c7b-a06b-597a0a53a869\") " pod="openstack/download-cache-openstack-openstack-cell1-hmvph" Feb 17 19:46:19 crc kubenswrapper[4892]: I0217 19:46:19.304779 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0b80ae5d-715f-4c7b-a06b-597a0a53a869-ceph\") pod \"download-cache-openstack-openstack-cell1-hmvph\" (UID: \"0b80ae5d-715f-4c7b-a06b-597a0a53a869\") " pod="openstack/download-cache-openstack-openstack-cell1-hmvph" Feb 17 19:46:19 crc kubenswrapper[4892]: I0217 19:46:19.308950 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lf679\" (UniqueName: \"kubernetes.io/projected/0b80ae5d-715f-4c7b-a06b-597a0a53a869-kube-api-access-lf679\") pod \"download-cache-openstack-openstack-cell1-hmvph\" (UID: \"0b80ae5d-715f-4c7b-a06b-597a0a53a869\") " pod="openstack/download-cache-openstack-openstack-cell1-hmvph" Feb 17 19:46:19 crc kubenswrapper[4892]: I0217 19:46:19.508396 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-hmvph" Feb 17 19:46:20 crc kubenswrapper[4892]: I0217 19:46:20.113808 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-hmvph"] Feb 17 19:46:21 crc kubenswrapper[4892]: I0217 19:46:21.078352 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-hmvph" event={"ID":"0b80ae5d-715f-4c7b-a06b-597a0a53a869","Type":"ContainerStarted","Data":"a2b6e9efe8740f9580c2b779da4b0ea0197837efe1f931ee21f17d121ba95a77"} Feb 17 19:46:21 crc kubenswrapper[4892]: I0217 19:46:21.078889 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-hmvph" event={"ID":"0b80ae5d-715f-4c7b-a06b-597a0a53a869","Type":"ContainerStarted","Data":"6f665fadfda11cd424361ef548b5170836ed287e0d5530c18d56a9fb2f7f044c"} Feb 17 19:46:21 crc kubenswrapper[4892]: I0217 19:46:21.098153 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-openstack-openstack-cell1-hmvph" podStartSLOduration=1.6444273900000002 podStartE2EDuration="2.098135151s" podCreationTimestamp="2026-02-17 19:46:19 +0000 UTC" firstStartedPulling="2026-02-17 19:46:20.118392725 +0000 UTC m=+7351.493796010" lastFinishedPulling="2026-02-17 19:46:20.572100466 +0000 UTC m=+7351.947503771" observedRunningTime="2026-02-17 19:46:21.092695564 +0000 UTC m=+7352.468098839" watchObservedRunningTime="2026-02-17 19:46:21.098135151 +0000 UTC m=+7352.473538416" Feb 17 19:46:24 crc kubenswrapper[4892]: I0217 19:46:24.359644 4892 scope.go:117] "RemoveContainer" containerID="445f87c5a931a69b03d0a40bb0f8702e093637c8856522e51b78c9e6dbbc9e48" Feb 17 19:46:24 crc kubenswrapper[4892]: E0217 19:46:24.360478 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:46:39 crc kubenswrapper[4892]: I0217 19:46:39.370377 4892 scope.go:117] "RemoveContainer" containerID="445f87c5a931a69b03d0a40bb0f8702e093637c8856522e51b78c9e6dbbc9e48" Feb 17 19:46:39 crc kubenswrapper[4892]: E0217 19:46:39.372188 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:46:54 crc kubenswrapper[4892]: I0217 19:46:54.359914 4892 scope.go:117] "RemoveContainer" containerID="445f87c5a931a69b03d0a40bb0f8702e093637c8856522e51b78c9e6dbbc9e48" Feb 17 19:46:54 crc kubenswrapper[4892]: E0217 19:46:54.360753 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:47:07 crc kubenswrapper[4892]: I0217 19:47:07.360447 4892 scope.go:117] "RemoveContainer" containerID="445f87c5a931a69b03d0a40bb0f8702e093637c8856522e51b78c9e6dbbc9e48" Feb 17 19:47:07 crc kubenswrapper[4892]: E0217 19:47:07.362055 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:47:20 crc kubenswrapper[4892]: I0217 19:47:20.359374 4892 scope.go:117] "RemoveContainer" containerID="445f87c5a931a69b03d0a40bb0f8702e093637c8856522e51b78c9e6dbbc9e48" Feb 17 19:47:20 crc kubenswrapper[4892]: E0217 19:47:20.360638 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:47:35 crc kubenswrapper[4892]: I0217 19:47:35.360854 4892 scope.go:117] "RemoveContainer" containerID="445f87c5a931a69b03d0a40bb0f8702e093637c8856522e51b78c9e6dbbc9e48" Feb 17 19:47:35 crc kubenswrapper[4892]: E0217 19:47:35.361973 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:47:47 crc kubenswrapper[4892]: I0217 19:47:47.360496 4892 scope.go:117] "RemoveContainer" containerID="445f87c5a931a69b03d0a40bb0f8702e093637c8856522e51b78c9e6dbbc9e48" Feb 17 19:47:47 crc kubenswrapper[4892]: E0217 19:47:47.361608 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:47:56 crc kubenswrapper[4892]: I0217 19:47:56.310928 4892 generic.go:334] "Generic (PLEG): container finished" podID="0b80ae5d-715f-4c7b-a06b-597a0a53a869" containerID="a2b6e9efe8740f9580c2b779da4b0ea0197837efe1f931ee21f17d121ba95a77" exitCode=0 Feb 17 19:47:56 crc kubenswrapper[4892]: I0217 19:47:56.311024 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-hmvph" event={"ID":"0b80ae5d-715f-4c7b-a06b-597a0a53a869","Type":"ContainerDied","Data":"a2b6e9efe8740f9580c2b779da4b0ea0197837efe1f931ee21f17d121ba95a77"} Feb 17 19:47:57 crc kubenswrapper[4892]: I0217 19:47:57.963632 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-hmvph" Feb 17 19:47:58 crc kubenswrapper[4892]: I0217 19:47:58.013027 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lf679\" (UniqueName: \"kubernetes.io/projected/0b80ae5d-715f-4c7b-a06b-597a0a53a869-kube-api-access-lf679\") pod \"0b80ae5d-715f-4c7b-a06b-597a0a53a869\" (UID: \"0b80ae5d-715f-4c7b-a06b-597a0a53a869\") " Feb 17 19:47:58 crc kubenswrapper[4892]: I0217 19:47:58.013161 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b80ae5d-715f-4c7b-a06b-597a0a53a869-inventory\") pod \"0b80ae5d-715f-4c7b-a06b-597a0a53a869\" (UID: \"0b80ae5d-715f-4c7b-a06b-597a0a53a869\") " Feb 17 19:47:58 crc kubenswrapper[4892]: I0217 19:47:58.013351 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0b80ae5d-715f-4c7b-a06b-597a0a53a869-ceph\") pod \"0b80ae5d-715f-4c7b-a06b-597a0a53a869\" (UID: \"0b80ae5d-715f-4c7b-a06b-597a0a53a869\") " Feb 17 19:47:58 crc kubenswrapper[4892]: I0217 19:47:58.013383 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0b80ae5d-715f-4c7b-a06b-597a0a53a869-ssh-key-openstack-cell1\") pod \"0b80ae5d-715f-4c7b-a06b-597a0a53a869\" (UID: \"0b80ae5d-715f-4c7b-a06b-597a0a53a869\") " Feb 17 19:47:58 crc kubenswrapper[4892]: I0217 19:47:58.034669 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b80ae5d-715f-4c7b-a06b-597a0a53a869-kube-api-access-lf679" (OuterVolumeSpecName: "kube-api-access-lf679") pod "0b80ae5d-715f-4c7b-a06b-597a0a53a869" (UID: "0b80ae5d-715f-4c7b-a06b-597a0a53a869"). InnerVolumeSpecName "kube-api-access-lf679". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:47:58 crc kubenswrapper[4892]: I0217 19:47:58.039963 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b80ae5d-715f-4c7b-a06b-597a0a53a869-ceph" (OuterVolumeSpecName: "ceph") pod "0b80ae5d-715f-4c7b-a06b-597a0a53a869" (UID: "0b80ae5d-715f-4c7b-a06b-597a0a53a869"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:47:58 crc kubenswrapper[4892]: I0217 19:47:58.053012 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b80ae5d-715f-4c7b-a06b-597a0a53a869-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "0b80ae5d-715f-4c7b-a06b-597a0a53a869" (UID: "0b80ae5d-715f-4c7b-a06b-597a0a53a869"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:47:58 crc kubenswrapper[4892]: I0217 19:47:58.072129 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b80ae5d-715f-4c7b-a06b-597a0a53a869-inventory" (OuterVolumeSpecName: "inventory") pod "0b80ae5d-715f-4c7b-a06b-597a0a53a869" (UID: "0b80ae5d-715f-4c7b-a06b-597a0a53a869"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:47:58 crc kubenswrapper[4892]: I0217 19:47:58.117064 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lf679\" (UniqueName: \"kubernetes.io/projected/0b80ae5d-715f-4c7b-a06b-597a0a53a869-kube-api-access-lf679\") on node \"crc\" DevicePath \"\"" Feb 17 19:47:58 crc kubenswrapper[4892]: I0217 19:47:58.117101 4892 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b80ae5d-715f-4c7b-a06b-597a0a53a869-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 19:47:58 crc kubenswrapper[4892]: I0217 19:47:58.117114 4892 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0b80ae5d-715f-4c7b-a06b-597a0a53a869-ceph\") on node \"crc\" DevicePath \"\"" Feb 17 19:47:58 crc kubenswrapper[4892]: I0217 19:47:58.117125 4892 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0b80ae5d-715f-4c7b-a06b-597a0a53a869-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 17 19:47:58 crc kubenswrapper[4892]: I0217 19:47:58.340913 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-hmvph" event={"ID":"0b80ae5d-715f-4c7b-a06b-597a0a53a869","Type":"ContainerDied","Data":"6f665fadfda11cd424361ef548b5170836ed287e0d5530c18d56a9fb2f7f044c"} Feb 17 19:47:58 crc kubenswrapper[4892]: I0217 19:47:58.341173 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f665fadfda11cd424361ef548b5170836ed287e0d5530c18d56a9fb2f7f044c" Feb 17 19:47:58 crc kubenswrapper[4892]: I0217 19:47:58.340997 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-hmvph" Feb 17 19:47:58 crc kubenswrapper[4892]: I0217 19:47:58.359767 4892 scope.go:117] "RemoveContainer" containerID="445f87c5a931a69b03d0a40bb0f8702e093637c8856522e51b78c9e6dbbc9e48" Feb 17 19:47:58 crc kubenswrapper[4892]: E0217 19:47:58.360494 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:47:58 crc kubenswrapper[4892]: I0217 19:47:58.587938 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-lvk64"] Feb 17 19:47:58 crc kubenswrapper[4892]: E0217 19:47:58.588697 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b80ae5d-715f-4c7b-a06b-597a0a53a869" containerName="download-cache-openstack-openstack-cell1" Feb 17 19:47:58 crc kubenswrapper[4892]: I0217 19:47:58.588723 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b80ae5d-715f-4c7b-a06b-597a0a53a869" containerName="download-cache-openstack-openstack-cell1" Feb 17 19:47:58 crc kubenswrapper[4892]: I0217 19:47:58.589125 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b80ae5d-715f-4c7b-a06b-597a0a53a869" containerName="download-cache-openstack-openstack-cell1" Feb 17 19:47:58 crc kubenswrapper[4892]: I0217 19:47:58.590340 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-lvk64" Feb 17 19:47:58 crc kubenswrapper[4892]: I0217 19:47:58.592398 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-sdxx7" Feb 17 19:47:58 crc kubenswrapper[4892]: I0217 19:47:58.592677 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 19:47:58 crc kubenswrapper[4892]: I0217 19:47:58.593240 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 17 19:47:58 crc kubenswrapper[4892]: I0217 19:47:58.593596 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 17 19:47:58 crc kubenswrapper[4892]: I0217 19:47:58.601738 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-lvk64"] Feb 17 19:47:58 crc kubenswrapper[4892]: I0217 19:47:58.630179 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/9d911062-0c28-415c-bdb4-6288348c105a-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-lvk64\" (UID: \"9d911062-0c28-415c-bdb4-6288348c105a\") " pod="openstack/configure-network-openstack-openstack-cell1-lvk64" Feb 17 19:47:58 crc kubenswrapper[4892]: I0217 19:47:58.630266 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9d911062-0c28-415c-bdb4-6288348c105a-inventory\") pod \"configure-network-openstack-openstack-cell1-lvk64\" (UID: \"9d911062-0c28-415c-bdb4-6288348c105a\") " pod="openstack/configure-network-openstack-openstack-cell1-lvk64" Feb 17 19:47:58 crc kubenswrapper[4892]: I0217 19:47:58.630327 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzsqg\" (UniqueName: \"kubernetes.io/projected/9d911062-0c28-415c-bdb4-6288348c105a-kube-api-access-zzsqg\") pod \"configure-network-openstack-openstack-cell1-lvk64\" (UID: \"9d911062-0c28-415c-bdb4-6288348c105a\") " pod="openstack/configure-network-openstack-openstack-cell1-lvk64" Feb 17 19:47:58 crc kubenswrapper[4892]: I0217 19:47:58.630429 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9d911062-0c28-415c-bdb4-6288348c105a-ceph\") pod \"configure-network-openstack-openstack-cell1-lvk64\" (UID: \"9d911062-0c28-415c-bdb4-6288348c105a\") " pod="openstack/configure-network-openstack-openstack-cell1-lvk64" Feb 17 19:47:58 crc kubenswrapper[4892]: I0217 19:47:58.731947 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9d911062-0c28-415c-bdb4-6288348c105a-ceph\") pod \"configure-network-openstack-openstack-cell1-lvk64\" (UID: \"9d911062-0c28-415c-bdb4-6288348c105a\") " pod="openstack/configure-network-openstack-openstack-cell1-lvk64" Feb 17 19:47:58 crc kubenswrapper[4892]: I0217 19:47:58.732105 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/9d911062-0c28-415c-bdb4-6288348c105a-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-lvk64\" (UID: \"9d911062-0c28-415c-bdb4-6288348c105a\") " pod="openstack/configure-network-openstack-openstack-cell1-lvk64" Feb 17 19:47:58 crc kubenswrapper[4892]: I0217 19:47:58.732169 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9d911062-0c28-415c-bdb4-6288348c105a-inventory\") pod \"configure-network-openstack-openstack-cell1-lvk64\" (UID: \"9d911062-0c28-415c-bdb4-6288348c105a\") " pod="openstack/configure-network-openstack-openstack-cell1-lvk64" Feb 17 19:47:58 crc kubenswrapper[4892]: I0217 19:47:58.732224 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzsqg\" (UniqueName: \"kubernetes.io/projected/9d911062-0c28-415c-bdb4-6288348c105a-kube-api-access-zzsqg\") pod \"configure-network-openstack-openstack-cell1-lvk64\" (UID: \"9d911062-0c28-415c-bdb4-6288348c105a\") " pod="openstack/configure-network-openstack-openstack-cell1-lvk64" Feb 17 19:47:58 crc kubenswrapper[4892]: I0217 19:47:58.736565 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9d911062-0c28-415c-bdb4-6288348c105a-ceph\") pod \"configure-network-openstack-openstack-cell1-lvk64\" (UID: \"9d911062-0c28-415c-bdb4-6288348c105a\") " pod="openstack/configure-network-openstack-openstack-cell1-lvk64" Feb 17 19:47:58 crc kubenswrapper[4892]: I0217 19:47:58.737315 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9d911062-0c28-415c-bdb4-6288348c105a-inventory\") pod \"configure-network-openstack-openstack-cell1-lvk64\" (UID: \"9d911062-0c28-415c-bdb4-6288348c105a\") " pod="openstack/configure-network-openstack-openstack-cell1-lvk64" Feb 17 19:47:58 crc kubenswrapper[4892]: I0217 19:47:58.737521 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/9d911062-0c28-415c-bdb4-6288348c105a-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-lvk64\" (UID: \"9d911062-0c28-415c-bdb4-6288348c105a\") " pod="openstack/configure-network-openstack-openstack-cell1-lvk64" Feb 17 19:47:58 crc kubenswrapper[4892]: I0217 19:47:58.749213 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzsqg\" (UniqueName: \"kubernetes.io/projected/9d911062-0c28-415c-bdb4-6288348c105a-kube-api-access-zzsqg\") pod \"configure-network-openstack-openstack-cell1-lvk64\" (UID: \"9d911062-0c28-415c-bdb4-6288348c105a\") " pod="openstack/configure-network-openstack-openstack-cell1-lvk64" Feb 17 19:47:58 crc kubenswrapper[4892]: I0217 19:47:58.919620 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-lvk64" Feb 17 19:47:59 crc kubenswrapper[4892]: I0217 19:47:59.705305 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-lvk64"] Feb 17 19:48:00 crc kubenswrapper[4892]: I0217 19:48:00.363836 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-lvk64" event={"ID":"9d911062-0c28-415c-bdb4-6288348c105a","Type":"ContainerStarted","Data":"9639c3e112bad002d2b20cdd9857e97ec168a756668710d4584999c5e9af5e9c"} Feb 17 19:48:01 crc kubenswrapper[4892]: I0217 19:48:01.386907 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-lvk64" event={"ID":"9d911062-0c28-415c-bdb4-6288348c105a","Type":"ContainerStarted","Data":"00b9910611f0d688b58b552a120b233b90389f22879d0906eea69800c05c42c8"} Feb 17 19:48:01 crc kubenswrapper[4892]: I0217 19:48:01.434975 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-openstack-openstack-cell1-lvk64" podStartSLOduration=3.009400923 podStartE2EDuration="3.434949144s" podCreationTimestamp="2026-02-17 19:47:58 +0000 UTC" firstStartedPulling="2026-02-17 19:47:59.706040739 +0000 UTC m=+7451.081444004" lastFinishedPulling="2026-02-17 19:48:00.13158897 +0000 UTC m=+7451.506992225" observedRunningTime="2026-02-17 19:48:01.41515193 +0000 UTC m=+7452.790555205" watchObservedRunningTime="2026-02-17 19:48:01.434949144 +0000 UTC m=+7452.810352449" Feb 17 19:48:12 crc kubenswrapper[4892]: I0217 19:48:12.359931 4892 scope.go:117] "RemoveContainer" containerID="445f87c5a931a69b03d0a40bb0f8702e093637c8856522e51b78c9e6dbbc9e48" Feb 17 19:48:13 crc kubenswrapper[4892]: I0217 19:48:13.536226 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerStarted","Data":"3b5fc1d94d26a2a79cb5f7d6b0aa8474c2bc104c8b47fe3294fed95ffd456f3a"} Feb 17 19:48:31 crc kubenswrapper[4892]: I0217 19:48:31.357006 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2r9g5"] Feb 17 19:48:31 crc kubenswrapper[4892]: I0217 19:48:31.361322 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2r9g5" Feb 17 19:48:31 crc kubenswrapper[4892]: I0217 19:48:31.380507 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2r9g5"] Feb 17 19:48:31 crc kubenswrapper[4892]: I0217 19:48:31.486584 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mtf5\" (UniqueName: \"kubernetes.io/projected/51a13936-fe59-44ff-993b-cb343fb5e88c-kube-api-access-7mtf5\") pod \"certified-operators-2r9g5\" (UID: \"51a13936-fe59-44ff-993b-cb343fb5e88c\") " pod="openshift-marketplace/certified-operators-2r9g5" Feb 17 19:48:31 crc kubenswrapper[4892]: I0217 19:48:31.486883 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51a13936-fe59-44ff-993b-cb343fb5e88c-catalog-content\") pod \"certified-operators-2r9g5\" (UID: \"51a13936-fe59-44ff-993b-cb343fb5e88c\") " pod="openshift-marketplace/certified-operators-2r9g5" Feb 17 19:48:31 crc kubenswrapper[4892]: I0217 19:48:31.486964 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51a13936-fe59-44ff-993b-cb343fb5e88c-utilities\") pod \"certified-operators-2r9g5\" (UID: \"51a13936-fe59-44ff-993b-cb343fb5e88c\") " pod="openshift-marketplace/certified-operators-2r9g5" Feb 17 19:48:31 crc kubenswrapper[4892]: I0217 19:48:31.589918 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mtf5\" (UniqueName: \"kubernetes.io/projected/51a13936-fe59-44ff-993b-cb343fb5e88c-kube-api-access-7mtf5\") pod \"certified-operators-2r9g5\" (UID: \"51a13936-fe59-44ff-993b-cb343fb5e88c\") " pod="openshift-marketplace/certified-operators-2r9g5" Feb 17 19:48:31 crc kubenswrapper[4892]: I0217 19:48:31.590067 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51a13936-fe59-44ff-993b-cb343fb5e88c-catalog-content\") pod \"certified-operators-2r9g5\" (UID: \"51a13936-fe59-44ff-993b-cb343fb5e88c\") " pod="openshift-marketplace/certified-operators-2r9g5" Feb 17 19:48:31 crc kubenswrapper[4892]: I0217 19:48:31.590102 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51a13936-fe59-44ff-993b-cb343fb5e88c-utilities\") pod \"certified-operators-2r9g5\" (UID: \"51a13936-fe59-44ff-993b-cb343fb5e88c\") " pod="openshift-marketplace/certified-operators-2r9g5" Feb 17 19:48:31 crc kubenswrapper[4892]: I0217 19:48:31.590645 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51a13936-fe59-44ff-993b-cb343fb5e88c-utilities\") pod \"certified-operators-2r9g5\" (UID: \"51a13936-fe59-44ff-993b-cb343fb5e88c\") " pod="openshift-marketplace/certified-operators-2r9g5" Feb 17 19:48:31 crc kubenswrapper[4892]: I0217 19:48:31.591306 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51a13936-fe59-44ff-993b-cb343fb5e88c-catalog-content\") pod \"certified-operators-2r9g5\" (UID: \"51a13936-fe59-44ff-993b-cb343fb5e88c\") " pod="openshift-marketplace/certified-operators-2r9g5" Feb 17 19:48:31 crc kubenswrapper[4892]: I0217 19:48:31.612721 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mtf5\" (UniqueName: \"kubernetes.io/projected/51a13936-fe59-44ff-993b-cb343fb5e88c-kube-api-access-7mtf5\") pod \"certified-operators-2r9g5\" (UID: \"51a13936-fe59-44ff-993b-cb343fb5e88c\") " pod="openshift-marketplace/certified-operators-2r9g5" Feb 17 19:48:31 crc kubenswrapper[4892]: I0217 19:48:31.695899 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2r9g5" Feb 17 19:48:32 crc kubenswrapper[4892]: W0217 19:48:32.252346 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51a13936_fe59_44ff_993b_cb343fb5e88c.slice/crio-3b568ac6814a4cc71f71ea874ac894f2c6271d7677a2128b2a6d8f4bafc8a520 WatchSource:0}: Error finding container 3b568ac6814a4cc71f71ea874ac894f2c6271d7677a2128b2a6d8f4bafc8a520: Status 404 returned error can't find the container with id 3b568ac6814a4cc71f71ea874ac894f2c6271d7677a2128b2a6d8f4bafc8a520 Feb 17 19:48:32 crc kubenswrapper[4892]: I0217 19:48:32.268776 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2r9g5"] Feb 17 19:48:32 crc kubenswrapper[4892]: I0217 19:48:32.775870 4892 generic.go:334] "Generic (PLEG): container finished" podID="51a13936-fe59-44ff-993b-cb343fb5e88c" containerID="0dadd2c7a4ab9a2cb068cf4fdc67f117a85ec2c007c2766124ad0082b9b3fc38" exitCode=0 Feb 17 19:48:32 crc kubenswrapper[4892]: I0217 19:48:32.776195 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2r9g5" event={"ID":"51a13936-fe59-44ff-993b-cb343fb5e88c","Type":"ContainerDied","Data":"0dadd2c7a4ab9a2cb068cf4fdc67f117a85ec2c007c2766124ad0082b9b3fc38"} Feb 17 19:48:32 crc kubenswrapper[4892]: I0217 19:48:32.776234 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2r9g5" event={"ID":"51a13936-fe59-44ff-993b-cb343fb5e88c","Type":"ContainerStarted","Data":"3b568ac6814a4cc71f71ea874ac894f2c6271d7677a2128b2a6d8f4bafc8a520"} Feb 17 19:48:32 crc kubenswrapper[4892]: I0217 19:48:32.779030 4892 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 19:48:33 crc kubenswrapper[4892]: I0217 19:48:33.791973 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2r9g5" event={"ID":"51a13936-fe59-44ff-993b-cb343fb5e88c","Type":"ContainerStarted","Data":"c63c1375c6726ab96dd50d93cad13946ea4617c6bdf08087c48ab1d708e4fbbc"} Feb 17 19:48:35 crc kubenswrapper[4892]: I0217 19:48:35.821794 4892 generic.go:334] "Generic (PLEG): container finished" podID="51a13936-fe59-44ff-993b-cb343fb5e88c" containerID="c63c1375c6726ab96dd50d93cad13946ea4617c6bdf08087c48ab1d708e4fbbc" exitCode=0 Feb 17 19:48:35 crc kubenswrapper[4892]: I0217 19:48:35.821910 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2r9g5" event={"ID":"51a13936-fe59-44ff-993b-cb343fb5e88c","Type":"ContainerDied","Data":"c63c1375c6726ab96dd50d93cad13946ea4617c6bdf08087c48ab1d708e4fbbc"} Feb 17 19:48:36 crc kubenswrapper[4892]: I0217 19:48:36.843598 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2r9g5" event={"ID":"51a13936-fe59-44ff-993b-cb343fb5e88c","Type":"ContainerStarted","Data":"6ac4435a58251af7866637dfecf5ec1d2378861e7fb64a0ea26ea23daefc301a"} Feb 17 19:48:36 crc kubenswrapper[4892]: I0217 19:48:36.876467 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2r9g5" podStartSLOduration=2.103132266 podStartE2EDuration="5.876446455s" podCreationTimestamp="2026-02-17 19:48:31 +0000 UTC" firstStartedPulling="2026-02-17 19:48:32.778610123 +0000 UTC m=+7484.154013428" lastFinishedPulling="2026-02-17 19:48:36.551924312 +0000 UTC m=+7487.927327617" observedRunningTime="2026-02-17 19:48:36.864837032 +0000 UTC m=+7488.240240297" watchObservedRunningTime="2026-02-17 19:48:36.876446455 +0000 UTC m=+7488.251849720" Feb 17 19:48:41 crc kubenswrapper[4892]: I0217 19:48:41.697218 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2r9g5" Feb 17 19:48:41 crc kubenswrapper[4892]: I0217 19:48:41.698068 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2r9g5" Feb 17 19:48:41 crc kubenswrapper[4892]: I0217 19:48:41.763587 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2r9g5" Feb 17 19:48:41 crc kubenswrapper[4892]: I0217 19:48:41.976289 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2r9g5" Feb 17 19:48:42 crc kubenswrapper[4892]: I0217 19:48:42.035179 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2r9g5"] Feb 17 19:48:43 crc kubenswrapper[4892]: I0217 19:48:43.936046 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2r9g5" podUID="51a13936-fe59-44ff-993b-cb343fb5e88c" containerName="registry-server" containerID="cri-o://6ac4435a58251af7866637dfecf5ec1d2378861e7fb64a0ea26ea23daefc301a" gracePeriod=2 Feb 17 19:48:44 crc kubenswrapper[4892]: I0217 19:48:44.949707 4892 generic.go:334] "Generic (PLEG): container finished" podID="51a13936-fe59-44ff-993b-cb343fb5e88c" containerID="6ac4435a58251af7866637dfecf5ec1d2378861e7fb64a0ea26ea23daefc301a" exitCode=0 Feb 17 19:48:44 crc kubenswrapper[4892]: I0217 19:48:44.949783 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2r9g5" event={"ID":"51a13936-fe59-44ff-993b-cb343fb5e88c","Type":"ContainerDied","Data":"6ac4435a58251af7866637dfecf5ec1d2378861e7fb64a0ea26ea23daefc301a"} Feb 17 19:48:44 crc kubenswrapper[4892]: I0217 19:48:44.950282 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2r9g5" event={"ID":"51a13936-fe59-44ff-993b-cb343fb5e88c","Type":"ContainerDied","Data":"3b568ac6814a4cc71f71ea874ac894f2c6271d7677a2128b2a6d8f4bafc8a520"} Feb 17 19:48:44 crc kubenswrapper[4892]: I0217 19:48:44.950297 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b568ac6814a4cc71f71ea874ac894f2c6271d7677a2128b2a6d8f4bafc8a520" Feb 17 19:48:45 crc kubenswrapper[4892]: I0217 19:48:45.052016 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2r9g5" Feb 17 19:48:45 crc kubenswrapper[4892]: I0217 19:48:45.112598 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51a13936-fe59-44ff-993b-cb343fb5e88c-utilities\") pod \"51a13936-fe59-44ff-993b-cb343fb5e88c\" (UID: \"51a13936-fe59-44ff-993b-cb343fb5e88c\") " Feb 17 19:48:45 crc kubenswrapper[4892]: I0217 19:48:45.112683 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51a13936-fe59-44ff-993b-cb343fb5e88c-catalog-content\") pod \"51a13936-fe59-44ff-993b-cb343fb5e88c\" (UID: \"51a13936-fe59-44ff-993b-cb343fb5e88c\") " Feb 17 19:48:45 crc kubenswrapper[4892]: I0217 19:48:45.112739 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mtf5\" (UniqueName: \"kubernetes.io/projected/51a13936-fe59-44ff-993b-cb343fb5e88c-kube-api-access-7mtf5\") pod \"51a13936-fe59-44ff-993b-cb343fb5e88c\" (UID: \"51a13936-fe59-44ff-993b-cb343fb5e88c\") " Feb 17 19:48:45 crc kubenswrapper[4892]: I0217 19:48:45.114586 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51a13936-fe59-44ff-993b-cb343fb5e88c-utilities" (OuterVolumeSpecName: "utilities") pod "51a13936-fe59-44ff-993b-cb343fb5e88c" (UID: "51a13936-fe59-44ff-993b-cb343fb5e88c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:48:45 crc kubenswrapper[4892]: I0217 19:48:45.120154 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51a13936-fe59-44ff-993b-cb343fb5e88c-kube-api-access-7mtf5" (OuterVolumeSpecName: "kube-api-access-7mtf5") pod "51a13936-fe59-44ff-993b-cb343fb5e88c" (UID: "51a13936-fe59-44ff-993b-cb343fb5e88c"). InnerVolumeSpecName "kube-api-access-7mtf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:48:45 crc kubenswrapper[4892]: I0217 19:48:45.186453 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51a13936-fe59-44ff-993b-cb343fb5e88c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "51a13936-fe59-44ff-993b-cb343fb5e88c" (UID: "51a13936-fe59-44ff-993b-cb343fb5e88c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:48:45 crc kubenswrapper[4892]: I0217 19:48:45.215584 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51a13936-fe59-44ff-993b-cb343fb5e88c-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 19:48:45 crc kubenswrapper[4892]: I0217 19:48:45.215619 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51a13936-fe59-44ff-993b-cb343fb5e88c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 19:48:45 crc kubenswrapper[4892]: I0217 19:48:45.215632 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mtf5\" (UniqueName: \"kubernetes.io/projected/51a13936-fe59-44ff-993b-cb343fb5e88c-kube-api-access-7mtf5\") on node \"crc\" DevicePath \"\"" Feb 17 19:48:45 crc kubenswrapper[4892]: I0217 19:48:45.961970 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2r9g5" Feb 17 19:48:45 crc kubenswrapper[4892]: I0217 19:48:45.991039 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2r9g5"] Feb 17 19:48:46 crc kubenswrapper[4892]: I0217 19:48:46.001911 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2r9g5"] Feb 17 19:48:47 crc kubenswrapper[4892]: I0217 19:48:47.388791 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51a13936-fe59-44ff-993b-cb343fb5e88c" path="/var/lib/kubelet/pods/51a13936-fe59-44ff-993b-cb343fb5e88c/volumes" Feb 17 19:48:47 crc kubenswrapper[4892]: I0217 19:48:47.433604 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tkmsz"] Feb 17 19:48:47 crc kubenswrapper[4892]: E0217 19:48:47.434302 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51a13936-fe59-44ff-993b-cb343fb5e88c" containerName="registry-server" Feb 17 19:48:47 crc kubenswrapper[4892]: I0217 19:48:47.434319 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="51a13936-fe59-44ff-993b-cb343fb5e88c" containerName="registry-server" Feb 17 19:48:47 crc kubenswrapper[4892]: E0217 19:48:47.434352 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51a13936-fe59-44ff-993b-cb343fb5e88c" containerName="extract-content" Feb 17 19:48:47 crc kubenswrapper[4892]: I0217 19:48:47.434361 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="51a13936-fe59-44ff-993b-cb343fb5e88c" containerName="extract-content" Feb 17 19:48:47 crc kubenswrapper[4892]: E0217 19:48:47.434384 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51a13936-fe59-44ff-993b-cb343fb5e88c" containerName="extract-utilities" Feb 17 19:48:47 crc kubenswrapper[4892]: I0217 19:48:47.434394 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="51a13936-fe59-44ff-993b-cb343fb5e88c" containerName="extract-utilities" Feb 17 19:48:47 crc kubenswrapper[4892]: I0217 19:48:47.434718 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="51a13936-fe59-44ff-993b-cb343fb5e88c" containerName="registry-server" Feb 17 19:48:47 crc kubenswrapper[4892]: I0217 19:48:47.438169 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tkmsz" Feb 17 19:48:47 crc kubenswrapper[4892]: I0217 19:48:47.464491 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tkmsz"] Feb 17 19:48:47 crc kubenswrapper[4892]: I0217 19:48:47.475117 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/359bffdc-f01a-413c-959f-52bf4bac680f-catalog-content\") pod \"redhat-marketplace-tkmsz\" (UID: \"359bffdc-f01a-413c-959f-52bf4bac680f\") " pod="openshift-marketplace/redhat-marketplace-tkmsz" Feb 17 19:48:47 crc kubenswrapper[4892]: I0217 19:48:47.475210 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/359bffdc-f01a-413c-959f-52bf4bac680f-utilities\") pod \"redhat-marketplace-tkmsz\" (UID: \"359bffdc-f01a-413c-959f-52bf4bac680f\") " pod="openshift-marketplace/redhat-marketplace-tkmsz" Feb 17 19:48:47 crc kubenswrapper[4892]: I0217 19:48:47.475571 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgfkf\" (UniqueName: \"kubernetes.io/projected/359bffdc-f01a-413c-959f-52bf4bac680f-kube-api-access-vgfkf\") pod \"redhat-marketplace-tkmsz\" (UID: \"359bffdc-f01a-413c-959f-52bf4bac680f\") " pod="openshift-marketplace/redhat-marketplace-tkmsz" Feb 17 19:48:47 crc kubenswrapper[4892]: I0217 19:48:47.578232 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/359bffdc-f01a-413c-959f-52bf4bac680f-catalog-content\") pod \"redhat-marketplace-tkmsz\" (UID: \"359bffdc-f01a-413c-959f-52bf4bac680f\") " pod="openshift-marketplace/redhat-marketplace-tkmsz" Feb 17 19:48:47 crc kubenswrapper[4892]: I0217 19:48:47.578337 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/359bffdc-f01a-413c-959f-52bf4bac680f-utilities\") pod \"redhat-marketplace-tkmsz\" (UID: \"359bffdc-f01a-413c-959f-52bf4bac680f\") " pod="openshift-marketplace/redhat-marketplace-tkmsz" Feb 17 19:48:47 crc kubenswrapper[4892]: I0217 19:48:47.578423 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgfkf\" (UniqueName: \"kubernetes.io/projected/359bffdc-f01a-413c-959f-52bf4bac680f-kube-api-access-vgfkf\") pod \"redhat-marketplace-tkmsz\" (UID: \"359bffdc-f01a-413c-959f-52bf4bac680f\") " pod="openshift-marketplace/redhat-marketplace-tkmsz" Feb 17 19:48:47 crc kubenswrapper[4892]: I0217 19:48:47.579181 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/359bffdc-f01a-413c-959f-52bf4bac680f-catalog-content\") pod \"redhat-marketplace-tkmsz\" (UID: \"359bffdc-f01a-413c-959f-52bf4bac680f\") " pod="openshift-marketplace/redhat-marketplace-tkmsz" Feb 17 19:48:47 crc kubenswrapper[4892]: I0217 19:48:47.579385 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/359bffdc-f01a-413c-959f-52bf4bac680f-utilities\") pod \"redhat-marketplace-tkmsz\" (UID: \"359bffdc-f01a-413c-959f-52bf4bac680f\") " pod="openshift-marketplace/redhat-marketplace-tkmsz" Feb 17 19:48:47 crc kubenswrapper[4892]: I0217 19:48:47.603011 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgfkf\" (UniqueName: \"kubernetes.io/projected/359bffdc-f01a-413c-959f-52bf4bac680f-kube-api-access-vgfkf\") pod \"redhat-marketplace-tkmsz\" (UID: \"359bffdc-f01a-413c-959f-52bf4bac680f\") " pod="openshift-marketplace/redhat-marketplace-tkmsz" Feb 17 19:48:47 crc kubenswrapper[4892]: I0217 19:48:47.782420 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tkmsz" Feb 17 19:48:48 crc kubenswrapper[4892]: I0217 19:48:48.339515 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tkmsz"] Feb 17 19:48:49 crc kubenswrapper[4892]: I0217 19:48:49.016161 4892 generic.go:334] "Generic (PLEG): container finished" podID="359bffdc-f01a-413c-959f-52bf4bac680f" containerID="d78a31682b8c99bfe2afd86e518802a145a7010e3eb775a4daa120a49e4ddb8b" exitCode=0 Feb 17 19:48:49 crc kubenswrapper[4892]: I0217 19:48:49.016359 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tkmsz" event={"ID":"359bffdc-f01a-413c-959f-52bf4bac680f","Type":"ContainerDied","Data":"d78a31682b8c99bfe2afd86e518802a145a7010e3eb775a4daa120a49e4ddb8b"} Feb 17 19:48:49 crc kubenswrapper[4892]: I0217 19:48:49.016505 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tkmsz" event={"ID":"359bffdc-f01a-413c-959f-52bf4bac680f","Type":"ContainerStarted","Data":"8dfd42ab5d504cf7680b02766fddc984ca6bcd85d62700d6ac5b183dce01d0a3"} Feb 17 19:48:51 crc kubenswrapper[4892]: I0217 19:48:51.053763 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tkmsz" event={"ID":"359bffdc-f01a-413c-959f-52bf4bac680f","Type":"ContainerStarted","Data":"16c596fcb115a496ebfd82fe3718b9f1a6a1eb91c701e2526ea3e7f9e1eb3925"} Feb 17 19:48:53 crc kubenswrapper[4892]: I0217 19:48:53.084226 4892 generic.go:334] "Generic (PLEG): container finished" podID="359bffdc-f01a-413c-959f-52bf4bac680f" containerID="16c596fcb115a496ebfd82fe3718b9f1a6a1eb91c701e2526ea3e7f9e1eb3925" exitCode=0 Feb 17 19:48:53 crc kubenswrapper[4892]: I0217 19:48:53.084289 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tkmsz" event={"ID":"359bffdc-f01a-413c-959f-52bf4bac680f","Type":"ContainerDied","Data":"16c596fcb115a496ebfd82fe3718b9f1a6a1eb91c701e2526ea3e7f9e1eb3925"} Feb 17 19:48:54 crc kubenswrapper[4892]: I0217 19:48:54.100878 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tkmsz" event={"ID":"359bffdc-f01a-413c-959f-52bf4bac680f","Type":"ContainerStarted","Data":"8dba8d546dcd26bdeb6fedd0a4ade505c7181636dda8e2aea752e84f6ec22156"} Feb 17 19:48:54 crc kubenswrapper[4892]: I0217 19:48:54.131009 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tkmsz" podStartSLOduration=2.678181723 podStartE2EDuration="7.1309858s" podCreationTimestamp="2026-02-17 19:48:47 +0000 UTC" firstStartedPulling="2026-02-17 19:48:49.019646271 +0000 UTC m=+7500.395049546" lastFinishedPulling="2026-02-17 19:48:53.472450358 +0000 UTC m=+7504.847853623" observedRunningTime="2026-02-17 19:48:54.119076718 +0000 UTC m=+7505.494479983" watchObservedRunningTime="2026-02-17 19:48:54.1309858 +0000 UTC m=+7505.506389065" Feb 17 19:48:57 crc kubenswrapper[4892]: I0217 19:48:57.783177 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tkmsz" Feb 17 19:48:57 crc kubenswrapper[4892]: I0217 19:48:57.784743 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tkmsz" Feb 17 19:48:57 crc kubenswrapper[4892]: I0217 19:48:57.861157 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tkmsz" Feb 17 19:48:58 crc kubenswrapper[4892]: I0217 19:48:58.242918 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tkmsz" Feb 17 19:48:58 crc kubenswrapper[4892]: I0217 19:48:58.305222 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tkmsz"] Feb 17 19:49:00 crc kubenswrapper[4892]: I0217 19:49:00.177386 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tkmsz" podUID="359bffdc-f01a-413c-959f-52bf4bac680f" containerName="registry-server" containerID="cri-o://8dba8d546dcd26bdeb6fedd0a4ade505c7181636dda8e2aea752e84f6ec22156" gracePeriod=2 Feb 17 19:49:00 crc kubenswrapper[4892]: I0217 19:49:00.863876 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tkmsz" Feb 17 19:49:01 crc kubenswrapper[4892]: I0217 19:49:01.033963 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgfkf\" (UniqueName: \"kubernetes.io/projected/359bffdc-f01a-413c-959f-52bf4bac680f-kube-api-access-vgfkf\") pod \"359bffdc-f01a-413c-959f-52bf4bac680f\" (UID: \"359bffdc-f01a-413c-959f-52bf4bac680f\") " Feb 17 19:49:01 crc kubenswrapper[4892]: I0217 19:49:01.034238 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/359bffdc-f01a-413c-959f-52bf4bac680f-utilities\") pod \"359bffdc-f01a-413c-959f-52bf4bac680f\" (UID: \"359bffdc-f01a-413c-959f-52bf4bac680f\") " Feb 17 19:49:01 crc kubenswrapper[4892]: I0217 19:49:01.034294 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/359bffdc-f01a-413c-959f-52bf4bac680f-catalog-content\") pod \"359bffdc-f01a-413c-959f-52bf4bac680f\" (UID: \"359bffdc-f01a-413c-959f-52bf4bac680f\") " Feb 17 19:49:01 crc kubenswrapper[4892]: I0217 19:49:01.036129 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/359bffdc-f01a-413c-959f-52bf4bac680f-utilities" (OuterVolumeSpecName: "utilities") pod "359bffdc-f01a-413c-959f-52bf4bac680f" (UID: "359bffdc-f01a-413c-959f-52bf4bac680f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:49:01 crc kubenswrapper[4892]: I0217 19:49:01.042663 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/359bffdc-f01a-413c-959f-52bf4bac680f-kube-api-access-vgfkf" (OuterVolumeSpecName: "kube-api-access-vgfkf") pod "359bffdc-f01a-413c-959f-52bf4bac680f" (UID: "359bffdc-f01a-413c-959f-52bf4bac680f"). InnerVolumeSpecName "kube-api-access-vgfkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:49:01 crc kubenswrapper[4892]: I0217 19:49:01.063716 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/359bffdc-f01a-413c-959f-52bf4bac680f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "359bffdc-f01a-413c-959f-52bf4bac680f" (UID: "359bffdc-f01a-413c-959f-52bf4bac680f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:49:01 crc kubenswrapper[4892]: I0217 19:49:01.138891 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgfkf\" (UniqueName: \"kubernetes.io/projected/359bffdc-f01a-413c-959f-52bf4bac680f-kube-api-access-vgfkf\") on node \"crc\" DevicePath \"\"" Feb 17 19:49:01 crc kubenswrapper[4892]: I0217 19:49:01.138956 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/359bffdc-f01a-413c-959f-52bf4bac680f-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 19:49:01 crc kubenswrapper[4892]: I0217 19:49:01.138972 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/359bffdc-f01a-413c-959f-52bf4bac680f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 19:49:01 crc kubenswrapper[4892]: I0217 19:49:01.190325 4892 generic.go:334] "Generic (PLEG): container finished" podID="359bffdc-f01a-413c-959f-52bf4bac680f" containerID="8dba8d546dcd26bdeb6fedd0a4ade505c7181636dda8e2aea752e84f6ec22156" exitCode=0 Feb 17 19:49:01 crc kubenswrapper[4892]: I0217 19:49:01.190427 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tkmsz" event={"ID":"359bffdc-f01a-413c-959f-52bf4bac680f","Type":"ContainerDied","Data":"8dba8d546dcd26bdeb6fedd0a4ade505c7181636dda8e2aea752e84f6ec22156"} Feb 17 19:49:01 crc kubenswrapper[4892]: I0217 19:49:01.190581 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tkmsz" Feb 17 19:49:01 crc kubenswrapper[4892]: I0217 19:49:01.190871 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tkmsz" event={"ID":"359bffdc-f01a-413c-959f-52bf4bac680f","Type":"ContainerDied","Data":"8dfd42ab5d504cf7680b02766fddc984ca6bcd85d62700d6ac5b183dce01d0a3"} Feb 17 19:49:01 crc kubenswrapper[4892]: I0217 19:49:01.190904 4892 scope.go:117] "RemoveContainer" containerID="8dba8d546dcd26bdeb6fedd0a4ade505c7181636dda8e2aea752e84f6ec22156" Feb 17 19:49:01 crc kubenswrapper[4892]: I0217 19:49:01.226722 4892 scope.go:117] "RemoveContainer" containerID="16c596fcb115a496ebfd82fe3718b9f1a6a1eb91c701e2526ea3e7f9e1eb3925" Feb 17 19:49:01 crc kubenswrapper[4892]: I0217 19:49:01.231176 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tkmsz"] Feb 17 19:49:01 crc kubenswrapper[4892]: I0217 19:49:01.240890 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tkmsz"] Feb 17 19:49:01 crc kubenswrapper[4892]: I0217 19:49:01.259052 4892 scope.go:117] "RemoveContainer" containerID="d78a31682b8c99bfe2afd86e518802a145a7010e3eb775a4daa120a49e4ddb8b" Feb 17 19:49:01 crc kubenswrapper[4892]: I0217 19:49:01.313660 4892 scope.go:117] "RemoveContainer" containerID="8dba8d546dcd26bdeb6fedd0a4ade505c7181636dda8e2aea752e84f6ec22156" Feb 17 19:49:01 crc kubenswrapper[4892]: E0217 19:49:01.314065 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8dba8d546dcd26bdeb6fedd0a4ade505c7181636dda8e2aea752e84f6ec22156\": container with ID starting with 8dba8d546dcd26bdeb6fedd0a4ade505c7181636dda8e2aea752e84f6ec22156 not found: ID does not exist" containerID="8dba8d546dcd26bdeb6fedd0a4ade505c7181636dda8e2aea752e84f6ec22156" Feb 17 19:49:01 crc kubenswrapper[4892]: I0217 19:49:01.314102 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dba8d546dcd26bdeb6fedd0a4ade505c7181636dda8e2aea752e84f6ec22156"} err="failed to get container status \"8dba8d546dcd26bdeb6fedd0a4ade505c7181636dda8e2aea752e84f6ec22156\": rpc error: code = NotFound desc = could not find container \"8dba8d546dcd26bdeb6fedd0a4ade505c7181636dda8e2aea752e84f6ec22156\": container with ID starting with 8dba8d546dcd26bdeb6fedd0a4ade505c7181636dda8e2aea752e84f6ec22156 not found: ID does not exist" Feb 17 19:49:01 crc kubenswrapper[4892]: I0217 19:49:01.314122 4892 scope.go:117] "RemoveContainer" containerID="16c596fcb115a496ebfd82fe3718b9f1a6a1eb91c701e2526ea3e7f9e1eb3925" Feb 17 19:49:01 crc kubenswrapper[4892]: E0217 19:49:01.314356 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16c596fcb115a496ebfd82fe3718b9f1a6a1eb91c701e2526ea3e7f9e1eb3925\": container with ID starting with 16c596fcb115a496ebfd82fe3718b9f1a6a1eb91c701e2526ea3e7f9e1eb3925 not found: ID does not exist" containerID="16c596fcb115a496ebfd82fe3718b9f1a6a1eb91c701e2526ea3e7f9e1eb3925" Feb 17 19:49:01 crc kubenswrapper[4892]: I0217 19:49:01.314377 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16c596fcb115a496ebfd82fe3718b9f1a6a1eb91c701e2526ea3e7f9e1eb3925"} err="failed to get container status \"16c596fcb115a496ebfd82fe3718b9f1a6a1eb91c701e2526ea3e7f9e1eb3925\": rpc error: code = NotFound desc = could not find container \"16c596fcb115a496ebfd82fe3718b9f1a6a1eb91c701e2526ea3e7f9e1eb3925\": container with ID starting with 16c596fcb115a496ebfd82fe3718b9f1a6a1eb91c701e2526ea3e7f9e1eb3925 not found: ID does not exist" Feb 17 19:49:01 crc kubenswrapper[4892]: I0217 19:49:01.314388 4892 scope.go:117] "RemoveContainer" containerID="d78a31682b8c99bfe2afd86e518802a145a7010e3eb775a4daa120a49e4ddb8b" Feb 17 19:49:01 crc kubenswrapper[4892]: E0217 19:49:01.314596 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d78a31682b8c99bfe2afd86e518802a145a7010e3eb775a4daa120a49e4ddb8b\": container with ID starting with d78a31682b8c99bfe2afd86e518802a145a7010e3eb775a4daa120a49e4ddb8b not found: ID does not exist" containerID="d78a31682b8c99bfe2afd86e518802a145a7010e3eb775a4daa120a49e4ddb8b" Feb 17 19:49:01 crc kubenswrapper[4892]: I0217 19:49:01.314620 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d78a31682b8c99bfe2afd86e518802a145a7010e3eb775a4daa120a49e4ddb8b"} err="failed to get container status \"d78a31682b8c99bfe2afd86e518802a145a7010e3eb775a4daa120a49e4ddb8b\": rpc error: code = NotFound desc = could not find container \"d78a31682b8c99bfe2afd86e518802a145a7010e3eb775a4daa120a49e4ddb8b\": container with ID starting with d78a31682b8c99bfe2afd86e518802a145a7010e3eb775a4daa120a49e4ddb8b not found: ID does not exist" Feb 17 19:49:01 crc kubenswrapper[4892]: I0217 19:49:01.386688 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="359bffdc-f01a-413c-959f-52bf4bac680f" path="/var/lib/kubelet/pods/359bffdc-f01a-413c-959f-52bf4bac680f/volumes" Feb 17 19:49:23 crc kubenswrapper[4892]: I0217 19:49:23.465765 4892 generic.go:334] "Generic (PLEG): container finished" podID="9d911062-0c28-415c-bdb4-6288348c105a" containerID="00b9910611f0d688b58b552a120b233b90389f22879d0906eea69800c05c42c8" exitCode=0 Feb 17 19:49:23 crc kubenswrapper[4892]: I0217 19:49:23.465850 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-lvk64" event={"ID":"9d911062-0c28-415c-bdb4-6288348c105a","Type":"ContainerDied","Data":"00b9910611f0d688b58b552a120b233b90389f22879d0906eea69800c05c42c8"} Feb 17 19:49:25 crc kubenswrapper[4892]: I0217 19:49:25.085126 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-lvk64" Feb 17 19:49:25 crc kubenswrapper[4892]: I0217 19:49:25.235505 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzsqg\" (UniqueName: \"kubernetes.io/projected/9d911062-0c28-415c-bdb4-6288348c105a-kube-api-access-zzsqg\") pod \"9d911062-0c28-415c-bdb4-6288348c105a\" (UID: \"9d911062-0c28-415c-bdb4-6288348c105a\") " Feb 17 19:49:25 crc kubenswrapper[4892]: I0217 19:49:25.235632 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/9d911062-0c28-415c-bdb4-6288348c105a-ssh-key-openstack-cell1\") pod \"9d911062-0c28-415c-bdb4-6288348c105a\" (UID: \"9d911062-0c28-415c-bdb4-6288348c105a\") " Feb 17 19:49:25 crc kubenswrapper[4892]: I0217 19:49:25.235767 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9d911062-0c28-415c-bdb4-6288348c105a-ceph\") pod \"9d911062-0c28-415c-bdb4-6288348c105a\" (UID: \"9d911062-0c28-415c-bdb4-6288348c105a\") " Feb 17 19:49:25 crc kubenswrapper[4892]: I0217 19:49:25.235951 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9d911062-0c28-415c-bdb4-6288348c105a-inventory\") pod \"9d911062-0c28-415c-bdb4-6288348c105a\" (UID: \"9d911062-0c28-415c-bdb4-6288348c105a\") " Feb 17 19:49:25 crc kubenswrapper[4892]: I0217 19:49:25.240580 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d911062-0c28-415c-bdb4-6288348c105a-ceph" (OuterVolumeSpecName: "ceph") pod "9d911062-0c28-415c-bdb4-6288348c105a" (UID: "9d911062-0c28-415c-bdb4-6288348c105a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:49:25 crc kubenswrapper[4892]: I0217 19:49:25.241053 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d911062-0c28-415c-bdb4-6288348c105a-kube-api-access-zzsqg" (OuterVolumeSpecName: "kube-api-access-zzsqg") pod "9d911062-0c28-415c-bdb4-6288348c105a" (UID: "9d911062-0c28-415c-bdb4-6288348c105a"). InnerVolumeSpecName "kube-api-access-zzsqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:49:25 crc kubenswrapper[4892]: I0217 19:49:25.267041 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d911062-0c28-415c-bdb4-6288348c105a-inventory" (OuterVolumeSpecName: "inventory") pod "9d911062-0c28-415c-bdb4-6288348c105a" (UID: "9d911062-0c28-415c-bdb4-6288348c105a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:49:25 crc kubenswrapper[4892]: I0217 19:49:25.272264 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d911062-0c28-415c-bdb4-6288348c105a-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "9d911062-0c28-415c-bdb4-6288348c105a" (UID: "9d911062-0c28-415c-bdb4-6288348c105a"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:49:25 crc kubenswrapper[4892]: I0217 19:49:25.338927 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzsqg\" (UniqueName: \"kubernetes.io/projected/9d911062-0c28-415c-bdb4-6288348c105a-kube-api-access-zzsqg\") on node \"crc\" DevicePath \"\"" Feb 17 19:49:25 crc kubenswrapper[4892]: I0217 19:49:25.338962 4892 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/9d911062-0c28-415c-bdb4-6288348c105a-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 17 19:49:25 crc kubenswrapper[4892]: I0217 19:49:25.338977 4892 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9d911062-0c28-415c-bdb4-6288348c105a-ceph\") on node \"crc\" DevicePath \"\"" Feb 17 19:49:25 crc kubenswrapper[4892]: I0217 19:49:25.338990 4892 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9d911062-0c28-415c-bdb4-6288348c105a-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 19:49:25 crc kubenswrapper[4892]: I0217 19:49:25.493103 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-lvk64" event={"ID":"9d911062-0c28-415c-bdb4-6288348c105a","Type":"ContainerDied","Data":"9639c3e112bad002d2b20cdd9857e97ec168a756668710d4584999c5e9af5e9c"} Feb 17 19:49:25 crc kubenswrapper[4892]: I0217 19:49:25.493140 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-lvk64" Feb 17 19:49:25 crc kubenswrapper[4892]: I0217 19:49:25.493158 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9639c3e112bad002d2b20cdd9857e97ec168a756668710d4584999c5e9af5e9c" Feb 17 19:49:25 crc kubenswrapper[4892]: I0217 19:49:25.581527 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-v6tgq"] Feb 17 19:49:25 crc kubenswrapper[4892]: E0217 19:49:25.582088 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d911062-0c28-415c-bdb4-6288348c105a" containerName="configure-network-openstack-openstack-cell1" Feb 17 19:49:25 crc kubenswrapper[4892]: I0217 19:49:25.582109 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d911062-0c28-415c-bdb4-6288348c105a" containerName="configure-network-openstack-openstack-cell1" Feb 17 19:49:25 crc kubenswrapper[4892]: E0217 19:49:25.582127 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="359bffdc-f01a-413c-959f-52bf4bac680f" containerName="extract-utilities" Feb 17 19:49:25 crc kubenswrapper[4892]: I0217 19:49:25.582136 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="359bffdc-f01a-413c-959f-52bf4bac680f" containerName="extract-utilities" Feb 17 19:49:25 crc kubenswrapper[4892]: E0217 19:49:25.582157 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="359bffdc-f01a-413c-959f-52bf4bac680f" containerName="extract-content" Feb 17 19:49:25 crc kubenswrapper[4892]: I0217 19:49:25.582163 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="359bffdc-f01a-413c-959f-52bf4bac680f" containerName="extract-content" Feb 17 19:49:25 crc kubenswrapper[4892]: E0217 19:49:25.582185 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="359bffdc-f01a-413c-959f-52bf4bac680f" containerName="registry-server" Feb 17 19:49:25 crc kubenswrapper[4892]: I0217 19:49:25.582193 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="359bffdc-f01a-413c-959f-52bf4bac680f" containerName="registry-server" Feb 17 19:49:25 crc kubenswrapper[4892]: I0217 19:49:25.582443 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="359bffdc-f01a-413c-959f-52bf4bac680f" containerName="registry-server" Feb 17 19:49:25 crc kubenswrapper[4892]: I0217 19:49:25.582467 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d911062-0c28-415c-bdb4-6288348c105a" containerName="configure-network-openstack-openstack-cell1" Feb 17 19:49:25 crc kubenswrapper[4892]: I0217 19:49:25.583381 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-v6tgq" Feb 17 19:49:25 crc kubenswrapper[4892]: I0217 19:49:25.587849 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 17 19:49:25 crc kubenswrapper[4892]: I0217 19:49:25.588040 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 19:49:25 crc kubenswrapper[4892]: I0217 19:49:25.588176 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 17 19:49:25 crc kubenswrapper[4892]: I0217 19:49:25.588670 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-sdxx7" Feb 17 19:49:25 crc kubenswrapper[4892]: I0217 19:49:25.592047 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-v6tgq"] Feb 17 19:49:25 crc kubenswrapper[4892]: I0217 19:49:25.748565 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpxtt\" (UniqueName: \"kubernetes.io/projected/b4cb43e5-177c-462c-acba-1a4ac62bf30d-kube-api-access-tpxtt\") pod \"validate-network-openstack-openstack-cell1-v6tgq\" (UID: \"b4cb43e5-177c-462c-acba-1a4ac62bf30d\") " pod="openstack/validate-network-openstack-openstack-cell1-v6tgq" Feb 17 19:49:25 crc kubenswrapper[4892]: I0217 19:49:25.749062 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b4cb43e5-177c-462c-acba-1a4ac62bf30d-ceph\") pod \"validate-network-openstack-openstack-cell1-v6tgq\" (UID: \"b4cb43e5-177c-462c-acba-1a4ac62bf30d\") " pod="openstack/validate-network-openstack-openstack-cell1-v6tgq" Feb 17 19:49:25 crc kubenswrapper[4892]: I0217 19:49:25.749134 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/b4cb43e5-177c-462c-acba-1a4ac62bf30d-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-v6tgq\" (UID: \"b4cb43e5-177c-462c-acba-1a4ac62bf30d\") " pod="openstack/validate-network-openstack-openstack-cell1-v6tgq" Feb 17 19:49:25 crc kubenswrapper[4892]: I0217 19:49:25.749798 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b4cb43e5-177c-462c-acba-1a4ac62bf30d-inventory\") pod \"validate-network-openstack-openstack-cell1-v6tgq\" (UID: \"b4cb43e5-177c-462c-acba-1a4ac62bf30d\") " pod="openstack/validate-network-openstack-openstack-cell1-v6tgq" Feb 17 19:49:25 crc kubenswrapper[4892]: I0217 19:49:25.852153 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b4cb43e5-177c-462c-acba-1a4ac62bf30d-inventory\") pod \"validate-network-openstack-openstack-cell1-v6tgq\" (UID: \"b4cb43e5-177c-462c-acba-1a4ac62bf30d\") " pod="openstack/validate-network-openstack-openstack-cell1-v6tgq" Feb 17 19:49:25 crc kubenswrapper[4892]: I0217 19:49:25.852241 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpxtt\" (UniqueName: \"kubernetes.io/projected/b4cb43e5-177c-462c-acba-1a4ac62bf30d-kube-api-access-tpxtt\") pod \"validate-network-openstack-openstack-cell1-v6tgq\" (UID: \"b4cb43e5-177c-462c-acba-1a4ac62bf30d\") " pod="openstack/validate-network-openstack-openstack-cell1-v6tgq" Feb 17 19:49:25 crc kubenswrapper[4892]: I0217 19:49:25.852285 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b4cb43e5-177c-462c-acba-1a4ac62bf30d-ceph\") pod \"validate-network-openstack-openstack-cell1-v6tgq\" (UID: \"b4cb43e5-177c-462c-acba-1a4ac62bf30d\") " pod="openstack/validate-network-openstack-openstack-cell1-v6tgq" Feb 17 19:49:25 crc kubenswrapper[4892]: I0217 19:49:25.852333 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/b4cb43e5-177c-462c-acba-1a4ac62bf30d-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-v6tgq\" (UID: \"b4cb43e5-177c-462c-acba-1a4ac62bf30d\") " pod="openstack/validate-network-openstack-openstack-cell1-v6tgq" Feb 17 19:49:25 crc kubenswrapper[4892]: I0217 19:49:25.857128 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b4cb43e5-177c-462c-acba-1a4ac62bf30d-inventory\") pod \"validate-network-openstack-openstack-cell1-v6tgq\" (UID: \"b4cb43e5-177c-462c-acba-1a4ac62bf30d\") " pod="openstack/validate-network-openstack-openstack-cell1-v6tgq" Feb 17 19:49:25 crc kubenswrapper[4892]: I0217 19:49:25.858253 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b4cb43e5-177c-462c-acba-1a4ac62bf30d-ceph\") pod \"validate-network-openstack-openstack-cell1-v6tgq\" (UID: \"b4cb43e5-177c-462c-acba-1a4ac62bf30d\") " pod="openstack/validate-network-openstack-openstack-cell1-v6tgq" Feb 17 19:49:25 crc kubenswrapper[4892]: I0217 19:49:25.861563 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/b4cb43e5-177c-462c-acba-1a4ac62bf30d-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-v6tgq\" (UID: \"b4cb43e5-177c-462c-acba-1a4ac62bf30d\") " pod="openstack/validate-network-openstack-openstack-cell1-v6tgq" Feb 17 19:49:25 crc kubenswrapper[4892]: I0217 19:49:25.882849 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpxtt\" (UniqueName: \"kubernetes.io/projected/b4cb43e5-177c-462c-acba-1a4ac62bf30d-kube-api-access-tpxtt\") pod \"validate-network-openstack-openstack-cell1-v6tgq\" (UID: \"b4cb43e5-177c-462c-acba-1a4ac62bf30d\") " pod="openstack/validate-network-openstack-openstack-cell1-v6tgq" Feb 17 19:49:25 crc kubenswrapper[4892]: I0217 19:49:25.908643 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-v6tgq" Feb 17 19:49:26 crc kubenswrapper[4892]: I0217 19:49:26.487139 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-v6tgq"] Feb 17 19:49:26 crc kubenswrapper[4892]: I0217 19:49:26.513518 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-v6tgq" event={"ID":"b4cb43e5-177c-462c-acba-1a4ac62bf30d","Type":"ContainerStarted","Data":"dd4b1740545dd0c993d0d4fa9ce386373481c9a2c236cbaabacce69b270403fe"} Feb 17 19:49:27 crc kubenswrapper[4892]: I0217 19:49:27.525787 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-v6tgq" event={"ID":"b4cb43e5-177c-462c-acba-1a4ac62bf30d","Type":"ContainerStarted","Data":"633213b9009860209d4b1d74fc35c392b738e834d816302b8a71c101d0834714"} Feb 17 19:49:27 crc kubenswrapper[4892]: I0217 19:49:27.546765 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-openstack-openstack-cell1-v6tgq" podStartSLOduration=2.146054479 podStartE2EDuration="2.546746439s" podCreationTimestamp="2026-02-17 19:49:25 +0000 UTC" firstStartedPulling="2026-02-17 19:49:26.485607786 +0000 UTC m=+7537.861011051" lastFinishedPulling="2026-02-17 19:49:26.886299736 +0000 UTC m=+7538.261703011" observedRunningTime="2026-02-17 19:49:27.540851 +0000 UTC m=+7538.916254285" watchObservedRunningTime="2026-02-17 19:49:27.546746439 +0000 UTC m=+7538.922149704" Feb 17 19:49:32 crc kubenswrapper[4892]: I0217 19:49:32.606569 4892 generic.go:334] "Generic (PLEG): container finished" podID="b4cb43e5-177c-462c-acba-1a4ac62bf30d" containerID="633213b9009860209d4b1d74fc35c392b738e834d816302b8a71c101d0834714" exitCode=0 Feb 17 19:49:32 crc kubenswrapper[4892]: I0217 19:49:32.606668 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-v6tgq" event={"ID":"b4cb43e5-177c-462c-acba-1a4ac62bf30d","Type":"ContainerDied","Data":"633213b9009860209d4b1d74fc35c392b738e834d816302b8a71c101d0834714"} Feb 17 19:49:34 crc kubenswrapper[4892]: I0217 19:49:34.174145 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-v6tgq" Feb 17 19:49:34 crc kubenswrapper[4892]: I0217 19:49:34.275328 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/b4cb43e5-177c-462c-acba-1a4ac62bf30d-ssh-key-openstack-cell1\") pod \"b4cb43e5-177c-462c-acba-1a4ac62bf30d\" (UID: \"b4cb43e5-177c-462c-acba-1a4ac62bf30d\") " Feb 17 19:49:34 crc kubenswrapper[4892]: I0217 19:49:34.275415 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpxtt\" (UniqueName: \"kubernetes.io/projected/b4cb43e5-177c-462c-acba-1a4ac62bf30d-kube-api-access-tpxtt\") pod \"b4cb43e5-177c-462c-acba-1a4ac62bf30d\" (UID: \"b4cb43e5-177c-462c-acba-1a4ac62bf30d\") " Feb 17 19:49:34 crc kubenswrapper[4892]: I0217 19:49:34.275505 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b4cb43e5-177c-462c-acba-1a4ac62bf30d-ceph\") pod \"b4cb43e5-177c-462c-acba-1a4ac62bf30d\" (UID: \"b4cb43e5-177c-462c-acba-1a4ac62bf30d\") " Feb 17 19:49:34 crc kubenswrapper[4892]: I0217 19:49:34.275527 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b4cb43e5-177c-462c-acba-1a4ac62bf30d-inventory\") pod \"b4cb43e5-177c-462c-acba-1a4ac62bf30d\" (UID: \"b4cb43e5-177c-462c-acba-1a4ac62bf30d\") " Feb 17 19:49:34 crc kubenswrapper[4892]: I0217 19:49:34.280553 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4cb43e5-177c-462c-acba-1a4ac62bf30d-ceph" (OuterVolumeSpecName: "ceph") pod "b4cb43e5-177c-462c-acba-1a4ac62bf30d" (UID: "b4cb43e5-177c-462c-acba-1a4ac62bf30d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:49:34 crc kubenswrapper[4892]: I0217 19:49:34.280834 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4cb43e5-177c-462c-acba-1a4ac62bf30d-kube-api-access-tpxtt" (OuterVolumeSpecName: "kube-api-access-tpxtt") pod "b4cb43e5-177c-462c-acba-1a4ac62bf30d" (UID: "b4cb43e5-177c-462c-acba-1a4ac62bf30d"). InnerVolumeSpecName "kube-api-access-tpxtt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:49:34 crc kubenswrapper[4892]: I0217 19:49:34.305864 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4cb43e5-177c-462c-acba-1a4ac62bf30d-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "b4cb43e5-177c-462c-acba-1a4ac62bf30d" (UID: "b4cb43e5-177c-462c-acba-1a4ac62bf30d"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:49:34 crc kubenswrapper[4892]: I0217 19:49:34.317617 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4cb43e5-177c-462c-acba-1a4ac62bf30d-inventory" (OuterVolumeSpecName: "inventory") pod "b4cb43e5-177c-462c-acba-1a4ac62bf30d" (UID: "b4cb43e5-177c-462c-acba-1a4ac62bf30d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:49:34 crc kubenswrapper[4892]: I0217 19:49:34.380600 4892 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/b4cb43e5-177c-462c-acba-1a4ac62bf30d-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 17 19:49:34 crc kubenswrapper[4892]: I0217 19:49:34.380660 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpxtt\" (UniqueName: \"kubernetes.io/projected/b4cb43e5-177c-462c-acba-1a4ac62bf30d-kube-api-access-tpxtt\") on node \"crc\" DevicePath \"\"" Feb 17 19:49:34 crc kubenswrapper[4892]: I0217 19:49:34.380683 4892 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b4cb43e5-177c-462c-acba-1a4ac62bf30d-ceph\") on node \"crc\" DevicePath \"\"" Feb 17 19:49:34 crc kubenswrapper[4892]: I0217 19:49:34.380700 4892 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b4cb43e5-177c-462c-acba-1a4ac62bf30d-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 19:49:34 crc kubenswrapper[4892]: I0217 19:49:34.641103 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-v6tgq" event={"ID":"b4cb43e5-177c-462c-acba-1a4ac62bf30d","Type":"ContainerDied","Data":"dd4b1740545dd0c993d0d4fa9ce386373481c9a2c236cbaabacce69b270403fe"} Feb 17 19:49:34 crc kubenswrapper[4892]: I0217 19:49:34.641745 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd4b1740545dd0c993d0d4fa9ce386373481c9a2c236cbaabacce69b270403fe" Feb 17 19:49:34 crc kubenswrapper[4892]: I0217 19:49:34.641178 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-v6tgq" Feb 17 19:49:34 crc kubenswrapper[4892]: I0217 19:49:34.778609 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-openstack-openstack-cell1-mt4f2"] Feb 17 19:49:34 crc kubenswrapper[4892]: E0217 19:49:34.779383 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4cb43e5-177c-462c-acba-1a4ac62bf30d" containerName="validate-network-openstack-openstack-cell1" Feb 17 19:49:34 crc kubenswrapper[4892]: I0217 19:49:34.779456 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4cb43e5-177c-462c-acba-1a4ac62bf30d" containerName="validate-network-openstack-openstack-cell1" Feb 17 19:49:34 crc kubenswrapper[4892]: I0217 19:49:34.779786 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4cb43e5-177c-462c-acba-1a4ac62bf30d" containerName="validate-network-openstack-openstack-cell1" Feb 17 19:49:34 crc kubenswrapper[4892]: I0217 19:49:34.780871 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-mt4f2" Feb 17 19:49:34 crc kubenswrapper[4892]: I0217 19:49:34.782903 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-sdxx7" Feb 17 19:49:34 crc kubenswrapper[4892]: I0217 19:49:34.783888 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 19:49:34 crc kubenswrapper[4892]: I0217 19:49:34.784554 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 17 19:49:34 crc kubenswrapper[4892]: I0217 19:49:34.784680 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 17 19:49:34 crc kubenswrapper[4892]: I0217 19:49:34.789023 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-mt4f2"] Feb 17 19:49:34 crc kubenswrapper[4892]: I0217 19:49:34.891471 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fab94950-73bd-4831-8872-84219a776cef-ceph\") pod \"install-os-openstack-openstack-cell1-mt4f2\" (UID: \"fab94950-73bd-4831-8872-84219a776cef\") " pod="openstack/install-os-openstack-openstack-cell1-mt4f2" Feb 17 19:49:34 crc kubenswrapper[4892]: I0217 19:49:34.891639 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/fab94950-73bd-4831-8872-84219a776cef-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-mt4f2\" (UID: \"fab94950-73bd-4831-8872-84219a776cef\") " pod="openstack/install-os-openstack-openstack-cell1-mt4f2" Feb 17 19:49:34 crc kubenswrapper[4892]: I0217 19:49:34.891885 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdv79\" (UniqueName: \"kubernetes.io/projected/fab94950-73bd-4831-8872-84219a776cef-kube-api-access-kdv79\") pod \"install-os-openstack-openstack-cell1-mt4f2\" (UID: \"fab94950-73bd-4831-8872-84219a776cef\") " pod="openstack/install-os-openstack-openstack-cell1-mt4f2" Feb 17 19:49:34 crc kubenswrapper[4892]: I0217 19:49:34.892037 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fab94950-73bd-4831-8872-84219a776cef-inventory\") pod \"install-os-openstack-openstack-cell1-mt4f2\" (UID: \"fab94950-73bd-4831-8872-84219a776cef\") " pod="openstack/install-os-openstack-openstack-cell1-mt4f2" Feb 17 19:49:34 crc kubenswrapper[4892]: I0217 19:49:34.994444 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fab94950-73bd-4831-8872-84219a776cef-inventory\") pod \"install-os-openstack-openstack-cell1-mt4f2\" (UID: \"fab94950-73bd-4831-8872-84219a776cef\") " pod="openstack/install-os-openstack-openstack-cell1-mt4f2" Feb 17 19:49:34 crc kubenswrapper[4892]: I0217 19:49:34.994576 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fab94950-73bd-4831-8872-84219a776cef-ceph\") pod \"install-os-openstack-openstack-cell1-mt4f2\" (UID: \"fab94950-73bd-4831-8872-84219a776cef\") " pod="openstack/install-os-openstack-openstack-cell1-mt4f2" Feb 17 19:49:34 crc kubenswrapper[4892]: I0217 19:49:34.994655 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/fab94950-73bd-4831-8872-84219a776cef-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-mt4f2\" (UID: \"fab94950-73bd-4831-8872-84219a776cef\") " pod="openstack/install-os-openstack-openstack-cell1-mt4f2" Feb 17 19:49:34 crc kubenswrapper[4892]: I0217 19:49:34.994721 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdv79\" (UniqueName: \"kubernetes.io/projected/fab94950-73bd-4831-8872-84219a776cef-kube-api-access-kdv79\") pod \"install-os-openstack-openstack-cell1-mt4f2\" (UID: \"fab94950-73bd-4831-8872-84219a776cef\") " pod="openstack/install-os-openstack-openstack-cell1-mt4f2" Feb 17 19:49:34 crc kubenswrapper[4892]: I0217 19:49:34.999362 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fab94950-73bd-4831-8872-84219a776cef-ceph\") pod \"install-os-openstack-openstack-cell1-mt4f2\" (UID: \"fab94950-73bd-4831-8872-84219a776cef\") " pod="openstack/install-os-openstack-openstack-cell1-mt4f2" Feb 17 19:49:35 crc kubenswrapper[4892]: I0217 19:49:35.002168 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/fab94950-73bd-4831-8872-84219a776cef-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-mt4f2\" (UID: \"fab94950-73bd-4831-8872-84219a776cef\") " pod="openstack/install-os-openstack-openstack-cell1-mt4f2" Feb 17 19:49:35 crc kubenswrapper[4892]: I0217 19:49:35.007991 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fab94950-73bd-4831-8872-84219a776cef-inventory\") pod \"install-os-openstack-openstack-cell1-mt4f2\" (UID: \"fab94950-73bd-4831-8872-84219a776cef\") " pod="openstack/install-os-openstack-openstack-cell1-mt4f2" Feb 17 19:49:35 crc kubenswrapper[4892]: I0217 19:49:35.021088 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdv79\" (UniqueName: \"kubernetes.io/projected/fab94950-73bd-4831-8872-84219a776cef-kube-api-access-kdv79\") pod \"install-os-openstack-openstack-cell1-mt4f2\" (UID: \"fab94950-73bd-4831-8872-84219a776cef\") " pod="openstack/install-os-openstack-openstack-cell1-mt4f2" Feb 17 19:49:35 crc kubenswrapper[4892]: I0217 19:49:35.101044 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-mt4f2" Feb 17 19:49:35 crc kubenswrapper[4892]: I0217 19:49:35.737433 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-mt4f2"] Feb 17 19:49:36 crc kubenswrapper[4892]: I0217 19:49:36.674784 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-mt4f2" event={"ID":"fab94950-73bd-4831-8872-84219a776cef","Type":"ContainerStarted","Data":"15bb513c08b4971e9d0da7be4489c6842326b9fde6bf135b80fc492048ca63db"} Feb 17 19:49:36 crc kubenswrapper[4892]: I0217 19:49:36.675482 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-mt4f2" event={"ID":"fab94950-73bd-4831-8872-84219a776cef","Type":"ContainerStarted","Data":"73a14380c302c3df08254542678d992979ba4c880c863e053574a1c8ce631370"} Feb 17 19:49:36 crc kubenswrapper[4892]: I0217 19:49:36.698053 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-openstack-openstack-cell1-mt4f2" podStartSLOduration=2.285987771 podStartE2EDuration="2.698034817s" podCreationTimestamp="2026-02-17 19:49:34 +0000 UTC" firstStartedPulling="2026-02-17 19:49:35.736293938 +0000 UTC m=+7547.111697203" lastFinishedPulling="2026-02-17 19:49:36.148340984 +0000 UTC m=+7547.523744249" observedRunningTime="2026-02-17 19:49:36.690581746 +0000 UTC m=+7548.065985001" watchObservedRunningTime="2026-02-17 19:49:36.698034817 +0000 UTC m=+7548.073438082" Feb 17 19:49:59 crc kubenswrapper[4892]: I0217 19:49:59.747655 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-f6mjl"] Feb 17 19:49:59 crc kubenswrapper[4892]: I0217 19:49:59.750555 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f6mjl" Feb 17 19:49:59 crc kubenswrapper[4892]: I0217 19:49:59.761541 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f6mjl"] Feb 17 19:49:59 crc kubenswrapper[4892]: I0217 19:49:59.904797 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2897de4e-559e-434d-8a35-944d76f621f2-utilities\") pod \"redhat-operators-f6mjl\" (UID: \"2897de4e-559e-434d-8a35-944d76f621f2\") " pod="openshift-marketplace/redhat-operators-f6mjl" Feb 17 19:49:59 crc kubenswrapper[4892]: I0217 19:49:59.904849 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzzgr\" (UniqueName: \"kubernetes.io/projected/2897de4e-559e-434d-8a35-944d76f621f2-kube-api-access-mzzgr\") pod \"redhat-operators-f6mjl\" (UID: \"2897de4e-559e-434d-8a35-944d76f621f2\") " pod="openshift-marketplace/redhat-operators-f6mjl" Feb 17 19:49:59 crc kubenswrapper[4892]: I0217 19:49:59.904917 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2897de4e-559e-434d-8a35-944d76f621f2-catalog-content\") pod \"redhat-operators-f6mjl\" (UID: \"2897de4e-559e-434d-8a35-944d76f621f2\") " pod="openshift-marketplace/redhat-operators-f6mjl" Feb 17 19:50:00 crc kubenswrapper[4892]: I0217 19:50:00.006710 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2897de4e-559e-434d-8a35-944d76f621f2-utilities\") pod \"redhat-operators-f6mjl\" (UID: \"2897de4e-559e-434d-8a35-944d76f621f2\") " pod="openshift-marketplace/redhat-operators-f6mjl" Feb 17 19:50:00 crc kubenswrapper[4892]: I0217 19:50:00.006750 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzzgr\" (UniqueName: \"kubernetes.io/projected/2897de4e-559e-434d-8a35-944d76f621f2-kube-api-access-mzzgr\") pod \"redhat-operators-f6mjl\" (UID: \"2897de4e-559e-434d-8a35-944d76f621f2\") " pod="openshift-marketplace/redhat-operators-f6mjl" Feb 17 19:50:00 crc kubenswrapper[4892]: I0217 19:50:00.006844 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2897de4e-559e-434d-8a35-944d76f621f2-catalog-content\") pod \"redhat-operators-f6mjl\" (UID: \"2897de4e-559e-434d-8a35-944d76f621f2\") " pod="openshift-marketplace/redhat-operators-f6mjl" Feb 17 19:50:00 crc kubenswrapper[4892]: I0217 19:50:00.007261 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2897de4e-559e-434d-8a35-944d76f621f2-utilities\") pod \"redhat-operators-f6mjl\" (UID: \"2897de4e-559e-434d-8a35-944d76f621f2\") " pod="openshift-marketplace/redhat-operators-f6mjl" Feb 17 19:50:00 crc kubenswrapper[4892]: I0217 19:50:00.007307 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2897de4e-559e-434d-8a35-944d76f621f2-catalog-content\") pod \"redhat-operators-f6mjl\" (UID: \"2897de4e-559e-434d-8a35-944d76f621f2\") " pod="openshift-marketplace/redhat-operators-f6mjl" Feb 17 19:50:00 crc kubenswrapper[4892]: I0217 19:50:00.026675 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzzgr\" (UniqueName: \"kubernetes.io/projected/2897de4e-559e-434d-8a35-944d76f621f2-kube-api-access-mzzgr\") pod \"redhat-operators-f6mjl\" (UID: \"2897de4e-559e-434d-8a35-944d76f621f2\") " pod="openshift-marketplace/redhat-operators-f6mjl" Feb 17 19:50:00 crc kubenswrapper[4892]: I0217 19:50:00.094047 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f6mjl" Feb 17 19:50:00 crc kubenswrapper[4892]: I0217 19:50:00.627349 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f6mjl"] Feb 17 19:50:00 crc kubenswrapper[4892]: I0217 19:50:00.979642 4892 generic.go:334] "Generic (PLEG): container finished" podID="2897de4e-559e-434d-8a35-944d76f621f2" containerID="e88484261dc5dcd984c9aa41694422900dc048bd7112f7b366a34fe836d4b332" exitCode=0 Feb 17 19:50:00 crc kubenswrapper[4892]: I0217 19:50:00.979740 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f6mjl" event={"ID":"2897de4e-559e-434d-8a35-944d76f621f2","Type":"ContainerDied","Data":"e88484261dc5dcd984c9aa41694422900dc048bd7112f7b366a34fe836d4b332"} Feb 17 19:50:00 crc kubenswrapper[4892]: I0217 19:50:00.979981 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f6mjl" event={"ID":"2897de4e-559e-434d-8a35-944d76f621f2","Type":"ContainerStarted","Data":"3b7113e2cfba7a15fd791891bcd4a49764756f3e878f7b8a9d335dab3bbff6f2"} Feb 17 19:50:14 crc kubenswrapper[4892]: I0217 19:50:14.234194 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f6mjl" event={"ID":"2897de4e-559e-434d-8a35-944d76f621f2","Type":"ContainerStarted","Data":"4747c7157936a09646b919071dcfd5ebec08851f13a13dbf7bf665a38a0efbf4"} Feb 17 19:50:15 crc kubenswrapper[4892]: I0217 19:50:15.247236 4892 generic.go:334] "Generic (PLEG): container finished" podID="2897de4e-559e-434d-8a35-944d76f621f2" containerID="4747c7157936a09646b919071dcfd5ebec08851f13a13dbf7bf665a38a0efbf4" exitCode=0 Feb 17 19:50:15 crc kubenswrapper[4892]: I0217 19:50:15.247288 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f6mjl" event={"ID":"2897de4e-559e-434d-8a35-944d76f621f2","Type":"ContainerDied","Data":"4747c7157936a09646b919071dcfd5ebec08851f13a13dbf7bf665a38a0efbf4"} Feb 17 19:50:16 crc kubenswrapper[4892]: I0217 19:50:16.260160 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f6mjl" event={"ID":"2897de4e-559e-434d-8a35-944d76f621f2","Type":"ContainerStarted","Data":"520b4baaac8b9ae710ccbafce96153f9776a56441482a40dee8b26dfdfd73770"} Feb 17 19:50:16 crc kubenswrapper[4892]: I0217 19:50:16.289934 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-f6mjl" podStartSLOduration=2.573845157 podStartE2EDuration="17.289916188s" podCreationTimestamp="2026-02-17 19:49:59 +0000 UTC" firstStartedPulling="2026-02-17 19:50:00.982658563 +0000 UTC m=+7572.358061828" lastFinishedPulling="2026-02-17 19:50:15.698729594 +0000 UTC m=+7587.074132859" observedRunningTime="2026-02-17 19:50:16.277077211 +0000 UTC m=+7587.652480466" watchObservedRunningTime="2026-02-17 19:50:16.289916188 +0000 UTC m=+7587.665319453" Feb 17 19:50:20 crc kubenswrapper[4892]: I0217 19:50:20.094693 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-f6mjl" Feb 17 19:50:20 crc kubenswrapper[4892]: I0217 19:50:20.095055 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-f6mjl" Feb 17 19:50:21 crc kubenswrapper[4892]: I0217 19:50:21.143956 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-f6mjl" podUID="2897de4e-559e-434d-8a35-944d76f621f2" containerName="registry-server" probeResult="failure" output=< Feb 17 19:50:21 crc kubenswrapper[4892]: timeout: failed to connect service ":50051" within 1s Feb 17 19:50:21 crc kubenswrapper[4892]: > Feb 17 19:50:25 crc kubenswrapper[4892]: I0217 19:50:25.362187 4892 generic.go:334] "Generic (PLEG): container finished" podID="fab94950-73bd-4831-8872-84219a776cef" containerID="15bb513c08b4971e9d0da7be4489c6842326b9fde6bf135b80fc492048ca63db" exitCode=0 Feb 17 19:50:25 crc kubenswrapper[4892]: I0217 19:50:25.376507 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-mt4f2" event={"ID":"fab94950-73bd-4831-8872-84219a776cef","Type":"ContainerDied","Data":"15bb513c08b4971e9d0da7be4489c6842326b9fde6bf135b80fc492048ca63db"} Feb 17 19:50:26 crc kubenswrapper[4892]: I0217 19:50:26.930505 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-mt4f2" Feb 17 19:50:27 crc kubenswrapper[4892]: I0217 19:50:27.005464 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fab94950-73bd-4831-8872-84219a776cef-inventory\") pod \"fab94950-73bd-4831-8872-84219a776cef\" (UID: \"fab94950-73bd-4831-8872-84219a776cef\") " Feb 17 19:50:27 crc kubenswrapper[4892]: I0217 19:50:27.005654 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdv79\" (UniqueName: \"kubernetes.io/projected/fab94950-73bd-4831-8872-84219a776cef-kube-api-access-kdv79\") pod \"fab94950-73bd-4831-8872-84219a776cef\" (UID: \"fab94950-73bd-4831-8872-84219a776cef\") " Feb 17 19:50:27 crc kubenswrapper[4892]: I0217 19:50:27.005715 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fab94950-73bd-4831-8872-84219a776cef-ceph\") pod \"fab94950-73bd-4831-8872-84219a776cef\" (UID: \"fab94950-73bd-4831-8872-84219a776cef\") " Feb 17 19:50:27 crc kubenswrapper[4892]: I0217 19:50:27.005779 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/fab94950-73bd-4831-8872-84219a776cef-ssh-key-openstack-cell1\") pod \"fab94950-73bd-4831-8872-84219a776cef\" (UID: \"fab94950-73bd-4831-8872-84219a776cef\") " Feb 17 19:50:27 crc kubenswrapper[4892]: I0217 19:50:27.016899 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fab94950-73bd-4831-8872-84219a776cef-kube-api-access-kdv79" (OuterVolumeSpecName: "kube-api-access-kdv79") pod "fab94950-73bd-4831-8872-84219a776cef" (UID: "fab94950-73bd-4831-8872-84219a776cef"). InnerVolumeSpecName "kube-api-access-kdv79". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:50:27 crc kubenswrapper[4892]: I0217 19:50:27.018022 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fab94950-73bd-4831-8872-84219a776cef-ceph" (OuterVolumeSpecName: "ceph") pod "fab94950-73bd-4831-8872-84219a776cef" (UID: "fab94950-73bd-4831-8872-84219a776cef"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:50:27 crc kubenswrapper[4892]: I0217 19:50:27.055321 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fab94950-73bd-4831-8872-84219a776cef-inventory" (OuterVolumeSpecName: "inventory") pod "fab94950-73bd-4831-8872-84219a776cef" (UID: "fab94950-73bd-4831-8872-84219a776cef"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:50:27 crc kubenswrapper[4892]: I0217 19:50:27.060945 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fab94950-73bd-4831-8872-84219a776cef-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "fab94950-73bd-4831-8872-84219a776cef" (UID: "fab94950-73bd-4831-8872-84219a776cef"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:50:27 crc kubenswrapper[4892]: I0217 19:50:27.108525 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdv79\" (UniqueName: \"kubernetes.io/projected/fab94950-73bd-4831-8872-84219a776cef-kube-api-access-kdv79\") on node \"crc\" DevicePath \"\"" Feb 17 19:50:27 crc kubenswrapper[4892]: I0217 19:50:27.108562 4892 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fab94950-73bd-4831-8872-84219a776cef-ceph\") on node \"crc\" DevicePath \"\"" Feb 17 19:50:27 crc kubenswrapper[4892]: I0217 19:50:27.108574 4892 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/fab94950-73bd-4831-8872-84219a776cef-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 17 19:50:27 crc kubenswrapper[4892]: I0217 19:50:27.108582 4892 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fab94950-73bd-4831-8872-84219a776cef-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 19:50:27 crc kubenswrapper[4892]: I0217 19:50:27.387626 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-mt4f2" event={"ID":"fab94950-73bd-4831-8872-84219a776cef","Type":"ContainerDied","Data":"73a14380c302c3df08254542678d992979ba4c880c863e053574a1c8ce631370"} Feb 17 19:50:27 crc kubenswrapper[4892]: I0217 19:50:27.387676 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73a14380c302c3df08254542678d992979ba4c880c863e053574a1c8ce631370" Feb 17 19:50:27 crc kubenswrapper[4892]: I0217 19:50:27.387717 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-mt4f2" Feb 17 19:50:27 crc kubenswrapper[4892]: I0217 19:50:27.493968 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-xqknj"] Feb 17 19:50:27 crc kubenswrapper[4892]: E0217 19:50:27.494766 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fab94950-73bd-4831-8872-84219a776cef" containerName="install-os-openstack-openstack-cell1" Feb 17 19:50:27 crc kubenswrapper[4892]: I0217 19:50:27.494801 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="fab94950-73bd-4831-8872-84219a776cef" containerName="install-os-openstack-openstack-cell1" Feb 17 19:50:27 crc kubenswrapper[4892]: I0217 19:50:27.495248 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="fab94950-73bd-4831-8872-84219a776cef" containerName="install-os-openstack-openstack-cell1" Feb 17 19:50:27 crc kubenswrapper[4892]: I0217 19:50:27.496455 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-xqknj" Feb 17 19:50:27 crc kubenswrapper[4892]: I0217 19:50:27.500289 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 17 19:50:27 crc kubenswrapper[4892]: I0217 19:50:27.500339 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 17 19:50:27 crc kubenswrapper[4892]: I0217 19:50:27.500627 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 19:50:27 crc kubenswrapper[4892]: I0217 19:50:27.500654 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-sdxx7" Feb 17 19:50:27 crc kubenswrapper[4892]: I0217 19:50:27.508064 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-xqknj"] Feb 17 19:50:27 crc kubenswrapper[4892]: I0217 19:50:27.526265 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a33740f5-3f9c-4f60-a0b9-cc91fbed6701-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-xqknj\" (UID: \"a33740f5-3f9c-4f60-a0b9-cc91fbed6701\") " pod="openstack/configure-os-openstack-openstack-cell1-xqknj" Feb 17 19:50:27 crc kubenswrapper[4892]: I0217 19:50:27.526417 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a33740f5-3f9c-4f60-a0b9-cc91fbed6701-inventory\") pod \"configure-os-openstack-openstack-cell1-xqknj\" (UID: \"a33740f5-3f9c-4f60-a0b9-cc91fbed6701\") " pod="openstack/configure-os-openstack-openstack-cell1-xqknj" Feb 17 19:50:27 crc kubenswrapper[4892]: I0217 19:50:27.526481 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9c87\" (UniqueName: \"kubernetes.io/projected/a33740f5-3f9c-4f60-a0b9-cc91fbed6701-kube-api-access-x9c87\") pod \"configure-os-openstack-openstack-cell1-xqknj\" (UID: \"a33740f5-3f9c-4f60-a0b9-cc91fbed6701\") " pod="openstack/configure-os-openstack-openstack-cell1-xqknj" Feb 17 19:50:27 crc kubenswrapper[4892]: I0217 19:50:27.526527 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a33740f5-3f9c-4f60-a0b9-cc91fbed6701-ceph\") pod \"configure-os-openstack-openstack-cell1-xqknj\" (UID: \"a33740f5-3f9c-4f60-a0b9-cc91fbed6701\") " pod="openstack/configure-os-openstack-openstack-cell1-xqknj" Feb 17 19:50:27 crc kubenswrapper[4892]: I0217 19:50:27.628778 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a33740f5-3f9c-4f60-a0b9-cc91fbed6701-inventory\") pod \"configure-os-openstack-openstack-cell1-xqknj\" (UID: \"a33740f5-3f9c-4f60-a0b9-cc91fbed6701\") " pod="openstack/configure-os-openstack-openstack-cell1-xqknj" Feb 17 19:50:27 crc kubenswrapper[4892]: I0217 19:50:27.628856 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9c87\" (UniqueName: \"kubernetes.io/projected/a33740f5-3f9c-4f60-a0b9-cc91fbed6701-kube-api-access-x9c87\") pod \"configure-os-openstack-openstack-cell1-xqknj\" (UID: \"a33740f5-3f9c-4f60-a0b9-cc91fbed6701\") " pod="openstack/configure-os-openstack-openstack-cell1-xqknj" Feb 17 19:50:27 crc kubenswrapper[4892]: I0217 19:50:27.628902 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a33740f5-3f9c-4f60-a0b9-cc91fbed6701-ceph\") pod \"configure-os-openstack-openstack-cell1-xqknj\" (UID: \"a33740f5-3f9c-4f60-a0b9-cc91fbed6701\") " pod="openstack/configure-os-openstack-openstack-cell1-xqknj" Feb 17 19:50:27 crc kubenswrapper[4892]: I0217 19:50:27.629159 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a33740f5-3f9c-4f60-a0b9-cc91fbed6701-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-xqknj\" (UID: \"a33740f5-3f9c-4f60-a0b9-cc91fbed6701\") " pod="openstack/configure-os-openstack-openstack-cell1-xqknj" Feb 17 19:50:27 crc kubenswrapper[4892]: I0217 19:50:27.633371 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a33740f5-3f9c-4f60-a0b9-cc91fbed6701-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-xqknj\" (UID: \"a33740f5-3f9c-4f60-a0b9-cc91fbed6701\") " pod="openstack/configure-os-openstack-openstack-cell1-xqknj" Feb 17 19:50:27 crc kubenswrapper[4892]: I0217 19:50:27.636841 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a33740f5-3f9c-4f60-a0b9-cc91fbed6701-inventory\") pod \"configure-os-openstack-openstack-cell1-xqknj\" (UID: \"a33740f5-3f9c-4f60-a0b9-cc91fbed6701\") " pod="openstack/configure-os-openstack-openstack-cell1-xqknj" Feb 17 19:50:27 crc kubenswrapper[4892]: I0217 19:50:27.640719 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a33740f5-3f9c-4f60-a0b9-cc91fbed6701-ceph\") pod \"configure-os-openstack-openstack-cell1-xqknj\" (UID: \"a33740f5-3f9c-4f60-a0b9-cc91fbed6701\") " pod="openstack/configure-os-openstack-openstack-cell1-xqknj" Feb 17 19:50:27 crc kubenswrapper[4892]: I0217 19:50:27.652429 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9c87\" (UniqueName: \"kubernetes.io/projected/a33740f5-3f9c-4f60-a0b9-cc91fbed6701-kube-api-access-x9c87\") pod \"configure-os-openstack-openstack-cell1-xqknj\" (UID: \"a33740f5-3f9c-4f60-a0b9-cc91fbed6701\") " pod="openstack/configure-os-openstack-openstack-cell1-xqknj" Feb 17 19:50:27 crc kubenswrapper[4892]: I0217 19:50:27.829727 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-xqknj" Feb 17 19:50:28 crc kubenswrapper[4892]: I0217 19:50:28.461474 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-xqknj"] Feb 17 19:50:29 crc kubenswrapper[4892]: I0217 19:50:29.414274 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-xqknj" event={"ID":"a33740f5-3f9c-4f60-a0b9-cc91fbed6701","Type":"ContainerStarted","Data":"caf9af17e5c66534251da1c497b54ce667f4f83d58ef685ba64adc875cc62790"} Feb 17 19:50:29 crc kubenswrapper[4892]: I0217 19:50:29.414557 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-xqknj" event={"ID":"a33740f5-3f9c-4f60-a0b9-cc91fbed6701","Type":"ContainerStarted","Data":"5c5b99bad09390cbf69c81c500cf54210aca6785247d0cfd89d3dde025cb08a6"} Feb 17 19:50:29 crc kubenswrapper[4892]: I0217 19:50:29.440867 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-openstack-openstack-cell1-xqknj" podStartSLOduration=1.968149993 podStartE2EDuration="2.440848206s" podCreationTimestamp="2026-02-17 19:50:27 +0000 UTC" firstStartedPulling="2026-02-17 19:50:28.451917053 +0000 UTC m=+7599.827320318" lastFinishedPulling="2026-02-17 19:50:28.924615266 +0000 UTC m=+7600.300018531" observedRunningTime="2026-02-17 19:50:29.436578361 +0000 UTC m=+7600.811981626" watchObservedRunningTime="2026-02-17 19:50:29.440848206 +0000 UTC m=+7600.816251481" Feb 17 19:50:30 crc kubenswrapper[4892]: I0217 19:50:30.187353 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-f6mjl" Feb 17 19:50:30 crc kubenswrapper[4892]: I0217 19:50:30.331212 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-f6mjl" Feb 17 19:50:30 crc kubenswrapper[4892]: I0217 19:50:30.805577 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f6mjl"] Feb 17 19:50:30 crc kubenswrapper[4892]: I0217 19:50:30.959795 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gfc4l"] Feb 17 19:50:30 crc kubenswrapper[4892]: I0217 19:50:30.960371 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gfc4l" podUID="31427c94-1de1-431c-9f4b-e87f01548d3f" containerName="registry-server" containerID="cri-o://e57bf64fd272b83c9b0c76454389d556e20f18a0cfda0881aa6288c49cb07d10" gracePeriod=2 Feb 17 19:50:31 crc kubenswrapper[4892]: I0217 19:50:31.153747 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fj4g8"] Feb 17 19:50:31 crc kubenswrapper[4892]: I0217 19:50:31.154232 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fj4g8" podUID="fee21c95-556a-4a25-8b38-273489f881e4" containerName="registry-server" containerID="cri-o://1619ab7d9ed780d2cc40a936c7b7d6cabc77abf480db3f68fced340a2f40a7f1" gracePeriod=2 Feb 17 19:50:31 crc kubenswrapper[4892]: I0217 19:50:31.353671 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5v59h"] Feb 17 19:50:31 crc kubenswrapper[4892]: I0217 19:50:31.354676 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5v59h" podUID="272b1940-458c-4610-a618-fcc2a4fab95c" containerName="registry-server" containerID="cri-o://891529110bfd1783b1f20f998e261a4dea3331cf20f0710362cfdedb4a9666fb" gracePeriod=2 Feb 17 19:50:31 crc kubenswrapper[4892]: I0217 19:50:31.490555 4892 generic.go:334] "Generic (PLEG): container finished" podID="fee21c95-556a-4a25-8b38-273489f881e4" containerID="1619ab7d9ed780d2cc40a936c7b7d6cabc77abf480db3f68fced340a2f40a7f1" exitCode=0 Feb 17 19:50:31 crc kubenswrapper[4892]: I0217 19:50:31.490652 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fj4g8" event={"ID":"fee21c95-556a-4a25-8b38-273489f881e4","Type":"ContainerDied","Data":"1619ab7d9ed780d2cc40a936c7b7d6cabc77abf480db3f68fced340a2f40a7f1"} Feb 17 19:50:31 crc kubenswrapper[4892]: I0217 19:50:31.504954 4892 generic.go:334] "Generic (PLEG): container finished" podID="31427c94-1de1-431c-9f4b-e87f01548d3f" containerID="e57bf64fd272b83c9b0c76454389d556e20f18a0cfda0881aa6288c49cb07d10" exitCode=0 Feb 17 19:50:31 crc kubenswrapper[4892]: I0217 19:50:31.505440 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gfc4l" event={"ID":"31427c94-1de1-431c-9f4b-e87f01548d3f","Type":"ContainerDied","Data":"e57bf64fd272b83c9b0c76454389d556e20f18a0cfda0881aa6288c49cb07d10"} Feb 17 19:50:31 crc kubenswrapper[4892]: I0217 19:50:31.505484 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gfc4l" event={"ID":"31427c94-1de1-431c-9f4b-e87f01548d3f","Type":"ContainerDied","Data":"15daf13eff4370c279d6830d07c6b949cfd623dac781dc351e393c57a91b833c"} Feb 17 19:50:31 crc kubenswrapper[4892]: I0217 19:50:31.505494 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15daf13eff4370c279d6830d07c6b949cfd623dac781dc351e393c57a91b833c" Feb 17 19:50:31 crc kubenswrapper[4892]: I0217 19:50:31.561195 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pbtj9"] Feb 17 19:50:31 crc kubenswrapper[4892]: I0217 19:50:31.561455 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pbtj9" podUID="d276a412-68f8-4069-9aa2-275fdb23997d" containerName="registry-server" containerID="cri-o://f057e6826ee9b4451acb2ecf5775b8d00830c9d920d4c51c7a05610476ed7701" gracePeriod=2 Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:31.694028 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gfc4l" Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:31.697575 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fj4g8" Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:31.723741 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hbfn\" (UniqueName: \"kubernetes.io/projected/fee21c95-556a-4a25-8b38-273489f881e4-kube-api-access-2hbfn\") pod \"fee21c95-556a-4a25-8b38-273489f881e4\" (UID: \"fee21c95-556a-4a25-8b38-273489f881e4\") " Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:31.723798 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31427c94-1de1-431c-9f4b-e87f01548d3f-catalog-content\") pod \"31427c94-1de1-431c-9f4b-e87f01548d3f\" (UID: \"31427c94-1de1-431c-9f4b-e87f01548d3f\") " Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:31.723946 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31427c94-1de1-431c-9f4b-e87f01548d3f-utilities\") pod \"31427c94-1de1-431c-9f4b-e87f01548d3f\" (UID: \"31427c94-1de1-431c-9f4b-e87f01548d3f\") " Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:31.724000 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-454vb\" (UniqueName: \"kubernetes.io/projected/31427c94-1de1-431c-9f4b-e87f01548d3f-kube-api-access-454vb\") pod \"31427c94-1de1-431c-9f4b-e87f01548d3f\" (UID: \"31427c94-1de1-431c-9f4b-e87f01548d3f\") " Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:31.724108 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fee21c95-556a-4a25-8b38-273489f881e4-utilities\") pod \"fee21c95-556a-4a25-8b38-273489f881e4\" (UID: \"fee21c95-556a-4a25-8b38-273489f881e4\") " Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:31.724157 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fee21c95-556a-4a25-8b38-273489f881e4-catalog-content\") pod \"fee21c95-556a-4a25-8b38-273489f881e4\" (UID: \"fee21c95-556a-4a25-8b38-273489f881e4\") " Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:31.727454 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fee21c95-556a-4a25-8b38-273489f881e4-utilities" (OuterVolumeSpecName: "utilities") pod "fee21c95-556a-4a25-8b38-273489f881e4" (UID: "fee21c95-556a-4a25-8b38-273489f881e4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:31.729971 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31427c94-1de1-431c-9f4b-e87f01548d3f-utilities" (OuterVolumeSpecName: "utilities") pod "31427c94-1de1-431c-9f4b-e87f01548d3f" (UID: "31427c94-1de1-431c-9f4b-e87f01548d3f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:31.739026 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31427c94-1de1-431c-9f4b-e87f01548d3f-kube-api-access-454vb" (OuterVolumeSpecName: "kube-api-access-454vb") pod "31427c94-1de1-431c-9f4b-e87f01548d3f" (UID: "31427c94-1de1-431c-9f4b-e87f01548d3f"). InnerVolumeSpecName "kube-api-access-454vb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:31.739172 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fee21c95-556a-4a25-8b38-273489f881e4-kube-api-access-2hbfn" (OuterVolumeSpecName: "kube-api-access-2hbfn") pod "fee21c95-556a-4a25-8b38-273489f881e4" (UID: "fee21c95-556a-4a25-8b38-273489f881e4"). InnerVolumeSpecName "kube-api-access-2hbfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:31.829107 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fee21c95-556a-4a25-8b38-273489f881e4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fee21c95-556a-4a25-8b38-273489f881e4" (UID: "fee21c95-556a-4a25-8b38-273489f881e4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:31.832159 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fee21c95-556a-4a25-8b38-273489f881e4-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:31.832187 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fee21c95-556a-4a25-8b38-273489f881e4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:31.832199 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hbfn\" (UniqueName: \"kubernetes.io/projected/fee21c95-556a-4a25-8b38-273489f881e4-kube-api-access-2hbfn\") on node \"crc\" DevicePath \"\"" Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:31.832209 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31427c94-1de1-431c-9f4b-e87f01548d3f-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:31.832221 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-454vb\" (UniqueName: \"kubernetes.io/projected/31427c94-1de1-431c-9f4b-e87f01548d3f-kube-api-access-454vb\") on node \"crc\" DevicePath \"\"" Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:31.839404 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-x8f4q"] Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:31.839673 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-x8f4q" podUID="94374c20-25dd-491e-8778-0b00d10afda0" containerName="registry-server" containerID="cri-o://2ba90fc2d816105e9338e3455c3d172641daf2c4e534d92d6555ad37ec55401b" gracePeriod=2 Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:31.878469 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31427c94-1de1-431c-9f4b-e87f01548d3f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "31427c94-1de1-431c-9f4b-e87f01548d3f" (UID: "31427c94-1de1-431c-9f4b-e87f01548d3f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:31.935321 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31427c94-1de1-431c-9f4b-e87f01548d3f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:31.959649 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ztgz4"] Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:31.960315 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ztgz4" podUID="66354c96-340d-43c7-98d0-19713f857884" containerName="registry-server" containerID="cri-o://cc4815a035b4957f5c46b6adc05333fe70f215427c1b860521e216f31e27e5e6" gracePeriod=2 Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:31.960401 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5v59h" Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:32.139736 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpsgw\" (UniqueName: \"kubernetes.io/projected/272b1940-458c-4610-a618-fcc2a4fab95c-kube-api-access-gpsgw\") pod \"272b1940-458c-4610-a618-fcc2a4fab95c\" (UID: \"272b1940-458c-4610-a618-fcc2a4fab95c\") " Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:32.139805 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/272b1940-458c-4610-a618-fcc2a4fab95c-utilities\") pod \"272b1940-458c-4610-a618-fcc2a4fab95c\" (UID: \"272b1940-458c-4610-a618-fcc2a4fab95c\") " Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:32.140022 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/272b1940-458c-4610-a618-fcc2a4fab95c-catalog-content\") pod \"272b1940-458c-4610-a618-fcc2a4fab95c\" (UID: \"272b1940-458c-4610-a618-fcc2a4fab95c\") " Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:32.142160 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/272b1940-458c-4610-a618-fcc2a4fab95c-utilities" (OuterVolumeSpecName: "utilities") pod "272b1940-458c-4610-a618-fcc2a4fab95c" (UID: "272b1940-458c-4610-a618-fcc2a4fab95c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:32.152005 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/272b1940-458c-4610-a618-fcc2a4fab95c-kube-api-access-gpsgw" (OuterVolumeSpecName: "kube-api-access-gpsgw") pod "272b1940-458c-4610-a618-fcc2a4fab95c" (UID: "272b1940-458c-4610-a618-fcc2a4fab95c"). InnerVolumeSpecName "kube-api-access-gpsgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:32.167776 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6vvnr"] Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:32.168208 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6vvnr" podUID="68547217-2996-4b25-9020-c2187ecfb42e" containerName="registry-server" containerID="cri-o://e51c1574a60189904fadbce2d623f904d53f6722685a80e95f6a12844fd34fa7" gracePeriod=2 Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:32.223468 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/272b1940-458c-4610-a618-fcc2a4fab95c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "272b1940-458c-4610-a618-fcc2a4fab95c" (UID: "272b1940-458c-4610-a618-fcc2a4fab95c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:32.242809 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/272b1940-458c-4610-a618-fcc2a4fab95c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:32.242872 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpsgw\" (UniqueName: \"kubernetes.io/projected/272b1940-458c-4610-a618-fcc2a4fab95c-kube-api-access-gpsgw\") on node \"crc\" DevicePath \"\"" Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:32.242884 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/272b1940-458c-4610-a618-fcc2a4fab95c-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:32.366927 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jxtbb"] Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:32.367296 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jxtbb" podUID="3425377d-d147-4bc3-a063-5e3c9456d2f9" containerName="registry-server" containerID="cri-o://296e009cfb6f2a21e4228283cfc44e00354c12e851eb69cdae67c43b2b299156" gracePeriod=2 Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:32.536152 4892 generic.go:334] "Generic (PLEG): container finished" podID="66354c96-340d-43c7-98d0-19713f857884" containerID="cc4815a035b4957f5c46b6adc05333fe70f215427c1b860521e216f31e27e5e6" exitCode=0 Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:32.536473 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ztgz4" event={"ID":"66354c96-340d-43c7-98d0-19713f857884","Type":"ContainerDied","Data":"cc4815a035b4957f5c46b6adc05333fe70f215427c1b860521e216f31e27e5e6"} Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:32.555424 4892 generic.go:334] "Generic (PLEG): container finished" podID="d276a412-68f8-4069-9aa2-275fdb23997d" containerID="f057e6826ee9b4451acb2ecf5775b8d00830c9d920d4c51c7a05610476ed7701" exitCode=0 Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:32.555509 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbtj9" event={"ID":"d276a412-68f8-4069-9aa2-275fdb23997d","Type":"ContainerDied","Data":"f057e6826ee9b4451acb2ecf5775b8d00830c9d920d4c51c7a05610476ed7701"} Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:32.565331 4892 generic.go:334] "Generic (PLEG): container finished" podID="3425377d-d147-4bc3-a063-5e3c9456d2f9" containerID="296e009cfb6f2a21e4228283cfc44e00354c12e851eb69cdae67c43b2b299156" exitCode=0 Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:32.565400 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jxtbb" event={"ID":"3425377d-d147-4bc3-a063-5e3c9456d2f9","Type":"ContainerDied","Data":"296e009cfb6f2a21e4228283cfc44e00354c12e851eb69cdae67c43b2b299156"} Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:32.570935 4892 generic.go:334] "Generic (PLEG): container finished" podID="94374c20-25dd-491e-8778-0b00d10afda0" containerID="2ba90fc2d816105e9338e3455c3d172641daf2c4e534d92d6555ad37ec55401b" exitCode=0 Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:32.570980 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x8f4q" event={"ID":"94374c20-25dd-491e-8778-0b00d10afda0","Type":"ContainerDied","Data":"2ba90fc2d816105e9338e3455c3d172641daf2c4e534d92d6555ad37ec55401b"} Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:32.575345 4892 generic.go:334] "Generic (PLEG): container finished" podID="272b1940-458c-4610-a618-fcc2a4fab95c" containerID="891529110bfd1783b1f20f998e261a4dea3331cf20f0710362cfdedb4a9666fb" exitCode=0 Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:32.575389 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5v59h" event={"ID":"272b1940-458c-4610-a618-fcc2a4fab95c","Type":"ContainerDied","Data":"891529110bfd1783b1f20f998e261a4dea3331cf20f0710362cfdedb4a9666fb"} Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:32.575409 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5v59h" event={"ID":"272b1940-458c-4610-a618-fcc2a4fab95c","Type":"ContainerDied","Data":"53759a00a3b1e0e47a65764e82e44b06af984fccbc7bb1f7a409900a5eca81ed"} Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:32.575425 4892 scope.go:117] "RemoveContainer" containerID="891529110bfd1783b1f20f998e261a4dea3331cf20f0710362cfdedb4a9666fb" Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:32.575608 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5v59h" Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:32.598193 4892 generic.go:334] "Generic (PLEG): container finished" podID="68547217-2996-4b25-9020-c2187ecfb42e" containerID="e51c1574a60189904fadbce2d623f904d53f6722685a80e95f6a12844fd34fa7" exitCode=0 Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:32.598261 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6vvnr" event={"ID":"68547217-2996-4b25-9020-c2187ecfb42e","Type":"ContainerDied","Data":"e51c1574a60189904fadbce2d623f904d53f6722685a80e95f6a12844fd34fa7"} Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:32.614168 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lx4bn"] Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:32.614428 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fj4g8" event={"ID":"fee21c95-556a-4a25-8b38-273489f881e4","Type":"ContainerDied","Data":"7f76615bafe5269546c45acc46de2a55a419580d92eababd140e708cecbcd664"} Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:32.614504 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lx4bn" podUID="9833d581-8f46-4e58-9a99-6ec65dc9431b" containerName="registry-server" containerID="cri-o://076c41de3c27bee00540514cfa0ac6ff34e251fdec2660e974becb7d9c6b5c1b" gracePeriod=2 Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:32.614557 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fj4g8" Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:32.614634 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gfc4l" Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:32.651054 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5v59h"] Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:32.694172 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5v59h"] Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:32.695062 4892 scope.go:117] "RemoveContainer" containerID="2aa4c0258198354208f3bd9ca1757e9cc1b45a0be4571e56666814b568d697b5" Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:32.734986 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gfc4l"] Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:32.763278 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gfc4l"] Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:32.792233 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fj4g8"] Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:32.802826 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fj4g8"] Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:32.805607 4892 scope.go:117] "RemoveContainer" containerID="a44b89bd4c8b505a828ec24bcfbc3fb71032b42d182573461b50b00ba5587cb9" Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:32.814865 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ct6mr"] Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:32.815209 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ct6mr" podUID="50aef298-67b1-42cd-a4a5-86179693b0eb" containerName="registry-server" containerID="cri-o://0cd96c59b712e93c256d47f7fb31c8154dbfed1cb5359729f1f4cb4df4f521d7" gracePeriod=2 Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:32.839766 4892 scope.go:117] "RemoveContainer" containerID="891529110bfd1783b1f20f998e261a4dea3331cf20f0710362cfdedb4a9666fb" Feb 17 19:50:32 crc kubenswrapper[4892]: E0217 19:50:32.840408 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"891529110bfd1783b1f20f998e261a4dea3331cf20f0710362cfdedb4a9666fb\": container with ID starting with 891529110bfd1783b1f20f998e261a4dea3331cf20f0710362cfdedb4a9666fb not found: ID does not exist" containerID="891529110bfd1783b1f20f998e261a4dea3331cf20f0710362cfdedb4a9666fb" Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:32.840438 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"891529110bfd1783b1f20f998e261a4dea3331cf20f0710362cfdedb4a9666fb"} err="failed to get container status \"891529110bfd1783b1f20f998e261a4dea3331cf20f0710362cfdedb4a9666fb\": rpc error: code = NotFound desc = could not find container \"891529110bfd1783b1f20f998e261a4dea3331cf20f0710362cfdedb4a9666fb\": container with ID starting with 891529110bfd1783b1f20f998e261a4dea3331cf20f0710362cfdedb4a9666fb not found: ID does not exist" Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:32.840457 4892 scope.go:117] "RemoveContainer" containerID="2aa4c0258198354208f3bd9ca1757e9cc1b45a0be4571e56666814b568d697b5" Feb 17 19:50:32 crc kubenswrapper[4892]: E0217 19:50:32.840825 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2aa4c0258198354208f3bd9ca1757e9cc1b45a0be4571e56666814b568d697b5\": container with ID starting with 2aa4c0258198354208f3bd9ca1757e9cc1b45a0be4571e56666814b568d697b5 not found: ID does not exist" containerID="2aa4c0258198354208f3bd9ca1757e9cc1b45a0be4571e56666814b568d697b5" Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:32.840847 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2aa4c0258198354208f3bd9ca1757e9cc1b45a0be4571e56666814b568d697b5"} err="failed to get container status \"2aa4c0258198354208f3bd9ca1757e9cc1b45a0be4571e56666814b568d697b5\": rpc error: code = NotFound desc = could not find container \"2aa4c0258198354208f3bd9ca1757e9cc1b45a0be4571e56666814b568d697b5\": container with ID starting with 2aa4c0258198354208f3bd9ca1757e9cc1b45a0be4571e56666814b568d697b5 not found: ID does not exist" Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:32.840858 4892 scope.go:117] "RemoveContainer" containerID="a44b89bd4c8b505a828ec24bcfbc3fb71032b42d182573461b50b00ba5587cb9" Feb 17 19:50:32 crc kubenswrapper[4892]: E0217 19:50:32.841264 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a44b89bd4c8b505a828ec24bcfbc3fb71032b42d182573461b50b00ba5587cb9\": container with ID starting with a44b89bd4c8b505a828ec24bcfbc3fb71032b42d182573461b50b00ba5587cb9 not found: ID does not exist" containerID="a44b89bd4c8b505a828ec24bcfbc3fb71032b42d182573461b50b00ba5587cb9" Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:32.841283 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a44b89bd4c8b505a828ec24bcfbc3fb71032b42d182573461b50b00ba5587cb9"} err="failed to get container status \"a44b89bd4c8b505a828ec24bcfbc3fb71032b42d182573461b50b00ba5587cb9\": rpc error: code = NotFound desc = could not find container \"a44b89bd4c8b505a828ec24bcfbc3fb71032b42d182573461b50b00ba5587cb9\": container with ID starting with a44b89bd4c8b505a828ec24bcfbc3fb71032b42d182573461b50b00ba5587cb9 not found: ID does not exist" Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:32.841296 4892 scope.go:117] "RemoveContainer" containerID="1619ab7d9ed780d2cc40a936c7b7d6cabc77abf480db3f68fced340a2f40a7f1" Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:32.934110 4892 scope.go:117] "RemoveContainer" containerID="554ddb300fba674d1cc5a1a361ac7dfbc35d6e4f749cb9231a2a8532bd44450e" Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:32.960732 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k2k8h"] Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:32.961027 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-k2k8h" podUID="e730a12c-43a9-45a7-a52b-9f6fd3eaecaf" containerName="registry-server" containerID="cri-o://098479c84ca860a2b62aa81e3f7f2dc771b33b3e3dcdc59ed7bf5bd2efdd78af" gracePeriod=2 Feb 17 19:50:32 crc kubenswrapper[4892]: I0217 19:50:32.987310 4892 scope.go:117] "RemoveContainer" containerID="557f85612b4b8b4d561466b7153151f8ec59e2074c2dabb0b90a2fe562b3c1aa" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.036404 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x8f4q" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.063568 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6vvnr" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.072424 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94374c20-25dd-491e-8778-0b00d10afda0-catalog-content\") pod \"94374c20-25dd-491e-8778-0b00d10afda0\" (UID: \"94374c20-25dd-491e-8778-0b00d10afda0\") " Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.072503 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4cr9\" (UniqueName: \"kubernetes.io/projected/94374c20-25dd-491e-8778-0b00d10afda0-kube-api-access-m4cr9\") pod \"94374c20-25dd-491e-8778-0b00d10afda0\" (UID: \"94374c20-25dd-491e-8778-0b00d10afda0\") " Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.072594 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94374c20-25dd-491e-8778-0b00d10afda0-utilities\") pod \"94374c20-25dd-491e-8778-0b00d10afda0\" (UID: \"94374c20-25dd-491e-8778-0b00d10afda0\") " Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.072656 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnwvb\" (UniqueName: \"kubernetes.io/projected/68547217-2996-4b25-9020-c2187ecfb42e-kube-api-access-pnwvb\") pod \"68547217-2996-4b25-9020-c2187ecfb42e\" (UID: \"68547217-2996-4b25-9020-c2187ecfb42e\") " Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.072684 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68547217-2996-4b25-9020-c2187ecfb42e-catalog-content\") pod \"68547217-2996-4b25-9020-c2187ecfb42e\" (UID: \"68547217-2996-4b25-9020-c2187ecfb42e\") " Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.072740 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68547217-2996-4b25-9020-c2187ecfb42e-utilities\") pod \"68547217-2996-4b25-9020-c2187ecfb42e\" (UID: \"68547217-2996-4b25-9020-c2187ecfb42e\") " Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.074645 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68547217-2996-4b25-9020-c2187ecfb42e-utilities" (OuterVolumeSpecName: "utilities") pod "68547217-2996-4b25-9020-c2187ecfb42e" (UID: "68547217-2996-4b25-9020-c2187ecfb42e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.082193 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94374c20-25dd-491e-8778-0b00d10afda0-utilities" (OuterVolumeSpecName: "utilities") pod "94374c20-25dd-491e-8778-0b00d10afda0" (UID: "94374c20-25dd-491e-8778-0b00d10afda0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.099992 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68547217-2996-4b25-9020-c2187ecfb42e-kube-api-access-pnwvb" (OuterVolumeSpecName: "kube-api-access-pnwvb") pod "68547217-2996-4b25-9020-c2187ecfb42e" (UID: "68547217-2996-4b25-9020-c2187ecfb42e"). InnerVolumeSpecName "kube-api-access-pnwvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.100274 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94374c20-25dd-491e-8778-0b00d10afda0-kube-api-access-m4cr9" (OuterVolumeSpecName: "kube-api-access-m4cr9") pod "94374c20-25dd-491e-8778-0b00d10afda0" (UID: "94374c20-25dd-491e-8778-0b00d10afda0"). InnerVolumeSpecName "kube-api-access-m4cr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.107749 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ztgz4" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.132973 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jxtbb" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.174105 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66354c96-340d-43c7-98d0-19713f857884-utilities\") pod \"66354c96-340d-43c7-98d0-19713f857884\" (UID: \"66354c96-340d-43c7-98d0-19713f857884\") " Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.174220 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgplj\" (UniqueName: \"kubernetes.io/projected/66354c96-340d-43c7-98d0-19713f857884-kube-api-access-jgplj\") pod \"66354c96-340d-43c7-98d0-19713f857884\" (UID: \"66354c96-340d-43c7-98d0-19713f857884\") " Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.174295 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66354c96-340d-43c7-98d0-19713f857884-catalog-content\") pod \"66354c96-340d-43c7-98d0-19713f857884\" (UID: \"66354c96-340d-43c7-98d0-19713f857884\") " Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.174440 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3425377d-d147-4bc3-a063-5e3c9456d2f9-catalog-content\") pod \"3425377d-d147-4bc3-a063-5e3c9456d2f9\" (UID: \"3425377d-d147-4bc3-a063-5e3c9456d2f9\") " Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.174488 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kl52n\" (UniqueName: \"kubernetes.io/projected/3425377d-d147-4bc3-a063-5e3c9456d2f9-kube-api-access-kl52n\") pod \"3425377d-d147-4bc3-a063-5e3c9456d2f9\" (UID: \"3425377d-d147-4bc3-a063-5e3c9456d2f9\") " Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.174544 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3425377d-d147-4bc3-a063-5e3c9456d2f9-utilities\") pod \"3425377d-d147-4bc3-a063-5e3c9456d2f9\" (UID: \"3425377d-d147-4bc3-a063-5e3c9456d2f9\") " Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.175181 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4cr9\" (UniqueName: \"kubernetes.io/projected/94374c20-25dd-491e-8778-0b00d10afda0-kube-api-access-m4cr9\") on node \"crc\" DevicePath \"\"" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.175195 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94374c20-25dd-491e-8778-0b00d10afda0-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.175204 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnwvb\" (UniqueName: \"kubernetes.io/projected/68547217-2996-4b25-9020-c2187ecfb42e-kube-api-access-pnwvb\") on node \"crc\" DevicePath \"\"" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.175212 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68547217-2996-4b25-9020-c2187ecfb42e-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.175580 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3425377d-d147-4bc3-a063-5e3c9456d2f9-utilities" (OuterVolumeSpecName: "utilities") pod "3425377d-d147-4bc3-a063-5e3c9456d2f9" (UID: "3425377d-d147-4bc3-a063-5e3c9456d2f9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.176357 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66354c96-340d-43c7-98d0-19713f857884-utilities" (OuterVolumeSpecName: "utilities") pod "66354c96-340d-43c7-98d0-19713f857884" (UID: "66354c96-340d-43c7-98d0-19713f857884"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.204596 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3425377d-d147-4bc3-a063-5e3c9456d2f9-kube-api-access-kl52n" (OuterVolumeSpecName: "kube-api-access-kl52n") pod "3425377d-d147-4bc3-a063-5e3c9456d2f9" (UID: "3425377d-d147-4bc3-a063-5e3c9456d2f9"). InnerVolumeSpecName "kube-api-access-kl52n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.205760 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66354c96-340d-43c7-98d0-19713f857884-kube-api-access-jgplj" (OuterVolumeSpecName: "kube-api-access-jgplj") pod "66354c96-340d-43c7-98d0-19713f857884" (UID: "66354c96-340d-43c7-98d0-19713f857884"). InnerVolumeSpecName "kube-api-access-jgplj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.210000 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68547217-2996-4b25-9020-c2187ecfb42e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "68547217-2996-4b25-9020-c2187ecfb42e" (UID: "68547217-2996-4b25-9020-c2187ecfb42e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.258212 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3425377d-d147-4bc3-a063-5e3c9456d2f9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3425377d-d147-4bc3-a063-5e3c9456d2f9" (UID: "3425377d-d147-4bc3-a063-5e3c9456d2f9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.288395 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66354c96-340d-43c7-98d0-19713f857884-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.288417 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgplj\" (UniqueName: \"kubernetes.io/projected/66354c96-340d-43c7-98d0-19713f857884-kube-api-access-jgplj\") on node \"crc\" DevicePath \"\"" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.288428 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3425377d-d147-4bc3-a063-5e3c9456d2f9-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.288437 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kl52n\" (UniqueName: \"kubernetes.io/projected/3425377d-d147-4bc3-a063-5e3c9456d2f9-kube-api-access-kl52n\") on node \"crc\" DevicePath \"\"" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.288444 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3425377d-d147-4bc3-a063-5e3c9456d2f9-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.288452 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68547217-2996-4b25-9020-c2187ecfb42e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.292643 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94374c20-25dd-491e-8778-0b00d10afda0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "94374c20-25dd-491e-8778-0b00d10afda0" (UID: "94374c20-25dd-491e-8778-0b00d10afda0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.312249 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66354c96-340d-43c7-98d0-19713f857884-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "66354c96-340d-43c7-98d0-19713f857884" (UID: "66354c96-340d-43c7-98d0-19713f857884"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.381892 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="272b1940-458c-4610-a618-fcc2a4fab95c" path="/var/lib/kubelet/pods/272b1940-458c-4610-a618-fcc2a4fab95c/volumes" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.382736 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31427c94-1de1-431c-9f4b-e87f01548d3f" path="/var/lib/kubelet/pods/31427c94-1de1-431c-9f4b-e87f01548d3f/volumes" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.383690 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fee21c95-556a-4a25-8b38-273489f881e4" path="/var/lib/kubelet/pods/fee21c95-556a-4a25-8b38-273489f881e4/volumes" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.403449 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66354c96-340d-43c7-98d0-19713f857884-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.405097 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94374c20-25dd-491e-8778-0b00d10afda0-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.571544 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pbtj9" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.586450 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lx4bn" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.616302 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d276a412-68f8-4069-9aa2-275fdb23997d-catalog-content\") pod \"d276a412-68f8-4069-9aa2-275fdb23997d\" (UID: \"d276a412-68f8-4069-9aa2-275fdb23997d\") " Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.616406 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d276a412-68f8-4069-9aa2-275fdb23997d-utilities\") pod \"d276a412-68f8-4069-9aa2-275fdb23997d\" (UID: \"d276a412-68f8-4069-9aa2-275fdb23997d\") " Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.616573 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5hmc\" (UniqueName: \"kubernetes.io/projected/d276a412-68f8-4069-9aa2-275fdb23997d-kube-api-access-l5hmc\") pod \"d276a412-68f8-4069-9aa2-275fdb23997d\" (UID: \"d276a412-68f8-4069-9aa2-275fdb23997d\") " Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.617497 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d276a412-68f8-4069-9aa2-275fdb23997d-utilities" (OuterVolumeSpecName: "utilities") pod "d276a412-68f8-4069-9aa2-275fdb23997d" (UID: "d276a412-68f8-4069-9aa2-275fdb23997d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.620396 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ct6mr" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.636291 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d276a412-68f8-4069-9aa2-275fdb23997d-kube-api-access-l5hmc" (OuterVolumeSpecName: "kube-api-access-l5hmc") pod "d276a412-68f8-4069-9aa2-275fdb23997d" (UID: "d276a412-68f8-4069-9aa2-275fdb23997d"). InnerVolumeSpecName "kube-api-access-l5hmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.654547 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5hmc\" (UniqueName: \"kubernetes.io/projected/d276a412-68f8-4069-9aa2-275fdb23997d-kube-api-access-l5hmc\") on node \"crc\" DevicePath \"\"" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.654582 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d276a412-68f8-4069-9aa2-275fdb23997d-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.667667 4892 generic.go:334] "Generic (PLEG): container finished" podID="50aef298-67b1-42cd-a4a5-86179693b0eb" containerID="0cd96c59b712e93c256d47f7fb31c8154dbfed1cb5359729f1f4cb4df4f521d7" exitCode=0 Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.667769 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ct6mr" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.667801 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ct6mr" event={"ID":"50aef298-67b1-42cd-a4a5-86179693b0eb","Type":"ContainerDied","Data":"0cd96c59b712e93c256d47f7fb31c8154dbfed1cb5359729f1f4cb4df4f521d7"} Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.667879 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ct6mr" event={"ID":"50aef298-67b1-42cd-a4a5-86179693b0eb","Type":"ContainerDied","Data":"8631449bdce6b6b6fff6cb93d87f0be6618eef0dff8c5210aeee4a9d5dd4afb4"} Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.667901 4892 scope.go:117] "RemoveContainer" containerID="0cd96c59b712e93c256d47f7fb31c8154dbfed1cb5359729f1f4cb4df4f521d7" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.676047 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jxtbb" event={"ID":"3425377d-d147-4bc3-a063-5e3c9456d2f9","Type":"ContainerDied","Data":"2127c8729a6c593f3d6bebab6d244305f40ba0ecef8b5c7333a8e09dc89d1c5a"} Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.676153 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jxtbb" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.689593 4892 generic.go:334] "Generic (PLEG): container finished" podID="9833d581-8f46-4e58-9a99-6ec65dc9431b" containerID="076c41de3c27bee00540514cfa0ac6ff34e251fdec2660e974becb7d9c6b5c1b" exitCode=0 Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.689671 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lx4bn" event={"ID":"9833d581-8f46-4e58-9a99-6ec65dc9431b","Type":"ContainerDied","Data":"076c41de3c27bee00540514cfa0ac6ff34e251fdec2660e974becb7d9c6b5c1b"} Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.689700 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lx4bn" event={"ID":"9833d581-8f46-4e58-9a99-6ec65dc9431b","Type":"ContainerDied","Data":"3681f4b0a6d8090419fd5ee1ad34a1d105dd1efdc4462f1ce4ec364c50b8b3cf"} Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.689859 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lx4bn" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.702195 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6vvnr" event={"ID":"68547217-2996-4b25-9020-c2187ecfb42e","Type":"ContainerDied","Data":"e814ced2fb312d8217eec5ff46796c720b90d30cbd27b36cbfc3bcb0f60dde75"} Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.702304 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6vvnr" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.717676 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pbtj9" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.717801 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jxtbb"] Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.717872 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbtj9" event={"ID":"d276a412-68f8-4069-9aa2-275fdb23997d","Type":"ContainerDied","Data":"a409622bc7625257df8b70e7b1f49962ec72aa2700e539dda6fee1984e054289"} Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.726157 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d276a412-68f8-4069-9aa2-275fdb23997d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d276a412-68f8-4069-9aa2-275fdb23997d" (UID: "d276a412-68f8-4069-9aa2-275fdb23997d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.735517 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jxtbb"] Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.737918 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x8f4q" event={"ID":"94374c20-25dd-491e-8778-0b00d10afda0","Type":"ContainerDied","Data":"5fed5f18e8e09f644c7feb02c6eba4a9a0e607e3b4689693b4baf8c49522ba35"} Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.738012 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x8f4q" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.740352 4892 scope.go:117] "RemoveContainer" containerID="f8bc707d6563a439c5213a575db578e3a964a66bd64169ac0c8c4e6e1a7044da" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.744066 4892 generic.go:334] "Generic (PLEG): container finished" podID="e730a12c-43a9-45a7-a52b-9f6fd3eaecaf" containerID="098479c84ca860a2b62aa81e3f7f2dc771b33b3e3dcdc59ed7bf5bd2efdd78af" exitCode=0 Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.744133 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k2k8h" event={"ID":"e730a12c-43a9-45a7-a52b-9f6fd3eaecaf","Type":"ContainerDied","Data":"098479c84ca860a2b62aa81e3f7f2dc771b33b3e3dcdc59ed7bf5bd2efdd78af"} Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.747875 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6vvnr"] Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.755748 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50aef298-67b1-42cd-a4a5-86179693b0eb-utilities\") pod \"50aef298-67b1-42cd-a4a5-86179693b0eb\" (UID: \"50aef298-67b1-42cd-a4a5-86179693b0eb\") " Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.755835 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9833d581-8f46-4e58-9a99-6ec65dc9431b-utilities\") pod \"9833d581-8f46-4e58-9a99-6ec65dc9431b\" (UID: \"9833d581-8f46-4e58-9a99-6ec65dc9431b\") " Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.755937 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50aef298-67b1-42cd-a4a5-86179693b0eb-catalog-content\") pod \"50aef298-67b1-42cd-a4a5-86179693b0eb\" (UID: \"50aef298-67b1-42cd-a4a5-86179693b0eb\") " Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.755963 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9833d581-8f46-4e58-9a99-6ec65dc9431b-catalog-content\") pod \"9833d581-8f46-4e58-9a99-6ec65dc9431b\" (UID: \"9833d581-8f46-4e58-9a99-6ec65dc9431b\") " Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.756065 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wz4bk\" (UniqueName: \"kubernetes.io/projected/50aef298-67b1-42cd-a4a5-86179693b0eb-kube-api-access-wz4bk\") pod \"50aef298-67b1-42cd-a4a5-86179693b0eb\" (UID: \"50aef298-67b1-42cd-a4a5-86179693b0eb\") " Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.756133 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z49sj\" (UniqueName: \"kubernetes.io/projected/9833d581-8f46-4e58-9a99-6ec65dc9431b-kube-api-access-z49sj\") pod \"9833d581-8f46-4e58-9a99-6ec65dc9431b\" (UID: \"9833d581-8f46-4e58-9a99-6ec65dc9431b\") " Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.756660 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d276a412-68f8-4069-9aa2-275fdb23997d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.759674 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9833d581-8f46-4e58-9a99-6ec65dc9431b-utilities" (OuterVolumeSpecName: "utilities") pod "9833d581-8f46-4e58-9a99-6ec65dc9431b" (UID: "9833d581-8f46-4e58-9a99-6ec65dc9431b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.760193 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50aef298-67b1-42cd-a4a5-86179693b0eb-utilities" (OuterVolumeSpecName: "utilities") pod "50aef298-67b1-42cd-a4a5-86179693b0eb" (UID: "50aef298-67b1-42cd-a4a5-86179693b0eb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.761871 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6vvnr"] Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.762133 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9833d581-8f46-4e58-9a99-6ec65dc9431b-kube-api-access-z49sj" (OuterVolumeSpecName: "kube-api-access-z49sj") pod "9833d581-8f46-4e58-9a99-6ec65dc9431b" (UID: "9833d581-8f46-4e58-9a99-6ec65dc9431b"). InnerVolumeSpecName "kube-api-access-z49sj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.764234 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50aef298-67b1-42cd-a4a5-86179693b0eb-kube-api-access-wz4bk" (OuterVolumeSpecName: "kube-api-access-wz4bk") pod "50aef298-67b1-42cd-a4a5-86179693b0eb" (UID: "50aef298-67b1-42cd-a4a5-86179693b0eb"). InnerVolumeSpecName "kube-api-access-wz4bk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:50:33 crc kubenswrapper[4892]: E0217 19:50:33.778181 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 098479c84ca860a2b62aa81e3f7f2dc771b33b3e3dcdc59ed7bf5bd2efdd78af is running failed: container process not found" containerID="098479c84ca860a2b62aa81e3f7f2dc771b33b3e3dcdc59ed7bf5bd2efdd78af" cmd=["grpc_health_probe","-addr=:50051"] Feb 17 19:50:33 crc kubenswrapper[4892]: E0217 19:50:33.778854 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 098479c84ca860a2b62aa81e3f7f2dc771b33b3e3dcdc59ed7bf5bd2efdd78af is running failed: container process not found" containerID="098479c84ca860a2b62aa81e3f7f2dc771b33b3e3dcdc59ed7bf5bd2efdd78af" cmd=["grpc_health_probe","-addr=:50051"] Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.779215 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ztgz4" event={"ID":"66354c96-340d-43c7-98d0-19713f857884","Type":"ContainerDied","Data":"93f38c7c7467fe902f1e67b205777619ca237bf0075010941b4ede85c2df88ac"} Feb 17 19:50:33 crc kubenswrapper[4892]: E0217 19:50:33.779477 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 098479c84ca860a2b62aa81e3f7f2dc771b33b3e3dcdc59ed7bf5bd2efdd78af is running failed: container process not found" containerID="098479c84ca860a2b62aa81e3f7f2dc771b33b3e3dcdc59ed7bf5bd2efdd78af" cmd=["grpc_health_probe","-addr=:50051"] Feb 17 19:50:33 crc kubenswrapper[4892]: E0217 19:50:33.779555 4892 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 098479c84ca860a2b62aa81e3f7f2dc771b33b3e3dcdc59ed7bf5bd2efdd78af is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-k2k8h" podUID="e730a12c-43a9-45a7-a52b-9f6fd3eaecaf" containerName="registry-server" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.779954 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ztgz4" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.794841 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-x8f4q"] Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.810918 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-x8f4q"] Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.848170 4892 scope.go:117] "RemoveContainer" containerID="5f5a3f59c2536565699b1b25c13d26697f1f39acfb6a67fda775b1a8d558ea5a" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.860170 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wz4bk\" (UniqueName: \"kubernetes.io/projected/50aef298-67b1-42cd-a4a5-86179693b0eb-kube-api-access-wz4bk\") on node \"crc\" DevicePath \"\"" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.860198 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z49sj\" (UniqueName: \"kubernetes.io/projected/9833d581-8f46-4e58-9a99-6ec65dc9431b-kube-api-access-z49sj\") on node \"crc\" DevicePath \"\"" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.860209 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50aef298-67b1-42cd-a4a5-86179693b0eb-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.860218 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9833d581-8f46-4e58-9a99-6ec65dc9431b-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.879714 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ztgz4"] Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.891391 4892 scope.go:117] "RemoveContainer" containerID="0cd96c59b712e93c256d47f7fb31c8154dbfed1cb5359729f1f4cb4df4f521d7" Feb 17 19:50:33 crc kubenswrapper[4892]: E0217 19:50:33.891842 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cd96c59b712e93c256d47f7fb31c8154dbfed1cb5359729f1f4cb4df4f521d7\": container with ID starting with 0cd96c59b712e93c256d47f7fb31c8154dbfed1cb5359729f1f4cb4df4f521d7 not found: ID does not exist" containerID="0cd96c59b712e93c256d47f7fb31c8154dbfed1cb5359729f1f4cb4df4f521d7" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.891889 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cd96c59b712e93c256d47f7fb31c8154dbfed1cb5359729f1f4cb4df4f521d7"} err="failed to get container status \"0cd96c59b712e93c256d47f7fb31c8154dbfed1cb5359729f1f4cb4df4f521d7\": rpc error: code = NotFound desc = could not find container \"0cd96c59b712e93c256d47f7fb31c8154dbfed1cb5359729f1f4cb4df4f521d7\": container with ID starting with 0cd96c59b712e93c256d47f7fb31c8154dbfed1cb5359729f1f4cb4df4f521d7 not found: ID does not exist" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.891918 4892 scope.go:117] "RemoveContainer" containerID="f8bc707d6563a439c5213a575db578e3a964a66bd64169ac0c8c4e6e1a7044da" Feb 17 19:50:33 crc kubenswrapper[4892]: E0217 19:50:33.892350 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8bc707d6563a439c5213a575db578e3a964a66bd64169ac0c8c4e6e1a7044da\": container with ID starting with f8bc707d6563a439c5213a575db578e3a964a66bd64169ac0c8c4e6e1a7044da not found: ID does not exist" containerID="f8bc707d6563a439c5213a575db578e3a964a66bd64169ac0c8c4e6e1a7044da" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.892380 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8bc707d6563a439c5213a575db578e3a964a66bd64169ac0c8c4e6e1a7044da"} err="failed to get container status \"f8bc707d6563a439c5213a575db578e3a964a66bd64169ac0c8c4e6e1a7044da\": rpc error: code = NotFound desc = could not find container \"f8bc707d6563a439c5213a575db578e3a964a66bd64169ac0c8c4e6e1a7044da\": container with ID starting with f8bc707d6563a439c5213a575db578e3a964a66bd64169ac0c8c4e6e1a7044da not found: ID does not exist" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.892398 4892 scope.go:117] "RemoveContainer" containerID="5f5a3f59c2536565699b1b25c13d26697f1f39acfb6a67fda775b1a8d558ea5a" Feb 17 19:50:33 crc kubenswrapper[4892]: E0217 19:50:33.893495 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f5a3f59c2536565699b1b25c13d26697f1f39acfb6a67fda775b1a8d558ea5a\": container with ID starting with 5f5a3f59c2536565699b1b25c13d26697f1f39acfb6a67fda775b1a8d558ea5a not found: ID does not exist" containerID="5f5a3f59c2536565699b1b25c13d26697f1f39acfb6a67fda775b1a8d558ea5a" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.893517 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f5a3f59c2536565699b1b25c13d26697f1f39acfb6a67fda775b1a8d558ea5a"} err="failed to get container status \"5f5a3f59c2536565699b1b25c13d26697f1f39acfb6a67fda775b1a8d558ea5a\": rpc error: code = NotFound desc = could not find container \"5f5a3f59c2536565699b1b25c13d26697f1f39acfb6a67fda775b1a8d558ea5a\": container with ID starting with 5f5a3f59c2536565699b1b25c13d26697f1f39acfb6a67fda775b1a8d558ea5a not found: ID does not exist" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.893532 4892 scope.go:117] "RemoveContainer" containerID="296e009cfb6f2a21e4228283cfc44e00354c12e851eb69cdae67c43b2b299156" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.894269 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ztgz4"] Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.907100 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50aef298-67b1-42cd-a4a5-86179693b0eb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "50aef298-67b1-42cd-a4a5-86179693b0eb" (UID: "50aef298-67b1-42cd-a4a5-86179693b0eb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.915267 4892 scope.go:117] "RemoveContainer" containerID="17754112c2787e44e163a30f9110738ef9beb1dee8705a1644d1a377921c6503" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.945880 4892 scope.go:117] "RemoveContainer" containerID="67d1ec9b26c08b9dca9a5b81f44c80be7e8fa8e99899ac04645043c9c49500d2" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.966598 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50aef298-67b1-42cd-a4a5-86179693b0eb-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 19:50:33 crc kubenswrapper[4892]: I0217 19:50:33.973368 4892 scope.go:117] "RemoveContainer" containerID="076c41de3c27bee00540514cfa0ac6ff34e251fdec2660e974becb7d9c6b5c1b" Feb 17 19:50:34 crc kubenswrapper[4892]: I0217 19:50:34.026110 4892 scope.go:117] "RemoveContainer" containerID="bd8b8a6ac9540f1ab8c9eaf080c8cddbf3dfa7c4efa13a1e1ab048216d12cd5f" Feb 17 19:50:34 crc kubenswrapper[4892]: I0217 19:50:34.044918 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ct6mr"] Feb 17 19:50:34 crc kubenswrapper[4892]: I0217 19:50:34.063983 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9833d581-8f46-4e58-9a99-6ec65dc9431b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9833d581-8f46-4e58-9a99-6ec65dc9431b" (UID: "9833d581-8f46-4e58-9a99-6ec65dc9431b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:50:34 crc kubenswrapper[4892]: I0217 19:50:34.072772 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9833d581-8f46-4e58-9a99-6ec65dc9431b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 19:50:34 crc kubenswrapper[4892]: I0217 19:50:34.081470 4892 scope.go:117] "RemoveContainer" containerID="666b68835bee66d4b6aa5d1e6aca7e51bc1e9c1455c012cafe58ed8db11bffec" Feb 17 19:50:34 crc kubenswrapper[4892]: I0217 19:50:34.083089 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ct6mr"] Feb 17 19:50:34 crc kubenswrapper[4892]: I0217 19:50:34.100851 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pbtj9"] Feb 17 19:50:34 crc kubenswrapper[4892]: I0217 19:50:34.114557 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pbtj9"] Feb 17 19:50:34 crc kubenswrapper[4892]: I0217 19:50:34.225272 4892 scope.go:117] "RemoveContainer" containerID="076c41de3c27bee00540514cfa0ac6ff34e251fdec2660e974becb7d9c6b5c1b" Feb 17 19:50:34 crc kubenswrapper[4892]: E0217 19:50:34.226184 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"076c41de3c27bee00540514cfa0ac6ff34e251fdec2660e974becb7d9c6b5c1b\": container with ID starting with 076c41de3c27bee00540514cfa0ac6ff34e251fdec2660e974becb7d9c6b5c1b not found: ID does not exist" containerID="076c41de3c27bee00540514cfa0ac6ff34e251fdec2660e974becb7d9c6b5c1b" Feb 17 19:50:34 crc kubenswrapper[4892]: I0217 19:50:34.226228 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"076c41de3c27bee00540514cfa0ac6ff34e251fdec2660e974becb7d9c6b5c1b"} err="failed to get container status \"076c41de3c27bee00540514cfa0ac6ff34e251fdec2660e974becb7d9c6b5c1b\": rpc error: code = NotFound desc = could not find container \"076c41de3c27bee00540514cfa0ac6ff34e251fdec2660e974becb7d9c6b5c1b\": container with ID starting with 076c41de3c27bee00540514cfa0ac6ff34e251fdec2660e974becb7d9c6b5c1b not found: ID does not exist" Feb 17 19:50:34 crc kubenswrapper[4892]: I0217 19:50:34.226252 4892 scope.go:117] "RemoveContainer" containerID="bd8b8a6ac9540f1ab8c9eaf080c8cddbf3dfa7c4efa13a1e1ab048216d12cd5f" Feb 17 19:50:34 crc kubenswrapper[4892]: E0217 19:50:34.227071 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd8b8a6ac9540f1ab8c9eaf080c8cddbf3dfa7c4efa13a1e1ab048216d12cd5f\": container with ID starting with bd8b8a6ac9540f1ab8c9eaf080c8cddbf3dfa7c4efa13a1e1ab048216d12cd5f not found: ID does not exist" containerID="bd8b8a6ac9540f1ab8c9eaf080c8cddbf3dfa7c4efa13a1e1ab048216d12cd5f" Feb 17 19:50:34 crc kubenswrapper[4892]: I0217 19:50:34.227097 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd8b8a6ac9540f1ab8c9eaf080c8cddbf3dfa7c4efa13a1e1ab048216d12cd5f"} err="failed to get container status \"bd8b8a6ac9540f1ab8c9eaf080c8cddbf3dfa7c4efa13a1e1ab048216d12cd5f\": rpc error: code = NotFound desc = could not find container \"bd8b8a6ac9540f1ab8c9eaf080c8cddbf3dfa7c4efa13a1e1ab048216d12cd5f\": container with ID starting with bd8b8a6ac9540f1ab8c9eaf080c8cddbf3dfa7c4efa13a1e1ab048216d12cd5f not found: ID does not exist" Feb 17 19:50:34 crc kubenswrapper[4892]: I0217 19:50:34.227113 4892 scope.go:117] "RemoveContainer" containerID="666b68835bee66d4b6aa5d1e6aca7e51bc1e9c1455c012cafe58ed8db11bffec" Feb 17 19:50:34 crc kubenswrapper[4892]: E0217 19:50:34.228088 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"666b68835bee66d4b6aa5d1e6aca7e51bc1e9c1455c012cafe58ed8db11bffec\": container with ID starting with 666b68835bee66d4b6aa5d1e6aca7e51bc1e9c1455c012cafe58ed8db11bffec not found: ID does not exist" containerID="666b68835bee66d4b6aa5d1e6aca7e51bc1e9c1455c012cafe58ed8db11bffec" Feb 17 19:50:34 crc kubenswrapper[4892]: I0217 19:50:34.228113 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"666b68835bee66d4b6aa5d1e6aca7e51bc1e9c1455c012cafe58ed8db11bffec"} err="failed to get container status \"666b68835bee66d4b6aa5d1e6aca7e51bc1e9c1455c012cafe58ed8db11bffec\": rpc error: code = NotFound desc = could not find container \"666b68835bee66d4b6aa5d1e6aca7e51bc1e9c1455c012cafe58ed8db11bffec\": container with ID starting with 666b68835bee66d4b6aa5d1e6aca7e51bc1e9c1455c012cafe58ed8db11bffec not found: ID does not exist" Feb 17 19:50:34 crc kubenswrapper[4892]: I0217 19:50:34.228125 4892 scope.go:117] "RemoveContainer" containerID="e51c1574a60189904fadbce2d623f904d53f6722685a80e95f6a12844fd34fa7" Feb 17 19:50:34 crc kubenswrapper[4892]: I0217 19:50:34.271452 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k2k8h" Feb 17 19:50:34 crc kubenswrapper[4892]: I0217 19:50:34.287856 4892 scope.go:117] "RemoveContainer" containerID="168b7bdb369be71c31ac2c71505e1a44233545383d46375e567d36c7a7f38f6d" Feb 17 19:50:34 crc kubenswrapper[4892]: I0217 19:50:34.346853 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lx4bn"] Feb 17 19:50:34 crc kubenswrapper[4892]: I0217 19:50:34.368678 4892 scope.go:117] "RemoveContainer" containerID="4377875f391f14d6766643a98ffd1dfef2c9c174e539553062ef8f25af15ca07" Feb 17 19:50:34 crc kubenswrapper[4892]: I0217 19:50:34.369267 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lx4bn"] Feb 17 19:50:34 crc kubenswrapper[4892]: I0217 19:50:34.381929 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e730a12c-43a9-45a7-a52b-9f6fd3eaecaf-utilities\") pod \"e730a12c-43a9-45a7-a52b-9f6fd3eaecaf\" (UID: \"e730a12c-43a9-45a7-a52b-9f6fd3eaecaf\") " Feb 17 19:50:34 crc kubenswrapper[4892]: I0217 19:50:34.381975 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e730a12c-43a9-45a7-a52b-9f6fd3eaecaf-catalog-content\") pod \"e730a12c-43a9-45a7-a52b-9f6fd3eaecaf\" (UID: \"e730a12c-43a9-45a7-a52b-9f6fd3eaecaf\") " Feb 17 19:50:34 crc kubenswrapper[4892]: I0217 19:50:34.382126 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lc7rt\" (UniqueName: \"kubernetes.io/projected/e730a12c-43a9-45a7-a52b-9f6fd3eaecaf-kube-api-access-lc7rt\") pod \"e730a12c-43a9-45a7-a52b-9f6fd3eaecaf\" (UID: \"e730a12c-43a9-45a7-a52b-9f6fd3eaecaf\") " Feb 17 19:50:34 crc kubenswrapper[4892]: I0217 19:50:34.396967 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e730a12c-43a9-45a7-a52b-9f6fd3eaecaf-utilities" (OuterVolumeSpecName: "utilities") pod "e730a12c-43a9-45a7-a52b-9f6fd3eaecaf" (UID: "e730a12c-43a9-45a7-a52b-9f6fd3eaecaf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:50:34 crc kubenswrapper[4892]: I0217 19:50:34.399185 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e730a12c-43a9-45a7-a52b-9f6fd3eaecaf-kube-api-access-lc7rt" (OuterVolumeSpecName: "kube-api-access-lc7rt") pod "e730a12c-43a9-45a7-a52b-9f6fd3eaecaf" (UID: "e730a12c-43a9-45a7-a52b-9f6fd3eaecaf"). InnerVolumeSpecName "kube-api-access-lc7rt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:50:34 crc kubenswrapper[4892]: I0217 19:50:34.483888 4892 scope.go:117] "RemoveContainer" containerID="f057e6826ee9b4451acb2ecf5775b8d00830c9d920d4c51c7a05610476ed7701" Feb 17 19:50:34 crc kubenswrapper[4892]: I0217 19:50:34.484998 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lc7rt\" (UniqueName: \"kubernetes.io/projected/e730a12c-43a9-45a7-a52b-9f6fd3eaecaf-kube-api-access-lc7rt\") on node \"crc\" DevicePath \"\"" Feb 17 19:50:34 crc kubenswrapper[4892]: I0217 19:50:34.485038 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e730a12c-43a9-45a7-a52b-9f6fd3eaecaf-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 19:50:34 crc kubenswrapper[4892]: I0217 19:50:34.690569 4892 scope.go:117] "RemoveContainer" containerID="5186ffd59cb954c27883939ccd1e21943d600879ac8599b458c2142f6d2b66d3" Feb 17 19:50:34 crc kubenswrapper[4892]: I0217 19:50:34.749057 4892 scope.go:117] "RemoveContainer" containerID="aaf6c2d6daab549efc9c97fa98a8f39f4a98e15e2b90d63bf017d3878089637c" Feb 17 19:50:34 crc kubenswrapper[4892]: I0217 19:50:34.780079 4892 scope.go:117] "RemoveContainer" containerID="2ba90fc2d816105e9338e3455c3d172641daf2c4e534d92d6555ad37ec55401b" Feb 17 19:50:34 crc kubenswrapper[4892]: I0217 19:50:34.810260 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e730a12c-43a9-45a7-a52b-9f6fd3eaecaf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e730a12c-43a9-45a7-a52b-9f6fd3eaecaf" (UID: "e730a12c-43a9-45a7-a52b-9f6fd3eaecaf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:50:34 crc kubenswrapper[4892]: I0217 19:50:34.811039 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k2k8h" event={"ID":"e730a12c-43a9-45a7-a52b-9f6fd3eaecaf","Type":"ContainerDied","Data":"1626b29e905340f6deacd2ba4712da7f61425d837a482ae74fd71298e9f9b9e5"} Feb 17 19:50:34 crc kubenswrapper[4892]: I0217 19:50:34.811149 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k2k8h" Feb 17 19:50:34 crc kubenswrapper[4892]: I0217 19:50:34.839109 4892 scope.go:117] "RemoveContainer" containerID="d1014d4ed3072d95e63d7b7bae47fe8dcd0e17bc688218b747f04f332329d99a" Feb 17 19:50:34 crc kubenswrapper[4892]: I0217 19:50:34.847704 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k2k8h"] Feb 17 19:50:34 crc kubenswrapper[4892]: I0217 19:50:34.861463 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-k2k8h"] Feb 17 19:50:34 crc kubenswrapper[4892]: I0217 19:50:34.895556 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e730a12c-43a9-45a7-a52b-9f6fd3eaecaf-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 19:50:34 crc kubenswrapper[4892]: I0217 19:50:34.913701 4892 scope.go:117] "RemoveContainer" containerID="f10b38ed42395540b600837982e306e0f8547758920c7092c67abace4e1b4581" Feb 17 19:50:34 crc kubenswrapper[4892]: I0217 19:50:34.949646 4892 scope.go:117] "RemoveContainer" containerID="cc4815a035b4957f5c46b6adc05333fe70f215427c1b860521e216f31e27e5e6" Feb 17 19:50:34 crc kubenswrapper[4892]: I0217 19:50:34.976519 4892 scope.go:117] "RemoveContainer" containerID="009bd6851f021fef23a438ed46f023782908848880371c9dec7ee845540b639a" Feb 17 19:50:35 crc kubenswrapper[4892]: I0217 19:50:35.002176 4892 scope.go:117] "RemoveContainer" containerID="4139f53a7cc8fef9ebc26414bb1382f5d99f4b3d4c71987c97e854fd6f15b6e3" Feb 17 19:50:35 crc kubenswrapper[4892]: I0217 19:50:35.024202 4892 scope.go:117] "RemoveContainer" containerID="098479c84ca860a2b62aa81e3f7f2dc771b33b3e3dcdc59ed7bf5bd2efdd78af" Feb 17 19:50:35 crc kubenswrapper[4892]: I0217 19:50:35.049481 4892 scope.go:117] "RemoveContainer" containerID="8497fb9df9095ccf6b3defb12843d295646049056697a13f60fea815a49cd876" Feb 17 19:50:35 crc kubenswrapper[4892]: I0217 19:50:35.072515 4892 scope.go:117] "RemoveContainer" containerID="1ac8ae4b6ebbee6f0f846c3fe0d25eba908841a626fe85a5379294620d9d5406" Feb 17 19:50:35 crc kubenswrapper[4892]: I0217 19:50:35.372083 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3425377d-d147-4bc3-a063-5e3c9456d2f9" path="/var/lib/kubelet/pods/3425377d-d147-4bc3-a063-5e3c9456d2f9/volumes" Feb 17 19:50:35 crc kubenswrapper[4892]: I0217 19:50:35.373652 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50aef298-67b1-42cd-a4a5-86179693b0eb" path="/var/lib/kubelet/pods/50aef298-67b1-42cd-a4a5-86179693b0eb/volumes" Feb 17 19:50:35 crc kubenswrapper[4892]: I0217 19:50:35.374446 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66354c96-340d-43c7-98d0-19713f857884" path="/var/lib/kubelet/pods/66354c96-340d-43c7-98d0-19713f857884/volumes" Feb 17 19:50:35 crc kubenswrapper[4892]: I0217 19:50:35.375557 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68547217-2996-4b25-9020-c2187ecfb42e" path="/var/lib/kubelet/pods/68547217-2996-4b25-9020-c2187ecfb42e/volumes" Feb 17 19:50:35 crc kubenswrapper[4892]: I0217 19:50:35.376265 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94374c20-25dd-491e-8778-0b00d10afda0" path="/var/lib/kubelet/pods/94374c20-25dd-491e-8778-0b00d10afda0/volumes" Feb 17 19:50:35 crc kubenswrapper[4892]: I0217 19:50:35.377401 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9833d581-8f46-4e58-9a99-6ec65dc9431b" path="/var/lib/kubelet/pods/9833d581-8f46-4e58-9a99-6ec65dc9431b/volumes" Feb 17 19:50:35 crc kubenswrapper[4892]: I0217 19:50:35.378474 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d276a412-68f8-4069-9aa2-275fdb23997d" path="/var/lib/kubelet/pods/d276a412-68f8-4069-9aa2-275fdb23997d/volumes" Feb 17 19:50:35 crc kubenswrapper[4892]: I0217 19:50:35.380068 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e730a12c-43a9-45a7-a52b-9f6fd3eaecaf" path="/var/lib/kubelet/pods/e730a12c-43a9-45a7-a52b-9f6fd3eaecaf/volumes" Feb 17 19:50:37 crc kubenswrapper[4892]: I0217 19:50:37.424663 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 19:50:37 crc kubenswrapper[4892]: I0217 19:50:37.425166 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 19:51:07 crc kubenswrapper[4892]: I0217 19:51:07.424735 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 19:51:07 crc kubenswrapper[4892]: I0217 19:51:07.425316 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 19:51:08 crc kubenswrapper[4892]: I0217 19:51:08.707670 4892 scope.go:117] "RemoveContainer" containerID="e57bf64fd272b83c9b0c76454389d556e20f18a0cfda0881aa6288c49cb07d10" Feb 17 19:51:08 crc kubenswrapper[4892]: I0217 19:51:08.731912 4892 scope.go:117] "RemoveContainer" containerID="792b0f5ab045c2d981e9f022986d9afadc5e6fe9beeb3f3507e0e33fef329c3d" Feb 17 19:51:08 crc kubenswrapper[4892]: I0217 19:51:08.752975 4892 scope.go:117] "RemoveContainer" containerID="1c64371027f9a0b92aebbd32db4c5e47fb934474941206fbd2e67f5f92c2fedd" Feb 17 19:51:16 crc kubenswrapper[4892]: I0217 19:51:16.421667 4892 generic.go:334] "Generic (PLEG): container finished" podID="a33740f5-3f9c-4f60-a0b9-cc91fbed6701" containerID="caf9af17e5c66534251da1c497b54ce667f4f83d58ef685ba64adc875cc62790" exitCode=0 Feb 17 19:51:16 crc kubenswrapper[4892]: I0217 19:51:16.421952 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-xqknj" event={"ID":"a33740f5-3f9c-4f60-a0b9-cc91fbed6701","Type":"ContainerDied","Data":"caf9af17e5c66534251da1c497b54ce667f4f83d58ef685ba64adc875cc62790"} Feb 17 19:51:17 crc kubenswrapper[4892]: I0217 19:51:17.868856 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-xqknj" Feb 17 19:51:17 crc kubenswrapper[4892]: I0217 19:51:17.987423 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a33740f5-3f9c-4f60-a0b9-cc91fbed6701-ceph\") pod \"a33740f5-3f9c-4f60-a0b9-cc91fbed6701\" (UID: \"a33740f5-3f9c-4f60-a0b9-cc91fbed6701\") " Feb 17 19:51:17 crc kubenswrapper[4892]: I0217 19:51:17.987517 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a33740f5-3f9c-4f60-a0b9-cc91fbed6701-inventory\") pod \"a33740f5-3f9c-4f60-a0b9-cc91fbed6701\" (UID: \"a33740f5-3f9c-4f60-a0b9-cc91fbed6701\") " Feb 17 19:51:17 crc kubenswrapper[4892]: I0217 19:51:17.987545 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a33740f5-3f9c-4f60-a0b9-cc91fbed6701-ssh-key-openstack-cell1\") pod \"a33740f5-3f9c-4f60-a0b9-cc91fbed6701\" (UID: \"a33740f5-3f9c-4f60-a0b9-cc91fbed6701\") " Feb 17 19:51:17 crc kubenswrapper[4892]: I0217 19:51:17.987562 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9c87\" (UniqueName: \"kubernetes.io/projected/a33740f5-3f9c-4f60-a0b9-cc91fbed6701-kube-api-access-x9c87\") pod \"a33740f5-3f9c-4f60-a0b9-cc91fbed6701\" (UID: \"a33740f5-3f9c-4f60-a0b9-cc91fbed6701\") " Feb 17 19:51:17 crc kubenswrapper[4892]: I0217 19:51:17.996107 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a33740f5-3f9c-4f60-a0b9-cc91fbed6701-ceph" (OuterVolumeSpecName: "ceph") pod "a33740f5-3f9c-4f60-a0b9-cc91fbed6701" (UID: "a33740f5-3f9c-4f60-a0b9-cc91fbed6701"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.010925 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a33740f5-3f9c-4f60-a0b9-cc91fbed6701-kube-api-access-x9c87" (OuterVolumeSpecName: "kube-api-access-x9c87") pod "a33740f5-3f9c-4f60-a0b9-cc91fbed6701" (UID: "a33740f5-3f9c-4f60-a0b9-cc91fbed6701"). InnerVolumeSpecName "kube-api-access-x9c87". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.030058 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a33740f5-3f9c-4f60-a0b9-cc91fbed6701-inventory" (OuterVolumeSpecName: "inventory") pod "a33740f5-3f9c-4f60-a0b9-cc91fbed6701" (UID: "a33740f5-3f9c-4f60-a0b9-cc91fbed6701"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.030290 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a33740f5-3f9c-4f60-a0b9-cc91fbed6701-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "a33740f5-3f9c-4f60-a0b9-cc91fbed6701" (UID: "a33740f5-3f9c-4f60-a0b9-cc91fbed6701"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.090958 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9c87\" (UniqueName: \"kubernetes.io/projected/a33740f5-3f9c-4f60-a0b9-cc91fbed6701-kube-api-access-x9c87\") on node \"crc\" DevicePath \"\"" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.093566 4892 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a33740f5-3f9c-4f60-a0b9-cc91fbed6701-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.093778 4892 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a33740f5-3f9c-4f60-a0b9-cc91fbed6701-ceph\") on node \"crc\" DevicePath \"\"" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.093876 4892 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a33740f5-3f9c-4f60-a0b9-cc91fbed6701-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.442602 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-xqknj" event={"ID":"a33740f5-3f9c-4f60-a0b9-cc91fbed6701","Type":"ContainerDied","Data":"5c5b99bad09390cbf69c81c500cf54210aca6785247d0cfd89d3dde025cb08a6"} Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.442913 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c5b99bad09390cbf69c81c500cf54210aca6785247d0cfd89d3dde025cb08a6" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.442663 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-xqknj" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.540222 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-openstack-qpx7f"] Feb 17 19:51:18 crc kubenswrapper[4892]: E0217 19:51:18.540977 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d276a412-68f8-4069-9aa2-275fdb23997d" containerName="extract-content" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.541011 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="d276a412-68f8-4069-9aa2-275fdb23997d" containerName="extract-content" Feb 17 19:51:18 crc kubenswrapper[4892]: E0217 19:51:18.541048 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a33740f5-3f9c-4f60-a0b9-cc91fbed6701" containerName="configure-os-openstack-openstack-cell1" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.541061 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="a33740f5-3f9c-4f60-a0b9-cc91fbed6701" containerName="configure-os-openstack-openstack-cell1" Feb 17 19:51:18 crc kubenswrapper[4892]: E0217 19:51:18.541083 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66354c96-340d-43c7-98d0-19713f857884" containerName="extract-utilities" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.541094 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="66354c96-340d-43c7-98d0-19713f857884" containerName="extract-utilities" Feb 17 19:51:18 crc kubenswrapper[4892]: E0217 19:51:18.541117 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9833d581-8f46-4e58-9a99-6ec65dc9431b" containerName="registry-server" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.541128 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="9833d581-8f46-4e58-9a99-6ec65dc9431b" containerName="registry-server" Feb 17 19:51:18 crc kubenswrapper[4892]: E0217 19:51:18.541151 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50aef298-67b1-42cd-a4a5-86179693b0eb" containerName="extract-content" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.541162 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="50aef298-67b1-42cd-a4a5-86179693b0eb" containerName="extract-content" Feb 17 19:51:18 crc kubenswrapper[4892]: E0217 19:51:18.541180 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e730a12c-43a9-45a7-a52b-9f6fd3eaecaf" containerName="extract-utilities" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.541190 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="e730a12c-43a9-45a7-a52b-9f6fd3eaecaf" containerName="extract-utilities" Feb 17 19:51:18 crc kubenswrapper[4892]: E0217 19:51:18.541207 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fee21c95-556a-4a25-8b38-273489f881e4" containerName="extract-content" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.541216 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="fee21c95-556a-4a25-8b38-273489f881e4" containerName="extract-content" Feb 17 19:51:18 crc kubenswrapper[4892]: E0217 19:51:18.541246 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d276a412-68f8-4069-9aa2-275fdb23997d" containerName="registry-server" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.541258 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="d276a412-68f8-4069-9aa2-275fdb23997d" containerName="registry-server" Feb 17 19:51:18 crc kubenswrapper[4892]: E0217 19:51:18.541276 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94374c20-25dd-491e-8778-0b00d10afda0" containerName="extract-content" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.541285 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="94374c20-25dd-491e-8778-0b00d10afda0" containerName="extract-content" Feb 17 19:51:18 crc kubenswrapper[4892]: E0217 19:51:18.541296 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="272b1940-458c-4610-a618-fcc2a4fab95c" containerName="extract-content" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.541306 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="272b1940-458c-4610-a618-fcc2a4fab95c" containerName="extract-content" Feb 17 19:51:18 crc kubenswrapper[4892]: E0217 19:51:18.541328 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66354c96-340d-43c7-98d0-19713f857884" containerName="registry-server" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.541338 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="66354c96-340d-43c7-98d0-19713f857884" containerName="registry-server" Feb 17 19:51:18 crc kubenswrapper[4892]: E0217 19:51:18.541351 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9833d581-8f46-4e58-9a99-6ec65dc9431b" containerName="extract-content" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.541360 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="9833d581-8f46-4e58-9a99-6ec65dc9431b" containerName="extract-content" Feb 17 19:51:18 crc kubenswrapper[4892]: E0217 19:51:18.541387 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3425377d-d147-4bc3-a063-5e3c9456d2f9" containerName="extract-utilities" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.541397 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="3425377d-d147-4bc3-a063-5e3c9456d2f9" containerName="extract-utilities" Feb 17 19:51:18 crc kubenswrapper[4892]: E0217 19:51:18.541421 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68547217-2996-4b25-9020-c2187ecfb42e" containerName="extract-utilities" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.541431 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="68547217-2996-4b25-9020-c2187ecfb42e" containerName="extract-utilities" Feb 17 19:51:18 crc kubenswrapper[4892]: E0217 19:51:18.541452 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31427c94-1de1-431c-9f4b-e87f01548d3f" containerName="extract-content" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.541462 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="31427c94-1de1-431c-9f4b-e87f01548d3f" containerName="extract-content" Feb 17 19:51:18 crc kubenswrapper[4892]: E0217 19:51:18.541480 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9833d581-8f46-4e58-9a99-6ec65dc9431b" containerName="extract-utilities" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.541490 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="9833d581-8f46-4e58-9a99-6ec65dc9431b" containerName="extract-utilities" Feb 17 19:51:18 crc kubenswrapper[4892]: E0217 19:51:18.541509 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3425377d-d147-4bc3-a063-5e3c9456d2f9" containerName="registry-server" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.541519 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="3425377d-d147-4bc3-a063-5e3c9456d2f9" containerName="registry-server" Feb 17 19:51:18 crc kubenswrapper[4892]: E0217 19:51:18.541544 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94374c20-25dd-491e-8778-0b00d10afda0" containerName="extract-utilities" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.541555 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="94374c20-25dd-491e-8778-0b00d10afda0" containerName="extract-utilities" Feb 17 19:51:18 crc kubenswrapper[4892]: E0217 19:51:18.541570 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50aef298-67b1-42cd-a4a5-86179693b0eb" containerName="registry-server" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.541599 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="50aef298-67b1-42cd-a4a5-86179693b0eb" containerName="registry-server" Feb 17 19:51:18 crc kubenswrapper[4892]: E0217 19:51:18.541614 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3425377d-d147-4bc3-a063-5e3c9456d2f9" containerName="extract-content" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.541624 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="3425377d-d147-4bc3-a063-5e3c9456d2f9" containerName="extract-content" Feb 17 19:51:18 crc kubenswrapper[4892]: E0217 19:51:18.541649 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31427c94-1de1-431c-9f4b-e87f01548d3f" containerName="extract-utilities" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.541660 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="31427c94-1de1-431c-9f4b-e87f01548d3f" containerName="extract-utilities" Feb 17 19:51:18 crc kubenswrapper[4892]: E0217 19:51:18.541675 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31427c94-1de1-431c-9f4b-e87f01548d3f" containerName="registry-server" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.541684 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="31427c94-1de1-431c-9f4b-e87f01548d3f" containerName="registry-server" Feb 17 19:51:18 crc kubenswrapper[4892]: E0217 19:51:18.541912 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fee21c95-556a-4a25-8b38-273489f881e4" containerName="registry-server" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.541923 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="fee21c95-556a-4a25-8b38-273489f881e4" containerName="registry-server" Feb 17 19:51:18 crc kubenswrapper[4892]: E0217 19:51:18.541940 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="272b1940-458c-4610-a618-fcc2a4fab95c" containerName="registry-server" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.541949 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="272b1940-458c-4610-a618-fcc2a4fab95c" containerName="registry-server" Feb 17 19:51:18 crc kubenswrapper[4892]: E0217 19:51:18.541979 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94374c20-25dd-491e-8778-0b00d10afda0" containerName="registry-server" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.541989 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="94374c20-25dd-491e-8778-0b00d10afda0" containerName="registry-server" Feb 17 19:51:18 crc kubenswrapper[4892]: E0217 19:51:18.542011 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e730a12c-43a9-45a7-a52b-9f6fd3eaecaf" containerName="registry-server" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.542020 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="e730a12c-43a9-45a7-a52b-9f6fd3eaecaf" containerName="registry-server" Feb 17 19:51:18 crc kubenswrapper[4892]: E0217 19:51:18.542038 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e730a12c-43a9-45a7-a52b-9f6fd3eaecaf" containerName="extract-content" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.542050 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="e730a12c-43a9-45a7-a52b-9f6fd3eaecaf" containerName="extract-content" Feb 17 19:51:18 crc kubenswrapper[4892]: E0217 19:51:18.542065 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fee21c95-556a-4a25-8b38-273489f881e4" containerName="extract-utilities" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.542074 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="fee21c95-556a-4a25-8b38-273489f881e4" containerName="extract-utilities" Feb 17 19:51:18 crc kubenswrapper[4892]: E0217 19:51:18.542096 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68547217-2996-4b25-9020-c2187ecfb42e" containerName="extract-content" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.542105 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="68547217-2996-4b25-9020-c2187ecfb42e" containerName="extract-content" Feb 17 19:51:18 crc kubenswrapper[4892]: E0217 19:51:18.542135 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d276a412-68f8-4069-9aa2-275fdb23997d" containerName="extract-utilities" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.542144 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="d276a412-68f8-4069-9aa2-275fdb23997d" containerName="extract-utilities" Feb 17 19:51:18 crc kubenswrapper[4892]: E0217 19:51:18.542161 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68547217-2996-4b25-9020-c2187ecfb42e" containerName="registry-server" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.542171 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="68547217-2996-4b25-9020-c2187ecfb42e" containerName="registry-server" Feb 17 19:51:18 crc kubenswrapper[4892]: E0217 19:51:18.542183 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="272b1940-458c-4610-a618-fcc2a4fab95c" containerName="extract-utilities" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.542193 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="272b1940-458c-4610-a618-fcc2a4fab95c" containerName="extract-utilities" Feb 17 19:51:18 crc kubenswrapper[4892]: E0217 19:51:18.542210 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66354c96-340d-43c7-98d0-19713f857884" containerName="extract-content" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.542220 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="66354c96-340d-43c7-98d0-19713f857884" containerName="extract-content" Feb 17 19:51:18 crc kubenswrapper[4892]: E0217 19:51:18.542258 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50aef298-67b1-42cd-a4a5-86179693b0eb" containerName="extract-utilities" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.542268 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="50aef298-67b1-42cd-a4a5-86179693b0eb" containerName="extract-utilities" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.542601 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="d276a412-68f8-4069-9aa2-275fdb23997d" containerName="registry-server" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.542623 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="31427c94-1de1-431c-9f4b-e87f01548d3f" containerName="registry-server" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.542636 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="e730a12c-43a9-45a7-a52b-9f6fd3eaecaf" containerName="registry-server" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.542650 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="66354c96-340d-43c7-98d0-19713f857884" containerName="registry-server" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.542667 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="272b1940-458c-4610-a618-fcc2a4fab95c" containerName="registry-server" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.542689 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="94374c20-25dd-491e-8778-0b00d10afda0" containerName="registry-server" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.542710 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="9833d581-8f46-4e58-9a99-6ec65dc9431b" containerName="registry-server" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.542732 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="fee21c95-556a-4a25-8b38-273489f881e4" containerName="registry-server" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.542757 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="a33740f5-3f9c-4f60-a0b9-cc91fbed6701" containerName="configure-os-openstack-openstack-cell1" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.542773 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="3425377d-d147-4bc3-a063-5e3c9456d2f9" containerName="registry-server" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.542790 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="68547217-2996-4b25-9020-c2187ecfb42e" containerName="registry-server" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.542809 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="50aef298-67b1-42cd-a4a5-86179693b0eb" containerName="registry-server" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.543920 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-qpx7f" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.545690 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.546395 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-sdxx7" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.547227 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.549110 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.552239 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-qpx7f"] Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.602773 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ff891110-801e-4168-831b-d018dff2a1e5-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-qpx7f\" (UID: \"ff891110-801e-4168-831b-d018dff2a1e5\") " pod="openstack/ssh-known-hosts-openstack-qpx7f" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.603018 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ff891110-801e-4168-831b-d018dff2a1e5-inventory-0\") pod \"ssh-known-hosts-openstack-qpx7f\" (UID: \"ff891110-801e-4168-831b-d018dff2a1e5\") " pod="openstack/ssh-known-hosts-openstack-qpx7f" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.603171 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqggv\" (UniqueName: \"kubernetes.io/projected/ff891110-801e-4168-831b-d018dff2a1e5-kube-api-access-jqggv\") pod \"ssh-known-hosts-openstack-qpx7f\" (UID: \"ff891110-801e-4168-831b-d018dff2a1e5\") " pod="openstack/ssh-known-hosts-openstack-qpx7f" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.603234 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ff891110-801e-4168-831b-d018dff2a1e5-ceph\") pod \"ssh-known-hosts-openstack-qpx7f\" (UID: \"ff891110-801e-4168-831b-d018dff2a1e5\") " pod="openstack/ssh-known-hosts-openstack-qpx7f" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.705890 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqggv\" (UniqueName: \"kubernetes.io/projected/ff891110-801e-4168-831b-d018dff2a1e5-kube-api-access-jqggv\") pod \"ssh-known-hosts-openstack-qpx7f\" (UID: \"ff891110-801e-4168-831b-d018dff2a1e5\") " pod="openstack/ssh-known-hosts-openstack-qpx7f" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.705985 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ff891110-801e-4168-831b-d018dff2a1e5-ceph\") pod \"ssh-known-hosts-openstack-qpx7f\" (UID: \"ff891110-801e-4168-831b-d018dff2a1e5\") " pod="openstack/ssh-known-hosts-openstack-qpx7f" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.706035 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ff891110-801e-4168-831b-d018dff2a1e5-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-qpx7f\" (UID: \"ff891110-801e-4168-831b-d018dff2a1e5\") " pod="openstack/ssh-known-hosts-openstack-qpx7f" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.707043 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ff891110-801e-4168-831b-d018dff2a1e5-inventory-0\") pod \"ssh-known-hosts-openstack-qpx7f\" (UID: \"ff891110-801e-4168-831b-d018dff2a1e5\") " pod="openstack/ssh-known-hosts-openstack-qpx7f" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.710063 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ff891110-801e-4168-831b-d018dff2a1e5-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-qpx7f\" (UID: \"ff891110-801e-4168-831b-d018dff2a1e5\") " pod="openstack/ssh-known-hosts-openstack-qpx7f" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.710345 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ff891110-801e-4168-831b-d018dff2a1e5-inventory-0\") pod \"ssh-known-hosts-openstack-qpx7f\" (UID: \"ff891110-801e-4168-831b-d018dff2a1e5\") " pod="openstack/ssh-known-hosts-openstack-qpx7f" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.716499 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ff891110-801e-4168-831b-d018dff2a1e5-ceph\") pod \"ssh-known-hosts-openstack-qpx7f\" (UID: \"ff891110-801e-4168-831b-d018dff2a1e5\") " pod="openstack/ssh-known-hosts-openstack-qpx7f" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.733646 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqggv\" (UniqueName: \"kubernetes.io/projected/ff891110-801e-4168-831b-d018dff2a1e5-kube-api-access-jqggv\") pod \"ssh-known-hosts-openstack-qpx7f\" (UID: \"ff891110-801e-4168-831b-d018dff2a1e5\") " pod="openstack/ssh-known-hosts-openstack-qpx7f" Feb 17 19:51:18 crc kubenswrapper[4892]: I0217 19:51:18.877541 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-qpx7f" Feb 17 19:51:19 crc kubenswrapper[4892]: I0217 19:51:19.465171 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-qpx7f"] Feb 17 19:51:19 crc kubenswrapper[4892]: W0217 19:51:19.469624 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff891110_801e_4168_831b_d018dff2a1e5.slice/crio-f10f983a40f119e5188077b2fa4680982766a07d4d2439ac602cd7813461db84 WatchSource:0}: Error finding container f10f983a40f119e5188077b2fa4680982766a07d4d2439ac602cd7813461db84: Status 404 returned error can't find the container with id f10f983a40f119e5188077b2fa4680982766a07d4d2439ac602cd7813461db84 Feb 17 19:51:20 crc kubenswrapper[4892]: I0217 19:51:20.471429 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-qpx7f" event={"ID":"ff891110-801e-4168-831b-d018dff2a1e5","Type":"ContainerStarted","Data":"7b6e9c688d60feb133f19a3661caec7cafe648c40ddbb44828196b028d31a3b3"} Feb 17 19:51:20 crc kubenswrapper[4892]: I0217 19:51:20.471688 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-qpx7f" event={"ID":"ff891110-801e-4168-831b-d018dff2a1e5","Type":"ContainerStarted","Data":"f10f983a40f119e5188077b2fa4680982766a07d4d2439ac602cd7813461db84"} Feb 17 19:51:20 crc kubenswrapper[4892]: I0217 19:51:20.504468 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-openstack-qpx7f" podStartSLOduration=2.117269998 podStartE2EDuration="2.504433133s" podCreationTimestamp="2026-02-17 19:51:18 +0000 UTC" firstStartedPulling="2026-02-17 19:51:19.473398402 +0000 UTC m=+7650.848801677" lastFinishedPulling="2026-02-17 19:51:19.860561537 +0000 UTC m=+7651.235964812" observedRunningTime="2026-02-17 19:51:20.493160079 +0000 UTC m=+7651.868563394" watchObservedRunningTime="2026-02-17 19:51:20.504433133 +0000 UTC m=+7651.879836458" Feb 17 19:51:29 crc kubenswrapper[4892]: I0217 19:51:29.585151 4892 generic.go:334] "Generic (PLEG): container finished" podID="ff891110-801e-4168-831b-d018dff2a1e5" containerID="7b6e9c688d60feb133f19a3661caec7cafe648c40ddbb44828196b028d31a3b3" exitCode=0 Feb 17 19:51:29 crc kubenswrapper[4892]: I0217 19:51:29.585305 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-qpx7f" event={"ID":"ff891110-801e-4168-831b-d018dff2a1e5","Type":"ContainerDied","Data":"7b6e9c688d60feb133f19a3661caec7cafe648c40ddbb44828196b028d31a3b3"} Feb 17 19:51:31 crc kubenswrapper[4892]: I0217 19:51:31.093424 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-qpx7f" Feb 17 19:51:31 crc kubenswrapper[4892]: I0217 19:51:31.216126 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqggv\" (UniqueName: \"kubernetes.io/projected/ff891110-801e-4168-831b-d018dff2a1e5-kube-api-access-jqggv\") pod \"ff891110-801e-4168-831b-d018dff2a1e5\" (UID: \"ff891110-801e-4168-831b-d018dff2a1e5\") " Feb 17 19:51:31 crc kubenswrapper[4892]: I0217 19:51:31.216587 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ff891110-801e-4168-831b-d018dff2a1e5-inventory-0\") pod \"ff891110-801e-4168-831b-d018dff2a1e5\" (UID: \"ff891110-801e-4168-831b-d018dff2a1e5\") " Feb 17 19:51:31 crc kubenswrapper[4892]: I0217 19:51:31.216705 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ff891110-801e-4168-831b-d018dff2a1e5-ceph\") pod \"ff891110-801e-4168-831b-d018dff2a1e5\" (UID: \"ff891110-801e-4168-831b-d018dff2a1e5\") " Feb 17 19:51:31 crc kubenswrapper[4892]: I0217 19:51:31.216783 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ff891110-801e-4168-831b-d018dff2a1e5-ssh-key-openstack-cell1\") pod \"ff891110-801e-4168-831b-d018dff2a1e5\" (UID: \"ff891110-801e-4168-831b-d018dff2a1e5\") " Feb 17 19:51:31 crc kubenswrapper[4892]: I0217 19:51:31.222869 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff891110-801e-4168-831b-d018dff2a1e5-kube-api-access-jqggv" (OuterVolumeSpecName: "kube-api-access-jqggv") pod "ff891110-801e-4168-831b-d018dff2a1e5" (UID: "ff891110-801e-4168-831b-d018dff2a1e5"). InnerVolumeSpecName "kube-api-access-jqggv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:51:31 crc kubenswrapper[4892]: I0217 19:51:31.223966 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff891110-801e-4168-831b-d018dff2a1e5-ceph" (OuterVolumeSpecName: "ceph") pod "ff891110-801e-4168-831b-d018dff2a1e5" (UID: "ff891110-801e-4168-831b-d018dff2a1e5"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:51:31 crc kubenswrapper[4892]: I0217 19:51:31.249737 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff891110-801e-4168-831b-d018dff2a1e5-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "ff891110-801e-4168-831b-d018dff2a1e5" (UID: "ff891110-801e-4168-831b-d018dff2a1e5"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:51:31 crc kubenswrapper[4892]: I0217 19:51:31.256054 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff891110-801e-4168-831b-d018dff2a1e5-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "ff891110-801e-4168-831b-d018dff2a1e5" (UID: "ff891110-801e-4168-831b-d018dff2a1e5"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:51:31 crc kubenswrapper[4892]: I0217 19:51:31.320209 4892 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ff891110-801e-4168-831b-d018dff2a1e5-inventory-0\") on node \"crc\" DevicePath \"\"" Feb 17 19:51:31 crc kubenswrapper[4892]: I0217 19:51:31.320253 4892 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ff891110-801e-4168-831b-d018dff2a1e5-ceph\") on node \"crc\" DevicePath \"\"" Feb 17 19:51:31 crc kubenswrapper[4892]: I0217 19:51:31.320266 4892 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ff891110-801e-4168-831b-d018dff2a1e5-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 17 19:51:31 crc kubenswrapper[4892]: I0217 19:51:31.320284 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqggv\" (UniqueName: \"kubernetes.io/projected/ff891110-801e-4168-831b-d018dff2a1e5-kube-api-access-jqggv\") on node \"crc\" DevicePath \"\"" Feb 17 19:51:31 crc kubenswrapper[4892]: I0217 19:51:31.611006 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-qpx7f" event={"ID":"ff891110-801e-4168-831b-d018dff2a1e5","Type":"ContainerDied","Data":"f10f983a40f119e5188077b2fa4680982766a07d4d2439ac602cd7813461db84"} Feb 17 19:51:31 crc kubenswrapper[4892]: I0217 19:51:31.611068 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f10f983a40f119e5188077b2fa4680982766a07d4d2439ac602cd7813461db84" Feb 17 19:51:31 crc kubenswrapper[4892]: I0217 19:51:31.611083 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-qpx7f" Feb 17 19:51:31 crc kubenswrapper[4892]: I0217 19:51:31.707802 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-openstack-openstack-cell1-ww4hh"] Feb 17 19:51:31 crc kubenswrapper[4892]: E0217 19:51:31.708569 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff891110-801e-4168-831b-d018dff2a1e5" containerName="ssh-known-hosts-openstack" Feb 17 19:51:31 crc kubenswrapper[4892]: I0217 19:51:31.708605 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff891110-801e-4168-831b-d018dff2a1e5" containerName="ssh-known-hosts-openstack" Feb 17 19:51:31 crc kubenswrapper[4892]: I0217 19:51:31.709123 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff891110-801e-4168-831b-d018dff2a1e5" containerName="ssh-known-hosts-openstack" Feb 17 19:51:31 crc kubenswrapper[4892]: I0217 19:51:31.710448 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-ww4hh" Feb 17 19:51:31 crc kubenswrapper[4892]: I0217 19:51:31.713951 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-sdxx7" Feb 17 19:51:31 crc kubenswrapper[4892]: I0217 19:51:31.714285 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 19:51:31 crc kubenswrapper[4892]: I0217 19:51:31.716436 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 17 19:51:31 crc kubenswrapper[4892]: I0217 19:51:31.724796 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-ww4hh"] Feb 17 19:51:31 crc kubenswrapper[4892]: I0217 19:51:31.727399 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 17 19:51:31 crc kubenswrapper[4892]: I0217 19:51:31.834467 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ef5efbdd-488e-4956-b442-da3dfc5542e1-ceph\") pod \"run-os-openstack-openstack-cell1-ww4hh\" (UID: \"ef5efbdd-488e-4956-b442-da3dfc5542e1\") " pod="openstack/run-os-openstack-openstack-cell1-ww4hh" Feb 17 19:51:31 crc kubenswrapper[4892]: I0217 19:51:31.834587 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84jxj\" (UniqueName: \"kubernetes.io/projected/ef5efbdd-488e-4956-b442-da3dfc5542e1-kube-api-access-84jxj\") pod \"run-os-openstack-openstack-cell1-ww4hh\" (UID: \"ef5efbdd-488e-4956-b442-da3dfc5542e1\") " pod="openstack/run-os-openstack-openstack-cell1-ww4hh" Feb 17 19:51:31 crc kubenswrapper[4892]: I0217 19:51:31.835008 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ef5efbdd-488e-4956-b442-da3dfc5542e1-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-ww4hh\" (UID: \"ef5efbdd-488e-4956-b442-da3dfc5542e1\") " pod="openstack/run-os-openstack-openstack-cell1-ww4hh" Feb 17 19:51:31 crc kubenswrapper[4892]: I0217 19:51:31.835302 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef5efbdd-488e-4956-b442-da3dfc5542e1-inventory\") pod \"run-os-openstack-openstack-cell1-ww4hh\" (UID: \"ef5efbdd-488e-4956-b442-da3dfc5542e1\") " pod="openstack/run-os-openstack-openstack-cell1-ww4hh" Feb 17 19:51:31 crc kubenswrapper[4892]: I0217 19:51:31.937998 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ef5efbdd-488e-4956-b442-da3dfc5542e1-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-ww4hh\" (UID: \"ef5efbdd-488e-4956-b442-da3dfc5542e1\") " pod="openstack/run-os-openstack-openstack-cell1-ww4hh" Feb 17 19:51:31 crc kubenswrapper[4892]: I0217 19:51:31.938239 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef5efbdd-488e-4956-b442-da3dfc5542e1-inventory\") pod \"run-os-openstack-openstack-cell1-ww4hh\" (UID: \"ef5efbdd-488e-4956-b442-da3dfc5542e1\") " pod="openstack/run-os-openstack-openstack-cell1-ww4hh" Feb 17 19:51:31 crc kubenswrapper[4892]: I0217 19:51:31.939000 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ef5efbdd-488e-4956-b442-da3dfc5542e1-ceph\") pod \"run-os-openstack-openstack-cell1-ww4hh\" (UID: \"ef5efbdd-488e-4956-b442-da3dfc5542e1\") " pod="openstack/run-os-openstack-openstack-cell1-ww4hh" Feb 17 19:51:31 crc kubenswrapper[4892]: I0217 19:51:31.939325 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84jxj\" (UniqueName: \"kubernetes.io/projected/ef5efbdd-488e-4956-b442-da3dfc5542e1-kube-api-access-84jxj\") pod \"run-os-openstack-openstack-cell1-ww4hh\" (UID: \"ef5efbdd-488e-4956-b442-da3dfc5542e1\") " pod="openstack/run-os-openstack-openstack-cell1-ww4hh" Feb 17 19:51:31 crc kubenswrapper[4892]: I0217 19:51:31.943116 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef5efbdd-488e-4956-b442-da3dfc5542e1-inventory\") pod \"run-os-openstack-openstack-cell1-ww4hh\" (UID: \"ef5efbdd-488e-4956-b442-da3dfc5542e1\") " pod="openstack/run-os-openstack-openstack-cell1-ww4hh" Feb 17 19:51:31 crc kubenswrapper[4892]: I0217 19:51:31.943498 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ef5efbdd-488e-4956-b442-da3dfc5542e1-ceph\") pod \"run-os-openstack-openstack-cell1-ww4hh\" (UID: \"ef5efbdd-488e-4956-b442-da3dfc5542e1\") " pod="openstack/run-os-openstack-openstack-cell1-ww4hh" Feb 17 19:51:31 crc kubenswrapper[4892]: I0217 19:51:31.947392 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ef5efbdd-488e-4956-b442-da3dfc5542e1-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-ww4hh\" (UID: \"ef5efbdd-488e-4956-b442-da3dfc5542e1\") " pod="openstack/run-os-openstack-openstack-cell1-ww4hh" Feb 17 19:51:31 crc kubenswrapper[4892]: I0217 19:51:31.961242 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84jxj\" (UniqueName: \"kubernetes.io/projected/ef5efbdd-488e-4956-b442-da3dfc5542e1-kube-api-access-84jxj\") pod \"run-os-openstack-openstack-cell1-ww4hh\" (UID: \"ef5efbdd-488e-4956-b442-da3dfc5542e1\") " pod="openstack/run-os-openstack-openstack-cell1-ww4hh" Feb 17 19:51:32 crc kubenswrapper[4892]: I0217 19:51:32.045095 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-ww4hh" Feb 17 19:51:32 crc kubenswrapper[4892]: I0217 19:51:32.715581 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-ww4hh"] Feb 17 19:51:33 crc kubenswrapper[4892]: I0217 19:51:33.659768 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-ww4hh" event={"ID":"ef5efbdd-488e-4956-b442-da3dfc5542e1","Type":"ContainerStarted","Data":"062b0a48b661b75fc675067623a5202987f0b22903ea6a34171513c25571aa6d"} Feb 17 19:51:33 crc kubenswrapper[4892]: I0217 19:51:33.660289 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-ww4hh" event={"ID":"ef5efbdd-488e-4956-b442-da3dfc5542e1","Type":"ContainerStarted","Data":"1a9c618b89d8b44db2401c2cecce04184b3820e4d7be1e28affdf4065dd38281"} Feb 17 19:51:33 crc kubenswrapper[4892]: I0217 19:51:33.689258 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-openstack-openstack-cell1-ww4hh" podStartSLOduration=2.188111993 podStartE2EDuration="2.689238075s" podCreationTimestamp="2026-02-17 19:51:31 +0000 UTC" firstStartedPulling="2026-02-17 19:51:32.716678664 +0000 UTC m=+7664.092081939" lastFinishedPulling="2026-02-17 19:51:33.217804746 +0000 UTC m=+7664.593208021" observedRunningTime="2026-02-17 19:51:33.679509823 +0000 UTC m=+7665.054913108" watchObservedRunningTime="2026-02-17 19:51:33.689238075 +0000 UTC m=+7665.064641350" Feb 17 19:51:37 crc kubenswrapper[4892]: I0217 19:51:37.424233 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 19:51:37 crc kubenswrapper[4892]: I0217 19:51:37.424709 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 19:51:37 crc kubenswrapper[4892]: I0217 19:51:37.424749 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" Feb 17 19:51:37 crc kubenswrapper[4892]: I0217 19:51:37.425892 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3b5fc1d94d26a2a79cb5f7d6b0aa8474c2bc104c8b47fe3294fed95ffd456f3a"} pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 19:51:37 crc kubenswrapper[4892]: I0217 19:51:37.425947 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" containerID="cri-o://3b5fc1d94d26a2a79cb5f7d6b0aa8474c2bc104c8b47fe3294fed95ffd456f3a" gracePeriod=600 Feb 17 19:51:37 crc kubenswrapper[4892]: E0217 19:51:37.574085 4892 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9013d62_9809_436b_82a8_5b18dbf13e35.slice/crio-conmon-3b5fc1d94d26a2a79cb5f7d6b0aa8474c2bc104c8b47fe3294fed95ffd456f3a.scope\": RecentStats: unable to find data in memory cache]" Feb 17 19:51:37 crc kubenswrapper[4892]: I0217 19:51:37.706595 4892 generic.go:334] "Generic (PLEG): container finished" podID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerID="3b5fc1d94d26a2a79cb5f7d6b0aa8474c2bc104c8b47fe3294fed95ffd456f3a" exitCode=0 Feb 17 19:51:37 crc kubenswrapper[4892]: I0217 19:51:37.706693 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerDied","Data":"3b5fc1d94d26a2a79cb5f7d6b0aa8474c2bc104c8b47fe3294fed95ffd456f3a"} Feb 17 19:51:37 crc kubenswrapper[4892]: I0217 19:51:37.706975 4892 scope.go:117] "RemoveContainer" containerID="445f87c5a931a69b03d0a40bb0f8702e093637c8856522e51b78c9e6dbbc9e48" Feb 17 19:51:38 crc kubenswrapper[4892]: I0217 19:51:38.718933 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerStarted","Data":"6c0d3b57fe7ce8a8bf0e7ab5ed7eb5daea00ebff5bc63d153811e8fa47cadb35"} Feb 17 19:51:42 crc kubenswrapper[4892]: I0217 19:51:42.809724 4892 generic.go:334] "Generic (PLEG): container finished" podID="ef5efbdd-488e-4956-b442-da3dfc5542e1" containerID="062b0a48b661b75fc675067623a5202987f0b22903ea6a34171513c25571aa6d" exitCode=0 Feb 17 19:51:42 crc kubenswrapper[4892]: I0217 19:51:42.809970 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-ww4hh" event={"ID":"ef5efbdd-488e-4956-b442-da3dfc5542e1","Type":"ContainerDied","Data":"062b0a48b661b75fc675067623a5202987f0b22903ea6a34171513c25571aa6d"} Feb 17 19:51:44 crc kubenswrapper[4892]: I0217 19:51:44.309640 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-ww4hh" Feb 17 19:51:44 crc kubenswrapper[4892]: I0217 19:51:44.355266 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84jxj\" (UniqueName: \"kubernetes.io/projected/ef5efbdd-488e-4956-b442-da3dfc5542e1-kube-api-access-84jxj\") pod \"ef5efbdd-488e-4956-b442-da3dfc5542e1\" (UID: \"ef5efbdd-488e-4956-b442-da3dfc5542e1\") " Feb 17 19:51:44 crc kubenswrapper[4892]: I0217 19:51:44.356070 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ef5efbdd-488e-4956-b442-da3dfc5542e1-ssh-key-openstack-cell1\") pod \"ef5efbdd-488e-4956-b442-da3dfc5542e1\" (UID: \"ef5efbdd-488e-4956-b442-da3dfc5542e1\") " Feb 17 19:51:44 crc kubenswrapper[4892]: I0217 19:51:44.356115 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef5efbdd-488e-4956-b442-da3dfc5542e1-inventory\") pod \"ef5efbdd-488e-4956-b442-da3dfc5542e1\" (UID: \"ef5efbdd-488e-4956-b442-da3dfc5542e1\") " Feb 17 19:51:44 crc kubenswrapper[4892]: I0217 19:51:44.356144 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ef5efbdd-488e-4956-b442-da3dfc5542e1-ceph\") pod \"ef5efbdd-488e-4956-b442-da3dfc5542e1\" (UID: \"ef5efbdd-488e-4956-b442-da3dfc5542e1\") " Feb 17 19:51:44 crc kubenswrapper[4892]: I0217 19:51:44.361387 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef5efbdd-488e-4956-b442-da3dfc5542e1-ceph" (OuterVolumeSpecName: "ceph") pod "ef5efbdd-488e-4956-b442-da3dfc5542e1" (UID: "ef5efbdd-488e-4956-b442-da3dfc5542e1"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:51:44 crc kubenswrapper[4892]: I0217 19:51:44.361801 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef5efbdd-488e-4956-b442-da3dfc5542e1-kube-api-access-84jxj" (OuterVolumeSpecName: "kube-api-access-84jxj") pod "ef5efbdd-488e-4956-b442-da3dfc5542e1" (UID: "ef5efbdd-488e-4956-b442-da3dfc5542e1"). InnerVolumeSpecName "kube-api-access-84jxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:51:44 crc kubenswrapper[4892]: I0217 19:51:44.393615 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef5efbdd-488e-4956-b442-da3dfc5542e1-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "ef5efbdd-488e-4956-b442-da3dfc5542e1" (UID: "ef5efbdd-488e-4956-b442-da3dfc5542e1"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:51:44 crc kubenswrapper[4892]: I0217 19:51:44.397483 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef5efbdd-488e-4956-b442-da3dfc5542e1-inventory" (OuterVolumeSpecName: "inventory") pod "ef5efbdd-488e-4956-b442-da3dfc5542e1" (UID: "ef5efbdd-488e-4956-b442-da3dfc5542e1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:51:44 crc kubenswrapper[4892]: I0217 19:51:44.458991 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84jxj\" (UniqueName: \"kubernetes.io/projected/ef5efbdd-488e-4956-b442-da3dfc5542e1-kube-api-access-84jxj\") on node \"crc\" DevicePath \"\"" Feb 17 19:51:44 crc kubenswrapper[4892]: I0217 19:51:44.459141 4892 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ef5efbdd-488e-4956-b442-da3dfc5542e1-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 17 19:51:44 crc kubenswrapper[4892]: I0217 19:51:44.459186 4892 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef5efbdd-488e-4956-b442-da3dfc5542e1-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 19:51:44 crc kubenswrapper[4892]: I0217 19:51:44.459198 4892 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ef5efbdd-488e-4956-b442-da3dfc5542e1-ceph\") on node \"crc\" DevicePath \"\"" Feb 17 19:51:44 crc kubenswrapper[4892]: I0217 19:51:44.838263 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-ww4hh" event={"ID":"ef5efbdd-488e-4956-b442-da3dfc5542e1","Type":"ContainerDied","Data":"1a9c618b89d8b44db2401c2cecce04184b3820e4d7be1e28affdf4065dd38281"} Feb 17 19:51:44 crc kubenswrapper[4892]: I0217 19:51:44.838311 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a9c618b89d8b44db2401c2cecce04184b3820e4d7be1e28affdf4065dd38281" Feb 17 19:51:44 crc kubenswrapper[4892]: I0217 19:51:44.838309 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-ww4hh" Feb 17 19:51:44 crc kubenswrapper[4892]: I0217 19:51:44.918855 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-tb4ks"] Feb 17 19:51:44 crc kubenswrapper[4892]: E0217 19:51:44.919647 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef5efbdd-488e-4956-b442-da3dfc5542e1" containerName="run-os-openstack-openstack-cell1" Feb 17 19:51:44 crc kubenswrapper[4892]: I0217 19:51:44.919683 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef5efbdd-488e-4956-b442-da3dfc5542e1" containerName="run-os-openstack-openstack-cell1" Feb 17 19:51:44 crc kubenswrapper[4892]: I0217 19:51:44.920075 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef5efbdd-488e-4956-b442-da3dfc5542e1" containerName="run-os-openstack-openstack-cell1" Feb 17 19:51:44 crc kubenswrapper[4892]: I0217 19:51:44.921448 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-tb4ks" Feb 17 19:51:44 crc kubenswrapper[4892]: I0217 19:51:44.923897 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 19:51:44 crc kubenswrapper[4892]: I0217 19:51:44.924716 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 17 19:51:44 crc kubenswrapper[4892]: I0217 19:51:44.924732 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-sdxx7" Feb 17 19:51:44 crc kubenswrapper[4892]: I0217 19:51:44.930455 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-tb4ks"] Feb 17 19:51:44 crc kubenswrapper[4892]: I0217 19:51:44.935598 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 17 19:51:44 crc kubenswrapper[4892]: I0217 19:51:44.969136 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mmwd\" (UniqueName: \"kubernetes.io/projected/430ead3e-b98d-466b-819e-3231507f95cd-kube-api-access-4mmwd\") pod \"reboot-os-openstack-openstack-cell1-tb4ks\" (UID: \"430ead3e-b98d-466b-819e-3231507f95cd\") " pod="openstack/reboot-os-openstack-openstack-cell1-tb4ks" Feb 17 19:51:44 crc kubenswrapper[4892]: I0217 19:51:44.969221 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/430ead3e-b98d-466b-819e-3231507f95cd-inventory\") pod \"reboot-os-openstack-openstack-cell1-tb4ks\" (UID: \"430ead3e-b98d-466b-819e-3231507f95cd\") " pod="openstack/reboot-os-openstack-openstack-cell1-tb4ks" Feb 17 19:51:44 crc kubenswrapper[4892]: I0217 19:51:44.969268 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/430ead3e-b98d-466b-819e-3231507f95cd-ceph\") pod \"reboot-os-openstack-openstack-cell1-tb4ks\" (UID: \"430ead3e-b98d-466b-819e-3231507f95cd\") " pod="openstack/reboot-os-openstack-openstack-cell1-tb4ks" Feb 17 19:51:44 crc kubenswrapper[4892]: I0217 19:51:44.969294 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/430ead3e-b98d-466b-819e-3231507f95cd-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-tb4ks\" (UID: \"430ead3e-b98d-466b-819e-3231507f95cd\") " pod="openstack/reboot-os-openstack-openstack-cell1-tb4ks" Feb 17 19:51:45 crc kubenswrapper[4892]: I0217 19:51:45.071547 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mmwd\" (UniqueName: \"kubernetes.io/projected/430ead3e-b98d-466b-819e-3231507f95cd-kube-api-access-4mmwd\") pod \"reboot-os-openstack-openstack-cell1-tb4ks\" (UID: \"430ead3e-b98d-466b-819e-3231507f95cd\") " pod="openstack/reboot-os-openstack-openstack-cell1-tb4ks" Feb 17 19:51:45 crc kubenswrapper[4892]: I0217 19:51:45.071604 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/430ead3e-b98d-466b-819e-3231507f95cd-inventory\") pod \"reboot-os-openstack-openstack-cell1-tb4ks\" (UID: \"430ead3e-b98d-466b-819e-3231507f95cd\") " pod="openstack/reboot-os-openstack-openstack-cell1-tb4ks" Feb 17 19:51:45 crc kubenswrapper[4892]: I0217 19:51:45.071630 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/430ead3e-b98d-466b-819e-3231507f95cd-ceph\") pod \"reboot-os-openstack-openstack-cell1-tb4ks\" (UID: \"430ead3e-b98d-466b-819e-3231507f95cd\") " pod="openstack/reboot-os-openstack-openstack-cell1-tb4ks" Feb 17 19:51:45 crc kubenswrapper[4892]: I0217 19:51:45.071650 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/430ead3e-b98d-466b-819e-3231507f95cd-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-tb4ks\" (UID: \"430ead3e-b98d-466b-819e-3231507f95cd\") " pod="openstack/reboot-os-openstack-openstack-cell1-tb4ks" Feb 17 19:51:45 crc kubenswrapper[4892]: I0217 19:51:45.075449 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/430ead3e-b98d-466b-819e-3231507f95cd-inventory\") pod \"reboot-os-openstack-openstack-cell1-tb4ks\" (UID: \"430ead3e-b98d-466b-819e-3231507f95cd\") " pod="openstack/reboot-os-openstack-openstack-cell1-tb4ks" Feb 17 19:51:45 crc kubenswrapper[4892]: I0217 19:51:45.075508 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/430ead3e-b98d-466b-819e-3231507f95cd-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-tb4ks\" (UID: \"430ead3e-b98d-466b-819e-3231507f95cd\") " pod="openstack/reboot-os-openstack-openstack-cell1-tb4ks" Feb 17 19:51:45 crc kubenswrapper[4892]: I0217 19:51:45.076037 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/430ead3e-b98d-466b-819e-3231507f95cd-ceph\") pod \"reboot-os-openstack-openstack-cell1-tb4ks\" (UID: \"430ead3e-b98d-466b-819e-3231507f95cd\") " pod="openstack/reboot-os-openstack-openstack-cell1-tb4ks" Feb 17 19:51:45 crc kubenswrapper[4892]: I0217 19:51:45.086026 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mmwd\" (UniqueName: \"kubernetes.io/projected/430ead3e-b98d-466b-819e-3231507f95cd-kube-api-access-4mmwd\") pod \"reboot-os-openstack-openstack-cell1-tb4ks\" (UID: \"430ead3e-b98d-466b-819e-3231507f95cd\") " pod="openstack/reboot-os-openstack-openstack-cell1-tb4ks" Feb 17 19:51:45 crc kubenswrapper[4892]: I0217 19:51:45.250326 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-tb4ks" Feb 17 19:51:45 crc kubenswrapper[4892]: I0217 19:51:45.805988 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-tb4ks"] Feb 17 19:51:45 crc kubenswrapper[4892]: I0217 19:51:45.856878 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-tb4ks" event={"ID":"430ead3e-b98d-466b-819e-3231507f95cd","Type":"ContainerStarted","Data":"6ebc39c4dac4802c4a41aa3c0e453e2b5384bde83fdda7b475c14cbf0460c768"} Feb 17 19:51:46 crc kubenswrapper[4892]: I0217 19:51:46.875193 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-tb4ks" event={"ID":"430ead3e-b98d-466b-819e-3231507f95cd","Type":"ContainerStarted","Data":"4b22847b701dc69a526e898c5a8f55f513d747bed91a5dcc8441c7f875564a28"} Feb 17 19:51:46 crc kubenswrapper[4892]: I0217 19:51:46.911146 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-openstack-openstack-cell1-tb4ks" podStartSLOduration=2.522103645 podStartE2EDuration="2.911121019s" podCreationTimestamp="2026-02-17 19:51:44 +0000 UTC" firstStartedPulling="2026-02-17 19:51:45.812402892 +0000 UTC m=+7677.187806197" lastFinishedPulling="2026-02-17 19:51:46.201420296 +0000 UTC m=+7677.576823571" observedRunningTime="2026-02-17 19:51:46.901590701 +0000 UTC m=+7678.276994066" watchObservedRunningTime="2026-02-17 19:51:46.911121019 +0000 UTC m=+7678.286524294" Feb 17 19:52:00 crc kubenswrapper[4892]: I0217 19:52:00.814542 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ztjb7"] Feb 17 19:52:00 crc kubenswrapper[4892]: I0217 19:52:00.818222 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ztjb7" Feb 17 19:52:00 crc kubenswrapper[4892]: I0217 19:52:00.828684 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ztjb7"] Feb 17 19:52:00 crc kubenswrapper[4892]: I0217 19:52:00.880595 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx75k\" (UniqueName: \"kubernetes.io/projected/5c058476-2e9f-42eb-b8f1-377820b00c85-kube-api-access-gx75k\") pod \"community-operators-ztjb7\" (UID: \"5c058476-2e9f-42eb-b8f1-377820b00c85\") " pod="openshift-marketplace/community-operators-ztjb7" Feb 17 19:52:00 crc kubenswrapper[4892]: I0217 19:52:00.880997 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c058476-2e9f-42eb-b8f1-377820b00c85-utilities\") pod \"community-operators-ztjb7\" (UID: \"5c058476-2e9f-42eb-b8f1-377820b00c85\") " pod="openshift-marketplace/community-operators-ztjb7" Feb 17 19:52:00 crc kubenswrapper[4892]: I0217 19:52:00.881138 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c058476-2e9f-42eb-b8f1-377820b00c85-catalog-content\") pod \"community-operators-ztjb7\" (UID: \"5c058476-2e9f-42eb-b8f1-377820b00c85\") " pod="openshift-marketplace/community-operators-ztjb7" Feb 17 19:52:00 crc kubenswrapper[4892]: I0217 19:52:00.983014 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gx75k\" (UniqueName: \"kubernetes.io/projected/5c058476-2e9f-42eb-b8f1-377820b00c85-kube-api-access-gx75k\") pod \"community-operators-ztjb7\" (UID: \"5c058476-2e9f-42eb-b8f1-377820b00c85\") " pod="openshift-marketplace/community-operators-ztjb7" Feb 17 19:52:00 crc kubenswrapper[4892]: I0217 19:52:00.983153 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c058476-2e9f-42eb-b8f1-377820b00c85-utilities\") pod \"community-operators-ztjb7\" (UID: \"5c058476-2e9f-42eb-b8f1-377820b00c85\") " pod="openshift-marketplace/community-operators-ztjb7" Feb 17 19:52:00 crc kubenswrapper[4892]: I0217 19:52:00.983410 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c058476-2e9f-42eb-b8f1-377820b00c85-catalog-content\") pod \"community-operators-ztjb7\" (UID: \"5c058476-2e9f-42eb-b8f1-377820b00c85\") " pod="openshift-marketplace/community-operators-ztjb7" Feb 17 19:52:00 crc kubenswrapper[4892]: I0217 19:52:00.983986 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c058476-2e9f-42eb-b8f1-377820b00c85-utilities\") pod \"community-operators-ztjb7\" (UID: \"5c058476-2e9f-42eb-b8f1-377820b00c85\") " pod="openshift-marketplace/community-operators-ztjb7" Feb 17 19:52:00 crc kubenswrapper[4892]: I0217 19:52:00.984041 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c058476-2e9f-42eb-b8f1-377820b00c85-catalog-content\") pod \"community-operators-ztjb7\" (UID: \"5c058476-2e9f-42eb-b8f1-377820b00c85\") " pod="openshift-marketplace/community-operators-ztjb7" Feb 17 19:52:01 crc kubenswrapper[4892]: I0217 19:52:01.006336 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gx75k\" (UniqueName: \"kubernetes.io/projected/5c058476-2e9f-42eb-b8f1-377820b00c85-kube-api-access-gx75k\") pod \"community-operators-ztjb7\" (UID: \"5c058476-2e9f-42eb-b8f1-377820b00c85\") " pod="openshift-marketplace/community-operators-ztjb7" Feb 17 19:52:01 crc kubenswrapper[4892]: I0217 19:52:01.154403 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ztjb7" Feb 17 19:52:01 crc kubenswrapper[4892]: I0217 19:52:01.750912 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ztjb7"] Feb 17 19:52:01 crc kubenswrapper[4892]: W0217 19:52:01.766109 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c058476_2e9f_42eb_b8f1_377820b00c85.slice/crio-3c00f2c238fbb0fca993fccb96df0d3da52817ef88f29c4f6ba5d862c407d266 WatchSource:0}: Error finding container 3c00f2c238fbb0fca993fccb96df0d3da52817ef88f29c4f6ba5d862c407d266: Status 404 returned error can't find the container with id 3c00f2c238fbb0fca993fccb96df0d3da52817ef88f29c4f6ba5d862c407d266 Feb 17 19:52:02 crc kubenswrapper[4892]: I0217 19:52:02.044130 4892 generic.go:334] "Generic (PLEG): container finished" podID="5c058476-2e9f-42eb-b8f1-377820b00c85" containerID="f0b77945d02f06fbb234add2cf1d7a5f5d173416781fc35b2a61715b4fd11456" exitCode=0 Feb 17 19:52:02 crc kubenswrapper[4892]: I0217 19:52:02.044350 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ztjb7" event={"ID":"5c058476-2e9f-42eb-b8f1-377820b00c85","Type":"ContainerDied","Data":"f0b77945d02f06fbb234add2cf1d7a5f5d173416781fc35b2a61715b4fd11456"} Feb 17 19:52:02 crc kubenswrapper[4892]: I0217 19:52:02.044478 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ztjb7" event={"ID":"5c058476-2e9f-42eb-b8f1-377820b00c85","Type":"ContainerStarted","Data":"3c00f2c238fbb0fca993fccb96df0d3da52817ef88f29c4f6ba5d862c407d266"} Feb 17 19:52:03 crc kubenswrapper[4892]: I0217 19:52:03.055231 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ztjb7" event={"ID":"5c058476-2e9f-42eb-b8f1-377820b00c85","Type":"ContainerStarted","Data":"1ac2737ba52a6bdf66cdb2b130e43b7b2569aaefa285d0f25fe95cfcab2e2cfb"} Feb 17 19:52:03 crc kubenswrapper[4892]: I0217 19:52:03.057087 4892 generic.go:334] "Generic (PLEG): container finished" podID="430ead3e-b98d-466b-819e-3231507f95cd" containerID="4b22847b701dc69a526e898c5a8f55f513d747bed91a5dcc8441c7f875564a28" exitCode=0 Feb 17 19:52:03 crc kubenswrapper[4892]: I0217 19:52:03.057133 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-tb4ks" event={"ID":"430ead3e-b98d-466b-819e-3231507f95cd","Type":"ContainerDied","Data":"4b22847b701dc69a526e898c5a8f55f513d747bed91a5dcc8441c7f875564a28"} Feb 17 19:52:04 crc kubenswrapper[4892]: I0217 19:52:04.072320 4892 generic.go:334] "Generic (PLEG): container finished" podID="5c058476-2e9f-42eb-b8f1-377820b00c85" containerID="1ac2737ba52a6bdf66cdb2b130e43b7b2569aaefa285d0f25fe95cfcab2e2cfb" exitCode=0 Feb 17 19:52:04 crc kubenswrapper[4892]: I0217 19:52:04.072423 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ztjb7" event={"ID":"5c058476-2e9f-42eb-b8f1-377820b00c85","Type":"ContainerDied","Data":"1ac2737ba52a6bdf66cdb2b130e43b7b2569aaefa285d0f25fe95cfcab2e2cfb"} Feb 17 19:52:04 crc kubenswrapper[4892]: I0217 19:52:04.617464 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-tb4ks" Feb 17 19:52:04 crc kubenswrapper[4892]: I0217 19:52:04.676634 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/430ead3e-b98d-466b-819e-3231507f95cd-ssh-key-openstack-cell1\") pod \"430ead3e-b98d-466b-819e-3231507f95cd\" (UID: \"430ead3e-b98d-466b-819e-3231507f95cd\") " Feb 17 19:52:04 crc kubenswrapper[4892]: I0217 19:52:04.676793 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/430ead3e-b98d-466b-819e-3231507f95cd-inventory\") pod \"430ead3e-b98d-466b-819e-3231507f95cd\" (UID: \"430ead3e-b98d-466b-819e-3231507f95cd\") " Feb 17 19:52:04 crc kubenswrapper[4892]: I0217 19:52:04.677039 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mmwd\" (UniqueName: \"kubernetes.io/projected/430ead3e-b98d-466b-819e-3231507f95cd-kube-api-access-4mmwd\") pod \"430ead3e-b98d-466b-819e-3231507f95cd\" (UID: \"430ead3e-b98d-466b-819e-3231507f95cd\") " Feb 17 19:52:04 crc kubenswrapper[4892]: I0217 19:52:04.677070 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/430ead3e-b98d-466b-819e-3231507f95cd-ceph\") pod \"430ead3e-b98d-466b-819e-3231507f95cd\" (UID: \"430ead3e-b98d-466b-819e-3231507f95cd\") " Feb 17 19:52:04 crc kubenswrapper[4892]: I0217 19:52:04.689847 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/430ead3e-b98d-466b-819e-3231507f95cd-ceph" (OuterVolumeSpecName: "ceph") pod "430ead3e-b98d-466b-819e-3231507f95cd" (UID: "430ead3e-b98d-466b-819e-3231507f95cd"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:52:04 crc kubenswrapper[4892]: I0217 19:52:04.689896 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/430ead3e-b98d-466b-819e-3231507f95cd-kube-api-access-4mmwd" (OuterVolumeSpecName: "kube-api-access-4mmwd") pod "430ead3e-b98d-466b-819e-3231507f95cd" (UID: "430ead3e-b98d-466b-819e-3231507f95cd"). InnerVolumeSpecName "kube-api-access-4mmwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:52:04 crc kubenswrapper[4892]: I0217 19:52:04.733279 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/430ead3e-b98d-466b-819e-3231507f95cd-inventory" (OuterVolumeSpecName: "inventory") pod "430ead3e-b98d-466b-819e-3231507f95cd" (UID: "430ead3e-b98d-466b-819e-3231507f95cd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:52:04 crc kubenswrapper[4892]: I0217 19:52:04.735006 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/430ead3e-b98d-466b-819e-3231507f95cd-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "430ead3e-b98d-466b-819e-3231507f95cd" (UID: "430ead3e-b98d-466b-819e-3231507f95cd"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:52:04 crc kubenswrapper[4892]: I0217 19:52:04.780115 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mmwd\" (UniqueName: \"kubernetes.io/projected/430ead3e-b98d-466b-819e-3231507f95cd-kube-api-access-4mmwd\") on node \"crc\" DevicePath \"\"" Feb 17 19:52:04 crc kubenswrapper[4892]: I0217 19:52:04.780160 4892 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/430ead3e-b98d-466b-819e-3231507f95cd-ceph\") on node \"crc\" DevicePath \"\"" Feb 17 19:52:04 crc kubenswrapper[4892]: I0217 19:52:04.780170 4892 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/430ead3e-b98d-466b-819e-3231507f95cd-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 17 19:52:04 crc kubenswrapper[4892]: I0217 19:52:04.780180 4892 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/430ead3e-b98d-466b-819e-3231507f95cd-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 19:52:05 crc kubenswrapper[4892]: I0217 19:52:05.082878 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ztjb7" event={"ID":"5c058476-2e9f-42eb-b8f1-377820b00c85","Type":"ContainerStarted","Data":"f9b85f0afc43e5dbc42f0cceb1fc7a3b649ff7fa1ac923daebd89f6237880cd8"} Feb 17 19:52:05 crc kubenswrapper[4892]: I0217 19:52:05.085585 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-tb4ks" event={"ID":"430ead3e-b98d-466b-819e-3231507f95cd","Type":"ContainerDied","Data":"6ebc39c4dac4802c4a41aa3c0e453e2b5384bde83fdda7b475c14cbf0460c768"} Feb 17 19:52:05 crc kubenswrapper[4892]: I0217 19:52:05.085617 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-tb4ks" Feb 17 19:52:05 crc kubenswrapper[4892]: I0217 19:52:05.085621 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ebc39c4dac4802c4a41aa3c0e453e2b5384bde83fdda7b475c14cbf0460c768" Feb 17 19:52:05 crc kubenswrapper[4892]: I0217 19:52:05.133051 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ztjb7" podStartSLOduration=2.597422588 podStartE2EDuration="5.133029857s" podCreationTimestamp="2026-02-17 19:52:00 +0000 UTC" firstStartedPulling="2026-02-17 19:52:02.04567434 +0000 UTC m=+7693.421077605" lastFinishedPulling="2026-02-17 19:52:04.581281579 +0000 UTC m=+7695.956684874" observedRunningTime="2026-02-17 19:52:05.10204032 +0000 UTC m=+7696.477443585" watchObservedRunningTime="2026-02-17 19:52:05.133029857 +0000 UTC m=+7696.508433132" Feb 17 19:52:05 crc kubenswrapper[4892]: I0217 19:52:05.217072 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-klzmt"] Feb 17 19:52:05 crc kubenswrapper[4892]: E0217 19:52:05.217695 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="430ead3e-b98d-466b-819e-3231507f95cd" containerName="reboot-os-openstack-openstack-cell1" Feb 17 19:52:05 crc kubenswrapper[4892]: I0217 19:52:05.217714 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="430ead3e-b98d-466b-819e-3231507f95cd" containerName="reboot-os-openstack-openstack-cell1" Feb 17 19:52:05 crc kubenswrapper[4892]: I0217 19:52:05.217980 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="430ead3e-b98d-466b-819e-3231507f95cd" containerName="reboot-os-openstack-openstack-cell1" Feb 17 19:52:05 crc kubenswrapper[4892]: I0217 19:52:05.218809 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-klzmt" Feb 17 19:52:05 crc kubenswrapper[4892]: I0217 19:52:05.224328 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 17 19:52:05 crc kubenswrapper[4892]: I0217 19:52:05.224389 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 19:52:05 crc kubenswrapper[4892]: I0217 19:52:05.224549 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-sdxx7" Feb 17 19:52:05 crc kubenswrapper[4892]: I0217 19:52:05.224603 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 17 19:52:05 crc kubenswrapper[4892]: I0217 19:52:05.235682 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-klzmt"] Feb 17 19:52:05 crc kubenswrapper[4892]: I0217 19:52:05.290195 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-inventory\") pod \"install-certs-openstack-openstack-cell1-klzmt\" (UID: \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\") " pod="openstack/install-certs-openstack-openstack-cell1-klzmt" Feb 17 19:52:05 crc kubenswrapper[4892]: I0217 19:52:05.290327 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-klzmt\" (UID: \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\") " pod="openstack/install-certs-openstack-openstack-cell1-klzmt" Feb 17 19:52:05 crc kubenswrapper[4892]: I0217 19:52:05.290375 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-klzmt\" (UID: \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\") " pod="openstack/install-certs-openstack-openstack-cell1-klzmt" Feb 17 19:52:05 crc kubenswrapper[4892]: I0217 19:52:05.290464 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s84hc\" (UniqueName: \"kubernetes.io/projected/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-kube-api-access-s84hc\") pod \"install-certs-openstack-openstack-cell1-klzmt\" (UID: \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\") " pod="openstack/install-certs-openstack-openstack-cell1-klzmt" Feb 17 19:52:05 crc kubenswrapper[4892]: I0217 19:52:05.290517 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-klzmt\" (UID: \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\") " pod="openstack/install-certs-openstack-openstack-cell1-klzmt" Feb 17 19:52:05 crc kubenswrapper[4892]: I0217 19:52:05.290540 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-klzmt\" (UID: \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\") " pod="openstack/install-certs-openstack-openstack-cell1-klzmt" Feb 17 19:52:05 crc kubenswrapper[4892]: I0217 19:52:05.290585 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-klzmt\" (UID: \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\") " pod="openstack/install-certs-openstack-openstack-cell1-klzmt" Feb 17 19:52:05 crc kubenswrapper[4892]: I0217 19:52:05.290619 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-klzmt\" (UID: \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\") " pod="openstack/install-certs-openstack-openstack-cell1-klzmt" Feb 17 19:52:05 crc kubenswrapper[4892]: I0217 19:52:05.290668 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-ceph\") pod \"install-certs-openstack-openstack-cell1-klzmt\" (UID: \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\") " pod="openstack/install-certs-openstack-openstack-cell1-klzmt" Feb 17 19:52:05 crc kubenswrapper[4892]: I0217 19:52:05.290694 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-klzmt\" (UID: \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\") " pod="openstack/install-certs-openstack-openstack-cell1-klzmt" Feb 17 19:52:05 crc kubenswrapper[4892]: I0217 19:52:05.290708 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-klzmt\" (UID: \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\") " pod="openstack/install-certs-openstack-openstack-cell1-klzmt" Feb 17 19:52:05 crc kubenswrapper[4892]: I0217 19:52:05.290744 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-klzmt\" (UID: \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\") " pod="openstack/install-certs-openstack-openstack-cell1-klzmt" Feb 17 19:52:05 crc kubenswrapper[4892]: I0217 19:52:05.392805 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-klzmt\" (UID: \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\") " pod="openstack/install-certs-openstack-openstack-cell1-klzmt" Feb 17 19:52:05 crc kubenswrapper[4892]: I0217 19:52:05.393087 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-klzmt\" (UID: \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\") " pod="openstack/install-certs-openstack-openstack-cell1-klzmt" Feb 17 19:52:05 crc kubenswrapper[4892]: I0217 19:52:05.393939 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s84hc\" (UniqueName: \"kubernetes.io/projected/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-kube-api-access-s84hc\") pod \"install-certs-openstack-openstack-cell1-klzmt\" (UID: \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\") " pod="openstack/install-certs-openstack-openstack-cell1-klzmt" Feb 17 19:52:05 crc kubenswrapper[4892]: I0217 19:52:05.394940 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-klzmt\" (UID: \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\") " pod="openstack/install-certs-openstack-openstack-cell1-klzmt" Feb 17 19:52:05 crc kubenswrapper[4892]: I0217 19:52:05.396188 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-klzmt\" (UID: \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\") " pod="openstack/install-certs-openstack-openstack-cell1-klzmt" Feb 17 19:52:05 crc kubenswrapper[4892]: I0217 19:52:05.396543 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-klzmt\" (UID: \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\") " pod="openstack/install-certs-openstack-openstack-cell1-klzmt" Feb 17 19:52:05 crc kubenswrapper[4892]: I0217 19:52:05.396895 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-klzmt\" (UID: \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\") " pod="openstack/install-certs-openstack-openstack-cell1-klzmt" Feb 17 19:52:05 crc kubenswrapper[4892]: I0217 19:52:05.397587 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-ceph\") pod \"install-certs-openstack-openstack-cell1-klzmt\" (UID: \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\") " pod="openstack/install-certs-openstack-openstack-cell1-klzmt" Feb 17 19:52:05 crc kubenswrapper[4892]: I0217 19:52:05.397890 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-klzmt\" (UID: \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\") " pod="openstack/install-certs-openstack-openstack-cell1-klzmt" Feb 17 19:52:05 crc kubenswrapper[4892]: I0217 19:52:05.401254 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-klzmt\" (UID: \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\") " pod="openstack/install-certs-openstack-openstack-cell1-klzmt" Feb 17 19:52:05 crc kubenswrapper[4892]: I0217 19:52:05.401164 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-klzmt\" (UID: \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\") " pod="openstack/install-certs-openstack-openstack-cell1-klzmt" Feb 17 19:52:05 crc kubenswrapper[4892]: I0217 19:52:05.401304 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-klzmt\" (UID: \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\") " pod="openstack/install-certs-openstack-openstack-cell1-klzmt" Feb 17 19:52:05 crc kubenswrapper[4892]: I0217 19:52:05.400702 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-klzmt\" (UID: \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\") " pod="openstack/install-certs-openstack-openstack-cell1-klzmt" Feb 17 19:52:05 crc kubenswrapper[4892]: I0217 19:52:05.401448 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-klzmt\" (UID: \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\") " pod="openstack/install-certs-openstack-openstack-cell1-klzmt" Feb 17 19:52:05 crc kubenswrapper[4892]: I0217 19:52:05.402000 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-klzmt\" (UID: \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\") " pod="openstack/install-certs-openstack-openstack-cell1-klzmt" Feb 17 19:52:05 crc kubenswrapper[4892]: I0217 19:52:05.402009 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-klzmt\" (UID: \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\") " pod="openstack/install-certs-openstack-openstack-cell1-klzmt" Feb 17 19:52:05 crc kubenswrapper[4892]: I0217 19:52:05.402221 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-klzmt\" (UID: \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\") " pod="openstack/install-certs-openstack-openstack-cell1-klzmt" Feb 17 19:52:05 crc kubenswrapper[4892]: I0217 19:52:05.402677 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-inventory\") pod \"install-certs-openstack-openstack-cell1-klzmt\" (UID: \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\") " pod="openstack/install-certs-openstack-openstack-cell1-klzmt" Feb 17 19:52:05 crc kubenswrapper[4892]: I0217 19:52:05.402825 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-klzmt\" (UID: \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\") " pod="openstack/install-certs-openstack-openstack-cell1-klzmt" Feb 17 19:52:05 crc kubenswrapper[4892]: I0217 19:52:05.403487 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-ceph\") pod \"install-certs-openstack-openstack-cell1-klzmt\" (UID: \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\") " pod="openstack/install-certs-openstack-openstack-cell1-klzmt" Feb 17 19:52:05 crc kubenswrapper[4892]: I0217 19:52:05.405859 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-klzmt\" (UID: \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\") " pod="openstack/install-certs-openstack-openstack-cell1-klzmt" Feb 17 19:52:05 crc kubenswrapper[4892]: I0217 19:52:05.405910 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-klzmt\" (UID: \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\") " pod="openstack/install-certs-openstack-openstack-cell1-klzmt" Feb 17 19:52:05 crc kubenswrapper[4892]: I0217 19:52:05.410317 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s84hc\" (UniqueName: \"kubernetes.io/projected/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-kube-api-access-s84hc\") pod \"install-certs-openstack-openstack-cell1-klzmt\" (UID: \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\") " pod="openstack/install-certs-openstack-openstack-cell1-klzmt" Feb 17 19:52:05 crc kubenswrapper[4892]: I0217 19:52:05.410950 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-inventory\") pod \"install-certs-openstack-openstack-cell1-klzmt\" (UID: \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\") " pod="openstack/install-certs-openstack-openstack-cell1-klzmt" Feb 17 19:52:05 crc kubenswrapper[4892]: I0217 19:52:05.550005 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-klzmt" Feb 17 19:52:06 crc kubenswrapper[4892]: W0217 19:52:06.194612 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcdb317ef_c3b7_41a8_b80b_de001a95dbd1.slice/crio-4381b30bff070b2d42b138bc73132c616cbc9f41f4ce0f6a43159897a05ef91c WatchSource:0}: Error finding container 4381b30bff070b2d42b138bc73132c616cbc9f41f4ce0f6a43159897a05ef91c: Status 404 returned error can't find the container with id 4381b30bff070b2d42b138bc73132c616cbc9f41f4ce0f6a43159897a05ef91c Feb 17 19:52:06 crc kubenswrapper[4892]: I0217 19:52:06.203122 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-klzmt"] Feb 17 19:52:07 crc kubenswrapper[4892]: I0217 19:52:07.108490 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-klzmt" event={"ID":"cdb317ef-c3b7-41a8-b80b-de001a95dbd1","Type":"ContainerStarted","Data":"4381b30bff070b2d42b138bc73132c616cbc9f41f4ce0f6a43159897a05ef91c"} Feb 17 19:52:08 crc kubenswrapper[4892]: I0217 19:52:08.121799 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-klzmt" event={"ID":"cdb317ef-c3b7-41a8-b80b-de001a95dbd1","Type":"ContainerStarted","Data":"87c9a4af52edf19692f49b91f1c66ec1217bdbfcf1eabf32c43895bcb16677a0"} Feb 17 19:52:08 crc kubenswrapper[4892]: I0217 19:52:08.153666 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-openstack-openstack-cell1-klzmt" podStartSLOduration=2.460838794 podStartE2EDuration="3.153644842s" podCreationTimestamp="2026-02-17 19:52:05 +0000 UTC" firstStartedPulling="2026-02-17 19:52:06.197254944 +0000 UTC m=+7697.572658209" lastFinishedPulling="2026-02-17 19:52:06.890060992 +0000 UTC m=+7698.265464257" observedRunningTime="2026-02-17 19:52:08.147561858 +0000 UTC m=+7699.522965123" watchObservedRunningTime="2026-02-17 19:52:08.153644842 +0000 UTC m=+7699.529048107" Feb 17 19:52:11 crc kubenswrapper[4892]: I0217 19:52:11.155198 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ztjb7" Feb 17 19:52:11 crc kubenswrapper[4892]: I0217 19:52:11.156069 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ztjb7" Feb 17 19:52:11 crc kubenswrapper[4892]: I0217 19:52:11.214951 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ztjb7" Feb 17 19:52:12 crc kubenswrapper[4892]: I0217 19:52:12.239629 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ztjb7" Feb 17 19:52:12 crc kubenswrapper[4892]: I0217 19:52:12.310409 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ztjb7"] Feb 17 19:52:14 crc kubenswrapper[4892]: I0217 19:52:14.205533 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ztjb7" podUID="5c058476-2e9f-42eb-b8f1-377820b00c85" containerName="registry-server" containerID="cri-o://f9b85f0afc43e5dbc42f0cceb1fc7a3b649ff7fa1ac923daebd89f6237880cd8" gracePeriod=2 Feb 17 19:52:14 crc kubenswrapper[4892]: I0217 19:52:14.717025 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ztjb7" Feb 17 19:52:14 crc kubenswrapper[4892]: I0217 19:52:14.820088 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c058476-2e9f-42eb-b8f1-377820b00c85-utilities\") pod \"5c058476-2e9f-42eb-b8f1-377820b00c85\" (UID: \"5c058476-2e9f-42eb-b8f1-377820b00c85\") " Feb 17 19:52:14 crc kubenswrapper[4892]: I0217 19:52:14.820219 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gx75k\" (UniqueName: \"kubernetes.io/projected/5c058476-2e9f-42eb-b8f1-377820b00c85-kube-api-access-gx75k\") pod \"5c058476-2e9f-42eb-b8f1-377820b00c85\" (UID: \"5c058476-2e9f-42eb-b8f1-377820b00c85\") " Feb 17 19:52:14 crc kubenswrapper[4892]: I0217 19:52:14.820272 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c058476-2e9f-42eb-b8f1-377820b00c85-catalog-content\") pod \"5c058476-2e9f-42eb-b8f1-377820b00c85\" (UID: \"5c058476-2e9f-42eb-b8f1-377820b00c85\") " Feb 17 19:52:14 crc kubenswrapper[4892]: I0217 19:52:14.820806 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c058476-2e9f-42eb-b8f1-377820b00c85-utilities" (OuterVolumeSpecName: "utilities") pod "5c058476-2e9f-42eb-b8f1-377820b00c85" (UID: "5c058476-2e9f-42eb-b8f1-377820b00c85"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:52:14 crc kubenswrapper[4892]: I0217 19:52:14.830702 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c058476-2e9f-42eb-b8f1-377820b00c85-kube-api-access-gx75k" (OuterVolumeSpecName: "kube-api-access-gx75k") pod "5c058476-2e9f-42eb-b8f1-377820b00c85" (UID: "5c058476-2e9f-42eb-b8f1-377820b00c85"). InnerVolumeSpecName "kube-api-access-gx75k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:52:14 crc kubenswrapper[4892]: I0217 19:52:14.873762 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c058476-2e9f-42eb-b8f1-377820b00c85-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5c058476-2e9f-42eb-b8f1-377820b00c85" (UID: "5c058476-2e9f-42eb-b8f1-377820b00c85"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:52:14 crc kubenswrapper[4892]: I0217 19:52:14.922726 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c058476-2e9f-42eb-b8f1-377820b00c85-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 19:52:14 crc kubenswrapper[4892]: I0217 19:52:14.922757 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c058476-2e9f-42eb-b8f1-377820b00c85-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 19:52:14 crc kubenswrapper[4892]: I0217 19:52:14.922768 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gx75k\" (UniqueName: \"kubernetes.io/projected/5c058476-2e9f-42eb-b8f1-377820b00c85-kube-api-access-gx75k\") on node \"crc\" DevicePath \"\"" Feb 17 19:52:15 crc kubenswrapper[4892]: I0217 19:52:15.232474 4892 generic.go:334] "Generic (PLEG): container finished" podID="5c058476-2e9f-42eb-b8f1-377820b00c85" containerID="f9b85f0afc43e5dbc42f0cceb1fc7a3b649ff7fa1ac923daebd89f6237880cd8" exitCode=0 Feb 17 19:52:15 crc kubenswrapper[4892]: I0217 19:52:15.232752 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ztjb7" Feb 17 19:52:15 crc kubenswrapper[4892]: I0217 19:52:15.232776 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ztjb7" event={"ID":"5c058476-2e9f-42eb-b8f1-377820b00c85","Type":"ContainerDied","Data":"f9b85f0afc43e5dbc42f0cceb1fc7a3b649ff7fa1ac923daebd89f6237880cd8"} Feb 17 19:52:15 crc kubenswrapper[4892]: I0217 19:52:15.235053 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ztjb7" event={"ID":"5c058476-2e9f-42eb-b8f1-377820b00c85","Type":"ContainerDied","Data":"3c00f2c238fbb0fca993fccb96df0d3da52817ef88f29c4f6ba5d862c407d266"} Feb 17 19:52:15 crc kubenswrapper[4892]: I0217 19:52:15.235135 4892 scope.go:117] "RemoveContainer" containerID="f9b85f0afc43e5dbc42f0cceb1fc7a3b649ff7fa1ac923daebd89f6237880cd8" Feb 17 19:52:15 crc kubenswrapper[4892]: I0217 19:52:15.267949 4892 scope.go:117] "RemoveContainer" containerID="1ac2737ba52a6bdf66cdb2b130e43b7b2569aaefa285d0f25fe95cfcab2e2cfb" Feb 17 19:52:15 crc kubenswrapper[4892]: I0217 19:52:15.281233 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ztjb7"] Feb 17 19:52:15 crc kubenswrapper[4892]: I0217 19:52:15.294264 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ztjb7"] Feb 17 19:52:15 crc kubenswrapper[4892]: I0217 19:52:15.298986 4892 scope.go:117] "RemoveContainer" containerID="f0b77945d02f06fbb234add2cf1d7a5f5d173416781fc35b2a61715b4fd11456" Feb 17 19:52:15 crc kubenswrapper[4892]: I0217 19:52:15.376744 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c058476-2e9f-42eb-b8f1-377820b00c85" path="/var/lib/kubelet/pods/5c058476-2e9f-42eb-b8f1-377820b00c85/volumes" Feb 17 19:52:15 crc kubenswrapper[4892]: I0217 19:52:15.384545 4892 scope.go:117] "RemoveContainer" containerID="f9b85f0afc43e5dbc42f0cceb1fc7a3b649ff7fa1ac923daebd89f6237880cd8" Feb 17 19:52:15 crc kubenswrapper[4892]: E0217 19:52:15.384968 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9b85f0afc43e5dbc42f0cceb1fc7a3b649ff7fa1ac923daebd89f6237880cd8\": container with ID starting with f9b85f0afc43e5dbc42f0cceb1fc7a3b649ff7fa1ac923daebd89f6237880cd8 not found: ID does not exist" containerID="f9b85f0afc43e5dbc42f0cceb1fc7a3b649ff7fa1ac923daebd89f6237880cd8" Feb 17 19:52:15 crc kubenswrapper[4892]: I0217 19:52:15.385031 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9b85f0afc43e5dbc42f0cceb1fc7a3b649ff7fa1ac923daebd89f6237880cd8"} err="failed to get container status \"f9b85f0afc43e5dbc42f0cceb1fc7a3b649ff7fa1ac923daebd89f6237880cd8\": rpc error: code = NotFound desc = could not find container \"f9b85f0afc43e5dbc42f0cceb1fc7a3b649ff7fa1ac923daebd89f6237880cd8\": container with ID starting with f9b85f0afc43e5dbc42f0cceb1fc7a3b649ff7fa1ac923daebd89f6237880cd8 not found: ID does not exist" Feb 17 19:52:15 crc kubenswrapper[4892]: I0217 19:52:15.385064 4892 scope.go:117] "RemoveContainer" containerID="1ac2737ba52a6bdf66cdb2b130e43b7b2569aaefa285d0f25fe95cfcab2e2cfb" Feb 17 19:52:15 crc kubenswrapper[4892]: E0217 19:52:15.385499 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ac2737ba52a6bdf66cdb2b130e43b7b2569aaefa285d0f25fe95cfcab2e2cfb\": container with ID starting with 1ac2737ba52a6bdf66cdb2b130e43b7b2569aaefa285d0f25fe95cfcab2e2cfb not found: ID does not exist" containerID="1ac2737ba52a6bdf66cdb2b130e43b7b2569aaefa285d0f25fe95cfcab2e2cfb" Feb 17 19:52:15 crc kubenswrapper[4892]: I0217 19:52:15.385530 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ac2737ba52a6bdf66cdb2b130e43b7b2569aaefa285d0f25fe95cfcab2e2cfb"} err="failed to get container status \"1ac2737ba52a6bdf66cdb2b130e43b7b2569aaefa285d0f25fe95cfcab2e2cfb\": rpc error: code = NotFound desc = could not find container \"1ac2737ba52a6bdf66cdb2b130e43b7b2569aaefa285d0f25fe95cfcab2e2cfb\": container with ID starting with 1ac2737ba52a6bdf66cdb2b130e43b7b2569aaefa285d0f25fe95cfcab2e2cfb not found: ID does not exist" Feb 17 19:52:15 crc kubenswrapper[4892]: I0217 19:52:15.385555 4892 scope.go:117] "RemoveContainer" containerID="f0b77945d02f06fbb234add2cf1d7a5f5d173416781fc35b2a61715b4fd11456" Feb 17 19:52:15 crc kubenswrapper[4892]: E0217 19:52:15.385849 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0b77945d02f06fbb234add2cf1d7a5f5d173416781fc35b2a61715b4fd11456\": container with ID starting with f0b77945d02f06fbb234add2cf1d7a5f5d173416781fc35b2a61715b4fd11456 not found: ID does not exist" containerID="f0b77945d02f06fbb234add2cf1d7a5f5d173416781fc35b2a61715b4fd11456" Feb 17 19:52:15 crc kubenswrapper[4892]: I0217 19:52:15.385889 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0b77945d02f06fbb234add2cf1d7a5f5d173416781fc35b2a61715b4fd11456"} err="failed to get container status \"f0b77945d02f06fbb234add2cf1d7a5f5d173416781fc35b2a61715b4fd11456\": rpc error: code = NotFound desc = could not find container \"f0b77945d02f06fbb234add2cf1d7a5f5d173416781fc35b2a61715b4fd11456\": container with ID starting with f0b77945d02f06fbb234add2cf1d7a5f5d173416781fc35b2a61715b4fd11456 not found: ID does not exist" Feb 17 19:52:26 crc kubenswrapper[4892]: I0217 19:52:26.363689 4892 generic.go:334] "Generic (PLEG): container finished" podID="cdb317ef-c3b7-41a8-b80b-de001a95dbd1" containerID="87c9a4af52edf19692f49b91f1c66ec1217bdbfcf1eabf32c43895bcb16677a0" exitCode=0 Feb 17 19:52:26 crc kubenswrapper[4892]: I0217 19:52:26.363809 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-klzmt" event={"ID":"cdb317ef-c3b7-41a8-b80b-de001a95dbd1","Type":"ContainerDied","Data":"87c9a4af52edf19692f49b91f1c66ec1217bdbfcf1eabf32c43895bcb16677a0"} Feb 17 19:52:27 crc kubenswrapper[4892]: I0217 19:52:27.851343 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-klzmt" Feb 17 19:52:27 crc kubenswrapper[4892]: I0217 19:52:27.944785 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-ssh-key-openstack-cell1\") pod \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\" (UID: \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\") " Feb 17 19:52:27 crc kubenswrapper[4892]: I0217 19:52:27.944925 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-ovn-combined-ca-bundle\") pod \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\" (UID: \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\") " Feb 17 19:52:27 crc kubenswrapper[4892]: I0217 19:52:27.944989 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-inventory\") pod \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\" (UID: \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\") " Feb 17 19:52:27 crc kubenswrapper[4892]: I0217 19:52:27.945112 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-telemetry-combined-ca-bundle\") pod \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\" (UID: \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\") " Feb 17 19:52:27 crc kubenswrapper[4892]: I0217 19:52:27.945176 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-nova-combined-ca-bundle\") pod \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\" (UID: \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\") " Feb 17 19:52:27 crc kubenswrapper[4892]: I0217 19:52:27.945230 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-neutron-metadata-combined-ca-bundle\") pod \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\" (UID: \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\") " Feb 17 19:52:27 crc kubenswrapper[4892]: I0217 19:52:27.945287 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-neutron-sriov-combined-ca-bundle\") pod \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\" (UID: \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\") " Feb 17 19:52:27 crc kubenswrapper[4892]: I0217 19:52:27.945320 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-bootstrap-combined-ca-bundle\") pod \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\" (UID: \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\") " Feb 17 19:52:27 crc kubenswrapper[4892]: I0217 19:52:27.945372 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s84hc\" (UniqueName: \"kubernetes.io/projected/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-kube-api-access-s84hc\") pod \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\" (UID: \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\") " Feb 17 19:52:27 crc kubenswrapper[4892]: I0217 19:52:27.945417 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-libvirt-combined-ca-bundle\") pod \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\" (UID: \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\") " Feb 17 19:52:27 crc kubenswrapper[4892]: I0217 19:52:27.945443 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-ceph\") pod \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\" (UID: \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\") " Feb 17 19:52:27 crc kubenswrapper[4892]: I0217 19:52:27.945545 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-neutron-dhcp-combined-ca-bundle\") pod \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\" (UID: \"cdb317ef-c3b7-41a8-b80b-de001a95dbd1\") " Feb 17 19:52:27 crc kubenswrapper[4892]: I0217 19:52:27.953699 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "cdb317ef-c3b7-41a8-b80b-de001a95dbd1" (UID: "cdb317ef-c3b7-41a8-b80b-de001a95dbd1"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:52:27 crc kubenswrapper[4892]: I0217 19:52:27.954036 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "cdb317ef-c3b7-41a8-b80b-de001a95dbd1" (UID: "cdb317ef-c3b7-41a8-b80b-de001a95dbd1"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:52:27 crc kubenswrapper[4892]: I0217 19:52:27.954245 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "cdb317ef-c3b7-41a8-b80b-de001a95dbd1" (UID: "cdb317ef-c3b7-41a8-b80b-de001a95dbd1"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:52:27 crc kubenswrapper[4892]: I0217 19:52:27.954502 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "cdb317ef-c3b7-41a8-b80b-de001a95dbd1" (UID: "cdb317ef-c3b7-41a8-b80b-de001a95dbd1"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:52:27 crc kubenswrapper[4892]: I0217 19:52:27.955176 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "cdb317ef-c3b7-41a8-b80b-de001a95dbd1" (UID: "cdb317ef-c3b7-41a8-b80b-de001a95dbd1"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:52:27 crc kubenswrapper[4892]: I0217 19:52:27.955420 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "cdb317ef-c3b7-41a8-b80b-de001a95dbd1" (UID: "cdb317ef-c3b7-41a8-b80b-de001a95dbd1"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:52:27 crc kubenswrapper[4892]: I0217 19:52:27.955713 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "cdb317ef-c3b7-41a8-b80b-de001a95dbd1" (UID: "cdb317ef-c3b7-41a8-b80b-de001a95dbd1"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:52:27 crc kubenswrapper[4892]: I0217 19:52:27.955915 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "cdb317ef-c3b7-41a8-b80b-de001a95dbd1" (UID: "cdb317ef-c3b7-41a8-b80b-de001a95dbd1"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:52:27 crc kubenswrapper[4892]: I0217 19:52:27.956902 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-ceph" (OuterVolumeSpecName: "ceph") pod "cdb317ef-c3b7-41a8-b80b-de001a95dbd1" (UID: "cdb317ef-c3b7-41a8-b80b-de001a95dbd1"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:52:27 crc kubenswrapper[4892]: I0217 19:52:27.969354 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-kube-api-access-s84hc" (OuterVolumeSpecName: "kube-api-access-s84hc") pod "cdb317ef-c3b7-41a8-b80b-de001a95dbd1" (UID: "cdb317ef-c3b7-41a8-b80b-de001a95dbd1"). InnerVolumeSpecName "kube-api-access-s84hc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:52:27 crc kubenswrapper[4892]: I0217 19:52:27.997979 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-inventory" (OuterVolumeSpecName: "inventory") pod "cdb317ef-c3b7-41a8-b80b-de001a95dbd1" (UID: "cdb317ef-c3b7-41a8-b80b-de001a95dbd1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:52:28 crc kubenswrapper[4892]: I0217 19:52:28.010720 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "cdb317ef-c3b7-41a8-b80b-de001a95dbd1" (UID: "cdb317ef-c3b7-41a8-b80b-de001a95dbd1"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:52:28 crc kubenswrapper[4892]: I0217 19:52:28.048508 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s84hc\" (UniqueName: \"kubernetes.io/projected/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-kube-api-access-s84hc\") on node \"crc\" DevicePath \"\"" Feb 17 19:52:28 crc kubenswrapper[4892]: I0217 19:52:28.048540 4892 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 19:52:28 crc kubenswrapper[4892]: I0217 19:52:28.048550 4892 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-ceph\") on node \"crc\" DevicePath \"\"" Feb 17 19:52:28 crc kubenswrapper[4892]: I0217 19:52:28.048560 4892 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 19:52:28 crc kubenswrapper[4892]: I0217 19:52:28.048571 4892 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 17 19:52:28 crc kubenswrapper[4892]: I0217 19:52:28.048581 4892 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 19:52:28 crc kubenswrapper[4892]: I0217 19:52:28.048589 4892 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 19:52:28 crc kubenswrapper[4892]: I0217 19:52:28.048597 4892 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 19:52:28 crc kubenswrapper[4892]: I0217 19:52:28.048605 4892 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 19:52:28 crc kubenswrapper[4892]: I0217 19:52:28.048613 4892 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 19:52:28 crc kubenswrapper[4892]: I0217 19:52:28.048622 4892 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 19:52:28 crc kubenswrapper[4892]: I0217 19:52:28.048633 4892 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb317ef-c3b7-41a8-b80b-de001a95dbd1-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 19:52:28 crc kubenswrapper[4892]: I0217 19:52:28.395122 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-klzmt" event={"ID":"cdb317ef-c3b7-41a8-b80b-de001a95dbd1","Type":"ContainerDied","Data":"4381b30bff070b2d42b138bc73132c616cbc9f41f4ce0f6a43159897a05ef91c"} Feb 17 19:52:28 crc kubenswrapper[4892]: I0217 19:52:28.395436 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4381b30bff070b2d42b138bc73132c616cbc9f41f4ce0f6a43159897a05ef91c" Feb 17 19:52:28 crc kubenswrapper[4892]: I0217 19:52:28.395520 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-klzmt" Feb 17 19:52:28 crc kubenswrapper[4892]: I0217 19:52:28.495508 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-f8r4q"] Feb 17 19:52:28 crc kubenswrapper[4892]: E0217 19:52:28.496085 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdb317ef-c3b7-41a8-b80b-de001a95dbd1" containerName="install-certs-openstack-openstack-cell1" Feb 17 19:52:28 crc kubenswrapper[4892]: I0217 19:52:28.496110 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdb317ef-c3b7-41a8-b80b-de001a95dbd1" containerName="install-certs-openstack-openstack-cell1" Feb 17 19:52:28 crc kubenswrapper[4892]: E0217 19:52:28.496136 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c058476-2e9f-42eb-b8f1-377820b00c85" containerName="registry-server" Feb 17 19:52:28 crc kubenswrapper[4892]: I0217 19:52:28.496145 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c058476-2e9f-42eb-b8f1-377820b00c85" containerName="registry-server" Feb 17 19:52:28 crc kubenswrapper[4892]: E0217 19:52:28.496199 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c058476-2e9f-42eb-b8f1-377820b00c85" containerName="extract-utilities" Feb 17 19:52:28 crc kubenswrapper[4892]: I0217 19:52:28.496208 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c058476-2e9f-42eb-b8f1-377820b00c85" containerName="extract-utilities" Feb 17 19:52:28 crc kubenswrapper[4892]: E0217 19:52:28.496233 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c058476-2e9f-42eb-b8f1-377820b00c85" containerName="extract-content" Feb 17 19:52:28 crc kubenswrapper[4892]: I0217 19:52:28.496242 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c058476-2e9f-42eb-b8f1-377820b00c85" containerName="extract-content" Feb 17 19:52:28 crc kubenswrapper[4892]: I0217 19:52:28.496586 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdb317ef-c3b7-41a8-b80b-de001a95dbd1" containerName="install-certs-openstack-openstack-cell1" Feb 17 19:52:28 crc kubenswrapper[4892]: I0217 19:52:28.496625 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c058476-2e9f-42eb-b8f1-377820b00c85" containerName="registry-server" Feb 17 19:52:28 crc kubenswrapper[4892]: I0217 19:52:28.497572 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-f8r4q" Feb 17 19:52:28 crc kubenswrapper[4892]: I0217 19:52:28.501461 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 17 19:52:28 crc kubenswrapper[4892]: I0217 19:52:28.501659 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 17 19:52:28 crc kubenswrapper[4892]: I0217 19:52:28.501798 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-sdxx7" Feb 17 19:52:28 crc kubenswrapper[4892]: I0217 19:52:28.501981 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 19:52:28 crc kubenswrapper[4892]: I0217 19:52:28.523520 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-f8r4q"] Feb 17 19:52:28 crc kubenswrapper[4892]: I0217 19:52:28.560513 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/07d0e22b-fd41-43a3-8d50-fc6a280e0637-ssh-key-openstack-cell1\") pod \"ceph-client-openstack-openstack-cell1-f8r4q\" (UID: \"07d0e22b-fd41-43a3-8d50-fc6a280e0637\") " pod="openstack/ceph-client-openstack-openstack-cell1-f8r4q" Feb 17 19:52:28 crc kubenswrapper[4892]: I0217 19:52:28.560563 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/07d0e22b-fd41-43a3-8d50-fc6a280e0637-ceph\") pod \"ceph-client-openstack-openstack-cell1-f8r4q\" (UID: \"07d0e22b-fd41-43a3-8d50-fc6a280e0637\") " pod="openstack/ceph-client-openstack-openstack-cell1-f8r4q" Feb 17 19:52:28 crc kubenswrapper[4892]: I0217 19:52:28.560591 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cpcs\" (UniqueName: \"kubernetes.io/projected/07d0e22b-fd41-43a3-8d50-fc6a280e0637-kube-api-access-8cpcs\") pod \"ceph-client-openstack-openstack-cell1-f8r4q\" (UID: \"07d0e22b-fd41-43a3-8d50-fc6a280e0637\") " pod="openstack/ceph-client-openstack-openstack-cell1-f8r4q" Feb 17 19:52:28 crc kubenswrapper[4892]: I0217 19:52:28.560752 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/07d0e22b-fd41-43a3-8d50-fc6a280e0637-inventory\") pod \"ceph-client-openstack-openstack-cell1-f8r4q\" (UID: \"07d0e22b-fd41-43a3-8d50-fc6a280e0637\") " pod="openstack/ceph-client-openstack-openstack-cell1-f8r4q" Feb 17 19:52:28 crc kubenswrapper[4892]: I0217 19:52:28.663123 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/07d0e22b-fd41-43a3-8d50-fc6a280e0637-inventory\") pod \"ceph-client-openstack-openstack-cell1-f8r4q\" (UID: \"07d0e22b-fd41-43a3-8d50-fc6a280e0637\") " pod="openstack/ceph-client-openstack-openstack-cell1-f8r4q" Feb 17 19:52:28 crc kubenswrapper[4892]: I0217 19:52:28.663292 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/07d0e22b-fd41-43a3-8d50-fc6a280e0637-ssh-key-openstack-cell1\") pod \"ceph-client-openstack-openstack-cell1-f8r4q\" (UID: \"07d0e22b-fd41-43a3-8d50-fc6a280e0637\") " pod="openstack/ceph-client-openstack-openstack-cell1-f8r4q" Feb 17 19:52:28 crc kubenswrapper[4892]: I0217 19:52:28.663326 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/07d0e22b-fd41-43a3-8d50-fc6a280e0637-ceph\") pod \"ceph-client-openstack-openstack-cell1-f8r4q\" (UID: \"07d0e22b-fd41-43a3-8d50-fc6a280e0637\") " pod="openstack/ceph-client-openstack-openstack-cell1-f8r4q" Feb 17 19:52:28 crc kubenswrapper[4892]: I0217 19:52:28.663353 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cpcs\" (UniqueName: \"kubernetes.io/projected/07d0e22b-fd41-43a3-8d50-fc6a280e0637-kube-api-access-8cpcs\") pod \"ceph-client-openstack-openstack-cell1-f8r4q\" (UID: \"07d0e22b-fd41-43a3-8d50-fc6a280e0637\") " pod="openstack/ceph-client-openstack-openstack-cell1-f8r4q" Feb 17 19:52:28 crc kubenswrapper[4892]: I0217 19:52:28.669496 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/07d0e22b-fd41-43a3-8d50-fc6a280e0637-inventory\") pod \"ceph-client-openstack-openstack-cell1-f8r4q\" (UID: \"07d0e22b-fd41-43a3-8d50-fc6a280e0637\") " pod="openstack/ceph-client-openstack-openstack-cell1-f8r4q" Feb 17 19:52:28 crc kubenswrapper[4892]: I0217 19:52:28.676372 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/07d0e22b-fd41-43a3-8d50-fc6a280e0637-ceph\") pod \"ceph-client-openstack-openstack-cell1-f8r4q\" (UID: \"07d0e22b-fd41-43a3-8d50-fc6a280e0637\") " pod="openstack/ceph-client-openstack-openstack-cell1-f8r4q" Feb 17 19:52:28 crc kubenswrapper[4892]: I0217 19:52:28.676862 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/07d0e22b-fd41-43a3-8d50-fc6a280e0637-ssh-key-openstack-cell1\") pod \"ceph-client-openstack-openstack-cell1-f8r4q\" (UID: \"07d0e22b-fd41-43a3-8d50-fc6a280e0637\") " pod="openstack/ceph-client-openstack-openstack-cell1-f8r4q" Feb 17 19:52:28 crc kubenswrapper[4892]: I0217 19:52:28.679303 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cpcs\" (UniqueName: \"kubernetes.io/projected/07d0e22b-fd41-43a3-8d50-fc6a280e0637-kube-api-access-8cpcs\") pod \"ceph-client-openstack-openstack-cell1-f8r4q\" (UID: \"07d0e22b-fd41-43a3-8d50-fc6a280e0637\") " pod="openstack/ceph-client-openstack-openstack-cell1-f8r4q" Feb 17 19:52:28 crc kubenswrapper[4892]: I0217 19:52:28.821950 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-f8r4q" Feb 17 19:52:29 crc kubenswrapper[4892]: I0217 19:52:29.381658 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-f8r4q"] Feb 17 19:52:29 crc kubenswrapper[4892]: I0217 19:52:29.406684 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-f8r4q" event={"ID":"07d0e22b-fd41-43a3-8d50-fc6a280e0637","Type":"ContainerStarted","Data":"4054b6d014abc49fce637dc38101eaa8b405f7efc9f122dec826041aeef404e3"} Feb 17 19:52:30 crc kubenswrapper[4892]: I0217 19:52:30.428569 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-f8r4q" event={"ID":"07d0e22b-fd41-43a3-8d50-fc6a280e0637","Type":"ContainerStarted","Data":"d6a4111dc8c3a6c1285307820980f64dc96a1b279436ecc9a7abc180ca759141"} Feb 17 19:52:30 crc kubenswrapper[4892]: I0217 19:52:30.462011 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-openstack-openstack-cell1-f8r4q" podStartSLOduration=1.973694839 podStartE2EDuration="2.461992464s" podCreationTimestamp="2026-02-17 19:52:28 +0000 UTC" firstStartedPulling="2026-02-17 19:52:29.371443737 +0000 UTC m=+7720.746847012" lastFinishedPulling="2026-02-17 19:52:29.859741352 +0000 UTC m=+7721.235144637" observedRunningTime="2026-02-17 19:52:30.446964819 +0000 UTC m=+7721.822368154" watchObservedRunningTime="2026-02-17 19:52:30.461992464 +0000 UTC m=+7721.837395729" Feb 17 19:52:35 crc kubenswrapper[4892]: I0217 19:52:35.501507 4892 generic.go:334] "Generic (PLEG): container finished" podID="07d0e22b-fd41-43a3-8d50-fc6a280e0637" containerID="d6a4111dc8c3a6c1285307820980f64dc96a1b279436ecc9a7abc180ca759141" exitCode=0 Feb 17 19:52:35 crc kubenswrapper[4892]: I0217 19:52:35.501635 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-f8r4q" event={"ID":"07d0e22b-fd41-43a3-8d50-fc6a280e0637","Type":"ContainerDied","Data":"d6a4111dc8c3a6c1285307820980f64dc96a1b279436ecc9a7abc180ca759141"} Feb 17 19:52:37 crc kubenswrapper[4892]: I0217 19:52:37.093138 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-f8r4q" Feb 17 19:52:37 crc kubenswrapper[4892]: I0217 19:52:37.189025 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/07d0e22b-fd41-43a3-8d50-fc6a280e0637-ssh-key-openstack-cell1\") pod \"07d0e22b-fd41-43a3-8d50-fc6a280e0637\" (UID: \"07d0e22b-fd41-43a3-8d50-fc6a280e0637\") " Feb 17 19:52:37 crc kubenswrapper[4892]: I0217 19:52:37.189155 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/07d0e22b-fd41-43a3-8d50-fc6a280e0637-ceph\") pod \"07d0e22b-fd41-43a3-8d50-fc6a280e0637\" (UID: \"07d0e22b-fd41-43a3-8d50-fc6a280e0637\") " Feb 17 19:52:37 crc kubenswrapper[4892]: I0217 19:52:37.189849 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cpcs\" (UniqueName: \"kubernetes.io/projected/07d0e22b-fd41-43a3-8d50-fc6a280e0637-kube-api-access-8cpcs\") pod \"07d0e22b-fd41-43a3-8d50-fc6a280e0637\" (UID: \"07d0e22b-fd41-43a3-8d50-fc6a280e0637\") " Feb 17 19:52:37 crc kubenswrapper[4892]: I0217 19:52:37.190025 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/07d0e22b-fd41-43a3-8d50-fc6a280e0637-inventory\") pod \"07d0e22b-fd41-43a3-8d50-fc6a280e0637\" (UID: \"07d0e22b-fd41-43a3-8d50-fc6a280e0637\") " Feb 17 19:52:37 crc kubenswrapper[4892]: I0217 19:52:37.198311 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07d0e22b-fd41-43a3-8d50-fc6a280e0637-ceph" (OuterVolumeSpecName: "ceph") pod "07d0e22b-fd41-43a3-8d50-fc6a280e0637" (UID: "07d0e22b-fd41-43a3-8d50-fc6a280e0637"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:52:37 crc kubenswrapper[4892]: I0217 19:52:37.198385 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07d0e22b-fd41-43a3-8d50-fc6a280e0637-kube-api-access-8cpcs" (OuterVolumeSpecName: "kube-api-access-8cpcs") pod "07d0e22b-fd41-43a3-8d50-fc6a280e0637" (UID: "07d0e22b-fd41-43a3-8d50-fc6a280e0637"). InnerVolumeSpecName "kube-api-access-8cpcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:52:37 crc kubenswrapper[4892]: I0217 19:52:37.235077 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07d0e22b-fd41-43a3-8d50-fc6a280e0637-inventory" (OuterVolumeSpecName: "inventory") pod "07d0e22b-fd41-43a3-8d50-fc6a280e0637" (UID: "07d0e22b-fd41-43a3-8d50-fc6a280e0637"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:52:37 crc kubenswrapper[4892]: I0217 19:52:37.241784 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07d0e22b-fd41-43a3-8d50-fc6a280e0637-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "07d0e22b-fd41-43a3-8d50-fc6a280e0637" (UID: "07d0e22b-fd41-43a3-8d50-fc6a280e0637"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:52:37 crc kubenswrapper[4892]: I0217 19:52:37.293385 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cpcs\" (UniqueName: \"kubernetes.io/projected/07d0e22b-fd41-43a3-8d50-fc6a280e0637-kube-api-access-8cpcs\") on node \"crc\" DevicePath \"\"" Feb 17 19:52:37 crc kubenswrapper[4892]: I0217 19:52:37.293424 4892 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/07d0e22b-fd41-43a3-8d50-fc6a280e0637-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 19:52:37 crc kubenswrapper[4892]: I0217 19:52:37.293437 4892 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/07d0e22b-fd41-43a3-8d50-fc6a280e0637-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 17 19:52:37 crc kubenswrapper[4892]: I0217 19:52:37.293449 4892 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/07d0e22b-fd41-43a3-8d50-fc6a280e0637-ceph\") on node \"crc\" DevicePath \"\"" Feb 17 19:52:37 crc kubenswrapper[4892]: I0217 19:52:37.527348 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-f8r4q" event={"ID":"07d0e22b-fd41-43a3-8d50-fc6a280e0637","Type":"ContainerDied","Data":"4054b6d014abc49fce637dc38101eaa8b405f7efc9f122dec826041aeef404e3"} Feb 17 19:52:37 crc kubenswrapper[4892]: I0217 19:52:37.527716 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4054b6d014abc49fce637dc38101eaa8b405f7efc9f122dec826041aeef404e3" Feb 17 19:52:37 crc kubenswrapper[4892]: I0217 19:52:37.527428 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-f8r4q" Feb 17 19:52:37 crc kubenswrapper[4892]: I0217 19:52:37.631754 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-openstack-openstack-cell1-wt46w"] Feb 17 19:52:37 crc kubenswrapper[4892]: E0217 19:52:37.632338 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07d0e22b-fd41-43a3-8d50-fc6a280e0637" containerName="ceph-client-openstack-openstack-cell1" Feb 17 19:52:37 crc kubenswrapper[4892]: I0217 19:52:37.632357 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="07d0e22b-fd41-43a3-8d50-fc6a280e0637" containerName="ceph-client-openstack-openstack-cell1" Feb 17 19:52:37 crc kubenswrapper[4892]: I0217 19:52:37.632563 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="07d0e22b-fd41-43a3-8d50-fc6a280e0637" containerName="ceph-client-openstack-openstack-cell1" Feb 17 19:52:37 crc kubenswrapper[4892]: I0217 19:52:37.633371 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-wt46w" Feb 17 19:52:37 crc kubenswrapper[4892]: I0217 19:52:37.636297 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 19:52:37 crc kubenswrapper[4892]: I0217 19:52:37.637226 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-sdxx7" Feb 17 19:52:37 crc kubenswrapper[4892]: I0217 19:52:37.637390 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 17 19:52:37 crc kubenswrapper[4892]: I0217 19:52:37.637553 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 17 19:52:37 crc kubenswrapper[4892]: I0217 19:52:37.637858 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Feb 17 19:52:37 crc kubenswrapper[4892]: I0217 19:52:37.660023 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-wt46w"] Feb 17 19:52:37 crc kubenswrapper[4892]: I0217 19:52:37.703198 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d107e682-a9c6-463f-ac66-142f2fe6d6c2-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-wt46w\" (UID: \"d107e682-a9c6-463f-ac66-142f2fe6d6c2\") " pod="openstack/ovn-openstack-openstack-cell1-wt46w" Feb 17 19:52:37 crc kubenswrapper[4892]: I0217 19:52:37.703442 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d107e682-a9c6-463f-ac66-142f2fe6d6c2-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-wt46w\" (UID: \"d107e682-a9c6-463f-ac66-142f2fe6d6c2\") " pod="openstack/ovn-openstack-openstack-cell1-wt46w" Feb 17 19:52:37 crc kubenswrapper[4892]: I0217 19:52:37.703486 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dpnh\" (UniqueName: \"kubernetes.io/projected/d107e682-a9c6-463f-ac66-142f2fe6d6c2-kube-api-access-6dpnh\") pod \"ovn-openstack-openstack-cell1-wt46w\" (UID: \"d107e682-a9c6-463f-ac66-142f2fe6d6c2\") " pod="openstack/ovn-openstack-openstack-cell1-wt46w" Feb 17 19:52:37 crc kubenswrapper[4892]: I0217 19:52:37.703512 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d107e682-a9c6-463f-ac66-142f2fe6d6c2-inventory\") pod \"ovn-openstack-openstack-cell1-wt46w\" (UID: \"d107e682-a9c6-463f-ac66-142f2fe6d6c2\") " pod="openstack/ovn-openstack-openstack-cell1-wt46w" Feb 17 19:52:37 crc kubenswrapper[4892]: I0217 19:52:37.703537 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d107e682-a9c6-463f-ac66-142f2fe6d6c2-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-wt46w\" (UID: \"d107e682-a9c6-463f-ac66-142f2fe6d6c2\") " pod="openstack/ovn-openstack-openstack-cell1-wt46w" Feb 17 19:52:37 crc kubenswrapper[4892]: I0217 19:52:37.703602 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d107e682-a9c6-463f-ac66-142f2fe6d6c2-ceph\") pod \"ovn-openstack-openstack-cell1-wt46w\" (UID: \"d107e682-a9c6-463f-ac66-142f2fe6d6c2\") " pod="openstack/ovn-openstack-openstack-cell1-wt46w" Feb 17 19:52:37 crc kubenswrapper[4892]: I0217 19:52:37.805068 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d107e682-a9c6-463f-ac66-142f2fe6d6c2-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-wt46w\" (UID: \"d107e682-a9c6-463f-ac66-142f2fe6d6c2\") " pod="openstack/ovn-openstack-openstack-cell1-wt46w" Feb 17 19:52:37 crc kubenswrapper[4892]: I0217 19:52:37.805159 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dpnh\" (UniqueName: \"kubernetes.io/projected/d107e682-a9c6-463f-ac66-142f2fe6d6c2-kube-api-access-6dpnh\") pod \"ovn-openstack-openstack-cell1-wt46w\" (UID: \"d107e682-a9c6-463f-ac66-142f2fe6d6c2\") " pod="openstack/ovn-openstack-openstack-cell1-wt46w" Feb 17 19:52:37 crc kubenswrapper[4892]: I0217 19:52:37.805202 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d107e682-a9c6-463f-ac66-142f2fe6d6c2-inventory\") pod \"ovn-openstack-openstack-cell1-wt46w\" (UID: \"d107e682-a9c6-463f-ac66-142f2fe6d6c2\") " pod="openstack/ovn-openstack-openstack-cell1-wt46w" Feb 17 19:52:37 crc kubenswrapper[4892]: I0217 19:52:37.805240 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d107e682-a9c6-463f-ac66-142f2fe6d6c2-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-wt46w\" (UID: \"d107e682-a9c6-463f-ac66-142f2fe6d6c2\") " pod="openstack/ovn-openstack-openstack-cell1-wt46w" Feb 17 19:52:37 crc kubenswrapper[4892]: I0217 19:52:37.805338 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d107e682-a9c6-463f-ac66-142f2fe6d6c2-ceph\") pod \"ovn-openstack-openstack-cell1-wt46w\" (UID: \"d107e682-a9c6-463f-ac66-142f2fe6d6c2\") " pod="openstack/ovn-openstack-openstack-cell1-wt46w" Feb 17 19:52:37 crc kubenswrapper[4892]: I0217 19:52:37.805430 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d107e682-a9c6-463f-ac66-142f2fe6d6c2-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-wt46w\" (UID: \"d107e682-a9c6-463f-ac66-142f2fe6d6c2\") " pod="openstack/ovn-openstack-openstack-cell1-wt46w" Feb 17 19:52:37 crc kubenswrapper[4892]: I0217 19:52:37.806849 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d107e682-a9c6-463f-ac66-142f2fe6d6c2-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-wt46w\" (UID: \"d107e682-a9c6-463f-ac66-142f2fe6d6c2\") " pod="openstack/ovn-openstack-openstack-cell1-wt46w" Feb 17 19:52:37 crc kubenswrapper[4892]: I0217 19:52:37.810347 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d107e682-a9c6-463f-ac66-142f2fe6d6c2-inventory\") pod \"ovn-openstack-openstack-cell1-wt46w\" (UID: \"d107e682-a9c6-463f-ac66-142f2fe6d6c2\") " pod="openstack/ovn-openstack-openstack-cell1-wt46w" Feb 17 19:52:37 crc kubenswrapper[4892]: I0217 19:52:37.810405 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d107e682-a9c6-463f-ac66-142f2fe6d6c2-ceph\") pod \"ovn-openstack-openstack-cell1-wt46w\" (UID: \"d107e682-a9c6-463f-ac66-142f2fe6d6c2\") " pod="openstack/ovn-openstack-openstack-cell1-wt46w" Feb 17 19:52:37 crc kubenswrapper[4892]: I0217 19:52:37.810440 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d107e682-a9c6-463f-ac66-142f2fe6d6c2-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-wt46w\" (UID: \"d107e682-a9c6-463f-ac66-142f2fe6d6c2\") " pod="openstack/ovn-openstack-openstack-cell1-wt46w" Feb 17 19:52:37 crc kubenswrapper[4892]: I0217 19:52:37.812343 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d107e682-a9c6-463f-ac66-142f2fe6d6c2-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-wt46w\" (UID: \"d107e682-a9c6-463f-ac66-142f2fe6d6c2\") " pod="openstack/ovn-openstack-openstack-cell1-wt46w" Feb 17 19:52:37 crc kubenswrapper[4892]: I0217 19:52:37.825635 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dpnh\" (UniqueName: \"kubernetes.io/projected/d107e682-a9c6-463f-ac66-142f2fe6d6c2-kube-api-access-6dpnh\") pod \"ovn-openstack-openstack-cell1-wt46w\" (UID: \"d107e682-a9c6-463f-ac66-142f2fe6d6c2\") " pod="openstack/ovn-openstack-openstack-cell1-wt46w" Feb 17 19:52:37 crc kubenswrapper[4892]: I0217 19:52:37.958570 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-wt46w" Feb 17 19:52:38 crc kubenswrapper[4892]: I0217 19:52:38.561400 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-wt46w"] Feb 17 19:52:38 crc kubenswrapper[4892]: W0217 19:52:38.571645 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd107e682_a9c6_463f_ac66_142f2fe6d6c2.slice/crio-185d2dccba6fe5fccbf32efafe8e1bec870240df6f1fd8104fb1d58ea47a29b9 WatchSource:0}: Error finding container 185d2dccba6fe5fccbf32efafe8e1bec870240df6f1fd8104fb1d58ea47a29b9: Status 404 returned error can't find the container with id 185d2dccba6fe5fccbf32efafe8e1bec870240df6f1fd8104fb1d58ea47a29b9 Feb 17 19:52:39 crc kubenswrapper[4892]: I0217 19:52:39.553889 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-wt46w" event={"ID":"d107e682-a9c6-463f-ac66-142f2fe6d6c2","Type":"ContainerStarted","Data":"f4d9efb4b6a9d5f92f9a7def606bbba2b970992050fce01318b8cbd77ec95f20"} Feb 17 19:52:39 crc kubenswrapper[4892]: I0217 19:52:39.554442 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-wt46w" event={"ID":"d107e682-a9c6-463f-ac66-142f2fe6d6c2","Type":"ContainerStarted","Data":"185d2dccba6fe5fccbf32efafe8e1bec870240df6f1fd8104fb1d58ea47a29b9"} Feb 17 19:52:39 crc kubenswrapper[4892]: I0217 19:52:39.581096 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-openstack-openstack-cell1-wt46w" podStartSLOduration=2.004429351 podStartE2EDuration="2.581079392s" podCreationTimestamp="2026-02-17 19:52:37 +0000 UTC" firstStartedPulling="2026-02-17 19:52:38.577772751 +0000 UTC m=+7729.953176056" lastFinishedPulling="2026-02-17 19:52:39.154422822 +0000 UTC m=+7730.529826097" observedRunningTime="2026-02-17 19:52:39.568517014 +0000 UTC m=+7730.943920289" watchObservedRunningTime="2026-02-17 19:52:39.581079392 +0000 UTC m=+7730.956482657" Feb 17 19:53:37 crc kubenswrapper[4892]: I0217 19:53:37.425367 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 19:53:37 crc kubenswrapper[4892]: I0217 19:53:37.426175 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 19:53:43 crc kubenswrapper[4892]: I0217 19:53:43.348807 4892 generic.go:334] "Generic (PLEG): container finished" podID="d107e682-a9c6-463f-ac66-142f2fe6d6c2" containerID="f4d9efb4b6a9d5f92f9a7def606bbba2b970992050fce01318b8cbd77ec95f20" exitCode=0 Feb 17 19:53:43 crc kubenswrapper[4892]: I0217 19:53:43.348864 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-wt46w" event={"ID":"d107e682-a9c6-463f-ac66-142f2fe6d6c2","Type":"ContainerDied","Data":"f4d9efb4b6a9d5f92f9a7def606bbba2b970992050fce01318b8cbd77ec95f20"} Feb 17 19:53:44 crc kubenswrapper[4892]: I0217 19:53:44.846214 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-wt46w" Feb 17 19:53:44 crc kubenswrapper[4892]: I0217 19:53:44.996936 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d107e682-a9c6-463f-ac66-142f2fe6d6c2-ssh-key-openstack-cell1\") pod \"d107e682-a9c6-463f-ac66-142f2fe6d6c2\" (UID: \"d107e682-a9c6-463f-ac66-142f2fe6d6c2\") " Feb 17 19:53:44 crc kubenswrapper[4892]: I0217 19:53:44.997303 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dpnh\" (UniqueName: \"kubernetes.io/projected/d107e682-a9c6-463f-ac66-142f2fe6d6c2-kube-api-access-6dpnh\") pod \"d107e682-a9c6-463f-ac66-142f2fe6d6c2\" (UID: \"d107e682-a9c6-463f-ac66-142f2fe6d6c2\") " Feb 17 19:53:44 crc kubenswrapper[4892]: I0217 19:53:44.997449 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d107e682-a9c6-463f-ac66-142f2fe6d6c2-inventory\") pod \"d107e682-a9c6-463f-ac66-142f2fe6d6c2\" (UID: \"d107e682-a9c6-463f-ac66-142f2fe6d6c2\") " Feb 17 19:53:44 crc kubenswrapper[4892]: I0217 19:53:44.997597 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d107e682-a9c6-463f-ac66-142f2fe6d6c2-ovn-combined-ca-bundle\") pod \"d107e682-a9c6-463f-ac66-142f2fe6d6c2\" (UID: \"d107e682-a9c6-463f-ac66-142f2fe6d6c2\") " Feb 17 19:53:44 crc kubenswrapper[4892]: I0217 19:53:44.997784 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d107e682-a9c6-463f-ac66-142f2fe6d6c2-ovncontroller-config-0\") pod \"d107e682-a9c6-463f-ac66-142f2fe6d6c2\" (UID: \"d107e682-a9c6-463f-ac66-142f2fe6d6c2\") " Feb 17 19:53:44 crc kubenswrapper[4892]: I0217 19:53:44.998269 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d107e682-a9c6-463f-ac66-142f2fe6d6c2-ceph\") pod \"d107e682-a9c6-463f-ac66-142f2fe6d6c2\" (UID: \"d107e682-a9c6-463f-ac66-142f2fe6d6c2\") " Feb 17 19:53:45 crc kubenswrapper[4892]: I0217 19:53:45.002859 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d107e682-a9c6-463f-ac66-142f2fe6d6c2-ceph" (OuterVolumeSpecName: "ceph") pod "d107e682-a9c6-463f-ac66-142f2fe6d6c2" (UID: "d107e682-a9c6-463f-ac66-142f2fe6d6c2"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:53:45 crc kubenswrapper[4892]: I0217 19:53:45.005649 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d107e682-a9c6-463f-ac66-142f2fe6d6c2-kube-api-access-6dpnh" (OuterVolumeSpecName: "kube-api-access-6dpnh") pod "d107e682-a9c6-463f-ac66-142f2fe6d6c2" (UID: "d107e682-a9c6-463f-ac66-142f2fe6d6c2"). InnerVolumeSpecName "kube-api-access-6dpnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:53:45 crc kubenswrapper[4892]: I0217 19:53:45.019547 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d107e682-a9c6-463f-ac66-142f2fe6d6c2-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "d107e682-a9c6-463f-ac66-142f2fe6d6c2" (UID: "d107e682-a9c6-463f-ac66-142f2fe6d6c2"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:53:45 crc kubenswrapper[4892]: I0217 19:53:45.030449 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d107e682-a9c6-463f-ac66-142f2fe6d6c2-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "d107e682-a9c6-463f-ac66-142f2fe6d6c2" (UID: "d107e682-a9c6-463f-ac66-142f2fe6d6c2"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 19:53:45 crc kubenswrapper[4892]: I0217 19:53:45.033521 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d107e682-a9c6-463f-ac66-142f2fe6d6c2-inventory" (OuterVolumeSpecName: "inventory") pod "d107e682-a9c6-463f-ac66-142f2fe6d6c2" (UID: "d107e682-a9c6-463f-ac66-142f2fe6d6c2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:53:45 crc kubenswrapper[4892]: I0217 19:53:45.040606 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d107e682-a9c6-463f-ac66-142f2fe6d6c2-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "d107e682-a9c6-463f-ac66-142f2fe6d6c2" (UID: "d107e682-a9c6-463f-ac66-142f2fe6d6c2"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:53:45 crc kubenswrapper[4892]: I0217 19:53:45.103142 4892 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d107e682-a9c6-463f-ac66-142f2fe6d6c2-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 19:53:45 crc kubenswrapper[4892]: I0217 19:53:45.103690 4892 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d107e682-a9c6-463f-ac66-142f2fe6d6c2-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 19:53:45 crc kubenswrapper[4892]: I0217 19:53:45.103916 4892 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d107e682-a9c6-463f-ac66-142f2fe6d6c2-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Feb 17 19:53:45 crc kubenswrapper[4892]: I0217 19:53:45.104113 4892 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d107e682-a9c6-463f-ac66-142f2fe6d6c2-ceph\") on node \"crc\" DevicePath \"\"" Feb 17 19:53:45 crc kubenswrapper[4892]: I0217 19:53:45.104283 4892 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d107e682-a9c6-463f-ac66-142f2fe6d6c2-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 17 19:53:45 crc kubenswrapper[4892]: I0217 19:53:45.104430 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6dpnh\" (UniqueName: \"kubernetes.io/projected/d107e682-a9c6-463f-ac66-142f2fe6d6c2-kube-api-access-6dpnh\") on node \"crc\" DevicePath \"\"" Feb 17 19:53:45 crc kubenswrapper[4892]: I0217 19:53:45.371662 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-wt46w" Feb 17 19:53:45 crc kubenswrapper[4892]: I0217 19:53:45.373418 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-wt46w" event={"ID":"d107e682-a9c6-463f-ac66-142f2fe6d6c2","Type":"ContainerDied","Data":"185d2dccba6fe5fccbf32efafe8e1bec870240df6f1fd8104fb1d58ea47a29b9"} Feb 17 19:53:45 crc kubenswrapper[4892]: I0217 19:53:45.373453 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="185d2dccba6fe5fccbf32efafe8e1bec870240df6f1fd8104fb1d58ea47a29b9" Feb 17 19:53:45 crc kubenswrapper[4892]: I0217 19:53:45.474835 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-c8mfc"] Feb 17 19:53:45 crc kubenswrapper[4892]: E0217 19:53:45.475837 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d107e682-a9c6-463f-ac66-142f2fe6d6c2" containerName="ovn-openstack-openstack-cell1" Feb 17 19:53:45 crc kubenswrapper[4892]: I0217 19:53:45.475863 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="d107e682-a9c6-463f-ac66-142f2fe6d6c2" containerName="ovn-openstack-openstack-cell1" Feb 17 19:53:45 crc kubenswrapper[4892]: I0217 19:53:45.476136 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="d107e682-a9c6-463f-ac66-142f2fe6d6c2" containerName="ovn-openstack-openstack-cell1" Feb 17 19:53:45 crc kubenswrapper[4892]: I0217 19:53:45.477765 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-c8mfc" Feb 17 19:53:45 crc kubenswrapper[4892]: I0217 19:53:45.482611 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-sdxx7" Feb 17 19:53:45 crc kubenswrapper[4892]: I0217 19:53:45.482977 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 17 19:53:45 crc kubenswrapper[4892]: I0217 19:53:45.483210 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Feb 17 19:53:45 crc kubenswrapper[4892]: I0217 19:53:45.483425 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 17 19:53:45 crc kubenswrapper[4892]: I0217 19:53:45.483580 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Feb 17 19:53:45 crc kubenswrapper[4892]: I0217 19:53:45.483712 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 19:53:45 crc kubenswrapper[4892]: I0217 19:53:45.491725 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-c8mfc"] Feb 17 19:53:45 crc kubenswrapper[4892]: I0217 19:53:45.615554 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e432a42-0176-49de-915c-f655506500e6-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-c8mfc\" (UID: \"4e432a42-0176-49de-915c-f655506500e6\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-c8mfc" Feb 17 19:53:45 crc kubenswrapper[4892]: I0217 19:53:45.615672 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4e432a42-0176-49de-915c-f655506500e6-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-c8mfc\" (UID: \"4e432a42-0176-49de-915c-f655506500e6\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-c8mfc" Feb 17 19:53:45 crc kubenswrapper[4892]: I0217 19:53:45.615760 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4e432a42-0176-49de-915c-f655506500e6-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-c8mfc\" (UID: \"4e432a42-0176-49de-915c-f655506500e6\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-c8mfc" Feb 17 19:53:45 crc kubenswrapper[4892]: I0217 19:53:45.615789 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4e432a42-0176-49de-915c-f655506500e6-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-c8mfc\" (UID: \"4e432a42-0176-49de-915c-f655506500e6\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-c8mfc" Feb 17 19:53:45 crc kubenswrapper[4892]: I0217 19:53:45.615879 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e432a42-0176-49de-915c-f655506500e6-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-c8mfc\" (UID: \"4e432a42-0176-49de-915c-f655506500e6\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-c8mfc" Feb 17 19:53:45 crc kubenswrapper[4892]: I0217 19:53:45.615926 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/4e432a42-0176-49de-915c-f655506500e6-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-c8mfc\" (UID: \"4e432a42-0176-49de-915c-f655506500e6\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-c8mfc" Feb 17 19:53:45 crc kubenswrapper[4892]: I0217 19:53:45.615973 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsc99\" (UniqueName: \"kubernetes.io/projected/4e432a42-0176-49de-915c-f655506500e6-kube-api-access-fsc99\") pod \"neutron-metadata-openstack-openstack-cell1-c8mfc\" (UID: \"4e432a42-0176-49de-915c-f655506500e6\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-c8mfc" Feb 17 19:53:45 crc kubenswrapper[4892]: I0217 19:53:45.717803 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e432a42-0176-49de-915c-f655506500e6-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-c8mfc\" (UID: \"4e432a42-0176-49de-915c-f655506500e6\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-c8mfc" Feb 17 19:53:45 crc kubenswrapper[4892]: I0217 19:53:45.717940 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4e432a42-0176-49de-915c-f655506500e6-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-c8mfc\" (UID: \"4e432a42-0176-49de-915c-f655506500e6\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-c8mfc" Feb 17 19:53:45 crc kubenswrapper[4892]: I0217 19:53:45.718009 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4e432a42-0176-49de-915c-f655506500e6-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-c8mfc\" (UID: \"4e432a42-0176-49de-915c-f655506500e6\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-c8mfc" Feb 17 19:53:45 crc kubenswrapper[4892]: I0217 19:53:45.718041 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4e432a42-0176-49de-915c-f655506500e6-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-c8mfc\" (UID: \"4e432a42-0176-49de-915c-f655506500e6\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-c8mfc" Feb 17 19:53:45 crc kubenswrapper[4892]: I0217 19:53:45.718131 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e432a42-0176-49de-915c-f655506500e6-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-c8mfc\" (UID: \"4e432a42-0176-49de-915c-f655506500e6\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-c8mfc" Feb 17 19:53:45 crc kubenswrapper[4892]: I0217 19:53:45.718186 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/4e432a42-0176-49de-915c-f655506500e6-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-c8mfc\" (UID: \"4e432a42-0176-49de-915c-f655506500e6\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-c8mfc" Feb 17 19:53:45 crc kubenswrapper[4892]: I0217 19:53:45.718273 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsc99\" (UniqueName: \"kubernetes.io/projected/4e432a42-0176-49de-915c-f655506500e6-kube-api-access-fsc99\") pod \"neutron-metadata-openstack-openstack-cell1-c8mfc\" (UID: \"4e432a42-0176-49de-915c-f655506500e6\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-c8mfc" Feb 17 19:53:45 crc kubenswrapper[4892]: I0217 19:53:45.723794 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e432a42-0176-49de-915c-f655506500e6-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-c8mfc\" (UID: \"4e432a42-0176-49de-915c-f655506500e6\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-c8mfc" Feb 17 19:53:45 crc kubenswrapper[4892]: I0217 19:53:45.723845 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4e432a42-0176-49de-915c-f655506500e6-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-c8mfc\" (UID: \"4e432a42-0176-49de-915c-f655506500e6\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-c8mfc" Feb 17 19:53:45 crc kubenswrapper[4892]: I0217 19:53:45.724444 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4e432a42-0176-49de-915c-f655506500e6-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-c8mfc\" (UID: \"4e432a42-0176-49de-915c-f655506500e6\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-c8mfc" Feb 17 19:53:45 crc kubenswrapper[4892]: I0217 19:53:45.724708 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4e432a42-0176-49de-915c-f655506500e6-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-c8mfc\" (UID: \"4e432a42-0176-49de-915c-f655506500e6\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-c8mfc" Feb 17 19:53:45 crc kubenswrapper[4892]: I0217 19:53:45.725452 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/4e432a42-0176-49de-915c-f655506500e6-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-c8mfc\" (UID: \"4e432a42-0176-49de-915c-f655506500e6\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-c8mfc" Feb 17 19:53:45 crc kubenswrapper[4892]: I0217 19:53:45.726799 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e432a42-0176-49de-915c-f655506500e6-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-c8mfc\" (UID: \"4e432a42-0176-49de-915c-f655506500e6\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-c8mfc" Feb 17 19:53:45 crc kubenswrapper[4892]: I0217 19:53:45.742172 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsc99\" (UniqueName: \"kubernetes.io/projected/4e432a42-0176-49de-915c-f655506500e6-kube-api-access-fsc99\") pod \"neutron-metadata-openstack-openstack-cell1-c8mfc\" (UID: \"4e432a42-0176-49de-915c-f655506500e6\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-c8mfc" Feb 17 19:53:45 crc kubenswrapper[4892]: I0217 19:53:45.797121 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-c8mfc" Feb 17 19:53:46 crc kubenswrapper[4892]: I0217 19:53:46.418188 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-c8mfc"] Feb 17 19:53:46 crc kubenswrapper[4892]: I0217 19:53:46.438019 4892 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 19:53:47 crc kubenswrapper[4892]: I0217 19:53:47.392096 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-c8mfc" event={"ID":"4e432a42-0176-49de-915c-f655506500e6","Type":"ContainerStarted","Data":"6558a952cd3ebef70ec4960a0aba2a550e082e4b4968be2a13bd8b011c047282"} Feb 17 19:53:47 crc kubenswrapper[4892]: I0217 19:53:47.392707 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-c8mfc" event={"ID":"4e432a42-0176-49de-915c-f655506500e6","Type":"ContainerStarted","Data":"7ccf75284011cb16fe69c27450db739472eb358e4eebe28af55b19bd24ccd375"} Feb 17 19:53:47 crc kubenswrapper[4892]: I0217 19:53:47.422515 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-openstack-openstack-cell1-c8mfc" podStartSLOduration=1.943245168 podStartE2EDuration="2.422495989s" podCreationTimestamp="2026-02-17 19:53:45 +0000 UTC" firstStartedPulling="2026-02-17 19:53:46.43780084 +0000 UTC m=+7797.813204095" lastFinishedPulling="2026-02-17 19:53:46.917051651 +0000 UTC m=+7798.292454916" observedRunningTime="2026-02-17 19:53:47.411332187 +0000 UTC m=+7798.786735472" watchObservedRunningTime="2026-02-17 19:53:47.422495989 +0000 UTC m=+7798.797899254" Feb 17 19:54:07 crc kubenswrapper[4892]: I0217 19:54:07.425293 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 19:54:07 crc kubenswrapper[4892]: I0217 19:54:07.425977 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 19:54:37 crc kubenswrapper[4892]: I0217 19:54:37.424603 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 19:54:37 crc kubenswrapper[4892]: I0217 19:54:37.425210 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 19:54:37 crc kubenswrapper[4892]: I0217 19:54:37.425259 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" Feb 17 19:54:37 crc kubenswrapper[4892]: I0217 19:54:37.425978 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6c0d3b57fe7ce8a8bf0e7ab5ed7eb5daea00ebff5bc63d153811e8fa47cadb35"} pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 19:54:37 crc kubenswrapper[4892]: I0217 19:54:37.426053 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" containerID="cri-o://6c0d3b57fe7ce8a8bf0e7ab5ed7eb5daea00ebff5bc63d153811e8fa47cadb35" gracePeriod=600 Feb 17 19:54:37 crc kubenswrapper[4892]: E0217 19:54:37.554110 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:54:38 crc kubenswrapper[4892]: I0217 19:54:38.062156 4892 generic.go:334] "Generic (PLEG): container finished" podID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerID="6c0d3b57fe7ce8a8bf0e7ab5ed7eb5daea00ebff5bc63d153811e8fa47cadb35" exitCode=0 Feb 17 19:54:38 crc kubenswrapper[4892]: I0217 19:54:38.062228 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerDied","Data":"6c0d3b57fe7ce8a8bf0e7ab5ed7eb5daea00ebff5bc63d153811e8fa47cadb35"} Feb 17 19:54:38 crc kubenswrapper[4892]: I0217 19:54:38.062279 4892 scope.go:117] "RemoveContainer" containerID="3b5fc1d94d26a2a79cb5f7d6b0aa8474c2bc104c8b47fe3294fed95ffd456f3a" Feb 17 19:54:38 crc kubenswrapper[4892]: I0217 19:54:38.063142 4892 scope.go:117] "RemoveContainer" containerID="6c0d3b57fe7ce8a8bf0e7ab5ed7eb5daea00ebff5bc63d153811e8fa47cadb35" Feb 17 19:54:38 crc kubenswrapper[4892]: E0217 19:54:38.063452 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:54:48 crc kubenswrapper[4892]: I0217 19:54:48.204406 4892 generic.go:334] "Generic (PLEG): container finished" podID="4e432a42-0176-49de-915c-f655506500e6" containerID="6558a952cd3ebef70ec4960a0aba2a550e082e4b4968be2a13bd8b011c047282" exitCode=0 Feb 17 19:54:48 crc kubenswrapper[4892]: I0217 19:54:48.204623 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-c8mfc" event={"ID":"4e432a42-0176-49de-915c-f655506500e6","Type":"ContainerDied","Data":"6558a952cd3ebef70ec4960a0aba2a550e082e4b4968be2a13bd8b011c047282"} Feb 17 19:54:49 crc kubenswrapper[4892]: I0217 19:54:49.788060 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-c8mfc" Feb 17 19:54:49 crc kubenswrapper[4892]: I0217 19:54:49.852536 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsc99\" (UniqueName: \"kubernetes.io/projected/4e432a42-0176-49de-915c-f655506500e6-kube-api-access-fsc99\") pod \"4e432a42-0176-49de-915c-f655506500e6\" (UID: \"4e432a42-0176-49de-915c-f655506500e6\") " Feb 17 19:54:49 crc kubenswrapper[4892]: I0217 19:54:49.852885 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4e432a42-0176-49de-915c-f655506500e6-ceph\") pod \"4e432a42-0176-49de-915c-f655506500e6\" (UID: \"4e432a42-0176-49de-915c-f655506500e6\") " Feb 17 19:54:49 crc kubenswrapper[4892]: I0217 19:54:49.852946 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4e432a42-0176-49de-915c-f655506500e6-neutron-ovn-metadata-agent-neutron-config-0\") pod \"4e432a42-0176-49de-915c-f655506500e6\" (UID: \"4e432a42-0176-49de-915c-f655506500e6\") " Feb 17 19:54:49 crc kubenswrapper[4892]: I0217 19:54:49.852985 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4e432a42-0176-49de-915c-f655506500e6-nova-metadata-neutron-config-0\") pod \"4e432a42-0176-49de-915c-f655506500e6\" (UID: \"4e432a42-0176-49de-915c-f655506500e6\") " Feb 17 19:54:49 crc kubenswrapper[4892]: I0217 19:54:49.853013 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/4e432a42-0176-49de-915c-f655506500e6-ssh-key-openstack-cell1\") pod \"4e432a42-0176-49de-915c-f655506500e6\" (UID: \"4e432a42-0176-49de-915c-f655506500e6\") " Feb 17 19:54:49 crc kubenswrapper[4892]: I0217 19:54:49.853053 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e432a42-0176-49de-915c-f655506500e6-inventory\") pod \"4e432a42-0176-49de-915c-f655506500e6\" (UID: \"4e432a42-0176-49de-915c-f655506500e6\") " Feb 17 19:54:49 crc kubenswrapper[4892]: I0217 19:54:49.853109 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e432a42-0176-49de-915c-f655506500e6-neutron-metadata-combined-ca-bundle\") pod \"4e432a42-0176-49de-915c-f655506500e6\" (UID: \"4e432a42-0176-49de-915c-f655506500e6\") " Feb 17 19:54:49 crc kubenswrapper[4892]: I0217 19:54:49.860271 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e432a42-0176-49de-915c-f655506500e6-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "4e432a42-0176-49de-915c-f655506500e6" (UID: "4e432a42-0176-49de-915c-f655506500e6"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:54:49 crc kubenswrapper[4892]: I0217 19:54:49.866043 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e432a42-0176-49de-915c-f655506500e6-ceph" (OuterVolumeSpecName: "ceph") pod "4e432a42-0176-49de-915c-f655506500e6" (UID: "4e432a42-0176-49de-915c-f655506500e6"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:54:49 crc kubenswrapper[4892]: I0217 19:54:49.866214 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e432a42-0176-49de-915c-f655506500e6-kube-api-access-fsc99" (OuterVolumeSpecName: "kube-api-access-fsc99") pod "4e432a42-0176-49de-915c-f655506500e6" (UID: "4e432a42-0176-49de-915c-f655506500e6"). InnerVolumeSpecName "kube-api-access-fsc99". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:54:49 crc kubenswrapper[4892]: I0217 19:54:49.894001 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e432a42-0176-49de-915c-f655506500e6-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "4e432a42-0176-49de-915c-f655506500e6" (UID: "4e432a42-0176-49de-915c-f655506500e6"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:54:49 crc kubenswrapper[4892]: I0217 19:54:49.894583 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e432a42-0176-49de-915c-f655506500e6-inventory" (OuterVolumeSpecName: "inventory") pod "4e432a42-0176-49de-915c-f655506500e6" (UID: "4e432a42-0176-49de-915c-f655506500e6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:54:49 crc kubenswrapper[4892]: I0217 19:54:49.905792 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e432a42-0176-49de-915c-f655506500e6-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "4e432a42-0176-49de-915c-f655506500e6" (UID: "4e432a42-0176-49de-915c-f655506500e6"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:54:49 crc kubenswrapper[4892]: I0217 19:54:49.923722 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e432a42-0176-49de-915c-f655506500e6-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "4e432a42-0176-49de-915c-f655506500e6" (UID: "4e432a42-0176-49de-915c-f655506500e6"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:54:49 crc kubenswrapper[4892]: I0217 19:54:49.958964 4892 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4e432a42-0176-49de-915c-f655506500e6-ceph\") on node \"crc\" DevicePath \"\"" Feb 17 19:54:49 crc kubenswrapper[4892]: I0217 19:54:49.959005 4892 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4e432a42-0176-49de-915c-f655506500e6-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 17 19:54:49 crc kubenswrapper[4892]: I0217 19:54:49.959020 4892 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4e432a42-0176-49de-915c-f655506500e6-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 17 19:54:49 crc kubenswrapper[4892]: I0217 19:54:49.959030 4892 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/4e432a42-0176-49de-915c-f655506500e6-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 17 19:54:49 crc kubenswrapper[4892]: I0217 19:54:49.959042 4892 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e432a42-0176-49de-915c-f655506500e6-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 19:54:49 crc kubenswrapper[4892]: I0217 19:54:49.959051 4892 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e432a42-0176-49de-915c-f655506500e6-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 19:54:49 crc kubenswrapper[4892]: I0217 19:54:49.959063 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsc99\" (UniqueName: \"kubernetes.io/projected/4e432a42-0176-49de-915c-f655506500e6-kube-api-access-fsc99\") on node \"crc\" DevicePath \"\"" Feb 17 19:54:50 crc kubenswrapper[4892]: I0217 19:54:50.243779 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-c8mfc" event={"ID":"4e432a42-0176-49de-915c-f655506500e6","Type":"ContainerDied","Data":"7ccf75284011cb16fe69c27450db739472eb358e4eebe28af55b19bd24ccd375"} Feb 17 19:54:50 crc kubenswrapper[4892]: I0217 19:54:50.243831 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ccf75284011cb16fe69c27450db739472eb358e4eebe28af55b19bd24ccd375" Feb 17 19:54:50 crc kubenswrapper[4892]: I0217 19:54:50.243887 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-c8mfc" Feb 17 19:54:50 crc kubenswrapper[4892]: I0217 19:54:50.343374 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-bfmls"] Feb 17 19:54:50 crc kubenswrapper[4892]: E0217 19:54:50.343921 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e432a42-0176-49de-915c-f655506500e6" containerName="neutron-metadata-openstack-openstack-cell1" Feb 17 19:54:50 crc kubenswrapper[4892]: I0217 19:54:50.343939 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e432a42-0176-49de-915c-f655506500e6" containerName="neutron-metadata-openstack-openstack-cell1" Feb 17 19:54:50 crc kubenswrapper[4892]: I0217 19:54:50.344174 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e432a42-0176-49de-915c-f655506500e6" containerName="neutron-metadata-openstack-openstack-cell1" Feb 17 19:54:50 crc kubenswrapper[4892]: I0217 19:54:50.345020 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-bfmls" Feb 17 19:54:50 crc kubenswrapper[4892]: I0217 19:54:50.349003 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-sdxx7" Feb 17 19:54:50 crc kubenswrapper[4892]: I0217 19:54:50.349026 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Feb 17 19:54:50 crc kubenswrapper[4892]: I0217 19:54:50.350312 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 19:54:50 crc kubenswrapper[4892]: I0217 19:54:50.350961 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 17 19:54:50 crc kubenswrapper[4892]: I0217 19:54:50.351207 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 17 19:54:50 crc kubenswrapper[4892]: I0217 19:54:50.353415 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-bfmls"] Feb 17 19:54:50 crc kubenswrapper[4892]: I0217 19:54:50.478090 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/0f868edd-d082-4f76-87db-34a40d03ba30-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-bfmls\" (UID: \"0f868edd-d082-4f76-87db-34a40d03ba30\") " pod="openstack/libvirt-openstack-openstack-cell1-bfmls" Feb 17 19:54:50 crc kubenswrapper[4892]: I0217 19:54:50.478178 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0f868edd-d082-4f76-87db-34a40d03ba30-ceph\") pod \"libvirt-openstack-openstack-cell1-bfmls\" (UID: \"0f868edd-d082-4f76-87db-34a40d03ba30\") " pod="openstack/libvirt-openstack-openstack-cell1-bfmls" Feb 17 19:54:50 crc kubenswrapper[4892]: I0217 19:54:50.478215 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0f868edd-d082-4f76-87db-34a40d03ba30-inventory\") pod \"libvirt-openstack-openstack-cell1-bfmls\" (UID: \"0f868edd-d082-4f76-87db-34a40d03ba30\") " pod="openstack/libvirt-openstack-openstack-cell1-bfmls" Feb 17 19:54:50 crc kubenswrapper[4892]: I0217 19:54:50.478236 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9klp5\" (UniqueName: \"kubernetes.io/projected/0f868edd-d082-4f76-87db-34a40d03ba30-kube-api-access-9klp5\") pod \"libvirt-openstack-openstack-cell1-bfmls\" (UID: \"0f868edd-d082-4f76-87db-34a40d03ba30\") " pod="openstack/libvirt-openstack-openstack-cell1-bfmls" Feb 17 19:54:50 crc kubenswrapper[4892]: I0217 19:54:50.478256 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0f868edd-d082-4f76-87db-34a40d03ba30-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-bfmls\" (UID: \"0f868edd-d082-4f76-87db-34a40d03ba30\") " pod="openstack/libvirt-openstack-openstack-cell1-bfmls" Feb 17 19:54:50 crc kubenswrapper[4892]: I0217 19:54:50.478313 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f868edd-d082-4f76-87db-34a40d03ba30-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-bfmls\" (UID: \"0f868edd-d082-4f76-87db-34a40d03ba30\") " pod="openstack/libvirt-openstack-openstack-cell1-bfmls" Feb 17 19:54:50 crc kubenswrapper[4892]: I0217 19:54:50.580625 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0f868edd-d082-4f76-87db-34a40d03ba30-inventory\") pod \"libvirt-openstack-openstack-cell1-bfmls\" (UID: \"0f868edd-d082-4f76-87db-34a40d03ba30\") " pod="openstack/libvirt-openstack-openstack-cell1-bfmls" Feb 17 19:54:50 crc kubenswrapper[4892]: I0217 19:54:50.580672 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9klp5\" (UniqueName: \"kubernetes.io/projected/0f868edd-d082-4f76-87db-34a40d03ba30-kube-api-access-9klp5\") pod \"libvirt-openstack-openstack-cell1-bfmls\" (UID: \"0f868edd-d082-4f76-87db-34a40d03ba30\") " pod="openstack/libvirt-openstack-openstack-cell1-bfmls" Feb 17 19:54:50 crc kubenswrapper[4892]: I0217 19:54:50.580702 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0f868edd-d082-4f76-87db-34a40d03ba30-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-bfmls\" (UID: \"0f868edd-d082-4f76-87db-34a40d03ba30\") " pod="openstack/libvirt-openstack-openstack-cell1-bfmls" Feb 17 19:54:50 crc kubenswrapper[4892]: I0217 19:54:50.580769 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f868edd-d082-4f76-87db-34a40d03ba30-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-bfmls\" (UID: \"0f868edd-d082-4f76-87db-34a40d03ba30\") " pod="openstack/libvirt-openstack-openstack-cell1-bfmls" Feb 17 19:54:50 crc kubenswrapper[4892]: I0217 19:54:50.580952 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/0f868edd-d082-4f76-87db-34a40d03ba30-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-bfmls\" (UID: \"0f868edd-d082-4f76-87db-34a40d03ba30\") " pod="openstack/libvirt-openstack-openstack-cell1-bfmls" Feb 17 19:54:50 crc kubenswrapper[4892]: I0217 19:54:50.581019 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0f868edd-d082-4f76-87db-34a40d03ba30-ceph\") pod \"libvirt-openstack-openstack-cell1-bfmls\" (UID: \"0f868edd-d082-4f76-87db-34a40d03ba30\") " pod="openstack/libvirt-openstack-openstack-cell1-bfmls" Feb 17 19:54:50 crc kubenswrapper[4892]: I0217 19:54:50.585574 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0f868edd-d082-4f76-87db-34a40d03ba30-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-bfmls\" (UID: \"0f868edd-d082-4f76-87db-34a40d03ba30\") " pod="openstack/libvirt-openstack-openstack-cell1-bfmls" Feb 17 19:54:50 crc kubenswrapper[4892]: I0217 19:54:50.585982 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/0f868edd-d082-4f76-87db-34a40d03ba30-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-bfmls\" (UID: \"0f868edd-d082-4f76-87db-34a40d03ba30\") " pod="openstack/libvirt-openstack-openstack-cell1-bfmls" Feb 17 19:54:50 crc kubenswrapper[4892]: I0217 19:54:50.586954 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f868edd-d082-4f76-87db-34a40d03ba30-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-bfmls\" (UID: \"0f868edd-d082-4f76-87db-34a40d03ba30\") " pod="openstack/libvirt-openstack-openstack-cell1-bfmls" Feb 17 19:54:50 crc kubenswrapper[4892]: I0217 19:54:50.587727 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0f868edd-d082-4f76-87db-34a40d03ba30-ceph\") pod \"libvirt-openstack-openstack-cell1-bfmls\" (UID: \"0f868edd-d082-4f76-87db-34a40d03ba30\") " pod="openstack/libvirt-openstack-openstack-cell1-bfmls" Feb 17 19:54:50 crc kubenswrapper[4892]: I0217 19:54:50.587858 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0f868edd-d082-4f76-87db-34a40d03ba30-inventory\") pod \"libvirt-openstack-openstack-cell1-bfmls\" (UID: \"0f868edd-d082-4f76-87db-34a40d03ba30\") " pod="openstack/libvirt-openstack-openstack-cell1-bfmls" Feb 17 19:54:50 crc kubenswrapper[4892]: I0217 19:54:50.599736 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9klp5\" (UniqueName: \"kubernetes.io/projected/0f868edd-d082-4f76-87db-34a40d03ba30-kube-api-access-9klp5\") pod \"libvirt-openstack-openstack-cell1-bfmls\" (UID: \"0f868edd-d082-4f76-87db-34a40d03ba30\") " pod="openstack/libvirt-openstack-openstack-cell1-bfmls" Feb 17 19:54:50 crc kubenswrapper[4892]: I0217 19:54:50.665886 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-bfmls" Feb 17 19:54:51 crc kubenswrapper[4892]: I0217 19:54:51.312435 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-bfmls"] Feb 17 19:54:52 crc kubenswrapper[4892]: I0217 19:54:52.273487 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-bfmls" event={"ID":"0f868edd-d082-4f76-87db-34a40d03ba30","Type":"ContainerStarted","Data":"40e6787d7722c3c5a827e61edc14a107fc4383fcb96b0cfe718cccb5ab73d39c"} Feb 17 19:54:52 crc kubenswrapper[4892]: I0217 19:54:52.273869 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-bfmls" event={"ID":"0f868edd-d082-4f76-87db-34a40d03ba30","Type":"ContainerStarted","Data":"51b5facd7a28db7f37eaed1eedadb49aaedf198fa9de5bce54f61dcf3f1af9db"} Feb 17 19:54:52 crc kubenswrapper[4892]: I0217 19:54:52.300720 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-openstack-openstack-cell1-bfmls" podStartSLOduration=1.869665425 podStartE2EDuration="2.300697274s" podCreationTimestamp="2026-02-17 19:54:50 +0000 UTC" firstStartedPulling="2026-02-17 19:54:51.316342274 +0000 UTC m=+7862.691745549" lastFinishedPulling="2026-02-17 19:54:51.747374083 +0000 UTC m=+7863.122777398" observedRunningTime="2026-02-17 19:54:52.294428584 +0000 UTC m=+7863.669831849" watchObservedRunningTime="2026-02-17 19:54:52.300697274 +0000 UTC m=+7863.676100539" Feb 17 19:54:52 crc kubenswrapper[4892]: I0217 19:54:52.359525 4892 scope.go:117] "RemoveContainer" containerID="6c0d3b57fe7ce8a8bf0e7ab5ed7eb5daea00ebff5bc63d153811e8fa47cadb35" Feb 17 19:54:52 crc kubenswrapper[4892]: E0217 19:54:52.359832 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:55:04 crc kubenswrapper[4892]: I0217 19:55:04.360141 4892 scope.go:117] "RemoveContainer" containerID="6c0d3b57fe7ce8a8bf0e7ab5ed7eb5daea00ebff5bc63d153811e8fa47cadb35" Feb 17 19:55:04 crc kubenswrapper[4892]: E0217 19:55:04.360915 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:55:09 crc kubenswrapper[4892]: I0217 19:55:09.144511 4892 scope.go:117] "RemoveContainer" containerID="0dadd2c7a4ab9a2cb068cf4fdc67f117a85ec2c007c2766124ad0082b9b3fc38" Feb 17 19:55:09 crc kubenswrapper[4892]: I0217 19:55:09.187025 4892 scope.go:117] "RemoveContainer" containerID="c63c1375c6726ab96dd50d93cad13946ea4617c6bdf08087c48ab1d708e4fbbc" Feb 17 19:55:09 crc kubenswrapper[4892]: I0217 19:55:09.272871 4892 scope.go:117] "RemoveContainer" containerID="6ac4435a58251af7866637dfecf5ec1d2378861e7fb64a0ea26ea23daefc301a" Feb 17 19:55:19 crc kubenswrapper[4892]: I0217 19:55:19.377164 4892 scope.go:117] "RemoveContainer" containerID="6c0d3b57fe7ce8a8bf0e7ab5ed7eb5daea00ebff5bc63d153811e8fa47cadb35" Feb 17 19:55:19 crc kubenswrapper[4892]: E0217 19:55:19.378295 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:55:32 crc kubenswrapper[4892]: I0217 19:55:32.359264 4892 scope.go:117] "RemoveContainer" containerID="6c0d3b57fe7ce8a8bf0e7ab5ed7eb5daea00ebff5bc63d153811e8fa47cadb35" Feb 17 19:55:32 crc kubenswrapper[4892]: E0217 19:55:32.360104 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:55:44 crc kubenswrapper[4892]: I0217 19:55:44.360123 4892 scope.go:117] "RemoveContainer" containerID="6c0d3b57fe7ce8a8bf0e7ab5ed7eb5daea00ebff5bc63d153811e8fa47cadb35" Feb 17 19:55:44 crc kubenswrapper[4892]: E0217 19:55:44.360989 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:55:59 crc kubenswrapper[4892]: I0217 19:55:59.382680 4892 scope.go:117] "RemoveContainer" containerID="6c0d3b57fe7ce8a8bf0e7ab5ed7eb5daea00ebff5bc63d153811e8fa47cadb35" Feb 17 19:55:59 crc kubenswrapper[4892]: E0217 19:55:59.384894 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:56:17 crc kubenswrapper[4892]: I0217 19:56:17.616926 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Feb 17 19:56:17 crc kubenswrapper[4892]: I0217 19:56:17.625478 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/alertmanager-metric-storage-0" podUID="b1b1be0e-8ec2-48ea-967b-a89c7e20bea9" containerName="alertmanager" probeResult="failure" output="Get \"http://10.217.1.162:9093/-/ready\": dial tcp 10.217.1.162:9093: i/o timeout (Client.Timeout exceeded while awaiting headers)" Feb 17 19:56:17 crc kubenswrapper[4892]: I0217 19:56:17.631439 4892 patch_prober.go:28] interesting pod/perses-operator-5bf474d74f-n5wfv container/perses-operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.1.159:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 17 19:56:17 crc kubenswrapper[4892]: I0217 19:56:17.631496 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/perses-operator-5bf474d74f-n5wfv" podUID="609d9353-9db2-4a1c-8f00-8cfe986c3b12" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.1.159:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 19:56:17 crc kubenswrapper[4892]: I0217 19:56:17.632061 4892 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-nj852 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 17 19:56:17 crc kubenswrapper[4892]: I0217 19:56:17.632090 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nj852" podUID="14853a40-bee2-4d8e-a148-5f0ae761d71c" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.16:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 17 19:56:17 crc kubenswrapper[4892]: I0217 19:56:17.632776 4892 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-nj852 container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.16:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 17 19:56:17 crc kubenswrapper[4892]: I0217 19:56:17.632803 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nj852" podUID="14853a40-bee2-4d8e-a148-5f0ae761d71c" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.16:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 17 19:56:17 crc kubenswrapper[4892]: I0217 19:56:17.633067 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="hostpath-provisioner/csi-hostpathplugin-nnn4l" podUID="8607d279-cebb-4590-a0c0-7dda3de5dfd5" containerName="hostpath-provisioner" probeResult="failure" output="Get \"http://10.217.0.42:9898/healthz\": dial tcp 10.217.0.42:9898: i/o timeout (Client.Timeout exceeded while awaiting headers)" Feb 17 19:56:17 crc kubenswrapper[4892]: I0217 19:56:17.635666 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-volume-volume1-0" podUID="196db019-189a-4787-a766-f9ae8d46cbea" containerName="cinder-volume" probeResult="failure" output="Get \"http://10.217.1.117:8080/\": dial tcp 10.217.1.117:8080: i/o timeout (Client.Timeout exceeded while awaiting headers)" Feb 17 19:56:17 crc kubenswrapper[4892]: I0217 19:56:17.636772 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="b8b75602-eb3c-41f2-85d9-e5b055bd0724" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.1.163:9090/-/ready\": dial tcp 10.217.1.163:9090: i/o timeout (Client.Timeout exceeded while awaiting headers)" Feb 17 19:56:17 crc kubenswrapper[4892]: I0217 19:56:17.637723 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/prometheus-metric-storage-0" podUID="b8b75602-eb3c-41f2-85d9-e5b055bd0724" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.1.163:9090/-/healthy\": dial tcp 10.217.1.163:9090: i/o timeout (Client.Timeout exceeded while awaiting headers)" Feb 17 19:56:17 crc kubenswrapper[4892]: I0217 19:56:17.642147 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="b4368bec-a527-4e2e-bcd8-c3b83faf9bca" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.1.120:8080/\": dial tcp 10.217.1.120:8080: i/o timeout (Client.Timeout exceeded while awaiting headers)" Feb 17 19:56:17 crc kubenswrapper[4892]: I0217 19:56:17.648345 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-api-0" podUID="b762d850-e72b-4151-97c1-c6e1f8c9e76f" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.1.119:8776/healthcheck\": dial tcp 10.217.1.119:8776: i/o timeout (Client.Timeout exceeded while awaiting headers)" Feb 17 19:56:17 crc kubenswrapper[4892]: I0217 19:56:17.655268 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-694cc5859d-jlpkr" podUID="8380e84c-8f80-43dc-825e-d9dd3dd0533f" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.1.59:9311/healthcheck\": dial tcp 10.217.1.59:9311: i/o timeout (Client.Timeout exceeded while awaiting headers)" Feb 17 19:56:17 crc kubenswrapper[4892]: I0217 19:56:17.656940 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-694cc5859d-jlpkr" podUID="8380e84c-8f80-43dc-825e-d9dd3dd0533f" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.1.59:9311/healthcheck\": dial tcp 10.217.1.59:9311: i/o timeout (Client.Timeout exceeded while awaiting headers)" Feb 17 19:56:17 crc kubenswrapper[4892]: I0217 19:56:17.666186 4892 scope.go:117] "RemoveContainer" containerID="6c0d3b57fe7ce8a8bf0e7ab5ed7eb5daea00ebff5bc63d153811e8fa47cadb35" Feb 17 19:56:17 crc kubenswrapper[4892]: I0217 19:56:17.681551 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-cell1-novncproxy-0" podUID="8f2c6c93-db35-404a-a613-34cb4b2de98b" containerName="nova-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"http://10.217.1.110:6080/vnc_lite.html\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 19:56:17 crc kubenswrapper[4892]: I0217 19:56:17.684307 4892 patch_prober.go:28] interesting pod/route-controller-manager-58cdcc4d95-wq4w7 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.61:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 17 19:56:17 crc kubenswrapper[4892]: I0217 19:56:17.684370 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-58cdcc4d95-wq4w7" podUID="6a1b1449-9911-496a-bb90-b207edd42528" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.61:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 17 19:56:17 crc kubenswrapper[4892]: E0217 19:56:17.705672 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:56:17 crc kubenswrapper[4892]: I0217 19:56:17.769945 4892 patch_prober.go:28] interesting pod/route-controller-manager-58cdcc4d95-wq4w7 container/route-controller-manager namespace/openshift-route-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.61:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 17 19:56:17 crc kubenswrapper[4892]: I0217 19:56:17.770043 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-route-controller-manager/route-controller-manager-58cdcc4d95-wq4w7" podUID="6a1b1449-9911-496a-bb90-b207edd42528" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.61:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 17 19:56:17 crc kubenswrapper[4892]: I0217 19:56:17.770111 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-backup-0" podUID="0c0d1a90-2e96-43e4-9ed7-b375dd729dd5" containerName="cinder-backup" probeResult="failure" output="Get \"http://10.217.1.118:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 19:56:17 crc kubenswrapper[4892]: I0217 19:56:17.770592 4892 patch_prober.go:28] interesting pod/dns-default-cv2jf container/dns namespace/openshift-dns: Readiness probe status=failure output="Get \"http://10.217.0.43:8181/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 17 19:56:17 crc kubenswrapper[4892]: I0217 19:56:17.770617 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-dns/dns-default-cv2jf" podUID="1b62a756-057b-45a6-b568-b788dbb95fa0" containerName="dns" probeResult="failure" output="Get \"http://10.217.0.43:8181/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 19:56:17 crc kubenswrapper[4892]: I0217 19:56:17.829314 4892 patch_prober.go:28] interesting pod/network-check-target-xd92c container/network-check-target-container namespace/openshift-network-diagnostics: Readiness probe status=failure output="Get \"http://10.217.0.4:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 17 19:56:17 crc kubenswrapper[4892]: I0217 19:56:17.829368 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" containerName="network-check-target-container" probeResult="failure" output="Get \"http://10.217.0.4:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 19:56:29 crc kubenswrapper[4892]: I0217 19:56:29.367297 4892 scope.go:117] "RemoveContainer" containerID="6c0d3b57fe7ce8a8bf0e7ab5ed7eb5daea00ebff5bc63d153811e8fa47cadb35" Feb 17 19:56:29 crc kubenswrapper[4892]: E0217 19:56:29.368048 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:56:42 crc kubenswrapper[4892]: I0217 19:56:42.361299 4892 scope.go:117] "RemoveContainer" containerID="6c0d3b57fe7ce8a8bf0e7ab5ed7eb5daea00ebff5bc63d153811e8fa47cadb35" Feb 17 19:56:42 crc kubenswrapper[4892]: E0217 19:56:42.362759 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:56:54 crc kubenswrapper[4892]: I0217 19:56:54.360542 4892 scope.go:117] "RemoveContainer" containerID="6c0d3b57fe7ce8a8bf0e7ab5ed7eb5daea00ebff5bc63d153811e8fa47cadb35" Feb 17 19:56:54 crc kubenswrapper[4892]: E0217 19:56:54.361911 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:57:02 crc kubenswrapper[4892]: I0217 19:57:02.872051 4892 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-nj852 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 17 19:57:02 crc kubenswrapper[4892]: I0217 19:57:02.872615 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nj852" podUID="14853a40-bee2-4d8e-a148-5f0ae761d71c" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.16:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 19:57:05 crc kubenswrapper[4892]: I0217 19:57:05.360741 4892 scope.go:117] "RemoveContainer" containerID="6c0d3b57fe7ce8a8bf0e7ab5ed7eb5daea00ebff5bc63d153811e8fa47cadb35" Feb 17 19:57:05 crc kubenswrapper[4892]: E0217 19:57:05.361905 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:57:06 crc kubenswrapper[4892]: I0217 19:57:06.767883 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Feb 17 19:57:11 crc kubenswrapper[4892]: I0217 19:57:11.767719 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Feb 17 19:57:16 crc kubenswrapper[4892]: I0217 19:57:16.767141 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Feb 17 19:57:16 crc kubenswrapper[4892]: I0217 19:57:16.767701 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ceilometer-0" Feb 17 19:57:16 crc kubenswrapper[4892]: I0217 19:57:16.768430 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f" containerName="ceilometer-notification-agent" probeResult="failure" output="command timed out" Feb 17 19:57:16 crc kubenswrapper[4892]: I0217 19:57:16.768776 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="ceilometer-central-agent" containerStatusID={"Type":"cri-o","ID":"eb4484c662b22b2f889a6e41fda5cd12f48a6f2ba6f7509c806823f31c8e8aa1"} pod="openstack/ceilometer-0" containerMessage="Container ceilometer-central-agent failed liveness probe, will be restarted" Feb 17 19:57:16 crc kubenswrapper[4892]: I0217 19:57:16.768930 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f" containerName="ceilometer-central-agent" containerID="cri-o://eb4484c662b22b2f889a6e41fda5cd12f48a6f2ba6f7509c806823f31c8e8aa1" gracePeriod=30 Feb 17 19:57:19 crc kubenswrapper[4892]: I0217 19:57:19.370247 4892 scope.go:117] "RemoveContainer" containerID="6c0d3b57fe7ce8a8bf0e7ab5ed7eb5daea00ebff5bc63d153811e8fa47cadb35" Feb 17 19:57:19 crc kubenswrapper[4892]: E0217 19:57:19.370930 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:57:24 crc kubenswrapper[4892]: I0217 19:57:24.947262 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-ngzhd" podUID="67ccc01c-23ce-407b-91dd-9554c49acbd5" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.107:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 19:57:24 crc kubenswrapper[4892]: I0217 19:57:24.947350 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-ngzhd" podUID="67ccc01c-23ce-407b-91dd-9554c49acbd5" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.107:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 19:57:31 crc kubenswrapper[4892]: I0217 19:57:31.406284 4892 trace.go:236] Trace[903935932]: "Calculate volume metrics of ovndbcluster-nb-etc-ovn for pod openstack/ovsdbserver-nb-2" (17-Feb-2026 19:57:26.198) (total time: 5199ms): Feb 17 19:57:31 crc kubenswrapper[4892]: Trace[903935932]: [5.199669644s] [5.199669644s] END Feb 17 19:57:31 crc kubenswrapper[4892]: E0217 19:57:31.764873 4892 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67ccc01c_23ce_407b_91dd_9554c49acbd5.slice/crio-conmon-37feb22caad006be9f46d877ae4484912e01926b39d64f4e3cf3b2bb7d7ea927.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef7f0a40_a8bc_4a01_be30_3f0d5e5fd52f.slice/crio-conmon-eb4484c662b22b2f889a6e41fda5cd12f48a6f2ba6f7509c806823f31c8e8aa1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48f194fc_64b3_4ef2_9006_8b533ce72000.slice/crio-conmon-76383e265ea0359d55ed3e51165ffd91efcbc54205644b3d02cc5331ee8fd680.scope\": RecentStats: unable to find data in memory cache]" Feb 17 19:57:32 crc kubenswrapper[4892]: I0217 19:57:32.359498 4892 scope.go:117] "RemoveContainer" containerID="6c0d3b57fe7ce8a8bf0e7ab5ed7eb5daea00ebff5bc63d153811e8fa47cadb35" Feb 17 19:57:32 crc kubenswrapper[4892]: E0217 19:57:32.360371 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:57:32 crc kubenswrapper[4892]: I0217 19:57:32.447038 4892 generic.go:334] "Generic (PLEG): container finished" podID="ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f" containerID="eb4484c662b22b2f889a6e41fda5cd12f48a6f2ba6f7509c806823f31c8e8aa1" exitCode=0 Feb 17 19:57:32 crc kubenswrapper[4892]: I0217 19:57:32.448194 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f","Type":"ContainerDied","Data":"eb4484c662b22b2f889a6e41fda5cd12f48a6f2ba6f7509c806823f31c8e8aa1"} Feb 17 19:57:32 crc kubenswrapper[4892]: I0217 19:57:32.451460 4892 generic.go:334] "Generic (PLEG): container finished" podID="48f194fc-64b3-4ef2-9006-8b533ce72000" containerID="76383e265ea0359d55ed3e51165ffd91efcbc54205644b3d02cc5331ee8fd680" exitCode=1 Feb 17 19:57:32 crc kubenswrapper[4892]: I0217 19:57:32.451543 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-nr7vj" event={"ID":"48f194fc-64b3-4ef2-9006-8b533ce72000","Type":"ContainerDied","Data":"76383e265ea0359d55ed3e51165ffd91efcbc54205644b3d02cc5331ee8fd680"} Feb 17 19:57:32 crc kubenswrapper[4892]: I0217 19:57:32.452404 4892 scope.go:117] "RemoveContainer" containerID="76383e265ea0359d55ed3e51165ffd91efcbc54205644b3d02cc5331ee8fd680" Feb 17 19:57:32 crc kubenswrapper[4892]: I0217 19:57:32.454022 4892 generic.go:334] "Generic (PLEG): container finished" podID="67ccc01c-23ce-407b-91dd-9554c49acbd5" containerID="37feb22caad006be9f46d877ae4484912e01926b39d64f4e3cf3b2bb7d7ea927" exitCode=1 Feb 17 19:57:32 crc kubenswrapper[4892]: I0217 19:57:32.454061 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-ngzhd" event={"ID":"67ccc01c-23ce-407b-91dd-9554c49acbd5","Type":"ContainerDied","Data":"37feb22caad006be9f46d877ae4484912e01926b39d64f4e3cf3b2bb7d7ea927"} Feb 17 19:57:32 crc kubenswrapper[4892]: I0217 19:57:32.456978 4892 scope.go:117] "RemoveContainer" containerID="37feb22caad006be9f46d877ae4484912e01926b39d64f4e3cf3b2bb7d7ea927" Feb 17 19:57:33 crc kubenswrapper[4892]: I0217 19:57:33.467468 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-ngzhd" event={"ID":"67ccc01c-23ce-407b-91dd-9554c49acbd5","Type":"ContainerStarted","Data":"2383fe1da250e3ff6079ba8eb119e04f8e6852e56ea3b0e0cd2ea03a09f678ea"} Feb 17 19:57:33 crc kubenswrapper[4892]: I0217 19:57:33.469512 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-ngzhd" Feb 17 19:57:33 crc kubenswrapper[4892]: I0217 19:57:33.788753 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f","Type":"ContainerStarted","Data":"9033b456065bf993fa4ad2448ff4b47f1fce1b4160c4145fd331add527072938"} Feb 17 19:57:33 crc kubenswrapper[4892]: I0217 19:57:33.797405 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-nr7vj" event={"ID":"48f194fc-64b3-4ef2-9006-8b533ce72000","Type":"ContainerStarted","Data":"bba603a166f6f6adcf2e95c657388173e3e297cb262a18a7fcb1f16752607941"} Feb 17 19:57:33 crc kubenswrapper[4892]: I0217 19:57:33.798363 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-nr7vj" Feb 17 19:57:43 crc kubenswrapper[4892]: I0217 19:57:43.912076 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-ngzhd" Feb 17 19:57:43 crc kubenswrapper[4892]: I0217 19:57:43.987764 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-nr7vj" Feb 17 19:57:45 crc kubenswrapper[4892]: I0217 19:57:45.360410 4892 scope.go:117] "RemoveContainer" containerID="6c0d3b57fe7ce8a8bf0e7ab5ed7eb5daea00ebff5bc63d153811e8fa47cadb35" Feb 17 19:57:45 crc kubenswrapper[4892]: E0217 19:57:45.361079 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:57:56 crc kubenswrapper[4892]: I0217 19:57:56.361093 4892 scope.go:117] "RemoveContainer" containerID="6c0d3b57fe7ce8a8bf0e7ab5ed7eb5daea00ebff5bc63d153811e8fa47cadb35" Feb 17 19:57:56 crc kubenswrapper[4892]: E0217 19:57:56.362151 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:58:11 crc kubenswrapper[4892]: I0217 19:58:11.359790 4892 scope.go:117] "RemoveContainer" containerID="6c0d3b57fe7ce8a8bf0e7ab5ed7eb5daea00ebff5bc63d153811e8fa47cadb35" Feb 17 19:58:11 crc kubenswrapper[4892]: E0217 19:58:11.360983 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:58:24 crc kubenswrapper[4892]: I0217 19:58:24.359395 4892 scope.go:117] "RemoveContainer" containerID="6c0d3b57fe7ce8a8bf0e7ab5ed7eb5daea00ebff5bc63d153811e8fa47cadb35" Feb 17 19:58:24 crc kubenswrapper[4892]: E0217 19:58:24.360467 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:58:38 crc kubenswrapper[4892]: I0217 19:58:38.360613 4892 scope.go:117] "RemoveContainer" containerID="6c0d3b57fe7ce8a8bf0e7ab5ed7eb5daea00ebff5bc63d153811e8fa47cadb35" Feb 17 19:58:38 crc kubenswrapper[4892]: E0217 19:58:38.361620 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:58:53 crc kubenswrapper[4892]: I0217 19:58:53.359508 4892 scope.go:117] "RemoveContainer" containerID="6c0d3b57fe7ce8a8bf0e7ab5ed7eb5daea00ebff5bc63d153811e8fa47cadb35" Feb 17 19:58:53 crc kubenswrapper[4892]: E0217 19:58:53.360455 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:59:06 crc kubenswrapper[4892]: I0217 19:59:06.340761 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-k48p7"] Feb 17 19:59:06 crc kubenswrapper[4892]: I0217 19:59:06.344129 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k48p7" Feb 17 19:59:06 crc kubenswrapper[4892]: I0217 19:59:06.354421 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k48p7"] Feb 17 19:59:06 crc kubenswrapper[4892]: I0217 19:59:06.479434 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d405651f-0e0c-4099-8656-67e77b88d50d-utilities\") pod \"certified-operators-k48p7\" (UID: \"d405651f-0e0c-4099-8656-67e77b88d50d\") " pod="openshift-marketplace/certified-operators-k48p7" Feb 17 19:59:06 crc kubenswrapper[4892]: I0217 19:59:06.479483 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrv8h\" (UniqueName: \"kubernetes.io/projected/d405651f-0e0c-4099-8656-67e77b88d50d-kube-api-access-vrv8h\") pod \"certified-operators-k48p7\" (UID: \"d405651f-0e0c-4099-8656-67e77b88d50d\") " pod="openshift-marketplace/certified-operators-k48p7" Feb 17 19:59:06 crc kubenswrapper[4892]: I0217 19:59:06.479798 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d405651f-0e0c-4099-8656-67e77b88d50d-catalog-content\") pod \"certified-operators-k48p7\" (UID: \"d405651f-0e0c-4099-8656-67e77b88d50d\") " pod="openshift-marketplace/certified-operators-k48p7" Feb 17 19:59:06 crc kubenswrapper[4892]: I0217 19:59:06.582963 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d405651f-0e0c-4099-8656-67e77b88d50d-utilities\") pod \"certified-operators-k48p7\" (UID: \"d405651f-0e0c-4099-8656-67e77b88d50d\") " pod="openshift-marketplace/certified-operators-k48p7" Feb 17 19:59:06 crc kubenswrapper[4892]: I0217 19:59:06.583066 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrv8h\" (UniqueName: \"kubernetes.io/projected/d405651f-0e0c-4099-8656-67e77b88d50d-kube-api-access-vrv8h\") pod \"certified-operators-k48p7\" (UID: \"d405651f-0e0c-4099-8656-67e77b88d50d\") " pod="openshift-marketplace/certified-operators-k48p7" Feb 17 19:59:06 crc kubenswrapper[4892]: I0217 19:59:06.583252 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d405651f-0e0c-4099-8656-67e77b88d50d-catalog-content\") pod \"certified-operators-k48p7\" (UID: \"d405651f-0e0c-4099-8656-67e77b88d50d\") " pod="openshift-marketplace/certified-operators-k48p7" Feb 17 19:59:06 crc kubenswrapper[4892]: I0217 19:59:06.584032 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d405651f-0e0c-4099-8656-67e77b88d50d-catalog-content\") pod \"certified-operators-k48p7\" (UID: \"d405651f-0e0c-4099-8656-67e77b88d50d\") " pod="openshift-marketplace/certified-operators-k48p7" Feb 17 19:59:06 crc kubenswrapper[4892]: I0217 19:59:06.584288 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d405651f-0e0c-4099-8656-67e77b88d50d-utilities\") pod \"certified-operators-k48p7\" (UID: \"d405651f-0e0c-4099-8656-67e77b88d50d\") " pod="openshift-marketplace/certified-operators-k48p7" Feb 17 19:59:06 crc kubenswrapper[4892]: I0217 19:59:06.610432 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrv8h\" (UniqueName: \"kubernetes.io/projected/d405651f-0e0c-4099-8656-67e77b88d50d-kube-api-access-vrv8h\") pod \"certified-operators-k48p7\" (UID: \"d405651f-0e0c-4099-8656-67e77b88d50d\") " pod="openshift-marketplace/certified-operators-k48p7" Feb 17 19:59:06 crc kubenswrapper[4892]: I0217 19:59:06.680256 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k48p7" Feb 17 19:59:07 crc kubenswrapper[4892]: I0217 19:59:07.218255 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k48p7"] Feb 17 19:59:07 crc kubenswrapper[4892]: I0217 19:59:07.360889 4892 scope.go:117] "RemoveContainer" containerID="6c0d3b57fe7ce8a8bf0e7ab5ed7eb5daea00ebff5bc63d153811e8fa47cadb35" Feb 17 19:59:07 crc kubenswrapper[4892]: E0217 19:59:07.361181 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:59:08 crc kubenswrapper[4892]: I0217 19:59:08.087801 4892 generic.go:334] "Generic (PLEG): container finished" podID="d405651f-0e0c-4099-8656-67e77b88d50d" containerID="4cc95d5458794dd8ee9d434ef5607df7e52d691b549d62cbab5450bfcc74ebe9" exitCode=0 Feb 17 19:59:08 crc kubenswrapper[4892]: I0217 19:59:08.087895 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k48p7" event={"ID":"d405651f-0e0c-4099-8656-67e77b88d50d","Type":"ContainerDied","Data":"4cc95d5458794dd8ee9d434ef5607df7e52d691b549d62cbab5450bfcc74ebe9"} Feb 17 19:59:08 crc kubenswrapper[4892]: I0217 19:59:08.087924 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k48p7" event={"ID":"d405651f-0e0c-4099-8656-67e77b88d50d","Type":"ContainerStarted","Data":"4c0a6b48a533d7847f9e97cac80728724263af07fd5884c359307348988edca6"} Feb 17 19:59:08 crc kubenswrapper[4892]: I0217 19:59:08.091632 4892 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 19:59:10 crc kubenswrapper[4892]: I0217 19:59:10.108525 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k48p7" event={"ID":"d405651f-0e0c-4099-8656-67e77b88d50d","Type":"ContainerStarted","Data":"c42511c7d68c1272a7bb3f9572551e57cf8db707bf7235795fda66ad56f94827"} Feb 17 19:59:12 crc kubenswrapper[4892]: I0217 19:59:12.130972 4892 generic.go:334] "Generic (PLEG): container finished" podID="d405651f-0e0c-4099-8656-67e77b88d50d" containerID="c42511c7d68c1272a7bb3f9572551e57cf8db707bf7235795fda66ad56f94827" exitCode=0 Feb 17 19:59:12 crc kubenswrapper[4892]: I0217 19:59:12.131035 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k48p7" event={"ID":"d405651f-0e0c-4099-8656-67e77b88d50d","Type":"ContainerDied","Data":"c42511c7d68c1272a7bb3f9572551e57cf8db707bf7235795fda66ad56f94827"} Feb 17 19:59:13 crc kubenswrapper[4892]: I0217 19:59:13.141719 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k48p7" event={"ID":"d405651f-0e0c-4099-8656-67e77b88d50d","Type":"ContainerStarted","Data":"eaddaf3f3426b74bb14b69e564019fb4dd330aaf5071872baf79a2b247abf3ed"} Feb 17 19:59:13 crc kubenswrapper[4892]: I0217 19:59:13.173564 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-k48p7" podStartSLOduration=2.710753862 podStartE2EDuration="7.173549127s" podCreationTimestamp="2026-02-17 19:59:06 +0000 UTC" firstStartedPulling="2026-02-17 19:59:08.091138992 +0000 UTC m=+8119.466542257" lastFinishedPulling="2026-02-17 19:59:12.553934257 +0000 UTC m=+8123.929337522" observedRunningTime="2026-02-17 19:59:13.162097008 +0000 UTC m=+8124.537500313" watchObservedRunningTime="2026-02-17 19:59:13.173549127 +0000 UTC m=+8124.548952392" Feb 17 19:59:16 crc kubenswrapper[4892]: I0217 19:59:16.681513 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-k48p7" Feb 17 19:59:16 crc kubenswrapper[4892]: I0217 19:59:16.681905 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-k48p7" Feb 17 19:59:16 crc kubenswrapper[4892]: I0217 19:59:16.778716 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-k48p7" Feb 17 19:59:17 crc kubenswrapper[4892]: I0217 19:59:17.277245 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-k48p7" Feb 17 19:59:17 crc kubenswrapper[4892]: I0217 19:59:17.333202 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k48p7"] Feb 17 19:59:19 crc kubenswrapper[4892]: I0217 19:59:19.237323 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-k48p7" podUID="d405651f-0e0c-4099-8656-67e77b88d50d" containerName="registry-server" containerID="cri-o://eaddaf3f3426b74bb14b69e564019fb4dd330aaf5071872baf79a2b247abf3ed" gracePeriod=2 Feb 17 19:59:19 crc kubenswrapper[4892]: I0217 19:59:19.865217 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k48p7" Feb 17 19:59:19 crc kubenswrapper[4892]: I0217 19:59:19.951702 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d405651f-0e0c-4099-8656-67e77b88d50d-catalog-content\") pod \"d405651f-0e0c-4099-8656-67e77b88d50d\" (UID: \"d405651f-0e0c-4099-8656-67e77b88d50d\") " Feb 17 19:59:19 crc kubenswrapper[4892]: I0217 19:59:19.951891 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d405651f-0e0c-4099-8656-67e77b88d50d-utilities\") pod \"d405651f-0e0c-4099-8656-67e77b88d50d\" (UID: \"d405651f-0e0c-4099-8656-67e77b88d50d\") " Feb 17 19:59:19 crc kubenswrapper[4892]: I0217 19:59:19.951962 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrv8h\" (UniqueName: \"kubernetes.io/projected/d405651f-0e0c-4099-8656-67e77b88d50d-kube-api-access-vrv8h\") pod \"d405651f-0e0c-4099-8656-67e77b88d50d\" (UID: \"d405651f-0e0c-4099-8656-67e77b88d50d\") " Feb 17 19:59:19 crc kubenswrapper[4892]: I0217 19:59:19.953713 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d405651f-0e0c-4099-8656-67e77b88d50d-utilities" (OuterVolumeSpecName: "utilities") pod "d405651f-0e0c-4099-8656-67e77b88d50d" (UID: "d405651f-0e0c-4099-8656-67e77b88d50d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:59:19 crc kubenswrapper[4892]: I0217 19:59:19.976312 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d405651f-0e0c-4099-8656-67e77b88d50d-kube-api-access-vrv8h" (OuterVolumeSpecName: "kube-api-access-vrv8h") pod "d405651f-0e0c-4099-8656-67e77b88d50d" (UID: "d405651f-0e0c-4099-8656-67e77b88d50d"). InnerVolumeSpecName "kube-api-access-vrv8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:59:20 crc kubenswrapper[4892]: I0217 19:59:20.006937 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d405651f-0e0c-4099-8656-67e77b88d50d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d405651f-0e0c-4099-8656-67e77b88d50d" (UID: "d405651f-0e0c-4099-8656-67e77b88d50d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:59:20 crc kubenswrapper[4892]: I0217 19:59:20.053795 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrv8h\" (UniqueName: \"kubernetes.io/projected/d405651f-0e0c-4099-8656-67e77b88d50d-kube-api-access-vrv8h\") on node \"crc\" DevicePath \"\"" Feb 17 19:59:20 crc kubenswrapper[4892]: I0217 19:59:20.053839 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d405651f-0e0c-4099-8656-67e77b88d50d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 19:59:20 crc kubenswrapper[4892]: I0217 19:59:20.053848 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d405651f-0e0c-4099-8656-67e77b88d50d-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 19:59:20 crc kubenswrapper[4892]: I0217 19:59:20.249406 4892 generic.go:334] "Generic (PLEG): container finished" podID="d405651f-0e0c-4099-8656-67e77b88d50d" containerID="eaddaf3f3426b74bb14b69e564019fb4dd330aaf5071872baf79a2b247abf3ed" exitCode=0 Feb 17 19:59:20 crc kubenswrapper[4892]: I0217 19:59:20.249456 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k48p7" event={"ID":"d405651f-0e0c-4099-8656-67e77b88d50d","Type":"ContainerDied","Data":"eaddaf3f3426b74bb14b69e564019fb4dd330aaf5071872baf79a2b247abf3ed"} Feb 17 19:59:20 crc kubenswrapper[4892]: I0217 19:59:20.249517 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k48p7" event={"ID":"d405651f-0e0c-4099-8656-67e77b88d50d","Type":"ContainerDied","Data":"4c0a6b48a533d7847f9e97cac80728724263af07fd5884c359307348988edca6"} Feb 17 19:59:20 crc kubenswrapper[4892]: I0217 19:59:20.249540 4892 scope.go:117] "RemoveContainer" containerID="eaddaf3f3426b74bb14b69e564019fb4dd330aaf5071872baf79a2b247abf3ed" Feb 17 19:59:20 crc kubenswrapper[4892]: I0217 19:59:20.249733 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k48p7" Feb 17 19:59:20 crc kubenswrapper[4892]: I0217 19:59:20.285875 4892 scope.go:117] "RemoveContainer" containerID="c42511c7d68c1272a7bb3f9572551e57cf8db707bf7235795fda66ad56f94827" Feb 17 19:59:20 crc kubenswrapper[4892]: I0217 19:59:20.291684 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k48p7"] Feb 17 19:59:20 crc kubenswrapper[4892]: I0217 19:59:20.304715 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-k48p7"] Feb 17 19:59:20 crc kubenswrapper[4892]: I0217 19:59:20.313962 4892 scope.go:117] "RemoveContainer" containerID="4cc95d5458794dd8ee9d434ef5607df7e52d691b549d62cbab5450bfcc74ebe9" Feb 17 19:59:20 crc kubenswrapper[4892]: I0217 19:59:20.372935 4892 scope.go:117] "RemoveContainer" containerID="eaddaf3f3426b74bb14b69e564019fb4dd330aaf5071872baf79a2b247abf3ed" Feb 17 19:59:20 crc kubenswrapper[4892]: E0217 19:59:20.377071 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eaddaf3f3426b74bb14b69e564019fb4dd330aaf5071872baf79a2b247abf3ed\": container with ID starting with eaddaf3f3426b74bb14b69e564019fb4dd330aaf5071872baf79a2b247abf3ed not found: ID does not exist" containerID="eaddaf3f3426b74bb14b69e564019fb4dd330aaf5071872baf79a2b247abf3ed" Feb 17 19:59:20 crc kubenswrapper[4892]: I0217 19:59:20.377137 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eaddaf3f3426b74bb14b69e564019fb4dd330aaf5071872baf79a2b247abf3ed"} err="failed to get container status \"eaddaf3f3426b74bb14b69e564019fb4dd330aaf5071872baf79a2b247abf3ed\": rpc error: code = NotFound desc = could not find container \"eaddaf3f3426b74bb14b69e564019fb4dd330aaf5071872baf79a2b247abf3ed\": container with ID starting with eaddaf3f3426b74bb14b69e564019fb4dd330aaf5071872baf79a2b247abf3ed not found: ID does not exist" Feb 17 19:59:20 crc kubenswrapper[4892]: I0217 19:59:20.377177 4892 scope.go:117] "RemoveContainer" containerID="c42511c7d68c1272a7bb3f9572551e57cf8db707bf7235795fda66ad56f94827" Feb 17 19:59:20 crc kubenswrapper[4892]: E0217 19:59:20.380607 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c42511c7d68c1272a7bb3f9572551e57cf8db707bf7235795fda66ad56f94827\": container with ID starting with c42511c7d68c1272a7bb3f9572551e57cf8db707bf7235795fda66ad56f94827 not found: ID does not exist" containerID="c42511c7d68c1272a7bb3f9572551e57cf8db707bf7235795fda66ad56f94827" Feb 17 19:59:20 crc kubenswrapper[4892]: I0217 19:59:20.380665 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c42511c7d68c1272a7bb3f9572551e57cf8db707bf7235795fda66ad56f94827"} err="failed to get container status \"c42511c7d68c1272a7bb3f9572551e57cf8db707bf7235795fda66ad56f94827\": rpc error: code = NotFound desc = could not find container \"c42511c7d68c1272a7bb3f9572551e57cf8db707bf7235795fda66ad56f94827\": container with ID starting with c42511c7d68c1272a7bb3f9572551e57cf8db707bf7235795fda66ad56f94827 not found: ID does not exist" Feb 17 19:59:20 crc kubenswrapper[4892]: I0217 19:59:20.380701 4892 scope.go:117] "RemoveContainer" containerID="4cc95d5458794dd8ee9d434ef5607df7e52d691b549d62cbab5450bfcc74ebe9" Feb 17 19:59:20 crc kubenswrapper[4892]: E0217 19:59:20.381058 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cc95d5458794dd8ee9d434ef5607df7e52d691b549d62cbab5450bfcc74ebe9\": container with ID starting with 4cc95d5458794dd8ee9d434ef5607df7e52d691b549d62cbab5450bfcc74ebe9 not found: ID does not exist" containerID="4cc95d5458794dd8ee9d434ef5607df7e52d691b549d62cbab5450bfcc74ebe9" Feb 17 19:59:20 crc kubenswrapper[4892]: I0217 19:59:20.381094 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cc95d5458794dd8ee9d434ef5607df7e52d691b549d62cbab5450bfcc74ebe9"} err="failed to get container status \"4cc95d5458794dd8ee9d434ef5607df7e52d691b549d62cbab5450bfcc74ebe9\": rpc error: code = NotFound desc = could not find container \"4cc95d5458794dd8ee9d434ef5607df7e52d691b549d62cbab5450bfcc74ebe9\": container with ID starting with 4cc95d5458794dd8ee9d434ef5607df7e52d691b549d62cbab5450bfcc74ebe9 not found: ID does not exist" Feb 17 19:59:21 crc kubenswrapper[4892]: I0217 19:59:21.360339 4892 scope.go:117] "RemoveContainer" containerID="6c0d3b57fe7ce8a8bf0e7ab5ed7eb5daea00ebff5bc63d153811e8fa47cadb35" Feb 17 19:59:21 crc kubenswrapper[4892]: E0217 19:59:21.361437 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:59:21 crc kubenswrapper[4892]: I0217 19:59:21.385855 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d405651f-0e0c-4099-8656-67e77b88d50d" path="/var/lib/kubelet/pods/d405651f-0e0c-4099-8656-67e77b88d50d/volumes" Feb 17 19:59:24 crc kubenswrapper[4892]: I0217 19:59:24.303395 4892 generic.go:334] "Generic (PLEG): container finished" podID="0f868edd-d082-4f76-87db-34a40d03ba30" containerID="40e6787d7722c3c5a827e61edc14a107fc4383fcb96b0cfe718cccb5ab73d39c" exitCode=0 Feb 17 19:59:24 crc kubenswrapper[4892]: I0217 19:59:24.303502 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-bfmls" event={"ID":"0f868edd-d082-4f76-87db-34a40d03ba30","Type":"ContainerDied","Data":"40e6787d7722c3c5a827e61edc14a107fc4383fcb96b0cfe718cccb5ab73d39c"} Feb 17 19:59:25 crc kubenswrapper[4892]: I0217 19:59:25.876967 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-bfmls" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.063039 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0f868edd-d082-4f76-87db-34a40d03ba30-ceph\") pod \"0f868edd-d082-4f76-87db-34a40d03ba30\" (UID: \"0f868edd-d082-4f76-87db-34a40d03ba30\") " Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.063127 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0f868edd-d082-4f76-87db-34a40d03ba30-inventory\") pod \"0f868edd-d082-4f76-87db-34a40d03ba30\" (UID: \"0f868edd-d082-4f76-87db-34a40d03ba30\") " Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.064306 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f868edd-d082-4f76-87db-34a40d03ba30-libvirt-combined-ca-bundle\") pod \"0f868edd-d082-4f76-87db-34a40d03ba30\" (UID: \"0f868edd-d082-4f76-87db-34a40d03ba30\") " Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.064476 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/0f868edd-d082-4f76-87db-34a40d03ba30-libvirt-secret-0\") pod \"0f868edd-d082-4f76-87db-34a40d03ba30\" (UID: \"0f868edd-d082-4f76-87db-34a40d03ba30\") " Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.064586 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0f868edd-d082-4f76-87db-34a40d03ba30-ssh-key-openstack-cell1\") pod \"0f868edd-d082-4f76-87db-34a40d03ba30\" (UID: \"0f868edd-d082-4f76-87db-34a40d03ba30\") " Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.064654 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9klp5\" (UniqueName: \"kubernetes.io/projected/0f868edd-d082-4f76-87db-34a40d03ba30-kube-api-access-9klp5\") pod \"0f868edd-d082-4f76-87db-34a40d03ba30\" (UID: \"0f868edd-d082-4f76-87db-34a40d03ba30\") " Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.068612 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f868edd-d082-4f76-87db-34a40d03ba30-ceph" (OuterVolumeSpecName: "ceph") pod "0f868edd-d082-4f76-87db-34a40d03ba30" (UID: "0f868edd-d082-4f76-87db-34a40d03ba30"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.068994 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f868edd-d082-4f76-87db-34a40d03ba30-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "0f868edd-d082-4f76-87db-34a40d03ba30" (UID: "0f868edd-d082-4f76-87db-34a40d03ba30"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.070177 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f868edd-d082-4f76-87db-34a40d03ba30-kube-api-access-9klp5" (OuterVolumeSpecName: "kube-api-access-9klp5") pod "0f868edd-d082-4f76-87db-34a40d03ba30" (UID: "0f868edd-d082-4f76-87db-34a40d03ba30"). InnerVolumeSpecName "kube-api-access-9klp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.098122 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f868edd-d082-4f76-87db-34a40d03ba30-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "0f868edd-d082-4f76-87db-34a40d03ba30" (UID: "0f868edd-d082-4f76-87db-34a40d03ba30"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.105316 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f868edd-d082-4f76-87db-34a40d03ba30-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "0f868edd-d082-4f76-87db-34a40d03ba30" (UID: "0f868edd-d082-4f76-87db-34a40d03ba30"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.123649 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f868edd-d082-4f76-87db-34a40d03ba30-inventory" (OuterVolumeSpecName: "inventory") pod "0f868edd-d082-4f76-87db-34a40d03ba30" (UID: "0f868edd-d082-4f76-87db-34a40d03ba30"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.167010 4892 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0f868edd-d082-4f76-87db-34a40d03ba30-ceph\") on node \"crc\" DevicePath \"\"" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.167054 4892 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0f868edd-d082-4f76-87db-34a40d03ba30-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.167067 4892 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f868edd-d082-4f76-87db-34a40d03ba30-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.167080 4892 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/0f868edd-d082-4f76-87db-34a40d03ba30-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.167094 4892 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0f868edd-d082-4f76-87db-34a40d03ba30-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.167106 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9klp5\" (UniqueName: \"kubernetes.io/projected/0f868edd-d082-4f76-87db-34a40d03ba30-kube-api-access-9klp5\") on node \"crc\" DevicePath \"\"" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.328041 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-bfmls" event={"ID":"0f868edd-d082-4f76-87db-34a40d03ba30","Type":"ContainerDied","Data":"51b5facd7a28db7f37eaed1eedadb49aaedf198fa9de5bce54f61dcf3f1af9db"} Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.328114 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51b5facd7a28db7f37eaed1eedadb49aaedf198fa9de5bce54f61dcf3f1af9db" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.328139 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-bfmls" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.429628 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-8lwbt"] Feb 17 19:59:26 crc kubenswrapper[4892]: E0217 19:59:26.430563 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f868edd-d082-4f76-87db-34a40d03ba30" containerName="libvirt-openstack-openstack-cell1" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.430586 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f868edd-d082-4f76-87db-34a40d03ba30" containerName="libvirt-openstack-openstack-cell1" Feb 17 19:59:26 crc kubenswrapper[4892]: E0217 19:59:26.430603 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d405651f-0e0c-4099-8656-67e77b88d50d" containerName="extract-utilities" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.430613 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="d405651f-0e0c-4099-8656-67e77b88d50d" containerName="extract-utilities" Feb 17 19:59:26 crc kubenswrapper[4892]: E0217 19:59:26.430623 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d405651f-0e0c-4099-8656-67e77b88d50d" containerName="registry-server" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.430633 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="d405651f-0e0c-4099-8656-67e77b88d50d" containerName="registry-server" Feb 17 19:59:26 crc kubenswrapper[4892]: E0217 19:59:26.430680 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d405651f-0e0c-4099-8656-67e77b88d50d" containerName="extract-content" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.430688 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="d405651f-0e0c-4099-8656-67e77b88d50d" containerName="extract-content" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.431020 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f868edd-d082-4f76-87db-34a40d03ba30" containerName="libvirt-openstack-openstack-cell1" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.431050 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="d405651f-0e0c-4099-8656-67e77b88d50d" containerName="registry-server" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.432201 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-8lwbt" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.434990 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.435153 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.435399 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.436479 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-sdxx7" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.436778 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.437625 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.438211 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.466698 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-8lwbt"] Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.493243 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-8lwbt\" (UID: \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8lwbt" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.494086 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-ceph\") pod \"nova-cell1-openstack-openstack-cell1-8lwbt\" (UID: \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8lwbt" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.494363 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-8lwbt\" (UID: \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8lwbt" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.494633 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-inventory\") pod \"nova-cell1-openstack-openstack-cell1-8lwbt\" (UID: \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8lwbt" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.494682 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-openstack-cell1-8lwbt\" (UID: \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8lwbt" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.494759 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/7a1ac411-8b1b-4947-8e72-7d4401056d3f-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-8lwbt\" (UID: \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8lwbt" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.494857 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68cqm\" (UniqueName: \"kubernetes.io/projected/7a1ac411-8b1b-4947-8e72-7d4401056d3f-kube-api-access-68cqm\") pod \"nova-cell1-openstack-openstack-cell1-8lwbt\" (UID: \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8lwbt" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.494965 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-8lwbt\" (UID: \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8lwbt" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.494999 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/7a1ac411-8b1b-4947-8e72-7d4401056d3f-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-8lwbt\" (UID: \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8lwbt" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.495024 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-openstack-cell1-8lwbt\" (UID: \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8lwbt" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.495057 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-8lwbt\" (UID: \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8lwbt" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.495173 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-8lwbt\" (UID: \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8lwbt" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.495263 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-openstack-cell1-8lwbt\" (UID: \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8lwbt" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.597701 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-8lwbt\" (UID: \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8lwbt" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.597753 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/7a1ac411-8b1b-4947-8e72-7d4401056d3f-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-8lwbt\" (UID: \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8lwbt" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.597782 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-openstack-cell1-8lwbt\" (UID: \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8lwbt" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.597863 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-8lwbt\" (UID: \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8lwbt" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.597902 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-8lwbt\" (UID: \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8lwbt" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.597931 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-openstack-cell1-8lwbt\" (UID: \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8lwbt" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.597958 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-8lwbt\" (UID: \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8lwbt" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.597994 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-ceph\") pod \"nova-cell1-openstack-openstack-cell1-8lwbt\" (UID: \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8lwbt" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.598051 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-8lwbt\" (UID: \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8lwbt" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.598118 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-inventory\") pod \"nova-cell1-openstack-openstack-cell1-8lwbt\" (UID: \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8lwbt" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.598138 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-openstack-cell1-8lwbt\" (UID: \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8lwbt" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.598165 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/7a1ac411-8b1b-4947-8e72-7d4401056d3f-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-8lwbt\" (UID: \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8lwbt" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.598186 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68cqm\" (UniqueName: \"kubernetes.io/projected/7a1ac411-8b1b-4947-8e72-7d4401056d3f-kube-api-access-68cqm\") pod \"nova-cell1-openstack-openstack-cell1-8lwbt\" (UID: \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8lwbt" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.599829 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/7a1ac411-8b1b-4947-8e72-7d4401056d3f-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-8lwbt\" (UID: \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8lwbt" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.600087 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/7a1ac411-8b1b-4947-8e72-7d4401056d3f-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-8lwbt\" (UID: \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8lwbt" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.603203 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-ceph\") pod \"nova-cell1-openstack-openstack-cell1-8lwbt\" (UID: \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8lwbt" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.603746 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-openstack-cell1-8lwbt\" (UID: \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8lwbt" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.603846 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-openstack-cell1-8lwbt\" (UID: \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8lwbt" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.603930 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-8lwbt\" (UID: \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8lwbt" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.605005 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-8lwbt\" (UID: \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8lwbt" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.605418 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-openstack-cell1-8lwbt\" (UID: \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8lwbt" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.606145 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-8lwbt\" (UID: \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8lwbt" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.610044 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-8lwbt\" (UID: \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8lwbt" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.613290 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-inventory\") pod \"nova-cell1-openstack-openstack-cell1-8lwbt\" (UID: \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8lwbt" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.614523 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-8lwbt\" (UID: \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8lwbt" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.623048 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68cqm\" (UniqueName: \"kubernetes.io/projected/7a1ac411-8b1b-4947-8e72-7d4401056d3f-kube-api-access-68cqm\") pod \"nova-cell1-openstack-openstack-cell1-8lwbt\" (UID: \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8lwbt" Feb 17 19:59:26 crc kubenswrapper[4892]: I0217 19:59:26.773408 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-8lwbt" Feb 17 19:59:27 crc kubenswrapper[4892]: W0217 19:59:27.386192 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a1ac411_8b1b_4947_8e72_7d4401056d3f.slice/crio-1974b06a392da58acc19c326b1325a63f67da8563d70b1d4a1f0291b49f56d1b WatchSource:0}: Error finding container 1974b06a392da58acc19c326b1325a63f67da8563d70b1d4a1f0291b49f56d1b: Status 404 returned error can't find the container with id 1974b06a392da58acc19c326b1325a63f67da8563d70b1d4a1f0291b49f56d1b Feb 17 19:59:27 crc kubenswrapper[4892]: I0217 19:59:27.386301 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-8lwbt"] Feb 17 19:59:28 crc kubenswrapper[4892]: I0217 19:59:28.350314 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-8lwbt" event={"ID":"7a1ac411-8b1b-4947-8e72-7d4401056d3f","Type":"ContainerStarted","Data":"6828402bb37b9bed63f53a27744e71467fc557fb15e3d4ff9607f76370c8ef8c"} Feb 17 19:59:28 crc kubenswrapper[4892]: I0217 19:59:28.350939 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-8lwbt" event={"ID":"7a1ac411-8b1b-4947-8e72-7d4401056d3f","Type":"ContainerStarted","Data":"1974b06a392da58acc19c326b1325a63f67da8563d70b1d4a1f0291b49f56d1b"} Feb 17 19:59:28 crc kubenswrapper[4892]: I0217 19:59:28.377998 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-openstack-cell1-8lwbt" podStartSLOduration=1.8619923269999998 podStartE2EDuration="2.377982208s" podCreationTimestamp="2026-02-17 19:59:26 +0000 UTC" firstStartedPulling="2026-02-17 19:59:27.388479447 +0000 UTC m=+8138.763882712" lastFinishedPulling="2026-02-17 19:59:27.904469288 +0000 UTC m=+8139.279872593" observedRunningTime="2026-02-17 19:59:28.370493606 +0000 UTC m=+8139.745896871" watchObservedRunningTime="2026-02-17 19:59:28.377982208 +0000 UTC m=+8139.753385473" Feb 17 19:59:32 crc kubenswrapper[4892]: I0217 19:59:32.360128 4892 scope.go:117] "RemoveContainer" containerID="6c0d3b57fe7ce8a8bf0e7ab5ed7eb5daea00ebff5bc63d153811e8fa47cadb35" Feb 17 19:59:32 crc kubenswrapper[4892]: E0217 19:59:32.361506 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 19:59:41 crc kubenswrapper[4892]: I0217 19:59:41.341161 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nkws5"] Feb 17 19:59:41 crc kubenswrapper[4892]: I0217 19:59:41.345179 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nkws5" Feb 17 19:59:41 crc kubenswrapper[4892]: I0217 19:59:41.356265 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nkws5"] Feb 17 19:59:41 crc kubenswrapper[4892]: I0217 19:59:41.482773 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e86219e-ad13-41c8-a805-fe900c25da05-catalog-content\") pod \"redhat-marketplace-nkws5\" (UID: \"4e86219e-ad13-41c8-a805-fe900c25da05\") " pod="openshift-marketplace/redhat-marketplace-nkws5" Feb 17 19:59:41 crc kubenswrapper[4892]: I0217 19:59:41.482827 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e86219e-ad13-41c8-a805-fe900c25da05-utilities\") pod \"redhat-marketplace-nkws5\" (UID: \"4e86219e-ad13-41c8-a805-fe900c25da05\") " pod="openshift-marketplace/redhat-marketplace-nkws5" Feb 17 19:59:41 crc kubenswrapper[4892]: I0217 19:59:41.482892 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn8jw\" (UniqueName: \"kubernetes.io/projected/4e86219e-ad13-41c8-a805-fe900c25da05-kube-api-access-pn8jw\") pod \"redhat-marketplace-nkws5\" (UID: \"4e86219e-ad13-41c8-a805-fe900c25da05\") " pod="openshift-marketplace/redhat-marketplace-nkws5" Feb 17 19:59:41 crc kubenswrapper[4892]: I0217 19:59:41.586035 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e86219e-ad13-41c8-a805-fe900c25da05-catalog-content\") pod \"redhat-marketplace-nkws5\" (UID: \"4e86219e-ad13-41c8-a805-fe900c25da05\") " pod="openshift-marketplace/redhat-marketplace-nkws5" Feb 17 19:59:41 crc kubenswrapper[4892]: I0217 19:59:41.586090 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e86219e-ad13-41c8-a805-fe900c25da05-utilities\") pod \"redhat-marketplace-nkws5\" (UID: \"4e86219e-ad13-41c8-a805-fe900c25da05\") " pod="openshift-marketplace/redhat-marketplace-nkws5" Feb 17 19:59:41 crc kubenswrapper[4892]: I0217 19:59:41.586183 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pn8jw\" (UniqueName: \"kubernetes.io/projected/4e86219e-ad13-41c8-a805-fe900c25da05-kube-api-access-pn8jw\") pod \"redhat-marketplace-nkws5\" (UID: \"4e86219e-ad13-41c8-a805-fe900c25da05\") " pod="openshift-marketplace/redhat-marketplace-nkws5" Feb 17 19:59:41 crc kubenswrapper[4892]: I0217 19:59:41.586701 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e86219e-ad13-41c8-a805-fe900c25da05-utilities\") pod \"redhat-marketplace-nkws5\" (UID: \"4e86219e-ad13-41c8-a805-fe900c25da05\") " pod="openshift-marketplace/redhat-marketplace-nkws5" Feb 17 19:59:41 crc kubenswrapper[4892]: I0217 19:59:41.586700 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e86219e-ad13-41c8-a805-fe900c25da05-catalog-content\") pod \"redhat-marketplace-nkws5\" (UID: \"4e86219e-ad13-41c8-a805-fe900c25da05\") " pod="openshift-marketplace/redhat-marketplace-nkws5" Feb 17 19:59:41 crc kubenswrapper[4892]: I0217 19:59:41.609794 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn8jw\" (UniqueName: \"kubernetes.io/projected/4e86219e-ad13-41c8-a805-fe900c25da05-kube-api-access-pn8jw\") pod \"redhat-marketplace-nkws5\" (UID: \"4e86219e-ad13-41c8-a805-fe900c25da05\") " pod="openshift-marketplace/redhat-marketplace-nkws5" Feb 17 19:59:41 crc kubenswrapper[4892]: I0217 19:59:41.688099 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nkws5" Feb 17 19:59:42 crc kubenswrapper[4892]: I0217 19:59:42.215536 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nkws5"] Feb 17 19:59:42 crc kubenswrapper[4892]: W0217 19:59:42.224797 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e86219e_ad13_41c8_a805_fe900c25da05.slice/crio-42a91461756da77991aae0f787d4f3858d409809205f5795d3e3d49e4c6238e7 WatchSource:0}: Error finding container 42a91461756da77991aae0f787d4f3858d409809205f5795d3e3d49e4c6238e7: Status 404 returned error can't find the container with id 42a91461756da77991aae0f787d4f3858d409809205f5795d3e3d49e4c6238e7 Feb 17 19:59:42 crc kubenswrapper[4892]: I0217 19:59:42.573120 4892 generic.go:334] "Generic (PLEG): container finished" podID="4e86219e-ad13-41c8-a805-fe900c25da05" containerID="ce43c64f9aaaa6162610af408452e52317a290ca19c076dca6132f232923d5ae" exitCode=0 Feb 17 19:59:42 crc kubenswrapper[4892]: I0217 19:59:42.573196 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nkws5" event={"ID":"4e86219e-ad13-41c8-a805-fe900c25da05","Type":"ContainerDied","Data":"ce43c64f9aaaa6162610af408452e52317a290ca19c076dca6132f232923d5ae"} Feb 17 19:59:42 crc kubenswrapper[4892]: I0217 19:59:42.573412 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nkws5" event={"ID":"4e86219e-ad13-41c8-a805-fe900c25da05","Type":"ContainerStarted","Data":"42a91461756da77991aae0f787d4f3858d409809205f5795d3e3d49e4c6238e7"} Feb 17 19:59:43 crc kubenswrapper[4892]: I0217 19:59:43.586266 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nkws5" event={"ID":"4e86219e-ad13-41c8-a805-fe900c25da05","Type":"ContainerStarted","Data":"d4910ea5f742e3821b1409bf50c7fd2b3b019d7010828149725b7cfcffec560f"} Feb 17 19:59:44 crc kubenswrapper[4892]: I0217 19:59:44.359895 4892 scope.go:117] "RemoveContainer" containerID="6c0d3b57fe7ce8a8bf0e7ab5ed7eb5daea00ebff5bc63d153811e8fa47cadb35" Feb 17 19:59:44 crc kubenswrapper[4892]: I0217 19:59:44.613180 4892 generic.go:334] "Generic (PLEG): container finished" podID="4e86219e-ad13-41c8-a805-fe900c25da05" containerID="d4910ea5f742e3821b1409bf50c7fd2b3b019d7010828149725b7cfcffec560f" exitCode=0 Feb 17 19:59:44 crc kubenswrapper[4892]: I0217 19:59:44.613277 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nkws5" event={"ID":"4e86219e-ad13-41c8-a805-fe900c25da05","Type":"ContainerDied","Data":"d4910ea5f742e3821b1409bf50c7fd2b3b019d7010828149725b7cfcffec560f"} Feb 17 19:59:45 crc kubenswrapper[4892]: I0217 19:59:45.627392 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nkws5" event={"ID":"4e86219e-ad13-41c8-a805-fe900c25da05","Type":"ContainerStarted","Data":"d8c042110057a45bab099c8e24be91fb8ef08f8729e9daa272ff9d840ee2dc34"} Feb 17 19:59:45 crc kubenswrapper[4892]: I0217 19:59:45.630505 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerStarted","Data":"321964d924bc9b60d8e8d4a00f54b10a06cb73c81d8538ba205c584596e60b63"} Feb 17 19:59:45 crc kubenswrapper[4892]: I0217 19:59:45.657343 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nkws5" podStartSLOduration=2.083419005 podStartE2EDuration="4.657319363s" podCreationTimestamp="2026-02-17 19:59:41 +0000 UTC" firstStartedPulling="2026-02-17 19:59:42.57510611 +0000 UTC m=+8153.950509385" lastFinishedPulling="2026-02-17 19:59:45.149006438 +0000 UTC m=+8156.524409743" observedRunningTime="2026-02-17 19:59:45.648692291 +0000 UTC m=+8157.024095556" watchObservedRunningTime="2026-02-17 19:59:45.657319363 +0000 UTC m=+8157.032722648" Feb 17 19:59:51 crc kubenswrapper[4892]: I0217 19:59:51.688903 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nkws5" Feb 17 19:59:51 crc kubenswrapper[4892]: I0217 19:59:51.689334 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nkws5" Feb 17 19:59:51 crc kubenswrapper[4892]: I0217 19:59:51.774167 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nkws5" Feb 17 19:59:51 crc kubenswrapper[4892]: I0217 19:59:51.848015 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nkws5" Feb 17 19:59:52 crc kubenswrapper[4892]: I0217 19:59:52.033136 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nkws5"] Feb 17 19:59:53 crc kubenswrapper[4892]: I0217 19:59:53.752843 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nkws5" podUID="4e86219e-ad13-41c8-a805-fe900c25da05" containerName="registry-server" containerID="cri-o://d8c042110057a45bab099c8e24be91fb8ef08f8729e9daa272ff9d840ee2dc34" gracePeriod=2 Feb 17 19:59:54 crc kubenswrapper[4892]: I0217 19:59:54.273923 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nkws5" Feb 17 19:59:54 crc kubenswrapper[4892]: I0217 19:59:54.394021 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e86219e-ad13-41c8-a805-fe900c25da05-utilities\") pod \"4e86219e-ad13-41c8-a805-fe900c25da05\" (UID: \"4e86219e-ad13-41c8-a805-fe900c25da05\") " Feb 17 19:59:54 crc kubenswrapper[4892]: I0217 19:59:54.394197 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pn8jw\" (UniqueName: \"kubernetes.io/projected/4e86219e-ad13-41c8-a805-fe900c25da05-kube-api-access-pn8jw\") pod \"4e86219e-ad13-41c8-a805-fe900c25da05\" (UID: \"4e86219e-ad13-41c8-a805-fe900c25da05\") " Feb 17 19:59:54 crc kubenswrapper[4892]: I0217 19:59:54.394291 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e86219e-ad13-41c8-a805-fe900c25da05-catalog-content\") pod \"4e86219e-ad13-41c8-a805-fe900c25da05\" (UID: \"4e86219e-ad13-41c8-a805-fe900c25da05\") " Feb 17 19:59:54 crc kubenswrapper[4892]: I0217 19:59:54.395810 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e86219e-ad13-41c8-a805-fe900c25da05-utilities" (OuterVolumeSpecName: "utilities") pod "4e86219e-ad13-41c8-a805-fe900c25da05" (UID: "4e86219e-ad13-41c8-a805-fe900c25da05"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:59:54 crc kubenswrapper[4892]: I0217 19:59:54.399921 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e86219e-ad13-41c8-a805-fe900c25da05-kube-api-access-pn8jw" (OuterVolumeSpecName: "kube-api-access-pn8jw") pod "4e86219e-ad13-41c8-a805-fe900c25da05" (UID: "4e86219e-ad13-41c8-a805-fe900c25da05"). InnerVolumeSpecName "kube-api-access-pn8jw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 19:59:54 crc kubenswrapper[4892]: I0217 19:59:54.422431 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e86219e-ad13-41c8-a805-fe900c25da05-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4e86219e-ad13-41c8-a805-fe900c25da05" (UID: "4e86219e-ad13-41c8-a805-fe900c25da05"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 19:59:54 crc kubenswrapper[4892]: I0217 19:59:54.497964 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pn8jw\" (UniqueName: \"kubernetes.io/projected/4e86219e-ad13-41c8-a805-fe900c25da05-kube-api-access-pn8jw\") on node \"crc\" DevicePath \"\"" Feb 17 19:59:54 crc kubenswrapper[4892]: I0217 19:59:54.498009 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e86219e-ad13-41c8-a805-fe900c25da05-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 19:59:54 crc kubenswrapper[4892]: I0217 19:59:54.498026 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e86219e-ad13-41c8-a805-fe900c25da05-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 19:59:54 crc kubenswrapper[4892]: I0217 19:59:54.771273 4892 generic.go:334] "Generic (PLEG): container finished" podID="4e86219e-ad13-41c8-a805-fe900c25da05" containerID="d8c042110057a45bab099c8e24be91fb8ef08f8729e9daa272ff9d840ee2dc34" exitCode=0 Feb 17 19:59:54 crc kubenswrapper[4892]: I0217 19:59:54.771330 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nkws5" event={"ID":"4e86219e-ad13-41c8-a805-fe900c25da05","Type":"ContainerDied","Data":"d8c042110057a45bab099c8e24be91fb8ef08f8729e9daa272ff9d840ee2dc34"} Feb 17 19:59:54 crc kubenswrapper[4892]: I0217 19:59:54.771363 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nkws5" event={"ID":"4e86219e-ad13-41c8-a805-fe900c25da05","Type":"ContainerDied","Data":"42a91461756da77991aae0f787d4f3858d409809205f5795d3e3d49e4c6238e7"} Feb 17 19:59:54 crc kubenswrapper[4892]: I0217 19:59:54.771385 4892 scope.go:117] "RemoveContainer" containerID="d8c042110057a45bab099c8e24be91fb8ef08f8729e9daa272ff9d840ee2dc34" Feb 17 19:59:54 crc kubenswrapper[4892]: I0217 19:59:54.771410 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nkws5" Feb 17 19:59:54 crc kubenswrapper[4892]: I0217 19:59:54.804563 4892 scope.go:117] "RemoveContainer" containerID="d4910ea5f742e3821b1409bf50c7fd2b3b019d7010828149725b7cfcffec560f" Feb 17 19:59:54 crc kubenswrapper[4892]: I0217 19:59:54.856504 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nkws5"] Feb 17 19:59:54 crc kubenswrapper[4892]: I0217 19:59:54.861219 4892 scope.go:117] "RemoveContainer" containerID="ce43c64f9aaaa6162610af408452e52317a290ca19c076dca6132f232923d5ae" Feb 17 19:59:54 crc kubenswrapper[4892]: I0217 19:59:54.873004 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nkws5"] Feb 17 19:59:54 crc kubenswrapper[4892]: I0217 19:59:54.901855 4892 scope.go:117] "RemoveContainer" containerID="d8c042110057a45bab099c8e24be91fb8ef08f8729e9daa272ff9d840ee2dc34" Feb 17 19:59:54 crc kubenswrapper[4892]: E0217 19:59:54.904429 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8c042110057a45bab099c8e24be91fb8ef08f8729e9daa272ff9d840ee2dc34\": container with ID starting with d8c042110057a45bab099c8e24be91fb8ef08f8729e9daa272ff9d840ee2dc34 not found: ID does not exist" containerID="d8c042110057a45bab099c8e24be91fb8ef08f8729e9daa272ff9d840ee2dc34" Feb 17 19:59:54 crc kubenswrapper[4892]: I0217 19:59:54.904462 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8c042110057a45bab099c8e24be91fb8ef08f8729e9daa272ff9d840ee2dc34"} err="failed to get container status \"d8c042110057a45bab099c8e24be91fb8ef08f8729e9daa272ff9d840ee2dc34\": rpc error: code = NotFound desc = could not find container \"d8c042110057a45bab099c8e24be91fb8ef08f8729e9daa272ff9d840ee2dc34\": container with ID starting with d8c042110057a45bab099c8e24be91fb8ef08f8729e9daa272ff9d840ee2dc34 not found: ID does not exist" Feb 17 19:59:54 crc kubenswrapper[4892]: I0217 19:59:54.904483 4892 scope.go:117] "RemoveContainer" containerID="d4910ea5f742e3821b1409bf50c7fd2b3b019d7010828149725b7cfcffec560f" Feb 17 19:59:54 crc kubenswrapper[4892]: E0217 19:59:54.907140 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4910ea5f742e3821b1409bf50c7fd2b3b019d7010828149725b7cfcffec560f\": container with ID starting with d4910ea5f742e3821b1409bf50c7fd2b3b019d7010828149725b7cfcffec560f not found: ID does not exist" containerID="d4910ea5f742e3821b1409bf50c7fd2b3b019d7010828149725b7cfcffec560f" Feb 17 19:59:54 crc kubenswrapper[4892]: I0217 19:59:54.907165 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4910ea5f742e3821b1409bf50c7fd2b3b019d7010828149725b7cfcffec560f"} err="failed to get container status \"d4910ea5f742e3821b1409bf50c7fd2b3b019d7010828149725b7cfcffec560f\": rpc error: code = NotFound desc = could not find container \"d4910ea5f742e3821b1409bf50c7fd2b3b019d7010828149725b7cfcffec560f\": container with ID starting with d4910ea5f742e3821b1409bf50c7fd2b3b019d7010828149725b7cfcffec560f not found: ID does not exist" Feb 17 19:59:54 crc kubenswrapper[4892]: I0217 19:59:54.907180 4892 scope.go:117] "RemoveContainer" containerID="ce43c64f9aaaa6162610af408452e52317a290ca19c076dca6132f232923d5ae" Feb 17 19:59:54 crc kubenswrapper[4892]: E0217 19:59:54.911397 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce43c64f9aaaa6162610af408452e52317a290ca19c076dca6132f232923d5ae\": container with ID starting with ce43c64f9aaaa6162610af408452e52317a290ca19c076dca6132f232923d5ae not found: ID does not exist" containerID="ce43c64f9aaaa6162610af408452e52317a290ca19c076dca6132f232923d5ae" Feb 17 19:59:54 crc kubenswrapper[4892]: I0217 19:59:54.911440 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce43c64f9aaaa6162610af408452e52317a290ca19c076dca6132f232923d5ae"} err="failed to get container status \"ce43c64f9aaaa6162610af408452e52317a290ca19c076dca6132f232923d5ae\": rpc error: code = NotFound desc = could not find container \"ce43c64f9aaaa6162610af408452e52317a290ca19c076dca6132f232923d5ae\": container with ID starting with ce43c64f9aaaa6162610af408452e52317a290ca19c076dca6132f232923d5ae not found: ID does not exist" Feb 17 19:59:55 crc kubenswrapper[4892]: I0217 19:59:55.370540 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e86219e-ad13-41c8-a805-fe900c25da05" path="/var/lib/kubelet/pods/4e86219e-ad13-41c8-a805-fe900c25da05/volumes" Feb 17 20:00:00 crc kubenswrapper[4892]: I0217 20:00:00.198414 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522640-7lx5q"] Feb 17 20:00:00 crc kubenswrapper[4892]: E0217 20:00:00.199555 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e86219e-ad13-41c8-a805-fe900c25da05" containerName="registry-server" Feb 17 20:00:00 crc kubenswrapper[4892]: I0217 20:00:00.199574 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e86219e-ad13-41c8-a805-fe900c25da05" containerName="registry-server" Feb 17 20:00:00 crc kubenswrapper[4892]: E0217 20:00:00.199592 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e86219e-ad13-41c8-a805-fe900c25da05" containerName="extract-content" Feb 17 20:00:00 crc kubenswrapper[4892]: I0217 20:00:00.199601 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e86219e-ad13-41c8-a805-fe900c25da05" containerName="extract-content" Feb 17 20:00:00 crc kubenswrapper[4892]: E0217 20:00:00.199628 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e86219e-ad13-41c8-a805-fe900c25da05" containerName="extract-utilities" Feb 17 20:00:00 crc kubenswrapper[4892]: I0217 20:00:00.199637 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e86219e-ad13-41c8-a805-fe900c25da05" containerName="extract-utilities" Feb 17 20:00:00 crc kubenswrapper[4892]: I0217 20:00:00.199957 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e86219e-ad13-41c8-a805-fe900c25da05" containerName="registry-server" Feb 17 20:00:00 crc kubenswrapper[4892]: I0217 20:00:00.201073 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522640-7lx5q" Feb 17 20:00:00 crc kubenswrapper[4892]: I0217 20:00:00.203281 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 20:00:00 crc kubenswrapper[4892]: I0217 20:00:00.203291 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 20:00:00 crc kubenswrapper[4892]: I0217 20:00:00.213936 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522640-7lx5q"] Feb 17 20:00:00 crc kubenswrapper[4892]: I0217 20:00:00.359019 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnfrx\" (UniqueName: \"kubernetes.io/projected/657f4ea9-6fa9-448c-87de-23576a799f44-kube-api-access-pnfrx\") pod \"collect-profiles-29522640-7lx5q\" (UID: \"657f4ea9-6fa9-448c-87de-23576a799f44\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522640-7lx5q" Feb 17 20:00:00 crc kubenswrapper[4892]: I0217 20:00:00.359102 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/657f4ea9-6fa9-448c-87de-23576a799f44-config-volume\") pod \"collect-profiles-29522640-7lx5q\" (UID: \"657f4ea9-6fa9-448c-87de-23576a799f44\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522640-7lx5q" Feb 17 20:00:00 crc kubenswrapper[4892]: I0217 20:00:00.360149 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/657f4ea9-6fa9-448c-87de-23576a799f44-secret-volume\") pod \"collect-profiles-29522640-7lx5q\" (UID: \"657f4ea9-6fa9-448c-87de-23576a799f44\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522640-7lx5q" Feb 17 20:00:00 crc kubenswrapper[4892]: I0217 20:00:00.462401 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnfrx\" (UniqueName: \"kubernetes.io/projected/657f4ea9-6fa9-448c-87de-23576a799f44-kube-api-access-pnfrx\") pod \"collect-profiles-29522640-7lx5q\" (UID: \"657f4ea9-6fa9-448c-87de-23576a799f44\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522640-7lx5q" Feb 17 20:00:00 crc kubenswrapper[4892]: I0217 20:00:00.462464 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/657f4ea9-6fa9-448c-87de-23576a799f44-config-volume\") pod \"collect-profiles-29522640-7lx5q\" (UID: \"657f4ea9-6fa9-448c-87de-23576a799f44\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522640-7lx5q" Feb 17 20:00:00 crc kubenswrapper[4892]: I0217 20:00:00.462568 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/657f4ea9-6fa9-448c-87de-23576a799f44-secret-volume\") pod \"collect-profiles-29522640-7lx5q\" (UID: \"657f4ea9-6fa9-448c-87de-23576a799f44\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522640-7lx5q" Feb 17 20:00:00 crc kubenswrapper[4892]: I0217 20:00:00.463810 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/657f4ea9-6fa9-448c-87de-23576a799f44-config-volume\") pod \"collect-profiles-29522640-7lx5q\" (UID: \"657f4ea9-6fa9-448c-87de-23576a799f44\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522640-7lx5q" Feb 17 20:00:00 crc kubenswrapper[4892]: I0217 20:00:00.469917 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/657f4ea9-6fa9-448c-87de-23576a799f44-secret-volume\") pod \"collect-profiles-29522640-7lx5q\" (UID: \"657f4ea9-6fa9-448c-87de-23576a799f44\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522640-7lx5q" Feb 17 20:00:00 crc kubenswrapper[4892]: I0217 20:00:00.479236 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnfrx\" (UniqueName: \"kubernetes.io/projected/657f4ea9-6fa9-448c-87de-23576a799f44-kube-api-access-pnfrx\") pod \"collect-profiles-29522640-7lx5q\" (UID: \"657f4ea9-6fa9-448c-87de-23576a799f44\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522640-7lx5q" Feb 17 20:00:00 crc kubenswrapper[4892]: I0217 20:00:00.526882 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522640-7lx5q" Feb 17 20:00:01 crc kubenswrapper[4892]: I0217 20:00:01.082479 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522640-7lx5q"] Feb 17 20:00:01 crc kubenswrapper[4892]: I0217 20:00:01.863123 4892 generic.go:334] "Generic (PLEG): container finished" podID="657f4ea9-6fa9-448c-87de-23576a799f44" containerID="77d2ecf27abf6b7ea400f77a39448dd1730563940e1ae491a55f8e81ecb3c8c1" exitCode=0 Feb 17 20:00:01 crc kubenswrapper[4892]: I0217 20:00:01.863761 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522640-7lx5q" event={"ID":"657f4ea9-6fa9-448c-87de-23576a799f44","Type":"ContainerDied","Data":"77d2ecf27abf6b7ea400f77a39448dd1730563940e1ae491a55f8e81ecb3c8c1"} Feb 17 20:00:01 crc kubenswrapper[4892]: I0217 20:00:01.866173 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522640-7lx5q" event={"ID":"657f4ea9-6fa9-448c-87de-23576a799f44","Type":"ContainerStarted","Data":"685ab8cdaf1943470b03e521ebdf00d0ccde47233324a3c66b770aed526372e1"} Feb 17 20:00:03 crc kubenswrapper[4892]: I0217 20:00:03.322395 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522640-7lx5q" Feb 17 20:00:03 crc kubenswrapper[4892]: I0217 20:00:03.460062 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/657f4ea9-6fa9-448c-87de-23576a799f44-secret-volume\") pod \"657f4ea9-6fa9-448c-87de-23576a799f44\" (UID: \"657f4ea9-6fa9-448c-87de-23576a799f44\") " Feb 17 20:00:03 crc kubenswrapper[4892]: I0217 20:00:03.460248 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnfrx\" (UniqueName: \"kubernetes.io/projected/657f4ea9-6fa9-448c-87de-23576a799f44-kube-api-access-pnfrx\") pod \"657f4ea9-6fa9-448c-87de-23576a799f44\" (UID: \"657f4ea9-6fa9-448c-87de-23576a799f44\") " Feb 17 20:00:03 crc kubenswrapper[4892]: I0217 20:00:03.460339 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/657f4ea9-6fa9-448c-87de-23576a799f44-config-volume\") pod \"657f4ea9-6fa9-448c-87de-23576a799f44\" (UID: \"657f4ea9-6fa9-448c-87de-23576a799f44\") " Feb 17 20:00:03 crc kubenswrapper[4892]: I0217 20:00:03.462449 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/657f4ea9-6fa9-448c-87de-23576a799f44-config-volume" (OuterVolumeSpecName: "config-volume") pod "657f4ea9-6fa9-448c-87de-23576a799f44" (UID: "657f4ea9-6fa9-448c-87de-23576a799f44"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 20:00:03 crc kubenswrapper[4892]: I0217 20:00:03.466081 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/657f4ea9-6fa9-448c-87de-23576a799f44-kube-api-access-pnfrx" (OuterVolumeSpecName: "kube-api-access-pnfrx") pod "657f4ea9-6fa9-448c-87de-23576a799f44" (UID: "657f4ea9-6fa9-448c-87de-23576a799f44"). InnerVolumeSpecName "kube-api-access-pnfrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 20:00:03 crc kubenswrapper[4892]: I0217 20:00:03.466968 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/657f4ea9-6fa9-448c-87de-23576a799f44-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "657f4ea9-6fa9-448c-87de-23576a799f44" (UID: "657f4ea9-6fa9-448c-87de-23576a799f44"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 20:00:03 crc kubenswrapper[4892]: I0217 20:00:03.562901 4892 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/657f4ea9-6fa9-448c-87de-23576a799f44-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 20:00:03 crc kubenswrapper[4892]: I0217 20:00:03.562942 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnfrx\" (UniqueName: \"kubernetes.io/projected/657f4ea9-6fa9-448c-87de-23576a799f44-kube-api-access-pnfrx\") on node \"crc\" DevicePath \"\"" Feb 17 20:00:03 crc kubenswrapper[4892]: I0217 20:00:03.562953 4892 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/657f4ea9-6fa9-448c-87de-23576a799f44-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 20:00:03 crc kubenswrapper[4892]: I0217 20:00:03.897058 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522640-7lx5q" event={"ID":"657f4ea9-6fa9-448c-87de-23576a799f44","Type":"ContainerDied","Data":"685ab8cdaf1943470b03e521ebdf00d0ccde47233324a3c66b770aed526372e1"} Feb 17 20:00:03 crc kubenswrapper[4892]: I0217 20:00:03.897531 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="685ab8cdaf1943470b03e521ebdf00d0ccde47233324a3c66b770aed526372e1" Feb 17 20:00:03 crc kubenswrapper[4892]: I0217 20:00:03.897190 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522640-7lx5q" Feb 17 20:00:04 crc kubenswrapper[4892]: I0217 20:00:04.411579 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522595-9rrr5"] Feb 17 20:00:04 crc kubenswrapper[4892]: I0217 20:00:04.422546 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522595-9rrr5"] Feb 17 20:00:05 crc kubenswrapper[4892]: I0217 20:00:05.379400 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53744014-b001-443c-980c-2ee0a13a037c" path="/var/lib/kubelet/pods/53744014-b001-443c-980c-2ee0a13a037c/volumes" Feb 17 20:00:09 crc kubenswrapper[4892]: I0217 20:00:09.479222 4892 scope.go:117] "RemoveContainer" containerID="08cb4266fc063cfdd35144c7fc72be99ccf72818535563f1455af8a14738d6ed" Feb 17 20:00:53 crc kubenswrapper[4892]: I0217 20:00:53.766390 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xd56j"] Feb 17 20:00:53 crc kubenswrapper[4892]: E0217 20:00:53.767784 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="657f4ea9-6fa9-448c-87de-23576a799f44" containerName="collect-profiles" Feb 17 20:00:53 crc kubenswrapper[4892]: I0217 20:00:53.767805 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="657f4ea9-6fa9-448c-87de-23576a799f44" containerName="collect-profiles" Feb 17 20:00:53 crc kubenswrapper[4892]: I0217 20:00:53.769438 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="657f4ea9-6fa9-448c-87de-23576a799f44" containerName="collect-profiles" Feb 17 20:00:53 crc kubenswrapper[4892]: I0217 20:00:53.774795 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xd56j" Feb 17 20:00:53 crc kubenswrapper[4892]: I0217 20:00:53.792646 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xd56j"] Feb 17 20:00:53 crc kubenswrapper[4892]: I0217 20:00:53.850988 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d3cb986-c9cb-466b-9345-7e169485d069-catalog-content\") pod \"redhat-operators-xd56j\" (UID: \"4d3cb986-c9cb-466b-9345-7e169485d069\") " pod="openshift-marketplace/redhat-operators-xd56j" Feb 17 20:00:53 crc kubenswrapper[4892]: I0217 20:00:53.851383 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsf7j\" (UniqueName: \"kubernetes.io/projected/4d3cb986-c9cb-466b-9345-7e169485d069-kube-api-access-zsf7j\") pod \"redhat-operators-xd56j\" (UID: \"4d3cb986-c9cb-466b-9345-7e169485d069\") " pod="openshift-marketplace/redhat-operators-xd56j" Feb 17 20:00:53 crc kubenswrapper[4892]: I0217 20:00:53.851448 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d3cb986-c9cb-466b-9345-7e169485d069-utilities\") pod \"redhat-operators-xd56j\" (UID: \"4d3cb986-c9cb-466b-9345-7e169485d069\") " pod="openshift-marketplace/redhat-operators-xd56j" Feb 17 20:00:53 crc kubenswrapper[4892]: I0217 20:00:53.954038 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d3cb986-c9cb-466b-9345-7e169485d069-catalog-content\") pod \"redhat-operators-xd56j\" (UID: \"4d3cb986-c9cb-466b-9345-7e169485d069\") " pod="openshift-marketplace/redhat-operators-xd56j" Feb 17 20:00:53 crc kubenswrapper[4892]: I0217 20:00:53.954577 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsf7j\" (UniqueName: \"kubernetes.io/projected/4d3cb986-c9cb-466b-9345-7e169485d069-kube-api-access-zsf7j\") pod \"redhat-operators-xd56j\" (UID: \"4d3cb986-c9cb-466b-9345-7e169485d069\") " pod="openshift-marketplace/redhat-operators-xd56j" Feb 17 20:00:53 crc kubenswrapper[4892]: I0217 20:00:53.954890 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d3cb986-c9cb-466b-9345-7e169485d069-utilities\") pod \"redhat-operators-xd56j\" (UID: \"4d3cb986-c9cb-466b-9345-7e169485d069\") " pod="openshift-marketplace/redhat-operators-xd56j" Feb 17 20:00:53 crc kubenswrapper[4892]: I0217 20:00:53.955919 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d3cb986-c9cb-466b-9345-7e169485d069-utilities\") pod \"redhat-operators-xd56j\" (UID: \"4d3cb986-c9cb-466b-9345-7e169485d069\") " pod="openshift-marketplace/redhat-operators-xd56j" Feb 17 20:00:53 crc kubenswrapper[4892]: I0217 20:00:53.956549 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d3cb986-c9cb-466b-9345-7e169485d069-catalog-content\") pod \"redhat-operators-xd56j\" (UID: \"4d3cb986-c9cb-466b-9345-7e169485d069\") " pod="openshift-marketplace/redhat-operators-xd56j" Feb 17 20:00:53 crc kubenswrapper[4892]: I0217 20:00:53.978142 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsf7j\" (UniqueName: \"kubernetes.io/projected/4d3cb986-c9cb-466b-9345-7e169485d069-kube-api-access-zsf7j\") pod \"redhat-operators-xd56j\" (UID: \"4d3cb986-c9cb-466b-9345-7e169485d069\") " pod="openshift-marketplace/redhat-operators-xd56j" Feb 17 20:00:54 crc kubenswrapper[4892]: I0217 20:00:54.131784 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xd56j" Feb 17 20:00:54 crc kubenswrapper[4892]: I0217 20:00:54.625204 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xd56j"] Feb 17 20:00:55 crc kubenswrapper[4892]: I0217 20:00:55.524212 4892 generic.go:334] "Generic (PLEG): container finished" podID="4d3cb986-c9cb-466b-9345-7e169485d069" containerID="f4340cae34ac21b29e95f6157dfa2bf2f41694058c85e37a6119b522131af72f" exitCode=0 Feb 17 20:00:55 crc kubenswrapper[4892]: I0217 20:00:55.524267 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xd56j" event={"ID":"4d3cb986-c9cb-466b-9345-7e169485d069","Type":"ContainerDied","Data":"f4340cae34ac21b29e95f6157dfa2bf2f41694058c85e37a6119b522131af72f"} Feb 17 20:00:55 crc kubenswrapper[4892]: I0217 20:00:55.524321 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xd56j" event={"ID":"4d3cb986-c9cb-466b-9345-7e169485d069","Type":"ContainerStarted","Data":"7a07c6448089df519370ab7aa1d95cec6be3b6abc410909cc85d33cd3daf4f19"} Feb 17 20:00:56 crc kubenswrapper[4892]: I0217 20:00:56.541378 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xd56j" event={"ID":"4d3cb986-c9cb-466b-9345-7e169485d069","Type":"ContainerStarted","Data":"721f89ae6c2af8f5061158c9fce7c5204896b8574a3007d33583097e8ecf6f0c"} Feb 17 20:00:57 crc kubenswrapper[4892]: I0217 20:00:57.552801 4892 generic.go:334] "Generic (PLEG): container finished" podID="4d3cb986-c9cb-466b-9345-7e169485d069" containerID="721f89ae6c2af8f5061158c9fce7c5204896b8574a3007d33583097e8ecf6f0c" exitCode=0 Feb 17 20:00:57 crc kubenswrapper[4892]: I0217 20:00:57.552969 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xd56j" event={"ID":"4d3cb986-c9cb-466b-9345-7e169485d069","Type":"ContainerDied","Data":"721f89ae6c2af8f5061158c9fce7c5204896b8574a3007d33583097e8ecf6f0c"} Feb 17 20:00:58 crc kubenswrapper[4892]: I0217 20:00:58.609000 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xd56j" event={"ID":"4d3cb986-c9cb-466b-9345-7e169485d069","Type":"ContainerStarted","Data":"f807d3c556aa5f21366096889af3b327dbb5a2e408d69254a636ef3051fd993c"} Feb 17 20:00:58 crc kubenswrapper[4892]: I0217 20:00:58.633761 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xd56j" podStartSLOduration=3.1877488290000002 podStartE2EDuration="5.633741994s" podCreationTimestamp="2026-02-17 20:00:53 +0000 UTC" firstStartedPulling="2026-02-17 20:00:55.526621222 +0000 UTC m=+8226.902024487" lastFinishedPulling="2026-02-17 20:00:57.972614377 +0000 UTC m=+8229.348017652" observedRunningTime="2026-02-17 20:00:58.630850916 +0000 UTC m=+8230.006254221" watchObservedRunningTime="2026-02-17 20:00:58.633741994 +0000 UTC m=+8230.009145279" Feb 17 20:01:00 crc kubenswrapper[4892]: I0217 20:01:00.166927 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29522641-mcslv"] Feb 17 20:01:00 crc kubenswrapper[4892]: I0217 20:01:00.169699 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29522641-mcslv" Feb 17 20:01:00 crc kubenswrapper[4892]: I0217 20:01:00.209537 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29522641-mcslv"] Feb 17 20:01:00 crc kubenswrapper[4892]: I0217 20:01:00.216635 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86917ae5-f69c-4bcf-a090-99818cf3c471-config-data\") pod \"keystone-cron-29522641-mcslv\" (UID: \"86917ae5-f69c-4bcf-a090-99818cf3c471\") " pod="openstack/keystone-cron-29522641-mcslv" Feb 17 20:01:00 crc kubenswrapper[4892]: I0217 20:01:00.216849 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6kr2\" (UniqueName: \"kubernetes.io/projected/86917ae5-f69c-4bcf-a090-99818cf3c471-kube-api-access-j6kr2\") pod \"keystone-cron-29522641-mcslv\" (UID: \"86917ae5-f69c-4bcf-a090-99818cf3c471\") " pod="openstack/keystone-cron-29522641-mcslv" Feb 17 20:01:00 crc kubenswrapper[4892]: I0217 20:01:00.216898 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86917ae5-f69c-4bcf-a090-99818cf3c471-combined-ca-bundle\") pod \"keystone-cron-29522641-mcslv\" (UID: \"86917ae5-f69c-4bcf-a090-99818cf3c471\") " pod="openstack/keystone-cron-29522641-mcslv" Feb 17 20:01:00 crc kubenswrapper[4892]: I0217 20:01:00.217183 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/86917ae5-f69c-4bcf-a090-99818cf3c471-fernet-keys\") pod \"keystone-cron-29522641-mcslv\" (UID: \"86917ae5-f69c-4bcf-a090-99818cf3c471\") " pod="openstack/keystone-cron-29522641-mcslv" Feb 17 20:01:00 crc kubenswrapper[4892]: I0217 20:01:00.319425 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86917ae5-f69c-4bcf-a090-99818cf3c471-config-data\") pod \"keystone-cron-29522641-mcslv\" (UID: \"86917ae5-f69c-4bcf-a090-99818cf3c471\") " pod="openstack/keystone-cron-29522641-mcslv" Feb 17 20:01:00 crc kubenswrapper[4892]: I0217 20:01:00.319581 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6kr2\" (UniqueName: \"kubernetes.io/projected/86917ae5-f69c-4bcf-a090-99818cf3c471-kube-api-access-j6kr2\") pod \"keystone-cron-29522641-mcslv\" (UID: \"86917ae5-f69c-4bcf-a090-99818cf3c471\") " pod="openstack/keystone-cron-29522641-mcslv" Feb 17 20:01:00 crc kubenswrapper[4892]: I0217 20:01:00.319622 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86917ae5-f69c-4bcf-a090-99818cf3c471-combined-ca-bundle\") pod \"keystone-cron-29522641-mcslv\" (UID: \"86917ae5-f69c-4bcf-a090-99818cf3c471\") " pod="openstack/keystone-cron-29522641-mcslv" Feb 17 20:01:00 crc kubenswrapper[4892]: I0217 20:01:00.319676 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/86917ae5-f69c-4bcf-a090-99818cf3c471-fernet-keys\") pod \"keystone-cron-29522641-mcslv\" (UID: \"86917ae5-f69c-4bcf-a090-99818cf3c471\") " pod="openstack/keystone-cron-29522641-mcslv" Feb 17 20:01:00 crc kubenswrapper[4892]: I0217 20:01:00.325840 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/86917ae5-f69c-4bcf-a090-99818cf3c471-fernet-keys\") pod \"keystone-cron-29522641-mcslv\" (UID: \"86917ae5-f69c-4bcf-a090-99818cf3c471\") " pod="openstack/keystone-cron-29522641-mcslv" Feb 17 20:01:00 crc kubenswrapper[4892]: I0217 20:01:00.326407 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86917ae5-f69c-4bcf-a090-99818cf3c471-combined-ca-bundle\") pod \"keystone-cron-29522641-mcslv\" (UID: \"86917ae5-f69c-4bcf-a090-99818cf3c471\") " pod="openstack/keystone-cron-29522641-mcslv" Feb 17 20:01:00 crc kubenswrapper[4892]: I0217 20:01:00.328949 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86917ae5-f69c-4bcf-a090-99818cf3c471-config-data\") pod \"keystone-cron-29522641-mcslv\" (UID: \"86917ae5-f69c-4bcf-a090-99818cf3c471\") " pod="openstack/keystone-cron-29522641-mcslv" Feb 17 20:01:00 crc kubenswrapper[4892]: I0217 20:01:00.338737 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6kr2\" (UniqueName: \"kubernetes.io/projected/86917ae5-f69c-4bcf-a090-99818cf3c471-kube-api-access-j6kr2\") pod \"keystone-cron-29522641-mcslv\" (UID: \"86917ae5-f69c-4bcf-a090-99818cf3c471\") " pod="openstack/keystone-cron-29522641-mcslv" Feb 17 20:01:00 crc kubenswrapper[4892]: I0217 20:01:00.512509 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29522641-mcslv" Feb 17 20:01:01 crc kubenswrapper[4892]: I0217 20:01:01.073113 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29522641-mcslv"] Feb 17 20:01:01 crc kubenswrapper[4892]: I0217 20:01:01.653272 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29522641-mcslv" event={"ID":"86917ae5-f69c-4bcf-a090-99818cf3c471","Type":"ContainerStarted","Data":"8bb11bf1e4db88187e1697f3093e1e6980c6e374dd5466e8c1b4b64e08e0af6e"} Feb 17 20:01:01 crc kubenswrapper[4892]: I0217 20:01:01.653612 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29522641-mcslv" event={"ID":"86917ae5-f69c-4bcf-a090-99818cf3c471","Type":"ContainerStarted","Data":"fb429757a3509f457d4ace83ad4a053f28a64703cb54fee6fcf7ee1c2c8646b4"} Feb 17 20:01:04 crc kubenswrapper[4892]: I0217 20:01:04.132057 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xd56j" Feb 17 20:01:04 crc kubenswrapper[4892]: I0217 20:01:04.132571 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xd56j" Feb 17 20:01:04 crc kubenswrapper[4892]: I0217 20:01:04.692291 4892 generic.go:334] "Generic (PLEG): container finished" podID="86917ae5-f69c-4bcf-a090-99818cf3c471" containerID="8bb11bf1e4db88187e1697f3093e1e6980c6e374dd5466e8c1b4b64e08e0af6e" exitCode=0 Feb 17 20:01:04 crc kubenswrapper[4892]: I0217 20:01:04.692385 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29522641-mcslv" event={"ID":"86917ae5-f69c-4bcf-a090-99818cf3c471","Type":"ContainerDied","Data":"8bb11bf1e4db88187e1697f3093e1e6980c6e374dd5466e8c1b4b64e08e0af6e"} Feb 17 20:01:05 crc kubenswrapper[4892]: I0217 20:01:05.187509 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xd56j" podUID="4d3cb986-c9cb-466b-9345-7e169485d069" containerName="registry-server" probeResult="failure" output=< Feb 17 20:01:05 crc kubenswrapper[4892]: timeout: failed to connect service ":50051" within 1s Feb 17 20:01:05 crc kubenswrapper[4892]: > Feb 17 20:01:06 crc kubenswrapper[4892]: I0217 20:01:06.207171 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29522641-mcslv" Feb 17 20:01:06 crc kubenswrapper[4892]: I0217 20:01:06.355669 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6kr2\" (UniqueName: \"kubernetes.io/projected/86917ae5-f69c-4bcf-a090-99818cf3c471-kube-api-access-j6kr2\") pod \"86917ae5-f69c-4bcf-a090-99818cf3c471\" (UID: \"86917ae5-f69c-4bcf-a090-99818cf3c471\") " Feb 17 20:01:06 crc kubenswrapper[4892]: I0217 20:01:06.355807 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86917ae5-f69c-4bcf-a090-99818cf3c471-combined-ca-bundle\") pod \"86917ae5-f69c-4bcf-a090-99818cf3c471\" (UID: \"86917ae5-f69c-4bcf-a090-99818cf3c471\") " Feb 17 20:01:06 crc kubenswrapper[4892]: I0217 20:01:06.355880 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/86917ae5-f69c-4bcf-a090-99818cf3c471-fernet-keys\") pod \"86917ae5-f69c-4bcf-a090-99818cf3c471\" (UID: \"86917ae5-f69c-4bcf-a090-99818cf3c471\") " Feb 17 20:01:06 crc kubenswrapper[4892]: I0217 20:01:06.356067 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86917ae5-f69c-4bcf-a090-99818cf3c471-config-data\") pod \"86917ae5-f69c-4bcf-a090-99818cf3c471\" (UID: \"86917ae5-f69c-4bcf-a090-99818cf3c471\") " Feb 17 20:01:06 crc kubenswrapper[4892]: I0217 20:01:06.362043 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86917ae5-f69c-4bcf-a090-99818cf3c471-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "86917ae5-f69c-4bcf-a090-99818cf3c471" (UID: "86917ae5-f69c-4bcf-a090-99818cf3c471"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 20:01:06 crc kubenswrapper[4892]: I0217 20:01:06.362178 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86917ae5-f69c-4bcf-a090-99818cf3c471-kube-api-access-j6kr2" (OuterVolumeSpecName: "kube-api-access-j6kr2") pod "86917ae5-f69c-4bcf-a090-99818cf3c471" (UID: "86917ae5-f69c-4bcf-a090-99818cf3c471"). InnerVolumeSpecName "kube-api-access-j6kr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 20:01:06 crc kubenswrapper[4892]: I0217 20:01:06.416954 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86917ae5-f69c-4bcf-a090-99818cf3c471-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "86917ae5-f69c-4bcf-a090-99818cf3c471" (UID: "86917ae5-f69c-4bcf-a090-99818cf3c471"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 20:01:06 crc kubenswrapper[4892]: I0217 20:01:06.417715 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86917ae5-f69c-4bcf-a090-99818cf3c471-config-data" (OuterVolumeSpecName: "config-data") pod "86917ae5-f69c-4bcf-a090-99818cf3c471" (UID: "86917ae5-f69c-4bcf-a090-99818cf3c471"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 20:01:06 crc kubenswrapper[4892]: I0217 20:01:06.459575 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6kr2\" (UniqueName: \"kubernetes.io/projected/86917ae5-f69c-4bcf-a090-99818cf3c471-kube-api-access-j6kr2\") on node \"crc\" DevicePath \"\"" Feb 17 20:01:06 crc kubenswrapper[4892]: I0217 20:01:06.459797 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86917ae5-f69c-4bcf-a090-99818cf3c471-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 20:01:06 crc kubenswrapper[4892]: I0217 20:01:06.459916 4892 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/86917ae5-f69c-4bcf-a090-99818cf3c471-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 17 20:01:06 crc kubenswrapper[4892]: I0217 20:01:06.459971 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86917ae5-f69c-4bcf-a090-99818cf3c471-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 20:01:06 crc kubenswrapper[4892]: I0217 20:01:06.725967 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29522641-mcslv" event={"ID":"86917ae5-f69c-4bcf-a090-99818cf3c471","Type":"ContainerDied","Data":"fb429757a3509f457d4ace83ad4a053f28a64703cb54fee6fcf7ee1c2c8646b4"} Feb 17 20:01:06 crc kubenswrapper[4892]: I0217 20:01:06.726020 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb429757a3509f457d4ace83ad4a053f28a64703cb54fee6fcf7ee1c2c8646b4" Feb 17 20:01:06 crc kubenswrapper[4892]: I0217 20:01:06.726056 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29522641-mcslv" Feb 17 20:01:14 crc kubenswrapper[4892]: I0217 20:01:14.210453 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xd56j" Feb 17 20:01:14 crc kubenswrapper[4892]: I0217 20:01:14.277409 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xd56j" Feb 17 20:01:14 crc kubenswrapper[4892]: I0217 20:01:14.458321 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xd56j"] Feb 17 20:01:15 crc kubenswrapper[4892]: I0217 20:01:15.845948 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xd56j" podUID="4d3cb986-c9cb-466b-9345-7e169485d069" containerName="registry-server" containerID="cri-o://f807d3c556aa5f21366096889af3b327dbb5a2e408d69254a636ef3051fd993c" gracePeriod=2 Feb 17 20:01:16 crc kubenswrapper[4892]: I0217 20:01:16.428363 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xd56j" Feb 17 20:01:16 crc kubenswrapper[4892]: I0217 20:01:16.630140 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsf7j\" (UniqueName: \"kubernetes.io/projected/4d3cb986-c9cb-466b-9345-7e169485d069-kube-api-access-zsf7j\") pod \"4d3cb986-c9cb-466b-9345-7e169485d069\" (UID: \"4d3cb986-c9cb-466b-9345-7e169485d069\") " Feb 17 20:01:16 crc kubenswrapper[4892]: I0217 20:01:16.630438 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d3cb986-c9cb-466b-9345-7e169485d069-catalog-content\") pod \"4d3cb986-c9cb-466b-9345-7e169485d069\" (UID: \"4d3cb986-c9cb-466b-9345-7e169485d069\") " Feb 17 20:01:16 crc kubenswrapper[4892]: I0217 20:01:16.630493 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d3cb986-c9cb-466b-9345-7e169485d069-utilities\") pod \"4d3cb986-c9cb-466b-9345-7e169485d069\" (UID: \"4d3cb986-c9cb-466b-9345-7e169485d069\") " Feb 17 20:01:16 crc kubenswrapper[4892]: I0217 20:01:16.631386 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d3cb986-c9cb-466b-9345-7e169485d069-utilities" (OuterVolumeSpecName: "utilities") pod "4d3cb986-c9cb-466b-9345-7e169485d069" (UID: "4d3cb986-c9cb-466b-9345-7e169485d069"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 20:01:16 crc kubenswrapper[4892]: I0217 20:01:16.636383 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d3cb986-c9cb-466b-9345-7e169485d069-kube-api-access-zsf7j" (OuterVolumeSpecName: "kube-api-access-zsf7j") pod "4d3cb986-c9cb-466b-9345-7e169485d069" (UID: "4d3cb986-c9cb-466b-9345-7e169485d069"). InnerVolumeSpecName "kube-api-access-zsf7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 20:01:16 crc kubenswrapper[4892]: I0217 20:01:16.733382 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d3cb986-c9cb-466b-9345-7e169485d069-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 20:01:16 crc kubenswrapper[4892]: I0217 20:01:16.733428 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsf7j\" (UniqueName: \"kubernetes.io/projected/4d3cb986-c9cb-466b-9345-7e169485d069-kube-api-access-zsf7j\") on node \"crc\" DevicePath \"\"" Feb 17 20:01:16 crc kubenswrapper[4892]: I0217 20:01:16.768872 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d3cb986-c9cb-466b-9345-7e169485d069-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4d3cb986-c9cb-466b-9345-7e169485d069" (UID: "4d3cb986-c9cb-466b-9345-7e169485d069"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 20:01:16 crc kubenswrapper[4892]: I0217 20:01:16.834378 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d3cb986-c9cb-466b-9345-7e169485d069-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 20:01:16 crc kubenswrapper[4892]: I0217 20:01:16.860120 4892 generic.go:334] "Generic (PLEG): container finished" podID="4d3cb986-c9cb-466b-9345-7e169485d069" containerID="f807d3c556aa5f21366096889af3b327dbb5a2e408d69254a636ef3051fd993c" exitCode=0 Feb 17 20:01:16 crc kubenswrapper[4892]: I0217 20:01:16.860165 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xd56j" event={"ID":"4d3cb986-c9cb-466b-9345-7e169485d069","Type":"ContainerDied","Data":"f807d3c556aa5f21366096889af3b327dbb5a2e408d69254a636ef3051fd993c"} Feb 17 20:01:16 crc kubenswrapper[4892]: I0217 20:01:16.860242 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xd56j" event={"ID":"4d3cb986-c9cb-466b-9345-7e169485d069","Type":"ContainerDied","Data":"7a07c6448089df519370ab7aa1d95cec6be3b6abc410909cc85d33cd3daf4f19"} Feb 17 20:01:16 crc kubenswrapper[4892]: I0217 20:01:16.860271 4892 scope.go:117] "RemoveContainer" containerID="f807d3c556aa5f21366096889af3b327dbb5a2e408d69254a636ef3051fd993c" Feb 17 20:01:16 crc kubenswrapper[4892]: I0217 20:01:16.860289 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xd56j" Feb 17 20:01:16 crc kubenswrapper[4892]: I0217 20:01:16.891792 4892 scope.go:117] "RemoveContainer" containerID="721f89ae6c2af8f5061158c9fce7c5204896b8574a3007d33583097e8ecf6f0c" Feb 17 20:01:16 crc kubenswrapper[4892]: I0217 20:01:16.918770 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xd56j"] Feb 17 20:01:16 crc kubenswrapper[4892]: I0217 20:01:16.936012 4892 scope.go:117] "RemoveContainer" containerID="f4340cae34ac21b29e95f6157dfa2bf2f41694058c85e37a6119b522131af72f" Feb 17 20:01:16 crc kubenswrapper[4892]: I0217 20:01:16.964096 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xd56j"] Feb 17 20:01:16 crc kubenswrapper[4892]: I0217 20:01:16.995234 4892 scope.go:117] "RemoveContainer" containerID="f807d3c556aa5f21366096889af3b327dbb5a2e408d69254a636ef3051fd993c" Feb 17 20:01:16 crc kubenswrapper[4892]: E0217 20:01:16.995840 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f807d3c556aa5f21366096889af3b327dbb5a2e408d69254a636ef3051fd993c\": container with ID starting with f807d3c556aa5f21366096889af3b327dbb5a2e408d69254a636ef3051fd993c not found: ID does not exist" containerID="f807d3c556aa5f21366096889af3b327dbb5a2e408d69254a636ef3051fd993c" Feb 17 20:01:16 crc kubenswrapper[4892]: I0217 20:01:16.995905 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f807d3c556aa5f21366096889af3b327dbb5a2e408d69254a636ef3051fd993c"} err="failed to get container status \"f807d3c556aa5f21366096889af3b327dbb5a2e408d69254a636ef3051fd993c\": rpc error: code = NotFound desc = could not find container \"f807d3c556aa5f21366096889af3b327dbb5a2e408d69254a636ef3051fd993c\": container with ID starting with f807d3c556aa5f21366096889af3b327dbb5a2e408d69254a636ef3051fd993c not found: ID does not exist" Feb 17 20:01:16 crc kubenswrapper[4892]: I0217 20:01:16.995943 4892 scope.go:117] "RemoveContainer" containerID="721f89ae6c2af8f5061158c9fce7c5204896b8574a3007d33583097e8ecf6f0c" Feb 17 20:01:16 crc kubenswrapper[4892]: E0217 20:01:16.996569 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"721f89ae6c2af8f5061158c9fce7c5204896b8574a3007d33583097e8ecf6f0c\": container with ID starting with 721f89ae6c2af8f5061158c9fce7c5204896b8574a3007d33583097e8ecf6f0c not found: ID does not exist" containerID="721f89ae6c2af8f5061158c9fce7c5204896b8574a3007d33583097e8ecf6f0c" Feb 17 20:01:16 crc kubenswrapper[4892]: I0217 20:01:16.996606 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"721f89ae6c2af8f5061158c9fce7c5204896b8574a3007d33583097e8ecf6f0c"} err="failed to get container status \"721f89ae6c2af8f5061158c9fce7c5204896b8574a3007d33583097e8ecf6f0c\": rpc error: code = NotFound desc = could not find container \"721f89ae6c2af8f5061158c9fce7c5204896b8574a3007d33583097e8ecf6f0c\": container with ID starting with 721f89ae6c2af8f5061158c9fce7c5204896b8574a3007d33583097e8ecf6f0c not found: ID does not exist" Feb 17 20:01:16 crc kubenswrapper[4892]: I0217 20:01:16.996626 4892 scope.go:117] "RemoveContainer" containerID="f4340cae34ac21b29e95f6157dfa2bf2f41694058c85e37a6119b522131af72f" Feb 17 20:01:16 crc kubenswrapper[4892]: E0217 20:01:16.996962 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4340cae34ac21b29e95f6157dfa2bf2f41694058c85e37a6119b522131af72f\": container with ID starting with f4340cae34ac21b29e95f6157dfa2bf2f41694058c85e37a6119b522131af72f not found: ID does not exist" containerID="f4340cae34ac21b29e95f6157dfa2bf2f41694058c85e37a6119b522131af72f" Feb 17 20:01:16 crc kubenswrapper[4892]: I0217 20:01:16.996995 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4340cae34ac21b29e95f6157dfa2bf2f41694058c85e37a6119b522131af72f"} err="failed to get container status \"f4340cae34ac21b29e95f6157dfa2bf2f41694058c85e37a6119b522131af72f\": rpc error: code = NotFound desc = could not find container \"f4340cae34ac21b29e95f6157dfa2bf2f41694058c85e37a6119b522131af72f\": container with ID starting with f4340cae34ac21b29e95f6157dfa2bf2f41694058c85e37a6119b522131af72f not found: ID does not exist" Feb 17 20:01:17 crc kubenswrapper[4892]: I0217 20:01:17.376208 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d3cb986-c9cb-466b-9345-7e169485d069" path="/var/lib/kubelet/pods/4d3cb986-c9cb-466b-9345-7e169485d069/volumes" Feb 17 20:02:07 crc kubenswrapper[4892]: I0217 20:02:07.425574 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 20:02:07 crc kubenswrapper[4892]: I0217 20:02:07.426110 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 20:02:28 crc kubenswrapper[4892]: I0217 20:02:28.867988 4892 generic.go:334] "Generic (PLEG): container finished" podID="7a1ac411-8b1b-4947-8e72-7d4401056d3f" containerID="6828402bb37b9bed63f53a27744e71467fc557fb15e3d4ff9607f76370c8ef8c" exitCode=0 Feb 17 20:02:28 crc kubenswrapper[4892]: I0217 20:02:28.868102 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-8lwbt" event={"ID":"7a1ac411-8b1b-4947-8e72-7d4401056d3f","Type":"ContainerDied","Data":"6828402bb37b9bed63f53a27744e71467fc557fb15e3d4ff9607f76370c8ef8c"} Feb 17 20:02:30 crc kubenswrapper[4892]: I0217 20:02:30.396026 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-8lwbt" Feb 17 20:02:30 crc kubenswrapper[4892]: I0217 20:02:30.490668 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-nova-cell1-combined-ca-bundle\") pod \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\" (UID: \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\") " Feb 17 20:02:30 crc kubenswrapper[4892]: I0217 20:02:30.490752 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-nova-cell1-compute-config-2\") pod \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\" (UID: \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\") " Feb 17 20:02:30 crc kubenswrapper[4892]: I0217 20:02:30.490870 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-ceph\") pod \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\" (UID: \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\") " Feb 17 20:02:30 crc kubenswrapper[4892]: I0217 20:02:30.490937 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/7a1ac411-8b1b-4947-8e72-7d4401056d3f-nova-cells-global-config-1\") pod \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\" (UID: \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\") " Feb 17 20:02:30 crc kubenswrapper[4892]: I0217 20:02:30.491010 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/7a1ac411-8b1b-4947-8e72-7d4401056d3f-nova-cells-global-config-0\") pod \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\" (UID: \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\") " Feb 17 20:02:30 crc kubenswrapper[4892]: I0217 20:02:30.491034 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68cqm\" (UniqueName: \"kubernetes.io/projected/7a1ac411-8b1b-4947-8e72-7d4401056d3f-kube-api-access-68cqm\") pod \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\" (UID: \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\") " Feb 17 20:02:30 crc kubenswrapper[4892]: I0217 20:02:30.491075 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-nova-migration-ssh-key-1\") pod \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\" (UID: \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\") " Feb 17 20:02:30 crc kubenswrapper[4892]: I0217 20:02:30.491094 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-nova-migration-ssh-key-0\") pod \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\" (UID: \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\") " Feb 17 20:02:30 crc kubenswrapper[4892]: I0217 20:02:30.491115 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-ssh-key-openstack-cell1\") pod \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\" (UID: \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\") " Feb 17 20:02:30 crc kubenswrapper[4892]: I0217 20:02:30.491140 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-nova-cell1-compute-config-0\") pod \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\" (UID: \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\") " Feb 17 20:02:30 crc kubenswrapper[4892]: I0217 20:02:30.491156 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-inventory\") pod \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\" (UID: \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\") " Feb 17 20:02:30 crc kubenswrapper[4892]: I0217 20:02:30.491181 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-nova-cell1-compute-config-1\") pod \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\" (UID: \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\") " Feb 17 20:02:30 crc kubenswrapper[4892]: I0217 20:02:30.491210 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-nova-cell1-compute-config-3\") pod \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\" (UID: \"7a1ac411-8b1b-4947-8e72-7d4401056d3f\") " Feb 17 20:02:30 crc kubenswrapper[4892]: I0217 20:02:30.496501 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a1ac411-8b1b-4947-8e72-7d4401056d3f-kube-api-access-68cqm" (OuterVolumeSpecName: "kube-api-access-68cqm") pod "7a1ac411-8b1b-4947-8e72-7d4401056d3f" (UID: "7a1ac411-8b1b-4947-8e72-7d4401056d3f"). InnerVolumeSpecName "kube-api-access-68cqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 20:02:30 crc kubenswrapper[4892]: I0217 20:02:30.497449 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "7a1ac411-8b1b-4947-8e72-7d4401056d3f" (UID: "7a1ac411-8b1b-4947-8e72-7d4401056d3f"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 20:02:30 crc kubenswrapper[4892]: I0217 20:02:30.497986 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-ceph" (OuterVolumeSpecName: "ceph") pod "7a1ac411-8b1b-4947-8e72-7d4401056d3f" (UID: "7a1ac411-8b1b-4947-8e72-7d4401056d3f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 20:02:30 crc kubenswrapper[4892]: I0217 20:02:30.526307 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "7a1ac411-8b1b-4947-8e72-7d4401056d3f" (UID: "7a1ac411-8b1b-4947-8e72-7d4401056d3f"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 20:02:30 crc kubenswrapper[4892]: I0217 20:02:30.527256 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "7a1ac411-8b1b-4947-8e72-7d4401056d3f" (UID: "7a1ac411-8b1b-4947-8e72-7d4401056d3f"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 20:02:30 crc kubenswrapper[4892]: I0217 20:02:30.528959 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "7a1ac411-8b1b-4947-8e72-7d4401056d3f" (UID: "7a1ac411-8b1b-4947-8e72-7d4401056d3f"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 20:02:30 crc kubenswrapper[4892]: I0217 20:02:30.534566 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "7a1ac411-8b1b-4947-8e72-7d4401056d3f" (UID: "7a1ac411-8b1b-4947-8e72-7d4401056d3f"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 20:02:30 crc kubenswrapper[4892]: I0217 20:02:30.540313 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a1ac411-8b1b-4947-8e72-7d4401056d3f-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "7a1ac411-8b1b-4947-8e72-7d4401056d3f" (UID: "7a1ac411-8b1b-4947-8e72-7d4401056d3f"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 20:02:30 crc kubenswrapper[4892]: I0217 20:02:30.545523 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-inventory" (OuterVolumeSpecName: "inventory") pod "7a1ac411-8b1b-4947-8e72-7d4401056d3f" (UID: "7a1ac411-8b1b-4947-8e72-7d4401056d3f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 20:02:30 crc kubenswrapper[4892]: I0217 20:02:30.553730 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "7a1ac411-8b1b-4947-8e72-7d4401056d3f" (UID: "7a1ac411-8b1b-4947-8e72-7d4401056d3f"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 20:02:30 crc kubenswrapper[4892]: I0217 20:02:30.554540 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "7a1ac411-8b1b-4947-8e72-7d4401056d3f" (UID: "7a1ac411-8b1b-4947-8e72-7d4401056d3f"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 20:02:30 crc kubenswrapper[4892]: I0217 20:02:30.561976 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "7a1ac411-8b1b-4947-8e72-7d4401056d3f" (UID: "7a1ac411-8b1b-4947-8e72-7d4401056d3f"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 20:02:30 crc kubenswrapper[4892]: I0217 20:02:30.566597 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a1ac411-8b1b-4947-8e72-7d4401056d3f-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "7a1ac411-8b1b-4947-8e72-7d4401056d3f" (UID: "7a1ac411-8b1b-4947-8e72-7d4401056d3f"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 20:02:30 crc kubenswrapper[4892]: I0217 20:02:30.599410 4892 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 20:02:30 crc kubenswrapper[4892]: I0217 20:02:30.599448 4892 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Feb 17 20:02:30 crc kubenswrapper[4892]: I0217 20:02:30.599459 4892 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-ceph\") on node \"crc\" DevicePath \"\"" Feb 17 20:02:30 crc kubenswrapper[4892]: I0217 20:02:30.599468 4892 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/7a1ac411-8b1b-4947-8e72-7d4401056d3f-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Feb 17 20:02:30 crc kubenswrapper[4892]: I0217 20:02:30.599477 4892 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/7a1ac411-8b1b-4947-8e72-7d4401056d3f-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Feb 17 20:02:30 crc kubenswrapper[4892]: I0217 20:02:30.599487 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68cqm\" (UniqueName: \"kubernetes.io/projected/7a1ac411-8b1b-4947-8e72-7d4401056d3f-kube-api-access-68cqm\") on node \"crc\" DevicePath \"\"" Feb 17 20:02:30 crc kubenswrapper[4892]: I0217 20:02:30.599499 4892 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Feb 17 20:02:30 crc kubenswrapper[4892]: I0217 20:02:30.599513 4892 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Feb 17 20:02:30 crc kubenswrapper[4892]: I0217 20:02:30.599524 4892 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 17 20:02:30 crc kubenswrapper[4892]: I0217 20:02:30.599536 4892 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Feb 17 20:02:30 crc kubenswrapper[4892]: I0217 20:02:30.599545 4892 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 20:02:30 crc kubenswrapper[4892]: I0217 20:02:30.599554 4892 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Feb 17 20:02:30 crc kubenswrapper[4892]: I0217 20:02:30.599563 4892 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/7a1ac411-8b1b-4947-8e72-7d4401056d3f-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Feb 17 20:02:30 crc kubenswrapper[4892]: I0217 20:02:30.888643 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-8lwbt" event={"ID":"7a1ac411-8b1b-4947-8e72-7d4401056d3f","Type":"ContainerDied","Data":"1974b06a392da58acc19c326b1325a63f67da8563d70b1d4a1f0291b49f56d1b"} Feb 17 20:02:30 crc kubenswrapper[4892]: I0217 20:02:30.889137 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1974b06a392da58acc19c326b1325a63f67da8563d70b1d4a1f0291b49f56d1b" Feb 17 20:02:30 crc kubenswrapper[4892]: I0217 20:02:30.888741 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-8lwbt" Feb 17 20:02:31 crc kubenswrapper[4892]: I0217 20:02:31.022628 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-pnvpv"] Feb 17 20:02:31 crc kubenswrapper[4892]: E0217 20:02:31.023539 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a1ac411-8b1b-4947-8e72-7d4401056d3f" containerName="nova-cell1-openstack-openstack-cell1" Feb 17 20:02:31 crc kubenswrapper[4892]: I0217 20:02:31.023572 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a1ac411-8b1b-4947-8e72-7d4401056d3f" containerName="nova-cell1-openstack-openstack-cell1" Feb 17 20:02:31 crc kubenswrapper[4892]: E0217 20:02:31.023669 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d3cb986-c9cb-466b-9345-7e169485d069" containerName="registry-server" Feb 17 20:02:31 crc kubenswrapper[4892]: I0217 20:02:31.023687 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d3cb986-c9cb-466b-9345-7e169485d069" containerName="registry-server" Feb 17 20:02:31 crc kubenswrapper[4892]: E0217 20:02:31.023715 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86917ae5-f69c-4bcf-a090-99818cf3c471" containerName="keystone-cron" Feb 17 20:02:31 crc kubenswrapper[4892]: I0217 20:02:31.023728 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="86917ae5-f69c-4bcf-a090-99818cf3c471" containerName="keystone-cron" Feb 17 20:02:31 crc kubenswrapper[4892]: E0217 20:02:31.023760 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d3cb986-c9cb-466b-9345-7e169485d069" containerName="extract-content" Feb 17 20:02:31 crc kubenswrapper[4892]: I0217 20:02:31.023772 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d3cb986-c9cb-466b-9345-7e169485d069" containerName="extract-content" Feb 17 20:02:31 crc kubenswrapper[4892]: E0217 20:02:31.023809 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d3cb986-c9cb-466b-9345-7e169485d069" containerName="extract-utilities" Feb 17 20:02:31 crc kubenswrapper[4892]: I0217 20:02:31.023852 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d3cb986-c9cb-466b-9345-7e169485d069" containerName="extract-utilities" Feb 17 20:02:31 crc kubenswrapper[4892]: I0217 20:02:31.024319 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="86917ae5-f69c-4bcf-a090-99818cf3c471" containerName="keystone-cron" Feb 17 20:02:31 crc kubenswrapper[4892]: I0217 20:02:31.024365 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a1ac411-8b1b-4947-8e72-7d4401056d3f" containerName="nova-cell1-openstack-openstack-cell1" Feb 17 20:02:31 crc kubenswrapper[4892]: I0217 20:02:31.024402 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d3cb986-c9cb-466b-9345-7e169485d069" containerName="registry-server" Feb 17 20:02:31 crc kubenswrapper[4892]: I0217 20:02:31.043331 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-pnvpv"] Feb 17 20:02:31 crc kubenswrapper[4892]: I0217 20:02:31.043471 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-pnvpv" Feb 17 20:02:31 crc kubenswrapper[4892]: I0217 20:02:31.045469 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 17 20:02:31 crc kubenswrapper[4892]: I0217 20:02:31.045559 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Feb 17 20:02:31 crc kubenswrapper[4892]: I0217 20:02:31.045800 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 17 20:02:31 crc kubenswrapper[4892]: I0217 20:02:31.046655 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 20:02:31 crc kubenswrapper[4892]: I0217 20:02:31.047301 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-sdxx7" Feb 17 20:02:31 crc kubenswrapper[4892]: I0217 20:02:31.120094 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/164b2658-87b4-4639-98bf-9b594c5a8b00-ceph\") pod \"telemetry-openstack-openstack-cell1-pnvpv\" (UID: \"164b2658-87b4-4639-98bf-9b594c5a8b00\") " pod="openstack/telemetry-openstack-openstack-cell1-pnvpv" Feb 17 20:02:31 crc kubenswrapper[4892]: I0217 20:02:31.120155 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/164b2658-87b4-4639-98bf-9b594c5a8b00-ssh-key-openstack-cell1\") pod \"telemetry-openstack-openstack-cell1-pnvpv\" (UID: \"164b2658-87b4-4639-98bf-9b594c5a8b00\") " pod="openstack/telemetry-openstack-openstack-cell1-pnvpv" Feb 17 20:02:31 crc kubenswrapper[4892]: I0217 20:02:31.120176 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/164b2658-87b4-4639-98bf-9b594c5a8b00-inventory\") pod \"telemetry-openstack-openstack-cell1-pnvpv\" (UID: \"164b2658-87b4-4639-98bf-9b594c5a8b00\") " pod="openstack/telemetry-openstack-openstack-cell1-pnvpv" Feb 17 20:02:31 crc kubenswrapper[4892]: I0217 20:02:31.120201 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/164b2658-87b4-4639-98bf-9b594c5a8b00-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-pnvpv\" (UID: \"164b2658-87b4-4639-98bf-9b594c5a8b00\") " pod="openstack/telemetry-openstack-openstack-cell1-pnvpv" Feb 17 20:02:31 crc kubenswrapper[4892]: I0217 20:02:31.120246 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/164b2658-87b4-4639-98bf-9b594c5a8b00-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-pnvpv\" (UID: \"164b2658-87b4-4639-98bf-9b594c5a8b00\") " pod="openstack/telemetry-openstack-openstack-cell1-pnvpv" Feb 17 20:02:31 crc kubenswrapper[4892]: I0217 20:02:31.120306 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sprlj\" (UniqueName: \"kubernetes.io/projected/164b2658-87b4-4639-98bf-9b594c5a8b00-kube-api-access-sprlj\") pod \"telemetry-openstack-openstack-cell1-pnvpv\" (UID: \"164b2658-87b4-4639-98bf-9b594c5a8b00\") " pod="openstack/telemetry-openstack-openstack-cell1-pnvpv" Feb 17 20:02:31 crc kubenswrapper[4892]: I0217 20:02:31.120350 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/164b2658-87b4-4639-98bf-9b594c5a8b00-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-pnvpv\" (UID: \"164b2658-87b4-4639-98bf-9b594c5a8b00\") " pod="openstack/telemetry-openstack-openstack-cell1-pnvpv" Feb 17 20:02:31 crc kubenswrapper[4892]: I0217 20:02:31.120367 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/164b2658-87b4-4639-98bf-9b594c5a8b00-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-pnvpv\" (UID: \"164b2658-87b4-4639-98bf-9b594c5a8b00\") " pod="openstack/telemetry-openstack-openstack-cell1-pnvpv" Feb 17 20:02:31 crc kubenswrapper[4892]: E0217 20:02:31.182669 4892 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a1ac411_8b1b_4947_8e72_7d4401056d3f.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a1ac411_8b1b_4947_8e72_7d4401056d3f.slice/crio-1974b06a392da58acc19c326b1325a63f67da8563d70b1d4a1f0291b49f56d1b\": RecentStats: unable to find data in memory cache]" Feb 17 20:02:31 crc kubenswrapper[4892]: I0217 20:02:31.222970 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/164b2658-87b4-4639-98bf-9b594c5a8b00-ceph\") pod \"telemetry-openstack-openstack-cell1-pnvpv\" (UID: \"164b2658-87b4-4639-98bf-9b594c5a8b00\") " pod="openstack/telemetry-openstack-openstack-cell1-pnvpv" Feb 17 20:02:31 crc kubenswrapper[4892]: I0217 20:02:31.223035 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/164b2658-87b4-4639-98bf-9b594c5a8b00-ssh-key-openstack-cell1\") pod \"telemetry-openstack-openstack-cell1-pnvpv\" (UID: \"164b2658-87b4-4639-98bf-9b594c5a8b00\") " pod="openstack/telemetry-openstack-openstack-cell1-pnvpv" Feb 17 20:02:31 crc kubenswrapper[4892]: I0217 20:02:31.223056 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/164b2658-87b4-4639-98bf-9b594c5a8b00-inventory\") pod \"telemetry-openstack-openstack-cell1-pnvpv\" (UID: \"164b2658-87b4-4639-98bf-9b594c5a8b00\") " pod="openstack/telemetry-openstack-openstack-cell1-pnvpv" Feb 17 20:02:31 crc kubenswrapper[4892]: I0217 20:02:31.223083 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/164b2658-87b4-4639-98bf-9b594c5a8b00-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-pnvpv\" (UID: \"164b2658-87b4-4639-98bf-9b594c5a8b00\") " pod="openstack/telemetry-openstack-openstack-cell1-pnvpv" Feb 17 20:02:31 crc kubenswrapper[4892]: I0217 20:02:31.223135 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/164b2658-87b4-4639-98bf-9b594c5a8b00-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-pnvpv\" (UID: \"164b2658-87b4-4639-98bf-9b594c5a8b00\") " pod="openstack/telemetry-openstack-openstack-cell1-pnvpv" Feb 17 20:02:31 crc kubenswrapper[4892]: I0217 20:02:31.223157 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sprlj\" (UniqueName: \"kubernetes.io/projected/164b2658-87b4-4639-98bf-9b594c5a8b00-kube-api-access-sprlj\") pod \"telemetry-openstack-openstack-cell1-pnvpv\" (UID: \"164b2658-87b4-4639-98bf-9b594c5a8b00\") " pod="openstack/telemetry-openstack-openstack-cell1-pnvpv" Feb 17 20:02:31 crc kubenswrapper[4892]: I0217 20:02:31.223223 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/164b2658-87b4-4639-98bf-9b594c5a8b00-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-pnvpv\" (UID: \"164b2658-87b4-4639-98bf-9b594c5a8b00\") " pod="openstack/telemetry-openstack-openstack-cell1-pnvpv" Feb 17 20:02:31 crc kubenswrapper[4892]: I0217 20:02:31.223921 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/164b2658-87b4-4639-98bf-9b594c5a8b00-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-pnvpv\" (UID: \"164b2658-87b4-4639-98bf-9b594c5a8b00\") " pod="openstack/telemetry-openstack-openstack-cell1-pnvpv" Feb 17 20:02:31 crc kubenswrapper[4892]: I0217 20:02:31.227437 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/164b2658-87b4-4639-98bf-9b594c5a8b00-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-pnvpv\" (UID: \"164b2658-87b4-4639-98bf-9b594c5a8b00\") " pod="openstack/telemetry-openstack-openstack-cell1-pnvpv" Feb 17 20:02:31 crc kubenswrapper[4892]: I0217 20:02:31.227789 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/164b2658-87b4-4639-98bf-9b594c5a8b00-ceph\") pod \"telemetry-openstack-openstack-cell1-pnvpv\" (UID: \"164b2658-87b4-4639-98bf-9b594c5a8b00\") " pod="openstack/telemetry-openstack-openstack-cell1-pnvpv" Feb 17 20:02:31 crc kubenswrapper[4892]: I0217 20:02:31.227832 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/164b2658-87b4-4639-98bf-9b594c5a8b00-inventory\") pod \"telemetry-openstack-openstack-cell1-pnvpv\" (UID: \"164b2658-87b4-4639-98bf-9b594c5a8b00\") " pod="openstack/telemetry-openstack-openstack-cell1-pnvpv" Feb 17 20:02:31 crc kubenswrapper[4892]: I0217 20:02:31.228032 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/164b2658-87b4-4639-98bf-9b594c5a8b00-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-pnvpv\" (UID: \"164b2658-87b4-4639-98bf-9b594c5a8b00\") " pod="openstack/telemetry-openstack-openstack-cell1-pnvpv" Feb 17 20:02:31 crc kubenswrapper[4892]: I0217 20:02:31.228386 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/164b2658-87b4-4639-98bf-9b594c5a8b00-ssh-key-openstack-cell1\") pod \"telemetry-openstack-openstack-cell1-pnvpv\" (UID: \"164b2658-87b4-4639-98bf-9b594c5a8b00\") " pod="openstack/telemetry-openstack-openstack-cell1-pnvpv" Feb 17 20:02:31 crc kubenswrapper[4892]: I0217 20:02:31.229763 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/164b2658-87b4-4639-98bf-9b594c5a8b00-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-pnvpv\" (UID: \"164b2658-87b4-4639-98bf-9b594c5a8b00\") " pod="openstack/telemetry-openstack-openstack-cell1-pnvpv" Feb 17 20:02:31 crc kubenswrapper[4892]: I0217 20:02:31.237454 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/164b2658-87b4-4639-98bf-9b594c5a8b00-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-pnvpv\" (UID: \"164b2658-87b4-4639-98bf-9b594c5a8b00\") " pod="openstack/telemetry-openstack-openstack-cell1-pnvpv" Feb 17 20:02:31 crc kubenswrapper[4892]: I0217 20:02:31.244532 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sprlj\" (UniqueName: \"kubernetes.io/projected/164b2658-87b4-4639-98bf-9b594c5a8b00-kube-api-access-sprlj\") pod \"telemetry-openstack-openstack-cell1-pnvpv\" (UID: \"164b2658-87b4-4639-98bf-9b594c5a8b00\") " pod="openstack/telemetry-openstack-openstack-cell1-pnvpv" Feb 17 20:02:31 crc kubenswrapper[4892]: I0217 20:02:31.383577 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-pnvpv" Feb 17 20:02:31 crc kubenswrapper[4892]: I0217 20:02:31.997719 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-pnvpv"] Feb 17 20:02:32 crc kubenswrapper[4892]: I0217 20:02:32.925429 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-pnvpv" event={"ID":"164b2658-87b4-4639-98bf-9b594c5a8b00","Type":"ContainerStarted","Data":"5fba2422effda81dab3b1f468cc61ff6477e78bea8458256d9776c5679b35f61"} Feb 17 20:02:32 crc kubenswrapper[4892]: I0217 20:02:32.926008 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-pnvpv" event={"ID":"164b2658-87b4-4639-98bf-9b594c5a8b00","Type":"ContainerStarted","Data":"f8ddb7704ded09347816bdc2ebeef5e52a0985b3fa06555590ca08b0901c44e2"} Feb 17 20:02:32 crc kubenswrapper[4892]: I0217 20:02:32.962089 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-openstack-openstack-cell1-pnvpv" podStartSLOduration=2.496430255 podStartE2EDuration="2.962072042s" podCreationTimestamp="2026-02-17 20:02:30 +0000 UTC" firstStartedPulling="2026-02-17 20:02:32.000741719 +0000 UTC m=+8323.376144994" lastFinishedPulling="2026-02-17 20:02:32.466383506 +0000 UTC m=+8323.841786781" observedRunningTime="2026-02-17 20:02:32.953783079 +0000 UTC m=+8324.329186354" watchObservedRunningTime="2026-02-17 20:02:32.962072042 +0000 UTC m=+8324.337475307" Feb 17 20:02:37 crc kubenswrapper[4892]: I0217 20:02:37.425533 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 20:02:37 crc kubenswrapper[4892]: I0217 20:02:37.426326 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 20:02:37 crc kubenswrapper[4892]: I0217 20:02:37.467951 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pqz68"] Feb 17 20:02:37 crc kubenswrapper[4892]: I0217 20:02:37.471400 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pqz68" Feb 17 20:02:37 crc kubenswrapper[4892]: I0217 20:02:37.481431 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pqz68"] Feb 17 20:02:37 crc kubenswrapper[4892]: I0217 20:02:37.585179 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c8f0c82-356f-49f5-8136-f16f02aefc3d-utilities\") pod \"community-operators-pqz68\" (UID: \"0c8f0c82-356f-49f5-8136-f16f02aefc3d\") " pod="openshift-marketplace/community-operators-pqz68" Feb 17 20:02:37 crc kubenswrapper[4892]: I0217 20:02:37.585272 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c8f0c82-356f-49f5-8136-f16f02aefc3d-catalog-content\") pod \"community-operators-pqz68\" (UID: \"0c8f0c82-356f-49f5-8136-f16f02aefc3d\") " pod="openshift-marketplace/community-operators-pqz68" Feb 17 20:02:37 crc kubenswrapper[4892]: I0217 20:02:37.585366 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nb46q\" (UniqueName: \"kubernetes.io/projected/0c8f0c82-356f-49f5-8136-f16f02aefc3d-kube-api-access-nb46q\") pod \"community-operators-pqz68\" (UID: \"0c8f0c82-356f-49f5-8136-f16f02aefc3d\") " pod="openshift-marketplace/community-operators-pqz68" Feb 17 20:02:37 crc kubenswrapper[4892]: I0217 20:02:37.687197 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c8f0c82-356f-49f5-8136-f16f02aefc3d-utilities\") pod \"community-operators-pqz68\" (UID: \"0c8f0c82-356f-49f5-8136-f16f02aefc3d\") " pod="openshift-marketplace/community-operators-pqz68" Feb 17 20:02:37 crc kubenswrapper[4892]: I0217 20:02:37.687276 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c8f0c82-356f-49f5-8136-f16f02aefc3d-catalog-content\") pod \"community-operators-pqz68\" (UID: \"0c8f0c82-356f-49f5-8136-f16f02aefc3d\") " pod="openshift-marketplace/community-operators-pqz68" Feb 17 20:02:37 crc kubenswrapper[4892]: I0217 20:02:37.687324 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nb46q\" (UniqueName: \"kubernetes.io/projected/0c8f0c82-356f-49f5-8136-f16f02aefc3d-kube-api-access-nb46q\") pod \"community-operators-pqz68\" (UID: \"0c8f0c82-356f-49f5-8136-f16f02aefc3d\") " pod="openshift-marketplace/community-operators-pqz68" Feb 17 20:02:37 crc kubenswrapper[4892]: I0217 20:02:37.687757 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c8f0c82-356f-49f5-8136-f16f02aefc3d-utilities\") pod \"community-operators-pqz68\" (UID: \"0c8f0c82-356f-49f5-8136-f16f02aefc3d\") " pod="openshift-marketplace/community-operators-pqz68" Feb 17 20:02:37 crc kubenswrapper[4892]: I0217 20:02:37.687833 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c8f0c82-356f-49f5-8136-f16f02aefc3d-catalog-content\") pod \"community-operators-pqz68\" (UID: \"0c8f0c82-356f-49f5-8136-f16f02aefc3d\") " pod="openshift-marketplace/community-operators-pqz68" Feb 17 20:02:37 crc kubenswrapper[4892]: I0217 20:02:37.706976 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nb46q\" (UniqueName: \"kubernetes.io/projected/0c8f0c82-356f-49f5-8136-f16f02aefc3d-kube-api-access-nb46q\") pod \"community-operators-pqz68\" (UID: \"0c8f0c82-356f-49f5-8136-f16f02aefc3d\") " pod="openshift-marketplace/community-operators-pqz68" Feb 17 20:02:37 crc kubenswrapper[4892]: I0217 20:02:37.795443 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pqz68" Feb 17 20:02:38 crc kubenswrapper[4892]: I0217 20:02:38.375121 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pqz68"] Feb 17 20:02:39 crc kubenswrapper[4892]: I0217 20:02:39.017152 4892 generic.go:334] "Generic (PLEG): container finished" podID="0c8f0c82-356f-49f5-8136-f16f02aefc3d" containerID="b33360cae6ac8223b6df7f8dd2cde8a152da01362f8fdd55a21a0c3f428f7d79" exitCode=0 Feb 17 20:02:39 crc kubenswrapper[4892]: I0217 20:02:39.017208 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pqz68" event={"ID":"0c8f0c82-356f-49f5-8136-f16f02aefc3d","Type":"ContainerDied","Data":"b33360cae6ac8223b6df7f8dd2cde8a152da01362f8fdd55a21a0c3f428f7d79"} Feb 17 20:02:39 crc kubenswrapper[4892]: I0217 20:02:39.017605 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pqz68" event={"ID":"0c8f0c82-356f-49f5-8136-f16f02aefc3d","Type":"ContainerStarted","Data":"9792d33b771b08ec643c1ebabc0a202edf3c4ec386b43482264596f5d3ba2072"} Feb 17 20:02:40 crc kubenswrapper[4892]: I0217 20:02:40.033219 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pqz68" event={"ID":"0c8f0c82-356f-49f5-8136-f16f02aefc3d","Type":"ContainerStarted","Data":"9df2f34ac1108c828b3041fbe65e00a83cae093f484201bc83f50a61a7659ef1"} Feb 17 20:02:41 crc kubenswrapper[4892]: I0217 20:02:41.049809 4892 generic.go:334] "Generic (PLEG): container finished" podID="0c8f0c82-356f-49f5-8136-f16f02aefc3d" containerID="9df2f34ac1108c828b3041fbe65e00a83cae093f484201bc83f50a61a7659ef1" exitCode=0 Feb 17 20:02:41 crc kubenswrapper[4892]: I0217 20:02:41.049963 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pqz68" event={"ID":"0c8f0c82-356f-49f5-8136-f16f02aefc3d","Type":"ContainerDied","Data":"9df2f34ac1108c828b3041fbe65e00a83cae093f484201bc83f50a61a7659ef1"} Feb 17 20:02:42 crc kubenswrapper[4892]: I0217 20:02:42.073906 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pqz68" event={"ID":"0c8f0c82-356f-49f5-8136-f16f02aefc3d","Type":"ContainerStarted","Data":"f0d3ca77f2fa4b41eeb27a62412155c30f62299776a03ebbd74c94894eea4344"} Feb 17 20:02:42 crc kubenswrapper[4892]: I0217 20:02:42.098344 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pqz68" podStartSLOduration=2.625296397 podStartE2EDuration="5.09831747s" podCreationTimestamp="2026-02-17 20:02:37 +0000 UTC" firstStartedPulling="2026-02-17 20:02:39.020741322 +0000 UTC m=+8330.396144587" lastFinishedPulling="2026-02-17 20:02:41.493762335 +0000 UTC m=+8332.869165660" observedRunningTime="2026-02-17 20:02:42.089463011 +0000 UTC m=+8333.464866316" watchObservedRunningTime="2026-02-17 20:02:42.09831747 +0000 UTC m=+8333.473720765" Feb 17 20:02:47 crc kubenswrapper[4892]: I0217 20:02:47.795655 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pqz68" Feb 17 20:02:47 crc kubenswrapper[4892]: I0217 20:02:47.796350 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pqz68" Feb 17 20:02:47 crc kubenswrapper[4892]: I0217 20:02:47.894425 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pqz68" Feb 17 20:02:48 crc kubenswrapper[4892]: I0217 20:02:48.208065 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pqz68" Feb 17 20:02:48 crc kubenswrapper[4892]: I0217 20:02:48.274799 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pqz68"] Feb 17 20:02:50 crc kubenswrapper[4892]: I0217 20:02:50.168620 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pqz68" podUID="0c8f0c82-356f-49f5-8136-f16f02aefc3d" containerName="registry-server" containerID="cri-o://f0d3ca77f2fa4b41eeb27a62412155c30f62299776a03ebbd74c94894eea4344" gracePeriod=2 Feb 17 20:02:50 crc kubenswrapper[4892]: I0217 20:02:50.820884 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pqz68" Feb 17 20:02:50 crc kubenswrapper[4892]: I0217 20:02:50.942785 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nb46q\" (UniqueName: \"kubernetes.io/projected/0c8f0c82-356f-49f5-8136-f16f02aefc3d-kube-api-access-nb46q\") pod \"0c8f0c82-356f-49f5-8136-f16f02aefc3d\" (UID: \"0c8f0c82-356f-49f5-8136-f16f02aefc3d\") " Feb 17 20:02:50 crc kubenswrapper[4892]: I0217 20:02:50.943363 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c8f0c82-356f-49f5-8136-f16f02aefc3d-catalog-content\") pod \"0c8f0c82-356f-49f5-8136-f16f02aefc3d\" (UID: \"0c8f0c82-356f-49f5-8136-f16f02aefc3d\") " Feb 17 20:02:50 crc kubenswrapper[4892]: I0217 20:02:50.943387 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c8f0c82-356f-49f5-8136-f16f02aefc3d-utilities\") pod \"0c8f0c82-356f-49f5-8136-f16f02aefc3d\" (UID: \"0c8f0c82-356f-49f5-8136-f16f02aefc3d\") " Feb 17 20:02:50 crc kubenswrapper[4892]: I0217 20:02:50.944396 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c8f0c82-356f-49f5-8136-f16f02aefc3d-utilities" (OuterVolumeSpecName: "utilities") pod "0c8f0c82-356f-49f5-8136-f16f02aefc3d" (UID: "0c8f0c82-356f-49f5-8136-f16f02aefc3d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 20:02:50 crc kubenswrapper[4892]: I0217 20:02:50.960149 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c8f0c82-356f-49f5-8136-f16f02aefc3d-kube-api-access-nb46q" (OuterVolumeSpecName: "kube-api-access-nb46q") pod "0c8f0c82-356f-49f5-8136-f16f02aefc3d" (UID: "0c8f0c82-356f-49f5-8136-f16f02aefc3d"). InnerVolumeSpecName "kube-api-access-nb46q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 20:02:51 crc kubenswrapper[4892]: I0217 20:02:51.046572 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nb46q\" (UniqueName: \"kubernetes.io/projected/0c8f0c82-356f-49f5-8136-f16f02aefc3d-kube-api-access-nb46q\") on node \"crc\" DevicePath \"\"" Feb 17 20:02:51 crc kubenswrapper[4892]: I0217 20:02:51.046634 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c8f0c82-356f-49f5-8136-f16f02aefc3d-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 20:02:51 crc kubenswrapper[4892]: I0217 20:02:51.182057 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pqz68" Feb 17 20:02:51 crc kubenswrapper[4892]: I0217 20:02:51.182093 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pqz68" event={"ID":"0c8f0c82-356f-49f5-8136-f16f02aefc3d","Type":"ContainerDied","Data":"f0d3ca77f2fa4b41eeb27a62412155c30f62299776a03ebbd74c94894eea4344"} Feb 17 20:02:51 crc kubenswrapper[4892]: I0217 20:02:51.182142 4892 scope.go:117] "RemoveContainer" containerID="f0d3ca77f2fa4b41eeb27a62412155c30f62299776a03ebbd74c94894eea4344" Feb 17 20:02:51 crc kubenswrapper[4892]: I0217 20:02:51.181949 4892 generic.go:334] "Generic (PLEG): container finished" podID="0c8f0c82-356f-49f5-8136-f16f02aefc3d" containerID="f0d3ca77f2fa4b41eeb27a62412155c30f62299776a03ebbd74c94894eea4344" exitCode=0 Feb 17 20:02:51 crc kubenswrapper[4892]: I0217 20:02:51.183112 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pqz68" event={"ID":"0c8f0c82-356f-49f5-8136-f16f02aefc3d","Type":"ContainerDied","Data":"9792d33b771b08ec643c1ebabc0a202edf3c4ec386b43482264596f5d3ba2072"} Feb 17 20:02:51 crc kubenswrapper[4892]: I0217 20:02:51.206120 4892 scope.go:117] "RemoveContainer" containerID="9df2f34ac1108c828b3041fbe65e00a83cae093f484201bc83f50a61a7659ef1" Feb 17 20:02:51 crc kubenswrapper[4892]: I0217 20:02:51.229422 4892 scope.go:117] "RemoveContainer" containerID="b33360cae6ac8223b6df7f8dd2cde8a152da01362f8fdd55a21a0c3f428f7d79" Feb 17 20:02:51 crc kubenswrapper[4892]: I0217 20:02:51.252671 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c8f0c82-356f-49f5-8136-f16f02aefc3d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0c8f0c82-356f-49f5-8136-f16f02aefc3d" (UID: "0c8f0c82-356f-49f5-8136-f16f02aefc3d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 20:02:51 crc kubenswrapper[4892]: I0217 20:02:51.287082 4892 scope.go:117] "RemoveContainer" containerID="f0d3ca77f2fa4b41eeb27a62412155c30f62299776a03ebbd74c94894eea4344" Feb 17 20:02:51 crc kubenswrapper[4892]: E0217 20:02:51.287739 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0d3ca77f2fa4b41eeb27a62412155c30f62299776a03ebbd74c94894eea4344\": container with ID starting with f0d3ca77f2fa4b41eeb27a62412155c30f62299776a03ebbd74c94894eea4344 not found: ID does not exist" containerID="f0d3ca77f2fa4b41eeb27a62412155c30f62299776a03ebbd74c94894eea4344" Feb 17 20:02:51 crc kubenswrapper[4892]: I0217 20:02:51.287998 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0d3ca77f2fa4b41eeb27a62412155c30f62299776a03ebbd74c94894eea4344"} err="failed to get container status \"f0d3ca77f2fa4b41eeb27a62412155c30f62299776a03ebbd74c94894eea4344\": rpc error: code = NotFound desc = could not find container \"f0d3ca77f2fa4b41eeb27a62412155c30f62299776a03ebbd74c94894eea4344\": container with ID starting with f0d3ca77f2fa4b41eeb27a62412155c30f62299776a03ebbd74c94894eea4344 not found: ID does not exist" Feb 17 20:02:51 crc kubenswrapper[4892]: I0217 20:02:51.288059 4892 scope.go:117] "RemoveContainer" containerID="9df2f34ac1108c828b3041fbe65e00a83cae093f484201bc83f50a61a7659ef1" Feb 17 20:02:51 crc kubenswrapper[4892]: E0217 20:02:51.288548 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9df2f34ac1108c828b3041fbe65e00a83cae093f484201bc83f50a61a7659ef1\": container with ID starting with 9df2f34ac1108c828b3041fbe65e00a83cae093f484201bc83f50a61a7659ef1 not found: ID does not exist" containerID="9df2f34ac1108c828b3041fbe65e00a83cae093f484201bc83f50a61a7659ef1" Feb 17 20:02:51 crc kubenswrapper[4892]: I0217 20:02:51.288588 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9df2f34ac1108c828b3041fbe65e00a83cae093f484201bc83f50a61a7659ef1"} err="failed to get container status \"9df2f34ac1108c828b3041fbe65e00a83cae093f484201bc83f50a61a7659ef1\": rpc error: code = NotFound desc = could not find container \"9df2f34ac1108c828b3041fbe65e00a83cae093f484201bc83f50a61a7659ef1\": container with ID starting with 9df2f34ac1108c828b3041fbe65e00a83cae093f484201bc83f50a61a7659ef1 not found: ID does not exist" Feb 17 20:02:51 crc kubenswrapper[4892]: I0217 20:02:51.288619 4892 scope.go:117] "RemoveContainer" containerID="b33360cae6ac8223b6df7f8dd2cde8a152da01362f8fdd55a21a0c3f428f7d79" Feb 17 20:02:51 crc kubenswrapper[4892]: E0217 20:02:51.288956 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b33360cae6ac8223b6df7f8dd2cde8a152da01362f8fdd55a21a0c3f428f7d79\": container with ID starting with b33360cae6ac8223b6df7f8dd2cde8a152da01362f8fdd55a21a0c3f428f7d79 not found: ID does not exist" containerID="b33360cae6ac8223b6df7f8dd2cde8a152da01362f8fdd55a21a0c3f428f7d79" Feb 17 20:02:51 crc kubenswrapper[4892]: I0217 20:02:51.288995 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b33360cae6ac8223b6df7f8dd2cde8a152da01362f8fdd55a21a0c3f428f7d79"} err="failed to get container status \"b33360cae6ac8223b6df7f8dd2cde8a152da01362f8fdd55a21a0c3f428f7d79\": rpc error: code = NotFound desc = could not find container \"b33360cae6ac8223b6df7f8dd2cde8a152da01362f8fdd55a21a0c3f428f7d79\": container with ID starting with b33360cae6ac8223b6df7f8dd2cde8a152da01362f8fdd55a21a0c3f428f7d79 not found: ID does not exist" Feb 17 20:02:51 crc kubenswrapper[4892]: I0217 20:02:51.353986 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c8f0c82-356f-49f5-8136-f16f02aefc3d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 20:02:51 crc kubenswrapper[4892]: I0217 20:02:51.544910 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pqz68"] Feb 17 20:02:51 crc kubenswrapper[4892]: I0217 20:02:51.559944 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pqz68"] Feb 17 20:02:53 crc kubenswrapper[4892]: I0217 20:02:53.374214 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c8f0c82-356f-49f5-8136-f16f02aefc3d" path="/var/lib/kubelet/pods/0c8f0c82-356f-49f5-8136-f16f02aefc3d/volumes" Feb 17 20:03:07 crc kubenswrapper[4892]: I0217 20:03:07.424603 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 20:03:07 crc kubenswrapper[4892]: I0217 20:03:07.425424 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 20:03:07 crc kubenswrapper[4892]: I0217 20:03:07.425506 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" Feb 17 20:03:07 crc kubenswrapper[4892]: I0217 20:03:07.426907 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"321964d924bc9b60d8e8d4a00f54b10a06cb73c81d8538ba205c584596e60b63"} pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 20:03:07 crc kubenswrapper[4892]: I0217 20:03:07.427042 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" containerID="cri-o://321964d924bc9b60d8e8d4a00f54b10a06cb73c81d8538ba205c584596e60b63" gracePeriod=600 Feb 17 20:03:08 crc kubenswrapper[4892]: I0217 20:03:08.393280 4892 generic.go:334] "Generic (PLEG): container finished" podID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerID="321964d924bc9b60d8e8d4a00f54b10a06cb73c81d8538ba205c584596e60b63" exitCode=0 Feb 17 20:03:08 crc kubenswrapper[4892]: I0217 20:03:08.393391 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerDied","Data":"321964d924bc9b60d8e8d4a00f54b10a06cb73c81d8538ba205c584596e60b63"} Feb 17 20:03:08 crc kubenswrapper[4892]: I0217 20:03:08.393925 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerStarted","Data":"dd34609e40c62d0dc2fae95fb119b2fff9896a684cd3e8b1cf99b9b16a2445f8"} Feb 17 20:03:08 crc kubenswrapper[4892]: I0217 20:03:08.393948 4892 scope.go:117] "RemoveContainer" containerID="6c0d3b57fe7ce8a8bf0e7ab5ed7eb5daea00ebff5bc63d153811e8fa47cadb35" Feb 17 20:05:07 crc kubenswrapper[4892]: I0217 20:05:07.425577 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 20:05:07 crc kubenswrapper[4892]: I0217 20:05:07.426406 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 20:05:37 crc kubenswrapper[4892]: I0217 20:05:37.424423 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 20:05:37 crc kubenswrapper[4892]: I0217 20:05:37.425992 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 20:05:53 crc kubenswrapper[4892]: I0217 20:05:53.564398 4892 generic.go:334] "Generic (PLEG): container finished" podID="164b2658-87b4-4639-98bf-9b594c5a8b00" containerID="5fba2422effda81dab3b1f468cc61ff6477e78bea8458256d9776c5679b35f61" exitCode=0 Feb 17 20:05:53 crc kubenswrapper[4892]: I0217 20:05:53.564493 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-pnvpv" event={"ID":"164b2658-87b4-4639-98bf-9b594c5a8b00","Type":"ContainerDied","Data":"5fba2422effda81dab3b1f468cc61ff6477e78bea8458256d9776c5679b35f61"} Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.140169 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-pnvpv" Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.313168 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/164b2658-87b4-4639-98bf-9b594c5a8b00-ssh-key-openstack-cell1\") pod \"164b2658-87b4-4639-98bf-9b594c5a8b00\" (UID: \"164b2658-87b4-4639-98bf-9b594c5a8b00\") " Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.313215 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/164b2658-87b4-4639-98bf-9b594c5a8b00-inventory\") pod \"164b2658-87b4-4639-98bf-9b594c5a8b00\" (UID: \"164b2658-87b4-4639-98bf-9b594c5a8b00\") " Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.313278 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/164b2658-87b4-4639-98bf-9b594c5a8b00-ceilometer-compute-config-data-2\") pod \"164b2658-87b4-4639-98bf-9b594c5a8b00\" (UID: \"164b2658-87b4-4639-98bf-9b594c5a8b00\") " Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.314056 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/164b2658-87b4-4639-98bf-9b594c5a8b00-telemetry-combined-ca-bundle\") pod \"164b2658-87b4-4639-98bf-9b594c5a8b00\" (UID: \"164b2658-87b4-4639-98bf-9b594c5a8b00\") " Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.314325 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/164b2658-87b4-4639-98bf-9b594c5a8b00-ceilometer-compute-config-data-0\") pod \"164b2658-87b4-4639-98bf-9b594c5a8b00\" (UID: \"164b2658-87b4-4639-98bf-9b594c5a8b00\") " Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.314376 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sprlj\" (UniqueName: \"kubernetes.io/projected/164b2658-87b4-4639-98bf-9b594c5a8b00-kube-api-access-sprlj\") pod \"164b2658-87b4-4639-98bf-9b594c5a8b00\" (UID: \"164b2658-87b4-4639-98bf-9b594c5a8b00\") " Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.314417 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/164b2658-87b4-4639-98bf-9b594c5a8b00-ceilometer-compute-config-data-1\") pod \"164b2658-87b4-4639-98bf-9b594c5a8b00\" (UID: \"164b2658-87b4-4639-98bf-9b594c5a8b00\") " Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.314464 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/164b2658-87b4-4639-98bf-9b594c5a8b00-ceph\") pod \"164b2658-87b4-4639-98bf-9b594c5a8b00\" (UID: \"164b2658-87b4-4639-98bf-9b594c5a8b00\") " Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.319290 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/164b2658-87b4-4639-98bf-9b594c5a8b00-ceph" (OuterVolumeSpecName: "ceph") pod "164b2658-87b4-4639-98bf-9b594c5a8b00" (UID: "164b2658-87b4-4639-98bf-9b594c5a8b00"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.319873 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/164b2658-87b4-4639-98bf-9b594c5a8b00-kube-api-access-sprlj" (OuterVolumeSpecName: "kube-api-access-sprlj") pod "164b2658-87b4-4639-98bf-9b594c5a8b00" (UID: "164b2658-87b4-4639-98bf-9b594c5a8b00"). InnerVolumeSpecName "kube-api-access-sprlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.320747 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/164b2658-87b4-4639-98bf-9b594c5a8b00-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "164b2658-87b4-4639-98bf-9b594c5a8b00" (UID: "164b2658-87b4-4639-98bf-9b594c5a8b00"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.347538 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/164b2658-87b4-4639-98bf-9b594c5a8b00-inventory" (OuterVolumeSpecName: "inventory") pod "164b2658-87b4-4639-98bf-9b594c5a8b00" (UID: "164b2658-87b4-4639-98bf-9b594c5a8b00"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.350898 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/164b2658-87b4-4639-98bf-9b594c5a8b00-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "164b2658-87b4-4639-98bf-9b594c5a8b00" (UID: "164b2658-87b4-4639-98bf-9b594c5a8b00"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.352512 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/164b2658-87b4-4639-98bf-9b594c5a8b00-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "164b2658-87b4-4639-98bf-9b594c5a8b00" (UID: "164b2658-87b4-4639-98bf-9b594c5a8b00"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.354868 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/164b2658-87b4-4639-98bf-9b594c5a8b00-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "164b2658-87b4-4639-98bf-9b594c5a8b00" (UID: "164b2658-87b4-4639-98bf-9b594c5a8b00"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.355110 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/164b2658-87b4-4639-98bf-9b594c5a8b00-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "164b2658-87b4-4639-98bf-9b594c5a8b00" (UID: "164b2658-87b4-4639-98bf-9b594c5a8b00"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.419517 4892 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/164b2658-87b4-4639-98bf-9b594c5a8b00-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.419947 4892 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/164b2658-87b4-4639-98bf-9b594c5a8b00-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.420257 4892 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/164b2658-87b4-4639-98bf-9b594c5a8b00-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.423162 4892 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/164b2658-87b4-4639-98bf-9b594c5a8b00-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.423357 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sprlj\" (UniqueName: \"kubernetes.io/projected/164b2658-87b4-4639-98bf-9b594c5a8b00-kube-api-access-sprlj\") on node \"crc\" DevicePath \"\"" Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.423606 4892 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/164b2658-87b4-4639-98bf-9b594c5a8b00-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.423771 4892 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/164b2658-87b4-4639-98bf-9b594c5a8b00-ceph\") on node \"crc\" DevicePath \"\"" Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.424497 4892 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/164b2658-87b4-4639-98bf-9b594c5a8b00-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.597616 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-pnvpv" event={"ID":"164b2658-87b4-4639-98bf-9b594c5a8b00","Type":"ContainerDied","Data":"f8ddb7704ded09347816bdc2ebeef5e52a0985b3fa06555590ca08b0901c44e2"} Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.598011 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8ddb7704ded09347816bdc2ebeef5e52a0985b3fa06555590ca08b0901c44e2" Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.597706 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-pnvpv" Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.720204 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-v5r58"] Feb 17 20:05:55 crc kubenswrapper[4892]: E0217 20:05:55.720904 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c8f0c82-356f-49f5-8136-f16f02aefc3d" containerName="extract-utilities" Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.720933 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c8f0c82-356f-49f5-8136-f16f02aefc3d" containerName="extract-utilities" Feb 17 20:05:55 crc kubenswrapper[4892]: E0217 20:05:55.720983 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c8f0c82-356f-49f5-8136-f16f02aefc3d" containerName="extract-content" Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.720993 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c8f0c82-356f-49f5-8136-f16f02aefc3d" containerName="extract-content" Feb 17 20:05:55 crc kubenswrapper[4892]: E0217 20:05:55.721012 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c8f0c82-356f-49f5-8136-f16f02aefc3d" containerName="registry-server" Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.721020 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c8f0c82-356f-49f5-8136-f16f02aefc3d" containerName="registry-server" Feb 17 20:05:55 crc kubenswrapper[4892]: E0217 20:05:55.721046 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="164b2658-87b4-4639-98bf-9b594c5a8b00" containerName="telemetry-openstack-openstack-cell1" Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.721053 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="164b2658-87b4-4639-98bf-9b594c5a8b00" containerName="telemetry-openstack-openstack-cell1" Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.721364 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c8f0c82-356f-49f5-8136-f16f02aefc3d" containerName="registry-server" Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.721409 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="164b2658-87b4-4639-98bf-9b594c5a8b00" containerName="telemetry-openstack-openstack-cell1" Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.722457 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-v5r58" Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.724911 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.724946 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-sriov-agent-neutron-config" Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.725072 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-sdxx7" Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.725309 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.732620 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.747594 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-v5r58"] Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.837555 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkxcf\" (UniqueName: \"kubernetes.io/projected/0dcceec2-d889-492f-921f-0da0329466c4-kube-api-access-bkxcf\") pod \"neutron-sriov-openstack-openstack-cell1-v5r58\" (UID: \"0dcceec2-d889-492f-921f-0da0329466c4\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-v5r58" Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.837864 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0dcceec2-d889-492f-921f-0da0329466c4-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-v5r58\" (UID: \"0dcceec2-d889-492f-921f-0da0329466c4\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-v5r58" Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.837943 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dcceec2-d889-492f-921f-0da0329466c4-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-v5r58\" (UID: \"0dcceec2-d889-492f-921f-0da0329466c4\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-v5r58" Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.837963 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0dcceec2-d889-492f-921f-0da0329466c4-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-v5r58\" (UID: \"0dcceec2-d889-492f-921f-0da0329466c4\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-v5r58" Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.837990 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0dcceec2-d889-492f-921f-0da0329466c4-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-v5r58\" (UID: \"0dcceec2-d889-492f-921f-0da0329466c4\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-v5r58" Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.838125 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0dcceec2-d889-492f-921f-0da0329466c4-ssh-key-openstack-cell1\") pod \"neutron-sriov-openstack-openstack-cell1-v5r58\" (UID: \"0dcceec2-d889-492f-921f-0da0329466c4\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-v5r58" Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.940933 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkxcf\" (UniqueName: \"kubernetes.io/projected/0dcceec2-d889-492f-921f-0da0329466c4-kube-api-access-bkxcf\") pod \"neutron-sriov-openstack-openstack-cell1-v5r58\" (UID: \"0dcceec2-d889-492f-921f-0da0329466c4\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-v5r58" Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.941074 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0dcceec2-d889-492f-921f-0da0329466c4-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-v5r58\" (UID: \"0dcceec2-d889-492f-921f-0da0329466c4\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-v5r58" Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.941247 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dcceec2-d889-492f-921f-0da0329466c4-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-v5r58\" (UID: \"0dcceec2-d889-492f-921f-0da0329466c4\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-v5r58" Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.941303 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0dcceec2-d889-492f-921f-0da0329466c4-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-v5r58\" (UID: \"0dcceec2-d889-492f-921f-0da0329466c4\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-v5r58" Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.941418 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0dcceec2-d889-492f-921f-0da0329466c4-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-v5r58\" (UID: \"0dcceec2-d889-492f-921f-0da0329466c4\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-v5r58" Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.941561 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0dcceec2-d889-492f-921f-0da0329466c4-ssh-key-openstack-cell1\") pod \"neutron-sriov-openstack-openstack-cell1-v5r58\" (UID: \"0dcceec2-d889-492f-921f-0da0329466c4\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-v5r58" Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.947022 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0dcceec2-d889-492f-921f-0da0329466c4-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-v5r58\" (UID: \"0dcceec2-d889-492f-921f-0da0329466c4\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-v5r58" Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.947369 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0dcceec2-d889-492f-921f-0da0329466c4-ssh-key-openstack-cell1\") pod \"neutron-sriov-openstack-openstack-cell1-v5r58\" (UID: \"0dcceec2-d889-492f-921f-0da0329466c4\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-v5r58" Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.948790 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0dcceec2-d889-492f-921f-0da0329466c4-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-v5r58\" (UID: \"0dcceec2-d889-492f-921f-0da0329466c4\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-v5r58" Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.953655 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0dcceec2-d889-492f-921f-0da0329466c4-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-v5r58\" (UID: \"0dcceec2-d889-492f-921f-0da0329466c4\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-v5r58" Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.973590 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dcceec2-d889-492f-921f-0da0329466c4-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-v5r58\" (UID: \"0dcceec2-d889-492f-921f-0da0329466c4\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-v5r58" Feb 17 20:05:55 crc kubenswrapper[4892]: I0217 20:05:55.979859 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkxcf\" (UniqueName: \"kubernetes.io/projected/0dcceec2-d889-492f-921f-0da0329466c4-kube-api-access-bkxcf\") pod \"neutron-sriov-openstack-openstack-cell1-v5r58\" (UID: \"0dcceec2-d889-492f-921f-0da0329466c4\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-v5r58" Feb 17 20:05:56 crc kubenswrapper[4892]: I0217 20:05:56.039638 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-v5r58" Feb 17 20:05:56 crc kubenswrapper[4892]: I0217 20:05:56.749417 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-v5r58"] Feb 17 20:05:56 crc kubenswrapper[4892]: I0217 20:05:56.752012 4892 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 20:05:57 crc kubenswrapper[4892]: I0217 20:05:57.624001 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-v5r58" event={"ID":"0dcceec2-d889-492f-921f-0da0329466c4","Type":"ContainerStarted","Data":"a56ab344ea4896a52b468632a1071f51a31438cb11a58b2f69cb0de5facedb52"} Feb 17 20:05:57 crc kubenswrapper[4892]: I0217 20:05:57.624410 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-v5r58" event={"ID":"0dcceec2-d889-492f-921f-0da0329466c4","Type":"ContainerStarted","Data":"1d99f73f633ff57b03a333741e41222b39445a820b08c309a1656600edb5b757"} Feb 17 20:05:57 crc kubenswrapper[4892]: I0217 20:05:57.650795 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-sriov-openstack-openstack-cell1-v5r58" podStartSLOduration=2.182841849 podStartE2EDuration="2.650774798s" podCreationTimestamp="2026-02-17 20:05:55 +0000 UTC" firstStartedPulling="2026-02-17 20:05:56.751806113 +0000 UTC m=+8528.127209378" lastFinishedPulling="2026-02-17 20:05:57.219739022 +0000 UTC m=+8528.595142327" observedRunningTime="2026-02-17 20:05:57.645220009 +0000 UTC m=+8529.020623304" watchObservedRunningTime="2026-02-17 20:05:57.650774798 +0000 UTC m=+8529.026178073" Feb 17 20:06:07 crc kubenswrapper[4892]: I0217 20:06:07.425557 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 20:06:07 crc kubenswrapper[4892]: I0217 20:06:07.426089 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 20:06:07 crc kubenswrapper[4892]: I0217 20:06:07.426128 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" Feb 17 20:06:07 crc kubenswrapper[4892]: I0217 20:06:07.426930 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dd34609e40c62d0dc2fae95fb119b2fff9896a684cd3e8b1cf99b9b16a2445f8"} pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 20:06:07 crc kubenswrapper[4892]: I0217 20:06:07.427008 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" containerID="cri-o://dd34609e40c62d0dc2fae95fb119b2fff9896a684cd3e8b1cf99b9b16a2445f8" gracePeriod=600 Feb 17 20:06:07 crc kubenswrapper[4892]: E0217 20:06:07.553954 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:06:07 crc kubenswrapper[4892]: I0217 20:06:07.767164 4892 generic.go:334] "Generic (PLEG): container finished" podID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerID="dd34609e40c62d0dc2fae95fb119b2fff9896a684cd3e8b1cf99b9b16a2445f8" exitCode=0 Feb 17 20:06:07 crc kubenswrapper[4892]: I0217 20:06:07.767258 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerDied","Data":"dd34609e40c62d0dc2fae95fb119b2fff9896a684cd3e8b1cf99b9b16a2445f8"} Feb 17 20:06:07 crc kubenswrapper[4892]: I0217 20:06:07.767532 4892 scope.go:117] "RemoveContainer" containerID="321964d924bc9b60d8e8d4a00f54b10a06cb73c81d8538ba205c584596e60b63" Feb 17 20:06:07 crc kubenswrapper[4892]: I0217 20:06:07.768591 4892 scope.go:117] "RemoveContainer" containerID="dd34609e40c62d0dc2fae95fb119b2fff9896a684cd3e8b1cf99b9b16a2445f8" Feb 17 20:06:07 crc kubenswrapper[4892]: E0217 20:06:07.769320 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:06:20 crc kubenswrapper[4892]: I0217 20:06:20.360111 4892 scope.go:117] "RemoveContainer" containerID="dd34609e40c62d0dc2fae95fb119b2fff9896a684cd3e8b1cf99b9b16a2445f8" Feb 17 20:06:20 crc kubenswrapper[4892]: E0217 20:06:20.361860 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:06:35 crc kubenswrapper[4892]: I0217 20:06:35.362268 4892 scope.go:117] "RemoveContainer" containerID="dd34609e40c62d0dc2fae95fb119b2fff9896a684cd3e8b1cf99b9b16a2445f8" Feb 17 20:06:35 crc kubenswrapper[4892]: E0217 20:06:35.365311 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:06:48 crc kubenswrapper[4892]: I0217 20:06:48.359544 4892 scope.go:117] "RemoveContainer" containerID="dd34609e40c62d0dc2fae95fb119b2fff9896a684cd3e8b1cf99b9b16a2445f8" Feb 17 20:06:48 crc kubenswrapper[4892]: E0217 20:06:48.360467 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:07:01 crc kubenswrapper[4892]: I0217 20:07:00.360151 4892 scope.go:117] "RemoveContainer" containerID="dd34609e40c62d0dc2fae95fb119b2fff9896a684cd3e8b1cf99b9b16a2445f8" Feb 17 20:07:01 crc kubenswrapper[4892]: E0217 20:07:00.361666 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:07:11 crc kubenswrapper[4892]: I0217 20:07:11.602951 4892 generic.go:334] "Generic (PLEG): container finished" podID="0dcceec2-d889-492f-921f-0da0329466c4" containerID="a56ab344ea4896a52b468632a1071f51a31438cb11a58b2f69cb0de5facedb52" exitCode=0 Feb 17 20:07:11 crc kubenswrapper[4892]: I0217 20:07:11.605022 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-v5r58" event={"ID":"0dcceec2-d889-492f-921f-0da0329466c4","Type":"ContainerDied","Data":"a56ab344ea4896a52b468632a1071f51a31438cb11a58b2f69cb0de5facedb52"} Feb 17 20:07:13 crc kubenswrapper[4892]: I0217 20:07:13.136440 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-v5r58" Feb 17 20:07:13 crc kubenswrapper[4892]: I0217 20:07:13.219692 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dcceec2-d889-492f-921f-0da0329466c4-neutron-sriov-combined-ca-bundle\") pod \"0dcceec2-d889-492f-921f-0da0329466c4\" (UID: \"0dcceec2-d889-492f-921f-0da0329466c4\") " Feb 17 20:07:13 crc kubenswrapper[4892]: I0217 20:07:13.219766 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0dcceec2-d889-492f-921f-0da0329466c4-ceph\") pod \"0dcceec2-d889-492f-921f-0da0329466c4\" (UID: \"0dcceec2-d889-492f-921f-0da0329466c4\") " Feb 17 20:07:13 crc kubenswrapper[4892]: I0217 20:07:13.219796 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0dcceec2-d889-492f-921f-0da0329466c4-neutron-sriov-agent-neutron-config-0\") pod \"0dcceec2-d889-492f-921f-0da0329466c4\" (UID: \"0dcceec2-d889-492f-921f-0da0329466c4\") " Feb 17 20:07:13 crc kubenswrapper[4892]: I0217 20:07:13.219924 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0dcceec2-d889-492f-921f-0da0329466c4-inventory\") pod \"0dcceec2-d889-492f-921f-0da0329466c4\" (UID: \"0dcceec2-d889-492f-921f-0da0329466c4\") " Feb 17 20:07:13 crc kubenswrapper[4892]: I0217 20:07:13.220088 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0dcceec2-d889-492f-921f-0da0329466c4-ssh-key-openstack-cell1\") pod \"0dcceec2-d889-492f-921f-0da0329466c4\" (UID: \"0dcceec2-d889-492f-921f-0da0329466c4\") " Feb 17 20:07:13 crc kubenswrapper[4892]: I0217 20:07:13.220241 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkxcf\" (UniqueName: \"kubernetes.io/projected/0dcceec2-d889-492f-921f-0da0329466c4-kube-api-access-bkxcf\") pod \"0dcceec2-d889-492f-921f-0da0329466c4\" (UID: \"0dcceec2-d889-492f-921f-0da0329466c4\") " Feb 17 20:07:13 crc kubenswrapper[4892]: I0217 20:07:13.226369 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dcceec2-d889-492f-921f-0da0329466c4-ceph" (OuterVolumeSpecName: "ceph") pod "0dcceec2-d889-492f-921f-0da0329466c4" (UID: "0dcceec2-d889-492f-921f-0da0329466c4"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 20:07:13 crc kubenswrapper[4892]: I0217 20:07:13.227413 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dcceec2-d889-492f-921f-0da0329466c4-kube-api-access-bkxcf" (OuterVolumeSpecName: "kube-api-access-bkxcf") pod "0dcceec2-d889-492f-921f-0da0329466c4" (UID: "0dcceec2-d889-492f-921f-0da0329466c4"). InnerVolumeSpecName "kube-api-access-bkxcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 20:07:13 crc kubenswrapper[4892]: I0217 20:07:13.227722 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dcceec2-d889-492f-921f-0da0329466c4-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "0dcceec2-d889-492f-921f-0da0329466c4" (UID: "0dcceec2-d889-492f-921f-0da0329466c4"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 20:07:13 crc kubenswrapper[4892]: I0217 20:07:13.263807 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dcceec2-d889-492f-921f-0da0329466c4-inventory" (OuterVolumeSpecName: "inventory") pod "0dcceec2-d889-492f-921f-0da0329466c4" (UID: "0dcceec2-d889-492f-921f-0da0329466c4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 20:07:13 crc kubenswrapper[4892]: I0217 20:07:13.278587 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dcceec2-d889-492f-921f-0da0329466c4-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "0dcceec2-d889-492f-921f-0da0329466c4" (UID: "0dcceec2-d889-492f-921f-0da0329466c4"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 20:07:13 crc kubenswrapper[4892]: I0217 20:07:13.288225 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dcceec2-d889-492f-921f-0da0329466c4-neutron-sriov-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-sriov-agent-neutron-config-0") pod "0dcceec2-d889-492f-921f-0da0329466c4" (UID: "0dcceec2-d889-492f-921f-0da0329466c4"). InnerVolumeSpecName "neutron-sriov-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 20:07:13 crc kubenswrapper[4892]: I0217 20:07:13.323344 4892 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0dcceec2-d889-492f-921f-0da0329466c4-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 17 20:07:13 crc kubenswrapper[4892]: I0217 20:07:13.323373 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkxcf\" (UniqueName: \"kubernetes.io/projected/0dcceec2-d889-492f-921f-0da0329466c4-kube-api-access-bkxcf\") on node \"crc\" DevicePath \"\"" Feb 17 20:07:13 crc kubenswrapper[4892]: I0217 20:07:13.323383 4892 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dcceec2-d889-492f-921f-0da0329466c4-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 20:07:13 crc kubenswrapper[4892]: I0217 20:07:13.323393 4892 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0dcceec2-d889-492f-921f-0da0329466c4-ceph\") on node \"crc\" DevicePath \"\"" Feb 17 20:07:13 crc kubenswrapper[4892]: I0217 20:07:13.323406 4892 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0dcceec2-d889-492f-921f-0da0329466c4-neutron-sriov-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 17 20:07:13 crc kubenswrapper[4892]: I0217 20:07:13.323415 4892 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0dcceec2-d889-492f-921f-0da0329466c4-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 20:07:13 crc kubenswrapper[4892]: I0217 20:07:13.360669 4892 scope.go:117] "RemoveContainer" containerID="dd34609e40c62d0dc2fae95fb119b2fff9896a684cd3e8b1cf99b9b16a2445f8" Feb 17 20:07:13 crc kubenswrapper[4892]: E0217 20:07:13.361018 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:07:13 crc kubenswrapper[4892]: I0217 20:07:13.631423 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-v5r58" event={"ID":"0dcceec2-d889-492f-921f-0da0329466c4","Type":"ContainerDied","Data":"1d99f73f633ff57b03a333741e41222b39445a820b08c309a1656600edb5b757"} Feb 17 20:07:13 crc kubenswrapper[4892]: I0217 20:07:13.631486 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d99f73f633ff57b03a333741e41222b39445a820b08c309a1656600edb5b757" Feb 17 20:07:13 crc kubenswrapper[4892]: I0217 20:07:13.632257 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-v5r58" Feb 17 20:07:13 crc kubenswrapper[4892]: I0217 20:07:13.754804 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-qcxh2"] Feb 17 20:07:13 crc kubenswrapper[4892]: E0217 20:07:13.755709 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dcceec2-d889-492f-921f-0da0329466c4" containerName="neutron-sriov-openstack-openstack-cell1" Feb 17 20:07:13 crc kubenswrapper[4892]: I0217 20:07:13.755730 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dcceec2-d889-492f-921f-0da0329466c4" containerName="neutron-sriov-openstack-openstack-cell1" Feb 17 20:07:13 crc kubenswrapper[4892]: I0217 20:07:13.756106 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dcceec2-d889-492f-921f-0da0329466c4" containerName="neutron-sriov-openstack-openstack-cell1" Feb 17 20:07:13 crc kubenswrapper[4892]: I0217 20:07:13.757159 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-qcxh2" Feb 17 20:07:13 crc kubenswrapper[4892]: I0217 20:07:13.759373 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 17 20:07:13 crc kubenswrapper[4892]: I0217 20:07:13.763049 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 17 20:07:13 crc kubenswrapper[4892]: I0217 20:07:13.764208 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-sdxx7" Feb 17 20:07:13 crc kubenswrapper[4892]: I0217 20:07:13.764229 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-dhcp-agent-neutron-config" Feb 17 20:07:13 crc kubenswrapper[4892]: I0217 20:07:13.764290 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 20:07:13 crc kubenswrapper[4892]: I0217 20:07:13.770279 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-qcxh2"] Feb 17 20:07:13 crc kubenswrapper[4892]: I0217 20:07:13.938917 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4vn2\" (UniqueName: \"kubernetes.io/projected/e6cbf46e-f128-468f-99f5-64ae24e21ec6-kube-api-access-p4vn2\") pod \"neutron-dhcp-openstack-openstack-cell1-qcxh2\" (UID: \"e6cbf46e-f128-468f-99f5-64ae24e21ec6\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qcxh2" Feb 17 20:07:13 crc kubenswrapper[4892]: I0217 20:07:13.939138 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e6cbf46e-f128-468f-99f5-64ae24e21ec6-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-qcxh2\" (UID: \"e6cbf46e-f128-468f-99f5-64ae24e21ec6\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qcxh2" Feb 17 20:07:13 crc kubenswrapper[4892]: I0217 20:07:13.939220 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6cbf46e-f128-468f-99f5-64ae24e21ec6-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-qcxh2\" (UID: \"e6cbf46e-f128-468f-99f5-64ae24e21ec6\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qcxh2" Feb 17 20:07:13 crc kubenswrapper[4892]: I0217 20:07:13.939303 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/e6cbf46e-f128-468f-99f5-64ae24e21ec6-ssh-key-openstack-cell1\") pod \"neutron-dhcp-openstack-openstack-cell1-qcxh2\" (UID: \"e6cbf46e-f128-468f-99f5-64ae24e21ec6\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qcxh2" Feb 17 20:07:13 crc kubenswrapper[4892]: I0217 20:07:13.939410 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6cbf46e-f128-468f-99f5-64ae24e21ec6-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-qcxh2\" (UID: \"e6cbf46e-f128-468f-99f5-64ae24e21ec6\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qcxh2" Feb 17 20:07:13 crc kubenswrapper[4892]: I0217 20:07:13.939570 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e6cbf46e-f128-468f-99f5-64ae24e21ec6-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-qcxh2\" (UID: \"e6cbf46e-f128-468f-99f5-64ae24e21ec6\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qcxh2" Feb 17 20:07:14 crc kubenswrapper[4892]: I0217 20:07:14.043627 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4vn2\" (UniqueName: \"kubernetes.io/projected/e6cbf46e-f128-468f-99f5-64ae24e21ec6-kube-api-access-p4vn2\") pod \"neutron-dhcp-openstack-openstack-cell1-qcxh2\" (UID: \"e6cbf46e-f128-468f-99f5-64ae24e21ec6\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qcxh2" Feb 17 20:07:14 crc kubenswrapper[4892]: I0217 20:07:14.043708 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6cbf46e-f128-468f-99f5-64ae24e21ec6-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-qcxh2\" (UID: \"e6cbf46e-f128-468f-99f5-64ae24e21ec6\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qcxh2" Feb 17 20:07:14 crc kubenswrapper[4892]: I0217 20:07:14.043730 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e6cbf46e-f128-468f-99f5-64ae24e21ec6-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-qcxh2\" (UID: \"e6cbf46e-f128-468f-99f5-64ae24e21ec6\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qcxh2" Feb 17 20:07:14 crc kubenswrapper[4892]: I0217 20:07:14.043757 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/e6cbf46e-f128-468f-99f5-64ae24e21ec6-ssh-key-openstack-cell1\") pod \"neutron-dhcp-openstack-openstack-cell1-qcxh2\" (UID: \"e6cbf46e-f128-468f-99f5-64ae24e21ec6\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qcxh2" Feb 17 20:07:14 crc kubenswrapper[4892]: I0217 20:07:14.043844 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6cbf46e-f128-468f-99f5-64ae24e21ec6-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-qcxh2\" (UID: \"e6cbf46e-f128-468f-99f5-64ae24e21ec6\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qcxh2" Feb 17 20:07:14 crc kubenswrapper[4892]: I0217 20:07:14.044010 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e6cbf46e-f128-468f-99f5-64ae24e21ec6-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-qcxh2\" (UID: \"e6cbf46e-f128-468f-99f5-64ae24e21ec6\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qcxh2" Feb 17 20:07:14 crc kubenswrapper[4892]: I0217 20:07:14.049343 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e6cbf46e-f128-468f-99f5-64ae24e21ec6-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-qcxh2\" (UID: \"e6cbf46e-f128-468f-99f5-64ae24e21ec6\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qcxh2" Feb 17 20:07:14 crc kubenswrapper[4892]: I0217 20:07:14.050604 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6cbf46e-f128-468f-99f5-64ae24e21ec6-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-qcxh2\" (UID: \"e6cbf46e-f128-468f-99f5-64ae24e21ec6\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qcxh2" Feb 17 20:07:14 crc kubenswrapper[4892]: I0217 20:07:14.051578 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/e6cbf46e-f128-468f-99f5-64ae24e21ec6-ssh-key-openstack-cell1\") pod \"neutron-dhcp-openstack-openstack-cell1-qcxh2\" (UID: \"e6cbf46e-f128-468f-99f5-64ae24e21ec6\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qcxh2" Feb 17 20:07:14 crc kubenswrapper[4892]: I0217 20:07:14.056505 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6cbf46e-f128-468f-99f5-64ae24e21ec6-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-qcxh2\" (UID: \"e6cbf46e-f128-468f-99f5-64ae24e21ec6\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qcxh2" Feb 17 20:07:14 crc kubenswrapper[4892]: I0217 20:07:14.062118 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e6cbf46e-f128-468f-99f5-64ae24e21ec6-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-qcxh2\" (UID: \"e6cbf46e-f128-468f-99f5-64ae24e21ec6\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qcxh2" Feb 17 20:07:14 crc kubenswrapper[4892]: I0217 20:07:14.066861 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4vn2\" (UniqueName: \"kubernetes.io/projected/e6cbf46e-f128-468f-99f5-64ae24e21ec6-kube-api-access-p4vn2\") pod \"neutron-dhcp-openstack-openstack-cell1-qcxh2\" (UID: \"e6cbf46e-f128-468f-99f5-64ae24e21ec6\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qcxh2" Feb 17 20:07:14 crc kubenswrapper[4892]: I0217 20:07:14.075425 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-qcxh2" Feb 17 20:07:14 crc kubenswrapper[4892]: I0217 20:07:14.694721 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-qcxh2"] Feb 17 20:07:15 crc kubenswrapper[4892]: I0217 20:07:15.663673 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-qcxh2" event={"ID":"e6cbf46e-f128-468f-99f5-64ae24e21ec6","Type":"ContainerStarted","Data":"13409eeb900c91982771ba8888757603515e069d6e086999cf964772e38d723e"} Feb 17 20:07:15 crc kubenswrapper[4892]: I0217 20:07:15.664134 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-qcxh2" event={"ID":"e6cbf46e-f128-468f-99f5-64ae24e21ec6","Type":"ContainerStarted","Data":"493f70f37c7633ab3f5abb54409ea923ba708e2bcff25d52946dc2c48d086b08"} Feb 17 20:07:15 crc kubenswrapper[4892]: I0217 20:07:15.706137 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-dhcp-openstack-openstack-cell1-qcxh2" podStartSLOduration=2.298770298 podStartE2EDuration="2.706117697s" podCreationTimestamp="2026-02-17 20:07:13 +0000 UTC" firstStartedPulling="2026-02-17 20:07:14.700851153 +0000 UTC m=+8606.076254428" lastFinishedPulling="2026-02-17 20:07:15.108198562 +0000 UTC m=+8606.483601827" observedRunningTime="2026-02-17 20:07:15.685550304 +0000 UTC m=+8607.060953609" watchObservedRunningTime="2026-02-17 20:07:15.706117697 +0000 UTC m=+8607.081520972" Feb 17 20:07:27 crc kubenswrapper[4892]: I0217 20:07:27.360203 4892 scope.go:117] "RemoveContainer" containerID="dd34609e40c62d0dc2fae95fb119b2fff9896a684cd3e8b1cf99b9b16a2445f8" Feb 17 20:07:27 crc kubenswrapper[4892]: E0217 20:07:27.361159 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:07:38 crc kubenswrapper[4892]: I0217 20:07:38.360929 4892 scope.go:117] "RemoveContainer" containerID="dd34609e40c62d0dc2fae95fb119b2fff9896a684cd3e8b1cf99b9b16a2445f8" Feb 17 20:07:38 crc kubenswrapper[4892]: E0217 20:07:38.362552 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:07:51 crc kubenswrapper[4892]: I0217 20:07:51.360821 4892 scope.go:117] "RemoveContainer" containerID="dd34609e40c62d0dc2fae95fb119b2fff9896a684cd3e8b1cf99b9b16a2445f8" Feb 17 20:07:51 crc kubenswrapper[4892]: E0217 20:07:51.361782 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:08:06 crc kubenswrapper[4892]: I0217 20:08:06.360221 4892 scope.go:117] "RemoveContainer" containerID="dd34609e40c62d0dc2fae95fb119b2fff9896a684cd3e8b1cf99b9b16a2445f8" Feb 17 20:08:06 crc kubenswrapper[4892]: E0217 20:08:06.361112 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:08:19 crc kubenswrapper[4892]: I0217 20:08:19.366407 4892 scope.go:117] "RemoveContainer" containerID="dd34609e40c62d0dc2fae95fb119b2fff9896a684cd3e8b1cf99b9b16a2445f8" Feb 17 20:08:19 crc kubenswrapper[4892]: E0217 20:08:19.367319 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:08:29 crc kubenswrapper[4892]: I0217 20:08:29.637862 4892 generic.go:334] "Generic (PLEG): container finished" podID="e6cbf46e-f128-468f-99f5-64ae24e21ec6" containerID="13409eeb900c91982771ba8888757603515e069d6e086999cf964772e38d723e" exitCode=0 Feb 17 20:08:29 crc kubenswrapper[4892]: I0217 20:08:29.638444 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-qcxh2" event={"ID":"e6cbf46e-f128-468f-99f5-64ae24e21ec6","Type":"ContainerDied","Data":"13409eeb900c91982771ba8888757603515e069d6e086999cf964772e38d723e"} Feb 17 20:08:30 crc kubenswrapper[4892]: I0217 20:08:30.360373 4892 scope.go:117] "RemoveContainer" containerID="dd34609e40c62d0dc2fae95fb119b2fff9896a684cd3e8b1cf99b9b16a2445f8" Feb 17 20:08:30 crc kubenswrapper[4892]: E0217 20:08:30.360740 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:08:31 crc kubenswrapper[4892]: I0217 20:08:31.170263 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-qcxh2" Feb 17 20:08:31 crc kubenswrapper[4892]: I0217 20:08:31.219776 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6cbf46e-f128-468f-99f5-64ae24e21ec6-neutron-dhcp-combined-ca-bundle\") pod \"e6cbf46e-f128-468f-99f5-64ae24e21ec6\" (UID: \"e6cbf46e-f128-468f-99f5-64ae24e21ec6\") " Feb 17 20:08:31 crc kubenswrapper[4892]: I0217 20:08:31.219914 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6cbf46e-f128-468f-99f5-64ae24e21ec6-inventory\") pod \"e6cbf46e-f128-468f-99f5-64ae24e21ec6\" (UID: \"e6cbf46e-f128-468f-99f5-64ae24e21ec6\") " Feb 17 20:08:31 crc kubenswrapper[4892]: I0217 20:08:31.219973 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4vn2\" (UniqueName: \"kubernetes.io/projected/e6cbf46e-f128-468f-99f5-64ae24e21ec6-kube-api-access-p4vn2\") pod \"e6cbf46e-f128-468f-99f5-64ae24e21ec6\" (UID: \"e6cbf46e-f128-468f-99f5-64ae24e21ec6\") " Feb 17 20:08:31 crc kubenswrapper[4892]: I0217 20:08:31.219987 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e6cbf46e-f128-468f-99f5-64ae24e21ec6-ceph\") pod \"e6cbf46e-f128-468f-99f5-64ae24e21ec6\" (UID: \"e6cbf46e-f128-468f-99f5-64ae24e21ec6\") " Feb 17 20:08:31 crc kubenswrapper[4892]: I0217 20:08:31.220061 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e6cbf46e-f128-468f-99f5-64ae24e21ec6-neutron-dhcp-agent-neutron-config-0\") pod \"e6cbf46e-f128-468f-99f5-64ae24e21ec6\" (UID: \"e6cbf46e-f128-468f-99f5-64ae24e21ec6\") " Feb 17 20:08:31 crc kubenswrapper[4892]: I0217 20:08:31.220181 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/e6cbf46e-f128-468f-99f5-64ae24e21ec6-ssh-key-openstack-cell1\") pod \"e6cbf46e-f128-468f-99f5-64ae24e21ec6\" (UID: \"e6cbf46e-f128-468f-99f5-64ae24e21ec6\") " Feb 17 20:08:31 crc kubenswrapper[4892]: I0217 20:08:31.225744 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6cbf46e-f128-468f-99f5-64ae24e21ec6-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "e6cbf46e-f128-468f-99f5-64ae24e21ec6" (UID: "e6cbf46e-f128-468f-99f5-64ae24e21ec6"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 20:08:31 crc kubenswrapper[4892]: I0217 20:08:31.227149 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6cbf46e-f128-468f-99f5-64ae24e21ec6-kube-api-access-p4vn2" (OuterVolumeSpecName: "kube-api-access-p4vn2") pod "e6cbf46e-f128-468f-99f5-64ae24e21ec6" (UID: "e6cbf46e-f128-468f-99f5-64ae24e21ec6"). InnerVolumeSpecName "kube-api-access-p4vn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 20:08:31 crc kubenswrapper[4892]: I0217 20:08:31.227162 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6cbf46e-f128-468f-99f5-64ae24e21ec6-ceph" (OuterVolumeSpecName: "ceph") pod "e6cbf46e-f128-468f-99f5-64ae24e21ec6" (UID: "e6cbf46e-f128-468f-99f5-64ae24e21ec6"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 20:08:31 crc kubenswrapper[4892]: I0217 20:08:31.259290 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6cbf46e-f128-468f-99f5-64ae24e21ec6-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "e6cbf46e-f128-468f-99f5-64ae24e21ec6" (UID: "e6cbf46e-f128-468f-99f5-64ae24e21ec6"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 20:08:31 crc kubenswrapper[4892]: I0217 20:08:31.269741 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6cbf46e-f128-468f-99f5-64ae24e21ec6-inventory" (OuterVolumeSpecName: "inventory") pod "e6cbf46e-f128-468f-99f5-64ae24e21ec6" (UID: "e6cbf46e-f128-468f-99f5-64ae24e21ec6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 20:08:31 crc kubenswrapper[4892]: I0217 20:08:31.294296 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6cbf46e-f128-468f-99f5-64ae24e21ec6-neutron-dhcp-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-dhcp-agent-neutron-config-0") pod "e6cbf46e-f128-468f-99f5-64ae24e21ec6" (UID: "e6cbf46e-f128-468f-99f5-64ae24e21ec6"). InnerVolumeSpecName "neutron-dhcp-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 20:08:31 crc kubenswrapper[4892]: I0217 20:08:31.323904 4892 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e6cbf46e-f128-468f-99f5-64ae24e21ec6-neutron-dhcp-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 17 20:08:31 crc kubenswrapper[4892]: I0217 20:08:31.323950 4892 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/e6cbf46e-f128-468f-99f5-64ae24e21ec6-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 17 20:08:31 crc kubenswrapper[4892]: I0217 20:08:31.323971 4892 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6cbf46e-f128-468f-99f5-64ae24e21ec6-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 20:08:31 crc kubenswrapper[4892]: I0217 20:08:31.323990 4892 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6cbf46e-f128-468f-99f5-64ae24e21ec6-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 20:08:31 crc kubenswrapper[4892]: I0217 20:08:31.324011 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4vn2\" (UniqueName: \"kubernetes.io/projected/e6cbf46e-f128-468f-99f5-64ae24e21ec6-kube-api-access-p4vn2\") on node \"crc\" DevicePath \"\"" Feb 17 20:08:31 crc kubenswrapper[4892]: I0217 20:08:31.324027 4892 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e6cbf46e-f128-468f-99f5-64ae24e21ec6-ceph\") on node \"crc\" DevicePath \"\"" Feb 17 20:08:31 crc kubenswrapper[4892]: I0217 20:08:31.665777 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-qcxh2" event={"ID":"e6cbf46e-f128-468f-99f5-64ae24e21ec6","Type":"ContainerDied","Data":"493f70f37c7633ab3f5abb54409ea923ba708e2bcff25d52946dc2c48d086b08"} Feb 17 20:08:31 crc kubenswrapper[4892]: I0217 20:08:31.666087 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="493f70f37c7633ab3f5abb54409ea923ba708e2bcff25d52946dc2c48d086b08" Feb 17 20:08:31 crc kubenswrapper[4892]: I0217 20:08:31.665837 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-qcxh2" Feb 17 20:08:44 crc kubenswrapper[4892]: I0217 20:08:44.359279 4892 scope.go:117] "RemoveContainer" containerID="dd34609e40c62d0dc2fae95fb119b2fff9896a684cd3e8b1cf99b9b16a2445f8" Feb 17 20:08:44 crc kubenswrapper[4892]: E0217 20:08:44.360226 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:08:55 crc kubenswrapper[4892]: I0217 20:08:55.359753 4892 scope.go:117] "RemoveContainer" containerID="dd34609e40c62d0dc2fae95fb119b2fff9896a684cd3e8b1cf99b9b16a2445f8" Feb 17 20:08:55 crc kubenswrapper[4892]: E0217 20:08:55.360575 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:08:56 crc kubenswrapper[4892]: I0217 20:08:56.425369 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 17 20:08:56 crc kubenswrapper[4892]: I0217 20:08:56.425936 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="d62a3b6f-05ed-4c4f-8c4a-357db18967d9" containerName="nova-cell0-conductor-conductor" containerID="cri-o://e8700e9d019703f8522b742152a7b7f0198429932f23ad3980fe61f339da91aa" gracePeriod=30 Feb 17 20:08:56 crc kubenswrapper[4892]: I0217 20:08:56.455687 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 17 20:08:56 crc kubenswrapper[4892]: I0217 20:08:56.455927 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="74eb0508-84ab-4ac0-a7b4-5cdc321c3c4a" containerName="nova-cell1-conductor-conductor" containerID="cri-o://07783f23535025be8b8331c575a55ec1f402a4c838604e9179cb34c1bf66d3ea" gracePeriod=30 Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.207663 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.207893 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="74f65fe3-a32c-4468-aabd-7db13aa21aa4" containerName="nova-api-log" containerID="cri-o://87aac0b422a5c6fa015611fc43442b8018f3af9bd129b89100a498b163553bab" gracePeriod=30 Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.208001 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="74f65fe3-a32c-4468-aabd-7db13aa21aa4" containerName="nova-api-api" containerID="cri-o://b9de8cff655d97851c673048b173338dbbdc3352968e715b840eb2394b079902" gracePeriod=30 Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.228206 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.228437 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="e2e0bc87-ae8f-4e54-9301-03b2ae1c1d17" containerName="nova-scheduler-scheduler" containerID="cri-o://057d700fe79b3768e3253ba7ecbf5d6eec1aeb3130b69d9e7917730c4b68974d" gracePeriod=30 Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.261046 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.261432 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d2730df2-b773-4353-9b44-9dcc2b516221" containerName="nova-metadata-log" containerID="cri-o://9e63277c45c08d2e1cd6457d1e1997d3242a058acfd4f5be20156014f47c439b" gracePeriod=30 Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.261850 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d2730df2-b773-4353-9b44-9dcc2b516221" containerName="nova-metadata-metadata" containerID="cri-o://b1834fba28e56bcb58a0012a5daa7923a9b18bebea4c07285fa112db4ff2513e" gracePeriod=30 Feb 17 20:08:57 crc kubenswrapper[4892]: E0217 20:08:57.267180 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e8700e9d019703f8522b742152a7b7f0198429932f23ad3980fe61f339da91aa" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 17 20:08:57 crc kubenswrapper[4892]: E0217 20:08:57.271077 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e8700e9d019703f8522b742152a7b7f0198429932f23ad3980fe61f339da91aa" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 17 20:08:57 crc kubenswrapper[4892]: E0217 20:08:57.275603 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e8700e9d019703f8522b742152a7b7f0198429932f23ad3980fe61f339da91aa" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 17 20:08:57 crc kubenswrapper[4892]: E0217 20:08:57.275680 4892 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="d62a3b6f-05ed-4c4f-8c4a-357db18967d9" containerName="nova-cell0-conductor-conductor" Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.300055 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt"] Feb 17 20:08:57 crc kubenswrapper[4892]: E0217 20:08:57.300575 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6cbf46e-f128-468f-99f5-64ae24e21ec6" containerName="neutron-dhcp-openstack-openstack-cell1" Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.300596 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6cbf46e-f128-468f-99f5-64ae24e21ec6" containerName="neutron-dhcp-openstack-openstack-cell1" Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.300906 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6cbf46e-f128-468f-99f5-64ae24e21ec6" containerName="neutron-dhcp-openstack-openstack-cell1" Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.301915 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt" Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.304837 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.305067 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.305216 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.305354 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.305478 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.305703 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-sdxx7" Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.305907 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.323725 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt"] Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.364263 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt\" (UID: \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt" Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.364316 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt\" (UID: \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt" Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.364382 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt\" (UID: \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt" Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.364453 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt\" (UID: \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt" Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.364529 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt\" (UID: \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt" Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.364592 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt\" (UID: \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt" Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.364634 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt\" (UID: \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt" Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.364661 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt\" (UID: \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt" Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.364695 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt\" (UID: \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt" Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.364729 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qm724\" (UniqueName: \"kubernetes.io/projected/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-kube-api-access-qm724\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt\" (UID: \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt" Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.364796 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt\" (UID: \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt" Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.375155 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt\" (UID: \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt" Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.375250 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt\" (UID: \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt" Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.477398 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt\" (UID: \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt" Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.478369 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt\" (UID: \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt" Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.479049 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt\" (UID: \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt" Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.479136 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt\" (UID: \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt" Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.479371 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt\" (UID: \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt" Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.479438 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt\" (UID: \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt" Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.480229 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt\" (UID: \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt" Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.480251 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt\" (UID: \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt" Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.480319 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt\" (UID: \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt" Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.480365 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt\" (UID: \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt" Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.480392 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt\" (UID: \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt" Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.480408 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt\" (UID: \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt" Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.480443 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt\" (UID: \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt" Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.480473 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qm724\" (UniqueName: \"kubernetes.io/projected/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-kube-api-access-qm724\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt\" (UID: \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt" Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.480724 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt\" (UID: \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt" Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.483171 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt\" (UID: \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt" Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.483425 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt\" (UID: \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt" Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.483579 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt\" (UID: \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt" Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.491260 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt\" (UID: \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt" Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.491479 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt\" (UID: \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt" Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.491579 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt\" (UID: \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt" Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.491791 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt\" (UID: \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt" Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.494744 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt\" (UID: \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt" Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.499426 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt\" (UID: \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt" Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.499789 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt\" (UID: \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt" Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.504358 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qm724\" (UniqueName: \"kubernetes.io/projected/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-kube-api-access-qm724\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt\" (UID: \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt" Feb 17 20:08:57 crc kubenswrapper[4892]: I0217 20:08:57.755220 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt" Feb 17 20:08:58 crc kubenswrapper[4892]: I0217 20:08:58.053240 4892 generic.go:334] "Generic (PLEG): container finished" podID="74f65fe3-a32c-4468-aabd-7db13aa21aa4" containerID="87aac0b422a5c6fa015611fc43442b8018f3af9bd129b89100a498b163553bab" exitCode=143 Feb 17 20:08:58 crc kubenswrapper[4892]: I0217 20:08:58.053324 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"74f65fe3-a32c-4468-aabd-7db13aa21aa4","Type":"ContainerDied","Data":"87aac0b422a5c6fa015611fc43442b8018f3af9bd129b89100a498b163553bab"} Feb 17 20:08:58 crc kubenswrapper[4892]: I0217 20:08:58.055892 4892 generic.go:334] "Generic (PLEG): container finished" podID="d2730df2-b773-4353-9b44-9dcc2b516221" containerID="9e63277c45c08d2e1cd6457d1e1997d3242a058acfd4f5be20156014f47c439b" exitCode=143 Feb 17 20:08:58 crc kubenswrapper[4892]: I0217 20:08:58.055996 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d2730df2-b773-4353-9b44-9dcc2b516221","Type":"ContainerDied","Data":"9e63277c45c08d2e1cd6457d1e1997d3242a058acfd4f5be20156014f47c439b"} Feb 17 20:08:58 crc kubenswrapper[4892]: I0217 20:08:58.411533 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt"] Feb 17 20:08:58 crc kubenswrapper[4892]: I0217 20:08:58.611672 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 20:08:58 crc kubenswrapper[4892]: I0217 20:08:58.706680 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8xc7\" (UniqueName: \"kubernetes.io/projected/e2e0bc87-ae8f-4e54-9301-03b2ae1c1d17-kube-api-access-q8xc7\") pod \"e2e0bc87-ae8f-4e54-9301-03b2ae1c1d17\" (UID: \"e2e0bc87-ae8f-4e54-9301-03b2ae1c1d17\") " Feb 17 20:08:58 crc kubenswrapper[4892]: I0217 20:08:58.706723 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2e0bc87-ae8f-4e54-9301-03b2ae1c1d17-combined-ca-bundle\") pod \"e2e0bc87-ae8f-4e54-9301-03b2ae1c1d17\" (UID: \"e2e0bc87-ae8f-4e54-9301-03b2ae1c1d17\") " Feb 17 20:08:58 crc kubenswrapper[4892]: I0217 20:08:58.706815 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2e0bc87-ae8f-4e54-9301-03b2ae1c1d17-config-data\") pod \"e2e0bc87-ae8f-4e54-9301-03b2ae1c1d17\" (UID: \"e2e0bc87-ae8f-4e54-9301-03b2ae1c1d17\") " Feb 17 20:08:58 crc kubenswrapper[4892]: I0217 20:08:58.794839 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2e0bc87-ae8f-4e54-9301-03b2ae1c1d17-kube-api-access-q8xc7" (OuterVolumeSpecName: "kube-api-access-q8xc7") pod "e2e0bc87-ae8f-4e54-9301-03b2ae1c1d17" (UID: "e2e0bc87-ae8f-4e54-9301-03b2ae1c1d17"). InnerVolumeSpecName "kube-api-access-q8xc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 20:08:58 crc kubenswrapper[4892]: I0217 20:08:58.807120 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2e0bc87-ae8f-4e54-9301-03b2ae1c1d17-config-data" (OuterVolumeSpecName: "config-data") pod "e2e0bc87-ae8f-4e54-9301-03b2ae1c1d17" (UID: "e2e0bc87-ae8f-4e54-9301-03b2ae1c1d17"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 20:08:58 crc kubenswrapper[4892]: I0217 20:08:58.809115 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8xc7\" (UniqueName: \"kubernetes.io/projected/e2e0bc87-ae8f-4e54-9301-03b2ae1c1d17-kube-api-access-q8xc7\") on node \"crc\" DevicePath \"\"" Feb 17 20:08:58 crc kubenswrapper[4892]: I0217 20:08:58.809146 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2e0bc87-ae8f-4e54-9301-03b2ae1c1d17-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 20:08:58 crc kubenswrapper[4892]: I0217 20:08:58.876780 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2e0bc87-ae8f-4e54-9301-03b2ae1c1d17-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e2e0bc87-ae8f-4e54-9301-03b2ae1c1d17" (UID: "e2e0bc87-ae8f-4e54-9301-03b2ae1c1d17"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 20:08:58 crc kubenswrapper[4892]: I0217 20:08:58.911282 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2e0bc87-ae8f-4e54-9301-03b2ae1c1d17-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 20:08:59 crc kubenswrapper[4892]: I0217 20:08:59.066308 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt" event={"ID":"e13a74be-1b5c-4c8f-9c61-5aa0965a4610","Type":"ContainerStarted","Data":"59878c3afc46e1b7c947524cf872f2ab3c1a8ad6748a6c416dbc2ecde6e27529"} Feb 17 20:08:59 crc kubenswrapper[4892]: I0217 20:08:59.067618 4892 generic.go:334] "Generic (PLEG): container finished" podID="e2e0bc87-ae8f-4e54-9301-03b2ae1c1d17" containerID="057d700fe79b3768e3253ba7ecbf5d6eec1aeb3130b69d9e7917730c4b68974d" exitCode=0 Feb 17 20:08:59 crc kubenswrapper[4892]: I0217 20:08:59.067675 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e2e0bc87-ae8f-4e54-9301-03b2ae1c1d17","Type":"ContainerDied","Data":"057d700fe79b3768e3253ba7ecbf5d6eec1aeb3130b69d9e7917730c4b68974d"} Feb 17 20:08:59 crc kubenswrapper[4892]: I0217 20:08:59.067705 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e2e0bc87-ae8f-4e54-9301-03b2ae1c1d17","Type":"ContainerDied","Data":"1029c03b3848928a3c761fe4bba5961501ebe32ad22655b1c3a5a353a9ddfc00"} Feb 17 20:08:59 crc kubenswrapper[4892]: I0217 20:08:59.067722 4892 scope.go:117] "RemoveContainer" containerID="057d700fe79b3768e3253ba7ecbf5d6eec1aeb3130b69d9e7917730c4b68974d" Feb 17 20:08:59 crc kubenswrapper[4892]: I0217 20:08:59.067915 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 20:08:59 crc kubenswrapper[4892]: I0217 20:08:59.104318 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 20:08:59 crc kubenswrapper[4892]: I0217 20:08:59.109125 4892 scope.go:117] "RemoveContainer" containerID="057d700fe79b3768e3253ba7ecbf5d6eec1aeb3130b69d9e7917730c4b68974d" Feb 17 20:08:59 crc kubenswrapper[4892]: E0217 20:08:59.109711 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"057d700fe79b3768e3253ba7ecbf5d6eec1aeb3130b69d9e7917730c4b68974d\": container with ID starting with 057d700fe79b3768e3253ba7ecbf5d6eec1aeb3130b69d9e7917730c4b68974d not found: ID does not exist" containerID="057d700fe79b3768e3253ba7ecbf5d6eec1aeb3130b69d9e7917730c4b68974d" Feb 17 20:08:59 crc kubenswrapper[4892]: I0217 20:08:59.109751 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"057d700fe79b3768e3253ba7ecbf5d6eec1aeb3130b69d9e7917730c4b68974d"} err="failed to get container status \"057d700fe79b3768e3253ba7ecbf5d6eec1aeb3130b69d9e7917730c4b68974d\": rpc error: code = NotFound desc = could not find container \"057d700fe79b3768e3253ba7ecbf5d6eec1aeb3130b69d9e7917730c4b68974d\": container with ID starting with 057d700fe79b3768e3253ba7ecbf5d6eec1aeb3130b69d9e7917730c4b68974d not found: ID does not exist" Feb 17 20:08:59 crc kubenswrapper[4892]: I0217 20:08:59.114383 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 20:08:59 crc kubenswrapper[4892]: I0217 20:08:59.129390 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 20:08:59 crc kubenswrapper[4892]: E0217 20:08:59.129960 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2e0bc87-ae8f-4e54-9301-03b2ae1c1d17" containerName="nova-scheduler-scheduler" Feb 17 20:08:59 crc kubenswrapper[4892]: I0217 20:08:59.129976 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2e0bc87-ae8f-4e54-9301-03b2ae1c1d17" containerName="nova-scheduler-scheduler" Feb 17 20:08:59 crc kubenswrapper[4892]: I0217 20:08:59.130219 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2e0bc87-ae8f-4e54-9301-03b2ae1c1d17" containerName="nova-scheduler-scheduler" Feb 17 20:08:59 crc kubenswrapper[4892]: I0217 20:08:59.131147 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 20:08:59 crc kubenswrapper[4892]: I0217 20:08:59.134941 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 17 20:08:59 crc kubenswrapper[4892]: I0217 20:08:59.157621 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 20:08:59 crc kubenswrapper[4892]: I0217 20:08:59.217919 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88b16467-84db-4897-aaa7-6b523f95112f-config-data\") pod \"nova-scheduler-0\" (UID: \"88b16467-84db-4897-aaa7-6b523f95112f\") " pod="openstack/nova-scheduler-0" Feb 17 20:08:59 crc kubenswrapper[4892]: I0217 20:08:59.218060 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88b16467-84db-4897-aaa7-6b523f95112f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"88b16467-84db-4897-aaa7-6b523f95112f\") " pod="openstack/nova-scheduler-0" Feb 17 20:08:59 crc kubenswrapper[4892]: I0217 20:08:59.218266 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8w2cj\" (UniqueName: \"kubernetes.io/projected/88b16467-84db-4897-aaa7-6b523f95112f-kube-api-access-8w2cj\") pod \"nova-scheduler-0\" (UID: \"88b16467-84db-4897-aaa7-6b523f95112f\") " pod="openstack/nova-scheduler-0" Feb 17 20:08:59 crc kubenswrapper[4892]: I0217 20:08:59.321064 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88b16467-84db-4897-aaa7-6b523f95112f-config-data\") pod \"nova-scheduler-0\" (UID: \"88b16467-84db-4897-aaa7-6b523f95112f\") " pod="openstack/nova-scheduler-0" Feb 17 20:08:59 crc kubenswrapper[4892]: I0217 20:08:59.321270 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88b16467-84db-4897-aaa7-6b523f95112f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"88b16467-84db-4897-aaa7-6b523f95112f\") " pod="openstack/nova-scheduler-0" Feb 17 20:08:59 crc kubenswrapper[4892]: I0217 20:08:59.321401 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8w2cj\" (UniqueName: \"kubernetes.io/projected/88b16467-84db-4897-aaa7-6b523f95112f-kube-api-access-8w2cj\") pod \"nova-scheduler-0\" (UID: \"88b16467-84db-4897-aaa7-6b523f95112f\") " pod="openstack/nova-scheduler-0" Feb 17 20:08:59 crc kubenswrapper[4892]: I0217 20:08:59.324841 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88b16467-84db-4897-aaa7-6b523f95112f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"88b16467-84db-4897-aaa7-6b523f95112f\") " pod="openstack/nova-scheduler-0" Feb 17 20:08:59 crc kubenswrapper[4892]: I0217 20:08:59.326088 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88b16467-84db-4897-aaa7-6b523f95112f-config-data\") pod \"nova-scheduler-0\" (UID: \"88b16467-84db-4897-aaa7-6b523f95112f\") " pod="openstack/nova-scheduler-0" Feb 17 20:08:59 crc kubenswrapper[4892]: I0217 20:08:59.347714 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w2cj\" (UniqueName: \"kubernetes.io/projected/88b16467-84db-4897-aaa7-6b523f95112f-kube-api-access-8w2cj\") pod \"nova-scheduler-0\" (UID: \"88b16467-84db-4897-aaa7-6b523f95112f\") " pod="openstack/nova-scheduler-0" Feb 17 20:08:59 crc kubenswrapper[4892]: I0217 20:08:59.371339 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2e0bc87-ae8f-4e54-9301-03b2ae1c1d17" path="/var/lib/kubelet/pods/e2e0bc87-ae8f-4e54-9301-03b2ae1c1d17/volumes" Feb 17 20:08:59 crc kubenswrapper[4892]: I0217 20:08:59.506765 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 20:09:00 crc kubenswrapper[4892]: I0217 20:09:00.032282 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 20:09:00 crc kubenswrapper[4892]: I0217 20:09:00.083576 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"88b16467-84db-4897-aaa7-6b523f95112f","Type":"ContainerStarted","Data":"767bf9ca78de03a5c41ffc5473afda7c4858700ca4d6481902036d1cd682d26e"} Feb 17 20:09:00 crc kubenswrapper[4892]: I0217 20:09:00.085322 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt" event={"ID":"e13a74be-1b5c-4c8f-9c61-5aa0965a4610","Type":"ContainerStarted","Data":"4ab13c964ab0206a0c133b840cd20bbfd2d901516632ade84ac120a831a22763"} Feb 17 20:09:00 crc kubenswrapper[4892]: I0217 20:09:00.122771 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt" podStartSLOduration=2.669839261 podStartE2EDuration="3.122748876s" podCreationTimestamp="2026-02-17 20:08:57 +0000 UTC" firstStartedPulling="2026-02-17 20:08:58.415807133 +0000 UTC m=+8709.791210398" lastFinishedPulling="2026-02-17 20:08:58.868716748 +0000 UTC m=+8710.244120013" observedRunningTime="2026-02-17 20:09:00.107339771 +0000 UTC m=+8711.482743026" watchObservedRunningTime="2026-02-17 20:09:00.122748876 +0000 UTC m=+8711.498152141" Feb 17 20:09:00 crc kubenswrapper[4892]: E0217 20:09:00.253157 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="07783f23535025be8b8331c575a55ec1f402a4c838604e9179cb34c1bf66d3ea" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 17 20:09:00 crc kubenswrapper[4892]: E0217 20:09:00.254955 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="07783f23535025be8b8331c575a55ec1f402a4c838604e9179cb34c1bf66d3ea" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 17 20:09:00 crc kubenswrapper[4892]: E0217 20:09:00.256447 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="07783f23535025be8b8331c575a55ec1f402a4c838604e9179cb34c1bf66d3ea" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 17 20:09:00 crc kubenswrapper[4892]: E0217 20:09:00.256488 4892 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="74eb0508-84ab-4ac0-a7b4-5cdc321c3c4a" containerName="nova-cell1-conductor-conductor" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.006569 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.066216 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gnn2\" (UniqueName: \"kubernetes.io/projected/d2730df2-b773-4353-9b44-9dcc2b516221-kube-api-access-6gnn2\") pod \"d2730df2-b773-4353-9b44-9dcc2b516221\" (UID: \"d2730df2-b773-4353-9b44-9dcc2b516221\") " Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.066480 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2730df2-b773-4353-9b44-9dcc2b516221-logs\") pod \"d2730df2-b773-4353-9b44-9dcc2b516221\" (UID: \"d2730df2-b773-4353-9b44-9dcc2b516221\") " Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.066517 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2730df2-b773-4353-9b44-9dcc2b516221-combined-ca-bundle\") pod \"d2730df2-b773-4353-9b44-9dcc2b516221\" (UID: \"d2730df2-b773-4353-9b44-9dcc2b516221\") " Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.066595 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2730df2-b773-4353-9b44-9dcc2b516221-config-data\") pod \"d2730df2-b773-4353-9b44-9dcc2b516221\" (UID: \"d2730df2-b773-4353-9b44-9dcc2b516221\") " Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.067431 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2730df2-b773-4353-9b44-9dcc2b516221-logs" (OuterVolumeSpecName: "logs") pod "d2730df2-b773-4353-9b44-9dcc2b516221" (UID: "d2730df2-b773-4353-9b44-9dcc2b516221"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.079065 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2730df2-b773-4353-9b44-9dcc2b516221-kube-api-access-6gnn2" (OuterVolumeSpecName: "kube-api-access-6gnn2") pod "d2730df2-b773-4353-9b44-9dcc2b516221" (UID: "d2730df2-b773-4353-9b44-9dcc2b516221"). InnerVolumeSpecName "kube-api-access-6gnn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.111057 4892 generic.go:334] "Generic (PLEG): container finished" podID="74f65fe3-a32c-4468-aabd-7db13aa21aa4" containerID="b9de8cff655d97851c673048b173338dbbdc3352968e715b840eb2394b079902" exitCode=0 Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.111142 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"74f65fe3-a32c-4468-aabd-7db13aa21aa4","Type":"ContainerDied","Data":"b9de8cff655d97851c673048b173338dbbdc3352968e715b840eb2394b079902"} Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.133928 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2730df2-b773-4353-9b44-9dcc2b516221-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d2730df2-b773-4353-9b44-9dcc2b516221" (UID: "d2730df2-b773-4353-9b44-9dcc2b516221"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.149456 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"88b16467-84db-4897-aaa7-6b523f95112f","Type":"ContainerStarted","Data":"d207015fa1d61ca65e95d5730553e9e4cd6354ba8061f8545b6dd74b6e8c6d3d"} Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.154098 4892 generic.go:334] "Generic (PLEG): container finished" podID="d2730df2-b773-4353-9b44-9dcc2b516221" containerID="b1834fba28e56bcb58a0012a5daa7923a9b18bebea4c07285fa112db4ff2513e" exitCode=0 Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.154466 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d2730df2-b773-4353-9b44-9dcc2b516221","Type":"ContainerDied","Data":"b1834fba28e56bcb58a0012a5daa7923a9b18bebea4c07285fa112db4ff2513e"} Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.154498 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d2730df2-b773-4353-9b44-9dcc2b516221","Type":"ContainerDied","Data":"3e37c35f3af4419064c8cf5cf8169daf9e829e6f37f4d15cb535e72fc5732ef7"} Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.154516 4892 scope.go:117] "RemoveContainer" containerID="b1834fba28e56bcb58a0012a5daa7923a9b18bebea4c07285fa112db4ff2513e" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.154642 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.155991 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2730df2-b773-4353-9b44-9dcc2b516221-config-data" (OuterVolumeSpecName: "config-data") pod "d2730df2-b773-4353-9b44-9dcc2b516221" (UID: "d2730df2-b773-4353-9b44-9dcc2b516221"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.164697 4892 generic.go:334] "Generic (PLEG): container finished" podID="74eb0508-84ab-4ac0-a7b4-5cdc321c3c4a" containerID="07783f23535025be8b8331c575a55ec1f402a4c838604e9179cb34c1bf66d3ea" exitCode=0 Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.164720 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"74eb0508-84ab-4ac0-a7b4-5cdc321c3c4a","Type":"ContainerDied","Data":"07783f23535025be8b8331c575a55ec1f402a4c838604e9179cb34c1bf66d3ea"} Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.169396 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2730df2-b773-4353-9b44-9dcc2b516221-logs\") on node \"crc\" DevicePath \"\"" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.170625 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2730df2-b773-4353-9b44-9dcc2b516221-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.171610 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2730df2-b773-4353-9b44-9dcc2b516221-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.171629 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gnn2\" (UniqueName: \"kubernetes.io/projected/d2730df2-b773-4353-9b44-9dcc2b516221-kube-api-access-6gnn2\") on node \"crc\" DevicePath \"\"" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.204157 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.204134999 podStartE2EDuration="2.204134999s" podCreationTimestamp="2026-02-17 20:08:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 20:09:01.183279997 +0000 UTC m=+8712.558683272" watchObservedRunningTime="2026-02-17 20:09:01.204134999 +0000 UTC m=+8712.579538264" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.255413 4892 scope.go:117] "RemoveContainer" containerID="9e63277c45c08d2e1cd6457d1e1997d3242a058acfd4f5be20156014f47c439b" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.256507 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.299511 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.323995 4892 scope.go:117] "RemoveContainer" containerID="b1834fba28e56bcb58a0012a5daa7923a9b18bebea4c07285fa112db4ff2513e" Feb 17 20:09:01 crc kubenswrapper[4892]: E0217 20:09:01.327202 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1834fba28e56bcb58a0012a5daa7923a9b18bebea4c07285fa112db4ff2513e\": container with ID starting with b1834fba28e56bcb58a0012a5daa7923a9b18bebea4c07285fa112db4ff2513e not found: ID does not exist" containerID="b1834fba28e56bcb58a0012a5daa7923a9b18bebea4c07285fa112db4ff2513e" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.327233 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1834fba28e56bcb58a0012a5daa7923a9b18bebea4c07285fa112db4ff2513e"} err="failed to get container status \"b1834fba28e56bcb58a0012a5daa7923a9b18bebea4c07285fa112db4ff2513e\": rpc error: code = NotFound desc = could not find container \"b1834fba28e56bcb58a0012a5daa7923a9b18bebea4c07285fa112db4ff2513e\": container with ID starting with b1834fba28e56bcb58a0012a5daa7923a9b18bebea4c07285fa112db4ff2513e not found: ID does not exist" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.327255 4892 scope.go:117] "RemoveContainer" containerID="9e63277c45c08d2e1cd6457d1e1997d3242a058acfd4f5be20156014f47c439b" Feb 17 20:09:01 crc kubenswrapper[4892]: E0217 20:09:01.329122 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e63277c45c08d2e1cd6457d1e1997d3242a058acfd4f5be20156014f47c439b\": container with ID starting with 9e63277c45c08d2e1cd6457d1e1997d3242a058acfd4f5be20156014f47c439b not found: ID does not exist" containerID="9e63277c45c08d2e1cd6457d1e1997d3242a058acfd4f5be20156014f47c439b" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.329152 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e63277c45c08d2e1cd6457d1e1997d3242a058acfd4f5be20156014f47c439b"} err="failed to get container status \"9e63277c45c08d2e1cd6457d1e1997d3242a058acfd4f5be20156014f47c439b\": rpc error: code = NotFound desc = could not find container \"9e63277c45c08d2e1cd6457d1e1997d3242a058acfd4f5be20156014f47c439b\": container with ID starting with 9e63277c45c08d2e1cd6457d1e1997d3242a058acfd4f5be20156014f47c439b not found: ID does not exist" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.382543 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74f65fe3-a32c-4468-aabd-7db13aa21aa4-logs\") pod \"74f65fe3-a32c-4468-aabd-7db13aa21aa4\" (UID: \"74f65fe3-a32c-4468-aabd-7db13aa21aa4\") " Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.382626 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74f65fe3-a32c-4468-aabd-7db13aa21aa4-config-data\") pod \"74f65fe3-a32c-4468-aabd-7db13aa21aa4\" (UID: \"74f65fe3-a32c-4468-aabd-7db13aa21aa4\") " Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.383154 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbrvl\" (UniqueName: \"kubernetes.io/projected/74eb0508-84ab-4ac0-a7b4-5cdc321c3c4a-kube-api-access-mbrvl\") pod \"74eb0508-84ab-4ac0-a7b4-5cdc321c3c4a\" (UID: \"74eb0508-84ab-4ac0-a7b4-5cdc321c3c4a\") " Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.383490 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74eb0508-84ab-4ac0-a7b4-5cdc321c3c4a-config-data\") pod \"74eb0508-84ab-4ac0-a7b4-5cdc321c3c4a\" (UID: \"74eb0508-84ab-4ac0-a7b4-5cdc321c3c4a\") " Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.383534 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74eb0508-84ab-4ac0-a7b4-5cdc321c3c4a-combined-ca-bundle\") pod \"74eb0508-84ab-4ac0-a7b4-5cdc321c3c4a\" (UID: \"74eb0508-84ab-4ac0-a7b4-5cdc321c3c4a\") " Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.384269 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cx8ng\" (UniqueName: \"kubernetes.io/projected/74f65fe3-a32c-4468-aabd-7db13aa21aa4-kube-api-access-cx8ng\") pod \"74f65fe3-a32c-4468-aabd-7db13aa21aa4\" (UID: \"74f65fe3-a32c-4468-aabd-7db13aa21aa4\") " Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.384307 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74f65fe3-a32c-4468-aabd-7db13aa21aa4-combined-ca-bundle\") pod \"74f65fe3-a32c-4468-aabd-7db13aa21aa4\" (UID: \"74f65fe3-a32c-4468-aabd-7db13aa21aa4\") " Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.389013 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74f65fe3-a32c-4468-aabd-7db13aa21aa4-logs" (OuterVolumeSpecName: "logs") pod "74f65fe3-a32c-4468-aabd-7db13aa21aa4" (UID: "74f65fe3-a32c-4468-aabd-7db13aa21aa4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.391433 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74f65fe3-a32c-4468-aabd-7db13aa21aa4-kube-api-access-cx8ng" (OuterVolumeSpecName: "kube-api-access-cx8ng") pod "74f65fe3-a32c-4468-aabd-7db13aa21aa4" (UID: "74f65fe3-a32c-4468-aabd-7db13aa21aa4"). InnerVolumeSpecName "kube-api-access-cx8ng". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.400428 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74eb0508-84ab-4ac0-a7b4-5cdc321c3c4a-kube-api-access-mbrvl" (OuterVolumeSpecName: "kube-api-access-mbrvl") pod "74eb0508-84ab-4ac0-a7b4-5cdc321c3c4a" (UID: "74eb0508-84ab-4ac0-a7b4-5cdc321c3c4a"). InnerVolumeSpecName "kube-api-access-mbrvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.461034 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74eb0508-84ab-4ac0-a7b4-5cdc321c3c4a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74eb0508-84ab-4ac0-a7b4-5cdc321c3c4a" (UID: "74eb0508-84ab-4ac0-a7b4-5cdc321c3c4a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.495860 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbrvl\" (UniqueName: \"kubernetes.io/projected/74eb0508-84ab-4ac0-a7b4-5cdc321c3c4a-kube-api-access-mbrvl\") on node \"crc\" DevicePath \"\"" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.497329 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74eb0508-84ab-4ac0-a7b4-5cdc321c3c4a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.497345 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cx8ng\" (UniqueName: \"kubernetes.io/projected/74f65fe3-a32c-4468-aabd-7db13aa21aa4-kube-api-access-cx8ng\") on node \"crc\" DevicePath \"\"" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.497548 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74f65fe3-a32c-4468-aabd-7db13aa21aa4-logs\") on node \"crc\" DevicePath \"\"" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.525078 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74eb0508-84ab-4ac0-a7b4-5cdc321c3c4a-config-data" (OuterVolumeSpecName: "config-data") pod "74eb0508-84ab-4ac0-a7b4-5cdc321c3c4a" (UID: "74eb0508-84ab-4ac0-a7b4-5cdc321c3c4a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.542938 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74f65fe3-a32c-4468-aabd-7db13aa21aa4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74f65fe3-a32c-4468-aabd-7db13aa21aa4" (UID: "74f65fe3-a32c-4468-aabd-7db13aa21aa4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.555981 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74f65fe3-a32c-4468-aabd-7db13aa21aa4-config-data" (OuterVolumeSpecName: "config-data") pod "74f65fe3-a32c-4468-aabd-7db13aa21aa4" (UID: "74f65fe3-a32c-4468-aabd-7db13aa21aa4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.600811 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74eb0508-84ab-4ac0-a7b4-5cdc321c3c4a-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.600923 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74f65fe3-a32c-4468-aabd-7db13aa21aa4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.600936 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74f65fe3-a32c-4468-aabd-7db13aa21aa4-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.693655 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.705920 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.717410 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 17 20:09:01 crc kubenswrapper[4892]: E0217 20:09:01.718102 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2730df2-b773-4353-9b44-9dcc2b516221" containerName="nova-metadata-metadata" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.718125 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2730df2-b773-4353-9b44-9dcc2b516221" containerName="nova-metadata-metadata" Feb 17 20:09:01 crc kubenswrapper[4892]: E0217 20:09:01.718140 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74f65fe3-a32c-4468-aabd-7db13aa21aa4" containerName="nova-api-api" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.718148 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="74f65fe3-a32c-4468-aabd-7db13aa21aa4" containerName="nova-api-api" Feb 17 20:09:01 crc kubenswrapper[4892]: E0217 20:09:01.718172 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2730df2-b773-4353-9b44-9dcc2b516221" containerName="nova-metadata-log" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.718208 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2730df2-b773-4353-9b44-9dcc2b516221" containerName="nova-metadata-log" Feb 17 20:09:01 crc kubenswrapper[4892]: E0217 20:09:01.718229 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74f65fe3-a32c-4468-aabd-7db13aa21aa4" containerName="nova-api-log" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.718236 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="74f65fe3-a32c-4468-aabd-7db13aa21aa4" containerName="nova-api-log" Feb 17 20:09:01 crc kubenswrapper[4892]: E0217 20:09:01.718248 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74eb0508-84ab-4ac0-a7b4-5cdc321c3c4a" containerName="nova-cell1-conductor-conductor" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.718256 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="74eb0508-84ab-4ac0-a7b4-5cdc321c3c4a" containerName="nova-cell1-conductor-conductor" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.718548 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="74f65fe3-a32c-4468-aabd-7db13aa21aa4" containerName="nova-api-log" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.718587 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2730df2-b773-4353-9b44-9dcc2b516221" containerName="nova-metadata-log" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.718602 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="74eb0508-84ab-4ac0-a7b4-5cdc321c3c4a" containerName="nova-cell1-conductor-conductor" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.718610 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="74f65fe3-a32c-4468-aabd-7db13aa21aa4" containerName="nova-api-api" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.718621 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2730df2-b773-4353-9b44-9dcc2b516221" containerName="nova-metadata-metadata" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.720056 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.731936 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.754218 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.804805 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e2cefca-37d0-4868-97cf-bf6b1a24f0a3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5e2cefca-37d0-4868-97cf-bf6b1a24f0a3\") " pod="openstack/nova-metadata-0" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.804908 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e2cefca-37d0-4868-97cf-bf6b1a24f0a3-config-data\") pod \"nova-metadata-0\" (UID: \"5e2cefca-37d0-4868-97cf-bf6b1a24f0a3\") " pod="openstack/nova-metadata-0" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.804937 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chhzm\" (UniqueName: \"kubernetes.io/projected/5e2cefca-37d0-4868-97cf-bf6b1a24f0a3-kube-api-access-chhzm\") pod \"nova-metadata-0\" (UID: \"5e2cefca-37d0-4868-97cf-bf6b1a24f0a3\") " pod="openstack/nova-metadata-0" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.804971 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e2cefca-37d0-4868-97cf-bf6b1a24f0a3-logs\") pod \"nova-metadata-0\" (UID: \"5e2cefca-37d0-4868-97cf-bf6b1a24f0a3\") " pod="openstack/nova-metadata-0" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.906384 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e2cefca-37d0-4868-97cf-bf6b1a24f0a3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5e2cefca-37d0-4868-97cf-bf6b1a24f0a3\") " pod="openstack/nova-metadata-0" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.906466 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e2cefca-37d0-4868-97cf-bf6b1a24f0a3-config-data\") pod \"nova-metadata-0\" (UID: \"5e2cefca-37d0-4868-97cf-bf6b1a24f0a3\") " pod="openstack/nova-metadata-0" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.906497 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chhzm\" (UniqueName: \"kubernetes.io/projected/5e2cefca-37d0-4868-97cf-bf6b1a24f0a3-kube-api-access-chhzm\") pod \"nova-metadata-0\" (UID: \"5e2cefca-37d0-4868-97cf-bf6b1a24f0a3\") " pod="openstack/nova-metadata-0" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.906535 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e2cefca-37d0-4868-97cf-bf6b1a24f0a3-logs\") pod \"nova-metadata-0\" (UID: \"5e2cefca-37d0-4868-97cf-bf6b1a24f0a3\") " pod="openstack/nova-metadata-0" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.906993 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e2cefca-37d0-4868-97cf-bf6b1a24f0a3-logs\") pod \"nova-metadata-0\" (UID: \"5e2cefca-37d0-4868-97cf-bf6b1a24f0a3\") " pod="openstack/nova-metadata-0" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.918940 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e2cefca-37d0-4868-97cf-bf6b1a24f0a3-config-data\") pod \"nova-metadata-0\" (UID: \"5e2cefca-37d0-4868-97cf-bf6b1a24f0a3\") " pod="openstack/nova-metadata-0" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.919741 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e2cefca-37d0-4868-97cf-bf6b1a24f0a3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5e2cefca-37d0-4868-97cf-bf6b1a24f0a3\") " pod="openstack/nova-metadata-0" Feb 17 20:09:01 crc kubenswrapper[4892]: I0217 20:09:01.928129 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chhzm\" (UniqueName: \"kubernetes.io/projected/5e2cefca-37d0-4868-97cf-bf6b1a24f0a3-kube-api-access-chhzm\") pod \"nova-metadata-0\" (UID: \"5e2cefca-37d0-4868-97cf-bf6b1a24f0a3\") " pod="openstack/nova-metadata-0" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.003358 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.076070 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.110433 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d62a3b6f-05ed-4c4f-8c4a-357db18967d9-config-data\") pod \"d62a3b6f-05ed-4c4f-8c4a-357db18967d9\" (UID: \"d62a3b6f-05ed-4c4f-8c4a-357db18967d9\") " Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.110731 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d62a3b6f-05ed-4c4f-8c4a-357db18967d9-combined-ca-bundle\") pod \"d62a3b6f-05ed-4c4f-8c4a-357db18967d9\" (UID: \"d62a3b6f-05ed-4c4f-8c4a-357db18967d9\") " Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.110777 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tssfw\" (UniqueName: \"kubernetes.io/projected/d62a3b6f-05ed-4c4f-8c4a-357db18967d9-kube-api-access-tssfw\") pod \"d62a3b6f-05ed-4c4f-8c4a-357db18967d9\" (UID: \"d62a3b6f-05ed-4c4f-8c4a-357db18967d9\") " Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.117078 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d62a3b6f-05ed-4c4f-8c4a-357db18967d9-kube-api-access-tssfw" (OuterVolumeSpecName: "kube-api-access-tssfw") pod "d62a3b6f-05ed-4c4f-8c4a-357db18967d9" (UID: "d62a3b6f-05ed-4c4f-8c4a-357db18967d9"). InnerVolumeSpecName "kube-api-access-tssfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.144782 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d62a3b6f-05ed-4c4f-8c4a-357db18967d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d62a3b6f-05ed-4c4f-8c4a-357db18967d9" (UID: "d62a3b6f-05ed-4c4f-8c4a-357db18967d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.168202 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d62a3b6f-05ed-4c4f-8c4a-357db18967d9-config-data" (OuterVolumeSpecName: "config-data") pod "d62a3b6f-05ed-4c4f-8c4a-357db18967d9" (UID: "d62a3b6f-05ed-4c4f-8c4a-357db18967d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.177050 4892 generic.go:334] "Generic (PLEG): container finished" podID="d62a3b6f-05ed-4c4f-8c4a-357db18967d9" containerID="e8700e9d019703f8522b742152a7b7f0198429932f23ad3980fe61f339da91aa" exitCode=0 Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.177133 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"d62a3b6f-05ed-4c4f-8c4a-357db18967d9","Type":"ContainerDied","Data":"e8700e9d019703f8522b742152a7b7f0198429932f23ad3980fe61f339da91aa"} Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.177161 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"d62a3b6f-05ed-4c4f-8c4a-357db18967d9","Type":"ContainerDied","Data":"5b5b4dc24ee48a92e732682d1c20d60b7befb52e0cb00b8e9d5677cc01d93b2c"} Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.177177 4892 scope.go:117] "RemoveContainer" containerID="e8700e9d019703f8522b742152a7b7f0198429932f23ad3980fe61f339da91aa" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.177281 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.180108 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"74eb0508-84ab-4ac0-a7b4-5cdc321c3c4a","Type":"ContainerDied","Data":"b61d938e01773abc1c5cc198308cf4b262f97b00eb5606512a6b2e5079fef25c"} Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.180131 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.203219 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"74f65fe3-a32c-4468-aabd-7db13aa21aa4","Type":"ContainerDied","Data":"83c872092d163fa24e137a4766ea36cb3df34dc0f787c63f5fb1cd59ab346371"} Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.203334 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.213730 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d62a3b6f-05ed-4c4f-8c4a-357db18967d9-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.213759 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d62a3b6f-05ed-4c4f-8c4a-357db18967d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.213777 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tssfw\" (UniqueName: \"kubernetes.io/projected/d62a3b6f-05ed-4c4f-8c4a-357db18967d9-kube-api-access-tssfw\") on node \"crc\" DevicePath \"\"" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.250123 4892 scope.go:117] "RemoveContainer" containerID="e8700e9d019703f8522b742152a7b7f0198429932f23ad3980fe61f339da91aa" Feb 17 20:09:02 crc kubenswrapper[4892]: E0217 20:09:02.251545 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8700e9d019703f8522b742152a7b7f0198429932f23ad3980fe61f339da91aa\": container with ID starting with e8700e9d019703f8522b742152a7b7f0198429932f23ad3980fe61f339da91aa not found: ID does not exist" containerID="e8700e9d019703f8522b742152a7b7f0198429932f23ad3980fe61f339da91aa" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.251588 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8700e9d019703f8522b742152a7b7f0198429932f23ad3980fe61f339da91aa"} err="failed to get container status \"e8700e9d019703f8522b742152a7b7f0198429932f23ad3980fe61f339da91aa\": rpc error: code = NotFound desc = could not find container \"e8700e9d019703f8522b742152a7b7f0198429932f23ad3980fe61f339da91aa\": container with ID starting with e8700e9d019703f8522b742152a7b7f0198429932f23ad3980fe61f339da91aa not found: ID does not exist" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.251605 4892 scope.go:117] "RemoveContainer" containerID="07783f23535025be8b8331c575a55ec1f402a4c838604e9179cb34c1bf66d3ea" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.282899 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.333311 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.334761 4892 scope.go:117] "RemoveContainer" containerID="b9de8cff655d97851c673048b173338dbbdc3352968e715b840eb2394b079902" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.356662 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 17 20:09:02 crc kubenswrapper[4892]: E0217 20:09:02.357666 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d62a3b6f-05ed-4c4f-8c4a-357db18967d9" containerName="nova-cell0-conductor-conductor" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.357691 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="d62a3b6f-05ed-4c4f-8c4a-357db18967d9" containerName="nova-cell0-conductor-conductor" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.357994 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="d62a3b6f-05ed-4c4f-8c4a-357db18967d9" containerName="nova-cell0-conductor-conductor" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.359535 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.363329 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.376227 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.399266 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.415318 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.417323 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/872897e3-475e-40c0-b7f3-ae6a8c6efd29-logs\") pod \"nova-api-0\" (UID: \"872897e3-475e-40c0-b7f3-ae6a8c6efd29\") " pod="openstack/nova-api-0" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.417404 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/872897e3-475e-40c0-b7f3-ae6a8c6efd29-config-data\") pod \"nova-api-0\" (UID: \"872897e3-475e-40c0-b7f3-ae6a8c6efd29\") " pod="openstack/nova-api-0" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.417497 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/872897e3-475e-40c0-b7f3-ae6a8c6efd29-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"872897e3-475e-40c0-b7f3-ae6a8c6efd29\") " pod="openstack/nova-api-0" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.417529 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmflb\" (UniqueName: \"kubernetes.io/projected/872897e3-475e-40c0-b7f3-ae6a8c6efd29-kube-api-access-wmflb\") pod \"nova-api-0\" (UID: \"872897e3-475e-40c0-b7f3-ae6a8c6efd29\") " pod="openstack/nova-api-0" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.442063 4892 scope.go:117] "RemoveContainer" containerID="87aac0b422a5c6fa015611fc43442b8018f3af9bd129b89100a498b163553bab" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.450897 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.452442 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.456211 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.467443 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.477982 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.489017 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.500624 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.502363 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.507107 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.522348 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/872897e3-475e-40c0-b7f3-ae6a8c6efd29-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"872897e3-475e-40c0-b7f3-ae6a8c6efd29\") " pod="openstack/nova-api-0" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.522601 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxxsg\" (UniqueName: \"kubernetes.io/projected/5bbf722a-acfd-4b39-b080-b9022968adac-kube-api-access-pxxsg\") pod \"nova-cell0-conductor-0\" (UID: \"5bbf722a-acfd-4b39-b080-b9022968adac\") " pod="openstack/nova-cell0-conductor-0" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.522768 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmflb\" (UniqueName: \"kubernetes.io/projected/872897e3-475e-40c0-b7f3-ae6a8c6efd29-kube-api-access-wmflb\") pod \"nova-api-0\" (UID: \"872897e3-475e-40c0-b7f3-ae6a8c6efd29\") " pod="openstack/nova-api-0" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.522941 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bbf722a-acfd-4b39-b080-b9022968adac-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"5bbf722a-acfd-4b39-b080-b9022968adac\") " pod="openstack/nova-cell0-conductor-0" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.523655 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/872897e3-475e-40c0-b7f3-ae6a8c6efd29-logs\") pod \"nova-api-0\" (UID: \"872897e3-475e-40c0-b7f3-ae6a8c6efd29\") " pod="openstack/nova-api-0" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.523755 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bbf722a-acfd-4b39-b080-b9022968adac-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"5bbf722a-acfd-4b39-b080-b9022968adac\") " pod="openstack/nova-cell0-conductor-0" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.523982 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/872897e3-475e-40c0-b7f3-ae6a8c6efd29-config-data\") pod \"nova-api-0\" (UID: \"872897e3-475e-40c0-b7f3-ae6a8c6efd29\") " pod="openstack/nova-api-0" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.525516 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/872897e3-475e-40c0-b7f3-ae6a8c6efd29-logs\") pod \"nova-api-0\" (UID: \"872897e3-475e-40c0-b7f3-ae6a8c6efd29\") " pod="openstack/nova-api-0" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.536959 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.543575 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmflb\" (UniqueName: \"kubernetes.io/projected/872897e3-475e-40c0-b7f3-ae6a8c6efd29-kube-api-access-wmflb\") pod \"nova-api-0\" (UID: \"872897e3-475e-40c0-b7f3-ae6a8c6efd29\") " pod="openstack/nova-api-0" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.549208 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/872897e3-475e-40c0-b7f3-ae6a8c6efd29-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"872897e3-475e-40c0-b7f3-ae6a8c6efd29\") " pod="openstack/nova-api-0" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.559946 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/872897e3-475e-40c0-b7f3-ae6a8c6efd29-config-data\") pod \"nova-api-0\" (UID: \"872897e3-475e-40c0-b7f3-ae6a8c6efd29\") " pod="openstack/nova-api-0" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.629252 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/313bae35-ce99-4aed-915a-d8e1c5d8202c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"313bae35-ce99-4aed-915a-d8e1c5d8202c\") " pod="openstack/nova-cell1-conductor-0" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.629323 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxxsg\" (UniqueName: \"kubernetes.io/projected/5bbf722a-acfd-4b39-b080-b9022968adac-kube-api-access-pxxsg\") pod \"nova-cell0-conductor-0\" (UID: \"5bbf722a-acfd-4b39-b080-b9022968adac\") " pod="openstack/nova-cell0-conductor-0" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.629386 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bbf722a-acfd-4b39-b080-b9022968adac-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"5bbf722a-acfd-4b39-b080-b9022968adac\") " pod="openstack/nova-cell0-conductor-0" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.629436 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bbf722a-acfd-4b39-b080-b9022968adac-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"5bbf722a-acfd-4b39-b080-b9022968adac\") " pod="openstack/nova-cell0-conductor-0" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.629475 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/313bae35-ce99-4aed-915a-d8e1c5d8202c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"313bae35-ce99-4aed-915a-d8e1c5d8202c\") " pod="openstack/nova-cell1-conductor-0" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.629502 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flk8k\" (UniqueName: \"kubernetes.io/projected/313bae35-ce99-4aed-915a-d8e1c5d8202c-kube-api-access-flk8k\") pod \"nova-cell1-conductor-0\" (UID: \"313bae35-ce99-4aed-915a-d8e1c5d8202c\") " pod="openstack/nova-cell1-conductor-0" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.635613 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bbf722a-acfd-4b39-b080-b9022968adac-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"5bbf722a-acfd-4b39-b080-b9022968adac\") " pod="openstack/nova-cell0-conductor-0" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.636509 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bbf722a-acfd-4b39-b080-b9022968adac-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"5bbf722a-acfd-4b39-b080-b9022968adac\") " pod="openstack/nova-cell0-conductor-0" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.657654 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxxsg\" (UniqueName: \"kubernetes.io/projected/5bbf722a-acfd-4b39-b080-b9022968adac-kube-api-access-pxxsg\") pod \"nova-cell0-conductor-0\" (UID: \"5bbf722a-acfd-4b39-b080-b9022968adac\") " pod="openstack/nova-cell0-conductor-0" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.729705 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.732493 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/313bae35-ce99-4aed-915a-d8e1c5d8202c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"313bae35-ce99-4aed-915a-d8e1c5d8202c\") " pod="openstack/nova-cell1-conductor-0" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.732565 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flk8k\" (UniqueName: \"kubernetes.io/projected/313bae35-ce99-4aed-915a-d8e1c5d8202c-kube-api-access-flk8k\") pod \"nova-cell1-conductor-0\" (UID: \"313bae35-ce99-4aed-915a-d8e1c5d8202c\") " pod="openstack/nova-cell1-conductor-0" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.732701 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/313bae35-ce99-4aed-915a-d8e1c5d8202c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"313bae35-ce99-4aed-915a-d8e1c5d8202c\") " pod="openstack/nova-cell1-conductor-0" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.741438 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/313bae35-ce99-4aed-915a-d8e1c5d8202c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"313bae35-ce99-4aed-915a-d8e1c5d8202c\") " pod="openstack/nova-cell1-conductor-0" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.741647 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/313bae35-ce99-4aed-915a-d8e1c5d8202c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"313bae35-ce99-4aed-915a-d8e1c5d8202c\") " pod="openstack/nova-cell1-conductor-0" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.785663 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.802536 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flk8k\" (UniqueName: \"kubernetes.io/projected/313bae35-ce99-4aed-915a-d8e1c5d8202c-kube-api-access-flk8k\") pod \"nova-cell1-conductor-0\" (UID: \"313bae35-ce99-4aed-915a-d8e1c5d8202c\") " pod="openstack/nova-cell1-conductor-0" Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.810087 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 20:09:02 crc kubenswrapper[4892]: I0217 20:09:02.825127 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 17 20:09:02 crc kubenswrapper[4892]: W0217 20:09:02.975956 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e2cefca_37d0_4868_97cf_bf6b1a24f0a3.slice/crio-7c75e0b0dcb1856b40b7e5e130a0126793746357aad46e049fe9372577c0862d WatchSource:0}: Error finding container 7c75e0b0dcb1856b40b7e5e130a0126793746357aad46e049fe9372577c0862d: Status 404 returned error can't find the container with id 7c75e0b0dcb1856b40b7e5e130a0126793746357aad46e049fe9372577c0862d Feb 17 20:09:03 crc kubenswrapper[4892]: I0217 20:09:03.247075 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5e2cefca-37d0-4868-97cf-bf6b1a24f0a3","Type":"ContainerStarted","Data":"7c75e0b0dcb1856b40b7e5e130a0126793746357aad46e049fe9372577c0862d"} Feb 17 20:09:03 crc kubenswrapper[4892]: I0217 20:09:03.377878 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74eb0508-84ab-4ac0-a7b4-5cdc321c3c4a" path="/var/lib/kubelet/pods/74eb0508-84ab-4ac0-a7b4-5cdc321c3c4a/volumes" Feb 17 20:09:03 crc kubenswrapper[4892]: I0217 20:09:03.378880 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74f65fe3-a32c-4468-aabd-7db13aa21aa4" path="/var/lib/kubelet/pods/74f65fe3-a32c-4468-aabd-7db13aa21aa4/volumes" Feb 17 20:09:03 crc kubenswrapper[4892]: I0217 20:09:03.380470 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2730df2-b773-4353-9b44-9dcc2b516221" path="/var/lib/kubelet/pods/d2730df2-b773-4353-9b44-9dcc2b516221/volumes" Feb 17 20:09:03 crc kubenswrapper[4892]: I0217 20:09:03.381630 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d62a3b6f-05ed-4c4f-8c4a-357db18967d9" path="/var/lib/kubelet/pods/d62a3b6f-05ed-4c4f-8c4a-357db18967d9/volumes" Feb 17 20:09:03 crc kubenswrapper[4892]: I0217 20:09:03.454240 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 20:09:03 crc kubenswrapper[4892]: I0217 20:09:03.603862 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 17 20:09:03 crc kubenswrapper[4892]: W0217 20:09:03.638564 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod313bae35_ce99_4aed_915a_d8e1c5d8202c.slice/crio-73d04cf94f0f1ecf79ce9c9e76cf65239233fde59551eb0f975131759a6cba35 WatchSource:0}: Error finding container 73d04cf94f0f1ecf79ce9c9e76cf65239233fde59551eb0f975131759a6cba35: Status 404 returned error can't find the container with id 73d04cf94f0f1ecf79ce9c9e76cf65239233fde59551eb0f975131759a6cba35 Feb 17 20:09:03 crc kubenswrapper[4892]: I0217 20:09:03.639680 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 17 20:09:04 crc kubenswrapper[4892]: I0217 20:09:04.275984 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"313bae35-ce99-4aed-915a-d8e1c5d8202c","Type":"ContainerStarted","Data":"f3a8d41ab3290a05b4f13bbcb27843da8aa33156d1ccf0d462016209d32c4c37"} Feb 17 20:09:04 crc kubenswrapper[4892]: I0217 20:09:04.276233 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"313bae35-ce99-4aed-915a-d8e1c5d8202c","Type":"ContainerStarted","Data":"73d04cf94f0f1ecf79ce9c9e76cf65239233fde59551eb0f975131759a6cba35"} Feb 17 20:09:04 crc kubenswrapper[4892]: I0217 20:09:04.277530 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 17 20:09:04 crc kubenswrapper[4892]: I0217 20:09:04.289019 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"5bbf722a-acfd-4b39-b080-b9022968adac","Type":"ContainerStarted","Data":"5ff7c69e3b9cf3092a1e359268271b08a42dbebedbd8cfc37ed7d2ffef8fb78c"} Feb 17 20:09:04 crc kubenswrapper[4892]: I0217 20:09:04.289071 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"5bbf722a-acfd-4b39-b080-b9022968adac","Type":"ContainerStarted","Data":"8b6f71223a8805d82a584da9f0b7a360e914d5bf5adcfaffafc2b3b3cd300c28"} Feb 17 20:09:04 crc kubenswrapper[4892]: I0217 20:09:04.290240 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 17 20:09:04 crc kubenswrapper[4892]: I0217 20:09:04.303163 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.303145703 podStartE2EDuration="2.303145703s" podCreationTimestamp="2026-02-17 20:09:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 20:09:04.29522092 +0000 UTC m=+8715.670624185" watchObservedRunningTime="2026-02-17 20:09:04.303145703 +0000 UTC m=+8715.678548968" Feb 17 20:09:04 crc kubenswrapper[4892]: I0217 20:09:04.303935 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5e2cefca-37d0-4868-97cf-bf6b1a24f0a3","Type":"ContainerStarted","Data":"c2e9ee802051006c728c32f13392639471e82d755aae595aef1c01efd4008a34"} Feb 17 20:09:04 crc kubenswrapper[4892]: I0217 20:09:04.303976 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5e2cefca-37d0-4868-97cf-bf6b1a24f0a3","Type":"ContainerStarted","Data":"3da4c7bdfa30fca8641bc3f9eae9fafa6a86209c90c06c5bb9bcdcfee76cf74e"} Feb 17 20:09:04 crc kubenswrapper[4892]: I0217 20:09:04.317113 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"872897e3-475e-40c0-b7f3-ae6a8c6efd29","Type":"ContainerStarted","Data":"65db8e772c9fdecef76b66598fbe1a0d6048d0defe3d6aeeab640a81015c9602"} Feb 17 20:09:04 crc kubenswrapper[4892]: I0217 20:09:04.317173 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"872897e3-475e-40c0-b7f3-ae6a8c6efd29","Type":"ContainerStarted","Data":"94a456e0860c758d5222477150e10f94afb73ca2e80e722e2ebacb1c8154610c"} Feb 17 20:09:04 crc kubenswrapper[4892]: I0217 20:09:04.317189 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"872897e3-475e-40c0-b7f3-ae6a8c6efd29","Type":"ContainerStarted","Data":"4c9031c4cacb6f8f35d67fb34fd9048e09533097c2a66fea52d84039e5a5d283"} Feb 17 20:09:04 crc kubenswrapper[4892]: I0217 20:09:04.322354 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.3223344 podStartE2EDuration="2.3223344s" podCreationTimestamp="2026-02-17 20:09:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 20:09:04.312280539 +0000 UTC m=+8715.687683804" watchObservedRunningTime="2026-02-17 20:09:04.3223344 +0000 UTC m=+8715.697737665" Feb 17 20:09:04 crc kubenswrapper[4892]: I0217 20:09:04.352113 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.35209044 podStartE2EDuration="3.35209044s" podCreationTimestamp="2026-02-17 20:09:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 20:09:04.340287272 +0000 UTC m=+8715.715690527" watchObservedRunningTime="2026-02-17 20:09:04.35209044 +0000 UTC m=+8715.727493705" Feb 17 20:09:04 crc kubenswrapper[4892]: I0217 20:09:04.368653 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.368631615 podStartE2EDuration="2.368631615s" podCreationTimestamp="2026-02-17 20:09:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 20:09:04.356471218 +0000 UTC m=+8715.731874483" watchObservedRunningTime="2026-02-17 20:09:04.368631615 +0000 UTC m=+8715.744034880" Feb 17 20:09:04 crc kubenswrapper[4892]: I0217 20:09:04.507925 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 17 20:09:07 crc kubenswrapper[4892]: I0217 20:09:07.076763 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 20:09:07 crc kubenswrapper[4892]: I0217 20:09:07.077324 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 20:09:08 crc kubenswrapper[4892]: I0217 20:09:08.360127 4892 scope.go:117] "RemoveContainer" containerID="dd34609e40c62d0dc2fae95fb119b2fff9896a684cd3e8b1cf99b9b16a2445f8" Feb 17 20:09:08 crc kubenswrapper[4892]: E0217 20:09:08.360879 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:09:09 crc kubenswrapper[4892]: I0217 20:09:09.507620 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 17 20:09:09 crc kubenswrapper[4892]: I0217 20:09:09.569248 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 17 20:09:10 crc kubenswrapper[4892]: I0217 20:09:10.443118 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 17 20:09:12 crc kubenswrapper[4892]: I0217 20:09:12.076629 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 17 20:09:12 crc kubenswrapper[4892]: I0217 20:09:12.076975 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 17 20:09:12 crc kubenswrapper[4892]: I0217 20:09:12.731192 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 20:09:12 crc kubenswrapper[4892]: I0217 20:09:12.731288 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 20:09:12 crc kubenswrapper[4892]: I0217 20:09:12.862449 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 17 20:09:12 crc kubenswrapper[4892]: I0217 20:09:12.871447 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 17 20:09:13 crc kubenswrapper[4892]: I0217 20:09:13.158993 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5e2cefca-37d0-4868-97cf-bf6b1a24f0a3" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.219:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 20:09:13 crc kubenswrapper[4892]: I0217 20:09:13.159008 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5e2cefca-37d0-4868-97cf-bf6b1a24f0a3" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.219:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 20:09:13 crc kubenswrapper[4892]: I0217 20:09:13.813052 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="872897e3-475e-40c0-b7f3-ae6a8c6efd29" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.220:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 20:09:13 crc kubenswrapper[4892]: I0217 20:09:13.813057 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="872897e3-475e-40c0-b7f3-ae6a8c6efd29" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.220:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 20:09:20 crc kubenswrapper[4892]: I0217 20:09:20.360231 4892 scope.go:117] "RemoveContainer" containerID="dd34609e40c62d0dc2fae95fb119b2fff9896a684cd3e8b1cf99b9b16a2445f8" Feb 17 20:09:20 crc kubenswrapper[4892]: E0217 20:09:20.361180 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:09:22 crc kubenswrapper[4892]: I0217 20:09:22.079461 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 17 20:09:22 crc kubenswrapper[4892]: I0217 20:09:22.081875 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 17 20:09:22 crc kubenswrapper[4892]: I0217 20:09:22.083386 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 17 20:09:22 crc kubenswrapper[4892]: I0217 20:09:22.585349 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 17 20:09:22 crc kubenswrapper[4892]: I0217 20:09:22.737191 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 17 20:09:22 crc kubenswrapper[4892]: I0217 20:09:22.737719 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 17 20:09:22 crc kubenswrapper[4892]: I0217 20:09:22.738839 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 17 20:09:22 crc kubenswrapper[4892]: I0217 20:09:22.744943 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 17 20:09:23 crc kubenswrapper[4892]: I0217 20:09:23.599168 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 17 20:09:23 crc kubenswrapper[4892]: I0217 20:09:23.604358 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 17 20:09:32 crc kubenswrapper[4892]: I0217 20:09:32.198201 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4f5j9"] Feb 17 20:09:32 crc kubenswrapper[4892]: I0217 20:09:32.201987 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4f5j9" Feb 17 20:09:32 crc kubenswrapper[4892]: I0217 20:09:32.231365 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4f5j9"] Feb 17 20:09:32 crc kubenswrapper[4892]: I0217 20:09:32.360405 4892 scope.go:117] "RemoveContainer" containerID="dd34609e40c62d0dc2fae95fb119b2fff9896a684cd3e8b1cf99b9b16a2445f8" Feb 17 20:09:32 crc kubenswrapper[4892]: E0217 20:09:32.360679 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:09:32 crc kubenswrapper[4892]: I0217 20:09:32.375091 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c624r\" (UniqueName: \"kubernetes.io/projected/2f46f74c-ce94-4ff5-8543-6580352fda57-kube-api-access-c624r\") pod \"certified-operators-4f5j9\" (UID: \"2f46f74c-ce94-4ff5-8543-6580352fda57\") " pod="openshift-marketplace/certified-operators-4f5j9" Feb 17 20:09:32 crc kubenswrapper[4892]: I0217 20:09:32.375140 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f46f74c-ce94-4ff5-8543-6580352fda57-utilities\") pod \"certified-operators-4f5j9\" (UID: \"2f46f74c-ce94-4ff5-8543-6580352fda57\") " pod="openshift-marketplace/certified-operators-4f5j9" Feb 17 20:09:32 crc kubenswrapper[4892]: I0217 20:09:32.375317 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f46f74c-ce94-4ff5-8543-6580352fda57-catalog-content\") pod \"certified-operators-4f5j9\" (UID: \"2f46f74c-ce94-4ff5-8543-6580352fda57\") " pod="openshift-marketplace/certified-operators-4f5j9" Feb 17 20:09:32 crc kubenswrapper[4892]: I0217 20:09:32.477457 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c624r\" (UniqueName: \"kubernetes.io/projected/2f46f74c-ce94-4ff5-8543-6580352fda57-kube-api-access-c624r\") pod \"certified-operators-4f5j9\" (UID: \"2f46f74c-ce94-4ff5-8543-6580352fda57\") " pod="openshift-marketplace/certified-operators-4f5j9" Feb 17 20:09:32 crc kubenswrapper[4892]: I0217 20:09:32.477507 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f46f74c-ce94-4ff5-8543-6580352fda57-utilities\") pod \"certified-operators-4f5j9\" (UID: \"2f46f74c-ce94-4ff5-8543-6580352fda57\") " pod="openshift-marketplace/certified-operators-4f5j9" Feb 17 20:09:32 crc kubenswrapper[4892]: I0217 20:09:32.477712 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f46f74c-ce94-4ff5-8543-6580352fda57-catalog-content\") pod \"certified-operators-4f5j9\" (UID: \"2f46f74c-ce94-4ff5-8543-6580352fda57\") " pod="openshift-marketplace/certified-operators-4f5j9" Feb 17 20:09:32 crc kubenswrapper[4892]: I0217 20:09:32.479705 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f46f74c-ce94-4ff5-8543-6580352fda57-catalog-content\") pod \"certified-operators-4f5j9\" (UID: \"2f46f74c-ce94-4ff5-8543-6580352fda57\") " pod="openshift-marketplace/certified-operators-4f5j9" Feb 17 20:09:32 crc kubenswrapper[4892]: I0217 20:09:32.479759 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f46f74c-ce94-4ff5-8543-6580352fda57-utilities\") pod \"certified-operators-4f5j9\" (UID: \"2f46f74c-ce94-4ff5-8543-6580352fda57\") " pod="openshift-marketplace/certified-operators-4f5j9" Feb 17 20:09:32 crc kubenswrapper[4892]: I0217 20:09:32.499406 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c624r\" (UniqueName: \"kubernetes.io/projected/2f46f74c-ce94-4ff5-8543-6580352fda57-kube-api-access-c624r\") pod \"certified-operators-4f5j9\" (UID: \"2f46f74c-ce94-4ff5-8543-6580352fda57\") " pod="openshift-marketplace/certified-operators-4f5j9" Feb 17 20:09:32 crc kubenswrapper[4892]: I0217 20:09:32.533333 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4f5j9" Feb 17 20:09:33 crc kubenswrapper[4892]: I0217 20:09:33.175204 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4f5j9"] Feb 17 20:09:33 crc kubenswrapper[4892]: I0217 20:09:33.741132 4892 generic.go:334] "Generic (PLEG): container finished" podID="2f46f74c-ce94-4ff5-8543-6580352fda57" containerID="f89c01e27aa878a9dff90518e49e874242f5335071052b9ac808c372dbb21593" exitCode=0 Feb 17 20:09:33 crc kubenswrapper[4892]: I0217 20:09:33.741458 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4f5j9" event={"ID":"2f46f74c-ce94-4ff5-8543-6580352fda57","Type":"ContainerDied","Data":"f89c01e27aa878a9dff90518e49e874242f5335071052b9ac808c372dbb21593"} Feb 17 20:09:33 crc kubenswrapper[4892]: I0217 20:09:33.741498 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4f5j9" event={"ID":"2f46f74c-ce94-4ff5-8543-6580352fda57","Type":"ContainerStarted","Data":"f885ab9a2503384d4b3e908f198f98d30526a0b90fea9e2dfeef9eec3d6fa624"} Feb 17 20:09:40 crc kubenswrapper[4892]: I0217 20:09:40.832562 4892 generic.go:334] "Generic (PLEG): container finished" podID="2f46f74c-ce94-4ff5-8543-6580352fda57" containerID="ac9867c91dc974186696f18f7dfb6bc763870de2223aa46a16edf94632c1bec1" exitCode=0 Feb 17 20:09:40 crc kubenswrapper[4892]: I0217 20:09:40.833101 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4f5j9" event={"ID":"2f46f74c-ce94-4ff5-8543-6580352fda57","Type":"ContainerDied","Data":"ac9867c91dc974186696f18f7dfb6bc763870de2223aa46a16edf94632c1bec1"} Feb 17 20:09:41 crc kubenswrapper[4892]: I0217 20:09:41.847763 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4f5j9" event={"ID":"2f46f74c-ce94-4ff5-8543-6580352fda57","Type":"ContainerStarted","Data":"0574484f243b526a5ac652118d0143ca517213fff0b9f04842ebf331f43063d6"} Feb 17 20:09:41 crc kubenswrapper[4892]: I0217 20:09:41.876519 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4f5j9" podStartSLOduration=2.37203139 podStartE2EDuration="9.876493447s" podCreationTimestamp="2026-02-17 20:09:32 +0000 UTC" firstStartedPulling="2026-02-17 20:09:33.7442359 +0000 UTC m=+8745.119639165" lastFinishedPulling="2026-02-17 20:09:41.248697947 +0000 UTC m=+8752.624101222" observedRunningTime="2026-02-17 20:09:41.868605794 +0000 UTC m=+8753.244009059" watchObservedRunningTime="2026-02-17 20:09:41.876493447 +0000 UTC m=+8753.251896752" Feb 17 20:09:42 crc kubenswrapper[4892]: I0217 20:09:42.534056 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4f5j9" Feb 17 20:09:42 crc kubenswrapper[4892]: I0217 20:09:42.534119 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4f5j9" Feb 17 20:09:43 crc kubenswrapper[4892]: I0217 20:09:43.598976 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-4f5j9" podUID="2f46f74c-ce94-4ff5-8543-6580352fda57" containerName="registry-server" probeResult="failure" output=< Feb 17 20:09:43 crc kubenswrapper[4892]: timeout: failed to connect service ":50051" within 1s Feb 17 20:09:43 crc kubenswrapper[4892]: > Feb 17 20:09:46 crc kubenswrapper[4892]: I0217 20:09:46.360295 4892 scope.go:117] "RemoveContainer" containerID="dd34609e40c62d0dc2fae95fb119b2fff9896a684cd3e8b1cf99b9b16a2445f8" Feb 17 20:09:46 crc kubenswrapper[4892]: E0217 20:09:46.361172 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:09:46 crc kubenswrapper[4892]: I0217 20:09:46.607108 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nznkb"] Feb 17 20:09:46 crc kubenswrapper[4892]: I0217 20:09:46.610358 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nznkb" Feb 17 20:09:46 crc kubenswrapper[4892]: I0217 20:09:46.623716 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nznkb"] Feb 17 20:09:46 crc kubenswrapper[4892]: I0217 20:09:46.738844 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fm8x\" (UniqueName: \"kubernetes.io/projected/330bbc2b-6580-4c85-8180-847df82d681f-kube-api-access-7fm8x\") pod \"redhat-marketplace-nznkb\" (UID: \"330bbc2b-6580-4c85-8180-847df82d681f\") " pod="openshift-marketplace/redhat-marketplace-nznkb" Feb 17 20:09:46 crc kubenswrapper[4892]: I0217 20:09:46.738964 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/330bbc2b-6580-4c85-8180-847df82d681f-catalog-content\") pod \"redhat-marketplace-nznkb\" (UID: \"330bbc2b-6580-4c85-8180-847df82d681f\") " pod="openshift-marketplace/redhat-marketplace-nznkb" Feb 17 20:09:46 crc kubenswrapper[4892]: I0217 20:09:46.739095 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/330bbc2b-6580-4c85-8180-847df82d681f-utilities\") pod \"redhat-marketplace-nznkb\" (UID: \"330bbc2b-6580-4c85-8180-847df82d681f\") " pod="openshift-marketplace/redhat-marketplace-nznkb" Feb 17 20:09:46 crc kubenswrapper[4892]: I0217 20:09:46.842188 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fm8x\" (UniqueName: \"kubernetes.io/projected/330bbc2b-6580-4c85-8180-847df82d681f-kube-api-access-7fm8x\") pod \"redhat-marketplace-nznkb\" (UID: \"330bbc2b-6580-4c85-8180-847df82d681f\") " pod="openshift-marketplace/redhat-marketplace-nznkb" Feb 17 20:09:46 crc kubenswrapper[4892]: I0217 20:09:46.842414 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/330bbc2b-6580-4c85-8180-847df82d681f-catalog-content\") pod \"redhat-marketplace-nznkb\" (UID: \"330bbc2b-6580-4c85-8180-847df82d681f\") " pod="openshift-marketplace/redhat-marketplace-nznkb" Feb 17 20:09:46 crc kubenswrapper[4892]: I0217 20:09:46.842680 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/330bbc2b-6580-4c85-8180-847df82d681f-utilities\") pod \"redhat-marketplace-nznkb\" (UID: \"330bbc2b-6580-4c85-8180-847df82d681f\") " pod="openshift-marketplace/redhat-marketplace-nznkb" Feb 17 20:09:46 crc kubenswrapper[4892]: I0217 20:09:46.843166 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/330bbc2b-6580-4c85-8180-847df82d681f-catalog-content\") pod \"redhat-marketplace-nznkb\" (UID: \"330bbc2b-6580-4c85-8180-847df82d681f\") " pod="openshift-marketplace/redhat-marketplace-nznkb" Feb 17 20:09:46 crc kubenswrapper[4892]: I0217 20:09:46.843560 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/330bbc2b-6580-4c85-8180-847df82d681f-utilities\") pod \"redhat-marketplace-nznkb\" (UID: \"330bbc2b-6580-4c85-8180-847df82d681f\") " pod="openshift-marketplace/redhat-marketplace-nznkb" Feb 17 20:09:46 crc kubenswrapper[4892]: I0217 20:09:46.867712 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fm8x\" (UniqueName: \"kubernetes.io/projected/330bbc2b-6580-4c85-8180-847df82d681f-kube-api-access-7fm8x\") pod \"redhat-marketplace-nznkb\" (UID: \"330bbc2b-6580-4c85-8180-847df82d681f\") " pod="openshift-marketplace/redhat-marketplace-nznkb" Feb 17 20:09:46 crc kubenswrapper[4892]: I0217 20:09:46.944073 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nznkb" Feb 17 20:09:47 crc kubenswrapper[4892]: I0217 20:09:47.452717 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nznkb"] Feb 17 20:09:47 crc kubenswrapper[4892]: I0217 20:09:47.927723 4892 generic.go:334] "Generic (PLEG): container finished" podID="330bbc2b-6580-4c85-8180-847df82d681f" containerID="7e042ea6c93c3ba548cef648dd64ea03a562f5f55d68664e299e60dd8eecb424" exitCode=0 Feb 17 20:09:47 crc kubenswrapper[4892]: I0217 20:09:47.927806 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nznkb" event={"ID":"330bbc2b-6580-4c85-8180-847df82d681f","Type":"ContainerDied","Data":"7e042ea6c93c3ba548cef648dd64ea03a562f5f55d68664e299e60dd8eecb424"} Feb 17 20:09:47 crc kubenswrapper[4892]: I0217 20:09:47.928097 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nznkb" event={"ID":"330bbc2b-6580-4c85-8180-847df82d681f","Type":"ContainerStarted","Data":"1a931eb4f7df10e4905ab9e0c213408c9a9537d2ab94ca8654b6703d1b52c325"} Feb 17 20:09:48 crc kubenswrapper[4892]: I0217 20:09:48.956024 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nznkb" event={"ID":"330bbc2b-6580-4c85-8180-847df82d681f","Type":"ContainerStarted","Data":"8a2a8b03b290867ca8100ed61686e61288e0f0b47209005e966806849a8f8b35"} Feb 17 20:09:49 crc kubenswrapper[4892]: I0217 20:09:49.971987 4892 generic.go:334] "Generic (PLEG): container finished" podID="330bbc2b-6580-4c85-8180-847df82d681f" containerID="8a2a8b03b290867ca8100ed61686e61288e0f0b47209005e966806849a8f8b35" exitCode=0 Feb 17 20:09:49 crc kubenswrapper[4892]: I0217 20:09:49.972043 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nznkb" event={"ID":"330bbc2b-6580-4c85-8180-847df82d681f","Type":"ContainerDied","Data":"8a2a8b03b290867ca8100ed61686e61288e0f0b47209005e966806849a8f8b35"} Feb 17 20:09:50 crc kubenswrapper[4892]: I0217 20:09:50.984491 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nznkb" event={"ID":"330bbc2b-6580-4c85-8180-847df82d681f","Type":"ContainerStarted","Data":"a52f033cccbf6f2542aa4251af58d5d8015114a735f0572f09a5deba23f63cec"} Feb 17 20:09:51 crc kubenswrapper[4892]: I0217 20:09:51.006504 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nznkb" podStartSLOduration=2.568511084 podStartE2EDuration="5.006486275s" podCreationTimestamp="2026-02-17 20:09:46 +0000 UTC" firstStartedPulling="2026-02-17 20:09:47.93046715 +0000 UTC m=+8759.305870415" lastFinishedPulling="2026-02-17 20:09:50.368442341 +0000 UTC m=+8761.743845606" observedRunningTime="2026-02-17 20:09:50.999681733 +0000 UTC m=+8762.375085008" watchObservedRunningTime="2026-02-17 20:09:51.006486275 +0000 UTC m=+8762.381889540" Feb 17 20:09:52 crc kubenswrapper[4892]: I0217 20:09:52.581132 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4f5j9" Feb 17 20:09:52 crc kubenswrapper[4892]: I0217 20:09:52.645270 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4f5j9" Feb 17 20:09:56 crc kubenswrapper[4892]: I0217 20:09:56.235249 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4f5j9"] Feb 17 20:09:56 crc kubenswrapper[4892]: I0217 20:09:56.795435 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5cd89"] Feb 17 20:09:56 crc kubenswrapper[4892]: I0217 20:09:56.795958 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5cd89" podUID="94cfcd99-3052-49c6-991c-571a85bdeba5" containerName="registry-server" containerID="cri-o://60ede4528881a0e75c0a773cdc6e4742ed2805584239afe4bcecf7be8adcba0a" gracePeriod=2 Feb 17 20:09:56 crc kubenswrapper[4892]: I0217 20:09:56.944641 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nznkb" Feb 17 20:09:56 crc kubenswrapper[4892]: I0217 20:09:56.944746 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nznkb" Feb 17 20:09:57 crc kubenswrapper[4892]: I0217 20:09:57.003215 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nznkb" Feb 17 20:09:57 crc kubenswrapper[4892]: I0217 20:09:57.085239 4892 generic.go:334] "Generic (PLEG): container finished" podID="94cfcd99-3052-49c6-991c-571a85bdeba5" containerID="60ede4528881a0e75c0a773cdc6e4742ed2805584239afe4bcecf7be8adcba0a" exitCode=0 Feb 17 20:09:57 crc kubenswrapper[4892]: I0217 20:09:57.085853 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5cd89" event={"ID":"94cfcd99-3052-49c6-991c-571a85bdeba5","Type":"ContainerDied","Data":"60ede4528881a0e75c0a773cdc6e4742ed2805584239afe4bcecf7be8adcba0a"} Feb 17 20:09:57 crc kubenswrapper[4892]: I0217 20:09:57.142038 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nznkb" Feb 17 20:09:57 crc kubenswrapper[4892]: I0217 20:09:57.387880 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hw9fh"] Feb 17 20:09:57 crc kubenswrapper[4892]: I0217 20:09:57.388165 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hw9fh" podUID="8d6875b4-db50-44e7-b270-ceb7bc6ef804" containerName="registry-server" containerID="cri-o://cbf0f4bb67090449de60937e4c5f126db77b22d75a30b29fa70a2de42b9432be" gracePeriod=2 Feb 17 20:09:57 crc kubenswrapper[4892]: I0217 20:09:57.406107 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5cd89" Feb 17 20:09:57 crc kubenswrapper[4892]: I0217 20:09:57.509880 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47pnh\" (UniqueName: \"kubernetes.io/projected/94cfcd99-3052-49c6-991c-571a85bdeba5-kube-api-access-47pnh\") pod \"94cfcd99-3052-49c6-991c-571a85bdeba5\" (UID: \"94cfcd99-3052-49c6-991c-571a85bdeba5\") " Feb 17 20:09:57 crc kubenswrapper[4892]: I0217 20:09:57.509940 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94cfcd99-3052-49c6-991c-571a85bdeba5-catalog-content\") pod \"94cfcd99-3052-49c6-991c-571a85bdeba5\" (UID: \"94cfcd99-3052-49c6-991c-571a85bdeba5\") " Feb 17 20:09:57 crc kubenswrapper[4892]: I0217 20:09:57.510078 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94cfcd99-3052-49c6-991c-571a85bdeba5-utilities\") pod \"94cfcd99-3052-49c6-991c-571a85bdeba5\" (UID: \"94cfcd99-3052-49c6-991c-571a85bdeba5\") " Feb 17 20:09:57 crc kubenswrapper[4892]: I0217 20:09:57.510685 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94cfcd99-3052-49c6-991c-571a85bdeba5-utilities" (OuterVolumeSpecName: "utilities") pod "94cfcd99-3052-49c6-991c-571a85bdeba5" (UID: "94cfcd99-3052-49c6-991c-571a85bdeba5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 20:09:57 crc kubenswrapper[4892]: I0217 20:09:57.515897 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94cfcd99-3052-49c6-991c-571a85bdeba5-kube-api-access-47pnh" (OuterVolumeSpecName: "kube-api-access-47pnh") pod "94cfcd99-3052-49c6-991c-571a85bdeba5" (UID: "94cfcd99-3052-49c6-991c-571a85bdeba5"). InnerVolumeSpecName "kube-api-access-47pnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 20:09:57 crc kubenswrapper[4892]: I0217 20:09:57.535615 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94cfcd99-3052-49c6-991c-571a85bdeba5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "94cfcd99-3052-49c6-991c-571a85bdeba5" (UID: "94cfcd99-3052-49c6-991c-571a85bdeba5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 20:09:57 crc kubenswrapper[4892]: I0217 20:09:57.613714 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94cfcd99-3052-49c6-991c-571a85bdeba5-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 20:09:57 crc kubenswrapper[4892]: I0217 20:09:57.614368 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47pnh\" (UniqueName: \"kubernetes.io/projected/94cfcd99-3052-49c6-991c-571a85bdeba5-kube-api-access-47pnh\") on node \"crc\" DevicePath \"\"" Feb 17 20:09:57 crc kubenswrapper[4892]: I0217 20:09:57.614386 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94cfcd99-3052-49c6-991c-571a85bdeba5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 20:09:57 crc kubenswrapper[4892]: I0217 20:09:57.859036 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hw9fh" Feb 17 20:09:58 crc kubenswrapper[4892]: I0217 20:09:58.023967 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d6875b4-db50-44e7-b270-ceb7bc6ef804-catalog-content\") pod \"8d6875b4-db50-44e7-b270-ceb7bc6ef804\" (UID: \"8d6875b4-db50-44e7-b270-ceb7bc6ef804\") " Feb 17 20:09:58 crc kubenswrapper[4892]: I0217 20:09:58.024059 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7p5g\" (UniqueName: \"kubernetes.io/projected/8d6875b4-db50-44e7-b270-ceb7bc6ef804-kube-api-access-s7p5g\") pod \"8d6875b4-db50-44e7-b270-ceb7bc6ef804\" (UID: \"8d6875b4-db50-44e7-b270-ceb7bc6ef804\") " Feb 17 20:09:58 crc kubenswrapper[4892]: I0217 20:09:58.024245 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d6875b4-db50-44e7-b270-ceb7bc6ef804-utilities\") pod \"8d6875b4-db50-44e7-b270-ceb7bc6ef804\" (UID: \"8d6875b4-db50-44e7-b270-ceb7bc6ef804\") " Feb 17 20:09:58 crc kubenswrapper[4892]: I0217 20:09:58.025862 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d6875b4-db50-44e7-b270-ceb7bc6ef804-utilities" (OuterVolumeSpecName: "utilities") pod "8d6875b4-db50-44e7-b270-ceb7bc6ef804" (UID: "8d6875b4-db50-44e7-b270-ceb7bc6ef804"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 20:09:58 crc kubenswrapper[4892]: I0217 20:09:58.030108 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d6875b4-db50-44e7-b270-ceb7bc6ef804-kube-api-access-s7p5g" (OuterVolumeSpecName: "kube-api-access-s7p5g") pod "8d6875b4-db50-44e7-b270-ceb7bc6ef804" (UID: "8d6875b4-db50-44e7-b270-ceb7bc6ef804"). InnerVolumeSpecName "kube-api-access-s7p5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 20:09:58 crc kubenswrapper[4892]: I0217 20:09:58.060873 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d6875b4-db50-44e7-b270-ceb7bc6ef804-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8d6875b4-db50-44e7-b270-ceb7bc6ef804" (UID: "8d6875b4-db50-44e7-b270-ceb7bc6ef804"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 20:09:58 crc kubenswrapper[4892]: I0217 20:09:58.097294 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5cd89" event={"ID":"94cfcd99-3052-49c6-991c-571a85bdeba5","Type":"ContainerDied","Data":"78c3dbdb10ebc9c0da599587dbf59ebe7bd05ff8e72754b8eaead5e05c5bd1ba"} Feb 17 20:09:58 crc kubenswrapper[4892]: I0217 20:09:58.097343 4892 scope.go:117] "RemoveContainer" containerID="60ede4528881a0e75c0a773cdc6e4742ed2805584239afe4bcecf7be8adcba0a" Feb 17 20:09:58 crc kubenswrapper[4892]: I0217 20:09:58.097456 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5cd89" Feb 17 20:09:58 crc kubenswrapper[4892]: I0217 20:09:58.107160 4892 generic.go:334] "Generic (PLEG): container finished" podID="8d6875b4-db50-44e7-b270-ceb7bc6ef804" containerID="cbf0f4bb67090449de60937e4c5f126db77b22d75a30b29fa70a2de42b9432be" exitCode=0 Feb 17 20:09:58 crc kubenswrapper[4892]: I0217 20:09:58.107196 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hw9fh" event={"ID":"8d6875b4-db50-44e7-b270-ceb7bc6ef804","Type":"ContainerDied","Data":"cbf0f4bb67090449de60937e4c5f126db77b22d75a30b29fa70a2de42b9432be"} Feb 17 20:09:58 crc kubenswrapper[4892]: I0217 20:09:58.107223 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hw9fh" event={"ID":"8d6875b4-db50-44e7-b270-ceb7bc6ef804","Type":"ContainerDied","Data":"e8bc16e1a1a2023ac06d03ae26e05cf2902fd7a793b6c30d314ae6e3b73fabf4"} Feb 17 20:09:58 crc kubenswrapper[4892]: I0217 20:09:58.107272 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hw9fh" Feb 17 20:09:58 crc kubenswrapper[4892]: I0217 20:09:58.127695 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d6875b4-db50-44e7-b270-ceb7bc6ef804-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 20:09:58 crc kubenswrapper[4892]: I0217 20:09:58.127739 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d6875b4-db50-44e7-b270-ceb7bc6ef804-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 20:09:58 crc kubenswrapper[4892]: I0217 20:09:58.127754 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7p5g\" (UniqueName: \"kubernetes.io/projected/8d6875b4-db50-44e7-b270-ceb7bc6ef804-kube-api-access-s7p5g\") on node \"crc\" DevicePath \"\"" Feb 17 20:09:58 crc kubenswrapper[4892]: I0217 20:09:58.152061 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5cd89"] Feb 17 20:09:58 crc kubenswrapper[4892]: I0217 20:09:58.158097 4892 scope.go:117] "RemoveContainer" containerID="4c7c4f01e276d66570dc871177174486b174db591f011f3bcfaf345727957eca" Feb 17 20:09:58 crc kubenswrapper[4892]: I0217 20:09:58.163381 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5cd89"] Feb 17 20:09:58 crc kubenswrapper[4892]: I0217 20:09:58.173262 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hw9fh"] Feb 17 20:09:58 crc kubenswrapper[4892]: I0217 20:09:58.186129 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hw9fh"] Feb 17 20:09:58 crc kubenswrapper[4892]: I0217 20:09:58.199264 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wx7cq"] Feb 17 20:09:58 crc kubenswrapper[4892]: I0217 20:09:58.199518 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wx7cq" podUID="f9918aee-e88f-48dc-a1dd-f5de340b5570" containerName="registry-server" containerID="cri-o://96a79f94b5b068c3072a49a54e121f86581f7a8054e048bd036cd18e252fa778" gracePeriod=2 Feb 17 20:09:58 crc kubenswrapper[4892]: I0217 20:09:58.199888 4892 scope.go:117] "RemoveContainer" containerID="f89a549aee8b2cfcccc4c26aa6cd254abec5f29488ed0084d00d0b7760e7fb32" Feb 17 20:09:58 crc kubenswrapper[4892]: I0217 20:09:58.252298 4892 scope.go:117] "RemoveContainer" containerID="cbf0f4bb67090449de60937e4c5f126db77b22d75a30b29fa70a2de42b9432be" Feb 17 20:09:58 crc kubenswrapper[4892]: I0217 20:09:58.305443 4892 scope.go:117] "RemoveContainer" containerID="2da9e6d989f7b60732cff5aa1691028e7269fbd5413a1f63af6a225f0c0b83a8" Feb 17 20:09:58 crc kubenswrapper[4892]: I0217 20:09:58.432441 4892 scope.go:117] "RemoveContainer" containerID="f2819bfa5e2d545e8b28aa1b846a66666d5981d51498cac306c94a5d7f4a99bc" Feb 17 20:09:58 crc kubenswrapper[4892]: I0217 20:09:58.467906 4892 scope.go:117] "RemoveContainer" containerID="cbf0f4bb67090449de60937e4c5f126db77b22d75a30b29fa70a2de42b9432be" Feb 17 20:09:58 crc kubenswrapper[4892]: E0217 20:09:58.468461 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbf0f4bb67090449de60937e4c5f126db77b22d75a30b29fa70a2de42b9432be\": container with ID starting with cbf0f4bb67090449de60937e4c5f126db77b22d75a30b29fa70a2de42b9432be not found: ID does not exist" containerID="cbf0f4bb67090449de60937e4c5f126db77b22d75a30b29fa70a2de42b9432be" Feb 17 20:09:58 crc kubenswrapper[4892]: I0217 20:09:58.468513 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbf0f4bb67090449de60937e4c5f126db77b22d75a30b29fa70a2de42b9432be"} err="failed to get container status \"cbf0f4bb67090449de60937e4c5f126db77b22d75a30b29fa70a2de42b9432be\": rpc error: code = NotFound desc = could not find container \"cbf0f4bb67090449de60937e4c5f126db77b22d75a30b29fa70a2de42b9432be\": container with ID starting with cbf0f4bb67090449de60937e4c5f126db77b22d75a30b29fa70a2de42b9432be not found: ID does not exist" Feb 17 20:09:58 crc kubenswrapper[4892]: I0217 20:09:58.468549 4892 scope.go:117] "RemoveContainer" containerID="2da9e6d989f7b60732cff5aa1691028e7269fbd5413a1f63af6a225f0c0b83a8" Feb 17 20:09:58 crc kubenswrapper[4892]: E0217 20:09:58.468897 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2da9e6d989f7b60732cff5aa1691028e7269fbd5413a1f63af6a225f0c0b83a8\": container with ID starting with 2da9e6d989f7b60732cff5aa1691028e7269fbd5413a1f63af6a225f0c0b83a8 not found: ID does not exist" containerID="2da9e6d989f7b60732cff5aa1691028e7269fbd5413a1f63af6a225f0c0b83a8" Feb 17 20:09:58 crc kubenswrapper[4892]: I0217 20:09:58.468938 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2da9e6d989f7b60732cff5aa1691028e7269fbd5413a1f63af6a225f0c0b83a8"} err="failed to get container status \"2da9e6d989f7b60732cff5aa1691028e7269fbd5413a1f63af6a225f0c0b83a8\": rpc error: code = NotFound desc = could not find container \"2da9e6d989f7b60732cff5aa1691028e7269fbd5413a1f63af6a225f0c0b83a8\": container with ID starting with 2da9e6d989f7b60732cff5aa1691028e7269fbd5413a1f63af6a225f0c0b83a8 not found: ID does not exist" Feb 17 20:09:58 crc kubenswrapper[4892]: I0217 20:09:58.468962 4892 scope.go:117] "RemoveContainer" containerID="f2819bfa5e2d545e8b28aa1b846a66666d5981d51498cac306c94a5d7f4a99bc" Feb 17 20:09:58 crc kubenswrapper[4892]: E0217 20:09:58.469248 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2819bfa5e2d545e8b28aa1b846a66666d5981d51498cac306c94a5d7f4a99bc\": container with ID starting with f2819bfa5e2d545e8b28aa1b846a66666d5981d51498cac306c94a5d7f4a99bc not found: ID does not exist" containerID="f2819bfa5e2d545e8b28aa1b846a66666d5981d51498cac306c94a5d7f4a99bc" Feb 17 20:09:58 crc kubenswrapper[4892]: I0217 20:09:58.469288 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2819bfa5e2d545e8b28aa1b846a66666d5981d51498cac306c94a5d7f4a99bc"} err="failed to get container status \"f2819bfa5e2d545e8b28aa1b846a66666d5981d51498cac306c94a5d7f4a99bc\": rpc error: code = NotFound desc = could not find container \"f2819bfa5e2d545e8b28aa1b846a66666d5981d51498cac306c94a5d7f4a99bc\": container with ID starting with f2819bfa5e2d545e8b28aa1b846a66666d5981d51498cac306c94a5d7f4a99bc not found: ID does not exist" Feb 17 20:09:58 crc kubenswrapper[4892]: I0217 20:09:58.640398 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wx7cq" Feb 17 20:09:58 crc kubenswrapper[4892]: I0217 20:09:58.740733 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9918aee-e88f-48dc-a1dd-f5de340b5570-utilities\") pod \"f9918aee-e88f-48dc-a1dd-f5de340b5570\" (UID: \"f9918aee-e88f-48dc-a1dd-f5de340b5570\") " Feb 17 20:09:58 crc kubenswrapper[4892]: I0217 20:09:58.740798 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4g5gg\" (UniqueName: \"kubernetes.io/projected/f9918aee-e88f-48dc-a1dd-f5de340b5570-kube-api-access-4g5gg\") pod \"f9918aee-e88f-48dc-a1dd-f5de340b5570\" (UID: \"f9918aee-e88f-48dc-a1dd-f5de340b5570\") " Feb 17 20:09:58 crc kubenswrapper[4892]: I0217 20:09:58.741040 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9918aee-e88f-48dc-a1dd-f5de340b5570-catalog-content\") pod \"f9918aee-e88f-48dc-a1dd-f5de340b5570\" (UID: \"f9918aee-e88f-48dc-a1dd-f5de340b5570\") " Feb 17 20:09:58 crc kubenswrapper[4892]: I0217 20:09:58.742278 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9918aee-e88f-48dc-a1dd-f5de340b5570-utilities" (OuterVolumeSpecName: "utilities") pod "f9918aee-e88f-48dc-a1dd-f5de340b5570" (UID: "f9918aee-e88f-48dc-a1dd-f5de340b5570"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 20:09:58 crc kubenswrapper[4892]: I0217 20:09:58.746041 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9918aee-e88f-48dc-a1dd-f5de340b5570-kube-api-access-4g5gg" (OuterVolumeSpecName: "kube-api-access-4g5gg") pod "f9918aee-e88f-48dc-a1dd-f5de340b5570" (UID: "f9918aee-e88f-48dc-a1dd-f5de340b5570"). InnerVolumeSpecName "kube-api-access-4g5gg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 20:09:58 crc kubenswrapper[4892]: I0217 20:09:58.820089 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9918aee-e88f-48dc-a1dd-f5de340b5570-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f9918aee-e88f-48dc-a1dd-f5de340b5570" (UID: "f9918aee-e88f-48dc-a1dd-f5de340b5570"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 20:09:58 crc kubenswrapper[4892]: I0217 20:09:58.843986 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9918aee-e88f-48dc-a1dd-f5de340b5570-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 20:09:58 crc kubenswrapper[4892]: I0217 20:09:58.844018 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9918aee-e88f-48dc-a1dd-f5de340b5570-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 20:09:58 crc kubenswrapper[4892]: I0217 20:09:58.844031 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4g5gg\" (UniqueName: \"kubernetes.io/projected/f9918aee-e88f-48dc-a1dd-f5de340b5570-kube-api-access-4g5gg\") on node \"crc\" DevicePath \"\"" Feb 17 20:09:58 crc kubenswrapper[4892]: I0217 20:09:58.991502 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hjlcl"] Feb 17 20:09:58 crc kubenswrapper[4892]: I0217 20:09:58.991802 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hjlcl" podUID="e6edd657-e697-42e2-bfc4-3ea98348eb23" containerName="registry-server" containerID="cri-o://ff98d1f4fe8d39315ef9d68f94466af7e580b1f041b4be80f27f79aa7b851ce4" gracePeriod=2 Feb 17 20:09:59 crc kubenswrapper[4892]: I0217 20:09:59.123376 4892 generic.go:334] "Generic (PLEG): container finished" podID="f9918aee-e88f-48dc-a1dd-f5de340b5570" containerID="96a79f94b5b068c3072a49a54e121f86581f7a8054e048bd036cd18e252fa778" exitCode=0 Feb 17 20:09:59 crc kubenswrapper[4892]: I0217 20:09:59.123451 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wx7cq" event={"ID":"f9918aee-e88f-48dc-a1dd-f5de340b5570","Type":"ContainerDied","Data":"96a79f94b5b068c3072a49a54e121f86581f7a8054e048bd036cd18e252fa778"} Feb 17 20:09:59 crc kubenswrapper[4892]: I0217 20:09:59.123512 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wx7cq" Feb 17 20:09:59 crc kubenswrapper[4892]: I0217 20:09:59.123541 4892 scope.go:117] "RemoveContainer" containerID="96a79f94b5b068c3072a49a54e121f86581f7a8054e048bd036cd18e252fa778" Feb 17 20:09:59 crc kubenswrapper[4892]: I0217 20:09:59.123529 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wx7cq" event={"ID":"f9918aee-e88f-48dc-a1dd-f5de340b5570","Type":"ContainerDied","Data":"d163a85c08372f5fa77859458f92754a6b606ca3578b57e7ad7791325ee15e4d"} Feb 17 20:09:59 crc kubenswrapper[4892]: I0217 20:09:59.179390 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wx7cq"] Feb 17 20:09:59 crc kubenswrapper[4892]: I0217 20:09:59.192133 4892 scope.go:117] "RemoveContainer" containerID="5a0564725f144f21c2482dc8d633ec3655af50ec683bf0c6cfda35ab831c6286" Feb 17 20:09:59 crc kubenswrapper[4892]: I0217 20:09:59.195645 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wx7cq"] Feb 17 20:09:59 crc kubenswrapper[4892]: I0217 20:09:59.217034 4892 scope.go:117] "RemoveContainer" containerID="b17ee6f58577586801aedd782b3b3c56090c0334aa71ce8b9f80f905ed97da37" Feb 17 20:09:59 crc kubenswrapper[4892]: I0217 20:09:59.266337 4892 scope.go:117] "RemoveContainer" containerID="96a79f94b5b068c3072a49a54e121f86581f7a8054e048bd036cd18e252fa778" Feb 17 20:09:59 crc kubenswrapper[4892]: E0217 20:09:59.266774 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96a79f94b5b068c3072a49a54e121f86581f7a8054e048bd036cd18e252fa778\": container with ID starting with 96a79f94b5b068c3072a49a54e121f86581f7a8054e048bd036cd18e252fa778 not found: ID does not exist" containerID="96a79f94b5b068c3072a49a54e121f86581f7a8054e048bd036cd18e252fa778" Feb 17 20:09:59 crc kubenswrapper[4892]: I0217 20:09:59.266837 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96a79f94b5b068c3072a49a54e121f86581f7a8054e048bd036cd18e252fa778"} err="failed to get container status \"96a79f94b5b068c3072a49a54e121f86581f7a8054e048bd036cd18e252fa778\": rpc error: code = NotFound desc = could not find container \"96a79f94b5b068c3072a49a54e121f86581f7a8054e048bd036cd18e252fa778\": container with ID starting with 96a79f94b5b068c3072a49a54e121f86581f7a8054e048bd036cd18e252fa778 not found: ID does not exist" Feb 17 20:09:59 crc kubenswrapper[4892]: I0217 20:09:59.266870 4892 scope.go:117] "RemoveContainer" containerID="5a0564725f144f21c2482dc8d633ec3655af50ec683bf0c6cfda35ab831c6286" Feb 17 20:09:59 crc kubenswrapper[4892]: E0217 20:09:59.267368 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a0564725f144f21c2482dc8d633ec3655af50ec683bf0c6cfda35ab831c6286\": container with ID starting with 5a0564725f144f21c2482dc8d633ec3655af50ec683bf0c6cfda35ab831c6286 not found: ID does not exist" containerID="5a0564725f144f21c2482dc8d633ec3655af50ec683bf0c6cfda35ab831c6286" Feb 17 20:09:59 crc kubenswrapper[4892]: I0217 20:09:59.267413 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a0564725f144f21c2482dc8d633ec3655af50ec683bf0c6cfda35ab831c6286"} err="failed to get container status \"5a0564725f144f21c2482dc8d633ec3655af50ec683bf0c6cfda35ab831c6286\": rpc error: code = NotFound desc = could not find container \"5a0564725f144f21c2482dc8d633ec3655af50ec683bf0c6cfda35ab831c6286\": container with ID starting with 5a0564725f144f21c2482dc8d633ec3655af50ec683bf0c6cfda35ab831c6286 not found: ID does not exist" Feb 17 20:09:59 crc kubenswrapper[4892]: I0217 20:09:59.267440 4892 scope.go:117] "RemoveContainer" containerID="b17ee6f58577586801aedd782b3b3c56090c0334aa71ce8b9f80f905ed97da37" Feb 17 20:09:59 crc kubenswrapper[4892]: E0217 20:09:59.267726 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b17ee6f58577586801aedd782b3b3c56090c0334aa71ce8b9f80f905ed97da37\": container with ID starting with b17ee6f58577586801aedd782b3b3c56090c0334aa71ce8b9f80f905ed97da37 not found: ID does not exist" containerID="b17ee6f58577586801aedd782b3b3c56090c0334aa71ce8b9f80f905ed97da37" Feb 17 20:09:59 crc kubenswrapper[4892]: I0217 20:09:59.267761 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b17ee6f58577586801aedd782b3b3c56090c0334aa71ce8b9f80f905ed97da37"} err="failed to get container status \"b17ee6f58577586801aedd782b3b3c56090c0334aa71ce8b9f80f905ed97da37\": rpc error: code = NotFound desc = could not find container \"b17ee6f58577586801aedd782b3b3c56090c0334aa71ce8b9f80f905ed97da37\": container with ID starting with b17ee6f58577586801aedd782b3b3c56090c0334aa71ce8b9f80f905ed97da37 not found: ID does not exist" Feb 17 20:09:59 crc kubenswrapper[4892]: I0217 20:09:59.383840 4892 scope.go:117] "RemoveContainer" containerID="dd34609e40c62d0dc2fae95fb119b2fff9896a684cd3e8b1cf99b9b16a2445f8" Feb 17 20:09:59 crc kubenswrapper[4892]: E0217 20:09:59.384352 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:09:59 crc kubenswrapper[4892]: I0217 20:09:59.385395 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d6875b4-db50-44e7-b270-ceb7bc6ef804" path="/var/lib/kubelet/pods/8d6875b4-db50-44e7-b270-ceb7bc6ef804/volumes" Feb 17 20:09:59 crc kubenswrapper[4892]: I0217 20:09:59.386376 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94cfcd99-3052-49c6-991c-571a85bdeba5" path="/var/lib/kubelet/pods/94cfcd99-3052-49c6-991c-571a85bdeba5/volumes" Feb 17 20:09:59 crc kubenswrapper[4892]: I0217 20:09:59.387214 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9918aee-e88f-48dc-a1dd-f5de340b5570" path="/var/lib/kubelet/pods/f9918aee-e88f-48dc-a1dd-f5de340b5570/volumes" Feb 17 20:09:59 crc kubenswrapper[4892]: I0217 20:09:59.507137 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hjlcl" Feb 17 20:09:59 crc kubenswrapper[4892]: I0217 20:09:59.668495 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s65h8\" (UniqueName: \"kubernetes.io/projected/e6edd657-e697-42e2-bfc4-3ea98348eb23-kube-api-access-s65h8\") pod \"e6edd657-e697-42e2-bfc4-3ea98348eb23\" (UID: \"e6edd657-e697-42e2-bfc4-3ea98348eb23\") " Feb 17 20:09:59 crc kubenswrapper[4892]: I0217 20:09:59.668560 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6edd657-e697-42e2-bfc4-3ea98348eb23-catalog-content\") pod \"e6edd657-e697-42e2-bfc4-3ea98348eb23\" (UID: \"e6edd657-e697-42e2-bfc4-3ea98348eb23\") " Feb 17 20:09:59 crc kubenswrapper[4892]: I0217 20:09:59.668898 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6edd657-e697-42e2-bfc4-3ea98348eb23-utilities\") pod \"e6edd657-e697-42e2-bfc4-3ea98348eb23\" (UID: \"e6edd657-e697-42e2-bfc4-3ea98348eb23\") " Feb 17 20:09:59 crc kubenswrapper[4892]: I0217 20:09:59.669230 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6edd657-e697-42e2-bfc4-3ea98348eb23-utilities" (OuterVolumeSpecName: "utilities") pod "e6edd657-e697-42e2-bfc4-3ea98348eb23" (UID: "e6edd657-e697-42e2-bfc4-3ea98348eb23"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 20:09:59 crc kubenswrapper[4892]: I0217 20:09:59.669975 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6edd657-e697-42e2-bfc4-3ea98348eb23-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 20:09:59 crc kubenswrapper[4892]: I0217 20:09:59.808528 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jnfwl"] Feb 17 20:09:59 crc kubenswrapper[4892]: I0217 20:09:59.810884 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jnfwl" podUID="d42f2f0e-3e80-4231-9daa-5abdd0a0091e" containerName="registry-server" containerID="cri-o://af4a15d0b4f9f93d9b089cf26fd05394e4d9ac66db54ab531c48a2158c0c0841" gracePeriod=2 Feb 17 20:09:59 crc kubenswrapper[4892]: I0217 20:09:59.831423 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6edd657-e697-42e2-bfc4-3ea98348eb23-kube-api-access-s65h8" (OuterVolumeSpecName: "kube-api-access-s65h8") pod "e6edd657-e697-42e2-bfc4-3ea98348eb23" (UID: "e6edd657-e697-42e2-bfc4-3ea98348eb23"). InnerVolumeSpecName "kube-api-access-s65h8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 20:09:59 crc kubenswrapper[4892]: I0217 20:09:59.875763 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s65h8\" (UniqueName: \"kubernetes.io/projected/e6edd657-e697-42e2-bfc4-3ea98348eb23-kube-api-access-s65h8\") on node \"crc\" DevicePath \"\"" Feb 17 20:09:59 crc kubenswrapper[4892]: I0217 20:09:59.880212 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6edd657-e697-42e2-bfc4-3ea98348eb23-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e6edd657-e697-42e2-bfc4-3ea98348eb23" (UID: "e6edd657-e697-42e2-bfc4-3ea98348eb23"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 20:09:59 crc kubenswrapper[4892]: I0217 20:09:59.978137 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6edd657-e697-42e2-bfc4-3ea98348eb23-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 20:10:00 crc kubenswrapper[4892]: I0217 20:10:00.142000 4892 generic.go:334] "Generic (PLEG): container finished" podID="e6edd657-e697-42e2-bfc4-3ea98348eb23" containerID="ff98d1f4fe8d39315ef9d68f94466af7e580b1f041b4be80f27f79aa7b851ce4" exitCode=0 Feb 17 20:10:00 crc kubenswrapper[4892]: I0217 20:10:00.142076 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hjlcl" event={"ID":"e6edd657-e697-42e2-bfc4-3ea98348eb23","Type":"ContainerDied","Data":"ff98d1f4fe8d39315ef9d68f94466af7e580b1f041b4be80f27f79aa7b851ce4"} Feb 17 20:10:00 crc kubenswrapper[4892]: I0217 20:10:00.142108 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hjlcl" event={"ID":"e6edd657-e697-42e2-bfc4-3ea98348eb23","Type":"ContainerDied","Data":"4150b5ddbb9802419e9cfbdfa8318ab322335734077bc15885a044299956782f"} Feb 17 20:10:00 crc kubenswrapper[4892]: I0217 20:10:00.142129 4892 scope.go:117] "RemoveContainer" containerID="ff98d1f4fe8d39315ef9d68f94466af7e580b1f041b4be80f27f79aa7b851ce4" Feb 17 20:10:00 crc kubenswrapper[4892]: I0217 20:10:00.142297 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hjlcl" Feb 17 20:10:00 crc kubenswrapper[4892]: I0217 20:10:00.175983 4892 generic.go:334] "Generic (PLEG): container finished" podID="d42f2f0e-3e80-4231-9daa-5abdd0a0091e" containerID="af4a15d0b4f9f93d9b089cf26fd05394e4d9ac66db54ab531c48a2158c0c0841" exitCode=0 Feb 17 20:10:00 crc kubenswrapper[4892]: I0217 20:10:00.176034 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jnfwl" event={"ID":"d42f2f0e-3e80-4231-9daa-5abdd0a0091e","Type":"ContainerDied","Data":"af4a15d0b4f9f93d9b089cf26fd05394e4d9ac66db54ab531c48a2158c0c0841"} Feb 17 20:10:00 crc kubenswrapper[4892]: I0217 20:10:00.187962 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hjlcl"] Feb 17 20:10:00 crc kubenswrapper[4892]: I0217 20:10:00.191394 4892 scope.go:117] "RemoveContainer" containerID="b71360393e5093d54ee4280b406404bd3aeba92aa26135a2c1a7471c904481df" Feb 17 20:10:00 crc kubenswrapper[4892]: I0217 20:10:00.199130 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hjlcl"] Feb 17 20:10:00 crc kubenswrapper[4892]: I0217 20:10:00.216884 4892 scope.go:117] "RemoveContainer" containerID="76c56ce35efa20ed85f745f01843987bf7d2c433748e37590881bd8f4e22b376" Feb 17 20:10:00 crc kubenswrapper[4892]: I0217 20:10:00.314842 4892 scope.go:117] "RemoveContainer" containerID="ff98d1f4fe8d39315ef9d68f94466af7e580b1f041b4be80f27f79aa7b851ce4" Feb 17 20:10:00 crc kubenswrapper[4892]: E0217 20:10:00.315405 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff98d1f4fe8d39315ef9d68f94466af7e580b1f041b4be80f27f79aa7b851ce4\": container with ID starting with ff98d1f4fe8d39315ef9d68f94466af7e580b1f041b4be80f27f79aa7b851ce4 not found: ID does not exist" containerID="ff98d1f4fe8d39315ef9d68f94466af7e580b1f041b4be80f27f79aa7b851ce4" Feb 17 20:10:00 crc kubenswrapper[4892]: I0217 20:10:00.315458 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff98d1f4fe8d39315ef9d68f94466af7e580b1f041b4be80f27f79aa7b851ce4"} err="failed to get container status \"ff98d1f4fe8d39315ef9d68f94466af7e580b1f041b4be80f27f79aa7b851ce4\": rpc error: code = NotFound desc = could not find container \"ff98d1f4fe8d39315ef9d68f94466af7e580b1f041b4be80f27f79aa7b851ce4\": container with ID starting with ff98d1f4fe8d39315ef9d68f94466af7e580b1f041b4be80f27f79aa7b851ce4 not found: ID does not exist" Feb 17 20:10:00 crc kubenswrapper[4892]: I0217 20:10:00.315488 4892 scope.go:117] "RemoveContainer" containerID="b71360393e5093d54ee4280b406404bd3aeba92aa26135a2c1a7471c904481df" Feb 17 20:10:00 crc kubenswrapper[4892]: E0217 20:10:00.316035 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b71360393e5093d54ee4280b406404bd3aeba92aa26135a2c1a7471c904481df\": container with ID starting with b71360393e5093d54ee4280b406404bd3aeba92aa26135a2c1a7471c904481df not found: ID does not exist" containerID="b71360393e5093d54ee4280b406404bd3aeba92aa26135a2c1a7471c904481df" Feb 17 20:10:00 crc kubenswrapper[4892]: I0217 20:10:00.316063 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b71360393e5093d54ee4280b406404bd3aeba92aa26135a2c1a7471c904481df"} err="failed to get container status \"b71360393e5093d54ee4280b406404bd3aeba92aa26135a2c1a7471c904481df\": rpc error: code = NotFound desc = could not find container \"b71360393e5093d54ee4280b406404bd3aeba92aa26135a2c1a7471c904481df\": container with ID starting with b71360393e5093d54ee4280b406404bd3aeba92aa26135a2c1a7471c904481df not found: ID does not exist" Feb 17 20:10:00 crc kubenswrapper[4892]: I0217 20:10:00.316080 4892 scope.go:117] "RemoveContainer" containerID="76c56ce35efa20ed85f745f01843987bf7d2c433748e37590881bd8f4e22b376" Feb 17 20:10:00 crc kubenswrapper[4892]: E0217 20:10:00.316321 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76c56ce35efa20ed85f745f01843987bf7d2c433748e37590881bd8f4e22b376\": container with ID starting with 76c56ce35efa20ed85f745f01843987bf7d2c433748e37590881bd8f4e22b376 not found: ID does not exist" containerID="76c56ce35efa20ed85f745f01843987bf7d2c433748e37590881bd8f4e22b376" Feb 17 20:10:00 crc kubenswrapper[4892]: I0217 20:10:00.316342 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76c56ce35efa20ed85f745f01843987bf7d2c433748e37590881bd8f4e22b376"} err="failed to get container status \"76c56ce35efa20ed85f745f01843987bf7d2c433748e37590881bd8f4e22b376\": rpc error: code = NotFound desc = could not find container \"76c56ce35efa20ed85f745f01843987bf7d2c433748e37590881bd8f4e22b376\": container with ID starting with 76c56ce35efa20ed85f745f01843987bf7d2c433748e37590881bd8f4e22b376 not found: ID does not exist" Feb 17 20:10:00 crc kubenswrapper[4892]: I0217 20:10:00.350421 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jnfwl" Feb 17 20:10:00 crc kubenswrapper[4892]: I0217 20:10:00.493950 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vqw7\" (UniqueName: \"kubernetes.io/projected/d42f2f0e-3e80-4231-9daa-5abdd0a0091e-kube-api-access-8vqw7\") pod \"d42f2f0e-3e80-4231-9daa-5abdd0a0091e\" (UID: \"d42f2f0e-3e80-4231-9daa-5abdd0a0091e\") " Feb 17 20:10:00 crc kubenswrapper[4892]: I0217 20:10:00.494059 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d42f2f0e-3e80-4231-9daa-5abdd0a0091e-catalog-content\") pod \"d42f2f0e-3e80-4231-9daa-5abdd0a0091e\" (UID: \"d42f2f0e-3e80-4231-9daa-5abdd0a0091e\") " Feb 17 20:10:00 crc kubenswrapper[4892]: I0217 20:10:00.494176 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d42f2f0e-3e80-4231-9daa-5abdd0a0091e-utilities\") pod \"d42f2f0e-3e80-4231-9daa-5abdd0a0091e\" (UID: \"d42f2f0e-3e80-4231-9daa-5abdd0a0091e\") " Feb 17 20:10:00 crc kubenswrapper[4892]: I0217 20:10:00.494766 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d42f2f0e-3e80-4231-9daa-5abdd0a0091e-utilities" (OuterVolumeSpecName: "utilities") pod "d42f2f0e-3e80-4231-9daa-5abdd0a0091e" (UID: "d42f2f0e-3e80-4231-9daa-5abdd0a0091e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 20:10:00 crc kubenswrapper[4892]: I0217 20:10:00.497351 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d42f2f0e-3e80-4231-9daa-5abdd0a0091e-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 20:10:00 crc kubenswrapper[4892]: I0217 20:10:00.499624 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d42f2f0e-3e80-4231-9daa-5abdd0a0091e-kube-api-access-8vqw7" (OuterVolumeSpecName: "kube-api-access-8vqw7") pod "d42f2f0e-3e80-4231-9daa-5abdd0a0091e" (UID: "d42f2f0e-3e80-4231-9daa-5abdd0a0091e"). InnerVolumeSpecName "kube-api-access-8vqw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 20:10:00 crc kubenswrapper[4892]: I0217 20:10:00.518343 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d42f2f0e-3e80-4231-9daa-5abdd0a0091e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d42f2f0e-3e80-4231-9daa-5abdd0a0091e" (UID: "d42f2f0e-3e80-4231-9daa-5abdd0a0091e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 20:10:00 crc kubenswrapper[4892]: I0217 20:10:00.596627 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rmmjd"] Feb 17 20:10:00 crc kubenswrapper[4892]: I0217 20:10:00.597019 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rmmjd" podUID="12793f08-a0ab-411a-9449-b0b2d0834e5e" containerName="registry-server" containerID="cri-o://c6e50c67a1a5ad2f80bf4049b2522399dfcf51701c4703092c83ad868afd9672" gracePeriod=2 Feb 17 20:10:00 crc kubenswrapper[4892]: I0217 20:10:00.600639 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vqw7\" (UniqueName: \"kubernetes.io/projected/d42f2f0e-3e80-4231-9daa-5abdd0a0091e-kube-api-access-8vqw7\") on node \"crc\" DevicePath \"\"" Feb 17 20:10:00 crc kubenswrapper[4892]: I0217 20:10:00.600683 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d42f2f0e-3e80-4231-9daa-5abdd0a0091e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 20:10:01 crc kubenswrapper[4892]: I0217 20:10:01.187508 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ljjvr"] Feb 17 20:10:01 crc kubenswrapper[4892]: I0217 20:10:01.188217 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ljjvr" podUID="4b23b6a9-794e-40ce-93b0-8b60ec25cf20" containerName="registry-server" containerID="cri-o://1de36c79e76db97c5e8cdaba0096c633c3bce0fff11e04340520c991b31f96b1" gracePeriod=2 Feb 17 20:10:01 crc kubenswrapper[4892]: I0217 20:10:01.188388 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rmmjd" Feb 17 20:10:01 crc kubenswrapper[4892]: I0217 20:10:01.188537 4892 generic.go:334] "Generic (PLEG): container finished" podID="12793f08-a0ab-411a-9449-b0b2d0834e5e" containerID="c6e50c67a1a5ad2f80bf4049b2522399dfcf51701c4703092c83ad868afd9672" exitCode=0 Feb 17 20:10:01 crc kubenswrapper[4892]: I0217 20:10:01.188590 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rmmjd" event={"ID":"12793f08-a0ab-411a-9449-b0b2d0834e5e","Type":"ContainerDied","Data":"c6e50c67a1a5ad2f80bf4049b2522399dfcf51701c4703092c83ad868afd9672"} Feb 17 20:10:01 crc kubenswrapper[4892]: I0217 20:10:01.188614 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rmmjd" event={"ID":"12793f08-a0ab-411a-9449-b0b2d0834e5e","Type":"ContainerDied","Data":"705ce30159b8d544970c18e0e288bcd207bbfa658193fe13e118d29b76e471c3"} Feb 17 20:10:01 crc kubenswrapper[4892]: I0217 20:10:01.188633 4892 scope.go:117] "RemoveContainer" containerID="c6e50c67a1a5ad2f80bf4049b2522399dfcf51701c4703092c83ad868afd9672" Feb 17 20:10:01 crc kubenswrapper[4892]: I0217 20:10:01.193975 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jnfwl" event={"ID":"d42f2f0e-3e80-4231-9daa-5abdd0a0091e","Type":"ContainerDied","Data":"a45d5fbdd1d074685bdd6d039464c9dcfbb02b3895b1b830ae2673c1dec44789"} Feb 17 20:10:01 crc kubenswrapper[4892]: I0217 20:10:01.194385 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jnfwl" Feb 17 20:10:01 crc kubenswrapper[4892]: I0217 20:10:01.229268 4892 scope.go:117] "RemoveContainer" containerID="5a4f985587cbecaa347e64c8c095b0203f618e96b04bc5548ce1287ccf6e91fa" Feb 17 20:10:01 crc kubenswrapper[4892]: I0217 20:10:01.256991 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jnfwl"] Feb 17 20:10:01 crc kubenswrapper[4892]: I0217 20:10:01.273038 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jnfwl"] Feb 17 20:10:01 crc kubenswrapper[4892]: I0217 20:10:01.306074 4892 scope.go:117] "RemoveContainer" containerID="b4cba89435d139bae2a0ba80644ac183aa902671fdb2340208afda10f9d98f64" Feb 17 20:10:01 crc kubenswrapper[4892]: I0217 20:10:01.325245 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8qqz\" (UniqueName: \"kubernetes.io/projected/12793f08-a0ab-411a-9449-b0b2d0834e5e-kube-api-access-j8qqz\") pod \"12793f08-a0ab-411a-9449-b0b2d0834e5e\" (UID: \"12793f08-a0ab-411a-9449-b0b2d0834e5e\") " Feb 17 20:10:01 crc kubenswrapper[4892]: I0217 20:10:01.325312 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12793f08-a0ab-411a-9449-b0b2d0834e5e-utilities\") pod \"12793f08-a0ab-411a-9449-b0b2d0834e5e\" (UID: \"12793f08-a0ab-411a-9449-b0b2d0834e5e\") " Feb 17 20:10:01 crc kubenswrapper[4892]: I0217 20:10:01.325391 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12793f08-a0ab-411a-9449-b0b2d0834e5e-catalog-content\") pod \"12793f08-a0ab-411a-9449-b0b2d0834e5e\" (UID: \"12793f08-a0ab-411a-9449-b0b2d0834e5e\") " Feb 17 20:10:01 crc kubenswrapper[4892]: I0217 20:10:01.328186 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12793f08-a0ab-411a-9449-b0b2d0834e5e-utilities" (OuterVolumeSpecName: "utilities") pod "12793f08-a0ab-411a-9449-b0b2d0834e5e" (UID: "12793f08-a0ab-411a-9449-b0b2d0834e5e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 20:10:01 crc kubenswrapper[4892]: I0217 20:10:01.332888 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12793f08-a0ab-411a-9449-b0b2d0834e5e-kube-api-access-j8qqz" (OuterVolumeSpecName: "kube-api-access-j8qqz") pod "12793f08-a0ab-411a-9449-b0b2d0834e5e" (UID: "12793f08-a0ab-411a-9449-b0b2d0834e5e"). InnerVolumeSpecName "kube-api-access-j8qqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 20:10:01 crc kubenswrapper[4892]: I0217 20:10:01.348672 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12793f08-a0ab-411a-9449-b0b2d0834e5e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "12793f08-a0ab-411a-9449-b0b2d0834e5e" (UID: "12793f08-a0ab-411a-9449-b0b2d0834e5e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 20:10:01 crc kubenswrapper[4892]: I0217 20:10:01.376005 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d42f2f0e-3e80-4231-9daa-5abdd0a0091e" path="/var/lib/kubelet/pods/d42f2f0e-3e80-4231-9daa-5abdd0a0091e/volumes" Feb 17 20:10:01 crc kubenswrapper[4892]: I0217 20:10:01.376666 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6edd657-e697-42e2-bfc4-3ea98348eb23" path="/var/lib/kubelet/pods/e6edd657-e697-42e2-bfc4-3ea98348eb23/volumes" Feb 17 20:10:01 crc kubenswrapper[4892]: I0217 20:10:01.395268 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nznkb"] Feb 17 20:10:01 crc kubenswrapper[4892]: I0217 20:10:01.395510 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nznkb" podUID="330bbc2b-6580-4c85-8180-847df82d681f" containerName="registry-server" containerID="cri-o://a52f033cccbf6f2542aa4251af58d5d8015114a735f0572f09a5deba23f63cec" gracePeriod=2 Feb 17 20:10:01 crc kubenswrapper[4892]: I0217 20:10:01.409392 4892 scope.go:117] "RemoveContainer" containerID="c6e50c67a1a5ad2f80bf4049b2522399dfcf51701c4703092c83ad868afd9672" Feb 17 20:10:01 crc kubenswrapper[4892]: E0217 20:10:01.410136 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6e50c67a1a5ad2f80bf4049b2522399dfcf51701c4703092c83ad868afd9672\": container with ID starting with c6e50c67a1a5ad2f80bf4049b2522399dfcf51701c4703092c83ad868afd9672 not found: ID does not exist" containerID="c6e50c67a1a5ad2f80bf4049b2522399dfcf51701c4703092c83ad868afd9672" Feb 17 20:10:01 crc kubenswrapper[4892]: I0217 20:10:01.410197 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6e50c67a1a5ad2f80bf4049b2522399dfcf51701c4703092c83ad868afd9672"} err="failed to get container status \"c6e50c67a1a5ad2f80bf4049b2522399dfcf51701c4703092c83ad868afd9672\": rpc error: code = NotFound desc = could not find container \"c6e50c67a1a5ad2f80bf4049b2522399dfcf51701c4703092c83ad868afd9672\": container with ID starting with c6e50c67a1a5ad2f80bf4049b2522399dfcf51701c4703092c83ad868afd9672 not found: ID does not exist" Feb 17 20:10:01 crc kubenswrapper[4892]: I0217 20:10:01.410271 4892 scope.go:117] "RemoveContainer" containerID="5a4f985587cbecaa347e64c8c095b0203f618e96b04bc5548ce1287ccf6e91fa" Feb 17 20:10:01 crc kubenswrapper[4892]: E0217 20:10:01.410639 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a4f985587cbecaa347e64c8c095b0203f618e96b04bc5548ce1287ccf6e91fa\": container with ID starting with 5a4f985587cbecaa347e64c8c095b0203f618e96b04bc5548ce1287ccf6e91fa not found: ID does not exist" containerID="5a4f985587cbecaa347e64c8c095b0203f618e96b04bc5548ce1287ccf6e91fa" Feb 17 20:10:01 crc kubenswrapper[4892]: I0217 20:10:01.410673 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a4f985587cbecaa347e64c8c095b0203f618e96b04bc5548ce1287ccf6e91fa"} err="failed to get container status \"5a4f985587cbecaa347e64c8c095b0203f618e96b04bc5548ce1287ccf6e91fa\": rpc error: code = NotFound desc = could not find container \"5a4f985587cbecaa347e64c8c095b0203f618e96b04bc5548ce1287ccf6e91fa\": container with ID starting with 5a4f985587cbecaa347e64c8c095b0203f618e96b04bc5548ce1287ccf6e91fa not found: ID does not exist" Feb 17 20:10:01 crc kubenswrapper[4892]: I0217 20:10:01.410698 4892 scope.go:117] "RemoveContainer" containerID="b4cba89435d139bae2a0ba80644ac183aa902671fdb2340208afda10f9d98f64" Feb 17 20:10:01 crc kubenswrapper[4892]: E0217 20:10:01.410974 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4cba89435d139bae2a0ba80644ac183aa902671fdb2340208afda10f9d98f64\": container with ID starting with b4cba89435d139bae2a0ba80644ac183aa902671fdb2340208afda10f9d98f64 not found: ID does not exist" containerID="b4cba89435d139bae2a0ba80644ac183aa902671fdb2340208afda10f9d98f64" Feb 17 20:10:01 crc kubenswrapper[4892]: I0217 20:10:01.411001 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4cba89435d139bae2a0ba80644ac183aa902671fdb2340208afda10f9d98f64"} err="failed to get container status \"b4cba89435d139bae2a0ba80644ac183aa902671fdb2340208afda10f9d98f64\": rpc error: code = NotFound desc = could not find container \"b4cba89435d139bae2a0ba80644ac183aa902671fdb2340208afda10f9d98f64\": container with ID starting with b4cba89435d139bae2a0ba80644ac183aa902671fdb2340208afda10f9d98f64 not found: ID does not exist" Feb 17 20:10:01 crc kubenswrapper[4892]: I0217 20:10:01.411019 4892 scope.go:117] "RemoveContainer" containerID="af4a15d0b4f9f93d9b089cf26fd05394e4d9ac66db54ab531c48a2158c0c0841" Feb 17 20:10:01 crc kubenswrapper[4892]: I0217 20:10:01.430948 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8qqz\" (UniqueName: \"kubernetes.io/projected/12793f08-a0ab-411a-9449-b0b2d0834e5e-kube-api-access-j8qqz\") on node \"crc\" DevicePath \"\"" Feb 17 20:10:01 crc kubenswrapper[4892]: I0217 20:10:01.431110 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12793f08-a0ab-411a-9449-b0b2d0834e5e-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 20:10:01 crc kubenswrapper[4892]: I0217 20:10:01.431191 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12793f08-a0ab-411a-9449-b0b2d0834e5e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 20:10:01 crc kubenswrapper[4892]: I0217 20:10:01.452450 4892 scope.go:117] "RemoveContainer" containerID="6254cae7da380d7ce1441e1e4c8df8573f650443ff6cb9e79bd49fa402b06d1d" Feb 17 20:10:01 crc kubenswrapper[4892]: I0217 20:10:01.525159 4892 scope.go:117] "RemoveContainer" containerID="5ff92214d6f6318966ebda783a4c31c55dbc09aa86f2bafab13284df0b0b7df6" Feb 17 20:10:01 crc kubenswrapper[4892]: I0217 20:10:01.626941 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ljjvr" Feb 17 20:10:01 crc kubenswrapper[4892]: I0217 20:10:01.739371 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf2rt\" (UniqueName: \"kubernetes.io/projected/4b23b6a9-794e-40ce-93b0-8b60ec25cf20-kube-api-access-gf2rt\") pod \"4b23b6a9-794e-40ce-93b0-8b60ec25cf20\" (UID: \"4b23b6a9-794e-40ce-93b0-8b60ec25cf20\") " Feb 17 20:10:01 crc kubenswrapper[4892]: I0217 20:10:01.739500 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b23b6a9-794e-40ce-93b0-8b60ec25cf20-catalog-content\") pod \"4b23b6a9-794e-40ce-93b0-8b60ec25cf20\" (UID: \"4b23b6a9-794e-40ce-93b0-8b60ec25cf20\") " Feb 17 20:10:01 crc kubenswrapper[4892]: I0217 20:10:01.739612 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b23b6a9-794e-40ce-93b0-8b60ec25cf20-utilities\") pod \"4b23b6a9-794e-40ce-93b0-8b60ec25cf20\" (UID: \"4b23b6a9-794e-40ce-93b0-8b60ec25cf20\") " Feb 17 20:10:01 crc kubenswrapper[4892]: I0217 20:10:01.740357 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b23b6a9-794e-40ce-93b0-8b60ec25cf20-utilities" (OuterVolumeSpecName: "utilities") pod "4b23b6a9-794e-40ce-93b0-8b60ec25cf20" (UID: "4b23b6a9-794e-40ce-93b0-8b60ec25cf20"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 20:10:01 crc kubenswrapper[4892]: I0217 20:10:01.748716 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b23b6a9-794e-40ce-93b0-8b60ec25cf20-kube-api-access-gf2rt" (OuterVolumeSpecName: "kube-api-access-gf2rt") pod "4b23b6a9-794e-40ce-93b0-8b60ec25cf20" (UID: "4b23b6a9-794e-40ce-93b0-8b60ec25cf20"). InnerVolumeSpecName "kube-api-access-gf2rt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 20:10:01 crc kubenswrapper[4892]: I0217 20:10:01.775157 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b23b6a9-794e-40ce-93b0-8b60ec25cf20-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4b23b6a9-794e-40ce-93b0-8b60ec25cf20" (UID: "4b23b6a9-794e-40ce-93b0-8b60ec25cf20"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 20:10:01 crc kubenswrapper[4892]: I0217 20:10:01.788518 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xnd6s"] Feb 17 20:10:01 crc kubenswrapper[4892]: I0217 20:10:01.788778 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xnd6s" podUID="e446b496-04a9-4d11-b207-cdfe5ccc6d5a" containerName="registry-server" containerID="cri-o://1c64ab752bc5e66cc7a7098a411531a3daf533cd829bad312c8be8492189604a" gracePeriod=2 Feb 17 20:10:01 crc kubenswrapper[4892]: I0217 20:10:01.798430 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nznkb" Feb 17 20:10:01 crc kubenswrapper[4892]: I0217 20:10:01.842205 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b23b6a9-794e-40ce-93b0-8b60ec25cf20-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 20:10:01 crc kubenswrapper[4892]: I0217 20:10:01.842226 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b23b6a9-794e-40ce-93b0-8b60ec25cf20-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 20:10:01 crc kubenswrapper[4892]: I0217 20:10:01.842236 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf2rt\" (UniqueName: \"kubernetes.io/projected/4b23b6a9-794e-40ce-93b0-8b60ec25cf20-kube-api-access-gf2rt\") on node \"crc\" DevicePath \"\"" Feb 17 20:10:01 crc kubenswrapper[4892]: I0217 20:10:01.943929 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/330bbc2b-6580-4c85-8180-847df82d681f-utilities\") pod \"330bbc2b-6580-4c85-8180-847df82d681f\" (UID: \"330bbc2b-6580-4c85-8180-847df82d681f\") " Feb 17 20:10:01 crc kubenswrapper[4892]: I0217 20:10:01.944165 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fm8x\" (UniqueName: \"kubernetes.io/projected/330bbc2b-6580-4c85-8180-847df82d681f-kube-api-access-7fm8x\") pod \"330bbc2b-6580-4c85-8180-847df82d681f\" (UID: \"330bbc2b-6580-4c85-8180-847df82d681f\") " Feb 17 20:10:01 crc kubenswrapper[4892]: I0217 20:10:01.944485 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/330bbc2b-6580-4c85-8180-847df82d681f-catalog-content\") pod \"330bbc2b-6580-4c85-8180-847df82d681f\" (UID: \"330bbc2b-6580-4c85-8180-847df82d681f\") " Feb 17 20:10:01 crc kubenswrapper[4892]: I0217 20:10:01.945254 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/330bbc2b-6580-4c85-8180-847df82d681f-utilities" (OuterVolumeSpecName: "utilities") pod "330bbc2b-6580-4c85-8180-847df82d681f" (UID: "330bbc2b-6580-4c85-8180-847df82d681f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 20:10:01 crc kubenswrapper[4892]: I0217 20:10:01.947800 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/330bbc2b-6580-4c85-8180-847df82d681f-kube-api-access-7fm8x" (OuterVolumeSpecName: "kube-api-access-7fm8x") pod "330bbc2b-6580-4c85-8180-847df82d681f" (UID: "330bbc2b-6580-4c85-8180-847df82d681f"). InnerVolumeSpecName "kube-api-access-7fm8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 20:10:01 crc kubenswrapper[4892]: I0217 20:10:01.969380 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/330bbc2b-6580-4c85-8180-847df82d681f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "330bbc2b-6580-4c85-8180-847df82d681f" (UID: "330bbc2b-6580-4c85-8180-847df82d681f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.047444 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/330bbc2b-6580-4c85-8180-847df82d681f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.047475 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/330bbc2b-6580-4c85-8180-847df82d681f-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.047489 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fm8x\" (UniqueName: \"kubernetes.io/projected/330bbc2b-6580-4c85-8180-847df82d681f-kube-api-access-7fm8x\") on node \"crc\" DevicePath \"\"" Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.223448 4892 generic.go:334] "Generic (PLEG): container finished" podID="4b23b6a9-794e-40ce-93b0-8b60ec25cf20" containerID="1de36c79e76db97c5e8cdaba0096c633c3bce0fff11e04340520c991b31f96b1" exitCode=0 Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.223611 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ljjvr" event={"ID":"4b23b6a9-794e-40ce-93b0-8b60ec25cf20","Type":"ContainerDied","Data":"1de36c79e76db97c5e8cdaba0096c633c3bce0fff11e04340520c991b31f96b1"} Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.223778 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ljjvr" event={"ID":"4b23b6a9-794e-40ce-93b0-8b60ec25cf20","Type":"ContainerDied","Data":"d08591d644054f20e71ce9acb2784ca82e53a24408d7527da05e59e6615dd2d3"} Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.223700 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ljjvr" Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.223802 4892 scope.go:117] "RemoveContainer" containerID="1de36c79e76db97c5e8cdaba0096c633c3bce0fff11e04340520c991b31f96b1" Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.226703 4892 generic.go:334] "Generic (PLEG): container finished" podID="330bbc2b-6580-4c85-8180-847df82d681f" containerID="a52f033cccbf6f2542aa4251af58d5d8015114a735f0572f09a5deba23f63cec" exitCode=0 Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.226760 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nznkb" Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.226771 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nznkb" event={"ID":"330bbc2b-6580-4c85-8180-847df82d681f","Type":"ContainerDied","Data":"a52f033cccbf6f2542aa4251af58d5d8015114a735f0572f09a5deba23f63cec"} Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.226797 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nznkb" event={"ID":"330bbc2b-6580-4c85-8180-847df82d681f","Type":"ContainerDied","Data":"1a931eb4f7df10e4905ab9e0c213408c9a9537d2ab94ca8654b6703d1b52c325"} Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.229261 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rmmjd" Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.247443 4892 generic.go:334] "Generic (PLEG): container finished" podID="e446b496-04a9-4d11-b207-cdfe5ccc6d5a" containerID="1c64ab752bc5e66cc7a7098a411531a3daf533cd829bad312c8be8492189604a" exitCode=0 Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.247503 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xnd6s" event={"ID":"e446b496-04a9-4d11-b207-cdfe5ccc6d5a","Type":"ContainerDied","Data":"1c64ab752bc5e66cc7a7098a411531a3daf533cd829bad312c8be8492189604a"} Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.267835 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rmmjd"] Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.282591 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rmmjd"] Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.360514 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xnd6s" Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.387579 4892 scope.go:117] "RemoveContainer" containerID="f7266bc344c6aab906c236cba119a7a4ac288fa4a38b69bd19c3a169fbdc3636" Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.391227 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nznkb"] Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.403171 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5c7bz"] Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.403733 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5c7bz" podUID="30a36169-14a5-4d2a-9f66-fb343852b6a6" containerName="registry-server" containerID="cri-o://707d30d85368ace4a1096f6610fbd2d0995919888db6fda397a652b326263bb2" gracePeriod=2 Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.420323 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nznkb"] Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.434934 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ljjvr"] Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.440863 4892 scope.go:117] "RemoveContainer" containerID="c8437b48b739e1ba68ee8987a855e7ea7ae6e38da4ab45d51fb3da9dba4e1e44" Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.446559 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ljjvr"] Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.465923 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pn9bb\" (UniqueName: \"kubernetes.io/projected/e446b496-04a9-4d11-b207-cdfe5ccc6d5a-kube-api-access-pn9bb\") pod \"e446b496-04a9-4d11-b207-cdfe5ccc6d5a\" (UID: \"e446b496-04a9-4d11-b207-cdfe5ccc6d5a\") " Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.466043 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e446b496-04a9-4d11-b207-cdfe5ccc6d5a-utilities\") pod \"e446b496-04a9-4d11-b207-cdfe5ccc6d5a\" (UID: \"e446b496-04a9-4d11-b207-cdfe5ccc6d5a\") " Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.466126 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e446b496-04a9-4d11-b207-cdfe5ccc6d5a-catalog-content\") pod \"e446b496-04a9-4d11-b207-cdfe5ccc6d5a\" (UID: \"e446b496-04a9-4d11-b207-cdfe5ccc6d5a\") " Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.470131 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e446b496-04a9-4d11-b207-cdfe5ccc6d5a-utilities" (OuterVolumeSpecName: "utilities") pod "e446b496-04a9-4d11-b207-cdfe5ccc6d5a" (UID: "e446b496-04a9-4d11-b207-cdfe5ccc6d5a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.474361 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e446b496-04a9-4d11-b207-cdfe5ccc6d5a-kube-api-access-pn9bb" (OuterVolumeSpecName: "kube-api-access-pn9bb") pod "e446b496-04a9-4d11-b207-cdfe5ccc6d5a" (UID: "e446b496-04a9-4d11-b207-cdfe5ccc6d5a"). InnerVolumeSpecName "kube-api-access-pn9bb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.527125 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e446b496-04a9-4d11-b207-cdfe5ccc6d5a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e446b496-04a9-4d11-b207-cdfe5ccc6d5a" (UID: "e446b496-04a9-4d11-b207-cdfe5ccc6d5a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.532411 4892 scope.go:117] "RemoveContainer" containerID="1de36c79e76db97c5e8cdaba0096c633c3bce0fff11e04340520c991b31f96b1" Feb 17 20:10:02 crc kubenswrapper[4892]: E0217 20:10:02.532900 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1de36c79e76db97c5e8cdaba0096c633c3bce0fff11e04340520c991b31f96b1\": container with ID starting with 1de36c79e76db97c5e8cdaba0096c633c3bce0fff11e04340520c991b31f96b1 not found: ID does not exist" containerID="1de36c79e76db97c5e8cdaba0096c633c3bce0fff11e04340520c991b31f96b1" Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.532992 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1de36c79e76db97c5e8cdaba0096c633c3bce0fff11e04340520c991b31f96b1"} err="failed to get container status \"1de36c79e76db97c5e8cdaba0096c633c3bce0fff11e04340520c991b31f96b1\": rpc error: code = NotFound desc = could not find container \"1de36c79e76db97c5e8cdaba0096c633c3bce0fff11e04340520c991b31f96b1\": container with ID starting with 1de36c79e76db97c5e8cdaba0096c633c3bce0fff11e04340520c991b31f96b1 not found: ID does not exist" Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.533080 4892 scope.go:117] "RemoveContainer" containerID="f7266bc344c6aab906c236cba119a7a4ac288fa4a38b69bd19c3a169fbdc3636" Feb 17 20:10:02 crc kubenswrapper[4892]: E0217 20:10:02.533506 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7266bc344c6aab906c236cba119a7a4ac288fa4a38b69bd19c3a169fbdc3636\": container with ID starting with f7266bc344c6aab906c236cba119a7a4ac288fa4a38b69bd19c3a169fbdc3636 not found: ID does not exist" containerID="f7266bc344c6aab906c236cba119a7a4ac288fa4a38b69bd19c3a169fbdc3636" Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.533540 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7266bc344c6aab906c236cba119a7a4ac288fa4a38b69bd19c3a169fbdc3636"} err="failed to get container status \"f7266bc344c6aab906c236cba119a7a4ac288fa4a38b69bd19c3a169fbdc3636\": rpc error: code = NotFound desc = could not find container \"f7266bc344c6aab906c236cba119a7a4ac288fa4a38b69bd19c3a169fbdc3636\": container with ID starting with f7266bc344c6aab906c236cba119a7a4ac288fa4a38b69bd19c3a169fbdc3636 not found: ID does not exist" Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.533559 4892 scope.go:117] "RemoveContainer" containerID="c8437b48b739e1ba68ee8987a855e7ea7ae6e38da4ab45d51fb3da9dba4e1e44" Feb 17 20:10:02 crc kubenswrapper[4892]: E0217 20:10:02.533732 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8437b48b739e1ba68ee8987a855e7ea7ae6e38da4ab45d51fb3da9dba4e1e44\": container with ID starting with c8437b48b739e1ba68ee8987a855e7ea7ae6e38da4ab45d51fb3da9dba4e1e44 not found: ID does not exist" containerID="c8437b48b739e1ba68ee8987a855e7ea7ae6e38da4ab45d51fb3da9dba4e1e44" Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.533917 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8437b48b739e1ba68ee8987a855e7ea7ae6e38da4ab45d51fb3da9dba4e1e44"} err="failed to get container status \"c8437b48b739e1ba68ee8987a855e7ea7ae6e38da4ab45d51fb3da9dba4e1e44\": rpc error: code = NotFound desc = could not find container \"c8437b48b739e1ba68ee8987a855e7ea7ae6e38da4ab45d51fb3da9dba4e1e44\": container with ID starting with c8437b48b739e1ba68ee8987a855e7ea7ae6e38da4ab45d51fb3da9dba4e1e44 not found: ID does not exist" Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.534001 4892 scope.go:117] "RemoveContainer" containerID="a52f033cccbf6f2542aa4251af58d5d8015114a735f0572f09a5deba23f63cec" Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.559406 4892 scope.go:117] "RemoveContainer" containerID="8a2a8b03b290867ca8100ed61686e61288e0f0b47209005e966806849a8f8b35" Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.580093 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pn9bb\" (UniqueName: \"kubernetes.io/projected/e446b496-04a9-4d11-b207-cdfe5ccc6d5a-kube-api-access-pn9bb\") on node \"crc\" DevicePath \"\"" Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.580122 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e446b496-04a9-4d11-b207-cdfe5ccc6d5a-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.580132 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e446b496-04a9-4d11-b207-cdfe5ccc6d5a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.581212 4892 scope.go:117] "RemoveContainer" containerID="7e042ea6c93c3ba548cef648dd64ea03a562f5f55d68664e299e60dd8eecb424" Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.602966 4892 scope.go:117] "RemoveContainer" containerID="a52f033cccbf6f2542aa4251af58d5d8015114a735f0572f09a5deba23f63cec" Feb 17 20:10:02 crc kubenswrapper[4892]: E0217 20:10:02.603866 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a52f033cccbf6f2542aa4251af58d5d8015114a735f0572f09a5deba23f63cec\": container with ID starting with a52f033cccbf6f2542aa4251af58d5d8015114a735f0572f09a5deba23f63cec not found: ID does not exist" containerID="a52f033cccbf6f2542aa4251af58d5d8015114a735f0572f09a5deba23f63cec" Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.604007 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a52f033cccbf6f2542aa4251af58d5d8015114a735f0572f09a5deba23f63cec"} err="failed to get container status \"a52f033cccbf6f2542aa4251af58d5d8015114a735f0572f09a5deba23f63cec\": rpc error: code = NotFound desc = could not find container \"a52f033cccbf6f2542aa4251af58d5d8015114a735f0572f09a5deba23f63cec\": container with ID starting with a52f033cccbf6f2542aa4251af58d5d8015114a735f0572f09a5deba23f63cec not found: ID does not exist" Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.604035 4892 scope.go:117] "RemoveContainer" containerID="8a2a8b03b290867ca8100ed61686e61288e0f0b47209005e966806849a8f8b35" Feb 17 20:10:02 crc kubenswrapper[4892]: E0217 20:10:02.604875 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a2a8b03b290867ca8100ed61686e61288e0f0b47209005e966806849a8f8b35\": container with ID starting with 8a2a8b03b290867ca8100ed61686e61288e0f0b47209005e966806849a8f8b35 not found: ID does not exist" containerID="8a2a8b03b290867ca8100ed61686e61288e0f0b47209005e966806849a8f8b35" Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.604942 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a2a8b03b290867ca8100ed61686e61288e0f0b47209005e966806849a8f8b35"} err="failed to get container status \"8a2a8b03b290867ca8100ed61686e61288e0f0b47209005e966806849a8f8b35\": rpc error: code = NotFound desc = could not find container \"8a2a8b03b290867ca8100ed61686e61288e0f0b47209005e966806849a8f8b35\": container with ID starting with 8a2a8b03b290867ca8100ed61686e61288e0f0b47209005e966806849a8f8b35 not found: ID does not exist" Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.604968 4892 scope.go:117] "RemoveContainer" containerID="7e042ea6c93c3ba548cef648dd64ea03a562f5f55d68664e299e60dd8eecb424" Feb 17 20:10:02 crc kubenswrapper[4892]: E0217 20:10:02.606003 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e042ea6c93c3ba548cef648dd64ea03a562f5f55d68664e299e60dd8eecb424\": container with ID starting with 7e042ea6c93c3ba548cef648dd64ea03a562f5f55d68664e299e60dd8eecb424 not found: ID does not exist" containerID="7e042ea6c93c3ba548cef648dd64ea03a562f5f55d68664e299e60dd8eecb424" Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.606036 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e042ea6c93c3ba548cef648dd64ea03a562f5f55d68664e299e60dd8eecb424"} err="failed to get container status \"7e042ea6c93c3ba548cef648dd64ea03a562f5f55d68664e299e60dd8eecb424\": rpc error: code = NotFound desc = could not find container \"7e042ea6c93c3ba548cef648dd64ea03a562f5f55d68664e299e60dd8eecb424\": container with ID starting with 7e042ea6c93c3ba548cef648dd64ea03a562f5f55d68664e299e60dd8eecb424 not found: ID does not exist" Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.867351 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5c7bz" Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.982966 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bv8xp"] Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.983202 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bv8xp" podUID="36d16e04-a79d-42f9-bcb8-e9115efb1eae" containerName="registry-server" containerID="cri-o://803ffa13fee64053edce941066bb95ac5f892a4716a2b03b73a3fa0c42dae1ee" gracePeriod=2 Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.988288 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30a36169-14a5-4d2a-9f66-fb343852b6a6-catalog-content\") pod \"30a36169-14a5-4d2a-9f66-fb343852b6a6\" (UID: \"30a36169-14a5-4d2a-9f66-fb343852b6a6\") " Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.988712 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5b92\" (UniqueName: \"kubernetes.io/projected/30a36169-14a5-4d2a-9f66-fb343852b6a6-kube-api-access-d5b92\") pod \"30a36169-14a5-4d2a-9f66-fb343852b6a6\" (UID: \"30a36169-14a5-4d2a-9f66-fb343852b6a6\") " Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.988842 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30a36169-14a5-4d2a-9f66-fb343852b6a6-utilities\") pod \"30a36169-14a5-4d2a-9f66-fb343852b6a6\" (UID: \"30a36169-14a5-4d2a-9f66-fb343852b6a6\") " Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.989991 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30a36169-14a5-4d2a-9f66-fb343852b6a6-utilities" (OuterVolumeSpecName: "utilities") pod "30a36169-14a5-4d2a-9f66-fb343852b6a6" (UID: "30a36169-14a5-4d2a-9f66-fb343852b6a6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 20:10:02 crc kubenswrapper[4892]: I0217 20:10:02.995885 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30a36169-14a5-4d2a-9f66-fb343852b6a6-kube-api-access-d5b92" (OuterVolumeSpecName: "kube-api-access-d5b92") pod "30a36169-14a5-4d2a-9f66-fb343852b6a6" (UID: "30a36169-14a5-4d2a-9f66-fb343852b6a6"). InnerVolumeSpecName "kube-api-access-d5b92". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 20:10:03 crc kubenswrapper[4892]: I0217 20:10:03.018621 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30a36169-14a5-4d2a-9f66-fb343852b6a6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "30a36169-14a5-4d2a-9f66-fb343852b6a6" (UID: "30a36169-14a5-4d2a-9f66-fb343852b6a6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 20:10:03 crc kubenswrapper[4892]: I0217 20:10:03.092283 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5b92\" (UniqueName: \"kubernetes.io/projected/30a36169-14a5-4d2a-9f66-fb343852b6a6-kube-api-access-d5b92\") on node \"crc\" DevicePath \"\"" Feb 17 20:10:03 crc kubenswrapper[4892]: I0217 20:10:03.092337 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30a36169-14a5-4d2a-9f66-fb343852b6a6-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 20:10:03 crc kubenswrapper[4892]: I0217 20:10:03.092349 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30a36169-14a5-4d2a-9f66-fb343852b6a6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 20:10:03 crc kubenswrapper[4892]: I0217 20:10:03.262907 4892 generic.go:334] "Generic (PLEG): container finished" podID="30a36169-14a5-4d2a-9f66-fb343852b6a6" containerID="707d30d85368ace4a1096f6610fbd2d0995919888db6fda397a652b326263bb2" exitCode=0 Feb 17 20:10:03 crc kubenswrapper[4892]: I0217 20:10:03.262986 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5c7bz" event={"ID":"30a36169-14a5-4d2a-9f66-fb343852b6a6","Type":"ContainerDied","Data":"707d30d85368ace4a1096f6610fbd2d0995919888db6fda397a652b326263bb2"} Feb 17 20:10:03 crc kubenswrapper[4892]: I0217 20:10:03.263016 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5c7bz" event={"ID":"30a36169-14a5-4d2a-9f66-fb343852b6a6","Type":"ContainerDied","Data":"1d9ae68afc2ff801f2b2b86fad2316504f9a41898364b972be58f60088ba7b3c"} Feb 17 20:10:03 crc kubenswrapper[4892]: I0217 20:10:03.263019 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5c7bz" Feb 17 20:10:03 crc kubenswrapper[4892]: I0217 20:10:03.263037 4892 scope.go:117] "RemoveContainer" containerID="707d30d85368ace4a1096f6610fbd2d0995919888db6fda397a652b326263bb2" Feb 17 20:10:03 crc kubenswrapper[4892]: I0217 20:10:03.268100 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xnd6s" event={"ID":"e446b496-04a9-4d11-b207-cdfe5ccc6d5a","Type":"ContainerDied","Data":"ba10995aab86678b399e02de5daed1c0326197987867c6d1f5fe7d3b1253ba25"} Feb 17 20:10:03 crc kubenswrapper[4892]: I0217 20:10:03.268125 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xnd6s" Feb 17 20:10:03 crc kubenswrapper[4892]: I0217 20:10:03.288115 4892 scope.go:117] "RemoveContainer" containerID="76f7d0dd2abcf5cc5477d52bab7e6299b8745e10fe10316a6f3b3a94c0441fac" Feb 17 20:10:03 crc kubenswrapper[4892]: I0217 20:10:03.324450 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5c7bz"] Feb 17 20:10:03 crc kubenswrapper[4892]: I0217 20:10:03.326744 4892 scope.go:117] "RemoveContainer" containerID="caa2a35beacddd6be13358c3905c6c7b1c8cbcaf78f3469d39706824bc40c207" Feb 17 20:10:03 crc kubenswrapper[4892]: I0217 20:10:03.338938 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5c7bz"] Feb 17 20:10:03 crc kubenswrapper[4892]: I0217 20:10:03.351908 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xnd6s"] Feb 17 20:10:03 crc kubenswrapper[4892]: I0217 20:10:03.376811 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12793f08-a0ab-411a-9449-b0b2d0834e5e" path="/var/lib/kubelet/pods/12793f08-a0ab-411a-9449-b0b2d0834e5e/volumes" Feb 17 20:10:03 crc kubenswrapper[4892]: I0217 20:10:03.377448 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30a36169-14a5-4d2a-9f66-fb343852b6a6" path="/var/lib/kubelet/pods/30a36169-14a5-4d2a-9f66-fb343852b6a6/volumes" Feb 17 20:10:03 crc kubenswrapper[4892]: I0217 20:10:03.378082 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="330bbc2b-6580-4c85-8180-847df82d681f" path="/var/lib/kubelet/pods/330bbc2b-6580-4c85-8180-847df82d681f/volumes" Feb 17 20:10:03 crc kubenswrapper[4892]: I0217 20:10:03.379297 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b23b6a9-794e-40ce-93b0-8b60ec25cf20" path="/var/lib/kubelet/pods/4b23b6a9-794e-40ce-93b0-8b60ec25cf20/volumes" Feb 17 20:10:03 crc kubenswrapper[4892]: I0217 20:10:03.380096 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xnd6s"] Feb 17 20:10:03 crc kubenswrapper[4892]: I0217 20:10:03.401230 4892 scope.go:117] "RemoveContainer" containerID="707d30d85368ace4a1096f6610fbd2d0995919888db6fda397a652b326263bb2" Feb 17 20:10:03 crc kubenswrapper[4892]: E0217 20:10:03.401606 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"707d30d85368ace4a1096f6610fbd2d0995919888db6fda397a652b326263bb2\": container with ID starting with 707d30d85368ace4a1096f6610fbd2d0995919888db6fda397a652b326263bb2 not found: ID does not exist" containerID="707d30d85368ace4a1096f6610fbd2d0995919888db6fda397a652b326263bb2" Feb 17 20:10:03 crc kubenswrapper[4892]: I0217 20:10:03.401632 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"707d30d85368ace4a1096f6610fbd2d0995919888db6fda397a652b326263bb2"} err="failed to get container status \"707d30d85368ace4a1096f6610fbd2d0995919888db6fda397a652b326263bb2\": rpc error: code = NotFound desc = could not find container \"707d30d85368ace4a1096f6610fbd2d0995919888db6fda397a652b326263bb2\": container with ID starting with 707d30d85368ace4a1096f6610fbd2d0995919888db6fda397a652b326263bb2 not found: ID does not exist" Feb 17 20:10:03 crc kubenswrapper[4892]: I0217 20:10:03.401649 4892 scope.go:117] "RemoveContainer" containerID="76f7d0dd2abcf5cc5477d52bab7e6299b8745e10fe10316a6f3b3a94c0441fac" Feb 17 20:10:03 crc kubenswrapper[4892]: E0217 20:10:03.402137 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76f7d0dd2abcf5cc5477d52bab7e6299b8745e10fe10316a6f3b3a94c0441fac\": container with ID starting with 76f7d0dd2abcf5cc5477d52bab7e6299b8745e10fe10316a6f3b3a94c0441fac not found: ID does not exist" containerID="76f7d0dd2abcf5cc5477d52bab7e6299b8745e10fe10316a6f3b3a94c0441fac" Feb 17 20:10:03 crc kubenswrapper[4892]: I0217 20:10:03.402209 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76f7d0dd2abcf5cc5477d52bab7e6299b8745e10fe10316a6f3b3a94c0441fac"} err="failed to get container status \"76f7d0dd2abcf5cc5477d52bab7e6299b8745e10fe10316a6f3b3a94c0441fac\": rpc error: code = NotFound desc = could not find container \"76f7d0dd2abcf5cc5477d52bab7e6299b8745e10fe10316a6f3b3a94c0441fac\": container with ID starting with 76f7d0dd2abcf5cc5477d52bab7e6299b8745e10fe10316a6f3b3a94c0441fac not found: ID does not exist" Feb 17 20:10:03 crc kubenswrapper[4892]: I0217 20:10:03.402223 4892 scope.go:117] "RemoveContainer" containerID="caa2a35beacddd6be13358c3905c6c7b1c8cbcaf78f3469d39706824bc40c207" Feb 17 20:10:03 crc kubenswrapper[4892]: E0217 20:10:03.402750 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"caa2a35beacddd6be13358c3905c6c7b1c8cbcaf78f3469d39706824bc40c207\": container with ID starting with caa2a35beacddd6be13358c3905c6c7b1c8cbcaf78f3469d39706824bc40c207 not found: ID does not exist" containerID="caa2a35beacddd6be13358c3905c6c7b1c8cbcaf78f3469d39706824bc40c207" Feb 17 20:10:03 crc kubenswrapper[4892]: I0217 20:10:03.402793 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"caa2a35beacddd6be13358c3905c6c7b1c8cbcaf78f3469d39706824bc40c207"} err="failed to get container status \"caa2a35beacddd6be13358c3905c6c7b1c8cbcaf78f3469d39706824bc40c207\": rpc error: code = NotFound desc = could not find container \"caa2a35beacddd6be13358c3905c6c7b1c8cbcaf78f3469d39706824bc40c207\": container with ID starting with caa2a35beacddd6be13358c3905c6c7b1c8cbcaf78f3469d39706824bc40c207 not found: ID does not exist" Feb 17 20:10:03 crc kubenswrapper[4892]: I0217 20:10:03.402835 4892 scope.go:117] "RemoveContainer" containerID="1c64ab752bc5e66cc7a7098a411531a3daf533cd829bad312c8be8492189604a" Feb 17 20:10:03 crc kubenswrapper[4892]: I0217 20:10:03.455975 4892 scope.go:117] "RemoveContainer" containerID="c59b96f0f1058b9d1f168a2ac9505b8849b9b4b8e3630111342edb52404648b0" Feb 17 20:10:03 crc kubenswrapper[4892]: I0217 20:10:03.488582 4892 scope.go:117] "RemoveContainer" containerID="8916b2ec1365ba7be86238f8b9137908aef59a2f71931e55e3ee84a77219bf90" Feb 17 20:10:03 crc kubenswrapper[4892]: I0217 20:10:03.581078 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6qj4k"] Feb 17 20:10:03 crc kubenswrapper[4892]: I0217 20:10:03.581302 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6qj4k" podUID="8a8823d3-7d81-4ab1-8726-e77a2d0024e6" containerName="registry-server" containerID="cri-o://31479b29c172110aee0c402d9eee0ecb00c6341d89987efadcbb60135886a995" gracePeriod=2 Feb 17 20:10:05 crc kubenswrapper[4892]: I0217 20:10:05.292917 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bv8xp_36d16e04-a79d-42f9-bcb8-e9115efb1eae/registry-server/0.log" Feb 17 20:10:05 crc kubenswrapper[4892]: I0217 20:10:05.295210 4892 generic.go:334] "Generic (PLEG): container finished" podID="36d16e04-a79d-42f9-bcb8-e9115efb1eae" containerID="803ffa13fee64053edce941066bb95ac5f892a4716a2b03b73a3fa0c42dae1ee" exitCode=137 Feb 17 20:10:05 crc kubenswrapper[4892]: I0217 20:10:05.295256 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bv8xp" event={"ID":"36d16e04-a79d-42f9-bcb8-e9115efb1eae","Type":"ContainerDied","Data":"803ffa13fee64053edce941066bb95ac5f892a4716a2b03b73a3fa0c42dae1ee"} Feb 17 20:10:05 crc kubenswrapper[4892]: I0217 20:10:05.373385 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e446b496-04a9-4d11-b207-cdfe5ccc6d5a" path="/var/lib/kubelet/pods/e446b496-04a9-4d11-b207-cdfe5ccc6d5a/volumes" Feb 17 20:10:05 crc kubenswrapper[4892]: E0217 20:10:05.647373 4892 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a8823d3_7d81_4ab1_8726_e77a2d0024e6.slice/crio-31479b29c172110aee0c402d9eee0ecb00c6341d89987efadcbb60135886a995.scope\": RecentStats: unable to find data in memory cache]" Feb 17 20:10:05 crc kubenswrapper[4892]: I0217 20:10:05.989162 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bv8xp_36d16e04-a79d-42f9-bcb8-e9115efb1eae/registry-server/0.log" Feb 17 20:10:05 crc kubenswrapper[4892]: I0217 20:10:05.990897 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bv8xp" Feb 17 20:10:06 crc kubenswrapper[4892]: I0217 20:10:06.116272 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6qj4k_8a8823d3-7d81-4ab1-8726-e77a2d0024e6/registry-server/0.log" Feb 17 20:10:06 crc kubenswrapper[4892]: I0217 20:10:06.116933 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6qj4k" Feb 17 20:10:06 crc kubenswrapper[4892]: I0217 20:10:06.170976 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6n5jq\" (UniqueName: \"kubernetes.io/projected/36d16e04-a79d-42f9-bcb8-e9115efb1eae-kube-api-access-6n5jq\") pod \"36d16e04-a79d-42f9-bcb8-e9115efb1eae\" (UID: \"36d16e04-a79d-42f9-bcb8-e9115efb1eae\") " Feb 17 20:10:06 crc kubenswrapper[4892]: I0217 20:10:06.171131 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36d16e04-a79d-42f9-bcb8-e9115efb1eae-utilities\") pod \"36d16e04-a79d-42f9-bcb8-e9115efb1eae\" (UID: \"36d16e04-a79d-42f9-bcb8-e9115efb1eae\") " Feb 17 20:10:06 crc kubenswrapper[4892]: I0217 20:10:06.171236 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36d16e04-a79d-42f9-bcb8-e9115efb1eae-catalog-content\") pod \"36d16e04-a79d-42f9-bcb8-e9115efb1eae\" (UID: \"36d16e04-a79d-42f9-bcb8-e9115efb1eae\") " Feb 17 20:10:06 crc kubenswrapper[4892]: I0217 20:10:06.178937 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36d16e04-a79d-42f9-bcb8-e9115efb1eae-utilities" (OuterVolumeSpecName: "utilities") pod "36d16e04-a79d-42f9-bcb8-e9115efb1eae" (UID: "36d16e04-a79d-42f9-bcb8-e9115efb1eae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 20:10:06 crc kubenswrapper[4892]: I0217 20:10:06.181987 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36d16e04-a79d-42f9-bcb8-e9115efb1eae-kube-api-access-6n5jq" (OuterVolumeSpecName: "kube-api-access-6n5jq") pod "36d16e04-a79d-42f9-bcb8-e9115efb1eae" (UID: "36d16e04-a79d-42f9-bcb8-e9115efb1eae"). InnerVolumeSpecName "kube-api-access-6n5jq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 20:10:06 crc kubenswrapper[4892]: I0217 20:10:06.211213 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36d16e04-a79d-42f9-bcb8-e9115efb1eae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "36d16e04-a79d-42f9-bcb8-e9115efb1eae" (UID: "36d16e04-a79d-42f9-bcb8-e9115efb1eae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 20:10:06 crc kubenswrapper[4892]: I0217 20:10:06.273472 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tg25\" (UniqueName: \"kubernetes.io/projected/8a8823d3-7d81-4ab1-8726-e77a2d0024e6-kube-api-access-6tg25\") pod \"8a8823d3-7d81-4ab1-8726-e77a2d0024e6\" (UID: \"8a8823d3-7d81-4ab1-8726-e77a2d0024e6\") " Feb 17 20:10:06 crc kubenswrapper[4892]: I0217 20:10:06.273668 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a8823d3-7d81-4ab1-8726-e77a2d0024e6-utilities\") pod \"8a8823d3-7d81-4ab1-8726-e77a2d0024e6\" (UID: \"8a8823d3-7d81-4ab1-8726-e77a2d0024e6\") " Feb 17 20:10:06 crc kubenswrapper[4892]: I0217 20:10:06.273809 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a8823d3-7d81-4ab1-8726-e77a2d0024e6-catalog-content\") pod \"8a8823d3-7d81-4ab1-8726-e77a2d0024e6\" (UID: \"8a8823d3-7d81-4ab1-8726-e77a2d0024e6\") " Feb 17 20:10:06 crc kubenswrapper[4892]: I0217 20:10:06.274448 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36d16e04-a79d-42f9-bcb8-e9115efb1eae-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 20:10:06 crc kubenswrapper[4892]: I0217 20:10:06.274474 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36d16e04-a79d-42f9-bcb8-e9115efb1eae-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 20:10:06 crc kubenswrapper[4892]: I0217 20:10:06.274489 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6n5jq\" (UniqueName: \"kubernetes.io/projected/36d16e04-a79d-42f9-bcb8-e9115efb1eae-kube-api-access-6n5jq\") on node \"crc\" DevicePath \"\"" Feb 17 20:10:06 crc kubenswrapper[4892]: I0217 20:10:06.275223 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a8823d3-7d81-4ab1-8726-e77a2d0024e6-utilities" (OuterVolumeSpecName: "utilities") pod "8a8823d3-7d81-4ab1-8726-e77a2d0024e6" (UID: "8a8823d3-7d81-4ab1-8726-e77a2d0024e6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 20:10:06 crc kubenswrapper[4892]: I0217 20:10:06.278999 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a8823d3-7d81-4ab1-8726-e77a2d0024e6-kube-api-access-6tg25" (OuterVolumeSpecName: "kube-api-access-6tg25") pod "8a8823d3-7d81-4ab1-8726-e77a2d0024e6" (UID: "8a8823d3-7d81-4ab1-8726-e77a2d0024e6"). InnerVolumeSpecName "kube-api-access-6tg25". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 20:10:06 crc kubenswrapper[4892]: I0217 20:10:06.307307 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6qj4k_8a8823d3-7d81-4ab1-8726-e77a2d0024e6/registry-server/0.log" Feb 17 20:10:06 crc kubenswrapper[4892]: I0217 20:10:06.307967 4892 generic.go:334] "Generic (PLEG): container finished" podID="8a8823d3-7d81-4ab1-8726-e77a2d0024e6" containerID="31479b29c172110aee0c402d9eee0ecb00c6341d89987efadcbb60135886a995" exitCode=137 Feb 17 20:10:06 crc kubenswrapper[4892]: I0217 20:10:06.308026 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6qj4k" event={"ID":"8a8823d3-7d81-4ab1-8726-e77a2d0024e6","Type":"ContainerDied","Data":"31479b29c172110aee0c402d9eee0ecb00c6341d89987efadcbb60135886a995"} Feb 17 20:10:06 crc kubenswrapper[4892]: I0217 20:10:06.308059 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6qj4k" event={"ID":"8a8823d3-7d81-4ab1-8726-e77a2d0024e6","Type":"ContainerDied","Data":"71c955501ff6260aa202dee21dfc569b10fa6621cf31fe425022cce59e8e21d4"} Feb 17 20:10:06 crc kubenswrapper[4892]: I0217 20:10:06.308080 4892 scope.go:117] "RemoveContainer" containerID="31479b29c172110aee0c402d9eee0ecb00c6341d89987efadcbb60135886a995" Feb 17 20:10:06 crc kubenswrapper[4892]: I0217 20:10:06.308275 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6qj4k" Feb 17 20:10:06 crc kubenswrapper[4892]: I0217 20:10:06.308631 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a8823d3-7d81-4ab1-8726-e77a2d0024e6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8a8823d3-7d81-4ab1-8726-e77a2d0024e6" (UID: "8a8823d3-7d81-4ab1-8726-e77a2d0024e6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 20:10:06 crc kubenswrapper[4892]: I0217 20:10:06.312200 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bv8xp_36d16e04-a79d-42f9-bcb8-e9115efb1eae/registry-server/0.log" Feb 17 20:10:06 crc kubenswrapper[4892]: I0217 20:10:06.321166 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bv8xp" event={"ID":"36d16e04-a79d-42f9-bcb8-e9115efb1eae","Type":"ContainerDied","Data":"c10c388e88659a93c55a7b3e10c4c2dba0447b09b6ede9c148ad91c4b935adba"} Feb 17 20:10:06 crc kubenswrapper[4892]: I0217 20:10:06.321278 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bv8xp" Feb 17 20:10:06 crc kubenswrapper[4892]: I0217 20:10:06.353224 4892 scope.go:117] "RemoveContainer" containerID="5772aae5990ff53de41fb1ecd34b0bc6546450216efa0097d3839f7398e3acee" Feb 17 20:10:06 crc kubenswrapper[4892]: I0217 20:10:06.359572 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bv8xp"] Feb 17 20:10:06 crc kubenswrapper[4892]: I0217 20:10:06.370690 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bv8xp"] Feb 17 20:10:06 crc kubenswrapper[4892]: I0217 20:10:06.376614 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a8823d3-7d81-4ab1-8726-e77a2d0024e6-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 20:10:06 crc kubenswrapper[4892]: I0217 20:10:06.376657 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a8823d3-7d81-4ab1-8726-e77a2d0024e6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 20:10:06 crc kubenswrapper[4892]: I0217 20:10:06.376667 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tg25\" (UniqueName: \"kubernetes.io/projected/8a8823d3-7d81-4ab1-8726-e77a2d0024e6-kube-api-access-6tg25\") on node \"crc\" DevicePath \"\"" Feb 17 20:10:06 crc kubenswrapper[4892]: I0217 20:10:06.383612 4892 scope.go:117] "RemoveContainer" containerID="ad79dfeb408341e4b7e6e20c9b2e82562148b058458200823d0beed3405fca09" Feb 17 20:10:06 crc kubenswrapper[4892]: I0217 20:10:06.428427 4892 scope.go:117] "RemoveContainer" containerID="31479b29c172110aee0c402d9eee0ecb00c6341d89987efadcbb60135886a995" Feb 17 20:10:06 crc kubenswrapper[4892]: E0217 20:10:06.428963 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31479b29c172110aee0c402d9eee0ecb00c6341d89987efadcbb60135886a995\": container with ID starting with 31479b29c172110aee0c402d9eee0ecb00c6341d89987efadcbb60135886a995 not found: ID does not exist" containerID="31479b29c172110aee0c402d9eee0ecb00c6341d89987efadcbb60135886a995" Feb 17 20:10:06 crc kubenswrapper[4892]: I0217 20:10:06.429011 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31479b29c172110aee0c402d9eee0ecb00c6341d89987efadcbb60135886a995"} err="failed to get container status \"31479b29c172110aee0c402d9eee0ecb00c6341d89987efadcbb60135886a995\": rpc error: code = NotFound desc = could not find container \"31479b29c172110aee0c402d9eee0ecb00c6341d89987efadcbb60135886a995\": container with ID starting with 31479b29c172110aee0c402d9eee0ecb00c6341d89987efadcbb60135886a995 not found: ID does not exist" Feb 17 20:10:06 crc kubenswrapper[4892]: I0217 20:10:06.429038 4892 scope.go:117] "RemoveContainer" containerID="5772aae5990ff53de41fb1ecd34b0bc6546450216efa0097d3839f7398e3acee" Feb 17 20:10:06 crc kubenswrapper[4892]: E0217 20:10:06.429674 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5772aae5990ff53de41fb1ecd34b0bc6546450216efa0097d3839f7398e3acee\": container with ID starting with 5772aae5990ff53de41fb1ecd34b0bc6546450216efa0097d3839f7398e3acee not found: ID does not exist" containerID="5772aae5990ff53de41fb1ecd34b0bc6546450216efa0097d3839f7398e3acee" Feb 17 20:10:06 crc kubenswrapper[4892]: I0217 20:10:06.429714 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5772aae5990ff53de41fb1ecd34b0bc6546450216efa0097d3839f7398e3acee"} err="failed to get container status \"5772aae5990ff53de41fb1ecd34b0bc6546450216efa0097d3839f7398e3acee\": rpc error: code = NotFound desc = could not find container \"5772aae5990ff53de41fb1ecd34b0bc6546450216efa0097d3839f7398e3acee\": container with ID starting with 5772aae5990ff53de41fb1ecd34b0bc6546450216efa0097d3839f7398e3acee not found: ID does not exist" Feb 17 20:10:06 crc kubenswrapper[4892]: I0217 20:10:06.429739 4892 scope.go:117] "RemoveContainer" containerID="ad79dfeb408341e4b7e6e20c9b2e82562148b058458200823d0beed3405fca09" Feb 17 20:10:06 crc kubenswrapper[4892]: E0217 20:10:06.430690 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad79dfeb408341e4b7e6e20c9b2e82562148b058458200823d0beed3405fca09\": container with ID starting with ad79dfeb408341e4b7e6e20c9b2e82562148b058458200823d0beed3405fca09 not found: ID does not exist" containerID="ad79dfeb408341e4b7e6e20c9b2e82562148b058458200823d0beed3405fca09" Feb 17 20:10:06 crc kubenswrapper[4892]: I0217 20:10:06.430720 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad79dfeb408341e4b7e6e20c9b2e82562148b058458200823d0beed3405fca09"} err="failed to get container status \"ad79dfeb408341e4b7e6e20c9b2e82562148b058458200823d0beed3405fca09\": rpc error: code = NotFound desc = could not find container \"ad79dfeb408341e4b7e6e20c9b2e82562148b058458200823d0beed3405fca09\": container with ID starting with ad79dfeb408341e4b7e6e20c9b2e82562148b058458200823d0beed3405fca09 not found: ID does not exist" Feb 17 20:10:06 crc kubenswrapper[4892]: I0217 20:10:06.430743 4892 scope.go:117] "RemoveContainer" containerID="803ffa13fee64053edce941066bb95ac5f892a4716a2b03b73a3fa0c42dae1ee" Feb 17 20:10:06 crc kubenswrapper[4892]: I0217 20:10:06.470784 4892 scope.go:117] "RemoveContainer" containerID="5260f20fe3b032cf80ab688ac96b7b0facf7d74885a46af27ad7a71d3b6a74b2" Feb 17 20:10:06 crc kubenswrapper[4892]: I0217 20:10:06.494151 4892 scope.go:117] "RemoveContainer" containerID="68f6fa6edc0a5e9e95eaaa02dc0fcedf0c4145bbb806c96f182338c53924d977" Feb 17 20:10:06 crc kubenswrapper[4892]: I0217 20:10:06.646898 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6qj4k"] Feb 17 20:10:06 crc kubenswrapper[4892]: I0217 20:10:06.657577 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6qj4k"] Feb 17 20:10:07 crc kubenswrapper[4892]: I0217 20:10:07.374118 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36d16e04-a79d-42f9-bcb8-e9115efb1eae" path="/var/lib/kubelet/pods/36d16e04-a79d-42f9-bcb8-e9115efb1eae/volumes" Feb 17 20:10:07 crc kubenswrapper[4892]: I0217 20:10:07.375378 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a8823d3-7d81-4ab1-8726-e77a2d0024e6" path="/var/lib/kubelet/pods/8a8823d3-7d81-4ab1-8726-e77a2d0024e6/volumes" Feb 17 20:10:10 crc kubenswrapper[4892]: I0217 20:10:10.360385 4892 scope.go:117] "RemoveContainer" containerID="dd34609e40c62d0dc2fae95fb119b2fff9896a684cd3e8b1cf99b9b16a2445f8" Feb 17 20:10:10 crc kubenswrapper[4892]: E0217 20:10:10.361385 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:10:24 crc kubenswrapper[4892]: I0217 20:10:24.361006 4892 scope.go:117] "RemoveContainer" containerID="dd34609e40c62d0dc2fae95fb119b2fff9896a684cd3e8b1cf99b9b16a2445f8" Feb 17 20:10:24 crc kubenswrapper[4892]: E0217 20:10:24.361930 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:10:38 crc kubenswrapper[4892]: I0217 20:10:38.360710 4892 scope.go:117] "RemoveContainer" containerID="dd34609e40c62d0dc2fae95fb119b2fff9896a684cd3e8b1cf99b9b16a2445f8" Feb 17 20:10:38 crc kubenswrapper[4892]: E0217 20:10:38.361887 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:10:49 crc kubenswrapper[4892]: I0217 20:10:49.379150 4892 scope.go:117] "RemoveContainer" containerID="dd34609e40c62d0dc2fae95fb119b2fff9896a684cd3e8b1cf99b9b16a2445f8" Feb 17 20:10:49 crc kubenswrapper[4892]: E0217 20:10:49.380785 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:11:04 crc kubenswrapper[4892]: I0217 20:11:04.360275 4892 scope.go:117] "RemoveContainer" containerID="dd34609e40c62d0dc2fae95fb119b2fff9896a684cd3e8b1cf99b9b16a2445f8" Feb 17 20:11:04 crc kubenswrapper[4892]: E0217 20:11:04.361521 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:11:17 crc kubenswrapper[4892]: I0217 20:11:17.360140 4892 scope.go:117] "RemoveContainer" containerID="dd34609e40c62d0dc2fae95fb119b2fff9896a684cd3e8b1cf99b9b16a2445f8" Feb 17 20:11:18 crc kubenswrapper[4892]: I0217 20:11:18.378097 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerStarted","Data":"16db715ca4968324805ed8b7d8e7d0822fd44599b240bf90b59ee6d89baa29e3"} Feb 17 20:12:26 crc kubenswrapper[4892]: I0217 20:12:26.296313 4892 generic.go:334] "Generic (PLEG): container finished" podID="e13a74be-1b5c-4c8f-9c61-5aa0965a4610" containerID="4ab13c964ab0206a0c133b840cd20bbfd2d901516632ade84ac120a831a22763" exitCode=0 Feb 17 20:12:26 crc kubenswrapper[4892]: I0217 20:12:26.296842 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt" event={"ID":"e13a74be-1b5c-4c8f-9c61-5aa0965a4610","Type":"ContainerDied","Data":"4ab13c964ab0206a0c133b840cd20bbfd2d901516632ade84ac120a831a22763"} Feb 17 20:12:27 crc kubenswrapper[4892]: I0217 20:12:27.824313 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt" Feb 17 20:12:27 crc kubenswrapper[4892]: I0217 20:12:27.933830 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-nova-cell1-compute-config-2\") pod \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\" (UID: \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\") " Feb 17 20:12:27 crc kubenswrapper[4892]: I0217 20:12:27.934226 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-ssh-key-openstack-cell1\") pod \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\" (UID: \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\") " Feb 17 20:12:27 crc kubenswrapper[4892]: I0217 20:12:27.934325 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-inventory\") pod \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\" (UID: \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\") " Feb 17 20:12:27 crc kubenswrapper[4892]: I0217 20:12:27.934383 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-nova-migration-ssh-key-0\") pod \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\" (UID: \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\") " Feb 17 20:12:27 crc kubenswrapper[4892]: I0217 20:12:27.934411 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qm724\" (UniqueName: \"kubernetes.io/projected/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-kube-api-access-qm724\") pod \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\" (UID: \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\") " Feb 17 20:12:27 crc kubenswrapper[4892]: I0217 20:12:27.934440 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-nova-migration-ssh-key-1\") pod \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\" (UID: \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\") " Feb 17 20:12:27 crc kubenswrapper[4892]: I0217 20:12:27.934501 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-nova-cell1-compute-config-0\") pod \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\" (UID: \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\") " Feb 17 20:12:27 crc kubenswrapper[4892]: I0217 20:12:27.934529 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-ceph\") pod \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\" (UID: \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\") " Feb 17 20:12:27 crc kubenswrapper[4892]: I0217 20:12:27.934836 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-nova-cell1-compute-config-1\") pod \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\" (UID: \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\") " Feb 17 20:12:27 crc kubenswrapper[4892]: I0217 20:12:27.934893 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-nova-cell1-compute-config-3\") pod \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\" (UID: \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\") " Feb 17 20:12:27 crc kubenswrapper[4892]: I0217 20:12:27.934992 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-nova-cell1-combined-ca-bundle\") pod \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\" (UID: \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\") " Feb 17 20:12:27 crc kubenswrapper[4892]: I0217 20:12:27.935062 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-nova-cells-global-config-1\") pod \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\" (UID: \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\") " Feb 17 20:12:27 crc kubenswrapper[4892]: I0217 20:12:27.935089 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-nova-cells-global-config-0\") pod \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\" (UID: \"e13a74be-1b5c-4c8f-9c61-5aa0965a4610\") " Feb 17 20:12:27 crc kubenswrapper[4892]: I0217 20:12:27.950086 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-ceph" (OuterVolumeSpecName: "ceph") pod "e13a74be-1b5c-4c8f-9c61-5aa0965a4610" (UID: "e13a74be-1b5c-4c8f-9c61-5aa0965a4610"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 20:12:27 crc kubenswrapper[4892]: I0217 20:12:27.950217 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-kube-api-access-qm724" (OuterVolumeSpecName: "kube-api-access-qm724") pod "e13a74be-1b5c-4c8f-9c61-5aa0965a4610" (UID: "e13a74be-1b5c-4c8f-9c61-5aa0965a4610"). InnerVolumeSpecName "kube-api-access-qm724". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 20:12:27 crc kubenswrapper[4892]: I0217 20:12:27.952466 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "e13a74be-1b5c-4c8f-9c61-5aa0965a4610" (UID: "e13a74be-1b5c-4c8f-9c61-5aa0965a4610"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 20:12:27 crc kubenswrapper[4892]: I0217 20:12:27.969116 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "e13a74be-1b5c-4c8f-9c61-5aa0965a4610" (UID: "e13a74be-1b5c-4c8f-9c61-5aa0965a4610"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 20:12:27 crc kubenswrapper[4892]: I0217 20:12:27.974113 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "e13a74be-1b5c-4c8f-9c61-5aa0965a4610" (UID: "e13a74be-1b5c-4c8f-9c61-5aa0965a4610"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 20:12:27 crc kubenswrapper[4892]: I0217 20:12:27.975139 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-inventory" (OuterVolumeSpecName: "inventory") pod "e13a74be-1b5c-4c8f-9c61-5aa0965a4610" (UID: "e13a74be-1b5c-4c8f-9c61-5aa0965a4610"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 20:12:27 crc kubenswrapper[4892]: I0217 20:12:27.975182 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "e13a74be-1b5c-4c8f-9c61-5aa0965a4610" (UID: "e13a74be-1b5c-4c8f-9c61-5aa0965a4610"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 20:12:27 crc kubenswrapper[4892]: I0217 20:12:27.978441 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "e13a74be-1b5c-4c8f-9c61-5aa0965a4610" (UID: "e13a74be-1b5c-4c8f-9c61-5aa0965a4610"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 20:12:27 crc kubenswrapper[4892]: I0217 20:12:27.983987 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "e13a74be-1b5c-4c8f-9c61-5aa0965a4610" (UID: "e13a74be-1b5c-4c8f-9c61-5aa0965a4610"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 20:12:27 crc kubenswrapper[4892]: I0217 20:12:27.985730 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "e13a74be-1b5c-4c8f-9c61-5aa0965a4610" (UID: "e13a74be-1b5c-4c8f-9c61-5aa0965a4610"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 20:12:27 crc kubenswrapper[4892]: I0217 20:12:27.988420 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "e13a74be-1b5c-4c8f-9c61-5aa0965a4610" (UID: "e13a74be-1b5c-4c8f-9c61-5aa0965a4610"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 20:12:27 crc kubenswrapper[4892]: I0217 20:12:27.997243 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "e13a74be-1b5c-4c8f-9c61-5aa0965a4610" (UID: "e13a74be-1b5c-4c8f-9c61-5aa0965a4610"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 20:12:28 crc kubenswrapper[4892]: I0217 20:12:28.011243 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "e13a74be-1b5c-4c8f-9c61-5aa0965a4610" (UID: "e13a74be-1b5c-4c8f-9c61-5aa0965a4610"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 20:12:28 crc kubenswrapper[4892]: I0217 20:12:28.037654 4892 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 20:12:28 crc kubenswrapper[4892]: I0217 20:12:28.037689 4892 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Feb 17 20:12:28 crc kubenswrapper[4892]: I0217 20:12:28.037698 4892 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Feb 17 20:12:28 crc kubenswrapper[4892]: I0217 20:12:28.037706 4892 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Feb 17 20:12:28 crc kubenswrapper[4892]: I0217 20:12:28.037716 4892 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 17 20:12:28 crc kubenswrapper[4892]: I0217 20:12:28.037724 4892 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 20:12:28 crc kubenswrapper[4892]: I0217 20:12:28.037735 4892 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Feb 17 20:12:28 crc kubenswrapper[4892]: I0217 20:12:28.037744 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qm724\" (UniqueName: \"kubernetes.io/projected/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-kube-api-access-qm724\") on node \"crc\" DevicePath \"\"" Feb 17 20:12:28 crc kubenswrapper[4892]: I0217 20:12:28.037753 4892 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Feb 17 20:12:28 crc kubenswrapper[4892]: I0217 20:12:28.037761 4892 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Feb 17 20:12:28 crc kubenswrapper[4892]: I0217 20:12:28.037769 4892 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-ceph\") on node \"crc\" DevicePath \"\"" Feb 17 20:12:28 crc kubenswrapper[4892]: I0217 20:12:28.037777 4892 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Feb 17 20:12:28 crc kubenswrapper[4892]: I0217 20:12:28.037787 4892 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/e13a74be-1b5c-4c8f-9c61-5aa0965a4610-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Feb 17 20:12:28 crc kubenswrapper[4892]: I0217 20:12:28.319075 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt" event={"ID":"e13a74be-1b5c-4c8f-9c61-5aa0965a4610","Type":"ContainerDied","Data":"59878c3afc46e1b7c947524cf872f2ab3c1a8ad6748a6c416dbc2ecde6e27529"} Feb 17 20:12:28 crc kubenswrapper[4892]: I0217 20:12:28.319117 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59878c3afc46e1b7c947524cf872f2ab3c1a8ad6748a6c416dbc2ecde6e27529" Feb 17 20:12:28 crc kubenswrapper[4892]: I0217 20:12:28.319178 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.113562 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rffh2"] Feb 17 20:13:13 crc kubenswrapper[4892]: E0217 20:13:13.119517 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="330bbc2b-6580-4c85-8180-847df82d681f" containerName="extract-utilities" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.119552 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="330bbc2b-6580-4c85-8180-847df82d681f" containerName="extract-utilities" Feb 17 20:13:13 crc kubenswrapper[4892]: E0217 20:13:13.119569 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6edd657-e697-42e2-bfc4-3ea98348eb23" containerName="extract-content" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.119579 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6edd657-e697-42e2-bfc4-3ea98348eb23" containerName="extract-content" Feb 17 20:13:13 crc kubenswrapper[4892]: E0217 20:13:13.119609 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94cfcd99-3052-49c6-991c-571a85bdeba5" containerName="extract-utilities" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.119615 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="94cfcd99-3052-49c6-991c-571a85bdeba5" containerName="extract-utilities" Feb 17 20:13:13 crc kubenswrapper[4892]: E0217 20:13:13.119634 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b23b6a9-794e-40ce-93b0-8b60ec25cf20" containerName="extract-utilities" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.119641 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b23b6a9-794e-40ce-93b0-8b60ec25cf20" containerName="extract-utilities" Feb 17 20:13:13 crc kubenswrapper[4892]: E0217 20:13:13.119657 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d6875b4-db50-44e7-b270-ceb7bc6ef804" containerName="extract-utilities" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.119663 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d6875b4-db50-44e7-b270-ceb7bc6ef804" containerName="extract-utilities" Feb 17 20:13:13 crc kubenswrapper[4892]: E0217 20:13:13.119678 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36d16e04-a79d-42f9-bcb8-e9115efb1eae" containerName="registry-server" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.119686 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="36d16e04-a79d-42f9-bcb8-e9115efb1eae" containerName="registry-server" Feb 17 20:13:13 crc kubenswrapper[4892]: E0217 20:13:13.119705 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94cfcd99-3052-49c6-991c-571a85bdeba5" containerName="extract-content" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.119712 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="94cfcd99-3052-49c6-991c-571a85bdeba5" containerName="extract-content" Feb 17 20:13:13 crc kubenswrapper[4892]: E0217 20:13:13.119727 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d42f2f0e-3e80-4231-9daa-5abdd0a0091e" containerName="extract-utilities" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.119733 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="d42f2f0e-3e80-4231-9daa-5abdd0a0091e" containerName="extract-utilities" Feb 17 20:13:13 crc kubenswrapper[4892]: E0217 20:13:13.119745 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d42f2f0e-3e80-4231-9daa-5abdd0a0091e" containerName="extract-content" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.119753 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="d42f2f0e-3e80-4231-9daa-5abdd0a0091e" containerName="extract-content" Feb 17 20:13:13 crc kubenswrapper[4892]: E0217 20:13:13.119768 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a8823d3-7d81-4ab1-8726-e77a2d0024e6" containerName="extract-content" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.119773 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a8823d3-7d81-4ab1-8726-e77a2d0024e6" containerName="extract-content" Feb 17 20:13:13 crc kubenswrapper[4892]: E0217 20:13:13.119783 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="330bbc2b-6580-4c85-8180-847df82d681f" containerName="registry-server" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.119789 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="330bbc2b-6580-4c85-8180-847df82d681f" containerName="registry-server" Feb 17 20:13:13 crc kubenswrapper[4892]: E0217 20:13:13.119800 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d6875b4-db50-44e7-b270-ceb7bc6ef804" containerName="registry-server" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.119807 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d6875b4-db50-44e7-b270-ceb7bc6ef804" containerName="registry-server" Feb 17 20:13:13 crc kubenswrapper[4892]: E0217 20:13:13.119835 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9918aee-e88f-48dc-a1dd-f5de340b5570" containerName="extract-utilities" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.119842 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9918aee-e88f-48dc-a1dd-f5de340b5570" containerName="extract-utilities" Feb 17 20:13:13 crc kubenswrapper[4892]: E0217 20:13:13.119853 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30a36169-14a5-4d2a-9f66-fb343852b6a6" containerName="registry-server" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.119860 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="30a36169-14a5-4d2a-9f66-fb343852b6a6" containerName="registry-server" Feb 17 20:13:13 crc kubenswrapper[4892]: E0217 20:13:13.119874 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36d16e04-a79d-42f9-bcb8-e9115efb1eae" containerName="extract-content" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.119880 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="36d16e04-a79d-42f9-bcb8-e9115efb1eae" containerName="extract-content" Feb 17 20:13:13 crc kubenswrapper[4892]: E0217 20:13:13.119891 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12793f08-a0ab-411a-9449-b0b2d0834e5e" containerName="registry-server" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.119897 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="12793f08-a0ab-411a-9449-b0b2d0834e5e" containerName="registry-server" Feb 17 20:13:13 crc kubenswrapper[4892]: E0217 20:13:13.119907 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e13a74be-1b5c-4c8f-9c61-5aa0965a4610" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.119915 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="e13a74be-1b5c-4c8f-9c61-5aa0965a4610" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Feb 17 20:13:13 crc kubenswrapper[4892]: E0217 20:13:13.119927 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12793f08-a0ab-411a-9449-b0b2d0834e5e" containerName="extract-content" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.119933 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="12793f08-a0ab-411a-9449-b0b2d0834e5e" containerName="extract-content" Feb 17 20:13:13 crc kubenswrapper[4892]: E0217 20:13:13.119946 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36d16e04-a79d-42f9-bcb8-e9115efb1eae" containerName="extract-utilities" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.119951 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="36d16e04-a79d-42f9-bcb8-e9115efb1eae" containerName="extract-utilities" Feb 17 20:13:13 crc kubenswrapper[4892]: E0217 20:13:13.119957 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a8823d3-7d81-4ab1-8726-e77a2d0024e6" containerName="extract-utilities" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.119963 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a8823d3-7d81-4ab1-8726-e77a2d0024e6" containerName="extract-utilities" Feb 17 20:13:13 crc kubenswrapper[4892]: E0217 20:13:13.119977 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e446b496-04a9-4d11-b207-cdfe5ccc6d5a" containerName="extract-content" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.119982 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="e446b496-04a9-4d11-b207-cdfe5ccc6d5a" containerName="extract-content" Feb 17 20:13:13 crc kubenswrapper[4892]: E0217 20:13:13.119992 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12793f08-a0ab-411a-9449-b0b2d0834e5e" containerName="extract-utilities" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.119998 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="12793f08-a0ab-411a-9449-b0b2d0834e5e" containerName="extract-utilities" Feb 17 20:13:13 crc kubenswrapper[4892]: E0217 20:13:13.120012 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9918aee-e88f-48dc-a1dd-f5de340b5570" containerName="registry-server" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.120017 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9918aee-e88f-48dc-a1dd-f5de340b5570" containerName="registry-server" Feb 17 20:13:13 crc kubenswrapper[4892]: E0217 20:13:13.120025 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e446b496-04a9-4d11-b207-cdfe5ccc6d5a" containerName="extract-utilities" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.120031 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="e446b496-04a9-4d11-b207-cdfe5ccc6d5a" containerName="extract-utilities" Feb 17 20:13:13 crc kubenswrapper[4892]: E0217 20:13:13.120039 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="330bbc2b-6580-4c85-8180-847df82d681f" containerName="extract-content" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.120045 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="330bbc2b-6580-4c85-8180-847df82d681f" containerName="extract-content" Feb 17 20:13:13 crc kubenswrapper[4892]: E0217 20:13:13.120056 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e446b496-04a9-4d11-b207-cdfe5ccc6d5a" containerName="registry-server" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.120062 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="e446b496-04a9-4d11-b207-cdfe5ccc6d5a" containerName="registry-server" Feb 17 20:13:13 crc kubenswrapper[4892]: E0217 20:13:13.120070 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b23b6a9-794e-40ce-93b0-8b60ec25cf20" containerName="extract-content" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.120076 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b23b6a9-794e-40ce-93b0-8b60ec25cf20" containerName="extract-content" Feb 17 20:13:13 crc kubenswrapper[4892]: E0217 20:13:13.120086 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9918aee-e88f-48dc-a1dd-f5de340b5570" containerName="extract-content" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.120092 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9918aee-e88f-48dc-a1dd-f5de340b5570" containerName="extract-content" Feb 17 20:13:13 crc kubenswrapper[4892]: E0217 20:13:13.120107 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d6875b4-db50-44e7-b270-ceb7bc6ef804" containerName="extract-content" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.120113 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d6875b4-db50-44e7-b270-ceb7bc6ef804" containerName="extract-content" Feb 17 20:13:13 crc kubenswrapper[4892]: E0217 20:13:13.120123 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6edd657-e697-42e2-bfc4-3ea98348eb23" containerName="registry-server" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.120128 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6edd657-e697-42e2-bfc4-3ea98348eb23" containerName="registry-server" Feb 17 20:13:13 crc kubenswrapper[4892]: E0217 20:13:13.120141 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6edd657-e697-42e2-bfc4-3ea98348eb23" containerName="extract-utilities" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.120146 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6edd657-e697-42e2-bfc4-3ea98348eb23" containerName="extract-utilities" Feb 17 20:13:13 crc kubenswrapper[4892]: E0217 20:13:13.120157 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d42f2f0e-3e80-4231-9daa-5abdd0a0091e" containerName="registry-server" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.120163 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="d42f2f0e-3e80-4231-9daa-5abdd0a0091e" containerName="registry-server" Feb 17 20:13:13 crc kubenswrapper[4892]: E0217 20:13:13.120172 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a8823d3-7d81-4ab1-8726-e77a2d0024e6" containerName="registry-server" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.120177 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a8823d3-7d81-4ab1-8726-e77a2d0024e6" containerName="registry-server" Feb 17 20:13:13 crc kubenswrapper[4892]: E0217 20:13:13.120194 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94cfcd99-3052-49c6-991c-571a85bdeba5" containerName="registry-server" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.120200 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="94cfcd99-3052-49c6-991c-571a85bdeba5" containerName="registry-server" Feb 17 20:13:13 crc kubenswrapper[4892]: E0217 20:13:13.120210 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b23b6a9-794e-40ce-93b0-8b60ec25cf20" containerName="registry-server" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.120219 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b23b6a9-794e-40ce-93b0-8b60ec25cf20" containerName="registry-server" Feb 17 20:13:13 crc kubenswrapper[4892]: E0217 20:13:13.120225 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30a36169-14a5-4d2a-9f66-fb343852b6a6" containerName="extract-utilities" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.120231 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="30a36169-14a5-4d2a-9f66-fb343852b6a6" containerName="extract-utilities" Feb 17 20:13:13 crc kubenswrapper[4892]: E0217 20:13:13.120240 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30a36169-14a5-4d2a-9f66-fb343852b6a6" containerName="extract-content" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.120246 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="30a36169-14a5-4d2a-9f66-fb343852b6a6" containerName="extract-content" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.120467 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="330bbc2b-6580-4c85-8180-847df82d681f" containerName="registry-server" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.120479 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6edd657-e697-42e2-bfc4-3ea98348eb23" containerName="registry-server" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.120487 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="d42f2f0e-3e80-4231-9daa-5abdd0a0091e" containerName="registry-server" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.120495 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="30a36169-14a5-4d2a-9f66-fb343852b6a6" containerName="registry-server" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.120509 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="36d16e04-a79d-42f9-bcb8-e9115efb1eae" containerName="registry-server" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.120519 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="94cfcd99-3052-49c6-991c-571a85bdeba5" containerName="registry-server" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.120530 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d6875b4-db50-44e7-b270-ceb7bc6ef804" containerName="registry-server" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.120536 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a8823d3-7d81-4ab1-8726-e77a2d0024e6" containerName="registry-server" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.120548 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="12793f08-a0ab-411a-9449-b0b2d0834e5e" containerName="registry-server" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.120558 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="e13a74be-1b5c-4c8f-9c61-5aa0965a4610" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.120569 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b23b6a9-794e-40ce-93b0-8b60ec25cf20" containerName="registry-server" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.120580 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9918aee-e88f-48dc-a1dd-f5de340b5570" containerName="registry-server" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.120594 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="e446b496-04a9-4d11-b207-cdfe5ccc6d5a" containerName="registry-server" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.122400 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rffh2" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.130507 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rffh2"] Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.217979 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bphrq\" (UniqueName: \"kubernetes.io/projected/e080e4b8-b48a-4744-8f3f-86656dd50ad3-kube-api-access-bphrq\") pod \"community-operators-rffh2\" (UID: \"e080e4b8-b48a-4744-8f3f-86656dd50ad3\") " pod="openshift-marketplace/community-operators-rffh2" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.218295 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e080e4b8-b48a-4744-8f3f-86656dd50ad3-utilities\") pod \"community-operators-rffh2\" (UID: \"e080e4b8-b48a-4744-8f3f-86656dd50ad3\") " pod="openshift-marketplace/community-operators-rffh2" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.218771 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e080e4b8-b48a-4744-8f3f-86656dd50ad3-catalog-content\") pod \"community-operators-rffh2\" (UID: \"e080e4b8-b48a-4744-8f3f-86656dd50ad3\") " pod="openshift-marketplace/community-operators-rffh2" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.320975 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e080e4b8-b48a-4744-8f3f-86656dd50ad3-utilities\") pod \"community-operators-rffh2\" (UID: \"e080e4b8-b48a-4744-8f3f-86656dd50ad3\") " pod="openshift-marketplace/community-operators-rffh2" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.321107 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e080e4b8-b48a-4744-8f3f-86656dd50ad3-catalog-content\") pod \"community-operators-rffh2\" (UID: \"e080e4b8-b48a-4744-8f3f-86656dd50ad3\") " pod="openshift-marketplace/community-operators-rffh2" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.321176 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bphrq\" (UniqueName: \"kubernetes.io/projected/e080e4b8-b48a-4744-8f3f-86656dd50ad3-kube-api-access-bphrq\") pod \"community-operators-rffh2\" (UID: \"e080e4b8-b48a-4744-8f3f-86656dd50ad3\") " pod="openshift-marketplace/community-operators-rffh2" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.321514 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e080e4b8-b48a-4744-8f3f-86656dd50ad3-utilities\") pod \"community-operators-rffh2\" (UID: \"e080e4b8-b48a-4744-8f3f-86656dd50ad3\") " pod="openshift-marketplace/community-operators-rffh2" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.321651 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e080e4b8-b48a-4744-8f3f-86656dd50ad3-catalog-content\") pod \"community-operators-rffh2\" (UID: \"e080e4b8-b48a-4744-8f3f-86656dd50ad3\") " pod="openshift-marketplace/community-operators-rffh2" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.339972 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bphrq\" (UniqueName: \"kubernetes.io/projected/e080e4b8-b48a-4744-8f3f-86656dd50ad3-kube-api-access-bphrq\") pod \"community-operators-rffh2\" (UID: \"e080e4b8-b48a-4744-8f3f-86656dd50ad3\") " pod="openshift-marketplace/community-operators-rffh2" Feb 17 20:13:13 crc kubenswrapper[4892]: I0217 20:13:13.446480 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rffh2" Feb 17 20:13:14 crc kubenswrapper[4892]: I0217 20:13:14.079991 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rffh2"] Feb 17 20:13:14 crc kubenswrapper[4892]: I0217 20:13:14.954730 4892 generic.go:334] "Generic (PLEG): container finished" podID="e080e4b8-b48a-4744-8f3f-86656dd50ad3" containerID="aa81d7e2b6b07a51530ae043702e0e6aaa9eeb2bc36f2724e9953a26019b35ec" exitCode=0 Feb 17 20:13:14 crc kubenswrapper[4892]: I0217 20:13:14.954786 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rffh2" event={"ID":"e080e4b8-b48a-4744-8f3f-86656dd50ad3","Type":"ContainerDied","Data":"aa81d7e2b6b07a51530ae043702e0e6aaa9eeb2bc36f2724e9953a26019b35ec"} Feb 17 20:13:14 crc kubenswrapper[4892]: I0217 20:13:14.955071 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rffh2" event={"ID":"e080e4b8-b48a-4744-8f3f-86656dd50ad3","Type":"ContainerStarted","Data":"031391eec20ad76d6692478078f6a755bd99f841f0a532eb8943ca915db54b82"} Feb 17 20:13:14 crc kubenswrapper[4892]: I0217 20:13:14.958168 4892 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 20:13:16 crc kubenswrapper[4892]: I0217 20:13:16.902552 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-blnzd"] Feb 17 20:13:16 crc kubenswrapper[4892]: I0217 20:13:16.907251 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-blnzd" Feb 17 20:13:16 crc kubenswrapper[4892]: I0217 20:13:16.927845 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-blnzd"] Feb 17 20:13:16 crc kubenswrapper[4892]: I0217 20:13:16.984402 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rffh2" event={"ID":"e080e4b8-b48a-4744-8f3f-86656dd50ad3","Type":"ContainerStarted","Data":"486e792e2302bfd618cb34a3851c77dd2ac11d9d2984d51cf3bdae01e6b1758b"} Feb 17 20:13:17 crc kubenswrapper[4892]: I0217 20:13:17.007318 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4f753ad-0539-4f45-8163-2208b6e3da91-utilities\") pod \"redhat-operators-blnzd\" (UID: \"f4f753ad-0539-4f45-8163-2208b6e3da91\") " pod="openshift-marketplace/redhat-operators-blnzd" Feb 17 20:13:17 crc kubenswrapper[4892]: I0217 20:13:17.007401 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nxzf\" (UniqueName: \"kubernetes.io/projected/f4f753ad-0539-4f45-8163-2208b6e3da91-kube-api-access-5nxzf\") pod \"redhat-operators-blnzd\" (UID: \"f4f753ad-0539-4f45-8163-2208b6e3da91\") " pod="openshift-marketplace/redhat-operators-blnzd" Feb 17 20:13:17 crc kubenswrapper[4892]: I0217 20:13:17.007462 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4f753ad-0539-4f45-8163-2208b6e3da91-catalog-content\") pod \"redhat-operators-blnzd\" (UID: \"f4f753ad-0539-4f45-8163-2208b6e3da91\") " pod="openshift-marketplace/redhat-operators-blnzd" Feb 17 20:13:17 crc kubenswrapper[4892]: I0217 20:13:17.110636 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4f753ad-0539-4f45-8163-2208b6e3da91-utilities\") pod \"redhat-operators-blnzd\" (UID: \"f4f753ad-0539-4f45-8163-2208b6e3da91\") " pod="openshift-marketplace/redhat-operators-blnzd" Feb 17 20:13:17 crc kubenswrapper[4892]: I0217 20:13:17.110711 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nxzf\" (UniqueName: \"kubernetes.io/projected/f4f753ad-0539-4f45-8163-2208b6e3da91-kube-api-access-5nxzf\") pod \"redhat-operators-blnzd\" (UID: \"f4f753ad-0539-4f45-8163-2208b6e3da91\") " pod="openshift-marketplace/redhat-operators-blnzd" Feb 17 20:13:17 crc kubenswrapper[4892]: I0217 20:13:17.110748 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4f753ad-0539-4f45-8163-2208b6e3da91-catalog-content\") pod \"redhat-operators-blnzd\" (UID: \"f4f753ad-0539-4f45-8163-2208b6e3da91\") " pod="openshift-marketplace/redhat-operators-blnzd" Feb 17 20:13:17 crc kubenswrapper[4892]: I0217 20:13:17.111358 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4f753ad-0539-4f45-8163-2208b6e3da91-catalog-content\") pod \"redhat-operators-blnzd\" (UID: \"f4f753ad-0539-4f45-8163-2208b6e3da91\") " pod="openshift-marketplace/redhat-operators-blnzd" Feb 17 20:13:17 crc kubenswrapper[4892]: I0217 20:13:17.111383 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4f753ad-0539-4f45-8163-2208b6e3da91-utilities\") pod \"redhat-operators-blnzd\" (UID: \"f4f753ad-0539-4f45-8163-2208b6e3da91\") " pod="openshift-marketplace/redhat-operators-blnzd" Feb 17 20:13:17 crc kubenswrapper[4892]: I0217 20:13:17.132684 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nxzf\" (UniqueName: \"kubernetes.io/projected/f4f753ad-0539-4f45-8163-2208b6e3da91-kube-api-access-5nxzf\") pod \"redhat-operators-blnzd\" (UID: \"f4f753ad-0539-4f45-8163-2208b6e3da91\") " pod="openshift-marketplace/redhat-operators-blnzd" Feb 17 20:13:17 crc kubenswrapper[4892]: I0217 20:13:17.247087 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-blnzd" Feb 17 20:13:17 crc kubenswrapper[4892]: I0217 20:13:17.764232 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-blnzd"] Feb 17 20:13:18 crc kubenswrapper[4892]: I0217 20:13:18.000442 4892 generic.go:334] "Generic (PLEG): container finished" podID="e080e4b8-b48a-4744-8f3f-86656dd50ad3" containerID="486e792e2302bfd618cb34a3851c77dd2ac11d9d2984d51cf3bdae01e6b1758b" exitCode=0 Feb 17 20:13:18 crc kubenswrapper[4892]: I0217 20:13:18.000585 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rffh2" event={"ID":"e080e4b8-b48a-4744-8f3f-86656dd50ad3","Type":"ContainerDied","Data":"486e792e2302bfd618cb34a3851c77dd2ac11d9d2984d51cf3bdae01e6b1758b"} Feb 17 20:13:18 crc kubenswrapper[4892]: I0217 20:13:18.007289 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-blnzd" event={"ID":"f4f753ad-0539-4f45-8163-2208b6e3da91","Type":"ContainerStarted","Data":"02c8ac63c107273024d64ed9a508c8ba645ca68e39c522f0f0767278067ad7b8"} Feb 17 20:13:18 crc kubenswrapper[4892]: I0217 20:13:18.007333 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-blnzd" event={"ID":"f4f753ad-0539-4f45-8163-2208b6e3da91","Type":"ContainerStarted","Data":"cdb7e9fc7910d3818eff47923efdbd5b89db74952c36faa5d0d3f1525184fd2d"} Feb 17 20:13:19 crc kubenswrapper[4892]: I0217 20:13:19.029961 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rffh2" event={"ID":"e080e4b8-b48a-4744-8f3f-86656dd50ad3","Type":"ContainerStarted","Data":"8c7d7cb4d918e7e134b77d12a46231ac275791b557a52ca33916acaed5554653"} Feb 17 20:13:19 crc kubenswrapper[4892]: I0217 20:13:19.033555 4892 generic.go:334] "Generic (PLEG): container finished" podID="f4f753ad-0539-4f45-8163-2208b6e3da91" containerID="02c8ac63c107273024d64ed9a508c8ba645ca68e39c522f0f0767278067ad7b8" exitCode=0 Feb 17 20:13:19 crc kubenswrapper[4892]: I0217 20:13:19.033622 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-blnzd" event={"ID":"f4f753ad-0539-4f45-8163-2208b6e3da91","Type":"ContainerDied","Data":"02c8ac63c107273024d64ed9a508c8ba645ca68e39c522f0f0767278067ad7b8"} Feb 17 20:13:19 crc kubenswrapper[4892]: I0217 20:13:19.064405 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rffh2" podStartSLOduration=2.62962165 podStartE2EDuration="6.064384237s" podCreationTimestamp="2026-02-17 20:13:13 +0000 UTC" firstStartedPulling="2026-02-17 20:13:14.957870087 +0000 UTC m=+8966.333273362" lastFinishedPulling="2026-02-17 20:13:18.392632684 +0000 UTC m=+8969.768035949" observedRunningTime="2026-02-17 20:13:19.049330892 +0000 UTC m=+8970.424734157" watchObservedRunningTime="2026-02-17 20:13:19.064384237 +0000 UTC m=+8970.439787512" Feb 17 20:13:21 crc kubenswrapper[4892]: I0217 20:13:21.065489 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-blnzd" event={"ID":"f4f753ad-0539-4f45-8163-2208b6e3da91","Type":"ContainerStarted","Data":"08c1d4fe5d35fe111b82c317b6ddfa3559b6d9a60c3e0ec8ab41e86f8be11980"} Feb 17 20:13:22 crc kubenswrapper[4892]: I0217 20:13:22.090020 4892 generic.go:334] "Generic (PLEG): container finished" podID="f4f753ad-0539-4f45-8163-2208b6e3da91" containerID="08c1d4fe5d35fe111b82c317b6ddfa3559b6d9a60c3e0ec8ab41e86f8be11980" exitCode=0 Feb 17 20:13:22 crc kubenswrapper[4892]: I0217 20:13:22.090978 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-blnzd" event={"ID":"f4f753ad-0539-4f45-8163-2208b6e3da91","Type":"ContainerDied","Data":"08c1d4fe5d35fe111b82c317b6ddfa3559b6d9a60c3e0ec8ab41e86f8be11980"} Feb 17 20:13:23 crc kubenswrapper[4892]: I0217 20:13:23.103168 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-blnzd" event={"ID":"f4f753ad-0539-4f45-8163-2208b6e3da91","Type":"ContainerStarted","Data":"f6a2b2806870bb4aa181382506b4757896fc9068c70e7b695b6ee1d4b8cad116"} Feb 17 20:13:23 crc kubenswrapper[4892]: I0217 20:13:23.135921 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-blnzd" podStartSLOduration=3.635634535 podStartE2EDuration="7.135901624s" podCreationTimestamp="2026-02-17 20:13:16 +0000 UTC" firstStartedPulling="2026-02-17 20:13:19.035175691 +0000 UTC m=+8970.410578956" lastFinishedPulling="2026-02-17 20:13:22.53544278 +0000 UTC m=+8973.910846045" observedRunningTime="2026-02-17 20:13:23.120242984 +0000 UTC m=+8974.495646249" watchObservedRunningTime="2026-02-17 20:13:23.135901624 +0000 UTC m=+8974.511304889" Feb 17 20:13:23 crc kubenswrapper[4892]: I0217 20:13:23.447468 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rffh2" Feb 17 20:13:23 crc kubenswrapper[4892]: I0217 20:13:23.447521 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rffh2" Feb 17 20:13:23 crc kubenswrapper[4892]: I0217 20:13:23.529602 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rffh2" Feb 17 20:13:24 crc kubenswrapper[4892]: I0217 20:13:24.190151 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rffh2" Feb 17 20:13:25 crc kubenswrapper[4892]: I0217 20:13:25.899619 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rffh2"] Feb 17 20:13:26 crc kubenswrapper[4892]: I0217 20:13:26.148147 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rffh2" podUID="e080e4b8-b48a-4744-8f3f-86656dd50ad3" containerName="registry-server" containerID="cri-o://8c7d7cb4d918e7e134b77d12a46231ac275791b557a52ca33916acaed5554653" gracePeriod=2 Feb 17 20:13:26 crc kubenswrapper[4892]: I0217 20:13:26.655470 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rffh2" Feb 17 20:13:26 crc kubenswrapper[4892]: I0217 20:13:26.741160 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bphrq\" (UniqueName: \"kubernetes.io/projected/e080e4b8-b48a-4744-8f3f-86656dd50ad3-kube-api-access-bphrq\") pod \"e080e4b8-b48a-4744-8f3f-86656dd50ad3\" (UID: \"e080e4b8-b48a-4744-8f3f-86656dd50ad3\") " Feb 17 20:13:26 crc kubenswrapper[4892]: I0217 20:13:26.741207 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e080e4b8-b48a-4744-8f3f-86656dd50ad3-utilities\") pod \"e080e4b8-b48a-4744-8f3f-86656dd50ad3\" (UID: \"e080e4b8-b48a-4744-8f3f-86656dd50ad3\") " Feb 17 20:13:26 crc kubenswrapper[4892]: I0217 20:13:26.741303 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e080e4b8-b48a-4744-8f3f-86656dd50ad3-catalog-content\") pod \"e080e4b8-b48a-4744-8f3f-86656dd50ad3\" (UID: \"e080e4b8-b48a-4744-8f3f-86656dd50ad3\") " Feb 17 20:13:26 crc kubenswrapper[4892]: I0217 20:13:26.742792 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e080e4b8-b48a-4744-8f3f-86656dd50ad3-utilities" (OuterVolumeSpecName: "utilities") pod "e080e4b8-b48a-4744-8f3f-86656dd50ad3" (UID: "e080e4b8-b48a-4744-8f3f-86656dd50ad3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 20:13:26 crc kubenswrapper[4892]: I0217 20:13:26.763574 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e080e4b8-b48a-4744-8f3f-86656dd50ad3-kube-api-access-bphrq" (OuterVolumeSpecName: "kube-api-access-bphrq") pod "e080e4b8-b48a-4744-8f3f-86656dd50ad3" (UID: "e080e4b8-b48a-4744-8f3f-86656dd50ad3"). InnerVolumeSpecName "kube-api-access-bphrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 20:13:26 crc kubenswrapper[4892]: I0217 20:13:26.819403 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e080e4b8-b48a-4744-8f3f-86656dd50ad3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e080e4b8-b48a-4744-8f3f-86656dd50ad3" (UID: "e080e4b8-b48a-4744-8f3f-86656dd50ad3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 20:13:26 crc kubenswrapper[4892]: I0217 20:13:26.843880 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bphrq\" (UniqueName: \"kubernetes.io/projected/e080e4b8-b48a-4744-8f3f-86656dd50ad3-kube-api-access-bphrq\") on node \"crc\" DevicePath \"\"" Feb 17 20:13:26 crc kubenswrapper[4892]: I0217 20:13:26.843912 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e080e4b8-b48a-4744-8f3f-86656dd50ad3-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 20:13:26 crc kubenswrapper[4892]: I0217 20:13:26.843921 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e080e4b8-b48a-4744-8f3f-86656dd50ad3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 20:13:27 crc kubenswrapper[4892]: I0217 20:13:27.163036 4892 generic.go:334] "Generic (PLEG): container finished" podID="e080e4b8-b48a-4744-8f3f-86656dd50ad3" containerID="8c7d7cb4d918e7e134b77d12a46231ac275791b557a52ca33916acaed5554653" exitCode=0 Feb 17 20:13:27 crc kubenswrapper[4892]: I0217 20:13:27.163082 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rffh2" event={"ID":"e080e4b8-b48a-4744-8f3f-86656dd50ad3","Type":"ContainerDied","Data":"8c7d7cb4d918e7e134b77d12a46231ac275791b557a52ca33916acaed5554653"} Feb 17 20:13:27 crc kubenswrapper[4892]: I0217 20:13:27.163118 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rffh2" event={"ID":"e080e4b8-b48a-4744-8f3f-86656dd50ad3","Type":"ContainerDied","Data":"031391eec20ad76d6692478078f6a755bd99f841f0a532eb8943ca915db54b82"} Feb 17 20:13:27 crc kubenswrapper[4892]: I0217 20:13:27.163136 4892 scope.go:117] "RemoveContainer" containerID="8c7d7cb4d918e7e134b77d12a46231ac275791b557a52ca33916acaed5554653" Feb 17 20:13:27 crc kubenswrapper[4892]: I0217 20:13:27.165478 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rffh2" Feb 17 20:13:27 crc kubenswrapper[4892]: I0217 20:13:27.192184 4892 scope.go:117] "RemoveContainer" containerID="486e792e2302bfd618cb34a3851c77dd2ac11d9d2984d51cf3bdae01e6b1758b" Feb 17 20:13:27 crc kubenswrapper[4892]: I0217 20:13:27.213483 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rffh2"] Feb 17 20:13:27 crc kubenswrapper[4892]: I0217 20:13:27.244317 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rffh2"] Feb 17 20:13:27 crc kubenswrapper[4892]: I0217 20:13:27.247200 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-blnzd" Feb 17 20:13:27 crc kubenswrapper[4892]: I0217 20:13:27.247238 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-blnzd" Feb 17 20:13:27 crc kubenswrapper[4892]: I0217 20:13:27.285067 4892 scope.go:117] "RemoveContainer" containerID="aa81d7e2b6b07a51530ae043702e0e6aaa9eeb2bc36f2724e9953a26019b35ec" Feb 17 20:13:27 crc kubenswrapper[4892]: I0217 20:13:27.308473 4892 scope.go:117] "RemoveContainer" containerID="8c7d7cb4d918e7e134b77d12a46231ac275791b557a52ca33916acaed5554653" Feb 17 20:13:27 crc kubenswrapper[4892]: E0217 20:13:27.310129 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c7d7cb4d918e7e134b77d12a46231ac275791b557a52ca33916acaed5554653\": container with ID starting with 8c7d7cb4d918e7e134b77d12a46231ac275791b557a52ca33916acaed5554653 not found: ID does not exist" containerID="8c7d7cb4d918e7e134b77d12a46231ac275791b557a52ca33916acaed5554653" Feb 17 20:13:27 crc kubenswrapper[4892]: I0217 20:13:27.310190 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c7d7cb4d918e7e134b77d12a46231ac275791b557a52ca33916acaed5554653"} err="failed to get container status \"8c7d7cb4d918e7e134b77d12a46231ac275791b557a52ca33916acaed5554653\": rpc error: code = NotFound desc = could not find container \"8c7d7cb4d918e7e134b77d12a46231ac275791b557a52ca33916acaed5554653\": container with ID starting with 8c7d7cb4d918e7e134b77d12a46231ac275791b557a52ca33916acaed5554653 not found: ID does not exist" Feb 17 20:13:27 crc kubenswrapper[4892]: I0217 20:13:27.310263 4892 scope.go:117] "RemoveContainer" containerID="486e792e2302bfd618cb34a3851c77dd2ac11d9d2984d51cf3bdae01e6b1758b" Feb 17 20:13:27 crc kubenswrapper[4892]: E0217 20:13:27.310877 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"486e792e2302bfd618cb34a3851c77dd2ac11d9d2984d51cf3bdae01e6b1758b\": container with ID starting with 486e792e2302bfd618cb34a3851c77dd2ac11d9d2984d51cf3bdae01e6b1758b not found: ID does not exist" containerID="486e792e2302bfd618cb34a3851c77dd2ac11d9d2984d51cf3bdae01e6b1758b" Feb 17 20:13:27 crc kubenswrapper[4892]: I0217 20:13:27.310934 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"486e792e2302bfd618cb34a3851c77dd2ac11d9d2984d51cf3bdae01e6b1758b"} err="failed to get container status \"486e792e2302bfd618cb34a3851c77dd2ac11d9d2984d51cf3bdae01e6b1758b\": rpc error: code = NotFound desc = could not find container \"486e792e2302bfd618cb34a3851c77dd2ac11d9d2984d51cf3bdae01e6b1758b\": container with ID starting with 486e792e2302bfd618cb34a3851c77dd2ac11d9d2984d51cf3bdae01e6b1758b not found: ID does not exist" Feb 17 20:13:27 crc kubenswrapper[4892]: I0217 20:13:27.310965 4892 scope.go:117] "RemoveContainer" containerID="aa81d7e2b6b07a51530ae043702e0e6aaa9eeb2bc36f2724e9953a26019b35ec" Feb 17 20:13:27 crc kubenswrapper[4892]: E0217 20:13:27.311212 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa81d7e2b6b07a51530ae043702e0e6aaa9eeb2bc36f2724e9953a26019b35ec\": container with ID starting with aa81d7e2b6b07a51530ae043702e0e6aaa9eeb2bc36f2724e9953a26019b35ec not found: ID does not exist" containerID="aa81d7e2b6b07a51530ae043702e0e6aaa9eeb2bc36f2724e9953a26019b35ec" Feb 17 20:13:27 crc kubenswrapper[4892]: I0217 20:13:27.311248 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa81d7e2b6b07a51530ae043702e0e6aaa9eeb2bc36f2724e9953a26019b35ec"} err="failed to get container status \"aa81d7e2b6b07a51530ae043702e0e6aaa9eeb2bc36f2724e9953a26019b35ec\": rpc error: code = NotFound desc = could not find container \"aa81d7e2b6b07a51530ae043702e0e6aaa9eeb2bc36f2724e9953a26019b35ec\": container with ID starting with aa81d7e2b6b07a51530ae043702e0e6aaa9eeb2bc36f2724e9953a26019b35ec not found: ID does not exist" Feb 17 20:13:27 crc kubenswrapper[4892]: I0217 20:13:27.371440 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e080e4b8-b48a-4744-8f3f-86656dd50ad3" path="/var/lib/kubelet/pods/e080e4b8-b48a-4744-8f3f-86656dd50ad3/volumes" Feb 17 20:13:28 crc kubenswrapper[4892]: I0217 20:13:28.308990 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-blnzd" podUID="f4f753ad-0539-4f45-8163-2208b6e3da91" containerName="registry-server" probeResult="failure" output=< Feb 17 20:13:28 crc kubenswrapper[4892]: timeout: failed to connect service ":50051" within 1s Feb 17 20:13:28 crc kubenswrapper[4892]: > Feb 17 20:13:37 crc kubenswrapper[4892]: I0217 20:13:37.308973 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-blnzd" Feb 17 20:13:37 crc kubenswrapper[4892]: I0217 20:13:37.389525 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-blnzd" Feb 17 20:13:37 crc kubenswrapper[4892]: I0217 20:13:37.425323 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 20:13:37 crc kubenswrapper[4892]: I0217 20:13:37.425378 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 20:13:37 crc kubenswrapper[4892]: I0217 20:13:37.549728 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-blnzd"] Feb 17 20:13:39 crc kubenswrapper[4892]: I0217 20:13:39.309533 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-blnzd" podUID="f4f753ad-0539-4f45-8163-2208b6e3da91" containerName="registry-server" containerID="cri-o://f6a2b2806870bb4aa181382506b4757896fc9068c70e7b695b6ee1d4b8cad116" gracePeriod=2 Feb 17 20:13:39 crc kubenswrapper[4892]: I0217 20:13:39.833450 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-blnzd" Feb 17 20:13:39 crc kubenswrapper[4892]: I0217 20:13:39.849405 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nxzf\" (UniqueName: \"kubernetes.io/projected/f4f753ad-0539-4f45-8163-2208b6e3da91-kube-api-access-5nxzf\") pod \"f4f753ad-0539-4f45-8163-2208b6e3da91\" (UID: \"f4f753ad-0539-4f45-8163-2208b6e3da91\") " Feb 17 20:13:39 crc kubenswrapper[4892]: I0217 20:13:39.849660 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4f753ad-0539-4f45-8163-2208b6e3da91-catalog-content\") pod \"f4f753ad-0539-4f45-8163-2208b6e3da91\" (UID: \"f4f753ad-0539-4f45-8163-2208b6e3da91\") " Feb 17 20:13:39 crc kubenswrapper[4892]: I0217 20:13:39.849740 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4f753ad-0539-4f45-8163-2208b6e3da91-utilities\") pod \"f4f753ad-0539-4f45-8163-2208b6e3da91\" (UID: \"f4f753ad-0539-4f45-8163-2208b6e3da91\") " Feb 17 20:13:39 crc kubenswrapper[4892]: I0217 20:13:39.850522 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4f753ad-0539-4f45-8163-2208b6e3da91-utilities" (OuterVolumeSpecName: "utilities") pod "f4f753ad-0539-4f45-8163-2208b6e3da91" (UID: "f4f753ad-0539-4f45-8163-2208b6e3da91"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 20:13:39 crc kubenswrapper[4892]: I0217 20:13:39.855322 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4f753ad-0539-4f45-8163-2208b6e3da91-kube-api-access-5nxzf" (OuterVolumeSpecName: "kube-api-access-5nxzf") pod "f4f753ad-0539-4f45-8163-2208b6e3da91" (UID: "f4f753ad-0539-4f45-8163-2208b6e3da91"). InnerVolumeSpecName "kube-api-access-5nxzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 20:13:39 crc kubenswrapper[4892]: I0217 20:13:39.953041 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nxzf\" (UniqueName: \"kubernetes.io/projected/f4f753ad-0539-4f45-8163-2208b6e3da91-kube-api-access-5nxzf\") on node \"crc\" DevicePath \"\"" Feb 17 20:13:39 crc kubenswrapper[4892]: I0217 20:13:39.953077 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4f753ad-0539-4f45-8163-2208b6e3da91-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 20:13:39 crc kubenswrapper[4892]: I0217 20:13:39.993315 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4f753ad-0539-4f45-8163-2208b6e3da91-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f4f753ad-0539-4f45-8163-2208b6e3da91" (UID: "f4f753ad-0539-4f45-8163-2208b6e3da91"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 20:13:40 crc kubenswrapper[4892]: I0217 20:13:40.054356 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4f753ad-0539-4f45-8163-2208b6e3da91-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 20:13:40 crc kubenswrapper[4892]: I0217 20:13:40.324851 4892 generic.go:334] "Generic (PLEG): container finished" podID="f4f753ad-0539-4f45-8163-2208b6e3da91" containerID="f6a2b2806870bb4aa181382506b4757896fc9068c70e7b695b6ee1d4b8cad116" exitCode=0 Feb 17 20:13:40 crc kubenswrapper[4892]: I0217 20:13:40.324917 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-blnzd" Feb 17 20:13:40 crc kubenswrapper[4892]: I0217 20:13:40.324931 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-blnzd" event={"ID":"f4f753ad-0539-4f45-8163-2208b6e3da91","Type":"ContainerDied","Data":"f6a2b2806870bb4aa181382506b4757896fc9068c70e7b695b6ee1d4b8cad116"} Feb 17 20:13:40 crc kubenswrapper[4892]: I0217 20:13:40.325117 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-blnzd" event={"ID":"f4f753ad-0539-4f45-8163-2208b6e3da91","Type":"ContainerDied","Data":"cdb7e9fc7910d3818eff47923efdbd5b89db74952c36faa5d0d3f1525184fd2d"} Feb 17 20:13:40 crc kubenswrapper[4892]: I0217 20:13:40.325154 4892 scope.go:117] "RemoveContainer" containerID="f6a2b2806870bb4aa181382506b4757896fc9068c70e7b695b6ee1d4b8cad116" Feb 17 20:13:40 crc kubenswrapper[4892]: I0217 20:13:40.353040 4892 scope.go:117] "RemoveContainer" containerID="08c1d4fe5d35fe111b82c317b6ddfa3559b6d9a60c3e0ec8ab41e86f8be11980" Feb 17 20:13:40 crc kubenswrapper[4892]: I0217 20:13:40.376805 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-blnzd"] Feb 17 20:13:40 crc kubenswrapper[4892]: I0217 20:13:40.384010 4892 scope.go:117] "RemoveContainer" containerID="02c8ac63c107273024d64ed9a508c8ba645ca68e39c522f0f0767278067ad7b8" Feb 17 20:13:40 crc kubenswrapper[4892]: I0217 20:13:40.387978 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-blnzd"] Feb 17 20:13:40 crc kubenswrapper[4892]: I0217 20:13:40.436533 4892 scope.go:117] "RemoveContainer" containerID="f6a2b2806870bb4aa181382506b4757896fc9068c70e7b695b6ee1d4b8cad116" Feb 17 20:13:40 crc kubenswrapper[4892]: E0217 20:13:40.436993 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6a2b2806870bb4aa181382506b4757896fc9068c70e7b695b6ee1d4b8cad116\": container with ID starting with f6a2b2806870bb4aa181382506b4757896fc9068c70e7b695b6ee1d4b8cad116 not found: ID does not exist" containerID="f6a2b2806870bb4aa181382506b4757896fc9068c70e7b695b6ee1d4b8cad116" Feb 17 20:13:40 crc kubenswrapper[4892]: I0217 20:13:40.437028 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6a2b2806870bb4aa181382506b4757896fc9068c70e7b695b6ee1d4b8cad116"} err="failed to get container status \"f6a2b2806870bb4aa181382506b4757896fc9068c70e7b695b6ee1d4b8cad116\": rpc error: code = NotFound desc = could not find container \"f6a2b2806870bb4aa181382506b4757896fc9068c70e7b695b6ee1d4b8cad116\": container with ID starting with f6a2b2806870bb4aa181382506b4757896fc9068c70e7b695b6ee1d4b8cad116 not found: ID does not exist" Feb 17 20:13:40 crc kubenswrapper[4892]: I0217 20:13:40.437063 4892 scope.go:117] "RemoveContainer" containerID="08c1d4fe5d35fe111b82c317b6ddfa3559b6d9a60c3e0ec8ab41e86f8be11980" Feb 17 20:13:40 crc kubenswrapper[4892]: E0217 20:13:40.437404 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08c1d4fe5d35fe111b82c317b6ddfa3559b6d9a60c3e0ec8ab41e86f8be11980\": container with ID starting with 08c1d4fe5d35fe111b82c317b6ddfa3559b6d9a60c3e0ec8ab41e86f8be11980 not found: ID does not exist" containerID="08c1d4fe5d35fe111b82c317b6ddfa3559b6d9a60c3e0ec8ab41e86f8be11980" Feb 17 20:13:40 crc kubenswrapper[4892]: I0217 20:13:40.437431 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08c1d4fe5d35fe111b82c317b6ddfa3559b6d9a60c3e0ec8ab41e86f8be11980"} err="failed to get container status \"08c1d4fe5d35fe111b82c317b6ddfa3559b6d9a60c3e0ec8ab41e86f8be11980\": rpc error: code = NotFound desc = could not find container \"08c1d4fe5d35fe111b82c317b6ddfa3559b6d9a60c3e0ec8ab41e86f8be11980\": container with ID starting with 08c1d4fe5d35fe111b82c317b6ddfa3559b6d9a60c3e0ec8ab41e86f8be11980 not found: ID does not exist" Feb 17 20:13:40 crc kubenswrapper[4892]: I0217 20:13:40.437472 4892 scope.go:117] "RemoveContainer" containerID="02c8ac63c107273024d64ed9a508c8ba645ca68e39c522f0f0767278067ad7b8" Feb 17 20:13:40 crc kubenswrapper[4892]: E0217 20:13:40.437728 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02c8ac63c107273024d64ed9a508c8ba645ca68e39c522f0f0767278067ad7b8\": container with ID starting with 02c8ac63c107273024d64ed9a508c8ba645ca68e39c522f0f0767278067ad7b8 not found: ID does not exist" containerID="02c8ac63c107273024d64ed9a508c8ba645ca68e39c522f0f0767278067ad7b8" Feb 17 20:13:40 crc kubenswrapper[4892]: I0217 20:13:40.437770 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02c8ac63c107273024d64ed9a508c8ba645ca68e39c522f0f0767278067ad7b8"} err="failed to get container status \"02c8ac63c107273024d64ed9a508c8ba645ca68e39c522f0f0767278067ad7b8\": rpc error: code = NotFound desc = could not find container \"02c8ac63c107273024d64ed9a508c8ba645ca68e39c522f0f0767278067ad7b8\": container with ID starting with 02c8ac63c107273024d64ed9a508c8ba645ca68e39c522f0f0767278067ad7b8 not found: ID does not exist" Feb 17 20:13:41 crc kubenswrapper[4892]: I0217 20:13:41.373442 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4f753ad-0539-4f45-8163-2208b6e3da91" path="/var/lib/kubelet/pods/f4f753ad-0539-4f45-8163-2208b6e3da91/volumes" Feb 17 20:14:07 crc kubenswrapper[4892]: I0217 20:14:07.425289 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 20:14:07 crc kubenswrapper[4892]: I0217 20:14:07.425735 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 20:14:37 crc kubenswrapper[4892]: I0217 20:14:37.424631 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 20:14:37 crc kubenswrapper[4892]: I0217 20:14:37.425329 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 20:14:37 crc kubenswrapper[4892]: I0217 20:14:37.425392 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" Feb 17 20:14:37 crc kubenswrapper[4892]: I0217 20:14:37.426548 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"16db715ca4968324805ed8b7d8e7d0822fd44599b240bf90b59ee6d89baa29e3"} pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 20:14:37 crc kubenswrapper[4892]: I0217 20:14:37.426653 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" containerID="cri-o://16db715ca4968324805ed8b7d8e7d0822fd44599b240bf90b59ee6d89baa29e3" gracePeriod=600 Feb 17 20:14:38 crc kubenswrapper[4892]: I0217 20:14:38.102280 4892 generic.go:334] "Generic (PLEG): container finished" podID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerID="16db715ca4968324805ed8b7d8e7d0822fd44599b240bf90b59ee6d89baa29e3" exitCode=0 Feb 17 20:14:38 crc kubenswrapper[4892]: I0217 20:14:38.102374 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerDied","Data":"16db715ca4968324805ed8b7d8e7d0822fd44599b240bf90b59ee6d89baa29e3"} Feb 17 20:14:38 crc kubenswrapper[4892]: I0217 20:14:38.102872 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerStarted","Data":"0ab03386b956c72a478daacc623c9e719cc9be4c2ae5015b826cee6ca5f2b27c"} Feb 17 20:14:38 crc kubenswrapper[4892]: I0217 20:14:38.102898 4892 scope.go:117] "RemoveContainer" containerID="dd34609e40c62d0dc2fae95fb119b2fff9896a684cd3e8b1cf99b9b16a2445f8" Feb 17 20:15:00 crc kubenswrapper[4892]: I0217 20:15:00.152198 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522655-dss9x"] Feb 17 20:15:00 crc kubenswrapper[4892]: E0217 20:15:00.153069 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e080e4b8-b48a-4744-8f3f-86656dd50ad3" containerName="extract-content" Feb 17 20:15:00 crc kubenswrapper[4892]: I0217 20:15:00.153081 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="e080e4b8-b48a-4744-8f3f-86656dd50ad3" containerName="extract-content" Feb 17 20:15:00 crc kubenswrapper[4892]: E0217 20:15:00.153106 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4f753ad-0539-4f45-8163-2208b6e3da91" containerName="extract-content" Feb 17 20:15:00 crc kubenswrapper[4892]: I0217 20:15:00.153113 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4f753ad-0539-4f45-8163-2208b6e3da91" containerName="extract-content" Feb 17 20:15:00 crc kubenswrapper[4892]: E0217 20:15:00.153137 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4f753ad-0539-4f45-8163-2208b6e3da91" containerName="registry-server" Feb 17 20:15:00 crc kubenswrapper[4892]: I0217 20:15:00.153144 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4f753ad-0539-4f45-8163-2208b6e3da91" containerName="registry-server" Feb 17 20:15:00 crc kubenswrapper[4892]: E0217 20:15:00.153156 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e080e4b8-b48a-4744-8f3f-86656dd50ad3" containerName="registry-server" Feb 17 20:15:00 crc kubenswrapper[4892]: I0217 20:15:00.153161 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="e080e4b8-b48a-4744-8f3f-86656dd50ad3" containerName="registry-server" Feb 17 20:15:00 crc kubenswrapper[4892]: E0217 20:15:00.153186 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e080e4b8-b48a-4744-8f3f-86656dd50ad3" containerName="extract-utilities" Feb 17 20:15:00 crc kubenswrapper[4892]: I0217 20:15:00.153193 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="e080e4b8-b48a-4744-8f3f-86656dd50ad3" containerName="extract-utilities" Feb 17 20:15:00 crc kubenswrapper[4892]: E0217 20:15:00.153202 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4f753ad-0539-4f45-8163-2208b6e3da91" containerName="extract-utilities" Feb 17 20:15:00 crc kubenswrapper[4892]: I0217 20:15:00.153208 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4f753ad-0539-4f45-8163-2208b6e3da91" containerName="extract-utilities" Feb 17 20:15:00 crc kubenswrapper[4892]: I0217 20:15:00.153413 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="e080e4b8-b48a-4744-8f3f-86656dd50ad3" containerName="registry-server" Feb 17 20:15:00 crc kubenswrapper[4892]: I0217 20:15:00.153429 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4f753ad-0539-4f45-8163-2208b6e3da91" containerName="registry-server" Feb 17 20:15:00 crc kubenswrapper[4892]: I0217 20:15:00.154213 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522655-dss9x" Feb 17 20:15:00 crc kubenswrapper[4892]: I0217 20:15:00.163073 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 20:15:00 crc kubenswrapper[4892]: I0217 20:15:00.164113 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 20:15:00 crc kubenswrapper[4892]: I0217 20:15:00.170842 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522655-dss9x"] Feb 17 20:15:00 crc kubenswrapper[4892]: I0217 20:15:00.343777 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa4b27ff-1607-4e5d-b9ff-2a563e4613b6-config-volume\") pod \"collect-profiles-29522655-dss9x\" (UID: \"fa4b27ff-1607-4e5d-b9ff-2a563e4613b6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522655-dss9x" Feb 17 20:15:00 crc kubenswrapper[4892]: I0217 20:15:00.344096 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9szj\" (UniqueName: \"kubernetes.io/projected/fa4b27ff-1607-4e5d-b9ff-2a563e4613b6-kube-api-access-g9szj\") pod \"collect-profiles-29522655-dss9x\" (UID: \"fa4b27ff-1607-4e5d-b9ff-2a563e4613b6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522655-dss9x" Feb 17 20:15:00 crc kubenswrapper[4892]: I0217 20:15:00.344319 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa4b27ff-1607-4e5d-b9ff-2a563e4613b6-secret-volume\") pod \"collect-profiles-29522655-dss9x\" (UID: \"fa4b27ff-1607-4e5d-b9ff-2a563e4613b6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522655-dss9x" Feb 17 20:15:00 crc kubenswrapper[4892]: I0217 20:15:00.446773 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa4b27ff-1607-4e5d-b9ff-2a563e4613b6-config-volume\") pod \"collect-profiles-29522655-dss9x\" (UID: \"fa4b27ff-1607-4e5d-b9ff-2a563e4613b6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522655-dss9x" Feb 17 20:15:00 crc kubenswrapper[4892]: I0217 20:15:00.447071 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9szj\" (UniqueName: \"kubernetes.io/projected/fa4b27ff-1607-4e5d-b9ff-2a563e4613b6-kube-api-access-g9szj\") pod \"collect-profiles-29522655-dss9x\" (UID: \"fa4b27ff-1607-4e5d-b9ff-2a563e4613b6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522655-dss9x" Feb 17 20:15:00 crc kubenswrapper[4892]: I0217 20:15:00.447258 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa4b27ff-1607-4e5d-b9ff-2a563e4613b6-secret-volume\") pod \"collect-profiles-29522655-dss9x\" (UID: \"fa4b27ff-1607-4e5d-b9ff-2a563e4613b6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522655-dss9x" Feb 17 20:15:00 crc kubenswrapper[4892]: I0217 20:15:00.449058 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa4b27ff-1607-4e5d-b9ff-2a563e4613b6-config-volume\") pod \"collect-profiles-29522655-dss9x\" (UID: \"fa4b27ff-1607-4e5d-b9ff-2a563e4613b6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522655-dss9x" Feb 17 20:15:00 crc kubenswrapper[4892]: I0217 20:15:00.459330 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa4b27ff-1607-4e5d-b9ff-2a563e4613b6-secret-volume\") pod \"collect-profiles-29522655-dss9x\" (UID: \"fa4b27ff-1607-4e5d-b9ff-2a563e4613b6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522655-dss9x" Feb 17 20:15:00 crc kubenswrapper[4892]: I0217 20:15:00.481591 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9szj\" (UniqueName: \"kubernetes.io/projected/fa4b27ff-1607-4e5d-b9ff-2a563e4613b6-kube-api-access-g9szj\") pod \"collect-profiles-29522655-dss9x\" (UID: \"fa4b27ff-1607-4e5d-b9ff-2a563e4613b6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522655-dss9x" Feb 17 20:15:00 crc kubenswrapper[4892]: I0217 20:15:00.781084 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522655-dss9x" Feb 17 20:15:01 crc kubenswrapper[4892]: I0217 20:15:01.285238 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522655-dss9x"] Feb 17 20:15:01 crc kubenswrapper[4892]: I0217 20:15:01.653614 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522655-dss9x" event={"ID":"fa4b27ff-1607-4e5d-b9ff-2a563e4613b6","Type":"ContainerStarted","Data":"e05093bfe992c9b140b934c4a9178c5b75c27c77b86be61cf28f866ea991f7a6"} Feb 17 20:15:01 crc kubenswrapper[4892]: I0217 20:15:01.653962 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522655-dss9x" event={"ID":"fa4b27ff-1607-4e5d-b9ff-2a563e4613b6","Type":"ContainerStarted","Data":"fd9e94009291b281e582e21a593e495bee54fd3b38cd2fc5d6a8ab7230f6e5a4"} Feb 17 20:15:01 crc kubenswrapper[4892]: I0217 20:15:01.680112 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29522655-dss9x" podStartSLOduration=1.680088187 podStartE2EDuration="1.680088187s" podCreationTimestamp="2026-02-17 20:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 20:15:01.668535536 +0000 UTC m=+9073.043938791" watchObservedRunningTime="2026-02-17 20:15:01.680088187 +0000 UTC m=+9073.055491452" Feb 17 20:15:02 crc kubenswrapper[4892]: I0217 20:15:02.671548 4892 generic.go:334] "Generic (PLEG): container finished" podID="fa4b27ff-1607-4e5d-b9ff-2a563e4613b6" containerID="e05093bfe992c9b140b934c4a9178c5b75c27c77b86be61cf28f866ea991f7a6" exitCode=0 Feb 17 20:15:02 crc kubenswrapper[4892]: I0217 20:15:02.671614 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522655-dss9x" event={"ID":"fa4b27ff-1607-4e5d-b9ff-2a563e4613b6","Type":"ContainerDied","Data":"e05093bfe992c9b140b934c4a9178c5b75c27c77b86be61cf28f866ea991f7a6"} Feb 17 20:15:04 crc kubenswrapper[4892]: I0217 20:15:04.136576 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522655-dss9x" Feb 17 20:15:04 crc kubenswrapper[4892]: I0217 20:15:04.239378 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa4b27ff-1607-4e5d-b9ff-2a563e4613b6-config-volume\") pod \"fa4b27ff-1607-4e5d-b9ff-2a563e4613b6\" (UID: \"fa4b27ff-1607-4e5d-b9ff-2a563e4613b6\") " Feb 17 20:15:04 crc kubenswrapper[4892]: I0217 20:15:04.239557 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa4b27ff-1607-4e5d-b9ff-2a563e4613b6-secret-volume\") pod \"fa4b27ff-1607-4e5d-b9ff-2a563e4613b6\" (UID: \"fa4b27ff-1607-4e5d-b9ff-2a563e4613b6\") " Feb 17 20:15:04 crc kubenswrapper[4892]: I0217 20:15:04.239591 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9szj\" (UniqueName: \"kubernetes.io/projected/fa4b27ff-1607-4e5d-b9ff-2a563e4613b6-kube-api-access-g9szj\") pod \"fa4b27ff-1607-4e5d-b9ff-2a563e4613b6\" (UID: \"fa4b27ff-1607-4e5d-b9ff-2a563e4613b6\") " Feb 17 20:15:04 crc kubenswrapper[4892]: I0217 20:15:04.251244 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa4b27ff-1607-4e5d-b9ff-2a563e4613b6-config-volume" (OuterVolumeSpecName: "config-volume") pod "fa4b27ff-1607-4e5d-b9ff-2a563e4613b6" (UID: "fa4b27ff-1607-4e5d-b9ff-2a563e4613b6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 20:15:04 crc kubenswrapper[4892]: I0217 20:15:04.255974 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa4b27ff-1607-4e5d-b9ff-2a563e4613b6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "fa4b27ff-1607-4e5d-b9ff-2a563e4613b6" (UID: "fa4b27ff-1607-4e5d-b9ff-2a563e4613b6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 20:15:04 crc kubenswrapper[4892]: I0217 20:15:04.256164 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa4b27ff-1607-4e5d-b9ff-2a563e4613b6-kube-api-access-g9szj" (OuterVolumeSpecName: "kube-api-access-g9szj") pod "fa4b27ff-1607-4e5d-b9ff-2a563e4613b6" (UID: "fa4b27ff-1607-4e5d-b9ff-2a563e4613b6"). InnerVolumeSpecName "kube-api-access-g9szj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 20:15:04 crc kubenswrapper[4892]: I0217 20:15:04.342265 4892 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa4b27ff-1607-4e5d-b9ff-2a563e4613b6-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 20:15:04 crc kubenswrapper[4892]: I0217 20:15:04.342549 4892 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa4b27ff-1607-4e5d-b9ff-2a563e4613b6-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 20:15:04 crc kubenswrapper[4892]: I0217 20:15:04.342647 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9szj\" (UniqueName: \"kubernetes.io/projected/fa4b27ff-1607-4e5d-b9ff-2a563e4613b6-kube-api-access-g9szj\") on node \"crc\" DevicePath \"\"" Feb 17 20:15:04 crc kubenswrapper[4892]: I0217 20:15:04.342416 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522610-lwvp7"] Feb 17 20:15:04 crc kubenswrapper[4892]: I0217 20:15:04.355411 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522610-lwvp7"] Feb 17 20:15:04 crc kubenswrapper[4892]: I0217 20:15:04.696512 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522655-dss9x" event={"ID":"fa4b27ff-1607-4e5d-b9ff-2a563e4613b6","Type":"ContainerDied","Data":"fd9e94009291b281e582e21a593e495bee54fd3b38cd2fc5d6a8ab7230f6e5a4"} Feb 17 20:15:04 crc kubenswrapper[4892]: I0217 20:15:04.696889 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd9e94009291b281e582e21a593e495bee54fd3b38cd2fc5d6a8ab7230f6e5a4" Feb 17 20:15:04 crc kubenswrapper[4892]: I0217 20:15:04.696602 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522655-dss9x" Feb 17 20:15:05 crc kubenswrapper[4892]: I0217 20:15:05.382286 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddcc45a7-ec52-48c2-abf1-ba45be48d183" path="/var/lib/kubelet/pods/ddcc45a7-ec52-48c2-abf1-ba45be48d183/volumes" Feb 17 20:15:10 crc kubenswrapper[4892]: I0217 20:15:10.566196 4892 scope.go:117] "RemoveContainer" containerID="443ca8a21c8d7b6145693c4135642c571f32ded888065039e4d7a019823a44f7" Feb 17 20:16:37 crc kubenswrapper[4892]: I0217 20:16:37.425139 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 20:16:37 crc kubenswrapper[4892]: I0217 20:16:37.425797 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 20:17:07 crc kubenswrapper[4892]: I0217 20:17:07.424977 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 20:17:07 crc kubenswrapper[4892]: I0217 20:17:07.425567 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 20:17:37 crc kubenswrapper[4892]: I0217 20:17:37.424805 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 20:17:37 crc kubenswrapper[4892]: I0217 20:17:37.425886 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 20:17:37 crc kubenswrapper[4892]: I0217 20:17:37.425950 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" Feb 17 20:17:37 crc kubenswrapper[4892]: I0217 20:17:37.427625 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0ab03386b956c72a478daacc623c9e719cc9be4c2ae5015b826cee6ca5f2b27c"} pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 20:17:37 crc kubenswrapper[4892]: I0217 20:17:37.427718 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" containerID="cri-o://0ab03386b956c72a478daacc623c9e719cc9be4c2ae5015b826cee6ca5f2b27c" gracePeriod=600 Feb 17 20:17:37 crc kubenswrapper[4892]: E0217 20:17:37.573182 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:17:38 crc kubenswrapper[4892]: I0217 20:17:38.571844 4892 generic.go:334] "Generic (PLEG): container finished" podID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerID="0ab03386b956c72a478daacc623c9e719cc9be4c2ae5015b826cee6ca5f2b27c" exitCode=0 Feb 17 20:17:38 crc kubenswrapper[4892]: I0217 20:17:38.571893 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerDied","Data":"0ab03386b956c72a478daacc623c9e719cc9be4c2ae5015b826cee6ca5f2b27c"} Feb 17 20:17:38 crc kubenswrapper[4892]: I0217 20:17:38.573789 4892 scope.go:117] "RemoveContainer" containerID="16db715ca4968324805ed8b7d8e7d0822fd44599b240bf90b59ee6d89baa29e3" Feb 17 20:17:38 crc kubenswrapper[4892]: I0217 20:17:38.575066 4892 scope.go:117] "RemoveContainer" containerID="0ab03386b956c72a478daacc623c9e719cc9be4c2ae5015b826cee6ca5f2b27c" Feb 17 20:17:38 crc kubenswrapper[4892]: E0217 20:17:38.575605 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:17:48 crc kubenswrapper[4892]: I0217 20:17:48.525140 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-znjnt/must-gather-7pdtl"] Feb 17 20:17:48 crc kubenswrapper[4892]: E0217 20:17:48.526082 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa4b27ff-1607-4e5d-b9ff-2a563e4613b6" containerName="collect-profiles" Feb 17 20:17:48 crc kubenswrapper[4892]: I0217 20:17:48.526095 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa4b27ff-1607-4e5d-b9ff-2a563e4613b6" containerName="collect-profiles" Feb 17 20:17:48 crc kubenswrapper[4892]: I0217 20:17:48.526344 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa4b27ff-1607-4e5d-b9ff-2a563e4613b6" containerName="collect-profiles" Feb 17 20:17:48 crc kubenswrapper[4892]: I0217 20:17:48.529543 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-znjnt/must-gather-7pdtl" Feb 17 20:17:48 crc kubenswrapper[4892]: I0217 20:17:48.531302 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-znjnt"/"default-dockercfg-42ljl" Feb 17 20:17:48 crc kubenswrapper[4892]: I0217 20:17:48.531302 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-znjnt"/"kube-root-ca.crt" Feb 17 20:17:48 crc kubenswrapper[4892]: I0217 20:17:48.533667 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-znjnt"/"openshift-service-ca.crt" Feb 17 20:17:48 crc kubenswrapper[4892]: I0217 20:17:48.542254 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-znjnt/must-gather-7pdtl"] Feb 17 20:17:48 crc kubenswrapper[4892]: I0217 20:17:48.663052 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a1c2ecb1-1a9c-4fc7-953f-710e44082d31-must-gather-output\") pod \"must-gather-7pdtl\" (UID: \"a1c2ecb1-1a9c-4fc7-953f-710e44082d31\") " pod="openshift-must-gather-znjnt/must-gather-7pdtl" Feb 17 20:17:48 crc kubenswrapper[4892]: I0217 20:17:48.663937 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5hmg\" (UniqueName: \"kubernetes.io/projected/a1c2ecb1-1a9c-4fc7-953f-710e44082d31-kube-api-access-g5hmg\") pod \"must-gather-7pdtl\" (UID: \"a1c2ecb1-1a9c-4fc7-953f-710e44082d31\") " pod="openshift-must-gather-znjnt/must-gather-7pdtl" Feb 17 20:17:48 crc kubenswrapper[4892]: I0217 20:17:48.765810 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5hmg\" (UniqueName: \"kubernetes.io/projected/a1c2ecb1-1a9c-4fc7-953f-710e44082d31-kube-api-access-g5hmg\") pod \"must-gather-7pdtl\" (UID: \"a1c2ecb1-1a9c-4fc7-953f-710e44082d31\") " pod="openshift-must-gather-znjnt/must-gather-7pdtl" Feb 17 20:17:48 crc kubenswrapper[4892]: I0217 20:17:48.766014 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a1c2ecb1-1a9c-4fc7-953f-710e44082d31-must-gather-output\") pod \"must-gather-7pdtl\" (UID: \"a1c2ecb1-1a9c-4fc7-953f-710e44082d31\") " pod="openshift-must-gather-znjnt/must-gather-7pdtl" Feb 17 20:17:48 crc kubenswrapper[4892]: I0217 20:17:48.766512 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a1c2ecb1-1a9c-4fc7-953f-710e44082d31-must-gather-output\") pod \"must-gather-7pdtl\" (UID: \"a1c2ecb1-1a9c-4fc7-953f-710e44082d31\") " pod="openshift-must-gather-znjnt/must-gather-7pdtl" Feb 17 20:17:48 crc kubenswrapper[4892]: I0217 20:17:48.787806 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5hmg\" (UniqueName: \"kubernetes.io/projected/a1c2ecb1-1a9c-4fc7-953f-710e44082d31-kube-api-access-g5hmg\") pod \"must-gather-7pdtl\" (UID: \"a1c2ecb1-1a9c-4fc7-953f-710e44082d31\") " pod="openshift-must-gather-znjnt/must-gather-7pdtl" Feb 17 20:17:48 crc kubenswrapper[4892]: I0217 20:17:48.852373 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-znjnt/must-gather-7pdtl" Feb 17 20:17:49 crc kubenswrapper[4892]: I0217 20:17:49.414500 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-znjnt/must-gather-7pdtl"] Feb 17 20:17:49 crc kubenswrapper[4892]: I0217 20:17:49.719601 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-znjnt/must-gather-7pdtl" event={"ID":"a1c2ecb1-1a9c-4fc7-953f-710e44082d31","Type":"ContainerStarted","Data":"644ed64e6c418f764fc8aaa3d9d183d8dc1b02fab68dcebaac885402b7a9b50b"} Feb 17 20:17:50 crc kubenswrapper[4892]: I0217 20:17:50.360164 4892 scope.go:117] "RemoveContainer" containerID="0ab03386b956c72a478daacc623c9e719cc9be4c2ae5015b826cee6ca5f2b27c" Feb 17 20:17:50 crc kubenswrapper[4892]: E0217 20:17:50.360681 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:17:57 crc kubenswrapper[4892]: I0217 20:17:57.848574 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-znjnt/must-gather-7pdtl" event={"ID":"a1c2ecb1-1a9c-4fc7-953f-710e44082d31","Type":"ContainerStarted","Data":"3b35208fdca84d382a7f2a1181e8ac6dabc39402f77e768f8e758538e98ad74d"} Feb 17 20:17:57 crc kubenswrapper[4892]: I0217 20:17:57.851074 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-znjnt/must-gather-7pdtl" event={"ID":"a1c2ecb1-1a9c-4fc7-953f-710e44082d31","Type":"ContainerStarted","Data":"1c27df014d87bb7866c77dd33c4108654acb697483efd58f775ebe04fdd1b712"} Feb 17 20:17:57 crc kubenswrapper[4892]: I0217 20:17:57.887629 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-znjnt/must-gather-7pdtl" podStartSLOduration=2.607406881 podStartE2EDuration="9.887606826s" podCreationTimestamp="2026-02-17 20:17:48 +0000 UTC" firstStartedPulling="2026-02-17 20:17:49.418105451 +0000 UTC m=+9240.793508727" lastFinishedPulling="2026-02-17 20:17:56.698305407 +0000 UTC m=+9248.073708672" observedRunningTime="2026-02-17 20:17:57.864056031 +0000 UTC m=+9249.239459316" watchObservedRunningTime="2026-02-17 20:17:57.887606826 +0000 UTC m=+9249.263010112" Feb 17 20:18:01 crc kubenswrapper[4892]: I0217 20:18:01.729655 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-znjnt/crc-debug-5djps"] Feb 17 20:18:01 crc kubenswrapper[4892]: I0217 20:18:01.732050 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-znjnt/crc-debug-5djps" Feb 17 20:18:01 crc kubenswrapper[4892]: I0217 20:18:01.925366 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/34ac24e1-b0d8-4019-bdc6-fd070f5a6d42-host\") pod \"crc-debug-5djps\" (UID: \"34ac24e1-b0d8-4019-bdc6-fd070f5a6d42\") " pod="openshift-must-gather-znjnt/crc-debug-5djps" Feb 17 20:18:01 crc kubenswrapper[4892]: I0217 20:18:01.925827 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqlhv\" (UniqueName: \"kubernetes.io/projected/34ac24e1-b0d8-4019-bdc6-fd070f5a6d42-kube-api-access-xqlhv\") pod \"crc-debug-5djps\" (UID: \"34ac24e1-b0d8-4019-bdc6-fd070f5a6d42\") " pod="openshift-must-gather-znjnt/crc-debug-5djps" Feb 17 20:18:02 crc kubenswrapper[4892]: I0217 20:18:02.028138 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqlhv\" (UniqueName: \"kubernetes.io/projected/34ac24e1-b0d8-4019-bdc6-fd070f5a6d42-kube-api-access-xqlhv\") pod \"crc-debug-5djps\" (UID: \"34ac24e1-b0d8-4019-bdc6-fd070f5a6d42\") " pod="openshift-must-gather-znjnt/crc-debug-5djps" Feb 17 20:18:02 crc kubenswrapper[4892]: I0217 20:18:02.028230 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/34ac24e1-b0d8-4019-bdc6-fd070f5a6d42-host\") pod \"crc-debug-5djps\" (UID: \"34ac24e1-b0d8-4019-bdc6-fd070f5a6d42\") " pod="openshift-must-gather-znjnt/crc-debug-5djps" Feb 17 20:18:02 crc kubenswrapper[4892]: I0217 20:18:02.028383 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/34ac24e1-b0d8-4019-bdc6-fd070f5a6d42-host\") pod \"crc-debug-5djps\" (UID: \"34ac24e1-b0d8-4019-bdc6-fd070f5a6d42\") " pod="openshift-must-gather-znjnt/crc-debug-5djps" Feb 17 20:18:02 crc kubenswrapper[4892]: I0217 20:18:02.047227 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqlhv\" (UniqueName: \"kubernetes.io/projected/34ac24e1-b0d8-4019-bdc6-fd070f5a6d42-kube-api-access-xqlhv\") pod \"crc-debug-5djps\" (UID: \"34ac24e1-b0d8-4019-bdc6-fd070f5a6d42\") " pod="openshift-must-gather-znjnt/crc-debug-5djps" Feb 17 20:18:02 crc kubenswrapper[4892]: I0217 20:18:02.054161 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-znjnt/crc-debug-5djps" Feb 17 20:18:02 crc kubenswrapper[4892]: W0217 20:18:02.096096 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34ac24e1_b0d8_4019_bdc6_fd070f5a6d42.slice/crio-fe1c9b798b2d2b195f255ce88cc415b6758c3b8cafc54301ce2ebcd4f25b3d06 WatchSource:0}: Error finding container fe1c9b798b2d2b195f255ce88cc415b6758c3b8cafc54301ce2ebcd4f25b3d06: Status 404 returned error can't find the container with id fe1c9b798b2d2b195f255ce88cc415b6758c3b8cafc54301ce2ebcd4f25b3d06 Feb 17 20:18:02 crc kubenswrapper[4892]: I0217 20:18:02.903929 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-znjnt/crc-debug-5djps" event={"ID":"34ac24e1-b0d8-4019-bdc6-fd070f5a6d42","Type":"ContainerStarted","Data":"fe1c9b798b2d2b195f255ce88cc415b6758c3b8cafc54301ce2ebcd4f25b3d06"} Feb 17 20:18:03 crc kubenswrapper[4892]: I0217 20:18:03.360816 4892 scope.go:117] "RemoveContainer" containerID="0ab03386b956c72a478daacc623c9e719cc9be4c2ae5015b826cee6ca5f2b27c" Feb 17 20:18:03 crc kubenswrapper[4892]: E0217 20:18:03.361380 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:18:15 crc kubenswrapper[4892]: I0217 20:18:15.063394 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-znjnt/crc-debug-5djps" event={"ID":"34ac24e1-b0d8-4019-bdc6-fd070f5a6d42","Type":"ContainerStarted","Data":"9813447a001c0e62eed38acbeefc8dc1e3544eff8a99077bf34823c5ec4ad29e"} Feb 17 20:18:15 crc kubenswrapper[4892]: I0217 20:18:15.089231 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-znjnt/crc-debug-5djps" podStartSLOduration=2.327836893 podStartE2EDuration="14.089213134s" podCreationTimestamp="2026-02-17 20:18:01 +0000 UTC" firstStartedPulling="2026-02-17 20:18:02.099093031 +0000 UTC m=+9253.474496296" lastFinishedPulling="2026-02-17 20:18:13.860469272 +0000 UTC m=+9265.235872537" observedRunningTime="2026-02-17 20:18:15.080082939 +0000 UTC m=+9266.455486204" watchObservedRunningTime="2026-02-17 20:18:15.089213134 +0000 UTC m=+9266.464616399" Feb 17 20:18:18 crc kubenswrapper[4892]: I0217 20:18:18.360548 4892 scope.go:117] "RemoveContainer" containerID="0ab03386b956c72a478daacc623c9e719cc9be4c2ae5015b826cee6ca5f2b27c" Feb 17 20:18:18 crc kubenswrapper[4892]: E0217 20:18:18.361234 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:18:31 crc kubenswrapper[4892]: I0217 20:18:31.359475 4892 scope.go:117] "RemoveContainer" containerID="0ab03386b956c72a478daacc623c9e719cc9be4c2ae5015b826cee6ca5f2b27c" Feb 17 20:18:31 crc kubenswrapper[4892]: E0217 20:18:31.360182 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:18:37 crc kubenswrapper[4892]: I0217 20:18:37.324800 4892 generic.go:334] "Generic (PLEG): container finished" podID="34ac24e1-b0d8-4019-bdc6-fd070f5a6d42" containerID="9813447a001c0e62eed38acbeefc8dc1e3544eff8a99077bf34823c5ec4ad29e" exitCode=0 Feb 17 20:18:37 crc kubenswrapper[4892]: I0217 20:18:37.324846 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-znjnt/crc-debug-5djps" event={"ID":"34ac24e1-b0d8-4019-bdc6-fd070f5a6d42","Type":"ContainerDied","Data":"9813447a001c0e62eed38acbeefc8dc1e3544eff8a99077bf34823c5ec4ad29e"} Feb 17 20:18:38 crc kubenswrapper[4892]: I0217 20:18:38.457004 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-znjnt/crc-debug-5djps" Feb 17 20:18:38 crc kubenswrapper[4892]: I0217 20:18:38.517890 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-znjnt/crc-debug-5djps"] Feb 17 20:18:38 crc kubenswrapper[4892]: I0217 20:18:38.528011 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/34ac24e1-b0d8-4019-bdc6-fd070f5a6d42-host\") pod \"34ac24e1-b0d8-4019-bdc6-fd070f5a6d42\" (UID: \"34ac24e1-b0d8-4019-bdc6-fd070f5a6d42\") " Feb 17 20:18:38 crc kubenswrapper[4892]: I0217 20:18:38.528331 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqlhv\" (UniqueName: \"kubernetes.io/projected/34ac24e1-b0d8-4019-bdc6-fd070f5a6d42-kube-api-access-xqlhv\") pod \"34ac24e1-b0d8-4019-bdc6-fd070f5a6d42\" (UID: \"34ac24e1-b0d8-4019-bdc6-fd070f5a6d42\") " Feb 17 20:18:38 crc kubenswrapper[4892]: I0217 20:18:38.528357 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/34ac24e1-b0d8-4019-bdc6-fd070f5a6d42-host" (OuterVolumeSpecName: "host") pod "34ac24e1-b0d8-4019-bdc6-fd070f5a6d42" (UID: "34ac24e1-b0d8-4019-bdc6-fd070f5a6d42"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 20:18:38 crc kubenswrapper[4892]: I0217 20:18:38.532528 4892 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/34ac24e1-b0d8-4019-bdc6-fd070f5a6d42-host\") on node \"crc\" DevicePath \"\"" Feb 17 20:18:38 crc kubenswrapper[4892]: I0217 20:18:38.537191 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-znjnt/crc-debug-5djps"] Feb 17 20:18:38 crc kubenswrapper[4892]: I0217 20:18:38.542092 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34ac24e1-b0d8-4019-bdc6-fd070f5a6d42-kube-api-access-xqlhv" (OuterVolumeSpecName: "kube-api-access-xqlhv") pod "34ac24e1-b0d8-4019-bdc6-fd070f5a6d42" (UID: "34ac24e1-b0d8-4019-bdc6-fd070f5a6d42"). InnerVolumeSpecName "kube-api-access-xqlhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 20:18:38 crc kubenswrapper[4892]: I0217 20:18:38.635223 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqlhv\" (UniqueName: \"kubernetes.io/projected/34ac24e1-b0d8-4019-bdc6-fd070f5a6d42-kube-api-access-xqlhv\") on node \"crc\" DevicePath \"\"" Feb 17 20:18:39 crc kubenswrapper[4892]: I0217 20:18:39.385253 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-znjnt/crc-debug-5djps" Feb 17 20:18:39 crc kubenswrapper[4892]: I0217 20:18:39.390191 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34ac24e1-b0d8-4019-bdc6-fd070f5a6d42" path="/var/lib/kubelet/pods/34ac24e1-b0d8-4019-bdc6-fd070f5a6d42/volumes" Feb 17 20:18:39 crc kubenswrapper[4892]: I0217 20:18:39.391128 4892 scope.go:117] "RemoveContainer" containerID="9813447a001c0e62eed38acbeefc8dc1e3544eff8a99077bf34823c5ec4ad29e" Feb 17 20:18:39 crc kubenswrapper[4892]: I0217 20:18:39.849577 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-znjnt/crc-debug-z2zxw"] Feb 17 20:18:39 crc kubenswrapper[4892]: E0217 20:18:39.850240 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34ac24e1-b0d8-4019-bdc6-fd070f5a6d42" containerName="container-00" Feb 17 20:18:39 crc kubenswrapper[4892]: I0217 20:18:39.850252 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="34ac24e1-b0d8-4019-bdc6-fd070f5a6d42" containerName="container-00" Feb 17 20:18:39 crc kubenswrapper[4892]: I0217 20:18:39.850473 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="34ac24e1-b0d8-4019-bdc6-fd070f5a6d42" containerName="container-00" Feb 17 20:18:39 crc kubenswrapper[4892]: I0217 20:18:39.851265 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-znjnt/crc-debug-z2zxw" Feb 17 20:18:39 crc kubenswrapper[4892]: I0217 20:18:39.982809 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c9b30623-5da9-40f3-8191-cdfb4690ab04-host\") pod \"crc-debug-z2zxw\" (UID: \"c9b30623-5da9-40f3-8191-cdfb4690ab04\") " pod="openshift-must-gather-znjnt/crc-debug-z2zxw" Feb 17 20:18:39 crc kubenswrapper[4892]: I0217 20:18:39.983554 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7nm6\" (UniqueName: \"kubernetes.io/projected/c9b30623-5da9-40f3-8191-cdfb4690ab04-kube-api-access-h7nm6\") pod \"crc-debug-z2zxw\" (UID: \"c9b30623-5da9-40f3-8191-cdfb4690ab04\") " pod="openshift-must-gather-znjnt/crc-debug-z2zxw" Feb 17 20:18:40 crc kubenswrapper[4892]: I0217 20:18:40.086729 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c9b30623-5da9-40f3-8191-cdfb4690ab04-host\") pod \"crc-debug-z2zxw\" (UID: \"c9b30623-5da9-40f3-8191-cdfb4690ab04\") " pod="openshift-must-gather-znjnt/crc-debug-z2zxw" Feb 17 20:18:40 crc kubenswrapper[4892]: I0217 20:18:40.086894 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c9b30623-5da9-40f3-8191-cdfb4690ab04-host\") pod \"crc-debug-z2zxw\" (UID: \"c9b30623-5da9-40f3-8191-cdfb4690ab04\") " pod="openshift-must-gather-znjnt/crc-debug-z2zxw" Feb 17 20:18:40 crc kubenswrapper[4892]: I0217 20:18:40.087006 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7nm6\" (UniqueName: \"kubernetes.io/projected/c9b30623-5da9-40f3-8191-cdfb4690ab04-kube-api-access-h7nm6\") pod \"crc-debug-z2zxw\" (UID: \"c9b30623-5da9-40f3-8191-cdfb4690ab04\") " pod="openshift-must-gather-znjnt/crc-debug-z2zxw" Feb 17 20:18:40 crc kubenswrapper[4892]: I0217 20:18:40.107853 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7nm6\" (UniqueName: \"kubernetes.io/projected/c9b30623-5da9-40f3-8191-cdfb4690ab04-kube-api-access-h7nm6\") pod \"crc-debug-z2zxw\" (UID: \"c9b30623-5da9-40f3-8191-cdfb4690ab04\") " pod="openshift-must-gather-znjnt/crc-debug-z2zxw" Feb 17 20:18:40 crc kubenswrapper[4892]: I0217 20:18:40.208615 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-znjnt/crc-debug-z2zxw" Feb 17 20:18:40 crc kubenswrapper[4892]: I0217 20:18:40.399348 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-znjnt/crc-debug-z2zxw" event={"ID":"c9b30623-5da9-40f3-8191-cdfb4690ab04","Type":"ContainerStarted","Data":"d46e295c5b1b431c1a7828959059c4c64d2055ba0c7cccd1cb3e872a556eb2a3"} Feb 17 20:18:41 crc kubenswrapper[4892]: I0217 20:18:41.410383 4892 generic.go:334] "Generic (PLEG): container finished" podID="c9b30623-5da9-40f3-8191-cdfb4690ab04" containerID="4da7ed4e3ec71e31417201a943b2b6ecde520158a32e7f2b614691f76097c06d" exitCode=0 Feb 17 20:18:41 crc kubenswrapper[4892]: I0217 20:18:41.410493 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-znjnt/crc-debug-z2zxw" event={"ID":"c9b30623-5da9-40f3-8191-cdfb4690ab04","Type":"ContainerDied","Data":"4da7ed4e3ec71e31417201a943b2b6ecde520158a32e7f2b614691f76097c06d"} Feb 17 20:18:41 crc kubenswrapper[4892]: I0217 20:18:41.585131 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-znjnt/crc-debug-z2zxw"] Feb 17 20:18:41 crc kubenswrapper[4892]: I0217 20:18:41.595475 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-znjnt/crc-debug-z2zxw"] Feb 17 20:18:42 crc kubenswrapper[4892]: I0217 20:18:42.533059 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-znjnt/crc-debug-z2zxw" Feb 17 20:18:42 crc kubenswrapper[4892]: I0217 20:18:42.542267 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c9b30623-5da9-40f3-8191-cdfb4690ab04-host\") pod \"c9b30623-5da9-40f3-8191-cdfb4690ab04\" (UID: \"c9b30623-5da9-40f3-8191-cdfb4690ab04\") " Feb 17 20:18:42 crc kubenswrapper[4892]: I0217 20:18:42.542472 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7nm6\" (UniqueName: \"kubernetes.io/projected/c9b30623-5da9-40f3-8191-cdfb4690ab04-kube-api-access-h7nm6\") pod \"c9b30623-5da9-40f3-8191-cdfb4690ab04\" (UID: \"c9b30623-5da9-40f3-8191-cdfb4690ab04\") " Feb 17 20:18:42 crc kubenswrapper[4892]: I0217 20:18:42.543676 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9b30623-5da9-40f3-8191-cdfb4690ab04-host" (OuterVolumeSpecName: "host") pod "c9b30623-5da9-40f3-8191-cdfb4690ab04" (UID: "c9b30623-5da9-40f3-8191-cdfb4690ab04"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 20:18:42 crc kubenswrapper[4892]: I0217 20:18:42.560008 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9b30623-5da9-40f3-8191-cdfb4690ab04-kube-api-access-h7nm6" (OuterVolumeSpecName: "kube-api-access-h7nm6") pod "c9b30623-5da9-40f3-8191-cdfb4690ab04" (UID: "c9b30623-5da9-40f3-8191-cdfb4690ab04"). InnerVolumeSpecName "kube-api-access-h7nm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 20:18:42 crc kubenswrapper[4892]: I0217 20:18:42.650650 4892 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c9b30623-5da9-40f3-8191-cdfb4690ab04-host\") on node \"crc\" DevicePath \"\"" Feb 17 20:18:42 crc kubenswrapper[4892]: I0217 20:18:42.650685 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7nm6\" (UniqueName: \"kubernetes.io/projected/c9b30623-5da9-40f3-8191-cdfb4690ab04-kube-api-access-h7nm6\") on node \"crc\" DevicePath \"\"" Feb 17 20:18:42 crc kubenswrapper[4892]: I0217 20:18:42.879626 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-znjnt/crc-debug-lc4h8"] Feb 17 20:18:42 crc kubenswrapper[4892]: E0217 20:18:42.880304 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9b30623-5da9-40f3-8191-cdfb4690ab04" containerName="container-00" Feb 17 20:18:42 crc kubenswrapper[4892]: I0217 20:18:42.880321 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9b30623-5da9-40f3-8191-cdfb4690ab04" containerName="container-00" Feb 17 20:18:42 crc kubenswrapper[4892]: I0217 20:18:42.880569 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9b30623-5da9-40f3-8191-cdfb4690ab04" containerName="container-00" Feb 17 20:18:42 crc kubenswrapper[4892]: I0217 20:18:42.881298 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-znjnt/crc-debug-lc4h8" Feb 17 20:18:43 crc kubenswrapper[4892]: I0217 20:18:43.083104 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm789\" (UniqueName: \"kubernetes.io/projected/59981b4c-5be9-46ea-89f3-bad67e4eeb6d-kube-api-access-lm789\") pod \"crc-debug-lc4h8\" (UID: \"59981b4c-5be9-46ea-89f3-bad67e4eeb6d\") " pod="openshift-must-gather-znjnt/crc-debug-lc4h8" Feb 17 20:18:43 crc kubenswrapper[4892]: I0217 20:18:43.083296 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/59981b4c-5be9-46ea-89f3-bad67e4eeb6d-host\") pod \"crc-debug-lc4h8\" (UID: \"59981b4c-5be9-46ea-89f3-bad67e4eeb6d\") " pod="openshift-must-gather-znjnt/crc-debug-lc4h8" Feb 17 20:18:43 crc kubenswrapper[4892]: I0217 20:18:43.186358 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lm789\" (UniqueName: \"kubernetes.io/projected/59981b4c-5be9-46ea-89f3-bad67e4eeb6d-kube-api-access-lm789\") pod \"crc-debug-lc4h8\" (UID: \"59981b4c-5be9-46ea-89f3-bad67e4eeb6d\") " pod="openshift-must-gather-znjnt/crc-debug-lc4h8" Feb 17 20:18:43 crc kubenswrapper[4892]: I0217 20:18:43.186473 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/59981b4c-5be9-46ea-89f3-bad67e4eeb6d-host\") pod \"crc-debug-lc4h8\" (UID: \"59981b4c-5be9-46ea-89f3-bad67e4eeb6d\") " pod="openshift-must-gather-znjnt/crc-debug-lc4h8" Feb 17 20:18:43 crc kubenswrapper[4892]: I0217 20:18:43.186654 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/59981b4c-5be9-46ea-89f3-bad67e4eeb6d-host\") pod \"crc-debug-lc4h8\" (UID: \"59981b4c-5be9-46ea-89f3-bad67e4eeb6d\") " pod="openshift-must-gather-znjnt/crc-debug-lc4h8" Feb 17 20:18:43 crc kubenswrapper[4892]: I0217 20:18:43.207255 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm789\" (UniqueName: \"kubernetes.io/projected/59981b4c-5be9-46ea-89f3-bad67e4eeb6d-kube-api-access-lm789\") pod \"crc-debug-lc4h8\" (UID: \"59981b4c-5be9-46ea-89f3-bad67e4eeb6d\") " pod="openshift-must-gather-znjnt/crc-debug-lc4h8" Feb 17 20:18:43 crc kubenswrapper[4892]: I0217 20:18:43.372211 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9b30623-5da9-40f3-8191-cdfb4690ab04" path="/var/lib/kubelet/pods/c9b30623-5da9-40f3-8191-cdfb4690ab04/volumes" Feb 17 20:18:43 crc kubenswrapper[4892]: I0217 20:18:43.438474 4892 scope.go:117] "RemoveContainer" containerID="4da7ed4e3ec71e31417201a943b2b6ecde520158a32e7f2b614691f76097c06d" Feb 17 20:18:43 crc kubenswrapper[4892]: I0217 20:18:43.438663 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-znjnt/crc-debug-z2zxw" Feb 17 20:18:43 crc kubenswrapper[4892]: I0217 20:18:43.502235 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-znjnt/crc-debug-lc4h8" Feb 17 20:18:43 crc kubenswrapper[4892]: W0217 20:18:43.569193 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59981b4c_5be9_46ea_89f3_bad67e4eeb6d.slice/crio-ed60aa6952fc98035401feef64e2a75f117051fd5d0832ae100db10d93a6139e WatchSource:0}: Error finding container ed60aa6952fc98035401feef64e2a75f117051fd5d0832ae100db10d93a6139e: Status 404 returned error can't find the container with id ed60aa6952fc98035401feef64e2a75f117051fd5d0832ae100db10d93a6139e Feb 17 20:18:44 crc kubenswrapper[4892]: I0217 20:18:44.450064 4892 generic.go:334] "Generic (PLEG): container finished" podID="59981b4c-5be9-46ea-89f3-bad67e4eeb6d" containerID="8ee071e16241d90657f61fc830b641fdc8680f6a90c7cefe81d41e03332f2780" exitCode=0 Feb 17 20:18:44 crc kubenswrapper[4892]: I0217 20:18:44.450163 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-znjnt/crc-debug-lc4h8" event={"ID":"59981b4c-5be9-46ea-89f3-bad67e4eeb6d","Type":"ContainerDied","Data":"8ee071e16241d90657f61fc830b641fdc8680f6a90c7cefe81d41e03332f2780"} Feb 17 20:18:44 crc kubenswrapper[4892]: I0217 20:18:44.450435 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-znjnt/crc-debug-lc4h8" event={"ID":"59981b4c-5be9-46ea-89f3-bad67e4eeb6d","Type":"ContainerStarted","Data":"ed60aa6952fc98035401feef64e2a75f117051fd5d0832ae100db10d93a6139e"} Feb 17 20:18:44 crc kubenswrapper[4892]: I0217 20:18:44.489328 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-znjnt/crc-debug-lc4h8"] Feb 17 20:18:44 crc kubenswrapper[4892]: I0217 20:18:44.506512 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-znjnt/crc-debug-lc4h8"] Feb 17 20:18:45 crc kubenswrapper[4892]: I0217 20:18:45.359659 4892 scope.go:117] "RemoveContainer" containerID="0ab03386b956c72a478daacc623c9e719cc9be4c2ae5015b826cee6ca5f2b27c" Feb 17 20:18:45 crc kubenswrapper[4892]: E0217 20:18:45.360173 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:18:45 crc kubenswrapper[4892]: I0217 20:18:45.592325 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-znjnt/crc-debug-lc4h8" Feb 17 20:18:45 crc kubenswrapper[4892]: I0217 20:18:45.760138 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/59981b4c-5be9-46ea-89f3-bad67e4eeb6d-host\") pod \"59981b4c-5be9-46ea-89f3-bad67e4eeb6d\" (UID: \"59981b4c-5be9-46ea-89f3-bad67e4eeb6d\") " Feb 17 20:18:45 crc kubenswrapper[4892]: I0217 20:18:45.760263 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/59981b4c-5be9-46ea-89f3-bad67e4eeb6d-host" (OuterVolumeSpecName: "host") pod "59981b4c-5be9-46ea-89f3-bad67e4eeb6d" (UID: "59981b4c-5be9-46ea-89f3-bad67e4eeb6d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 20:18:45 crc kubenswrapper[4892]: I0217 20:18:45.760509 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lm789\" (UniqueName: \"kubernetes.io/projected/59981b4c-5be9-46ea-89f3-bad67e4eeb6d-kube-api-access-lm789\") pod \"59981b4c-5be9-46ea-89f3-bad67e4eeb6d\" (UID: \"59981b4c-5be9-46ea-89f3-bad67e4eeb6d\") " Feb 17 20:18:45 crc kubenswrapper[4892]: I0217 20:18:45.761199 4892 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/59981b4c-5be9-46ea-89f3-bad67e4eeb6d-host\") on node \"crc\" DevicePath \"\"" Feb 17 20:18:45 crc kubenswrapper[4892]: I0217 20:18:45.777845 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59981b4c-5be9-46ea-89f3-bad67e4eeb6d-kube-api-access-lm789" (OuterVolumeSpecName: "kube-api-access-lm789") pod "59981b4c-5be9-46ea-89f3-bad67e4eeb6d" (UID: "59981b4c-5be9-46ea-89f3-bad67e4eeb6d"). InnerVolumeSpecName "kube-api-access-lm789". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 20:18:45 crc kubenswrapper[4892]: I0217 20:18:45.863756 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lm789\" (UniqueName: \"kubernetes.io/projected/59981b4c-5be9-46ea-89f3-bad67e4eeb6d-kube-api-access-lm789\") on node \"crc\" DevicePath \"\"" Feb 17 20:18:46 crc kubenswrapper[4892]: I0217 20:18:46.490790 4892 scope.go:117] "RemoveContainer" containerID="8ee071e16241d90657f61fc830b641fdc8680f6a90c7cefe81d41e03332f2780" Feb 17 20:18:46 crc kubenswrapper[4892]: I0217 20:18:46.490837 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-znjnt/crc-debug-lc4h8" Feb 17 20:18:47 crc kubenswrapper[4892]: I0217 20:18:47.373125 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59981b4c-5be9-46ea-89f3-bad67e4eeb6d" path="/var/lib/kubelet/pods/59981b4c-5be9-46ea-89f3-bad67e4eeb6d/volumes" Feb 17 20:18:59 crc kubenswrapper[4892]: I0217 20:18:59.360119 4892 scope.go:117] "RemoveContainer" containerID="0ab03386b956c72a478daacc623c9e719cc9be4c2ae5015b826cee6ca5f2b27c" Feb 17 20:18:59 crc kubenswrapper[4892]: E0217 20:18:59.368196 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:19:10 crc kubenswrapper[4892]: I0217 20:19:10.359571 4892 scope.go:117] "RemoveContainer" containerID="0ab03386b956c72a478daacc623c9e719cc9be4c2ae5015b826cee6ca5f2b27c" Feb 17 20:19:10 crc kubenswrapper[4892]: E0217 20:19:10.360628 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:19:25 crc kubenswrapper[4892]: I0217 20:19:25.359723 4892 scope.go:117] "RemoveContainer" containerID="0ab03386b956c72a478daacc623c9e719cc9be4c2ae5015b826cee6ca5f2b27c" Feb 17 20:19:25 crc kubenswrapper[4892]: E0217 20:19:25.360553 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:19:39 crc kubenswrapper[4892]: I0217 20:19:39.213103 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2r49w"] Feb 17 20:19:39 crc kubenswrapper[4892]: E0217 20:19:39.214115 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59981b4c-5be9-46ea-89f3-bad67e4eeb6d" containerName="container-00" Feb 17 20:19:39 crc kubenswrapper[4892]: I0217 20:19:39.214130 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="59981b4c-5be9-46ea-89f3-bad67e4eeb6d" containerName="container-00" Feb 17 20:19:39 crc kubenswrapper[4892]: I0217 20:19:39.214465 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="59981b4c-5be9-46ea-89f3-bad67e4eeb6d" containerName="container-00" Feb 17 20:19:39 crc kubenswrapper[4892]: I0217 20:19:39.216412 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2r49w" Feb 17 20:19:39 crc kubenswrapper[4892]: I0217 20:19:39.225421 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2r49w"] Feb 17 20:19:39 crc kubenswrapper[4892]: I0217 20:19:39.289469 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1797e1bb-d01d-47a8-9ada-0da1b6cec865-utilities\") pod \"certified-operators-2r49w\" (UID: \"1797e1bb-d01d-47a8-9ada-0da1b6cec865\") " pod="openshift-marketplace/certified-operators-2r49w" Feb 17 20:19:39 crc kubenswrapper[4892]: I0217 20:19:39.289539 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1797e1bb-d01d-47a8-9ada-0da1b6cec865-catalog-content\") pod \"certified-operators-2r49w\" (UID: \"1797e1bb-d01d-47a8-9ada-0da1b6cec865\") " pod="openshift-marketplace/certified-operators-2r49w" Feb 17 20:19:39 crc kubenswrapper[4892]: I0217 20:19:39.289788 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhmdn\" (UniqueName: \"kubernetes.io/projected/1797e1bb-d01d-47a8-9ada-0da1b6cec865-kube-api-access-dhmdn\") pod \"certified-operators-2r49w\" (UID: \"1797e1bb-d01d-47a8-9ada-0da1b6cec865\") " pod="openshift-marketplace/certified-operators-2r49w" Feb 17 20:19:39 crc kubenswrapper[4892]: I0217 20:19:39.392591 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1797e1bb-d01d-47a8-9ada-0da1b6cec865-utilities\") pod \"certified-operators-2r49w\" (UID: \"1797e1bb-d01d-47a8-9ada-0da1b6cec865\") " pod="openshift-marketplace/certified-operators-2r49w" Feb 17 20:19:39 crc kubenswrapper[4892]: I0217 20:19:39.393021 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1797e1bb-d01d-47a8-9ada-0da1b6cec865-catalog-content\") pod \"certified-operators-2r49w\" (UID: \"1797e1bb-d01d-47a8-9ada-0da1b6cec865\") " pod="openshift-marketplace/certified-operators-2r49w" Feb 17 20:19:39 crc kubenswrapper[4892]: I0217 20:19:39.393141 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhmdn\" (UniqueName: \"kubernetes.io/projected/1797e1bb-d01d-47a8-9ada-0da1b6cec865-kube-api-access-dhmdn\") pod \"certified-operators-2r49w\" (UID: \"1797e1bb-d01d-47a8-9ada-0da1b6cec865\") " pod="openshift-marketplace/certified-operators-2r49w" Feb 17 20:19:39 crc kubenswrapper[4892]: I0217 20:19:39.393446 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1797e1bb-d01d-47a8-9ada-0da1b6cec865-utilities\") pod \"certified-operators-2r49w\" (UID: \"1797e1bb-d01d-47a8-9ada-0da1b6cec865\") " pod="openshift-marketplace/certified-operators-2r49w" Feb 17 20:19:39 crc kubenswrapper[4892]: I0217 20:19:39.394181 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1797e1bb-d01d-47a8-9ada-0da1b6cec865-catalog-content\") pod \"certified-operators-2r49w\" (UID: \"1797e1bb-d01d-47a8-9ada-0da1b6cec865\") " pod="openshift-marketplace/certified-operators-2r49w" Feb 17 20:19:39 crc kubenswrapper[4892]: I0217 20:19:39.424193 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhmdn\" (UniqueName: \"kubernetes.io/projected/1797e1bb-d01d-47a8-9ada-0da1b6cec865-kube-api-access-dhmdn\") pod \"certified-operators-2r49w\" (UID: \"1797e1bb-d01d-47a8-9ada-0da1b6cec865\") " pod="openshift-marketplace/certified-operators-2r49w" Feb 17 20:19:39 crc kubenswrapper[4892]: I0217 20:19:39.545747 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2r49w" Feb 17 20:19:40 crc kubenswrapper[4892]: I0217 20:19:40.078311 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2r49w"] Feb 17 20:19:40 crc kubenswrapper[4892]: I0217 20:19:40.146317 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2r49w" event={"ID":"1797e1bb-d01d-47a8-9ada-0da1b6cec865","Type":"ContainerStarted","Data":"fd65d88997a5e510ac9ec9d60d706089de1c18ef3ca9b2eec491da79a5a79f49"} Feb 17 20:19:40 crc kubenswrapper[4892]: I0217 20:19:40.359440 4892 scope.go:117] "RemoveContainer" containerID="0ab03386b956c72a478daacc623c9e719cc9be4c2ae5015b826cee6ca5f2b27c" Feb 17 20:19:40 crc kubenswrapper[4892]: E0217 20:19:40.360774 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:19:41 crc kubenswrapper[4892]: I0217 20:19:41.161891 4892 generic.go:334] "Generic (PLEG): container finished" podID="1797e1bb-d01d-47a8-9ada-0da1b6cec865" containerID="f9853d93a95376299739bdbd4e7a1c220457e651e5025b6fc614f0c7f42e9f41" exitCode=0 Feb 17 20:19:41 crc kubenswrapper[4892]: I0217 20:19:41.161931 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2r49w" event={"ID":"1797e1bb-d01d-47a8-9ada-0da1b6cec865","Type":"ContainerDied","Data":"f9853d93a95376299739bdbd4e7a1c220457e651e5025b6fc614f0c7f42e9f41"} Feb 17 20:19:41 crc kubenswrapper[4892]: I0217 20:19:41.164504 4892 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 20:19:42 crc kubenswrapper[4892]: I0217 20:19:42.174180 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2r49w" event={"ID":"1797e1bb-d01d-47a8-9ada-0da1b6cec865","Type":"ContainerStarted","Data":"57b5a2a2bcd59ea540191ed7c95f6e02efeed9f5b65f18cd32381fb5aad5ddd7"} Feb 17 20:19:43 crc kubenswrapper[4892]: I0217 20:19:43.189384 4892 generic.go:334] "Generic (PLEG): container finished" podID="1797e1bb-d01d-47a8-9ada-0da1b6cec865" containerID="57b5a2a2bcd59ea540191ed7c95f6e02efeed9f5b65f18cd32381fb5aad5ddd7" exitCode=0 Feb 17 20:19:43 crc kubenswrapper[4892]: I0217 20:19:43.190052 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2r49w" event={"ID":"1797e1bb-d01d-47a8-9ada-0da1b6cec865","Type":"ContainerDied","Data":"57b5a2a2bcd59ea540191ed7c95f6e02efeed9f5b65f18cd32381fb5aad5ddd7"} Feb 17 20:19:44 crc kubenswrapper[4892]: I0217 20:19:44.220173 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2r49w" event={"ID":"1797e1bb-d01d-47a8-9ada-0da1b6cec865","Type":"ContainerStarted","Data":"7fd766481c56eaaec355f3f73fadcfa0f9fe2b8f6b0b886d4400f68226b9067c"} Feb 17 20:19:44 crc kubenswrapper[4892]: I0217 20:19:44.253143 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2r49w" podStartSLOduration=2.801234818 podStartE2EDuration="5.253122161s" podCreationTimestamp="2026-02-17 20:19:39 +0000 UTC" firstStartedPulling="2026-02-17 20:19:41.164264949 +0000 UTC m=+9352.539668214" lastFinishedPulling="2026-02-17 20:19:43.616152292 +0000 UTC m=+9354.991555557" observedRunningTime="2026-02-17 20:19:44.241139949 +0000 UTC m=+9355.616543214" watchObservedRunningTime="2026-02-17 20:19:44.253122161 +0000 UTC m=+9355.628525426" Feb 17 20:19:49 crc kubenswrapper[4892]: I0217 20:19:49.547066 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2r49w" Feb 17 20:19:49 crc kubenswrapper[4892]: I0217 20:19:49.547659 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2r49w" Feb 17 20:19:49 crc kubenswrapper[4892]: I0217 20:19:49.613261 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2r49w" Feb 17 20:19:50 crc kubenswrapper[4892]: I0217 20:19:50.375563 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2r49w" Feb 17 20:19:50 crc kubenswrapper[4892]: I0217 20:19:50.434791 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2r49w"] Feb 17 20:19:52 crc kubenswrapper[4892]: I0217 20:19:52.351904 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2r49w" podUID="1797e1bb-d01d-47a8-9ada-0da1b6cec865" containerName="registry-server" containerID="cri-o://7fd766481c56eaaec355f3f73fadcfa0f9fe2b8f6b0b886d4400f68226b9067c" gracePeriod=2 Feb 17 20:19:52 crc kubenswrapper[4892]: I0217 20:19:52.904705 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2r49w" Feb 17 20:19:52 crc kubenswrapper[4892]: I0217 20:19:52.933901 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1797e1bb-d01d-47a8-9ada-0da1b6cec865-catalog-content\") pod \"1797e1bb-d01d-47a8-9ada-0da1b6cec865\" (UID: \"1797e1bb-d01d-47a8-9ada-0da1b6cec865\") " Feb 17 20:19:52 crc kubenswrapper[4892]: I0217 20:19:52.934055 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1797e1bb-d01d-47a8-9ada-0da1b6cec865-utilities\") pod \"1797e1bb-d01d-47a8-9ada-0da1b6cec865\" (UID: \"1797e1bb-d01d-47a8-9ada-0da1b6cec865\") " Feb 17 20:19:52 crc kubenswrapper[4892]: I0217 20:19:52.934109 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhmdn\" (UniqueName: \"kubernetes.io/projected/1797e1bb-d01d-47a8-9ada-0da1b6cec865-kube-api-access-dhmdn\") pod \"1797e1bb-d01d-47a8-9ada-0da1b6cec865\" (UID: \"1797e1bb-d01d-47a8-9ada-0da1b6cec865\") " Feb 17 20:19:52 crc kubenswrapper[4892]: I0217 20:19:52.936484 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1797e1bb-d01d-47a8-9ada-0da1b6cec865-utilities" (OuterVolumeSpecName: "utilities") pod "1797e1bb-d01d-47a8-9ada-0da1b6cec865" (UID: "1797e1bb-d01d-47a8-9ada-0da1b6cec865"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 20:19:52 crc kubenswrapper[4892]: I0217 20:19:52.960551 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1797e1bb-d01d-47a8-9ada-0da1b6cec865-kube-api-access-dhmdn" (OuterVolumeSpecName: "kube-api-access-dhmdn") pod "1797e1bb-d01d-47a8-9ada-0da1b6cec865" (UID: "1797e1bb-d01d-47a8-9ada-0da1b6cec865"). InnerVolumeSpecName "kube-api-access-dhmdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 20:19:53 crc kubenswrapper[4892]: I0217 20:19:53.037443 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1797e1bb-d01d-47a8-9ada-0da1b6cec865-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 20:19:53 crc kubenswrapper[4892]: I0217 20:19:53.037479 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhmdn\" (UniqueName: \"kubernetes.io/projected/1797e1bb-d01d-47a8-9ada-0da1b6cec865-kube-api-access-dhmdn\") on node \"crc\" DevicePath \"\"" Feb 17 20:19:53 crc kubenswrapper[4892]: I0217 20:19:53.363431 4892 generic.go:334] "Generic (PLEG): container finished" podID="1797e1bb-d01d-47a8-9ada-0da1b6cec865" containerID="7fd766481c56eaaec355f3f73fadcfa0f9fe2b8f6b0b886d4400f68226b9067c" exitCode=0 Feb 17 20:19:53 crc kubenswrapper[4892]: I0217 20:19:53.363518 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2r49w" Feb 17 20:19:53 crc kubenswrapper[4892]: I0217 20:19:53.377620 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2r49w" event={"ID":"1797e1bb-d01d-47a8-9ada-0da1b6cec865","Type":"ContainerDied","Data":"7fd766481c56eaaec355f3f73fadcfa0f9fe2b8f6b0b886d4400f68226b9067c"} Feb 17 20:19:53 crc kubenswrapper[4892]: I0217 20:19:53.377712 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2r49w" event={"ID":"1797e1bb-d01d-47a8-9ada-0da1b6cec865","Type":"ContainerDied","Data":"fd65d88997a5e510ac9ec9d60d706089de1c18ef3ca9b2eec491da79a5a79f49"} Feb 17 20:19:53 crc kubenswrapper[4892]: I0217 20:19:53.377755 4892 scope.go:117] "RemoveContainer" containerID="7fd766481c56eaaec355f3f73fadcfa0f9fe2b8f6b0b886d4400f68226b9067c" Feb 17 20:19:53 crc kubenswrapper[4892]: I0217 20:19:53.405278 4892 scope.go:117] "RemoveContainer" containerID="57b5a2a2bcd59ea540191ed7c95f6e02efeed9f5b65f18cd32381fb5aad5ddd7" Feb 17 20:19:53 crc kubenswrapper[4892]: I0217 20:19:53.437104 4892 scope.go:117] "RemoveContainer" containerID="f9853d93a95376299739bdbd4e7a1c220457e651e5025b6fc614f0c7f42e9f41" Feb 17 20:19:53 crc kubenswrapper[4892]: I0217 20:19:53.490656 4892 scope.go:117] "RemoveContainer" containerID="7fd766481c56eaaec355f3f73fadcfa0f9fe2b8f6b0b886d4400f68226b9067c" Feb 17 20:19:53 crc kubenswrapper[4892]: E0217 20:19:53.491208 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fd766481c56eaaec355f3f73fadcfa0f9fe2b8f6b0b886d4400f68226b9067c\": container with ID starting with 7fd766481c56eaaec355f3f73fadcfa0f9fe2b8f6b0b886d4400f68226b9067c not found: ID does not exist" containerID="7fd766481c56eaaec355f3f73fadcfa0f9fe2b8f6b0b886d4400f68226b9067c" Feb 17 20:19:53 crc kubenswrapper[4892]: I0217 20:19:53.491246 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fd766481c56eaaec355f3f73fadcfa0f9fe2b8f6b0b886d4400f68226b9067c"} err="failed to get container status \"7fd766481c56eaaec355f3f73fadcfa0f9fe2b8f6b0b886d4400f68226b9067c\": rpc error: code = NotFound desc = could not find container \"7fd766481c56eaaec355f3f73fadcfa0f9fe2b8f6b0b886d4400f68226b9067c\": container with ID starting with 7fd766481c56eaaec355f3f73fadcfa0f9fe2b8f6b0b886d4400f68226b9067c not found: ID does not exist" Feb 17 20:19:53 crc kubenswrapper[4892]: I0217 20:19:53.491271 4892 scope.go:117] "RemoveContainer" containerID="57b5a2a2bcd59ea540191ed7c95f6e02efeed9f5b65f18cd32381fb5aad5ddd7" Feb 17 20:19:53 crc kubenswrapper[4892]: E0217 20:19:53.491502 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57b5a2a2bcd59ea540191ed7c95f6e02efeed9f5b65f18cd32381fb5aad5ddd7\": container with ID starting with 57b5a2a2bcd59ea540191ed7c95f6e02efeed9f5b65f18cd32381fb5aad5ddd7 not found: ID does not exist" containerID="57b5a2a2bcd59ea540191ed7c95f6e02efeed9f5b65f18cd32381fb5aad5ddd7" Feb 17 20:19:53 crc kubenswrapper[4892]: I0217 20:19:53.491532 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57b5a2a2bcd59ea540191ed7c95f6e02efeed9f5b65f18cd32381fb5aad5ddd7"} err="failed to get container status \"57b5a2a2bcd59ea540191ed7c95f6e02efeed9f5b65f18cd32381fb5aad5ddd7\": rpc error: code = NotFound desc = could not find container \"57b5a2a2bcd59ea540191ed7c95f6e02efeed9f5b65f18cd32381fb5aad5ddd7\": container with ID starting with 57b5a2a2bcd59ea540191ed7c95f6e02efeed9f5b65f18cd32381fb5aad5ddd7 not found: ID does not exist" Feb 17 20:19:53 crc kubenswrapper[4892]: I0217 20:19:53.491550 4892 scope.go:117] "RemoveContainer" containerID="f9853d93a95376299739bdbd4e7a1c220457e651e5025b6fc614f0c7f42e9f41" Feb 17 20:19:53 crc kubenswrapper[4892]: E0217 20:19:53.491752 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9853d93a95376299739bdbd4e7a1c220457e651e5025b6fc614f0c7f42e9f41\": container with ID starting with f9853d93a95376299739bdbd4e7a1c220457e651e5025b6fc614f0c7f42e9f41 not found: ID does not exist" containerID="f9853d93a95376299739bdbd4e7a1c220457e651e5025b6fc614f0c7f42e9f41" Feb 17 20:19:53 crc kubenswrapper[4892]: I0217 20:19:53.491785 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9853d93a95376299739bdbd4e7a1c220457e651e5025b6fc614f0c7f42e9f41"} err="failed to get container status \"f9853d93a95376299739bdbd4e7a1c220457e651e5025b6fc614f0c7f42e9f41\": rpc error: code = NotFound desc = could not find container \"f9853d93a95376299739bdbd4e7a1c220457e651e5025b6fc614f0c7f42e9f41\": container with ID starting with f9853d93a95376299739bdbd4e7a1c220457e651e5025b6fc614f0c7f42e9f41 not found: ID does not exist" Feb 17 20:19:54 crc kubenswrapper[4892]: I0217 20:19:54.060791 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1797e1bb-d01d-47a8-9ada-0da1b6cec865-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1797e1bb-d01d-47a8-9ada-0da1b6cec865" (UID: "1797e1bb-d01d-47a8-9ada-0da1b6cec865"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 20:19:54 crc kubenswrapper[4892]: I0217 20:19:54.061522 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1797e1bb-d01d-47a8-9ada-0da1b6cec865-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 20:19:54 crc kubenswrapper[4892]: I0217 20:19:54.328699 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2r49w"] Feb 17 20:19:54 crc kubenswrapper[4892]: I0217 20:19:54.344211 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2r49w"] Feb 17 20:19:54 crc kubenswrapper[4892]: I0217 20:19:54.360327 4892 scope.go:117] "RemoveContainer" containerID="0ab03386b956c72a478daacc623c9e719cc9be4c2ae5015b826cee6ca5f2b27c" Feb 17 20:19:54 crc kubenswrapper[4892]: E0217 20:19:54.360674 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:19:55 crc kubenswrapper[4892]: I0217 20:19:55.377723 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1797e1bb-d01d-47a8-9ada-0da1b6cec865" path="/var/lib/kubelet/pods/1797e1bb-d01d-47a8-9ada-0da1b6cec865/volumes" Feb 17 20:20:02 crc kubenswrapper[4892]: I0217 20:20:02.760309 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xts2l"] Feb 17 20:20:02 crc kubenswrapper[4892]: E0217 20:20:02.761362 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1797e1bb-d01d-47a8-9ada-0da1b6cec865" containerName="registry-server" Feb 17 20:20:02 crc kubenswrapper[4892]: I0217 20:20:02.761375 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="1797e1bb-d01d-47a8-9ada-0da1b6cec865" containerName="registry-server" Feb 17 20:20:02 crc kubenswrapper[4892]: E0217 20:20:02.761393 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1797e1bb-d01d-47a8-9ada-0da1b6cec865" containerName="extract-utilities" Feb 17 20:20:02 crc kubenswrapper[4892]: I0217 20:20:02.761399 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="1797e1bb-d01d-47a8-9ada-0da1b6cec865" containerName="extract-utilities" Feb 17 20:20:02 crc kubenswrapper[4892]: E0217 20:20:02.761433 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1797e1bb-d01d-47a8-9ada-0da1b6cec865" containerName="extract-content" Feb 17 20:20:02 crc kubenswrapper[4892]: I0217 20:20:02.761441 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="1797e1bb-d01d-47a8-9ada-0da1b6cec865" containerName="extract-content" Feb 17 20:20:02 crc kubenswrapper[4892]: I0217 20:20:02.761630 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="1797e1bb-d01d-47a8-9ada-0da1b6cec865" containerName="registry-server" Feb 17 20:20:02 crc kubenswrapper[4892]: I0217 20:20:02.763287 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xts2l" Feb 17 20:20:02 crc kubenswrapper[4892]: I0217 20:20:02.777071 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br7j8\" (UniqueName: \"kubernetes.io/projected/6319cdb5-2a33-4900-9f1b-9aec82d3b41b-kube-api-access-br7j8\") pod \"redhat-marketplace-xts2l\" (UID: \"6319cdb5-2a33-4900-9f1b-9aec82d3b41b\") " pod="openshift-marketplace/redhat-marketplace-xts2l" Feb 17 20:20:02 crc kubenswrapper[4892]: I0217 20:20:02.777129 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6319cdb5-2a33-4900-9f1b-9aec82d3b41b-catalog-content\") pod \"redhat-marketplace-xts2l\" (UID: \"6319cdb5-2a33-4900-9f1b-9aec82d3b41b\") " pod="openshift-marketplace/redhat-marketplace-xts2l" Feb 17 20:20:02 crc kubenswrapper[4892]: I0217 20:20:02.777177 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6319cdb5-2a33-4900-9f1b-9aec82d3b41b-utilities\") pod \"redhat-marketplace-xts2l\" (UID: \"6319cdb5-2a33-4900-9f1b-9aec82d3b41b\") " pod="openshift-marketplace/redhat-marketplace-xts2l" Feb 17 20:20:02 crc kubenswrapper[4892]: I0217 20:20:02.800031 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xts2l"] Feb 17 20:20:02 crc kubenswrapper[4892]: I0217 20:20:02.879215 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-br7j8\" (UniqueName: \"kubernetes.io/projected/6319cdb5-2a33-4900-9f1b-9aec82d3b41b-kube-api-access-br7j8\") pod \"redhat-marketplace-xts2l\" (UID: \"6319cdb5-2a33-4900-9f1b-9aec82d3b41b\") " pod="openshift-marketplace/redhat-marketplace-xts2l" Feb 17 20:20:02 crc kubenswrapper[4892]: I0217 20:20:02.879274 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6319cdb5-2a33-4900-9f1b-9aec82d3b41b-catalog-content\") pod \"redhat-marketplace-xts2l\" (UID: \"6319cdb5-2a33-4900-9f1b-9aec82d3b41b\") " pod="openshift-marketplace/redhat-marketplace-xts2l" Feb 17 20:20:02 crc kubenswrapper[4892]: I0217 20:20:02.879331 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6319cdb5-2a33-4900-9f1b-9aec82d3b41b-utilities\") pod \"redhat-marketplace-xts2l\" (UID: \"6319cdb5-2a33-4900-9f1b-9aec82d3b41b\") " pod="openshift-marketplace/redhat-marketplace-xts2l" Feb 17 20:20:02 crc kubenswrapper[4892]: I0217 20:20:02.879798 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6319cdb5-2a33-4900-9f1b-9aec82d3b41b-catalog-content\") pod \"redhat-marketplace-xts2l\" (UID: \"6319cdb5-2a33-4900-9f1b-9aec82d3b41b\") " pod="openshift-marketplace/redhat-marketplace-xts2l" Feb 17 20:20:02 crc kubenswrapper[4892]: I0217 20:20:02.879931 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6319cdb5-2a33-4900-9f1b-9aec82d3b41b-utilities\") pod \"redhat-marketplace-xts2l\" (UID: \"6319cdb5-2a33-4900-9f1b-9aec82d3b41b\") " pod="openshift-marketplace/redhat-marketplace-xts2l" Feb 17 20:20:02 crc kubenswrapper[4892]: I0217 20:20:02.899491 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-br7j8\" (UniqueName: \"kubernetes.io/projected/6319cdb5-2a33-4900-9f1b-9aec82d3b41b-kube-api-access-br7j8\") pod \"redhat-marketplace-xts2l\" (UID: \"6319cdb5-2a33-4900-9f1b-9aec82d3b41b\") " pod="openshift-marketplace/redhat-marketplace-xts2l" Feb 17 20:20:03 crc kubenswrapper[4892]: I0217 20:20:03.094183 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xts2l" Feb 17 20:20:03 crc kubenswrapper[4892]: I0217 20:20:03.579164 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xts2l"] Feb 17 20:20:04 crc kubenswrapper[4892]: I0217 20:20:04.523511 4892 generic.go:334] "Generic (PLEG): container finished" podID="6319cdb5-2a33-4900-9f1b-9aec82d3b41b" containerID="c11fb5af8ff7290c8fa73e3e4a383203224c0dbb4b729d43ddedd3ff5ced8a3d" exitCode=0 Feb 17 20:20:04 crc kubenswrapper[4892]: I0217 20:20:04.523772 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xts2l" event={"ID":"6319cdb5-2a33-4900-9f1b-9aec82d3b41b","Type":"ContainerDied","Data":"c11fb5af8ff7290c8fa73e3e4a383203224c0dbb4b729d43ddedd3ff5ced8a3d"} Feb 17 20:20:04 crc kubenswrapper[4892]: I0217 20:20:04.523880 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xts2l" event={"ID":"6319cdb5-2a33-4900-9f1b-9aec82d3b41b","Type":"ContainerStarted","Data":"7fc2ca7925fe216981f7fc56004c036d378431e94428e0bde98cf4cb3b41d8b4"} Feb 17 20:20:05 crc kubenswrapper[4892]: I0217 20:20:05.546213 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xts2l" event={"ID":"6319cdb5-2a33-4900-9f1b-9aec82d3b41b","Type":"ContainerStarted","Data":"8d1185ae72d95a4fd3b76dcde8089c75a1c0aaa8490a8842c2f0638c5ef06fb0"} Feb 17 20:20:06 crc kubenswrapper[4892]: I0217 20:20:06.360409 4892 scope.go:117] "RemoveContainer" containerID="0ab03386b956c72a478daacc623c9e719cc9be4c2ae5015b826cee6ca5f2b27c" Feb 17 20:20:06 crc kubenswrapper[4892]: E0217 20:20:06.361208 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:20:06 crc kubenswrapper[4892]: I0217 20:20:06.575077 4892 generic.go:334] "Generic (PLEG): container finished" podID="6319cdb5-2a33-4900-9f1b-9aec82d3b41b" containerID="8d1185ae72d95a4fd3b76dcde8089c75a1c0aaa8490a8842c2f0638c5ef06fb0" exitCode=0 Feb 17 20:20:06 crc kubenswrapper[4892]: I0217 20:20:06.575140 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xts2l" event={"ID":"6319cdb5-2a33-4900-9f1b-9aec82d3b41b","Type":"ContainerDied","Data":"8d1185ae72d95a4fd3b76dcde8089c75a1c0aaa8490a8842c2f0638c5ef06fb0"} Feb 17 20:20:07 crc kubenswrapper[4892]: I0217 20:20:07.593794 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xts2l" event={"ID":"6319cdb5-2a33-4900-9f1b-9aec82d3b41b","Type":"ContainerStarted","Data":"871c8c5a13ef1b844174d4dc979b6cb806043c5cce5b6a96b86dc4bc8ea88ae6"} Feb 17 20:20:07 crc kubenswrapper[4892]: I0217 20:20:07.618675 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xts2l" podStartSLOduration=3.062927604 podStartE2EDuration="5.618655507s" podCreationTimestamp="2026-02-17 20:20:02 +0000 UTC" firstStartedPulling="2026-02-17 20:20:04.526278288 +0000 UTC m=+9375.901681593" lastFinishedPulling="2026-02-17 20:20:07.082006241 +0000 UTC m=+9378.457409496" observedRunningTime="2026-02-17 20:20:07.616924621 +0000 UTC m=+9378.992327926" watchObservedRunningTime="2026-02-17 20:20:07.618655507 +0000 UTC m=+9378.994058772" Feb 17 20:20:13 crc kubenswrapper[4892]: I0217 20:20:13.094742 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xts2l" Feb 17 20:20:13 crc kubenswrapper[4892]: I0217 20:20:13.095502 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xts2l" Feb 17 20:20:13 crc kubenswrapper[4892]: I0217 20:20:13.146520 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xts2l" Feb 17 20:20:13 crc kubenswrapper[4892]: I0217 20:20:13.788707 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xts2l" Feb 17 20:20:13 crc kubenswrapper[4892]: I0217 20:20:13.845462 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xts2l"] Feb 17 20:20:15 crc kubenswrapper[4892]: I0217 20:20:15.757764 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xts2l" podUID="6319cdb5-2a33-4900-9f1b-9aec82d3b41b" containerName="registry-server" containerID="cri-o://871c8c5a13ef1b844174d4dc979b6cb806043c5cce5b6a96b86dc4bc8ea88ae6" gracePeriod=2 Feb 17 20:20:16 crc kubenswrapper[4892]: I0217 20:20:16.271889 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xts2l" Feb 17 20:20:16 crc kubenswrapper[4892]: I0217 20:20:16.392778 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6319cdb5-2a33-4900-9f1b-9aec82d3b41b-catalog-content\") pod \"6319cdb5-2a33-4900-9f1b-9aec82d3b41b\" (UID: \"6319cdb5-2a33-4900-9f1b-9aec82d3b41b\") " Feb 17 20:20:16 crc kubenswrapper[4892]: I0217 20:20:16.393225 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-br7j8\" (UniqueName: \"kubernetes.io/projected/6319cdb5-2a33-4900-9f1b-9aec82d3b41b-kube-api-access-br7j8\") pod \"6319cdb5-2a33-4900-9f1b-9aec82d3b41b\" (UID: \"6319cdb5-2a33-4900-9f1b-9aec82d3b41b\") " Feb 17 20:20:16 crc kubenswrapper[4892]: I0217 20:20:16.393559 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6319cdb5-2a33-4900-9f1b-9aec82d3b41b-utilities\") pod \"6319cdb5-2a33-4900-9f1b-9aec82d3b41b\" (UID: \"6319cdb5-2a33-4900-9f1b-9aec82d3b41b\") " Feb 17 20:20:16 crc kubenswrapper[4892]: I0217 20:20:16.395008 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6319cdb5-2a33-4900-9f1b-9aec82d3b41b-utilities" (OuterVolumeSpecName: "utilities") pod "6319cdb5-2a33-4900-9f1b-9aec82d3b41b" (UID: "6319cdb5-2a33-4900-9f1b-9aec82d3b41b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 20:20:16 crc kubenswrapper[4892]: I0217 20:20:16.399614 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6319cdb5-2a33-4900-9f1b-9aec82d3b41b-kube-api-access-br7j8" (OuterVolumeSpecName: "kube-api-access-br7j8") pod "6319cdb5-2a33-4900-9f1b-9aec82d3b41b" (UID: "6319cdb5-2a33-4900-9f1b-9aec82d3b41b"). InnerVolumeSpecName "kube-api-access-br7j8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 20:20:16 crc kubenswrapper[4892]: I0217 20:20:16.439783 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6319cdb5-2a33-4900-9f1b-9aec82d3b41b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6319cdb5-2a33-4900-9f1b-9aec82d3b41b" (UID: "6319cdb5-2a33-4900-9f1b-9aec82d3b41b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 20:20:16 crc kubenswrapper[4892]: I0217 20:20:16.496472 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6319cdb5-2a33-4900-9f1b-9aec82d3b41b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 20:20:16 crc kubenswrapper[4892]: I0217 20:20:16.496806 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-br7j8\" (UniqueName: \"kubernetes.io/projected/6319cdb5-2a33-4900-9f1b-9aec82d3b41b-kube-api-access-br7j8\") on node \"crc\" DevicePath \"\"" Feb 17 20:20:16 crc kubenswrapper[4892]: I0217 20:20:16.497082 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6319cdb5-2a33-4900-9f1b-9aec82d3b41b-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 20:20:16 crc kubenswrapper[4892]: I0217 20:20:16.781254 4892 generic.go:334] "Generic (PLEG): container finished" podID="6319cdb5-2a33-4900-9f1b-9aec82d3b41b" containerID="871c8c5a13ef1b844174d4dc979b6cb806043c5cce5b6a96b86dc4bc8ea88ae6" exitCode=0 Feb 17 20:20:16 crc kubenswrapper[4892]: I0217 20:20:16.781348 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xts2l" event={"ID":"6319cdb5-2a33-4900-9f1b-9aec82d3b41b","Type":"ContainerDied","Data":"871c8c5a13ef1b844174d4dc979b6cb806043c5cce5b6a96b86dc4bc8ea88ae6"} Feb 17 20:20:16 crc kubenswrapper[4892]: I0217 20:20:16.781658 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xts2l" event={"ID":"6319cdb5-2a33-4900-9f1b-9aec82d3b41b","Type":"ContainerDied","Data":"7fc2ca7925fe216981f7fc56004c036d378431e94428e0bde98cf4cb3b41d8b4"} Feb 17 20:20:16 crc kubenswrapper[4892]: I0217 20:20:16.781690 4892 scope.go:117] "RemoveContainer" containerID="871c8c5a13ef1b844174d4dc979b6cb806043c5cce5b6a96b86dc4bc8ea88ae6" Feb 17 20:20:16 crc kubenswrapper[4892]: I0217 20:20:16.781369 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xts2l" Feb 17 20:20:16 crc kubenswrapper[4892]: I0217 20:20:16.809063 4892 scope.go:117] "RemoveContainer" containerID="8d1185ae72d95a4fd3b76dcde8089c75a1c0aaa8490a8842c2f0638c5ef06fb0" Feb 17 20:20:16 crc kubenswrapper[4892]: I0217 20:20:16.840918 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xts2l"] Feb 17 20:20:16 crc kubenswrapper[4892]: I0217 20:20:16.854895 4892 scope.go:117] "RemoveContainer" containerID="c11fb5af8ff7290c8fa73e3e4a383203224c0dbb4b729d43ddedd3ff5ced8a3d" Feb 17 20:20:16 crc kubenswrapper[4892]: I0217 20:20:16.859444 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xts2l"] Feb 17 20:20:16 crc kubenswrapper[4892]: I0217 20:20:16.898920 4892 scope.go:117] "RemoveContainer" containerID="871c8c5a13ef1b844174d4dc979b6cb806043c5cce5b6a96b86dc4bc8ea88ae6" Feb 17 20:20:16 crc kubenswrapper[4892]: E0217 20:20:16.899318 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"871c8c5a13ef1b844174d4dc979b6cb806043c5cce5b6a96b86dc4bc8ea88ae6\": container with ID starting with 871c8c5a13ef1b844174d4dc979b6cb806043c5cce5b6a96b86dc4bc8ea88ae6 not found: ID does not exist" containerID="871c8c5a13ef1b844174d4dc979b6cb806043c5cce5b6a96b86dc4bc8ea88ae6" Feb 17 20:20:16 crc kubenswrapper[4892]: I0217 20:20:16.899357 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"871c8c5a13ef1b844174d4dc979b6cb806043c5cce5b6a96b86dc4bc8ea88ae6"} err="failed to get container status \"871c8c5a13ef1b844174d4dc979b6cb806043c5cce5b6a96b86dc4bc8ea88ae6\": rpc error: code = NotFound desc = could not find container \"871c8c5a13ef1b844174d4dc979b6cb806043c5cce5b6a96b86dc4bc8ea88ae6\": container with ID starting with 871c8c5a13ef1b844174d4dc979b6cb806043c5cce5b6a96b86dc4bc8ea88ae6 not found: ID does not exist" Feb 17 20:20:16 crc kubenswrapper[4892]: I0217 20:20:16.899383 4892 scope.go:117] "RemoveContainer" containerID="8d1185ae72d95a4fd3b76dcde8089c75a1c0aaa8490a8842c2f0638c5ef06fb0" Feb 17 20:20:16 crc kubenswrapper[4892]: E0217 20:20:16.899748 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d1185ae72d95a4fd3b76dcde8089c75a1c0aaa8490a8842c2f0638c5ef06fb0\": container with ID starting with 8d1185ae72d95a4fd3b76dcde8089c75a1c0aaa8490a8842c2f0638c5ef06fb0 not found: ID does not exist" containerID="8d1185ae72d95a4fd3b76dcde8089c75a1c0aaa8490a8842c2f0638c5ef06fb0" Feb 17 20:20:16 crc kubenswrapper[4892]: I0217 20:20:16.899771 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d1185ae72d95a4fd3b76dcde8089c75a1c0aaa8490a8842c2f0638c5ef06fb0"} err="failed to get container status \"8d1185ae72d95a4fd3b76dcde8089c75a1c0aaa8490a8842c2f0638c5ef06fb0\": rpc error: code = NotFound desc = could not find container \"8d1185ae72d95a4fd3b76dcde8089c75a1c0aaa8490a8842c2f0638c5ef06fb0\": container with ID starting with 8d1185ae72d95a4fd3b76dcde8089c75a1c0aaa8490a8842c2f0638c5ef06fb0 not found: ID does not exist" Feb 17 20:20:16 crc kubenswrapper[4892]: I0217 20:20:16.899783 4892 scope.go:117] "RemoveContainer" containerID="c11fb5af8ff7290c8fa73e3e4a383203224c0dbb4b729d43ddedd3ff5ced8a3d" Feb 17 20:20:16 crc kubenswrapper[4892]: E0217 20:20:16.900116 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c11fb5af8ff7290c8fa73e3e4a383203224c0dbb4b729d43ddedd3ff5ced8a3d\": container with ID starting with c11fb5af8ff7290c8fa73e3e4a383203224c0dbb4b729d43ddedd3ff5ced8a3d not found: ID does not exist" containerID="c11fb5af8ff7290c8fa73e3e4a383203224c0dbb4b729d43ddedd3ff5ced8a3d" Feb 17 20:20:16 crc kubenswrapper[4892]: I0217 20:20:16.900145 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c11fb5af8ff7290c8fa73e3e4a383203224c0dbb4b729d43ddedd3ff5ced8a3d"} err="failed to get container status \"c11fb5af8ff7290c8fa73e3e4a383203224c0dbb4b729d43ddedd3ff5ced8a3d\": rpc error: code = NotFound desc = could not find container \"c11fb5af8ff7290c8fa73e3e4a383203224c0dbb4b729d43ddedd3ff5ced8a3d\": container with ID starting with c11fb5af8ff7290c8fa73e3e4a383203224c0dbb4b729d43ddedd3ff5ced8a3d not found: ID does not exist" Feb 17 20:20:17 crc kubenswrapper[4892]: I0217 20:20:17.363938 4892 scope.go:117] "RemoveContainer" containerID="0ab03386b956c72a478daacc623c9e719cc9be4c2ae5015b826cee6ca5f2b27c" Feb 17 20:20:17 crc kubenswrapper[4892]: E0217 20:20:17.370664 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:20:17 crc kubenswrapper[4892]: I0217 20:20:17.397414 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6319cdb5-2a33-4900-9f1b-9aec82d3b41b" path="/var/lib/kubelet/pods/6319cdb5-2a33-4900-9f1b-9aec82d3b41b/volumes" Feb 17 20:20:29 crc kubenswrapper[4892]: I0217 20:20:29.379035 4892 scope.go:117] "RemoveContainer" containerID="0ab03386b956c72a478daacc623c9e719cc9be4c2ae5015b826cee6ca5f2b27c" Feb 17 20:20:29 crc kubenswrapper[4892]: E0217 20:20:29.380148 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:20:40 crc kubenswrapper[4892]: I0217 20:20:40.359384 4892 scope.go:117] "RemoveContainer" containerID="0ab03386b956c72a478daacc623c9e719cc9be4c2ae5015b826cee6ca5f2b27c" Feb 17 20:20:40 crc kubenswrapper[4892]: E0217 20:20:40.360188 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:20:54 crc kubenswrapper[4892]: I0217 20:20:54.359796 4892 scope.go:117] "RemoveContainer" containerID="0ab03386b956c72a478daacc623c9e719cc9be4c2ae5015b826cee6ca5f2b27c" Feb 17 20:20:54 crc kubenswrapper[4892]: E0217 20:20:54.361070 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:21:08 crc kubenswrapper[4892]: I0217 20:21:08.360386 4892 scope.go:117] "RemoveContainer" containerID="0ab03386b956c72a478daacc623c9e719cc9be4c2ae5015b826cee6ca5f2b27c" Feb 17 20:21:08 crc kubenswrapper[4892]: E0217 20:21:08.361331 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:21:23 crc kubenswrapper[4892]: I0217 20:21:23.362171 4892 scope.go:117] "RemoveContainer" containerID="0ab03386b956c72a478daacc623c9e719cc9be4c2ae5015b826cee6ca5f2b27c" Feb 17 20:21:23 crc kubenswrapper[4892]: E0217 20:21:23.364795 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:21:35 crc kubenswrapper[4892]: I0217 20:21:35.360593 4892 scope.go:117] "RemoveContainer" containerID="0ab03386b956c72a478daacc623c9e719cc9be4c2ae5015b826cee6ca5f2b27c" Feb 17 20:21:35 crc kubenswrapper[4892]: E0217 20:21:35.361551 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:21:47 crc kubenswrapper[4892]: I0217 20:21:47.360672 4892 scope.go:117] "RemoveContainer" containerID="0ab03386b956c72a478daacc623c9e719cc9be4c2ae5015b826cee6ca5f2b27c" Feb 17 20:21:47 crc kubenswrapper[4892]: E0217 20:21:47.361621 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:22:01 crc kubenswrapper[4892]: I0217 20:22:01.359557 4892 scope.go:117] "RemoveContainer" containerID="0ab03386b956c72a478daacc623c9e719cc9be4c2ae5015b826cee6ca5f2b27c" Feb 17 20:22:01 crc kubenswrapper[4892]: E0217 20:22:01.360580 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:22:16 crc kubenswrapper[4892]: I0217 20:22:16.360413 4892 scope.go:117] "RemoveContainer" containerID="0ab03386b956c72a478daacc623c9e719cc9be4c2ae5015b826cee6ca5f2b27c" Feb 17 20:22:16 crc kubenswrapper[4892]: E0217 20:22:16.361464 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:22:30 crc kubenswrapper[4892]: I0217 20:22:30.359410 4892 scope.go:117] "RemoveContainer" containerID="0ab03386b956c72a478daacc623c9e719cc9be4c2ae5015b826cee6ca5f2b27c" Feb 17 20:22:30 crc kubenswrapper[4892]: E0217 20:22:30.360043 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:22:43 crc kubenswrapper[4892]: I0217 20:22:43.360882 4892 scope.go:117] "RemoveContainer" containerID="0ab03386b956c72a478daacc623c9e719cc9be4c2ae5015b826cee6ca5f2b27c" Feb 17 20:22:44 crc kubenswrapper[4892]: I0217 20:22:44.450182 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerStarted","Data":"d229474ca59a6d1790d5bfd995f8dcef27acdebd08978bf381531977775cb29f"} Feb 17 20:23:45 crc kubenswrapper[4892]: I0217 20:23:45.499417 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rkkv2"] Feb 17 20:23:45 crc kubenswrapper[4892]: E0217 20:23:45.500485 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6319cdb5-2a33-4900-9f1b-9aec82d3b41b" containerName="extract-content" Feb 17 20:23:45 crc kubenswrapper[4892]: I0217 20:23:45.500500 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="6319cdb5-2a33-4900-9f1b-9aec82d3b41b" containerName="extract-content" Feb 17 20:23:45 crc kubenswrapper[4892]: E0217 20:23:45.500521 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6319cdb5-2a33-4900-9f1b-9aec82d3b41b" containerName="registry-server" Feb 17 20:23:45 crc kubenswrapper[4892]: I0217 20:23:45.500528 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="6319cdb5-2a33-4900-9f1b-9aec82d3b41b" containerName="registry-server" Feb 17 20:23:45 crc kubenswrapper[4892]: E0217 20:23:45.500572 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6319cdb5-2a33-4900-9f1b-9aec82d3b41b" containerName="extract-utilities" Feb 17 20:23:45 crc kubenswrapper[4892]: I0217 20:23:45.500580 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="6319cdb5-2a33-4900-9f1b-9aec82d3b41b" containerName="extract-utilities" Feb 17 20:23:45 crc kubenswrapper[4892]: I0217 20:23:45.500868 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="6319cdb5-2a33-4900-9f1b-9aec82d3b41b" containerName="registry-server" Feb 17 20:23:45 crc kubenswrapper[4892]: I0217 20:23:45.502576 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rkkv2" Feb 17 20:23:45 crc kubenswrapper[4892]: I0217 20:23:45.515260 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rkkv2"] Feb 17 20:23:45 crc kubenswrapper[4892]: I0217 20:23:45.579562 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clrs4\" (UniqueName: \"kubernetes.io/projected/f96b1606-24c5-4f44-8edd-eae1338b4ae8-kube-api-access-clrs4\") pod \"community-operators-rkkv2\" (UID: \"f96b1606-24c5-4f44-8edd-eae1338b4ae8\") " pod="openshift-marketplace/community-operators-rkkv2" Feb 17 20:23:45 crc kubenswrapper[4892]: I0217 20:23:45.579622 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f96b1606-24c5-4f44-8edd-eae1338b4ae8-catalog-content\") pod \"community-operators-rkkv2\" (UID: \"f96b1606-24c5-4f44-8edd-eae1338b4ae8\") " pod="openshift-marketplace/community-operators-rkkv2" Feb 17 20:23:45 crc kubenswrapper[4892]: I0217 20:23:45.580098 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f96b1606-24c5-4f44-8edd-eae1338b4ae8-utilities\") pod \"community-operators-rkkv2\" (UID: \"f96b1606-24c5-4f44-8edd-eae1338b4ae8\") " pod="openshift-marketplace/community-operators-rkkv2" Feb 17 20:23:45 crc kubenswrapper[4892]: I0217 20:23:45.682519 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clrs4\" (UniqueName: \"kubernetes.io/projected/f96b1606-24c5-4f44-8edd-eae1338b4ae8-kube-api-access-clrs4\") pod \"community-operators-rkkv2\" (UID: \"f96b1606-24c5-4f44-8edd-eae1338b4ae8\") " pod="openshift-marketplace/community-operators-rkkv2" Feb 17 20:23:45 crc kubenswrapper[4892]: I0217 20:23:45.682604 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f96b1606-24c5-4f44-8edd-eae1338b4ae8-catalog-content\") pod \"community-operators-rkkv2\" (UID: \"f96b1606-24c5-4f44-8edd-eae1338b4ae8\") " pod="openshift-marketplace/community-operators-rkkv2" Feb 17 20:23:45 crc kubenswrapper[4892]: I0217 20:23:45.682780 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f96b1606-24c5-4f44-8edd-eae1338b4ae8-utilities\") pod \"community-operators-rkkv2\" (UID: \"f96b1606-24c5-4f44-8edd-eae1338b4ae8\") " pod="openshift-marketplace/community-operators-rkkv2" Feb 17 20:23:45 crc kubenswrapper[4892]: I0217 20:23:45.683092 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f96b1606-24c5-4f44-8edd-eae1338b4ae8-catalog-content\") pod \"community-operators-rkkv2\" (UID: \"f96b1606-24c5-4f44-8edd-eae1338b4ae8\") " pod="openshift-marketplace/community-operators-rkkv2" Feb 17 20:23:45 crc kubenswrapper[4892]: I0217 20:23:45.683331 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f96b1606-24c5-4f44-8edd-eae1338b4ae8-utilities\") pod \"community-operators-rkkv2\" (UID: \"f96b1606-24c5-4f44-8edd-eae1338b4ae8\") " pod="openshift-marketplace/community-operators-rkkv2" Feb 17 20:23:45 crc kubenswrapper[4892]: I0217 20:23:45.707237 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clrs4\" (UniqueName: \"kubernetes.io/projected/f96b1606-24c5-4f44-8edd-eae1338b4ae8-kube-api-access-clrs4\") pod \"community-operators-rkkv2\" (UID: \"f96b1606-24c5-4f44-8edd-eae1338b4ae8\") " pod="openshift-marketplace/community-operators-rkkv2" Feb 17 20:23:45 crc kubenswrapper[4892]: I0217 20:23:45.827480 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rkkv2" Feb 17 20:23:46 crc kubenswrapper[4892]: I0217 20:23:46.416956 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rkkv2"] Feb 17 20:23:47 crc kubenswrapper[4892]: I0217 20:23:47.257371 4892 generic.go:334] "Generic (PLEG): container finished" podID="f96b1606-24c5-4f44-8edd-eae1338b4ae8" containerID="a49d2a3dca566a8253a849c28d18d657738c5e63c575819fcb46b2af191a135f" exitCode=0 Feb 17 20:23:47 crc kubenswrapper[4892]: I0217 20:23:47.257476 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rkkv2" event={"ID":"f96b1606-24c5-4f44-8edd-eae1338b4ae8","Type":"ContainerDied","Data":"a49d2a3dca566a8253a849c28d18d657738c5e63c575819fcb46b2af191a135f"} Feb 17 20:23:47 crc kubenswrapper[4892]: I0217 20:23:47.257659 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rkkv2" event={"ID":"f96b1606-24c5-4f44-8edd-eae1338b4ae8","Type":"ContainerStarted","Data":"2c037baeb22a89685bf48062ac1a2dda57a3f5d869ecc4ae5862dab3cdd50731"} Feb 17 20:23:48 crc kubenswrapper[4892]: I0217 20:23:48.269409 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rkkv2" event={"ID":"f96b1606-24c5-4f44-8edd-eae1338b4ae8","Type":"ContainerStarted","Data":"aa61ce829d17200d6f61848d4c06d9ba2b1c4e3acff1bd7b5570c649a0ff8b24"} Feb 17 20:23:48 crc kubenswrapper[4892]: I0217 20:23:48.497794 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2lvtc"] Feb 17 20:23:48 crc kubenswrapper[4892]: I0217 20:23:48.500342 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2lvtc" Feb 17 20:23:48 crc kubenswrapper[4892]: I0217 20:23:48.510117 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2lvtc"] Feb 17 20:23:48 crc kubenswrapper[4892]: I0217 20:23:48.657162 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b7d4b72-4b0e-4eba-9692-b58a2c066a5a-utilities\") pod \"redhat-operators-2lvtc\" (UID: \"2b7d4b72-4b0e-4eba-9692-b58a2c066a5a\") " pod="openshift-marketplace/redhat-operators-2lvtc" Feb 17 20:23:48 crc kubenswrapper[4892]: I0217 20:23:48.657261 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b7d4b72-4b0e-4eba-9692-b58a2c066a5a-catalog-content\") pod \"redhat-operators-2lvtc\" (UID: \"2b7d4b72-4b0e-4eba-9692-b58a2c066a5a\") " pod="openshift-marketplace/redhat-operators-2lvtc" Feb 17 20:23:48 crc kubenswrapper[4892]: I0217 20:23:48.657488 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blh6d\" (UniqueName: \"kubernetes.io/projected/2b7d4b72-4b0e-4eba-9692-b58a2c066a5a-kube-api-access-blh6d\") pod \"redhat-operators-2lvtc\" (UID: \"2b7d4b72-4b0e-4eba-9692-b58a2c066a5a\") " pod="openshift-marketplace/redhat-operators-2lvtc" Feb 17 20:23:48 crc kubenswrapper[4892]: I0217 20:23:48.759208 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blh6d\" (UniqueName: \"kubernetes.io/projected/2b7d4b72-4b0e-4eba-9692-b58a2c066a5a-kube-api-access-blh6d\") pod \"redhat-operators-2lvtc\" (UID: \"2b7d4b72-4b0e-4eba-9692-b58a2c066a5a\") " pod="openshift-marketplace/redhat-operators-2lvtc" Feb 17 20:23:48 crc kubenswrapper[4892]: I0217 20:23:48.759350 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b7d4b72-4b0e-4eba-9692-b58a2c066a5a-utilities\") pod \"redhat-operators-2lvtc\" (UID: \"2b7d4b72-4b0e-4eba-9692-b58a2c066a5a\") " pod="openshift-marketplace/redhat-operators-2lvtc" Feb 17 20:23:48 crc kubenswrapper[4892]: I0217 20:23:48.759408 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b7d4b72-4b0e-4eba-9692-b58a2c066a5a-catalog-content\") pod \"redhat-operators-2lvtc\" (UID: \"2b7d4b72-4b0e-4eba-9692-b58a2c066a5a\") " pod="openshift-marketplace/redhat-operators-2lvtc" Feb 17 20:23:48 crc kubenswrapper[4892]: I0217 20:23:48.760080 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b7d4b72-4b0e-4eba-9692-b58a2c066a5a-catalog-content\") pod \"redhat-operators-2lvtc\" (UID: \"2b7d4b72-4b0e-4eba-9692-b58a2c066a5a\") " pod="openshift-marketplace/redhat-operators-2lvtc" Feb 17 20:23:48 crc kubenswrapper[4892]: I0217 20:23:48.760256 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b7d4b72-4b0e-4eba-9692-b58a2c066a5a-utilities\") pod \"redhat-operators-2lvtc\" (UID: \"2b7d4b72-4b0e-4eba-9692-b58a2c066a5a\") " pod="openshift-marketplace/redhat-operators-2lvtc" Feb 17 20:23:48 crc kubenswrapper[4892]: I0217 20:23:48.788702 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blh6d\" (UniqueName: \"kubernetes.io/projected/2b7d4b72-4b0e-4eba-9692-b58a2c066a5a-kube-api-access-blh6d\") pod \"redhat-operators-2lvtc\" (UID: \"2b7d4b72-4b0e-4eba-9692-b58a2c066a5a\") " pod="openshift-marketplace/redhat-operators-2lvtc" Feb 17 20:23:48 crc kubenswrapper[4892]: I0217 20:23:48.827490 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2lvtc" Feb 17 20:23:49 crc kubenswrapper[4892]: W0217 20:23:49.371734 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b7d4b72_4b0e_4eba_9692_b58a2c066a5a.slice/crio-ac0d005277eab97a50f07687bbebceddc380d8f5cad89a36c1c0140bbbf29734 WatchSource:0}: Error finding container ac0d005277eab97a50f07687bbebceddc380d8f5cad89a36c1c0140bbbf29734: Status 404 returned error can't find the container with id ac0d005277eab97a50f07687bbebceddc380d8f5cad89a36c1c0140bbbf29734 Feb 17 20:23:49 crc kubenswrapper[4892]: I0217 20:23:49.392236 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2lvtc"] Feb 17 20:23:50 crc kubenswrapper[4892]: I0217 20:23:50.297118 4892 generic.go:334] "Generic (PLEG): container finished" podID="2b7d4b72-4b0e-4eba-9692-b58a2c066a5a" containerID="dc50f086a214600f88c4905411cd792f4fd541c96364cc0c80dea92bbcc77a44" exitCode=0 Feb 17 20:23:50 crc kubenswrapper[4892]: I0217 20:23:50.297259 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2lvtc" event={"ID":"2b7d4b72-4b0e-4eba-9692-b58a2c066a5a","Type":"ContainerDied","Data":"dc50f086a214600f88c4905411cd792f4fd541c96364cc0c80dea92bbcc77a44"} Feb 17 20:23:50 crc kubenswrapper[4892]: I0217 20:23:50.297674 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2lvtc" event={"ID":"2b7d4b72-4b0e-4eba-9692-b58a2c066a5a","Type":"ContainerStarted","Data":"ac0d005277eab97a50f07687bbebceddc380d8f5cad89a36c1c0140bbbf29734"} Feb 17 20:23:50 crc kubenswrapper[4892]: I0217 20:23:50.305157 4892 generic.go:334] "Generic (PLEG): container finished" podID="f96b1606-24c5-4f44-8edd-eae1338b4ae8" containerID="aa61ce829d17200d6f61848d4c06d9ba2b1c4e3acff1bd7b5570c649a0ff8b24" exitCode=0 Feb 17 20:23:50 crc kubenswrapper[4892]: I0217 20:23:50.305220 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rkkv2" event={"ID":"f96b1606-24c5-4f44-8edd-eae1338b4ae8","Type":"ContainerDied","Data":"aa61ce829d17200d6f61848d4c06d9ba2b1c4e3acff1bd7b5570c649a0ff8b24"} Feb 17 20:23:51 crc kubenswrapper[4892]: I0217 20:23:51.321199 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2lvtc" event={"ID":"2b7d4b72-4b0e-4eba-9692-b58a2c066a5a","Type":"ContainerStarted","Data":"bd03f6e5ea9c098ed672c4d2c899b5a1b59b25f4a9a581f88b101a1cdc5a4b62"} Feb 17 20:23:51 crc kubenswrapper[4892]: I0217 20:23:51.324943 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rkkv2" event={"ID":"f96b1606-24c5-4f44-8edd-eae1338b4ae8","Type":"ContainerStarted","Data":"b9d965ec72015b1e785e1dc40b6419edec8f5516111372d69707f3f16c07b5ee"} Feb 17 20:23:51 crc kubenswrapper[4892]: I0217 20:23:51.376068 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rkkv2" podStartSLOduration=2.797894115 podStartE2EDuration="6.376046028s" podCreationTimestamp="2026-02-17 20:23:45 +0000 UTC" firstStartedPulling="2026-02-17 20:23:47.260654563 +0000 UTC m=+9598.636057828" lastFinishedPulling="2026-02-17 20:23:50.838806476 +0000 UTC m=+9602.214209741" observedRunningTime="2026-02-17 20:23:51.36719086 +0000 UTC m=+9602.742594135" watchObservedRunningTime="2026-02-17 20:23:51.376046028 +0000 UTC m=+9602.751449303" Feb 17 20:23:54 crc kubenswrapper[4892]: I0217 20:23:54.372068 4892 generic.go:334] "Generic (PLEG): container finished" podID="2b7d4b72-4b0e-4eba-9692-b58a2c066a5a" containerID="bd03f6e5ea9c098ed672c4d2c899b5a1b59b25f4a9a581f88b101a1cdc5a4b62" exitCode=0 Feb 17 20:23:54 crc kubenswrapper[4892]: I0217 20:23:54.372178 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2lvtc" event={"ID":"2b7d4b72-4b0e-4eba-9692-b58a2c066a5a","Type":"ContainerDied","Data":"bd03f6e5ea9c098ed672c4d2c899b5a1b59b25f4a9a581f88b101a1cdc5a4b62"} Feb 17 20:23:55 crc kubenswrapper[4892]: I0217 20:23:55.387011 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2lvtc" event={"ID":"2b7d4b72-4b0e-4eba-9692-b58a2c066a5a","Type":"ContainerStarted","Data":"48d422fc262617deab7a0181fb860ab22aaba76b0978254522651c3ebca6929e"} Feb 17 20:23:55 crc kubenswrapper[4892]: I0217 20:23:55.417309 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2lvtc" podStartSLOduration=2.904015175 podStartE2EDuration="7.417259704s" podCreationTimestamp="2026-02-17 20:23:48 +0000 UTC" firstStartedPulling="2026-02-17 20:23:50.298846332 +0000 UTC m=+9601.674249597" lastFinishedPulling="2026-02-17 20:23:54.812090831 +0000 UTC m=+9606.187494126" observedRunningTime="2026-02-17 20:23:55.408688373 +0000 UTC m=+9606.784091668" watchObservedRunningTime="2026-02-17 20:23:55.417259704 +0000 UTC m=+9606.792662969" Feb 17 20:23:55 crc kubenswrapper[4892]: I0217 20:23:55.828378 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rkkv2" Feb 17 20:23:55 crc kubenswrapper[4892]: I0217 20:23:55.828727 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rkkv2" Feb 17 20:23:55 crc kubenswrapper[4892]: I0217 20:23:55.903498 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rkkv2" Feb 17 20:23:56 crc kubenswrapper[4892]: I0217 20:23:56.456381 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rkkv2" Feb 17 20:23:58 crc kubenswrapper[4892]: I0217 20:23:58.828236 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2lvtc" Feb 17 20:23:58 crc kubenswrapper[4892]: I0217 20:23:58.828547 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2lvtc" Feb 17 20:23:59 crc kubenswrapper[4892]: I0217 20:23:59.886257 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2lvtc" podUID="2b7d4b72-4b0e-4eba-9692-b58a2c066a5a" containerName="registry-server" probeResult="failure" output=< Feb 17 20:23:59 crc kubenswrapper[4892]: timeout: failed to connect service ":50051" within 1s Feb 17 20:23:59 crc kubenswrapper[4892]: > Feb 17 20:24:00 crc kubenswrapper[4892]: I0217 20:24:00.693164 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rkkv2"] Feb 17 20:24:00 crc kubenswrapper[4892]: I0217 20:24:00.693662 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rkkv2" podUID="f96b1606-24c5-4f44-8edd-eae1338b4ae8" containerName="registry-server" containerID="cri-o://b9d965ec72015b1e785e1dc40b6419edec8f5516111372d69707f3f16c07b5ee" gracePeriod=2 Feb 17 20:24:01 crc kubenswrapper[4892]: I0217 20:24:01.202617 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rkkv2" Feb 17 20:24:01 crc kubenswrapper[4892]: I0217 20:24:01.380452 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clrs4\" (UniqueName: \"kubernetes.io/projected/f96b1606-24c5-4f44-8edd-eae1338b4ae8-kube-api-access-clrs4\") pod \"f96b1606-24c5-4f44-8edd-eae1338b4ae8\" (UID: \"f96b1606-24c5-4f44-8edd-eae1338b4ae8\") " Feb 17 20:24:01 crc kubenswrapper[4892]: I0217 20:24:01.380666 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f96b1606-24c5-4f44-8edd-eae1338b4ae8-utilities\") pod \"f96b1606-24c5-4f44-8edd-eae1338b4ae8\" (UID: \"f96b1606-24c5-4f44-8edd-eae1338b4ae8\") " Feb 17 20:24:01 crc kubenswrapper[4892]: I0217 20:24:01.380787 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f96b1606-24c5-4f44-8edd-eae1338b4ae8-catalog-content\") pod \"f96b1606-24c5-4f44-8edd-eae1338b4ae8\" (UID: \"f96b1606-24c5-4f44-8edd-eae1338b4ae8\") " Feb 17 20:24:01 crc kubenswrapper[4892]: I0217 20:24:01.382065 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f96b1606-24c5-4f44-8edd-eae1338b4ae8-utilities" (OuterVolumeSpecName: "utilities") pod "f96b1606-24c5-4f44-8edd-eae1338b4ae8" (UID: "f96b1606-24c5-4f44-8edd-eae1338b4ae8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 20:24:01 crc kubenswrapper[4892]: I0217 20:24:01.389989 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f96b1606-24c5-4f44-8edd-eae1338b4ae8-kube-api-access-clrs4" (OuterVolumeSpecName: "kube-api-access-clrs4") pod "f96b1606-24c5-4f44-8edd-eae1338b4ae8" (UID: "f96b1606-24c5-4f44-8edd-eae1338b4ae8"). InnerVolumeSpecName "kube-api-access-clrs4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 20:24:01 crc kubenswrapper[4892]: I0217 20:24:01.461402 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f96b1606-24c5-4f44-8edd-eae1338b4ae8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f96b1606-24c5-4f44-8edd-eae1338b4ae8" (UID: "f96b1606-24c5-4f44-8edd-eae1338b4ae8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 20:24:01 crc kubenswrapper[4892]: I0217 20:24:01.467325 4892 generic.go:334] "Generic (PLEG): container finished" podID="f96b1606-24c5-4f44-8edd-eae1338b4ae8" containerID="b9d965ec72015b1e785e1dc40b6419edec8f5516111372d69707f3f16c07b5ee" exitCode=0 Feb 17 20:24:01 crc kubenswrapper[4892]: I0217 20:24:01.467561 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rkkv2" event={"ID":"f96b1606-24c5-4f44-8edd-eae1338b4ae8","Type":"ContainerDied","Data":"b9d965ec72015b1e785e1dc40b6419edec8f5516111372d69707f3f16c07b5ee"} Feb 17 20:24:01 crc kubenswrapper[4892]: I0217 20:24:01.467611 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rkkv2" event={"ID":"f96b1606-24c5-4f44-8edd-eae1338b4ae8","Type":"ContainerDied","Data":"2c037baeb22a89685bf48062ac1a2dda57a3f5d869ecc4ae5862dab3cdd50731"} Feb 17 20:24:01 crc kubenswrapper[4892]: I0217 20:24:01.467634 4892 scope.go:117] "RemoveContainer" containerID="b9d965ec72015b1e785e1dc40b6419edec8f5516111372d69707f3f16c07b5ee" Feb 17 20:24:01 crc kubenswrapper[4892]: I0217 20:24:01.467634 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rkkv2" Feb 17 20:24:01 crc kubenswrapper[4892]: I0217 20:24:01.484806 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f96b1606-24c5-4f44-8edd-eae1338b4ae8-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 20:24:01 crc kubenswrapper[4892]: I0217 20:24:01.484906 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clrs4\" (UniqueName: \"kubernetes.io/projected/f96b1606-24c5-4f44-8edd-eae1338b4ae8-kube-api-access-clrs4\") on node \"crc\" DevicePath \"\"" Feb 17 20:24:01 crc kubenswrapper[4892]: I0217 20:24:01.484929 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f96b1606-24c5-4f44-8edd-eae1338b4ae8-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 20:24:01 crc kubenswrapper[4892]: I0217 20:24:01.507307 4892 scope.go:117] "RemoveContainer" containerID="aa61ce829d17200d6f61848d4c06d9ba2b1c4e3acff1bd7b5570c649a0ff8b24" Feb 17 20:24:01 crc kubenswrapper[4892]: I0217 20:24:01.526935 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rkkv2"] Feb 17 20:24:01 crc kubenswrapper[4892]: I0217 20:24:01.541462 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rkkv2"] Feb 17 20:24:01 crc kubenswrapper[4892]: I0217 20:24:01.558870 4892 scope.go:117] "RemoveContainer" containerID="a49d2a3dca566a8253a849c28d18d657738c5e63c575819fcb46b2af191a135f" Feb 17 20:24:01 crc kubenswrapper[4892]: I0217 20:24:01.593854 4892 scope.go:117] "RemoveContainer" containerID="b9d965ec72015b1e785e1dc40b6419edec8f5516111372d69707f3f16c07b5ee" Feb 17 20:24:01 crc kubenswrapper[4892]: E0217 20:24:01.594370 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9d965ec72015b1e785e1dc40b6419edec8f5516111372d69707f3f16c07b5ee\": container with ID starting with b9d965ec72015b1e785e1dc40b6419edec8f5516111372d69707f3f16c07b5ee not found: ID does not exist" containerID="b9d965ec72015b1e785e1dc40b6419edec8f5516111372d69707f3f16c07b5ee" Feb 17 20:24:01 crc kubenswrapper[4892]: I0217 20:24:01.594408 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9d965ec72015b1e785e1dc40b6419edec8f5516111372d69707f3f16c07b5ee"} err="failed to get container status \"b9d965ec72015b1e785e1dc40b6419edec8f5516111372d69707f3f16c07b5ee\": rpc error: code = NotFound desc = could not find container \"b9d965ec72015b1e785e1dc40b6419edec8f5516111372d69707f3f16c07b5ee\": container with ID starting with b9d965ec72015b1e785e1dc40b6419edec8f5516111372d69707f3f16c07b5ee not found: ID does not exist" Feb 17 20:24:01 crc kubenswrapper[4892]: I0217 20:24:01.594429 4892 scope.go:117] "RemoveContainer" containerID="aa61ce829d17200d6f61848d4c06d9ba2b1c4e3acff1bd7b5570c649a0ff8b24" Feb 17 20:24:01 crc kubenswrapper[4892]: E0217 20:24:01.594877 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa61ce829d17200d6f61848d4c06d9ba2b1c4e3acff1bd7b5570c649a0ff8b24\": container with ID starting with aa61ce829d17200d6f61848d4c06d9ba2b1c4e3acff1bd7b5570c649a0ff8b24 not found: ID does not exist" containerID="aa61ce829d17200d6f61848d4c06d9ba2b1c4e3acff1bd7b5570c649a0ff8b24" Feb 17 20:24:01 crc kubenswrapper[4892]: I0217 20:24:01.594906 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa61ce829d17200d6f61848d4c06d9ba2b1c4e3acff1bd7b5570c649a0ff8b24"} err="failed to get container status \"aa61ce829d17200d6f61848d4c06d9ba2b1c4e3acff1bd7b5570c649a0ff8b24\": rpc error: code = NotFound desc = could not find container \"aa61ce829d17200d6f61848d4c06d9ba2b1c4e3acff1bd7b5570c649a0ff8b24\": container with ID starting with aa61ce829d17200d6f61848d4c06d9ba2b1c4e3acff1bd7b5570c649a0ff8b24 not found: ID does not exist" Feb 17 20:24:01 crc kubenswrapper[4892]: I0217 20:24:01.594925 4892 scope.go:117] "RemoveContainer" containerID="a49d2a3dca566a8253a849c28d18d657738c5e63c575819fcb46b2af191a135f" Feb 17 20:24:01 crc kubenswrapper[4892]: E0217 20:24:01.595184 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a49d2a3dca566a8253a849c28d18d657738c5e63c575819fcb46b2af191a135f\": container with ID starting with a49d2a3dca566a8253a849c28d18d657738c5e63c575819fcb46b2af191a135f not found: ID does not exist" containerID="a49d2a3dca566a8253a849c28d18d657738c5e63c575819fcb46b2af191a135f" Feb 17 20:24:01 crc kubenswrapper[4892]: I0217 20:24:01.595212 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a49d2a3dca566a8253a849c28d18d657738c5e63c575819fcb46b2af191a135f"} err="failed to get container status \"a49d2a3dca566a8253a849c28d18d657738c5e63c575819fcb46b2af191a135f\": rpc error: code = NotFound desc = could not find container \"a49d2a3dca566a8253a849c28d18d657738c5e63c575819fcb46b2af191a135f\": container with ID starting with a49d2a3dca566a8253a849c28d18d657738c5e63c575819fcb46b2af191a135f not found: ID does not exist" Feb 17 20:24:03 crc kubenswrapper[4892]: I0217 20:24:03.387043 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f96b1606-24c5-4f44-8edd-eae1338b4ae8" path="/var/lib/kubelet/pods/f96b1606-24c5-4f44-8edd-eae1338b4ae8/volumes" Feb 17 20:24:08 crc kubenswrapper[4892]: I0217 20:24:08.902460 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2lvtc" Feb 17 20:24:08 crc kubenswrapper[4892]: I0217 20:24:08.974092 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2lvtc" Feb 17 20:24:09 crc kubenswrapper[4892]: I0217 20:24:09.298053 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2lvtc"] Feb 17 20:24:10 crc kubenswrapper[4892]: I0217 20:24:10.586587 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2lvtc" podUID="2b7d4b72-4b0e-4eba-9692-b58a2c066a5a" containerName="registry-server" containerID="cri-o://48d422fc262617deab7a0181fb860ab22aaba76b0978254522651c3ebca6929e" gracePeriod=2 Feb 17 20:24:11 crc kubenswrapper[4892]: I0217 20:24:11.090361 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2lvtc" Feb 17 20:24:11 crc kubenswrapper[4892]: I0217 20:24:11.209178 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b7d4b72-4b0e-4eba-9692-b58a2c066a5a-catalog-content\") pod \"2b7d4b72-4b0e-4eba-9692-b58a2c066a5a\" (UID: \"2b7d4b72-4b0e-4eba-9692-b58a2c066a5a\") " Feb 17 20:24:11 crc kubenswrapper[4892]: I0217 20:24:11.209637 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b7d4b72-4b0e-4eba-9692-b58a2c066a5a-utilities\") pod \"2b7d4b72-4b0e-4eba-9692-b58a2c066a5a\" (UID: \"2b7d4b72-4b0e-4eba-9692-b58a2c066a5a\") " Feb 17 20:24:11 crc kubenswrapper[4892]: I0217 20:24:11.209664 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blh6d\" (UniqueName: \"kubernetes.io/projected/2b7d4b72-4b0e-4eba-9692-b58a2c066a5a-kube-api-access-blh6d\") pod \"2b7d4b72-4b0e-4eba-9692-b58a2c066a5a\" (UID: \"2b7d4b72-4b0e-4eba-9692-b58a2c066a5a\") " Feb 17 20:24:11 crc kubenswrapper[4892]: I0217 20:24:11.210374 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b7d4b72-4b0e-4eba-9692-b58a2c066a5a-utilities" (OuterVolumeSpecName: "utilities") pod "2b7d4b72-4b0e-4eba-9692-b58a2c066a5a" (UID: "2b7d4b72-4b0e-4eba-9692-b58a2c066a5a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 20:24:11 crc kubenswrapper[4892]: I0217 20:24:11.217176 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b7d4b72-4b0e-4eba-9692-b58a2c066a5a-kube-api-access-blh6d" (OuterVolumeSpecName: "kube-api-access-blh6d") pod "2b7d4b72-4b0e-4eba-9692-b58a2c066a5a" (UID: "2b7d4b72-4b0e-4eba-9692-b58a2c066a5a"). InnerVolumeSpecName "kube-api-access-blh6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 20:24:11 crc kubenswrapper[4892]: I0217 20:24:11.312614 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b7d4b72-4b0e-4eba-9692-b58a2c066a5a-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 20:24:11 crc kubenswrapper[4892]: I0217 20:24:11.312653 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blh6d\" (UniqueName: \"kubernetes.io/projected/2b7d4b72-4b0e-4eba-9692-b58a2c066a5a-kube-api-access-blh6d\") on node \"crc\" DevicePath \"\"" Feb 17 20:24:11 crc kubenswrapper[4892]: I0217 20:24:11.344792 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b7d4b72-4b0e-4eba-9692-b58a2c066a5a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2b7d4b72-4b0e-4eba-9692-b58a2c066a5a" (UID: "2b7d4b72-4b0e-4eba-9692-b58a2c066a5a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 20:24:11 crc kubenswrapper[4892]: I0217 20:24:11.414300 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b7d4b72-4b0e-4eba-9692-b58a2c066a5a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 20:24:11 crc kubenswrapper[4892]: I0217 20:24:11.599005 4892 generic.go:334] "Generic (PLEG): container finished" podID="2b7d4b72-4b0e-4eba-9692-b58a2c066a5a" containerID="48d422fc262617deab7a0181fb860ab22aaba76b0978254522651c3ebca6929e" exitCode=0 Feb 17 20:24:11 crc kubenswrapper[4892]: I0217 20:24:11.599080 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2lvtc" Feb 17 20:24:11 crc kubenswrapper[4892]: I0217 20:24:11.599087 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2lvtc" event={"ID":"2b7d4b72-4b0e-4eba-9692-b58a2c066a5a","Type":"ContainerDied","Data":"48d422fc262617deab7a0181fb860ab22aaba76b0978254522651c3ebca6929e"} Feb 17 20:24:11 crc kubenswrapper[4892]: I0217 20:24:11.599170 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2lvtc" event={"ID":"2b7d4b72-4b0e-4eba-9692-b58a2c066a5a","Type":"ContainerDied","Data":"ac0d005277eab97a50f07687bbebceddc380d8f5cad89a36c1c0140bbbf29734"} Feb 17 20:24:11 crc kubenswrapper[4892]: I0217 20:24:11.599201 4892 scope.go:117] "RemoveContainer" containerID="48d422fc262617deab7a0181fb860ab22aaba76b0978254522651c3ebca6929e" Feb 17 20:24:11 crc kubenswrapper[4892]: I0217 20:24:11.623443 4892 scope.go:117] "RemoveContainer" containerID="bd03f6e5ea9c098ed672c4d2c899b5a1b59b25f4a9a581f88b101a1cdc5a4b62" Feb 17 20:24:11 crc kubenswrapper[4892]: I0217 20:24:11.638725 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2lvtc"] Feb 17 20:24:11 crc kubenswrapper[4892]: I0217 20:24:11.646462 4892 scope.go:117] "RemoveContainer" containerID="dc50f086a214600f88c4905411cd792f4fd541c96364cc0c80dea92bbcc77a44" Feb 17 20:24:11 crc kubenswrapper[4892]: I0217 20:24:11.658418 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2lvtc"] Feb 17 20:24:11 crc kubenswrapper[4892]: I0217 20:24:11.707424 4892 scope.go:117] "RemoveContainer" containerID="48d422fc262617deab7a0181fb860ab22aaba76b0978254522651c3ebca6929e" Feb 17 20:24:11 crc kubenswrapper[4892]: E0217 20:24:11.707941 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48d422fc262617deab7a0181fb860ab22aaba76b0978254522651c3ebca6929e\": container with ID starting with 48d422fc262617deab7a0181fb860ab22aaba76b0978254522651c3ebca6929e not found: ID does not exist" containerID="48d422fc262617deab7a0181fb860ab22aaba76b0978254522651c3ebca6929e" Feb 17 20:24:11 crc kubenswrapper[4892]: I0217 20:24:11.707979 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48d422fc262617deab7a0181fb860ab22aaba76b0978254522651c3ebca6929e"} err="failed to get container status \"48d422fc262617deab7a0181fb860ab22aaba76b0978254522651c3ebca6929e\": rpc error: code = NotFound desc = could not find container \"48d422fc262617deab7a0181fb860ab22aaba76b0978254522651c3ebca6929e\": container with ID starting with 48d422fc262617deab7a0181fb860ab22aaba76b0978254522651c3ebca6929e not found: ID does not exist" Feb 17 20:24:11 crc kubenswrapper[4892]: I0217 20:24:11.708003 4892 scope.go:117] "RemoveContainer" containerID="bd03f6e5ea9c098ed672c4d2c899b5a1b59b25f4a9a581f88b101a1cdc5a4b62" Feb 17 20:24:11 crc kubenswrapper[4892]: E0217 20:24:11.708226 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd03f6e5ea9c098ed672c4d2c899b5a1b59b25f4a9a581f88b101a1cdc5a4b62\": container with ID starting with bd03f6e5ea9c098ed672c4d2c899b5a1b59b25f4a9a581f88b101a1cdc5a4b62 not found: ID does not exist" containerID="bd03f6e5ea9c098ed672c4d2c899b5a1b59b25f4a9a581f88b101a1cdc5a4b62" Feb 17 20:24:11 crc kubenswrapper[4892]: I0217 20:24:11.708249 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd03f6e5ea9c098ed672c4d2c899b5a1b59b25f4a9a581f88b101a1cdc5a4b62"} err="failed to get container status \"bd03f6e5ea9c098ed672c4d2c899b5a1b59b25f4a9a581f88b101a1cdc5a4b62\": rpc error: code = NotFound desc = could not find container \"bd03f6e5ea9c098ed672c4d2c899b5a1b59b25f4a9a581f88b101a1cdc5a4b62\": container with ID starting with bd03f6e5ea9c098ed672c4d2c899b5a1b59b25f4a9a581f88b101a1cdc5a4b62 not found: ID does not exist" Feb 17 20:24:11 crc kubenswrapper[4892]: I0217 20:24:11.708266 4892 scope.go:117] "RemoveContainer" containerID="dc50f086a214600f88c4905411cd792f4fd541c96364cc0c80dea92bbcc77a44" Feb 17 20:24:11 crc kubenswrapper[4892]: E0217 20:24:11.708539 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc50f086a214600f88c4905411cd792f4fd541c96364cc0c80dea92bbcc77a44\": container with ID starting with dc50f086a214600f88c4905411cd792f4fd541c96364cc0c80dea92bbcc77a44 not found: ID does not exist" containerID="dc50f086a214600f88c4905411cd792f4fd541c96364cc0c80dea92bbcc77a44" Feb 17 20:24:11 crc kubenswrapper[4892]: I0217 20:24:11.708658 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc50f086a214600f88c4905411cd792f4fd541c96364cc0c80dea92bbcc77a44"} err="failed to get container status \"dc50f086a214600f88c4905411cd792f4fd541c96364cc0c80dea92bbcc77a44\": rpc error: code = NotFound desc = could not find container \"dc50f086a214600f88c4905411cd792f4fd541c96364cc0c80dea92bbcc77a44\": container with ID starting with dc50f086a214600f88c4905411cd792f4fd541c96364cc0c80dea92bbcc77a44 not found: ID does not exist" Feb 17 20:24:13 crc kubenswrapper[4892]: I0217 20:24:13.373080 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b7d4b72-4b0e-4eba-9692-b58a2c066a5a" path="/var/lib/kubelet/pods/2b7d4b72-4b0e-4eba-9692-b58a2c066a5a/volumes" Feb 17 20:25:07 crc kubenswrapper[4892]: I0217 20:25:07.424669 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 20:25:07 crc kubenswrapper[4892]: I0217 20:25:07.425352 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 20:25:37 crc kubenswrapper[4892]: I0217 20:25:37.425117 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 20:25:37 crc kubenswrapper[4892]: I0217 20:25:37.425601 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 20:26:07 crc kubenswrapper[4892]: I0217 20:26:07.425107 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 20:26:07 crc kubenswrapper[4892]: I0217 20:26:07.426630 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 20:26:07 crc kubenswrapper[4892]: I0217 20:26:07.426740 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" Feb 17 20:26:07 crc kubenswrapper[4892]: I0217 20:26:07.427563 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d229474ca59a6d1790d5bfd995f8dcef27acdebd08978bf381531977775cb29f"} pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 20:26:07 crc kubenswrapper[4892]: I0217 20:26:07.427734 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" containerID="cri-o://d229474ca59a6d1790d5bfd995f8dcef27acdebd08978bf381531977775cb29f" gracePeriod=600 Feb 17 20:26:08 crc kubenswrapper[4892]: I0217 20:26:08.033692 4892 generic.go:334] "Generic (PLEG): container finished" podID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerID="d229474ca59a6d1790d5bfd995f8dcef27acdebd08978bf381531977775cb29f" exitCode=0 Feb 17 20:26:08 crc kubenswrapper[4892]: I0217 20:26:08.033746 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerDied","Data":"d229474ca59a6d1790d5bfd995f8dcef27acdebd08978bf381531977775cb29f"} Feb 17 20:26:08 crc kubenswrapper[4892]: I0217 20:26:08.034097 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerStarted","Data":"2d8fac9640248d569699481a65fe24982eb12cc3dc607889c6c3f752cff2bc30"} Feb 17 20:26:08 crc kubenswrapper[4892]: I0217 20:26:08.034119 4892 scope.go:117] "RemoveContainer" containerID="0ab03386b956c72a478daacc623c9e719cc9be4c2ae5015b826cee6ca5f2b27c" Feb 17 20:26:35 crc kubenswrapper[4892]: I0217 20:26:35.357649 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_b1b1be0e-8ec2-48ea-967b-a89c7e20bea9/init-config-reloader/0.log" Feb 17 20:26:35 crc kubenswrapper[4892]: I0217 20:26:35.650389 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_b1b1be0e-8ec2-48ea-967b-a89c7e20bea9/alertmanager/0.log" Feb 17 20:26:35 crc kubenswrapper[4892]: I0217 20:26:35.656007 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_b1b1be0e-8ec2-48ea-967b-a89c7e20bea9/init-config-reloader/0.log" Feb 17 20:26:35 crc kubenswrapper[4892]: I0217 20:26:35.656199 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_b1b1be0e-8ec2-48ea-967b-a89c7e20bea9/config-reloader/0.log" Feb 17 20:26:35 crc kubenswrapper[4892]: I0217 20:26:35.844093 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_d4722499-5387-425c-a006-1664b733c70e/aodh-api/0.log" Feb 17 20:26:35 crc kubenswrapper[4892]: I0217 20:26:35.913003 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_d4722499-5387-425c-a006-1664b733c70e/aodh-listener/0.log" Feb 17 20:26:35 crc kubenswrapper[4892]: I0217 20:26:35.917307 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_d4722499-5387-425c-a006-1664b733c70e/aodh-evaluator/0.log" Feb 17 20:26:36 crc kubenswrapper[4892]: I0217 20:26:36.022196 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_d4722499-5387-425c-a006-1664b733c70e/aodh-notifier/0.log" Feb 17 20:26:36 crc kubenswrapper[4892]: I0217 20:26:36.077762 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-694cc5859d-jlpkr_8380e84c-8f80-43dc-825e-d9dd3dd0533f/barbican-api/0.log" Feb 17 20:26:36 crc kubenswrapper[4892]: I0217 20:26:36.129116 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-694cc5859d-jlpkr_8380e84c-8f80-43dc-825e-d9dd3dd0533f/barbican-api-log/0.log" Feb 17 20:26:36 crc kubenswrapper[4892]: I0217 20:26:36.310505 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-68d8fcd9d6-kjplh_de7a0634-e601-4c65-adb6-8e2625e1709b/barbican-keystone-listener/0.log" Feb 17 20:26:36 crc kubenswrapper[4892]: I0217 20:26:36.364543 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-68d8fcd9d6-kjplh_de7a0634-e601-4c65-adb6-8e2625e1709b/barbican-keystone-listener-log/0.log" Feb 17 20:26:36 crc kubenswrapper[4892]: I0217 20:26:36.511636 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7b97c57775-zc698_fa15978b-f3ec-4d7f-8a4e-4a4bcfd61f15/barbican-worker-log/0.log" Feb 17 20:26:36 crc kubenswrapper[4892]: I0217 20:26:36.574098 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7b97c57775-zc698_fa15978b-f3ec-4d7f-8a4e-4a4bcfd61f15/barbican-worker/0.log" Feb 17 20:26:36 crc kubenswrapper[4892]: I0217 20:26:36.691228 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-openstack-openstack-cell1-v99wc_69c7c84a-660b-4db3-a143-95efe4b92db2/bootstrap-openstack-openstack-cell1/0.log" Feb 17 20:26:37 crc kubenswrapper[4892]: I0217 20:26:37.059489 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f/ceilometer-central-agent/1.log" Feb 17 20:26:37 crc kubenswrapper[4892]: I0217 20:26:37.071067 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f/ceilometer-notification-agent/0.log" Feb 17 20:26:37 crc kubenswrapper[4892]: I0217 20:26:37.116941 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f/ceilometer-central-agent/0.log" Feb 17 20:26:37 crc kubenswrapper[4892]: I0217 20:26:37.172523 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f/proxy-httpd/0.log" Feb 17 20:26:37 crc kubenswrapper[4892]: I0217 20:26:37.252810 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ef7f0a40-a8bc-4a01-be30-3f0d5e5fd52f/sg-core/0.log" Feb 17 20:26:37 crc kubenswrapper[4892]: I0217 20:26:37.358730 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-openstack-openstack-cell1-f8r4q_07d0e22b-fd41-43a3-8d50-fc6a280e0637/ceph-client-openstack-openstack-cell1/0.log" Feb 17 20:26:37 crc kubenswrapper[4892]: I0217 20:26:37.491485 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_b762d850-e72b-4151-97c1-c6e1f8c9e76f/cinder-api/0.log" Feb 17 20:26:37 crc kubenswrapper[4892]: I0217 20:26:37.550268 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_b762d850-e72b-4151-97c1-c6e1f8c9e76f/cinder-api-log/0.log" Feb 17 20:26:37 crc kubenswrapper[4892]: I0217 20:26:37.751535 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_0c0d1a90-2e96-43e4-9ed7-b375dd729dd5/probe/0.log" Feb 17 20:26:37 crc kubenswrapper[4892]: I0217 20:26:37.820856 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_0c0d1a90-2e96-43e4-9ed7-b375dd729dd5/cinder-backup/0.log" Feb 17 20:26:37 crc kubenswrapper[4892]: I0217 20:26:37.910064 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_b4368bec-a527-4e2e-bcd8-c3b83faf9bca/cinder-scheduler/0.log" Feb 17 20:26:37 crc kubenswrapper[4892]: I0217 20:26:37.981877 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_b4368bec-a527-4e2e-bcd8-c3b83faf9bca/probe/0.log" Feb 17 20:26:38 crc kubenswrapper[4892]: I0217 20:26:38.107752 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_196db019-189a-4787-a766-f9ae8d46cbea/cinder-volume/0.log" Feb 17 20:26:38 crc kubenswrapper[4892]: I0217 20:26:38.215910 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_196db019-189a-4787-a766-f9ae8d46cbea/probe/0.log" Feb 17 20:26:38 crc kubenswrapper[4892]: I0217 20:26:38.233583 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-openstack-openstack-cell1-lvk64_9d911062-0c28-415c-bdb4-6288348c105a/configure-network-openstack-openstack-cell1/0.log" Feb 17 20:26:38 crc kubenswrapper[4892]: I0217 20:26:38.312797 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-openstack-openstack-cell1-xqknj_a33740f5-3f9c-4f60-a0b9-cc91fbed6701/configure-os-openstack-openstack-cell1/0.log" Feb 17 20:26:38 crc kubenswrapper[4892]: I0217 20:26:38.405015 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7d54db68c9-hsn6k_902b8a22-0008-44c7-a2c5-a1cfebd97794/init/0.log" Feb 17 20:26:38 crc kubenswrapper[4892]: I0217 20:26:38.592946 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7d54db68c9-hsn6k_902b8a22-0008-44c7-a2c5-a1cfebd97794/init/0.log" Feb 17 20:26:38 crc kubenswrapper[4892]: I0217 20:26:38.625442 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7d54db68c9-hsn6k_902b8a22-0008-44c7-a2c5-a1cfebd97794/dnsmasq-dns/0.log" Feb 17 20:26:38 crc kubenswrapper[4892]: I0217 20:26:38.691806 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-openstack-openstack-cell1-hmvph_0b80ae5d-715f-4c7b-a06b-597a0a53a869/download-cache-openstack-openstack-cell1/0.log" Feb 17 20:26:38 crc kubenswrapper[4892]: I0217 20:26:38.806193 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_253a67a0-8806-43b4-8994-10b9143ee4dd/glance-httpd/0.log" Feb 17 20:26:38 crc kubenswrapper[4892]: I0217 20:26:38.844130 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_253a67a0-8806-43b4-8994-10b9143ee4dd/glance-log/0.log" Feb 17 20:26:38 crc kubenswrapper[4892]: I0217 20:26:38.929317 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_e6e79a3c-a8e5-4144-bedb-0e771ee43025/glance-httpd/0.log" Feb 17 20:26:38 crc kubenswrapper[4892]: I0217 20:26:38.974545 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_e6e79a3c-a8e5-4144-bedb-0e771ee43025/glance-log/0.log" Feb 17 20:26:39 crc kubenswrapper[4892]: I0217 20:26:39.145453 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-7bf8574b5-6xj5n_92b20a09-f679-471f-aad0-bf6e308b3bce/heat-api/0.log" Feb 17 20:26:39 crc kubenswrapper[4892]: I0217 20:26:39.244670 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-5fcbcbc588-hh84w_0fc379d3-42fb-4714-b739-78a9f7e81068/heat-cfnapi/0.log" Feb 17 20:26:39 crc kubenswrapper[4892]: I0217 20:26:39.331660 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-768dfc78cc-qbdts_2d5b0472-6954-47d0-b1dd-aeefea0ce5be/heat-engine/0.log" Feb 17 20:26:39 crc kubenswrapper[4892]: I0217 20:26:39.605442 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-openstack-openstack-cell1-klzmt_cdb317ef-c3b7-41a8-b80b-de001a95dbd1/install-certs-openstack-openstack-cell1/0.log" Feb 17 20:26:39 crc kubenswrapper[4892]: I0217 20:26:39.607159 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-dcbd57fbf-8j2s5_fe6fd6f5-5715-4a79-8c5b-839baf178e1d/horizon-log/0.log" Feb 17 20:26:39 crc kubenswrapper[4892]: I0217 20:26:39.616262 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-dcbd57fbf-8j2s5_fe6fd6f5-5715-4a79-8c5b-839baf178e1d/horizon/0.log" Feb 17 20:26:39 crc kubenswrapper[4892]: I0217 20:26:39.838793 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-openstack-openstack-cell1-mt4f2_fab94950-73bd-4831-8872-84219a776cef/install-os-openstack-openstack-cell1/0.log" Feb 17 20:26:39 crc kubenswrapper[4892]: I0217 20:26:39.893805 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7d9c8c664b-fqjld_92b3e18a-a257-410d-9f92-d6775960d070/keystone-api/0.log" Feb 17 20:26:39 crc kubenswrapper[4892]: I0217 20:26:39.976384 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29522641-mcslv_86917ae5-f69c-4bcf-a090-99818cf3c471/keystone-cron/0.log" Feb 17 20:26:40 crc kubenswrapper[4892]: I0217 20:26:40.138019 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-openstack-openstack-cell1-bfmls_0f868edd-d082-4f76-87db-34a40d03ba30/libvirt-openstack-openstack-cell1/0.log" Feb 17 20:26:40 crc kubenswrapper[4892]: I0217 20:26:40.139024 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_7e8e76a7-b0ec-4b83-b288-c80fdd74ff97/kube-state-metrics/0.log" Feb 17 20:26:40 crc kubenswrapper[4892]: I0217 20:26:40.396094 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_37a2faa2-4ddf-4fd0-b254-1e5b139e29bb/manila-api/0.log" Feb 17 20:26:40 crc kubenswrapper[4892]: I0217 20:26:40.417448 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_37a2faa2-4ddf-4fd0-b254-1e5b139e29bb/manila-api-log/0.log" Feb 17 20:26:40 crc kubenswrapper[4892]: I0217 20:26:40.464907 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_13c41fce-5694-4cf4-af3d-befaa59b6459/manila-scheduler/0.log" Feb 17 20:26:40 crc kubenswrapper[4892]: I0217 20:26:40.486902 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_13c41fce-5694-4cf4-af3d-befaa59b6459/probe/0.log" Feb 17 20:26:40 crc kubenswrapper[4892]: I0217 20:26:40.611575 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_806e5569-2407-4607-8331-cc09f54e37a6/manila-share/0.log" Feb 17 20:26:40 crc kubenswrapper[4892]: I0217 20:26:40.700367 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_806e5569-2407-4607-8331-cc09f54e37a6/probe/0.log" Feb 17 20:26:40 crc kubenswrapper[4892]: I0217 20:26:40.727607 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-copy-data_4e3f7981-bf3f-4acc-8345-e58e505778b4/adoption/0.log" Feb 17 20:26:41 crc kubenswrapper[4892]: I0217 20:26:41.286781 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-cd6dd448c-cv5w9_63dc1c5d-2307-487a-a2f3-5c40864bdfb9/neutron-httpd/0.log" Feb 17 20:26:41 crc kubenswrapper[4892]: I0217 20:26:41.353883 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-cd6dd448c-cv5w9_63dc1c5d-2307-487a-a2f3-5c40864bdfb9/neutron-api/0.log" Feb 17 20:26:41 crc kubenswrapper[4892]: I0217 20:26:41.450365 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-dhcp-openstack-openstack-cell1-qcxh2_e6cbf46e-f128-468f-99f5-64ae24e21ec6/neutron-dhcp-openstack-openstack-cell1/0.log" Feb 17 20:26:41 crc kubenswrapper[4892]: I0217 20:26:41.539361 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-openstack-openstack-cell1-c8mfc_4e432a42-0176-49de-915c-f655506500e6/neutron-metadata-openstack-openstack-cell1/0.log" Feb 17 20:26:41 crc kubenswrapper[4892]: I0217 20:26:41.731404 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-sriov-openstack-openstack-cell1-v5r58_0dcceec2-d889-492f-921f-0da0329466c4/neutron-sriov-openstack-openstack-cell1/0.log" Feb 17 20:26:41 crc kubenswrapper[4892]: I0217 20:26:41.887133 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_872897e3-475e-40c0-b7f3-ae6a8c6efd29/nova-api-api/0.log" Feb 17 20:26:42 crc kubenswrapper[4892]: I0217 20:26:42.064174 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_872897e3-475e-40c0-b7f3-ae6a8c6efd29/nova-api-log/0.log" Feb 17 20:26:42 crc kubenswrapper[4892]: I0217 20:26:42.258609 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_5bbf722a-acfd-4b39-b080-b9022968adac/nova-cell0-conductor-conductor/0.log" Feb 17 20:26:42 crc kubenswrapper[4892]: I0217 20:26:42.391176 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_313bae35-ce99-4aed-915a-d8e1c5d8202c/nova-cell1-conductor-conductor/0.log" Feb 17 20:26:42 crc kubenswrapper[4892]: I0217 20:26:42.551623 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_8f2c6c93-db35-404a-a613-34cb4b2de98b/nova-cell1-novncproxy-novncproxy/0.log" Feb 17 20:26:43 crc kubenswrapper[4892]: I0217 20:26:43.356466 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellst9zt_e13a74be-1b5c-4c8f-9c61-5aa0965a4610/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1/0.log" Feb 17 20:26:43 crc kubenswrapper[4892]: I0217 20:26:43.364677 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-openstack-cell1-8lwbt_7a1ac411-8b1b-4947-8e72-7d4401056d3f/nova-cell1-openstack-openstack-cell1/0.log" Feb 17 20:26:43 crc kubenswrapper[4892]: I0217 20:26:43.666724 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_5e2cefca-37d0-4868-97cf-bf6b1a24f0a3/nova-metadata-metadata/0.log" Feb 17 20:26:43 crc kubenswrapper[4892]: I0217 20:26:43.677465 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_5e2cefca-37d0-4868-97cf-bf6b1a24f0a3/nova-metadata-log/0.log" Feb 17 20:26:43 crc kubenswrapper[4892]: I0217 20:26:43.804935 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_88b16467-84db-4897-aaa7-6b523f95112f/nova-scheduler-scheduler/0.log" Feb 17 20:26:43 crc kubenswrapper[4892]: I0217 20:26:43.928327 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-bd9b9bf5f-c7bmv_6e8dd5b6-76f7-4712-a660-b7a23721b2ad/init/0.log" Feb 17 20:26:44 crc kubenswrapper[4892]: I0217 20:26:44.067447 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-bd9b9bf5f-c7bmv_6e8dd5b6-76f7-4712-a660-b7a23721b2ad/init/0.log" Feb 17 20:26:44 crc kubenswrapper[4892]: I0217 20:26:44.160210 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-bd9b9bf5f-c7bmv_6e8dd5b6-76f7-4712-a660-b7a23721b2ad/octavia-api-provider-agent/0.log" Feb 17 20:26:44 crc kubenswrapper[4892]: I0217 20:26:44.417354 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-bd9b9bf5f-c7bmv_6e8dd5b6-76f7-4712-a660-b7a23721b2ad/octavia-api/0.log" Feb 17 20:26:44 crc kubenswrapper[4892]: I0217 20:26:44.751781 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-crjvb_154030e3-3d0f-4c51-8a6e-e9ddd9238c03/init/0.log" Feb 17 20:26:45 crc kubenswrapper[4892]: I0217 20:26:45.220290 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-crjvb_154030e3-3d0f-4c51-8a6e-e9ddd9238c03/init/0.log" Feb 17 20:26:45 crc kubenswrapper[4892]: I0217 20:26:45.251030 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-bk2pv_3c603be1-c02e-4c29-ae4c-f0209436fa4b/init/0.log" Feb 17 20:26:45 crc kubenswrapper[4892]: I0217 20:26:45.307198 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-crjvb_154030e3-3d0f-4c51-8a6e-e9ddd9238c03/octavia-healthmanager/0.log" Feb 17 20:26:45 crc kubenswrapper[4892]: I0217 20:26:45.535735 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-bk2pv_3c603be1-c02e-4c29-ae4c-f0209436fa4b/init/0.log" Feb 17 20:26:45 crc kubenswrapper[4892]: I0217 20:26:45.592996 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-59f8cff499-tgk79_a4e42272-b37b-4214-9a11-de8ce611d1b3/init/0.log" Feb 17 20:26:45 crc kubenswrapper[4892]: I0217 20:26:45.600931 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-bk2pv_3c603be1-c02e-4c29-ae4c-f0209436fa4b/octavia-housekeeping/0.log" Feb 17 20:26:45 crc kubenswrapper[4892]: I0217 20:26:45.808740 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-59f8cff499-tgk79_a4e42272-b37b-4214-9a11-de8ce611d1b3/init/0.log" Feb 17 20:26:45 crc kubenswrapper[4892]: I0217 20:26:45.863777 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-59f8cff499-tgk79_a4e42272-b37b-4214-9a11-de8ce611d1b3/octavia-amphora-httpd/0.log" Feb 17 20:26:45 crc kubenswrapper[4892]: I0217 20:26:45.917770 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-c6t7x_2fcd17ca-4457-418b-8dc1-3f8b5e485d72/init/0.log" Feb 17 20:26:46 crc kubenswrapper[4892]: I0217 20:26:46.176116 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-c6t7x_2fcd17ca-4457-418b-8dc1-3f8b5e485d72/octavia-rsyslog/0.log" Feb 17 20:26:46 crc kubenswrapper[4892]: I0217 20:26:46.177473 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-c6t7x_2fcd17ca-4457-418b-8dc1-3f8b5e485d72/init/0.log" Feb 17 20:26:46 crc kubenswrapper[4892]: I0217 20:26:46.243743 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-pck59_cda6c9c6-98f3-4399-b095-ce45683ebd27/init/0.log" Feb 17 20:26:46 crc kubenswrapper[4892]: I0217 20:26:46.394361 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-pck59_cda6c9c6-98f3-4399-b095-ce45683ebd27/init/0.log" Feb 17 20:26:46 crc kubenswrapper[4892]: I0217 20:26:46.575552 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-pck59_cda6c9c6-98f3-4399-b095-ce45683ebd27/octavia-worker/0.log" Feb 17 20:26:46 crc kubenswrapper[4892]: I0217 20:26:46.581859 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_20cf068b-8714-4fc3-8a41-3af0baacc634/mysql-bootstrap/0.log" Feb 17 20:26:46 crc kubenswrapper[4892]: I0217 20:26:46.713415 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_20cf068b-8714-4fc3-8a41-3af0baacc634/mysql-bootstrap/0.log" Feb 17 20:26:46 crc kubenswrapper[4892]: I0217 20:26:46.835710 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_20cf068b-8714-4fc3-8a41-3af0baacc634/galera/0.log" Feb 17 20:26:46 crc kubenswrapper[4892]: I0217 20:26:46.847691 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_c466a3b3-748c-4402-a029-ba4f30d2f660/mysql-bootstrap/0.log" Feb 17 20:26:47 crc kubenswrapper[4892]: I0217 20:26:47.048173 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_c466a3b3-748c-4402-a029-ba4f30d2f660/mysql-bootstrap/0.log" Feb 17 20:26:47 crc kubenswrapper[4892]: I0217 20:26:47.081906 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_c466a3b3-748c-4402-a029-ba4f30d2f660/galera/0.log" Feb 17 20:26:47 crc kubenswrapper[4892]: I0217 20:26:47.154857 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_3519261b-9df6-4bbd-976b-a6987e030742/openstackclient/0.log" Feb 17 20:26:47 crc kubenswrapper[4892]: I0217 20:26:47.393739 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-9w7kn_031573a9-543b-4333-a50c-d7e514eb3a41/ovn-controller/0.log" Feb 17 20:26:47 crc kubenswrapper[4892]: I0217 20:26:47.439378 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-vjdzt_ba17e82c-3558-4262-99b7-d37b403e49dd/openstack-network-exporter/0.log" Feb 17 20:26:47 crc kubenswrapper[4892]: I0217 20:26:47.579643 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-pc6zw_8bec279d-5310-4762-8f15-b0ad2d919df9/ovsdb-server-init/0.log" Feb 17 20:26:47 crc kubenswrapper[4892]: I0217 20:26:47.811078 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-pc6zw_8bec279d-5310-4762-8f15-b0ad2d919df9/ovsdb-server/0.log" Feb 17 20:26:47 crc kubenswrapper[4892]: I0217 20:26:47.823414 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-pc6zw_8bec279d-5310-4762-8f15-b0ad2d919df9/ovs-vswitchd/0.log" Feb 17 20:26:47 crc kubenswrapper[4892]: I0217 20:26:47.842525 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-pc6zw_8bec279d-5310-4762-8f15-b0ad2d919df9/ovsdb-server-init/0.log" Feb 17 20:26:48 crc kubenswrapper[4892]: I0217 20:26:48.057271 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_cb3a7c2a-a1ba-4b09-b07c-6287365b6ee2/openstack-network-exporter/0.log" Feb 17 20:26:48 crc kubenswrapper[4892]: I0217 20:26:48.058289 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-copy-data_d6e36b08-e650-4dec-82ee-72c4e5e013d7/adoption/0.log" Feb 17 20:26:48 crc kubenswrapper[4892]: I0217 20:26:48.197607 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_cb3a7c2a-a1ba-4b09-b07c-6287365b6ee2/ovn-northd/0.log" Feb 17 20:26:48 crc kubenswrapper[4892]: I0217 20:26:48.282294 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-openstack-openstack-cell1-wt46w_d107e682-a9c6-463f-ac66-142f2fe6d6c2/ovn-openstack-openstack-cell1/0.log" Feb 17 20:26:48 crc kubenswrapper[4892]: I0217 20:26:48.451945 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_8f5f5244-28cd-4312-84a9-7baa4b22d017/openstack-network-exporter/0.log" Feb 17 20:26:48 crc kubenswrapper[4892]: I0217 20:26:48.468132 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_8f5f5244-28cd-4312-84a9-7baa4b22d017/ovsdbserver-nb/0.log" Feb 17 20:26:48 crc kubenswrapper[4892]: I0217 20:26:48.662664 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_be323205-1316-4c35-9475-8d0e5a64c889/openstack-network-exporter/0.log" Feb 17 20:26:48 crc kubenswrapper[4892]: I0217 20:26:48.671794 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_be323205-1316-4c35-9475-8d0e5a64c889/ovsdbserver-nb/0.log" Feb 17 20:26:48 crc kubenswrapper[4892]: I0217 20:26:48.867250 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_d988b11e-1dda-417d-8792-4130cb690552/openstack-network-exporter/0.log" Feb 17 20:26:48 crc kubenswrapper[4892]: I0217 20:26:48.933170 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_d988b11e-1dda-417d-8792-4130cb690552/ovsdbserver-nb/0.log" Feb 17 20:26:49 crc kubenswrapper[4892]: I0217 20:26:49.201914 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_05c8f198-b5d3-4126-a3ed-89c2fd6cca1f/openstack-network-exporter/0.log" Feb 17 20:26:49 crc kubenswrapper[4892]: I0217 20:26:49.274553 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_05c8f198-b5d3-4126-a3ed-89c2fd6cca1f/ovsdbserver-sb/0.log" Feb 17 20:26:49 crc kubenswrapper[4892]: I0217 20:26:49.377975 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_5457fa69-1de2-46fe-bd62-0af0c4bc3bd7/openstack-network-exporter/0.log" Feb 17 20:26:49 crc kubenswrapper[4892]: I0217 20:26:49.455919 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_5457fa69-1de2-46fe-bd62-0af0c4bc3bd7/ovsdbserver-sb/0.log" Feb 17 20:26:49 crc kubenswrapper[4892]: I0217 20:26:49.528959 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_0d7585d4-873b-4de6-97f9-4e9f719a0805/openstack-network-exporter/0.log" Feb 17 20:26:49 crc kubenswrapper[4892]: I0217 20:26:49.579158 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_0d7585d4-873b-4de6-97f9-4e9f719a0805/ovsdbserver-sb/0.log" Feb 17 20:26:49 crc kubenswrapper[4892]: I0217 20:26:49.771789 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-89979d8dd-729tc_e155c783-830f-44c1-85da-037c34052461/placement-api/0.log" Feb 17 20:26:49 crc kubenswrapper[4892]: I0217 20:26:49.871768 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-89979d8dd-729tc_e155c783-830f-44c1-85da-037c34052461/placement-log/0.log" Feb 17 20:26:49 crc kubenswrapper[4892]: I0217 20:26:49.941212 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_pre-adoption-validation-openstack-pre-adoption-openstack-cg76fw_422e1c2e-e6ab-4d2c-ac8b-86e70965fef8/pre-adoption-validation-openstack-pre-adoption-openstack-cell1/0.log" Feb 17 20:26:50 crc kubenswrapper[4892]: I0217 20:26:50.059022 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_b8b75602-eb3c-41f2-85d9-e5b055bd0724/init-config-reloader/0.log" Feb 17 20:26:50 crc kubenswrapper[4892]: I0217 20:26:50.218072 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_b8b75602-eb3c-41f2-85d9-e5b055bd0724/init-config-reloader/0.log" Feb 17 20:26:50 crc kubenswrapper[4892]: I0217 20:26:50.219469 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_b8b75602-eb3c-41f2-85d9-e5b055bd0724/prometheus/0.log" Feb 17 20:26:50 crc kubenswrapper[4892]: I0217 20:26:50.258906 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_b8b75602-eb3c-41f2-85d9-e5b055bd0724/config-reloader/0.log" Feb 17 20:26:50 crc kubenswrapper[4892]: I0217 20:26:50.298340 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_b8b75602-eb3c-41f2-85d9-e5b055bd0724/thanos-sidecar/0.log" Feb 17 20:26:50 crc kubenswrapper[4892]: I0217 20:26:50.407017 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b45c3cf8-448f-4ac0-8964-d78733369884/setup-container/0.log" Feb 17 20:26:50 crc kubenswrapper[4892]: I0217 20:26:50.674088 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b45c3cf8-448f-4ac0-8964-d78733369884/rabbitmq/0.log" Feb 17 20:26:50 crc kubenswrapper[4892]: I0217 20:26:50.705487 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b45c3cf8-448f-4ac0-8964-d78733369884/setup-container/0.log" Feb 17 20:26:50 crc kubenswrapper[4892]: I0217 20:26:50.742101 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ce51a677-1590-46d7-a8d9-47d1d2564a70/setup-container/0.log" Feb 17 20:26:50 crc kubenswrapper[4892]: I0217 20:26:50.895698 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ce51a677-1590-46d7-a8d9-47d1d2564a70/setup-container/0.log" Feb 17 20:26:50 crc kubenswrapper[4892]: I0217 20:26:50.947146 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-openstack-openstack-cell1-tb4ks_430ead3e-b98d-466b-819e-3231507f95cd/reboot-os-openstack-openstack-cell1/0.log" Feb 17 20:26:51 crc kubenswrapper[4892]: I0217 20:26:51.000625 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ce51a677-1590-46d7-a8d9-47d1d2564a70/rabbitmq/0.log" Feb 17 20:26:51 crc kubenswrapper[4892]: I0217 20:26:51.191487 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_51a16c44-9bb6-4250-8f1f-4617b6e20eb9/memcached/0.log" Feb 17 20:26:51 crc kubenswrapper[4892]: I0217 20:26:51.642887 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-openstack-qpx7f_ff891110-801e-4168-831b-d018dff2a1e5/ssh-known-hosts-openstack/0.log" Feb 17 20:26:51 crc kubenswrapper[4892]: I0217 20:26:51.680365 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-openstack-openstack-cell1-ww4hh_ef5efbdd-488e-4956-b442-da3dfc5542e1/run-os-openstack-openstack-cell1/0.log" Feb 17 20:26:51 crc kubenswrapper[4892]: I0217 20:26:51.935596 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-openstack-openstack-cell1-pnvpv_164b2658-87b4-4639-98bf-9b594c5a8b00/telemetry-openstack-openstack-cell1/0.log" Feb 17 20:26:51 crc kubenswrapper[4892]: I0217 20:26:51.939702 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tripleo-cleanup-tripleo-cleanup-openstack-cell1-w56qd_b17be7f8-f4d0-434f-b0b0-010faf440574/tripleo-cleanup-tripleo-cleanup-openstack-cell1/0.log" Feb 17 20:26:52 crc kubenswrapper[4892]: I0217 20:26:52.005769 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-openstack-openstack-cell1-v6tgq_b4cb43e5-177c-462c-acba-1a4ac62bf30d/validate-network-openstack-openstack-cell1/0.log" Feb 17 20:27:16 crc kubenswrapper[4892]: I0217 20:27:16.177004 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6058dfda14cb65c58e6e1b1bf4650548f92122c2af59420cb44b4d3438j7svn_4ebf28d4-a728-4295-a2c1-bbd21cd9c333/util/0.log" Feb 17 20:27:16 crc kubenswrapper[4892]: I0217 20:27:16.392378 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6058dfda14cb65c58e6e1b1bf4650548f92122c2af59420cb44b4d3438j7svn_4ebf28d4-a728-4295-a2c1-bbd21cd9c333/pull/0.log" Feb 17 20:27:16 crc kubenswrapper[4892]: I0217 20:27:16.396674 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6058dfda14cb65c58e6e1b1bf4650548f92122c2af59420cb44b4d3438j7svn_4ebf28d4-a728-4295-a2c1-bbd21cd9c333/pull/0.log" Feb 17 20:27:16 crc kubenswrapper[4892]: I0217 20:27:16.417926 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6058dfda14cb65c58e6e1b1bf4650548f92122c2af59420cb44b4d3438j7svn_4ebf28d4-a728-4295-a2c1-bbd21cd9c333/util/0.log" Feb 17 20:27:16 crc kubenswrapper[4892]: I0217 20:27:16.594191 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6058dfda14cb65c58e6e1b1bf4650548f92122c2af59420cb44b4d3438j7svn_4ebf28d4-a728-4295-a2c1-bbd21cd9c333/extract/0.log" Feb 17 20:27:16 crc kubenswrapper[4892]: I0217 20:27:16.654639 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6058dfda14cb65c58e6e1b1bf4650548f92122c2af59420cb44b4d3438j7svn_4ebf28d4-a728-4295-a2c1-bbd21cd9c333/pull/0.log" Feb 17 20:27:16 crc kubenswrapper[4892]: I0217 20:27:16.701417 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6058dfda14cb65c58e6e1b1bf4650548f92122c2af59420cb44b4d3438j7svn_4ebf28d4-a728-4295-a2c1-bbd21cd9c333/util/0.log" Feb 17 20:27:17 crc kubenswrapper[4892]: I0217 20:27:17.099209 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-5pdhr_cc4b7060-d89f-47c5-b2e4-2a793606350c/manager/0.log" Feb 17 20:27:17 crc kubenswrapper[4892]: I0217 20:27:17.556383 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987464f4-ml9zk_8dcfa260-ea93-42f8-a345-aec700f9e938/manager/0.log" Feb 17 20:27:17 crc kubenswrapper[4892]: I0217 20:27:17.689469 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-ngzhd_67ccc01c-23ce-407b-91dd-9554c49acbd5/manager/1.log" Feb 17 20:27:17 crc kubenswrapper[4892]: I0217 20:27:17.820961 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-ngzhd_67ccc01c-23ce-407b-91dd-9554c49acbd5/manager/0.log" Feb 17 20:27:18 crc kubenswrapper[4892]: I0217 20:27:18.051583 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-nr7vj_48f194fc-64b3-4ef2-9006-8b533ce72000/manager/1.log" Feb 17 20:27:18 crc kubenswrapper[4892]: I0217 20:27:18.155496 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-nr7vj_48f194fc-64b3-4ef2-9006-8b533ce72000/manager/0.log" Feb 17 20:27:18 crc kubenswrapper[4892]: I0217 20:27:18.159990 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-p6wgw_ed7add95-6153-4284-b9be-a76a4142a35e/manager/0.log" Feb 17 20:27:18 crc kubenswrapper[4892]: I0217 20:27:18.568263 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-wdsjq_9693b58d-64b4-4d18-a746-ec0a67606de5/manager/0.log" Feb 17 20:27:19 crc kubenswrapper[4892]: I0217 20:27:19.009961 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-tw7fv_cdcfbb9d-667b-4333-bb72-96bbf99ed979/manager/0.log" Feb 17 20:27:19 crc kubenswrapper[4892]: I0217 20:27:19.159747 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-54f6768c69-spdh6_471d6891-4b43-4dd0-86b1-5deb2fa418f7/manager/0.log" Feb 17 20:27:19 crc kubenswrapper[4892]: I0217 20:27:19.261013 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-ff5c8777-kccmj_0756a5c3-ad9d-4f9c-a3ce-77763bd1182e/manager/0.log" Feb 17 20:27:19 crc kubenswrapper[4892]: I0217 20:27:19.495512 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-67zr2_5716f235-cb99-4c41-b126-c122c572684a/manager/0.log" Feb 17 20:27:19 crc kubenswrapper[4892]: I0217 20:27:19.608065 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64ddbf8bb-m5npp_062a2519-0d5a-4662-8c84-5b8926ba32a2/manager/0.log" Feb 17 20:27:20 crc kubenswrapper[4892]: I0217 20:27:20.298237 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-7k7pc_ab01e10a-ce82-409d-8912-88ec85acac47/manager/0.log" Feb 17 20:27:20 crc kubenswrapper[4892]: I0217 20:27:20.530839 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-84966cf5c4c5hvk_86e53c34-ab5b-49ab-a14e-13d76792b6ef/manager/0.log" Feb 17 20:27:20 crc kubenswrapper[4892]: I0217 20:27:20.597409 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-5d946d989d-254f9_727885b3-46ff-43a8-9991-28567f43d07e/manager/0.log" Feb 17 20:27:21 crc kubenswrapper[4892]: I0217 20:27:21.015028 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-k5wgc_72dd98da-284e-486e-8ec7-9418283ac655/registry-server/0.log" Feb 17 20:27:21 crc kubenswrapper[4892]: I0217 20:27:21.076516 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-5498fb9db9-4w9jv_3916f167-fdb1-4c54-bc25-0a7fa20a6794/operator/0.log" Feb 17 20:27:21 crc kubenswrapper[4892]: I0217 20:27:21.357143 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-d44cf6b75-lf9pj_67b2947f-d96d-4697-9527-da5cbabc0552/manager/0.log" Feb 17 20:27:21 crc kubenswrapper[4892]: I0217 20:27:21.399122 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-ktptw_6622263b-b231-458c-b9fc-19061a5d73a7/manager/0.log" Feb 17 20:27:22 crc kubenswrapper[4892]: I0217 20:27:22.111767 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-jd4b7_5bc11699-06b5-4fd8-b8ff-6f3ecbda532f/operator/0.log" Feb 17 20:27:22 crc kubenswrapper[4892]: I0217 20:27:22.348236 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-9ws6d_689be620-172c-415d-927a-0a4f9ea9f5cb/manager/0.log" Feb 17 20:27:22 crc kubenswrapper[4892]: I0217 20:27:22.568507 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7866795846-cpl2x_f533d37d-5213-42af-82db-fc5e0208cb8d/manager/0.log" Feb 17 20:27:22 crc kubenswrapper[4892]: I0217 20:27:22.601365 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-7f45b4ff68-tzf7x_2f32911d-efa2-45de-8a11-497adcb7d2bd/manager/0.log" Feb 17 20:27:22 crc kubenswrapper[4892]: I0217 20:27:22.762696 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5db88f68c-xftdg_a1264594-6ddc-417a-bea1-9bb9eedbb719/manager/0.log" Feb 17 20:27:24 crc kubenswrapper[4892]: I0217 20:27:24.156687 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-b96b9dfc9-cb6tc_1b49b91f-8f48-4831-ae32-4b1a9287123f/manager/0.log" Feb 17 20:27:24 crc kubenswrapper[4892]: I0217 20:27:24.375046 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69f8888797-hjp6k_07a93e09-771e-4a85-89d0-e6fb19dcdb2d/manager/0.log" Feb 17 20:27:45 crc kubenswrapper[4892]: I0217 20:27:45.911449 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-vhwpb_f6f8d659-22a7-478b-a52a-f1a82ee5a40a/control-plane-machine-set-operator/0.log" Feb 17 20:27:46 crc kubenswrapper[4892]: I0217 20:27:46.418807 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-zh54l_be8497ae-6021-4f47-9bf1-64fc30d9e161/machine-api-operator/0.log" Feb 17 20:27:46 crc kubenswrapper[4892]: I0217 20:27:46.433476 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-zh54l_be8497ae-6021-4f47-9bf1-64fc30d9e161/kube-rbac-proxy/0.log" Feb 17 20:28:01 crc kubenswrapper[4892]: I0217 20:28:01.940842 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-62mrg_c8cb92f6-00bd-44aa-bd4e-277d7327155d/cert-manager-controller/0.log" Feb 17 20:28:02 crc kubenswrapper[4892]: I0217 20:28:02.152423 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-8x4md_4bf0f495-906b-4e3c-b5e2-eecaaf07335e/cert-manager-cainjector/0.log" Feb 17 20:28:02 crc kubenswrapper[4892]: I0217 20:28:02.211993 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-t2r56_e7381d2c-f7e2-4935-be5f-380479a2e516/cert-manager-webhook/0.log" Feb 17 20:28:07 crc kubenswrapper[4892]: I0217 20:28:07.424368 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 20:28:07 crc kubenswrapper[4892]: I0217 20:28:07.424982 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 20:28:16 crc kubenswrapper[4892]: I0217 20:28:16.596010 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5c78fc5d65-cvjjp_642744dc-fe94-49d4-98a0-64115704ead8/nmstate-console-plugin/0.log" Feb 17 20:28:16 crc kubenswrapper[4892]: I0217 20:28:16.750176 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-99qbz_ffff86a7-6135-4bc5-b5ea-2b5f219551e4/nmstate-handler/0.log" Feb 17 20:28:16 crc kubenswrapper[4892]: I0217 20:28:16.815857 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-89vmm_91f1ad1b-9ce2-44b8-bdd0-6528b0563442/kube-rbac-proxy/0.log" Feb 17 20:28:16 crc kubenswrapper[4892]: I0217 20:28:16.837658 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-89vmm_91f1ad1b-9ce2-44b8-bdd0-6528b0563442/nmstate-metrics/0.log" Feb 17 20:28:17 crc kubenswrapper[4892]: I0217 20:28:17.022299 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-694c9596b7-7nmvs_e291b988-bfc6-47d5-864a-71c877507a09/nmstate-operator/0.log" Feb 17 20:28:17 crc kubenswrapper[4892]: I0217 20:28:17.079842 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-866bcb46dc-v6h8d_0190d61f-6f4b-43a1-a439-acc019e8353a/nmstate-webhook/0.log" Feb 17 20:28:29 crc kubenswrapper[4892]: I0217 20:28:29.956327 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-2tk5v_b593a6d0-56ba-4022-8084-246a3ac9fd30/prometheus-operator/0.log" Feb 17 20:28:30 crc kubenswrapper[4892]: I0217 20:28:30.171413 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7567bbd4cd-dvr5f_25bc72d9-8845-4954-93f7-657b3cac94b6/prometheus-operator-admission-webhook/0.log" Feb 17 20:28:30 crc kubenswrapper[4892]: I0217 20:28:30.277019 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7567bbd4cd-mqtrm_8a9c57e6-a854-4895-b520-267ac9379772/prometheus-operator-admission-webhook/0.log" Feb 17 20:28:30 crc kubenswrapper[4892]: I0217 20:28:30.371781 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-hdpxb_6d3c1e97-ee06-41a8-890b-8606b2297aa0/operator/0.log" Feb 17 20:28:30 crc kubenswrapper[4892]: I0217 20:28:30.445104 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-n5wfv_609d9353-9db2-4a1c-8f00-8cfe986c3b12/perses-operator/0.log" Feb 17 20:28:37 crc kubenswrapper[4892]: I0217 20:28:37.424758 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 20:28:37 crc kubenswrapper[4892]: I0217 20:28:37.425374 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 20:28:45 crc kubenswrapper[4892]: I0217 20:28:45.263568 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-n6bwg_7a4756aa-28e7-41ff-a32c-65b2a2db0b71/kube-rbac-proxy/0.log" Feb 17 20:28:45 crc kubenswrapper[4892]: I0217 20:28:45.448737 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-t4nhs_dbdc8dbb-c968-46e3-8e11-9f1006a9ccbf/frr-k8s-webhook-server/0.log" Feb 17 20:28:45 crc kubenswrapper[4892]: I0217 20:28:45.673304 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xfjst_87e188e4-25c7-4068-8336-8396f99d9a51/cp-frr-files/0.log" Feb 17 20:28:45 crc kubenswrapper[4892]: I0217 20:28:45.677738 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-n6bwg_7a4756aa-28e7-41ff-a32c-65b2a2db0b71/controller/0.log" Feb 17 20:28:45 crc kubenswrapper[4892]: I0217 20:28:45.896169 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xfjst_87e188e4-25c7-4068-8336-8396f99d9a51/cp-frr-files/0.log" Feb 17 20:28:45 crc kubenswrapper[4892]: I0217 20:28:45.911138 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xfjst_87e188e4-25c7-4068-8336-8396f99d9a51/cp-reloader/0.log" Feb 17 20:28:46 crc kubenswrapper[4892]: I0217 20:28:46.740658 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xfjst_87e188e4-25c7-4068-8336-8396f99d9a51/cp-metrics/0.log" Feb 17 20:28:46 crc kubenswrapper[4892]: I0217 20:28:46.758149 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xfjst_87e188e4-25c7-4068-8336-8396f99d9a51/cp-reloader/0.log" Feb 17 20:28:46 crc kubenswrapper[4892]: I0217 20:28:46.930893 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xfjst_87e188e4-25c7-4068-8336-8396f99d9a51/cp-frr-files/0.log" Feb 17 20:28:46 crc kubenswrapper[4892]: I0217 20:28:46.937648 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xfjst_87e188e4-25c7-4068-8336-8396f99d9a51/cp-metrics/0.log" Feb 17 20:28:46 crc kubenswrapper[4892]: I0217 20:28:46.964278 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xfjst_87e188e4-25c7-4068-8336-8396f99d9a51/cp-metrics/0.log" Feb 17 20:28:46 crc kubenswrapper[4892]: I0217 20:28:46.992413 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xfjst_87e188e4-25c7-4068-8336-8396f99d9a51/cp-reloader/0.log" Feb 17 20:28:47 crc kubenswrapper[4892]: I0217 20:28:47.127615 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xfjst_87e188e4-25c7-4068-8336-8396f99d9a51/cp-reloader/0.log" Feb 17 20:28:47 crc kubenswrapper[4892]: I0217 20:28:47.179191 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xfjst_87e188e4-25c7-4068-8336-8396f99d9a51/cp-metrics/0.log" Feb 17 20:28:47 crc kubenswrapper[4892]: I0217 20:28:47.186475 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xfjst_87e188e4-25c7-4068-8336-8396f99d9a51/cp-frr-files/0.log" Feb 17 20:28:47 crc kubenswrapper[4892]: I0217 20:28:47.220198 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xfjst_87e188e4-25c7-4068-8336-8396f99d9a51/controller/0.log" Feb 17 20:28:47 crc kubenswrapper[4892]: I0217 20:28:47.341786 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xfjst_87e188e4-25c7-4068-8336-8396f99d9a51/frr-metrics/0.log" Feb 17 20:28:47 crc kubenswrapper[4892]: I0217 20:28:47.387305 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xfjst_87e188e4-25c7-4068-8336-8396f99d9a51/kube-rbac-proxy/0.log" Feb 17 20:28:47 crc kubenswrapper[4892]: I0217 20:28:47.442128 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xfjst_87e188e4-25c7-4068-8336-8396f99d9a51/kube-rbac-proxy-frr/0.log" Feb 17 20:28:47 crc kubenswrapper[4892]: I0217 20:28:47.614901 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xfjst_87e188e4-25c7-4068-8336-8396f99d9a51/reloader/0.log" Feb 17 20:28:47 crc kubenswrapper[4892]: I0217 20:28:47.703407 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6cbb547f6d-mlwwq_97b49200-bd30-4c85-b61d-3b75276c00ef/manager/0.log" Feb 17 20:28:48 crc kubenswrapper[4892]: I0217 20:28:48.458335 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7cbfbcbb66-8nlr9_b319b422-94c7-47c8-9e1f-2bef71d833b1/webhook-server/0.log" Feb 17 20:28:48 crc kubenswrapper[4892]: I0217 20:28:48.686613 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-55rj9_b6b73500-2314-4f41-83cf-3da8514065e3/kube-rbac-proxy/0.log" Feb 17 20:28:49 crc kubenswrapper[4892]: I0217 20:28:49.588008 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-55rj9_b6b73500-2314-4f41-83cf-3da8514065e3/speaker/0.log" Feb 17 20:28:50 crc kubenswrapper[4892]: I0217 20:28:50.857122 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xfjst_87e188e4-25c7-4068-8336-8396f99d9a51/frr/0.log" Feb 17 20:29:03 crc kubenswrapper[4892]: I0217 20:29:03.353287 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e594m25_d5933e5f-3c83-41ab-b58a-9b5d5e7f7580/util/0.log" Feb 17 20:29:03 crc kubenswrapper[4892]: I0217 20:29:03.602648 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e594m25_d5933e5f-3c83-41ab-b58a-9b5d5e7f7580/util/0.log" Feb 17 20:29:03 crc kubenswrapper[4892]: I0217 20:29:03.630718 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e594m25_d5933e5f-3c83-41ab-b58a-9b5d5e7f7580/pull/0.log" Feb 17 20:29:03 crc kubenswrapper[4892]: I0217 20:29:03.665334 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e594m25_d5933e5f-3c83-41ab-b58a-9b5d5e7f7580/pull/0.log" Feb 17 20:29:03 crc kubenswrapper[4892]: I0217 20:29:03.783066 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e594m25_d5933e5f-3c83-41ab-b58a-9b5d5e7f7580/util/0.log" Feb 17 20:29:03 crc kubenswrapper[4892]: I0217 20:29:03.820669 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e594m25_d5933e5f-3c83-41ab-b58a-9b5d5e7f7580/extract/0.log" Feb 17 20:29:03 crc kubenswrapper[4892]: I0217 20:29:03.877005 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e594m25_d5933e5f-3c83-41ab-b58a-9b5d5e7f7580/pull/0.log" Feb 17 20:29:03 crc kubenswrapper[4892]: I0217 20:29:03.960414 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08k8zvg_a0b5e8ae-9589-439c-b68d-04963a2fc27b/util/0.log" Feb 17 20:29:04 crc kubenswrapper[4892]: I0217 20:29:04.190963 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08k8zvg_a0b5e8ae-9589-439c-b68d-04963a2fc27b/util/0.log" Feb 17 20:29:04 crc kubenswrapper[4892]: I0217 20:29:04.192433 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08k8zvg_a0b5e8ae-9589-439c-b68d-04963a2fc27b/pull/0.log" Feb 17 20:29:04 crc kubenswrapper[4892]: I0217 20:29:04.210082 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08k8zvg_a0b5e8ae-9589-439c-b68d-04963a2fc27b/pull/0.log" Feb 17 20:29:04 crc kubenswrapper[4892]: I0217 20:29:04.374979 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08k8zvg_a0b5e8ae-9589-439c-b68d-04963a2fc27b/util/0.log" Feb 17 20:29:04 crc kubenswrapper[4892]: I0217 20:29:04.393441 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08k8zvg_a0b5e8ae-9589-439c-b68d-04963a2fc27b/pull/0.log" Feb 17 20:29:04 crc kubenswrapper[4892]: I0217 20:29:04.399280 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08k8zvg_a0b5e8ae-9589-439c-b68d-04963a2fc27b/extract/0.log" Feb 17 20:29:04 crc kubenswrapper[4892]: I0217 20:29:04.570164 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138p9bq_28fded22-d298-44b3-8cb8-6588578ba409/util/0.log" Feb 17 20:29:04 crc kubenswrapper[4892]: I0217 20:29:04.744749 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138p9bq_28fded22-d298-44b3-8cb8-6588578ba409/pull/0.log" Feb 17 20:29:04 crc kubenswrapper[4892]: I0217 20:29:04.770397 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138p9bq_28fded22-d298-44b3-8cb8-6588578ba409/pull/0.log" Feb 17 20:29:04 crc kubenswrapper[4892]: I0217 20:29:04.776003 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138p9bq_28fded22-d298-44b3-8cb8-6588578ba409/util/0.log" Feb 17 20:29:04 crc kubenswrapper[4892]: I0217 20:29:04.941314 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138p9bq_28fded22-d298-44b3-8cb8-6588578ba409/util/0.log" Feb 17 20:29:04 crc kubenswrapper[4892]: I0217 20:29:04.942391 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138p9bq_28fded22-d298-44b3-8cb8-6588578ba409/pull/0.log" Feb 17 20:29:05 crc kubenswrapper[4892]: I0217 20:29:05.025207 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138p9bq_28fded22-d298-44b3-8cb8-6588578ba409/extract/0.log" Feb 17 20:29:05 crc kubenswrapper[4892]: I0217 20:29:05.149927 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4f5j9_2f46f74c-ce94-4ff5-8543-6580352fda57/extract-utilities/0.log" Feb 17 20:29:05 crc kubenswrapper[4892]: I0217 20:29:05.284174 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4f5j9_2f46f74c-ce94-4ff5-8543-6580352fda57/extract-utilities/0.log" Feb 17 20:29:05 crc kubenswrapper[4892]: I0217 20:29:05.320756 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4f5j9_2f46f74c-ce94-4ff5-8543-6580352fda57/extract-content/0.log" Feb 17 20:29:05 crc kubenswrapper[4892]: I0217 20:29:05.327995 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4f5j9_2f46f74c-ce94-4ff5-8543-6580352fda57/extract-content/0.log" Feb 17 20:29:05 crc kubenswrapper[4892]: I0217 20:29:05.540291 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4f5j9_2f46f74c-ce94-4ff5-8543-6580352fda57/extract-content/0.log" Feb 17 20:29:05 crc kubenswrapper[4892]: I0217 20:29:05.586726 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4f5j9_2f46f74c-ce94-4ff5-8543-6580352fda57/extract-utilities/0.log" Feb 17 20:29:05 crc kubenswrapper[4892]: I0217 20:29:05.773124 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4pxrd_c5624079-5b90-4427-a09c-0d96ed3ddebd/extract-utilities/0.log" Feb 17 20:29:05 crc kubenswrapper[4892]: I0217 20:29:05.816466 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4f5j9_2f46f74c-ce94-4ff5-8543-6580352fda57/registry-server/0.log" Feb 17 20:29:05 crc kubenswrapper[4892]: I0217 20:29:05.949741 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4pxrd_c5624079-5b90-4427-a09c-0d96ed3ddebd/extract-utilities/0.log" Feb 17 20:29:05 crc kubenswrapper[4892]: I0217 20:29:05.965717 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4pxrd_c5624079-5b90-4427-a09c-0d96ed3ddebd/extract-content/0.log" Feb 17 20:29:05 crc kubenswrapper[4892]: I0217 20:29:05.973254 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4pxrd_c5624079-5b90-4427-a09c-0d96ed3ddebd/extract-content/0.log" Feb 17 20:29:06 crc kubenswrapper[4892]: I0217 20:29:06.201369 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4pxrd_c5624079-5b90-4427-a09c-0d96ed3ddebd/extract-utilities/0.log" Feb 17 20:29:06 crc kubenswrapper[4892]: I0217 20:29:06.238922 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4pxrd_c5624079-5b90-4427-a09c-0d96ed3ddebd/extract-content/0.log" Feb 17 20:29:06 crc kubenswrapper[4892]: I0217 20:29:06.655916 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7hghz_3b7d2423-4734-4106-91f3-fd4f6712a9d0/extract-utilities/0.log" Feb 17 20:29:07 crc kubenswrapper[4892]: I0217 20:29:07.424845 4892 patch_prober.go:28] interesting pod/machine-config-daemon-6mhzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 20:29:07 crc kubenswrapper[4892]: I0217 20:29:07.424889 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 20:29:07 crc kubenswrapper[4892]: I0217 20:29:07.424923 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" Feb 17 20:29:07 crc kubenswrapper[4892]: I0217 20:29:07.425634 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2d8fac9640248d569699481a65fe24982eb12cc3dc607889c6c3f752cff2bc30"} pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 20:29:07 crc kubenswrapper[4892]: I0217 20:29:07.425681 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerName="machine-config-daemon" containerID="cri-o://2d8fac9640248d569699481a65fe24982eb12cc3dc607889c6c3f752cff2bc30" gracePeriod=600 Feb 17 20:29:07 crc kubenswrapper[4892]: I0217 20:29:07.463360 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7hghz_3b7d2423-4734-4106-91f3-fd4f6712a9d0/extract-utilities/0.log" Feb 17 20:29:07 crc kubenswrapper[4892]: I0217 20:29:07.488495 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7hghz_3b7d2423-4734-4106-91f3-fd4f6712a9d0/extract-content/0.log" Feb 17 20:29:07 crc kubenswrapper[4892]: I0217 20:29:07.505429 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7hghz_3b7d2423-4734-4106-91f3-fd4f6712a9d0/extract-content/0.log" Feb 17 20:29:07 crc kubenswrapper[4892]: E0217 20:29:07.550740 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:29:07 crc kubenswrapper[4892]: I0217 20:29:07.613755 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4pxrd_c5624079-5b90-4427-a09c-0d96ed3ddebd/registry-server/0.log" Feb 17 20:29:07 crc kubenswrapper[4892]: I0217 20:29:07.733797 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7hghz_3b7d2423-4734-4106-91f3-fd4f6712a9d0/extract-content/0.log" Feb 17 20:29:07 crc kubenswrapper[4892]: I0217 20:29:07.742678 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7hghz_3b7d2423-4734-4106-91f3-fd4f6712a9d0/extract-utilities/0.log" Feb 17 20:29:07 crc kubenswrapper[4892]: I0217 20:29:07.863043 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8d66x_db2e4438-9aec-4791-88de-9b1d82af406d/extract-utilities/0.log" Feb 17 20:29:07 crc kubenswrapper[4892]: I0217 20:29:07.906627 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7hghz_3b7d2423-4734-4106-91f3-fd4f6712a9d0/registry-server/0.log" Feb 17 20:29:07 crc kubenswrapper[4892]: I0217 20:29:07.976542 4892 generic.go:334] "Generic (PLEG): container finished" podID="f9013d62-9809-436b-82a8-5b18dbf13e35" containerID="2d8fac9640248d569699481a65fe24982eb12cc3dc607889c6c3f752cff2bc30" exitCode=0 Feb 17 20:29:07 crc kubenswrapper[4892]: I0217 20:29:07.976593 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerDied","Data":"2d8fac9640248d569699481a65fe24982eb12cc3dc607889c6c3f752cff2bc30"} Feb 17 20:29:07 crc kubenswrapper[4892]: I0217 20:29:07.976639 4892 scope.go:117] "RemoveContainer" containerID="d229474ca59a6d1790d5bfd995f8dcef27acdebd08978bf381531977775cb29f" Feb 17 20:29:07 crc kubenswrapper[4892]: I0217 20:29:07.977686 4892 scope.go:117] "RemoveContainer" containerID="2d8fac9640248d569699481a65fe24982eb12cc3dc607889c6c3f752cff2bc30" Feb 17 20:29:07 crc kubenswrapper[4892]: E0217 20:29:07.978421 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:29:08 crc kubenswrapper[4892]: I0217 20:29:08.080999 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8d66x_db2e4438-9aec-4791-88de-9b1d82af406d/extract-content/0.log" Feb 17 20:29:08 crc kubenswrapper[4892]: I0217 20:29:08.096798 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8d66x_db2e4438-9aec-4791-88de-9b1d82af406d/extract-content/0.log" Feb 17 20:29:08 crc kubenswrapper[4892]: I0217 20:29:08.096960 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8d66x_db2e4438-9aec-4791-88de-9b1d82af406d/extract-utilities/0.log" Feb 17 20:29:08 crc kubenswrapper[4892]: I0217 20:29:08.263892 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8d66x_db2e4438-9aec-4791-88de-9b1d82af406d/extract-utilities/0.log" Feb 17 20:29:08 crc kubenswrapper[4892]: I0217 20:29:08.310287 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8d66x_db2e4438-9aec-4791-88de-9b1d82af406d/extract-content/0.log" Feb 17 20:29:08 crc kubenswrapper[4892]: I0217 20:29:08.313986 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cmmtx_f052c8f0-9011-4e0e-af19-c27399a3dfc0/extract-utilities/0.log" Feb 17 20:29:08 crc kubenswrapper[4892]: I0217 20:29:08.508919 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cmmtx_f052c8f0-9011-4e0e-af19-c27399a3dfc0/extract-utilities/0.log" Feb 17 20:29:08 crc kubenswrapper[4892]: I0217 20:29:08.546653 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cmmtx_f052c8f0-9011-4e0e-af19-c27399a3dfc0/extract-content/0.log" Feb 17 20:29:08 crc kubenswrapper[4892]: I0217 20:29:08.603572 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cmmtx_f052c8f0-9011-4e0e-af19-c27399a3dfc0/extract-content/0.log" Feb 17 20:29:08 crc kubenswrapper[4892]: I0217 20:29:08.822300 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8d66x_db2e4438-9aec-4791-88de-9b1d82af406d/registry-server/0.log" Feb 17 20:29:09 crc kubenswrapper[4892]: I0217 20:29:09.444718 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cmmtx_f052c8f0-9011-4e0e-af19-c27399a3dfc0/extract-utilities/0.log" Feb 17 20:29:09 crc kubenswrapper[4892]: I0217 20:29:09.539435 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cmmtx_f052c8f0-9011-4e0e-af19-c27399a3dfc0/extract-content/0.log" Feb 17 20:29:09 crc kubenswrapper[4892]: I0217 20:29:09.692967 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cmmtx_f052c8f0-9011-4e0e-af19-c27399a3dfc0/registry-server/0.log" Feb 17 20:29:09 crc kubenswrapper[4892]: I0217 20:29:09.701506 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dhfx5_7700d4bf-d2a8-4c27-bba9-712bde89f76c/extract-utilities/0.log" Feb 17 20:29:09 crc kubenswrapper[4892]: I0217 20:29:09.820754 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dhfx5_7700d4bf-d2a8-4c27-bba9-712bde89f76c/extract-content/0.log" Feb 17 20:29:09 crc kubenswrapper[4892]: I0217 20:29:09.843644 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dhfx5_7700d4bf-d2a8-4c27-bba9-712bde89f76c/extract-utilities/0.log" Feb 17 20:29:09 crc kubenswrapper[4892]: I0217 20:29:09.873862 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dhfx5_7700d4bf-d2a8-4c27-bba9-712bde89f76c/extract-content/0.log" Feb 17 20:29:10 crc kubenswrapper[4892]: I0217 20:29:10.056363 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dhfx5_7700d4bf-d2a8-4c27-bba9-712bde89f76c/extract-utilities/0.log" Feb 17 20:29:10 crc kubenswrapper[4892]: I0217 20:29:10.056377 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dhfx5_7700d4bf-d2a8-4c27-bba9-712bde89f76c/extract-content/0.log" Feb 17 20:29:10 crc kubenswrapper[4892]: I0217 20:29:10.116275 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fsc5r_28f724e3-47e3-47df-be60-cc0da5a15e25/extract-utilities/0.log" Feb 17 20:29:10 crc kubenswrapper[4892]: I0217 20:29:10.205723 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dhfx5_7700d4bf-d2a8-4c27-bba9-712bde89f76c/registry-server/0.log" Feb 17 20:29:10 crc kubenswrapper[4892]: I0217 20:29:10.341290 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fsc5r_28f724e3-47e3-47df-be60-cc0da5a15e25/extract-utilities/0.log" Feb 17 20:29:10 crc kubenswrapper[4892]: I0217 20:29:10.353499 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fsc5r_28f724e3-47e3-47df-be60-cc0da5a15e25/extract-content/0.log" Feb 17 20:29:10 crc kubenswrapper[4892]: I0217 20:29:10.364001 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fsc5r_28f724e3-47e3-47df-be60-cc0da5a15e25/extract-content/0.log" Feb 17 20:29:10 crc kubenswrapper[4892]: I0217 20:29:10.543672 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fsc5r_28f724e3-47e3-47df-be60-cc0da5a15e25/extract-content/0.log" Feb 17 20:29:10 crc kubenswrapper[4892]: I0217 20:29:10.549742 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fsc5r_28f724e3-47e3-47df-be60-cc0da5a15e25/extract-utilities/0.log" Feb 17 20:29:10 crc kubenswrapper[4892]: I0217 20:29:10.638079 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hwn4v_a7833e24-e8f2-422a-8759-be2e54c1f6ee/extract-utilities/0.log" Feb 17 20:29:10 crc kubenswrapper[4892]: I0217 20:29:10.754984 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fsc5r_28f724e3-47e3-47df-be60-cc0da5a15e25/registry-server/0.log" Feb 17 20:29:10 crc kubenswrapper[4892]: I0217 20:29:10.855242 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hwn4v_a7833e24-e8f2-422a-8759-be2e54c1f6ee/extract-utilities/0.log" Feb 17 20:29:10 crc kubenswrapper[4892]: I0217 20:29:10.920051 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hwn4v_a7833e24-e8f2-422a-8759-be2e54c1f6ee/extract-content/0.log" Feb 17 20:29:10 crc kubenswrapper[4892]: I0217 20:29:10.926799 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hwn4v_a7833e24-e8f2-422a-8759-be2e54c1f6ee/extract-content/0.log" Feb 17 20:29:11 crc kubenswrapper[4892]: I0217 20:29:11.160080 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hwn4v_a7833e24-e8f2-422a-8759-be2e54c1f6ee/extract-content/0.log" Feb 17 20:29:11 crc kubenswrapper[4892]: I0217 20:29:11.250994 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hwn4v_a7833e24-e8f2-422a-8759-be2e54c1f6ee/extract-utilities/0.log" Feb 17 20:29:11 crc kubenswrapper[4892]: I0217 20:29:11.276620 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hwn4v_a7833e24-e8f2-422a-8759-be2e54c1f6ee/registry-server/0.log" Feb 17 20:29:11 crc kubenswrapper[4892]: I0217 20:29:11.297563 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jstz7_20048710-1af1-4532-8a8d-c8f325bcc2a1/extract-utilities/0.log" Feb 17 20:29:11 crc kubenswrapper[4892]: I0217 20:29:11.442805 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jstz7_20048710-1af1-4532-8a8d-c8f325bcc2a1/extract-content/0.log" Feb 17 20:29:11 crc kubenswrapper[4892]: I0217 20:29:11.472374 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jstz7_20048710-1af1-4532-8a8d-c8f325bcc2a1/extract-utilities/0.log" Feb 17 20:29:11 crc kubenswrapper[4892]: I0217 20:29:11.473143 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jstz7_20048710-1af1-4532-8a8d-c8f325bcc2a1/extract-content/0.log" Feb 17 20:29:11 crc kubenswrapper[4892]: I0217 20:29:11.620132 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jstz7_20048710-1af1-4532-8a8d-c8f325bcc2a1/extract-utilities/0.log" Feb 17 20:29:11 crc kubenswrapper[4892]: I0217 20:29:11.721964 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n84ch_f87ba7cc-b3a3-4df2-bc4e-4f9d892f5c02/extract-utilities/0.log" Feb 17 20:29:11 crc kubenswrapper[4892]: I0217 20:29:11.735230 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jstz7_20048710-1af1-4532-8a8d-c8f325bcc2a1/extract-content/0.log" Feb 17 20:29:11 crc kubenswrapper[4892]: I0217 20:29:11.862599 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jstz7_20048710-1af1-4532-8a8d-c8f325bcc2a1/registry-server/0.log" Feb 17 20:29:11 crc kubenswrapper[4892]: I0217 20:29:11.917960 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n84ch_f87ba7cc-b3a3-4df2-bc4e-4f9d892f5c02/extract-utilities/0.log" Feb 17 20:29:11 crc kubenswrapper[4892]: I0217 20:29:11.920147 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n84ch_f87ba7cc-b3a3-4df2-bc4e-4f9d892f5c02/extract-content/0.log" Feb 17 20:29:11 crc kubenswrapper[4892]: I0217 20:29:11.944862 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n84ch_f87ba7cc-b3a3-4df2-bc4e-4f9d892f5c02/extract-content/0.log" Feb 17 20:29:12 crc kubenswrapper[4892]: I0217 20:29:12.147313 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n84ch_f87ba7cc-b3a3-4df2-bc4e-4f9d892f5c02/extract-content/0.log" Feb 17 20:29:12 crc kubenswrapper[4892]: I0217 20:29:12.198465 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xnsqq_b8e76551-8839-410b-9aec-a671ca2119dc/extract-utilities/0.log" Feb 17 20:29:12 crc kubenswrapper[4892]: I0217 20:29:12.246776 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n84ch_f87ba7cc-b3a3-4df2-bc4e-4f9d892f5c02/extract-utilities/0.log" Feb 17 20:29:12 crc kubenswrapper[4892]: I0217 20:29:12.279460 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n84ch_f87ba7cc-b3a3-4df2-bc4e-4f9d892f5c02/registry-server/0.log" Feb 17 20:29:12 crc kubenswrapper[4892]: I0217 20:29:12.415091 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xnsqq_b8e76551-8839-410b-9aec-a671ca2119dc/extract-utilities/0.log" Feb 17 20:29:12 crc kubenswrapper[4892]: I0217 20:29:12.460099 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xnsqq_b8e76551-8839-410b-9aec-a671ca2119dc/extract-content/0.log" Feb 17 20:29:12 crc kubenswrapper[4892]: I0217 20:29:12.469069 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xnsqq_b8e76551-8839-410b-9aec-a671ca2119dc/extract-content/0.log" Feb 17 20:29:12 crc kubenswrapper[4892]: I0217 20:29:12.645107 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xnsqq_b8e76551-8839-410b-9aec-a671ca2119dc/extract-utilities/0.log" Feb 17 20:29:12 crc kubenswrapper[4892]: I0217 20:29:12.650486 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xnsqq_b8e76551-8839-410b-9aec-a671ca2119dc/extract-content/0.log" Feb 17 20:29:12 crc kubenswrapper[4892]: I0217 20:29:12.759947 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zcqks_a22a9e64-6027-45c1-be54-dc34baaf71c5/extract-utilities/0.log" Feb 17 20:29:12 crc kubenswrapper[4892]: I0217 20:29:12.800764 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xnsqq_b8e76551-8839-410b-9aec-a671ca2119dc/registry-server/0.log" Feb 17 20:29:12 crc kubenswrapper[4892]: I0217 20:29:12.961121 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zcqks_a22a9e64-6027-45c1-be54-dc34baaf71c5/extract-content/0.log" Feb 17 20:29:13 crc kubenswrapper[4892]: I0217 20:29:13.006471 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zcqks_a22a9e64-6027-45c1-be54-dc34baaf71c5/extract-content/0.log" Feb 17 20:29:13 crc kubenswrapper[4892]: I0217 20:29:13.048333 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zcqks_a22a9e64-6027-45c1-be54-dc34baaf71c5/extract-utilities/0.log" Feb 17 20:29:13 crc kubenswrapper[4892]: I0217 20:29:13.261759 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zcqks_a22a9e64-6027-45c1-be54-dc34baaf71c5/extract-utilities/0.log" Feb 17 20:29:13 crc kubenswrapper[4892]: I0217 20:29:13.341380 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zcqks_a22a9e64-6027-45c1-be54-dc34baaf71c5/extract-content/0.log" Feb 17 20:29:13 crc kubenswrapper[4892]: I0217 20:29:13.375693 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx_bfe219b7-e412-4fb9-8b96-36ed80ea8cd3/util/0.log" Feb 17 20:29:13 crc kubenswrapper[4892]: I0217 20:29:13.465598 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zcqks_a22a9e64-6027-45c1-be54-dc34baaf71c5/registry-server/0.log" Feb 17 20:29:13 crc kubenswrapper[4892]: I0217 20:29:13.647545 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx_bfe219b7-e412-4fb9-8b96-36ed80ea8cd3/pull/0.log" Feb 17 20:29:13 crc kubenswrapper[4892]: I0217 20:29:13.673464 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx_bfe219b7-e412-4fb9-8b96-36ed80ea8cd3/util/0.log" Feb 17 20:29:13 crc kubenswrapper[4892]: I0217 20:29:13.689388 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx_bfe219b7-e412-4fb9-8b96-36ed80ea8cd3/pull/0.log" Feb 17 20:29:13 crc kubenswrapper[4892]: I0217 20:29:13.850846 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx_bfe219b7-e412-4fb9-8b96-36ed80ea8cd3/pull/0.log" Feb 17 20:29:13 crc kubenswrapper[4892]: I0217 20:29:13.861280 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx_bfe219b7-e412-4fb9-8b96-36ed80ea8cd3/util/0.log" Feb 17 20:29:13 crc kubenswrapper[4892]: I0217 20:29:13.866025 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaf5mfx_bfe219b7-e412-4fb9-8b96-36ed80ea8cd3/extract/0.log" Feb 17 20:29:13 crc kubenswrapper[4892]: I0217 20:29:13.975619 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-5t5vc_c6d1cad4-ce5d-4116-a7ad-7a2f6c51c2bb/marketplace-operator/0.log" Feb 17 20:29:14 crc kubenswrapper[4892]: I0217 20:29:14.050219 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8w24r_b11f3a8b-12fd-40bd-b67b-fa0ba188eed8/extract-utilities/0.log" Feb 17 20:29:14 crc kubenswrapper[4892]: I0217 20:29:14.230123 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8w24r_b11f3a8b-12fd-40bd-b67b-fa0ba188eed8/extract-content/0.log" Feb 17 20:29:14 crc kubenswrapper[4892]: I0217 20:29:14.248307 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8w24r_b11f3a8b-12fd-40bd-b67b-fa0ba188eed8/extract-content/0.log" Feb 17 20:29:14 crc kubenswrapper[4892]: I0217 20:29:14.266703 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8w24r_b11f3a8b-12fd-40bd-b67b-fa0ba188eed8/extract-utilities/0.log" Feb 17 20:29:14 crc kubenswrapper[4892]: I0217 20:29:14.455146 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8w24r_b11f3a8b-12fd-40bd-b67b-fa0ba188eed8/extract-utilities/0.log" Feb 17 20:29:14 crc kubenswrapper[4892]: I0217 20:29:14.465865 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8w24r_b11f3a8b-12fd-40bd-b67b-fa0ba188eed8/extract-content/0.log" Feb 17 20:29:14 crc kubenswrapper[4892]: I0217 20:29:14.524940 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f6mjl_2897de4e-559e-434d-8a35-944d76f621f2/extract-utilities/0.log" Feb 17 20:29:14 crc kubenswrapper[4892]: I0217 20:29:14.705890 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f6mjl_2897de4e-559e-434d-8a35-944d76f621f2/extract-utilities/0.log" Feb 17 20:29:14 crc kubenswrapper[4892]: I0217 20:29:14.767036 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f6mjl_2897de4e-559e-434d-8a35-944d76f621f2/extract-content/0.log" Feb 17 20:29:14 crc kubenswrapper[4892]: I0217 20:29:14.792878 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f6mjl_2897de4e-559e-434d-8a35-944d76f621f2/extract-content/0.log" Feb 17 20:29:15 crc kubenswrapper[4892]: I0217 20:29:15.015173 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f6mjl_2897de4e-559e-434d-8a35-944d76f621f2/extract-utilities/0.log" Feb 17 20:29:15 crc kubenswrapper[4892]: I0217 20:29:15.042917 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8w24r_b11f3a8b-12fd-40bd-b67b-fa0ba188eed8/registry-server/0.log" Feb 17 20:29:15 crc kubenswrapper[4892]: I0217 20:29:15.059373 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f6mjl_2897de4e-559e-434d-8a35-944d76f621f2/extract-content/0.log" Feb 17 20:29:15 crc kubenswrapper[4892]: I0217 20:29:15.451330 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f6mjl_2897de4e-559e-434d-8a35-944d76f621f2/registry-server/0.log" Feb 17 20:29:19 crc kubenswrapper[4892]: I0217 20:29:19.376974 4892 scope.go:117] "RemoveContainer" containerID="2d8fac9640248d569699481a65fe24982eb12cc3dc607889c6c3f752cff2bc30" Feb 17 20:29:19 crc kubenswrapper[4892]: E0217 20:29:19.387624 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:29:28 crc kubenswrapper[4892]: I0217 20:29:28.972979 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-2tk5v_b593a6d0-56ba-4022-8084-246a3ac9fd30/prometheus-operator/0.log" Feb 17 20:29:29 crc kubenswrapper[4892]: I0217 20:29:29.024889 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7567bbd4cd-mqtrm_8a9c57e6-a854-4895-b520-267ac9379772/prometheus-operator-admission-webhook/0.log" Feb 17 20:29:29 crc kubenswrapper[4892]: I0217 20:29:29.030767 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7567bbd4cd-dvr5f_25bc72d9-8845-4954-93f7-657b3cac94b6/prometheus-operator-admission-webhook/0.log" Feb 17 20:29:29 crc kubenswrapper[4892]: I0217 20:29:29.186871 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-hdpxb_6d3c1e97-ee06-41a8-890b-8606b2297aa0/operator/0.log" Feb 17 20:29:29 crc kubenswrapper[4892]: I0217 20:29:29.209314 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-n5wfv_609d9353-9db2-4a1c-8f00-8cfe986c3b12/perses-operator/0.log" Feb 17 20:29:31 crc kubenswrapper[4892]: I0217 20:29:31.359395 4892 scope.go:117] "RemoveContainer" containerID="2d8fac9640248d569699481a65fe24982eb12cc3dc607889c6c3f752cff2bc30" Feb 17 20:29:31 crc kubenswrapper[4892]: E0217 20:29:31.361130 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:29:45 crc kubenswrapper[4892]: I0217 20:29:45.360708 4892 scope.go:117] "RemoveContainer" containerID="2d8fac9640248d569699481a65fe24982eb12cc3dc607889c6c3f752cff2bc30" Feb 17 20:29:45 crc kubenswrapper[4892]: E0217 20:29:45.361570 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:29:56 crc kubenswrapper[4892]: I0217 20:29:56.359651 4892 scope.go:117] "RemoveContainer" containerID="2d8fac9640248d569699481a65fe24982eb12cc3dc607889c6c3f752cff2bc30" Feb 17 20:29:56 crc kubenswrapper[4892]: E0217 20:29:56.360324 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:30:00 crc kubenswrapper[4892]: I0217 20:30:00.156237 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522670-gdh6c"] Feb 17 20:30:00 crc kubenswrapper[4892]: E0217 20:30:00.158785 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b7d4b72-4b0e-4eba-9692-b58a2c066a5a" containerName="extract-content" Feb 17 20:30:00 crc kubenswrapper[4892]: I0217 20:30:00.158829 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b7d4b72-4b0e-4eba-9692-b58a2c066a5a" containerName="extract-content" Feb 17 20:30:00 crc kubenswrapper[4892]: E0217 20:30:00.158866 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b7d4b72-4b0e-4eba-9692-b58a2c066a5a" containerName="registry-server" Feb 17 20:30:00 crc kubenswrapper[4892]: I0217 20:30:00.158874 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b7d4b72-4b0e-4eba-9692-b58a2c066a5a" containerName="registry-server" Feb 17 20:30:00 crc kubenswrapper[4892]: E0217 20:30:00.158902 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f96b1606-24c5-4f44-8edd-eae1338b4ae8" containerName="extract-utilities" Feb 17 20:30:00 crc kubenswrapper[4892]: I0217 20:30:00.158912 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f96b1606-24c5-4f44-8edd-eae1338b4ae8" containerName="extract-utilities" Feb 17 20:30:00 crc kubenswrapper[4892]: E0217 20:30:00.158928 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f96b1606-24c5-4f44-8edd-eae1338b4ae8" containerName="extract-content" Feb 17 20:30:00 crc kubenswrapper[4892]: I0217 20:30:00.158933 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f96b1606-24c5-4f44-8edd-eae1338b4ae8" containerName="extract-content" Feb 17 20:30:00 crc kubenswrapper[4892]: E0217 20:30:00.158958 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b7d4b72-4b0e-4eba-9692-b58a2c066a5a" containerName="extract-utilities" Feb 17 20:30:00 crc kubenswrapper[4892]: I0217 20:30:00.158966 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b7d4b72-4b0e-4eba-9692-b58a2c066a5a" containerName="extract-utilities" Feb 17 20:30:00 crc kubenswrapper[4892]: E0217 20:30:00.159008 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f96b1606-24c5-4f44-8edd-eae1338b4ae8" containerName="registry-server" Feb 17 20:30:00 crc kubenswrapper[4892]: I0217 20:30:00.159015 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f96b1606-24c5-4f44-8edd-eae1338b4ae8" containerName="registry-server" Feb 17 20:30:00 crc kubenswrapper[4892]: I0217 20:30:00.161139 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b7d4b72-4b0e-4eba-9692-b58a2c066a5a" containerName="registry-server" Feb 17 20:30:00 crc kubenswrapper[4892]: I0217 20:30:00.161197 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="f96b1606-24c5-4f44-8edd-eae1338b4ae8" containerName="registry-server" Feb 17 20:30:00 crc kubenswrapper[4892]: I0217 20:30:00.163026 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522670-gdh6c" Feb 17 20:30:00 crc kubenswrapper[4892]: I0217 20:30:00.181010 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 20:30:00 crc kubenswrapper[4892]: I0217 20:30:00.181096 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 20:30:00 crc kubenswrapper[4892]: I0217 20:30:00.194715 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522670-gdh6c"] Feb 17 20:30:00 crc kubenswrapper[4892]: I0217 20:30:00.330833 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cad8aaf6-f9dd-40d1-82a1-c2d45e338ce7-config-volume\") pod \"collect-profiles-29522670-gdh6c\" (UID: \"cad8aaf6-f9dd-40d1-82a1-c2d45e338ce7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522670-gdh6c" Feb 17 20:30:00 crc kubenswrapper[4892]: I0217 20:30:00.331046 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cad8aaf6-f9dd-40d1-82a1-c2d45e338ce7-secret-volume\") pod \"collect-profiles-29522670-gdh6c\" (UID: \"cad8aaf6-f9dd-40d1-82a1-c2d45e338ce7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522670-gdh6c" Feb 17 20:30:00 crc kubenswrapper[4892]: I0217 20:30:00.331087 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8r2l\" (UniqueName: \"kubernetes.io/projected/cad8aaf6-f9dd-40d1-82a1-c2d45e338ce7-kube-api-access-v8r2l\") pod \"collect-profiles-29522670-gdh6c\" (UID: \"cad8aaf6-f9dd-40d1-82a1-c2d45e338ce7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522670-gdh6c" Feb 17 20:30:00 crc kubenswrapper[4892]: I0217 20:30:00.433161 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cad8aaf6-f9dd-40d1-82a1-c2d45e338ce7-secret-volume\") pod \"collect-profiles-29522670-gdh6c\" (UID: \"cad8aaf6-f9dd-40d1-82a1-c2d45e338ce7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522670-gdh6c" Feb 17 20:30:00 crc kubenswrapper[4892]: I0217 20:30:00.433559 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8r2l\" (UniqueName: \"kubernetes.io/projected/cad8aaf6-f9dd-40d1-82a1-c2d45e338ce7-kube-api-access-v8r2l\") pod \"collect-profiles-29522670-gdh6c\" (UID: \"cad8aaf6-f9dd-40d1-82a1-c2d45e338ce7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522670-gdh6c" Feb 17 20:30:00 crc kubenswrapper[4892]: I0217 20:30:00.433697 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cad8aaf6-f9dd-40d1-82a1-c2d45e338ce7-config-volume\") pod \"collect-profiles-29522670-gdh6c\" (UID: \"cad8aaf6-f9dd-40d1-82a1-c2d45e338ce7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522670-gdh6c" Feb 17 20:30:00 crc kubenswrapper[4892]: I0217 20:30:00.435283 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cad8aaf6-f9dd-40d1-82a1-c2d45e338ce7-config-volume\") pod \"collect-profiles-29522670-gdh6c\" (UID: \"cad8aaf6-f9dd-40d1-82a1-c2d45e338ce7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522670-gdh6c" Feb 17 20:30:00 crc kubenswrapper[4892]: I0217 20:30:00.462700 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cad8aaf6-f9dd-40d1-82a1-c2d45e338ce7-secret-volume\") pod \"collect-profiles-29522670-gdh6c\" (UID: \"cad8aaf6-f9dd-40d1-82a1-c2d45e338ce7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522670-gdh6c" Feb 17 20:30:00 crc kubenswrapper[4892]: I0217 20:30:00.476651 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8r2l\" (UniqueName: \"kubernetes.io/projected/cad8aaf6-f9dd-40d1-82a1-c2d45e338ce7-kube-api-access-v8r2l\") pod \"collect-profiles-29522670-gdh6c\" (UID: \"cad8aaf6-f9dd-40d1-82a1-c2d45e338ce7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522670-gdh6c" Feb 17 20:30:00 crc kubenswrapper[4892]: I0217 20:30:00.510685 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522670-gdh6c" Feb 17 20:30:01 crc kubenswrapper[4892]: I0217 20:30:01.082558 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522670-gdh6c"] Feb 17 20:30:01 crc kubenswrapper[4892]: I0217 20:30:01.538470 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522670-gdh6c" event={"ID":"cad8aaf6-f9dd-40d1-82a1-c2d45e338ce7","Type":"ContainerStarted","Data":"f335b2618ee7adfd97edc00a257b96a91cdb2aa68a945c0cb9e39893f1b42931"} Feb 17 20:30:01 crc kubenswrapper[4892]: I0217 20:30:01.538914 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522670-gdh6c" event={"ID":"cad8aaf6-f9dd-40d1-82a1-c2d45e338ce7","Type":"ContainerStarted","Data":"3b5c8348899761211228bc4fc90d4dd600ee9d13fd4a9f8762fd86829eb4e88c"} Feb 17 20:30:01 crc kubenswrapper[4892]: I0217 20:30:01.570180 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29522670-gdh6c" podStartSLOduration=1.570159259 podStartE2EDuration="1.570159259s" podCreationTimestamp="2026-02-17 20:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 20:30:01.559422159 +0000 UTC m=+9972.934825424" watchObservedRunningTime="2026-02-17 20:30:01.570159259 +0000 UTC m=+9972.945562524" Feb 17 20:30:02 crc kubenswrapper[4892]: I0217 20:30:02.556357 4892 generic.go:334] "Generic (PLEG): container finished" podID="cad8aaf6-f9dd-40d1-82a1-c2d45e338ce7" containerID="f335b2618ee7adfd97edc00a257b96a91cdb2aa68a945c0cb9e39893f1b42931" exitCode=0 Feb 17 20:30:02 crc kubenswrapper[4892]: I0217 20:30:02.556548 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522670-gdh6c" event={"ID":"cad8aaf6-f9dd-40d1-82a1-c2d45e338ce7","Type":"ContainerDied","Data":"f335b2618ee7adfd97edc00a257b96a91cdb2aa68a945c0cb9e39893f1b42931"} Feb 17 20:30:03 crc kubenswrapper[4892]: I0217 20:30:03.992040 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522670-gdh6c" Feb 17 20:30:04 crc kubenswrapper[4892]: I0217 20:30:04.132687 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cad8aaf6-f9dd-40d1-82a1-c2d45e338ce7-secret-volume\") pod \"cad8aaf6-f9dd-40d1-82a1-c2d45e338ce7\" (UID: \"cad8aaf6-f9dd-40d1-82a1-c2d45e338ce7\") " Feb 17 20:30:04 crc kubenswrapper[4892]: I0217 20:30:04.132726 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cad8aaf6-f9dd-40d1-82a1-c2d45e338ce7-config-volume\") pod \"cad8aaf6-f9dd-40d1-82a1-c2d45e338ce7\" (UID: \"cad8aaf6-f9dd-40d1-82a1-c2d45e338ce7\") " Feb 17 20:30:04 crc kubenswrapper[4892]: I0217 20:30:04.132939 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8r2l\" (UniqueName: \"kubernetes.io/projected/cad8aaf6-f9dd-40d1-82a1-c2d45e338ce7-kube-api-access-v8r2l\") pod \"cad8aaf6-f9dd-40d1-82a1-c2d45e338ce7\" (UID: \"cad8aaf6-f9dd-40d1-82a1-c2d45e338ce7\") " Feb 17 20:30:04 crc kubenswrapper[4892]: I0217 20:30:04.133652 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cad8aaf6-f9dd-40d1-82a1-c2d45e338ce7-config-volume" (OuterVolumeSpecName: "config-volume") pod "cad8aaf6-f9dd-40d1-82a1-c2d45e338ce7" (UID: "cad8aaf6-f9dd-40d1-82a1-c2d45e338ce7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 20:30:04 crc kubenswrapper[4892]: I0217 20:30:04.142084 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cad8aaf6-f9dd-40d1-82a1-c2d45e338ce7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "cad8aaf6-f9dd-40d1-82a1-c2d45e338ce7" (UID: "cad8aaf6-f9dd-40d1-82a1-c2d45e338ce7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 20:30:04 crc kubenswrapper[4892]: I0217 20:30:04.142179 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cad8aaf6-f9dd-40d1-82a1-c2d45e338ce7-kube-api-access-v8r2l" (OuterVolumeSpecName: "kube-api-access-v8r2l") pod "cad8aaf6-f9dd-40d1-82a1-c2d45e338ce7" (UID: "cad8aaf6-f9dd-40d1-82a1-c2d45e338ce7"). InnerVolumeSpecName "kube-api-access-v8r2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 20:30:04 crc kubenswrapper[4892]: I0217 20:30:04.236364 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8r2l\" (UniqueName: \"kubernetes.io/projected/cad8aaf6-f9dd-40d1-82a1-c2d45e338ce7-kube-api-access-v8r2l\") on node \"crc\" DevicePath \"\"" Feb 17 20:30:04 crc kubenswrapper[4892]: I0217 20:30:04.236635 4892 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cad8aaf6-f9dd-40d1-82a1-c2d45e338ce7-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 20:30:04 crc kubenswrapper[4892]: I0217 20:30:04.236738 4892 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cad8aaf6-f9dd-40d1-82a1-c2d45e338ce7-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 20:30:04 crc kubenswrapper[4892]: I0217 20:30:04.578466 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522670-gdh6c" event={"ID":"cad8aaf6-f9dd-40d1-82a1-c2d45e338ce7","Type":"ContainerDied","Data":"3b5c8348899761211228bc4fc90d4dd600ee9d13fd4a9f8762fd86829eb4e88c"} Feb 17 20:30:04 crc kubenswrapper[4892]: I0217 20:30:04.578515 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b5c8348899761211228bc4fc90d4dd600ee9d13fd4a9f8762fd86829eb4e88c" Feb 17 20:30:04 crc kubenswrapper[4892]: I0217 20:30:04.578522 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522670-gdh6c" Feb 17 20:30:04 crc kubenswrapper[4892]: I0217 20:30:04.644921 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522625-4h5xs"] Feb 17 20:30:04 crc kubenswrapper[4892]: I0217 20:30:04.657471 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522625-4h5xs"] Feb 17 20:30:05 crc kubenswrapper[4892]: I0217 20:30:05.375654 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc6e067c-cdf0-490c-a561-43a216cde39c" path="/var/lib/kubelet/pods/dc6e067c-cdf0-490c-a561-43a216cde39c/volumes" Feb 17 20:30:07 crc kubenswrapper[4892]: I0217 20:30:07.359700 4892 scope.go:117] "RemoveContainer" containerID="2d8fac9640248d569699481a65fe24982eb12cc3dc607889c6c3f752cff2bc30" Feb 17 20:30:07 crc kubenswrapper[4892]: E0217 20:30:07.360240 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:30:10 crc kubenswrapper[4892]: I0217 20:30:10.364836 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gst5s"] Feb 17 20:30:10 crc kubenswrapper[4892]: E0217 20:30:10.365754 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cad8aaf6-f9dd-40d1-82a1-c2d45e338ce7" containerName="collect-profiles" Feb 17 20:30:10 crc kubenswrapper[4892]: I0217 20:30:10.365766 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="cad8aaf6-f9dd-40d1-82a1-c2d45e338ce7" containerName="collect-profiles" Feb 17 20:30:10 crc kubenswrapper[4892]: I0217 20:30:10.366039 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="cad8aaf6-f9dd-40d1-82a1-c2d45e338ce7" containerName="collect-profiles" Feb 17 20:30:10 crc kubenswrapper[4892]: I0217 20:30:10.368078 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gst5s" Feb 17 20:30:10 crc kubenswrapper[4892]: I0217 20:30:10.409916 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gst5s"] Feb 17 20:30:10 crc kubenswrapper[4892]: I0217 20:30:10.495203 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50b4d84f-1f61-45e3-a80b-14f89f7d9028-utilities\") pod \"redhat-marketplace-gst5s\" (UID: \"50b4d84f-1f61-45e3-a80b-14f89f7d9028\") " pod="openshift-marketplace/redhat-marketplace-gst5s" Feb 17 20:30:10 crc kubenswrapper[4892]: I0217 20:30:10.495334 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50b4d84f-1f61-45e3-a80b-14f89f7d9028-catalog-content\") pod \"redhat-marketplace-gst5s\" (UID: \"50b4d84f-1f61-45e3-a80b-14f89f7d9028\") " pod="openshift-marketplace/redhat-marketplace-gst5s" Feb 17 20:30:10 crc kubenswrapper[4892]: I0217 20:30:10.495430 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxmz7\" (UniqueName: \"kubernetes.io/projected/50b4d84f-1f61-45e3-a80b-14f89f7d9028-kube-api-access-mxmz7\") pod \"redhat-marketplace-gst5s\" (UID: \"50b4d84f-1f61-45e3-a80b-14f89f7d9028\") " pod="openshift-marketplace/redhat-marketplace-gst5s" Feb 17 20:30:10 crc kubenswrapper[4892]: I0217 20:30:10.597244 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50b4d84f-1f61-45e3-a80b-14f89f7d9028-catalog-content\") pod \"redhat-marketplace-gst5s\" (UID: \"50b4d84f-1f61-45e3-a80b-14f89f7d9028\") " pod="openshift-marketplace/redhat-marketplace-gst5s" Feb 17 20:30:10 crc kubenswrapper[4892]: I0217 20:30:10.597353 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxmz7\" (UniqueName: \"kubernetes.io/projected/50b4d84f-1f61-45e3-a80b-14f89f7d9028-kube-api-access-mxmz7\") pod \"redhat-marketplace-gst5s\" (UID: \"50b4d84f-1f61-45e3-a80b-14f89f7d9028\") " pod="openshift-marketplace/redhat-marketplace-gst5s" Feb 17 20:30:10 crc kubenswrapper[4892]: I0217 20:30:10.597430 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50b4d84f-1f61-45e3-a80b-14f89f7d9028-utilities\") pod \"redhat-marketplace-gst5s\" (UID: \"50b4d84f-1f61-45e3-a80b-14f89f7d9028\") " pod="openshift-marketplace/redhat-marketplace-gst5s" Feb 17 20:30:10 crc kubenswrapper[4892]: I0217 20:30:10.597934 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50b4d84f-1f61-45e3-a80b-14f89f7d9028-utilities\") pod \"redhat-marketplace-gst5s\" (UID: \"50b4d84f-1f61-45e3-a80b-14f89f7d9028\") " pod="openshift-marketplace/redhat-marketplace-gst5s" Feb 17 20:30:10 crc kubenswrapper[4892]: I0217 20:30:10.598099 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50b4d84f-1f61-45e3-a80b-14f89f7d9028-catalog-content\") pod \"redhat-marketplace-gst5s\" (UID: \"50b4d84f-1f61-45e3-a80b-14f89f7d9028\") " pod="openshift-marketplace/redhat-marketplace-gst5s" Feb 17 20:30:10 crc kubenswrapper[4892]: I0217 20:30:10.627981 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxmz7\" (UniqueName: \"kubernetes.io/projected/50b4d84f-1f61-45e3-a80b-14f89f7d9028-kube-api-access-mxmz7\") pod \"redhat-marketplace-gst5s\" (UID: \"50b4d84f-1f61-45e3-a80b-14f89f7d9028\") " pod="openshift-marketplace/redhat-marketplace-gst5s" Feb 17 20:30:10 crc kubenswrapper[4892]: I0217 20:30:10.706525 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gst5s" Feb 17 20:30:11 crc kubenswrapper[4892]: I0217 20:30:11.093474 4892 scope.go:117] "RemoveContainer" containerID="76458a9fb6d6c16b17614531b215790665eaa9219e3ed7b1e86569c79ad49f77" Feb 17 20:30:11 crc kubenswrapper[4892]: I0217 20:30:11.240352 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gst5s"] Feb 17 20:30:11 crc kubenswrapper[4892]: I0217 20:30:11.684790 4892 generic.go:334] "Generic (PLEG): container finished" podID="50b4d84f-1f61-45e3-a80b-14f89f7d9028" containerID="51c8a4591e4a46c4ea45dad9df13ee12e183faa653df29d3d4d96e7af0d620a7" exitCode=0 Feb 17 20:30:11 crc kubenswrapper[4892]: I0217 20:30:11.685128 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gst5s" event={"ID":"50b4d84f-1f61-45e3-a80b-14f89f7d9028","Type":"ContainerDied","Data":"51c8a4591e4a46c4ea45dad9df13ee12e183faa653df29d3d4d96e7af0d620a7"} Feb 17 20:30:11 crc kubenswrapper[4892]: I0217 20:30:11.685163 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gst5s" event={"ID":"50b4d84f-1f61-45e3-a80b-14f89f7d9028","Type":"ContainerStarted","Data":"adeff93f70bca2d71b80cd58c8f255f877efa24b7ee38dd9e941abf4221c1a26"} Feb 17 20:30:11 crc kubenswrapper[4892]: I0217 20:30:11.687628 4892 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 20:30:13 crc kubenswrapper[4892]: I0217 20:30:13.742113 4892 generic.go:334] "Generic (PLEG): container finished" podID="50b4d84f-1f61-45e3-a80b-14f89f7d9028" containerID="3c1daaf1912b1c6e7a78b8a9190bc17c521bcb38745bc877dfca60b552cd52d7" exitCode=0 Feb 17 20:30:13 crc kubenswrapper[4892]: I0217 20:30:13.742214 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gst5s" event={"ID":"50b4d84f-1f61-45e3-a80b-14f89f7d9028","Type":"ContainerDied","Data":"3c1daaf1912b1c6e7a78b8a9190bc17c521bcb38745bc877dfca60b552cd52d7"} Feb 17 20:30:14 crc kubenswrapper[4892]: I0217 20:30:14.755365 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gst5s" event={"ID":"50b4d84f-1f61-45e3-a80b-14f89f7d9028","Type":"ContainerStarted","Data":"444a1a342e25ad031862b0c85b3200651818da5d8e9d1a2051cbe92a0efe9e6b"} Feb 17 20:30:14 crc kubenswrapper[4892]: I0217 20:30:14.780058 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gst5s" podStartSLOduration=2.088614875 podStartE2EDuration="4.780040106s" podCreationTimestamp="2026-02-17 20:30:10 +0000 UTC" firstStartedPulling="2026-02-17 20:30:11.687313558 +0000 UTC m=+9983.062716823" lastFinishedPulling="2026-02-17 20:30:14.378738779 +0000 UTC m=+9985.754142054" observedRunningTime="2026-02-17 20:30:14.779586944 +0000 UTC m=+9986.154990229" watchObservedRunningTime="2026-02-17 20:30:14.780040106 +0000 UTC m=+9986.155443371" Feb 17 20:30:19 crc kubenswrapper[4892]: I0217 20:30:19.374619 4892 scope.go:117] "RemoveContainer" containerID="2d8fac9640248d569699481a65fe24982eb12cc3dc607889c6c3f752cff2bc30" Feb 17 20:30:19 crc kubenswrapper[4892]: E0217 20:30:19.376101 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:30:20 crc kubenswrapper[4892]: I0217 20:30:20.707685 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gst5s" Feb 17 20:30:20 crc kubenswrapper[4892]: I0217 20:30:20.707775 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gst5s" Feb 17 20:30:20 crc kubenswrapper[4892]: I0217 20:30:20.777657 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gst5s" Feb 17 20:30:20 crc kubenswrapper[4892]: I0217 20:30:20.910077 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gst5s" Feb 17 20:30:21 crc kubenswrapper[4892]: I0217 20:30:21.023971 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gst5s"] Feb 17 20:30:22 crc kubenswrapper[4892]: I0217 20:30:22.875873 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gst5s" podUID="50b4d84f-1f61-45e3-a80b-14f89f7d9028" containerName="registry-server" containerID="cri-o://444a1a342e25ad031862b0c85b3200651818da5d8e9d1a2051cbe92a0efe9e6b" gracePeriod=2 Feb 17 20:30:23 crc kubenswrapper[4892]: I0217 20:30:23.449002 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gst5s" Feb 17 20:30:23 crc kubenswrapper[4892]: I0217 20:30:23.536629 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50b4d84f-1f61-45e3-a80b-14f89f7d9028-utilities\") pod \"50b4d84f-1f61-45e3-a80b-14f89f7d9028\" (UID: \"50b4d84f-1f61-45e3-a80b-14f89f7d9028\") " Feb 17 20:30:23 crc kubenswrapper[4892]: I0217 20:30:23.536786 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxmz7\" (UniqueName: \"kubernetes.io/projected/50b4d84f-1f61-45e3-a80b-14f89f7d9028-kube-api-access-mxmz7\") pod \"50b4d84f-1f61-45e3-a80b-14f89f7d9028\" (UID: \"50b4d84f-1f61-45e3-a80b-14f89f7d9028\") " Feb 17 20:30:23 crc kubenswrapper[4892]: I0217 20:30:23.536948 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50b4d84f-1f61-45e3-a80b-14f89f7d9028-catalog-content\") pod \"50b4d84f-1f61-45e3-a80b-14f89f7d9028\" (UID: \"50b4d84f-1f61-45e3-a80b-14f89f7d9028\") " Feb 17 20:30:23 crc kubenswrapper[4892]: I0217 20:30:23.554709 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50b4d84f-1f61-45e3-a80b-14f89f7d9028-utilities" (OuterVolumeSpecName: "utilities") pod "50b4d84f-1f61-45e3-a80b-14f89f7d9028" (UID: "50b4d84f-1f61-45e3-a80b-14f89f7d9028"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 20:30:23 crc kubenswrapper[4892]: I0217 20:30:23.555937 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50b4d84f-1f61-45e3-a80b-14f89f7d9028-kube-api-access-mxmz7" (OuterVolumeSpecName: "kube-api-access-mxmz7") pod "50b4d84f-1f61-45e3-a80b-14f89f7d9028" (UID: "50b4d84f-1f61-45e3-a80b-14f89f7d9028"). InnerVolumeSpecName "kube-api-access-mxmz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 20:30:23 crc kubenswrapper[4892]: I0217 20:30:23.565107 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50b4d84f-1f61-45e3-a80b-14f89f7d9028-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "50b4d84f-1f61-45e3-a80b-14f89f7d9028" (UID: "50b4d84f-1f61-45e3-a80b-14f89f7d9028"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 20:30:23 crc kubenswrapper[4892]: I0217 20:30:23.640038 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50b4d84f-1f61-45e3-a80b-14f89f7d9028-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 20:30:23 crc kubenswrapper[4892]: I0217 20:30:23.640069 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50b4d84f-1f61-45e3-a80b-14f89f7d9028-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 20:30:23 crc kubenswrapper[4892]: I0217 20:30:23.640079 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxmz7\" (UniqueName: \"kubernetes.io/projected/50b4d84f-1f61-45e3-a80b-14f89f7d9028-kube-api-access-mxmz7\") on node \"crc\" DevicePath \"\"" Feb 17 20:30:23 crc kubenswrapper[4892]: I0217 20:30:23.892015 4892 generic.go:334] "Generic (PLEG): container finished" podID="50b4d84f-1f61-45e3-a80b-14f89f7d9028" containerID="444a1a342e25ad031862b0c85b3200651818da5d8e9d1a2051cbe92a0efe9e6b" exitCode=0 Feb 17 20:30:23 crc kubenswrapper[4892]: I0217 20:30:23.892056 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gst5s" event={"ID":"50b4d84f-1f61-45e3-a80b-14f89f7d9028","Type":"ContainerDied","Data":"444a1a342e25ad031862b0c85b3200651818da5d8e9d1a2051cbe92a0efe9e6b"} Feb 17 20:30:23 crc kubenswrapper[4892]: I0217 20:30:23.892088 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gst5s" event={"ID":"50b4d84f-1f61-45e3-a80b-14f89f7d9028","Type":"ContainerDied","Data":"adeff93f70bca2d71b80cd58c8f255f877efa24b7ee38dd9e941abf4221c1a26"} Feb 17 20:30:23 crc kubenswrapper[4892]: I0217 20:30:23.892108 4892 scope.go:117] "RemoveContainer" containerID="444a1a342e25ad031862b0c85b3200651818da5d8e9d1a2051cbe92a0efe9e6b" Feb 17 20:30:23 crc kubenswrapper[4892]: I0217 20:30:23.892127 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gst5s" Feb 17 20:30:23 crc kubenswrapper[4892]: I0217 20:30:23.936138 4892 scope.go:117] "RemoveContainer" containerID="3c1daaf1912b1c6e7a78b8a9190bc17c521bcb38745bc877dfca60b552cd52d7" Feb 17 20:30:23 crc kubenswrapper[4892]: I0217 20:30:23.941578 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gst5s"] Feb 17 20:30:23 crc kubenswrapper[4892]: I0217 20:30:23.950557 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gst5s"] Feb 17 20:30:23 crc kubenswrapper[4892]: I0217 20:30:23.965489 4892 scope.go:117] "RemoveContainer" containerID="51c8a4591e4a46c4ea45dad9df13ee12e183faa653df29d3d4d96e7af0d620a7" Feb 17 20:30:24 crc kubenswrapper[4892]: I0217 20:30:24.030466 4892 scope.go:117] "RemoveContainer" containerID="444a1a342e25ad031862b0c85b3200651818da5d8e9d1a2051cbe92a0efe9e6b" Feb 17 20:30:24 crc kubenswrapper[4892]: E0217 20:30:24.031742 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"444a1a342e25ad031862b0c85b3200651818da5d8e9d1a2051cbe92a0efe9e6b\": container with ID starting with 444a1a342e25ad031862b0c85b3200651818da5d8e9d1a2051cbe92a0efe9e6b not found: ID does not exist" containerID="444a1a342e25ad031862b0c85b3200651818da5d8e9d1a2051cbe92a0efe9e6b" Feb 17 20:30:24 crc kubenswrapper[4892]: I0217 20:30:24.031781 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"444a1a342e25ad031862b0c85b3200651818da5d8e9d1a2051cbe92a0efe9e6b"} err="failed to get container status \"444a1a342e25ad031862b0c85b3200651818da5d8e9d1a2051cbe92a0efe9e6b\": rpc error: code = NotFound desc = could not find container \"444a1a342e25ad031862b0c85b3200651818da5d8e9d1a2051cbe92a0efe9e6b\": container with ID starting with 444a1a342e25ad031862b0c85b3200651818da5d8e9d1a2051cbe92a0efe9e6b not found: ID does not exist" Feb 17 20:30:24 crc kubenswrapper[4892]: I0217 20:30:24.031805 4892 scope.go:117] "RemoveContainer" containerID="3c1daaf1912b1c6e7a78b8a9190bc17c521bcb38745bc877dfca60b552cd52d7" Feb 17 20:30:24 crc kubenswrapper[4892]: E0217 20:30:24.032133 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c1daaf1912b1c6e7a78b8a9190bc17c521bcb38745bc877dfca60b552cd52d7\": container with ID starting with 3c1daaf1912b1c6e7a78b8a9190bc17c521bcb38745bc877dfca60b552cd52d7 not found: ID does not exist" containerID="3c1daaf1912b1c6e7a78b8a9190bc17c521bcb38745bc877dfca60b552cd52d7" Feb 17 20:30:24 crc kubenswrapper[4892]: I0217 20:30:24.032162 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c1daaf1912b1c6e7a78b8a9190bc17c521bcb38745bc877dfca60b552cd52d7"} err="failed to get container status \"3c1daaf1912b1c6e7a78b8a9190bc17c521bcb38745bc877dfca60b552cd52d7\": rpc error: code = NotFound desc = could not find container \"3c1daaf1912b1c6e7a78b8a9190bc17c521bcb38745bc877dfca60b552cd52d7\": container with ID starting with 3c1daaf1912b1c6e7a78b8a9190bc17c521bcb38745bc877dfca60b552cd52d7 not found: ID does not exist" Feb 17 20:30:24 crc kubenswrapper[4892]: I0217 20:30:24.032179 4892 scope.go:117] "RemoveContainer" containerID="51c8a4591e4a46c4ea45dad9df13ee12e183faa653df29d3d4d96e7af0d620a7" Feb 17 20:30:24 crc kubenswrapper[4892]: E0217 20:30:24.032399 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51c8a4591e4a46c4ea45dad9df13ee12e183faa653df29d3d4d96e7af0d620a7\": container with ID starting with 51c8a4591e4a46c4ea45dad9df13ee12e183faa653df29d3d4d96e7af0d620a7 not found: ID does not exist" containerID="51c8a4591e4a46c4ea45dad9df13ee12e183faa653df29d3d4d96e7af0d620a7" Feb 17 20:30:24 crc kubenswrapper[4892]: I0217 20:30:24.032423 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51c8a4591e4a46c4ea45dad9df13ee12e183faa653df29d3d4d96e7af0d620a7"} err="failed to get container status \"51c8a4591e4a46c4ea45dad9df13ee12e183faa653df29d3d4d96e7af0d620a7\": rpc error: code = NotFound desc = could not find container \"51c8a4591e4a46c4ea45dad9df13ee12e183faa653df29d3d4d96e7af0d620a7\": container with ID starting with 51c8a4591e4a46c4ea45dad9df13ee12e183faa653df29d3d4d96e7af0d620a7 not found: ID does not exist" Feb 17 20:30:25 crc kubenswrapper[4892]: I0217 20:30:25.376366 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50b4d84f-1f61-45e3-a80b-14f89f7d9028" path="/var/lib/kubelet/pods/50b4d84f-1f61-45e3-a80b-14f89f7d9028/volumes" Feb 17 20:30:34 crc kubenswrapper[4892]: I0217 20:30:34.359868 4892 scope.go:117] "RemoveContainer" containerID="2d8fac9640248d569699481a65fe24982eb12cc3dc607889c6c3f752cff2bc30" Feb 17 20:30:34 crc kubenswrapper[4892]: E0217 20:30:34.360811 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:30:37 crc kubenswrapper[4892]: I0217 20:30:37.708940 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hwr79"] Feb 17 20:30:37 crc kubenswrapper[4892]: E0217 20:30:37.710943 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50b4d84f-1f61-45e3-a80b-14f89f7d9028" containerName="registry-server" Feb 17 20:30:37 crc kubenswrapper[4892]: I0217 20:30:37.710973 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="50b4d84f-1f61-45e3-a80b-14f89f7d9028" containerName="registry-server" Feb 17 20:30:37 crc kubenswrapper[4892]: E0217 20:30:37.710988 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50b4d84f-1f61-45e3-a80b-14f89f7d9028" containerName="extract-content" Feb 17 20:30:37 crc kubenswrapper[4892]: I0217 20:30:37.710996 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="50b4d84f-1f61-45e3-a80b-14f89f7d9028" containerName="extract-content" Feb 17 20:30:37 crc kubenswrapper[4892]: E0217 20:30:37.711017 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50b4d84f-1f61-45e3-a80b-14f89f7d9028" containerName="extract-utilities" Feb 17 20:30:37 crc kubenswrapper[4892]: I0217 20:30:37.711026 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="50b4d84f-1f61-45e3-a80b-14f89f7d9028" containerName="extract-utilities" Feb 17 20:30:37 crc kubenswrapper[4892]: I0217 20:30:37.711273 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="50b4d84f-1f61-45e3-a80b-14f89f7d9028" containerName="registry-server" Feb 17 20:30:37 crc kubenswrapper[4892]: I0217 20:30:37.713398 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hwr79" Feb 17 20:30:37 crc kubenswrapper[4892]: I0217 20:30:37.726285 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hwr79"] Feb 17 20:30:37 crc kubenswrapper[4892]: I0217 20:30:37.912551 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7rgq\" (UniqueName: \"kubernetes.io/projected/21564135-8634-4bbb-bd75-ca5202dffd1c-kube-api-access-b7rgq\") pod \"certified-operators-hwr79\" (UID: \"21564135-8634-4bbb-bd75-ca5202dffd1c\") " pod="openshift-marketplace/certified-operators-hwr79" Feb 17 20:30:37 crc kubenswrapper[4892]: I0217 20:30:37.912942 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21564135-8634-4bbb-bd75-ca5202dffd1c-catalog-content\") pod \"certified-operators-hwr79\" (UID: \"21564135-8634-4bbb-bd75-ca5202dffd1c\") " pod="openshift-marketplace/certified-operators-hwr79" Feb 17 20:30:37 crc kubenswrapper[4892]: I0217 20:30:37.913086 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21564135-8634-4bbb-bd75-ca5202dffd1c-utilities\") pod \"certified-operators-hwr79\" (UID: \"21564135-8634-4bbb-bd75-ca5202dffd1c\") " pod="openshift-marketplace/certified-operators-hwr79" Feb 17 20:30:38 crc kubenswrapper[4892]: I0217 20:30:38.016215 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21564135-8634-4bbb-bd75-ca5202dffd1c-utilities\") pod \"certified-operators-hwr79\" (UID: \"21564135-8634-4bbb-bd75-ca5202dffd1c\") " pod="openshift-marketplace/certified-operators-hwr79" Feb 17 20:30:38 crc kubenswrapper[4892]: I0217 20:30:38.016417 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7rgq\" (UniqueName: \"kubernetes.io/projected/21564135-8634-4bbb-bd75-ca5202dffd1c-kube-api-access-b7rgq\") pod \"certified-operators-hwr79\" (UID: \"21564135-8634-4bbb-bd75-ca5202dffd1c\") " pod="openshift-marketplace/certified-operators-hwr79" Feb 17 20:30:38 crc kubenswrapper[4892]: I0217 20:30:38.016459 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21564135-8634-4bbb-bd75-ca5202dffd1c-catalog-content\") pod \"certified-operators-hwr79\" (UID: \"21564135-8634-4bbb-bd75-ca5202dffd1c\") " pod="openshift-marketplace/certified-operators-hwr79" Feb 17 20:30:38 crc kubenswrapper[4892]: I0217 20:30:38.017103 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21564135-8634-4bbb-bd75-ca5202dffd1c-utilities\") pod \"certified-operators-hwr79\" (UID: \"21564135-8634-4bbb-bd75-ca5202dffd1c\") " pod="openshift-marketplace/certified-operators-hwr79" Feb 17 20:30:38 crc kubenswrapper[4892]: I0217 20:30:38.017126 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21564135-8634-4bbb-bd75-ca5202dffd1c-catalog-content\") pod \"certified-operators-hwr79\" (UID: \"21564135-8634-4bbb-bd75-ca5202dffd1c\") " pod="openshift-marketplace/certified-operators-hwr79" Feb 17 20:30:38 crc kubenswrapper[4892]: I0217 20:30:38.053426 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7rgq\" (UniqueName: \"kubernetes.io/projected/21564135-8634-4bbb-bd75-ca5202dffd1c-kube-api-access-b7rgq\") pod \"certified-operators-hwr79\" (UID: \"21564135-8634-4bbb-bd75-ca5202dffd1c\") " pod="openshift-marketplace/certified-operators-hwr79" Feb 17 20:30:38 crc kubenswrapper[4892]: I0217 20:30:38.337700 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hwr79" Feb 17 20:30:38 crc kubenswrapper[4892]: I0217 20:30:38.854700 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hwr79"] Feb 17 20:30:39 crc kubenswrapper[4892]: I0217 20:30:39.075422 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hwr79" event={"ID":"21564135-8634-4bbb-bd75-ca5202dffd1c","Type":"ContainerStarted","Data":"5e38319376829c0f23d40105262e34657fb4073ce532d97cfd21227761ddc85f"} Feb 17 20:30:39 crc kubenswrapper[4892]: I0217 20:30:39.075527 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hwr79" event={"ID":"21564135-8634-4bbb-bd75-ca5202dffd1c","Type":"ContainerStarted","Data":"d6b6fdfca4c22729e8ecf8f505bac9a2667810674537a978ccbee2cf60d61b39"} Feb 17 20:30:40 crc kubenswrapper[4892]: I0217 20:30:40.088936 4892 generic.go:334] "Generic (PLEG): container finished" podID="21564135-8634-4bbb-bd75-ca5202dffd1c" containerID="5e38319376829c0f23d40105262e34657fb4073ce532d97cfd21227761ddc85f" exitCode=0 Feb 17 20:30:40 crc kubenswrapper[4892]: I0217 20:30:40.089162 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hwr79" event={"ID":"21564135-8634-4bbb-bd75-ca5202dffd1c","Type":"ContainerDied","Data":"5e38319376829c0f23d40105262e34657fb4073ce532d97cfd21227761ddc85f"} Feb 17 20:30:41 crc kubenswrapper[4892]: I0217 20:30:41.107711 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hwr79" event={"ID":"21564135-8634-4bbb-bd75-ca5202dffd1c","Type":"ContainerStarted","Data":"2642cb4a3a374a721918100be792af9ea29dc20a8b7389bb89658453ecabe502"} Feb 17 20:30:42 crc kubenswrapper[4892]: I0217 20:30:42.120398 4892 generic.go:334] "Generic (PLEG): container finished" podID="21564135-8634-4bbb-bd75-ca5202dffd1c" containerID="2642cb4a3a374a721918100be792af9ea29dc20a8b7389bb89658453ecabe502" exitCode=0 Feb 17 20:30:42 crc kubenswrapper[4892]: I0217 20:30:42.120492 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hwr79" event={"ID":"21564135-8634-4bbb-bd75-ca5202dffd1c","Type":"ContainerDied","Data":"2642cb4a3a374a721918100be792af9ea29dc20a8b7389bb89658453ecabe502"} Feb 17 20:30:43 crc kubenswrapper[4892]: I0217 20:30:43.141505 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hwr79" event={"ID":"21564135-8634-4bbb-bd75-ca5202dffd1c","Type":"ContainerStarted","Data":"a33f67a32c09842dfe6d0a92a48e02a7b786a166d9053c87289716b83c00a805"} Feb 17 20:30:43 crc kubenswrapper[4892]: I0217 20:30:43.167099 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hwr79" podStartSLOduration=3.767294931 podStartE2EDuration="6.167073129s" podCreationTimestamp="2026-02-17 20:30:37 +0000 UTC" firstStartedPulling="2026-02-17 20:30:40.091606467 +0000 UTC m=+10011.467009742" lastFinishedPulling="2026-02-17 20:30:42.491384675 +0000 UTC m=+10013.866787940" observedRunningTime="2026-02-17 20:30:43.161229731 +0000 UTC m=+10014.536633016" watchObservedRunningTime="2026-02-17 20:30:43.167073129 +0000 UTC m=+10014.542476404" Feb 17 20:30:45 crc kubenswrapper[4892]: I0217 20:30:45.366692 4892 scope.go:117] "RemoveContainer" containerID="2d8fac9640248d569699481a65fe24982eb12cc3dc607889c6c3f752cff2bc30" Feb 17 20:30:45 crc kubenswrapper[4892]: E0217 20:30:45.367648 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:30:48 crc kubenswrapper[4892]: I0217 20:30:48.338747 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hwr79" Feb 17 20:30:48 crc kubenswrapper[4892]: I0217 20:30:48.339061 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hwr79" Feb 17 20:30:48 crc kubenswrapper[4892]: I0217 20:30:48.427476 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hwr79" Feb 17 20:30:49 crc kubenswrapper[4892]: I0217 20:30:49.272658 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hwr79" Feb 17 20:30:49 crc kubenswrapper[4892]: I0217 20:30:49.336869 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hwr79"] Feb 17 20:30:51 crc kubenswrapper[4892]: I0217 20:30:51.229393 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hwr79" podUID="21564135-8634-4bbb-bd75-ca5202dffd1c" containerName="registry-server" containerID="cri-o://a33f67a32c09842dfe6d0a92a48e02a7b786a166d9053c87289716b83c00a805" gracePeriod=2 Feb 17 20:30:51 crc kubenswrapper[4892]: I0217 20:30:51.787799 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hwr79" Feb 17 20:30:51 crc kubenswrapper[4892]: I0217 20:30:51.942046 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7rgq\" (UniqueName: \"kubernetes.io/projected/21564135-8634-4bbb-bd75-ca5202dffd1c-kube-api-access-b7rgq\") pod \"21564135-8634-4bbb-bd75-ca5202dffd1c\" (UID: \"21564135-8634-4bbb-bd75-ca5202dffd1c\") " Feb 17 20:30:51 crc kubenswrapper[4892]: I0217 20:30:51.942271 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21564135-8634-4bbb-bd75-ca5202dffd1c-utilities\") pod \"21564135-8634-4bbb-bd75-ca5202dffd1c\" (UID: \"21564135-8634-4bbb-bd75-ca5202dffd1c\") " Feb 17 20:30:51 crc kubenswrapper[4892]: I0217 20:30:51.942347 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21564135-8634-4bbb-bd75-ca5202dffd1c-catalog-content\") pod \"21564135-8634-4bbb-bd75-ca5202dffd1c\" (UID: \"21564135-8634-4bbb-bd75-ca5202dffd1c\") " Feb 17 20:30:51 crc kubenswrapper[4892]: I0217 20:30:51.943059 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21564135-8634-4bbb-bd75-ca5202dffd1c-utilities" (OuterVolumeSpecName: "utilities") pod "21564135-8634-4bbb-bd75-ca5202dffd1c" (UID: "21564135-8634-4bbb-bd75-ca5202dffd1c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 20:30:51 crc kubenswrapper[4892]: I0217 20:30:51.947382 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21564135-8634-4bbb-bd75-ca5202dffd1c-kube-api-access-b7rgq" (OuterVolumeSpecName: "kube-api-access-b7rgq") pod "21564135-8634-4bbb-bd75-ca5202dffd1c" (UID: "21564135-8634-4bbb-bd75-ca5202dffd1c"). InnerVolumeSpecName "kube-api-access-b7rgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 20:30:52 crc kubenswrapper[4892]: I0217 20:30:52.046393 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7rgq\" (UniqueName: \"kubernetes.io/projected/21564135-8634-4bbb-bd75-ca5202dffd1c-kube-api-access-b7rgq\") on node \"crc\" DevicePath \"\"" Feb 17 20:30:52 crc kubenswrapper[4892]: I0217 20:30:52.046451 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21564135-8634-4bbb-bd75-ca5202dffd1c-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 20:30:52 crc kubenswrapper[4892]: I0217 20:30:52.121015 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21564135-8634-4bbb-bd75-ca5202dffd1c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "21564135-8634-4bbb-bd75-ca5202dffd1c" (UID: "21564135-8634-4bbb-bd75-ca5202dffd1c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 20:30:52 crc kubenswrapper[4892]: I0217 20:30:52.149111 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21564135-8634-4bbb-bd75-ca5202dffd1c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 20:30:52 crc kubenswrapper[4892]: I0217 20:30:52.252097 4892 generic.go:334] "Generic (PLEG): container finished" podID="21564135-8634-4bbb-bd75-ca5202dffd1c" containerID="a33f67a32c09842dfe6d0a92a48e02a7b786a166d9053c87289716b83c00a805" exitCode=0 Feb 17 20:30:52 crc kubenswrapper[4892]: I0217 20:30:52.252158 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hwr79" event={"ID":"21564135-8634-4bbb-bd75-ca5202dffd1c","Type":"ContainerDied","Data":"a33f67a32c09842dfe6d0a92a48e02a7b786a166d9053c87289716b83c00a805"} Feb 17 20:30:52 crc kubenswrapper[4892]: I0217 20:30:52.252203 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hwr79" event={"ID":"21564135-8634-4bbb-bd75-ca5202dffd1c","Type":"ContainerDied","Data":"d6b6fdfca4c22729e8ecf8f505bac9a2667810674537a978ccbee2cf60d61b39"} Feb 17 20:30:52 crc kubenswrapper[4892]: I0217 20:30:52.252232 4892 scope.go:117] "RemoveContainer" containerID="a33f67a32c09842dfe6d0a92a48e02a7b786a166d9053c87289716b83c00a805" Feb 17 20:30:52 crc kubenswrapper[4892]: I0217 20:30:52.252450 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hwr79" Feb 17 20:30:52 crc kubenswrapper[4892]: I0217 20:30:52.306326 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hwr79"] Feb 17 20:30:52 crc kubenswrapper[4892]: I0217 20:30:52.326787 4892 scope.go:117] "RemoveContainer" containerID="2642cb4a3a374a721918100be792af9ea29dc20a8b7389bb89658453ecabe502" Feb 17 20:30:52 crc kubenswrapper[4892]: I0217 20:30:52.330370 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hwr79"] Feb 17 20:30:52 crc kubenswrapper[4892]: I0217 20:30:52.361931 4892 scope.go:117] "RemoveContainer" containerID="5e38319376829c0f23d40105262e34657fb4073ce532d97cfd21227761ddc85f" Feb 17 20:30:52 crc kubenswrapper[4892]: E0217 20:30:52.536980 4892 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21564135_8634_4bbb_bd75_ca5202dffd1c.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21564135_8634_4bbb_bd75_ca5202dffd1c.slice/crio-d6b6fdfca4c22729e8ecf8f505bac9a2667810674537a978ccbee2cf60d61b39\": RecentStats: unable to find data in memory cache]" Feb 17 20:30:53 crc kubenswrapper[4892]: I0217 20:30:53.050561 4892 scope.go:117] "RemoveContainer" containerID="a33f67a32c09842dfe6d0a92a48e02a7b786a166d9053c87289716b83c00a805" Feb 17 20:30:53 crc kubenswrapper[4892]: E0217 20:30:53.053694 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a33f67a32c09842dfe6d0a92a48e02a7b786a166d9053c87289716b83c00a805\": container with ID starting with a33f67a32c09842dfe6d0a92a48e02a7b786a166d9053c87289716b83c00a805 not found: ID does not exist" containerID="a33f67a32c09842dfe6d0a92a48e02a7b786a166d9053c87289716b83c00a805" Feb 17 20:30:53 crc kubenswrapper[4892]: I0217 20:30:53.053747 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a33f67a32c09842dfe6d0a92a48e02a7b786a166d9053c87289716b83c00a805"} err="failed to get container status \"a33f67a32c09842dfe6d0a92a48e02a7b786a166d9053c87289716b83c00a805\": rpc error: code = NotFound desc = could not find container \"a33f67a32c09842dfe6d0a92a48e02a7b786a166d9053c87289716b83c00a805\": container with ID starting with a33f67a32c09842dfe6d0a92a48e02a7b786a166d9053c87289716b83c00a805 not found: ID does not exist" Feb 17 20:30:53 crc kubenswrapper[4892]: I0217 20:30:53.053776 4892 scope.go:117] "RemoveContainer" containerID="2642cb4a3a374a721918100be792af9ea29dc20a8b7389bb89658453ecabe502" Feb 17 20:30:53 crc kubenswrapper[4892]: E0217 20:30:53.055829 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2642cb4a3a374a721918100be792af9ea29dc20a8b7389bb89658453ecabe502\": container with ID starting with 2642cb4a3a374a721918100be792af9ea29dc20a8b7389bb89658453ecabe502 not found: ID does not exist" containerID="2642cb4a3a374a721918100be792af9ea29dc20a8b7389bb89658453ecabe502" Feb 17 20:30:53 crc kubenswrapper[4892]: I0217 20:30:53.055863 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2642cb4a3a374a721918100be792af9ea29dc20a8b7389bb89658453ecabe502"} err="failed to get container status \"2642cb4a3a374a721918100be792af9ea29dc20a8b7389bb89658453ecabe502\": rpc error: code = NotFound desc = could not find container \"2642cb4a3a374a721918100be792af9ea29dc20a8b7389bb89658453ecabe502\": container with ID starting with 2642cb4a3a374a721918100be792af9ea29dc20a8b7389bb89658453ecabe502 not found: ID does not exist" Feb 17 20:30:53 crc kubenswrapper[4892]: I0217 20:30:53.055882 4892 scope.go:117] "RemoveContainer" containerID="5e38319376829c0f23d40105262e34657fb4073ce532d97cfd21227761ddc85f" Feb 17 20:30:53 crc kubenswrapper[4892]: E0217 20:30:53.058001 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e38319376829c0f23d40105262e34657fb4073ce532d97cfd21227761ddc85f\": container with ID starting with 5e38319376829c0f23d40105262e34657fb4073ce532d97cfd21227761ddc85f not found: ID does not exist" containerID="5e38319376829c0f23d40105262e34657fb4073ce532d97cfd21227761ddc85f" Feb 17 20:30:53 crc kubenswrapper[4892]: I0217 20:30:53.058048 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e38319376829c0f23d40105262e34657fb4073ce532d97cfd21227761ddc85f"} err="failed to get container status \"5e38319376829c0f23d40105262e34657fb4073ce532d97cfd21227761ddc85f\": rpc error: code = NotFound desc = could not find container \"5e38319376829c0f23d40105262e34657fb4073ce532d97cfd21227761ddc85f\": container with ID starting with 5e38319376829c0f23d40105262e34657fb4073ce532d97cfd21227761ddc85f not found: ID does not exist" Feb 17 20:30:53 crc kubenswrapper[4892]: I0217 20:30:53.389958 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21564135-8634-4bbb-bd75-ca5202dffd1c" path="/var/lib/kubelet/pods/21564135-8634-4bbb-bd75-ca5202dffd1c/volumes" Feb 17 20:31:00 crc kubenswrapper[4892]: I0217 20:31:00.359838 4892 scope.go:117] "RemoveContainer" containerID="2d8fac9640248d569699481a65fe24982eb12cc3dc607889c6c3f752cff2bc30" Feb 17 20:31:00 crc kubenswrapper[4892]: E0217 20:31:00.360757 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:31:12 crc kubenswrapper[4892]: I0217 20:31:12.360963 4892 scope.go:117] "RemoveContainer" containerID="2d8fac9640248d569699481a65fe24982eb12cc3dc607889c6c3f752cff2bc30" Feb 17 20:31:12 crc kubenswrapper[4892]: E0217 20:31:12.362003 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:31:24 crc kubenswrapper[4892]: I0217 20:31:24.359716 4892 scope.go:117] "RemoveContainer" containerID="2d8fac9640248d569699481a65fe24982eb12cc3dc607889c6c3f752cff2bc30" Feb 17 20:31:24 crc kubenswrapper[4892]: E0217 20:31:24.360881 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:31:36 crc kubenswrapper[4892]: I0217 20:31:36.360591 4892 scope.go:117] "RemoveContainer" containerID="2d8fac9640248d569699481a65fe24982eb12cc3dc607889c6c3f752cff2bc30" Feb 17 20:31:36 crc kubenswrapper[4892]: E0217 20:31:36.362168 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:31:51 crc kubenswrapper[4892]: I0217 20:31:51.010365 4892 generic.go:334] "Generic (PLEG): container finished" podID="a1c2ecb1-1a9c-4fc7-953f-710e44082d31" containerID="1c27df014d87bb7866c77dd33c4108654acb697483efd58f775ebe04fdd1b712" exitCode=0 Feb 17 20:31:51 crc kubenswrapper[4892]: I0217 20:31:51.010468 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-znjnt/must-gather-7pdtl" event={"ID":"a1c2ecb1-1a9c-4fc7-953f-710e44082d31","Type":"ContainerDied","Data":"1c27df014d87bb7866c77dd33c4108654acb697483efd58f775ebe04fdd1b712"} Feb 17 20:31:51 crc kubenswrapper[4892]: I0217 20:31:51.011462 4892 scope.go:117] "RemoveContainer" containerID="1c27df014d87bb7866c77dd33c4108654acb697483efd58f775ebe04fdd1b712" Feb 17 20:31:51 crc kubenswrapper[4892]: I0217 20:31:51.360170 4892 scope.go:117] "RemoveContainer" containerID="2d8fac9640248d569699481a65fe24982eb12cc3dc607889c6c3f752cff2bc30" Feb 17 20:31:51 crc kubenswrapper[4892]: E0217 20:31:51.360531 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:31:52 crc kubenswrapper[4892]: I0217 20:31:52.034644 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-znjnt_must-gather-7pdtl_a1c2ecb1-1a9c-4fc7-953f-710e44082d31/gather/0.log" Feb 17 20:32:00 crc kubenswrapper[4892]: I0217 20:32:00.611028 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-znjnt/must-gather-7pdtl"] Feb 17 20:32:00 crc kubenswrapper[4892]: I0217 20:32:00.611759 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-znjnt/must-gather-7pdtl" podUID="a1c2ecb1-1a9c-4fc7-953f-710e44082d31" containerName="copy" containerID="cri-o://3b35208fdca84d382a7f2a1181e8ac6dabc39402f77e768f8e758538e98ad74d" gracePeriod=2 Feb 17 20:32:00 crc kubenswrapper[4892]: I0217 20:32:00.628603 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-znjnt/must-gather-7pdtl"] Feb 17 20:32:01 crc kubenswrapper[4892]: I0217 20:32:01.066138 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-znjnt_must-gather-7pdtl_a1c2ecb1-1a9c-4fc7-953f-710e44082d31/copy/0.log" Feb 17 20:32:01 crc kubenswrapper[4892]: I0217 20:32:01.067335 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-znjnt/must-gather-7pdtl" Feb 17 20:32:01 crc kubenswrapper[4892]: I0217 20:32:01.110226 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-znjnt_must-gather-7pdtl_a1c2ecb1-1a9c-4fc7-953f-710e44082d31/copy/0.log" Feb 17 20:32:01 crc kubenswrapper[4892]: I0217 20:32:01.110657 4892 generic.go:334] "Generic (PLEG): container finished" podID="a1c2ecb1-1a9c-4fc7-953f-710e44082d31" containerID="3b35208fdca84d382a7f2a1181e8ac6dabc39402f77e768f8e758538e98ad74d" exitCode=143 Feb 17 20:32:01 crc kubenswrapper[4892]: I0217 20:32:01.110720 4892 scope.go:117] "RemoveContainer" containerID="3b35208fdca84d382a7f2a1181e8ac6dabc39402f77e768f8e758538e98ad74d" Feb 17 20:32:01 crc kubenswrapper[4892]: I0217 20:32:01.110878 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-znjnt/must-gather-7pdtl" Feb 17 20:32:01 crc kubenswrapper[4892]: I0217 20:32:01.133905 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a1c2ecb1-1a9c-4fc7-953f-710e44082d31-must-gather-output\") pod \"a1c2ecb1-1a9c-4fc7-953f-710e44082d31\" (UID: \"a1c2ecb1-1a9c-4fc7-953f-710e44082d31\") " Feb 17 20:32:01 crc kubenswrapper[4892]: I0217 20:32:01.134214 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5hmg\" (UniqueName: \"kubernetes.io/projected/a1c2ecb1-1a9c-4fc7-953f-710e44082d31-kube-api-access-g5hmg\") pod \"a1c2ecb1-1a9c-4fc7-953f-710e44082d31\" (UID: \"a1c2ecb1-1a9c-4fc7-953f-710e44082d31\") " Feb 17 20:32:01 crc kubenswrapper[4892]: I0217 20:32:01.151324 4892 scope.go:117] "RemoveContainer" containerID="1c27df014d87bb7866c77dd33c4108654acb697483efd58f775ebe04fdd1b712" Feb 17 20:32:01 crc kubenswrapper[4892]: I0217 20:32:01.159927 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1c2ecb1-1a9c-4fc7-953f-710e44082d31-kube-api-access-g5hmg" (OuterVolumeSpecName: "kube-api-access-g5hmg") pod "a1c2ecb1-1a9c-4fc7-953f-710e44082d31" (UID: "a1c2ecb1-1a9c-4fc7-953f-710e44082d31"). InnerVolumeSpecName "kube-api-access-g5hmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 20:32:01 crc kubenswrapper[4892]: I0217 20:32:01.238024 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5hmg\" (UniqueName: \"kubernetes.io/projected/a1c2ecb1-1a9c-4fc7-953f-710e44082d31-kube-api-access-g5hmg\") on node \"crc\" DevicePath \"\"" Feb 17 20:32:01 crc kubenswrapper[4892]: I0217 20:32:01.248610 4892 scope.go:117] "RemoveContainer" containerID="3b35208fdca84d382a7f2a1181e8ac6dabc39402f77e768f8e758538e98ad74d" Feb 17 20:32:01 crc kubenswrapper[4892]: E0217 20:32:01.249011 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b35208fdca84d382a7f2a1181e8ac6dabc39402f77e768f8e758538e98ad74d\": container with ID starting with 3b35208fdca84d382a7f2a1181e8ac6dabc39402f77e768f8e758538e98ad74d not found: ID does not exist" containerID="3b35208fdca84d382a7f2a1181e8ac6dabc39402f77e768f8e758538e98ad74d" Feb 17 20:32:01 crc kubenswrapper[4892]: I0217 20:32:01.249055 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b35208fdca84d382a7f2a1181e8ac6dabc39402f77e768f8e758538e98ad74d"} err="failed to get container status \"3b35208fdca84d382a7f2a1181e8ac6dabc39402f77e768f8e758538e98ad74d\": rpc error: code = NotFound desc = could not find container \"3b35208fdca84d382a7f2a1181e8ac6dabc39402f77e768f8e758538e98ad74d\": container with ID starting with 3b35208fdca84d382a7f2a1181e8ac6dabc39402f77e768f8e758538e98ad74d not found: ID does not exist" Feb 17 20:32:01 crc kubenswrapper[4892]: I0217 20:32:01.249082 4892 scope.go:117] "RemoveContainer" containerID="1c27df014d87bb7866c77dd33c4108654acb697483efd58f775ebe04fdd1b712" Feb 17 20:32:01 crc kubenswrapper[4892]: E0217 20:32:01.249379 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c27df014d87bb7866c77dd33c4108654acb697483efd58f775ebe04fdd1b712\": container with ID starting with 1c27df014d87bb7866c77dd33c4108654acb697483efd58f775ebe04fdd1b712 not found: ID does not exist" containerID="1c27df014d87bb7866c77dd33c4108654acb697483efd58f775ebe04fdd1b712" Feb 17 20:32:01 crc kubenswrapper[4892]: I0217 20:32:01.249401 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c27df014d87bb7866c77dd33c4108654acb697483efd58f775ebe04fdd1b712"} err="failed to get container status \"1c27df014d87bb7866c77dd33c4108654acb697483efd58f775ebe04fdd1b712\": rpc error: code = NotFound desc = could not find container \"1c27df014d87bb7866c77dd33c4108654acb697483efd58f775ebe04fdd1b712\": container with ID starting with 1c27df014d87bb7866c77dd33c4108654acb697483efd58f775ebe04fdd1b712 not found: ID does not exist" Feb 17 20:32:01 crc kubenswrapper[4892]: I0217 20:32:01.351974 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1c2ecb1-1a9c-4fc7-953f-710e44082d31-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "a1c2ecb1-1a9c-4fc7-953f-710e44082d31" (UID: "a1c2ecb1-1a9c-4fc7-953f-710e44082d31"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 20:32:01 crc kubenswrapper[4892]: I0217 20:32:01.375064 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1c2ecb1-1a9c-4fc7-953f-710e44082d31" path="/var/lib/kubelet/pods/a1c2ecb1-1a9c-4fc7-953f-710e44082d31/volumes" Feb 17 20:32:01 crc kubenswrapper[4892]: I0217 20:32:01.442196 4892 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a1c2ecb1-1a9c-4fc7-953f-710e44082d31-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 17 20:32:03 crc kubenswrapper[4892]: I0217 20:32:03.360268 4892 scope.go:117] "RemoveContainer" containerID="2d8fac9640248d569699481a65fe24982eb12cc3dc607889c6c3f752cff2bc30" Feb 17 20:32:03 crc kubenswrapper[4892]: E0217 20:32:03.361315 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:32:17 crc kubenswrapper[4892]: I0217 20:32:17.360355 4892 scope.go:117] "RemoveContainer" containerID="2d8fac9640248d569699481a65fe24982eb12cc3dc607889c6c3f752cff2bc30" Feb 17 20:32:17 crc kubenswrapper[4892]: E0217 20:32:17.361152 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:32:28 crc kubenswrapper[4892]: I0217 20:32:28.360095 4892 scope.go:117] "RemoveContainer" containerID="2d8fac9640248d569699481a65fe24982eb12cc3dc607889c6c3f752cff2bc30" Feb 17 20:32:28 crc kubenswrapper[4892]: E0217 20:32:28.361195 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:32:42 crc kubenswrapper[4892]: I0217 20:32:42.360399 4892 scope.go:117] "RemoveContainer" containerID="2d8fac9640248d569699481a65fe24982eb12cc3dc607889c6c3f752cff2bc30" Feb 17 20:32:42 crc kubenswrapper[4892]: E0217 20:32:42.361199 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:32:54 crc kubenswrapper[4892]: I0217 20:32:54.361131 4892 scope.go:117] "RemoveContainer" containerID="2d8fac9640248d569699481a65fe24982eb12cc3dc607889c6c3f752cff2bc30" Feb 17 20:32:54 crc kubenswrapper[4892]: E0217 20:32:54.362261 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:33:06 crc kubenswrapper[4892]: I0217 20:33:06.360836 4892 scope.go:117] "RemoveContainer" containerID="2d8fac9640248d569699481a65fe24982eb12cc3dc607889c6c3f752cff2bc30" Feb 17 20:33:06 crc kubenswrapper[4892]: E0217 20:33:06.361646 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:33:21 crc kubenswrapper[4892]: I0217 20:33:21.360174 4892 scope.go:117] "RemoveContainer" containerID="2d8fac9640248d569699481a65fe24982eb12cc3dc607889c6c3f752cff2bc30" Feb 17 20:33:21 crc kubenswrapper[4892]: E0217 20:33:21.361744 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:33:36 crc kubenswrapper[4892]: I0217 20:33:36.361278 4892 scope.go:117] "RemoveContainer" containerID="2d8fac9640248d569699481a65fe24982eb12cc3dc607889c6c3f752cff2bc30" Feb 17 20:33:36 crc kubenswrapper[4892]: E0217 20:33:36.362223 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:33:47 crc kubenswrapper[4892]: I0217 20:33:47.359542 4892 scope.go:117] "RemoveContainer" containerID="2d8fac9640248d569699481a65fe24982eb12cc3dc607889c6c3f752cff2bc30" Feb 17 20:33:47 crc kubenswrapper[4892]: E0217 20:33:47.360402 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:33:58 crc kubenswrapper[4892]: I0217 20:33:58.550647 4892 scope.go:117] "RemoveContainer" containerID="2d8fac9640248d569699481a65fe24982eb12cc3dc607889c6c3f752cff2bc30" Feb 17 20:33:58 crc kubenswrapper[4892]: E0217 20:33:58.551497 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6mhzt_openshift-machine-config-operator(f9013d62-9809-436b-82a8-5b18dbf13e35)\"" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" podUID="f9013d62-9809-436b-82a8-5b18dbf13e35" Feb 17 20:34:09 crc kubenswrapper[4892]: I0217 20:34:09.374621 4892 scope.go:117] "RemoveContainer" containerID="2d8fac9640248d569699481a65fe24982eb12cc3dc607889c6c3f752cff2bc30" Feb 17 20:34:09 crc kubenswrapper[4892]: I0217 20:34:09.751388 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6mhzt" event={"ID":"f9013d62-9809-436b-82a8-5b18dbf13e35","Type":"ContainerStarted","Data":"4bf22c28ff41924fa6f75c7dedd069f2bd93cde9cad357dd53c6f78056175065"}